解决:hdfs: org.apache.hadoop.security.AccessControlException(Permission denied)

Problem Description : I DataGrip to insert a hive data, there are mistakes

Job Submission failed with exception 'org.apache.hadoop.security.AccessControlException(Permission denied: user=anonymous, access=EXECUTE, inode="/tmp":root:supergroup:drwx------

Root of the problem: jobs submitted by the user is different from the original in / tmp, he privileges on the cluster must be: 700

Note: How to change the permissions skip to:

Solution:

The first: In a cluster configuration file can be changed:

In hdfs-site it was added to the closed user authentication hdfs

    <property>
  <name>dfs.permissions.enabled</name>
  <value>true</value>
  <description>
    If "true", enable permission checking in HDFS.
    If "false", permission checking is turned off,
    but all other behavior is unchanged.
    Switching from one parameter value to the other does not change the mode,
    owner or group of files or directories.
  </description>
</property>
!

      The second:

        将:export HADOOP_USER_NAME = hdfs 

        Added to ~ / .bash_profile then performed

        source  ~/.bash_profile 

 

       Third:

        将System.setProperty("HADOOP_USER_NAME”,“hdfs”);     

        The method of header added to the main code,

        This statement means that modify the contents of system variables HADOOP_USER_NAME code for HDFS
---------------------

 

Guess you like

Origin www.cnblogs.com/GuangMingDingFighter/p/10958989.html