Azkaban 执行hive脚本 并从文件导入数据(org.apache.hadoop.security.AccessControlException: Permission denied...)

    xiaoxiao2022-06-26  131

    报异常

    FAILED: Error in metadata: MetaException(message:Got exception: org.apache.hadoop.security.AccessControlException org.apache.hadoop.security.AccessControlException: Permission denied: user=aseema, access=WRITE, inode="":hduser:supergroup:rwxr-xr-x) FAILED: Execution Error, return code 1 from org.apache.hadoop.hive.ql.exec.DDLTask

    解决方法 

    hadoop fs -mkdir -p /user/hive/warehouse hadoop fs -mkdir /tmp hadoop fs -chmod -R 777 /user/hive hadoop fs -chmod 777 /tmp

    test.sql

    use default; drop table aztest; create table aztest(id int,name string) row format delimited fields terminated by ","; load data local inpath 'b.txt' into table aztest; create table azres as select * from aztest;

     hi.job

     

    重点是  load data local inpath 'b.txt''

    没有local是从hdfs路径读取,加上local从本地文件读取,因为用azkaban,用local,使用相对路径即可

    例如:

    hi.zip

    ----b.txt

    ----hi.job

    ----test.sql

     

    好了,这是在xshell 里面没问题

    but 用azkaban跑的时候出问题了

    一会permission deny 一会找不到文件,各种问题,最终look:

    hadoop的用户是hdfs

    hdfs.job

    #hivef.job type=command command=sudo bash start.sh

     start.sh

    echo "************Bash Start********************" #echo chown hdfs:hdfs * chown hdfs:hdfs b.txt chown hdfs:hdfs test.hql mv b.txt /var/lib/hadoop-hdfs mv test.hql /var/lib/hadoop-hdfs su - hdfs -c "hive -f test.hql" rm -rf /var/lib/hadoop-hdfs/* echo "************Bash End**********************"

    /var/lib/hadoop-hdfs 这个目录是你不加 mv指令报错找不到文件的哪个文件夹

    test.hql

    create database IF not EXISTS test3 ; use test3; DROP TABLE if exists aztest; create table aztest(id int,name string) row format delimited fields terminated by ","; load data local inpath 'b.txt' into table aztest; create table azres as select * from aztest;

     b.txt

    2,bb 3,cc 7,yy 9,pp

    将四个文件压缩成zip,上传到azkaban,执行,搞定

    报错 \r 不是shell命令什么的,用notepad将 

    总而言之,一顿坑,


    最新回复(0)