Hive学习笔记——hive hook

Hive hook是hive的钩子函数,可以下面这几种情况下被执行

https://www.slideshare.net/julingks/apache-hive-hooksminwookim130813

下面将实现一个钩子函数

依赖

<?xml version="1.0" encoding="UTF-8"?>
<project xmlns="http://maven.apache.org/POM/4.0.0"
         xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
         xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/xsd/maven-4.0.0.xsd">
    <parent>
        <artifactId>interview-parent</artifactId>
        <groupId>com.interview</groupId>
        <version>1.0-SNAPSHOT</version>
    </parent>
    <modelVersion>4.0.0</modelVersion>

    <artifactId>interview-bigdata</artifactId>

    <dependencies>
        <!-- logback -->
        <dependency>
            <groupId>org.slf4j</groupId>
            <artifactId>log4j-over-slf4j</artifactId>
            <version>1.7.25</version>
        </dependency>
        <!--hive-->
        <dependency>
            <groupId>org.apache.hive</groupId>
            <artifactId>hive-exec</artifactId>
            <version>1.1.0</version>
        </dependency>
    </dependencies>

</project>

代码,只是简单的打印了一行日志

package com.bigdata.hive;

import org.apache.hadoop.hive.ql.hooks.ExecuteWithHookContext;
import org.apache.hadoop.hive.ql.hooks.HookContext;
import org.slf4j.Logger;
import org.slf4j.LoggerFactory;

public class MyHiveHook implements ExecuteWithHookContext {

    private static Logger logger = LoggerFactory.getLogger(MyHiveHook.class);

    public void run(HookContext hookContext) throws Exception {
        logger.info("this is my hive hook");
    }
}

打包

mvn clean package

将打好的jar包上传到所有hiveserver2所在机器的/var/lib/hive目录下,或者找一个hdfs目录

root@master:/var/lib/hive# ls
examples.desktop  interview-bigdata-1.0-SNAPSHOT.jar

修改owner成hive

sudo chown hive:hive ./interview-bigdata-1.0-SNAPSHOT.jar

去cloudera manager中配置jar目录和hook函数

jar目录

hook函数,此处配置成hive.exec.pre.hooks,此时添加的hook函数将在sql执行之前运行

猜你喜欢

转载自www.cnblogs.com/tonglin0325/p/12542656.html