hive的UDF开发部署

  1. 添加maven依赖
 <dependencies>
        <dependency>
            <groupId>org.apache.hive</groupId>
            <artifactId>hive-exec</artifactId>
            <version>1.1.0</version>
        </dependency>
        <dependency>
            <groupId>org.apache.hadoop</groupId>
            <artifactId>hadoop-common</artifactId>
            <version>2.6.0</version>
        </dependency>
        <!-- https://mvnrepository.com/artifact/com.alibaba/fastjson -->
        <!--这个可以不用,因为我写的是解析json的一个UDF,所以用了阿里的fastjson-->
        <dependency>
            <groupId>com.alibaba</groupId>
            <artifactId>fastjson</artifactId>
            <version>1.2.57</version>
        </dependency>

    </dependencies>
  1. 开发代码,继承 UDF,并重载 evaluate
package io.udf.hive;

import com.alibaba.fastjson.JSONObject;
import org.apache.hadoop.hive.ql.exec.UDF;

public class UDFjson extends UDF {
    
    

    public String evaluate(String input, String key) {
    
    
        String re = jsonParser(input, key);
        return re;
    }

    public String jsonParser(String in, String key) {
    
    
        JSONObject object = JSONObject.parseObject(in);
        String result = object.getString(key);
        return result;
    }

}

  1. 打成 jar 包上传到HDFS
hdfs dfs -put /home/webserver/jars/UDFjson-1.0-SNAPSHOT-jar-with-dependencies.jar /udf/ 

  1. 打开hive命令行,选择你需要添加UDF的database:

CREATE FUNCTION json_tuple_test AS 'io.udf.hive.UDFjson' USING JAR 'hdfs:///udf/UDFjson-1.0-SNAPSHOT-jar-with-dependencies.jar’;

  1. 查看function,在hive命令行执行以下命令:
show functions;

PS:注册的UDF会有database的前缀,所以查看的时候需要注意下

猜你喜欢

转载自blog.csdn.net/xiaozhaoshigedasb/article/details/102719983