Mapreduce如何进行本地Hadoop测试

                                              Mapreduce如何进行本地Hadoop测试                                                           

如果开发mapreduce,一定会进行测试

一般情况下都是在本地进行测试,

如何要进行本地测试,需要添加一个依赖包

从如下网站下搜索hadoop

http://mvnrepository.com

在pom.xml中添加下面的依赖(http://mvnrepository.com/artifact/org.apache.hadoop/hadoop-minicluster

<dependency>

    <groupId>org.apache.hadoop</groupId>

    <artifactId>hadoop-minicluster</artifactId>

    <version>2.7.6</version>

    <scope>test</scope>

</dependency>

单词解释:

scope: test 测试 在打包时不会打包,在发布时不会发布

在本机进行测试 :   就是在window的hadoop环境下进行mapreduce程序的测试

下面是一个默认的mapreduce的本地测试代码:

public class MapReduceTest extends Configured implements Tool {

     @Override

     public int run(String[] args) throws Exception {

           if(args.length != 2) {

                System.out.println("urfs: files num error");

                return -1;

           }

                      

           //1 .声明一个job

           Configuration conf = getConf();

           

           //获取hdfs系统连接,判断目标文件是否存在,如果存在就删除

           Path src = new Path(args[0]);

           Path desc = new Path(args[1]);

                      

           FileSystem fs = FileSystem.get(conf);

           if(fs.exists(desc)) {

                fs.delete(desc,true);

           }

           

           //2.获取一个job

           

           Job job = Job.getInstance(conf);

           //设置job的名字

           job.setJobName("defaultmapreduce");

           

           //3. 导入jar包/加载哪个类

           

           job.setJarByClass(getClass());

           

           //4加载mapper

           

           /*job.setMapperClass(MyMapper.class);

           job.setMapOutputKeyClass(Text.class);

           job.setMapOutputValueClass(LongWritable.class);

           */

           

           

           //5. 加载reduecer

           

           //6  设置输入输出文件

           FileInputFormat.addInputPath(job,src);

           FileOutputFormat.setOutputPath(job,desc);

           

           //7.开始工作

           

           return job.waitForCompletion(true)?0:1;

           

     }

     

     public static void main(String[] args) throws Exception {

           int code = ToolRunner.run(new MapReduceTest(), args);

           System.exit(code);

     }

     

     /*public static class MyMapper extends Mapper<LongWritable, Text, Text, LongWritable>{

           

     }*/  

}


 

猜你喜欢

转载自blog.csdn.net/qq_42482484/article/details/81158764