MASTERING HADOOP (XI) - THE BASICS OF MAPREDUCE MISSIONS - SUMMARY

1.1 Summary

 

This chapter explains how to execute a MapReduce job. You now have a basic understanding of the JobConf object and how to use it to inform the framework of the elements your job needs.

 

You have seen how to write mapper and reducer classes, and how to use the reporter object, which can provide enough information about the runtime of your job. Finally, the output block is very important, through which you can know when and why you configure your job to reduce, and how many reducers you need to use .

 

As a good Hadoop expert, you see that files opened in mapper and reducer classes are empty or short, which is not surprising, since you know that the framework does not flush the last filesystem block until the file is closed. data values ​​to disk.

 

In the next chapter, you will learn how to set up a multi-machine cluster.

Guess you like

Origin http://43.154.161.224:23101/article/api/json?id=325472668&siteId=291194637