Install Hadoop1.0.4 under Mac OS

   http://andy-ghg.iteye.com/blog/1165453

The reference example is installed with 1.0.4, but it is slightly different

1: Added namenode designation to core-site

   

<property>
  <name>hadoop.tmp.dir</name>
  <value>/Users/apple/tmp/hadoop/hadoop-${user.name}</value>
  <description>A base for other temporary directories.</description>
</property>
<property>
     <name>fs.default.name</name>
     <value>hdfs://localhost:8020</value>
</property>

 This seems to avoid entering Y in bin/hadoop namenode -format  , and the installation directory is also optional and controllable.

 

hadoop-env.sh added the following configuration

export JAVA_HOME=/System/Library/Java/JavaVirtualMachines/1.6.0.jdk/Contents/Home
export HADOOP_HEAPSIZE=2000
export HADOOP_INSTALL=/Users/apple/hadoop-1.0.4
export HADOOP_OPTS="-Djava.security.krb5.realm=OX.AC.UK -Djava.security.krb5.kdc=kdc0.ox.ac.uk:kdc1.ox.ac.uk"

 OPTS seems to be able to solve a bug, but I forgot about it.

 

In addition, there is the 1.0.4 plug-in. This plug-in is more difficult to find. After a lot of searches, it can't be used. I typed one locally, and the test can be used .

 

Finally, the test program

package hadoop;

import java.net.URI;

import org.apache.hadoop.conf.Configuration;
import org.apache.hadoop.fs.FSDataInputStream;
import org.apache.hadoop.fs.FileSystem;
import org.apache.hadoop.fs.Path;
import org.apache.hadoop.io.IOUtils;

public class MyTest {

	public static void main(String[] args) throws Exception {
		String uri = "hdfs://localhost:8020/Users/apple/tmp/hadoop/Zookeeper.doc";
		Configuration conf = new Configuration();
		FileSystem fs = FileSystem.get(URI.create(uri), conf);
		FSDataInputStream in = null;
		try {
			in = fs.open(new Path(uri));
			System.out.println("the first print:");
			IOUtils.copyBytes(in, System.out, 4096, false);
			in.seek(0);// You can relocate the pos of the file because it is read, but if you write a file on DFS, you can only read pos and cannot change pos, because hadoop only runs sequential append data.
			System.out.println("the second print:");
			IOUtils.copyBytes(in, System.out, 4096, false);
		} finally {
			IOUtils.closeStream(in);
		}
	}

}

 Tested under Hadoop1.0.4 and Myeclipse8.6.

 

 

 

It was the first time to write a blog, and it was very rough, but a Weibo from ThoughtWorks said: If you want to improve your technology, write more blogs, which can improve the depth of your thinking and exercise your expressive ability. It feels quite right

 

Guess you like

Origin http://10.200.1.11:23101/article/api/json?id=326994871&siteId=291194637