mac搭建本地大数据开发环境

版权声明:本文为博主原创文章,未经博主允许不得转载。 https://blog.csdn.net/datadev_sh/article/details/87937567

用root用户修改配置文件,没有root用户,先创建一个。

例如:
vi /etc/profile 

保存退出:	
:wq!

1.java

一般安装位置

/Library/Java/JavaVirtualMachines/jdk1.8.0_201.jdk/Contents/Home

2.安装hadoop

参考:https://blog.csdn.net/fox64194167/article/details/80617527

hadoop位置

/usr/local/Cellar/hadoop/3.1.1

hadoop配置文件位置

/usr/local/Cellar/hadoop/3.1.1/libexec/etc/hadoop

vi /etc/profile

# hadoop
export HADOOP_HOME=/usr/local/Cellar/hadoop/3.1.1
export PATH=$HADOOP_HOME/bin:$HADOOP_HOME/sbin:$PATH

通过brew安装的hadoop,始终无法正常使用,最终下载tar包安装的。

3.安装scala

brew install scala

安装位置

/usr/local/Cellar/scala/2.12.8

vi /etc/profile

# scala
export SCALA_HOME=/usr/local/Cellar/scala/2.12.8
export PATH=$SCALA_HOME/bin:$PATH

4.安装spark

参考:https://spark.apache.org/downloads.html
启动是报错

/Volumes/ds/service/spark-2.4.0-bin-hadoop2.7/bin/spark-class: line 71: 
/Library/Java/JavaVirtualMachines/jdk1.8.0_201/Contents/Home/bin/java: No such file or directory

参考:https://blog.csdn.net/datadev_sh/article/details/87937117

ln -s  /Library/Java/JavaVirtualMachines/jdk1.8.0_201.jdk /Library/Java/JavaVirtualMachines/jdk1.8.0_201

5.mysql

参考:https://www.jianshu.com/p/07a9826898c0
https://dev.mysql.com/downloads/mysql/

6. hive

参考:https://www.jianshu.com/p/5c11073d19d3

brew install hive

安装位置

/usr/local/Cellar/hive/3.1.1

vi /etc/profile

# hive
export HIVE_HOME=/usr/local/Cellar/hive/3.1.1
export PATH=$HIVE_HOME/bin:$PATH

hive配置文件

7.zookeeper

https://www.jianshu.com/p/b889b86536be

brew install zookeeper

安装位置

/usr/local/Cellar/zookeeper/3.4.13

修改zoo.cfg配置文件的dataDir,改为一个普通的文件夹(当前用户拥有写权限)。
启动./zkServer start

在 /usr/local/Cellar/zookeeper/3.4.13/bin 目录下

sh-3.2# ./zkServer start
ZooKeeper JMX enabled by default
Using config: /usr/local/etc/zookeeper/zoo.cfg
Starting zookeeper ... STARTED
sh-3.2# jps
5334 QuorumPeerMain
5337 Jps

8.hbase

https://blog.csdn.net/fox64194167/article/details/80638126

brew install hbase
# hbase
export HBASE_HOME=/usr/local/Cellar/hbase/1.2.9
export PATH=$HBASE_HOME/bin:$PATH

修改配置文件hbase-env.sh中的JAVA_HOME、HBASE_LOG_DIR、HBASE_REGIONSERVERS。其中目录最后不要加斜杠
修改hbase-site.xml文件。

如果是用hbase自身的zk,那么不要系统自己安装的zk,否则会端口被占用。

启动

/usr/local/opt/hbase/bin/start-hbase.sh

完整/etc/profile参考


#JAVA_HOME
export JAVA_HOME=/Library/Java/JavaVirtualMachines/jdk1.8.0_201.jdk/Contents/Home
export PATH=$JAVA_HOME/bin:$PATH 

# neo4j
export NEO4J_HOME=/Volumes/ds/service/neo4j-community-3.5.3
export PATH=$NEO4J_HOME/bin:$PATH

# scala
export SCALA_HOME=/usr/local/Cellar/scala/2.12.8
export PATH=$SCALA_HOME/bin:$PATH

# hive
export HIVE_HOME=/usr/local/Cellar/hive/3.1.1
export PATH=$HIVE_HOME/bin:$PATH

# hbase
export HBASE_HOME=/usr/local/Cellar/hbase/1.2.9
export PATH=$HBASE_HOME/bin:$PATH

export HADOOP_HOME=/Users/xin/Downloads/service/hadoop-3.1.2
export PATH=$PATH:$HADOOP_HOME/sbin:$HADOOP_HOME/bin

export HADOOP_MAPRED_HOME=$HADOOP_HOME
export PATH=$PATH:$HADOOP_MAPRED_HOME

export HADOOP_COMMON_HOME=$HADOOP_HOME
export PATH=$PATH:$HADOOP_COMMON_HOME

export HADOOP_HDFS_HOME=$HADOOP_HOME
export PATH=$PATH:$HADOOP_HDFS_HOME

export YARN_HOME=$HADOOP_HOME
export PATH=$PATH:$YARN_HOME


猜你喜欢

转载自blog.csdn.net/datadev_sh/article/details/87937567