Electronic business platform log analysis system under the (big data)

Since ## went ETL link

 

Operation ETL: after washing the data into HBase 

zkServer.sh Start; start-dfs.sh; 

[the root @ node1 ~] # ./shells/start-yarn-ha.sh 
start-yarn.sh 
SSH node3 the root @ "$ HADOOP_HOME is Start the ResourceManager /sbin/yarn-daemon.sh " 
SSH root @ Node4" $ HADOOP_HOME / sbin / yarn-daemon.sh Start the ResourceManager " 

start-hbase.sh 

---------------- --- create a table hbase
the shell HBase 

HBase (main): 001: 0> Create 'EventLog', 'log' 

after ## to execute the program, and then view the scan data table has the data etl.

  

------------------------------ 运行 项目 BIG_DATA_SXT_1 修改如下配置 public class AnalyserLogDataRunner implements Tool { private static final Logger logger = Logger .getLogger(AnalyserLogDataRunner.class); private Configuration conf = null; public static void main(String[] args) { try { ToolRunner.run(new Configuration(), new AnalyserLogDataRunner(), args); } catch (Exception e) { logger.error("执行日志解析job异常", e); throw new RuntimeException(e); } } @Override public void setConf(Configuration conf) { conf.set("fs.defaultFS", "hdfs://node1:8020"); // conf.set("yarn.resourcemanager.hostname", "node3"); conf.set("hbase.zookeeper.quorum", "node2,node3,node4"); this.conf = HBaseConfiguration.create(conf); } ....... }

 

 

 

 

------------- following section: 
data above can not meet the requirements, (ip date and too little) 
required to generate the class data. /BIG_DATA_SXT_1/test/com/sxt/test/TestDataMaker.java

  

Guess you like

Origin www.cnblogs.com/xhzd/p/11441912.html