JStorm ProcessLauncher进程不退出的问题

ps -elf|grep Launcher|wc -l 发现有很多ProcessLauncher没有退出。
检查supervisor的线程数

jstack -l 44493|grep waitForProcessExit|wc -l

发现supervisor中也存在非常多的线程处于等待进程退出的状态。数量和ProcessLauncher的数量一致。

使用jstack工具,观察程序卡死位置。

"main" #1 prio=5 os_prio=0 tid=0x00007fc93c00a000 nid=0x7db9 runnable [0x00007fc9438fb000]
   java.lang.Thread.State: RUNNABLE
    at java.io.FileOutputStream.writeBytes(Native Method)
    at java.io.FileOutputStream.write(FileOutputStream.java:326)
    at java.io.BufferedOutputStream.write(BufferedOutputStream.java:122)
    - locked <0x00000000fab1fd98> (a java.io.BufferedOutputStream)
    at java.io.PrintStream.write(PrintStream.java:480)
    - locked <0x00000000fab1fce8> (a java.io.PrintStream)
    at sun.nio.cs.StreamEncoder.writeBytes(StreamEncoder.java:221)
    at sun.nio.cs.StreamEncoder.implFlushBuffer(StreamEncoder.java:291)
    at sun.nio.cs.StreamEncoder.flushBuffer(StreamEncoder.java:104)
    - locked <0x00000000fab1ffc0> (a java.io.OutputStreamWriter)
    at java.io.OutputStreamWriter.flushBuffer(OutputStreamWriter.java:185)
    at java.io.PrintStream.write(PrintStream.java:527)
    - locked <0x00000000fab1fce8> (a java.io.PrintStream)
    at java.io.PrintStream.print(PrintStream.java:669)
    at java.io.PrintStream.println(PrintStream.java:806)
    - locked <0x00000000fab1fce8> (a java.io.PrintStream)
    at com.alibaba.jstorm.utils.ProcessLauncher.main(ProcessLauncher.java:130)

或者是如下错误

   java.lang.Thread.State: RUNNABLE
    at java.io.FileOutputStream.writeBytes(Native Method)
    at java.io.FileOutputStream.write(FileOutputStream.java:326)
    at java.io.BufferedOutputStream.flushBuffer(BufferedOutputStream.java:82)
    at java.io.BufferedOutputStream.flush(BufferedOutputStream.java:140)
    - locked <0x00000000fab1fe20> (a java.io.BufferedOutputStream)
    at java.io.PrintStream.write(PrintStream.java:482)
    - locked <0x00000000fab1fd70> (a java.io.PrintStream)
    at sun.nio.cs.StreamEncoder.writeBytes(StreamEncoder.java:221)
    at sun.nio.cs.StreamEncoder.implFlushBuffer(StreamEncoder.java:291)
    at sun.nio.cs.StreamEncoder.flushBuffer(StreamEncoder.java:104)
    - locked <0x00000000fab20048> (a java.io.OutputStreamWriter)
    at java.io.OutputStreamWriter.flushBuffer(OutputStreamWriter.java:185)
    at java.io.PrintStream.write(PrintStream.java:527)
    - locked <0x00000000fab1fd70> (a java.io.PrintStream)
    at java.io.PrintStream.print(PrintStream.java:683)
    at ch.qos.logback.core.status.OnPrintStreamStatusListenerBase.print(OnPrintStreamStatusListenerBase.java:44)
    at ch.qos.logback.core.status.OnPrintStreamStatusListenerBase.addStatusEvent(OnPrintStreamStatusListenerBase.java:50)
    at ch.qos.logback.core.status.OnConsoleStatusListener.addStatusEvent(OnConsoleStatusListener.java:25)
    at ch.qos.logback.core.BasicStatusManager.fireStatusAddEvent(BasicStatusManager.java:87)
    - locked <0x00000000faea1948> (a ch.qos.logback.core.spi.LogbackLock)
    at ch.qos.logback.core.BasicStatusManager.add(BasicStatusManager.java:59)
    at ch.qos.logback.core.spi.ContextAwareBase.addStatus(ContextAwareBase.java:79)
    at ch.qos.logback.core.spi.ContextAwareBase.addInfo(ContextAwareBase.java:84)
    at ch.qos.logback.core.rolling.RollingPolicyBase.determineCompressionMode(RollingPolicyBase.java:50)
    at ch.qos.logback.core.rolling.TimeBasedRollingPolicy.start(TimeBasedRollingPolicy.java:62)
    at ch.qos.logback.core.rolling.SizeAndTimeBasedRollingPolicy.start(SizeAndTimeBasedRollingPolicy.java:23)
    at ch.qos.logback.core.joran.action.NestedComplexPropertyIA.end(NestedComplexPropertyIA.java:167)
    at ch.qos.logback.core.joran.spi.Interpreter.callEndAction(Interpreter.java:317)
    at ch.qos.logback.core.joran.spi.Interpreter.endElement(Interpreter.java:196)
    at ch.qos.logback.core.joran.spi.Interpreter.endElement(Interpreter.java:182)
    at ch.qos.logback.core.joran.spi.EventPlayer.play(EventPlayer.java:62)
    at ch.qos.logback.core.joran.GenericConfigurator.doConfigure(GenericConfigurator.java:149)
    - locked <0x00000000faea1b58> (a ch.qos.logback.core.spi.LogbackLock)
    at ch.qos.logback.core.joran.GenericConfigurator.doConfigure(GenericConfigurator.java:135)
    at ch.qos.logback.core.joran.GenericConfigurator.doConfigure(GenericConfigurator.java:99)
    at ch.qos.logback.core.joran.GenericConfigurator.doConfigure(GenericConfigurator.java:49)
    at ch.qos.logback.classic.util.ContextInitializer.configureByResource(ContextInitializer.java:75)
    at ch.qos.logback.classic.util.ContextInitializer.autoConfig(ContextInitializer.java:148)
    at org.slf4j.impl.StaticLoggerBinder.init(StaticLoggerBinder.java:85)
    at org.slf4j.impl.StaticLoggerBinder.<clinit>(StaticLoggerBinder.java:55)
    at org.slf4j.LoggerFactory.bind(LoggerFactory.java:128)
    at org.slf4j.LoggerFactory.performInitialization(LoggerFactory.java:107)
    at org.slf4j.LoggerFactory.getILoggerFactory(LoggerFactory.java:295)
    at org.slf4j.LoggerFactory.getLogger(LoggerFactory.java:269)
    at org.slf4j.LoggerFactory.getLogger(LoggerFactory.java:281)
    at backtype.storm.utils.Utils.<clinit>(Utils.java:103)
    at com.alibaba.jstorm.utils.ProcessLauncher.main(ProcessLauncher.java:157)

代码卡在了System.out.println上。首先排除的是磁盘空间不足等问题。程序里其他线程都是守护线程,也不存在线程死锁的问题。
首先排查打开文件数量的问题

##查看系统总共打开文件数量限制
cat /proc/sys/fs/file-max
3247228
##查看文件数量限制
lsof|wc -l
1132023

显然文件总打开数量总量虽然小于限制值,但是打开了一百万文件似乎也太多了。经排查,是一个worker的进程开启的端口特别多。

[jboss5@h3715217148-1 ~]$ lsof -n|awk '{print $2}'|sort|uniq -c|sort -nr|more
 321480 52771
 321300 52744
 283815 52738
  30030 48039
  29601 48028
  29574 44493
  22860 48430
  14544 47458
   9432 52748
   6160 56821
   2464 39005
   1113 8791

查看不退出的进程打开的文件

java    3535 jboss5    0r  FIFO                0,8       0t0 2725987468 pipe
java    3535 jboss5    1w  FIFO                0,8       0t0 2725987469 pipe
java    3535 jboss5    2w  FIFO                0,8       0t0 2725987469 pipe

明显打开了标准输入,标准输出和错误输出。作为一个管道,管道的读取方就是父进程supervisor进程。0t0表示管道的offset为0(管道是一种环形数据结构)。这也就表示实际上管道并没有任何实际的写入。

在已经起来的worker上,观察到了进程内部的管道,似乎和eventpoll有关

java    52771 jboss5 2610u  a_inode                0,9         0       6372 [eventpoll]
java    52771 jboss5 2611r     FIFO                0,8       0t0 2460351397 pipe
java    52771 jboss5 2612w     FIFO                0,8       0t0 2460351397 pipe

无法解释,决定先手动杀掉processlauncher,解决问题。

ps -ef|grep -i processlauncher |grep -v grep|cut -c 9-15|xargs kill -9

杀完之后,世界清静了,继续开始启动我们的worker。然并卵,并启动比起来。这样可以判断,ProcessLauncher不退出是集群出现问题的果而不是因。
于是先排查线程数目

扫描二维码关注公众号,回复: 3131619 查看本文章
cat /proc/5056/status

结果显示有115个线程。

查看supervisor进程的GC情况,没有任何问题

$ jstat -gc 44493
 S0C    S1C    S0U    S1U      EC       EU        OC         OU       MC     MU    CCSC   CCSU   YGC     YGCT    FGC    FGCT     GCT
52416.0 52416.0 472.2   0.0   419456.0 135049.7  524288.0   21829.5   17792.0 17285.1 1920.0 1796.8  10014  146.775   0      0.000  146.775

那就是处于什么原因,写入Launcher进程写入管道失败了。
这个可能是supervisor进程的原因,也有可能是launcher进程的原因。因为是生产环境的supervisor,我们只能先排查launcher进程。不是这两行报错吗,那就删掉。再把相关执行路径上的日志输出关掉。

public static void main(String[] args) throws Exception{
            System.out.println("Enviroment:" + System.getenv());
            System.out.println("Properties:" + System.getProperties());

重新打包JStorm-core的jar包,替换掉线上的包。问题圆满解决。

——————————————我是求真务实的分界线————————————————————————
worker日志第一句话是,只要是出现这句话,问题肯定就不存在了。我们转向对比supervisor日志。

[INFO 2018-09-06 10:49:24 c.a.j.d.w.Worker:412 main] !!!!!!!!!!!!!!!!!!!!!!!!!!!
[INFO 2018-09-06 10:49:24 c.a.j.d.w.Worker:413 main] Begin to start worker:demo-120-1536202211 0ce5d63c-7a60-4422-9137-01df91d192d8 6803 6f0986e6-1c44-423e-806e-3f3034056f62 /DATA/app/mskyprocess/jstorm/jstorm-2.2.1/data/supervisor/stormdist/demo-120-1536202211/stormjar.jar & 
[INFO 2018-09-06 10:49:24 c.a.j.d.w.Worker:414 main] !!!!!!!!!!!!!!!!!!!!!!!!!!!
[INFO 2018-09-06 10:49:24 c.a.j.d.w.Worker:314 main] Begin to execute ps -Af 

不注释日志的情况下supervisor的日志如下

[INFO 2018-09-06 10:53:47 c.a.j.c.DistributedClusterState:86 main-EventThread] Received event SyncConnected:NodeChildrenChanged:/assignments
[INFO 2018-09-06 10:53:47 c.a.j.d.s.SyncSupervisorEvent:476 EventManagerImp] Downloading code for storm id demo1-121-1536202427 from /DATA/app/mskyprocess/jstorm/jstorm-2.2.1/data
[INFO 2018-09-06 10:53:47 b.s.s.a.ThriftClient:108 EventManagerImp] masterHost:10.237.65.11:8627
[INFO 2018-09-06 10:53:47 c.a.j.b.BlobStoreUtils:353 EventManagerImp] download blob demo1-121-1536202427-stormjar.jar from nimbus 10.237.65.11:8627
[INFO 2018-09-06 10:53:49 c.a.j.b.BlobStoreUtils:353 EventManagerImp] download blob demo1-121-1536202427-stormcode.ser from nimbus 10.237.65.11:8627
[INFO 2018-09-06 10:53:49 c.a.j.b.BlobStoreUtils:353 EventManagerImp] download blob demo1-121-1536202427-stormconf.ser from nimbus 10.237.65.11:8627
[INFO 2018-09-06 10:53:49 c.a.j.d.s.SyncSupervisorEvent:494 EventManagerImp] Finished downloading code for storm id demo1-121-1536202427 from /DATA/app/mskyprocess/jstorm/jstorm-2.2.1/data
[INFO 2018-09-06 10:53:49 c.a.j.d.s.SyncProcessEvent:1004 EventManagerImp] Launching worker with assiangment LocalAssignment[topologyId=demo1-121-1536202427,topologyName=demo1,taskIds=[1, 2, 3, 4, 5, 6],mem=2147483648,cpu=1,jvm=,timeStamp=1536202427683] for the supervisor 294af591-c9d2-4021-b776-12c83ac081b3 on port 6803 with id 7b5935aa-5739-4c8e-a321-96629a08ebb2
[INFO 2018-09-06 10:53:49 c.a.j.d.s.SyncProcessEvent:480 EventManagerImp] Remove jars [slf4j-log4j, log4j]
[INFO 2018-09-06 10:53:49 c.a.j.d.s.SyncProcessEvent:480 EventManagerImp] Remove jars [slf4j-log4j, log4j]
[INFO 2018-09-06 10:53:49 c.a.j.d.s.SyncProcessEvent:875 EventManagerImp] Launching worker with command: java -Xms256m -Xmx256m -Djstorm.log.dir=/opt/applog/MskyLog/jstorm -Dlogfile.name=demo1-worker-6803.log -Dtopology.name=demo1 -Dlogback.configurationFile=/opt/app/mskyprocess/jstorm/jstorm-2.2.1/conf/jstorm.logback.xml -cp /opt/app/mskyprocess/jstorm/jstorm-2.2.1/conf:/DATA/app/mskyprocess/jstorm/jstorm-2.2.1/lib/netty-3.9.0.Final.jar:/DATA/app/mskyprocess/jstorm/jstorm-2.2.1/lib/clojure-1.6.0.jar:/DATA/app/mskyprocess/jstorm/jstorm-2.2.1/jstorm-core-2.2.1.jar:/DATA/app/mskyprocess/jstorm/jstorm-2.2.1/lib/logback-core-1.0.13.jar:/DATA/app/mskyprocess/jstorm/jstorm-2.2.1/lib/slf4j-api-1.7.5.jar:/DATA/app/mskyprocess/jstorm/jstorm-2.2.1/lib/log4j-over-slf4j-1.6.6.jar:/DATA/app/mskyprocess/jstorm/jstorm-2.2.1/lib/servlet-api-2.5.jar:/DATA/app/mskyprocess/jstorm/jstorm-2.2.1/lib/logback-classic-1.0.13.jar:/DATA/app/mskyprocess/jstorm/jstorm-2.2.1/lib/commons-logging-1.1.3.jar:/DATA/app/mskyprocess/jstorm/jstorm-2.2.1/lib/rocksdbjni-4.3.1.jar::/DATA/app/mskyprocess/jstorm/jstorm-2.2.1/data/supervisor/stormdist/demo1-121-1536202427/stormjar.jar com.alibaba.jstorm.utils.ProcessLauncher java -server -Xms2147483648 -Xmx2147483648 -Xmn1073741824 -XX:PermSize=67108864 -XX:MaxPermSize=134217728 -XX:ParallelGCThreads=4 -XX:SurvivorRatio=4 -XX:+UseConcMarkSweepGC -XX:+UseCMSInitiatingOccupancyOnly -XX:CMSInitiatingOccupancyFraction=70 -XX:CMSFullGCsBeforeCompaction=5 -XX:+HeapDumpOnOutOfMemoryError -XX:+UseCMSCompactAtFullCollection -XX:CMSMaxAbortablePrecleanTime=5000 -Xloggc:/opt/applog/MskyLog/jstorm/demo1/demo1-worker-6803-gc.log -verbose:gc -XX:+PrintGCDateStamps -XX:+PrintGCDetails -XX:HeapDumpPath=/opt/applog/MskyLog/jstorm/demo1/java-demo1-121-1536202427-20180906105349.hprof -Djava.library.path=/usr/local/lib:/opt/local/lib:/usr/lib -Djstorm.home=/DATA/app/mskyprocess/jstorm/jstorm-2.2.1 -Djstorm.log.dir=/opt/applog/MskyLog/jstorm -Dlogfile.name=demo1-worker-6803.log -Dtopology.name=demo1 -Dlogback.configurationFile=/opt/app/mskyprocess/jstorm/jstorm-2.2.1/conf/jstorm.logback.xml -cp /opt/app/mskyprocess/jstorm/jstorm-2.2.1/conf:/DATA/app/mskyprocess/jstorm/jstorm-2.2.1/lib/netty-3.9.0.Final.jar:/DATA/app/mskyprocess/jstorm/jstorm-2.2.1/lib/clojure-1.6.0.jar:/DATA/app/mskyprocess/jstorm/jstorm-2.2.1/jstorm-core-2.2.1.jar:/DATA/app/mskyprocess/jstorm/jstorm-2.2.1/lib/logback-core-1.0.13.jar:/DATA/app/mskyprocess/jstorm/jstorm-2.2.1/lib/slf4j-api-1.7.5.jar:/DATA/app/mskyprocess/jstorm/jstorm-2.2.1/lib/log4j-over-slf4j-1.6.6.jar:/DATA/app/mskyprocess/jstorm/jstorm-2.2.1/lib/servlet-api-2.5.jar:/DATA/app/mskyprocess/jstorm/jstorm-2.2.1/lib/logback-classic-1.0.13.jar:/DATA/app/mskyprocess/jstorm/jstorm-2.2.1/lib/commons-logging-1.1.3.jar:/DATA/app/mskyprocess/jstorm/jstorm-2.2.1/lib/rocksdbjni-4.3.1.jar::/DATA/app/mskyprocess/jstorm/jstorm-2.2.1/data/supervisor/stormdist/demo1-121-1536202427/stormjar.jar com.alibaba.jstorm.daemon.worker.Worker demo1-121-1536202427 294af591-c9d2-4021-b776-12c83ac081b3 6803 7b5935aa-5739-4c8e-a321-96629a08ebb2 /DATA/app/mskyprocess/jstorm/jstorm-2.2.1/data/supervisor/stormdist/demo1-121-1536202427/stormjar.jar
[INFO 2018-09-06 10:53:49 c.a.j.d.s.SyncProcessEvent:876 EventManagerImp] Environment:{LD_LIBRARY_PATH=/usr/local/lib:/opt/local/lib:/usr/lib, jstorm.home=/DATA/app/mskyprocess/jstorm/jstorm-2.2.1, jstorm.workerId=7b5935aa-5739-4c8e-a321-96629a08ebb2, REDIRECT=false}
[INFO 2018-09-06 10:53:55 c.a.j.d.s.SyncProcessEvent:270 EventManagerImp] Successfully start worker 7b5935aa-5739-4c8e-a321-96629a08ebb2

注释掉相关日志后的supevior日志如下

[INFO 2018-09-06 10:49:21 c.a.j.c.DistributedClusterState:86 main-EventThread] Received event SyncConnected:NodeChildrenChanged:/assignments
[INFO 2018-09-06 10:49:21 c.a.j.d.s.SyncSupervisorEvent:476 EventManagerImp] Downloading code for storm id demo-120-1536202211 from /DATA/app/mskyprocess/jstorm/jstorm-2.2.1/data
[INFO 2018-09-06 10:49:21 b.s.s.a.ThriftClient:108 EventManagerImp] masterHost:10.237.65.11:8627
[INFO 2018-09-06 10:49:21 c.a.j.b.BlobStoreUtils:353 EventManagerImp] download blob demo-120-1536202211-stormjar.jar from nimbus 10.237.65.11:8627
[INFO 2018-09-06 10:49:23 c.a.j.b.BlobStoreUtils:353 EventManagerImp] download blob demo-120-1536202211-stormcode.ser from nimbus 10.237.65.11:8627
[INFO 2018-09-06 10:49:23 c.a.j.b.BlobStoreUtils:353 EventManagerImp] download blob demo-120-1536202211-stormconf.ser from nimbus 10.237.65.11:8627
[INFO 2018-09-06 10:49:23 c.a.j.d.s.SyncSupervisorEvent:494 EventManagerImp] Finished downloading code for storm id demo-120-1536202211 from /DATA/app/mskyprocess/jstorm/jstorm-2.2.1/data
[INFO 2018-09-06 10:49:23 c.a.j.d.s.SyncProcessEvent:1004 EventManagerImp] Launching worker with assiangment LocalAssignment[topologyId=demo-120-1536202211,topologyName=demo,taskIds=[1, 2, 3, 4, 5, 6],mem=2147483648,cpu=1,jvm=,timeStamp=1536202212233] for the supervisor 0ce5d63c-7a60-4422-9137-01df91d192d8 on port 6803 with id 6f0986e6-1c44-423e-806e-3f3034056f62
[INFO 2018-09-06 10:49:23 c.a.j.d.s.SyncProcessEvent:480 EventManagerImp] Remove jars [slf4j-log4j, log4j]
[INFO 2018-09-06 10:49:23 c.a.j.d.s.SyncProcessEvent:480 EventManagerImp] Remove jars [slf4j-log4j, log4j]
[INFO 2018-09-06 10:49:23 c.a.j.d.s.SyncProcessEvent:875 EventManagerImp] Launching worker with command: java -Xms256m -Xmx256m -Djstorm.log.dir=/opt/applog/MskyLog/jstorm -Dlogfile.name=demo-worker-6803.log -Dtopology.name=demo -Dlogback.configurationFile=/opt/app/mskyprocess/jstorm/jstorm-2.2.1/conf/jstorm.logback.xml -cp /opt/app/mskyprocess/jstorm/jstorm-2.2.1/conf:/DATA/app/mskyprocess/jstorm/jstorm-2.2.1/lib/netty-3.9.0.Final.jar:/DATA/app/mskyprocess/jstorm/jstorm-2.2.1/lib/clojure-1.6.0.jar:/DATA/app/mskyprocess/jstorm/jstorm-2.2.1/jstorm-core-2.2.1.jar:/DATA/app/mskyprocess/jstorm/jstorm-2.2.1/lib/logback-core-1.0.13.jar:/DATA/app/mskyprocess/jstorm/jstorm-2.2.1/lib/slf4j-api-1.7.5.jar:/DATA/app/mskyprocess/jstorm/jstorm-2.2.1/lib/log4j-over-slf4j-1.6.6.jar:/DATA/app/mskyprocess/jstorm/jstorm-2.2.1/lib/servlet-api-2.5.jar:/DATA/app/mskyprocess/jstorm/jstorm-2.2.1/lib/logback-classic-1.0.13.jar:/DATA/app/mskyprocess/jstorm/jstorm-2.2.1/lib/commons-logging-1.1.3.jar:/DATA/app/mskyprocess/jstorm/jstorm-2.2.1/lib/rocksdbjni-4.3.1.jar::/DATA/app/mskyprocess/jstorm/jstorm-2.2.1/data/supervisor/stormdist/demo-120-1536202211/stormjar.jar com.alibaba.jstorm.utils.ProcessLauncher java -server -Xms2147483648 -Xmx2147483648 -Xmn1073741824 -XX:PermSize=67108864 -XX:MaxPermSize=134217728 -XX:ParallelGCThreads=4 -XX:SurvivorRatio=4 -XX:+UseConcMarkSweepGC -XX:+UseCMSInitiatingOccupancyOnly -XX:CMSInitiatingOccupancyFraction=70 -XX:CMSFullGCsBeforeCompaction=5 -XX:+HeapDumpOnOutOfMemoryError -XX:+UseCMSCompactAtFullCollection -XX:CMSMaxAbortablePrecleanTime=5000 -Xloggc:/opt/applog/MskyLog/jstorm/demo/demo-worker-6803-gc.log -verbose:gc -XX:+PrintGCDateStamps -XX:+PrintGCDetails -XX:HeapDumpPath=/opt/applog/MskyLog/jstorm/demo/java-demo-120-1536202211-20180906104923.hprof -Djava.library.path=/usr/local/lib:/opt/local/lib:/usr/lib -Djstorm.home=/DATA/app/mskyprocess/jstorm/jstorm-2.2.1 -Djstorm.log.dir=/opt/applog/MskyLog/jstorm -Dlogfile.name=demo-worker-6803.log -Dtopology.name=demo -Dlogback.configurationFile=/opt/app/mskyprocess/jstorm/jstorm-2.2.1/conf/jstorm.logback.xml -cp /opt/app/mskyprocess/jstorm/jstorm-2.2.1/conf:/DATA/app/mskyprocess/jstorm/jstorm-2.2.1/lib/netty-3.9.0.Final.jar:/DATA/app/mskyprocess/jstorm/jstorm-2.2.1/lib/clojure-1.6.0.jar:/DATA/app/mskyprocess/jstorm/jstorm-2.2.1/jstorm-core-2.2.1.jar:/DATA/app/mskyprocess/jstorm/jstorm-2.2.1/lib/logback-core-1.0.13.jar:/DATA/app/mskyprocess/jstorm/jstorm-2.2.1/lib/slf4j-api-1.7.5.jar:/DATA/app/mskyprocess/jstorm/jstorm-2.2.1/lib/log4j-over-slf4j-1.6.6.jar:/DATA/app/mskyprocess/jstorm/jstorm-2.2.1/lib/servlet-api-2.5.jar:/DATA/app/mskyprocess/jstorm/jstorm-2.2.1/lib/logback-classic-1.0.13.jar:/DATA/app/mskyprocess/jstorm/jstorm-2.2.1/lib/commons-logging-1.1.3.jar:/DATA/app/mskyprocess/jstorm/jstorm-2.2.1/lib/rocksdbjni-4.3.1.jar::/DATA/app/mskyprocess/jstorm/jstorm-2.2.1/data/supervisor/stormdist/demo-120-1536202211/stormjar.jar com.alibaba.jstorm.daemon.worker.Worker demo-120-1536202211 0ce5d63c-7a60-4422-9137-01df91d192d8 6803 6f0986e6-1c44-423e-806e-3f3034056f62 /DATA/app/mskyprocess/jstorm/jstorm-2.2.1/data/supervisor/stormdist/demo-120-1536202211/stormjar.jar
[INFO 2018-09-06 10:49:23 c.a.j.d.s.SyncProcessEvent:876 EventManagerImp] Environment:{LD_LIBRARY_PATH=/usr/local/lib:/opt/local/lib:/usr/lib, jstorm.home=/DATA/app/mskyprocess/jstorm/jstorm-2.2.1, jstorm.workerId=6f0986e6-1c44-423e-806e-3f3034056f62, REDIRECT=false}
[INFO 2018-09-06 10:49:24 c.a.j.d.s.SyncProcessEvent:347 EventManagerImp] Worker:6f0986e6-1c44-423e-806e-3f3034056f62 state:notStarted WorkerHeartbeat:null assignedTasks:{6800=LocalAssignment[topologyId=adsb-87-1535695254,topologyName=adsb,taskIds=[64, 194, 6, 134, 264, 74, 204, 14, 144, 274, 84, 214, 24, 154, 94, 224, 34, 164, 104, 234, 44, 174, 114, 244, 54, 184, 124, 254],mem=2147483648,cpu=1,jvm=,timeStamp=1535695254744], 6801=LocalAssignment[topologyId=flightest-36-1534411874,topologyName=flightest,taskIds=[64, 33, 4, 37, 8, 41, 45, 14, 16, 49, 21, 53, 25, 57, 60, 29],mem=2147483648,cpu=1,jvm=,timeStamp=1534411874959], 6802=LocalAssignment[topologyId=umepsr_lk-88-1535892467,topologyName=umepsr_lk,taskIds=[33, 4, 36, 37, 7, 40, 10, 43, 13, 46, 16, 19, 24, 27, 30],mem=2147483648,cpu=1,jvm=,timeStamp=1535892467962], 6803=LocalAssignment[topologyId=demo-120-1536202211,topologyName=demo,taskIds=[1, 2, 3, 4, 5, 6],mem=2147483648,cpu=1,jvm=,timeStamp=1536202212233], 6804=LocalAssignment[topologyId=CancelTopo-135-1526890047,topologyName=CancelTopo,taskIds=[33, 5, 37, 9, 41, 13, 45, 17, 49, 21, 25, 29],mem=2147483648,cpu=1,jvm=,timeStamp=1533621992358], 6805=LocalAssignment[topologyId=UmeEid-116-1536133052,topologyName=UmeEid,taskIds=[33, 66, 3, 38, 70, 7, 73, 43, 12, 78, 48, 17, 83, 53, 23, 28, 60],mem=2147483648,cpu=1,jvm=,timeStamp=1536133052881], 6806=LocalAssignment[topologyId=waypoint-115-1536132773,topologyName=waypoint,taskIds=[16, 4, 20, 8, 24, 11],mem=2147483648,cpu=1,jvm=,timeStamp=1536132774009]} at supervisor time-secs 1536202163
[INFO 2018-09-06 10:49:24 c.a.j.d.s.SyncProcessEvent:264 EventManagerImp] 6f0986e6-1c44-423e-806e-3f3034056f62 still hasn't started
[INFO 2018-09-06 10:49:34 c.a.j.d.s.SyncProcessEvent:270 EventManagerImp] Successfully start worker 6f0986e6-1c44-423e-806e-3f3034056f62

猜你喜欢

转载自blog.csdn.net/define_us/article/details/82379558