druid异常处理:Types.collectionOf(Ljava/lang/reflect/Type;)Ljava/lang/reflect/ParameterizedType

版权声明:*************本文为博主原创文章,转载请注明出处************* https://blog.csdn.net/oDaiLiDong/article/details/83932759
2018-11-10T19:54:35,072 INFO [task-runner-0-priority-0] io.druid.indexer.DetermineHashedPartitionsJob - Job wikiticker-determine_partitions_hashed-Optional.of([2015-09-12T00:00:00.000Z/2015-09-13T00:00:00.000Z]) submitted, status available at: http://stone.lan:8088/proxy/application_1541847157114_0006/
2018-11-10T19:54:35,073 INFO [task-runner-0-priority-0] org.apache.hadoop.mapreduce.Job - Running job: job_1541847157114_0006
2018-11-10T19:54:41,142 INFO [task-runner-0-priority-0] org.apache.hadoop.mapreduce.Job - Job job_1541847157114_0006 running in uber mode : false
2018-11-10T19:54:41,143 INFO [task-runner-0-priority-0] org.apache.hadoop.mapreduce.Job -  map 0% reduce 0%
2018-11-10T19:54:46,592 INFO [task-runner-0-priority-0] org.apache.hadoop.mapreduce.Job - Task Id : attempt_1541847157114_0006_m_000000_0, Status : FAILED
Error: com.google.inject.util.Types.collectionOf(Ljava/lang/reflect/Type;)Ljava/lang/reflect/ParameterizedType;
Container killed by the ApplicationMaster.
Container killed on request. Exit code is 143
Container exited with a non-zero exit code 143

2018-11-10T19:54:50,636 INFO [task-runner-0-priority-0] org.apache.hadoop.mapreduce.Job - Task Id : attempt_1541847157114_0006_m_000000_1, Status : FAILED
Error: com.google.inject.util.Types.collectionOf(Ljava/lang/reflect/Type;)Ljava/lang/reflect/ParameterizedType;
2018-11-10T19:54:56,679 INFO [task-runner-0-priority-0] org.apache.hadoop.mapreduce.Job - Task Id : attempt_1541847157114_0006_m_000000_2, Status : FAILED
Error: com.google.inject.util.Types.collectionOf(Ljava/lang/reflect/Type;)Ljava/lang/reflect/ParameterizedType;
Container killed by the ApplicationMaster.
Container killed on request. Exit code is 143
Container exited with a non-zero exit code 143

2018-11-10T19:55:02,714 INFO [task-runner-0-priority-0] org.apache.hadoop.mapreduce.Job -  map 100% reduce 100%
2018-11-10T19:55:03,728 INFO [task-runner-0-priority-0] org.apache.hadoop.mapreduce.Job - Job job_1541847157114_0006 failed with state FAILED due to: Task failed task_1541847157114_0006_m_000000
Job failed as tasks failed. failedMaps:1 failedReduces:0

2018-11-10T19:55:03,834 INFO [task-runner-0-priority-0] org.apache.hadoop.mapreduce.Job - Counters: 16
   Job Counters 
      Failed map tasks=4
      Killed reduce tasks=1
      Launched map tasks=4
      Other local map tasks=3
      Data-local map tasks=1
      Total time spent by all maps in occupied slots (ms)=13567
      Total time spent by all reduces in occupied slots (ms)=0
      Total time spent by all map tasks (ms)=13567
      Total time spent by all reduce tasks (ms)=0
      Total vcore-milliseconds taken by all map tasks=13567
      Total vcore-milliseconds taken by all reduce tasks=0
      Total megabyte-milliseconds taken by all map tasks=13892608
      Total megabyte-milliseconds taken by all reduce tasks=0
   Map-Reduce Framework
      CPU time spent (ms)=0
      Physical memory (bytes) snapshot=0
      Virtual memory (bytes) snapshot=0
2018-11-10T19:55:03,836 ERROR [task-runner-0-priority-0] io.druid.indexer.DetermineHashedPartitionsJob - Job failed: job_1541847157114_0006
2018-11-10T19:55:03,836 INFO [task-runner-0-priority-0] io.druid.indexer.JobHelper - Deleting path[var/druid/hadoop-tmp/wikiticker/2018-11-10T115425.566Z_24d5a38f4b164508a3efbc594d03433e]
2018-11-10T19:55:03,884 ERROR [task-runner-0-priority-0] io.druid.indexing.overlord.ThreadPoolTaskRunner - Exception while running task[AbstractTask{id='index_hadoop_wikiticker_2018-11-10T11:54:25.566Z', groupId='index_hadoop_wikiticker_2018-11-10T11:54:25.566Z', taskResource=TaskResource{availabilityGroup='index_hadoop_wikiticker_2018-11-10T11:54:25.566Z', requiredCapacity=1}, dataSource='wikiticker', context={}}]
java.lang.RuntimeException: java.lang.reflect.InvocationTargetException
   at com.google.common.base.Throwables.propagate(Throwables.java:160) ~[guava-16.0.1.jar:?]
   at io.druid.indexing.common.task.HadoopTask.invokeForeignLoader(HadoopTask.java:222) ~[druid-indexing-service-0.12.4-SNAPSHOT.jar:0.12.4-SNAPSHOT]
   at io.druid.indexing.common.task.HadoopIndexTask.run(HadoopIndexTask.java:184) ~[druid-indexing-service-0.12.4-SNAPSHOT.jar:0.12.4-SNAPSHOT]
   at io.druid.indexing.overlord.ThreadPoolTaskRunner$ThreadPoolTaskRunnerCallable.call(ThreadPoolTaskRunner.java:444) [druid-indexing-service-0.12.4-SNAPSHOT.jar:0.12.4-SNAPSHOT]
   at io.druid.indexing.overlord.ThreadPoolTaskRunner$ThreadPoolTaskRunnerCallable.call(ThreadPoolTaskRunner.java:416) [druid-indexing-service-0.12.4-SNAPSHOT.jar:0.12.4-SNAPSHOT]
   at java.util.concurrent.FutureTask.run(FutureTask.java:266) [?:1.8.0_171]
   at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149) [?:1.8.0_171]
   at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624) [?:1.8.0_171]
   at java.lang.Thread.run(Thread.java:748) [?:1.8.0_171]
Caused by: java.lang.reflect.InvocationTargetException
   at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) ~[?:1.8.0_171]
   at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) ~[?:1.8.0_171]
   at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) ~[?:1.8.0_171]
   at java.lang.reflect.Method.invoke(Method.java:498) ~[?:1.8.0_171]
   at io.druid.indexing.common.task.HadoopTask.invokeForeignLoader(HadoopTask.java:219) ~[druid-indexing-service-0.12.4-SNAPSHOT.jar:0.12.4-SNAPSHOT]
   ... 7 more
Caused by: io.druid.java.util.common.ISE: Job[class io.druid.indexer.DetermineHashedPartitionsJob] failed!
   at io.druid.indexer.JobHelper.runJobs(JobHelper.java:391) ~[druid-indexing-hadoop-0.12.4-SNAPSHOT.jar:0.12.4-SNAPSHOT]
   at io.druid.indexer.HadoopDruidDetermineConfigurationJob.run(HadoopDruidDetermineConfigurationJob.java:91) ~[druid-indexing-hadoop-0.12.4-SNAPSHOT.jar:0.12.4-SNAPSHOT]
   at io.druid.indexing.common.task.HadoopIndexTask$HadoopDetermineConfigInnerProcessing.runTask(HadoopIndexTask.java:325) ~[druid-indexing-service-0.12.4-SNAPSHOT.jar:0.12.4-SNAPSHOT]
   at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) ~[?:1.8.0_171]
   at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) ~[?:1.8.0_171]
   at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) ~[?:1.8.0_171]
   at java.lang.reflect.Method.invoke(Method.java:498) ~[?:1.8.0_171]
   at io.druid.indexing.common.task.HadoopTask.invokeForeignLoader(HadoopTask.java:219) ~[druid-indexing-service-0.12.4-SNAPSHOT.jar:0.12.4-SNAPSHOT]
   ... 7 more
2018-11-10T19:55:03,895 INFO [task-runner-0-priority-0] io.druid.indexing.overlord.TaskRunnerUtils - Task [index_hadoop_wikiticker_2018-11-10T11:54:25.566Z] status changed to [FAILED].
2018-11-10T19:55:03,898 INFO [task-runner-0-priority-0] io.druid.indexing.worker.executor.ExecutorLifecycle - Task completed with status: {
  "id" : "index_hadoop_wikiticker_2018-11-10T11:54:25.566Z",
  "status" : "FAILED",
  "duration" : 33977
}

解决方法:

添加  "mapreduce.job.user.classpath.first": "true" 到 jobProperties即可

 "jobProperties": {
                "mapreduce.job.user.classpath.first": "true"
                        }

猜你喜欢

转载自blog.csdn.net/oDaiLiDong/article/details/83932759