spark在yarn-cluster上面执行报错

2018-04-16 15:40:13 浏览数 (1)

在单机模式下执行成功的spark程序,在yarn上面就报错。异常信息如下:

代码语言:javascript复制
 1 14/08/14 02:05:42 INFO DAGScheduler: Completed ResultTask(2, 0)
 2 14/08/14 02:05:42 INFO DAGScheduler: Stage 2 (saveAsTextFile at FileUtil.scala:114) finished in 0.179 s
 3 14/08/14 02:05:42 INFO SparkContext: Job finished: saveAsTextFile at FileUtil.scala:114, took 0.331739293 s
 4 14/08/14 02:05:42 INFO TaskSetManager: Finished TID 2 in 184 ms on localhost (progress: 1/1)
 5 14/08/14 02:05:42 INFO TaskSchedulerImpl: Removed TaskSet 2.0, whose tasks have all completed, from pool 
 6 14/08/14 02:05:42 INFO ApplicationMaster: finishApplicationMaster with SUCCEEDED
 7 Exception in thread "main" java.lang.AssertionError: assertion failed
 8     at scala.Predef$.assert(Predef.scala:165)
 9     at org.apache.spark.deploy.yarn.ApplicationMaster.waitForSparkContextInitialized(ApplicationMaster.scala:222)
10     at org.apache.spark.deploy.yarn.ApplicationMaster.run(ApplicationMaster.scala:111)
11     at org.apache.spark.deploy.yarn.ApplicationMaster$$anonfun$main$1.apply$mcV$sp(ApplicationMaster.scala:469)
12     at org.apache.spark.deploy.SparkHadoopUtil$$anon$1.run(SparkHadoopUtil.scala:53)
13     at org.apache.spark.deploy.SparkHadoopUtil$$anon$1.run(SparkHadoopUtil.scala:52)
14     at java.security.AccessController.doPrivileged(Native Method)
15     at javax.security.auth.Subject.doAs(Subject.java:415)
16     at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1491)
17     at org.apache.spark.deploy.SparkHadoopUtil.runAsSparkUser(SparkHadoopUtil.scala:52)
18     at org.apache.spark.deploy.yarn.ApplicationMaster$.main(ApplicationMaster.scala:468)
19     at org.apache.spark.deploy.yarn.ApplicationMaster.main(ApplicationMaster.scala)
20 14/08/14 02:05:44 INFO ApplicationMaster: AppMaster received a signal.
21 14/08/14 02:05:44 INFO ApplicationMaster: Deleting staging directory .sparkStaging/application_1408004797389_0007

从日志上面分析,job执行成功了。但貌似是没有获取到yarn返回的结果信息。

debug后发现是下面的问题:

代码语言:javascript复制
spark-submit --class org.andy.hadoop.ETL --master yarn-cluster  ../lib/rdbms-0.0.1-SNAPSHOT-jar-with-dependencies.jar /dest/ETL2

job以yarn-cluster形式执行,但代码中初始化的为:

代码语言:javascript复制
1  var conf = new SparkConf().setAppName("testFilter").setMaster("local")
2  var sc = new SparkContext(conf)

以local的形式初始化的。所以接收不到yarn的返回结果。修改后:

代码语言:javascript复制
1     var conf = new SparkConf().setAppName("testFilter").setMaster("yarn-cluster")
2     var sc = new SparkContext(conf)

执行成功!

0 人点赞