魂落忘川犹在川 2022-01-11 11:04 采纳率: 50%
浏览 59
已结题

spark3.0 运行报错

windows下idea里运行spark代码报错

  • spark-shell可用
  • 环境变量配置完成
Using Spark's default log4j profile: org/apache/spark/log4j-defaults.properties
22/01/11 10:41:38 INFO SparkContext: Running Spark version 3.0.1
22/01/11 10:41:38 INFO ResourceUtils: ==============================================================
22/01/11 10:41:38 INFO ResourceUtils: Resources for spark.driver:

22/01/11 10:41:38 INFO ResourceUtils: ==============================================================
22/01/11 10:41:38 INFO SparkContext: Submitted application: HelloWorld
22/01/11 10:41:38 INFO SecurityManager: Changing view acls to: jiantang.y
22/01/11 10:41:38 INFO SecurityManager: Changing modify acls to: jiantang.y
22/01/11 10:41:38 INFO SecurityManager: Changing view acls groups to: 
22/01/11 10:41:38 INFO SecurityManager: Changing modify acls groups to: 
22/01/11 10:41:38 INFO SecurityManager: SecurityManager: authentication disabled; ui acls disabled; users  with view permissions: Set(jiantang.y); groups with view permissions: Set(); users  with modify permissions: Set(jiantang.y); groups with modify permissions: Set()
22/01/11 10:41:39 INFO Utils: Successfully started service 'sparkDriver' on port 60984.
22/01/11 10:41:39 INFO SparkEnv: Registering MapOutputTracker
22/01/11 10:41:39 INFO SparkEnv: Registering BlockManagerMaster
22/01/11 10:41:39 INFO BlockManagerMasterEndpoint: Using org.apache.spark.storage.DefaultTopologyMapper for getting topology information
22/01/11 10:41:39 INFO BlockManagerMasterEndpoint: BlockManagerMasterEndpoint up
java.lang.NoSuchFieldError: JAVA_9
  at org.apache.spark.storage.StorageUtils$.<init>(StorageUtils.scala:207)
  at org.apache.spark.storage.StorageUtils$.<clinit>(StorageUtils.scala)
  at org.apache.spark.storage.BlockManagerMasterEndpoint.<init>(BlockManagerMasterEndpoint.scala:93)
  at org.apache.spark.SparkEnv$.$anonfun$create$9(SparkEnv.scala:370)
  at org.apache.spark.SparkEnv$.registerOrLookupEndpoint$1(SparkEnv.scala:311)
  at org.apache.spark.SparkEnv$.create(SparkEnv.scala:359)
  at org.apache.spark.SparkEnv$.createDriverEnv(SparkEnv.scala:189)
  at org.apache.spark.SparkContext.createSparkEnv(SparkContext.scala:272)
  at org.apache.spark.SparkContext.<init>(SparkContext.scala:447)
  ... 28 elided

源代码

import org.apache.spark.{SparkConf, SparkContext}

object HelloWord1 {

  def main(args: Array[String]): Unit = {
    val conf = new SparkConf().setMaster("local").setAppName("HelloWorld")

    val sc = new SparkContext(conf)

    val helloWorld = sc.parallelize(List("Hello,World!","Hello,Spark!","Hello,BigData!"))

    helloWorld.foreach(line => println(line))

  }
}
  • 写回答

1条回答 默认 最新

报告相同问题?

问题事件

  • 已结题 (查看结题原因) 1月11日
  • 创建了问题 1月11日