Spark:3.0版本报错“java.lang.NoSuchFieldError: JAVA_9“

spark 3.0 版本在创建SparkSession时报错:

Exception in thread "main" java.lang.NoSuchFieldError: JAVA_9
    at org.apache.spark.storage.StorageUtils$.<init>(StorageUtils.scala:207)
    at org.apache.spark.storage.StorageUtils$.<clinit>(StorageUtils.scala)
    at org.apache.spark.storage.BlockManagerMasterEndpoint.<init>(BlockManagerMasterEndpoint.scala:93)
    at org.apache.spark.SparkEnv$.$anonfun$create$9(SparkEnv.scala:370)
    at org.apache.spark.SparkEnv$.registerOrLookupEndpoint$1(SparkEnv.scala:311)
    at org.apache.spark.SparkEnv$.create(SparkEnv.scala:359)
    at org.apache.spark.SparkEnv$.createDriverEnv(SparkEnv.scala:189)
    at org.apache.spark.SparkContext.createSparkEnv(SparkContext.scala:272)
    at org.apache.spark.SparkContext.<init>(SparkContext.scala:447)
    at org.apache.spark.SparkContext$.getOrCreate(SparkContext.scala:2574)
    at org.apache.spark.sql.SparkSession$Builder.$anonfun$getOrCreate$2(SparkSession.scala:934)
    at scala.Option.getOrElse(Option.scala:189)
    at org.apache.spark.sql.SparkSession$Builder.getOrCreate(SparkSession.scala:928)
    at com.ppdai.etl.spark.ParserTest$.main(ParserTest.scala:10)
    at com.ppdai.etl.spark.ParserTest.main(ParserTest.scala)

报错代码是:https://github.com/apache/spark/blob/master/core/src/main/scala/org/apache/spark/storage/StorageUtils.scala

private val bufferCleaner: DirectBuffer => Unit =
if (SystemUtils.isJavaVersionAtLeast(JavaVersion.JAVA_9)) {
  val cleanerMethod =
    Utils.classForName("sun.misc.Unsafe").getMethod("invokeCleaner", classOf[ByteBuffer])
  val unsafeField = classOf[Unsafe].getDeclaredField("theUnsafe")
  unsafeField.setAccessible(true)
  ..

 原因是与maven中的包不兼容,如我项目中使用的是hadoop-common和hadoop-mapreduce-client-core包是CDH版本:

    <dependency>
        <groupId>org.apache.hadoop</groupId>
        <artifactId>hadoop-common</artifactId>
        <!--<version>2.6.0-cdh5.12.1</version>-->
        <version>3.0.0-cdh6.2.0</version>
        <scope>provided</scope>
    </dependency>
    <dependency>
        <groupId>org.apache.hadoop</groupId>
        <artifactId>hadoop-mapreduce-client-core</artifactId>
        <!--<version>2.6.0-cdh5.12.1</version>-->
        <version>3.0.0-cdh6.2.0</version>
        <scope>provided</scope>
    </dependency>

修改成apach版本后可解决:

    <!-- https://mvnrepository.com/artifact/org.apache.hadoop/hadoop-common -->
    <dependency>
        <groupId>org.apache.hadoop</groupId>
        <artifactId>hadoop-common</artifactId>
        <version>3.0.0</version>
    </dependency>

    <!-- https://mvnrepository.com/artifact/org.apache.hadoop/hadoop-mapreduce-client-core -->
    <dependency>
        <groupId>org.apache.hadoop</groupId>
        <artifactId>hadoop-mapreduce-client-core</artifactId>
        <version>3.0.0</version>
    </dependency>

 

评论
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值