I got the same error as below when I used spark from spark-shell or pyspark, so I fixed it.
'Unsupported class file major version 55'
Ubuntu18.04 Spark 2.4.3
The cause seems to be using Java 11. Check the version of Java you are using and do the following to choose another version:
$ sudo update-alternatives --config java
alternative java (/usr/bin/Provide java)Has three choices.
Choice Path Priority Status
------------------------------------------------------------
0 /usr/lib/jvm/java-11-openjdk-amd64/bin/java 1101 automatic mode
1 /usr/lib/jvm/java-11-openjdk-amd64/bin/java 1101 manual mode
* 2 /usr/lib/jvm/java-11-oracle/bin/java 1091 manual mode
3 /usr/lib/jvm/java-8-openjdk-amd64/jre/bin/java 1081 manual mode
Current choice[*]To hold<Enter>, Otherwise press the key of the choice number:3
I was wondering if I could use Java 8 with this, but it didn't happen as shown below.
$ java --version
openjdk 11.0.2 2019-01-15
OpenJDK Runtime Environment 18.9 (build 11.0.2+9)
OpenJDK 64-Bit Server VM 18.9 (build 11.0.2+9, mixed mode)
Let's find out where we are going to see in the first place. Then
$ which java
/home/ksn/.sdkman/candidates/java/current/bin/java
There is evidence that Java (and Kotlin) was installed using SDKMAN.
At this point, I learned that SDKMAN can install spark, so I decided to try it.
$ sdk install spark
I installed it with and tried to check the operation from $ spark-session
, but I still get the same error like'Unsupported class file major version 55'
.
$ sdk list java
When I looked at whether Java 8 could be installed with SDKMAN, it turned out that Java 8 or less could not be installed. In that case, I decided that it was impossible to solve it here.
Therefore, uninstall java that was put in SDKMAN below.
$ sdk uninstall java 11.0.2-open
with this,
$ java -version
openjdk version "1.8.0_212"
OpenJDK Runtime Environment (build 1.8.0_212-8u212-b03-0ubuntu1.18.04.1-b03)
OpenJDK 64-Bit Server VM (build 25.212-b03, mixed mode)
$ which java
/usr/bin/java
Then, it will read java under / usr / bin /
.
Try launching spark installed with SDKMAN as $ spark-shell
. Then
$ spark-shell
...
scala> val textFile = spark.read.text("README.md")
textFile: org.apache.spark.sql.DataFrame = [value: string]
scala> textFile.count()
res0: Long = 109
And the error disappeared safely.
As far as I can tell, it's too early to use Java 11 with Spark and Hadoop, and if you want to use Java 11, expect Spark 3.x. This seems to solve the problem when introducing Hadoop the other day.