Skip to content

Commit 185ffd4

Browse files
xiejiajunzjffdu
authored andcommitted
[ZEPPELIN-4962]. Support for manually specifying the Java version of Spark Interpreter Scala REPL and fix the CI failure due to low Scala version
### What is this PR for? - fix the [CI failure](https://travis-ci.org/github/apache/zeppelin/builds/709913046) due to [PR-3852](apache#3852) ### What type of PR is it? [Bug Fix] ### Todos * [ ] - Task ### What is the Jira issue? * [ZEPPELIN-4962](https://issues.apache.org/jira/projects/ZEPPELIN/issues/ZEPPELIN-4962) ### How should this be tested? * CI test ### Screenshots (if appropriate) ### Questions: * Does the licenses files need update? NO * Is there breaking changes for older versions? NO * Does this needs documentation? Yes Author: xiejiajun <[email protected]> Author: xie-jia-jun <[email protected]> Author: JakeXie <[email protected]> Closes apache#3860 from xiejiajun/ZEPPELIN-4962 and squashes the following commits: 9128c9b [JakeXie] spark.repl.target docs update ad4c0e3 [xiejiajun] Clear irrelevant code a12d3a9 [xiejiajun] Support for manually specifying the Java version of Spark Interpreter Scala REPL and fix the CI failure due to low Scala version ab2b191 [xiejiajun] Merge branch 'master' of https://github.com/apache/zeppelin into apache-master 5569788 [xiejiajun] Merge branch 'master' of https://github.com/apache/zeppelin into apache-master 0a9af6c [xiejiajun] Merge branch 'master' of https://github.com/apache/zeppelin into apache-master be36b37 [xiejiajun] 合并Apache Master分支冲突解决 1335d55 [xiejiajun] Merge remote-tracking branch 'origin/master' fc59f57 [JakeXie] Merge pull request apache#4 from apache/master 9cc70fe [xiejiajun] Merge remote-tracking branch 'origin/master' 6ef9b23 [xie-jia-jun] Merge pull request apache#3 from apache/master 45af87a [xiejiajun] added timeout for getting Thrift client to avoid situations where the interpreter may not be restarted when the interpreter process exits unexpectedly f149c3b [xie-jia-jun] Merge pull request #1 from apache/master 5d4b645 [xie-jia-jun] Support OSSConfigStorage of Aliyun dbb6639 [xie-jia-jun] Add Aliyun OSS SDK bb47849 [xie-jia-jun] Support S3ConfigStorage of AWS
1 parent 7d18dd7 commit 185ffd4

File tree

3 files changed

+17
-1
lines changed

3 files changed

+17
-1
lines changed

docs/interpreter/spark.md

Lines changed: 12 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -200,10 +200,21 @@ You can also set other Spark properties which are not listed in the table. For a
200200
(ex: http://{{PORT}}-{{SERVICE_NAME}}.{{SERVICE_DOMAIN}})
201201
</td>
202202
</tr>
203-
<td>spark.webui.yarn.useProxy</td>
203+
<tr>
204+
<td>spark.webui.yarn.useProxy</td>
204205
<td>false</td>
205206
<td>whether use yarn proxy url as spark weburl, e.g. http://localhost:8088/proxy/application_1583396598068_0004</td>
206207
</tr>
208+
<tr>
209+
<td>spark.repl.target</td>
210+
<td>jvm-1.6</td>
211+
<td>
212+
Manually specifying the Java version of Spark Interpreter Scala REPL,Available options:<br/>
213+
scala-compile v2.10.7 to v2.11.12 supports "jvm-1.5, jvm-1.6, jvm-1.7 and jvm-1.8", and the default value is jvm-1.6.<br/>
214+
scala-compile v2.10.1 to v2.10.6 supports "jvm-1.5, jvm-1.6, jvm-1.7", and the default value is jvm-1.6.<br/>
215+
scala-compile v2.12.x defaults to jvm-1.8, and only supports jvm-1.8.
216+
</td>
217+
</tr>
207218
</table>
208219

209220
Without any configuration, Spark interpreter works out of box in local mode. But if you want to connect to your Spark cluster, you'll need to follow below two simple steps.

spark/scala-2.10/src/main/scala/org/apache/zeppelin/spark/SparkScala210Interpreter.scala

Lines changed: 3 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -67,10 +67,13 @@ class SparkScala210Interpreter(override val conf: SparkConf,
6767
sparkHttpServer = server
6868
conf.set("spark.repl.class.uri", uri)
6969
}
70+
val target = conf.get("spark.repl.target", "jvm-1.6")
7071

7172
val settings = new Settings()
7273
settings.embeddedDefaults(sparkInterpreterClassLoader)
7374
settings.usejavacp.value = true
75+
settings.target.value = target
76+
7477
this.userJars = getUserJars()
7578
LOGGER.info("UserJars: " + userJars.mkString(File.pathSeparator))
7679
settings.classpath.value = userJars.mkString(File.pathSeparator)

spark/scala-2.11/src/main/scala/org/apache/zeppelin/spark/SparkScala211Interpreter.scala

Lines changed: 2 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -66,12 +66,14 @@ class SparkScala211Interpreter(override val conf: SparkConf,
6666
sparkHttpServer = server
6767
conf.set("spark.repl.class.uri", uri)
6868
}
69+
val target = conf.get("spark.repl.target", "jvm-1.6")
6970

7071
val settings = new Settings()
7172
settings.processArguments(List("-Yrepl-class-based",
7273
"-Yrepl-outdir", s"${outputDir.getAbsolutePath}"), true)
7374
settings.embeddedDefaults(sparkInterpreterClassLoader)
7475
settings.usejavacp.value = true
76+
settings.target.value = target
7577

7678
this.userJars = getUserJars()
7779
LOGGER.info("UserJars: " + userJars.mkString(File.pathSeparator))

0 commit comments

Comments
 (0)