Skip to content
This repository was archived by the owner on Dec 20, 2022. It is now read-only.

Commit 4545823

Browse files
committed
Spark-2.4 support
In spark-2.4 mapOutputTracker.getMapSizesByExecutorId returns iterator, rather then seq. Change-Id: I4a65a6e66af34792e29ed758fe81281df2cb908b
1 parent 616834d commit 4545823

File tree

3 files changed

+12
-4
lines changed

3 files changed

+12
-4
lines changed

README.md

Lines changed: 4 additions & 3 deletions
Original file line numberDiff line numberDiff line change
@@ -34,7 +34,7 @@ Mellanox ConnectX-5 network adapter with 100GbE RoCE fabric, connected with a Me
3434
For more information on configuration, performance tuning and troubleshooting, please visit the [SparkRDMA GitHub Wiki](https://github.com/Mellanox/SparkRDMA/wiki)
3535

3636
## Runtime requirements
37-
* Apache Spark 2.0.0/2.1.0/2.2.0/2.3.0
37+
* Apache Spark 2.0.0/2.1.0/2.2.0/2.3.0/2.4.0
3838
* Java 8
3939
* An RDMA-supported network, e.g. RoCE or Infiniband
4040

@@ -49,14 +49,15 @@ The pre-built binaries are packed as an archive that contains the following file
4949
* spark-rdma-3.1-for-spark-2.1.0-jar-with-dependencies.jar
5050
* spark-rdma-3.1-for-spark-2.2.0-jar-with-dependencies.jar
5151
* spark-rdma-3.1-for-spark-2.3.0-jar-with-dependencies.jar
52+
* spark-rdma-3.1-for-spark-2.4.0-jar-with-dependencies.jar
5253
* libdisni.so
5354

5455
libdisni.so **must** be in `java.library.path` on every Spark Master and Worker (usually in /usr/lib)
5556

5657
### Configuration
5758

5859
Provide Spark the location of the SparkRDMA plugin jars by using the extraClassPath option. For standalone mode this can
59-
be added to either spark-defaults.conf or any runtime configuration file. For client mode this **must** be added to spark-defaults.conf. For Spark 2.0.0 (Replace with 2.1.0, 2.2.0 or 2.3.0 according to your Spark version):
60+
be added to either spark-defaults.conf or any runtime configuration file. For client mode this **must** be added to spark-defaults.conf. For Spark 2.0.0 (Replace with 2.1.0, 2.2.0, 2.3.0, 2.4.0 according to your Spark version):
6061
```
6162
spark.driver.extraClassPath /path/to/SparkRDMA/target/spark-rdma-3.1-for-spark-2.0.0-jar-with-dependencies.jar
6263
spark.executor.extraClassPath /path/to/SparkRDMA/target/spark-rdma-3.1-for-spark-2.0.0-jar-with-dependencies.jar
@@ -76,7 +77,7 @@ Building the SparkRDMA plugin requires [Apache Maven](http://maven.apache.org/)
7677

7778
1. Obtain a clone of [SparkRDMA](https://github.com/Mellanox/SparkRDMA)
7879

79-
2. Build the plugin for your Spark version (either 2.0.0, 2.1.0, 2.2.0 or 2.3.0), e.g. for Spark 2.0.0:
80+
2. Build the plugin for your Spark version (either 2.0.0, 2.1.0, 2.2.0, 2.3.0, 2.4.0), e.g. for Spark 2.0.0:
8081
```
8182
mvn -DskipTests clean package -Pspark-2.0.0
8283
```

pom.xml

Lines changed: 6 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -61,6 +61,12 @@
6161
<spark.version>2.3.0</spark.version>
6262
</properties>
6363
</profile>
64+
<profile>
65+
<id>spark-2.4.0</id>
66+
<properties>
67+
<spark.version>2.4.0</spark.version>
68+
</properties>
69+
</profile>
6470
</profiles>
6571

6672
<dependencies>

src/main/scala/org/apache/spark/shuffle/rdma/RdmaShuffleReader.scala

Lines changed: 2 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -46,7 +46,8 @@ private[spark] class RdmaShuffleReader[K, C](
4646
startPartition,
4747
endPartition,
4848
handle.shuffleId,
49-
mapOutputTracker.getMapSizesByExecutorId(handle.shuffleId, startPartition, endPartition))
49+
mapOutputTracker.getMapSizesByExecutorId(handle.shuffleId,
50+
startPartition, endPartition).toSeq)
5051

5152
val dummyShuffleBlockId = ShuffleBlockId(0, 0, 0)
5253
// Wrap the streams for compression based on configuration

0 commit comments

Comments
 (0)