@@ -16,7 +16,17 @@ Examples are available [here](./examples).
16
16
``` bash
17
17
sbt package # Creates a minimal jar.
18
18
sbt assembly # Creates the full assembly with all dependencies, notably hadoop cloud.
19
- ```
19
+ ```
20
+
21
+ ## Formatting Code
22
+
23
+ Formatting is done with ` scalafmt ` . This can be triggered with the following configuration.
24
+
25
+ ``` bash
26
+ sbt scalafmtAll # Format the source code
27
+ sbt scalafmtSbt # Format the SBT.
28
+ ```
29
+
20
30
21
31
## Required configuration
22
32
@@ -86,14 +96,14 @@ to Java > 11:
86
96
--add-opens=java.base/java.io=ALL-UNNAMED
87
97
--add-opens=java.base/java.net=ALL-UNNAMED
88
98
--add-opens=java.base/java.nio=ALL-UNNAMED
89
- --add-opens=java.base/java.util=ALL-UNNAMED
90
- --add-opens=java.base/java.util.concurrent=ALL-UNNAMED
91
- --add-opens=java.base/java.util.concurrent.atomic=ALL-UNNAMED
92
- --add-opens=java.base/sun.nio.ch=ALL-UNNAMED
93
- --add-opens=java.base/sun.nio.cs=ALL-UNNAMED
99
+ --add-opens=java.base/java.util=ALL-UNNAMED
100
+ --add-opens=java.base/java.util.concurrent=ALL-UNNAMED
101
+ --add-opens=java.base/java.util.concurrent.atomic=ALL-UNNAMED
102
+ --add-opens=java.base/sun.nio.ch=ALL-UNNAMED
103
+ --add-opens=java.base/sun.nio.cs=ALL-UNNAMED
94
104
--add-opens=java.base/sun.security.action=ALL-UNNAMED -
95
- -add-opens=java.base/sun.util.calendar=ALL-UNNAMED
96
- --add-opens=java.security.jgss/sun.security.krb5=ALL-UNNAMED
105
+ -add-opens=java.base/sun.util.calendar=ALL-UNNAMED
106
+ --add-opens=java.security.jgss/sun.security.krb5=ALL-UNNAMED
97
107
```
98
108
99
109
## Usage
@@ -119,7 +129,7 @@ Add the following lines to your Spark configuration:
119
129
--conf spark.hadoop.fs.s3a.endpoint=S3A_ENDPOINT
120
130
--conf spark.hadoop.fs.s3a.path.style.access=true
121
131
--conf spark.hadoop.fs.s3a.fast.upload=true
122
-
132
+
123
133
--conf spark.shuffle.manager="org.apache.spark.shuffle.sort.S3ShuffleManager"
124
134
--conf spark.shuffle.sort.io.plugin.class="org.apache.spark.shuffle.S3ShuffleDataIO"
125
135
--conf spark.hadoop.fs.s3a.impl="org.apache.hadoop.fs.s3a.S3AFileSystem"
0 commit comments