Skip to content

Update flint version to 1.0.0 in main branch #1071

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Merged
merged 1 commit into from
Mar 4, 2025
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
1 change: 1 addition & 0 deletions .github/pull_request_template.md
Original file line number Diff line number Diff line change
Expand Up @@ -10,6 +10,7 @@ _List any issues this PR will resolve, e.g. Resolves [...]._
- [ ] Implemented tests for combination with other commands
- [ ] New added source code should include a copyright header
- [ ] Commits are signed per the DCO using `--signoff`
- [ ] Add `backport 0.x` label if it is a stable change which won't break existing feature

By submitting this pull request, I confirm that my contribution is made under the terms of the Apache 2.0 license.
For more information on following Developer Certificate of Origin and signing off your commits, please check [here](https://github.com/opensearch-project/sql/blob/main/CONTRIBUTING.md#developer-certificate-of-origin).
4 changes: 2 additions & 2 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -65,7 +65,7 @@ sbt clean standaloneCosmetic/publishM2
```
then add org.opensearch:opensearch-spark-standalone_2.12 when run spark application, for example,
```
bin/spark-shell --packages "org.opensearch:opensearch-spark-standalone_2.12:0.7.0-SNAPSHOT" \
bin/spark-shell --packages "org.opensearch:opensearch-spark-standalone_2.12:1.0.0-SNAPSHOT" \
--conf "spark.sql.extensions=org.opensearch.flint.spark.FlintSparkExtensions" \
--conf "spark.sql.catalog.dev=org.apache.spark.opensearch.catalog.OpenSearchCatalog"
```
Expand All @@ -81,7 +81,7 @@ sbt clean sparkPPLCosmetic/publishM2
Then add org.opensearch:opensearch-spark-ppl_2.12 when run spark application, for example,

```
bin/spark-shell --packages "org.opensearch:opensearch-spark-ppl_2.12:0.7.0-SNAPSHOT" \
bin/spark-shell --packages "org.opensearch:opensearch-spark-ppl_2.12:1.0.0-SNAPSHOT" \
--conf "spark.sql.extensions=org.opensearch.flint.spark.FlintPPLSparkExtensions" \
--conf "spark.sql.catalog.dev=org.apache.spark.opensearch.catalog.OpenSearchCatalog"

Expand Down
2 changes: 1 addition & 1 deletion build.sbt
Original file line number Diff line number Diff line change
Expand Up @@ -21,7 +21,7 @@ val sparkMinorVersion = sparkVersion.split("\\.").take(2).mkString(".")

ThisBuild / organization := "org.opensearch"

ThisBuild / version := "0.7.0-SNAPSHOT"
ThisBuild / version := "1.0.0-SNAPSHOT"

ThisBuild / scalaVersion := scala212

Expand Down
2 changes: 1 addition & 1 deletion docker/apache-spark-sample/.env
Original file line number Diff line number Diff line change
@@ -1,4 +1,4 @@
MASTER_UI_PORT=8080
MASTER_PORT=7077
UI_PORT=4040
PPL_JAR=../../ppl-spark-integration/target/scala-2.12/ppl-spark-integration-assembly-0.7.0-SNAPSHOT.jar
PPL_JAR=../../ppl-spark-integration/target/scala-2.12/ppl-spark-integration-assembly-1.0.0-SNAPSHOT.jar
6 changes: 3 additions & 3 deletions docker/integ-test/.env
Original file line number Diff line number Diff line change
Expand Up @@ -5,9 +5,9 @@ MASTER_UI_PORT=8080
MASTER_PORT=7077
UI_PORT=4040
SPARK_CONNECT_PORT=15002
PPL_JAR=./ppl-spark-integration/target/scala-2.12/ppl-spark-integration-assembly-0.7.0-SNAPSHOT.jar
FLINT_JAR=./flint-spark-integration/target/scala-2.12/flint-spark-integration-assembly-0.7.0-SNAPSHOT.jar
SQL_APP_JAR=./spark-sql-application/target/scala-2.12/sql-job-assembly-0.7.0-SNAPSHOT.jar
PPL_JAR=./ppl-spark-integration/target/scala-2.12/ppl-spark-integration-assembly-1.0.0-SNAPSHOT.jar
FLINT_JAR=./flint-spark-integration/target/scala-2.12/flint-spark-integration-assembly-1.0.0-SNAPSHOT.jar
SQL_APP_JAR=./spark-sql-application/target/scala-2.12/sql-job-assembly-1.0.0-SNAPSHOT.jar
OPENSEARCH_NODE_MEMORY=512m
OPENSEARCH_ADMIN_PASSWORD=C0rrecthorsebatterystaple.
OPENSEARCH_PORT=9200
Expand Down
2 changes: 1 addition & 1 deletion docker/spark-emr-sample/.env
Original file line number Diff line number Diff line change
@@ -1 +1 @@
PPL_JAR=../../ppl-spark-integration/target/scala-2.12/ppl-spark-integration-assembly-0.7.0-SNAPSHOT.jar
PPL_JAR=../../ppl-spark-integration/target/scala-2.12/ppl-spark-integration-assembly-1.0.0-SNAPSHOT.jar
6 changes: 3 additions & 3 deletions docs/index.md
Original file line number Diff line number Diff line change
Expand Up @@ -60,7 +60,7 @@ Currently, Flint metadata is only static configuration without version control a

```json
{
"version": "0.7.0",
"version": "1.0.0",
"name": "...",
"kind": "skipping",
"source": "...",
Expand Down Expand Up @@ -711,7 +711,7 @@ For now, only single or conjunct conditions (conditions connected by AND) in WHE
### AWS EMR Spark Integration - Using execution role
Flint use [DefaultAWSCredentialsProviderChain](https://docs.aws.amazon.com/AWSJavaSDK/latest/javadoc/com/amazonaws/auth/DefaultAWSCredentialsProviderChain.html). When running in EMR Spark, Flint use executionRole credentials
```
--conf spark.jars.packages=org.opensearch:opensearch-spark-standalone_2.12:0.7.0-SNAPSHOT \
--conf spark.jars.packages=org.opensearch:opensearch-spark-standalone_2.12:1.0.0-SNAPSHOT \
--conf spark.jars.repositories=https://aws.oss.sonatype.org/content/repositories/snapshots \
--conf spark.emr-serverless.driverEnv.JAVA_HOME=/usr/lib/jvm/java-17-amazon-corretto.x86_64 \
--conf spark.executorEnv.JAVA_HOME=/usr/lib/jvm/java-17-amazon-corretto.x86_64 \
Expand Down Expand Up @@ -753,7 +753,7 @@ Flint use [DefaultAWSCredentialsProviderChain](https://docs.aws.amazon.com/AWSJa
```
3. Set the spark.datasource.flint.customAWSCredentialsProvider property with value as com.amazonaws.emr.AssumeRoleAWSCredentialsProvider. Set the environment variable ASSUME_ROLE_CREDENTIALS_ROLE_ARN with the ARN value of CrossAccountRoleB.
```
--conf spark.jars.packages=org.opensearch:opensearch-spark-standalone_2.12:0.7.0-SNAPSHOT \
--conf spark.jars.packages=org.opensearch:opensearch-spark-standalone_2.12:1.0.0-SNAPSHOT \
--conf spark.jars.repositories=https://aws.oss.sonatype.org/content/repositories/snapshots \
--conf spark.emr-serverless.driverEnv.JAVA_HOME=/usr/lib/jvm/java-17-amazon-corretto.x86_64 \
--conf spark.executorEnv.JAVA_HOME=/usr/lib/jvm/java-17-amazon-corretto.x86_64 \
Expand Down
4 changes: 2 additions & 2 deletions docs/ppl-lang/PPL-on-Spark.md
Original file line number Diff line number Diff line change
Expand Up @@ -34,7 +34,7 @@ sbt clean sparkPPLCosmetic/publishM2
```
then add org.opensearch:opensearch-spark_2.12 when run spark application, for example,
```
bin/spark-shell --packages "org.opensearch:opensearch-spark-ppl_2.12:0.7.0-SNAPSHOT"
bin/spark-shell --packages "org.opensearch:opensearch-spark-ppl_2.12:1.0.0-SNAPSHOT"
```

### PPL Extension Usage
Expand All @@ -46,7 +46,7 @@ spark-sql --conf "spark.sql.extensions=org.opensearch.flint.spark.FlintPPLSparkE
```

### Running With both Flint & PPL Extensions
In order to make use of both flint and ppl extension, one can simply add both jars (`org.opensearch:opensearch-spark-ppl_2.12:0.7.0-SNAPSHOT`,`org.opensearch:opensearch-spark_2.12:0.7.0-SNAPSHOT`) to the cluster's
In order to make use of both flint and ppl extension, one can simply add both jars (`org.opensearch:opensearch-spark-ppl_2.12:1.0.0-SNAPSHOT`,`org.opensearch:opensearch-spark_2.12:1.0.0-SNAPSHOT`) to the cluster's
classpath.

Next need to configure both extensions :
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -21,6 +21,7 @@ object FlintVersion {
val V_0_5_0: FlintVersion = FlintVersion("0.5.0")
val V_0_6_0: FlintVersion = FlintVersion("0.6.0")
val V_0_7_0: FlintVersion = FlintVersion("0.7.0")
val V_1_0_0: FlintVersion = FlintVersion("1.0.0")

def current(): FlintVersion = V_0_7_0
def current(): FlintVersion = V_1_0_0
}
Loading