-
-
Notifications
You must be signed in to change notification settings - Fork 12.7k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
apache-spark formula seems to be broken #210638
Comments
That's not a path that has the install. I'd recommend setting these opt paths instead of Cellar because they are stable between updates: export JAVA_HOME="/opt/homebrew/opt/openjdk@17/libexec/openjdk.jdk/Contents/Home/"
export SPARK_HOME="/opt/homebrew/opt/apache-spark/" |
Closing as formula is behaving as expected. Apache Spark will fail if you give it incorrect paths. The formula is set up to work out-of-the-box so if you want to use environment variables then it is up to you to make sure they are correct. |
This is the path that is logged when you run the installation command and
So that's not obvious, I think. Could the "stable path" be surfaced better?
I'm on M2 Pro with Sequoia 15.3.1 if that helps. My setup works thanks to manually downloading, so I am trying to help the next people. |
Incorrect:
Worked for me out of the box from a fresh Homebrew install, no
Sorry, but you're actually confusing everyone by claiming you need to set additional environment variables in an unnecessary and incorrect way. If you had to set these variables for this (or any) formula to work, the necessary instructions would have been printed out in a Caveats section after If you're following online instructions to set these variables, note that they tell you to do that because you can put the binaries you download anywhere in your filesystem, which is probably not where the hardcoded default |
Thank you for trying to reproduce. |
The path shown is
but you used export SPARK_HOME="/opt/homebrew/Cellar/apache-spark/" Note the missing
Perhaps, but it's not really clear how/where/when to. If it's any easier, you could always do something like export SPARK_HOME="$(brew --prefix apache-spark)" and that always uses the stable path. |
❯ JAVA_HOME=/some/incorrect/path spark-shell --version
/opt/homebrew/Cellar/apache-spark/3.5.5/libexec/bin/spark-class: line 71: /some/incorrect/path/bin/java: No such file or directory
/opt/homebrew/Cellar/apache-spark/3.5.5/libexec/bin/spark-class: line 97: CMD: bad array subscript
head: illegal line count -- -1
❯ SPARK_HOME=/some/incorrect/path spark-shell --version
/opt/homebrew/Cellar/apache-spark/3.5.5/libexec/bin/spark-shell: line 60: /some/incorrect/path/bin/spark-submit: No such file or directory Spark's own commands (e.g. I guess the question is if there are common scenarios for Homebrew's users that need homebrew-core/Formula/d/dotnet.rb Lines 98 to 101 in af00d1e
The path you are looking for is |
brew gist-logs <formula>
link ORbrew config
ANDbrew doctor
outputVerification
brew doctor
output saysYour system is ready to brew.
and am still able to reproduce my issue.brew update
and am still able to reproduce my issue.brew doctor
and that did not fix my problem.What were you trying to do (and why)?
Install Apache Spark to run Apache Spark code locally.
What happened (include all command output)?
The brew command worked without errors, but the installation is not functional. All the Spark commands exit immediately.
Ex:
spark-sell
exits immediately with no error message.What did you expect to happen?
Spark commands should work. Ex:
Step-by-step reproduction instructions (by running
brew
commands)brew install openjdk@17
brew install [email protected]
brew install apache-spark
export JAVA_HOME="/opt/homebrew/Cellar/openjdk@17/17.0.13/libexec/openjdk.jdk/Contents/Home/"
export SPARK_HOME="/opt/homebrew/Cellar/apache-spark/"
export PATH=$SPARK_HOME/bin:$PATH
I know the JDK and Scala are installed correctly, and variables are correct because it works if I download the package from the official website and set the
SPARK_HOME
to this package. https://spark.apache.org/downloads.htmlConsequently, I think something may be broken with the formula of Apache Spark.
The text was updated successfully, but these errors were encountered: