Skip to content
This repository has been archived by the owner on Jan 17, 2019. It is now read-only.

Docker failed to launch spark Worker #39

Open
zengqicheng opened this issue Feb 9, 2017 · 1 comment
Open

Docker failed to launch spark Worker #39

zengqicheng opened this issue Feb 9, 2017 · 1 comment

Comments

@zengqicheng
Copy link

starting org.apache.spark.deploy.master.Master, logging to /usr/local/spark/logs/spark--org.apache.spark.deploy.master.Master-1-sandbox.out
localhost: starting org.apache.spark.deploy.worker.Worker, logging to /usr/local/spark/logs/spark-root-org.apache.spark.deploy.worker.Worker-1-sandbox.out
localhost: failed to launch org.apache.spark.deploy.worker.Worker:
localhost: at java.lang.ClassLoader.loadClass(libgcj.so.10)
localhost: at gnu.java.lang.MainThread.run(libgcj.so.10)
localhost: full log in /usr/local/spark/logs/spark-root-org.apache.spark.deploy.worker.Worker-1-sandbox.out
about to fork child process, waiting until server is ready for connections.

@bhlx3lyx7
Copy link
Contributor

Actually you've started it successfully, just wait some minutes for tomcat start up, you can follow the next steps.
The failure of launching spark worker comes from manually starting spark cluster by script, in this simple docker we only have one master node without any worker node. To avoid this error we've also modified our script file in docker and rebuild this image, you can also pull it again and run, this failure will no longer come out.
Thanks for your questions, and hope you enjoy it.

Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants