Cannot connect to Sparkmagic kernels
See original GitHub issueDiscussed in https://github.com/microsoft/vscode-jupyter/discussions/11125
<div type='discussions-op-text'>Originally posted by jvaesteves August 16, 2022 Hello, I’ve trying to use VSCode to accept the Sparkmagic kernels that I installed on a venv using Poetry, so I can connect to an EMR instance via Livy, but when listing the kernels on the notebook interface, all that appears are Python versions and venvs from my computer. I tried this on PyCharm Pro and works well.
Setup:
- VSCode version: 1.70.1
- Jupyter extension version: v2022.7.1102252217
- Python version: tried 3.10.5 and 3.7.13
- Poetry version: 1.1.14
- Packages: sparkmagic==0.20.0
Steps to reproduce:
SPARKMAGIC_LOCATION=$(pip show sparkmagic | grep Location | cut -d" " -f2)
jupyter nbextension enable --py --sys-prefix widgetsnbextension
jupyter-kernelspec install --user $SPARKMAGIC_LOCATION/sparkmagic/kernels/sparkkernel
jupyter-kernelspec install --user $SPARKMAGIC_LOCATION/sparkmagic/kernels/pysparkkernel
jupyter-kernelspec install --user $SPARKMAGIC_LOCATION/sparkmagic/kernels/sparkrkernel
jupyter serverextension enable --py sparkmagic
What is expected
For PySpark, Spark and SparkR to appear on VSCode kernel list
What is happening
</div>
Issue Analytics
- State:
- Created a year ago
- Comments:13 (7 by maintainers)
Top Results From Across the Web
issue to connect jupyter sparkmagic kernel to kerberized livy ...
Some things to try: a) Make sure Spark has enough available resources for Jupyter to create a Spark context. b) Contact your Jupyter ......
Read more >Enhance kernels with magic commands - Amazon EMR
You can use magic commands as long as you have a Python kernel in your EMR notebook. Similarly, any Spark-related kernel supports Sparkmagic...
Read more >Configuring Livy server for Hadoop Spark access
If you misconfigure a .json file, all Sparkmagic kernels will fail to launch. You can test your Sparkmagic configuration by running the following...
Read more >Configuring a session in Jupyter | PySpark Cookbook
To install Livy and sparkmagic , we have created a script that will do this ... For a list of available Jupyter kernels...
Read more >How to connect Jupyter Notebook to remote spark clusters ...
You cannot easily change the code and get the result printed like what ... There is a Jupyter notebook kernel called “Sparkmagic” which...
Read more >
Top Related Medium Post
No results found
Top Related StackOverflow Question
No results found
Troubleshoot Live Code
Lightrun enables developers to add logs, metrics and snapshots to live code - no restarts or redeploys required.
Start Free
Top Related Reddit Thread
No results found
Top Related Hackernoon Post
No results found
Top Related Tweet
No results found
Top Related Dev.to Post
No results found
Top Related Hashnode Post
No results found
@DonJayamanne Thanks a lot for the help. With this change, now the PySpark works correctly! 😃
Closing this issue as its been over 4 weeks, since the information was requested. We’ll be happy to reopen the issue when the requested information has been provided.