Comments (3)
Do you have logs from the NotebookApp? Sounds like there may be something incorrect happening with how Apache Spark is started up
from spylon-kernel.
[I 10:38:38.759 NotebookApp] Accepting one-time-token-authenticated connection f
rom ::1
[I 10:39:37.282 NotebookApp] Creating new notebook in
[I 10:39:41.453 NotebookApp] Kernel started: e634dfd9-9c9e-4024-94df-519fbfc874b
5
[I 10:39:46.866 NotebookApp] Adapting to protocol v5.1 for kernel e634dfd9-9c9e-
4024-94df-519fbfc874b5
[MetaKernelApp] ERROR | Exception in message handler:
Traceback (most recent call last):
File "C:\ProgramData\Anaconda3\lib\site-packages\ipykernel\kernelbase.py", lin
e 235, in dispatch_shell
handler(stream, idents, msg)
File "C:\ProgramData\Anaconda3\lib\site-packages\ipykernel\kernelbase.py", lin
e 399, in execute_request
user_expressions, allow_stdin)
File "C:\ProgramData\Anaconda3\lib\site-packages\metakernel_metakernel.py", l
ine 357, in do_execute
retval = self.do_execute_direct(code)
File "C:\ProgramData\Anaconda3\lib\site-packages\spylon_kernel\scala_kernel.py
", line 141, in do_execute_direct
res = self._scalamagic.eval(code.strip(), raw=False)
File "C:\ProgramData\Anaconda3\lib\site-packages\spylon_kernel\scala_magic.py"
, line 155, in eval
intp = self.get_scala_interpreter()
File "C:\ProgramData\Anaconda3\lib\site-packages\spylon_kernel\scala_magic.py"
, line 46, in get_scala_interpreter
self.interp = get_scala_interpreter()
File "C:\ProgramData\Anaconda3\lib\site-packages\spylon_kernel\scala_interpret
er.py", line 562, in get_scala_interpreter
scala_intp = initialize_scala_interpreter()
File "C:\ProgramData\Anaconda3\lib\site-packages\spylon_kernel\scala_interpret
er.py", line 163, in initialize_scala_interpreter
spark_session, spark_jvm_helpers, spark_jvm_proc = init_spark()
File "C:\ProgramData\Anaconda3\lib\site-packages\spylon_kernel\scala_interpret
er.py", line 78, in init_spark
import pyspark.java_gateway
File "C:\ProgramData\spark-2.1.0-bin-hadoop2.7\python\pyspark_init.py", li
ne 44, in
from pyspark.context import SparkContext
File "C:\ProgramData\spark-2.1.0-bin-hadoop2.7\python\pyspark\context.py", lin
e 40, in
from pyspark.rdd import RDD, load_from_socket, ignore_unicode_prefix
File "C:\ProgramData\spark-2.1.0-bin-hadoop2.7\python\pyspark\rdd.py", line 47
, in
from pyspark.statcounter import StatCounter
File "C:\ProgramData\spark-2.1.0-bin-hadoop2.7\python\pyspark\statcounter.py",
line 24, in
from numpy import maximum, minimum, sqrt
File "C:\ProgramData\Anaconda3\lib\site-packages\numpy_init.py", line 142,
in
from . import add_newdocs
File "C:\ProgramData\Anaconda3\lib\site-packages\numpy\add_newdocs.py", line 1
3, in
from numpy.lib import add_newdoc
File "C:\ProgramData\Anaconda3\lib\site-packages\numpy\lib_init.py", line
8, in
from .type_check import *
File "C:\ProgramData\Anaconda3\lib\site-packages\numpy\lib\type_check.py", lin
e 11, in
import numpy.core.numeric as nx
File "C:\ProgramData\Anaconda3\lib\site-packages\numpy\core_init.py", line
72, in
from numpy.testing.nosetester import numpy_tester
File "C:\ProgramData\Anaconda3\lib\site-packages\numpy\testing_init.py", l
ine 10, in
from unittest import TestCase
File "C:\ProgramData\Anaconda3\lib\unittest_init.py", line 58, in
from .result import TestResult
File "C:\ProgramData\Anaconda3\lib\unittest\result.py", line 7, in
from . import util
File "C:\ProgramData\Anaconda3\lib\unittest\util.py", line 119, in
_Mismatch = namedtuple('Mismatch', 'actual expected value')
File "C:\ProgramData\spark-2.1.0-bin-hadoop2.7\python\pyspark\serializers.py",
line 393, in namedtuple
cls = _old_namedtuple(*args, **kwargs)
TypeError: namedtuple() missing 3 required keyword-only arguments: 'verbose', 'r
ename', and 'module'
[MetaKernelApp] ERROR | No such comm target registered: jupyter.widget.version
[MetaKernelApp] ERROR | No such comm target registered: `jupyter.widget.version```
from spylon-kernel.
So this is probably due to an issue with pyspark 2.1.0 under python 3.6. Try running it with python 3.5
from spylon-kernel.
Related Issues (20)
- ExecutorClassLoader error in Spylon notebook
- How add additional jar files to SparkContext HOT 5
- [Request] provide example of Spark Yarn cluster conectivity (EMR) HOT 1
- Write NULL File to HDFS
- Graph Frames modules are missing
- Not able of import external packages HOT 6
- Question: How to gracefully stop execution in a cell?
- Does spylon-kernel support Spark 3.0? HOT 2
- spylon-kernel error : compilation: disabled (not enough contiguous free space left) HOT 1
- Unable to install spylon kernel HOT 2
- spylon launcher.packages inside kernel.json args
- s3 filesystem not found
- [BUG]: Spark submit fails: No such file or directory: '/opt/spark/python/pyspark/./bin/spark-submit'
- Cannot get Hive data HOT 1
- Run Scala cell on Jupyter notebook
- Cannot install spylon-kernel on Ubuntu 22 HOT 1
- Failed running `python -m spylon_kernel install`
- Unable to use existing spark server with spylon-kernel
- Using spylon-kernel with java?
- Outdated versioneer.py broken for Python 3.12
Recommend Projects
-
React
A declarative, efficient, and flexible JavaScript library for building user interfaces.
-
Vue.js
🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
-
Typescript
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
-
TensorFlow
An Open Source Machine Learning Framework for Everyone
-
Django
The Web framework for perfectionists with deadlines.
-
Laravel
A PHP framework for web artisans
-
D3
Bring data to life with SVG, Canvas and HTML. 📊📈🎉
-
Recommend Topics
-
javascript
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
-
web
Some thing interesting about web. New door for the world.
-
server
A server is a program made to process requests and deliver data to clients.
-
Machine learning
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
-
Visualization
Some thing interesting about visualization, use data art
-
Game
Some thing interesting about game, make everyone happy.
Recommend Org
-
Facebook
We are working to build community through open source technology. NB: members must have two-factor auth.
-
Microsoft
Open source projects and samples from Microsoft.
-
Google
Google ❤️ Open Source for everyone.
-
Alibaba
Alibaba Open Source for everyone
-
D3
Data-Driven Documents codes.
-
Tencent
China tencent open source team.
from spylon-kernel.