!:本次测试没有启动任何版本hadoop
spark-3.5.4-bin-hadoop3
[root@master spark]# IPYTHON=1 ./bin/pyspark
Error in pyspark startup:
IPYTHON and IPYTHON_OPTS are removed in Spark 2.0+. Remove these from the environment and set PYSPARK_DRIVER_PYTHON and PYSPARK_DRIVER_PYTHON_OPTS instead.
[root@master spark]# PYSPARK_DRIVER_PYTHON=ipython bin/pyspark
Python 2.7.5 (default, Nov 14 2023, 16:14:06)
Type "copyright", "credits" or "license" for more information.IPython 3.2.3 -- An enhanced Interactive Python.
? -> Introduction and overview of IPython's features.
%quickref -> Quick reference.
help -> Python's own help system.
object? -> Details about 'object', use 'object??' for extra details.
---------------------------------------------------------------------------
ImportError Traceback (most recent call last)
/usr/local/src/spark/python/pyspark/shell.py in <module>()23 24 import atexit
---> 25 import builtins26 import os27 import platformImportError: No module named builtinsIn [1]:
spark-2.4.3-bin-hadoop2.7
[root@master spark2.x]# PYSPARK_DRIVER_PYTHON=ipython bin/pyspark
Python 2.7.5 (default, Nov 14 2023, 16:14:06)
Type "copyright", "credits" or "license" for more information.IPython 3.2.3 -- An enhanced Interactive Python.
? -> Introduction and overview of IPython's features.
%quickref -> Quick reference.
help -> Python's own help system.
object? -> Details about 'object', use 'object??' for extra details.
25/02/02 11:17:14 WARN NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
Using Spark's default log4j profile: org/apache/spark/log4j-defaults.properties
Setting default log level to "WARN".
To adjust logging level use sc.setLogLevel(newLevel). For SparkR, use setLogLevel(newLevel).
25/02/02 11:17:15 WARN Utils: Service 'sparkDriver' could not bind on a random free port. You may check whether configuring an appropriate binding address.
25/02/02 11:17:15 WARN Utils: Service 'sparkDriver' could not bind on a random free port. You may check whether configuring an appropriate binding address.
25/02/02 11:17:15 WARN Utils: Service 'sparkDriver' could not bind on a random free port. You may check whether configuring an appropriate binding address.
25/02/02 11:17:15 WARN Utils: Service 'sparkDriver' could not bind on a random free port. You may check whether configuring an appropriate binding address.
25/02/02 11:17:15 WARN Utils: Service 'sparkDriver' could not bind on a random free port. You may check whether configuring an appropriate binding address.
25/02/02 11:17:15 WARN Utils: Service 'sparkDriver' could not bind on a random free port. You may check whether configuring an appropriate binding address.
25/02/02 11:17:15 WARN Utils: Service 'sparkDriver' could not bind on a random free port. You may check whether configuring an appropriate binding address.
25/02/02 11:17:15 WARN Utils: Service 'sparkDriver' could not bind on a random free port. You may check whether configuring an appropriate binding address.
25/02/02 11:17:15 WARN Utils: Service 'sparkDriver' could not bind on a random free port. You may check whether configuring an appropriate binding address.
25/02/02 11:17:15 WARN Utils: Service 'sparkDriver' could not bind on a random free port. You may check whether configuring an appropriate binding address.
25/02/02 11:17:15 WARN Utils: Service 'sparkDriver' could not bind on a random free port. You may check whether configuring an appropriate binding address.
25/02/02 11:17:15 WARN Utils: Service 'sparkDriver' could not bind on a random free port. You may check whether configuring an appropriate binding address.
25/02/02 11:17:15 WARN Utils: Service 'sparkDriver' could not bind on a random free port. You may check whether configuring an appropriate binding address.
25/02/02 11:17:15 WARN Utils: Service 'sparkDriver' could not bind on a random free port. You may check whether configuring an appropriate binding address.
25/02/02 11:17:15 WARN Utils: Service 'sparkDriver' could not bind on a random free port. You may check whether configuring an appropriate binding address.
25/02/02 11:17:15 WARN Utils: Service 'sparkDriver' could not bind on a random free port. You may check whether configuring an appropriate binding address.
25/02/02 11:17:15 ERROR SparkContext: Error initializing SparkContext.
java.net.BindException: 无法指定被请求的地址: Service 'sparkDriver' failed after 16 retries (on a random free port)! Consider explicitly setting the appropriate binding address for the service 'sparkDriver' (for example spark.driver.bindAddress for SparkDriver) to the correct binding address.at sun.nio.ch.Net.bind0(Native Method)at sun.nio.ch.Net.bind(Net.java:433)at sun.nio.ch.Net.bind(Net.java:425)at sun.nio.ch.ServerSocketChannelImpl.bind(ServerSocketChannelImpl.java:223)at io.netty.channel.socket.nio.NioServerSocketChannel.doBind(NioServerSocketChannel.java:128)at io.netty.channel.AbstractChannel$AbstractUnsafe.bind(AbstractChannel.java:558)at io.netty.channel.DefaultChannelPipeline$HeadContext.bind(DefaultChannelPipeline.java:1283)at io.netty.channel.AbstractChannelHandlerContext.invokeBind(AbstractChannelHandlerContext.java:501)at io.netty.channel.AbstractChannelHandlerContext.bind(AbstractChannelHandlerContext.java:486)at io.netty.channel.DefaultChannelPipeline.bind(DefaultChannelPipeline.java:989)at io.netty.channel.AbstractChannel.bind(AbstractChannel.java:254)at io.netty.bootstrap.AbstractBootstrap$2.run(AbstractBootstrap.java:364)at io.netty.util.concurrent.AbstractEventExecutor.safeExecute(AbstractEventExecutor.java:163)at io.netty.util.concurrent.SingleThreadEventExecutor.runAllTasks(SingleThreadEventExecutor.java:403)at io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:463)at io.netty.util.concurrent.SingleThreadEventExecutor$5.run(SingleThreadEventExecutor.java:858)at io.netty.util.concurrent.DefaultThreadFactory$DefaultRunnableDecorator.run(DefaultThreadFactory.java:138)at java.lang.Thread.run(Thread.java:748)
25/02/02 11:17:15 WARN SparkContext: Another SparkContext is being constructed (or threw an exception in its constructor). This may indicate an error, since only one SparkContext may be running in this JVM (see SPARK-2243). The other SparkContext was created at:
org.apache.spark.api.java.JavaSparkContext.<init>(JavaSparkContext.scala:58)
sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
java.lang.reflect.Constructor.newInstance(Constructor.java:423)
py4j.reflection.MethodInvoker.invoke(MethodInvoker.java:247)
py4j.reflection.ReflectionEngine.invoke(ReflectionEngine.java:357)
py4j.Gateway.invoke(Gateway.java:238)
py4j.commands.ConstructorCommand.invokeConstructor(ConstructorCommand.java:80)
py4j.commands.ConstructorCommand.execute(ConstructorCommand.java:69)
py4j.GatewayConnection.run(GatewayConnection.java:238)
java.lang.Thread.run(Thread.java:748)
25/02/02 11:17:15 WARN Utils: Service 'sparkDriver' could not bind on a random free port. You may check whether configuring an appropriate binding address.
25/02/02 11:17:15 WARN Utils: Service 'sparkDriver' could not bind on a random free port. You may check whether configuring an appropriate binding address.
25/02/02 11:17:15 WARN Utils: Service 'sparkDriver' could not bind on a random free port. You may check whether configuring an appropriate binding address.
25/02/02 11:17:15 WARN Utils: Service 'sparkDriver' could not bind on a random free port. You may check whether configuring an appropriate binding address.
25/02/02 11:17:15 WARN Utils: Service 'sparkDriver' could not bind on a random free port. You may check whether configuring an appropriate binding address.
25/02/02 11:17:15 WARN Utils: Service 'sparkDriver' could not bind on a random free port. You may check whether configuring an appropriate binding address.
25/02/02 11:17:15 WARN Utils: Service 'sparkDriver' could not bind on a random free port. You may check whether configuring an appropriate binding address.
25/02/02 11:17:15 WARN Utils: Service 'sparkDriver' could not bind on a random free port. You may check whether configuring an appropriate binding address.
25/02/02 11:17:15 WARN Utils: Service 'sparkDriver' could not bind on a random free port. You may check whether configuring an appropriate binding address.
25/02/02 11:17:15 WARN Utils: Service 'sparkDriver' could not bind on a random free port. You may check whether configuring an appropriate binding address.
25/02/02 11:17:15 WARN Utils: Service 'sparkDriver' could not bind on a random free port. You may check whether configuring an appropriate binding address.
25/02/02 11:17:15 WARN Utils: Service 'sparkDriver' could not bind on a random free port. You may check whether configuring an appropriate binding address.
25/02/02 11:17:15 WARN Utils: Service 'sparkDriver' could not bind on a random free port. You may check whether configuring an appropriate binding address.
25/02/02 11:17:15 WARN Utils: Service 'sparkDriver' could not bind on a random free port. You may check whether configuring an appropriate binding address.
25/02/02 11:17:15 WARN Utils: Service 'sparkDriver' could not bind on a random free port. You may check whether configuring an appropriate binding address.
25/02/02 11:17:15 WARN Utils: Service 'sparkDriver' could not bind on a random free port. You may check whether configuring an appropriate binding address.
25/02/02 11:17:15 ERROR SparkContext: Error initializing SparkContext.
java.net.BindException: 无法指定被请求的地址: Service 'sparkDriver' failed after 16 retries (on a random free port)! Consider explicitly setting the appropriate binding address for the service 'sparkDriver' (for example spark.driver.bindAddress for SparkDriver) to the correct binding address.at sun.nio.ch.Net.bind0(Native Method)at sun.nio.ch.Net.bind(Net.java:433)at sun.nio.ch.Net.bind(Net.java:425)at sun.nio.ch.ServerSocketChannelImpl.bind(ServerSocketChannelImpl.java:223)at io.netty.channel.socket.nio.NioServerSocketChannel.doBind(NioServerSocketChannel.java:128)at io.netty.channel.AbstractChannel$AbstractUnsafe.bind(AbstractChannel.java:558)at io.netty.channel.DefaultChannelPipeline$HeadContext.bind(DefaultChannelPipeline.java:1283)at io.netty.channel.AbstractChannelHandlerContext.invokeBind(AbstractChannelHandlerContext.java:501)at io.netty.channel.AbstractChannelHandlerContext.bind(AbstractChannelHandlerContext.java:486)at io.netty.channel.DefaultChannelPipeline.bind(DefaultChannelPipeline.java:989)at io.netty.channel.AbstractChannel.bind(AbstractChannel.java:254)at io.netty.bootstrap.AbstractBootstrap$2.run(AbstractBootstrap.java:364)at io.netty.util.concurrent.AbstractEventExecutor.safeExecute(AbstractEventExecutor.java:163)at io.netty.util.concurrent.SingleThreadEventExecutor.runAllTasks(SingleThreadEventExecutor.java:403)at io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:463)at io.netty.util.concurrent.SingleThreadEventExecutor$5.run(SingleThreadEventExecutor.java:858)at io.netty.util.concurrent.DefaultThreadFactory$DefaultRunnableDecorator.run(DefaultThreadFactory.java:138)at java.lang.Thread.run(Thread.java:748)
/usr/local/src/spark2.x/python/pyspark/shell.py:45: UserWarning: Failed to initialize Spark session.warnings.warn("Failed to initialize Spark session.")
Traceback (most recent call last):File "/usr/local/src/spark2.x/python/pyspark/shell.py", line 41, in <module>spark = SparkSession._create_shell_session()File "/usr/local/src/spark2.x/python/pyspark/sql/session.py", line 583, in _create_shell_sessionreturn SparkSession.builder.getOrCreate()File "/usr/local/src/spark2.x/python/pyspark/sql/session.py", line 173, in getOrCreatesc = SparkContext.getOrCreate(sparkConf)File "/usr/local/src/spark2.x/python/pyspark/context.py", line 367, in getOrCreateSparkContext(conf=conf or SparkConf())File "/usr/local/src/spark2.x/python/pyspark/context.py", line 136, in __init__conf, jsc, profiler_cls)File "/usr/local/src/spark2.x/python/pyspark/context.py", line 198, in _do_initself._jsc = jsc or self._initialize_context(self._conf._jconf)File "/usr/local/src/spark2.x/python/pyspark/context.py", line 306, in _initialize_contextreturn self._jvm.JavaSparkContext(jconf)File "/usr/local/src/spark2.x/python/lib/py4j-0.10.7-src.zip/py4j/java_gateway.py", line 1525, in __call__answer, self._gateway_client, None, self._fqn)File "/usr/local/src/spark2.x/python/lib/py4j-0.10.7-src.zip/py4j/protocol.py", line 328, in get_return_valueformat(target_id, ".", name), value)
Py4JJavaError: An error occurred while calling None.org.apache.spark.api.java.JavaSparkContext.
: java.net.BindException: \u65e0\u6cd5\u6307\u5b9a\u88ab\u8bf7\u6c42\u7684\u5730\u5740: Service 'sparkDriver' failed after 16 retries (on a random free port)! Consider explicitly setting the appropriate binding address for the service 'sparkDriver' (for example spark.driver.bindAddress for SparkDriver) to the correct binding address.at sun.nio.ch.Net.bind0(Native Method)at sun.nio.ch.Net.bind(Net.java:433)at sun.nio.ch.Net.bind(Net.java:425)at sun.nio.ch.ServerSocketChannelImpl.bind(ServerSocketChannelImpl.java:223)at io.netty.channel.socket.nio.NioServerSocketChannel.doBind(NioServerSocketChannel.java:128)at io.netty.channel.AbstractChannel$AbstractUnsafe.bind(AbstractChannel.java:558)at io.netty.channel.DefaultChannelPipeline$HeadContext.bind(DefaultChannelPipeline.java:1283)at io.netty.channel.AbstractChannelHandlerContext.invokeBind(AbstractChannelHandlerContext.java:501)at io.netty.channel.AbstractChannelHandlerContext.bind(AbstractChannelHandlerContext.java:486)at io.netty.channel.DefaultChannelPipeline.bind(DefaultChannelPipeline.java:989)at io.netty.channel.AbstractChannel.bind(AbstractChannel.java:254)at io.netty.bootstrap.AbstractBootstrap$2.run(AbstractBootstrap.java:364)at io.netty.util.concurrent.AbstractEventExecutor.safeExecute(AbstractEventExecutor.java:163)at io.netty.util.concurrent.SingleThreadEventExecutor.runAllTasks(SingleThreadEventExecutor.java:403)at io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:463)at io.netty.util.concurrent.SingleThreadEventExecutor$5.run(SingleThreadEventExecutor.java:858)at io.netty.util.concurrent.DefaultThreadFactory$DefaultRunnableDecorator.run(DefaultThreadFactory.java:138)at java.lang.Thread.run(Thread.java:748)An exception has occurred, use %tb to see the full traceback.SystemExit: 1In [1]:
spark-1.6.3-bin-hadoop2.3
[root@master spark1.x]# IPYTHON=1 bin/pyspark
Python 2.7.5 (default, Nov 14 2023, 16:14:06)
Type "copyright", "credits" or "license" for more information.IPython 3.2.3 -- An enhanced Interactive Python.
? -> Introduction and overview of IPython's features.
%quickref -> Quick reference.
help -> Python's own help system.
object? -> Details about 'object', use 'object??' for extra details.
Using Spark's default log4j profile: org/apache/spark/log4j-defaults.properties
25/02/02 11:21:04 INFO SparkContext: Running Spark version 1.6.3
25/02/02 11:21:05 WARN NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
25/02/02 11:21:05 INFO SecurityManager: Changing view acls to: root
25/02/02 11:21:05 INFO SecurityManager: Changing modify acls to: root
25/02/02 11:21:05 INFO SecurityManager: SecurityManager: authentication disabled; ui acls disabled; users with view permissions: Set(root); users with modify permissions: Set(root)
25/02/02 11:21:05 WARN Utils: Service 'sparkDriver' could not bind on port 0. Attempting port 1.
25/02/02 11:21:05 WARN Utils: Service 'sparkDriver' could not bind on port 0. Attempting port 1.
25/02/02 11:21:05 WARN Utils: Service 'sparkDriver' could not bind on port 0. Attempting port 1.
25/02/02 11:21:05 WARN Utils: Service 'sparkDriver' could not bind on port 0. Attempting port 1.
25/02/02 11:21:05 WARN Utils: Service 'sparkDriver' could not bind on port 0. Attempting port 1.
25/02/02 11:21:05 WARN Utils: Service 'sparkDriver' could not bind on port 0. Attempting port 1.
25/02/02 11:21:05 WARN Utils: Service 'sparkDriver' could not bind on port 0. Attempting port 1.
25/02/02 11:21:05 WARN Utils: Service 'sparkDriver' could not bind on port 0. Attempting port 1.
25/02/02 11:21:05 WARN Utils: Service 'sparkDriver' could not bind on port 0. Attempting port 1.
25/02/02 11:21:05 WARN Utils: Service 'sparkDriver' could not bind on port 0. Attempting port 1.
25/02/02 11:21:05 WARN Utils: Service 'sparkDriver' could not bind on port 0. Attempting port 1.
25/02/02 11:21:05 WARN Utils: Service 'sparkDriver' could not bind on port 0. Attempting port 1.
25/02/02 11:21:05 WARN Utils: Service 'sparkDriver' could not bind on port 0. Attempting port 1.
25/02/02 11:21:05 WARN Utils: Service 'sparkDriver' could not bind on port 0. Attempting port 1.
25/02/02 11:21:05 WARN Utils: Service 'sparkDriver' could not bind on port 0. Attempting port 1.
25/02/02 11:21:05 WARN Utils: Service 'sparkDriver' could not bind on port 0. Attempting port 1.
25/02/02 11:21:05 ERROR SparkContext: Error initializing SparkContext.
java.net.BindException: 无法指定被请求的地址: Service 'sparkDriver' failed after 16 retries! Consider explicitly setting the appropriate port for the service 'sparkDriver' (for example spark.ui.port for SparkUI) to an available port or increasing spark.port.maxRetries.at sun.nio.ch.Net.bind0(Native Method)at sun.nio.ch.Net.bind(Net.java:433)at sun.nio.ch.Net.bind(Net.java:425)at sun.nio.ch.ServerSocketChannelImpl.bind(ServerSocketChannelImpl.java:223)at sun.nio.ch.ServerSocketAdaptor.bind(ServerSocketAdaptor.java:74)at io.netty.channel.socket.nio.NioServerSocketChannel.doBind(NioServerSocketChannel.java:125)at io.netty.channel.AbstractChannel$AbstractUnsafe.bind(AbstractChannel.java:485)at io.netty.channel.DefaultChannelPipeline$HeadContext.bind(DefaultChannelPipeline.java:1089)at io.netty.channel.AbstractChannelHandlerContext.invokeBind(AbstractChannelHandlerContext.java:430)at io.netty.channel.AbstractChannelHandlerContext.bind(AbstractChannelHandlerContext.java:415)at io.netty.channel.DefaultChannelPipeline.bind(DefaultChannelPipeline.java:903)at io.netty.channel.AbstractChannel.bind(AbstractChannel.java:198)at io.netty.bootstrap.AbstractBootstrap$2.run(AbstractBootstrap.java:348)at io.netty.util.concurrent.SingleThreadEventExecutor.runAllTasks(SingleThreadEventExecutor.java:357)at io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:357)at io.netty.util.concurrent.SingleThreadEventExecutor$2.run(SingleThreadEventExecutor.java:111)at java.lang.Thread.run(Thread.java:748)
25/02/02 11:21:05 INFO SparkContext: Successfully stopped SparkContext
---------------------------------------------------------------------------
Py4JJavaError Traceback (most recent call last)
/usr/local/src/spark1.x/python/pyspark/shell.py in <module>()41 SparkContext.setSystemProperty("spark.executor.uri", os.environ["SPARK_EXECUTOR_URI"])42
---> 43 sc = SparkContext(pyFiles=add_files)44 atexit.register(lambda: sc.stop())45 /usr/local/src/spark1.x/python/pyspark/context.py in __init__(self, master, appName, sparkHome, pyFiles, environment, batchSize, serializer, conf, gateway, jsc, profiler_cls)113 try:114 self._do_init(master, appName, sparkHome, pyFiles, environment, batchSize, serializer,
--> 115 conf, jsc, profiler_cls)116 except:117 # If an error occurs, clean up in order to allow future SparkContext creation:/usr/local/src/spark1.x/python/pyspark/context.py in _do_init(self, master, appName, sparkHome, pyFiles, environment, batchSize, serializer, conf, jsc, profiler_cls)170 171 # Create the Java SparkContext through Py4J
--> 172 self._jsc = jsc or self._initialize_context(self._conf._jconf)173 174 # Create a single Accumulator in Java that we'll send all our updates through;/usr/local/src/spark1.x/python/pyspark/context.py in _initialize_context(self, jconf)233 Initialize SparkContext in function to allow subclass specific initialization234 """
--> 235 return self._jvm.JavaSparkContext(jconf)236 237 @classmethod/usr/local/src/spark1.x/python/lib/py4j-0.9-src.zip/py4j/java_gateway.py in __call__(self, *args)1062 answer = self._gateway_client.send_command(command)1063 return_value = get_return_value(
-> 1064 answer, self._gateway_client, None, self._fqn)1065 1066 for temp_arg in temp_args:/usr/local/src/spark1.x/python/lib/py4j-0.9-src.zip/py4j/protocol.py in get_return_value(answer, gateway_client, target_id, name)306 raise Py4JJavaError(307 "An error occurred while calling {0}{1}{2}.\n".
--> 308 format(target_id, ".", name), value)309 else:310 raise Py4JError(<type 'str'>: (<type 'exceptions.UnicodeEncodeError'>, UnicodeEncodeError('ascii', u"An error occurred while calling None.org.apache.spark.api.java.JavaSparkContext.\n: java.net.BindException: \u65e0\u6cd5\u6307\u5b9a\u88ab\u8bf7\u6c42\u7684\u5730\u5740: Service 'sparkDriver' failed after 16 retries! Consider explicitly setting the appropriate port for the service 'sparkDriver' (for example spark.ui.port for SparkUI) to an available port or increasing spark.port.maxRetries.\n\tat sun.nio.ch.Net.bind0(Native Method)\n\tat sun.nio.ch.Net.bind(Net.java:433)\n\tat sun.nio.ch.Net.bind(Net.java:425)\n\tat sun.nio.ch.ServerSocketChannelImpl.bind(ServerSocketChannelImpl.java:223)\n\tat sun.nio.ch.ServerSocketAdaptor.bind(ServerSocketAdaptor.java:74)\n\tat io.netty.channel.socket.nio.NioServerSocketChannel.doBind(NioServerSocketChannel.java:125)\n\tat io.netty.channel.AbstractChannel$AbstractUnsafe.bind(AbstractChannel.java:485)\n\tat io.netty.channel.DefaultChannelPipeline$HeadContext.bind(DefaultChannelPipeline.java:1089)\n\tat io.netty.channel.AbstractChannelHandlerContext.invokeBind(AbstractChannelHandlerContext.java:430)\n\tat io.netty.channel.AbstractChannelHandlerContext.bind(AbstractChannelHandlerContext.java:415)\n\tat io.netty.channel.DefaultChannelPipeline.bind(DefaultChannelPipeline.java:903)\n\tat io.netty.channel.AbstractChannel.bind(AbstractChannel.java:198)\n\tat io.netty.bootstrap.AbstractBootstrap$2.run(AbstractBootstrap.java:348)\n\tat io.netty.util.concurrent.SingleThreadEventExecutor.runAllTasks(SingleThreadEventExecutor.java:357)\n\tat io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:357)\n\tat io.netty.util.concurrent.SingleThreadEventExecutor$2.run(SingleThreadEventExecutor.java:111)\n\tat java.lang.Thread.run(Thread.java:748)\n", 107, 117, 'ordinal not in range(128)'))In [1]: