spark--bash启动python版本的shell--ipython

!:本次测试没有启动任何版本hadoop

spark-3.5.4-bin-hadoop3

[root@master spark]# IPYTHON=1 ./bin/pyspark
Error in pyspark startup:
IPYTHON and IPYTHON_OPTS are removed in Spark 2.0+. Remove these from the environment and set PYSPARK_DRIVER_PYTHON and PYSPARK_DRIVER_PYTHON_OPTS instead.
[root@master spark]# PYSPARK_DRIVER_PYTHON=ipython bin/pyspark
Python 2.7.5 (default, Nov 14 2023, 16:14:06) 
Type "copyright", "credits" or "license" for more information.IPython 3.2.3 -- An enhanced Interactive Python.
?         -> Introduction and overview of IPython's features.
%quickref -> Quick reference.
help      -> Python's own help system.
object?   -> Details about 'object', use 'object??' for extra details.
---------------------------------------------------------------------------
ImportError                               Traceback (most recent call last)
/usr/local/src/spark/python/pyspark/shell.py in <module>()23 24 import atexit
---> 25 import builtins26 import os27 import platformImportError: No module named builtinsIn [1]: 

spark-2.4.3-bin-hadoop2.7

[root@master spark2.x]# PYSPARK_DRIVER_PYTHON=ipython bin/pyspark
Python 2.7.5 (default, Nov 14 2023, 16:14:06) 
Type "copyright", "credits" or "license" for more information.IPython 3.2.3 -- An enhanced Interactive Python.
?         -> Introduction and overview of IPython's features.
%quickref -> Quick reference.
help      -> Python's own help system.
object?   -> Details about 'object', use 'object??' for extra details.
25/02/02 11:17:14 WARN NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
Using Spark's default log4j profile: org/apache/spark/log4j-defaults.properties
Setting default log level to "WARN".
To adjust logging level use sc.setLogLevel(newLevel). For SparkR, use setLogLevel(newLevel).
25/02/02 11:17:15 WARN Utils: Service 'sparkDriver' could not bind on a random free port. You may check whether configuring an appropriate binding address.
25/02/02 11:17:15 WARN Utils: Service 'sparkDriver' could not bind on a random free port. You may check whether configuring an appropriate binding address.
25/02/02 11:17:15 WARN Utils: Service 'sparkDriver' could not bind on a random free port. You may check whether configuring an appropriate binding address.
25/02/02 11:17:15 WARN Utils: Service 'sparkDriver' could not bind on a random free port. You may check whether configuring an appropriate binding address.
25/02/02 11:17:15 WARN Utils: Service 'sparkDriver' could not bind on a random free port. You may check whether configuring an appropriate binding address.
25/02/02 11:17:15 WARN Utils: Service 'sparkDriver' could not bind on a random free port. You may check whether configuring an appropriate binding address.
25/02/02 11:17:15 WARN Utils: Service 'sparkDriver' could not bind on a random free port. You may check whether configuring an appropriate binding address.
25/02/02 11:17:15 WARN Utils: Service 'sparkDriver' could not bind on a random free port. You may check whether configuring an appropriate binding address.
25/02/02 11:17:15 WARN Utils: Service 'sparkDriver' could not bind on a random free port. You may check whether configuring an appropriate binding address.
25/02/02 11:17:15 WARN Utils: Service 'sparkDriver' could not bind on a random free port. You may check whether configuring an appropriate binding address.
25/02/02 11:17:15 WARN Utils: Service 'sparkDriver' could not bind on a random free port. You may check whether configuring an appropriate binding address.
25/02/02 11:17:15 WARN Utils: Service 'sparkDriver' could not bind on a random free port. You may check whether configuring an appropriate binding address.
25/02/02 11:17:15 WARN Utils: Service 'sparkDriver' could not bind on a random free port. You may check whether configuring an appropriate binding address.
25/02/02 11:17:15 WARN Utils: Service 'sparkDriver' could not bind on a random free port. You may check whether configuring an appropriate binding address.
25/02/02 11:17:15 WARN Utils: Service 'sparkDriver' could not bind on a random free port. You may check whether configuring an appropriate binding address.
25/02/02 11:17:15 WARN Utils: Service 'sparkDriver' could not bind on a random free port. You may check whether configuring an appropriate binding address.
25/02/02 11:17:15 ERROR SparkContext: Error initializing SparkContext.
java.net.BindException: 无法指定被请求的地址: Service 'sparkDriver' failed after 16 retries (on a random free port)! Consider explicitly setting the appropriate binding address for the service 'sparkDriver' (for example spark.driver.bindAddress for SparkDriver) to the correct binding address.at sun.nio.ch.Net.bind0(Native Method)at sun.nio.ch.Net.bind(Net.java:433)at sun.nio.ch.Net.bind(Net.java:425)at sun.nio.ch.ServerSocketChannelImpl.bind(ServerSocketChannelImpl.java:223)at io.netty.channel.socket.nio.NioServerSocketChannel.doBind(NioServerSocketChannel.java:128)at io.netty.channel.AbstractChannel$AbstractUnsafe.bind(AbstractChannel.java:558)at io.netty.channel.DefaultChannelPipeline$HeadContext.bind(DefaultChannelPipeline.java:1283)at io.netty.channel.AbstractChannelHandlerContext.invokeBind(AbstractChannelHandlerContext.java:501)at io.netty.channel.AbstractChannelHandlerContext.bind(AbstractChannelHandlerContext.java:486)at io.netty.channel.DefaultChannelPipeline.bind(DefaultChannelPipeline.java:989)at io.netty.channel.AbstractChannel.bind(AbstractChannel.java:254)at io.netty.bootstrap.AbstractBootstrap$2.run(AbstractBootstrap.java:364)at io.netty.util.concurrent.AbstractEventExecutor.safeExecute(AbstractEventExecutor.java:163)at io.netty.util.concurrent.SingleThreadEventExecutor.runAllTasks(SingleThreadEventExecutor.java:403)at io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:463)at io.netty.util.concurrent.SingleThreadEventExecutor$5.run(SingleThreadEventExecutor.java:858)at io.netty.util.concurrent.DefaultThreadFactory$DefaultRunnableDecorator.run(DefaultThreadFactory.java:138)at java.lang.Thread.run(Thread.java:748)
25/02/02 11:17:15 WARN SparkContext: Another SparkContext is being constructed (or threw an exception in its constructor).  This may indicate an error, since only one SparkContext may be running in this JVM (see SPARK-2243). The other SparkContext was created at:
org.apache.spark.api.java.JavaSparkContext.<init>(JavaSparkContext.scala:58)
sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
java.lang.reflect.Constructor.newInstance(Constructor.java:423)
py4j.reflection.MethodInvoker.invoke(MethodInvoker.java:247)
py4j.reflection.ReflectionEngine.invoke(ReflectionEngine.java:357)
py4j.Gateway.invoke(Gateway.java:238)
py4j.commands.ConstructorCommand.invokeConstructor(ConstructorCommand.java:80)
py4j.commands.ConstructorCommand.execute(ConstructorCommand.java:69)
py4j.GatewayConnection.run(GatewayConnection.java:238)
java.lang.Thread.run(Thread.java:748)
25/02/02 11:17:15 WARN Utils: Service 'sparkDriver' could not bind on a random free port. You may check whether configuring an appropriate binding address.
25/02/02 11:17:15 WARN Utils: Service 'sparkDriver' could not bind on a random free port. You may check whether configuring an appropriate binding address.
25/02/02 11:17:15 WARN Utils: Service 'sparkDriver' could not bind on a random free port. You may check whether configuring an appropriate binding address.
25/02/02 11:17:15 WARN Utils: Service 'sparkDriver' could not bind on a random free port. You may check whether configuring an appropriate binding address.
25/02/02 11:17:15 WARN Utils: Service 'sparkDriver' could not bind on a random free port. You may check whether configuring an appropriate binding address.
25/02/02 11:17:15 WARN Utils: Service 'sparkDriver' could not bind on a random free port. You may check whether configuring an appropriate binding address.
25/02/02 11:17:15 WARN Utils: Service 'sparkDriver' could not bind on a random free port. You may check whether configuring an appropriate binding address.
25/02/02 11:17:15 WARN Utils: Service 'sparkDriver' could not bind on a random free port. You may check whether configuring an appropriate binding address.
25/02/02 11:17:15 WARN Utils: Service 'sparkDriver' could not bind on a random free port. You may check whether configuring an appropriate binding address.
25/02/02 11:17:15 WARN Utils: Service 'sparkDriver' could not bind on a random free port. You may check whether configuring an appropriate binding address.
25/02/02 11:17:15 WARN Utils: Service 'sparkDriver' could not bind on a random free port. You may check whether configuring an appropriate binding address.
25/02/02 11:17:15 WARN Utils: Service 'sparkDriver' could not bind on a random free port. You may check whether configuring an appropriate binding address.
25/02/02 11:17:15 WARN Utils: Service 'sparkDriver' could not bind on a random free port. You may check whether configuring an appropriate binding address.
25/02/02 11:17:15 WARN Utils: Service 'sparkDriver' could not bind on a random free port. You may check whether configuring an appropriate binding address.
25/02/02 11:17:15 WARN Utils: Service 'sparkDriver' could not bind on a random free port. You may check whether configuring an appropriate binding address.
25/02/02 11:17:15 WARN Utils: Service 'sparkDriver' could not bind on a random free port. You may check whether configuring an appropriate binding address.
25/02/02 11:17:15 ERROR SparkContext: Error initializing SparkContext.
java.net.BindException: 无法指定被请求的地址: Service 'sparkDriver' failed after 16 retries (on a random free port)! Consider explicitly setting the appropriate binding address for the service 'sparkDriver' (for example spark.driver.bindAddress for SparkDriver) to the correct binding address.at sun.nio.ch.Net.bind0(Native Method)at sun.nio.ch.Net.bind(Net.java:433)at sun.nio.ch.Net.bind(Net.java:425)at sun.nio.ch.ServerSocketChannelImpl.bind(ServerSocketChannelImpl.java:223)at io.netty.channel.socket.nio.NioServerSocketChannel.doBind(NioServerSocketChannel.java:128)at io.netty.channel.AbstractChannel$AbstractUnsafe.bind(AbstractChannel.java:558)at io.netty.channel.DefaultChannelPipeline$HeadContext.bind(DefaultChannelPipeline.java:1283)at io.netty.channel.AbstractChannelHandlerContext.invokeBind(AbstractChannelHandlerContext.java:501)at io.netty.channel.AbstractChannelHandlerContext.bind(AbstractChannelHandlerContext.java:486)at io.netty.channel.DefaultChannelPipeline.bind(DefaultChannelPipeline.java:989)at io.netty.channel.AbstractChannel.bind(AbstractChannel.java:254)at io.netty.bootstrap.AbstractBootstrap$2.run(AbstractBootstrap.java:364)at io.netty.util.concurrent.AbstractEventExecutor.safeExecute(AbstractEventExecutor.java:163)at io.netty.util.concurrent.SingleThreadEventExecutor.runAllTasks(SingleThreadEventExecutor.java:403)at io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:463)at io.netty.util.concurrent.SingleThreadEventExecutor$5.run(SingleThreadEventExecutor.java:858)at io.netty.util.concurrent.DefaultThreadFactory$DefaultRunnableDecorator.run(DefaultThreadFactory.java:138)at java.lang.Thread.run(Thread.java:748)
/usr/local/src/spark2.x/python/pyspark/shell.py:45: UserWarning: Failed to initialize Spark session.warnings.warn("Failed to initialize Spark session.")
Traceback (most recent call last):File "/usr/local/src/spark2.x/python/pyspark/shell.py", line 41, in <module>spark = SparkSession._create_shell_session()File "/usr/local/src/spark2.x/python/pyspark/sql/session.py", line 583, in _create_shell_sessionreturn SparkSession.builder.getOrCreate()File "/usr/local/src/spark2.x/python/pyspark/sql/session.py", line 173, in getOrCreatesc = SparkContext.getOrCreate(sparkConf)File "/usr/local/src/spark2.x/python/pyspark/context.py", line 367, in getOrCreateSparkContext(conf=conf or SparkConf())File "/usr/local/src/spark2.x/python/pyspark/context.py", line 136, in __init__conf, jsc, profiler_cls)File "/usr/local/src/spark2.x/python/pyspark/context.py", line 198, in _do_initself._jsc = jsc or self._initialize_context(self._conf._jconf)File "/usr/local/src/spark2.x/python/pyspark/context.py", line 306, in _initialize_contextreturn self._jvm.JavaSparkContext(jconf)File "/usr/local/src/spark2.x/python/lib/py4j-0.10.7-src.zip/py4j/java_gateway.py", line 1525, in __call__answer, self._gateway_client, None, self._fqn)File "/usr/local/src/spark2.x/python/lib/py4j-0.10.7-src.zip/py4j/protocol.py", line 328, in get_return_valueformat(target_id, ".", name), value)
Py4JJavaError: An error occurred while calling None.org.apache.spark.api.java.JavaSparkContext.
: java.net.BindException: \u65e0\u6cd5\u6307\u5b9a\u88ab\u8bf7\u6c42\u7684\u5730\u5740: Service 'sparkDriver' failed after 16 retries (on a random free port)! Consider explicitly setting the appropriate binding address for the service 'sparkDriver' (for example spark.driver.bindAddress for SparkDriver) to the correct binding address.at sun.nio.ch.Net.bind0(Native Method)at sun.nio.ch.Net.bind(Net.java:433)at sun.nio.ch.Net.bind(Net.java:425)at sun.nio.ch.ServerSocketChannelImpl.bind(ServerSocketChannelImpl.java:223)at io.netty.channel.socket.nio.NioServerSocketChannel.doBind(NioServerSocketChannel.java:128)at io.netty.channel.AbstractChannel$AbstractUnsafe.bind(AbstractChannel.java:558)at io.netty.channel.DefaultChannelPipeline$HeadContext.bind(DefaultChannelPipeline.java:1283)at io.netty.channel.AbstractChannelHandlerContext.invokeBind(AbstractChannelHandlerContext.java:501)at io.netty.channel.AbstractChannelHandlerContext.bind(AbstractChannelHandlerContext.java:486)at io.netty.channel.DefaultChannelPipeline.bind(DefaultChannelPipeline.java:989)at io.netty.channel.AbstractChannel.bind(AbstractChannel.java:254)at io.netty.bootstrap.AbstractBootstrap$2.run(AbstractBootstrap.java:364)at io.netty.util.concurrent.AbstractEventExecutor.safeExecute(AbstractEventExecutor.java:163)at io.netty.util.concurrent.SingleThreadEventExecutor.runAllTasks(SingleThreadEventExecutor.java:403)at io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:463)at io.netty.util.concurrent.SingleThreadEventExecutor$5.run(SingleThreadEventExecutor.java:858)at io.netty.util.concurrent.DefaultThreadFactory$DefaultRunnableDecorator.run(DefaultThreadFactory.java:138)at java.lang.Thread.run(Thread.java:748)An exception has occurred, use %tb to see the full traceback.SystemExit: 1In [1]: 

spark-1.6.3-bin-hadoop2.3

[root@master spark1.x]# IPYTHON=1 bin/pyspark
Python 2.7.5 (default, Nov 14 2023, 16:14:06) 
Type "copyright", "credits" or "license" for more information.IPython 3.2.3 -- An enhanced Interactive Python.
?         -> Introduction and overview of IPython's features.
%quickref -> Quick reference.
help      -> Python's own help system.
object?   -> Details about 'object', use 'object??' for extra details.
Using Spark's default log4j profile: org/apache/spark/log4j-defaults.properties
25/02/02 11:21:04 INFO SparkContext: Running Spark version 1.6.3
25/02/02 11:21:05 WARN NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
25/02/02 11:21:05 INFO SecurityManager: Changing view acls to: root
25/02/02 11:21:05 INFO SecurityManager: Changing modify acls to: root
25/02/02 11:21:05 INFO SecurityManager: SecurityManager: authentication disabled; ui acls disabled; users with view permissions: Set(root); users with modify permissions: Set(root)
25/02/02 11:21:05 WARN Utils: Service 'sparkDriver' could not bind on port 0. Attempting port 1.
25/02/02 11:21:05 WARN Utils: Service 'sparkDriver' could not bind on port 0. Attempting port 1.
25/02/02 11:21:05 WARN Utils: Service 'sparkDriver' could not bind on port 0. Attempting port 1.
25/02/02 11:21:05 WARN Utils: Service 'sparkDriver' could not bind on port 0. Attempting port 1.
25/02/02 11:21:05 WARN Utils: Service 'sparkDriver' could not bind on port 0. Attempting port 1.
25/02/02 11:21:05 WARN Utils: Service 'sparkDriver' could not bind on port 0. Attempting port 1.
25/02/02 11:21:05 WARN Utils: Service 'sparkDriver' could not bind on port 0. Attempting port 1.
25/02/02 11:21:05 WARN Utils: Service 'sparkDriver' could not bind on port 0. Attempting port 1.
25/02/02 11:21:05 WARN Utils: Service 'sparkDriver' could not bind on port 0. Attempting port 1.
25/02/02 11:21:05 WARN Utils: Service 'sparkDriver' could not bind on port 0. Attempting port 1.
25/02/02 11:21:05 WARN Utils: Service 'sparkDriver' could not bind on port 0. Attempting port 1.
25/02/02 11:21:05 WARN Utils: Service 'sparkDriver' could not bind on port 0. Attempting port 1.
25/02/02 11:21:05 WARN Utils: Service 'sparkDriver' could not bind on port 0. Attempting port 1.
25/02/02 11:21:05 WARN Utils: Service 'sparkDriver' could not bind on port 0. Attempting port 1.
25/02/02 11:21:05 WARN Utils: Service 'sparkDriver' could not bind on port 0. Attempting port 1.
25/02/02 11:21:05 WARN Utils: Service 'sparkDriver' could not bind on port 0. Attempting port 1.
25/02/02 11:21:05 ERROR SparkContext: Error initializing SparkContext.
java.net.BindException: 无法指定被请求的地址: Service 'sparkDriver' failed after 16 retries! Consider explicitly setting the appropriate port for the service 'sparkDriver' (for example spark.ui.port for SparkUI) to an available port or increasing spark.port.maxRetries.at sun.nio.ch.Net.bind0(Native Method)at sun.nio.ch.Net.bind(Net.java:433)at sun.nio.ch.Net.bind(Net.java:425)at sun.nio.ch.ServerSocketChannelImpl.bind(ServerSocketChannelImpl.java:223)at sun.nio.ch.ServerSocketAdaptor.bind(ServerSocketAdaptor.java:74)at io.netty.channel.socket.nio.NioServerSocketChannel.doBind(NioServerSocketChannel.java:125)at io.netty.channel.AbstractChannel$AbstractUnsafe.bind(AbstractChannel.java:485)at io.netty.channel.DefaultChannelPipeline$HeadContext.bind(DefaultChannelPipeline.java:1089)at io.netty.channel.AbstractChannelHandlerContext.invokeBind(AbstractChannelHandlerContext.java:430)at io.netty.channel.AbstractChannelHandlerContext.bind(AbstractChannelHandlerContext.java:415)at io.netty.channel.DefaultChannelPipeline.bind(DefaultChannelPipeline.java:903)at io.netty.channel.AbstractChannel.bind(AbstractChannel.java:198)at io.netty.bootstrap.AbstractBootstrap$2.run(AbstractBootstrap.java:348)at io.netty.util.concurrent.SingleThreadEventExecutor.runAllTasks(SingleThreadEventExecutor.java:357)at io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:357)at io.netty.util.concurrent.SingleThreadEventExecutor$2.run(SingleThreadEventExecutor.java:111)at java.lang.Thread.run(Thread.java:748)
25/02/02 11:21:05 INFO SparkContext: Successfully stopped SparkContext
---------------------------------------------------------------------------
Py4JJavaError                             Traceback (most recent call last)
/usr/local/src/spark1.x/python/pyspark/shell.py in <module>()41     SparkContext.setSystemProperty("spark.executor.uri", os.environ["SPARK_EXECUTOR_URI"])42 
---> 43 sc = SparkContext(pyFiles=add_files)44 atexit.register(lambda: sc.stop())45 /usr/local/src/spark1.x/python/pyspark/context.py in __init__(self, master, appName, sparkHome, pyFiles, environment, batchSize, serializer, conf, gateway, jsc, profiler_cls)113         try:114             self._do_init(master, appName, sparkHome, pyFiles, environment, batchSize, serializer,
--> 115                           conf, jsc, profiler_cls)116         except:117             # If an error occurs, clean up in order to allow future SparkContext creation:/usr/local/src/spark1.x/python/pyspark/context.py in _do_init(self, master, appName, sparkHome, pyFiles, environment, batchSize, serializer, conf, jsc, profiler_cls)170 171         # Create the Java SparkContext through Py4J
--> 172         self._jsc = jsc or self._initialize_context(self._conf._jconf)173 174         # Create a single Accumulator in Java that we'll send all our updates through;/usr/local/src/spark1.x/python/pyspark/context.py in _initialize_context(self, jconf)233         Initialize SparkContext in function to allow subclass specific initialization234         """
--> 235         return self._jvm.JavaSparkContext(jconf)236 237     @classmethod/usr/local/src/spark1.x/python/lib/py4j-0.9-src.zip/py4j/java_gateway.py in __call__(self, *args)1062         answer = self._gateway_client.send_command(command)1063         return_value = get_return_value(
-> 1064             answer, self._gateway_client, None, self._fqn)1065 1066         for temp_arg in temp_args:/usr/local/src/spark1.x/python/lib/py4j-0.9-src.zip/py4j/protocol.py in get_return_value(answer, gateway_client, target_id, name)306                 raise Py4JJavaError(307                     "An error occurred while calling {0}{1}{2}.\n".
--> 308                     format(target_id, ".", name), value)309             else:310                 raise Py4JError(<type 'str'>: (<type 'exceptions.UnicodeEncodeError'>, UnicodeEncodeError('ascii', u"An error occurred while calling None.org.apache.spark.api.java.JavaSparkContext.\n: java.net.BindException: \u65e0\u6cd5\u6307\u5b9a\u88ab\u8bf7\u6c42\u7684\u5730\u5740: Service 'sparkDriver' failed after 16 retries! Consider explicitly setting the appropriate port for the service 'sparkDriver' (for example spark.ui.port for SparkUI) to an available port or increasing spark.port.maxRetries.\n\tat sun.nio.ch.Net.bind0(Native Method)\n\tat sun.nio.ch.Net.bind(Net.java:433)\n\tat sun.nio.ch.Net.bind(Net.java:425)\n\tat sun.nio.ch.ServerSocketChannelImpl.bind(ServerSocketChannelImpl.java:223)\n\tat sun.nio.ch.ServerSocketAdaptor.bind(ServerSocketAdaptor.java:74)\n\tat io.netty.channel.socket.nio.NioServerSocketChannel.doBind(NioServerSocketChannel.java:125)\n\tat io.netty.channel.AbstractChannel$AbstractUnsafe.bind(AbstractChannel.java:485)\n\tat io.netty.channel.DefaultChannelPipeline$HeadContext.bind(DefaultChannelPipeline.java:1089)\n\tat io.netty.channel.AbstractChannelHandlerContext.invokeBind(AbstractChannelHandlerContext.java:430)\n\tat io.netty.channel.AbstractChannelHandlerContext.bind(AbstractChannelHandlerContext.java:415)\n\tat io.netty.channel.DefaultChannelPipeline.bind(DefaultChannelPipeline.java:903)\n\tat io.netty.channel.AbstractChannel.bind(AbstractChannel.java:198)\n\tat io.netty.bootstrap.AbstractBootstrap$2.run(AbstractBootstrap.java:348)\n\tat io.netty.util.concurrent.SingleThreadEventExecutor.runAllTasks(SingleThreadEventExecutor.java:357)\n\tat io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:357)\n\tat io.netty.util.concurrent.SingleThreadEventExecutor$2.run(SingleThreadEventExecutor.java:111)\n\tat java.lang.Thread.run(Thread.java:748)\n", 107, 117, 'ordinal not in range(128)'))In [1]: 

本文来自互联网用户投稿,该文观点仅代表作者本人,不代表本站立场。本站仅提供信息存储空间服务,不拥有所有权,不承担相关法律责任。如若转载,请注明出处:http://www.hqwc.cn/news/877956.html

如若内容造成侵权/违法违规/事实不符,请联系编程知识网进行投诉反馈email:809451989@qq.com,一经查实,立即删除!

相关文章

C++ 编译静态链接 (-static)

因为很长一段时间内并不知道这个编译参数究竟是干什么用的,只知道这个参数在 NOI 系列赛事中普遍使用,并且会导致编译输出文件变大,直到碰到具体问题了才发现这个参数的作用 -static 参数是静态链接开关,不加这个参数默认是动态链接,加上这个参数以后是静态链接,先说一下…

https://avoid.overfit.cn/post/e57ca7e30ea74ad380b093a2599c9c01

DeepSeekMoE是一种创新的大规模语言模型架构,通过整合专家混合系统(Mixture of Experts, MoE)、改进的注意力机制和优化的归一化策略,在模型效率与计算能力之间实现了新的平衡。DeepSeekMoE架构融合了专家混合系统(MoE)、多头潜在注意力机制(Multi-Head Latent Attention, ML…

DeepSeekV3+Roo Code,智能编码好助手

前言 硅基流动最近上线了deepseek-ai/DeepSeek-R1与deepseek-ai/DeepSeek-V3,感兴趣快来试试吧! 邀请注册得14元不过期额度:https://cloud.siliconflow.cn/i/Ia3zOSCU。实践 最近VS Code中的自动编程插件Cline很火爆,Roo Code也是Cline的一个fork版本。 Cline 自主编码代理…

Cisco Catalyst 8000V Edge Software, IOS XE Release 17.16.1a ED - 思科虚拟路由器系统软件

Cisco Catalyst 8000V Edge Software, IOS XE Release 17.16.1a ED - 思科虚拟路由器系统软件Cisco Catalyst 8000V Edge Software, IOS XE Release 17.16.1a ED 思科 Catalyst 8000V 边缘软件 - 虚拟路由器 请访问原文链接:https://sysin.org/blog/cisco-catalyst-8000v/ 查看…

Cisco Catalyst 8000 Series Edge Platforms, IOS XE Release 17.16.1a ED - 思科路由器系统软件

Cisco Catalyst 8000 Series Edge Platforms, IOS XE Release 17.16.1a ED - 思科路由器系统软件Cisco Catalyst 8000 Series Edge Platforms, IOS XE Release 17.16.1a ED 思科 Catalyst 8000 边缘平台系列 IOS XE 系统软件 请访问原文链接:https://sysin.org/blog/cisco-cat…

Easysearch 集群通过 API 进行用户密码重置

在日常使用 Easysearch 中,难免会遇到集群密码需要重置的情况(如密码遗失、安全审计要求等)。 通过查看 Easysearch 用户接口文档,创建用户使用如下接口: PUT _security/user/<username> {"password": "adminpass","roles": ["m…

Elasticvue:一款轻量级的Elasticsearch可视化管理工具

Elasticvue是一款免费开源的Elasticsearch GUI工具,你可以使用它来管理ES里的数据, Elasticvue具有多种安装形式,我们这里采用最简单的Docker安装方式,其他版本如Winodws、MacOS、Linux和浏览器插件。 Elasticvue相比Kibana的优势主要体现在以下几个方面: 1、 轻量级与易用…

【攻防3.0 】信任攻击

Author: sm0nk@深蓝攻防实验室 上周在一个技术沙龙我分享了一个攻防相关议题——进击的白巨人,在此归档。一、进攻场景思考 无论是端侧产品还是流量侧产品、亦或是原生安全还是外挂式,主模式还是鉴黑和鉴白; 随着防守方强化的安全建设,安全产品越来越强,进攻的难度在增加;…

[2025.2.1 MySQL学习] MVCC

MVCC 基本概念当前读(直接读取数据页最新版本):读取的是记录的最新版本,读取时还要保证其他并发事务不能修改当前记录,会对读取的记录进行加锁。对于一些日常操作,如:select...lock in share mode、select ... for update、update、isnert、delete都是一种当前读快照读:…

母婴app

您好!这是一个非常全面的母婴健康管理APP构想。让我帮您从技术角度分析并提供一个基础的项目结构建议。 技术架构建议 1. 前端技术栈:- iOS: Swift/SwiftUI - Android: Kotlin - 跨平台选项: Flutter/React Native2. 后端技术栈:- 主服务框架: Spring Boot - 数据库: - MySQL …

毕设学习第六天SSM框架之Spring5

虽然目前spring已经出现了6但是现如今大多数应用的还是spring5,因此毕设学习选择Spring5而非6 spring简介Spring 是一个开源的 Java 企业级应用开发框架,旨在简化企业级 Java 应用的开发过程。它通过控制反转(IOC)和面向切面编程(AOP)等核心技术,帮助开发人员构建松耦合…

心态急躁,什么事都做不成

春节这几天,心态有些急躁。也许是突如其来的放松让大脑不适应,最近做事(尤其是打游戏)不顺。 比如体现在炉石酒馆,农,围棋这三者上。这三个是2/1号我从外面回来之后进行的三项娱乐活动。 首先先打了几把炉石,一把速七,两把速八,再加上之前的两把速七速八,让我直接从8…