如何分析Spark集群的启动日志

发布时间:2021-12-17 09:53:23 作者:柒染
来源:亿速云 阅读:116

这篇文章给大家介绍如何分析Spark集群的启动日志,内容非常详细,感兴趣的小伙伴们可以参考借鉴,希望对大家能有所帮助。

Created by Wang, Jerry, last modified on Aug 24, 2015

added by Jerry:…
/root/devExpert/spark-1.4.1/sbin/…/conf – ( Jerry: I haven’t copied out the template from my own confiugration file yet )
starting org.apache.spark.deploy.master.Master, logging to /root/devExpert/spark-1.4.1/sbin/…/logs/spark-root-org.apache.spark.deploy.master.Master-1-NKGV50849583FV1.out
Jerry- location of log file:

NKGV50849583FV1:~/devExpert/spark-1.4.1/logs # vi spark-root-org.apache.spark.deploy.master.Master-1-NKGV50849583FV1.out

1 added by Jerry: loading load-spark-env.sh !!!1
2 added by Jerry, number of Jars: 1
3 added by Jerry, launch_classpath: /root/devExpert/spark-1.4.1/assembly/target/scala-2.10/spark-assembly-1.4.1-hadoop2.4.0.jar
4 added by Jerry,RUNNER:/usr/jdk1.7.0_79/bin/java
5 added by Jerry, printf argument list: org.apache.spark.deploy.master.Master --ip NKGV50849583FV1 --port 7077 --webui-port 8080
Jerry: this is default value

6 Spark Command: /usr/jdk1.7.0_79/bin/java -cp /root/devExpert/spark-1.4.1/sbin/…/conf/:/root/devExpert/spark-1.4.1/assembly/target/scala-2.10/spark-assem bly-1.4.1-hadoop2.4.0.jar:/root/devExpert/spark-1.4.1/lib_managed/jars/datanucleus-rdbms-3.2.9.jar:/root/devExpert/spark-1.4.1/lib_managed/jars/datanucle us-core-3.2.10.jar:/root/devExpert/spark-1.4.1/lib_managed/jars/datanucleus-api-jdo-3.2.6.jar -Xms512m -Xmx512m -XX:MaxPermSize=256m org.apache.spark.dep loy.master.Master --ip NKGV50849583FV1 --port 7077 --webui-port 8080
7 ========================================
8 added by Jerry, I am in if-else branch: /usr/jdk1.7.0_79/bin/java -cp /root/devExpert/spark-1.4.1/sbin/…/conf/:/root/devExpert/spark-1.4.1/assembly/targ et/scala-2.10/spark-assembly-1.4.1-hadoop2.4.0.jar:/root/devExpert/spark-1.4.1/lib_managed/jars/datanucleus-rdbms-3.2.9.jar:/root/devExpert/spark-1.4.1/l ib_managed/jars/datanucleus-core-3.2.10.jar:/root/devExpert/spark-1.4.1/lib_managed/jars/datanucleus-api-jdo-3.2.6.jar -Xms512m -Xmx512m -XX:MaxPermSize= 256m org.apache.spark.deploy.master.Master --ip NKGV50849583FV1 --port 7077 --webui-port 8080
9 Using Spark’s default log4j profile: org/apache/spark/log4j-defaults.properties
10 15/08/16 12:45:17 INFO Master: Registered signal handlers for [TERM, HUP, INT]
11 15/08/16 12:45:17 WARN Utils: Your hostname, NKGV50849583FV1 resolves to a loopback address: 127.0.0.1; using 10.128.184.131 instead (on interface eth0)
12 15/08/16 12:45:17 WARN Utils: Set SPARK_LOCAL_IP if you need to bind to another address
Jerry: useful hint
13 15/08/16 12:45:18 WARN NativeCodeLoader: Unable to load native-hadoop library for your platform… using builtin-java classes where applicable
14 15/08/16 12:45:18 INFO SecurityManager: Changing view acls to: root
15 15/08/16 12:45:18 INFO SecurityManager: Changing modify acls to: root
16 15/08/16 12:45:18 INFO SecurityManager: SecurityManager: authentication disabled; ui acls disabled; users with view permissions: Set(root); users with mo dify permissions: Set(root)
17 15/08/16 12:45:19 INFO Slf4jLogger: Slf4jLogger started
18 15/08/16 12:45:19 INFO Remoting: Starting remoting
19 15/08/16 12:45:19 INFO Remoting: Remoting started; listening on addresses :[akka.tcp://sparkMaster@NKGV50849583FV1:7077]
20 15/08/16 12:45:19 INFO Utils: Successfully started service ‘sparkMaster’ on port 7077.

21 15/08/16 12:45:19 INFO Utils: Successfully started service on port 6066.
22 15/08/16 12:45:19 INFO StandaloneRestServer: Started REST server for submitting applications on port 6066
23 15/08/16 12:45:19 INFO Master: Starting Spark master at spark://NKGV50849583FV1:7077
24 15/08/16 12:45:19 INFO Master: Running Spark version 1.4.1
25 15/08/16 12:45:19 INFO Utils: Successfully started service ‘MasterUI’ on port 8080.

26 15/08/16 12:45:19 INFO MasterWebUI: Started MasterWebUI at http://10.128.184.131:8080
27 15/08/16 12:45:20 INFO Master: I have been elected leader! New state: ALIVE - cool!

关于如何分析Spark集群的启动日志就分享到这里了,希望以上内容可以对大家有一定的帮助,可以学到更多知识。如果觉得文章不错,可以把它分享出去让更多的人看到。

推荐阅读:
  1. Python 关于日志的分析
  2. mysql如何启动慢查日志

免责声明:本站发布的内容(图片、视频和文字)以原创、转载和分享为主,文章观点不代表本网站立场,如果涉及侵权请联系站长邮箱:is@yisu.com进行举报,并提供相关证据,一经查实,将立刻删除涉嫌侵权内容。

spark

上一篇:如何用命令行的方式运行Spark平台的wordcount项目

下一篇:python匿名函数怎么创建

相关阅读

您好,登录后才能下订单哦!

密码登录
登录注册
其他方式登录
点击 登录注册 即表示同意《亿速云用户服务条款》