Issue
I'm running PySpark on Spyder IDE and this warnings are comming out everytime:
Using Spark's default log4j profile: org/apache/spark/log4j-defaults.properties
Setting default log level to "WARN".
To adjust logging level use sc.setLogLevel(newLevel). For SparkR, use setLogLevel(newLevel).
22/02/15 17:05:12 WARN NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
22/02/15 17:05:29 WARN ProcfsMetricsGetter: Exception when trying to compute pagesize, as a result reporting of ProcessTree metrics is stopped
I have tried to edit the file C:\spark\spark-3.2.1-bin-hadoop2.7\conf\log4j.properties.template
to change the warn level to 'ERROR' but it didn't do anything
Solution
- Rename
log4j.properties.template
intolog4j.properties
- Make sure
log4j.properties
is inside classpath or under$SPARK_HOME/conf/
Answered By - Warren Zhu
0 comments:
Post a Comment
Note: Only a member of this blog may post a comment.