Code
from pyspark.sql import SparkSession
spark = SparkSession.builder \
.appName('3_test_sparksession') \
.master('spark://spark-master:17077') \
.getOrCreate()
sc = spark.sparkContext
for setting in sc._conf.getAll():
print(setting)
sc.stop()
Result
('spark.driver.port', '39007')
('spark.master', 'spark://spark-master:17077')
('spark.sql.warehouse.dir', 'file:/home/spark/dev/spark-warehouse')
('spark.rdd.compress', 'True')
('spark.driver.host', 'spark-client')
('spark.app.startTime', '1654338870411')
('spark.app.name', '3_test_sparksession')
('spark.serializer.objectStreamReset', '100')
('spark.submit.pyFiles', '')
('spark.executor.id', 'driver')
('spark.submit.deployMode', 'client')
('spark.app.id', 'app-20220604103432-0048')
('spark.ui.showConsoleProgress', 'true')
'Data Engineering > Spark' 카테고리의 다른 글
[Spark] Basic way to use spark data frames with sql query (0) | 2022.06.04 |
---|---|
[Spark] Compare user settings in pyspark code (0) | 2022.06.04 |
[Spark] To output spark settings from pyspark code (0) | 2022.06.04 |
[Spark] How to adjust spark memory in pyspark code (0) | 2022.06.04 |
[Spark] pyspark 코드에서 spark master host 변경하는 방법 (0) | 2022.06.01 |