Code
from pyspark.conf import SparkConf
from pyspark.context import SparkContext
conf = SparkConf().setAll([('spark.app.name', '2_test_sparksession'),
('spark.master', 'spark://spark-master:17077'),
('spark.driver.cores', '1'),
('spark.driver.memory','1g'),
('spark.executor.memory', '1g'),
('spark.executor.cores', '1'),
('spark.cores.max', '2')])
sc = SparkContext(conf = conf)
sc.stop()
Result
Code
from pyspark.conf import SparkConf
from pyspark.context import SparkContext
conf = SparkConf().setAll([('spark.app.name', '2_test_sparksession'),
('spark.master', 'spark://spark-master:17077'),
('spark.driver.cores', '1'),
('spark.driver.memory','1g'),
('spark.executor.memory', '1g'),
('spark.executor.cores', '2'),
('spark.cores.max', '2')])
sc = SparkContext(conf = conf)
sc.stop()
Result
'Data Engineering > Spark' 카테고리의 다른 글
[Spark] To output the default settings for a Spark Session (0) | 2022.06.04 |
---|---|
[Spark] To output spark settings from pyspark code (0) | 2022.06.04 |
[Spark] pyspark 코드에서 spark master host 변경하는 방법 (0) | 2022.06.01 |
[Spark] pyspark 코드에서 어플리케이션 이름 변경하는 방법 (0) | 2022.06.01 |
[Spark] 스파크 데이터프레임 첫번째 행만 출력하는 방법 (0) | 2022.06.01 |