본문 바로가기
Spark/러닝 스파크

Spark 시작하기09 - 메모리 Exception

by java개발자 2016. 4. 19.

Spark를 실행중.... 갑자기 Exception이 발생했다.


Exceptioin 발생 당시 설정 : 

java : jdk1.8.0_77 (32bit)

eclipse : MARS.1 (32bit)

XXMaxPermSize : 256m

XXMaxPermSize : 256m

Xms1024m

Xmx1024m


당시 컴퓨터 메모리 사용량 : 2.0G / 4.0G (이클립스 기동 후)

Spark 설정 : local[1]


> 큰 문제없는 설정이다.


Exception msg:

------------------------------------------------------------------------------------------------------------------------------------------------------------------

16/04/19 10:29:07 ERROR SparkContext: Error initializing SparkContext.

java.lang.IllegalArgumentException: System memory 259522560 must be at least 4.718592E8. Please use a larger heap size.

at org.apache.spark.memory.UnifiedMemoryManager$.getMaxMemory(UnifiedMemoryManager.scala:193)

at org.apache.spark.memory.UnifiedMemoryManager$.apply(UnifiedMemoryManager.scala:175)

at org.apache.spark.SparkEnv$.create(SparkEnv.scala:354)

at org.apache.spark.SparkEnv$.createDriverEnv(SparkEnv.scala:193)

at org.apache.spark.SparkContext.createSparkEnv(SparkContext.scala:288)

at org.apache.spark.SparkContext.<init>(SparkContext.scala:457)

at org.apache.spark.SparkContext.<init>(SparkContext.scala:169)

at org.apache.spark.SparkContext.<init>(SparkContext.scala:186)

at org.apache.spark.api.java.JavaSparkContext.<init>(JavaSparkContext.scala:65)

at org.mystudy.testcase.TestCase1.<init>(TestCase1.java:15)

at org.mystudy.testcase.TestCase1.main(TestCase1.java:19)

Exception in thread "main" java.lang.IllegalArgumentException: System memory 259522560 must be at least 4.718592E8. Please use a larger heap size.

at org.apache.spark.memory.UnifiedMemoryManager$.getMaxMemory(UnifiedMemoryManager.scala:193)

at org.apache.spark.memory.UnifiedMemoryManager$.apply(UnifiedMemoryManager.scala:175)

at org.apache.spark.SparkEnv$.create(SparkEnv.scala:354)

at org.apache.spark.SparkEnv$.createDriverEnv(SparkEnv.scala:193)

at org.apache.spark.SparkContext.createSparkEnv(SparkContext.scala:288)

at org.apache.spark.SparkContext.<init>(SparkContext.scala:457)

at org.apache.spark.SparkContext.<init>(SparkContext.scala:169)

at org.apache.spark.SparkContext.<init>(SparkContext.scala:186)

at org.apache.spark.api.java.JavaSparkContext.<init>(JavaSparkContext.scala:65)

at org.mystudy.testcase.TestCase1.<init>(TestCase1.java:15)

at org.mystudy.testcase.TestCase1.main(TestCase1.java:19)

------------------------------------------------------------------------------------------------------------------------------------------------------------------


무엇이 문제인가?

32bit가 문제인가?

64bit로 할때, 위의 메모리 설정에서 잘 동작했었다...


64bit에서 이클립스가 강제종료 되는 현상이 있어서, 잠시 32bit로 갈아탔을 뿐인데...



>>>


windows 32bit 용은 없단다...

출처 : http://qiita.com/ishida330/items/5302e1493e1d94e86adf (하단 참고)