Building Spark with Maven: the PermGen space error and Java heap space error

Spark doesn’t build properly with maven out of the box. While this is clearly stated in the Building Spark Page in the documentation, it’s easy to miss.

You’ll get an error like the following:

$ mvn clean package
.....

[INFO] Compiling 203 Scala sources and 9 Java sources to /Users/me/Development/spark/core/target/scala-2.10/classes...
[ERROR] PermGen space -> [Help 1]

[INFO] Compiling 203 Scala sources and 9 Java sources to /Users/me/Development/spark/core/target/scala-2.10/classes...
[ERROR] Java heap space -> [Help 1]

Basically, this says that the PermGen space and/or Java heap space have been exceeded. While Java heap space issues are so common that I instinctively bump that up when seeing issues (i.e., -Xmx2g), the Perm space has to be set separately.

That said, the solution is pretty straightforward.

That should fix the problem and allow spark to be built properly. I also had issues building spark on my home directory, which was encrypted. This was a second issue, and I posted a workaround for it in an earlier post.

Leave a Reply

Your email address will not be published. Required fields are marked *