java - Kryo deserializer throws ClassNotFoundException in spark: Unable to find class -
i writing spark job in scala run spark 1.3.0. rdd transformation functions use classes third party library not serializable. make closure serialization possible, wrap these objects in com.twitter.chill.meatlocker java.io.serializable uses kryo wrapped objects. make uber jar using assembly.
when run job, executors tasks fail throwing classnotfoundexception same classes wrapped inside meatlocker. know there related bug in spark 1.2.x googling on subject suggests me has been fixed in 1.3.0. https://issues.apache.org/jira/browse/spark-6069
i have tried spark configuration property spark.executor.userclasspathfirst=true no effect. passed property spark-submit script so: spark-submit --class <my-class> <my-jar> --conf spark.executor.userclasspathfirst=true
beyond stuck no option source of third party library , include in project after making these classes implement java.io.serializable. way eliminate need kryo serialization altogether thereby not running issue hope better way.
Comments
Post a Comment