Spark and NegativeArraySizeException
Recently I was debugging the following crash in Spark: Disabling Kryo solves the issue. To do that just set spark.serializer to org.apache.spark.serializer.JavaSerializer. Other workaround is to change Kryo’s reference management, as explained on Github: