PySpark error: AttributeError: ‘NoneType’ object has no attribute ‘_jvm’

Mariusz answer didn’t really help me. So if you like me found this because it’s the only result on google and you’re new to pyspark (and spark in general), here’s what worked for me.

In my case I was getting that error because I was trying to execute pyspark code before the pyspark environment had been set up.

Making sure that pyspark was available and set up before doing calls dependent on pyspark.sql.functions fixed the issue for me.

Leave a Comment