Skip to content Skip to sidebar Skip to footer

Pyspark: Need To Show A Count Of Null/empty Values Per Each Column In A Dataframe

I have a spark dataframe and need to do a count of null/empty values for each column. I need to show ALL columns in the output. I have looked online and found a few 'similar questi

Solution 1:

you can do the following, just make sure your df is a Spark DataFrame.

from pyspark.sql.functions import col, when

df.select(*(count(when(col(c).isNull(), c)).alias(c) for c in df.columns)).show()

Post a Comment for "Pyspark: Need To Show A Count Of Null/empty Values Per Each Column In A Dataframe"