'Dataframe empty check pyspark

I am trying to check if a dataframe is empty in Pyspark using below.

print(df.head(1).isEmpty)

But, I am getting an error

Attribute error: 'list' object has no attribute 'isEmpty'.

I checked if my object is really a dd using type(df) and it is class 'pyspark.sql.dataframe.Dataframe'



Solution 1:[1]

I used df.first() == None to evaluate if my spark dataframe is empty

Solution 2:[2]

When u do a head(1) it returns a list. So that’s the reason for your error.

You have to just do df.isEmpty().

Solution 3:[3]

df.head(1) returns a list corresponding to the first row of df.

You can check if this list is empty "[ ]" using a bool type condition as in:

if df.head(1):
    print("there is something")
else:
    print("df is empty")

>>> 'df is empty'

Empty lists are implicity "False".

For better explanation, please head over to python docs.

Solution 4:[4]

Another way to do this would be to check if df.count()==0

Sources

This article follows the attribution requirements of Stack Overflow and is licensed under CC BY-SA 3.0.

Source: Stack Overflow

Solution Source
Solution 1 Padfoot123
Solution 2 ZygD
Solution 3 Andronicus
Solution 4 fuzzy-memory