How to find the size or shape of a DataFrame in PySpark?

I am trying to find out the size/shape of a DataFrame in PySpark. I do not see a single function that can do this.

In Python, I can do this:


Is there a similar function in PySpark? This is my current solution, but I am looking for an element one

row_number = data.count()
column_number = len(data.dtypes)

The computation of the number of columns is not ideal…

Here is not the best place where you can get help regarding this issue, try somewhere else
( Forum is for general purposes only )