pyspark.sql.Column.over¶
- 
Column.over(window)[source]¶
- Define a windowing column. - New in version 1.4.0. - Parameters
- windowWindowSpec
 
- window
- Returns
 - Examples - >>> from pyspark.sql import Window >>> window = Window.partitionBy("name").orderBy("age") .rowsBetween(Window.unboundedPreceding, Window.currentRow) >>> from pyspark.sql.functions import rank, min >>> from pyspark.sql.functions import desc >>> df.withColumn("rank", rank().over(window)) .withColumn("min", min('age').over(window)).sort(desc("age")).show() +---+-----+----+---+ |age| name|rank|min| +---+-----+----+---+ | 5| Bob| 1| 5| | 2|Alice| 1| 2| +---+-----+----+---+