pyspark.sql.functions.partitioning.hours#
- pyspark.sql.functions.partitioning.hours(col)[source]#
Partition transform function: A transform for timestamps to partition data into hours.
New in version 4.0.0.
- Parameters
- col
Column
or str target date or timestamp column to work on.
- col
- Returns
Column
data partitioned by hours.
Notes
This function can be used only in combination with
partitionedBy()
method of the DataFrameWriterV2.Examples
>>> df.writeTo("catalog.db.table").partitionedBy( ... partitioning.hours("ts") ... ).createOrReplace()