pyspark.sql.functions.make_interval#

pyspark.sql.functions.make_interval(years=None, months=None, weeks=None, days=None, hours=None, mins=None, secs=None)[source]#

Make interval from years, months, weeks, days, hours, mins and secs.

New in version 3.5.0.

Parameters
yearsColumn or str, optional

The number of years, positive or negative.

monthsColumn or str, optional

The number of months, positive or negative.

weeksColumn or str, optional

The number of weeks, positive or negative.

daysColumn or str, optional

The number of days, positive or negative.

hoursColumn or str, optional

The number of hours, positive or negative.

minsColumn or str, optional

The number of minutes, positive or negative.

secsColumn or str, optional

The number of seconds with the fractional part in microsecond precision.

Returns
Column

A new column that contains an interval.

Examples

Example 1: Make interval from years, months, weeks, days, hours, mins and secs.

>>> import pyspark.sql.functions as sf
>>> df = spark.createDataFrame([[100, 11, 1, 1, 12, 30, 01.001001]],
...     ["year", "month", "week", "day", "hour", "min", "sec"])
>>> df.select(sf.make_interval(
...     df.year, df.month, df.week, df.day, df.hour, df.min, df.sec)
... ).show(truncate=False)
+---------------------------------------------------------------+
|make_interval(year, month, week, day, hour, min, sec)          |
+---------------------------------------------------------------+
|100 years 11 months 8 days 12 hours 30 minutes 1.001001 seconds|
+---------------------------------------------------------------+

Example 2: Make interval from years, months, weeks, days, hours and mins.

>>> import pyspark.sql.functions as sf
>>> df = spark.createDataFrame([[100, 11, 1, 1, 12, 30, 01.001001]],
...     ["year", "month", "week", "day", "hour", "min", "sec"])
>>> df.select(sf.make_interval(
...     df.year, df.month, df.week, df.day, df.hour, df.min)
... ).show(truncate=False)
+---------------------------------------------------+
|make_interval(year, month, week, day, hour, min, 0)|
+---------------------------------------------------+
|100 years 11 months 8 days 12 hours 30 minutes     |
+---------------------------------------------------+

Example 3: Make interval from years, months, weeks, days and hours.

>>> import pyspark.sql.functions as sf
>>> df = spark.createDataFrame([[100, 11, 1, 1, 12, 30, 01.001001]],
...     ["year", "month", "week", "day", "hour", "min", "sec"])
>>> df.select(sf.make_interval(
...     df.year, df.month, df.week, df.day, df.hour)
... ).show(truncate=False)
+-------------------------------------------------+
|make_interval(year, month, week, day, hour, 0, 0)|
+-------------------------------------------------+
|100 years 11 months 8 days 12 hours              |
+-------------------------------------------------+

Example 4: Make interval from years, months, weeks and days.

>>> import pyspark.sql.functions as sf
>>> df = spark.createDataFrame([[100, 11, 1, 1, 12, 30, 01.001001]],
...     ["year", "month", "week", "day", "hour", "min", "sec"])
>>> df.select(sf.make_interval(df.year, df.month, df.week, df.day)).show(truncate=False)
+----------------------------------------------+
|make_interval(year, month, week, day, 0, 0, 0)|
+----------------------------------------------+
|100 years 11 months 8 days                    |
+----------------------------------------------+

Example 5: Make interval from years, months and weeks.

>>> import pyspark.sql.functions as sf
>>> df = spark.createDataFrame([[100, 11, 1, 1, 12, 30, 01.001001]],
...     ["year", "month", "week", "day", "hour", "min", "sec"])
>>> df.select(sf.make_interval(df.year, df.month, df.week)).show(truncate=False)
+--------------------------------------------+
|make_interval(year, month, week, 0, 0, 0, 0)|
+--------------------------------------------+
|100 years 11 months 7 days                  |
+--------------------------------------------+

Example 6: Make interval from years and months.

>>> import pyspark.sql.functions as sf
>>> df = spark.createDataFrame([[100, 11, 1, 1, 12, 30, 01.001001]],
...     ["year", "month", "week", "day", "hour", "min", "sec"])
>>> df.select(sf.make_interval(df.year, df.month)).show(truncate=False)
+-----------------------------------------+
|make_interval(year, month, 0, 0, 0, 0, 0)|
+-----------------------------------------+
|100 years 11 months                      |
+-----------------------------------------+

Example 7: Make interval from years.

>>> import pyspark.sql.functions as sf
>>> df = spark.createDataFrame([[100, 11, 1, 1, 12, 30, 01.001001]],
...     ["year", "month", "week", "day", "hour", "min", "sec"])
>>> df.select(sf.make_interval(df.year)).show(truncate=False)
+-------------------------------------+
|make_interval(year, 0, 0, 0, 0, 0, 0)|
+-------------------------------------+
|100 years                            |
+-------------------------------------+

Example 8: Make interval.

>>> import pyspark.sql.functions as sf
>>> df = spark.createDataFrame([[100, 11, 1, 1, 12, 30, 01.001001]],
...     ["year", "month", "week", "day", "hour", "min", "sec"])
>>> df.select(sf.make_interval()).show(truncate=False)
+----------------------------------+
|make_interval(0, 0, 0, 0, 0, 0, 0)|
+----------------------------------+
|0 seconds                         |
+----------------------------------+