:: DeveloperApi :: The data type for collections of multiple values.
:: DeveloperApi ::
The data type for collections of multiple values.
Internally these are represented as columns that contain a
.scala.collection.Seq
Please use DataTypes.createArrayType() to create a specific instance.
An ArrayType object comprises two fields, elementType: DataType
and
containsNull: Boolean
. The field of elementType
is used to specify the type of
array elements. The field of containsNull
is used to specify if the array has null
values.
The data type of values.
Indicates if values have null
values
An internal type used to represent everything that is not null, UDTs, arrays, structs, and maps.
An internal type used to represent everything that is not null, UDTs, arrays, structs, and maps.
:: DeveloperApi ::
The data type representing Array[Byte]
values.
:: DeveloperApi ::
The data type representing Array[Byte]
values.
Please use the singleton DataTypes.BinaryType.
:: DeveloperApi ::
The data type representing Boolean
values.
:: DeveloperApi ::
The data type representing Boolean
values. Please use the singleton DataTypes.BooleanType.
:: DeveloperApi ::
The data type representing Byte
values.
:: DeveloperApi ::
The data type representing Byte
values. Please use the singleton DataTypes.ByteType.
:: DeveloperApi :: The data type representing calendar time intervals.
:: DeveloperApi :: The data type representing calendar time intervals. The calendar time interval is stored internally in two components: number of months the number of microseconds.
Note that calendar intervals are not comparable.
Please use the singleton DataTypes.CalendarIntervalType.
:: DeveloperApi :: The base type of all Spark SQL data types.
:: DeveloperApi :: The base type of all Spark SQL data types.
:: DeveloperApi :: A date type, supporting "0001-01-01" through "9999-12-31".
:: DeveloperApi :: A date type, supporting "0001-01-01" through "9999-12-31".
Please use the singleton DataTypes.DateType.
Internally, this is represented as the number of days from 1970-01-01.
A mutable implementation of BigDecimal that can hold a Long if values are small enough.
A mutable implementation of BigDecimal that can hold a Long if values are small enough.
The semantics of the fields are as follows: - _precision and _scale represent the SQL precision and scale we are looking for - If decimalVal is set, it represents the whole decimal value - Otherwise, the decimal value is longVal / (10 ** _scale)
:: DeveloperApi ::
The data type representing java.math.BigDecimal
values.
:: DeveloperApi ::
The data type representing java.math.BigDecimal
values.
A Decimal that must have fixed precision (the maximum number of digits) and scale (the number
of digits on right side of dot).
The precision can be up to 38, scale can also be up to 38 (less or equal to precision).
The default precision and scale is (10, 0).
Please use DataTypes.createDecimalType() to create a specific instance.
:: DeveloperApi ::
The data type representing Double
values.
:: DeveloperApi ::
The data type representing Double
values. Please use the singleton DataTypes.DoubleType.
:: DeveloperApi ::
The data type representing Float
values.
:: DeveloperApi ::
The data type representing Float
values. Please use the singleton DataTypes.FloatType.
:: DeveloperApi ::
The data type representing Int
values.
:: DeveloperApi ::
The data type representing Int
values. Please use the singleton DataTypes.IntegerType.
:: DeveloperApi ::
The data type representing Long
values.
:: DeveloperApi ::
The data type representing Long
values. Please use the singleton DataTypes.LongType.
:: DeveloperApi :: The data type for Maps.
:: DeveloperApi ::
The data type for Maps. Keys in a map are not allowed to have null
values.
Please use DataTypes.createMapType() to create a specific instance.
The data type of map keys.
The data type of map values.
Indicates if map values have null
values.
:: DeveloperApi ::
:: DeveloperApi ::
Metadata is a wrapper over Map[String, Any] that limits the value type to simple ones: Boolean, Long, Double, String, Metadata, Array[Boolean], Array[Long], Array[Double], Array[String], and Array[Metadata]. JSON is used for serialization.
The default constructor is private. User should use either MetadataBuilder or Metadata.fromJson() to create Metadata instances.
:: DeveloperApi ::
:: DeveloperApi ::
Builder for Metadata. If there is a key collision, the latter will overwrite the former.
:: DeveloperApi ::
The data type representing NULL
values.
:: DeveloperApi ::
The data type representing NULL
values. Please use the singleton DataTypes.NullType.
:: DeveloperApi :: Numeric data types.
:: DeveloperApi :: Numeric data types.
:: DeveloperApi ::
The data type representing Short
values.
:: DeveloperApi ::
The data type representing Short
values. Please use the singleton DataTypes.ShortType.
:: DeveloperApi ::
The data type representing String
values.
:: DeveloperApi ::
The data type representing String
values. Please use the singleton DataTypes.StringType.
A field inside a StructType.
A field inside a StructType.
The name of this field.
The data type of this field.
Indicates if values of this field can be null
values.
The metadata of this field. The metadata should be preserved during transformation if the content of the column is not modified, e.g, in selection.
:: DeveloperApi :: A StructType object can be constructed by
:: DeveloperApi :: A StructType object can be constructed by
StructType(fields: Seq[StructField])
For a StructType object, one or multiple StructFields can be extracted by names.
If multiple StructFields are extracted, a StructType object will be returned.
If a provided name does not have a matching field, it will be ignored. For the case
of extracting a single StructField, a null
will be returned.
Example:
import org.apache.spark.sql._ import org.apache.spark.sql.types._ val struct = StructType( StructField("a", IntegerType, true) :: StructField("b", LongType, false) :: StructField("c", BooleanType, false) :: Nil) // Extract a single StructField. val singleField = struct("b") // singleField: StructField = StructField(b,LongType,false) // This struct does not have a field called "d". null will be returned. val nonExisting = struct("d") // nonExisting: StructField = null // Extract multiple StructFields. Field names are provided in a set. // A StructType object will be returned. val twoFields = struct(Set("b", "c")) // twoFields: StructType = // StructType(List(StructField(b,LongType,false), StructField(c,BooleanType,false))) // Any names without matching fields will be ignored. // For the case shown below, "d" will be ignored and // it is treated as struct(Set("b", "c")). val ignoreNonExisting = struct(Set("b", "c", "d")) // ignoreNonExisting: StructType = // StructType(List(StructField(b,LongType,false), StructField(c,BooleanType,false)))
A org.apache.spark.sql.Row object is used as a value of the StructType. Example:
import org.apache.spark.sql._ val innerStruct = StructType( StructField("f1", IntegerType, true) :: StructField("f2", LongType, false) :: StructField("f3", BooleanType, false) :: Nil) val struct = StructType( StructField("a", innerStruct, true) :: Nil) // Create a Row with the schema defined by struct val row = Row(Row(1, 2, true)) // row: Row = [[1,2,true]]
:: DeveloperApi ::
The data type representing java.sql.Timestamp
values.
:: DeveloperApi ::
The data type representing java.sql.Timestamp
values.
Please use the singleton DataTypes.TimestampType.
An AbstractDataType that matches any concrete data types.
An AbstractDataType that matches any concrete data types.
Extra factory methods and pattern matchers for Decimals
Contains a type system for attributes produced by relations, including complex types like structs, arrays and maps.