pyspark.sql.functions.unix_timestamp¶
- 
pyspark.sql.functions.unix_timestamp(timestamp: Optional[ColumnOrName] = None, format: str = 'yyyy-MM-dd HH:mm:ss') → pyspark.sql.column.Column[source]¶ Convert time string with given pattern (‘yyyy-MM-dd HH:mm:ss’, by default) to Unix time stamp (in seconds), using the default timezone and the default locale, returns null if failed.
if timestamp is None, then it returns current timestamp.
New in version 1.5.0.
Changed in version 3.4.0: Supports Spark Connect.
- Parameters
 - timestamp
Columnor str, optional timestamps of string values.
- formatstr, optional
 alternative format to use for converting (default: yyyy-MM-dd HH:mm:ss).
- timestamp
 - Returns
 Columnunix time as long integer.
Examples
>>> spark.conf.set("spark.sql.session.timeZone", "America/Los_Angeles") >>> time_df = spark.createDataFrame([('2015-04-08',)], ['dt']) >>> time_df.select(unix_timestamp('dt', 'yyyy-MM-dd').alias('unix_time')).collect() [Row(unix_time=1428476400)] >>> spark.conf.unset("spark.sql.session.timeZone")