Unix Timestamp

Share:

A Unix timestamp is the number of seconds since January 1, 1970 UTC. This page explains what Unix timestamps are, how to convert them, and how to use them in every major programming language — with a live converter tool.

Current Unix Timestamp (live)

1,775,742,500

What Is a Unix Timestamp?

A Unix timestamp is a single integer that represents a specific moment in time as the number of seconds elapsed since the Unix epoch — midnight on January 1, 1970, Coordinated Universal Time (UTC). It is the most widely used time format in software development, server systems, databases, and APIs.

Because a Unix timestamp is timezone-independent, the same number means the same moment everywhere on Earth. When you store a Unix timestamp in a database, you never have to worry about daylight saving time, timezone offsets, or calendar differences — the number is always unambiguous.

How to Get the Current Unix Timestamp

Every major programming language has a built-in way to get the current Unix timestamp. Here are the most common methods:

LanguageGet Current TimestampConvert to Date
JavaScriptMath.floor(Date.now() / 1000)new Date(ts * 1000).toUTCString()
Pythonimport time; int(time.time())datetime.utcfromtimestamp(ts)
PHPtime()date("Y-m-d H:i:s", $ts)
JavaSystem.currentTimeMillis() / 1000Lnew Date(ts * 1000L)
Gotime.Now().Unix()time.Unix(ts, 0).UTC()
RubyTime.now.to_iTime.at(ts).utc
C#DateTimeOffset.UtcNow.ToUnixTimeSeconds()DateTimeOffset.FromUnixTimeSeconds(ts)
Bashdate +%sdate -d @$ts

Unix Timestamp vs. Milliseconds

Standard Unix timestamps are in seconds. However, JavaScript and many web APIs use milliseconds (1/1000th of a second). This is a common source of bugs — always check which unit your system expects.

FormatExample ValueUsed In
Seconds (standard)1700000000Unix, Linux, Python, PHP, Go
Milliseconds1700000000000JavaScript, Java, .NET
Microseconds1700000000000000High-precision systems
Nanoseconds1700000000000000000Go time.UnixNano()

The Year 2038 Problem

On 32-bit systems, Unix timestamps are stored as a signed 32-bit integer. The maximum value is 2,147,483,647, which corresponds to January 19, 2038 at 03:14:07 UTC. After this point, 32-bit systems will overflow — the timestamp will wrap around to a large negative number, causing dates to appear as December 13, 1901.

This is known as the Year 2038 problem (or Y2K38). Modern 64-bit systems are not affected — they can represent timestamps up to the year 292,277,026,596. Most modern operating systems, databases, and programming languages have already migrated to 64-bit timestamps.

Unix Timestamps in Real-World Systems

JWT Tokens

The "exp", "iat", and "nbf" claims in JSON Web Tokens are Unix timestamps. Always compare "exp" against the current timestamp to validate token expiry.

Databases

MySQL's UNIX_TIMESTAMP(), PostgreSQL's EXTRACT(EPOCH FROM ...), and MongoDB's ISODate all work with Unix timestamps internally.

Server Logs

Apache, Nginx, and most server logs include Unix timestamps for precise event ordering across distributed systems.

Cloud APIs

AWS, Google Cloud, and Azure APIs use Unix timestamps in responses for created_at, updated_at, and expiry fields.

Version Control

Git stores commit timestamps as Unix timestamps. The "git log --format=%at" command outputs raw Unix timestamps.

Cache TTL

HTTP Cache-Control headers and CDN configurations often use Unix timestamps to define when cached content expires.

Frequently Asked Questions — Unix Timestamp

Found this helpful?

Share:

Related Tools

Key Facts

Epoch startJan 1, 1970 UTC
UnitSeconds (standard)
JS unitMilliseconds
32-bit maxJan 19, 2038
64-bit maxYear 292 billion
Negative?Yes (pre-1970)

Milestones

1,000,000,000Sep 9, 2001
1,111,111,111Mar 18, 2005
1,234,567,890Feb 13, 2009
1,500,000,000Jul 14, 2017
2,000,000,000May 18, 2033