A Unix timestamp is a number that represents the number of seconds that have elapsed since January 1, 1970, at 00:00:00 UTC, a moment known as the Unix epoch.
Why 1970?
The date was chosen by early Unix developers as a convenient starting point. It is entirely arbitrary, but has since become a universal standard.
Seconds vs milliseconds
Many modern languages and frameworks use milliseconds instead of seconds. JavaScript's Date.now() returns milliseconds. Always check which unit your system expects.
// Seconds Math.floor(Date.now() / 1000) // Milliseconds Date.now()
The year 2038 problem
32-bit signed integers can only hold values up to 2,147,483,647, which corresponds to January 19, 2038. Systems still using 32-bit timestamps will overflow on that date. Most modern systems have moved to 64-bit integers, which extend the range by billions of years.
Converting timestamps
To convert a Unix timestamp to a human-readable date in JavaScript:
const ts = 1710000000; const date = new Date(ts * 1000); // multiply by 1000 for ms console.log(date.toISOString()); // "2024-03-09T16:00:00.000Z"
Working with timezones
Unix timestamps are always in UTC. When displaying to users, convert to their local timezone. The Intl.DateTimeFormat API is the modern way to do this in JavaScript.