Skip to content
Deftkit

Unix Timestamp Converter

Convert between Unix timestamps and human-readable dates. Free online epoch converter — supports seconds, milliseconds and ISO 8601.

Type a Unix timestamp or pick a date — the two inputs are peers and the other one fills in automatically.

What is a Unix timestamp?

A Unix timestamp (also called Unix time, POSIX time, or epoch time) is a single integer counting the number of seconds elapsed since 00:00:00 UTC on January 1, 1970 — the Unix epoch. It is the most widely used way to represent a moment in time in computing. Operating systems, file systems, databases, network protocols, and APIs all rely on it because it has three properties that human-readable formats cannot match: it is compact (one integer), it is timezone-independent (always UTC), and arithmetic on times reduces to plain integer arithmetic. Any developer who has worked with APIs, log files, or database schemas will have encountered Unix timestamps regularly.

Seconds vs milliseconds — the JavaScript trap

This is the most common source of confusion with Unix time. The Unix standard is seconds. Most languages and runtimes follow it: C, Python, Go, Ruby, PHP, MySQL's UNIX_TIMESTAMP(), and PostgreSQL's extract(epoch from ...) all return whole seconds.

JavaScript is the outlier. Date.now() returns milliseconds since the epoch — one thousand times larger than the Unix standard. This mismatch causes a steady stream of bugs whenever JavaScript code talks to a backend written in any other language. You will eventually multiply or divide by 1000 in the wrong direction. Two quick sentinels:

  • 10-digit value (e.g. 1712574896) → seconds. This represents a date in the 2000s–2030s range.
  • 13-digit value (e.g. 1712574896000) → milliseconds. This is what Date.now() produces.

This unix time converter autodetects the unit based on magnitude. If your value is roughly 1011 or larger, it is treated as milliseconds. In code, the conversions are straightforward: seconds = Math.floor(ms / 1000) and ms = seconds * 1000.

The Year 2038 problem

Unix timestamps were historically stored as signed 32-bit integers. The maximum value of a signed 32-bit integer is 2,147,483,647, which corresponds to 03:14:07 UTC on 19 January 2038. One second after that moment, the value overflows and wraps to a large negative number, causing software to interpret timestamps as dates in December 1901.

This is known as the Year 2038 problem or Y2K38. It is structurally the same class of bug as Y2K, just at a different layer of the stack. Most modern systems — 64-bit operating systems, MySQL with BIGINTcolumns, JavaScript's 64-bit doubles — have already moved past this limit and are not affected. Embedded systems, legacy databases, and older C code that has not been recompiled with time64_t may still be at risk. This epoch converter uses 64-bit JavaScript numbers throughout and handles timestamps well beyond 2038 without issue.

Timezone gotchas

A Unix timestamp is always UTC. There is no such thing as a timestamp "in PST" — a timezone is only a display preference applied on top of an underlying UTC value. When you convert a timestamp to a human-readable date, the result depends entirely on which timezone you choose to display it in. The same value 1712574896 is both 2024-04-08 11:14:56 UTC and 2024-04-08 04:14:56 PDT — the same instant expressed two different ways.

This tool shows you both: the ISO 8601 representation (always UTC) and your local time with the UTC offset spelled out explicitly. Note that DST (daylight saving time) does not affect Unix timestamps at all. Timestamps are monotonically increasing integers. DST only changes how a given timestamp is displayed, not its numeric value.

How to use this Unix timestamp converter

  1. Enter a timestamp in the timestamp field. Paste any Unix time value in seconds or milliseconds — the tool autodetects which unit you are using based on the number of digits.
  2. Or pick a date using the date and time picker to convert in the other direction: from a human-readable date to its epoch equivalent.
  3. Below the inputs you'll see the converted value in five formats: Unix seconds, Unix milliseconds, ISO 8601 UTC, local date/time with UTC offset, and a relative description ("3 minutes ago").
  4. Click the Now button to populate both fields with the current moment and see what the current Unix timestamp is.
  5. Click Copy next to any output format to place it on your clipboard.

Common conversion examples

A few well-known Unix timestamps and their human-readable equivalents:

0              → 1970-01-01 00:00:00 UTC  (the Unix epoch — origin of Unix time)
946684800      → 2000-01-01 00:00:00 UTC  (Y2K)
1000000000     → 2001-09-09 01:46:40 UTC  (the "one billion seconds" moment)
1234567890     → 2009-02-13 23:31:30 UTC  (widely noticed at the time)
1712574896000  → 2024-04-08 11:14:56 UTC  (milliseconds — note the 13 digits)
2147483647     → 2038-01-19 03:14:07 UTC  (the Year 2038 overflow boundary)

Notice that the millisecond example has 13 digits while the second-precision examples have 10. That digit count is the fastest way to determine which unit a timestamp uses when you encounter one in the wild.

Frequently asked questions

Why does my timestamp give the wrong date?

Almost always the culprit is a seconds/milliseconds mismatch. If a 10-digit Unix timestamp is divided by 1000 before converting, the result lands in early January 1970, just seconds after the epoch. If a 13-digit millisecond value is fed into a function that expects seconds, the result is a date tens of thousands of years in the future. Check the digit count first: 10 digits means seconds, 13 digits means milliseconds. This unix timestamp converter autodetects, but if you are computing the conversion manually in code, multiply or divide by 1000 with care.

What is the difference between "epoch time" and "Unix time"?

They are the same thing. "Epoch time" emphasizes that the count starts from a fixed reference point — the epoch — which is 00:00:00 UTC on January 1, 1970. "Unix time" emphasizes the convention's origin in the Unix operating system. POSIX time is a third name for the exact same concept, reflecting its standardization in the POSIX specification. All three terms are used interchangeably in documentation and APIs. When you see any of them, they refer to integer seconds (or milliseconds in JavaScript) counted from that single reference point.

Is the Unix epoch the same as my system clock?

On a correctly configured modern machine, yes — the system clock is maintained as a Unix timestamp internally. However, system clocks drift from true UTC over time. Applications that need precise, synchronized time use NTP (Network Time Protocol) or PTP (Precision Time Protocol) to discipline the clock against authoritative time servers. The Unix epoch itself is a definition — 00:00:00 UTC, January 1, 1970 — not a hardware component. Your system clock tracks elapsed seconds from that definition.

Does Unix time include leap seconds?

Officially no. The Unix time standard treats every day as exactly 86,400 seconds. When a real-world leap second is inserted, the operating system either smears the extra second across a longer interval (Google's "leap smear" approach) or repeats a second, depending on configuration. For day-to-day timestamp to date conversion this distinction does not matter — the accumulated error is at most tens of seconds over decades. For high-precision astronomy, navigation, or financial systems it can be significant.

Can a Unix timestamp be negative?

Yes. Negative values represent instants before January 1, 1970. The value -1 is 1969-12-31 23:59:59 UTC. This tool accepts negative timestamps and converts them correctly. Whether a given database or API supports negative timestamps depends on the implementation — some systems store timestamps as unsigned integers and cannot represent pre-epoch dates at all.

Why did JavaScript choose milliseconds instead of seconds?

When Unix was designed in the early 1970s, second-precision was adequate for practical use cases, and integer arithmetic was faster than fractional arithmetic on the hardware of the time. JavaScript, designed in the mid-1990s for the web, needed sub-second timing for animations and event handling from the start, so milliseconds were the natural choice. The two conventions were never reconciled into a single standard, which is why the seconds/milliseconds confusion persists across every codebase that mixes JavaScript with a backend written in another language.

Is my data sent anywhere when I use this tool?

No. All conversions happen entirely in your browser. The timestamps and dates you enter are never transmitted to any server, stored, or logged. This unix timestamp converter is a static client-side tool with no backend. No account or signup is required.