Time Converters — seconds, minutes, hours, days, milliseconds
Last updated:
Time conversions span five fundamental units that together cover everyday timekeeping, scientific-engineering measurement, sport-timing certification, computing-and-network-latency, and biological-and-medical research. The second (s) is the SI base unit of time, defined since 1967 as exactly 9,192,631,770 periods of the Cs-133 hyperfine transition — the foundational atomic-clock-based standard that anchors every other SI unit through traceable atomic-clock metrology. The minute (min) at exactly 60 seconds, the hour (h) at exactly 3600 seconds, and the day (d) at exactly 86,400 seconds preserve unchanged the Babylonian-Egyptian sexagesimal time-division system from the third millennium BC, with the modern SI second-definition transitively fixing each non-SI multiple. The millisecond (ms) at exactly 0.001 second is the SI-derived submultiple essential for modern computing, sport-timing certification, signal-processing, and cardiac-medicine ECG-interval analysis. The five units coexist across cross-disciplinary contexts where atomic-clock metrology, everyday timekeeping, sport-timing, computing-network-latency, and biological-medical research all need parallel time-unit reference frameworks anchored to the SI atomic-second primary standard. Modern atomic-clock precision better than 1 part in 10^15 sets the foundational standard for every SI unit globally, with optical-lattice atomic clocks approaching 1 part in 10^18 at the current precision frontier of national-metrology-institute primary timekeeping infrastructure. The five SI-related units described above span fifteen orders of magnitude from millisecond to day with exact integer relationships throughout the hierarchy.
Units in this category
Seconds (s)
The second (s) is the SI base unit of time, defined since 1967 as exactly 9,192,631,770 periods of the radiation corresponding to the transition between the two hyperfine levels of the ground state of the caesium-133 atom (the Cs-133 hyperfine transition at 9.192631770 GHz). The 2019 SI redefinition preserved this atomic-clock definition. The recognised SI symbol is "s" (lowercase, italics-disambiguated when needed).
Minutes (min)
The minute (min) is exactly 60 seconds by SI definition, derived from the Babylonian sexagesimal time-division system preserved unchanged into the modern SI second. The recognised symbol is "min" with no spaces or punctuation. The minute is not part of the SI base units but is recognised by NIST and BIPM as a non-SI unit accepted for use with the SI.
Hours (h)
The hour (h) is exactly 3600 seconds (60 minutes × 60 seconds) by SI definition, derived from the Babylonian-Egyptian sexagesimal time-division system preserved unchanged into the modern SI second. The recognised symbol is "h" (lowercase) under ISO 80000-3 conventions, with "hr" appearing in some casual writing as a non-standard variant. The hour is not part of the SI base units but is recognised by NIST and BIPM as a non-SI unit accepted for use with the SI.
Days (d)
The day (d) is exactly 86,400 seconds (24 hours × 3600 seconds per hour) by SI civil-day definition, fixed by the 1967 atomic-clock SI second standard. The recognised symbol is "d" (lowercase) under ISO 80000-3 conventions. The day is not part of the SI base units but is recognised by NIST and BIPM as a non-SI unit accepted for use with the SI.
Milliseconds (ms)
The millisecond (ms) is exactly 0.001 seconds (1/1000 of a second) by SI prefix definition. The recognised SI symbol is "ms" (lowercase, no spaces or punctuation). Since the 2019 SI redefinition the millisecond inherits the atomic-clock-based second definition, with sub-millisecond precision available through atomic-clock-based timing systems.
Weeks (wk)
The week (wk) is defined as exactly 7 days = 168 hours = 10,080 minutes = 604,800 seconds via the SI second-anchored civil-day at 86,400 seconds and the seven-day-cycle convention preserved unchanged from ancient Babylonian astronomy. The factor is exact rather than measured. ISO 8601 formalises the week-based date format with the week starting on Monday (in the ISO convention) and weeks numbered 1-52 or 1-53 within each calendar year.
History of time measurement
Time measurement traces from ancient Egyptian and Babylonian astronomy through twentieth-century atomic-clock standardisation. The 24-hour day was first established by ancient Egyptian astronomy in the second millennium BC, with the day divided into 12 daylight hours and 12 nighttime hours. The Babylonian sexagesimal system divided each hour into 60 minutes (the "first division" or "minutum") and each minute into 60 seconds (the "second division"). The unit names derived from Latin: "minutum" (small) for the first division, and "pars minuta secunda" (second small part) for the seconds. The astronomical second was defined as 1/86,400 of a mean solar day from the early-nineteenth-century era of mechanical-pendulum clocks. The atomic-second redefinition came at the 13th CGPM in 1967, replacing the astronomical-second definition with the Cs-133 hyperfine-transition-based atomic-clock standard. Modern atomic-clock precision better than 1 part in 10^15 (caesium-fountain) and approaching 1 part in 10^18 (optical-lattice clocks) sets the foundational standard for every SI unit globally. The 2019 SI redefinition preserved the atomic-second.
Where time conversions matter
Time conversions appear across every measurement, scientific, computing, sport-timing, and biological-medical discipline. Atomic-clock metrology and GPS systems use seconds as the SI-canonical primary, with caesium-fountain primary standards at NIST, NPL, PTB, NMIJ achieving 1 part in 10^15 precision and optical-lattice clocks (Sr-87, Yb-171) approaching 1 part in 10^18. GPS satellites carry atomic clocks for nanosecond-precision timing. Everyday timekeeping uses minutes-and-hours universally — every clock, watch, smartphone, microwave timer, oven timer, transportation schedule, and meeting-scheduling system. Sport-timing certification uses millisecond precision for IAAF/World Athletics records (Usain Bolt 100m at 9.580 s, Eliud Kipchoge marathon at 7269 s). Computing-and-network-latency uses milliseconds universally — typical web-page load 1000-3000 ms, network-ping 5-300 ms, database-query 1-100 ms, 4G LTE TTI at 1 ms. Cardiac-medicine ECG-monitoring uses milliseconds for QRS-duration (80-120 ms normal), QT-interval (350-450 ms normal). Employment-and-payroll uses hourly wage rates universally (US federal minimum $7.25/hour, UK National Living Wage £11.44/hour for 21+ in 2024). Astronomical observation and space-mission scheduling uses days for orbital mechanics and calendar systems. Biological-medical research uses days for medication-dose intervals, clinical-trial follow-up timepoints, pregnancy gestational-age tracking. Industrial-process specifications use minutes for cycle-time, hours for shift-scheduling, and days for production-cycle scheduling across manufacturing, food-and-beverage, and chemical-processing industries. Cross-disciplinary engineering documentation preserves both SI-canonical seconds and the everyday sexagesimal multiples for the appropriate audience-and-precision context.
How to convert time units
Time-unit conversion runs against the SI second as the primary reference, with each non-SI unit related to the second by an exact integer factor: 1 ms = 0.001 s exactly, 1 min = 60 s exactly, 1 h = 3600 s exactly, 1 d = 86,400 s exactly. Cross-conversion between non-SI units uses the directly-tabulated factors: 1 min = 60 s, 1 h = 60 min = 3600 s, 1 d = 24 h = 1440 min = 86,400 s. The Babylonian-Egyptian sexagesimal system means decimal-fraction-of-an-hour conversion to minutes-and-seconds requires multiplication by 60 (not 100): 1.5 hours = 1 hour 30 minutes (60 × 0.5), not 1 hour 5 minutes. The conversion factors are exact since the 1967 SI atomic-second definition transitively fixes every non-SI time-multiple. Astronomical solar-day length varies by ±25 seconds across the year due to Earth's elliptical orbit, with the IERS leap-second system absorbing the difference between civil-day (86,400 s) and astronomical-day length to maintain UTC within ±0.9 s of UT1. The conversion factors are universal across modern timekeeping with no jurisdictional variation.
All time conversions
Frequently asked questions
How many seconds in a minute?
One minute equals exactly 60 seconds by SI definition since 1967. The relationship is exact and unchanged across every modern timekeeping context, with the SI second anchored to the Cs-133 hyperfine-transition atomic-clock primary standard. The "1 min = 60 s" reference is the canonical sexagesimal time-division preserved unchanged from ancient Babylonian astronomy.
How many minutes in an hour?
One hour equals exactly 60 minutes by SI definition. The relationship is exact and unchanged across every modern timekeeping context. The "1 h = 60 min = 3600 s" reference is the canonical Babylonian-Egyptian sexagesimal time-division preserved unchanged from the third millennium BC into the modern SI second-anchored system.
How many seconds in an hour?
One hour equals exactly 3600 seconds (60 minutes × 60 seconds) by SI definition. The relationship is exact and unchanged across every modern timekeeping context, with the underlying SI second-definition fixed by the Cs-133 hyperfine-transition atomic-clock primary standard. The factor is universal across modern timekeeping with no jurisdictional variation.
How many hours in a day?
One civil day equals exactly 24 hours = 86,400 seconds by SI definition. The civil-day at 86,400 s differs slightly from the astronomical solar-day (which varies seasonally and averages 86,400.002 s due to Earth's elliptical orbit and tidal slowdown), with the IERS leap-second system absorbing the difference to maintain UTC within ±0.9 s of UT1.
How many milliseconds in a second?
One second equals exactly 1000 milliseconds by SI prefix definition. The thousandfold ratio is fixed and unchanging across every metric time-measurement context. Sub-second precision uses microseconds (μs, 10⁻⁶ s), nanoseconds (ns, 10⁻⁹ s), and picoseconds (ps, 10⁻¹² s) for higher-precision computing-and-electronics work.
How is the second defined?
The SI second has been defined since the 13th CGPM in 1967 as exactly 9,192,631,770 periods of the radiation corresponding to the Cs-133 hyperfine ground-state transition at zero magnetic field and 0 K. Modern caesium-fountain primary atomic clocks at NIST, NPL, PTB, NMIJ achieve precision better than 1 part in 10^15. The 2019 SI redefinition preserved this atomic-second as the foundational SI base unit.
Why does a minute have 60 seconds (not 100)?
The 60-second minute and 60-minute hour preserve unchanged the Babylonian sexagesimal (base-60) time-division system from the third millennium BC. The Babylonians chose base 60 because 60 has many divisors (1, 2, 3, 4, 5, 6, 10, 12, 15, 20, 30, 60), making it convenient for fractional time-division. The system survived through Greek, Roman, and modern Western civilisations into the SI time-system unchanged.