Can someone explain why this two commands returns different results:
> new Date(2019, 0, 1).getTime()> 1546293600000> Date.UTC(2019, 0, 1)> 1546300800000
According to MDN documentation of getTime method:
The getTime() method returns the number of milliseconds* since the Unix Epoch.
getTime() always uses UTC for time representation. For example, a client browser in one timezone, getTime() will be the same as a client browser in any other timezone.
And Date.UTC method:
The Date.UTC() method accepts parameters similar to the Date constructor, but treats them as UTC. It returns the number of milliseconds since January 1, 1970, 00:00:00 UTC.
They are should be the same?
I just doubt how can I calculate difference between year 1924 and today. Because if we just add one year (in milliseconds) to timestamp of year 1924 it will be completely messed up. There is example of behavior I can't understand at all:
> new Date( new Date(1924, 0, 1).getTime() + 366 * 24 * 60 * 60 * 1000 )> Wed Dec 31 1924 23:57:56 GMT+0200 (Eastern European Standard Time)
Best Answer
new Date(2019, 0, 1)
creates a Date object representing the start of January, 1, 2019 in your locale specificed timezone. When converted to UTC, it will be a different time than the start of Jan 1 in UTC.
Running your second snippet to add one year gives the correct result for me. What browser or context are you running in?
new Date(2019, 0, 1)
makes an object from your timezone, whereas Date.UTC(2019, 0, 1)
gives you the timestamp in UTC.
I am in CE(S)T, so there will be a difference of 1 hour, 3600000 milliseconds:
const timeInCEST = new Date(2019, 0, 1).getTime();// 1546297200000const timeInUTC = Date.UTC(2019, 0, 1)// 1546300800000const differrenceBetweenMyTimeAndUTC = new Date(2019, 0, 1).getTime() - Date.UTC(2019, 0, 1) // (I'm ahead, so the difference will be negative)// -3600000console.log(timeInCEST, timeInUTC, differrenceBetweenMyTimeAndUTC);