I’m pretty sure that it wasn’t Windows that was the main offender, but instead legacy systems of all kinds made since 1970, where people were not expecting for their programs to run for more than 30 years.
Surprise! Businesses don’t care whether the code is old, as long as it works - so that data type you store the year in only held two characters, and hard-coded the 19 onto it.
1999 would be written as 99. 19 + 99 = 1999 = computers were happy.
2000 would be written as 00. 19 + 00 = 1900 = computers went to shit
Yeah good explanation. I was too young to had any further knowledge about this issue way back and only saw it manifesting when I had to adjust my windows 95 clock :)
I’m pretty sure that it wasn’t Windows that was the main offender, but instead legacy systems of all kinds made since 1970, where people were not expecting for their programs to run for more than 30 years.
Surprise! Businesses don’t care whether the code is old, as long as it works - so that data type you store the year in only held two characters, and hard-coded the 19 onto it.
1999 would be written as 99. 19 + 99 = 1999 = computers were happy.
2000 would be written as 00. 19 + 00 = 1900 = computers went to shit
Yeah good explanation. I was too young to had any further knowledge about this issue way back and only saw it manifesting when I had to adjust my windows 95 clock :)
Next doom and gloom scenario is 2038, when poorly maintained *nix systems will think it’s Jan 1st, 1970.
I’ll be pushing 68. Hopefully retired or dead by then.
… I’ll probably still be working, though…
Eh, it only being an issue for 32-bit systems will hopefully help. But of course somebody will still be running that in 15 years.