We're being short-sighted
-
[email protected]replied to [email protected] last edited by
Even if such parsers aren't used directly in critical systems, they'll surely be used in the supply chains of critical systems. Your train won't randomly derail, but disruptions in the supply chain can cause repair parts not to be delivered, that kind of thing.
And you can be certain such parsers are used in almost every application dealing with datetimes that hasn't been specifically audited or secured. 99% of software is held together with duct tape.
-
[email protected]replied to [email protected] last edited by
... any discrepancies in the use of "year" as a 4 digit number vs a 5 digit number, are entirely a display issue (front end).
That's exactly how I read the meme. It would still require a change.
Whether that is displayed to you correctly or not, doesn't matter in the slightest. The machine will function even if you see some weird shit,
I'm not sure if this is some nihilistic stuff, or you really think this. Of course nothing actually matters. The program will still work even if the time is uint32 instead of uint64. The machine of course will still work as well. Shit, your life will go on. The earth continues to spin and this will for sure not cause the heat death of the universe. But aside from actual crashes and some functionality bugs, UI issues should be the ones you worry about the most. If your users are a bank and they need to date the contracts, and you only offer 3 digits for the year? I think you'll agree with me that if users don't like using your program, it's a useless program.
-
[email protected]replied to [email protected] last edited by
True. But I wouldn't see this as extremely more critical than the hundreds of other issues we encounter daily in software. Tbh, I'd be glad if some of the software I have to use daily had more duct tape on it...
-
[email protected]replied to [email protected] last edited by
Look at this fucking piece of shit code, oh right, it's been written by a homo sapiens sapiens. No wonder they collapsed soon after.
-
[email protected]replied to [email protected] last edited by
This is for a 32bits encoded epoch time, which will run out in 2038.
Epoch time on 64 bits will see the sun swallow Earth before it runs out.
-
[email protected]replied to [email protected] last edited by
Y2k was not fear mongering. There were a great many systems, in industrial finance and infrastructure applications that definitely needed to be addressed. You know, the things that keep modern infrastructures running. Of course there were consumer facing companies that took advantage of it, but that was small in comparison.
It ended up not being a disaster, because it was taken seriously.
-
Yup. Gentoo people are working on it as well. This is only a problem on 32-bit Linux too, right?
-
[email protected]replied to [email protected] last edited by
I think you might be underestimating the potential impact.
Remember the Crowdstrike Windows BSOD? It caused billions in damages, and it's the absolute best case scenario for this kind of issue. Our potential Y10K bug has a bunch of additional issues:
- you don't just have to patch one piece of software, but potentially all software ever written that's still in use, a bunch of which won't have active maintainers
- hitting the bug won't necessarily cause crashes (which are easy to recognize), it can also lead to wrong behavior, which will take time to identify. Now imagine hundreds of companies hitting the bug in different environments, each with their own wrong behavior. Can you imagine the amount of continuous supply chain disruptions?
- fixes have to be thought about and implemented per-application. There's no panacea, so it will be an incredible amount of work.
I really don't see how this scenario is comparable to anything we've faced, beyond Y2K.
-
You need to qualify your statement about Y2K being fear mongering. People saying all technology would stop (think planes crashing out of the sky) were clearly fear mongering or conspiracy theorists. People saying certain financial systems needed to be updated so loans didn't suddenly go from the year 1,999 to 19,100 or back to 1900 were not fear mongering. It's only because of a significant amount of work done by IT folks that we have the luxury of looking back and saying it was fear mongering.
Look at this Wikipedia page for documented errors. One in particular was at a nuclear power plant. They were testing their fix but accidentally applied the new date to the actual equipment. It caused the system to crash. It took seven hours to get back up and they had to use obsolete equipment to monitor conditions until then. Presumably if the patch wasn't applied this would happen at midnight on January 1st 2000 too.
Y2K was a real problem that needed real fixes. It just wasn't an apocalyptic scenario.
-
The comment you're replying to is really frustrating to me. It annoys me when people are so arrogant but also wrong. Do they live in a perfect world where nobody stores dates as ISO 8601 strings? I've seen that tons of times. Sometimes, it may even be considered the appropriate format when using stuff like JSON based formats.
-
Actual programmers wondering how you've never seen anyone use ISO 8601 strings as storage.
-
I've seen plenty of people use ISO 8601 for storage as well as display.
-
Probably some mainframe or something lol. Always a mainframe.
-
[email protected]replied to The Picard Maneuver last edited by
In other news, the colony Szinthar failed to update its software systems due to a lack of pregrammers and Techmancers. Signals received suggest there were no survivors.
-
[email protected]replied to The Picard Maneuver last edited by
Again?!
Rest of the world: I guess they overhyped that issue because nothing bad happened.
-
Planes crashing out of the sky wouldn't have been inconceivable. Say you have two air traffic control systems that are synchronizing - one handles dates with a modulo 100, another handles them in epoch time. All of a sudden the two reported time + positions of two different planes don't match up by a century, and collision projection software doesn't work right. I've seen nastier bugs than that, in terms of conceptual failure.
At no point is that a theory about a "conspiracy" either, IDK why you're bandying that term around.
-
Years
YYYY
±YYYYY
ISO 8601 prescribes, as a minimum, a four-digit year [YYYY] to avoid the year 2000 problem. It therefore represents years from 0000 to 9999, year 0000 being equal to 1 BC and all others AD, similar to astronomical year numbering. However, years before 1583 (the first full year following the introduction of the Gregorian calendar) are not automatically allowed by the standard. Instead, the standard states that "values in the range [0000] through [1582] shall only be used by mutual agreement of the partners in information interchange".[20]
To represent years before 0000 or after 9999, the standard also permits the expansion of the year representation but only by prior agreement between the sender and the receiver.[21] An expanded year representation [±YYYYY] must have an agreed-upon number of extra year digits beyond the four-digit minimum, and it must be prefixed with a + or − sign[22] instead of the more common AD/BC (or CE/BCE) notation; by convention 1 BC is labelled +0000, 2 BC is labeled −0001, and so on.[23]
-
[email protected]replied to The Picard Maneuver last edited by
The two most difficult things in programming; dealing with time, naming things, and boundary conditions.
-
[email protected]replied to [email protected] last edited by
Amazon is already testing robotic loaders, self driving trucks are already in development, and vending machines retail everything in Japan.
-
I'm 100% with you - it's the dangerous level of knowledge where someone understands the technical background for the most part, but is lacking real world experience. Reminds me of the blog posts titled "Misconceptions programmers have about X" - almost everything we touch in IT is complicated if you get deep enough.
But their style of commenting really jives with Lemmy on technical topics. I can't count the number of posts where people proudly shout fundamentally wrong explanations for current AI models, yet any corrections are downvoted to oblivion. It's not as bad on non-AI-topics, but I can't imagine anyone in the field reading GPs comment and agreeing...