Only somebody who isn't aware of whose "Congress speech received multiple standing ovations, touted 'most by any world leader'" would be surprised by that bipartisan support you mention. That happened in 2024, before Trump began his second term, but shows how the system works.
Another advantage of Pascal is that the programs written in it crashed much less, which also allowed for a much safe development on the machines of that time which didn't have any "memory write" protections.
And safety in development actually translated in less crashy product too.
Pascal is very much like a managed language but without GC or borrow checker. It's not formally memory-safe, but its syntax discourages a developer from playing with fire unless it's really needed.
Additionally all the flaws regarding it being designed for teaching and the raise of dialects, were already fixed by 1978 with Modula-2, which Niklaus Wirth than created, with the learnings from Mesa at Xerox PARC.
Later, we also got the managed language genealogy, via Modula-2+ branch, and Niklaus Wirth own Oberon variants, or inspired dialects from it.
Nowadays GCC has Ada, Modula-2 and Algol 68 as official frontend, we have Free Pascal and Delphi.
Then we also have all the other modern ones that somehow got some inspiration out this history.
Thus we as an industry aren't lacking alternatives.
It sorta did as Ada, tho Ada is a much bigger language than M2 (or Pascal)[1]. There was at least one Ada-83 compiler for DOS (Janus), but it was a tight fit and a miserable experience. To me, the missed opportunity was Modula-3; much of what was nifty about Ada (and other things) in a smaller package.
[1] People often forget how compact a language Pascal (and somewhat M2) is. It comfortable self-hosts on 8-bit machines with a few dozen K (not M) of memory. It does an OK job even on the 6502 (admittedly, p-code). There was even a cross compiler for the 8051.
I think that D and C# are the right descendents to Modula-3, and many aren't still unaware of how much C# has improved for low level programming tasks.
Unfortunely its adoption window has passed by, although there is a guy keeping it going on Github, from the official Critical Mass compiler that once existed,
sizeof s (fun fact: you can omit the parentheses because it's not a function, it's an operator) reports the size of the array s, which is a variable of automatic storage duration.
Consider the following program:
#include <stdio.h>
#include <string.h>
int main()
{
const char * const s = "gogo\x00gogo";
const char t[] = "gogo\x00gogo";
printf("%d,%d,%d,%d\n",
(int)sizeof s,
(int)strlen(s), // DO NOT EVER USE THIS FUNCTION
// YOU WILL BE FIRED IF YOU DO
(int)sizeof t,
(int)strlen(t) // DO NOT EVER USE THIS FUNCTION
// YOU WILL BE FIRED IF YOU DO
);
}
> 8,4,10,4
That's what Doctorow calls "a cartoonish vision of markets in which “the customer is king” and successful businesses are those who cater to their customers." In reality, the capitalists also don't care, _when they can get away with it._
"To understand whom a platform treats well and whom it abuses, look not to who pays it and who doesn’t. Instead, ask yourself: who has the platform managed to lock in? "
"Smearing" is much more sensible approach for the use cases where it is implemented. There is simply no solution that is optimal for every use case, and insisting on using just one approach everywhere is unreasonable.
We stop pretending that universal time exists and add location to every time stamp relative to some predefined clock at a known location. The issues of when something happened dissapear.
If you stop pretending that universal time exists, then each timezone would need its own NTP stratum 0 atomic clock., Higher strata NTP servers would have to be told which timezone you wanted, and only return you the time sourced from whatever country's national labs count for that timezone, and those labs would not be allowed to compare each others times.
Each clock would run at a slightly different rate, based on imperfections in the equipment, the altitude of the laboratory and the amount of nearby mass in its vicinity. National clocks would drift, relative to each other, and so would everybody's timestamps. CET would be perhaps +1:00:00.000002 ahead of GMT rather than being exactly +1:00.
You'd have to continually measure and publish this drift in some kind of timezone-to-timezone comparison service, so that people who make network connections across the world don't end up finding that packets appear to arrive before they've been sent (due to National Clock drifts)
It's an interesting thought experiment. But I prefer where all the worlds' labs work together to produce a consensus universal time, and we just add fixed political offsets to it.
I like the odea of reference frames. But it isn't just geographic. The unix clock on a starlink is doing double digit mach and goes horizon to horizon in minutes. GPS satellites have always had to account for relativity, so we could look to that system... Hmm, seems gps time doesn't add leap seconds, only using those already added when it started ;)
Sure, you can define the reference frames to allow any orbit on any body. Mars landers & orbiters have slightly different time due to general & special relativity than Earth's surface, and also don't need Earth days/months/years, for example.