The interesting NABU was IMHO not these, but the NABU 1200. An early 8086 Unix machine. I got one at a garage sale in 1993 or so. It worked, and the Microsoft Xenix 1.0 was a direct port of V7 Unix from Bell Labs and quite educational in this respect precisely because it was still simple and understandable, compared to to the work station OS's of the time.
Proprietary no-source OS's were still common then, so binary patching the kernel to put in a different hard disk parameter table (to use a luxuriously large 20MB drive in place of the ST412 the machine came with - precious! Must not mess with the irreplaceable original OS image) was undaunting, especially with a .h file handy that gave the structure. Compiling "elvis" to get vi in the absolutely stripped down Minix mode, that used 63Kbytes of the maximum 64K of code space that executables could use... fun times. Of course back then you still had a hope of compiling current C with ancient pre-ANSI K&R C compiler. Most stuff that I ran on the machine didn't need much porting.
You know what I would truly love out of an AI/LLM ==
A crawler across everything tech which takes a comment such as yours, and then parses out all the systems/people/code/languages/companies/timeframe and builds a really good history of computing.
That would be absolutely beautiful.
AI Keeping its own evolutionary tree documented... and turned into a teaching platform.
I think "really good" is where we cross the point from LLM to AGI in that it can't just be a fancy autocomplete. It has to have decent model of readers to test various prose and structure options against to figure out which ones are "really good" for particular readers.
Considering the amount of people making up s** here by accident or on purpose, this would demand some very sophisticated level of cautiousness and plausibility-check, as also being able to follow the discussion to interweave corrections. This would demand a level of attention-detail and an understanding of human interaction at least on the level of a proper human historian IMHO.
But wouldnt it be better to focus the "attention to detail" on the details - and let the AI handle the fluff explainations for such - example would be to restrict the model to only research certain aspects of the prompt's intent...
So get a listing of languages, companies, technologies, industries, resources, supply chains - that can be directly tied to the evolution of computing - and let it web them together?
truly distopian wish. in your vision, knowledge would stop at the day we have that tech. nothing new will be added to the repository thilose llm parrot from.
for me, i rather we go back to having time and energy to write decently about our history so others can learn in the future.
The paragraph you've provided is rich with technology details, which offer a good opportunity to delve into the history of computing. Here's a summary of the key technologies, people, and companies:
1. *NABU 1200:* This is a somewhat obscure early PC built around the Intel 8086 processor. The NABU Network was a Canadian computer network that existed from 1983-1985. NABU PCs were designed for home use and offered unique connectivity options for the time.
2. *8086 Unix machine:* Intel's 8086 was a 16-bit microprocessor designed in the late 1970s, which was the basis for the x86 architecture. UNIX, a powerful multi-user, multi-tasking operating system developed in the late 1960s and early 1970s at Bell Labs, was often used in early computers with this processor.
3. *Microsoft Xenix 1.0:* This was Microsoft's version of the Unix operating system, licensed from AT&T in the late 1970s and released in 1980. Xenix was a direct port of V7 Unix from Bell Labs. The "1.0" suggests this is the earliest version of Xenix.
4. *Bell Labs:* The historic research and development company responsible for a host of important technologies and programming languages, including Unix, the C programming language, and others.
5. *ST412 hard drive:* This was an early model of hard disk drive produced by Seagate Technology in the early 1980s. It was one of the first 5.25-inch hard drives and had a storage capacity of 10 megabytes.
6. *K&R C Compiler:* This refers to the original version of the C programming language as defined by Brian Kernighan and Dennis Ritchie (hence "K&R") in their classic book "The C Programming Language" (1st edition published in 1978). It predates the ANSI C standard.
7. *Elvis and vi:* Elvis is a vi clone, vi itself being a visual text editor for Unix. It was written by Bill Joy in 1976 at Berkeley. The reference to "compiling 'elvis' to get vi in the absolutely stripped down Minix mode" refers to the process of building the Elvis software from its source code to run on the Minix operating system, a Unix-like system designed for teaching purposes.
8. *Minix:* A Unix-like operating system created by Andrew S. Tanenbaum for educational purposes. The first version of Minix was released in 1987.
Overall, this paragraph presents a snapshot of a transitional time in computing, when PCs were still relatively new, Unix was becoming more common on these machines, and software had to be patched and compiled from source code to run on specific systems. The timeframe seems to be early to mid-1990s, considering the user bought the NABU 1200 at a garage sale around 1993.
Are there many of these still existing? If I'm not set on having something Canadian, which would be a good model to look into? Would it be a good way to introduce a kid to computers / Unix / programming?
I learned to code on a TRS-80 Model 100, in BASIC, when I was 7 years old or so. Nothing Unix-like about it, but as an introduction to programming concepts I got a lot of mileage out of it. The upside is, there are no distractions, and it basically doesn't do anything except let you write text or short BASIC programs. It's essentially a big keyboard with a little chunky LCD display and it runs a long time on AA batteries. I dragged it everywhere with me as a kid, along with the BASIC manual, trying to figure out how to get it to do things. The portability was a big plus, cause I could just keep experimenting whenever I had ideas.
There's no point to running antiques any more, not when, for example, you can buy a reproduction PDP11/70 front panel that runs on a Raspi runnig an emulator running genuine V7 Unix!
You can probably emulate that pretty easily with a Raspberry Pi, which contains a huge amount more processing power, but I guess I understand how its just "not the same".
Proprietary no-source OS's were still common then, so binary patching the kernel to put in a different hard disk parameter table (to use a luxuriously large 20MB drive in place of the ST412 the machine came with - precious! Must not mess with the irreplaceable original OS image) was undaunting, especially with a .h file handy that gave the structure. Compiling "elvis" to get vi in the absolutely stripped down Minix mode, that used 63Kbytes of the maximum 64K of code space that executables could use... fun times. Of course back then you still had a hope of compiling current C with ancient pre-ANSI K&R C compiler. Most stuff that I ran on the machine didn't need much porting.