1991 was the vibrant, exciting, crazy "adolescence" of the PC age and well into the period where it was cool to have a desktop PC and really learn about it.
Phones are dominant now and have passed the PC generation by - in number, not capability. The concept of copy/paste/save for arbitrary data lives on for the non-tech masses only in the form of screenshots and screen recording features.
The thing that stands out to me looking back over a few decades is how much of consumer/public computing is exploring the latest novel thing and companies trying to cash in on it. Multimedia was the buzzword aeons ago, but was a gradual thing with increasing color depth and resolution, video, 3D rendering, storage capabilities for local playback, sound going from basic built in speaker beeps to surround and spatial processing. Similar with the internet from modems to broadband to being almost ubiquitously available on mobile. Or stereoscopic 3D, or VR, or touchscreens, or various input devices.
Adolescence is a very good word to encompass it, lots of awkward experiments trying to make the latest thing stick along with some of them getting discarded along the way when we grow out of them, they turn out not to be (broadly) useful or fashion moves on. What I wonder about is if the personal computer has hit maturity now and we're past that experimental phase, for most people it's an appliance. Obviously you can still get PCs and treat them as a workstation to dive into whatever you're enthusiastic about but you need to specifically go out and pursue that, where the ecosystem might be lacking is a bridge between the device most have as their personal computer (phone/tablet) and something that'll introduce them to other areas.
If it were a powerful, useful device that I could load my own software onto and make programmable without jumping through a bunch of hoops, instead of the ad-laden crapware that resulted from primarily two megacorps duking it out over how to best extort billions from app developers and users for their own benefit, then sure, I'd agree.
But phones aren't awesome little PCs, they're zombifying the majority of the public. They also, incidentally, are insidious little snitches busy at work trying to monetize every single thing about our daily lives.
> ut phones aren't awesome little PCs, they're zombifying the majority of the public. They also, incidentally, are insidious little snitches busy at work trying to monetize every single thing about our daily lives.
Yes, and corporations are doing all the same stuff to our PCs as well.
If you think having a developer mode switch on your smartphone that would enable shell access and a build env is what's stopping "the majority of the public" from "zombifying", either you need to talk with more "majority of the public", or I've been talking to the wrong "majority of the public".
The general public doesn't know how to program. They don't know what variables are, that they have types, they think functions are what rich people call a dinner party or corporate event. On computers, where there are no such restrictions, the majority of the public haven't suddenly become hobbyist programmers in their spare time.
If you're so blinded by hate because there are hoops (which there absolutely are), and you refuse to jump at all, not even a little bit, simply on principle, I mean, you do you. Meanwhile, there are people who aren't the majority of the public, but that want to do things that able to get into tech learning to code despite the epic of Apple vs Google vs Gilgamesh flattening towns. It would be great if it were easier because the phones were more open, but at some point you gotta go with the serenity prayer.
There's definitely a mismatch between expectations between what you inferred I meant and what I really mean. We agree that the majority of people are not going to suddenly stop being zombies if the platform were more open for development. It's a complex societal issue that's driven by the media atmosphere and the attention economy and affects all platforms. But smartphones are the platform that seems to be the most extremely affected and it definitely is accelerated by the locked down, content-consuming, ad-laden nature of everything the platform drives them to do. Nothing about the interaction mode of a touchscreen phone lends itself to being able to do deep work particularly well, but then on top of that all the platforms' incentives push away from it again.
> If you're so blinded by hate because there are hoops (which there absolutely are), and you refuse to jump at all
It's not necessary to bring that energy to HN and I'm going to nope right on at the point you accuse me of not being technical enough.
It's not a question of being technical enough. It's actually a question of being too technical. You're here, which for me garners youa base level of respect if only because I've had the creators of certain subjects of discussion have responded directly to me grousing here.
Because you're technical, the iOS restriction that code must be signed seems insurmountable, because it is. But if you know less about computers, you'd find bitrig or swift playgrounds or Pythonista. And knowing even less, you get into building web apps. For what people want to do and create; they don't know frontend from backend and are just getting their feet wet, a phone does alright.
Could it better at it? Absolutely, no question about that! But so could everything else in life. It depends on where on the spectrum you exist, A laptop is better than a phone for writing code for a lot of reasons, but when we're looking at the bigger picture, a phone is better than nothing.
> Phones are dominant now and have passed the PC generation by - in number, not capability.
And I'm saying phones have passed PCs in capabilities. Don't put words in my mouth, not all of them, obviously. I'm just pointing out that a desktop with a 5090 and 42" widescreen monitor doesn't fit in my pocket, and that fitting into my pocket is a capability that some people value.
Oh fuck you, I didn't have the $1,500 I just spent on Amazon for one of those! I've been waiting forever for them to make one with a finger print sensor, and I thought you were responding to a different comment so I looked it up and thank you :)
That's not remotely true. The only person I've ever seen in public using something like https://www.newegg.com/p/3C6-018V-01637 to STAND in line while using a laptop is me.
Depending on where personal/portable AI devices go, phones might be significantly different or not exist in 10 years as they do today.
There might be a resurgence of some kind of device like a PC.
Seeing iPadOS gain desktop features, and MacOS starting to adopt more and more iPadOS type features clearly shows the desktop, laptop and tablet experiences will be merged at some point by Apple at least.
I think it'd be biased more in the direction of the Ipad. If anything there's one feature apple's trying to avoid and that's Macos' waning ability to run third party binaries
There's a double edged sword here. Not even talking about DRM.
I would love to be able, myself, decide if it's fine to capture a screen for an application but I'd also would love to protect me or my non-tech-savvy relatives from accidentally share sensitive info (e.g. banking) when screen casting.
In my opinion there should be a waiver buried deep in the settings that allows me to disable such protections with a grace period of a week or so. Grace period is crucial because scammers are able to make people do virtually anything under stress and hurrying.
No, it's also iOS that's arbitrarily restricting it. I opened a bare .webm directly in Safari and got nothing on long press and nothing in any of the control widgets to save it.
Phones are dominant now and have passed the PC generation by - in number, not capability. The concept of copy/paste/save for arbitrary data lives on for the non-tech masses only in the form of screenshots and screen recording features.