I am not strictly entitled to answer this but I will just in case.
(Language is a bit different in Australia.)
I completed a Bachelor CS degree in 1995. I think that's a "CS major program".
It was very theoretical, in that the languages we learnt were too old, too new, and not industry-led. So, Eiffel for OO, Cobol(!), and some proper maths thrown in.
It got me a solid 25 years of work.
After about a five year gap in software development as a job, I am now doing a Masters of Computer Science at the same place (by name alone, maybe) and the tech they teach is ten years old.
I'm not averse to this so far. I finish in a year, and I'll know if it was a waste of time to get back into the industry then.
However, I have done six of the twelve subjects and they ALL filled gaps in my understanding from both my original Bachelor and my work experience. I am a better programmer now.
I am currently in an interview process where I surprised myself with my own knowledge. YMMV of course.
"But is this shift actually worth worrying about? Or are younger people just projecting their own anxieties about screen time onto their parents and grandparents?"
False dichotomies can either be the worst thing that happened to humankind or a pathway to a new way of understanding each other.
I'm still using a 2010 Macbook Pro with a 1TB SSD for Logic Pro and Mainstage. Does it struggle? Yes. Does it work? Yes. It's still amazing technology that makes my keyboards and guitars sound bananas. To be fair, I just muck around with it, but it still has more than what I'll ever need or be able to discover.
This is tongue-in-cheek, but you spent years in management because "the thought of spending your life staring at a screen and dealing with insignificant minutia seemed horrible?" I need to read your management book!
It’s a lot of 1:1s and talking to people directly and strategy about setting up performant teams. I enjoy it way more and don’t spend a lot of time looking at screens.
Tried to play a free-for-all card game with blank cards with friends in a bar thirty years ago. It was too far out for the group. Writing the rules for a game during your own turn is pretty great. But if there isn't an improv idea of "and then" among the group the game won't work. It's certainly not about winning :)
Not sure if this has been posted (I see stephenwoo has mentioned him further down), but it's a break-down of how sugary foods damage the body, particularly fructose.
It's 16 years old about 30 years of previous research.
I haven't had a _terrible_ UI experience with Win 11 that Apple hasn't put me through already. But it took away my sideways toolbar. I don't click anything that loads edge, like "Show me more from the web" type links. So I don't see ads. I use firefox and thunderbird.
The telemetry all the way through the operating system sucks ethically. But I'm invested and familiar with Windows and Office. Not being able to make Copilot disappear is annoying.
However, all my games and software that work on Windows won't necessarily work on linux. I am not interested in making a political stand and putting up without abilities and features I currently have.
Yeah, I think it's just a matter of what you know. I recently got a Mac for work and the UI is horrid and I have no idea how people put up with it, it's like designed by people who never actually had to use it afterwards. But clearly it works for people so I strongly suspect it's just the personal bias here.
Maybe I'm still in denial about the benefit of AI code design, but when I have an initial set of requirements for a program, the design begins. That is just a set of unanswered questions that I address with slowly growing code and documents. Then the final documents and code match the answers to all the questions that rose from the answers of previous questions. More importantly, I know how the code answers them and someone else can learn from the documentation. Since the invention of "velocity" I feel like much of the industry treats code and programmers like tissues. Wipe your nose and throw it away. Now we have AI-based automatic tissue dispensers and Weizenbaum's gripe about programmers creating work for themselves other than solving the requirements of the actual problems continues.
I completed a Bachelor CS degree in 1995. I think that's a "CS major program".
It was very theoretical, in that the languages we learnt were too old, too new, and not industry-led. So, Eiffel for OO, Cobol(!), and some proper maths thrown in.
It got me a solid 25 years of work.
After about a five year gap in software development as a job, I am now doing a Masters of Computer Science at the same place (by name alone, maybe) and the tech they teach is ten years old.
I'm not averse to this so far. I finish in a year, and I'll know if it was a waste of time to get back into the industry then.
However, I have done six of the twelve subjects and they ALL filled gaps in my understanding from both my original Bachelor and my work experience. I am a better programmer now.
I am currently in an interview process where I surprised myself with my own knowledge. YMMV of course.
reply