Hacker Newsnew | past | comments | ask | show | jobs | submitlogin
The Developer's Dystopian Future (the-pastry-box-project.net)
475 points by bjxrn on July 6, 2014 | hide | past | favorite | 201 comments


So I'm in the 40+ crowd, and things are different from when I was a 20-something. But learning new stuff isn't a problem. I pulled Angular into the new project I'm building at work. I've moved from C to C++ to Java to Python and am now playing with Racket (Python generators have nothing on define-syntax :).

My point is not to say I'm better than the author. I'm nothing special. My point is that there isn't a magical end-of-life for developers. Learning changes. Interests change. Priorities change. Nothing to be afraid of.

And there are positives: I'm much better at prioritizing. I don't waste my time on "the next hotness" just to watch it tank in obscurity three months later. I'm good at understanding what infrastructure is important and what isn't. My hunches have gotten better.

What scares me about this post has nothing to do with the author. I worry more about it propagating the myth that developers have a limited lifespan, that they expire. That's a falsehood that needs to die.


"My hunches have gotten better" is perhaps the best description of the value of experience.

I am 38 myself, and sometimes, when I am surrounded by a new batch of fresh 20-somethings, I feel a little like the author of the original article. They need only a couple of hours to understand a new framework or tool to an extend that I feel would take me days or weeks. And they think nothing of burning the midnight oil for whatever reason, while I really have to get home in time to pick up the kids from daycare.

But then they suddenly make a design decision I just know is never going to work out. Not because I'm smarter (it's probably the other way around), but because I saw a similar mistake being made 10 years ago. Or maybe it's not even that explicit, and a certain solution just "feels" wrong even before I can clearly articulate why.


That always amazed me when I worked with younger peers. They could pick up the new JS framework/build tool/library-du-jour and whip together an amazing, reactive, fluid GUI in an afternoon. I'd try to cobble together a simple, "Hello world!" with the same technologies and throw my hands up after a few days proclaiming it to be the worst invention since the dust-jacket.

But I taught them about unit testing, data structures, and algorithms. In return for making their programs easier to maintain and faster they answered my dumb questions about crazy JS frameworks. Eventually I did get, "Hello world!" to appear in my browser screen.

I often wonder why it's so complicated but they assure me it's for good reasons.


Yea very true. I'd also say that learning new things has only gotten easier with experience. In my experience not only have I been able to get up to speed faster, but often it is easier to spot the pitfalls with a new technology before even using it.


Until cybernetics becomes more mature and cost-effective i'm pretty sure we all have a limited lifespan.

What's funny is you see this article as a kind of threat, while the author is voicing his concern for the threat of the advance of technology for technology's sake in contrast to his shrinking interest in learning additional skills over time.

If you think about it, it's kind of depressing. Like a cobbler having to learn how to repair a newly invented shoe every couple years, constantly purchasing new tools and learning how they work, only to have the business change a few quarters later. A constant churn of a technological wheel that doesn't get the cobbler anywhere other than the exact same place. You'd think the world would be happy with the old, long-lasting, easily-repaired leather oxfords. But that isn't the case with technology, and I share the author's lament for less unnecessary complication.


The problem in software is the assumed useful lifetime of a developer is much less than his actual useful lifetime.

And, while I disagree with your reading of the article, I do agree that change for the sake of change is a bad idea, and it is something that just increases risk without benefit. However, I don't think all of the change (or maybe even a majority of it) in software is the result of change for change's sake. Most of the change I've seen in my career has been complex abstractions of even more complex realities. The Java VM was a complex abstraction to an even more complex reality of developing in heterogeneous computing environments (I think it's something else now, having evolved as the computing environment has changed). This abstraction reduces complexity in many ways, making it possible for developers to do more, but it comes at a cost of needing to know more places where abstractions leak and how to deal with those leaks.


Racket really is awesome. I used the Racket web server for a small project once (that never got finished), and it's just so pleasant to work with!


I'm not even legally an adult, and I feel like things are moving too fast for me. At 17 years. I guess I shouldn't feel like that, but I can't help it.

I started with VB6 as a kid, 6 years ago. I moved onto C++, then learned about 3D graphics and game engines, then learned HTML, CSS, JavaScript, gotten into web development, learned Photoshop, some basic design, working with Linux, PHP, a little of C#. Recently tinkered with MongoDB. Now I'm learning Common Lisp, but I feel I've missed so much, and slowly losing pace.

All those frameworks, Angular, Ember, intimidate me. ZeroMQ. RabbitMQ. AWS, EC2. "Big data". Swift, Dart, even Python and Ruby. Responsive design. Scala. Backbone.js, underscore.js, Node, NPM. Neural networks. How do those things work?

I'm freaking out a little, not even sure if I should continue pursuing this. I so love coding and new challenges, but I just fear it's moving too fast for me and one day I simply won't be relevant anymore. I take a look at HN's front page, and I don't recognize so many technologies mentioned in the titles of articles.

And I'm feeling like one day it'll go over the top, I won't be able to keep up anymore (and I can barely keep up even now), and I'll just quit and go learn play drums. It would be heartbreaking for me, but I'm not even sure if computers are the right thing for me in the long term.

So I understand this post, from the bottom of my heart. And there's this anxiety creeping up about it. Who knows if I'll still be able to do this in 10 or 20 years? And finding another craft which I love so much is also a scary task. I think I might never do.


Relax.

One of the lovely things about being older is having the confidence to be able to be able to point out that the latest fad is just a warmed-over rehash of something old.

It's important to keep up with the new things, but the fact of the matter is that there is very little in the world of software development that's actually new. Swift? Please. If you know ML (1970's), Smalltalk (1980), and Objective-C (1983), you know 95% of what there is to know about Swift. If you know Smalltalk, you know about 90% of what you need to about Java, and with those two, you know 98% of what you need to about Dart.

Learn the big important things, focus on principles rather than specific implementations, and you'll realize that there's very little new under the sun. At this age, while your brain is still limber, learn as much mathematics and theory as you can. You will probably not use Dart 10 years from now. You can bet you'll use the knowledge you picked up in your combinatorics or algorithms courses.


> One of the lovely things about being older is having the confidence to be able to be able to point out that the latest fad is just a warmed-over rehash of something old.

Sane advice. I'm 39, and I'm rather stunned to see 27yr olds calling themselves "old" and disillusioned with tech and what not ... let alone 17!

If you like to build things, hone your system level thinking .. which lasts longer than Angular/Backbone/whatever. (Heck I'm working with the intention of obsoleting them, based on old Smalltalk ideas!) If you like to build things the possibility of which average society has no idea of, hone your algorithmic thinking. I'm certainly more productive today than I was, say, a decade ago because I chose to focus on the ideas rather than specific tools during that time. I'm confident I can work with any tool at hand due to exactly what rayiner says. Good debugging skills, for example, don't die because you need to approach it scientifically.

rayiner - while you're right about the "if you know .." part, many companies today ask for specific skills and hirers don't know enough to say "we need guys to work on a Java code base, you know Smalltalk, you can handle it, hop onboard". So it is up to us tech folks who give these job specifications to hint at the broader kind of people we can accept.


With regards to your last point: it's also up to job seekers and employees to signal to organizations that they need to take professional development seriously. You have to fight the pressure to over-specialize and demand opportunities to learn new tools and techniques. There's a basic conflict of interest here between employer and employee: the employer wants a highly specialized cog they can replace when their needs change. The employee wants to develop a speciality, but also build a general base of skills and dabble in new technologies so he can stay employable for years down the road. Employees need to fight for their interest in this regard, and employers need to have the foresight to realize that the most talented people who have the most options will not take a job at a place that stifles their professional development.


I'm in my 50s, and I'm having a ton of fun with this new stuff.

After burning out at a startup that did everything in 68000 assembler (nice architecture...) in my 20s I'm getting back to dev work now and finding I still have aptitude and curiosity.

There is a lot happening. But the change has barely has started yet. I can see hints of new shapes coming over the horizon, and anyone who thinks they're going to be needing the same skills ten years from now is fooling themselves.

Imperative and functional styles have been around for decades already, it's true there's a lot of pointless reinvention (at worst) and refinement (at best) happening.

But very different things will start happening within the decade. IMO the fallout after the bootstrapping will be like nothing anyone has seen before.


The ability to think clearly and critically is certainly something I don't expect to be obsoleted in a decade. I do expect what I think about and think with to change though. Maintaining a curiosity about new ideas and how they relate to the old ones is a healthy trait to have that will help one survive such change.

Wolfram Language certainly looks like an interesting and somewhat new take on building systems using advanced computation. For those who're continuously curious about genuinely new ideas (not just rehashes) it will not be "nothing anyone has seen before" when it (whatever it is) finally happens. To them, it will be more like "hey, it's cool that this idea I've been dabbling with for the past 5 years is pretty powerful and feasible to use today, yippie!"

Minor ex: I had two Haskell "aha" moments - one around 1997 and one around 2000. The former was exposure to Haskore which offered a neat design that clearly illustrated the "separation of concerns" design principle. The second one was when I implemented an algorithm in Haskell in 4 hours and it worked like a charm and was more general than some C code for the same problem that I was hacking on for almost 2 weeks prior to that and which continued to be buggy. The language made a difference to how I thought about the problem, taking away many of the low level concerns. Back then, for the cases the C code worked, the Haskell code ran 60x slower. Today, Haskell is blazing fast compared to those years and pretty viable for just about anything. Yippie!


I think Machine Learning will become much more common place. It's a radically different approach.


I definitely agree with you on this point. A lot of the "new stuff" is just a rehash of the old or a new implementation in a different language of something that's been around for forever.

I'd say the front-end MV* stuff is a great example of this. I started out with KnockoutJS a few years ago and that's all but dead by now. Was it a waste? No, not really. It was pretty easy to go from Knockout into Angular. Sure, some concepts are different, but the ideas are the same. Moving on from that, I picked up VueJS pretty quickly and after tinkering with its Component-based example, I saw that it's pretty much almost the same as Polymer. BAM! All of these frameworks wrapped up in a nice package.

And what's behind it all? An idea. The concepts behind all of these are the same. Same goes for working with PHP vs Ruby, same concepts, different syntax, slightly different ecosystem. NodeJS (as much as I love to say how amazing and different it is), is pretty much the same as Ruby and PHP and Python except for its damned callbacks.

And then there are things like underscore (which is pretty much an array manipulation library), and backbone (which is just a library that gives you the MVC of javascript) etc.

All of it old concepts, and all of the similar libraries work very similarly. There's no need to relearn everything from scratch once you get the patterns and concepts down.


Just wait till you look for a new job, and it happens to be at a startup. The more mature companies tend to be a little slower at adopting new technologies.

In my situation I've been in the Enterprise/automation world. Which is a little bit behind. [Ok a lot.. but you work with what you can and what works within the requirements] ... you'll get the "you're not the right skillset" when the new organization is using *js, nosql dbs, and the FB APIs etc. That's rather odd when you've been developing for more than 10 years and have shown that you can learn new technologies.

I'm completely in agreement with you, however when it comes to a new position employers "don't want to risk you not knowing jquery 12"


You should be worried.

You are joining a pop culture that measures, judges, and hires people by how much they know about the current pop culture. Worse yet, everyone who tries to escape this conformity appears to just switch to a different conformity (OOP programmers switching to FP. Emacs users switching to SublimeText. Javascript people switching to Go. Programmers switching to Entrepreneurism. Etc.).

- Choose what not to learn (while being outwardly supportive of endless learning).

- Learn your excuse list for not understanding things now, and sit quietly while energetic people try to "convert" you.

- Choose what you actually want. (Many people are successful in unseen ways, and you never hear about them.)

- Know that you won't get it right the first time, but you will develop a feel for what to pay attention to after a few times around the hype cycle.

And, as always, keep learning.


"Choose what not to learn (while being outwardly supportive of endless learning)."

So that's why I didn't hear back from the (initially enthusiastic) Google recruiter.

(edit): I re-read the email thread and I think I was being unfair in the previous iteration of this comment. "See randomness" and all that.


You must keep in mind Google's perspective. They have a firehose of great candidates to choose from, requiring an assembly line of recruiters and interviewers to filter them. They could use any filtering method, no matter how inaccurate or impersonal, and still get a large group of great people.


I'm 20 and I've felt the same way. I just returned from a semester abroad and effectively took a 6 month break from all coding following my 7 years of continuous interest. When I came back to the states to start my summer programming job a few weeks ago, I was hit with a feeling of emptiness and mild depression. The reasons for this are likely tied up in my reverse-culture-shock, and the fact that I spent a majority of my time in India socializing and exploring. I came back to my home, to a job I was really looking forward to, and didn't find programming to be very fun.

To find traveling enjoyable and purposeful, you have to become a bit of an extrovert. My days were filled with interesting sights, sounds, smells, and people. I felt happy to have learned a bit more about the "real" world around me. Then I came home to a desk job, where I unenthusiastically wired up programming logic to produce results. A few days ago I realized the obvious reason why I was feeling down. I had planned my summer around my work, and I wasn't enjoying my work, because I was still searching for external reward. To enjoy programming again, I would have to dig deep and remember why I loved programming in the first place, and promote the introvert was suppressing in myself.

I told myself to forget that I'm working on a product that might not improve the lives of tons of people, or that my code might be obsolete by the end of the summer, or that outsiders find it hard to relate to stories of my progress at work. Instead I allowed myself to enjoy building cool stuff that would be mostly divorced from the physical world. It's worked pretty well for me so far. Reward is relative, nobody can tell you that you should find less reward in designing fascinating and complex sand castles in your mind than more normative tasks like socializing.

The takeaway I'm going for here is that if you're feeling burnt out, it might be because your motivations for programming have been messed up by social pressures since you started.


I don't know your wealth situation but one positive thing about our profession is that it provides us with the finances to travel and continue experiencing the things during your semester abroad. In my 20's, looking around, it was difficult finding people who had the vacation and money to travel abroad.

If you can make sure you realize that this is your job and take time to take advantage of your situation I feel it can justify the sadness you experience from our profession.


To see a young person say this bolsters my opinion that technology is moving too fast.

Everybody is in such a rush to learn new frameworks, new languages, new toolsets; nobody is learning how to actually _program_. I've worked with people who looked down on me for not knowing WeekOldFramework, but who couldn't architect a simple CRUD application to save their lives.

I feel like so many developers have this "quantity over quality" attitude about knowledge, and it's hurting the industry. With such shallow knowledge, we're doomed to never learn from history and forever write bad software.

Get off my lawn, etc.


It isn't moving to fast. New library x is like going from PHP to Python, true holistic understanding and a grasp of theory goes a long way to enable someone to look past the superficial differences and focus on the substance - everything else falls into place.

It's also not about learning to Program, that's like saying to an adult that knows how to write, learn how to write! That doesn't improve their writing.

It's about thinking. Structured languages (like those used for programming) are Yet Another medium for translating Thought. Learn to think better, learn better tools that enable better thinking (Haskell, Idris, Mercury, etc...)


HN is the worst possible place for you to spend your time.

A constant stream of new ideas - many of them not worth the bits they're stored as - punctuated by periodic tales of extreme success and extreme failure. It doesn't cover the in-betweens - yet the in-between is where 99% of the rest of the world's software developers live.

Someone like you, who had the natural drive and curiosity to start programming at 11 and dive into C++ sooner after, will never be obsolete.

Find what you like in the field. Learn it well. Accept that what you like may change and that you'll be inclined to learn something else down the road. Things that interest you today may not tomorrow; and things that seem out of reach today you may find coming to you as naturally as breathing tomorrow.

Most industry professionals don't chase after every new trend - we learn the fundamentals well, we pursue the things that interest us and let go of the things that don't. We even have time in our lives for things that aren't technology, yet we manage to progress in our careers as software engineers. (I would say making time for those things is critical to retaining your enjoyment of technology as you get older.)

My contact info is in my profile; feel free to shoot me a message if you want to talk further.


Stop follow hacker news and you will rapidly recover from this illness.

Source: Me, someone that see COBOL all day and still live happily after that.


Exactly.

I work in a similar place. IMS and COBOL is all over the place.

Delphi is too. I maintain mostly C code written 20 years ago.

It pays better than any job I've had previously with up to date technology stacks.

A bunch of mates are Coldfusion coders. Now they are in demand because all these odd places have legacy CF code they have to deal with.

In 10 years time all over the place will be piles of Angular, Node & whatnot to maintain.

The pile just keeps getting bigger.


I loved Delphi 15 years ago. It was WISWIG, click on a button, write an event handler, drag and drop components, it even generated some boilerplate code for me. That was bliss. I learned how to do event driven UI's back then. Today I use jQuery much the same way, but without the nice RAD. Is there a Delphi-esque RAD for JavaScript?


Calm down! Nobody in the industry knows everything about everything :). It's OK that you feel a little overwhelmed, because there is so much to learn. At your age, you can take your time, learn a little of everything, discover what is it that you like the most, and focus on it. And I'm referring to domains, not languages or techniques: if you're into web backends, there is not that much difference between, say, Python and Ruby, even if knowing both will make you appreciate the advantages and shortcomings of each one.

In the end, if you work steadily and know where to apply for a job, getting it is easier than it sounds (nobody's days have more than 24 hours, and if you started that early, chances are that you're quite better than your age's average). Just be aware that, with all that many languages, frameworks, etc. there will always be people (maybe even younger than you) that know more than you about some of them.


I'm over 30, and been working in the industry for ~6 years now, and I say don't worry. Yeah, there's segments of the software world that push out new frameworks and languages at a breakneck pace that nobody could keep up with. There's no reason why you need to know them to work, though. The vast majority of developers out there are working in time-tested languages and frameworks. C++, C#, and Java for most of the enterprise and big business world. Most of the web work that uses any other language is Perl and PHP. There are jobs out there in the latest cool stuff, but they're a small fraction of the overall market. I haven't really worked with that side of the industry, but I'd have to guess that any company there knows that they aren't going to find many people on the market that already know their stack, and so will be open to teaching you.

Don't worry about trying to keep up with the latest stuff. Do pick a language and framework, and go build something in it. Make it do something useful and solve somebody's problems. A proven ability to execute on a business idea is worth far, far more than knowing the latest languages and frameworks off the top of your head.


I've got 10 years on you, and while that still isn't much, I can at least give you some advice on how to deal with this.

Nobody can keep up with the entire industry. It seems like you're trying to know everything about everything, and that's simply not possible. What is possible is learning a little bit about everything. If you see a word, technology or concept on HN that you don't know about, spend 30 seconds googling it. Usually, 30 seconds is enough to get a basic idea of what it is about. The more you do this, the more you will be able to relate things to each other, and you'll be able to intuit a lot of context from just a single-line description of something.

Don't try to learn a lot of languages. When you're 17, it's useless to know VB, C++, PHP, C#, Lisp, swift, dart, python AND ruby. Pick one you now know and like, and stick to it for a while. Become a genuinely good coder in one language, and you'll be able to pick up another in no time. Knowing one language really well also teaches you loads of the basic concepts that you can use to understand other technologies. It doesn't matter if it's Angular, RabbitMQ or Big Data, you need a concept of the problem they're trying to solve before you will understand them properly.

Once you have this nailed down, you can learn about whatever you like. Every single thing you learn, whether cursory or in-depth, will help you understand everything else better.

In short, don't try to be a renaissance man, but be a T-shaped person (http://darrennegraeff.com/the-importance-of-t-shaped-individ...)


I love how you've got so many responses, and hope I'm adding signal more than just repeating what everyone has already said.

It's crazy overwhelming to look at all of the technology and opportunity out there and to know that you can't keep up with it all. But you don't need to keep up with everything. You need to know what tools are out there, and what their general purpose is. 15 minutes with Google will inform you enough about any single technology. In one day you can knock out 10 or 20 paradigms.

Once you know what's out there, you learn more about the things that are specifically relevant to what you're working on and what you will be working on in the near future. And you Google around to see what paradigms are being used by people working on similar problems.

Especially if you already have a specific thing you want to accomplish, it'll only be a week or so before you've narrowed your needs down to just a few specific tools. And those are tools that have arisen from decades of refinement, and will be relevant for at least a decade on their own even if they'll only be the "coolest" framework for a few months. You don't need to be at the forefront of every technology, you just need to be good enough.

And knowing that makes staying relevant feel a lot less scary. Most people only know a few things, and they only _really_ know one or two things. And each of those things is good enough to last at least 5 years if they were the most relevant thing when they were learned. Often they'll be good for multiple decades.

So keep your head up, and pay attention to what's cool and hot, but don't feel behind if you've never really understood what AWS does, or how Angular works, or if you don't know the difference between Unsupervised and Supervised machine learning. When you need the tools, you'll be able to learn them in time to remain relevant.


Don't freak out too much. Make a list of the minimum list of skills you need to do a non-trivial project solo, and learn those. Once you've got that base you can expand out, and if you've got a few different things to build on it should make that easier.

When I started, it was PHP/JS, but after a while quickly changed that to Ruby/Rails, Python, JS/CSS/HTML and design (can't do a solo project without knowing some colour theory, typography basics etc). With only six things to tick off, it was less of a daunting task and I felt like I was making progress.

I've added to that list both at work and around work since, but as somebody above noted, you can get burned out from coding all the time - sometimes you need to recharge and wait for the fire to come back.

The one thing I do know, is that if you have a passion for the craft and nurture it, you will be able to find a balance :)


This has already been said by others, but let me take this opportunity to talk to you as if I were talking to 17-year-old me.

Focus on what you're making. Let the process sort itself as you go. Step back from the computer and think about what you can do by yourself, with simple tools. Get a pencil and a sheet of paper. With this simple sheet of paper and this simple pencil, you can draw many things. You can paint a whole landscape (in gray tones, but still), you can make a dumb doodle or you can design a machine. You don't have to know how to draw, you can start today and eventually get good at it. First the circle, then the rest of the owl. [0]

All that with just a simple pencil and a sheet of paper. Both have a lot of engineering behind them, so making paper and pencils isn't easy from scratch, but as objects, they're simple and you can use them for many things.

I spent so much time worrying about tools, worrying about browser compatibility, worrying about doing things the right way, increasing performance... that I didn't do what I really wanted to do: make good stuff, for me and for other people.

There's no simple way to avoid the trap of keeping up with everything, but I've found that focusing on the intended result (what you want to create, be it a product, an experiment or a tool), picking a toolset early and sticking to this toolset/stack is what worked best for me.

* * *

Here's good news from present me: as you gain more experience, (which can correlate to the older you get, especially when you start early), you get better at knowing when something will really improve your work (making it faster, better looking, more enjoyable, better performing, more efficient, etc.) and when it won't. Perhaps the article's author is actually on to something. He intuitively knows New Tool X won't improve his work much, and he's wiser than his younger self, who would waste time chasing non-essential niceties.

[0]: https://imgur.com/RadSf


You've got plenty of time. Other commenters are right, focus on theory, and bolster that by building things. Constantly. If there's one lesson I could hope to impart to any budding developer, it's to never stop building. Learn a bit about genetic algorithms - then implement a simple one. Learn a little Ruby, then build something with it. This will teach you a lot of what you need to know - fill in the gaps with Google. Focus on building good abstractions, and you'll be fine wherever you go.


Are you familiar with the adage that the longer something has been around, the longer it will likely continue to be around? Also known as the Lindy effect.

Anyway my point is that C++ has been around for 40 years (and a dominant language for 30 of those), so it will likely be around longer than anything that surfaced in the last 6 years. Plus, mode new things are just refinements or variations on a theme, so you hardly need to learn them, just keep abreast of what's interesting about them.


It's going to be okay. It sounds like your problem is that you want to learn everything- and at 17, that's not really a problem. You can be successful in this industry by choosing any two (ok, almost any two) of the technologies you listed above, and focusing on them for a couple years. If they last longer than that, great.

You certainly don't need to keep up with all of the latest and greatest JS frameworks and understand machine learning to be successful :)


It all seems overwhelming when you look at the whole of the industry like that, but it's a lot easier if you pick a niche for now. If you like backend work, just ignore Javascript for now. If you need it for a job, you'll be able to pick it up, I promise. You sound a lot like me at 17, although to be fair to you there weren't so many technologies at the time. I started with C/C++ and Win32, played with MFC, learnt OpenGL and tried to write a game engine with a friend, got into Java at Uni, did language parsing and C# WinForms at my first job, moved to a SaaS company having done no real web development before, now I work for another web startup where I need to know a whole chunk of those techs you listed.

I tell people I probably won't be working in tech when I'm 40. If you think that's where you'll end up, plan for it. My plan is to buy a business with my wife in her field and do some contracting on the side. No idea if that will work out or not, but that's a 10 year plan. Just having a plan can help with the anxiety.


There will always be more frameworks, languages and technologies to learn. Don't freak out. It is impossible to keep up with all of them just accept that you can never know everything.

I'd recommend a 'wide and deep' strategy, try and get an overview of as much as you can but specialize in a couple.


Most of the tools you talk about will be old before you get a job, don't even bother. Among the people that know how to use them (and most know only a few of them) very few know how to actually program. They only regurgitate patterns they learned by heart. They're lost when they're left on their own. There are good odds you can do better.

My advice is keep the fundamental books close to you and practice a lot. Programming is mastered by practicing, not by reading blog posts, never forget that. Filter the noise. And most importantly : have fun. Mastery will come as a byproduct, don't even worry about that. You can trust your crave for knowledge, it will always lead you on the right path.

I'm probably not the most knowledgeable out there but you can reach to me if you want to talk.


First day CompSci 101: 'The language you will work with professionally hasn't been released yet so we are learning this" I wouldn't worry about it. I work with developers in their late 40s who started out learning Fortran in the 1980s.


Woah woah woah, relax. OK, C++ and client-side tech (HTML/CSS/JS) is enough to go for some time. Common Lisp is a good way to go next, but you'll see most of that return in how it affects your use of other languages. For the rest in your list: they're solving overlapping sets of problems with overlapping sets of techniques.

You have to know very few of them to be quite productive. You also have lots and lots of time. I'd honestly focus on getting better at those that you already know than worry about adding in more. And just start building shit. Nobody cares what technology you use, as long as it's not PHP[2].

The more technologies that you know, the easier the new ones will be. I don't know why, but people keep adding them on their resume like merit badges. They're tools, and you aren't responsible for learning tools, you're responsible for making solutions.

But, if you want to learn a bunch of them, here's how you get started. When you start a new technology, ask yourself a few questions about it:

- What's the problem it's trying to solve? (hint: this applies to programming languages as well)

- What're the constraints imposed by the environment?

- What fundamental techniques is it using?

- What are its predecessors?

Those four questions will explain most of the technology[1], before you even look at the details. Do this a few times, and you can predict much of a new technology just from the answers to these questions.

Example: C++:

A systems language for large programs. Must have as close to minimal runtime overhead and optimal code as possible. It's native-code compiled with the platform ABI, usually with the same compiler backend as the C compiler, and without any runtime interpreter or compiler. Predecessors include C and Simula. Footnote 1 applies here, as it is designed by committee, with a "shove a lot of stuff in the language, and let each user figure out which parts are useful" attitude.

[1] Modulo random experiments, social system issues, laziness, and bad ideas.

[2] The only reason to use PHP is "someone made me."


You remind me of myself at 17. Your post tells me you have both the capacity and the appetite to learn, which is the foundation of success in this field. Keep chaining one body of knowledge to the next.


I just ignore everything dripfeed myself new tech that looks good until I find something I like. Don't everything shown off on HN too seriously, but pay attention.


You've gotten plenty of replies already, many of which I haven't read. But I have this to say: I'm 31 and I often feel the same, but it's not real. Just read Dijkstra and don't worry about Angular, Ember, or the latest fad.

You'll find out that the real knowledge is relatively stable and how well people know the latest fad frameworks has little to do with their long term relevance or employability.


I literally took the same path. Vb6, c++, games, Web development, and beyond. Just keep going, you'll be fine.


I was there ;) -Now, I have 35-

... in my time the changes was made by Microsoft, and really I'm a sucker for learn new things.

"More things change, more stay the same"

Despite all the noise that new stuff generate, exist timeless wisdom that permeant all the good things, so focus in answer the question "What good, timeless thing I can get from this?" Even if some tech fade with the time, you keep what do you learn.

I start programming with FoxPro 2.5 in DOS, then Visual FoxPro 3. I still give credit to have used first a xbase language in how "good" I'm (in contrast with some people) doing database work, and have get OO instead of the people that chose Java... despite that Fox died long time ago.

Everything will teach you something good. For example, Pascal/Delphi show me how is work with structure. Python, what look like to have simplicity & readability. Fox, this is how you do database, and Visual Fox: This is how you do OO. Delphi, this is how look like to have performance and RAD, and not need C/C++. Erlang, this is how multi-core development must/could be, Haskell, this is the crazy world of immutability (and this is functional programming).. etc.

The mistake is follow a BIG. This burn me and others devs in my community when we follow whatever MS say. Don't do the things that a BIG tell you: Them are big, and their problems and resources are not yours. Instead, I focus in learn stuff practical for a solo developer or for a very small team (well, if you want to be part of a BIG, then do otherwise: ie: Understand the motivations and intentions of whatever you are listening).

Don't attach to anything.. too much or for too long. Learn as broad as you can, yes... but don't commit until you need to.

For example: I have read a bit about Erlang and GO. However, I don't do anything (in code) with that, because? I don't have a real, present requirement to use that tools. So, I don't get burned for that tech. I just read the websites, download the binaries, run some samples and get back to my actual, present, work. However I get aware that exist that things called Actor Model & CSP (and I google: CSP vs actor model) so I could use them, even with limitations, in python. Or just know that shared/state/mutation is bad for multi-task and passing non mutables messages is good, and use the libs at hand with that in mind.

I have not committed to GO or Erlang, but still, have learning a good, timeless, practical thing that I could carry with me.

So, not worry to get deep to soon, to fast. Learn broadly - look like is your thing!- but not worry about until the time to get worry come.


Relax. A lot of those technologies will be dead. And you should be intimidated by angular and ember. They have the smell of Enterprise Java for me.

Learn paradigms and principles. If you know what to do, you will be able to find the tools to do it.

Its a bit like cooking - the cookbooks for the peasants give recipes. The ones for the chefs give mechanics and techniques.

So learn pointers, functional programming, asynchronous operations and few more important paradigms, learn how to keep a code base tidy and organized and just ignore the foam on the water that is the hot new tech. You will be able to learn it in a week when need arises.


> Its a bit like cooking - the cookbooks for the peasants give recipes. The ones for the chefs give mechanics and techniques.

See eg http://www.amazon.com/Ratio-Simple-Behind-Everyday-Cooking/d...


AngularJS and Ember can be intimidating if you've never:

A) Built an API B) Used an MVC Framework (front or back end)

This can be easily overcome by following either a book like "AngularJS in Action" or spending some time using a framework on a small scale project.

At the end of the day, I've found that when using Angular, my code is much easier to read, maintain, and reason about. I used it for one project about a year ago, and just started another project with it last week - which was easy since I'd spent some time with it previously.

Never feel that you should be intimidated by a technology; but definitely be able to reason about its use cases and the value it may add to your toolbox.

I stuck with Angular because I tend to write larger applications, and Angular had shining recommendations from those using it in similar contexts.


I find these kinds of frameworks intimidating, because in them there is usually a ??magic?? phase that is difficult to debug if it does not work.


> Learn paradigms and principles.

This. It seems to me that many authors are pointing to the importance of a balance between the long-term paradigmatic and principled knowledge and short-term tactical knowledge. Both are important and not to be neglected at the expense of the other.


40-ish developer here and I think the author is being at risk of being left behind.

I just think that when we choose this industry as our career, we have to go in with the knowledge that this isn't something where we can learn a set of skills once and then focus on that for twenty years. Our career is learning. Our value isn't what we've done before, it's what we're able to do.

That isn't to say you always have to learn the newest thing - definitely not. But it's important to stay current with the largest general trends. In the past it meant being facile with a handful of MVC frameworks (however you define MVC). In the more recent past it meant being familiar with some sort of RPC architecture (whether SOA, REST, "API-first", whatever). At present you probably want to be roughly familiar with at least one javascript framework that has MVC on the client side. And looking towards the future, you have to bone up on math concepts and functional programming, so you can move into parallel programming and concurrency concepts.

That's my sense of the rough direction. If in five years you haven't paid attention to that stuff, you'll still be able to find work, but it'll be grunt work that is at greater risk of being outsourced.


I don't think that the future is that scary. The current themes at a shallow level are changing very rapidly but the underlying principles and techniques are changing at a much slower rate. The latest, coolest, brightest startup is using a framework that's only been in existence for 6 months (and replaces another framework that's only been around for 18!) but most of Google's codebase is still in C++. Most of Amazon's codebase is Java.

You don't have to keep up with the latest fads. You have to pay attention to the big themes, and be willing to change frameworks as you change jobs, but a company that updates it's entire framework model every 12 months is going to waste a lot of effort on maintenance that could have been put to better use. Most companies will keep their existing frameworks because it's good enough.

It looks scary but programming as a whole moves much slower than the surrounding hype makes it appear.


I've been coding for 30 years. I've seen many people like the OP who are not totally passionate about tech and it's sort of a "grind" to keep up. Meanwhile I have six children and I've been through the "PTA President" years and now as my children grow up I've found myself engulfed in all sorts of new tech. I am not a young man but I'm not old either. I know people who were in tech 20 years before me and some of them stay sharp while most of them find new things to do in life. (Play the piano, learn to play guitar, what ever). I have zero judgement of these people, they find a path through life that is rewarding to them and I'm glad to see them happy. I personally find learning new technologies and pushing myself to the edge an exhilarating experience. I am always frustrated when I start working with a new technology, but I push through that and once I "understand" it I'm fired up with new ideas on how to apply that technology to problems that come my way.

I think the OP should certainly worry if their drive comes from the need to "turn the crank". In 10 years the crank will be harder to turn and if you're not passionate about it you might as well find a new vocation now.

My father learned HTML when he was 62, quit his job and started a web development shop. I hope when I'm in my late years I'm learning new tech and adopting it as quickly as I can to apply it to whatever new problems come my way. My chosen vocation is my passion and thus the parts that feel like a grind to most people feel like a challenge to me and I enjoy it.

I hope all of us find our passions and it's perfectly fine if it's not learning the latest and greatest language brought to you by the multibillion dollar company of the day.

Follow you passions, be happy, life is short.


I love this post. So awesome. Good for your dad.


Do you exercise?

I ask because, the time you exercise is time you don't end up spending with family but you do it because 1. you know its good for your body 2. you know that exercising each day will allow you to spend more time with your family, in the long run 3. its accepted by society because it keeps you healthy.

Similarly, while its not yet as accepted by society, exercising your mind should be a priority. You don't have to be learning new tech trends necessarily, but do keep learning. If you can just keep your neuroplasticity high, ie make sure, no matter your age, when you try to learn something that it comes naturally. 1. It will allow you to comprehend more of the time you spend with your family, in the long run. 2. I would argue, it might even allow you to better bond with them if you're more willing to learn the things they like. 3. If and when you need to learn the current tech trend, this "muscle" you've kept healthy should lend you to being able to do so relatively easily.

For good reason society/family/yourself have put so many negative connotations on "working too much", but what most people don't get is this isn't working, its learning. This is staying healthy. Everyone understands staying healthy.

Long story short, if you're already dedicating time to your health, don't neglect your brain.


I've found enduring value in these things:

* Understanding binary vs decimal math, deeply. Implementing decimal math out of integers nothing else is available.

* Understanding locking strategies and the need for shared data protection. Build and break and rebuild things until you can intuit then prove deadlock, lock failure or long waits when you suspect their effects.

* Getting good at debugging. Don't point the finger of suspicion at anyone or anything (unless it's at you or your work), /prove/ what's wrong. Toward that end, don't "kill the (error) messenger"; it's just the first to speak up about a problem that might be layers deeper in your software stack.

* Learn data; it outlives code. SQL may bore you but it pays the bills. You didn't like Linear Algebra in school? Well, I didn't much either, but I sure am glad now I took it back then.

* Be the person who can say, "Yeah, I can fix that," and just do that.

* Learn business areas like General Ledger and Accounts Receivable. Be the person who can say "Yeah, I can keep a balance on that for you" and do it.

* As far as application areas go, remember that money, unlike computer languages, never goes out of style.


All the old stuff will need maintenance for decades to come.

Code always lives longer than we think and a lot of the stuff that was created in the last 15 years will outlive half its programmers.


I relate to this quite strongly. Whether you like it or not, a software career forces you into being a generalist eventually. Technology changes dramatically every few years - unable to improve what we have, we do entire revolutions to make incremental progress, and in the process, we make everybody's skills stale. The only things that survive are the principles that are common to everything. There's nothing you can do about that. And once you end up a generalist it is very hard to be appreciated for your wide breadth of general skills. People want the best person for exactly the job they are about to do, very few take a long view, very few even have the luxury to do that.

From another perspective, we are in unknown territory here. Us 40+ developers are at the vanguard of a generation of software people who have no trail blazed before them to tell us where our careers should go. There simply isn't a widespread professional software industry of 50+ and 60+ years old for us to look at and say "that will be me in 10 years". We were largely the first generation for whom software was a major industry. I take comfort from this because in all likelihood it will work out better than our worst fears would have us imagine. It is as much the lack of example and uncertainty as the reality that makes us feel insecure.


I met COBOL/zOS people that talk about retirement "next year". They are ~60 to 70

That shows something: Every generation has it's path (must have :D "push/squeeze")


> Where will I be in 10 years? I don’t know. I hope I still will have some in-demand skills to pay the bills. But it feels like all I see are DevOps and JavaScript, and I know less and less every day about those things.

In 10 years you'll be The Gray Beard, the mythical wizard who interfaces with Earth and Heaven, bringing otherworldly knowledge to our ultimately trivial concerns. You'll guide the younger and unwise in the creation of more ultimately trivial concerns, but with a heightened sense of what the Heavens want us to do.

You'll also support legacy systems, because penitence is one of the keys to sapience and the aforementioned otherworldly knowledge and ability to interface with Earth and Heaven.


Your dystopia should include the millions of engineers India graduates each year. In many ways the difference between these groups and US groups is the technology curve, US group offer cutting edge technology while Indian company will build Java backends with awt. If you fall behind the tech curve you might be outsourced.


I think a lot of the fear of not keeping up might be connected to the fear of not being able to make money on something you enjoy doing.


instead of commenting, i'm just going to link you to http://www.smbc-comics.com/?id=2722.


Shorter version:

David Bowie


It is the loss of the sense of wonder. The trick is that computers today are really really really fast versions of computers from 60 years ago. Fundamentally the same box, just faster and with more memory and disk space.

That isn't entirely true of course, there are special function architectures like GPUs which have a specialness all their own, but it is true for most people.

But the thing that drives people in the beginning, is the sense of wonder. Same thing with relationships, when they are new, everything about them is new and exciting, and then they aren't that much new any more and you don't get any relationship augmentation from the newness. So once you've discovered computers, and made them do your bidding, that is all new and all exciting and the feedback loop is strong. But once you have done all of the 'usual' things once, and due to work obligations are doing them again, but in a slightly different way, you aren't getting the adrenalin rush from the newness. Computer games, same thing, movie franchises, same thing. So what to do.

Well when the rush of newness runs out, you have three choices; pick a new field, pick a different area in the field, or take it to the next level. New field or different area choices give you a new thing to be excited about, taking it to the next level requires that you challenge yourself.

Taking an example from the article, "... I don’t feel like I really grok the module system. I definitely don’t understand the class system. What the hell is a generator and how does it work the way it does? I am so lost."

Decide to take your understanding of Python to the next level, and by that seek out the more advanced texts, read them, read the source code, build Python from scratch, add some change to the module system, analyze the impacts, look at how classes work in Python and compare them to classes in C++, Scala, Java, and Go. If you can get over that hump then you can get engaged in learning things at a deeper level. I really didn't understand Python modules at all well at Google until I took apart the SWIG system to write some C code that was accessible by Python. Port it to a different architecture, benchmark it, change it and benchmark the changes. That is what I mean by "going deep." You will find after a while that there is a lot of similarity at that next level too.

Another skill people don't work on until later than they should, reading code. Read a bunch of code, figure out how it works, prove your understanding by changing it.

All of these things are the programmatic equivalent of strength training at a professional level. If you can't get any satisfaction from this though, option 1 (try something new) will give you more satisfaction long term.


The author specifically mentions Angular - probably that was the last straw. I want to tell you: I feel your pain! You are not alone!

I've been in the field for almost 35 years, and I think I've seen a lot. But Angular really stands out. It's absolutely the worst kludge ever unleashed into the field with so much hype, I can't remember anything like this since EJB, but EJB is a nice piece of engineering compared with this cake. And there's not much choice - others somehow acquired the taste of it, next thing which is going to fall on our heads is polymer (the kludge of comparable proportions). Brace yourselves!

Good that I'm going to retire soon: accepting Angular and friends as a career is going to kill any remaining professional self-respect even if one had some left.

The reason for this unfortunate development (as I see it) is a total lack of culture of professional criticism in programming. People who were supposed to speak out didn't do so.


Funny you mention that. Just today I was doing some research about Angular, Ember, Backbone, etc. The fundamental question I was trying to address was, why would one need a client-side MVC framework in the first place? After reading several lengthy articles, as well as looking at some documentation, I just couldn't grasp it. They all seem... off.


If something is called "framework", you can safely ignore it - it's clearly a pile of crap. If the thing is useful, it is called a library. As a corollary, if a language/programming paradigm requires frameworks to work, it's a failure right from the start. The problem is that when you throw away all frameworks, you are not employable any more. And maybe that's the solution: find profession where you won't waste your life fighting next fad (which is different every year).


Try to imagine a larger project, an entire site running client side with just API end points on the server. Each framework you mentioned has a different approach, but they all try and tackle the task of organizing the heavy client side setup. If you did that project without those frameworks, you would end up writing most of a framework of your own.


You get a cute spinner animation instead of a blinking page. Thick client is coming back. Also JS developers are cheaper, although good ones are hard to find, because everyone likes server-side.


The honesty is relatable, the fear is real. As someone who is working with/on some of the newfangled technologies it's as if my coworkers are constantly trying to put me out of touch!

I've seen a bunch of new and sweet technologies come out in the last 3 years that I've been doing web development. The thing that comforts me is that what I haven't seen is a diminished number of problems that need solving. In fact it's the opposite, the more things tech enables, the more problems people seem to have!

If you change your mindset from "what technologies do I need to know" to "what are the problems I can solve with technology" everything changes. There are thousands of organizations getting by on excel spreadsheets to model and manage billions of dollars. Do you need to know responsive-Angular-distributed-Go to tailor some software that solves real problems? no. Do you need it to be cool on HN? maybe.


For the old guys in the room who want a simple way to add AJAX/rich client functionality that stays true to the original web development model:

http://intercoolerjs.org

(NB, basecamp2 was built in this way. So it is possible to build a fairly rich app without the complexity of client side MVC.)


Luckily you don't need any of these fad/new libraries to make good products and websites.


Unfortunately, when you're job hunting ... you do.

As I'm finding out. Two years as a system admin and not keeping up with trending tech and I can't land anything here.


Where do you live? In the center of San Francisco?

In every other place in the world nobody give a damn about the latest technology. They only need something that work for they problems.


Don't be a web developer if you hate fads. There are ERP systems, business systems etc that pretty much use the same tech as 20 years ago...


Mobile developers are ironically stable, being forced to use the language of their platform and maybe whatever their companies cross platform layer is made out of. I've been doing ios dev for 4 to 5 years and it's only whatever is new for the new OS version in objective-c. I was paid to learn all the newness of ios 7 as I transitioned the app towards it. Swift is mostly objective-C in a new skin with welcome improvements, so learning it is easy.


He is a sysadmin...


I dunno. Web developers in Ithaca NY use Angular JS, I know that.


Ithaca is full of hippies, it doesn't count.

/s (GREETINGS, MY ITHACAN BROTHER)


There's plenty of remote sysadmin jobs https://careers.stackoverflow.com/jobs/16884/unix-devops-sys...

Note: "If you don't meet all those criteria, apply anyway. We'd love to talk to as many people as we can."

My local craigslist has dozens of jobs posted today looking for "Linux sysadmin" with scant requirements, offering $60-70k. Not huge money but better than being unemployed.


I keep wondering if the only reason you see that many ads for a trending tech is because those are newly-founded startups or small web dev shops that will be going under in a year anyway, or maybe because tech recruiters are out of ideas.


What happened to the tech recruiters that continuously send everyone job offers? Or is that only in the Valley?


I feel his pain, in a sense.

I'm 24, and 'cloud' to me is 'a vps somewhere', I don't grok the difference, I've worked in elastic compute environments before, but that's definitely not what you're buying with things like EC2, and I simply don't see the value in it.

other things like the 'DevOps' title; seems to mean either a sysadmin who can do configuration management, or a developer who can configure apache.

everywhere I go, I feel like I should have a large set of development skills, but I've been in operations for 5 years now, and I'm regarded as being a very good ops person.

I'm a 24 year old sysadmin, and I feel like I'm being left behind.

this is not an industry you can grow old in, unless you become a manager or specialise.

at least... that's what I fear.


>I'm 24, and 'cloud' to me is 'a vps somewhere'

It is a VPS somewhere, ultimately. It's a VPS somewhere but it may be two of them or more. It may not be exactly where you think "somewhere" is, but it is somewhere. You don't know exactly what is running the VPS (or the bunch of them, somewhere), and what it runs, but it's running, it's a VPS, and it's somewhere.

We'll charge you 50% more to enjoy this uncertainty and to be able to use the word "cloud" alongside "synergy", "blue-sky thinking", "out of the box" and a few others when you describe your stack.

(On a less cynical note: with the "cloud" you're essentially paying to replace the known, common possible points of failure with unknown, usually rare points of failure, and some redundancy. Uncertainty can be a feature.)


> I simply don't see the value in it.

Your intuition is probably right. On EC2, you pay a lot and get very little, for the sake of flexibility that you may not need.

If you never find yourself saying "oh man I need another server for the next 6 hours", then EC2 is just overpriced hosting.


[deleted]


Devops is an attitude more than a particular practice.

So far as AWS, I'm an AWS maniac and I would say it is a bit complicated. You have a huge number of choices and if you pick the wrong ones you will pay in more ways then one.

I've gotten to the top of the learning curve fortunately because I read AWS docs over and over when I work out at the gym, but I've got little interest in working with other cloud platforms because I don't want to climb that hill again.


>because I read AWS docs over and over when I work out at the gym

Now that's... dedication. How does the "working out" part go though? There is no way I could see myself reading AWS docs repeatedly while lifting, it actually sounds kindof dangerous. Edit: you and I have the same alma mater and have probably worked out in the same gym! Go Big Red.


ah yes, teagle hall IIRC. i don't recall anyone browsing documentation in the squat rack (on the admittedly rare occasion I went to the gym in college), but that was more than a decade ago so maybe this is part of the 'brogrammer' thing :)


For various reasons I've been doing a lot of cardio for the last year. I just started lifting weights again and it's true that I don't read docs while lifting.


DevOps in my area means you do everything except clean the bathrooms for 1/2 the salary of whoever you are replacing, and work at least 10-12hrs per day because unlikely you can fit in customer support, chef/ansible/puppet deployment, databases, front and back end development, testing and debugging, presentations and piles of internal paperwork and emails in 8 hours.


In 2000 I was sitting in the XML proseminar and I asked why? Why validation? Why xPointer?

I wrote my thesis (2005) about Semantic Web (RDF&OWL). In 2012 the company of my professor ("Ontoprise") went bankrupt

I started working with "SOA" and ESBs in 2006 (just was the thing then). In 2014 I open sourced the essence of SOA[1] (http://www.use-the-tree.com). Ups, soooo easy...

But I think things get better (e.g. Hacker News tends to tear BS apart)

(I am prepared to be downvoted duck&cover,shiver)

[1] The essence of SOA? - Fast message transformation ofc (rest is not specific to connecting computer systems)... but that is easy. Sorry microservices?!


I think the author vastly over-estimates what it takes to be a proficient technologist. He can figure things out. That's all you need to have a job/career in tech. Because technology is hard enough that the vast majority of people just won't want to do it. Everything else is just stuff that makes that job/career easier and more fulfilling.


2 Weeks is what it usually takes to catch-up with all new libraries in all the languages you care about. Not more. Easy as googling it and checking github explore, then trying each one out. Simple as that. Do that every 6months and you're set. What could go wrong? :)


This post really resonated with me. The fact is, there used to be less to know to be a "full stack" engineer, and the expectations for what "good front end design" means were much lower 7+ years ago. I sense a widening gap between the realms of front end and back end engineering/design. It used to be you could write a small PHP app and put it on a shared hosting account to serve your small user base and people (stakeholders/users/peers) were mostly happy with that. Now, if you're not "infinitely scalable" plus doing TDD/BDD plus Continuous Integration or even Continuous Delivery right away, that's considered lacking. This does create opportunities for those of us who've been able to grow with the field and truly understand "how computers work", but I imagine it's frustrating for those who are trying to get into the field now or even just trying to broaden their skillset such as from front-end engineering to devops and system engineering/architecture.


I think there is still room for a generalist to make a dent in the world. If you can build full stack apps you're at least worth gold to someone / yourself as a technical cofounder if you can stomach the risk.


Thank you for boosting my spirits. I can build a modern web app from design to deployment, but 2-3 years experience seems to not be enough for most jobs. Considering just starting my own business at this point.


I'm 40-something now, and my fear is always not being allowed to bring in another technology (new or otherwise) that could help us, because my team mates aren't familiar with it.

At the same time, a lot of new technology is just "Oh, that's nice, but actually not much different or better than what I'm using now." Which is fine. I can skim through a couple of articles linked from Hacker News and get an idea of whether I should care or not.

I'm also more realistic about picking my battles. The Swift keynote demo was really cool! But realistically, it's just going into the queue of all the other "that would be fun to learn some day" technologies.

Overall, though, I am always thankful to be able to work in an industry where there are always new things to learn. I imagine I am in a very rare position relative to most of the human beings who ever lived, being able to earn a good living while learning new things every day. The thought of just solving the same problems in the same ways every day until I retire scares me.


I feel like it is pretty safe to specialize in c and c++. A lot of production systems are still using these languages and I don't foresee companies switching all their systems to go/rust any time soon. I think that having a good understanding of c, c++, and the linux kernel is still very valuable. It is quite difficult to find good c/c++ programmer these days.


I'm going to try to provide an alternative perspective, one which is more optimistic. This is love letter to programming, written in my late 30s.

I've been programming since my parents brought home an Apple II. Even though I sometimes get burnt out for a couple of months, sooner or later I find myself hacking some fun little program together, and I'm obsessed once again.

When I work for somebody full-time, I search out jobs that throw lots of random, interesting problems my way. When I consult, I like medium-sized projects that teach me something new.

In the late 90s, I hacked on Lisp and Dylan compilers. In 2001, I had strong opinions on how generators should work. In 2007, I spent time fooling around with probability monads. Right now, I'm delivering production code in Ember.js, and on the bleeding edge, I'm spending some personal time messing around with Rust.

I want my code to be simple, beautiful, expressive, correct and fast. Every few years, somebody invents a library (or sometimes a language!) that allows me inch a little closer to that goal. I suppose that if I ever felt my tools and my code were truly good enough, I might get frustrated with the rate of change. But if anything, I'm impatient. Our tools could be better. When will I get to play with the good stuff?

Of course, I do get tired of some things. If I never again see another huge Rails 2.3 app with undeclared dependencies on 30 dodgy abandoned gems, I'll be a happy programmer. And I wish that the various JavaScript frameworks would finally settle down enough that I don't have to budget one month per year just to keep production apps up to date. But then again, I didn't much like that sort of foolishness when I was younger, either.

My only real problem is that the day is short, I do have outside interests, and there are so many cool programs left to write.


I started programming in 5th grade, and still love it after 40.

But the world has so many large problems these days. I think many people decide at some point, there is more reward and more meaning in working on those things, than following the latest Android API diffs.

Technology is still interesting to me, but only as a tool to accomplish more important things.

Unfortunately, as a software developer - I view the most important technology not to be software, web frameworks, or mobile, but developments in energy, material science, etc. Things that are out of reach for me as a developer.

How can i solve any meaningful problems, when the main tool at my disposal is mostly just copying bits from one memory location to another?

Thats what makes me depressed and disenfranchised with software development at 40. I'd like to work on more meaningful things, but I find my skills rather impotent given the enormity and complexity of the world's problems.


I have felt the same from time to time (45 here). However, as software technologists we have one advantage: our work can be completely independent.

In this case I mean if you can conceive a project to better a part of the world, it may be possible to implement with resources you already have at your disposal.

The materials energy researchers need resources beyond the reach of most individuals (and their personal budgets), and in some cases corporate resistance to embedded profit centers.


Are there any energy startups, materials science labs etc. who need modeling, simulation, data processing done, don't have funds to pay market rate for a good programmer and are stumbling along with physicists and chemical engineers having to divert some of their time into trying to debug code?

I'd be surprised if there aren't some. Maybe find one and offer assistance?


Looks like a possible candidate for Impostor Syndrome to me


If I get it right, Ed is concerned about specialization. He misses the days when he was a generalist.

Well, I've been consistently "forced" to generalize my skills for the last 9 years and I have similar concerns.

Now that I'm looking for another company, it seems I'm not specialized enough for like nobody. And while I know for sure my value is – let's say – 9 / 10 where 10 is the likes of Jeff Atwood, John Resig, and so on, it seems prospect employers value me 5 / 10 because I lack specialization.

I am a generalist (from sysadmin all the way up to frontend development and copywriting), but I have my own specialization: HTML, CSS, Web Accessibility. But my specialization seems to be not enough important for most dev teams today.

[Edit: rephrased a bit, for clarity.]


I imagine it is very hard to distinguish your specialization from the mass of people who will claim to know HTML and CSS and pay lip service to the accessibility part as well.

Have you thought about going the independent / consulting route? It seems to me a lot of companies will value having a consultant come and tell them how to fix their accessibility but not be willing to employ someone full time on that basis.


Recently I marvelled at how far we have come since 1999, and how many changes in focus that meant. Not just tech itself, but also paradigms and methodologies.

Like the OP I started doing with design-y tasks as an intern in a web shop. Then followed a lot of t.gif style table layouts. Then followed some flash development, and my fist scripting with ActionScript and copy-paste JS stuff. Then ASP classic and PHP.

Then followed the push towards semantic html and progressive enhancement, and the first JS libs like mootools started popping up to be eventually boiled down into jQuery.

Couple years later and it was all about MVC frameworks, databases and so on. Like the OP I did php based stuff, and it meant learning 4+ different frameworks, i.e Cake, Zend, Symfony and so on.

At the same time there was Agile and TDD coming into the mix and you had to learn a whole lot on how to do your work.

Following that I started looking into node pretty early on and that shifted my understanding of JS as a language and pushed me into the niche where I am working today, namely JS stuff be it server or client.

Looking at the current tech choices and popular stacks, or even just the browser APIs available and coming, I cannot believe how far things have gotten. On one hand things have become a lot easier for menial dev tasks, you have things like bower and grunt/gulp whatever really helping you, boilerplate generators and the likes. Deployment is often just a git push so some PaaS or some docker file and shell script. On the other hand is just SO MUCH OF IT, that it's impossible to keep on top of even one aspect of the stack properly if you need to get on with making stuff and have a life. If you just take the browser these days (audio API, Web GL and so on). In 15 years of dev work I have learnt and forgotten so much that my brain feels like a sieve. My approach to deal with that is to not desperately try and REALLY remember many things. Instead I just focus on the current stack I'm working with and remember the general concepts of that, plus some good knowledge of where to look for things. Come the next project in a different stack I basically anticipate a learning curve.


I think the stack switching is probably what leads a lot of older folks to force their stack on folks. I used to see it as inflexible but I'm starting to think it's just economical.


Y2K was solved by "out of date" programmers who still knew the code behind the stuff that would only take 2 digit years getting hired to fix it so banking and what not would not implode.

Some guy who made bricks found himself very in demand when he was older because he was the only person who knew how to make authentic historic bricks with which to restore historic properties in the nation's capitol.

Being old(er) can be an asset. It can be a feature, not a bug.

(shrug)


I agree with you completely man. I'm working on my senior thesis to wrap up my degree and I have to use the Microsoft stack to write a real-time web app.

I haven't use MS stack since MVC2, and now we're at MVC5. I'm looking over the new things and now there's this new stuff called OWIN and Katana. Wat?

Buzzwords galore: iterative, modular, lightweight. Ugh. It used to work fine and now I have to learn some new thingamajig again.


Sorry, but you don't learn actually anything new there buddy (there is more). You're right that MS packs all the un-branded wisdom into products with custom API's, SDK's coming with 20 new buzzwords describing the improvements. Well, because people love good packages. For MS software there are the Rx extensions and F#, but I'm sure that you can live with just ReactJS too. However here is some short but good read: http://www.infoq.com/news/2013/08/reactive-programming-emerg...

But if you once learned what FRP is, what Software-Engineering means, OOP Patterns, Functional Programming, then you'll be set for at least half of your life man. Also Real-Time Applications for the Web are a big misconception, they can only be of use where you really really need hard-real time applications. Which is mostly not the case with web-apps. What you need is a reactive web application, but please don't eat the BS in the reactive-manifesto.org. It's not wrong, but not intended for engineers and comes packaged with just hot-air, that according to the writers is "useful" to communicate to managers. I digress, but reactive programming is the concept you would be better of focusing on instead of real-time, except you are having one of the rare cases that actually needs hard-real-time synchronization.

OT: Am I hell-banned or something? My submissions appear dead at arrival.


How can you suggest reactive web if you don't know what I'm building?


I love my work, still writing software over 40. I am intolerant of technologies that are poor reinventions of octagonal wheels. Now they're triangular wheels: they move even less well as the crappy octagonal ones that were going to cure cancer back then (when we were writing it in Java). Now the triangular ones are shockingly more complex and offer even less (and they're written in Scala, no less)


I think the place you can deliver the most value is strategy and leadership. There is a tremendous dearth of good communicators in this space.

I barely code anymore - the labour overseas is so cheap I can't justify spending 8 hours trying to figure out some bug ... But I've found joy in managing teams and coming up with plans and strategies and helping clients get results and things like that.

In fact it's far more enjoyable: when I was doing all the work the limit on how many of my ideas could come to fruition was my own time. Now I am able to see more of my plans and ideas come to life more quickly and that's the part that I always really enjoyed.

I've let go of trying to keep up with specific technological details... They all sort of fade into insignificance and you realise that it's all sort of the same stuff; different tactics but the same strategies. Old folks win by being better at strategy.

That's how it works in a lot of fields where the energy of youth is an advantage. I don't think tech and/or business is any different.


How do you deal with bad coding practices? Such as over complicated & verbose code and copy paste everywhere leading to an unmaintainable mess?


Short answer: testing.


Unmaintainable copy paste code can still past tests in a degenerate way. And things like UI code and other things that majorly depends on 3rd party code definitely is harder to make proper tests for.


Not just automated testing, quality assurance.

The job of QA is to find bugs.

The job of automated testing is to reduce the cost of QA.

With proper project documentation and separation of backend from interface using DOM templating as well as a compartmentalised, non-monolithic application architecture (ie. not using "frameworks" in the traditional sence) the code that drives an interface is not only trivial to build but also pretty trivial to replace.


It's not so much change as it is exploration. You can either wait for the best player to come out or start betting.

In the 70s, the exploration was with different hardware architectures and then we settled with general purpose computer chips.

In the 90s, it was operating systems, protocols and standards. Eventually commercial interests (mainly Microsoft) lost and we settled for open stuff they couldn't control.

Same story with cloud computing, a technology that didn't really exist 10 years ago and is still being developed. Eventually we will settle for aws or openStack and common processes will be developed that companies like linode, amazon, google etc. can all share and use and the software will be open source because there will be no major differences between all of them.

Every technology experiences a phase of exploration where people contribute left and right and eventually we settle for a handful of major players. Remember the MVC frameworks competition in Ruby/Python/PHP less than 8 years ago?

Just bet carefully


That's not a dystopian future.

I appreciate the honesty. I've spoke with my developers faced with the same situation. The common theme is that they have better things to do. That's fine too. Not everyone is interested in digging into combinatorial algorithms and why should they be? We all have varying levels of interest in the field. It only seems reasonable that you should be able to use what you need of it to get your tasks done.

You certainly don't need the plethora of front-end frameworks and tools to make a good site today. Those tools are great when you need them for large, modular user interface applications. It's still as easy today as it was ten years ago to set up a dynamic web site. PHP is still around if that's your game and there's no shame in using it. Use what works and get it done.

Nobody is going to take away your license.

I'm 32. I still love programming. I go through morose periods where I lament the lack of innovation and ingenuity that the industry had in spades while I was growing up. Settling into a world of incremental improvements and watching the end of Moore's law approaching is disturbing. However I can't stop. It's what I do. But instead of focusing on new frameworks or languages I have moved more towards pure mathematics and finding applications or links to computer science. Knuth's work has been taking on new dimensions for me in recent years. I've obsessively digested everything that Sussman, Friedman and Felleisen have written. I've meditated on the ANSI C specification and the CLHS. I enjoy talks on the nature of computing and speculation about the future. It's an exciting field for me that is just ripe with fruit.

That said it's also healthy to encourage an interest in other things as well. My daughter. Literature. Music. Chemistry. Electronics... these are my digressions. I always come back to programming though.


I'm probably considered an "old guy" in the software world. But I realized a few years ago that there is little new under the sun. We get new names on stuff all the time; we get old concepts repackaged as "new." Data still must be organized and searched, transmitted and received. Instructions must still be transcribed and turned into something the silicon can use. The only differences are in the syntax. Sometimes, those differences are what we needed to make a breakthrough in comprehension of complex systems. Oftentimes, there's little difference.

I was once worried I'd be left behind. I don't worry about that now. Mainly because I've got the skills to grok the "new" and catch up. But also because "new" is just repackaging that research paper from 1964 into another package for $YOUR_FAVORITE_LANGUAGE.


I read the post, and I've got to say that I've been coding every day since I was about 7, and I'm over 40 now. I've got to say that the problem is not will I not be employable, the problem is that there are not enough programmers to do the work, and not enough marketing staff to sell the products and get the software/hardware out there. So, yeah this guy is toying with Javascript and all the random frameworks that come out every day but I'd say stick to performance, security and scalability in your code and there are only a few languages out there that you'll ever really be writing software in anyway. It's also better to be writing your own software, not something for someone else. Because at the end of the day, its better to make yourself wealthly than someone else.


A question for the intersection of Python and Clojure people, is a generator kinda like a lazy sequence from Clojure? Or is that wrong or is there more to it?

Yes, I'm well aware looking at the dates that python generators came a bit before clojure lazy sequences BUT I personally learned Clojure lazy sequences first.

What I'm getting at that's relevant is the OP article author needs to tune up his analogy engine. The older I get the more "I've seen this before and know exactly what to do". As long as I can draw (hopefully correct) analogies, I can learn something very quickly.

I don't know any Python at all, and 2 or 3 minutes of google searching implies a python generator is like a clojure lazy seq, so if I'm on the right track, I'm doing well.


One thought that soothes me a bit when I think about my future as a (relatively young) web developer is to remember that there is currently plenty of demand for 'greybeards' in companies that use older technologies or companies (even some startups?) that understand the value of developers with a lot of experience, or a particular skill-set.

I fully expect that especially the latter will be realized by more and more (web) companies by the time I'm an old-timer.

That said, I do understand it might be a little bit harder to find the kind of work/location situation I want. But considering that right now I have ridiculous freedom of choice in that regard, I can accept the idea that this might not last forever.


Be and keep worried, but don't let your angst control you!

The post describes a peripheral vision of progress, but no relation. It doesn't matter shit what new software, libraries and technologies are out there. What matters is that you have your own goals and advance in that! The rest doesn't matter, don't worry about others, you cannot help anybody, if you cannot help yourself.

I always tell me friends who don't know anything about computer science that my job is to make everyone else's job obsolete. After convincing them how their career might one day be replaced by software, robots and algorithms, I explain how they don't need to be alone with that fear. Of course I know how this makes me look like either a freak, an asshole or both. But the important part about the discussion is to communicate what really is going to happen. It happens, but the progress advances as slow as the growth of a tree. One day, it will be large, it will be a forest and then there will be no way to ignore it anymore. This is why it matters to open someones eyes. Of course only those who you deem worth, most people still won't listen and let time pass.

So yes, I share the same fear. Our destiny and true job is to make ourselves obsolete. Creating Software that can create machines, products and manage people, schedules and ressources. Creating Software that can create better versions of it's own or even create other Software to fulfill it's goals. This sounds like a big stretch, but we're talking about a time-span of 20 to 30 years.

We write code that will one day replace the need for us as a regular' work-force, but not as an actor in the process. The Future won't stop there, it'll instead create new markets and jobs where we proceed as architects using our creativity and intelligence, until our machines create hardware that surpasses the constraints on the software, which were set by us.

This will require us to allocate new resources for energy, either beyond earth using SpaceX like programs, the invention of a replicator (using quantum teleportation and molecular re-organization) or light to mass converters.


I don't worry about this happening to me because I know exactly why I learned how to program - because I wanted to build cool shit. I do not fall in love with my languages and tools and therefore I don't feel any stress dropping them and moving onto the next thing that will better help me reach my ultimate and unchanging goal of building cool shit. I think the OP's anxiety comes from their desire to "master" a particular language or framework. If you just stop worrying about your chops and start worrying instead about the thing you're building you'll find all the volatility of the programming world much more manageable.


I think this is down to education. The move from designer to developer is possible down a narrow stack in the context of rare skills in demand (financing the learning curve on that stack). The challenge that developers from this track face is that when they have to transfer their experience from one stack to another they lack the intellectual framework that CS graduates spend 2 years having beaten into them.

All that stuff about types, functional programming, logic programming, modeling, proofs... It helps you understand things - if nothing else as a shared short hand for communication, but really as a deep set of concepts.


Nobody can stay on the top during his whole life. Technology marches very fast and as we get older it's hard to keep up and compete with younger, more motivated brains. Everybody should expect that.

This guy should think about all the people that did not even have any skill or talent in IT in the first place. Or people that are not just very smart. They manage to live anyway. They get by. My advice is to do what you enjoy and not compare yourself with others. If that means your value on the employment market is low, so be it. Let programming be a hobby, not a job.


As we get older it gets harder and harder for our brains to learn new things, but it also gets harder for our bodies to do a lot of things. Rebounding from injury or a night of heavy drinking is much tougher. Staying in shape becomes an almost part time job. You have to go to the gym and watch what you eat just so you don't get overweight.

Think of learning new programming frameworks or languages like going to the gym. Studies have shown that learning new things throughout your life can prevent Alzheimer's disease.


If you're a developer and you absolutely loathe learning new technologies then you're in trouble, I'd find a new career. Otherwise, I think the key is striking a balance between recognising the hype and fads, then spending time learning the new technologies that are going to stick around, because they provide a substantial benefit over their predecessors.

Give technologies a chance to mature before you jump in. Look at what actual companies/sites are using, not just what people on HN say is cool.


We're in our late thirties and just spent the last month (pretty much 24/7) learning golang while developing our own cloud-based relational database (which has some really impressive benchmarks by the way) so this whole idea that you cease becoming adaptable to new technologies the moment you hit 30 is silly.

If anything, we find it's the early 20-somethings these days that are rusted on to javascript or ruby and can't seem to work in anything else. There isn't any passion anymore.


Relax. Technology is part of a means to create value. Having skilled leaders is more important in value creation at scale than skilled engineers.

Projects fail because obvious risks are ignored - finding the integrity to fix those will maximise your value to anyone far more than learning AngularJS

This is not to say code should be ignored, but that code is software literacy - and soon "everyone" will be literate and the supply/demand curve will balance back out, but with a new higher baseline for entry.


The perennial question is: what do you like to do, that is worth money? What Color is Your Parachute is a job-hunting guide, but most of it is working out what you like.

What skills have lasting value? JS frameworks seem to come and go faster and faster, each one tweaking some aspect and reinventing the rest. Is there any evidence that the current crop will last any longer?

But the underlying principles are lasting. And writing, mathematics, problem-solving, people-skills are lasting.


As many have said here, focus on what you're actually building. That means at some point you have to put aside the fear that the framework you've chosen "isn't good enough" and just build something with it. With any framework, there are going to be some things that fit well and more often than not some things that don't - I think it's rare that any one framework perfectly fits any one project, which is probably why so many frameworks are in existence in the first place.

So my ideal philosophy is to pick a technical stack and try something with it. Until you're building something with a stack, you're just messing around.

Now -- the harder to solve (and more infuriating problem) I see is a recruitment culture that seems to favor buzzword bingo. Why don't job ads just ask for "a developer who is comfortable having to work with or pick up the following frameworks quickly" instead of "MUST have at least 3-5 years experience in such-and-such. "

Don't even get me started on 'language quizzes'. Oh man I hate those. Learning language and framework minutae off by heart is pointless. In production code, it's rare to use ALL the features of a given language. So you have to be inquisitive, but at the same time turning down candidates because they know .NET 3.0 but not the minutae of .NET 4.5 is just lazy; unless the codebase actually uses the new feature (and then you have Google anyway).

(on that note, one of the best job ads I ever saw was for a games company where their only requirement was "show us a complete game you've built". Now THAT is sane and says to me that these guys/gals know how development works in the real world. Unfortunately I didn't have a complete game to send them, oh well. )

WRT to learning new things, I'm sort of going through that now. I'm "de-specializing" myself a bit by learning HTML, CSS and JS. I always pegged myself as 'not a web dev guy' but unfortunately the country I live in doesn't have the healthiest games industry. I had to drop the 'not that kind of developer' attitude and just go into it with as much of a blank slate as possible. It's been interesting so far, with the hardest mental roadblock I've had to overcome is thinking that "web pages are just documents" from my late-90s/early-2000s exposure to basic HTML/CSS.


about python

> I definitely don’t understand the class system.

this could be worrying if you mean you don't understand class-based OO languages as a whole. if you specifically mean you don't understand multiple inheritance, that's less worrying, and reading "super() considered super" ought to clear things up.

so much of the so-called "JS hotness" seems not entirely red hot to me. perhaps you can skip over angular and proceed straight to react and see if that makes more sense?

based on this post, i don't know if the issue is ageism so much as foregoing breadth for depth. being able to tweak SQL configurations is definitely a valuable (in terms of $$$) skill, but what if some of that time had been spent reading SICP, learning how to implement class and module systems "from scratch?" it's not something that you'd ever do in your day job, but it would probably make understanding the variations on the overall class n' module theme easier.


I find as I get older the urgency to do something technically relevant and interesting increases because time is running out. When I was a kid, I would focus on technical tasks that were not core because I had all the time in the world. Fortunately, now that I'm older, my ability to judge what is relevant and interesting has got much better.


In 1973, when I was working at a summer job as an assembly language programmer for a university medical research computer center, I asked the section supervisor about the career path for programmers. I was told: there are no old programmers, after 15-20 years or so you will go into management or leave the profession.


I find myself moving farther down the stack rather then picking up the "new hotness" framework etc. Right now I'm very interested in embedded / low level systems. Someone still needs to maintain and build the actual technology that these young'uns use. :P


"There are other, more important needs in my life that are not related to programming languages.

What I’m most scared of, though, is being left behind."

At least there is a choice. Also being creative is one of the important needs IMHO.


Devops actually makes things a little easier I think. With Ansible, setting up a server on AWS feels a little like maintaining a vimrc file. Get a starter playbook from someone and run that to setup a server on AWS. Then you modify as needed as you go through your days as a developer. The great thing about Ansible playbooks is that it codifies the setup. You have a "recipe" that you can modify and run as needed. For me, this is much better than the days where I would set up a VPS "firing from the hip." If you had me setup 10 different servers, I would have set them up 10 different ways.

I think for development knowledge in general, it might be good to look at the following people...

http://www.kalzumeus.com/ (patio11 here)

http://brennandunn.com/

http://unicornfree.com/

Take a read through. For these guys, it's all about products. In the end, that's all that matters. Maybe try for a change of perspective where you look at your skills in terms of products.

In Amazon, there are probably projects where they lock developers in dark basements to learn arcane crafts so specialized that these engineers forget how to spell their name. These engineers would probably struggle just the same as you if the need for these specialties were to disappear. Faced with the need to "retool" - would they be able to do it? Some might, some might not.

There is (and probably will be for quite some time) still a lot of need for people doing the sort of "LAMP" development that you cut your teeth on. Just because the stack wouldn't run an F-35 doesn't mean it isn't valuable.

So, look at what you know in terms of business models rather than actual skills. Focus more on the product which people need and then fill in the skills.

Do people need AWS and Devops? The client probably isn't going to fuss over this. The client just wants to ship a product. You handle the details.

Most businesses start here and then they change as they get more successful. These projects still need the general guys to come in and get the first iteration shipped. Then later they might need to bring in more specialized engineers to deal with more specific issues. Eventually even the founders may hand over the reigns to "grown up" leadership.

So, you need to figure out where your "market fit" is. You can't be everything to everyone (though at one time it seemed like we could be.) Once you find that, then just carve out some time to continually develop that and keep an eye out on where it's going. Maybe you could even have a side project which you could use as a plan B. You don't have to spend so much time on it that you are having to take away from other important things. Just plan a good hour of focus at certain regular intervals , even if it's just an hour every couple of weeks. You might find that it develops in an interesting direction and you may want to pursue it further.

Further reading...

http://letsworkshop.com/freelance-as-a-service/


I am scared as well. Most jobs seem like dead end, I bet this will get worse as I get older


A lot of the comments here are about age, ability to learn, etc. All valid, but I'm going to switch it up and give a different point of view, because this is very close to something I was discussing with some friends recently, because three of us eating were what I'd call "advanced" developers and one of us was a "scrappy PHP guy."

Our backgrounds are LAMP (isn't everyone in web?), and we were discussing that despite technology getting "better" (quoted because that itself is arguable, but for the sake of this comment, accept it), the learning curve to getting something out the door using "modern techniques" has a terribly steep learning curve.

I learned web programming when I was 12: slap some Apache + mod_php (a couple package installs away), FTP up some PHP files, and BOOM, you had a website that was scalable to most needs. I learned PHP by pretending it was HTML and not really understanding the PHP. It was easy that way.

Python, Ruby (Rails), JavaScript (Node) are not easy that way. You have to actually understand some things: web serving, HTTP, cookies maybe, GET vs POST, etc. I probably went years without knowing what any of those were, and still strung together stuff that worked (for my use cases: family websites, small business websites, etc.).

And then it doesn't stop there, that's just at the language level. Now you have a choice of databases, you have these PaaS things that are sometimes great and sometimes not great, you have shared hosting and VPS still but you also have fancy new IaaS things, you have people rambling on about configuration management, and now people are starting to throw the word containers into the fray.

In short: for the programmer who just wants to get things done (maybe not correctly, but correct enough for them), things seem to have gotten worse. Much, much worse.

And I admit (or believe) that it has, but I argue that it is worse _temporarily_. And worse (worse for a certain type of programmer) _necessarily_.

First, LAMP is still available, still heavily used, and still easily learn-able and deployable. The issue is that apps done so cavalierly are prone to maintainability, scalability, and security issues. As the web has grown, we've learned some things about how to improve that. For maintainability, we introduce some concepts like testing, and language constructs like classes. For security, web frameworks help mask the really low-hanging fruit security issues that usually cover you in most cases unless Mossad is after you (then you're fucked anyway). And for scalability, we've leaned towards horizontal scalability for various reasons.

Now, each of these things can be poorly bolted on to existing programming styles. But in general, to do it correctly, a lot of things had to be redone from the ground up. Low-level components had to be rebuilt. Examples: web frameworks, safer languages (in that they're harder to make dumb mistakes in), configuring servers, deploying apps.

I argue we're still at that very low-level stage. The old "learn PHP like it is HTML and deploy some stuff" programmer just isn't comfortable at the low level we're currently treading through. But as the low level components stabilize out (over many years), eventually the tools will become higher level and will become easier.

I don't do any frontend, but from the sidelines it looks like the same way: there are dozens (more?) package managers for frontend and it looks like people MIGHT be starting to stabilize to a few, but it still might be years before we see anything super stable.

Of course, technology will always be moving forward. So as things stabilize, someone will be working on the next thing. But I hope that that next thing is a higher level problem. As we figure out how to build secure, scalable, maintainable web apps, maybe we can move higher up the stack to make this stuff more teachable without understanding everything.


Honest, legitimate post that I very much enjoyed.

However I do think posts like this feed in ageism. I say this because it posits that the change is that the author is older and has more things to do, therefore they've fallen out with technology. The take-away invariably being that people who are older and have more things to do have fallen out with technology, whereas the young are still with it, whatever it is.

Let's back up for a moment.

I've worked with people throughout my career who never learned a thing that wasn't specifically taught to them on the job. They had very little curiosity, and an email saying something like "try to look into" would get met with derision. "Is this on the test?" they would effectively ask, when really I was just trying to get people interested in something that might have an impact on our work. So soon enough I have to be very specific and guide people along the technology path.

They left work and played volleyball and worked on their tuner car and went to family events and so on. They learned exactly what was specifically necessary for their job today and tomorrow. Never beyond.

Most of them were 20-ish year olds.

I was the prototypical "love technology around the clock" sort, and I was very, very much the exception. Literally, I know no one who I ever worked with who savored technology like I do.

But there were moments when I lost interest. One was on a return from my honeymoon. Being away from technology for a couple of weeks, suddenly it was all very "Meh". I could easily have taken that moment and said "Oh well, I guess this is what happens when you turn 27. Guess it's time to look into management."

But I didn't because that would be wrong. I lost my mojo and was burned out. Not long after I changed jobs, got recharged, and was at it again.

And I love it. I love the change. I love the technology.


I'm 27, used to be in tech like the biggest nerd of them all. But not anymore, frankly I'm tired of all of the changes that happened in the industry - it's not that I'm tired of learning but it's that the changes lack substance. Learning a new framework that does the same thing like layout or user design; it's not like learning music and digging into a deeper discipline like moving from acoustic strumming to jazz improvisation, it's just learning new keywords to do the same crap because the old keywords have been remapped.

I'm reminded by a writing blog, like writers of popular fiction, programmers aren't creating technology anymore but we are simply selling a idea (e.g., Martha Stewart for home furnishings flash sales sites, fear of missing out for Facebook/Instagram, diversion or outlet for teenage aggression/frustration for iPhone games) that validates a user's idea that is wrapped around technology. Hence, tech trends surrounds consumer tech instead of system programming now and new web frameworks sprouts by the day that emphasize shaving the man-hours of a routine set of mundane tasks down by a fraction of an hour for the benefit of the agile burn-down chart.

This is not meant to be depressing nor cynical comment but a liberating one for me. Too often, I or I have witnessed a lot of my mostly younger peers try to reconcile the art and commerce, or to put in blunt way, have our cake and eat it too. I see that as a cowardice or indecisive but perhaps necessary stage in one's journey of growth, not to commit yourself and not to express your personal values through the imperfect tradeoff's and limited circumstances of a finite professional life - but be swayed by the arbitrary trinkets that one's professional guild assigns and equating its idea of worth to your self worth.

VB/Pythonistas/Rails/Node.js I've seen them come and go like names on the Billboards charts moving to VH1's "Behind the Music/Where are They Now?". I'm convinced now that you have two routes in this business, 1) make your programming craft secondary to the development of your domain expertise, so learn finance, SEO or sales whatever is the main objective of your business, 2) make your programming craft your domain business and work at companies where their business is the software.


I reminded a co-worker last week as he was figuring out whether to use Grunt or Gulp - Computers aren't the thing. They're the thing that gets us to the thing. - Quoted from Halt and Catch Fire.

Photography has its own version of the same quote. Amateurs worry about equipment, Professionals worry about time, Masters worry about light.

User do not care about the technology used, they care about results.


Same thing bro with musicians, "Amateurs are gear-heads, pro's worry about techniques, and masters tries to listen to the sound"; traders/gamblers, "Amateurs focus on their winning, pro's worry about preserving their capital and masters worry about executing their performance where losing and winning money is only a side effect," and ad infinitum.


Reminds me of the Freaks and Geeks episode where he goes to a drumming audition and gets made fun of a bit for his enormous kit before completely bombing.


And in war, strategy, logistics, capability.


They may come and go in the small bubble of blogs you read, but the real world works very differently.

Heck, I work with Perl on a regular basis. Not for some amateur web pages but for real, important, back end stuff. What matters is that the code is solid, not how many new blogs are started on the subject.

Remember that the ones who get things done in the world does not blog about it. No blogs chronicled how they chose dynamic frameworks for the first version of Google. Because they didn't. They whipped out their editor and compiler and went to work.


> Remember that the ones who get things done in the world does not blog about it.

Perhaps not necessarily true. I don't blog as often as I used to but I still get things done. The act of writing helps to strengthen and reinforce good ideas and train one to recognize them in the future. Publicly or privately matters little... but some of us do "blog."


I think your parent's language was a bit inflammatory, but I like the reminder that not everybody doing interesting things is blogging and tweeting about it. Lots of people just work.


> Learning a new framework that does the same thing like layout or user design; it's not like learning music and digging into a deeper discipline like moving from acoustic strumming to jazz improvisation, it's just learning new keywords to do the same crap because the old keywords have been remapped.

Perhaps you should look into Haskell, Clojure or Erlang for a change?


I love HN. In response to a complaint about learning new frameworks, you suggest three other alternatives.


They are all languages (not frameworks), which are valid analogies to the guitar metaphor. They are so different from your usual C-like languages that they will be challenging to learn (and likely fun also)


I think the difference is not just that these are languages rather than frameworks, but also that each of these teach you different skills and approaches by their nature (that aren't 'new', so clearly have a stood the test of time).


All but one are profoundly old, yet making a come back.


Haskell is from the 80s. Erlang is also pretty old, and Clojure is a Lisp, the second oldest programming language around. Which is the `but one'?


If you put it this way, Clojure is old too I guess.


Though to be honest, Lisp has come a long way since the 50s.


In the context of web applications, what do Haskell or Clojure bring to the table that makes them worth the learning curve?


In the context of noname123's post they bring interesting learning to the table: they are not more of the same with renamed keywords. To quote,

> [...] it's not like learning music and digging into a deeper discipline like moving from acoustic strumming to jazz improvisation, it's just learning new keywords to do the same crap because the old keywords have been remapped.

Haskell _is_ digging deeper into the discipline.

I am sure one can make good arguments for some of the Haskell and Clojure frameworks for web applications. Alas, I am not qualified. I only used Haskell professionally for embedded development, and for rapid desktop application development, but I've never done web development seriously.

In any case, the Haskell answer for web development will probably include what all Haskell advocacy ever includes: purity and the strong static typing.


How do you Haskell on embedded systems? Are you talking about baremetal with 1-32K SRAM?


To take this thread yet further into self-parody, you might like Rust; it borrows some interesting stuff from Haskell (or really, the ML family), but is actively targeting embedded systems.

Edit: Also, I have heard that Ocaml can be made to work pretty nicely in an embedded environment.


You can make OCaml (and Haskell) run on Xen, no need for a full blown operating system. (Not sure if that counts for anything.) The OCaml project for that is called mirage.


No, not nearly that baremetal. I was talking about control processes running in dom0 of xen. They are embedded in the sense that there are `no user serviceable parts'.

(I just looked up embedded software on Wikipedia; it seems like the traditional use of the term is very different to how I used it here. Please pardon the confusion.)


Ignoring the joy of writing software with much more powerful abstractions for a moment (Python, Ruby, JS, PHP all look like jalopies now compared to Haskell) out has very real industrial benefits:

Write software faster with fewer bugs, easier to maintain down the road because the "mental model" is maintained in the types by the compiler, it's more succinct, it's fast, some of the world's brightest computer scientists work on it, and so on...


Thanks for your response. I'd like to dive a little deeper into the basis for these claims. My current stance is that there isn't sufficient reason to adopt Haskell for use in production.

> Write software faster with fewer bugs

Is there an evidentiary basis for the 'fewer bugs' claim? What type of software are people writing faster with Haskell?

>> easier to maintain down the road because the "mental model" is maintained in the types by the compiler

That seems fair.

>> it's more succinct, it's fast

In isolation, that's a bit hand wavy IMO.

>> some of the world's brightest computer scientists work on it

This is something I hear quite often from the Haskell community in particular. It may or may not be true, but it gets repeated far too often IMHO. It feels like a bit of an appeal to authority.


Interesting you should ask for more verification of my claim to fewer bugs because I'm about to embark on using Haskell in the new startup I'm working for and they may require stronger reasoning than I have been providing.

I've been thinking about how to qualify the claim logically or maybe quantify it. Anywho, aside from that I can tell you that the vast majority of bugs I introduce into my Python code (or other languages) are caught by GHC - these are bugs that purely have to do with my inability to remember what something is or is doing, or fear of refactoring something, or some tangled mess of types that are harder to reason about in Python because they're in my head instead of encoded in Haskell.

I've discovered that when I've been able to compile my Haskell programs the bugs I find in them are usually business logic bugs now. Occasionally I'll make use of QuickCheck / HUnit for automated property testing and unit testing, and that will help catch those business logic bugs.

Very subjective and anecdotal but I can promise you that the experience is very real.

> In isolation, that's a bit hand wavy IMO.

Sure, but if you take into account Haskell's denotational semantics vs. other language's operational semantics it should be pretty clear that Haskell inherits Mathematic's idiom for succinct expression. That idiom also informed the language designers when building Haskell's grammar and syntax, it's very flexible and abstract - both in the essence of the language (it's semantics) and its modality.

When reading idiomatic Haskell, there's a lot of information packed into a line, conceptually and syntactically.

> This is something I hear quite often from the Haskell community...

Sure, it probably borders on it, but in many ways other language communities commit the bandwagon fallacy (Go is one) - so I think you can poke holes all over the place. The fact though that proof proving systems and dependently typed languages like Agda and Idris (which are the future and I think will supplant Haskell unless Haskell can evolve) are being written in Haskell which speaks to the level of intellect and forward thinking in the community.


>> Write software faster with fewer bugs > Is there an evidentiary basis for the 'fewer bugs' claim? What type of software are people writing faster with Haskell?

http://donsbot.wordpress.com/2007/05/01/roll-your-own-window... and especially http://donsbot.wordpress.com/2007/05/17/roll-your-own-window... give a good introduction into the mindset behind programming in Haskell.

The latter focusses on zippers. Zippers solve the problem of having a connection and a `cursor' into that connection to mark one element--in the case of XMonad we mark the window that has focus.

In C you would probably solve this with an array and an int. Unfortunately, the compiler can't help ensure that your int always points into the array, and that deleting and inserting are doing the right thing. Zippers help here.


> Ignoring the joy of writing software with much more powerful abstractions for a moment (Python, Ruby, JS, PHP all look like jalopies now compared to Haskell)

Just out of curiosity: Which abstractions are we talking about here that Haskell doesn't have? OO?


I didn't write that sentence very well if that's how it was interpreted, sorry. I meant to say that the abstractions available to Haskell are more powerful.

I can't really think of any abstraction that's not available to it - even an Object System is possible in Haskell, it just wouldn't be as easy to use as Monads are.


> I can't really think of any abstraction that's not available to it [...]

Dependent typing doesn't work very well in Haskell (yet).


I believe he meant Haskell's abstractions are more powerful.


I used to have that passion, when I was young (I'm 29 now, so not very young for the tech world, although not old either). And to be honest, it was the 40+ work week what killed it. I never liked specialization, so to me it makes a lot of sense that, if you're already investing that much time in technology, you may spend the rest of the week doing a different thing. I'm getting a new degree (in principle, I don't plan to make a career change, I just want to do something different with my evenings). And this summer I'm learning a bit of Finnish on my own, because why not. Obviously, it's not like I ran out of curiosity, but rather, that I prefer to direct it to several different things than focusing on a single one.

If I devoted my free time to technology, I would be doing so just not to become obsolete, not out of genuine interest, and that would be too frustrating. I actually have some tech interests (Scala, Hadoop, any kind of complex algorithm, etc), but when I get home, there is so much I'd rather do, that I don't program at home any more, except on rare occasions.

There is a lot of pressure to spend all your waking time on technology and things related to your work or your career (I'd say this is part of the so-called "Californian ideology", maybe I'm wrong). But, in actuality, not that much people is so obsessed with learning the same area, and the competition to not get behind is mostly illusory (ageism is another different problem; and no amount of knowledge will free you from it). So my advice here is that, if you don't feel like devoting your life to a single thing, forget about your job the very moment you close your office's door, and enjoy any other of the million things life has to offer. And of course, if you are really that passionate, keep learning new programming languages or techniques on your free time (just be careful about burnout).

Final note: the most tech-passionate guy on my job is one of the oldest (36). It's not a matter of losing it over time, it's a simple matter of people's preferences (and in some cases evolution over time).


I think the tech industry discounts the value of expanding your horizons. I have a background in design and writing, and when I choose to incorporate those disciplines into what I do with code, a lot of magic happens. Let's say I order a class like a do a news article (upside down pyramid, aka most important shit first). Or I apply my own design eye to the motion or interactivity of a static design. Or even to how I interpret the structure of the code?

Reach out. Learn as much as you can. You end up finding a lot of relationships between a lot of things in life that can apply equally to each other. Right now I'm digging deep into car repair. It's fun to find the correlations between the component systems of a car and software.


> I think the tech industry discounts the value of expanding your horizons. I have a background in design and writing,

I had a developer get up in my face once and chew me out for studying design in my spare time instead trying more new languages and frameworks because I'm a developer, dammit, and it should be all I live, eat, and breathe.

That's bull. Well-rounded people bring a lot to teams, the least of which being the ability to speak to other specialties in a common language.

Keep broadening your horizons & keep being awesome.


I think expanding horizons beyond technology is a really important key idea that you've stated.

Programmers that are learning more mathematics, music, painting, or a scientific discipline beyond computers really expand their available set of symbols and motifs - creative ideas then emerge from that soup.


Is it ageism? Or is it just experience?

If you've been in the industry for 10+ years, there's a certain cynicism that occurs, when you're learning a new framework/language every 2 years that's just an incremental improvement on another language.

Just because a boss tells you to. "Oh learn this, it's gonna be the next big thing. It's gonna propel your career. Oh, you're done with that? Now, learn this new thing for your next project. 3 months later.. OK, forget that, we're switching frameworks. Learn this now."


It's very hard to care a lot about the new technology that is just the same as the old technology you used 10 years ago.

It's a giant zero-sum game to expect everyone to pour time into running in place to not fall behind, and we're not getting any better.


So your boss is basically telling you to learn all of these things so that you will remain a cog in the machine. Like running really fast trying to move ahead, but really you're just on the treadmill and staying in the same place.


That's a great way of putting it. It's exactly where I feel I an now.

I wish I did C more often. Most of that knowledge has longevity.


My first website made CGI calls to an API written in C. 17 years later, I'm using the same basic model, except with Go.


> The take-away invariably being that people who are older and have more things to do have fallen out with technology, whereas the young are still with it, whatever it is.

But it's still true, in general.

Sure, I work with plenty of early 20s people who have (frustratingly) no interest in their domain outside of what is necessary to get them through the work day. These young people exist.

But there are also the young people who are actually interested, and, in general, they have a comparatively infinite amount of free time to dig into languages and libraries.

If you have a spouse/partner, some of the time you used to spend working on your interests is now spent nurturing your relationship (and if you don't do that, I guess you'll have more free time for yourself again later on). If you have a child, then you must take another chunk of time spent on personal interests, or work. For most people it ends up being their own interests, since they still have bills to pay.


But it's still true, in general.

At best what you could say is true is that people with no relationships and no "extra-curriculars/external interests" have more free time.

This crosses all ages. There are young people with filled social and activity calendars (I've worked with young people who seemingly had filled social calendars at work). There are older people who have all the time in the world.

I remember reading once that the best employees are the unhappiest (in life). These were the people with endless time because they had no activities, no family or social relationships, etc. They'd happily burn the candle 120 hours a week because they had nothing else to do. Not sure how beneficial that is to knowledge work.


I agree completely, you sound exactly like the kind of person I am!

I just finished my computer engineering bachelors degree a month ago, I am 26 years old. From all the people I know at school, only a small handful of them has ever done any kind of coding or any thing outside of the school curriculum (more than just reading a tutorial one night).

That is, for me at least, a frightening thought. You want to work has a developer but you do not spend any time outside work to learn?

Some of us are curious and try things out. I believe those that do are (mostly) the people that can be good developers. A situation that was particularly curious to me was when we had our first Programming 101 (Java) class the semester before Christmas, and then when the second class from around mid-February. In between that time, a huge amount of the class did not look at any Java, or do anything, or even try it out. So unsurprisingly, barely anyone remembered anything. There was just a complete lack of interest, and I can't figure out why they would choose this career path if they did not enjoy it and I do not understand how those people managed to pass classes, or how they are supposed to do good work later.

I have had a growing interest in computers since I was a child, and I think I have been coding since I was around 12-13 years old. During my time at university I have attempted to get people more interested in coding by suggesting they figure out some problem they want to solve, and then spend some time making something. I think I managed to get only a few of them to try anything, while the rest were questioning why they would want to do anything like that... I know quite a few of them have gotten jobs directly out of school, but I have no idea how.

Edit:

I do understand that not everyone has time to nerd out all the time, especially when you get older and have a family. I also like to relax and recharge, so I try to keep busy doing other things like exercising. Sometimes I just get so sick of work, school and everything that I can't seem to write any code at home for personal projects. It is nice to be able to burn of that unspent energy mountain biking for example. And sometimes that just seems to shake loose a solution to a problem I have been having or give me inspiration to create something new.


Yes. Somehow "I am less motivated to work hard and learn new things in my 40s" turns into "Everyone is less motivated to work hard and learn new things in their 40s."

The author sounds a bit bored. His mind is telling him, "After 15 years of LAMP development, I'm bored, let's do something different!"

What should he do? Maybe he should take some time off, learn something new and exciting, or maybe move to the countryside and become an organic farmer. Only he can know.


If you're sick of building CRUD web applications, it's probably a good idea to remind yourself that there are an infinite number of other things you can create with computers.


Yes. Somehow "I am less motivated to work hard and learn new things in my 40s" turns into "Everyone is less motivated to work hard and learn new things in their 40s."

Yes, it very much is worded exactly like that. If you describe a starting state, and then a change of environment or inputs (getting older, which happens to all of us), and then hold the resulting state as the consequences of that change, you are generalizing, intentional or not. It is simply unavoidable. Sarcasm to this obvious reality isn't helpful.

To put it another way -- if a technology ignorant HR recruiter read that piece, they would absolutely think "older, established people are less valuable hires" than "this individual is a less valuable hire".

This, of course, could be the cycle turning in on itself, in much the same way that some minorities equate their condition to being a consequence of being a minority, rather than individual, more unique traits.


> They left work and played volleyball and worked on their tuner car and went to family events and so on.

'They had interests besides developing work skills.' You say this like it's a bad thing. This is a very disturbing aspect of HN/startup culture: the disdain for reasonable life-work balance & non-obsessive programmers.

Life goes by fast, I hope you're stopping to smell more than just the new programming frameworks. :)


I share the same sentiment and thoughts. Have you ever read Zen and the Art of Motorcycle Maintenance? If not I recommend it, been a great book to read and think about over the summer.


"However I do think posts like this feed in ageism"

That was my exact thought after I read this as well. I'm 36 and starting to feel the creeping fear of being seen as "too old" to do what I do. People posting "I did PHP for 15 years and don't get Python and don't want to learn DevOps and AWS is so confusing" sounds like a senile old man complaining about the good old days. If you want to be marked as a crotchety old man then great, don't make the rest of us look that way too.


You're never too old to "do what you do", unless you have nothing to show for it. Have you written software that others find awesome and useful? Are you still capable of writing such software? If so, you have nothing to worry about.


Right, it's not a question of being too old to do the job, it's a question of the PERCEPTION of being too old to do the job.


"They left work and played volleyball and worked on their tuner car and went to family events and so on. They learned exactly what was specifically necessary for their job today and tomorrow. Never beyond.

Most of them were 20-ish year olds."

Any idea what happened to your colleagues?


Most of them are (if not all...I can't think of a single one who isn't) still developers. Few have made any great impact, and most have had the "people get hired in as my boss" experience a number of times.

But they're still in the industry, doing a job, getting paycheck, and supporting a living.

Our industry is rather unique in that way that it's a 24/7 job for some, even at the lowest tiers.


I'm 31 and can relate to some of this. Learning new technologies is fun when it's a genuine step forward, or something new to explore. It's not fun when it's just "how we do things here" bullshit that you have to take in because some 27-year-old "wunderkind" (who isn't that great, but is politically protected) called those shots that way. There are a lot of new technologies that aren't impressive, or are obvious bad ideas (ORMs).

The lack of coherence in the software career is troubling. See, what we're not told is that (at least under closed allocation) 4/5 of us are going to be bench-warmers, given low-importance evaluative projects, and what determines whether a person gets promoted to something real, or gets enough clout to direct his career and protect his specialties, is mostly politics, not merit. (This can be afforded because, put to genuine work, good programmers are worth $500k-2M/year to the business, and great programmers are worth $2-10M. So low utilization is quite affordable with programmer salaries at their current level.) The result of this is that we have to job-hop until we find jobs where we "click" (find like-minded people, or win the political lottery, or just come in at the right time) every 1-3 years. It's not fun to change jobs constantly, but there's often no other way to have a legit career in this industry.

The other thing that's really shitty is that employers demand specialization and high-quality work in job candidates, but refuse to recognize specialties once the person is on board. In terms of work experience, most companies don't eat their own dogfood. This is also why software firms have a terrible record on internal promotion. Even they would rather hire, for a high-impact position, a mercenary who only did cool stuff over the loser "team player" who let them load him up with grunt work.

I think it's time to consider some form of collective action. I don't want an old-style union (with wages set by seniority alone) because that tends toward mediocrity. I think our model needs to be more like the Hollywood model (Screen Actors Guild, worker-side talent agents). We're generating a lot of value, but ultimately the people who decide what to do with our work aren't using it to improve the world or the state of technology, but to cut jobs. We're in the business of helping executive assholes unemploy people, and that just fucking sucks. We need to take charge of this industry and make it work on our terms, not theirs.


At some point in life we become so busy with something else that we stop keeping up to date with the latest trends. Suddenly we realize everything has changed and has become foreign and complicated. We feel stupid; incapable of catching up. We feel old.

But listen, it's crucial to ignore this fear, because it's wrong. We don't really lose our ability to learn way until retirement (and for the lucky ones, even past it).

When you decide to stop looking back, and decide to bravely dive into the big unknown, you realize "you still got it".

Been there, done that. I still got it ;)


the global marketplace will also bring more change to our industry. I just got let go from a contract because the owner had 5 people on the team from overseas (India and Egypt) getting paid 1/4 my salary. They were also willing to work all hours of the day, weekends, and us holidays.

Open source has made it so businesses don't really need to hire engineers anymore, only developers good enough to make changes to a system engineered for free. As the older generation dies out and the younger generation starts businesses, it will only get worse.

This is why I started my own business and totally quit development (aside from my own business).


I'm 27 and I've been thinking about the problems described in the article. All the new learning becomes bullshit when you work on your own project. I can see that it's a huge annoyance when it's required to earn a salary. For example, we are a TDD shop for a rapidly changing software, learn or know TDD or gtfo. Even though, you know TDD is solely practiced by the sadomasochistic crowd stuck in an acid trip, you have no choice.

What if you just focused on creating your own software and your own projects? For me, when I focus on my own development projects, all of the angst and worry described in the article goes away because I stick to the tried and tested tools instead of jumping on trendy frameworks or javascript-not-just-on-browser type of crowd. I just use the best tool that I am familiar with.

It really doesn't help your career when you've written a desktop app using Java Swing on your own project or when you've spent a long time working on a SaaS using LAMP, or if you haven't implemented it with AQMPMONGODBMETEORJSANGULARJSAGILENODEJS, or for jobs that list 10 years of AngularJS experience.

Ultimately, aren't we, as holders of highly intellectual skillset, should be masters of our own destiny, instead of worrying solely on the career aspect of it? Why not say, fuck this job, I'm gonna build something and sell it? This is the way I see it.


GET OFF MY LAWN!




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: