After having been in this industry for 20+ years, I can pretty confidently say the world of software will never wake up. It's just been an endless cycle of adding abstractions, removing them again, using back end languages on the front end, using front end languages on the back end, using relational databases instead of a key value store, using key value stores instead of a relational database, unit tests over integration tests, integration tests over unit tests, silver bullet framework after silver bullet framework.
My feeling now is that in the greater scheme of things none of the technology choices really matter all that much, the main thing is if the software is written in a clear, documented and maintainable fashion. Unfortunately people will still throw that away and pointlessly rewrite it in a few years, so making the choice to contribute in that fashion is more a matter of professional pride than practical utility. Arguably it's better for your career to let go of that pride and just embrace the constant unnecessary rewrite process.
I do wonder if there are still some corners of software development which aren't like this. Perhaps in public universities or government?
This I think shows how "clear, documented, and maintainable" can mean different things to different people.
I specifically left Rails and Ruby because I failed to see any production Rails code actually meet any of those criteria. All of the metaprogramming, OOP, inheritance, and DSLs just made the code more confusing than it needs to be. Ruby is a cult of "the code documents itself", but code can never document itself, because good documentation includes the how and the why and some examples. Code is a horrible at describing how it exists, and no, unit tests are often not sufficient examples. And then there's the issue of Rails apps taking way too long to boot up despite their only job being to serve webpages. Debugging serious issues is a pain when the Ruby code takes minutes to actually run, and having neither clear nor documented code doesn't help. It's always abstractions upon abstractions upon abstractions.
But some people like Ruby because it's a beautiful language and Rails gives them an opinionated structure, and maybe that matters more to them.
I'll take simple functions and primitive data structures with detailed comments any day over design patterns with a bunch of classes to describe abstract ideas that inherit from one another and fail to self-describe.
Unfortunately in practice, there is so much ruby/rails magic going on that most Rails projects end up in terrible shape.
It's kind of a catch-22, and tons of companies have overcome it, but rails by far has the "least long term maintainable" defaults. Good for quick prototypes/small teams but bad for large and scaling teams
This was my experience with rails (to be fair, some years ago). Each new version reinvented some huge chunk of the previous version with magic all the way down. Basically why I stopped using it despite having invested lots of time.
I think it's a maturity thing. I've found myself going from wanting to use cool technology to build whatever to wanting to use boring technology to build cool software.
A few decades of programming has taught me that cool technology inevitably turns out to be janky and annoying and half-finished.
I agree it's a maturity thing, but I wouldn't say the cool technology 'inevitably turns out to be janky and annoying and half-finished.' I think it's more that the cool technology is developed for a certain use-case and, developers being the way we are, we pick up our new hammer and proceed to test if every object around us is a nail.
We also won't accept other people's test results. Sure, 80 other devs have said this isn't a nail, but are they SURE?
Maybe not inevitably, but the vast majority of of yesteryear's cool technology is obsolete and forgotten today. The remaining became mature "old" technology.
There was a time Java used to be cool new technology.
20+ years here too, and I agree to a large extent.
The conclusion I've come to is that we're making expensive sandcastles. I don't know when I start a project how big the sandcastle will need to get, or exactly what it will end up looking like. I also don't know when the tide is coming in. Some of them were pretty good, others were disasters, but they've all washed away now.
I do quite like building sandcastles though, so I don't worry too much about it.
My feeling now is that in the greater scheme of things none of the technology choices really matter all that much, the main thing is if the software is written in a clear, documented and maintainable fashion. Unfortunately people will still throw that away and pointlessly rewrite it in a few years, so making the choice to contribute in that fashion is more a matter of professional pride than practical utility. Arguably it's better for your career to let go of that pride and just embrace the constant unnecessary rewrite process.
I do wonder if there are still some corners of software development which aren't like this. Perhaps in public universities or government?