You are spot on w.r.t every assertion you've made. When bean-counters took over the ecosystem they optimised immediate profitability over everything else. Which in turn means, in their mind, every part of the system needs to be firing at 100% all the time. There's no room for experimentation, repair, or anything else.
I've commented about lack of slack on several times here on HN because when I notice a broken system now a days, 90% of it is due to lack of slack in the system to absorb short term shocks.
The problem is, in the minds of these people 'firing at 100% all the time' generally means doing busywork and/or thinking of ways to cheat/manipulate their customers and the market for maximum gain whole delivering minimum value. I would have loved to be 100% engaged working on solving real problems in honest ways at some of my past jobs, but alas MBA/marketing leadership, which has taken over much of tech has very little interest in actually building good things and solving real problems in honest ways.
No, the process has impeded even higher standard of living, because it misallocates resources from value generation to value appropriation.
It's the extreme short term profit maximization that makes the economy a zero sum game. Otherwise it is not.
I don’t know who his wealth was transferred from, exactly. But I do know what he’s using it for now: as a gravitational force to unilaterally screw with public institutions and systems the rest of us depend on.
Even if you agree with some of DOGE projects’s goals, the way it operated was wildly thoughtless about consequences beyond Musk’s personal wishes, and almost completely unaccountable.
I’m honestly sick that my personal Model Y purchase helped add to that power.
And I say that as someone who was a huge Musk fan for years, despite the warning signs — the Thai “pedo” comments, and his very public turn during COVID.
You can always trick investors. For example all the overpromising Musk has done over the years. Also when you are that famous you can sell lower quality goods for a higher price that people will buy because they are associated to you.
All true. That's why there are laws against fraud.
What do you think of the tricks that California pulled to have billions vanish with not a mile of track laid? Or all those hospices with no beds? Or this fun one:
Unfortunately, as a taxpayer, I am on the hook for those tricks. With a business, I can do some due diligence and then decide if I want to get in or not.
I suspect you and I are being fleeced far more by government waste and fraud than by businesses.
The “created vs transferred” thing is more nuanced than that.
At some point, accumulated wealth becomes power, and that power can be used to pull energy, attention, labor, and public resources out of the system for one person’s agenda.
And in Tesla’s case, the stock value creation story is insanely unorthodox, to be charitable. A lot of that valuation was supercharged by years of market-moving hype done personally by Elon: 10+ years of FSD timelines that never happened, the more recent “buy a Tesla, it’s an appreciating asset!” super-lie, the mission gradually being abandoned, GOP craps on EVs and it’s crickets from Elon, etc.
Now we have the ultimate example of wealth gravity distortion: Musk helped put Trump back in power, and the relationship looks openly transactional. But Elon is not just benefiting from Trump’s transactional nature. He is also benefiting from an administration where white collar crime and regulatory accountability seem to have basically stopped being real things.
So with all that, the kind of shady behavior Elon pulled that might normally trigger government scrutiny or enforcement is now being smothered by political influence.
Even today, Tesla appears to be the highest P/E outlier in the S&P 500 among profitable companies. So the market is not just valuing the current business. It is valuing the story, the hype, and increasingly Musk’s ability to buy influence to stay outta trouble.
And to be clear, I’m not trying to pretend nothing real was built. SpaceX is impressive. Tesla really did help kickstart the EV market. But that does not make up for the harm, distortion, and unaccountable power being exercised now.
So yes, maybe the wealth was “created” in an accounting sense. But the concern is what happens when that created value becomes an unaccountable force acting on everything else.
Wealth is not created, it's stripped from the natural commons and it is, despite being massive, obviously finite. What you're referring to is "value creation"--the transformation of this natural wealth into a form that some other humans find valuable at a point in time. This value creation is rewarded via capitalism by the accumulation of monetary instruments which very much represent an appropriation of this wealth. If this system wasn't zero sum then we wouldn't have inflation?
The vast majority of it was stored underground as petrochemicals. A fact made immediately apparent by looking at like any chart. Is this a serious conversation?
In the vast majority of cases that energy could have come from other sources, though the cost would have been somewhat higher. In the hypothetical case of solar would you still describe it as being finite or stripped from the natural commons? I suppose raw land area or 1 AU solar sphere surface area could be viewed that way but it seems reductionist to me.
What if I use what would otherwise be a waste product to create something people are willing to pay for? For example sawdust. Is that not value creation?
What is the point on Musk you are making? The monetary success does not neccesserily correlate to the common good they have created. In case of Musk there is a lot of governement subsidies, lots of market manipulation and false claim s. So not all activieties that bring profit to the richest are good to the rest. And stats on inequality just highlights that trend
While their activities certainly fall in the realm of capitalism, and are just blips at longer time scales, it certainly feels like capitalism has been a bit under the weather for the past couple decades.
Regarding the money invested in AI, it all gives me "irrational exuberance" vibes.
> Musk got a huge leg up through the government, whether it be tax credits, incentives, side-stepping regulations, etc.
Nope. (Any government incentives were available to all the other car companies.)
> Bezos ran at a loss for so long it drove out actual and potential competitors.
Where do you think he got the money to sustain those losses? Investors! Including me. That is not a "transfer" of wealth, as in exchange the investors received an ownership share of the company.
> Regarding the money invested in AI, it all gives me "irrational exuberance" vibes.
I was not making an argument that the economy is zero-sum, or that Musk or Bezos did not build wealth. I merely pointed out methods they used to build their empires. For example, Musk did take advantage of government incentives, sidestepped regulations, etc.
Again, I never claimed there was any sort of zero-sum transfer of wealth. I'm simply pointing out there are varied ways to build up wealth; people have various opinions about those ways. It's right to call out misconceptions or outright falsehoods, but it's also good to understand what leads people to form or accept those misconceptions in the first place.
"Profit maximization" on its own would have left most people working 12+ hours a day 6 days a week, like it was very common in the 19th century. Luckily, it's never been the only force shaping our societies.
Sure, productivity increase is hugely important, but if you only pursue profit maximization, then all the productivity increase goes into profits, which means that the general population doesn't increase their well being much if at all.
The 40hr work week didn't come by as a consequence of the profit maximization mentality, but as a consequence of hard fought battles by the workers/employees against that mentality. And when I say "hard fought" I mean in the literal sense, with at least 1,000 workers killed just in the US in those days. https://en.wikipedia.org/wiki/List_of_worker_deaths_in_Unite...
The Law of Supply and Demand is in play, and it means a company cannot dictate prices, wages, or working conditions in a free market economy. Rising productivity would have reduced the average work week regardless.
If you still aren't convinced, consider that the benefits package routinely offered to employees is worth around 40% of their pay.
Most people do a lot of work themselves that the Richie Rich would pay somebody else to do, like cooking, cleaning, childcare, gardening, etc. If it counts as work when you hire somebody to do it, it should equally count as work if you do it yourself.
People still did cooking, cleaning, childcare, and gardening in those times of 12 hour work days.
BTW, cooking in those days was an all day affair. The wood stove required continuous feeding and watching. Today one can just put the food in a microwave.
I cook a steak now and then, it's the only cooking I do. It takes about 10 minutes. The dishwasher does the cleaning.
Rich people hire others to do the cooking because the rich peoples' jobs pay off far more per hour of work. For example, if my profession pays me $100/hr, it makes perfect sense to pay someone $30/hr to do the cooking for me, as I am still $70/hr ahead.
I think it's more accurate to say it is a process that has resulted in our high standard of living faster than other processes... so far.
There is no guarantee it will keep working for the majority of us going forward; as is becoming very clear all around the world, it also has downsides especially without checks and balances (which was predicted and observed in the past, which is why other processes were conceptualized in the first place!)
As a trivial example, profit maximization is directly responsible for the enshittification we're seeing everywhere, which definitely is negatively impacting our standard of living.
Your line of reasoning misses the clear example that China pulled 1.4 billion people out of poverty creating mass-literacy before embracing capitalism.
I don't recall ever seeing USSR products in stores, while plenty of manufactured goods from other countries were. (By products I meant manufactured products, not extracted resources like oil.)
I remember books (there was a famous soviet science publisher, which I believe we learned later had gulag deportees working on their printing presses) and I seem to recall toys and some foods.
My memory from the period is far from perfect, though, as I was a kid when the USSR collapsed.
I think the bean counters get a bad rap for this a bit unfairly. The past century has seen more progress in knowledge and technology than the rest of human history combined. The world and business environment are changing too rapidly to make longtermist thinking practical.
Few care if you have a lifetime warranty and excellent service or replacement parts if the majority will upgrade in a few years! Mature technologies increasingly become cheaply available as services, eg. laundry, food, transportation. That further reduces demand on production, as many can get by with the bare minimum and don't need the highest quality, longest lasting appliances. Software is even more ephemeral and specialized.
Developing education and training pipelines is wasting money if the skills you need are constantly changing! There is plenty of "slack" in the workforce so this works just fine in most cases - somebody will learn what they need to get paid. There are very few fields where qualified worker shortages are a real problem.
R&D can be outsourced or bought and subsidized by the government in universities, so why do everything yourself? Open source software has even further muddied the waters. Applications have only a limited lifetime before being replicated and becoming free products (this has only been intensified by the introduction of AI), so companies develop services instead.
Technology and knowledge deepening and rapidly becoming more specialized makes the monolithic corporation much less practical, so companies also need to specialize in order to effectively compete. Going too far in the name of efficiency can destroy core competencies, but moving away from the old model was necessary and rational.
> R&D can be outsourced or bought and subsidized by the government in universities, so why do everything yourself?
Because some problems that many companies in very specialized industries work on are so special that outside of this industry, nearly all people won't even have heard about them.
Additionally, many problems companies have where research would make sense are not the kind of problems that are a good fit for universities.
Those fields still develop in-house expertise and world-leadning products. General Electric was cited above, but their turbine engine division is producing the most fuel-efficient, reliable, and lowest TCO aircraft engines there have ever been. The materials science and engineering expertise needed to do this isn't something you can find in a freshly-graduated university student.
Products like jet engines, though, are still those where quality matters. They are so costly that there's room in the finances to deliver it. Unlike household appliances, where consumers make decisions mostly on the basis of price and being $5 cheaper than the competition is what will get you the sale even if it means using plastic instead of cast or forged metal parts.
> Unlike household appliances, where consumers make decisions mostly on the basis of price and being $5 cheaper than the competition is what will get you the sale even if it means using plastic instead of cast or forged metal parts.
A part of this is that consumers usually don't have very good information about products like that. I would almost always pay twice as much for an appliance that's going to last three times as long, but I usually can't find a review that's based on a teardown and rebuild or testing to destruction.
> Unlike household appliances, where consumers make decisions mostly on the basis of price and being $5 cheaper than the competition is what will get you the sale even if it means using plastic instead of cast or forged metal parts.
Some of this seems reverse causal to me. There were many consumers interested in options other than a race to the bottom. I certainly remember 90s Consumer Reports-era consciousness of consumers trying to find the best products as they all seemed to race to the bottom.
The irony seems to be that now that GE has sold GE Appliances they've been returning to higher quality and cutting fewer corners just because activist US shareholders wanted slightly higher dividends each quarter. It feels like only a matter of time before Heier finishes the next steps in the Lenovo playbook and stops paying GE to license their brand and stop giving credit to a US company that stopped caring about consumers and consumer product quality decades ago.
> Developing education and training pipelines is wasting money if the skills you need are constantly changing! There is plenty of "slack" in the workforce so this works just fine in most cases - somebody will learn what they need to get paid. There are very few fields where qualified worker shortages are a real problem.
Here's the problem with your reasoning. This paragraph is simply wrong, with each sentence being untrue. Education and training are never wasted money, the skills aren't changing that quickly, there isn't any slack in the workforce, and qualified worker shortages are being reported in every trade across the board. Someone needs to solve the problems you hand-wave away.
> this works just fine in most cases - somebody will learn what they need to get paid.
That's me. I specialize in learning new domains. I cost like 8x more than the random junior you'd be able to hire with a functional onboarding program.
Universities dont do product oriented research. They do more general research. And also, they should not do product oriented research, that is companies role.
And universities research capabilities are being destroyed too right now.
> Which in turn means, in their mind, every part of the system needs to be firing at 100% all the time
Not just that, you have to be always doing less for more gains. Real work is bad work. Shrinkflation good. I don't know what it is if it wasn't a pure scammer mindset.
I believe private equity ownership represents this in an aggressive form. The 2 and 20 percent takes that PE usually mandates as part of their purchase agreement means that they are highly highly incentivized to maximize short term "wins" over long term survival.
I think Chesterton and Taleb also had pretty reasonable things to say about understanding a system before you make changes and fragile/anti-fragile systems as well.
I’ll note at the end of the last century I worked at IBM research which had a budget of 6 Billion dollars. Management was trying very hard to get better return on that investment. Even today IBM though often ridiculed in the tech space (sometimes they do deserve it) spends a lot on R&D.
I've been to the Holmdel office in the decline years. It was very sad. A fraction of the former staff was rattling around in what could've been used for a post apocalyptic sci-fi set. In its heyday it must've been magnificent. Imagine taking an entire great research university and putting it into a single architectural masterpiece. I've also been to Nokia HQ after Elop ruined the place. Also sad.
IBM employees have garnered six Nobel Prizes, seven Turing Awards,
20 inductees into the U.S. National Inventors Hall of Fame, 19 National Medals of Technology,
five National Medals of Science and three Kavli Prizes. As of 2018,
the company had generated more patents than any other business in each of 25 consecutive years.
> the company had generated more patents than any other business in each of 25 consecutive years.
A couple things about those patents, from a former IBMer who has quite a few in his time there.
First, not all patents are created equal. Most of those IBM patents are software-related, and for pretty trivial stuff.
Second, most of those patents are generated by the rank and file employees, not research scientists. The IBM patent process is a well-oiled machine but they ain't exactly patenting transistor-level breakthroughs thousands of times a year.
Why do you need to generate transistor-level breakthroughs multiple times a year? Those breakthroughs are hard to generate, but they're important and industry-spanning. The problem is we've mostly stopped generating them.
I wasn't saying anything about that, I was just pointing out that yes, IBM produces a ton of patents, but they're mostly trivial junk that regular employees generate en masse in order to earn accomplishments and make up for the insultingly low bonuses.
> they're mostly trivial junk that regular employees generate en masse in order to earn accomplishments and make up for the insultingly low bonuses
We did that at Meta and Amazon too (for polycarbonate puzzle pieces, with no monetary award at all!). Every now and then something meaningful came out of it
I also worked (briefly, as an intern) at IBM and IBM’s management also sometimes undermined the R&D that happened at the company.
I started at the tail of one research group’s mass exodus. It was like a bomb had gone off; the people left behind were trying to pick up the pieces. In essence, this group developed a sophisticated new technique, which the company urged them to commercialize. Pivoting to commercialization was a big effort, and not naturally within the expertise of this group, but they did it, largely at the expense of their own research productivity—for several years. They even hired programmers (ie, not people who are primarily computer scientists) and got it done. But just before launch, IBM pulled the plug.
This infuriated the researchers in the group. Keep in mind that career advancement in research is largely predicated on producing new research. In effect, IBM asked people to take a time out and then punished them for agreeing to do it. The whole group was extremely demoralized. Google was the largest beneficiary of this misstep.
I also had a similar, frustrating experience working for Microsoft, so it’s not just IBM, but the same dynamics were at work: bean counters asking researchers to commercialize something and then axing a project as it becomes deliverable.
If AI replaces any role in the company of the future, please let it be the managerial class.
The thing is, Nobel Prizes and other awards don't pay the bills.
Patents do, but in most cases it's trivial patents or patents for a "mutually assured destruction" portfolio (aka, you keep them in hand should someone ever decide to sue you).
That's a fundamental problem with how the Western sphere prioritizes and funds R&D. Either it has direct and massive ROI promises (that's how most pharma R&D works), some sort of government backing (that's how we got mRNA - pharma corps weren't interested, or how we got the Internet, lasers, radar and microwaves) or some uber wealthy billionaire (that's how we got Tesla and SpaceX, although government aids certainly helped).
All while we are cutting back government R&D funding in the pursuit of "austerity", China just floods the system with money. And they are winning the war.
mRNA is not a good example. If anything, it's a demonstration of why the Western capitalist model is superior to anything else. Most of the mRNA research was funded by venture capital as a high-risk high-reward investment.
In the world of government-sponsored research, mRNA likely would have been passed over in favor of funding research with more assured results.
Every year they grant prizes. If hardly anyone is doing core R&D because of cost cutting, there is a higher chance those doing the smallest amount of R&D get the prizes.
A Nobel in 2026 doesnt carry the same weight as a Nobel in 1955.
Toshiba, IBM and Siemens had a DRAM joint development program 1993-1998. Several generations of DRAM was developed there. Also, while IBM exited the DRAM business, the knowledge survived in Rambus to an extent.
They also took out all the quality, though in pure business terms one can argue that's a kind of "slack" by itself.
The beancounters have cut all the corners on physical products that they could find. Now even design and manufacturing is outsourced to the lowest bidder, a bunch of monkeys paid peanuts to do a job they're woefully unqualified for.
And the end result is just a market for lemons. Nobody trusts products to be good anymore, so they just buy the cheapest garbage.
Which, inevitably, is the stuff sold directly by Chinese manufacturers. And so the beancounters are hoisted by their own petard.
We've seen it happen to small electronics and general goods.
We're seeing it happen right now to cars. Manufacturers clinging on to combustion engines and cutting corners. Why spend twice the money on a western brand when their quality is rapidly declining to meet BYD models half the price.
---
And we're seeing it happen to software. It was already kind of happening before AI; So much of software was enshittifying rapidly. But AI is just taking a sledgehammer to quality. (Setting aside whether this is an AI problem or a "beancounters push everyone into vibecoding" problem)
E.g. Desktop Linux has always been kind of a joke. It hasn't gotten better, the problems are all still there. Windows is just going down in flames. People are jumping ship now.
SaaS is quickly going that way as well. If it's all garbage, why pay for it. Either stop using it or just slop something together yourself.
---
And in the background of this something ominous: Companies can't just pivot back to higher quality after they've destroyed all their inhouse knowledge. So much manufacturing knowledge is just gone, starting a new manufacturing firm in the west is a staffing nightmare. Same story with cars, China has the EV knowledge. And software's going the same way. These beancounters are all chomping at the bit to fire all their devs and replace them with teenagers in the developing world spitting out prompts. They can't move back upmarket after that's done.
Even when the knowledge still lives, when the people with the skills requires have simply moved to other industries and jobs, who's going to come back? Why leave your established job for the former field, when all it takes is the management or executive in charge being replaced by another dipshit beancounter for everyone to be laid off again.
> E.g. Desktop Linux has always been kind of a joke. It hasn't gotten better, the problems are all still there.
Desktop Linux has gotten better, though much of the improvement happened decades ago. I believe the first person to prematurely declare "the year of Linux on the desktop" was Dirk Hohndel in 1999: https://www.linux.com/news/23-years-terrible-linux-predictio...
And speaking as someone who was running desktop Linux in 1999, I remember just how bad it was. Xfce, XFree86 config files, and endless messing around with everything. The most impressive Linux video game of 2000 was Tux Racer.
But over the next 10 years, Gnome and KDE matured, X learned how to auto-detect most hardware, and more-and-more installs started working out of the box.
By the mid-2010s, I could go to Dell's Ubuntu Linux page and buy a Linux laptop that Just Worked, and that came with next day on-site support. I went through a couple of those machines, and they were nearly hassle free over their entire operational life. (I think one needed an afternoon of work after an Ubuntu LTS upgrade.)
The big recent improvement has been largely thanks to Valve, and especially the Steam Deck. Valve has been pushing Proton, and they're encouraging Steam Deck support. So the big change in recent years is that more and more new game releases Just Work on Linux.
Is it perfect? No. Desktop Linux is still kind of shit. For examples, Chrome sometimes loses the ability to use hardware acceleration for WebGPU-style features. But I also have a Mac sitting on my desk, and that Mac also has plenty of weird interactions with Chrome, ones where audio or video just stops working. The Mac is slightly less shit, but not magically so.
So yes, Desktop Linux has "gotten better". What it hasn't done is solved any of the systemic problems.
The Open Source development quirks that created the shitshow of the 1999 is still here. Gnome is better but still suffers massively from mainstream features being declared stupid by the maintainers. (A power button that turns off the machine? Heretical.)
Valve's recent successes are pretty illustrative here. They used their money to directly hijack the projects their products rely on.
For what it pertains the comparison, Windows is not without this "slow" improvement either. 95 and 98 are lightyears behind contemporary Windows in so many ways. Until quite recently it still made about as much sense to use Linux as it did back then; Not much.
Take your Linux Laptop example. Sure, Linux finally kind of worked on some specific models that were tested for it. Meanwhile, Windows had moved from "it'll work with some mucking about with drivers" to "It works universally, on practically all hardware". Really, by the mid 2010s Windows would finally be quite tolerant of you changing the hardware.
Hence my original point; Desktop Linux hasn't really caught up with Windows in any meaningful sense. Windows is just nose-diving into the ground in the last few years.
> The Open Source development quirks that created the shitshow of the 1999 is still here. Gnome is better but still suffers massively from mainstream features being declared stupid by the maintainers. (A power button that turns off the machine? Heretical.)
Gnome have been chopping off their own limbs because it reduces weight. All in the name of simplicity. I think they are not the best example of Open Source development.
KDE on the other hand had a hard fall once and basically recovered and invested long term in Plasma and that has paid off handsomely. Today, it is one desktop that I can say is closest to typical/standard desktop paradigms out of the box while retaining a high degree of flexibility for those who choose to customise it. I have been using KDE on Fedora for a while now and it has been basically solid.
> I think they are not the best example of Open Source development.
They're not. I'm using them as an example of the "bad" in Open Source development.
But it's also not so much the individual OS components that are a problem, their interactions are just as fragile and usually subject to neither party taking ownership of the problem.
For some reference back in Ubuntu 6 days around 2005 I switched. It took me 2 weeks to get X Org to run with my nvidia card at the time. 2 weeks of messing with config files. I only persisted because I was so sick of windows.
> And in the background of this something ominous: Companies can't just pivot back to higher quality after they've destroyed all their inhouse knowledge. (...) They can't move back upmarket after that's done.
The knowledge isn't the problem. It can be quickly regained, and progress of science and technology often offer new paths to even better quality, which limits the need for recovering details of old process.
The actual problem is, there is no market to go up to anymore. Once everyone is used to garbage being the only thing on offer, and adjust to cope with it, you cannot compete on quality anymore. Customers won't be able to tell whether you're honest, or just trying to charge suckers for the same garbage with a nicer finish, like every other brand that promises quality. It would take years of effort and low sales to convince the customers to start believing you're the real deal, which (as beancounters will happily tell you) you cannot afford. And even if you could, how are you going to convince people you're not going to start cutting corners again a few years down the line? In fact, how do you convince yourself? If it happened once, if it keeps happening everywhere around across all economy, it's bound to happen to your business too.
Wrong on the first point, right on the second. Institutional knowledge can't be easily regained. To build up the knowledge to, say, make a transistor, you need a bunch of people experimenting with a bunch of things. Published scientific papers and patents will get you part of the way there, but the final stretch is still up to you, including things like which equipment to buy, purity of supplies (and where to get them!), how long the chip needs to be bombarded by each kind of particles, how much air the cleanroom needs to move. All the tiny details. You have to discover them by trial and error. Actual chip manufacturing companies have found themselves unable to get good yield until they copied the floor plan of another working fabrication plant, and they still have no idea why that mattered, but that's an extreme case. Maybe nobody expected miniscule air contamination from one process step was affecting another nearby process step, and in the original plan they were farther apart.
Yes if you want to wire a neighborhood for internet you can skip DSL and go straight to fiber. That's not the problem. The problem is that nobody in your company knows how deep to put the fiber to minimize problems, how much redundancy is needed, how strong the mechanical armor around the fiber needs to be, how many fibers per cable to meet future capacity needs without excessive costs, which landlords are friendly to you, nobody has the right connections to city hall to get digging permits approved expediently, and so on.
Sure, hypothetically e.g. any western car manufacturer could poach a bunch of BYD employees. But it's not really practical for most businesses.
> The actual problem is, there is no market to go up to anymore.
This is the "Market for Lemons" problem, yes.
It's less of a problem than you might think. Convincing the entire wider world that you're legitimate is a problem. One made infinitely worse by store marketplaces like Amazon preferring to push "aqekj;bgrsabhghwjbgawrjwsraG" brand garbage.
So you just don't. The trick is to start small. The smallest you can sustain. (This doesn't work for cars, or anything that's sufficiently complex. You won't be taking on Salesforce.)
But so long as you can find a market niche where there's demand for quality, you can carve out a living, and from there, scale up.
The problem with that is twofold: Venture Capital has supplanted other forms of investment and "small business generating single digit millions in revenue" is utterly unappealing to VCs, even though the investment required is downsized accordingly.
And problem #2: The cost of starting a business is too high right now. Real estate and cost of living just make it unaffordable to even try. + Healthcare if you're in the US.
I think you’re not blaming political leadership enough. NAFTA, and other programs were always going to lead to the state of affairs we have now. This was a choice. Blaming greed is like blaming gravity.
> Desktop Linux has always been kind of a joke. It hasn't gotten better, the problems are all still there.
Desktop Linux mostly works these days. It does everything most regular people would want of it, with zero fuss. Including playing games. In some respects, it's easier to use than Mac or Windows.
When it has trouble with some things, one must remember neither Mac nor Windows is perfect, and they can be extremely frustrating at times.
Engineers seem to think business people don’t know what they are doing, but if your post were true, then companies would add slack to outperform their competitors.
The broken system likely doesn’t have enough business impact to justify the investment to maintain it.
When you plan working 3-5 years in a single company you don’t care if it crashes and burns month after you leave just to burn down next one.
Conversely we see the same dynamic with engineers, they build stuff to prop up their CV and don't care if company still supports crap they did after they leave.
> companies would add slack to outperform their competitors.
I think if they did this they'd get buried by the market. Your slack is someone else's opportunity to undercut you. It's a systemic problem, it's in every individual's self interest to work towards instability.
Are stock market profit expectations mostly long term? Stock markets have been wrong before.
Besides that, the U.S. stock market went up over several decades while manufacturing capabilities were transferred overseas. That has had, and will continue to have, domestic ramifications that might not be captured by investor profits.
> Without people who have actually worked with the system, you end up with a loss of tacit knowledge—and eventually, declining productivity.
> You are spot on w.r.t every assertion you've made.
Huh? What happened to the concept of "debate" on HN. It's just a bunch of people agreeing with each other. Yet the data doesn't support any of OP's thesis.
Here's a chart of the rise in productivity per hour worked in the United States since 1947. It's a steady linear increase every single year: https://fred.stlouisfed.org/series/OPHNFB
Yours is the type of story big company workers tell themselves to feel important while refusing to learn anything new and never taking any risks. But the truth is 99.999% of companies are not doing anything that unique or complex. Most companies are not ASML.
If I had a nickel for every time I've heard someone justify their do-nothing position within a giant bureaucracy while saying the phrase "institutional knowledge" I'd be rich. This is just a sign of a poorly run giant company full of engineers building esoteric and overly complex in-house solutions to already-solved problems as job security.
The truth is all of this "institutional knowledge" is worthless in the face of disruption, and it has a half life that's getting shorter every day.
Everybody talks shit about global just-in-time supply chains and specialization...but just because we had a fake toilet paper shortage for a few months during a 100-year global pandemic doesn't mean running things like it's 1947 for the last 70 years would have been better. You enjoy a much higher quality of life today due to these "evil" JIT supply chains which it turns out are far more durable than people want to claim.
Most measurements measured in dollars are just stealth measurements of inflation. Even inflation adjusted measurements, because official inflation metrics are always lowball numbers with shady methodology.
US aggregate productivity metrics fail to address this nuance. There is a fundamental difference in abstraction layers between a macro-system becoming more efficient and an individual enterprise experiencing operational failure. As a software engineer, distinguishing between these layers is critical. Your argument is akin to claiming that because the Google Play Store sees a higher volume of app releases (increased productivity), the intrinsic quality of individual apps has naturally improved.
In this analogy, the individual app represents a company, and the Play Store represents the broader US market. Silicon Valley’s highly liquid labor market allows talent to flow freely, which opens up and elevates the baseline of the overall market. However, that is entirely distinct from the fact that individual companies are suffering severe drops in internal quality and productivity.
Furthermore, in software architecture, 'productivity' and 'quality' are rarely directly proportional. With AI coding tools, we can ship an app orders of magnitude faster. Historically, it took me three months to write 60,000 lines of code; recently, I am generating that same volume in just two weeks. My productivity has undeniably spiked, but can I confidently claim the code quality is better than when I manually scrutinized every single line?
The real issue is not whether the broader economy has grown more productive since 1947. The core issue is whether a specific organization bleeds capability when the exact people who understand its real-world constraints, failure modes, and operational history walk out the door.
Both realities can co-exist: National productivity can trend upwards, while individual companies simultaneously suffer operational regressions due to botched migrations, failed refactors, or the loss of tacit knowledge.
I agree that 'institutional knowledge' is sometimes weaponized to defend unnecessary complexity. However, the opposite fallacy is treating all localized, domain-specific knowledge as worthless. While some of it is merely job-security folklore, the rest is literally the only surviving documentation of why the system functions in the first place
You are spot on w.r.t every assertion you've made. When bean-counters took over the ecosystem they optimised immediate profitability over everything else. Which in turn means, in their mind, every part of the system needs to be firing at 100% all the time. There's no room for experimentation, repair, or anything else.
I've commented about lack of slack on several times here on HN because when I notice a broken system now a days, 90% of it is due to lack of slack in the system to absorb short term shocks.