My hobby projects have 100x more tests than they used to, because LLMs are great at writing tests. And my subjective experience is that the net quality has increased as a result.
YMMV, but it’s certainly a common belief, and for me at least a lived experience.
A lot of people believe a lot of things are true. Plenty of things that are provably false too. People will have strong convictions. People will drink deadly Koolaid because they believe in things so much.
I don't care what people believe, I care about what is. What is measurable. What is factual. I need evidence for a belief to be meaningful. I need strong evidence for a belief to be strong. Not just evidence in favor, but evidence that alternative explanations are unlikely.
Currently I see evidence that things are moving fast. But I am unable to distinguish if this is actually because of AI or because increased efforts and motivation. Most importantly, speed isn't the same thing as velocity.
What I do not see evidence for is increasing quality. In fact, I see strong evidence to the contrary. I see strong evidence that quality is declining even quicker than it was before. I'm not convinced AI is that cause of this, but there's more than adequate evidence for me to believe it is a (significant) catalyst.
Right now you could throw a stone in a random direction and there's a good chance that whatever it lands on will be decreasing in quality. It is even easy to gesture broadly at Microsoft, but they aren't the only big tech disappointing their users.
I hear a lot of claims. I don't see a lot of evidence.
I'm not convinced. To me it looks like the bulls are throwing money and priding themselves on the bits that land back on it (or come from the next table over). It looks very chaotic and wasteful to me
The problem is that from the outside it seems like Microsoft no longer cares about the product. So much so that "the product" has become "shareholders"[0].
We've just been moving into a world where metric hacking is the desired outcome, not an outcome to try to avoid. These companies are only surviving because of their monopoly statuses. Because of momentum. It's a powerful force. It's the reason Twitter still is around. The reason Facebook is still around. But them being around doesn't mean they're good. It doesn't mean they're useful. It doesn't mean it is a good product. It doesn't mean the users like it. It just means people are used to the way things are and they aren't angry enough to leave for something else. But these companies are actively creating friction for users, daring them to leave, gouging them for everything they can. FFS Microsoft is the largest contributor (even more than Valve) to creating "the year of linux". Sure, it'll never have M$FT's market share, but it sure is eating into their revenue.
We've all lost sight of what made software so powerful in the first place. Why it became so successful and changed the world. We used to ship good products that help people, make their lives better, and make lots of money in the process. Now, I think all that anyone cares about is the last part. Now we're actively being hostile to those that make the systems better. And that system is fucked up and will destroy itself. That's not a good thing, because it does a lot of damage along the way. It is a system of extreme myopia.
In the last 5 years I'd argue that most software has made my life harder and more complex, not easier. There are definitely exceptions to this (ghostty being a great example), but there is a strong trend. I know I'm not alone in this feeling and I think we're getting to a point where a lot of people are no longer willing to dismiss their own gripes. This is not a good sign...
I'm glad you're optimistic. I do hope things can change. And my frustration is not directed at you. I really do want you to be right and I really do want to see change come from the inside. But I do not think those leading the companies now have any foresight. To be honest, I'm not even sure there's anyone at the wheel. It feels like we've just let the market forces steer the ship. If the currents steer the ship, then there's no captain, regardless of who claims the title. Frankly, I don't want to be on a ship without a captain, but here we are.
> It's messy, subject to commercial pressures, to a hierarchy of values that doesn't place "refactoring" at the top of the list. And why would it?
Probably because it's a good way to be more profitable.
Code that's easier to understand is easier to: maintain, generate new features for, fix bugs, onboard new engineers, etc
Code that's well written: executes faster (saving computational costs), scales better, has higher uptimes/more robust, reduces bandwidth, and so on.
The thing is the business people will never understand this. Why would they? They're not programmers. They're not in the weeds. But that's what your job is as an engineer. To find all these invisible costs.
I'm pretty confident the industry is spending billions unnecessary. Hell, I'm sure Google alone is wasting over $100m/yr due to this.
Don't be penny wise and pound foolish. You're smarter than that. I know everyone here is smarter than that. So don't fall for the trap
I am well aware of stupidity in industry. However, I am also wise enough to recognize the opposite error. (I myself have academic tendencies and a background aligned with that. I have chosen jobs that payed less, because the subject matter was more interesting for me. I'm not some vulgar, money-chasing techbro here.) The via media demands that we recognize the distinction between general truths and practical realities. As I wrote elsewhere in this thread, yes, properly refactored code is easier to maintain, easier to read, easier to change, and theoretically, commercially preferable. It also makes programming more satisfying, helping retention. But that describes a feature of such code. It doesn't tell us what the right course of action is in a particular situation. The notion that refactoring is unconditionally the right course of action when code is not in some ideal state is simply wrong. It really does depend on the situation. Sometimes, refactoring is the wrong thing to do.
I'm not making some outrageous claim here. This follows from basic truths about the nature of what it means to be practical, and if industry is anything, it is practical.
The professor is obviously not advising naive absolutism. He’s saying care deeply about your craft, and good judgement will follow from that.
Actually caring is what gives someone the itch to go back and improve things, versus happily calling it a day once minimum acceptable value has been delivered. The rampant enshittification of basically everything should make it clear which disposition is in short supply.
> Have the courage to go slowly, especially when everyone else is telling you that you need to go fast and cut corners.
The advice is aimed at students who haven’t yet decided which type they want to be. In fact it’s directly telling them to think for themselves and not blindly listen to you or anyone else here making the same case.
If you come out swinging you can't get mad when others swing back. You're not a victim, you're an instigator. You called danny_codes flippant for suggesting there are different biases. You called it absurd. You escalated it. And then you escalated it again.
> It doesn't tell us what the right course of action is in a particular situation.
That's because there is never an objectively correct course of action. There is no optimal solution. In fact, there can't be when the situation evolves. The objective isn't even defined, let alone well defined. I don't understand your point because no one was suggesting it was always the right answer. Don't strawman here. Of course it depends on the situation, that's true about almost everything. It doesn't need to be said explicitly because it's so well understood. Don't inject absolute qualifiers into statements that don't have them.
> I'm not making some outrageous claim here.
Your current claim? No. To be frank, you didn't claim much. But your prior claim? Yes. Yes you were. You were creating strawman then just as you did now.
>> Unlike algorithms and principles and even techniques, software is not eternal.
Not even algorithms are eternal. But I'm going to assume you're meaning the types of algoritms you see in textbooks because interpreting "algorithms" by its actual definition makes your comment weird. Since all programs are algorithms.
>> [Software] is ephemeral. It's shelf-life is bounded.
And this is going to be something nearly everyone is already going to assume. It doesn't need to be stated. It doesn't need to be differentiated because it is already the working assumption.
>> You're not refining some theory or some grasp of a Platonic ideal
And this is the real strawman. You're made a wild assumption about what others are claiming. There is such a wide range of viewpoints between "the way things are done now" and "chasing perfection." Anyone that thinks perfection exists in code is incredibly naive. You and I both know this, and so does anyone working in industry or academia (save maybe some juniors). There's a huge difference between saying "this isn't good enough" and "it's not good until its perfect." If someone talks about climbing a mountain you can't respond by saying it is impossible to climb to the moon.
>> Whether you should refactor something, when you should refactor something, is a matter of prudential judgement, which is to say, of practical reason.
Whether you should do anything is a matter of prudential judgement. It's wild to say this while accusing people of chasing perfection. You think people are just yoloing their way to perfection?! Seriously? The article and thread context is literally asking that people use more prudential judgement. To not be myopic. And you have the audacity to say "think about it". What do you think we're doing here?
> Who is going to read your carefully crafted documentation lol?
Everyone that uses or works in your codebase.
Look at how people use LLMs these days. People frequently use it on new codebases to get up to speed on the code. Frankly because it's a lot faster than grepping, profiling, and all the digging we'd normally do (though those still have benefits and you're still going to do them. Hell, the LLMs even do them). But how much of that could have been avoided had people just taken a few seconds to document their code? No one is saying sit down and document the whole thing but "add a few comments when you add new functions" or "update comments in places you touch". If it costs you more than a minute of your time you're probably doing it wrong.
I'm tired of these arguments. People are turning molehills into mountains. It's so incredibly myopic. We waste so much fucking time on things because we're trying to move fast. But no one seems to understand the difference between speed and velocity. It never mattered how fast you go, it has always been about velocity. Going fast in the wrong direction is harming you, not helping. If you don't have the time to know if you're headed in the right direction or not then you're probably not.
> Outside of the bit on avoiding cutting corners
But what your gripe is with is cutting corners. Not documenting? That's cutting corners. Not refactoring? That's cutting corners. Not spending time understanding the code at multiple scopes? That's cutting corners.
Those are all corners cut that end up wasting tons of man hours. Sure, they save you a few precious seconds or minutes now, but at the cost of hours or days in the future.
Here's the thing, if you don't take those shortcuts, then none of those tasks are hard. Even refactoring. But as soon as you start taking those shortcuts they start compounding. Then a year down the line your company is writing a blog post about how your code is 500x faster now that it's written in rust (or whatever the cool kids use). If it's 500x faster that's not because a language change, it's because tech debt. And like all debt it accumulates little by little and it's the compounding interest that really kills you.
Sorry, I'm tired of cleaning up everybody's messes. Go ahead, move fast and break things. It's a great way to learn (I do it too!), but don't make others clean up your mess.
Stop buying into this bullshit of needing to move so fast. It's the same anti-pattern scammers use to get you to make poor decisions. Stop scamming yourselves
this resonated for me, quite hard actually. there's the famous quote which has always stuck with me on this stuff slow is smooth, smooth is fast.
thinking about it a little more, i would personally prefer to use the term momentum rather than velocity or just plain speed -- we accrue more mass by adding code, features, etc. and shifting direction/increasing speed are both harder with greater mass.
I think the occasional joke is fine but when you have too many then the comments get diluted. It's exactly that kind of thing that makes me hate Reddit and so many other places: spam.
If you had the former rule why would you ever whitelist bash commands? That's full access to everything you can do.
Same goes for `find`, `xargs`, `awk`, `sed`, `tar`, `rsync`, `git`, `vim` (and all text editors), `less` (any pager), `man`, `env`, `timeout`, `watch`, and so many more commands. If you whitelist things in the settings you should be much more specific about arguments to those commands.
There's no point in getting things done if there's nothing that ends up being done.
You can still get shit done without risking losing it all. Don't outsource your thinking to the machine. You can't even evaluate if what it is doing is "good enough" work or not if you don't know how to do the work. If you don't know what goes into it you just end up eating a lot of sausages.
https://news.ycombinator.com/item?id=47963465
reply