Visa/Mastercard are also 100% surveilled. The nightmare of your transactions being monitored and arbitrarily blocked that's peddled as an argument against CBDCs or public financial infrastructure like Pix is already a reality for all of traditional finance (with the exception of physical cash, which happens to also be a central bank product). Sometimes even balances get confiscated, read the horror stories by Stripe customers. Americans seem to believe that the same thing is necessarily worse if it's done in the public sector. Other countries tend to see it differently. Just let them be.
I'm not american. I'm commenting on my own country.
As I said, I'm not defending credit card companies here. Just pointing out the fact that there are problems with pix. It's not the silver bullet people think it is.
And yes, it is worse when the public sector does it. You can actually use the government to fight the private sector when it screws you over. When the government does it, then it's just the system working as intended.
You cannot "use the government". I'm not aware of any countries where it would work like that. You can use the judiciary, which is independent of the government, and you can use it to go against the government the same way you can go against corporations.
My pet theory is that package managers will one day be seen like we see object-oriented programming today. As something that was once popular but that we've since grown out of. It's also a design flaw that I see in cargo/Rust. Having to import 3rd party packages with who-knows-what dependencies to do pretty much anything, from using async to parsing JSON, it's supply chain vulnerability baked into the language philosophy. npm is no better, but I'm mentioning Rust specifically because it's an otherwise security-conscious language.
The industry hasn't grown out of OOP. Go look at any major production codebase businesses rely on and it's fully of objects and classes, including new codebases made very recently.
Package managers aren't going anywhere. Even languages that historically bet on large standard libraries have been giving up on that over time (e.g. Java's stdlib comes with XML support but not JSON).
Unfortunately, LLMs are also not cheap enough to just create whole new PL ecosystems from scratch. So we have to focus on the lowest hanging fruits here. That means making sandboxing and containers far more available and easy for developers. Nobody should run "npm install" outside a sandbox.
It's a condensed statement. There was a time when I would start a new programming project thinking about class hierarchies, maybe drawing some UML diagrams. I don't do that anymore, and I don't believe it's very common for greenfield projects anymore. But educate me if that's wrong. We've kept some of the good ideas from OOP like namespaces and interfaces and we use them in slightly different contexts now, where OOP may even still be technically possible, but it's not the primary way of doing things anymore. I believe, or at least hope, that we will see a similar kind of evolution for package managers. Where it's still possible to use other people's code, but having packages like left-pad or is-even is no longer how it's commonly being used, even if it may still technically be possible.
I think that's normal, what universities teach as OOP is very different to what's actually done in the real world. But it was always that way. I learned OOP as a kid and UML didn't feature. Then at university it was taught in a very theoretical way. On the other hand things like encapsulation, inheritance etc are still widely used.
I'm all for including more things in the Rust standard library, but anyhow and syn are literally from a core Rust dev. It's not some left-pad rando, it's like a Linux user saying they don't trust the Git developer.
I don't have an answer what the alternative is going to look like. But smarter people than me may find something. C/C++ are doing fine without package managers. Go at least has a more capable standard library than Rust. But I'm not sure if Go's import github approach is the answer.
One idea I've been entertaining is to not allow transitive imports in packages. It would probably lead to far fewer and more capable packages, and a bigger standard library. Much harder to imagine a left-pad incident in such an ecosystem.
More or less the entire Debian apparatus is an organization devoted to being a C/C++ package manager, and while as an end-user it's adequate for installing applications it's still an enormous pain to use packages as libraries even with apt and friends. And once you get outside of apt, you're in an endless hellscape. People don't seem to understand that the real reason that people love Rust is not because of memory safety (let's be honest, most people are too short-sighted to care about that); it's because of Cargo.
> it's still an enormous pain to use packages as libraries even with apt and friends. And once you get outside of apt, you're in an endless hellscape
I strongly doubt that. Especially with tools like pkg-config that let you generate the set of flags for a package. If anything I've seen more horrendous build scripts from people that are trying to be clever and trying to support everything under the sun.
They're not either, every one of these projects contains a gigantic vendor/ folder full of unmaintained libraries, modified so much that keeping up with the latest changes is impossible so they're stuck with whatever version they copied back in 2009.
You make that sound worse than it is. On the overall topic, you have 0 supply chain risk, and the whole thing is local. Also, your code from 2009 is still valid. That would be a foreign concept in some languages like Python.
you have your supply chain risk still, it's just frozen as of 2009 and whatever you vendored back then is as of today swiss cheese; also you'd better have the compiler suite vendored, too (as you should with this strategy).
there's nothing stopping you from using python from 2009 except why would you want to do that to yourself - but the same strategy applies. the reference python implementation is written in C, after all.
> Go at least has a more capable standard library than Rust.
Many Golang projects I see in the wild will import a number of dependencies with significant feature overlap with sections of the standard library, or even be intended as a replacement for them. So it seems that having an expansive stdlib isn’t sufficient to avoid deep dependency trees, it probably helps to some degree but it’s definitely not a panacea.
That's not really that surprising when you think about it. Standard library-provided things are implemented on a basis of working OK for as many scenarios as possible, not on one of being the best possible implementation for every possible scenario.
I think what we really need is better sandboxes languages. I’d be much happier if my compression algorithm only had an input stream and an output stream. Maybe my gui library shouldn’t have network access or filesystem access. It just draws what I give it, gives me back what users press. You could still make evil software in this world of course.
The solution exists, and those are curated package repositories as we have in Linux distributions. In C I can simply install a -dev package and use some library which sees some quality control and security updates from the distribution.
The problem is that the UNIX shell model got very successful and is now also used on other platforms with poor package management, so all the language-level packaging system were created instead. But those did not learn from the lessons of Linux distributions. Cargo is particularly bad.
> But those did not learn from the lessons of Linux distributions. Cargo is particularly bad.
I recall a decade ago listening to native app developers lamenting how web pages were inferior to native apps and gnashing their teeth at why browsers wouldn't learn the lessons of native apps. It was, and remains, a shocking display of self-unawareness to fail to understand why web pages, despite doing many things worse than native apps, managed to do blow native apps out of the water when it comes to doing the things that actually matter to users. This is how it feels listening to the above comment; you have failed to reflect on why both programming language authors and programming language users were pushed to using language-specific package managers in the first place, and you have failed to put forth any improvements to OS-level package managers that would allow them to address those underlying flaws.
TFA is literally talking about vulnerabilities in Linux packages. There are gajillions of them. Curated package repositories are not solving this problem.
I think curated package repositories solve a problem, but not all of them.
For example, I'm not sure if the world of windows freeware ever moved past this, but very often, the home page for a freeware package will look nearly identical to a page set up to deliver malware. Every package you download you wonder "is this the legit version?". Even push it further, there were multiple examples of sites that were previously trusted for software downloads(SourceForge and the installer debacle) that began packaging spyware or adware into downloads.
With either delivery method, you're not quite safe from supply chain attacks, but with the curated repo, you at least have a single source of packages where you can trust it 99% of the time.
It talks about "installing software". You should definitely install updates from your Linux distribution and installing new packages from a curated repository is certainly not worse than having software already installed. Reducing the footprint is always a good idea though. Installing software from random uncurated sources is generally risky.
A stdlib doesn't have to provide everything under the sun in order to be helpful here.
Languages with rich standard libraries provide enough common components that it's feasible to build things using only a small handful of external dependencies. Each of those can be carefully chosen, monitored, and potentially even audited, by an individual or small team.
That doesn't make the resulting software exploit-proof, of course, but it seems to me much less risky than an ecosystem where most programs pull in hundreds of dependencies, all of which receive far less scrutiny than a language's standard library.
Contrary take: I believe we will see an expanded market for capable PCs that can be sanely put in a living space. By extension of the gaming PC niche to local AI. Both NVidia and AMD are developing product lines in that direction (DGX Spark, Ryzen AI Max). And Linux will be more prominent than ever, due to several independent reasons: MS dropping the ball hard on Windows, SteamOS making Linux attractive for gamers, 'digital sovereignty' as a trend, and Linux being the de facto standard for hosting AI (or anything really).
Well, the two chips I mentioned (DGX Spark uses the GB-10) are both a SoC, so no motherboard needed there. I don't know if that's the full explanation, but it could be a factor.
The SoC design with unified memory is generally well suited for residential use because it's quite energy-efficient, quiet and small (compared to traditional GPU-powered gaming rigs). Great performance-per-annoyance, so to say.
Yeah, the DGX Spark could qualify as a mini PC too. The AMD chip is sold as a laptop chip I believe, but I've mostly seen it in mini PCs. And the Framework Desktop. A brand that probably carries a lot of trust among the kind of tinkerers who would consider buying a barebone motherboard in the first place.
I don't like it. It's inevitable, but no reason to cheer it on. I find it similar to Google Mail or YouTube autotranslating content without opt-in (and sometimes opt-out). It's continuing a trend of you can't really trust the content you see is the content someone else sees or what they sent. It says it only changes accents, soon it'll filter swear words and what else? The end game for the legal use of such tech is always injecting ads. And with this particular tech, we know that the legal uses will be a negligible fraction of the real uses.
> The end game for the legal use of such tech is always injecting ads
From GP
> Almost every time I get a call from TELUS about a new service or promotion
I’d hate to see accents removed in movies and e.g. YouTube review videos. But sales and customer service have lost their humanity long ago. At least the call center workers will receive less bigoted hate and hard-of-hearing customers will be less confused.
Things I can do to help someone understand me are, generally, a net plus. Same for someone trying to help me understand them. But this has complicated effects, some surely unforseen.
It's also going to be a landmine. First you can't force ToS on support calls, although I've seen companies try. If a company has charged you erroneously, for example, by no means do you have to adhere to their terms to resolve such an issue. The very notion is absurd, both ethically and legally, and no recorded message telling you so holds water.
My reason for mentioning this, is that there are going to be weird bugs in any such system. Systems hallucinate. Misunderstand words. I can see accent removal meaning that different words are the result, and context can mean those different words could be a disaster. This immediately opens up liability, because it doesn't matter if it was a computer, a human, or who, a company is on the hook.
It also doesn't matter if another company is providing this service, your contact is with Telus. Telus may sue their company, but you're going to go after Telus. A company could agree to all sorts of things without meaning to, make fraudulent statements, and yes they are liable and always have been. That also includes hate-crime related legislation, harmful insults, snide comments, and here's the fun part...
The person on the other end doesn't even know what they're saying to the person. Not accurately. This is supposed to be seamless, so they'll think that what they're saying is coming through correctly. And continue talking.
Yes, humans can do all of these things. But often there's a manager walking around the room, listening, and would hear someone raising their voice, yelling at the end-user, swearing, making inappropriate statements. This would stand out.
Yet here we have a system altering what's being heard, and no one is directly in the loop on that. No manager. No person on the floor.
Frankly, I hope this explodes in their face. Hard. I want to see them sued so hard, that no other company tries to ever interfere with human conversation again. Go full AI? OK. Full human? OK. But this nonsense???
Lean isn't the most loved proof assistant by mathematicians, it's not the most suited to formal verification of software, it just sort of works for both. Yet it's got the thing that's arguably the hardest to achieve, momentum. Compounded by the fact that in the AI age, the amount of publicly available human-written code directly affects how well agents can produce new code.
They do also make a lot of money selling hardware, and as things stand today that business happens to make them look like the first tech giant to actually profit from the AI boom (because the hardware they've been developing internally for years happens to be among the best consumer-grade options for self-hosting LLMs). Making their hardware more attractive to tinkerers could be a winning move right now.
Hexagonal, with shaded colors? QR Codes are, by definition, square and binary and traditionally use black and white. They're also used for a different purpose typically. They could easily have made them look more like QR Codes if they had wanted to, but they made their own artistic choices. Which I love btw, but they could have maybe chosen better wording. Something like 'fingerprint' or 'mugshot' would have conveyed the idea of it being useful for identification, if not perfect, much better.
They are vastly different. Incus is aimed at providing a minimal, immutable Os for the hosting of VMs & containers. nixOS provides a full linux OS that is reproducible and declarative.
reply