Hacker Newsnew | past | comments | ask | show | jobs | submit | oliwarner's commentslogin

Excuse me.

That's all it means. Excuse my impertinence, presence, gaul, mere existence; I need your attention. Many languages have this overloaded phrase and use it just as a Brit would "sorry". It's formal deference. It's polite.

And it's not like we don't also shout "Oi!" when we need to determine whether or not some brigand possesses a licence for whatever it is they're doing.


PPAs are still timing out for me in apt update.

Ditto, launchpad.net is also extremely flaky.

And now the GB mirror is down. I know DDOSes are hard but they're not showering themselves in glory.

In raw grunt, it's equivalent to a PC with a ~3060Ti/6700XT.

You can get a refurb PS5 for £250. Good luck building the equivalent PC for that.


You could get a board built on the PS5 APU (BC-250) for cheaper than that.

£150 plus taxes. By the time you add a PSU, a case, and a good controller, it's pretty close.

Is that really true? If so, is there a saner way to handle this than upgrade all the things to 10GBE? Like a POE ethernet condom that interfaces with both network and devices at native max speeds without the core network having to degrade?


> Is that really true?

It's not, cf. sibling posts. The GP probably learned networking in the 80ies~90ies when it was true, but those times are long gone.

(unless you're talking wifi.)


This feels like a compelling reason to joke around more.

If inaccuracies make it to your patient record, it's defamatory. Your doctor must sign off on the transcript and if they're letting through poor results, make it their problem to fix. That'll either force the tech to get better or to fall back on better note taking practices.


Be warned though that life and disability insurance will absolutely use errors in your medical records to refuse your coverage or claims.


And that's what makes it actionable defamation. If your doctor signs off on an AI summary that accuses you of being an drug dependant sex worker, that's serious malpractice.


How do we make those markets more competitive?


Yeah my parents thought it was funny and I was like... yeah not actually. You need to get that fixed.


Might be immature but personally once I knew this was possible I'd go for the high score. Try to get every substance I can think of listed plus a supposed admission of murder and whatever other ridiculous stuff I can come up with.

"Well you know me doc, I keep my drugs in the deep freezer with the bodies waiting for disposal so I'm quite confident in their shelf life." I wonder what an AI scribe would make of such a remark.


Initially nothing, but then two weeks later you'll start getting more push ads for high end chest freezers.


Your username is uncanny for this comment. Well played.


Only if you enjoy lawsuits, though.


Notes need writing though.

You can do that by recording and transcribing (many methods) or your doctor has to write on the fly, or worse, has their head in their computer while you talk in their general direction.

Letting doctors talk and examine and not write is a wholly better experience.

Offsite third parties are the problem here. If this was done automatically without data leaving the room, is there a problem? Do you have the same objections to how your digital notes are stored?


We agree on the desired outcome, but couldn't we also give doctors more time to do that job without AI? Feels like the blame is in the wrong place.


Maybe it's a regional thing, but in my last 3 appointments, 2 had an assistant doing the note-taking (as prompted by the treating physician or PA). The third was a virtual appointment, so no idea what notes were taken, if any.


Sounds cushy, but not everywhere can afford 2:1 healthcare for every primary contact. It's not a thing here until you get to a ward or hospital-based clinic and you're seeing a team.

I don't like off-site data vacuums. Palantir can get fucked. But good ML transcription tools don't have to be run off-site. Even to get you 90%, or serve as a backup. And as I've said in other threads here, it's hard to be angry about consented audio recording and AI transcription when my entire medical history is floating around in a database that could be hacked, or its data deliberately passed through (eg) a Palantir tool. I think audio of me complaining about lower back pain is the very least of our worries.

Personally, I'd prefer AI and better doctor availability. To have that admin time back as consultation time, or more appointments, or just less overworked doctor.

But also, there have to be weapons grade consequences for people that leak patient data. Loss of registration, never allowed to work with sensitive data again and jail.


If there is, couldn't they exist in any model?

I don't mean that flippantly. These things are dumped in the wild, used on common (largely) open source execution chains. If you find a software exploit, it's going to affect your population too.

Wet exploits are a bit harder to track. I'd assume there are plenty of biases based on training material but who knows if these models have a MKUltra training programme integrated into them?


Many memberships (like phone and internet service contracts) aren't rolling contracts, they're annual or biennial to lock you in. Of course you can cancel at the term of the contract, but many start strong, let it peter off, and by they time they realise they're paying for a service they're not using, they're well out of a cooling-off period.


Because there's money in letting people make mistakes on their own dime.

For every story about a mistake in the tens of thousands, I wonder how many there are in the single-thousands, hundreds and even tens where people just suck it up and pay the bill. When you provide such a limited billing support surface, employ a thousand lawyers in-house, and hold as many cards as Google does (losing my G account would ruin my year).

Multinational service providers need better regulation to ensure consumers' rights are protected. In the UK utilities and banks have to absorb (or insure) against some leaks, theft and external fraud. If Google (and others!) were subjected to similar regulation, problems like this would evaporate overnight.


... And most streams are fraudulent.

I'm not sure I'd care if AI generated music was competing against my own organic music, but having the stream-reward diluted down by bots is actually hurting artists.


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: