I'm creating a new start up called QuantumFlop Electricity - there's a 10% chance it will cause a black hole to open up in the Atlantic Ocean that may eventually consume us all but a 50% chance we'll have unlimited clean energy. We'll never know for sure if at any point that black hole may open as it's borrowing energy from the 81st dimension, but the upside seems pretty good.
Funny, I was just rereading the Hyperion series. It says there clearly that it was the AIs that created the black hole that led to the destruction of Old Earth. Intentionally.
I've never read the book but I assume there were some sycophants who were praising the AI right up until it created a black hole that destroyed the planet.
> akshually, you can never make anything 100% safe
Yes Sherlock. And especially a natural language product that can't output the same thing for unchanged input twice.
Besides when you say "safe" i think of the idiots at Anthropic deleting "the hell" when i pasted a string in Claude and asked "what the hell are those unprintable characters at the beginning and end"...
How many correct answers did they suppress in their quest to make their chatbot "family friendly"?
What exactly are you implying? It sounds to me like you're saying that if it's impossible to make a product safe, then there shouldn't be any safety requirements. I think a more sensible position is that if it's impossible to make a product safe, then it should be illegal to build.
Make a nondeterministic product safe how?