Hacker Newsnew | past | comments | ask | show | jobs | submit | greendestiny's commentslogin

I think that it's absurd that we've jumped to the conclusion backpropagation in neural networks should be legally treated the same as human learning.

I mean I don't think think I could find a better description for following the derivatives of error in reproducing a set of works as creating a "derivative work".


>> ... we've jumped to the conclusion backpropagation in neural networks should be legally treated the same as human learning.

I agree. However, the reverse is also likely true, i.e., it cannot currently be denied that learning in humans is different from learning in artificial neural networks from the point of view of production of works that mix ideas/memes from several works processed/read. Surely, as the article says, copyright law talks exclusively about humans, not machines, not animals.


I understand the article - the point about 'learning' is that if the model and its outputs are a derivative works then the copyright belongs to the human creators of the works it was trained on.

Edit*: Or perhaps put more pseudo legally that the created works infringe on the copyrights of the original human creators.


The part I agree to is that copyright law calls out humans specifically as the potential owners of copyright. So what you suggest seems to be the only possibility out. Calling out humans could imply that when a human reads a thousand books and then writes something basis the same but which is not a substantial copy of anything explicitly read, that human owns the copyright to the text written. Whereas, if an artificial neural network does the same (hypothetically writing the same text), it would not.

The above does not follow from, imply or conclude anything about learning in artificial neural networks and humans being similar or dissimilar.


Sure as a word it can be broad, as a concept in our legal system that should be much more nuanced.

The relevant extension of your analogy is should birds be required to obey FAA rules? Or should plane factories be protected as nesting sites?


It's a relevant extension if you think the ability to learn from a work is a right people have that exempts them from the more general lockdown copyright would impose.

If you come at it from the view of copyright being a limited set of control over some areas but not others, then if copyright doesn't block human learning it shouldn't affect anything similar either, unless a specific rule is added to make those situations be handled differently.



You can get chatGPT to generate plantUML


Disclosing vulnerabilities.


They should definitely disclose vulnerabilities in American made software for SIGSEC. If it’s foreign then not disclosing it would be SIGINT


If the software is foreign made and used in US, americans are still vulnerable. And opensource isn't American/national


The problem with that is there's a lot of software that's American made that is also used by potential targets of SIGINT.


I think the theory is suggesting that the photon was emitted with less energy.


I think the question is that if the current accepted theory for the redshift is correct, then where has the energy gone?


Would the star then appear to be a different color to "someone" its own solar system than it does us? Does OUR star thus appear to be moving TOWARD them since it's "our" color vs what "they" would consider to be the correct color for a star our size?


Doesn’t this require the photon to be emitted differently depending on where it ends up, as it would experience different amounts of redshift depending on the length of flight?

That’s some serious retro-causality if a photon from 13B ly hits us instead of Andromeda.


You and me both. :(


Actually despite your skepticism Tesla's PR spin has already beaten you.

"but autopilot still lowers the death rate on average"

That's not what they said, they said the death rate was lower than the average. And yet you can't help hearing that it lowered the death rate. I think it's very likely turning on autopilot massively increases the rate of death for Tesla drivers, but they've managed to deflect from that so skilfully.


I don't get it. These two are the same thing.


The comment above you supposes that people who drive teslas have fewer accidents on average, even without autopilot. Saying that autopilot "lowers the average" would mean that autopilot lowers the amount of accidents for tesla drivers, while "lower than average" could mean that while a tesla with autopilot is safer than the average, it is less safe than a tesla without autopilot. Pretty complicated


Hah!


I don't think it's going to emerge without significant effort to make it happen. I think most of the 'intelligence' we desire will be attainable without sentience. Sentience itself will require a lot of specific research directed at the goal. It's certainly a risk though.


Right there with you on this, love react's rendering model. Redux seems to get all the hype but I'm sort of beginning to suspect it might be mostly the wrong way to do things. It reminds me of TDD, most of the benefits seem to be moral in character rather than concrete.


Consider applying for YC's Summer 2026 batch! Applications are open till May 4

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: