Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

> It doesn't "learn" like a person learns

Who cares? Why should you hold the idea that it would? They are systems moulded after data, 'learn' seemed to be a decent label. If it is not, it is because "learning" is _active_, by philological analysis, and happily consistently with an aim of AGI (intelligent entities learn actively).

A computer does not compute like a human would. Yet, no problem.

For that matter, you are using 'person' in a very individual way - not even "personal" (a "person" learns according to individual nature, while you are using it as a collective term).

As already expressed - nearby I wrote 'biomimicry' -, what you are calling "anthropomorphizing" is a wrong direction: "evolutionary algorithms" were born out of keys after the observation of the natural world, and the terms express that - it is not that you saw the algorithm and went "It looks like my uncle Oscar"¹ (this side is active - it "learns").

(¹Those anthropomorphizing Hollywood cultists and all that sculpture...)



> Who cares?

> a "person" learns according to individual nature, while you are using it as a collective term

There's a very specific definition for learn:

>> to gain knowledge or understanding of or skill in by study, instruction, or experience[0]

There's a few more, but none of the definitions treat learn in a non-collective way. I guess meriam Websters dictionary doesn't like treating people as individuals or something lol.

Additionally, all the definitions there are speaking in human contexts. They talk about learning in the sense of being taught, or gaining experience, or gaining knowledge. Sure a computer kind of does this stuff, but it doesn't really. And that falls into the category of attributing human characteristics to an inanimate object.

I probably shouldn't have said that everything in the short list I wrote reeked of anthropomorphizing processes. But the evolutionary algorithm was more in line with what I mentioned immediately before. My whole comment read:

> I feel like the whole industry is inundated with aphorisms that are kind of true, but not wholly true. Evolutionary algorithms, neural networks, deep learning, deep mind, this stuff all reeks of anthropomorphizing fundamentally mathematical processes.

An evolutionary algorithm definitely falls into the category of kind of true but not wholly true. But it's not anthropomorphic.

> intelligent entities learn actively

Also, this is a very loaded statement. What is an intelligent entity? If you Google "is a computer intelligent" there are various papers, articles, and other pieces of media all claiming that we can't call a computer intelligent, and some claiming that we can consider certain algorithms somewhat intelligent. This is anything but an accepted standard today.

[0]: https://www.merriam-webster.com/dictionary/learn


> kind of true but not wholly true

Give us an example of some relevant label that would be "«wholly true»" instead of "«just kind of true»". Because metaphors, and the whole system of fuzzy pattern relations, are based on fuzzy pattern relations.

> none of the definitions treat learn in a non-collective way

You have misunderstood my post. I would prefer that you read it again.

You are complaining about loose use of the language: I noted that you yourself used the term 'person' more than loosely, with a dubious jump. When one says '«like a person learns»', that is supposed to be "like a specific individual in his own individual characteristics will learn" - instead you used to say "like people in general learn". A "person" is a "definite form", not a general individual representing common features - it is the opposite.

> very loaded statement

Which you are taking out of context. I said that you have to call the moulding of your functions something, and that "learn" seems a very acceptable term, since it is bottom-up instead of top-down, it is automated instead of encoded: it is developed against data, it "learns". And that if the term is disliked, there could be a very good reason, because 'learn' was born as a sort of a hunting term¹ - it really means something like "investigate" -, which is a happy coincidence because what is largely missing in AI is critical thinking, part of the active process of learning ("learning" is active as investigation is). And the day John will have to check «accepted standard[s]» to see how things are, I will be willing to comply to his sad request for mercy.²

¹Irregardless of what the Merriam-Webster will write, because you get a none-the-wiser relative notion but not knowledge from a "dictionary of use", as at the paragraph for 'life' you will not find the meaning of life.

²John must be, tautologically, an "active learner". (He will check personally.)




Consider applying for YC's Summer 2026 batch! Applications are open till May 4

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: