Hacker News
new
|
past
|
comments
|
ask
|
show
|
jobs
|
submit
login
dpweb
3 months ago
|
parent
|
context
|
favorite
| on:
On the Existence, Impact, and Origin of Hallucinat...
We don't understand the brain. We fully understand what LLM are doing, humans built them. The idea we don't understand what LLMs are doing is magical. Magical is good for clicks and fundraising.
allears
3 months ago
[–]
We know how we built the machines, but their complexity produces emergent behavior that we don't completely understand.
emp17344
3 months ago
|
parent
[–]
This isn’t settled science, by the way. There’s evidence that the “emergent” behaviors are just a mirage.
Guidelines
|
FAQ
|
Lists
|
API
|
Security
|
Legal
|
Apply to YC
|
Contact
Search: