Hacker News
new
|
past
|
comments
|
ask
|
show
|
jobs
|
submit
login
lmm
on Dec 5, 2024
|
parent
|
context
|
favorite
| on:
AI hallucinations: Why LLMs make things up (and ho...
The LLM is always bullshitting the user. It's just sometimes the things it talks about happen to be real and sometimes they don't.
Guidelines
|
FAQ
|
Lists
|
API
|
Security
|
Legal
|
Apply to YC
|
Contact
Search: