Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

I know someone who used ChatGPT to diagnose themselves with a rare and specific disease. They paid out of pocket for some expensive and intrusive diagnostics that their doctor didn't want to perform and it came out, surprise, that they didn't have this disease. The faith of this person in ChatGPT remains nonetheless just as high.

I'm constantly amazed at the attitude that doctors are useless and that their multiple years of medical school and practical experience amounts to little more than a Google search. Or as someone put it, "just because a doctor messed up once it doesn't mean that you are the doctor now".



I have a family member with an uncommon (1/1000) genetic condition. The only doctor they have ever been to that didn’t google it in the exam room with us was the PI of a study on the condition.

The best part is they always immediately start badly explaining it to us like we’ve never heard of it either.

Between that and having concerns repeatedly dismissed before we secured a diagnosis has sincerely changed my view of Dr. Google.


They're not useless but they're also human with limited time and limited amount of inputs.

To me it's crazy that doctors rarely ask me if I'm taking any medications for example, since meds can have some pretty serious side effects. ChatGPT Health reportedly connects to Apple Health and reads the medications you're on; to me that's huge.


> To me it's crazy that doctors rarely ask me if I'm taking any medications for example, since meds can have some pretty serious side effects.

This sounds very strange to me. Every medical appointment I've ever been to has required me to fill out an intake form where I list medications I'm taking.


Understanding drug interactions is the job of pharmacists (who are also doctors…of pharmacy). Instead of asking apple health or ChatGpt about your meds, please try talking to your pharmacist.


Pharmacists are the last person I see on the way out. They do ask if I have any allergies, but by that time the doctor already washed his hands.


Doctors are wrong all the time as well. There are quite a few studies on this.

I would in no way trust a doctor over ChatGPT at this point. At least with ChatGPT I can ask it to cite the sources proving its conclusions. Then I can verify them. I can’t do that with a doctor it’s all “trust me bro”


You can sue doctors for malpractice.


Money from a lawsuit is nice but I'd rather get better than have the money.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: