Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

If we are talking about jobs (quantity) maybe to some extent. But if want to be honest, it’s qualitative (human-judgment) question. And even if a job seems totally AI-ready on paper, it might have invisible side effects.

(Thought experiment: do I want an AI robot to perform a surgery on me, if it only has 2% chance of hallucinating? My answer is no, bring the surgeon)



I wonder if we will see some perverse incentives emerge to make the AI seem even better. For example, say a well rested, stress free surgeon can have a 1% error rate. Well, lets make the job harder then, fatigue the surgeon, lay many of them off (or just not rehire as they leave) and spread the remainder thin. Make them hit 3% error rate. Then fire the lot because it would be malpractice not to.


If that’s the dystopia we would live in, I’d imagine an alternate healthcare/legal system would emerge. Also, personally I’m far more forgiving of the human-error than that of the machine


For the musk class maybe. But for you and I?


> if it only has 2% chance of hallucinating

I want people to have jobs.

Setting that aside, it is dependent on error rate of human surgeons, right?




Consider applying for YC's Summer 2026 batch! Applications are open till May 4

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: