Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Yeah definitely, especially for more complicated things like having a conversation. State of the Art will probably always need a server.

For simpler things, small models can definitely handle them. Transcription, object detection, simple classification tasks. I expect more and more to fall under the category of “things which ML can do on $X of hardware” as hardware and software get better.



Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: