Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

So, I found this from back in 2011: http://www.tomshardware.com/news/ibm-patent-gpu-accelerated-... However, I couldn't find any commercial or even (active) open source projects on this topic. It seems like something that would be valuable to businesses working with big data, so what's the hold up? Has nobody reached this scale yet? Is it still too expensive? I don't get it.. Maybe I'm overthinking it.



I know folding@home was taking big advantage of CUDA enabled nvidia GPUs




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: