Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

... which happens to be the exact opposite of what you want if you’re into deep learning.

Sigh.



Can one productively use a video card in a laptop or even in a high-end desktop for machine learning? At work we use a special workstation-type box with 4 NVidea cards. It is not connected to any display and we do not run any remote graphical sections on it that could have used graphical cards. Still it is slow at learning. If not the size of datasets (video in lossless compression), we would use cloud solutions.


Yeah, you can actually do it. Just run desktop in your integrated Intel GPU and CUDA in NVIDIA discrete GPU.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: