Can one productively use a video card in a laptop or even in a high-end desktop for machine learning? At work we use a special workstation-type box with 4 NVidea cards. It is not connected to any display and we do not run any remote graphical sections on it that could have used graphical cards. Still it is slow at learning. If not the size of datasets (video in lossless compression), we would use cloud solutions.
Sigh.