Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Just read deepming / nvidia / facebook (A-tier AI institution) papers or NIPS orals, and go to the results first: are they reeeealy better than what you've seen before ? Then read the paper.

I know it's a bit elitist but we all have a limited time only in life. Also these institutions are usually the only ones making actual breakthrough for money reasons: they brute force many hyperparameters on their giants cluster in a context where sometimes a single training cost 15k USD..

Also most papers which "look interesting" or "edgy" are usually a disappointment.



> the only ones making actual breakthrough for money reasons

That's also a suggestion that those aren't breakthroughs. They are just someone getting 1% because their corporate sponsor spent $250k more than the other guys.

Look at the ideas, not the results. Is there something new and is it clearly expressed? If you can't answer that in five minutes, move on. Ideas transfer, results don't.

In particular, if an ML paper abstract states a percentage improvement over SOTA and then lists five existing techniques that were combined to get the result, you can just put it directly on the trash pile.


Mm i get the point but i disagree that "money" only bring small percentage. Entire new possibilities can be uncovered. For example I was thinking to these papers for example:

https://www.youtube.com/watch?v=vppFvq2quQ0

https://www.youtube.com/watch?v=XOxxPcy5Gr4

As another proof the entire concept of neural network exists since the 80's. It's the fact of using it on new hardware (the GPUs) which made it so important. And the "new hardware" is always expensive.

In 10 years also maybe every startup will have a massive cluster instead of a 4 GPU PC with current flops capabilities (which was luxury a few decades ago)




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: