Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Every intelligent colleague is an interesting mix of 'sour but intrigued'

Personally, I know I've lost a lot of street cred amongst certain work circles in recent history as far as my thoughts of 'shops should pursue local LLM solutions[0]' and the '$6000 4-8 tokens/second local LLM box' post making the rounds [1] hopefully gives orgs a better idea of what LLMs can do if we keep them from being 100% SAASlike in structure.

I think a big litmus test for some orgs in near future, is whether they keep 'buying ChatGPT' or instead find a good way to quickly customize or at least properly deploy such models.

[0] - I mean, for starters, a locally hosted LLM resolves a LOT of concerns around infosec....

[1] - Very thankful a colleague shared that with me...



Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: