Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Definitely agree, LLMs are only as useful as the person interpreting and implementing the output; if someone doesn't have enough knowledge or context about the thing they are trying to solve/create then copy & pasting blindly while asking the wrong questions will lead projects to disaster.

I have witnessed this firsthand when I dove into the deep end on something over my head, GPT-4 Code Interpreter went into an error loop and I had to learn all of the background knowledge I was foolishly trying to avoid.



Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: