Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

"This broke. Here is the error behavior, here are diagnostics, here is the code. Help me dig in and figure this out."


I'm sure it can diagnose common, easily searchable well documented issues. I've tried LLMs for debugging and it only led me on a wild goose chase ~40% of the time.

But if you expect it to debug code written by another black box you might as well use it to decompile software


Sometimes the error message is a red herring and the problem lies elsewhere. It's a good way to test imposters that think prompting an LLM makes you a programmer. They secretly paste the error into chatGPT and go off in the wrong direction...




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: