Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

LLMs can add unsubstantiated conclusions at a far higher rate than humans working without LLMs.




True, but humans got a 20 year head start and I am willing to wager the overwhelming majority of extant flagrant errors are due to humans making shit up and no other human noticing and correcting it.

My go too example was the SDI page saying that brilliant pebble interceptors were to be made out of tungsten (completely illogical hogwash that doesn't even pass a basic sniff test.) This claim was added to the page in February of 2012 by a new wikipedia user, with no edit note accompanying the change nor any change to the sources and references. It stayed in the article until October 29th, 2025. And of course this misinformation was copied by other people and you can still find it being quoted, uncited, in other online publications. With an established track record of fact checking this poor, I honestly think LLMs are just pissing into the ocean.


If LLMs 10X it, as the advocates keep insisting, that means it would only take 2 years to do as much or more damage as humans alone have done in 20.

Perhaps so. On the other hand, there's probably a lot of low hanging fruit they can pick just by reading the article, reading the cited sources, and making corrections. Humans can do this, but rarely do because it's so tedious.

I don't know how it will turn out. I don't have very high hopes, but I'm not certain it will all get worse either.


The entire point of the article is that LLMs cannot make accurate text, but ironically you claiming LLMs can do accurate texts illustrates your point about human reliability perfectly.

I guess the conclusion is there simply is no avenues to gain knowledge.


> I am willing to wager the overwhelming majority of extant flagrant errors are due to humans making shit up

In general, I agree, but I wouldn't want to ascribe malfeasance ("making shit up") as the dominant problem.

I've seen two types of problems with references.

1. The reference is dead, which means I can't verify or refute the statement in the Wikipedia article. If I see that, I simply remove both the assertion and the reference from the wiki article.

2. The reference is live, but it almost confirms the statement in the wikipedia article, but whoever put it there over-interpreted the information in the reference. In that case, I correct the statement in the article, but I keep the ref.

Those are the two types of reference errors that I've come across.

And, yes, I've come across these types of errors long before LLMs.


At some point you're forced to either believe that people have never heard of the concept of a force multiplier, or to return to Upton Sinclair's observation about getting people to believe in things that hurt their bottom line.

I don’t see why people keep blaming cars for road safety problems; people got into buggy crashes for centuries before automobiles even existed

Because a difference in scale can become a difference in category. A handful of buggy crashes can be reduced to operator error, but as the car becomes widely adopted and analysis matures, it becomes clear that the fundamental design of the machine and its available use cases has fundamental flaws that cause a higher rate of operator error than desired. Therefore, cars are redesigned to be safer, laws and regulations are put in place, license systems are issued, and traffic calming and road design is considered.

Hope that helps you understand.


Is the sarcasm really that opaque? Who would unironically equate buggy accidents and automobile accidents?

I’d like to introduce you to the internet.

There’s a reason /s was a big thing, one persons obvious sarcasm is (almost tautologically) another persons true statement of opinion.


Thanks. I wasn’t aware of that.

It took me a minute to realise you were joking too! :)

How much time have you spent around developers?

I got my first tech job in 1998. Some of the most sarcastic people I’ve ever met.



Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: