Hacker Newsnew | past | comments | ask | show | jobs | submit | CuriouslyC's commentslogin

The US committed massive treaty violations and genocide, on top of huge imperialist destabilization of many sovereign nations. Tianmen square and the Uyghers are bad, but we're straight up evil.

The Chinese government regularly kidnaps its own citizens, who have no due process rights, and is currently engaged in a mass genocide of a racial group they consider “inferior.”

Additionally, they have supported Russia consistently during their occupation of Ukraine, and just install leaders for life.

I’m confused how you think the US is worse. I say this as an Afroindigenous person who is very clear about the harms white supremacy has inflicted upon the cultures I am a part of.


> Additionally, they have supported Russia consistently during their occupation of Ukraine

And who are we supporting since roughly 01/2025? :-)


Just on the genocide scorecard, it's us 0, China 1. Ask a native american what they think of the US govt.

Breaking shit is the path of most resistance. Do not do this unless you're young and poor.

The way to win is economic resistance. Stop spending and stop paying taxes. Crash the fucking economy so deep into the ground that the country self-immolates.


What? Do you understand what happens to the subject of self-immolation?

It gets turned into ashes to provide nutrients to something worth growing.

Yes, because never in history has a rotten economy empowered right-wing authoritarians.

>the country self-immolates

Right-wing authoritarianism is a primal response to perception of disorder my dude. Don't pour fuel on the fire.


A bad economy is a noose around the neck of the people who own it, which in this case is the right wing authoritarians. The next people in line are the social democracy leftists. Look to Mamdani for where we're going when we hang the traitors.

Gemini had the best long context support for the longest time, and even now at >400k tokens it's still got the best long context recall.

Gemini is just not trained for autonomy/tool use/agentic behavior to the same degree as the other frontier models. Goog seems to emphasize video/images/scientific+world knowledge.


My experience is it advertises large context and then just becomes incoherent and confused as it climbs to fill that context.

e.g. it sucks at general tool use but sucks even more at it after a chunk of time in a session. One frustrating situation is to watch it go into a loop trying and failing to edit source files.

I often wonder how my old coworkers from Google get by, if this is the the agentic coding they have available to them for working on projects on Google3. But I suspect the models they work with have been fine tuned on Google's custom tooling and perform better?


Thank god for the Chinese labs. Keeping us (relatively) honest.

The only AI use case that cares about latency is interactive voice agents, where you ideally want <200ms response time, and 100ms of network latency kills that. For coding and batch job agents anything under 1s isn't going to matter to the user.

tbh, that's a good point about the voice agents that I hadn't considered. I guess there are some latency-sensitive inference workloads. Thanks for pointing that out.

Yeah, also stuff like robotics which might not really exist today but could be big in the future.

You'll want the time-sensitive parts (motor control) to be running locally anyway.

A customer service chatbot can require more than one LLM call per response to the point that latency anywhere in the system starts to show up as a degraded end-user experience.

I think the intent is that if you can cleanly encapsulate some complexity so that people working on stuff that uses it don't have to understand anything beyond a simple interface, that complexity "doesn't exist" for all intents and purposes. Obviously this isn't universal, but a fair percentage of programmers these days don't understand the hardware they're programming against due to the layers of abstractions over them, so it's not crazy either.

They were running a 2x rate limit promo last month.

Theoretically yes. In practice even a few weeks before it ended, the actual rate limit was down to what it was before the promo. And now I'm getting roughly 0.25x of what I got before the promo.

To be fair, GPT 5.4 is mostly a better model than Opus 4.6 in terms of quality of work. The tradeoff is it's less autonomous and it takes longer to complete equivalent tasks.

The dynamics vastly favor China, part of the reason the US sprinting towards "ASI" isn't totally boneheaded is that the US and its industry needs a hail mary play to "win" the game, if they play it safe they lose for sure.

I'd be fine with a world without AI, honestly. Nobody really wins this race except the very wealthy. And I don't think it's really going to play out the way the wealthy think it will. It's more like a dog catching a car than it is a race.

> It's more like a dog catching a car than it is a race.

What does this mean? I didn't understand the analogy.


A car caught by a dog has no purpose. The activity concludes with no output.

"The dog that caught the car" refers to how dogs sometimes chase cars. Suppose the car stops and the dog catches up - what is it going to do? It has no plan, it has no purpose, it isn't going to bite the car, it isn't going to get anything out of catching the car. The car may even run it over. I intended it basically as "play stupid games, win stupid prizes", or "be careful what you wish for".

My observation is that the dog sniffs all the tires, picks one tire, lifts one leg and does the deed. I don't know if its a way of marking territory or domination. We need a dogatologist to explain what it means.

That was quite the unexpected anticlimactic ending. I’m sure Terry Pratchett would be proud.

We did it reddit!

The irony is that we've just shifted the complexity. Anyone can make something now, but since everyone is making things, now you need to compete on reach/distribution more aggressively. The new "capital" is social media juice and pre-AI rep. Same problem, different skin.


It’s the same as it always says.

You could always take the time to do something or pay someone else to do it.

You pay others to focus on things you can’t.

Unless myths fully does that (which I say in full confidence that it doesn’t) it’s just making it cheaper to provide focus.


Consider applying for YC's Summer 2026 batch! Applications are open till May 4

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: