Part of the issue is that they don't actually advertise what the token limit is. Just some vague, "this is 5x more than free, and 5x more than pro". They seem to be free to change the basis however they please, because most of us are more than happy to use what they give us at the discounted subscription pricing.
How much do you want to bet me that the credential was stolen during the previous LiteLLM incident? At what point are we going to have to stop using these package managers because it's not secure? I've got to admit, it's got me nervous to use Python or Node.js these days, but it's really a universal problem.
> it’s got me nervous to use Python or Node.js these days
My feelings precisely. Min package age (supported in uv and all JS package managers) is nice but I still feel extremely hesitant to upgrade my deps or start a new project at the moment.
I don’t think this is going to stabilize any time soon, so figuring out how to handle potentially compromised deps is something we will all need to think about.
I would be avoiding npm itself on principle in the JS ecosystem. Use a package manager that has a history of actually caring about these issues in a timely manner.
It almost doesn't matter, because you can get pwned by a transitive dependency. If someone doesn't have the same scruples as you have, you're still at risk.
PNPM makes you approve postinstall scripts instead of running them by default, which helps a lot. Whenever I see a prompt to run a postinstall script, unless I know the package normally has one & what it does, I go look it up before approving it.
(Of course I could still get bitten if one of the packages I trust has its postinstall script replaced.)
I suppose you would have to commit your node_modules, or otherwise cache your setup so that all prerequesite modules are built and ready to install without running post-install scripts?
Kind of is doing a lot of work there. This might be THE most misleading title I heard. Jumping into this thread I expected they went from 30% to 0% not 20% so I appreciate your comment for giving me more context.
Can Dang or HN moderation team fix the title to better reflect the true state and not be misleading as it currently is?
“Did you hear? On Red Square they’re giving away cars.”
“Not quite. First, it’s not on Red Square but on Dzerzhinsky Square. Second, they’re not cars but bicycles. And third, they’re not giving them away, they’re stealing them.”
I feel like this is a bit of a sinking ship. I suppose if you want to avoid known sources of slop then this works … but beyond that it’s a bit of a lost cause. It’s like sports betting — once it’s there then there’s no saying who is (ab)using it.
It's not perfect, but in time my search results have gone from the first several pages being mostly garbage to mostly all good. Sure, new spam sites crop up every few days, but it's a quick block.
I have been using omz for YEARS. I have practically grown up with it. I resent that I would have never noticed that it took several hundred milliseconds to load, if it were not for this discussion. I never felt the delay particularly unless I was in a very large Git repository — which is rare. On the bright side, now I know about `fish` and am learning there are some nice features I never had in ZSH (i.e. much more advanced autosuggest.) What I basically use omz for is 1) autocomplete, 2) git aliases, and 3) my beloved prompt. In about 30 minutes, Claude helped me port my prompt from ZSH, autocomplete comes OOB, and a cursory Google search shows someone has made a Fish plugin to satiate my Git alias muscle memory. I could leave Zsh/omz in the rear view mirror tomorrow — but for why? I never would have noticed before this discussion...
reply