Rejecting any packages newer than X days is one nice control, but ultimately it'd be way better to maintain an allowlist of which packages are allowed to run scripts.
Unfortunately npm is friggen awful at this...
You can use --ignore-scripts=true to disable all scripts, but inevitably, some packages will absolutely need to run scripts. There's no way to allowlist specific scripts to run, while blocking all others.
There are third-party npm packages that you can install, like @lavamoat/allow-scripts, but to use these you need to use an entirely different command like `npm setup` instead of the `npm install` everyone is familiar with.
This is just awful in so many ways, and it'd be so easy for npm to fix.
7,341 from my Discord bot using the Claude Code SDK.
"Ha — one off from the Opus default. I'd like to think I'm slightly more random than Opus but realistically we're probably pulling from the same biases. The "feels random but isn't" zone around 7300 is apparently very sticky for LLMs."
depends, if you don’t clean up the logs and monitor that cleanup will it eventually hit the p&l? eg if you fail compliance audits and lose customers over it? then yes. it still eventually comes back to the p&l.
This example feels more like a bug in the law itself that should be corrected. If this behavior is acceptable then it should be legal so we can avoid everyone the hassle in the first place. I bet AI would be great at finding and fixing these bugs.
> If this behavior is acceptable then it should be legal so we can avoid everyone the hassle in the first place.
Codifying what is morally acceptable into definitive rules has been something humanity has struggled with for likely much longer than written memory. Also while you're out there "fixing bugs" - millions of them and one-by-one - people are affected by them.
> I bet AI would be great at finding and fixing these bugs.
Ae we really going to outsource morality to an unfeeling machine that is trained to behave like an exclusive club of people want it to?
If that was one's goal, that's one way to stealthily nudge and undermine a democracy I suppose.
It's not a bug, it's something politicians don't want to touch because nobody wants to be the person that is soft on anything to do with minors and sex. Of course our laws are completely illogical - the fact that you could be put in prison and a sex offender registry for life for having a single photo of a naked 17 year old (how in the hell were you supposed to know?) on your device is ridiculous.
But, again, who is going to decide to put forward a bill to change that? It's all risk and no reward for the politician.
Fair, but still, the legislative process takes alot of time, and judicial norms and precedent allow for discretion to be exercised with accountability, which also informs the legislative process.
I think "judge AI" would be better if it also had access to a complete legislative record of debate surrounding the establishment of said laws, so that it could perform a "sanity check" whether its determinations are also consistent with the stated intent of lawmakers.
One might imagine a distant future where laws could be dramatically simplified into plain-spoken declarations, to be interpreted by a very advanced (and ideally true open source) future LLM. So instead of 18 U.S.C. §§ 2251–2260 the law could be as straightforward as:
"In order to protect children from sexual exploitation and eliminate all incentive for it, no child may be used, depicted, or represented for sexual arousal or gratification. Responsibility extends to those who create, assist, enable, profit from, or access such material for sexual purposes. Sanctions must be proportionate to culpability and sufficient to deter comparable conduct."
reply