What the law does:
SB 53 establishes new requirements for frontier AI developers creating stronger:
Transparency: Requires large frontier developers to publicly publish a framework on its website describing how the company has incorporated national standards, international standards, and industry-consensus best practices into its frontier AI framework.
Innovation: Establishes a new consortium within the Government Operations Agency to develop a framework for creating a public computing cluster. The consortium, called CalCompute, will advance the development and deployment of artificial intelligence that is safe, ethical, equitable, and sustainable by fostering research and innovation.
Safety: Creates a new mechanism for frontier AI companies and the public to report potential critical safety incidents to California’s Office of Emergency Services.
Accountability: Protects whistleblowers who disclose significant health and safety risks posed by frontier models, and creates a civil penalty for noncompliance, enforceable by the Attorney General’s office.
Responsiveness: Directs the California Department of Technology to annually recommend appropriate updates to the law based on multistakeholder input, technological developments, and international standards.
> This product contains AI known in the state of California to not incorporate any national standards, international standards, or industry-consensus best practices into its framework
it sounds like a nothing burger? Pretty much the only thing tech companies have to do in terms of transparency is create a static web page with some self flattering fluff on it?
I was expecting something more like a mandatory BOM style list of "ingredients", regular audits and public reporting on safety incidents etc etc
By putting "ethical" in there it essentially gives the California AG the right to fine companies that provide LLMs capable of expressing controversial viewpoints.
This is so watered down and full of legal details for corps to loophole into. I like the initiative, but I wouldn’t count on safety or model providers being forced to do the right thing.
And when the AI bubble pops, does it also prevent corps of getting themselves bailed out with taxpayer money?
At least a bunch of lawyers and AI consultants (who conveniently, are frequently also lobbyists and consultants for the legislature) now get some legally mandated work and will make a shit ton more money!
and what nobody seems to notice, that last part looks like it was generated by Anthropic's Claude (it likes to make bolded lists with check emojis, structured exactly in that manner). Kind of scary implying that they could be letting these models draft legislation
Its possible that ai was used for this summary section, which isn't as scary as you make it. It's def scary that ai is used in a legislative doc at all.
Students have access to OPT (1y) and STEM OPT (2y) on the same visa to work after their degree. If they go for a higher degree then they can get OPT again. Grad students from US universities also get a separate quota in the H1B cap.
All of this should to a little extent alleviate some of the concerns.
The weighted system should still work since the candidate pool (from within the US) is likely mostly students on OPT. They should have comparable salaries, unless they are hired by rotten companies.
I once had an algorithms professor who would give us written home assignments and then on the day of submission take a quiz with identical questions. A significant portion of the class did poorly on these quizes despite scoring good on the assignment.
I can't even imagine how learning is impacted by the (ab)use of AI.
These numbers are misleading (as in not apples to apples comparison). M4 has a matrix multiply hardware extension which can accelerate code written (or compiled) specifically for this extension.
It will be faster only for code that uses/is optimized for that specific extension. And the examples you give are not really correct.
If you add a supercharger you will get more power, but if the car's transmission is not upgraded, you might just get some broken gears and shafts.
If you add more solar panels to your roof, you might exceed the inverter power, and the panels will not bring benefits.
It's true that you will benefit from the changes above, but not just by themselves - something else needs to change so you can benefit. And in the case of the M4 and these extensions, the software needs do be changed and also to have an use case for these extensions.
That is an example of the kind of hardware and software synergy that gives good performance on Apple Silicon for apps in iOS and Mac OS. Apple execs have given interviews where they talk about this kind of thing. They look at code in the OS and in their application libraries that can benefit from hardware optimization and they build in hardware support for it to improve performance overall. This helps all kinds of app running on the OS and using the standard libraries.
Are you implying there are no use cases for matrix multiply?
In any case, the two main deep learning packages have already been updated so for the place this change was almost certainly targeted for, your complaint is answered. I'm just stunned that anyone would complain about hardware matrix multiplication? I've wondered why that hasn't been ubiquitous for the past 20 years.
Everyone should make that improvement in their hardware. Everyone should get rid of code implementing matrix mult and make the hardware call instead. It's common sense. Not to put too fine a point on it, but your complaint assumes that GeekBench is based on code that has implemented all those changes.
> Are you implying there are no use cases for matrix multiply?
The whole point is that these highly specialized scenarios are only featured in very specialized usecases, and don't reflect in overall performance.
We've been dealing with the regular release of specialized processor operations for a couple of decades. This story is not new. You see cherry-picked microbenchmarks used to plot impressive bar charts, immediately followed by the realization that a) in general this sort of operator is rarely invoked with enough frequency to be noticeable, b) you need to build code with specialized flags to get software to actually leverage this feature, c) even then it's only noticeable in very specialized workloads that already run on the background.
I still recall when fused multiply-add was such a game changer because everyone used polynomials and these operations would triple performance. Not the case.
And more to the point, do you believe that matrix multiplication is a breakthrough discovery that is only now surfacing? Computers were being designed around matrix operations way before they were even considered to be in a household.
I'm not complaining, I'm just saying that the higher numbers of that benchmark result do not translate directly to better performance for all software you run. Deep learning as it is right now is probably the main application that benefits from this extension (and probably the reason why it was added in hardware at this point in time).
Well you're really just describing benchmarks- if the benchmark doesn't represent your standard workflow then it probably isn't a good reference for you. But Geekbench includes a bunch of components based on real-world applications like file compression, web browsing, and PDF rendering. So it probably isn't perfect, but it's likely that the M4 will feel a bit faster in regular use compared to an older generation MacBook Pro.
> Homeless people are unlikely to pay $20 000 per year which is the price of the drug though.
True, but think about all the people who are fully functioning and productive members of society and got afflicted with this disease. This med will increase the likelihood that they continue to be highly functioning and compliant with the treatment. This will allow them to keep their jobs and cognitive abilities.
Every person I know with this disease has trouble sticking to meds due to side effects, and not sticking to the meds and relapsing multiple times is probably one of the most important reasons that their condition regresses.
It’s not just the side effects that cause people with schizophrenia to be non-compliant with meds. Many people with schizophrenia stop taking meds because they do not think they are sick - anosognosia. It’s why a longer lasting injectable is often recommended over daily oral meds.
> longer lasting injectable is often recommended over daily oral meds
Both are recommended. Why? Daily meds are not only effective, but remind the patient that they have an illness that needs daily maintenance, it makes them involved in the treatment. Long acting injectables are added monthly to prevent backslides in case the patient forgets or does not want to take their medicine.
It would be nice if every one could afford it. And people were still filthy rich, but not necessarily exorbitantly rich? It cost the company discovering the drug around $11M
Yep the poor people will only have to wait 20 years or so for the patent to expire. I’m happy it will help people but I can still get angry at companies that will watch people die and suffer, give a shrug, and continue counting their billions.
> Dementia is not a specific disease but is rather a general term for the impaired ability to remember, think, or make decisions that interferes with doing everyday activities.
Article says:
> Long before people develop dementia, they often begin falling behind on mortgage payments, credit card bills and other financial obligations, new research shows.
> Credit scores among people who later develop dementia.....
Author is conflating the diagnosis of dementia with the development of dementia. It is not like a person wakes up one day with dementia. It progresses slowly. Early stage dementia is still dementia. I don't know what the formal requirements for diagnosis are but my guess is that that has to be a significant quality of life degradation for it to be diagnosed (or even for someone to be taken to see a doctor for that matter).
> New research shows that people who develop dementia often begin falling behind on bills years earlier.
I would say something like "New research shows that people in early stages of dementia often begin falling behind on bills years before it can be diagnosed."
It is not about "can diagnose" we refuse to diagnose more minor decline as dementia for social and political reasons.
The study quantitatively evaluates some of the costs of a political choice to limit medicine to treating only a decline well bellow the average (instead of the individual's peak) and bellow the ability to perform self care as a disease.
@190/share this is $102 million
Also: > Vistra Energy, another 19% chunk, was wiped out as well.
> Moreover, the fund’s turnover hovered over 80%, leaving just three holdings: Tesla, Microsoft, and Apple.
Some would argue that tesla is a bigger bubble.
> his personal net worth is an eye-popping $16.3 billion as of 2025.