So I'm glad they fixed the problems of depending on kilogram sample object for anything, but the definitions are now bizarre if you want to learn them in reference to real-world things. How will they be taught now?
The Ampere went from being the thought experiment of two infinite wires 1m apart creating a certain force to being something nearly unintelligible to the average person.
The Kelvin was just turning 0C into absolute units, carefully formalized as the triple point of water which was actually 0.01C, and making it into absolute units. Now its something nearly intelligible to the public.
Another problem is I'm not entirely sure the atomic mass is quite defined anymore. That number at the bottom of each box of the periodic table of elements may not have a valid unit anymore??
> So I'm glad they fixed the problems of depending on kilogram sample object for anything, but the definitions are now bizarre if you want to learn them in reference to real-world things. How will they be taught now?
This hasn't changed, really.
The kilogram started as the mass of one litre of water (under defined conditions, etc). Since every time people reproduce this they will get a slightly different measurement because of the various experimental errors they changed to a physical artefact, which isn't ideal either.
So now we have a definition purely based on fundamental constants.
The best way to represent and approximate it in daily life is still 1 litre of water. 1 litre is also very simple to visualise if you don't have a measuring jug or bottle handy since that's a cube with 10cm edges.
For teaching purposes that also looks much less random than an artefact.
Right. So in other words, the SI unit definitions are for scientists wishing to affirm the base units, but the old definitions are superior for education purposes (well, except maybe for the prototype kilogram...)
The same as they always have for practical use, since the new definitions are chosen precisely because they make virtually no difference in practice while being more sound.
Most people won't notice, just like they didn't when US customary units were redefined as derived values from SI units.
I must strongly disagree about ampere, the new definition is much simpler conceptually. The wording is bit obtuse, but that is a side matter, practically they fixed the value of coulomb (to a constant multiple of elementary charge) and now ampere follows from that by the basic definition of "current of one ampere is one coulomb of charge going past a given point per second". That is much more concrete than the previously imho very abstract definition which involved infinities and forces over a distance
Most likely how they've always been taught. For most people, it's enough to know that a kilogram equates to roughly a liter of water (H2O). It's not like the Ampere was ever actually equivalent to "two infinite wires 1m apart creating a certain force" either, since that's not a scenario repeatable in the real world. Teaching science involves lots and lots of simplification. Just like you learn that electrons don't really orbit around the nucleus when studying science in academia, you'll learn the proper definition of the kilogram.
They've definitely improved atomic mass and Avogadro's constant. Chemists and students have been fooled into believing they have some special importance but in fact they're nothing more than ugly accidents of history that somehow survived standardization.
We used to have two independently defined mass units! The unified atomic mass unit, and the kg. Now we have only one - the kg, and the other is defined in terms of it.
Avogadro's constant is now clearly identified as an arbitrary number with no illusion of importance since it's no longer part of such an interconnected web of dependencies. Hopefully this will help students to realize that it's not some important chemical quantity but just a way for old people to count big numbers because that's the only way they ever learnt. Nothing more special than the number of feet in a mile.
As for periodic tables not having valid units. They didn't anyway. Chemists, even text-book writers, often neglect to include units for atomic masses. The numerical values will be identical though. The difference is that as we measure them more precisely with future technology, they'll diverge from what they would have been under the old system.
That's a common misconception but all the things that it relates to SI units are themselves redundant. There's an entire parallel system of constants and units built around it which only exists for legacy reasons, as well as people's desire to avoid very big and very small numbers.
If we started from scratch, Avogadro's number would be a simple exact power of 10 or nothing at all and we'd just tolerate extreme numbers the way computer people tolerate terabytes and electronics people tolerate picofarads. There's nothing natural or fundamental about it.
So you make the calculations and use some wire length that places the error within your acceptable margin. You use the completely precise, but impractically small value of the electron charge on those calculations.
You also use the completely impractical value of the cesium emission frequency on your calculations for your tabletop amperameter, as well as the speed of light on vacuum (good luck having any "vacuum" around). So, why are people complaining specifically about the electron charge?
Both the cesium frequency and the speed of light are practically reproducible and are used to calibrate instruments. Not so with the elementary charge. (Similarly, in chemistry, nobody would count the quantity of stuff in molecules, they use moles instead.)
Well, I do know that infinite wires aren't practically reproducible.
In physics people talk about measuring single electrons all the time. Electrons can be counted with tabletop equipment made with home-like budget. You can easily add them to up to trillions, what is far from a Coulomb, but still a practical amount of them.
> The Ampere went from being the thought experiment of two infinite wires 1m apart creating a certain force to being something nearly unintelligible to the average person.
The ampere is defined by taking the fixed numerical value of the elementary charge e to be 1.602 176 634 × 10−19 when expressed in the unit C, which is equal to A s, where the second is defined in terms of ∆ν
The Ampere went from being the thought experiment of two infinite wires 1m apart creating a certain force to being something nearly unintelligible to the average person.
The Kelvin was just turning 0C into absolute units, carefully formalized as the triple point of water which was actually 0.01C, and making it into absolute units. Now its something nearly intelligible to the public.
Another problem is I'm not entirely sure the atomic mass is quite defined anymore. That number at the bottom of each box of the periodic table of elements may not have a valid unit anymore??