I'd love to know the history of the (misleading) unemployment indicator. I feel that the true indicator of employment should be very simple: the amount of people who have full time jobs divided by the amount of people, period. The amount of people who are "not looked" or "dropped out" can be gamed so badly, not to mention that it's irrelevant.
For the longest time, married women had extremely low participation in the labor market. Even still, many women choose to work part-time or not at all. This represents a very large portion of the adult population.
> the amount of people who have full time jobs divided by the amount of people, period.
300 million / 150 million employed = ~50% [1], so that's roughly half the country not working. The retired, the disabled, the minors, and others all take away from this number. When it swings by 100,000 either way, then the resulting percentage is less than one tenth of a percentage point. Sure, this is okay we just have to get used to dealing with small numbers, right? Well, lots of people have a lot of trouble comprehending numbers like this IMO.
It's not political for statisticians to want to normalize a raw stat like this in order to show what seems like the "true" number that matters. How many constituents out there want a job and can't find one? That's a very different stat from "how many humans in this geographical region are have jobs?"
> It's not political for statisticians to want to normalize a raw stat like this in order to show what seems like the "true" number that matters. How many constituents out there want a job and can't find one? That's a very different stat from "how many humans in this geographical region are have jobs?"
I disagree. The definitions themselves are political.