Could you say a bit more about your workflow? I'm used to having source code files open in emacs (or zemacs) and a lisp REPL in another buffer. I edit the source files directly and then recompile changes into the running lisp. The code stays in sync because the text code files are the master copy and the running image is updated (when I choose) to reflect changes. Program state (e.g., large data structures) isn't generally kept in the source files. If what's in memory needs to be preserved, then I need to have serialization and deserialization methods, or a database, or I need to save the running image to a snapshot file. I've probably so internalized this process that there are problems with it that are invisible to me.
So SmallJS works the same way. Update text files, press Run and the 'image' is updated in a second or 2. But it contains only code, not data. And I think that's a good thing..
The commenter's point was to disagree with the previous comment that "souls aren't real". Lack of evidence either way means we don't know. Occam's razor, while a good heuristic, is a heuristic, not a theorem.
It hasn't been limited to probationary employees. Here's one example: https://www.science.org/content/article/nih-ban-renewing-sen...
Agencies have also been directed to make plans for "significant reductions". For example, EPA plans to cut 65%. Fish and Wildlife and the Bureau of Indian Affairs are preparing for up to 40%. These latter cuts haven't happened yet, but they're very likely.
That's correct. It hasn't been limited to probationary employees only, but many are. Those who are not either aren't needed or will be rehired if operations cannot continue without them.
Didn't the NIH freeze the review meetings in this year's proposal review process, putting all grant funding that would start next fiscal year in question? This is separate from the change to the overhead rate.
Only if the organization with the money wants to do that. Flip it around. Do you think the sports program at any major university pays for physics research facilities (or any topic outside of sports medicine)?
The federal workforce, as a percentage of all jobs in the U.S. was 4% in the 50's, decreased steadily to 2% in 2000 and has held roughly steady since then. (The source is https://usafacts.org/articles/how-many-people-work-for-the-f... second figure, and I'm taking total jobs as a proxy for the population that the workforce serves.)
The end of that period of reduction was Clinton's Presidency. Clinton's National Performance Review (NPR) started at the beginning his term in '93. It had goals very similar to the stated goals of this efficiency effort, but it was organized completely differently. He said, "I'll ask every member of our Cabinet to assign their best people to this project, managers, auditors, and frontline workers as well."
GPT4o: The NPR's initial report, released in September 1993, contained 384 recommendations focused on cutting red tape, empowering employees, and enhancing customer service. Implementation of these recommendations involved presidential directives, congressional actions, and agency-specific initiatives. Notably, the NPR led to the passage of the Government Performance and Results Act (GPRA) of 1993, which required federal agencies to develop strategic plans and measure performance outcomes. Additionally, the NPR contributed to a reduction of over 377,000 federal jobs during the 1990s, primarily through buyouts, early retirements, natural attrition and some layoffs (reductions-in-force or RIFs).
The recommendations that involved changes to law, the GPRA, were passed in both houses of Congress by unanimous voice vote.
I don't think the stated goals of the current efficiency drive are controversial. The problem is the method. I want to understand the basis for people supporting those methods, the "we've got to break some eggs" crowd, when the example of the NPR exists. In my opinion, it didn't cause conflicts between branches of government, didn't disrupt markets, and was wildly successful. It also caused much less disruption in people's lives, because the changes were implemented over several years with much more warning.
I, personally, don't think the real goals of this effort are the stated goals, but that's a different issue.
Research funding awarded to universities and to performers internal to NASA (back when there was a reasonable amount of that) had overhead rates that were similar to the NIH rates. When I worked at Xerox PARC, we would perform research for other parts of the company and charged overhead too, although the rate was a little lower (around 40%). Institutional overhead has been a regular feature of how research has been organized and funded for 60 years. Change is fine, but most of the costs are legitimate, and it takes time for the rest of the system to adjust to changes in one part of it. Doing it abruptly is damaging the system and will negatively impact the careers of many students and young researchers.
This equivalence between a company that provides one app that, if it were to disappear, would hurt no one, and a government that has thousands of functions, many of which are life-and-death in both the short and long run, is just ridiculous.
Let's take one example. The Epidemic Intelligence Service (EIS) is a two-year post-residency program that trains health professionals in applied epidemiology. These officers are crucial for on-the-ground investigations of disease outbreaks. It's a 2-year program, with 50-60 doctors in each year. All of the first-year doctors in this year's program were fired by DOGE, so far, for a capacity reduction of 50%. Both years are in the 'probationary' civil servant category, so the jobs of the rest of them are still at risk.
I asked ChatGPT 4o for other examples, and it generated a list of 40. You can do that for yourself, if you're interested.
I don't know. Mark Hamill is a voice acting chameleon. James Earl Jones was himself, which was great for many roles and impossible to improve upon for some. Their strengths lie on different axes.