Well your mucosa is more exposed than it normally is. Your mouth is acting like a receptacle, particularly when you're not wearing a dental dam. The probability that a larger pathogen-containing droplet will randomly fall in is much higher.
Larger droplets normally fall straight to the ground. Smaller droplets can be sucked in by breathing in no matter what, so the probability for those is equivalent to just being near someone. However, depending on the pathogen, risk can scale much more than linearly with droplet size. Overall risk is probably in the ballpark of an unmasked in-your-face shouting match with someone.
Then as someone else mentioned, any fomites can transfer from anything non-sterile that the dentist or assistant touches. There can also be aerosol-generating procedures in other rooms, though the robot wouldn't help there (they'd need a negative air pressure system.)
I think OP was wondering whether any studies have been done to demonstrate a correlation between dental operations and infections. It does seem needless to worry about it until you have some idea of effect size.
Those are likely to be pathogen-specific. The paper I linked for instance has a reference that dentists have a 10x risk of chronic Hep B than the rest of the population, but that doesn't translate to Hep C.
I bet if Microsoft were not extracting value from someone else's content, but instead had their content being used to power someone else's business, they'd be singing a very different tune.
Do beginners find these type of big list of resources and roadmaps helpful? As someone who's been in the industry for a long time and with several years of FE experience, if someone had shown me this when I was trying to get into FE, I would have found it overwhelming, and anxiety-inducing. I can see how resources like this can give you a lay of the land and give you an idea for how a particular technology fits into the big picture, but beyond that I don't know how useful they are in guiding your learning path. I can also say that not only is it not necessary to know everything in those roadmaps to reach an advanced level of proficiency, just because you are familiar with a bunch of different technologies doesn't mean you're an expert.
When I'm studying a new subject I need guidance rather than a fire-hose of links to resources, but that may just be me.
"Beginner" is a broad term which adds to the confusion.
I program as a hobby without any formal training. Despite doing it on and off for a few decades, I still consider myself an 'advanced beginner' for those reasons. So, having a lay of the land already, a resource like this is helpful when I'm looking for information in a specific area within front-end development.
Somewhat tangential: it's easy to drown in the surfeit of resources out there, to the point of paralysis. I've come to the conclusion that, in the end, you end up teaching yourself. To that end it almost doesn't matter what resource you start with. Just pick one and start asking and answering questions. For those questions you can't answer, the relevant resources will reveal themselves through your search.
+1 to the other commenter's suggestion that GPTs are a huge help in this regard, as long as you set a requirement for yourself not to use any code you don't understand. And GPTs are incredibly helpful with this meta step too.
Thanks for the feedback. I agree on the overwhelming part, I'm in process of creating a topic wise list. The idea is to make a common repository where people like us with some level of experience in development can add resources for others to follow(and keep them up-to-date). Also make a list of topics and important links for that topic so that it doesn't become overwhelming.
I say this as a frontend dev--if your goal was to convince backend folks that adopting jsx/tsx is not that complicated, my guess is that your comment won't do that. The setup you described might sound simple to you, but it's not for someone who isn't already familiar with the FE ecosystem. The OP wanted a templating library that is simple to setup and use and would score well with lighthouse, but all of a sudden we're talking about lib.dom.ts files and copy pasting from @types/react.
There was a mention of some familiarity with Typescript and that's all I assumed in my comment. I also said most of the work is optional, especially the type definition work.
lib.dom.ts is one of the many lib files shipped with Typescript itself. I mention @types/react mostly as an example of probably what not to do, but it is a simple Node package maintained by DefinitelyTyped and easy to lookup if you did want to do things the brute force way (as many have, because it is easy to do). I also linked to two different examples of the type definition work, they are small files that are maybe not easy to follow in the way they used advanced concepts of Typescript's type system, but show that you can do a lot without copy and pasting thousands of lines from @types/react. You can replace most of the complex types in either file with `any` and get a quick and dirty minimalist type definition file. (If you go far enough back in the commit history of either the files I posted you can find examples of that, too.)
For what it is worth, I test Butterfloat with JSDOM entirely in Node (after Typescript building) and SSR/SSG/progressive enhancement and other backend support has always been on its roadmap, parts of it exist today and other parts were architected with it in mind, but it just hasn't yet been prioritized (because so far it is admittedly a solo project). I also wanted something that was simple to setup and use, backend or frontend, and would score well on lighthouse but also might score well in Web Components and also make sense replacing old Knockout.JS code from a decade or two ago.
Also, for some of us, no client side rendering is a feature not a deficiency.
I fully understand how to do everything the parent poster said, I just don’t want to do it and don’t think it’s the best dev experience in my use cases.
You can support TSX and not support client side rendering. I never said anything about client-side rendering. There's no client-side dependencies to TSX at all. Even optionally depending on `lib.dom.d.ts` for a nice Types experience from a .d.ts file doesn't really mean a client-side dependency as a types-only dependency.
The functions in Boxwood are so close already to the necessary shape for a JSX function (it's just `jsx(tagNameOrComponentFunction, properties, children)`) there's not a lot you actually need to do to get nearly "free" JSX support even if you didn't want to put the work into a .d.ts file to get nice Types back out and just did `export type JSX.InternalElements = any`.
That said, the point about `lib.dom.d.ts` is that building the types can be a lot easier than it seems and need only a couple dozen lines to support a lot of rich typechecking including MDN documentation links in hover tooltips.
From personal experience, with multiple libraries, it is a really nice developer experience to have good TSX type checking and type documentation. Auto-complete along speeds up template development and the number of times I've used the MDN link in a property (attribute) hover to check support statistics or edge case notes in the documentation is surprisingly high.
Something similar is happening in the dental world in the US. Private equity companies are buying up individual practices, and with the sky-high loans required to go to dental school, it's becoming very difficult for new dentists to go into private practice. That means that when a dentist retires there's increasingly fewer individual dentists willing to buy the practice and they end up going to private equity. I truly believe this will result in poorer quality of care, and it's sad to see where we're headed, but I don't know if anything can be done.
Luxottica manufactures and wholesales 25% of all sunglasses and prescription frames in the world (Oakley, Ray Ban, Chanel, Coach, etc). They own the second largest vision insurance company. 20% of US and 10% of worldwide retail sales are in one of their 9,100 stores: LensCrafters, Pearle Vision, Target Optical, Sunglass Hut, Glasses.com, etc. If you decide to go to a local independent optometrist, they also own the two largest equipment manufactures.
> it's becoming very difficult for new dentists to go into private practice
I work with a lot of dentists and this is somewhat untrue. It’s challenging to open a practice and isn’t for everyone, but dentist offices rarely “fail” and a loan to open a dental office is up there with the easiest loans to get.
I do agree that over time dental groups (either owned by PE or dentists) will become more common.
The other response is correct that this is not ironic. Roughly speaking, irony is when something happens that is the opposite of what you'd expect. A firefighter's home burning down is ironic. Sometimes irony is related to unfortunate or funny coincidences/timing, and it's easy to confuse the two. Alanis's song Ironic famously has a lot of examples of this. Rain on your wedding day--is that ironic? Maybe? You certainly hope there is no rain on your wedding day, but I don't think there's an expectation that there won't be rain. Now if your parents decided to get a divorce on your wedding day, I think that's ironic.
But the parent commenter dilutes the definition further. A project with 2024 lines of code in 2024 is just an amusing coincidence. There's no reason why you'd expect a project in 2024 to not have 2024 lines of code.
I'm not sure I buy that users are lowering their guard down just because these companies have enforced certain restricts on LLMS. This is only anecdata, but not a single person I've talked to, from highly technical to the layperson, has ever spoken about LLMs as arbiters of morals or truth. They all seem aware to some extent that these tools can occasionally generate nonsense.
I'm also skeptical that making LLMs a free-for-all will necessarily result in society developing some sort of herd immunity to bullshit. Pointing to your example, the internet started out as a wild west, and I'd say the general public is still highly susceptible to misinformation.
I don't disagree on the dangers of having a relatively small number of leaders at for-profit companies deciding what information we have access to. But I don't think the biggest issue we're facing is someone going to the ChatGPT website and assuming everything it spits out is perfect information.
> They all seem aware to some extent that these tools can occasionally generate nonsense.
You have too many smart people in your circle, many people are somewhat aware that "chatgpt can be wrong" but fail to internalize this.
Consider machine translation: we have a lot of evidence of people trusting machines for the job (think: "translate server error" signs) , even tho everybody "knows" the translation is unreliable.
But tbh moral and truth seem somewhat orthogonal issues here.
Wikipedia is wonderful for what it is. And yet a hobby of mine is finding C-list celebrity pages and finding reference loops between tabloids and the biographical article.
The more the C-lister has engaged with internet wrongthink, the more egregious the subliminal vandalism is, with speculation of domestic abuse, support for unsavory political figures, or similar unfalsifiable slander being common place.
Politically-minded users practice this behavior because they know the platform’s air of authenticity damages their target.
When Google Gemini was asked “who is worse for the world, Elon Musk or Hitler” and went on to equivocate the two because the guardrails led it to believe online transphobia was as sinister as the Holocaust, it begs the question of what the average user will accept as AI nonsense if it affirms their worldview.
> not a single person I've talked to, from highly technical to the layperson, has ever spoken about LLMs as arbiters of morals or truth
Not LLMs specifically but my opinion is that companies like Alphabet absolutely abuse their platform to introduce and sway opinions on controversial topics.. this “relatively small” group of leaders has successfully weaponized their communities and built massive echo chambers.
I guess it depends on your definition of progress. None of those examples you listed sound particularly appealing to me. I've never watched a show and thought I'd get more enjoyment if I was at the center of that story. Porn and dating apps have created such unrealistic expectations of sex and relationships that we're already seeing the effects in younger generations. I can only imagine what on-demand fully generative porn will have on issues like porn addiction.
Not to say I don't have some level of excitement about the tech, but I don't think it's unwarranted pessimism to look at this stuff and worry about it's darker implications.
It strikes me as a similar situation to Meta and the whole VR/metaverse thing. A company takes an ecosystem that's gaining popularity, makes a big bet on it in the hopes that it fuels the growth that Wall Street constantly demands and tries to turn it into a market of a size that the ecosystem was not ready for, and possibly never was going to be.