This is always, always the answer. It's not only the decent thing to do, it ensures you'll get an accurate representation of the quality of the prospective employee's work.
It's just as misleading to string together dozens of "sources" in an article to prop up a narrative. It appears to me that the website you've linked is just playing on peoples paranoia, a la YouTube "news" channels. If the actual content has to chase the dramatic headline the entire time, you can be sure that it's a whole lot of BS.
News channels do not broadcast 24 hours of news. The majority of programming are personalities, opinions and "recaps." The actual news comes on in hour-long chunks once early in the morning, mid-day and at primetime.
I'm not going to say my news is perfect, but it's from a non-commercial / nonpartisan organization, they get money from the government / taxes (which is less than ideal, but their independence is codified I believe), etc. It sticks to the news, professional-like.
Switch to the commercial channels (we thankfully don't have major 24 hour news channels) and the news on there has much more er, Personalities, quirkiness, some silly background news, and a weatherman that travels the country to visit events and petting zoos and shit.
The trick is paying attention long enough to determine if sources stay consistent, issue proper corrections & retractions, are intellectually honest, etc.
Or if the purveyors are just hacking ratings, for more ad revenue.
In the 80s, I eschewed a career in broadcast media. I loved the gear, doing creative stuff, working in the studio. But absolutely hated the business. "If it bleeds, it leads."
We've been complaining about outrage engines since the beginning. Social media just made things much much worse.
State-funded media has it's own problems but at least there's some accountability which is effectively nonexistent for large media companies. As long as the free market rewards sensationalism, there will be profit driven "journalism." Ultimately, the general populace needs to be educated on the veracity of news in the digital age; how to spot a misleading headline, and how to corroborate actual expository works.
In what world is an attempt to understand the actions of a governing body more disturbing than disappearing someone? The pondering of morality and underlying intentions are what fuel the entire "East vs West" debate.
Chalk it up to systemic casualties all you'd like, but let's not pretend as if that isn't reductive as well. Behind every system, there are decisions being made by human beings and it's not flippant to be cognizant of that.
It reminds me of what Apple does when it redefines a product category by "just" bringing together existing technology into a more convenient form factor and UX.
That's a wonderful observation. The form factor is as clever as it is logical. Stuff like this makes me really optimistic for the future of Raspberry Pi (and widespread computing!).
It's one of those designs where you kind of go "ah yes, I thought that would be a good idea". But before you saw it you might never have planned it in your head exactly the way it is. The iPod was like that for me, with its scrolling click wheel.
Like getting to the platonic form of a specific type of device.
The Pi 400 is the completion of the idea of an educational computer for kids, IMO. It's not going to compete with a Mac mini or anything like that, and the keyboard is likely not going to be anyone's top choice for a 'professional' keyboard. But it's good enough in all the right ways as a general purpose computer.
The one weird thing to me is their mouse placement decision. Their promo photos and videos show it on the left side because of the most convenient usb port. I have never actually seen a left-placed mouse in the wild. Having it all look so clean and organized in the promo is a bit disingenuous.
In reality, the mouse will be on the right, and the cable will have to cross over top of all cables except the ethernet and usbc power. For such a beautiful form factor, it will look like a mess.
The funny thing is that all of the lefties I know just adapted to the right-side mouse (likely just because that is how it was setup at school/office).
While yes, the BT mouse is going to be the better option, they're pushing the kit-included mouse.
Me too. Due to some issues with right hand I retrained myself to use mouse with my left hand. Now I can use mouse by both hands, and in both button orientations.
Switching is easier than most people imagine, IMHO. I started to feel the beginnings of an RSI and went from right to left for half a year or so, no problems.
You basically just convinced me that I should be training my left hand to use my mouse. In all cases except precision work (eg graphic design) it should be doable.
Side note, but if you actually do graphic design, get a digitizer/gtaphic pen tablet, it is simply "another world" when compared to mouse, nowadays even an el-cheapo one ( i.e. something in the 40-60 US$ range)is good enough for non-professional use.
Actually this has little to do with being left-handed, it depends on other factors.
I am right handed and use the mouse with the left, so that I have my right hand "free" to use a pen or the (right side) numeric keypad when crunching numbers.
And I do not (as many left-handers do) "invert" the miue buttons, maybe I am strange/an exception.
The keyboard comes with Bluetooth 5.0; maybe the promo would look a bit cleaner with a bluetooth mouse, or at least a 2.4GHz wireless mouse and dongle.
It's a laptop minus the screen... that's all it amounts to. It's a keyboard with the proc and memory underneath it. I don't understand how any of this is revolutionary or special. Hell, it doesn't even have a battery.
I think the price alone makes it revolutionary. Laptops aren't $100 brand new and if it were what would that screen look like? Most people who would benefit from this price probably have a TV they can connect it to.
Also, screens last a long time, but it's the computers themselves that need continual ugrading. Case in point: we have a Dell screen from 18 year ago. The Dell desktop long since became obsolete.
So this makes a lot of sense economically, especially for cash-strapped organizations like schools that have to manage large fleets of machines.
Stuff that isn't touched and lasts long (screens) don't get replaced. Stuff that is touched (and likely to wear out over time) like keyboards or gets out of date due to Moore's Law (computer) is fused into one package, so there are fewer cables to worry about getting unplugged, worn out, etc.
That's fair, but there's another problem. It's hard enough to find a regular educator that's qualified to teach kids about computers on a Windows or Apple. Linux is going to be easier?
The education sector has already bought into Chromebooks, which means that most in-class technology solutions are going to target the browser. The only thing anyone will need to learn to move between OSes is: how do I turn it on, where's the button to launch the browser, how do I turn it off. Everything else will already be familiar.
As a British person we tend to tell everyone we do that and people believe it. However it’s not necessarily true. We produced a ridiculously large amount of crap to the point we nearly killed our manufacturing industry.
If there’s anything to aspire to its Hewlett Packard’s test and measurement and computing offerings between 1960 and 1990.
I've still got a 1971 MGB. Apart from problems with the electrics - batteries go flat in winter, it's pretty easy to fix most things by hitting it with a hammer.
Perhaps that was the case with e.g. vintage sports cars from the 50s and 60s. But I don't think the British manufacturing industry has been doing well ever since. I would love to be wrong, though.
In terms of using deep fakes to impersonate political figures, I do think we'll ultimately be okay on that front. I can see the entertainment industry being flipped on it's head and fabricated conspiracy theories hitting a fever pitch but with initiatives like CAI* cropping up, my hope is that there will be a digital wax seal of sorts to make sure what you're seeing is what the author intended.
I'm not so sure that it's the cringe that is the problem. I think it's the notion that these social media platforms are acting as a lens for how a lot of people see the world.