It is not a protected term, so anything is state-of-the-art if you want it to be.
For example, Gemma models at the moment of release were performing worse their competition, but still, it is "state-of-the-art". It does not mean it's a bad product at all (Gemma is actually good), but the claims are very free.
Juicero was state-of-the-art on release too, though hands were better, etc.
> It's just marketing. [...] It is not a protected term, so anything is state-of-the-art if you want it to be.
But is it true?
I think we ought to stop indulging and rationalizing self-serving bullshit with the "it's just marketing" bit, as if that somehow makes bullshit okay. It's not okay. Normalizing bullshit is culturally destructive and reinforces the existing indifference to truth.
Part of the motivation people have seems to be a cowardly morbid fear of conflict or the acknowledgment that the world is a mess. But I'm not even suggesting conflict. I'm suggesting demoting the dignity of bullshitters in one's own estimation of them. A bullshitter should appear trashy to us, because bullshitting is trashy.
My rule for modern TVs:
1. Never connect the TV panel itself to the internet. Keep it air-gapped. Treat it solely as a dumb monitor.
2. Use an Apple TV for the "smart" features.
3. Avoid Fire TV, Chromecast, or Roku.
The logic is simple, Google (Chromecast) and Amazon (Fire TV) operate on the same business model as the TV manufacturers subsidized hardware in exchange for user data and ad inventory. Apple is the only mainstream option where the hardware cost covers the experience, rather than your viewing habits subsidizing the device.
That's exactly my own thought process. I don't pretend that Apple is saintly, but their profit model is currently to make money through premium prices on premium products. They have a lot to lose, like several trillion dollars, in betraying that trust.
A large % of their revenue comes from app store/services and they have incentives to lock you into the ecosystem, sell you digital shit and take a cut off of everything.
I saw an ad for apple gaming service in my iphone system settings recently !
That's not to say that Google isn't worse but let's not pretend Apple is some saint here or that their incentives are perfectly aligned with the users. Hardware growth has peaked, they will be forced to milk you on services to keep growing revenue.
Personally I'm looking forward to Steam Deck, if that gets annoying with SteamOS - it's a PC built for Linux, there's going to be something available.
True. The best option currently is to buy an Nvidia Shield TV, unlock the bootloader and install a custom Android ROM. The hardware is great, and if you install a custom ROM, you have more freedom than Apple TV will ever give you.
The comment about the ad wasn't about the ad istelf. It was an apple ad for an apple service, so they didn't make any money at all on the ad. The remark was about the service Apple was pushing, and just how intrusively.
But the comment OP was replying to was about their ad services and what incentive the company has to operate in good faith or risk impacting sales to the majority of their business.
Correct, and didn’t sell your data to do it. I’m okay with that. If I trust Apple with basically my life stored on their phone and in their cloud, and processing payments for me, and filtering my email, and spoofing my mac address on networks (and,and,and), it seems foolish to be worried about them knowing what tv shows I like to watch at night too. At least to me. It’s gonna be a sad day when Tim leaves and user privacy isn’t a company focus anymore.
Services are 25% and are the only one growing/they can grow - that means all focus is going to be on expanding that revenue = enshitification.
Hardware is now purely a way to get you on to the app store - which is why iOS is so locked down and iPad has a MacBook level processor with toy OS.
If you stop looking at the marketing speak and look at it from a stock owner perspective all the user hostile moves Apple is double speaking into security and UX actually make a lot more sense.
Hardware is still 3x the revenue of services, and though it has a lower margin is the bulk of the companies profit. Apple was 3% of the PC market in 2010 and is 10% today, while Android is 75% of the global cellphone market - there's plenty of room for growth in hardware... if you stop looking at the marketing speak, whatever that means.
I don’t see how this really changes the underlying problem of the device pays on you and then they sell that information to the highest bidder? I’m not reaching for a financial report to fix that.
Apple doesn't sell information, they sell access to eyeballs. Quite a big difference. The whole point of first OPs point was that ad revenues to Apple are not worth hurting the other parts of their business built around privacy. Pointing out that Apple shows ads for owned services within their own OS isn't a case otherwise.
Apple absolutely does allow wholesale data harvesting by turning a blind eye to apps that straight up embed spyware SDKs.
This isn’t some hypothetical or abstract scenario, it’s a real life multi billion dollar a year industry that Apple allows on their devices.
You can argue that this is not the same thing as the native ad platform that they run and I’d agree but it’s also a distinction without a meaningful difference.
All you've done is move the goal posts, and it's not even ads related. I'm not entirely certain what you're arguing, other than having some feelings about Apple.
Like another comment mentioned I'm ready to go back to torrenting. Im currently paying for 4 streaming service subscriptions (if you count YouTube premium) where I have super segmented and annoying search UX, and Apple won't even let me pay for their service in my EU county (Croatia). And the DRM story is ridiculous. I'll just setup ARR stack and have a better experience than I can pay for - for free.
Jellyfin + Arr stack would take a couple of hours to setup and cost $10/month for a seedbox in Europe, but it's not as convenient as downloading an app and logging in.
If it was just one app or even two I would agree but there's :
- Netflix
- HBO max
- Sky Showtime
- Amazon Prime
- Apple TV+
- Disney+
This is just the stuff I watched this year.
Add in all the region locks, also not all the services having rights to local dubs despite them being available (more for children's stuff but still relevant, Disney+ is unusable for me because of this)
Netflix used to have a catalog worth keeping the subscription on, nowadays I maybe get to watch something once a quarter and keep it on for kids stuff.
Streaming is not convince anymore it's a shitshow.
I think a jellyfin/ARR/Seedbox setup is going to be the solution this year.
> I don't pretend that Apple is saintly, but their profit model is currently to make money through premium prices on premium products
Is this statement based on anything other than Apple marketing materials, perhaps a meaningful qualification from an independent third party? I worry this falsehood is being repeated so much it has become "truth".
For some reason, some people have this inexplicable rose-tinted vision of Apple. Until they release the source code of their products, the only rational stance is to treat their software as malware.
If further evidence is necessary, any Apple device that I have owned pings multiple Apple domains several times per minute, despite disabling every cloud dependency that can be disabled. The roles of the domains are partially documented, but traffic is encrypted and it is impossible to know for sure what information Apple is exfiltrating. It is certainly a lot more than a periodic software update check. It certainly seems that Apple is documenting how people interact with the devices they own very closely. That's an insane amount of oversight over people's lives considering that some (most?) people use their phones as their primary computer.
I just opened Activity Monitor - a process called "dasd" is the 5th largest consumer of CPU time. What does it do? Apple does not want you to know. Apple also will not let you disable it. Apple will not even tell you if this process is legitimate (it is signed by "Software Signing" lmao).
$ man dasd
No manual entry for dasd
There are like two dozen processes like this, half of which open network connections despite me never invoking any Apple services or even built-in apps. macOS has basically become malware.
Even excusing that daemon, here is a list of processes which have attempted to contact Apple in the past 24 hours, according to Little Snitch. I am certain this is not even a complete list, because macOS is closed source and likely can bypass application firewalls altogether:
Again, I have never used iCloud/Apple services, turned off all available telemetry options and did not open any Apple applications while all this took place (I only use Firefox and iTerm). Almost all of these processes lack a man page, or if they have one, it's one-line nonsense which explains nothing. This is beyond unprofessional.
The scheduling shouldn't be the 5th largest consumer of CPU. The question is what is it scheduling. Collecting data about user behavior would be a background task, you know..
Absence of evidence isn't evidence of absence, but it certainly rhymes. Is there proof that Apple is monetizing our data with third parties? It's very clear how almost every other major company is, but Apple's been reasonably respectful about it.
Google is also vehemently opposed to selling your data to third parties. That's how they keep themselves as the middleman between advertisers and users. What they do is allow detailed behavioral targeting. Apple prefers to expose contextual targeting data to advertising instead. Apple is also better about not letting advertisers run random scripts.
But frankly the difference between the two companies seems more a matter of degree than kind. It's not like Apple has a strong, principled stance against collecting data. They have a strong principled stance against other ad networks collecting user data, which looks a lot like anticompetitiveness. Their first party software collects identifiable data on you regardless of whether you opt out. They just avoid using that to target you if you opt out.
The reason Apple says their advertising doesn't track you is because they define "tracking" as purchasing third party data, not first party data collection.
What falsehood? That apple's profit mix is much less advertising than its competitors is just a fact about their incentives in the moment. He didn't really go all that far in claiming anything beyond that being better than the alternative of being mostly an advertising company.
My only * to this would be Google Chromecast devices directly if you already have them.
They have an option (buried way under settings) to make the home-screen apps only.
> Turn on Apps only mode
> From the Google TV home screen, select Settings Settings and then Accounts & Sign In.
> Select your profile and then Apps only mode and then Turn on.
It also makes the device significantly more performant.
With a bit of fiddling, Android TV can be as good as Apple TV in terms of privacy. Not out of the box, of course, but ADB can remove advertising/surveillance related APK files from most devices sold in big box stores and there are open-source, alternative clients to YouTube and a few other platforms available due to the popularity on the underlying AOSP platform. The same is possible to varying extents on smart TVs that use Android TV as their OS.
You can even completely replace Google's sponsored-content-feed launcher/homescreen with an open source alternative that is just a grid of big tiles for your installed apps (FLauncher).
For me, SmartTube with both ad-blocking and sponsor block is the killer feature of Android TV as a platform.
If you're into local network media streaming, Jellyfin's Android TV app is also great. Their Apple TV app is limited enough that people recommend using a paid third party client instead. And that's usually inevitably the case with Apple's walled gardens... The annual developer fee means things that people would build for the community on AOSP/Android are locked behind purchases or subscriptions on iOS and Apple TV.
It never occurred to me that that's why all the macOS utilities cost money. (I mean not literally all but way more basic stuff than you'd ever think to pay for on Windows or Android). I did figure Apple encouraged it because of their massive cut off the revenue but i forgot they charge devs to publish in the first place.
MacOS isn't as locked down as iOS or Apple TV (yet) unless you publish via the Mac App Store, but a secondary factor is that Apple customers expect to pay to solve a problem without having to think about it.
The good is that the above norm encourages the creation of high quality software. The bad is that, by the same token, some ideas that would be free/libre community projects on other platforms are instead paid utilities in Apple's walled garden, especially on iOS and Apple TV.
>It never occurred to me that that's why all the macOS utilities cost money
All macOS utilities absolutely don't cost money. There are countless free macOS utilities in the Mac App Store, as well as open source utilities for macOS specifically too.
I like the idea, but these KODI-based devices far too limited, they essentially only serve as media players for local content. For example, streaming Youtube is difficult and a poor experience relative using VacuumTube on desktop Linux. It's even harder to get a browser to work to stream from websites like Pluto and Flixer, especially if you want an adblocker. I haven't found a better option than an upscaled Linux DE on a mini-PC so far (however, see KDE Plasma Bigscreen).
Also, you can buy a more capable used ThinkCenter micro for less money, so the value proposition isn't exactly great.
I wouldn't expect KODI/OSMC to provide an unofficial YT client. However, the "app" availability issue is a big one for devices like this if they are to compete with spyware-ridden Android TV boxes on one hand and Linux HTPCs on the other hand. The Android TV boxes are cheap and support all streaming platforms. The Linux HTPCs are free (as in freedom), typically far more powerful (can double as consoles/emulators) and don't restrict the user in any way.
Use a PC for "smart" features. Used PC hardware is cheap and plenty effective. And the Logitech K400 is better than any TV remote.
No spying (unless you run Windows). Easy ad blocking. No reliance on platform-specific app support. Native support for multiple simultaneous content feeds (windows) - even from different services.
And it's not like it's complicated. My parents are as tech-illiterate as they come and they've been happily using an HTPC setup for over well over a decade. Anyone who can operate a "Smart TV" can certainly use a web browser.
Of course that's a viable option, but likely uses far more electricity in a year and unless you're going the high seas, unlikely to always get a better 4k HDR resolution from streaming services.
Unlikely, Apple TV is itself a "PC", not much different.
An actual PC doesn't cost much for electricity in a year either (say $30/year headless for watching several hours a day and sleep mode the rest). Make it an ARM and it will be quite less.
I have the same setup and have never looked back. My kids can control the TV now via the browser instead of asking me to fiddle with a smartphone, and I can easily block e.g. YouTube via the hosts file. The ability to have multiple streaming services open in different tabs and reading online reviews all on the same screen is also vastly superior to any UX offered by e.g. Chromecast or similar devices.
100%. Confirmed by my Firewalla. These and HomePods only access apple.com and icloud.com domains unless you're using apps. No mysterious hard coded IP addresses. Apple TV also has the best hardware, by far.
I used to work in the industry. I know the guys responsible for real-time data capture from various platforms like Roku and Visio.
I 100% agree, and I own very nice LG TVs. They are not connected to the internet. They each have an Apple TV and that is their only way that they get video, and can't send data out.
I agree with you except for the Apple TV part. I use a mini-PC running Ubuntu and use a wireless keyboard with integrated touchpad to control it, and it works wonderfully and has a much better user experience than the Chromecast I was using before - a product which has progressively become more and more shitty over the years to the point of being unusable.
An Apple TV is probably also OK, but likely also much more expensive. Also, Apple is a company that is and always has done all they could to lock down their platforms, lock in their users and seek exorbitant fees from developers releasing to their platform.
I used to use a NUC with a K400 as well (and a Logitech Harmony (RIP)), and the Apple TV is a way better experience.
The Apple TV remote is way more useable, and HDMI CEC just works™, which never ever was true with the NUC. I really like the client-server model - the Apple TV is my dumb front end for Plex, Steam Link, and so on. It also is well supported by every streaming service.
All of the Apple TV apps are designed with a UI for a TV and remote, not a user sitting two feet from a computer with a keyboard and mouse, and are way easier to use sitting on a sofa then a keyboard + browser combo.
I could fiddle with the NUC and make it work, but it was not family friendly. In general, the "it just works" factor is extremely high, which I could not say for the NUC.
If Apple ever goes evil, I'll just switch to whatever the best solution is when that happens (maybe a rooted Android TV device?). It's not like I'm marrying it. An Apple TV is $150. I've gotten 4 years out of my current one. The cost is negligible.
As I've gotten older, I've really come to value the "it just works" factor. I don't have time or energy for fiddly stuff anymore. After I put in the time to set something up, I want it to be rock solid. To each their own though.
> and use a wireless keyboard with integrated touchpad to control it.
Which wireless keyboard do you use? I've pretty much exact same setup - TV + Linux Mint + Logitech K400+. I'm just looking to see if there are better options for K400+
can't wait for valve to release the new controller with touchpads. Should be more compact than a keyboard and paired with some voice recognition would make the need for keyboard almost obsolete for smarttv usecase
I believe HDMI has support for sharing internet since 1.4 and I wouldn't be surprised to see TV makers attempting to leverage this in the future to get around not connecting your TV directly to internet.
I'm not any more in the ecosystem than an Apple ID and airpods, and it is just fine. The directional spatial audio with the airpods is cool, but we also use other BT headphones with it. I use the ATV almost exclusively for Jellyfin/Infuse.
If these things include WiFi hw it's not so simple.
You'd likely be surprised what proprietary WiFi-enabled consumer products do without your knowledge. Especially in a dense residential environment, there's nothing preventing a neighbor's WiFi AP giving internet access to everything it deems eligible within range. It may be a purely behind the scenes facility, on an otherwise ostensibly secured AP.
I don't have firsthand knowledge of TVs doing this, but other consumer devices with WiFi most definitely do this. If you don't control the software driving the TV, and the TV has WiFi hardware, I would assume it's at the very least in the cards.
It's rationalized by the vendors as a service to the customer. The mobile app needs to be able to configure the device via the cloud, so increasing the ability for said device to reach cloud by whatever means necessary is a customer benefit.
It most certainly is. It's not wifi, but it's definitely a thing. It lives down in the 900MHz world where things tend to be slower, but also travel further.
And of course: If it exists, it can be used.
That said, I haven't seen any evidence that suggests that televisions and streaming boxes are using it.
I’d kinda forgotten about it until someone mentioned open WiFi, and this seems like a use case tailor made for it. If not already, it looks like a near certainty to me.
Available as a one-time extra-cost feature on the first Kindle back in '07, Whispernet provided a bit of slow Internet access over cellular networks -- without additional payments or contracts or computers.
And really, Whispernet was great in that role.
But the world of data is shaped a lot differently these days. Data is a lot more-available and much less-expensive than it was back then, ~18 years ago -- and codecs have improved by leaps-and-bounds in terms of data efficiency.
Radios are also less expensive and more-capable compared to what they were in '07.
This will be sold as a feature: "Now with Amazon Whispernet, your new Amazon Fire TV will let you stream as much ad-supported TV as you want! For free! No home Internet connection or bulky antenna required! Say no to monthly bills and wanky-janky setups, and say yes to Amazon Fire TV!"
The future will be advertising. (Always has been, but always will be, too.)
Amazon Sidewalk is more about things connecting to the neighbor's always-plugged-in Echo Dot speaker than it is about them connecting to people walking down literal sidewalks.
As a thought exercise ask yourself would you notice if any of your closed WiFi-enabled hw scanned for APs and occasionally phoned home, if it didn't go out of its way to inform you of this? What would prevent the vendor from doing so?
Why would people even buy something like a smart TV if they know it's highly likely that it's created to spy on them? It's not a necessity, so maybe just don't get a smart TV in the first place? Otherwise, how sure you are it won't search for an open Wi-Fi or that it doesn't have a cellular connection?
Because intentionally non-smart TVs are an increasingly niche, and thus expensive market, and not a categorical upgrade from simply not connecting a smart TV to the internet, while benefitting from the manufacturer subsidy from advertisers.
Even if dumb TVs were manufactured at a cost comparable to smart TVs (at the same volume, they'd be cheaper to manufacture!), smart TVs are subsidized by the expected behavioral tracking & ad sale revenue.
Right, but the cars here now have to have some kind of GPS tracker thing built in. And the Jeans are 1% elastacine? so that they fall to bits in the Sun after 6 months. I remember a pair of real denim jeans I picked up in the states that lasted me 10 years.
Quality has gone out of everything in the last 15+ years.
So these items, along with anything marked Smart == Ad platform, or AI == Future Ad platform, are on my 'will not buy on principle' list regardless of need or wants.
Because the stereo doesn't spy on us (hopefully). If it did, I wouldn't buy one, as it's not a necessity, either.
The zipper also doesn't spy on us... yet? When smart zippers become the norm and you can't find jeans with dumb zippers, I'll return to using buttons even if they're a bit annoying to deal with.
Good luck finding a modern car that doesn't have a stereo. And continuing the analogy, good luck finding jeans without a zipper. When the only affordable and available options spy on you, it's simple enough to keep them air gapped from the internet... Electing not to own these devices at all is a much tougher sell.
> Apple is the only mainstream option where the hardware cost covers the experience, rather than your viewing habits subsidizing the device.
This might be temporarily a good rule of thumb to follow, but you will get monetized eventually. Nobody likes leaving money on the table. Same reason why subscription services now serve ads as well.
They generously offer you a free SIM card when going through passport control in Dubai. I can’t think of any other reason to do that, besides pure benevolence.
I read an article a few years ago about someone using a SIM card embedded in a product like this for free internet. The connection was severely limited though.
There already are on Sony TVs. My roommate is always connecting it when I’m away and I have to factory reset it and go through the dark pattern to use it without WiFi.
Prompt for a login or to check for updates on every start or once a week. It wouldn’t be difficult to get the numbers up for the number of online devices.
What would be the monthly cost per unit to LG for servicing those cell modems? Data-only, and I presume they could get some kind of bulk discount as a big manufacturer.
the alternative is they'll develop some common mesh local network that'll grab data through any gateway. Imagine your tv connecting to some wireless headphones which have multipoint feature enabled and connected to a smartphone which has wifi, tv sends encrypted data to buds, buds to phone and phone to some external source. Ofc it can be more sophisticated but totally doable and plausible.
Or imagine some localized automesh based on zigbee/matter-> you have a philips hue lamp connected to wifi, tv connects to it and it forwards data... I totally believe this will be the next development of ad networks and sold as 'better smart home devices'. And it'll not require any LTE. Or it can have LTE only on some subset of devices while others will use that as gateway.
probably a couple of dollars a month, which would be very tough to actually make work. Even facebook only makes a few hundred dollars a year per person in the US.
Amazon had a data deal for Kindles for a long time. If we're assuming nefariousness, the embedded SIM would only be used for analytics/telemetry not for content, so it shouldn't be too much data.
If Neilsen will give me $1 to have a journal of what I watch, they might give Samsung something to have actual logs.
If you get an Apple Tv also get the Infuse app. It is able to play anything that is in your home network - smb, plex, jellyfin.
I also recommend running iSponsorBlockTV if you use the YouTube app, it auto mutes and auto skips ads
I looked at the git page for sponsor block TV, but it’s super confusing. It’s talking about installing python and docker. You can’t do that stuff on an Apple TV right??
For me: I want something that will always work with minimal effort and is easy to use for the family.
I've farted around with every HTPC software from MythTV on and I'm over it. I'll happily pay the premium for an AppleTV that will handle almost everything in hardware.
I would honestly just use an Apple TV. But the killer feature for me (I currently use a Steam Deck/Steam Controller) is just Youtube without ads reliably. Also total control, if Youtube jacked up the prices for Youtube Red, I always have Ublock.
Total control is the name of the game for me. I can load Steam. I can load Brave. I can load VLC. I can watch any streaming, play any game (proton supported), or listen to any music.
It's just really grating to buy a nice screen and then have all the streaming services basically lock you to early-2000s picture quality. It's not that it doesn't work at all, but if I get the big nice modern screen I want to be able to use what I paid for.
Not user friendly and required dedicated hardware (TV tuners). Governing bodies also couldn't agree on HTPC standards, like Play4Sure, causing even more confusion. Plex and Sonarr/Radar are gaining some steam though.
They're great but my friends get confused when they're staying and I'm not there. Not having a normal remote throws people. Getting a remote to work perfectly and usefully in Linux isn't all that simple. Plus it's not at all easy for it to manage external inputs -- a smart TV can just switch to the ps5 with a button, how would i do that from my Linux htpc keyboard?
Don't get me wrong, I'm never giving up my ublock-YouTube plus steam plus Plex Linux htpc but there's plenty of reasons they're not super practical.
Also doesn't Netflix still throttle to 720p on PCs?
Pretty often, honestly. My friends and i all let each other crash at our places when we're in each other's town, and somebody is in my town visiting probably 3-4 times a year, and then my brother and sister come out 1-2 times a year each. So in a busy year that's almost once a month.
So enough that I'd like to find a good solution, even if it's not super high priority. My sofabaton Bluetooth remote was hopefully the savior but its Bluetooth mode is pretty bad and makes macros unreliable.
100% agree and do the same. There's no way I'd let one of those things touch the network. That is insane for a techie and even scarier that normal people live that way.
This except throw out the spyware that is an apple tv and get an intel n150 based mini pc (aoostar makes a nice one), throw bazzite on it, tell kde to auto login and auto load jellyfin and attach a flirc ir receiver and get a flirc remote for it. If you want to get fancy set a systemd timer to reboot it in the middle of the night.
> Apple is the only mainstream option where the hardware cost covers the experience, rather than your viewing habits subsidizing the device.
Years ago our refrain was "if you're not paying for the service, you're the product".
Nowadays we all recognize how naive that was; why would these psychopathic megacorporations overlook the possibility of both charging us and selling our privacy to the highest bidder?
In other words, Apple doesn't have a pass here. They're profiting from your data too, in addition to charging you the usual Apple tax. Why wouldn't they? Apple's a psychopathic megacorporation just like all the rest of them, whose only goal is to generate profit at any cost.
1. Never connect the TV panel itself to the internet. Keep it air-gapped. Treat it solely as a dumb monitor.
2. Use an Apple TV for the "smart" features.
3. Avoid Fire TV, Chromecast, or Roku.
The logic is simple, Google (Chromecast) and Amazon (Fire TV) operate on the same business model as the TV manufacturers subsidized hardware in exchange for user data and ad inventory. Apple is the only mainstream option where the hardware cost covers the experience, rather than your viewing habits subsidizing the device.
My new rule for modern TVs is don't have a TV at all. The social role of having a TV is rapidly dwindling. First off, the number of movies and TV shows that merit even being watched is dwindling. Secondly, even if you find something worth watching, the odds that anybody else will want to watch it is small; everybody has been atomized by recommendation algorithms, everybody gets shown a different set of ads and media, there's no longer and shared culture when it comes to media. It used to be that everybody went home and watched NBC or one of the two other channels, all saw the same ads for the same movies and shows, so if you mentioned one the next day everybody knew what you were talking about. This is no longer true, if you try to bring up some Netflix show you heard of last night, probaby nobody else has heard of it. Now let's say you actually talk somebody into watching something with you despite that... What are the odds that both they and you get through the show or movie without reaching for their phone? Almost zero, in my experience.
It's done. The cultural significance of TV is toast. Our culture is too atomized, too personalized for shared experiences. Large TVs, centerpiece of the living room, are becoming an anachronism that date people as being from a previous era when television was still a shared cultural experience.
I like rewatching old TV shows and films, streamed from my Jellyfin server.
For me, my rule is to get a Google TV, because I can change out the launcher to Flauncher. At least that way I don't see any ads. Google may well still be tracking me, but they do all over the web and I have an Android phone so they've already got plenty of data on me. I just avoid their ads so that it minimises the profitability of that data.
I use Flauncher too. I also use Netguard from F-Droid and block everything except streaming apps and their dependencies. I only unblock Google when a steaming app requires an update. I'm slowly dropping subscriptions and moving to Jellyfin though too.
I agree. My Google TV with Projectivity Launcher shows zero launcher ads unlike my Apple TV. As a bonus, it lets me install SmartTube and use DeArrow and Sponsor Block.
I just wish I could get something similar as a native iOS app. Although I can use Safari extensions, the Safari YouTube experience on iOS is terrible.
I love Projectivity Launcher on my Google Streamer, but I can't figure out how to really replace the built-in launcher. Sometimes the device falls back to the default launcher until I press the "home" button on my remote.
That's a popular and socially safe point of view, but it's completely wrong. Artistic merit, like truth and beauty, is an objective quality completely orthogonal to cultural differences or personal opinion. To illustrate this orthogonality, I invite you to realize there exists art which has great merit and yet which you personally do not like. You should be able to do this, if you can't manage I can provide my own examples for you. The existence of such art proves that personal preferences don't weigh on the recognition of artistic merit.
Except... art and beauty (idk about truth) are subjective. You can attribute grades and points to art or TV shows, but whether any one person likes it is entirely subjective.
But of course, you mention "merit", which if my English is correct is "the amount of work / effort / skill involved". But I personally do not like the duct taped banana, and the work / effort / skill involved is minimal - and yet it's considered art, and people go out of their way to view it.
The Mona Lisa is "fine" (in my opinion), took skill to make, but it wasn't considered particularly exceptional in the works of Da Vinci - until it got stolen. It has objective "artistic merit", its beauty is subjective, but its financial and cultural value exploded through the roof due to its story.
How interesting! These great minds you speak of, are they objectively great, or only subjectively? If they are only subjectively great, then why should a lazy appeal to them sway me? And if they are objectively great minds, then how does that not acknowledge my premise?
I'm teasing you, I do acknowledged that there are great minds, past and present, who disagree with me. And I trust you can acknowledge the same, there is no shortage of great minds who believed and argued that objective truth, beauty and merit really do exist. The question I have for you or anybody who disagrees is this: can you acknowledge the existence of media you don't like one bit but nonetheless acknowledge as having merit which transcends your own personal opinions? I can easily, I can't stand Shakespeare's Othello, and I simultaneously acknowledge it as possessing a great deal of objective artistic merit. For me, there is no contradiction here because merit is not a function of personal opinion.
Using flowery language does not automatically make you correct, and even if on the hard facts you are correct, it comes across as condescending and arrogant.
What you're saying, "There are shows on TV worth watching and the art form is still evolving, and one person not liking it doesn't mean that it is bad" would have come across much more cleanly if you had stated it plainly.
> What you're saying, "There are shows on TV worth watching and the art form is still evolving, and one person not liking it doesn't mean that it is bad"
That's not even remotely what I'm saying!
I would instead say that media is only "evolving" insofar as it is being optimized by media corporations for reliable fiscal return. No risks allowed, everything needs to be cookie cutter, no risks, barely any new IPs even, the industry just wants to get committees of nepo hire "writers" to remix shit they can be reasonably confident will find reliable audiences, and that means pandering to viewers instead of challenging them. Every decision gets run through test audiences, opinion polls, and legions of executives and their consultants. Art cannot be created in such an industry environment. All they can make is base slop.
That I acknowledge there are great minds on both sides of the debate means that I wouldn't treat it as a hard fact when talking about it in an online forum, which was the explicit point of my response.
If you can recognize the greatness of somebody you disagree with, then you should also be able to recognize merit in media you don't personally like. And if merit can be thus decoupled from personal opinion, that affirms my point that objective merit really does exist.
I agree that the days when “everyone” watched the same show are done. But if you can find a small group to watch a show with (better in person), then there are better shows available for that experience these last several years, even if the average quality has gone down.
What are some of your favorite shared experiences to replace tv?
It frustrates me that this is where we have come too.
I refuse to connect any of my TV's to the internet but I have to wonder how long until a few different things happen:
- The TV's just connect to unsecure Wifi and collect the data anyways (I think there were reports of at least one manufacture already doing this at one point?). Or just make a deal with xfinity to use their mesh network that seems to be everywhere.
- The TV's don't work without being connected to the internet.
- The manufactures find out that the cost of adding in a cellular modem is justified by the increase in data they can collect.
I would love the idea of buying a modern TV without any of this crap shoved in, I happily use my Apple TV for everything that isnt video games.
It bothers me though when it seems like to fix an issue with HDR or something I need to update the firmware. I have wondered on occasion if this is intentional to "force" people to connect. If I have to do this I will run an ethernet cable to temporarily connect.
99.999% of TV's are connected directly to the Internet by their users without any restrictions. Investing in additional hardware or operator deals to capture the remaining .001% isn't typically worth it, for now.
For one of our Samsung TVs, we were able to put the update on a thumb drive (we're old, we still have some around) and then use that to install the update on the TV.
It's kind of funny, we bought these TVs because they were "smart" (when they first came out) but they were so clunky and unreliable we disconnected them and used either PS or Apple TV for other things. Now we wouldn't connect our TVs to the internet for anything, and only use PS5s for specific things. We mostly just use our Apple TV.
My og chromecast frequently crash (I suspect due to running out of ram when casting contents) so I ended up replacing it with the chromecast ultra. The og chromecast is still usable if you can tolerate the crashing, but I imagine it's going to get worse as developers keep updating their cast players (basically a webpage that plays video) with new features that consume more and more ram.
Even if they don't brick them explicitly they will no longer provide security updates for them.
I'm on the same boat, smart TV has never been online, all content is just cast from media server/phone/tablet straight to chromecast. It works, no fuss, glitch free, and of course they will kill it.
Even if they were to provide security updates, a few platforms no longer work with them. At the very least, now Disney+ refuses to stream to my original chromecast dongle.
What do you think of Nvidia Shield? I haven't tried it, but I think it should also belong to 2). It's clearly much more expensive than a FireTV, but as you say it shouldn't be subsidized by ads. As an Android device it should be more open than an Apple TV. While I recognize the near flawless UI and high hardware quality of most Apple devices, I disagree with their "golden cage" or walled garden approach.
I see many people liking their shield, and with good reason it seems, but is it a worthy ecosystem to buy in to when it has not seen a new hardware revision since 2019?
This. The lack of a Shield hardware refresh seems insane.
I get Nvidia (the company) has other priorities with higher revenue.
But they have a product, with proven product-market fit, that gives them a last mile connection with end users, in one of the highest utilization home spaces.
How has no one at Nvidia looked at that and said "I'm not saying we orient our entire focus around it, but shouldn't we at least fund it as a strategic priority?"
If datacenter revenue falls off, it's going to look awfully short-sighted not to have diversified customer base when they had the chance.
What benefits would a hardware upgrade bring the end user? Not releasing a new model every year sounds like a perfectly good thing to me as long as they keep updating the software without introducing performance problems.
My biggest gripe with the Shield is the newest one has a remote that I really don’t like. Luckily it can be replaced with a third party remote!
I too think yearly updates are a bit too much and I too want to keep my devices for a long time. Still rocking an iPhone 12 (mini).
But support for newer codecs like AV1 and general hardware refreshes to keep up with the underlying Android base would still seem like good ideas to me.
Reading the specs it seems that the Shield also would benefit from being able to detect frame rate to auto-switch via HDMI.
Higher network bandwidth to play UHD Blu-ray rips seems like something people want.
Same, one of mine is from the initial models, and still working and receiving updates... it doesn't have the 4k upscaling of the newer models, but I've been happy and have several in my house.
About half my watching is YouTube on a paid account, the other half via Kodi and the high seas. My SO uses the regular apps for Netflix, Amazon and HBO currently. Having support for hacker-friendly features as well as blessed apps with 4k support has been pretty great.
As another post mentioned, the remote (current and previous) have been less than stellar... I've been using the one linked below[1], which works pretty well, though uses a USB dongle. FWIW, can also pair a bluetooth headset if you want the big screen experience, but don't want to blow out the house with audio sometimes.
The Shield Pro is perfect for me and I have no reason to upgrade. Have mine downgraded and de-bloated using this guide [0] running a custom launcher. Like you said being Android and more open helps a lot.
I use the non-Pro version for 1080p streaming and have for years. It’s great, does what I want and gets out of the way. Some years ago they were forced by Google to use the standard AndroidTV UI instead of their own custom one, which means it now shows ads on the home screen (a carousel of “watch this on service X”), which are inoffensive enough I haven’t bothered to circumvent them. You can swap to your own custom UI if you want with some ssh futzing.
Chromecast hardware wasn't ever sold at a loss, AFAIK. These things were/are pretty pricey for being long-outdated SoCs equivalent to low range smartphone SoCs and a HDMI driver chip.
I disconnected our living room LG TV from the internet and got a Fire Stick 4K Max, but I hate it; 90% of the screen is advert, and you get a tiny sliver for the 5 apps it lets you see, and you have to go digging for the rest, not to mention the home-screen advertising isn't always appropriate for young children.
I hadn't considered Apple TV because I've never been an Apple user, but perhaps this is what I need.
Though I'm an Android user, all of the Android TV devices seem to be junk or ad-ridden junk.
Is Apple TV the way to go (asking other opinions).
The only other one I'd seriously consider is the nVidia Shield (Pro?). But the risk with that is that it's decade old hardware with no updates in sight. It's more for the "My Plex/Jellyfin server has all the movies and TV shows ever" -crowd :)
Meanwhile my 1st gen 4k AppleTV (6-ish years old?) is chugging away perfectly and runs every single 3rd party streaming platform I need - even the local ones. As a market it's just too big to ignore.
And no ads anywhere on the front page. The top row apps get to show their stuff on the top part, but it's not "ads" in my book - unlike Google TV that just shoves full-screen crap of "YOU WANNA SEE THIS MARVEL MOVIE?!" at you no matter where you browse.
I've been aware of the Shield devices for some years now, as an Android user, but something always put me off them.
I lrecently bought the FireTV 4K in a last-ditched effort to find something I could at least have some control over; if I could replace the launcher with something that's just app icons and not all adverts it would have been perfect, but alas, Amazon has prevented that, so onto the next thing.
It's really sounding like Apple TV is the best option for something suitable for the whole family.
Can I ask; what is the purpose of the relatively large storage on an Apple TV, do they support "apps" of some kind?
You can play games decently on Apple TV, those require some storage and I believe at least the native Apple apps cache pretty heavily to local storage instead of relying on streaming.
There are "apps" too, but all of them are related to streaming video in some way, except for the games of course.
It's still seeing software updates and can play h.265 and AV1 content in 4K without issue. The latest model is 2019 though... that said works great... latest software update was just a couple weeks ago.
Also, you can swap the Android TV launcher relatively easily.
I've been using an AppleTV as the primary way to get content to my (dumb, vintage 2007) TV for approximately a decade now.
While my usage has increasingly shifted toward drawing from my personal library through first Plex, then Jellyfin, I've also used Netflix, YouTube, Twitch, Amazon Prime Video, AppleTV+, and probably a couple of other content apps I'm forgetting on it. Aside from some issues with the UI of individual apps (which is, of course, on the developers), it all works great. Many of the apps can even show you a couple of tiles of "suggested content" right from the home screen (for instance, when I select the Netflix app, but before I launch it, it currently shows the next episodes from the most recent two shows I've been watching on it).
There are various ways in which an AppleTV can be better if you're already in the Apple ecosystem (which I am), but you absolutely do not need to be to make excellent use of it.
It can even join your Tailscale network and act as an exit node, giving you a quick & dirty VPN into your home network!
Happy apple tv user for > 5 years now. It has icons for the apps you want to start on the home screen. You click the icons. The apps start.
It's connected to a samsung tv that's not allowed wifi access. Besides the bad and steadily worsening UX of streaming apps like Netflix, my setup itself never shows me any ads.
Also the apple tv remote has a very solid, premium feel, which i like
Yes. Macrumors here says “don’t buy” [0], but only because they tend to recommend only if tech is “new”. I have this ATv gen and it’s not perfect, but is really the best streaming box, having used roku and nvidia shield. PS: get the version with wired ethernet if you can…wifi works, but no surprise that wired is more solid.
Before you buy an Apple TV you can try installing ProjectIvy launcher and see if that suits your needs. It's basically a simplified launcher UI for Android TV devices.
It's not perfect, not if it suits your needs you won't have to buy another device.
That's the one I tried on the Fire Stick, it doesn't always launch, sometimes launches a few minutes after start, and when pressing home it would go back to the default launcher.
I think it's related to the accessibility settings it wants enabled, I was never able to turn it on like it says because the setting just refuses to enable.
I just hook up a (linux) laptop to my TV, personally. I have a mouse (and a bluetooth keyboard which I rarely use) to interface.
I have no idea if that would in some way impact something like streaming quality because I don't have any streaming services; I live in australia where the streaming companies simply don't bother organising streaming rights for worthwhile media. I also like to own things I want to rewatch.
If I wanted to get fancy (and if I had a TV capable of connecting to the internet, which I don't) I might consider setting it up as a media server or look at NAS solutions, but my laptop is perfect for me as is.
A cheap used mini desktop with a linux install on it is also a good way to go. Throw in a wireless mouse and keyboard and you can do not only what an AppleTV or Android box does but also everything a cheap used mini pc can do.
Would be a media powerhouse compared to almost any set top box you can buy.
Throw OpenElec or OSMC on it for simple media setup or Bazzite or Ubuntu for a normal linux desktop with downloadable applications for most streaming platforms.
I'm a Linux user myself, and into hardware and self-hosting, so I have hardware coming out of my ears, but I wouldn't dare try and use a Linux box in the living room.
My wife and kids use the living room TV most, I barely use it, and only then if I'm watching with them, they use all of the streaming services so they want it to play in 4K and "just work", for which Linux is unfortunately not the solution.
Honestly whatever makes you happy is fine. My point is that there are a lot of choices out there that don't require you to hand over your personal data to companies or to pay absurd prices for a sliver of privacy.
But I understand that when you have to account for the other people you live with and their comfort.
The down side is if you actually want streaming apps with 4K support for the paid services.
I've been using NVidia Shield TV (pro) since the first gen, still have my OG device as well as the updated models. I'm also running a Beelink SER8 with Bazzite for some living-room gaming and classic emulation.
Apple TV 4k has an idle power draw of 0.49W and a 4k streaming power draw of 2.31W. That mini desktop will likely run at around 20W idle and approaches 30W under relatively light load and up to 60W on high loads. Plus keyboard and mouse are generally terrible couch devices. I've already got a NAS and plenty of devices I can stream from. The Apple TV is an almost perfect small and efficient device to stream to.
Unfortunately not possible on the Fire Stick (4K Max), Amazon have modified it to disallow other launchers; there are some "hacks" but none have worked for me and the closest one I did get to working was too much of a pain for the rest of the family.
I've found no way to root it either so I just want rid of it, every time something appears almost-full-screen on the home page that's inappropriate for the kids with no regard for what time of day it is, my wife gets all the more annoyed by it; she never wanted it in the first place, so the poor experience is not helping my case).
What's wrong with Roku? They have a few ads here and there but I've always found the interface to be super slick. And they aren't Google, so not as harmful to share my data with? (a big assumption, I know)
I wouldn't assume Roku is better to share your data with. Google uses your data to feed their own algorithms instead of just straight up selling it. Their incentive is to keep the data internal so they alone can extract value from it.
This works until eARC breaks and you have to update (LG C6, never connected to the internet, only using AppleTV). And then of course the next LG update will break eARC again.
I’ve thought about doing this, what kind of device do you use for input?
I have a Logitech K400 Plus portable keyboard and it works great for general use, but I end up using my Apple TV on the couch instead since I prefer using a TV remote / gamepad to navigate.
I just use an Apple TV box these days, but ~15 years ago I was running a Windows 7 Media Center PC for DVR and a few saved movies. The best remote was a TiVo Slide Pro (no longer made, and my wife broke the one we had). Normal TiVo remote buttons on top, slides open to expose a small keyboard. They occasionally show up on eBay or if you have a local source, use that. You'll need one that has the RF dongle to be able to use the keyboard with anything other than an RF-capable TiVo.
I have a Logitech K830, it's great. Alas they're not manufactured any more.
Modern equivalents are a lot cheaper. I think they might not have the backlight feature which is extremely useful, and I've dropped mine many times, even spilled coffee over it, and it still works flawlessly.
i just use the cheapest wireless keyboard and mouse I can. I’m accessing content through the browser so there’s enough typing that I wouldn’t want a non keyboard personally.
Imo you only need remote like hardware if you’re planning to scroll through Netflix style walls of content.
> 1. Never connect the TV panel itself to the internet. Keep it air-gapped. Treat it solely as a dumb monitor.
A sensible rule, indeed. Next level of dystopia: cellular modems becoming so cheap that every TV, fridge and washing machine comes with one that connects it to the Internet whether you like it or not. And then when we Faraday cage those, the device refuses to function.
Laws need to keep up and ban this shit outright. It sounds exactly like something that the EU could help with.
The EU actually mandated that cars have a modem ("eCall"), so they could self-report accidents. I think this has been under reported even in tech circles.
You also need to have a spare tire or an inflate kit, that doesn't mean you can throw it at somebody's head or spray them in the eyes.
Said in another manner: having eCall doesn't mean that they are authorized to send telemetry back in non-emergency situations or use it to do any other thing unrelated to the main function. Now, if there is not a law that forbids that, car makers are going to exploit that loophole for sure, but that does not mean the EU is evil in this context.
Having two independent cellular modems in a car is obviously silly, so it only makes sense to use the same module both for the mandatory emergency calling and for the telemetry.
Because the emergency calling is mandatory, it'll of course be made impossible to disable the modem - and by extension the telemetry. Oh, you disabled the telemetry? I bet that'll be called "tampering with safety equipment", and your insurance is now void, and your car is no longer road legal.
If the law doesn't mandate that eCall has to be fully independent, it'll 100% be used to spy on you.
But the EU can just (and maybe already does) mandate that such telemetry must be opt-in by the user, and on top of that the data collected that way must be treated accordingly to the GDPR anyway.
> Next level of dystopia: cellular modems becoming so cheap that every TV, fridge and washing machine comes with one that connects it to the Internet whether you like it or not.
I also don't like this precedent, but I do still feel cars are quite different. You need a license to drive a car on public roads. The car needs lots of certifications. You need an insurance. You need to prominently display your (your car's) ID for all to see. If you make mistakes while operating a car, the police can stop you and the state can take away your right to drive a car.
This makes it all very different from a gadget you use for entertainment in your own home.
Over twenty years ago there came a mandate that all places with many people gathers (both residential and commercial housing) should have a EN 54‑21 compliant alarm transmitter to automatically notify authorities in case of a fire.
I'm afraid that we are crying wolf right now and are undermining our efforts to permanently shut down Chat Control and the likes when we complain about these efforts with a history of not being misused.
I feel like this product is optimizing for an anti-pattern.
The blog argues that AI workloads are bottlenecked by latency because of 'millions of small files.' But if you are training on millions of loose 4KB objects directly from network storage, your data pipeline is the problem, not the storage layer.
Data Formats: Standard practice is to use formats like WebDataset, Parquet, or TFRecord to chunk small files into large, sequential blobs. This negates the need for high-IOPS metadata operations and makes standard S3 throughput the only metric that matters (which is already plentiful).
Caching: Most high-performance training jobs hydrate local NVMe scratch space on the GPU nodes. S3 is just the cold source of truth. We don't need sub-millisecond access to the source of truth, we need it at the edge (local disk/RAM), which is handled by the data loader pre-fetching.
It seems like they are building a complex distributed system to solve a problem that is better solved by tar -cvf
In AI training, you want to sample the dataset in arbitrary fashion. You may want to arbitrarily subset your dataset for specific jobs. These are fundamentally opposed demands compared to linear access: To make your tar-file approach work, the data has to ordered to match the sample order of your training workload, coupling data storage and sampler design.
There are solutions for this, but the added complexity is big. In any case, your training code and data storage become tightly coupled. If you can avoid it by having a faster storage solution, at least I would be highly appreciative of it.
- Modern DL frameworks (PyTorch DataLoader, WebDataset, NVIDIA DALI) do not require random access to disk. They stream large sequential shards into a RAM buffer and shuffle within that buffer. As long as the buffer size is significantly larger than the batch size, the statistical convergence of the model is identical to perfect random sampling.
- AI training is a bandwidth problem, not a latency problem. GPUs need to be fed at 10GB/s+. Making millions of small HTTP requests introduces massive overhead (headers, SSL handshakes, TTFB) that kills bandwidth. Even if the storage engine has 0ms latency, the network stack does not.
- If you truly need "arbitrary subsetting" without downloading a whole tarball, formats like Parquet or indexed TFRecords allow HTTP Range Requests. You can fetch specific byte ranges from a large blob without "coupling" the storage layout significantly.
Highly dependent on what you are training. "Shuffling within a buffer" still results in your sampling being dependent on the data storage order. PyTorch DataLoader does not handle this for you. High level libraries like DALI do, but this is the exact coupling I wanted to say to avoid. These libraries have specific use cases in mind, and therefore have restrictions that may or may not suit your needs.
AI training is a bandwidth problem, not a latency problem. GPUs need to be fed at 10GB/s+. Making millions of small HTTP requests introduces massive overhead (headers, SSL handshakes, TTFB) that kills bandwidth. Even if the storage engine has 0ms latency, the network stack does not.
Agree that throughput is more of an issue than latency, as you can queue data to CPU memory. Small object throughput is definitely an issue though, which is what I was talking about. Also, there's no need to use HTTP for your requests, so HTTP or TLS overheads are more of self-induced problems of the storage system itself.
You can fetch specific byte ranges from a large blob without "coupling" the storage layout significantly.
This has exact same throughput problems as small objects though.
I agree that this is an anti-pattern for training.
In training, you are often I/O bound over S3 - high b/w networking doesn't fix it (.saftensor files are typically 4GB in size). You need NVMe and high b/w networking along with a distributed file system.
We do this with tiered storage over S3 using HopsFS that has a HDFS API with a FUSE client, so training can just read data (from HopsFS datanode's NVMe cache) as if it is local, but it is pulled from NVMe disks over the network.
In contrast, writes go straight to S3 vis HopsFS write-through NVMe cache.
> It seems like they are building a complex distributed system to solve a problem that is better solved by tar -cvf
That doesn't work on Parquet or anything compressed. In real-time analytics you want to load small files quickly into a central location where they can be both queried and compacted (different workloads) at the same time. This is hard to do in existing table formats like Iceberg. Granted not everyone shares this requirement but it's increasingly important for a wide range of use cases like log management.
You can do app optimizations to work with object databases that are slow for small objects, or you can have a fast object database - doesn't seem that black and white. If you can build a fast object database that is robust and solves that problem well, it's (hopefully) a non leaky abstraction that can warrant some complexity inside.
The tar -cvf is a good analogy though, are you working with a virtual tape drive or a virtual SSD.
Expecting the storage layer to fix an inefficient I/O pattern (millions of tiny network requests) is optimizing the wrong part of the stack.
> are you working with a virtual tape drive or a virtual SSD.
Treating a networked object store like a local SSD ignores the Fallacies of Distributed Computing. You cannot engineer away the speed of light or the TCP stack.
SSD (over nvme) and TCP (over 100gbe) both exhibit low tens of microseconds of latency as the low bound. This is ignoring redundancy for both of course, but the cost of that should also be similar between the two.
If the storage is farther away, then you'll go slower of course. But since the article is comparing to EFS and S3 Express, it's fitting to talk about a nearby scenarios I think. And the point of the article was that S3 Express was more problematic for cost than small-object performance reasons.
Yeah I was a bit lost from the introduction. High performance object stores are "too expensive?" We live an era where I can store everything forever and query it in human scale time-frames at costs that are far less than what we paid for much worse technologies a decade ago. But I was thinking of datalakes, not vector stores or whatever they are trying to solve for AI.
Have a question related to this, if a photon has zero proper time between emission and absorption, how should I think about the influence of later-created photons or fields on it?
In our frame, we can interact with a photon long after it's emitted send it through a filter, bounce it off a mirror, measure it, etc. But from the photon's own “no proper time” perspective, does it make sense to ask how something created after its emission could affect its path?
The photon doesn't have an inertial frame of reference precisely because it's moving at the speed of light, so it doesn't have a perspective. It's a (quantized) wave in the electromagnetic field. The closer you get to the speed of light, the closer the proper time of the journey goes to zero, but actually taking the limit does not make sense physically.
I wonder if Photon doesnt have an inertial frame of reference - from our point of view.
Like, from our point of view, we assume that from photons point of view it has no perspective.
But maybe we are limited by our spacetime, where photon goes trough our universe in an instant, and continues to pierce infinite universes over its non-zero frame of reference.
This looks impressive. Could someone familiar with Postgres internals explain the hidden trade-offs of this approach?
I understand the obvious limitations of it being embedded/single-host, but I'm curious about the engine itself. Does running in this environment compromise standard features like ACID compliance, parallel query execution, or the ecosystem of tools/extensions we usually rely on?
This is what I'm most interested in. I have an application which has a smaller trimmed down client version but it shares a lot of code with the larger full version of itself. Part of that code is query logic and it's very dependent on multiple connections and even the simplest transactions on it will deadlock without multiple connections. Right now if one wants to use the Postgres option, it needs Postgres manually installed and connected to it which is a mess. It would be the dream to have a way to easily ship Postgres in a small to medium sized app in a enterprise-Windows-sysadmin-friendly way and be able to use the same Postgres queries.
Yep. I have some suspicions on how the information travels lately ( it is kinda both ways depending on the 'type' of news ), but it would absolutely be of general interest.
I don't know why you would say juniors handle this better. From what I've seen, juniors who ship fast often do so by ignoring domain complexity and testing against a single "happy path" prompt.
Seniors take longer because they are actually trying to understand the domain and cover edge cases. Speed isn't a metric of success if the feature is half-baked and breaks the moment a user deviates from the test case.
Juniors are juniors because they haven't yet struggled with mistakes of their own creation. In a few years we should see some pretty strong senior engineers emerging.
Even Gemini Flash did really well for me[0] using two prompts - the initial query and one to fix the only error I could identify.
> Please generate an analog clock widget, synchronized to actual system time, with hands that update in real time and a second hand that ticks at least once per second. Make sure all the hour markings are visible and put some effort into making a modern, stylish clock face.
Followed by:
> Currently the hands are working perfectly but they're translated incorrectly making then uncentered. Can you ensure that each one is translated to the correct position on the clock face?
I don't think this is a serious test. It's just an art piece to contrast different LLMs taking on the same task, and against themselves since it updates every minute. One minute one of the results was really good for me and the next minute it was very, very bad.
Aren't they attempting to also display current time though? Your share is a clock starting at midnight/noon. Kimi K2 seems to be the best on each refresh.
what makes this state of the art?
reply