The sentiment here on HN is to never connect the TV to the Internet and instead use a digital media player or external streaming device. I wholeheartedly agree. But then several people mentioned wanting to update the firmware. If you'll never connect it to the Internet, then you should never update the firmware either. It goes against conventional wisdom, but updating the firmware in this situation is more likely to cause problems than to bring you desired improvements.
The firmware update might have stricter DRM controls; it could have a tricky new way to exfiltrate data out via your streaming box; maybe the firmware update itself contains some static ads; perhaps in the near future we'll have public wifi or free Google community wifi, and the new firmware will have the smarts to use that and bypass your wifi.
And these days, once you update the firmware, you often cannot revert back to earlier firmware.
Firmware updates may resolve HDMI compatibilities, though, and in my mind it's always worth the risk even if there's a 10% chance of improvement because HDMI reliability is terrible.
When I mounted my TV, I embedded a single HDMI cable, a single cat5e cable, and a single optical cable in the wall. Conduit wasn't an option due to the age and construction of the wall, so changing it would require power tools and drywall mud.
I have a sound bar rather than a receiver because it makes my wife happy. The sound bar works best with ARC. Optical works too, but power and volume isn't synchronized.
I have an NVIDIA shield mounted behind the TV because the TV's software stack got too lethargic, and the TV's built-in decoder has silicon bugs that break video in Netflix, and break surround sound in everything but Netflix.
Surprisingly, everything works reliably about 95% of the time. It's unfortunate that I consider that a win. I just added a 4-1 HDMI splitter with ARC passthrough in order to get a Blu-ray player back in the mix, and it was boring!
I wish I was in your shoes! I also had no issues with HDMI until recent years--but HDMI-eARC is so buggy on Samsung TVs, especially when being used with surround sound or a sound bar. I had no problems before I moved to a surround system/beyond ARC, but with eARC I need to reboot the TV at the wall once every two weeks or it will "forget" it's plugged into a sound system, it's very annoying. It was far worse when I first got the TV and updates have improved it somewhat, but it's still very annoying. I don't hold out hope for Samsung updating the TV for much longer, my last one from them they abandoned it after the first year. :/
I have a 5 year old samsung 4k with a samsung sound bar. ARC does not work reliably. It did but just stops working. I have to power cycle the TV. Change HDMI cables etc. All the normal I am an engineer, made check list, swapped parts in controlled manor stuff. Even tried another soundbar that works fine on another set. Googling tells me other have this issue and samsung does not care as it is to old (it is likely sw issue). TV is going to be replaced with much more expensive Sony as they still seem to take pride in their product quality. Oh as to the ads, never hook TV up to internet. I use an AppleTV and it just works.
I want ARC/CEC to work as I want a single remote. You loose that when using optical.
If optical works reliably, you could possibly leave both HDMI and optical plugged in to the sound bar. CEC volume control and power seems to still work with my sound bar when set to the optical input.
My setup currently has an audio drop out every few minutes. It seems to happen over both ARC and optical, though, so it's presumably just the Nvidia Shield being buggy. The UI bugs out too lately. I've had various forms of audio drop outs on that TV since at least 2020. It could be the TV corrupting the passthrough I guess. I'm not going to buy a whole new TV just to test that hypothesis.
From what source though? Antenna broadcast? Otherwise it’s better to route through the set-top box if at all possible, and that can output directly to the sound bar.
OP said they have both an Nvidia Shield and a Blu-Ray player so there is no single set-top box. Running from the TV means you catch all audio output no matter the source.
Yeah, the only device with a "passthrough" is the TV itself, via ARC. If I had a 4k capable receiver instead of the sound bar, then it could act as the hub and output everything to the TV.
Previously I did have a receiver, and no Nvidia Shield device. The TV has Android TV built in to it, so I still relied on ARC most of the time. The TV's built in functionality got too buggy (borked video, audio dropouts during loud scenes) and slow to use day to day, so I added the Shield. My receiver didn't support 4k, though, so I plugged straight into the TV and used ARC still. I since replaced the receiver and all the mid-size speakers with a 5.1 wireless soundbar to free up shelf space in that room. Since I know you're curious, the receiver got paired with a 1080p plasma TV in a guest house where it can retire with dignity, and the speakers got moved to a bedroom, connected to a pre-HDMI receiver and a cheap Bluetooth dongle.
Me too. I just put a digital TOSLink cable between my TV and the soundbar. Works a treat and it's funky knowing that audio that was produced, recorded digitally, sent across the internet, decoded by my computer, pumped to a TV and then re-encoded into laser light before being turned back into a digital electrical signal and then converted into an analog electrical signal and then played through speakers is pretty amazing.
Sure, the "DVI" subset of HDMI generally works everywhere, although I've had signal integrity issues from cheap cables in the past.
It's all the other stuff that's supposed to seamlessly integrate your components. Audio Return Channel so sound makes it to the speakers regardless of where a source is plugged in, Consumer Electronics Control so you can use one Bluetooth remote for everything, automatic power synchronization, etc.
Even 4k is a little weird, with different HDMI versions supporting different frame rates. I have a Yamaha receiver from 2014 or so that doesn't support 4k, but claims to "support" it; all you have to do is turn off the receiver and it will pass the signal through!
This thread has motivated me to try and tackle my current audio drop out problem. Every 5-10 minutes, the audio drops out for a second. I'm currently updating the firmware on my soundbar as a shot in the dark...
Well, I tried a new HDMI topology and that didn't improve things. I tried intentionally mis-configuring the TV to break ARC but still have CEC power and volume control of my sound bar, but the sound bar still defaults to ARC on power up rather than SPDIF.
I've given up on ARC completely in my setup. The Nvidia Shield remote can control sound bar volume via IR. I just successfully watched a full movie without any drop outs. Also, I may be imagining it, but the Shield feels more responsive. It seems like it must be the TV's fault. Don't buy a Sony I guess.
I learned this with my router. I updated the firmware thinking it was the smart thing to do, security fixed and all that, but instead I was greeted with ads for their other products and them wanted to shove their stupid phone app in my face to interact with the router.
I then installed OpenWRT and said nuts to using proprietary router firmware.
> perhaps in the near future we'll have public wifi or free Google community wifi, and the new firmware will have the smarts to use that and bypass your wifi.
Xfinity WiFi has been around for at least 8 years. Amazon sidewalk has been around for over a year. Wanna bet those are or will be used by your smart TV producers to connect them to the internet via wifi?
What makes me think this will never happen is that
- Open Wi-Fi networks are a thing of the past. There hasn't been any around me in a residential area for a long time now. Businesses and workplace lobbies, more likely, though.
- No one is going to just give Samsung free Internet except the hapless consumer by supplying Wi-Fi credentials.
- Samsung might make a deal with providers, but it would have to have unique credentials embedded in its OS and firmware, and I doubt Samsung has the ability to keep that totally secure.
Think about it. If you could get free, anonymous Internet with credentials in a Samsung TV, crackers would be all over that - they'd be searching every crook and nanny for exploits, desoldering NAND and sniffing busses for encryption keys, connecting with Chinese friends to get original datasheets, etc.
Even if Samsung embedded an LTE/5G SIM, eSIM, whatever, it would be hacked to bits. "Get model X of samsung TV, get free Internet with this Linux application". It's not realistic for there to be a network connection that you don't know about, pay for, and have your name attached to.
Of course the p2p network interface that shows up on the Netflix diagnostic screen is concerning, though.
Now if cellular providers start selling TVs, such as AT&T, Verizon, etc. bundling Internet with them, then it can happen.
> No one is going to just give Samsung free Internet except the hapless consumer by supplying Wi-Fi credentials.
I think it's implied that Samsung would pay Amazon for Sidewalk access.
> Samsung might make a deal with providers, but it would have to have unique credentials embedded in its OS and firmware, and I doubt Samsung has the ability to keep that totally secure.
I don't think this is as hard a problem as you're making it sound. Each TV ships with a serial number, let's suppose; it tries to handshake with the Sidewalk network. Sidewalk phones home to Amazon, Amazon talks to Samsung, Samsung says "yes, we sold that S/N recently and it has never connected before, here's its public key".
So if I can spoof communication with that serial number on another device, I get free Internet. Same concept as MAC filtering not being really secure because I can just change MAC addresses in my packets.
Find a remote vulnerability, or find the device on the circuit board where it's stored, connect a reader to it, and dump it. Not trivial, but not impossible. The TV software just has to have one mistake, and TV companies aren't security experts.
All of the popular embedded platforms have had scores of vulnerabilites - Qualcomm, Android, WebOS, etc. - patched over time, new ones found etc.
Heck, it even took Microsoft more than one try to start to get it right. An interesting story is Microsoft attempting to protect its first game platform--the original Xbox from the early 2000's. There were numerous security protections and all were bypassed - from encrypted boot code to a device-unique hard drive key stored in EEPROM.
Microsoft got better and smarter with the 360--this time with unique keys and eFuses in the CPU but it was still eventually bypassed--not after the effective lifetime of the platform though.
Honestly I do not think Samsung would be concerned about the single-digit number of people who manage to get free internet this way.
If you really wanted it to be secure, you could use a TPM instead of a private key in memory, but that's overkill IMO. Who wants to take their TV apart in exchange for free crappy internet?
Every TV ships with a non-unique secret key and an agreement with some major internet service provider which specifies that accounts using that key will gain network access but only to a specific list of IP addresses that host or proxy firmware updates and advertising content.
[edit - to be clear, I'm not saying that this is what Samsung is doing, I'm just describing a plausible way how this might get done]
Everyone has said stuff like this but there has been zero proof of TVs connecting to wifi like this on their own. If you have that proof please share it.
>> Wanna bet [Broad Company Wifi Networks] are or will be used by your smart TV
> What makes me think this will never happen...
The comment suggests that the behavior of auto-connecting to wifi is infeasible for technical reasons. My comment and the one below it show that this is technically feasible.
I've been to two medical facilities and a large regional hospital in the last week where there were open wifi networks with no portals. My apartment building operates an open wifi network for guests so we don't have to bother giving out passwords to visitors. An airport I visited last month has wide open wifi. A see ads on transit buses all the time stating that the bus has wifi. I suspect that is wide open because the transit agency didn't want to deal with tech support.
It's pretty common for public networks to still have a captive portal to get the user to view an ad or click "I agree" before actually granting full connectivity.
Fully open wifi that didn't require posting to an http or https endpoint was never common in the first place.
Consumer routers are now shipped pre-configured with a password on the network so random joe who bought his router at best buy or got it from his ISP doesn't accidentally provide free wifi to his 20 closest neighbors.
Meanwhile out of the box every single xfinity provided modem/router combo provides by default an open network with no password that allows any other xfinity subscriber to access the internet via your device. They have 18 million such hotspots throughout the US. Give the expected usage of a few MB per year this would seem to be an easy ask and easily sold the end user as a feature not a cost.
Likewise nearly every major business that serves customers food refreshments or produces to buy on site provides wifi that requires only that you push a simple response to open it up. This can be and in fact is already automated on your phone for example.
Instead of referring to open wifi I would redirect the discussion to negotiable connections and they are everywhere.
And by "residential areas," I assume you mean "the very specific residential area where I live in my neighborhood, in my city, in my county, in my state, in my nation" since there is simply no way for you to have made a detailed assessment of the availability of open wifi for the entirety of the rest of the planet, or even for the small subset of its people who are on HN.
But thanks for informing me, and the 300 other people who reside in my building that we don't live in a residential area.
I'm sure there's something in the water that's driving my neighbours towards protected-by-default wifi, and not the defaults with which their ISP-provided routers are shipped with.
Yes! There's less focus on crypto part of it, and more emphasis on connecting people and communities as a WISP, but underneath it all, Althea is doing just that.
> Even if Samsung embedded an LTE/5G SIM, eSIM, whatever, it would be hacked to bits. "Get model X of samsung TV, get free Internet with this Linux application". It's not realistic for there to be a network connection that you don't know about, pay for, and have your name attached to.
Kindles and cars have had those for years and people haven't torn those apart to come up with free internet.
Your connection could be limited to receiving ads and firmware updates, incredibly slow,and be limited to a key stored in hardware both near impossible to retrieve and nearly worthless if you retrieved it. This connection would only be used if a primary connection was unavailable.
You could basically use 10MB per 10 customers per year and the only question is do you make more ensuring everyone gets ads to justify the peanuts paid to people like comcast or at worst the cost of a chip that has a cellular modem vs just wifi.
The problem is that we're at a point in the evolution of TVs where robust support for HDMI-CEC (especially in combination with ARC/eARC) isn't guaranteed; but might be something you can get later through a firmware update.
When you finally decide to shell out for a sound bar, and after hooking it up, your TV now suddenly takes 30 seconds to flicker to life from sleep, you'll really want that firmware update.
(It's similar to where we were ~5 years ago with motherboards and NVMe boot support. The motherboard had the M.2 socket; but whether the NVMe device showed up in the boot options was up to chance. Often it'd only work in legacy mode but not UEFI mode. But, after a few-months-later motherboard firmware update, things would begin to work the way you'd expect.)
This is good advice. I kept my 4k Vizio offline until updating the firmware on a whim. The update broke the ability to run 1080p@60hz from my PC, making it effectively useless for normal use. The annoying part about the panel is that it was advertised as 4k but only works at 30hz at that resolution. Cheap, so should have figured.
Not a TV, but my monitor is 3440x1440. However, it only supports 30Hz at that resolution over HDMI since it has an older HDMI version. It only supports 3440x1400 at 60Hz over DisplayPort.
So, if your TV has any other inputs, might want to try those too.
Do you have links to blogs that show this happening, I see plenty of hearsay, but I know my LG doesn’t attempt to my honeypot wifi, I’d love to see proof.
Samsung (Dacor) discloses this behavior in their fridge manual; it says it will mesh with Samsung TVs to better target ads.
I’m not about to buy a Samsung set to find out what it actually does in practice. The fridge has deep learning object classifiers and internal cameras; I assume that is a big part of its ad targeting capability.
Note that the fridge has demand response / energy use time shifting features that don’t work unless it is connected to the internet.
You know all those stories about weird little shops run by suspicious characters but filled with incredible items that end up being cursed. That's every single shop right now. All of our technology is cursed. They will bring you great things, as promised, but always at some hidden price because those items exist to serve a dark master.
Maybe the targeting works the other way around. Those refrigerators have cameras. Maybe a little bit of machine learning to figure out what you buy and play targeted ads based on what's in your fridge or what used to be in your fridge but isnt currently.
>demand response / energy use time shifting features that don’t work unless it is connected to the internet.
I don't understand. What extra info does it need from the internet to keep the food at the right temperature, other than just the current temperature inside the fridge?
Maybe not straight up arrested, but I think I recall a case in the past where legal action was taken over someone using a restaurant's free WiFi everyday while not being a customer (he used it in his car in the parking lot). Like the network is open but technically you need permission to connect to it (a sign at the airport saying that free WiFi is available at this SSID, or the restaurant stating that WiFi is available to all paying customers counts as permission). Obviously very few of these cases will be prosecuted, but I would be very concerned if my smart devices were connecting to open networks automatically without my consent.
So far, my experience with Samsung products is that they're really badly put together in my experience. In the past, I've had multiple Samsung Note Smartphones (changed every 2-3 years because they would stop functioning), two Samsung smart TVs (latest one that's still in use is one whose wifi module stopped functioning after a firmware update), and a Samsung 27-inch monitor (unstable connectivity to computer).
All the Samsung products were poor experiences one way or another.
I've replaced my last Samsung Note with an (my first) iPhone (that has lasted 6-7 years since and is still going!); and replaced the monitor with a Dell one (none of my current or previous Dell & Philips monitors ever gave issues!).
Hearing how some Samsung TVs now comes with ads feels like a new low for the user experience. I'm personally avoiding Samsung products until things change.
I stay on top of device firmware updates in order to stay current on security patches, lest bad folks come crawling through the connection. I assume that's the "conventional wisdom" spoken of here. But if the TV never goes anywhere near a network...
If it's not a networked device, then the FW update better have enhancements that benefit me, the user, and not Samsung. Otherwise, GTFO with your crab bucket of new bugs.
Aah, the ever-present, nebulous and unlisted "bug fixes and security enhancements".
...oh, and more ads (not mentioned in the changelog).
...and feature removals (not mentioned in the changelog).
...oh, and you can't downgrade (not mentioned in the changelog).
Firmware updates are fine, but it should definitely be illegal to not allow a device to be returned by a consumer to the precise functionality it had at the time of purchase.
I can't believe people have been brainwashed to accept this as normal.
I'm stuck in a similar boat. Expensive TV now trying to steel my attention when I turn it on because my 12yo thought they were being helpful by connecting to the internet. Was just thinking about trying a factory reset. 30s of "look at our apps" followed by 30s "update your software." I love the picture quality and HATE the "smart" features.
Obviously most users won't be able to do this, but this is exactly the type of reason I use OpenWRT on my RPi-based router. I have a firewall rule that prevents the TV from communicating with anything in the WAN zone. I can still have the TV on the Wi-Fi and still use features like Airplay.
I disable the firewall rule if I want to try updating the TV firmware.
That breaks the built in chromecast, right (since the TV is the one fetching the stream), but I suppose you could plug in a regular chromecast dongle if that matters to you.
Google may well be violating their privacy policy which specifically claims they don't do that, but I don't think it's likely; they run a pretty tight ship RE legal messaging, and _someone would have noticed_ since all you have to do is watch a video on a non-google property and watch the network traffic.
> my cable provider clT out refused to give me the PPP / etc details or I have to go find an RPi with adsl io.
In the USA and perhaps other countries the cable provider is required to let you use your own DOCSIS modem. The list of modems that work with their system has to be on their web site.
They don't want to do this because they are all shitheads and also because they want to charge you "rent". But TBF I imagine it's also because it's easier to debug customer support calls when they manage the whole connection (e.g. GF's kids complain that "WiFi is down" when they mean the cable connection is down).
Sadly, enforcement of this directive is up to each country's own regulator, and as the map in the above link shows, only a few countries have actually enshrined this in their national laws.
You can chain routers. disable the isp router's wifi, or just ignore it. Then place your router of choice on the LAN side of the ISP's router as the its Ethernet client. Make up a new subnet of your choice on your router's LAN side and use its wifi and DHCP.
See if your cable modem has a pass-through mode. This disables the NAT, DHCP, etc and passes the IP address of the cable side of the modem through to one of the ethernet jacks. Then you plug your own router in to that and do your own NAT, DHCP, firewall, etc. I know Arris used to have this, haven't checked recently.
My current ISP has the cable modem completely locked down. No control over DHCP, no punching service ports through the NAT for home servers. Also they've got me double NATed.
Add a second router and put it in your ISP router’s DMZ. You shouldn’t have trouble with double nat since all traffic will get forwarded on to your “real” router.
Hmm I like just buying 4k displays with no smart features at all. 65 sceptre works great.. is just a display (does need a sound bar as built in speakers are poor).
Sure, but that’s not a very high-end panel. When Samsung and LG make almost all the best panels, it’s tricky to avoid if you aren’t willing to sacrifice picture quality.
I also considered maybe trying to import a TV from the EU - do they have stronger privacy laws that would prevent this? Panasonic makes some VERY nice OLED TV's out of the country
I had a Panasonic TV about ten years ago that refused to play any content off my home server unless it could talk to its own servers over the internet first.
And that sentiment is soon going to be obsolete, once 5G is everywhere and your smart TV can connect to the internet without asking for your wifi password.
I have an untrusted segment of my network I connect my tv to. If I streamed from something else, I would have to put that device also in an untrusted segment and dedicate it to that task but that means more time spent admining a device that no one is paying me to admin.
This has been discussed previously and one of the major issues is that many devices require internet connectivity to complete initial setup before they can be used.
A piHole fixed issues I had with my Samsung smart TV. No more ads in the homescreen, and there’s a way to disable analytics deep into the TV’s setting menu.
I'm not so sure. I've been using a projector as a TV for something like four years straight: no TV at home, only a projector. As a bonus the living room becomes a home cinema. I didn't see much downsides. Zero issue. No ads. Cheaper than a TV for a much wider diagonal and a more "cinema'ish" picture too (I hate it when movies look like cheap sitcoms on modern TVs).
And as many of these are made to show slides and whatnots in corporate settings, ads are a big no-no.
I think more people should seriously consider that option.
EDIT: well I remember one issue... I decided to fix it and it took longer than the time it'd have take to hook a TV. But I did it exactly once, used some fishtape to pull HDMI cables in the system ceiling and family was good to go for years.
> I hate it when movies look like cheap sitcoms on modern TVs
This really is the trope about LCD TVs that will not die. I blame the manufacturers.
Yes, it looks terrible. It’s called motion smoothing (or something like it) and it’s often switched on - for reasons I cannot possibly fathom - when TVs are in demo mode and/or in the showroom.
And it is absolutely trivial to disable. Most any modern TV is either fully capable of playing 24p / 30p and 60p at native frame rate, or playing 60p at native, 30p at half-rate and 24p at 3:2 pulldown, in each case without any interpolated frames muddying the native presentation. It looks perfectly great (while 3:2 pulldown is perceived as juddery by some, that’s an issue any projector incapable of either 120hz or native 24p would also share).
There are loads of legitimate reasons to prefer projectors over flat panels (and vice versa); motion smoothing is not one of them.
I've been using a projector in my bedroom because I do occasionally want to watch something while laying in bed, but don't generally like a TV in the room.
Potential problems, all fixable:
Can't see it during the day or when using the lights.
Had to run power and data to the ceiling.
Lower resolution for the price.
Can't use the wall for anything else or need to install a screen; paint choice a potential issue (bad color, too glossy). (Also need a wall without windows that is large enough.)
That said, the pros vastly outweigh the cons for me and projectors have gotten cheaper and better in the near decade since I installed mine.
You sound like somebody going on about how vinyl is better than CDs. Sure, projectors give you that "authentic" cinema look, in the sense of bad color reproduction, especially being unable to show dark blacks like OLEDs can.
The "cheap sitcom" look is objectively better. With the right filters you can reproduce the cinema look; but nobody who isn't already used to it would want to to that, unless you are going for a specific "old school" style.
The "cheap sitcom" look is objectively worse, but it can be disabled on every TV I've encountered so far. If I bought a TV that didn't allow me to disable it I'd return it as defective.
>I hate it when movies look like cheap sitcoms on modern TVs
That has nothing to do with "TVs" in general and more the software smoothing settings that are on by default. Properly configured (takes literally minutes or less one time) using a website like rtings.com for the optimal settings, a TV will look better than a similar quality projector every time, and in greater lighting situations. There's a reason why projectors aren't popular and it's not because people forget about them. Movies on my OLED TV look incredible, better than a cinema frankly.
Since we're talking Samsung, there's a picture mode called FILMMAKER (weirdly all caps I know) that promises to not do any 'smart' fuckery. And I guess there's an alliance of other manufacturers too who offer the same mode with the same name.
A big cons for me is that it's useless on bright room, as who prefer bright room. Darker room is fine to watch movie sometimes, but I just buy a TV to cover all use case.
Not 100% silent, but lots of new projectors are based on LED/laser lamps, which produce a lot less waste heat.
Plus there's now ultra-short-throw projectors that sit in the front of the room, rather than above / behind you.
Ultra-short-throw also has the advantage that you can use from-below lenticular screens. This means that ambient room light affects them a lot less, since they refelect light mostly coming from only where the projector is.
Sadly, "smart TV" has infected projectors as well. But maybe not as bad as normal TVs. Most projectors still have dumb inputs that work fine.
I got one of these from work, it's not a great solution. First, they're more expensive than regular TVs. Bigger issue though is that there's no remote so changing the volume can be a pain in the ass.
The firmware update might have stricter DRM controls; it could have a tricky new way to exfiltrate data out via your streaming box; maybe the firmware update itself contains some static ads; perhaps in the near future we'll have public wifi or free Google community wifi, and the new firmware will have the smarts to use that and bypass your wifi.
And these days, once you update the firmware, you often cannot revert back to earlier firmware.