(2) Most all of the "smart" TVs in my area don't use any of their features. They are all used as dumb monitors. They either take signal from a proprietary cable box, a game device, or some other media player.
(3) I have yet to see a TV that requires an internet connection to function. If you have one of these things, blacklist it on your network. At the firewall, or just don't give it your wifi password.
A lot of people watch physical TVs, even if they don't watch "TV". This is about tracking what people are watching on the physical device, no matter the source.
If I watch a Blu-ray or Netflix, my smart TV will still send a fingerprint of that content to their servers for analytics.
As far as not connecting your TV you the Internet: (1) The default setup says you need to, so most people will since they want to watch Netflix. (2) There's so much free wifi around these days I wouldn't be surprised if TVs try to connect to any open wifi for "automatic updates".
Have you actually tried any? I have a 4 year old Samsung TV and its Netflix app is great. I don't want the hassle of turning on my PS4 or setting up a Pi. I don't really see why people hate on Smart TVs I've used both Samsung and LG and they're great for netflix/plex/youtube (Other than the obvious privacy issues here, but I can't imagine most external devices are better).
samba can use fingerprinting to detect what you are watching, even if you are watching using a roku (and have somehow opted-in to samba on the tv itself). It can even detect games, etc.
The statistic you're citing lumps together people watching directly on a TV vs using it as a display for a device like a Roku or X-Box, so it doesn't actually support the point you're trying to make :)
I don't think you are understanding how they are identifying content. It doesn't need to playing via the built in 'app'. Basically they are fingerprinting the video feed which passes through the video processors in the TV regardless of the source (internal, external, whatever). This is a video analog of what apps like Shazam do to identify songs. If you are playing content from any source on one of these devices they can detect what the content is and report back on it. About the only way you could avoid this is to never connect the TV to your network, but even that may not be sufficient.
I think the point the GP is making is that users who use TV as merely displays and not connected to the internet cannot have access to this data, whereas connected TVs can send whatever fingerprinted data they want back home.
Sure does because there is no difference. It's still watching tv regardless of where the source comes from. People are literally watching or staring at their tvs. That's what watching tv means. It has nothing to do with the source of the content and never did.
I have an LG Smart TV, and I use the build-in apps because I'm not aware any devices provide 12-bit 4k output without some sort of signal processing that I don't want.
Admittedly, it's been a while (and a couple of Apple TV 4k software revisions) since I've checked, but as you get into newer display technologies it's harder to find devices that can provide you the same experience you'd get from the built-in app.
(2) Most all of the "smart" TVs in my area don't use any of their features. They are all used as dumb monitors. They either take signal from a proprietary cable box, a game device, or some other media player.
(3) I have yet to see a TV that requires an internet connection to function. If you have one of these things, blacklist it on your network. At the firewall, or just don't give it your wifi password.