Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

I never worked for Samsung, but I built TVs for JVC and LG, among many other brands. I don't work in consumer electronics anymore but a decade ago that was my field.

TVs are a wildly unprofitable business. It's astoundingly bad. You get 4-6 months to make any profit on a new model before it gets discounted so heavily by retailers that you're taking a bath on each one sold. So every dollar in the BOM (bill of materials) has to be carefully considered, and not far back the CPUs in practically every TV was single core or dual core, and still under 1GHz. Bottom of the bin ARM cores you'd think twice to fit to a cheap tablet.

They sit within a custom app framework which was written before HTML5 was a standard. Or, hey want to write in an old version of .NET? Or Adobe Stagecraft, another name for Adobe Flash on TV?

Apps get dropped on TVs because the app developers don't want to support ancient frameworks. It's like asking them to still support IE10. You either hold back the evolution of the app, or you declare some generation of TV now obsolete. Some developers will freeze their app, put it in maintenance mode only and concentrate on the new one, but even then that maintenance requires some effort. And the backend developers want to shutdown the API endpoints that are getting 0.1% of the traffic but costing them time and money to keep. Yes, those older TVs are literally 0.1% or less of use even on a supported app.

After a decade in consumer electronics, working with some of the biggest brands in the world (my work was awarded an Emmy) I can confidently say that I never saw anyone doing what could be described as 'planned obsolescence'. The single biggest driver for a TV or other similar device being shit is cost, because >95% of customers want a cheap deal. Samsung, LG and Sony are competing with cheap white label brands where the customer doesn't care what they're buying. So the good brands have to keep their prices somewhere close to the cheap products in order to give the customers something to pick from. If a device contains cheap components, it was because someone said "If we shave $1 off here, it'll take $3 off the shelf price." I once encountered a situation where a retailer, who was buying cheap set-top boxes from China to stick a now defunct brandname on, argued to halve the size of an EEPROM. It saved them less than 5c on each box made.

For long life support of the OS and frameworks, aside from the fact that the CPU and RAM are poor, Samsung, LG and Sony don't make much money from the apps. It barely pays to run the app store itself, let alone maintain upgrades to the OS for an ever increasing, aging range of products.

And we as consumers have to take responsibility for the fact that we want to buy cheap, disposable electronics. We'll always look for the deal and buy it on sale. Given the choice of high quality and cheap, most people choose cheap. So they're hearing the message and delivering.



Yeah, but is there a way for consumers to compare the compute performance of any given TV?

If OEMs differentiated their TVs based on compute performance, consumers might be able to make an informed choice. (See smartphones: consumers expect a Galaxy Sxx to have faster compute than a Galaxy Axx.)

If not, consumers just see TVs with similar specs at different prices, so of course they’re going to pick the cheaper one.


It's really hard to get these things across to consumers.

This is why we ended up with phrases like "Full HD".

The average consumer doesn't know what these numbers mean, people who read hackernews aren't the 99%. Phones have helped a little bit with widening the idea of newer = better, but ask the average person how many cores their phone is or how much RAM it has? They don't know.

Also, it's hard to benchmark TV performance as a selling point. Perhaps sites like rtings need to have UX benchmarks as well? They could measure channel change times, app load times, etc. That might create some pressure to compete.


>I can confidently say that I never saw anyone doing what could be described as 'planned obsolescence'. The single biggest driver for a TV or other similar device being shit is cost, because >95% of customers want a cheap deal.

You are literally the first person I have ever seen say this online, besides myself. I have worked in hardware for years and can vouch that there is no such thing as planned obsolescence, but obsession over cost is paramount. People think LED bulbs fail because they are engineered that way, but really it's because they just buy whatever is cheapest. You cannot even really support a decent mid-grade market because it just gets eviscerated by low cost competitors.


I was in a meeting with a senior guy from one of the top Asian brands and I said "We're getting out of TVs, we've lost $x millions and that's enough."

He said "Hah, we can lose way more than that!"


Thanks for sharing. Without insight beyond being a consumer, I do think there's room for disription (ideally from within the industry itself) vs 10y ago.

Comparing models from 2005/2015/2025, for example. Most people literally can't tell 4k from 1080 and anything new in the HD race mostly feels like a scam. The software capabilities are all there. I think to differentiate from the no-name stuff, longevity is going to become a more significant differentiator.


We tried to disrupt the market, back about 10 years ago.

One of the significant problems is that 80% of TV SOCs are made by one company, MStar (or their subsidiary). And there's only a handful of companies who make the motherboards with those chipsets. Anyone entering the market either buys those or isn't competitive. It's hard to be competitive because everything is so concentrated and consolidated. Since ST Microelectronics and Broadcom left the TV chip market it became a much less diverse market.

We were an established company who made software for STBs, we had done a ground-up build of what was probably the most capable and powerful framework for TV/DVRs. The new design was commissioned from us by a well known open source Linux distro, who then decided they didn't want to continue with the project after they realised that getting into TV OS's was hard. We then took on ownership of that project but getting investment or even commitments from buyers was impossible.

The retailers and TV brands wanted to rehash the same thing over and over because that was tried and tested. It didn't matter that we made something that was provably better and used modern approaches, it wasn't worth the effort for them. If you can't order about 500,000 TVs then you're not going to get anyone to make anything custom for you these days and you'll not make a profit.

--

It was a DVR/TV framework that was designed by people who had worked for big names in the TV business with a clean slate. It would handle up to 16 different broadcast networks (e.g. satellite, terrestrial, cable) and up to 255 tuners, even hot pluggable. Fast EPG processing and smart recording to either internal storage or USB storage. It was user friendly and allowed for HTML5 apps. We pushed it as much as we could but eventually on the brink of financial ruin the company was sold to someone who had no interest in what had been built. I will always feel that something great was lost.


The problem is getting that jank even when you buy the expensive models, though.


But then they're running on the same common platform as the models half the price. But more than 95% of the cost of the TV is in the panel itself, a fancy model is usually just a bigger model and maybe some different, higher end panel. But the CPU inside is nothing special because then they can keep costs down to compete the with the cheap 60in TV you saw while shopping for groceries.


> TVs are a wildly unprofitable business... not far back the CPUs in practically every TV was single core or dual core

Explain to me then how an Apple TV device for $125 (Retail! not BOM!) can be staggeringly faster and generally better than any TV controller board I've seen?

I really want to highlight how ludicrous the difference is: My $4,000 "flagship" OLED TV has a 1080p SDR GUI that has multi-second pauses and stutters at all times but "somehow" Apple can show me a silky smooth 4K GUI in 10 bit HDR.

This is dumbass hardware-manufacturer thinking of "We saved 5c! Yay!" Of course, now every customer paying thousands is pissed and doesn't trust the vendor.

This is also why the TVs go obsolete in a matter of months, because the manufacturers are putting out a firehose of crap that rots on the shelves in months.

Apple TV hasn't had a refresh in years and people are still buying it at full retail price.

I do. Not. Trust. TV vendors. None of them. I trust Apple. I will spend thousands more with Apple on phones, laptops, speakers, or whatever they will make because of precisely this self-defeating decisions from traditional hardware vendors.

I really want to grab one of these CEOs by the lapels and scream in their face for a little while: "JUST COPY APPLE!"


> Explain to me then how an Apple TV device for $125 (Retail! not BOM!) can be staggeringly faster and generally better than any TV controller board I've seen?

This is the result of Apple being vertically integrated and reusing components from other product lines in products like Apple TV. The SoC used in the Apple TV are from lower-tier bins of chips produced for mobile applications.

With the Apple TV, you are getting a SoC that is effectively the same as a recent-year iPhone. With most other Smart TV devices you are getting a low computational power SoC, Raspberry Pi tier, with processing blocks that are optimized for the video playback and visual processing use cases.

Apple also does this with the iPhone where the non-flagship variants will reuse components or designs from prior years.

Television/Smart TV manufacturer margins are in the single-digit percentages and the Samsung and LG tv businesses are significantly threatened since their high-volume products have been commoditized from Chinese producer competition. Most potential customers are shopping based on screen size per dollar, versus specs like peak luminance and contrast ratios. Flagship TV products like "The Wall" are low-volume halo products. Lifestyle products like "The Frame" exist because they are able to differentiate to certain segments of customers that place enough value the packaging aesthetics to buy a higher priced product with better margins for the manufacturers.

Most other hardware device manufacturers are jealous of Apple's margins. Nvidia would probably be one of the few exceptions.

Thin margins on commodity tier products drive these manufacturers to cut their BOM costs as much as possible, even if it makes the product worse in other ways. This is also the big driver for why ads are appearing as part of the Smart TV experience at the device/screen level. Vizio for example shared that they made more money from their ACR business than they did from the device sales themselves. There are companies with business models based around giving you the screen for "free" in exchange for permanent ad-space. Even adjacent products and companies like Roku have business models where they are selling their hardware at near break-even cost points because their business model is built around 'services' from having a large user audience.


Budget mobiles phones exist, and make a profit. These have 4G radios, screens, batteries, cameras, and storage.

There is no excuse for TV manufacturers when selling premium devices costing thousands of dollars.


Greater than 95% of the cost of a TV is in the panel.

TV panels must have a near 0% defect rate and a single piece of dust during the manufacture will render the finished panel e-waste. The bigger the panel the risk of a defect goes up exponentially because the surface area for any defect becomes bigger. It follows the same issue as to why chip companies introduced chiplets, the smaller die sizes improves the yield and they can throw away less silicon.

A TV panel is basically a 50in chip, and a mobile phone display is a 6in chip.


Samsung also has access to competitive mobile SoCs through vertical integration though.


In theory they do have access and should, but in practice they don't.

Samsung's flagship mobile phone products tend to ship with Qualcomm Snapdragon SoCs in competitive markets, such as USA/North America, versus their "in-house" Exnyos SoC used in markets where consumers tend to have less choice (e.g. Samsung S-series phones with Snapdragon for USA, Exnyos for EU and KDM markets)




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: