MiniLED backlit LCD display tech can come pretty close to OLED and it’s only going to get cheaper. I’m a longtime OLED enthusiast but the M1 series displays were what convinced me there’s a long future for LCD TVs and monitors.
> The cost of microLEDs remains high and the technology has struggled to compete with organic LEDs (OLEDs), he said, which keep improving while getting cheaper and lasting longer. It’s possible that Apple went the wrong route when it came to developing the technology. “Maybe their design was over engineered,” Virey said. “Maybe it was expensive by design.”
Yeah the names are confusing here. I actually had to edit my post after I realized I used “MicroLED” when I meant “MiniLED” so I’m not sure if you saw that one. If so, sorry!
For those who are mercifully unfamiliar: MiniLED is different than MicroLED. MiniLED means a backlight panel with more addressable “zones” due to use of smaller and more densely packed LEDs. It’s arguably more of a marketing term for high-resolution LED backlighting, but the impact of this incremental improvement shouldn’t be overlooked - it’s very effective for most applications.
MicroLED on the other hand is like OLED in that each pixel is a single addressable unit that emits its own light, but differs in that it’s essentially using an actual grid of semiconductor-style LEDs, where OLED manufacturing uses something totally different IIRC.
Also interesting is a video that RTNGS made a little while back that I cannot currently find where they took some of the TVs from this test with failing screens that looked SUPER bad in the pictures, and showed them in the video with moving pictures, asking the question "So, hey, can you spot the problem?". For some the answer was an obvious "YES", but for others, no, not at all.
Some of those look pretty bad but I feel like having a news chyron on-screen for 10000 hours indicates something deeper than having bought an imperfect TV.
Just a TV somewhere in the lobby of a doctor's office, or some such, that permanently shows news, interspersed with some short information pieces from the medical establishment. It would also usually run closed captions at all times, also at about the same area of the screen.
Yep. The test they're running is exactly this scenario.
In one of the articles related to the project, they talk about how they considered handling the fact that the 24/7 news channel they selected for the test significantly changed something about the bottom bar (the logo? the crawl design? both? I can't remember.). IIRC, they considered running a lengthy pre-recorded loop through the screens, but decided against it because that might not be representative of real-world programming decisions or whatever.
Yeah just like OLED, the LED backlight on a LCD doesn't last. I just recently swapped out the backlight on my ~10 year old TV with a $30 new one off aliexpress. Way brighter again, and way less color accurate. At least it doesn't need to be ewaste now.
Apple TV is the only option I know of (other than using it as an HDMI output device for a PC) that allows for color calibration. This needs to become mainstream!
The Apple TV setup has you use a phone to handle it for you (it auto calibrates from video) rather than having to fiddle with 20 different sliders in 7 different menus.
OLED do burn in but it take a LOT of time. Here is the OLED Switch that has 18,000 hours of static screen time. Yes, it has burned in but on the most extreme situation possible.
I don’t understand what you mean. Both of our LCD TVs have failed or degenerated and I was looking to OLED for something that would last longer.
>Furthermore, LCD screens have a finite lifespan, typically around 30,000 to 60,000 hours, after which the quality of the display can start to degrade. In contrast, OLED screens can potentially last up to 100,000 hours
Maybe I don't use tvs the way most people do. But the most pessimistic lifetime estimates for my use case on a 30,000 hour panel is 41 years. I don't really see a television as something to pass on to my heirs so that seems like a solved problem for non-commercial use.
I've had televisions fail but it has always been a connector or capacitor that I could replace with a sautering iron and about 46 minutes of disassembly/re-assembly
A lot of us watch more tv than two hours a day. I have over 1000 hours in the game Elden Ring alone! :)
I also consider life estimates as very fluid. My gf’s Hisense 75 inch can’t be more than a few years old and it has already gone to pot with weird circles visible on any scene with a white or light background. 8 hours a day for a few years is only ~6,000 hours, far less than 30k.
Maybe planned obsolescence has something to do with it. But mostly, OLEDs look so much better than LCDs. I can’t believe we put up with light grey “black” pixels for so long.
We've put up with gray for black since 1950s when the first CRTs were sold. Not sure why you stopped in history for LCD. LCD was an improvement just like LED/OLED was the next improvement over LCD.
It isn’t that mysterious. I stopped at LCDs because they were the direct predecessor to OLED, and so they are the familiar comparison point for most people.
I wouldn’t call them an improvement over CRTs, except in terms of weight. But, no accounting for taste.
LCDs are still widely available and dirt cheap thanks to cutthroat competition, but the only reason why you'd buy one these days is if you can't afford OLED. (That includes everyone that's worried about burn-in)
My daughter loves to put a beautiful image on her iPad and make it stay like a picture frame for hours. She’s a kid, she doesn't care if pixels burn out, and if I try to prohibit it, it will just happen more often.
Or if you know, you want to maintain any level of brightness over a greater than a 10% window. I have OLED's and MiniLED on my desk. Each excels in a different area. The OLED can't even get close to the MiniLED's HDR performance, whilst the MiniLED gets close enough to the OLEDs strength for 95% of content. Save for motion, OLED reams LCD in motion. I just wish I could have everything in the one panel but alas we are just not there.
I'm hoping we see major bumps to OLED on the desktops brightness over the next few generations of panel as it's still early days to be fair.
For now, it's likely ill keep both on my desk. OLED for darker tighters only supplying the occasional highlight, MiniLED for everything else including titles calling for large and sustained brightness values.
I have a theory that native 720p/768p panels are going to go up in value on the second hand market once they're no longer available. xbox 360/ps3 era games don't look quite right scaled up to higher resolutions (and a lot of stuff from the generation afterwards too). The retro gamers with cash burning a hole in their pocket are going to get to that generation soon and won't have nostalgia for old CRTs anymore.
I bet you could do pretty convincing 3x scaling on a 4k oled and simulate the individual R, G, and B pixels with the right hardware though.
There's a qualitative difference between an integer upscale and actual LCD pixels at native res, especially at low resolutions. The glow and shape of the individual pixels makes edges a little softer without looking blurry, and anything taking advantage of subpixel rendering doesn't look quite right when upscaled.
A couple months back I suddenly got nostalgic for MGS4 and was shocked to find out there's no way to play it other than to power up your PS3 again. Ditto for FF13 (but who cares about that game)
I suppose the PS3 is also the last console with a very bespoke architecture making it hard to port etc
The bigger the size, the greater the chances of unacceptable defects. Which drives the price of non-defective units up commensurately (just like regular wafer size). And makes replacements equally onerous and expensive.
I bought a 42 inch TV for just under $300 AUD a few weeks back. I would suspect the panel isn't even the most expensive part any more on a TV, logistics for that thing have to be one of the biggest parts nowadays.
I'm pretty sure there have been LCD smartphones with a camera hole.
E.g. from 2019:
> Alongside the new OLED announcements, CSOT also demonstrated a new LCD model with a front-camera hole-punch design. Here the company just wanted to demonstrate their ability to employ the technology which will be in high demand this year.
Nice to see that display technology enshittification is continuing apace. Well-cared-for CRTs from 50 or more years ago still work. Meanwhile, Game & Watch games and other 80s gadgets -- even new old stock ones -- are turning into useless kipple that will never be seen in working order again, because the irreplaceable custom LCDs are deteriorating. I can't wait for future retro enthusiasts to discover the 10-year (or less) half-life on OLED devices like the recent rev of Switch.
No, but the LCDs in more recent laptops and devices are doomed to a similar fate, on 40-year or less timescales, and the world has literally forgotten how to make a more durable display. There's like one active manufacturer of CRTs left, and they only cater to military applications like fighter-plane HUDs (and charge an arm and a leg), which means the standard for display technology longevity has gone down.
Headline should read “The Death of small LCDs…”.
MiniLED backlit LCD display tech can come pretty close to OLED and it’s only going to get cheaper. I’m a longtime OLED enthusiast but the M1 series displays were what convinced me there’s a long future for LCD TVs and monitors.
Apple exited microLED after multiple billions of investment, https://web.archive.org/web/20240624161521/https://www.yoleg... & https://www.fierceelectronics.com/electronics/microled-marke...
> The cost of microLEDs remains high and the technology has struggled to compete with organic LEDs (OLEDs), he said, which keep improving while getting cheaper and lasting longer. It’s possible that Apple went the wrong route when it came to developing the technology. “Maybe their design was over engineered,” Virey said. “Maybe it was expensive by design.”
I don't think this has anything to do with LCDs, since microLED doesn't use LCD, unlike miniLED.
Yeah the names are confusing here. I actually had to edit my post after I realized I used “MicroLED” when I meant “MiniLED” so I’m not sure if you saw that one. If so, sorry!
For those who are mercifully unfamiliar: MiniLED is different than MicroLED. MiniLED means a backlight panel with more addressable “zones” due to use of smaller and more densely packed LEDs. It’s arguably more of a marketing term for high-resolution LED backlighting, but the impact of this incremental improvement shouldn’t be overlooked - it’s very effective for most applications.
MicroLED on the other hand is like OLED in that each pixel is a single addressable unit that emits its own light, but differs in that it’s essentially using an actual grid of semiconductor-style LEDs, where OLED manufacturing uses something totally different IIRC.
Thanks for the explanation!
It's no surprise that companies are pushing OLED as they actually wear out, while LCDs don't.
Isn't that the opposite of the conclusions from https://www.rtings.com/tv/learn/longevity-results-after-10-m... ? Every LCD in their test seems to look like stark junk.
It's definitely incorrect to say that backlit LCD screens don't wear out.
However, it's definitely correct to say that quality OLED screens wear out much, much faster than quality backlit LCD screens.
And as always, garbage-grade screens [0] fall to pieces much, much faster than good ones.
You might also be interested in this follow-up report that was published about a year later than the link you posted: <https://www.rtings.com/tv/tests/longevity-burn-in-test-updat...>.
Also interesting is a video that RTNGS made a little while back that I cannot currently find where they took some of the TVs from this test with failing screens that looked SUPER bad in the pictures, and showed them in the video with moving pictures, asking the question "So, hey, can you spot the problem?". For some the answer was an obvious "YES", but for others, no, not at all.
[0] Like dreadfully thin edge-lit ones: <https://www.rtings.com/research/thin-lcd-tvs-break-faster-un...>
Some of those look pretty bad but I feel like having a news chyron on-screen for 10000 hours indicates something deeper than having bought an imperfect TV.
Just a TV somewhere in the lobby of a doctor's office, or some such, that permanently shows news, interspersed with some short information pieces from the medical establishment. It would also usually run closed captions at all times, also at about the same area of the screen.
Yep. The test they're running is exactly this scenario.
In one of the articles related to the project, they talk about how they considered handling the fact that the 24/7 news channel they selected for the test significantly changed something about the bottom bar (the logo? the crawl design? both? I can't remember.). IIRC, they considered running a lengthy pre-recorded loop through the screens, but decided against it because that might not be representative of real-world programming decisions or whatever.
I collect old laptops.
The difference in brightness and colour reproduction in a pair of 2003 PowerBooks, one used heavily and the other just stored…
It’s quite stark. Dim backlights and washed out colours.
Yeah just like OLED, the LED backlight on a LCD doesn't last. I just recently swapped out the backlight on my ~10 year old TV with a $30 new one off aliexpress. Way brighter again, and way less color accurate. At least it doesn't need to be ewaste now.
2003 PowerBooks didn’t have an LED backlight, they were CCFL. MacBook Pros didn’t get an LED backlight until 2007.
Apple TV is the only option I know of (other than using it as an HDMI output device for a PC) that allows for color calibration. This needs to become mainstream!
TVs generally have a service menu with color calibration options.
The Apple TV setup has you use a phone to handle it for you (it auto calibrates from video) rather than having to fiddle with 20 different sliders in 7 different menus.
With a colorimeter?
it uses a known (to apple) iphone camera as a colorimeter
I’m aware, that’s what I was referring to in GP. Read the post I was replying to again.
Are you saying your replacement while being brighter is now less color accurate?
OLED do burn in but it take a LOT of time. Here is the OLED Switch that has 18,000 hours of static screen time. Yes, it has burned in but on the most extreme situation possible.
https://www.youtube.com/watch?v=Po8jAQjvd88
I don't think that's true, certainly not in my experience. My LG OLED has burn in for something that's rarely on the screen.
Everything wears out. Early OLED's wore out faster than already very mature LCD tech, but that's just the nature of early adoption of new tech.
I don’t understand what you mean. Both of our LCD TVs have failed or degenerated and I was looking to OLED for something that would last longer.
>Furthermore, LCD screens have a finite lifespan, typically around 30,000 to 60,000 hours, after which the quality of the display can start to degrade. In contrast, OLED screens can potentially last up to 100,000 hours
Maybe I don't use tvs the way most people do. But the most pessimistic lifetime estimates for my use case on a 30,000 hour panel is 41 years. I don't really see a television as something to pass on to my heirs so that seems like a solved problem for non-commercial use.
I've had televisions fail but it has always been a connector or capacitor that I could replace with a sautering iron and about 46 minutes of disassembly/re-assembly
A lot of us watch more tv than two hours a day. I have over 1000 hours in the game Elden Ring alone! :)
I also consider life estimates as very fluid. My gf’s Hisense 75 inch can’t be more than a few years old and it has already gone to pot with weird circles visible on any scene with a white or light background. 8 hours a day for a few years is only ~6,000 hours, far less than 30k.
~45000 / 7 hours per day = 17 years.
17 years ago was 2007 - I can’t remember every TV our family had since 2007, but it was more than one or two, for sure.
The failure points for LCDs are usually not the panel itself (the most expensive part), unlike with OLEDs.
Maybe planned obsolescence has something to do with it. But mostly, OLEDs look so much better than LCDs. I can’t believe we put up with light grey “black” pixels for so long.
We've put up with gray for black since 1950s when the first CRTs were sold. Not sure why you stopped in history for LCD. LCD was an improvement just like LED/OLED was the next improvement over LCD.
It isn’t that mysterious. I stopped at LCDs because they were the direct predecessor to OLED, and so they are the familiar comparison point for most people.
I wouldn’t call them an improvement over CRTs, except in terms of weight. But, no accounting for taste.
since 1950s when the first CRTs were sold
1930s.
LCDs are still widely available and dirt cheap thanks to cutthroat competition, but the only reason why you'd buy one these days is if you can't afford OLED. (That includes everyone that's worried about burn-in)
No, it doesn't include everyone.
My daughter loves to put a beautiful image on her iPad and make it stay like a picture frame for hours. She’s a kid, she doesn't care if pixels burn out, and if I try to prohibit it, it will just happen more often.
For me, OLED is better, sure.
Or if you know, you want to maintain any level of brightness over a greater than a 10% window. I have OLED's and MiniLED on my desk. Each excels in a different area. The OLED can't even get close to the MiniLED's HDR performance, whilst the MiniLED gets close enough to the OLEDs strength for 95% of content. Save for motion, OLED reams LCD in motion. I just wish I could have everything in the one panel but alas we are just not there.
I'm hoping we see major bumps to OLED on the desktops brightness over the next few generations of panel as it's still early days to be fair.
For now, it's likely ill keep both on my desk. OLED for darker tighters only supplying the occasional highlight, MiniLED for everything else including titles calling for large and sustained brightness values.
OLED brightness still hasn’t quite caught up, and many of the bright LED backlit LCDs have great color gamuts, and supposedly almost true black:
https://www.rtings.com/tv/reviews/best/bright-room
Anyway, if you have a big window in the room where your TV is, brightness probably should be the deciding factor.
> LCDs still dominate larger displays, with OLED accounting for just 3.1% of all TVs in 2023
Announcing the death of LCDs seems slightly immature.
Premature. Jeez.
I have a theory that native 720p/768p panels are going to go up in value on the second hand market once they're no longer available. xbox 360/ps3 era games don't look quite right scaled up to higher resolutions (and a lot of stuff from the generation afterwards too). The retro gamers with cash burning a hole in their pocket are going to get to that generation soon and won't have nostalgia for old CRTs anymore.
I bet you could do pretty convincing 3x scaling on a 4k oled and simulate the individual R, G, and B pixels with the right hardware though.
1440p displays should be just fine for a 2x upscale off 720p.
Uneven multiples don't behave nicely. Even multiples however are fine.
It's not the CRT era upsampling this time around.
There's a qualitative difference between an integer upscale and actual LCD pixels at native res, especially at low resolutions. The glow and shape of the individual pixels makes edges a little softer without looking blurry, and anything taking advantage of subpixel rendering doesn't look quite right when upscaled.
A couple months back I suddenly got nostalgic for MGS4 and was shocked to find out there's no way to play it other than to power up your PS3 again. Ditto for FF13 (but who cares about that game)
I suppose the PS3 is also the last console with a very bespoke architecture making it hard to port etc
I want to see 2.9 x 3.1 meter chips now. Wall height.
Cant it also be a display? Nothing fancy, just rows of flashing leds to illustrate activity would be enough.
The bigger the size, the greater the chances of unacceptable defects. Which drives the price of non-defective units up commensurately (just like regular wafer size). And makes replacements equally onerous and expensive.
The defects can stay. In my oblivious opinion they should just drill holes in them.
Does this mean LCD displays are going to get a lot cheaper? That would be nice.
LCD panels are already dirt cheap. Eg look at prices from wholesalers on panelook
I bought a 42 inch TV for just under $300 AUD a few weeks back. I would suspect the panel isn't even the most expensive part any more on a TV, logistics for that thing have to be one of the biggest parts nowadays.
How hard is it to get an LCD panel professionally cut with holes in it?
Holes where? I'm pretty sure cutting into the display area will break all the rows or columns that hit each hole
I'm pretty sure there have been LCD smartphones with a camera hole.
E.g. from 2019:
> Alongside the new OLED announcements, CSOT also demonstrated a new LCD model with a front-camera hole-punch design. Here the company just wanted to demonstrate their ability to employ the technology which will be in high demand this year.
https://www.anandtech.com/show/14002/tclcsot-reveals-mobile-...
If LCD fabs could be retooled to make chips that would be amazing.
Nice to see that display technology enshittification is continuing apace. Well-cared-for CRTs from 50 or more years ago still work. Meanwhile, Game & Watch games and other 80s gadgets -- even new old stock ones -- are turning into useless kipple that will never be seen in working order again, because the irreplaceable custom LCDs are deteriorating. I can't wait for future retro enthusiasts to discover the 10-year (or less) half-life on OLED devices like the recent rev of Switch.
Are you implying that the dying LCDs of 40 year old video games are an example of enshittification?
No, but the LCDs in more recent laptops and devices are doomed to a similar fate, on 40-year or less timescales, and the world has literally forgotten how to make a more durable display. There's like one active manufacturer of CRTs left, and they only cater to military applications like fighter-plane HUDs (and charge an arm and a leg), which means the standard for display technology longevity has gone down.
And this is an example of enshitification?