Response time of Pandora successor screen


_wb_

Microbe
Staff member
Joined
Apr 5, 2012
Messages
5,390
Age
42
Location
Brussels, Belgium
I like the Pandora screen a lot. It's crisp, nice pixel density, efficient in power consumption. The only two things that could be improved in my opinion are the size (slightly larger would be nice) and the response time (ghosting). If you don't know what ghosting is, start Pandora System Info, go full screen ("f") and use dpad left/right to navigate. Look at how smoothly the pixels flow. Yep. Not that smoothly.

LCD screens have always had a problem of ghosting, with response times in the order of 30 ms (so essentially the pixels flip at ~30FPS, even if the display takes 60FPS input). The problem has not gone away on the latest LCD screens, although it has somewhat improved thanks to built-in compensation mechanisms. But for 60 FPS gaming, it's still not that good.

CRT and plasma screens don't have a ghosting issue, but those are obviously not suitable for a handheld.

So AFAIK that leaves only OLED. OLED has response times in the order of 1 or 2 ms, so no ghosting at all. They also have other desirable properties (low power consumption, good contrast, etc).

The only problem with OLED is that there are not a lot of manufacturers to choose from. Basically it's just Samsung who makes them AFAIK. So there are not a lot of displays to choose between; (long-term) availability may be an issue. Also many of Samsung's panels use PenTile pixel arrangements, which is something I don't really like - I prefer my pixels to have 3 subpixels.

So my question is: how important is it to have a display with a good response time? As far as I'm concerned, it's the one thing that needs improvement.
 
An additional issue with OLED, which may be fixed, is lifespan. The OLED displays from 3-4 years ago had a tendency to loose brightness with time. As a side effect of the lifespan issue, color balance can start to run out of sync.

I'm not saying that we shouldn't do OLED, just that it isn't perfect either.

From http://en.wikipedia.org/wiki/OLED

Lifespan

The biggest technical problem for OLEDs was the limited lifetime of the organic materials. One 2008 technical report on an OLED TV panel found that "After 1,000 hours the blue luminance degraded by 12%, the red by 7% and the green by 8%."[64] In particular, blue OLEDs historically have had a lifetime of around 14,000 hours to half original brightness (five years at 8 hours a day) when used for flat-panel displays. This is lower than the typical lifetime of LCD, LED or PDP technology. Each currently is rated for about 25,000–40,000 hours to half brightness, depending on manufacturer and model.[65][66] Degradation occurs because of the accumulation of nonradiative recombination centers and luminescence quenchers in the emissive zone. It is said that the chemical breakdown in the semiconductors occurs in four steps: 1) recombination of charge carriers through the absorption of UV light, 2) hemolytic dissociation, 3)subsequent radical addition reactions that form π radicals, and 4) disproportionation between two radicals resulting in hydrogen-atom transfer reactions.[67] However, some manufacturers' displays aim to increase the lifespan of OLED displays, pushing their expected life past that of LCD displays by improving light outcoupling, thus achieving the same brightness at a lower drive current.[68][69] In 2007, experimental OLEDs were created which can sustain 400 cd/m2 of luminance for over 198,000 hours for green OLEDs and 62,000 hours for blue OLEDs.[70]

Color balance issues

Additionally, as the OLED material used to produce blue light degrades significantly more rapidly than the materials that produce other colors, blue light output will decrease relative to the other colors of light. This variation in the differential color output will change the color balance of the display and is much more noticeable than a decrease in overall luminance.[71] This can be avoided partially by adjusting color balance, but this may require advanced control circuits and interaction with the user, which is unacceptable for some users. More commonly, though, manufacturers optimize the size of the R, G and B subpixels to reduce the current density through the subpixel in order to equalize lifetime at full luminance. For example, a blue subpixel may be 100% larger than the green subpixel. The red subpixel may be 10% smaller than the green.
 
The best I found with LCDs is 20ms so far. The current Pandora one has 30ms, so 20ms is a lot better already 
 
LCDs don't have to have 30ms response time. Most decent modern desktop LCDs are much lower than that. Even older mobile displays can be better than what's in Pandora, GP2X for example has much less ghosting.

I think the problem is that most mobile displays are just not optimized for it these days, they prefer better static image quality and lower power consumption.

Something that would be interesting to try is to take some demo that ghosts easily on Pandora @ a 60Hz update, change the refresh rate to 120Hz, and insert pure black every other frame. It'll be half as bright, but it may be less ghosty.. would be curious to see how that works out.
 
Well, the faster the screens are, the more power they need, and most use cases don't need a really fast screen.

The GP2X had a Top-Sun TS35ND2501 screen with a typical response time of 25ms, only 5ms faster than the Pandora one.

Those 5ms can make a huge difference!

The screen where I'll get samples from soon will have 20ms, so that's pretty good.

FullHD though...
 
Many of the faster screens have inverse ghosting as a result of overdriving the panel. I would say brightness and even brightness(backlight saturation on lcd) is something to look out for. That being said, it isnt just oled vs lcd, there are tn panels, pva, s-ips, etc etc.
 
Last edited by a moderator:
Well, the faster the screens are, the more power they need, and most use cases don't need a really fast screen.

The GP2X had a Top-Sun TS35ND2501 screen with a typical response time of 25ms, only 5ms faster than the Pandora one.

Those 5ms can make a huge difference!

The screen where I'll get samples from soon will have 20ms, so that's pretty good.

FullHD though...
Those figures tend to be "typical." They often don't include min and max at all, so who knows what the distribution is like. From anecdotes regarding Pandora we get a lot of different accounts but that could be subjective.

From spec sheet to spec sheet the actual definitions used can also mean slightly different things.
 
An additional issue with OLED, which may be fixed, is lifespan. The OLED displays from 3-4 years ago had a tendency to loose brightness with time. As a side effect of the lifespan issue, color balance can start to run out of sync.
This was an issue with the early OLEDs, but they have done lots of work to protect them from oxidation and that problem is virutally gone (i.e. you will replace your device way before your OLED screen goes bad) 
 
LCD screens have always had a problem of ghosting, with response times in the order of 30 ms (so essentially the pixels flip at ~30FPS
One thing I'd like to add is that there's no single number for response time. First, Manufacturers use different tests, and second, the results varies hugely from one color to another. You may be fast to go from yellow to red but slow to go from blue to green, for example. What's critical for ghosting is the uniformity of the transition from one state to another, across all colors switches conditions and across the range of visible colors to be reproduced. 

Net, you can't really rely on these numbers coming from the Marketing departments from the said manufacturers, and you need to test them for yourself to really see what they mean in practice. 
 
Last edited by a moderator:
Most datasheets I've seen these days have the same measuremtns for response time (from back to white)  and offer typical and maximal values.

It's become a lot better than years ago, less marketing in the datasheets :)
 
Most datasheets I've seen these days have the same measuremtns for response time (from back to white)  and offer typical and maximal values.
Yes, but that;s not what I meant. From black to white is just a change from pixel off to pixel lit. Changing from one color to another color could take more time than changing from off to on. That's where ghosting comes from as well.  
 
From black to white is just a change from pixel off to pixel lit.
In LCD screens going from white to black mean that RGB values are going from 0 to 255, so it's not a pixel on/off.

The backlight is always on on these screens ;)

That's why, with LCD, going from white to black and inversely are the "best" measures, the 3 sub-pixel impact the speed of that change.
 
Last edited by a moderator:
I never really experienced ghosting issues with the P1 screen. Pinball is a nice game to see effects like this and just tried a game of Psycho Pinball on Picodrive but I didn't notice any ghosting. Only when I activate the flippers very fast, you can see the screen is not able to catch up.

So if P2 is similar or a bit better it should be ok imo. I do love to see an OLED screen though, but then just for the clarity. Blue fonts on a black background (emacs - orgmode) is very hard to read on the tiny P1 screen and I think an OLED screen would really help.

BTW.. this is just an example. I don't need a how-to adjust the background or font color / size in emacs. :)
 
Last edited by a moderator:
Try playing some VCS2600 games like Phoenix, etc. Pretty much unplayable, as the shots disappear.
 
From black to white is just a change from pixel off to pixel lit.
In LCD screens going from white to black mean that RGB values are going from 0 to 255, so it's not a pixel on/off.

The backlight is always on on these screens ;)

That's why, with LCD, going from white to black and inversely are the "best" measures, the 3 sub-pixel impact the speed of that change.
Gray to gray is often the given measurement though. The problem is going from black to white overshoots the mark, so once you go from one thing to another thats not quite the end of it in many cases. Backlighting can be a matrix, which is a semi-decent idea. Changing the backlighting dynamically for the whole lot is next to useless with moving media.

I like oled for being simpler, and its environmental side in power use and recycling. Getting worse over time i don't really know the extent of, i think that was pretty much fixed in the galaxy s2 age. And its mitigated by being able to change it, getting spare parts, help with installing something better, if that comes into fruition.

Also there is 10bit aRGB or sRGB and all that, but for what its going to be used for i dont think thats much of a factor. Over saturation ever so slightly on one colour, like on galaxy s3 is not that important either.
 
Last edited by a moderator:
If my calculations are correct, I don't see much point in dropping below 20ms for most applications. That's sufficient to render black to white and black again for a 25fps video. Which is approximately what you need for even the most demanding console games. For native games that might be a problem, but even at 60hz you're talking about roughly 8ms of response time for that.

Obviously, if they can do 10ms without throwing the budget or the power consumption requirements out, I'd love to see them do it, I'm just not sure that it's that big of a problem. 20ms would likely be more than sufficient for most cases that people are going to have. I know I've not noticed the ghosting as I rarely am playing anything that goes from black to white and back to black again quickly enough for it to be a problem.
 
If my calculations are correct, I don't see much point in dropping below 20ms for most applications. That's sufficient to render black to white and black again for a 25fps video. Which is approximately what you need for even the most demanding console games. For native games that might be a problem, but even at 60hz you're talking about roughly 8ms of response time for that.
Most NTSC 2D consoles updated the screen automatically at 60Hz. While frame data for sprites and tiles was usually not updated every refresh the positions of sprites and scrolling of the background usually was. And it was normal for the background to scroll at least 1 pixel per frame, sometimes several more.

These games would often use thick black outlines around brightly colored graphics. Look at this screenshot from StarTropics II for example: http://www.hardcoregaming101.net/nes/star2-2.gif When you move around the screen those bright near white colors in the mountain will bleed badly into the near black outlines.
 
Last edited by a moderator:
If my calculations are correct, I don't see much point in dropping below 20ms for most applications. That's sufficient to render black to white and black again for a 25fps video. Which is approximately what you need for even the most demanding console games. For native games that might be a problem, but even at 60hz you're talking about roughly 8ms of response time for that.
Most NTSC 2D consoles updated the screen automatically at 60Hz. While frame data for sprites and tiles was usually not updated every refresh the positions of sprites and scrolling of the background usually was. And it was normal for the background to scroll at least 1 pixel per frame, sometimes several more.


These games would often use thick black outlines around brightly colored graphics. Look at this screenshot from StarTropics II for example: http://www.hardcoregaming101.net/nes/star2-2.gif When you move around the screen those bright near white colors in the mountain will bleed badly into the near black outlines.
But wasn't that interlaced? As in they weren't really updating things that quickly, each scan line was only being touched every other time, so really only at 30 hz rather than at the full 60hz.

Or is there something that I'm missing here. A CRT TV that was able to do twice the FPS of the source material seems like an awful waste of money. It's not like LCDs today where you need to provide time for the shutters to adjust, the adjustmenst to the Cathode ray should have been much quicker.
 
But wasn't that interlaced? As in they weren't really updating things that quickly, each scan line was only being touched every other time, so really only at 30 hz rather than at the full 60hz.

Or is there something that I'm missing here. A CRT TV that was able to do twice the FPS of the source material seems like an awful waste of money. It's not like LCDs today where you need to provide time for the shutters to adjust, the adjustmenst to the Cathode ray should have been much quicker.
The traditional NTSC signal contains interlaced "even" and "odd" fields of 262 and 263 lines respectively to gain an effective 525 line 30Hz rate. But you don't have to actually interlace the signal at all, you can send nothing but even or nothing but odd fields every frame - the scanline timing is controlled by the signal so you have this flexibility, it's totally independent of the display itself. And that's what the 2D consoles did almost exclusively, even if they offered interlaced modes they were rarely used. So it was really a true progressive scan 60Hz signal with 192-242 or so visible lines. This is also why the consoles had noticeable dark gaps between the scanlines.
 
Last edited by a moderator:
Back
Top