5" screen (on a P2)


Print something - text even - on a printer at 300dpi then at 600dpi. Hold them at 25cm. You can not see the individual pixels on either - but you can definitely tell a difference in readability, crispness and overall quality of the output.
A 600dpi printer is approximately equivalent to 212ppi, and could be as low as 150ppi with a tradeoff being better colour per pixel. A 300dpi printer would equal about 106ppi. On a 5 inch screen, you're looking at an approximate difference between 1000x600 and 500x300, at best. Depending on the printer, it may be even sharper difference, like 800x480 vs 400x240. At those resolutions you are absolutely going to see a difference.


A high quality 1200dpi printer is going to give you the desired 300ppi. To make your comparison relevant, you'd need to print something off at 2400dpi and 1200dpi at the same colour density. That might prove difficult though, as most 2400 dpi printers are basically 1200 dpi printers with double the colour density: they don't give more "pixels" they just give twice the colour range.
 
Last edited by a moderator:
Yes Grench, please don't compare screen ppi with printer dpi. Ink or paper is 1 bit (afaik, no printer dillutes ink to create shades of color), so CMYK would give you at most 4 bits per dot, and one of those is even redundant in theory. Most screens do 24 bits per pixel.
 
I really realy hope the team doesn't go for too high res. Just look at iPad3, all the resolution in the world, but without the GPU to power it!


On a mobile I would rather trade lower res over high res if that means >FPS.
 
Didn't you read what I said about printers? :/
You provided no supporting documentation of your assertions AND they were irrelevant to the analogy that I was making.

Print something - text even - on a printer at 300dpi then at 600dpi. Hold them at 25cm. You can not see the individual pixels on either - but you can definitely tell a difference in readability, crispness and overall quality of the output.
A 600dpi printer is approximately equivalent to 212ppi, and could be as low as 150ppi with a tradeoff being better colour per pixel. A 300dpi printer would equal about 106ppi. On a 5 inch screen, you're looking at an approximate difference between 1000x600 and 500x300, at best. Depending on the printer, it may be even sharper difference, like 800x480 vs 400x240. At those resolutions you are absolutely going to see a difference.


A high quality 1200dpi printer is going to give you the desired 300ppi. To make your comparison relevant, you'd need to print something off at 2400dpi and 1200dpi at the same colour density. That might prove difficult though, as most 2400 dpi printers are basically 1200 dpi printers with double the colour density: they don't give more "pixels" they just give twice the colour range.
You claim that 600x600 dpi laser printing is the equivalent of 212 pixels per inch black and white on a screen? Do you have anything that supports that claim?


Also - even if true in some twisted fashion, NONE of that detracts or discredits the ANALOGY that I was making.

Yes Grench, please don't compare screen ppi with printer dpi. Ink or paper is 1 bit (afaik, no printer dillutes ink to create shades of color), so CMYK would give you at most 4 bits per dot, and one of those is even redundant in theory. Most screens do 24 bits per pixel.
I said nothing about color. I was making an analogy using a simplified system. Set the entire analogy to black/white if you would like.

/**********/


I was using printing as an analogy.


http://dictionary.reference.com/browse/analogy


The analogy fits just fine. In either case, you can not see the individual pixels at 24cm, but you CAN tell a difference in quality. THAT is the point of my post.
 
Not really too weak for the resolution itself. More too weak for rendering certain programs at that res.


Edit: Also halving the res doesn't give a great boost in e.g. Mupen.
 
Last edited by a moderator:
You provided no supporting documentation of your assertions AND they were irrelevant to the analogy that I was making.

I didn't think I had to provide links explaining how printers work. Unless your printer has variable sized dots it can only print dots of one of the colored inks that it has access to. This should be common sense. And most printers only have 4 tones of ink. Here's a link explaining it:


http://www.1960pcug.org/pcnews/2000/03/more_about_%20inkjet_printers.htm


Dithering increases the resolution of perceived samples at the expense of spatial resolution (number of samples). It works so well for printers precisely because the individual dots are beyond the resolution capabilities of the human eye, therefore the eye's acting as a low pass filter averages color values within regions giving the appearance of more color tones. Your own argument seems to be inadvertently stumbling upon this fact since you point out that the higher resolution printer images look better despite not being able to see dots on either. But if the printer had the ability to output samples beyond the spatial resolution perception of the human eye AND could output each sample beyond the sample differentiation capabilities of the human eye then increasing the spatial resolution WOULD be pointless.


All this stuff about just using "analogies" falls flat too, because it's making this assumption that more samples per area = better just because it does in the printer example you gave. But the printer example was hinging over the DPI being beyond human resolution capabilities in both cases. So the fact that the effective color depth is much different kills the relevance...
 
You claim that 600x600 dpi laser printing is the equivalent of 212 pixels per inch black and white on a screen? Do you have anything that supports that claim?
"I'm sorry, I clearly mispoke. I should have mentioned I was speaking strictly of black and white images. I realize now in the context of the discussion that has lead up to this point how someone might think I might be trying to apply the analogy to coloured images, since we were previously talking about using the screen in real world applications, things such as watching movies and playing video games. I should have been specific."


That's cool. In that case, yes, you would be absolutely correct: a 600dpi printer capable of printing in black and white would print at 600ppi. Unfortunately I've tried several printers around the office and none have a black and white setting to test, the best they can do is greyscale which has the same problem as colour when trying to make the direct comparison. I tried a greyscale comparison (which should be a more favourable comparison than colour) between 1200 and 600 using a black and white image though (involving some circles, lines, and the alphabet), if that makes you feel any better. I can safely say I didn't see a difference at about 25cm. I even shuffled the images and asked people around the office to pick the better one, and they consistently said they were the same.

The analogy fits just fine. In either case, you can not see the individual pixels at 24cm, but you CAN tell a difference in quality. THAT is the point of my post.
The analogy is flawed because even if what you were asserting was true, they act on different technologies, the printer has certain "tricks" available to it to make an image look better that simply aren't available on an LCD screen.
 
You claim that 600x600 dpi laser printing is the equivalent of 212 pixels per inch black and white on a screen? Do you have anything that supports that claim?
"I'm sorry, I clearly mispoke. I should have mentioned I was speaking strictly of black and white images. I realize now in the context of the discussion that has lead up to this point how someone might think I might be trying to apply the analogy to coloured images, since we were previously talking about using the screen in real world applications, things such as watching movies and playing video games. I should have been specific."


That's cool. In that case, yes, you would be absolutely correct: a 600dpi printer capable of printing in black and white would print at 600ppi. Unfortunately I've tried several printers around the office and none have a black and white setting to test, the best they can do is greyscale which has the same problem as colour when trying to make the direct comparison. I tried a greyscale comparison (which should be a more favourable comparison than colour) between 1200 and 600 using a black and white image though (involving some circles, lines, and the alphabet), if that makes you feel any better. I can safely say I didn't see a difference at about 25cm. I even shuffled the images and asked people around the office to pick the better one, and they consistently said they were the same.

Go back. Re-read my analogy. It was comparing 300 to 600 dpi.


The difference between 600 and 1200 dpi that you supposedly did this test with would obviously be more difficult to tell apart.

The analogy fits just fine. In either case, you can not see the individual pixels at 24cm, but you CAN tell a difference in quality. THAT is the point of my post.
The analogy is flawed because even if what you were asserting was true, they act on different technologies, the printer has certain "tricks" available to it to make an image look better that simply aren't available on an LCD screen.

Now you're arguing 180* opposite of what you were - by the logic of this statement, display technologies would require a -higher- effective resolution.
 
You provided no supporting documentation of your assertions AND they were irrelevant to the analogy that I was making.

I didn't think I had to provide links explaining how printers work. Unless your printer has variable sized dots it can only print dots of one of the colored inks that it has access to. This should be common sense. And most printers only have 4 tones of ink. Here's a link explaining it:


http://www.1960pcug.org/pcnews/2000/03/more_about_%20inkjet_printers.htm


Dithering increases the resolution of perceived samples at the expense of spatial resolution (number of samples). It works so well for printers precisely because the individual dots are beyond the resolution capabilities of the human eye, therefore the eye's acting as a low pass filter averages color values within regions giving the appearance of more color tones. Your own argument seems to be inadvertently stumbling upon this fact since you point out that the higher resolution printer images look better despite not being able to see dots on either. But if the printer had the ability to output samples beyond the spatial resolution perception of the human eye AND could output each sample beyond the sample differentiation capabilities of the human eye then increasing the spatial resolution WOULD be pointless.


All this stuff about just using "analogies" falls flat too, because it's making this assumption that more samples per area = better just because it does in the printer example you gave. But the printer example was hinging over the DPI being beyond human resolution capabilities in both cases. So the fact that the effective color depth is much different kills the relevance...

Really. You want to argue over the accuracy of an analogy. Sad.


Ignore color entirely. Use on/off at a per resolved black/white dot perspective.


Can you tell the difference in quality between a 300dpi black and white laser printer result and a 600 dpi black and white laser printer result at 24cm distance?


If not, please have your vision checked.
 
The reason why he's doing 600 vs 1200 is because he's compensating for the lack of color depth, since his printers are ALWAYS applying dithering. He's trying to do something that best approximates ~250ppi vs ~350ppi, and his choice does it better than yours.


WizardStan is actually incorrect about printers having more tricks, you can dither an image on a display too (and it's done all the time), increasing effective color depth at the expense of spatial resolution. Only if your pixels are 8-bits per component to begin with you don't need to do it. It's usually done with 16bpp images.


What are you printing that is ONLY sending pure black and pure white pixels to the printer? If it has text it's anti-aliased and includes more than 1-bit sample resolution, therefore is invalid for comparison.
 
So to sum up:


800x480


pro:


-High compatibility with emulated systems


-Tendency to use less power


-Current screen uses this and works well


-Higher power per pixel.


cons:


-Could lead to extra effort for adapting some GNU/Linux application that assume a higher resolution


-Inability to take advantage of resolutions >640x480 on the increasing library of Windows application that can be handled by Qemu+VM, Qemu/ARM Wine, and related.


1920x1080


pro:


-Reasonable compatibility with integer scaling, if full screen isn't utilized


-320x240*4=640x480*2=1280x960 (89%)


-320x200*5=1600x1000 (92.5%)


-All applications can fit from a pixel perspective


cons:


-Excessive power use.


-Scaling complexities for almost all applications due to native resolution assuming a 22" screen.


-Very low per pixel processing power.


1280x800/768


pros:


-Increased resolution allows emulating a 800x600 or 1024x768 screen, and otherwise clearing the normal minimum resolution requirement on some applications


-Reasonable integer scaling possibilities for consoles and early PC games:


-320x200*4=1280x800 (100%)


-320x240*3=960x720 (90%)


-Can play 720p HD movies at native resolution


Cons:


-Non-integer scaling required for 640x480, thus creating smudgy graphics for a significant number of PC games and converted DVD movies.


-More power use and less power per pixel then 800x480


[/\]


There is a valid question, as raised earlier in the thread whether any resolution increased is justified given we're already over 200ppi.
 
So to sum up:


800x480


pro:


-High compatibility with emulated systems


-Tendency to use less power


-Current screen uses this and works well


-Higher power per pixel.


cons:


-Could lead to extra effort for adapting some GNU/Linux application that assume a higher resolution


-Inability to take advantage of resolutions >640x480 on the increasing library of Windows application that can be handled by Qemu+VM, Qemu/ARM Wine, and related.


1920x1080


pro:


-Reasonable compatibility with integer scaling, if full screen isn't utilized


-320x240*4=640x480*2=1280x960 (89%)


-320x200*5=1600x1000 (92.5%)


-All applications can fit from a pixel perspective


cons:


-Excessive power use.


-Scaling complexities for almost all applications due to native resolution assuming a 22" screen.


-Very low per pixel processing power.


1280x800/768


pros:


-Increased resolution allows emulating a 800x600 or 1024x768 screen, and otherwise clearing the normal minimum resolution requirement on some applications


-Reasonable integer scaling possibilities for consoles and early PC games:


-320x200*4=1280x800 (100%)


-320x240*3=960x720 (90%)


-Can play 720p HD movies at native resolution


Cons:


-Non-integer scaling required for 640x480, thus creating smudgy graphics for a significant number of PC games and converted DVD movies.


-More power use and less power per pixel then 800x480


[/\]


There is a valid question, as raised earlier in the thread whether any resolution increased is justified given we're already over 200ppi.

You missed a few bits/options.


1920x1080


pro:


Marketing it as a 'High definition HD' and/or 1080p display device.


High enough resolution that non-integer scaling would still look nice.


Direct resolution translation/fit between the handheld and the HDMI out port.


con:


could be a very expensive screen (bleeding edge pricing.)


questionable availability


1280x720 (left out? Why?)


pros:


Marketing it as a 'True HD' and/or 720p device.


Direct resolution translation/fit between the handheld and the HDMI out port.


High availability - screens of this size and resolution are already appearing in several devices.


Reasonable integer scaling applies - though not as strong as 1280/x800.


cons:


Non integer scaling required for 640x480. x2 and x1.5 to stretch fit.


Don't underestimate the marketing potential of claiming '720p' and '1080p'.
 
Go back. Re-read my analogy. It was comparing 300 to 600 dpi.


The difference between 600 and 1200 dpi that you supposedly did this test with would obviously be more difficult to tell apart.
Go back, re-read everything that has already been posted about how printers work and you'll understand why your analogy is flawed and why I had to adjust it to 600 vs 1200.


Good point about screen dithering, Exophase. I had completely forgotten that was even a thing. Let's use that in the analogy!


Comparing a printer to a video screen is like comparing a monitor capable of ONLY black and white that achieves shades of grey by dithering across multiple pixels to another screen actually capable of lighting up each pixel individually with its own shade of grey at the same resolution. The first one takes more pixels to achieve the same effect, so it would need a higher density at the same resolution to look the same.


"That's why I (eventually) said use just black and white!"


Yes, but as I said I don't have a printer capable of printing in JUST black and white: even a black and white image is still dithered to grey scale, it's just that each black "pixel" is made up of 9 solid "dots". 600 vs 1200 gives an approximate 300 vs 400 ppi.


I seriously have no idea how else to explain why how flawed your analogy is.
 
Oh boy, so we really are discussing the marketability of specs. Another poll incoming!
 
oh-no...-not-this-shit-again.jpg
 
There is a valid question, as raised earlier in the thread whether any resolution increased is justified given we're already over 200ppi.

Some increase in resolution is needed to maintain the current ppi if the screen size is going to be larger (e.g. 5" instead of 4.3"). Also it will be easier to find suppliers high-res screens.


So imo, the P2 should be higher res than 800x480.


Anything higher res than 1280x800 is overkill and would probably do more harm than good in terms of power and cpu/gpu consumption.


Something like 1024x600 would be ideal in my opinion, but too bad that doesn't seem to be a common resolution.


In the end, the question will most likely be decided by whatever is available with good specs at a reasonable price - we're not going to make our own screens, we'll have to be happy with something that is already used in some other product. Currently it looks like 1280x800 and 1280x720 are good candidates. I would prefer 1280x800, since I like an aspect ratio that is not too wide. Wide aspects are great for movies, but I don't want to watch movies on tiny screens anyway. For anything else (emulation, desktop applications, web browsing, watching TV series, etc.), it's better to be closer to 4:3.
 
Something like 1024x600 would be ideal in my opinion, but too bad that doesn't seem to be a common resolution.

Maybe not for the handheld market, but for netbooks 1024x600 is the most common from what I've seen. I think we share a common vein with them in portable computing, so in my eyes this is the proper next step.


Granted, that fact may change by 2014, but over the span of almost 5 years starting with the eee 701 in 2007 at 800x480, netbook screen resolutions only bumped up slightly to the 1024x600, with some few higher end ones at 1366x768.


I own both the eee 701 and an Acer Aspire One from just about a year ago, and that slight increase makes all the difference in software usability. Every level beyond what we need we are paying for it in all the ways mentioned. I'm not saying we can't go higher, just that we can take guidance from other small computers on what is optimal.


EDIT: Well, netbook sales are on the decline, so maybe the common niche in portable computing should have less bearing on our future specs than what I'm suggesting.
 
Last edited by a moderator:
Back
Top