3DS Emulator and 3D screen for Pandora's Successor?


...

Both options are practically free, without negative consequences for 2D-mode.
Yea, but there is the negative consequence of looking like a total dork wearing those glasses. Also, I've never found polarized glasses, or the 3D effect they produce, to be comfortable.

-God Ginrai
Yep, I'm near sighted and I've already worn a pair of thick glasses and hate it.  Don't want to wear another over it :)
 
There is total surround 3D video and audio with direct body controls in the parks outside, so lets just use it. ;)
 
Quick question:  Could 3D effect be done via software?
No. If that were the case, every TV would be 3D
But TVs don't have processing powers like computers right?
Well, assuming you dont use a cable box, there actually is a computer inside the TV that decompresses data and turns it into the video you see. Its bacially a graphics card and a decompression-specific computer. If you use a Cable Box, though, it simply acts a monitor. But seriously, there is enough power there to process it properly if such a thing were possible.
 
We could cut the frame rate in half and use active shutter glasses. Wired LCD ones are pretty cheap, under $10 a piece.
 
>cut the framerate in half


Dude, no matter what you do, when making a gaming device NEVER intentionally cut the framerate
 
Yes, but it still requires rendering twice per frame: ie, it cuts the frame rate in half.
 
Ok, half of 120 is 60. 60 is fine. But if you mean half of 60 then that's not fine
I mean half of whatever it can produce. If the game being played on the hardware (be it a P2 or the 3DS) can only do 60 fps, then adding the 3D effect is going to reduce it to 30 frames per second per eye. This is a fact regardless of the type of screen used. If a theoretical Pandora 2 had a lenticular display like the 3DS has it would still cut the frame rate in half in order to have the 3D effect.Benefit of a lenticular display similar to that in the 3DS vs the active shutter glasses: assuming the software can push 120 frames per second, those frames can be interlaced so the display itself only needs to refresh 60 times; doesn't need glasses.

Benefit of the active shutter glasses vs lenticular display: can work with any stock LCD, doesn't require a double width density or special overlay; works from any angle, no "magic sweet spot"; Rendering without the 3D effect has absolutely no drawback in terms of resolution or frame rate.

If the 3D is a gimmick, the answer seems obvious to me: source a 120hz LCD (or even 90hz, 45fps is the practical limit of human vision) and use the active shutter glasses. It'll be cheaper, customizable (people can choose to pay for the glasses seperately or not at all), and the only downside is that you are wearing glasses, but it's a gimmick so you won't be wearing the glasses all that often anyway; the benefits outweigh that one shortcoming. On the other hand, if the 3D effect is going to be a core feature like it is on the 3DS then you'll want to follow their footsteps and do away with the glasses.
 
45hz is human limit? Then how is it when I compare my 60" 120hz TV with my 20" 240hz TV, I can clearly see the flickering with the big TV but not at all with the small TV?
 
45hz is human limit? Then how is it when I compare my 60" 120hz TV with my 20" 240hz TV, I can clearly see the flickering with the big TV but not at all with the small TV?
Because you accidentally catch the change between the current and the next pic.

The chance for this is reduced with a higher refresh rate.
 
Last edited by a moderator:
45hz is human limit? Then how is it when I compare my 60" 120hz TV with my 20" 240hz TV, I can clearly see the flickering with the big TV but not at all with the small TV?
Possibly lots of reasons.From the biological side, our eyes are analogue. In order for the cells to completely transition from one colour to another takes about 22ms: ie, if you look at a screen that is red and it suddenly changes to blue it takes 22ms before your eyes have completely changed to recognize blue. But your brain is still processing the colour your eyes are seeing, so for that 22ms your brain sees a scale going from red to blue. If that screen is displaying a constant colour the transition will be perfect.

On the hardware side, the way it refreshes makes a huge difference as well. If it's an LED or plasma TV, these display images with charged particles, and over time they lose their charge: it may be displaying an image for 22ms (or faster) before switching to the next, but towards the end of that 22ms it is losing its strength and getting darker, and your eyes start to adjust accordingly. This will happen with faster displays as well: if it displays a new image every 11ms then you're eyes "see" less of the drop in strength towards the end of each frame, but it's still there and your brain goes "wait, did it just get fractionally dimmer for 1/100th of a second?". Looking at one screen you don't actually notice, but if you compare side by side you can say "this one is getting more fractionally dimmer than the other one".

There's also interpolation being done: TV shows are 30/60fps and in order to display at higher frame rates they actually calculate and insert frames in between each one. Personally I hate this effect, it makes things look... weird (it's called the soap opera effect, it's a real thing) but that seems to be the way things are going as they push for faster and faster TVs. Anywho, the point I was getting at is even when using LCD or if the timing on the LED TV is absolutely perfect so there's no degredation between frames, the fact that it is inserting extra frames means that there will be colour change on the 240hz TV where there isn't on the 120hz TV, and even though it is such a tiny tiny fraction of a second and your eyes cannot possibly perceive the full change the cones in the retina still take some small step towards the correct colour which your brain might interpret.

Eyes are like film (old school camera film that needs to be exposed to light) that is self regenerating, the longer it is exposed to something the brighter that image becomes.

If that's still not making any sense, think of this: you know those video effects where it retains a few frames and then overlays them, discarding the earliest frame so it always has the same length of ghosting? Imagine taking one frame ever 1/2 a second for 22 seconds and you overlay them all in a continuous video. Each frame that you see, with its ghosting image of past frames, is what your eye is seeing and your brain interpreting every 1/45th of a second. Except because it's every 1/45th of a second instead of every 22 seconds there's obviously a lot less ghosting. Faster displays intentionally insert frames between those 1/45th, they introduce ghosting because, at such high refresh rates, the ghosting actually helps your eyes transition. You aren't seeing the refresh rate, you're seeing the artificially created frames between the actually recorded frames ghosting in your vision, and whether that's a good or bad thing depends entirely on your opinion.

Incidentally, the reverse is how you get ghosting on a screen: the screen is given a new image every X ms, but it takes some length of time Y which is greater than X to transition from one colour to the next. So if it is told to go from blue to red, the blue pixel slowly turns off while the red pixel slowly turns on, requiring Y ms but only getting most of the way there after X ms before it has to change course: the effect to our eyes is a light purple pixel though very close to red. Do this over enough full images and you the ghost effect becomes very noticeable.
 
Back
Top