Pyra maximum resolution on external monitor


edlee

Member
Joined
Dec 3, 2013
Messages
79
What is the maximum resolution that can be used on an external monitor that is connected to a Pyra? The Pyra could be used as a desktop computer replacement if it supports 1920x1080 pixel resolution on an external monitor.
 
Is there a way to force it to stay 720p even on an external monitor, so interfaces and such don't change size?
 
Reading the OMAP5 Reference manual the HDMI output can be set to resolutions as high as 2560x1600 as well as more standard ones like 1920x1200 and 1920x1080
 
Presumably selecting such high resolutions will have a performance tradeoff though, as graphics ops will have to traverse more pixels to draw the same image scaled up.
 
Presumably selecting such high resolutions will have a performance tradeoff though, as graphics ops will have to traverse more pixels to draw the same image scaled up.


For gaming you mostly won't need the higher Resolution but use the 720p and let the Display do the scaling.
For Desktop work not so much performance is needed. For web-browsing, libreoffice... you should bot notice a big difference I suppose.
And as We've seen on the devboard already 1920x1080 video playback is possible.
 
Reading the OMAP5 Reference manual the HDMI output can be set to resolutions as high as 2560x1600 as well as more standard ones like 1920x1200 and 1920x1080
Which again makes me want a docking station, which supports 2 HDMI outputs. At least the pixelsum suggest that it could be possible to drive 2 1680x1050 monitors, which would make a killer Desktop for such a small device....
 
Reading the OMAP5 Reference manual the HDMI output can be set to resolutions as high as 2560x1600 as well as more standard ones like 1920x1200 and 1920x1080
What reference manual are you referring to?
[1] doesn't mention any info about the maximum resolution. However, it links to [2], which links to [3]. [3] merely states that HDMI 1.4 data rate is supported. In theory this would be enough to run a 4k display at 30Hz, but it also depends on color depth and available graphics memory.

bottom line:
I'd guess, that 2560x1600 (and even more) would work. But it's just a guess, because I don't have all necessary info.

Which again makes me want a docking station, which supports 2 HDMI outputs.
That would be hard. HDMI supports no daisy-chaining. In theory one could probably build a splitter, that announces a resolution of 2x*y (or x*2y), cuts the image in the middle and outputs each half of x*y to another port. To the computer the splitter would have to look like one big screen. But I've never seen an HDMI splitter actually do that. All splitters I've seen merely clone the input image.

I think your best bet would be to use a USB/HDMI converter. Due to the required video compression USB2 converters tend to work okayish for mostly static contents (like office desktops), but produce no satisfactory results when dynamic content (videos, games) is involved. USB 3.0 converters could potentially solve that due to their higher data rate, but I've never seen one in action.


[1] http://www.ti.com/lit/ug/swcu130/swcu130.pdf
[2] http://www.ti.com/product/tpd12s016
[3] http://www.ti.com/product/TPD12S016/datasheet
 
What reference manual are you referring to?

The 6000 Page OMAP543X_public_Technical_Reference_Manual.pdf , The one you need to sign your life away to the US government get access to.
 
I think your best bet would be to use a USB/HDMI converter. Due to the required video compression USB2 converters tend to work okayish for mostly static contents (like office desktops), but produce no satisfactory results when dynamic content (videos, games) is involved. USB 3.0 converters could potentially solve that due to their higher data rate, but I've never seen one in action.

I had one with my Pandora in use to output native 1920x1080.
It was not the best experience but I preferred it over the low-res TV-out cable.

Of course it was only usable for a few office tasks, no Gaming etc...
But Mouse movement etc.. was fluid.
 
I think your best bet would be to use a USB/HDMI converter. Due to the required video compression USB2 converters tend to work okayish for mostly static contents (like office desktops), but produce no satisfactory results when dynamic content (videos, games) is involved. USB 3.0 converters could potentially solve that due to their higher data rate, but I've never seen one in action.
Those ain't "converters", they have a fully functional GPU that requires its own driver - usually something from DisplayLink. Their USB 3 chips introduced some kind of data encryption, which fired their Linux support back into Stone Age.
 
I think your best bet would be to use a USB/HDMI converter. Due to the required video compression USB2 converters tend to work okayish for mostly static contents (like office desktops), but produce no satisfactory results when dynamic content (videos, games) is involved. USB 3.0 converters could potentially solve that due to their higher data rate, but I've never seen one in action.

Would the Numato Opsis work? You could expect the same quality from both HDMI ports (the ones that do output, that is).

I don't know if it would provide a lagless experience for gaming out-of-the-box, but it's an FPGA board. A lagless experience is possible.
 
Those ain't "converters", they have a fully functional GPU that requires its own driver
In my book, everything that actively converts one protocol to another is a "converter". Everything passive, that merely translates one plug to another mechanically (and electrically), without changing the protocol, is an "adapter".
According to that, a GPU is a converter. In this case it converts USB to HDMI.

That being said, I'd keep my hands off polyphase/Gardena adapters. But I'd totally buy polyphase/Gardena converters, just for the ingenuity of the design. ;)

usually something from DisplayLink.
Well, that sounds bad (some links: [1][2]). I've only ever seen those converters attached to Windows machines.

Their USB 3 chips introduced some kind of data encryption, which fired their Linux support back into Stone Age.
Doesn't seem much of a setback. According to [1] they barely left tool age with their USB 2 support.

Would the Numato Opsis work?
I doubt that. On [3] they say, that the firmware currently supports 1024x768 and 720p (1280x720). Besides that, the basic concept might have potential. But I don't think they even thought of using their device as an extending HDMI splitter (which would have to be considered when writing the firmware). If someone made them aware of that application, a potential version 2 might be able to do that.


[1] https://displaylinklinuxdriver.wordpress.com/
[2] http://plugable.com/2014/03/06/displaylink-usb-2-0-graphics-adapters-on-linux-2014-edition
[3] https://hdmi2usb.tv/faq/
 
In my book, everything that actively converts one protocol to another is a "converter". Everything passive, that merely translates one plug to another mechanically (and electrically), without changing the protocol, is an "adapter".
According to that, a GPU is a converter. In this case it converts USB to HDMI.
The difference is that there is no standard protocol to convert from, the USB HID protocol family doesn't include support for display controllers and the DL chips don't just display a given framebuffer, either (that would be way too inefficient on USB 2.0, anyway). The DL chips process data instead of just converting them and need instructions on how to do so, there are registers to be filled, hence it is an active device - a converter would act independently, the source shouldn't have to be aware of something that is converting its output. Ironically, those DL-based products are commonly named adapters.

How would you call a converter which converts a protocol that wouldn't even exist without the converter? :p

Well, that sounds bad
Well, AFAIK the free drivers for the USB 2.0 chips do work quite well, though - there's full KMS and DMA-Buf support, you just need a driver for your main GPU which is able to leverage that and then it's all just plug in and do your stuff. However, that's obviously not the case with the proprietary SGX drivers, so you're left with Xinerama and friends - which is a general issue when you're working with multiple GPUs.
 
This one should also support the 5.x Devices (i.e. USB 3)
It's not a driver in the traditional sense, IIRC it should be totally useless if you don't want to write a program that directly accesses its framebuffer exclusively. Also: it's binary only, ARM is not supported.
 
Back
Top