Presumably selecting such high resolutions will have a performance tradeoff though, as graphics ops will have to traverse more pixels to draw the same image scaled up.
Which again makes me want a docking station, which supports 2 HDMI outputs. At least the pixelsum suggest that it could be possible to drive 2 1680x1050 monitors, which would make a killer Desktop for such a small device....Reading the OMAP5 Reference manual the HDMI output can be set to resolutions as high as 2560x1600 as well as more standard ones like 1920x1200 and 1920x1080
What reference manual are you referring to?Reading the OMAP5 Reference manual the HDMI output can be set to resolutions as high as 2560x1600 as well as more standard ones like 1920x1200 and 1920x1080
That would be hard. HDMI supports no daisy-chaining. In theory one could probably build a splitter, that announces a resolution of 2x*y (or x*2y), cuts the image in the middle and outputs each half of x*y to another port. To the computer the splitter would have to look like one big screen. But I've never seen an HDMI splitter actually do that. All splitters I've seen merely clone the input image.Which again makes me want a docking station, which supports 2 HDMI outputs.
What reference manual are you referring to?
The 6000 Page OMAP543X_public_Technical_Reference_Manual.pdf , The one you need to sign your life away to the US government get access to.
I think your best bet would be to use a USB/HDMI converter. Due to the required video compression USB2 converters tend to work okayish for mostly static contents (like office desktops), but produce no satisfactory results when dynamic content (videos, games) is involved. USB 3.0 converters could potentially solve that due to their higher data rate, but I've never seen one in action.
Those ain't "converters", they have a fully functional GPU that requires its own driver - usually something from DisplayLink. Their USB 3 chips introduced some kind of data encryption, which fired their Linux support back into Stone Age.I think your best bet would be to use a USB/HDMI converter. Due to the required video compression USB2 converters tend to work okayish for mostly static contents (like office desktops), but produce no satisfactory results when dynamic content (videos, games) is involved. USB 3.0 converters could potentially solve that due to their higher data rate, but I've never seen one in action.
Your bank account, I think.Thats the public one? what do you have to sign away for the private one then, your soul?
I think your best bet would be to use a USB/HDMI converter. Due to the required video compression USB2 converters tend to work okayish for mostly static contents (like office desktops), but produce no satisfactory results when dynamic content (videos, games) is involved. USB 3.0 converters could potentially solve that due to their higher data rate, but I've never seen one in action.
In my book, everything that actively converts one protocol to another is a "converter". Everything passive, that merely translates one plug to another mechanically (and electrically), without changing the protocol, is an "adapter".Those ain't "converters", they have a fully functional GPU that requires its own driver
Well, that sounds bad (some links: [1][2]). I've only ever seen those converters attached to Windows machines.usually something from DisplayLink.
Doesn't seem much of a setback. According to [1] they barely left tool age with their USB 2 support.Their USB 3 chips introduced some kind of data encryption, which fired their Linux support back into Stone Age.
I doubt that. On [3] they say, that the firmware currently supports 1024x768 and 720p (1280x720). Besides that, the basic concept might have potential. But I don't think they even thought of using their device as an extending HDMI splitter (which would have to be considered when writing the firmware). If someone made them aware of that application, a potential version 2 might be able to do that.Would the Numato Opsis work?
The difference is that there is no standard protocol to convert from, the USB HID protocol family doesn't include support for display controllers and the DL chips don't just display a given framebuffer, either (that would be way too inefficient on USB 2.0, anyway). The DL chips process data instead of just converting them and need instructions on how to do so, there are registers to be filled, hence it is an active device - a converter would act independently, the source shouldn't have to be aware of something that is converting its output. Ironically, those DL-based products are commonly named adapters.In my book, everything that actively converts one protocol to another is a "converter". Everything passive, that merely translates one plug to another mechanically (and electrically), without changing the protocol, is an "adapter".
According to that, a GPU is a converter. In this case it converts USB to HDMI.
Well, AFAIK the free drivers for the USB 2.0 chips do work quite well, though - there's full KMS and DMA-Buf support, you just need a driver for your main GPU which is able to leverage that and then it's all just plug in and do your stuff. However, that's obviously not the case with the proprietary SGX drivers, so you're left with Xinerama and friends - which is a general issue when you're working with multiple GPUs.Well, that sounds bad
It's not a driver in the traditional sense, IIRC it should be totally useless if you don't want to write a program that directly accesses its framebuffer exclusively. Also: it's binary only, ARM is not supported.This one should also support the 5.x Devices (i.e. USB 3)