Not everything is as fast as planned


I got the same email. There were two with 4G/LTE when I checked, but I didn't order one.
 
No. The M4's are afaik 200 MHz without any SIMD. They aren't useful for much of anything that doesn't require realtime response, as said a billion times already on these boards, they're not high performers, be it computation or just raw bandwidth.

What are the M4s good for anyway?
 
What are the M4s good for anyway?



Background tasks that need to be really low power and/or low latency and coordinating other parts of the SoC like the video stuff.


If that sounds vague it's because I doubt it'll get any real use on Pyra.
 
What are the M4s good for anyway?

Well, let's put an e-ink display on top of the lid, put most of the OS to suspend (together with the 2 other cores) and have some really "spartan" parts running on the (like calendar, contacts and mp3s).


That will run for days.....


(Honestly: I was thinking of take the CPU board of the pyra and build a mainboard that fits inside the old Palm3 cases. Battery compartment becomes the connector base and the do the above scenario. Add the foldable Keyboard, use a monochrome LCD/E-Ink.... I REALLY would like to do that!)
 
As mentioned, in command mode, the SoC only sends a frame whenever content changes. So if a smartphone shows the dashboard which doesn't change for a minute or so, it doesn't send a stream which reduces power usage.


So there's no sync coming from the SoC to the display or bridge during that time.


So the display controller (they usually have a framebuffer) has to keep the sync.


When the SoC wants to send one or more frames because the content is updated, it let's the display/bridge know which then sends the anti-tear-signal (which is a basically a 'send the frame now'-signal).


With that it mind, it makes sense the sync is coming from the displaycontroller / bridge to the SoC and not the other way round.


The rotator chip we're using is being used in millions of chinese android smartphones, all using command mode with the anti-tearing signal coming from the chip to the SoC.



Sounds to me like command mode is more power-friendly, and more suitable for anything except 60fps games. When doing something like web browsing or document editing, probably not much more more than one or two frames per second are needed on average. If I understand correctly, the way X11 works, every screen update goes through X so in principle it knows when the screen needs to be refreshed.


Of course full screen games/emulators may want to bypass X11 for various reasons, so this is only a partial solution. But regardless of the tearing/rotation issue, I think it would be nice to have a command-mode X11 video driver just to improve power consumption.
 
Background tasks that need to be really low power and/or low latency and coordinating other parts of the SoC like the video stuff.


If that sounds vague it's because I doubt it'll get any real use on Pyra.



Might they not be used to monitor the 4G module for phone calls while the rest of the device sleeps? It would be useful, I think, for the Pyra to have a "dumbphone" mode where nothing is powered but the RAM, the 4G module and an M4 processor -- I assume that powering down the main processor would give a significant power saving. Ideally, the M4 would power on everything else when the Pyra gets a phone call (to wake Linux from Suspend, it would need to be a virtual keystroke, or a virtual power-button press, or some such thing).


That way, one could leave the "phone" on all day with confidence that the battery would not run out, and that one was not being wasteful. Unfortunately, one is expected to be contactable at all times, these days.
 
Might they not be used to monitor the 4G module for phone calls while the rest of the device sleeps? It would be useful, I think, for the Pyra to have a "dumbphone" mode where nothing is powered but the RAM, the 4G module and an M4 processor -- I assume that powering down the main processor would give a significant power saving. Ideally, the M4 would power on everything else when the Pyra gets a phone call (to wake Linux from Suspend, it would need to be a virtual keystroke, or a virtual power-button press, or some such thing).


That way, one could leave the "phone" on all day with confidence that the battery would not run out, and that one was not being wasteful. Unfortunately, one is expected to be contactable at all times, these days.

Maybe, google still hasn't provided me any hints at how to access these cores. The OMAP5 product bulletin PDF hints that it may be able to be used in such a manner: http://www.ti.com/pdfs/wtbu/SWCT010.pdf
 
I would use the command mode for anything that uses X11, and anything that uses more direct rendering such as OpenGL etc. would use video mode. That way we'll only have tearing some of the time. :)
 
I think Wayland updates the screen only when an application notifies it about a graphical change (cuz programs render directly in GPU. Then the compositing manager reads form RAM and draws on the screen). Isn't that basically the same effect like using X in command mode + that library but without patching and compiling?


I recently got a TV and as a result I'm now interested in using Wayland for a tear less video playback. KDE Plasma 4.5 and the latest GNOME both have Wayland support  : )


About the vsinc issue - I preordered and don't care. Give them a bit free space to work on the important stuff. First people need to get their devices to start working. Light tearing is nothing in my opinion. Everyone need to decide for them selfs to get one in the fist batch or to wait until the software is on a more mature level. Buying a Pyra would probably help more to fix that issue then nagging about it and demanding disproportional delays xD
 
X11 itself doesn't even support VSync for 2D stuff, so why bother about tearing? Or how extreme is the tearing? Can you make a Video about that ED?
 
The problem as you say is most likely to be noticable in full-screen games and so on using the framebuffer directly, but the Pyra will probably be capable to playing hi-res movie files in a window pretty well.


The tearing is quite likely to be pretty noticeable IMO, as both the OMAP and rotation chip maintain 60Hz timers for their own vsync purposes.  These drift apart, but over a few seconds or even minutes will probably be quite stable, even if unsynced - so the vsync line will be visible at a fairly consistent height between frames.


The problem might be even worse if there are any apps that don't use double buffering, and rely on flyback or scan time to build certain elements, but I guess that approach is less common these days with different computers being able to get different amounts of stuff done in those periods.


A video would be good, but first of all ED needs a prototype which boots to X, and we're still some way off that last time I heard.
 
so the vsync line will be visible at a fairly consistent height between frames

The obvious solution is to add a tuner dial like on the old CRT TVs.  I remember playing with ours as a kid to mess up the VSync and make the picture just loop around and around and around.  I was never bored.


Seriously though, if the GPU and the rotator maintain distinct vsyncs, is it possible to resyncronize them, maybe once a minute?  Is that a possible solution, or is it too naive? Or is it such that if you're able to do that may as well go a step further and fix the driver to share the vsync every frame?
 
The obvious solution is to add a tuner dial like on the old CRT TVs.  I remember playing with ours as a kid to mess up the VSync and make the picture just loop around and around and around.  I was never bored.


Seriously though, if the GPU and the rotator maintain distinct vsyncs, is it possible to resyncronize them, maybe once a minute?  Is that a possible solution, or is it too naive? Or is it such that if you're able to do that may as well go a step further and fix the driver to share the vsync every frame?

That's sort of what Nikolaus is planning to do:


http://projects.goldelico.com/p/gta04-kernel/issues/703/
 
The obvious solution is to add a tuner dial like on the old CRT TVs.  I remember playing with ours as a kid to mess up the VSync and make the picture just loop around and around and around.  I was never bored.


Seriously though, if the GPU and the rotator maintain distinct vsyncs, is it possible to resyncronize them, maybe once a minute?  Is that a possible solution, or is it too naive? Or is it such that if you're able to do that may as well go a step further and fix the driver to share the vsync every frame?



Crazy thought...  Could an M4 sit behind the scenes and watch the two sync signals and issue a re-sync command to one or the other when they become different?
 
Until we know even how to get the m4s to start running it's probably a little premature to start allocating work for them.  Although the code written for a driver to periodically run can probably run as well on an M4 if we can figure them out, so there's unlikely to be work lost even if we do switch to using the M4s somewhere down the line.


Personally I'm dubious that the M4s even have access to the GPIOs directly though, so it won't be able to read the VSync signal from the rotation chip.  They're probably sat behind a very simple MMU.
 
Back
Top