the open pandora 2 must use an amd apu!


What does that have to do with anything I just said. Are you disputing my 50% claim? You act as if you're in no need for more CPU power while using language like "sorta run."


These subjective talks about emulator performance don't do anything for me. I have pretty much zero interest in emulating anything that isn't full speed most of the time, with a modest amount of frameskip. The criteria will be different for different people. It's totally subjective. What isn't subjective is that the faster your CPU is the more people you will satisfy, and that no one would have zero preference towards a more correct emulation speed.
 
The ARM SoCs possible for Pandora 2, most likely OMAP5 and Exynos 5, will have much faster CPUs than Z-60, yes.


As for GPU: AFAIK Z-60 has an HD6250 @ 280MHz. AFAIK it has 16 VLIW5 clusters (so 160 FLOPS/cycle) with 8 TMUs and 4 ROPs. SGX544MP2 in OMAP5 is alleged to beat Apple's A5X so it probably runs a clock speed well over 500MHz, we'll say around 2x HD6250's clock speed. The chip has 4 TMUs (ROP count is a little nebulous for SGX stuff) and 8 USSE2 pipelines that can run 8 FLOPS/cycle (dual issue paired shared operand FP32 or independent FP16 FMA) each so 64/cycle. Scale everything for 2x to normalize vs Z-60 clocks and you get similar fillrate. AMD has a bit more ALU power but it's notoriously hard to utilize all of it in VLIW5 format (hence why they moved far away from this) so I doubt they actually win here, plus PowerVR has much more flexible thread granularity here, and has bandwidth/compute saving optimizations AMD doesn't like TBDR (instead you need stuff like depth pre-passes)


So at first glance I'd call the GPU capability pretty similar.


I don't know enough about Exynos 5's GPU to really comment.


But I still contend that all this talk is moot if the damn thing has too high of a TDP rating to fit in a shrunken Pandora. Even AMD is only positioning it for 10mm tablets instead of the thinner Clovertrail and ARM offerings, why do you think that is?


Here's another bad point for this processor in Pandora 2: http://arstechnica.c...dows-8-tablets/


No Linux or Android support, so barring successful third party integration that means you have to ship with Windows 8. Not everyone wants Windows 8, and I'm sure people will certainly not appreciate the impact it has on the price tag, especially as OPT probably won't be able to get a competitive OEM rate on it (and none in the team are very fond of Windows)
 
Last edited by a moderator:
I wouldn't buy a Pandora 2 if that means I have to buy a copy of Windows, and if it would only run Windows, I wouldn't touch it even if someone would pay me for it.
 
Last edited by a moderator:
any P2 must run linux cleanly, and must have a huge battery life.


It sounds like these x86 chips can only survive in something with a tablet battery and a port of Win NT


No, thank you.
 
Tell us more about this power consumption test. What are you measuring exactly, 12V load to the APU? What are you using to perform the measurement? Did you log CPU clockspeed and load during all of these activities? What's it like while running that last gen emulator that "sorta ran"? What game was it, what was the frame rate, what were the settings - even in this context you could easily end up severely GPU limited. What type of video were you running?


Emulators are not analogous to demanding 3D games, the CPU demand scales much more higher vs GPU demand.


And no matter what you measure power load to be, so long as a chip is rated for a certain TDP (and the TDP can't be reconfigured) you must design the hardware to accommodate it. You can't have something that works in nice scenarios then melts when a game starts using 100% of both cores. The only way they can offer configuration against is if they have thermal or power throttling.
 
Thats very similar to what the P1 achieves - but why does a little quite googleing bring up a TDP of 9W?
 
Thats very similar to what the P1 achieves

No it isn't. Pandora 1's battery capacity is about 15.5Wh. If your Pandora lasts 10 hours doing something that means it's consuming about 1.55W. That's for everything: SoC, display, and RAM will be the main consumers, and wifi if you're using it. It also includes waste due to inefficiency of power regulators. So the SoC part of that will only be a fraction of that. While in monstercameron's case he's not even account for the FCH that uses about 0.5W any time it's on in a Z-60 system (probably uses substantially more for his)


And this 0.5W for video claim is fishy because AMD themselves claim 1.57W for Z-60 during 720p playback (http://www.tweaktown...load/index.html). And I'm pretty confident they used the best numbers they could. So if monstercameron really measured this it's probably running a much weaker video.

- but why does a little quite googleing bring up a TDP of 9W?

TDP is a design parameter that's supposed to tell you how much heat the thing can generate under a reasonable high load scenario that the thing won't exceed (or at least won't exceed for long). Hence why it's "Thermal Design Power." In other words, it's the specification of how much heat the system needs to be prepared to tolerate. Not how little power you can use while doing things that you subjectively think the user most wants to do.


So if Hondo has a TDP of 4.5W your design must be prepared to handle 4.5W coming out from that chip.
 
Last edited by a moderator:
I got 3.3W power draw from my pandora with screen, USB, Wifi and BT all on full.


If the results described by monstercameron are in a real device, then they are comparable, surely.


If a system under high load can turn 9W of electricity into 9W of heat, yet a demanding 3D game (on the same system) only uses a third of that, then isn't something fishy afoot?
 
Last edited by a moderator:
Binky, if your Pandora uses 3.3W and lasts not even 5 hours on a full charge then most of what you're consuming isn't coming from the SoC. How are you supposed to call something representative for Pandora while using "USB" when USB can provide 2.5W to an external device just by itself (and that's after boosting the battery's 3.7V to 5V)? Are you running a lamp off of it like WizardStan was? And who is running the screen at full brightness with wifi and BT constantly stressed?


10 hours is the real number that people actually get while putting the CPU under a real load. Less if they bang on wifi (Pandora uses an old module that consumes a lot, wifi power consumption has gotten a lot better since then)


You have to get this idea out of your head that "demanding game" necessarily means you're maxing power consumption on the chip, especially when we don't even know just what this demanding game was. If you're looking at a landscape with no enemies on the screen while running at 11FPS (you know, like in his Rage video) you might not be using very much CPU time at all. Even the GPU isn't necessarily being fully taxed, if the game has a low TEX:ALU ratio and therefore struggling to put those 80 shader ALUs to work. Or is constantly stalling for external memory bandwidth.


These numbers are really annoying, yet everyone does it. I got this battery life while browsing the web! Okay, doing what? Browsing the web can mean pretty much anything. I got this battery life while watching a video! With what encoding?? I got this battery life while playing some game! What game? And what was the CPU load for all of this? Windows puts it in a nice little graph, take a screenshot!


Tell us MORE.
 
Last edited by a moderator:
Thanks for the video, that helps a lot. Although I'd still like to see CPU usage plots (and block speed) if you can provide that.


FCH is probably counted in "base", not CPU. Sunspider is strictly single threaded, so if you hit up to 3W while it's running you can expect at least 4-5W with both cores pegged (depending on what junk is running and how much uncore is taking).


Finally, going from C-50 to Z-60 definitely doesn't mean that your average power consumption goes down by half. Think about it.. it's the exact same CPU and GPU design on the exact same manufacturing process (TSMC 40G), any optimizations they made would be minimal. The gains in TDP came from limiting the worst case scenarios, probably from a) using a more limited memory controller that doesn't handle DDR3, B) cutting the number of PCIe links down, and c) if necessary, more aggressive throttling to maintain the limits in unusual cases. I definitely don't think your processor would normally hit 9W, although the netbook manufacturer probably still had to design for it. 9W definitely does not include the FCH, AMD provides a separate TDP for that (afaik it's around 2W for what's used in Ontario)


As I suspected, your video is more DVD quality than "HD".. would like to know more about what codec was used. For that quality bitrate and > 2GB file size is pretty big, which makes me think that it's not using a very aggressive encoding. People routinely run 720p and 1080p content on mobile devices now and I expect they'll want to on Pandora 2, so maybe you could do a demo of that - if you need material there's stuff you can get here: http://www.h264info.com/clips.html


3W for a game on high settings, while wandering around an empty environment. and some scripted cutscenes. Do you have frame rate numbers? CPU utilization? It's really hard for me to get an idea from youtube videos (I often see stuff that looks a lot smoother than it feels while playing it, because of camera blur and the fact that I don't personally have to directly interact with it)


All told, is a 2004 game really what you'd call demanding? Do you feel that this surpasses high end games on the best ARM tablet hardware? How about some power consumption tests with the most demanding emulators?
 
Last edited by a moderator:
using a more limited memory controller that doesn't handle DDR3
it does support ddr3 1066 and ddr3L1066
would like to know more about what codec was used. For that quality bitrate and > 2GB file size is pretty big, which makes me think that it's not using a very aggressive encoding
from vlc mpeg 4 part 10 avc1, reso 1280x544, framerate 23.97, audio mpeg aac mp4a, content bitrate 4249 kb/s
if you need material there's stuff you can get here: http://www.h264info.com/clips.html
will do updated video more scientific...running battery life test in ubuntu 12.10 now (mixed usage
3W for a game on high settings, while wandering around an empty environment. and some scripted cutscenes
i did try rage at the end...with 2 live ai enemies and cloth physics
Do you have frame rate numbers? CPU utilization?
will use amd utilization utility to break down cpu and gpu usage
Do you feel that this surpasses high end games on the best ARM tablet hardware?
you must be joking...farcry maxed out beats every mobile game out there!
)
 
it does support ddr3 1066 and ddr3L1066

Okay, I have to look at what the specific differences were.. I talked about this with someone on S|A once, it was based on some old slides long before the Z-60 announcement.

you must be joking...farcry maxed out beats every mobile game out there!

Yeah that was a gaffe on my part, I forgot how ridiculous Far Cry was for its time. It's just really hard to discern image quality and settings from your video.


Being so GPU demanding 2004 further stresses my belief that it likes to push GPU a lot more than CPU.. GPU is easy to scale for utilization vs features, CPU isn't. If it used a ton of CPU by today's standards then in 2004 they'd really be eschewing their target market.
 
new video up soon...there were some oddities, after recalibration the base is using more power -possible due to setting windows from power saver to performance- and the cpu update speed felt slower...
 
I'm looking more into how Joulemeter works. I assumed that it was directly reading the values off of internal sensors that measure current off of voltage rails, but that's not the case.


From the manual (http://research.microsoft.com/en-us/downloads/fe9e10c5-5c5b-450c-a674-daf55565f794/usermanual.pdf):

Joulemeter estimates the power usage through a power model that relates the computer resource
usage and hardware power state (processor utilization, processor frequency, screen brightness, monitor


on/off state, disk utilization) to power drawn. This relationship, known as a power model, is learned


using a process called calibration.

The calibration is done either by reading battery charge over time or values from a watt meter. Then it tries to measure the impact that certain activities has on this. The categories CPU, disk, monitor, and base correspond to actual tasks performed during calibration. You can see on page 9 the different measurements used by the power model. You might be able to read them back somehow.


Since the tool would have no way of measuring GPU load, and from what I can tell doesn't try to exercise the GPU anyway, I don't see how it could possibly be measuring the true power load of the APU while the GPU is heavily exercised. This leaves me with the impression that 3W is about the value you'd get JUST from heavily utilizing a single CPU core (consistent with the Sunspider numbers).
 
This thread needs more popcorn.


But seriously thanks Exophase for putting in a silly amount of education into your posts. They may be wasted on Mr Cameroon but others such as myself surely enjoy learning from them.


And more seriously do you know what the crazy Germans here put on popcorn? Sugar! Sugar I tell you. The mind reels.
 
Back
Top