possible soc updates.


@ exo, in a non-mod capacity:
Could you try to sound a little less adversarial towards my comments?
I've probably been doing it myself, so I'll try to avoid turning conversations into standoffs.
 
Friends?
 
---
 
I have seen several people comment of the poor feel of the P1's rubbery keys, so I do think its worth looking into.
Having said that, the iCP2's look like they should be an improvement. If they are, I'm happy to go with something of a similar design.
 
Personally, I'd want something taller than (or as tall as) 768 px, but as you say resolution is not the most important thing.
 
I said 1000 people because there are 2000 iCP2 kickstarter backers, so it seems a reasonable number.
I'd like the P2 keyboard to be as good as the P1's dpad

Stepping back from this, I'd say that the discussion arose from Craig making premature announcements (SD slots, fullHD, Android etc)

If ED organises things, I'm happy

I'll drop the 'm' word :)
 
Last edited by a moderator:
Samsung Exynos 5410 sounds good (I like the implications of big.LITTLE on power consumption, although 4 big cores and 4 little ones is maybe a bit of overkill), and it does seem to have a more reliable future than TI OMAP5 from the point of view of availability and manufacturer support. I hope it will be slightly outdated by the time the P2 production starts, so it is not too expensive. In my opinion, we should aim at something that is very good but not the best of the bleeding edge best. It's silly to pay the premium price for the best possible SoC, if the plan is anyway to sell the device for a couple of years (an eternity), so after a few months the specs will start to look "old" anyway.

So better make a smart choice for a "mature" SoC that gets good perf/W, is affordable and easily available (by the time the P2 production starts), than to choose some SoC that has "the best" specs, but is very expensive, hard to get, and maybe has some issues because it's so cutting edge.

Not quite sure what we can do with 8 cores, but multi-core seems to be the future anyway, so I guess we will see more (FOSS) software that makes good use of it.
 
Samsung Exynos 5410 sounds good (I like the implications of big.LITTLE on power consumption, although 4 big cores and 4 little ones is maybe a bit of overkill), and it does seem to have a more reliable future than TI OMAP5 from the point of view of availability and manufacturer support. I hope it will be slightly outdated by the time the P2 production starts, so it is not too expensive. In my opinion, we should aim at something that is very good but not the best of the bleeding edge best. It's silly to pay the premium price for the best possible SoC, if the plan is anyway to sell the device for a couple of years (an eternity), so after a few months the specs will start to look "old" anyway.


So better make a smart choice for a "mature" SoC that gets good perf/W, is affordable and easily available (by the time the P2 production starts), than to choose some SoC that has "the best" specs, but is very expensive, hard to get, and maybe has some issues because it's so cutting edge.


Not quite sure what we can do with 8 cores, but multi-core seems to be the future anyway, so I guess we will see more (FOSS) software that makes good use of it.
There is a reason I posted those socs and it is that they have opensource -or are in developement- drivers and if this is to be a proper linux system, the p2 should stay far away fro closed off pvr gpu. I mentioned the t3 because they would be cheap, offer decent linux drivers, good cpus, power saving core, decent gpu and when running android, it may be possible to run the ouya environment on it -android gaming. If not t3 then I would say snapdragon 8064/600 which rival the amd e-350, fast cpus, fast gpus, builtin wireless radios, opensource reverse engineered freedreno gfx drivers, and the apq8064 is a little older and might be lower price.
 
Having open source GPU drivers in development is well and good, but that doesn't say anything about the suitability of the drivers. Just that someone started a reverse engineering initiative. Unless they're actually competitive with the blobs most people won't use them. And there's no guarantee that just because an RE project started it'll get anywhere anytime soon or ever for that matter. I know some people will insist on open source drivers no matter how low quality they are but they are the minority and hardware decisions should not be catered to them.

And regardless of third party RE efforts all of these GPUs are closed off, who knows, maybe in the future IMG GPUs will get better RE'd drivers than the others...
 
Last edited by a moderator:
Would a set of fully open GPU drivers enable proper openGL support?

As to CPU's, I'd pick ones with the most per-core computing power (for the larger of BIG.little (if relevant)) (yes I know computing power is extremely hard to measure)

Then I'd start considering numbers of cores and the GPU power.

For example, I see 4 cores @ 1.5GHz as worse than 2 cores @ 2GHz (assuming same architecture, memory bandwidth etc)
 
Would a set of fully open GPU drivers enable proper openGL support?

As to CPU's, I'd pick ones with the most per-core computing power (for the larger of BIG.little (if relevant)) (yes I know computing power is extremely hard to measure)

Then I'd start considering numbers of cores and the GPU power.

For example, I see 4 cores @ 1.5GHz as worse than 2 cores @ 2GHz (assuming same architecture, memory bandwidth etc)
proper ogl support idk, but there was a thread on here about some kind of layer, translator for ogl -or something to that effect, I dont know but being able to see/mess with gpu source should help greatly.

as for cpus, lower freq should mean lower voltage, which means lower power consumption, so a quad 1GHz might use less power than a dual 2GHz -all things being equal. the only thing is that multithreaded designs just dont adequately spread the workload around so a dual 2GHz might seem faster with less than stellar multithreaded code.
 
I'm guessing that floss software will be slow to implement efficient multithreading
not necessarily just maybe beginner projects and so forth. quad core maybe better for heavy overhead os like android or ubuntu...also gpu perf is very importnant that is where the t3 falters, where the qm apq8064/s600 whould be better.

aside from that I still -more quietly- maintain that x86 is the way to go. Just much more s/w, compatibility and could help bring in less technical gamers compared to an arm system -especially when it comes to finding arm compatible s/w. the amd temash apu is competitive -with the little info out there- power-wise to the t3, cpu performance should be a little below exynos 5250 but the gpu power should be competitive with ipad 4 with the added bonus of ogl 4.3 via fglrx and ogl 3.1 via mesa(oss drivers) .

that means running xonotic(dpe games) 30 fps low settings, doom3(bfg version) low settings 30 fps, super tux kart low settings 30fps, modern warfare low settings >20fps, css >30fps and pretty much every emulator under ps2 generation(although some 2d ps2 games could run!).

another use case could be running popular linux distros on the thing without waiting for an arm port. just dl the iso and boot the regular x86_64 version from a pendrive, no weird flashing procedures. running android, linux mint, windows 8, maybe mac osx, ubuntu, android.
 
Availability of FLOSS drivers is an important thing for me, but alas the situation is that afaik, there are not a lot of mobile GPUs (and wifi chips for that matter) that currently have good FLOSS drivers. We have the best chance to get those if we pick hardware that is popular and widespread, so there is an interest from the FLOSS community to RE that hardware and write the drivers, independently of the Pandora 2 project. In the mean time, I can live with a binary blob or two, provided those blobs work well.

Ease of running existing binaries is not at all important for me, and I don't think that should be a factor at all. Not for the choice of CPU architecture - I don't care about x86 binaries, and also not for the choice of the GPU - I don't care about OpenGL or DirectX support. Raw performance and performance per watt are the only things I care about. It would be nice if the Pandora 2 can be made backwards compatible with P1 PNDs (which would require an ARMv7 compatible CPU), but it would not be a huge problem if it isn't.

Attracting non-technical gamers who want to install Windows and run their x86 games natively is something I'd rather avoid than promote.
 
I think that the situation can only get better (in regards to mobile FLOSS driver support with mobile CPUs).  We all have toofbrushes, I have one...just one toffuss and my cousin.....awww, git over here!  She's open and free...Now that's Floss!  Right?  Okay I was mostly sleeping during that last post by _wb_, but his head is on straight and he's ethical, like a good guy, so let's get behind that floss crap.  I'm gonna look into that stuff fo' real.

Edit1:  I hadn't even considered the possibility of the P1's "library" not being compatible with a future Pandora.  That's just another thing to consider I guess...for the OP Team...do you attempt to establish backwards compatibility throughout the platform base?  Interesting question.
 
Last edited by a moderator:
Compatibility with a large percentage of Pandora's library day one on release would be a huge asset. Especially if this doesn't go through 2 years of development hell like Pandora did, giving select devs less prep time before release to put together a library.

You could get a lot of working software for an x86 version but it's going to be a big patchwork of stuff pulled from various repositories and web sites with a lot of wine setup thrown in. Trying to run Windows games needing physical media is going to be especially annoying. A big library of software you can get from a single easy to use repository on release day is going to make a big difference for a lot of people.
 
Compatibility with a large percentage of Pandora's library day one on release would be a huge asset. Especially if this doesn't go through 2 years of development hell like Pandora did, giving select devs less prep time before release to put together a library.

You could get a lot of working software for an x86 version but it's going to be a big patchwork of stuff pulled from various repositories and web sites with a lot of wine setup thrown in. Trying to run Windows games needing physical media is going to be especially annoying. A big library of software you can get from a single easy to use repository on release day is going to make a big difference for a lot of people.
I read up a bit on the forthcoming 22nm Silvermont Atom SoC.  Obviously someone is going to make an X86 palmtop computer out of it at some point.  It would probably be an OQO type tablet thing running Windows 8 and completely ignoring the gaming aspect.

The 22nm and 14nm Atoms though - those may put X86 in the ARM realm for mobile devices.  I could actually see potential for a Pandora form factor & controls handheld using Silvermont running Debian, Ubuntu or other mainstream Linux release.

It's hard to guess based on future tech though.  The ARM camp has a lot of strengths too.
 
That guy is kicking some serious butt on that driver.... less serious.. now the Pandora 2 can use an allwinner SOC!

NOOOOOOOOOOOOOOOOOOOPE
 
I'm still holding out to see the benchmarks and battery life for the quad core atom bay trail SOCs coming out later this year.


Looks like they will support 64 bit instruction sets if you believe Intels chief product officer who said the following in the linked article


"Perlmutter made it clear that Intel has ARM directly in its sights these days. Silvermont products based on Intel's x86 architecture will outperform the next wave of ARM-based SoCs in power efficiency and offer support for 64-bit instruction sets."


Source : http://www.pcmag.com/article2/0,2817,2418611,00.asp
 
I'm still holding out to see the benchmarks and battery life for the quad core atom bay trail SOCs coming out later this year.


Looks like they will support 64 bit instruction sets if you believe Intels chief product officer who said the following in the linked article


"Perlmutter made it clear that Intel has ARM directly in its sights these days. Silvermont products based on Intel's x86 architecture will outperform the next wave of ARM-based SoCs in power efficiency and offer support for 64-bit instruction sets."


Source : http://www.pcmag.com/article2/0,2817,2418611,00.asp
doubt it, but we'll see soon enough.
 
"Perlmutter made it clear that Intel has ARM directly in its sights these days. Silvermont products based on Intel's x86 architecture will outperform the next wave of ARM-based SoCs in power efficiency and offer support for 64-bit instruction sets."

Source : http://www.pcmag.com/article2/0,2817,2418611,00.asp
That's actually not a confirmation at all.

Silvermont is the name of the CPU core. It'll come in multiple different SoCs and platforms. Merrifield will be the phone platform, Bay Trail-T the tablet platform. There was never any question that the Silvermont core supports 64-bit; the very first Bonnell core released in 2008 supported 64-bit too. But Intel disabled the feature on all phone and tablet SoCs. One would think that come 2014 they would stop doing this, but leaked BayTrail-T slides indicate that it'll only run the 32-bit version of Windows 8 - which strongly suggests it won't have 64-bit support. And if the tablet platform doesn't have it chances are poor that the phone platform will. Note that the phone platform traditionally hasn't supported Windows at all; they're intended exclusively for Android, and as long as that lacks 64-bit support Intel will have even less reason to care.
 
^ I don't know, the context in which he makes the statement is directly in regards to Socs.  I guess we will see in about 3 months time when they start hitting the market.
 
Back
Top