We all know what it CAN do


Indeed, I would like to know why you can't add an accelerometer, magnetometer, and cell phone with add-on peripherals.

Those all have no standardized interface between the software and the peripherals, I believe. Even if you did have a USB stick with accelerometer on it (which I didn't think existed by the way), it's the protocol used by the stick that the software must be able to speak in order to access the accelerometer and decode its messages. Is that protocol standard? I don't think it is. Without standardization, all software has to support all peripherals individually, and each one may have its own protocol. And every new peripheral that comes out requires a patch to the software in order to understand it.


This is too much to ask of software developers. They would rather there be an intermediate layer between their software and the hardware. That would be updated, rather than the software. Either that or some standardization of hardware. But I doubt that exists yet for accelerometers, magnetometers, and cell phone dongles (most of which are proprietary).


So there is a good case to be made for using built-in hardware. It means you can now standardize around it.


It would also mean you could free up your USB port.

What about the 3DS display do you think is processed by the CPU?

I don't understand the question. I said it's better not to process graphics with the CPU but instead use a graphics processor for that. The CPU would still be utilized more with 3D graphics, since there's more data to process, by the way.

It's just 600MHz that the manufacturer stands behind, but "overclocking" can be a bit of an abstract concept.

I design processors. I get scared every time people try to overclock. Yeah it can be done, but it's not good to go more than, say, 5-10%. When you're talking about increasing clock speed from 600MHz to 800MHz, that's probably too much. Even if you cool it or bring the voltage up, it's still pretty risky.


Processor designers make speed tests really well these days (the "binning" procedure). When they say it's good for a certain speed, there's not usually a lot of room to overclock. In the past it was a lot easier to overclock simply because the test boards weren't very discerning, and test throughput was slow. A lot managed to sneak by. Not much anymore.


Usually you'll see spectacular failures and know that you've gone too far, but sometimes you don't see any problems even though they exist. Not all parts of a chip are designed the same way. In fact, there are hundreds of parts on a chip, and each one gets designed by a different designer. Each designer has his/her way of designing. And so one person's part is a lot more conservative, while another person's part is a lot more risky and sensitive to overclocking. During boot-up or normal usage of the system, you may not be using the parts that are more risky. It's only later on do you get an application that uses that part of the chip, and it fails.


I overclocked an Intel x86 processor recently by 20% or so, and everything looked fine until I noticed my files on my harddrive were getting corrupted by single bit changes over time. Turns out the system bus was being tainted by the CPU glitching onto it. I only noticed it by accident when I was building my own intrusion detection system to tell when a file had changed without me knowing it. I ended up doing a CRC-32 of all my files in their current state vs. the archived versions. Turned out I had corrupted files everywhere. But it was small enough that it remained unnoticed for months. Scary stuff.


- Steve
 
Ekiga is not Skype, and it's not compatible with Skype.
Indeed it is not Skype, but everyone before you that came looking for Skype was not actually looking for Skype, they were looking for some kind of voice and video communication service and Skype was all they knew about. You're the first person to request it for whom nothing but Skype will do.

And "instant on" is not what Pandora does. Pandora still requires a lengthy boot-up every time you turn it on. There is an idle/sleep mode you can place it in, the same as laptops. But sleep mode is not the same as a cold start, instant on. Why wait any longer than a second or two for you to be back up and running?
In that case, instant on is not what anything does. There is no device out there currently that does what you think it does. If you've got a device that seems to be powering on from cold in a second or two, it's actually just waking up from some sort of hibernation. You can demonstrate it by forcing a true cold boot: remove the battery for a minute (some devices have some kind of short backup battery to enable you to swap batteries without losing state) and see what happens when you boot it up then.

Yes, I'm sure the future will kick it over the 1GHz threshold. But by then, I'll want 1.5GHz instead. See, the measuring stick changes as time goes on. The 1GHz number is right now. And Pandora does just 600MHz without overclocking, right?
600Mhz is "overclocking" as far as TI is concerned. 500Mhz is the base. Most people are able to double it. But as Exophase says, "overclock" doesn't really mean the same thing it does for x86.


As far as the future kicking it over 1Ghz, we already have higher than 1Ghz processors. My point was that we're already able to overclock above 1Ghz without a fan, ARM processors of +1Ghz already exist (before "overclocking") without the need for a fan, at no point in the future should we expect to require a cooling fan for the processor.

The rest are not reasonably resolved by using add-on peripherals. And we can discuss why if you'd like:
Indeed, I would like to know why you can't add an accelerometer, magnetometer, and cell phone with add-on peripherals.
Me too. A quick google search shows all kinds of small USB solutions for each.

What about the 3DS display do you think is processed by the CPU?
That's probably my fault for talking about "lag" in CPU. What I meant was the GPU is doing the rendering, but the CPU has to keep telling the GPU to render a different eye (hence my "in software" remark) and signaling the glasses to change lenses, which is different from how the 3DS does the whole thing natively on the hardware.


And I only brought up the active sync glasses because "3D viewing" was on his list, and this is the only way it could possibly be added with the current hardware.
 
Those all have no standardized interface between the software and the peripherals, I believe. Even if you did have a USB stick with accelerometer on it (which I didn't think existed by the way), it's the protocol used by the stick that the software must be able to speak in order to access the accelerometer and decode its messages. Is that protocol standard? I don't think it is. Without standardization, all software has to support all peripherals individually, and each one may have its own protocol. And every new peripheral that comes out requires a patch to the software in order to understand it.
So it's exactly like cameras, graphics cards, wifi chips, pretty much everything not specified explicitly in the USB standard (and even some things that are).


You need a driver to control these things. Every USB not specified in the USB standard has a different set of commands for doing things, some of which are even radically different from another device of the same class. The driver then takes this non-standard IO and puts it in the dev or sys directories in a way that is acceptable and frequently standard. I believe the standard way of handling accelerometers in Linux is to create a 1-axis joystick devices for each axis on the accelerometer. The driver itself may have additional features (mapping two axis to a two axis joystick, for example). This would allow for automatic use of a USB accelerometer in almost any application which is joystick configurable.


HDMI output is just a graphics card, they're about as standardly non-standard as you can get.


Cell phone USB frequently just use the standard modem interface that has been around since the dialup days for data, and either act as a USB


sound card for voice or require a bluetooth connection for the audio, both of which are standard and available in the Pandora.


Unfortunately, I can't find a fully standard way that Linux handles magnetometers the way it does accelerometers, but I was able to find that most magnetometers (including USB ones) simply output their data as an RS232 stream: ie, a USB magnetometer just acts like a serial port, you open that port, and it's a stream of numbers from 0 to 255 giving the uncalibrated direction. Can't get much more standard than that.
 
No need to touch on the standardization stuff, I think WizardStan had a good response. Anyway, you said those things couldn't be served by add-ons, not that they couldn't be served well. With anything that has a data rate greater than what USB 2.0 can provide or won't fit the form factor you have a valid point, otherwise...

I don't understand the question. I said it's better not to process graphics with the CPU but instead use a graphics processor for that. The CPU would still be utilized more with 3D graphics, since there's more data to process, by the way.

Look at the sentence in which you used it:


"Active sync glasses are kind of impractical. I much prefer the Nintendo 3DS screen idea. Very simple idea, and no glasses needed. But it would probably violate international patent agreements. And CPU shouldn't be what's used to process graphics. That's best left up to a stand-alone graphics core."


It looked like you were tying the thought to a 3DS-like implementation.


I don't agree with you that 3D means more data to process. 3D is just rendering the same scene from two slightly different camera positions (one for each eye). So long as all T&L is done by the GPU subsystem the CPU workload shouldn't scale one bit. Probably why 3DS is rumored to only be packing low speed ARM11s.

I design processors. I get scared every time people try to overclock. Yeah it can be done, but it's not good to go more than, say, 5-10%. When you're talking about increasing clock speed from 600MHz to 800MHz, that's probably too much. Even if you cool it or bring the voltage up, it's still pretty risky.

Processor designers make speed tests really well these days (the "binning" procedure). When they say it's good for a certain speed, there's not usually a lot of room to overclock. In the past it was a lot easier to overclock simply because the test boards weren't very discerning, and test throughput was slow. A lot managed to sneak by. Not much anymore.

No offense, but how could anyone who designs processors make such a blanket generalization so as to say that it's not good to overclock 5-10%? Processors are binned for clock speed, but also for artificial market segmentation. A lot of the time you have processors sold vastly underspecified in order to increase the perceived value of the higher end processors and make higher margins on them. Intel is an expert at this, but pretty much everyone does it. As a design and process mature yield variations decrease and you further expect the lower end of the binning spectrum to more likely match the higher ends.


TI sells 720MHz OMAP3530's at 65nm, do you think the design is different? No, there are just some they're willing to bin out at that and some they aren't. For what it's worth, the ones ordered for the Pandora were pre-binning.


Why do you think it's risky anyway, if you don't increase the voltage beyond safe levels? Temperature can be monitored and kept under control (the actual operating temperature of these chips tends to be at least 90C anyway), other than that it's just a matter of whether or not you still meet timing. Damaging a CPU due to overclocking is incredibly rare. People overclock > 50% with great success all the time, and as I said, ALL Pandoras so far are known to be fine at 750MHz. If the voltage is the same and they aren't overheating why do you think they're at risk?

Usually you'll see spectacular failures and know that you've gone too far, but sometimes you don't see any problems even though they exist. Not all parts of a chip are designed the same way. In fact, there are hundreds of parts on a chip, and each one gets designed by a different designer. Each designer has his/her way of designing. And so one person's part is a lot more conservative, while another person's part is a lot more risky and sensitive to overclocking. During boot-up or normal usage of the system, you may not be using the parts that are more risky. It's only later on do you get an application that uses that part of the chip, and it fails.

Fails as in doesn't work properly. Then you probably crash and you reboot and don't run it at the same clock. People aren't exactly using Pandoras for high-reliability operations..

I overclocked an Intel x86 processor recently by 20% or so, and everything looked fine until I noticed my files on my harddrive were getting corrupted by single bit changes over time. Turns out the system bus was being tainted by the CPU glitching onto it. I only noticed it by accident when I was building my own intrusion detection system to tell when a file had changed without me knowing it. I ended up doing a CRC-32 of all my files in their current state vs. the archived versions. Turned out I had corrupted files everywhere. But it was small enough that it remained unnoticed for months. Scary stuff.

- Steve

Guess it's fortunate for us that you can't overclock the memory bus on Pandora us, huh? Was that processor already binned at the top? Pretty bizarre story given that your harddrive stuff goes over SATA and not "the system bus"; if you had errors on RAM you would have seen it a lot sooner than file corruption. And it's hard to imagine you'd overclock whatever SATA was on as usually those clocks are asynchronous with the main CPU and RAM clocks (themselves also asynchronous to each other)
 
The rest are not reasonably resolved by using add-on peripherals. And we can discuss why if you'd like:


...


- An x86 compatibility mode for running Windows apps.

I've been thinking about this one a bit, and would like to contradict my first post in this topic (the second post of the topic). I think I've figured out a way to do this, mostly in order to run the Spiderweb Software games. For any program that you have source for, Wine itself has ARM support. One recompile later, and you're done. However, if you're lacking the source, I'm pretty sure you could still hack together an x86 emulator. The simplest method would be to create an x86 QEmu image and install something very small on it, like Debian Stable/Puppy Linux with the minimum library requirements to run Wine. Add in the kqemu kernel module and presto, you've got a cross-arch emulated system at 80-90% of the original speed.


That should be plenty fast to run turn-based games, and Excel. The only bugger is the amount of RAM available, but if you're using the MiniMenu and a very small host distro, it shouldn't be too much of an issue. Sure it's a bit hacky and would require some development, but it's certainly not an unavailable add-on.
 
Been discussed many, many times myownlittlworld. You definitely won't get 80-90% native speed (whatever that means exactly, it's difficult to compare the different archs) You can potentially do a lot better than something like DOSBox, though.
 
Indeed it is not Skype, but everyone before you that came looking for Skype was not actually looking for Skype, they were looking for some kind of voice and video communication service and Skype was all they knew about. You're the first person to request it for whom nothing but Skype will do.

I've seen people here on these boards who have said that Skype is what they want, not Ekiga. Why? Because everyone knows Skype. Nobody has ever heard of Ekiga. Even my new TV has Skype built-in.

In that case, instant on is not what anything does. There is no device out there currently that does what you think it does. If you've got a device that seems to be powering on from cold in a second or two, it's actually just waking up from some sort of hibernation. You can demonstrate it by forcing a true cold boot: remove the battery for a minute (some devices have some kind of short backup battery to enable you to swap batteries without losing state) and see what happens when you boot it up then.

Yes I'm aware the technology doesn't really exist yet. For perfection, you'd need non-volatile, high-speed main memory. Something like MRAM (magneto-resistive RAM), which is currently in production but might be too expensive right now.


But if you look into it, you'll find that there are a large number of sites dedicated to reducing linux boot time. Like I said, 10 seconds is about all I would like to see. Two seconds would be great. How long does Pandora take? A minute? 30 seconds? I can't tell from the videos, but it looks like it takes a lot of time to boot. The GP2X Wizard apparently boots in about 10 seconds, I believe. I think there's a version of Ubuntu that promises boots of 8 seconds, at least on current desktop machines, dunno about netbooks and such.


Also, there are some ideas that attempt to design a core memory image and save it to flash and then load it back real fast. I believe Windows does something similar now. You'd only have to change that memory image if your setup changed. I think this may be an example of that:


http://www.embeddedarm.com/software/arm-linux-fastboot-ts7300.php


They apparently boot Debian linux on a 200MHz ARM chip in about 2 seconds from SD card. That is, if it's really doing what I think it's doing.


- Steve
 
Yes I'm aware the technology doesn't really exist yet. For perfection, you'd need non-volatile, high-speed main memory. Something like MRAM (magneto-resistive RAM), which is currently in production but might be too expensive right now.


But if you look into it, you'll find that there are a large number of sites dedicated to reducing linux boot time. Like I said, 10 seconds is about all I would like to see. Two seconds would be great. How long does Pandora take? A minute? 30 seconds? I can't tell from the videos, but it looks like it takes a lot of time to boot. The GP2X Wizard apparently boots in about 10 seconds, I believe. I think there's a version of Ubuntu that promises boots of 8 seconds, at least on current desktop machines, dunno about netbooks and such.


Also, there are some ideas that attempt to design a core memory image and save it to flash and then load it back real fast. I believe Windows does something similar now. You'd only have to change that memory image if your setup changed. I think this may be an example of that:


http://www.embeddedarm.com/software/arm-linux-fastboot-ts7300.php


They apparently boot Debian linux on a 200MHz ARM chip in about 2 seconds from SD card. That is, if it's really doing what I think it's doing.


- Steve

This is the only thing I've read from you on this subject that doesn't make me grimace. But I that's not what I came here to post.


I think that lowering to boot time for this device is very possible. After all, the hardware is open to an extent, the since the O/S and kernel are both open source it should be pretty easy to strip out the fat left in the kernel and throw in a few optimizations of our own. All Pandoras are the same hardware-wise. It really shouldn't be a problem to lower boot time a considerable amount.
 
No need to touch on the standardization stuff, I think WizardStan had a good response. Anyway, you said those things couldn't be served by add-ons, not that they couldn't be served well.
I did say "reasonably well". It's a matter of perspective. Yes, you can get a bunch of things on USB stick these days. Heck, an entire desktop PC can be "on USB port" if you connect to one. Heh. But is it reasonable to want to plug in a different USB device for every simple application? I mean, shouldn't it be built-in? Lots of devices nowadays have a magnetometer, accelerometer, cell phone, GPS, and camera built-in. That series of technologies seems to be pretty common nowadays. But Pandora doesn't do any of those. At least not built-in. And with just a single USB port, good luck trying to do all of them at once.


Okay, you can buy a USB hub if you want and plug in a half dozen different USB peripherals. Great, so what about that sounds at all reasonable to do with a Pandora that fits in the palm of your hand? And aren't you just making excuses at this point?

I don't agree with you that 3D means more data to process. 3D is just rendering the same scene from two slightly different camera positions (one for each eye). So long as all T&L is done by the GPU subsystem the CPU workload shouldn't scale one bit. Probably why 3DS is rumored to only be packing low speed ARM11s.

Yeah, maybe. I've not looked into how 3D is handled. I suspect it's more than just turning it over to the GPU.

No offense, but how could anyone who designs processors make such a blanket generalization so as to say that it's not good to overclock 5-10%? Processors are binned for clock speed, but also for artificial market segmentation.

I think you need to see how speed sorts are done to understand exactly how a processor gets stamped with a certain MHz qualification. Then you can tell me if you'd like to overclock your CPU. Marketing does have some say to it, sure. But there's generally nothing an end customer can rely on. If you do intend to overclock, you should perform a lot of testing.

A lot of the time you have processors sold vastly underspecified in order to increase the perceived value of the higher end processors and make higher margins on them. Intel is an expert at this, but pretty much everyone does it. As a design and process mature yield variations decrease and you further expect the lower end of the binning spectrum to more likely match the higher ends.

Well maybe that's true for Intel. But for smaller players, the numbers of chips manufactured is much less. By the time yield is improved at a given process technology, the designers are already working on the next process technology. So instead of continuing to produce 65nm parts, they sell off that inventory and start producing 45nm parts instead. The shrink means higher profit margins (more chips per wafer), faster speeds, and lower power use. And that is advertised to customers in order to get design wins. Unless we're steadily doing better (faster, cheaper, smaller, lower power), we lose business.


We designers would absolutely love it if we could stick with a single technology and not have to continuously shrink everything every year or two. Less learning curve, less effort for us. But in reality, that just doesn't happen. And we don't get money by producing 2GHz chips and selling them as 1GHz chips.

TI sells 720MHz OMAP3530's at 65nm, do you think the design is different? No, there are just some they're willing to bin out at that and some they aren't. For what it's worth, the ones ordered for the Pandora were pre-binning.

Pre-binning? They're not going to do a speed sort? They're just getting sold as-is? I didn't know TI did that. I guess it's cheaper for them, and they don't care since Pandora is a low-volume vendor.

Why do you think it's risky anyway

Because things tend to fail when they're pushed outside of design spec conditions. You might not even see it right away. Things like electromigration and joule heating can eat away the microscopic metal interconnect wires inside chips little by little. A chip will naturally fail after 3-10 years of use because of this, depending on duty cycle. Increasing voltage results in an exponential increase in electromigration. Upping the voltage and increasing clock speeds can really reduce the lifetime of your part. Instead of something dieing in 5 years, it dies in just 2. Not to mention ground bounce and local voltage drop problems are exacerbated by voltage and speed changes. That can lead to failures. And those kinds of failures often exhibit hysteresis, which means you can get rare failures that aren't very reproducible.


When a chip is spec'd, we often don't do a full analysis at every different process, temperature, voltage, clock speed variation possible like you'd think. That would take way too long. Instead, we pick out a few points that match what a part may likely encounter in the real world. Outside of those points, your guess is as good as mine. You'd wish we would spec an entire island of PTV points, draw a circle around it and say that everything inside the circle will work, but that doesn't happen in the industry (it takes too long for that kind of analysis). Overclocking typically pushes a processor into uncharted territory. All bets are off.


Okay, this is taking too much of my time. I'm going to wrap this up. I'm not seeing anyone budge. Pandora is a wonderful product, I agree. But it can't do everything. Nor should it. But boy it would be nice to have some of the things I mentioned. Many of you disagree. That's fine.


- Steve
 
Just curious, what sort of chips do you design, SteveInAustin? I'm finding this thread very interesting to read, personally, because of the minds involved ..
 
Not really. I want Skype. I think a lot of us are saying the same thing, and it has been discussed sufficiently already.
Are we saying things?


I don't like Skype. I don't want all my phone calls routed through a single company and their closed-sauce SaaS.


In fact, I was supposed to set up VoIP on my server, but I don't know how. Is there some well-known server for that?


Re: PandaCrazy: The Pandora cannot act as a Black Metal Detector either.
 
Last edited by a moderator:
[…]
In fact, I was supposed to set up VoIP on my server, but I don't know how. Is there some well-known server for that?


[…]

In a company I worked, they were using Asterisk, an open source VoIP server.


As a simple user, it was working pretty well.


But I don't know if it's hard to setup.
 
I did say "reasonably well". It's a matter of perspective. Yes, you can get a bunch of things on USB stick these days. Heck, an entire desktop PC can be "on USB port" if you connect to one. Heh. But is it reasonable to want to plug in a different USB device for every simple application? I mean, shouldn't it be built-in?
You keep moving the goal posts, and that's just not fair.


"What can't the Pandora do?" "Well, it can't do any of these things..."


"Yes it can, here's all this USB stuff you can plug into it." "Well there's no standard way, so that makes programming a pain."


"Yes there are, here's the standards for them." "Well all that stuff makes it bulky, it should really be built in."


Whether USB add on or built in chips, the simple fact is that the Pandora can do these things. It is by virtue of it's being open that it can do these things. Can you take a 3D picture with an iPhone? Don't think so. Can you take a 3D picture/video with the Pandora? Absolutely, just need to buy two webcams and mount them properly. Is it an easy thing to do for the average person? Probably not, but the fact that it is possible counts for a lot in my opinion.


Now if you want to discuss what you think a future iteration should include, by all means, make your suggestions. The same thing comes up every few weeks and it'd be interesting to have a chip designers input next time.


As far as Skype is concerned, you seem to think you're speaking for more than yourself, but if you actually do a search on this forum you'll see that I'm correct: people come asking for Skype, get told about open standard equivalents like Ekiga, and start looking forward to Ekiga. They don't care whether it's Skype or not, they just want to make VoIP calls to a telephone for cheap.
 
Just curious, what sort of chips do you design, SteveInAustin? I'm finding this thread very interesting to read, personally, because of the minds involved ..

I used to be a VLSI design engineer at Texas Instruments on their c6x DSP processor team. Yeah the same team that made the c64x DSP inside the OMAP that's on Pandora, as far as I can tell. I was more of a jack of all trades back then, getting into just about all aspects of design, analysis and verification. Now I'm a CAD / EDA engineer helping design the VIA C3, C7, Nano, etc. line of x86 chips. It's very challenging, cutting edge work.


- Steve
 
You keep moving the goal posts, and that's just not fair.

I said "reasonably well". You can plug a desktop PC into a Pandora by connecting it to USB port. Is that something you're saying would really work well, though? Is it reasonable?


There are all these USB port add-on peripherals that you could dig up somewhere. I'm sure there's some company selling an accelerometer on USB port (and for a premium price most likely). Doesn't mean it's a particularly good fit for a Pandora. Because, any software developer can't just assume the user is going to have one, since it's not a built-in device. Plus, that software developer would need to support not just that one accelerometer but perhaps several. This scheme doesn't work in reality. Don't expect games to use it, for example. Don't expect nifty new apps to come out taking advantage of it. You'll be sitting there writing a Perl script or something to do anything with it. Yeah you might be able to use it for your own purposes, but with a built-in accelerometer, everyone could use it, and you'd see a lot more apps taking advantage of it.


Plus, an accelerometer on USB port may not be a good idea, because it's far away from the center of gravity of the Pandora. I'm not sure if that would be a big problem, but it might require some compensation and calibration in software - if that's even possible.


That was actually the primary reason going through my head for why an accelerometer wouldn't be a reasonable thing to add on USB port. But I've been thinking about it and haven't decided if it's a big issue or not. Pandora is small, after all, so maybe it doesn't matter too much.

"What can't the Pandora do?" "Well, it can't do any of these things..."


"Yes it can, here's all this USB stuff you can plug into it." "Well there's no standard way, so that makes programming a pain."


"Yes there are, here's the standards for them." "Well all that stuff makes it bulky, it should really be built in."

You're just being overly defensive about this. Honestly, I've not seen a standard interface for accelerometer on USB. And I've not even seen accelerometer USB peripherals in the real world. Nobody I know has them. Where are all the apps that take advantage of these USB accelerometer peripherals, then? Right, they don't exist. If, however, you build-in the accelerometer into the Pandora itself, now everyone can build apps around that. That works. Build it, and they will come.

Whether USB add on or built in chips, the simple fact is that the Pandora can do these things. It is by virtue of it's being open that it can do these things. Can you take a 3D picture with an iPhone? Don't think so.

Yes, sure it can already:


http://www.tuaw.com/2009/04/01/take-3d-pix-on-your-iphone/


http://www.tuaw.com/2009/06/26/3d-camera-adds-depth-to-your-iphone-photography/


The iPhone is a good example of what I've been saying. The iPhone's success is due in large part to its hardware platform (not to mention the easy to use OS and the ease of purchasing apps using iTunes). The fact that it has a lot of stuff built-in is key. All those built-in things, like cell-phone, GPS, magnetometer, accelerometer, video camera, and multi-touch display, are driving the apps. People who develop software can easily take advantage of it. If they relied on iPhone users to go out and buy a bunch of bulky add-on peripherals just to do the most basic things, it wouldn't work. That market would be non-existent.

As far as Skype is concerned, you seem to think you're speaking for more than yourself, but if you actually do a search on this forum you'll see that I'm correct: people come asking for Skype, get told about open standard equivalents like Ekiga, and start looking forward to Ekiga. They don't care whether it's Skype or not, they just want to make VoIP calls to a telephone for cheap.

Some do, not all. Skype is a "killer app". The Pandora team would do well to try to get it working. And I guarantee you that if Skype was working for Pandora, that would be a selling point.


- Steve
 
Well, from this point of view, absolutely any device able to make photo is able to generate 3D images.


You just have to take 2 photos from a slightly different position (with mixed results) and combine them with software.
wink.gif


Some do, not all. Skype is a "killer app". The Pandora team would do well to try to get it working. And I guarantee you that if Skype was working for Pandora, that would be a selling point.


- Steve
What you don't understand here is that a Skype port for Pandora is independent from the will of OPT since Skype isn't open source.


(There is a Skype internal project to make a binary blob that will allow to make third party clients. But no informations about an ARM compatible one…)


The only "offical" way to get Skype on Pandora would be to ditch massive amount of money to Skype, money that is needed to get Pandoras produced.


The other is to hack Skype for N900, a murky way of doing it.


Most of us aren't against Skype on Pandora, but at the moment it's not doable and Ekiga is a really good alternative.
 
I said "reasonably well". You can plug a desktop PC into a Pandora by connecting it to USB port. Is that something you're saying would really work well, though? Is it reasonable?
I think you're still missing the point of what I've been saying, which is unfortunate.
 
But if you look into it, you'll find that there are a large number of sites dedicated to reducing linux boot time. Like I said, 10 seconds is about all I would like to see. Two seconds would be great. How long does Pandora take? A minute? 30 seconds?

35 seconds of which 10 seconds is what's needed to start the X Server.


The Pandora is not using upstart yet which should lower the boot time to about 20 seconds (into X)

I can't tell from the videos, but it looks like it takes a lot of time to boot. The GP2X Wizard apparently boots in about 10 seconds, I believe.

The WIZ needs more, the Caanoo is pretty fast though.


However, they don't load ANY X-Server and not really a proper Linux. They don't need to load a lot of drivers, it's actually not too hard to start a simple system like that so fast.


But it's crippled, and that's where some users complain. i.e. you can't even shutdown it properly from command line.

I think there's a version of Ubuntu that promises boots of 8 seconds, at least on current desktop machines, dunno about netbooks and such.

Haven't seen that, but a current desktop machine is WAY faster (memory, harddisk, CPU). A normal desktop takes usually longer to boot than a Pandora though.

Also, there are some ideas that attempt to design a core memory image and save it to flash and then load it back real fast. I believe Windows does something similar now. You'd only have to change that memory image if your setup changed. I think this may be an example of that:


http://www.embeddedarm.com/software/arm-linux-fastboot-ts7300.php


They apparently boot Debian linux on a 200MHz ARM chip in about 2 seconds from SD card. That is, if it's really doing what I think it's doing.

Yes, but they mention on that site that it's the Linux command line.


You could also create a kernel and initrd with a minimal system for the Pandora which should be able to boot up in a few seconds as well - but starting X will take a while and PROPERLY starting a Linux system as well.


Of course you can hack it to make it way faster, but that would be with some stuff missing.


The two-second boot actually is something similar to hibernation. It doesn't really boot but tries to change the hardware components incl. RAM to be exactly like they've just been booted.


This makes it hard to customize the boot process AFTER that image has been created though.
 
What you don't understand here is that a Skype port for Pandora is independent from the will of OPT since Skype isn't open source.


(There is a Skype internal project to make a binary blob that will allow to make third party clients. But no informations about an ARM compatible one…)


The only "offical" way to get Skype on Pandora would be to ditch massive amount of money to Skype, money that is needed to get Pandoras produced.


The other is to hack Skype for N900, a murky way of doing it.


Most of us aren't against Skype on Pandora, but at the moment it's not doable and Ekiga is a really good alternative.

So you couldn't, say, turn it over to an x86 emulator and run the x86 binary of Skype? Is that not doable? ARM processor is too slow? Would be better at 1.0GHz instead of 600MHz? Or just wouldn't work?


- Steve
 
Back
Top