SteveInAustin
Still Fresh
- Joined
- Dec 17, 2010
- Messages
- 20
Indeed, I would like to know why you can't add an accelerometer, magnetometer, and cell phone with add-on peripherals.
Those all have no standardized interface between the software and the peripherals, I believe. Even if you did have a USB stick with accelerometer on it (which I didn't think existed by the way), it's the protocol used by the stick that the software must be able to speak in order to access the accelerometer and decode its messages. Is that protocol standard? I don't think it is. Without standardization, all software has to support all peripherals individually, and each one may have its own protocol. And every new peripheral that comes out requires a patch to the software in order to understand it.
This is too much to ask of software developers. They would rather there be an intermediate layer between their software and the hardware. That would be updated, rather than the software. Either that or some standardization of hardware. But I doubt that exists yet for accelerometers, magnetometers, and cell phone dongles (most of which are proprietary).
So there is a good case to be made for using built-in hardware. It means you can now standardize around it.
It would also mean you could free up your USB port.
What about the 3DS display do you think is processed by the CPU?
I don't understand the question. I said it's better not to process graphics with the CPU but instead use a graphics processor for that. The CPU would still be utilized more with 3D graphics, since there's more data to process, by the way.
It's just 600MHz that the manufacturer stands behind, but "overclocking" can be a bit of an abstract concept.
I design processors. I get scared every time people try to overclock. Yeah it can be done, but it's not good to go more than, say, 5-10%. When you're talking about increasing clock speed from 600MHz to 800MHz, that's probably too much. Even if you cool it or bring the voltage up, it's still pretty risky.
Processor designers make speed tests really well these days (the "binning" procedure). When they say it's good for a certain speed, there's not usually a lot of room to overclock. In the past it was a lot easier to overclock simply because the test boards weren't very discerning, and test throughput was slow. A lot managed to sneak by. Not much anymore.
Usually you'll see spectacular failures and know that you've gone too far, but sometimes you don't see any problems even though they exist. Not all parts of a chip are designed the same way. In fact, there are hundreds of parts on a chip, and each one gets designed by a different designer. Each designer has his/her way of designing. And so one person's part is a lot more conservative, while another person's part is a lot more risky and sensitive to overclocking. During boot-up or normal usage of the system, you may not be using the parts that are more risky. It's only later on do you get an application that uses that part of the chip, and it fails.
I overclocked an Intel x86 processor recently by 20% or so, and everything looked fine until I noticed my files on my harddrive were getting corrupted by single bit changes over time. Turns out the system bus was being tainted by the CPU glitching onto it. I only noticed it by accident when I was building my own intrusion detection system to tell when a file had changed without me knowing it. I ended up doing a CRC-32 of all my files in their current state vs. the archived versions. Turned out I had corrupted files everywhere. But it was small enough that it remained unnoticed for months. Scary stuff.
- Steve