I'm really depressed at all the flaming. Yeah, this guy did not do the research (a quick glance to wikipedia would have been a good idea) and made some strange presumptions, but that doesn't make him a malicious troll and it's pretty mean to call him "stupid" over it. Keep things technical, to the point, and polite and if they still get all bent out of shape THEN you can be justified in telling them to piss off.
Anyway...
The 3Watts TDP is max power, only achieved on maximum CPU AND GPU utilization. In idle it uses less than a tenth of this. It has a graphics unit built into it, so no additional graphics card is necessary.
The Z500 and Z600 atom processors where designed in 2010 and 2009. They are modified pentium 4 processors, optimized for low power. Originally created to fit in cellphones, but never caught on because they used too much power for a cellphone (and would give them only 1 day battery life).
Manufacturers have used many of these processors in all kinds of other devices ranging from tablets, to netbooks, to portable media players etc.
Most of these processors are too slow to actually host an operating system like Windows. But they work fine for watching HD video, listening to MP3, and browsing the web.
In case if MS DOS would be the host operating system, even the lowest of Atom processors (like the Z500, with 800Mhz speed) would be sufficient to run most x86 programs created for MS DOS. But as some said, it is not a DOS console, so I will quit my chatter here.
Atom is actually a new (rather cumbersome) CPU design, not in any way related to Pentium 4. I imagine Netburst would be the last design they'd want to revisit for power sensitive applications, it was a real power drain.
Intel has made some good progress in reducing power consumption for Atom and its chipsets, but they're still nowhere close to Cortex-A9 SoCs, which tend to have an over 2x advantage in performance per Watt and lower leakage (although that's the area Intel has improved the most in). Citing Z500 is very misleading; not just because it doesn't include a GPU and memory controller on-die, but because the associated I/O hub chip you have to use is not nearly as power optimized and made on an older process it burns far more power at the system level. Suffice it to say that there isn't a price advantage either, far from it; Intel has much higher margins and $20 for just a CPU at this level is actually very expensive (an entire Tegra 2 costs less than that). So Menlow, the original Z-series platform, was basically dead on arrival. It made its way to some UMPCs but we all know how niche that market is, and how it's apparently okay to only last a couple hours on battery there.
Moorestown at 3W is better, naturally, but that's still only 1.1GHz single core - a dual core Cortex-A9 @ at lest 1GHz is a better choice and uses substantially less power. Bottom line: Moorestown has been out for a good long while and has not released in a single product, or at least none I've heard of and I pretty routinely check to see if I've missed any. It's quite the market failure on Intel's part, with the only saving grace being that the main Lincroft chip is used in Oak Trail which does at least have a few design wins.
Note that even Moorestown has a much higher cost and board space footprint than ARM-based chipsets. This is because it's a three-chip set instead of two-chips, because the chip itself is pretty big (and is not using anything close to TSMC's super dense 40nm transistors), and because it isn't using PoP memory meaning it needs extra board space for that. Check out a picture of the Pandora PCB some time - it's pretty crammed. Frankly, I doubt you can even fit Moorestown on it. And you certainly wouldn't be able to buy it in the volume of a few thousand; OMAP35xx is attractive because it's a hobbyist chip available in small volumes from suppliers.
Word has been that Intel dumped Moorestown to focus on the 32nm Medfield, but it turns out Medfield isn't coming out until 2012, and it'll still be single core and will probably look ridiculous compared to quad-core Cortex-A9s and dual-core Cortex-A15s. In 2013 Intel will be releasing Silverpoint which will be both out of order and on 22nm (well ahead of the competition's ability to release near this node), and they're anticipating a massive shift in performance. They could very well overtake ARM, but until then I'm not seeing it.
Of course if you want x86 compatibility going with Atom is a no brainer. It goes without saying that DOS emulation is not really that high on the list compared to everything else; it's hard to imagine that anyone would try to make a handheld just for DOSBox. If they did it'd be weird to go with ARM, but I think even Atom wouldn't be as seamless as you think. For over a decade now I've seen major compatibility problems in running straight DOS on modern PCs. For instance, devices hogging up the first 1MB memory space that needs to be free for various things. Or the need for direct hardware interfaces that no longer exist, like Soundblaster (and somehow I think that the SGX chips on current Z-series Atoms don't offer vanilla VGA or VESA, or at least it'd be pretty weird if they did). These days everyone uses DOSBox, although I suppose in a pinch you could use dosemu on Linux to get better performance while avoiding the hardware problems. If you do use DOSBox using x86 only gives you a small theoretical advantage, although in practice the x86 recompiler is far superior to the current ARM one. I expect that could eventually change.