which we don't need.
reading analog joysticks isn't that hard, they sell ready-made analog-only controls and all you need is some form of ADC and a tiny bit of look up tables in the driver if they're not linear.
the pandora nubs are over-engineered.
the cheapest solution is to use a 555 or similar timer as was used on PCs and "horrible" software polling.
on the pandora, you may even use the DSP for this mostly-software ADC solution and leave the main CPU free.
My rule of thumb is - if you need to buy an additional multi-channel ADC for an apps processor and it doesn't have to be high resolution (> 12bit) or high speed (> 1MHz) then you're best off just buying a microcontroller. It's a lot more flexible and often cheaper. They're really not expensive solutions. Even Cortex-M3 and Cortex-M0s are being made available for less than a dollar now.
On Pandora the PMIC has extra ADC channels, since they're need for the touchscreen controller. I don't know if it has enough available though. For nubs you need at least 4.
I think the external sampling rate should be at least 60Hz. Which means the frequency response of this output must be completely flat at 30Hz. The problem with using a really low sampling rate and relying on an analog anti-aliasing filter is that a cheap single-order filter has a very gradual drop-off that'll give you much less bandwidth than half the sampling rate (depending on the stop band attenuation you need). Better filters need active components (op-amps), from one to multiple per channel depending on how good you want the filter to be. It's going to be really easy to once again raise the price and board complexity beyond what you get using a microcontroller that samples at a very high rate and performs digital filtering. I've personally filtered 8-channels sampled at 16KHz with a 255th order FIR on a little 32MHz STM32F (that's four years old now). You don't need that much power to do a decent amount of filtering. But you do need a steady input stream at a high sampling rate. You may be able to rig something like the OMAP3530 to DMA at a high rate from something like its PMIC's ADC, I don't know, but it sounds like a pain.
Personally I'd have a single microcontroller that handles all inputs. Including touchscreen, because Pandora already has problems with noise on its touchscreen lines, although I guess this will depend on exactly what hardware is available for whatever touchscreen controller is used. Then the microcontroller handles sampling and calibration and is of course fully program-modifiable. It sends single packets for all of the input state, and it has (programmable) thresholds such that it only sends these updates when the state has changed enough. The microcontroller could of course also buffer events and only give them to you when the program running on the SoC requests input. Either way, when your Pandora is sitting there doing but handling really low frequency events the CPU doesn't have to be woken up constantly to deal with interrupts.
The alternative of using the DSP for this has the problem of needing a decent preemptive OS on the DSP, because you don't want to waste the whole thing by only being able to use a small percentage of its ability running analog sampling and filtering (this is assuming the DSP has full access to the same external peripherals, haven't checked). But on this DSP architecture real time interrupts are a bad idea, because interruptible code takes a performance hit.
The biggest problem with Pandora's design is using TWO microcontrollers, that run immutable software blobs with questionable auto-calibration. But using a separate microcontroller isn't necessarily a bad idea.