My favourite analogy of RAM is that to caching.
A computer has various forms of storage as it runs a program. Memory is the storage it uses to hold results of calculations and data.
The fastest memory type are CPU registers; these are used directly by the CPU to store the results of calculations. A register can usually be accessed at the speed of the CPU. ie. a 33Mhz machine could theoretically access a CPU register 33,000,000 times a second.
The next fastest memory is onboard CPU cache memory (which may run at half the speed of register memory), followed by RAM (perhaps running at one quarter the speed) then Hard Drive (or in the case of hand helds like the GP2X, SD memory cards), which really is just another form of memory. Once you get to the Hard Drive, speed of access is dramatically reduced. On a 33Mhz machine you may be lucky to achieve 1,000,000 access per second (for example).
The reason there are various forms of memory is basically economic. A super computer made entirely of CPU register memory would be incredibly fast but also incredibly expensive to make. So, the memory needs to be scaled to make the whole thing viable and we have the configuration commonly seen today.
Having more RAM means that you can hold more data and results of calculations at once though as you can see there are many levels of memory. RAM is the first level of memory that a user can easily change to increase the computer's capability. Registers and CPU cache memory are usually properties of the CPU and cannot be easily changed without changing the CPU itself.
So what's more important? It depends entirely on the application you want to run.