Grench said:
I wonder if there is a way to download the top 20,000 (by hits) or so Wikipedia pages with full graphics and the rest with just text... How much space would that consume?
Why yes there is!
...only without the graphics, since there's licensing issues there they don't let people DL dumps with images.
http://en.wikipedia.org/wiki/Wikipedia_database
Not sure about the top 20k pages thing either, but certainly a complete dump is available and could then be trimmed down on a desktop computer with some creative scripting.
I personally would download
this one... it's only the latest revisions of every article without any of the user or talk pages.
6 gigs compressed and it says on their page that it can be up to
20 times that size uncompressed!
I know I for one will be looking into this further once I get my Pandora and a 32+ gig SD card.
If the articles can be read individually from within the compressed file then this might be viable on the Pandora... if not you'd need approx. 120 gigs of space to decompress the archive into. Hey a real use for a pair of SDXC cards
It would be awesome if there was a way to do incremental updates... but alas I think this would put a bit too much load on the wikipedia servers.
Maybe I'll just have to download the weekly dump to a server of my own, decompress it, and setup an SVN repository that fellow Pandorians could do incremental updates from... I imagine a small server could handle a dozen or so wiki-crazy Pandorians doing SVN updates off it on a weekly basis.
edit: well disregard most of what I said... those guys thought of it all already... once I actually read the whole page in my first link, I realized that they provide the means for everything we could want... including pre-parsed html dumps (which are 14 gigs compressed) and incremental updating!