64Gb Sdxc Cards


Gricey said:
Prometheus said:
Jourdy288 said:
We could make games bigger than the ones that come on DVD/Blu-Ray/Whatever, and this means we'll be capable of... A lot.
But why would you want to? Compelling content matters much more than the filesize. :p

That's what people say when they are packing no content. :p
And that is what people say when they try to justify their huge data repositories. :p
 
Last edited by a moderator:
WizardStan said:
Gricey said:
Prometheus said:
Jourdy288 said:
We could make games bigger than the ones that come on DVD/Blu-Ray/Whatever, and this means we'll be capable of... A lot.
But why would you want to? Compelling content matters much more than the filesize. :p

That's what people say when they are packing no content. :p
And that is what people say when they try to justify their huge data repositories. :p
Is it just me or did we create some odd chain of unusual smiley-faces? :p
But what I'm saying is that we'll be able to pack in LOT'S of content. Good content. And very much of it.
 
Last edited by a moderator:
WizardStan said:
And that is what people say when they try to justify their huge data repositories. :p
How would you define a huge data repository?

In my experience data will always fill up all available space, doesn't matter how much space that is.
 
Last edited by a moderator:
Caine said:
WizardStan said:
And that is what people say when they try to justify their huge data repositories. :p
How would you define a huge data repository?

In my experience data will always fill up all available space, doesn't matter how much space that is.
1+!
 
Last edited by a moderator:
Caine said:
In my experience data will always fill up all available space, doesn't matter how much space that is.
But that's the point: developers saying "we have all this space!" and then doing whatever it takes to just fill it in. This can become detrimental as resources are pulled from gameplay and put into making cutscenes more realistic simply because they can. Graphics quality is asymptotic: the closer you get to "perfect", the more resources it takes to get a little more perfect.
If you have less space, it is much easier to fill with lower quality sounds and graphics, which frees up resources for other things.
Imagine a game with true to life boob physics. I like jiggly boob physics as much as the next guy, but think: that's one (or sometimes more) guys who were solely responsible for writing the code to make those boobs jiggle. Imagine that team was instead tasked with coding some aspect that actually affected gameplay. Is the addition of jiggly boobs really worth whatever cool thing this team might have added if they weren't otherwise occupied?
If nothing else, the team could simply have never existed, reducing the overall cost of development. Do this with enough superfluous features and you still have a good game with reasonable quality stuff, but at a cut of the cost.
But no, we've got team leads saying "we've got 4GB of storage space, may as well fill it up with jiggly boobs!"
 
Last edited by a moderator:
Mjlink said:
http://www.microcenter.com/single_product_results.phtml?product_id=0314845
$59.99 32GB Class VI
Too bad they don't ship outside US :(

@ WizardStan: BOOBS! \o/


This SDXC support is simply awesome!
Now I don't have to carry around a bulky external HD ;)
 
Last edited by a moderator:
WizardStan said:
Caine said:
In my experience data will always fill up all available space, doesn't matter how much space that is.
But that's the point: developers saying "we have all this space!" and then doing whatever it takes to just fill it in. This can become detrimental as resources are pulled from gameplay and put into making cutscenes more realistic simply because they can. Graphics quality is asymptotic: the closer you get to "perfect", the more resources it takes to get a little more perfect.
If you have less space, it is much easier to fill with lower quality sounds and graphics, which frees up resources for other things.
And that's kinda why I'm thinking lets fill it up- but with quality content! That's all I have to say, really.
 
Last edited by a moderator:
Jourdy288 said:
And that's kinda why I'm thinking lets fill it up- but with quality content! That's all I have to say, really.
But that's the problem with having too much space. You can say only fill it with quality, but there comes a point where you just don't have anything else to add: your game is done, and it is good! You're sitting at three gigs, one gig remains, you've done good, it's a quality game, and then some dumbass says "so what are we going to do with the extra gig?" and you're back to implementing jiggle physics because you just have to fill up that last gig for some reason; trying to justify the need for that much space, and push the bounds of that media to justify the need for even more space.
Having more space available would be good if it were filled with quality, as you say, and then STOPPED. There's no shame in not filling every nook and cranny with whatever blasted feature they can throw on there simply because they have the room for it.

Basically I'm saying: you can't get true quality from "we've got this space, what can we do with it?". Storage is incredibly flexible: design your quality around the amount of RAM and CPU you've got, and one way or another your storage will accommodate it.
 
Last edited by a moderator:
I got that Stan, perhaps we could come up with some new stuff? Like maybe more advanced graphics? Keep in mind that the possibilities are (nearly) endless. Thus, stuff you think 'hey, wouldn't it be cool if...' is now possible. I'm not saying just throw together random thoughts, you'll have garbage, but I mean with that kind of space, the thinking should be less 'why add it', and more 'why not add it'? Get where I'm coming from?
And I agree. There's nothing wrong with having that extra space left over. I'm just saying we should utilize what's available to us.
 
I wonder if there is a way to download the top 20,000 (by hits) or so Wikipedia pages with full graphics and the rest with just text... How much space would that consume?
 
Grench said:
I wonder if there is a way to download the top 20,000 (by hits) or so Wikipedia pages with full graphics and the rest with just text... How much space would that consume?

Why yes there is!

...only without the graphics, since there's licensing issues there they don't let people DL dumps with images.

http://en.wikipedia.org/wiki/Wikipedia_database

Not sure about the top 20k pages thing either, but certainly a complete dump is available and could then be trimmed down on a desktop computer with some creative scripting.

I personally would download this one... it's only the latest revisions of every article without any of the user or talk pages.

6 gigs compressed and it says on their page that it can be up to 20 times that size uncompressed!

I know I for one will be looking into this further once I get my Pandora and a 32+ gig SD card.

If the articles can be read individually from within the compressed file then this might be viable on the Pandora... if not you'd need approx. 120 gigs of space to decompress the archive into. Hey a real use for a pair of SDXC cards :D

It would be awesome if there was a way to do incremental updates... but alas I think this would put a bit too much load on the wikipedia servers.

Maybe I'll just have to download the weekly dump to a server of my own, decompress it, and setup an SVN repository that fellow Pandorians could do incremental updates from... I imagine a small server could handle a dozen or so wiki-crazy Pandorians doing SVN updates off it on a weekly basis.

edit: well disregard most of what I said... those guys thought of it all already... once I actually read the whole page in my first link, I realized that they provide the means for everything we could want... including pre-parsed html dumps (which are 14 gigs compressed) and incremental updating!
 
Last edited by a moderator:
paulguy said:
You can use
Code:
tune2fs -L label
to add a volume name.
bash: tune2fs: command not found :-(
So, as I have freshly formated card I repeat mkfs, this time with -L "myvolume" argument :)
But ED should add tune2fs into next HotFix. EvilDragon, do You read - should I send a PM? :)
 
Last edited by a moderator:
peca said:
bash: tune2fs: command not found :-(
So, as I have freshly formated card I repeat mkfs, this time with -L "myvolume" argument :)
But ED should add tune2fs into next HotFix. EvilDragon, do You read - should I send a PM? :)
file a bug report (feature request) at http://bugs.openpandora.org/
 
Last edited by a moderator:
mitosis said:
Grench said:
I wonder if there is a way to download the top 20,000 (by hits) or so Wikipedia pages with full graphics and the rest with just text... How much space would that consume?

Why yes there is!

...only without the graphics, since there's licensing issues there they don't let people DL dumps with images.

http://en.wikipedia.org/wiki/Wikipedia_database

Not sure about the top 20k pages thing either, but certainly a complete dump is available and could then be trimmed down on a desktop computer with some creative scripting.

I personally would download this one... it's only the latest revisions of every article without any of the user or talk pages.

6 gigs compressed and it says on their page that it can be up to 20 times that size uncompressed!

I know I for one will be looking into this further once I get my Pandora and a 32+ gig SD card.

If the articles can be read individually from within the compressed file then this might be viable on the Pandora... if not you'd need approx. 120 gigs of space to decompress the archive into. Hey a real use for a pair of SDXC cards :D

It would be awesome if there was a way to do incremental updates... but alas I think this would put a bit too much load on the wikipedia servers.

Maybe I'll just have to download the weekly dump to a server of my own, decompress it, and setup an SVN repository that fellow Pandorians could do incremental updates from... I imagine a small server could handle a dozen or so wiki-crazy Pandorians doing SVN updates off it on a weekly basis.

edit: well disregard most of what I said... those guys thought of it all already... once I actually read the whole page in my first link, I realized that they provide the means for everything we could want... including pre-parsed html dumps (which are 14 gigs compressed) and incremental updating!

So, back to my question, no - you are not aware of any way to download the top 20K (or so) Wikipedia articles WITH pictures.

I wonder... Are there any Wikipedia stats published? I.e. these are the hit counts per URL type thing?
 
Last edited by a moderator:
Jourdy288 said:
I got that Stan, perhaps we could come up with some new stuff? Like maybe more advanced graphics? Keep in mind that the possibilities are (nearly) endless. Thus, stuff you think 'hey, wouldn't it be cool if...' is now possible. I'm not saying just throw together random thoughts, you'll have garbage, but I mean with that kind of space, the thinking should be less 'why add it', and more 'why not add it'? Get where I'm coming from?
And I agree. There's nothing wrong with having that extra space left over. I'm just saying we should utilize what's available to us.
The CPU/GPU remains the same, the amount of RAM remains the same.
 
Last edited by a moderator:
Grench said:
I wonder... Are there any Wikipedia stats published? I.e. these are the hit counts per URL type thing?
http://wikistics.falsikon.de/latest/wikipedia/en/
http://stats.grok.se/en/top
http://stats.wikimedia.org/
 
Last edited by a moderator:
Grench said:
So, back to my question, no - you are not aware of any way to download the top 20K (or so) Wikipedia articles WITH pictures.
There is no way to mass download any of the pictures, period. At least, nothing endorsed by wikipedia.
As far as top 20K is concerned: define "top". Most edited? Most viewed? Most relevant? In any event, still no easy way of doing it: the functionality just isn't important to enough people for it to exist.
 
Last edited by a moderator:
So, what we need is an HTML robot.

With the top 3K searches, you can return the HTML tree of the responses to that as of execution date/time.
Download all of the links from the tree for 3 layers deep for offline viewing.

Download all articles in text form as a companion.

Allow the user to flag an article for addition to the graphics download simply by browsing the article. I.e. if I've ever read or looked up an article, bring me back that page (and 3 pages deep from it by HTML tree) with full graphics.

Have an application button to refresh it.

It shouldn't be that hard to build. The engine itself would carry no content - the user would query the content on a user by user basis.

Using it would probably violate the Wikipedia user agreement - even they seem to feel some need to lock down content. I don't see any reason why it shouldn't be built though.
 
http://lifehacker.com/217250/download-of-the-day-wikipedia-cd-all-platforms

Disregard the 'for school children' bit. The articles are not dumbed down in the least. In actual fact, they have been checked and lengthened in most cases.

I had this on my PSP a few years ago. It's all static HTML with pictures and an index.
 
B_Lizzard said:
SDHC devices will only support the SDXC cards which use UHS104 speeds; SDHC devices will not recognize the SDXC cards which use the faster (SD 4.0), final specification of SDXC.
Just to be clear, does this mean that SD4.0 SDXC cards will not work on Pandora?
 
Last edited by a moderator:
Back
Top