Possible Ps2 Emu?


I know dozens of people who bought a netbook with windows because it was cheaper and/or more powerful then those sold with linux (Don't even try to buy a linux-netbook at a store, those were never sold here). All of them replaced windows with linux because it works better for them. Maybe I know the wrong people or it's because germany's so linux-friendly, but my personal guess is that the reason for window's strength on the netbook marked is the retailers and not the customers. ARM-powered netbooks could indeed turn the market, as there are real arguments to prefer arm to x86 and therefor linux to windowsCE, but it again depends on the will of shops if those devices will ever hit the mass market.
 
My personal belief is that the Google Chrome OS will actually cause many people to convert to Linux.
You buy a netbook running Chrome, and want to get more power, what do you do?
You load up Linux, that's what.

The Chrome thing will make it "OK" to be running something other than Windows, for the average user.
 
Exophase said:
cb88 said:
Well we have multicore ARM processors that will take over the netbook market
I don't know about that. People like their netbooks with Windows, as the market continues to prove.
Unless MS comes out with Win7 for ARM (which is unlikely, but certainly not impossible), don't expect double-digit market share for non-x86 netbooks.

Exophase said:
cb88 said:
and personally I think AMD's bulldozer may turn the tides toward AMD for a spell... especially if Intel actually trys to ship larabee which is a terrible design
Do you have any links on Bulldozer and Larabee benchmarks?
Bulldozer won't be out for over a year. It's a non-player in 2010. Larabee has been "delayed indefinitely" as a consumer product, though Intel still intends to make it available eventually as a research platform.

Anything could happen in 2011 and beyond, but just looking at the upcoming year, expect more of the same from everybody. The only thing changing in the next 12 months is Intel moving their northbridge (including IGP) onto the CPU die. It should make for small but noticeable cost and power savings, but won't really affect user experience at all. The IGPs they're using are only slight spec-bumps above the current offerings. Nothing revolutionary, but still pretty smart evolutions.
 
Last edited by a moderator:
cb88 said:
@ fischju2000 I agree even though I'm not fond of java and thus dalvik hopefully c code support will be allowed on such devices otherwise I think they will flop even if there are a lot of dalvik apps. I do find dalvik interesting though as it claims to be more efficient than java...

More efficient in size maybe, but right now there is no Dalvik JIT and I absolutely guarantee you that interpreters will never be close in performance to recompilers. I personally find Dalvik's whole spiel to be laughable.

There is actually a JIT in the works but what can be used right now is very unstable, introduces a lot of sudden slowdowns, uses way too much memory, and is overall barely any faster. With JITs like this it's no wonder they don't like them.
 
Last edited by a moderator:
Exophase said:
More efficient in size maybe, but right now there is no Dalvik JIT and I absolutely guarantee you that interpreters will never be close in performance to recompilers. I personally find Dalvik's whole spiel to be laughable.

There is actually a JIT in the works but what can be used right now is very unstable, introduces a lot of sudden slowdowns, uses way too much memory, and is overall barely any faster. With JITs like this it's no wonder they don't like them.
What, Dalvik is an interpreter?! OK, I haven't really cared about the whole Android transition thing a whole lot, but I assumed that since the Dalvik BC supports virtual registers compared to normal JVM BC which uses stacks, at least some translation to machine code would have to happen for it to have any point at all.

But yeah, Dalvik should use less memory since, because of the more lenient memory structure, it is possible for the JVM to drop that stupid 8 byte in-memory class header and 8-byte structure alignment. A char (aka wchar_t in C lingo) stored on the heap on a PC uses from 16 and up to 32 bytes of memory. 32! That should shrink to a more reasonable 4 bytes (when not contained in a packed array or a java.nio.CharBuffer in which case it uses 2 bytes as always) with the Dalvik model if they're doing things right.

And if Dalvik starts with a JIT implementation now, it might just be a tad too late. I mean, we all know that HotSpot is unbeaten in practice when it comes to JIT systems, but it, too, suffers from too many flaws because of an inefficient legacy in pre-JIT JVMs (programmers expect a JVM to behave a certain way and suddenly, when using a JITed system, find that all of their programs deadlock because of races they hadn't expected previously occurring. The JIT in the JVM has to avoid such a situation and thus introduces artificially inefficient optimizations that makes code speed up semi-proportionally across all portions of it), and on a limited platform such as the ones Dalvik has to run on, well... let's say that the JIT won't have such a big performance-playpen, and that it will have to grow up quickly to become successful.
 
Last edited by a moderator:
Is there a PS2 emulator that runs decently on the a PC ?
If so, can the Pandora 2 be designed to run a PS2 emulator?
I know how everybody hates to talk about Pandora 2 but there's nothing to do on the forum yet except wait.
 
trats20050 said:
Is there a PS2 emulator that runs decently on the a PC ?
If so, can the Pandora 2 be designed to run a PS2 emulator?
I know how everybody hates to talk about Pandora 2 but there's nothing to do on the forum yet except wait.

Maybe in about 10 years. High end PC's today can barely run PS2 fullspeed for most games. Handhelds seem to be about 10 years behind PC's in terms of power. It was around this time 10 years ago Quake 3 and Unreal Tournament came out, and they needed high end PC's to play at great frame rates and full quality graphics settings.
 
Last edited by a moderator:
Exophase said:
Do you have any links on Bulldozer and Larabee benchmarks? I haven't seen anything on either, and I suddenly feel very out of touch with current x86 news :/
Like Chip said, Larrabee will be a no-show and Bulldozer is at least a year away from the market. But from what I've read it looks like they're prepping it for integrating GPU features right into the CPU architecture (instead of having a small GPU put on the die next to the CPU cores like the new Atom chips coming out).

conso said:
I know dozens of people who bought a netbook with windows because it was cheaper and/or more powerful then those sold with linux (Don't even try to buy a linux-netbook at a store, those were never sold here).
That's weird, the situation used to be completely the opposite of that here: since Linux costs almost nothing for OEMs except tweaking (and a few codecs if they threw those in for media playback), they offered it with twice the RAM and extra storage space. And when I went shopping at Target about six months ago, the only Eeepcs they had left were the Linux ones.

There are companies out there pursuing and demoing ARM-powered "smartbooks," but so far they look like regular netbooks running on Qualcomm, Samsung, and TI SoCs, and there's a lot of talk about them pursuing Android or Chrome OS for operating systems (though I bet a version of Ubuntu Netbook Remix for ARM would be a strong contender as a fully-featured OS with a mobile-friendly interface). Pandora's still pretty unique in that it's going for the handheld-format with game controls and a full desktop environment
 
Last edited by a moderator:
MDave said:
Maybe in about 10 years. High end PC's today can barely run PS2 fullspeed for most games. Handhelds seem to be about 10 years behind PC's in terms of power. It was around this time 10 years ago Quake 3 and Unreal Tournament came out, and they needed high end PC's to play at great frame rates and full quality graphics settings.

Depends on where the market goes though. Let's say we start seeing decent FPGAs appear in handhelds. That could start to help emulation immensely. You only need an FPGA at a few hundred MHz to provide massive improvement, they can be much more efficient at emulation than a CPU is. Now granted, you probably won't be seeing FPGAs that let you dump an Emotion Engine on them any time soon, but if the interface between an FPGA and a CPU is somehow made tight enough you could enhance emulation a lot with just some blocks of custom fabric. Imagine if an FPGA is mounted via ARM coprocessor interface and you could define custom operations that can do anything on a given register, including using any state available to the FPGA. That'd be pretty slick.
 
Last edited by a moderator:
Laurent said:
If you know what you're talking about frequency can be used: the R5900 is a dual-issue 64-bit processor running at 300 MHz; the Cortex-A8 in Pandora is a dual-issue 32-bit processor running at 600 MHz. From there you can deduce the Pandora probably hasn't enough power even for the CPU alone (as a reference n64 has an R4300i which is single-issue 64-bit running at 100 MHz).
As with N64 emulation, it largely depends on the cache miss rate and how fast the memory system is. That 8-cycle L1 miss accounts for a large amount of CPU time, and so does the store buffers getting full.

Laurent said:
I fully agree, and I think it's the reason why emulators should move to multicore. Of course there will always be one core that will be the hot spot (the one simulating the CPU), but if you can move some tasks to other cores (sound, graphics, etc.) and have low overhead synchronization it's a win.
It's possible to do a dual-processor dynamic recompiler. One CPU begins running an interpreter and sends the addresses that are executed to the second processor. The second processor recompiles the code, then signals the first CPU to execute the compiled blocks instead of the interpreter. After awhile the second CPU is mostly idle, but it improves the startup time.

Exophase said:
What's needed all depends on the synchronization demands of what's being emulated, which is of course a fuzzy number since it's really questioning how sensitive the software is to timing inaccuracy. For PCSX2 they have determined the switching between the CPU cores (R5900, VUs in micro-mode, IOP) happen every 512 cycles. I think this is still too fine grained for threads in different CPUs to synchronize at w/o introducing more overhead than the emulation of the block itself plus the single core task switch overhead. This might not strictly be the case; spinning on a timestamp counter should be fast enough, although it's still putting out loads to whatever the shared cache is, so probably a few hundred clock cycles. But the single core switching overhead is going to be quite high too, since it'll have to switch register sets. DS games also require tight synchronization between their two CPUs to even boot. It's within several dozen bus cycles.
SNES is worse. The sync between the 65816 and SPC700 is 1 or 2 instructions, and the sync with the graphics chip is generally one scanline (around 50-100 instructions).
 
Last edited by a moderator:
Ari64 said:
SNES is worse. The sync between the 65816 and SPC700 is 1 or 2 instructions, and the sync with the graphics chip is generally one scanline (around 50-100 instructions).

byuu, the author of BSNES, can tell you that the synchronization granularity is the full 21MHz and 24MHz of the CPU/PPU and audio subsystem respectively. Not that you have to switch it that frequently, just when one accesses the state of the other.

But SNES's clocks are still much lower than PS2's or even DS's for that matter so it's hardly slower to emulate in the end...
 
Last edited by a moderator:
Exophase said:
MDave said:
Maybe in about 10 years. High end PC's today can barely run PS2 fullspeed for most games. Handhelds seem to be about 10 years behind PC's in terms of power. It was around this time 10 years ago Quake 3 and Unreal Tournament came out, and they needed high end PC's to play at great frame rates and full quality graphics settings.

Depends on where the market goes though. Let's say we start seeing decent FPGAs appear in handhelds. That could start to help emulation immensely. You only need an FPGA at a few hundred MHz to provide massive improvement, they can be much more efficient at emulation than a CPU is. Now granted, you probably won't be seeing FPGAs that let you dump an Emotion Engine on them any time soon, but if the interface between an FPGA and a CPU is somehow made tight enough you could enhance emulation a lot with just some blocks of custom fabric. Imagine if an FPGA is mounted via ARM coprocessor interface and you could define custom operations that can do anything on a given register, including using any state available to the FPGA. That'd be pretty slick.

This is a shot in the dark.. But:
Could a USB FPGA be used? The OP has USB 2.0, I know it would be in the way, its more just a thought out loud..

[post='http://www.fpgaz.com/usbp/']Similar to This?[/post]
I have been learning about FPGA's from a mate of mine who is a telecom's tech and into radios and all that nerd stuff.
He has a GNU Radio (FPGA in that, etc).
 
Last edited by a moderator:
MDave said:
trats20050 said:
Is there a PS2 emulator that runs decently on the a PC ?
If so, can the Pandora 2 be designed to run a PS2 emulator?
I know how everybody hates to talk about Pandora 2 but there's nothing to do on the forum yet except wait.

Maybe in about 10 years. High end PC's today can barely run PS2 fullspeed for most games. Handhelds seem to be about 10 years behind PC's in terms of power. It was around this time 10 years ago Quake 3 and Unreal Tournament came out, and they needed high end PC's to play at great frame rates and full quality graphics settings.

:blink: I just built my brother a computer for under $1000 that can run any PS2 game that actually work with PCSX2 at >60 fps with max settings and the internal resolution set to 1920x1080. This computer is nowhere near top of the line.

I don't think a handheld emulating the PS2 will be feasible in under 4 years, but I think 10 might be a bit extreme.

Unfortunately, as they say, "An optimist is never pleasantly surprised."
 
Last edited by a moderator:
You might as well just get a PS2 (fat) and install a hard drive and boot games (that you own) from that.
That would cost under $100-$150 (country defendant)!
Save all the effort of emulation and incompatibility.
 
Unless, of course, you want to do other computer stuff aswell :unsure:
 
kingoddball said:
You might as well just get a PS2 (fat) and install a hard drive and boot games (that you own) from that.
That would cost under $100-$150 (country defendant)!
Save all the effort of emulation and incompatibility.
I always saw emulation as a convenience. I could hook up my NES to my CRT monitor after digging it out of storage to have have a perfect experience, or I could fire up a NES emulator on any of the 4 devices I use every day even if sound isn't perfect.

Emulation for me has never been about 100% perfect experience, its mainly about convenience of playing games on devices I use every day.
 
Last edited by a moderator:
In terms of emulating, the PS2 generation of consoles should be out of question for other reasons than performance. The most widely used file system today with USB and memory cards is FAT32 which limits files to a maximum size of 2GB. This should not be enough for consoles that use DVDs (up to 8GB images).

Furthermore, you will want to have a decent amount of games available on your storage space without having to buy lots of SD cards and card switching. Two SDHC cards with 32GB are the maximum for Pandora. Even if in one or two year's time their size doubles to 2x 64GB it would not be enough with PS1 images and other stuff also on the cards. You will need at least 2 x 120 GB to be comfortable.
 
second exodous said:
kingoddball said:
You might as well just get a PS2 (fat) and install a hard drive and boot games (that you own) from that.
That would cost under $100-$150 (country defendant)!
Save all the effort of emulation and incompatibility.
I always saw emulation as a convenience. I could hook up my NES to my CRT monitor after digging it out of storage to have have a perfect experience, or I could fire up a NES emulator on any of the 4 devices I use every day even if sound isn't perfect.

Emulation for me has never been about 100% perfect experience, its mainly about convenience of playing games on devices I use every day.


I have to agree with you. I always loved turning on Sonic 2 (with knuckles) and playing around for awhile, switching to SMW, or SMB3 (NES). You are correct.
Always loved emulation, but for larger newer (still on the market consoles) consoles, you might as well stick it on a full screen TV.
Just my thoughts. But I love my old 16bit games!


Pleng said:
Unless, of course, you want to do other computer stuff aswell :unsure:

Want a smart ass answer?
Here is it: PS2 Linux Kit.

[post='http://playstation2-linux.com/']PS2 Linux[/post]



MiniSinisterMinister said:
In terms of emulating, the PS2 generation of consoles should be out of question for other reasons than performance. The most widely used file system today with USB and memory cards is FAT32 which limits files to a maximum size of 2GB. This should not be enough for consoles that use DVDs (up to 8GB images).

Furthermore, you will want to have a decent amount of games available on your storage space without having to buy lots of SD cards and card switching. Two SDHC cards with 32GB are the maximum for Pandora. Even if in one or two year's time their size doubles to 2x 64GB it would not be enough with PS1 images and other stuff also on the cards. You will need at least 2 x 120 GB to be comfortable.


FAT32 is 4GB file limit.
HFS/NTFS does not seem to have a limit.
Storage is CHEAP and easily available. 1000GB (3.5" HDD, or 500GB 2.5") for under $100, and you can get a USB case for $20.
It's pointless using SD cards as a reference, the Pandora will not play them. Simple.
Most PC's dont have an SD slot, and the read/write is not as fast as usb flash cards, unless you wanna buy higher class cards.
Ps1 images are a max size of 700MB, and if you cut down FMV's then it lowers it even more.. Like iso/cda for PSP.
 
Last edited by a moderator:
MiniSinisterMinister said:
In terms of emulating, the PS2 generation of consoles should be out of question for other reasons than performance. The most widely used file system today with USB and memory cards is FAT32 which limits files to a maximum size of 2GB. This should not be enough for consoles that use DVDs (up to 8GB images).

Furthermore, you will want to have a decent amount of games available on your storage space without having to buy lots of SD cards and card switching. Two SDHC cards with 32GB are the maximum for Pandora. Even if in one or two year's time their size doubles to 2x 64GB it would not be enough with PS1 images and other stuff also on the cards. You will need at least 2 x 120 GB to be comfortable.
Comment on your first point is you don't need to format your SD cards to FAT32, mine are formatted EXT3, so that argument is pointless.

Comment on your second point is yes, storage is limited, but it's enough to put a few games you really like even if they are DVD size. Plus, are there any PS2 games that use the full 8GB? I never ripped one so I wouldn't know, but PSX games rarely ever are 650MB in size.

Anyway, your two arguments are invalid. I'm guessing that we won't see PS2 on handhelds for a while anyway and by that time your limitations won't exist.
 
Last edited by a moderator:
Back
Top