WizardStan
Mega GP Mania
- Joined
- May 24, 2008
- Messages
- 16,731
So it has been about 5 7 years since the last time I built a computer. My current one still crushes pretty much everything I throw at it, but it could crush them better. My brother gave me a good offer to buy my existing one and I planned on using that money to build an all new one. And apparently in the intervening 5 7 years where I wasn't really paying much attention to the desktop processor market a lot of things have changed and I am confused.
Apparently AMD is no longer even close to considered competition with Intel anymore. I guess that makes that decision easy. Cores, processor speed, cache size, all straightforward and easy to compare.
But when I was investigating these new Intel CPUs, I encountered something that definitely wasn't there before: GPU architecture. I can't find a single website which explains to me what this actually means. Does the CPU actually have a GPU on it now? How does that even work? Do motherboards not come with onboard GPU anymore? If I buy a separate graphics card am I effectively wasting money buying a CPU with built in GPU? I found a number of comparison sites that seemed to compare "onboard" graphics with graphics cards and any recent card seemed to always edge out (and in some cases completely blow away) the onboard (which I am assuming means the GPU in the CPU) graphics, so it makes me wonder why they would even bother. I don't want to pay for something I'm not going to use, my current motherboard didn't have onboard graphics either. And even if I do just use this onboard graphics processor, there seems to be mixed information about whether Linux can take advantage of it (or in fact anything less than Windows 7).
This is frustrating. What happened to CPUs being CPUs and RAM being RAM and graphics cards being graphics cards?
edit: I've apparently had this desktop for 7 years.
Apparently AMD is no longer even close to considered competition with Intel anymore. I guess that makes that decision easy. Cores, processor speed, cache size, all straightforward and easy to compare.
But when I was investigating these new Intel CPUs, I encountered something that definitely wasn't there before: GPU architecture. I can't find a single website which explains to me what this actually means. Does the CPU actually have a GPU on it now? How does that even work? Do motherboards not come with onboard GPU anymore? If I buy a separate graphics card am I effectively wasting money buying a CPU with built in GPU? I found a number of comparison sites that seemed to compare "onboard" graphics with graphics cards and any recent card seemed to always edge out (and in some cases completely blow away) the onboard (which I am assuming means the GPU in the CPU) graphics, so it makes me wonder why they would even bother. I don't want to pay for something I'm not going to use, my current motherboard didn't have onboard graphics either. And even if I do just use this onboard graphics processor, there seems to be mixed information about whether Linux can take advantage of it (or in fact anything less than Windows 7).
This is frustrating. What happened to CPUs being CPUs and RAM being RAM and graphics cards being graphics cards?
edit: I've apparently had this desktop for 7 years.
Last edited by a moderator: