Amd radeon graphics thing.
- Epicpeople321
- Offline
- Posts: 34
- Joined: January 24th, 2012, 10:58 pm
- Location: Your pants
Amd radeon graphics thing.
WELL, I know integrated graphics are a no-no, but does anyone know anything about this graphics card? AMD Radeon HD 6530D. I've found a Quad core prossecor that has this dual integrated, it has up to 1GB of memory, so I guess the question is, with it being dual graphics, does it have 2GB?, and I'm also not really sure if it shares ram or not.
<Wundsalz> some people will be pissed at your formula
<#Ollieboy> They can suck it.
<#Ollieboy> They can suck it.
Re: Amd radeon graphics thing.
That's a Llano based Quad Core, think of it more like an Athlon 2 x4 (Core 2 Quad) in terms of processor performance.
Now as far as your Dual Graphics = Dual memory thing goes... No, it doesn't work like that.
If you have 2x Graphics cards both with 1gb of ram... That Ram isn't pooled or shared between graphics chips, instead the information is copied to both the graphics cards memory at the same time with the same data.
So you will still end up with 1gb usable overall.
With that said, more Ram doesn't make a graphics card faster, a low-end graphics solution such as that will *never* be able to handle 2gb of game data at once and still output an acceptable amount of performance.
Plus, the ONLY time you will see a dramatic performance increase with 2gb or more graphics card memory is when you are running 2560x1440/2560x1600/Eyefinity or Surround Vision resolutions, which lets face it...
If you can afford a display set-up like that with those resolutions you would also be shopping for high-end $300-$1000+ multiple graphics cards.
As for if it's shared memory or not... That's hard to know without knowing the actual model of the system itself, so if you can share the make and model that would be awesome, but at a minimum the integrated graphics HAS to use system memory.
The only exception to this rule is the old 600/700/800 chipset series IGP's which can have Sideport memory which can be used either stand-alone or in conjunction with system memory.
Now as far as your Dual Graphics = Dual memory thing goes... No, it doesn't work like that.
If you have 2x Graphics cards both with 1gb of ram... That Ram isn't pooled or shared between graphics chips, instead the information is copied to both the graphics cards memory at the same time with the same data.
So you will still end up with 1gb usable overall.
With that said, more Ram doesn't make a graphics card faster, a low-end graphics solution such as that will *never* be able to handle 2gb of game data at once and still output an acceptable amount of performance.
Plus, the ONLY time you will see a dramatic performance increase with 2gb or more graphics card memory is when you are running 2560x1440/2560x1600/Eyefinity or Surround Vision resolutions, which lets face it...
If you can afford a display set-up like that with those resolutions you would also be shopping for high-end $300-$1000+ multiple graphics cards.
As for if it's shared memory or not... That's hard to know without knowing the actual model of the system itself, so if you can share the make and model that would be awesome, but at a minimum the integrated graphics HAS to use system memory.
The only exception to this rule is the old 600/700/800 chipset series IGP's which can have Sideport memory which can be used either stand-alone or in conjunction with system memory.
Xecutioner91890: I wanna meet the owners that would be a dream !
<@hafnium> Fuck off
<@hafnium> Fuck off
- Cartoonman
- SupOP
- Offline
- Posts: 229
- Joined: June 23rd, 2011, 7:42 pm
- Location: NYC
- Contact:
Re: Amd radeon graphics thing.
key word: "UP to 1GB". that means that chipset is using "UP TO 1GB". In most cases, when dealing with things like integrated graphics, or even CPU's in general, memory is a shared aspect. Multiple CPU's must share the same L3 cache, as the dual-integrated graphics chipset must share 1GB with each other.
In short, get a dedicated card if you can. My recommendation is to get the HD 5850 if you can. If not, anything that's 58xx+,68xx+, or 78xx+ is sure to run about 99% of all games on the market at fps of 15+.
Pemalite here is a gaming freak, so his views on gaming involve 3 gigantic screens, the very best of computing software, and a pile of graphic-intensive games to show off his E-peen, all while contributing nothing to Folding@home nor any BOINC scientific project that could greatly benefit from his overpowered gaming monster, but I digress.
In short, get a dedicated card if you can. My recommendation is to get the HD 5850 if you can. If not, anything that's 58xx+,68xx+, or 78xx+ is sure to run about 99% of all games on the market at fps of 15+.
Pemalite here is a gaming freak, so his views on gaming involve 3 gigantic screens, the very best of computing software, and a pile of graphic-intensive games to show off his E-peen, all while contributing nothing to Folding@home nor any BOINC scientific project that could greatly benefit from his overpowered gaming monster, but I digress.
Show
markiled: i just want to hump the speakers
Re: Amd radeon graphics thing.
Last I heard Folding@Home had two of Pem's cores dedicated to it.Cartoonman wrote:all while contributing nothing to Folding@home nor any BOINC scientific project that could greatly benefit from his overpowered gaming monster
<TKB> Hit_Girl: zombies don't hurt
<TKB> Hit_Girl: weird.
<TKB> Hit_Girl was slain by Zombie
<TKB> Hit_Girl: weird.
<TKB> Hit_Girl was slain by Zombie
Re: Amd radeon graphics thing.
When I had your Phenom 2 x6 chip I had 2 cores dedicated to it, always.Ollieboy wrote:Last I heard Folding@Home had two of Pem's cores dedicated to it.Cartoonman wrote:all while contributing nothing to Folding@home nor any BOINC scientific project that could greatly benefit from his overpowered gaming monster
When I'm asleep, I fire up the CPU and GPU clients and my system happily folds proteins all night long.
When I'm not gaming or just playing Minecraft I limit it to 6 cores and a single GPU, so my "overpowered" system is not going to waste.
Conversely, when it comes to upgrade time, other people end up with my old hardware, so they benefit from my purchases too.
Plus, running games at 5760x1080 is stupidly taxing, you can't run the latest games at max with anything less than 2x High-end GPU's, hate on it alllll you want, I worked for my money, I'll spend it the way I bloody well want!
Xecutioner91890: I wanna meet the owners that would be a dream !
<@hafnium> Fuck off
<@hafnium> Fuck off
- Cartoonman
- SupOP
- Offline
- Posts: 229
- Joined: June 23rd, 2011, 7:42 pm
- Location: NYC
- Contact:
Re: Amd radeon graphics thing.
Besides, Folding@home is too mainstream.Cartoonman wrote:"but I digress. "
Show
markiled: i just want to hump the speakers
- Epicpeople321
- Offline
- Posts: 34
- Joined: January 24th, 2012, 10:58 pm
- Location: Your pants
Re: Amd radeon graphics thing.
Plus, running games at 5760x1080 is stupidly taxing, you can't run the latest games at max with anything less than 2x High-end GPU's, hate on it alllll you want, I worked for my money, I'll spend it the way I bloody well want!
I don't think that's entirely true. I seem to remember you saying minecraft can run on an acient roman sundile. Unless that's not considered a game.
<Wundsalz> some people will be pissed at your formula
<#Ollieboy> They can suck it.
<#Ollieboy> They can suck it.
Re: Amd radeon graphics thing.
I'm talking about the new and shiny games.Epicpeople321 wrote:Plus, running games at 5760x1080 is stupidly taxing, you can't run the latest games at max with anything less than 2x High-end GPU's, hate on it alllll you want, I worked for my money, I'll spend it the way I bloody well want!
I don't think that's entirely true. I seem to remember you saying minecraft can run on an acient roman sundile. Unless that's not considered a game.
Xecutioner91890: I wanna meet the owners that would be a dream !
<@hafnium> Fuck off
<@hafnium> Fuck off
- Epicpeople321
- Offline
- Posts: 34
- Joined: January 24th, 2012, 10:58 pm
- Location: Your pants
Re: Amd radeon graphics thing.
I have yet another question, can you disable the integrated graphics and install a new graphics card, then enable the new card? I couldn't find any clear answer from Google, so yeah.
<Wundsalz> some people will be pissed at your formula
<#Ollieboy> They can suck it.
<#Ollieboy> They can suck it.
- Sanjar Khan
- Trustee
- Offline
- Posts: 1766
- Joined: May 24th, 2011, 1:40 pm
- Location: Leiden, Zuid Holland
Re: Amd radeon graphics thing.
What? If you install an actual card you're done.
Ferrisbuler2: i will stay but i might not post cus of ollieboy
- Epicpeople321
- Offline
- Posts: 34
- Joined: January 24th, 2012, 10:58 pm
- Location: Your pants
Re: Amd radeon graphics thing.
So even if you have a graphics card integrated in the CPU, you can just install a new one and use that? O:
<Wundsalz> some people will be pissed at your formula
<#Ollieboy> They can suck it.
<#Ollieboy> They can suck it.
- boblol0909
- SupOP
- Offline
- Posts: 314
- Joined: June 24th, 2011, 10:27 pm
- Epicpeople321
- Offline
- Posts: 34
- Joined: January 24th, 2012, 10:58 pm
- Location: Your pants
Re: Amd radeon graphics thing.
Cool, thanks bobo.
<Wundsalz> some people will be pissed at your formula
<#Ollieboy> They can suck it.
<#Ollieboy> They can suck it.
Re: Amd radeon graphics thing.
Yes, as long as it's not a laptop.Epicpeople321 wrote:So even if you have a graphics card integrated in the CPU, you can just install a new one and use that? O:
Also, some integrated graphics chips can "work together" with a dedicated graphics card to boost your overall performance too.
For example... You buy an AMD Llano Quad core desktop with a Radeon 6550D, you could drop in a Radeon 6670 and both Radeons would work together and provide performance fairly relative to a Radeon 6750.
It's essentially a free performance upgrade.
However, if you go for a faster card than the 6670 then both graphics chips won't work together, but there would also be no need for them to, but if you want to keep things cheap it's an "option".
Xecutioner91890: I wanna meet the owners that would be a dream !
<@hafnium> Fuck off
<@hafnium> Fuck off