i am using 8800GT for 2yrs and it is wearing out, like playing games, it is lagging and got sounds coming out from it. is it normal? any recommendations for a graphic card that last and good?
u got update ur graphics card?
Originally posted by Battlecrusier:i am using 8800GT for 2yrs and it is wearing out, like playing games, it is lagging and got sounds coming out from it. is it normal? any recommendations for a graphic card that last and good?
Try one of those ATI graphic card with big and long heatsink with minimium or no fans at all.
But I'm not an expert, cos I also need alot of help in this area.
Originally posted by Sgforum King:u got update ur graphics card?
i alr updated to the latest verison, but it is like no use. anyone who know IT can help? or it is the time to buy a new card? if need to buy , which one to buy is the long lasting?
Sometimes graphics card lasting or not may need to do with the computer's cooling
Best is to get those with non reference cooling, non reference means the design of the heatsink is not the same as what ati/nvidia's original card's design. Non reference cooling are bit more expensive but it runs at a cooler temperature.
Your 8800gt is still under warranty right? They will change it to a GTS250(iirc) if they cannot fix it as 8800gt is out of production already
If under warranty, go RMA and they will replace the card for a similar, but 'newer' version. GTS 250 I think, or at the very least, a 9800GT.
If not, I would suggest cards from ATI like the 4850 or 4870 with good non-reference fansinks, like the Powercolor HD4850/70 PCS, The XFX 4870s (A bit ex though). Gainward has a few non-ref designs too, though I heard they are noisy.
my card is not under warranty alr. the warranty only for 1 yr. so ATI is much better than Nivida. thanks for ur advice. ATI i need to see if my mainboard is suitable for it or else need a full overhaul, budget only have $200 =.=
nvidia got physx why wan go get ati
Haha, no need worry about mobo liao. I'm using the Powercolor HD4850 PCS (About 20C cooler under load than ATI reference design) on an NVidia 630i mobo, no problems whatsoever. My sis is using my old GeForce 9500 on an ATI mobo, so you don't really need to worry about chipset incompatibility. As long as the form factor (PCI-E) is the same, why not?
PhysX is more for eye candy than real-world applications. Besides, enabling PhysX will reduce framerates. Most people don't use PhysX anyway.
ATI also include Folding@Home in their drivers, so if you aren't playing games, they will use your graphics card to help process scientific data. Using GPU acceleration is much faster than CPU acceleration, because new GPUs can process data faster than a CPU can. NVidia also has F@H based on CUDA, IIRC.
A Q6600 (A Quad-core) has roughly 38.70 GigaFLOPS of processing power, while a HD4850 has close to 1 TeraFLOP, or 1000 GigaFLOPS. A GTS 250 from NVidia has roughly 738 GFLOPS. Of course, this is all theoretical.
i am using my com just to play games and my CPU is only core duo 2. therefore physX is necessary for me.
Actually Physics is to help the cpu to decode and calculate the 'physics' in the game like when the object is falling, so your game will be more realistec. If you let the cpu calculate the algorithms, and you turn up your resolutions higher, there might be some lag, because your cpu is handling more task.
In the past, people will buy a physics card to calculate the physics, releasing the cpu from some stress, but now nvidia got the card to work in their gpu, so when people buy their gpu they also have a physics, so is kind of like marketing
To me using a physics and no physics there is not much of difference, you can go to youtube and search for nvidia physicsx test and see
Maybe i am more of a fan of ati so i dun really like nvidia, nvidia has their own techonology like CUDA, and ati have their own ati atream technology.
I will say wait till your graphics card really spoils first then you see whether you wan to get a new graphics card or a new computer, maybe it is because you have too much background programs running and that is why your system is slowing down?
If your gpu is too fast and your cpu is too slow, your comp is slow. If your cpu is slow but your gpu is fast, your comp is also slow, so what you wan is something that will work nicely.
Originally posted by Raraken:Haha, no need worry about mobo liao. I'm using the Powercolor HD4850 PCS (About 20C cooler under load than ATI reference design) on an NVidia 630i mobo, no problems whatsoever. My sis is using my old GeForce 9500 on an ATI mobo, so you don't really need to worry about chipset incompatibility. As long as the form factor (PCI-E) is the same, why not?
PhysX is more for eye candy than real-world applications. Besides, enabling PhysX will reduce framerates. Most people don't use PhysX anyway.
ATI also include Folding@Home in their drivers, so if you aren't playing games, they will use your graphics card to help process scientific data. Using GPU acceleration is much faster than CPU acceleration, because new GPUs can process data faster than a CPU can. NVidia also has F@H based on CUDA, IIRC.
A Q6600 (A Quad-core) has roughly 38.70 GigaFLOPS of processing power, while a HD4850 has close to 1 TeraFLOP, or 1000 GigaFLOPS. A GTS 250 from NVidia has roughly 738 GFLOPS. Of course, this is all theoretical.
in the future almost all 3d games will require physx alr cause it looks so fucking nice. if ur card has physx then there is no reason why it should reduce too much framerate.
Originally posted by Sgforum King:in the future almost all 3d games will require physx alr cause it looks so fucking nice. if ur card has physx then there is no reason why it should reduce too much framerate.
PhysX is more of a cosmetic add-on, like cloth physics and realistic water physics. IMO, excellent physics has already been achieved on the Havok engine, used in Source games and other games. And Source runs pretty well on a 1.7GHz Pentium 4. Tests have already shown that PhysX reduces framerates, and most of the time, it grants no additional benefit except for special maps and missions (UT3 PhysX extensions) or realism (Cloth Physics in Mirrors Edge). The card sets aside a few Stream Processors or CUDA Cores or WTF Nvidia says they are, and that can make, say, a GTX 275 with PhysX enabled perform like a GTX 260 Core 216 without PhysX enabled. The differences may be negligible, or on high resolutions, drastic. By drastic, I mean 8+ FPS loss.
Oh, and try to watch your language. Thanks!
Originally posted by Raraken:PhysX is more of a cosmetic add-on, like cloth physics and realistic water physics. IMO, excellent physics has already been achieved on the Havok engine, used in Source games and other games. And Source runs pretty well on a 1.7GHz Pentium 4. Tests have already shown that PhysX reduces framerates, and most of the time, it grants no additional benefit except for special maps and missions (UT3 PhysX extensions) or realism (Cloth Physics in Mirrors Edge). The card sets aside a few Stream Processors or CUDA Cores or WTF Nvidia says they are, and that can make, say, a GTX 275 with PhysX enabled perform like a GTX 260 Core 216 without PhysX enabled. The differences may be negligible, or on high resolutions, drastic. By drastic, I mean 8+ FPS loss.
Oh, and try to watch your language. Thanks!
Batman arkham asylum looks nicer with physx
Originally posted by Sgforum King:Batman arkham asylum looks nicer with physx
Yep, it looks nicer, but apart form that, PhysX doesn't really do much. It was intended to enhance the gaming experience, nothing more. By enhance they obviously meant, "Let's make it looks flashy!"
Originally posted by Raraken:Yep, it looks nicer, but apart form that, PhysX doesn't really do much. It was intended to enhance the gaming experience, nothing more. By enhance they obviously meant, "Let's make it looks flashy!"
so u dun agree that graphics is a big part of games nowadays?
Originally posted by Sgforum King:so u dun agree that graphics is a big part of games nowadays?
Graphics are a big part, but not the major part. What use is it to have stunning visuals, but a crappy gameplay? Take Crysis for instance. Gameplay is repetitive, and similar throughout. It becomes a benchmark instead, due to its stunning visuals. And then take Team Fortress 2. It's graphics are nowhere as nice as Crysis's, but it's gameplay is excellent, and it doesn't need a bleeding edge computer to play at decent settings. It even runs on a 1.7GHz Pentium 4, which Crysis obviously cannot compare to.
I can say PhysX does interfere with gameplay somewhat. ME with PhysX enabled gets you these cloth banners and canvas covering, and what they do is that they block you from seeing your enemy, but the enemy acts as if the cloth isn't there. It gets irritating when you get shot out of no where from behind a cloth, so I can say PhysX isn't for everyone. Besides, price-to-performance ratio favours ATI, not NVidia. My 4850 is my first ATI card, ever. I wouldn't have gone for ATI had NVidia offered competitive solutions, but alas, this wasn't the case.
Originally posted by Sgforum King:nvidia got physx why wan go get ati
how many people actually use that thing?
be realistic.
then again, ati hd4870 versus an nvidia gtx260 or something, the nv won in terms of framrate but it really sucked when it came to details.
the ati card could actually reproduce the rough texturing on a large rock while the nvidia card produced a bleached rock.
Originally posted by Sgforum King:so u dun agree that graphics is a big part of games nowadays?
it definitely is.
then again, ati still wins.
refer to my statement about the hd4870 being able to actually produce minute details compared to the nv's inability to do so.
look like most of them support ATI for gaming . then i should buy a HD5770. what is the difference between the HD 5 and HD 4 series??
Originally posted by Battlecrusier:look like most of them support ATI for gaming . then i should buy a HD5770. what is the difference between the HD 5 and HD 4 series??
The HD5000 series support DX11, the new standard in computer graphics as well as OpenGL 3.2 <- Exact version I forgot
The HD4000 series only supported up to DX10.1. The 5770 is about the same as a 4890, so go for it. BTW, what is your CPU? Monitor resolution? A 5770 might be overkill for most systems.
my CPU is core duo 2 or duo core 2 i forgot alr, my monitor is 19 inch.
Originally posted by Battlecrusier:my CPU is core duo 2 or duo core 2 i forgot alr, my monitor is 19 inch.
Hmm, I would hazard a guess it's a 65nm CPU, if you bought the CPU + mobo at the same time. Either way, they maxed out at 3GHz, and that was the top-of-the-line models. Care to elaborate? Speed, model?
Ways to check:
Vista/7: Start Menu > Right-click Computer > Properties and look at "Processor" under "System"
XP/Vista/7 :Start Menu> Run>dxdiag and look under CPU
CPU speed is important, since a slow CPU will bottleneck the 5770. 19-inch means 1440x900, 4850/4870/4890/5770 is more than powerful enough. though watch out, since the 40nm HD5000 series cards are running into availability issues.
Intel(R) Core(TM)2 CPU 6300 @ 1.86GHz (2 CPUs) is my CPU
Hmm, that CPU might bottleneck the HD4800 or HD5700 series.....You might need a more powerful CPU to take full advantage of the cards.