If it were a matter of just the fastest GPU, things would be simple. Alas, the complications are endless. One of the biggest is AMD’s seeming focus on Microsoft’s DirectX (or DirectDraw3D) and Apple’s QuickDraw3D at the expense of OpenGL support. Since Linden Lab’s viewers depend on OpenGL for their 3D render in SL Viewers that is a problem. nVidia is doing a better job of supporting OpenGL.
The nVidia 400 series cards have some problems. These are manufacturing problems. The cards were 6 months late and band-aids were used to get them to market. Literally parts of the GPU were turned off to get them to work and out to market. The new 580 is the first card to have all the hardware problems resolved. The 200 to 400 series cards do work. They are just less than they were hoped to be.
Linden Lab has lagged in updating their render pipeline. OpenGL has outpaced them to the point that SL’s render pipeline became incompatible and had to be upgraded. While it was upgraded for mesh, it seems little effort went into modernizing the pipeline and extending graphics card support. By August 2011 problems for those using new graphics cards became a significant problem… crashes and failures to launch. Since then the Lab has been upgrading their graphics compatibility and things are working better. Most of the fixes are coming out of QA now (Nov 2011) and are being rolled out to the production viewers.
The Brand Winner
Amidst all the confusion the unclear winner seems to be nVidia. That is not gospel from on high, so take it with some salt. I think it is a general consensus among SL users.
The Model Needed?
nVidia model numbers actually mean something. But, figuring it out and understanding what they mean is complex. They seem designed to get you to spend money more than to explain anything. So, one must tread carefully.
To get some perspective consider the GTX 590. It is designed for an extreme graphics high end, extreme as in running three HD 3D monitors in nVidia 3D Vision Surround mode (this means you are wearing the 3D LCD glasses). This is not something typically done with Second Life. Using a 590 to run Second Life is massive over kill.
But, which video card is best for Second Life? nVidia 275 GTX and 295 GTX are considered minimums on the SL System Requirements page. My 8800 and I hear the 9800 cards are performing very well. Depending on other components in your system the 8800/9800 cards give 15 to 60 FPS. Asking them to do shadows or high Anti-Aliasing is pushing them and can drop FPS to 4 or 5. Generally the 8800/9800 is good for the High graphics setting in SL.
Tom’s Hardware provides a list with cards of similar capabilities grouped together: Video Card Hierarchy (published Aug 2011). In this chart you can see that a GTX 470 and GTX 560 have about the same capability. Performance of each is nearly the same with the 560 having a slight edge.
As a rough guide a 260, 360, 460 and 560 use the same core processor architectural. The 560 uses the latest version of it. A 470 and 570 use another family of processors with a slightly different architectural. The 570 uses the newer version of it. The 580 and 590 use the exact same processor from yet another architectural family, however the 590 uses two of them… Like I said, a rough idea. The pattern is a clue but there are no hints I can give you to tell when the model number deviates from the pattern.
There are some nice benchmark comparisons at VideoCardBenchmak.net. These comparisons are the average of many benchmark runs on a large number of different computers. For my GTX 560Ti there are 3,205 tests. This means these results from so many different computers are probably the best representation of the actual video card’s performance.
KirstenLee is the developer of the Kirsten Viewer, at least up until about September 2011. Kirsten rated the various video cards for performance with SL. See Kirstens GPU Awards 2011!
Stop by and donate to the KirstenLee Viewer Fund.
As you read through the various comparisons you may notice some conflicting ratings. That is normal. Performance very much depends on all the components in the computer and the settings. So, different testers get different results.
Second Life Video Features
Second Life is not optimized for high end graphics cards. If SL is the only game you play, a lower-end card is going to work. I suggest spending about US$50 to $150.
If you play other games and think you may want to make machinima then you need to at least get a card in the $100 to $200 range.
You want to pay some attention to the level of OpenGL support the card provides. The video drivers from nVidia supply the OpenGL version. nVidia is good about bring older cards forward to new versions. When you update nVidia drivers you update OpenGL.
The SL Viewer will whatever version you have installed. How well it works with an OpenGL version depends on your version of the viewer. Another SL limitation is many rendering features in the video card are overridden by the SL Viewer. So, performance is very dependent on your hardware, drivers, and Linden Lab’s viewer.
Supported and Unsupported
There actually is a list of video cards that are and are not supported by Linden Lab. See: BitBucket – GPU table improvements VWR-25931 – Oct 2011.
JIRA item VWR-7964 – Add SLI and Crossfire support, was opened in June 2008, triaged in June 2009, and last updated in October 2011. The Lab is aware of the problem.
The comments show those using SLI or Crossfire are NOT seeing any performance improvement. So, it appears that SL Viewer 3 is not benefiting from SLI or Crossfire.
New video cards plug into motherboards via a PCI-e expansion slot. PCI-e slots first appeared in 2003. So, mother boards purchased before 2003 won’t have PCI-e. There is no point plugging a new video card into an old PCI or AGP slot. If the card will work, your video card will be hamstrung. PCI-e 1.0 is slow, fast in its day, but slow by today’s standard. New cards made now need at least PCI-e (e = express) 2.0. PCI-e 2 came on the scene in Late 2007. PCI-e 3.0 started appearing in motherboards this year.
A change from PCI-e 2 to a PCI-e 3 can make a big performance difference. 3.0 move data twice as fast as 2.0. But, one has to replace the motherboard to go from 2 to 3.
Motherboards may also have an option to set the expansion slots to run as PCI or PCI-e. GPU-Z will tell you your motherboard’s capability and in what mode it is running. The video card’s interface is shown in an adjacent field. So, it is easy to see if there is a mismatch. If the PCI runs in a legacy mode it will choke the data flow to the video card.
To correct the setting usually requires a BIOS setting change. Google for help on changing BIOS settings. Or look in the motherboard’s manual.
When to Upgrade Video Cards
When considering technical information it is easy to feel overwhelmed. But, there is a simple way to know whether a video card upgrade will make a large or small difference. Using the GPU-Z tool you can see the load on the graphics processor. If the GPU is carrying 100% load, a video card upgrade will make a significance difference. If the GPU load is only carrying a 10% load, an upgrade is probably not going to make much of a difference.
Duel Core Problems
I’ve looked through a number of reviews of various systems running Second Life. If you run a Duel Core 2 you will likely have the worst possible performance. A single or Quad Core 2 both seem to give better performance. When one starts using the newer Core i3, i5, and i7 performance improves. Games like Skyrim do take advantage of i7 multi-cores and high end graphics cards. In the Core2 there just isn’t much of which to take advantage.
A Duel Core 2 and my new GTX 560 drag along at 15 to 35 frames using the current Development Viewer 3.2.4 (245519). My next upgrade item will be to a Quad Core. Then eventually a new motherboard and CPU.
There are benchmarks for the various CPU’s at CPUBenchmark.net. The Duel Core2 6600 @ 2.4 GHz which I use comes in at 1,508. The Quad Core2 6600 2.4 GHz comes in at 2,983 or almost twice as fast. An i3-370M @ 2.4 GHz comes in at 2,219, and it too is a 2 core processor. The 2 core i3-530 @ 2.9 GHz comes in at 2,725. So, I would need to move up to at least an i5 to beat a Quad Core2. Check your CPU to see what you need to move up to to be able to see an improvement. New doesn’t necessarily mean better.
Look up the benchmarks on your CPU and find out which CPU’s can be put in your motherboard. The benchmarks will show if the replacement will outperform your existing CPU. If your motherboard can handle newer CPU’s you can save a bundle. Some motherboards made in 2007 will handle Intel’s newest i7.
The Duo I have draws about 35 watts and the Quad about 45 watts. So, changing CPU’s should not over load the power supply.
Links to the next page are below the buttons.