Second Life Deferred Render

There is a problem with Deferred Render (L&S = Lighting & Shadows) crashing some viewers. The Firestorm Viewer team thinks most people cannot run L&S. That may say more about who is using Firestorm and Phoenix rather than SL users. But, as far as I know there are no good stats to answer the question of how many people can run L&S, at least not in regard to SL users.

As we move forward this will be a more important issue. It certainly makes a difference in how Second Life™ appears on your screen.

What it is?

Deferred Rendering is a high fidelity render of lighting and in some systems shadows. There is no limit to the number of lights that can be rendered, which is a big step up from the 6 or 8 lights we could previously render in SL.

Lighting is done per pixel for speed and several other advantages. There are some advantages that will only kick in when we get the coming Materials System. There are some anti-alias and transparent objects issues with it.

The implementation in SL is customized for SL. So, if you read about deferred render on various sites, it may not sound exactly like what we see or are familiar with.

Requirements

According to Unity3D to use deferred render requires a graphics card with Shader Model 3.0 (or later), support for Depth render textures and two-sided stencil buffers. Most graphics cards made after 2004 support deferred lighting, including GeForce FX and later, Radeon X1300 and later, Intel 965 / GMA X3100 and later. However, it is not currently available on mobile platforms nor Flash.

So, it would seem this is another factor that is keeping Second Life off mobile devices.

Changes in Viewer Settings

Recently the default settings for video cards changed. The video card tables were updated. More settings between LOW and ULTRA were added. These changes are directly related to the Lab’s effort to base default feature settings on actual frame rates related to specific video cards. The Lab is trying to classify cards by actual measured performance.

Because of rendering system changes the Lab needed to reclassify video cards. In doing so they decided more granularity was needed in the settings. That is done to give the most people the best experience and performance and base it on actual measured performance.

However, as people upgrade to new viewers they find that they cannot run the viewer at previous settings. The graphics settings need to change. Not knowing this, people’s reaction to the problems is to think viewer is broken and switch back to the last previous version.

I suspect this is why the SL Viewer will on occasion force an update in the viewer’s graphics settings. Third Party Viewers (TPV) are less likely to force settings resets and some use different default settings than the SL Viewer does. I suspect this contributes to why some think one viewer works on their computer and another doesn’t.

Solution

If your updated viewer starts crashing on login, change your graphics settings to LOW and then login. Once logged in you can start increasing the quality until you run into problems.

9 thoughts on “Second Life Deferred Render

  1. I’ve always had better luck with shadows if I turned off antialiasing and anisotropic filtering in the viewer and make those settings in the software for my NVidia card itself. Since I did that a year or so ago, I no longer crash when I turn on shadows.

  2. On my mac i turn on Lighting and Shadows but then set ‘shadows’ to none so i can still enjoy the wonderful lighting and projectors.

  3. I find myself running with deferred rendering all the time these days, once you get used to real time shadows, ambient occlusion and much more realistic lighting it’s very hard to go back without feeling a bit handicapped.

    It does require a bit from your hardware setup – but if you play “modern games” on your PC, you should be quite used to that. I’ve NOT found it to be a problem for stability at all, infact I crash very little these days and when I do it’s a simcrossing failure causing it.

    I do think it’s a problem though, that we have “two sorts of users” based on hardware, and espicially after materials come into play soon, there’ll be so much difference in what each of type of user experience behind their screen. Not only problematic for creators, but really for SL as a whole. I’d love to see deferred rendering getting an overhaul, I’m sure it could be optimized and fixed in some places for give better support for lowend systems and I think SL would benefit greatly from it becomming more of a “standard” for all users.

    • I don’t code render pipelines. So, I can’t say for sure. But, since deferred rendering first came out in 1988 and appeared as we know it using an image buffer in 1990 then moved in graphics cards in 2004, I think about all the optimization that can be done has been done. So any computer with mid range graphics should be able to handle deferred render.

      Since video cards with good deferred render capable engines are available on eBay for less than $25, I can’t see that there is a reason for there to be two classes of SL user; deferred capable and incapable. Nor do I see a reason to try and hold SL development back for those that chose not to upgrade their computers.

      If this were a matter of having to upgrade to a top end gaming machine, I could see two classes being created. But, that is not the case.

      I’ve asked for the Lab to give us stats on how many SL users are running with deferred enabled. I think that will give us an idea of how to build going forward. In a little less than a year most (very high 90 percents) had started to run SL with mesh capable viewers. I think most of those people will also be deferred capable.

      The last I heard old viewers were used by less than 3% of the SL users. I can’t see making 97+% unhappy to keep 3% happy.

      In a creative environment forcing a standard on all users is counterproductive. I think trying to support a low end ‘standard’ would do more harm to SL than benefit. The cutting edge, highly motivated creative types would move on to something else and SL would stagnate.

  4. I run the viewer with AA, Anisotropic and shadows and it runs pretty well. Never had “problems” or “bugs” for having those features enabled. The only bad thing is that, asides of FX Antialising, doesnt matter what kind of AA I set on the nvidia panel. SL always override them and set its own ugly AA. At least, that kind of AA is faster than the usual MSAA.

    Running the viewer with render deferred enabled but shadows off isnt really that heavy for a PC. It mostly runs SL at the same speed than without render deferred. I just think that calling that option “Lighting and Shadows” is not a good name since people thinks that using it they will always get shadows and make SL slow. Something like “Advanced Shaders” should be a better name since we already have the basic shaders one. This can be helpful when materials hits the viewer. Anyway, the viewer can handle normal maps without using render deferred so if LL wants, they could make it available for quite everyone.

    • I agree the labeling is weak. Enhanced Lighting or Deferred Lighting, the later being sort of the industry standard label, would make more since.

      I run AA mostly in the viewer. I do have one install that runs with viewer AA off and the nVidia AA on full blast. It is interesting bu not really useful. When I want the large, sharp photos from the viewer I have to use the viewer’s AA. Otherwise I get jaggies.

  5. Pingback: September 24, 2015: Blogging SL | Virtual Worlds Education Roundtable

Leave a Reply to Nalates Urriah Cancel reply

Your email address will not be published. Required fields are marked *