In the Open Source User Group meeting Oz Linden provided an explanation of how the new avatar baking system will work and some of the ideas they have for implementing it. I’ve added in the description of how things work now to provide contrast. I’ve covered this topic before. But, there is new information here.
Oz Linden pointed out that a large part of the viewer side avatar baking process is done in the local computer’s video card. Since there are so many different cards, driver, and driver versions the result is inconsistent. By moving the process to the server side the results will be consistent.
As it is now, users with the exact same input (think clothes) can get difference results (look different in the same clothes). Once this moves to the server side, the same inputs will produce the same result.
As you edit appearance your system will be using its video card to create your appearance. We see that now as we put clothes on and see then go from fuzzy to sharp. With the new system that will still happen.
As it is now, when you finish editing appearance the viewer sends off the baked texture. Actually there is just a time limit that expires and whatever you have baked at the time is sent. The server then sends your baked texture to everyone and you. This is when you see the avatar go fuzzy a second time as it downloads and re-renders your baked appearance texture.
With the new system when your appearance editing is complete the viewer will send a signal to the server, which will start the server side bake process. The server will look in the server side data for your inventory folder named Current Outfit (COF). Using that list of items it will pull the items that make up your appearance from the asset servers. Using those assets it will bake your appearance texture. That texture will be cached in the Bake System and sent to everyone that can see you as a UUID. You too will get the UUID.
Because the server is in the data center it will be using a much faster and more reliable network. Also, it moves the texture caching out of all the region servers and into a global server side cache. One of the problems with the local region caches is the region servers figuring out when they can drop items. It is much less of a problem in a global cache.
Since your viewer is getting all the avatar textures from a single source, the Lindens think they can implement persistent connections to pipeline data to the viewer. That should provide faster avatar rendering.
Your viewer will then download the newly baked texture and use it for your appearance. So, you will see a second fuzzy appearance as it downloads and updates, just as you do now. The difference is you may not see yourself EXACTLY as you did when changing appearance. It will hopefully be unnoticeable differences. But, those video card specific inconsistencies may show up.
The current work is to move the viewer side bake processes into a library. The library will be a file of the code containing the functions used to bake your appearance. That same code (library) will be used in building the server side render process. A single library can be updated to update both the servers and viewers.
Because the baking will be done server side and CACHED SERVER SIDE in the Bake System, region servers will no longer have to deal with your appearance texture. As it is now, if I understand correctly, the regions cache the appearance texture. So, as you move region to region the texture is another hunk of data that has to move with you. Now the regions can avoid that task. All viewers will ask for the appearance texture directly from the Bake System.
The result is avatars should rez more quickly, your cache should be more effective as the same appearance textures will have the same UUID’s in all regions, the regions will be handing out less data, the region will have less to do… in general: many things are touched and changed for the better by this new system.
Any viewer that does not adopt the new avatar bake process will see only grey avatars. It will take some time to get this all implemented, but once implemented only updated viewers will show texture on avatars. Even your own avatar will be grey, because your viewer uses the texture coming from the servers for your appearance.
Oz Linden said, “That’s why we’re going to publish the changes viewers need to make well before we start deploying the servers on main grid…. so they have plenty of time to be updated to be able to detect which sort of server they are talking to and do the right thing.”
Once the project is far enough along there will be test regions in ADITI and a Project Viewer.
We have no ETA. I’m still thinking like 2 or 3 months.
Current Pre-Baked Avatars
You may remember that Library Avatars are using pre-baked appearance textures. They don’t use the new Bake System. What the Lab did is pre-load the textures into the caches on all the servers. They did this to get data on the actual effective savings and reliability of the process.
Would be nice to see some comparsion screenshots
Ideally, would you see any difference at all?
Except that you don’t get IMs from others telling you that you wear 2 different kinds of boots or that an alpha from a former outfit cut off your legs etc?
Ideally, we will all see the same thing and bake fail will be a thing of the past.
However, connections problems will still affect what we see.
Nice writeup! Thanks!