For those making animations this is an important issue. While I haven’t been sure which side of the debate to take because I am pretty much an amateur when it comes to making animation. Medhue, of Medhue Animations, made a video that explains the problem. I’m not sure it depicts the problem as well as it could.
I think because Medhue does so much animation he can see the problem in ways those of not as familiar with animating can’t see it. So, if this still doesn’t explain the problem for you, read on.
What is possible is shown here:
There is a point at 1:15± where translation is required to create the motion shown, the lip pooch out. I made a gif to show the point I mean.
Image
From the GIF you can see the lips move in a direction that bone rotation alone cannot replicate. So, to have good facial expressions we need translation to.
What the industry is moving toward:
https://www.youtube.com/watch?v=CvaGd4KqlvQ
You heard the speaker mention the data load and processing power needed to handle the facial animation of Ira is high. So, they built an engine to carry that load into the video card. This is 2013 technology. The Lab is concerned about how much of a load this will put on the viewer. Their thinking is that current hardware can handle it. Also, that the load won’t have that big an impact on older hardware. But, part of the beta test running now is to determine if that thinking is accurate. For more information see Inara’s article: The Drax Files Radio Hour 100: of gateways and Bento.
We are not likely to have what NVIDIA is showing for Second Life. But, the ability to translate bones (change position) looks to be important. So, click over to the JIRA and click WATCH.
We might have something more like the NVIDIA video in Project Sansar. But, I don’t recall hearing anything related to this facial bones or bone translation subjects or seeing any avatars from Sansar.
There is going to be a Bento user group meeting.
http://wiki.secondlife.com/wiki/Bento_User_Group
Nice. Thanks.