This morning Daniel Voyager has a link to this video, which is about 33 minutes. Daniel also has an index to the video: VWBPE 2014: Philip Rosedale Keynotes – Live Updates. It is based on time-of-day rather than minute marks. But, you can get a sense of where in the video things are.
I didn’t hear much new information in this speech/video. Drax, or at least some Draxtor… and it sounds like a question the world famous Drax we know would ask, asks if High Fidelity (HF) will make Second Life™ obsolete. In answering Philip sounds a bit like a politician. Meaning I didn’t really hear an answer. There is no yes or no or may be or what would have been most accurate: I don’t know. He does point out that Linden Lab is an investor in HF. He expects HF technology to make it into Second Life.
But, the question is valid. Think about it. If HF can eliminate lag then Second Life will have to eliminate lag or I believe it will die. Which would you rather use? A laggy world or a highly responsive one? We already know lag is a major objection that new users of Second Life have.
To get its lag-reducing-computing-power HF plans to use distributed computing, something like what SETI does. That seems to be a very basic paradigm change in how the system works. I think that will require a major re-write of the code that runs Second Life.
Philip expects bits of technology like facial expression may make it into SL. But, we start to run into problems, at least I can imagine some problems. HF uses Kinect to do facial recognition for expressions. Now put on an Oculus Rift and tell me how that works…
Philip is asked about LEAP Motion control. His answer covers the same problems I’ve found with my LEAP, which sets unused most of time. LEAP provides a small sensor space and you have to hold your arms up. Plus you have to develop and learn motion control, which I think for now is somewhat similar to having to develop and learn a new sign language. I see this last one as a major obstacle.
The problem with the LEAP is we have to teach it to turn our gestures into something computer programs understand. Most computer programs are designed to understand the mouse and keyboard. So, we teach the LEAP to convert our hand motions into equivalent mouse motions and key strokes. That pretty much works as a second layer of translation. Things are likely to be that way until software adapts to use hand motions.
Philip thinks the mouse is the big obstacle. LEAP is not that good. The things you hold in your hands so the computer can track hand and arm motion are not that good. Jo Yardley has written about those. Even if you have gloves or rings or a Kinect videos you, you have your hands and arms up moving around.
The idea that somehow the computer is going to understand how to manipulate things in a virtual world by our moving our hands like we do in RL is going to make things more natural… I don’t think so… All the systems I have seen so far lack the visual and tactile feedback needed to allow us to interact as we do with the physical world. For now we see and try to figure out where we are visually and then move to touch something. In VR so far much of that process is in our mind. Getting the system to provide the feedback we need and then the information from our mind back to the computer is still far too primitive to feel natural. These problems will be solved. But, I doubt that will be this year.
So, will Second Life become obsolete? I think it is a coin toss. There are too many factors to consider. Some days I think yes. Other days I think no, it will grow. But, I never doubt there will be a creative space something like Second Life.