I got a June 1 announcement I just noticed that AVsitter is going to go open source July 31, 2017. See the official announcement here: AVsitter to become open-source!
The video is for AVsitter™ (2013), which I’ll call AVsitter1. I couldn’t find an AVsitter1 video by AVsitter.
https://www.youtube.com/watch?v=tk_xLVYe9zQ
If you don’t know, AVsitter is a tool for building and controlling animations in things, furniture mostly. It serves for simple to complex animation control. There are videos and various other tutorials on how to use both AVsitter1 and 2. (AVsitter2 on YouTube) These days most are AVsitter2™.
There will be one final update of AVsitter1. Some support will continue.
Today AvaStar released a new release. You can get it from your products page. I’m not sure what, if anything, is new or changed. I assume there are bug fixes. If you read the problems I had adding in the Slink 2.0.30 rig to the then current 2.0.46, you are aware of one problem.
Crystal Gardens
The new version is 2.0.48. It updates easily. I disabled and removed 2.0.26, installed and enabled 2.0.48 (2RC10) without error.
Of first interest to me is whether I can update the Slink model I want to use. Slink updated their May 2016 to a May 2017 version. The 2017 version used AvaStar 2.0.30. The Slink people, AvaStar people, and I had trouble updating. Slink was gracious enough to update their files to 2.0.46.
With this new AvaStar version I tried updating the Slink 2.0.46 version to 2.0.48. At first, it seemed to have updated without a problem. I did have to close Blender and reopen to get the Outliner to update. Then I realized all the Slink meshes failed to parent to the rig in the update. So, I’ll have to move the Slink model and my things manually.
The Slink people have come out with a new tool for those making Appliers for Slink bodies. Siddean Munro has made a video of the tool: Slink Texture Studio. The formal announcement from Slink is here: The Slink Texture Studio – for Creators!
A really interesting feature is the ability to use textures on your local computer without having to upload them. Instructions start at 04:50.
I know Firestorm has a feature to allow the use of local textures, those on your computer. I hadn’t realized the feature is also available in the Linden Viewer.
The Studio has a land impact cost of 71 prims. That pushes me over my main grid parcel’s limit. So, for me, this is a tool for the ADITI grid or AGNI sandboxes. Of course, while it can be used in either grid, if you do use it on ADITI the UUID’s used and generated will not work in the AGNI grid.
Not much has changed since week 21… said Oz as he listed off these changes…
There is a new AssetHTTP version. It is doing reasonably well and Oz Linden thinks it likely to promote to the default viewer.
A new Voice RC is out but, it is too new for the Lab to know much about how it is doing.
Also, a new maintenance viewer is out and it too is too new to say much about performance-wise.
.Psychosis.
Alex Ivy, the 64-bit version of the Linden project viewer, has been updated. A RC version of it is in QA. The RC is likely to release in week 22.
The Lindens are changing the server side support for viewer update process. This is the process that decides if you should run a 32 or 64-bit version of the viewer. The update process is taking a fair amount of effort. A couple of easier said than done things weren’t anticipated that are being resolved.
That is a ‘double take’ inspiring line. For a couple of days, people have been writing about M.I.T.’s project to make digital copies of people that can look and speak like the person. Of course, it is hype based people’s curiosity.
M.I.T. has developed an amalgam of technology that allows the creation of an avatar that looks and sounds like a specific person. (Example) But, that is about it.
Second Life has been doing the ‘looks’ part for years. The voice part is new tech. Prior voice imitation was about almost only pitch and word speed. The new tech controls pronunciation, word choice, cadence, and the other factors that make voices recognizable as a specific person. The idea is to have more AI interacting with people. Sort of a replacement for customer service people.
In psychology, the idea of the soul is self-aware life. Figuring these things out gets complicated because we can’t know what another person experiences. There is the challenge of whether the color you perceive as red is the same perception I have.
All your life you have looked at a color people call red. So, you too call it red. We can measure the frequency of the color we all agree is red. The color-blind show us that we are not all perceiving the color the same. For me, there is a huge difference between a bright green and bright red. That isn’t true for the color-blind, as the colors are nearly the same.
We can’t know what color others actually perceive. So, how do we know when another person or creature is self-aware? Are animals self-aware? If so, do they have souls?
Who is stomping on our free speech rights? Who is working to silence public speech?
Kathy Griffin 2017
This is where you find out if YOU are into democracy or fascism. Are you allowing free speech? What about hate speech? What about the speech of climate skeptics? Liberals, anti-Trump’ers, pro-Trump’ers, and Conservatives?