Is it just me or is the Second Life Mac Viewer the worst it’s been for a while?

The performance of the Second Life mac viewer is the worst it has been for a long time. It is so bad it actually makes me not want to login.

The performance really started to degrade more than it usually is just before Fantasy Faire. I put it down to traditional low performance before major SL events but its persisted and perhaps got worst. maybe i should be filling Jira reports but I’m not really sure where to start and i’m never sure if its just me who’s having the troubles.

When i say poor performance i don’t mean low frame rates. I’m a mac user so am used to existing and working in a low FPS world. The performance loss starts with simply logging in. It’s taking three times longer than usual just to log in. Once i arrive inworld i am greeted with the usual ‘wheres the dam cache?’ phenomenon. Non existent cache has been around for me ever since the lab switched to using CDN’s to quickly deliver nothing for 3 minutes. This is ok as long as I’m not crashing at an event where i need to quickly get back into SL. It’s the next thing thats really causing me grief.

The UI has become a sticky horrid experience of constant pauses. It used to be that when i clicked to open preferences window i’d get a pause of about five seconds. Now that pause has gone up to 20 seconds and isn’t localised to just preferences, it happens when you open an IM with someone, click to open inventory, and most annoyingly clicking to open the texture window when editing objects. It’s driving me mad.

Also should mention i still can’t post snapshots to my SL feed since the last official viewer update.

The performance drop is across all the official Viewers and release candidate viewers. Of course it’s not happened in the Firestorm viewer, either because the Firestorm viewer has yet to add this performance drop or because the firestorm viewer is 64bit. I don’t know.

I don’t want to use the Firestorm viewer even though it gives a smoother frame rate experience. I find it bloated with options and features, but i may have to if I’m going to build an Exhibit for SL13B. Makes me wonder what we’re going to be celebrating. Not improved stability, well i won’t be.

Is anyone else is having the same performance drop? i’d like to know incase it is just me and my stupid computer is to blame. That would be embarrassing.

[UPDATE] I made a video comparing the firestorm viewers performance with the current Official viewer, see I’m not going crazy it is poor!

Firestorm/Official SLviewer Comparison by lokieliot

OMG IT’S FINALLY HERE!!!!! XPtools

Four long years since Linden Lab first announced they were working on adding more tools for making fluid gaming experiences. I’ve blogged numerous times about how it could revolutionise Second Life. Linden Lab even asked me to guest blog on the SL forums once to discuss games.

Yesterday Linden Lab finally added XPtools (experience tools) into the main viewer allowing ‘you’ the user access to the experiences window where you can manage which experiences to be part of.

After all this time, is Experience Tools still revolutionary?, especially since Sansar is right around the corner (after just 1 year) which apparently plans from the start to be about experiences. I guess that’ll be down to what users do with XPtools now. I’m fully aware that for someone who has dreamed for years of making interactive immersive stories Sansar could be a better place to invest time and effort, but we’ll cross that problem at another date.

In the meantime i need to start putting into motion very old plans, dust off the old note books from 2010 and see whats now possible. Making experiences that require things to happen to you are a lot more easier now than it was, but it’s still quite hard. There are still a lot of limitations in SL , but XPtools has taken a huge step towards replacing lines and lines of ‘workaround’ scripting.

In truth nobody should get excited about XPtools, except people like me :-p. The best experiences in SL that use the XPtools will be the ones you don’t even realise use them. Creators who have an XPkey will use them for more than one experience and you will go from one to the other without thinking about it. This fact brings up a question about how the experiences are being listed.

Each premium user gets a specific XPkey which is used to get permissions for the experiences. That XPkey gets a profile displayed in the experiences list of which you can only have one. If you make multiple experiences such as i have, you will find you can not make profiles for each of those experiences. So really when you search for ‘experiences’ you are in fact searching ‘XPkeys’. Seems like a minor oversight to me as i have two experiences “Escapades Bashables & SL12B Dreams Dome” with three more experiences planned all using the same XPkey thats profiled as just one. Meh. Maybe ill just rebrand my XPkey profile to LOKI EXPERIENCES’ and have the profile landmark to an experience select room:-P.

Anyways Thank you Linden Lab for finally releasing the XPtools. Great things will come, just give me some time to make it happen, Escapades is about to get a whole lot more action packed 🙂

For experience creators here is a Wiki and here is a Forum and Knowledgeable article devoted to Xptools.

My first time in SL with the Oculus [UPDATED]

So thanks to Cinder Roxley and David Rowe i finally got my first taste of Oculus VR in Second Life on a Mac using the CtrlAltStudio viewer

And what a wonderful experience it is. I say this even though i threw up this morning from using it. CtrlAltStudio gives us the bare bones of Oculus Integration and throws up all the problems and issues i expect Linden Lab is currently working out behind closed doors. I must point out that the CntrlAltStudio Viewer has added it’s own Oculus compatibility and does not represent what Linden Lab may release in future.

Screen-Shot-2013-09-16-at-15.14.23Convincing the brain

Until i got to try this out for myself the main concern had been UI, interacting with SL’s many options , inventory, communication, how do we do all that with Oculus on our heads. Now though i feel movement, our view and keeping the immersion in SL is the more important concern of Oculus.

Lag is still the enemy, even more so with Oculus, Busy environment leads to lag, lag leads to Latency issues, latency issues lead to Brain questioning the validity of the environment it thinks it’s in and Brain questioning the validity of the environment it thinks it’s in leads to puking.

There are other things too that can lead the brain to think WTF and thats uncharacteristic movement or movement the brain was not expecting.

The most popular questions the brain will ask before communicating to stomach to throw up.

• Are we moving?

• Are we not moving?

• I can’t bend that way?

• How did you change the laws of physics?

CtrlAltStudio i think uses controls  already in the SL viewer and as such sometimes your head will suddenly rotate in a way that would represent whiplash or having your head fall off. The combination of moving your head and using the mouse for viewing direction i think causes an issue where when you look down the viewer can’t decide which direction you are facing and spins you out.

Playing around with CtrlAltStudio shows that Linden Lab can’t just simply Add Oculus support, that they would need to create an entirely new viewing experience and not just forcing people into mouse view. I expect Linden Lab to take as long as they need and don’t expect anything out this year.

Thanks again to the CtrlAltStudio team for giving us a taste of Oculus in Second Life with which we can get a sense of what works and does not work.

ITS VIDEO TIME!

Ok so i made two videos. The first is of my pride and joy builds in New Babbage. This is also a good test in that New Babbage is notoriously laggy because of the massive builds (especially with my evil sculpt house across the canal). I had advanced lighting on but no shadows, and no materials because TPV’s still don’t have that feature. All in all the experience was good and it was great being able to look around. I felt a bit woozy due to a little latency issues from  slow FPS. The video itself is dark and a bit frazzled from compression so filming Babbage may have been pointless.

The second video is from my sky pirates game where i am the gunner of an airship. Where ever I’m looking is where the cannons fire. This turned out to work really well. The fact you are sitting and not controlling the movement helped a lot. You concentrate on where you are aiming and there was no latency issues due to being so high up with little else around. I did suffer from the looking down spiny issue once or twice which made me feel a bit woozy. Other Oculus explorers can try the game for themselves HERE (just click the barrel to rez a balloon)

So what made me throw up?

It’s my own fault, not the viewer or Oculus. After the sky pirates game i was on the ground and i accidentally clicked a wrong button which detached the camera from my avatar. This lead my view to lean to the left suddenly and my brain just could not handle that and told my stomach to evacuate its contents. It may also have had something to do with the Ginger Beer i drank the night before.

Ive played plenty of Oculus demos, some i can play for over an hour with no sickness issues. I believe It’s all down to the movement of your view something that the CtrlAltStudio viewer delightfully made me aware of 🙂

[UPDATE] One thing i forgot to mention was this thing about standing up and sitting down.For me at least it does seem to make all the difference. A day after my first experience i ventured back into SL with oculus again but this time i stood instead of sat and the whole experience was much better. After chatting on twitter with Rowe i realised i had forgotten to enable ‘view avatar in mouse look’ which allows me to see my feet while in Oculus.

Another first that surprised me was meeting another avatar. So far id walked around New Babbage and hung out in my building studio but not met anyone else, so when in my store suddenly turning to see someone else it was a ‘OH WOW’ moment. This sudden presence of another person stood in front of you. Shame though that i had to take the headset off to chat to them.

Pathfinding Post Mortem

Im starting to see a pattern in how i get excited about new features in SL. The fun and excitement comes from the announcement then the downloading of the development viewers to test on the Beta Grid. The blogging about how it will add so much to Second Life once they iron out the bugs.

By the time the features come out on main grid or in the release viewer no one gives much of a crap because the features have no effect on how they use Second Life. Only the super content creating geeks get giddy over the new tech.

So it was with Pathfinding, this amazing bit of tech used for more dynamic movement of NPC’s or AI’s or bag guys,. Before Pathfinding i had been using sensor scripting with ‘move to target’ commands so my goblins would sense a user, target them and move to that target, then sense again and move to new target.

Pathfinding  allows the goblin to target a person and then track that person without any further censoring. It allowed the goblin to stay rooted to the surface it was walking, and it allowed the goblin to run away, intercept and find pathways around object to reach its target.

Pathfinding is amazing, it really is great but now that it’s out on the grid i cant find any use for it. Or should that be i can find uses for bugs are stopping me from using it.

My first test of pathfinding was rats using an animation technique called Alpha Flipping. This technique allows for smooth animations by flipping through the faces of an object. The test Rats on the beta grid worked really well and i showcased them in a video.

My first proper main grid use of Pathfinding was to be the Cheese fairy quest. I had the cheese fairy woddle about and wonder freely but i started to notice that his appearance was getting stuck or he would disappear entirely. Investigating more lead to the belief that Pathfinding was having some kind of side effect with the alpha flipping technique. I filed a bug report on the Jira and a Linden investigated but eventually i got snowed under with other projects.

Months later and a new project required new NPC monsters so again i looked at pathfinding. This time they were big crazy rabbits that pounced at you when they got close. The animation looked awesome. But this time i discovered that when the animated rabbits had pathfinding scripts put in them they would fail to rez at all. More investigating revealed the object wasn’t being rezzed at all by the viewer. In the mean time i decided to remove the animated aspect of the rabbits in order to keep the pathfinding. Filed another bug report, a linden came to investigate and then i got sidetracked with other projects.

So a month later and i’m now working on new NPC monsters for the latest upgrade to my weapons system. These little critters are also animated and like the rabbits fail to rezz so im probably going to not use pathfinding this time. I want to, i find the movement and tracking to be really challenging for users and take things a step forward to being more like proper gaming. But i want my NPCs to look lively and engaging which i cant get with pathfinding anymore.

web_518a806888940e1e26000001

Its rare to find pathfinder being used anywhere, i think the best example i saw was of cockroaches scurrying around on the ground. Im sure at some point ill find a good use for dynamically moving static objects

Maybe LL are working on fixing this issue, maybe the issue is related to the big problem of objects not rezzing, which is in turn probably related to shining project,  in which case id expect it to be fixed  eventually. All i know is that i cant do what i did when testing in the Beta stage and that ultimately SUCKS because they did a lot of work on this feature and i wanted to move forward.

Of course the whole thing would be a mute point if only LL implemented a way to animate a rigged object. 😛

Second Life Materials Feature is AWESOME!

Just how important is the new materials feature to Second Life content creators? You need to have deferred rendering enabled (thats lighting and shadows to regular users or as LL have renamed it ‘Advanced Lighting’) which for most people means killing your SL experience.

I’m on top spec iMac. I sold my grandma and parents and sister to get a computer that runs SL pretty Well and i sometimes forget that not everyone can switch on Advanced Lighting and shadows and keep up with me as we run across the grid.

So if i praise the awesomeness of the new materials feature, am i preaching to only a small number of users who can run with shadows?

As usual with second life you have to look at things alternatively. Why is it only top spec computers can run Lighting and shadows, i’d wager it has something to do with the amount of geometry the shadows have to bounce off and reflect light and all that technical gobbledygook the computer overheats in doing.

 

What Materials Does

Materials for SL basically allows you to make custom fake bumps and shiny surface effects for objects. Its a pretty standard effect used in many 3D programs where instead of having to render a really detailed model that uses up lots of computing power, you make a basic low detail shape and cover it in a special texture that tells the computer how light should react to it and in doing so gives an illusion of shape.

For Second Life these special textures are called Normal Maps for bumpiness and Specular maps for shinyness.

maps

The way i see it is that if content creators start reducing their objects detail in favour of using Material Mapping to create the illusion of high detail, this should mean that lower spec computers could find it easier to run Lighting and Shadows (advanced Lighting).

 

How to do it?

There is an experimental viewer which does have some bugs which is why its called an Alpha… DUH!, But for the most part it works spiffingly.

So for my first proper test i created a VERY simple pillar in Blender. I wanted to see just how much extra geometry (detail) i could add to a smooth cylinder shape.

MaterialsMesh

Next i created a texture of stone and carvings in photoshop. This would serve as the basis for my normal and specular maps.

originalimage

Unwrapped UV texture with stone and carvings added with photoshop to be imported to Crazybump

I discovered a small app called ‘CrazyBump’ that seems to be in beta testing for the Mac version but works really well and is currently free to download. The PC version is $99.  It has a simple user interface that allows you to import a base image to create and change how shiny and how bumpy your maps will be via simple sliders while giving you a nice preview of how lighting will react to the maps.

crazybump

Crazy Bump is a gem of an app with a simple to use UI for creating just what i need for SL

 

Crazy bump then allows with 2 clicks to save the Normal or Specular Maps to you desktop as .BMPs ready to upload into SL.

Next i recorded a video where i add the normal then specular maps to the Mesh Pillar.

 

One thing that I’m beginning to realise is the importance of lighting. Its the atmospheric lighting and how it reacts to the mapping that gives it the 3D appearance and changing that lighting can change how the object appears rathe significantly. that means and object could look amazing when Sl is set to midnight and awful when Sl is set to midday. It’s not a simple matter of adding the Norm and Spec maps, you have to consider the objects environment and  the wind light settings of that region.

materialsLighting

1. No Advanced Lighting 2. Advanced Lighting Midday 3. Advanced Lighting Midnight

Materials adds an extra dimension to atmosphere, wether it will widen the gap between those who can run deferred rendering or close it, we will have to wait and see.