Im known for VR event documenting. I mostly film events that i organise or others organise and edit them into a narrative.
I used to do this with real world events as a hobby and found that documenting in VR owes a lot to real life. People assume because i film inside a computer world that it’s staged or planned, when in fact its all live, chaotic and usually luck that i capture what i do.
This week i made a new Machinima, but as you might have seen it is different to what i have produced in the past. It was a staged, planned video to help show off my new mesh avatar and clothes in a way that i hoped had never been seen in Second Life before.
When you hear the word ‘Machinima’ there is a tendency to think of a certain style, especially from Second Life. I wanted to try and blow that assumption up and do something that was created in Second Life but didn’t look like it was created using second life.
So what do you think, did i achieve my goal?
The Making of ‘Achievement’
I started with a simple idea in my head. I wanted the main character to battle a monster in a Saturday Morning Pokemon style cartoon. From there i sat down and wrote out a quick script and decided after hearing the Drax Radio hour to add a plot point about winning Achievements. Then i storyboarded the whole thing.
The Story board was very important as it helped visualise how many scenes and animations would be needed. Where scenes would be cut and the camera angles, plus where scripts would be needed for visual FX.
The main character was already available but i needed two others, the monster called ‘GARGHH’ and the annoying ‘Zeb’. Using Blender i created simplistic characters, textured with photoshop and rigged them with Avastar before importing to Second Life. I used a technique i learnt back in 2011 to create an effect known as ‘cell shading’ to produce black lines around 3D models to mimic 2D cartoons.
Looking at the storyboard i worked out that i would need at least 20 poses or animations to express the story. I use poser 7 to quickly create animations and poses for import into Second Life.
The actual filming took place in a 30m x 30m box carefully texturing it so that it appeared like a long path going off into the distance.
I was determined not to have any visual FX put on during post production. I wanted it so that all that you see was done within Second Life, not touched up afterwards. So to produce FX such as the ball flash and the monster exploding i scripted prims to change size and spin after hearing a chat command. I think this worked better than i expected.
Finally i edited the recorded shots in Apple iMovie while adding soundFX and copyright free music. All in all it took me a solid days work to produce and I’m pretty pleased with the result. It looks quite unique and has an energetic punch. It is at least a proof of concept for using virtual worlds for quickly producing anime cartoons perhaps?