I have uploaded the latest version of AtlasMaker to Github, so it should be easier for you to hack on, report bugs etc. Not much in my Github account at the moment, but I plan to upload some of my experiments soon.
So, I haven’t written a blog post since 2014! What have I been doing since I released Antigen?
Well, I have been busy:
Learning 3d modelling
Improving my art skills
Improving my maths skills
Learning Unreal Engine
Learning Common Lisp
Developing Antigen was a pretty frustrating experience. I had no problems with the programming, but creating the artwork was difficult and took a great deal of time and experimentation. I threw away an enormous amount of crap artwork during the production of the game. So I resolved that once the game was finished, I would spend some time working on art fundamentals before doing another game.
I’ve been working through Drawing on the Right Side of the Brain, which I actually find quite difficult. I’ve never been interested in drawing things in the real world or in representing reality; I only ever wanted to express what was in my imagination. My brain goes crazy when I try to draw a chair or a cup or something – its like it’s not interested in purely sensory phenomena, but constantly tries to seek ideas or “personalities” in things. I’m going to stick with it. There’s a lot of value in doing things that don’t come naturally to you.
I’ve also been working on the exercises on Draw A Box which I’ve found extremely helpful in building basic pencil handling skills (Which I sorely need. I have dyspraxia, and had to have a special pen at school!)
Coffee machine modelled in Maya. One of my first models since starting the Digital Tutors course. One day I’ll do a nice render of it
On top of that, I’ve been learning 3d modelling in preparation for my next game which will use the Unreal game engine. I learned to model years ago, using Imagine on the Amiga, but my skills were pretty out of date. Maybe I’ll post some of my models and renders when I start getting good. I’ve been following tutorials on Digital Tutors. The tutorials are very good, but it is a bit pricey, especially as you’ve gotta sign up as a premium user to access the project files/reference images which you need if you really want to succeed.
Imagine 2.0 on the Amiga. I got this free from a computer magazine in the early 90’s. My first 3d modelling experience
Apart from the game development stuff, I’ve also been learning the Common Lisp programming language. I got interested in Lisp a couple of years ago when I first found out about Lisp Machines; high-end workstations from the 70s and 80s that worked in a fundamentally different way from computers today. They offered programmers a positively luxurious environment for building software, and I found them extremely inspiring. I want to do something special with Lisp, but I ain’t gonna talk about that yet 😉
Last week I received my Oculus Rift DK2 headset which I ordered back in August. I had my eye on the Oculus DK1, the first version of their headset, but I was too busy getting Antigen out of the door to let myself be distracted by other projects. Now Antigen is out of the way, I can let myself experiment a little while I figure out what my next big project will be.
It comes in a padded, reusable cardboard box, which is fine for storage but if I was going to take it anywhere I would probably get a Pelican camera case or something similar, and cut some custom foam inserts for it, like this Oculus user has done.
The headset is surprisingly light, and connects to your computer with a sturdy cable that runs over the top of your head along one of the headset’s straps. I imagine this was done for weight distribution and to help keep the cable out of the way when you move around.
The cable connects to your computer via 1 USB and 1 HDMI port. There is a motion tracking IR camera that clips to your monitor like a webcam. This uses 1 USB port. There’s also an optional power supply if you want to use the USB port built into the headset. One device that uses this is the Leap Motion, which I’d like to get my hands on sooner or later…
The Rift also comes with two sets of detachable lenses; a longer lens that is already installed in the headset, and a shorter lens intended for nearsighted users.
Setup was easy. I downloaded the drivers from the Oculus website, installed them and as instructed, updated the headset firmware using a simple and straightforward preferences utility. This tool also lets you create user profiles for the Rift, containing your height and other details.
The Rift appears to your computer as a second monitor. You can view anything you drag on there, but it will look like crap unless it is rendered stereoscopically and positioned correctly. There is a DirectX style “Direct Mode” that lets applications talk directly to the Rift, but it doesn’t work on the Mac yet. (The Mac is a bit of a second-class citizen for the Rift at the moment. Unfortunately, most of the coolest demos are Windows only too.)
At this point I could hardly contain my urge to jack in to the matrix and become the console cowboy I’d always dreamed of being, so I hit “Show Demo” and braced myself.
I found myself sitting at a desk in cyberspace, somewhere on an infinite Tron-esque plain. The first thing that struck me was the stereoscopy, the 3d-ness of the lamp and the tower of playing cards on my desk. I really felt I could reach out and touch them.
The resolution is good at 960×1080 for each eye, but I don’t think it’s high enough for consumer use yet. I suspect that in order to approximate the usability and experience of a desktop monitor, VR displays will require a much higher relative pixel density. There is also a kind of “screen door effect” where you can see the pixel grid. Most of the time this isn’t too bad, but it is noticeable when you try and view details that are further away, such as the faces of game characters.
Text is readable as long as it is large and in the middle of the display. It’s a bit like looking at text on a C64 with a bad TV set. Towards the edges of the display it gets blurry and there is some chromatic aberration. You need to look at text directly in order to read it properly, which will have ramifications for UI design, particularly games where information is usually positioned at the edge of the screen.
All the same, these aren’t really criticisms, they are just statements about where the developers kit is right now. The DK2 is a prototype that a deranged experimenter committed person works with and adapts himself to, rather than a consumer product. I’ve not had a chance to try the new Crescent Bay prototype, but reports suggest that it eliminates the screen door effect and that we can expect the consumer version to be a real advance on what is currently available.
There’s a nice demo included with the SDK where you can walk around an Italian country house and its surrounding grounds. The house is on a cliff over the sea. You really feel like you can fall off when you approach the cliff. One thing I found interesting is how your body is tricked into reacting to things when you accidentally bump into them. It feels suddenly like there is something real there. This demo also shows how high quality shading and lighting can mitigate some of the shortcomings of the headset such as the resolution and screen-door effect. Another demo I tried, 4thFlrStudio by Brendan Coyle demonstrates this clearly, with its remarkable lighting and detail.
After a bit of messing about I got Steam and Half-Life 2 working on the Rift. Within minutes it gave me motion sickness! It clearly showed how VR requires both a rock-solid framerate and well-calibrated motion settings in order to be effective. Anything less than 75fps will likely send the user running to the toilet!
The presence of other characters in the game is startling. This was the first time I had encountered another being in VR, albeit a simulated one. At the beginning of HL2 you go through a police checkpoint and one of them thumps you in the chest. I actually felt it! The enemies were extremely intimidating. I dread to think what it is like on the later levels when you encounter the Hunters and Striders! I really felt I was inside the game, it’s kind of like a fuzzy lucid dream. I will have to experiment with the settings to see how far I can minimise the motion sickness.
The next thing for me to do is get the SDK going and see what I can create with it. I might also need a more powerful computer. Some of the demos I tried only ran at 16fps. My 2012 Macbook Pro is a great for general use and for my iOS development, but its NVidia 650m has about 1/7th the power of a top desktop GPU.