An Engine to Power Virtual Worlds


In our last post, we mentioned we were developing a custom game engine for our “Side Trip to Wonderland” and “VR Toyroom”. Our Wonderland Engine gave us full control of performance.
As a result, performance was great: while VR games need to stay below 11 ms per frame to keep latency low, VR Toyroom was at 4 ms on minimum required hardware for the Oculus Rift CV1!
Our resulting game binaries would have been very small, compilation was very fast and the editor was directly integrated into Blender.

On the other hand, development took incredibly long. If we wanted shadows, we had to implement those ourselves in the engine first. If we had wanted real time ambient occlusion, we would have had to implement that to. In Unreal Engine or Unity you hit a checkbox and you’re done.

Why not use Unity or Unreal Engine?

Our games are pretty simple. They do not require a lot of visual fidelity nor elaborate AI, networking or complex texture streaming code. In addition, since we develop exclusively for virtual reality, there are a lot of post-processing filters you want to avoid anyway:
Lens flares, vignettes and depth of field for example try to emulate effects which only appear with cameras, not with the human eye. Motion blur, if applied to the entire screen, can even introduce motion sickness.

In VR performance in king. The lower the latency of your game, the better the game will feel in VR and the easier it will be to create presence and immersion. This even works with stylized or cartoony worlds, you do not even need photo realistic rendering. This latency is the time between head motion of a user until the image on the screen is updated accordingly. Goal is to have that at least below 20 ms, which is the threshold for “fooling your subconsciousness”.

When we started VR development, “deferred rendering” was very popular among game engines. It is a rendering technique that allows you to render many many light sources without huge performance impact. For VR, though, this technique is less desirable, since the head position and rotation are needed earlier while rendering a frame, which increases latency. See Oculus blog post on this for further information. At the present day, though, Unreal Engine caught up and provides a Forward Renderer as an alternative to the Deferred Renderer.

Why *do* use Unreal Engine?

Unreal and Unity provide tools to get everything done for a simple game extremely quickly. Entire production pipelines are built around them, asset management and versioning tools are integrated already, there is support for diverse file formats and a healthy community builds plugins to get them to do basically anything you want.

All in all, everything is there already. You can focus on creating the game rather than the engine. Features like Blueprints enable you to quickly click together game ideas, it makes game development almost too easy.

We now started a third project with Unreal Engine to see how it goes (putting the other two on hold for now). We had basically three options: Unreal, Unity and CryEngine. A quick summary of what led up to the final choice:
Since our developers know C++ very well and Unreal Engine provides a C++ API, that was one of the most important deciding factors. Viewing Unity’s source code is expensive and even a simple pro licenses is not affordable for a eight-headed 0-budget team. CryEngine does currently not support Android which is required when targeting mobile VR such as GearVR or Daydream headsets.

Warning, Technical: The Wonderland Engine

I spent alot of time on our engine, hence I will dedicate a small paragraph to it in the hopes someone reads it and thinks “Wow, hey, that is pretty cool!” 😉

Our game engine is fittingly named “Wonderland Engine”, since it is meant to power the wonderlands we as Vhite Rabbit guide you to. The engine itself is written in C++, based on Magnum, an open source C++11/14 OpenGL graphics engine. Our editor is written as an addon for the open source 3D creation suite Blender, which allows our artists to create assets and then easily view them directly in the modelling tool (at least that’s the idea). Simultaneously, we profit from the unbelievably fast startup times of blender.

To manage compile-time dependencies, we wrote a custom package manager in python which allowed us to deploy binaries manually or even via gitlab-ci and provide automatic updates to the team.

Exchange of data between Blender and our engine is done using the OpenGEX scene file format using a custom exporter plugin (which is open source!). In comparison to the official exporter, this one sacrifices features we do not need, for speed, but also allows us to export extensions which contain properties not supported by the original specification of the format. This includes physics materials, audio properties or particle system settings for example.

In addition, we can compile the engine and game projects through blender, which allows Artists to view their changes directly in the game.

Our custom engine allows our artists to “wish” for features, which can then be implemented while still keeping performance really really good.
If you are interested in further details, go tweet at me @Squareys!

 

March 29th 2017 10:41 PM | by Squareys | posted in Wonderland Engine