Welcome to the 4th part of my series of blog posts about mobile Samsung Gear VR development. I will look at a series of useful tips and tricks about VR and mobile performance optimisation and testing, which will enable you to create an enjoyable, smooth experience for your users (and speed up your app submission process, as covered in the next and ﬁnal part of this blog series).
If you have already read parts 1, 2 and 3, where we looked at considerations as to why you should design mobile VR apps for Samsung Gear VR (& to a degree prepare for Google Daydream VR), how to get your development environment setup and general VR design considerations, and you’re still here, then you must be pretty interested! Therefore it is assumed from this point onwards that you have at least a basic knowledge and understanding of 3D asset creation and app development, since I’m switching to a more speciﬁc set of technical terminology in order to relate to key aspects of design and development appropriately. You have been warned…
Optimising the performance of your mobile VR app is key, since it ensures that the user has a comfortable experience (horror titles aside) and it passes the submission review as part of the curated store process.
There are a couple of areas to performance; overall app performance, 3D optimisation and battery lifetime. These all play a part to ensure that users can enjoy your app for as long as possible, give great reviews and tell their friends and associates about it to help spread the word.
- Optimise for 60 frames per second. You can not drop a frame, although Asynchronous TimeWarp will hide and smooth some more complex scenes but do not rely upon it.
- Don’t rely on the frame rate counter in the Unity editor since it is doing everything twice when you play a scene on your computer. Whilst it can give you a good indication of where your performance levels are at, build out and test on the target hardware to ensure smooth experiences.
- Users often can’t tell beyond around a virtual distance of 20m whether the image or scene they are looking at is stereoscopic or monoscopic. Use this to your advantage to swap out skyboxes for far off environments to save rendering load on a mobile device.
- Use the tools built into Unity to help you: The Proﬁler and Frame Debugger. These will show you were your app is lagging or overloading a scene, allowing you to go through frame by frame to examine how the scene is constructed by stepping through the draw calls. You will likely ﬁnd objects you don’t need to render, reducing your overall draw call amount.
- Furthermore, batch your draw calls wherever possible using the Unity Static Batching and Dynamic Batching tools built into the editor.
- Cull the faces from your 3D models geometry that will never be seen to remove wasteful polys.
- Similarly, use occlusion culling to ensure you are not rendering things that cannot be seen yet, i.e. the geometry of a room beyond a door that hasn’t been opened yet.
- Simplify your 3D meshes as much as possible to ensure you have the lowest level of detail for objects without losing ﬁner information.
- Reduce overdraw as much as possible to ensure fewer objects are drawn over the top of one another. The Unity Scene View Control Bar will give you a understanding of what can be optimised.
- Whilst there, use lightmapping to bake your shadows onto objects and scenes rather than using expensive dynamic shadows.
- Beyond skyboxes, if you have to render an object in 3D in the distance then use lower levels of detail (LOD) in your model with fewer triangles and swap out to higher LOD models as they get closer to the users viewpoint.
- Make sure CPU and GPU throttling is enabled since failing to initialise these values will result in your app running in a downwards clocked environment by default. Gear VR apps are typically CPU bound so favouring this over the GPU will often get the best performance. However if your app is optimised well, you might be able to downclock both the CPU and GPU increasing battery life and therefore session playtime.
The key to a great app is to test regularly and iterate upon after each session, including relevant suggestions and improvements to user ﬂow, interface, process and design as you go, rather than storing all the effort up until the last minute when you think it is 100% done. This way, you will be able to make continuous small adjustments that will overall require less effort than suddenly discovering a major ﬂaw in your design not apparent to you until too late, requiring a huge amount of effort to rework and ﬁx.
As a developer you will be too close and involved with the app to see the issues and bugs, so user testing early on and throughout the development process is critical to ensure you aren’t missing something obvious that a ﬁrst-time user can spot straight away. However there are still a number of tests that you can and should carry out yourself before unleashing it upon others.
The main types of testing you will be carrying out are functionality and performance related, to ensure that it operates as it should do, at a basic level, in a means that will make for comfortable usage by users. You could write unit tests for aspects of functionality but sometimes you’re just going to have to carry out manual testing and spot any issues yourself.
If you have decided to manage the development process using Agile methodology, then you can create a series of test cases from your epics and user stories to ensure that the app functions and includes the features as intended. Otherwise you will need to think of a series of test cases that effectively test and capture all possible conditions and uses, not just for the expected behaviour and user journey but also what a user could do to disrupt this and potentially end up in a locked state, i.e. not accepting a checkbox or not meeting the score to progress to the next level without an option to try again.
Testing VR for functionality is harder than testing a normal ﬂat app since you are best running it on the device in the VR headset, but this means you cannot quickly switch over to a spreadsheet or notepad to detail issues you spot. So testing in pairs is recommended so one can interact and carry out the tests whilst the other is annotating issues verbally described.
Before you get to this stage though, you can run the app directly in the Unity Editor to check functionality and performance without having to make a build and deploy to a mobile device. As referenced above in the performance optimisation section, the Unity Proﬁler, Frame Debugger and Scene Viewer will provide a good starting point to test for performance, as well as in the Editor itself throwing up any edge case exceptions and code errors.
User testing requires more preparation and time to get right to ensure you are getting valid feedback about your app and not the technology. As mentioned in an earlier blog post, roughly 9 out of 10 people still haven’t tried VR before and so, using them as fresh test subjects, whilst necessary, needs management to make them useful to you.
When arranging a user testing session, have each tester carry out a period of VR familiarisation and acclimatisation before asking them to try out and test your app. If they are new to VR, then this will give them a chance to be wowed by the technology and the experience of being immersed in another world without that excitement affecting the usefulness of their feedback speciﬁcally about your VR app. Once they are aware of what VR can do and how it works at a basic level, they will be ready to test your app with a clearer understanding of how it should work, feel and sensations provided. Some good examples of familiarisation apps on Gear VR are the Samsung Introduction to Virtual Reality (free) and Welcome to Virtual Reality by SliceVR (paid).
Prepare a set of questions to ask them after their test session with your app to gather useful feedback and information about how they felt, how easy they understood what to do, where they struggled or got stuck, or aspects that made them feel uncomfortable (from a performance perspective not a content, scary horror angle).
Remember, it’s harder to mirror the content shown on a mobile VR device and so you likely won’t be able to see in real time what they are looking at (and likely pointing at in thin air). Have a set of printouts with you with key screens and menus on from your app so after their test session, they can refer to them and help them describe certain screens or panels that they’ve seen and are highlighting but may not necessarily know or use the same names as you do for them.
If you have the budget available, there are companies now offering VR testing services to take some of the time effort and strain off you. Testronic Labs now offer a paired VR testing service for functionality and compatibility, whereas Player Research are leaders in user research and user testing, both creating and providing comprehensive reports post-test for you to take on-board as part of the service.
So by now, your app should be running smoothly at a solid 60 frames and is bug free (as ever a piece of software can be), has been tested and veriﬁed comfortable and easy-to-use by a range of intended end users. Therefore it’s time to submit your app to the Oculus Store and get ready for launch! Look out for pt.5 of this series of blog posts covering just that very process coming soon.
Oculus Blog - Squeezing Performance Out Your Gear VR Title pt.2: https://developer3.oculus.com/blog/squeezing-performance-out-of-your-unity-gear-vr-game-continued/Oculus Mobile SDK - Testing & Troubleshooting: https://developer3.oculus.com/documentation/mobilesdk/latest/concepts/book-testing/
Previous parts in Developing for Mobile VR
By Sam Watts
Sam Watts has been involved in interactive, immersive content production for over 15 years, from learning development and simulation to AAA and casual games. Currently employed as Operations Lead for Make REAL and Game Producer for Tammeka, he keeps busy by evangelising the possibilities and real world benefits of immersive technologies like VR and AR to anyone who will listen. Tammeka’s first VR game Radial-G : Racing Revolved launched alongside Oculus Rift in March and HTC Vive in April 2016. Make REAL are currently powering the McDonald’s “Follow Our Foodsteps” VR farming experiences at numerous agricultural and countryside shows around the UK.
Disclaimer: In the rapidly changing and advancing tech climate around VR, where things never stay still for long, this blog is written as is at this point in time of publishing. In a month or two, some elements or details may be incorrect or surpassed with new technology that now does do what I say it currently cannot.
Check out awesome VR mobile experts and an incredible line up of thought leaders at Apps World London