Just a few days before Microsoft Build 2019, the inaugural Mixed Reality Dev Days were held right in the Microsoft campus in Redmond. How was it? Learn all about it here!
What are #MRDevDays, anyway?
Build 2018 hosted an unique invite-only event for participants interested in mixed reality development. The event was a part of Build, but was not at the same time – it had its own agenda and was running parallel to other Build sessions. Besides – it was not wholly a Microsoft-driven effort. Instead, one of the main organizers who brought this idea to life was Jesse McCulloch, then a community member with a passion for mixed reality.
The event was a tremendous success and many were hoping that it will get a “sequel” in 2019 as well. And they were right! Jesse, who joined Microsoft In September 2018, started planning Mixed Reality Dev Days for 2019, which took place on May 2nd and 3rd right in the Microsoft campus in Redmond.
To attend, we needed to fill out a survey and share how we work with mixed reality and why we should be invited. Even though I have so far only tried out the basics of mixed reality development, I was lucky to be one of the selected participants. I hope this blog article will serve as a thank you to the organizers for inviting me as well 😉 .
In the morning of May 2nd, we gathered in the lobby of Microsoft Building 92. This building also hosts the Microsoft Company Store and Microsoft Visitor Center. We got our attendee badges and were shortly led upstairs to the first floor, where the whole event took place.
The opening keynote was lead by Jesse McCulloch himself, who shared a bit about the Microsoft Life and the journey which led him there. He shared his passion and vision for Mixed Reality and his excitement about the two-day event. Lila Tretikov, the corporate vice president of AI, Perception and Mixed Reality, followed up, welcomed all attendees and talked about the importance of Mixed Reality and the next wave of MR ushered in by HoloLens 2. By 2025, mixed reality is expected to be ubiquitous and of crucial importance in all areas of our life.
Microsoft HoloLens 2 was the overall star of the event as most of the sessions were focused on developing experiences for this device. Charlie Han, took us for a deep dive into hardware that makes HoloLens 2 come to life. The amount of innovation and focus on details in the small form factor of the headset is incredible. From improved weight distribution, through the use of carbon fiber to avoid heat expansion, to smart microphone design – everything is significantly better.
One of the most amazing new features in HoloLens 2 is, without a doubt, the eye-tracking. Sophie Stellmach, walked us through the APIs available to developers to understand where the user is looking. This can be used to understand the user’s intent to make actions contextual, to add implicit actions like eye-gaze-based auto-scrolling and zoom and pan and to track user attention, which will be incredibly useful in user research studies for example. We were also told that eye tracking has some challenges as well – it is always on, so it is crucial to differentiate intentional eye movement from random and spontaneous. Compared to head gaze and hand motions, it is less precise form of interaction but has enormous potential for creating unique experiences.
The second and probably most crucial addition to HoloLens 2 is fully articulated hand tracking. Julia Schwarz told us the first version of the device allowed for simple “air-tap” gesture, but most users were immediately trying to touch and interact with holograms directly, which did not work. Luckily, HoloLens 2 now supports these instinctual interactions, and it makes a huge difference in user experience. The hands are now fully tracked including the fingers and event individual finger joints, so you can grab, push a pinch everything you see. The MR toolkit team talked about the challenges of creating a natural user interface for a holographic world – it required them to build many different prototypes for known UI controls like buttons, sliders, and gauges, which all needed to be customized and adapted to the 3D, real-world interaction model.
Of course, all attendees were eager to get their hands-on with HoloLens 2. MR Dev Days had a special room dedicated to this purpose. You could try to deploy and test your own HoloLens 2 apps on the device or choose to test the UI interaction demo focused on hand tracking. In the demo, we were first greeted by holographic Alex Kipman and then presented with a set of different 3D UI controls including a holographic piano. The interaction was really precise and you could actually see a “skeleton” model of your hands and fingers reacting to your movements. I have tried HoloLens 1 twice previously and the new version feels much better. It sits comfortably on your head and there is no longer a problem adjusting the visor so that the view is centered. In addition, thanks to the twice larger field of view, the whole experience is much more immersive.
For lunch, we had a chance to take part in the Microsoft Life on campus and could choose from a number of restaurants available for employees with a dining coupon card. I had a MOD Pizza but there was a vast range of other options including sushi and more.
After lunch, it was time for a session with Don Box, which was not only a lot of fun but also included an official announcement of the HoloLens 2 Development Edition. Although the price is ultimately the same as for the normal edition – $3500, it includes $500 in Azure credits and three months of Unity Pro and Unity PiXYZ. In addition, the Developer Edition can be purchased as a “subscription service” at $99/month. The price is still $3500, and you get to keep the device after the $3500 payment is done, but it is a more affordable approach than the full up-front payment.
The next session I attended focused on the new Azure Kinect. The same way as HoloLens 2, this new version is awe-inspiring and packs significant improvements over the Xbox One Kinect. Erica Towle and Chris Edmonds walked us through building a simple app with Azure Kinect SDK and showed us the RGB, depth camera, and distance measurement, how Kinect can recognize people. Although we won’t probably see Kinect back in our living room for some time, it will have broad applications in many industries.
Yoyo Zhang showed us the Mixed Reality Toolkit and the wide range of features Unity developers will get out-of-the-box with it. The toolkit gives a very good starting point for any mixed reality app. Included are input system complete with support for articulated hands, eye tracking API, voice commanding, controller visualization, teleportation and UI controls. In the future, Unreal Engine support is planned as well.
We ended the sessions of day 1 with the amazing Mark Bolas, who shared his insights about how mixed reality brings about the next step in the human-computer interaction. He asserted that each step in the computer user interface evolution gives us more and more freedom to express ourselves and interact instinctively. This process is liberating us from the limited interaction model that traditional computers offer. The challenge is for developers to think about how to blend the user interfaces with real-world and make them natural for the user.
The day ended with a welcome reception. I have had the pleasure to meet and talk with many interesting people and was quite amazed by how many developers are working with HoloLens full-time. It seems the community is growing and HoloLens 2 will only accelerate it. I don’t consider myself a mixed reality developer yet, so it was an inspirational experience to talk with people who have such advanced first-hand experience and clearly are excited about what is to come next.
I started the second day with a session by Hakon Strande and Nick Landry, who talked about augmenting HoloLens with Azure Speech Cognitive Services. Although the device itself offers basic voice recognition features, the true magic happens when it utilizes the cloud. Speech Cognitive Services offer many useful services such as speech-to-text transcription with real-time translation. Combined with LUIS (Language Understanding Intelligent Service), you can understand user intent based on natural speech and perform actions in response to prompts. The service learns from actual interaction, so the more you use it, the better model you can train. When you take this capability and put in a HoloLens app, the result is nothing short of spectacular. We also had a chance to see neural text-to-speech service. It uses deep neural networks to make the computer synthesized voice sound nearly indistinguishable from a human. We were presented with a direct comparison of human and synthesized recordings, and it was tough to tell which one is which! You can try this as well as part of this blog post.
Neena Kamath and Jonathan Lyons talked about Azure Mixed Reality Services. Azure Spatial Anchors are the first new Mixed Reality offering announced with HoloLens 2 at Mobile World Congress. This service allows for easy creation of multi-user, cross-platform spatially aware mixed reality experiences. Multiple users can then view holograms on multiple devices simultaneously, including those with Android and iOS phones. It can also be used for interior wayfinding apps and integrating IoT data into mixed reality as holograms. The whole service is very ambitious. We were given a live demo on stage and it was quite impressive for service in the early preview stages.
Azure Remote Rendering is particularly interesting for developers who need to render high-fidelity 3D models with hundreds of thousands of polygons and display and interact with them on devices that simply don’t and can’t have this computing power including HoloLens, mobile phones, and tablets. Remote rendering is currently in closed preview and is yet far from finished, but seems very promising and should be easy to integrate into any application.
Azure Digital Twins offer a way to virtually replicate physical space and model relationships between people and devices. This way businesses can optimize, automate, and infuse their spaces with intelligence. Use cases include building management, space usage analysis, energy management, and more.
Ester Barbuto had an interesting session about Dynamics 365 solutions with Mixed Reality and showed several use cases how HoloLens can be utilized in customer interaction. Although it was quite out of my scope, it was still interesting to see how mixed reality can transform the future of sales. Customer will be able to see the product they are purchasing right in their actual workspace, can interact with it and even provide feedback and change requests directly on the holographic model.
The last session I attended was with Alex Turner. He talked about OpenXR standard, which is developed by all companies in the mixed reality space. The goal is to create a specification, that will allow high-performance access to any AR and VR device. Developers coding against this specification will no longer need to rewrite their application from scratch when developing for different headsets. Instead, they can start with a common API and add platform-specific features in the form of optional vendor specific “plug-ins”. This way the fragmentation of the mixed reality headset space will be prevented and it will significantly simplify further development.
The Mixed Reality Dev Days were a fabulous event and I send thanks to the whole team for making it possible. Every session was exceptionally well prepared, and the entire event was very well organized. I loved the fact that it was separate from Microsoft Build, so choosing among sessions to attend was less complicated. I enjoyed the chance to visit Microsoft campus for the first time and enjoy the Microsoft Life at least a little bit. The fact that Mixed Reality Dev Days were free for all attendees was a great cherry on top of it (although I had a hard time avoiding spending all my money on swag in the Microsoft Company Store – that’s my problem though 😂 ).
Again, I sincerely thank all organizers for Mixed Reality Dev Days 2019 and I am looking forward to the next year (if you’ll have me 🙂 )!