Augmented Reality for .NET Developers on iOS

leeenglestone

Lee Englestone 💡🧠👨‍💻

Posted on December 5, 2020

Augmented Reality for .NET Developers on iOS

So for this years contribution to the C# Advent Calendar, I thought I would give the gift of knowledge.

What I want to share is the little known secret that is.. just how easy it is for .NET developers to create Augmented Reality for iOS devices like iPhone and iPad using C#.

This has largely made possible because of the amazing people working on Xamarin, whom ported Apples Augmented Reality framework ARKit to .NET so that we can use it in conjunction with Visual Studio for Mac and write lovely C# to create some interesting AR experiences and deploy them to our iOS devices.

Here are just a few examples of the Augmented Reality functionality built into the feature rich ARKit and SceneKit all of which are achievable using C# and .NET.

Plane Detection

Plane detection is important as it can help us find the limits of our environment. Planes provide good point of reference on which to place other virtual items.

Image Recognition

Something that works very smoothly in ARKit is image recognition, when detecting a pre-defined image, we can respond to that event and use the detected images location to add other items to the detected image as can be seen below with my business card and a few transparent pngs. Simple.

Face Tracking

ARKit can track up to 3 different faces in a scene. It is possible to add other virtual items to the scene in relation to the detected face for example hats, classes etc.

Facial Expression Detection

ARkit is able to identify the movements of a surprisingly large number of facial features (how else to you think those animojis work), something that we can respond to when detected as the video shows below where I am simply changing the colour of the facial mesh depending on the detected expression.

Body Tracking

Using ARKit we can track the presence and orientation of a body in the scene. ARKit is able to tell us the the position of the tracked bodies major joints in 3D space and infer from those the location of minor joints as can be seen in the following video.

Object Detection

Similar to Image recognition, in ARKit it is possible to scan and record the features of a 3D object, then have your app detect the presence of that scanned item in your environment at a later date.

Lighting and Shadows

Lighting and shadows are important if we are to help make our virtual objects look as realistic as possible. Our brains use light to infer a great deal about our environments. Don't believe me? turn out the lights and tell me about the environment around you. We can add virtual lighting and shadows to our scenes. Bad virtual lighting can make our virtual items look fake and conversely good virtual lighting can make them look more real as can be seen below.

Touch Gestures and Interactions

We can take touch gestures on our screen such as tap, pinch, rotate and pan and translate them to do something to the virtual objects in the scene that our fingers are touching on the screen. AR would be pretty boring if we couldn't interact with virtual items we place into the scene.

Animations

Animations are incredibly important to make our AR experiences pop and show movement. Below you can see an AR periodic table animated into place and respond to screen touches. A bit of opacity also makes things cooler. It's just a general rule, I find.

Physics

SceneKit provides us with a Physics engine we can play with. Below you can see how we can give our virtual items solid mesh and have look to be effected by gravity.

Video & Sound

You can add virtual video and sound to your AR scene. Below you can see a video played at the location of a detected image.

3D Models

We can take 3D models such as .dae, .obj and .scn formats and place them in the scene. You can even create your own in Blender and export them to those supported formats. Below you can see a 3D model placed on the location of a detected image.

Combining Functionality

The best AR experiences are achieved when you combine a few of the aforementioned features together as can be seen below.

And sometimes it is fun to just mess around and see what kind of experience you can make..

Summary

Again, all the above were achieved just using .NET, C# and Visual Studio for Mac to leverage Apples ARKit and SceneKit frameworks.

If this has peaked your interest, know that I am writing a book on the topic with APress, which will hopefully be completed and published in the new year and I have a dedicated website XamarinArkit.com where I share code samples and approaches on how to leverage the functionality discussed to creating interesting AR experiences.

In order to help share these amazing capabilities with my fellow .NET developers, I've given a few talks to .NET User groups on the topic and hope to do more in future.

What is most important though is imagination, with Augmented Reality, you really are only limited by your imagination. It provides an in immersive interactive canvas like we've never had before. You can see why AR is being used in an increasing number of industries.

If you want to see more examples, make sure you follow me on Twitter and YouTube.

Thanks,

-- Lee

💖 💪 🙅 🚩
leeenglestone
Lee Englestone 💡🧠👨‍💻

Posted on December 5, 2020

Join Our Newsletter. No Spam, Only the good stuff.

Sign up to receive the latest update from our blog.

Related

Augmented Reality for .NET Developers on iOS
augmentedreality Augmented Reality for .NET Developers on iOS

December 5, 2020