Case study: Augmented Reality – the next tech big thing
Nowadays, humans are really obsessed with new technologies and creating stuff which was deemed impossible not too long ago. Today, we are going to discuss the Augmented Reality software, and its real life application.
First of all, let’s define what Augmented Reality is. The most renown source, Wikipedia provides the following definition: “Augmented reality (AR) is a live direct or indirect view of a physical, real-world environment whose elements are augmented by computer-generated sensory input such as sound, video, graphics or GPS data”. In other words, it is a technology enriching the real world with digital information and media, such as 3D models and videos, overlaying in the real-time camera view of your smartphone, tablet, PC or connected glasses.
On smartphones and tablets, Augmented Reality feels like a magic window. You just turn on your camera, point it at the environment and the program places life-size 3D models within it. For example, you can put monsters on the street, a plane in your room, or even a cat in a microwave.
The goal of Augmented Reality is to create a system in which the user cannot tell the difference between the real world and the virtual augmentation in it.
The first idea of AR appeared in 1901, when L. Frank Baum, an author, first mentioned the idea of an electronic display/spectacles that overlays data onto real life (in this case 'people'), it was named a 'character marker'. But the first AR software was invented in 1990 by Professor Tom Caudell, who built a system for a neural systems project at Boeing. This project was focused on finding new ways to help the company’s engineering process and involved the use of virtual reality. Caudell developed software which displayed the position of important cabling during construction, and this removed the need for complex user manuals.
Today, Augmented Reality is used in entertainment, healthcare, military training, engineering design, robotics, manufacturing and other industries. For instance, in sales and marketing, the Ikea application works in the following way: when you point your phone’s camera at an empty room, you could begin to decorate it with furniture available from Ikea’s catalog. Then the app would allow you to purchase those goods.
Currently, we have a chance to research and develop an Augmented Reality application that make it easy to find interesting places in your surroundings.
We had to implement a simple and basic Augmented Reality task - help the device used to navigate in the real world, and search for points of interest around him in real time. The idea is simple: any time a user starts the app, – he or she can see where some interesting places are with the help of a camera. For example, I want to find one peculiar monument in the city that I’m located. I don't know the address, I don't know the language - all I have is an app. I launch the app and by moving a camera around 360 degrees I can literally find the monument I am looking for marked on the screen with the 3D object. Then, I can see how far it is and its direction. Also, the app can show you markers which indicate other interesting places around, and all of them you can find with camera’s marker as a 3D object.
To implement such functionality on Apple’s iPhone with iOS core, one can use the following features:
- SceneKit (displaying 3D scene);
- CoreLocation (to track user’s location, and heading of the device);
- CoreMotion (to track movement of the device in real time);
Step 1: We start with getting the correct location of a user (lat/long value). Afterwards we need to get the correct heading.
Step 2: Now we are ready to draw 3D scene with the SceneKit. We are making ViewController with layer of Camera, and the SceneKit layer on top of it. We should rotate initial 3D scene so the axis are positioned in the following manner:
- 1. X,Y axis are at ground level;
- 2. Z axis - is altitude. z = 0 is the sea level.
Step 3: Finally our 3D scene is bound with the real world so we can convert location values of points interest in XY coordinates for that we will use some secret magic formula. Now each object has X Y points and we can place them in 3D scene (Z - value of an object above the sea level).
Step 4: With the CoreMotion we can connect real-time rotation of a camera to a camera in 3D Scene. Every time user will rotate the device – the camera in the 3D scene will rotate at the same angle respectively. Now the camera is basically a spyglass from which we can see our 3D objects placed in real world.
This pretty simple example of Augmented Reality, with be further improved accuracy, and interactions, and thus can become really helpful in our everyday lives. Augmented Reality features are not perfect yet, but this industry keeps growing and expanding and I’m certain that it will change our life tremendously soon enough (hopefully, for the best). Our lives could always use a little bit of augmenting.
Read more about TechMagic researches here:
Case Study: In-app purchase programming in iOS and Android
Case study: Playing video in your iOS application
R&D project: GPS tracking app. Experiments in Android