Check how your business can use augmented reality to improve customer experience, increase engagement, and boost brand promotion.Read more
With relatively cheap implementation costs and easy accessibility, mobile AR is a great choice for a goal-oriented tech asset. Businesses can use augmented reality to increase customer engagement, improve brand promotion and awareness, and facilitate the creation of product demos.
Multiple high-profile companies are investing in AR tech. Among them are Qualcomm, Apple, Facebook, and Google.
Both Apple and Google are heavily developing their respective AR software development kits (SDK). ARInsider projects that by the end of 2020, there will be over 2.5 million AR-compatible mobile devices out there compared to only 500 million active users. There’s a huge potential for commercial adoption of this tech.
Update October 2021: As of 2021, there are roughly 2 billion AR-enabled devices, with 851 million Android smartphones and 1.25 billion iOS phones.
No mainstream implementation of costly technology necessary to bring it to the users makes augmented reality that much more accessible. On the other hand, virtual reality or mixed reality headsets are still too expensive for such widespread adoption.
But what is all the fuss about? Can AR apps transform businesses and add tangible value? Check out how augmented reality is helping businesses.
Houzz enhances its selling and reach capabilities with AR, to help customers better visualize how a product would look like in their house. A growing selection of products available for an AR viewing helps Houzz fuel sales and increase engagement.
The 3D view features tiles, furniture, lighting, and other accessories. Houzz’s CEO Adi Tatarko said that users who used the My Room 3D tool were 11 times more likely to make a purchase.
Once you’ve set up AR capability inside an ecommerce app, creating and adding 3D models of products is a relatively inexpensive way to bring AR to your customers.
Great-looking shoes don’t always look equally great on feet. Wanna Kicks gives its customers a chance to try how a pair of sneakers would look on their feet before making a purchase.
There’s still a huge opportunity in retail AR that will most likely be explored in the upcoming years. For example, customers could create a 3D image of their body to be used across shops for better fitting experiences. And I’m not talking here about a simple superimposition of an image of a shirt onto the picture of your body. A 3D scan would reflect how the material flows on different body shapes.
Now that’s a piece of app that I absolutely love. When I first downloaded Google Lens, I had this nagging urge to g-lense everything around me. And even if not all the stuff I g-lensed was properly identified (for some reason, my sleeping cat’s paw got tagged as a rat), Google Lens has proved to be really useful in real-life situations where you need to translate or identify something.
But that’s just one side of Google Lens. The app can also be used as a powerful marketing tool. For example, by scanning a product, customers can check product reviews, product information, or price comparison. Brands can offer coupons or promos hidden inside ads. The key is to provide plenty of info that can be delivered to your customers via Google Lens.
Moreover, ads can be further explored and give a greater background when a user simply scans a billboard, no QR codes involved.
The healthcare industry is projected as one of the greatest beneficiaries of augmented reality tech. Learn how AR is already reshaping clinical practice.Read more
Over the past century, clinical practice has undergone an almost unthinkable transformation. Just over 150 years ago, doctors didn’t even know they could transmit germs on their hands.
On maternity wards of the past, doctors would move from one female patient to the next, examining each without washing their hands in between, much less wearing gloves.
Needless to say, maternal death ran rampant.
So to think we can now use augmented reality during reconstructive surgery to locate bone fractures and blood vessels is quite an astonishing advancement.
Check out what else augmented reality makes possible in healthcare.
The potential of AR is visible across industries. From education to manufacturing to automotive, augmented reality fills gaps in workflows, offering tangible opportunities for improvement.
For example, AR can be used to eliminate inefficiencies in manufacturing — the constant loss of focus and time necessary for the engineers to refer to paper instructions amounts to considerable losses over time.
If you've ever struggled with a paper instruction for a shiny new IKEA bookshelf, quality assurance in the automotive industry is like the IKEA situation times 1,782.4.
For extremely precise and complex tasks, the help of computers is invaluable. And the better the augmented reality technology gets, the more advanced its use cases in clinical practice.
In fact, the healthcare industry is projected to add a $47.9 billion boost to the global economy by 2022.
The current clinical practice is evolving at light speed, but the sector still has multiple areas in need of solutions that can benefit from technology.
AR is an entirely new concept to you? Read our AR guide to learn the basics of augmented reality.
Reconstructive surgery calls for a high precision to yield the expected results and improve a patient's recovery time.
Augmented reality goggles can be used to feed information from CT scans and MRI images directly into a surgeon’s field of vision. This way, the surgeon knows where blood vessels and bone fractures are and can increase the precision of the incision during reconstructive surgery.
First, surgeons perform diagnostic imaging on the patient. Then, the data from CT scans, MRI, and X-rays is digitized and transformed into a 3D model, which shows the location of soft tissues, blood vessels, and bones.
This rendering is then fed into the AR device. During the surgery, the 3D rendering is mapped on a patient’s body, providing the surgeon with critical information.
Thanks to this approach, surgeons don’t have to look back and forth between the patient and the images, or rely on audible Doppler ultrasound, which is currently the prevalent method during reconstructive surgeries.
The technology is far from mature, with many challenges waiting in line before mainstream adoption is possible.
For example, transforming information from CT scans and MRI into 3D models is time consuming — the ER won’t benefit from AR, at least for now.
Neurosurgeons at Johns Hopkins used augmented reality during spinal fusion surgery on June 8, 2020. Two days later, surgeons relied on an AR headset during the removal of a cancerous tumor from the spine of another patient. Both procedures were conducted using a pilot augmented reality headset from Augmedics.
In 2021, Dr. Harvinder Sandhu at Stamford Health performed a successful spinal surgery using AR goggles. The AR technology uses data from MRIs to overlay critical tissue around a patient's spine. Provided with detailed data projected directly on the retina, the doctor is able to perform more precise surgeries that speed up recovery and decrease the likelihood of infection.
Global skill gaps, pandemic-related border closures, and travel limitations stress the need for remote assistance solutions. Discover Remote Assist App.Read more
For many of us, technology is a staple commodity. It permeates a growing number of households, and factories increasingly rely on highly specialized machinery. And while technology has catapulted the evolution of our civilization, it’s prone to breaking.
But technicians who can fix problems may not always be available, and on-site service is often costly. Pandemic-related border closures and travel restrictions also limit the availability of specialists. In this context, remote assist solutions emerge as reliable support across industries.
In the last 30 years, the adoption of specific technologies in the US alone has skyrocketed.
In the business world, technology has spurred production and manufacturing on a palpable scale. According to the World Economic Forum, OECD producers that adopted technology “have grown at a rate of 3.5%, compared with an anaemic 0.5% for the laggards.”
Even small manufacturing facilities have seen an increase in the diffusion of technological development, which led to a more productive workforce and output increases.
Developing countries have also observed an increase in technological innovation, albeit at a lower scale.
The global technological advancement and the ubiquitous presence of technologies in many sectors call for engineers and technicians who can not only drive that innovation but also maintain it.
Yet there’s an alarming shortage of skilled engineers and technicians who can carry out innovative infrastructure projects. On the other hand, less qualified personnel can complete tasks with varying levels of complexity given professional assistance.
Remote assistance might therefore become an affordable and scalable solution that bridges the gap between the need for support and the lack of on-site skill.
The app has two modes: consultant and user.
When a user connects, they choose a problem from a list of available topics.
The user then waits for a connection with a technician.
After logging in, the technician sees a list with active user sessions awaiting help for a given topic.
Once connected with the user, the technician has access to the user’s rear camera.
The technician communicates with the user:
How we evaluated different input methods available for mobile augmented reality and why we didn’t choose a hand gesture-based interaction method.Read more
In this article, we’ll explain why we considered incorporating hand gestures as an interaction method for one of our mobile XR projects, what options we evaluated, and why we ultimately decided not to follow that path, at least for the first release.
In April 2020, we started working on a new AR project for a large entertainment center in Las Vegas. The project was a treasure hunt-like experience to be deployed as an interactive installation.
During the initial discussions, iPhone 11 Pro Max was selected as the target platform. The devices would be encased in custom-designed handheld masks to amplify the feeling of immersion into the whole theme of the venue.
We chose that specific model of the iPhone for the horsepower necessary to handle the demanding particle-based graphics. Also, the phone had a reasonably sized AR viewport at a relatively low weight of the entire setup.
Considering the project’s scale, its high dependency on the graphical assets, and our pleasant experiences with developing the Magic Leap One app for HyperloopTT XR Pitch Deck, we selected Unity as our main development tool.
In the AR experience, the users would rent the phone-outfitted masks and look for special symbols all around the dim but colorfully lit, otherworldly venue. The app would recognize the custom-designed symbols covered with fluorescent paint and display mini-games over them for the users to enjoy.
We already had several challenges to solve.
The main one was detecting the symbols — there was nothing standard about them in terms of AR-tracking. Also, the on-site lighting was expected to be atmospheric, meaning dark — totally AR-unfriendly.
The games themselves were another challenge. Users could approach and view the AR games from any distance and at any angle (e.g., over any background). The graphics would consist mostly of very resource-intensive particle effects, as it was the theme of the app.
Plus, we only had a couple of months to develop the AR experience.
There was also one big question left “how are the users going to interact with the app?”
During our work on a mobile AR project, we reviewed many non-touch screen interaction methods. Read our findings and learn where to use different solutions.Read more
For one of our recent projects — a mobile AR treasure-hunt type experience that consisted of a series of mini-games — we were challenged to find a way of interacting with the app that would be more engaging and fun than a touchscreen interface.
Since we’ve spent some time and effort reviewing various options, we thought it might be helpful to others if we shared our findings.
Our goal is to show you what’s out there and highlight the solutions that we find to be the most promising for mobile interactions.
Not all of the solutions target mobile AR or are actual controllers. But we thought it makes sense to include them as well, since, as you will see, some of them have the potential to be viable in the future or are simply really cool.
The list is by no means exhaustive, and we’d be happy to hear if you know of any awesome AR interaction solutions that we might have missed.
Litho is a very promising mobile AR controller. The device is small with a futuristic look, which matched really well the Sci-Fi theme of our project.
The initial research showed great potential, and, since we developed our app in Unity and Litho offers an SDK for it, we decided to take it to round two of our evaluation.
Our colleague Dominik prepared a prototype that allowed us to determine how well Litho fit in with the requirements of our project.
The SDK setup was straightforward and implementing the prototype took relatively little time. From the video, you can see that Litho requires minimal initial calibration. The interaction is based on a precise virtual laser pointer with a fixed origin. Litho offers 3 DOF readings, meaning you can detect how the hand is oriented in space, but not where it’s located. The quality of the interactions doesn’t depend on the lighting conditions.
After evaluation, we decided not to use Litho for our entertainment experience, mainly because it didn’t offer hand position tracking, which we considered crucial for achieving a high fun factor.
The significance of this argument came from the fact that adding any kind of external controller made the logistics more complex so the provided entertainment value needed to be worth that extra cost.
Our app would be preinstalled on mobile devices, which would then be rented on the premises for the users to enjoy for a specified time.
Using any kind of controller required us to pair it with the devices, ensure it remained charged, and account for it when picking up the equipment once a visitor finished using the app.
Even though we decided not to use Litho in our project, I think it could work well in any kind of mobile AR application where the precision of the interaction is the key and where the users interact with the app regularly.
FinchDash / FinchShift are AR/VR motion controllers. A Unity SDK is available. Of the two, only FinchDash is described as supporting mobile platforms, which is a bit unfortunate as it only allows 3 DOF tracking.
Also, the controller’s rather unremarkable looks don’t fit well with themed experiences.
FinchShift, FinchDash's sibling device, uses an additional piece of equipment in the form of an armband to offer full 6 DOF input. It’s also more visually appealing in my opinion.
ManoMotion is a camera-based mobile hand tracking solution that offers a Unity SDK. It’s easy to set up and doesn’t require additional calibration. It can scale depending on the needs from 2D tracking through 3D tracking of the hand position and finally to a full-blown skeletal tracking with gesture recognition.
As with any camera-based solution though, it has some technical limitations that need to be considered.
The main one is the effective control area that is only as large as the camera’s field of view. Since we’re discussing a mobile use case here, it’s even more significant as the device in our project would be held in one hand and you can extend the other one so far before it becomes awkward.
Another disadvantage is the reliance on computer vision algorithms, which causes the accuracy to be inconsistent across different lighting conditions. Especially colored light can degrade the experience quite a bit.
That said, we had the chance to work with ManoMotion’s support on our challenging use case (dim colored lighting). It turns out that ManoMotion can adjust their algorithm if the target conditions are known in advance. In our case, it allowed achieving a similar level of accuracy in the challenging lighting as in an optimal one, which was very impressive.
Google MediaPipe is an open-source camera-based hand tracking solution, similar to ManoMotion, and as such it shares the same limitations. In terms of platforms, it supports Android and iOS. But at the time of our research, it didn’t offer an officially supported Unity SDK.
ClayControl is another option in the category of camera-based hand tracking solutions. It seems to cover a wide range of platforms, including mobile, and has Unity on the list of compatible software platforms.
ClatControl’s website mentions low latency as one of the key selling points, which is interesting considering that solutions based on cameras and computer vision usually involve some degree of an input lag. It seems there is no SDK openly available for it, so we didn’t have a chance to evaluate it.
Polhemus is a very promising wearable camera-less hand tracking solution. It’s based on 6 DOF sensors, which don’t require a field of view and can provide continuous tracking even in complete darkness.
At the time of our research, however, it was PC-only. On the website, you can find information about VR, but unfortunately no AR support yet.
Although Xsens DOT is a motion-tracking device and not an actual controller, it could be used as one with some additional work. It does offer 3 DOF support so the orientation data is accurate but the position is only estimated.
It’s smaller than Litho, which itself is quite small. At the time of writing, I wouldn't consider it practical for typical AR interactions. It might be worth considering for more specific motion tracking needs.
FingerTrak is another promising wearable, this time in the form of a bracelet. It allows continuous hand pose tracking using relatively small thermal cameras attached to a wristband.
On its own FingerTrak allows detecting gestures without the field of view limitation, which affects the other camera-based solutions. It doesn’t look like a finished product yet, but it seems to be an interesting approach that could turn out well in the future.
Leap Motion Controller uses a relatively small sensor with cameras and a skeletal tracking algorithm to accurately detect the user’s hands. Unfortunately, at the time of writing, it doesn’t support mobile yet.
Wait, what? As surprising as it sounds, someone actually seems to have tried using AirPods Pro for 6 DOF tracking! This is more of a curio, but if the video is actually legitimate, AirPods open up some unexpected possibilities.
Ensuring your Android app development project is successful is easy when you know what’s involved in the process. Learn all about Android app development.Read more
Shopping, organizing, scheduling, banking, working, and simply having fun, it’s often a mobile app that solves, bringing convenience galore.
All right, so how come such a shattering number of apps fails?
That’s easy — because they don’t solve anything for their target audience. Indeed, it’s the failure to research the market that’s the top reason why mobile applications flop on app stores.
Whether we’re talking about iOS or Android app development, the principle is the same:
Do your research and battle test your app idea with potential users. Only then will you be able to use the application as a vehicle for business growth.
In this article, I’ll take you through the basic steps you need to take to make your Android development project successful.
A business analysis of your idea is the most important part of the app creation process.
I can’t stress this enough but validating your idea before investing money in its creation will help you approach the process with the necessary level of confidence backed by data.
Here, you have to consider concepts such as:
A key phase during idea validation is a detailed analysis of your competitors.
Looking at how your competitors solve for their audiences will help you build a better strategy for your product — you’ll know what worked for them and what has drawn negative feedback.
Similarly important is determining if your product is lucrative. To do this, think about how your competitor’s growth looks like. Have they developed a sizable customer base? Are they out looking for specialists to expand their business even more?
In essence, you validate your idea to learn if the product will pick up on the market and give you sustainable growth opportunities.
Use tools such as lean canvas for a deeper dive into your product analysis.
Be it an internal application for employees, a customer-facing app, or a game, the design is important across all app types. Positive user experience comes from both how the app looks like and how it works.
That said, when creating designs, make sure the designer understands the project and knows your target audience. The colors, typography, pictures, animations — all have to be aligned to reflect the needs and preferences of your users.
If you’re releasing your app into a highly competitive market with many similar apps, the design of the interface can make or break a deal.
For example, users might love the functionalities offered by your competitor. But if that product has a poor interface and usability, consumers would just as well leave the application in search for a better solution.
To be a viable alternative, however, your product needs seamless UI and UX.
So, regardless of application type, keep your design smart, intuitive, and clean. A cluttered interface is a sure-fire way to make your users wrinkle their noses.