Mobile development blog

Sign up to our newsletter
Thank you! Your submission has been received!
Oops! Something went wrong while submitting the form.
Looking for something more specific?
search icon
Thank you! Your submission has been received!
Oops! Something went wrong while submitting the form.
project planning premortem

Premortem — Why It Should Be a Core Part of Your Digital Product

by
Mateusz Płatek
Mateusz Płatek
,
April 7, 2021

A premortem helps teams predict and mitigate problems before they happen during product development. Learn how to perform a premortem.

Read more

Failures are a natural event in the life cycle of a product, but they’re especially painful when a whole project ends up unsuccessful.

When a digital product fails, the team usually performs an inspection called a postmortem. The team gathers, analyzes what went wrong, tries to draw conclusions, and creates a note. The learnings from that failure help the company adjust the processes so as not to repeat the same mistake.

But what if we could look into the future before the project failed? At nomtek, we use a premortem — an attempt to predict and mitigate problems before they happen.

What Is a Postmortem?

Before explaining what a premortem is, let’s take a closer look at a postmortem.

In medical language, a postmortem describes the procedure after the patient's death, where doctors and pathologists try to understand the cause of demise. The detailed analysis of why something as complex as a human body has failed can be applied to multiple contexts.

For example, when you need to understand what exactly went wrong to be able to avoid it in the future.

Software development loves postmortems because they’re a great learning tool.

What Is a Premortem?

The main purpose of a premortem is to simulate failure before it happens.

More specifically, a premortem is a meeting that takes place before the first line of code is written. All team members involved in a project attend it:

  • account manager
  • development team
  • moderator

Depending on the size of the project and the team, a premortem meeting lasts one to two hours.

three people talking about a project
A premortem meeting starts before well ahead of development work.

We follow these steps:

#1. The scope of the project is explained so that each participant has a full picture of the situation and potential threats.

At nomtek, we do a premortem after the kick-off meeting, when the team already had a chance to familiarize themselves with the current state and plans for the future.

#2. The moderator announces that the project has failed — this is important. We are talking in the past tense: the project has ended. Participants must feel that the action has already happened.

#3. By using sticky notes or an agile retrospective tool, participants list the reasons why the project failed.

These reasons can be the obvious ones like exceeding the budget, the less obvious ones that all fell ill with COVID and the timeline crumbled, or the totally abstract ones like a falling meteorite. 

We throw out all ideas until we can move on to the next stages.

#4. Then we proceed as in a typical agile retro. We group similar cards and vote for the most important problems.

A good criterion is to vote on things that have the best chance of occurrence, and ones that we have an influence on. Sorry, we won't upvote the meteorite scenario ;)

#5. Here we discuss and come up with solutions. There are different techniques for how to carry out this step.

We simply give participants a moment to come up with specific actions and then post them simultaneously on the Slack channel where we can review the available solutions.

Experience tells us that sometimes the most unconventional ideas end up on the action point list, so it's important not to reveal solutions too early, so as not to disturb the creative phase.

The selected solutions must be actionable, precise, and measurable. We can select more than one way of mitigating the risk of a single problem.

#6. The developed action points are assigned to the appropriate people, and then we set a regular checkpoint to check the progress, e.g., in two weeks or at the beginning of a regular retrospective if it’s a repetitive activity.

people putting together a mobile application

How to Build a Minimum Viable Product

by
Wojciech Czajkowski
Wojciech Czajkowski
,
March 25, 2021

Learn what is an MVP and how to build a minimum viable product that will help you validate your hypothesis.

Read more

Most legendary products out there start with a minimum viable product (MVP) that expands and blooms into a feature-rich solution. But not all companies get the MVP stage right. Learn what a minimum viable product is and how to build it the right way.

What Is an MVP?

An MVP is a release of your product with a minimum set of features enough to validate your main hypothesis.

In other words, an MVP is a product that solves your users’ core problems.

Eric Ries says a minimum viable product “allows a team to collect the maximum amount of validated learning about customers with the least effort.”

An MVP is a low-risk way to answer the below questions:

  • Do people need that product?
  • Can I generate a following and a returning audience with it?
  • Is my idea of solving a problem something that people really need? And are willing to pay for?

To do that, you don’t need fireworks and dozens of sparkling features — you need the core functionality that gives you answers to the above questions. Glitter and complexity will come later, in the next iterations.

It’s worth keeping in mind that you don’t need an MVP at all to confirm a hypothesis or check if your idea is viable. You can first look into building a proof of concept (POC) or a prototype.

For a detailed discussion of the differences between a proof of concept (POC), prototype, and MVP read our article.

What Is the Purpose of Building MVPs?

In short, the purpose of building MVPs is to validate your assumptions quickly and improve the product, pivot, or abandon the project before losing any more resources.

people designing an app interactivity

What You Should Know Before Developing a Mobile App

by
Maja Nowak
Maja Nowak
,
March 17, 2021

Developing a mobile app is a complex undertaking. Learn what to know before developing a mobile app.

Read more

Developing a mobile app is a complex process that involves technical knowledge, product strategy experience, and good old organizational skills. Here are the basic questions you have to answer before developing a mobile app. Learn what to know before developing an app.

Does Your Business Need an App?

Mobile apps might be used by billions of users multiple times a day, but would your business actually benefit from an app?

In other words, do you have business goals, especially long-term ones, that a mobile app would support?

A mobile app should solve an existing problem that’s been vetted and — more importantly — verified to generate a demand for a solution.

How to know if your business needs an app?

For example, besides an in-depth business analysis, you can also check your website analytics to learn if your customers would find value in a mobile app.

If your website is seeing a lot of mobile traffic, seize and analyze this data. Maybe you could create an app that delivers what your users are coming for to your website via their mobile phones. Check traffic sources and events in your analytics software to gain insight.

An app might also support an upcoming event your business is hosting. With an event-focused app, you can breathe more engagement for users and facilitate communication between participants.

Can I Make an App with No Experience?

It all depends on the complexity of your app.

When you lack programming skills, you can consider using low- or no-code platforms to build a simple app. That said, low- or no-code solutions still call for a level of experience, but the entry threshold is lower than that of traditional programming.

So if you need an app that will support internal company processes for employees, low-code solutions can help build one that addresses simple tasks and workflows. Similarly, no-code platforms can also be used to build e-commerce shops.

drag-and-drop low-code programming platform
Low- and no-code development platforms work aa drag-and-drop interfaces. Source: Creatio

But again, there’s a learning curve — getting enough expertise will take some time. And if you need complex programming to add additional functionality to your app, some serious skills are required, code or no code involved.

Also, low-code development doesn’t completely eliminate the need to hire a developer, but it can immensely support your dev team and let them focus on creating solutions for complex business processes.

And while you can take the time and effort to build a mobile app in a no-code platform, the technical side of an app is only one part of the story.

To build, release, and maintain a successful mobile application, you’ll need experience in idea validation, data analytics, and product design just to name three.

A mobile app agency brings experience galore to the table. You get the expertise of people who went through the process of idea validation many times before and know what to do to make an app pick up on the market — sometimes do it without writing a single line of code.

Still, if you decide to pursue the process of mobile app development on your own, be sure to first validate your idea extensively using tools such as the lean canvas.

hands interacting with a mobile phone

Choosing a Hand Gesture-Based Input Method for a Mobile AR App

by
Sebastian Ewak
Sebastian Ewak
,
March 8, 2021

How we evaluated different input methods available for mobile augmented reality and why we didn’t choose a hand gesture-based interaction method.

Read more

In this article, we’ll explain why we considered incorporating hand gestures as an interaction method for one of our mobile XR projects, what options we evaluated, and why we ultimately decided not to follow that path, at least for the first release.

A New XR Project

In April 2020, we started working on a new AR project for a large entertainment center in Las Vegas. The project was a treasure hunt-like experience to be deployed as an interactive installation.

During the initial discussions, iPhone 11 Pro Max was selected as the target platform. The devices would be encased in custom-designed handheld masks to amplify the feeling of immersion into the whole theme of the venue.

We chose that specific model of the iPhone for the horsepower necessary to handle the demanding particle-based graphics. Also, the phone had a reasonably sized AR viewport at a relatively low weight of the entire setup.

Considering the project’s scale, its high dependency on the graphical assets, and our pleasant experiences with developing the Magic Leap One app for HyperloopTT XR Pitch Deck, we selected Unity as our main development tool.

The Heart and Soul of the Experience

In the AR experience, the users would rent the phone-outfitted masks and look for special symbols all around the dim but colorfully lit, otherworldly venue. The app would recognize the custom-designed symbols covered with fluorescent paint and display mini-games over them for the users to enjoy.

We already had several challenges to solve.

The main one was detecting the symbols — there was nothing standard about them in terms of AR-tracking. Also, the on-site lighting was expected to be atmospheric, meaning dark — totally AR-unfriendly.

The games themselves were another challenge. Users could approach and view the AR games from any distance and at any angle (e.g., over any background). The graphics would consist mostly of very resource-intensive particle effects, as it was the theme of the app.

Plus, we only had a couple of months to develop the AR experience.

There was also one big question left “how are the users going to interact with the app?”

a prototype of a mobile phone

What Is a POC, Prototype, and MVP — Explaining the Differences

by
Maja Nowak
Maja Nowak
,
March 4, 2021

Learn the differences between a proof of concept (POC), prototype, and minimum viable product (MVP) to know how to approach product development.

Read more

Building good digital products is a combination of being innovative and following tested mobile app development methods. A proof of concept (POC), prototype, and minimum viable product (MVP) help test a product idea before you make a significant investment.

What are the differences between a POC, prototype, and MVP, and how to choose the one that fits your project best? Read on for answers.

POC vs. MVP vs. Prototype: Short Definition

Proof of concept — A POC is a method of validating assumptions with target users and checking if your idea is feasible technically.

Prototype — A mobile app prototype evaluates the general “shape” of your idea (e.g., look, flow, user interaction).

Minimum viable product — An MVP is a fully working version of your product but with only the core features that let you collect initial user feedback.

We talk in detail about how to build an MVP in our guide.

What Is a Proof of Concept?

In the world of mobile app development, a POC is a simple project that validates or demonstrates an idea. The purpose of a POC is to check if an idea can be developed and won’t consume excessive resources or time.

With a POC you essentially evaluate core functionality. If your app idea is complex, you can have many POCs to test each functionality.

User experience is pushed aside when you build a POC. That’s because it takes lots of time and work to create an optimal user experience, and that’s not the point of creating a POC. The goal is to validate technical capability.

Features of a proof of concept

Catch early investor interest. You can build a POC to present your idea to investors to acquire seed funding for further development.

Innovate. Innovation happens at the intersection of technological viability and market demand. A POC will help you check if your idea can be built using current technology.

Save time. When you check if your idea can be built, you automatically save time that would be wasted if you were to figure out technical viability issues once you hired developers and committed significant resources and time.

Pick the technology. Creating many POCs using different technologies can help you decide which technology stack is the most suitable for your project. This way, you’ll know early on what’s possible as you move forward and how to structure your product’s roadmap.

Check against the competition. If you plan to release a mobile application in a heavily competitive market, a POC will help you validate unique features in your offer. Your product will need to include a unique approach to solving the same problem to be a better alternative to what’s already out there.

Example of a proof of concept

PONS XR Interpreter

Companies around the world are increasingly embracing remote-work solutions and collaboration methods. We worked with PONS — a global publishing house and our long-term partner — to create a proof of concept for an XR cross-language communication solution supported by AI.

The POC helped validate if XR Interpreter could be used in a professional environment to make communication easier.

product validation augmented reality
The POC was built to prove that professionals could communicate in different languages in real time to discuss complex mechanical issues. Source: nomtek


woman analyzing data

Use Mobile Analytics to Transform Your Digital Product

by
Mat Zaleski
Mat Zaleski
,
February 25, 2021

Mobile analytics gives you all the necessary insight into in-app user behaviors. Learn how to use mobile analytics to transform your products from good to excellent.

Read more

Mobile analytics gives you all the necessary insight into in-app user behaviors. The data from mobile analytics can help you make informed decisions about changes in your features or design. Mobile analytics is also invaluable in adjusting in-app processes and funnels because you can directly measure how every step affects user experience (UX).

What Is Mobile App Analytics?

Mobile analytics is the process of collecting and analyzing in-app data. This data gives you precise information about app performance and in-app user journey, letting you fix ineffective elements.

Mobile analytics is one of the key elements to improving conversions, retention, and user engagement. It’s the foundation of great digital products that engage target audiences.

How Does Mobile Analytics Work?

Mobile analytics software integrates with your mobile application to gather and analyze the data produced by users and the app itself. For example, an analytics tool can be embedded as an SDK into the code. You need different SDKs depending on the platform release of your app (e.g., iOS, Android).

Analytics lets you hypothesize, make assumptions, and evaluate experiments. It’s the key to refining and adjusting your product so that it matches the real-life behaviors that users display in your app.

With analytics, you know how users are using your app, what in-app actions they take, and what features they’re using (or not).

feature usage anlytics
Feature usage analytics. Source: Amplitude

For example, you can check the completion of onboarding or how the buying process looks from the user’s perspective.

Identifying steps causing a drop in the % of users
Identifying steps causing a drop in the % of users. Source: Piwik PRO


hands with mobile augmented reality interaction devices

Reviewing Mobile Augmented Reality App Interaction Methods

by
Sebastian Ewak
Sebastian Ewak
,
February 16, 2021

During our work on a mobile AR project, we reviewed many non-touch screen interaction methods. Read our findings and learn where to use different solutions.

Read more

For one of our recent projects — a mobile AR treasure-hunt type experience that consisted of a series of mini-games — we were challenged to find a way of interacting with the app that would be more engaging and fun than a touchscreen interface.

Since we’ve spent some time and effort reviewing various options, we thought it might be helpful to others if we shared our findings.

Our goal is to show you what’s out there and highlight the solutions that we find to be the most promising for mobile interactions.

Not all of the solutions target mobile AR or are actual controllers. But we thought it makes sense to include them as well, since, as you will see, some of them have the potential to be viable in the future or are simply really cool.

The list is by no means exhaustive, and we’d be happy to hear if you know of any awesome AR interaction solutions that we might have missed.

What Mobile Augmented Reality App Interaction Methods Are There?

Litho

Litho is a very promising mobile AR controller. The device is small with a futuristic look, which matched really well the Sci-Fi theme of our project.

The initial research showed great potential, and, since we developed our app in Unity and Litho offers an SDK for it, we decided to take it to round two of our evaluation.

Our colleague Dominik prepared a prototype that allowed us to determine how well Litho fit in with the requirements of our project.

The SDK setup was straightforward and implementing the prototype took relatively little time. From the video, you can see that Litho requires minimal initial calibration. The interaction is based on a precise virtual laser pointer with a fixed origin. Litho offers 3 DOF readings, meaning you can detect how the hand is oriented in space, but not where it’s located. The quality of the interactions doesn’t depend on the lighting conditions.

After evaluation, we decided not to use Litho for our entertainment experience, mainly because it didn’t offer hand position tracking, which we considered crucial for achieving a high fun factor.

The significance of this argument came from the fact that adding any kind of external controller made the logistics more complex so the provided entertainment value needed to be worth that extra cost.

Our app would be preinstalled on mobile devices, which would then be rented on the premises for the users to enjoy for a specified time.

Using any kind of controller required us to pair it with the devices, ensure it remained charged, and account for it when picking up the equipment once a visitor finished using the app.

Even though we decided not to use Litho in our project, I think it could work well in any kind of mobile AR application where the precision of the interaction is the key and where the users interact with the app regularly.

FinchDash / FinchShift

FinchDash / FinchShift are AR/VR motion controllers. A Unity SDK is available. Of the two, only FinchDash is described as supporting mobile platforms, which is a bit unfortunate as it only allows 3 DOF tracking.

Also, the controller’s rather unremarkable looks don’t fit well with themed experiences.

FinchDash, an AR motion controller. Source: Finch XR

FinchShift, FinchDash's sibling device, uses an additional piece of equipment in the form of an armband to offer full 6 DOF input. It’s also more visually appealing in my opinion.

ManoMotion

ManoMotion is a camera-based mobile hand tracking solution that offers a Unity SDK. It’s easy to set up and doesn’t require additional calibration. It can scale depending on the needs from 2D tracking through 3D tracking of the hand position and finally to a full-blown skeletal tracking with gesture recognition.

As with any camera-based solution though, it has some technical limitations that need to be considered.

The main one is the effective control area that is only as large as the camera’s field of view. Since we’re discussing a mobile use case here, it’s even more significant as the device in our project would be held in one hand and you can extend the other one so far before it becomes awkward.

Another disadvantage is the reliance on computer vision algorithms, which causes the accuracy to be inconsistent across different lighting conditions. Especially colored light can degrade the experience quite a bit.

That said, we had the chance to work with ManoMotion’s support on our challenging use case (dim colored lighting). It turns out that ManoMotion can adjust their algorithm if the target conditions are known in advance. In our case, it allowed achieving a similar level of accuracy in the challenging lighting as in an optimal one, which was very impressive.

Google MediaPipe Hand Tracking

Google MediaPipe is an open-source camera-based hand tracking solution, similar to ManoMotion, and as such it shares the same limitations. In terms of platforms, it supports Android and iOS. But at the time of our research, it didn’t offer an officially supported Unity SDK.

Google’s MediaPipe hand tracking
Google’s MediaPipe hand tracking. Source: Google Blog

ClayControl

ClayControl is another option in the category of camera-based hand tracking solutions. It seems to cover a wide range of platforms, including mobile, and has Unity on the list of compatible software platforms.

ClatControl’s website mentions low latency as one of the key selling points, which is interesting considering that solutions based on cameras and computer vision usually involve some degree of an input lag. It seems there is no SDK openly available for it, so we didn’t have a chance to evaluate it.

Polhemus

Polhemus is a very promising wearable camera-less hand tracking solution. It’s based on 6 DOF sensors, which don’t require a field of view and can provide continuous tracking even in complete darkness.

At the time of our research, however, it was PC-only. On the website, you can find information about VR, but unfortunately no AR support yet.

Real-time hand and finger tracking by Polhemus
Real-time hand and finger tracking by Polhemus. Source: Polhemus

Xsens DOT

Although Xsens DOT is a motion-tracking device and not an actual controller, it could be used as one with some additional work. It does offer 3 DOF support so the orientation data is accurate but the position is only estimated.

It’s smaller than Litho, which itself is quite small. At the time of writing, I wouldn't consider it practical for typical AR interactions. It might be worth considering for more specific motion tracking needs.

FingerTrak

FingerTrak is another promising wearable, this time in the form of a bracelet. It allows continuous hand pose tracking using relatively small thermal cameras attached to a wristband.

On its own FingerTrak allows detecting gestures without the field of view limitation, which affects the other camera-based solutions. It doesn’t look like a finished product yet, but it seems to be an interesting approach that could turn out well in the future.

Leap Motion Controller

Leap Motion Controller uses a relatively small sensor with cameras and a skeletal tracking algorithm to accurately detect the user’s hands. Unfortunately, at the time of writing, it doesn’t support mobile yet.

Apple AirPods Pro

Wait, what? As surprising as it sounds, someone actually seems to have tried using AirPods Pro for 6 DOF tracking! This is more of a curio, but if the video is actually legitimate, AirPods open up some unexpected possibilities.

person moving icons on an iPhone screen

Working with an iOS App Development Company

by
Maja Nowak
Maja Nowak
,
February 1, 2021

To develop a great iOS app, you need a reliable partner. Find out how iOS development looks like and how to find an iOS app development company.

Read more

To develop a great mobile app, you need a reliable partner who will help you build a product that aligns with your business goals. With many iOS development companies to choose from, it can be daunting to pick a trustworthy contractor.

The key is to find a partner who will not only develop your product but also actively participate in the idea validation process and further product refinement.

Learn how the iOS development process looks like and how to work with an iOS development company.

What Is Required for iOS App Development?

Apple has a whole set of resources and tools to help developers build apps that work on devices running iOS. That said, the iOS app development process is somewhat different from Android app development — to build an iOS app, you’ll need to equip yourself with prerequisite tools.

  • an Apple Mac
  • an Apple Developer account (priced $99 yearly, $299 for an enterprise account)
  • and Xcode to sign and publish a native iOS application

While an Apple Mac and an Apple Developer account are self-explanatory, Xcode is an integrated development environment (IDE) built specifically for creating native apps for Apple’s devices.

You can’t publish an iOS app without Xcode and a valid Apple Developer Program membership.

Considering the necessary tools, it’s not exactly free to develop an iOS app (in contrast, Android calls for a one-time $25 fee, and that’s it).

But the Xcode requirement is hardly a nuisance — Xcode is a pretty decent IDE with a code editor, UI designer, testing features, and other essential tools for iOS mobile development.

Note: In cross-platform mobile development, Xcode is also required to sign and submit the app to the App Store.

woman presenting slide about mobile app

Android App Development: How to Build a Great Mobile Product

by
Maja Nowak
Maja Nowak
,
January 15, 2021

Ensuring your Android app development project is successful is easy when you know what’s involved in the process. Learn all about Android app development.

Read more

Shopping, organizing, scheduling, banking, working, and simply having fun, it’s often a mobile app that solves, bringing convenience galore.

All right, so how come such a shattering number of apps fails?

That’s easy — because they don’t solve anything for their target audience. Indeed, it’s the failure to research the market that’s the top reason why mobile applications flop on app stores.

Whether we’re talking about iOS or Android app development, the principle is the same:

Do your research and battle test your app idea with potential users. Only then will you be able to use the application as a vehicle for business growth.

In this article, I’ll take you through the basic steps you need to take to make your Android development project successful.

Start with an In-Depth Idea Validation and Analysis

A business analysis of your idea is the most important part of the app creation process.

I can’t stress this enough but validating your idea before investing money in its creation will help you approach the process with the necessary level of confidence backed by data.

Here, you have to consider concepts such as:

  • Unique value proposition — how is your offer different from everything out there?
  • Target audience — who are your preferred users?
  • Reach channels — how will you reach your users with your mobile product?
  • Revenue channels — how will you monetize your product? (e.g., in-app sales, advertising, paid features)

A key phase during idea validation is a detailed analysis of your competitors.

Looking at how your competitors solve for their audiences will help you build a better strategy for your product — you’ll know what worked for them and what has drawn negative feedback.

Similarly important is determining if your product is lucrative. To do this, think about how your competitor’s growth looks like. Have they developed a sizable customer base? Are they out looking for specialists to expand their business even more?

In essence, you validate your idea to learn if the product will pick up on the market and give you sustainable growth opportunities.

Use tools such as lean canvas for a deeper dive into your product analysis.

Idea validation is the first step in the lifecycle of your product.

Create Stunning, Engaging, and Simple UI and UX

Be it an internal application for employees, a customer-facing app, or a game, the design is important across all app types. Positive user experience comes from both how the app looks like and how it works.

That said, when creating designs, make sure the designer understands the project and knows your target audience. The colors, typography, pictures, animations — all have to be aligned to reflect the needs and preferences of your users.

If you’re releasing your app into a highly competitive market with many similar apps, the design of the interface can make or break a deal.

For example, users might love the functionalities offered by your competitor. But if that product has a poor interface and usability, consumers would just as well leave the application in search for a better solution.

To be a viable alternative, however, your product needs seamless UI and UX.

So, regardless of application type, keep your design smart, intuitive, and clean. A cluttered interface is a sure-fire way to make your users wrinkle their noses.

Keeping a clean and intuitive design will make your application more attractive.
android mobile phone animations

Discover and Easy Way to Create Complex Animations with MotionLayout

by
Krzysztof Król
Krzysztof Król
,
January 13, 2021

Mobile animations on Android can be just as polished as those on iOS apps. Find out how to create refined animations on Android.

Read more

Animations are important. iOS developers seem to know that because applications in the App Store are usually much more polished than their Android counterparts.

What’s the reason behind this? Are Android devs simply lazy?

Well, no.

The problem is that for a long time Android SDK didn’t offer great tools for creating animations. This has been changing throughout the years and nowadays creating beautiful animations in Android is a lot easier.

Developers at Google created many amazing tools like Transitions API or CoordinatorLayout. We wrote an overview of Android animation libraries that we use every day at work.

At the I/O 2018 conference, Google introduced yet another great library — MotionLayout. In this article, we will take a deep dive into the world of Motion Layout and explore countless possibilities that it offers to developers.

MotionLayout: Basics

So, what’s the MotionLayout? Simply put, MotionLayout is a Viewgroup that extends ConstraintLayout.

We can define and constraint children just like we would do when using standard ConstraintLayout. The difference is that MotionLayout builds upon its capabilities - we can now describe layout transitions and animate view properties changes.

The amazing thing about MotionLayout is that it’s fully declarative. All transitions and animations might be described purely in xml.

Before we start playing with MotionLayout, we need to add suitable dependencies to a project.

dependencies {
implementation 'androidx.constraintlayout:constraintlayout:2.1.0-alpha1'
}

Now, let’s define our initial Fragment (activity would work as well).

MotionLayoutFragment.kt

class MotionLayoutFragment : Fragment() {

override fun onCreateView(
inflater: LayoutInflater,
container: ViewGroup?,
savedInstanceState: Bundle?
): View? {
return inflater.inflate(R.layout.fragment_motion_layout, container, false)
}
}

As you can see this Fragment is very basic. There is no animation-related code. The only thing we need to do is to inflate our layout. Let’s take a look at the layout.

fragment_motion_layout.xml

<?xml version="1.0" encoding="utf-8"?>
<androidx.constraintlayout.motion.widget.MotionLayout xmlns:android="http://schemas.android.com/apk/res/android"
 xmlns:app="http://schemas.android.com/apk/res-auto"
 xmlns:tools="http://schemas.android.com/tools"
 android:id="@+id/motionLayout"
 android:layout_width="match_parent"
 android:layout_height="match_parent"
 app:layoutDescription="@xml/scene_01"
 app:motionDebug="SHOW_ALL"
 tools:showPaths="true">

<View
     android:id="@+id/button"
     android:layout_width="64dp"
     android:layout_height="64dp"
     android:background="@color/colorAccent"
     android:text="@string/button" />

<ImageView
     android:id="@+id/imageView"
     android:layout_width="128dp"
     android:layout_height="128dp"
     android:src="@drawable/android" />

</androidx.constraintlayout.motion.widget.MotionLayout>

The most important parameter here is app:layoutDescription. It lets us point to a scene definition. In this scene, we will define our layout transitions.

Another interesting parameter is app:motionDebug=”SHOW_ALL” Thanks to this parameter, our layout will show information helpful with debugging and adjusting animations, namely the path and progress of the animation.

Now we need to define our scene. Take a look at the diagram below.

Diagram for defining MotionLayout scene

As you can see, MotionScene consists of two major blocks.

Transition block contains several pieces of information:

  • The touch handler defines the way users will interact with the animation. The animation might be started with a click action or a user might progress animation with a swipe gesture.
  • KeyFrameSet will enable us to fully customize animations. We will take a detailed look at keyframes later in this article.
  • References to starting and final layout constraints.

Apart from that, we need to define a starting and final constraint set. We don’t need to define constraints for views that are still during the animation.

Now that we know the basics we can create our scene.

<?xml version="1.0" encoding="utf-8"?>
<MotionScene xmlns:motion="http://schemas.android.com/apk/res-auto"
 xmlns:android="http://schemas.android.com/apk/res/android">

<Transition
     motion:constraintSetEnd="@+id/end"
     motion:constraintSetStart="@+id/start"
     motion:duration="1000">
<OnSwipe
         motion:dragDirection="dragRight"
         motion:touchAnchorSide="right"
         motion:touchAnchorId="@id/button"/>
</Transition>

<ConstraintSet android:id="@+id/start">
<Constraint
         android:id="@+id/button"
         android:layout_width="64dp"
         android:layout_height="64dp"
         android:layout_marginStart="8dp"
         android:elevation="0dp"
         motion:layout_constraintBottom_toBottomOf="parent"
         motion:layout_constraintStart_toStartOf="parent"
         motion:layout_constraintTop_toTopOf="parent"/>

<Constraint
         android:id="@+id/imageView"
         android:layout_width="128dp"
         android:layout_height="128dp"
         android:alpha="1"
         android:scaleX="1"
         android:scaleY="1"
         motion:layout_constraintBottom_toBottomOf="parent"
         motion:layout_constraintEnd_toEndOf="parent"
         motion:layout_constraintStart_toStartOf="parent" />
</ConstraintSet>

<ConstraintSet android:id="@+id/end">
<Constraint
         android:id="@+id/button"
         android:layout_width="64dp"
         android:layout_height="64dp"
         android:layout_marginEnd="8dp"
         android:elevation="20dp"
         motion:layout_constraintTop_toTopOf="parent"
         motion:layout_constraintBottom_toBottomOf="parent"
         motion:layout_constraintEnd_toEndOf="parent"/>

<Constraint
         android:id="@+id/imageView"
         android:layout_width="256dp"
         android:layout_height="256dp"
         android:alpha="0.5"
         motion:layout_constraintTop_toTopOf="parent"
         motion:layout_constraintStart_toStartOf="parent"
         motion:layout_constraintEnd_toEndOf="parent"/>
</ConstraintSet>

</MotionScene>

Let’s take a look at the more interesting parameters.

<Transition>

  • motion:constraintSetStart — references starting layout constraints
  • motion:constraintEndStart — references final layout constraints
  • motion:duration — defines the duration of the animation. Note that it has no effect if the touch handler is defined as OnSwipe. In that case, the duration of the animation is defined by the velocity of the user's gesture and additional parameters that will be described later.

<OnSwipe>

  • motion:dragDirection — determines the direction of the gesture that needs to be performed to progress the animation. If it’s equal to “dragRight,” a user needs to swipe from left to right to progress the animation. If a user swipes from right to left, the animation will be reversed.
  • motion:touchRegionId — defines the view that needs to be dragged to progress the animation. If it’s not defined, a user might swipe anywhere within MotionLayout.
  • motion:touchAnchorId — parameter might be a little confusing. We need to tell MotionLayout how much the animation should progress given the number of pixels a user dragged their finger on. So the library will determine how much a user's swipe gesture progresses the animation based on the distance between the starting and final position of the touchAnchor view.
  • motion:touchAnchorSide — determines the side of the touchAnchor view.
  • motion:dragScale — determines how much a swipe gesture will progress the animation. If dragScale is equal to 2 and a user's finger moves 2cm, the touch anchor view will move 4cm.
  • motion:maxAcceleration — determines how fast the animation will snap to initial or final state once the user releases their finger.
  • motion:maxVelocity — is similar to motion:maxAcceleration but determines the maximum velocity.

<ConstraintSet> — is a set of initial or final layout constraints. Each constraint defines attributes for a particular view. Take note that we can’t declare any view attribute as a part of a constraint. It should describe view position or one of the following:

  • Alpha
  • Visibility
  • Elevation
  • Rotation
  • Translation
  • Scale

If we need to animate a different view attribute, we should declare it as <CustomAttribute>. We will learn how to do this in the next section.

Let’s take a look at the animation we created.

As you can probably remember we enabled debug overlay.

Thanks to this we can see text at the bottom describing the animation progress and animation frame rate. We can also see paths of our animated views.

The blue rectangle is the anchor view.

human and robot hands touching

Technology Trends for 2021

by
Maja Nowak
Maja Nowak
,
January 5, 2021

The technology trends for 2021 are likely to be the continuation of the technological evolution and adoption that started in 2020.

Read more

One thing in 2020 was certain — the uncertainty. All the trends and projections announced for 2020 have either been modified or their emergence delayed. But in technology, we’ve seen an unprecedented evolution.

E-commerce sales soared, online education matured, and on-demand services rose to huge popularity. To meet the sudden customer demand, companies across the globe have increased their spending on digital transformation.

This demand has in turn spurred the growth and branching out of multiple related services. Will 2021 be the continuation of that expansion or maybe other technologies will see increased adoption?

5G Fuels the World

5G is a gateway to the realm of a mind-bending technological revolution. That’s not an overstatement — most of the technology trends of the future will be relying on that connectivity.

According to the annual state of the global mobile economy report by GSMA, by 2025, 5G will amount to 20% of global connectivity. And while the rollout pace isn’t yet astonishing, the hype around 5G keeps the public’s interest and curiosity up.

5G is rolling out for good
5G is rolling out for good.

In 2021, we’re likely to see the expansion of 5G networks and the doubling of devices with 5G capability.

Internet of Behavior

Activity trackers, phones, smartglasses, cars, cameras, and plenty of other devices collect data about you and your close ones. From your physical geographic location to your browsing history to even face recognition, companies have data galore about you.

The analysis of this highly personal data (your behavior, interests, and preferences) gathered by the Internet of Things devices is dubbed the Internet of Behavior (IoB).

internet of behavior and internet of things
Connected everything — an increasing number of devices can collect and share data.

The more we use online services and connected devices the larger amount of personal data we leave behind. Companies know very well about our political preferences, where we live, what we do, what we believe in, what our interests are, who we associate with, etc.

This data along with a slew of information coming from devices that are yet to enter the market (e.g., smartglasses that know exactly where you look at any given moment) will give businesses an unprecedented wealth of information to use to influence our behavior.

But the IoB also means several customer benefits — for example, the more data about driving patterns is collected from connected cars, the better driving experiences the automotive companies can build.

In 2021, we’re likely to see companies use the data from connected devices to create extremely personalized offers and products.

woman looking through augmented reality headset

What Is Augmented Reality? Your Guide to Immersive Technologies

by
Maja Nowak
Maja Nowak
,
December 22, 2020

Augmented reality is a flexible technology that blends digital information with the real world. Learn what is augmented reality in our guide.

Read more

We’re slowly entering the era of augmented reality dominance. With prices for AR hardware dropping and tech giants announcing the release of their AR devices in 2021–2022, mainstream AR adoption is just a matter of time.

But it’s no longer a far-flung future.

Dive deeper into the world of augmented reality in our guide to immersive tech.

With all the technology that goes into an AR experience, the definition of augmented reality is pretty straightforward.

Augmented reality (AR) is a computer-enhanced version of the physical world, with digital content used to amplify the user experience of reality.

Digital content in AR can range from simple graphics and animations to videos and GPS overlays. The rendered content can “feel” and respond to the user’s environment and the user can also manipulate the content (via gestures or movement, for example).

With all the theory checked off, let’s check out some of the most exciting augmented reality apps out there.

Mondly

Mondly helps users learn 33 languages via adaptive learning, which customizes the curriculum based on user progress. The app offers AR-supported conversations, pronunciation advice, and advanced statistics to improve language learning and knowledge retention.

Mondly also features augmented reality lessons where you can view animals and objects in your space for deeper engagement and better experience.

An augmented reality language lesson. Source: Mondly

80's AR Portal

Now this is a hugely entertaining AR mobile app that rubs my imagination in the right way.

Anyone a fan of Stranger Things, here? The 80’s AR Portal gives you a chance to immerse yourself in a fun experience and feel the crazy retro-cosmic spirit of the 80s. It’s definitely not a perfect app, but the concept is alluring.


Stepping into the past with 80's AR Portal. Source: Google Play

ARLOOPA

ARLOOPA is an educational app with a flair for entertainment. With the app, you can view multiple AR models in your space and engage with them.

Interestingly, the app has three different types of AR available for you to experience: marker-based, markerless, and geo-location.

ARLOOPA features a number of AR models, with the solar system, animals, furniture, and art. Source: ARLOOPA via App Store

Human Anatomy Atlas 2021

The Human Anatomy Atlas 2021 is one of the best augmented reality examples out there. Excellent for medical students and health enthusiasts, the app features some of the most detailed human anatomy models, with tissues, muscles, bones, and nerves. All supported by interactive lectures.

And you can view all those fancy body parts in AR.

Get to know your body better through augmented reality. Source: Human Anatomy Atlas 2021 via App Store

Bringing immersive technologies into the healthcare industry can help doctors explain complex procedures to patients and serve as an excellent reference during reconstructive surgeries.

We talked in detail about augmented reality in healthcare in one of our articles.

IKEA Place

No respected AR app list can go without the IKEA home decor app.

IKEA Place lets users render augmented reality furniture in their living spaces. The app views real-size furniture, letting homeowners better visualize the items from IKEA’s catalog before deciding on a purchase.

Decorate your room in AR. Source: IKEA Place via App Store

For home decor, there’s also the amazing Houzz app, which got the Editor's Choice award on the app store.

Shapr3D

Shapr3D is an award-winning app for 3D modeling workflows. It’s a professional CAD (computer-aided design) tool with robust capabilities. A work-horse for designers.

With the expansion of the app to macOS, the Shapr3D iPad users can now collaborate on prototypes across devices.

Shapr3D is a robust CAD tool for designers
Shapr3D is a robust CAD tool for designers. Source: Shapr3D via App Store

In 2020, Shapr3D got Apple’s Design Award (the first CAD application to get it).

Canvas

Canvas is a house scanning AR application that uses Apple’s raw lidar data to create interior CAD models and floor plans.

The improved depth-mapping properties of LiDAR technology result in more accurate models (the developer claims a 5x increase in accuracy compared to the non-lidar app version).

In the Canvas app, LiDAR scans the room with laser beams
LiDAR scans the room with laser beams. Source: Occipital HQ via YouTube


calling phone and film slate illustration

Android Animation Libraries You Should Know

by
Krzysztof Król
Krzysztof Król
,
December 18, 2020

In-app animations play a critical part in how users perceive apps. Check out some of the great animation libraries that we use for projects at Nomtek.

Read more

Here at nomtek we pay special attention to application responsiveness. It’s also very important for apps we create to be intuitive for our users. Great animations play a huge part in how users perceive apps.

In Android's early days, we didn’t have much choice in the available solutions. Now there are hundreds of awesome libraries that make our life as app developers easier. 

In this article, we will give you an overview of animation libraries that we use in our everyday work.

All the examples below are from our public repository.

Loading Button

Loading Button example
Loading Button example

Android button widget that uses morph animation to transform into a circular progress bar.

It’s an easy way to make your application more responsive. You can customize the button in many ways to achieve the effect desired in your application.

Transitions API — Scenes

To animate between two layouts with the Transitions framework, you can use scenes API. 

Steps needed to create the animation:

  1. Create two Scene objects — one for the starting layout and the second one for the ending layout.
  2. Create a Transition object. You can customize the type of animations and their order.
  3. Call TransitionManager.go(targetScene, transition).

Source: https://developer.android.com/training/transitions

Transitions API scenes diagram
Transitions API scenes diagram



Transitions API scene examples
Transitions API scene examples
doctors viewing an augmented reality heart

Improving Clinical Practice with Augmented Reality

by
Maja Nowak
Maja Nowak
,
December 1, 2020

The healthcare industry is projected as one of the greatest beneficiaries of augmented reality tech. Learn how AR is already reshaping clinical practice.

Read more

Over the past century, clinical practice has undergone an almost unthinkable transformation. Just over 150 years ago, doctors didn’t even know they could transmit germs on their hands.

On maternity wards of the past, doctors would move from one female patient to the next, examining each without washing their hands in between, much less wearing gloves.

Needless to say, childbirth death went rampant.

So to think we can now use augmented reality during reconstructive surgery to locate bone fractures and blood vessels is quite an astonishing advancement.

Check out what else augmented reality makes possible in healthcare.

Fast Forward to the Twenty-First Century

The potential of AR is visible across industries. From education to manufacturing to automotive, augmented reality fills gaps in workflows, offering tangible opportunities for improvement.

For example, AR can be used to eliminate inefficiencies in manufacturing caused by engineers having to check paper instructions.

Whereas insignificant at first glance, the constant loss of focus and time necessary to refer to the instruction amounts to considerable losses over time.

To give it a more personal context — You know the feeling when you have to focus on the paper instruction for your shiny new IKEA bookshelf and hold the darn thing together with your other two hands?

Well, quality assurance in the automotive industry is like the IKEA instruction times 1,782.4.

For extremely precise and complex tasks, the help of computers is indeed invaluable. And the better the augmented reality technology gets, the more advanced its use cases in clinical practice.

In fact, the healthcare industry is projected to add a $47.9 billion boost to the global economy by 2022.

The role of AR/VR in the global economy
The role of AR/VR in the global economy. Source: PwC

The current clinical practice is evolving at lightspeed, but the sector still has multiple areas in need of solutions that can benefit from technology.

Here are just some examples of AR in healthcare.

<quote>AR is an entirely new concept to you? Read our AR guide to learn the basics of augmented reality.<quote>

Advancing Clinical Practice with Augmented Reality

There’s still lots of research, refinement, and mainstream adoption needed for augmented reality to enhance clinical practice in healthcare systems across the world.

As the technology matures and overcomes technical and implementation hurdles, augmented reality will most likely become a go-to tool for treating a variety of ailments. Doctors will use AR solutions to improve patient outcomes and the quality of surgeries.

people sticking post it cards on the board

Android Development: How to Implement Complex Features in a Finite Time

by
Wojciech Dawiskiba
Wojciech Dawiskiba
,
November 27, 2020

Organizing your work during feature implementation helps improve the process of application development.

Read more

I’ve seen lots of articles about using various libraries and frameworks, clean code, and programming practices but almost none about organizing a developer’s work around implementing a feature. So I decided to share what works for me, and I hope it will help someone improve their process.

I’ll tell you how I analyze a user story, design and implement a solution, and prepare a merge request.

I’m currently working on an Android project that follows Clean Architecture variation, with Gradle submodules containing features. All the examples in the article will be based on this project.

Getting to Know the Feature

I start with reading the whole story and looking through the designs, to load all the context into my head. Then I go through the story again, but this time listing everything that needs to be done.

And I do mean everything, not only the implementation pieces.

Need to add a copy to the translation tool — list it. Ned to talk with other teams about some integration details — list it.


List all the things to master feature implementation. Source: Imgflip

Entries like “load data” are good for now. I worry about all the details later (fetching data from the API, caching locally, handling errors, etc.).

In a perfect world, the user stories that we pick up to work on should be small enough to result only in a few points on the list. Unfortunately, this is not always the case in the real world.

Making the Code Review Easy

I follow this pattern until I'm out of items on the sublist.

Then I prepare a pull request from the sub-branch to the feature branch: “OA-5_load_data” ->“OA-5.” This way, reviewers can get through all the code in smaller pieces that are easier to digest.

Before actually creating the pull request, I read through all the changes. Such a pre-review step makes the code review process easier. Most typos and styling issues are caught here and other developers don't have to point them out.

After merging all the pieces, I create another pull request — this time to the development branch: “OA-5” -> “develop.” In this one, reviewers can only skim through the code looking mostly for leftover developer tweaks (unnecessary logs, navigation shortcuts, etc.), since they've seen all of the code before.

Once this pull request is merged, the feature is ready for the QA.

person playing with an augmented reality solar system

AR in Education: How to Make Learning Fun with Tech

by
Maja Nowak
Maja Nowak
,
November 23, 2020

Augmented reality in education helps increase student learning motivation, engagement, and satisfaction. See examples of AR apps for educators.

Read more

Bored out of my mind with a growling stomach, I remember sitting during biology lessons with no clue whatsoever what my teacher was saying about cellular structures and the human nervous system.

Well, if it were possible for the teacher to put it this way:

human nervous system in augmented reality
The human nervous system from Google’s search result page. Source: Google

… I might have even become a doctor. I mean, who knows, right?

But two decades ago, diving into an immersive augmented reality biology lesson was stuff out of wicked dreams or good sci-fi movies.

I mean, those were the times of the Nokia 3310.

snake game nokia 3310
Remembering the thrill of the Snake game on the Nokia 3310.

Currently, augmented reality in education is reviving traditional classes, giving educators a new way of engaging students and teaching them difficult concepts.

On top of reviving the classroom, augmented reality apps for education also help increase student learning motivation and amplify professional training.

How Augmented Reality Helps Educators Teach

According to research, most of us are visual learners — we learn and understand concepts best when provided with visual cues.

Augmented reality is therefore a great enabler of learning experiences, especially with the growing capabilities of this technology.

The rendering of objects has improved, along with the ability of 3D models to react to the environment in which they’re projected.

A systematic review of a decade of using AR in education (2008–2018) has revealed that augmented reality is “increasing motivation (24%) and facilitating interaction (18%).”

Various studies analyzed in the review also found that AR:

  • Increases confidence in students
  • Raises the level of commitment and interest
  • Provides opportunities for self-learning
  • Boosts collaborative learning
  • Improves satisfaction

Now let’s take a closer look at how exactly augmented reality helps students learn better and more effectively.

<quote>Don't know much about AR? Head over to our guide on augmented reality for more details about this tech.<quote>

AR Boosts Student Engagement

While learning is fun in itself, some classes can bore even the most curious person out there. And let’s say it, when you’re bored, the last thing you want to do is to actually understand something.

With many schools and universities still using remote education in many countries, engagement is more important than ever to maintain the quality of education.

Augmented reality helps teachers stimulate students’ interest by providing them with immersive visual experiences.

AR mitochondrion
A mitochondrion — Google’s jab at AR cellular structures. Source: Google

Google is developing its AR capabilities full-throttle. With the company’s Google for Education initiative, educators can use several 3D models to help students better understand concepts and retain knowledge.

Many popular queries can be rendered in AR directly from the mobile browser on an Android phone.

AR Increases Learning Motivation

AR delivers immersive experiences with interaction features that increase student motivation to learn.

For example, undergraduate health science students exhibited greater learning motivation when using an augmented reality app to study human organs.

Human organs in Anatomy 4D. Source: Educational App Store

Interestingly, researchers found that “Anatomy 4D mobile application was better able to hold the attention of the students than the anatomy notes.”

By leveraging augmented reality, students can receive practical experience in multiple fields in a simulated learning environment.

AR Facilitates the Understanding of Difficult Concepts

Augmented reality can help students understand difficult and abstract concepts. For example, how the human digestive system looks and works like.

Digestive system in Hololens
Digestive system as seen in mixed reality via HoloLens. Source: HoloPundits

On a bigger scale, the enormity of the universe and our own solar system are difficult to comprehend, but augmented reality lets students get a more immersive view of such gargantuan structures.

Solar system in AR Solar System
Solar system in AR Solar System. Source: AR Solar System

Augmented reality lets students experience and understand scientific phenomena without actually running experiments.

For example, various chemical reactions are too dangerous to run in the classroom. By bringing these experiments into a tangible virtual environment, AR gives students access to many topics beyond reach.

Another way augmented reality makes learning immersive is by letting students manipulate objects and experience the resulting reactions and phenomena.

This way, students gain a better conceptual understanding of topics that could otherwise be inaccessible to them.

Augmented Reality Improves Learning Performance

Augmented reality helps students learn new concepts faster and more effectively.

For example, in a study evaluating zSpace, an AR/VR learning solution, researchers found that it was easier for students to understand concepts better when augmented reality was involved.

zSpace is a comprehensive solution built to create immersive classrooms for experiential learning. Using zSpace, educators can prepare custom learning material enriched with advanced augmented reality elements.

woman manipulating an augmented reality heart
zSpace lets students manipulate objects and have an immersive experience through special glasses. Source: zSpace

With tools such as zSpace, educators can craft curriculums in a way that makes it easier and faster for students to absorb information.

Medical students and engineers can perform procedures on virtual objects to practice their skills and gain the necessary knowledge.

AR Adds Context to Lessons

For history teachers, it can be tricky to maintain a seamless experience for students for the duration of the whole lesson.

Without an easily relatable context, students usually pore over dates and time periods — tangible visual material could help them digest the lesson better.

For example, when talking about hieroglyphs or historical structures, educators can render the object right in the classroom, and then deliver the lesson.

Building Immersive Homework with AR

Not all students are equally eager to do their homework. Some need a little more incentive.

Augmented reality helps make workbooks and textbooks more interesting by letting students engage in immersive 3D renderings of studied concepts.

Enriching student materials with AR is affordable and accessible. In the US, 53% of children under the age of 11 and 84% of teenagers own a smartphone.

a reptile in augmented reality
A snapshot from Reptiles & Amphibians: An Augmented Reality Popup Book. Source: Amazon
onboarding in augmented reality with lego bricks

Onboarding in Augmented Reality: A New Way to Build LEGO Bricks

by
Maja Nowak
Maja Nowak
,
November 18, 2020

Onboarding in augmented reality can help accelerate new employee productivity. Learn how we used AR to showcase its onboarding capabilities using LEGO bricks.

Read more

Here at nomtek, we’re huge fans of immersive technologies. We also believe that this tech will experience mainstream adoption within two to three years.

And so we explore the possibilities of augmented reality to find out use cases of this tech in different business settings.

Cross-Reality Technologies in Business

Many top companies out there pursue cross-reality solutions to improve their operations. We have Mercedes-Benz, Toyota, Lockheed Martin, Porsche, BMW, and even the US Army eagerly using augmented reality headsets to streamline information sharing.

Augmented reality patches numerous gaps in workflows and can help companies streamline processes.

Recognizing this great potential of AR tech, we decided to showcase AR’s capabilities from a rather unconventional angle.

person interacting with an AR engine

Augmented Reality in Business: How and When to Use It

by
Maja Nowak
Maja Nowak
,
November 12, 2020

Businesses can use AR to improve various workflows and processes. Find out how and when to use augmented reality in business.

Read more

While we’re yet to experience a full-scale AR disruption, the potential of this tech to revolutionize workflows and processes is already huge. Let’s see how and when businesses can use augmented reality.

Note: I’ll be using the terms augmented reality and mixed reality interchangeably throughout the article.

Read why mixed reality is synonymous with augmented reality in What Is Mixed Reality? The No BS Explanation

How Is Augmented Reality Used in Businesses?

Businesses across industries are already using augmented reality solutions for a variety of purposes.

For example, AR is used to:

  • Improve brand awareness
  • Showcase product demos
  • Increase customer experience
  • Streamline workflows

While the benefits can directly contribute to, say, increased sales in e-commerce businesses — Houzz’s customers are 11x more likely to make a purchase after using the company’s mobile AR feature — the AR technology isn’t universally viable for all sectors.

How to Implement AR in Your Business Strategy?

Before jumping into the world of augmented reality tech, there are a few questions you have to answer.

If you want to include AR in your business strategy, first ask yourself what it is specifically that you want to achieve through an AR solution.

To give you an example: a problem can be something missing in the workflow.

Let’s say quality assurance at your company takes a lot of time to complete. The reason why might be that QA professionals need to comb through stacks of paper instructions to complete the process.

This inefficient approach results in a waste of time: seconds turn into minutes and minutes into hours. In the long term, it amounts to a significant drop in productivity.

Augmented reality could come in handy here by feeding all the steps and actions necessary to conduct a QA test into a mixed reality headset. The application would interact with and respond to the actions of the tester in real-time.

Here’s Renault’s road to quality assurance supported by mixed reality:

Wondering what companies use augmented reality? Let’s look at some of the use cases of augmented reality across industries and sectors.

Manufacturing

The manufacturing sector is expected to be one of the biggest beneficiaries of cross-reality solutions. Combined with the rollout of 5G connectivity that offers speeds 100x faster than 4G LTE, augmented reality can be a huge opportunity for manufacturing facilities to improve a number of their processes and workflows.

The upside of using AR in manufacturing facilities is that it’s a fraction of the cost compared to investing in complex hardware.

Besides, augmented reality is much more convenient to exchange information since there are no physical restrictions such as cables, devices. Data is fed to the AR application virtually.

Onboarding. With AR, employees just starting out in a manufacturing plant could see interactive hints and instructions on how to use machinery, with all important information layered over the physical equipment. AR onboarding can increase employee safety and shave off ramp-up time.

See an example of using AR for training purposes at BMW:

Productivity. AR headsets equipped with AI technology could help employees get from point A to point B in the most efficient fashion, leading to potential productivity gains in the long term. Moreover, engineers could send a request for a specific part by simply pointing at it.

Operational information. With AR elements overlaid in a factory, manufacturing employees could have easy access to information about the performance of existing equipment and infrastructure. For example, interactive gauges over different areas on the assembly line offer real-time insight for employees.

Safety. Mixed reality solutions can inform employees about dangers (e.g., areas closed for maintenance/cleaning). And when an emergency happens, workers in need of help or assistance could transmit an interactive beam with their whereabouts.

Automotive

Relying on complex technology and hardware, cars are becoming increasingly difficult to maintain for mechanics, who may not yet have the know-how necessary for servicing. The pace of the digital evolution in the automotive industry calls for improvements in various workflows and processes.

Remote assistance. AR-based remote assist lets engineers show other employees how to conduct complex repairs and service maintenance on vehicles.

Mercedes-Benz US is using Microsoft HoloLens 2 and Dynamics 365 Remote Assist to help technicians perform maintenance activities remotely with the assistance of an expert engineer.

Training. Cross-reality workshops are a safe and efficient way to share information. Engineers can enroll in digital training where they learn the workings of complex machinery and how to assemble various parts. Instructors show trainees how to disassemble an engine without actually putting it apart.

Porsche says it has tripled the usage of augmented reality in their workshops. “Tech Live Look” is Porsche’s in-house app for connecting technicians with experts to help solve complex car repairs. Porsche has been using augmented reality for years.

Tech Live Look from Porsche
Tech Live Look from Porsche. Source: Porsche

Tech Live Look speeds up car service per vehicle by up to 40%, significantly improving Porsche’s customer experience.

See how augmented reality can be used in education.

Prototyping. Every new iteration of a prototype can be costly for an automotive company. Car prototypes can cost upwards of $100,000. Immersive designs aid in iterating and improving on product designs without companies having to spend anything on expensive prototypes.

Now, $100,000 might not seem like a lot for big automotive companies, but when we’re talking about multiple iterations, it can amount to a nice sum.

BMW is already using AR in vehicle prototyping. AR lets workers know faster if a component will fit once the production starts. Using AR in that way decreases the need for performing many test setups.

Entertainment

Content consumption. Enhanced with augmented reality, written content can be expanded to include immersive experiences. AR can also serve as a visual aid in non-fiction writing to better illustrate concepts and events.

Board games. Traditional board games could use augmented reality to enhance the level of immersion for gamers. Physical boards can be transformed from 2D experiences into interactive 3D adventures — the board stays the same while the elements turn virtual.

A real world board game with augmented reality elements
A real world board game with augmented reality elements. Source: Tilt Five via Kickstarter

Military

In the military, soldiers can use augmented reality that transforms sensor data into visual input to gain greater insight into their surroundings as well as to improve navigation.

Situational awareness. Sensors and cameras implemented in AR tech provide soldiers with more information regarding their surroundings. Other critical information can also be fed into a headset from headquarters.

Since 2018, the US Army has been looking into AR when developing the Integrated Virtual Augmentation System (IVAS). The IVAS provides mission-critical information to soldiers on the battlefield, for example, the system performs a quick object identification check.

Navigation. In aviation, augmented reality blends complex charts and maps into a pilot’s field of view, decreasing the need to check the information on displays.

An example of heads-up display (HUD) in aviation
An example of heads-up display (HUD) in aviation.

The US Army is exploring the possibilities of augmented reality goggles for combat dogs. The idea is to give dogs in the field more contextual information, along with visual indicators that tell dogs where to go.

Ordinarily, soldiers guide their dogs with lasers or hand gestures. During a mission, however, it might not be possible for the soldier to be close enough to the dog to give it commands. This is where AR goggles step in, letting soldiers guide their dogs through visual cues rendered in the glasses. Additionally, the goggles attached to the dog’s head transmit what the dog sees back to the soldier.

AR goggles for dogs -- augmented reality technology in military
AR goggles for dogs. Source: US Army

Travel

Tenant instructions. Landlords renting apartments can use augmented reality to provide tenants with instructions. For example, to help tenants orient themselves around the apartment or explain how to use and locate different utilities.

Along with smart locks that eliminate the need for the landlord to hand the tenant the keys, augmented reality further decreases the necessity for contact.

Virtual AR Map Concept for rentals
AR Map Concept for rentals. Source: Isil Uzum via dribbble

Tenants simply put on a headset or turn on an app and explore the flat themselves with detailed instructions.

Immersive experiences. To advertise offered destinations and facilities, travel agencies can turn to augmented reality to create AR tour presentations. This way, customers get to experience interactive content and learn more about a destination. AR tour presentations also help travel agencies prepare offers with content customized to cater to different target audiences.

Moreover, to improve customer experience, travel agencies can equip their customers with advanced digital tour guides. These augmented reality guides can be further tweaked to include memorable and personalized experiences to tourists in a given location. For example, an AR guide could contain sightseeing places that match customer needs and preferences.

woman with hair color change via augmented reality app

How Businesses Use Augmented Reality: App Examples across Industries

by
Maja Nowak
Maja Nowak
,
November 2, 2020

Check how your business can use augmented reality to improve customer experience, increase engagement, and boost brand promotion.

Read more

With relatively cheap implementation costs and easy accessibility, mobile AR is a great choice for a goal-oriented tech asset. Businesses can use augmented reality to increase customer engagement, improve brand promotion and awareness, and facilitate the creation of product demos.

Multiple high-profile companies are investing in AR tech. Among them are Qualcomm, Apple, Facebook, and Google.

Both Apple and Google are heavily developing their respective AR software development kits (SDK). ARInsider projects that by the end of 2020, there will be over 2.5 million AR-compatible mobile devices out there compared to only 500 million active users. There’s a huge potential for commercial adoption of this tech.

Combined forecast for ARCore and ARKit devices and user base
Combined forecast for ARCore and ARKit devices and user base. Source ARInsider

In fact, IDC’s Worldwide Augmented and Virtual Reality Spending Guide predicts that global spending on cross-reality tech will experience an exponential growth of 77% CAGR between the years 2019 and 2023.

The AR market alone is blooming, with various forecasts projecting it to reach between 30 to 80 billion USD in the upcoming years (1, 2, 3).

No mainstream implementation of costly technology necessary to bring it to the users makes augmented reality that much more accessible. On the other hand, virtual reality or mixed reality headsets are still too expensive for such widespread adoption.

But what is all the fuss about? Can AR apps transform businesses and add tangible value? Check out how augmented reality is helping businesses.

Houzz

Houzz enhances its selling and reach capabilities with AR, to help customers better visualize how a product would look like in their house. A growing selection of products available for an AR viewing helps Houzz fuel sales and increase engagement.

The 3D view features tiles, furniture, lighting, and other accessories. Houzz’s CEO Adi Tatarko said that users who used the My Room 3D tool were 11 times more likely to make a purchase.

Floor tile selection in augmented reality
Tile selection in AR. Source: Houzz

Once you’ve set up AR capability inside an ecommerce app, creating and adding 3D models of products is a relatively inexpensive way to bring AR to your customers.

Wanna Kicks

Great-looking shoes don’t always look equally great on feet. Wanna Kicks gives its customers a chance to try how a pair of sneakers would look on their feet before making a purchase.

There’s still a huge opportunity in retail AR that will most likely be explored in the upcoming years. For example, customers could create a 3D image of their body to be used across shops for better fitting experiences. And I’m not talking here about a simple superimposition of an image of a shirt onto the picture of your body. A 3D scan would reflect how the material flows on different body shapes.

RoomScan Pro and RoomScan LiDAR

Using patented technology, RoomScan LiDAR creates floor plans using the iPhone's newest hardware, including Apple’s LiDAR Scanner and the A12Z Bionic chip.

Multiple export options, floor plan views, and whole floor scan are just some of the features RoomScan LiDAR offers its users. It’s a great tool for architects who want to quickly draw floor plans or for homeowners.


AR Ruler App

How many times have you been in a situation where you had to measure something fast but had no tools to do it? Well, to the rescue in such situations come AR-based measuring apps such as AR Ruler App.

AR Ruler App lets users measure distance, angles, volume, and height, among many other available measurement options.

Various measurement options in an AR tool
Various measurement options in an AR tool. Source: AR Ruler App

While there are still inaccuracies present in the app, with the introduction of iPhone 12 Pro — that features Apple’s LiDAR Scanner — we can expect this family of apps to offer significantly better measurement accuracy.

SunSeeker

SunSeeker is another tool valued by various professionals. From architects and homeowners to realtors and gardeners, SunSeeker gives users detailed information on how the sun operates in a given location.

Solar tracking in augmented reality
Solar tracking in augmented reality. Source: SunSeeker via App Store

Check out how we helped Lighticians design and develop a mobile app for controlling light fixtures.

magic leap headset mixed reality

What Is Mixed Reality? The No BS Explanation

by
Maja Nowak
Maja Nowak
,
October 19, 2020

Defining mixed reality can be tricky. Find out what mixed reality is and how it works.

Read more

MR, AR, XR, 3D, VR… makes your head spin just thinking about deciphering those, right?

Because it so happens that it might actually be a tricky thing to do.

As the features and capabilities of mixed reality, augmented reality, and virtual reality grow more aligned, it’s becoming increasingly difficult to give a clear definition to describe each. Especially augmented reality and mixed reality have become problematic to define.

So without further ado, let’s see what mixed reality (MR) is.

Defining Mixed Reality: Different Takes on MR Tech

Microsoft describes mixed reality as a spectrum ranging from virtual reality to augmented reality (AR). This definition is also shared by Unity.

mixed reality spectrum
Definition of mixed reality by Microsoft. Source: Microsoft

According to Deloitte, however, mixed reality is a subset of augmented reality.

But when we think about how the capabilities of augmented reality have grown over the years, AR can now be considered synonymous with mixed reality. At least, from a strictly technical point of view.

So why the disparity?

A few years back, most of us associated augmented reality solely with apps. At the time, however, AR apps and tech had rather limited capabilities in injecting interactive digital objects into the user’s environment.

The biggest problem was the lack of occlusion. Or in other words, the ability of 3D models to “hide” behind objects in the real world, which would happen if the objects were really in our space. This had the potential of ruining the perceived level of immersion for the user.

augmented reality dragon in a room
An AR dragon suspended in space, chilling and defying physics. Source: AR Dragon

But in 2019, updates from Apple and Android have eliminated the issue, letting digital AR objects better react to and co-exist with the real-world environment.

A virtual cat with occlusion turned off and occlusion turned on with Google’s ARCore Depth API
On the left, occlusion turned off. On the right, occlusion turned on with Google’s ARCore Depth API. Source: Google

Mixed reality used to be touted as superior to augmented reality precisely because of MR being the more immersive technology, capable of sensing and reacting to the world it was projected in.

That’s no longer the case.

The current AR technology lets mobile phones create high-resolution depth maps that accurately place objects in the real-world environment, taking into account other physical objects present in the space.

So why mixed reality? My guess is that the term mixed reality was coined to help users associate it with headset-based AR and remove any possible confusion with mobile AR.

This was a legitimate approach — a few years ago. Just think about the hype with Pokémon Go. It’s no wonder that an enterprise-targeting company such as Microsoft wanted to differentiate its technology.

HoloLens 2 Enterprise
HoloLens 2 Enterprise. Source: Microsoft

And how does virtual reality fit into the picture?

Virtual reality completely occludes the user’s vision and replaces what the user sees with a simulation. Read more about virtual reality and augmented reality in our article.

There’s also Facebook’s take on VR in its Infinite Office, where the user sees a monochromatic version of their physical surroundings (as fed into the headset via a camera). This makes the experience fall somewhere on the spectrum of mixed reality if we use the definition proposed by Microsoft or Unity.

Infinite Office by Facebook
Infinite Office by Facebook. Source: Facebook

Interaction

Another historically significant feature that is said to distinguish mixed reality from augmented reality is interactivity. When using Magic Leap One or HoloLens 2, users can use hand gestures to manipulate digital objects.

Check out how our partnership with Jagermeister resulted in an interactive mixed reality experience where users could “touch” the music with their hands.

With mobile AR, on the other hand, users interact via on-screen gestures.

On-screen interactions in mobile AR
On-screen interactions in mobile AR. Source: Apple

Different Experiences

Lastly, mixed reality headsets simply offer a more powerful way of experiencing augmented reality. The headsets pack more sensors and give us a greater sense of immersion thanks to advanced optics through which the experience is delivered. But it’s still AR.

As you can see, the line between augmented reality and mixed reality is currently blurred to the point of nonexistence. But since the solutions that deliver AR via headsets are often called mixed reality, we’ll stick to this term for the rest of the article.

Thank you! Your submission has been received!
Oops! Something went wrong while submitting the form.
Block Quote