Smartlearn
Our final project was with a firm based in Singapore called Xaltius that uses Virtual Reality, Augmented Reality, MR, and other technology to create tools to enhance business performance.
​
The project was supposed to build an augmented reality based Mobile platform to enable learners to learn about different equipment in Augmented Reality.
​
Besides which, we were also to help out the client list product features, undertake a SWOT of our product, suggest a name for the app and design the logo & color scheme.
My Role:
Ux Researcher, UX Designer, Interaction Designer
Tools Used:
Paper Prototyping, Figma, 3d Model making,
RESEARCH
​
Firstly, to validate the idea and gauge user feedback on this mobile platform we carried out a survey to check on learning preferences and Augmented Reality awareness.
​
​
Some of the key results
​
-
96% of participants indicated Visual (spatial) as one of their preferred methods of learning while Aural (Auditory) was only preferred by 25% of the participants
​
-
80% of participants cited ‘Ease of Use’ as vital to their digital learning experience while more than 60% indicated credibility, interactivity and learnability as important factors as well​
​
-
59% Are receptive to learning a new tool/platform
​
-
Majority of the participants’ exposure to AR took the form of Entertainment and Gaming platforms while only 11% of them had exposure to AR platforms for education
We also did a Competitor Analysis of some major Augmented Reality Training Apps in the Health Manufacturing and other sectors. This served to make us aware how these apps worked, what features they had, their popularity, target markets.
They were TouchSurgery, HaloEye Anatomy, Complete Anatomy, HP Reveal
DESIGN
We decided to test our platform physically with a low-fi prototype just to get started with some data, and gauge the usability of a new platform like this one.
​
Next we drew up a user flow
​
1. User points the camera at an equipment he wants to learn
2. The app identifies the device, user confirms it
3. Text labels in a numbered sequential format showing the different parts needed to use the device pop up . User goes to a label, he is given the step needed to use the device.
4. If the User gets stuck somewhere, he can access the troubleshoot
​
We used a simple equipment-a fire extinguisher and made a prototype with paper labels and plastic transparency for the mobile interface.
​
​
Our findings:
​
We needed to refine the the Interaction Design of the prototype, since most of the labels were not intuitive enough for the users.
Users were also not aware that the whole step by step process explained was in fact teaching them how to use a Fire Extinguisher. We concluded they needed an On-boarding.
​
Round One:
Round Two:
The User flow was revised factoring in all the input and feedback we got from round 1.
​
The Changes made were:
​
-
Maintenance was taken out
​
-
Functions was split into two categories- How to use (step by step instruction on how to use equipment) and Functions ( learning different parts of a device
​
-
Any information could be accessed two ways- Text Mode & AR Mode
​
-
And we incorporated some additional features in it too-Screenshot, Library
We decided to build a prototype with a more complex equipment, and had to be a medical device- a ‘Keratometer’. We couldn’t source one to test on, so we made it on our own.
​
Our Findings:
100 percent of the users said they liked to learn the device through this app vs having to go through a manual. It was interactive and it was visual. They like the AR mode where the virtual assistant did the demonstrations.
They still had some issues-
how they’d like to go straight to a particular step vs going through it sequentially.
How some of the interaction was not intuitive enough
But these were also limitations because a. we were testing with a physical prototype and b. also because this was a new platform.
​