Personalized Interfaces Can Turbo-Charge Your Mobile Journey - Wipro

0 downloads 116 Views 1MB Size Report
voice input and accordingly responding to it. For example, if a service assistant chatbot of a bank senses that the user
Tailor it swift: Personalized interfaces can turbo-charge your mobile journey

mobility

S

mart phones, wearables and other smart devices are redefining the contours of personalization and device interfaces. Personalization is not just a new normal, it’s the opium of a mobile, customer-obsessed world. And it is this obsession with customers and the experience they derive that is defining how user interfaces will be designed and moulded. With personalized interfaces moving towards the next stage of maturity, “intelligent personalization” is becoming the order of the day. Personalization, seguing from basic user-driven customization related to colours and layouts, is wearing a “human” face by leveraging machine learning and cognitive intelligence. Aravind Ajad Yarra (AAY), a Distinguished Member of Technical Staff in Wipro whose expertise lies in smart and mobile applications, speaks with Bhaswati Mukhopadhyay (BM) from Wipro's Thought Leadership Team, on how personalized interfaces can take mobile experiences to new highs.

BM: What are personalized interfaces? Will they extend the boundaries of all things mobile? How? AAY: In this increasingly connected world, we are moving away from one-way user interfaces to those that are highly personalized, that sense our preferences, predict and pre-empt what we want to do, at what time and more. Today, mobile is not just about holding a smart phone, but it is about having deep personalized interaction with a device, be it a conversation with a voice assistant like an Alexa or immerse in an augmented world while on the move. You will be surprised to know that sometimes our mobile devices carry more than 20 sensors that provide valuable information related to, say, our health or even our mood swings. What personalized interfaces really are doing today is that they are not just picking up random behaviour patterns and data points, but are also offering insights that are contextual and

2

non-intrusive while being relevant to users’/our need. A case in point is the way the good old alarm can be made “smart” and personalized to sense out our body movement, heart rate, the natural light in the room to decide what's the right time to wake us up.

BM: Are personalized interfaces only about contextualization? AAY: Personalization leverages context, no doubt. What we see is how contextualization is injecting “intelligence” into user interfaces. What helps us in this regard are machine learning (ML) and artificial intelligence (AI). If real-time data fraternizes with ML & AI, what we get is a heady mix of real-time personalization. Personalizing e-commerce experience based on contextual information such as search queries and past shopping behavior is not new. Processing data in real-time and applying AI models make contextualization useful in areas beyond e-commerce. For instance, an Oil & Gas company is processing data from variety of sources, including sensors and task workload, to build highly personalized interfaces that an engineer operating in a remote location can use, to take real-time decisions on exploration and drilling. Personalization, however, can go beyond contextualization into the zone of subtle behavioural aspects of users, such as emotions. Emotions are what make interfaces interact. They lend a certain “personality” type to the interface. Delivering personalized experience based on emotions can be as simple as parsing a user’s voice input and accordingly responding to it. For example, if a service assistant chatbot of a bank senses that the user is getting restive and angry, it hands it over to a human assistant to pick up the conversation thread. And this is done in such a seamless way that I, as a user, wouldn’t even know when the handover happened.

Emotions are what make interfaces interact. They lend a certain “personality” type to the interface.

BM: What are the key things we need to focus on for creating next gen interfaces? AAY: The focus has to be on two key areas: build interfaces that need very little cognitive attention, and create interfaces that are truly immersive and useful, just so they do not miss attracting the user’s attention. The type of interface is also something that needs attention. Next gen interfaces will be more mobile-enabled and should leverage more sensory triggers (voice, brain, eye, other biometrics) than they do now. Building these interfaces requires integrating with lot more sensors and crunching more data coming out of those sensors. Mobile processors with more computing capabilities, using edge and fog computing, is the way forward. Some of this has already started taking shape. One example is how we can now run a trained ML model on smartphones. There’s no one size that fits all interfaces. One needs to look at the unique characteristic of that interface – be it VR or voice – and design the personalized interface in keeping with that.

BM: Will interfaces for VR/AR, so far thought to be in the early stages of development, become more personalized? What are the new things we shall see in this space in terms of personalized UI? AAY: For AR/VR, we have barely scratched the

surface in terms of personalization. However, there is infinite scope of personalizing interfaces for AR/VR as they are increasingly embedded in every mobile device such as a smartphone or the regular glasses we wear. Virtual and augmented reality can significantly improve personalization in different areas of mobile experience, which could be anything from shopping, gaming to workplace productivity. Let’s take the example of a customer who is using his smartphone to shop for his family room upholstery. The 3D sensor on his smartphone scans the room to figure out the layout of the room, the kind of furniture used, the rest of the décor and even the lighting, and then culls out options for upholstery that could be the right fit in terms of color and texture. Thus, we see the shopping interface getting completely personalized to suit the design and décor sensibilities of the customer. Personalization of interfaces in VR/AR just does not stop there. We now have the AR capabilities even on smartphones that have 2D cameras. For example, Team Zuckerberg recently demonstrated Facebook’s camera that turns a 2D picture into 3D, betting big on AR.

BM: Let’s talk from an enterprise level. What are the top things in organization strategy for ushering in personalized interfaces? AAY: On the enterprise front, building personalization requires overhauling the interface that is currently in use.

3

The focus has to shift from simplicity and standardization to a more dynamic and adaptive interface that caters to user preferences in a seamlessly way. Personalization should be a key part of an organization's technology strategy and can't just be a tactical initiative. New interfaces bring new challenges with them. Hence, strategy at an organization level is an imperative. • Platform approach is key to building basic capabilities that are required for personalization. Just as we made Web servers and content management part of our Web application infrastructure, next gen personalized interfaces will require mobile application platforms, enhanced with context brokers and real-time event processing engines. • Given personalized interfaces are largely dependent on multiple threads of data

gathered from users, handling privacy and security will be an important part of organization strategy. • Protecting personally identifiable information (PII) has to be priority. Secure vaults, homomorphic encryption, data obfuscation and anonymization will play an important role. • Personalized experience design requires a more experimental approach to see what works and what does not. More investments have to be in areas of automated measurement techniques, that go beyond traditional A/B testing for weighing out personalization options. • Centres of innovation and experience labs will play a key role in developing personalized interfaces by experimenting with new technologies such as Google Tango and Microsoft HoloLens.

Reference: Here’s Everything Facebook Announced at F8, From VR to Bots https://www.wired.com/2017/04/everything-facebook-announ ced-at-f8

About the author Aravind Ajad Yarra Distinguished Member of Technical Staff, Wipro He is a chief architect and architecture practice leader focusing on emerging technologies and digital architectures. With over 20 years’ experience in the IT services industry, Aravind helps clients adopt emerging technologies to build smart applications leveraging Cloud

4

Computing and Mobility. In his previous roles, he worked as a solution architect for several complex transformational programs across banking, capital markets and insurance verticals. He can be reached at [email protected].

Wipro Limited Doddakannelli, Sarjapur Road, Bangalore-560 035, India Tel: +91 (80) 2844 0011 Fax: +91 (80) 2844 0256 wipro.com Wipro Limited (NYSE: WIT, BSE: 507685, NSE: WIPRO) is a leading global information technology, consulting and business process services company. We harness the power of cognitive computing, hyper-automation, robotics, cloud, analytics and emerging technologies to help our clients adapt to the digital world and make them successful. A company recognized globally for its comprehensive portfolio of services, strong commitment to sustainability and good corporate citizenship, we have a dedicated workforce of over 170,000, serving clients across six continents. Together, we discover ideas and connect the dots to build a better and a bold new future.

For more information, please write to us at [email protected]

IND/B&T/JUN-NOV2017