Connect with us

The future of apps: Contextual Awareness

Knowledge Hub

The future of apps: Contextual Awareness

The future is upon us…

Imagine your phone reminding you to take an umbrella on your way out due to a storm approaching. Or warning you of traffic ahead and suggesting a faster route as you drive to work, or reminding you to stop by the groceries on your way home, or informing you when you walk into a shop of a discount on items in that shop, all without you having to retrieve such information! It proactively provides you the right information at the right time and at the right place.

Some examples mentioned are already possible today, thanks to advancements in Natural Language Processing, Machine learning and Artificial Intelligence. The future, once only seen in science fiction movies, is happening and is happening fast. It’s only a matter of time before a “Jarvis-like” Assistant becomes a reality.

One word – “Context-awareness”

Context-awareness is the ability of apps to sense and respond to current user situations based on the data it has access to. This data can be collected implicitly through the device’s sensors, GPS, radios, etc or / and explicitly inputted by the user.

The idea behind context-awareness in apps is to provide the user relevant information without digging for it — a concept which Mark Weiser and John Seely Brown termed “Calm Technology”. The growing popularity of mobile devices has been the driving force towards this technology. More people spend more time on their mobile devices than they do on desktops and most of this time is spent using apps. This realization necessitates a paradigm shift towards context-awareness in apps to increase their capabilities and enhance user experience.

Soon, apps will be able to tell where you are, what what you’re doing, the objects that are around you and based on that, trigger an appropriate responses and suggestions.

You may have seen rudimentary applications of context-awareness in action in some of the services you use in your day to day lives. One of the biggest proponents of context-awareness, Google, incorporates context-awareness into their services in a number of ways:

  • Google Keep lets you set not only time-based but location-based reminders that notifies when you are in certain places and is  able to link your reminders to your Google Calendar
  • Google Photos is able to process images and categorize them based on faces, things, places for easy searchability and is also able to automatically create photo albums or collages based on specific moments.
  • Google Play Music and YouTube are able to curate your playlists based on your music preference
  • Google Maps provides users information about ETAs, traffic, open and closing time of establishments, the fastest route from one location to another, public transport fares, etc
  • Searching for “Starbucks” in Google Dialer brings up the nearest Starbucks with the their hotline number without you having to save it
  • Google Fit detects how much you’ve ran, jogged, walked or biked and keeps tracks of your health and fitness goals
  • Google Translate allows you to tap on any word in any app to translate into another language
  • Google Allo is able to pick up context from the conversation and suggest a reply
  • Google Nest is a home automation system that can be controlled from anywhere via Wi-Fi with the ability to automatically adjust based on your usage
  • Google Home is a Wi-Fi speaker with Google Assistant built-in and can converse with you

Furthermore, with the Awareness API, Google aims to further push the boundaries of context-aware applications by making it easier for app developers to create context-aware apps for Android using sensor-derived data. Soon, apps will be able to tell where you are, what what you’re doing, the objects that are around you and based on that, trigger an appropriate responses and suggestions. For instance, when you plug in your headphones and your device detects that you are jogging, the music app will immediately suggest a suitable playlist. Brilliant!

Of course, Google isn’t the only company that employs context-awareness into their services (I used Google for most part of this article because I’m heavily invested into their services). Advertising companies have been using it for the longest time to target ads to you. Facebook, Apple, Amazon, Microsoft, all have their own personal assistants embedded into their systems and many apps that use contextual cues to maximise their usefulness.

Dangers

While the potentials of context-awareness seems very promising, it also will present new ways for potential abuse by certain app developers. Context-awareness thrives on data, lots and lots of data. A major concern of “Big data” is privacy and security. On one hand, we want technologies that are passive, that can anticipate our needs before we do (because nobody likes clicking a bunch of buttons to configure apps anymore) and on the other hand, we want to keep certain aspects of our lives private. We don’t want technologies tracking our every move.

However, for context-awareness to accurately anticipate our needs, it has to be able to collect as much data as it can. The more we use context-aware apps, the more it learns about you, your habits and usage patterns and the better its response. Google Nest for instance learns your usage overtime and as a result is able to automatically adjust the lighting, thermostats and so on according to your usage to the point it can even “see” if you’re home or not. If something like this gets incorporated into Google Home it can be very powerful. Something like this could easily go from convenience to borderline creepy. Do you really want your devices eavesdropping on you?

The pervasiveness of modern technology is both a blessing and a curse. It has given us seamless experience of having our needs catered to before we can even give thoughts about them. This is the big promise of the Internet of things — the concept that all connected devices everywhere will eventually automate our lives — which context-awareness is a part and parcel of. But, then we start to realize that our regular interactions with tech creates data that is valuable to different entities out there.

With the ability to access virtual assistants like Google Assistant everywhere, we leave ourselves open for exploitation. We leave digital footprints everywhere and we don’t always exactly know what those footprints are or know what exactly they are used for. This raises lots of questions about privacy and security. Shouldn’t we have access to the data about us? Should we just trust companies to be responsible with our data? This data could be sold to advertisers to cajole unsuspecting users to buy products.

I believe in the promise of the Internet of things. The idea of anticipating my needs even before I know I need them and have them taken care of so I can use the time for something else is pretty awesome. But knowing that I’m being tracked all the time is quite unsettling. With context-awareness increasingly gaining popularity in mobile computing, it is important that we know the implications of embracing it. Being aware of the consequences can help us mitigate any negative fall out on the way and reap its benefits.

Brace yourselves  for next app revolution!

Featured image credit: forbes.com

Continue Reading
You may also like...

Accountant by day, tech blogger by night.

1 Comment

1 Comment

  1. Pingback: Connecting the dots: Awareness API, Google Assistant & Instant Apps - AFD Tech Talk

Leave a Reply

Your email address will not be published. Required fields are marked *

More in Knowledge Hub

Popular Posts

All categories

mudclo logo

Happening in Ghana

Is Mudclo legitimate or a sham?

By 23rd July 2017

Happening worldwide

Galaxy S8 Impressions

By 23rd May 2017

Happening worldwide

The future of customer contact centers

By 28th February 2017

Happening worldwide

How the new Nokia 3310 compares with the old

By 27th February 2017
Five features WhatsApp desperately needs
Reapp Information Portal

Happening in Ghana

Reapp, your product information portal

By 13th February 2017
To Top