Artificial intelligence (AI) is one of the most exciting technological growth areas in recent years, with some investors like technologically-focused entrepreneur Tej Kohli predicting the sector will be worth $150 trillion (£125tn) by 2025, but why do we need the technology in our phones?
Flagship devices today all come equipped with specialised AI processing chips, known and neural engines or neural processing units, from Apple’s A12 Bionic CPU to Huawei’s Kirin 980 or Qualcomm’s Snapdragon 845, and more and more tasks are using their advanced processing capabilities.
Google Assistant, Alexa and Siri
The most obvious artificial intelligence in our phones are the voice assistants that learn to understand our voice commands and then act appropriately from telling us the weather to playing our favourite song or adding an appointment to our calendar.
Google, Apple, and Amazon have steered clear of labelling their services as AI so as not to scare away users fearful of a robot takeover, but these services rely on machine learning to function – understanding what you are telling them to do and then performing the right action.
Possibly the most advanced implementation of any digital assistant is Google’s Duplex service that will make calls and interact with other people and businesses on your behalf. This expands the possibilities of Google Assistant exponentially, as it means that it can not only perform actions on the smartphone or connected Google service, but actually make calls to other businesses to book events or pass information externally. Terrifyingly, Duplex emulates the pauses, “umm”s, and “ahh”s of real people when contacting external people and businesses, so they may never know they were not talking to a real person.
Object and scene recognition
Smartphone cameras have improved with better sensors and more megapixels in recent years, but the biggest improvements to our snapshots recently have come from AI that interprets the scene we are shooting and processes the image accordingly. The Huawei Mate 10 was the first smartphone to introduce an AI image processing unit that could interpret 13 different scene types and in-scene objects from dogs or cats playing to sunsets or snowscapes and as even the worst photographers amongst us benefited from better snaps.
Most recently, Google impressed with its AI image processing on the Pixel 3 range of phones, where the sensors and lenses were not as impressive as those in similar flagships from the likes of Apple or Samsung, but the resultant imagery was generally better. AI processing is not a replacement for better hardware and photography talent, but it goes a long way to making sure everyone’s photographs look better and will generate more likes on social media.
Selfies remain one of the most popular uses for phone cameras, artificial intelligence is used to improve those images more than any other. Blurring the background may seem a simple “trick”, but in reality this takes a huge amount of AI processing for your smartphone to work out where your face and messy hair ends and the background begins. When Google released this feature on its Pixel 2 ranges of smartphones in 2017, the search giant said it informed the process by training it on over a million images.
Shooting in the dark has traditionally been a major problem for smartphone cameras, as the compact sensors and short exposure times have left us with dark and grainy images. However, thanks to some clever AI, the latest smartphones manage to emulate long exposure times by using multiple camera modules to take a whole series of shots at different exposure levels and merging the results into a single image.
Many flagship devices now come with an advanced AI nightmode, but it was again Huawei that was first out the gate with its P20 Pro and Mate 20 Pro devices last year, which at the time had little competition for night-time photography.
As the technology continues to be developed we are likely to see technology firms introduce AI to a wide array of services, with the integrations ever more seamless. The question will soon be whether it matters if some of our interactions are with bots if they get the job done efficiently and politely.