The iOS of the future might be able to read your emotions. Without saying a word, without touching a button, your next iPhone might be able to know exactly how you’re feeling or what you need. Could this be what Apple needs to leave Android and Windows Phone in the dust? Quite possibly.
iOS 7 is already more “emotional” than its predecessors
One of the main features of iOS 7 was the removal of unnecessary visual metaphors from the interface. There’s no wood, no metal surfaces or anything that reminds us of real objects in any obvious way. The “furniture” that separated the user and the system is no longer there.
For iOS, the consequence of “tossing the furniture” is that the interface now seems to breathe like a living being. With iOS 7, we see the operating system stripped bare, without distractions. “What do we want people to feel?” That’s the question the Apple design team asked themselves recently. And the answer, as the video below elegantly explains, are basic emotions: happiness, surprise, love…connections.
I’ve been chatting about this with several editors here at Softonic, and the verdict is the same: iOS 7 is flexible, elegant and doesn’t get in your way. It’s the perfect preparation for what’s to come.
Emotional software is ready and waiting
Emotions are increasingly important in the world of software, and recognizing them will be an essential part of future operating systems and applications. Imagine an operating system that can read your emotions from the look on your face, your tone of voice or electrical resistance from your skin. It would be an operating system capable of guessing your mood.
The Sension app for Glass helps people with autism to recognize emotions (source)
The uses of an “Emotion API” are as exciting as they are numerous: better recommendations when shopping, advertising that’s in tune with your mood, and a whole range of possibilities for social games and applications. To sum up, these affective computing methods already exist, and software authors are busy exploring their possibilities. For example, Sension, a company in San Francisco, is creating an app for Google Glass which is able to read emotions.
Will you join Apple in the “emotional revolution”?
When I saw that Apple had launched a fingerprint reader system, the intention behind it seemed obvious: establishing direct, discreet contact between the system and the user, with no middleman.
Photo-montage of what could be an emotional recognition calibration for iOS
If Apple released this type of feature in the next few years, it would make perfect sense. User emotions have always played a key role in the design decisions that Apple makes. Some say it’s the secret to Apple’s success.
But transmitting positive emotions through design isn’t enough. The next step is to precisely detect and monitor users’ emotions to establish a definitive link between man and machine.
Establishing this link could be the differentiating factor that ensures that iOS can still compete against Android and Windows Phone, which are getting better by the day.