Both iOS and Android have undoubtedly transformed how we interact with devices and how we design digital experiences, but Apple’s closed developer system has run it’s course and it’s time for a change.
In less than a decade we’ve moved from phones tethered to the kitchen walls by a cord, to being tethered to our phones from morning to night. Despite the impact of the iPhone, it’s still a very closed system, where all iOS apps must adhere to Apple’s strict terms and Siri (a truly a powerful piece of software) is still not open to developers. Although now there’s rumors of a sea change for Siri, and developers may soon have access to a Siri API.
With Google Glass and the GDK however, you can create a Glass app with few limitations, and even add unique voice commands to Glass. For instance, let’s say we want to solve a common problem like capturing To-Dos you think of while busy and on the go. So, we’ll create a Google Glass app called “Reminder” that lets a user dictate personal to-do lists and items by using a custom voice command like “Ok, Glass, remind me to…”
Google allowing developers to submit new voice commands at an OS system level is a simple but powerful aspect to the platform. Imagine what we could create if Siri allowed for the same. What could designers and developers create with the ability to add custom voice commands? Maybe therein lies even a bit of a tragic situation with Apple and the iPhone. Such amazing technology that has transformed culture and behavior, and yet it remains such a closed platform, so truly limited from innovations and ideas from third-parties that could improve people’s lives.