Editor’s Note: Semil Shah works on product for Swell, is a TechCrunch columnist, and an investor. He blogs at Haywire, and you can follow him on Twitter at @semil. Leading up to last week’s Apple event and the unveiling of the iPhone 5s, the Internet may have led you to believe that “Android is better.” The reality for startups, however, is more nuanced. When it comes to the first choice developers have for starting in mobile, Apple still leads the competition, as mobile expert Steve Cheney asserts these advancements place iOS roughly 18-24 months ahead of Android. (If you haven’t read this post from early August 2012 by Cheney, please do - he called much of this months ago.) And while the runaway growth of Android has decreased the time hot iOS apps are released on Android -- Instagram released Android right before their Series B, but more recently Snapchat already had an Android client by the time of their Series A -- for small technology startups iOS remains the platform of choice upon which to build new mobile experiences. The advancements in mobile hardware and corresponding OS improvements often present new opportunities for software developers to exploit, such as Instagram using software to improve image resolution leading up to the iPhone 4, or more recently, helping Frontback rise in popularity now that the front-facing camera has improved in the iPhone 5. We’ve all read about the new iPhone 5s by now, but what about its specific hardware improvements and how they could create new opportunities for iOS developers? From my informal conversations, startup developers are not moving to Android anytime soon, despite what you may have read, and the intermingling of these advancements in iOS 7 could make for an even more innovative future for mobile computing. Let’s start with the most important mobile sensor: the camera. On the iPhone 5s, an improved camera helps folks take better photos in low lit areas with a wider aperture and features like True Tone that don’t even appear on some DSLRs. These continuous improvements to pictures, including video capture and playback, keep most Android handsets in the rearview mirror (though some objects in the mirror may be closer than they seem). The new camera also keeps Apple on a path of devouring more of the point-and-shoot market dollars. In terms of what developers can do with the next iterations of the iPhone camera, I’m waiting for depth-sensing cameras, which can determine distances between objects, and potentially even better software to recognize objects themselves. As the first five years of the iPhone have demonstrated, there is no greater communication currency than images on our mobile devices, and camera-related apps remain some of the most popular apps on all of iOS. It is early innings for digital imaging, and when it comes to mobile and the rate of pictures captured and shared, the scale is accelerating. When it comes to location tracking, the M7 coprocessor presents groundbreaking opportunities. To date, most quantified-self endeavors involve an external piece of hardware that captures data from wrists and sends that data to our phones or computers via USB uploads, Wi-Fi data transfers, or other methods. Now with the new M7, the iPhone comes with a separate “coprocessor” that doesn’t draw energy and power from the main chip while capturing more fine-grained, precise data about a user’s movements by triangulating position from the following sensors: accelerometer, gryoscope, and compass. iOS apps like Moves or Highlight before it convinced an early-adopter set to give up precious battery life in order to get valuable location data, but people remain concerned about battery limits, which may be a few years away from the fundamental shift in battery tech (from ions to electrons) many are waiting for. Now with the M7, iOS developers can write apps that can read the data written to the M7 and build new mobile experiences on top of that. Furthermore, the M7 can inform the OS itself to be more intelligent about location and, in turn, make the OS and other apps behave more contextually. It will be interesting to see how the fidelity of M7 stacks up against the potential (for instance, in a popular app like Strava), and what the effect of all of this could be on the fitness wearables moving forward. Moreover, now that everyone expects Apple to ship a smartwatch at some point, many wonder if the M7 could also be placed inside a watch (and other wearables) to communicate relevant data to another nearby interface. Speaking of processors, Apple’s new 64-bit A7. This processor is two times faster than what has been put inside a mobile phone to date, delivering a 2x on CPU as well as a 2x on GPU and leveling the processing requirements of Mac apps with iPhone. So, there’s faster compute power, but also more power for resource-intensive applications, especially the graphical demands of games, the category which happens to dominate the iOS App Store successes. While resource-intensive apps may place an extra strain on the battery, the effects of the A7 may be slower to trickle out as most apps will be written to 32-bit specifications. Reading the documentation leads me to believe much of the advancement helps Apple build more of a beachhead into mobile gaming and a future where games are played on many screens. Touch ID Fingerprint reimagines what “touch” means. This is the most mainstream, sci-fi advancement in the iPhone hardware and software. In talking with iOS developers over the last week, they’re very excited about the possibility of Fingerprint helping to unlock more downloads, where users don’t get hung up on remembering and entering yet another password. Beyond this, I expect Fingerprint to help with oauth and in-app logins. I also expect the protocol to be opened up to other apps to help tie biometric identity to each iTunes account, which should grease the wheels for easier in-app purchases (which also require iTunes passwords, and net Apple 30 percent tolls) and potentially regular app-commerce on the whole. iBeacon creates new local, private sharing opportunities. Much has been written about the effect of iBeacon on NFC, so I won’t go into that here, other than to say that iBeacon’s ability to allow devices and their applications to be positioned indoors could unleash a new wave of entirely new consumer experiences. iBeacon enables iOS users to share information over short distances by harnessing a low-energy state of Bluetooth without the need for cellular data or Wi-Fi networks. This creates new possibilities indoors, where signal strength varies for users. This means iOS users could, for example, initiate, split, or receive payments among friends or stores through iBeacon, or share documents and images. Users could receive information indoors related to retail experiences based on their micro-location, receive better indoor navigation, and even “check in and out.” Imagine that, mobile check-ins may come back in style because they could be passive without destroying battery life. From a commercial point of view, with companies like Euclid successfully providing real-world analytics to retailers by using Wi-Fi signals to identify unique IDs, Beacon creates another level of quantitative data for retailers (such as the type Estimote focuses on), and could be one of the first steps for other connected devices to begin to pry open a market for the Internet of Things. In catching up on all the technical documentation, news, and analysis from this week, it became apparent to me that most of the headlines covered all the specs but missed the forest for the trees. With iOS 7, the trees present a world of iOS interconnectedness beyond just iPhones. When all the new advancements of iOS are taken under consideration together, Apple’s mobile future hints at a world where all iOS devices and their apps freely and efficiently communicate with one another on an intra-OS level that are not yet possible across a fragmented Android handset and OS landscape. So when it comes to early-stage technology startups, iOS 7 again pushes the boundaries of what type of applications developers can build and put into the wild, where devices become more aware of context, where users can touch more than type, where new location-based opportunities and data emerge, and where other mobile devices (like watches) could effortlessly communicate with our mobile computers. The future of these new mobile experiences is exciting, and I can’t wait to see what developers cook up.
and 5 more articles