Friday, November 16, 2012

The future of Siri and Apple's services

It's been over a year now since Siri launched alongside the iPhone 4S in October of 2011. When I first saw Siri, it seemed to have enormous potential as: 1) A natural language interface that may one day do to multitouch and graphics was they did to the command line; 2) thanks to that interface, a way for Apple to intermediate and broker search away from Google and towards parter content; and 3) by virtue of that intermediation and brokerage, a gateway into customer insight analytics.

On the client side, I've enjoyed the type of results Siri delivers enough, both in terms of content and presentation, to wish Apple would: 1) hook it into Spotlight so I could still use it when talking would be impossible or inappropriate, or the natural language parser wasn't available; and 2) fix it so the natural language parser wasn't so frequently unavailable. (Purple-dot-purple-dot-purple-dot-nothing is the mouse only randomly getting food.)

Since then, Apple has brokered deals for sports, restaurant, and movie knowledge bases in Siri, including the ability to start table reservations and, soon, movie ticket purchases right from within the service. However, also since then, Google has launched their competing Google Now service. And Google knows services the way Apple knows hardware and software. It offers on-device voice parsing, Google's industry-leading backend infrastructure, and goes a step beyond Siri by attempting to predictively provide information and answer questions before you even ask them.

Now, Apple has started hiring people away from Amazon to help with the service and, in the wake of a management re-ogranization, Siri has been given to Apple's "fixer", senior vice-president Eddy Cue to help set, or reset, its course going forward.

Because Siri is only as useful as its weakest server and slowest response, and both those things are going to need some serious attention.

Speed and reliability

It's tough to argue that the biggest problems Siri faced at launch, and continues to face today, is that it sometimes doesn't work, and oftentimes when it does, it's annoyingly slow. Part of that is due to the network. Literally everything you send to Siri needs to go to Apple's servers for parsing and back to your device before you get a response. That's certainly understandable if the result set includes information stores on the internet, but for local tasks like setting an alarm, it's a single point of unnecessary congestion and all too frequently, failure.

Google switched to on-device voice parsing for Android 4.1 and that should be at the top of Apple's Siri list for iOS 7. Moving that all on-device is no doubt non-trivial, but removing the burden of the cloud from where it's not needed has so many benefits that it's absolutely worth the effort. That way not only setting alarms but anything involving local or locally-cached data in apps, most especially dictation (say goodbye to purple-dot-purple-dot-purple-dot-nothing!) becomes not only nearly instantaneous, but immune to outages.

Sports scores, movie listings, Wolfram|Alpha queries, restaurant table books, and anything else that absolutely had to hit the internet would still be slower and riskier, but even local map and point-of-interest data could be cached locally, greatly reducing the dependency on Apple's backend. And about that backend...

Infrastructure

The elephant in Apple's room, the wrench in their reliability, is their server-side infrastructure and its glass jaw. Siri and it's issues since launch are just one example. Game Center has infamously gone down thanks, perhaps, due to the launch of just one popular game. iMessage has had it's ups and down. So has the App Store (in fact, as I write this, App Store downloads aren't working). iOS 6 Maps feels more like a data aggregation, cleansing, and quality assurance issue than an infrastructure one at this point; I haven't seen maps "go down". The Apple Online Store has to go offline simply to be updated (even if there's marketing value to a stunt like taking the store down, there's real-world value to live updates on ecommerce engines).

Google and other competitors like Facebook and Amazon come from the clouds. Their infrastructure isn't as old as Apple's WebObjects past, and has been their singular focus since their respective launches. As good as Apple is at hardware and software, that's as good as Google, Facebook, and Amazon are at the data centers, servers, and services that comprise their clouds.

For Apple to re-create their backend architecture in a way that's more modern and advanced, or even as modern and advanced, as Google Facebook, and Amazon will be non-trivial. One look at Microsoft's valiant efforts to date in that respect shows just exactly how non-trivial it is to turn an old, stubborn aircraft carrier into a new, shiny hellicarrier.

Maybe Apple is already doing that. They're slowly but surely pushing their Objective-C based development platforms forward, maybe they're doing the same thing with their cloud infrastructure. Maybe something just as good as what runs on Macs and iOS devices is being worked on to run iCloud and all Apple's ancillary services.

If not, however, their should be. And soon. And with massive, billion dollar efforts not spent on data centers alone but on the next generation of software to run them.

Google, Facebook, and Amazon are buying up apps, developers, and designers to address their cultural weaknesses, and the Sofa, Sparrow, Snapseed, and other teams are hard at work making sure every new generation of native app they release is less embarrassing than the last. And it's its working.

Apple has a much harder problem to fix, but that just means that have to work harder at fixing it. Whether it's buying Nuance or former OS X head Bertrand Serlet's new startup (if what it does is even appropriate) or raiding Google, Facebook, and Amazon (again) for every cloud engineer they can, they need to get it done.

Otherwise Game Center, App Store, iTunes, iMessage, iCloud, and yes, Siri will suffer.

API

While engineers and architecture are vitally important to Apple, APIs are what matter to developers. And developers have wanted Siri APIs since they first saw it announced at the iPhone 4S event. And it still seems unlikely.

Guy English of Kicking Bear explained how Apple's internal secrecy made even the integration of an Apple app, Find my Friends, convoluted. Here's the crux (but read the whole thing, the linen-play is killer):

Apple doesn?t appear to have an internal SPI for Siri yet, and it?s my bet that they?re a year or so away from it. Even internally it appears that they?ve not yet drafted an approximation. And I don?t blame them. For an AI system like Siri that would require determining the plug-in based upon confidence that it could handle the request. How do you write an API that gauges the trustability of thousands of plug-ins to properly report their confidence?

Guy also talks about hand-off collisions in the first episode of Debug as well, where different apps offer up potentially overlapping knowledge sources, and the SIri AI has to try to figure out which one gets what and when. A pop-up requester, the kind Siri already uses to offer up different contacts or locations, could handle the obvious stuff, but not everything is obvious. Good natural language parsing is all about subtlety, context, and yes, nuance.

A Siri API wouldn't just have the potential for conflicting app hand-offs, but for conflicting with Apple's partnership deals. That goes back to Apple using Siri as a way to intermediate and broker search. An API intermediates Apple. What value would a content deal have between Apple and Yelp, or Apple and OpenTable or Fandango or anyone else, if any app could hook into an API "for free"? Right now Apple seems to want to handle Siri access the way they handle Apple TV access, through closed partnering deals rather than open access.

That sucks for developers, and may or may not suck for users. Apple might feel controlling access provides a better, saner experience, even if many power users would disagree -- the classic conflict.

Either way, I'd argue fixing Siri's speed and reliability, fixing iCloud's backend infrastructure, and adding in predictive functionality should all be done way before Apple even considers taking on the responsibility of a Siri API.

Functionality

Beyond speed and reliability, architecture and API, for end users, Siri is still a mixed bad when it comes to functionality. Even with Apple's built-in apps, there's not a lot of inconsistency. For example, Siri can compose both emails and messages, but can only read incoming messages, not emails. That Siri will tell you it can't do certain things shows the natural language and contextual parsing knows what you want to do, the ability to do it simply hasn't been implemented, turned on, or allowed. Over a year later, and Siri still doesn't provide basic Settings toggle functionality.

Kontra, questioning whether Siri is Apple's future on Counternotions, points out the advantage that contextually aware, targeted search has over Google's traditional, linear search algorithms. Here's an excerpt, and again, go read the whole thing:

A conventional search engine like Google has to maintain an unpalatable level of click-stream snooping to track your financial transactions to build your purchasing profile. That?s not easy (likely illegal on several continents) especially if you?re not constantly using Google Play or Google Wallet, for example. While your credit card history or your bank account is opaque to Google, your Amex or Chase app has all that info. If you allow Siri to securely link to such apps on your iPhone, because this is a highly selective request and you trust Siri/Apple, your app and/or Siri can actually interpret what ?nice? is within your budget: up to $85 this month and certainly not in the $150-$250 range and not a $25 hole-in-the wall Chinese restaurant either because it?s your mother?s birthday.

This is even true with the the excellent Google Search iOS app. In terms of speed and, so far, reliability, it positively schools Siri. Yet it remains trapped in Google's traditional search paradigm.

But not so Google Now. In my experience Google now isn't (yet) the contextual equal of Siri, but it does something Apple (also yet) hasn't been willing to do with Siri: predictive response.

The idea isn't new. Roger McNamee, back when Elevation Partners still owned Palm, pitched the idea that your phone, because it knows where you are, what you have scheduled, and who your contacts are, could alert you if traffic became such that you could no longer make your meeting down town, and prepare messages to send the people you were supposed to meet to excuse your tardiness. Instead of a static alarm, it could remind you to leave for a meeting only a few minutes ahead of time if it was down the hall, or hours ahead if it was across town and there'd been an accident. Rather than you asking about the weather, it would know you had a trip planned for Banff and, when snow started to fall, alert you to bring your jacket. It could anticipate, and instead of making you ask for information, it could bring the information to you.

And forget asking to have Wi-Fi or LTE toggled off or on. It could know when you entered or left a trusted network or planned location to just do it for you. Not to mention, "it's 48hrs until your anniversary, dumbass, and you haven't made dinner reservations yet, would you like to see a list of romantic restaurants with seating for 2 still available?"

Add automatic search widening to that as well, so if the perfect restaurant is 11 miles away instead of 10, or there's nothing Italian available but there is something French, you don't get zero results back, and the future starts to become much more interesting and convenient.

Google Now is doing some of that already, and with a nice looking interface, and creepy as it is, it's convenient enough that many of us probably wouldn't be bothered by the privacy issues any longer than it took us to agree to the access requester.

While Siri was ahead of Google in terms of personal search, Google is getting ahead of Siri in terms of predictive search, and if it takes until iOS 7, presumably in the fall of 2013, for Apple to respond, Google Now will likely be even further ahead.

The bottom line

Whether it was the command line with the Apple II, the GUI with the Mac, or multitouch with the iPhone, Apple has been at the forefront of every major mainstream computing interface revolution in modern memory. They're not with Siri. Siri is the Apple I. The Lisa. The unreleased Safari Pad before the iPhone. If Apple needs the Apple II, the Mac, the iPhone version of Siri, or they cede the next great interfaces to the likes of Google Now or Microsoft Kinect, or whatever else comes next.

Services have never been Apple's forte, so the coming revolution could well favor the competition. But that just means Apple has to be bolder and fight harder to win this next battle for the future.

(Seriously, and not trying to be a pain, but we talk about a lot of this with Loren Brichter of Tweetie and Letterpress fame in this week's episode of Debug so check it out if you haven't already.)



Source: http://feedproxy.google.com/~r/TheIphoneBlog/~3/m6C54N37jYY/story01.htm

hunger games box office xavier joan crawford joan crawford john goodman kendall marshall whitney houston news

No comments:

Post a Comment

Note: Only a member of this blog may post a comment.