The Underwhelming Apple HomePod Launch. Blame it on Siri and Maps.
Apple announced the HomePod smart speaker last week and with it introduced a new competitor to Amazon Echo and Google Home. Well, a competitor that will arrive in six months. But December delivery wasn’t the only launch detail that disappointed observers. When Amazon Echo was introduced, the focus was about all of the things the Alexa voice assistant could do. The same was true for Google Home and the many capabilities of Google Assistant. The focus of the Apple HomePod launch was almost entirely about the device hardware. The sound quality delivered by the tweeters. The sound quality for bass. The microphone. Siri was an afterthought.
Blame it on Siri
Apple has a reputation for quality. It is hard for the company to enter a new market and be perceived as inferior to competitors. Siri has feature depth for a few capabilities, but lacks breadth. As a result, it looks less impressive that Amazon Alexa and Google Assistant. There has been a lot of focus on making Siri really good at a few things across a number of languages. The Siri feature list simply looks anemic compared to Alexa and Google Assistant. And, third party developers only have access to Siri for nine limited domains like workouts, messaging and QR codes. So, the strategy to emphasize the device over the voice assistant makes sense.
Siri is shackled with the need to be really good at a few things in part because of its legacy. There is reputation overhang about quality issues during the early days after Siri’s initial launch that persist. Apple’s advertising overpromised and the product under delivered. Many people still think Siri isn’t very reliable in terms of voice recognition. Even if most of those early issues have been addressed, any Siri performance expectations that are not met have the risk of amplifying old concerns.
As voice assistants have become more robust, Siri’s narrow domain knowledge has led to a new set of critiques. The legendary digital product reviewer Walt Mossberg opined last fall:
Why does Siri seem so dumb?
Mr. Mossberg was not alone. Kirk McElhearn of MacWorld wrote a 2016 article titled, “Siri-ous Mistakes: Apple’s personal assistant still needs a lot of work.” The subtitle? “Siri performs well in Apple’s ads, but in real life, it’s a different story.” He goes on to enumerate a few simple tasks that Siri handles well and then lists many more examples where it fails miserably. If Apple were ready to take on these criticisms and compete effectively against Alexa and Google Assistant, the HomePod announcement would have had a lot more focus on Siri. They weren’t ready with great software so they retreated to focusing on hardware specs.
Blame it on Maps
The other Apple baggage at play is memory of the ill-fated 2102 launch of Apple Maps for iOS 6. In this case, Apple introduced its own maps app and removed Google Maps as the default, meaning that the expectation was to match or exceed the Google capabilities that most people were using. Tim Cook’s first major product release as CEO led The Verge to report:
Apple’s buggy iOS 6 maps lead to widespread complaints.
The incident also led to Cook issuing a public apology in response to the fiasco. You can bet that Cook remembers this period and doesn’t want to repeat it by overpromising around Siri capabilities. The company has seemed less bold, more cautious ever since.
Is Sound Quality Really an Important Differentiator?
You can already connect Amazon Echo to high end blue tooth speakers. In fact, Apple even showed us the two paired up. The point by Apple’s Phil Schiller was that speakers either had great sound quality or access to an intelligent voice assistant like Alexa. He may not have realized that Amazon Echo has been integrated with Sonos since August 2016. And, that’s not all. You can choose Bose or any other Bluetooth speaker and even have the opportunity to tether an Amazon Echo Dot to a high end speaker through a 3.5 mm wired output.
Apple positioned the HomePod as a, “Breakthrough home speaker.” It may be. The A8 Chip onboard powers multi-channel echo cancelation and real-time acoustic modeling that supposedly optimizes sound based on the room dimensions. This may help capture the audiophile market and no doubt will entice many Apple fanatics, but it is not an obvious winner in a market that now had multiple offerings.
Besides, there was no mention of Spotify or Pandora. Apple Music may be the number two music streaming subscription service worldwide with 27 million users, but Spotify has more than twice the subscribers and over 100 million monthly active users in total. Pandora has 81 million monthly active users in the US alone. Will these be accessible through HomePod? Probably, but if you are reinventing home music listening, these are important details to people that already have music service preferences.
Does Music Matter for Voice Assistants?
It is true that music listening is consistently listed as one of the top uses for Amazon Echo. A recent comScore study found that 54% of Amazon Echo owners report using it to listen to music. A GfK survey puts that number at a little over 60%. RadioCentre found that 63% of UK residents use the devices to listen to radio in the morning and 54% in the evening. Streaming services were used but less frequently. Listening is an important use case.
However, is music listening why people are purchasing Amazon Echo and Google Home today? Once you have an audio device in your vicinity you are likely to start using it for music, podcasts and radio. That doesn’t mean it was the primary purchase motivator. People are buying what is new and the new features are AI-based voice assistants like Alexa, Assistant and soon Microsoft’s Cortana.
Apple is Missing an Opportunity
The focus on speaker quality is genuine to a degree. It may well be a great speaker. However, the audio quality focus is primarily about posturing. They are ignoring the feature that is weak, Siri, and promoting an area that its rivals are not stressing. It may be the best approach given the status of Siri’s technology.
The bigger issue is that Apple is missing an opportunity. Apple stressed that Siri has 375 million monthly active users. If true, this makes it the most widely used voice assistant globally. But used for what? Executing a few transactional tasks like sending a message, making a phone call or setting a timer? By constraining the domains that Apple users can access, Apple is not accumulating training data around new capabilities and information sources. Google Assistant and Amazon Alexa are capturing this information, learning and improving.
This is the model for neural nets or even for machine learning. The alogorithms aren’t perfect when they start, but they get better over time. Siri’s hope to only deliver bullet proof solutions is out-of-step with this approach in part due to history but also due to philosophy. It’s a philosophy that makes it hard to fully commit to machine learning, much less deep learning where errors are expected during the training process. Siri isn’t going to get better and catch up to Alexa, Assistant and Cortana unless she is permitted to fail. That is why HomePod was all about the hardware. Siri software isn’t ready to graduate because she is not offered the opportunity to learn at the rate of her competitors.
Follow @bretkinsella