Apple-AI-Chip

Could New Apple AI Chip Help Siri Take on Alexa? Probably Not.

Bloomberg is reporting that there is a new Apple AI chip in development. It is part of the company’s strategy to compete with Amazon and Google’s AI initiatives and would have the added benefit of reducing battery power consumption. A source for the article written by Mark Gurman asserts:

Apple is working on a processor devoted specifically to AI-related tasks, according to a person familiar with the matter. The chip, known internally as the Apple Neural Engine, would improve the way the company’s devices handle tasks that would otherwise require human intelligence — such as facial recognition and speech recognition…

The move follows similar initiatives by Google, Qualcomm and Nvidia to develop AI-specific chips. However, it is unclear how much local speech processing is practical.

Better Battery Consumption is the Long Game

The battery consumption angle is about handling AI functions on-device as opposed to using cloud services. There are clear benefits to this when WiFi is not available, but even a more sophisticated chip will be very limited in its scope of capabilities. The Bloomberg article does mention that some AI capabilities are currently handled on-device for some Apple products.

Apple devices currently handle complex artificial intelligence processes with two different chips: the main processor and the graphics chip. The new chip would let Apple offload those tasks onto a dedicated module designed specifically for demanding artificial intelligence processing, allowing Apple to improve battery performance.

This is About Visual Recognition, Not Speech

The third chip approach seems logical for on-device AI processing. However, few AI processes actually occur on-device today. Whether it is Amazon’s Alexa or Apple’s Siri, the language processing and understanding occurs in the cloud. It would be impressive if Apple could actually bring all of Siri’s language understanding processing onto a mobile device, but that is unlikely in the near term. It’s not just about analyzing the data, it’s also about having access to information that helps you interpret and respond to requests. The cloud is well suited to these challenges.

More limited facial and image recognition tasks are a different matter. This relies on limiting the domain that you are asking the device to process locally. A new AI chip could store facial recognition for a limited set of people and enable personalized user profiles and security access based on that capability without requiring cloud access. The key is drawing boundaries around the AI’s function to make it work within the constraints of on-device resources. Some limited voice commands and routines could clearly fit this “domain constrained” model, but human-level natural language understanding is unlikely.

More Rumors Sure to Come

With Apple’s annual Worldwide Developer Conference (WWDC) starting next week, we are bound to hear more rumors about the latest initiatives from Cupertino. The Apple AI chip will complement the credible forecasts about about a planned Amazon Echo competitor. Apple seems to be slipping behind its tech rivals in the AI arms race, but the company has successfully come from behind before so expect some big announcements next week.

Does New Patent Hint at Apple Siri Smart Speaker? Yes.

Apple Has Acquired AI Startup Lattice Data for $200 Million