Bixby Views – FI

Bixby Views is What Every Voice Developer Wants and Reinforces Samsung’s Biggest Differentiator

“Write once. Publish everywhere.” For voice developers, this phrase usually refers to the idea of publishing voice apps to multiple voice assistants instead of coding each individually. Bixby Views, which was announced at this week’s Samsung Developer Conference (SDC), offers a different take on this phrase. You don’t have to create new code for your voice app to render properly on Samsung devices beyond the smartphone. A single line of code makes the Bixby capsule (i.e. voice app) accessible on any Samsung device and render properly on any Samsung device screen. Bixby’s AI will automatically scale the visual user interface (UI) to a watch, refrigerator, television, and so on.

Accessing Samsung’s Device Ecosystem

Developers can still define custom rendering on other device types if they choose to do so. However, Bixby Views makes that an option as opposed to a requirement. This is a particularly important announcement for Samsung as a core motivation for developers to support Bixby is its availability across hundreds of millions of devices sold annually. The upside for developers is the potential for ubiquitous reach but the downside could be customizing a Bixby capsule for hundreds of different device specifications. Bixby Views is designed to eliminate that potential downside. When asked what new Bixby announcements he was most interested in, Piyush Hari, founder of Dilli Labs didn’t hesitate:

Putting Bixby on other Samsung devices particularly the refrigerators…I think that is the competitive edge Samsung has over the other competitors…More often than not, consumers have a Samsung appliance inside their home. If they buy a new one that is Bixby enabled then they have the interface right on their device versus adding a plugin.

Developers Realize Zero UI is a Myth

The idea of Zero UI and its potential to reduce developer time has turned out to be a myth. Voice is a UI and it is complex to design it well. With that said, there was a hope that with voice apps at least developers could avoid the complexities of graphical user interfaces (GUI) that invariably need to be ported to dozens or hundreds of form factors. The thinking was that a speaker and microphone are essentially the same on all devices, so we have the result of “write once; publish everywhere.”

But, those voice-only experiences didn’t last for long. Soon, Amazon, Google, and other OEMs introduced smart displays and started demonstrating voice apps on smart TVs. Google insisted all Actions for Assistant at least render text on the screen.

More Tools and More Modes to Support

Amazon then introduced Alexa Presentation Language (APL) and Google launched Interactive Canvas and suddenly not only was there more work to do to “publish everywhere,” but also some new tools to learn. Multimodal means that developing for voice apps can actually be more complex because you must accommodate both a voice user interface (VUI) and a GUI. This isn’t how it was supposed to be. But, here we are.

Bixby is a bit different from its voice assistant peers because it assumes multimodal features from the start. Google Assistant does as well, but you can get away with voice and text and skip the visuals if need be. Bixby expects an all-of-the-above approach of voice, visual, touch, and more. That makes it very flexible, but also adds some work. Then you have that issue of optimizing for all of those Samsung devices. So, it’s not a surprise that so many developers at SDC 19 were more excited about Bixby Views than the other announcements. It will help them reach more devices and save time.

Reducing Developer Time

Cory Wixom is the developer behind BabyStats. The solution enables new parents to track activities such as feeding schedules and diaper changes by speaking. It was originally launched on Alexa then made its way to Google Assistant and iOS, Android, and Amazon apps. More recently Wixom has become a Bixby developer. His reaction to Bixby Views was enthusiastic.

It’s true multimodal. What they showed is that you just change one line of your code to say yes this can support a watch and then it preformatted it for you so that was interesting…You think what some companies are investing in to create responsive designs, it’s built into Bixby the responsive design that it can switch between a TV and a watch. So, I thought that was pretty impressive.

So, Bixby Views will be popular but it will be limited to Samsung devices. To render your capsule on a non-Samsung device will require custom coding as in the past.

Samsung Bixby Now on 160 Million Devices and New Features Launched for Personalization, Ease of Access to Third-Party Capsules, and Templates to Streamline Development

Amazon Launches Alexa Skill Personalization Features

Google Will Acquire Fitbit for $2.1 Billion and Strengthen its Google Assistant Wearables Strategy