Android Things: Adding Google Assistant

With the growth of the Internet of Things (IoT), developers and engineers have had to rethink how users interact with devices on a day-to-day basis. 

While screens work well for websites and most apps, devices that interface with the real world can be a bit more tedious to operate if you have to use multiple buttons or a screen in order to function. One of the ways around this is to enable voice controls on your devices. 

In this tutorial you will learn about Google Assistant and how you can add it to your Android Things IoT devices.

If you need a little background on Android Things before you start, check out some of my other posts here on Envato Tuts+.

  • Android
    Introduction to Android Things
    Paul Trebilcox-Ruiz
  • Android SDK
    Android Things: Your First Project
    Paul Trebilcox-Ruiz

Assistant SDK

The Google Assistant SDK allows you to add voice controls with key word detection, natural language processing, and other machine learning features to your IoT devices. There’s a lot that can be done with the Assistant SDK, but this tutorial will just focus on the basics: how you can include it on your Android Things devices in order to ask questions, get information, and interact with standard “out of the box” Assistant functionality.

As far as hardware requirements, you have a few options. You can use a Raspberry Pi flashed with Android Things with an AIY Voice Kit.

Or you can use a standard speaker with AUX connector and a USB microphone.

Additionally, you can use any other I²S hardware configuration. While we won’t discuss I²S in detail in this tutorial, it’s worth noting that the Voice Kit will use this protocol. Once you have a microphone and speaker set up, you will also need to add a button to your device. This button will need to keep track of two states: pressed and released. You can accomplish this with a multi-pronged arcade button, or a standard button with a pull-down resistor attached to one of the poles.


Once you have hooked up your hardware, it’s time to add the Assistant SDK to your device. First, you will need to create a new credentials file for your device. You can find the instructions for this in the Google Assistant docs. Once you have your credentials.json file, you will need to place it into the res/raw directory of your Android Things module.

credentialsjson file in the resraw directory

After your credentials are created with Google, you will need to declare some permissions for your app. Open the AndroidManifest.xml file and add the following lines within the manifest tag, but before the application tag.

It’s worth noting that you will need to restart your device after installing the app with these permissions in order for them to be granted.

Next you will need to copy the gRPC module into your app for communicating with the home device. This gets a little tricky, so the best place to get it is from the Google Assistant Android Things sample app, which can be found in the Android Things GitHub account. You will then need to update your settings.gradle file to reflect the new module.

After updating settings.gradle, include the module as a dependency in your things module by including the following line in the things module’s build.gradle file and include Google’s button driver (you will need this for activating the microphone) and optional Voice Hat driver if you are using that hardware.

You’ll also need to include protobuf as a dependency in your project-level build.gradle file.

Next, let’s include the oauth2 library in our project by opening the things module’s build.gradle file and adding the following under the dependencies node:

You may run into conflicts here if your project has the Espresso dependency, with an error message similar to this:

If so, just remove the Espresso dependency from build.gradle.

After you have synced your project, create a new class named to access your credentials.

Embedded Assistant Helper Class

Once your class is created, it’s time to create a new class named This is a helper class that was originally written by engineers at Google to easily wrap the Google Assistant for Android Things. While this class is fairly straightforward to use by just including it into your project, we will want to dive into it and understand how it actually works. 

The first thing you will do is create two inner abstract classes that will be used for handling callbacks in the conversation and requests to the Assistant API.

Once your two inner classes are written, go ahead and define the following set of global values at the top of your class. The majority of these will be initialized later in this file. These values are used to keep track of device state and interactions with the Assistant API.

Handling API Responses

While the above has a StreamObserver object for requests to the Assistant API, you will also need one for responses. This object will consist of a switch statement that checks the state of the response and then handles it accordingly.

The first case checks for the end of a user speaking and uses the ConversationCallback to let the rest of the class know that a response is imminent.

The next case will check and update conversation, volume, and microphone state.

The third case will take an audio result and play it back for the user.

The final case will simply forward errors that occurred during the conversation process.

The final two methods within this stream handle error states and cleanup on completion of a conversation result.

Streaming Audio

Next, you will need to create a Runnable that will handle audio streaming on a different thread.

Creating the Assistant

Now that your global values are defined, it's time to go over the framework for creating the EmbeddedAssistant. You will need to be able to retrieve the credentials for your app using the class that was created earlier.

In order to instantiate itself, this class uses a private constructor and the builder pattern.

The Builder inner class contains multiple methods for initializing the values within the EmbeddedAssistant class, such as sample rate, volume, and user credentials. Once the build() method is called, all of the defined values will be set on the EmbeddedAssistant, global objects necessary for operation will be configured, and an error will be thrown if any necessary data is missing.

Connecting to the Assistant API

After the EmbeddedAssistant has been created, the connect() method will need to be called in order to connect to the Assistant API.

After you have connected to the API, you will use two methods for starting and stopping conversations. These methods will post Runnable objects to mAssistantHandler in order to pass conversation state objects to the request and response streams.

Shutting Down

Finally, the destroy() method will be used for teardown when your app is closing and no longer needs to access the Assistant API.

Using the Assistant

Once your helper classes are fleshed out, it's time to use them. You will do this by editing your Android Things MainActivity class to interact with the EmbeddedAssistant and hardware for controlling the Google Assistant. First, add the Button.OnButtonEventListener interface to your Activity.

Next you will need to add the member variables and constants that will be required by your app. These values will control the debounce of the button that triggers the Assistant, as well as the volume, the audio format, the UserCredentials class that you created earlier, and the hardware for your device.

Once you have your constants defined, you will need to create a few callback objects that will be used for conversations and requests with the assistant.

In mConversationCallback, you will notice that we save a volume change percentage in a shared preference. This allows your device volume to stay consistent for your users, even across reboots.

As the assistant works asynchronously on your device, you will initialize everything for using the Assistant API in onCreate() by calling a set of helper methods that we will define over the rest of this tutorial.

The first helper method is initVoiceHat(). If the Voice Hat shield is attached to a Raspberry Pi, this method will initialize the device so that users can use the attached microphone and speaker. If a Voice Hat is not attached, then a standard AUX speaker and USB microphone can be used and will be routed to automatically. The Voice Hat uses I2S to handle audio peripherals on the bus, and is wrapped by a driver class that was written by Google.

The assistant will only respond in this sample while a triggering button is held down. This button is initialized and configured like so:

When the button is pressed, the assistant will start listening for a new conversation.

You can find more information about GPIO and Android Things in my tutorial about input and output with Android Things.

  • Android SDK
    Android Things: Peripheral Input/Output
    Paul Trebilcox-Ruiz

Since we stored volume information in our device's SharedPreferences, we can access it directly to initialize the device's volume.

The Assistant SDK requires authentication for use. Luckily we created a method in the EmbeddedAssistant class earlier in this tutorial specifically for this situation.

The final helper method that was called in onCreate() will initialize the EmbeddedAssistant object and connect it to the API.

The last thing that you will need to do is properly tear down your peripherals by updating the onDestroy() method in your Activity.

After all of this, you should be able to interact with your Android Things device as if it were a Google Home!



In this tutorial, you learned about the Google Assistant and how it can be added to your Android Things applications. This feature gives your users a new way of interacting with and controlling your device, as well as access to the many features available from Google. This is only one part of the fantastic features that can go into an Android Things app and allow you to create new and amazing devices for your users.

While you're here, check out some of my other posts on Android Things on Envato Tuts+!

  • Android Things
    Android Things and Machine Learning
    Paul Trebilcox-Ruiz
  • Android SDK
    Android Things: Understanding and Writing Drivers
    Paul Trebilcox-Ruiz
  • Android Things
    Android Things: Creating a Cloud-Connected Doorman
    Paul Trebilcox-Ruiz

Powered by WPeMatico

Leave a Comment

Scroll to Top