Project

TinyML In Action—Creating a Voice Controlled Robotic Subsystem

July 03, 2022 by Jake Hertz

We’ll be walking you through creating a robotic subsystem with a voice-activated motor leveraging machine learning (ML) and an Arduino Nano 33 BLE Sense.

With a solid foundational understanding of the concepts that underlie the field of TinyML, we’ll be applying our knowledge to a real-life project. 

A note before digging into this project, I just wanted to make clear that this project will be using pre-existing datasets, Google Colabs, and Arduino code developed by both Pete Warden and the TinyML team at Harvard University. To deploy on our microcontroller unit (MCU), their resources will provide us with:

  • Access to datasets
  • Model architectures
  • Training scripts
  • Quantization scripts
  • Evaluation tools
  • Arduino code 

As a disclaimer, we did not develop the vast majority of this code, and we do not own the rights to it. 

All said and done, this project assumes a basic understanding of programming and electronics.

 

TinyML Project—Building a Voice Command Robotic Subsystem

In this project, we will be building a simple robotic subsystem that uses machine learning to respond to voice commands. A microcontroller will collect inputs from a microphone, use ML to listen for the wake words like "forwards" and "backwards," and then drive a small DC motor in the commanded direction.

Since there is already a lot of good information on controlling motors with a microcontroller, this article will primarily focus on demonstrating how to: 

 

BOM for a TinyML Robotic Subsystem With a Voice-activated Motor

Below in Table 1, you'll find a bill of materials (BOM) for this project.

 

Table 1. BOM for example TInyML voice-activated motor project. The whole project will cost under $30.
Part  Example  Cost* USD Notes

Arduino Nano 33 BLE Sense

Link

$22.50

This is a standard TinyML development device.

L293D Dual
H-Bridge Motor Driver

Link

$4.50

I’m using this IC since I already had it lying around. In your project, this can be replaced by an H-bridge of your choosing.

DC MotorL

Link

$1.95

I’m using a DC motor from the Elegoo Uno R3 starter kit. Any cheap 5 VDC motor will work for this project.

3x 1k Resistor

 

$0.30

 

4x AA Battery and Connector

Link

$0.58

The input voltage range for this project can be from 4.5 to 21 V. In my project, I used 4x AA batteries (~6 V) because I had them available, but a 9V or other power source in this range will work.

(* Note that all costs are from Sept 2021)

 

For this specific project, I selected most of the parts, seen in Figure 1, from what I already had on hand.

 

The parts I used in this project

Figure 1. The parts I used in this project

 

In this project, you'll have a large amount of freedom to choose other, similar parts to replicate this project. 

 

Setting Up TinyML Software for Arduino Nano 33 BLE Sense

To run TinyML scripts on our Arduino Nano 33 BLE Sense, we need to install some packages and dependencies. If you don’t already have the Arduino IDE installed on your computer, you can find it here.

Once that is installed, we’ll need to install the board files for the Arduino Nano 33 BLE Sense. To do this, from the IDE, go to Tools → Board → Boards Manager. Here, search for “mbed nano” and install “Arduino Mbed OS Nano Boards."

This is shown in Figure 2 below.

 

We need to install the board files for the Nano BLE 33

Figure 2. We need to install the board files for the Nano 33 BLE Sense

 

After this, we’ll need to install the necessary libraries for this project. To do this, go to Tools → Manage Libraries. From there, search for and download the following libraries:

With this done, we can start the project!

 

Step 1: Training a Machine Learning Model With TensorFlow Lite 

Generally, an ML workflow would begin with collecting and labeling a dataset, followed by designing a model architecture from scratch. For the sake of time and simplicity, we’ll be "cheating" by leveraging some ready-made datasets and a pre-trained keyword spotting model, both developed by Pete Warden. To utilize these resources and train our model we will be using scripts in a Google Colab developed by the TinyML team at Harvard University. 

The Google Colab needed can be found here.

First, make sure you are using a graphics processing unit (GPU) runtime in your Colab (as shown in Figure 3), as this will significantly speed up training time. Once you do this, all of the code is ready to be used as-is. Simply run each cell in order by clicking on the black “run” button in the upper left-hand side of each individual cell. 

 

You must ensure that you’re using a GPU runtime in your Colab

Figure 3. You must ensure that you’re using a GPU runtime in your Colab

 

The model architecture we are using is tiny_conv, and we will be training 15,000 steps in total. The first 12,000 will be with a learning rate of 0.001 and the last 3,000 will use a learning rate of 0.0001. Additionally, we will be training the model to understand the words “forwards” and “backwards,” which Warden’s keyword spotting (KWS) dataset already includes. This can be seen in Figure 4.

 

This is the section in our Colab where we define what words we’re training for, our training parameters, and our model architecture

Figure 4. This is the section in our Colab where we define what words we’re training for, our training parameters, and our model architecture

 

Keep in mind that training may take a couple of hours to complete, so make sure your computer is plugged in and your internet connection is stable.

 

Step 2: Quantize and Evaluate the ML Model

When training is done, you will reach a point in the Colab that is labeled as Step 2. Here is where the quantization begins.

First, we freeze our model, which is the process of combining all relevant training results (graph, weights, etc) into a single file for inference. Once we have a frozen model, we will convert the model into a TFLite model. The script that Harvard has set up makes this process relatively easy, and the outputted TFLite model should be fully quantized. The final model should be under 20 kB in size.

When fully converted, the Colab has scripts available for comparing accuracy between the quantized and unquantized models. If everything went correctly, the accuracies should be almost identical.

 

Step 3: Deploy the Machine Learning Model to Arduino

Before going any further, I want you to know that you can find my full code as a reference here.

Once we have a fully quantized and converted TensorFlow Lite model, we need to deploy it to our Arduino. We will be modifying Harvard’s pre-existing micro_speech example, which you can find in the Arduino IDE under: Files → Examples → INCOMPATIBLE → Harvard_TinyMLx → micro_speech. 

You might be a bit overwhelmed since there is a lot going on in this code; however, for the purposes of this project, we don’t need to concern ourselves with most of it.

First, we must enter our new TFLite Micro model in place of what is currently used in the micro_speech example. In the very last cell of the Colab, we should have outputted a large matrix of hexadecimal characters, as shown in Figure 5. This is our TensorFlow Lite for Microcontrollers model that will be used in our Arduino code. 

 

A snippet of the outputted TensorFlow Lite Micro model from our Google Colab

Figure 5. A snippet of the outputted TensorFlow Lite Micro model from our Google Colab

 

In the micro_features_model.CPP file, copy and paste just the hexadecimal characters from your Colab in place of the characters that are already in the file. At the very bottom of the Colab’s printout, there should be a line that says unsigned int g_model_len followed by a number. The last thing to do is to copy this number from your Colab and insert it in place of the number currently used for const int g_model_len at the bottom of the Arduino code file.

After this, the only thing to change relating to the model is in the micro_features_micro_model_settings.CPP file. As shown in Figure 6, change the category labels of “yes” and “no” to “forwards” and “backward”. Make sure you don’t touch the “silence” or “unknown” labels.

 

We must change the expected category labels for our new words

Figure 6. We must change the expected category labels for our new words

 

Step 4: Interpret Inference and Write Our Motor Driver Code

At this point, the TFLite Micro model should run as intended, and now we need to drive our motor in response to the TinyML inference output. To do this, we will be modifying the  arduino_command_responder.CPP file.

As shown in Figure 7, at the top of the file we will be adding a couple of #define statements to define which pins on the Arduino will go to which pins on our motor driver. For our project, we’ll use D2 for the ENABLE signal, D3 for the Driver1A input, and D4 for the Driver2A input. Make sure to also set these pins as output with pinMode() function in the RespondToCommand() function.

 

We need to define our pins, set them as outputs, and write our simple motorCTRL function.

Figure 7. We need to define our pins, set them as outputs, and write our simple motorCTRL function.

 

From there we can define our motor control function. This function takes in a speed (which we won’t alter for the purposes of this project), and a logic value for both Driver1A and Driver2A. Basically, if Driver1A is HIGH, and Driver2A is LOW, the motor will spin in one direction. If the reverse is true, our motor will spin in the opposite direction.

Now the only thing left to do is to change the command responses that already exist in the code. As shown in Figure 8, we’ll change the command response so that if the first character of the found command is “f” (i.e the found command is “Forward”), it will spin the motor forwards. We do the same for the “Backward” command.

 

We’ll be controlling the motor to move either forward or backward based on the found command by the ML model.

Figure 8. We’ll be controlling the motor to move either forward or backward based on the found command by the ML model.

 

Step 5: Build the Motor Circuitry—Motor Driver Circuit

With all of the software out of the way, we can now build our motor driver circuit. The BOM is listed above, and the schematic is shown in Figure 9 below.

 

Our motor driver circuitry.

Figure 9. Our motor driver circuitry.

 

Using a voltage source from 4.5 to 21 V, we power both the Arduino and the L293D. The wiring has D4 going to the motor driver 2A input, D3 going to motor driver 1A input, and D2 going to EN1,2. We have a 1 kΩ pull-down resistor on each of these signals to make sure our states are always defined, and we have a 0.1 μF capacitor for decoupling just to be safe.

 

Step 6: Upload the Code and Show it Off!

Once everything is wired up, we can upload our code to the Arduino and watch it work. Shown here is a demonstration of our finished working project [video].

 

 

In this project, we were able to create an audio keyword spotting model small enough to be run locally on an MCU powered by standard AA batteries. Hopefully, this project helped demonstrate the value and applications of TinyML.

7 Comments
  • mattganis July 07, 2022

    question -  I don’t see a mic in the circuit.  How does the Arduino get your voice to figure out the spoken command ?

    -Matt

    Like. Reply
    • jhertz2 July 07, 2022
      Thanks for the question. I did not clarify in the project, but the Arduino Nano BLE33 has a built-in microphone!
      Like. Reply
      • mattganis July 07, 2022
        I did look at the specs for the BLE and didn't see it. I have a bunch of students that are using the IoT 33 Nano - I'm hoping I can have them use that by just connecting up a mic - does that seem plausible ?
        Like. Reply