The Wi-Fi Eye, Part 4: Power System and Neopixel LightsJuly 28, 2017 by Jeremy Lee
In part 4 of our Wi-Fi Eye project, we'll deal with our power supply
In part 4 of our Wi-Fi eye project, we 3D print some robot bodies, and wire up our servos, lights, and batteries to the controller we prepared earlier.
Welcome back to the Wi-Fi Eye project! Please check out the previous parts of this series before continuing on.
Table of Contents
- Part 1: Project Introduction and BOM
- Part 2: Firmware, Wiring, and Connectivity
- Part 3: Servos
- Part 4: Power and Neopixel Lights
Also, check out these articles on DIY Hacking that will guide you through 3D printing your own case:
If everything went well in the previous parts of this series, you should have a tiny microcontroller programmed with all the firmware we need and you should be able to configure it over Wi-Fi. But it’s not really a “robot” unless it physically flails something around. Also, everyone knows that robots have glowy lights, especially for eyes. (No, I don’t know how they see out, either.)
We could use direct remote-control, but that means we’d have to sit there and manually puppeteer our robot the whole time. If we give it a set of pre-programmed actions on a loop, that gets old pretty fast and doesn’t communicate much information. To really get “character”, animatronics need a touch of randomness, as well as purpose.
Also, the state of the robot should be affected by outside stimuli—which, in our case, is the presence of Wi-Fi signals—so the movements have some meaning. You would be surprised how good humans are at extracting meaning from a few blinking lights and an errant twitch. It's kind of our thing.
The ESP-12 prefers a clean 3.3V @150mA power supply that can handle high-current bursts of 300-500mA when the Wi-Fi radio transmitter turns on. Switch-mode supplies sometimes don’t like those conditions (the pulses are confusing) so an LDO (low drop-out) linear regulator is ideal here. We only need to shave off a volt, so we’re not dissipating much power.
I’ve been fairly happy with the AMS1117 3.3v regulator, which you can get on pre-built modules from the internet for less than a dollar. They have terrible heat-sinking for the stated 1-amp capacity, but that’s fine for our purposes. They also have no voltage adjustment, so they’re basically foolproof. And the ones I get have a red LED to let you know they’re working, which I tend to remove after a while.
This regulator has a 1V dropout (so, not that low), which means that even with a fully-charged 4.2V battery, we only get 3.2V to the microcontroller (at best), not the full 3.3V. And that will go as low as 2.2V when the battery is drained. Fortunately, the ESP8266 works well across the 2.5-3.3V range at the standard 80Mhz clock rate.
The ESP much prefers running lower than 3.3V to going above—as you increase to 5V, it will overheat and slowly fry itself to death. (Guess how I know?) Lower voltages, on the other hand, only prevent you from overclocking reliably.
Most of the problems people have with the ESP8266 seem to come from the radio transmitter bursts causing “brown outs” in the power system, leading to corrupt serial data or erroneous PWM signals. This can cause servos to glitch or LEDs to be updated incorrectly (which may then cause their own small power spikes). Problems occur when the power supply can only provide the average 100-200mA for the ESP, in which case you need huge "bypass" capacitors to store enough current for the radio bursts.
It’s critical to ensure that your power source has the oomph (that’s a technical term) to provide half an amp to the ESP-12 at the same time it’s giving another half an amp to the servos during a high-torque move. The supply doesn’t need to sustain it, but it’s going to happen all at once eventually, and being able to “burst” over an amp is not something that all batteries (or even power adaptors) can cope with. That only increases if you start stacking on more LEDs and servos (ie: If you don't spec. your power supply for the worst possible moment in the device's life, that's the moment it will crash).
Lithium cells, whether Li-Ion or Li-Po are very good at supplying bursts of current that are 10 times the base load. They make ideal batteries for this kind of application. I personally like repurposing “USB battery banks” that you can get for a few dollars for a few reasons:
- They usually contain a protected 1800-2400mAh Li-Ion 25500 single-cell battery.
- Plus a 5V DC converter.
- And a USB charger.
- They’re cheap since they're actually pretty terrible at their stated purpose (recharging phones).
- They're easy to replace or upgrade with an identical form-factor later.
- It’s less hassle getting batteries shipped inside a product because some carriers won’t accept packets marked “lithium batteries” anymore.
So I recommend picking a couple up from the internet or a bargain store. They’ll power a project like the Eye of Agomotto for over 20 hours continuously and it won’t matter what kind of servos you get.
You’ll need some way of charging (or swapping) the battery. I mounted the USB charger module directly on the back of my Wi-Fi Eye so that I could just plug a USB cable in, on the assumption that If I made a separate charger, I’d lose it. There are many options here.
The only tricky choice is whether to run the LEDs from the raw power side or the regulated power side. The issue here is that the WS2812b "Neopixels" don’t like being given a control signal “1” bit that’s more than a volt below its power rails.
Unless we want to add a logic level converter (we don’t), we can’t run the LEDs any more than 4.3V if the ESP is on 3.3V. That’s perfect if we’re running from a 4.2V battery, but no higher.
Specifically, if you run a Neopixel at 5V (their "nominal" voltage) they won’t reliably accept data from the ESP anymore. Fortunately, if you have 5V available then the AMS1117 regulator will be providing the full 3.3V which is good enough for a couple of lights.
In other words, more volts are not necessarily better here. We can’t drift too far above the microcontrollers supply rail or the LEDs will go strange.
The circuit diagrams show the Neopixel connected to the regulator output (the same rails powering the ESP-12) because that should work in all cases. However, if you want to make super-efficient use of your battery power (or get maximum brightness) then route power from the battery/input side instead, just like the servos do. This is actually how the Eye of Agomotto was wired—and trust me, it's super bright!
If you want multiple lights to act identically, you can wire them in parallel. If the same serial data is sent to many Neopixels (or strips), they will act in the same way and you don't even have to tell the software.
Striking a Pose
The firmware has a set of internal states that it chooses in response to events. Many of these states have associated poses. Here's a broad overview of when the poses are activated:
Each pose is stored in a file in SPIFFS (SPI File Flash System, a flash file system for embedded devices), and contains a set of properties which modify the current outputs wherever new values are supplied. If a value is not supplied, it is left to whatever the last pose set it to. So poses are incremental. A pose could change the pulsing speed of the LED without changing its color, or vice versa.
Briefly, Neopixels can be assigned a base color, which it will transition to over a short period of time (in seconds). It can also randomly flicker, filtered by a second color. This way, you can have a green light that flickers with a little bit of blue.
The Neopixels can also pulse. They will dim from their usual color to a minimum brightness, using a gamma curve to adjust the linear ramp so it has a bit more character. The pulsing rate is defined by the time to complete one cycle, plus an amount based on the RSSI (received signal strength indicator) of the currently tracked access point. That means some states (e.g., idle) can be defined to pulse at a constant rate, while others (e.g., detect) can pulse at a rate depending on the signal strength being received from the tracked target.
Each servo has the position (in degrees) it should move to, with a speed measured roughly in degrees per second. The servo angle can also have a random amount of twitch added to it.
The twitch behavior applies to all the selected servos simultaneously, and it will happen randomly within the given time. So a twitch time of "8" means you will get at least one twitch every eight seconds.
One thing to note: the pose editor uses a different method of transmitting data back to the server. It actually writes files directly to the flash filesystem using HTTP POSTs, rather than using a specialized AJAX call or update command as most other things do. This saves a lot of complicated JSON re-parsing in the firmware (the client browser does all the work, and it's got more memory) but it means the pose editor will not work on some older/mobile browsers that lack some HTML5 features. Also, when you save a pose (the first time) you will be asked to HTTP authenticate with a username and password (the ones set in the system configuration).
The limitations on the number of LEDs and Servos might seem to preclude larger and more complicated animatronics, but it is possible to divide the task and have several ESP-12s listening for the same events. They won’t be exactly synchronized (which can add character), but you’ll save on wire by “distributing” the receivers around a large costume or installation.
Remember you can build a Wi-Fi device without any servos or lights, and it will act as a beacon when turned on. Just its presence can be like a remote switch for other devices.
In some future version, it could be interesting to have nodes rename themselves depending on their mood. That would act as a signaling channel to other nodes, without any other fancy TCP protocols needed.
Limitations, Going Further
Honestly, the ESP8266 isn’t the greatest real-time servo controller, and I wouldn’t trust it to run anything critical, sharp, or anything that needs to be secure. The PWM signal is generated by a software interrupt, not dedicated hardware, and it’s vulnerable to anything that messes with interrupts.
If you want to reliably control more than a couple of servos (four should be possible if you increase SERVO_COUNT) there are dedicated chips like the PCA9685, which will drive 16 servos with 12-bit accuracy over an I2C bus. These are something I will be exploring in the future. The aim here was to see how much we could do with the ESP-12 alone without extra hardware.
In software, it's pretty trivial to extend the number of Neopixels (just increase LEDS_COUNT in the firmware and web interface) but even short chains of WS2812b's are very power-hungry, and I didn't want to make that the focus here. One was enough to show the principles and to give you a starting point.
Hopefully, the software goes beyond the usual “bare minimum” approach and provides a full example of several advanced techniques, including authentication and real-time comms, all at once. Changing it should be more about removing the stuff you don’t need.
There's enormous scope to improve the sophistication of the animation system, with queued or timed poses, or pre-recorded motion-capture. The ESP-12's flash filesystem has megabytes of space that could hold animation files, but that would tie the firmware to specific animatronics/editor software, so I didn't take that route.
Enormous thanks go to Naomi Wu (maker extraordinaire and 3D printing luminary) who introduced me to OpenSCAD and helped with some design elements for the Eye. Her patience and encouragement were more than I deserved. You can follow her on Twitter @RealSexyCyborg, if you can keep up with her awesomeness.
The original Doctor Strange Eye of Agamotto movie prop was designed by Alexandra Byrne and Barry Gibbs, who have created many of Marvel's iconic props and hero costumes. It's a work of art, and I've tried to pay homage to its many elements and subtle layers without being a direct copy. (Did you notice I changed the name by one character?)
Thanks also to Bill and Britt Doran from Punished Props (@Chinbeard and @LadyLongShanks) for getting me obsessed with Destiny, and to Amie D.D (@amiedoubleD) in particular for making me realize a Ghost is just a flying Eye-bot. I wish I'd thought of that.
And finally, thanks to the many wonderful people in the 3D Printing Community, including Thomas Sanladerer (@toms3dp), Joel Telling (@joeltelling), Angus Deveson (@makersmuse), and Naomi Wu who have put so much useful information on the internet and made it easier than ever for all of us to get into this rewarding hobby.