This is a bit of a convoluted story, but I think it's important for me to first set the scene so that you understand all of the ramifications…
This all started several months ago when I proposed the Propeller Beanie as the official headgear for members of the All Programmable Planet community, for which I don the ermine robe and brandish the scepter of authority in my role as Editor in Chief.
One of the members asked why we needed an official hat in the first place, and I replied that if a number of us were attending a technical conference, for example, then – if we were all sporting our propeller beanies – we would be able to quickly and easily spot each other in the crowd.
But then someone raised the issue that anyone can purchase a propeller beanie and wear it at a technical conference. On this basis, if you were to see someone else wearing a propeller beanie, you still couldn’t tell if that person was "one of us" or "one of them" as it were.
The solution we came up with was for all of our propeller beanies to form a wireless mesh network and to automatically keep track of each other's locations. This way, when two or more members move into close proximity, their propellers can start to spin and LEDs can start to flash (I like LEDs).
Of course "talk is cheap" as they say – actually implementing something like this takes a bit more effort. Thus it was that I turned to my chum David Ewing, who is the CTO at Synapse Wireless – a company who boast that their wireless technology can be used to "Control and Monitor Anything from Anywhere" (I think it's fair to say that my propeller beanies took them by surprise).
After mulling this over for a while, David said that it wouldn’t be a problem, and he created a formal specification for CapNet – the world's first wireless mesh network to be deployed in propeller beanies. Of course, being an engineer, David couldn’t help himself from augmenting my original requirements specification with all sorts of "extras" like accelerometers, thermocouples, and cloud-based communications capabilities.
The real cream on the cake was when David told me that he would build six of these little rascals for me to take to the forthcoming DESIGN West Conference and exhibition, which is to be held in San Jose on 22-25 April, 2013 (Click Here to see the conference schedule and Click Here to register – note that the Expo Only pass is FREE!)
So far so good, but then things started to get more complicated. The conference organizers heard about CapNet and they invited David to give a paper describing the hardware components and software code that make this project spin. In this talk – Cool Beanies! A Mesh Networked Cranial Cooling System – David will discuss all of the design decisions and tradeoffs he made with regard to motor control, temperature sensors, accelerometers, LEDs, battery management, and… the list goes on.
Of course, this meant that David would need to build a few more of these wireless mesh networked propeller beanies to support his presentation… the tale was growing in the telling…
It was around this time that I heard that we were looking for companies to provide hands-on speed training sessions on the exhibit floor at DESIGN West. The idea is to provide the students with some form of development platform and to give them 45 minutes of speed training. Afterwards, the students can keep the development platform to experiment with further at home. (You can see the various hands-on training sessions by visiting the Schedule Builder page, clicking the Clear link, and then selecting the Hands-On Speed Training track. Note that this training is available to everyone attending the conference on a first-come, first served basis, including those holding free Expo Only passes.)
So I asked David if Synapse Wireless would be interested in providing some of these training sessions using their low-power wireless mesh modules. Most wireless devices of this ilk have to be programmed in C/C++, and creating applications for these little rascals requires a lot of expertise. By comparison, applications for use on Synapse's wireless modules can be created in the easy-to-learn-and-use Python language and uploaded "over-the-air" (wirelessly) into the modules.
Originally, I had thought that Synapse would use their off-the-shelf wireless modules for this training. But Synapse's marketing manager, Bryan Floyd, said "Why not use the CapNet propeller beanies as the training platform?" This is obviously a wonderful idea – the only slight problem is that it would now require David to build 250 of these wireless mesh networked propeller beanies. (I'm happy to report that I was visiting with Synapse just a couple of days ago, and production is in full swing.)
But wait, there's more, because David's talk and the CapNet hands-on training all take place on Tuesday 23 April. Also, the conference "meet and greet" party will be held at the end of that day on the main exhibition floor from 5:30 to 7:00 p.m. All of the people receiving the CapNet training will be told to wear their wireless mesh networked propeller beanies at this party for a chance to win a prize.
And what a prize! This is something any engineer would want. Well, it's certainly something I would want, because I'm the one that picked it. This is a Parrot AR.Drone 2.0 Quadricopter featuring forward-facing and downward-facing video cameras that you can control using your iPod touch, iPhone, iPad, and/or Android devices.
The thing is that it's the propeller beanies that will communicate amongst themselves to determine who is the lucky winner. David says the algorithm they use is so cunning that my mind is incapable of understanding it, and I can’t argue with that.
And the MCU powering CapNet is… One of the things I've been interested in is which components David decided to use to implement the CapNet beanies. At the heart of the system, of course, is the microcontroller (MCU).
David told me that Synapse uses a wide variety of MCUs depending on the requirements of the target application. In some cases, these are standalone MCUs combined with external RF (radio frequency) transceiver devices; in other cases they are MCU/RF combo devices (i.e., the MCU and RF/Transceiver functions are presented in the same package – I think they are on the same silicon chip, but I'm not 100% sure about this).
Synapse have developed their own wireless OS called SNAP. This has a very low memory footprint (~40KB) and can run on affordable, low-power MCUs. In fact, SNAP can happily run on 8-bit, 16-bit, and 32-bit MCUs. Of particular interest is the fact that user applications are created in the easy-to-learn Python language, which is interpreted/compiled down into "bytecode." This bytecode is loaded "over-the-air" into the wireless module(s) where it is executed on a Python Virtual Machine in SNAP. This has several important ramifications, including the fact that the same application can be run on any SNAP-enabled MCU without the need for recompilation. Also, since each bytecode equates to between 1 and 10 opcodes, the resulting applications are extremely memory efficient.
Synapse uses MCUs from multiple vendors, including Atmel, Freescale, Silicon Labs, and STMicro. In this case, however, David opted for the ATmega128RFA1 from Atmel. This little rascal (the chip, not David) combines an 8-bit AVR MCU with a low-power 2.4GHz RF transceiver. You can see the full range of capabilities offered by the ATmega128RFA1 by Clicking Here to visit the Arrow website (Arrow are supplying the majority of the CapNet components). However, David summarized the key points that led to his selecting this MCU for this project as follows:
The transceiver/MCU combo chip is small and cost effective
The power consumption is very low (less than 25mA with the radio on)
The sleep current gets down to about 1uA without compromising I/O functions or performance
Atmel’s 8-bit AVR architecture yields very fast CPU performance
The Received Signal Strength Indication (RSSI) has good resolution
There are plenty of PWM outputs available to control the motor and LEDs
The on-chip 128KB flash memory provides enough space for massively complex programs
The 16KB of on-chip SRAM provides lots of packet buffer and script-data space
The 4kB on-chip EEPROM is handy for nonvolatile data
The radio TX power and RX sensitivity is great
The radio supports interoperable 802.15.4 mode as well as higher bandwidths up to 2Mbps
And finally… Lest you think that the idea of a bunch of guys and gals wandering around in propeller beanies is a tad far-fetched, may I be so bold as to mention the forthcoming movie The Internship, which will be hitting the streets in June 2013. This stars Owen Wilson and Vince Vaughn as two salesmen whose careers have been torpedoed by the digital age. Our heroes work their way into a coveted internship at Google, where they must compete with a group of young, tech-savvy geniuses for a shot at employment.
As you can see from the following video trailer (starting around 1:20), propeller beanies are the order of the day:
I'm prepared to bet that if the film's producers had been aware about CapNet when they were making this movie, then the propeller beanies you see in the trailer would have been wireless mesh networked!
If you found this article to be interest, visitMicrocontroller / MCU Designline where – in addition to my Max's Cool Beans blogs on all sorts of "stuff" – you will find the latest and greatest design, technology, product, and news articles with regard to all aspects of designing and using microcontrollers.
Also, you can obtain a highlights update delivered directly to your inbox by signing up for my weekly newsletter – just Click Here to request this newsletter using the Manage Newsletters tab (if you aren't already a member you'll be asked to register, but it's free and painless so don't let that stop you [grin]).
Last but certainly not least, make sure you check out all of the discussions and other information resources at All Programmable Planet. For example, in addition to blogs by yours truly, microcontroller expert Duane Benson is learning how to use FPGAs to augment (sometimes replace) the MCUs in his robot (and other) projects.
David Patterson, known for his pioneering research that led to RAID, clusters and more, is part of a team at UC Berkeley that recently made its RISC-V processor architecture an open source hardware offering. We talk with Patterson and one of his colleagues behind the effort about the opportunities they see, what new kinds of designs they hope to enable and what it means for today’s commercial processor giants such as Intel, ARM and Imagination Technologies.