My uncle's quest for a beer-fetching robot
People work daily and put millions of dollars toward designing robots that do amazing things: perform intricate surgeries, explore other planets. My Uncle Bill has simpler dreams for his robot: he just wants to build one that will bring him a beer every now and then.
That might sound like a relatively simple project, but it's proved to be a decades-long undertaking that's required him to learn several programming languages and has demanded a lot more heady study time than the average casual hobby.
"When I started, I really didn't know what I was getting into," my uncle told me recently. "I soon realized it was a lofty goal."
Bill's inspiration came when he was in high school, though he's quick to clarify that the beer part of it didn't come into play until his college years. In 1977, a little sci-fi flick called "Star Wars" hit theaters, and that opened up a world of imagination for him and many others at that time.
"When 'Star Wars' came out, it was all over. I knew I wanted to be an engineer," Bill said. He's now a test engineer at a semiconductor company in Silicon Valley.
Smitten with R2-D2, Bill decided that he wanted to build a robot of his own. And later, that idea took the form of a beer-fetching bot. Here's the dream: He sits in his Santa Clara, Calif., home's living room, kitchen, or office, and says, "BeerBot, go get me a Heineken." BeerBot dutifully rolls off to the kitchen to retrieve said beer, then navigates its way back to Bill with beer and bottle opener in tow.
Sounds simple, right? Not so much. This would require voice recognition (so Bill can speak his command directly), object recognition (so BeerBot can distinguish between a Heineken, a Guinness, or for that matter, a bag of carrot sticks), and some kind of mapping and navigation capability (so it can find the refrigerator from anywhere in the house). Add to that the strength and balance to open a modern refrigerator and, well, Bill's got his work cut out for him.
In 1977, when he conceived of this thing, he was attempting to build it without the aid of personal computers or any kind of modern, user-friendly software.
"When I started, I was using logic gates, or TTL (transistor-transistor logic), which don't do a lot," he said. That would allow the machine to make simple decisions (move forward vs. stop, turn left vs. go straight), but in the case of the BeerBot, you'd have to map out its exact path ahead of time. And the bot would have to start from the exact same position every time it began its great journey to the fridge. And even that would be difficult because, due to accumulated errors caused by wheel slippage, you could end up several feet away from your intended destination.
"I don't even know if you could call what I ended up with a robot," Bill said. "It was more like a remote-controlled toy."
In college, he started getting into using microcontrollers (and, of course, beer). Microcontrollers are more advanced and work much better for robotics projects than TTL did. They are small, take very little power, and are programmable. Even today, microcontrollers are at the core of many of his other, simpler robotics projects.
iRobot Create (essentially a Roomba without the vacuum),
Bill was able to get his BeerBot-in-the-works to recognize
colors, but, as he says, that's not a very practical way
to navigate. It needs to be able to recognize objects.
As time passed, new technologies came into play that enabled Bill to make further advances, but each came with its own set of problems. Using a Webcam and a laptop computer attached to an iRobot Create (essentially a Roomba without the vacuum), he was able to get his bot to recognize colors, but, as he says, that's not a very practical way to navigate. It needs to be able to recognize objects.
He tried working with a patented, experimental algorithm published in 1999 called SIFT (scale-invariant feature transform) that can be used in object recognition and robotic navigation. Bill says SIFT has quite a bit of potential for his project, but at the time, it was available only for Linux. So he spent months learning Linux so he could use it.
So far, the SIFT-aided bot has proved to be very slow. Hooked up to a Webcam and laptop, it would take a picture and spend 7 seconds analyzing the image before moving forward just a foot. That would make for a mighty long wait for one beer.
Bill tried using a desktop computer linked to an onboard wireless camera, but communication between the robot and the computer didn't work because the image quality from wireless cameras was not high enough to work reliably with the vision software.
Another technology with promise is an FPGA (field-programmable gate array) board, which is a semiconductor device containing millions of logic gates with connections that can be configured by the user; the user can define how the chip works.
Hooked up to a camera, an FPGA-embedded BeerBot may have the potential to process several frames per second, a vast improvement on its object recognition capabilities. But FPGA implementation also requires coding in a yet another language, and after spending months trying to figure out how to program an FPGA, Bill gave up in frustration. Now he's hoping that someone will come out with an FPGA board that already has vision algorithms programmed into it.
In the meantime, other projects have been more successful. He's part of a growing community of robotics hobbyists who build advanced Halloween bots to scare the bejesus out of people. His haunted house has become something of a local spectacle, and he adds to it every year.
Motion or light sensors trigger most of the robots, from corpses who jump out at people at the very moment they pass by to a headless waiter (above) whose head springs up at people just as they reach for candy from the silver tray he holds.
Bill's proudest achievement so far is something he designed himself: a bot that appears to transform from Dr. Jekyll to Mr. Hyde right before your eyes (see animation below). When you enter his garage, you see a life-size Dr. Jekyll standing with a test tube in his hand. As you approach the figure, it turns into a ghoulish Mr. Hyde.
What's happening behind the scenes is that the bot has two masks, one on each side of its head. The head is on a motor salvaged from an old drill press and is spinning at 720 revolutions per minute. Two strategically timed strobe lights shine on the bot, each one synced up with one of the masks, and visitors can only see the mask when its corresponding strobe light is shining on it. At first, visitors can see only Dr. Jekyll's face.
A motion sensor detects people approaching, which triggers a microcontroller connected to a servomotor that is attached to a square-foot piece of cardboard. The cardboard then slowly moves from the front of one strobe light to the other, simultaneously blocking more of the light from one of the strobe lights while allowing the light from the other to be less obscured. This gives the impression of a gradual transformation, until all you can see is Mr. Hyde.
Bill's Halloween machines are much simpler than the elusive BeerBot. For that project, he'll keep plugging away, testing out different ideas and tinkering with new technologies as they come along. And while he hasn't met anyone working on a project quite like BeerBot, he has found a fair amount of resources out there. He's attended some Homebrew Robotics Club meetings. And he's garnered inspiration and ideas from online message boards, e-mail lists, and the shelves upon shelves of books he's collected over the years.
For now, Uncle Bill is perfectly capable of making the trek to the fridge himself, whenever he's in need of a beverage. Having a droid as advanced as R2D2 remains a distant dream. But technology keeps advancing, and he keeps learning. And maybe, just maybe, he'll one day toast his own success--and the fun he's had getting there--with a beer delivered by his BeerBot.
I, for one, hope it happens. And I hope that he thinks to give BeerBot two arms: one for him and one for his niece.
transformation is much smoother.