Awhile back I was contacted by an Applications Engineer at National Instruments who said NI was interested in releasing their own toolkit of VI’s to control the AR Drone. He asked my permission to build their software on top of my toolkit so I sent him my code. Now, I’m pleased to announce that the NI’s AR Drone Toolkit was released two weeks ago at NI Week 2012. You can get more information and download it from LabviewHacker.com/ardrone.php
What does this new version give you?
- Support for the AR Drone 2.0
- An automated installer
- Xbox 360 controller support
- Animations (aka backflips and barrel rolls!)
- More frequent software updates and bug fixes
- The technical support of the LabVIEW Hacker team and larger NI community
Below is a bad-quality video of the presentation:
My interest in aerial robotics began with an undergraduate summer project I did for my advisor. Over the years, students at the CEEO have done many amazing LEGO NXT based projects including, but not limited to: milling machines, automated puppets, pancake makers, hamburger cookers, and sundae makers (I don’t know what’s with all the food projects). My advisor challenged me to make a flying LEGO NXT robot. I settled on the quadrotor platform and over the course of a summer I constructed this craft:
What you see is a NXT, an Arduino based inertial measurement unit, and brushless motors with electronic speed controllers, all mounted on an aluminum TETRIX frame. The NXT reads orientation information from the Arduino, calculates the motor speeds required to maintain stable flight, and sends these motor commands back to the Arduino, which spins the motors via the speed controllers.
So does it fly? Yep. Is it graceful? Nope. Check out the video of its first (and only) flight.
I had big plans for this robot but by the time I got it flying, the summer was over. This experience prompted the thought that became my thesis: “What if students could start with a functioning flying robot? What cool things could they create in the same amount of time it had taken me to build my quadrotor?”
I presented this weekend at the Boston Museum of Science as part of their Robot Block Party. I showed my AR Drone as well as a couple of other projects from the CEEO. I had a number of in depth conversations with 9 year olds about how computers see colors and how I could get my quadrotor to follow me around based on the color of my shirt. However, most of these kids’ parents failed to understand why anyone would want to program an AR Drone quadrotor themselves instead of just buying the iPhone app.
I also realized that I created this site without any way to contact me. My email address is . Feel free to message me and discuss how aerial drones are the future and/or will be the death of us all.
I received a couple of requests to expand on how I got the AR Drone to fly though a hula hoop, as seen in this video:
Below is a slide from my defense:
The first step is identifying the hoop in the image. We take a binary mask for green pixels above a certain threshold. Since the hoop is wrapped in bright green tape, only the green pixels belonging to the hoop make it through the mask. However, the resulting binary image is often missing parts of the hoop. It could be partially off screen, such as in the image above. Therefore, a blob detector would fail to identify the true center of the hoop.
My solution is to fit the equation of a circle to binary data, much in the same way as Excel will fit a linear or quadratic trend line to data. Our goal is to minimize the expression printed above the hoop picture, where r is the radius of our circle equation, a and b are the center coordinates, and Xi & Yi are the location of each nonzero pixel. Luckily for us, there is a closed form solution to this problem, printed in the top right of the slide. We can evaluate this expression by calculating the summations listed below. This is what happens for each new image we receive from the drone. From this process we know the center and radius of our hoop.
That information is fed into a four state controller. We manipulate yaw and climb rate to orient the AR Drone to center the hoop in the front camera’s field of view, pitch to propel the drone through the hoop, and roll to stop any sideways drift that would slide our drone right past the hoop.
Dear small niche audience of interested students, educators, hobbyists or confused blog readers who have stumbled onto this post. My name is Michael Mogenson, I’m a graduate student in Mechanical Engineering at Tufts University in Boston, MA. I work at the Center for Engineering Education and Outreach and I’m releasing the first iteration of a project I’ve been working on: the AR Drone LabVIEW Toolkit.
What is the AR Drone LabVIEW Toolkit? Its a palette of VI’s for National Instruments LabVIEW that can can control the Parrot AR Drone quadrotor. With these VI’s you can fly the AR Drone around your living room with a USB joystick or gamepad. You can write a program to autonomously fly your AR Drone. You can view the video stream from the AR Drone’s cameras or navigation data from the on-board sensors. You can use the many included image processing algorithms to analyze images from the AR Drone and recognize objects, track people, navigate through enclosed spaces, or anything else you can think of.
If you want the AR Drone LabVIEW Toolkit you can download it from here. Follow the instructions in the READ ME file to install the Toolkit. It should work with the base package of LabVIEW 2010 or later on Windows, Mac, and Linux.
Check out these YouTube videos to see what myself and other people have created with the AR Drone LabVIEW Toolkit.
Still reading this and have a lot of free time on your hands? You can read my masters thesis about this project here.