April 23, 2007
Our finalized circuit board
Early in the project, I decided it would be a good idea to design an entirely new blimp control board, so that we would have a lot more control over what happens onboard each blimp. For projects like this, it is definitely good to see whether you can just purchase a canned solution, or if you will have to build it completley from scratch. Choosing either extreme is probably a bad choice.
With this PCB design and the accompanying firmware that runs on it, we ended up a little more on the "from scratch" end, but now hopefully others can leverage our work. We used expressPCB to produce our boards for a very reasonable charge, accompanied with fast delivery. Our final design is available for download. You'll need the (free) design software from expressPCB to open and manipulate the designs. An accompanying schematic is also available.
Now, in terms of features, this PCB has a quite a lot packed into a relatively small package:
- Onboard Microchip PIC18F1330 microcontroller, able to run at 32MHz with its internal oscillator. Beware of this inaccuracy in the currently-posted PDF manual!
- Serial communications with pins designed to mate with the Sparkfun BlueSMiRF. This allows the unit to operate remotely with very low power at a distance of 100 meters!
- Speed and direction control for 3 DC motors (400mA).
- Motor source voltage is separable from the main battery voltage -- so you can use different batteries to match your motors. If you don't need this, you just jump the main battery over.
- Two mutually-exclusive high-power outputs (400mA). We use these for two high-intensity LEDs.
- 0.01V-precise battery voltage sensing down to 3.0V.
- Pins designed to mate with an HMC6352 compass.
- Another pin set alternately compatible with this 3-axis accelerometer.
- Pin header for In-Circuit-Serial Programming with the Olimex PIC-MCP-USB programmer, which is compatible with the Microchip MPLAB.
- The whole thing weighs less that 15 grams populated, and measures just under 2.5" x 1.9".
Not bad for a first hack at a PCB, right?
April 21, 2007
All good things
The demonstration has concluded, and not without a bit of unintended excitement. (More on that soon, with photos and eventually video.) The project has concluded, but the conclusion is a soft one, and we will continue to broadcast blimpbot details here. Perhaps a flurry of such details as we wrap things up. All of our designs and much of our software will be available soon.
April 04, 2007
Painting on plastic
Painting our new clear polyurethane blimp has been among the many things we've been working on in the past few weeks. We tested Krylon engine enamel and Krylon fusion spraypaint on plastic, and neither seemed to work particularly well. We ended up using fabric paint for the first blimp -- and you can see us applying it in the video below. Beyond this, there has been much progress on all other aspects, but we will save those details for forthcoming entries.
March 05, 2007
Octopart, and "It Just Works"
Today I learned of a website called "Octopart," which is a search engine designed around locating electrical and mechanical parts. This probably would have saved me hours during the Blimpbot component search. For instance, I previously spent at least 30 minutes finding somewhere to buy our main microcontroller, because it was relatively new and not available at all the sellers. Instead, you can find the product and its availability across all of the major sites in, say, 0.005 seconds, according to the Octopart server. Pretty cool.
On another note, for software developers in particular, I have been impressed by the "It Just Works" (IJW) C++ unmanaged / C++ managed interoperability in Visual Studio 2005. Read on if interested. IJW is the name given to the ability to let unmanaged code access managed objects, and to let managed code seamlessly instantiate and call unmanaged objects. The upshot of this is that you can avoid expensive copy operations when passing data between the two realms. This is special to Visual C++, you can't do it with Visual C# or any of the other managed languages.
For the Blimpbots project, we're using a little bit of this to call several unmanaged libraries, including Intel OpenCV, and the SURF keypoint detector. For starters, you can mark any unmanaged code blocks with a #pragma unmanaged directive. Then, within managed C++, you just call the unmanaged libraries as you normally would, as long as you're passing value types. If you want to pass arrays or reference types, you first use a special pin_ptr syntax to tell the garbage collector not to mess with some block of memory temporarily, and then you just call the unmanaged code and pass the pointer. I'm actually not quite sure how this is accomplished behind the scenes, but, it just works, and has been quite helpful!
March 01, 2007
We've mentioned cameras and infrared LEDs, but how does all of that turn into a tracking system that monitors blimps flying around in a 3D environment? To be honest, we're not even totally sure yet, but we can at least describe the basic plans.
- Experiment with our cameras and figure out how much they distort images. The fanciest cameras out there still turn straight lines into curves, particularly near the outer edges of an image. We can summarize this distortion in the K matrix from the last post.
- Once we've placed the cameras in a room, figure out exactly where they are positioned and what direction they face. We will do this by positioning one or more objects at known locations, and use matrix operations to solve for the camera details.
- Now we need to look at the camera video and see whether we can find any blimps. First we perform a feature detection pass on the whole image using an algorithm called SIFT or SURF, which finds distinct points of interest in the image. We may also use specialized convolutions to look for bright spots if we employ IR LEDs. We then try to match features in the image to those found in pre-processed images of our blimps. Each match constitutes a "vote" for a particular location of a blimp. We tally up the votes, and cluster them, to see whether a large number of votes pool up one or more areas. We store these groups for the next step. We'll perform the clustering with either the K-means algorithm or a heavily discretized Hough transform.
- We take our possible matches, and verify that they actually appear to be a blimp by trying to find an affine transform or a homography that maps the known blimp points to the measured ones. This can account for rotations and stretching/skewing, since the blimp will be viewed from a variety of angles.
- If this succeeds, we store our best guess for the front and the back of the blimps that are detected and send them over the network to a central computer.
- The central computer waits for data from all of our cameras, and uses the information we extracted in (1) and (2) to estimate the position and orientation or each blimp. The central computer will use the technique of Kalman filtering to smooth out irregularities and develop an estimate for the velocity of the blimps as well.
- Finally, we can send this information into a controller, which transforms the blimp locations into the terms it needs to control the blimp.
February 23, 2007
As we've mentioned, cameras will be used to determine the position of our blimps. But to do this, we have to understand a few things about the cameras. Quite obviously, we have to know where the camera is located, and where it's facing. But cameras also have certain intrinsic properties which cause straight lines to appear as curves, particularly near the outer edges of an image. The intrinsic properties also include the camera's resolution.
It turns out that we can capture all of these properties in a 3x4 matrix, P. The matrix is set up such that, when postmultiplied by a 3D coordinate, the product is a 2D coordinate. That is, where x and X are 2D and 3D homogeneous coordinates respectively,
x = PX
Algorithms exist to automatically compute P, but let's start with a manual example. Let's think about a single 3D point X at coordinates [x,y,z]=[1,1,1]. If the camera was sitting at coordinate [1,1,0], and facing along the z-axis, this point it should be right in the center of view. If the camera had a resolution of 640x480, this means the point X should show up on the image at 320x240.
For a moment, let's ignore the camera's intrinsic properties and just figure out how to account for the camera's position C=(1,1,0) and orientation (facing straight along the z-axis). We can do this in two steps -- shifting the camera -- and then rotating it. We can achieve this by using a homogeneous translation followed by a rotation. So we translate by C. Actually, we translate by -C because we're moving the point X rather than the camera. What about the rotation? Well, we have already implied that the Z-direction is "straight ahead" for the camera. So it turns out the camera is already facing the right direction, and we can use the identity matrix for the rotation, R=I. These two operations produce a matrix operation X' = RTX, where x' is the position of the fixed-frame point X in the camera's reference frame. R and T are the aformementioned rotation and translation.
In MATLAB, noting that X has an extra "1" scale factor tacked on, we get the result we expect, X' = (0 0 1), that is, just one unit in the Z-direction in front of the camera!
R=eye(3); T=[eye(3) [-1 -1 0]']; X=[1 1 1 1]; X_prime = T*X = [0 0 1]
In the above calculations, we have still ignored the fact that the camera is actually projecting 3D coordinates onto a 2D plane, and that the camera has other intrinsic properties (like focal length). It turns out that these can all be represented by one more matrix K multiplied by the other two we have already seen, yielding the full camera matrix P = K*R*T. The definition of K is shown below. The camera has a focal length, which we'll just arbitrarily specify as 35.0, and two other intrinsic parameters k_x and k_y, which we also arbitrarily specify as 1.0 for now.
Now we have everything to fully-specify our camera, so we compute the result of the multiplication, in MATLAB,
35 1 320 -36
0 -35 240 35
0 0 1 0
Now let's see what this camera can do. Remember all along we've been trying to take our coordinate X=(1,1,1) and turn it into camera coordinates, which we know from experience to be (320,240). So we will "take a picture" by premultiplying X by P, and we get precisely that:
>> P*[1 1 1 1]' ans = 320 240 1
Actually it may be confusing that we have that extra "1". Again, this is a scale factor, and as long as it is "1," the coordinate is what we would expect. The K matrix is responsible for the camera's projection into 2D, which requires a division that will in general yield non-unit scale factor. In this case, we simply normalize the coordinate by dividing by the scale factor, e.g.,
Next, I'll apply the camera matrix to visualize something other than a single point, and then we'll show how the camera matrix is crucial in using multiple 2D images to reconstruct 3D scenes.
February 02, 2007
Keep your eye on the blimp
Or, alternately, the cofeee mug, if you don't have a blimp available. We're switching from SIFT to SURF for 2D image processing, because of some great speed gains. This puts us one step closer to our next demo -- controlled vertical positioning.
January 31, 2007
the mysterious third guy
I'm Sam, the other member of the blimpbots team. I'm a second-year grad student in AI, studying nothing to do with blimps, but I'm working with Jeff on the technical side. I've been out of commission for most of January due to some other commitments, but now I'm full speed ahead on the project.
The completed blimpbot control board 1.0 is up and running, courtesy of Jeff (see the picture here, or Jeff's video below). There are a few details to change in a second revision, but it does what we need. We have arbitrary motor control over bluetooth, which is pretty awesome. The next step is to demonstrate a simple control loop-- we're going to set up one blimp and control it enough to stay at a set altitude. That involves getting a decent start on the control software and camera tracking, at least for the vertical dimension. So there's a bit of software to write, but stay tuned for our first demo.
January 28, 2007
All systems go (?)
Excellent news! The Pulse-Width-Modulated (PWM) signals that control the three independent blimp motors are working just fine under remote control. This means we can control power and direction independently, as hoped. There was a worry that the directional control would not work when all three motors outputs were enabled... but on that front, all systems go.
The Blimpbot also has an onboard 3.3V voltage regulator, to keep the voltage to our future 3-axis accelerometer constant and within acceptable levels. The voltage is then routed to a reference pin on the microcontroller. Since we didn't buy the accelerometers yet, I was trying to use this pin to measure the battery's voltage. (i.e. 3.3V/(Battery Voltage) = Pin value) However, I'm just getting '0' back from the Analog-to-Digital converter. This isn't critical to flight, so I'm going to quit on this for now and hope for a response on the Microchip forums.
In other news, Jason's excited that he might have enough leftover weight to add fun accessories to the Blimpbot. For example, maybe something like electroluminescent wire.
So, how heavy is it?
The Blimpbot circuit board was cut in half today. Don't worry, this is a good thing, as the board came with two identical sets of traces. I had some help making the cut, but broke a table jigsaw in the process. Jason is going to see if we can get access to the high-powered laser cutter for future PCB splits.
So, with a fully-assembled single board, I made some approximate measurements:
- Main board: 15.75g
- BlueSMiRF: 3.25g
- Battery: 18g (from web)
- Total: ~40g
Other news today is that the H-bridges seem to be wired correctly, and they will at least power some blue LEDs that I have with me. Spinning real motors forward and reverse still remains to be tested.
Also, some work on the bluetooth link has Blimpbot 1 talking reliably at 9.6 kbps. We can probably go a lot faster, but it shouldn't be needed.
January 27, 2007
So far, so good
The PCB arrived on schedule yesterday, and after the weekly GROCS meeting, I got started with assembling our actual prototype. Before I touched the PCB, I wanted to get basic serial communication working over Bluetooth. It turns out that our new PIC Microcontroller is so new, the manufacturer didn't finish updating the library that makes serial (USART) easy to implement. So I had to build the library myself, and got a simple test program working shortly thereafter.
And with that, it was time for circuit assembly onto the PCB. The moment of truth for the layout, and my soldering skills. The bluetooth connector was a little too close to the exposed pins on the back, so I angled it out a bit. Everything else kind of dropped into place. And when I powered it up, the example program I had written previously fired up and functioned just as it did in the breadboard. The In-Circuit Serial Programming port worked, too. However, several other parts remain to be tested, so we need to wait a bit before breaking out the champagne (or, alternatley, the sparkling grape juice if on University premises). Check out a time-lapse MPEG video of circuit assembly!
(Video made using our new Axis camera.)
January 25, 2007
Not quite the Graf Zeppelin...
Hi! I'd like to introduce myself, I'm Jason Dietrich, and I'm the Art and Design student who should be taking most of the blame for keeping Jeff and Sam up nights slaving over hot soldering irons. While Jeff's been working on the electronic and camera tracking components of the project, I've been working on coming up with a chassis to secure those components into the blimps themselves.
Most of the preliminary designs were loosely based on the geometry of the Plantraco microblimp. They were modeled in CAD and cut using the U's CNC laser. They fit the miniature motors/gearboxes we've chosen well, but don't have enough clearance for the final PCB or the propellers, so we're back to the drawing board on that. After consulting with design Prof. Jan-Henrik Andersen we may be taking a very different tact on chassis construction. More on that as it evolves.
Wednesday morning we did get to try out the microblimp with a 52" blimp envelope. We had been using oblong party balloons which inflated to aproximately 28". The new envelope killed our mini helium tank, and we didn't even get it all the way full. But even so, it lifted the microblimp chassis, the 18g battery, the microblimp ballast and about $1.65 in loose change (a quarter weighs about 2.5g). These blimp bags are rated to about 106g, which should be more than we need. The microblimp motors were able to muscle the larger bag around at an adequate speed. We also did some rough tests on the strength of the G05 motors and gear box, with their super trick carbon propellers. These motors are about half the size of the micro blimp motors, but due to their gearing and bigger prop, push more air. Nice.
I'm out of town for the next week, but I'm be trying to think blimp.
January 23, 2007
PIC shakeup, PCB sent-to-the-ether
The BlimpBot's schematic has been translated into a PCB layout, and the layout send in for production to expressPCB.com. I sent them the design above, and an actual circuit board will arrive here Friday. There was of course a little money involved too.
Given the short timespan allotted to actually designing the PCB and reworking the schematic (as discussed below), I am crossing my fingers that it does not contain any big mistakes. Time will tell...
UPDATE: The PCB design underwent 2 revisions subsequent to this one. See newer PCB-related-posts for details. Here is the final design.
In other news, we skirted a near-disaster with the original schematic...
On Sunday it was discovered that the microcontroller used in our original design (i.e. all of the schematics below) is a bit insufficient. The short story is that it was only capable of controlling one of the three motors on the blimp. The only alternatives we could find were bigger -- 40 pins versus 18 -- overkill for what we want. Plus, each extra pin means more solder, more square area of PCB, and more weight, all of which are detrimental to miniature helium blimps.
But some more searching revealed the Microchip PIC18F1330, a brand new model with only a slightly incremented part number. This one can do just what we need -- control 3 independent PWM outputs. So the new PIC was added, the schematic revised, and the PCB design laid out, all since Sunday night.
January 20, 2007
Lab work day one
Today I actually put together a small fraction of the blimpbot circuit on a breadboard. Specifically, the parts that have to do with power, ground, power stabilization, the microcontroller, and its status LED.
The good news is that all of this works, and Microchip lets students use their C18 compiler for free! (Works with 18XX series microcontrollers)
The bad news is that one of the 18F1320 PICs we got from SparkFun was defective, so we're left with only one working one at the moment, which makes me slightly nervous.
On another note, the motors arrived, and they are impossibly small. Just compare them to the standard household tack in the photo below.
January 19, 2007
Since we haven't been able to find any help with soldering itsy bitsy components, the design has been revised to use more soldering-iron friendly parts. And a tweak to the serial communications circuit left more pins open, so we can now toggle between two bright LEDs. Other tweaks are contained within the new schematic.
We've also been working through some designs for the assembly which the motors and PCB will attach to. They're looking pretty cool, but Jason would be the appropriate person to discuss that.
January 17, 2007
A revised schematic
After correcting a mistake in the logic for the motor outputs, we have a slightly revised schematic.
One of our current challenges is to find someone who can help us attach very small surface-mount components. The motor drivers and accelerometers are quite small.