March 05, 2007
Octopart, and "It Just Works"
Today I learned of a website called "Octopart," which is a search engine designed around locating electrical and mechanical parts. This probably would have saved me hours during the Blimpbot component search. For instance, I previously spent at least 30 minutes finding somewhere to buy our main microcontroller, because it was relatively new and not available at all the sellers. Instead, you can find the product and its availability across all of the major sites in, say, 0.005 seconds, according to the Octopart server. Pretty cool.
On another note, for software developers in particular, I have been impressed by the "It Just Works" (IJW) C++ unmanaged / C++ managed interoperability in Visual Studio 2005. Read on if interested. IJW is the name given to the ability to let unmanaged code access managed objects, and to let managed code seamlessly instantiate and call unmanaged objects. The upshot of this is that you can avoid expensive copy operations when passing data between the two realms. This is special to Visual C++, you can't do it with Visual C# or any of the other managed languages.
For the Blimpbots project, we're using a little bit of this to call several unmanaged libraries, including Intel OpenCV, and the SURF keypoint detector. For starters, you can mark any unmanaged code blocks with a #pragma unmanaged directive. Then, within managed C++, you just call the unmanaged libraries as you normally would, as long as you're passing value types. If you want to pass arrays or reference types, you first use a special pin_ptr syntax to tell the garbage collector not to mess with some block of memory temporarily, and then you just call the unmanaged code and pass the pointer. I'm actually not quite sure how this is accomplished behind the scenes, but, it just works, and has been quite helpful!
Hey Blimpbot fans-
I don’t know if you all could tell, but during the past week, we’ve been on spring break. Before we left though, we had a design review with all the other GROCS groups and a few interested members of the public. We gave a simple presentation outlining our project. Then we ran a relatively successful hover test, where we used a single camera and a computer running SURF to keep the blimp more or less in one spot vertically (We kept it off the ground and the ceiling anyway). Then we passed out a questionnaire to spur some discussion on the project. Here are some of the suggestions we received from the questions we asked:
What behaviors would you like to see in the blimps:
-Predator/Prey- one blimp is chasing the others. Possibly if the “Predator” blimp “gets” another blimp by getting too close to it, that blimp has to descend to a certain altitude.
-Tag- a variation on Predator/Prey might be “tag,” where the “predator” switches when it gets close enough to another blimp, “tagging” it. Possibly the blimp’s LED comes on or changes color.
-Herding -another version might be “herding” where one blimp tries to “round up” other blimps that are trying to avoid it.
-Pack pursuit- The opposite of herding, multiple blimps pursue one blimp. An example that kept coming up was of a girl at a club being pursued by a bunch of guys. (Obviously, some people were already thinking about spring break ;-) )
-Patrolling behavior- The blimps run a set course, which has the possibility of evolving into Shriner-car-style coordinated movements (figure 8s, clover leafs, etc.)
-Swarm/flock- The blimps obey simple rules for their motion creating a complex and ever-changing pattern of movement
-Target chasing- The blimps pursue aerial targets or the blimps react to a target on the floor, be it a specific object, a hat, the beam of a laser pointer or groups of people. Blimps might cluster around the object or just change their patterns of movement according to the position of the object.
-Noise chasing- The blimps move to the loudest or quietest point in the room
-Stochastic self assembly- The blimps have to follow 5-10 simple rules
-Voice commands- The blimps react to voice commands
-Scrabble- The blimps attempt to spell a word or words (using letters painted on the blimp’s side or an LCD/LED display), possibly controlled by some sort of user interface
-Dance- The blimps create aesthetically pleasing patterns. This may only be realized in time-lapse video, or it could may be seen by creating an image of the paths of the blimps using the trail of LEDs attached to the blimps might make over the period of say, 40 minutes.
-Follow the leader- The blimps follow a “Lead Blimp”
-Mirror movement- Two sets of blimps mirror each other’s movements.
-Buddy system- The blimps move in pairs or families. Or even in trains.
-Collector- A version of Predator/Prey where the “Tagged” blimps have to follow the “Predator”
-Relax mode- Blimps fly slowly
-Pouty mode- Blimps sulk in corners
-Spin mode- Blimps rotate 360 while maintaining their position
-Zone defense- Blimps defend their “territory” while trying to muscle other blimps around
-Dive bomb- Blimps climb as high as possible, then descend as quickly as possible (Probably not all that exciting with LTA craft…)
-Docking- Blimps navigate to a pre-determined point
-Disease- Blimps have one color LED displayed. A “diseased” blimp is introduced displaying a different color LED. When it gets close enough to a “well” blimp (without actively pursuing it) that blimp’s LED changes color as it gets “diseased”, and is also able to pass on the “infection.”
Should we focus on allowing users to define behaviors (which may or may not be interesting), or on having more pre-programmed, interesting behaviors.
-There was generous support for interesting pre-programmed behaviors over heavy user interaction, especially if user error could lead to technical problems and the system crashing. Literally.
-Perhaps you could just control the lead blimp, and the others could follow it using swarming rules
-Perhaps the user could just control which blimp was the “lead” blimp
-Perhaps the user could simply switch the blimps from patrol to swarming behavior, either by flashing a light or making a sound?
-Simple user controls will probably be the most engaging
-It might not matter what the behaviors are as long as they aren’t predictable
-User defined behaviors should have an immediate response, so this likely restricts those behaviors to trivial changes, like the colors of LEDs
-Wagering on the “performance” of individual blimps (I think this might require a Vegas simulcast…)
-A combination of some kind of low-level user control that modifies a pre-programmed behavior.
What kinds of things could we add to the blimps to make them appear more “lifelike”.
-Why not use them for guerrilla marketing?
-Could the blimps somehow be used to make paintings? Could they drop paint?
-If they flew very low, it would be easier for people to interact with them.
-Each blimp should have it’s own color
-Each blimp could behave differently, fast/slow, like to fly high, like to fly low
-Making each blimp look different might give them their own identity
-Breaking up the outline of the blimp shape might make them look more interesting
-Use the blimp’s LED to reflect it’s personality and/or emotional state
-Use the blimps to convey a bigger message.
-Use natural materials if possible, and consider things resembling scales, fur, etc
-Aesthetically, the blimp prototype is already very cool.
-Sorry, I don’t know enough about how “live” blimps behave to comment.
-Names and personas, for each blimp, perhaps displayed on the NET or on posters? It would be nice to be able to say, “Oh my goodness! Hank is chasing Bertha! That rascal.”
-Eyes could reflect different personalities (LEDS?)
-Could they make different sounds?
-Their shape evokes fish…
-Reaction to the environment, being aware of people/noise in the room
-Additional balloons resembling appendages or eyes might give added
buoyancy and suggest other forms.
March 01, 2007
We've mentioned cameras and infrared LEDs, but how does all of that turn into a tracking system that monitors blimps flying around in a 3D environment? To be honest, we're not even totally sure yet, but we can at least describe the basic plans.
- Experiment with our cameras and figure out how much they distort images. The fanciest cameras out there still turn straight lines into curves, particularly near the outer edges of an image. We can summarize this distortion in the K matrix from the last post.
- Once we've placed the cameras in a room, figure out exactly where they are positioned and what direction they face. We will do this by positioning one or more objects at known locations, and use matrix operations to solve for the camera details.
- Now we need to look at the camera video and see whether we can find any blimps. First we perform a feature detection pass on the whole image using an algorithm called SIFT or SURF, which finds distinct points of interest in the image. We may also use specialized convolutions to look for bright spots if we employ IR LEDs. We then try to match features in the image to those found in pre-processed images of our blimps. Each match constitutes a "vote" for a particular location of a blimp. We tally up the votes, and cluster them, to see whether a large number of votes pool up one or more areas. We store these groups for the next step. We'll perform the clustering with either the K-means algorithm or a heavily discretized Hough transform.
- We take our possible matches, and verify that they actually appear to be a blimp by trying to find an affine transform or a homography that maps the known blimp points to the measured ones. This can account for rotations and stretching/skewing, since the blimp will be viewed from a variety of angles.
- If this succeeds, we store our best guess for the front and the back of the blimps that are detected and send them over the network to a central computer.
- The central computer waits for data from all of our cameras, and uses the information we extracted in (1) and (2) to estimate the position and orientation or each blimp. The central computer will use the technique of Kalman filtering to smooth out irregularities and develop an estimate for the velocity of the blimps as well.
- Finally, we can send this information into a controller, which transforms the blimp locations into the terms it needs to control the blimp.