Imagine RIT 2010
This weekend was the third annual Imagine RIT festival. There was a record breaking 32,000 attendees and I feel like every one of those walked by our booth and asked us to explain retaliator. Unfortunately, Thomas’s pedestrian tracking algorithm wasn’t developed enough to track in real time by the time of the festival, so I threw together my first attempt at video processing. The end result was a turret that would track and fire at motion!
Brian was manning the etch-a-sketch booth, where he showed off some pretty impressive pictures rendered by the automated etch-a-sketch controller. He worked a huge amount this week getting the calibration data for the backlash and friction losses, and ended up impressing a lot of people. Grant and Howland were driving around the shopping car (unfortunately that met an untimely end when they found a curb and a battery went through a laptop screen.) Finally Crawford and Green had their implementations of iButton and bluetooth doorlocks, respectively.
With our project, the hardest part to go from the mouse control to image processing was converting the 2 dimensional fix produced by the image processing software to spherical coordinates to feed directly into the servos. Because of the positioning of the servos in the gimbal, the “pan” servo is transformed by the tilt servo, so control is slightly unintuitive. By using a function that accepts arguments for a Cartesian 3 dimensional fix, advanced image processing techniques can be used with the system to account for ballistic arc information. Right now the image processing is very “naive” as Thomas would say, but the physical platform is stable enough that a permanent installation can be made for people to test their own implementation of image processing.