June 10, 2014


Filed under: Daily — profmason @ 7:37 pm

The new CMUCam V5 uses an ARM microprocessor with a camera to do HUE based color tracking.  It can track up to 100 colored blobs of seven different categories at 50Hz.

Previously, the CMUCam V4 had been used for the successful Stryker Humanoid robot (Gold medal winner of obstacle avoidance and bronze in Lift and Carry) The V4 would track a single blob based on a range or RGB color values.

The CMUCam V4 was well documented and easy to integrate with the microcontroller of your choice by passing serial messages.

The CMUCam V5 seems to support similar options, but the documentation is lacking at this point.  Here is what I can piece together from the wiki:

There are two extrenal commands implemented:

  • getBlocks(uint16_t maxBlocks=1000);

This command takes as an argument the number of blocks requested (or uses the default of 1000). It will return the number of blocks detected.  It will also populate an object within the pixy class called pixy.blocks

The pixy.blocks object contains Number of Blocks instances which are indexed from 0. These objects are

  1. pixy.blocks[#].x
  2. pixy.blocks[#].y
  3. pixy.blocks[#].width
  4. pixy.blocks[#].height

The getBlocks commands must be called repeatedly to get the servo values to update.

  • int8_t setServos(uint16_t s0, uint16_t s1);

setServos sets the values of the servos between 0 and 1000.  As far as I can tell the pulse width sent out is about 500 + value uS but I haven’t put it on the scope yet. In practice 500 is a reasonable center for most servos.

At this point there is no way to programatically define a set of hue and saturation targets.  Instead, this is done via a button on the device.

To integrate the PIXY into a small robot, an arduino is used to process the pixy.block data and send information to the servo channels using the two commands.

Building your PIXYBot


  • Two RC Servos (I used HK15138 as they are about as cheap as you can get)
  • Two RC Servo wheels (I cut mine on the router, but they could be laser cut, 3D printed or purchased)
  • Either 4xAA battery holder or small 2S lithium battery. (I used a 350mA 2S cell)
  • 1/8″ wood scrap.
  • Arduino UNO or equivalent
  • hot glue
  • foam tape

Step 1  :Glue the two servos together end to end as shown using hot glue.

Step 2:  Cut a piece of 1/8″ scrap in ~72mm x 80 mm (This is what I used but precise dimensions are not important) Drill two holes ~49 mm apart about 4 mm back from one edge to mount your pixy.

Step 3:  Hot glue the scrap plate to the servos so that the back of the plate lines up with the back of your servos .

Step 4:  Attach your PIXY using the 4x of right angle hardware.

Step 5: Use foam tape to attach your Arduino to the top of the platform in the orientation indicated in the picture at right.

Step 6: Use foam tape to attach the battery in the area behind the servos.

Step 7:  Use a 40mm 4-40 bolt with double nuts as a front leg for your robot.  If you better performance, take a hole puncher to a plastic milk bottle or the equivalent and glue the resulting disk over the head of the screw to decrease friction.

Step 8:  If you want to make it extra fancy, attach a jst power switch between the battery and the battery connector on the board.

Congratulations, your fabrication is done.

Below is the program that runs on the Arduiono. Once you have loaded it, you will need to train your PIXY on the object of interest for your PIXYbot.

// PIXYBot
// Simple Implementation of a robot based on two RC servos and a PIXY
// Martin S. Mason  6/10/2014

#include <SPI.h>;
#include <Pixy.h>;

#define LEFT_CENTER 410
#define RIGHT_CENTER 480
#define X_CENTER    160L
#define Y_CENTER    100L
#define RCS_MIN_POS     200L
#define RCS_MAX_POS     650L

int leftMotor;  //Define variables for the motors
int rightMotor;
int Error;

void stopMotors()  //Define a primitive to stop the robot
leftMotor = LEFT_CENTER;
rightMotor = RIGHT_CENTER;
void turn()  //Primitive to turn left
leftMotor = LEFT_CENTER + Error;
rightMotor = RIGHT_CENTER + Error;
void goForward()  //Primitive to go forward
leftMotor = RCS_MIN_POS;
rightMotor = RCS_MAX_POS;

Pixy pixy; //define the PIXY

void setup()
pixy.init();  //initialize the PIXY

void loop()
uint16_t blocks;
blocks = pixy.getBlocks();  //Detect blocks and pump pipeline
if (blocks)  //Did we see an object in the frame?
Error = pixy.blocks[0].x-X_CENTER;  //Find the distance between the target and the center of the frame

if (abs(Error)<20) turn();  //if target is out of the center then turn toward it
else goForward();  //otherwise go forward
pixy.setServos(leftMotor, rightMotor);  //Update the motors

March 15, 2014

Investigating HC-SR04 ultrasonic sensors for suitability of use with Vernier Sensor Systems

Filed under: Daily — profmason @ 6:10 am

The Vernier Labpro system consists of a 4 channel analog and two channel digital microprocessor based data acquisition system.  It uses a serial protocol to communicate with the host PC which is well documented in the Vernier Labpro technical manual. The data acquisition system is robust and suitable for classroom use, especially with either their well thought out Logger Pro software or with their LabVIEW VIs.

The Labpro system has a wide variety of sensors that are robust and again suitable for student use.  There are two sensors that consistently require replacement in student labs, the motion detector and the force sensor.

Vernier uses British Telecom connectors called BTD and BTA depending on the orientation of the plug.  I cut a cable and found the following:












Vernier uses an Auto-ID system for the sensors that is based on reading the voltage on Auto-ID pin (I assume using an ADC pin)  They publish some values in the Labpro technical manual.  Tying a 15K ohm resistor between the Auto-ID pin and ground made the cable consistently ID as a motion detector.

The motion detector is based on the Polaroid Ultrasonic ranging element coupled with the TI6500 ranging chipset.  This is a good system and provides long ranges (Classroom use suggests ranges between 20cm – 5m reliably)  The image below shows how the sensor functions.

1.  The INIT line is pulled high and the sensor generates a pulse train.  There is an internal blanking frame that limits the minimum range to ~50cm (This can be switched off with a switch on the sensor if the shorter range of ~20 cm is desired)

2.  A timer is started in the logger pro.

3.  Once the TI6500 detects the return pulses, it sets the Echo line hgih.  Once the Echo line goes high, the timer in the logger pro stops.     The distance is calculated based on the speed of sound * the return time  / 2.

The sampling speed is thus limited by the distance of the object.  If objects are closer the sensor could conceivably sample faster.  In practice the sample rate is limited to about 30 Hz as the transducer resonates and thus some damping time is appropriate.

Unfortunately the Polaroid element is large and fragile.  The most common failure mode is for students to drop something onto the sensing element or run a cart or other object into it.  As long as the membrane is not damaged, the sensor can be repaired by carefully bending the metal mesh back into position.   As we run 1000 students a semester through physics there are inevitable accidents which are beyond repair.  As the Sensor (now price reduced to $80) is expensive it would be ideal to have a less expensive option.

The HC-SR04 is an extremely inexpensive ultrasonic module made in china and based on the design of SRF-04 back in 2000.   It contains a small micro, a MAX232 to drive the Transmit Transducer at 16V, a quad LM324 for amplification of the return signal and a couple of transistors for shaping.

The HC-SR04 uses a different pulse scheme then the TI6500 as shown below:

This means that you have to time the echo pulse width instead of running a timer until the echo line goes high.  This has a number of advantages in terms of scalability to multiple sensors and allows you to use a single line per sensor by tying the Signal and Echo Lines together. Unfortunately it is significantly different from the TI6500.

Here is the schematic of the HC-SR04

The TL0974 op-amp shown was replaced with a LM324 on the part I was using.

Connecting the HC-SR04 to the Init and Echo pins gave the following:

It is difficult to see, but the HC-SR04 is timing out, and then sending a small pulse (as can be seen on the Echo line).  The width of this pulse does correspond to the distance.  However, the labpro doesn’t know what to make of this.

Next another line was connected to the output of the amplifier, so that we could see the returning pulses.  It was confirmed that start of these pulses corresponded to the return time for the pulse from the end of the init pulse.

At this point, perhaps the next step is to try a hex inverter to flip the init signal and do some pulse shaping on the return signal.

Fortunately, we have had better luck with the Force sensors, but that will be another article.

December 19, 2013

3pi with easy integrated encoders

Filed under: Daily — profmason @ 4:31 am

After some searching I couldn’t find anyone that had attached the new mico gear motor encoders to the 3pi platform. At the black friday sale I picked up a little bit of everything and spent an hour putting it together this morning. It is an intermediate soldering job to do the following. The good news is that everything fits.

1. Remove the stock 30×1 3pi motors. (Unscrew the plastic motor cover, and then desolder the two motor pins from the 0.1 inch pins.
2. Replace them with the 30×1 dual shaft motors.
3. Insert the encoder electronics module.
4. Solder at least 4 wires to the module. I used wires terminated in standard 0.1 inch pins, but you can also use 2mm cable as the pin spacing is 2mm instead of 2.54 mm. This is a bit fiddly after two cups of coffee. (Hey pololu, dump the M1 and M2 and give us a 2.54 connector for fat american hands. ( I really wanted to solder a 2.54 mm male header here. Yes I have 2mm headers from working on so many korean projects, but who else does!)

5. Hook up power and ground and check the output on a scope.

It is going to be a bit tricky as it would be nice to have one interrupt per encoder, but I will steal IO off the LCD to make this work, the other interrupt is tied to one of the motor control pins. I will probably end up using the pin change interrupts.

5. Hook up power and ground and check the output on a scope.

November 9, 2013

WIN USA Mini Urban Challenge

Filed under: Daily — profmason @ 11:13 pm

WIN USA in collaboration with Mt. San Antonio College presents:

MiniUrbanChallengeMINI URBAN CHALLENGE and Robot Showcase

December 7th from Noon to 3PM.

Robots from all over all over the East San Gabriel Valley will descend on Mt. San Antonio College for the first annual WIN USA Mini Urban Challenge. Participants will compete with their autonomous robots to drive through a model city completing missions that challenge both novice and expert roboticists.

  • Mini Urban Challenge Competition
  • Sumo Robot Competition
  • Line Race Competition.
  • Tour of the Mt. SAC exploratorium
  • Exposition of Mt. SAC robotics team VEX robots
  • Demonstration of Mt. SAC Intelligent Ground Vehicle.
  • Open Source WIN robot demonstration
  • Open Source WIN robot will be available for sale

Rules are available at

Schedule for the Event:

  • 12:00 Check In
  • 12:15 Opening Ceremony
  • 12:30 Line Race
  • 12:45 Sumo Robots
  • 1:15 Break / Lunch
  • 1:45 Robot Exposition
    • WIN Robot
    • VEX Robots
    • IGVC Robots
  • 2:15 Mini Urban Challenge
  • 2:50 Closing Ceremony

September 11, 2013

Mt. SAC Roboexpo V

Filed under: Daily — profmason @ 6:15 pm

Welcome to the Mt. SAC Roboexpo V.

Mt. San Antonio College is proud to host a State Final Qualifying VEX Robotics Competition (VRC) event for the fifth time .   This is a State Final Qualifying event and will have 5 state qualifying spots available to top finishers.

There will be two arena’s running for competitions in addition to a practice arena outside the pit area and a VEX College arena also available for practice.   Skills and Programming Challenges will run in the competition fields mid-day after the qualifying matches and before the the semi-final matches.

The event will start promptly at 9AM.  Door open at 8 AM and please arrive by 8:30 AM for registration.   The event organizers will work hard to have the event over by 5 PM.  Be sure to register for Skills and Programming Challenge when you arrive at the event so that the competitors can queue in a timely fashion. Free parking will be available in nearby lots.

The world champion Robomagellan team will be on hand to demonstrate their autonomous outdoor navigation platform and talk about their preparations for the Army / AUVSI Intelligent Ground Vehicle Challenge.  The gold medal winning humanoid robotics team will be present to talk about the challenges of humanoid robots.    The developer of RobotC compatible VEXduino will be present and giving out free VEXduino PCBs which allow your VEX sensors and motors to be connected to the Arduino platform. There will also be representatives from local aerospace to talk about opportunities in Robotics.

Participants will be asked to sign a waiver as there will be a feature article in Robot Magazine on Roboexpo V.  Please be ready to promote your school and team to be included in the article.

Map to Venue

Older Posts »

Powered by WordPress