CSCE 496/896: Robotics
Lab 4/Final Challenge: Ball Detecting and Retrieval


Instructor: Carrick Detweiler
carrick _at_ cse.unl.edu
University of Nebraska-Lincoln
Spring 2011

Started: November 4, 2011
Checkpoint: November 18, 2011
Due Date: December 9, 2011

1  Overview

In this lab you will implement some of the final pieces that you will need for the final challenge. You will build your ball grasping mechanism and will use and modify ball detecting code. You will only need to turn in a single report at the end of the final challenge. However, there is a checkpoint due on October 17th before lab. At this checkpoint you need: 1) to demonstrate your arm picking up a ball autonomously (e.g. rotating, detecting a ball, and moving to pick it up), and 2) turn in a (short 2-4 page) proposal that describes the approach you plan to take for the final challenge and a description of any changes you have made or plan to make to your grasping device. You should also include a preliminary analysis of how well the arm performs. Finally, you should create a timeline for completing the needed tasks and also discuss the distribution of work.

2  Final Challenge Overview

The competition will take place on the last day of lab Friday, December 9th. The final project challenge will entail navigating around an environment, picking up balls, and dropping them off at particular locations. Different balls and dropoff locations will have different point values. You will be given the location of most of the balls before the competition starts and the environment will be augmented with some visual landmarks (from Lab 3) with known locations. Once you start your robot you cannot direct or control it, however, you will have two "resets" where you can manually restart your robot. Each use of these will result in a two point reduction off your score (not final grade). If you manually intervene, you can change the rotation of your robot and you can change the position by up to 0.5 meters, but no more.
The final challenge will most likely require many of the subcomponents that you previously implemented in this course including: reactive control for obstacle avoidance, basic vector navigation, visual localization based landmark detection, vision-based ball detection, a gripper/mechanism for picking up and carrying balls. The approach you take, however, is completely up to you. We will discuss additional details of the final challenge in lab and class.

3  Ball Detection

On the course website there is code to download that can detect balls (called ballDetector). Download this code onto your netbook and gumstix and compile it. Note that on the gumstix this will take 3-5 minutes to complete. In the launch directory, there are two launch files (similar to the landmark detection code). The first, ballDetector.launch, launches the camera and ball detector code. The second, displayDebugImages.launch, will display three different images you can use for debugging. Note that you should operate this system similarly to the landmark detection code. You should always run the debug display on your netbook (only use it when you need it as it will slow down the processing) and can run the ball detector code on the netbook for testing and debugging, but you should ultimately run it on the gumstix.
The ball detector works by first converting the image to HSV color space. It then performs thresholding to filter out all pixels that are between a low and high HSV threshold. These values can be configured in the launch file (see ballDetector.launch for an example) and can be dynamically changed by running the command rosrun ballDetector configGUI.py. It then searches for the largest group of connected pixels that have similar height and width (as a crude approximation to find circular groups of pixels). The largest group that meets this criteria is selected as the "ball" in the image and a message is sent out with this ball location.
The debug images display the HSV image, thresholded image (white pixels indicate those that are between the low and high thresholds), and a marked up image with white pixels indicating the boundaries of connected groups. In addition, in this image the final ball location is marked with a circle. Use these images and the configGUI to pick good HSV low and high thresholds.
Question: How did you go about picking the HSV thresholds for various balls? Do this for a couple of different color balls and report on the results. How does lighting affect the values?
Question: How many locations per second does the ball detector detect? Does it change based on different HSV thresholds?
For the final challenge, there may be more than one ball color that you will need to detect.
Question: How can you change the thresholds automatically? Note, you don't need to modify the ball detector code to do this.
For the final challenge, you may want to limit the ball detector code to only look for balls in a certain part of the frame (for instance nearby on the floor).
Question: Make sure to report on any modifications you made to the ball detector code for the final challenge. Also report on how you used it in general.

4  Vision Arbitrator

For the final challenge you will most likely need to look for landmarks and balls. Create a new launch file on the gumstix that will launch both (without launching the actual camera node twice).
Question: With both running, what detection rates do you achieve on the gumstix for landmark and balls? What about on the netbook?
Most of the time you will not need to detect landmarks at the same time as you will detect balls. In this section you should create a system that will allow you to switch between using either landmark, ball detection, or both. You should do this by creating a vision arbitrator that will only pass images on to these nodes when needed.
Question: Describe the structure and how you implemented your vision arbitrator.

5  Gripper Construction and Characterization

In the previous lab you proposed the design of a gripper/mechanism to pick up and drop off balls. On the course website there is a hoverboard download which contains an updated hoverboard node that enables control of servos through PMW channels. This enables controlling each of the 6 servo channels by specifying a 0 to 100 control output. This does not map directly to angles on the servo, so you may want to create a new node that does this conversion to enable easier control. If you pass a value less than zero, the servo output will be disabled. I strongly recommend doing this whenever your "arm" is "at rest" to save energy and reduce the chance of damage to your mechanism and hovercraft.
In this lab and for the checkpoint, you should build and characterize the details of your gripper.
Question: Describe the construction process and any problems you encountered and how you overcame them.
Question: You should characterize the performance of your mechanism. Some things you may want to examine and report on (as applicable) are:

6  Challenge Day

On the day of the challenge (Friday, December 9th) you will compete against the other groups in the ball collection challenge. In addition to the competition, you should prepare a short 10 minute presentation that describes your hovercraft system and the approach you are taking to the challenge. You will present this immediately before the run of your hovercraft in the challenge environment.
On the challenge day you will receive the course "map" with ball and landmark locations by 11am. You will have until the start of the challenge to parse and update your system based on the course map. However, at 11am you will also have to turn in all batteries for the hovercraft (for charging). You can still work on minor changes, however, this is not recommended. At the start of lab, at 2:30 you will no be able to work on your system any further.

7  To Hand In

You should write up your final project as you have with all the previous labs. However, in addition to answering the specific questions above, you should also include a detailed description of your approach to the final challenge. Also include relevant tests to characterize the performance of the components of your system and the system as a whole.
You should designate one person from your group as the point person for this lab (each person needs to do this at least once over the semester). This person is responsible for organizing and handing in the report, but everyone must contribute to writing the text. You should list all group members and indicate who was the point person on this lab. Your lab should be submitted by email before the start of class on the due date. A pdf formatted document is preferred.
Your lab report should have an introduction and conclusion and address the various questions (highlighted as Question: ) throughout the lab in detail. It should be well written and have a logical flow. Including pictures, charts, and graphs may be useful in explaining the results. There is no set page limit, but you should make sure to answer questions in detail and explain how you arrived at your decisions. You are also welcome to add additional insights and material to the lab beyond answering the required questions. The clarity, organization, grammar, and completeness of the report is worth 10 points of your lab report grade.
Question: Please include your code with the lab report. Note that you will receive deductions if your code is not reasonably well commented. You should comment the code as you write it, do not leave writing comments until the end.
Question: Include an rxgraph of your final system and comment on your overall system architecture.
Question: For everyone in your group how many hours did each person spend on this part and the lab in total? Did you divide the work, if so how? Work on everything together?
Question: Please discuss and highlight any areas of this lab that you found unclear or difficult.



File translated from TEX by TTH, version 3.89.
On 8 Nov 2011, 11:10.