Group 5 (Left to Right): Tom Jozefowicz, Jeff Nie, Yuchen Xu, Morgan Aavang



Video of Run:


Astar Strategy:


Based on the homework Astar code, more features are added to our robotís path planning strategy. First, instead of only check 4 neighbor nodes, all 8 neighbor nodes including diagonal ones are checked. Second, the homework code just adds 1 f-score to each step, but our code add 6 different f-score to each step based on different types of steps. If the step is diagonal, added f-score will be higher than a straight step. If this step requires a turn, this step will have higher f-score added than a step doesnít require a turn or requires a gentler turn because robot is slower during turning than during moving in straight line. Above strategies help our robot finds a path that is not only short in true distance, but also requires least turning between two points.

Obstacle Scanning Strategy:


All possible obstacles are recorded in a structure in our code. The structure records the coordinate of the center of the obstacle, the position of obstacle in the Astar grid map, how many times it has been scanned, if it is recorded as an obstacle and if the obstacle is plotted in the LabView program. 12 LADAR distance at different angles are acquired. The distance between each LADAR scanning points to each obstacle are calculated. If the distance is smaller than 0.2 tiles, the obstacle is ďfound onceĒ, and if the same obstacle is found three times, it is decided that the obstacle exists. If an obstacle is found, its coordinate will be send to the LabView and plotted on the map.




Following the same strategy applied in Lab 8, the HSV center and radius values determined for the orange and blue golf balls were utilized for object recognition and centroid locating. With a provided Matlab program called Colorthreshold, the HSV values were determined using pictures of the golf balls taken in various locations using the robot's camera. The top part of the camera was cut off from examination in the code to prevent any brightly colored orange or blue shirts from being detected as golf balls. Additionally, every 40ms the vision code switched between looking for blue and orange golf balls. In the case of the robot seeing an orange and blue golf ball at the same time, the tile approximation function determined in a previous lab was used to tell the robot which golf ball was closer, thus telling it to collect the closer one. Once detected, the robot uses the x-coordinate of the centroid and an appropriate proportional gain to turn the robot so that the golf ball is straight ahead of the robot.


Golf Ball Collecting:



Figure 1. Fully assembled golf ball collector


To collect and segregate the different colored golf balls, we CADed and 3D printed a collector actuated by two RC servos (Figure 1). One servo controlled a front gate that opened when the robot car detected and approached a golf ball. The other servo controlled a divider that moved between two positions depending on the color of the detected golf ball. Upon detecting a golf ball, the robot would stop its path planning, adjust the two servoís positions, and speed towards the ball. After collecting the ball, the robot car sent the position and color of it to LabView, which displayed the robotís path, detected obstacles, and collected golf ballsí positions on an image. Figure 2 shows an example image after a completed run.


Figure 2: An example image in LabView after a test run. Five golf balls, either orange or blue, were placed inside the course, outlined in purple. These golf ballsí positions are indicated by the blue or orange circles inside the course. The green lines outline detected obstacles, and the light pink circles show the robotís path.