Path Planning

The robot utilized A-star to path plan to the required nodal positions on the course. The A-star algorithm builds an optimal path from the robot’s current position to the next requested waypoint. The algorithm utilizes a map of the course and checks through the available nodes on that map to determine a pathway from each waypoint. The algorithm will return these optimal nodal positions and the built in xy_controller function is utilized to move between these nodal positions.

Obstacle Avoidance

        The robot utilized the Ladar sensor to detect obstacles close by. The Ladar provides 240 distance readings. In the algorithm, 24 of these readings are used with a for loop. Another for loop checks to see whether these readings are below a certain threshold. The algorithm only marks an obstacle if the past five corresponding Ladar readings were below the distance threshold. To make the search faster, the code loops through the possible obstacle locations on the top/bottom half of the course depending on the current robot location (Searches only half of the field every time). Finally, the map is updated if the detected object coordinates match one of the possible obstacle locations initialized. For every single map update, astar is ran again to ensure obstacles are avoided as the robot goes through the course.    

   

Color Following

The robot has a RGB camera on the front that was utilized to locate and obtain the colored weeds. A given color vision file converted the camera’s RGB values to HSV values. A Matlab program was used to determine a range of HSV values for the pink and blue weeds. These threshold values were then entered into the color vision file which scanned through the bottom half of the image to identify when it detected a pink or blue weed. If there were over 40 blue or pink pixels detected, the robot would exit the A-star code and align the center of the camera with the center of the weed. The distance between the robot and the weed’s centroid were calculated and compared to previously found weeds. If the position of the new weed was not within 1.5 tiles of an existing weed, it would move forward a total of 1.5 seconds after the weed was out of the camera’s view to fully cover the weed. The robot’s position were then stored as the current weed’s position for future comparison. The program would then return to A-star.

LabView

The Labview program is an enhanced Lab 5 Labview program. It displays the contest area, including 5 target points, weed number display zones; found obstacles, weeds and robots position. The program takes in 14 variables from the DSP: Robot’s X and Y position, robot’s state, variable used only for testing, found obstacle wall’s coordinates (X and Y positions of two endpoints) and tally index, found weed’s X and Y position, weed’s tally index and color. These values are all sent every 6ms. The tally index values for both wall and weed serve simultaneously as flags: if -1 is sent, then there are no new weeds/walls. Otherwise an incremented integer index is sent, and relative wall/weed data are stored in an array under that index. Every coordinate sent to labview is transformed via a subVI, since Labview coordinate system differs from the one used in DSP.

Video

https://youtu.be/XTLjtz_89ts

Individual Photo

Ryan Wu                      Samantha Rivera

   Amir Tajbakhsh           Andrzej AJ Szopa