The Identifying the Targets section explains a theoretical approach to locating the Vision Targets on the 2014 FRC Field. This document will cover the details of C++ and Java examples which implement this theoretical approach. Note that in addition to the typical differences between the C++ and Java WPILib code, there are also a few additional differences prompted by the way the NIVision functions are accessed from the Java code. Through the syntax may differ slightly the general approaches are similar enough that this document will walk through the C++ code which should provide sufficient insight into the function for both C++ and Java teams.
Finding the Example: C++
For C++ teams, the example can be found by selecting File >> New >> Example. Then select VxWorks Downloadable Kernel Module Sample Project and click Next. Select FRC 2014 Vision Sample Program and click Finish to open the sample.
Finding the Example: Java
For Java teams, the example can be found by selecting File >> New Project. Then Expand the Samples folder, select FRC Java and click on the 2014VisionSampleProject, then click Next. Enter a name and location for the project, then click Finish.
Before examining the code, it is worth noting that the program samples are written using the Simple Robot framework and do not contain any other code in autonomous. The code will execute continuously, as quickly as possible, resulting in 100% usage of the cRIO CPU. When integrating this code into a different robot framework or integrating other team code, teams should make sure to add waits as appropriate and may wish to reduce the rate the code attempts to process at (processing every X loops in the Iterative Robot framework for example or every X milliseconds in the Command framework).
Also note that all of the methods are contained within the single class and file in order to make the example easier to read and understand. When adding to the example or integrating it with team code teams may wish to break out the scoring methods and structure into a separate class and file(s) in order to better organize the code.
The sample code uses a number of constants that can be modified to tweak the behavior of the code. Many of these constants are defined at the top of the code, but teams should note that there are additional values contained inline that may also be tweaked such as the threshold values for the color threshold. Note that when changing camera resolutions, in addition to changing the resolution constant, it may also be necessary to change the Area Minimum constant to an appropriate value.
Scores and Target Report Structures
In order to store the scores for all of the individual tests for a particular particle together, a structure is used to contain all of the scores. A separate structure is used to contain information about targets.
The threshold values used for the color threshold and the criteria object used for filtering out small particles is defined here. The filter criteria runs from the specified minimum area to the max integer value in order to filter out all particles smaller than the minimum.
The first step of the processing is to perform the image operations: thresholding, and filtering. Code has been provided, but commented out, to write out each step of the image processing to the cRIO flash where it can be retrieved using FTP. To access and view the images, open up a Windows Explorer window and enter "FTP://10.XX.YY.2" in the navigation bar, where XXYY is a 4 digit FRC team number.
To execute properly the code, as written, must have an image named testImage.jpg stored in the cRIO's root directory. To do this open up a Windows Explorer window and enter "FTP://10.XX.YY.2" in the navigation bar, where XXYY is a 4 digit FRC team number. You can then copy and rename an image from the sample images folder described below in the "Sample Images" section. It is highly recommended to take a new set of sample images using the actual robot, camera and lighting when available.
Alternatively, commented code is also provided to read from the Axis camera.
Note: It is strongly recommended to comment out the while loop before enabling the image writes in order to preserve the cRIO flash memory. Writing the images within the while loop may result in excessive wear to the cRIO flash memory.
A report is generated for each particle and then the reports are ordered from largest particle to smallest. An array of scores is created based on the number of particles.
Each particle is scored according to the approach described in the Identifying the Targets section, then the scores are compared to the defined minimum scores for both horizontal and vertical targets. The determination on the particle (target or not) is printed to the console along with the center and scores of the particle for debugging purposes.
This section of code is one that teams are recommended to modify to suit their robot and approach to vision processing and debugging, Teams may wish to modify the information printed to the console, or replace the console code with SmartDashboard code for the same purpose.
Score Aspect Ratio
The scoring of the Aspect Ratio is broken out into it's own method for clarity. This method compares the aspect ratio of the equivalent rectangle to the ideal aspect ratio for the target (which target type to use is determined by a parameter passed in)
The scoring of the Aspect Ratio is broken out into it's own method for clarity. This method compares the area of the particle to the area of the bounding box and returns a score between 0 and 100.
Ratio To Score
Many of the score calculations utilize the same subcalculation to convert a ratio with an ideal value of 1 to a 0-100 score value using a piecewise linear function that goes from (0,0) to (1,100) to (2,0). This calculation was broken out into a method to allow the code to be re-used for multiple score calculations.
The scoreCompare method checks if a particle is a target (meets all of the score minimums) of a specified type.
Finding Hot Targets
This section of the code iterates through each previously detected vertical target and calculates the scores for each detected horizontal target. After a pair is scored, the code checks if the total score for this pair is the highest detected so far; if so, information about the target is saved for later use. Finally the code checks if the target is a hot target or not.
Hot or Not
The hotOrNot method compares the scores for a target to the specified minimums in order to determine if the target is a Hot Target.
Print Target Info
This section of the code computes the distance to the best detected target and prints out if the best target is a Hot target or not and the distance to the target.
After processing is complete it is critical to release the memory from dynamically allocated objects such as the images, array of scores and vector of reports. Failing to release the memory used by these objects will "leak" the references to this memory and will result in the memory usage of the program steadily climbing until no free memory remains and the program crashes.
A number of sample images are provided in the VisionImages folder in the example Project Directory. The provided images are broken up into groups, one group is of a pseudo-target in the Hot position (reflective tape attached directly to the polycarbonate in the correct locations). Another group is of the pseudo-target in the Not hot position. The last group is images of the actual target setup from the kickoff filming field. All lit images were taken with a pair of green LED ring lights that nest one inside the other. While these images should help teams test algorithms quickly, it is highly recommended to utilize the reflective material provided in the Kit of Parts to create a target to test the camera and lighting setup that will be used on the robot. Note that the measurements in the image filenames are very rough and should not be taken as accurate measurements to be used for distance calibration calculations.