Recent Updates

  • Updated on: Dec 29, 2016

    Using GRIP with a Kangaroo Computer

    A recently available computer called the Kangaroo looks like a great platform for running GRIP on FRC robots. Some of the specs for this processor include:

    • Quad core 1.4Ghz Atom processor
    • HDMI port
    • 2 USB ports (1 USB2 and 1 USB3)
    • 2GB RAM
    • 32GB Flash
    • Flash card slot
    • WiFi
    • Battery with 4 hours running time
    • Power supply
    • Windows 10 installed
    • and a fingerprint reader

    All this is only $99 or $90 for a student or faculty member from Microsoft.

    The advantage of this setup is that it offloads the roboRIO from doing image processing and it is a normal Windows system so all of our software should work without modification. Be sure to read the caveats at the end of this page before jumping in.

  • Updated on: Dec 10, 2016

    Processing Images from the 2016 FRC Game

    GRIP can be used to locate goals for the FIRST Stronghold by using a series of thresholding and finding contours. This page describes the algorithm that was used. You should download the set of sample images from the WPILib project on http://usfirst.collab.net. The idea is to load up all the images into a multi image source and test your algorithms by cycling through the pictures.

  • Updated on: Dec 10, 2016

    Processing Images from the 2014 FRC Game

    This is a quick sample using GRIP to process a single image from the FRC 2014 Aerial Assist. Keep in mind that this is just a single image (all that I could find in a hurry) so it is not necessarily a particularly robust algorithm. You should really take many pictures from different distances, angles, and lighting conditions to ensure that your algorithm will perform well in all those cases.

  • Updated on: Dec 10, 2016

    Processing Images from the 2009 FRC Game

    In the FRC 2009 game, Lunacy, robots were required to put orbit balls into the trailers of opponents robots. To differentiate robots on each of the two alliances, a "flag" was attached to the center of the goal. The flag was a cylinder that was green on top and red on the bottom or red on top with green on the bottom. During the autonomous period, robots could look for opponent robots and shoot balls into their trailer using the vision targets.

  • Updated on: Dec 10, 2016

    Generating Code from GRIP

  • Updated on: Dec 10, 2016

    Introduction to GRIP

    GRIP is a tool for developing computer vision algorithms interactively rather than through trial and error coding. After developing your algorithm you may run GRIP in headless mode on your roboRIO, on a Driver Station Laptop, or on a coprocessor connected to your robot network. With Grip you choose vision operations to create a graphical pipeline that represents the sequence of operations that are performed to complete the vision algorithm.

    GRIP is based on OpenCV, one of the most popular computer vision software libraries used for research, robotics, and vision algorithm implementations. The operations that are available in GRIP are almost a 1 to 1 match with the operations available if you were hand coding the same algorithm with some text-based programming language.

  • Updated on: Jul 09, 2016

    Robot Preemptive Troubleshooting

  • Updated on: Jul 09, 2016

    Using limit switches to control behavior