Bring Your Own Drone™ Prep Flowsheet

First-time users of the Agribotix Bring Your Own Drone (BYOD) image processing services often have questions about how to make sure output image products can be easily ported into existing software packages like SST or SMS and combined with existing data layers.  Generally, if you're interested in precision agriculture applications, its important that imagery products be created with map information so that products are easily combined with other information like soil test or harvest yield data.  

If you're planning to use the Agribotix Bring Your Own Drone processing system, follow this flow chart to help us create high quality georeferenced maps from your imagery. Click on the image below to go to the Agribotix Bring Your Own Drone page and find more detailed info.

Just How Good is Drone Imagery?

Agribotix founder, Tom McKinnon, wrote an excellent post on Agribotix near infrared imagery, vegetation indices, and how cameras modified to image in the near infrared can be used to reliably monitor within-field crop variations. 

I wondered how Agribotix imagery would compare with the gold standard of moderate resolution land imaging sensors, Landsat 8.

To find out, I went to the USGS EarthExplorer and downloaded two Landsat 8 collections (9-20-2014 and 10-06-2014) that were collected near in time to an Agribotix test flight over Anderson Farms, near Erie, CO on September 23rd.  Fall of 2014 on the Colorado Front Range was dry, warm and clear for the several weeks over which this imagery was collected: atmospheric conditions were optimal for imaging.  See the beautiful Landsat image from September 20th that captured much of the Colorado Front Range, including the Agribotix test plot.

Landsat 8 color infrared composite image acquired over the Colorado Front Range, September 20th, 2014.  The test plot is inside the red box.

Landsat 8 color infrared composite image acquired over the Colorado Front Range, September 20th, 2014.  The test plot is inside the red box.

Results for the Agribotix imaging system look great!  Check out a plot of the mean modified-NDVI measured by the Agribotix imaging system when compared to m-NDVI calculated from the two dates of Landsat imagery.  Normalized vegetation index values range between -1.0 and 1.0, so the fact that Agribotix-measured m-NDVI is within 0.1 of m-NDVI measured from Landsat on two separate dates on either side of the Agribotix acquisition is very encouraging.

Find full details below on how we arrived at the plot of Agribotix vs Landsat m-NDVI.

Agribotix modified-NDVI from September 23rd compared with m-NDVI from Landsat on September 20th and October 6th.  Error bars are standard deviations of 100 randomly sampled pixels.

Agribotix modified-NDVI from September 23rd compared with m-NDVI from Landsat on September 20th and October 6th.  Error bars are standard deviations of 100 randomly sampled pixels.




Landsat 8 images were calibrated to units of radiance (W / (m2 * sr * µm)).

Agribotix images are currently uncalibrated, so this experimental comparison with a well-characterized system like Landsat gives great insight into the radiometric quality of Agribotix imaging.

Agribotix images were acquired with the GoPro HERO3+ Silver Edition, collecting images as 8-bit JPEGs. 

Image Stitching

Individual Agribotix flight images were stitched using the Agribotix Image Processing Service.

Image Registration

Agribotix image georegistration was improved using the Google Satellite web map service as 'truth', and had a geopositioning accuracy within 1 to 2 meters.  The geopositioning accuracy that the USGS provides for Landsat is exceptional; images were used as-is.  


Last summer, Tom and his team created vegetation index images using the near infrared and green bands from the Agribotix sensor system.  Here, we test creation of vegetation index images using the near infrared and blue band.

The blue and green channels have been selected for generating NDVI products from Agribotix imagery because Agribotix sensors don’t image in the red portion of the electromagnetic spectrum.  Agribotix installs a custom filter in camera systems that blocks red light, and allows imaging of the near infrared, green and blue (NGB) portions of the spectrum.   While the Normalized Difference Vegetation Index (NDVI) has traditionally been created using the near infrared and red channels, it turns out that most any channel in the visible range will work to highlight vegetation abundance and condition in imagery when compared with a near infrared channel.  As discussed by Tom in his recent Blog, both green and blue channels offer opportunity for vegetation index creation because of the way visible light is reflected from plants relative to near infrared light.  See the figure of the reflectance signatures of three species of shrub and trees (Toyon, Live Oak and Blue Oak), and how each of those plants reflect orders of magnitude more radiation in the near infrared than they do in the blue, green and red bands of the visible spectrum.  You can also see in the figure how plants preferentially absorb both blue and red light for use in photosynthesis, and reflect more green light.  This is the basis for our using the near infrared and blue channels in the calculation of our ‘modified’ normalized difference vegetation index.

Landsat and Agribotix Images were converted to a modified normalized difference vegetation index (m-NDVI) by applying the following equation:

m-NDVI = NIR – blue / NIR + blue

Several vegetation signatures with the blue, green, red and near infrared portions of the spectrum highlighted.

Several vegetation signatures with the blue, green, red and near infrared portions of the spectrum highlighted.

Spatial Resampling

The Agribotix image was resampled to the 30-meter pixel resolution of Landsat using nearest neighbor resampling. 


One hundred randomly sampled points were established for extracting pixel values from a test plot.

We’re happy to see that the Agribotix imaging system performs so well when compared to Landsat 8, one of the highest quality land imaging satellite systems, and gives us great confidence in our system for providing growers with on-demand in-field intelligence for precision agriculture.

Agribotix and Lean: Learning, Innovating, Iterating & Improving

Agribotix is making great strides applying lean startup principles to the business of drones and precision agriculture.  Agribotix brings amazing new technology to market for precision agriculture: a low-cost system that allows growers to easily integrate imagery into operations, reducing costs while increasing productivity and profitability on the farm.

I recently picked up The Lean Startup, Eric Ries’ book on lean principles for growing new technology companies, and have been thinking about how to apply lessons from the book to product development at Agribotix.  

A company like Agribotix wouldn’t have been possible just a few short years ago…  Agribotix is riding a wave of technological convergence, delivering new capabilities to a global agriculture market enabled by the remarkable pace of innovation that we’ve seen across technologies, from cloud computing, to open source hardware and software, to advances in robotics and aerial systems.  Any of these technologies is impressive by itself, but when combined, becomes a force-multiplier, and enables Agribotix to rapidly introduce useful new products to market at an affordable price.  Here’s a quick rundown on some of the technologies that are converging at Agribotix, and how we’re using lean to be successful in the marketplace.

At Agribotix, we focus on delivering actionable intelligence to growers, like the infrared imaging and variable rate prescription map depicted above.  One of our primary tools for rapidly creating actionable intelligence for agriculture is cloud computing: scaling on-demand computation at the touch of a button & processing large amounts of data in a short amount of time.  Utilizing cheap and scalable virtual cloud hardware, Agribotix is blessed with the ability to test products with a global audience: whether you’re a vineyard manager in Chile or a corn grower in Ohio, you can access the Agribotix WebApp and take advantage of scalable Agribotix analytics. 

Open source software: A whole collection of software tools from the open source community have matured and when combined, provide Agribotix with lean capability.  Python for scripting, rapid prototyping and automation; tools like Orfeo Toolbox, QGIS & GDAL for geospatial data management and analytics; and geospatial servers like GeoServer for testing products with customers in the field.  These technologies together provide an open software stack, with a global community of users supporting the platform and contributing to the Agribotix lean capability.

Unmanned aerial systems: Agribotix integrates the most capable, durable and affordable components for drones. UAV systems continue the trend of getting cheaper, while packing more capability into smaller packages.  Agribotix is able to apply a lean strategy of rapidly prototyping UAS technology and testing it in the field, and adjusting designs to fit customer requirements.    Combining this strategy with open source products like 3D Robotics Mission Planner/PixHawk platform enables Agribotix to deliver drone systems molded to the agriculture market, with capabilities like fully autonomous flight, long endurance, and sensor systems tailored to applications. 

All of these capabilities would mean nothing without the technology to tie it all together in the field.  The fact that most new tractors come equipped with GPS and computerized systems as standard equipment means that the promise of remote sensing for precision agriculture is finally here. 

Agribotix is using lean startup principles to make overhead imaging affordable and accessible. When combined with enabling technologies like GPS, variable rate application, UAVs, and open source hardware and software, Agribotix products can be used to monitor crops from above, improving yields and saving money on the farm.