How do I optimize the quality of my drone imagery?

 

The imagery Agribotix can produce for a given customer is only as good as the provided data; poor data will produce poor results. Below are three of the most common problems with stitched imagery and a lengthy discussion of what causes each of them and how to fix them. These problems are cloud and sky shadows, stitching artifacts, and GPS failure. If your imagery results are less than satisfactory, there is a good chance that your problem is one of the three listed below.

Cloud and Sky Shadows

Cloud shadows appear when the cloud cover overhead blocks the sunlight at varying degrees while the drone is in the air taking photos.  Cumulus clouds moving past the sun will create the most noticeable cloud shadows.  When the sky is overcast but the overcast clouds vary in thickness as they move across the sky and block the sun, they too will create cloud shadows.  Ideal weather conditions for drone photography are either clear skies or 100% overcast with a consistent overcast cloud layer.

The telltale signs of cloud shadows are:

1.) Broad, consistent transition from "good" crop health to "bad" crop health as if the bad areas were airbrushed onto the false color map using Photoshop.

2.) Consistent "bad" area that moves across several crop fields of varying species and growth phases as well as natural foliage.  These "bad" areas do not conform to any apparent geographical feature such as topography or soil type, nor do they conform to row direction, vehicle tracks, or irrigation layouts.

cloudShadows.jpg

Additionally, it is possible to have the opposite effect of what we call 'sky shadows' - these occur when the drone is flying on an overcast day but the sunlight manages to penetrate through the cloud deck for short periods of time during the drone's flight.  These appear as good-healthy airbrushed patches on a crop field rather than bad ones.

Below are examples of cloud shadows from a photo set; note the otherwise bright pink canopy interrupted by blotches of dark pink. The dark pink areas have a blocked view of the sun. IMG_1296.JPG in the middle of the bottom row has no sun shadow at all. IMG_1291.JPG on the upper right-hand side is almost entirely covered by cloud shadow except for the upper left-hand corner of the image. If you have images such as these in your photo set, you should remove them before uploading them to Agribotix for processing; if you have too many images removed from the set it may result in your final stitched image having holes within it but out experience has been that such holes are better than the false information provided by sun and sky shadows.

recognizeCloudShadow.png

Spotting cloud shadows in final results is part art, part science; at Agribotix we've gotten good at it after ground truthing dozens of fields over the last two years; when we see them in your imagery we will try to point them out but the customer is ultimately responsible for raw data input and final interpretation of results. The best way to avoid cloud and sky shadows altogether is of course to always fly when you have clear skies, or with clouds that are not blocking the sun at the very least.

 

Stitching Artifacts

Stitching artifacts are characteristics on your final stitched image that appear to show differentiation in crop health but in reality are non-existent on the physical ground and are mere by-products of the image processing. Telltale signs of stitching artifacts are:

  1. Regions of your false-color map changing colors as you zoom in and out on Google Earth.

  2. Visually apparent differences in resolution on the CIR stitched image.

  3. Several areas (all roughly the same size) that appear to be 'plastered' onto the image.

These artifacts can appear in part of your field or across the whole field. Stitching artifacts are highlighted in the red boxes below.

stitchingArtifacts.png

Stitching artifacts can be caused by fast moving clouds that produce cloud shadows over a particular geographic spot in one photo, but not in subsequent photos; this confuses the stitching software as to whether or not such a spot is supposed to be dark or light. Varying degrees of focus between individual photos taken by the drone's camera will also lead to stitching artifacts as well as poor stitching. Below are two photos; the one on the left has less focus than the one on the right; whatever your camera's focus quality is, it should be consistent, otherwise photos like these will produce stitching artifacts.

blur.png

Below is a photo set taken with a camera that had poor automatic exposure control; note the four sets of three dark photos highlighted in red boxes. If you have images like this in your photo set they should be removed before sending the remainder to Agribotix for image processing, these too will produce stitching artifacts.

poorAutoExposure.png

Another cause of stitching artifacts is not enough oversampling. Agribotix recommends an oversampling of 10; that is, each point on your field should be in (at least) 10 different individual photos from your photoset. Many customers get excellent results with an oversampling of 6.5, but their fields are small and the stitching software has several reference points around the field edges to work with when aligning the photos. Orchards and vineyards need less oversampling because the stitching software will find reference points within individual trees & vines with which to use. Below is an individual photo taken over a wheat field; notice the general uniformity in color/texture -it is extremely difficult for the stitching software to find individual reference points in this almost featureless terrain and therefore needs an oversampling of 10 to align the photos into a single orthorectified mosaic. Notice the false color image on the lower left; the center of the image is green whereas the outer edges are grey/black. The original infrared image is on the right and is brighter in the middle and darker around the edges. This phenomenon is known as 'vignetting' and is common in photography. The important thing to remember is that having a low oversampling by itself is one thing, but when oversampling is coupled with a photo set wherein all or most of the photos contain noticeable vignetting, the result will be severe stitching artifacts throughout the stitched image.

noOversampling.png

The final cause of stitching artifacts is when the aircraft altitude varies greatly. Below are two images; one from the drone's takeoff and another from the drone's landing approach, since these are taken at an altitude far below the cruising altitude of the drone's lawnmower flight pattern, they will produce high-resolution patches of the stitched image leading to a stitching artifact. The Agribotix Field Extractor will automatically ignore images taken below a certain altitude above ground level in order to avoid this problem entirely. If you are not using Field Extractor you should review your photoset and remove such images before uploading them to Agribotix for processing.

airSpeed.png

Below is an example of a field that is completely covered in stitching artifacts; there is no actual difference between those areas shown to be generally green and those generally yellow. Note the large black holes on the upper end of the image caused by flooding; just because stitching artifacts are present doesn't mean your imagery can't tell you valuable information about your fields; the key is to recognize stitching artifacts when they manifest and not let them lead you to false conclusions.

stitchingArtifacts2.png

GPS Failure

Below is an example of what happens when your drone system's GPS sensor fails to write accurate geo-coordinates to your camera's individual photos in a process called geotagging; the final stitched image will not properly overlay onto Google Earth and will often be the wrong size or have an inaccurate orientation.

gpsFailure.png

While some cameras have their own internal GPS sensors, these sensors are less accurate than those embedded in a drone's flight navigation system. The Agribotix Field Extractor uses the GPS data pulled from the drone's navigation system to optimize the georeferenced results. 

Just How Good is Drone Imagery?

Agribotix founder, Tom McKinnon, wrote an excellent post on Agribotix near infrared imagery, vegetation indices, and how cameras modified to image in the near infrared can be used to reliably monitor within-field crop variations. 

I wondered how Agribotix imagery would compare with the gold standard of moderate resolution land imaging sensors, Landsat 8.

To find out, I went to the USGS EarthExplorer and downloaded two Landsat 8 collections (9-20-2014 and 10-06-2014) that were collected near in time to an Agribotix test flight over Anderson Farms, near Erie, CO on September 23rd.  Fall of 2014 on the Colorado Front Range was dry, warm and clear for the several weeks over which this imagery was collected: atmospheric conditions were optimal for imaging.  See the beautiful Landsat image from September 20th that captured much of the Colorado Front Range, including the Agribotix test plot.

Landsat 8 color infrared composite image acquired over the Colorado Front Range, September 20th, 2014.  The test plot is inside the red box.

Landsat 8 color infrared composite image acquired over the Colorado Front Range, September 20th, 2014.  The test plot is inside the red box.

Results for the Agribotix imaging system look great!  Check out a plot of the mean modified-NDVI measured by the Agribotix imaging system when compared to m-NDVI calculated from the two dates of Landsat imagery.  Normalized vegetation index values range between -1.0 and 1.0, so the fact that Agribotix-measured m-NDVI is within 0.1 of m-NDVI measured from Landsat on two separate dates on either side of the Agribotix acquisition is very encouraging.

Find full details below on how we arrived at the plot of Agribotix vs Landsat m-NDVI.

Agribotix modified-NDVI from September 23rd compared with m-NDVI from Landsat on September 20th and October 6th.  Error bars are standard deviations of 100 randomly sampled pixels.

Agribotix modified-NDVI from September 23rd compared with m-NDVI from Landsat on September 20th and October 6th.  Error bars are standard deviations of 100 randomly sampled pixels.

 

Preprocessing

Calibration

Landsat 8 images were calibrated to units of radiance (W / (m2 * sr * µm)).

Agribotix images are currently uncalibrated, so this experimental comparison with a well-characterized system like Landsat gives great insight into the radiometric quality of Agribotix imaging.

Agribotix images were acquired with the GoPro HERO3+ Silver Edition, collecting images as 8-bit JPEGs. 

Image Stitching

Individual Agribotix flight images were stitched using the Agribotix Image Processing Service.

Image Registration

Agribotix image georegistration was improved using the Google Satellite web map service as 'truth', and had a geopositioning accuracy within 1 to 2 meters.  The geopositioning accuracy that the USGS provides for Landsat is exceptional; images were used as-is.  

Modified-NDVI

Last summer, Tom and his team created vegetation index images using the near infrared and green bands from the Agribotix sensor system.  Here, we test creation of vegetation index images using the near infrared and blue band.

The blue and green channels have been selected for generating NDVI products from Agribotix imagery because Agribotix sensors don’t image in the red portion of the electromagnetic spectrum.  Agribotix installs a custom filter in camera systems that blocks red light, and allows imaging of the near infrared, green and blue (NGB) portions of the spectrum.   While the Normalized Difference Vegetation Index (NDVI) has traditionally been created using the near infrared and red channels, it turns out that most any channel in the visible range will work to highlight vegetation abundance and condition in imagery when compared with a near infrared channel.  As discussed by Tom in his recent Blog, both green and blue channels offer opportunity for vegetation index creation because of the way visible light is reflected from plants relative to near infrared light.  See the figure of the reflectance signatures of three species of shrub and trees (Toyon, Live Oak and Blue Oak), and how each of those plants reflect orders of magnitude more radiation in the near infrared than they do in the blue, green and red bands of the visible spectrum.  You can also see in the figure how plants preferentially absorb both blue and red light for use in photosynthesis, and reflect more green light.  This is the basis for our using the near infrared and blue channels in the calculation of our ‘modified’ normalized difference vegetation index.

Landsat and Agribotix Images were converted to a modified normalized difference vegetation index (m-NDVI) by applying the following equation:

m-NDVI = NIR – blue / NIR + blue

Several vegetation signatures with the blue, green, red and near infrared portions of the spectrum highlighted.

Several vegetation signatures with the blue, green, red and near infrared portions of the spectrum highlighted.

Spatial Resampling

The Agribotix image was resampled to the 30-meter pixel resolution of Landsat using nearest neighbor resampling. 

Sampling

One hundred randomly sampled points were established for extracting pixel values from a test plot.

We’re happy to see that the Agribotix imaging system performs so well when compared to Landsat 8, one of the highest quality land imaging satellite systems, and gives us great confidence in our system for providing growers with on-demand in-field intelligence for precision agriculture.

Agribotix and Lean: Learning, Innovating, Iterating & Improving

Agribotix is making great strides applying lean startup principles to the business of drones and precision agriculture.  Agribotix brings amazing new technology to market for precision agriculture: a low-cost system that allows growers to easily integrate imagery into operations, reducing costs while increasing productivity and profitability on the farm.

I recently picked up The Lean Startup, Eric Ries’ book on lean principles for growing new technology companies, and have been thinking about how to apply lessons from the book to product development at Agribotix.  

A company like Agribotix wouldn’t have been possible just a few short years ago…  Agribotix is riding a wave of technological convergence, delivering new capabilities to a global agriculture market enabled by the remarkable pace of innovation that we’ve seen across technologies, from cloud computing, to open source hardware and software, to advances in robotics and aerial systems.  Any of these technologies is impressive by itself, but when combined, becomes a force-multiplier, and enables Agribotix to rapidly introduce useful new products to market at an affordable price.  Here’s a quick rundown on some of the technologies that are converging at Agribotix, and how we’re using lean to be successful in the marketplace.

At Agribotix, we focus on delivering actionable intelligence to growers, like the infrared imaging and variable rate prescription map depicted above.  One of our primary tools for rapidly creating actionable intelligence for agriculture is cloud computing: scaling on-demand computation at the touch of a button & processing large amounts of data in a short amount of time.  Utilizing cheap and scalable virtual cloud hardware, Agribotix is blessed with the ability to test products with a global audience: whether you’re a vineyard manager in Chile or a corn grower in Ohio, you can access the Agribotix WebApp and take advantage of scalable Agribotix analytics. 

Open source software: A whole collection of software tools from the open source community have matured and when combined, provide Agribotix with lean capability.  Python for scripting, rapid prototyping and automation; tools like Orfeo Toolbox, QGIS & GDAL for geospatial data management and analytics; and geospatial servers like GeoServer for testing products with customers in the field.  These technologies together provide an open software stack, with a global community of users supporting the platform and contributing to the Agribotix lean capability.

Unmanned aerial systems: Agribotix integrates the most capable, durable and affordable components for drones. UAV systems continue the trend of getting cheaper, while packing more capability into smaller packages.  Agribotix is able to apply a lean strategy of rapidly prototyping UAS technology and testing it in the field, and adjusting designs to fit customer requirements.    Combining this strategy with open source products like 3D Robotics Mission Planner/PixHawk platform enables Agribotix to deliver drone systems molded to the agriculture market, with capabilities like fully autonomous flight, long endurance, and sensor systems tailored to applications. 

All of these capabilities would mean nothing without the technology to tie it all together in the field.  The fact that most new tractors come equipped with GPS and computerized systems as standard equipment means that the promise of remote sensing for precision agriculture is finally here. 

Agribotix is using lean startup principles to make overhead imaging affordable and accessible. When combined with enabling technologies like GPS, variable rate application, UAVs, and open source hardware and software, Agribotix products can be used to monitor crops from above, improving yields and saving money on the farm.   

Agribotix GoPro Adventure: Part 3 - Rolling Shutter

In our two previous GoPro blogs (Part 1 and Part 2) we've experimented with ways of hacking the GoPro to make it useful as an ag drone camera and pointed out some problems.  We solved most of the problems, but Part 2 left us with a show stopper: the shutter speed of the GoPro can't be set by the user.  It's not a problem in bright sun, but in low light the shutter speed gets as low as 1/60th, which is disaster for an aircraft moving at 15 m/s.

Or is it?  Jimmy was out flying last week right around sunset and he returned some absolutely stunning imagery at around 1/60th.  The shot below is a zoomed-in bit of an image showing a dirt road about 5m wide.  Note the fence posts along the left edge that are sharp.  With the plane flying at 100m AGL and 15 m/s, the camera traveled 0.25m while the shutter was open so the smearing should have been horrible.  But we can't detect any at all.  What happened?

We're still trying to figure it out, but we believe it is due to the rolling shutter on the GoPro's CMOS sensor (and all CMOS sensors).  Normally rolling shutter is considered an annoyance (e.g. propellers that turn into scimitars), but it appears to be saving our bacon in this case.  Our theory (yet to be proven) is that the rolling shutter scans through the entire image in the shutter-open time slot, but every line of pixels is exposed for only a fraction of that time (if this isn't clear, there's a nice little animation in this link).  We're doing some tests on the GoPro with moving objects to try to get a more quantitative handle on it so we should have some more refined ideas soon.

So is there a downside?  Perhaps.  The way we orient the camera on the plane, the bottom of the image has moved forward 25cm relative to its location when the top of the image was exposed.  Effectively we are compressing the aspect ratio of the picture a bit in the landscape direction.  But this is less than 0.25% distortion (25cm in a 100m image size) under the worst case (normally the shutter speed is higher than 1/60, even on dark days).  I expect that the inherent lens distortion is even higher than that.

The jury is still out on this theory and we welcome comments from anyone who perhaps knows more about CMOS rolling shutter issues and/or specifics of the GoPro.  But tentatively we are withdrawing our "show stopper" verdict on the GoPro.  Now if we could only get the cost of the unit down...