We were delighted last fall when we were invited to speak at the 2014 High Plains No-Till Conference put on by the Colorado Conservation Tillage Association. Clearly we weren't anticipating the sub-zero temperatures and blowing snow between Boulder and the Kansas border. Despite all odds we made it to Burlington and had a great meeting. Our sessions on ag drones (title "UAVs in Agriculture: the Good, the Bad, and the Ugly" with the "ugly" being the current regulatory swamp the FAA has dealt us) played to a packed hall on both days. Beyond the presentations, the networking opportunities were awesome.
We made the road trip out to Salina, Kansas on January 23 and 24 to attend the Kansas Ag Research and Technology Association annual conference. The drone presentations made by Richard Ferguson and his group from University of Nebraska Lincoln and Dietrich Kastens generated the most interest of all over the two-day event. Two drone manufacturers presented in the trade show. All in all we generated a lot of interest in Agribotix and will be making a number of trips back to Kansas this growing season.
Colorado State University, a hub of agricultural innovation, hosted a breakfast prior to the National Western Stock Show to discuss the future of agriculture in Colorado and how innovative small businesses like Agribotix are contributing significantly to this essential sector of the state's economy. Governer Hickenlooper joined CSU president Tony Frank and four agricultural experts, including Doug Johnson of the Rocky Mountain Innosphere, who gave us a nice shout out during his remarks, to outline a vision for the next five to ten years. Before and after the remarks we had the opportunity to showcase our drones to ag innovators from all over the state and were overwhelmed by the enthusiasm shown by growers and agronomists. We look forward to continued collaborations with Colorado State University, the Rocky Mountain Innosphere, and Northern Colorado Ag Innovators and thank everyone who stopped by our booth. Check out more coverage at http://www.today.colostate.edu/story.aspx?id=9568
The guys doing the cool stuff at Conservation Drones assembled and tested almost a dozen drones in 48 hours! The aircraft are bound for the four corners of the earth to help shine a bright light on poaching. Great work!
We are super impressed with the work being done by Kevin Price and his group at Kansas State University. These guys are doing the hard work to figure out all the components required to do aerial remote sensing in agriculture. In particular, they are very cognizant of keeping the costs down and thus are using reasonably priced airframes and consumer-grade cameras. Much of what we've done at Agribotix has been inspired by their innovation. We were pleased to see a concise article describing the K-State work in a nutshell. Enjoy.
Our friend Jasen, from our time working in Shenzhen, got a nice plug when some of his staff made a quadrotor using parts from Makeblock components (video here). Jasen was just getting Makeblock off the ground when we were sharing space at HAXLR8R. It's a fun little video, although his flight safety officer was asleep at the switch!
Doug Johnson, Maggie Flanagan, and Rob Writz of the Rocky Mountain Innosphere paid a visit to our shop last week. The RM Innosphere is in the business of accelerating entrepreneurship and creating jobs. Their interest in Agribotix is due to their new program in agriculture technology and precision agriculture. With their tight connections to the Colorado State University this interest in agriculture carries some real weight. We're super excited about collaborating with this vibrant group and we're already beginning to see results from it (more on that later...).
We will be making a presentation on Friday, December 19 to the Energy Fellows Institute in Golden, Colorado. The topic will be using drones for agriculture. The EFI, an arm of the Colorado Cleantech Industries Association, is a cohort of entrepreneurs developing businesses in the clean tech and energy space.
Most photo mosaic generation software requires that the photos be geotagged as a prerequisite. That is, the GPS latitude/longitude coordinates must be incorporated into the EXIF metadata. Many cameras have a built-in GPS but it is generally accepted that this isn’t adequate because (a) the GPS patch antenna is facing sideways instead of up and (b) the camera is moving at about 10 m/s and the GPS units built into cameras are built for walking-around speeds.
A better option is to use the GPS data from the flight log of the APM flight controller. The idea is to read the time stamp from the EXIF data of the photo and match it to the right time stamp of from the flight log. Then take the GPS lat/lon from that time and insert it into the EXIF metadata. There may be some interpolation involved, but you get the idea. There is a geo-tagging utility built in to the Mission Planner ground control station software but I found the utility to be pretty buggy. And since there are no error messages, when it wouldn’t work I was dead in the water without any clue on how to fix it.
Fortunately, I found a reliable alternative. The exiftool software can do exactly what we need (I reported on exiftool previously as a way to recover the time stamp after the photos went through an NDVI conversion). All that exiftool needs is a file with time-stamped GPS data. It supports a bunch of file types, but KML files are the easiest to generate from the flight log.
One issue is that the flight controller clock and the camera clock are rarely exactly synchronized. ExifTool has an option called geosync to take care of this (see example below for syntax). The easiest way to find the offset is to take a photo of the Mission Planner clock with the camera.
Here are the few, simple, steps:
1. Using the Mission Planner, generate a "KML" file from the flight log (“Telemetry Logs” tab >> Tlog>KML button). Note that Mission Planner is being almost constantly updated. This was the sequence in late 2013 but it may shift.
2. Mission Planner actually generates a KMZ file, which is a zipped KML file. To make the shift back to KML, change the .kmz extension to .zip. Unzip it using a regular ZIP utility and the .kml file will come spilling out.
3. Run exiftool with this syntax (here’s a link to the full description of the geotagging capabilities):
exiftool –geotag flight_log.kml geosync=+5 photo_directory
In this example, the camera clock is 5 seconds behind the flight controller. Use a negative number if it goes the other way. Note that this sign convention is the opposite of what is used by the geotagging feature in the Mission Planner.
Once I figured out all the upfront steps, this process worked flawlessly. Note that if the bugs in the Mission Planner utility are resolved that route would be a simpler. But for now, exiftool provides a reliable fallback option.
This is another in the series of wonk-blogs; if you aren’t interested in the messy details of processing aerial images you might want to move along. But if you are, perhaps this post could save you some time.
Ned Horning has built a fantastic ImageJ (and Fiji) plugin to create NDVI images from a camera with the near IR filter removed (see a previous blog post for details on the camera conversion). The only downside is that ImageJ strips the time stamp from the EXIF metadata of the JPEG file – and we’re going to need that later when we go to geotag the image (I’ll have a future post on this process). Fortunately, there is another bit of open-source software to the rescue: ExifTool developed by Phil Harvey. As the name implies, this tool is the Swiss Army Knife for EXIF metadata. But right now we are only interested in one function: moving the time stamp from one JPEG to another. That is, we’re going to take the time stamp from the original file that came off the camera and shift it to the NDVI image. The syntax is:
exiftool –tagsFromFile originalfile.jpg NDVI_file.jpg
(thanks to archaeometallurgy.de for turning me on to this trick).
Since there are hundreds of photos, doing this by hand would get old pretty quick. On the Mac, exiftool operates from the command line of the Terminal. Normally I hate command line stuff, but the Terminal does allow for bash scripts to be executed. Here’s the drill.
1. In any text editor (Nano from the Terminal is the simplest), create this script (I called mine “exiftagger”);
#This script copies&nbsp; the EXIF header from one file to another
for ((c=431; c<=600; c++))
exiftool -tagsFromFile IMG_0"$c".jpg IMG_0"$c"_NDVI_Color.jpg
In this case my camera IMG files started with 0431 and ended in 0600 – clearly you would just replace the number in the “for” command with the appropriate numbers from your shooting session. “_NDVI_Color” is the bit that Ned Horning’s plugin adds to the file name.
Next, you need to make this script file executable. Simply type:
chmod +x exiftagger
Finally, run it from the command line with this command:
This was the first bash script I’ve ever written, but it worked flawlessly – hopefully you can do the same. Sorry I can’t provide any direct recipes for Windows but the scripting must be similar. And exiftool has versions for Windows, OSX, and Linux.
I read an article in the New York Times this morning about the Google Books Ngram viewer. Those clever geeks at Google have created a data mining tool for the Google Books database that can tell us the normalized frequency of a word's usage over the last couple of centuries -- cool! Naturally the high brows at the NYT are using it for pointy head stuff, but for me it's drones!! The figure below shows that we really aren't as drone obsessed as we may have thought -- although it's unfortunate that the Google database stops at 2008. The peak in the 40s must have been the fascination with target drones during the war. And before that I guess the authors were more interested in honey bees.
In many ways the GoPro is an ideal camera for drones: small, light, rugged, decent resolution, and not too expensive. The show stopper is the fisheye lens that makes image stitching impossible. Now there could be a fix on the way. Backbone, a Canadian company, is making an aftermarket hack to allow C-mount lenses to be used with the GoPro. An as an extra bonus for those of us interested in NDVI, the IR filter is optionally removable. Their product, called the Ribcage, will be a bit pricey at $200 (in addition to the cost of the GoPro) but it might turn out to be a comparable solution to something more expensive like the Sony NEX series cameras. The Ribcage isn't available now, but it should ship soon.
There is a petition on We the People to request that the White House simplify the rules for small unmanned aerial systems. Since the FAA itself doesn't appear to moving anywhere quickly perhaps this will help. Give it your vote!
Justin Dougherty of News 9 in Oklahoma City had a story that aired on November 25 where he claims that the FAA is allowing the use of UAS on farms under the Academy of Model Aeronautics rules. News 9 claims they received an "exclusive statement" indicating:
"Farmers may operate an unmanned aircraft over their own property for personal use and guidelines for the operation of model aircraft, such as those published by the Academy of Model Aeronautics, may be used by farmers as reference for safe model UAS operations."
While we would dearly love for this to be true, the lack of attribution to an actual person at the FAA makes us skeptical. It just doesn't make sense that the agency would dribble out such important information to a small TV news outlet without putting anything on their web site. As we reported in a previous blog post, Jim Williams wrote this text to us on October 27,
However, the FAA Councils Office has not yet rendered a legal opinion on this proposal so we have not published an official policy. Please monitor the FAA UAS web page for an official determination.
There certainly doesn't seem to be anything on the FAA UAS web page corroborating Mr. Dougherty's statement.
As we reported in a previous blog post, the FAA has been indicating that they are at least interested in loosening up the UAS rules for farmers in advance of the much-anticipated September 30, 2015 rule-issuing date. Specifically, they are considering allowing the Academy of Model Aeronautics safety code to apply to farmers. This change would be exactly what the industry needs: the chance to gain many thousands of hours of flight data using small aircraft flying low in Class G airspace with no privacy issues to deal with.
Just when we were feeling optimistic, along comes the report “Integration of Civil UAS in the National Airspace Systems (NAS) Roadmap” released on November 7, 2013. The Roadmap does contain a bit of language indicating that small UAS (sUAS are defined as less than 55 pounds) may have somewhat less strict requirements than larger UAS, e.g.:
“Except for some special cases, such as sUAS with very limited operational range, all UAS will require design and airworthiness certification to fly civil operations in the NAS.”
But in general, the Roadmap reads an awful lot like the 30 July 2013 document released by the FAA showing regulations for getting a Certificate of Authorization (N 8900.227). That is, tight certification requirements on the pilot in command and visual observer as well as a mountain of paperwork to file. In my opinion, these rules are totally appropriate for a DOD contractor flying a large, high-speed, high-altitude UAS, but they are absurd for a farmer flying a 4-pound foamie at 300 ft AGL on his own farm. If the FAA decides to apply the current full COA regulations to farm UAVs, the industry will be killed off before it is born.
In fairness to the FAA, the Roadmap is just that, a roadmap and not regulations. But since the FAA has been so opaque, we on the outside are forced to read tea leaves. And my read of the leaves in the Roadmap makes me worried. My wish is that the FAA would provide some clarity so entrepreneurs can make sensible business decisions. Is that too much to ask for?
We are deep into wonk-land with this post, but it would be useful to anyone wanting to use the Canon Hacker's Development Kit (CHDK) on their SX260. In short, CHDK allows control of almost any camera function and allows scripts to be run. As an open source project, there is a ton of information out there but it not organized well. These tips should help anyone putting CHKD on an SX260 using a Mac.
Step 1. First find what version of firmware is loaded. Create a file ver.req on the root of the SD card with Text Edit using Unicode UTF-16 format. Do not use .txt extension.
2. Start camera in PLAY mode. Press and hold FUNC/SET and then rapidly press DISP. The firmware version and some other data will display for a few seconds. It would be useful if you had a helper to take a picture of the screen. Otherwise, read fast and memorize.
3. Download the correct version of the CHDK .zip file from this page (scroll about half way down to find the SX260). My camera had the GM1.01a firmware so I downloaded 101a. I used the Stable 1.2 build. Ver 1.3 is for the adventurous types.
4. Unzip the file and load the individual files to the root of the SD card. The root directory should have these files: changelog.txt, DISKBOOT.BIN, PS.FI2, readme.txt, vers.req
5. I don't know if this next step is required, but it probably can't hurt. According to the CHDK docs "the Mac adds extra hidden stuff to files that are downloaded from the internet (as a security feature). To fix this open the Terminal. (It is found in your Applications > Utilities folder). Enter these commands:
xattr -d com.apple.quarantine DISKBOOT.BIN PS.FI2
6. Put the card back in the camera, start it up in PLAY mode (blue right-facing arrow), press MENU. Scroll to the bottom and a new entry called Firmware Update should be present. Select it and say OK if the version numbers look OK.
7. Go into Camera mode. The button to switch into and out of CHDK mode is called the <ALT> button. By default it is PLAY, but this can be switched in the menu. I changed it to DISP.
8. In CHDK Camera mode, PLAY displays some shortcuts, MENU brings up a bunch of CHDK parameters, and FUNC SET brings up the menu for the script that was selected in the Menu mode.
9. With this method, Step 6 must be repeated every time you turn on the camera. There is a way to make the camera go into CHDK mode upon startup, but I haven't figured it out yet.
1. Obviously, the reason we are doing this is for scripting. CHDK comes with an intervalometer script. The intervalometer is capable of shooting at 1.8 frame/sec (much better than my S100 running CHDK). Or we can have a script to trigger the camera by the flight controller (stay tuned for details).
2. CHDK allows the RGB histogram to display on the screen. Very useful for seeing in real time how the red (i.e. NIR) and blue channels change as the camera view healthy and sickly leaves.
3. CHDK allows RAW, something not available in the standard SX260. For me the jury is still out on the value of RAW vs the extra hassle.
We'll be presenting at the High Plains No-Till Conference put on by the Colorado Conservation Tillage Association in Burlington, Colorado on February 4 and 5. The title will be "UAVs in Agriculture: The Good, The Bad, and The Ugly." The Good, of course, is all the good that can come out of low-cost, on-demand remote sensing of crops: input cost reduction, reduced groundwater contamination, and increased profits. The Bad is the uncertainty around the economics of this new industry which is closely related to The Ugly. The Ugly is the specter of DOD-oriented UAV regulations and the unwillingness of the FAA to give any guidance whatsoever on the form of the new rules -- or even any assurance that they will meet their Congressionally set deadline of September 2015. More on that later...
If you are in eastern Colorado in February, drop by and let's chat!
I just started working with my newly converted camera to try to understand making it sing in NDVI. Since it's November in Colorado, finding healthy green vegetation in sunlight is a bit tough. But I have been working with some photos of dead leaves on greenish grass.
The top image is unmodified. The colors are washed out because of the blue filter on the lens and I reset the white balance against a solid blue panel. I played around with the images using the Infragram Sandbox, a great tool that allows the images to be manipulated on the fly. From Photoshop I could see that the blue channel had much less intensity than the red channel (red becomes NIR after the mods made to the camera) so I modified the regular NDVI formula (NIR-R)/(NIR+R) and instead used (NIR-2*B)/(NIR+B). Note that in the DIY NDVI we use the blue channel as the baseline instead of red. This works because the reflectance of chlorophyl is about the same in the blue as the red.
The black bits are the dead leaves and the grass shows up as various gray scales. This is far from any definitive proof that it will work with crops but it's a step in the right direction.
We'll lighten up the blog today with a fun Youtube video. Despite its parody nature there are some cool farming shots in it.