Sunday, September 11, 2016

Daniel Schmoldt of USDA/NIFA presenting at NREC 20th anniversary seminar

Streamed live on Sep 8, 2016 - “Daniel Schmoldt completed his academic training in 1987 from the University of Wisconsin-Madison with degrees in mathematics, computer science, and forest science. The latter included completion of both Masters and Ph.D. programs. From 1987 until 2001, he held several research scientist positions with the U.S. Forest Service while conducting research in a variety of forestry areas: wildfire management, atmospheric deposition, artificial intelligence, decision support systems, ecosystem management, machine vision systems, and automation in forest products utilization. From 1997-2004, he served as Joint Editor-in-Chief for the Elsevier journal, Computers and Electronics in Agriculture, and remains on their editorial board. Since 2001, he has filled a newly created position as National Program Leader for Instrumentation and Sensors with the National Institute of Food and Agriculture, and helps to prioritize, develop, focus, and coordinate USDA research, education, and extension programs covering the development of sensors, instrumentation, and automation technologies related to precision agriculture/forestry, robotics, processing of agricultural and forest products, detection of contaminants in agricultural products, and monitoring and management of air, soil, and water quality. His current $100M+ portfolio of grant programs include specialty crops, agroclimatology, robotics, engineering, nanotechnology, and cyber-physical systems. Finally, he currently serves as the USDA representative to several Office of Science and Technology Policy working groups on engineering and technology.”

Saturday, September 03, 2016

Sensing and Sensory Response in Plants

Presented for a general audience, this video is a gentle introduction to the ability of plants to detect and respond to aspects of their environments.

"Pesticides include both insecticides and herbicides"

In an article titled “How GMOs Cut The Use Of Pesticides — And Perhaps Boosted It Again” on, Dan Charles writes “Pesticides include both insecticides and herbicides”.

To this point, I have used ‘pesticide’ and ‘herbicide’ as disjunct terms, rather than treating pesticides as being inclusive of herbicides, thinking of ‘pesticide’ as referring to any chemical agent applied for the purpose of controlling animal pests, but this may not be strictly correct.

In any case, it is at variance with one authoritative interpretation of these terms.

Wednesday, August 03, 2016

Assessing the Present Moment

I've long contended that, with the partial exception of Facebook, my online presences aren't about me. They're about something of interest to me, and inevitably filtered through my perspective and constrained by the amount of time I have to give to each, but they're not actually about me. I'm personally not that interesting, I'm just fiendishly drawn to topics that are.

Nevertheless, life sometimes impinges.

In Robotics for Gardeners and Farmers, Part 6, I said "I think it is time to bring this series to a close and begin a new one which attempts to bring these two audiences together...", implying, without saying so directly, that this new series would follow almost immediately.

What did occur to me almost immediately after posting the above is that the effort to bring roboticists together with gardeners and farmers — particularly those engaged in organic/biological/ecological/regenerative approaches — has been the primary mission of this blog from its outset, ten years ago, so a series for this purpose would seem somewhat redundant. Also, the effort to produce the twelve posts outlined in Robotics for Gardening and Farming: A Guide to Two Recent Series exhausted me more than I appreciated at the time. I need a break.

Happily, the dramatic success of the pre-order campaign for FarmBot Genesis came along just in time to take up my slack. So far as I'm concerned, the ball is in their court for the moment.

I expect to return to posting here at a more sedate pace, and to give more of my time to other interests.

In the meantime, I will continue to be on the lookout for anything I can simply link to that contributes to building a bridge between robotics and its application to making the best practices of __*__ scalable. *(Filling in that blank is a bit tricky. There are quite a few overlapping communities of practice, and I don't wish to exclude any of them.)

If what you see here leaves you wanting more, I post more frequently to my topic, and more frequently yet to my Twitter account, both of which have a similar central focus.

Sunday, July 17, 2016

Robotics for Gardening and Farming: A Guide to Two Recent Series

Over the past weeks, I've written two series of posts, the first titled "Biological Agriculture for Roboticists" and the second "Robotics for Gardeners and Farmers", with the intention of helping to bridge the gap between these occupations and those engaged in them. What appears below is a table of contents for those series.

Biological Agriculture for Roboticists

Part 1

Part 2

Part 3

Part 4

Part 5

Part 6

Robotics for Gardeners and Farmers

Part 1

Part 2

Part 3

Part 4

Part 5

Part 6

Robotics for Gardeners and Farmers, Part 6

Imagine, for a moment, that you are a baby chipmunk, emerging from the burrow for the first time and having your first look around. The world is amazing, full of light and sound, most of which doesn't make much sense at first, although your highly evolved mammalian brain quickly learns to turn that barrage of data into a plausible model of what is happening around you. But, in that first instant, it's cacophony.

Now let's take this one step further. Imagine you're a newborn tree squirrel, nearly devoid of usable senses, that somehow fell from the nest but survived the fall. Without sensory hardware or some other source of information about its environment, this is essentially the situation faced by any computing device, except that it doesn't experience distress; it just runs code.

Machines only have the senses provided through the inclusion of sensory hardware in their construction, and even that by itself is insufficient. Sensory hardware must be attached to computing hardware in a manner that allows it to pass along signals representing what it has sensed, and that computing hardware must be able to interpret those signals meaningfully, either automatically, as a result of its design, or under the control of software. Those meaningful interpretations must then be passed along to software which chooses among available actions and plans the execution of whatever action it has chosen, with the resulting action feeding back into the cycle as altered sensory input.

This all sounds very complicated, but it needn't always be so. Say you have a triangular platform supported by three steerable, powered wheels near the corners, all three of which always point the same direction, meaning that they are steerable in unison, perhaps under the control of a single motor and a chain drive. This platform is sitting on a table top, and its purpose is to randomly roll around on the table without falling over the edge. All that is required to accomplish this is three edge detectors, basically simple feelers, extending beyond the wheels, each producing a simple signal (no clock needed) that lets the processor know when an edge has been detected, informing it that it should not go any further in that direction and to pick another direction, one that will move the device away from the detected edge. If the device detects edges at two corners at more or less the same time, it will know to move in the direction of the corner from which it is not receiving such a signal. If this is the only challenge with which this device is presented, it will happily roll around, without falling off the table, until its batteries can no longer power the circuitry or make the motors turn.

While this example isn't particularly useful, except perhaps for keeping small children or pets entertained, there are even simpler devices which are, such as a hose-following lawn sprinkler, and even when you're setting out to design a full-blown robotic system it's good practice to make a first pass using the simplest approach that will do a passable job of whatever it is you're out to accomplish.

That said, let's dive into the discussion of enabling machines to garner some information about their environments.

One of the most important categories of information for a machine that moves about is location, with respect to any boundaries it should not venture beyond and to any significant features within those boundaries.

For some purposes, knowing where it is to within a few yards might be enough. Say you wanted a lawn sprinkler that moved itself about more intelligently than one that just follows a hose, and you're only going to use it in the back yard, so you don't need to worry about it sprinkling visitors or your mail carrier. It's going to need a time source, so you can tell it when to start sprinkling, a map of the back yard, and some means of determining where it is within that map. One obvious way to determine location would be GPS. There are GPS receivers available for single-board computers and microcontrollers, typically as plug-in boards called shields, and, if your yard is fenced in and you don't mind some imprecision, GPS might be good enough.

For other purposes, like edging the lawn along walks and around garden spaces, GPS alone doesn't come close to being precise enough, and you might wish to rely upon some other positioning technology, or upon a hybrid system, perhaps utilizing technology more usually applied indoors. One approach would be to use an array of ZigBee protocol nodes spread around the perimeter of your yard, triangulating position based on signal strengths from those nodes, although this too might not be precise enough for edging.

For rectangular garden spaces and raised beds, the rail and gantry approach employed by FarmBot provides enough precision for most operations, and provides a good foundation for greater precision based on imagery and force control, topics beyond the scope of this installment.

Returning to our lawn sprinkler example, you might want to take soil moisture levels into account, but incorporating a soil moisture sensor into your sprinkler would make it considerably more complicated, and, in any case, healthy turf can be very difficult to penetrate, so maybe you'd prefer to distribute several of these sensors around your yard and network them together using WiFi, Bluetooth, or ZigBee. These soil moisture sensing nodes could also be used to provide a local positioning system, as described above.

But what if you have children who leave their toys strewn about. Those toys are going to get wet; there's no helping that at this level of sophistication, but we'd like to be able to detect their presence to avoid running into them, and, if possible, to avoid wrapping the water hose around them. Several fixed ultrasonic range finders, or a single one on a motorized mount that sweeps from side to side, can provide good information about such obstacles, if they can be made to operate while sealed to protect them from water. Whiskers connected to microswitches may be a more practical solution.

There are many more types of sensors available, but all have one thing in common, they convert some bit of information about the physical world into an electrical signal that then becomes digital grist for the mill of some processor and the code running on it, providing a basis for choosing what, if anything, to do next.

Taken in order, the next installment would be about that processing, but I've already gone into some detail about processing hardware and software, and have mentioned ROS in passing, so it would make more sense for me to skip on to the the subject of actuators. However, because the topic of actuators and end effectors to perform detailed manipulations of living plants and their environments is nearly as unexplored for roboticists as it is for gardeners and farmers, I think it is time to bring this series to a close and begin a new one which attempts to bring these two audiences together, probably including explanations for new terms in brief glossaries at the bottom of the installments in which they are introduced, linking to these and to supplementary material from the text.

Previous installments

Thursday, July 14, 2016

TED talk by Emma Marris

I first learned about Emma Marris from another video, posted in conjunction with the publication of her book Rambunctious Garden...

...which I have previously linked to here.

The reason I believe her vision and my own are complementary is that devices using cultivation techniques sufficiently meticulous and noninvasive to enable mechanization of intensive polycultures could also allow some selective wildness (something other than aggressive and/or noxious weeds) back onto land used for production, intermixed with crops grown for harvest.