Sunday, May 29, 2016

Biological Agriculture for Roboticists, Part 6

In a previous installment, I said that identifying weeds based on what's left standing after a patch of ground has been grazed won't control low-growing plants, using goatheads as an example.

To begin with, what some type of herbivore (cattle) finds distasteful another (goats) may find delectable, so not everything left standing by a single species is useless, and it's a good idea to run cattle, which strongly prefer grass, together with or immediately followed by another herbivore that is less picky, like goats.

Secondly, being unpalatable doesn't automatically make a plant a weed. Weeds are plants that move aggressively into disturbed ground, smother or chemically inhibit other plant life, and/or put most of their energy into producing above-ground growth and seeds rather than roots. They are typically annuals or biennials (producing seed in their second year). If a plant does none of these things and is not toxic to livestock or wildlife, it's probably not accurate to describe it as a weed. Even so, if livestock won't eat it and it's not a candidate for protection for being rare and endangered or threatened, and not vital to some rare and endangered animal, you probably don't want it taking up ground that could be producing something more useful in your pasture. So what's left standing after grazing isn't such a bad indication, but, as already mentioned, this test won't catch low-growing plants.

So, how to deal with those low-growing plants? Good question, and a good subject for further research. First you have to be able to identify their presence, and distinguish between them and the grass stubble left behind by grazing. Then there's the matter of locating the main stem and the location where it and the root system connect. If a plant is laying on the ground, supported by it and not swaying in the breeze, the modeling of its branching structure from video of its motion I referenced earlier won't work. One way to accomplish this might be to use a vacuum that pulls in a sufficiently large volume of air to pick up the vining tendrils and suck them in, and if you have a serious infestation of this sort of weed then using such equipment might be a reasonable choice. Another way might be a pincer-like manipulator, with cylindrical counter-rotating rotary rasps for fingers, pinching the vine at any point, determining which direction to rotate by trial and error, then using the resulting tension to guide the manipulator to the main stem so it can be uprooted.

Such a manipulator might be generally better at uprooting than a simple grasping manipulator, since the rotation of the fingers would replace retracting the robotic arm, potentially making the overall operation more efficient. A variation on the theme which might prove more generally useful would have low points on each finger matched by shallow indentations on the other finger, at the end furthest from the motors driving finger rotation, progressing to protruding hooks matched by deep indentations at the end nearest the motors. This would allow the same attachment to be used both for ordinary uprooting and for gathering up a something like goatheads, simply by adjusting where along the length of the rotating fingers it grasped the plant.


I also promised to get back to the use of sound, in the context of fauna management and pest control. This by itself could easily be the subject of a lengthy book. Information about the environment can be gleaned from ambient sounds as well as from active sonar, and a robot might also emit sounds for the effects they can produce.

Sonar is already widely used in robotics as a way of detecting and determining the distance to obstacles. While thus far more sophisticated technologies, such as synthetic aperture sonar, have primarily been developed for underwater use, a large market for autonomous robots operating at modest ground speeds in uncontrolled environments might prove incentive enough to justify developing versions for use in air.

Meanwhile, there is a wealth of information available from simple microphones. From tiny arthropods to passing ungulates, many animals produce characteristic sounds, with familiar examples including crickets, frogs, and all types of birds and mammals. These sounds can help identify not only what species are present but where they are and what they are doing.

Sound can also be used to affect the behavior of animals, for example discouraging deer from spending too much time browsing on your vegetable garden or keeping chickens from venturing too far afield. Through sound, a robot might signal the presence of a predator, or food, or a potential mate.

But it's not just animals; even plants produce sounds. A tree that has sustained wind damage, introducing cracks into its trunk, will sound different from one which has not. A plant with wilted leaves sounds different from one that is fully turgid, and one from which the leaves have fallen sounds different yet.

So far as I'm aware, all such potential uses of sound represent largely unexplored areas of research, so it's hard to know what all a machine might be able to learn about its biological environment just by listening and processing the data produced, and in what manner it might use sound to exert some control over that environment.


I've concentrated on tying up loose ends here because I'm eager to get on to the series on Robotics for Gardeners and Farmers. That's not to say that this will be the last installment in this series; after all I've yet to address planting, pruning, pest control, harvest, or dealing with the plant matter left behind after harvest, as well as animal husbandry. Whether I eventually get to all of these remains to be seen. Touching on all such topics probably isn't as important as conveying the nature of the opportunities presented by the application of robotics to methods founded in horticulture rather than in conventional agriculture, with an eye to then making them scalable.

Previous installments:

Building Soil Health for Healthy Plants by soil scientist Dr. Elaine Ingham

You might think of this as a mini-course in soil science, with an emphasis on soil microbiology.

Saturday, May 28, 2016

Joel Salatin: Successional Success - Field of Farmers

No mention of robotics here, except as might be implied by portable infrastructure, but this speech is a real eye-opener, well worth the time investment in watching and listening.

Pushing Back Deserts through Aerial Seeding


SourceLicense — Photo unmodified from original.

Start with a seed ball, containing seeds of one or more drought tolerant, deep-rooted perennial plants.

Next assemble some feathers or vanes, rather like those found on a badminton shuttlecock, but with an adaxial (inner) surface that is both a good radiator of thermal energy and hydrophobic, or having a branching network of hydrophobic veins which converge at the stem end.

Attach the feathers/vanes to the seed ball to form a seed bomb, and experiment iteratively to refine the design. The combination of mass and terminal velocity in free fall must be such that the seed bomb will penetrate a dry clay soil surface sufficiently to anchor itself against wind. The feathers or vanes should open up like a flower upon impact and remain in that configuration thereafter. This may require spring-loaded anchors that are triggered by the impact, to keep winds from tearing the seed bomb loose from the soil by its feathers/vanes.

Equip an aircraft with sensors that enable automatic determination of whether there are any people, domestic animals, or wildlife below and use this information to avoid harming them by interrupting the release of seed bombs. Drop the seed bombs near the desert's edge, where there is occasional rainfall, but not enough to support grazing, much less agriculture. Where there is enough rainfall to support grazing, a different type of seed bomb should be used.

Even without precipitation, so long as there is some humidity in the air, condensation (dew) will collect on the inner, now upward-facing radiative surfaces of the feathers/vanes, from where it will run down towards the seed ball due to the hydrophobic character of those surfaces.

In this manner, it should be possible to establish greenery at the edge of a desert, with the effect of locally altering the climate, perhaps enough so that a few years later another swath, closer to the center of the desert, can be seeded.

Friday, May 27, 2016

Why Agriculture Can Never Be Sustainable, and a Permacultural Solution present by Toby Hemenway

This video was recorded two years ago, but only recently posted to YouTube. I think it's amazingly good!

Tuesday, May 24, 2016

Biological Agriculture for Roboticists, Part 5

To be most useful, agricultural robots need not only to be able to distinguish plants from a background of soil and decaying plant matter, but to be able to distinguish them from each other, and to quickly model their branching structures, at least approximately, if only so they can locate the main stem and the point at which it emerges from the soil. They also need to be able to recognize plants that don't belong to any of the types they've already learned to identify as being something new.

This is a tall order, and I'll get into some specifics on how it might be accomplished a bit further on, but first why would robots need to be able to recognize plants as being something they haven't seen before; isn't it enough to be able to tell whether they've been planted intentionally, crop or not?

In Part 4 of this series, I provisionally claimed that, in a recently tilled field, which has not yet been planted to the next crop, any green growing thing can be presumed to be a weed. While that's usually the case there are exceptions.

Even in a monoculture scenario with routine tillage, where you don't really expect to find anything other than the crop that the farmer has planted and weeds in the field, seed may be brought in from elsewhere, blown on the wind, in bird droppings, or in the stools or clinging to the fur of some wide-ranging mammal. Generally these also might be considered weeds, but occasionally they will be rare and endangered species themselves, or vital to the survival of rare and endangered animals (milkweed for monarch butterflies), and should therefore be allowed to grow and mature, even at the expense of a small percentage of crop production and some inconvenience. (Farmers should be compensated for allowing this to happen, and robotic equipment can help document that they have done so.)

In a poly/permaculture scenario, native plants that aren't poisonous to livestock or wildlife, and which don't compete aggressively with crops, are usually welcome, because they increase the diversity of the flora, supporting a more diverse fauna, which is more likely to include beneficial species, all of which implies a more stable environment, less prone to overwhelming infestations of all sorts.

Plants look different under different lighting conditions — dawn, mid-morning, noon, mid-afternoon, dusk, and under clear sky versus clouds versus overcast conditions — and different in shade than when standing alone on otherwise clear ground. Beyond that, plants look very different as seedlings than they do after a few weeks of growth, different yet when they've gone to flower, and different once again when mature, and for perennials still more different in their winter or dry season dormancy. Unless you've seen them in all of these stages and conditions, even a human gardener might mistake one crop plant for another, or for a weed, and based upon that select an inappropriate action. Recognizing continuity between stages and across diverse conditions is even more challenging for a machine.

For all of these reasons, once the technology is up to making such differentiations quickly enough that it is no longer the limiting factor in machine performance, the default needs to be that when confronted with something unfamiliar do nothing other than keep track of it, and send a notification up the escalation chain. Now back to the question of how, which is about sensory modes and sensory processing. What information about an environment composed of crops, a smattering of native plants, and weeds, on a background of soil and decaying plant matter, can a machine usefully collect and process?

Among the most obvious and most valuable is location. To a very close approximation, plants stay where they're planted, so if today you find a plant in the same location as you found one yesterday, there's a high probability that it's the same plant, just a day older. (It's true that some plants send up new shoots from their root systems, remote from the original stem, but that belongs to a discussion of modeling things that aren't directly sensible, or, in that example, requires something like ground-penetrating radar.) Generally speaking for plants, over a short interval, location is synonymous with identity. GPS by itself is inadequate to establish location with sufficient precision to be used in this manner, so it must be supplemented with other methods, such as fixed markers, odometry, and maps created on previous passes over the same ground. More precise local positioning systems could also prove very helpful.

Another obvious collection of modalities center around imagery based on sensing reflected electromagnetic energy, including everything from microwaves through infrared and visible light to ultraviolet, as snapshots and over time (video), and using ambient or active illumination, or a combination of the two. (Introduction to RADAR Remote Sensing for Vegetation Mapping and Monitoring) Color video camera modules have become so inexpensive that using an array of them has become a reasonable proposition, and modules containing two or more lens/sensor systems are becoming widely available. Cameras which are sensitive to ultraviolet, near-infrared (wavelengths just longer than visible light), and far-infrared (thermal radiation) are also becoming common and dropping in price. Even phased array radar is being modularized and should be reasonable to include in mobile machines within a very few years.

Other sensory modes that are either already in common use, or may soon be so, include sound, both passive (hearing) and active (sonar), pressure/strain (touch-bars, whiskers, and controlling manipulator force), simple gas measurement (H2O, CO2, CH4) and volatile organic compound detection (smell, useful in distinguishing seedlings). I'll get back to the use of sound in a future installment, in the context of fauna management and pest control.

The stickier problem is how to transform all the data produced by all available sensors into something useful. This can be somewhat simplified by the inclusion of preprocessing circuitry in sensor modules, so that, for example, a camera module serves processed imagery instead of a raw data stream, but that still leaves the challenge of sensor fusion, weaving all of the data from all of the various sensors together to create an integrated model of the machine's environment and its position within it, both reflecting physical reality and supporting decisions about what to do next, quickly enough to be useful. Again, research is ongoing.

Previous installments:

Sunday, May 22, 2016

Biological Agriculture for Roboticists, Part 4

So how can robotics contribute to agriculture, or, more generally, to land management? Let's start with a relatively simple example, where the robot need not concern itself with differentiating between crops and weeds, and the only required manipulations are of nonliving materials. I'm talking about erosion control. In the video linked below there is no mention of robots, but the speaker does describe one of the techniques he employs, close placement of stones such that they aren't vulnerable to being washed away in the next flood, as time consuming and tedious. This is almost certainly a task that could be automated.

One step up from this is weed elimination in a recently tilled field which has not yet been seeded for the next crop, sometimes referred to as 'fallow' although that word is also used to mean a field that is simply being left alone for a time and has not been tilled. In that recently tilled field, any green growing thing can be assumed to be a weed — well, not exactly, but for the present purpose yes; I'll get back to this in a future installment — so all the robot has to be able to do is differentiate between the color of tilled soil and the color of any green growing thing within a geofence. Spot application of herbicide directly onto the plant, scaled to the size of the plant (less on smaller plants), is a huge improvement over area-wide application of the same herbicide, because it results in much less being used, and there are already machines available that do this.

Still using herbicides, further improvement is possible if the robot can model the plant in some detail, instead of only detecting a green blob, and determine where various parts of the plant are in space and relative to each other, how they are attached. Given such a model, more precise application of still smaller amounts of herbicide becomes possible, dropwise onto the point or points where cell division takes place (the meristem or meristems) or, in the case of an established plant, by injection into the main stem, just above where it emerges from the soil. Drop application can be tricky in a breeze, with the plant moving around, but if that model includes the structure of the plant and how it moves in response to air currents — something that is extractable from video (see below) — then those motions can be predicted and compensated for in the positioning of the drop dispenser. Other methods, not involving herbicides, are also possible, and made a great deal easier by this sort of modeling.

Pastures are a bit more complicated than recently tilled fields, since the contrast between weeds and everything else is more subtle, but help is available. Herbivores can be picky about what they eat and don't eat, and what they don't eat can produce seed or spread vegetatively, reducing the value of the pasture. In combination with holistic management of the animals themselves, keeping unpalatable plant species in check can help protect and improve pasture land. This is a task that can be performed by robots, at its simplest by clipping off at ground level anything left standing more than a few centimeters high immediately after a paddock has been grazed, after the animals have already decided for themselves what's good to eat and what isn't. This approach won't control low-growing plants, like goatheads (photo courtesy of Forest & Kim Starr), but it will control thistles and other erect plants. (Again, more in a later installment.)

Weed control becomes more challenging when the machine must be able to distinguish between crops and weeds, and research into accomplishing this is ongoing, but given that perceptual ability, the ways in which a robot might deal with a weed are numerous, and generally fall into three categories, control of seedlings, control of established plants that can easily be uprooted, and control of established plants that break off if you attempt to uproot them, leaving the root behind to produce a new shoot.

Seedlings are difficult to distinguish, but easy to kill. Most are easily uprooted, but moving a mechanism into position to do the uprooting takes time, especially if there are many weed seedlings to be dealt with in this manner. Using a high-pressure water jet to expose the root or sever the stem should take less time, since no movement reversal (pulling) is required. Using projectiles (ice, dry compressed compost, ...) might accomplish much the same thing from a slightly greater distance, given accurate targeting. With even more precise targeting, a laser might heat just the meristem enough to stop a seedling's development, from an even greater distance, requiring even smaller movements for retargeting, enabling faster operation. Laser heating might be combined with LIDAR sensing as a precisely timed high-energy pulse. A downside for both projectile and laser methods is that they require a clear path from a remote position to the target, something that becomes more problematic as the growing season progresses and leaf canopies become more dense, but seedlings that emerge later in the season are less of a concern, both because they will develop more slowly due to the shade created by established plants and because, by the time they have developed enough to represent significant competition for resources, annual crops will already be maturing or have already been harvested.

Plants with established root systems can sometimes also be uprooted, and any established plant can be clipped off mechanically near the soil surface, or cut off with a jet of high-pressure water. All of these techniques require positioning a mechanism at or near the base of the plant, and uprooting also requires support sufficient not only for the weight of that mechanism but also to offset the force required to accomplish uprooting, which can be considerable. Clipping or cutting a plant off near the soil line may not kill it, but it will set back its growth, and doing so repeatedly can eventually exhaust the resources it draws upon to regenerate above-ground growth, provided that the root system isn't being fed by foliage elsewhere. Machines relying upon such methods should be programmed to revisit those locations periodically, checking for regrowth. Plants with very tenacious root systems may require more aggressive treatment, which could mean herbicides but might also mean coring (cutting a deep cylindrical plug from the soil around the plant's stem) or steam injection, to cut the node from which that stem emerged off from the rest of the root system or locally kill the root system. All of this is a little like the botanical equivalent of Whac-A-Mole, which means tedium, something robots excel at coping with.

Next I'll go into perceptual systems (sensing and sensory processing) and plant differentiation in greater depth.

Previous installments: