Mamut is a new AI-powered autonomous robot prototype that can map and navigate in an unstructured, natural environment. It can alter its route depending on changes in plant and crop growth as well as inspect and monitor high-value crops like vineyards and orchards.
But his ultimate goal is the automated collection of precise, granular data that can help growers/farmers be better informed.
“Mamut goes far beyond what manual collection could feasibly manage,” said Chris Roberts, Head of Industrial Robotics, Cambridge Consultants. “Many crops require constant monitoring, an important but time-consuming and tedious job, but Mamut can gather data 24/7.”
This level of automation says Roberts, can provide early identification of disease, or pests, estimation of crop yields and guidance on when and which areas to harvest for best yield and quality. Then he added:
“We’re also focused on the practicalities of getting it into the field – literally. The objective is to create a platform that can autonomously map and navigate in an unstructured, natural environment – and one that changes with plant growth – depending on the specific build it can be made low cost and robust enough to make commercial sense in the agricultural environment, rather than being an expensive research tool,”
Cambridge Consultants is using Mamut to demonstrate autonomous navigation in a real agricultural environment. The robot is currently being tested since early 2018 on Mackleapple’s 500-acre orchard. Each field test in the orchard lasts about eight hours with Mamut covering around 15 miles in that time frame testing collision avoidance, the effectiveness of the real-time mapping, route planning and route following as well as different sensor combinations.
While it’s operating in the field, Mamut collects detailed crop data and builds maps and routes in real time. From the sensors and data, maps are of the fields are automatically generated. Stored data from Mamut lets farmers analyze what has been gathered on their crops and gives them actionable information which will help predict and optimize yields. He also added:
“Within an orchard, the ultimate goal is to be able to capture an image of every apple every day, to spot disease, to estimate crop yield and to help growers choose the optimum time to harvest. Testing the robot in the field helps us to learn about autonomous navigation in unstructured environments, and this is one of the central challenges in all robotics projects.”
To navigate in the field, Mamut uses a stereo camera, an onboard AI system, LIDAR and a compass. The platform has different inspection sensors including six cameras with 360-degree image capture capabilities and a multispectral imaging camera which can detect different types of chlorophyll. With it’s adaptable DNA, farmers can use the relevant sensors to capture the crop they are inspecting.
The company wanted Mamut to learn on the job in a real-world environment, not just a lab. Robert also said:
“‘Learning’ can be described in several ways,” said Roberts. “The robot itself learns the environment in which it is moving and the best way to navigate between points on the map it creates. But AIs must learn and be taught. So, for example, the robot may navigate perfectly in a lab, but will display anomalous behavior in a field.”
“An example of this is that the natural environment is ‘soft’ and the robot initially tried to cut corners. This behavior wasn’t wrong per se, but we chose to ‘teach’ it not to do that. Another example is that early tests saw the robot weave along a straight path, much like one learning to drive may overcompensate on the steering wheel. Over time this was tuned out.”
“Initial interest has been from growers of high-value crops, but ultimately, we believe that Robotics-as-a-Service (RaaS) will be the preferred commercial model. With RaaS, a technology provider owns the hardware while growers purchase a service contract,”