How can sensors contribute towards more efficient farming?

4 mins read

There are currently 7.6billion people on the planet. By 2050 this is estimated to rise by a further 2bn.

Resources are already being stretched and farming processes are struggling to cope with not just increased demand, but the pressures of climate change which is impacting on production methods. Now, farmers are having to find innovative ways to cope, while addressing environmental concerns.

Much of this innovation is being driven by the Internet of Things (IoT), which has given rise to ‘smart farming’ or ‘agri-tech’ and the introduction of intelligent robots to help address these challenges.

The Small Robot Company, an agri-tech start-up, is among those looking to inspire traditional farmers to adopt new farming practices. According to the company, 95% of energy is used to plough – a technique that is only needed due to heavy machinery crushing soil. The Small Robot Company is building intelligent machines that it says will seed and look after each plant in a crop, feeding and spraying accurately and only when necessary. This precision farming will limit wastage, providing individual plants with specific care and support. In theory, this should reduce chemical usage while increasing yield and farmers’ revenues.

At the heart of precision farming are sensors, an element that the likes of Xsens Technologies – an innovator in 3D motion tracking technology – provide for smart farming vehicles, among other autonomous-based applications such as drones.

The company recently launched the MTi-7 – a small, low-power sensor with an external Global Navigation Satellite System (GNSS) receiver. The GNSS helps provide an accurate, real-time position, velocity and orientation data stream, but as Arnout Koelewijn, Key Account Manager EMEA – Inertial Sensor Modules at Xsens, explains: “It won’t tell you where that object is looking or the angle” – an essential piece of information needed for autonomous farming vehicles due to the environments they encounter, that is, the uneven and soft texture of a field.

“If you have a tractor on a tilted plane of land, it will be positioned at an angle,” Koelewijn says. “The GNSS receiver will be at the top of the vehicle, which could be positioned at, say, 5 degrees. The top of the GNSS receiver where the antenna is located, is not in the same position as the wheels due to this angle. Therefore, additional sensors are required to compensate for the position error you get from the elevations and contours that an autonomous tractor would meet on a field.”

To overcome this challenge the MTi-7 features a gyroscope, an accelerometer and a magnetometer – which according to Koelewijn are key elements to a successful farming navigational system.

“The gyroscope senses rotation; the accelerometer senses acceleration; and the magnetometer, magnetic field,” he explains. “These readings enable us to calculate the roll, pitch and yaw angles.”

Imagine an aeroplane with three lines running through its body, meeting at right angles at the plane’s centre of gravity. Rotation around the front-to-back axis is the roll, side-to-side is the pitch, and the vertical axis is known as the yaw. These allow for precise navigation in tricky environments – the challenge is calculating them in such an environment.

For Koelewijn accuracy is critical as the demand for real-time data processing increases.

“Low latency is vital,” he clarifies, “if you have a drone in the air and there is a gust of wind, you need it to compensate for that immediately. The longer it takes for the sensor to calculate and provide feedback, the longer it will take for the autopilot or stabilisation mechanism to compensate for that.” A delay means the data will be inaccurate, and for autonomous vehicles this could result in an accident.

“Low latency is vital for smart farming solutions. You need to compensate for enviromental occurrences immediately.”
Arnout Koelewijn

Smart farming challenge

Koelewijn compares smart farming to the challenges associated with driverless cars. “Generally, a road is fixed and flat, but a field can be muddy and slippery. It’s a very uncontrolled environment. Imagine a tractor pulling a heavy load across the field, it may be ‘dancing’ left and right because of the unevenness and softness of the ground. GNSS alone wouldn’t be able to identify the direction precisely. That’s why the gyroscope, accelerometer and magnetometer are so vital.”

To achieve the low latency Koelewijn speaks of, the MTi-7 module uses a high update frequency and “fuses” the sensor readings with the GNSS receiver. The high update frequencies are enabled by the accelerometer, which, according to Koelewijn, can achieve results much quicker than a GNSS receiver, and allow for more samples per second.

“We sample each sensor internally, getting a rate of 3 turns per sample, so essentially, we’re collecting 9 samples at 1,000Hz per second, together with an orientation estimate” Koelewijn says.

But low latency does come with a price – it’s more power hungry. “We are limited by the individual sensors elements available to us,” Koelewijn contends. “If we implemented a low power mode within our technology, we would have to decrease the same frequency, meaning the accuracy of the sensor would also lower.”

Alongside the sensors, there is also a small microprocessor which gathers all the data. “Each of the sensors have 3 axes – a, y and z,” Koelewijn says, “so internally it actually acts as 3 gyroscopes, 3 magnetometers, 3 accelerometers.”

The sensors output a voltage depending on the acceleration or magnetic field or rotation. These voltages are then converted into a digital signal and processed by Xsens’ algorithms.

“Anything which is happening physically is obviously not digital,” he says, “what Xsens does, is provide data which can be processed by a computer that says: ‘this vehicle moved from location a to b via this route, and during that travel it had this orientation, at this time’. In a sense, we digitise motion.”

According to Koelewijn the sensors must be “adaptable” to work in accordance with the user’s needs. “The MTi-7 has a lot of settings which can be modified. An engineer can configure it as they would like it to act.”

Despite the well-publicised issues concerning Uber, Koelewijn believes the autonomous market will only get bigger due to its convenience. “It means you’ll need less people to operate your business, and it will cheaper in the long run. It can make farming much more efficient. One farmer can deploy 10 autonomous tractors at a single time, instead of having to do it all themselves.

“The population is growing and there are more mouths to feed, so food production needs to become cheaper and one way to achieve this is through autonomous farms.”

The sensor technology Xsens is deploying is just one part of this solution. To be fully autonomous requires many different types of components working together with the sensors, for example a steering angle.

Consequently, Xsens is working towards accepting new data sources and Koelewijn believes camera integration is the next step.

“Cameras are typically measuring the same things as an inertial measurement uni (IMU), just in a different way. The inertial sensor measures the rate of turns and the camera measures the pixels – they’re moving in the same direction, so they could assist each other.

“There are all sorts of technology that could be interesting to combine and fuse in our algorithms to make everything easier and work together. A camera is a logical step because it’s being implemented in most devices already.”

Whatever the next stage, accuracy will be the key to feeding an over-populated world, and sensors will have a big role in delivering that.