The ARPON project is a Franco – Brazilian PRCI (International Collaborative Research Project). It is funded by ANR for France and FACEPE for Brazil. It is part of the current movement which seeks to meet the modern challenges of the agricultural world by relying on robotic technology. Thus, this project aims at designing and developing a framework allowing a robotic mobile base to autonomously navigate in commercial orchards. Being able to convey the payload at any point of the orchard represents a keystone in automating orchard farming. Indeed, it then becomes possible to achieve farming tasks such as monitoring, spraying, harvesting, pruning, weeding, detecting irrigation leak, and transporting fruits.
Four main scientific objectives have been highlighted to carry out the project: (i) Auto-guidance system design, (ii) Perception system design, (iii) Mapping and localization system design, and (iv) Interaction system design. Moreover, the project aims also at proving the concept of sensor-based navigation through orchards, which in turn requires to conduct large scale experimental campaigns in such agricultural environments. All the developed methods will be developed using ROS framework and a parallel implementation on a GPU could be considered for some of them if required. Thus, the scientific program is structured around the following scientific and technical workpackages (WP).
- Auto-guidance System Design (WP1): This WP is intended to design a sensor-based autonomous guidance system allowing to drive the robot in the orchard. It will be based on a Visual Predictive Control scheme. This kind of automatic control allows to define the task in the image space while taking into account a set of constraints, especially related to obstacles and vibrations.
-
Localization and mapping (WP2): The goal is to design an appearance-based visual SLAM scheme for an orchard-like environment using visual information collected by the embedded cameras. Specific AI techniques based on self organizing maps will be used. The obtained methods will have to be efficient despite the light and weather variations over the days and the structural evolution over the seasons.
-
Perception System Design (WP3): This WP aims at designing the perception part of the navigation system. Its goal will be to detect and identify the orchard trees from the data provided by the cameras. From this, all the landmarks required for the robot guidance will be extracted. There again, the proposed method will have to be robust to the constraints introduced by the particular nature of the considered environment (light variations, etc.).
-
Interaction system design (WP4): It aims at defining the robot’s missions and collecting data about the robot’s state. It is also intended to handle the communication with the user and provide feedback about the task execution. It must also allow to remotely take control of the robot in order to deal with unexpected failures.
-
Experimental validation (WP5): This WP is devoted to the realization of large scale experimental tests to thoroughly validate the complete proposed navigation system. These tests will be conducted in the orchards of ArboNovateur (Tarn et Garonne). The task will consist in safely navigating through an orchard with two targeted scenarii. In the first one, the mission will be performed with a recently learned map in an orchard free of obstacles. In the second one, the same map will be used and updated to perform navigation over two different seasons.
The project involves two partners : on the one hand CNRS-LAAS (Toulouse, France) and on the other hand UFPE-CIn (Recife, Brazil). The first one exhibits skills in robotics and perception, thus leading WP1 and WP3. The second one presents strong skills in computer science and artificial intelligence and will thus be in charge of WP2 and WP4. Both LAAS and CIn teams will deeply collaborate in WP5 to handle the aspects related to implementation in the Robot Operating System (ROS) and integration on the platforms. See here for more details about the consortium.