Air-Cobot
Country | France |
---|---|
Type | Cobot |
Website | https://aircobot.akka.eu/ |
Air-Cobot (Aircraft Inspection enhanced by smaRt & Collaborative rOBOT) is a French research and development project of a wheeled collaborative mobile robot able to inspect aircraft during maintenance operations. This multi-partner project involves research laboratories and industry. Research around this prototype was developed in three domains: autonomous navigation, nondestructive testing and human-robot collaboration.
Air-Cobot is presented as the first robot able to perform visual inspections of aircraft. Inspection robots using other types of sensors have been considered before, such as the European project Robair. Since the launch of the project, other solutions based on image processing began to be developed, such as EasyJet with a drone or the swarm of drones from Toulouse company Donéclé. Currently, the Air-Cobot robot can inspect the lower parts of an aircraft. If the project continues, in prospect is the coupling with a drone to inspect an aircraft's upper parts.
Project description
Objectives
Launched in January 2013,[1] the project is part of the Interministerial Fund program of Aerospace Valley, a business cluster in southwestern France.[2] With a budget of over one million euros,[3] Air-Cobot aims to develop an innovative collaborative mobile robot, autonomous in its movements and able to perform the inspection of an aircraft with nondestructive testing sensors during preflight or during maintenance operations in a hangar.[2][4] Testing has been performed at the premises of Airbus and Air France Industries.[5]
Partners
The project leader is Akka Technologies. There are two academic partners; Akka Technologies and four other companies make up the five commercial partners.[6]
- Academic partners
- Armines and Institut Clément Ader of the École des mines d'Albi-Carmaux are in charge of nondestructive testing.[6][7]
- Laboratoire d'analyse et d'architecture des systèmes (LAAS-CNRS) with the Robotics, Action and Perception (RAP) team handles the autonomous navigation.[6][7][8]
- Industrial partners
- Akka Technologies, particularly the center for research and development Akka Research Toulouse, leads the project and brings skills in image analysis, navigation and aircraft maintenance.[3][6][7][9]
- Airbus Group Innovations is the initiator of the project, providing CAD models of the Airbus A320 and developing operating scenarios.[3][6][7]
- 2MoRO Solutions, a company based in the French Basque Country, is in charge of the maintenance information system.[6][7]
- M3 System, a Toulouse-based company, takes care of the outdoor localization solution based on the Global Positioning System (GPS).[6][7][10]
- Sterela, based in the south of Toulouse, provides the 4MOB mobile platform.[6][7][11]
Project finance
Project finance is provided by banque publique d'investissement, the Aquitaine Regional Council, the Pyrénées-Atlantiques Departemental Council, the Midi-Pyrénées Regional Council and by the European Union.[12]
Expected benefits
Aircraft are inspected during maintenance operations either outdoors on an airport between flights, or in a hangar for longer-duration inspections. These inspections are conducted mainly by human operators, visually and sometimes using tools to assess defects.[A 1] The project aims to improve inspections of aircraft and traceability. A database dedicated to each aircraft type, containing images and three-dimensional scans, will be updated after each maintenance. This allows for example to assess the propogation of a crack.[4][13]
The human operator's eyes fatigue over time while an automatic solution ensures reliability and repeatability of inspections. The decrease in time taken for inspections is a major objective for aircraft manufacturers and airlines. If maintenance operations are faster, this will optimize the availability of aircraft and reduce maintenance operating costs.[4][13]
Robot equipment
All electronics equipment is carried by the 4MOB mobile platform manufactured by Sterela. The off-road platform, equipped with four-wheel drive, can move at a speed of 2 metres per second (7.2 kilometres per hour (4.47 mph)).[11] Its lithium-ion battery allows an operating time of eight hours. Two bumpers are located at the front and at the rear. These are obstacle detection bumpers. They stop the platform if they are compressed.[11]
The cobot weighs 230 kilograms (507 lb). It has two computers, one running Linux for the autonomous navigation module and the other Windows for the non-destructive testing module. The robot is equipped with several sensors. The pan-tilt-zoom camera manufactured by Axis Communications and Eva 3D scanner manufactured by Artec 3D are dedicated to inspection. The sensors for navigation are an inertial measurement unit; two benches, each equipped with two PointGrey cameras; two Hokuyo laser range finders; and a GPS unit developed by M3 Systems that allows for geofencing tasks in outdoor environments.[3][7]
Autonomous navigation
The autonomous navigation of the Air-Cobot robot is in two phases. The first, navigation in the airport or the factory, allows the robot to move close to the aircraft. The second navigation, around the aircraft, allows the robot to position itself at control points referenced in the aircraft virtual model. In addition, the robot must insert itself in a dynamic environment where humans and vehicles are moving. To address this problem, it has an obstacle avoidance module. Many navigation algorithms are constantly running on the robot with real time constraints. Searches are conducted on optimizing the computing time.
Navigation in the airport or the factory
In an outdoor environment, the robot is able to go to the inspection site by localizing through Global Positioning System (GPS) data. The GPS device developed by M3 Systems allows geofencing. At the airport, the robot operates in dedicated navigation corridors respecting speed limits. Alerts are sent to the operator if the robot enters a prohibited area or exceeds a given speed.[10]
If in an indoor environment or an outdoor environment where GPS information is not available, the cobot can be switch to follower mode to move behind the human operator and follow her or him to the aircraft to inspect.[14]
Navigation around the aircraft
To perform the inspection, the robot has to navigate around the aircraft and get to the checkpoints called up in the aircraft virtual model. The position of the aircraft in the airport or factory is not known precisely; the cobot needs to detect the aircraft in order to know its position and orientation relative to the aircraft. To do this, the robot is able to locate itself, either with the laser data from its laser range finders,[A 2] or with image data from its cameras.[A 1][A 3]
Near the aircraft, a point cloud in three dimensions is acquired by changing the orientation of the laser scanning sensors fixed on pan-tilt units. After filtering data to remove floor- or insufficiently large dot clusters, a registration technique with the model of the aircraft is used to estimate the static orientation of the robot. The robot moves and holds this orientation by considering its wheel odometry, its inertial unit and visual odometry.[A 2]
"Matching clouds" | ||||||||
---|---|---|---|---|---|---|---|---|
|
Laser data are also used horizontally in two dimensions. An algorithm provides a real-time position estimation of the robot when enough elements from the landing gears and engines are visible. A confidence index is calculated based on the number of items collected by lasers. If good data confidence is achieved, the position is updated. This mode is particularly used when the robot moves beneath the aircraft.[A 2]
For visual localization, the robot estimates its position relative to the aircraft using visual elements (doors, windows, tires, static ports etc.) of the aircraft. During the evolution of the robot, these visual elements are extracted from a three-dimensional virtual model of the aircraft and projected in the image plane of the cameras. The projected shapes are used for pattern recognition to detect those visual elements.[A 3] The other detection method used is based on the extraction of features with a Speeded Up Robust Features (SURF) approach. A pairing is performed between images of each element to be detected and the actual scene experienced.[A 1]
By detecting and tracking visual landmarks, in addition to estimating its position relative to the aircraft, the robot can perform a visual servoing.[A 4] Research in vision is also conducted on simultaneous localization and mapping (SLAM).[A 5][A 6] A merger of information between the two methods of acquisition and laser vision is being considered. Artificial intelligence arbitrating various locations is also under consideration.[A 2][A 1]
Obstacle avoidance
In both navigation modes, Air-Cobot is also able to detect, track, identify and avoid obstacles that are in its way. The laser data from laser range sensors and visual data from the cameras can be used for detection, monitoring and identification of the obstacles. The detection and monitoring are better in the two-dimensional laser data, while identification is easier in the images from the cameras; the two methods are complementary. Information from laser data can be used to delimit work areas in the image.
The robot has several possible responses to any obstacles. These will depend on its environment (navigation corridor, tarmac area without many obstacles, cluttered indoor environment etc.) at the time of the encounter with an obstacle. It can stop and wait for a gap in traffic, or avoid an obstacle by using a technique based on a spiral, or perform path planning trajectories.[A 4]
Computing time optimization
Given the number of navigation algorithms calculating simultaneously to provide all the information in real time, research has been conducted to improve the computation time of some numerical methods using field-programmable gate arrays.[A 7][A 8][A 9] The research focused on visual perception. The first part was focused on the simultaneous localization and mapping with an extended Kalman filter that estimates the state of a dynamic system from a series of noisy or incomplete measures.[A 7][A 9] The second focused on the location and the detection of obstacles.[A 8]
Nondestructive testing
Image analysis
After having positioned to perform a visual inspection, the robot performs an acquisition with a pan-tilt-zoom camera. Several steps take place: pointing the camera, sensing the element to be inspected, if needed repointing and zooming with the camera, image acquisition and inspection. Image analysis is used on doors to determine whether they are open or closed; on the presence or absence of protection for certain equipment; the state of turbofan blades or the wear of landing gear tires.[A 10][A 11][A 12][A 13]
The detection uses pattern recognition of regular shapes (rectangles, circles, ellipses). The 3D model of the element to be inspected can be projected in the image plane for more complex shapes. The evaluation is based on indices such as the uniformity of segmented regions, convexity of their forms, or periodicity of the image pixels' intensity.[A 12]
The feature extraction using speeded up robust features (SURF) is also able to perform the inspection of certain elements having two possible states, such as pitot probes or static ports being covered or not covered. A pairing is performed between images of the element to be inspected in different states and that present on the scene. For these simple items to be inspected, an analysis during navigation is possible and preferable due to its time saving.[A 1]
Point cloud analysis
After having positioned to perform a scan inspection, the pantograph elevates the 3D scanner at the fuselage. A pan-tilt unit moves the scan device to acquire the hull. By comparing the data acquired to the three-dimensional model of the aircraft, algorithms are able to diagnose any faults in the fuselage structure and provide information on their shape, size and depth.[15]
By moving the pan-tilt units of the laser range finders, it is also possible to a obtain point cloud in three dimensions. Technical readjustment between the model of the aircraft and the scene point cloud is already used in navigation to estimate the static placement of the robot. It is planned to make targeted acquisitions, simpler in terms of movement, to verify the absence of chocks in front of the landing gear wheels, or the proper closing of engine cowling latches.[A 2]
Collaboration human-robot
As the project name suggests, the mobile robot is a cobot – a collaborative robot. During phases of navigation and inspection, a human operator accompanies the robot; he can take control if necessary, add inspection tasks, note a defect that is not in the list of robot checks, or validate the results. In the case of pre-flight inspections, the diagnosis of the walk-around is sent to the pilot who decides whether or not to take off.[7][14][A 14]
Other robotic inspection solutions
European project Robair
The inspection robot of the European project Robair, funded from 2001 to 2003, is designed to mount on the wings and fuselage of an aircraft to inspect rows of rivets. To move, the robot uses a flexible network of pneumatic suction cups that are adjustable to the surface. It can inspect the lines of rivets with ultrasonic waves, eddy current and thermographic techniques. It detects loose rivets and cracks.[16][17][18]
EasyJet drone
Airline EasyJet is interested in the inspection of aircraft with drones. It made a first inspection in 2015. Equipped with laser sensors and high resolution camera, the drone performs autonomous flight around the aeroplane. It generates a three-dimensional image of the aircraft and transmits it to a technician. The operator can then navigate in this representation and zoom to display a high-resolution picture of some parts of the aircraft. The operator must then visually diagnose the presence or absence of defects. This approach avoids the use of platforms to observe the upper parts of the aeroplane.[19]
Donéclé drone
Founded in 2015, Donéclé, a Toulouse start-up company, has also launched a drone approach which was initially specialized in the detection of lightning strikes on aeroplanes.[20][21] Performed by five people equipped with harnesses and platforms, this inspection usually takes about eight hours. The immobilization of the aircraft and the staff are costly for the airlines, estimated at $10 000 per hour. The solution proposed by the start-up lasts twenty minutes.[21]
Donéclé uses a swarm of drones equipped with laser sensors and micro-cameras. The algorithms for automatic detection of defects, trained on existing images database with a machine learning software, are able to identify various elements: texture irregularities, pitot probes, rivets, openings, text, defects, corrosion, oil stains. A damage report is sent on the operator's touch pad with each area of interest and the proposed classification with a probability percentage. After reviewing the images, the verdict is pronounced by a qualified inspector.[21]
Project continuation
In 2015, in an interview given to the French weekly magazine Air et Cosmos, Jean-Charles Marcos, CEO of Akka Research, explained that once developed and marketed the Air-Cobot should cost between 100,000 and 200,000 euros. He could meet civilian needs in nondestructive testing and also military ones.[3] A possible continuation of the project could be the use of the robot on aircraft larger than the Airbus A320. The CEO also revealed that Akka Technologies plans to work on a duo of robots for inspection: the same mobile platform for the lower parts, and a drone for the upper parts. If funding is allocated then this second phase would take place during the period 2017–2020.[3] At the Singapore Airshow in February 2016, Airbus Group presented Air-Cobot and its use in its vision of the hangar of the future.[22]
Communications
On October 23, 2014, a patent was filed by Airbus Group.[23] From 2014 to 2016, the robot had presentations in five exhibitions including Paris Air Show 2015,[1][24][25] and Singapore Airshow 2016.[22][26] The research developed in the project was presented in twelve conferences. Fourteen scientific articles were published eleven conference proceedings and three journal articles.[27] Part of publications is centered on navigation and/or inspection by Air-Cobot while the rest focuses on specific numerical methods or hardware solutions related to the issues of the project.
In April 17, 2015, Airbus Group distributed a project presentation video, made by the communication agency Clipatize, on its YouTube channel.[14][28] In September 25, 2015, Toulouse métropole broadcasts a promotional video on its YouTube channel. Toulouse metropolis is presented as an attractive ecosystem, able to build the future and highlights its visibility internationally. The Air-Cobot demonstrator was chosen to illustrate the robotics research of this metropolis.[29] Located at Laboratoire d'analyse et d'architecture des systèmes during development, researchers or engineers working on the project regularly present a demonstration to visitors (external researchers, industrial partners, or students); it was also demonstrated to the general public during the 2015 Feast of Science.[30] Airbus Group, in February 17, 2016, broadcast a YouTube video presentation of its vision of the hangar of the future in which it plans to use Air-Cobot.[22]
See also
Wikimedia Commons has media related to Air-Cobot. |
Notes and references
Research articles of the project
- 1 2 3 4 5 6 Villemot, Larnier & Vetault 2016, RFIA
- 1 2 3 4 5 6 7 8 9 10 Frejaville, Larnier & Vetault 2016, RFIA
- 1 2 3 Jovancevic et al. 2016, ICPRAM
- 1 2 Futterlieb, Cadenat & Sentenac 2014, ICINCO
- ↑ Esparza-Jiménez, Devy & Gordillo 2014, FUSION
- ↑ Esparza-Jiménez, Devy & Gordillo 2016, Sensors
- 1 2 Tertei, Piat & Devy 2014, ReConFig
- 1 2 Alhamwi, Vandeportaele & Piat 2015, ICVS
- 1 2 Tertei, Piat & Devy 2016, CEE
- ↑ Jovancevic et al. 2015, QCAV
- ↑ Jovancevic et al. 2015, CMOI
- 1 2 Jovancevic et al. 2015, JEI
- ↑ Jovancevic et al. 2016, MECO
- ↑ Donadio et al. 2016, MCG
Proceedings
- Futterlieb, Marcus; Cadenat, Viviane; Sentenac, Thierry (2014). "A navigational framework combining Visual Servoing and spiral obstacle avoidance techniques" (pdf). Informatics in Control, Automation and Robotics (ICINCO), 2014 11th International Conference on, Vienna: 57–64.
- Esparza-Jiménez, Jorge Othón; Devy, Michel; Gordillo, José Luis (2014). "EKF-based SLAM fusing heterogeneous landmarks" (pdf). 17th International Conference on Information Fusion (FUSION): 1–8.
- Tertei, Daniel Törtei; Piat, Jonathan; Devy, Michel (2014). "FPGA design and implementation of a matrix multiplier based accelerator for 3D EKF SLAM" (pdf). International Conference on ReConFigurable Computing and FPGAs (ReConFig14): 1–6.
- Jovancevic, Igor; Orteu, Jean-José; Sentenac, Thierry; Gilblas, Rémi (April 2015). "Automated visual inspection of an airplane exterior" (pdf). Proceedings of SPIE – the International Society for Optical Engineering. Twelfth International Conference on Quality Control by Artificial Vision 2015. 9534: 95340Y. Bibcode:2015SPIE.9534E..0YJ. doi:10.1117/12.2182811.
- (French) Jovancevic, Igor; Orteu, Jean-José; Sentenac, Thierry; Gilblas, Rémi (November 2015). "Inspection d'un aéronef à partir d'un système multi-capteurs porté par un robot mobile" (pdf). Actes du 14ème Colloque Méthodes et Techniques Optiques pour l'Industrie.
- Alhamwi, Ali; Vandeportaele, Bertrand; Piat, Jonathan (2015). "Real Time Vision System for Obstacle Detection and Localization on FPGA" (pdf). Computer Vision Systems – 10th International Conference, ICVS 2015: 80–90.
- Jovancevic, Igor; Viana, Ilisio; Orteu, Jean-José; Sentenac, Thierry; Larnier, Stanislas (February 2016). "Matching CAD model and images features for robot navigation and inspection of an aircraft" (pdf). International Conference on Pattern Recognition Applications and Methods: 359–366.
- Jovancevic, Igor; Arafat, Al; Orteu, Jean-José; Sentenac, Thierry (2016). "Airplane tire inspection by image processing techniques" (pdf). 5th Mediterranean Conference on Embedded Computing.
- (French) Frejaville, Jérémy; Larnier, Stanislas; Vetault, Stéphane (2016). "Localisation à partir de données laser d'un robot naviguant autour d'un avion" (pdf). Actes de la conférence Reconnaissance de Formes et Intelligence Artificielle.
- (French) Villemot, Tanguy; Larnier, Stanislas; Vetault, Stéphane (2016). "Détection d'amers visuels pour la navigation d'un robot autonome autour d'un avion et son inspection" (pdf). Actes de la conférence Reconnaissance de Formes et Intelligence Artificielle.
- Donadio, Frédéric; Frejaville, Jérémy; Larnier, Stanislas; Vetault, Stéphane (2016). "Human-robot collaboration to perform aircraft inspection in working environment". Proceedings of 5th International conference on Machine Control and Guidance.
Journal articles
- Jovancevic, Igor; Larnier, Stanislas; Orteu, Jean-José; Sentenac, Thierry (November 2015). "Automated exterior inspection of an aircraft with a pan-tilt-zoom camera mounted on a mobile robot" (pdf). Journal of Electronic Imaging. 24 (6): 061110. Bibcode:2015JEI....24f1110J. doi:10.1117/1.JEI.24.6.061110.
- Esparza-Jiménez, Jorge Othón; Devy, Michel; Gordillo, José Luis (2016). "EKF-based SLAM fusing heterogeneous landmarks" (pdf). Sensors. 16 (4).
- Tertei, Daniel Törtei; Piat, Jonathan; Devy, Michel (2016). "FPGA design of EKF block accelerator for 3D visual SLAM" (pdf). Computers and Electrical Engineering.
Other references
- 1 2 (French) Xavier Martinage (17 June 2015). "Air-Cobot : le robot dont dépendra votre sécurité". lci.tf1.fr. La Chaîne Info. Archived from the original on 3 January 2016. Retrieved 12 July 2016.
- 1 2 (French) "Air-Cobot : un nouveau mode d'inspection visuelle des avions". competitivite.gouv.fr. Les pôles de compétitivité. Retrieved 12 July 2016.
- 1 2 3 4 5 6 (French) Olivier Constant (11 September 2015). "Le projet Air-Cobot suit son cours". Air et Cosmos. Retrieved 12 July 2016.
- 1 2 3 (French) "Rapport d'activité 2013–2014 de l'Aerospace Valley" (PDF). aerospace-valley.com. Aerospace Valley. Retrieved 12 July 2016.
- 1 2 (French) "News du projet Air-Cobot". aircobot.akka.eu. Akka Technologies. Retrieved 12 July 2016.
- 1 2 3 4 5 6 7 8 (French) "AKKA Technologies coordonne le projet Air-COBOT, un robot autonome d'inspection visuelle des avions". Capital. 1 July 2014. Retrieved 14 July 2016.
- 1 2 3 4 5 6 7 8 9 (French) "Air-Cobot, le robot qui s'assure que vous ferez un bon vol !". Planète Robots: 32–33. 2016.
- ↑ (French) "Contrats RAP". Laboratoire d'analyse et d'architecture des systèmes. Retrieved 17 July 2016.
- ↑ (French) "Akka Technologies : une marque employeur orientée sur l'innovation". Le Parisien. 15 February 2016. Retrieved 17 July 2016.
- 1 2 "M3 Systems Flagship Solution". M3 Systems. Retrieved 17 July 2016.
- 1 2 3 (French) "4MOB, plateforme intelligente autonome" (pdf). Sterela Solutions. Retrieved 17 July 2016.
- ↑ (French) "Financeurs". aircobot.akka.eu. Akka Technologies. Retrieved 15 July 2016.
- 1 2 (French) Véronique Guillermard (18 May 2015). "Aircobot contrôle les avions avant le décollage". Le Figaro. Retrieved 14 July 2016.
- 1 2 3 Air-Cobot on YouTube
- ↑ (French) Pascal NGuyen (December 2014). "Des robots vérifient l'avion au sol". Sciences et Avenir (814). Retrieved 17 July 2016.
- ↑ (French) "Robair, Inspection robotisée des aéronefs". European Commission. Retrieved 16 July 2016.
- ↑ "Robair". London South Bank University. Retrieved 16 July 2016.
- ↑ Shang, Jianzhong; Sattar, Tariq; Chen, Shuwo; Bridge, Bryan (2007). "Design of a climbing robot for inspecting aircraft wings and fuselage". Industrial Robot: an International Journal. 34 (6): 495–502. doi:10.1108/01439910710832093.
- ↑ (French) Newsroom (8 June 2015). "Easy Jet commence à utiliser des drones pour l'inspection de ses avions". Humanoides. Retrieved 16 July 2016.
- ↑ (French) Florine Galéron (28 May 2015). "Aéronautique : la startup Donéclé invente le drone anti-foudre". Objectif News, La Tribune. Retrieved 16 July 2016.
- 1 2 3 (French) Arnaud Devillard (20 April 2016). "Des drones pour inspecter des avions". Sciences et avenir. Retrieved 16 July 2016.
- 1 2 3 Innovations in Singapore: the Hangar of the Future on YouTube
- ↑ "Espacenet – Bibliographic data – Collaborative robot for visually inspecting an aircraft". worldwide.espacenet.com. Retrieved 1 June 2016.
- ↑ (French) Juliette Raynal; Jean-François Prevéraud (15 June 2015). "Bourget 2015 : les dix rendez-vous technos à ne pas louper". Industrie et Technologies. Retrieved 16 July 2016.
- ↑ (French) "Akka Technologies au Salon du Bourget". Maurice Ricci. 21 June 2015. Retrieved 16 July 2015.
- ↑ "Singapore Airshow 2016 Trends: Emerging Technologies Take Off – APEX | Airline Passenger Experience". apex.aero. Retrieved 1 June 2016.
- ↑ (French) "Communications du projet Air-Cobot". aircobot.akka.eu. Akka Technologies. Retrieved 14 July 2016.
- ↑ "AirCobot – Introducing Smart Robots for Aircraft Inspections". clipatize.com. Clipatize. Retrieved 15 August 2016.
- ↑ (French) Toulouse métropole, construire le futur on YouTube
- ↑ (French) Air-Cobot, le robot d’assistance aux inspections des aéronefs (pdf). Programme de la fête de la science. 2015. Retrieved 17 July 2016.
External links
- (French) Official website
- Air-Cobot
- Akka Technologies