This newsletter revisits the process that led eventually to Autonomous Supervisionfrom a human and systems point of view, although some basics of engineering would come handy.
It has been a long and winding road.
The initial finding
Albatroz Engenharia first contacted robotics aerial systems – they have been known as RPAS, UAV and more recently UAS and we’ll use the latter – in 2009 with an three year R&D project named “AIRTICI” with LABELEC – EDP Group, Institute for Systems and Robotics (ISR-Lisboa) and other partners for aerial inspection of critical infrastructure: dams, bridges, pipelines, overhead lines, etc.. The aircraft used was a typical helicopter configuration with a large main rotor and a small tail rotor perpendicular to the first one (Figure 1).
One of the earliest and key findings of AIRTICI was that fatigue of the pilots on ground arose faster than helicopter pilots on air in spite the similarities of both aircrafts and procedures.
After study and consulting with pilots with both competences – crewed flight and uncrewed flight – it was apparent that the main cause for this phenomenon was the difference of experience in both positions: on air there is no parallax error between pilot and aircraft, there is less control-to-actuator delay and the “envelope” around the pilot wholly relates to the mission. Last but not the least, the experience of flight is multi-sensorial while the pilot body on the ground does not receive the inputs that transmit essential components of flight dynamics through vibrations, accelerations, sounds, odours, etc.. A younger generation that has never piloted a crewed aircraft may not feel the need for such stimuli but they are irreplaceable for seasoned pilots of aircraft with people on board.
The opportunity = the challenge
If human pilots struggle to maintain top performance from the ground for multi-hour flights as they do from a helicopter and if overhead line inspection is a hazardous task that does not require immediate human presence at close range, than there is a gap between UAS and crewed helicopters operations that opens an opportunity for R&D that could lead later to an innovation: autonomous power line maintenance inspections.
In 2007, Albatroz Engenharia had created Power Line Maintenance Inspection [PLMI], the world first real-time supervision system for overhead lines [OHL] to support the safety of the crew inside a helicopter operating for LABELEC. Albatroz’ purpose was to support the decision of the pilot who was fed relevant instant information about clearances to the obstacles in the scenario filtered by relevance, first by an algorithm and second by other members of the crew. Moreover, the system recorded all relevant variables for later debriefing and safety improvements.
The four variables monitored and gauged each second were:
- Clearance to ground or Altitude above Ground Level [AGL].
- Indicated Air Speed [IAS],
- Clearance to the line or Distance Helicopter to Line [DHL that would become DVL = Distance Vehicle to Line],
- Clearance to any obstacle.
Altitude above Ground Level is important to gauge the likelihood of a successful auto-rotation when the pilot safely brings the helicopter to the ground in case of engine failure: if one is too close to the ground there is no time to perform auto-rotation. Auto-rotation also depends on the Instant Air Speed that determines the lift generated by air on the main rotor blades: higher speed means more lift. Distance Helicopter to Line means implies that safety margins grow as the helicopter moves away from the OHL.
IAS is measured with a pitot tube, an air pressure sensor that does not operate properly at typical inspection speeds (< 25 kts = 46km/h = 29mph) requiring a replacement by a surrogate measurement: Velocity to Ground [VTG] measured with GPS, which works across all relevant velocity ranges.
The first two variables, AGL & IAS form a critical safety specification of a helicopter: the velocity to height diagram defining the flight envelope kinematics with safe areas (green), transition and hazardous areas (red). Such a diagram is shown in Figure 2 (a), for a particular type of helicopter and as an generic case in (b). The same safety approach was generalized with the AGL, IAS and DHL creating similar 2-axis diagrams: DHL to AGL (c) and IAS to DHL (d).
The result was implemented in the real time interface of PLMI shown in Figure 3 . The main innovative feature of PLMI was the identification of OHL cables, ground, towers and vegetation in real-time based on LiDAR measurements, shown in the main window.
The robotics in PLMI was only about sensing and computing. All reasoning was carried by the crew: a) what was the relevance of the gauge readings and the scenario, as depicted in the main window in Figure 3, b) what to communicate to the pilot and c) the decisions the pilot takes and how they translate to actions. This sequence includes the three main roles of a flight crew: aviate, navigate and communicate.
The role of the Navigator is to keep the aircraft safe from other aircrafts and obstacles in route and to show the pilot where to go – often there are multiple OHL confluences and overpasses that require dedicated attention. The role of the Mission Expert is to report on how the systems are performing both for data acquisition as well as for safety and flight management. Both communicate with the Pilot (usually through voice) which takes all information pieces into account when handling the actuators that control the dynamics of the helicopter.
Learning from humans to design robots
To implement inspections using UAS, both the Navigator and the Mission expert had to become autonomous systems that would feed the Pilot – in this case, an Auto-Pilot – the best inputs to carry the mission.
Considering that the Auto-Pilot was a system that had been appropriately created by others and it is common to a type of aircraft performing different missions and, in case of UAS, there are full auto-pilots for multiple aircraft types, the Auto-Pilot role was left beyond the scope of the ideation and R&D.
Above all, a “Supervisor” acts as a pilot-in-command to coordinate the three sub-systems – Navigator, Mission expert and Auto-Pilot (which provides a lot of information) – and arbitrate between competing goals. This is how Autonomous Supervision was conceived. For safety and quality control reasons, all autonomous systems should operate under the aegis of a human crew, henceforth on ground. In short, while most of “aviate” role was entrusted to the Auto-Pilot, the remainder of aviate, all navigate and most of communicate were assigned to the Autonomous Supervision.
Safety requires the three variables not to be too small (apart from take-off and landing that happen at IAS=0 and AGL=0), so all charts come close to red in the origin and go green for large values (see Figure 2). On the other hand, the Mission expert requires:
- To put a ceiling on AGL to maintain quality of inspection of features on the ground;
- To put a ceiling on IAS (or VTG) to maintain quality of image capturing and density of data;
- To put a ceiling on DHL to maintain quality in visual or thermal inspection of the OHL components and reliable detection of the OHL cables using LiDAR.
The qualitative two-axis diagrams for the three pairs of variables emphasise there is a sweet spot in the center of a cube defined by IAS(=VTG), AGL and DHL which are monitored by the Mission expert.
If the tree variables are considered at the same time, they form a 3D state space where the role of the Supervisor is to keep the UAS within the green volume inside the state space cube, while advancing along the lines according to the route shown by the Navigator.
Conclusion (of part I)
In a nutshell:
- safety puts floors to AGL, IAS and DHL,
- quality and robustness of inspection put ceilings to AGL, IAS and DHL and
- the same LiDAR sensor is used both for safety and for vegetation and clearance measurements, ensuring consistency of scene modelisation.
The interesting point is that a control architecture devised for the safety of human crews proved to be the best fit for UAS operations.
Sounds simple? It was only the beginning.
In 2012, an aeronautics engineer at Albatroz Engenharia, Ms. Sandra Antunes, started a PhD program with Universidade da Beira Interior (www.ubi.pt) to prove this was feasible, safe and efficient.
It took 10 years, up to 7th April 2022, four years ago to this day, to prove it was feasible and safe. Read the continuation of this endeavour in the coming editions…
One picture per month
A look behind the curtains to discover some significant memories from Albatroz Engenharia’s archives.
The Talisman
If you need to improvise in a helicopter, put all your gear in a shoe box.
At the start of airborne demonstrations for the 4th PLMI unit, time constraints prevented the design of a proper data processing unit for that specific helicopter. As a temporary fix, the team installed all components inside a crude laboratory steel box—nicknamed “the shoebox” for its makeshift look. What was meant to last only a few days ended up serving for months, accompanying extensive deployments while a proper Digital Recording Unit (DRU) was being developed. When the DRU finally replaced it, the client—out of either satisfaction or superstition—insisted on keeping the shoeboxas a backup.
Years later, when the whole system was replaced, the shoebox was shelved in the company museum as a cherished artifact, a symbol of ingenuity and reliability.




