IFATCA @ ICAO Assembly #40

The 40th edition of the ICAO Assemb​ly is underway at the organisation’s headquarters in Montréal, Canada. The Assembly meets at least once every three years and is convened by ICAO’s governing body, the Council.

During Assembly Sessions, ICAO’s complete work programme in the technical, economic, legal and technical cooperation fields is reviewed in detail. Assembly outcomes are then provided to the other bodies of ICAO and to its Member States in order to guide their continuing and future work.

IFATCA has observer status at ICAO and can make interventions. Below are links to such interventions made by our ICAO ANC Liaison Officer, Mr. Jean-François Lepage.

Video of the interventions, courtesy of the ICAO lifestream of the meetings, can be found here:


European Aviation Artificial Intelligence High-Level Group

The Kickoff meeting of the European Aviation Artificial Intelligence High-Level Group took place at the EUROCONTROL headquarters in Brussels on 16 September 2019. This group, formed with experts from different aviation areas, is tasked to study how Artificial Intelligence can be included in ATM. Their ambitious target is to produce a report on the subject by the end of the year including recommendations. IFATCA is included in the team and will ensure that the operational point of view is covered in the final report.


Just Culture: Are we sustaining a false belief?

Prof. Dekker

By Sidney W. A. Dekker, currently a professor at Griffith University in Brisbane, Australia, where he founded the Safety Science Innovation Lab. He is also Honorary Professor of Psychology at the University of Queensland. He flew as First Officer on Boeing 737s for Sterling and later Cimber Airlines out of Copenhagen. In 2004, he wrote the following article in the aftermath of the Linate accident.

We like to believe that accidents happen because a few people do stupid things. That some people just do not pay attention. Or that some people have become complacent and do not want to do a better job.

On the surface, there is often a lot of support for these ideas. A quick look at what happened at Linate, for example, shows that controllers did not follow up on position reports, that airport managers did not fix a broken radar system in time, that nobody had bothered to maintain markings and signs out on the airport, and that controllers did not even know about some of the stop marks and positions out on the taxiway system. And of course, that a Cessna pilot landed in conditions that were below his minima. He should never have been there in the first place.

When we dig through the rubble of an accident, these shortcomings strike us as egregious, as shocking, as deviant, or even as criminal. If only these people had done their jobs! If only they had done what we pay them to do! Then the accident would never have happened. There seems only one way to go after such discoveries: fire the people who did not do their jobs. Perhaps even prosecute them and put them in jail. Make sure that they never touch a safety-critical system again. In fact, set an example by punishing them: make sure that other people like them will do their jobs diligently and correctly, so that they do not also lose their jobs or land in jail.

The problem with this logic is that it does not get us anywhere. The problem with this logic is that it does not work the way we hope. What we believe is not what really happens. The reason the logic does not work is twofold. First, accidents don’t just happen because a few people do stupid things or don’t pay attention. Second, firing or punishing people does not create progress on safety: it does not prevent such accidents from happening again. The only thing that we sustain by this logic of individual errors and punishment is our illusions. Systems don’t get safer by punishing people. Systems don’t get safer by thinking that humans are the greatest risk.

Let’s look at the first problem. Accidents don’t just happen because a few people do stupid things or don’t pay attention. Accidents are not just “caused” by those people. There is research that shows how accidents are almost normal, expected phenomena in systems that operate under conditions of resource scarcity and competition; that accidents are the normal by-product of normal people doing normal work in everyday organizations that operate technology that is exposed to a certain amount of risk. Accidents happen because entire systems fail. Not because people fail. This is called the systems view. The systems view is in direct contrast to the logic outlined above. The systems view sees the little errors and problems that we discover on the surface as symptoms, not as causes. These things do not “cause” an accident. Rather, they are symptoms of issues that lie much deeper inside a system. These issues may have to do with priorities, politics, organizational communication, engineering uncertainties, and much more.

To people who work in these organizations, however, such issues are seldom as obvious as they are to outside observers after an accident. To people inside organizations, these issues are not noteworthy or special. They are the stuff of doing everyday work in everyday organizations. Think of it: there is no organization where resource scarcity and communication problems do not play some sort of role (just think of your own workplace). But connecting these issues to an accident, or the potential of an accident, before the accident happens, is impossible. Research shows that it is basically outside our ability to imagine accidents as possible. We don’t believe that it is possible that an accident will happen. And what we don’t believe, we cannot predict.

An additional problem is that the potential for having an accident can grow over time. Systems slowly, and unnoticeably, move towards the edge of their safety envelopes. In their daily work people — operators, managers, administrators — make numerous decisions and trade-offs. They solve numerous larger and little problems. This is part and parcel of their everyday work, their everyday lives. With each solved problem comes the confidence that they must be doing the right thing; a decision was made without obvious safety consequences. But other ramifications or consequences of those decisions may be hard to foresee, they may be impossible to predict. The cumulative effect is called drift: the drift into failure. Drifting into failure is possible because people in organizations make thousands of little and larger decisions that to them are seemingly unconnected. But together, eventually, all these little, normal decisions and actions can push a system over the edge. Research shows that recognizing drift is incredibly difficult, if not impossible — either from the inside or the outside of the organization.

Continue to part 2


Safeguarding the Aeronautical Frequency Spectrum

Europe is currently studying the possibility of low power audio “Programme Making and Special Events” (PMSE) equipment in the 960 – 1164 MHz band of the spectrum, which is currently reserved for aeronautical use.

Allocations and services operating in the 960-1215 MHz band (source: ofcom, UK)

Sharing this portion of the spectrum with other users would potentially affect systems widely used in civil aviation like Secondary Radar, DME, ADS-B, Multi-lateration and ACAS/TCAS. Aviation equipment more commonly used by the military like TACAN or Link 16 communication systems also makes use of this part of the spectrum. The same is true for future developments, like LDACS (L-Band Digital Aeronautical Communication System).

Aviation equipment has to comply with strict requirements and stringent rules apply to personnel training and licensing. Sharing a frequency spectrum that was so far reserved for a tightly regulated activity such as aviation raises many questions. The technical requirements for equipment is one aspect, but there are concerns on liability in case of accidents or incidents, where aviation systems suffer from interference from non-aviation equipment; or on the validity of safety cases developed before the entrance of these new stakeholders.

It is vital for safety that aviation systems are not compromised by sharing of a frequency band. To esnure these concerns are listened to before a final decision is taken, IFATCA and other major stakeholders have forwarded their concerns. In addition, IFATCA has requested its European Member Associations to contact their Aviation Authorities and invite them to participate in the Public Consultation that will end in September 2019.


Just Culture: Learning or Punishing?

By Job Brüggen, Safety Officer of LVNL (Air Traffic Control The Netherlands).

Our feature articles on Just Culture, triggered by the conviction of an Air Traffic Controller in Switzerland, aim to trigger thoughts and ideas for how to proceed. This article is written as a personal opinion of the author and does not necessarily reflect the opinion of LVNL or that of IFATCA.

The Federal Court has given its verdict and this time it is final. The conviction of a Swiss controller strongly stirred the aviation community. For missing a “readback/hearback”, two aircraft came closer than our standards prescribe, and two electronic safety nets, STCA for the controller on the ground and TCAS for the pilots in the air saved the day, as they were designed to do. The controller on the ground was found guilty, by ‘unconscious negligence’, of disturbance of public transport and fined 60 days. End of story?

Let us ask ourselves what the effect of this verdict will be to all professionals in jobs that are meaningful to the general public and also carries certain risks to the same general public. Policemen, doctors, nurses, train drivers, pilots, controllers are professionals , but also humans. And humans make mistakes. If we want these people to perform these jobs for us, we must find a way how to deal with these inevitable errors.

I reckon there are five objectives of penal law:

  1. Specific prevention – to prevent the wrongdoer acting again by teaching him/her a lesson;
  2. General prevention – to deter other people not to do the same thing. That is why punishments or executions used to be very public;
  3. To add misery to the perpetrator – compensating for the wrong act in the view of (and on behalf of) the general public and underlining the privilege of the State to enforce the law, lest vendetta’s or family gangs would have its way again;
  4. To compensate for the bad things that have become upon those that were suffering from the bad act. That is, if compensation is at all reasonably possible;
  5. To show to the general public that the law itself is being upheld – and thus upkeeping faith that the justice system actually works.

Not being a lawyer, I am sure there are more goals that the justice system serves, but these five are generally accepted to be laudable and indeed useful for a society to maintain order in a country or state.

Specific prevention and general prevention (objectives one and two) are unlikely to be served well here. Missing a readback/hearback is quite common in aviation and happens hundreds of times every day. Communication through half-duplex VHF channels is arguably one of the most critically weakest links in our system – it still amazes me we are using it. The aviation community is at best reminded that pilots and controllers have indeed serious accountabilities for ensuring safety. But they are humans and not infallible – the very reason why STCA and TCAS were invented anyway.

Objectives three and four (adding misery and compensating) completely miss the point here. There were no casualties hence what was there to compensate for? The Federal Court states that danger was created and I think nobody will dispute that. Flying inherently carries risk and risk already starts when the pilot gets out of bed at four in the morning to get to the airport and prepare for the flight.

That leaves us with objective number five: to show to uphold the law. How many people will feel more comfortable with the idea that Swiss controllers may report fewer incidents for fear of prosecution? It is legitimate to assume that Swiss controllers already report less or with less details than their colleagues in other countries, thus hampering the self-learning abilities that made aviation – inherently risky – so incredibly safe.

Note that nowhere I judge upon the acts of the controller or pilot whether they were good, normal, weak, reckless or exceptionally stupid. The Swiss judicial system, ticking like a Swiss watch, concluded that the behavior of the controller was negligent and not in accordance to professional standards, brutally brushing aside fundamental systemic questions: was the behavior of the controller seen as ‘normal’ in the community of controllers? Are other controllers experiencing the same events? Does the system elude controllers to fall into a spring loaded trap waiting to snap shut? What is the likelihood of this happening again? With this verdict, the answer to how we can prevent this systemic safety flaw from appearing again will be seriously hampered – not a pleasant thought for such a civilized country as Switzerland. My favorite phrase ‘learning is safer than punishing’ surely does apply.

“Never let a crisis go to waste” . It takes insight and courage to question as to why things are done the way they are. With the final verdict there now is a great opportunity for the Swiss people to reflect and ask themselves what they prefer: learning or punishing?

Job Brüggen
23 July 2019

30 kmh

Just Culture: Where are we going now?

IFATCA will be publishing several short articles on our website in the coming days. The articles will be asking important questions about the current status of Just Culture triggered by the conviction of an Air Traffic Controller in Switzerland. It is the purpose to trigger thoughts and ideas for how to proceed.

To describe one of the operational dilemmas that convicting individuals for being involved in incidents, we would like to tell a story that could happen to any of us:

Following the Swiss Federal Court verdict of an air traffic controller for an incident where nobody was hurt and no material damage was done, a conflict materialized between the duty of the judiciary and the needs of a safety-relevant reporting system in complex systems. Through this, aviation is subjected to a stress test.

You are driving your car on a small road in the countryside you are not familiar with. You find yourself suddenly driving 70 km/h by some houses with small children playing on the sidewalk. You suddenly realise you are in a village, but you missed the road sign. Looking back, you see it is there but the sign was mostly hidden by branches from a tree that outgrow towards the street.

You are a responsible driver and want to prevent somebody else falling in the same trap and possibly hitting a child, so you drive to the local authorities (a police station) and report your experience, arguing, one day someone might hit a kid involuntarily. The mayor is grateful and will trim the tree, the parents of the local kids probably agree, but the police officer says: you were driving 70 in a 50 km/h zone? Here you go: a 150 EUR fine!   

So, tell me, what are you going to do next time you find yourself in a similar situation? Go to the police again?

Is this the way we want to go in the future of Air Traffic Control?

It looks like today, the common law, which is applicable to every citizen, is also applied to an air traffic controller who reports an incident. If this is the case then should you do the same as most normal citizens do: i.e. not report your own mistakes or violation of laws to the authorities, whether it is your regulator or the police. You should not be incriminating yourself, there are even laws for this (like the USA 5th amendment).

The danger of all this:

Once again, common sense means reporting incidents to prevent they become accidents. Our authorities are implementing Just Culture to protect us from disciplinary actions when dealing with incidents reporting and investigating, but this should also have been extended to the judiciary level. Failure to do so, will be treating us just like normal citizens before the law, but then, following that logic, we should act like most normal citizens too, and this means keeping our mistakes for ourselves.

We need to make the case for a change in the law for professionals, similar to the recent Italian laws for medical doctors; You should not be punished for doing your job according the best practices and for reporting and talking about your honest mistakes while performing your job. But this needs to be done FOR EVERY country that wants to apply Just Culture and a free incident reporting system.

Achieving this is one of IFATCAs top priorities.


Single European Sky III – Mission Possible?

Click the image to download the report (PDF)

The Single European Sky (SES) concept was initially introduced by the European Commission in 1999 to tackle the inefficiencies of the European Air Traffic Management (ATM) system and to ensure it could meet future demand for air travel effectively. However, despite the introduction of two regulatory frameworks and implementation initiatives, SES I in 2004 and SES II in 2009, we are still a long way away from the full implementation of the SES.

IFATCA has now compiled a whitepaper, in which it presents its views on the reasons behind that delay and gives five recommendations to achieve an interoperable, standardised and efficient SES and ATM system, without compromising safety. The main reasons behind the failure to implement the SES are the lack of an agreed long-term vision and strategy about the SES, an inefficient legal framework which reinforces the idea of short-term performance targets, the lack of political will amongst Member States to break free from national boundaries and the absence of technological and procedural standards to ensure Europe-wide interoperability.

IFATCA is committed to and has been supporting the SES since its inception. We strongly believe that the SES is possible. However, the onus is on all the stakeholders to collaborate, leave vested interests aside and find a way forward, which avoids the mistakes of the past and addresses the current problems of the ATM system. Only then will the SES become a reality.


Working conditions of controllers in Cyprus

ATCEUC and IFATCA criticize the working conditions of the ATCOs in Cyprus

ATCEUC and IFATCA express their support to the Air Traffic Controller, who was injured during the collapse of the ceiling on the 13th of June 2019 in the Area Control Centre of Nicosia. ATCEUC and IFATCA also express their sympathy with the employees of the Department of Civil Aviation Cyprus and the effects that this accident has on their working environment. This will increase pressure on the work force, reduce moral and will create further delay and safety concerns in the area.

Ceiling collapse in Nicosia ACC on 13 June 2019 – photo CYATCU

ATCEUC and IFATCA have criticized the working conditions in Cyprus for more than 10 years. The interventions did not result in actions from the national or European competent authorities. In the opinion of ATCEUC and IFATCA the current situation is the result of years of mismanagement and underinvestment.

The Cyprian Air Navigation Service Provider is understaffed, work with antiquated equipment, work from the third floor of an abandoned office building with very little space. Furthermore, the operational environment is
characterized by many different political interests, e.g. the lack of communication between Ankara and Nicosia, making it difficult to maintain a safe an orderly flow of traffic.

Unfortunately, in some cases it takes an accident to show decision makers that action is needed:

ATCEUC and IFATCA call upon the European Commission, Eurocontrol, Airlines, the Cypriot government and the DCA Cyprus to invest in improving the working conditions significantly for our Cypriot friends and colleagues.
We suggest that the European Commission suspends the performance scheme for Cyprus and engage in negotiations with the Cypriote authorities to develop a sufficiently funded National performance plan.

A PDF version of the press release can be downloaded here.


360 Rue Saint-Jacques,
Suite 2002,
Montreal, Quebec,
Canada H2Y 1P5


Tel: +1 514 866 7040
Fax: +1 514 866 7612
Email: [email protected]
Hours: 0900-1700 (EST) Mon-Fri

Terms & Conditions  |  Privacy Policy

Copyright 2021 © All Rights Reserved