From the archives: Computers in Air Traffic Control

This is another piece from my archives. This was written for Personal Computer World magazine, published in the UK by VNU. It followed on from the one on fly-by-wire, so I’m going to guess this piece appeared in 1991. By that time I was doing a lot of work for IBM, especially its RS/6000 and AIX division which was engaged on the New En Route Centre (NERC) ATC project.


COMPUTERS IN AIR TRAFFIC CONTROL

While you’re cruising at 30,000ft, snoozing, suffering the in-flight movie or trying to keep down the airline food, your safety doesn’t totally depend on the person up at the sharp end waggling the stick. Down on the ground – often buried beneath it – air traffic controllers peer into glowing green dustbin lids, watching over aircraft represented by radar blips, making sure they don’t bang into each other.

Air Traffic Control centre
US Marine Corps Air Station Cherry Point, North Carolina, USA

The controller’s job is to assign each aircraft a flight level (altitude) and heading, responding to requests from pilots for higher levels (because they want to save fuel) or lower levels (because they want to land), and ensuring that the planes don’t wander outside controlled airspace and the designated airways. It’s a task that has often been compared to 3D chess, but the complexity is often much higher, the pieces don’t stay put while you decide what to do, and the consequences of making a major blunder are unthinkable.

This, then, would seem to be an application ripe for computerisation. Yet air traffic control (ATC) has remained generally resistant to the lure of silicon assistance. That’s not to say that computers aren’t involved in the process: air traffic would grind almost to a halt if all the computers used by ATC authorities went down. But the level of automation is often surprisingly low. But this is changing, and what’s creating the change is the advent of new, powerful processors and high resolution displays.

There are compelling reasons for bringing computers into the loop. For one thing, their ability to look ahead, extrapolating from any given situation, should be able to help make the skies safer by giving more advanced warning of collision threats.

Some air traffic control authorities are looking to computers as a way of reducing staffing levels and riding out strikes. It has been suggested that if a fully integrated European ATC network becomes a reality, UK controllers could, for example, maintain control of their aircraft through France, should the French controllers indulge in their hobby of striking during peak holiday periods.

Congestion will be eased because with more accurate position reporting and the computer providing another pair of eyes, separation rules can be relaxed, effectively providing more airspace. The CAA currently demands five nautical miles horizontal and 1000 ft vertical separation between aircraft, but in other parts of Europe, with sparser radar coverage and less sophisticated ATC facilities, separation can be as much as 60 miles. Having a cool, logical computer assign flight paths and flight levels, rather than a harassed human, should also help make more efficient use of the airspace.

ATC computers may draw flak from hapless travellers, whose long-term stays in departure lounges are often blamed on computer malfunctions, but it’s still a seductive market for computer manufacturers. There are big bucks to be made in this industry: by definition this is a market that demands high performance (that is, expensive) hardware backed up by extensive (that is, expensive) support and maintenance. For example, Unisys recently closed a deal worth $47 million to supply 2200/600 mainframes for the West German ATC system. And last year the firm landed a $44.9 million contract with the FAA to upgrade Automated Radar Terminal Systems computers.

HUMAN BURDEN

The issue of ATC automation is becoming ever more urgent as the skies fill up. The growth in commercial air traffic has exceeded all predictions: by the end of the century there may be twice as much traffic as in the late 1980s. Around 400 million people will travel by air in Europe this year. By 2010 that could rise to 1 billion – a level of traffic that no-one has planned for.

Something like a third of European flights are delayed (that is, they leave the gate more than 15 minutes after the scheduled departure time). Charter flights probably have a worse record but there is little data on them. Some estimates put the cost to airlines of scheduled flight delays in Europe at $980 million a year, mostly due to 330,000 wasted aircraft hours. Gellman Research, a firm of US consultants, puts the cost to the European economy, in terms of lost working hours and disrupted distribution, at around $400 million.

Inside portable military air traffic control radar room, US Marine Corps Air Station Cherry Point, NC, USA

In spite of all that, much of the ATC work is still down to what the RAF likes to call the Mk.1 eyeball – in this case, people staring at screens. Certainly technology is playing its part: for example, aircraft will take on some of the burden themselves through the use of position reporting equipment and the introduction of airborne collision alert and avoidance systems. Down on the ground, however, much of the equipment is old and tired. The air traffic centre that controls aircraft in the lower half of Britain has only just replaced computers that were first brought online in 1975, and which used technology that was ten years old even then.

A significant proportion of a controller’s time is taken up giving clearances to aircraft. This involves things like telling the pilot which altitude to fly at, course to steer and so on. Although English is the international aviation language, if pilot and controller don’t share the same accent, there is plenty of scope for mistakes and incomprehension. This isn’t helped by crowded and noisy radio frequencies. In an effort to overcome this problem, pilots read the information back to the controller, to check that it’s been received and understood correctly and completely. But it’s not unknown for this not to happen, or for the readback to be incomplete without the controller noticing. And, of course, the readback adds to the time taken and the workload.

Air Canada has developed a datalink system which will allow its aircraft to receive ATC clearances as data direct from the ATC computers. The cargo arm of the company, Transport Canada, has been testing the system in its Boeing 767s flying the North Atlantic since 1985: they receive eastbound clearances from Gander and, more recently, westbound clearances from Shanwick.

The messages appear from a small printer mounted on the centre console of the flightdeck. Outgoing messages from the aircraft are typed into a small alphanumeric keyboard fitted with function keys for several of the standard operations. On the Airbus A320 – an aircraft bristling with computers – crews will be able to enter data using a touch-sensitive screen.

The information can include: flight plan revisions and acceptance; fuel and weight and balance data; a weather briefing; departure time and estimated time of arrival (ETA); and flight level changes. Some of the data can be triggered automatically, relieving the crew of the job and so further easing their workload. For example, take off time can be triggered by the release of the brakes, landing time by gear compression and the time of the aircraft’s arrival at the gate by the opening of the passenger door.

The firm claims not to have received a single garbled or corrupted message. During the trial, a belt and braces system of receiving verbal confirmation and making voice readbacks is being used. Air Canada hopes to do away with the confirmation soon, although readbacks will still be made verbally, rather than by data downlink, until Gander equips itself to receive the transmissions.

ADS

The same datalink also provides position reports – something else normally done over the radio. These reports can then be fed to a computer which in turn produces a kind of ‘pseudo- radar’ display. The system is known as Automatic Dependent Surveillance (ADS), and by employing satellite relay of the data it can provide information on aircraft positions in areas out of range of normal radar installations – for example, on trans-oceanic routes.

The only thing stopping ADS becoming a major force in air traffic control is that not all aircraft have the necessary equipment: an airborne datalink is far from being standard equipment on airliners, not least because some of the flying hardware is 15-20 years old, and upgrades tend to concentrate on essential items that bring a tangible financial return – more efficient engines, for example.

Europe is also working on an ADS system. The Prodat project, started in 1987 by Eurocontrol’s Bretigny centre, has already carried out practical trials in co-ordination with the European Space Agency, and three airlines – Sabena, Saudia and Air France – have started equipping their fleets for the system.

With an ADS system, an ATC centre can request the crew of an aircraft to key data. But most of the exchanges between ground and aircraft will be automatic, with the aircraft sending regular reports and the ground installation ‘polling’ all the aircraft in its region for updates. The rate at which an aircraft is polled can be modified automatically depending on its position – say, every five minutes while it’s out over the ocean, reducing to two minutes once it has entered the control centre’s flight information region and maybe as little as a few seconds if there is other traffic in the area or the aircraft is nearing its destination.

The system isn’t yet perfect. The data is derived from the aircraft’s flight management system, which in turn obtains position information from satellite navigation aids and its own inertial navigation system. Both of the latter are subject to errors – inertial systems, in particular, tend to ‘drift’. One proposal is that aircraft are represented not by dots on the controller’s screen but by circles, the size of the circle representing the probable margin of error.

The US is also to make use of datalinks, although in this case it will be while the aircraft are still on the ground. San Francisco will be the first airport to benefit from the new predeparture clearance (PDC) system that will replace the lengthy radio call which normally precedes aircraft leaving their gates. It employs the Arinc datalink carried by many aircraft for communicating with their own airlines (to update ETAs, report the need for maintenance before they get on the ground, and so on). This links with a terminal in the control tower. Clearances can be received in the cockpit via a printer or electronic display.

RELIABILITY / SAFETY

Now, if you’ve got any kind of experience with computers, the thought of some software shuffling thousands of souls around the world’s crowded skies might leave you feeling a tad uneasy. We all know that computers can fail. On one day in July 1989, the ATC computer at Los Angeles airport failed 104 times! Even if the hardware holds up, the inevitably complex software is sure to contain bugs: it could give a whole new meaning to the term ‘computer crash’. That’s why one study has called for programmers working on safety-critical applications to be licensed.

And a computer problem doesn’t have to endanger life for it to be undesirable. One aspect of our reliance on computers is that we tend to develop the system that the computer supports way beyond our ability to run that system should the computer fail.

We’ve already been given a taste of the kind of chaos that can result from an ATC computer problem. At 8.00am on 28 July of this year the computers at London Air Traffic Control Centre (LATCC) hit a bug. LATCC, at West Drayton is responsible for the London Flight Information Region (FIR) which covers the lower half of the UK – possibly one of the busiest pieces of airspace in the world. According to the Civil Aviation Authority, safety was not compromised. The primary radar displays, which return the main blip on the controllers’ screens but carry little other information, were unaffected. But data from the secondary radars – identification code, callsign, and aircraft height – was missing.

The problem persisted for around two hours. LATCC immediately issued a take-off ban in the London area, and contacted aircraft overseas, due to come into London, telling them that they should not take off. Towards the end of the two-hour problem period a manual system was introduced, but at a much slower rate than normally provided by the computers. The cancelling of so many take off and landing ‘slots’ led to flights being cancelled, rerouted or massively delayed. Those that did make it in to Heathrow had to cope with waits of up to 1.5 hours before they could pull up to a vacant stand and disembark their passengers. And scores of planes were left in the wrong places – British Airways calculated that it took them several days to sort the problem.

And what was wrong with the computers? At the time of writing, nobody seemed too sure. The CAA was quick to point out that it was a software fault: anything else would have been embarrassing considering the Authority’s recent investment in new computers.

CAA COMPUTERS

The CAA has embarked on a £600 million upgrade of the UK’s air traffic system. Part of this has been a £22 million investment in new computer hardware for LATCC.

The centre first felt the benefits of computers in the early 1970s when it received its IBM 9020D triplex system. This system, and the software that came with it, was originally developed by IBM for the US Federal Aviation Administration (FAA): an agreement between the FAA and the Foreign Office meant that the software was supplied free to the CAA. The ATC system was based on three IBM 360/65 computers working in triplex, the idea being that any one computer could fail and the other two would carry the workload. And it soldiered on until earlier this year, not always behaving impeccably.

On 18 June, control of the traffic in the London FIR switched to two new machines, although ‘new’ may be putting it a little strongly. The system now uses dual IBM 4381 machines, running in a ‘mirrored’ configuration to provide fault tolerance and a safety back-up. If a fault does occur in the online machine, the other is designed to take over in under 10 seconds. The change was prompted, to some extent, by a similar upgrade in the US. The Americans changed over to liquid-cooled IBM 3090s: alas, there simply wasn’t the room at West Drayton for these hulking monsters, and so the CAA went for the smaller, air-cooled 4381s.

In all, six machines, representing a quarter of the budget, were bought. The second pair is being used for testing and development work. And another two were bought secondhand, to save money, but this pair is used purely for training.

The CAA didn’t even consider writing new software. The existing 1.3 million lines of code took an estimated 2000 man-years to develop and test, and the CAA calculated that any new software wouldn’t go live until around the year 2000. So the decision was made to re-host the existing code, even if it is 20 years old or so. This is what soaked up about half of the budget. The software grew to around 1.7 million lines of code, but it can now handle twice as many aircraft at one time – around 3000. This software is now being combed ‘line by line’ for the fault that caused that two-hour downtime.

The CAA says the software, some of it in Basic, is constantly updated: but of course this persistent tweaking can often lead to software that is ever more twisted and complex and hard to maintain.

The good news is that the software doesn’t have any direct input to the controlling of aircraft. It is used mainly as a huge database, holding details of all scheduled flights and flight plans. It prints flight progress strips, which contain the details of each flight, and these are handed to the controllers. Without the computers these strips have to be written by hand, which is what causes the delays.

The system was tested for 600 hours before being allowed on line, with 100 deliberate faults being introduced to test its robustness. The testing had to be extremely rigorous because the new system must be more reliable than the one it replaced. The 9020D-based system was taken down every night for maintenance, but now LATCC plans only one short break a week – and later perhaps once a month – otherwise the machines will be running 24 hours a day, every day.

ADIS

While West Drayton continues to use printed flight progress strips, the next logical step is to display the strips directly on computer monitors. ATC authorities in other countries have adopted this approach, and the CAA is itself using it in its Airport Display Information System (ADIS) which controls aircraft on the ground and within five miles of four airport terminals.

ADIS provides approach and ground controllers with flight, weather and other information on a single screen, presented using the DECWindows graphics system. Previously they had to glean this data from a variety of sources, although it is slightly worrying that these sources may no longer be available in the event of the computers going down.

This real-time system was developed by SD, the defence and aerospace arm of software house SD-Scicon, under a £2 million contract for the CAA. It is based on paired Digital Micro Vax 3400 machines, installed at Heathrow, Gatwick, Stansted and Manchester airports. These machines are connected to the National Area System database at West Drayton and in turn feed back information to update the database. The links are provided by the Civil Aviation Packet Switching Network, an X.25 network linking all CAA sites.

EFMS

Other research is going on at the Royal Aerospace Establishment (RAE) facility in Bedford. As part of its air traffic management (ATM) research, the establishment is working on an experimental flight management system (EFMS). At the moment this is concentrating on just one part of the system – trajectory management – using 68030-based computers running software written in Ada. But RAE is looking towards having a complete system by the end of 1991 that will have downlink capabilities for use with ADS systems. An important part of this development will be to have an FMS capable of negotiating a ‘timed trajectory’ through space – obtaining the clearances automatically from ATC computers. For this RAE is having to add an extra dimension to current flight management systems: most aircraft FMSs report position and height, but the EFMS project is adding time to that list. A system capable of providing information accurate to 0.1 nautical miles and five seconds has been demonstrated, which the RAE claims could halve the amount of airspace needed by aircraft to maintain safe separation.

A ground simulation of the system’s ability to transmit and execute four dimensional clearances will come at the end of next year, with an airborne demonstration following sometime in 1994.

Although the system could be fully automated, the RAE is keen to keep humans as part of the process of requesting and issuing clearances. However, it does recognise the need to reduce controller workload and is looking at direct voice input as one option.

Links between the ATC centres are often simple telephone lines, and these have proved an inadequate method of warning other countries of traffic build-ups originating in your own area. Controllers can be suddenly overwhelmed by an unexpected influx of aircraft: that’s what led to the Captain of one British Airways TriStar suddenly having his windscreen filled with an uncomfortably close view of a Bulgarian Tu154.

Controllers need advanced warning. Ironically it is the comparatively sleepy centre in Jersey – handling just 110,000 aircraft movements a year, compared to LATCC’s 1.23 million – that is leading the way. It’s £2 million system, based on telephone data lines, gives Jersey access to every flight plan filed on Heathrow’s computers, and similar information can be obtained from the French centre at Brest.

EUROCONTROL

The air traffic situation in Europe is complicated by the fact that there are so many national borders in such a small area, resulting in the airspace being watched over by 42 control centres in 23 countries. Eurocontrol is an organisation that was set up to help solve some of the problems by coordinating ATC activities throughout the continent.

In 1965, Eurocontrol established an experimental control centre in Bretigny, south of Paris. It has been running extensive real-time simulations that have emphasised the need for ergonomic displays. It has found, for example, that touch-screen are more effective than keyboards, and that displays need to be moved closer to the operators.

The centre is also looking at methods of displaying data: predictably, for a joint European venture, there are national differences. The UK favours a monochrome display for the radar and a colour display for data. West Germany prefers colour for both displays, and while France likes all-colour too, it wants both displays combined on a single monitor. That monitor, incidentally, is likely to be a 50cm-square Sony Trinitron model, driven by Thomson- CSF/Raytheon hardware and capable of producing a resolution of 2048 x 2048 pixels with good definition right out to the edges. The display makes use of windowed graphics and a mouse-driven pointer.

Mind you, it isn’t just fancy graphics that the controller needs from a computer, but help in making decisions. That’s why the centre is also exploring the possibilities of artificial intelligence. Its first move in this direction is the air traffic management strategic and tactical advisor (ASTA), a knowledge-based system initially aimed at flight level management at sector boundaries. There’s no intention to let the computer assign flight levels to aircraft, even when datalinks become common: that is still the controller’s job, and the computer is there simply to advise.

FAA SYSTEM

European controllers often look with envy at the US. In many ways it is similar to Europe in having a network of ATC centres – 20 traffic control (Tracon) centres scattered across the country. But the US has succeeded in integrating them, very much like a distributed processing network, but with one office looking at the big picture.

Around 200,000 people on 3000 flights are travelling through America’s skies at any given moment. Every one of those flights is monitored by the FAA’s Central Flow Control Facility in Washington, which the administration claims is the ‘largest, most complex computer system in the world’. It draws its data from the 20 regional centres, and through the use of the Aircraft Situation Display, can find the location of any airborne aircraft with an accuracy of two to five minutes. The centre’s coordinating activities have greatly reduced the amount of time aircraft spend circling in stacks waiting to land. Back in the 1970s you might have found as many as 300 aircraft stacked at one time, burning expensive fuel, stressing out controllers and creating a collision hazard.

There is a certain irony in the fact that Central Flow reduces controller workload. It was introduced initially after Ronald Reagan sacked 11,400 air traffic controllers when they went on strike. Something was needed to help the fewer, less-experienced controllers who subsequently took on the job.

Advanced Automation System (AAS)

Like LATCC, the FAA has also been busy upgrading its systems, but it’s also going somewhat further. The original US system, as adopted by the UK, was the child of IBM’s 30 year-old Air Traffic Control Organisation. The same outfit is now working with the FAA developing a $3.6 billion real- time system known as the Advanced Automation System – claimed to be one of the largest real-time systems under development. It is built around a new design of console based on IBM’s RS/6000 Risc-based computer running Ada-based software and driving the world’s highest resolution colour monitor. Consoles will be linked by 4-megabit and 16-megabit Token Ring networks. The project is about a year behind schedule thanks largely to bugs in the new version of AIX intended for the consoles, although IBM also blames changes in the ATC network.

Around 350,000 lines of code – a third of the total – have already been delivered to the FAA. The rest of the software will be delivered, tested and brought on-line over the course of the next 11 years. IBM has selected Ada as the language of choice. Indeed, Big Blue claims this is the largest-ever Ada project.

The first piece of hardware IBM delivered was the peripheral adapter module replacement item (PAMRI) which should come on-line in 1991. Its job is to collect data from remote subsystems and distribute it to the FAA’s 20 en route control centres. This is faster at transmitting radar and flight plan information than the system it replaces, and this will allow the FAA to add more radar and other peripheral systems to the network without overloading it, which should help to increase safety.

NASA is also in on the act. At its Ames Research Centre it is using computer simulation to test terminal area automation software. The Centre/Tracon Automation System (CTAS) is designed to increase landing rates by selecting optimum approach speeds and altitudes according to aircraft type. It will also save fuel by designing descents from cruise altitude that can be flown with the engines at idle.

The simulation, running on Sun SparcStation 1s, models Denver airport, and recent tests have used controllers borrowed from the real airfield. One of them said that the software mostly mirrored her own decisions, but that where they disagreed the computer generally came up with a good solution.

Something similar, if less ambitious, is already being introduced at Frankfurt in an attempt to squeeze every ounce of efficiency out of this grossly congested airport. It will help reduce the delays and increase flight frequencies while the subject of adding another runway gets the political football treatment.

Nervousness about hardware and software reliability means we are still a long way off the day when air traffic controllers can put their feet up, admire the colours on their radar displays and let the computers get on with the job of threading aircraft around each other. But at least computers are finally starting to do something about making our skies safer and less congested.

[ENDS]

 

 

 

Leave a Reply

Your email address will not be published. Required fields are marked *

This site uses Akismet to reduce spam. Learn how your comment data is processed.