HomeRail NewsAutomation in railway control - The human factors

Automation in railway control – The human factors

Listen to this article

At 35,000 ft above the Atlantic Ocean, the pitot tubes (air speed sensors) iced up and failed (a known phenomenon) causing the auto-pilot to disengage. Writes David Bickell

The inexperienced co-pilot did not know how to correctly react to the situation. The pilot returned to the cockpit after taking a rest but could not assimilate the scenario presented by the instruments in time to save the state of the art A330 Airbus plunging into the ocean with the loss of all 228 persons on board. The fate of Air France Flight AF447 in 2009 is a horrific example of the failure of Human/System interaction in transportation systems.

Is there a parallel in the automation of train control? The introduction of computer technology and automation hasn’t always produced the expected outputs in day-to-day service of the railway. Two recent examples from the RAIB archive demonstrate the need to fully understand the human factors associated with the design and operation of modern systems.

Inherent flaws

In 2008, following resignalling work at Milton Keynes, a driver waiting to depart the station observed his signal change momentarily to green with a train visible in the section ahead. This wrong-side failure was traced to a design error that resulted in the omission of some computer data in a complex matrix of adjoining signal interlockings. The designer, design checker and signalling principles tester all failed to spot the error. Under different circumstances the error could have caused another ‘Clapham’.

In 2013, a cyclist was fatally injured by a train whilst using a bridleway level crossing equipped with Miniature Stop Lights near Witham station in Essex. Whilst the MSLs were correctly displaying a red aspect, some users had become intolerant of the long red aspect closure periods on this busy main line and were using their own judgement to cross safely.

The RAIB investigation revealed that, in response to drivers complaining about delayed clearance of signals approaching the nearby station, the closure time was extended by the use of the ’non-stopping’ setting for trains which were due to stop at Witham. Signallers were unaware of the implications on crossing warning time. Also, the designer of the Automatic Route Setting system applied a progra(1)CanadaLine [online]mming rule that was intended to delay the operation of the level crossing, without realising that the controls within the interlocking itself already provided for such a delay.

Automation of train control

Automatic route setting has been in place on the London Underground since 1958 when the first Programme Machine was introduced at Kennington. The machine contained a roll of plastic in which holes, that encode timetable information, had been punched and which were read by electrical contacts – old fashioned and crude, but also robust and effective. The British Rail network introduced ARS as a computer controlled add-on to the Integrated Electronic Control Centres from 1989.

Automation train operation (ATO) was introduced with the opening of the Victoria Line in 1968. The Docklands Light Railway opened in 1987 fitted with ATO, without a ‘driver’ riding up front. Subsequently, the Central, Jubilee and Northern lines have been equipped with ATO, whilst the Victoria Line and DLR have received new ATO equipment.

ATO is generally considered best suited to self-contained metro systems. However, Network Rail has an ambitious scheme to introduce ATO to the core part of the Thameslink route which, in its entirety, contains all the elements of a main line, suburban and metro railway (issue 109, November 2013).

From December 2017, the Thameslink project team are planning to introduce Automatic Train Operation (ATO) through the core from St Pancras International to London Bridge. ATO is required to ensure optimum throughput of trains. However, to safeguard the passenger service in the event of a partial control system failure, train drivers are being provided with four modes of movement authority. The selected mode will depend upon the health of various components of the system, and a driver must be prepared to change over to another mode at any moment should the need arise.

The modes are: fully automatic driving; manual driving using movement authority information displayed in the cab; traditional driving in accordance with the lineside colour light signalling, and finally the ‘Proceed on Sight’ scenario using the special lineside POSA signals. The signallers’ workstations will be more complex than hitherto with the need to show the current mode of each train and facilitate the use of the POSA signals.

There are challenges to ensure that drivers and signallers correctly interpret failure symptoms and react appropriately. However, unlike flight AF447, the risk here is most likely to be commercial. At worst, wrong decisions by driver/signaller will bring the service to a standstill.

Human Factors in Railway Control

With computers and automation taking over ever more functions of railway control, the management of human factors associated with system design and operation of the railway need to be at the forefront of railway projects around the world. This topical subject has been addressed by Lloyd’s Register Rail and was the subject of a recent paper to a global audience, describing the research undertaken.

Dr Daniel Woodland (professional head of signalling with Lloyd’s Register Rail) and his colleagues Ajai Menon (principal consultant, signalling) and Harry Blanchard (principal consultant, human factors) have conducted a review of ‘Human Factors in Railway Control’ and presented their findings at a recent Middle East Rail conference in Dubai. A summary of their work follows:

Benefits and risks of automation

Automation, generally understood as ‘the use of control systems and information technologies to reduce the need for human intervention’, offers many attractions for a railway operation, whether in the control centre, on a station or on trains. It provides the promise of an increased quality of output (including greater consistency and impartial, emotion free decision making) with a reduction in operator workload, improved levels of safety and potential cost savings.

However, along with these advantages come some potentially significant drawbacks. Getting the ‘level’ of automation wrong, programming the automation with parameters that turn out not to meet the real need, or should the automation itself go wrong, then there will be an incorrect system response to the conditions. This can lead to a loss of situational awareness, competence and understanding by the now- overloaded supervising staff, leading to security risks and vulnerability.

The potential for human error to contribute to train accidents is significantly reduced by automation systems, but issues still exist. Errors in preparation and maintenance, a de-skilling of the workforce and communication errors can all lead to problems.

While humans are good at adapting to the circumstances through rapid thinking and action, they are not good at handling the stress and work overload when emergencies arise and intervention is required. In fact, they are particularly poor at the passive monitoring of systems – their attention can only be maintained for about half an hour.

Their role needs to be seen as more of an active part of the system’s activities if they are to contribute to their full potential – and be ready to intervene effectively when needed. So, if they are to have an active role, they must understand what the system is doing and where they fit.

It is important that the operator knows how to interpret and respond to information from the system. Advanced forms of training using simulations can assist with developing experience of low- probability events.

In order to ensure that these issues are addressed, it is essential that all aspects of the Human Machine Interface are considered during the early stages of design, and that sufficient analysis is undertaken early on to understand their impact on system function and performance. Ideally, early engagement with the rail operators will help to obtain clarification as to what is expected, especially if it is not clear how the railway should be run and operated.

Limitations of technology

An over-reliance on automation systems can lead to disruption of the smooth operation of the railway service and, in the worst cases, to unsafe conditions and accidents.

There are two significant limitations within current technology and application know-how.

Firstly, the dynamics of railways in general – let alone specific applications in a particular railway’s environment (particularly if that is a new application) – are not yet fully understood. This means that automation solutions which will deal optimally with all possible situations cannot be completely specified.

Secondly, the integration of automation and humans within a system, and how to implement solutions that enable knowledge sharing and mutual support, is still in its early days. This is a limitation within current technology itself that restricts the ability to provide effective automation.

So automation should only be implemented where the technological capability, the understanding of the system and the ability to address related human factors are robust. Automation is not a desirable end in itself – it is a tool that, if appropriately applied, can assist in achieving improvements in both operational efficiency and safety. That is the real objective.

Deuta-Werke GmbH cab displays

There is a need to carefully consider the ‘art of the possible’ and also carry out a cost / benefit analysis to determine the extent of automation that is appropriate. This depends on the client and a wide variety of associated factors, in addition to how the specific railway will need to be run.

If automation is implemented too early (and the automation is not accurate or effective), operators will ignore, bypass and / or turn off the support and automation systems – discrediting the concept of automation and putting off the potential benefits until at least the next upgrade cycle.

Given the current state of technology, there would seem to be distinct, supportive roles for human operators and automation systems. Humans make good information managers and can make complex decisions. Automation can filter information and present concise analyses, and can then process the decision and set off ‘alarms’ if anything goes outside set parameters.

Implementing automation

The Lloyd’s Register team concluded by suggesting that a railway considering the introduction of control-centre automation should start with a baseline of a manually- controllable system, augmented by ‘easy to implement’ and well understood (routine / repetitive task) automation and simple ‘process reminder’ decision support. This should include the ‘filtering’ and presentation of information to the human operator.

More comprehensive decision support features can then be developed as the system becomes established and the characteristics in service become better understood, building up models for ‘prediction’ and using these to enable pre- emptive advice. Over time, if decision support proves to be highly reliable in certain areas, full automation can be introduced in those areas.

In the future, it may be possible to build in additional automation, but operators need to take one step at a time and not let their enthusiasm and eagerness to see the final objectives achieved lead them to jump too far ahead of their ability to deliver.

2 COMMENTS

  1. I am a retired Signal Engineer, with 48 years of continuous service. (I am now a volunteer in the heritage sector). Within the last year I visited York IECC. The SB supervisor permitted us to walk round at the back of the signalman, asking us specifically not to talk to the Leeds West man as that position is exceptionally busy. When we arrived at that position the duty signalman welcomed us and it seemed that he was relieved that he had someone to talk to. He (while still keeping a eye on his area) enthusiastically explained the workings under his supervision. I got the impression that monitoring the ARS was somewhat monotonous and he would rather have operated the area himself.

  2. As a seasoned manager of safety critical staff, human factors are definitely the strength and weakness in any automated system. It is important, as outlined in the article, that ‘humans’ are not exposed to lengthy periods of intense monitoring. Regular breaks need to be built into a shift and also regular training on what happens when it goes wrong. Test scenarios should be part of any continual assessment as a matter of course.
    In my experience, this is not happening in control staff rosters and is an accident waiting to happen.
    Stress is also a major factor. As a therapist who manages corporate staff absence within the rail industry, I often speak to staff who take their decisions personally and are unable to walk away from the job for lunch, or even when they are not at work. This needs to be addressed, with appropriate treatment and a pro-active approach…
    As an aside, I was part of the delivery team on the Dubai Metro in 2009. Automation does work, and this installation is an excellent example. The issue is when automation is introduced into what began as a manual system. Again, change is a psychological challenge.

LEAVE A REPLY

Please enter your comment!
Please enter your name here

This site uses Akismet to reduce spam. Learn how your comment data is processed.