spirent.cn

Excessive trust: the real vulnerability behind the US Navy collisions?

By Guy Buesnel On September 18, 2017
定位
GPS, GNSS, navigation, sea, collision, US Navy

 “Once is happenstance. Twice is coincidence. Three times is enemy action.

Ian Fleming’s famous quote sums up a lot of the speculation surrounding the recent collisions involving US Navy ships USS Fitzgerald and USS John S. McCain in east Asia.

The fact that two similar incidents occurred in a short timeframe has led to a huge amount of theorising that the cause could have been a cyber-attack; more specifically, a GPS spoofing attack.

It’s easy to see why conspiracy theorists suspect enemy action. The two Navy collisions more or less coincided with something that really does look like a mass spoofing attack; on ships in the Black Sea. And once you see one GPS cyber-attack, it’s easy to start seeing them everywhere.

Spoofing is a red herring

But the US Navy and experts in maritime navigation have rejected the spoofing hypothesis. Admiral John Richardson said in a Facebook all-hands call that the Navy had found “no evidence of any kind of a cyber-intrusion”. And Todd Humphreys of the University of Texas and Dana Goward of the Resilient Navigation and Timing Foundation - experts in maritime GPS vulnerabilities - both told USA Today that a spoofing attack is a highly unlikely explanation.

So if we file the possibility of GPS cyber-attack under “highly unlikely”, what else could have caused two serious collisions in quick succession?

In my view, it’s likely to be a much less talked-about vulnerability: excessive trust. When the output of a system - especially one as successful as GPS - is accepted and trusted, that system becomes vulnerable, as its outputs are accepted as truth with no secondary checks applied.

Garbage in, gospel out

I call this situation “garbage in, gospel out” - as it means the output of the system is accepted unquestioningly, even when the system isn’t functioning properly.

At sea, global navigation satellite systems (GNSS) like GPS aren’t the only option for navigation. There are other systems - like radar - that can be used in conjunction with it, and plenty of opportunities for humans to check the integrity of data through visual reference from the bridge (aka “looking out of the window”).

If these secondary navigation methods were always used, whenever a faulty GNSS sensor provides inaccurate or misleading information, it would be detected quickly and appropriate action taken.  But if the output from the GNSS sensor is always considered to be the “truth”, that doesn’t happen.

Two examples spring to mind.

2012: Two vessels collide in an avoidable accident

In March 2012, the vessels Seagate and Timor Stream collided 24 miles North of the Dominican Republic, in good weather conditions.

The incident report notes that Timor Stream was en-route to the United Kingdom, while Seagate was heading to Africa. Seagate’s chief officer saw Timor Stream but assumed it was an overtaking vessel which would keep clear.

Timor Stream was on a heading of 041 degrees, but at the time of the accident, the AIS data transmitted by the vessel showed its heading to be around 160° different from its actual heading. Based on this data, the chief officer on Seagate assumed that Timor Stream was overtaking him.

The report states that while the chief officer had several clues that this assumption was wrong – including a lookout warning that a collision situation was developing – they weren’t sufficient to cause him to double-check his assumptions. The ships duly collided, leaving both badly damaged.

An earlier incident demonstrates how quickly the fast, reliable position fixes provided by GPS became assimilated into bridge systems and started being accepted as totally reliable.

1995: Royal Majesty runs aground in Nantucket

Majesty ship a-ground 

In 1995 the cruise ship Royal Majesty, equipped with state of the art GPS and LORAN-C navigation systems, ran aground in good weather near Nantucket Island. (This vivid re-telling of the incident is well worth a read for anyone interested in maritime accidents.)

Unnoticed by the crew, the navigation system’s connection to the GPS antenna had broken not long after the ship left port in Bermuda. For the next 27 hours, the GPS receiver operated in “Dead Reckoning” mode, calculating positions that were progressively less accurate. The only warning that anything was amiss was a tiny indicator on the GPS receiver display, which no one noticed.

The navigation system was outputting increasingly unreliable position data. But Royal Majesty’s crew had come to trust the system so much they assumed the data was accurate.

Here again, there were opportunities for the crew to realise that the GPS position information was misleading – but all were missed. The ship eventually grounded, more than 14 miles off course.

There are many examples of incidents where excessive trust of GPS was part of the chain of events. Outside of the maritime world, we’ve all read stories of drivers who trusted their satnav, even when it was clearly taking them into an impassable street, or even a subway.

None of these incidents means that GPS / GNSS is unusable. But they do highlight a human tendency to implicitly trust GNSS equipment, to the point of excluding evidence that the system is providing incorrect (and sometimes very improbable) position, navigation or timing data.

Can human error be prevented?

Is there anything that the designers and users of GPS / GNSS systems can do to take these human factors into account? The many incidents in which excessive trust leads to poor decision-making and unpleasant (even tragic) consequences reveals a need to think laterally in systems testing.

Most designers of systems will test for common or known errors, but fail to test for unusual or unexpected scenarios. There’s a need to understand the circumstances in which navigation systems might come to output misleading data, and to put safeguards in place that prevent human users from relying on this data.

In the case of the Royal Majesty, for example, the system did have a means of warning crew that its performance was degraded, but in the hubbub of activity on a busy ship’s bridge, it wasn’t sufficiently noticeable.

Trust, but verify

And given that a system might not always know when it has started to output inaccurate data, there needs to be awareness training on the user side, too. The US Coast Guard, in an alert sent to mariners in January 2016, recommended a “trust, but verify” approach. GPS data is usually accurate, but can sometimes be disrupted through interference or jamming, it warned. Better to use one or more secondary sources to verify what the GPS data is telling you.

We don’t yet know for sure what caused two US Navy ships to collide in the space of two months. But if excessive trust in GPS is found to be a contributing factor, it may bring about a cultural change in maritime navigation techniques - and prevent more such accidents happening in future.

 
comments powered by Disqus
× Spirent.com uses cookies to enhance and streamline your experience. By continuing to browse our site, you are agreeing to the use of cookies. If you would like to learn more about how we use cookies