Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

It would be nice to get an actual technical rebuttal describing why his stuff doesn't work rather than the sleight-of-hand English and PR responses in that article (The whole "industry standard robustness yada" bit makes my cringe).

Based on their responses it sounds like "Yes, these systems are vulnerable but a good pilot will ignore the bad data so the plane is not vulnerable." which does not exactly give me the warm fuzzies.

(Good point here at the bottom: http://arstechnica.com/security/2013/04/hacking-commercial-a... )

All his hack needs to be able to do is cause problems for the pilot (bad information etc.) for this to be a problem.

I mean, if there is no issue then surely he is now justified in publishing his work and them publishing in detail why it is not a risk.



Brief technical explanation.

The "attack" consists of the following steps:

1) Modify the desktop simulator FMS code to support commands in the data protocols (e.g. ADS-B). 2) Send commands through data link to utilize the newly created control channel.

What FAA is saying that

1) It is hard if not impossible to actually inject un-authorized code into an embedded airplane system (FMS, GPS, ...) due to strict quality controls in place. 2) Even if one succeeds with 1) then you still have limits of what FMS can actually do with the plane because it is a separate unit from other systems with well defined protocols (e.g. FMS doesn't control the lights in the plane).

IMHO, the whole "hack" sounds like a BS/PR action. Yes, you can "fake" GPS, ADS-B, and other communication protocols. However, there are other sources of information for pilots (e.g. the old and true magnetic compass) that can and should be used to validate and cross-reference the data. From a pilot's perspective, a "fake" GPS is no different from a "failed" GPS (yes, this happens). One should be ready to deal with this to qualify as a pilot.


Yep, hit the nail on the head.

Thing is, if you're able to inject un-authorised code into the FMS, chances are you have bigger concerns than a single aircraft getting hijacked.

It's the equivalent of saying "If I had access to a bank's mainframe and network infrastructure, I could steal millions of dollars with an Android App." Sure you could, but is the problem the fact you can do it with an Android App, or the fact you were able to inject the code in the first place?


Exactly. If I have that much capabilities I would buy the airline. It seems neater.


"From a pilot's perspective, a "fake" GPS is no different from a "failed" GPS (yes, this happens)."

Fake and failed are completely different things.

Take for example AF447, it had UNRELIABLE instrument readings which pilots still trusted. Much more dangerous then an instrument giving up it's ghost and it telling you so.

Modifying data subtly enough for it to not trip BS-detector is a valid attack vector.

Of course the rest of the story is theoretical, but so have been many emerging attacks.

I consider this a tool that can in theory be used in combination with other tricks to do something bad. To have it in the open means that the vendors have to put more safeguards in place and make the system safer.


The old vacuum attitude indicators have a tendency to fail in a really slow and subtle way as the gyro slows down. Yet, there are other instruments in the cabin to figure this out. There are rules of how to detect failed or unreliable indicators. It would be really hard (read: impossible) for an attacker to fake all instruments together to show a believable picture.

AF447 crash was caused by not following the procedures for the loss of airspeed reading followed by lack of basic manual flying skills (e.g. complete inability to fly the plane at high altitude w/o autopilot). Read the final report, it's not about instruments, it's about pilots.


I did read the report and disagree with you.

The pilots where inexperienced being in a violent thunderstorm with computers giving intermittent false information. There were no outside visual references. It was a complex set of circumstances that converged.

I have a pilots license and I agree that sitting on a couch, reading the report in retrospective there were cues that could have been interpreted differently, there were procedures that should have been followed more directly and other things that could have been done to prevent oneself to get into this kind of mess in the first place.

BUT. Would I have performed better in the same position? I don't know. Reading crash reports is for educating oneself so you can perform better in the next sticky situation or spot problems a mile away.

Can this attack in the original article crash airplanes alone today? Definite no.

Can it be evolved to play a part in some other foul play sometime in the future? I would not underestimate the inventiveness of people.


I personally had a frozen pitot tube due to failed tube heating in the solid IMC (no thunderstorm though). You just disable autopilot and manually fly by the known manifold pressure numbers, vertical speed and attitude. Plus double check with GPS ground speed (estimating the wind). This is not even an emergency (you have to advise ATC about the situation though).

Things fail all the time. This is why there is redundancy built into the system. A pilot just need to know how to detect the problem and how to use secondary systems.


Even briefer explanation. The attack consists of:

1. Get root on one of the plane's systems somehow.

2. After completing step (1), use this Android app to abuse this to display some messages which the pilot will identify as bogus.


None of the articles I read mentioned that he had actually modified the FMS code. Could you give a link?

If he did that then that is a colossally huge caveat to his claims.


>"Yes, these systems are vulnerable but a good pilot will ignore the bad data so the plane is not vulnerable."

Right, like on Air France Flight 447.

Obligatory Dilbert: http://search.dilbert.com/search?w=Flight++laptop&view=l...


I think what you're describing is more a culture problem than an issue with the design of the FMS or pilots receiving incorrect information from instruments.

Any pilot will tell you the very last thing you want is to be at war with your own aircraft (or its instruments), but incorrect readings should not solely cause an incident. This is also the case with AF447. I'm more likely after reading the CVR to put the incident down to poor communication between the crew (in the industry known as CRM or Crew / Cockpit Resource Management) [1].

The crew failed to effectively communicate what each other were doing, to the extent where they were inputting opposite commands to the flight controls and had a misunderstanding of what conditions triggered certain flight control modes [2] of the Airbus' autopilot.

Coming back to the original point around culture and training, it was evident that the more junior pilots were relying on certain "protections" the autopilot has against conditions like a stall. It's this reliance upon a computer (commanding a full nose-up during a stall) and lack of understanding of how the aircraft's flight computer acts under certain conditions that ultimately ended 228 people's lives. The French investigation concluded that the inconsistency of speed measurements was only one of seven factors that caused the accident.

[1] http://en.wikipedia.org/wiki/Crew_Resource_Management [2] http://en.wikipedia.org/wiki/Flight_control_modes_%28electro...

EDIT: Having said that, I would not want to be the pilot flying an aircraft where I can't trust my own instruments, however there are numerous cases (QF72 [3] comes to mind) where pilots have had to disregard most if not all digital instrument readings and stick to the bare minimum to safely land.

[3] http://en.wikipedia.org/wiki/Qantas_Flight_72




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: