Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

I don't find it likely that pilots hand-fly approaches to make a point. Isn't it more likely that this is codified in airline operation instructions?

As for completely autonomous, do you seriously think we have the technology available today to make an autopilot handle any conceivable situation without a human being present and ready to take over? (non-IR PP here)



It isn't "just to make a point". It's also to stay proficient, or to make the job less tedious. But whether to hand-fly an approach is always the pilot's discretion (as far as I know -- I'm a just a private pilot so I could be wrong about that).

And no, automation can't handle "every conceivable situation", but neither can humans. Furthermore, humans screw up more often than autopilots. Pilot error is currently the single biggest contributor to the overall accident rate.

Mind you, I'm not advocating fully autonomous aircraft. I like having a human in the loop, but that's in part because I am the human in the loop. It's far from clear that human pilots are a net win for safety.


Neither can humans, but humans can handle a lot more situations. We can take into account a lot of information and come up with creative solutions, while current computers will need a set of humans to come up with and program potential scenarios beforehand. I think we'll have computer level AI that will perform better in accidents eventually (AI is improving and our brains are not) but it a long way into the future, and not an imminent threat to pilot jobs as some would believe.

In the meantime, the solution to the human factor isn't to eliminate humans, but to improve training (which is already happening after AF447).

I like having human pilots primarily because I'm a programmer and I know how difficult it is to design robust computer systems. There's been one runway overrun and one serious in flight incident with passenger injury due to software design faults. Now try to design a system that makes sense of audio, video and smell in addition to the existing sensors, and not have it fail in some spectacular unforeseen way..


Actually, it is extremely rare for a human pilot to come up with a "creative solution" to an emergency. The vast majority of emergency responses are established procedures for which pilots train. Most of the time they're following a check list.

I can only think of a single example of a "creative" response to an in-flight emergency that actually helped, and that was UA232 in 1989.


Aircraft accidents are extremely rare in the first place, so that's a given. The question should rather be whether computers would do a better job than pilots in the same situation, and currently the answer is no. The actual flying of the airplane which autopilots do today is just a small part of the pilots job, and automating the rest of the tasks a pilot performs is non-trivial.


Are you a pilot? Because I am, and I'm telling you from firsthand experience that you're wrong. Except for takeoff and landing, there's next to nothing I have to do. And the only reason I have to do the landing is because my plane is a small GA aircraft without autothrottle or autoland technology.


Yes, I am a pilot, albeit not an instrument rated one. Are you telling me you don't do anything? Do you bring a book to read instead of monitoring the instruments? You don't talk to ATC, you never have to make a decision regarding a route deviation? You never take into account weather information and make a decision underway on whether to press on or to find another place to land?


No, of course I'm not saying that. I'm saying all those decisions could be automated, not that they have been. Except for ATC communications, all the information I use to make in-flight decisions is already available in digital form. All the engine parameters are digital. I get en-route METARS via XM. I have an WAAS GPS coupled to the autopilot. The only thing standing in the way of making my aircraft completely autonomous is a throttle actuator and the right software. And no, writing that software would not be trivial, but neither would it be impossible.


For normal operations, I think it's possible to make it _mostly_ autonomous today (TTS and voice recognition for ATC, and some heuristic metar/weather radar analysis might work). I don't think we have the technology today to make such a system safe enough to not have a human ready as a backup. And for an accident scenario, I think it's completely impossible today since we would need to integrate audio, video and smell sensors and AI software to rival humans in situational awareness. This would mean exceptionally complex software.

The way we have solved reliability in autopilots and FBW systems today is to make them as simple as possible and to give them sensible fallback modes (like the Airbus FBW removing stall protection when missing certain inputs), and even then we have had real life accidents because of programming errors or design errors. So if you think pilot automation is mainly a question of politics as you said earlier, I think you are being overly optimistic (which, of couse, is not uncommon in the software industry).

I think a better approach to removing pilots is to see the strengths in both computers and humans and design systems where the advantages of both are maximized.

If you are interested in reading more on software safety, http://sunnyday.mit.edu/ is a good starting point.


Is it standard procedure to land A320s in the Hudson? Prior to 2009, how likely do you think it would have been for someone to have programmed an autopilot with such a capability?

Do pilots typically glide unpowered 767s to landings at abandoned military airfields? (A result, incidentally, that could not be reproduced by other crews in simulators; are you sure it's the crews, and not inadequate programming? Still trust the computer?)

Are you willing to bet your life that the computer on a 737 that looked like this[1] could find its way to a safe landing?

As programmers, we should know better.

[1] http://en.wikipedia.org/wiki/File:Aloha_Airlines_Flight_243_...


> Is it standard procedure to land A320s in the Hudson?

Yes, ditching an airplane that has lost all of its engines is a standard emergency procedure.

> Prior to 2009, how likely do you think it would have been for someone to have programmed an autopilot with such a capability?

What difference does that make? I'm not saying it's a good idea to take pilots out of the cockpit right now, I'm just saying it's a lot more plausible than most people think. The main limiting factor is politics, not technology.

> Are you willing to bet your life that the computer on a 737 that looked like this[1] could find its way to a safe landing?

Sure, why not? Losing the top of the fuselage looks dramatic, but it probably doesn't change the flight characteristics all that much. Also, very good adaptive control algorithms exist that could almost certainly handle this.

Also, you're cherry-picking your anecdotes. There are plenty of examples of flights that would almost certainly have ended safely but for some stupid mistake the pilot made. Controlled flight into terrain accidents, for example, are much more common than heroic rescues, and they could be entirely eliminated if you took human pilots out of the loop.


> Yes, ditching an airplane that has lost all of its engines is a standard emergency procedure.

See my reply to krisoft. You're changing a specific situation into a general one. You can't just say "ditch the plane if you lose power", you have to anticipate every possible variable that may influence whether that is actually the correct course of action, and the manner in which it is carried out. And you have to do that before it ever happens.

> What difference does that make?

Unanticipated situations are unanticipated situations. We don't have AI. We haven't replicated the ability of a human being to adapt on-the-fly. There is no reason to believe we will in the near future.

> Controlled flight into terrain accidents [...] could be entirely eliminated if you took human pilots out of the loop.

The problem I have with this line of reasoning is that so much more could be done to prevent them even without taking the pilots out of the loop, and yet it's not. That does nothing to give me confidence that the right thing will be done when pilots ARE taken out of the loop.

Edit: What it comes down to is this. All you're ultimately doing is completely and irrevocably substituting the unalterable judgement of somebody in a completely different time, place, and circumstance for the adaptable judgement of the person on the plane. When you do that, what you're really saying is "I refuse to give people in a scenario I didn't think about the chance to survive". I can't accept that.


CFIT used to be common, but it's not anymore on commercial flights thanks to GPWS. There's not yet been an accident on an airplane equipped with EGPWS.


Emergency landing on water is "standard procedure". You might have the impression that it's something which was creatively invented by the pilot at a moments notice, but it's really not. The engineers designed the airplane with the capability, included advice and checklist in the operation manual, they even put a button labeled "ditch" on the dash! Can't really see why they could not make such an autopilot program.


You completely misunderstand. I wasn't talking about a generic "emergency landing on water".

A plane suddenly loses power in a highly urbanized area. It's very near multiple airports. Prior to 2009, I have no expectation that an autopilot would have been equipped to judge whether it should attempt to land in a heavily-trafficked river. (I further have no expectation that it would have been equipped to notice in the highly plausible scenario of one or more small boats being in the way.)

It's not a question of landing on water, it's a question of making the decision to do so and where.

Humans have the judgement gained by a lifetime of learning. Computers only have what it occurs to us to put into them while we're sitting safe and sound in our little offices pushing little buttons that don't threaten our lives.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: