Autopilot is a deterministic algorithm specifically for flying a plan.
There’s a 95% chance this air traffic controller “AI” will be a non-deterministic LLM because that’s the goto grift. The rate of errors LLM’s make for a task is 40-60%. People will die and no one will be responsible.
True, if they try this with a llm that’s both stupid and will kill people. I disagree that no one will be responsible, who ever made that system should be responsible.
Only use of a llm MAYBE interpting the speech over the comms between control tower and pilots and using that to put entries in the tracker for who is going where. This could also be a simple program on a screen, click the plane tag, click what it’s going to do(take off, land, taxi, etc), click where that will happen and then you don’t need any llm.
Still Ai because the computer is determining if there will be a collision and activating alarms but that’s deterministic and prove able. Plane a is at 7500 ft 7 miles out traveling at 2500 mph towards runway 47r. Plane b is at 35000 feet 20 miles out traveling at 4000 mph, can plane b also use runway 47r? Yes you can do the math to calculate. But let’s say you make a mistake and drop a 0 somewhere, wouldn’t it be nice to have an additional system that alerts you if the minimum distance between planes will be less than 1000ft for example?
In most cases this will not cause “planes to fall out of the sky”. There is a serious risk of more ground collisions which are also horrible if poorly implemented. Air traffic control is for airports, not pilots. Depending on definition of Ai this could be good or bad. If it’s Llms trying to do this it’s totally a bad idea, not even what they are meant to do.
There are systems that detect planes and land vehicles at an airport. Currently air traffic controllers have to keep all these in mind and estimate their future location. I could see Ai uses in this to have more accurate calculations of future locations and confirmations that two planes should nit collide. Those are all deterministic and “Ai”. As I said in other thread only llm use I could maybe see is if they try to interpret the voice comms between pilot and tower but that could just be a simple form app the atc just taps.
Here’s the confusion I suspect… Every thing wants to put Ai in their sales pitch because it’s the new hotness, many uses of “Ai” are not llm and can be deterministic and useful… We don’t know what they mean by “Ai”. The llm form of Ai is what gets attention but it totally useless in a lot of cases. People assume ai means llm due to news stories about llm but marketing departments are like “how can we incorporate the word Ai into this announcement?” which the engineers are like “Umm… Well technically this counts as Ai” and marketing runs with it…
AI in any of its forms is stochastic. Not deterministic. It just fits a multivariable function. It can always make a nonsense prodiction out of nowhere.
These systems cannot be verified because they are just a gazillion of numbers and algebra.
Really? No! That’s Llms, For those I agree. Ai has been around for decades,technically the goombas in the original super Mario bros on the ones were Ai. They had a set of inputs and based off those inputs changed their next action. Look at video games from the 90s, often it was “player vs Ai” and yes they were marked and called Ai in alot of cases.
This is what I meant by the last paragraph. Llm should never be acceptable for flight control. As you correctly stated it is not deterministic but that’s not all “Ai”. If the faa comes out and said they are using Claude code to handle this they should be fired and project wiped. Tracking locations of objects and detecting if a collision will happen between 2 objects has been effectively solved for decades so this could be a good use case. We don’t know what they are using Ai for or what kind of Ai it is.
I get what you are saying. My point is there are many types of Ai. We don’t know which they are going to use since they just said Ai which such an overused and overloaded verb we can’t tell what they mean. You are assuming llm, which would be stupid, wasteful and likely dangerous. But not all Ai is llm.
In the modern context where this is happening, it is. Would it be clearer if they said “LLM” explicitly? Yes. But that’s what is meant, they don’t mean one of the previous definitions of “AI”
Is it? How do you know? Let’s just say you are right? How would that work? Have a person type in “I’m at Seattle airport, I have a plane looking to land from Arizona, they are 18 miles out and 2500 feet high, which runway should they use?”
That takes longer to type not even including calculation time. There is literally no benefit.
I don’t have strong opinion. My whole point was that Ai is a buzzword with many possible meanings and it’s use was likely pushed by the or department. Before we form strong opinions let’s get details on what they will be using and how
I’m not a fan of Ai but everyone in this thread… It’s air traffic control not pilots! There has been Ai for pilots for decades… Called autopilot!
This is for coordinating planes and land vehicles traveling around the airport
Autopilot is a deterministic algorithm specifically for flying a plan.
There’s a 95% chance this air traffic controller “AI” will be a non-deterministic LLM because that’s the goto grift. The rate of errors LLM’s make for a task is 40-60%. People will die and no one will be responsible.
True, if they try this with a llm that’s both stupid and will kill people. I disagree that no one will be responsible, who ever made that system should be responsible.
Only use of a llm MAYBE interpting the speech over the comms between control tower and pilots and using that to put entries in the tracker for who is going where. This could also be a simple program on a screen, click the plane tag, click what it’s going to do(take off, land, taxi, etc), click where that will happen and then you don’t need any llm.
Still Ai because the computer is determining if there will be a collision and activating alarms but that’s deterministic and prove able. Plane a is at 7500 ft 7 miles out traveling at 2500 mph towards runway 47r. Plane b is at 35000 feet 20 miles out traveling at 4000 mph, can plane b also use runway 47r? Yes you can do the math to calculate. But let’s say you make a mistake and drop a 0 somewhere, wouldn’t it be nice to have an additional system that alerts you if the minimum distance between planes will be less than 1000ft for example?
You sound like you touch the paintings at the museum.
?? I don’t even know what this means…
In most cases this will not cause “planes to fall out of the sky”. There is a serious risk of more ground collisions which are also horrible if poorly implemented. Air traffic control is for airports, not pilots. Depending on definition of Ai this could be good or bad. If it’s Llms trying to do this it’s totally a bad idea, not even what they are meant to do.
There are systems that detect planes and land vehicles at an airport. Currently air traffic controllers have to keep all these in mind and estimate their future location. I could see Ai uses in this to have more accurate calculations of future locations and confirmations that two planes should nit collide. Those are all deterministic and “Ai”. As I said in other thread only llm use I could maybe see is if they try to interpret the voice comms between pilot and tower but that could just be a simple form app the atc just taps.
Here’s the confusion I suspect… Every thing wants to put Ai in their sales pitch because it’s the new hotness, many uses of “Ai” are not llm and can be deterministic and useful… We don’t know what they mean by “Ai”. The llm form of Ai is what gets attention but it totally useless in a lot of cases. People assume ai means llm due to news stories about llm but marketing departments are like “how can we incorporate the word Ai into this announcement?” which the engineers are like “Umm… Well technically this counts as Ai” and marketing runs with it…
AI in any of its forms is stochastic. Not deterministic. It just fits a multivariable function. It can always make a nonsense prodiction out of nowhere.
These systems cannot be verified because they are just a gazillion of numbers and algebra.
That is simply not acceptable for flight control.
Really? No! That’s Llms, For those I agree. Ai has been around for decades,technically the goombas in the original super Mario bros on the ones were Ai. They had a set of inputs and based off those inputs changed their next action. Look at video games from the 90s, often it was “player vs Ai” and yes they were marked and called Ai in alot of cases.
This is what I meant by the last paragraph. Llm should never be acceptable for flight control. As you correctly stated it is not deterministic but that’s not all “Ai”. If the faa comes out and said they are using Claude code to handle this they should be fired and project wiped. Tracking locations of objects and detecting if a collision will happen between 2 objects has been effectively solved for decades so this could be a good use case. We don’t know what they are using Ai for or what kind of Ai it is.
I’m not going to convince you. Read some papers by yourself. Maybe check Nielsen’s website if you want to learn a bit.
These people obviously are not implying to incorporate “Mario Bros” to control towers. They are talking about modern AI.
I get what you are saying. My point is there are many types of Ai. We don’t know which they are going to use since they just said Ai which such an overused and overloaded verb we can’t tell what they mean. You are assuming llm, which would be stupid, wasteful and likely dangerous. But not all Ai is llm.
In the modern context where this is happening, it is. Would it be clearer if they said “LLM” explicitly? Yes. But that’s what is meant, they don’t mean one of the previous definitions of “AI”
Is it? How do you know? Let’s just say you are right? How would that work? Have a person type in “I’m at Seattle airport, I have a plane looking to land from Arizona, they are 18 miles out and 2500 feet high, which runway should they use?”
That takes longer to type not even including calculation time. There is literally no benefit.
The fact that you say that I am assuming LLMs tells me you don’t understand what you are talking about.
No shame on not knowing. But some shame on having an strong opinion on something that you don’t know.
(btw, I have a phd in CS)
I don’t have strong opinion. My whole point was that Ai is a buzzword with many possible meanings and it’s use was likely pushed by the or department. Before we form strong opinions let’s get details on what they will be using and how