Alpha School, a pricey private school with campuses nationwide, uses artificial intelligence to instruct students. AI schools have been praised by the Trump administration, but researchers say there's limited evidence the model works.
“AI should serve as a scaffold for cognitive construction rather than a substitute.”
“…the teacher’s role is shifting from knowledge transmission to instructional design and behavioral facilitation… Teachers must develop digital literacy and data fluency while acting as safeguards against over‑automation, ensuring that human judgment and educational values mediate AI adoption.”
“…while AI offers efficiency and feedback advantages, traditional teaching remains essential for tasks requiring cultural interpretation, discourse depth, and emotional connection. A blended model—AI for repetitive or procedural tasks and teachers for critical discourse—appears most effective.”
This study explicitly does not advocate for replacing teachers with AI, and repeatedly cautions against doing so
These findings highlight both the promise and the limitations of AI in language education, underscoring the importance of teacher facilitation and thoughtful design of human–AI interaction to support deep and sustainable learning.
The problem is there’s no teachers in this scenario, at least that’s my understanding
The only research I’ve seen on using LLMs in a school setting found that the kids that were given access to an LLM performed a bit better on exercises that those without. At the same time their experienced learning was a lot better. When they finally got a test assignment, the kids that had been using LLMs during exercises flopped and performed significantly worse than those that hadn’t.
Why are you hounding them for the data? They would swear on their honor that Grok said it, and that’s somehow not enough for you. They even asked a follow-up “Are you sure?”, to which Grok reaffirmed its findings. Maybe you should be practicing law if you want to act like you care so much about “evidence”.
This is for college students (aka students educated enough to learn on their own already), reads like a promotion for AI, has a limited sample size and does not translate to school kids at all and from the study itself:
Finally, the study’s limitations include its single-institution sample, short duration, and reliance on proxy behavioral indicators. Ethical concerns around informed consent, data privacy, and AI dependency also warrant closer attention. Future research should pursue longer-term and cross-institutional designs, employ multimodal behavioral measures, and develop governance frameworks that align technical gains with equity, autonomy, and critical capacity.
This “”study”” seems to spend more time opining on AI learning frameworks than actually measuring scores on standardised testing and only dedicates a minimal amount of the paper to the results. It also states in paper that higher achieving college students saw less benefits (poorer performing student, AI can bump your grades enough to be noticeable for a unit/pass an exam).
Did you read this study or google something in order to provide a study? This study does not support the claim that “these kids will perform traditional learning by miles”.
It’s also for learning English, which is something a large language model is probably the most suitable for. It’s not going to be much use teaching music or drama.
You said the data says otherwise which you then used to support that opinion. The data doesn’t say otherwise.
Want me to pull out of study from 20 years ago with decades of proven data?
Almost like that was in my original comment that you then replied to with a study as if it were compelling, so spare me the sassy comment. Don’t claim the data says otherwise when it doesn’t if you don’t want to be called out on it.
Really? Because the data I’ve seen says the exact opposite and that Gen Z is the first generation of people dumber than the generation before them. These kids are already fucked and AI is going to make it even worse.
That’s not what the data says. These kids are going to outpace traditional learning kids by miles.
Is this data in the room with us?
https://www.mdpi.com/2078-2489/16/10/895
This study explicitly does not advocate for replacing teachers with AI, and repeatedly cautions against doing so
You have to excuse them, they used AI to summarize it.
The problem is there’s no teachers in this scenario, at least that’s my understanding
The only research I’ve seen on using LLMs in a school setting found that the kids that were given access to an LLM performed a bit better on exercises that those without. At the same time their experienced learning was a lot better. When they finally got a test assignment, the kids that had been using LLMs during exercises flopped and performed significantly worse than those that hadn’t.
AI hasn’t even been around long enough for any meaningful data to be collected surely. Also, post this “data” you’ve twice now claimed exists.
Why are you hounding them for the data? They would swear on their honor that Grok said it, and that’s somehow not enough for you. They even asked a follow-up “Are you sure?”, to which Grok reaffirmed its findings. Maybe you should be practicing law if you want to act like you care so much about “evidence”.
https://www.mdpi.com/2078-2489/16/10/895
This is for college students (aka students educated enough to learn on their own already), reads like a promotion for AI, has a limited sample size and does not translate to school kids at all and from the study itself:
This “”study”” seems to spend more time opining on AI learning frameworks than actually measuring scores on standardised testing and only dedicates a minimal amount of the paper to the results. It also states in paper that higher achieving college students saw less benefits (poorer performing student, AI can bump your grades enough to be noticeable for a unit/pass an exam).
Did you read this study or google something in order to provide a study? This study does not support the claim that “these kids will perform traditional learning by miles”.
It’s also for learning English, which is something a large language model is probably the most suitable for. It’s not going to be much use teaching music or drama.
No, the end part was my own opinion. I do believe classrooms that embrace AI will outperform tradition learning classrooms by a mile.
Already yes the study is limited, AI learning is very new. Want me to pull out of study from 20 years ago with decades of proven data?
You said the data says otherwise which you then used to support that opinion. The data doesn’t say otherwise.
Almost like that was in my original comment that you then replied to with a study as if it were compelling, so spare me the sassy comment. Don’t claim the data says otherwise when it doesn’t if you don’t want to be called out on it.
Really? Because the data I’ve seen says the exact opposite and that Gen Z is the first generation of people dumber than the generation before them. These kids are already fucked and AI is going to make it even worse.