6 Comments
User's avatar
A, Macare's avatar

I don’t think this is the full picture. What about the fact that our brains are consuming, filtering, and selecting far more information than ever before? What about the possibility that reducing the amount of energy needed for previous tasks, frees up that energy to use the brain in ways we haven’t imagined yet? And further still, what about the likelihood that a student is using ChatGPT, in part, because they are doing the task for some external meaningless thing like a grade, and they really don’t care?

In my experience, when students care, they learn. When I use Ai for countless things in once labored over, it frees me to do more.

I love brains. I teach about them all the time, but there’s more than one way to light them up.

It’ll be interesting to see how this all plays out.

Expand full comment
Marco D'Alessio's avatar

Feel free to conduct an academically sound research supported by brain scans and in-depth analyses that test your hypothesis. Otherwise, this is just "what-aboutism".

Expand full comment
Stephen Fitzpatrick's avatar

This certainly resonates on multiple levels. When we look back, it will seem fairly obvious in hindsight that introduction of AI tools in the hands of young learners will create a profound shift in skill development. I was interested in the finding that, at some juncture beyond which students are capable of directing an AI, it can enhance their abilities which is likely why so many adults report positive experiences with AI. We forget what it's like to be a nascent learner. So what now? How do we protect students from AI? Bans have been ineffective. The writing is on the wall that white collar jobs will likely demand AI fluency. Maybe some schools will distinguish themselves by capitalizing on this research and designing learning environments to create the best of both worlds. But it's not going to be easy. Educational institutions don't reorganize very efficiently. This will likely contribute to more inequalities not less. Thanks for the write up - the report is over 200 pages!

Expand full comment
Claude COULOMBE's avatar

The flagship application of generative AI is cheating. We're moving toward a Kafkaesque scenario, where the homework is prepared by teachers using an AI tool, then the students do their homework with an AI tool. Finally the homework is assessed by teachers using an AI tool. The only one really doing work is the AI ​​tool…

Neuroscience is clear: developing and maintaining complex cognitive abilities requires active work and cannot rely solely on technological assistance. For our natural neural networks, it's "use them or lose them." In the long term, the misuse of AI and the law of least effort will likely create a generation of cognitively assisted people, subscribed for life to generative AI tools. Another elephant in the room, along with the unreliability of LLMs, their environmental impact and the impact on jobs... Generative AI risks making us stupid. https://bit.ly/3I55FAs

The prospect of cognitive decline doesn't personally excite me. That said, everyone is free to make their own "informed" choices, which excludes children and other vulnerable people who lack the maturity to make those choices for themselves.

Expand full comment
Michael Spencer's avatar

A lot of teachers and professors have economic and financial incentives to be AI boosters and how to benefit from AI being introduced into course work in various ways. Meanwhile a lot of young people are hacked by corporations for radical consumer behavior modification programs at scale. I find both of these trends fairly nefarious in view of studies like these.

Expand full comment
Roi Ezra's avatar

This is so important, and so well articulated. I’ve been writing about this too, from a different lens: not as a researcher or educator, but as someone who’s lived what it feels like to drift. To outsource the thinking just enough that it starts to feel like mine again only when I stop.

The part that hit hardest? The timing. The idea that AI can become an amplifier only after clarity has been earned. I’ve seen this again and again, not in theory, but in how I work, write, lead. When I protect my own rhythm first, the tools help. When I don’t, they shape me before I notice.

Thank you for naming this so precisely. It’s one of the only things that feels urgent enough right now. I tried to name a similar tension in a piece called “Curiosity Is the Real Advantage”.

Expand full comment