From the left, panelists Dr. Brian Leech, Aaron Wetzel ’90 and Alex Mayszak ’13 with moderator TY Stone '26
Futurist Thinking Series explores what AI can’t teach
As artificial intelligence rapidly reshapes how people learn, work and create, a more urgent question is emerging across education: What must remain distinctly human?
That question anchored the March 31 Futurist Thinking Series conversation at Augustana College, where faculty, alumni, students and community members gathered to examine the future of learning in the age of AI.
The panel brought together three perspectives on the technology’s impact: scholarship, education and industry:
• Historian Dr. Brian Leech, director of Augustana’s Honors Program, explored how AI is changing intellectual formation.
• Alex Mayszak ’13, director of Digital Learning and Innovation for East Moline School District, shared how schools are adapting in real time.
• Aaron Wetzel ’90, vice president for Production and Precision Ag Systems at John Deere, offered insight into how artificial intelligence is transforming complex global systems.
The conversation was moderated by Augustana senior TY Stone, who is working on launching a nonprofit that introduces at-risk youth to artificial intelligence and digital skills.
Stone opened by naming the reality students already face.
“AI is already fundamentally changing the way we research, write and innovate,” Stone said. “The question here is what must remain distinctively human in the process … how do we use these tools without losing the judgment, without losing the curiosity and the human relationships that education is built on.”
The work of thinking
Dr. Leech emphasized that generative AI is forcing educators to reconsider how students develop habits of mind.
“One concern I often have is that AI can produce what everyone already thinks,” he said. “That’s not the goal of a college education. We want students to develop their own ideas.”
At stake is not simply how students complete assignments, but how they learn to think, he explained.
Writing and research, Dr. Leech noted, are not just about producing answers. They are central to how students learn to question sources, interpret information and construct meaning.
“The risk isn’t just that students use AI,” he said. “It’s that they use it as a crutch when thinking gets hard.”
Dr. Leech also pointed to something less often discussed in conversations about AI: the role of vulnerability in learning.
“Belonging requires vulnerability,” he said. “Students have to be willing to take intellectual risks, to be wrong, to work through uncertainty. That’s something technology can’t replicate.”
Learning in the age of AI
For Mayszak, the shift is not incremental. “At times, I wish we could put the entire educational system on a rocket, send it to space and start from scratch,” he said.
The comment, delivered with humor, reflects a deeper reality: the structures that have defined learning for decades are being challenged in real time. In K–12 classrooms, educators think about how learning itself must be redesigned.
“We have to build human skills, before we rely on tools,” he said. “Otherwise, without context, AI can short-circuit learning.”
At the same time, he sees this moment as a significant opportunity.
“This is the best opportunity we’ve had to rethink what learning looks like,” Mayszak said. The challenge, he suggested, is not preserving existing models, but reimagining them.
AI tools, he noted, are allowing students to test ideas, simulate real-world scenarios and engage in more complex problem-solving earlier in their education.
“When the foundations are strong, students can do things with these tools that simply weren’t possible before,” he said. “That’s where the energy is.”
Technology in practice
From the perspective of industry, Wetzel described how artificial intelligence is already embedded in modern agriculture.
“As a kid, we walked fields and pulled weeds by hand,” he said. “Today, self propelled machines use cameras to identify and spray weeds at 20 miles per hour with incredible precision.”
John Deere has spent more than a decade developing connected systems that use data, machine learning and automation to support farmers in making better decisions.
“We have machines operating today, fully autonomous with no one in the cab actually tilling fields right now,” Wetzel said. “Those machines see obstacles along the way, the machine will stop and send an alert to the customer. We have a human in the loop that is discerning those images.”
Yet even as machines become more capable, Wetzel emphasized that human leadership is becoming more important. “I firmly believe we need people who can think broadly, understand systems and ask the right questions.” In this environment, the competitive advantage is interpretation, challenging and understanding what to ask to get the right outcomes.
As AI reshapes decision-making environments, Wetzel emphasized that leadership itself is evolving.
“The ability to navigate ambiguity and process large volumes of information quickly is becoming essential,” he said. “Leaders need to learn faster, interpret more and make decisions with incomplete information.”
Opportunity and access
Stone also highlighted how artificial intelligence is reshaping access to opportunity. He works with young people and minorities learning skills in web development, artificial intelligence and digital marketing.
“I want my community to be part of this conversation now, not after everyone else has already figured it out,” he said.
Access to technology alone, he noted, is not enough. Students must also develop the knowledge and confidence to question, apply and understand it. Without that foundation, AI risks widening gaps rather than closing them.
What remains human
Across each perspective from classroom, to scholarship and industry, the message was consistent: technology is accelerating capability, but education must remain focused on developing the human capacities that guide it.
Panelists pointed to a set of capacities that will increasingly define leadership in an AI-driven world:
• Judgment — the ability to interpret information and make decisions in context
• Discernment — distinguishing insight from noise and consensus from original thought
• Intellectual stamina — the willingness to stay with complexity and do the work of thinking
• The ability to navigate ambiguity — making decisions without complete information
• Learning velocity — absorbing and applying new information quickly
• Relational intelligence — building trust, listening and working across difference
These capacities are cultivated through inquiry, challenge and shared learning, according to the panelists. The more capable artificial intelligence becomes, the more essential these human capacities become.
The Futurist Thinking Series was created to bring together students, faculty, alumni and regional leaders to explore questions shaping the future of higher education.
As Stone reflected at the close of the discussion, the conversation was ultimately about more than technology. “There is no world where people aren't needed, no matter how strong AI gets,” he said.