From Reactive to Redemptive

How Christian Schools Can Build a Mission-Aligned AI Strategy

Leading with Lament

When I lead AI workshops for Christian school faculty, I always start in the same place: lament.

Teachers and school leaders have experienced AI as a net negative. It arrived on the heels of smartphones, social media, and COVID disruption. Burned-out educators were suddenly faced with a technology that students could use as a universal cheating device, just as we were trying to help them recover pandemic-era learning losses. Many students now use it in ways that short-circuit learning and diminish their cognitive, volitional, and creative capacities. For most teachers, AI has felt like one more wave crashing over them without warning.

We need to name that loss honestly and sit with it before we try to solve it.

But there is a second lament that is harder to see because it has become normal. We are all teaching in some version of an industrial model school.

We put kids on a conveyor belt and push them through the system, even though we know each student has different needs and gaps. We are told to differentiate instruction for twenty students at once, which is unrealistic. Some students are lost, others are bored, and we end up teaching to the middle. This is the system we have inherited. Most teachers know it falls short of forming students to the fullest of their individual potential, because the factory model makes that goal unattainable.

We have been trying to fix this system for years. Despite pouring roughly $30 billion into educational technology over the past two decades, the results are sobering. According to PISA data, global test scores in math, science, and reading have been declining since the early 2010s, coinciding with mass adoption of digital devices. The digital revolution digitized the industrial model without transforming it.

We need to live with both laments. The technology that opened the first wound may also be what makes the second problem solvable, if we are wise about it. If we respond to AI only at the level of surface problems, we will miss the deeper opportunity and repeat the mistakes of the past.

The Scale of What’s Coming

Andy Crouch’s “A Redemptive Thesis for Artificial Intelligence,” written with his colleagues at Praxis, captures the magnitude of this moment. AI is likely as consequential as electricity and may prove as transformative as agriculture. Technologies of this scale create real common good, but they are also shaped by disordered human desires.

We have seen this pattern before. Christian schools largely accepted the internet, smartphones, and social media without shaping them, and then responded after the damage was done. We cannot afford that posture again.

AI is here to stay. Many educators still think of it as just another tool they can choose to learn or ignore. But AI is not simply a tool. It is a new form of infrastructure that will undergird work for the foreseeable future. As such, it will reshape education. Our generation of Christian school leaders and teachers has been entrusted with the responsibility to steward this technology in ways that align with and advance our mission.

God has placed us in this moment. It may feel like too much to handle, but now is not the time to panic. God has not been caught off guard by AI, and He will not abandon us in our time of need.

Why Policies Are Necessary, but Not Enough

Most schools were caught off guard by generative AI, and many still lack formal policies. Schools that do have policies are investing significant time trying to curb misuse. These efforts are understandable and necessary. But they do not address the core problem. Policies fail at multiple levels, and understanding why is essential to building something better.

Start with the most basic policy: banning AI use. Even if a school enforces this rigorously, students are going to use it. And the students who need the most support will use it the most foolishly. Research confirms this pattern. Students with stronger academic habits use AI strategically when they encounter it. Novice learners accept outputs uncritically and become more dependent. Without intentional intervention, AI widens the very gaps schools are trying to close. A ban does not prevent this. It just makes it invisible.

Now consider the alternative: allowing AI but keeping everything else the same. This is where most progressive schools land, and the results are predictable. Students who use AI often produce higher-quality work, but those gains disappear when AI is removed. The mechanism is what I call the illusion of understanding. Consuming polished output feels like learning, but it does not produce durable, transferable understanding. Teachers have seen a version of this for decades. A student watches a teacher demonstrate how to solve a problem. It makes sense. They think they understand it. But when it comes time to do it on their own, they cannot. AI accelerates this dynamic. Anthropic’s AI Fluency Index documented what they call the “artifact effect”: when AI produces polished outputs, users become less critical, less likely to check facts, and less likely to question the reasoning behind what they see. Permitting AI use without redesigning the learning is the same trap with a different trigger.

Go deeper, and a third problem emerges. No policy, whether it bans or permits AI, cultivates the wisdom students need to use it well. Policies draw lines. They do not form character. Students will graduate into a world where AI is embedded in every profession. They need to learn when to lean on it and when to resist the temptation to outsource their thinking.

The challenge is not primarily behavioral. It is formational and structural. And that requires a different kind of response.

Getting Theology and Ethics in Order First

If the problem runs deeper than misuse, the response must run deeper as well. Before adopting tools or redesigning structures, Christian schools need a clear theological and ethical framework. Otherwise we are building on sand.

Drawing in part on Crouch’s work, I have developed nine redemptive principles that function as decision-making filters for every AI adoption question a school will face. These are not abstract ideals. They are practical constraints and guides for the concrete decisions schools will need to make.

  1. Imago Dei and Human Agency. AI should inform human choice, never replace it. Students, teachers, and staff remain responsible image-bearers.
  2. Christian Character Formation. AI adoption must serve the school’s primary mission of forming students in character, cultivating the virtues and resisting the vices.
  3. Wisdom over Convenience. We adopt AI that deepens attention, creativity, and understanding rather than outsourcing thought. Wisdom, not convenience, is the measure of a good tool.
  4. Embodied Learning and Sabbath. AI must respect human embodiment and protect healthy rhythms of work, rest, and worship. Technology serves embodied learners; it does not disembody them.
  5. The Primacy of Real Relationships. AI may assist communication but must never pose as a person or replace genuine human community.
  6. Transparency, Privacy, and Trust. AI tools must operate transparently, protect student and family data, and maintain the trust our communities depend on. It only takes one breach to undermine everything.
  7. Justice for the Vulnerable. We prefer AI tools that widen access and blessing, especially for those with linguistic, economic, and physical disadvantages, over those that concentrate power or profit.
  8. Creation Care. As stewards of God’s creation, we evaluate the energy use and ecological impact of our AI adoption.
  9. Vocational Stewardship. We equip students to see their vocations as callings, with AI as a tool for meaningful work and kingdom service, not a replacement for the human labor that gives life dignity and purpose.

These principles change the questions we ask. When evaluating an AI tutoring tool: does it preserve space for human agency, or does it replace it? Does it strengthen relationships, or weaken them? Does it cultivate wisdom, or simply increase efficiency?

A school that filters decisions through these principles will make fundamentally different choices than one simply chasing efficiency. And those choices point toward something bigger than policy reform. They point toward a different kind of school.

From Strategy to Structure

This is where the conversation becomes exciting and generative. If we take our theology seriously, I believe we will neither create a new form of anti-tech legalism nor mindlessly infuse AI into our broken model. Instead we will ask what kind of school AI makes possible, and what kind of school our mission demands.

I say this to every school I work with: AI is a force that will reshape time, space, work, and money within education. Some AI applications, such as well-designed tutoring systems, can produce meaningful learning gains. But those gains depend on intentional design and wise teacher deployment. Generic AI use does not produce the same results. If we simply layer AI onto the industrial model, we are unlikely to see meaningful improvement. What is required is a better model.

Consider time. If adaptive AI can help students master foundational knowledge more efficiently, what should we do with the time that opens up? In my conversations with school leaders, this is where the energy shifts. We could recover classical and medieval forms of learning that the industrial model squeezed out: sustained reading and writing during the school day, engaging Socratic discussions, and Oxford-style tutorials where a teacher sits with one or two students and works through their thinking. These are the things Christian educators went into education to do. AI might give us the time to do them.

Consider space. Traditional classrooms assume uniform instruction. If part of the day becomes individualized and part becomes deeply relational, our physical spaces will need to reflect that shift.

Consider work. Teacher roles, assessment practices, and administrative responsibilities are all up for reimagination. If AI can handle routine tasks, teachers can focus more fully on formation, mentorship, and meaningful feedback. The goal is not efficiency alone, but a more human form of teaching.

And consider money. Schools will need to rethink how they allocate resources. AI implementation will cost money, but it may also make subscriptions to frustrating SaaS products unnecessary as schools learn to vibe code their own custom tools and platforms. AI may create opportunities to expand access to high-quality Christian education, including new curricular pathways aligned with students’ individual callings. But that only happens if financial models are aligned with mission rather than driven purely by cost reduction.

What It Takes

This kind of change does not happen through isolated initiatives. It requires sustained institutional commitment. When I work with schools on AI strategy, I tell them there are three non-negotiables.

Governance comes first. Schools need leadership structures responsible for developing and implementing AI strategy in alignment with mission. Someone has to own this, and “someone” cannot mean everyone hoping the IT director figures it out.

Faculty development is the engine. This cannot be a single professional development day. Teachers need sustained formation. They need to understand how AI works, how it shapes learning, and how to use it wisely in their specific disciplines. In my work with schools, that often includes helping teachers design new forms of AI-proof assessment, use AI as a co-worker and a coach, and even build simple tools that support their instructional goals. The goal is teacher agency: not teachers who are told what to do with AI, but teachers who are equipped to make wise decisions about how to use it to know and love their students better.

Communication holds it together. Parents and boards are paying attention. They want to know that the school has a framework, a plan, and people who are leading thoughtfully. Clear communication builds trust and helps families navigate these changes alongside the school.

Neither Optimist Nor Doomer

I am neither an optimist nor a doomer about AI. It will produce both good and harm because fallen and redeemed image-bearers have their fingerprints all over it.

The pace of change is blistering and bewildering, but the task before us is not new. It is the same task the church has always faced when the world shifts beneath its feet: stewardship. We are called to use the tools of our time in ways that align with God’s purposes. For Christian schools, that means moving from reactive policies to a redemptive strategy, from managing disruption to building something better.

Christian schools are uniquely positioned to lead in this moment. We understand human nature. We know the formative purpose of education. And we have the Holy Spirit to guide us through uncertain terrain.

AI is already reshaping education. Our task is to shape its use in ways that align with and advance our mission. If we take the creation mandate seriously, we can begin to heal what was already broken in our schools.

While the world divides along luddite and techno-optimist lines, Christian schools have the opportunity to begin the creative, redemptive work of reimagining education in the age of AI.


Sean Riley is Chief Strategy Officer at The Stony Brook School and Executive Director of Gravitas, the school’s online program. He also works with Christian schools through EDified Strategies to develop mission-aligned approaches to artificial intelligence.

Sean has served at Stony Brook for nearly two decades in a variety of roles, including teacher, Ethics Bowl coach, athletic coach, dorm parent, History Department Chair, and Academic Dean. He holds a PhD in philosophy from Baylor University.

He lives in New York with his wife and their four teenage children.