
RAISING HEALTHY CHILDREN
Educators from Carolina Friends School in Durham explore the relationship between health and education.
Contact the school at:
CAROLINA FRIENDS SCHOOL
4809 Friends School Road
Durham, NC 27705
Telephone: (919) 383-6602
John Sharon joined Carolina Friends School as Assistant Head for Teaching and Learning in 2022. In this role, he serves to assist the Head of School with academic initiatives and special projects; oversee supporting staff through professional development, recruiting, and retainment; and provide ongoing, direct support for head teachers.
John has served as Assistant Head of School
at Trinity School of Durham and Chapel Hill and The Fenn School in Concord, MA.
Prior roles include head of middle school and English and History teacher. In
his career as a caring teacher and accomplished administrator, John has also
served at independent schools in the Northeast and Washington, DC.
When it comes to the anxieties of parents and caregivers, some issues come and go in time. One topic that has stayed consistently near the top, despite (or maybe because of) the ways in which it is constantly changing, is the impact of personal technology on adolescents and teens. My own children were growing up just at the beginning of the technology wave, but as a teacher and administrator, I’ve seen the incredible growth of the use of technology and its impact. And technology does provide us with many powerful and useful tools that we want to embrace in the classrooms, but we do a disservice to our students, their families, and ourselves if we wrap our arms around technology without careful thought and planning.
Artificial Intelligence and Schools
Among the more recent concerns regarding technology in students’ lives is the advent of easily accessible and nuanced artificial intelligence (AI). In a recent Pew Research survey of U.S. teenagers, 19 percent say they have used AI to help with their homework, and 39 percent say it’s acceptable to help them solve math problems. Moreover, 92 percent of teachers have varying degrees of doubt about the usefulness of the technology. We know that AI is here to stay and will always be changing. Schools, in particular, have to do our best to stay on top of the curve and also to be transparent with families about how we’re teaching children to interact with it.
In my role as Assistant Head of School at Carolina Friends School, I think it’s important to look at AI use from both a student-user perspective and a teacher-user perspective, as they’re really connected. What is true for both is that AI can’t just be relied upon by itself without careful thought, process, and filtering of the gleaned information. It’s important that we create a culture of curiosity rather than fear, particularly for the adults in our community.
We also want to be sure we, as teachers and adults, are not embracing it unquestioningly. AI is meant to be an iterative process, but you can only use it properly if you are well trained in its use. We want our students to enter the world as such high end users that they know how to use it well. The goal of any school’s response to AI’s impact on student learning has to be around teaching students how to be critical users of AI as an information tool, and not use it exclusively as a research tool. (In that same Pew Research survey, 69 percent of teenagers say it’s acceptable to use AI as a research tool.)
We need to be able to differentiate between the function of a search engine and large language models like ChatGPT, which are nothing more than an amalgamation of internet data. For research, it’s important to be able to trace the source of the information. What comes through AI like ChatGPT might be easily accessible, but it is not necessarily true or accurate.

And I want to say that teachers are already doing this. As they teach students to use AI carefully and thoughtfully, the students will learn that what they are using is a flawed system that needs to be filtered. In one recent survey, 18 percent of teachers nationally are using AI in the classroom—not necessarily because they don’t know how, but because they are afraid of it. If we don’t teach our students what it is and how to use it well, we are fooling ourselves and doing them a disservice. But we are also potentially doing a disservice to ourselves, as AI has some really interesting potential for helping teachers and administrators do our jobs better. And that could be a good thing!
I’ve seen some really interesting work being done around AI’s ability to sift through quantitative data. It still should be checked by humans, but it can be incredibly helpful to schools to better use the data they are already collecting. Sometimes a limitation to better harnessing data is not having the human resources to analyze the data meaningfully. AI can also be a tool to help generate ideas for lesson plans and as a brainstorming partner of potential activities in the classroom. There are no ethical issues that I see in using it as a thought generator to find patterns in data or ask specific queries to get fresh ideas.
A lot of schools I know are right now writing policies about AI’s use among students and adults. Everything I’ve looked at so far says the best policies are ones that will be really nimble as the technology evolves. There’s been a lot of talk about AI in light of plagiarism, and specifically the role it could play in college admissions. The jury is still out on how that’s going. Some college admissions offices say they know when an essay is generated by AI, but I’m not completely sure that’s true. Some colleges invite applicants to use AI as long as that is indicated as a source or tool.
Exploring Uses of AI in the Classroom
At Carolina Friends School, we’ve very intentionally allowed our teachers to explore AI use at their own pace based on the needs and curiosities of themselves and their students. For instance, two teachers in our Upper School’s language arts program received a curriculum grant this summer to develop an AI policy, with the idea that it could be a basis for something shared across subject area departments and other units. They’ve already uncovered some great research on AI’s advantages and potential pitfalls. Our Middle School head teacher and librarian have also been at the forefront in exploring ways we can use AI strategically while also teaching students to be critical users. An organization called One Schoolhouse has been a great resource for some of our teachers and head teachers. This fall, we’re taking the conversation school-wide so that we can address the needs of teachers and students at each developmental level.
We’ve been hesitant to create a top-down policy thus far in part because that’s our method of making decisions as a Quaker school invites multiple perspectives and is built on a shared commitment to the continual revelation of truth. It’s also in part because so much is changing so quickly with the technology. The landscape is constantly evolving, and it’s on us as educators to continually examine what’s happening in real time.
Still, we want to be open to any technology—including AI—that supports diverse ways of teaching and learning, and we want students to leave our school feeling equipped and confident in using technology for the greater good.
Partnering with Families
We know it’s our job as educators to scaffold these skills in our classrooms, but as with any aspect of a whole-child education, it is important for parents and caretakers to be curious about this topic as well. You should feel empowered to ask questions about AI use at home, if your school has policies or frameworks around AI use in general, and how it’s being taught in the classrooms. It’s a little bit like when smart phones showed up, and families were completely caught off guard—students knew more about the technology than their parents and guardians, which is almost always true! It is important to keep working at it, because the outcomes will be better if children are aware their families are having these conversations and asking these questions.
Some Resources for Parents and Caregivers:
Common Sense Media, an American non-profit that provides ratings for media and technology with the goal of providing information on their suitability for children:
- “Parents’ Ultimate Guide to Generative AI,” https://www.commonsensemedia.org/articles/parents-ultimate-guide-to-generative-ai
- “Helping Kids Navigate the World of Artificial Intelligence,” https://www.commonsensemedia.org/articles/helping-kids-navigate-the-world-of-artificial-intelligence
Internet Matters, a non-profit launched by the United Kingdom’s largest internet service providers to offer child internet safety advice to parents, careers and professionals:
- “A Parent’s Guide to AI” (interactive),
https://www.internetmatters.org/resources/parent-guide-to-artificial-intelligence-ai-tools/