Undivided Resources
Article

AI in Education: Ethics, Safety Considerations, and Questions Parents Should Ask


Published: May. 16, 2024Updated: Aug. 26, 2024

Featured image

With anything as revolutionary as AI in education, concerns about ethics, privacy, security, and safety, are very much on our minds. It’s natural for parents to feel a sense of anxiety around AI in the classroom — everything feels new and we don’t necessarily know how our kids will be impacted. Like with any internet technology, you want to be careful what personal information you share.

Polls and studies from the National Parents Union and The National Coalition for Public School Options found that while most parents are uncertain, they are open to the possibilities of how AI can transform their children’s learning. Parents in the studies largely support AI being used to customize lessons and develop class materials, give feedback on students’ homework, and help with online tutoring, but the majority don’t want AI evaluating their child’s work.

A report from the Center for Democracy & Technology explores how school technology can have the potential to harm students, often felt most by vulnerable students such as kids with disabilities. The report found, for example, that students, parents, and teachers aren’t receiving the appropriate amount of information or training on privacy, student activity monitoring, content filtering and blocking, and generative AI. It also found that students with IEPs and/or 504 plans have higher rates of getting in trouble for using or being accused of using generative AI as compared to peers. Often, parents are left in the dark when it comes to school policies on the use of AI.

Is using AI cheating?

It’s no surprise that AI tools like ChatGPT have been compared to steroids in sports, creating fear that they could lead to a cheating crisis in education. Dr. James Basham, professor in the Department of Special Education at the University of Kansas and principal investigator and director of the Center for Innovation, Design, and Digital Learning (CIDDL), often hears concerns from individuals and professionals that AI will help students cheat, but it all depends on one’s perspective. “If we're measuring student growth and understanding in only one way, then we're really kind of missing the boat on how to assess learning,” he says. “What we know is that most students are going to be able to express themselves through writing and that's usually what people are picking up on: ‘Well, the AI can write for them.’ But if that, again, is the only way you're assessing a student's ability, knowledge, and skills, then you're probably assessing it too narrowly; we have to really look at multiple ways in which we're assessing the learning process.”

“It depends on how you look at it and how you use it.”

Knowing how to use AI meaningfully is a human skill, and whether using it is cheating depends on people’s perception. For example, Dr. Sarah Howorth, BCBA-D, associate professor of special education at the University of Maine School of Learning and Teaching in their College of Education and Human Development and the director of Maine Access to Inclusive Education Resources (MAIER), explains how when the chalkboard was first invented, people considered it cheating because it showed students how to do the work instead of figuring it out on their own. “I see AI and any kind of technology the same way. It depends on how you look at it and how you use it. Yes, you can use a chalkboard to show how to calculate an equation, but the human has to write on the chalkboard — the same with AI,” she says.

For example, while a generative AI can write things, a student would have to tell it what, how, and in what tone/voice to write it in. This is called prompt engineering, which means that a student would be “structuring text so that it can be interpreted and understood by a generative AI model — to create materials that meet teachers’ instructional needs and students’ goals.”

Charmaine Thaner, special education advocate, parent trainer, owner of Collaborative Special Education Advocacy, and a parent of an adult son with Down syndrome, provides us with a sample question parents can ask their kids when prompting for an essay, for example: what persona would you want your AI tool to have if it's going to help you with this writing assignment? For example, you could tell ChatGPT, “You are an excellent creative writer with much experience writing mystery stories.” So the student has to think of the persona to feed the AI tool that fits with the assignment, which is a skill and task itself. A student could use ChatGPT to reword a passage of text into a different reading level or to create images to pair with vocabulary words or other text. These are all skills that students learn through the use of AI tools.

Thaner believes students should also be provided ample time to explore on their own how AI can help, and that it can be a very creative experience. “With all tools or educational materials, like new manipulatives for math, we want to give kids time to explore and play with that new tool. And so for kids using AI, I think they also need that exploration time,” she says. For example, kids can work together in a social environment, come up with some wild story, then have AI illustrate it. And you’d think, ‘Well, isn't that cheating? I mean, isn't AI doing all the work?’ But there is still work that the students have to do — the persona that you give the AI tool.”

What about using AI to write essays?

When it comes to writing essays, for example, a tool such as ChatGPT can, and is being, used to cheat. Although AI can generate written responses for those struggling with writing and other areas, it’s still very controversial. Special education advocate and owner of KnowIEPs Dr. Sarah Pelangka says that some parents think AI is fantastic and can be an excellent accommodation for their child, while other parents are more hesitant. Some parents have said, "My child is just trying to graduate, he's just trying to get through school, he doesn't want to write in his career, so we don't really care." She also has parents who’ve said, "This is cheating. We don't want this. We want to continue to push."

Howorth tells us, "If a computer can do the assignment, you're giving the wrong assignment."
There are ways teachers and parents can help teach kids how to use the tool in a more meaningful way when writing essays. She shares how we can approach writing essays in a more meaningful way to promote deeper learning for kids.
How can you ensure your child is actually learning and engaging meaningfully with the educational material? “I would do comprehension checks,” Dr. Pelangka says, “and have them dictate whatever the topic of the written response or the essay is: ‘Tell me what you learned about as it relates to World War II, for example,’ or, ‘When you read this text, summarize it to me,’ to ensure that they're at least comprehending the information. I think that's probably the most important part.”

Data privacy: what to know

Data privacy and surveillance are two of the main concerns regarding the use of AI. A report on the future of AI in special education states, “All students should be taught about what information any AI collects, how it is stored and how it is shared. Parents have a role to play in that regard as well, in considering whether a school that uses AI is right for their child, if it complies with an Individualized Education Plan and if it can be personalized while being respectful of diverse student backgrounds and values.”

It’s important to note that all the experts we interviewed recommend not disclosing a child’s personal identifiable information into AI because of data privacy issues.

Giving an AI tool a profile about your child

"The thing that really makes the difference when you're using AI tools is how much you tell AI about your child. And I warn parents, do not use your child's name, don't use your name, the school's name, any identifying information, because what you put into AI is out there in the cloud or wherever. However, you can describe your child almost like your About Me page if you use a profile that you give to the teacher in the fall. And you can insert that information into your AI tool. When [AI develops] modified lessons, it knows, 'Oh, this student learns best when they're working with a peer,’ or, ‘This student learns best when they have more visual prompts.’ So those little details and nuances that you know about your child, when you put that in your AI tool, you're going to get a much more individualized, really appropriate activity for your child," Thaner says.

This also falls into the issue of surveillance, especially if these tools are being taken home. Studies show that students with disabilities are more likely to use generative AI, such as GPT, and that they’re also more likely to be disciplined for using it. With AI tools creating more access to education and leveling the playing field for kids with disabilities, is it fair that they are disciplined for using the tools?

Protecting our children's private information

Thaner tells us we don't really know where all the information can go when we plug it into an AI tool. “Just like we have HIPAA rules and privacy rules for other things in school, we need to have that also when we're using AI tools. You can share a couple stories with parents of ‘What are some of the possibilities if someone on the internet had all this private information about your child? How could that be used against you or your child?’ It’s always better to err on the side of caution and not give any of that private information to the AI tool to begin with.”

Basher explains that it’s vital that school districts provide guidance for how parents, teachers, and students should appropriately interact with the data systems that they have in place. These are discussions that should be happening during the IEP meeting if AI tools are being added to the IEP. “One of the things parents can do is talk to their schools about how AI is being used in our classrooms. What are some of the ethical sort of considerations that are being made to support that?”

Modeling AI safety for our kids

Howorth reminds us how important it is to exercise parental control over technology and how we need the same type of guidance we have for internet and social media safety for AI as well. Modeling how to behave when getting an unknown message on social media encourages open dialogue. That signals to kids to talk about it when something like that happens, and it is bound to.

“I'll say out loud, ‘I don't know this person. This is creepy,’ and I'm modeling that for my children,” Howorth says. “So that when they get it, they're like, ‘Oh, yeah, look at all these I got.’ Then we have that safe place to have the conversation. Because if you don't talk about it, they feel as though they shouldn't be talking about it.”

Is AI accurate and trustworthy?

AI literacy: it’s a thing. Are students being taught how to use AI? Where to find valid information? How to discern true information from false? “You can't trust everything that comes out of an AI tool,” Thaner says. This is an important thing for parents and students to know about when doing schoolwork, homework, etc.

Sometimes AI can come up with information that isn't accurate, so it’s important to always read, edit, and revise what tools like ChatGPT give to you. For example, Howorth explains that kids might use ChatGPT to create a cover letter for a job but not evaluate it before sending it out, and it may have inappropriate things in it. “So again, we have to teach people to evaluate what it's creating.”

Should AI be banned from the classroom?

Some educators have called for the banning of AI, but despite the ethical implications, our experts tell us that’s not the best long-term solution. AI exists and it will continue to and will not go away anytime soon. By knowing these concerns, you will be more aware of what’s going on in the AI world so that you can take measures to protect yourself and your kids’ information when on AI platforms, because there’s only so much policies can do and so far they can go. Parents are encouraged to do the best they can in situations they can control.

Howorth says, “We can't ban it all. We can't just say, ‘No AI anywhere.’ Because that would eliminate spell checkers, grammar checkers, transcription, sometimes there's text-to-speech or speech-to-text, which is an accessibility feature, but it's run off [of] AI. Otter.ai is a very good example. There are others. So we just have to teach people safety.”

Basham shares with us a story of how a New York City school banned AI when ChatGPT launched in 2022, and now New York City’s Department of Education has reversed that decision to explore how AI can help students in the classroom. He argues in “The Future of Artificial Intelligence in Special Education Technology” that banning AI can’t work because it would lead back to the traditional pen and paper, which would leave out many students with disabilities.

While some people might be quick to react to new technologies, AI will still “make a progression into schools in a meaningful sort of way,” he says. We need to have discussions on how to effectively use AI and continue research, as not many have conducted research on it. He says what needs to start happening is developing some best practices and best use guidelines and getting those into schools and testing them out to ensure they’re working as intended.

Legislation surrounding AI in education

Now, policies are being developed to address many of these ethical concerns. According to the California Department of Education, schools must consider COPPA (Children's Online Privacy Protection Act) and FERPA (Family Educational Rights and Privacy Act) when considering AI use in the educational setting.

The U.S. Department of Education, Office of Educational Technology, released new guidance on how to address the benefits of AI amidst the growing concerns about its risk. In the guidance, they write, “Education-specific policies are needed to address new opportunities and challenges within existing frameworks that take into consideration federal student privacy laws (such as the Family Educational Rights and Privacy Act, or FERPA), as well as similar state related laws. AI also makes recommendations and takes actions automatically in support of student learning, and thus educators will need to consider how such recommendations and actions can comply with laws such as the Individuals with Disabilities Education Act (IDEA).”

They will work with cross-agencies and use the White House Office of Science and Technology Policy’s Blueprint for an AI Bill of Rights as a guide to ensure that whatever measures are taken, “AI is trustworthy and equitable.”

Questions for parents to ask about AI

Whether you’re in an IEP meeting, in a parent support group, or at home doing your own research, here are some questions to ask about AI as it relates to your child’s education (with some questions from Basham’s report Inclusive Intelligence: The Impact of AI on Education for All Learners, the Office of Educational Technology, and this report from ISET):

AI and its uses

  • How is AI being used by the school? How is it not?
  • What are appropriate uses of AI in my child’s learning?
  • What AI tools are being used at school and can I use them at home, too?
  • What types of AI are available to increase my child’s access to and ability to benefit from the general curriculum?
  • Is my child continually showing growth in the learning process?
  • Can the AI provide personalized instruction while being respectful of my child’s disability?
  • Is the AI being used to enhance my child’s learning experience or to replace actual learning?
  • How are AI tools being used to supplement my child’s learning?

AI and IEPs

  • Does my child need AI and why?
  • What areas of need will the AI/AT evaluation be addressing?
  • How can AI be used in my child’s IEP?
  • What are the long-term implications for my child?
  • Does the AI align with my child’s IEP goals?
  • How can the use of AI tools be integrated into the curriculum in a meaningful way for my child’s needs?
  • How will the AI tools in my child’s IEP be evaluated and measured?

AI and inclusion and independence

  • How can AI facilitate my child’s inclusion in the general education curriculum?
  • How can AI promote my child’s independence?
  • How can AI make learning more accessible for my child?
  • How is AI enabling adaptation to my child’s strengths and not just deficits?
  • How is my child being involved in choosing and using AI for their learning?
  • Is AI supporting the whole learner, including social dimensions of learning such as enabling my child to be an active participant in small group and collaborative learning?
  • Does AI contribute to aspects of student collaboration, like shared attention, mutual engagement, peer help, self-regulation, and building on each other’s contributions?

AI and parent empowerment

  • How can I advocate for the use of AI in my child’s education and IEP?
  • What do I do if the school denies my child AI?
  • How can AI empower me to participate further in my child’s learning?
  • How can I be trained in the AI tools in my child’s IEP?

AI and data safety and privacy

  • Are my child and I okay with their personal data being collected and analyzed by a computer system?
  • What systems are being used to monitor my child’s use of AI?
  • Will the AI systems’ analysis of my child and their disability lead to any bias that limits perceptions of their performance?
  • Are my child and I being properly taught and informed about data privacy and cybersecurity?
  • Who has access to my information/data and who owns it?
  • Is the AI “always listening”? (For example, Howorth explains: “If there's any kind of AI that will record what's happening in a classroom — for transcription for students — that also has to be looked at carefully to make sure it's not recording everything else that's going on, because of privacy issues.”)

For more about using AI in schools and in our kids' IEPs, be sure to check out our main article AI in Special Education.

Contents


Overview

Is using AI cheating?

Data privacy: what to know

Is AI accurate and trustworthy?

Should AI be banned from the classroom?

Questions for parents to ask about AI
Blue asterisk
Liney circle
Join for free

Save your favorite resources and access a custom Roadmap.

Get Started
Tags:

Author

Sarah BunWriter

Co-author: Adelina Sarkisyan, Undivided Content Editor and Writer

Reviewed by: Cathleen Small, Editor

Contributors:

  • James Basham, Ph.D, professor in the Department of Special Education at the University of Kansas and principal investigator and director of the Center for Innovation, Design, and Digital Learning (CIDDL)
  • Sarah Howorth, Ph.D, BCBA-D, associate professor of special education at the University of Maine School of Learning and Teaching in their College of Education and Human Development and the director of Maine Access to Inclusive Education Resources (MAIER)
  • Charmaine Thaner, special education advocate, parent trainer, owner of Collaborative Special Education Advocacy, and a parent of an adult son with Down syndrome
  • Dr. Sarah Pelangka, BCBA-D, special education advocate and owner of Know IEPs

Promise Image
Each piece of content has been rigorously researched, edited, and vetted to bring you the latest and most up-to-date information. Learn more about our content and research process here.
A Navigator is your Partner at each turn
Every Undivided Navigator has years of experience supporting families raising kids with disabilities or parenting their own. Partner with an Undivided Navigator for a free Kickstart to learn first hand what support feels like!
tick-icon
Expert-driven content, guidance, and solutions.
tick-icon
Member events and office hours with real answers, plus access to our private parents' group.
tick-icon
Priority to begin a free Kickstart of the Undivided Support System with a dedicated Navigator.
“It’s so helpful to have one place that you can go to get many answers.”–Leeza Woodbury, with Navigator Kelly since 2020
*Currently offering Navigator Kickstarts to residents of California