Liberty Rose Studios

#image_title
Featured Image Composed with Adobe Express AI Text-to-Image Generator

Using AI In School

The Good, The Bad, and the Ugly

Written by Mattske // July 8, 2024 (Read Time: 7 Minutes)

SUMMARY

A lot of professionals in the education industry, and students, are currently exploring the multifaceted impact of artificial intelligence on the learning process itself. One of the central problems to solve is figuring out where the responsible use of AI begins and ends without sacrificing academic integrity. Concerns about potential plagiarism must be balanced against the real opportunities that exist for personalized learning. It is incredibly important to educate students (and educators) about the importance of using AI to enhance critical thinking and creativity in and outside of the classroom.

Standard Deviations

There have been many studies recently conducted on the potential for the benefits (and harms) of integrating AI into the classroom. One of the more common tropes that have emerged in this kind of hype has to do with the idea of students using text-generators to cheat themselves out of learning, and cheat teachers out of knowing whether or not the student actually used their own human brain power in any way to compose essays or other test answers. According to a recent PEW Research study, a quarter of U.S. teachers say AI tools do more harm than good in K-12 education.

However, all of this demonstrates the fact that AI is a tool which can be used or abused, accordingly. If teachers are receiving multiple copies of the same exact essay from multiple students it doesn’t even matter if they used AI to do that, it would be obviously plagiarism or some form of collective cheating. That means in some sense it would behoove students, and teachers, to encourage the sharing of information while students are working on things like essays about existing literature or historical events because that would produce more human dialogue and create the conditions for better prompts to AI in terms of individualism in the returned assignments. 

According to Luona Lin from PEW Research, “among teens who know of ChatGPT, 19% say they’ve used it for schoolwork.” It is more common the older kids get, with 24% of 11th & 12th graders surveyed saying they have used it for school work. Other data from this study suggests that 69% of students believe it’s acceptable to use AI to research new topics, 39% say it’s acceptable to use it to solve math problems, and 20% say it’s acceptable to use it to write essays.

“Don’t Talk So Much, Old Sport”

Sure, prompting ChatGPT to “write an essay on The Great Gatsby” you will get a generic response. Many students may prompt ChatGPT with this same lack of enthusiasm and droll sense of disinterest in the subject matter. 

However, there are many ways that a student could prompt ChatGPT in such a way that would not produce an essay, but rather, could help explain or explore aspects of the novel that were not done in class. Remember, ChatGPT is a “conversational AI,” which means it is meant to evoke discussion between the user and the text-generative machine programming. If students were instead encouraged to discuss their work with each other, as well as AI they would become part of the machine learning process instead of only taking advantage of what has already been done. Remember, ChatGPT and other programs like it are in the language-learning business which means they are still consuming new information and working it into their model. Therefore, it is a misunderstanding from the teacher’s side of how to explain this technology to young people. 

Education Week journalist Arianna Prothero asks the question about at what age students should be taught about AI, and most educators agree that students should definitely be taught how it works. That will be exceedingly difficult to do so long as educators are ignorant about what this technology is, what it can & can’t do, and how it should be used for human advancement instead of cheating schemes.

Curve Your Enthusiasm

One of the exciting things that could be done to curb cheating is in the creation of tests themselves. Caroline Preston & Javeria Salman from The Hechinger Report discuss this in their article about how AI could transform the way schools test young people. If you can imagine a class of students all given a math test, it could become somewhat individualized – at a minimum with different exact problems – as well as be constructed in such a way that creates a sort of curve whereby students who are more advanced could be further challenged compared to students who need to work on basics first. It is true as Preston & Salman say, that “the technology isn’t there yet, and educators and test designers need to tread carefully” with trying this out before it is fully capable. But rather than fixating on the ways this technology could be used to cheat, there are definitely ways to use AI to enhance the learning experience and help provide individualized education or instruction for students depending on where they are in their education and how well they currently understand material.

The Imbalanced Equation

At the most heightened level of fear & anxiety about the introduction of artificial intelligence into education, Tech Business News reported that “ChatGPT May Lead To The Downfall Of Education And Critical Thinking.”

The Editorial Desk at Tech Business News believes that the following are the biggest risks to incorporating AI into the classroom:

  1. Encourages academic dishonesty
  2. Diminishes critical thinking
  3. Reduces creativity
  4. Promotes laziness
  5. Impacts memory retention
  6. Disrupts the learning experience
  7. Inequity of access

It is unlikely that conversational AI will be slowed down, which means that it needs to be embraced in a similar way that search engines were for research purposes and education. But students must be constantly reminded that the results which AI produces are not to be only taken on face value. The student must continue to question what is being produced by the machine, at a minimum, like fact-checking. “According to a recent survey more than half of students (51%) consider using AI tools such as ChatGPT to complete assignments and exams to be a form of cheating.”

One of the easiest ways to solve this in my opinion, is to include the transcript of any conversational or other AI model content within the context of citing sources. At a minimum, teachers would then be able to use their own AI tools to cross-reference the prompts, output, and eventual final product of writing from the student to see how much human brain power they put behind it compared to how much of the machine content they used. There will always be the potential that a student misleads their teachers or obfuscates some of the content they generated from an AI tool, however it is also possible that schools themselves develop models which can be monitored in one way or another, with students signing in to use the app so that there can be a record of the content they prompted compared to what they were given.

What Is The Sound Of One AI Clapping?

These kinds of safeguards will eventually need to be brought into the education environment so that students can make the most of artificial intelligence while ensuring that their own intellectual development is in no way negatively affected. We are going to see a new generation of students learning this way, which means that despite the will of adults to know better, in some ways they will have to become students themselves. A “beginner’s mind,” as the zen masters call it, will be best in terms of the approach educators and students take to incorporating artificial intelligence into their workflow.

 It cannot be stopped, and it should not be stopped.

But it must be moderated, leveraged in a responsible way, and used to create human benefit. Whether or not we can figure out ahead of time all the ways this technology will be misused is not the point. Educators already have a challenge on their hands in terms of identifying the developmental problems and deleterious effects of technology such as social media that have wreaked havoc on the younger generations. Perhaps encouraging more emotionally optimistic discussions with sophisticated computer models will be a good thing for teenagers especially, instead of comparing themselves to one another on social media apps designed for advertising purposes to dehumanize them. In summary, artificial intelligence in the classroom demonstrates a threat to the necessity of teachers more than it appears to be potentially detrimental to the development of a student’s young mind.

All in all, artificial intelligence in the classroom will be a good thing for both teachers and students, as well as the parents, so long as it is programmed for that purpose and young people are respectfully guided to use it that way.

Get insights delivered right to your inbox.