Courtesy of Flickr under CC BY-NC 2.0 DEED

As the popularity of Artificial Intelligence (AI) platforms like ChatGPT skyrockets, its role has been under scrutiny in conversations about academic integrity. Personal statements are a standard part of the college admissions process, once thought to be the only medium of showing off an applicant’s unique qualifications. Administrators are concerned that AI will negatively impact the integrity of these statements and the college admissions process. However, the already canned statements made by applicants don’t contain much sincerity to begin with, and these platforms will make college admissions advice more financially accessible. Using AI as a tool in the admissions process is equivalent to having a professional consultant.

AI, like most things, is user-sensitive. It’s up to the applicant to use any platform wisely. If someone is going to lie on a college application or plagiarize a response, they can do so without using ChatGPT. Many students already have an advantage over their peers since their parents can afford to pay for standardized test preparation, professional advisors and tutors. The situation is already unfair, and students who do not have the time for or cannot afford those resources should not be penalized for asking ChatGPT for advice. AI has been unsuccessful in leveling the playing field with professional human support and assistance remaining superior to AI assistance. Additionally, it would be fairly obvious if a student used AI to spit out a personal statement to copy and paste on their application.

It would not be entirely effective for universities to ban AI since there’s very little they can do to regulate its usage. However, there should be consequences and accountability for flagrant and blatant violations. Students do put effort into their applications, and that shouldn’t be disrespected or ignored. It’s worth noting that there’s also no model to monitor college admissions consultants and whether or not they are crossing boundaries or violating academic integrity. Additionally, there doesn’t seem to be the same degree of uproar about those advantages or possible abuses.

There is already a serious problem with the content of personal statements. More and more frequently, students are being asked to capitalize on hardship or trauma in order to gain admission. Applicants are fully aware that their chances of acceptance increase if they write about the adversity they have faced due to any number of life circumstances, incentivizing the trauma dumping of 18-year-olds. Personal statements already had minimal integrity, with the standards dropping with each admissions cycle.

There does need to be a line when it comes to actual assignments and college work. Students should not be relying on AI to complete their education for them. There’s no point in paying tens of thousands of dollars to let an AI chatbot attend classes instead of the student. Using these resources for clarification or guidance does not warrant punishment, but universities should be taking steps to prevent abuses. Universities across the U.S. have taken different approaches to this, from discouraging the use of AI in writing and editing statements to allowing it. 

Professors are also revamping their teaching methodologies to prevent the use of AI. Courses are being completely shifted as students face more oral exams, group collaboration and handwritten exams. Some universities also provide screen recording platforms that students can be required to use during exams. These are the steps universities should take in response to AI rather than blindly and uniformly punishing all usage. AI can be a study tool and help students understand the content, but it should ultimately be up to them on exams and assignments.

Hopefully, the presence of AI will encourage universities to craft unique personal statement questions rather than regurgitating the same mind-numbing prompts year after year. Perhaps it will act as a catalyst for a more equitable and holistic process that focuses on who the applicants are and not their trauma or the world’s most boring 500-word essays. AI will be a part of education going forward, but it all comes down to student choices and personal responsibility. Universities need to take proactive steps in revolutionizing education instead of targeting students trying to use the only tools available to them.

Author

  • The Editorial Board

    The Highlander editorials reflect the majority view of the Highlander Editorial Board. They do not necessarily reflect the opinions of the Associated Students of UCR or the University of California system.