In April we wrote about suggested guidelines for educators when developing AI guidelines and we followed up with a blog for students and created resources for students to protect themselves against false positives on original work. Many in our industry casually brush off the issue of false positives, focusing on the 99% accuracy of their algorithms and choosing to ignore the faces behind their false positives. One reason AI Detector is the only AI detection software aimed at individuals, rather than institutions, is that the faces behind the 1% are the little guys, the ones with the least power when they find themselves in a situation where their academic integrity is being questioned. Institutions will be fine while students suffer. False allegations of academic misconduct disproportionately impact students, both emotionally and financially.

The Work of Black and Asian Students Is Surveilled More Closely

In 2022, Inside Higher Ed published an article citing data that Black and Asian students are accused of committing plagiarism more than any other group of students. Black students, in particular, are accused of academic misconduct more than any other group of students. Responding to allegations of academic misconduct is trauma compounded on top of the trauma of being a black student at a predominantly white school. As of March 2023, most faculties lacked AI policies. That means students whose work is already surveilled inside racialized institutions are most likely to suffer the fallout of a false positive on outdated AI Detection software.

Isn’t this unfair labor I shouldn’t have to do?

Yes, absolutely. You’re right that the burden has been shifted to students to prove themselves because the adults in the room are still wringing their hands and trying to catch up. Being BIPOC and having to do this is even worse. We should have more than just 14% of college administrations issuing guidelines on AI, and evaluating the systemic inequalities perpetuated at academic institutions against BIPOC students. But we advocate a practical proactive approach over the emotional trauma of defending yourself. When you consider the racial equity gaps in the value of a college education for a black student, the debt, the misconduct marks on your transcript, the disruption of your future career or graduate education plans; then the emotional labor of planning your campus life to accommodate for inaccurate AI software is far less. Signing up for an account with AI Detector Pro and creating a digital paper trail to protect yourself is easier, faster, and cheaper.  Remember, it’s not just about running our software, it’s also about using the resources at your disposal, advocating for yourself and others, and gaining clarity. Fending off an allegation of misconduct is more emotional and financial labor than just figuring out whether an inaccurate software will mark you as having used AI when you didn’t.  

Opting for Practicality: 4 points to consider  

It is completely unacceptable to be targeted by a professor who tried to replicate student essays using an unproven and rather ridiculous method of AI use detection (his own work has likely been scraped into an LLM). However, these horror stories will keep happening for the next few years, and students will be the ones to pay. So for that sole reason, we suggest you take the initiative in protecting yourselves.  

Let’s conclude this article with our Final Four recommendations for Students:

  • First, just follow the AI guidelines taking the most conservative interpretation: our difference is that we start from the assumption that you are following them, not that you aren’t,
  • Second, if no guidelines have been issued, force the point at your institution or with your educator,
  • Third, if you are a BIPOC student, make yourself part of the campus conversation on AI as a student ombudsman, or advocate for diverse faculty on the AI committee through your campus club,
  • Fourth, Sign up for our student-focused detection software and run a report before submitting your work to your professor.  

Leave a Reply

Your email address will not be published. Required fields are marked *