• Ai TRiSM Newsletter
  • Posts
  • Imagine discovering your doctor used AI to get through med school—would you still trust them?

Imagine discovering your doctor used AI to get through med school—would you still trust them?

Would You Trust a Doctor Who Let AI Do Their Homework?

College Students Are Receiving Degrees Without Doing the Work, Thanks to AI

Artificial intelligence (AI) has transformed many aspects of daily life, and one area where its impact is being felt significantly is in education. In particular, AI tools like ChatGPT have made it easier for students to complete assignments and pass exams without fully engaging in the learning process. But is this new trend hurting education? Here’s what we know.

AI and Academic Dishonesty

Since AI tools became widely available, students have been using them to get around doing the hard work themselves. A survey by BestColleges found that 56% of college students admit to using AI to complete assignments​ (Social Science Space). With AI’s ability to generate essays, solve complex math problems, and even conduct research, it’s become easier for students to rely on these tools instead of developing their own understanding of the material.

While some schools have policies in place to prevent academic dishonesty using AI, the rules can vary widely. Some professors allow limited AI use, while others ban it completely​ (InsideHigherEd). The lack of consistent policies means many students can get away with using AI to earn their degrees without mastering the required skills.

Med Students and AI: A Growing Concern?

One of the most alarming implications of AI in education is its potential use by medical students. With the ability to generate essays, complete exams, and even solve complex problems, AI may offer a shortcut for some med students who should be developing critical skills necessary for their future careers. If a medical student relies heavily on AI to pass their courses, it raises serious concerns about their competence when it comes to treating real patients.

While AI can be a valuable tool in medical research and diagnostics, over-reliance on it during education could discredit future doctors. The healthcare industry demands hands-on experience, critical thinking, and problem-solving abilities—skills that AI cannot replace. For patients, the idea of being treated by a doctor who used AI to complete their studies could undermine trust in their physician's abilities. As a result, ensuring that medical students engage with their learning fully and appropriately is essential for maintaining the integrity of the medical profession.

This further emphasizes the need for frameworks like AI TRiSM, not just in general education but in professional training as well. Establishing strict guidelines and transparent use of AI in academic settings will ensure that students—especially those in high-stakes fields like medicine—graduate with the skills they need to succeed and keep the public's trust.

Why Do Students Turn to AI?

There are several reasons why students may feel compelled to rely on AI. For some, it’s a matter of convenience—AI tools can quickly and efficiently complete tasks that would otherwise take hours. Others may feel pressure to keep up with peers who are also using AI, making it seem like a necessary part of their academic toolkit​ (InsideHigherEd). In many cases, students may not fully understand when it’s appropriate to use AI. A survey found that 40% of students at two-year colleges didn’t know the proper way to use AI for their coursework ​(InsideHigherEd). This uncertainty, combined with the rapid development of AI tools, has created a confusing landscape where students may unintentionally cross the line into academic dishonesty.

The Consequences

If students rely too heavily on AI to complete their work, they risk graduating without the necessary skills to succeed in the workforce. Employers are increasingly looking for workers with critical thinking and problem-solving skills—abilities that are difficult to develop if AI does all the work for them. In fact, research has shown that 92.7% of executives see data issues, like poor quality or reliance on AI tools, as major barriers to successful implementation in real-world business environments​ (Social Science Space)​(InsideHigherEd). Additionally, AI’s role in education has raised concerns about fairness. Students who use AI may have an unfair advantage over those who don’t, leading to discrepancies in grading and academic achievement. This could widen gaps in academic performance, especially among disadvantaged students who may not have access to the same AI resources.

AI TRiSM: A Potential Solution

AI Trust, Risk, and Security Management (TRiSM) could be a solution to the challenges AI presents in education. By establishing clear guidelines for how AI should be used, educators can ensure that students are still developing the skills they need while leveraging the benefits AI offers. AI TRiSM emphasizes trust and accountability, meaning that both students and teachers can rely on AI without sacrificing the quality of education.

With proper management and understanding, AI can be an incredible tool for learning. However, if left unchecked, it could lead to a generation of students who hold degrees but lack the skills to thrive in the real world.

Conclusion

The growing use of AI in education has made it easier for students to earn degrees without doing the work, but it comes with significant risks. Schools must establish clear policies and ensure that students understand how to use AI appropriately. With proper oversight through frameworks like AI TRiSM, we can harness AI's potential while maintaining the integrity of education.

If you'd like to explore more about AI’s impact on education, take a look at reports and discussions from higher education platforms like Inside Higher Ed and Social Science Space.