DEAR EDITOR:
AI is showing up in more and more classrooms — including ours. At Los Medanos College, students are using tools like ChatGPT, Copilot, and Grammarly to help with everything from brainstorming to grammar checks. For some, it’s a way to stay on top of assignments when life gets hectic. For others, it’s part of how they learn.
A recent BestColleges survey found that 56% of college students have used AI for schoolwork, and most said it helped.
AI can explain confusing concepts, summarize long readings and help organize ideas when your brain’s running on fumes.
It’s not hard to see why it’s become part of the routine.
But there’s another side to it. When AI starts doing the work for us — writing full essays or solving problems we don’t understand — it raises questions.
Not just about cheating, but about what learning really means. If we’re not engaging with the material ourselves, are we missing the point?
Faculty responses at LMC vary. Some professors have banned AI tools altogether. Others allow them with clear guidelines. What seems to matter most is transparency-students knowing what’s okay, and instructors explaining why it matters. That kind of clarity can make a big difference.
It also opens up a bigger conversation. If AI can handle traditional assignments, maybe it’s time to rethink those assignments.
In-class writing, presentations, and creative projects might offer a better way to show what students actually understand. And when AI is used, reflection could be part of the process — what did the student learn, and how did the tool help?
This isn’t just about school policies. It’s about preparing for a future where AI is everywhere. The World Economic Forum says AI-related skills are among the most in-demand in today’s job market. Learning how to use these tools responsibly might be part of what modern education looks like.
At LMC, students and faculty are still figuring it out. The conversation is ongoing — and it’s one worth having.