top of page

Episode 58: Jason Stephens

  • drbertramgallant
  • 12 hours ago
  • 2 min read

"Our tendency to deceive and our tendency to make moral judgments — both of these things are bred in the bone. These things live in tension inside of us."


"I'm working on what I call a wise model of use — where I want to help students balance outsourcing stuff that's appropriate to outsource versus offloading the stuff I really should be engaging in because learning is difficult."


In this 58th episode of The Opposite of Cheating Podcast, David reconnects with longtime friend and collaborator Dr. Jason Stephens, a professor of psychological studies at the University of Auckland, for a deep dive into the moral psychology of cheating.


Jason explains why, if he had to pick one variable to understand academic dishonesty, it would be moral disengagement — the rationalizations we use to protect our self-image after doing something we know is wrong. Drawing on Bandura, Freud, and evolutionary psychology, he traces the tension between two tendencies bred into humans: the impulse to deceive (older than morality itself, visible across nearly every species) and the social need to make moral judgments and appear trustworthy.


Jason outlines his three-cluster model of why students cheat — under pressure, under interested, and unable — and describes how context and culture determine which tendency wins out. The conversation turns to AI, where Jason shares his "wise model of use," helping students distinguish between outsourcing extraneous cognitive load and offloading the hard thinking that constitutes real learning, using tools like Cogniti and Cadmus to scaffold that process. Both scholars agree that the deeper threat of AI may not be academic integrity at all, but the erosion of human connection as people increasingly turn to machines for social and emotional needs.


(Disclaimer: episode quotes and summary were created using Youtube's Transcript and Anthropic's Claude but edited by a human. Any errors are the responsibility of the human).

Comments


bottom of page