03/01/2026
“A Warning Sign We Should Not Ignore About Today’s Students: ‘AI Keeps My Secrets’”
During my recent engagement with university students in Livingstone, Zambia, a new and unexpected concern emerged. One that deserves serious attention.
Several students admitted that they now turn to generative AI for emotional advice.
When asked why, their response was strikingly honest:
“AI doesn’t judge us.”
“AI keeps our secrets.”
“We don’t trust people to keep things confidential.”
This is not a technology story. It is a trust story.
I find this deeply concerning, not because AI is involved, but because human support systems are being bypassed.
Let us be clear:
🔹 Generative AI has no empathy, only pattern recognition
🔹 It does not understand cultural context, trauma, or consequence
🔹 It cannot carry ethical responsibility
🔹 It may offer comforting words, but not care
When students rely on AI for emotional guidance, several risks emerge:
• Delayed access to professional or peer support
• Reinforcement of isolation rather than healing
• False reassurance or inaccurate guidance
• Emotional dependency on systems never designed for care
The real issue is not that students trust AI.
The real issue is that they no longer trust us, institutions, counselors, peers to listen safely, confidentially, and without judgment.
This should force us to reflect.
👉 Are our campuses psychologically safe?
👉 Do students feel heard without fear of exposure?
👉 Have we invested enough in human-centered counseling and mentorship?
AI can assist learning.
AI can enhance productivity.
But AI must never replace human connection, especially in moments of vulnerability.
If students are seeking emotional refuge in machines, it is a signal, not of progress, but of a relational gap we must urgently close.
My message to institutions and educators is simple:
Build trust faster than technology advances.
And to students:
Your struggles deserve human understanding, not just algorithmic responses.
The future of education is not only digital, it must also remain deeply human.