Quick note: You’re receiving this via a new email platform as I’ve moved FutureSkills updates onto beehiiv. Nothing else has changed, and you’re in the right place. Thanks for being here.

Over the past few months, I’ve noticed something consistent in conversations about AI and education across schools, policy discussions, and product spaces.

Everyone agrees AI literacy matters but very few people agree on what it actually means. That’s because we’ve been using the same phrase to describe very different things. Below is a distinction that has helped me make sense of the confusion.

Three types of AI literacy

Most conversations about AI literacy collapse these into one even though they’re not the same. Treating them as interchangeable only creates real gaps in learning.

1. Functional AI literacy (most visible)

  • Knowing what AI tools exist

  • Understanding basic concepts like data, models, and prompts

  • Being able to use systems effectively

This literacy answers the question: “How does this work?”

2. Critical AI literacy (less visible, most important)

  • Understanding how systems are trained

  • Recognizing bias, limitations, and tradeoffs

  • Questioning outputs rather than accepting them

This literacy answers the question: “What is this system actually doing?”

3. Human AI literacy (least discussed, hardest to teach)

  • Judgment about when to use AI and when not to

  • Awareness of how AI shapes thinking and behavior

  • A sense of agency rather than dependence

This literacy answers the question: “What role should this play in my thinking and decisions?”

Why this distinction matters

When schools focus only on functional literacy, students learn speed without judgment. When critical literacy is added, students learn skepticism. But without human literacy, students never fully learn how to responsibly situate AI within their own thinking. AI literacy isn’t one skill, but a progression, and confusing the categories makes it harder to design learning experiences that actually help students grow and aptly prepare them for an AI-shaped future.

A question I’m sitting with

If students leave school knowing how to use AI, but not how to judge it, question it, or place boundaries around it, what have we actually prepared them for?

That question is shaping how I think about FutureSkills, and I’ll continue exploring it here.

If you found this distinction useful, I’d love to know where it resonated or where it raised questions. Those responses will help guide what I explore next.

Until next week,

Kingsley

Founder & CEO, FutureSkills

Empowering people to think, work, and lead with intention in an AI-shaped world

Keep reading

No posts found