Learning That Feels Good!
Weekly insights on learning science, AI, and EdTech, for anyone building or buying learning products that actually work
Can You Draw the Picture of a Successful Learning Journey?
In 1961, Turkish poet Nazim Hikmet asked his painter friend Abidin Dino: “Can you paint the picture of happiness?”
The popular story says Abidin drew a poor family on a broken bed, no resources, leaky roof, but everyone smiling. Happiness despite scarcity.
But that’s not what actually happened.
When Abidin read his friend’s question, he was troubled. According to his biographer, he said: “Happiness is a weird notion. Can it even be painted? Fear, sadness, despair. These have all been painted before, but not happiness.”
He never painted it. Instead, he wrote a poem about what happiness meant to him: his exiled friend returning home. Not a pretty picture. A reunion.
What does this have to do with learning?
I’ve been asked the equivalent question dozens of times: “Can you show me what successful learning looks like?”
And every time, I feel what Abidin must have felt. Because the “picture” people expect isn’t the real answer.
The Expected Picture (What We Measure)
When I work with L&D teams, here’s what they show me as “proof” of successful learning:
95% completion rate ✅
4.8/5 satisfaction score ✅
87% engagement (whatever that means) ✅
Dashboard showing thousands of courses completed ✅
Beautiful metrics. Perfect for the quarterly report.
And I believe you when you say leadership demands these numbers. I believe you when you say you have 90 days to show “impact” or lose your budget. I believe the pressure is real.
But here’s what haunts me: I’ve seen programs with all these green checkmarks where six months later, nobody remembers taking the training.
I’ve watched $2M learning platforms get renewed based on completion dashboards while managers quietly say: “Yeah, but did anyone actually change how they work?”
What We Don’t Measure (But Actually Matters)
Three months after a leadership training program, I was meeting with participants for something unrelated.
Mid-conversation, someone said: “Oh, like that thing we learned about psychological safety in that workshop, remember?”
And three others jumped in: “Yes! I’ve been using that in my 1:1s.” “I literally sent that framework to my team last week.”
They were still talking about it. Unprompted. Three months later.
That’s when I realized: The completion rate for that program was only 73%. Leadership almost killed it for “low engagement.”
But the 73% who completed it? They were still applying it. Still referencing it. Still talking about it.
The Real Picture
Here’s what I think successful learning looks like and why it’s so hard to measure:
Successful learning is when people voluntarily bring it up in conversations months later.
Not because they’re supposed to. Not in a feedback survey. But because it genuinely changed how they think about something.
It’s when someone says:
“Remember that thing we learned about X? I tried it and...”
“This reminds me of that framework from...”
“Can you send me that resource again? I want to share it with my team.”
Successful learning creates language.
It gives people words for things they’d been experiencing but couldn’t articulate. It gives them a framework that actually simplifies something complex in their work. It gives them permission to try something they’d been hesitant about.
And here’s the uncomfortable truth: You can’t force this with engagement metrics.
Why This Is So Hard to Achieve (And It’s Not Your Fault)
I know what you’re thinking: “Great, Ilkem. So I should optimize for ‘people talking about it months later’? How do I put THAT in a quarterly report?”
You can’t. And that’s the trap.
The things that make learning memorable such as time to practice, space to struggle, permission to fail, real problem-solving, peer discussion don’t show up in completion dashboards.
They’re slow. Messy. Hard to scale. Impossible to automate.
Meanwhile, leadership wants to see 10,000 employees “trained” by Q3. The new LMS promises “AI-powered adaptive learning paths” that will “increase engagement 40%.” The vendor’s demo shows a beautiful dashboard.
And you, the L&D professional who actually understands how learning works, are stuck between what your CEO will fund and what would actually help your people.
I see you. This isn’t a skill problem. It’s a systems problem.
What Actually Needs to Change (And It’s Bigger Than You)
I don’t have a magic solution for this. I wish I did.
But I can tell you what I’ve seen work when L&D professionals had the space to do it:
One team stopped measuring completion rates internally. They still reported them to leadership (because they had to), but internally they tracked: “How many people voluntarily came back to the materials after completing the course?” That metric told them which programs actually mattered to people. They doubled down on those and quietly let the others fade.
One L&D leader negotiated for “learning time.” They convinced leadership that 2 hours of “learning and application time” should be blocked on calendars every month. Not for taking courses. For applying what people had already learned. Completion rates dropped 15%. Leadership freaked out briefly. But performance reviews improved. Leadership stopped freaking out.
One team built “learning communities” around existing programs. Instead of creating new content, they created Slack channels where people could discuss what they’d learned. They seeded the first few conversations. Then they let it run. The programs with active communities had 3x higher application rates. Even though the content was identical to programs without communities.
But here’s what all three of these had in common: They required leadership buy-in, time, and resources. They required L&D leaders who could say “no” to bad requests. They required organizations that actually valued learning, not just training.
Not every L&D professional has that. Maybe you don’t have that.
And if you don’t, I’m not going to pretend you can fix this with “one simple trick.”
So What Do You Actually Do?
If you’re stuck in a system that measures the wrong things, you have three options:
Option 1: Work within the system, but be strategic about where you push
You can’t change everything. But you probably have more room than you think.
Ask yourself: Where do I actually have discretion?
Maybe you can’t change the enterprise LMS. But can you influence how one pilot program is designed? Can you add one question to the post-training survey that asks: “Have you applied this yet?” Can you track one unofficial metric just for yourself like how many people ask for the materials again?
Maybe you can’t eliminate completion rate reporting. But can you add a slide to your quarterly report that says: “Here’s completion rate (92%). And here’s how many people are still using these resources three months later (23 people requested materials again, 12 mentioned it in team meetings).”
You’re not waiting for the system to change. You’re planting seeds that might change it.
Will it work? Maybe. Maybe not. But you’re proactive where you have leverage, and you protect your energy where you don’t.
Option 2: Make one strategic, small bet where you have freedom
Pick ONE program where you have enough autonomy to try something different.
Not your biggest, highest-visibility program (too risky). Not your smallest (nobody will notice). Something mid-sized where you can experiment.
Design it for conversation instead of completion. Track something meaningful. See what happens.
If it works, you have evidence. If it doesn’t, you learned something.
Either way, you’re building the case for what’s possible when you have more freedom.
Option 3: Find an organization that values what you value
I’m serious. If you’re in an organization that fundamentally doesn’t value learning, that measures only compliance, that won’t give you the resources or autonomy to do meaningful work, maybe the answer is to find an organization that will.
Not everyone can do this. I know. But it’s an option worth considering.
Because burnout from fighting a system that doesn’t want to change is real. And your expertise deserves to be in a place where it can actually make a difference.
What This Newsletter Will (and Won’t) Give You
I won’t give you:
“5 quick hacks” that ignore your constraints
Judgments about what you “should” be doing
Learning science theory without acknowledging the messy reality of implementation
I will give you:
Frameworks for thinking about learning that might help you negotiate for what matters
Tools to evaluate EdTech so you’re not stuck with garbage your leadership bought
Ways to articulate the gap between what you’re asked to measure and what actually creates learning (maybe it helps you make the case to leadership, maybe it just helps you feel less alone)
What I’m learning building InspAIre including the failures and the hard tradeoffs
And most importantly:
I’ll work on this problem alongside you. Not from above you.
Because I’m building an EdTech product right now. I’m facing the same pressure to optimize for “engagement” instead of learning efficacy. I’m making the same hard calls about what’s realistic vs. what’s ideal.
I don’t have all the answers. But I’m willing to figure them out in public.
The Image in My Mind
You asked what my picture looks like.
It’s three people in a hallway, months after a training.
One says: “Wait, isn’t this that thing we learned about...?”
And the other two’s faces light up: “Oh yeah!”
They pull out their phones. Someone finds the framework. They sketch it on a whiteboard. They’re solving a real problem using something they learned.
Nobody is tracking this moment.
It’s not in your LMS. It’s not in your engagement dashboard. Leadership doesn’t know it happened.
But this—this fleeting, unmeasured, organic conversation—this is what successful learning looks like.
Not the completion rate.
Not the satisfaction score.
Not the pretty dashboard.
The reunion. The moment when learning comes home.
If you’ve ever seen that hallway moment happen—people still talking about something they learned months ago—hit reply and tell me about it.
And if you’ve never seen it happen, let’s figure out why together.



