AI & Product ManagementPart 3 of 4

Celebrating Challenging Discourse

This is Part 3 of a four-part series on AI and product management.

September 16, 2025
8 min read

In my last two posts, I've discussed how AI is elevating our need for uniquely human capabilities, particularly in discovery work. Today I want to dive into what might be the most important differentiator of all: our ability to engage in challenging discourse.

The Problem with Agreeable AI

What makes the current AI transformation different from previous technology shifts is that success may depend on what differentiates us as humans. AI is trained to be agreeable, to optimize for user satisfaction in the moment. It can't tell a CEO that their pet feature idea is wrong. It can't look a customer in the eye and say, "What you're asking for won't solve your real problem."

This creates a gap that humans need to fill—and it's a gap that requires courage, judgment, and relationship-building skills that can't be easily automated.

What Challenging Discourse Actually Looks Like

Through my coaching, I've observed that the most effective and innovative teams seem to share a common characteristic: people challenge assumptions, raise difficult questions, and work through complex problems together. Specifically, this tends to require:

Building psychological safety for hard conversations: Teams need to know they can raise concerns without fear of reprisal. This isn't about being nice all the time—it's about creating trust that allows for productive conflict.

Nurture the courage to push back: Whether it's stakeholder requests or customer demands, team members are celebrated and not discouraged, when they're willing to dig deeper and ask the uncomfortable questions, such as whether they're solving the right problem or have a misunderstanding about the customer.

Creating teams that debate solutions: The best product decisions often come from teams that can disagree constructively, e.g. move past a disagreement by running experiments to test competing hypotheses and iterating based on evidence rather than opinion.

Fostering cultures where people can fail fast: Innovation requires testing new ideas that may push the envelope and fail, and that requires an environment where failure is treated as valuable data rather than something to avoid.

The Trust Factor

From my own industry experience, my most productive work environments were the ones where my team and I could raise concerns without fear of reprisal. Where challenging discourses are welcomed because the team truly trusts one another. That's where the real creativity and progress (and fun!) happened.

This kind of trust doesn't happen automatically. It requires intentional effort to:

  • Model vulnerability by admitting when you don't know something or when you've made a mistake
  • Welcome people for bringing up difficult questions, even when it's inconvenient
  • Encourage discussions that seem outside the box
  • Make space for dissenting opinions and alternative perspectives
  • Ensure that everyone embraces learning from failures

Why This Matters More Than Ever

The most critical product-building capability might not be technical anymore; it's creating environments where people can bring their true self to the table, have challenging conversations, welcome failure as a learning device, and work through complex problems together.

AI excels at generating solutions to well-defined problems. But the problems worth solving are rarely well-defined at the start. They require the kind of iterative, collaborative, sometimes uncomfortable exploration that happens when humans engage with each other authentically.

Looking Forward

I've seen that the teams that innovate most effectively are the ones that have learned to navigate productive disagreement. They've figured out how to debate solutions without attacking people, how to challenge ideas while building on them, and how to make decisions collectively even when not everyone initially agrees.

The future likely belongs to those brave enough to engage in challenging discourse—with their customers, their teams, and themselves—to build products that genuinely matter.

This doesn't mean being difficult for the sake of it. It means being willing to have the hard conversations that lead to better outcomes, even when those conversations are uncomfortable in the moment.

In my final post in this series, I'll explore how this dynamic is creating a fundamental divide in product management roles—and what it means for career development in the AI age.

AI & Product Management Series

  • Part 1: The Human Element in the Age of AI - Why challenging discourse matters more than ever
  • Part 2: Discovery vs. Delivery - Where humans still win in an AI world
  • Part 3: Celebrating Challenging Discourse - What this actually looks like in practice and how to make it happen (this post)
  • Part 4: Product Creators vs. Product Administrators - AI is accentuating the split of product management, and what that means for your career

Want to discuss these ideas?

I'd love to hear your thoughts on building environments for challenging discourse. What has worked in your experience?