Learning comes from evidence. We learn when we trust in evidence — what we see, sense, experience, and believe using whatever data we have on hand. We don’t learn from things we don’t trust. That’s what’s amazing about our human nervous system — we can’t put faith in things we don’t trust. We can be deceived and trust things that aren’t true, but we can’t learn when we don’t trust something.
In my work, I create a space for trust between me and my clients or partners, them and the data we generate or gather, and a process that connects all of this together into a strategy. Trust is at the heart of what gets us from one place to the next.
I’m asking a lot of questions about trust these days. Much of it is tied to Artificial Intelligence (AI) and the algorithms that feed so much. I’ve written about this elsewhere. Mark Leung, my friend and colleague, calls what we’re seeing an AI-driven Arms Race — with algorithms (and their human inventors) looking to find more ways to mine data to manipulate our actions.
A Matter of Trust
Billy Joel was right: it’s all just a matter of trust.
It was refreshing and validating to read about Elise Loehnen and her journey from taking a thriving Instagram account and porting it to Substack. The reasons she cites are many, but one of them is trust. As she puts it:
I feel much more ownership on Substack, where I have immediate access to a list of people who would like to hear from me. I know that I’m landing in their inboxes. While I’m grateful to have 80K or so Instagram followers, I have no clue how many people are actually seeing my content because of the algorithm. It’s also frustratingly deeply, deeply unpaid. Not only is it unpaid for creators—unless you do a lot of partnership work, which hasn’t been my speed—but you are essentially underwriting the content creation of Instagram’s own monetization efforts. So it feels really, really bad. A lot of free labor to enrich Meta. Not great, but there haven’t been alternatives. My hope and dream is to build my Substack enough that I can get off Instagram or dramatically limit the amount of content I make for that platform. (I make a lot!)
Substack provides a far more focused environment with greater controls for people looking to engage. It’s offering what we once had on Facebook, X, and other platforms but now have lost.
Another way to trust is to get your hands dirty — literally. My great friend Matt Keene knows this firsthand. He’s worked with AI, learning, and education and uses the garden as both a metaphor and a literal platform for learning. In his chronicles about homeschooling, exploring AI and its relationship with nature, (also on Substack) and how we can evaluate our efforts with both, he documents how the lived, unfiltered experience of complexity in the garden fosters learning. Each day, he and his two daughters (and often other guests) confront their garden's unfiltered changes and evolution. Asking questions about this is how they learn.
They engage in what Matt calls an evaluative evolution.
Matt’s not a techno-luddite. He knows what AI can do but recognizes where it serves a purpose and whether and when it helps learning. For him, trust and the values around our relationship with nature, AI, and our experience of it all create learning.
Creating Trust Conversations
Social platforms are where many of us have conversations. As those platforms change and the trust in what we see and what others see degrades, the lessons from people like Elise Loehnen and Matt Keene become more important. Both are creating relationships through their content and with their audiences. They are building communities of trust.
This happens by asking, answering, and offering space to talk, engage with the content, and pause and learn unfiltered from algorithms. Substack is an opt-in, direct-to-subscriber system that offers ways to engage through Notes and chat. It’s multi-purpose means of creating conversations allows people to opt in or ignore content without asking where it came from. The connection between evidence and effect is tight.
If we want to learn, we’ll need to couple our evidence — the things we see, organize, and experience through data — with trust. Can AI help? Surely, yes. But only if we design what’s behind it to trust, not to clicks.
So Rich @Cameron Norman . Design for Trust. Design for evidence we want and that we can trust. You are provoking serious inquiry. I wonder about the differences between blanket wholesale blind trust and the trust associated with a few particular values that are prioritized in a given situation. How to choose the values. How to verify whether the evidence we generate engenders trust, how much, for how long does it last, how long does it take for it to emerge, and what effort is required or worth investing to create that trust. And I wonder about the wisdom of our society, our culture, our children…I wonder if we have empowered them, ourselves, with the evaluative thinking capabilities required to navigate the world you are showing us. A world where we , as individuals and groups, are the targets of surveillance capitalism, where our behavioral surplus is the food, that which is fed to the AIs whose values are chosen by others and may be directly in conflict with our own. These perspectives, these concerns, I quietly take with us into the garden as we wonder, inquire, puzzle at, learn, and train to navigate the world we have built so that we might create a future we want. Thank you Cam for pushing me toward these thoughts and helping me to zoom out before zooming back in as we are walking back out into the garden to say goodnight to the peas. Peace.