I hate security questions. I think I’d rather tell you how much I weighed on the morning after the Super Bowl than say the maiden name of my first pet or whatever it’s supposed to be.
I know two-factor authentication matters for company security. It matters for your security. But it’s easy to miss what these requirements say about work today:
None of us is trustworthy until undeniable evidence arrives.
As one professional report notes, our digitalized workplaces have replaced “cultures of trust” with “cultures of proof.”
Look, I hope your supervisor thinks you’re amazing. I hope your coworkers rely on you implicitly. I hope your job doesn’t make you tap-dance to demonstrate your worth. But even if your organizational culture aspires for good relationships, the digital platforming of work increasingly makes those relationships difficult.
Sometimes an evil intelligence seems to conspire against working relationships. Maybe that’s just me, after watching the latest Mission Impossible flick, which I confess I saw this past Sunday. I know, I know I should have been watching Tay Tay chugging and the Chiefs winning. Instead, I was making some homemade pizza watching Tom Cruise stayed ahead of some AI called The Entity, “a self-aware, self-learning, truth-eating digital parasite infesting all of cyberspace.”
Truth-eating digital parasite sounds a lot like Workday to me—or any of the other digital tools seemingly designed to be easy for administrators and hard for employees.
So here’s my three-step authenticity plan for this week’s newsletter.
(1) Let me tell you a story about automation in the 90s. (2) Then, let’s talk about algorithms today. (3) Finally, let’s figure out how to build trust amongst the protocols.
What the Devil Was Going on with Automation in the 1990s?
I was a junior in college, and my new manager was taking me around the radio station, giving notes on broadcast procedures. Then, he suddenly stopped, gave me a sideways smile, and told me a story from his early career. (I know. Nested storytelling. Whoa. Very Inception-y.)
One night, working third shift, he was loading programs, degaussing cassettes, setting up music for the next day—basically, spreading out the work, keeping track of the on-air programming with an eighth of his attention.
But then, the current show leapt into the front of his awareness. He stood for a moment, pressing his clipboard to his chest, staring at the overhead speaker.
The late-night show was, for some reason, talking about exorcism.
He sighed, “Just great,” and walked around the studio, trying not to think about being alone—or maybe not alone?—in the building after midnight.
And then, for no explicable reason, the radio station went silent. He dashed into the automation room, his flesh crawling, and stared at the tape deck. The red audio levels were gone, the airwaves dead. He reached up and and slowly and deliberately tapped the Play button. The tape started rolling again, the voices started up, the audio waves started flowing.
But then, a minute later, the signal went dead. He hit Play again. Then before it could stop again, he ran out of the room and started turning on every light in every room he could find.
My manager made a scary tale sound funny. But the ghost in the machine in your workplace’s processes is probably not funny to you. And sometimes it’s scary.
What Algorithms Do in Today’s Workplace
You do know, don’t you, that algorithms haunt every part of your job?
You see their ghostly outlines in the hiring process. I mean, you can’t watch their icey fingers sorting thru eligible candidates—but you know they’re doing it.
You feel their presence in HR processes, giving zombified answers to FAQs.
Even your manager summons the bots, like some second-rate sorcerer, to schedule your shifts and monitor your mousepad movements.
And the blood-sucking performance review—even that’s increasingly algorithmic.
But how, I want to know, can we circumvent the distrust that our digital systems perpetuate in working community? How can we sidestep the imperative that everybody must verify their identity, which subsequently becomes the mandate that everybody must verify their worth?
The workplace bias deeper than politics is the bias that we prove our very selves.
How to Build Trust in the Algorithmic Workplace
I think building faith in the workplace—and this is going to sound silly—starts with being polite to robots.
Why? Well, it’s bad for your own mental health to rage, rage against the algorithm. But worse, it’s bad for people nearby, too. Anger has a way of uniting people in detrimental ways. (Tech journalist Emily Dreyfuss tells a story about yelling at Alexa, only to hear her toddler say, “Gobamm it Alessa.”) Anger builds trust, in a way. It builds the trust of tribalism. But anger can’t build the kind of trust we need, the kind of reliance at the basis of good working community.
But the most important reason to be kind to (or at least neutral towards) bots is they don’t really and couldn’t really care less about you as the person you actually are. They can’t see your interior life, your spiritual habits, your thoughtful manner, your goofy laugh. All they can see is your online self, your shadowy and ghostly replicant, your digital doppelganger. All they see is the “you” made visible in a ghostly slime trail across all your socials as you hover and click and share and like.
So, don’t give the bird to the bot. Instead, love your algorithms as you love yourself.
Think for a second about the hiring process. Let’s say you post your resume on LinkedIn. You’re one among three thousand applicants. But if you interact with the company’s socials first and then post your application on LinkedIn, the algorithm is likely to notice you and the company to trust you. (Want to read more on this? I learned this insight and more from this piece out of NYU.)
Or think about how your boss wields algorithms like a weapon, monitoring your remote work, driving you to distraction and mad mistrust. (Remember how, for a hot minute in 2023, everybody was asking if that dude should have been fired for his mouse jiggler?)
Here, I think, there’s wisdom in simply slowing down. You can’t hiss out that your surveillant boss is an anxious slit-eyed prig. (Well, I mean, you can. But the Mode/Switch 100% does not recommend.) What you can do is practice the simple, hard virtues of patience and gentleness and watchfulness and hope.
The biggest surprise for me when I changed jobs two years back was how much I had been relying on social capital in my previous job. Only in my new workplace did I learn how lucky I’d been to be so trusted in my old one. Reversing an understandable skepticism among new coworkers—I gradually learned—required me to slow down everything. How I talked. How I reacted. Even how I walked. I needed to move at the speed of watchfulness, if I hoped to persuade anyone I was trustworthy. (Still working on all of that, by the way.)
At times, slowing down will feel like you’re tucking trust ideas deep in your coworkers’ consciousness, inception-style.
The Algorithms Aren’t Going Away
It’s constantly tempting to say that, geez, if only we could get rid of digital processes and, ya know, just have a real, face-to-face conversation, everybody could trust everybody again. But the old-fashioned sit-down will still confront you with distrust. That’s why an HBR article headline put it, “Reverting to Human Judgment Doesn’t Solve the Problem.” It’s an uncomfortable truth that algorithmic code is easier to rewrite than cognitive prejudices.
I standby my hatred for security questions. But those protocols are a mere sliver of the real problem in the digitalized workplace. The larger crisis is that our workplaces—increasingly managed by algorithms—demand constant, quantifiable evidence, while our work itself requires personal authenticity and reliability.
The good news is that, if algorithms are resilient, so are possibilities for human trust.
And now, if you wouldn’t mind selecting all the squares with stoplights in them, I can bring this newsletter to a close.
We are so pleased to have new subscribers joining every week! Look for this newsletter every Tuesday, followed by a podcast convo on Fridays. And please share this newsletter with a rising professional in your life.
What’s I’m Up to These Days…
Reading: Meghan O’Gieblyn’s God Human Animal Machine. Just finished this book for the second time. O’Gieblyn’s amazing in her capacity to show how our ways of talking about tech rely on dead metaphors. She’s pretty wild when it comes to discussing the possibility we’re all living in a simulation, too.
Watching: HBO Max’s Succession. I’m watching my Mode/Switch lens. The precarious rising professionals in this show are navigating the weirdness of corporate America in hilarious and frightening ways. (My wife wants to know: what makes Brian Cox so repulsive and so likable at the same time?)
Planning: Getting ready for a leadership workshop in a nearby correctional facility this spring. It’s strange and wonderful that some of my freest experiences of learning and conversation happen in communities of incarceration.
Waiting: My book manuscript Digital Overwhelm’s coming back to me with final edits any day now. The book comes out later this spring with a cover like this:
As usual, the Mode/Switch’s comms coordinator, Hannah, knows you need to Listen While You Work. Here’s this week’s playlist voicing challenges for staying human in de-humanizing digital spaces.