Note from the Editor, Tricia Wang: The fourth contributor to the Co-designing with machines edition is Molly Templeton (@mollymeme), digital and social media expert, Director of Social Media at Everybody at Once, and one of the internet’s first breakaway YouTube stars. Her piece urges brands’ social media strategy to look beyond the numbers when working in the digital entertainment and marketing industry. Molly gives specific examples where algorithms don’t know how to parse tweets by humans that are coded with multiple layers of emotional and cultural meaning. She offers the industry a new way to balance the emotional labor in audience management with data analysis. Her articles draws on her work at Everybody at Once, a consultancy that specializes in audience development and social strategy for media, entertainment, and sports.
@Tacobell spent an hour sending this same gif out to dozens of people. The account is probably run by humans (most social media presences today are). And they were following best practice by “replicating community behavior,” that is, talking the way normal people talk to each other (a human taco bell fan would definitely send a gif). But when @tacobell only sends the same gifs out over and over again, it’s uncanny. It’s pulling the right answers from the playbook, but at the wrong frequency.
Why do brands lose their chill?
I think that brands lose their chill when they don’t let their social media managers exercise empathy. The best brands on social media balance the benefit of interaction with the risk of human error – managers are constantly concerned with pissing off the organization, or the audience, and ultimately trying to please both sets of real people. Hitting campaign goals and maximizing efficiency are important, but social media managers need to bring humanity to their work. They have to understand the audience’s moods and where they’re coming from, and they have to exercise empathy at every level: customer service, information and content sharing, community management, call-to-actions, participation campaigns, crisis and abuse management. That is a lot of emotional labor.
With the recent chatter about chat bots on Facebook’s messenger platform, a lot of people are thinking about how bots can take over communications roles from humans. I’ve been thinking a lot about the opposite: how can machines help people manage the emotional labor of working with audiences? Can bots ever help with the difficult, and very human task of managing with empathy?
Social media is a business of empathy
Emotional connections drive social media. When people gather around the things they feel passionate about, they create energy. It’s because of limbic resonance — the deep, neurological response humans have to other people’s emotions. As my colleague Kenyatta Cheese says, it’s that energy that makes participating as a fan on social media feel as electric as it does when you’re part of a physical crowd.
It’s incredibly powerful when the person at the center of the attention (the television show, the celebrity on Instagram, or even Taco Bell) interacts with the audience, especially if they turn the attention back on the fans to celebrate and recognize their love. That spotlight builds more energy, making the crowd energy grow. It’s magic!
But to do that well, the person at the center of the attention has to have empathy for the people in that crowd. It’s not just about understanding what people are feeling, but who they are, where they’re coming from, and what they need to feed their energy.
I think that sharing empathy through social media is important because it allows us to recognize when our values align with another person (or #brand). It’s about us and our peers be understood, and seeing our humanity reflected back towards us. When social interactions become formulaic, they can shortly lead to the complete opposite reaction – feeling devalued, feeling misunderstood, or feeling neglected – and these feelings create distance, not love.
Take the common example of someone complaining on the Internet below. A mother complains about frozen pasta on the Birds Eye Vegetables Facebook page, and the brand responds by saying “My kids love this product, so I do understand your disappointment,” then directs her to phone their business-hours customer service hotline. (Transferring a customer experience to private message on the same platform you received the complaint feels like one thing – but it feels like an exceptionally large ask to invite a customer to CALL you.)
It’s not actually enough for a brand to just say, We Can Relate, We Have Kids Too. If that was all there was to it, an algorithm could do the job: Collect data sets, understand the correlation, and evaluate an appropriate, efficient, on-brand response. But how can a brand understand how a person feels? (How can a brand have kids? That’s just weird.)
If I put myself in the shoes of that customer, I’m on Facebook and I kinda just wanted to complain about my kids. What humans bring to the equation is an understanding of not just the frustration, but the context. More than an apology, people want to be recognized. Instead of making this a transactional customer relationship, it could become an understanding – “When you’re busy, important family time is scarce and getting kids to eat healthy things is tough – we’re glad you shared Birds Eye at your table. Thanks for being a great parent.”
While we’re talking about empathy, I really empathize with the people behind the @. Even those who have the freedom to express themselves honestly, outside of brand guidelines, won’t feel inspired to do it for every single person they interact with. I think its important to check ourselves, and to monitor our prioritization of empathic energy – it’s even emotionally laborious to be *self-aware* of your constant potential to emotionally over-exert or invest. How much can you empathize if you spend every day at work fielding complaints about the quantity of stems versus whole vegetables in frozen pasta?
Fluently speaking in gifs
Another reason we can’t turn social media management over to the bots is that language is constantly changing. I work with entertainment brands, developing audiences and community around shows like Orphan Black and Doctor Who. Understanding what people want when they’re tweeting about a film or television show is often complicated and intense. In my experience as a manager and as a fan in entertainment communities, I’ve learned that people don’t just relate to their favorite characters and storylines – they seriously identify with them and develop an understanding of the world through this lens. They have conversations about their own morality. They talk about gender politics, sexuality, spirituality, and self-identity, all through chatting online with other people. Their reactions are often impossible for algorithms to pick up:
This is a typical fan reaction to an intense dramatic moment. “I’m crying #lrt I can’t believe I used to be spn trash but this is good.” But the analytics tools I’ve used can mess tweets like this up, classifying them as “very negative” when I read them as “declaration of ultimate love”:
Fans often speak in code. “OMG I HATE YOU I HATE YOU I HATE YOU I HATE YOU #OUAT.” It’s hard for algorithms to even count their conversations, because their language and communication style seamlessly shifts from @ replying the official show handle, to using the official hashtag, to posting only a gif or image, to subtweeting and vaguebooking, to self-destructing posts on snapchat.
These are not messages that we can quantify. But we can empathize with them. And this is where the work of humans and data and algorithms can all come together.
Algorithms that monitor, interpret and contextualize audience emotion do exist, but they require human supervision to improve and develop intelligence. Instead of training machines to mimic empathy, I think we should be using machines to help us be more empathetic.
Bots can lighten our emotional load.
They can capture the broad statements, highlight a few, and build more space for us to use our empathy. Social media is a stream of emotions: Listening, absorbing, contextualizing, understanding, sorting, interrogating, and reporting on it can be exhausting. Algorithms can help. They can reduce an onslaught of comments into numeric volume, giving a data-comparison perspective of a conversation. Understanding the true volume of sentiment can help manage and prioritize conversation, for managers to layer emotional understanding on top of. The machines show us the forest, but we, the humans, recognize the trees.
And what if we had an algorithm that would point out anomalies for us, or users with a history of complaining on social media? Or one that points us to community members who are linked to broad networks? Or the opposite — participants that so *rarely* get rewarded for their tweets, or rarely replied to, that an empathetic response might have more impact than usual? And how do we do these things in a way that respects the privacy the people depend upon, as brands and media properties that people trust?
I’m looking forward to bots helping me at work. Sometimes knowing that we’re chatting with a machine can even inspire more empathy, make us “more willing to forgive” than if we were talking with a person. And the more we can find ways for bots to support us in our human skills, to keep us from getting burned out on our incredible new capacity for connection, the more we’ll grow to love them.
This article is part the Co-designing with Machines Edition. Read other articles in this edition.
Like what you’re reading? Ethnography Matters is a volunteer run site with no advertising. We’re happy to keep it that way, but we need your help. We don’t need your donations, we just want you to spread the word. Our reward for running this site is watching people spread our articles. Join us on twitter, tweet about articles you like, share them with your colleagues, or become a contributor. Also join our Slack to have deeper discussions about the readings and/or to connect with others who use applied ethnography; you don’t have to have any training in ethnography to join, we welcome all human-centric people from designers to engineers.