Join the Movement

What’s it Like Being Accountable For 250k Communities? A Chat with Brandon Rhea at Fandom



Fandom is a platform celebrating the world of pop culture, composed of over 250,000 wiki communities and a global audience of over 300 million monthly unique viewers. Imagine the challenge of creating and enforcing policies that help administrators foster a safe and engaging environment to keep users coming back again and again.

In the latest episode of Brand Safety Exchange, Brandon Rhea, VP of Growth at Fandom, joins Oasis Consortium GM and Host Tiffany Xingyu Wang to chat about the brand safety accountability and community policy challenges he and his teams face. Learn how he is infusing trust and safety by design into his evolving role at Fandom, how to build policies that accommodate many communities, why better self-regulation can help online platforms avoid government interference, and much more.

Listen and subscribe to the Brand Safety Exchange podcast via your favorite podcast service, or read the interview transcript below.

Listen on Apple Podcasts
Listen on Google Podcasts
Listen on Spotify

The following interview transcript has been edited to make it more concise.

Tiffany Xingyu Wang  0:00  

Hi everyone, I'm Tiffany Xingyu Wang, the GM of Oasis Consortium, a nonprofit to set up global standards for brand and user safety. Welcome to Brand Safety Exchange, a podcast to feature and interview the veterans in the space about their solutions and their thoughts on this topic. And today, I'm very, very honored to have Brendan Rhea, who is the VP of Growth with Fandom. Welcome, Brandon.

Brandon Rhea  0:28

Hey, thanks for having me, Tiffany.

Tiffany  0:30  

Tell us a little bit about what's going on in your world and Fandom, and for your team.

Brandon  0:36  

Sure. So I have been the VP of Growth at Fandom, but I'm finally changing that. And the reason why I say finally is that growth has been sort of a misnomer. I lead up our community teams. What we recently just did was kind of restructured how we approach community at Fandom. We come at things from the point of view of having a gaming community team and anime community team. And I felt that that was not running the most consistent strategy in the world, so we now have a few teams. One is called Community Experience, which directly supports our Creator Community. We have a Community Development team, which focuses on the optimization of content and making sure that Wikis are being built the best way they can be. A Customer Support team for some of the long-tail customers, Community Activations, which brings the development work to the next level through sold deals, and Community Safety, which to me is an incredibly important team. Trust and safety, sometimes you find it in marketing, you find it in revenue, you can even find it in products, I decided to put it in community - our first time having a Trust and Safety focused team. Because in my view, everything else that I just mentioned, that dynamic between the creator and the consumer, and the content that's being created doesn't work if users don't feel safe using the sites and consumers don't trust the content. So that's something that I'm really looking forward to building over the next year.

Tiffany  2:03 

I love it, you see the big bucket of traditional roads to drive the growth, using community for engagement and retention, but now we realized that if you don't fix the trust and safety issues, then all your investments in the growth part will be wiped out. So really excited to hear that you infused this trust and safety by design concept into the accountability part of your world. So for folks who are not familiar with Fandom, it is a community of many, many communities. Do you want to tell us a little bit about Fandom as well?

Brandon  2:41 

Sure, probably should have started there. So Fandom is a number of things. We have a number of creator and community platforms, and what I focus on specifically is our wiki platform. We have about 250,000 wiki communities under every topic you could possibly think of - most of our bread and butter is in pop culture. So entertainment, anime, gaming, and we focus on cultivating those and providing a space where people can really celebrate their passion, however they want, primarily through the creation of this encyclopedic content, not unlike Wikipedia. The way that we were founded as initially was that Fandom would be the rest of the library if Wikipedia was the encyclopedia - Fandom is the site that goes in much deeper.

Tiffany  3:28  

You said 250,000 communities inside your community - it's crazy that if you think about how you manage trust and safety, and the community policy with such a high diversity of your community, so bravo for taking on such a big, big job.

Brandon  3:45  

That to me is the most important reason why to have a trust and safety team at something like Fandom, something at that scale where 10,000 communities probably drive 90% of our traffic, I'm sure a lot of platforms see things like that. And so we don't have as much insight as I would want into the long-tail of that traffic and that to me is where trust and safety is most important. Making sure we have some sort of eyeballs in there to keep people safe there as well.

Tiffany  4:11  

That's awesome. So as a tradition we do a segment called Oasis Rapid Refresher. So we go through the same three questions with all the interviewees and the guests on Brand Safety Exchange. So are you ready for this round?

Brandon  4:28  

Let's do it.

Tiffany  4:30  

So the first question is why do you think brand and user safety is becoming so important these days, especially taking from the vantage point of Fandom and with what's going on in the world right now?

Brandon  4:42  

I think what we've seen increasing over the years is how the internet, I don't think has created any new problems. There's a lot of historical forces at work when we go look at something like the storming of the US Capitol, like the issues that went into that are not necessarily new issues, but the Internet has become a new weapon, a new tool in order to generate extremism, promote misinformation or disinformation, and a way for people to exploit others. And I don't think we as an industry have done as good of a job as we could have been to keeping people safe and really thinking long-term, like, what are the implications of what we're doing? And I think now, there's more and more awareness of that within the public. So that's why we're seeing a lot of these issues in the news a lot more, and we will continue to do so - especially after January sixth.

Tiffany  5:40  

Yeah, it's true that the internet and the whole of social media especially have been weaponized by a lot of political, religious, racial goals behind the scenes. So the second question is, why are you personally invested in this topic?

Brandon  5:58  

For me, when I was growing up going to high school, I didn't really have a ton of friends. I mostly kept to myself and a lot of social anxieties that went into that. And where I found community was online. I joined a Star Wars website, I'm a big Star Wars fan. And that ultimately led me to Fandom where I became a contributor before I was a staff member. So I always had very positive experiences online and I always enjoyed my time doing that. And as I see where things are going, I feel a certain responsibility as a leader within the industry, to make sure that people can continue to have that sort of positive experience, the same one that I felt that I had. So it's very personal to me in that regard.

Tiffany  6:44  

I love it. Because you started as a user, you have the empathy, and given the years past, have the history of how the community has evolved to really make the decisions today to build trust and safety for the community. So the last question is, as you're part of Oasis Consortium and leading the charge, especially from an accountability perspective, we are to set the guardrails for building brand and user safety. What do you think the world would become if we don't build such guardrails and principles?

Brandon  7:19  

I think it'll become a world, and you could argue whether this is the right call or not, where if we don't self-regulate, regulation will be imposed upon us. I listen personally to how some members of Congress, for instance, talk about the internet... I think it was famously a senator who asked the CEO of Google how to fix my iPhone, like, I don't feel like we're necessarily dealing with a government that knows a lot about the internet. So if we want to avoid what we might think could be overly imposing regulations, or maybe just the wrong regulations would be a better way to say it, then I think it's important to us to be able to kind of get our own house in order or somebody else will do it for us. The tea leaves are moving that way, we just had a presidential election where both candidates, both administrations now are in favor of, for example, the repeal of Section 230 for wildly different reasons. But that's the direction that things are trending, I think we need to get ahead of that trend.

Tiffany  8:26  

So true. There is a gap. When we build brand and user safety principles, we tried to get the meters from the security, trust and safety space to talk with people from the brand advertising world - there's a gap between these two worlds. But also, there is a gap between the private and public sectors, how we can come together, telling the policymakers and politicians, what that really means because they need to have empathy, as you have had being a user in a community to make the decision call, which will be enforced through law and policies.

 

Brandon  9:08  

And I also think it just makes good business sense. Spectrum Labs did a talk last week where one of the panelists said, "The job of content moderation, the job of trust and safety is to make sure that what the marketing team says about the brand is actually true." So to me, it just makes business sense to live up to what your highest aspiration as a company is.

Tiffany  9:31 

So it's well spot on Brandon, the content moderation needs to be aligned with the community policy, which needs to be aligned with the brand promise. So you are a platform of many platforms. So each community has its own policy and a promise to its users. So how do you ensure global policy enforcement when you are a platform of so many platforms?

Brandon  10:03  

So it has to start from having clear terms of use that people can understand. And that has to be as consistent as it possibly can be. I think where companies fall into traps, especially recently, with the de-platforming of certain high profile individuals is the tension around the inconsistency there. Like, why that person? Why now? What's the specific rule? And I think even if you agree with the action, you still sometimes wonder, why exactly are you carrying it out now versus two years ago or something like that. So that's really important just to get buy-in, I think, from your community and from your users. And then on Fandom with 250,000, wikis, they're all going to have their own set of policies as well. And one of the things that we're coming out with is essentially a policies policy, which says that you need to have some sort of policy, even if it's a really basic one. So administrators on Wikis aren't able to go in and just ban people for no reason or just because you don't like them - there's got to be a clear rule that's being violated. And that's true on a global level, regardless of what the platform is. A key part of the term trust and safety, I think we always think about the safety part, but it's trust - people need to trust you, and that you're going to consistently and fairly enforce policies. And if it doesn't, then your social dynamic doesn't work as well as it could.

Tiffany  11:40  

Yeah, it's so true that you can have a policy but if you don't enforce it, or if you enforce it but you don't back it up with transparent reporting and a transparent process, then you lose the original purpose of having this policy is to gain the trust from the community. And another point that is very true, coming back to what you alluded to, is how does trust and safety tie to the community growth or the whole brand promise? So that when a user chooses this platform, they believe the ideology and philosophy and a brand of it. So if you do not demonstrate that you stick to your ideology and brand, you end up losing your users. And one thing at Oasis that we talk a lot about is the speed to trust to become a community and platform and brand differentiator in the coming decade.

Brandon  12:40 

Yeah, I think having trust and safety be an extension of your company values too is really important... the marketing angle of living up to how you talk about yourself externally, that's important, but you have internal customers as well. And if you're coming out with a certain policy, and regardless of how well-intentioned somebody wants a policy to be, regardless of how, let's say, nonpartisan, you may feel you're going into it, there's going to be somebody who accuses you of some sort of bias. I just accepted the premise that somebody is going to say I'm being too liberal at some points. But if you can clearly demonstrate where it's rooted in the DNA of your company, I think the better off you are. One of our values at Fandom is, "We bring joy." So when I think about the policies that we're going to have and how we're going to evolve them over 2021, I look at how we're a platform called Fandom. It's about pop culture. Sure, somebody may feel that they can get into a debate about Trump versus Biden or something, but how is that bringing joy? They may have fun arguing with each other, but everybody around them is rolling their eyes and saying, “I don't want this in my community.” So I think rooting it into not just your brand story, but your corporate values is a really important piece of the puzzle.

Tiffany  14:00  

Thank you so much Brandon for leading the charge of building accountability around trust and safety for your community and the industry. Thank you for coming on the show today!

Brandon  14:11  

Thanks for having me. And thanks for calling me a veteran earlier! I've never been called that in the context of this before. So that was fun.

Tiffany  14:17  

Oh, man, how many years working community?

Brandon  14:20  

I've been doing this for 11 years, but I'm also 32. So I'm like, wait... I'm allowed to be a veteran? That's not right.

Tiffany  14:29 

Come on. You are a veteran! Thank you so much, Brandon, for today and for sharing your experience.

Brandon  14:36  

Thanks for having me, Tiffany. It's always a pleasure.