One of the Oasis User Safety Standards’ principles is “People.” This means organizations should be developing policies that value representation, learning and wellness and being mindful of the societal implications of new technologies.
We spoke about this imperative with someone who is doing a ton of work to grow and diversify the ‘Responsible Tech’ community, representing those who care deeply about the impact of technology: Rebekah Tweed.
Rebekah is the Program Director for All Tech Is Human, a non-profit organization that creates pathways and develops opportunities for individuals from various cultural, academic, and professional backgrounds to get involved in the field.
From Music Publicist To AI Skeptic
Like many people in the industry, Rebekah's pathway to the field is non-traditional. She began to be concerned about the impact of new technologies on society when she was still working as a music publicist.
"My interest was particularly piqued when Grimes in her pre Elon Musk days, went on a neuroscience podcast and said that live music is going to be obsolete soon because of AI" she told us on the show.
The idea that ours could be the last generation of human artists and the implications that this could have on musicians brought her to look into tech's impact on society and study artificial intelligence's ethics.
Soon Rebekah found out that AI applications in the industry span music composition, where language models can generate original compositions or assist composers in creating new music across different genres; music recommendation systems, such as those used by Spotify and Pandora, to provide personalised song suggestions and curated playlists based on individual listening habits; all the way to music generation with models like DeepBach using deep learning techniques to produce harmonisations.
AI algorithms are also capable of analysing large music datasets, extracting patterns, structures, and other musical features to provide insights into music theory, composition trends, and genre classification.
As she dug in, Rebekah realised that those technologies had been used for years within the music space with no significant oversight and in more controversial scenarios. One example, back in 2018 was the use of cameras with facial recognition technology at a Taylor Swift rehearsal concert with the purpose of trying to lure in any of the hundreds of the artist's known stalkers who could then be identified and apprehended.
Rebekah started asking herself questions like whether it was okay to capture the biometric data of thousands of people without their consent or knowledge. And it was relevant that it's for the safety of a single person? How much of our privacy would people be willing to trade for safety and security?
Those privacy issues sparked her interest in responsible tech more broadly, pushed by the undeniable fact that online harms affect real people's actual offline lives. Studies have shown that victims of cyberbullying are at a higher risk of experiencing depression, anxiety, and suicidal ideation. Similarly, the spread of misinformation via rapid propagation through social media platforms, can lead to misinformation cascades and amplifying public misconceptions. For example, during the COVID-19 pandemic, the dissemination of false claims and conspiracy theories on social media platforms has hindered public health efforts and contributed to vaccine hesitancy according to research conducted by the Massachusetts Institute of Technology (MIT).
Designing The New Web With A Human Approach
All Tech Is Human was founded by David Ryan Polgar in 2018. It’s a nonprofit, based in Manhattan, but with a global lens and reach committed to building a stronger responsible technology ecosystem.
Rebekah joined a few years later to support the organization in building a more responsible tech ecosystem, not only for technologists but for the public’s interest as well, by creating a diverse, multidisciplinary and multi-stakeholders tech industry.
A huge focus for All Tech Is Human is “design justice”, an approach to design led by communities traditionally marginalized and that aims to challenge structural inequalities.
“It's a proactive Trust and Safety approach. And if we can design new immersive products and features in such a way that it is led by marginalized communities and it's structured with that user experience in mind, then we will be able to minimize misuse and we will be able to promote positive uses,” she explains.
The discipline looks at creating welcoming, immersive online spaces, minimizing any potential harms. Without a change in the industry’s trajectory, the current online safety problems will be exponentially worse in the metaverse and Web3 because they are immersive experiences with an increased impact of toxicity, as it comes with its own set of vulnerabilities. And the fact that it is interoperable makes content moderation that much more challenging, and that toxicity much more intractable.
But, asks Rebekah, how do we empower all types of users to have an enjoyable experience using the tech of the future? And how do we engage with them so that they feel valued, and like their voices matter? If we can empower these users, all types of people will want to spend time in those spaces because safety will become an increasingly determining factor for a significant number of people, whether they want to adopt the metaverse into their daily lives or not.
Design justice does some of that heavy lifting on the front end of the trust and safety process, but it protects and benefits the workforce, helping companies to retain talent and prevent burnout. And, of course, one important way to do that is to make sure that brands include a wide array of talents with a diverse set of backgrounds, disciplines, lived experiences and perspectives into that process.
It is well-known that the best way for a company to ensure that all types of users will want to spend time on an immersive platform is to have the right people working on it. Even if a product has the best graphics and the most intuitive user interface and it checks all of the boxes, “if it is a toxic, predatory, abusive, unsafe online space, people won't want to be there,” Rebekah remarks.
On the contrary, a diverse workforce puts you in the best position to perceive blind spots, get a whole perspective on what the problems your product may be facing and work out the right solutions to fix them. DEI is a business imperative.
Ultimately, the people designing, developing and deploying new technologies should reflect the diversity of those people who will be impacted by those technologies, who will be using them and who those technologies will be used on. This is part of the motivation for the Responsible Tech Job Board – the best way to improve our tech future is to work in it.
Key Trust & Safety Lessons
- Always ask yourself what the right balance between privacy concerns, trust and security may be and adopt all measures to implement the right checks and ethical policies that would safeguard your users
- Prioritise design justice, an approach to design that would bring back to the core communities traditionally marginalised and that aims specifically to challenge structural inequalities, to retain not only customers but your own workforce as well
- Create a safe, diverse work environment if you want your product to perform well, reflect its users and favour the creation of a community around it