<img height="1" width="1" style="display:none;" alt="" src="https://px.ads.linkedin.com/collect/?pid=2982017&amp;fmt=gif">
Join the Movement

The Path to Rebuild Trust for the Coming Era of the Web 3.0



How can we build trust for the coming Web 3.0? This week our host, Tiffany Xingyu Wang, walked delegates through the answer to such a delicate and important matter in the opening keynote speech for the 4th annual HUE Tech Summit 2021, a premiere conference powered by the Tech Women Network in partnership with LinkedIn Learning, The New York Times, Urban Outfitters and SAP.

The loss of trust in the web 2.0 

At present, we are living in the Web 2.0 which is commonly known as the “social web”, that is characterised by the power of people’s data and dominated by a few large platforms that own the process of generating them. As Tiffany highlights in her opening observations, these platforms can literally “see us”. This is thanks to all the identifiable information companies know about us, from our name, address and phone number, to our interests, digital footprint and consumer tastes.

It is in this Web era that we lost trust. Alarmingly, every 30 seconds, a cyber attack occurs with over 40% of Internet users in the US having reported being harassed. One of the most shocking concerns is that AI recognizes white skin males 34% more accurately than dark skin females. So how can we trust a web where our data is breached, our safety is threatened and the AI skews data and discriminates against human beings?

But there is still a path to fix this, and as platforms look to rebuild trust for the coming era of the Web 3.0, Tiffany urges leaders to follow three crucial pillars of design:

  • Privacy
  • Safety 
  • Representation

Pillar 1: Privacy

According to estimates, over 50 billion devices will be online by 2025. With the perimeter of devices on the Web becoming so vast, protecting data will be one of the hardest challenges in maintaining ethics on the Web 3.0.

The priority of any technologist leader is to put people at the centre of data protection and we can really address this challenge through three movements:

  • Baking automated and orchestrated data in the design phase of the system, combining them with content management 
  • Implementing differential privacy, mathematical cryptography that ensures no information is derivable more than those that intend to be shared
  • Using the blockchain, a tamper-proof and distributed network to protect privacy and security

Pillar 2: Safety 

The pandemic has catalysed the world’s digital migration. This is undoubtedly the place we go to play, learn, socialise and even date. Whilst the freedom and opportunities of the Web is vast and exciting, it is also quite scary that there is no governance in this digital universe.

When we do not address safety, our platforms and brands suffer a lack of trust. We know that unsafe content is more likely to be taken off distribution channels and will ultimately result in consumers abandoning the purchase journey. We know that safety is tied to trust.

There are two sides to address the safety issue; technology and governance. We can only rely on technology when we are provided with enough context to work effectively and accurately, as much as we need to embrace and define a set of rules to truly act on the consensus to make the Web a safe place.

This is exactly what Oasis Consortium is doing by building industry trust through our five core values of:

  1. Openness
  2. Accountability
  3. Security
  4. Innovation
  5. Sustainability

Pillar 3: Representation

Lack of representation is a crucial part of the trust issue. In the physical world we suffer an endless amount of biases, discrimination and inequality. Whilst we are now entering an appealing new world dominated by AI, it is critical to reverse bias in the machines to ensure there is diversity and inclusion and that all colours are represented fairly.

There are a few things we can do to ensure fairer representation:

  • Detect the inputs and outputs that could discriminate
  • Hire legal and technologist teams who monitor the process
  • Act on the content by making it more inclusive and diverse so AI learns in a fair way

    We hope Tiffany’s three-pillar framework will come back into every single conversation amongst fellow leaders, cybersecurity professionals and privacy regulators, as we understand that trust in the internet, and especially in the incoming Web 3.0 can be a powerful medium to correct the wrongs and amplify the beauty of the physical world if ethicalized correctly.

Watch the full interview below or listen to the Brand Safety Exchange podcast on Apple Podcasts and Spotify.

Key Lessons To Amplify Your Brand’s Safety:

  • Put people at the centre of data protection
  • Provide technology with enough context to address safety issues effectively and accurately
  • Define a set of rules to truly act on the consensus to make the Web a safer place
  • Act on the content used for machine learning by making it more inclusive and diverse