Data privacy is a common concern in contemporary conversations. With governments worldwide trying to launch data privacy bills, Market Regulation and Anti-Trust bills to counter Big Tech’s dominance of the online world, Social media and networking have also come under the purview of concern. However, due to the limited alternatives to social media such as Google, Facebook, Twitter, etc. consumers have become dependent on centralised social media<1>.
Social media and social networks have become an irremovable part of individual existence. With networks like Facebook and Google dominating not only one’s access to these services but also verification on most other services online, ranging from private banking, online dating, and accessing online tools for work regardless of industry, etc.
In its foundation, social networks aimed to connect people and act as social tools. Still, with growing databases and audiences, social networks have become more than an online representation of socialising. Instead, social networks became tools for data collection and market analysis. As a result, these platforms became owners of the data hosted on them rather than remaining media platforms meant to boost user reach and augment users’ posts.
This databasing of individuals has allowed users to be connected to their extended circles, find connections based on interaction and geography (friend-of-a-friend) and access information and advertising that holds personal relevance.
DOSN uses peer-to-peer infrastructure to host data and interactions rather than centralised information by one organisation.
However, social media has also removed users’ agency over what they can post and what posts they access. In addition, social media contributes significantly to echo chambers and censoring users on their submissions.
Advantages of Decentralised Online Social Networks
These drawbacks have encouraged many, especially staunch believers in Digital Ledger Technology (DLT), to migrate to Decentralised Online Social Networks (DOSN). DOSN uses peer-to-peer infrastructure to host data and interactions rather than centralised information by one organisation.
While DOSNs have many challenges surrounding them, like climate concerns over the number of users and servers, issues on unregulated access by minors, etc., with most DOSNs, the main challenge remains to find an audience. Centralised social media attracts people by providing them with entertainment and the ability to interact with people. Therefore, data privacy is not a primary concern for most “normal” users over other aspects like finding contacts, staying connected, and accessing different services.
Thus, DOSN must establish itself as a good alternative to attract an audience and ensure data privacy.
While many DOSNs like Mastodon and Pleroma attempt to emulate microblogging sites like Twitter and Facebook without the issues of centralised ownership, these platforms have drawbacks that need to be addressed.
On most DOSN, users can install, own, and manage their servers, in this context called independent instances. These instances require users to participate in the community and interact on the platform, allowing users to own their data.
Along with supporters of data privacy, many bad actors who have been censored on traditional social media have also migrated to DOSN.
To enable users across such instances to interoperate, federation protocols ( a trust protocol associated with an organisation or digital signature) allow information and interactions to flow across DOSN instances to create a larger interconnected community. This process creates a landscape of interaction similar to that of traditional social media but on physically separate servers.
One of the highlights of this system is free speech outside the remit of Big Tech. While this is one of the drawbacks of centralised social media, removing Big Tech’s censorship adds to the scope of challenges in DOSN. Under traditional censorship, hate, racism, sexism, and violence are usually addressed. However, along with supporters of data privacy, many bad actors who have been censored on traditional social media have also migrated to DOSN.
Thus, most DOSNs now require content monitoring and moderation, which administrators of instances address on a case-to-case basis.
Concerns around DOSN
These administrators of instances, however, may hold no authority or knowledge over the area of conflict and are usually other users attempting to maintain peace and increase users on their instances. When such administrators block content chains or other users on a criterion for increased participation, further questions and concerns arise, such as:
- What policies should be formulated and applied, and to which instances?
- What is the scope of content moderation without venturing into censorship?
- How will federation policies targeted as bad actors affect the larger user base?
These questions are important to answer, as collateral damage to a larger user base based on the actions of a few bad actors can lead to issues of censorship and audience loss that already afflict centralised social media.
Collateral damage to users can also act as a deterrent for novel users to join DOSNs. As mentioned earlier, usually, users tend to join social platforms where their friends already are, i.e. the network effect. This could be one of the primary reasons some decentralised networks need help accumulating a non-floating user base. Suppose a user’s post or instance is tagged or rejected unfairly. In such a case, it could result in growing user dissatisfaction, especially if they have friends on another instance they are banned from following. Negative experiences on platforms like these can create distrust and dissatisfaction in the user base and can contribute to losing the said audience.
Solution Scope
User-driven policies designed on a rolling basis to streamline and incorporate novel requirements should be prioritised to enable administrators to moderate their instances. An example is applying a Tagging Policy, an approach to users allocating tags or reporting material. Therefore, a streamlined moderation interface devised to make tagging individual users straightforward (potentially assisted by automated classifiers) is a solution that empowers users on both ends of the pipeline.
Another Framework suggested under an EU-funded innovation and research project in collaboration with Mastodon (a popular DOSN) is the EUNOMIA framework. This concept is based on a circular data-driven approach. Data is collected and shared alongside a determined security and privacy framework. This privacy framework is based on an analytical system with a HaTS component (Human-as-Trust Sensor).
User-driven policies designed on a rolling basis to streamline and incorporate novel requirements should be prioritised to enable administrators to moderate their instances.
The HaTS component is one of the paradigms employed in frameworks such as EUNOMIA. It involves leveraging an individual’s ability to consume data visually and logically, for evaluating trustworthiness, basing its judgement on collecting and representing origin data for human consumption and allowing users to validate the data with their judgement using voting tools. This system is then adopted and supported by machine learning on a rolling basis, performed locally (on the users’ devices) or remotely (on EUNOMIA servers) allowing EUNOMIA and other HaTS-based frameworks to be independent of third-party centralised cloud servers entirely.
Governance and DOSN
DOSNs are not reliant on centralised governance systems. Unlike traditional social media, since DOSN is user-centric and user control, no single jurisdiction can govern the social network. Thus, the users are protected only under the federation policies of their instances and not governments or government systems.
While this is the true definition of a free network and will enhance any aspect of freedom of speech and access, the same will also strengthen any bad actors that may use such platforms, the number of which will only grow with the popularity of these platforms.
For this reason, it is essential for DOSNs to not only create federation policies on a case-by-case basis but to host policies around hate speech and radicalism that are not dependent on instance administrators but instead on the platform itself.
Here the rule of addressing foundation and area of impact will help address any bad actors or radical actors that aim to create disruption rather than policies that address contemporary issues in microscopic detail.
New generic and foundational policies that rely on a trusted/curated list of well-known instances in the fediverse (a colloquial term for standard federation policies across DOSNs) could be outlined. An administrator could select the relevant lists and apply them to instances, when necessary, under an understood common consensus.
While DOSN is a space supported by the tech-savvy and believers of data privacy and free speech, many areas in the same require internal governance for the space to be a genuine, safe platform that not only allows ‘access to all’ and encourages free speech but ensures access to all isn’t hindered by ‘safety to all’.
Thus, solving the issues of reducing the impact of bad actors, collateral damage, loss of user base and internal governance is vital before a more extensive audience moves to DOSNs that rely on individual empowerment over centralised ownership or government jurisdiction.
<1> social media owned by a single organisation or conglomerates
The views expressed above belong to the author(s). ORF research and analyses now available on Telegram! Click here to access our curated content — blogs, longforms and interviews.