Expert Speak Digital Frontiers
Published on Jun 23, 2022
With increasing gender-based violence in online spaces, the need of the hour is to have holistic user protection.
Gender-based violence in private online spaces Cyberspace is growing rapidly to include the professional, personal, and social aspects of our lives. Users have found communities that allow even minority communities to reach a large audience. Finding communities to echo beliefs has entrenched the opinions of many; unfortunately, this newfound agency for the marginalised has triggered reactions from the privileged sections. With greater access and the allowance for anonymity; the associated abuse, oppression, and different forms of violence continue to fester. This anonymity holds greater value in private online spaces than in public online spaces. In public spaces, the target of soft violence, especially women and other minority genders, can still call upon their own community and use that as a defence equal in number to the perpetrators of GBV (Gender-based Violence). Private spaces online allow the illusion of access since they are not enclosed as they may present in the physical world. Invasion of privacy and harassment have become much easier due to the perception of access, along with the deepening of para-social relationships. Even within the realm of the digital, GBV has formed a clear presence, especially towards those who are disenfranchised.

The target of soft violence, especially women and other minority genders, can still call upon their own community and use that as a defence equal in number to the perpetrators of GBV (Gender-based Violence).

Crime and associated offences are hard to define when it comes to cyberspace, especially so in private online spaces, however, there are a few forms of GBV that are clearly demarcated as such by the Indian Penal Code (IPC) and offensive in general, irrespective of the identity of participants—though in most cases women are at the receiving end of such invasion. In this discussion, GBV can include sending lewd unsolicited pictures, making threatening comments, using technology to distort images, spreading these without consent, and using deep fake technology to create and/or spread non-consensual intimate images, cyber flashing, etc. Multiple sections of the IPC cover some levels of protection against receipt of such forms of violence including email spoofing (Section 463 IPC), cyber-hacking (Section 66), sending obscene messages (Section 66A IPC), and pornography (Section 292 IPC).

Spreading via social media

Despite the existence of these legalities, online harassment reports have increased by 110 percent between 2018 and 2020 with the current rate of cyber-based sexual exploitation at 6.6 percent of all cyber-crimes (2020), especially with the introduction of private online spaces, different social media platforms, video conferencing applications, dating platforms, and even online networking platforms. Many social media apps and dating apps, primarily Bumble, have successfully attempted to lobby for regulations in Texas and California. These bills have helped regulate a safer environment online. Further, dating apps have introduced Artificial Intelligence (AI) options to scour images for any offensive objects, like guns, images of violence, lewd displays of body parts, etc., the power to view is given to the user vis-à-vis spam accounts or users who do not follow the user guidelines of these platforms. Many of these applications also use photo verification to ensure that the user does not create a fake profile during the process of signing up. Similar forms of photo verification augmented with user complaints worked against the profiles of transsexual people. In 2019, Tinder began its attempt at countering such harassment and exclusion to make its platform safer for all. However, with the automated system backing most banned accounts, it remains difficult for transsexual people and other members of the LGBTQIA+ to find security and peace in using these platforms. Despite the clear need for one, there is no trans-inclusive sexual harassment law in India.

As a platform, LinkedIn does advocate for trans visibility and inclusion, the community remains a minority even online, where often, their mere presence becomes a place for people to comment on offensive and transphobic statements both on the publicly visible parts of the platform and in private interactions.

This version of GBV, against transsexual people, is mirrored across platforms, including non-personal platforms such as LinkedIn. On a professional platform like LinkedIn, it is harder to find trans persons. Those who are prevalent usually have a political/activism platform outside of their professional career. Though, as a platform, LinkedIn does advocate for trans visibility and inclusion, the community remains a minority even online, where often, their mere presence becomes a place for people to comment on offensive and transphobic statements both on the publicly visible parts of the platform and in private interactions. LinkedIn, a platform for professional connection and communication has also seen an increase in harassment reports and reports of inappropriate private messages, to counter this, the platform has also begun to use AI and machine learning tools to grade users by their use and interaction and reduce harassment at its point of origin. Though a step forward, this step increases the responsibility of those on the receiving end to make decisions and report the harassment. This brings up primarily two issues: The lack of control of users outside the platform and the perceived para-social relationships may hinder reporting. An example of users lacking control over GBV outcomes can include deep fake technology. Often used to create false pornographic content, has implications of social consequence almost immediately, if used against marginalised genders. The law, so far, both in India and globally, is currently ill-equipped to contain the spectrum of deepfake use, especially when it comes to forms of sexual harassment. Though the IPC does currently protect against receiving unwarranted obscene images and videos, the technology allows for many conduits for the offending perpetrators, ranging from freedom of speech, to even copyright laws. The solution would be to create a comprehensive regulation to overlook the creation and circulation of non-consensual videography and pornography. GBV in private spaces is often a result of para-social relationships. The perception of these relationships affects not just the offender but also the victims, with some feeling guilty about reporting a “connection” and some not viewing the harassment as offensive due to a previous or underlying relationship. The process of reporting puts the onus on the marginalised genders to ensure those sending lewd messages are monitored, removed, or banned. This subjective policing prevents many disenfranchised by status or position, especially in terms of professional positions, from reporting instances that they may deem as insignificant forms of harassment.

The process of reporting puts the onus on the marginalised genders to ensure those sending lewd messages are monitored, removed, or banned.

Frequently, the victim is blamed for reporting sexual harassment. The fear insinuated by this limits reporting on non-physical platforms. The current sexual harassment laws do not protect marginalised genders from online workplace harassment and need to include this as the law progresses. Online harassment on professional platforms has now created a loophole for workplace harassment that isn’t regulated and monitored directly by employers. As a lot of these platforms are meant for use between private communities, those who are victims of such abuse may not report the first few forms of soft violence under the guise of familiarity, the fear of retaliation from larger communities, and ostracisation, both socially and professionally. These fears mostly emerge from the lack of accountability placed on the offender, such as in the IPC, under section 67A, where even the viewer of a lewd image can be held accountable. The integration of online GBV and physical GBV grows with technology. These areas of private violence include online platforms of surveillance and marketing that are targeted toward the urban upper class and monitor the household labour of the urban underprivileged. Such as cameras in urban societies, using gateway applications to monitor entries and exits, etc. The guise of safety and concern has been used to monitor and restrict, and that of networking, friendship and romance to approach women across platforms. Technology has been used not only to offend but also to control marginalised genders online. This has initiated the second wave of GBV in the physical world in housing complexes that are surveilled, in workspaces that utilised LinkedIn and Zoom, and in the personal lives of those who enjoy social media and online dating.

The guise of safety and concern has been used to monitor and restrict, and that of networking, friendship and romance to approach women across platforms.

The existing scenario

Currently, there is no national or global comprehensive law or regulation that mandates user protection across growing platforms. Though efforts are being made, they are not as efficient and up to speed as the rate of innovation. Platform-specific guidelines and outdated regulations are not enough to address the needs of contemporary issues. Alternatives to these regulations are required. They may include regulations for different technologies and industry verticals and not just their use cases. Similar to how drone policies have come out for the usage of drones alongside any IT or imagery policies. Implementing one of the three, government, platform, and industry/ technology, levels of regulation alone, will always leave scope for loopholes and permit GBV to continue. Holistic protection of user needs to be ensured with integrated regulation.
The views expressed above belong to the author(s). ORF research and analyses now available on Telegram! Click here to access our curated content — blogs, longforms and interviews.

Author

Shravishtha Ajaykumar

Shravishtha Ajaykumar

Shravishtha Ajaykumar is Associate Fellow at the Centre for Security, Strategy and Technology. Her fields of research include geospatial technology, data privacy, cybersecurity, and strategic ...

Read More +