With increasing gender-based violence in online spaces, the need of the hour is to have holistic user protection.
Crime and associated offences are hard to define when it comes to cyberspace, especially so in private online spaces, however, there are a few forms of GBV that are clearly demarcated as such by the Indian Penal Code (IPC) and offensive in general, irrespective of the identity of participants—though in most cases women are at the receiving end of such invasion. In this discussion, GBV can include sending lewd unsolicited pictures, making threatening comments, using technology to distort images, spreading these without consent, and using deep fake technology to create and/or spread non-consensual intimate images, cyber flashing, etc. Multiple sections of the IPC cover some levels of protection against receipt of such forms of violence including email spoofing (Section 463 IPC), cyber-hacking (Section 66), sending obscene messages (Section 66A IPC), and pornography (Section 292 IPC).
The target of soft violence, especially women and other minority genders, can still call upon their own community and use that as a defence equal in number to the perpetrators of GBV (Gender-based Violence).
This version of GBV, against transsexual people, is mirrored across platforms, including non-personal platforms such as LinkedIn. On a professional platform like LinkedIn, it is harder to find trans persons. Those who are prevalent usually have a political/activism platform outside of their professional career. Though, as a platform, LinkedIn does advocate for trans visibility and inclusion, the community remains a minority even online, where often, their mere presence becomes a place for people to comment on offensive and transphobic statements both on the publicly visible parts of the platform and in private interactions. LinkedIn, a platform for professional connection and communication has also seen an increase in harassment reports and reports of inappropriate private messages, to counter this, the platform has also begun to use AI and machine learning tools to grade users by their use and interaction and reduce harassment at its point of origin. Though a step forward, this step increases the responsibility of those on the receiving end to make decisions and report the harassment. This brings up primarily two issues: The lack of control of users outside the platform and the perceived para-social relationships may hinder reporting. An example of users lacking control over GBV outcomes can include deep fake technology. Often used to create false pornographic content, has implications of social consequence almost immediately, if used against marginalised genders. The law, so far, both in India and globally, is currently ill-equipped to contain the spectrum of deepfake use, especially when it comes to forms of sexual harassment. Though the IPC does currently protect against receiving unwarranted obscene images and videos, the technology allows for many conduits for the offending perpetrators, ranging from freedom of speech, to even copyright laws. The solution would be to create a comprehensive regulation to overlook the creation and circulation of non-consensual videography and pornography. GBV in private spaces is often a result of para-social relationships. The perception of these relationships affects not just the offender but also the victims, with some feeling guilty about reporting a “connection” and some not viewing the harassment as offensive due to a previous or underlying relationship. The process of reporting puts the onus on the marginalised genders to ensure those sending lewd messages are monitored, removed, or banned. This subjective policing prevents many disenfranchised by status or position, especially in terms of professional positions, from reporting instances that they may deem as insignificant forms of harassment.
As a platform, LinkedIn does advocate for trans visibility and inclusion, the community remains a minority even online, where often, their mere presence becomes a place for people to comment on offensive and transphobic statements both on the publicly visible parts of the platform and in private interactions.
Frequently, the victim is blamed for reporting sexual harassment. The fear insinuated by this limits reporting on non-physical platforms. The current sexual harassment laws do not protect marginalised genders from online workplace harassment and need to include this as the law progresses. Online harassment on professional platforms has now created a loophole for workplace harassment that isn’t regulated and monitored directly by employers. As a lot of these platforms are meant for use between private communities, those who are victims of such abuse may not report the first few forms of soft violence under the guise of familiarity, the fear of retaliation from larger communities, and ostracisation, both socially and professionally. These fears mostly emerge from the lack of accountability placed on the offender, such as in the IPC, under section 67A, where even the viewer of a lewd image can be held accountable. The integration of online GBV and physical GBV grows with technology. These areas of private violence include online platforms of surveillance and marketing that are targeted toward the urban upper class and monitor the household labour of the urban underprivileged. Such as cameras in urban societies, using gateway applications to monitor entries and exits, etc. The guise of safety and concern has been used to monitor and restrict, and that of networking, friendship and romance to approach women across platforms. Technology has been used not only to offend but also to control marginalised genders online. This has initiated the second wave of GBV in the physical world in housing complexes that are surveilled, in workspaces that utilised LinkedIn and Zoom, and in the personal lives of those who enjoy social media and online dating.
The process of reporting puts the onus on the marginalised genders to ensure those sending lewd messages are monitored, removed, or banned.
The guise of safety and concern has been used to monitor and restrict, and that of networking, friendship and romance to approach women across platforms.
The views expressed above belong to the author(s). ORF research and analyses now available on Telegram! Click here to access our curated content — blogs, longforms and interviews.
Shravishtha Ajaykumar is Associate Fellow at the Centre for Security, Strategy and Technology. Her fields of research include geospatial technology, data privacy, cybersecurity, and strategic ...Read More +