Site icon Policy Circle

Big brother watching: Proposed Digital India Act will cut both ways on privacy matters

Digital India Act

The proposed Digital India Act can model itself on European Union’s Digital Services Act which requires the governments’ takedown orders to be proportionate, reasoned and can be challenged.

The government is now in the final stages of formulating a law that has the potential to revolutionise India’s digital space. Under it, the Union government plans to make internet and social media firms accountable for the algorithms they deploy to aggregate content. The algorithms use the browsing record and profile of users which is flagged as a major privacy concern. If the Digital India Act comes into force, India will be among the first few nations to mandate legal oversight of the proprietary codes.

The Digital India Act is modelled on the drafts of the country’s Telecom Bill and the Digital Personal Data Protection Bill which were released earlier this year. The DIA will include provisions to regulate the way in which large internet firms process data. The government believes that the law will ensure that social media firms do not identify an individual user from an anonymised dataset and hence avert any harm to users. The premise is simple — the government wishes to prevent misuse of Indian citizens’ data. The proposed legislations will replace the 22-year-old Information Technology Act, 2000 which has governed the country’s technology sector till date.

READ | RBI MPC meet: Market braces for a moderate interest rate hike

How do social media algorithms work

Algorithms are the building blocks of social media applications. They are at heart of what users see at their feed. For instance, internet firms such as Twitter and Facebook display news feeds to users that are tailored to their location, interests and previously liked posts. The proprietary codes can also analyse wide swathes of data to create specific profiles based on browsing or shopping history along with other personal factors such as gender, age, friends and family.

India had earlier urged member states of Global Partnership on Artificial Intelligence (GPAI) to work towards a framework to prevent misuse of artificial intelligence, leading to user harm. It is not the only one concerned about the rising hold that the big tech firms have on common citizens. Previously, the Australian government had also raised concerns over profiling of its citizens by the Byte Dance-owned firm TikTok.

India is the second largest online market for social media companies such as Meta, Google and Twitter. The privacy of users has always been a concern for governments across the world which have been brain-storming on ways to tame these tech giants. Social media intermediaries deal in large amounts of personal data on a day-to-day basis. With regulation in place, they may be asked to disclose the methods by which they process the data collected by them in specific cases. However, the officials insist that such requests will only be entertained with the backing of law enforcement agencies through court orders.

Digital India Act: A double-edged sword

While the purpose of the whole enterprise is supposed to be the protection of users, this may make social media companies liable to disclose personal data of users to the government, leading to massive breaches of privacy. Those dealing in matters of privacy term such laws as double-edged swords.

A large number of activists across the world have decried any government control of online speech. Social media platforms have contributed to the democratisation of public speech and publish views of dissenting communities. With millions of users, social media shapes public discourse, but attempts to regulate these platforms can cut both ways. For one, there is no doubt that social media has amplified hate speech against particular communities. Nonetheless, increased stakes in free speech requires a modern intermediary law from the global community that will re-imagine the role of the governments.

Examples can be taken from existing the laws such as the European Union’s Digital Services Act which requires government’s takedown orders to be proportionate and reasoned. In fact, under DSA, intermediaries can challenge government decisions to take down content and defend themselves. Intermediary law must also devolve social media content moderation decisions at the platform level. Moreover, platforms themselves need to assume larger responsibility of content moderation without political pressures.

Exit mobile version