Big tech, big money: Why India needs a fair play code for news

big tech dominance over media
The Big Tech must pay a fair revenue share to media, and free news from unfair algorithms and biased digital ecosystem.

The rise of Big Tech companies like Google and Meta has brought immense benefits to our lives, but it has also exposed worrying trends of monopolistic control and unfair practices. Two such concerns impacting publishers globally are: Big Tech dictating terms for publisher data used in search engines, and unauthorised use of publisher content by generative AI developers. These issues threaten not only fair compensation for publishers but also the future of journalism and public discourse.

Big tech’s monopoly tends to skew negotiations regarding payment for publisher content. A recent US judgment highlighted a tacit agreement with Android phones to prevent other app stores, marking Google as a monopoly and mandating fees for in-app billing. This approach creates a negative impact on the subscription business of digital news players through illegal monopoly structures.

In India, the Competition Commission (CCI) made similar observations. The mandatory pre-installation of the Google Mobile Suite without an option to uninstall, and their prominent placement, was deemed an imposition of unfair conditions on device manufacturers. This practice was found to violate Section 4(2)(a)(i) of the Competition Act. In March 2023, the NCLAT upheld CCI’s Rs 1337.76 crore penalty on Google for these antitrust violations, citing abuse of market dominance in the Android ecosystem.

READTackling Big Tech: Lessons from Canada’s Online News Act

AI and publisher content

Another critical issue is the use of publisher content to train generative AI systems, which constitutes direct copyright infringement and is not considered fair use. While publishers invest and take risks, generative AI developers often reap the rewards in terms of users, data, brand creation, and advertising revenue. This unlicensed use of journalistic content threatens public interest and may impede generative AI innovations.

The rise of generative AI presents both opportunities and challenges for the media industry. While AI-powered tools can automate certain tasks and enhance content creation, the unauthorised use of copyrighted content raises serious ethical concerns. This practice not only undermines the financial viability of publishers but also stifles innovation and creativity. To address this issue, stricter copyright enforcement mechanisms are needed, along with increased transparency from AI developers regarding data sources and content usage.

Furthermore, promoting responsible AI development practices that respect intellectual property rights and prioritise ethical data sourcing is crucial. By ensuring fair compensation for content creators and fostering a culture of ethical AI development, we can unlock the true potential of AI for journalism while safeguarding the rights and interests of publishers.

Big tech vs national governments

Internationally, various countries have enacted or issued directions to establish a Bargaining Code, emphasising the exclusive right of publishers to reproduce their work. This code seeks to ensure that if big tech companies use publisher content, they must pay for it. The EU has issued such directions, while the Australian Communications and Media Authority (ACMA) has defined a Mandatory Bargaining Code. The need is to define this Bargaining Code with transparency.

Given these global developments that negatively impact news publishers due to big tech monopolies, there is a pressing need for the Indian government and regulatory bodies to take proactive steps. This includes ensuring that big tech’s misuse of monopoly does not harm Indian content providers. The ministry of electronics and information technology (MeitY) should amend existing IT Act rules to include a revenue test, defining a threshold for applicability under the IT Act 2000. This should encompass good faith negotiation mechanisms, mediation if necessary, and provisions for arbitration.

Big tech companies hold immense power over information dissemination. Their algorithms determine the visibility of news articles, political messages, and public discourse. Changes in these algorithms can disrupt information flow and potentially favour certain viewpoints or interests. Addressing this issue requires the strict implementation of policies and regulations for equitable revenue distribution, promoting fair competition and a balanced digital economy. Collaboration between the government, stakeholders, and technology companies is crucial in resolving these concerns.

Transparency is another vital issue. Minimum standards should be set for digital platforms, requiring them to provide clear explanations about the type of user data collected and how it is used. Transparency in algorithmic decision-making is crucial to maintain the integrity of India’s democracy, particularly during election seasons. Guidelines and disclosure mechanisms should be mandated to prevent big tech companies from influencing the electoral process.

Government intervention is essential to address the challenges faced by news publishers in the face of big tech dominance. By ensuring fair revenue sharing, protecting data privacy, and promoting innovation, governments can safeguard the democratic role of journalism and counter unfair practices by big tech companies, preserving the vitality of the news industry.

The Digital India Act

While the proviso in the proposed Digital India Act is a step forward, immediate intervention is necessary. The ministry of electronics and information technology should issue an advisory or amend rules mandating digital intermediaries to provide news publishers with a bargaining code through Good Faith Negotiation. This needs to be supplemented with transparent algorithmic decision-making and provisions for revenue test and negotiations.

In 2021, the government of India framed intermediary guidelines under the Information Technology Act, 2000, mandating digital intermediaries to follow certain due diligence guidelines. To enhance transparency and accountability in algorithmic decision-making, it is crucial to mandate the sharing of algorithmic decision-making processes used by big tech with the public.

The latest amendment to the Information Technology (Guidelines for Intermediaries and Digital Media Ethics Code) Rules of 2021 defines the publisher, intermediaries, and aggregators. Awaiting the enactment of the Digital India Act, the injury continues for Indian content providers. The Ministry of Electronics and Information Technology should issue an interim advisory or rule amendment mandating digital intermediaries to share all data generated from the online advertising space with news publishers, insisting on revenue tests and good faith negotiations.

This interim measure would initiate a transparent regime to prevent unethical practices by big tech. The amendment should also mandate fair contract negotiations between big tech and Indian news publishers. Establishing guidelines and restrictions on big tech operations will ensure transparency, fairness, and accountability. Proactive steps by the Indian government can address concerns such as unfair revenue sharing, data privacy breaches, and algorithmic biases.

Beyond financial harm, Big Tech’s dominance poses a significant threat to the very fabric of democracy. Their algorithms control the visibility and reach of information, potentially shaping public discourse and influencing elections. Unchecked algorithmic bias can amplify certain viewpoints and suppress others, creating echo chambers and undermining informed democratic participation. For instance, studies have shown how targeted news feeds can sway voting intentions and exacerbate political polarisation. This necessitates not only fair revenue distribution but also robust regulations and oversight mechanisms to ensure transparency and accountability in algorithmic decision-making, especially during elections.

In a world increasingly reliant on digital information, protecting diverse voices and ensuring fair play for content creators is critical. The Indian government’s proactive intervention is crucial to address the challenges faced by Indian publishers from Big Tech dominance. By implementing transparent revenue-sharing models, enforcing data privacy regulations, and promoting algorithmic fairness, the government can safeguard the democratic role of journalism and foster a sustainable and ethical digital ecosystem. India, with its diverse content and emerging economy, has a unique opportunity to set a global precedent for a level playing field in the digital landscape.

While government intervention is pivotal, addressing the challenges faced by publishers requires a multi-pronged approach. Publishers themselves can come together and leverage collective bargaining power to negotiate fairer deals with Big Tech platforms. Technologists can develop alternative content distribution models and promote ethical AI practices that respect intellectual property rights.

Civil society organisations can play a crucial role in raising awareness about Big Tech’s monopolistic practices and advocating for transparency and accountability. By fostering collaboration among these diverse stakeholders, we can create a more balanced digital ecosystem where diverse voices are heard and content creators are adequately compensated for their work.

Subscribe to Policy Circle newsletter

+ posts

Dr Aruna Sharma is a New Delhi-based development economist. She is a 1982-batch Indian Administrative Service officer. She retired as steel secretary in 2018.