Table of Contents
ToggleWhy social media needs to be regulated : A larger Pespective
Social media has become the primary medium through which people across the globe interact and communicate with one another.
Since the advent of modern technology, communication mediums have undergone a significant transformation. From the telegraph to the fax machine, from the telephone to smartphones—each advancement has reshaped the way we connect.
In the 1830s, the first telegraphic device was invented by Joseph Henry, marking a pivotal moment in the history of communication. This was followed by the groundbreaking invention of the telephone by Alexander Graham Bell in 1876, with notable contributions from Antonio Meucci and Elisha Gray.
In 1990, the development of the World Wide Web (WWW) by Tim Berners-Lee revolutionized how people accessed and shared information. The decade of the 1990s witnessed an extraordinary technological boom, as the internet began to reshape the landscape of communication, laying the foundation for the rise of social media.
The early 2000s witnessed the bursting of the dot-com bubble; a stock market crash ensued, followed by the emergence of social media platforms such as Facebook, Twitter, and many others.
Laws and Rules
These inventions played a key role in development of nations and it had to be regulated.
For example in relation to telegraph the countries with laws regulating the use of telegraph were:
• France (1849): France passed one of the first laws allowing the public to use the telegraph. This law also gave the state the right to monitor and stop telegrams deemed illegal or suspicious, and prohibited the use of secret codes except by the state
• United States (1860): The Pacific Telegraph Act of 1860 authorized the U.S. government to fund and regulate the construction of a transcontinental telegraph line, giving priority to government and military communications and allowing public use under certain conditions
• India (1885): The Indian Telegraph Act, 1885, gave the government exclusive control over telegraphy, telephones, and later digital communications, including the power to intercept and monitor communications.In 2023 telecommunication act has been passed replacing the 1885 telegraph act.
When it comes to social media it becomes a different ball game . India has enacted following laws to regulate the intermediaries and prevent cyber offences through internet :
1. Information Technology Act 2000 and intermediary guideline 2021;
2. Digital personal data protection Act, 2023;
that govern the social media realm in India.
Other countries
In USA
Section 230 of the Communications Decency Act (CDA):
This is arguably the most significant law regarding internet content. It generally protects social media platforms and other online services from liability for content posted by their users. It also allows platforms to moderate content in “good faith” without facing legal repercussions for doing so.
Children’s Online Privacy Protection Act (COPPA):
This law focuses on protecting the privacy of children under 13 online. It requires websites and online services to obtain parental consent before collecting personal information from young children.
In UK
The Online Safety Act 2023 aims to protect online users by imposing duties on social media companies and search services to ensure user safety and remove illegal content.
The important questions that spring up for this article is
whether a new law is needed for social media regulation especially penal provisions in relation to
• spread of harmful content ,
• Doxing
• hate mongering ,
• spreading of misinformation and disinformation on platforms
• online abuses and trolling ;
2. Whether there is need for the content regulation and audits / monitoring of algorithms used by these platforms by government .
and who should be accountability for such acts.; or are the current laws enough to tackle these issues.
Considering the first issue India does not have a special statute tailored to bring the under any specific act. The enforcement agencies have to place reliance on general law such Bhartya Nyaya Sanhita 2023 .
Relevant provisions under BNS 2023 wherein a person may be prosecuted for online acts :
S78 stalking
S152. Act endangering sovereignty, unity and integrity of India
S196. Promoting enmity between different groups on grounds of religion, race, place of birth, residence, language, etc., and doing acts prejudicial to maintenance of harmony.
S197. Imputations, assertions prejudicial to national integration
S353Statements conducing to public mischief
S356 Defamation
As these section have used the word ‘electronic communications ‘and ‘electronic means’ it brings into its ambit cyber crime.
The IT act 2000 being a special statute for cyber crime regulation and having ovrerrding effect on other acts is not tailored to address the unique and fast evolving issues.
The act provides punishment for various cyber offences under following sections :
S66A. Punishment for sending offensive messages through communication service, etc.
S66B. Punishment for dishonestly receiving stolen computer resource or communication device.
S66C. Punishment for identity theft.
S66D. Punishment for cheating by personation by using computer resource.
S66E. Punishment for violation of privacy.
S66F. Punishment for cyber terrorism.67. Punishment for publishing or transmitting obscene material in electronic form.
S67A. Punishment for publishing or transmitting of material containing sexually explicit act, etc., in electronic form.
S67B. Punishment for publishing or transmitting of material depicting children in sexually explicit act, etc., in electronic form.
The persons committing such offences will be punishable but the intermediary in such scenario are protectected under section 79 of the information technology act 2000. It provides a ‘safe harbor’ to social media platforms. However it is not a blanket immunity as s79(3) is an exception to it and provudes that an intermediary is not protected from liability if they conspired, aided or abetted in the unlawful act or, after notification by the government, fail to remove illegal content from their resource without damaging evidence.
The Information Technology (Intermediary Guidelines and Digital Media Ethics Code) Rules, 2021 provides under rule 3 for due diligence by publishing rules and regulations, privacy policy and user agreement on their websites or apps. It also prescribes digital media ethics code and content regulation.
When it comes to the above issue of spread of harmful content , Doxing ,hate mongering , spreading of misinformation and disinformation on platforms, online abuses and trolling – there are no specific rules and provisons under the IT Act 2000. Hence not equipped to handle these problems which is becoming a menace for society and for which doomscrolling has become a second nature due to algorithms.
Attending to second issue , it is the need of the hour for audit/monitoring of algorithms as they shape a persons perspective .
Perspective is an quintessential element which moulds an individuals even societies thinking on various aspects of life.
When it comes to sovereign issues perspectives shaped by these algorithms of social media may hamper the national growth of a country and may even endanger the nations security, identity and dignity.
Example:
Consider Country A, where a specific village or city—perhaps one facing economic hardship—is repeatedly shown on social media platforms in a way that emphasizes poverty, deteriorating infrastructure, and social dysfunction. These portrayals, often amplified by social media algorithms that prioritize sensational or emotionally charged content, can spread misleading or incomplete narratives about the entire country.
As a consequence, the global perception of Country A becomes skewed. When citizens of Country A travel or interact with people from other countries, they may be unfairly judged or stereotyped based on these distorted portrayals. This not only harms the international reputation of Country A, but also infringes on the dignity and identity of its citizens. It undermines their right to be treated as individuals rather than being reduced to a narrow, often negative, stereotype shaped by selective media exposure.
Such misinformation can contribute to cultural bias, discrimination, and even policy-level repercussions, highlighting the urgent need for ethical media representation and algorithmic responsibility on digital media
AUTHORS OPINION
Social media’s portrayal of certain regions—often those facing poverty or hardship—can distort the global perception of entire nation. In the case of Country A, repeatedly broadcasting images of a struggling village or city skews the narrative, making hardship appear as the national norm. These depictions, driven by algorithms that favor sensationalism, not only misrepresent reality but also fuel stereotypes that follow citizens across borders. As a result, individuals from Country A may be unfairly judged based on media-fueled perceptions rather than their personal identity or achievements. This kind of reductionist portrayal harms national dignity and reinforces global inequalities, making ethical media practices and algorithmic accountability essential in the digital age.
Imagine you’re from Country A, and every time you go online, the world sees only the poorest parts of your country—broken infrastructure, hardship, and dysfunction. These images are shared widely, often without context, and they shape how the rest of the world sees your entire nation. Now imagine traveling abroad and being treated according to that narrow image. This isn’t just about misinformation—it’s about dignity. When media platforms prioritize sensational content over accurate storytelling, they don’t just distort facts—they erode identities. That’s why we need greater responsibility from both content creators and tech platforms to ensure fair, ethical representation in the digital space.
Therefore current laws are not enough to tackle these issues and requires some policy and regulation as these social media platforms can be used as a tool for digital warfare by the enemy state for degrading another country and imposing their own narrative globally.
Recommendation:
Promoting Ethical Media Representation and Algorithmic Accountability
Governments and international organizations should collaborate with social media platforms/ intermediaries to implement standards for ethical content dissemination.
This may include:
• Flagging contextually incomplete or potentially harmful portrayals of regions and communities
• Intermediary accountability /liability
• Individual accountability/liabilty
• Prohibition of AI related contents
• Encouraging algorithmic transparency and diversity in content curation
• Supporting digital literacy initiatives to help users critically evaluate online content
Such measures would help protect the reputations and identities of marginalized or misrepresented populations while fostering a more balanced global discourse.
References
- https://www.legislation.gov.uk/ukpga/2023/50
- https://en.wikipedia.org/wiki/Section_230
- https://www.meity.gov.in/static/uploads/2024/06/2bf1f0e9f04e6fb4f8fef35e82c42aa5.pdf
- https://www.indiacode.nic.in/bitstream/123456789/13116/1/it_act_2000_updated.pdf
- https://www.ftc.gov/legal-library/browse/rules/childrens-online-privacy-protection-rule-coppa
- https://www.mha.gov.in/sites/default/files/250883_english_01042024.pdf
- https://www.meity.gov.in/static/uploads/2024/02/Information-Technology-Intermediary-Guidelines-and-Digital-Media-Ethics-Code-Rules-2021-updated-06.04.2023-.pdf
Leave a Reply