AI education rollout in Punjab raises alarms over digital rights and gendered online abuse
Technology
Pakistan’s rapid adoption of AI, including Punjab’s new school curriculum, is raising serious concerns over digital rights, data protection and rising online harms against women and minorities.
LAHORE (Ayesha Babar) – Pakistan’s push towards artificial intelligence has entered a new phase after the Punjab government last week mandated AI education in all government schools across the province, a first-of-its-kind move aimed at boosting digital literacy and technological capacity.
While the initiative marks a significant step in the country’s digital transformation, human rights organisations and digital rights experts warn that the rapid adoption of AI technologies carries serious and often overlooked risks, particularly for women and marginalised communities.
AI systems are evolving at unprecedented speed and scale. Geoffrey Hinton, the computer scientist widely known as the “godfather of AI”, has previously cautioned that advanced systems could surpass human intelligence and pose existential risks if left unchecked. Beyond such long-term concerns, rights advocates point to immediate social harms rooted in how AI is developed and deployed.
Maryam Nawaz Sharif mandates AI use in Punjab schools
Many AI systems are built predominantly by male developers and trained on datasets shaped by patriarchal norms. As a result, these systems often reproduce discrimination, exclusion and harmful stereotypes, disproportionately affecting women, religious minorities and gender-diverse communities.
Rise of AI-enabled abuse
In recent years, the proliferation of generative AI tools has fuelled a surge in technology-facilitated gender-based violence (TFGBV). One of the most visible manifestations is the creation of sexualised deepfake images of women and girls. Earlier this month, Grok, the generative AI chatbot owned by Elon Musk’s X platform, generated an estimated 1.8 million sexualised images of women and girls, according to figures cited by the Center for Countering Digital Hate.
UN Women defines TFGBV as any act committed, aggravated or amplified through digital technologies that results in, or is likely to result in, physical, sexual, psychological, social, political or economic harm, or infringements of rights and freedoms.
Advanced image and video generation tools can now produce hyper-realistic content that closely mimics real faces, skin textures and voices, making it increasingly difficult to distinguish fabricated material from authentic media. Many of these tools are freely available or require minimal technical skills, significantly lowering the cost and effort required to perpetrate abuse. Features such as voice cloning and automated bots further intensify harassment campaigns, reputational harm and public shaming.
Pakistan’s heightened vulnerability
Experts warn that Pakistan is particularly exposed to these risks due to weak data protection frameworks and limited institutional capacity to address digital harms. As the country moves to integrate AI into governance and public service delivery, the gendered and social impacts of digitalisation remain insufficiently examined.
Most AI systems used in Pakistan are developed in the Global North and deployed in the Global South without adequate localisation. This mismatch can exacerbate harms linked to different social norms, higher stigma for women and minorities, and increased risks of misuse. Rights groups describe this dynamic as digital colonialism, where developing countries face disproportionate social costs while also supplying cheap labour for data labelling and content moderation. In some documented cases, workers are paid as little as $1.32 per hour to train large language models.
Documented scale of online harm
According to the Digital Rights Foundation’s Annual Helpline Report 2024, the organisation received 3,171 complaints last year, with 1,772 filed by women. Since its launch in 2016, the Digital Security Helpline, previously known as the Cyber Harassment Helpline, has handled more than 20,000 complaints related to TFGBV and online harassment, the majority from women.
These figures represent only reported cases. Thousands more incidents remain undocumented due to social stigma, fear of retaliation and limited awareness of reporting mechanisms. Separate research by the Digital Rights Foundation found that 70 percent of women feared the misuse of their images online.
Religious minorities also face distinct risks in digital spaces. False accusations involving manipulated images or videos have, in some cases, gone viral and triggered offline violence before verification can take place. In a study on religious minorities’ experiences online, 61 percent of respondents said they felt unsafe expressing their opinions digitally, while 55 percent reported online abuse or threats that led to real-world consequences.
Data protection gaps
Concerns over AI-driven harms are compounded by the absence of comprehensive data protection legislation. Millions of Pakistanis’ personal records are stored in government databases without robust safeguards, clear access controls or enforceable rights to challenge misuse.
A major data breach at the National Database and Registration Authority highlighted the scale of vulnerability. Between 2019 and 2023, personal data of more than 2.7 million citizens was stolen from NADRA’s regional offices in Karachi, Peshawar and Multan.
Civil society organisations have repeatedly called for the passage of the long-pending Personal Data Protection Bill, which would establish legal principles around consent, accountability, purpose limitation and data security.
Calls for rights-based governance
In response to these challenges, the Digital Rights Foundation recently published a policy brief titled The Cost of Going Digital: Evaluating Rights Risks in Pakistan’s Digital Governance. The report was followed by a multi-stakeholder roundtable involving members of national commissions, civil society groups, activists and representatives of marginalised communities.
The policy brief recommends mandatory gender and minority impact assessments in digital policymaking, inclusive governance structures, and awareness initiatives to strengthen understanding of privacy, informed consent and digital rights. It also stresses the importance of providing such information in local languages to improve accessibility.
AI to become part of school curriculum in KP from March 2026
The recommendations align with Pakistan’s international commitments, including its obligations under the Convention on the Elimination of All Forms of Violence Against Women and its adoption of the UN’s 2030 Sustainable Development Goals, particularly those relating to gender equality, innovation and reduced inequalities.
Digital rights advocates also note that progress in these areas is closely linked to Pakistan’s continued eligibility for the European Union’s GSP+ trade status, which is assessed against benchmarks including freedom of expression, access to information and the elimination of violence against women, transgender persons and minorities.