Sweet Zannat’s name began circulating online when rumours linked her to an unverified “19-minute video” that several users claimed was trending privately on messaging platforms. The creator swiftly denied the allegations, calling them a deliberate attack on her reputation.
She later shared an emotional video in which she appeared visibly distressed. Many assumed she was reacting to the circulating rumours. However, Zannat clarified that her emotional state was triggered by a separate deepfake video, created by two boys from her hometown using artificial intelligence.
This manipulated video, she said, had gone viral locally and caused her immense humiliation and digital exploitation—far beyond the false claims involving the alleged 19-minute clip.
Deepfake creators issue public apology
In a surprising development, a new video emerged showing the two boys responsible for the AI-generated clip publicly apologising to Zannat.
Admission of guilt
The boys confessed to using artificial intelligence tools to create and circulate the deepfake video. Their actions, they acknowledged, had caused serious damage to Zannat’s public image.
Their apology
Addressing her as their “sister”, they said “Galti ho gayi” (“We made a mistake”), admitting guilt openly and apologising for the emotional and reputational harm inflicted.
The apology rapidly went viral, drawing strong reactions from viewers who condemned the misuse of AI to target women.
Zannat responds with forgiveness—and a warning
Following the apology, Sweet Zannat posted a video of her own, displaying calmness, restraint and maturity. Her response has earned widespread praise from social media users.
Choosing forgiveness
Zannat said she forgave the two boys, explaining that she chose to respond as an elder sister rather than escalate the situation. Many users viewed her approach as dignified and compassionate.
A strong warning for the future
Despite her forgiveness, Zannat was firm in her message:
“I wasn’t at fault, but many assumed it was my video.”
She warned the boys never to repeat such behaviour with her—or with any other girl. Her statement highlighted the serious consequences deepfakes can have, especially for women whose online reputations are vulnerable to manipulation and misuse.
Her blend of kindness and firmness was widely appreciated, with several viewers praising her for standing up against digital harassment while not endorsing vengeance.
A wider conversation on deepfakes and online safety
The incident has reignited discussions on the alarming rise of deepfake technology being used for harassment. Across India, similar incidents have fuelled concerns about children and young adults gaining easy access to AI tools capable of creating misleading and harmful content.
Digital rights groups emphasise that:
- deepfake content can cause irreversible damage to victims,
- misinformation spreads rapidly even before facts are established,
- and accountability remains limited in many jurisdictions.
Several industry experts have urged governments and platforms to strengthen reporting mechanisms, AI regulation and online safety laws to prevent similar cases.
For Karnataka, where online creators and youth communities are highly active, the case has once again reminded families and schools to advocate responsible digital behaviour.
(For related stories on online safety, see:
Parents urged to strengthen children’s digital protection online
Online misinformation and deepfake technology concerns rise)
Social media reactions highlight support for Zannat
Users across Instagram and other platforms applauded Zannat’s handling of the situation. Many lauded her emotional honesty and the grace with which she addressed a painful situation.
Comments included:
- “Your strength is inspiring. Not everyone can forgive.”
- “Deepfakes are becoming dangerous. Thank you for speaking out.”
- “Proud of you for standing up with dignity.”
The episode has left a deep impression on viewers, who acknowledge both the risks posed by misused technology and the resilience shown by content creators who navigate such challenges.
