The father of a schoolgirl who committed suicide has accused Instagram of helping to kill her.

Ian Russell said 14-year-old Molly took her own life after looking at pictures on the social network that glorified self-harm and suicide.

Molly was found dead just hours after handing in her homework and returning to her family home, where she had packed a bag to go to school the next day.

In a devastating note, she told her parents and two sisters: "I'm sorry. I did this because of me."


Speaking publicly about her death for the first time, Mr Russell said last night: "I have no doubt that Instagram helped kill my daughter. She had so much to offer and that's gone."

His criticism of the photo-sharing site, which is owned by Facebook, comes after experts warned Instagram helped to glorify self-harm among vulnerable youngsters.

Last night hundreds of thousands of images depicting people harming themselves and discussing suicide could be viewed on the site, which is hugely popular among teenagers.

According to the Daily Mail, Mr Russell said Molly, who went to Hatch End High School in Harrow, Middlesex, had started viewing disturbing posts on the social network without the family's knowledge.

He told the BBC: "She seemed to be a very ordinary teenager. She was future-looking. She was enthusiastic.

"She handed her homework in that night. She packed her bags and was preparing to go to school the next day and then when we woke up the next morning, she was dead."

It was only after her death in 2017 that the teenager's parents delved into her social media accounts and realised she was viewing distressing images.

One account she followed featured an image of a blindfolded girl, seemingly with bleeding eyes, hugging a teddy bear.

Ian Russell said his daughter took her own life after looking at pictures on the social network that glorified self-harm and suicide. Photo / Daily Mail
Ian Russell said his daughter took her own life after looking at pictures on the social network that glorified self-harm and suicide. Photo / Daily Mail

The caption read: "This world is so cruel, and I don't wanna to see it any more."

Mr Russell said Molly had access to "quite a lot of content" that raised concern.

"There were accounts from people who were depressed or self-harming or suicidal," he said. "Quite a lot of that content was quite positive. Perhaps groups of people who were trying to help each other out, find ways to remain positive to stop self-harming.

"But some of that content is shocking in that it encourages self-harm, it links self-harm to suicide and I have no doubt that Instagram helped kill my daughter.

"The posts on those sites are so often black and white, they're sort of fatalistic. [They say] there's no hope, join our club, you're depressed, I'm depressed, there's lots of us, come inside this virtual club."

Mr Russell, who directed the BBC coverage of the Queen's 90th birthday service at St Paul's Cathedral, questioned why huge numbers of posts were still available to view on Instagram despite repeated warnings by experts.

Algorithms on Instagram mean that youngsters who view one account glorifying self-harm and suicide can see recommendations to follow similar sites.

Experts say some images on the website, which has a minimum joining age of 13, may act as an "incitement" to self-harm.

Instagram's guidelines say posts should not "glorify self-injury" while searches using suspect words, such as "self-harm", are met with a warning. But users are easily able to view the pictures by ignoring the offers of help.

Ged Flynn, from the UK's suicide prevention charity Papyrus, said: "Suicide is not a hashtag. It is an unimaginable, devastating tragedy.

"If an algorithm behind a social media platform is engineered to encourage further access, in this case it must be looked at more seriously."

He told the BBC: "I would say [Instagram] need to look long and hard about changing their algorithms and do it now. It cannot be right that a child can access such graphic imagery."

Instagram said: "Our thoughts go out to Molly's family and anyone dealing with the issues raised. We do not allow content that promotes or glorifies eating disorders, self-harm or suicide and work hard to remove it.

"However, for many young people, discussing their mental health journey or connecting with others who have battled similar issues, is an important part of their recovery.

"This is why we don't remove certain content and instead offer people looking at, or posting it, support when they might need it most."

Where to get help:

Lifeline: 0800 543 354 (available 24/7)

Suicide Crisis Helpline: 0508 828 865 (0508 TAUTOKO) (available 24/7)

Youth services: (06) 3555 906

Youthline: 0800 376 633

Kidsline: 0800 543 754 (available 24/7)

Whatsup: 0800 942 8787 (1pm to 11pm)

The Word

Depression helpline: 0800 111 757 (available 24/7)

Rainbow Youth: (09) 376 4155

CASPER Suicide Prevention
If it is an emergency and you feel like you or someone else is at risk, call 111.