Instagram helped to kill my daughter: Father of 14-year-old schoolgirl who took her own life after looking at pictures that glorified self-harm and suicide says social media is to blame

Ian Russel has accused Instagram of helping to kill his daughter Molly, 14

23 January 2019 - 08:24

The father of a schoolgirl who committed suicide has accused Instagram of helping to kill her.

Ian Russell said 14-year-old Molly took her own life after looking at pictures on the social network that glorified self-harm and suicide.

Molly was found dead just hours after handing in her homework and returning to her family home, where she had packed a bag to go to school the next day.

14-year-old Molly took her own life after handing in her homework and returning to her family home

14-year-old Molly took her own life after handing in her homework and returning to her family home

Father Ian Russell said his daughter took her own life after looking at pictures on the social network that glorified self-harm and suicide

Father Ian Russell said his daughter took her own life after looking at pictures on the social network that glorified self-harm and suicide

In a devastating note, she told her parents and two sisters: 'I'm sorry. I did this because of me.'

Speaking publicly about her death for the first time, Mr Russell said last night: 'I have no doubt that Instagram helped kill my daughter. She had so much to offer and that's gone.' 

His criticism of the photo-sharing site, which is owned by Facebook, comes after experts warned Instagram helped to glorify self-harm among vulnerable youngsters.

Last night hundreds of thousands of images depicting people harming themselves and discussing suicide could be viewed on the site, which is hugely popular among teenagers.

Mr Russell said Molly, who went to Hatch End High School in Harrow, Middlesex, had started viewing disturbing posts on the social network without the family's knowledge.

He told the BBC: 'She seemed to be a very ordinary teenager. She was future-looking. She was enthusiastic. 

'She handed her homework in that night. She packed her bags and was preparing to go to school the next day and then when we woke up the next morning, she was dead.'

It was only after her death in 2017 that the teenager's parents delved into her social media accounts and realised she was viewing distressing images.

Pictured: One of the images that was shared on Instagram depicting people harming themselves 

Pictured: One of the images that was shared on Instagram depicting people harming themselves 

One account she followed featured an image of a blindfolded girl, seemingly with bleeding eyes, hugging a teddy bear. 

The caption read: 'This world is so cruel, and I don't wanna to see it any more.'

Mr Russell said Molly had access to 'quite a lot of content' that raised concern.

'There were accounts from people who were depressed or self-harming or suicidal,' he said. 'Quite a lot of that content was quite positive. Perhaps groups of people who were trying to help each other out, find ways to remain positive to stop self-harming.

'But some of that content is shocking in that it encourages self-harm, it links self-harm to suicide and I have no doubt that Instagram helped kill my daughter. 

The posts on those sites are so often black and white, they're sort of fatalistic. [They say] there's no hope, join our club, you're depressed, I'm depressed, there's lots of us, come inside this virtual club.'

Mr Russell, who directed the BBC coverage of the Queen's 90th birthday service at St Paul's Cathedral, questioned why huge numbers of posts were still available to view on Instagram despite repeated warnings by experts.

Algorithms on Instagram mean that youngsters who view one account glorifying self-harm and suicide can see recommendations to follow similar sites.

Experts say some images on the website, which has a minimum joining age of 13, may act as an 'incitement' to self-harm.

Instagram's guidelines say posts should not 'glorify self-injury' while searches using suspect words, such as 'self-harm', are met with a warning. But users are easily able to view the pictures by ignoring the offers of help.

Ged Flynn, from suicide prevention charity Papyrus, said: 'Suicide is not a hashtag. It is an unimaginable, devastating tragedy.

'If an algorithm behind a social media platform is engineered to encourage further access, in this case it must be looked at more seriously.'

He told the BBC: 'I would say [Instagram] need to look long and hard about changing their algorithms and do it now. It cannot be right that a child can access such graphic imagery.'

Hospital admissions for self-harming among girls under 18 has almost doubled over the past 20 years, from 7,327 in 1997 to 13,463 in 2017.

Instagram said: 'Our thoughts go out to Molly's family and anyone dealing with the issues raised. We do not allow content that promotes or glorifies eating disorders, self-harm or suicide and work hard to remove it.

'However, for many young people, discussing their mental health journey or connecting with others who have battled similar issues, is an important part of their recovery.

'This is why we don't remove certain content and instead offer people looking at, or posting it, support when they might need it most.'

MAY INTEREST YOU x
Jussie Smollett faces claims he STAGED attack 'because he was being written off Empire' - but cops insist theory is 'unconfirmed' after raid home of two show extras who left US hours after incident
Jussie Smollett faces claims he STAGED attack 'because he was being...
Paying above the odds for your broadband? Majority fork out more for basic package compared to top speed fibre - and BT customers overpay most
Paying above the odds for your broadband? Majority fork out more for...