As a dangerous TikTok trend that encourages teens to pass out does the rounds, we are left asking how liable the social media giant is if a worst-case scenario did occur.
'Blackout' is the current TikTok rage among youngsters where a person forces themself to pass out while being filmed with the video subsequently uploaded to the site.
TikTok is a video-focused social networking service owned by the Chinese company, Bytedance Ltd.
The service hosts a variety of short-form user videos from genres like pranks, stunts, tricks, jokes, dance, and entertainment with durations from 15 seconds to three minutes.
Northland health professionals have labelled 'blackout' as "potentially life-threatening" because of the risk of brain injury or death.
A Northland teen, who did not want to be named, says she suffered a seizure when attempting 'blackout'.
The 16-year-old lost consciousness for less than a minute. She struggles to recall much about the incident due to the seizure.
She was the last of her friend circle to try the trend.
A friend of the teen told the Advocate peer pressure may have been to blame.
"Every one of [her] friends did it and they didn't have seizures."
While the teen, fortunately, didn't need hospital treatment, Whangārei general practitioner Geoff Cunningham warned the incident could have gone very badly.
"There are all sorts of potential complications from it, not just people passing out. It can lead to heart issues, respiratory arrest, or worse.
"It is a real concern and something that should be discouraged completely."
Cunningham was aware of the dangerous TikTok trend. He said it would be a tragedy if someone were to die or get permanent brain damage from it.
"The danger is, just because it is on TikTok, does not make it okay. There are potential complications that people watching TikTok videos are not aware of.
"Parents should be discouraging kids from taking part in anything that is potentially life-threatening. This is the recommendation that the medical fraternity in New Zealand is trying to encourage," he said.
NetSafe research showed young people spent more time online watching videos - well ahead of playing games; creating a blog, online story or website, or looking at the news.
Up to May last year, Netsafe had 432 personal harm reports from or about young people aged between 10 and 17 - a 30 per cent increase compared to 2020. Of those reports, most related to people aged 13 and 15.
Samantha Chow, a solicitor at Stace Hammond Lawyers, said Parliament was being pressured to address the issue of social media platforms allowing or encouraging harmful posts.
However, after the Christchurch mosque shooting the push did not result in any bill or legislation being seriously considered or enacted.
"...nor did it lead to any notable amendments to somewhat relevant legislation that already exists [such as the Harmful Digital Communications Act 2015].
"A notable difference between this TikTok trend and the mosque shooter's manifesto is the latter is essentially a single video; whereas when there is a new TikTok trend, there become countless videos from countless users, making it practically near impossible for anyone to monitor, manage and ban as the content comes up."
Chow said the specific limitation clauses under clause 10 of TikTok's terms of services seemed to only really cover financial losses.
However, a more general exclusion under Clause 7 existed: "We accept no liability in respect of any content submitted by users and published by us or by authorised third parties".
Chow said if the consequence was injury then there would be no responsibility on TikTok's or anyone else's part as it would most likely be covered by ACC which bars civil responsibility.
"Technically, it could be argued that users posting content of them [or others] passing out on purpose constitutes a breach of the Terms on the user's part, TikTok would not be held responsible for the actions and breach of Terms from users, and no valid or applicable exclusions (if any) require consideration."
Any definitive duty on Tiktok or any other social media platform to constantly monitor and remove or ban certain content was more of a question of moral duty, rather than a legal one.
"We would tend to think that there should there be some general expectation of self-monitoring on the part of TikTok and other platforms, but if not then we anticipate that legislation will likely have to step in," Chow said.
"The caveat to this is, with New Zealand being such a small country and market, it is unlikely we will be that influential over social media platforms like TikTok. Working with larger countries would probably be more effective."
Chow said it was the same with recent cases of Tiktok-fuelled car thefts in the country.
"TikTok is unlikely to be held legally responsible for the car thefts arising from the car theft trend.
"Dangerous TikTok trends are arguably no different to other past dangerous internet trends to which other social media platforms were not held legally responsible."
Most social media platforms, including TikTok, had some sort of age restriction but when it came to minors, Chow said, it was probably safe to say that parents to a certain degree were responsible for the online content their children consume.
"...though with that being said, parents cannot possibly be expected to supervise their children 24/7. If we're talking about who bears the actual legal responsibility, once again, it is hard to say – probably no one."