Facebook says it does not offer tools to target people for advertising based on their emotional state.
The social media giant, however, has acknowledged that research shared with an advertiser did not follow its own processes.
Facebook was responding to a news report by the The Australian, which claimed a leaked document showed that the company conducted research into how it could exploit the moods of teenagers for potential advertising purposes.
READ MORE: • Facebook targeting vulnerable youth
The confidential document detailed how Facebook had a tool to figure out when people - as young as 14 - feel "defeated", "overwhelmed", "stressed", "anxious", "nervous", "stupid", "silly", "useless", and a "failure" by monitoring their posts, comments and interactions on the site.
Facebook has since responded to the leaked document, saying it has an established process to review the research it performs.
"On May 1, 2017, The Australian posted a story regarding research done by Facebook and subsequently shared with an advertiser. The premise of the article is misleading. Facebook does not offer tools to target people based on their emotional state," the company said in a statement on its newsroom website.
"The analysis done by an Australian researcher was intended to help marketers understand how people express themselves on Facebook. It was never used to target ads and was based on data that was anonymous and aggregated.
"Facebook has an established process to review the research we perform. This research did not follow that process, and we are reviewing the details to correct the oversight," it said.
The Office of the Privacy Commissioner said it believed Facebook's response was satisfactory and did not raise any ongoing concerns.
Netsafe chief executive Martin Cocker also acknowledged Facebook's response.
"You've got to take Facebook's response at face value. You've got to assume that what they are doing is being honest about their response," Cocker said.
Cocker said anywhere people share enough information online about themselves, or spend enough time, websites and networks can undertake analysis.
"In this particular case we're talking about vulnerable young people, and society and corporations all have a responsibility to protect and support young people who are going through a difficult period, not to exploit that information to provide marketing services to them. Absolutely provide social support services."
If everyone is to have confidence in using Internet services, they need to be able to rely on service providers to treat their information sensitively. Platforms like Facebook have a corporate responsibility to act in the best interests of all.
Over the years there have been calls for social networks to use there capabilities to ensure users have access to support services, Cocker said.
"If Facebook can use this kind of capability to connect genuinely vulnerable young people with support services then I think that is a good thing. But it does come with a whole lot of complications and I understand why companies would want to steer clear of that."
Jordan Carter, chief executive of Internet New Zealand, said platforms like Facebook had a corporate responsibility to act in the best interests of everyone.
"If everyone is to have confidence in using internet services, they need to be able to rely on service providers to treat their information sensitively. Platforms like Facebook have a corporate responsibility to act in the best interests of all.
Carter said Facebook should completely rule out allowing advertisers to target troubled young people.
"The insights from its knowledge of all of us can be used for good or ill, and the more transparent the company can be about what it learns and how it uses that, the better," he said.