Facebook has monitored the posts of Australian children and used algorithms to help advertisers target them
Facebook monitored the posts of children in Australia in order to help advertisers target them specifically, says a report in The Australian. The report is based on internal documents from Facebook, which highlight how the social network can target moments when young people need a confidence boost by monitoring posts, pictures and interactions.
The report on The Australian states, Facebook was looking at these posts, pictures, comments, etc in real-time and thus figure out when a child was feeling particularly low, anxious or stressed. The mood was being used to predict the behaviour that could follow online, which is something advertisers could then use for their benefit.
According to the report, Facebook has a whole database of information around these young adults, and could figure out their moods based on the activity on the social media website. All of this information is collected with Facebook’s own internal data, and is not available to the public.
The report says the internal document was prepared by “two of Facebook’s top Australian executives, David Fernandez and Andy Sinn.” As the report notes, Facebook’s tactic violates the Australian Code for Advertising and Marketing Communications to Children guidelines.
Facebook has admitted it was wrong to target the children and apologised for the same. The company has also ordered an investigation into this data collection.
The fact that Facebook has a ton of information around its 1.9 billion monthly active users is not new. Privacy settings on Facebook can often be complicated, and most users, which includes adults as well as kids, might not have a clue on which posts are available to the public and which are private.
But what the report makes clear is that Facebook can use this data to its monetary advantage. In this case, tapping into a person’s mood and mapping them to show relevant ads, raises serious concerns on the kind of control that Facebook can exercise over people. It also raises ethical questions around Facebook and how it uses the information that is shared by users.
The latest report might remind one of the 2014 incident when Facebook was mired in controversy over its News Feed experiment, where it had tweaked the feed of over 600,000 users in order to show more positive or more negative content for users. The Facebook study had caused an outrage with people wondering how the social network could tweak the News Feed for customers to reflect a certain kind of mood.
Facebook has not confirmed if it uses similar tactics elsewhere for collecting user data around moods. It also told the paper, that all of the information collected is aggregated and consistent with current privacy laws in the country.
EmoticonEmoticon