Summary: What – in – the – hell? Facebook has grown to become the most polemic main company of the social media market now. Running rallies against Snapchat (and other small competitors), not addressing up front certain issues in their platform, involved in several attempts of demands and a history of legal issues with other companies. The ambition of the company is not a real problem, but the way it seems to behave around those realities is curious. For apparently being in a good place, it does n
What – in – the – hell?
Facebook has grown to become the most polemic main company of the social media market now. Running rallies against Snapchat (and other small competitors), not addressing up front certain issues in their platform, involved in several attempts of demands and a history of legal issues with other companies.
The ambition of the company is not a real problem, but the way it seems to behave around those realities is curious. For apparently being in a good place, it does not make too much sense how things have happened lately.
Certainly… adding this one to the mix won’t help to the cause.
Terrorists can do as they please without too much effort in the social network.
Yes, as terrible as it reads, is true. It seems that Facebook moderator leaders are letting pass the violent images of their campaigns and many other elements of dangerous nature stay on free pages without sensor anything. How this is possible? Well, there’s a big loophole that can explain it easily.
The internal book of rules of Facebook have it clear that images of strong content and polemical precedents can be “invisible” to the censors of the social media, why? They have a statute that acts only if there is a caption that reveals the truth of the image’s purpose.
This is of grave issue for Facebook since society has grown oversensitive and really aware about media that might result as an offense to different social groups over the world.
Facebook’s Insecurity
The big reveal of this was due to a leak of Facebook’s 100 page rule book that evidenced the uncommon priorities regarding the exposed content on the network. According to the moderators line, the identification of possible terrorist threats that hold credibility can be of 1000 (approximately, even more) but the ones that will be removed are tied to the “specific details” of its nature; this is supposedly a system of double checking the information exposed in such images and pictures, but it doesn’t seem that’s the case here.
There are almost 4 billion of people around the globe using this social network, the content uploaded to that platform can come from untraceable sources (if we’re talking about terrorists), and Facebook might not be responsible for stopping those people from tempering with whatever they can, BUT, if it’s through their owned channel, this curious origins are doing activities that upset and damage many others, the role Facebook has to play is being ignored in this case.
Social media giants that copulate with the worldwide market are not just big dogs who own money and other companies, no. They also have bigger responsibilities to carry on with; being a global leader will give you the challenge of not being a preferential company that benefits 10 times more the offer for a certain part of your clientele’s population; the right path to take is to addressed each and every one of your collectives desires and necessities and work your way around them. And even more, if it is about terrorists harming their way into society.
4 years, 4 months ago
Download@
reply