Facebook’s Corporate Practices Regarding Questionable or Offensive Content

Facebook

Due to the fact that Facebook is an advertising organization, first and foremost, Facebook’s directors have strived to setup and implement infrastructure which has been specifically designed to keep people logged into Facebook for as long as possible. Facebook has discovered that a single person with fringe or extreme ideologies, can often provoke dozens of others, which in turn leads to even more engagement with more people, therefore generating greater advertising revenue. Offensive and disturbing content is an example of material which Facebook publicly denounces but internally welcomes as it keeps people signed in and conversing. Every second spent on Facebook is another chance to see a new or repeat advertisement. Unless streamed live, videos of child abuse in the U.S. typically go unreported to law enforcement and are rarely removed from Facebook unless requested to do so by an intervening law enforcement agency. Facebook has implemented a corporate policy in which users and moderators can flag content as disturbing so that users under 18 cannot view said content which is designed to be a reasonable intervention to maintaining a degree of control over the way content is shared, but Facebook purposefully allows questionable content because it provoked discussion and sharing between Facebook users. Censorship is purported as the main reason by Facebook as to why content of this nature is justified, but the true reason is most likely directly correlated to advertising revenue as the more the company earns, the higher its internal value becomes and the higher its internal value becomes, the more investors want to invest in it, which feeds back into a feedback loop causing an even higher valuation of the company. Facebook employs full time experts (e.g. psychologists etc.) to continuously moderate and review policies of acceptable terms of use. Facebook uses the accronym “MAD” which stands for “marked as disturbing” to classify content which many rational adults would argue should be removed. Fortunately, one of Facebook’s policies state that “if a parent or guardian views a video of their child in circumstances which they object to, they have the ability to request and/or demand Facebook moderators to remove the content”. Facebook moderators are forced to follow this corporate policy, however this request must come only from a parent or guardian of the child who is actively participating within the video or image in question. Fortunately, high priority issues (e.g. self-harm and suicide etc.) are sent to a specialized queue in which intervention has a turn around time of 24 hours, but all other flagged content which falls outside of this classification has a turn around time of approximately 5 days which is 5x longer than the standard response time for most technology companies 5x longer than the target Facebook has aimed to achieve. Priority queue material is not always removed however, with many cases involving helpful resources (e.g. digital ebooks, audiobooks, and telephone helpline phone numbers etc.) being sent to the content creator and/or person(s) found within the material, with the actual content itself remaining available for users to view. Facebook argues that removing content flagged as an immediate priority would mean that friends, family, and loved ones would not be aware of the issue (e.g. suicide) affecting the person(s) depicted and therefore would not allow for helpful intervention, rationalizing the distribution of content like this as a means of not only starting an open dialog between affected people, but also allowing them to be put in touch with the resource(s) they need. Facebook’s User Agreement states that no user under 13 years of age is permitted to use Facebook’s services however age violations are only investigated if other Facebook users report another user as being underage. Content which is racist or biased against a particular ethnicity is left available for consumption so long as these critics castigate immigration policy, and not any specific person or group. Highly popular Facebook pages are shielded from these removal policies to a degree and are also protected from ordinary content moderators removing content, only having material able to be removed if senior moderators flag it to be placed into a queue so that it can once again be reviewed by even higher ranking Facebook moderators. Surprisingly, some Facebook pages, particularly the high traffic ones tens of thousands or millions of followers, are afforded the same protection status as that of government and news organization Facebook pages

Nostradamus’ Books

Nostradamus

Other than The Bible, Nostradamus’ self-published books have sold more copies than any other publicized work in the history of the printed word. Nostradamus made his ominous predictions of the future by inhaling as well as orally ingesting nutmeg which breaks down into an amphetamine when processed by the liver. Nostradamus then poured black ink into a bowl of water, covered his head with the hood of his robe and stared into the bowl of water and ink claiming it allowed him to view depictions of future events. Nostradamus wrote down his visions in cryptic verses referred to as “quatrains” which contain only 4 sentences. Quatrains have been used by many different European cultures, as well as Asian cultures over the course of history