Part III-H: The Algorithm (continued)
As complex as the Facebook algorithm is, it really does not require too much over-thought. Artificial intelligence is designed to automate processes as much as possible. The best way to utilize these tools are to allow them to run their course. Throughout this entire series we discussed how the system is fed information (voluntarily), processes that information and ultimately sells it. The true power behind the process is the mathematical predictions that the algorithm offers (and also sells).
A user can provide all types of information that the algorithm can segment...but the true art of the software lies in understanding the user's next move. In order to master this, the system must learn human tendencies, emotions, addictions and habits. I really don't want to start sliding down a 'Terminator' path where machines are ultimately going to take over...but it is important to understand the evolution of this technology and how it is morphing our own behavior even away from the platform itself.
One of the things that make humans stand out above most creatures are our ability to evolve, pivot, change and adapt to various conditions and situations. While most of us are stubborn and refuse to conform to various new norms, sometimes we just have no choice...The events since 2020 are a perfect example of that. The question is...if we (as humans) build an artificial intelligence to learn our tendencies and habits over time, what happens when those tendencies change on the drop of a dime? The answer...is also change!
A system like the Facebook algorithm works best by adapting to its host variants. In order to be most effective, it must tap into the users and businesses associated within its network and decide what changes these dependent inputters are going through. While there is a lot relying on the algorithm to function on its own, the truth is, there are hundreds, if not thousands, of programmers and data analysts who's responsibility it is to study the current conditions and adjust the system accordingly. This human-element adjustment can have dire consequences on the performance of the AI and ultimately impact the output. Case in point...once again...2020!
As events began in the early months of 2020, social media became a primary source of staying current on just really what the hell was going on. With every user having their own soap box on a network like this, there are way too many chefs and not enough cooks. All opinions are classified as facts, arguments ensue, exaggerated or false information is published...and republished...and republished. Before you know it, the system enters a vortex of chaos that is more confusing than useful. This ultimately turns users and businesses away from the chaos thus hurting the performance of the algorithm. The solution? Intervene!
Unfortunately, the human interference with the algorithm slanted the system to a one-way train of thought. Censorship, Deletion, Cancelation...whatever you want to call it...if a user or business was engaging in activity that added to the chaos, those assets were eliminated from the equation. Many arguments will claim that it is due to a bias for a political party, scientific approach, or religious ideology...but the facts are the facts. Facebook is a business. It is in the business of gathering information, segmenting that information and ultimately selling that information. This machine has a very fixed method of performing these functions. An outlier to this method creates a glitch in the process and therefore needs adjusting in order to keep the users engaged and inputting more data.
The human element obviously offers human emotions into these decisions. But never judge the size of the fire based on the amount of smoke. Facebook will do anything to keep the machine functioning...even it means eliminating users in the process.