Everyone knows algorithms control much of what we see online. If you didn’t deliberately look for something on the world wide web, then odds are it was given to you via algorithm.
But sometimes those algorithms miss the point. Sometimes they show the complete opposite of what they should be showing. This is often the case on Facebook, where the algorithm attempts to work quickly based on what user just wrote.
A good example of this happened to me on November 5th. Following the Texas church shooting, I wrote the following on Facebook:
Given the current political climate, this could be construed in an anti-gun post, although I did not mention the typical key words of an anti-gun post. I didn’t mention the following:
- hearts and prayers
So perhaps the algorithm didn’t know what to do with me when it posted the following ads on my page:
Two of the three ads are for firearms-related activities. One is to build a weapon that has often been used in mass shootings. That’s not a good look by Facebook.
No, I will not learn the art of reloading, build an AR-15, and drive my pick-up truck. Those are enjoyable actions, but I am not in the market for them when I am posting about a large shooting with multiple gun-related victims.
Perhaps these ads are what the Russians would want me to support given my demographic: guns and trucks. If that’s the case, as I wrote in my post: Put me down for the opposite.