Algorithms operate everywhere in our daily lives. Using the information we give them, they're constantly learning about who we are and what we're more likely to buy. (Remember how that pricey coffee maker you looked at online showed up in your Facebook ads for the next two weeks?)
Most of the time, it's no big deal. But in an era where more than 40% of Americans get their news from Facebook, these algorithms can have a real impact on how we see the world. They may even have the power to shape our democracy. (Cue ominous music.)
So here's the thing: every time you "like" something, share something, tag yourself in a photo, or click on an article on Facebook, the site collects data on you and files it away in their folder of YOU. And it's not just your activity on Facebook that they're keeping track of. They also track what device you used to log on, what other app you came from, other sites you've visited, and much more.
All that data helps Facebook paint a detailed picture of who you are and what you like for advertisers. The problem is that we don't know how, exactly, that picture is formed. The algorithms at work are a "black box." We don't know how these algorithms decide whether we're a "trendy mom" or a "frequent traveler." And we don't know how they decide which ads to show us. In short, no one is really accountable.
And here's where you come in, dear N2S listener. We are collaborating with ProPublica on their Black Box Data Project, which has just launched. You can take part in this important digital experiment. So go download the Google Chrome extension for your web browser at propublica.org/blackbox. Tell us what you find out and how it makes you feel. Reach out in the comments section below; email us firstname.lastname@example.org; holler at us on Twitter or Facebook; and fill in ProPublica and Julia Angwin too.
Time WasteArtist: Podington Bear
Bad CutArtist: Podington Bear
SaunterArtist: Podington Bear
Bit RioArtist: Podington Bear