What that change means for you, and why some people are upset about it.
Automation can make a system simpler and more efficient, but a human being has to build the automated system first, and underlying values and goals come through in the finished product. It may sound simplistic, but you have to ask what automation is trying to accomplish, especially important given that most major tech companies frame updates and algorithm changes as a vague "better user experience." What, in the company's terms, does "better user experience" mean? One of Netflix's goals, which most users understand intuitively, is to attract as many subscribers as possible and give them personalized experiences to keep them watching.
It's not so much that this system is an inherently bad thing so much as that it's not inherently good, especially in a broader context of how the internet works. Search and recommendation algorithms can be pretty effective at surfacing content a user already likes, but they're not very good at exposing an individual to content she doesn't like, or content she doesn't even know she likes. The issue comes down to a basic push-pull concept in machine learning, called exploitation vs. exploration -- the exploitation side is surfacing what a user knows and loves, the "go-tos," whereas the exploration side is what it sounds like, discovering new things you like. (Spotify's "Discover Weekly" feature is a prime example of the exploration approach, but it's also mining your "go-tos" for recommendations.)
The vague word "content" actually applies here, because it really could refer to anything online. All the products you use, from Google search to Facebook to Spotify, try to predict what you want in real time, whether it's an image that will intrigue you, a hard news story, a music video, or an uncle's Facebook rant. It forces content creators, especially those who want to appeal to as many people as possible, to play by the algorithms' rules, and no one's immune. Your media becomes a product of your existing tastes, as filtered through a tech company's assumptions about what you already like.
Of course, because online algorithms are human creations, they have biases. Almost daily, you see and hear criticisms leveled at Facebook and Twitter for failing to control what information is disseminated on their platforms, which the companies often dismiss by citing freedom of speech and the fact that the content on their services is dictated by algorithms. The fact that these algorithms rely on user input -- i.e. what you click and engage with -- to deliver new content for you allows the companies to say that any perceived biases are actually the result of user prejudice, as opposed to an inherent flaw in their design.
Case in point: The recent uproar over Netflix's perceived targeting of black users with images of black actors who are often minor characters in a film or show. When people complain, Netflix can say, as they did to Fader, that they're not targeting black users with anything at all; it's a function of the algorithm. "Reports that we look at demographics when personalizing artwork are untrue," the company's statement says. "We don't ask members for their race, gender, or ethnicity so we cannot use this information to personalize their individual Netflix experience. The only information we use is a member's viewing history."
That's technically true, but also a cop out, because it doesn't reveal any of the input factors going into the service's algorithm that delivers the images in the first place. In other words: What's making Netflix's algorithm think certain users will respond best to black actors in a show or movie's cover image? The company conveniently sidesteps this question by saying it doesn't collect any demographic information that would target users' race, without acknowledging that that's exactly what the image-selection algorithm is doing anyway.
Stephen Colbert recently made fun of Netflix's racially targeted promotion, but most responses to this seem to be a fundamental misunderstanding of even the most basic functionality of Netflix, which makes it more difficult to hold the company (or Facebook, or Google, or Twitter, etc.) accountable.