Recommended Content – A Calculated Way to Annoy, Irritate and Frustrate Your Users

Fun fact, if someone was to watch everything on Netflix it would take them 236 and a half days. And that was back in 2017 – the amount of content has only grown since then. If we then factor in other platforms, like Hulu and HBO, that amount of time multiplies at a shocking rate. Thankfully, each website has some type of ‘recommended’ section that helps the viewer discover new, hand/machine-picked content, saving them from having to dredge through the literal sea of content. But, often, when someone peaks through this personalized selection, they’re met with generic and inaccurate suggestions. With a sigh of frustration viewers will turn on something familiar like The Office for the 15thtime (which, if you’re on a dating app, counts as a personality trait) or consider finding a different streaming service to use. With the number of services increasing, and content becoming more diverse, there is a need to rethink content curation to curb frustration caused by recommendation engines and their generic results.

First, it should be said that recommendation engines are a fantastical invention that have become an integral mechanism of streaming services. These engines are AI algorithms that dissect viewing habits and parse through content descriptions to calculate who would most likely enjoy what. Theoretically, this means that every user gets a curated experience unique to only them but, in practice, these recommendations are the subject of assumption. For instance, they assume content descriptions are accurate and that two users who share similar viewing habits truly have similar tastes. As a result, when these assumptions are wrong, suggested content feels disingenuous. Users already feel uncomfortable with algorithms, causing things like ‘algorithm anxiety,’ where a user feels powerless in the face of an algorithm they depend on (think Uber drivers, people submitting their resume). If anxiety exists with regard to algorithms influencing important life events, then there might a subtle form of this anxiety viewers experience daily because of streaming services. This could stem from inaccurate results reinforcing that the user’s preferences are, in fact, not being considered, despite being told the results are ‘recommended for you.’ It can feel dehumanizing and lead to frustrationwhen services take action that is disconnected with what viewers want. When Netflix announced that they would be canceling shows like The OA and Tuca & Bernie due to poor performance, both fans and creators blamed the services algorithm for not connecting the show with the right audience. Fans then organized a mass unsubscribe campaignin response, to demonstrate that Netflix had not been listening to them. When users feel that they are not being listened to, they feel betrayed by a system that is marketed as a curation tool for their preferences.

To address this, we need to develop more ‘human’ influence into streaming platforms. Some services have already gotten a jump on it and are already experimenting. HBO revealed ‘recommended by humans’ only a few months ago, meant as a jab at automated suggestions by featuring fans tweets reacting to HBO shows. Netflix also recently started testing humanized recommendations, trialing human curated playlists on iOS as a way of giving people content lists that aren’t auto generated based on video descriptions. Vimeo has focused on human recommendations for years with their Staff Picks, a high honor among indie film makers. While the jury is still out if these systems help to detract from the ire brought on by malfunctioning recommendations, they do help to remedy part of the problem. They are developing ways for recommendations to feel less manufactured and, as a result, more genuine. While it is a good start, these platforms are still relying on mass communication methods of providing recommendations that miss out on a core cause of the dehumanizing effect of algorithms. To ensure that the average user feels like their opinion matters, steps should be taken to allow each viewer to influence content curation.

To learn best practices and pitfalls of user led curated content, streaming services can look to platforms that prioritize the voice of the user, such as Reddit and Discord, to see how to incorporate user sentiment into content curation. Content voting should always be a consideration. Reddit has thrived by letting smaller communities vote on content featured on their pages, with the highest rated content getting featured on the coveted front page for everyone to see. The key to this is that Reddit weighs votes so that featured content reflects the sites zeitgeist. This voting allows the users to represent their communities and organically filter posts based on praise rather than automated assumptions. Another method of content curation is to give users a direct line to creators. Discord (the unproductive brother of Slack) has transformed the relationship between viewers and content creators, evolving the average viewer into a micro producer who can give feedback, establish timelines and propose ideas. There is a line to be drawn, of course, as to how much influence viewers have on content, but these chat rooms can become melting pots of new ideas that keep creators accountable while also helping them understand their core audience. There are other lessons to be learned here as well. For example, how YouTube gives creators a way to suggest other content you’d like, how Twitch transformed every stream into a vibrant viewing party or how Patreon gives people a chance to fund their favorite projects. The key takeaway is that these systems have made the voice of a user a system of curation.

It can be hard to make every single person feel valuable, especially when you have millions of subscribers each with their own unique tastes. This makes the automating of content recommendations an immensely difficult task because it serves both a practical and emotional purpose. But, by introducing systems that make the individuals voice count in how content is curated, you both learn more about your audience and help to remedy the frustration that can be caused by a recommendation engine when it breaks down.