Advertisement
Australia markets close in 2 hours 10 minutes
  • ALL ORDS

    7,841.30
    -168.10 (-2.10%)
     
  • ASX 200

    7,590.40
    -162.10 (-2.09%)
     
  • AUD/USD

    0.6419
    -0.0026 (-0.41%)
     
  • OIL

    85.88
    +0.47 (+0.55%)
     
  • GOLD

    2,403.40
    +20.40 (+0.86%)
     
  • Bitcoin AUD

    98,182.88
    -3,442.48 (-3.39%)
     
  • CMC Crypto 200

    885.54
    0.00 (0.00%)
     
  • AUD/EUR

    0.6047
    -0.0013 (-0.21%)
     
  • AUD/NZD

    1.0915
    +0.0007 (+0.06%)
     
  • NZX 50

    11,809.07
    -107.71 (-0.90%)
     
  • NASDAQ

    17,706.83
    -296.65 (-1.65%)
     
  • FTSE

    7,965.53
    -30.05 (-0.38%)
     
  • Dow Jones

    37,735.11
    -248.13 (-0.65%)
     
  • DAX

    18,026.58
    +96.26 (+0.54%)
     
  • Hang Seng

    16,291.00
    -309.46 (-1.86%)
     
  • NIKKEI 225

    38,349.96
    -882.84 (-2.25%)
     

YouTube’s ‘dislike’ and ‘not interested’ options don’t do much for your recommendations, study says

There is no secret that both viewers and creators are confused by the puzzle that is YouTube's recommendation algorithm. Now, a new study by Mozilla suggests that users' recommendations don't change a lot when they use options like "dislike" and "not interested" to stop YouTube from suggesting similar videos.

The organization's study observed that YouTube served them with videos similar to what they had rejected — despite people using feedback tools or changing their settings. When it came to tools preventing bad recommendations, clicking on "not interested" and "dislike" was mostly ineffective as it only prevented 11% and 12% of bad recommendations, respectively. Methods like "don't recommend channel" and "remove from watch history" ranked higher in effectiveness, cutting 43% and 29% of bad recommendations, respectively. Overall, users who participated in the study were dissatisfied with YouTube's ability to keep bad recommendations out of their feeds.

Image Credits: Mozilla

ADVERTISEMENT

Mozilla's study took data from 22,722 users of its own RegretReporter browser extension — which lets users report "regrettable" videos and control their recommendations better — and analyzed more than 567 million videos. It further took a detailed survey with 2,757 RegretReporter users to better understand their feedback.

The report noted that 78.3% of participants used YouTube's own feedback buttons, changed the settings or avoided certain videos to "teach" the algorithm to suggest better stuff. Out of people who took any kind of steps to control YouTube's recommendation better, 39.3% said those steps didn't work.

“Nothing changed. Sometimes I would report things as misleading and spam and the next day it was back in. It almost feels like the more negative feedback I provide to their suggestions the higher bullshit mountain gets. Even when you block certain sources they eventually return,” a survey taker said.

Indeed, 23% of people who made an effort to change YouTube's suggestion gave a mixed response. They cited effects like unwanted videos creeping back into the feed or spending a lot of sustained time and effort to positively change recommendations.

"Yes they did change, but in a bad way. In a way, I feel punished for proactively trying to change the algorithm's behavior. In some ways, less interaction provides less data on which to base the recommendations,” another study participant said.

Mozilla concluded that even YouTube's most effective tools for staving off bad recommendations were not sufficient to change users' feeds. It said that the company "is not really that interested in hearing what its users really want, preferring to rely on opaque methods that drive engagement regardless of the best interests of its users."

The organization recommended YouTube to design easy-to-understand user controls and give researchers granular data access to better understand the video-sharing site's recommendation engine.

"We offer viewers control over their recommendations, including the ability to block a video or channel from being recommended to them in the future. Importantly, our controls do not filter out entire topics or viewpoints, as this could have negative effects for viewers, like creating echo chambers. We welcome academic research on our platform, which is why we recently expanded Data API access through our YouTube Researcher Program. Mozilla’s report doesn’t take into account how our systems actually work, and therefore it’s difficult for us to glean many insights," Elena a YouTube spokesperson said in response to the study.

Mozilla conducted another YouTube-based study last year that noted the service's algorithm suggested 71% of the videos users "regretted" watching, which included clips on misinformation and spam. A few months after this study was made public, YouTube wrote a blog post defending its decision to build the current recommendation system and filter out "low-quality" content.

After years of relying on algorithms to suggest more content to users, social networks including TikTok, Twitter and Instagram are trying to provide users with more options to refine their feeds.

Lawmakers across the world are also taking a closer look at how opaque recommendation engines of different social networks can affect users. The European Union passed a Digital Services Act in April to increase algorithmic accountability from platforms, while the U.S. is considering a bipartisan Filter Bubble Transparency Act to address a similar issue.

The story is updated with a response from Google.