• breakfastmtn@lemmy.ca
    link
    fedilink
    English
    arrow-up
    0
    arrow-down
    5
    ·
    4 months ago

    There’s nothing wrong with MBFC. The arguments against it are silly. Most of the time people will link one entry that they typically haven’t read that very often says the opposite of what they claim. Even in those cases, it’s a look at about one hundredth of one percent of the content evaluated by MBFC. Serious sample size problem. When researchers use their entire dataset to compare MBFC to other bias monitors (both orgs and academic sources), what they find is consensus.

    This study compared 6 organizations and found consensus across thousands of news sites. Another recent study concluded that it doesn’t matter which one you use because the level of agreement between them is so high. One thing you won’t find ‘critics’ doing is citing peer-reviewed research in high-quality journals to support their arguments because it just doesn’t exist. MBFC is used in research all the time by people who’ve dedicated their lives to understanding media/bias/misinfo/propaganda. Those people have real skin in the game and could have their careers damaged or destroyed by using poor resources. And yet. That it’s a good enough resource for serious research but not good enough for our news sharing communities is a pretty laughable idea.

    These are some questions that don’t have great answers for those folks:

    1. How can 6 independent groups, using different methodologies, arrive at the same conclusions through random chance?
    2. If the bias in MBFC is so pervasive, why can’t anyone find it?
    3. How is MBFC never shown to be an outlier if it’s ‘one guy’s arbitrary opinion’?