diversity - Media Helping Media https://mediahelpingmedia.org Free journalism and media strategy training resources Sun, 29 Sep 2024 05:55:15 +0000 en-GB hourly 1 https://wordpress.org/?v=6.6.2 https://mediahelpingmedia.org/wp-content/uploads/2022/01/cropped-MHM_Logo-32x32.jpeg diversity - Media Helping Media https://mediahelpingmedia.org 32 32 Dealing with algorithm bias in news https://mediahelpingmedia.org/advanced/algorithm-bias-in-journalism/ Sat, 28 Sep 2024 07:33:19 +0000 https://mediahelpingmedia.org/?p=3106 Journalists must be trained in how to recognise and deal with ‘algorithm bias’ in order to counter the speed and scale at which biased content is being spread.

The post Dealing with algorithm bias in news first appeared on Media Helping Media.

]]>
An image of a fact-checking unit created using Microsoft's AI Image Designer
An image of a fact-checking unit created using Microsoft’s AI Image Designer

Journalists must be trained in how to recognise and deal with algorithm bias in order to counter the speed and scale at which biased content is being spread. That’s the view of Gemini, the artificial intelligence (AI) conversation tool, which says that the need for such training is “more urgent than ever”.

AI says “the rapid advancement of artificial intelligence and machine learning has led to a significant increase in the use of algorithms in news production” and that “algorithms can introduce bias into news content, which can have serious consequences for both individuals and society.”

Media Helping Media (MHM) asked Gemini how journalists can recognise and deal with algorithm bias. We also asked it to design a one-day training course to equip journalists with the tools they need to tackle the issue.

The following is the text of that conversation, which took place in September 2024. During the conversation Gemini states that while the challenge of bias in news production is not entirely new, the rise of AI has made it more urgent and complex.

AI warns that “journalists and news organisations must adapt to this new reality by developing strategies to mitigate algorithm bias and ensure that the news they produce is fair, accurate, and representative”.

I asked AI the following questions. The answers were immediate. Gemini’s responses are unedited other than to change them from American English to English English  (MHM’s preferred house style).

MHM: What is algorithm bias in news production?

Gemini AI: Algorithm bias in news production refers to the unintentional or intentional bias introduced into news content by the algorithms used to curate, personalise, or generate news. This bias can manifest in various ways, including:

  • Selection bias: Algorithms may prioritise certain stories or topics over others, leading to a skewed representation of the news.
  • Confirmation bias: Algorithms may reinforce existing beliefs or biases by recommending content that aligns with a user’s preferences.
  • Representation bias: Algorithms may underrepresent certain groups or perspectives, leading to a lack of diversity in the news.
  • Amplification bias: Algorithms may amplify certain narratives or misinformation, contributing to the spread of false or misleading information.

Factors contributing to algorithm bias:

  • Biased data: If the data used to train algorithms is biased, the algorithms will likely produce biased outputs.
  • Biased algorithms: The design of algorithms themselves can introduce bias, such as by prioritising certain features or metrics over others.
  • Lack of transparency: The lack of transparency in how algorithms work can make it difficult to identify and address bias.

Addressing algorithm bias in news production requires a multi-faceted approach, including:

  • Improving data quality: Ensuring that the data used to train algorithms is diverse, representative, and free from bias.
  • Designing unbiased algorithms: Developing algorithms that are transparent, accountable, and resistant to bias.
  • Promoting transparency: Increasing transparency in how algorithms are used and the decisions they make.
  • Human oversight: Ensuring that human editors and journalists play a role in reviewing and curating news content.

By understanding and addressing algorithm bias, news organisations can help to ensure that the news they produce is fair, accurate, and representative.

MHM: How can journalists avoid algorithm bias if they are not in control of the algorithms?

Gemini AI: While journalists may not have direct control over the algorithms used in news production, they can take several steps to mitigate algorithm bias:

  1. Be aware of algorithm bias: Journalists should understand the concept of algorithm bias and how it can manifest in news content. This knowledge can help them identify potential biases and take steps to address them.
  2. Critically evaluate news sources: Journalists should be sceptical of news stories generated or curated by algorithms. They should carefully evaluate the sources of information and verify the accuracy of the content.
  3. Seek diverse sources: Journalists should strive to gather information from a variety of sources, including those that may not be prominently featured in algorithm-driven news feeds. This can help to ensure that the news they report is balanced and representative.
  4. Use human judgement: Journalists should rely on their own judgement to assess the quality and relevance of news stories. They should be willing to override algorithmic recommendations if they believe that they are biased or misleading.
  5. Advocate for transparency: Journalists can advocate for greater transparency in how algorithms are used in news production. This can help to identify and address biases and ensure that the news is produced in a fair and ethical manner.
  6. Collaborate with data scientists: Journalists can collaborate with data scientists to understand how algorithms work and identify potential biases. This can help them to develop strategies for mitigating algorithm bias.

By taking these steps, journalists can help to ensure that the news they report is accurate, balanced, and free from algorithm bias.

MHM: So mitigating algorithm bias is a new task brought about by the advance of AI but should now become a crucial part of a news producer’s job?

Gemini AI: That’s a great observation. While the term ‘algorithm bias’ may be relatively new, the underlying issue of bias in news production has existed for centuries. However, the advent of AI and machine learning has amplified the potential for bias to be introduced into news content at a larger scale and at a faster pace.

Here’s a breakdown of how traditional news production compared to AI-driven news:

Traditional News Production:

  • Human curation: News editors and journalists manually select and curate stories based on their judgement and expertise.
  • Limited reach: News distribution was primarily through print and broadcast media, limiting the audience and potential for bias.

AI-Driven News Production:

  • Algorithmic curation: Algorithms use data to recommend news stories to users, often based on their past behaviour and preferences.
  • Vast reach: News can be disseminated instantly to a global audience through digital platforms, increasing the potential impact of bias.

Key differences:

  • Scale: AI can process and analyse vast amounts of data, increasing the potential for bias to be introduced at a larger scale.
  • Speed: AI can disseminate news at a much faster pace than traditional methods, making it difficult to identify and correct biases before they have a significant impact.
  • Personalisation: AI can personalise news content to individual users, potentially reinforcing existing biases and limiting exposure to diverse perspectives.

Therefore, while the challenge of bias in news production is not entirely new, the rise of AI has made it more urgent and complex. Journalists and news organisations must adapt to this new reality by developing strategies to mitigate algorithm bias and ensure that the news they produce is fair, accurate, and representative.

MHM: So would you say that dealing with algorithm bias should now be part of every journalist’s basic training?

Gemini AI: Absolutely. Dealing with algorithm bias should be a fundamental part of every journalist’s training in the digital age. Here’s why:

  1. Understanding the problem: Journalists need to understand how algorithms work, how they can introduce bias, and the potential consequences of this bias.
  2. Identifying and mitigating bias: Journalists should be able to identify signs of algorithm bias in news content and take steps to mitigate its effects.
  3. Evaluating sources: Journalists should be able to critically evaluate the sources of information they rely on, especially those that are generated or curated by algorithms.
  4. Promoting diversity and inclusion: Journalists should strive to promote diversity and inclusion in the news they report, by seeking out a variety of sources and avoiding algorithmic biases that may underrepresent certain groups or perspectives.
  5. Advocating for transparency: Journalists can advocate for greater transparency in how algorithms are used in news production, which can help to identify and address biases.

By incorporating training on algorithm bias into journalism programmes, we can equip journalists with the skills and knowledge they need to produce fair, accurate, and representative news in the digital age.

MHM: Could you suggest the outline for a one-day training course for journalists who want to understand and tackle algorithm bias?

Gemini AI: Absolutely. Here is an outline for a one-day training course on algorithm bias for journalists

Morning Session: Understanding Algorithm Bias

  • Introduction to algorithms in news production:
    • How algorithms are used in news curation, personalisation, and generation
    • Common types of algorithms (e.g., recommendation engines, natural language processing)
  • Understanding bias:
    • Defining bias and its different forms (e.g., selection bias, confirmation bias, representation bias)
    • The impact of bias on news quality and credibility
  • Sources of algorithm bias:
    • Biased data
    • Biased algorithms
    • Lack of transparency
  • Case studies of algorithm bias in news:
    • Real-world examples of how algorithm bias has affected news coverage
    • Lessons learned from these cases

Afternoon Session: Tackling Algorithm Bias

  • Identifying algorithm bias in news content:
    • Recognising signs of bias in news stories generated or curated by algorithms
    • Tools and techniques for detecting bias
  • Mitigating algorithm bias:
    • Strategies for reducing the impact of bias in news production
    • Best practices for data collection and curation
    • Designing unbiased algorithms
    • Promoting transparency and accountability
  • Human oversight and editorial judgement:
    • The role of human editors and journalists in ensuring the quality and fairness of news content
    • Balancing algorithmic recommendations with human expertise
  • Advocating for change:
    • How journalists can advocate for greater transparency and accountability in the use of algorithms in news production
    • Engaging with news organisations and policymakers to address algorithm bias

Workshop Activities:

  • Group discussions: Facilitated discussions on specific case studies and challenges related to algorithm bias
  • Hands-on exercises: Practical exercises to identify and analyse algorithm bias in news content
  • Role-playing scenarios: Simulations of real-world situations where journalists need to deal with algorithm bias
  • Guest speakers: Invited experts from the fields of journalism, technology, and ethics to share their insights and experiences

Additional Considerations:

  • Tailoring the course: The content of the course can be tailored to the specific needs and interests of the participating journalists.
  • Online or in-person: The training can be conducted either in-person or online, depending on the preferences of the participants and the resources available.
  • Continuous learning: Encourage participants to continue learning about algorithm bias and best practices for mitigating its effects through ongoing professional development.

MHM: And what sources would you recommend for supporting the course on algorithm bias which you have just outlined?

Gemini AI: Recommended sources for a course on algorithm bias in news production include:

Books:

  • Weapons of Math Destruction by Cathy O’Neil: This book explores how algorithms can perpetuate inequality and discrimination.
  • Automating Inequality by Virginia Eubanks: This book examines the ways in which algorithms are used to reinforce existing social and economic inequalities.
  • Algorithms of Oppression by Safiya Umoja Noble: This book focuses on the racial bias inherent in many algorithms and their impact on marginalised communities.
  • The Algorithm: The Ultimate Power by Brian Christian and Tom Griffiths: This book provides a general introduction to algorithms and their applications.

Articles and Papers:

  • Journal of Computer-Mediated Communication: This journal publishes research on the intersection of technology and communication, including studies on algorithm bias.
  • Digital Journalism & Ethics: This journal focuses on ethical issues in digital journalism, including the use of algorithms.
  • Pew Research Center: This organisation conducts research on a variety of topics related to technology and society, including algorithm bias.
  • MIT Technology Review: This publication covers emerging technologies and their societal implications, including the impact of algorithms.

Online Resources:

  • AlgorithmWatch: This non-profit organisation works to increase transparency and accountability in the use of algorithms.
  • Open Rights Group: This organisation campaigns for digital rights, including the right to be free from algorithmic discrimination.
  • Mozilla Foundation: This foundation is dedicated to building a healthy internet, including efforts to address algorithm bias.
  • The Markup: This investigative journalism outlet focuses on the impact of technology on society, including the role of algorithms.

Case Studies:

  • The Facebook Cambridge Analytica scandal: This case highlights the potential for algorithms to be used to manipulate public opinion.
  • The YouTube algorithm and the spread of misinformation: This case demonstrates how algorithms can amplify harmful content.
  • The use of algorithms in criminal justice: This case raises concerns about the potential for algorithms to perpetuate racial bias.

By incorporating these sources into your training course, you can provide participants with a comprehensive understanding of algorithm bias and the tools they need to address it.

MHM: Would you say that the need for journalists to be trained in dealing with algorithm bias is urgent?

Gemini AI: Absolutely. The need for journalists to be trained in dealing with algorithm bias is more urgent than ever. The rapid advancement of artificial intelligence and machine learning has led to a significant increase in the use of algorithms in news production. These algorithms can introduce bias into news content, which can have serious consequences for both individuals and society.

Here are some reasons why this training is urgent:

  • Increasing reliance on algorithms: News organisations are increasingly relying on algorithms to curate, personalise, and generate news content. This means that the potential for algorithm bias to impact news coverage is growing.
  • The potential for harm: Algorithm bias can lead to the spread of misinformation, the amplification of harmful stereotypes, and the underrepresentation of certain groups or perspectives. This can have serious consequences for individuals and society.
  • The need for accountability: Journalists have a responsibility to ensure that the news they report is accurate, fair, and unbiased. This includes understanding and addressing the potential for algorithm bias.

By training journalists in how to identify and mitigate algorithm bias, we can help to ensure that the news we consume is reliable and trustworthy.


MHM’s conclusion

The fact that AI is warning that journalists need to be trained in recognising and dealing with algorithm bias suggests that human judgement, journalistic ethics and editorial values are crucial in a world where artificial intelligence and machine learning is crafting so much content. As Gemini says in the interview above, “the underlying issue of bias in news production has existed for centuries. However, the advent of AI and machine learning has amplified the potential for bias to be introduced into news content at a larger scale and at a faster pace.”


 

The post Dealing with algorithm bias in news first appeared on Media Helping Media.

]]>
Ensuring female representation in news leadership and coverage https://mediahelpingmedia.org/strategy/ensuring-female-representation-in-news-leadership-and-coverage/ Sat, 17 Dec 2022 14:41:56 +0000 https://mediahelpingmedia.org/?p=2443 12 steps designed to tackle the “cultural exclusion” of women in news leadership roles and "unmute" the voices of women in the global news industry

The post Ensuring female representation in news leadership and coverage first appeared on Media Helping Media.

]]>
Journalists and contributors from Hurriyat Sudan www.hurriyatsudan.com/ learning how to produce original journalism during a week-long training course held in Kampala, Uganda and organised by Fojo International
Leadership training for Sudanese journalists – Image by Media Helping Media released via Creative Commons BY-NC-SA 4.0

Is your news organisation and its output male dominated? Are women fairly represented in newsroom leadership roles and the stories covered? Do your staff and your news agenda reflect the society they serve?

A 12-point plan containing “substantive opportunities to reshape the status quo” and improve the representation and voices of women in the global news industry was published in November 2022. We look at the steps suggested and examine the possible consequences.

The report Outrage to Opportunity looks at the representation of women in newsrooms in India, Kenya, Nigeria, South Africa, the UK, and the USA.

Luba Kassova, the report’s author, says in-depth interviews with more than 40 senior editors in the six countries revealed two areas of concern:

Kassova’s research revealed a “cultural exclusion” of women in news leadership roles and found that the voices of women “remain muted in a global news industry”.

However the study also recognises that progress is being made. It looks at 168 existing initiatives which aim to improve women’s representation or inclusion in news.

And it identifies practical steps that could be taken to “include the missing perspectives of women of all colours in news leadership and coverage”.

The report says that taking these steps could open up significant business opportunities and that “global revenues in the newspaper industry would increase by a cumulative $11bn by 2027 and $38 billion by 2032“.

This article looks at those 12 practical steps that could be taken by news organisations. They are listed below with the author’s permission.

The 282-page document – embedded at the foot of this article – sets out three areas which the author says “requires the news industry’s urgent attention”.

1: Inclusion

The report found that women are still on the margins of editorial decision-making in the highest-profile news areas. In the major journalistic specialisms such as business, politics, and foreign affairs, women hold as few as one in six editorial roles in the countries researched. Occasional instances of parity, for example South African political editors or US business editors, are counterbalanced by exclusively male leadership in political editor roles in countries such as Kenya and India.

2: Race

According to the report, women news leaders are subject to marginalisation in countries with multi-racial populations such as South Africa, the UK, and the US. This, the report says, is a news industry “blind spot”. In the countries examined their representation is significantly below their proportion in the working population. Kassova says that when interviewed, news leadership teams are either unaware, reactive, push back, or turn a blind eye to the problem.

3: Coverage

The report says the coverage of issues that affect women is disproportionate. Just 0.02% of news coverage globally focuses on the gaps between men and women in pay, power, safety, authority, confidence, health, and ageism. According to the report:

Just having women in many leadership positions is not the only solution. You need women’s voices to be heard, and, when women are in leadership positions, for them to be in meaningful positions.

Steps to be taken

The report outlines 12 solutions for:

changing the status quo and improving women’s representation and inclusion in news leadership and creating a more balanced and inclusive news coverage that engages more female and racially-diverse audiences”.

News sector level 

  1. Consolidate efforts at an industry level to enhance progress 
  • Set up a body that unifies all working in this area under the umbrella of the gender equity cause.
  • Nonprofits and associations working to improve women’s position in news should collaborate more and compete less.
  • Develop industry-wide initiatives that aim to understand women audiences.

News organisational level

  1. Conduct a comprehensive gender diversity audit across all elements of the news operation
  • Carry out an audit of the existing gender balance and news initiatives for women along all the elements of the value chain.
  1. Make the gender diversity strategy explicit
  • Agree the goal to pursue gender balance in your news organisation and coverage and make it explicit in the organisation’s strategy.

News leadership level

  1. Improve the representation of all women
  • Raise awareness that male-favouring norms prevail in society and in journalism.
  • Be intentional about change at all levels within your news organisation by setting targets and measuring representation continuously along the whole news value chain.
  • Challenge the persistent soft vs. hard news gender stereotyping that keeps women out of editorial roles in high-profile areas such as politics, economics, and foreign affairs.
  • Develop talent retention programmes, especially for mid-level managers.
  1. Improve specifically the representation of women of colour
  • Measure women’s representation to understand the role of race and ethnicity when overlaid onto gender.
  • Review recruitment and retention processes to accommodate diverse talent.
  • Establish and measure targets for representation and inclusion of women of colour.
  1. Improve the inclusion of all women in decision-making
  • Ensure everyone understands that a more diverse team does not automatically translate into inclusive decision making.
  • Consciously tone down the competitive win-lose journalistic values inside the newsroom to create a more empathetic culture.
  • Encourage support groups for women, with men on board.
  • Use employee engagement surveys to measure employees’ and leaders’ perceived inclusion.
  • Set up initiatives that specifically support individual women’s safety and well-being.
  1. Improve specifically the inclusion of women of colour in decision making
  • Beware of knee-jerk reactions to external trigger events which lead to superficial solutions.
  • Educate yourself about the stressful experiences of women of colour in your news organisation.
  • When considering a Diversity, Equity and Inclusion (DEI) initiative, bring in experts to run it rather than allocating it to staff with lived experiences but no formalised expertise.
  • Do not expect women or ethnic minority groups to lead and resolve their own underrepresentation or exclusion. Offer support.
  • If you are a woman of colour, do not take on DEI initiatives if this is too emotionally draining.
  • Measure and track the success of DEI initiatives. Establish baselines before the initiatives start.

Newsroom and journalism level

  1. Improve the representation and inclusion of women in newsrooms 

Newsgathering and news coverage

  1. Improve the representation of all women
  • Focus on circumventing the five main biases in journalism (status quo, male bias leading to gender blindness, short-term outlook, reductive journalism, organisations’ unwillingness to report on themselves).
  • Measure the impact of your women-centric journalism.
  • Make your newsroom more inclusive and accessible through outreach training.
  • Ensure the sustainability of interventions to diversify contributors: keep them simple, get backing from both the top and the grassroots, and, ideally, make them voluntary.
  1. Improve storytelling about women of all colours
  • Look for story angles that appeal to both women and men.
  • Use more micro angles in storytelling, including human stories to make macro stories relevant.
  • Increase news coverage of seven gender gaps (power, pay, safety, authority, confidence, health, and ageism), which are wider for people of colour.
  • Introduce inclusive storytelling and encompass the perspectives of different communities and audience groups about how a particular story affects them.
  1. Improve the portrayal of all women
  • Recognise that the portrayal of women of all colours in the news is an industry blind spot.
  • Start measuring the use of common gender or racial stereotypes in your news coverage.
  • Focus on expertise rather than identity when interviewing women contributors.
  • Conduct portrayal analyses to understand how news contributors are portrayed.

News consumption and impact

  1. Reframe the case for change to include the business opportunity of serving women audiences 
  • Develop a business plan for increasing revenue from women audiences whilst still retaining men’s readership.
  • Research and develop women-friendly news products and formats.
  • Track women’s engagement and consumption.
  • Track the impacts of any new strategy (on finance, brand, individuals, influencers, and decision-makers).

The business case

The report says gender equality in news makes business sense. It claims that if the “addressable gender consumption gap” in the global news industry is closed there could be a potential cumulative revenue opportunity of $43 billion between 2023 and 2027 and $83 billion between 2023 and 2032. See the section of the report covering business opportunities (part 4, chapter 2).

Outrage to Opportunities

The entire 282-page report Outrage to Opportunity is embedded below.

The report was produced by the international audience strategy consultancy AKAS, and published by Internews.

The gender parity in news checklist

This checklist appeared on pages 166 and 167 in the 2020 report by Luba Kassova “The Missing Perspectives of Women in News“. It is republished here with the author’s permission.

Gender parity in news checklist page one - author Luba Kassova
Gender parity in news checklist page two - author Luba Kassova


The post Ensuring female representation in news leadership and coverage first appeared on Media Helping Media.

]]>