At LLYC, we believe that Corporate Affairs and Marketing can be used to promote social progress. That’s why we’re launching an initiative to promote gender equality and encourage more people to get involved in making it a reality.

Ou “Out of focus” report looks at how news media and social media talk about gender-based violence. Unfortunately, we found that there is still a lot of work to be done to ensure that the media coverage of gender-based violence follows international ethical standards. This is especially true when it comes to social media.

In this path of purple tiles, the media play a big role in society when it comes to making real change. They provide information that can shape our thoughts and opinions and even affect social structures and systems.

This is why, on International Women’s Day (March 8), we are promoting an initiative to analyze how the language used in online media and social messages affects our ideas about machismo, gender roles, and stereotypes.

Can we help
change the situation?

“The Purple Check” was born from LLYC’s purpose to contribute to the media and the general public, aiming to take action to counter the findings of our “Out of focus” report.

It’s an artificial intelligence tool designed to detect biases in language by scrutinizing the words used in news reports headlines. The AI analyzes for bias and suggests alternative ways to convey the same message, aiming to inform without perpetuating inequality. This effort seeks to refocus communication on its intended purpose.

As we all harbor biases and habits, until we can effectively change them, we are launching a tool to complement our report, which will be accessible to everyone and raise awareness about ways to refocus gender-based violence news coverage. It is meant to send a message about the importance of an egalitarian perspective, whether you’re producing or consuming information.

Can we help
change the situation?

“The Purple Check” was born from LLYC’s purpose to contribute to the media and the general public, aiming to take action to counter the findings of our “Out of focus” report.

It’s an artificial intelligence tool designed to detect biases in language by scrutinizing the words used in news reports headlines. The AI analyzes for bias and suggests alternative ways to convey the same message, aiming to inform without perpetuating inequality. This effort seeks to refocus communication on its intended purpose.

Can we help
change the situation?

As we all harbor biases and habits, until we can effectively change them, we are launching a tool to complement our report, which will be accessible to everyone and raise awareness about ways to refocus gender-based violence news coverage. It is meant to send a message about the importance of an egalitarian perspective, whether you’re producing or consuming information.

How to use it?

  1. Enter the headline or text you want to analyze and click the purple check button.
  2. Wait a few seconds for the tool to analyze the words.
  3. Discover the result.
  4. If the tool detects bias you will see a recommendation of alternative ways to convey the same message, aiming to inform without perpetuating inequality.
There is still work to be done to ensure that media coverage related to gender-based violence complies with international guidelines.

Therefore, every year, the knowledge area team conducts studies in 12 countries thanks to the capabilities of our professionals in Deep Learning and Data Analytics methodologies. In this report we analyzed data from the conversation of over 226.2 million news reports: 5.4 million news items related to gender-based violence and 14 million messages on X (formerly Twitter).

Large Language Models (LLMs- GPT4) have been used to identify and isolate victim and offender descriptors in news reports obtained through scrapers models, as well as Natural Language Processing (NLP) techniques in 4 languages to analyze compliance with 21 rules of best practices from the Mediterranean Network of Regulatory Authorities (MNRA) and United Nations Development Programme (UNDP) guidelines.

Our contribution to innovation lies in our unique AI model that aims to make societal issues more visible. We believe in designing tools that can raise awareness about social issues, such as gender violence. Our reports offer a more objective, balanced, and fair treatment of information that can be freely accessible to anyone.

DOWNLOAD THE FULL REPORT

You can also learn how we did it.
Raising awareness and sharing information about the fight against gender-based violence is a human rights issue. Only if we all detect and call out the bias in news reports can we make language a tool for change.

Media and social networks:

heading towards a correct approach to gender violence.