At LLYC, we believe that Corporate Affairs and Marketing can be used to promote social progress. That’s why we’re launching an initiative to promote gender equality and encourage more people to get involved in making it a reality.
Ou “Out of focus” report looks at how news media and social media talk about gender-based violence. Unfortunately, we found that there is still a lot of work to be done to ensure that the media coverage of gender-based violence follows international ethical standards. This is especially true when it comes to social media.
In this path of purple tiles, the media play a big role in society when it comes to making real change. They provide information that can shape our thoughts and opinions and even affect social structures and systems.
This is why, on International Women’s Day (March 8), we are promoting an initiative to analyze how the language used in online media and social messages affects our ideas about machismo, gender roles, and stereotypes.
Can we help
change the situation?
“The Purple Check” was born from LLYC’s purpose to contribute to the media and the general public, aiming to take action to counter the findings of our “Out of focus” report.
It’s an artificial intelligence tool designed to detect biases in language by scrutinizing the words used in news reports headlines. The AI analyzes for bias and suggests alternative ways to convey the same message, aiming to inform without perpetuating inequality. This effort seeks to refocus communication on its intended purpose.
Can we help
change the situation?
As we all harbor biases and habits, until we can effectively change them, we are launching a tool to complement our report, which will be accessible to everyone and raise awareness about ways to refocus gender-based violence news coverage. It is meant to send a message about the importance of an egalitarian perspective, whether you’re producing or consuming information.
Can we help
change the situation?
It’s an artificial intelligence tool designed to detect biases in language by scrutinizing the words used in news reports headlines. The AI analyzes for bias and suggests alternative ways to convey the same message, aiming to inform without perpetuating inequality. This effort seeks to refocus communication on its intended purpose.
Can we help
change the situation?
How to use it?
- Enter the headline or text you want to analyze and click the purple check button.
- Wait a few seconds for the tool to analyze the words.
- Discover the result.
- If the tool detects bias you will see a recommendation of alternative ways to convey the same message, aiming to inform without perpetuating inequality.
Therefore, every year, the knowledge area team conducts studies in 12 countries thanks to the capabilities of our professionals in Deep Learning and Data Analytics methodologies. In this report we analyzed data from the conversation of over 226.2 million news reports: 5.4 million news items related to gender-based violence and 14 million messages on X (formerly Twitter).
Our contribution to innovation lies in our unique AI model that aims to make societal issues more visible. We believe in designing tools that can raise awareness about social issues, such as gender violence. Our reports offer a more objective, balanced, and fair treatment of information that can be freely accessible to anyone.
DOWNLOAD THE FULL REPORT
Gender violence
is a feminized issue.
Detecting information bias
is everyone’s business
Media and social networks:
heading towards a correct approach to gender violence.