Raheem Sterling questions will of social media companies to combat online racism

Manchester City v Arsenal – Premier League – Etihad Stadium
(Image credit: Martin Rickett)

Raheem Sterling has questioned whether there is the will among social media companies to combat online abuse towards footballers after a new study highlighted the shocking extent of the problem.

More than 3,000 explicitly abusive messages were sent publicly via Twitter over the six-week Project Restart period to 44 high-profile players currently or formerly involved in English football, research has shown.

Forty three per cent of the Premier League players in the study (13 out of 30) experienced targeted and explicitly racist abuse, while three players who called out racism during the period – Manchester City and England forward Sterling, Crystal Palace forward Wilfried Zaha and Wycombe’s Adebayo Akinfenwa – received 50 per cent of all the abuse.

Sterling said far more needed to be done to tackle the problem.

“I don’t know how many times I need to say this, but football and the social media platforms need to step up, show real leadership and take proper action in tackling online abuse,” he said.

“The technology is there to make a difference, but I’m increasingly questioning if there is the will.”

The study, which looked at 825,515 messages in total which were sent to the players, was commissioned by the Professional Footballers’ Association Charity, carried out by data scientists at Signify and supported by anti-discrimination group Kick It Out.

It also found that 29 per cent of the abuse came in emoji form, which the study commissioners described as a “glaring oversight” in the algorithms used by Twitter to spot hateful content.

A Twitter spokesperson said: “Racist behaviour has no place on Twitter and we strongly condemn it. We continue to take action on any account that violates the Twitter Rules.

“We welcome people to freely express themselves on our service, however, as outlined in our Hateful Conduct Policy, account holders cannot promote violence against, threaten or harass other people on the basis of race, ethnicity or other protected groups.

“We have proactively engaged and continue to collaborate with our valued partners in football, to identify ways to tackle this issue collectively. We remain focused on proactively actioning hateful content – now more than 1 in 2 Tweets are identified and removed without reports.

“We want to reiterate that abusive and hateful conduct has no place on our service and we will continue to take swift action on the minority that try to undermine the conversation for the majority. We will continue to play our part in curbing this unacceptable behaviour — both online and offline.”

A photo posted by on

Earlier this month, it was confirmed that the social media giant was in partnership with Kick It Out over its ‘Take A Stand’ initiative. At that time, Twitter said it proactively removed more than one in two tweets deemed hateful from its platform.

Facebook’s vice-president for northern Europe, Steve Hatch, said that between April and June action had been taken against 22.5 million pieces of content and that 94.5 per cent of that was detected and removed proactively by Facebook, rather than by users reporting it.

Akinfenwa said: “As someone who has experienced online abuse first-hand and spoken to team-mates who have experienced the same, I can say that players don’t want warm words of comfort from football’s authorities and social media giants, we want action.

“The time for talking has passed, we now need action by those who can make a difference.”

The report authors say the data demonstrates the need for football’s stakeholders to work together on the funding of a system to proactively monitor online abuse using artificial intelligence.

It also calls for more work to be done to ensure there are “real world consequences” for online abusers such as prosecutions and stadium bans, and for more pressure to be brought to bear on social media companies to act proactively and strongly against abuse.