Epic Games’ Battle Royale Fortnite video game has disrupted the gaming industry in a major way but a new voice reporting feature has raised privacy and ethical concerns from cybersecurity experts.
Since its 2017 launch, Fortnite has amassed over 500 million registered players and now averages around 221 million monthly active players.
Fortnite’s free-to-play and cross-platform release seven years ago made it unprecedentedly accessible for players. Combine that with colourful and bombastic art design and genuinely addictive shooting and building gameplay mechanics, and it has truly kept the world, and specifically young people, in a chokehold.
To date, Fortnite has generated revenue of over $26bn, of which over $6bn came in 2022 alone.
However, like any online video game popular with young people, the onus has fallen on its publisher, Epic Games, to make sure its platform remains as safe a space as possible, which, despite efforts, has not always remained the case.
According to an ExpressVPN study in 2023, one in five children in the UK aged between four and 13 have experienced harassment of some form on Fortnite.
Access the most comprehensive Company Profiles on the market, powered by GlobalData. Save hours of research. Gain competitive edge.
Company Profile – free sample
Thank you!
Your download email will arrive shortly
We are confident about the unique quality of our Company Profiles. However, we want you to make the most beneficial decision for your business, so we offer a free sample that you can download by submitting the below form
By GlobalData
Fortnite implemented a controversial voice reporting feature at the end of 2023 to combat this statistic. The security tool allows players to report another player’s last five minutes of voice chat, which is continually being recorded on a rolling basis.
When a user reports a conversation, “the voice chat audio captured from the last five minutes will be uploaded with the report and sent to Epic moderators for review,” Epic says.
Any previous audio over five minutes old is automatically deleted as new audio is captured.
The reporting feature is automatically enabled for all players under 18; minors who do not wish to have their voice chat audio captured must mute themselves or turn off voice chat completely.
Epic Games claims Fortnite’s voice chat audio is securely captured on a user’s device and not on Epic Games servers to protect privacy.
“Epic has no way of accessing any voice chat audio unless voice reporting is on and a participant submits a voice report,” according to the company.
However, the Fortnite-maker has come under fire from gaming and cybersecurity professionals who say the voice recording feature for minors opens up a myriad of privacy issues.
Lauren Hendry Parsons, a privacy advocate at ExpressVPN, told Verdict that voice recording minors, even on a five-minute basis, is not a reasonable solution to deter hate speech.
“Children may not fully comprehend the consequences of sharing personal information online, and taking this a step further, the idea that their ‘private’ conversations with their friends may be saved by a corporation at a moment’s notice with little visibility of how they’re used is very concerning,” Parsons tells Verdict.
“These decisions move gaming companies into incredibly murky territory, especially as many of their customers are under the age of 18,” Parsons says.
Epic Games says it will auto-delete clips after 14 days or “the duration of a sanction”. The company notes that it will keep clips for 14 more days if an appeal is made to make a decision.
Epic Games also said clips will be kept for “as long as legally required” if necessary.
Vlad Susanu, founder of Game Clubz, believes that Epic Games’ ability to listen in on private conversations has raised a flurry of ethical questions.
“Epic is treading into murky waters when it comes to protecting minors and preventing unauthorised access to private conversations,” Susanu told Verdict. “While the intent may be solely to combat hate speech, the reality is that their AI system could inadvertently expose far more sensitive information.”
Susanu questions whether the reporting tool will have the power to detect conversations that go darker than harassment – such as an adult engaging in predatory behaviour with an underage player.
“That’s not something a game developer should tackle alone,” Susanu says.
The gaming industry and children’s privacy
Protecting children within the gaming industry has become more challenging with the increasing toxicity in gaming.
A 2023 Unity survey that primarily focused on the UK, South Korea, and the US revealed that the number of players facing toxic behaviour increased by 6% to 74% between 2021 and 2023.
“Gaming companies should ensure strict age-based settings to protect minors from this kind of behaviour,” Eren Cicyasvili, analyst at research company GlobalData, tells Verdict.
“For instance, on Minecraft: Java Edition, users can use a profanity filter to cover up harmful language. This filter cannot be turned off in child account. Roblox also has an age verification process that users must go through before using voice chat,” Cicyasvili added.
Cicyasvili says that game publishers will be increasingly scrutinised and compelled to comply with regulations in the coming years.
“Their failure to comply could attract hefty penalties and bans and cause them to lose players,” he says.
Lawmakers and regulators worldwide have taken different approaches to ensure the safety of children in video games.
Video games like Grand Theft Auto and PUBG Mobile, which include elements such as self-harm, violence, and vulgarity, have been banned in some countries.
In 2023, the UK gaming industry decided to limit children’s access to in-game loot boxes through a collection of guidelines.
Epic Games did not respond to Verdict‘s request for comment.