Fortnite Players Can Submit Audio Recordings to Report In-Game Harassment

Online video games have become a toxic place for a lot of players due to the growing problem of abusive text and voice chat practices. While text chats can easily be flagged for offensive content, voice chat is a different situation. Fortnite is making an effort to punish those who try to evade violations by being more vocally abusive.

Fortnite
(Photo : Epic Games)

Audio Evidence of Abusive Players

Despite being filled with younger players, Fortnite has also been a toxic environment for players as they are harassed or bullied by enemy teams and even teammates. Others try to avoid suspensions or bans by voicing their frustrations instead of typing them.

Unfortunately for bullies, Epic Games has implemented a way for players to report abusive voice chats as well. Fortnite now has a voice reporting feature, wherein it can record voice chats in five-minute segments for evidence when a player decides to report another player.

Players over 18 can toggle the option on if they choose and set it to "Always On" or Off When Possible," which makes the feature inactive in Party Channels with friends, as reported by Engadget. Those under 18 wouldn't have much of a choice.

Fortnite
(Photo : Epic Games)

Minors playing the online FPS game will have the feature on at all times. If they choose not to have their voice chat audio recorded, they can mute themselves or turn the voice chat off entirely through the voice chat settings.

In addition to that, parents can also manage their children's voice chat permissions by logging into the Epic Account Portal, where they can find the video game company's Parental Controls. As for privacy, players can rest assured that the audio is not kept indefinitely.

Since the segment is replaced by the next recording, players will only have five minutes to report the abusive audio as the previous recording will immediately be deleted. Once submitted for reporting, the clip will only be kept for 14 days or until the matter has been resolved.

Those who report abusive players will do so under the veil of anonymity, which means that the reported players will not know who reported them. Once sanctioned after a report, the abusive player will receive an email based on the address linked to their Epic accounts.

Read Also: The 7 Most Toxic Gaming Communities You Should Avoid

Call of Duty is Using AI to Moderate Voice Chats

Video game developers of online games recognize the unpleasant language thrown around during games. Call of Duty knows well enough that their user base is also filled with toxic players to often use offensive language for bullying.

With that in mind, the video game will moderate its voice chat using AI technology. Activision is partnering with Modulate to make the moderation measure possible. Aptly named ToxMod, the system can detect hate speech, discrimination, and harassment, as reported by The Verge.

Games like Call of Duty: Modern Warfare II and Call of Duty: Warzone were the first titles to have the moderation system, with other games in the franchise getting it by November 10th. Hopefully, this feature works and will be applied to other online games as well.

Related: Call of Duty Starts Moderating Voice Chats Through AI

© 2024 iTech Post All rights reserved. Do not reproduce without permission.

Company from iTechPost

More from iTechPost