First content moderation transparency report released by Microsoft Xbox

Share This Post

  • Xbox cracked down on more than 4.3 million fraudulent accounts between January and June, increasing its aggressive moderation ninefold over the same period last year. These unauthenticated accounts are typically automated or bot accounts that use spam to trick or harass players, promote cheating, inflate friend and follower numbers, deny service or It can be used to distribute DDoS and launch attacks.

His vice president of services, Xbox Player Dave McCarthy, claimed that his proactive moderation increase is an effort to “delete” accounts before they enter the system. Proactive enforcement, which accounts for 65% of his total, entails computer programs or human moderators finding, examining, and dealing with conduct that contravenes Xbox’s community rules.

Microsoft also counts on users to report offensive content via reactive moderation. fake accounts might come from sources other than bots that advertise game cheats. “State actors and other financed entities regularly engage in activities to propagate content that is not appropriate for our service,” he continued.

In order to combat abuse and toxicity and give players a safer experience, Xbox joins a growing number of games and gaming services whose providers intend to publish regular transparency reports. Twitch, a live-streaming gaming platform owned by Amazon.com Inc., released its first report in early 2021, while Discord released its first report in 2019. A lack of analog regulated data is present. Microsoft stated that a report would be made available every six months.

Microsoft’s 2021 acquisition of content moderation provider Two Hat, noted for its text-filtering software, allowed the business to boost proactive moderation on the website. Additionally, McCarthy referred to Microsoft’s wider resources, which are outlined in the company’s bi-annual report on digital trust and enable it to “extract more products out of Microsoft Research utilizing video and image detection.”

McCarthy declined to comment on the number of Xbox’s human moderators or their job status. Discord and Twitch give much more information about their moderation activities, such as the quantity of subpoenas handled and information on extreme content. Xbox is “finding our way into what a decent transparency report looks like for us,” according to McCarthy.

When a claim of harassment is made to Xbox, unlike Riot Games, which is owned by Tencent Holdings Ltd. and is responsible for the well-known video games League of Legends and Valorant, neither company captures nor examines player voice audio. In the future, Xbox might invest more resources in this area, who also raised privacy issues.

In his first six months of 2022, Xbox took action against his 1 million profane accounts, 814,000 adult sexual content, and 759,000 harassing or bullying accounts. rice field. The Xbox player gave him more than 33 million reports in the same period, which is down 36% from the same period last year.

Related Posts

There are rumors of another Netflix spin-off for The Witcher

Netflix’s The Witcher is reportedly gearing up for another...

Street Fighter 6 could arrive sooner than expected

Street Fighter fans received some news this week about...

Virginia General Assembly questions charity game

Virginia’s House approved a bill authorizing gambling for charity...

Former BioWare developer reveals hilarious alternate ending to Mass Effect 3

A rather amusing alternate ending for Mass Effect 3...

Preparations are underway for the first “Border Games”

Giving one of the keynotes at the historic revival...

It won’t be long before Modern Warfare 2 is free to play

Call of Duty: Modern Warfare 2 will soon be...