No More Jerks With The Xbox One’s Reputation System

3

Michael Dunn, program manager of Xbox Live, talked about Xbox One’s reputation in a blog post on Xbox Wire.  The new community-powered reputation model for Xbox One helps users avoid the players they do not want to play with. If a user does not want to play with cheats or jerks, they do not have to. The new reputation model helps expose people that aren’t fun to be around and creates real consequences for trouble-makers that harass our good players.

This is done by simplifying the mechanism for Xbox One – moving from a survey option to more direct feedback, including things like “block” or “mute player” actions into the feedback model. The new model will take all of the feedback from a player’s online flow, put it in the system with a crazy algorithm that was created and validated with an MSR PhD to make sure things are fair for everyone.

Ultimately, a reputation score will determine which category a user is assigned – “Green = Good Player,” “Yellow = Needs Improvement” or “Red = Avoid.” Reputation can easily be seen by looking at someone’s gamer card. The reputation score is ultimately up to the individual user. The more hours a user plays online without being a jerk, the better their reputation will be; similar.

Most players will have good reputations and be seen as a “Good Player.” The algorithm is looking to identify players that are repeatedly disruptive on Xbox Live. Before a player ends up with the “Avoid Me” reputation level Microsoft will have sent many different alerts to the “Needs Improvement” player reminding them how their social gaming conduct is affecting lots of other gamers.

The algorithm is sophisticated and will not penalize users for a few bad reports, and helps the system from being gamed. Even good players might receive a few player feedback reports each month and that is OK. The algorithm weighs the data collected so if a dozen people suddenly reporting a single user, the system will look at a variety of factors before docking their reputation. The Xbox Live team will verify if those people actually played in an online game with the person reported – if not, all of those player’s feedback won’t matter as much as a single person who spent 15 minutes playing with the reported person. The system also looks at the reputation of the person reporting and the alleged offender, frequency of reports from a single user and a number of other factors.

This system is also dynamic and will continue to evolve and get better as Microsoft tracks the feedback from players and titles, plus add more consequences for the jerks. In addition, Smart Match technology will help players find the perfect match and more control. The system is dependent on community feedback, so players need to do is report the players that are abusive, cheating or causing mayhem and their reputation will reflect that.

Source: Xbox Wire



About Author

Suril is a scientist, journalist and obsessive Microsoft observer. He holds an advanced degree in Biotechnology with minors in Biochemistry, Microbiology, and Molecular Biology. Send him tips on twitter: http://www.twitter.com/surilamin

  • Name

    Aren’t they concerned with some jerks/hackers developing tools to use automation to ruin someone’s reputation, such as large numbers of “block”s or “mute”s generated?

    • DustiiWolf

      If you read, the algorithm looks into that. So if theres 500 blocks from one central user, or 500 blocks from 500 users created 5 seconds ago, it will take that into affect and ignore them. It basically, from my understanding, makes sure the user is real, has played with the character, and isnt spamming

    • BIAS

      Read again.