No-one likes a cheater, and no-one likes a jerk, either. Thankfully, the new Xbox Live "community-powered" reputation system will help you filter out any unpleasant players and flag them up over any poor conduct, in theory creating a more positive online environment for the Xbox One as time goes on. Last month, we shared some details on the new 'learning system' that would replace the traditional five star player ratings, and now Micheal Dunn, Program Manager for Xbox Live, has gone into a little bit more detail in a blog post on Xbox Wire as to how exactly that learning system will work.
"With the new community-powered reputation model for Xbox One," Dunn wrote, "we want to help you avoid the players you don't want to play with. If you don't want to play with cheats or jerks, you shouldn't have to. Our new reputation model helps expose people that aren't fun to be around and creates real consequences for trouble-makers that harass our good players.
"We are simplifying the mechanism for Xbox One - moving from a survey option to more direct feedback, including things like "block" or "mute player" actions into the feedback model," Dunn went on. "The new model will take all of the feedback from a player's online flow, put it in the system with a crazy algorithm we created and validated with an MSR PhD to make sure things are fair for everyone.
Ultimately, your reputation score will determine which category you are assigned - "Green = Good Player," "Yellow = Needs Improvement" or "Red = Avoid Me." Looking at someone's gamer card you'll be able to quickly see their reputation. And, your reputation score is ultimately up to you.
"The more hours you play online without being a jerk, the better your reputation will be; similar to the more hours you drive without an accident, the better your driving record and insurance rates will be. Most players will have good reputations and be seen as a "Good Player." The algorithm is looking to identify players that are repeatedly disruptive on Xbox Live."
So effectively, cheaters and repeat online abusers will be shown a red card, and appropriate consequences will follow. But, Dunn says, players shouldn't worry about receiving a poor rating unfairly, as the algorithm is designed to take lots of different scenarios into account before downgrading someone's rating.
"We'll identify those players with a lower reputation score and in the worse cases they will earn the "Avoid Me" reputation," Dunn wrote. "Before a player ends up with the "Avoid Me" reputation level we will have sent many different alerts to the "Needs Improvement" player reminding them how their social gaming conduct is affecting lots of other gamers.
"The algorithm is sophisticated and won't penalize you for a few bad reports. Even good players might receive a few player feedback reports each month and that is OK. The algorithm weighs the data collected so if a dozen people suddenly reporting a single user, the system will look at a variety of factors before docking their reputation.
"The system also looks at the reputation of the person reporting and the alleged offender, frequency of reports from a single user and a number of other factors."
Sounds pretty solid, but of course in order for the system to improve and grow, players will need to be active with their feedback, reporting those that deserve it and raising flags where necessary.
What do you think of the new system?
Source: Rheena.com
Social Networking Bookmarks