Home Featured Activision Blizzard partners with researchers to create AI against online abuse

Activision Blizzard partners with researchers to create AI against online abuse

0
Activision Blizzard partners with researchers to create AI against online abuse

[ad_1]

Sign up for the GI Daily here to get the biggest news straight to your inbox


Activision Blizzard is collaborating with researchers to develop AI against online abuse.


The effort will be a two-year project which aims to detect harmful user behavior online.


Joining the gaming firm is senior director of AI Research at Nvidia and Bren professor, Anima Anandkumar, and professor of political and computational social science, Michael Alvarez.


“We want to know how players interact. What kind of language do they use? What kinds of biases do they have? What should we be looking for? That requires domain expertise,” Anandkumar said.


Additionally, Activision Blizzard announced what actions its taking to create safer online experiences.


The Call of Duty maker attributed the announcement to the Anti-Defamation League’s newest report. Since the survey’s release, the firm says that it’s been in talks with the ADL on how it’s combating player abuse.


The gaming giant also shared how its protecting players online, some of the efforts are listed below:

  • A code of conduct for Call of Duty
  • An improved in-game reporting system
  • A warning system for players who exhibit disruptive behavior
  • A mandatory social contract for users who play World of Warcraft


“In no uncertain terms, disruptive behavior online is an unacceptable problem that transcends any single industry. No person should be meant to feel unwelcome, unsafe, or targeted when they’re online – especially because of who they are, what they believe, or their background,” Activision Blizzard said.


Activision Blizzard was one of the gaming firms that members of Congress wanted to ask how it was combating extremism.



[ad_2]

LEAVE A REPLY

Please enter your comment!
Please enter your name here