Hosting
Monday, February 24, 2025
Google search engine
HomeArtificial IntelligenceAI mediation tool could help narrow culture war rifts, say researchers |...

AI mediation tool could help narrow culture war rifts, say researchers | Artificial Intelligence (AI)


Artificial intelligence could help narrow some of the most contentious divisions in the culture war through a mediation process, researchers claim.

Experts say a system that can create group statements that reflect the views of the majority and minorities could help people find common ground.

Prof. Chris Summerfield, co-author of the study from the University of Oxford and who also works for Google DeepMind, said the AI ​​tool could have multiple purposes.

“What I would like to see it do is to give political leaders in Britain a better idea of ​​what people in Britain are really thinking,” he said, noting that surveys provided only limited insights, while forums known to as citizens’ assemblies are often expensive, logistically challenging and limited in size.

Summerfield and colleagues from Google DeepMind write in the journal Science how they built the ‘Habermas Machine’ – an AI system named after the German philosopher Jürgen Habermas.

The system works by taking written positions from individuals within a group and using them to generate a series of group statements that are acceptable to all. Group members can then rate these statements, a process that not only trains the system but can also select the statement with the greatest support.

Participants can also feed criticism of this initial group statement back to the Habermas Machine, resulting in a second collection of AI-generated statements that can be reordered, and a final revised text selected.

The team used the system in a series of experiments involving a total of more than 5,000 participants in Britain, many of whom were recruited through an online platform.

In each experiment, the researchers asked participants to respond to topics ranging from the role of monkeys in medical research to religious education in public education.

In one experiment, involving approximately 75 groups of six participants, researchers found that the Habermas Machine’s initial group statement was preferred by participants over a group statement produced by human mediators 56% of the time. The AI-based efforts were also rated as higher quality, clearer and more informative, among other things.

Another set of experiments found that the full two-step process with the Habermas Machine increased the degree of group similarity over participants’ initial beliefs before AI mediation began. Overall, the researchers found that agreement increased by an average of eight percentage points, which is equivalent to four in a hundred people changing their minds on an issue where opinions were initially evenly divided.

However, the researchers emphasize that it was not the case that participants always came off the fence or switched opinions to support the majority position.

The team found similar results when they used the Habermas Machine in a virtual citizens’ assembly in which 200 participants, representative of the British population, were asked to deliberate on questions related to topics ranging from Brexit to universal childcare.

The researchers say that further analysis looking at how the AI ​​system numerically represents the texts it receives will shed light on how it generates group statements.

“What [the Habermas Machine] seems to do is broadly respect the opinion of the majority in each of our small groups, but try to write a piece of text that does not make the minority feel that they have no rights – so it more or less recognizes the opinion of the minority.” said Summerfield.

However, the Habermas Machine itself has proven controversial, with other researchers noting that the system does not help translate democratic considerations into policy.

Dr. Melanie Garson, a conflict resolution expert at UCL, added that while she was a tech optimist, one concern was that some minorities may be too small to influence such group statements but could still be disproportionately affected by the outcome.

She also noted that the Habermas Machine does not give participants the opportunity to explain their feelings, and thus develop empathy with people with a different view.

Fundamentally, she said, when using technology, context is critical.

“[For example] How much value does this provide in the perception that mediation is more than just finding an agreement?” Garson said. “Sometimes, if it’s in the context of an ongoing relationship, it’s about learning behavior.”



Source link

RELATED ARTICLES
- Advertisment -
Google search engine

Most Popular