A group of tech companies have recently signed up for a secure solution
[ad_1]
But most Oasis plans remain, at best, good. One example is the idea of using a learning machine to detect harassment and hate speech. Like my friend Karen Hao reports last year, the AI genres probably give hate a lot of opportunities to spread or spread. However, Wang protects Oasis upgrade to AI as a control tool. “AI is as good as the data gets,” he says. “Platforms share a variety of controls, but they all contribute to accuracy, prompt action, and security while avoiding production.”
The document consists of seven pages and outlines the future goals of the consortium. Much is read as the word of the mission, and Wang says the first few months work has focused on creating advisory teams to help set goals.
Some of these systems, such as its system of things, are abstract. Wang says he wants companies to hire a multimillion-dollar retailer to understand and deal with racial and ethnic discrimination. But the plan has no other way to achieve that goal.
Consortium also expects member companies to share user abuses, which are important in identifying recurring offenders. Participating technology companies will partner with nonprofits, government agencies, and lawmakers to help formulate security policies, Wang says. He is also preparing for Oasis to have a legal response team, whose job it will be to inform the police about torture and ill-treatment. But it is still unknown Why the work of the law enforcement team will vary according to the situation.
Privacy and security planning
Despite the lack of concrete in detail, experts I spoke to think that Consortium’s principles are a good first step, at least. “It’s a good thing Oasis is looking at self-control, starting with people who know their systems and limits,” said Brittan Heller, a lawyer who specializes in technology and human rights.
This was not the first time that professional companies had worked together in this way. In 2017, some agreed to exchange information for free with the Global Internet Forum to Combat Terrorism. Today, GIFCT remains independent, and the companies that sign it run on their own.
Lucy Sparrow, a researcher at the School of Computing and Information Systems at the University of Melbourne, says what is happening in Oasis is what gives companies the resources to work, rather than waiting for them to come up with the language or wait. a third group to do that work.
Sparrow adds that cooking systems being developed from the beginning, as Oasis pushes, are admirable and that his research on multiplayer games shows that he is making a difference. “Ethics tends to be pushed aside, but here, they are [Oasis] encourages positive thinking from the beginning, ”he says.
But Heller argues that good design may not be enough. He points out that technology companies have redesigned their products, which have been widely criticized for taking advantage of consumers without legal expertise.
Sparrow admits, saying he is skeptical of believing that a team of technology companies will do what consumers want. He said: “It raises two questions. “First of all, how do we trust high-income corporations to improve security? And secondly, how much do we want technology companies to have more power in our lives?”
It is difficult, especially since users have rights to privacy and privacy, but those needs can be complex.
[ad_2]
Source link