Last week, the U.S. House of Representatives played a number of major TV VPs at a time listening to the potential dangers of bias and power development. While the meeting was immediately disrupted by chanting protests, democratic cinemas were able to take a brief look at how these ideas could help spread online lies and exaggerated ideas. The problems and pitfalls provided by the operating system are well known and have been well written. So, what are we going to do about it?
“So I think to answer this question, there is something important that needs to be done: we want independent researchers to be able to analyze the platform and their systems,” Dr Brandie Nonnecke, Director of CITRIS Policy Lab at UC Berkeley, told Engadget. TV companies “know that they have to be transparent about what is happening on their platform, but I believe that, in order for this show to be true, there must be a partnership between the platform and their independent partners, the research is done. ”
Work that can be considered easier than realizing, unfortunately. “There is a bit of a problem at the moment as platforms that interpret deep-rooted privacy laws such as the GDPR and the California Consumer Privacy Act do not provide independent investigators with information that they claim is protecting privacy and security,” he said.
And even ignore the basics black-and-white story – that “it is probably impossible to know how the AI that changed the amount of data is making decisions,” by Yavar Bathaee, Harvard Journal of Law & Technology – the internal function of these objects is often considered a trade secret.
“AI that relies on machine learning systems, such as deep neural networks, can be as complex to understand as the human brain,” Bathaee continued. “There is no shortcut to knowing what can be used in the production of synthetic neurons.”
Download the Compas news from 2016 as an example. Compas AI is an interconnected system that directs the length of a judge’s trial due to a number of factors and changes related to the defendant’s life and criminal history. In 2016, AI told a judge in a court in Wisconsin for Eric L Loomis to be sentenced to six years in prison for “running away from a soldier” … for good reason. Secret business secrets. Loomis later sued the government, alleging that the apparent nature of the Compas AI elections violated their constitutional rights because they could not review or challenge its decisions. The Supreme Court in Wisconsin eventually ruled in favor of Loomis, claiming that he would receive the same sentence without the help of AI.
But the algorithms that promote Facebook groups can be just as dangerous as the algorithms that promote minor misconceptions in prison – especially when it comes to increasingly dangerous propaganda these days.
Sen. Chris Coons (D-Del.) “Social media uses algorithms that create millions of people who read, watch and think every day, but we know very little about how these machines work and how they affect our people,” Sen. Chris Coons (D-Del.) Was informed Politics before hearing. “We feel that such methods only add to the lies, disrupt politics and confuse us with isolation.”
While Facebook regularly publishes its experiments on remove the records of hate groups I’m struggling with it their union using its platform, though own company report he denies doing so it didn’t happen almost enough that start waves of dangerous objects on the spot.
As a journalist and writer of Culture Leaders, Talia Lavin, says that, the Facebook platform has become a way to help hate groups that want to work. “In the past, they only had paper magazines, handguns or meetings where they had to go in public places and just find ways for people who were interested in their message,” he said. said Engadget.
But Facebook ideas, they have no such shortcomings – except when He is too crippled to prevent further unrest in the run-up to the presidential election.
“Obviously, over the past five years, we’ve seen a lot of this increase that I think is relevant to social media, and I know that algorithms are important,” Lavin said. But they are not the only ones driving here. ”
Lavin’s notes Obedient evidence from Drs. Joan Donovan, Research Superintendent at the Kennedy School of Government at Harvard University, is highlighting the rapid decay of local autonomous networks as well as the rise of social media such as Facebook.
“You have this platform that is able to deliver false information to millions every day, as well as conspiracy theories, as well as exaggeration,” he continued. “All of our growth is strongly influenced by where we are.”
For examples of this, one just needs to look Facebook’s small response to the theft, an online group that came out ahead of the election and is said to have launched a January 6 attack on Capitol Hill. As internal monitoring upon detection, the company failed to recognize the threat or to take appropriate action in response. Facebook’s advice was strongly encouraged to detect proper practices such as spamming, false accounts, such things, Lavin explained. “They had no guidance about the realities of people who do horrible things and bad habits on their behalf.”
“Stop the Steal is a great example of months and months of rising from the media,” he continued. “You had a vicious idea that spreads, burns people, and then similar past scenes are made in a number of cities where there is violence against passers-by and protesters. You had people showcasing the armed men and, at the same time, you had demonstrations against The closure, which was also heavily armed, led to a crackdown on various groups – from anti-corruption activists to whites – to be more visible and connected. ”
While not as effective in terms of modern technology than the Rolodex, some members of Congress are determined to keep trying.
Late March, two Democrats of the House, Rep. Anna Eshoo (CA-18) and Tom Malinowski (NJ-7), reinstated those who helped Protecting Americans from Dangerous Authorities, which “could lead to criminal media being blamed for their proliferation of harmful content that could lead to cybercrime.”
“When the media companies multiply disruptive and disruptive content on their platforms, the consequences can be devastating, as we saw on January 6. It’s time for Congress to intervene and help these platforms respond to the lawsuit.” Eshoo He was speaking to reporters. “That’s why I’m so proud to talk to Rep. Malinowski for a little change Section 230 of the Communications Decency Act, a law that protects companies that specialize in manufacturing lawsuits against their use, so that companies can be held accountable if their methods help to spread false stories that lead to outside violence. ”
Thus The law would have been the case with a media company if its approach is used to “promote or endorse directly related to human rights violations (42 USC 1985); neglect to avoid interference with human rights (42 USC 1986); as well as international terrorism cases (18 USC 2333).”
If the law were enacted into law, it would be an important factor in motivating TV executives but Drs. Nonnecke insists that further research into how algorithms work in the real world is needed before we can return to kill those who have already died of horses. It can also help legislators to develop advanced technologies in the future.
“Having a visual demonstration and response is not only beneficial to people but I think it also benefits the platform,” he said. “If there is a lot of research on what is going on in their machines that research can be used to inform people of the code of conduct they do not want to have rules or principles that can be reported to the federal agency that has failed.”
“There are examples of these interactions: Social Sciences between Facebook and researchers, ”said Nonnecke. In order to address these issues, we need to do more research and we need independent research to understand what is going on. ”
All products selected by Engadget are selected by our publishing team, excluding our parent company. Some of our articles include helpful links. If you purchase one of these links, we will be able to make a donation.