Tech News

Stop talking about AI behaviors. It’s time to dump her and move on.

[ad_1]

In his 20-year career, Crawford overcame real-world challenges with heavy use, machine learning, and creative ingenuity. In 2017, with Meredith Whittaker, he founded AI Now research firm as the first organization dedicated to exploring the meaning of technology. He is also a professor at USC Annenberg, Los Angeles, and chair of AI tourism and justice at the olecole Normale Supérieure in Paris, and a senior researcher at Microsoft Research.

Five years ago, Crawford says, he was still working to convey the mere idea that data and AI are politically neutral. Now the discussion has changed, and AI systems have spread to their own field. She hopes that her book will help her to continue growing.

I sat down with Crawford to discuss his book.

The following results have been changed in length and clarity.

Why did you decide to take up this profession, and what does it mean for you?

Crawford: Many books written on the subject of mechanical engineering deal with modern pressures. And sometimes they write about famous AI men, but that’s all we’ve had in regards to the struggle for creative ingenuity.

I think this has led to a strong understanding of the movement as a non-political approach and, as Stuart Russell and Peter Norvig put it. their book– as wise people who make wise decisions.

I wanted to do something very different: to understand how creative intelligence is made so much. This means looking at the operating environment, the energy wasted, the hidden functions throughout the transaction, and the amount of data that is released on each platform with the tool we use on a daily basis.

In doing so, I want to unlock the understanding of AI as unintelligent or intellectual. It is a file of opposite raw materials. It is based on the world’s most important elements and the bodies of working people, as well as on all the things we make and say and draw every day. Nor is it wise. I think there is original sin in the field, where people think computers are like the human brain and once we teach them as children, they gradually grow into human beings.

This is what I think is really difficult – that we have bought smart ideas where we are, we are just looking at the types of statistics for people with as many problems as they are given.

Was it obvious that this is how people should think about AI? Or was it a trip?

That has absolutely been a journey. I would say that one of the things that changed me was back in 2016, when I started a project called “Anatomy of the AI ​​system”By Vladan Joler. We met at a conference mainly about voice-enabled AI, and we tried to better visualize what it takes for Amazon Echo to work. What are its components? How does it delete data? What are the components of a data pipeline?

We realized, well — especially, in order to understand this, you need to understand where it comes from. Where did we make the chips? Where are the mines? Where does it melt? Where are the supply chain and sales lines?

Finally, how do we find the end of these life forms? How do we look at where e-waste technologies are located in places like Malaysia and Ghana and Pakistan? What we ended up doing was the one that took the longest time in a two-year search to find the chain of things from dating to the grave.

When you start looking at AI systems on a larger scale, and go on for a long time, you leave these narrow accounts of “AI justice” and “systems” to say: these are systems that create permanent geomorphic changes in our planet, as well as foster inconsistencies. for the works we already have in the world.

That’s why it made me realize that I had to give up only one analysis tool, Amazon Echo, in order to use this kind of analysis on all companies. That to me was a big job, that’s why Atlas and AI It took five years to write. There is such a need to see what these machines do to us, because we often do not work to understand their true meaning.

Another thing I would say that has been a real inspiration is the growing segment of professionals who ask big questions about employment, data, and inequality. Here I think of Ruha Benjamin, Safiya Noble, Mar Hicks, Julie Cohen, Meredith Broussard, Simone Brown — the list goes on. I see this as an aid to the information agency in bringing about ideas that are in harmony with the environment, workers’ rights, and data protection.

You are doing a lot in this book. Almost every chapter starts with a look around. Why was this important to you?

It was a well-known decision to set up AI analysis in some areas, in order to get out of the environment, where many machine-learning conflicts occur. And we hope it also shows that if we don’t do this, we’re just talking about the “nonsense” of the obvious move, which is also a political decision, and it becomes a necessity.

Based on linking the site together, that’s why I started thinking about this atlas, because atlases are strange books. With the books you can open and view the size of the continent, or you can approach and look at the mountains or the city. They give you these changes compared to the big changes.

There is a lovely line I use in this book from the scientist Ursula Franklin. They write about how maps combine the known and the unknown in these understandable ways. The reason for me, was based on the knowledge I had, as well as thinking about the exact location where AI is actually built from rocks and sand and oil.

What kind of answers has this book received?

One of the things that surprised me with the initial response was that people felt like these ideas were gone. There is a moment to realize that we need to have a different language than the one we have had over the years.

We have spent a lot of time thinking about small-scale AI systems and we are always looking for technical and technical solutions. Now we have to deal with the way nature works. We have to deal with the specific types of work that has been done in the construction of these machines.

And we are also beginning to see the dangerous legacy of what happens when you post as much online as you can, and just call it the truth. The aggressive nature that has created the world has produced a lot of injuries, and as always, these injuries are even more pronounced by areas that were previously marginalized and did not experience the benefits of such practices.

What do you expect people to do differently?

I hope it will be more difficult to have cul-de-sac discussions when terms such as “ethical” and “good AI” have become a complete definition of any literal meaning. I hope it removes the curtain and says, let’s see who’s running the machine. That means moving away from focusing on things like moral values ​​and talking about electronics.

How do we break free from such practices?

If there has been a real snare in the arts over the past decade, then the idea of ​​change has always been looking for expertise. It has always been, “If there is a problem, there is a solution.” And only recently are we starting to see the stretch say “Oh, well, if there’s a problem, then law can fix. The policymakers are responsible. ”

But I think we need to develop this. We must also say: Where are the government agencies, where are the rights activists, where are the defenders who are talking about fraud, justice in the workplace, data protection? How do we include them in these discussions? How do we integrate the affected areas?

In other words, how do we make this into a deep democratic dialogue on how these machines affect the lives of billions of people in illegal ways that live outside the law and control democracy?

In this way, the book tries to deal with technology and begins to ask bigger questions around it: In what country do we want to live?

Which country does you do you want to be What future do you dream of?

I would like to see groups that have worked hard to answer questions such as fair justice and the rights of co-workers, and recognize that aspects that were very different in changing social justice and racial justice have been sharing concerns and the role of reconciliation and planning.

Because we see a very short time here. We are battling a planet that is already in deep trouble. We are looking for great power in a few hands. You should definitely go back to the early days of the railways to see some of the most stable companies, and now you can say that the technology has achieved this.

That is why we must fight for ways to multiply our groups and have more types of democratic responses. And that is a cohesive problem. It is not a matter of personal choice. It’s not like we’re choosing the most sophisticated technology on the shelf. It is that we need to find ways to work together on the problems that exist on the planet.

[ad_2]

Source link

Related Articles

Leave a Reply

Back to top button