The Humanities Can’t Save Big Tech For Themselves
[ad_1]
The problem is technologically, many declare that it is its quantity, its “solid” mathematics given in the soft human world. Tech and Mark Zuckerberg: both turn into beautiful girls in numbers and doubts about the wonders of the culture of metaverse when it becomes difficult in any human activity that they are immediately remembered. The human world has Zuck, but it is everything he fails to admire. That failure, the lack of social media and social media, is one that many believe is shared by the companies they associate with.
As a result, because Big Tech fails to understand people, we often hear that co-workers just need to hire more people who to do understanding. Topics like “Liberal arts majors are the future of technology companies”And“Why computer science needs people”Has been a recurring theme in technical and commercial matters over the last several decades. It is said that development workers and librarians can help tech companies to reduce social ills. Young blacks and the amount of disinformation, so. Many sociologists, ethnographers, and intellectuals, especially those with advanced degrees who feel the economic pressures that students love in STEM – are rushing to showcase the benefits of professional professionals whose starting salary may embarrass ordinary professors.
I have been training non-technical professionals in the technical and media industries for the past several years. Contradictory “introductory” sociologists find the fact that these positions and staff already exist in the professional industry and, in various ways, are always present. For example, many modern UX researchers have advanced degrees in social sciences, anthropology, library and knowledge. And EDI educators and specialists (Equity, Diversity, and Inclusion) often hold positions in the technical departments of the profession.
Recently, the tech industry and a look at where non-technical expertise can address some of the challenges associated with their business. More and more, professional companies look to legal and ethical educators to assist them through legal and ethical challenges in platform management, to freedom fighters and experts to help protect unelected users, and to other professionals to help with platform issues such as algorithmic oppression. , disinformation, dera. management, user experience, and digital experience and transformation. These data-driven industries are trying to improve their technical skills and data expertise with expertise in culture, culture, and ethics, or what I often call “soft”.
But you can add all the soft work you want and a little bit will change only if companies appreciate such data and expertise. Instead, many experts, jurists, and other social scientists in AI and the technical fields are. awareness the confusing habit of professional companies looking for their expertise and then neglecting it in favor of professional jobs by many employees.
Such experiences well illustrate the critical times in which the culture of AI has grown, in which technology companies may claim to integrate non-professional roles while adding social and cultural descriptions to roles that would ultimately be held by “same-old” professionals. more important, in our love for “soft” works that are often not appreciated, we should not overlook their shortcomings when it comes to achieving set goals.
When it is necessary to promote the hard work that is being done with unappreciated and underdeveloped activities, there is no reason to believe that their members are willing to stand up for what is right. These people have real and important expertise in social and cultural life, but all of their components are in line with their design challenges and weak areas.
Take anthropology, a study that appeared as part of a Western colonial project. Although social science is often linked to societal goals, there is no guarantee that a sociologist (85% white in the US) can lead or deploy algorithms in a more biased manner than, say, a social scientist. computers. Perhaps the worst example is PredPol, a multi-million dollar police company that Ruha Benjamin named part of The New Jim Code. PredPol was developed by Jeff Brantingham, a professor of Anthropology at UCLA.
Other study groups promoted by those who push soft data are similarly controversial. The first Sociology survey and the number of black people played out responsibility in modern lighting technology that focuses mainly on black people. My research site, complex online courses, is very confusing and is failed focus on race and racism. In fact, I am often one of the few black and brown researchers who attend our meetings. There have been times when I have been surrounded by the diversity of tech companies rather than in the learning areas where the initial criticism of Big Tech comes from.
[ad_2]
Source link