The Student News Site of Stony Brook University

The Statesman

55° Stony Brook, NY
The Student News Site of Stony Brook University

The Statesman

The Student News Site of Stony Brook University

The Statesman

Newsletter

We have to teach computer science students about maintaining algorithmic neutrality

Binary code for computers. Tech companies like Google, Twitter or Facebook run on specific algorithms that influence what each individual sees online, such as advertisements on Facebook or the top hashtags on Twitter. CHRISTIAAN COLEN/FLICKR VIA CC BY SA 2.0

Rohit Panda is freshman computer science major who is a member of the SBU College Republicans.

When someone says they are studying computer science (CS), you don’t normally associate these students with doing anything remotely political in the future. A stereotypical assumption would be that people in the field have no real influence on society beyond coding. Computer science students share the same dream of getting a luxurious job at big tech companies like Google, Twitter or Facebook. However, these outlets run on specific algorithms that are influencing what you see in your day-to-day life; for example, the advertisements you see on Facebook or the top hashtag, like #ImpeachmentHearings, on Twitter. 

These companies have left-leaning ideologies that are palpable in their workplace cultures. Two leading tech companies, Google and Amazon, have employees who tend to support democrats. 

The pressure of “being liberal” can essentially coerce new employees to take a specific stance. It is not just limited to workplace banter about left-leaning topics; the real problem is that it creeps over into their actual work, including the people who are designing and creating the algorithms. This is why it is imperative that we teach computer science students at Stony Brook about not only how to create algorithms, but ensuring that they know how to detect bias in their code by being able to completely audit them. 

The actual process of auditing algorithms generally involves putting various forms of data, such as different search queries, into an algorithm, to analyze the output for biases. In May, the Computational Journalism Lab at Northwestern University undertook an algorithm audit of the “Top Stories” box on Google search.

At the end of their rigorous analysis, the study found that overall, the top three news outlets were making a total of 23% of the impressions. Unsurprisingly, these three sources were CNN, the New York Times and the Washington Post — all known as primarily left-leaning sources. Fox News was the only conservative outlet out of the top 20 sources, only accounting for 3% of the total impressions.

Knowing that search results greatly influence how people think about elections, it is honestly despicable that it took an outside institution to expose the inherent bias that one of the top search engines put out. 

Google details their “rigorous process” for testing search algorithms as focusing mostly on relevance in search results as advertising in their numerous experiments in relation to them, rather than the underlying social impact.  

Software engineers at tech companies like Google should have to do algorithm audits that reflect a level playing field as part of their algorithm design process.

It is not surprising that the only real consideration about biased algorithms in regards to ideology in current CS ethics classes seems to be in relation to gender and race. SBU’s most relevant course, CSE 312 — Legal, Social, and Ethical Issues in Information Systems — does not even cover viewpoint discrimination as a major topic. The famous algorithm design class, CSE 373 — Analysis of Algorithms — does not contain a blip on it either.

I believe algorithmic practices that affect an entire generation should be based solely on pure academic standards taught in classrooms worldwide instead of a large corporation attempting to define it in order to gain political points. 

The true aim of these algorithms should be maintaining complete neutrality. Having a purely conservative-dominated feed would be disastrous for our country also. I, myself, always try to get a good taste of all sides of a major news story. For example, if I’m reading an article about the Mueller investigation on the Daily Caller, a very conservative news source, for that same specific story, I will try to find a matching article on a site like Vox, which is left-leaning in its views. 

This helps me see not only what all kinds of people think but also helps me see if the truth is being distorted on either side. I shouldn’t have to put in extra effort to get this balanced perspective though, this should be already provided in our searches. 

One might point out that software engineers at Google and Twitter should be able to present bias in this way primarily because of the fact that they are non-governmental companies. This idea of these companies being able to do what they want is what many conservatives like me believe is a fundamental right. But what often goes undiscussed is Section 230 of the Communications Decency Act, which points out that companies like these are platforms; therefore, the content employees and users produce should be spared the liability. But because employees support one viewpoint, they should be forced to take on the role of publishers, like newspapers, and not enjoy liability.

It is clear to me that the brink of the problem with the biases of large tech companies being implemented on the public is starting from the first-hand coding experiences that these employees have. Universities have to revamp the curriculum in CS ethics or algorithms classes to include this forgotten idea that people could simply have a different worldview contrary to popular belief. We have to understand that software engineers are not just “employees.” They have the power to influence who America chooses to be president. 

View Comments (2)
Donate to The Statesman

Your donation will support the student journalists of Stony Brook University. Your contribution will allow us to purchase equipment and cover our annual website hosting costs.

More to Discover
Donate to The Statesman

Comments (2)

All The Statesman Picks Reader Picks Sort: Newest

Your email address will not be published. Required fields are marked *

  • N

    NavkiranMay 15, 2020 at 1:33 am

    I completely agree that universities should discuss algorithm neutrality in their curriculum. The tech industry must also take responsibility by doing the algorithm audits to test for neutrality you mentioned. We should all discuss algorithmic neutrality more because the ramifications of the work done in cubicles of engineers clearly extend far beyond the into the real world. Thank you for sparking this discussion!

    Reply
  • W

    We Binod Chandra PadhiFeb 18, 2020 at 12:00 am

    It is a really thought provoking article from a young brain worthy of praise and recognition. Impartiality should be the foundation stone of education without any intentional bias. The proposed audit must be accepted by authorities that decide the curriculum. My heart felt thanks to the author Mr Rohit Panda and wishing him a bright future in building a Prosperous USA and World Welfer.

    Reply