CITAP speaker examines algorithms

Why does YouTube seem to know just what you want to watch?
How can Google fill in your search questions before you even finish them?
Where does Facebook find out what ads it thinks you’d like?

One world: algorithms.

They’re ubiquitous in our tech-drenched world, but do people understand how these computational sets of instructions influence the internet content we see, from cat videos to political ads?

Eszter Hargittai, professor and chair of Internet Use and Society at the Institute of Communication and Media Research of the University of Zurich, seeks answers to those questions in research focusing on the social and policy implications of digital media and how differences in people’s Web-use skills influence what they do online.

Hargittai gave a talk January 28 entitled “Algorithm Skills: What Are They and How Do We Measure Them?” at the UNC Hussman School of Journalism and Media.

Hargittai, who defines algorithm skills as “being aware that there’s a system at work deciding what to show you” is the second speaker hosted by UNC’s Center for Information, Technology, and Public Life.

Funded by the John S. and James L. Knight Foundation, the Center combines faculty from UNC Hussman, the UNC School of Information and Library Science and the Department of Communication. The Center is dedicated to researching, understanding and responding to the growing impact of the internet, social media and other forms of digital information sharing.

For Deen Freelon, a principal investigator with CITAP and an associate professor at Hussman, Hargittai’s speech fit in perfectly with CITAP’s mission of researching, understanding and responding to the growing impact of the internet, social media and other forms of digital information sharing.

“In a society saturated with digital technology, the natural question is what effect is this technology having?” said Freelon, whose research interests focus on political expression through digital media, as well as data science and computational methods for analyzing large digital datasets.

Understanding algorithms is key in understanding what happens online, Hargittai said.

“Algorithms are relevant everywhere. This process very much influences the information we’re exposed to,” Hargittai said during her speech to the standing-room only crowd in Hussman’s Freedom Forum Conference Center.

However, studying algorithms proves tricky, as the means by which organizations — such as private companies and governments — construct them often remains secretive, Hargittai said as she told the crowd about her study of people’s algorithm skills in the U.S. and five European countries.

The study surveyed participants’ understanding of algorithm function in typical internet activities like watching videos, using map applications, searching for products and services, and using voice assistants such as Alexa and Siri.

A few of the study’s findings included that women often thought they had lower algorithm skills than they did and that study participants from higher socioeconomic backgrounds often had higher skills than those from lower socioeconomic backgrounds.

Noting such societal inequalities within technology’s use has long been a factor in Hargittai’s research, stemming from her sociology background. Hargittai received a Ph.D. in sociology from Princeton University with a dissertation entitled “How Wide a Web? Inequalities in Accessing Information Online.”

Sonoe Nakasone, an archivist at UNC’s Wilson Special Collections Library who attended Hargittai’s talk at Hussman, appreciated Hargittai’s sociological lens.

“Algorithms come up a lot in libraries, especially in thinking about the access to information and equity issues,” Nakasone said.

Tripp Tuttle, a Ph.D. student at UNC’s School of Information and Library Science — also in attendance — appreciated learning about Hargittai’s research methods. “A talk like this gives you the inside story on the study methods that research publications often gloss over,” Tuttle said.

Learn more about Hargittai’s work and research.