news

SDA 270: Data, Ethics and Society

Data, Ethics and Society introduces students in the new Social Data Analytics Minor program to the ethical concerns surrounding gathering, analyzing and interpreting social data sets.

September 01, 2020
Print

Analyzing data is only one of the skills required to generate meaningful results from social data. For ethically gathered data deployed without causing harm, social data analysts need to be aware of current ethics and privacy issues surrounding collection, analysis and interpretation of big data sets.

In order to fully equip students to tackle these concerns in a pro-active manner, SDA 270: Data, Ethics and Society is one of the core courses in the new Social Data Analytics Minor program. SDA 270 introduces students to the ethical, legal and privacy issues surrounding the collection and use of big data. It will also illustrate how not attending to these issues can impact vulnerable populations.

But why should students working with data care about ethics? Surely, it’s all about gathering enough data and conducting a robust analysis with a decent p value?

Privacy and data analytics

Speaking with Department of Philosophy professor, Chelsea Rosenthal it’s clear that ethics and specifically privacy issues are essential factors when dealing with social data analytics.

“We need to ask ourselves,” says Rosenthal, explaining the overall emphasis behind SDA 270, “does our analysis focus on what’s going to help us achieve valuable goals, and are we achieving those goals in a way that overlooks something of ethical importance?”

Rosenthal will be examining data, ethics and society. Among other topics, she’ll be looking at privacy—why it’s valuable and why it matters. Covering informed consent and disclosure, this topic will look at the interplay between privacy and how safeguarding the individual affects innovation. She will also be asking students to consider what is accomplished by protecting privacy when involved in social data analytics.

“Traditionally, ethical research requires that participants give informed consent for the use of their data, but with large-scale, population-level data, meaningful, informed consent may not be possible – and some scholars argue that it’s also not enough to prevent abuse.  If that’s the case, what does adequately protecting privacy look like?” she asks. 

Rosenthal will also be covering algorithmic bias and transparency, asking students to think about whether these algorithms are as objective as intended. Along with issues regarding privacy, bias and discrimination are also important factors. These can be introduced, often without intent, when tech is being developed. Without transparency and ethical oversight, and without knowing what issues are important to address, algorithms can reflect and perpetuate existing biases.

What matters?

She’ll also be challenging students to ask if what’s being measured is actually what matters.

“Is GDP really telling us what matters in an economy?” she suggests. “And are Quality Adjusted Life Years such a good measure for health outcomes?"

SDA 270 – not the answers, but a toolbox

Rosenthal is quick to point out that SDA 270 is not just about solving the issues. Instead of answers, students should expect what she describes as a conceptual toolbox for ethically responsible social data analytics by the end of the course. By studying conversations in the subject area students will not only understand the ethical worries being raised but also know how to approach them.

“I won’t be giving the answers,” says Rosenthal, “but I will be giving the tools so students will be able to reach their own ethical judgements.”

Why take SDA 270?

Human oversight and attention to ethical issues helps ensure fair play and equity whenever social data sets are collected and wherever they’re implemented.

“That is what’s most chilling, many of us may not know that there’s been an algorithm that has been involved in the decision. You saw in the case of Amazon where they were actually trying to do something positive by eliminating bias and creating a system that sorted through resumes, and unknowingly, the algorithm picked up on past biases and discriminated against all women who applied for the job.

These systems are invisible, opaque, and disseminated and deployed at massive scale. That makes it so dangerous, and like climate change, it requires an understanding of science for us to bridge this massive gap.”
Interview with Shalini Kantayya https://www.inverse.com/innovation/shalini-kantayya

Bias and Discrimination

If data collection is biased, intentionally or not, the outcome for some demographics is invisibility or worse.

Poet of Code Joy Buolamwini shares "AI, Ain't I A Woman", a spoken word piece that highlights the ways in which artificial intelligence can misinterpret the images of iconic black women: Oprah, Serena Williams, Michelle Obama, Sojourner Truth, Ida B. Wells, and Shirley Chisholm.

Invisibility

“I have a confession. Sometimes, with no one in sight, I code in a white mask…”

The Coded Gaze: Unmasking Algorithmic Bias by Shalini Kantayya. Debuted at the Museum of Fine Arts Boston, The Coded Gaze mini documentary follows Poet of Code  Joy Buolamwini's personal frustrations with facial recognition software and the need for more inclusive code.

Representation

If vital data are missing, then entire demographics go missing.

This became apparent in 2012 as humanitarian agencies started collecting hashtag-based data sets around Hurricane Sandy. The University of Pennsylvania noticed that people were left out of data sets and their observations thus not visible to the U.N. Office for Humanitarian Affairs, for example. 

Missed representation also affects more mundane activities in daily life such as being able to operate an automatic soap dispenser.

Philosophy Courses for the SDA Minor

Core courses SDA 270: Data, Ethics and Society--This course would introduce students to the ethical, legal, and privacy issues surrounding the collection and use of big data and the implications of these for vulnerable populations.

Elective course: PHIL 315 – Formal Methods in Philosophy (3) - A survey of formal methods used in philosophy. Topics will include some of the following: propositional logic, predicate logic, formal syntax, formal semantics, the probability calculus, decision theory, game theory and formal causal modeling. Prerequisite: One of: PHIL 110, 210, 310, 314, MACM 101, BUEC 232 or STAT 270. Students with credit for COGS 315 cannot take this course for further credit.

“As a philosophy major, I found the background in logic and formal methods that I gained in this course to be super helpful for understanding and formulating arguments in my other philosophy classes. I also especially enjoyed learning about modal propositional logic and tense logic.

I imagine this class would also be great for data analytics, particularly because it looks at set theory, probability, decision theory, and game theory.

We also took a look at vagueness, which could be applied to many different areas.

We were given three exams (not cumulative) and had the option to substitute either the second or third exam for a paper. This flexibility allowed us some lee-way to focus on what was most interesting to us. We were given problem sets regularly, as well as a practice exam before each exam, which we had opportunity to go over in class. We weren’t required to hand them in, but they were a huge help in understanding the material.

This is one class where you learn by doing. It’s also really fun to apply all the skills you’ve learned and see yourself making progress. The class provided a good overview of various areas within formal methods and is much more concrete than other philosophy classes.

Overall, I found the class to be rigorous, but well worth it.”

Student testimonial for PHIL 315

Additional Resources

Read more about ethics and social data analytics in our Flipboard curation.

 

Graduate

Study Philosophy at SFU

View my Flipboard Magazine. SFU Philosophy's collection on ethics and social data PHIL SDA on Flipboard

Privacy Matters

“Robert Williams spent over a day in custody in January after face recognition software matched his driver’s license photo to surveillance video of someone shoplifting, the American Civil Liberties Union of Michigan (ACLU) said in the complaint. In a video shared by ACLU, Williams says officers released him after acknowledging “the computer” must have been wrong.

Government documents seen by Reuters show the match to Williams came from Michigan state police’s digital image analysis section, which has been using a face matching service from Rank One Computing.”

NBC News, June 2020

The Big Three: Ethics in Data Analysis

Rosenthal summarizes the ethics of social data analysis by focusing on major concerns within the field. These include:

  • Ethics and data gathering: For example, respect privacy while gathering sufficient information – how do we balance these two sometimes competing issues?
  • Ethics of using data: For example, what is the risk of bias and discriminatory algorithms in generating results for onward application?
  • Ethics of deciding what to measure: What should we be measuring; with all the big data around, especially from social data, what do the data tell us and what doesn’t it tell us?
What does a philosopher say about ethics and SDA? SFU Philosophy alum Katherine Creel at our launch event.
automation and algorithms
should we have made this system?
bias and facial recognition
ethics and design decisions

Upcoming Events