尤物视频

news

SDA 270: Data, Ethics and Society

Data, Ethics and Society introduces students in the new Social Data Analytics Minor program to the ethical concerns surrounding gathering, analyzing and interpreting social data sets.

September 01, 2020
Print

Analyzing data is only one of the skills required to generate meaningful results from social data. For ethically gathered data deployed without causing harm, social data analysts need to be aware of current ethics and privacy issues surrounding collection, analysis and interpretation of big data sets.

In order to fully equip students to tackle these concerns in a pro-active manner,  is one of the core courses in the new Social Data Analytics Minor program. SDA 270 introduces students to the ethical, legal and privacy issues surrounding the collection and use of big data. It will also illustrate how not attending to these issues can impact vulnerable populations.

But why should students working with data care about ethics? Surely, it鈥檚 all about gathering enough data and conducting a robust analysis with a decent p value?

Privacy and data analytics

Speaking with Department of Philosophy professor,  it鈥檚 clear that ethics and specifically privacy issues are essential factors when dealing with social data analytics.

鈥淲e need to ask ourselves,鈥 says Rosenthal, explaining the overall emphasis behind SDA 270, 鈥渄oes our analysis focus on what鈥檚 going to help us achieve valuable goals, and are we achieving those goals in a way that overlooks something of ethical importance?鈥

Rosenthal will be examining data, ethics and society. Among other topics, she鈥檒l be looking at privacy鈥攚hy it鈥檚 valuable and why it matters. Covering informed consent and disclosure, this topic will look at the interplay between privacy and how safeguarding the individual affects innovation. She will also be asking students to consider what is accomplished by protecting privacy when involved in social data analytics.

鈥淭raditionally, ethical research requires that participants give informed consent for the use of their data, but with large-scale, population-level data, meaningful, informed consent may not be possible 鈥 and some scholars argue that it鈥檚 also not enough to prevent abuse.  If that鈥檚 the case, what does adequately protecting privacy look like?鈥 she asks. 

Rosenthal will also be covering algorithmic bias and transparency, asking students to think about whether these algorithms are as objective as intended. Along with issues regarding privacy, bias and discrimination are also important factors. These can be introduced, often without intent, when tech is being developed. Without transparency and ethical oversight, and without knowing what issues are important to address, algorithms can reflect and perpetuate existing biases.

What matters?

She鈥檒l also be challenging students to ask if what鈥檚 being measured is actually what matters.

鈥淚s GDP really telling us what matters in an economy?鈥 she suggests. 鈥淎nd are Quality Adjusted Life Years such a good measure for health outcomes?"

SDA 270 鈥 not the answers, but a toolbox

Rosenthal is quick to point out that SDA 270 is not just about solving the issues. Instead of answers, students should expect what she describes as a conceptual toolbox for ethically responsible social data analytics by the end of the course. By studying conversations in the subject area students will not only understand the ethical worries being raised but also know how to approach them.

鈥淚 won鈥檛 be giving the answers,鈥 says Rosenthal, 鈥渂ut I will be giving the tools so students will be able to reach their own ethical judgements.鈥

Why take SDA 270?

Human oversight and attention to ethical issues helps ensure fair play and equity whenever social data sets are collected and wherever they鈥檙e implemented.

鈥淭hat is what鈥檚 most chilling, many of us may not know that there鈥檚 been an algorithm that has been involved in the decision. You saw in the case of Amazon where they were actually trying to do something positive by eliminating bias and creating a system that sorted through resumes, and unknowingly, the algorithm picked up on past biases and discriminated against all women who applied for the job.

These systems are invisible, opaque, and disseminated and deployed at massive scale. That makes it so dangerous, and like climate change, it requires an understanding of science for us to bridge this massive gap.鈥
Interview with Shalini Kantayya

Bias and Discrimination

If data collection is biased, intentionally or not, the outcome for some demographics is invisibility or worse.

Poet of Code  shares "AI, Ain't I A Woman", a spoken word piece that highlights the ways in which artificial intelligence can misinterpret the images of iconic black women: Oprah, Serena Williams, Michelle Obama, Sojourner Truth, Ida B. Wells, and Shirley Chisholm.

Invisibility

鈥淚 have a confession. Sometimes, with no one in sight, I code in a white mask鈥︹

The Coded Gaze: Unmasking Algorithmic Bias by Shalini Kantayya. Debuted at the Museum of Fine Arts Boston, The Coded Gaze mini documentary follows Poet of Code  Joy Buolamwini's personal frustrations with facial recognition software and the need for more inclusive code.

Representation

If vital data are missing, then entire demographics go missing.

This became apparent in 2012 as humanitarian agencies started collecting hashtag-based data sets around Hurricane Sandy. The noticed that people were left out of data sets and their observations thus not visible to the U.N. Office for Humanitarian Affairs, for example. 

Missed representation also affects more mundane activities in daily life such as being able to .

Philosophy Courses for the SDA Minor

Core courses --This course would introduce students to the ethical, legal, and privacy issues surrounding the collection and use of big data and the implications of these for vulnerable populations.

Elective course: - A survey of formal methods used in philosophy. Topics will include some of the following: propositional logic, predicate logic, formal syntax, formal semantics, the probability calculus, decision theory, game theory and formal causal modeling. Prerequisite: One of: PHIL 110, 210, 310, 314, MACM 101, BUEC 232 or STAT 270. Students with credit for COGS 315 cannot take this course for further credit.

鈥淎s a philosophy major, I found the background in logic and formal methods that I gained in this course to be super helpful for understanding and formulating arguments in my other philosophy classes. I also especially enjoyed learning about modal propositional logic and tense logic.

I imagine this class would also be great for data analytics, particularly because it looks at set theory, probability, decision theory, and game theory.

We also took a look at vagueness, which could be applied to many different areas.

We were given three exams (not cumulative) and had the option to substitute either the second or third exam for a paper. This flexibility allowed us some lee-way to focus on what was most interesting to us. We were given problem sets regularly, as well as a practice exam before each exam, which we had opportunity to go over in class. We weren鈥檛 required to hand them in, but they were a huge help in understanding the material.

This is one class where you learn by doing. It鈥檚 also really fun to apply all the skills you鈥檝e learned and see yourself making progress. The class provided a good overview of various areas within formal methods and is much more concrete than other philosophy classes.

Overall, I found the class to be rigorous, but well worth it.鈥

Student testimonial for PHIL 315

Additional Resources

Read more about ethics and social data analytics in our curation.

Study Philosophy at SFU

SFU Philosophy's collection on ethics and social data

Privacy Matters

鈥淩obert Williams spent over a day in custody in January after face recognition software matched his driver鈥檚 license photo to surveillance video of someone shoplifting, the American Civil Liberties Union of Michigan (ACLU) said in the complaint. In a video shared by ACLU, Williams says officers released him after acknowledging 鈥渢he computer鈥 must have been wrong.

Government documents seen by Reuters show the match to Williams came from Michigan state police鈥檚 digital image analysis section, which has been using a face matching service from Rank One Computing.鈥

The Big Three: Ethics in Data Analysis

Rosenthal summarizes the ethics of social data analysis by focusing on major concerns within the field. These include:

  • Ethics and data gathering: For example, respect privacy while gathering sufficient information 鈥 how do we balance these two sometimes competing issues?
  • Ethics of using data: For example, what is the risk of bias and discriminatory algorithms in generating results for onward application?
  • Ethics of deciding what to measure: What should we be measuring; with all the big data around, especially from social data, what do the data tell us and what doesn鈥檛 it tell us?
What does a philosopher say about ethics and SDA? SFU Philosophy alum Katherine Creel at our launch event.
automation and algorithms
should we have made this system?
bias and facial recognition
ethics and design decisions

Upcoming Events