Check out some of my ongoing research projects!
algorithmic ethics, transparency and accountability
Algorithms permeate the world around us. Yet, we have little idea how they are designed, how data is collected, analyzed and how policy decisions are made based on these findings. How do they affect privacy? Speech? Constitutional rights? Are they capable of great harm and discrimination? We need to do human-centered research that develops socio-technical solutions.
Currently, students are working on two different projects:
– what are the unanticipated outcomes of applying popular machine learning algorithms in crime analysis? This is in response to certain recent developments and failures. We are focusing on crime analysis in Milwaukee and have built an interactive web application-in-progress to document and highlight this project.
– survey research is one of the most popular approaches in contemporary research. In response to recent work on algorithmic gold standards, we ask, what is the general quality of survey research in hci and how can it be made better? What are the implications for transparency, generalizability and replication?
aspects of privacy in social networks
I am curious about the different norms and perceptions that people have about privacy in social networks.
Specifically, I am interested in themes like anonymity, surveillance, impression management, non-use etc from the perspective of different groups, especially marginalized populations.
Strategies that people employ to manage these concerns are also very exciting. For instance, obfuscation, deception and non use are some approaches that user commonly undertake in social networks.
computing in non-WEIRD contexts
I collaborate with Ishtiaque Ahmed (assistant professor, Toronto) and a wide group of other folks on broader issues encompassing privacy, security, access and surveillance in non-western, culturally situated and developing contexts.
For instance, recently, the Bangladesh government has instituted a mandatory biometric registration for anyone owning a mobile phone. This policy is a direct outcome of WEIRD governmental assumptions and policies yet is applied to a context which is very different. How do we think about their outcomes on computing and society?