• CSS 581- Introduction to Machine Learning

  • CSS 490 - NLP

  • CSS444 - Analyzing Biases in Age of Big Data

  • CSS 590 - AI for Social Good (Spring 2022)

  • CSS Capstone. If you are interested in taking your capstone project with me you must have had taken one of my non-core courses or have been part of my research team.

  • CSS 499 Faculty Research: Exposing students to research opportunities both in my lab and with our outside partners is something I cherish. If you are an independent self-driven student I will always have a project for you in my lab. See how to get involved in my research page.

  • CSS 600: Graduate Research

  • CSS 490 - Bias and Ethic in ML

  • CSS 342-343 - Data Structures, Algorithms, And Discrete Mathematics I/II

  • CSS 142 - Programming I

  • SOC 225 -Data and Society

  • SOC 201- Introductory topics in Sociology

I will be presenting our pedagogy finding of integrating tools to ML curriculum at UW teaching symposium 2022. Find our resources here

I was named as Research Mentor Awardee at UW in 2021. read more here

Current Graduate Research Students

Corey Zhou

Alex Kyllo

Sana Sue

Andrew Chekerylla

Manuja Sharma (UW Seattle)

Yuting Zhan, PhD student (Imperial College London)

Current Undergraduate Research Students

Michael Cho (Mary Gates Fellow)

Larissa Gao

Tiffany Chen

Mandy Chen

2scholars.jpg

Mary Gates Fellows 2020-2021

Joshua Sterner (right)- fellow

Joshua Sterner is working on how to preserve the privacy of data used to train a machine-learning model while also keeping the model private. He has previously worked on the implication of federated deep embedded clustering and has published multiple papers with our research team.

Ali Jahangirnezhad (left)- fellow

Ali is developing a model that accounts for the properties and dynamics of sound. More specifically he is designing an unsupervised representation learning model based on the LSTM and applying deep embedded techniques. Applications of his work include detecting sounds of marine animals using data from hydrophones (in collaboration with Dr Shima Abadi)

Inkar Kapen- fellow 2023

Mary Gates Fellows 2021-2022

Project: Independent Measurement Platform for Federated Learning Models on Android Devices

Machine learning is a powerful tool that allows us to use data to make predictions and decisions about the world, but it requires expensive centralized hardware and data, is prone to algorithmic biases, and has privacy concerns surrounding the use of required data. In contrast, Federated Learning (FL) allows users to collaboratively train a shared model under a central server while keeping personal data on their devices. This ability potentially addresses problems of traditional machine learning by using widely available mobile devices to increase accessibility to mainstream users and leverages decentralized user data and computational resources to train machine learning models more efficiently. However, this emerging field requires established processes for training and measuring the efficiency of FL models on edge devices. This research provides an inclusive framework to federatively train models on Android devices and analyze their computational and energy efficiency. On the mobile devices, I leveraged a terminal application to install dependencies and natively train machine learning models on the device. Then, I analyzed the device’s efficiency by measuring the computational, energy, and network resources through terminal applications. This flexible framework can deploy diverse machine learning models and datasets for training on Android devices. In preliminary experiments, I used this framework to measure efficiency for a PyTorch obstacle detection model and a Tensorflow abnormal heartbeat detection algorithm. These experiments showed that training machine learning models on mobile phones makes efficient use of CPU, memory, and bandwidth, and it uses minimal energy consumption compared to centralized machine learning systems. With little to no examples of FL on Android devices, this framework provides a novel plug-and-play solution for native FL on mobile devices. Applications of this research will also demonstrate novel methods for using FL techniques to address topics of accessibility, privacy, algorithmic bias, and hardware limitations for machine learning. 

Previous Interns and Students

Students at UWB (underlined have co-authored research papers with me)

  • Nicholas Feuster

  • Jaeha Choi

  • Hanna Hunde

  • Benjamin Lin

  • Amarjot Kaur (now at Microsoft)

  • Edward Kim (now at SAP)

  • Sneha Manchukonda (now at Visa)

  • Jay Quedado

  • Joshua Sterner

  • Ali Jahangirnezhad

  • Afrooz Rahmati (now at Oracle)

  • Brenden Hurt



Interns at Bell Labs:

  1.  Alessandro Montanari (2014)

  2.  Moustafa Alzantot (2013)

  3.  Vaclav Belak (2012)

Interns at eScience UW:

• Rachael Dottle

• Myeong Lee 

• Carlos Espino

• Imam Subkhan

If you are interested in working with me either as undergraduate research assistant or as collaborator, email me at last name at uw dot edu

Previous
Previous

Publications

Next
Next

Research Statement