Assistant Professor
Department of Electrical and Computer Engineering
Department of Computer Science and Engineering (Cooperating faculty)
University of California, Riverside
Google Scholar
GitHub
Email: yzhu@ucr.edu
Office: Winston Chung Hall, Room 431
I am an assistant professor in the Department of Electrical and Computer Engineering at the University of California, Riverside. In 2023, I got my Ph.D. in Computer Sciences at the University of Wisconsin–Madison, where I was fortunate to be advised by Robert Nowak. In 2021, I spent a wonderful summer at Microsoft Research NYC, working with Paul Mineiro, Dylan Foster, and John Langford.
I work on interactive machine learning (e.g., active learning, bandits, and reinforcement learning), where my goal is to develop efficient human-in-the-loop learning algorithms and systems. Recently, I became interested in connecting interactive machine learning with large models (e.g., large language models), from both algorithmic and systemic perspectives. Specifically, some research topics I would like to further explore include:
Data selection and generation for large models
LLM-powered interactive learning agents and systems
Efficient training and inference systems for large models
AI safety and alignment (e.g., using reinforcement learning)
Theoretical foundations of interactive ML/transformer/attention
\(\bigstar\) Prospective Students. I am actively looking for Ph.D. students (Fall 2024/Spring 2025, fully-funded) and short-term visitors/interns who are generally interested in AI/ML and/or ML Systems. I'm happy to work with students with CS/ECE/Math/Stat/Physics or other STEM backgrounds. If you are interested in joining my lab, please (i) fill out this form and (ii) drop me an email with you CV and transcripts.
Spring 2024: EE 260 Large Models and Advances in AI
Fall 2023: EE/CS 228 Introduction to Deep Learning
Infinite Action Contextual Bandits with Reusable Data Exhaust
Mark Rucker, Yinglun Zhu, and Paul Mineiro
International Conference on Machine Learning (ICML) 2023
Active Learning with Neural Networks: Insights from Nonparametric Statistics
Yinglun Zhu and Robert Nowak
Conference on Neural Information Processing Systems (NeurIPS) 2022
Efficient Active Learning with Abstention
Yinglun Zhu and Robert Nowak
Conference on Neural Information Processing Systems (NeurIPS) 2022
Contextual Bandits with Large Action Spaces: Made Practical
Yinglun Zhu, Dylan Foster, John Langford, and Paul Mineiro
International Conference on Machine Learning (ICML) 2022, [Code], [Spotlight talk, 6 min]
\(\bigstar\) Now available at the leading machine learning library Vowpal Wabbit (see here for instructions) and commercially incorporated into Microsoft Azure Personalizer!
Contextual Bandits with Smooth Regret: Computational Efficiency in Continuous Action Spaces
Yinglun Zhu and Paul Mineiro
International Conference on Machine Learning (ICML) 2022, [Code]
\(\bigstar\) Selected for a full oral presentation (top 2.1%),
[Oral talk, 17 min]
Near Instance Optimal Model Selection for Pure Exploration Linear Bandits
Yinglun Zhu, Julian Katz-Samuels, and Robert Nowak
International Conference on Artificial Intelligence and Statistics (AISTATS) 2022, [Code]
Pareto Optimal Model Selection in Linear Bandits
Yinglun Zhu and Robert Nowak
International Conference on Artificial Intelligence and Statistics (AISTATS) 2022, [Code]
Pure Exploration in Kernel and Neural Bandits
Yinglun Zhu\(^\star\), Dongruo Zhou\(^\star\), Ruoxi Jiang\(^\star\), Quanquan Gu, Rebecca Willett, and Robert Nowak
Conference on Neural Information Processing Systems (NeurIPS) 2021, [Code]
On Regret with Multiple Best Arms
Yinglun Zhu and Robert Nowak
Conference on Neural Information Processing Systems (NeurIPS) 2020, [Code]
Robust Outlier Arm Identification
Yinglun Zhu, Sumeet Katariya, and Robert Nowak
International Conference on Machine Learning (ICML) 2020, [Code]