
Assistant Professor
Department of Electrical and Computer Engineering
Department of Computer Science and Engineering (Cooperating)
Department of Statistics (Cooperating)
University of California, Riverside
Google Scholar GitHub
Email: yzhu@ucr.edu
Office: Winston Chung Hall, Room 431
I am an assistant professor in the ECE department at UC Riverside.
My research focuses on machine learning, reinforcement learning, and foundation models, with an emphasis on developing efficient and reliable learning algorithms and systems for large-scale, multimodal problems. I am particularly interested in the following research directions:
Test-time training and scaling: enhancing pretrained model performance and reliability during inference, especially on new or unseen tasks.
Interactive agents: leveraging foundation models to develop efficient and adaptive decision-making agents in interactive environments.
Previously, I received my Ph.D. in Computer Sciences from the University of Wisconsin–Madison, advised by Robert Nowak. During Ph.D., I worked on theoretical foundations of interactive machine learning, including active learning and contextual bandits. I now aim to incorporate these algorithmic insights into the design of practical AI systems — see examples below.
Test-Time Matching: Unlocking Compositional Reasoning in Multimodal Models
Yinglun Zhu, Jiancheng Zhang, and Fuzhi Tang
arXiv preprint 2025
Online Finetuning Decision Transformers with Pure RL Gradients
Junkai Luo and Yinglun Zhu
arXiv preprint 2025
Strategic Scaling of Test-Time Compute: A Bandit Learning Approach
Bowen Zuo and Yinglun Zhu
arXiv preprint 2025
Towards Multimodal Active Learning: Efficient Learning with Limited Data Pairs
Jiancheng Zhang and Yinglun Zhu
arXiv preprint 2025
Efficient Sequential Decision Making with Large Language Models
Dingyang Chen, Qi Zhang, and Yinglun Zhu
Conference on Empirical Methods in Natural Language Processing (EMNLP) 2024, [Code]
Chain-of-Region: Visual Language Models Need Details for Diagram Analysis
Xue Li, Yiyou Sun, Wei Cheng, Yinglun Zhu, and Haifeng Chen
International Conference on Learning Representations (ICLR) 2025
Efficient Sparse PCA via Block-Diagonalization
Alberto Del Pia, Dekun Zhou, and Yinglun Zhu
International Conference on Learning Representations (ICLR) 2025, [Code]
LeMix: Unified Scheduling for LLM Training and Inference on Multi-GPU Systems
Yufei Li, Zexin Li, Yinglun Zhu, Cong Liu
Real-Time Systems Symposium (RTSS) 2025
Efficient Sequential Decision Making with Large Language Models
Dingyang Chen, Qi Zhang, and Yinglun Zhu
Conference on Empirical Methods in Natural Language Processing (EMNLP) 2024, [Code]
An Experimental Design Framework for Label-Efficient Supervised Finetuning of Large Language Models
Gantavya Bhatt, Yifang Chen, Arnav M. Das, Jifan Zhang, Sang T. Truong, Stephen Mussmann, Yinglun Zhu, Jeffrey Bilmes, Simon S. Du, Kevin Jamieson, Jordan T. Ash, and Robert Nowak
Findings of the Association for Computational Linguistics: ACL (ACL Findings) 2024
LabelBench: A Comprehensive Framework for Benchmarking Adaptive Label-Efficient Learning
Jifan Zhang, Yifang Chen, Gregory Canal, Arnav Das, Gantavya Bhatt, Stephen Mussmann, Yinglun Zhu, Jeffrey Bilmes, Simon Du, Kevin Jamieson, and Robert Nowak
Journal of Data-Centric Machine Learning Research 2024, [Code]
\(\bigstar\) Also selected for a Poster Award at Midwest ML Symposium 2024
Interactive Machine Learning: From Theory to Scale
Yinglun Zhu
Ph.D. Dissertation, University of Wisconsin-Madison, 2023
Infinite Action Contextual Bandits with Reusable Data Exhaust
Mark Rucker, Yinglun Zhu, and Paul Mineiro
International Conference on Machine Learning (ICML) 2023
Active Learning with Neural Networks: Insights from Nonparametric Statistics
Yinglun Zhu and Robert Nowak
Conference on Neural Information Processing Systems (NeurIPS) 2022
Efficient Active Learning with Abstention
Yinglun Zhu and Robert Nowak
Conference on Neural Information Processing Systems (NeurIPS) 2022
Contextual Bandits with Large Action Spaces: Made Practical
Yinglun Zhu, Dylan Foster, John Langford, and Paul Mineiro
International Conference on Machine Learning (ICML) 2022, [Code]
\(\bigstar\) Incorporated into the leading machine learning library Vowpal Wabbit (see here for detailed instructions).
Contextual Bandits with Smooth Regret: Computational Efficiency in Continuous Action Spaces
Yinglun Zhu and Paul Mineiro
International Conference on Machine Learning (ICML) 2022, [Code]
\(\bigstar\) Selected for a full oral presentation (top 2.1%)
Near Instance Optimal Model Selection for Pure Exploration Linear Bandits
Yinglun Zhu, Julian Katz-Samuels, and Robert Nowak
International Conference on Artificial Intelligence and Statistics (AISTATS) 2022, [Code]
Pareto Optimal Model Selection in Linear Bandits
Yinglun Zhu and Robert Nowak
International Conference on Artificial Intelligence and Statistics (AISTATS) 2022, [Code]
Pure Exploration in Kernel and Neural Bandits
Yinglun Zhu\(^\star\), Dongruo Zhou\(^\star\), Ruoxi Jiang\(^\star\), Quanquan Gu, Rebecca Willett, and Robert Nowak
Conference on Neural Information Processing Systems (NeurIPS) 2021, [Code]
On Regret with Multiple Best Arms
Yinglun Zhu and Robert Nowak
Conference on Neural Information Processing Systems (NeurIPS) 2020, [Code]
Robust Outlier Arm Identification
Yinglun Zhu, Sumeet Katariya, and Robert Nowak
International Conference on Machine Learning (ICML) 2020, [Code]
Ph.D. Students: Jiancheng Zhang (Fall 2024 - present), Bowen Zuo (Fall 2024 - present)
Prospective Students: If you are interested in joining my lab, please apply to the ECE or CSE Ph.D. program at UC Riverside and list me as a potential advisor.
Spring 2025: EE 260 Large Models and Advances in AI
Winter 2025: EE 114 Probability, Random Variables, and Random Processes in Electrical Engineering
Fall 2024: EE/CS 228 Introduction to Deep Learning
Spring 2024: EE 260 Large Models and Advances in AI
Fall 2023: EE/CS 228 Introduction to Deep Learning