About Me
Hi! I’m Victoria Zhanqi Zhang (张展旗), a Ph.D. candidate in Computer Science at UC San Diego, supported by the HDSI Ph.D. Fellowship. I’m co-advised by Dr. Mikio Aoi and Dr. Gal Mishne. I earned dual B.S. in Computer Science and Electrical Engineering and a M.S. in Computer Science from Washington University in St. Louis, where I worked with Dr. Carlos Ponce as an undergraduate researcher.
I develop machine learning models to uncover structure and dynamics in complex, large-scale spatiotemporal systems such as the brain and behavior.
Grew up in Jinan, China; moved to Beijing during my teens; began undergrad in St. Louis, MO in 2016; now based in San Diego, CA for grad school.
I enjoy traveling, art, and animals, and share my home with birds. See my art portfolio.
For more details, see CV.
News
- Hire Me — Open to full-time Research Scientist roles in 2026. Connect with me on LinkedIn!
- My work, BEHAVE: Behavioral Ethology for Human Assessment via Variational Encoding, has been accepted at the NeurIPS 2025 Workshop on Data on Brain and Mind (DBM).
- My paper, Brain Feature Maps Reveal Progressive Animal-Feature Representations, is published in Science Advances and featured as “When Neurons Discover on Their Own” by Harvard Brain Institute News.
- Check out my preprint on Behavioral Dynamics in Bipolar Disorder on medRxiv.
- Co-authored paper Arousal as a universal embedding for spatiotemporal brain dynamics now published in Nature.
Research
Machine Learning I develop robust machine learning methods for large-scale spatiotemporal data, studying how self-supervised and generative models learn hierarchical, multi-scale structure. My work includes using adversarial training to address distribution shift in brain–computer interfaces (BCIs) (preprint), developing multimodal foundation models for neural interfaces, and improving handwriting generalization under challenging kinematic movement for the Meta Neural Band.
Computational Neuroscience I study how to neural representations and learning dynamics encode across brain and manifest in behaviors. I showed how hierarchical feature representations arise along the primate ventral visual stream (paper) and develop computational models of behavioral dynamics to characterize state-dependent structure in psychiatric disorders such as bipolar disorder (preprint).
Industry Experience In 2024, I developed EMG–vision multimodal foundation models for neural wristbands, improving performance across multiple downstream tasks, such as pose estimation, at Meta. In 2025, I developed models to improve the robustness of a handwriting recognition system productionized in the Meta Neural Band, which captures electrical activity from subtle forearm muscle movements for intuitive, gesture-based input. This work was live demoed at Meta Connect 2025 and later recognized as one of TIME’s Best Inventions of 2025.