About Me
I will join TTIC as a Research Assistant Professor in September 2021.
I am a Ph.D. student (2016-2021)
in
Department of Computer Science
at
Johns Hopkins University (JHU)
and also a member of the
Center for Language and Speech Processing (CLSP).
I am fortunately advised by
Jason Eisner and supported by
Bloomberg Data Science Ph.D. Fellowship.
At JHU, I have collaborated with
Yanxun Xu from Dept. AMS as well as
Benjamin Van Durme and
Kevin Duh from Dept. CS.
Befor joining
JHU-CLSP,
I obtained a M.S. in
Physical Science
at
The University of Chicago and did research in natural language understanding and generation
with
Mohit Bansal
and
Matthew R. Walter
at
Toyota Technological Institute at Chicago.
Prior to that, I obtained a B.E. in Electrical Engineering
from
Huazhong University of Science and Technology (HUST).
Research & Publications
I believe that good research stems from curiosity.
I admire the attitude of David Blackwell to research "I've worked in so many areas – I'm sort of a dilettante. Basically, I'm not interested in doing research and I never have been. I'm interested in understanding, which is quite a different thing. And often to understand something you have to work it out yourself because no one else has done it."
My research interests are rooted in designing models and algorithms to solve challenging real-life problems.
My current focus is modelling irregular time series and my work has been covered by news articles on Fortune Magazine and Tech At Bloomberg.
Here is a series of my papers on this topic:
Noise-Contrastive Estimation for Multivariate Point Processes
Hongyuan Mei, Tom Wan, Jason Eisner
Proceedings of NeurIPS 2020, Virtual-only.
[pdf]
[bib]
[code]
[slides]
[talk (English)]
Neural Datalog Through Time: Informed Temporal Modeling via Logical Specification
Hongyuan Mei, Guanghui Qin, Minjie Xu, Jason Eisner
Proceedings of ICML 2020, Virtual-only.
[pdf]
[bib]
[code]
[slides]
[talk (English)]
Imputing Missing Events in Continuous-Time Event Streams
Hongyuan Mei, Guanghui Qin, Jason Eisner
Proceedings of ICML 2019, Long Beach, California.
[pdf]
[bib]
[code]
[poster]
[slides]
The Neural Hawkes Process: A Neurally Self-Modulating Multivariate Point Process
Hongyuan Mei, Jason Eisner
Proceedings of NeurIPS 2017, Long Beach, California.
[pdf]
[bib]
[code]
[poster]
[spotlight (English)]
[talk (Chinese)]
I also develop statistical and machine learning methods for other domains such as medicine, natural language processing and computer vision:
Personalized Dynamic Treatment Regimes in Continuous Time: A Bayesian Joint Model for Optimizing Clinical Decisions with Timing
William Hua, Hongyuan Mei, Sarah Zohar, Magali Giral, Yanxun Xu
Bayesian Analysis.
[pdf]
[code]
Selected as one of the International Biometric Society Eastern North American Region's (ENAR) Distinguished Student Paper Awards
On the Idiosyncrasies of the Mandarin Chinese Classifier System
Shijia Liu, Hongyuan Mei, Adina Williams, Ryan Cotterell
Proceedings of NAACL 2019, Minieapolis, Minnesota.
[pdf]
[bib]
Halo: Learning Semantics-Aware Representations for Cross-Lingual Information Extraction
Hongyuan Mei, Sheng Zhang, Kevin Duh, Benjamin Van Durme
Proceedings of *SEM 2018, New Orleans, Louisiana.
[pdf]
[bib]
Coherent Dialogue with Attention-based Language Models
Hongyuan Mei, Mohit Bansal, Matthew R. Walter
Proceedings of AAAI 2017, San Francisco, California.
[pdf]
[bib]
What to talk about and how? Selective
Generation using LSTMs with Coarse-to-Fine Alignment
Hongyuan Mei, Mohit Bansal, Matthew R. Walter
Proceedings of NAACL 2016, San Diego, California.
[pdf]
[code]
[bib]
Listen, Attend, and Walk: Neural Mapping of Navigational Instructions to Action Sequences
Hongyuan Mei, Mohit Bansal, Matthew R. Walter
Proceedings of AAAI 2016, Phoenix, Arizona.
[pdf]
[code]
[bib]
Selected Oral with NVIDIA Award (TITAN X GPU Prize) in NeurIPS 2015 Multimodal Machine Learning workshop
Accurate Vision-based Vehicle Localization using Satellite Imagery
Hang Chu, Hongyuan Mei, Mohit Bansal, Matthew R. Walter
Presented at NeurIPS 2015 Transfer and Multi-Task Learning workshop.
[pdf]
[bib]
Teaching
I believe that
explaining is understanding.
This philosophy is summarized from the Feynman story told by David Goldstein: "Feynman was a truly great teacher. He prided himself on being able to devise ways to explain even the most profound ideas to beginning students. Once, I said to him, "Dick, explain to me, so that I can understand it, why spin one-half particles obey Fermi-Dirac statistics." Sizing up his audience perfectly, Feynman said, "I'll prepare a freshman lecture on it." But he came back a few days later to say, "I couldn't do it. I couldn't reduce it to the freshman level. That means we don't really understand it."
Here is a list of courses that I have been involved in:
- Course Instructor, Bloomberg ML Course on Modeling Irregular Time Series, Bloomberg, Fall 2020
It is a series of lectures with hands-on experience. Material: slides + ipython notebook
- Guest Speaker, 2018 JHU Summer School on Human Language Technology, Johns Hopkins University, Summer 2018
This is part of Fifth Frederick Jelinek Memorial Summer Workshop
I gave a 60-min tutorial on deep learning with MXNet Gluon. Material: slides
- Teaching Assistant, 600.465---Natural Language Processing (Instructor: Jason Eisner), Johns Hopkins University, Fall 2017
I held a series of technical discuss sessions for the course. Material: optimization
- Guest Lecturer, 600.466---Information Retrieval and Web Agents (Instructor: David Yarowsky), Johns Hopkins University, Spring 2017
Life Outside Lab
When I am away from my workstation, I may be: