Current and past students:

  • Abhipsha Das, M.Sc. in Computer Science at New York University (Master’s Thesis) on Diffusion modeling for text, 2023-Present (co-supervised with Kyunghyun Cho)
  • Carter Teplica, M.Sc. in Computer Science at New York University (Independent Study). Research on cross-lingual adaptation for generative language models, 2024-Present
  • Edwin Huang, B.Sc. in Computer Science and Philosophy at New York University (Independent Study). Research on formal evaluation of reasoning with large language models, 2024-Present
  • Cecilia Zheng, B.Sc. in Computer Science at New York University (Independent Study). Research on analyzing and interpreting non-linear properties of multilingual representation learning models, 2023-Present
  • Tia Chen, B.Sc. in Computer Science at Tufts University (Pathways to AI Program Scholar) and Grace Wang, B.Sc. in Computer Science at New York University (Dean’s Undergraduate Research Scholar). Delving into Evaluation Metrics for Generation: A Thorough Assessment of How Metrics Generalize to Rephrasing Across Languages. To Appear at the Eval4NLP Workshop at IJCNLP-AACL, 2023
  • Saksham Bassi, M.Sc. in Computer Science at New York University (Independent Study). Research on cross-lingual generalization (under review), 2023
  • Zijian Jin, M.Sc. in Computer Engineering at New York University (Independent Study) on The Effect of Logographic Information on Semantic Inference (Published at AACL), 2022
  • Francesco Tinner, B.A. in Computational Linguistics at the University of Zürich (Thesis) on Zero-shot Crosslingual Transfer of the Topic Modeling Task, 2021
  • Vivien Angliker, M.A. in Multilingual Text Analysis at the University of Zürich (Thesis) on Transfer Learning Methods for Extractive Text Summarization in Spanish, 2020
Duygu Ataman
Duygu Ataman
Assistant Professor of Computer Science