Xiang Lisa Li

Lisa

Hi! I am a final year Ph.D. student at Stanford University, coadvised by Percy Liang and Tatsunori Hashimoto. My research is supported by Stanford Graduate Fellowship and Two Sigma PhD Fellowship.

I work on developing methods to overcome structural limitations of language models. My research encompasses many stages of language model development, including architecture (Diffusion-LM), adaptation (Prefix-Tuning), self-supervision (GV-consistency), decoding (Contrastive Decoding) and evaluation (AutoBencher).

Previously, I received undergraduate degrees from Johns Hopkins University, majoring in Computer Science and Applied Mathmatics and Statistics. I am fortunate to be advised by Prof. Jason Eisner.

If you're interested in getting started in research and think it'd be useful to chat, please feel free to email me.

Email: xlisali [at] stanford.edu

Links: [Github] [Google Scholar]


Selected Publications

(for full publication list please checkout my [Google Scholar])

  • Diffusion-LM Improves Controllable Text Generation
    Xiang Lisa Li, John Thickstun, Ishaan Gulrajani, Percy Liang, Tatsunori B. Hashimoto
    In Neurips 2022
    [bib] [abstract] [arxiv]

  • Prefix-Tuning: Optimizing Continuous Prompts for Generation
    Xiang Lisa Li and Percy Liang
    In ACL 2021
    [bib] [abstract]

  • Benchmarking and Improving Generator-Validator Consistency of Language Models
    Xiang Lisa Li, Vaishnavi Shrivastava, Siyan Li, Tatsunori Hashimoto, and Percy Liang
    In ICLR 2023
    [bib] [abstract]

  • Contrastive Decoding: Open-ended Text Generation as Optimization
    Xiang Lisa Li, Ari Holtzman, Daniel Fried, Percy Liang, Jason Eisner, Tatsunori Hashimoto, Luke Zettlemoyer, and Mike Lewis
    In ACL 2023
    [bib] [abstract]

  • AutoBencher: Towards Declarative Benchmark Construction
    Xiang Lisa Li, Farzaan Kaiyom, Evan Zheran Liu, Yifan Mai, Percy Liang, and Tatsunori Hashimoto
    ArXiv 2024
    [bib] [abstract]

  • Posterior Control of Blackbox Generation
    Xiang Lisa Li and Alexander Rush
    In Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics (ACL) , 2020.
    [bib] [abstract] [appendix]

  • Specializing Word Embeddings (for Parsing) by Information Bottleneck
    Xiang Lisa Li and Jason Eisner
    In Conference on Empirical Methods in Natural Language Processing (EMNLP-IJCNLP), 2019.
    Best Paper Award at EMNLP-IJCNLP 2019
    [bib] [abstract] [appendix]



Honors & Awards

  • (April. 2023) Two Sigma PhD Fellowship
  • (Sep. 2020) Stanford Graduate Fellowship
  • (May. 2020) Outstanding Senior Award
  • (Dec. 2019) Outstanding Undergraduate Researcher Award (Computing Research Association)
  • (Nov. 2019) Best Paper Award at EMNLP-IJCNLP

Teaching Experience

  • (Spring 2023) TA @ CS 224U at Stanford
  • (Winter 2023) TA @ CS 224N at Stanford
  • (Spring 2020) TA @ Introduction to Statistics (AMS 553.430/630)
  • (Spring 2019) TA @ Introduction to Probability (AMS 553.420/620)
  • (Fall 2018) TA @ Introduction to Probability (AMS 553.420/620)
  • (Spring 2017) TA @ Introduction to Probability (AMS 553.420/620)
  • (Fall 2017) TA @ Introduction to Probability (AMS 553.420/620)
  • Interestingly, a perpetual prob TA is switching to stats... Hope we can have fun in 430 :)