I am a PhD candidate at HKU, supervised by Lingpeng Kong. My current research interests including controllable text generation and large pretrain language models.
Previouly, I work at Shark-NLP Shanghai AI Lab as a NLP researcher. I graduated from Shanghai Jiao Tong University (SJTU), supervised by Kenny Zhu. I used to work at pose estimation, face recognition, hierarchical text classification and recommendation systems.
I’m willing to explore different kinds of generation paradigms and keep the eye on long sequence modeling. My ultimate goal is to narrow the language barrier between humans and machines by creating a more controllable, personalized and supportive natural language system.
➡️ Download my Resumé
“I can only show you the door, you’re the one that has to walk through it” – Morpheus (The Matrix)
📚 Publications
* indicates equal contribution
Preprint
L-Eval: Instituting Standardized Evaluation for Long Context Language Models | L-Eval
Chenxin An, Shansan Gong, Ming Zhong, Mukai Li, Jun Zhang, Lingpeng Kong, Xipeng Qiu
In-Context Learning with Many Demonstration Examples | EVALM
Mukai Li, Shansan Gong, Jiangtao Feng, Yiheng Xu, Jun Zhang, Zhiyong Wu, Lingpeng Kong
Published
DiffuSeq-v2: Bridging Discrete and Continuous Text Spaces for Accelerated Seq2Seq Diffusion Models
Shansan Gong, Mukai Li, Jiangtao Feng, Zhiyong Wu, Lingpeng Kong
Code| Accelerated version of DiffuSeq, where the discrete noise bridges the training and sampling stages, saving time consumption of these two stages.
DiffuSeq: Sequence to Sequence Text Generation With Diffusion Models
Shansan Gong, Mukai Li, Jiangtao Feng, Zhiyong Wu, Lingpeng Kong
DiffuSeq | Poster | DiffuSeq is a powerful model for text generation, matching or even surpassing competitive AR, iterative NAR, and PLMs on quality and diversity.
Transferable and Efficient: Unifying Dynamic Multi-Domain Product Categorization
Shansan Gong*, Zelin Zhou*, Shuo Wang, Fengjiao Chen, Xiujie Song, Xuezhi Cao, Yunsen Xian, Kenny Zhu
Data | Poster | A new framework to unify the categorization process as well as leverage knowledge from different domains.
Positive, Negative and Neutral: Modeling Implicit Feedback in Session-based News Recommendation
Shansan Gong, Kenny Q. Zhu
TCAR | Slides| By leveraging different kinds of implicit feedback, we alleviate the trade-off between the precision and diversity.
🎊 Honors and Awards
- 2022 SIGIR Student Travel Award
- 2022 Outstanding Graduate in Shanghai Municipality
- 2021 Wish Scholarship
- 2020 Shenzhen Stock Exchange Scholarship
- 2020 2nd Prize, Post-Graduate Mathematical Modeling Contest of China.
- 2019 Outstanding Undergraduate in SJTU
- 2019 Wenyuan Pan Scholarship
- 2016, 2017, 2018 Academic Excellence Scholarship of SJTU
💬 Invited Talks
- 2023.06, DiffuSeq, Youth PhD Talk-ICLR 2023 by AI Time. | [Slides]
- 2023.05, Incorporate Diffusion Models into Conditional Text Generation, Global Lunch Seminar at SJTU CS department. | [Slides]
📖 Educations
- 2019.06 - 2022.03, Master, Computer Science, SEIEE, Shanghai Jiao Tong University.
- 2015.09 - 2019.06, Undergraduate, Information Engineering, SEIEE, Shanghai Jiao Tong University.
💻 Experience
- 2021.12 - 2022.03, RE, Product Categorization, Meituan , Shanghai.
- 2021.06 - 2021.10, SDE, Bing Search Optimization, Microsoft STCA , Beijing.
- 2019.12 - 2022.03, CTO, iWenBooks APP Development, Yousheng Tech Inc , Shanghai.
📌 Services
- Conference Reviewer: COLING2022, ACL2023, NeurIPS2023, EMNLP2023, ICLR2024
All those moments will be lost in time, like tears in rain. – Blade Runner