The next Shocklab Seminar will be online on Wednesday, 3 September 2025, at 16:00 (GMT+2). Everlyn Asiko Chimoto will be presenting: "Improving Quantized Multilingual LLMs".

Title: Improving Quantized Multilingual LLMs
Speaker: Everlyn Asiko Chimoto
Date: Wednesday, 3 September 2025
Time: 16:00-17:00 (GMT +2)
Zoom Meeting Linkhttps://uct-za.zoom.us/j/92750361177?pwd=QzNiRzBJRjRITVlwa2k5SVNkVmx5UT09

Abstract: This talk explores how the choice of calibration data impacts the performance of quantized multilingual large language models (LLMs). We systematically evaluate multiple calibration strategies across languages and quantization methods, showing that multilingual and language-matched calibration sets significantly improve perplexity. The results highlight the importance of data diversity and language relevance in building efficient, high-performing LLMs for low-resource and multilingual settings.

Bio: Everlyn is a PhD student in Natural Language Processing at the University of Cape Town. She specializes in Neural Machine Translation for low-resource languages under Prof. Bruce Bassett’s supervision. Her research focuses on data and model-efficient methods for NLP. Currently, she is exploring efficient methods for adapting large language models (LLMs) to low-resource languages.

See you there!

 

Housekeeping: