ECE’s Wu Honored With Best Paper at IEEE Symposium

Hsiao-Chun WuJuly 3, 2024

BATON ROUGE, LA – LSU Electrical and Computer Engineering Professor Hsiao-Chun Wu was recently honored by the Institute of Electrical and Electronics Engineers (IEEE) Broadcast Technology Society with the 2024 Best Paper Award at the IEEE International Symposium on Broadband Multimedia Systems and Broadcasting in Toronto, Canada.

The paper, titled “Fine-Tuning Optimization of Small Language Models: A Novel Graph-Theoretical Approach for Efficient Prompt Engineering,” was co-authored by Wu, who is also an IEEE Fellow; Venkata Gadiraju, LSU Ph.D. graduate in electrical engineering and computer science; Hao-Yu Tsai, Ph.D. student in the National Tsing Hua University Department of Electrical Engineering; Manali Singha, LSU Ph.D. graduate in computational system biology; Scott C.-H. Huang, professor in the National Tsing Hua University Department of Electrical Engineering; Guannan Liu, assistant professor in the San Jose State University Department of Applied Data Science (also an LSU Ph.D. graduate in electrical engineering and computer science); Shih Yu Chang, assistant professor of applied data science at San Jose University; and Yiyan Wu, professor in Western University’s Department of Electrical and Computer Engineering.

In recent years, large language models, or LLMs, have been widely applied in artificial intelligence (AI)-driven prompt engineering, such as question answering and text summarization functionalities. However, there is a growing interest in small language models, or SLMs, for resource-constrained, application-specific data mining. SLMs involve much fewer parameters than LLMs and offer advantages in terms of reduced carbon emission, short training times, and low computational complexity.

The paper details the research group’s work with SLMs by designing a new systematic approach to extract the semantic-, contextual-, and domain-relevant relationships among users’ prompts; extend the conventional clique-finding paradigm for training data reduction (TDR) and evaluate the approach using the GPT-2 model (a large language model by OpenAI); develop a time-complexity analysis for the new TDR scheme; and compare the new approach with the conventional model, showing the on-par performance and better-than-conventional-method performance using the BERTScore, a metric useful in evaluating the quality of text summary in relation to the original text.

Wu is the Michel B. Voorhies Professor in the LSU Division of Electrical and Computer Engineering. He earned his Ph.D. from the University of Florida, and his research interests include statistical learning on optimization, estimation and detection applications, embedded algorithms for digital signal processing, speech and image processing, and wireless communications.

Like us on Facebook (@lsuengineering) or follow us on X (formerly Twitter) and Instagram (@lsuengineering).​


Contact: Joshua Duplechain
Director of Communications