Large Language Models in HeXie Management: A Comparative Evaluation of RAG and Finetune

Research output: Chapter in Book or Report/Conference proceedingConference Proceedingpeer-review

Abstract

HeXie Management Theory (HXMT) is a modern management theory for organizations. It provides macro-level considerations for both the internal mechanisms of an organization and its overall operational patterns. It has been utilized in organizations such as healthcare, rural construction, university management, and large-scale engineering projects and has proved useful. There are press demands for its wider adoption. Large Language Models (LLMs) have been widely used in natural language processing and content generation. Re-training LLMs will help them possess HeXie management theory, which can be useful. However, there are two popular methods to achieve this goal: fine-tuning and RAG. Each approach has its pros and cons. This paper reports our efforts in a comparative study of the two approaches. Our research employs datasets from HXMT and chooses the open-source platforms LlaMA-2, LlaMA-3, and ERNIE-Speed for fine-tuning based on four metrics and manual evaluations, with RAG we used ERNIE models with five dimensions.
Our results show that the RAG-trained ERNIE-speed-App performs better than the fine-tuning training ERNIE-speed-8k model under the same training data volume. this may shed some light on similar applications where new theory needs to be integrated into an LLM to make it specialized for particular applications. Our work is available at https://alex17swim.com/hxjun2.
Original languageEnglish
Title of host publicationACM International Conference Proceedings Series
PublisherAssociation for Computing Machinery (ACM)
Number of pages9
Publication statusAccepted/In press - 9 Dec 2024

Fingerprint

Dive into the research topics of 'Large Language Models in HeXie Management: A Comparative Evaluation of RAG and Finetune'. Together they form a unique fingerprint.

Cite this