TY - GEN
T1 - Formulating Financial Trading Strategies Using LLM
T2 - 7th International Conference on Natural Language Processing, ICNLP 2025
AU - Wu, Jinheng
AU - Zhang, Di
AU - Niu, Qiang
N1 - Publisher Copyright:
© 2025 IEEE.
PY - 2025
Y1 - 2025
N2 - Large language models (LLMs) offer the potential to streamline traditional tools through intuitive interfaces, but financial trading requires clear, interpretable solutions. Direct conversion of natural language instructions into general-purpose programming languages via LLMs often leads to redundancy and hallucination. To address these challenges, we propose a domainspecific language (DSL) as an intermediary. The proposed twostage process first converts user instructions into DSL through incontext learning (ICL), followed by conversion into general programming languages. This approach improves efficiency, reduces manual intervention, and improves interpretability, particularly for exchange-traded fund (ETF) strategies. The experimental results show an exact match rate of 0.953, demonstrating the effectiveness of translating complex trading logic into DSL. The impact of DSL design and LLM selection is also discussed, highlighting the broader potential of this technique in the development of financial strategies and its applicability in other domains. Furthermore, the Introduction briefly touches on the limitations of traditional approaches, outlining how the DSLmediated solution overcomes these challenges, offering a costeffective and efficient alternative.
AB - Large language models (LLMs) offer the potential to streamline traditional tools through intuitive interfaces, but financial trading requires clear, interpretable solutions. Direct conversion of natural language instructions into general-purpose programming languages via LLMs often leads to redundancy and hallucination. To address these challenges, we propose a domainspecific language (DSL) as an intermediary. The proposed twostage process first converts user instructions into DSL through incontext learning (ICL), followed by conversion into general programming languages. This approach improves efficiency, reduces manual intervention, and improves interpretability, particularly for exchange-traded fund (ETF) strategies. The experimental results show an exact match rate of 0.953, demonstrating the effectiveness of translating complex trading logic into DSL. The impact of DSL design and LLM selection is also discussed, highlighting the broader potential of this technique in the development of financial strategies and its applicability in other domains. Furthermore, the Introduction briefly touches on the limitations of traditional approaches, outlining how the DSLmediated solution overcomes these challenges, offering a costeffective and efficient alternative.
KW - Domain-specific Language
KW - Exchange-Traded Fund
KW - In-context Learning
KW - Large language models
KW - Natural Language Processing
UR - https://www.scopus.com/pages/publications/105015624759
U2 - 10.1109/ICNLP65360.2025.11108657
DO - 10.1109/ICNLP65360.2025.11108657
M3 - Conference Proceeding
AN - SCOPUS:105015624759
T3 - 2025 7th International Conference on Natural Language Processing, ICNLP 2025
SP - 6
EP - 13
BT - 2025 7th International Conference on Natural Language Processing, ICNLP 2025
PB - Institute of Electrical and Electronics Engineers Inc.
Y2 - 21 March 2025 through 23 March 2025
ER -