TY - GEN
T1 - Word Orthography and Relationship-Dominant Engineering (WOR-De) Model for Wordle Game
AU - Hu, Yifei
AU - Zhuang, Xinyao
AU - Wan, Yuxin
AU - Jin, Nanlin
AU - Zhu, Xiaohui
N1 - Publisher Copyright:
© 2023 IEEE.
PY - 2024/5
Y1 - 2024/5
N2 - This paper introduces the Word Orthography and Relationship-Dominant Engineering (WOR-De) model for predicting the distribution of Wordle game completion across the number of attempts (DWCA). Our model employs five feature engineering methods: Word Frequency Referencing (WFR), Orthographic Feature Extraction (OFE), Respective Adjacency Probability (RAP), Dominant Pattern Matching (DPM), and Edit Distance Counting (EDC). These methods collaboratively transform each input word into a five-dimensional vector that encapsulates intrinsic attributes like frequency, adjacency, and commonalities. This vector serves as a specialized word representation within the WOR-De model. When integrated with machine learning algorithms, the WOR-De model can effectively forecast the DWCA. Comparative analyses with established word embedding models like GloVe and FastText demonstrate that WOR-De excels in capturing both semantic elements essential for such Wordle game prediction.
AB - This paper introduces the Word Orthography and Relationship-Dominant Engineering (WOR-De) model for predicting the distribution of Wordle game completion across the number of attempts (DWCA). Our model employs five feature engineering methods: Word Frequency Referencing (WFR), Orthographic Feature Extraction (OFE), Respective Adjacency Probability (RAP), Dominant Pattern Matching (DPM), and Edit Distance Counting (EDC). These methods collaboratively transform each input word into a five-dimensional vector that encapsulates intrinsic attributes like frequency, adjacency, and commonalities. This vector serves as a specialized word representation within the WOR-De model. When integrated with machine learning algorithms, the WOR-De model can effectively forecast the DWCA. Comparative analyses with established word embedding models like GloVe and FastText demonstrate that WOR-De excels in capturing both semantic elements essential for such Wordle game prediction.
KW - Feature engineering
KW - Machine learning
KW - Natural language processing
KW - Word embedding
KW - Wordle
UR - http://www.scopus.com/inward/record.url?scp=85192852997&partnerID=8YFLogxK
U2 - 10.1109/iThings-GreenCom-CPSCom-SmartData-Cybermatics60724.2023.00106
DO - 10.1109/iThings-GreenCom-CPSCom-SmartData-Cybermatics60724.2023.00106
M3 - Conference Proceeding
AN - SCOPUS:85192852997
T3 - Proceedings - IEEE Congress on Cybermatics: 2023 IEEE International Conferences on Internet of Things, iThings 2023, IEEE Green Computing and Communications, GreenCom 2023, IEEE Cyber, Physical and Social Computing, CPSCom 2023 and IEEE Smart Data, SmartData 2023
SP - 563
EP - 569
BT - 2023 IEEE International Conferences on Internet of Things (iThings) and IEEE Green Computing & Communications (GreenCom) and IEEE Cyber, Physical & Social Computing (CPSCom) and IEEE Smart Data (SmartData) and IEEE Congress on Cybermatics (Cybermatics)
PB - Institute of Electrical and Electronics Engineers Inc.
T2 - 2023 IEEE Congress on Cybermatics: 16th IEEE International Conferences on Internet of Things, iThings 2023, 19th IEEE International Conference on Green Computing and Communications, GreenCom 2023, 16th IEEE International Conference on Cyber, Physical and Social Computing, CPSCom 2023 and 9th IEEE International Conference on Smart Data, SmartData 2023
Y2 - 17 December 2023 through 21 December 2023
ER -