TY - GEN
T1 - A Graph Convolution-Transformer Neural Network for Drug-Target Interaction Prediction
AU - Wang, Tianjun
AU - Liu, Xin
N1 - Publisher Copyright:
© 2022 ACM.
PY - 2022/5/27
Y1 - 2022/5/27
N2 - Identifying the ligand-binding affinity toward a target is crucial to ensure the drug s effects. The wet experiment is demanding when measuring massive drug-target interactions (DTI). Thus, many researchers applied machine learning to accelerate drug design. This study practised transformer and graph convolution neural network (GCN) to predict DTI. The molecules in the neural networkwere represented as graphs and then updated features by convolution. The Self-attention machine, transformer, can seek connections between source and target subjects. Thus, they were used to learn unique features of drugs and targets and then mathematically regress the interactions. The GCN-Transformer was trained on widely used benchmark DTI datasets. The best model classification performance was approximately 0.85 AUC-ROC and 0.86 AUPR. However, for the biased kinome selectivity dataset, the model s AUPR dropped to 0.75. The attention weights were also visualised, highlighting the compounds. The focused atoms by the weights contribute to the critical binding chemical bonds and conformation stability, showing reasonable explainability. This model might help insight into the DTI mechanism.
AB - Identifying the ligand-binding affinity toward a target is crucial to ensure the drug s effects. The wet experiment is demanding when measuring massive drug-target interactions (DTI). Thus, many researchers applied machine learning to accelerate drug design. This study practised transformer and graph convolution neural network (GCN) to predict DTI. The molecules in the neural networkwere represented as graphs and then updated features by convolution. The Self-attention machine, transformer, can seek connections between source and target subjects. Thus, they were used to learn unique features of drugs and targets and then mathematically regress the interactions. The GCN-Transformer was trained on widely used benchmark DTI datasets. The best model classification performance was approximately 0.85 AUC-ROC and 0.86 AUPR. However, for the biased kinome selectivity dataset, the model s AUPR dropped to 0.75. The attention weights were also visualised, highlighting the compounds. The focused atoms by the weights contribute to the critical binding chemical bonds and conformation stability, showing reasonable explainability. This model might help insight into the DTI mechanism.
KW - Deep learning
KW - drug-target interaction
KW - structural Bioinformatics
UR - http://www.scopus.com/inward/record.url?scp=85144280742&partnerID=8YFLogxK
U2 - 10.1145/3543377.3543399
DO - 10.1145/3543377.3543399
M3 - Conference Proceeding
AN - SCOPUS:85144280742
T3 - ACM International Conference Proceeding Series
SP - 145
EP - 150
BT - ICBBT 2022 - Proceedings of 2022 14th International Conference on Bioinformatics and Biomedical Technology
PB - Association for Computing Machinery
T2 - 14th International Conference on Bioinformatics and Biomedical Technology, ICBBT 2022
Y2 - 27 May 2022 through 29 May 2022
ER -