A risk-aware maintenance model based on a constrained Markov decision process

Jianyu Xu, Xiujie Zhao*, Bin Liu

*Corresponding author for this work

Research output: Contribution to journalArticlepeer-review

5 Citations (Scopus)

Abstract

The Markov Decision Process (MDP) model has been widely studied and used in sequential decision-making problems. In particular, it has been proved to be effective in maintenance policy optimization problems where the system state is assumed to continuously evolve under sequential maintenance policies. In traditional MDP models for maintenance, the long-run expected total discounted cost is taken as the objective function. The maintenance manager’s target is to evaluate an optimal policy that incurs the minimum expected total discounted cost through the corresponding MDP model. However, a significant drawback of these existing MDP-based maintenance strategies is that they fail to incorporate and characterize the safety issues of the system during the maintenance process. Therefore, in some applications that are sensitive to functional risks, such strategies fail to accommodate the requirement of risk awareness. In this study, we apply the concept of risk-aversion in the MDP maintenance model to develop risk-aware maintenance policies. Specifically, we use risk functions to measure some indexes of the system that reflect the safety level and formulate a safety constraint. Then, we summarize the problem as a constrained MDP model and use the linear programming approach to evaluate the proposed risk-aware optimal maintenance policy under concern.

Original languageEnglish
Pages (from-to)1072-1083
Number of pages12
JournalIISE Transactions
Volume54
Issue number11
DOIs
Publication statusPublished - 15 Oct 2021

Keywords

  • Markov decision process
  • Risk aversion
  • condition-based maintenance
  • imperfect repair
  • safety constraint

Cite this