References
[1] Childress, S. and Durango-Cohen. (2005) On Parallel Machine Replacement Problems with General Replacement Cost Functions and Stochastic Deterioration. Naval Research Logistics, 52, 409-419.
http://dx.doi.org/10.1002/nav.20088
[2] Cooper, L. and Cooper, M. (1981) Introduction to Dynamic Programming. Pergamon Press, London.
[3] Howard, R. (1960) Dynamic Programming and Markov Processes. The MIT Press, Cambridge.
[4] Sernik, E. and Marcus, S. (1991) Optimal Cost and Policy for a Markovian Replacement Problem. Journal of Optimization Theory and Applications, 71, 105-126.
http://dx.doi.org/10.1007/BF00940042
[5] Kristensen, A. (1996) Dynamic Programming and Markov Processes. Dina Notat No. 49.
http://www.prodstyr.ihh.kvl.dk/pdf/notat49.pdf
[6] Sethi, S., Sorger, G. and Zhou, X. (2000) Stability of Real-Time Lot Scheduling and Machine Replacement Policies with Quality Levels. IEEE Transactions on Automatic Control, 45, 2193-2196.
http://dx.doi.org/10.1109/9.887687
[7] Hadley G. (1964) Nonlinear and Dynamic Programming. Addison-Wesley, Massachussetts.
[8] Fu, M., Hu, J. and Shi, L. (1993) An Application of Perturbation Analysis to a Replacement Problem in Maintenance Theory. Proceedings of the 1993 Winter Simulation Conference, 12-15 December 1993, 329-337.
[9] Pérez, G., álvarez, M. and Garnica, J. (2006) Stochastic Linear Programming to Optimize some Stochastic Systems. WSEAS Transactions on Systems, 9, 2263-2267.
[10] Sherwin, J. and Al-Najjar, B. (1999) Practical Models for Condition-Based Monitoring Inspection Intervals. Journal of Quality on Maintenance Engineering, 5, 203-221.
http://dx.doi.org/10.1108/13552519910282665
[11] Lewis, E. (1987) Introduction to Reliability Theory. Wiley, Singapore.
[12] Cheng, T. (1992) Optimal Replacement of Ageing Equipment Using Geometric Programming. International Journal of Production Research, 3, 2151-2158.
http://dx.doi.org/10.1080/00207549208948142
[13] Karabakal, N., Bean, C. and Lohman, J. (2000) Solving Large Replacement Problems with Budget Constraints. Engineering Economy, 5, 290-308.
http://dx.doi.org/10.1080/00137910008967554
[14] Dohi, T., Ashioka, A., Kaio, N. and Osaki, S. (2004) A Simulation Study on the Discounted Cost Distribution under Age Replacement Policy. Journal of Management and Engineering Integration, 3, 134-139.
[15] Bellman, R. (1955) Equipment Replacement Policy. Journal of the Society for Industrial and Applied Mathematics, 3, 133-136.
http://dx.doi.org/10.1137/0103011
[16] White, R. (1969) Dynamic Programming. Holden Hay, San Francisco.
[17] Davidson, D. (1970) An Overhaul Policy for Deteriorating Equipment. In: Jardine, A.K.S., Ed., Operational Research in Maintenance, Manchester University Press, Manchester, 72-99.
[18] Walker, J. (1992) Graphical Analysis for Machine Replacement: A Case Study. International Journal of Operations and Production Management, 14, 54-63.
http://dx.doi.org/10.1108/01443579410067252
[19] Bertsekas, D. (2000) Dynamic Programming and Optimal Control. Athena Scientific, Belmont.
[20] Plá, L., Pomar, C. and Pomar, J.A (2004) Markov Sow Model Representing the Productive Lifespan of Herd Sows. Agricultural Systems, 76, 253-272.
http://dx.doi.org/10.1016/S0308-521X(02)00102-6
[21] Nielsen, L. and Kristensen, A. (2006) Finding the K Best Policies in a Finite-Horizon Markov Decision Process. European Journal of Operational Research, 175, 1164-1179.
http://dx.doi.org/10.1016/j.ejor.2005.06.011
[22] Nielsen, L., Jorgensen, E., Kristensen, A. and Ostergaard, S. (2009) Optimal Replacement Policies for Dairy Cows Based on Daily Yield Measurements. Journal of Dairy Science, 93, 75-92.
http://dx.doi.org/10.3168/jds.2009-2209
[23] Hillier, F. and Lieberman, J. (2002) Introduction to Operations Research. Mc Graw Hill, New York.
[24] Meyer, C. (1994) Sensitivity of the Stationary Distribution of a Markov Chain. SIAM Journal on Matrix Analysis and Applications, 15, 715-728.
http://dx.doi.org/10.1137/S0895479892228900
[25] Abbad, M. and Filar, J. (1995) Algorithms for Singularly Perturbed Markov Control Problems: A Survey. Control and Dynamic Systems, 73, 257-288.
[26] Feinberg, E. (2000) Constrained Discounted MarkovDecision Processes and Hamiltonian Cycles. Mathematics of Operations Research, 25, 130-140.
http://dx.doi.org/10.1137/S0895479892228900
[27] Schrijner, P. and van Doorn, E. (2001) The Deviation Matrix of a Continuous-Time Markov Chain. Probability in the Engineering and Informational Sciences, 16, 351-366.
[28] Ross, S. (1992) Applied Probability Models with Optimization Applications. Holden Day, San Francisco.