Covid-19 Cases and Recovery Previsions with Deep Learning Nested Sequence Prediction Models with Long Short-Term Memory (LSTM) Architecture
Keywords:
COVID-19, Deep Learning, LSTM, Nested Sequence PredictionAbstract
More than 854 thousand COVID-19 cases are confirmed among which at least 176 thousand have recovered as of March. 31, 2020. Success in controlling COVID-19 infections and recoveries requires a timely and accurate monitoring of the epidemic with rather limited data. In this study, we use a Deep Learning nested sequence prediction models with Long Short-Term Memory (LSTM) architecture for the continuous monitoring of the infection and recovering processes. This model was built based on the epidemic data evolution of 79 countries between the date of their first case and March 13, 2020. The data is based on 12 variables for cumulative case number prediction and 13 variables (among which the cumulative number of cases) for cumulative recoveries number prediction. Results are very promising despite the small amount of data provided. The model can evolve and become more accurate by continuously updating and enriching data by adding experiences of all affected countries. To our knowledge, our work is the only one to propose this conduct in the context of COVID-19, generalized for all affected countries. The methods used in this study can be used to inform and encourage the general public, public health professionals, clinicians and decision-makers to take coordinative and collaborative efforts to control the epidemic.
References
X. Chen and B. Yu, “First two months of the 2019 Coronavirus Disease (COVID-19) epidemic in China: realtime surveillance and evaluation with a second derivative model”, Global Health Research and Policy, https://doi.org/10.1186/s41256-020-00137-4, 5:7, 2020.
T. Kuniya, “Prediction of the Epidemic Peak of Coronavirus Disease in Japan 2020”, Journal of Clinical Medecine, 9:789, doi:10.3390, 2020.
Qun Li, et. Al., “Early Transmission Dynamics in Wuhan, China, of Novel Coronavirus–Infected Pneumonia”, The new england journal o f medicine, DOI: 10.1056/NEJMoa20013, 2020.
A. Anzai, T. Kobayashi, N.M. Linton, R. Kinoshita, K. Hayashi, A. Suzuki, Y. Yang, S. Jung, T. Miyama, A.R. Akhmetzhanov and H. Nishiura, “Assessing the Impact of Reduced Travel on Exportation Dynamics of Novel Coronavirus Infection (COVID-19)”, Journal of clinical medecine, 9:601, doi:10.3390/jcm9020601, 2020.
M. McAleer, “Prevention Is Better Than the Cure: Risk Management of COVID-19”, Journal of Risk Financial Managment, 13:46, doi:10.3390/jrfm13030046, 2020.
S. Jung, A.R. Akhmetzhanov, K. Hayashi, N.M. Linton, Y. Yang, B. Yuan, T. Kobayashi, R. Kinoshita and H. Nishiura, “Real-Time Estimation of the Risk of Death from Novel Coronavirus (COVID-19) Infection: Inference Using Exported Cases”, Journal of Clinical Medecine, 9:523, doi:10.3390/jcm9020523, 2020.
J. Brownlee, “Long Short-Term Memory Networks With Python, Machine Learning” Mastery Edition, 2017.
H. Bouhamed and Y. Ruichek, “Deep feedforward neural network learning using Local Binary Patterns histograms for outdoor object categorization”, Advances In Modelling And Analyses B, 61(3), p158-162, 2018.
G. Hinton, L. Deng, D. Yu, G. Dahl, A. Mohamed, N. Jaitly, “Deep neural networks for acoustic modeling in speech recognition: the shared views of four research groups”, IEEE Signal Process. Mag, 29 (6), pp. 82-97, 10.1109/MSP.2012.2205597, 2012.
A. Mohamed, G. Dahl, G. Hinton, “Acoustic modeling using deep belief networks IEEE Trans. Audio Speech Lang”. Process, 20 (1), pp. 14-22, 10.1109/TASL.2011.2109382, 2012.
D. Ciresan, U. Meier, L. Gambardella, J. Schmidhuber, “Deep Big Simple Neural Nets Excel on Handwritten Digit Recognition”, CoRR abs/1003.0358, 2010.
D. Yu, L. Deng, “Deep Learning and its applications to signal and information processing”, IEEE Signal Process. Mag, 28 (1), pp. 145-154, 10.1109/MSP.2010.939038, 2011.
Y. Bengio, “Learning deep architectures for AI. Foundations and trends®”, Machine Learning, 2(1): 1-127, 2009.
P. Zikopoulos, C. Eaton, “Understanding big data: Analytics for enterprise class hadoop and streaming data”, McGraw-Hill Osborne Media, 2011.
S. Hochreiter, J. Schmidhuber, “Long short-term memory”, Neural Computation, 9 (8): p1735–1780, 1997.
A. Graves, M. Liwicki, S. Fernandez, R. Bertolami, H. Bunke, J. Schmidhuber, “A Novel Connectionist System for Improved Unconstrained Handwriting Recognition”, IEEE Transactions on Pattern Analysis and Machine Intelligence, 31 (5): p855–868, 2009.
L. Xiangang, W. Xihong, “Constructing Long Short-Term Memory based Deep Recurrent Neural Networks for Large Vocabulary Speech Recognition”, arXiv:1410.4281, 2014.
S. Hochreiter and J. Schmidhuber, “Long Short-Term Memory”, Neural Computation, vol. 9, no 8, p. 1735-1780, 1997.
F. Gers, N. Schraudolph and J. Schmidhuber, “Learning Precise Timing with LSTM Recurrent Networks”, Journal of Machine Learning Research, vol. 3, p. 115-143, 2002.
Downloads
Published
How to Cite
Issue
Section
License

This work is licensed under a Creative Commons Attribution 4.0 International License.
Authors contributing to this journal agree to publish their articles under the Creative Commons Attribution 4.0 International License, allowing third parties to share their work (copy, distribute, transmit) and to adapt it, under the condition that the authors are given credit and that in the event of reuse or distribution, the terms of this license are made clear.