Contest Problems


Ventricular fibrillation (VF) and ventricular tachycardia (VT) are two types of life-threatening ventricular arrhythmias (VAs), which are the main cause of Sudden Cardiac Death (SCD). More than 60% of deaths from cardiovascular disease are from out-of-hospital SCD [1]. It is caused by a malfunction in the heart's electrical system that can occur when the lower chambers of the heart suddenly start beating in an uncoordinated fashion, preventing the heart from pumping blood out to the lungs and body [1]. Unless the heart is shocked back into normal rhythm, the patient rarely survives.

People at high risk of SCD rely on the Implantable Cardioverter Defibrillator (ICD) to deliver proper and in-time defibrillation treatment when experiencing life-threatening VAs. ICD is a small device implanted typically under the skin in the left upper chest and is programmed to release defibrillation (i.e., electrical shock) therapy on VT and VF to restore rhythm back to normal. The early and in-time diagnosis and treatment of life-threatening VAs can increase the chances of survival. However, current detection methods deployed on ICDs count on a wide variety of criteria obtained from clinical trials, and there are hundreds of programmable criteria parameters affecting the defibrillation delivery decision [2, 3].

Automatic detection with less expertise involved in ICDs can further improve the detection performances and reduce the workload from physicians in criteria design and parameters adjustment [4-6]. This year’s Challenge is asking you to build a deep learning algorithm that can classify life-threatening VAs with only the labeled one-channel intracardiac electrograms (IEGMs) sensed by single-chamber ICDs while satisfying the requirements of in-time detection on the resource-constrained microcontroller (MCU) platform. We will test each algorithm on databases of IEGMs in terms of detection performances as well as practical performances, and the comprehensive performances will reveal the utility of the algorithm on life-threatening VAs detection in ICDs. Please refer to our paper (accepted and to be appearing at TCAD) that describes the last year's contest in detail [7].


The goal of the 2023 Challenge is to discriminate life-threatening VAs (i.e., Ventricular Fibrillation and Ventricular Tachycardia) from single-lead (RVA-Bi) IEGM recordings.

We ask participants to design and implement a working, open-source machine learning or deep learning algorithm that can automatically discriminate life-threatening VAs (i.e., binary classification: VAs or non-VAs) from IEGM recordings while being able to be deployed and run on the given MCU platform. We will award prizes to the teams with top comprehensive performances in terms of detection precision, memory occupation, and inference latency.


The data contains single-lead IEGM recordings retrieved from the lead RVA-Bi of ICD. Each recording is 5-second in length with 250 Hz sampling rate. The recordings are pre-processed by applying a band-pass filter with a frequency of a pass-band frequency of 15 Hz and a stop-band frequency of 55 Hz. Each IEGM recording has one label that describes the cardiac abnormalities (or normal sinus rhythm). Here, for the classification of VAs, it contains the labels including VT (Ventricular Tachycardia), VFb (Ventricular Fibrillation), VFt (Ventricular Flutter). For the non-VAs, the segments are labeled with the one other than VT, VFb, VFt. We have provided the lists of labels for references.

The IEGM recordings are partitioned patient-wisely in the training and testing set. 80% of the subjects’ IEGM recordings are released as training material. The rest 20% of the subjects’ IEGM recordings will be utilized to evaluate the detection performances of the submitted deep learning algorithm. The dataset for final evaluation will remain private and will not be released.

4.Data Format

All data is formatted in text. Each text file contains a 5-second IEGM recording with the sampling point on each row of 1,250 in total length. The value of each sampling point is stored as a floating-point number.

The name of each text file contains the patient ID, labels, and index separated by a dash (i.e., ”-”). For example, for the file “S001-VT-1.txt”, “S001” represents the subject number, “VT” represents the type of arrhythmias or rhythm (refer to the provided label list), “1” represents the index of the segments.

5.MCU Platform and Cube.AI

The development board required in the Challenge is STM32L031K6T6, which is with ARM Cortex M0+ core, 32 Kbytes of flash memory, 8 Kbytes of SRAM, and embedded ST-LINK/V2-1 debugger/programmer. STM32F303K8T6, which is with ARM Cortex M4 core at 72 MHz, 64 Kbytes of flash memory, 16 Kbytes of SRAM, and embedded ST-LINK/V2-1 debugger/programmer. The development board also supports STM32 X-Cube-AI, which is an STM32Cube Expansion Package part of the STM32Cube. AI ecosystem and extending STM32CubeMX capabilities with automatic conversion of pre-trained Artificial Intelligence algorithms, including Neural Network and classical Machine Learning models, and integration of generated optimized library into the user's project. The practical performances of the deep learning algorithm will be deployed and evaluated on the development board with the help of X-Cube-AI. Note that you do not have to utilize the tool X-Cube-AI to implement the project on STM32F303K8T6. But we will need to be able to evaluate your model performance in a straightforward way.

The board only costs around $11 and we anticipate most participating teams can easily get them online. However, if you do need support, please contact us.

Purchase Link (mouser)
Purchase Link (digikey)


We will evaluate the submitted algorithm with the scoring metric that evaluates the comprehensive performances in terms of detection performances and practical performances. It is defined as follows:

For detection performances,

  • score: This score is set to measure the VA detection performance of the model. We will obtain the confusion matrix of the classification generated by the submitted design for each testing subject, where the case positive is VAs. Then, the score of each individual subject from the testing dataset will be calculated as follows: with =2. This setting gives a higher weight to recall since the detection accuracy of life-threatening VAs is the most important metric in ICDs. It is expected to discriminate as many VAs recordings as possible to avoid missed detection on VAs while achieving a high detection accuracy on non-VA recordings to reduce inappropriate shock rate. The final score will be obtained by averaging of all testing subjects as follows: .
  • Generalization score: This score is set to evaluate the generalization of the submitted model for VA detection on different subjects. A well-generalized model could perform well on different subjects. We will first calculate for each subject from the testing dataset and obtain , which is the number of the surpassing a preset threshold 0.95. The generalization score is defined as follows: , where is the total number of the testing subjects.

For practical performances,

  • Inference latency: The average latency (in ms) of inference executed on STM32L031K6T6 over recordings from testing dataset will be measured. The latency score will be normalized by , where and .
  • Memory occupation: The memory occupation (in KiB) will be measured based on Flash occupation of NUCLEO-L432KC for the storage of the deep neural network model and the program. To be more specific, is Code+RO Data+RW Data reported by Keil when building the project. will be normalized by , where and .

The final score will be calculated as follows:

where the higher score is expected.

7.Example Code

We have provided an example algorithm to illustrate how to train the model with PyTorch and evaluate the comprehensive performances in terms of detection performance, flash occupation and latency.


[1] Adabag, A.S., Luepker, R.V., Roger, V.L. and Gersh, B.J., 2010. Sudden cardiac death: epidemiology and risk factors. Nature Reviews Cardiology, 7(4), pp.216-225.

[2] Zanker, N., Schuster, D., Gilkerson, J. and Stein, K., 2016. Tachycardia detection in ICDs by Boston Scientific. Herzschrittmachertherapie+ Elektrophysiologie, 27(3), pp.186-192.

[3] Madhavan, M. and Friedman, P.A., 2013. Optimal programming of implantable cardiac-defibrillators. Circulation, 128(6), pp.659-672.

[4] Jia, Z., Wang, Z., Hong, F., Ping, L., Shi, Y. and Hu, J., Learning to Learn Personalized Neural Network for Ventricular Arrhythmias Detection on Intracardiac EGMs. International Joint Conferences on Artificial Intelligence Organization (IJCAI-21), pp. 2606-2613.

[5] Acharya, U. Rajendra, et al. "Automated identification of shockable and non-shockable life-threatening ventricular arrhythmias using convolutional neural network." Future Generation Computer Systems 79 (2018): 952-959.

[6] Hannun, Awni Y., et al. "Cardiologist-level arrhythmia detection and classification in ambulatory electrocardiograms using a deep neural network." Nature medicine 25.1 (2019): 65-69.

[7] Jia, Z., Li, D., Liu, C., Liao, L., Xu, X., Ping, L. and Shi, Y., 2023. TinyML Design Contest for Life-Threatening Ventricular Arrhythmia Detection. arXiv preprint arXiv:2305.05105.