The Application of Computer Techniques to ECG Interpretation

It is over 120 years since Einthoven introduced the electrocardiogram [...]


Introduction
It is over 120 years since Einthoven introduced the electrocardiogram. The technology has changed dramatically from recording three signals (or leads) through to recording over 300 leads, but paradoxically at the other extreme, the use of one lead is becoming popular in the area of wearables, such as a wristwatch. The ECG can be used to monitor heart rate and rhythm in its simplest form or to confirm a myocardial infarction or even to suggest left ventricular diastolic dysfunction at the other. In all of these areas nowadays, the digital computer or microprocessor plays a very significant role.
The concept behind this publication of many articles related to different aspects of ECG analysis was essentially to highlight the various areas where computer techniques support electrocardiology. The topics vary widely and should be of interest to clinicians, biomedical engineers and computer scientists with an interest in electrocardiography.

Historical Links
The compendium opens with a review of the history of automated ECG interpretation [1] which had its beginnings around 1960 and is still in the process of development at the present time. Whereas large digital computers were involved in ECG processing at the outset, today's technology allows automated ECG interpretation in extremely small devices. The future may well see more centralisation of interpretation which would have its benefits in terms of updating a single copy of a specific analytical program undertaking the analysis compared to updating thousands of individual ECG machines. On the other hand, given the rate at which technology advances, it is not difficult to foresee that updated software could be downloaded to individual ECG machines, though that would mean more costly equipment.
One of the most significant projects in the field of automated ECG analysis was the work entitled Common Standards for Quantitative Electrocardiology, often known as the CSE project. Some of the output from that project is still very relevant today, particularly in relation to a database of ECGs where the clinical details for each patient are documented. It is now a little outdated, but it is still used as a yardstick for comparing the performance of different algorithms for ECG interpretation. Various definitions for ECG component amplitudes and durations were set out as a result of this work [1].

Updated Standards
An important spin off from the CSE project was the creation of a Standard Communication Protocol (SCP). The concept was that there should be a common format in which ECG data was stored so that the same data could be analysed by different software packages which would accept the data in a well-defined format. This was not widely used, possibly for commercial reasons, but Rubel has persisted with the concept and has contributed in a major way to a recently updated protocol [2].
There are significant regulations related to software being regarded as suitable for inclusion in an electrocardiograph for general use. Through the years, different standards have been proposed by the international electrotechnical commission (IEC) to which software for automated ECG interpretation and hardware should conform. For many years the IEC 60601-2-51 standard was in place, but this was replaced by IEC 60601-2-25. The more recent standard omitted any reference to the diagnostic performance of the software much to the amazement of those in the field who could not understand how this came about! In order to resolve the shortcoming of the current standard, Young, Schmid and colleagues have been extremely active over the past few years in preparing a new IEC standard for automated ECG interpretation and the broad principles are outlined [3]. It is hoped that this new standard will be accepted by the various bodies and then become operational in the near future. There are still shortcomings in the sense that there is no currently available database of cardiac arrhythmias which can be used to test software, but work is in progress to remedy this situation.

Modelling
Given that deficiencies exist in well-defined data bases for evaluating performance of ECG software, the work of Doessel et al. in modelling the electrical activity of the heart is of particular relevance [4]. This detailed paper illustrates how modelling has reached the point of being able to generate a wide variety of ECG abnormalities including arrhythmias, conduction defects as well as other abnormalities including myocardial infarction and ischaemia. This is part of a wider European collaboration involving a number of centres interested in modelling. It could ultimately lead to the establishment of a large database that could be used as a yard stick against which interpretative software could be evaluated.

Big Data
Ribeiro and colleagues in Brazil have been prominent in recording large numbers of ECGs in Health Centres within one state of that vast country. Their work is now expanding into other states in Brazil and in all probability they have the biggest networking facility worldwide for the routine collection of ECGs.
The authors have made use of almost 2.5 million ECGs from individuals followed up for 3.7 years. The group had a mortality rate of 3.3% and the authors examine the predictive value of a variety of ECG abnormalities with respect to overall mortality [5]. This is an excellent example of big data and computer techniques enhancing the value of ECG analysis.

Artificial Intelligence
One of the more recent developments in ECG interpretation has been the use of artificial intelligence to facilitate ECG classification. This approach, while extremely promising, is still in its infancy with many aspects remaining to be carefully evaluated.
One of the groups which has been most active in this area is from the Mayo Clinic, Minnesota, USA. Rafie et al. describe an approach to 12 lead ECG interpretation which uses one form of artificial intelligence [6]. This is one of the earliest examples of 12 lead interpretation using this newer technique, and the authors frequently refer to the "potential" use of the approach in routine ECG interpretation.
AI has also been used in ambulatory ECG analysis. Xue and Yu [7] outline the areas in ambulatory monitoring where the AI methodology is likely to be of greatest help. Ambulatory ECG has problems in the sense that patients may be active at any time during a recording leading to the increased possibility of noise contaminating the ECG, while the amount of data that can be generated is clearly enormous, particularly if, as is sometimes the case nowadays, the recording is made for up to 14 days. It is clear, therefore, that automated methods for detecting abnormalities within such a long period are required.
AI in routine ECG interpretation, in the editor's opinion, still has some hurdles to climb before being fully accepted for basic 12 lead ECG interpretation. Principal amongst these is the positive predictive value of a diagnostic statement. This is not always considered in publications relating to the technique, but it is gradually being acknowledged that it is one aspect of AI that has to be reviewed very carefully [1].

Monitoring
Monitoring of the ECG in the coronary care unit, an intensive care unit or high dependency unit is commonplace. Many monitoring devices will have in-built computer algorithms to facilitate detection of life-threatening arrhythmias. However, the biggest problem which these devices have suffered in recent years has been dealing with artefact due to patient movement etc., which is inevitable during long term monitoring.
The article by Pelter et al. [8] highlights the problem and outlines an attempt to produce a database of ECGs from patients being monitored that will ultimately lead to an enhancement of algorithms for accurate detection of significant arrhythmias. Current algorithms have a very high percentage of false positive alarms often to the extent that nursing staff simply turn off the alarms in order to avoid continuous interruption for checking what frequently turns out to be a false alarm. Of course, this raises the obvious problem that a genuine alarm can be missed by manual monitoring. This scenario is known as alarm fatigue. The authors outline the creation of a data base of ECGs where alarms are genuine and illustrate how three experienced individuals may be required to agree an interpretation before it can be included in the database. This work suggests that there is much that can still be carried out to enhance the accuracy of alarm detection by patient monitors. There are many interesting examples of life-threatening arrhythmias in this article.

Body Surface Mapping
Body surface mapping (BSM) has been available since the initial work of Taccardi in the 1960s. Nowadays, as many as 300 electrodes can be placed on the anterior and posterior surfaces of the thorax in order to obtain as detailed a pattern as possible of the cardiac excitation as it appears on the body surface. Clearly automated methods for processing such a large amount of data are invaluable and essentially the technique would not have been followed to any significant extent had it not been for the revolution in technology over the past 50 years. BSM can be used in different ways. The most basic is simply to visualise the spread of excitation on the body surface and from that infer the nature of any abnormality. Another application is to utilise the activity on the body surface to determine the electrical activity on the surface of the heart and the spread of excitation with the myocardium. This socalled inverse modelling is gaining importance and has led to clinically available devices for investigation of cardiac arrhythmias by localising the source of the abnormal rhythm. Bergquist et al. [9] outline the leading edge methodology which they use to process body surface maps.
The use of the inverse modelling approach has given rise to the term of electrocardiographic imaging (ECGI). Currently, there is a need for a patient to undergo computed tomography (CT) or magnetic resonance imaging (MRI) in order to obtain accurate cardiac "geometry" prior to ECGI. This is then linked with the mathematical model which allows the cardiac excitation to be determined from the body surface potentials. The authors refer to their recent work on "imageless" ECGI, which represents a significant advance in inverse modelling with CT and MRI being unnecessary prior to ECGI.
The authors conclude that BSM is predominantly a research tool, but nevertheless its use has led to ECGI among other things and it will continue to be of very significant benefit to electrocardiological developments in general.

Application of ECGI
Andrews et al. describe the use of ECGI in facilitating the understanding of a cardiac resynchronisation therapy. [10]. When ECGI is combined with speckle tracking echocardiography, it is possible to look at the motion of the left ventricle and at the same time link this with the spread of electrical activation. ECGI was in large part developed by one of the authors of this article (Rudy) some years ago.
The study describes remodelling of the heart after bi-ventricular pacing has been initiated. Using ECGI, epicardial electrograms can be determined and from this, various other measures of electrical activity are computed. This allows the investigators to assess the effects of pacing over time. The link between myocardial activation and muscular contraction can be assessed, using this technique, against the effectiveness of biventricular pacing. The authors conclude that the ECGI measured delay in activation of the left ventricle is an excellent index for selecting patients for cardiac resynchronisation therapy.

Interatrial Block
Interest in interatrial blocks has increased recently due to the work of Bayes-de-Luna and his colleagues. They have described different types of interatrial block which in general terms have not been frequently reported as part of an ECG interpretation. The work presented [11] is therefore of significance in highlighting these electrocardiographic abnormalities and hence underlying myocardial problems.
It would indeed be difficult to have a detailed automated interpretation of P wave abnormalities, together with PR interval changes, given that a P wave can often be one of the most difficult components of the ECG to measure accurately. Its projection onto multiple leads will result on occasions in the P wave essentially not being seen in some leads making analysis even more difficult. The door is therefore open for those who wish to undertake further work in automated ECG interpretation to use their talents in this particular area.

Conclusions
It is hoped that the articles in this book will show the reader that there is still life in electrocardiographic research! Although it is a relatively old investigational technique, the ECG still remains of paramount importance in clinical investigation of patients. Advances in technology are gradually leading to advances in understanding of various aspects of the ECG although there are still many areas where knowledge is incomplete. Perhaps in due course there will be a complete understanding of the genesis of the ECG, and automated techniques will be able to give a fully detailed interpretation of the individual spread of cardiac activation in every patient.