eHealth Hub Blog

Big Data and its Impact on Cardiology

The evidence-based medicine we have known up to now should be gradually modified towards a new dimension, through the convergence of technological advances, especially mobile data and methodologies, to transform massive amounts of data that make no apparent sense into data on data with analytic purposes.

Big Data and its Impact on Cardiology

The union of Big Data and HPC will identify new patterns in large datasets, facilitating medical decision-making.

Last July the FDA approved Alirocumab (Praluent); the first monoclonal antibody that inactivates the proprotein convertase subtilizing kexin type 9 (PCSK9), a circulating negative regulator of the LDL receptors in the liver. By blocking PCSK9’s ability to work, more receptors are available to get rid of LDL cholesterol from the blood and, as a result, lower LDL cholesterol levels are achieved than on statins.

But beyond the indisputable pharmacological advance that this drug represents, what is truly important is that it represents a new vision of medicine since it is one of the first successful applications of therapeutic targets using genomic analysis, specifically the mutation of PCSK9, as a causal genetic mechanism of familial hypercholesterolemia.

This new development in so-called inductive drugs is based on the premise that once verified that these genes play an important role in disease progression and are considered therapeutic targets, we should identify the patients who display the necessary and sufficient conditions between the set target and the pathophysiology of the cardiovascular disease.

However, it is now that we can achieve this goal through the analysis of Big Data generated by epidemiological variables, electronic records and genomic databases.

Furthermore, the leading Japanese cardiologist, Masafumi Kitakaze, has just published an interesting article (Journal of the American College of Cardiology. Vol 66 No 2 2015), on the particular trends in cardiovascular disease in Asia and Japan, highlighting the need for epidemiological studies that permit an accurate recognition of risk factors, as well as their distribution and synergistic effects, in order to achieve a short and medium term modification thereof and effective prevention of ischemic heart disease and heart failure at the primary, secondary and tertiary levels. The author also suggests that Big Data and data mining may be ways of obtaining it, but is skeptical about the chances of immediate applications.

There are many signs to directing cardiologists to tune into Big Data and the other components of this new vision in medicine as soon as possible. There are irreversible technological realities that are essential for every cardiologists to know, such as:

  • High performance computing (HPC) using parallel processing to run advanced applications quickly, efficiently and reliably.
  • The number of supercomputer centers that employ co-processors and accelerators has doubled in the past two years.
  • HPC resources in the cloud are increasingly available as a consumer service.
  • We are reaching a new technology platform that was identified by IDC (International Data Corporation) in 2007 and consists of mobile computing, cloud services, Big Data, analytics and social networks.
  • Global spending on software and hardware related to Big Data and analytics will grow to 125,000 million dollars and large data supply chains will grow in importance as a cloud platform service.
  • The adoption of cloud infrastructure as a service (IaaS) will grow significantly by 36% this year.

Data analysis should be performed by a specific analyst whose profile is still being debated, but with a pressing need for quantitative training, since the rate at which data is being generated far exceeds the number of specialized employees that are able to analyze it. The medical specialty that is able to design a communicative, intelligent and goal-oriented interrelationship will have great advantages.

The Internet of things is an accelerator of innovation and growth of the other components, through the development of new solutions based on intelligent embedded devices that go beyond the telecommunications industries, transforming various economic fields (finance, transportation, healthcare, location-based services, construction, etc.)

Wireless data will be the fastest growing in telecommunications and the mobile applications and devices such as smartphones and tablets will reach sales of 484,000 million units by 2016. In 2015, there already 291 million connected objects in Latin America alone. But are there any tools for analysis? Which one should we know?

Hadoop, an open source system that serves as a data storage and Big Data analysis platform that is optimized to handle massive data through parallelism using cheap hardware, was developed to solve part of the problems associated with Big Data and the emergence of Data Science. It has obvious advantages, since the computing and storage capacity is linearly scalable thanks to fault-tolerant distributed processing. It uses anything from a few, to several thousand servers, offering quality service to all of them.

Map Reduce, the programming model used by Google to support parallel computing, processes large batches of data in groups of computers, or clusters. It offers several advantages, such as access to data streams, meaning that data is being provided as it is being consumed without the need to download them. The Hadoop clusters are equipped to store large volumes of files of all types, they can run on machines built by different manufacturers, shifting the algorithm to the data and not the other way around, as well as providing monitoring tools.

Processing provides these important results:

  • It provides logical structure to the processed data sets.
  • It stores data in the selected repository.
  • It analyzes the available data to find relationships.
  • It applies algorithms to the data.
  • It applies statistical processes.
  • It resolves the requests launched through modeling.
  • It interprets Big Data.
  • It interprets the different solutions.
  • It provides a final result.
  • For the sake of synthesis and with the certainty of having new spaces available for debate, I will offer examples of the impact in three areas of Cardiology: Cardiovascular Epidemiology, heart imaging and evidence-based medicine.

Cardiovascular Epidemiology, part of the implementation of risk scores such as the Framingham, Gaziano, or PAHO based on well-known risk factors (cholesterol, hypertension, smoking, diabetes etc), along with a set of various secondary lipid and non-lipid risk factors such as Lp-PLA2 or C-reactive protein, whose specific weight always has limitations with regard to individualized clinical applications.

The addition of data from Electronic Health Records tends to improve the predictions, establishing hierarchical orders of significance in defined groups, something we do not presently do. In addition to sharing risk factors with cerebrovascular disease and cancer, it allows you to extend the models to make global health prediction.

Big Data in cardiovascular epidemiology allows you to study topographies and various demographic groups, whose amplitude can be modeled depending on the aims of the examination of prevalence, selecting subpopulations in specific areas. Studying the health of the population in small areas allows the design of local health policies and optimal planning of ever-dwindling resources.
If areas or defined populations have electronic records, the data can be correlated with hospitalization, thus allowing a more accurate verification of incidences and continuous monitoring effort.

If there is data integration between different institutions or areas, it facilitates the adoption of agile interventions with central definition. A further advantage is the availability of multiple environmental data that, though not specifically health-related, allow more complete investigation of the effects of air pollution on ischemic heart disease.

Cardiac imaging has advanced with the emergence of new techniques used to determine the function, structure, perfusion, metabolism and characterization of tissue.

Multimodality image constituted by echocardiography, multidetector computed tomography, nuclear medicine, MRI and angiographic studies and functional evaluations of lesions involve the generation of high volumes of data.

The ability to access great quantities of stored information will determine new forms of image analysis that will allow new classifications of patients into different categories, as well as improving the display stream.

The union of Big Data and HPC will identify new patterns in large datasets, facilitating medical decision-making. For example, Syntax Score Correlation with different imaging modalities in different subgroups of patients, or a simpler noninvasive diagnosis of congenital heart disease. The latter is still considered an art form in cardiology.

The evidence-based medicine we have known up to now should be gradually modified towards a new dimension, through the convergence of technological advances, especially mobile data and methodologies, to transform massive amounts of data that make no apparent sense into data on data with analytic purposes. This is coupled with increasingly effective cloud processing and the emergence of new sensors and devices.

In cardiology, the empowerment of the patient through mobile devices and applications will be a field of critical development, primarily for outpatient management of heart failure and reducing costly readmissions, but also in atrial fibrillation and specific therapeutic aspects.

After solving the problem of data standardization within and between hospitals with electronic records, it may provide a pool for the selection of candidates for clinical trials. Pfizer has taken an important step with the design of the Exco in Touch using the eDiary Tool for the mobile-enabled Participatory Patient-Centered (PPC) clinical trial, although I think that you need to model the way of reaching the right patient, evaluating them, enlisting them and monitoring them.

A few obstacles remain in relation to strengthening the role of Big Data, such as incentives to sharing data and preserving privacy and setting limits to the anonymity of the data set.

The eminent Pio Baroja, one of the most emblematic figures of Spain’s Generation of ’98, said of the advances of the late nineteenth century, “Modernity may be good or bad, but there is only one way to face it: Accepting it.”

Faced with these new changes, we also we have one way to face them: Adopting new technologies and improving them for the benefit of our patients.

Dr. Juan Prohías

Posición actual: Director del Cardiocentro del Hospital Clínico-Quirúrgico Hermanos Ameijeiras. Especialización: cardiología, imagen cardíaca. Dr. en Medicina por la Universidad de la Habana. Profesor en Universidad de la Habana. Miembro de “American College of Cardiology”. Jefe del Grupo Nacional de Cardiología, Cuba. Director Médico del proyecto consorciado CYTED entre 2007-2011.

Share:

    Thanks for your interest. We will get in touch with you soon.

    First Name(*)
    Last Name
    Telephone (*)
    Email Address (*)
    Country (*)
    Organization (*)
    Type of organization (*)
    Number of beds (*)
    Support (*)
    Job title (*)
    Job Function (*)
    Select the product of interest:
    ehCOS CLINICehCOS SmartICUehCOS CMKehCOS TriageehCOS Emergency
    Comentarios: