Coders Packet

Heart Attack detection in Python using TensorFlow and Keras

By Anoushka Mergoju

In this project, we will be exploring TensorFlow and Keras deep learning API to detect Heart Attacks to understand deep neural networks in Python programming.

In this project, we will be exploring TensorFlow and Keras deep learning API to detect Heart Attacks to understand deep neural networks in Python programming. Here, we will be using an open-source dataset, Heart_dataset

Download the dataset and we can move further to implementing it using TensorFlow and Keras deep learning API.

 

from google.colab import drive
drive.mount("/content/gdrive")

After downloading the dataset, import the CSV file onto your drive as shown above.

 

# Importing libraries
import numpy as np
import pandas as pd

%matplotlib inline
import matplotlib as mpl
import matplotlib.pyplot as plt

import tensorflow as tf
from tensorflow import keras

In the above code snippet, we have imported libraries NumPy, pandas, and matplotlib. Alongside, we have also imported TensorFlow and Keras which will play a major role in building a model for detecting/predicting heart attack.

 

# Loading dataset
df=pd.read_csv('/content/gdrive/My Drive/CodeSpeedy/heart.csv')
df.head()

In the above code snippet, we loaded the dataset and displayed the first five rows using head(). Now, let's get the information from our dataset.

df.info()
# describe the dataset
df.describe()

In the above code snippets, we have displayed the information pertaining to our dataset to understand the amount of data we'll be dealing with in this project. Now, let's move further into the correlation matrix of this dataset.

 

mpl.rcParams['figure.figsize'] = 12, 10
plt.matshow(df.corr())
plt.yticks(np.arange(df.shape[1]), df.columns) plt.xticks(np.arange(df.shape[1]), df.columns) plt.colorbar()

Now, let's look at the dataset by depicting it as a histogram.

df.hist()
Now, let's visualize our dataset more precisely.
 
dataset=df
mpl.rcParams['figure.figsize'] = 7,5
plt.bar(dataset['target'].unique(), dataset['target'].value_counts(), color = ['blue', 'yellow'])
plt.xticks([0, 1])
plt.xlabel('Target Classes')
plt.ylabel('Count')
plt.title('Count of each Target Class')
We can also visualize and count the target class of the datasets provided.
 
from sklearn.preprocessing import StandardScaler
df = pd.get_dummies(df, columns = ['sex', 'cp', 'fbs', 'restecg', 'exang', 'slope', 'ca', 'thal'])
standardScaler = StandardScaler()
columns_scale = ['age', 'trestbps', 'chol', 'thalach', 'oldpeak']
df[columns_scale] = standardScaler.fit_transform(df[columns_scale])

df.head()
We will bserve here that there are lots of categorical values like 0’s and 1’s in our input feature, hence it is always advisable to obtain dummy variables for the aforementioned categorical variables.

Here, we used the function pd.get_dummies() to get dummy variables for categorical ones.


# Let's split the data into train and test using sklearn
from sklearn.model_selection import train_test_split
y = df['target']
X = df.drop(['target'], axis = 1)
X_train, X_test, y_train, y_test = train_test_split(X, y, test_size = 0.33, random_state = 0)

np.random.seed(42)
tf.random.set_seed(42)

X_train.shape

In the above code snippets, we used a random seed to generate a pseudo-random number and assign it to our tf graph. We also collected the shape of our input feature and created our model.

from keras.models import Sequential
from keras.layers import Dense, Dropout
model = Sequential()
model.add(Dense(15, input_dim=30, activation='relu'))
model.add(Dense(10, activation='relu'))
model.add(Dense(8, activation='relu'))
model.add(Dropout(.2))
model.add(Dense(1, activation='sigmoid'))

In the above code snippet, we used a sequential model. For the input layer, relu activation function has been used and for the output layer, sigmoid activation. We also used a 20% dropout layer.

Now, let's summarise our dataset.

model.summary()
Here, we'll notice that we have 722 trainable parameters.
 
model.compile(loss="binary_crossentropy", optimizer="adam", metrics=['accuracy'])
model_history = model.fit(X_train, y_train, epochs=200, validation_data=(X_test, y_test))
In the above code, we have compiled our model using 200 epochs, with binary cross-entropy loss function and Adam optimizer. 
 
model.compile(loss="binary_crossentropy", optimizer="SGD", metrics=['accuracy'])
model_history.history
Now, let's predict the values using our model.
 
y_pred = model.predict(X_test)
print (y_pred)

We can verify the predicted output with the actual value.

In conclusion, we have predicted/detected heart attacks using TensorFlow and Keras APIs in order to understand deep neural networks in Python programming.

Thank you.

 

Download Complete Code

Comments

No comments yet

Download Packet

Reviews Report

Submitted by Anoushka Mergoju (Anoushka)

Download packets of source code on Coders Packet