TensorFlow is arguably the most popular library for deep learning. I wrote many tutorials on TensorFlow before and I’m still continuing. TensorFlow is a very well-organized and easy-to-use package where you don’t need to worry too much about model development and training. The package itself takes care of most of the stuff. That’s probably why it has become so popular in the industry. But at the same time, sometimes it’s nice to have control over the behind-the-scenes functionalities. It gives you a lot of power to experiment with the models. If you’re looking for a job, some additional knowledge can give you an advantage.
Previously, I wrote an article on how to develop custom activation functions, layers, and loss functions. In this article, we will see how to train the model manually and update the weights yourself. But do not worry. You don’t need to remember differential calculus again. We have the GradientTape() method available in TensorFlow to take care of that part.
If GradientTape() is totally new to you, feel free to check out these GradientTape() exercises that show you how GradientTape() works: Introduction to GradientTape in TensorFlow – Regenerative (regenerativetoday.com)
Data preparation
In this article we work on a simple classification algorithm in TensorFlow using GradientTape(). Download the dataset from this link:
Heart Failure Prediction Dataset (kaggle.com)
This data set has an open database license.
These are the necessary imports:
import tensorflow as tf
from tensorflow.keras.models import Model
from tensorflow.keras.layers import Dense, Input
import numpy as np
import matplotlib.pyplot as plt
import matplotlib.ticker as mticker
import pandas as pd
from sklearn.model_selection import train_test_split
from sklearn.metrics import confusion_matrix
import itertools
from tqdm import tqdm
import tensorflow_datasets as tfds
Creating the DataFrame with the data set:
import pandas as pd
df = pd.read_csv('heart.csv')
df