Anomaly Detection in TensorFlow and Keras Utilizing the Autoencoder Technique | by Rashida Nasrin Sucky | Sep, 2023


Photograph by Leiada Krozjhen on Unsplash

A cutting-edge unsupervised methodology for noise elimination, dimensionality discount, anomaly detection, and extra

All of the tutorials about TensorFlow and neural networks I’ve shared till now have been about supervised studying. This one will probably be in regards to the Autoenocder which is an unsupervised studying approach. If I need to specific it merely, autoencoders scale back the noises from the information by compressing the enter knowledge, and encoding and reconstructing the information. That approach autoencoders can scale back the dimensionality or the noise of the information and give attention to the true focus of the enter knowledge.

As you may see from the introduction to the autoencoders right here there may be a couple of course of required.

  1. First, a mannequin to compress the enter knowledge which is the encoder mannequin.
  2. Then one other mannequin to reconstruct the compressed knowledge that must be as shut because the enter knowledge which is a decoder mannequin.

On this course of, it will probably take away the noise, scale back the dimensionality, and clear up the enter knowledge.

On this tutorial, I’ll clarify intimately how an autoencoder works with a working instance.

For this instance, I selected to make use of a public dataset (Apache License 2.0) named deep_weeds.

import tensorflow as tf
import tensorflow_datasets as tfds
ds = tfds.load('deep_weeds', cut up='prepare', shuffle_files=True)

Information Preparation

We have to put together a dataset for this unsupervised anomaly detection instance. Just one class will probably be taken as our essential class that will probably be thought of because the legitimate class. And I’ll put a couple of knowledge from one other class as an anomaly. Then we are going to develop the mannequin to see if we are able to discover that few anomaly knowledge.

I selected class 5 because the legitimate class and sophistication 1 because the anomaly. Within the code block beneath, I’m taking all the information of lessons 5 and 1 first and creating lists of the pictures and their corresponding labels.

import numpy as np
images_main = []
images_anomaly = []
labels_main= []
labels_anomaly = []
ds = ds.prefetch(tf.knowledge.AUTOTUNE)
for instance in ds…

Leave a Reply

Your email address will not be published. Required fields are marked *