Saturday, April 8, 2023

Part G: Text Classification with a Recurrent Layer

 

Part G: Text Classification with a Recurrent Layer


Author: Murat Karakaya
Date created….. 17 02 2023
Date published… 08 04 2023
Last modified…. 08 04 2023

Description: This is the Part G of the tutorial series “Multi-Topic Text Classification with Various Deep Learning Models which covers all the phases of multi-class  text classification:

  • Exploratory Data Analysis (EDA),

We will design various Deep Learning models by using

  • Keras Embedding layer,

We will cover all the topics related to solving Multi-Class Text Classification problems with sample implementations in Python / TensorFlow / Keras environment.

We will use a Kaggle Dataset in which there are 32 topics and more than 400K total reviews.

If you would like to learn more about Deep Learning with practical coding examples,

You can access all the codes, videos, and posts of this tutorial series from the links below.

Accessible on:


PARTS

In this tutorial series, there are several parts to cover Text Classification with various Deep Learning Models topics. You can access all the parts from this index page.

In this part, we will use the Keras Bidirectional LSTM layer in a Feed Forward Network (FFN).

If you are not familiar with the Keras LSTM layer or the Recurrent Networks concept, you can check in the following Murat Karakaya Akademi YouTube playlists:

English:

Turkish

If you are not familiar with the classification with Deep Learning topic, you can find the 5-part tutorials in the below Murat Karakaya Akademi YouTube playlists:

Saturday, November 19, 2022

Part F: Text Classification with a Convolutional (Conv1D) Layer in a Feed-Forward Network

 

Part F: Text Classification with a Convolutional (Conv1D) Layer in a Feed-Forward Network



Author: Murat Karakaya
Date created….. 17 09 2021
Date published… 11 03 2022
Last modified…. 29 12 2022

Description: This is the Part F of the tutorial series “Multi-Topic Text Classification with Various Deep Learning Models which covers all the phases of multi-class  text classification:

  • Exploratory Data Analysis (EDA),

We will design various Deep Learning models by using

  • Keras Embedding layer,

We will cover all the topics related to solving Multi-Class Text Classification problems with sample implementations in Python / TensorFlow / Keras environment.

We will use a Kaggle Dataset in which there are 32 topics and more than 400K total reviews.

If you would like to learn more about Deep Learning with practical coding examples,

You can access all the codes, videos, and posts of this tutorial series from the links below.

Accessible on:


PARTS

In this tutorial series, there are several parts to cover Text Classification with various Deep Learning Models topics. You can access all the parts from this index page.



Photo by Josh Eckstein on Unsplash

Wednesday, November 16, 2022

Sequence To Sequence Learning With Tensorflow & Keras Tutorial Series

Sequence To Sequence Learning With Tensorflow & Keras Tutorial Series


The Seq2Seq Learning Tutorial Series aims to build an Encoder-Decoder Model with Attention. I would like to develop a solution by showing the shortcomings of other possible approaches as well. Therefore, in the first 2 parts, we will observe that initial models have their own weaknesses. We will also understand why the Encoder-Decoder paradigm is so successful. 

You can access all the parts in the below links.


Photo by Clay Banks on Unsplash

Thursday, November 10, 2022

Seq2Seq Learning Part A: Introduction & A Sample Solution with MLP Network

 

Seq2Seq Learning Part A: Introduction & A Sample Solution with MLP Network

If you are interested in Seq2Seq Learning, I have good news for you. Recently, I have been working on Seq2Seq Learning and I decided to prepare a series of tutorials about Seq2Seq Learning from a simple Multi-Layer Perceptron Neural Network model to an Encoder-Decoder Model with Attention.

You can access all my SEQ2SEQ Learning videos on Murat Karakaya Akademi Youtube channel in ENGLISH or in TURKISH

You can access all the tutorials in this series from my blog on www.muratkarakaya.net

Thank you!



Photo by Hal Gatewood on Unsplash

Seq2Seq Learning Part B: Using the LSTM layer in a Recurrent Neural Network

 

SEQ2SEQ LEARNING Part B: Using the LSTM layer in a Recurrent Neural Network

Welcome to the Part B of the Seq2Seq Learning Tutorial Series. In this tutorial, we will use several Recurrent Neural Network models to solve the sample Seq2Seq problem introduced in Part A.

We will use LSTM as the Recurrent Neural Network layer in Keras.

You can access all my SEQ2SEQ Learning videos on Murat Karakaya Akademi Youtube channel in ENGLISH or in TURKISH

You can access all the tutorials in this series from my blog at www.muratkarakaya.net

If you would like to follow up on Deep Learning tutorials, please subscribe to my YouTube Channel or follow my blog on muratkarakaya.net. Thank you!


Photo by Jess Bailey on Unsplash

Seq2Seq Learning Part C: Basic Encoder Decoder Architecture & Design

 

Seq2Seq Learning Part C: Basic Encoder-Decoder Architecture & Design

Welcome to the Part C of the Seq2Seq Learning Tutorial Series. In this tutorial, we will design a Basic Encoder-Decoder model to solve the sample Seq2Seq problem introduced in Part A.

We will use LSTM as the Recurrent Neural Network layer in Keras.

You can access all my SEQ2SEQ Learning videos on Murat Karakaya Akademi Youtube channel in ENGLISH or in TURKISH

You can access all the tutorials in this series from my blog at www.muratkarakaya.net

If you would like to follow up on Deep Learning tutorials, please subscribe to my YouTube Channel or follow my blog on muratkarakaya.net. Thank you!


Photo by Med Badr Chemmaoui on Unsplash

LSTM: Understanding Output Types

 

LSTM: Understanding Output Types

INTRODUCTION

In this tutorial, we will focus on the outputs of the LSTM layer in Keras. To create powerful models, especially for solving Seq2Seq learning problems, LSTM is the key layer. To use LSTM effectively in models, we need to understand how it generates different results with respect to given parameters. Therefore, in this tutorial, we will learn and use 3 important parameters (units, return_sequences, and return_state).

At the end of the tutorial, you will be able to manage the LSTM layer to satisfy the model requirements correctly.

If you would like to follow up on Deep Learning tutorials, please subscribe to my YouTube Channel or follow my blog on muratkarakaya.net.  Thank you!


Photo by Victor Barrios on Unsplash