Coders Packet

Chatbot using Deep learning and Natural Language processing (Python)

By SUMIT HARESH GANGARAMANI

This is built chatbot using deep learning with PyTorch and the also by using some concept of Natural Language Processing like Tokenization, Stemming and Bag of words and used the library nltk.

Approach and Concepts used

 

  1. Training Data - So I built intent.json file named intents which is used as the training data for the Neural Network (nn) Model. This file had were categorised into different category tags like greeting, goodbye, coffee types, payments, etc. Each tag have different patterns and responses. Patterns are the for the possible input give by the user. Responses are for the outputs that the chatbot will give. These patterns are stored in Array and further concepts of Natural Language processing is applied on it.                                                                                                                     
  2. NLP concept and preprocessing Pipeline -

    The Natural Language Toolkit (nltk) was used which is a Natural Language processing library. The concept of natural language processing used were Tokenization, Stemming and bag of words.

    Tokenization is splitting a string into meaningful units. Eg. words punctuations characters, numbers.

    For example - “Hey there ! It’s 3 pm”

    this will get converted to this —
    [“Hey”, “there”, “!”, “It”, “‘s”, “3”, “pm”]

     

    Stemming is basically it generates the root form of the words. Crude heuristic that chops of the ends of the words.

    For example - “organize”, “organizes”, “organizing”

    this will get converted to this

    [“organ”, “organ”, “organ”]

     

    So afterwards this tokenization and stemming is applied on all the patterns. So a large array is formed having all the words in the pattern know as bag of words. This bag of words has all the word contained in it. This bag of word is send to the neural networks layers which tells is and see if the word entered is there or not in the array.

     

     

  3.  Feed Forward Neural Network - Now the bag of words is sent to the neural network. We have three hidden layer. The first layer where the input bag of word is put is the layer of number of patterns. Last layer the layer which gives the output to the softmax is of number of classes. Then softmax is applied through. After softmax we get the different probability.

 

Files in this project

Intent.json - To get the data for training

nltk_util - For applying various NLP concepts

Train - For training the nn model

Model - For building a nn model

Chat - The implementation of the ChatBot

Download project

Reviews Report

Submitted by SUMIT HARESH GANGARAMANI (sumitg10)

Download packets of source code on Coders Packet