My team got selected in Smart India Hackathon(SIH).

Problem Statement - Our problem statement in this hackathon was that we need you to develop an AI based chatbot(mobile/web app) to creat answers to queries based on FAQ'S getting automatically added including categorization.
Technology Stack- for backend we use (Python, Js, Flask, Tensorflow, Keras, Omw1.4, NLTK, Numpy, NGrok)
for frontend(HTML, CSS)
Working-
The chatbot uses natural language processing to analysis language of the user. It also uses NLTK (Natural language toolkit).
NLTK is a tool used in python to work with human language. It also uses tokenizer to break the text into smaller pieces.
Tokenization breaks the raw text into words or sentences called tokens. These tokens help in understanding the context of developing model of NLP. Then the bot uses lemmatization with NLTK. Here WordNetLemmatizer is used to analyze the words as a single item. The bot uses all this data and a database to come up with the answers.
we chose for sih is of (25) no.on the list,which is about solving the problem of Human dependency for answering frequently asked questions(FAQ's) on Ai.
Our AI chatbot is based on machine learning.
It recognizes the user FAQ and Reply's with a answer from its dataabse.It is very helpful for questions which are frequently asked.
There is apretrained model of the database,which is used for answering FAQ's.Chatbot get's more and more smart as the data in the databsase increases.Here we use Natural language processing(NLP) which is a field that focuses on making natural human languageus able by computer programs.NLTK,or Natural Language Toolkit,is a Python package that you can use for NLP.Which is used in the chatbot.
The chatbot uses tensorflow, TensorFlow is a free and open-sources of software library for machine learning and artificial intelligence.Withtensorflow we also use Keras,
Keras is an open-source software library that provides a Python interface for artificial neural networks.Keras acts an interface for the TensorFlow library.In this project we use Tokenization By tokenizing,you can conveniently split up text by word or by sentence.This will allow you to work with smaller pieces of text that are still relatively coherent and meaningful even outside of the context of there stofthetext.Its your first step in turning unstructured data into structured data,which is easier to analyze.And lastly Flask is used for making the web appilication.
0 comments:
Post a Comment