Friday, May 1Digital Marketing Journals

tensorflow

Types of Loss Functions in Deep Learning explained with Keras. | by Tripathi Aditya Prakash | Oct, 2022
ai bot, ai chat, ai chatbot, best chatbot, chatbot, chatbot ai, chatbot app, chatbot online, chatbot website, conversation with ai, creating chatbots, deep-learning, keras, loss-function, machine-learning, robot chat, tensorflow

Types of Loss Functions in Deep Learning explained with Keras. | by Tripathi Aditya Prakash | Oct, 2022

Loss functions are what make ANN (Artificial Neural Network) understand what is going wrong and how to get to that golden accuracy range, just like loss makes you cherish the profit and identify what went wrong.Cause profit will come after right?.. RIGHT??Now coming to the mathematical definition of the loss function, it is used to measure the inconsistency between the predicted value (^y) and the actual label (y). It is a non-negative value, where the robustness of the model increases along with the decrease of the value of the loss function.It’s straightforward actually, it’s basically how well your algorithm models the dataset, if its prediction is totally off, your loss function will output a higher number. If it is pretty good, it will output a lower number.In this article, they are d...
Types of Activation Functions in Deep Learning explained with Keras | by Tripathi Aditya Prakash | Sep, 2022
activation-functions, best chatbot, chatbot, chatbot app, chatbot online, chatbot website, creating chatbots, deep-learning, keras, neural-networks, robot chat, tensorflow

Types of Activation Functions in Deep Learning explained with Keras | by Tripathi Aditya Prakash | Sep, 2022

Activation does it means activating your car with a click ( if it has that ,of course) , well the same concept but in terms of neurons , neuron as in human brain ? , again close enough, neuron but in Artificial Neural Network (ANN).The activation function decides whether a neuron should be activated or not.A biological neuron in the human brainIf you have seen an ANN, which I sincerely hope you do you have seen they are linear in nature, so to use non — linearity in them we use activation functions and generate output from input values fed into the network.A sample ANN networkActivation functions can be divided into three typesLinear Activation FunctionBinary Step Activation FunctionNon — linear Activation FunctionsIt is proportional to the output values, it just adds the weighted total to...
Using Transfer Learning with Word Embeddings for Text Classification Tasks | by Manuel Gil | Jul, 2021
ai bot, ai chat, ai chatbot, best chatbot, chatbot, chatbot ai, chatbot app, chatbot online, chatbot website, conversation with ai, creating chatbots, robot chat, tensorflow, text-classification, transfer-learning, word-embeddings

Using Transfer Learning with Word Embeddings for Text Classification Tasks | by Manuel Gil | Jul, 2021

When we are working with computer vision tasks, there are some scenarios where the amount of data (images) is small or not enough to reach acceptable performance. In addition, dealing with image data and Convolutional Neural Networks (CNN) is expensive in terms of computational power.Due to the issues aforementioned, in most cases it is convenient to use a technique called Transfer Learning, which consists of using models trained with millions of images, to improve the performance during the training process. We can implement this technique with Natural Language Processing (NLP) tasks, but instead of using pre-trained CNN models, for text classification, we are going to use pre-trained Word Embeddings.When we have so little data available to learn an appropriate task-specific embedding of ...