Posts

Showing posts from August, 2021

Python - Queue

 We generally remain in line where we need to sit tight for administration. The equivalent goes for the line information structure where the things are masterminded in the line. line varieties are controlled by how the thing is added or taken out. Things just permitted in one end and eliminate from the opposite end. It's the earliest in, earliest out strategy. The line can be utilized utilizing a python list, were input () and pop () strategies are utilized to add and eliminate things. Since information things are constantly added toward the finish of the line.  Tasks Of a Queue  1) Enqueue: Adds a component to the furthest limit of the line. Flood condition - Queue is full  2) Dequeue: Removes a component from the front of the line. Sub-current condition - Queue is vacant  3) Front: Get first thing  4) Rear: Get last thing  Execution  We can execute line utilizing information design and python modules as follows:  1 ) list  2) collectio...

Namespaces and Scope in Python

In this article, we will think about the namespaces in python , kinds of namespaces, and degrees in python. In python programming, everything is viewed as an item. Name is only the identifier of an item. Also, space is a location in the primary memory related with that article. We can characterize namespace as an assortment of names related with the location in the principle memory. There are three kinds of namespaces, Built-in Namespace, Global Namespace, and Local Namespace. Each lower namespace can get to the upper namespaces. Also, the extent of factors in python relies upon the namespaces.  Namespace in Python  Each substance in python is viewed as an item. We give a few names to each protest like variable, class, and capacity for recognizable proof. Frequently these names are known as identifiers. Thus, the name is only the identifier. This load of names and where we use esteem are put away in the primary memory at a one of a kind area. This area is known as space. So th...

ReLU Activation Function

Image
  ReLU stands for Rectified Linear Unit. ReLU activation function is one of the most used activation functions in the deep learning models. ReLU function is used in almost all the convolutional neural networks or deep learning models. The ReLU (Rectified Linear Unit) function is an activation function that is currently more popular as compared with the sigmoid function and the tanh function. Advantages of tanh function When the input is OK, no gradient saturation problem. The calculation speed is very quickly. The ReLU function has only a direct relationship. Even so forward or backward, much faster than tanh and sigmoid.(tanh and Sigmoid  you need to calculate the object, which will move slowly.) Disadvantages of tanh function When the input is negative, ReLU is not fully functional, which means when it comes to the wrong number installed, ReLU will die. This problem is also known as the Dead Neurons problem. While you are forward propagation process, not a problem. Some are...