DEEP LEARNING: NEURAL NETWORK AND BEYOND

By Dr. S. Suganya, Dr. Sunila, Sivasubramanian Balasubramanian, Dr. Haewon Byeon

DEEP LEARNING: NEURAL NETWORK AND BEYOND
Available for 8 USD
Deep learning has brought about a revolution in the field of artificial intelligence by providing sophisticated tools that can be used to solve difficult issues in a variety of fields. One of the most important components of deep learning is the neural network, which is a computational model that is modeled after the structure and function of the human brain. Neural networks are made up of neurons, which are nodes that are connected to one another and are arranged in layers. Input data is processed by each neuron, and signals are then transmitted to neurons in the subsequent layer, which finally results in the production of output. The process of neural networks learning from data is referred to as backpropagation. This involves altering the strength of connections between neurons in order to reduce the amount of errors that occur in their predictions. However, the scope of deep learning encompasses a much wider range of applications than typical neural networks. In order to improve the capabilities of these models, researchers are continually investigating novel structures and methods. Examples of neural networks that are specifically developed for processing grid-like data include convolutional neural networks (CNNs), which are used to process images. Convolutional neural networks (CNNs) are able to effectively capture spatial hierarchies in visual input by utilizing convolutional layers. This enables CNNs to perform tasks such as image categorization and object detection with exceptional accuracy. The use of recurrent neural networks (RNNs) is another key innovation that is particularly well-suited for sequential data processing tasks. Some examples of these tasks include the understanding of natural language and the prediction of time series. In contrast to feedforward neural networks, recurrent neural networks (RNNs) feature connections that create directed cycles, which provide them with the ability to remember previous inputs. The ability of RNNs to record temporal connections in data is made possible by this memory, which makes them extremely useful for jobs that require context or continuity. In addition to these well-established designs, academics are investigating more unusual models such as transformers and generative adversarial networks (GANs). An artificial neural network (GAN) is made up of two neural networks—a generator and a discriminator—that are involved in a process of competitive learning. Because of this configuration, GANs are able to generate synthetic data that is realistic, which has a wide range of applications, including drug discovery and image synthesis.  

Book Details

Buy Now (8 USD)