Part 22: PyTorch, Introduction to Neural Networks

by digitaltech2.com

Neural networks are the foundation of deep learning. They consist of layers of interconnected neurons (also called nodes), which process input data to produce an output. Understanding the basics of neural networks is crucial for building more complex models in PyTorch.

Basic Concepts of Neural Networks
  1. Neurons: The basic units of a neural network, inspired by biological neurons, that receive input, process it, and pass it on to the next layer.
  2. Layers: Collections of neurons, where each layer processes the input from the previous layer and passes its output to the next layer.
  3. Activation Functions: Functions that introduce non-linearity into the network, allowing it to learn complex patterns. Common activation functions include ReLU, Sigmoid, and Tanh.
  • Simple Neural Network Diagram:
rust
Input Layer -> Hidden Layer 1 -> Hidden Layer 2 -> Output Layer

Building Neural Networks with nn.Module

In PyTorch, neural networks are built using the torch.nn.Module class. This class provides a convenient way to define and manage layers and parameters.

Defining a Neural Network

To define a neural network, you create a class that inherits from nn.Module and define the network architecture in the __init__ method. The forward computation is implemented in the forward method.

  • Example of a Simple Neural Network:
import torch
import torch.nn as nn

class SimpleNN(nn.Module):
    def __init__(self):
        super(SimpleNN, self).__init__()
        self.fc1 = nn.Linear(10, 50)  # First fully connected layer
        self.fc2 = nn.Linear(50, 1)   # Second fully connected layer

    def forward(self, x):
        x = torch.relu(self.fc1(x))  # Apply ReLU activation function
        x = self.fc2(x)              # Output layer
        return x

model = SimpleNN()
print(model)

In this example, SimpleNN is a simple neural network with two fully connected layers. The forward method defines how the input data flows through the network.

Adding Activation Functions

Activation functions introduce non-linearity into the network, which allows it to learn more complex patterns. Common activation functions include ReLU, Sigmoid, and Tanh.

  • Example with Different Activation Functions:
class AdvancedNN(nn.Module):
    def __init__(self):
        super(AdvancedNN, self).__init__()
        self.fc1 = nn.Linear(10, 50)
        self.fc2 = nn.Linear(50, 50)
        self.fc3 = nn.Linear(50, 1)

    def forward(self, x):
        x = torch.relu(self.fc1(x))  # ReLU activation after first layer
        x = torch.sigmoid(self.fc2(x))  # Sigmoid activation after second layer
        x = self.fc3(x)              # Output layer
        return x

model = AdvancedNN()
print(model)

In this example, AdvancedNN has three layers with ReLU and Sigmoid activation functions applied to the first and second layers, respectively.

Related Posts