Models#

The Models module provides a collection of neural network architectures for use with the Client and Server classes in the Federated Learning Framework. These models include fully connected networks, convolutional networks, logistic regression, and ResNet architectures, tailored for datasets such as MNIST, CIFAR-10, CIFAR-100, and ImageNet.

Available Models#

MNIST Models#

  • `fc_mnist`: Fully connected neural network for MNIST.

  • `cnn_mnist`: Convolutional neural network for MNIST.

  • `logreg_mnist`: Logistic regression model for MNIST.

CIFAR Models#

  • `cnn_cifar`: Convolutional neural network for CIFAR datasets.

ResNet Models#

  • `ResNet18`: ResNet with 18 layers.

  • `ResNet34`: ResNet with 34 layers.

  • `ResNet50`: ResNet with 50 layers.

  • `ResNet101`: ResNet with 101 layers.

  • `ResNet152`: ResNet with 152 layers.

Usage Examples#

Each model can be easily imported and used in your training framework. For instance:

Using ResNet18 for CIFAR-10:

from models import ResNet18
model = ResNet18(num_classes=10)
print(model)

Using fc_mnist for MNIST:

from models import fc_mnist
model = fc_mnist()
print(model)

Note

A detailed description of these models, including their architecture and intended use, can be found below.

API Documentation#

class byzfl.fc_mnist[source]#

Bases: Module

Fully Connected Network for MNIST.

Description:#

A simple fully connected neural network for the MNIST dataset, consisting of two fully connected layers with ReLU activation and softmax output.

Examples:#

>>> model = fc_mnist()
>>> x = torch.randn(16, 28*28)  # Batch of 16 MNIST images
>>> output = model(x)
>>> print(output.shape)
torch.Size([16, 10])
forward(x)[source]#

Perform a forward pass through the model.

class byzfl.cnn_mnist[source]#

Bases: Module

Convolutional Neural Network for MNIST.

Description:#

A simple convolutional neural network designed for the MNIST dataset. It consists of two convolutional layers, ReLU activation, max pooling, and fully connected layers.

Examples:#

>>> model = cnn_mnist()
>>> x = torch.randn(16, 1, 28, 28)  # Batch of 16 grayscale MNIST images
>>> output = model(x)
>>> print(output.shape)
torch.Size([16, 10])
forward(x)[source]#

Perform a forward pass through the model.

class byzfl.logreg_mnist[source]#

Bases: Module

Logistic Regression Model for MNIST.

Description:#

A simple logistic regression model for the MNIST dataset. It consists of a single linear layer.

Examples:#

>>> model = logreg_mnist()
>>> x = torch.randn(16, 28*28)  # Batch of 16 MNIST images
>>> output = model(x)
>>> print(output.shape)
torch.Size([16, 10])
forward(x)[source]#

Perform a forward pass through the model.

class byzfl.cnn_cifar[source]#

Bases: Module

Convolutional Neural Network for CIFAR.

Description:#

A convolutional neural network designed for the CIFAR-10 and CIFAR-100 datasets. It consists of three convolutional layers, max pooling, and fully connected layers.

Examples:#

>>> model = cnn_cifar()
>>> x = torch.randn(16, 3, 32, 32)  # Batch of 16 CIFAR images
>>> output = model(x)
>>> print(output.shape)
torch.Size([16, 10])
forward(x)[source]#

Perform a forward pass through the model.

class byzfl.ResNet18(num_classes=10)[source]#

Bases: Module

Description:#

ResNet18 is a convolutional neural network architecture with 18 layers. It is designed for image classification tasks and includes skip connections for efficient gradient flow.

Parameters:#

num_classesint

The number of output classes for classification (default is 10).

Examples:#

>>> model = ResNet18(num_classes=10)
>>> x = torch.randn(16, 3, 32, 32)  # Batch of 16 CIFAR images
>>> output = model(x)
>>> print(output.shape)
torch.Size([16, 10])
forward(x)[source]#

Perform a forward pass through the model.

class byzfl.ResNet34(num_classes=10)[source]#

Bases: Module

Description:#

ResNet34 is a deep convolutional neural network with 34 layers, designed for image classification tasks. It uses residual connections to improve gradient flow and enable training of very deep networks.

Parameters:#

num_classesint

The number of output classes for classification (default is 10).

Examples:#

>>> model = ResNet34(num_classes=100)
>>> x = torch.randn(8, 3, 32, 32)  # Batch of 8 images with CIFAR-like dimensions
>>> output = model(x)
>>> print(output.shape)
torch.Size([8, 100])
forward(x)[source]#

Perform a forward pass through the model.

class byzfl.ResNet50(num_classes=10)[source]#

Bases: Module

Description:#

ResNet50 is a deeper ResNet variant with 50 layers. It employs the Bottleneck block to reduce computational complexity while maintaining accuracy, making it suitable for larger-scale datasets and more complex tasks.

Parameters:#

num_classesint

The number of output classes for classification (default is 10).

Examples:#

>>> model = ResNet50(num_classes=1000)
>>> x = torch.randn(16, 3, 224, 224)  # Batch of 16 images with ImageNet-like dimensions
>>> output = model(x)
>>> print(output.shape)
torch.Size([16, 1000])
forward(x)[source]#

Perform a forward pass through the model.

class byzfl.ResNet101(num_classes=10)[source]#

Bases: Module

Description:#

ResNet101 is a deeper ResNet variant with 101 layers, designed for highly complex tasks. It leverages Bottleneck blocks to maintain performance while keeping computational costs manageable.

Parameters:#

num_classesint

The number of output classes for classification (default is 10).

Examples:#

>>> model = ResNet101(num_classes=100)
>>> x = torch.randn(4, 3, 64, 64)  # Batch of 4 images
>>> output = model(x)
>>> print(output.shape)
torch.Size([4, 100])
forward(x)[source]#

Perform a forward pass through the model.

class byzfl.ResNet152(num_classes=10)[source]#

Bases: Module

Description:#

ResNet152 is the deepest ResNet variant among the standard configurations. With 152 layers, it is highly effective for complex tasks, including image classification, segmentation, and detection. The model achieves a balance between depth and computational feasibility using Bottleneck blocks.

Parameters:#

num_classesint

The number of output classes for classification (default is 10).

Examples:#

>>> model = ResNet152(num_classes=10)
>>> x = torch.randn(2, 3, 128, 128)  # Batch of 2 high-resolution images
>>> output = model(x)
>>> print(output.shape)
torch.Size([2, 10])
forward(x)[source]#

Perform a forward pass through the model.