🎡 Architecture

The library provides six neural network architectures. More specifically, two main architectures are implemented, as well as two derived architectures for each of them :

  • ConvNet

  • ConvNet with DropBlock regularization

  • ResNet

  • ResNet with DropBlock regularization

ConvNet

A Convolutional Neural Network (ConvNet or CNN) is a type of deep learning algorithm primarily used for processing data with a grid-like topology, such as images, using convolutional layers to automatically and adaptively learn spatial hierarchies of features.

ResNet

ResNet, short for Residual Network, is a type of convolutional neural network (CNN) that introduces residual connections or “shortcuts” to jump over some layers, helping to solve the vanishing gradient problem and enabling the training of much deeper networks.

Dropout

Dropout is a regularization technique for reducing overfitting in neural networks by preventing complex co-adaptations on training data.

DropBlock

“DropBlock is a structured form of dropout directed at regularizing convolutional networks. In DropBlock, units in a contiguous region of a feature map are dropped together. As DropBlock discards features in a correlated area, the networks must look elsewhere for evidence to fit the data.”

Paper
Code

Training methods

Acutally two training methods are available. They take the form of three classes :

  • ClassicalTraining

  • AdversarialTraining

  • AutoAttackTraining

  • FireTraining

  • TradesTraining

ClassicalTraining

This is the class to train a model without considering improved robustness.

All other training methods inherit from this class.

AdversarialTraining

This is the class to train model against adversarial attack. This method is based on an external : cleverhans

AutoAttackTraining

“Reliable evaluation of adversarial robustness with an ensemble of diverse parameter-free attacks” Francesco Croce, Matthias Hein ICML 2020 AutoAttack

FireTraining

“Reliable evaluation of adversarial robustness with an ensemble of diverse parameter-free attacks” Francesco Croce, Matthias Hein ICML 2020 Fire

TradesTraining

TRADES minimizes a regularized surrogate loss L(.,.) (e.g., the cross-entropy loss) for adversarial training Trades