graph LR
BaseModel["BaseModel"]
IncrementalClassifier["IncrementalClassifier"]
DynamicOptimizer["DynamicOptimizer"]
PackNetModule["PackNetModule"]
PNN["PNN"]
PNNColumn["PNNColumn"]
CosineLinear["CosineLinear"]
utils["utils"]
BaseModel -- "informs" --> DynamicOptimizer
BaseModel -- "utilizes" --> utils
IncrementalClassifier -- "extends" --> BaseModel
PackNetModule -- "implements" --> BaseModel
PNN -- "implements" --> BaseModel
PNN -- "manages" --> PNNColumn
PNNColumn -- "adapts from" --> PNNColumn
CosineLinear -- "used by" --> IncrementalClassifier
The avalanche.models subsystem provides a robust framework for building and managing dynamic neural network architectures in continual learning scenarios. At its core, BaseModel defines the abstract interface for adaptable models, which can be extended by specialized implementations like PackNetModule and PNN. The DynamicOptimizer works in conjunction with these models to ensure that optimizer states are correctly managed as the model architecture evolves. PNN orchestrates the creation and management of PNNColumn instances, where each new column adapts its learning by leveraging knowledge from previous columns. IncrementalClassifier dynamically adjusts classification heads, often utilizing specialized layers like CosineLinear to handle new classes. The utils module provides essential helper functions for various model manipulation tasks, supporting the overall dynamic adaptation process. This interconnected design allows for flexible and efficient model evolution, crucial for mitigating catastrophic forgetting in sequential learning tasks.
Serves as the core abstraction for models that can adapt their architecture during continual learning, providing a unified interface for dynamic model adaptation.
Related Classes/Methods:
Manages the dynamic addition and selection of classification heads, crucial for scenarios where new classes are encountered over time without retraining the entire model.
Related Classes/Methods:
Dynamically adjusts optimizer parameters and state to accommodate architectural changes in the model, ensuring training stability and efficiency during adaptation.
Related Classes/Methods:
Implements a specific dynamic neural network architecture focused on model compression and task isolation, allowing for efficient multi-task learning by dynamically allocating capacity.
Related Classes/Methods:
Implements the Progressive Neural Network (PNN) architecture, which adds new columns (task-specific sub-networks) for each new task to prevent catastrophic forgetting.
Related Classes/Methods:
Represents a task-specific sub-network within the PNN architecture, designed to learn a new task while leveraging knowledge from previous tasks via lateral connections.
Related Classes/Methods:
Provides a dynamically adaptable linear layer based on cosine similarity, often used in classification tasks for improved feature separation and robustness.
Related Classes/Methods:
A module providing common model manipulation utilities, supporting operations like pruning, adding layers, or modifying existing model structures, essential for dynamic adaptation.
Related Classes/Methods: