To provide the AutoGrad functionality, the Pytorch library needs to set up the necessary data structures in the base class. If the
super().__init__()
method is not called, the module will not be able to keep track of its parameters and other attributes.
For example, when trying to instantiate a module like nn.Linear
without calling the super().__init__()
method, the
instantiation will fail when it tries to register it as a submodule of the parent module.
import torch.nn as nn
class MyCustomModule(nn.Module):
def __init__(self, input_size, output_size):
self.fc = nn.Linear(input_size, output_size)
model = MyCustomModule(10, 5) # AttributeError: cannot assign module before Module.__init__() call