EALSTM

class neuralhydrology.modelzoo.ealstm.EALSTM(cfg: Config)

Bases: BaseModel

Entity-Aware LSTM (EA-LSTM) model class.

This model has been proposed by Kratzert et al. [1] as a variant of the standard LSTM. The main difference is that the input gate of the EA-LSTM is modulated using only the static inputs, while the dynamic (time series) inputs are used in all other parts of the model (i.e. forget gate, cell update gate and output gate). To control the initial forget gate bias, use the config argument initial_forget_bias. Often it is useful to set this value to a positive value at the start of the model training, to keep the forget gate closed and to facilitate the gradient flow. The EALSTM class does only support single timescale predictions. Use MTSLSTM to train an LSTM-based model and get predictions on multiple temporal resolutions at the same time.

Parameters:

cfg (Config) – The run configuration.

References

forward(data: Dict[str, Tensor]) Dict[str, Tensor]

Perform a forward pass on the EA-LSTM model.

Parameters:

data (Dict[str, torch.Tensor]) – Dictionary, containing input features as key-value pairs.

Returns:

Model outputs and intermediate states as a dictionary.
  • y_hat: model predictions of shape [batch size, sequence length, number of target variables].

  • h_n: hidden state at the last time step of the sequence of shape

    [batch size, sequence length, number of target variables].

  • c_n: cell state at the last time step of the sequence of shape

    [batch size, sequence length, number of target variables].

Return type:

Dict[str, torch.Tensor]

module_parts = ['embedding_net', 'input_gate', 'dynamic_gates', 'head']