mlp_modules
- class MultiLayerPerceptron(*args: Any, **kwargs: Any)[source]
Bases:
sensai.torch.torch_base.MCDropoutCapableNNModule
- __init__(input_dim: float, output_dim: float, hidden_dims: Sequence[int], hid_activation_fn: Callable[[torch.Tensor], torch.Tensor] = torch.sigmoid, output_activation_fn: Optional[Callable[[torch.Tensor], torch.Tensor]] = torch.sigmoid, p_dropout: Optional[float] = None)
- forward(x)