Stationary Activations for Uncertainty Calibration in Deep Learning

Ғылым және технология

Presentation video for the paper: Lassi Meronen, Christabella Irwanto, and Arno Solin (2020). Stationary Activations for Uncertainty Calibration in Deep Learning. Advances in Neural Information Processing Systems (NeurIPS).
arXiv preprint: arxiv.org/abs/2010.09494

Пікірлер: 1

  • @nguyenngocly1484
    @nguyenngocly14843 жыл бұрын

    You can have swapped around neural nets too. With fixed dot products (enacted with fast transforms) and adjustable activation functions. Parametric (adjustable) ReLU is anyway a known thing. Of course you have to prevent the first transform from taking the spectrum of the input which you can do using a random fixed pattern of sign flips. And use a final transform as a readout layer. The fast Walsh Hadamard transform is good.

Келесі