论文标题

形态激活:通过数学形态概括relu激活函数

MorphoActivation: Generalizing ReLU activation function by mathematical morphology

论文作者

Velasco-Forero, Santiago, Angulo, Jesús

论文摘要

本文通过数学形态的代数基础来分析深卷积神经网络(DCNN)的非线性激活函数和空间最大化。此外,通过在形态表示的背景下同时考虑最大 - 释放和非线性算子,提出了一般的激活功能家族。实验部分验证了我们在经典基准测试中的方法,用于DCNN的监督学习。

This paper analyses both nonlinear activation functions and spatial max-pooling for Deep Convolutional Neural Networks (DCNNs) by means of the algebraic basis of mathematical morphology. Additionally, a general family of activation functions is proposed by considering both max-pooling and nonlinear operators in the context of morphological representations. Experimental section validates the goodness of our approach on classical benchmarks for supervised learning by DCNN.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源