Graduation Year
2024
Document Type
Thesis
Degree
M.S.C.S.
Degree Name
MS in Computer Science (M.S.C.S.)
Degree Granting Department
Computer Science and Engineering
Major Professor
Ankur Mali, Ph.D.
Committee Member
Lawrence Hall, Ph.D.
Committee Member
John Murray-Bruce, Ph.D.
Keywords
Neural Networks, Convergence Acceleration, Learning Stability, Computational Efficiency
Abstract
We propose the Hyperbolic Tangent Exponential Linear Unit (TeLU), a neural network hidden activation function defined as $TeLU(x)=x \cdot tanh(e^x)$. TeLU’s design is grounded in the core principles of key activation functions, achieving strong convergence by closely approximating the identity function in its active region while effectively mitigating the vanishing gradient problem in its saturating region. Its simple formulation enhances computational efficiency, leading to improvements in scalability and convergence speed. Unlike many modern activation functions, TeLU seamlessly combines the simplicity and effectiveness of ReLU with the smoothness and analytic properties essential for learning stability in deep neural networks. TeLU’s ability to mimic the behavior and optimal hyperparameter settings of ReLU, while introducing the benefits of smoothness and curvature, makes it an ideal drop-in replacement. Its analytic nature positions TeLU as a powerful universal approximator, enhancing both robustness and generalization across a multitude of experiments. We rigorously validate these claims through theoretical analysis and experimental validation, demonstrating TeLU's performance across challenging benchmarks; including ResNet18 on ImageNet, Dynamic-Pooling Transformers on Text8, and Recurrent Neural Networks (RNNs) on the Penn TreeBank dataset. These results highlight TeLU’s potential to set a new standard in activation functions, driving more efficient and stable learning in deep neural networks, thereby accelerating scientific discoveries across various fields.
Scholar Commons Citation
Fernandez, Alfredo, "TeLU Activation Function for Fast and Stable Deep Learning" (2024). USF Tampa Graduate Theses and Dissertations.
https://digitalcommons.usf.edu/etd/10618