Fractional deep neural network via constrained optimization

Antil, Harbir and Khatri, Ratna and Löhner, Rainald and Verma, Deepanshu (2021) Fractional deep neural network via constrained optimization. Machine Learning: Science and Technology, 2 (1). 015003. ISSN 2632-2153

[thumbnail of Antil_2021_Mach._Learn.__Sci._Technol._2_015003.pdf] Text
Antil_2021_Mach._Learn.__Sci._Technol._2_015003.pdf - Published Version

Download (961kB)

Abstract

This paper introduces a novel algorithmic framework for a deep neural network (DNN), which in a mathematically rigorous manner, allows us to incorporate history (or memory) into the network—it ensures all layers are connected to one another. This DNN, called Fractional-DNN, can be viewed as a time-discretization of a fractional in time non-linear ordinary differential equation (ODE). The learning problem then is a minimization problem subject to that fractional ODE as constraints. We emphasize that an analogy between the existing DNN and ODEs, with standard time derivative, is well-known by now. The focus of our work is the Fractional-DNN. Using the Lagrangian approach, we provide a derivation of the backward propagation and the design equations. We test our network on several datasets for classification problems. Fractional-DNN offers various advantages over the existing DNN. The key benefits are a significant improvement to the vanishing gradient issue due to the memory effect, and better handling of nonsmooth data due to the network's ability to approximate non-smooth functions.

Item Type: Article
Subjects: STM Academic > Multidisciplinary
Depositing User: Unnamed user with email support@stmacademic.com
Date Deposited: 03 Jul 2023 04:58
Last Modified: 31 Oct 2023 06:34
URI: http://article.researchpromo.com/id/eprint/1204

Actions (login required)

View Item
View Item