Research

Selected Publications

15+ papers across activation functions, neural network optimisation, and applied ML — from ICLR, Evolutionary Intelligence, and ArXiv.

Papers

Published Work

ArXiv · Dec 2024
Making Sigmoid-MSE Great Again: Output Reset Challenges Softmax Cross-Entropy
Kanishka Tyagi, Chinmay Rane, et al.
Paper
Evolutionary Intelligence · Oct 2024
Optimizing performance through dynamic activation functions
Chinmay Rane, Kanishka Tyagi, et al.
Paper
ICLR 2024 · Mar 2024
Dynamic Activations for Neural Net Training
Chinmay Rane, Kanishka Tyagi, et al.
Paper
PhD Research

Core Thesis

AdAct — Adaptive Activation Functions

My doctoral research introduced AdAct, an adaptive activation algorithm for neural networks that outperforms traditional ReLU-based activations across CNN, RNN, and Transformer architectures. The work demonstrates that learned, data-driven activations provide consistent improvements in convergence speed and final accuracy without significant computational overhead.