15+ papers across activation functions, neural network optimisation, and applied ML — from ICLR, Evolutionary Intelligence, and ArXiv.
My doctoral research introduced AdAct, an adaptive activation algorithm for neural networks that outperforms traditional ReLU-based activations across CNN, RNN, and Transformer architectures. The work demonstrates that learned, data-driven activations provide consistent improvements in convergence speed and final accuracy without significant computational overhead.