Loading...
Advancements in Memory-Efficient and Variance-Optimized Pairwise Learning, Extended to Nonlinear Modeling and Spiking Neural Networks
AlQuabeh, Hilal
AlQuabeh, Hilal
Files
Author
Supervisor
Department
Machine Learning
Embargo End Date
2024-01-01
Type
Dissertation
Date
2024
License
Language
English
Collections
Research Projects
Organizational Units
Journal Issue
Abstract
This thesis presents an exploration of online pairwise learning, extending the traditional boundaries through innovative algorithms and the application of Spiking Neural Networks (SNNs) for deep metric learning. The initial chapters introduce and refine advanced online gradient descent (OGD) algorithms tailored for pairwise learning in non-i.i.d. environments, showing their scalability and efficiency in handling quadratic growth in computational complexity and extending their applicability to kernelized models. Our proposed OGD methods, characterized by sub-linear regret and reduced complexity, demonstrate marked superiority over existing kernel and linear models across various datasets. Building upon these foundations, the exploration continues into the domain of deep metric learning with SNNs, with attention into the robustness of rate encoding in SNNs, a critical aspect that directly enhances pairwise learning strategies. By bridging the conceptual gap between SNN robustness and pairwise learning, this work illuminates the multifaceted interplay between encoding stability, metric learning effectiveness, and the overarching robustness of learning systems. In a complementary exploration, proposing a novel approach that takes advantage of the temporal dynamics of binary encoding to achieve high-dimensional data representation with low latency and superior accuracy. This methodology illustrates the potential of temporal dimension utilization within the SNN framework for enhanced discriminative capabilities in the metric space. Overall, this thesis not only advances the state of the art in online pairwise learning and deep metric learning with SNNs, but also opens new avenues for research by highlighting the importance of robustness in rate encoding and its implications for pairwise learning methodologies. Through a blend of theoretical innovation and empirical validation, we showcase the synergy between these seemingly disparate domains, paving the way for more resilient and accurate machine learning models in complex environments.
Citation
H. AlQuabe'h, "Advancements in Memory-Efficient and Variance-Optimized Pairwise Learning, Extended to Nonlinear Modeling and Spiking Neural Networks", PhD Dissertation, Machine Learning, MBZUAI, Abu Dhabi, UAE, 2024
