Pré-Publication, Document De Travail Année : 2024

A Gated Residual Kolmogorov-Arnold Networks for Mixtures of Experts

Résumé

This paper introduces KAMoE, a novel Mixture of Experts (MoE) framework based on Gated Residual KolmogorovArnold Networks (GRKAN). We propose GRKAN as an alternative to the traditional gating function, aiming to enhance efficiency and interpretability in MoE modeling. Through extensive experiments on digital asset markets and real estate valuation, wedemonstrate that KAMoE consistently outperforms traditional MoE architectures across various tasks and model types. Our results show that GRKAN exhibits superior performance compared to standard Gating Residual Networks, particularly in LSTMbased models for sequential tasks. We also provide insights into the trade-offs between model complexity and performance gains in MoE and KAMoE architectures.
Fichier principal
Vignette du fichier
2409.15161v2.pdf (295.24 Ko) Télécharger le fichier
Origine Fichiers produits par l'(les) auteur(s)

Dates et versions

hal-04923946 , version 1 (01-02-2025)

Identifiants

Citer

Hugo Inzirillo, Rémi Genet. A Gated Residual Kolmogorov-Arnold Networks for Mixtures of Experts. 2025. ⟨hal-04923946⟩
0 Consultations
0 Téléchargements

Altmetric

Partager

More