[ICLR2025] Kolmogorov-Arnold Transformer
-
Updated
Mar 23, 2025 - Python
[ICLR2025] Kolmogorov-Arnold Transformer
ikan: many kan variants for every body
This is the implementation of the paper Enhanced Photovoltaic Power Forecasting: An iTransformer and LSTM-Based Model Integrating Temporal and Covariate Interactions
Parametric differentiable curves with PyTorch for continuous embeddings, shape-restricted models, or KANs
High order and sparse layers in pytorch. Lagrange Polynomial, Piecewise Lagrange Polynomial, Piecewise Discontinuous Lagrange Polynomial (Chebyshev nodes) and Fourier Series layers of arbitrary order. Piecewise implementations could be thought of as a 1d grid (for each neuron) where each grid element is Lagrange polynomial. Both full connected a…
Kolmogorov–Arnold Networks (KAN) in PyTorch
Short experiment with Deep Q-Learning + KAN to play Flappy Bird.
LinearKAN: A very fast implementation of Kolmogorov-Arnold Networks
PyTorch implementation of QKAN "Quantum-inspired Kolmogorov-Arnold Network" https://arxiv.org/abs/2509.14026
Implicit representation of various things using PyTorch and high order layers
Neural Network Implicit Representation of Partial Differential Equations
Baantu Research: Hybrid KAN-Transformer for investigating learnable activations in LLM reasoning. Built on nanochat by Andrej Karpathy.
A rigorous 2x3 factorial comparison of neural network architectures: KAN vs MLP feedforward layers combined with Transformer vs Mamba sequence models. Investigates whether KAN advantages stem from B-spline activations or network topology.
Empirical investigation of grokking in KAN. Key finding: KAN groks multiplication 12x faster than MLP!
KAN to classify handwritten digits from the MNIST dataset, providing efficient predictions and automated data handling.
A multi-agent deep reinforcement learning model to de-traffic our lives
Bottleneck KANConv for Unet
Lightweight Kolmogorov-Arnold Network based model for Image Classification
Experiments in language interpolation with high order sparse neural networks
PyTorch implementation of Multifidelity Kolmogorov-Arnold Networks (MFKANs) for data-efficient learning. Train accurate models with sparse high-fidelity data by leveraging correlations with abundant low-fidelity data.
Add a description, image, and links to the kan topic page so that developers can more easily learn about it.
To associate your repository with the kan topic, visit your repo's landing page and select "manage topics."