Home

Published

- 2 min read

Perpetual Generation: Online Learning of Linear SSMs from a Single Stream

img of Perpetual Generation: Online Learning of Linear SSMs from a Single Stream

Conference Paper

Can neural networks learn to represent and keep generating a signal over time? Is it a stable process? Is learning feasible when running online? Read this paper if you care about online learning and perpetual generation!

Details

  • Authors: Michele Casoni, Tommaso Guidi, Stefano Melacci, Alessandro Betti, Marco Gori
  • Title: Perpetual Generation: Online Learning of Linear State-Space Models from a Single Stream
  • Where: International Conference on Artificial Neural Networks (ICANN) 2025

BibTeX

   @inproceedings{DBLP:conf/icann/CasoniGMBG25,
  author       = {Michele Casoni and
                  Tommaso Guidi and
                  Stefano Melacci and
                  Alessandro Betti and
                  Marco Gori},
  editor       = {Walter Senn and
                  Marcello Sanguineti and
                  Ausra Saudargiene and
                  Igor V. Tetko and
                  Alessandro E. P. Villa and
                  Viktor K. Jirsa and
                  Yoshua Bengio},
  title        = {Perpetual Generation: Online Learning of Linear State-Space Models
                  from a Single Stream},
  booktitle    = {Artificial Neural Networks and Machine Learning - {ICANN} 2025 - 34th
                  International Conference on Artificial Neural Networks, Kaunas, Lithuania,
                  September 9-12, 2025, Proceedings, Part {I}},
  series       = {Lecture Notes in Computer Science},
  volume       = {16068},
  pages        = {533--544},
  publisher    = {Springer},
  year         = {2025},
  url          = {https://doi.org/10.1007/978-3-032-04558-4\_43},
  doi          = {10.1007/978-3-032-04558-4\_43}
}