liu.seSök publikationer i DiVA
Ändra sökning
RefereraExporteraLänk till posten
Permanent länk

Direktlänk
Referera
Referensformat
  • apa
  • ieee
  • modern-language-association-8th-edition
  • vancouver
  • oxford
  • Annat format
Fler format
Språk
  • de-DE
  • en-GB
  • en-US
  • fi-FI
  • nn-NO
  • nn-NB
  • sv-SE
  • Annat språk
Fler språk
Utmatningsformat
  • html
  • text
  • asciidoc
  • rtf
Computing frustration and near-monotonicity in deep neural networks
Linköpings universitet, Institutionen för systemteknik, Reglerteknik. Linköpings universitet, Tekniska fakulteten.
Linköpings universitet, Institutionen för systemteknik, Kommunikationssystem. Linköpings universitet, Tekniska fakulteten.ORCID-id: 0000-0002-7599-4367
Linköpings universitet, Institutionen för systemteknik, Reglerteknik. Linköpings universitet, Tekniska fakulteten.ORCID-id: 0000-0003-4142-6502
(Engelska)Manuskript (preprint) (Övrigt vetenskapligt)
Abstract [en]

For the signed graph associated to a deep neural network, one can compute the frustration level, i.e., test how close or distant the graph is to structural balance. For all the pretrained deep convolutional neural networks we consider, we find that the frustration is always less than expected from null models. From a statistical physics point of view, and in particular in reference to an Ising spin glass model, the reduced frustration indicates that the amount of disorder encoded in the network is less than in the null models. From a functional point of view, low frustration (i.e., proximity to structural balance) means that the function representing the network behaves near-monotonically, i.e., more similarly to a monotone function than in the null models. Evidence of near-monotonic behavior along the partial order determined by frustration is observed for all networks we consider. This confirms that the class of deep convolutional neural networks tends to have a more ordered behavior than expected from null models, and suggests a novel form of implicit regularization.

Nyckelord [en]
Disordered Systems and Neural Networks, Machine Learning
Nationell ämneskategori
Reglerteknik Artificiell intelligens
Identifikatorer
URN: urn:nbn:se:liu:diva-221214DOI: 10.48550/arXiv.2510.05286OAI: oai:DiVA.org:liu-221214DiVA, id: diva2:2038299
Tillgänglig från: 2026-02-13 Skapad: 2026-02-13 Senast uppdaterad: 2026-02-13
Ingår i avhandling
1. A Graph-Based Perspective on Neural Networks
Öppna denna publikation i ny flik eller fönster >>A Graph-Based Perspective on Neural Networks
2026 (Engelska)Licentiatavhandling, sammanläggning (Övrigt vetenskapligt)
Abstract [en]

The empirical success of deep learning in a wide range of applications over the last decade has been remarkable. Neural networks can now achieve human-like or superhuman performance at tasks such as image recognition and segmentation,speech recognition, and natural language generation.

Despite decades of research dedicated to understanding how such models learn,there are still many unresolved questions. For instance, neural networks are often severely overparameterized, sometimes with many more parameters than training samples, which according to intuition from classical theory should lead to high sensitivity to noise and poor performance when encountering new data. Yet with enough parameters or training, one can overcome this issue, even without explicit regularization. Understanding implicit biases in training and the induced behavior of neural networks is an important puzzle piece towards understanding how these models learn so efficiently.

This thesis emphasizes the ‘network’ part of neural networks, and uses tools from graph theory to view this class of models from a new perspective that adds to our understanding of their inner workings.

The first paper treats deep linear neural networks, which are neural networks where the nonlinear activations have been removed. The gradient flow equations describing the network’s learning process is an analytically treatable dynamical system, and although it is a simplified model, a deep linear network shares several interesting features with its nonlinear counterpart, such as a non-convex loss function and nonlinear dynamics induced by the overparameterization. The network is considered as a directed acyclic graph and the learning dynamics are described in terms of its adjacency matrix. This reformulation simplifies the gradient flow equations and provides insight into the system properties. For instance,it allows us to highlight an equivalence relation among adjacency matrices, and to investigate stable and unstable manifolds at the critical points of the system without needing to compute the Hessian of the loss function.

The second paper uses the concept of frustration from statistical physics in the context of deep neural networks, and relates frustration to monotonicity of the network when viewed as a function. It is shown that state-of-the-art convolutional neural networks trained on image classification tasks are less frustrated,and thus closer to monotone functions, than what is expected from null models. This suggests an implicit bias in the kind of function that they learn.

Ort, förlag, år, upplaga, sidor
Linköping: Linköping University Electronic Press, 2026. s. 36
Serie
Linköping Studies in Science and Technology. Licentiate Thesis, ISSN 0280-7971 ; 2028
Nationell ämneskategori
Datavetenskap (datalogi) Reglerteknik
Identifikatorer
urn:nbn:se:liu:diva-221215 (URN)10.3384/9789181184822 (DOI)9789181184815 (ISBN)9789181184822 (ISBN)
Presentation
2026-03-13, Ada Lovelace, B-huset, Campus Valla, Linköping, 10:15
Opponent
Handledare
Tillgänglig från: 2026-02-13 Skapad: 2026-02-13 Senast uppdaterad: 2026-03-17Bibliografiskt granskad

Open Access i DiVA

fulltext(3622 kB)54 nedladdningar
Filinformation
Filnamn FULLTEXT01.pdfFilstorlek 3622 kBChecksumma SHA-512
c215330521978ff92cf725eef4cffa0ca23c9b970e618c7744a7ba4fae0c63cfe520f349f3c09ab9b94ff034b89f5c1565770772f49a375954c3fe3a74645024
Typ fulltextMimetyp application/pdf

Övriga länkar

Förlagets fulltext

Person

Wendin, JoelLarsson, Erik G.Altafini, Claudio

Sök vidare i DiVA

Av författaren/redaktören
Wendin, JoelLarsson, Erik G.Altafini, Claudio
Av organisationen
ReglerteknikTekniska fakultetenKommunikationssystem
ReglerteknikArtificiell intelligens

Sök vidare utanför DiVA

GoogleGoogle Scholar
Antalet nedladdningar är summan av nedladdningar för alla fulltexter. Det kan inkludera t.ex tidigare versioner som nu inte längre är tillgängliga.

doi
urn-nbn

Altmetricpoäng

doi
urn-nbn
Totalt: 3413 träffar
RefereraExporteraLänk till posten
Permanent länk

Direktlänk
Referera
Referensformat
  • apa
  • ieee
  • modern-language-association-8th-edition
  • vancouver
  • oxford
  • Annat format
Fler format
Språk
  • de-DE
  • en-GB
  • en-US
  • fi-FI
  • nn-NO
  • nn-NB
  • sv-SE
  • Annat språk
Fler språk
Utmatningsformat
  • html
  • text
  • asciidoc
  • rtf