BEGIN:VCALENDAR
VERSION:2.0
PRODID:researchseminars.org
CALSCALE:GREGORIAN
X-WR-CALNAME:researchseminars.org
BEGIN:VEVENT
SUMMARY:Cristhian Garay Lopez (Centro de Investigación en Matemáticas)
DTSTART:20241111T060000Z
DTEND:20241111T070000Z
DTSTAMP:20260404T110823Z
UID:TropicalmathandML/1
DESCRIPTION:Title: <a href="https://stable.researchseminars.org/talk/Tropi
 calmathandML/1/">Idempotization of schemes and sheaves</a>\nby Cristhian G
 aray Lopez (Centro de Investigación en Matemáticas) as part of Tropical 
 mathematics and machine learning\n\n\nAbstract\nTropical algebraic geometr
 y tries to be the algebraic geometry of the tropical semiring\, which is a
 n example of idempotent semiring.  \n\nMotivated by several relevant probl
 ems in tropical and non-Archimedean algebraic geometry (e.g. the definitio
 n of tropical schemes\, or the analytification and the schematic tropicali
 zation of algebraic varieties defined over a valuated field) we present an
  algebraic process for the “idempotization” of both schemes and sheave
 s of rings and modules over them\, understanding “idempotization” as a
  process that associates idempotent algebraic objects to the usual objects
  of algebraic geometry and commutative algebra.\n\nThis goal is achieved b
 y first constructing the affine scheme case\, and then globalizing it usin
 g a fixed affine open cover of a given scheme. The affine case is governed
  by certain idempotent semirings and idempotent semi-modules defined over 
 them (that we call realizable semirings and semimodules\, respectively)\, 
 which turn out to be semi-lattices that can be studied through a version o
 f commutative algebra for idempotent semirings. We show that these objects
  are suitable for our formalism because their lattices of sub-object are (
 in a precise way) a combinatorial reflection of the usual lattices obtaine
 d in commutative algebra.\n\nThis is a joint work with Félix Baril Boudre
 au (U. of Luxembourg).\n
LOCATION:https://stable.researchseminars.org/talk/TropicalmathandML/1/
END:VEVENT
BEGIN:VEVENT
SUMMARY:Gabriele Balletti (RaySearch Laboratories)
DTSTART:20241118T063000Z
DTEND:20241118T073000Z
DTSTAMP:20260404T110823Z
UID:TropicalmathandML/2
DESCRIPTION:Title: <a href="https://stable.researchseminars.org/talk/Tropi
 calmathandML/2/">Machine Assisted Proofs and Disproofs in Discrete Geometr
 y</a>\nby Gabriele Balletti (RaySearch Laboratories) as part of Tropical m
 athematics and machine learning\n\n\nAbstract\nI will discuss how modern c
 omputational techniques can help us get a better understanding of the math
 ematics of geometric structures\, with examples from Ehrhart Theory of lat
 tice polytopes. \nI will present several machine-assisted proofs where com
 putational aid has been essential\, and a more recent counterexample - ach
 ieved through a genetic algorithm - that answers negatively to a question 
 by Ferroni and Higashitani.\n
LOCATION:https://stable.researchseminars.org/talk/TropicalmathandML/2/
END:VEVENT
BEGIN:VEVENT
SUMMARY:Yuto Yamamoto (RIKEN iTHEMS)
DTSTART:20250106T060000Z
DTEND:20250106T070000Z
DTSTAMP:20260404T110823Z
UID:TropicalmathandML/3
DESCRIPTION:Title: <a href="https://stable.researchseminars.org/talk/Tropi
 calmathandML/3/">Period integrals of hypersurfaces via tropical geometry</
 a>\nby Yuto Yamamoto (RIKEN iTHEMS) as part of Tropical mathematics and ma
 chine learning\n\n\nAbstract\nAbouzaid--Ganatra--Iritani--Sheridan compute
 d asymptotics of integrations of holomorphic volume forms on toric Calabi-
 -Yau hypersurfaces over Lagrangian sections of SYZ fibrations by using tro
 pical geometry. They gave a new proof of the gamma conjecture for ambient 
 line bundles on Batyrev pairs of mirror Calabi--Yau hypersurfaces. In the 
 talk\, we review their work and discuss its generalization to the case of 
 toric hypersurfaces which are not necessarily Calabi--Yau hypersurfaces.\n
LOCATION:https://stable.researchseminars.org/talk/TropicalmathandML/3/
END:VEVENT
BEGIN:VEVENT
SUMMARY:Edward Hirst (Queen Mary\, University of London)
DTSTART:20250210T080000Z
DTEND:20250210T090000Z
DTSTAMP:20260404T110823Z
UID:TropicalmathandML/4
DESCRIPTION:Title: <a href="https://stable.researchseminars.org/talk/Tropi
 calmathandML/4/">Machine Learning Combinatorics from hep-th</a>\nby Edward
  Hirst (Queen Mary\, University of London) as part of Tropical mathematics
  and machine learning\n\n\nAbstract\nAn informal review of works applying 
 supervised machine learning architectures to combinatoric objects in hep-t
 h will be provided. These objects are related to quiver gauge theories\, a
 nd include quivers\, Hilbert Series\, Amoebae\, Brane Webs\; discussing th
 eir efficient representation and amenability to ML architectures on some s
 imple tasks.\n
LOCATION:https://stable.researchseminars.org/talk/TropicalmathandML/4/
END:VEVENT
BEGIN:VEVENT
SUMMARY:Samuel Bernard-Bernardet and Benjamin Apffel (DotWave Lab (S.B.B.)
 \, LWE\, EPFL\, Switzerland (B.A.))
DTSTART:20250224T080000Z
DTEND:20250224T090000Z
DTSTAMP:20260404T110823Z
UID:TropicalmathandML/5
DESCRIPTION:Title: <a href="https://stable.researchseminars.org/talk/Tropi
 calmathandML/5/">The spinorial ball : a macroscopic object of spin-1/2</a>
 \nby Samuel Bernard-Bernardet and Benjamin Apffel (DotWave Lab (S.B.B.)\, 
 LWE\, EPFL\, Switzerland (B.A.)) as part of Tropical mathematics and machi
 ne learning\n\n\nAbstract\nIn quantum physics lectures\, half-integer spin
 s are generally introduced as “objects that do not come back to their or
 iginal state after one full turn but that do after two" and are often beli
 eved to be purely quantum mechanics behavior. However\, spin-1/2 is above 
 all a geometrical property of the rotations group SO(3) and can\, therefor
 e\, also have practical consequences at the macroscopic scale. To illustra
 te this\, we will describe in this seminar a new visualizing tool named th
 e spinorial ball\, that allows to concretely manipulate a macroscopic 1/2-
 spin. It is based on the existing group homomorphism between SU(2) and SO(
 3)\, which will be discussed extensively during the seminar. We will also 
 show livestream how to visualize the Poincaré-Bloch sphere\, the Hopf bun
 dle or the homotopy classes of SO(3) using the ball. Last\, we will descri
 be some key  practical aspects of the implementation.\n\nUseful links :\n\
 nGitHub of the project (open source) : https://github.com/heligone/spinori
 alBall\n\nA second paper : https://arxiv.org/abs/2411.15059\n\nSpinorial b
 all website: www.spinorialball.com\n
LOCATION:https://stable.researchseminars.org/talk/TropicalmathandML/5/
END:VEVENT
BEGIN:VEVENT
SUMMARY:Baran Hashemi (cancelled) (Technical University of Munich)
DTSTART:20250217T080000Z
DTEND:20250217T090000Z
DTSTAMP:20260404T110823Z
UID:TropicalmathandML/6
DESCRIPTION:Title: <a href="https://stable.researchseminars.org/talk/Tropi
 calmathandML/6/">Can Transformers Do Enumerative Geometry?</a>\nby Baran H
 ashemi (cancelled) (Technical University of Munich) as part of Tropical ma
 thematics and machine learning\n\n\nAbstract\nWe introduce a Transformer-b
 ased approach to computational enumerative geometry\, specifically targeti
 ng the computation of $\\psi$-class intersection numbers on the moduli spa
 ce of curves. Traditional methods for calculating these numbers suffer fro
 m factorial computational complexity\, making them impractical to use. By 
 reformulating the problem as a continuous optimization task\, we compute i
 ntersection numbers across a wide value range from $10^{45}$ to $10^{-45}$
 . To capture the recursive nature inherent in these intersection numbers\,
  we propose the Dynamic Range Activator (DRA)\, a new activation function 
 that enhances the Transformer's ability to model recursive patterns and ha
 ndle severe heteroscedasticity. Given precision requirements for computing
  the intersections\, we quantify the uncertainty of the predictions using 
 Conformal Prediction with a dynamic sliding window adaptive to the partiti
 ons of equivalent number of marked points. To the best of our knowledge\, 
 there has been no prior work on modeling recursive functions with such a h
 igh-variance and factorial growth. Beyond simply computing intersection nu
 mbers\, we explore the enumerative "world-model" of Transformers. Our inte
 rpretability analysis reveals that the network is implicitly modeling the 
 Virasoro constraints in a purely data-driven manner. Moreover\, through ab
 ductive hypothesis testing\, probing\, and causal inference\, we uncover e
 vidence of an emergent internal representation of the the large-genus asym
 ptotic of  the $\\psi$-class intersection numbers. These findings suggest 
 that the network internalizes the parameters of the asymptotic closed-form
  and the polynomiality phenomenon of psi $\\psi$-class intersection number
 s in a non-linear manner.\n
LOCATION:https://stable.researchseminars.org/talk/TropicalmathandML/6/
END:VEVENT
BEGIN:VEVENT
SUMMARY:Olexandr Konovalov (University of St Andrews)
DTSTART:20250428T080000Z
DTEND:20250428T090000Z
DTSTAMP:20260404T110823Z
UID:TropicalmathandML/7
DESCRIPTION:Title: <a href="https://stable.researchseminars.org/talk/Tropi
 calmathandML/7/">Open science and reproducible research</a>\nby Olexandr K
 onovalov (University of St Andrews) as part of Tropical mathematics and ma
 chine learning\n\n\nAbstract\nDr Olexandr Konovalov is a lecturer in the S
 chool of Computer Science of the University of St Andrews\,\nwhere he lead
 s the Research Software Group. He is also one of the developers of the ope
 n source\nmathematical software system GAP (https://www.gap-system.org <ht
 tps://www.gap-system.org/>). He will talk about open science practices in 
 computational research\, and will present some novel ways of using Jupyter
  notebooks to share reproducible computational experiments in Python\, R\,
  GAP\, and other programming languages supported by Jupyter notebooks. Fur
 thermore\, there will be a discussion of associated technical skills\, nee
 ded for modern collaborative research\, of training opportunities offered 
 by the Carpentries (https://carpentries.org/)\, and of collaborative trans
 lation projects to maintain multi-language versions of the Carpentries tra
 ining resources.\n
LOCATION:https://stable.researchseminars.org/talk/TropicalmathandML/7/
END:VEVENT
BEGIN:VEVENT
SUMMARY:Marissa Masden (University of Puget Sound)
DTSTART:20250331T020000Z
DTEND:20250331T030000Z
DTSTAMP:20260404T110823Z
UID:TropicalmathandML/8
DESCRIPTION:Title: <a href="https://stable.researchseminars.org/talk/Tropi
 calmathandML/8/">Sign Sequence Combinatorics for Topological Measures of R
 eLU neural networks</a>\nby Marissa Masden (University of Puget Sound) as 
 part of Tropical mathematics and machine learning\n\n\nAbstract\nA (ReLU) 
 neural network is a type of piecewise linear (PL) function F which induces
  a canonical polyhedral subdivision\, $\\mathcal C(F)$\, on its input spac
 e (Grigsby and Lindsey\, 2022). This class of function is commonly used in
  modern machine learning applications. Following a brief introduction to t
 hese functions and a topological perspective on data classification\, we w
 ill then discuss how ReLU networks induce a polyhedral complex on their in
 put space which arises from hyperplane arrangements. The face poset of thi
 s polyhedral complex (for a given ReLU neural network) is entirely determi
 ned by combinatorial "sign sequence" information about the vertices of the
  complex. We will explore how combinatorial properties of the face poset o
 f this polyhedral subdivision may be used to compute topological propertie
 s of a given ReLU function such as its level set topology\, critical point
 s\, and (most recently) a discrete gradient vector field agreeing with the
  function\, among other useful measures\, and demonstrate how this may be 
 used to understand ReLU neural networks as a class of functions.\n
LOCATION:https://stable.researchseminars.org/talk/TropicalmathandML/8/
END:VEVENT
BEGIN:VEVENT
SUMMARY:Priyaa Varshinee Srinivasan (Tallinn University of Technology)
DTSTART:20250310T080000Z
DTEND:20250310T090000Z
DTSTAMP:20260404T110823Z
UID:TropicalmathandML/9
DESCRIPTION:Title: <a href="https://stable.researchseminars.org/talk/Tropi
 calmathandML/9/">Drazin inverses in Categories</a>\nby Priyaa Varshinee Sr
 inivasan (Tallinn University of Technology) as part of Tropical mathematic
 s and machine learning\n\n\nAbstract\nIn this talk\, we will explore Drazi
 n inverses through the lens of category theory. Drazin inverses are a fund
 amental algebraic structure which have been extensively studied in semigro
 up theory and ring theory. Drazin inverses can also be defined for endomap
 s  in any category. In this talk\, we will introduce Drazin inverse of an 
 endomap\, Drazin categories (categories in which every endomorphism has a 
 Drazin inverse)\, and provide various examples of such categories includin
 g the category of matrices over a field\, and explore a few properties of 
 these inverses.\n\nThe talk will feature lots of pictures!\n
LOCATION:https://stable.researchseminars.org/talk/TropicalmathandML/9/
END:VEVENT
BEGIN:VEVENT
SUMMARY:Anibal Medina Mardones (Western University)
DTSTART:20250317T070000Z
DTEND:20250317T080000Z
DTSTAMP:20260404T110823Z
UID:TropicalmathandML/10
DESCRIPTION:Title: <a href="https://stable.researchseminars.org/talk/Tropi
 calmathandML/10/">What makes math problems hard for RL: a case study</a>\n
 by Anibal Medina Mardones (Western University) as part of Tropical mathema
 tics and machine learning\n\n\nAbstract\nUsing a long-standing conjecture 
 from combinatorial group theory\, we explore\, from multiple perspectives\
 , the challenges of finding rare instances carrying disproportionately hig
 h rewards. Based on lessons learned in the context defined by the Andrews
 –Curtis conjecture\, we propose algorithmic enhancements and a topologic
 al hardness measure with implications for a broad class of search problems
 . As part of our study\, we also address several open mathematical questio
 ns. Notably\, we demonstrate the length reducibility of all but two presen
 tations in the Akbulut–Kirby series (1981) and resolve various potential
  counterexamples in the Miller–Schupp series (1991)\, including three in
 finite subfamilies.\n
LOCATION:https://stable.researchseminars.org/talk/TropicalmathandML/10/
END:VEVENT
BEGIN:VEVENT
SUMMARY:Siddharth Pritam (Chennai Mathematical Institute)
DTSTART:20250407T060000Z
DTEND:20250407T070000Z
DTSTAMP:20260404T110823Z
UID:TropicalmathandML/11
DESCRIPTION:Title: <a href="https://stable.researchseminars.org/talk/Tropi
 calmathandML/11/">Classification of Temporal Graphs using Persistent Homol
 ogy</a>\nby Siddharth Pritam (Chennai Mathematical Institute) as part of T
 ropical mathematics and machine learning\n\n\nAbstract\nTemporal graphs ef
 fectively model dynamic systems by representing interactions as timestampe
 d edges. However\, analytical tools for temporal graphs are limited compar
 ed to static graphs. We propose a novel method for analyzing temporal grap
 hs using Persistent Homology. Our approach leverages δ-temporal motifs (r
 ecurrent sub-graphs) to capture temporal dynamics. By evolving these motif
 s\, we define the average filtration and compute PH on the associated cliq
 ue complex. This method captures both local and global temporal structures
  and is stable with respect to reference models. We demonstrate the applic
 ability of our approach to the temporal graph classification task. Experim
 ents verify the effectiveness of our approach\, achieving over 92% accurac
 y\, with some cases reaching 100%. Unlike existing methods that require no
 de classes\, our approach is node class free\, offering flexibility for a 
 wide range of temporal graph analysis\n
LOCATION:https://stable.researchseminars.org/talk/TropicalmathandML/11/
END:VEVENT
BEGIN:VEVENT
SUMMARY:Keiji Miura (Kwansei Gakuin University)
DTSTART:20250414T060000Z
DTEND:20250414T070000Z
DTSTAMP:20260404T110823Z
UID:TropicalmathandML/12
DESCRIPTION:Title: <a href="https://stable.researchseminars.org/talk/Tropi
 calmathandML/12/">Tropical Neural Networks and Its Applications to Classif
 ying Phylogenetic Trees</a>\nby Keiji Miura (Kwansei Gakuin University) as
  part of Tropical mathematics and machine learning\n\n\nAbstract\nDeep neu
 ral networks show great success when input vectors are in an Euclidean spa
 ce. However\, those classical neural networks show a poor performance when
  inputs are phylogenetic trees\, which can be written as vectors in the tr
 opical projective torus. Here we propose tropical embedding to transform a
  vector in the tropical projective torus to a vector in the Euclidean spac
 e via the tropical metric. We introduce a tropical neural network where th
 e first layer is a tropical embedding layer and the following layers are t
 he same as the classical ones. We prove that a tropical neural network is 
 a universal approximator and we derive a backpropagation rule for deep tro
 pical neural networks. Then we provide TensorFlow 2 codes for implementing
  a tropical neural network in the same fashion as the classical one\, wher
 e the weights initialization problem is considered according to the extrem
 e value statistics. We apply our method to empirical data including sequen
 ces of hemagglutinin for influenza virus from New York. Finally we show th
 at a tropical neural network can be interpreted as a generalization of a t
 ropical logistic regression.\n
LOCATION:https://stable.researchseminars.org/talk/TropicalmathandML/12/
END:VEVENT
BEGIN:VEVENT
SUMMARY:Bruno Gavranović
DTSTART:20250512T080000Z
DTEND:20250512T090000Z
DTSTAMP:20260404T110823Z
UID:TropicalmathandML/13
DESCRIPTION:Title: <a href="https://stable.researchseminars.org/talk/Tropi
 calmathandML/13/">Learning Functors using Gradient Descent</a>\nby Bruno G
 avranović as part of Tropical mathematics and machine learning\n\n\nAbstr
 act\nCycleGAN is a general approach to unpaired image-to-image translation
  that has been getting attention in recent years. Inspired by categorical 
 database systems\, we show that CycleGAN is a "schema"\, i.e. a specific c
 ategory presented by generators and relations\, whose specific parameter i
 nstantiations are just set-valued functors on this schema. We show that en
 forcing cycle-consistencies amounts to enforcing composition invariants in
  this category. We generalize the learning procedure to arbitrary such cat
 egories and show that a special class of functors\, rather than functions\
 , can be learned using gradient descent. Using this framework we design a 
 novel neural network system capable of learning to insert and delete objec
 ts from images without paired data. We qualitatively evaluate the system o
 n three different datasets and obtain promising results.\n
LOCATION:https://stable.researchseminars.org/talk/TropicalmathandML/13/
END:VEVENT
BEGIN:VEVENT
SUMMARY:Geoffrey Cruttwell (Mount Allison University)
DTSTART:20250519T020000Z
DTEND:20250519T030000Z
DTSTAMP:20260404T110823Z
UID:TropicalmathandML/14
DESCRIPTION:Title: <a href="https://stable.researchseminars.org/talk/Tropi
 calmathandML/14/">An introduction to tangent categories</a>\nby Geoffrey C
 ruttwell (Mount Allison University) as part of Tropical mathematics and ma
 chine learning\n\n\nAbstract\nOne of the central constructions in differen
 tial geometry is the tangent bundle: the collection of all "tangent vector
 s" at all points of a space. In this talk\, we'll look at an axiomatizatio
 n/formalization of the tangent bundle called a "tangent category".  We'll 
 start with a brief review of category theory\, then look at some of the ca
 tegorical structure of the tangent bundle\, and finish by discussing some 
 of the different examples of tangent categories.\n
LOCATION:https://stable.researchseminars.org/talk/TropicalmathandML/14/
END:VEVENT
BEGIN:VEVENT
SUMMARY:Horacio Rostro-Gonzalez
DTSTART:20250901T070000Z
DTEND:20250901T080000Z
DTSTAMP:20260404T110823Z
UID:TropicalmathandML/15
DESCRIPTION:Title: <a href="https://stable.researchseminars.org/talk/Tropi
 calmathandML/15/">Canceled</a>\nby Horacio Rostro-Gonzalez as part of Trop
 ical mathematics and machine learning\n\nAbstract: TBA\n\nCanceled meeting
 . Seminar starts next week.\n
LOCATION:https://stable.researchseminars.org/talk/TropicalmathandML/15/
END:VEVENT
BEGIN:VEVENT
SUMMARY:Ismail Khalfaoui-Hassani (Forschungszentrum Jülich)
DTSTART:20250609T080000Z
DTEND:20250609T090000Z
DTSTAMP:20260404T110823Z
UID:TropicalmathandML/16
DESCRIPTION:Title: <a href="https://stable.researchseminars.org/talk/Tropi
 calmathandML/16/">Polynomial\, trigonometric\, and tropical activations</a
 >\nby Ismail Khalfaoui-Hassani (Forschungszentrum Jülich) as part of Trop
 ical mathematics and machine learning\n\n\nAbstract\nWhich functions can b
 e used as activations in deep neural networks? This article explores famil
 ies of functions based on orthonormal bases\, including the Hermite polyno
 mial basis and the Fourier trigonometric basis\, as well as a basis result
 ing from the tropicalization of a polynomial basis. Our study shows that\,
  through simple variance-preserving initialization and without additional 
 clamping mechanisms\, these activations can successfully be used to train 
 deep models\, such as GPT-2 for next-token prediction on OpenWebText and C
 onvNeXt for image classification on ImageNet. Our work addresses the issue
  of exploding and vanishing activations and gradients\, particularly preva
 lent with polynomial activations\, and opens the door for improving the ef
 ficiency of large-scale learning tasks. Furthermore\, our approach provide
 s insight into the structure of neural networks\, revealing that networks 
 with polynomial activations can be interpreted as multivariate polynomial 
 mappings. Finally\, using Hermite interpolation\, we show that our activat
 ions can closely approximate classical ones in pre-trained models by match
 ing both the function and its derivative\, making them especially useful f
 or fine-tuning tasks. These activations are available in the torchortho li
 brary\, which can be accessed via: https://github.com/K-H-Ismail/torchorth
 o.\n
LOCATION:https://stable.researchseminars.org/talk/TropicalmathandML/16/
END:VEVENT
BEGIN:VEVENT
SUMMARY:Susana López-Moreno (Pusan National University)
DTSTART:20250908T060000Z
DTEND:20250908T070000Z
DTSTAMP:20260404T110823Z
UID:TropicalmathandML/17
DESCRIPTION:Title: <a href="https://stable.researchseminars.org/talk/Tropi
 calmathandML/17/">Poset Neural Networks</a>\nby Susana López-Moreno (Pusa
 n National University) as part of Tropical mathematics and machine learnin
 g\n\n\nAbstract\nThe paper ''Tropical Geometry of Deep Neural Networks'' b
 y L. Zhang et al. introduces an equivalence between integer-valued neural 
 networks (IVNN) with $\\text{ReLU}_{t}$ and tropical rational functions\, 
 which come with a map to polytopes. Expanding this connection to posets\, 
 we will see how neural networks are constructed from an order polytope.\nW
 e then explain how posets with four points induce neural networks that can
  be interpreted as $2\\times 2$ convolutional filters\, that can used in n
 ot only IVNNs but in any general neural network.\n
LOCATION:https://stable.researchseminars.org/talk/TropicalmathandML/17/
END:VEVENT
BEGIN:VEVENT
SUMMARY:Aldo Guzmán-Sáenz (IBM Research)
DTSTART:20250915T000000Z
DTEND:20250915T010000Z
DTSTAMP:20260404T110823Z
UID:TropicalmathandML/18
DESCRIPTION:Title: <a href="https://stable.researchseminars.org/talk/Tropi
 calmathandML/18/">Topological Data Analysis and Machine Learning - an appl
 ication to computational genomics</a>\nby Aldo Guzmán-Sáenz (IBM Researc
 h) as part of Tropical mathematics and machine learning\n\n\nAbstract\nPer
 sistent homology is a tool derived from algebraic topology that has been s
 uccessfully applied to real-world problems\, either on its own or combined
  with standard machine learning techniques. To further strengthen its appl
 icability\, it is necessary to establish mappings from topological feature
 s in homology to the original data space. In this talk we present one poss
 ible approach using harmonic persistent homology to identify biomarkers re
 levant to disease subtyping.\n
LOCATION:https://stable.researchseminars.org/talk/TropicalmathandML/18/
END:VEVENT
BEGIN:VEVENT
SUMMARY:Jose Perea (Northeastern University)
DTSTART:20250929T120000Z
DTEND:20250929T130000Z
DTSTAMP:20260404T110823Z
UID:TropicalmathandML/19
DESCRIPTION:Title: <a href="https://stable.researchseminars.org/talk/Tropi
 calmathandML/19/">Learning functions on the space of persistence diagrams<
 /a>\nby Jose Perea (Northeastern University) as part of Tropical mathemati
 cs and machine learning\n\n\nAbstract\nThe persistence diagram is an incre
 asingly useful shape descriptor from Topological Data Analysis\, but its u
 se alongside typical machine learning techniques requires mathematical fin
 esse.  We will describe in this talk a mathematical framework for featuriz
 ation of said descriptors\, and show how it addresses the problem of appro
 ximating continuous functions on compact subsets of the space of persisten
 ce diagrams.  We will also show how these techniques can be applied to pro
 blems in semi-supervised learning where these descriptors are relevant.\n
LOCATION:https://stable.researchseminars.org/talk/TropicalmathandML/19/
END:VEVENT
BEGIN:VEVENT
SUMMARY:John Abascal (Northeastern University)
DTSTART:20251027T000000Z
DTEND:20251027T010000Z
DTSTAMP:20260404T110823Z
UID:TropicalmathandML/20
DESCRIPTION:Title: <a href="https://stable.researchseminars.org/talk/Tropi
 calmathandML/20/">Privacy (Attacks) in Machine Learning</a>\nby John Abasc
 al (Northeastern University) as part of Tropical mathematics and machine l
 earning\n\n\nAbstract\nDespite achieving state-of-the-art performance acro
 ss numerous domains\, deep learning models often memorize sensitive inform
 ation from their training data. This leaves models and the samples they ar
 e trained on vulnerable to a wide range of privacy attacks.\n\nThis talk p
 rovides an introduction to this landscape of privacy in machine learning\,
  exploring risks like membership inference attacks and provable mitigation
 s such as differential privacy. Throughout the discussion\, we will examin
 e how these risks arise\, fundamental trade-offs between privacy and model
  utility\, and the current state of privacy in the context of machine lear
 ning. We will also highlight some open challenges and future directions in
  the ongoing effort to build models that are performant\, private\, and tr
 ustworthy.\n
LOCATION:https://stable.researchseminars.org/talk/TropicalmathandML/20/
END:VEVENT
BEGIN:VEVENT
SUMMARY:Timothy Duff (University of Missouri)
DTSTART:20251103T010000Z
DTEND:20251103T020000Z
DTSTAMP:20260404T110823Z
UID:TropicalmathandML/21
DESCRIPTION:Title: <a href="https://stable.researchseminars.org/talk/Tropi
 calmathandML/21/">Compatibility of Fundamental and Essential Matrix Triple
 s</a>\nby Timothy Duff (University of Missouri) as part of Tropical mathem
 atics and machine learning\n\n\nAbstract\nThe fundamental matrix of a pair
  of pinhole cameras lies at the core of systems that reconstruct 3D scenes
  from 2D images. However\, for more than two cameras\, the relations betwe
 en the various fundamental matrices of camera pairs are not yet completely
  understood. In joint work with Viktor Korotynskiy\, Anton Leykin\, and To
 mas Pajdla\, we characterize all polynomial constraints that hold for an a
 rbitrary triple of fundamental matrices. Unlike most constraints in previo
 us works\, our constraints hold independently of the relative scaling of t
 he fundamental matrices\, which is unknown in practice. We also provide a 
 partial characterization for essential matrix triples arising from calibra
 ted cameras.\n
LOCATION:https://stable.researchseminars.org/talk/TropicalmathandML/21/
END:VEVENT
BEGIN:VEVENT
SUMMARY:Chung To Kong (University of Hong Kong)
DTSTART:20251020T060000Z
DTEND:20251020T063000Z
DTSTAMP:20260404T110823Z
UID:TropicalmathandML/22
DESCRIPTION:Title: <a href="https://stable.researchseminars.org/talk/Tropi
 calmathandML/22/">The possibility of making $138\,000 from shredded bankno
 te pieces using computer vision</a>\nby Chung To Kong (University of Hong 
 Kong) as part of Tropical mathematics and machine learning\n\n\nAbstract\n
 Every country must dispose of old banknotes. At the Hong Kong Monetary Aut
 hority visitor center\, visitors can buy a paperweight souvenir full of sh
 redded banknotes. Even though the shredded banknotes are small\, by using 
 computer vision\, it is possible to reconstruct the whole banknote like a 
 jigsaw puzzle. Each paperweight souvenir costs \\$100 HKD\, and it is clai
 med to contain shredded banknotes equivalent to 138 complete \\$1000 HKD b
 anknotes. In theory\, \\$138\,000 HKD can be recovered by using computer v
 ision. This paper discusses the technique of collecting shredded banknote 
 pieces and applying a computer vision program.\n
LOCATION:https://stable.researchseminars.org/talk/TropicalmathandML/22/
END:VEVENT
BEGIN:VEVENT
SUMMARY:Redi Haderi
DTSTART:20251215T100000Z
DTEND:20251215T110000Z
DTSTAMP:20260404T110823Z
UID:TropicalmathandML/23
DESCRIPTION:Title: <a href="https://stable.researchseminars.org/talk/Tropi
 calmathandML/23/">A practical introduction to operads</a>\nby Redi Haderi 
 as part of Tropical mathematics and machine learning\n\n\nAbstract\nIn thi
 s talk we will introduce the notion of operad as a tool to control a varie
 ty of algebraic structures. We describe the notion of algebra\, some examp
 les of interesting algebras which arise operadically and how to recognize 
 an algebraic structure is operadic and the relationship between operads an
 d monoidal categories.\n
LOCATION:https://stable.researchseminars.org/talk/TropicalmathandML/23/
END:VEVENT
BEGIN:VEVENT
SUMMARY:Redi Haderi
DTSTART:20251222T100000Z
DTEND:20251222T110000Z
DTSTAMP:20260404T110823Z
UID:TropicalmathandML/24
DESCRIPTION:Title: <a href="https://stable.researchseminars.org/talk/Tropi
 calmathandML/24/">Infinity operads via simplicial lists</a>\nby Redi Hader
 i as part of Tropical mathematics and machine learning\n\n\nAbstract\nWe i
 ntroduce the notion of simplicial list as a combinatorial tool to understa
 nd operads and their homotopy coherent variant (i.e. infinity operads). We
  will focus on the analogy between simplicial sets and simplicial lists an
 d present a nerve theorem which recognizes operads as certain simplicial l
 ists. This leads to an interesting quasi-categorical and combinatorial not
 ion of infinity operad.\n
LOCATION:https://stable.researchseminars.org/talk/TropicalmathandML/24/
END:VEVENT
BEGIN:VEVENT
SUMMARY:Joe Moeller (Caltech)
DTSTART:20260202T000000Z
DTEND:20260202T010000Z
DTSTAMP:20260404T110823Z
UID:TropicalmathandML/25
DESCRIPTION:Title: <a href="https://stable.researchseminars.org/talk/Tropi
 calmathandML/25/">Schur Functors and Categorified Plethysm</a>\nby Joe Moe
 ller (Caltech) as part of Tropical mathematics and machine learning\n\n\nA
 bstract\nIt is known that the Grothendieck group of the category of Schur 
 functors is the ring of symmetric functions. This ring has a rich structur
 e\, much of which is encapsulated in the fact that it is a "plethory": a m
 onoid in the category of birings with its substitution monoidal structure.
  We show that similarly the category of Schur functors is a "2-plethory"\,
  which descends to give the plethory structure on symmetric functions. Thu
 s\, much of the structure of symmetric functions exists at a higher level 
 in the category of Schur functors.\n
LOCATION:https://stable.researchseminars.org/talk/TropicalmathandML/25/
END:VEVENT
BEGIN:VEVENT
SUMMARY:Abbas Shoja-Daliklidash (University of Mohaghegh Ardabili)
DTSTART:20260105T060000Z
DTEND:20260105T070000Z
DTSTAMP:20260404T110823Z
UID:TropicalmathandML/26
DESCRIPTION:Title: <a href="https://stable.researchseminars.org/talk/Tropi
 calmathandML/26/">Postponed</a>\nby Abbas Shoja-Daliklidash (University of
  Mohaghegh Ardabili) as part of Tropical mathematics and machine learning\
 n\n\nAbstract\nSandpiles with finite-range interactions. We investigate th
 e sandpile model with Yukawa-type interactions\, whose effective range is 
 tuned by an external parameter R. Our results reveal that at specific valu
 es of R\, the system exhibits giant avalanches that span the system\, lead
 ing to percolation. The probability of such giant avalanches demonstrates 
 two distinct regimes as a function of R: for sufficiently small R\, it inc
 reases monotonically\, whereas for large R it undergoes threshold dynamics
 \, so that at certain values of R\, the percolation probability exhibits a
 brupt jumps. We refer it to as \\textit{pseudo-percolation transitions}\, 
 based on which we propose a hierarchical percolation model at the mean-fie
 ld level: each percolation transition corresponds to percolation within a 
 disc of radius R. We further examine both local and global geometrical obs
 ervables. The local quantities include avalanche size\, mass\, and duratio
 n and sub-avalanche mass\, while for the global characterization we analyz
 e the loop length and gyration radius of the external perimeter\, as well 
 as the mass of sub-avalanches. Remarkably\, all these observables exhibit 
 power-law scaling for all values of R\, with exponents that vary systemati
 cally with R. Notably\, in the vicinity of the pseudo-percolation transiti
 on points\, the exponents approach characteristic values\, signaling a dis
 tinct critical behavior.\n
LOCATION:https://stable.researchseminars.org/talk/TropicalmathandML/26/
END:VEVENT
BEGIN:VEVENT
SUMMARY:Jesse Wolfson (University of California\, Irvine)
DTSTART:20251208T044500Z
DTEND:20251208T054500Z
DTSTAMP:20260404T110823Z
UID:TropicalmathandML/27
DESCRIPTION:Title: <a href="https://stable.researchseminars.org/talk/Tropi
 calmathandML/27/">Fractals and Africanist Music</a>\nby Jesse Wolfson (Uni
 versity of California\, Irvine) as part of Tropical mathematics and machin
 e learning\n\n\nAbstract\nSymmetry has been a structuring device and motif
  for music in many cultures\, and probably everyone who has taken music le
 ssons as a child can call to mind the time translation symmetry encoded by
  the regular beats of a metronome.  Nonetheless\, the idea of fractal symm
 etry in music seems much less common.  In this talk\, I'll describe recent
  research investigating the presence of fractal structures in Africanist m
 usic\, in response to a hypothesis of choreographer Reggie Wilson. I'll de
 scribe both the project and "scientific" findings\, as well as the broader
  context of my unexpected and ongoing engagement with Reggie Wilson and hi
 s Fist and Heel Performance Group at the interplay of math and dance.  Thi
 s is joint work with Claudio Gómez-Gonzáles\, Sidhanth Raman and Siddhar
 th Viswanath.\n
LOCATION:https://stable.researchseminars.org/talk/TropicalmathandML/27/
END:VEVENT
BEGIN:VEVENT
SUMMARY:Eva Yi Xie (Princeton Neuroscience Institute\; Allen Institute)
DTSTART:20260112T000000Z
DTEND:20260112T010000Z
DTSTAMP:20260404T110823Z
UID:TropicalmathandML/28
DESCRIPTION:Title: <a href="https://stable.researchseminars.org/talk/Tropi
 calmathandML/28/">A Multi-Region Brain Model to Elucidate the Role of Hipp
 ocampus in Spatially Embedded Decision-Making</a>\nby Eva Yi Xie (Princeto
 n Neuroscience Institute\; Allen Institute) as part of Tropical mathematic
 s and machine learning\n\n\nAbstract\nBrains excel at robust decision-maki
 ng and data-efficient learning. Understanding the architectures and dynami
 cs underlying these capabilities can inform inductive biases for deep lear
 ning. We present a multi-region brain model that explores the normative ro
 le of structured memory circuits in a spatially embedded binary decision-m
 aking task from neuroscience. We counterfactually compare the learning per
 formance and neural representations of reinforcement learning (RL) agents 
 with brain models of different interaction architectures between grid and 
 place cells in the entorhinal cortex and hippocampus\, coupled with an act
 ion-selection cortical recurrent neural network. We demonstrate that a spe
 cific architecture--where grid cells receive and jointly encode self-movem
 ent velocity signals and decision evidence increments--optimizes learning 
 efficiency while best reproducing experimental observations relative to al
 ternative architectures.Our findings thus suggest brain-inspired structure
 d architectures for efficient RL. Importantly\, the models make novel\, te
 stable predictions about organization and information flow within the ento
 rhinal-hippocampal-neocortical circuit: we predict that grid cells must co
 njunctively encode position and evidence for effective spatial decision-ma
 king\, directly motivating new neurophysiological experiments.\n
LOCATION:https://stable.researchseminars.org/talk/TropicalmathandML/28/
END:VEVENT
BEGIN:VEVENT
SUMMARY:Assaf Shocher (Technion – Israel Institute of Technology)
DTSTART:20260316T080000Z
DTEND:20260316T090000Z
DTSTAMP:20260404T110823Z
UID:TropicalmathandML/29
DESCRIPTION:Title: <a href="https://stable.researchseminars.org/talk/Tropi
 calmathandML/29/">Teaching Neural Networks Linear Algebra</a>\nby Assaf Sh
 ocher (Technion – Israel Institute of Technology) as part of Tropical ma
 thematics and machine learning\n\n\nAbstract\nNeural networks are powerful
  but notoriously difficult to analyze\, compose\, or control. Linear algeb
 ra\, by contrast\, is the mathematical ideal of tractability. In this talk
  I will present several works from my lab that aim to import the principle
 s of linear algebra into deep learning. I begin with projection as a gener
 ative principle: Idempotent Generative Networks (IGN) train a neural netwo
 rk to satisfy f(f(z)) = f(z)\, so that the data manifold emerges as the se
 t of fixed points of the operator. Generation then becomes projection: a s
 ingle forward pass maps noise to the manifold\, while repeated application
  enables principled refinement. We then ask a more provocative question: "
 Who said neural networks aren't linear?". Neural networks are famously non
 linear\, but nonlinear with respect to which vector spaces? Using the alge
 braic notion of transport of structure\, the Linearizer framework identifi
 es non-standard vector spaces in which a neural network acts as a linear o
 perator. In these spaces\, tools such as SVD\, pseudo-inverses\, and compo
 sition become directly applicable to neural networks\, with consequences r
 anging from algebraic analysis of models to collapsing diffusion sampling 
 into a single step. Finally\, I will present recent work that generalizes 
 the Moore-Penrose pseudo-inverse to nonlinear mappings. Surjective Pseudo-
 invertible Neural Networks (SPNN) satisfy the classical Penrose identities
  by construction\, enabling nonlinear back-projection and extending diffus
 ion-based zero-shot inverse problem solving from linear degradations to ar
 bitrary nonlinear ones. These are steps we are taking towards systems that
  we can study and design with the same rigor and elegance that linear alge
 bra brings to the physical sciences.\n
LOCATION:https://stable.researchseminars.org/talk/TropicalmathandML/29/
END:VEVENT
BEGIN:VEVENT
SUMMARY:Shailesh Lal (Beijing Institute of Mathematical Sciences and Appli
 cations)
DTSTART:20260309T060000Z
DTEND:20260309T070000Z
DTSTAMP:20260404T110823Z
UID:TropicalmathandML/30
DESCRIPTION:Title: <a href="https://stable.researchseminars.org/talk/Tropi
 calmathandML/30/">Postponed</a>\nby Shailesh Lal (Beijing Institute of Mat
 hematical Sciences and Applications) as part of Tropical mathematics and m
 achine learning\n\nAbstract: TBA\n
LOCATION:https://stable.researchseminars.org/talk/TropicalmathandML/30/
END:VEVENT
BEGIN:VEVENT
SUMMARY:Changqing Fu (CEREMADE\, Paris Dauphine University - PSL and Paris
  AI Institute (PRAIRIE))
DTSTART:20260223T080000Z
DTEND:20260223T090000Z
DTSTAMP:20260404T110823Z
UID:TropicalmathandML/31
DESCRIPTION:Title: <a href="https://stable.researchseminars.org/talk/Tropi
 calmathandML/31/">Transformers as Effective Fields: From Quantum Physics t
 o AI</a>\nby Changqing Fu (CEREMADE\, Paris Dauphine University - PSL and 
 Paris AI Institute (PRAIRIE)) as part of Tropical mathematics and machine 
 learning\n\n\nAbstract\nApproximation and algebraic theories are not yet s
 ufficient to prove the optimality of Transformers: it is known that even s
 hallow infinite-width neural networks are approximately universal\, and Re
 LU networks are within the rational function class under tropical (max-plu
 s) algebra. However\, these facts still cannot explain the effectiveness o
 f Transformers\, since a constructive proof of their form is missing.\n\nI
 n this talk\, we propose a novel theory to fully classify all possible neu
 ral networks and argue that linear/softmax Transformers are optimal under 
 several minimal axioms. To model the reasoning process\, we treat the neur
 al ODE as the geodesics of some canonical field\, where time represents la
 yer depth. To model the interaction among concepts\, we pass from the vect
 or flow to the matrix flow\, denoted as $\\bm X$\, whose rows are tokens a
 nd columns are neurons. The Transformer is then a natural consequence:\n\n
 Linear Attention: the first interaction term under left unitary invariance
 .\n\nSoftmax Attention: the entropic regularization of the field under lef
 t permutation invariance.\n\nTwo-Layer ReLU Network: the projected gradien
 t flow\, where the feasible set is conic and permutation invariant. \n\nGa
 ted Activation Network: the minimal nonlinear non-interactive field under 
 left permutation invariance.\n\nSparse Attention*: the token-pairwise non-
 commutative correction that leads to a mask on the attention matrix.\n\nIn
  conclusion\, we provide a theoretical proof for why the “bitter lesson
 ” holds\, a theoretical guarantee for the technical path of Transformers
 \, and a paradigm to study the interpretability of intelligence.\n
LOCATION:https://stable.researchseminars.org/talk/TropicalmathandML/31/
END:VEVENT
BEGIN:VEVENT
SUMMARY:Tom Jacobs (CISPA Helmholtz Center for Information Security)
DTSTART:20260209T080000Z
DTEND:20260209T090000Z
DTSTAMP:20260404T110823Z
UID:TropicalmathandML/32
DESCRIPTION:Title: <a href="https://stable.researchseminars.org/talk/Tropi
 calmathandML/32/">Weight Decay Controls Implicit Regularization:  Insights
  on Generalization and Sparsity</a>\nby Tom Jacobs (CISPA Helmholtz Center
  for Information Security) as part of Tropical mathematics and machine lea
 rning\n\n\nAbstract\nClassical statistics teaches us that overparameteriza
 tion causes overfitting\, which prevents good generalization. However\, hi
 ghly overparameterized neural network architectures generalize surprisingl
 y well. This is because the training of these models tends towards low ran
 k or sparse solutions\, without requiring explicit constraints. This prefe
 rence is known as implicit regularization\, and it can be found in a varie
 ty of contexts\, including attention layers\, LoRA\, matrix sensing\, and 
 diagonal linear networks. As a result\, implicit regularization helps expl
 ain how overfitting is avoided and generalization is improved in neural ne
 tworks.\n\nIn this work I will show how weight decay controls implicit reg
 ularization beyond its explicit role of constraining the model capacity. F
 or instance\, it moves the implicit regularizer from $L_2$ to $L_1$\, whic
 h leads to more sparsity in the model. This demonstrates how weight decay 
 not only serves as a model constraint\, but also has an implicit effect. B
 y turning off weight decay during training\, only the implicit effect rema
 ins\, resulting in better generalization overall. Besides better generaliz
 ation\, I use these insights to induce sparsity in deep neural networks. S
 parsity aims to reduce model size and inference time by removing as many w
 eights as possible. This results in a new method: PILoT (Parameteric Impli
 cit Lottery Ticket)(Our previous work)\, a sparsification method based on 
 overparameterization and weight decay that uses the transition of the impl
 icit regularization from $L_2$ to $L_1$ to gradually sparsify\, achieving 
 high sparsity with a smaller performance drop.\n\nTheoretically\, we use a
 nd extend the connection between reparameterizations (specific overparamet
 erization) and mirror flows (Riemannian gradient flow) and extend this to 
 time-varying mirror flows. The mirror flow controls the implicit bias and 
 with that the weight decay controls the time-varying mirror flow.\n
LOCATION:https://stable.researchseminars.org/talk/TropicalmathandML/32/
END:VEVENT
BEGIN:VEVENT
SUMMARY:José Simental Rodríguez (UNAM)
DTSTART:20260323T010000Z
DTEND:20260323T020000Z
DTSTAMP:20260404T110823Z
UID:TropicalmathandML/33
DESCRIPTION:Title: <a href="https://stable.researchseminars.org/talk/Tropi
 calmathandML/33/">Big hypercubes in the Bruhat order</a>\nby José Simenta
 l Rodríguez (UNAM) as part of Tropical mathematics and machine learning\n
 \n\nAbstract\nThe Bruhat order is a basic structure on the symmetric group
  (or more generally\, any Coxeter group) that is surprisingly still very p
 oorly understood. Using permutations suggested by AlphaEvolve (an evolutio
 nary coding agent of Google DeepMind) we find an interval in the Bruhat or
 der whose elements can be explicitly described in terms of binary expansio
 ns and that\, moreover\, form a hypercube of dimension much larger than on
 e could initially expect can occur within the symmetric group. In fact\, t
 he dimension of this hypercube matches\, up to a constant\, the dimension 
 of the largest possible hypercube that can appear as an interval in the sy
 mmetric group. Time permitting\, I will elaborate on the consequences that
  the existence of these big hypercubes has on the recently discovered clus
 ter structures on open Richardson varieties. This is joint work with Jorda
 n Ellenberg\, Nicolás Libedinsky\, David Plaza\, and Geordie Williamson.\
 n
LOCATION:https://stable.researchseminars.org/talk/TropicalmathandML/33/
END:VEVENT
BEGIN:VEVENT
SUMMARY:Yu Tian (Max Planck Institute for Physics of Complex Systems (MPI 
 PKS))
DTSTART:20260511T080000Z
DTEND:20260511T090000Z
DTSTAMP:20260404T110823Z
UID:TropicalmathandML/34
DESCRIPTION:by Yu Tian (Max Planck Institute for Physics of Complex System
 s (MPI PKS)) as part of Tropical mathematics and machine learning\n\nAbstr
 act: TBA\n
LOCATION:https://stable.researchseminars.org/talk/TropicalmathandML/34/
END:VEVENT
BEGIN:VEVENT
SUMMARY:Austin Rodriguez (Michigan State University)
DTSTART:20260330T010000Z
DTEND:20260330T020000Z
DTSTAMP:20260404T110823Z
UID:TropicalmathandML/35
DESCRIPTION:Title: <a href="https://stable.researchseminars.org/talk/Tropi
 calmathandML/35/">Projected Hessian Learning: Scalable Curvature-Aware Tra
 ining for Reactive Machine Learning Interatomic Potentials</a>\nby Austin 
 Rodriguez (Michigan State University) as part of Tropical mathematics and 
 machine learning\n\n\nAbstract\nMachine learning interatomic potentials (M
 LIPs) are transforming reactive chemistry\, materials discovery\, and mole
 cular design by enabling near-quantum-chemical predictions of potential en
 ergies and forces at far lower cost than direct electronic structure calcu
 lations. While training to energies and forces improves the description of
  potential energy surfaces (PESs)\, these quantities alone do not fully ca
 pture local curvature. The Hessian matrix\, which contains second-derivati
 ve information\, provides a much richer description of PES topology and ca
 n improve extrapolation to nonequilibrium geometries\, reaction pathway mo
 deling\, transition-state characterization\, vibrational analysis\, molecu
 lar dynamics\, and nudged elastic band calculations. However\, conventiona
 l Hessian training is often impractical because explicit construction and 
 storage of Hessian matrices scale quadratically in memory and computationa
 l cost.\nTo address this limitation\, we introduce Projected Hessian Learn
 ing (PHL)\, a scalable second-order training framework that incorporates c
 urvature information through Hessian-vector products (HVPs) rather than fu
 ll Hessian matrices. PHL uses stochastic probe directions and an unbiased 
 trace-based loss to inject second-order information with favorable scaling
 \, avoiding the prohibitive cost of explicit Hessian supervision. We evalu
 ate this approach on a chemically diverse reactive dataset containing reac
 tants\, products\, transition states\, intrinsic reaction coordinate geome
 tries\, and normal-mode sampled structures computed at the ωB97X-D/6-31G(
 d) level of theory. Models trained only on equilibrium geometries and firs
 t-order saddle points are assessed for their ability to extrapolate to non
 equilibrium configurations. Compared with conventional energy-force traini
 ng\, curvature-informed models substantially improve predictions of energi
 es\, forces\, and Hessian-related properties for unseen geometries\, while
  also reducing the amount of training data required to achieve strong perf
 ormance. Moreover\, randomized HVP-based PHL schemes recover nearly all th
 e benefits of full Hessian training while achieving substantial speedups a
 nd avoiding quadratic memory growth. These results show that PHL provides 
 a practical route to scalable second-order MLIP training\, retaining the a
 ccuracy and transferability benefits of Hessian information while extendin
 g curvature-aware learning to larger and more complex molecular systems.\n
LOCATION:https://stable.researchseminars.org/talk/TropicalmathandML/35/
END:VEVENT
BEGIN:VEVENT
SUMMARY:Hana Dal Poz Kouřimská (University of Potsdam)
DTSTART:20260427T080000Z
DTEND:20260427T090000Z
DTSTAMP:20260404T110823Z
UID:TropicalmathandML/36
DESCRIPTION:by Hana Dal Poz Kouřimská (University of Potsdam) as part of
  Tropical mathematics and machine learning\n\nAbstract: TBA\n
LOCATION:https://stable.researchseminars.org/talk/TropicalmathandML/36/
END:VEVENT
BEGIN:VEVENT
SUMMARY:Sherkhon Azimov (Pusan National University)
DTSTART:20260406T060000Z
DTEND:20260406T070000Z
DTSTAMP:20260404T110823Z
UID:TropicalmathandML/37
DESCRIPTION:Title: <a href="https://stable.researchseminars.org/talk/Tropi
 calmathandML/37/">Adaptive Nonlinear Vector Autoregression: Robust Forecas
 ting for Noisy Chaotic Time Series</a>\nby Sherkhon Azimov (Pusan National
  University) as part of Tropical mathematics and machine learning\n\n\nAbs
 tract\nNonlinear vector autoregression (NVAR) and reservoir computing (RC)
  have shown promise in forecasting chaotic dynamical systems\, such as the
  Lorenz-63 model and El Nino-Southern Oscillation. However\, their relianc
 e on fixed nonlinear transformations - polynomial expansions in NVAR or ra
 ndom feature maps in RC - limits their adaptability to high noise or compl
 ex real-world data. Furthermore\, these methods also exhibit poor scalabil
 ity in high-dimensional settings due to costly matrix inversion during opt
 imization. We propose a data-adaptive NVAR model that combines delay-embed
 ded linear inputs with features generated by a shallow\, trainable multila
 yer perceptron (MLP). Unlike standard NVAR and RC models\, the MLP and lin
 ear readout are jointly trained using gradient-based optimization\, enabli
 ng the model to learn data-driven nonlinearities\, while preserving a simp
 le readout structure and improving scalability. Initial experiments across
  multiple chaotic systems\, tested under noise-free and synthetically nois
 y conditions\, showed that the adaptive model outperformed in predictive a
 ccuracy the standard NVAR\, a leaky echo state network (ESN) - the most co
 mmon RC model - and a hybrid ESN\, thereby showing robust forecasting unde
 r noisy conditions.\n
LOCATION:https://stable.researchseminars.org/talk/TropicalmathandML/37/
END:VEVENT
BEGIN:VEVENT
SUMMARY:Shailesh Lal (Beijing Institute of Mathematical Sciences and Appli
 cations)
DTSTART:20260602T070000Z
DTEND:20260602T080000Z
DTSTAMP:20260404T110823Z
UID:TropicalmathandML/38
DESCRIPTION:by Shailesh Lal (Beijing Institute of Mathematical Sciences an
 d Applications) as part of Tropical mathematics and machine learning\n\nAb
 stract: TBA\n
LOCATION:https://stable.researchseminars.org/talk/TropicalmathandML/38/
END:VEVENT
END:VCALENDAR
