BEGIN:VCALENDAR
VERSION:2.0
PRODID:researchseminars.org
CALSCALE:GREGORIAN
X-WR-CALNAME:researchseminars.org
BEGIN:VEVENT
SUMMARY:Isay Katsman (Yale)
DTSTART:20221024T121500Z
DTEND:20221024T131500Z
DTSTAMP:20260404T110913Z
UID:MaML/1
DESCRIPTION:Title: <a href="https://stable.researchseminars.org/talk/MaML/
 1/">Riemannian Geometry in Machine Learning</a>\nby Isay Katsman (Yale) as
  part of Mathematics and Machine Learning\n\n\nAbstract\nAlthough machine 
 learning researchers have introduced a plethora of useful constructions fo
 r learning over Euclidean space\, numerous types of data in various applic
 ations benefit from\, if not necessitate\, a non-Euclidean treatment. In t
 his talk I cover the need for Riemannian geometric constructs to (1) build
  more principled generalizations of common Euclidean operations used in ge
 ometric machine learning models as well as to (2) enable general manifold 
 density learning in contexts that require it. Said contexts include theore
 tical physics\, robotics\, and computational biology. I will cover one of 
 my papers that fits into (1) above\, namely the ICML 2020 paper “Differe
 ntiating through the Fréchet Mean.” I will also cover two of my papers 
 that fit into (2) above\, namely the NeurIPS 2020 paper “Neural Manifold
  ODEs” and the NeurIPS 2021 paper “Equivariant Manifold Flows.” Fina
 lly\, I will briefly discuss directions of relevant ongoing work.\n
LOCATION:https://stable.researchseminars.org/talk/MaML/1/
END:VEVENT
BEGIN:VEVENT
SUMMARY:Sebastian Fischer (LMU Munich)
DTSTART:20221107T131500Z
DTEND:20221107T141500Z
DTSTAMP:20260404T110913Z
UID:MaML/2
DESCRIPTION:Title: <a href="https://stable.researchseminars.org/talk/MaML/
 2/">Benchmarking Machine Learning Methods Using OpenML and mlr3</a>\nby Se
 bastian Fischer (LMU Munich) as part of Mathematics and Machine Learning\n
 \n\nAbstract\nBenchmark studies are an integral part of machine learning r
 esearch. The two main components are the datasets that are used to compare
  the methods and the software that supports the researcher in carrying out
  the experiment. OpenML is a platform for sharing datasets and machine lea
 rning results and is a great tool to obtain datasets.\nmlr3 is an ecosyste
 m of machine learning packages in the R language\, which among other thing
 s allows for benchmarking algorithms with ease. This presentation will sho
 w how OpenML and mlr3 can be used together to make benchmarking machine le
 arning methods as easy as possible\, by using the interface R package mlr3
 oml.\n
LOCATION:https://stable.researchseminars.org/talk/MaML/2/
END:VEVENT
BEGIN:VEVENT
SUMMARY:Edward de Brouwer (KU Leuven)
DTSTART:20221219T131500Z
DTEND:20221219T141500Z
DTSTAMP:20260404T110913Z
UID:MaML/3
DESCRIPTION:Title: <a href="https://stable.researchseminars.org/talk/MaML/
 3/">Topological Graph Neural Networks</a>\nby Edward de Brouwer (KU Leuven
 ) as part of Mathematics and Machine Learning\n\n\nAbstract\nGraph neural 
 networks (GNNs) are a powerful architecture for tackling graph learning ta
 sks\, yet have been shown to be oblivious to eminent substructures such as
  cycles. In this talk\, we introduce TOGL\, a novel layer that incorporate
 s global topological information of a graph using persistent homology. TOG
 L can be easily integrated into any type of GNN and is strictly more expre
 ssive (in terms the Weisfeiler–Lehman graph isomorphism test) than messa
 ge-passing GNNs. Augmenting GNNs with TOGL leads to improved predictive pe
 rformance for graph and node classification tasks\, both on synthetic data
  sets\, which can be classified by humans using their topology but not by 
 ordinary GNNs\, and on real-world data.\n
LOCATION:https://stable.researchseminars.org/talk/MaML/3/
END:VEVENT
BEGIN:VEVENT
SUMMARY:Jim Halverson (Northeastern)
DTSTART:20221121T131500Z
DTEND:20221121T141500Z
DTSTAMP:20260404T110913Z
UID:MaML/4
DESCRIPTION:Title: <a href="https://stable.researchseminars.org/talk/MaML/
 4/">Machine Learning for Pure Math</a>\nby Jim Halverson (Northeastern) as
  part of Mathematics and Machine Learning\n\n\nAbstract\nProgress in machi
 ne learning (ML) is poised to revolutionize a variety of STEM fields. But 
 how could these techniques — which are often stochastic\, error-prone\, 
 and blackbox — lead to progress in pure mathematics\, which values rigor
  and understanding? I will exemplify how ML can be used to generate conjec
 tures in a Calabi-Yau singularity problem that is relevant for physics\, a
 nd will demonstrate how reinforcement learning can yield truth certificate
 s that rigorously demonstrate properties of knots. The second half of the 
 talk will utilize ML theory instead of applied ML. Specifically\, I will d
 evelop a neural tangent kernel theory appropriate for flows in the space o
 f metrics (realized as neural networks)\, and will realize Perelman’s fo
 rmulation of Ricci flow as a specialization of the general theory.\n
LOCATION:https://stable.researchseminars.org/talk/MaML/4/
END:VEVENT
BEGIN:VEVENT
SUMMARY:Alex Davies (Deep Mind)
DTSTART:20221212T120000Z
DTEND:20221212T130000Z
DTSTAMP:20260404T110913Z
UID:MaML/5
DESCRIPTION:Title: <a href="https://stable.researchseminars.org/talk/MaML/
 5/">Machine Learning with Mathematicians</a>\nby Alex Davies (Deep Mind) a
 s part of Mathematics and Machine Learning\n\n\nAbstract\nCan machine lear
 ning be a useful tool for research mathematicians? There are many examples
  of mathematicians pioneering new technologies to aid our understanding of
  the mathematical world: using very early computers to help formulate the 
 Birch and Swinnerton-Dyer conjecture and using computer aid to prove the f
 our colour theorem are among the most notable. Up until now\, there hasn
 ’t been significant use of machine learning in the field and it hasn’t
  been clear where it might be useful for the questions that mathematicians
  care about. In this talk\, we will discuss how working together with top 
 mathematicians to use machine learning to achieve two new results – prov
 ing a new connection between the hyperbolic and geometric structure of kno
 ts\, and conjecturing a resolution to a 50-year problem in representation 
 theory\, the combinatorial invariance conjecture. Through these examples\,
  we demonstrate a way that machine learning can be used by mathematicians 
 to help guide the development of surprising and beautiful new conjectures.
 \n
LOCATION:https://stable.researchseminars.org/talk/MaML/5/
END:VEVENT
BEGIN:VEVENT
SUMMARY:Noemi Montobbio (IIT)
DTSTART:20230109T131500Z
DTEND:20230109T141500Z
DTSTAMP:20260404T110913Z
UID:MaML/6
DESCRIPTION:Title: <a href="https://stable.researchseminars.org/talk/MaML/
 6/">Emergence of Lie Symmetries in Functional Architectures Learned by CNN
 s</a>\nby Noemi Montobbio (IIT) as part of Mathematics and Machine Learnin
 g\n\n\nAbstract\nConvolutional Neural Networks (CNNs) are a powerful tool 
 providing outstanding performances on image classification tasks\, based o
 n an architecture designed in analogy with information processing in biolo
 gical visual systems. The functional architectures of the early visual pat
 hways have often been described in terms of geometric invariances\, and se
 veral studies have leveraged this framework to investigate the analogies b
 etween CNN models and biological mechanisms. Remarkably\, upon learning on
  natural images\, the translation-invariant filters of the first layer of 
 a CNN have been shown to develop as approximate Gabor functions\, resembli
 ng the orientation-selective receptive profiles found in the primary visua
 l cortex (V1). With a similar approach\, we modified a standard CNN archit
 ecture to insert computational blocks compatible with specific biological 
 processing stages\, and studied the spontaneous development of approximate
  geometric invariances after training the network on natural images. In pa
 rticular\, inserting a pre-filtering step mimicking the Lateral Geniculate
  Nucleus (LGN) led to the emergence of a radially symmetric profile well a
 pproximated by a Laplacian of Gaussian\, which is a well-known model of re
 ceptive profiles of LGN cells. Moreover\, we introduced a lateral connecti
 vity kernel acting on the feature space of the first network layer. We the
 n studied the learned connectivity as a function of relative tuning of fir
 st-layer filters\, thus re-mapping it into the roto-translation space. Thi
 s analysis revealed orientation-specific patterns\, which we compared qual
 itatively and quantitatively with established group-based models of V1 hor
 izontal connectivity.\n
LOCATION:https://stable.researchseminars.org/talk/MaML/6/
END:VEVENT
BEGIN:VEVENT
SUMMARY:Anna Seigal (Harvard)
DTSTART:20230123T141500Z
DTEND:20230123T151500Z
DTSTAMP:20260404T110913Z
UID:MaML/7
DESCRIPTION:by Anna Seigal (Harvard) as part of Mathematics and Machine Le
 arning\n\nAbstract: TBA\n
LOCATION:https://stable.researchseminars.org/talk/MaML/7/
END:VEVENT
END:VCALENDAR
