BEGIN:VCALENDAR
VERSION:2.0
PRODID:researchseminars.org
CALSCALE:GREGORIAN
X-WR-CALNAME:researchseminars.org
BEGIN:VEVENT
SUMMARY:Alexander Shen (CNRS - LIRMM)
DTSTART:20221207T160000Z
DTEND:20221207T171500Z
DTSTAMP:20260404T111133Z
UID:AAIT/1
DESCRIPTION:Title: <a href="https://stable.researchseminars.org/talk/AAIT/
 1/">Information inequalities: combinatorial interpretation</a>\nby Alexand
 er Shen (CNRS - LIRMM) as part of Seminar on Algorithmic Aspects of Inform
 ation Theory\n\nAbstract: TBA\n
LOCATION:https://stable.researchseminars.org/talk/AAIT/1/
END:VEVENT
BEGIN:VEVENT
SUMMARY:Frederique Oggier (Nanyang Technological University)
DTSTART:20230104T160000Z
DTEND:20230104T171500Z
DTSTAMP:20260404T111133Z
UID:AAIT/2
DESCRIPTION:Title: <a href="https://stable.researchseminars.org/talk/AAIT/
 2/">An overview of Ingleton's inequality from a group theory point of view
 .</a>\nby Frederique Oggier (Nanyang Technological University) as part of 
 Seminar on Algorithmic Aspects of Information Theory\n\n\nAbstract\nWe wil
 l review several works that use group theory to approach Ingleton's inequa
 lity and entropic vectors.\n
LOCATION:https://stable.researchseminars.org/talk/AAIT/2/
END:VEVENT
BEGIN:VEVENT
SUMMARY:Cheuk Ting LI (Chinese University of Hong Kong)
DTSTART:20230118T160000Z
DTEND:20230118T171500Z
DTSTAMP:20260404T111133Z
UID:AAIT/3
DESCRIPTION:Title: <a href="https://stable.researchseminars.org/talk/AAIT/
 3/">Existential information inequalities\, non-Shannon inequalities\, and 
 automated theorem proving.</a>\nby Cheuk Ting LI (Chinese University of Ho
 ng Kong) as part of Seminar on Algorithmic Aspects of Information Theory\n
 \n\nAbstract\nExistential information inequality is a generalization of li
 near information inequalities\, where random variables can not only be uni
 versally quantified\, but also existentially quantified. We study the stru
 cture of existential information inequalities\, and describe algorithms fo
 r automated verification of existential information inequalities\, which a
 re also useful for proving (non-existential) non-Shannon inequalities. We 
 also describe how a wide range of results in network information theory (e
 .g. 32 out of 56 theorems in Chapters 1-14 of Network Information Theory b
 y El Gamal and Kim) can be proved automatically using the proposed algorit
 hms. The algorithms are implemented in the PSITIP framework ( github.com/c
 heuktingli/psitip ).\n
LOCATION:https://stable.researchseminars.org/talk/AAIT/3/
END:VEVENT
BEGIN:VEVENT
SUMMARY:Alexander Kozachinskiy (Catholic University of Chile)
DTSTART:20230201T160000Z
DTEND:20230201T171500Z
DTSTAMP:20260404T111133Z
UID:AAIT/4
DESCRIPTION:Title: <a href="https://stable.researchseminars.org/talk/AAIT/
 4/">On the recent progress on Frankl's conjecture.</a>\nby Alexander Kozac
 hinskiy (Catholic University of Chile) as part of Seminar on Algorithmic A
 spects of Information Theory\n\n\nAbstract\nFrankl's conjecture states tha
 t for any non-empty family of non-empty finite sets which is closed under 
 finite union there exists an element belonging to at least 1/2 of the sets
  from the family. I will present a recent result of Gilmer that this conje
 cture holds for some positive constant.\n
LOCATION:https://stable.researchseminars.org/talk/AAIT/4/
END:VEVENT
BEGIN:VEVENT
SUMMARY:Andrei Romashchenko (CNRS - LIRMM)
DTSTART:20230215T160000Z
DTEND:20230215T171500Z
DTSTAMP:20260404T111133Z
UID:AAIT/5
DESCRIPTION:Title: <a href="https://stable.researchseminars.org/talk/AAIT/
 5/">Quasi-uniform distributions and the method of random covers.</a>\nby A
 ndrei Romashchenko (CNRS - LIRMM) as part of Seminar on Algorithmic Aspect
 s of Information Theory\n\n\nAbstract\nWe will talk about variations  of t
 he classical method of typical sequence and its applications useful in stu
 dying information inequalities. In particular\, we will discuss the Ahlswe
 de-Körner lemma in the form that can be used to prove non-Shannon type in
 equalities for  entropy.\n
LOCATION:https://stable.researchseminars.org/talk/AAIT/5/
END:VEVENT
BEGIN:VEVENT
SUMMARY:Milan Studeny (Institute of Information Theory and Automation\, Pr
 ague)
DTSTART:20230301T160000Z
DTEND:20230301T171500Z
DTSTAMP:20260404T111133Z
UID:AAIT/6
DESCRIPTION:Title: <a href="https://stable.researchseminars.org/talk/AAIT/
 6/">On conditional Ingleton inequalities</a>\nby Milan Studeny (Institute 
 of Information Theory and Automation\, Prague) as part of Seminar on Algor
 ithmic Aspects of Information Theory\n\n\nAbstract\nLinear information ine
 qualities valid for entropy functions induced by discrete random variables
  play an important role in the task to characterize discrete conditional i
 ndependence structures. Specifically\, the so-called conditional Ingleton 
 inequalities in the case of 4 random variables are in the center of intere
 st: these are valid under conditional independence assumptions on the indu
 cing random variables. The four inequalities of this form were earlier rev
 ealed: by Yeung and Zhang (1997)\, by Matúš (1999) and by Kaced and Roma
 shchenko (2013). In our recent paper (2021) the fifth inequality of this t
 ype was found. These five information inequalities can be used to characte
 rize all conditional independence structures induced by four discrete rand
 om variables. One of open problems in that 2021 paper was whether the list
  of conditional Ingleton inequalities over 4 random variables is complete:
  the analysis can be completed by a recent finding of Boege (2022) answeri
 ng that question.\n
LOCATION:https://stable.researchseminars.org/talk/AAIT/6/
END:VEVENT
BEGIN:VEVENT
SUMMARY:Vlad Demartsev (Max Planck Institute of Animal Behavior)
DTSTART:20230315T160000Z
DTEND:20230315T171500Z
DTSTAMP:20260404T111133Z
UID:AAIT/7
DESCRIPTION:Title: <a href="https://stable.researchseminars.org/talk/AAIT/
 7/">Economy of vocal repertoires: Zipf's laws in animal communication</a>\
 nby Vlad Demartsev (Max Planck Institute of Animal Behavior) as part of Se
 minar on Algorithmic Aspects of Information Theory\n\n\nAbstract\nZipf's L
 aw of Brevity (negative correlation of words' length with the frequency of
  their use) was found across multiple lexicons and text corpora and often 
 claimed to be one of unifying features of human language. This intriguing 
 linguistic regularity could be explained by the Principle of Least Effort 
 — a compromise between the need for transferring information in a detail
 ed and comprehensive manner and the pressure for minimising the effort (co
 st) associated with producing signals. If Zipf's principles are indeed fun
 damental for communication we should be able to find them in animal system
 s. Animal communication systems are likely to be are constrained by the sa
 me fundamental trade-off between minimizing signalling costs while maintai
 ning informational integrity. However\, the pressure towards optimization 
 of signalling is not the only force shaping animal vocal repertoires. Fact
 ors like\, sexual selection\, social organization and habitat features can
  generate evolutionary forces which might push vocal repertoires further a
 way from Zipfian optimization.\n
LOCATION:https://stable.researchseminars.org/talk/AAIT/7/
END:VEVENT
BEGIN:VEVENT
SUMMARY:Oriol Farràs (Universitat Rovira i Virgili)
DTSTART:20230322T160000Z
DTEND:20230322T171500Z
DTSTAMP:20260404T111133Z
UID:AAIT/8
DESCRIPTION:Title: <a href="https://stable.researchseminars.org/talk/AAIT/
 8/">Introduction to Secret Sharing Schemes: Constructions</a>\nby Oriol Fa
 rràs (Universitat Rovira i Virgili) as part of Seminar on Algorithmic Asp
 ects of Information Theory\n\n\nAbstract\nA secret sharing scheme is a met
 hod by which a dealer distributes shares to parties such that only authori
 zed subsets of parties can reconstruct the secret. The family of these aut
 horized subsets is called the access structure of the scheme. This introdu
 ctory talk is focused on the problem of finding efficient secret sharing s
 chemes for general access structures. We will see some general constructio
 ns as well as their limitations. We will review connections between the pr
 oblem of finding efficient schemes and other problems\, such as finding sm
 all monotone formulas and monotone span programs for monotone Boolean func
 tions\, and problems of matroid theory and entropy optimization.\n
LOCATION:https://stable.researchseminars.org/talk/AAIT/8/
END:VEVENT
BEGIN:VEVENT
SUMMARY:Carles Padro (Universitat Politècnica de Catalunya)
DTSTART:20230329T150000Z
DTEND:20230329T161500Z
DTSTAMP:20260404T111133Z
UID:AAIT/9
DESCRIPTION:Title: <a href="https://stable.researchseminars.org/talk/AAIT/
 9/">Lower Bounds in Secret Sharing from Information Theory.</a>\nby Carles
  Padro (Universitat Politècnica de Catalunya) as part of Seminar on Algor
 ithmic Aspects of Information Theory\n\n\nAbstract\nnformation theory has 
 been used for decades in the search for lower bounds in secret sharing. Th
 e main achievements and limitations of that technique are discussed in thi
 s survey talk. Among the former\, Csirmaz's general lower bound\, which is
  still the best of the known ones\, and the findings of exact optimal info
 rmation ratios for several families of access structures. For the latter\,
  some intuition on the limits of the method and a comparison with the exis
 ting lower bounds for linear secret sharing schemes.\n
LOCATION:https://stable.researchseminars.org/talk/AAIT/9/
END:VEVENT
BEGIN:VEVENT
SUMMARY:Tianren Liu (Peking University)
DTSTART:20230412T150000Z
DTEND:20230412T161500Z
DTSTAMP:20260404T111133Z
UID:AAIT/10
DESCRIPTION:Title: <a href="https://stable.researchseminars.org/talk/AAIT/
 10/">Barriers for Secret Sharing.</a>\nby Tianren Liu (Peking University) 
 as part of Seminar on Algorithmic Aspects of Information Theory\n\nAbstrac
 t: TBA\n
LOCATION:https://stable.researchseminars.org/talk/AAIT/10/
END:VEVENT
BEGIN:VEVENT
SUMMARY:Hung Q. Ngo (relationalAI)
DTSTART:20230426T150000Z
DTEND:20230426T161500Z
DTSTAMP:20260404T111133Z
UID:AAIT/11
DESCRIPTION:Title: <a href="https://stable.researchseminars.org/talk/AAIT/
 11/">An Information Theoretic Approach to Estimating Query Size Bounds.</a
 >\nby Hung Q. Ngo (relationalAI) as part of Seminar on Algorithmic Aspects
  of Information Theory\n\n\nAbstract\nCardinality estimation is one of the
  most important problems in database management. One aspect of cardinality
  estimation is to derive a good upper bound on the output size of a query\
 , given a statistical profile of the inputs. In recent years\, a promising
  information-theoretic approach was devised to address this problem\, lead
 ing to robust cardinality estimators which are used in practice. The infor
 mation theoretic approach led to many interesting open questions surroundi
 ng optimizing a linear function on the almost-entropic or polymatroidal co
 nes. This talk introduces the problem\, the approach\, summarizes some kno
 wn results\, and lists open questions.\n
LOCATION:https://stable.researchseminars.org/talk/AAIT/11/
END:VEVENT
BEGIN:VEVENT
SUMMARY:Raymond W. Yeung (The Chinese University of Hong Kong)
DTSTART:20230510T140000Z
DTEND:20230510T151500Z
DTSTAMP:20260404T111133Z
UID:AAIT/12
DESCRIPTION:Title: <a href="https://stable.researchseminars.org/talk/AAIT/
 12/">Entropy Inequalities: My Personal Journey.</a>\nby Raymond W. Yeung (
 The Chinese University of Hong Kong) as part of Seminar on Algorithmic Asp
 ects of Information Theory\n\n\nAbstract\nShannon's information measures\,
  namely entropy\, mutual information\, and their conditional versions\, wh
 ere each of them can be expressed as a linear combination of unconditional
  joint entropies. These measures are central in information theory\, and c
 onstraints on these measures\, in particular in the form of inequalities\,
  are of fundamental interest. It is well-known that all Shannon's informat
 ion measures are nonnegative\, forming a set of inequalities on entropy. H
 owever\, whether these are all the constraints on entropy was unknown\, an
 d the problem had been rather evasive until the introduction of the entrop
 y function region in the late 1990s.\n\nThis talk consists of two part. Th
 e first part is about how I became interested in this subject in the late 
 1980s. It began with my PhD thesis in which I needed to use inequalities o
 n Shannon's information measures (viz. information inequalities\, or equiv
 alently entropy inequalities) to prove converse coding theorems. During th
 at time I was also intrigued by the underlying set-theoretic structure of 
 Shannon's information measures\, namely their representation by diagrams t
 hat resemble the Venn diagram. In 1991 I introduced the I-Measure to forma
 lly establish the one-to-one correspondence between set theory and Shannon
 's information measures. In the mid-1990s\, I introduced the entropy funct
 ion region Γ* that provides a geometrical interpretation of entropy inequ
 alities. Shortly after\, Zhen Zhang and I discovered so-called non-Shannon
 -type inequalities which are beyond the nonnegativity of Shannon's informa
 tion measures. Since then this subject has been under active research\n\nA
  byproduct of the entropy function region and the associated geometrical i
 nterpretation is the machine-proving of entropy inequalities. In the secon
 d part of this talk\, I will discuss the research along this line\, includ
 ing some recent results.\n
LOCATION:https://stable.researchseminars.org/talk/AAIT/12/
END:VEVENT
BEGIN:VEVENT
SUMMARY:Laszlo Csirmaz (Central European University\, Budapest)
DTSTART:20221012T150000Z
DTEND:20221012T161500Z
DTSTAMP:20260404T111133Z
UID:AAIT/13
DESCRIPTION:Title: <a href="https://stable.researchseminars.org/talk/AAIT/
 13/">Around entropy inequalities.</a>\nby Laszlo Csirmaz (Central European
  University\, Budapest) as part of Seminar on Algorithmic Aspects of Infor
 mation Theory\n\nAbstract: TBA\n
LOCATION:https://stable.researchseminars.org/talk/AAIT/13/
END:VEVENT
BEGIN:VEVENT
SUMMARY:Andrei Romashchenko (CNRS LIRMM\, Montpellier)
DTSTART:20221026T150000Z
DTEND:20221026T161500Z
DTSTAMP:20260404T111133Z
UID:AAIT/14
DESCRIPTION:Title: <a href="https://stable.researchseminars.org/talk/AAIT/
 14/">Basic proof techniques for entropic inequalities.</a>\nby Andrei Roma
 shchenko (CNRS LIRMM\, Montpellier) as part of Seminar on Algorithmic Aspe
 cts of Information Theory\n\nAbstract: TBA\n
LOCATION:https://stable.researchseminars.org/talk/AAIT/14/
END:VEVENT
BEGIN:VEVENT
SUMMARY:Andrei Romashchenko (CNRS LIRMM\, Montpellier)
DTSTART:20221109T160000Z
DTEND:20221109T171500Z
DTSTAMP:20260404T111133Z
UID:AAIT/15
DESCRIPTION:Title: <a href="https://stable.researchseminars.org/talk/AAIT/
 15/">The Zhang-Yeung inequality: an attempt to explain.</a>\nby Andrei Rom
 ashchenko (CNRS LIRMM\, Montpellier) as part of Seminar on Algorithmic Asp
 ects of Information Theory\n\nAbstract: TBA\n
LOCATION:https://stable.researchseminars.org/talk/AAIT/15/
END:VEVENT
BEGIN:VEVENT
SUMMARY:Laszlo Csirmaz (Central European University\, Budapest)
DTSTART:20221123T160000Z
DTEND:20221123T171500Z
DTSTAMP:20260404T111133Z
UID:AAIT/16
DESCRIPTION:Title: <a href="https://stable.researchseminars.org/talk/AAIT/
 16/">The entropy region is not polyhedra.</a>\nby Laszlo Csirmaz (Central 
 European University\, Budapest) as part of Seminar on Algorithmic Aspects 
 of Information Theory\n\nAbstract: TBA\n
LOCATION:https://stable.researchseminars.org/talk/AAIT/16/
END:VEVENT
BEGIN:VEVENT
SUMMARY:Mahmoud Abo Khamis (relationalAI)
DTSTART:20230524T150000Z
DTEND:20230524T161500Z
DTSTAMP:20260404T111133Z
UID:AAIT/17
DESCRIPTION:Title: <a href="https://stable.researchseminars.org/talk/AAIT/
 17/">The Polymatroid Bound: Extensions and Applications in Database Query 
 Evaluation.</a>\nby Mahmoud Abo Khamis (relationalAI) as part of Seminar o
 n Algorithmic Aspects of Information Theory\n\n\nAbstract\nInformation the
 ory has been used to compute upper bounds on the output sizes of database 
 queries as well as to devise matching algorithms that can answer these que
 ries within these bounds. The polymatroid bound generalizes many of these 
 bounds and has a matching query evaluation algorithm\, called PANDA [Abo K
 hamis et al\, PODS'17]. In this talk\, we present extensions of the polyma
 troid bound that utilize the query structure to derive new constraints on 
 entropies. We state open problems regarding the relationship between these
  extensions and the original bound. On the algorithmic side\, we present t
 he PANDA algorithm in detail and discuss its shortcomings and related open
  problems.\n
LOCATION:https://stable.researchseminars.org/talk/AAIT/17/
END:VEVENT
BEGIN:VEVENT
SUMMARY:Mahmoud Abo Khamis (relationalAI)
DTSTART:20230607T150000Z
DTEND:20230607T161500Z
DTSTAMP:20260404T111133Z
UID:AAIT/18
DESCRIPTION:Title: <a href="https://stable.researchseminars.org/talk/AAIT/
 18/">The Polymatroid Bound: Extensions and Applications in Database Query 
 Evaluation (part 2).</a>\nby Mahmoud Abo Khamis (relationalAI) as part of 
 Seminar on Algorithmic Aspects of Information Theory\n\n\nAbstract\nThis i
 s the second part of the talk\, the first part of which took place on May 
 24. I continue our discussion of information-theoretic upper bounds on the
  output sizes of database queries. I will recap the polymatroid bound and 
 present a corresponding query evaluation algorithm\, called PANDA\, whose 
 runtime matches the polymatroid bound [Abo Khamis et al\, PODS’17]. I wi
 ll also discuss this algorithm's shortcomings and related open problems. A
 cquaintance with the first part of the talk is very desirable\, although f
 ormally it is not required.\n
LOCATION:https://stable.researchseminars.org/talk/AAIT/18/
END:VEVENT
BEGIN:VEVENT
SUMMARY:Alexander V. Smal (Technion\, Israel\; PDMI\, Russia)
DTSTART:20230621T150000Z
DTEND:20230621T161500Z
DTSTAMP:20260404T111133Z
UID:AAIT/19
DESCRIPTION:Title: <a href="https://stable.researchseminars.org/talk/AAIT/
 19/">Information theoretic approach to Boolean formula complexity.</a>\nby
  Alexander V. Smal (Technion\, Israel\; PDMI\, Russia) as part of Seminar 
 on Algorithmic Aspects of Information Theory\n\n\nAbstract\nWe will discus
 s the information-theoretic approach to proving lower bounds on the formul
 a complexity of Boolean functions. In the late 80s\, Karchmer and Wigderso
 n discovered that Boolean formulas could be characterized using the langua
 ge of communication complexity. This observation allowed us to apply commu
 nication complexity methods to prove lower and upper bounds on the Boolean
  formula complexity. In particular\, we can use information-theoretic meth
 ods from communication complexity to prove formula complexity lower bounds
 . This talk is a brief introduction to these techniques\, as well as to so
 me applications.\n
LOCATION:https://stable.researchseminars.org/talk/AAIT/19/
END:VEVENT
BEGIN:VEVENT
SUMMARY:Laszlo Csirmaz (Alfréd Rényi Institute of Mathematics\, Budapest
  and UTIA\, Prague)
DTSTART:20230913T150000Z
DTEND:20230913T161500Z
DTSTAMP:20260404T111133Z
UID:AAIT/20
DESCRIPTION:Title: <a href="https://stable.researchseminars.org/talk/AAIT/
 20/">The power of non-Shannon inequalities: a short proof of the Gács-Kö
 rner theorem.</a>\nby Laszlo Csirmaz (Alfréd Rényi Institute of Mathemat
 ics\, Budapest and UTIA\, Prague) as part of Seminar on Algorithmic Aspect
 s of Information Theory\n\n\nAbstract\nWe present a short proof of a celeb
 rated result of Gács and Körner giving sufficient and necessary conditio
 n on the joint distribution of two discrete random variables X and Y for t
 he case when their mutual information matches the extractable (in the limi
 t) common information. Our proof is based on the observation that the mere
  existence of certain random variables jointly distributed with X and Y ca
 n impose restrictions on all random variables jointly distributed with X a
 nd Y.\n
LOCATION:https://stable.researchseminars.org/talk/AAIT/20/
END:VEVENT
BEGIN:VEVENT
SUMMARY:Andrei Romashchenko (LIRMM - CNRS)
DTSTART:20230927T150000Z
DTEND:20230927T161500Z
DTSTAMP:20260404T111133Z
UID:AAIT/21
DESCRIPTION:Title: <a href="https://stable.researchseminars.org/talk/AAIT/
 21/">On the common information and the Ahlswede-Körner lemma in one-shot 
 settings.</a>\nby Andrei Romashchenko (LIRMM - CNRS) as part of Seminar on
  Algorithmic Aspects of Information Theory\n\n\nAbstract\nThe Gács-Körne
 r common information\, Wyner's common information\, and other related info
 rmation quantities apply by design to long series of independent identical
 ly distributed random variables. We will talk about a possible adaptation 
 of these quantities to the one-shot setting\, when the random objects cann
 ot be represented as series of i.i.d. variables\, and the usual technics o
 f typical sequences do not apply. We will discuss the connection of inform
 ation-theoretic profiles with combinatorial and spectral properties of gra
 phs.\n
LOCATION:https://stable.researchseminars.org/talk/AAIT/21/
END:VEVENT
BEGIN:VEVENT
SUMMARY:LI Cheuk Ting (Chinese University of Hong Kong)
DTSTART:20231011T150000Z
DTEND:20231011T161500Z
DTSTAMP:20260404T111133Z
UID:AAIT/22
DESCRIPTION:Title: <a href="https://stable.researchseminars.org/talk/AAIT/
 22/">Functional Representation Lemma and Minimum Entropy Coupling</a>\nby 
 LI Cheuk Ting (Chinese University of Hong Kong) as part of Seminar on Algo
 rithmic Aspects of Information Theory\n\n\nAbstract\nThe functional repres
 entation lemma states that for two jointly-distributed random variables X\
 , Y\, it is possible to express Y as a function of X and another random va
 riable Z that is independent of X. There are two interesting extreme point
 s. If we minimize H(Y|Z)\, the minimum can be bounded by the "strong funct
 ional representation lemma"\, and has implications in lossy compression an
 d channel simulation. If we instead minimize H(Z)\, this is equivalent to 
 the minimum entropy coupling problem\, which concerns finding the coupling
  of several given distributions that gives the smallest joint entropy. In 
 this talk\, we will discuss several recent developments regarding these pr
 oblems.\n
LOCATION:https://stable.researchseminars.org/talk/AAIT/22/
END:VEVENT
BEGIN:VEVENT
SUMMARY:Alexander Shen (CNRS - LIRMM\, Montpellier)
DTSTART:20231025T150000Z
DTEND:20231025T161500Z
DTSTAMP:20260404T111133Z
UID:AAIT/23
DESCRIPTION:Title: <a href="https://stable.researchseminars.org/talk/AAIT/
 23/">Information theoretic proofs (a survey).</a>\nby Alexander Shen (CNRS
  - LIRMM\, Montpellier) as part of Seminar on Algorithmic Aspects of Infor
 mation Theory\n\n\nAbstract\nSome theorems that have nothing to do with in
 formation theory can be proven using an information-theoretic argument (en
 tropy\, or Kolmogorov complexity\, or counting via compression). For examp
 le\, why are there infinitely many prime numbers? If there were only M pri
 me numbers\, then each integer could be encoded by the list of M exponents
  in its prime decomposition\, so we could specify each n-bit number by onl
 y M×O(log n) bits\, a contradiction. We will discuss several examples of 
 proofs of that type\, including other bounds for the distribution of prime
  numbers\, but not only them.\n
LOCATION:https://stable.researchseminars.org/talk/AAIT/23/
END:VEVENT
BEGIN:VEVENT
SUMMARY:Batya Kenig (Technion)
DTSTART:20231108T160000Z
DTEND:20231108T171500Z
DTSTAMP:20260404T111133Z
UID:AAIT/24
DESCRIPTION:Title: <a href="https://stable.researchseminars.org/talk/AAIT/
 24/">Approximate Implication for Probabilistic Graphical Models.</a>\nby B
 atya Kenig (Technion) as part of Seminar on Algorithmic Aspects of Informa
 tion Theory\n\n\nAbstract\nThe graphical structure of Probabilistic Graphi
 cal Models (PGMs) represents the conditional independence (CI) relations t
 hat hold in the modeled distribution. The premise of all current systems-o
 f-inference for deriving conditional independence relations in PGMs\, is t
 hat the set of CIs used for the construction of the PGM hold exactly. In p
 ractice\, algorithms for extracting the structure of PGMs from data discov
 er approximate CIs that do not hold exactly in the distribution. In this w
 ork\, we ask how the error in this set propagates to the inferred CIs read
  off the graphical structure. More precisely\, what guarantee can we provi
 de on the inferred CI when the set of CIs that entailed it hold only appro
 ximately? In this talk\, I will describe new positive and negative results
  concerning this problem. \n\nBased on:\nhttps://lmcs.episciences.org/8943
  \;\nhttps://proceedings.mlr.press/v161/kenig21a.html \;\nhttps://arxiv.or
 g/abs/2310.13942\n
LOCATION:https://stable.researchseminars.org/talk/AAIT/24/
END:VEVENT
BEGIN:VEVENT
SUMMARY:Mokshay Madiman (University of Delaware)
DTSTART:20231122T160000Z
DTEND:20231122T171500Z
DTSTAMP:20260404T111133Z
UID:AAIT/25
DESCRIPTION:Title: <a href="https://stable.researchseminars.org/talk/AAIT/
 25/">Analogies between entropy\, volume\, and cardinality inequalities for
  projections.</a>\nby Mokshay Madiman (University of Delaware) as part of 
 Seminar on Algorithmic Aspects of Information Theory\n\n\nAbstract\nIt is 
 well known that entropy inequalities are a quick way of obtaining volume i
 nequalities for projections of sets in a Euclidean space (or cardinality i
 nequalities for projections of subsets of the integer lattice) - for examp
 le\, the Loomis-Whitney inequality follows easily from the classical Han
 ’s inequality. There is also a well known connection with integral inequ
 alities. We will review more such analogies\, as well as their limitations
 . For example\, we will observe that volume of projections to coordinate s
 ubspaces is not submodular (though entropy is)\, and discuss general duali
 ties between entropy and integral inequalities. We will also discuss some 
 particularly useful classes of Shannon-type inequalities that may be new t
 o the AAIT audience - these also have applications to volume or cardinalit
 y. Most of this talk will be tutorial - it will be based on the work of ma
 ny people in the geometry\, combinatorics\, probability\, and information 
 theory communities\, rather than just the work of the speaker.\n
LOCATION:https://stable.researchseminars.org/talk/AAIT/25/
END:VEVENT
BEGIN:VEVENT
SUMMARY:Marius Zimand (Towson University)
DTSTART:20231206T160000Z
DTEND:20231206T171500Z
DTSTAMP:20260404T111133Z
UID:AAIT/26
DESCRIPTION:Title: <a href="https://stable.researchseminars.org/talk/AAIT/
 26/">Lossless expanders and Information Reconciliation with no shared rand
 omness.</a>\nby Marius Zimand (Towson University) as part of Seminar on Al
 gorithmic Aspects of Information Theory\n\n\nAbstract\nLossless expanders 
 are bipartite graphs in which for every sufficiently small set on the left
  side of the bipartition\, most of the outgoing edges lead to distinct ver
 tices. I will present an application of such graphs in Information Reconci
 liation. This is a protocol with 2 parties: Sender and Receiver. Sender ha
 s a string x and Receiver has a string y. For instance\, think that x is a
 n updated version of y. Sender sends a message m which allows Receiver to 
 construct x. The goal is to minimize the length of m\, taking into account
  the correlation of x and y. This correlation can be formalized in differe
 nt ways and is only known by Receiver. Standard solutions require Sender a
 nd Receiver to share randomness. In my talk\, I will explain how lossless 
 expanders are used to obtain Information Reconciliation with no shared ran
 domness.\n\nThe main idea is from the paper B. Bauwens and M. Zimand\, Uni
 versal almost optimal compression and Slepian-Wolf coding in probabilistic
  polynomial time\, Journal of the ACM\, April 2023 (also available at arXi
 v:1911.04268)\, but I'll focus on just one particular aspect.\n
LOCATION:https://stable.researchseminars.org/talk/AAIT/26/
END:VEVENT
BEGIN:VEVENT
SUMMARY:Amin Gohari (Chinese University of Hong Kong)
DTSTART:20240124T100000Z
DTEND:20240124T111500Z
DTSTAMP:20260404T111133Z
UID:AAIT/27
DESCRIPTION:Title: <a href="https://stable.researchseminars.org/talk/AAIT/
 27/">Secret Key Generation from Dependent Source\, part 1</a>\nby Amin Goh
 ari (Chinese University of Hong Kong) as part of Seminar on Algorithmic As
 pects of Information Theory\n\n\nAbstract\nWe consider the problem of extr
 acting a secret key from dependent sources. In this problem\, the legitima
 te terminals observe iid repetition of correlated random variables and wis
 h to create a shared secret key that is secure from a passive eavesdropper
  using a noiseless public communication channel. Finding the maximum achie
 vable key rate is a fundamental open problem in information-theoretic secu
 rity. We review the history and the state-of-the-art results for this prob
 lem\, with an emphasis on the speaker's prior work on this problem. Certai
 n extensions of the problem will also be discussed.\n
LOCATION:https://stable.researchseminars.org/talk/AAIT/27/
END:VEVENT
BEGIN:VEVENT
SUMMARY:Amin Gohari (Chinese University of Hong Kong)
DTSTART:20240131T100000Z
DTEND:20240131T111500Z
DTSTAMP:20260404T111133Z
UID:AAIT/28
DESCRIPTION:Title: <a href="https://stable.researchseminars.org/talk/AAIT/
 28/">Secret Key Generation from Dependent Source\, part 2</a>\nby Amin Goh
 ari (Chinese University of Hong Kong) as part of Seminar on Algorithmic As
 pects of Information Theory\n\n\nAbstract\nWe consider the problem of extr
 acting a secret key from dependent sources. In this problem\, the legitima
 te terminals observe iid repetition of correlated random variables and wis
 h to create a shared secret key that is secure from a passive eavesdropper
  using a noiseless public communication channel. Finding the maximum achie
 vable key rate is a fundamental open problem in information-theoretic secu
 rity. We review the history and the state-of-the-art results for this prob
 lem\, with an emphasis on the speaker's prior work on this problem. Certai
 n extensions of the problem will also be discussed.\n
LOCATION:https://stable.researchseminars.org/talk/AAIT/28/
END:VEVENT
BEGIN:VEVENT
SUMMARY:Chao Tian (Texas A&M University)
DTSTART:20240228T160000Z
DTEND:20240228T171500Z
DTSTAMP:20260404T111133Z
UID:AAIT/29
DESCRIPTION:Title: <a href="https://stable.researchseminars.org/talk/AAIT/
 29/">Computer-Aided Investigation of Information-Theoretic Limits: An Over
 view.</a>\nby Chao Tian (Texas A&M University) as part of Seminar on Algor
 ithmic Aspects of Information Theory\n\n\nAbstract\nThe linear programming
  (LP) formulation of information measures provides a solid mathematical fr
 amework to identify the fundamental limits of information systems computat
 ionally. A critical issue of this approach is however its high computation
 al complexity. To reduce the computation burden of this approach\, we can 
 utilize the symmetry structure in such systems. The strength of the symmet
 ry-reduced approach is illustrated in several well-known difficult problem
 s\, such as regenerating codes\, coded caching\, and private information r
 etrieval\, which provides new and non-trivial outer bounds. In addition to
  rate bounds\, more in-depth studies can be conducted on the joint entropy
  structure of these computed bounds\, which often lead to reverse-engineer
 ed novel code constructions and further allow disproving linear code achie
 vability. Finally\, we discuss two new directions: the first is to allow t
 he utilization of non-Shannon-type inequalities in the computational appro
 ach\, and the second is to convert the original LP into a sequence of smal
 ler LPs\, both of which appear to be awaiting certain suitable machine-lea
 rning techniques.\n
LOCATION:https://stable.researchseminars.org/talk/AAIT/29/
END:VEVENT
BEGIN:VEVENT
SUMMARY:Chao Tian (exas A&M University)
DTSTART:20240313T160000Z
DTEND:20240313T171500Z
DTSTAMP:20260404T111133Z
UID:AAIT/30
DESCRIPTION:Title: <a href="https://stable.researchseminars.org/talk/AAIT/
 30/">Computer-Aided Investigation of Information-Theoretic Limits: An Over
 view. (2nd part)</a>\nby Chao Tian (exas A&M University) as part of Semina
 r on Algorithmic Aspects of Information Theory\n\n\nAbstract\nThe linear p
 rogramming (LP) formulation of information measures provides a solid mathe
 matical framework to identify the fundamental limits of information system
 s computationally. A critical issue of this approach is however its high c
 omputational complexity. To reduce the computation burden of this approach
 \, we can utilize the symmetry structure in such systems. The strength of 
 the symmetry-reduced approach is illustrated in several well-known difficu
 lt problems\, such as regenerating codes\, coded caching\, and private inf
 ormation retrieval\, which provides new and non-trivial outer bounds. In a
 ddition to rate bounds\, more in-depth studies can be conducted on the joi
 nt entropy structure of these computed bounds\, which often lead to revers
 e-engineered novel code constructions and further allow disproving linear 
 code achievability. Finally\, we discuss two new directions: the first is 
 to allow the utilization of non-Shannon-type inequalities in the computati
 onal approach\, and the second is to convert the original LP into a sequen
 ce of smaller LPs\, both of which appear to be awaiting certain suitable m
 achine-learning techniques.\n
LOCATION:https://stable.researchseminars.org/talk/AAIT/30/
END:VEVENT
BEGIN:VEVENT
SUMMARY:Alexander Kozachinskiy (Catholic University of Chile)
DTSTART:20240410T150000Z
DTEND:20240410T161500Z
DTSTAMP:20260404T111133Z
UID:AAIT/32
DESCRIPTION:Title: <a href="https://stable.researchseminars.org/talk/AAIT/
 32/">Information-theoretic methods in communication complexity.</a>\nby Al
 exander Kozachinskiy (Catholic University of Chile) as part of Seminar on 
 Algorithmic Aspects of Information Theory\n\n\nAbstract\nImagine that Alic
 e and Bob both hold a binary string of length n\, and they want to know if
  there is a position\, where both their strings have 1. This problem is ca
 lled Disjointness. A fundamental result in communication complexity states
  that even if Alice and Bob have access to randomness and are allowed to e
 rror with small probability\, they need to send each other Omega(n) bits t
 o solve Disjointness. \n\nIn the talk\, I will present an information comp
 lexity proof technique that was developed by different authors in the 2000
 s and by now has become classical. This method\, in the nutshell\, uses th
 e chain rule for mutual information to formalize an intuition that we cann
 ot do better in solving Disjointness than checking individually all n posi
 tions.\n
LOCATION:https://stable.researchseminars.org/talk/AAIT/32/
END:VEVENT
BEGIN:VEVENT
SUMMARY:Esther Galbrun (University of Eastern Finland\, Kuopio)
DTSTART:20240424T140000Z
DTEND:20240424T151500Z
DTSTAMP:20260404T111133Z
UID:AAIT/33
DESCRIPTION:Title: <a href="https://stable.researchseminars.org/talk/AAIT/
 33/">The minimum description length principle for pattern mining (MDL4PM).
 </a>\nby Esther Galbrun (University of Eastern Finland\, Kuopio) as part o
 f Seminar on Algorithmic Aspects of Information Theory\n\n\nAbstract\nCons
 idering that patterns express the repeated presence in the data of particu
 lar items\, attribute values or other discrete properties\, mining pattern
 s is a core task in data analysis. Beyond issues of efficient enumeration\
 , the selection of patterns constitutes a major challenge. The MDL princip
 le\, a model selection method grounded in information theory\, has been ap
 plied to pattern mining with the aim to obtain compact high-quality sets o
 f patterns. After introducing some necessary concepts to formalise the pat
 tern mining task\, we will review MDL-based methods for mining various typ
 es of data and patterns and try to highlight their common characteristics 
 and differences. Finally\, we will discuss some of the issues regarding th
 ese methods\, and highlight currently active related data analysis problem
 s.\n
LOCATION:https://stable.researchseminars.org/talk/AAIT/33/
END:VEVENT
BEGIN:VEVENT
SUMMARY:Lasse Harboe Wolff (University of Copenhagen)
DTSTART:20240515T150000Z
DTEND:20240515T161500Z
DTSTAMP:20260404T111133Z
UID:AAIT/34
DESCRIPTION:by Lasse Harboe Wolff (University of Copenhagen) as part of Se
 minar on Algorithmic Aspects of Information Theory\n\nAbstract: TBA\n
LOCATION:https://stable.researchseminars.org/talk/AAIT/34/
END:VEVENT
BEGIN:VEVENT
SUMMARY:Tiago Pimentel (ETH Zurich)
DTSTART:20240529T150000Z
DTEND:20240529T161500Z
DTSTAMP:20260404T111133Z
UID:AAIT/35
DESCRIPTION:Title: <a href="https://stable.researchseminars.org/talk/AAIT/
 35/">Revisiting the optimality of word lengths.</a>\nby Tiago Pimentel (ET
 H Zurich) as part of Seminar on Algorithmic Aspects of Information Theory\
 n\n\nAbstract\nZipf posited that wordforms are optimized to minimize utter
 ances' communicative costs. He supported this claim by showing that words'
  lengths are inversely correlated with their frequencies. This correlation
 \, however\, is only expected if one assumes that a words' communicative c
 ost is given by its length. In this talk\, we will explore this assumption
 \, comparing the predictive power over word lengths we get when assuming d
 ifferent operationalisations of communicative cost.\n
LOCATION:https://stable.researchseminars.org/talk/AAIT/35/
END:VEVENT
BEGIN:VEVENT
SUMMARY:Dan Suciu (University of Washington)
DTSTART:20240605T150000Z
DTEND:20240605T161500Z
DTSTAMP:20260404T111133Z
UID:AAIT/36
DESCRIPTION:Title: <a href="https://stable.researchseminars.org/talk/AAIT/
 36/">Tight upper bounds on the number of homomorphic images of a hypergrap
 h in another.</a>\nby Dan Suciu (University of Washington) as part of Semi
 nar on Algorithmic Aspects of Information Theory\n\n\nAbstract\nGiven two 
 hypergraphs $H$\, $G$\, we are interested in the number of homomorphisms $
 H \\to G$. The hyperedges are typed\, and homomorphisms need to preserve t
 he type. The graph $G$ is unknown\, instead we only know statistics about 
 $G$\, such as the total number of edges of each type\, or information abou
 t their degree sequences. The problem is to compute an upper bound on the 
 number of homomorphisms $H \\to G$\, when $G$ satisfies the given statisti
 cs. We say that this bound is tight within a constant $c \\le 1$ if there 
 exists a graph $G$ satisfying the given statistics for which the number of
  homomorphisms is at least c times the bound.\n\nWe start with the AGM bou
 nd (by Atserias\, Grohe\, Marx)\, which strengthens a prior result by Frie
 dgut and Kahn. This bound uses only the cardinalities of each type of hype
 redge of the hypergraph $G$. We will prove the AGM bound by using a numeri
 cal inequality due to Friedgut\, which generalizes Cauchy-Schwartz and Hol
 der's inequalities. The AGM bound can be computed in polynomial time in th
 e size of $H$\, and is tight within a constant $1/2^k$\, where $k$ is the 
 number of vertices in $H$. \n\n\nNext\, we prove a general bound\, which u
 ses both the cardinalities of each type of hyperedge\, and the norms of th
 eir degree sequences. In particular\, the $\\ell_p$ norms of their degree 
 sequences. In particular\, the $\\ell_\\infty$ norm is the largest degree 
 of that type of hyperedges in $G$. The proof uses information inequalities
 \, and for that reason we call the bound the "entropic bound". However\, i
 t is an open problem whether the entropic bound is computable. \n\nFinally
 \, we restrict the statistics to "simple" statistics\, where all degrees a
 re from a single node\, as opposed to tuples of nodes: in particular\, if 
 $G$ is a graph\, then all statistics are simple. In this case we show that
  the entropic vector in the entropic bound can be restricted to be one wit
 h non-negative $I$-measure. Furthermore\, the entropic bound is computable
  in polynomial time in the size of $H$\, and it is tight within a constant
  $1/2^{2^k-1}$.\n
LOCATION:https://stable.researchseminars.org/talk/AAIT/36/
END:VEVENT
BEGIN:VEVENT
SUMMARY:Milan Studený (Institute of Information Theory and Automation\, P
 rague))
DTSTART:20240619T150000Z
DTEND:20240619T161500Z
DTSTAMP:20260404T111133Z
UID:AAIT/37
DESCRIPTION:Title: <a href="https://stable.researchseminars.org/talk/AAIT/
 37/">Self-adhesivity in lattices of abstract conditional independence mode
 ls.</a>\nby Milan Studený (Institute of Information Theory and Automation
 \, Prague)) as part of Seminar on Algorithmic Aspects of Information Theor
 y\n\n\nAbstract\nIn the first part of the talk\, the concept of a self-adh
 esive polymatroid\, introduced in 2007 by Matus\, will be recalled. This c
 oncept can be viewed as the formalization of a method to derive non-Shanno
 n information-theoretical inequalities\, known in context of the so-called
  copy lemma. Then an abstraction of this method leads to the notion of a s
 elf-adhesive conditional independence (CI) model. These CI models will be 
 defined relative to an abstract algebraic frame of CI models closed under 
 three basic operations\, called copying\, marginalization\, and intersecti
 on. Three examples of such abstract frames will be given. The application 
 of this theory to the theme of entropic region delimitation will be mentio
 ned in the end of the first part. Specifically\, most of extreme rays of t
 he 5-polymatroidal were computationally excluded from being almost entropi
 c.\n\nThe second part of the talk will start with recalling certain basic 
 facts from lattice theory and the formal concept analysis. Two basic ways 
 to describe finite lattices in a condensed form will be presented. Particu
 lar attention will be devoted to the concept of a pseudo-closed set an imp
 licational generator for a Moore family of sets. This leads to combinatori
 al algorithms to compute the so-called canonical implicational basis. The 
 results of computational results on relevant families of CI models will be
  then presented\, which results can be interpreted as the canonical "axiom
 atizations" for these CI families. The rest of the talk will be devoted ot
 her sensible operations with CI models\, other conceivable abstract algebr
 aic CI frames\, and to open questions raised in this context. \n\nThe talk
  is based on a joint work with T. Boege and J.H. Bolt.\n
LOCATION:https://stable.researchseminars.org/talk/AAIT/37/
END:VEVENT
BEGIN:VEVENT
SUMMARY:Andrei Romashchenko (LIRMM Univ Montpellier and CNRS)
DTSTART:20241023T150000Z
DTEND:20241023T161500Z
DTSTAMP:20260404T111133Z
UID:AAIT/38
DESCRIPTION:Title: <a href="https://stable.researchseminars.org/talk/AAIT/
 38/">Information inequalities on graphs with a strong mixing property and 
 their applications.</a>\nby Andrei Romashchenko (LIRMM Univ Montpellier an
 d CNRS) as part of Seminar on Algorithmic Aspects of Information Theory\n\
 n\nAbstract\nWe will discuss the argument of Andrei Muchnik (that goes bac
 k to the 1990s) that allows to prove information inequalities specific for
  neighboring vertices sampled on a graph with a strong mixing property. We
  will use this technique to show that the classical two-phases protocol of
  secret key agreement proposed by Ahlswede-Csiszár and Maurer (informatio
 n reconciliation + privacy amplification) is in some settings the only pos
 sible solution. In particular\, in the one-shot settings\, this protocol i
 s unavoidably asymmetric: (almost) the entire burden of communication fall
 s on one of the protocol participants. (By a joint work with Geoffroy Cail
 lat-Grenier and Rustam Zyavgarov.)\n
LOCATION:https://stable.researchseminars.org/talk/AAIT/38/
END:VEVENT
BEGIN:VEVENT
SUMMARY:Satyajit Thakor (Indian Institute of Technology Mandi)
DTSTART:20241120T150000Z
DTEND:20241120T161500Z
DTSTAMP:20260404T111133Z
UID:AAIT/39
DESCRIPTION:Title: <a href="https://stable.researchseminars.org/talk/AAIT/
 39/">The Constrained Entropy Vectors Characterization Problem.</a>\nby Sat
 yajit Thakor (Indian Institute of Technology Mandi) as part of Seminar on 
 Algorithmic Aspects of Information Theory\n\n\nAbstract\nProperties of the
  Shannon entropy are instrumental for analyzing the limits of reliable com
 munication for various communication system models. Often\, situations ari
 se where we have additional constraints on random variables and the corres
 ponding entropy vectors due to the underlying model. In this talk\, we foc
 us on the problem of characterizing constrained entropy vectors. Constrain
 ts can be entropy equalities or belonging to a particular class of entropy
  vectors\, e.g.\, quasi-uniform entropy vectors. We visit both direct and 
 converse approaches to tackle the characterization problem. The direct app
 roach is to construct entropy vectors given the constraints\, and the conv
 erse approach is to find necessary conditions\, usually equalities and ine
 qualities\, for entropy vectors given the constraints.\n
LOCATION:https://stable.researchseminars.org/talk/AAIT/39/
END:VEVENT
BEGIN:VEVENT
SUMMARY:Chandra NAIR and Chin Wa (Ken) LAU (the Chinese University of Hong
  Kong)
DTSTART:20241211T160000Z
DTEND:20241211T171500Z
DTSTAMP:20260404T111133Z
UID:AAIT/40
DESCRIPTION:Title: <a href="https://stable.researchseminars.org/talk/AAIT/
 40/">Additive Combinatorics\, Abelian Groups\, and Entropic Inequalities.<
 /a>\nby Chandra NAIR and Chin Wa (Ken) LAU (the Chinese University of Hong
  Kong) as part of Seminar on Algorithmic Aspects of Information Theory\n\n
 \nAbstract\nuzsa’s equivalence theorem provided a framework for converti
 ng certain families of inequalities in additive combinatorics to entropic 
 inequalities (which sometimes did not possess stand-alone entropic proofs)
 . In this talk\, we first establish formal equivalences between some famil
 ies (different from Ruzsa) of inequalities in additive combinatorics and e
 ntropic ones. Secondly\, we provide stand-alone entropic proofs for previo
 usly known entropic inequalities that we established via Ruzsa’s equival
 ence theorem. As a first step to further these equivalences\, we establish
  an information-theoretic characterization of the magnification ratio of i
 ndependent interest.\n\nThis talk is based on the following paper: Informa
 tion Inequalities via Ideas from Additive Combinatorics (conference versio
 n ISIT 2023: C. W. Ken Lau and C. Nair\, "Information Inequalities via Ide
 as from Additive Combinatorics\," 2023 IEEE International Symposium on Inf
 ormation Theory (ISIT)\, Taipei\, Taiwan\, 2023\, pp. 2452-2457\, doi: 10.
 1109/ISIT54713.2023.10206561.\n
LOCATION:https://stable.researchseminars.org/talk/AAIT/40/
END:VEVENT
BEGIN:VEVENT
SUMMARY:Bryon ARAGAM (University of Chicago)
DTSTART:20250115T160000Z
DTEND:20250115T171500Z
DTSTAMP:20260404T111133Z
UID:AAIT/41
DESCRIPTION:Title: <a href="https://stable.researchseminars.org/talk/AAIT/
 41/">Learning compositional structure from data</a>\nby Bryon ARAGAM (Univ
 ersity of Chicago) as part of Seminar on Algorithmic Aspects of Informatio
 n Theory\n\n\nAbstract\nWe introduce the neighbourhood lattice decompositi
 on of a distribution\, which is a compact\, non-graphical representation o
 f conditional independence that is valid in the absence of a faithful grap
 hical representation. The idea is to view the set of neighbourhoods of a v
 ariable as a subset lattice\, and partition this lattice into convex subla
 ttices\, each of which directly encodes a collection of conditional indepe
 ndence relations. We show that this decomposition exists in any compositio
 nal graphoid and can be computed consistently in high-dimensions without t
 he curse of dimensionality. In particular\, this gives a way to learn from
  data all of the independence relations implied by any graphical model and
  thus in particular its structure.\n
LOCATION:https://stable.researchseminars.org/talk/AAIT/41/
END:VEVENT
BEGIN:VEVENT
SUMMARY:Alexander SHEN (CNRS LIRMM\, Montpellier)
DTSTART:20250129T160000Z
DTEND:20250129T171500Z
DTSTAMP:20260404T111133Z
UID:AAIT/42
DESCRIPTION:Title: <a href="https://stable.researchseminars.org/talk/AAIT/
 42/">Revisiting combinatorial applications of information inequalities.</a
 >\nby Alexander SHEN (CNRS LIRMM\, Montpellier) as part of Seminar on Algo
 rithmic Aspects of Information Theory\n\n\nAbstract\nThis (short) talk is 
 a reaction to Dan Suciu's talk of the last year. He showed some new combin
 atorial inequalities that use L_p-size of sections of some sets. One could
  note that they have Kolmogorov complexity versions (even a bit more gener
 al that give a bound for the pair (C(x)\,C(y|x)) in terms of the section s
 ize statistics)\, and it would be interesting to find out whether one can 
 get a similar bound for Shannon's entropies.\n
LOCATION:https://stable.researchseminars.org/talk/AAIT/42/
END:VEVENT
BEGIN:VEVENT
SUMMARY:Qi CHEN (Xidian University)
DTSTART:20250212T140000Z
DTEND:20250212T151500Z
DTSTAMP:20260404T111133Z
UID:AAIT/43
DESCRIPTION:Title: <a href="https://stable.researchseminars.org/talk/AAIT/
 43/">Matroidal entropy functions (Part 1)</a>\nby Qi CHEN (Xidian Universi
 ty) as part of Seminar on Algorithmic Aspects of Information Theory\n\n\nA
 bstract\nA matroidal entropy function is an entropy function in the form l
 og v · r_M \, where v is an integer exceeding one and r_M is the rank fun
 ction of a matroid M. For a matroid M\, the characterization of matroidal 
 entropy functions induced by M is to determine its probabilistic-character
 istic set Χ_M\, i.e.\, the set of integers v such that log v · r_M is en
 tropic. When M is a connected matroid with rank not less than 2\, such cha
 racterization also determines the entropic region on the extreme ray of th
 e polymatroidal region containing log v · r_M.\n\nTo characterize matroid
 al entropy functions\, variable-strength orthogonal arrays (VOA) is define
 d\, which can be considered as special cases of classic combinatorial stru
 cture orthogonal arrays (OA). It can be proved that whether log v · r_M i
 s entropic depends on whether a VOA(M\, v) is constructible. The construct
 ibility of VOA(M\, v) is also equivalent to the partition-representability
  of M over an alphabet with cardinality v\, defined by Fero Matúš\, whic
 h generalize the linear representability of a matroid over a field.\n\nIn 
 this part\, we characterize all matroidal entropy functions with ground se
 t size not exceeding 5 except for log v · r_{U_{2\,5}} and log v · r_{U_
 {3\,5}} for some v. We also briefly discuss the application of matroidal e
 ntropy functions to network coding and secret sharing.\n
LOCATION:https://stable.researchseminars.org/talk/AAIT/43/
END:VEVENT
BEGIN:VEVENT
SUMMARY:Qi CHEN (Xidian University)
DTSTART:20250219T140000Z
DTEND:20250219T151500Z
DTSTAMP:20260404T111133Z
UID:AAIT/44
DESCRIPTION:Title: <a href="https://stable.researchseminars.org/talk/AAIT/
 44/">Matroidal entropy functions (Part 2)</a>\nby Qi CHEN (Xidian Universi
 ty) as part of Seminar on Algorithmic Aspects of Information Theory\n\n\nA
 bstract\nLeveraging the correspondences between matroidal entropy function
 s and VOAs discussed in part 1 of the talk\, we characterize the matroidal
  entropy functions induced by matroids obtained from matroid operations su
 ch as deletion\, contraction\, minor\, series and parallel connection and 
 2-sum. Utilizing these results\, we characterize two classes of matroidal 
 entropy functions\, i.e.\, those induced by regular matroids and matroids 
 with the same p-characteristic set as uniform matroid U_{2\,4}\, which are
  located on the bottom of the lattice of all matroids ordered by operation
  of "taking minor". Some further research topics of matroidal entropy func
 tions are also discussed at the end of talk.\nZoom link: will be posted he
 re shortly before the meeting.\n
LOCATION:https://stable.researchseminars.org/talk/AAIT/44/
END:VEVENT
BEGIN:VEVENT
SUMMARY:Tobias BOEGE (UiT The Arctic University of Norway in Tromsø)
DTSTART:20250305T160000Z
DTEND:20250305T171500Z
DTSTAMP:20260404T111133Z
UID:AAIT/45
DESCRIPTION:Title: <a href="https://stable.researchseminars.org/talk/AAIT/
 45/">On the Intersection and Composition properties for discrete random va
 riables.</a>\nby Tobias BOEGE (UiT The Arctic University of Norway in Trom
 sø) as part of Seminar on Algorithmic Aspects of Information Theory\n\n\n
 Abstract\nCompositional graphoids are fundamental combinatorial structures
  in the field of causality where they usually appear disguised as graphica
 l models. Their usefulness relies on the Intersection and Composition prop
 erties. We survey sufficient conditions for a system of discrete random va
 riables to satisfy either of these properties and tie this investigation t
 o conditional information inequalities.\n
LOCATION:https://stable.researchseminars.org/talk/AAIT/45/
END:VEVENT
BEGIN:VEVENT
SUMMARY:Marius ZIMAND (Towson University)
DTSTART:20250326T160000Z
DTEND:20250326T171500Z
DTSTAMP:20260404T111133Z
UID:AAIT/46
DESCRIPTION:Title: <a href="https://stable.researchseminars.org/talk/AAIT/
 46/">On the existence of one-way functions and the hardness of approximati
 ng Kolmogorov complexity.</a>\nby Marius ZIMAND (Towson University) as par
 t of Seminar on Algorithmic Aspects of Information Theory\n\n\nAbstract\nO
 ne-way functions are functions that are easy to compute and hard to invert
 . They yield private-key encryption\, zero-knowledge proofs\, digital sign
 atures\, commitment schemes\, and other applications in cryptography. They
  are widely believed to exist but an unconditional proof of this fact impl
 ies P \\not= NP. I will present a recent characterization due to Ilango\, 
 Ren\, and Santhanam: there exists a one-way function IFF approximating the
  Kolmogorov complexity is hard on average with respect to some efficiently
  samplable distribution.\n\nThe paper by Ilango\, Ren\, and Santhana is av
 ailable in Proc. STOC 2022\, https://dl.acm.org/doi/10.1145/3519935.352005
 1\nSee also by M. Zimand https://eccc.weizmann.ac.il/report/2024/201/\n
LOCATION:https://stable.researchseminars.org/talk/AAIT/46/
END:VEVENT
BEGIN:VEVENT
SUMMARY:Chris F. Westbury and Michelle Yang (Chris F. Westbury (University
  of Alberta) and Michelle Yang (McGill University))
DTSTART:20250611T150000Z
DTEND:20250611T161500Z
DTSTAMP:20260404T111133Z
UID:AAIT/47
DESCRIPTION:Title: <a href="https://stable.researchseminars.org/talk/AAIT/
 47/">Orthographic uncertainty: An entropy-based measure of word form typic
 ality.</a>\nby Chris F. Westbury and Michelle Yang (Chris F. Westbury (Uni
 versity of Alberta) and Michelle Yang (McGill University)) as part of Semi
 nar on Algorithmic Aspects of Information Theory\n\n\nAbstract\nMeasures o
 f orthographic typicality have long been studied as predictors of lexical 
 access. The best-known orthographic typicality measure is orthographic nei
 ghbourhood size (Coltheart’s N or ON)\, the number of words that are one
  letter different\, by substitution\, from the target word. A more recent 
 related measure of orthographic typicality is orthographic Levenshtein dis
 tance 20 (OLD20)\, the average Levenshtein orthographic edit distance of a
  target word from its 20 closest neighbours (Yarkoni\, Balota\, and Yap\, 
 2008). Both measures have been implicated in lexical access. We will discu
 ss a family of measures of word form similarity we call orthographic uncer
 tainty. These measures are based on Shannon entropy (Shannon\, 1948)\, whi
 ch has a long history of being considered psychologically relevant. Orthog
 raphic uncertainty measures are superior to ON and OLD20 at predicting wor
 d/nonword decision times and word reading times and accuracies. They are a
 lso superior to the older measures insofar as they are naturally tied to t
 he widely-accepted quantification using Shannon Entropy of the psychologic
 al functions of familiarity\, uncertainty\, learnability\, and representat
 ional and computational efficiency.\n
LOCATION:https://stable.researchseminars.org/talk/AAIT/47/
END:VEVENT
BEGIN:VEVENT
SUMMARY:Laszlo Csirmaz
DTSTART:20260211T160000Z
DTEND:20260211T171500Z
DTSTAMP:20260404T111133Z
UID:AAIT/48
DESCRIPTION:Title: <a href="https://stable.researchseminars.org/talk/AAIT/
 48/">Information Inequalities for Five Random Variables.</a>\nby Laszlo Cs
 irmaz as part of Seminar on Algorithmic Aspects of Information Theory\n\n\
 nAbstract\nSplit the base set N into the disjoint union YXZ\, and let Y1\,
 ...\,Yn be copies of Y\, and Z1\,...\,Zm be copies of Z. For any probabili
 ty distribution on YXZ there is another probability distribution on (Y1 ..
 . Yn X Z1 ... Zm) such that the marginals on XYi and XY are the same\; the
  marginals on XZj and XZ are the same\; moreover the variable sets {Yi\,Zj
 } are completely conditionally independent over X.\n\nApplying this proper
 ty for Y={cd}\, X={ab}\, and Z={z}\, all consequences are computed for n<1
 0. Based on the results\, two infinite families of five-variable non-Shann
 on inequalities are defined and proved to be consequences of the above pro
 perty. We investigate the “extremal” inequalities among them\, their a
 symptotic behavior\, and how they delimit the five-variable entropy region
 . At the end we discuss how they relate to Matus’ inequalities.\n
LOCATION:https://stable.researchseminars.org/talk/AAIT/48/
END:VEVENT
BEGIN:VEVENT
SUMMARY:Cheuk Ting LI (The Chinese University of Hong Kong)
DTSTART:20260225T160000Z
DTEND:20260225T171500Z
DTSTAMP:20260404T111133Z
UID:AAIT/49
DESCRIPTION:Title: <a href="https://stable.researchseminars.org/talk/AAIT/
 49/">Discrete Layered Entropy\, Conditional Compression and a Tighter Stro
 ng Functional Representation Lemma.</a>\nby Cheuk Ting LI (The Chinese Uni
 versity of Hong Kong) as part of Seminar on Algorithmic Aspects of Informa
 tion Theory\n\n\nAbstract\nGiven two pieces of information\, we can take t
 he "union" of them (the joint random variable)\, but the natural definitio
 n of the difference between them (the information in X but not in Y) is le
 ss clear. The conditional entropy H(X|Y) is often interpreted as the amoun
 t of information in X but not in Y\, but H(X|Y) cannot be regarded as the 
 entropy of the "conditional random variable X|Y"\, i.e.\, conditional entr
 opy is not a special case of entropy. In this talk\, we discuss a notion o
 f difference motivated by a conditional compression problem\, and a quanti
 ty Lambda(X)\, called discrete layered entropy. Lambda(X) has properties s
 imilar to the Shannon entropy H(X)\, and is close to H(X) within a logarit
 hmic gap. Moreover\, Lambda(X|Y) is indeed the discrete layered entropy of
  the "conditional random variable X|Y"\, so conditional discrete layered e
 ntropy is a special case of discrete layered entropy (unlike Shannon entro
 py). These properties make Lambda(X) useful for analyzing conditional comp
 ression problems such as channel simulation with common randomness. In par
 ticular\, it can give a tighter bound for the strong functional representa
 tion lemma.\n
LOCATION:https://stable.researchseminars.org/talk/AAIT/49/
END:VEVENT
BEGIN:VEVENT
SUMMARY:Carles Padró (Universitat Politècnica de Catalunya)
DTSTART:20260311T160000Z
DTEND:20260311T171500Z
DTSTAMP:20260404T111133Z
UID:AAIT/50
DESCRIPTION:Title: <a href="https://stable.researchseminars.org/talk/AAIT/
 50/">Interaction between skew-representability\, tensor products\, extensi
 on properties\, and rank inequalities.</a>\nby Carles Padró (Universitat 
 Politècnica de Catalunya) as part of Seminar on Algorithmic Aspects of In
 formation Theory\n\n\nAbstract\nSkew-representable matroids form a fundame
 ntal class in matroid theory\, bridging combinatorics and linear algebra. 
 They play an important role in areas such as coding theory\, optimization\
 , and combinatorial geometry\, where linear structure is crucial for both 
 theoretical insights and algorithmic applications. Since deciding skew-rep
 resentability is computationally intractable\, much effort has been focuse
 d on identifying necessary or sufficient conditions for a matroid to be sk
 ew-representable.\n\nIn this paper\, we introduce a novel approach to stud
 ying skew-representability and structural properties of matroids and polym
 atroid functions via tensor products. We provide a characterization of ske
 w-representable matroids\, as well as of those representable over skew fie
 lds of a given prime characteristic\, in terms of tensor products. As an a
 lgorithmic consequence\, we show that deciding skew-representability\, or 
 representability over a skew field of fixed prime characteristic\, is co-r
 ecursively enumerable: that is\, certificates of non-skew-representability
  — in general or over a fixed prime characteristic — can be verified. 
 We also prove that every rank-3 matroid admits a tensor product with any u
 niform matroid and give a construction yielding the unique freest tensor p
 roduct in this setting. Finally\, as an application of the tensor product 
 framework\, we give a new proof of Ingleton's inequality and\, more import
 antly\, derive the first known linear rank inequality for folded skew-repr
 esentable matroids that does not follow from the common information proper
 ty.\n
LOCATION:https://stable.researchseminars.org/talk/AAIT/50/
END:VEVENT
BEGIN:VEVENT
SUMMARY:Slava Matveev (Leipzig)
DTSTART:20260325T160000Z
DTEND:20260325T171500Z
DTSTAMP:20260404T111133Z
UID:AAIT/51
DESCRIPTION:Title: <a href="https://stable.researchseminars.org/talk/AAIT/
 51/">Spectral Conditions for the Ingleton Inequality.</a>\nby Slava Matvee
 v (Leipzig) as part of Seminar on Algorithmic Aspects of Information Theor
 y\n\n\nAbstract\nThe Ingleton inequality is a classical linear information
  inequality that holds for representable matroids but fails to be universa
 lly valid for entropic vectors. Understanding the extent to which this ine
 quality can be violated has been a longstanding problem in information the
 ory. In this paper\, we show that for a broad class of jointly distributed
  random variables (X\,Y) the Ingleton inequality holds up to a small addit
 ive error\, even even though the mutual information between X and Y is far
  from being extractable. Contrary to common intuition\, strongly non-extra
 ctable mutual information does not lead to large violations of the Ingleto
 n inequality in this setting. More precisely\, we consider pairs (X\,Y) th
 at are uniformly distributed on their joint support and whose associated b
 iregular bipartite graph is an expander. For all auxiliary random variable
 s A and B jointly distributed with (X\,Y)\, we establish a lower bound on 
 the Ingleton quantity I(X:Y|A)+I(X:Y|B)+I(A:B)-I(X:Y) in terms of the spec
 tral parameters of the underlying graph. Our proof combines the expander m
 ixing lemma with a partitioning technique for finite sets (cf. Alon\, Newm
 an\, Shen\, Tardos\, Vereshchagin\, 2007).\n
LOCATION:https://stable.researchseminars.org/talk/AAIT/51/
END:VEVENT
END:VCALENDAR
