BEGIN:VCALENDAR
VERSION:2.0
PRODID:researchseminars.org
CALSCALE:GREGORIAN
X-WR-CALNAME:researchseminars.org
BEGIN:VEVENT
SUMMARY:Paata Ivanisvili (UC Irvine)
DTSTART:20211112T230000Z
DTEND:20211113T000000Z
DTSTAMP:20260404T143359Z
UID:ags/9
DESCRIPTION:Title: <a href="https://stable.researchseminars.org/talk/ags/9
 /">Learning low degree functions in logarithmic number of random queries.<
 /a>\nby Paata Ivanisvili (UC Irvine) as part of Analysis and Geometry Semi
 nar\n\n\nAbstract\nPerhaps a very basic question one asks in learning theo
 ry is as follows: we have an unknown function $f$ on the hypercube $\\{-1\
 ,1\\}^n$\, and we are allowed to query samples $(X\, f(X))$ where $X$ is u
 niformly distributed on $\\{-1\,1\\}^n$. After getting these samples $(X_1
 \, f(X_1))\, ...\, (X_N\, f(X_N))$ we would like to construct a function $
 h$ which approximates f up to an error epsilon (say in $L^2$). Of course $
 h$ is a random function as it involves i.i.d. random variables $X_1\, ... 
 \, X_N$ in its construction. Therefore\, we want to construct such $h$ whi
 ch approximates $f$ with probability at least ($1-\\delta$). So given para
 meters epsilon\, $\\delta$  in $(0\,1)$ the goal is to minimize the number
  of random queries $N$. I will show that around $\\log(n)$ random queries 
 are sufficient to learn bounded "low-complexity" functions. Based on joint
  work with Alexandros Eskenazis.\n
LOCATION:https://stable.researchseminars.org/talk/ags/9/
END:VEVENT
END:VCALENDAR
