Λέσχη Φίλων Στατιστικής - GrStats forum
AUEB Stats Webinar 22/10/2020:   Scalable Gaussian Processes, with Guarantees: Kernel Approximations and Deep Feature Extraction by P. Dellaportas Forumgrstats

Join the forum, it's quick and easy

Λέσχη Φίλων Στατιστικής - GrStats forum
AUEB Stats Webinar 22/10/2020:   Scalable Gaussian Processes, with Guarantees: Kernel Approximations and Deep Feature Extraction by P. Dellaportas Forumgrstats
Λέσχη Φίλων Στατιστικής - GrStats forum
Would you like to react to this message? Create an account in a few clicks or log in to continue.
Για προβλήματα εγγραφής και άλλες πληροφορίες επικοινωνήστε με : grstats.forum@gmail.com ή grstats@stat-athens.aueb.gr

Go down
grstats
grstats
Posts : 966
Join date : 2009-10-21
http://stat-athens.aueb.gr/~grstats/

AUEB Stats Webinar 22/10/2020:   Scalable Gaussian Processes, with Guarantees: Kernel Approximations and Deep Feature Extraction by P. Dellaportas Empty AUEB Stats Webinar 22/10/2020: Scalable Gaussian Processes, with Guarantees: Kernel Approximations and Deep Feature Extraction by P. Dellaportas

Thu 1 Oct 2020 - 12:39
ΚΥΚΛΟΣ ΣΕΜΙΝΑΡΙΩΝ ΣΤΑΤΙΣΤΙΚΗΣ ΟΚΤΩΒΡΙΟΣ 2020

AUEB Stats Webinar 22/10/2020:   Scalable Gaussian Processes, with Guarantees: Kernel Approximations and Deep Feature Extraction by P. Dellaportas 2020-110


Πέτρος Δελλαπόρτας
Professor in Statistical Science, Department of Statistical Science, University College London και Professor of Statistics, Department of Statistics, Athens University of Economics and Business

Scalable Gaussian Processes, with Guarantees: Kernel Approximations and Deep Feature Extraction

ΠΕΜΠΤΗ 22/10/2020
12:30

Σύνδεσμος Google Meeting: meet.google.com/iut-yhkj-uqh

ΠΕΡΙΛΗΨΗ

We provide a linear time inferential framework for Gaussian processes that supports automatic feature extraction through deep neural networks and low-rank kernel approximations. Importantly, we derive approximation guarantees bounding the Kullback-Leibler divergence between the idealized Gaussian process and one resulting from a low-rank approximation to its kernel under two types of approximations, which result in two instantiations of our framework: Deep Fourier Gaussian Processes, resulting from random Fourier feature low-rank approximations, and Deep Mercer Gaussian Processes, resulting from truncating the Mercer expansion of the kernel. We do extensive experimental evaluation of these two instantiations in a broad collection of real-world datasets providing strong evidence that they outperform a broad range of state-of-the-art methods in terms of time efficiency, negative log-predictive density, and root mean squared error.

Facebook event: https://www.facebook.com/events/1028215017627341
AUEB web link: https://www.dept.aueb.gr/el/stat/events/dellaportas22_10
Back to top
Permissions in this forum:
You cannot reply to topics in this forum