29th European Summer School in Logic, Language, and Information
University of Toulouse (France), 17-28 July, 2017

Embeddings and Deep Learning

Hinrich Schütze

Language and Computation (Advanced)

First week, from 11:00 to 12:30

Abstract

An embedding for a word w is a vector in n-dimensional Euclidean space that provides information about the properties of w in terms of similarity to other words; e.g., words with vectors close to w have similar meanings. Embeddings are a modernized form of distributional semantic models and are widely used in computational linguistics and natural language processing. This one-week course will provide a detailed overview of embeddings and (on the last day) an introduction to their use in deep learning. The course aims to provide (i) a comprehensive overview of the state of the art in embeddings, (ii) an overview of the most important tools for computing and using embeddings and (iii) an introduction to the main research questions that embeddings currently pose.