ASHISH VASWANI

GOOGLEAI

The Research and Applied AI Summit (RAAIS) is a community for entrepreneurs and researchers who accelerate the science and applications of AI technology. In the lead up to our 5th annual event on June 28th 2019 in London, we’re running a series of speaker profiles to shed more light on what you can expect to learn on the day!

ashish vaswani headshot.jpg

Over the last 12 months, we’ve seen significant breakthroughs on natural language processing tasks such as machine translation, document generation and syntactic parsing. A few of these breakthroughs were produced by models such as Google AI’s BERT, OpenAI’s GPT, and Microsoft’s MT-DNN. A key enabler of these results is in fact the Transformer model architecture that was co-authored by Ashish Vaswani and colleagues in 2017 at NIPS and has now been cited >1,500 times.

At this year’s RAAIS, we’re excited to be hosting Ashish, who is a Senior Research Scientist in the Brain team within GoogleAI in San Francisco. His research has focussed on developing pure attention based models, such as the Transformer for generation and classification. Before joining Google, he was a PhD student, and later Research Scientist, in natural language processing at the University of Southern California Information Sciences Institute.

Ashish’s co-authored research and collaborations have been accepted at NIPS 2017 (Attention is All You Need), ICML 2018 (Image Transformer, Fast Decoding in Sequence Models Using Discrete Latent Variables) and ICLR 2019 (Music Transformer: Generating Music with Long-Term Structure).

You can read more about his work here and find him on Twitter here. Welcome, Ashish!