Abstract:
Deep learning architectures designed for natural language, such as the transformer, can be used to solve problems of mathematics and physics, by considering the problems and solutions as sequences of words into some formal language, and training the model to translate, from examples only, the problem into the solution. Conversely, problems of mathematics often prove to be good benchmarks for understanding. I will present applications of transformers to advanced mathematical problems, and results on mathematical benchmarks that help understand how language models learn.
The seminar will be held in the Dvořák hall, FZU, Na Slovance 2, Prague (entrance from the street "Pod Vodárenskou věží 1").
Location: https://goo.gl/maps/wEf7PsiLimSXMZhE9
The seminar will be also available via ZOOM video conference system:
Meeting ID: 674 9629 6646
Passcode: 575511
or join meeting via direct ZOOM link