Knowledge is everything!
Sign up for our newsletter to receive:
- an extra 10% off your ticket!
- insights, interviews, tips, news, and much more about Predictive Analytics World
- price break reminders
November 18, 2019
Estrelsaal C5 & C6
Sufficient training data is often a bottleneck for real-world machine learning applications. The computer vision community mitigated this problem by pretraining models on ImageNet and transferring knowledge to the desired task. Thanks to an emerging new class of deep language models, transfer learning has now also become hot in NLP. In this Malte will share strategies, tips & tricks along all model phases: Pretraining a language model from scratch, adjusting it for domain specific language and fine-tuning it for the desired down-stream task. He will demonstrate the practical implications by showing how BERT was deployed at a Fortune 500 company.