More on Transformer Language Models

January 22, 2023 - 2 minute read - Category: Intro - Tags: Deep learning


This post covers the fifth lecture in the course: “More on Transformer Language Models.”

This lecture will continue our discussion of transformer language models, focusing on interpretation and visualization of textual embeddings and the challenges – and interesting questions – raised by evolving language.

Lecture Video

Part 1

Watch the video

Part 2

Watch the video

Lecture notes 1 Lecture notes 2

References Cited in Lecture 5: More on Transformer Language Models

Academic Papers

Interpreting Textual Embeddings

Changing Language

Other Resources

  • Tensorflow Embedding Projector:

Code Bases

Historical Language Models

  • Huggingface open source library with large variety of NLP models. Includes MacBERTh and several other historically trained or finetuned language models

Image Source: Devlin, J., Chang, M., Lee, K., Toutanova, K. (2018) BERT: Pre-training of Deep Bidirectional Transformers for Lanaugage Understanding