Context Atlas
link description
expand_more
In transformer models like BERT, a word's embedding is defined by its linguistic context. Type in a word to see it in different sentence contexts from Wikipedia. info

Each point is the query word's embedding at the selected layer, projected into two dimensions using UMAP.

The labels are words that are common between sentences in a cluster. info

Visualization created by Google PAIR. See blogpost for more details.