Everyday Embeddings

Mar 27, 2018   •   Noon van der Silk

If you’ve ever spent any time around someone interested in deep learning or machine learning, one day they’ll say the word embeddings.

Generally, what we mean by an embedding is a mapping between two objects that preserves the structure.

Typically we dive into technical definitions of embeddings, but I thought here I’d give a few examples of everyday embeddings: these are ones that you will be familiar with and use regularly. By thinking of them, you will gain intuition into how embeddings work.

A few everyday embeddings

Example 1

Food -> A bite of that food

We would call this an embedding of the dish into the bite. By having a bite, you get an idea of the taste of the entire dish. Things that have similar characteristics when you bite them, have similar characteristics as a dish. (i.e. a bite of a salad sandwhich tastes like sort of similar to bite of a salad bagel but quite different to a bite of a pancake with icecream).

Example 2

Natural scene -> Photo

This is a photo of a natural scene. It’s not the scene itself (that’s long passed), but it has similar visual properties to the original scene. It looks like it has horses in it, it captures the sunshine, the mood of the horses, and the state of the trees and grass at the time.

Example 3

Thoughts -> Words

Still I Rise

You may write me down in history
With your bitter, twisted lies,
You may trod me in the very dirt
But still, like dust, I'll rise.

Does my sassiness upset you?
Why are you beset with gloom?
’Cause I walk like I've got oil wells
Pumping in my living room.

Just like moons and like suns,
With the certainty of tides,
Just like hopes springing high,
Still I'll rise.

Did you want to see me broken?
Bowed head and lowered eyes?
Shoulders falling down like teardrops,
Weakened by my soulful cries?

Does my haughtiness offend you?
Don't you take it awful hard
’Cause I laugh like I've got gold mines
Diggin’ in my own backyard.

You may shoot me with your words,
You may cut me with your eyes,
You may kill me with your hatefulness,
But still, like air, I’ll rise.

Does my sexiness upset you?
Does it come as a surprise
That I dance like I've got diamonds
At the meeting of my thighs?

Out of the huts of history’s shame
I rise
Up from a past that’s rooted in pain
I rise
I'm a black ocean, leaping and wide,
Welling and swelling I bear in the tide.

Leaving behind nights of terror and fear
I rise
Into a daybreak that’s wondrously clear
I rise
Bringing the gifts that my ancestors gave,
I am the dream and the hope of the slave.
I rise
I rise
I rise.

There is clearly some concept Maya Angelou is able to get across to us here. This particular set of words is just one way of realising the thoughts that she was trying to convey. We could imagine slight variations in the words she uses here that would still get across the idea, or similar ideas. Words are one way of embedding and communicating our ideas.

Example 4

Photos -> Paintings

Just as we can embed a natural scene by taking a photo, we can embed it in a painting; perhaps from the photo itself. In either case, the effect is the same: we retain some visual characteristics.

Example 5

Person -> Collection of clothes

Imagine just looking at all the outfits Prince worth over his lifetime. This would be informative in some sense about the nature of Prince himself.

Mappings that aren’t embeddings?

Hopefully the above few examples have prompted you to think of some of your own everyday embeddings.

What did you come up with?

To test the boundaries of this concept, let’s try and pick some things that aren’t embeddings. Here’s the first thing that came to my mind as an attempt:

People -> Phone Numbers

The question is: Does the phone number retain any characteristics that I have? Certainly it doesn’t visually look like me; I don’t have a huge personal relationship with it, and I don’t think that if anyone actually had it they would be able to know large amounts of information about me from the number alone.

However, there is some data in the number. If I write it in international form (+61 413 … …) it indicates that I’m in Australia. Therefore, if we were to plot mobile numbers in international form on a map of the earth by their country, we could use this mapping to determine what country the given person is (probably) in. The fact that it starts with 0413 also probably tells us something about the time when I purchased my phone number.

So while this mapping doesn’t have a lot of information in it, it does have some, so I would still call it an embedding.

In fact, to me it seems that by our definition of embedding earlier (a structure-preserving map) then basically all mappings we typically think of are embeddings. Of course, we could come up with purposefully meaningless mappings, that would throw away all the structure, but those aren’t particularly natural.

Why are embeddings interesting?

The examples listed above are useful in our everyday life.

We can take a tentative bite of a new meal, and get an idea of whether or not we want to continue eating.

We can take a memory of a natural scene and keep it forever; using it to prompt the memories and feelings we had at the time.

We can have thoughts and communicate them to other people.

We can express ourselves and through our outfits.

In general, each embedding we’ve seen here improves our life in some way. It allows us to communicate complicated concepts succintly, to group things together, and to save rich experiences.

Embeddings in deep learning perform exactly the same task: They let us represent some piece of data by a meaining-preserving (typically smaller) piece of data. With this second piece of data, typically called an embedding vector, we can perform operations that preserve the meaning inherent in the original data.

Here are a couple of technical links going into these ideas in the context of deep learning:

Level 2
382 Little Collins Street
Melbourne VIC, 3000

Enter from McKillop Street

Contact us

(03) 9008 5922

Get the latest on machine learning and data science in your inbox each month