The meaning/implication of the matrices generated by Singular Value Decomposition (SVD) for Latent Semantic Analysis (LSA)

StackOverflow https://stackoverflow.com/questions/20988969

Question

SVD is used in LSA to get the latent semantic information. I am confused about the interpretation about the SVD matrices.

We first build a document-term matrix. And then use SVD to decompose it into 3 matrices.

For example:

The doc-term matrix M1 is M x N, where:

M = the number of documents
N = the number of terms

And M1 was decomposed into:

M1 = M2 * M3 * M4, where:

M2: M x k

M3: k x k

M4: k x N

I see the interpretation like below:

The k column of M2 stands for categories of similar semantics. The k row of M4 stands for the topics.

My questions are:

  1. Why is k interpreted like above? How do we know it is similar semantics and topics?

  2. Why the similar semantics equal the topics?

  3. Why k is interpreted differently between M2 and M4

  4. How to interpret the M3?

I am really confused. It seems the interpretation is totally arbitrary. Is that what latent meant to be?

Was it helpful?

Solution

I warmly recommend reading the information retrieval chapter in the SNLP bible by Manning and Schutze. In 5 pages it explains everything you want to know about LSI and SVD.

You will find paragraphs like this :

enter image description here

Licensed under: CC-BY-SA with attribution
Not affiliated with StackOverflow
scroll top