Latent semantic analysis

From Wiki

Jump to: navigation, search

Latent semantic analysis (LSA) is a technique in natural language processing, in particular in vectorial semantics, of analyzing relationships between a set of documents and the terms they contain by producing a set of concepts related to the documents and terms. Latent semantic indexing is an indexing of Latent Semantic Analysis.

LSA uses a term-document matrix which describes the occurrences of terms in documents; it is a sparse matrix whose rows correspond to documents and whose columns correspond to terms, typically stemmed words that appear in the documents. A typical example of the weighting of the elements of the matrix is tf-idf (term frequency–inverse document frequency): the element of the matrix is proportional to the number of times the terms appear in each document, where rare terms are upweighted to reflect their relative importance.

LSA transforms the occurrence matrix into a relation between the terms and some concepts, and a relation between those concepts and the documents. Thus the terms and documents are now indirectly related through the concepts.

For more information about this topic and to learn how to use technologies and principles to succeed in business, please contact Russell Wright or visit Theme Zoom.

Also See Latent Semantic Indexing

Also See Network Empire's Content Curation Course - Cash Micro-Content Curation, Syndication and Publishing at the Speed of Thought

Personal tools