By using this site, you agree to the Privacy Policy and Terms of Use.
Accept
SmartData Collective
  • Analytics
    AnalyticsShow More
    predictive analytics in dropshipping
    Predictive Analytics Helps New Dropshipping Businesses Thrive
    12 Min Read
    data-driven approach in healthcare
    The Importance of Data-Driven Approaches to Improving Healthcare in Rural Areas
    6 Min Read
    analytics for tax compliance
    Analytics Changes the Calculus of Business Tax Compliance
    8 Min Read
    big data analytics in gaming
    The Role of Big Data Analytics in Gaming
    10 Min Read
    analyst,women,looking,at,kpi,data,on,computer,screen
    Promising Benefits of Predictive Analytics in Asset Management
    11 Min Read
  • Big Data
  • BI
  • Exclusive
  • IT
  • Marketing
  • Software
Search
© 2008-23 SmartData Collective. All Rights Reserved.
Reading: Why Does Latent Semantic Analysis Work?
Share
Notification Show More
Latest News
ai software development
Key Strategies to Develop AI Software Cost-Effectively
Artificial Intelligence
ai in omnichannel marketing
AI is Driving Huge Changes in Omnichannel Marketing
Artificial Intelligence
ai for small business tax planning
Maximize Tax Deductions as a Business Owner with AI
Artificial Intelligence
ai in marketing with 3D rendering
Marketers Use AI to Take Advantage of 3D Rendering
Artificial Intelligence
How Big Data Is Transforming the Maritime Industry
How Big Data Is Transforming the Maritime Industry
Big Data
Aa
SmartData Collective
Aa
Search
  • About
  • Help
  • Privacy
Follow US
© 2008-23 SmartData Collective. All Rights Reserved.
SmartData Collective > Uncategorized > Why Does Latent Semantic Analysis Work?
Uncategorized

Why Does Latent Semantic Analysis Work?

Daniel Tunkelang
Last updated: 2008/11/01 at 10:02 PM
Daniel Tunkelang
4 Min Read
SHARE

Warning to regular readers: this post is a bit more theoretical than average for this blog.

Peter Turney has a nice post today about “SVD, Variance, and Sparsity“. It’s actually a follow-up to a post last year entitled “Why Does SVD Improve Similarity Measurement?” that apparently has remained popular despite its old age in blog years.

For readers unfamiliar with singular value decomposition (SVD), I suggest a brief…

More Read

big data improves

3 Ways Big Data Improves Leadership Within Companies

IT Is Not Analytics. Here’s Why.
Romney Invokes Analytics in Rebuke of Trump
WEF Davos 2016: Top 100 CEO bloggers
In Memoriam: Robin Fray Carey

Warning to regular readers: this post is a bit more theoretical than average for this blog.

Peter Turney has a nice post today about “SVD, Variance, and Sparsity“. It’s actually a follow-up to a post last year entitled “Why Does SVD Improve Similarity Measurement?” that apparently has remained popular despite its old age in blog years.

For readers unfamiliar with singular value decomposition (SVD), I suggest a brief detour to the Wikipedia entry on latent semantic analysis (also known as latent semantic indexing). In a nutshell, latent semantic analysis is an information retrieval techinque that applies SVD to the term-document matrix of a corpus in order to reduce this sparse, high-dimensional matrix to a denser, lower-dimensional matrix whose dimensions correspond to the “latent” topics in the corpus.

Back to Peter’s thesis. He’s observed that document similarity is more accurate in the lower-dimensional vector space produced by SVD than in the space defined by the original term-document matrix. This isn’t immediately obvious; after all, SVD is a lossy approximation of the term-document matrix, so you might expect accuracy to decrease.

In his 2007 post, Peter offers three hypotheses for why SVD improves the similarity measure:

  1. High-order co-occurrence: Dimension reduction with SVD is sensitive to high-order co-occurrence information (indirect association) that is ignored by PMI-IR and cosine similarity measures without SVD. This high-order co-occurrence information improves the similarity measure.
  2. Latent meaning: Dimension reduction with SVD creates a (relatively) low-dimensional linear mapping between row space (words) and column space (chunks of text, such as sentences, paragraphs, or documents). This low-dimensional mapping captures the latent (hidden) meaning in the words and the chunks. Limiting the number of latent dimensions forces a greater correspondence between words and chunks. This forced correspondence between words and chunks (the simultaneous equation constraint) improves the similarity measurement.
  3. Noise reduction: Dimension reduction with SVD removes random noise from the matrix (it smooths the matrix). The raw matrix contains a mixture of signal and noise. A low-dimensional linear model captures the signal and removes the noise. (This is like fitting a messy scatterplot with a clean linear regression equation.)

In today’s follow-up post, he drills down on this third hypothesis, noting that noise can come from either variance and sparsity. He then proposes independently adjusting the sparsity-smoothing and variance-smoothing effects of SVD to split this third hypothesis into two sub-hypotheses.

I haven’t done much work with latent semantic analysis. But work that I’ve done with other statistical information retrieval techinques, such as using Kullback-Leibler divergence to measure the signal of a document set, suggest a similar benefit from preprocessing steps that reduce noise. Now I’m curious about the relative benefits of variance vs. sparsity reduction.

Link to original post

Daniel Tunkelang November 1, 2008
Share this Article
Facebook Twitter Pinterest LinkedIn
Share

Follow us on Facebook

Latest News

ai software development
Key Strategies to Develop AI Software Cost-Effectively
Artificial Intelligence
ai in omnichannel marketing
AI is Driving Huge Changes in Omnichannel Marketing
Artificial Intelligence
ai for small business tax planning
Maximize Tax Deductions as a Business Owner with AI
Artificial Intelligence
ai in marketing with 3D rendering
Marketers Use AI to Take Advantage of 3D Rendering
Artificial Intelligence

Stay Connected

1.2k Followers Like
33.7k Followers Follow
222 Followers Pin

You Might also Like

big data improves
Big DataJobsKnowledge ManagementUncategorized

3 Ways Big Data Improves Leadership Within Companies

6 Min Read
Image
Uncategorized

IT Is Not Analytics. Here’s Why.

7 Min Read

Romney Invokes Analytics in Rebuke of Trump

4 Min Read

WEF Davos 2016: Top 100 CEO bloggers

14 Min Read

SmartData Collective is one of the largest & trusted community covering technical content about Big Data, BI, Cloud, Analytics, Artificial Intelligence, IoT & more.

ai is improving the safety of cars
From Bolts to Bots: How AI Is Fortifying the Automotive Industry
Artificial Intelligence
giveaway chatbots
How To Get An Award Winning Giveaway Bot
Big Data Chatbots Exclusive

Quick Link

  • About
  • Contact
  • Privacy
Follow US

© 2008-23 SmartData Collective. All Rights Reserved.

Removed from reading list

Undo
Go to mobile version
Welcome Back!

Sign in to your account

Lost your password?