By using this site, you agree to the Privacy Policy and Terms of Use.
Accept
SmartData CollectiveSmartData CollectiveSmartData Collective
  • Analytics
    AnalyticsShow More
    data-driven white label SEO
    Does Data Mining Really Help with White Label SEO?
    7 Min Read
    marketing analytics for hardware vendors
    IT Hardware Startups Turn to Data Analytics for Market Research
    9 Min Read
    big data and digital signage
    The Power of Big Data and Analytics in Digital Signage
    5 Min Read
    data analytics investing
    Data Analytics Boosts ROI of Investment Trusts
    9 Min Read
    football data collection and analytics
    Unleashing Victory: How Data Collection Is Revolutionizing Football Performance Analysis!
    4 Min Read
  • Big Data
  • BI
  • Exclusive
  • IT
  • Marketing
  • Software
Search
© 2008-23 SmartData Collective. All Rights Reserved.
Reading: Gathering Information on a Global Scale
Share
Notification Show More
Aa
SmartData CollectiveSmartData Collective
Aa
Search
  • About
  • Help
  • Privacy
Follow US
© 2008-23 SmartData Collective. All Rights Reserved.
SmartData Collective > Big Data > Data Mining > Gathering Information on a Global Scale
Data MiningData Warehousing

Gathering Information on a Global Scale

GCerami
Last updated: 2012/01/05 at 2:09 PM
GCerami
4 Min Read
SHARE

The financial services industry is an ever-evolving whirlwind of information. Stock prices, trades and analyst ratings are critical components in organizations’ ability to be successful within this ultra-competitive market. Timing is everything in the financial world as organizations that find themselves unable to keep up with the unyielding data flow run the risk of making critical decisions based on outdated information – an action that could irrevocably tarnish their firm’s brand.

The financial services industry is an ever-evolving whirlwind of information. Stock prices, trades and analyst ratings are critical components in organizations’ ability to be successful within this ultra-competitive market. Timing is everything in the financial world as organizations that find themselves unable to keep up with the unyielding data flow run the risk of making critical decisions based on outdated information – an action that could irrevocably tarnish their firm’s brand.

FactSet, a company that gathers and publishes company and financial data from around the globe since 1978, collects data on tens of thousands of private and public entities and delivers the information to over 48,000 users of its software products, data and publications. While most internal market intelligence operations monitor a limited set of companies or individuals that are of interest to their organization, FactSet must monitor every source, including web sites, traditional media, government and registries, for every company.  

FactSet constantly needs to deliver a consistent, reliable, high-quality data product at high volume. For that reason, it recently looked for a new way to automate its data gathering process to keep up with the growth and quick-changing flood of information on the Web.

More Read

big data and IP laws

Big Data & AI In Collision Course With IP Laws – A Complete Guide

Data Security Unveiled: Protecting Your Information in a Connected World
Empowering Parents With Big Data: Ensuring Child Safety And Development
Security In Automated Document Processing: Ensuring Data Integrity And Confidentiality
Unleashing Victory: How Data Collection Is Revolutionizing Football Performance Analysis!

FactSet needed to grab changes to Web sites as soon as they occurred, but at the scale at which it was operating, it only wanted substantive, not cosmetic, changes; otherwise it would be inundated by non-substantive changes in layout or format. After considering a number of solutions, FactSet chose a solution that enables automated Web site monitoring, data gathering and alerts to changes in data as they occur. FactSet analysts are able to calibrate the tool to gather specific data on any Web site, using a simple interface that requires no coding.

FactSet was able to leverage the Web data monitoring and extraction tool to tremendously increase the number of companies it was able to monitor with the same number of people. The percentage of valid hits increased from 35% to over 90% because it was able to aggregate only hits that were important. The immediacy of the information — changes in Web sites were detected as they were posted — enabled FactSet to deliver fresher information to its clientele. Information collection has become a continuous process for each site, not just an annual update. More importantly, FactSet’s team could spend more time on high-value operations like research and quality assurance rather than repetitive, time-consuming collection.

In summary, FactSet finds that the investment in Web data extraction and monitoring tools have been worthwhile. FactSet has been able to deliver a timelier, greatly expanded, consistent, reliable and high quality product to its customers without having to hire additional employees.

TAGGED: big data, data aggregation, data warehousing
GCerami January 5, 2012
Share This Article
Facebook Twitter Pinterest LinkedIn
Share

Follow us on Facebook

Latest News

big data and IP laws
Big Data & AI In Collision Course With IP Laws – A Complete Guide
Big Data
ai in marketing
4 Ways AI Can Enhance Your Marketing Strategies
Marketing
sobm for ai-driven cybersecurity
Software Bill of Materials is Crucial for AI-Driven Cybersecurity
Security
IT budgeting for data-driven companies
IT Budgeting Practices for Data-Driven Companies
IT

Stay Connected

1.2k Followers Like
33.7k Followers Follow
222 Followers Pin

You Might also Like

big data and IP laws
Big Data

Big Data & AI In Collision Course With IP Laws – A Complete Guide

5 Min Read
data security unveiled
Security

Data Security Unveiled: Protecting Your Information in a Connected World

8 Min Read
child online safety data
Big Data

Empowering Parents With Big Data: Ensuring Child Safety And Development

13 Min Read
data integrity
Data Management

Security In Automated Document Processing: Ensuring Data Integrity And Confidentiality

7 Min Read

SmartData Collective is one of the largest & trusted community covering technical content about Big Data, BI, Cloud, Analytics, Artificial Intelligence, IoT & more.

AI and chatbots
Chatbots and SEO: How Can Chatbots Improve Your SEO Ranking?
Artificial Intelligence Chatbots Exclusive
AI chatbots
AI Chatbots Can Help Retailers Convert Live Broadcast Viewers into Sales!
Chatbots

Quick Link

  • About
  • Contact
  • Privacy
Follow US
© 2008-23 SmartData Collective. All Rights Reserved.
Go to mobile version
Welcome Back!

Sign in to your account

Lost your password?