Gathering Information on a Global Scale

4 Min Read

The financial services industry is an ever-evolving whirlwind of information. Stock prices, trades and analyst ratings are critical components in organizations’ ability to be successful within this ultra-competitive market. Timing is everything in the financial world as organizations that find themselves unable to keep up with the unyielding data flow run the risk of making critical decisions based on outdated information – an action that could irrevocably tarnish their firm’s brand.

The financial services industry is an ever-evolving whirlwind of information. Stock prices, trades and analyst ratings are critical components in organizations’ ability to be successful within this ultra-competitive market. Timing is everything in the financial world as organizations that find themselves unable to keep up with the unyielding data flow run the risk of making critical decisions based on outdated information – an action that could irrevocably tarnish their firm’s brand.

FactSet, a company that gathers and publishes company and financial data from around the globe since 1978, collects data on tens of thousands of private and public entities and delivers the information to over 48,000 users of its software products, data and publications. While most internal market intelligence operations monitor a limited set of companies or individuals that are of interest to their organization, FactSet must monitor every source, including web sites, traditional media, government and registries, for every company.  

FactSet constantly needs to deliver a consistent, reliable, high-quality data product at high volume. For that reason, it recently looked for a new way to automate its data gathering process to keep up with the growth and quick-changing flood of information on the Web.

FactSet needed to grab changes to Web sites as soon as they occurred, but at the scale at which it was operating, it only wanted substantive, not cosmetic, changes; otherwise it would be inundated by non-substantive changes in layout or format. After considering a number of solutions, FactSet chose a solution that enables automated Web site monitoring, data gathering and alerts to changes in data as they occur. FactSet analysts are able to calibrate the tool to gather specific data on any Web site, using a simple interface that requires no coding.

FactSet was able to leverage the Web data monitoring and extraction tool to tremendously increase the number of companies it was able to monitor with the same number of people. The percentage of valid hits increased from 35% to over 90% because it was able to aggregate only hits that were important. The immediacy of the information — changes in Web sites were detected as they were posted — enabled FactSet to deliver fresher information to its clientele. Information collection has become a continuous process for each site, not just an annual update. More importantly, FactSet’s team could spend more time on high-value operations like research and quality assurance rather than repetitive, time-consuming collection.

In summary, FactSet finds that the investment in Web data extraction and monitoring tools have been worthwhile. FactSet has been able to deliver a timelier, greatly expanded, consistent, reliable and high quality product to its customers without having to hire additional employees.

Share This Article
Exit mobile version