Cookies help us display personalized product recommendations and ensure you have great shopping experience.

By using this site, you agree to the Privacy Policy and Terms of Use.
Accept
SmartData CollectiveSmartData Collective
  • Analytics
    AnalyticsShow More
    image fx (67)
    Improving LinkedIn Ad Strategies with Data Analytics
    9 Min Read
    big data and remote work
    Data Helps Speech-Language Pathologists Deliver Better Results
    6 Min Read
    data driven insights
    How Data-Driven Insights Are Addressing Gaps in Patient Communication and Equity
    8 Min Read
    pexels pavel danilyuk 8112119
    Data Analytics Is Revolutionizing Medical Credentialing
    8 Min Read
    data and seo
    Maximize SEO Success with Powerful Data Analytics Insights
    8 Min Read
  • Big Data
  • BI
  • Exclusive
  • IT
  • Marketing
  • Software
Search
© 2008-25 SmartData Collective. All Rights Reserved.
Reading: Web Crawling Automation
Share
Notification
Font ResizerAa
SmartData CollectiveSmartData Collective
Font ResizerAa
Search
  • About
  • Help
  • Privacy
Follow US
© 2008-23 SmartData Collective. All Rights Reserved.
SmartData Collective > Big Data > Data Mining > Web Crawling Automation
Data Mining

Web Crawling Automation

Editor SDC
Editor SDC
3 Min Read
SHARE

Apart from the various ways you can use PERL, or other scripting languages for Automated Web crawling- this is a relatively low technology solution for people who want to download web pages , or web data.It can also be called as web scraping for some people.

 

The First Method is by using the package RCurl package (from R-Help Archives) .

The R –List is also found here http://www.nabble.com/R-help-f13820.html.

More Read

IBM and ILOG – Java, COBOL AND .Net
Operational decision making as a corporate asset
Tipping Points
Stop Mining Data!
The Viability of Big Data [INFOGRAPHIC]

 

> library(RCurl)
> my.url <->
http://www.nytimes.com/2009/01/07/technology/business-computing/07program.html?_r=2”
> getURL(my.url)

A variation is the following line of code-

getURL(my.url, followlocation = TRUE)

The information being sent from R and received by R from the server.

getURL(my.url, verbose = TRUE)

The second is by using the package RDCOMClient in R

> library(RDCOMClient)
> my.url <->
http://www.nytimes.com/2009/01/07/technology/business-computing/07program.html?_r=2”
> ie <- comcreate(“internetexplorer.application”)=””>
> txt <- list()=””>
> ie$Navigate(my.url)
NULL
> while(ie[[“Busy”]]) Sys.sleep(1)
> txt[[my.url]] <- ie[[“document”]][[“body”]][[“innertext”]]=””>
> txt
$`
http://www.nytimes.com/2009/01/07/tec…


Apart from the various ways you can use PERL, or other scripting languages for Automated Web crawling- this is a relatively low technology solution for people who want to download web pages , or web data.It can also be called as web scraping for some people.

 

The First Method is by using the package RCurl package (from R-Help Archives) .

The R –List is also found here http://www.nabble.com/R-help-f13820.html.

 

> library(RCurl)
> my.url <- “
http://www.nytimes.com/2009/01/07/technology/business-computing/07program.html?_r=2”
> getURL(my.url)

A variation is the following line of code-

getURL(my.url, followlocation = TRUE)

The information being sent from R and received by R from the server.

getURL(my.url, verbose = TRUE)

The second is by using the package RDCOMClient in R

> library(RDCOMClient)
> my.url <- “
http://www.nytimes.com/2009/01/07/technology/business-computing/07program.html?_r=2”
> ie <- COMCreate(“InternetExplorer.Application”)
> txt <- list()
> ie$Navigate(my.url)
NULL
> while(ie[[“Busy”]]) Sys.sleep(1)
> txt[[my.url]] <- ie[[“document”]][[“body”]][[“innerText”]]
> txt
$`
http://www.nytimes.com/2009/01/07/technology/business-computing/
07program.html?_r=2`

[1] “Skip to article Try Electronic Edition Log …

The third way ( a personal favorite) is by using the Firefox add in IMacros from www.iopus.com if you need to extract huge amounts of data and copy and paste into text and excel files. The Add in works almost the same way as the Record Macro feature works in Excel, with a difference it records all the clicks, download ,url’s etc from the browser.

It can even automate website testing, and data entry tasks.

While Firefox add-in is free the Internet Explorer costs 49 USD.

www.decisionstats.com Tags: R, automation, web scraping, web crawler

 

Share This Article
Facebook Pinterest LinkedIn
Share

Follow us on Facebook

Latest News

image fx (2)
Monitoring Data Without Turning into Big Brother
Big Data Exclusive
image fx (71)
The Power of AI for Personalization in Email
Artificial Intelligence Exclusive Marketing
image fx (67)
Improving LinkedIn Ad Strategies with Data Analytics
Analytics Big Data Exclusive Software
big data and remote work
Data Helps Speech-Language Pathologists Deliver Better Results
Analytics Big Data Exclusive

Stay Connected

1.2kFollowersLike
33.7kFollowersFollow
222FollowersPin

You Might also Like

Who will manage Big Data?

0 Min Read

What Do You Mean by BI?

8 Min Read

Integrating data and text analysis

9 Min Read

Transforming retail with analytics and decision management

3 Min Read

SmartData Collective is one of the largest & trusted community covering technical content about Big Data, BI, Cloud, Analytics, Artificial Intelligence, IoT & more.

ai in ecommerce
Artificial Intelligence for eCommerce: A Closer Look
Artificial Intelligence
data-driven web design
5 Great Tips for Using Data Analytics for Website UX
Big Data

Quick Link

  • About
  • Contact
  • Privacy
Follow US
© 2008-25 SmartData Collective. All Rights Reserved.
Go to mobile version
Welcome Back!

Sign in to your account

Username or Email Address
Password

Lost your password?