0
  • About
  • Help
  • Become an Expert
SmartData Collective SmartData Collective
SmartData Collective SmartData Collective
SmartData Collective SmartData Collective
  • Post Here
  • Exclusive
    predictive analytics
    Perks of Predictive Analytics for Businesses Big and Small
    role of big data in the growing nft market
    Why Big Data is Creating a Big Market for NFTs
    supply chain analytics
    Automotive Industry Uses Analytics To Solve Pressing Supply Chain Issues
    use the best ai video editing tools
    5 Best Free AI-Based Video Editing Software Applications for 2022
    ai helps suppliers manage their risks
    AI Helps Mitigate These 5 Major Supplier Risks
  • Analytics
  • Big Data
  • BI
  • IT
  • Marketing
  • Software
0
Trending Now
predictive analytics

Perks of Predictive Analytics for Businesses Big and Small

June 24, 2022
role of big data in the growing nft market

Why Big Data is Creating a Big Market for NFTs

June 23, 2022
supply chain analytics

Automotive Industry Uses Analytics To Solve Pressing Supply Chain Issues

June 23, 2022
use the best ai video editing tools

5 Best Free AI-Based Video Editing Software Applications for 2022

June 22, 2022
SmartData Collective SmartData Collective
  • Post Here
  • Exclusive
    predictive analytics
    Perks of Predictive Analytics for Businesses Big and Small
    role of big data in the growing nft market
    Why Big Data is Creating a Big Market for NFTs
    supply chain analytics
    Automotive Industry Uses Analytics To Solve Pressing Supply Chain Issues
    use the best ai video editing tools
    5 Best Free AI-Based Video Editing Software Applications for 2022
    ai helps suppliers manage their risks
    AI Helps Mitigate These 5 Major Supplier Risks
  • Analytics
  • Big Data
  • BI
  • IT
  • Marketing
  • Software
Trending Now
predictive analytics

Perks of Predictive Analytics for Businesses Big and Small

June 24, 2022
role of big data in the growing nft market

Why Big Data is Creating a Big Market for NFTs

June 23, 2022
supply chain analytics

Automotive Industry Uses Analytics To Solve Pressing Supply Chain Issues

June 23, 2022
use the best ai video editing tools

5 Best Free AI-Based Video Editing Software Applications for 2022

June 22, 2022
0
SmartData Collective > Data Management > SAS Admin: Process Data Faster in RDBMS by Buffering the Data in Memory
Data Management

SAS Admin: Process Data Faster in RDBMS by Buffering the Data in Memory

Tricia Aanderud
Posted by Tricia Aanderud
0 Shares
READ NEXT
Data Integration Architecture
Data Integration Ecosystem for Big Data and Analytics

Contributed by Stephen Overton to BI Notes


Contributed by Stephen Overton to BI Notes


Loading...

By default, accessing third party relational databases can be very slow if not configured properly.  I recently started using PostgreSQL 9.1, an open source database, to store high volumes of data for my SAS Global Forum 2013 paper.  At first it was taking forever to load up data because SAS was inserting 1 row at a time into the database table.  After adding a simple option my data processing was off to the races!

Buffering Options

The SAS INSERTBUFF and READBUFF options will improve ODBC and OLE DB libraries dramatically.   By default these are set to 1 and 250 rows respectively for ODBC connections.  Other third party databases, such as Oracle, DB2, or MS SQL Server, will probably benefit as well but I have not been able to test.  Setting these buffer sizes tells SAS how many rows to buffer in memory before processing.  

Using the LIBNAME Option

These options can be added to the LIBNAME statement to set the buffering sizes for all processing done on tables within the library.  Ideally if you have the SAS Metadata server running, your SAS Administrator should set these options through the Data Library manager in SAS Management Console.

Loading...

If you are using Base SAS or writing code in SAS Enterprise Guide, you can also manually write the LIBNAME step like this:

LIBNAME pgsgf13 ODBC  DBCOMMIT=10000  READBUFF=30000 INSERTBUFF=30000  DATASRC=sasgf13  SCHEMA=public ;

Be sure to check out SAS support for more information on the INSERTBUFF and READBUFF options for the LIBNAME statement.

Using a SAS Data Set Option

You can also explicitly define these buffer options for an individual data step in your code if you want.   This may come in handy depending on the type, size and width of data you plan on inserting.

LIBNAME pgsgf13 ODBC DATASRC=sasgf13 SCHEMA=public ; data pgsgf13.test(dbcommit=500000 insertbuff=10000 readbuff=10000); *** DATA STEP STUFF ****; run;

Be sure to check out SAS support for more information on the INSERTBUFF and READBUFF options for the data step.

Loading...

Considerations

Careful consideration must be taken into account when setting these options.  The optimal setting depends on your SAS compute server resources and network capacity.  The number of rows to buffer should be much less for very wide tables with lots of character data because of the physical byte sizes of character columns and the overall width of the table.  In my project I am using very skinny fact tables with numeric data, which requires only 8 bytes per column of numeric data.  Assuming I have 10 numeric columns, that’s only about 80 bytes of data per row.  For my data step which inserts a huge volume of data, I could theoretically set the INSERTBUFF equal to something like 1,000,000 rows, but SAS does have a hard limit of approximately 32,000 rows it can buffer in memory :-) . 

Related content:

Loading...
  1. Web Report Studio: Adding a Confidentiality Disclaimer
  2. SAS Enterprise Guide: Updating the Metadata with New/Modified Datasets
  3. Administration: Cleaning Up the WORK Library Automatically in UNIX
  4. Administration: Fall in Love with JBoss Again by Configuring the JGroup Bind Address
  5. SAS Code: Simple Macro to Benchmark Data Performance

The post SAS Administration: Process Data Faster in RDBMS by Buffering the Data in Memory appeared first on Business Intelligence Notes for SAS® BI Users.

Loading...

Tags: metadata sas
0 Shares
Share on Facebook Share on Twitter Share on Pinterest Share on Linkedin
Tricia Aanderud October 3, 2012

Follow Socials

Facebook like
34.2k followers follow
211 followers pin

Trending Now

Guest Post: Exploring Visual Similarity with Modista
My “all-time” most-read 5 articles
An Inside Look at How Big Data Is Changing Fleet Management
artificial intelligence big data
Is Artificial Intelligence About to Change Doing Business Forever?
The Difference Between Business Intelligence and Real Data Science
Encryption is the key to the data kingdom

Follow us on Facebook

Follow us on Facebook
– Advertisement –
– Advertisement –

You Might Also Enjoy

tips for companies coming up with data management strategies
Data Management

Steps Companies Should Take to Come Up Data Management Processes

May 16, 2022
painful lessons from major data breaches
Security

7 Consequences of a Data Intrusion: Insights From Asiaciti Trust & MGM International

May 1, 2022
survive data breaches by improving incident response time
Data Management

How To Improve Incident Response Time for Data Breaches

April 24, 2022
Big Data

7 Data Lineage Tool Tips For Preventing Human Error in Data Processing

Standardizing the data collecting and data input process can go a long way toward ensuring optimal accuracy. The data entry workers will have a better understanding of what to search for and anticipate in each batch of data, as well as the proper protocols to follow, because of the standardization of the process throughout the agency. 

April 21, 2022
Load More
SmartData Collective
  • About
  • Faqs

© 2008–2022 - All rights reserved

Our website uses cookies to improve your experience. Learn more about: cookie policy

Accept