Cookies help us display personalized product recommendations and ensure you have great shopping experience.

By using this site, you agree to the Privacy Policy and Terms of Use.
Accept
SmartData CollectiveSmartData Collective
  • Analytics
    AnalyticsShow More
    software developer using ai
    How Data Analytics Helps Developers Deliver Better Tech Services
    8 Min Read
    ai for stock trading
    Can Data Analytics Help Investors Outperform Warren Buffett
    9 Min Read
    media monitoring
    Signals In The Noise: Using Media Monitoring To Manage Negative Publicity
    5 Min Read
    data analytics
    How Data Analytics Can Help You Construct A Financial Weather Map
    4 Min Read
    financial analytics
    Financial Analytics Shows The Hidden Cost Of Not Switching Systems
    4 Min Read
  • Big Data
  • BI
  • Exclusive
  • IT
  • Marketing
  • Software
Search
© 2008-25 SmartData Collective. All Rights Reserved.
Reading: SAS Admin: Process Data Faster in RDBMS by Buffering the Data in Memory
Share
Notification
Font ResizerAa
SmartData CollectiveSmartData Collective
Font ResizerAa
Search
  • About
  • Help
  • Privacy
Follow US
© 2008-23 SmartData Collective. All Rights Reserved.
SmartData Collective > Data Management > SAS Admin: Process Data Faster in RDBMS by Buffering the Data in Memory
Data Management

SAS Admin: Process Data Faster in RDBMS by Buffering the Data in Memory

Tricia Aanderud
Tricia Aanderud
4 Min Read
SHARE

Contributed by Stephen Overton to BI Notes

Contents
  • Buffering Options
  • Using the LIBNAME Option
  • Using a SAS Data Set Option
  • Considerations


Contributed by Stephen Overton to BI Notes


More Read

Pssst … How Much Money For Your Personal Data?
SAS Visual Analytics: Tips for Unriddling Encoding in SAS Visual Analytics 6.3
Walled Gardens and the Value of Innovation: Questions for Bill Franks
What is Data Governance?
Real-Time Access to SaaS Data

By default, accessing third party relational databases can be very slow if not configured properly.  I recently started using PostgreSQL 9.1, an open source database, to store high volumes of data for my SAS Global Forum 2013 paper.  At first it was taking forever to load up data because SAS was inserting 1 row at a time into the database table.  After adding a simple option my data processing was off to the races!

Buffering Options

The SAS INSERTBUFF and READBUFF options will improve ODBC and OLE DB libraries dramatically.   By default these are set to 1 and 250 rows respectively for ODBC connections.  Other third party databases, such as Oracle, DB2, or MS SQL Server, will probably benefit as well but I have not been able to test.  Setting these buffer sizes tells SAS how many rows to buffer in memory before processing.  

Using the LIBNAME Option

These options can be added to the LIBNAME statement to set the buffering sizes for all processing done on tables within the library.  Ideally if you have the SAS Metadata server running, your SAS Administrator should set these options through the Data Library manager in SAS Management Console.

If you are using Base SAS or writing code in SAS Enterprise Guide, you can also manually write the LIBNAME step like this:

LIBNAME pgsgf13 ODBC  DBCOMMIT=10000  READBUFF=30000 INSERTBUFF=30000  DATASRC=sasgf13  SCHEMA=public ;

Be sure to check out SAS support for more information on the INSERTBUFF and READBUFF options for the LIBNAME statement.

Using a SAS Data Set Option

You can also explicitly define these buffer options for an individual data step in your code if you want.   This may come in handy depending on the type, size and width of data you plan on inserting.

LIBNAME pgsgf13 ODBC DATASRC=sasgf13 SCHEMA=public ; data pgsgf13.test(dbcommit=500000 insertbuff=10000 readbuff=10000); *** DATA STEP STUFF ****; run;

Be sure to check out SAS support for more information on the INSERTBUFF and READBUFF options for the data step.

Considerations

Careful consideration must be taken into account when setting these options.  The optimal setting depends on your SAS compute server resources and network capacity.  The number of rows to buffer should be much less for very wide tables with lots of character data because of the physical byte sizes of character columns and the overall width of the table.  In my project I am using very skinny fact tables with numeric data, which requires only 8 bytes per column of numeric data.  Assuming I have 10 numeric columns, that’s only about 80 bytes of data per row.  For my data step which inserts a huge volume of data, I could theoretically set the INSERTBUFF equal to something like 1,000,000 rows, but SAS does have a hard limit of approximately 32,000 rows it can buffer in memory :-) . 

Related content:

  1. Web Report Studio: Adding a Confidentiality Disclaimer
  2. SAS Enterprise Guide: Updating the Metadata with New/Modified Datasets
  3. Administration: Cleaning Up the WORK Library Automatically in UNIX
  4. Administration: Fall in Love with JBoss Again by Configuring the JGroup Bind Address
  5. SAS Code: Simple Macro to Benchmark Data Performance

The post SAS Administration: Process Data Faster in RDBMS by Buffering the Data in Memory appeared first on Business Intelligence Notes for SAS® BI Users.

TAGGED:metadatasas
Share This Article
Facebook Pinterest LinkedIn
Share

Follow us on Facebook

Latest News

software developer using ai
How Data Analytics Helps Developers Deliver Better Tech Services
Analytics Big Data Exclusive
ai for stock trading
Can Data Analytics Help Investors Outperform Warren Buffett
Analytics Exclusive
data security issues with annotation outsourcing
Data Annotation Outsourcing and Risk Mitigation Strategies
Big Data Exclusive Security
NO-CODE
Breaking down SPARC Emulation Technology: Zero Code Re-write
Exclusive News Software

Stay Connected

1.2KFollowersLike
33.7KFollowersFollow
222FollowersPin

You Might also Like

big data and meta data
Big DataSecurity

Big Data, Small Details: How Metadata Creates Security Risks

5 Min Read

A business intelligence parable

6 Min Read

SAS commits $70 million to Cloud Computing

4 Min Read

Interview – Anne Milley, SAS, Part 1

15 Min Read

SmartData Collective is one of the largest & trusted community covering technical content about Big Data, BI, Cloud, Analytics, Artificial Intelligence, IoT & more.

giveaway chatbots
How To Get An Award Winning Giveaway Bot
Big Data Chatbots Exclusive
ai is improving the safety of cars
From Bolts to Bots: How AI Is Fortifying the Automotive Industry
Artificial Intelligence

Quick Link

  • About
  • Contact
  • Privacy
Follow US
© 2008-25 SmartData Collective. All Rights Reserved.
Welcome Back!

Sign in to your account

Username or Email Address
Password

Lost your password?