Cookies help us display personalized product recommendations and ensure you have great shopping experience.

By using this site, you agree to the Privacy Policy and Terms of Use.
Accept
SmartData CollectiveSmartData Collective
  • Analytics
    AnalyticsShow More
    image fx (67)
    Improving LinkedIn Ad Strategies with Data Analytics
    9 Min Read
    big data and remote work
    Data Helps Speech-Language Pathologists Deliver Better Results
    6 Min Read
    data driven insights
    How Data-Driven Insights Are Addressing Gaps in Patient Communication and Equity
    8 Min Read
    pexels pavel danilyuk 8112119
    Data Analytics Is Revolutionizing Medical Credentialing
    8 Min Read
    data and seo
    Maximize SEO Success with Powerful Data Analytics Insights
    8 Min Read
  • Big Data
  • BI
  • Exclusive
  • IT
  • Marketing
  • Software
Search
© 2008-25 SmartData Collective. All Rights Reserved.
Reading: RAG – The Newest Advance in AI Is All About Context
Share
Notification
Font ResizerAa
SmartData CollectiveSmartData Collective
Font ResizerAa
Search
  • About
  • Help
  • Privacy
Follow US
© 2008-23 SmartData Collective. All Rights Reserved.
SmartData Collective > Business Intelligence > Artificial Intelligence > RAG – The Newest Advance in AI Is All About Context
Artificial IntelligenceExclusiveProgramming

RAG – The Newest Advance in AI Is All About Context

RAG is a great new breakthrough in AI that is changing the future of contextual data management.

Ed Thompson
Ed Thompson
8 Min Read
RAG large language model
DepositPhotos BiancoBlue
SHARE

There are a lot of amazing developments in AI over the last few years. We saw ChatGPT first reach the market in November, 2022. It was a remarkable breakthrough that made headlines around the world. ChatGPT and other AI startups are driving demand for software developers.

More recently, we have also heard about some of the newer developments in AI. Just today, Microsoft announced that it is introducing new AI employees that can handle queries.

But one of the biggest developments is the inception of RAG. Keep reading to learn how it is affecting our future.

RAG is the Newest Shiny Toy with AI

When we’re talking about AI, Retrieval Augmented Generation (RAG) and the like, it helps to think of an LLM as a person.

More Read

big data can help smart cities improve air quality
How Smart Cities Can Leverage Data Technology to Improve Air Quality
Is Machine Learning The Unspoken Secret To Gaming Success?
Data Visualization Boosts Business Scalability with Sales Mapping
How AI Is Transforming Lending And Loan Management
The Top AI-Based Web Design Trends For 2022

We’ve all heard the phrase “Jack of all trades, master of none,” and that applies to large language models (LLMs). In their default form, LLMs are generalist. IBM has a great overview of them.

If you want an LLM to participate in a business and either create productive output or make decisions – to move beyond generalist – you need to teach it about your business, and you need to teach it a lot! The list is long but as a baseline, you need to teach it the basic skills to do a job, about the organization and organization’s processes, about the desired outcome and potential problems, and you need to feed it with the context needed to solve the current problem at hand. You also need to provide it with all the necessary tools to either effect a change or learn more. This is one of the newest examples of ways that AI can help businesses.

In this way the LLM is very like a person. When you hire someone you start by finding the skills you need, you help them to understand your business, educate them on the business process they are working within, give them targets and goals, train them on their job, and give them tools to do their job.

For people, this is all achieved with formal and informal training, as well as providing good tools. For a Large Language Model, this is achieved with RAG. So, if we want to leverage the benefits of AI in any organization, we need to get very good at RAG.

So what’s the challenge?

One of the limitations of modern Large Language Models is the amount of contextual information that can be provided for each and every task you want that LLM to perform.

RAG provides that context. As such, preparing a succinct and accurate context is crucial. It’s this context that teaches the model about the specifics of your business, of the task you’re asking of them. Give an LLM the correct question and correct context and it will give an answer or make a decision as well as a human being (if not better).

It’s important to make the distinction that people learn by doing; LLM’s don’t learn naturally, they are static. In order to teach the LLM, you need to create that context as well as a feedback loop that updates that RAG context for it to do better next time.

The efficiency of how that context is curated is key both for the performance of the model but also is directly correlated to cost. The heavier the lift to create that context, the more expensive the project becomes in both time and actual cost.

Equally, if that context isn’t accurate, you’re going to find yourself spending infinitely longer to correct, tweak and improve the model, rather than getting results straight off the bat.

This makes AI a data problem.

Creating the context needed for LLMs is hard because it needs lots of data – ideally everything your business knows that might be relevant. And then that data needs to be distilled down to the most relevant information. No mean feat in even the most data-driven organization.

In reality, most businesses have neglected large parts of their data estate for a long time, especially the less structured data designed to teach humans (and therefore LLMs) how to do the job.

LLMs and RAG are bringing an age-old problem even further to light: data exists in silos that are complicated to reach.

When you consider we’re now looking at unstructured data as well as structured data, we’re looking at even more silos. The context needed to get value from AI means that the scope of data is no longer solely about pulling numbers from Salesforce, if organizations are going to see true value in AI, they also need training materials used to onboard humans, PDFs, call logs, the list goes on.

For organizations starting to hand over business processes to AI is daunting, but it is the organizations with the best ability to curate contextual data that will be best placed to achieve this.

At its core, ‘LLM + context + tools + human oversight + feedback loop’ are the keys to AI accelerating just about any business process.

Matillion has a long and storied history of helping customers be productive with data. For more than a decade, we’ve been evolving our platform – from BI to ETL, now to Data Productivity Cloud – adding building blocks that enable our customers to make the most of the latest technological advancements that improve their data productivity. AI and RAG are no exceptions. We’ve been adding the building blocks to our tool that allow customers to assemble and test RAG pipelines, to prepare data for the vector stores that power RAG; provide the tools to assemble that all-important context with the LLM, and provide the tools needed to feedback and access the quality of LLM responses.

We’re opening up access to RAG pipelines without the need for hard-to-come-by data scientists or huge amounts of investment, so that you can harness LLMs that are no longer just a ‘jack of all trades’ but a valuable and game-changing part of your organization.

TAGGED:AIartificial intelligencelarge language modelRAG
Share This Article
Facebook Pinterest LinkedIn
Share
ByEd Thompson
Follow:
Ed Thompson is CTO and co-founder of Matillion. He started his career as an IBM software consultant and spent 11 years consulting for some of the premier blue-chip companies in the UK.Along with CEO Matthew Scullion, Ed launched Matillion in 2011 and set about building a crack team of data integration experts and software engineers.He and his team launched Matillion’s flagship ETL product in 2014, which has driven the company’s growth ever since. Ed’s strength is his ability to bring together best-in-class technologies from across the software ecosystem and apply them to solving the deep and complex requirements of modern businesses in new and disruptive ways.

Follow us on Facebook

Latest News

image fx (2)
Monitoring Data Without Turning into Big Brother
Big Data Exclusive
image fx (71)
The Power of AI for Personalization in Email
Artificial Intelligence Exclusive Marketing
image fx (67)
Improving LinkedIn Ad Strategies with Data Analytics
Analytics Big Data Exclusive Software
big data and remote work
Data Helps Speech-Language Pathologists Deliver Better Results
Analytics Big Data Exclusive

Stay Connected

1.2kFollowersLike
33.7kFollowersFollow
222FollowersPin

You Might also Like

AI and nursing
Artificial IntelligenceExclusive

Transforming Healthcare Technology: The Powerful Collaboration between AI and Nurses

7 Min Read
AI driven e-singatures
Artificial Intelligence

AI Advances Lead To Improvements in E-Signatures

6 Min Read
AI usage in supply chain
Artificial IntelligenceExclusiveSecurity

How Does AI Help Secure The Supply Chain?

8 Min Read
smart crosswalk
Artificial IntelligenceExclusiveNews

AI Reduces Pedestrian Collisions With Smart Crosswalks

8 Min Read

SmartData Collective is one of the largest & trusted community covering technical content about Big Data, BI, Cloud, Analytics, Artificial Intelligence, IoT & more.

AI and chatbots
Chatbots and SEO: How Can Chatbots Improve Your SEO Ranking?
Artificial Intelligence Chatbots Exclusive
ai is improving the safety of cars
From Bolts to Bots: How AI Is Fortifying the Automotive Industry
Artificial Intelligence

Quick Link

  • About
  • Contact
  • Privacy
Follow US
© 2008-25 SmartData Collective. All Rights Reserved.
Go to mobile version
Welcome Back!

Sign in to your account

Username or Email Address
Password

Lost your password?