Cookies help us display personalized product recommendations and ensure you have great shopping experience.

By using this site, you agree to the Privacy Policy and Terms of Use.
Accept
SmartData CollectiveSmartData Collective
  • Analytics
    AnalyticsShow More
    image fx (67)
    Improving LinkedIn Ad Strategies with Data Analytics
    9 Min Read
    big data and remote work
    Data Helps Speech-Language Pathologists Deliver Better Results
    6 Min Read
    data driven insights
    How Data-Driven Insights Are Addressing Gaps in Patient Communication and Equity
    8 Min Read
    pexels pavel danilyuk 8112119
    Data Analytics Is Revolutionizing Medical Credentialing
    8 Min Read
    data and seo
    Maximize SEO Success with Powerful Data Analytics Insights
    8 Min Read
  • Big Data
  • BI
  • Exclusive
  • IT
  • Marketing
  • Software
Search
© 2008-25 SmartData Collective. All Rights Reserved.
Reading: Technical Debt and the Cost/Benefit of Knowledge Retention
Share
Notification
Font ResizerAa
SmartData CollectiveSmartData Collective
Font ResizerAa
Search
  • About
  • Help
  • Privacy
Follow US
© 2008-23 SmartData Collective. All Rights Reserved.
SmartData Collective > Uncategorized > Technical Debt and the Cost/Benefit of Knowledge Retention
Uncategorized

Technical Debt and the Cost/Benefit of Knowledge Retention

Jim MacLennan
Jim MacLennan
8 Min Read
SHARE
A rather rigorous, Financial-sounding title for a high-concept line of thought …

Thanks to Jeff Atwood at Coding Horror, for calling my attention to this article by Martin Fowler on Technical Debt:

    Technical Debt is a wonderful metaphor developed by Ward Cunningham to help us think about this problem. In this metaphor, doing things the quick-and-dirty way sets us up with a technical debt, which is similar to a financial debt. Like a financial debt, the technical debt incurs interest payments, which come in the form of the extra effort that we have to do in future development because of the quick-and-dirty design choice. We can choose to continue paying the interest, or we can pay down the principal by refactoring the quick-and-dirty design into the better design. Although it costs to pay down the principal, we gain by reduced interest payments in the future.

Now, before you write off Cunningham as a techie snob or an academic hold-out for unattainable perfection, listen to this healthy dose of reality…

    The metaphor also explains why it may be sensible to do the quick and dirty approach. Just as a business incurs some debt to take advantage of a market opportunity, developers may . …

A rather rigorous, Financial-sounding title for a high-concept line of thought …

Thanks to Jeff Atwood at Coding Horror, for calling my attention to this article by Martin Fowler on Technical Debt:

    Technical Debt is a wonderful metaphor developed by Ward Cunningham to help us think about this problem. In this metaphor, doing things the quick-and-dirty way sets us up with a technical debt, which is similar to a financial debt. Like a financial debt, the technical debt incurs interest payments, which come in the form of the extra effort that we have to do in future development because of the quick-and-dirty design choice. We can choose to continue paying the interest, or we can pay down the principal by refactoring the quick-and-dirty design into the better design. Although it costs to pay down the principal, we gain by reduced interest payments in the future.

Now, before you write off Cunningham as a techie snob or an academic hold-out for unattainable perfection, listen to this healthy dose of reality…

    The metaphor also explains why it may be sensible to do the quick and dirty approach. Just as a business incurs some debt to take advantage of a market opportunity, developers may incur technical debt to hit an important deadline. The all too common problem is that development organizations let their debt get out of control and spend most of their future development effort paying crippling interest payments.

I think most of us have seen this phenomenon before; sometimes it manifests as an open willingness to trade quality as just another feature (as measured by the amount of testing before code is put into production). Documentation is another common sacrifice – too often we accept e-mail summaries or PowerPoint outlines as a reasonable facsimile for knowledge capture.

You’ve probably seen this phenomenon where you work, and not just in your IT organization. Many areas of the business will rationalize over-budgeted schedules by summarizing critical findings in a brief email – or, worse, in a Status Update Meeting. “This is an expensive meeting,” I might quip upon entering the room, seeing the conference table ringed with upper-and middle-managers, each weighing in with their understandings and opinions. Don’t misunderstand me – these are typically very effective conversations, with exactly the right people; the folks that know and live the issues, and fully understand the implications of any process change. But my witty entrée was tragically accurate; the understanding and decisions developed at this meeting are too often lost a few minutes after the meeting ends, ideas with a half-life approximately 10 minutes into the start of the next meeting.

Think of it as a knowledge expense (vs. depreciation, as value is lost rather quickly). The expedience and effectiveness of face-to-face communication, with everyone in the same room hearing the same thing consistently and able to ask questions to validate their understanding, typically does not scale beyond the attendees. It’s like listening to a band vs. buying the album (ah, more poetic than downloading…).

In his article, Atwood continues along the Fowler / Cunningham thought process, discussing the need to budget a certain amount of time to pay down our technical debt by going back and finishing that unfinished work; document the things that you sloughed over, rework the inelegant parts of your database schema re code interfaces that rely us a little bit too much on assumptions.

The same can be said for process design and problem solving sessions – remain aware of your level of knowledge debt and budget time to document your findings. I like to call these chunks of captured knowledge “white papers” – I’ll pause while you admire that stunning originality, but there’s a method to my blandness. Calling these things “white papers” helps folks understand the purpose and value of such a document; reasonably short and idea-complete. The sweet spot seems to be two to four pages, well-organized, not too wordy, but clear enough that it remains effective months after the design or process rework sessions took place.

Just remember, organizations do the expedient thing all the time, streamlining meetings and decision-making by going light on the documentation. Every once in while, you’ll pay the cost of rework and rediscovery; as our experience grows, and our patience for such “wasted effort” grows thin, task effort times will increase as we invest a little bit more time in better, clearer documentation.

Previously …

  • Thoughts on Why Tech Folks Hate Documentation (July 8, 2006)
  • The Iron Triangle – Quality is a Feature that We Choose to Omit from Projects (October 28, 2006)
  • Innovation That Matters – Substance Over Style (January 12, 2008)
  • Do you want it good or fast? Prioritizing Time-to-Value over Requirements (February 10, 2008)
  • Optimizing the Wrong Part of Knowledge Management (March 16, 2008)
  • Facilitating Innovation: Establishing an Environment of Possibilities (August 22, 2008)
  • A Plea for Empathetic Communication (November 16, 2008)
  • Over / Under Communication for Project Managers (June 29, 2009)

Technorati Tags: collaboration, documentation, Knowledge Management,

Invisible Technorati Tags: cazh1, James P. MacLennan, jpmacl, MacLennan,

Share This Article
Facebook Pinterest LinkedIn
Share
ByJim MacLennan
Follow:
Jim MacLennan is Senior Vice President and Chief Information Officer at IDEX Corporation, a Fortune 1000 manufacturer that sells highly engineered products in a variety of markets worldwide. MacLennan has responsibility for Corporate IT services for all IDEX business units, and also drives innovation through initiatives that leverage Information and Technology as growth drivers for the industrial manufacturing space. He regularly publishes his observations and insights on the intersection of business and technology - check out his work at www.cazh1.com.

Follow us on Facebook

Latest News

image fx (2)
Monitoring Data Without Turning into Big Brother
Big Data Exclusive
image fx (71)
The Power of AI for Personalization in Email
Artificial Intelligence Exclusive Marketing
image fx (67)
Improving LinkedIn Ad Strategies with Data Analytics
Analytics Big Data Exclusive Software
big data and remote work
Data Helps Speech-Language Pathologists Deliver Better Results
Analytics Big Data Exclusive

Stay Connected

1.2kFollowersLike
33.7kFollowersFollow
222FollowersPin

You Might also Like

Here Comes Web 3.0: Wolfram|Alpha Launches Today

5 Min Read

Online Advertising Fight Club

3 Min Read

Email & Sleep Deprivation

4 Min Read

A new site for the R community: inside-R.org

5 Min Read

SmartData Collective is one of the largest & trusted community covering technical content about Big Data, BI, Cloud, Analytics, Artificial Intelligence, IoT & more.

giveaway chatbots
How To Get An Award Winning Giveaway Bot
Big Data Chatbots Exclusive
ai is improving the safety of cars
From Bolts to Bots: How AI Is Fortifying the Automotive Industry
Artificial Intelligence

Quick Link

  • About
  • Contact
  • Privacy
Follow US
© 2008-25 SmartData Collective. All Rights Reserved.
Go to mobile version
Welcome Back!

Sign in to your account

Username or Email Address
Password

Lost your password?