Underground Markets: A Macro Risk Factor or a Better Supply Chain Model?

January 17, 2012 § Leave a comment

Recently, I read through the latest World Economic Forum “Global Risks 2011”  report which is an initiative of the Risk Response Network.  It’s an impressive assessment of global risks produced in cooperation with Marsh & McLennan, Swiss Re, Wharton Center for Risk Management, University of Pennsylvania and Zurich Financial.  What is compelling about the report is it is not simply a survey result or a list ranking, rather it details and illustrates the interrelationships between risk areas; identifying the causes in an effort to identify points of intervention.  The report highlights response strategies and even proposes long term approaches.

As with any risk report, it has a tendency to feel alarmist, but its value and content cannot be dismissed and its emphasis on response is encouraging.  The two most significant risks the report identifies are relative to economic disparity and global governance.  The main point being that while we are achieving greater degrees of globalization and inherent connectedness, the benefits are narrowly spread with a small minority benefitting disproportionately.  Global governance is a key challenge as each country has differing ideas on how to promote sustainable, inclusive growth.

The Rise of the Informal Economy

The report goes on to highlight a number of risks including the “illegal economy”.  The illegal economy risk includes a cluster of risks: political stability of states, illicit trade, organized crime and corruption.  Specifically, the issue lies with the failure of global governance to manage the growing level of illegal trade activities.  In a recent book by Robert Neuwirth entitled, “Stealth of Nations: The Global Rise of the Informal Economy”, the author estimates that off-the-books business amounts to trillions of dollars of commerce and employs half of all the world’s workers.  If the underground markets were a single political entity, it’s roughly $10 trillion economy would trail only the US in total size.   Further, it’s thought to represent in the range of 7-10% of the global economy and it’s growing.  To be clear, underground markets are not only dealing in illegal substances, crime, prostitution or drugs.  It’s mostly dealing in legal products.  Some of the examples Mr. Neuwirth provide include:

  • Thousands of Africans head to China each year to buy cell phones, auto parts, and other products that they will import to their home countries through a clandestine global back channel.
  • Hundreds of Paraguayan merchants smuggle computers, electronics, and clothing across the border to Brazil.  
  • Scores of laid-off San Franciscans, working without any licenses, use Twitter to sell home-cooked foods.  
  • Dozens of major multinationals sell products through unregistered kiosks and street vendors around the world.

A Global Risk?

Are the underground markets really a global macro-economic risk?  Mr. Neuwirth makes solid arguments that these markets provide jobs and goods that are essential to these populations and that it is the corrupt authorities in most developing countries that are being worked around.  In some ways, it can be argued that these unlicensed vendors and importers are the purest of capitalists; innovatively providing goods by avoiding intervention.  In a recent interview in WIRED magazine, Mr. Neuwirth points out that Procter & Gamble, Unilever, Colgate-Palmolive and other consumer products companies are selling through small unregistered, unlicensed stores in parts of the developing world.   He goes on to point out that P&G’s sales in these unlicensed market’s make up the greatest percentage of the company’s sales worldwide.  I found this tidbit shocking.  Really, a company that brings in over $80 Billion in revenue a year is actually pulling in most of its revenue through unlicensed channels?  Now, that doesn’t mean P&G is directly selling through those channels, but they sell through distributors that may use others that do sell through to unlicensed vendors who don’t pay taxes.

The WEF concludes that illicit trade has a major effect on fragile country states given that the high value of commerce and resulting high loss of tax revenues impinge on national salaries and government budgets.  An example that’s included in the report is that of Kyrgyzstan.  “Members of the Forum’s Global Agenda Councils argue that the undermining of state leadership and economic growth by corrupt officials and organized crime contributed significantly to social tensions which erupted in violent conflict in June 2010, causing widespread destruction, hundreds of civilian deaths and the displacement of 400,000 ethnic Uzbeks.” 

The Threat to Quality and Public Safety

So, if you were guess what type of goods top the list of sales that take place in these underground markets, what would you guess? Cocaine? Opium? Software Piracy? Cigarettes smuggling? Small arms?  Topping the list with a rough estimate of $200 billion in value is counterfeit pharmaceutical drugs.  Just behind at $190 billion is prostitution.   Which leads me to the next serious risk issue if global efforts don’t improve to govern these markets: quality.  I’m not qualified to address the quality of prostitution, but let’s consider the quality of counterfeit pharmaceuticals and the general issue of public safety.  If these markets go unregulated and unmonitored, we are likely to see terrible abuse by profiteers whose only concern is to bring high value products to market quickly.  No regulation also means an inability to create safe work environments and to protect rights of laborers all along the supply chain.

On the other hand, the vast majority of workers and consumers in developing countries thrive because of these markets.  A strong effort to disrupt or disband these markets would cause a high degree of distress in communities that rely on these markets for access to essential goods.  But in return, without tax revenue that can only be gathered from legitimate, licensed businesses can governments function and provide oversight services that would benefit quality and public safety concerns.  It’s an endless loop as we say in the software world; a true catch-22.   Even relatively well functioning supply chain operations at pharmaceutical companies in developed countries are consistently challenged to maintain a high degree of quality (note recent impact of product recalls at Novartis).  Considering how much effort and money is spent on quality assurance, inspections, and FDA audits on legitimate pharmaceuticals, it’s beyond scary to consider the quality of counterfeit pharmaceuticals that are circulating in illicit markets.

Within the US, in the state of California, we’ve seen recent evidence of solutions such as bringing the trade of marijuana within the framework of the law.  Potential results include ensuring quality and safety for the public, raising tax revenue and reducing the profits of organized crime.  Still, the issue of economic disparity is a much tougher nut to crack.  Widening gaps in income within all economies provide incentive for lower income individuals to work outside of established trade structures.  This incentive leads to greater illicit trade which in turn hinders a government’s ability to effectively tax businesses and provide services such as regulatory oversight. 

Can We Govern Illicit Markets?  And If So, Should We?

These are obviously very difficult challenges, but ones that the WEF is analyzing in an effort to form solutions.  The relationships between economic disparity, illicit commercial trade, public safety and government corruption becomes glaringly clear.  How can the global community govern these illicit markets?  They exist everywhere to some degree, even in the US where informal markets are estimated to account for 10-20% of GDP.  One solution that WEF recommends is to strengthen financial systems.  The implication is that weakened systems are the result of the heightened volatility and risk deriving from the recent capital markets crisis.  With diminished confidence comes incentive to work outside the system.  Some suggestions include:

  • Better surveillance of the financial sector, including all systemically relevant players
  • Tighter capital and liquidity ratios for all banking institutions (including non-banks), with higher ratios for systemically relevant institutions
  • Risk retention for securitization (so-called “skin in the game”)
  • Improved transparency and counterparty risk management in “over-the-counter” derivative markets

Perhaps the most interesting part of this global risk challenge is how interrelated these issues are.  The influence that government corruption has on illicit markets is direct, but not the only factor.  Further, the ability of governments to regulate, control and tax this commerce is not straight-forward and overly severe policies can prove detrimental to workers and consumers.  And how much do other factors such as financial stability contribute to activity moving outside conventional channels?  There is no certain view on these underground markets as we must consider why they exist, for whom they exist and how valuable they are for the good of all.

Notes From The Leading Edge

October 6, 2011 § Leave a comment

The past few weeks has been a whirlwind of activity; taking me to DC, Philadelphia, Silicon Valley and Las Vegas.  As I work primarily with Pharmaceutical, Life Sciences and Fast Moving Consumer Goods (FMCG) organizations, there has been tremendous interest in improving compliance, quality and an overall uptick in investment aimed at mitigating risk.  Now, working as part of TIBCO, I’m involved in a number of initiatives that pull process improvement into the world of real-time execution.  What the heck does that mean?  Consider this scenario that’s not in my sector, but clearly defines the power of real-time execution:  You’re a retailer who’s main objective is to optimize value to customers and maximize sales opportunities through targeted promotion.  Now, consider all the ways there are to accomplish that.  There are many ways information is gathered by your customers.  They gain awareness of products and retail stores through traditional channels such as TV, newspaper, magazines and radio.  Increasingly, information awareness is achieved through the internet channel that is delivered through a growing variety of devices such as laptops, desktops, tablets, and smart phones.  

Customer Understanding and Vast Amounts of Data

With the volume of channels and devices growing, so is the amount of data needing to be designed, managed and pushed to these different sources.  Given the variety of touch points with customers, the task of managing communications (specifically promotions) is not easy, so how can retailers be as effective as possible?  Let’s start with the retailer’s objective of maximizing sales opportunities through targeted promotion. There are many ways to promote.  There’s the old school carpet bombing method of devising an advertising campaign and simply pushing a variety of ads through each channel or a chosen subset.  Those methods are extremely expensive and targeting tends to be poor at best.  While some web sites and some cable networks help make targeting a bit more accurate, there is still quite of bit of dilution.  And perhaps the biggest issue with this traditional ad campaign push is that it relies on some action in the future when the message about a sale or coupon may be easily forgotten or lost.  One of the important developments with event pattern matching technology  is it enables retailers the ability to know what customers are doing in real time.  Imagine this:  A person goes into a retailer to purchase a pair of shoes.  The retailer’s system knows the frequency of purchase by the customer as well as preferred brands and the retailer also knows that 85% of shoe buyers are interested in deals for accessories such as socks.  Imagine the person who’s in the shoe department also getting a text coupon sent to their phone alerting them of a 25% discount coupon offer for socks.  This is the power of event based processing and its impact is immense. 

The Need for a New Architecture

The last twenty years has seen companies operating on the premise that all information needed to be stored, analyzed, and reported to enable effective decision making.  What TIBCO is doing is nothing short of revolutionary.  Given the fact that we have seen 10X growth in data being generated by organizations in just the last two years, we can no longer expect traditional database storage and analysis architectures to support this growth and need for real time responsiveness.  Data flows in real time and responsiveness to customers must also be executed in real time.  Threats to systems and resources happen in real time and again, responsiveness must be instantaneous.  Far too many organizations are still relying on massive storage requirements, old business intelligence methods for data warehousing, creation of data-marts, massive libraries of metrics and performance measures and an abundance of continuously evolving report generation.  To what end?  How well are organizations able to respond in virtual real time to meet customers, partners, and risk/compliance requirements?  Not very well if they do not adapt to this tidal wave of data growth.  

The Advantage of Statistical Insight

This weekend, I took my 12-year old son to see the recent film release, “Moneyball”.  The movie is based on a book by Michael Lewis entitled, “Moneyball: The Art of Winning an Unfair Game”.   I won’t diverge into writing a review (it is good, go see it), but what’s most interesting to me about the subject of Mr. Lewis’ book is the idea that the game is won and lost based on probability and not the conventional wisdom of aged scouts who see the romantic and timeless qualities in each ballplayer.  The success that the General Manager of the Oakland A’s, Billy Beane, has with inexpensive players out of necessity, brings to light a truth about the value of understanding probability and having the tools to make decisions ahead of your competitors.  Through the use of the analytical principals of sabermetrics and the brilliance of his assistant, Paul DePodesta, Beane is able to achieve as much with his paltry budget as teams with four times more to spend.  Unfortunately, that advantage does diminish fairly rapidly and the rest of Major League Baseball catches on in successive years, but the point is not lost.  If you can gain key insight to your customers, competitors, fraudulent attackers, supply chain partners, etc, you can greatly improve how you interact and respond to create distinct advantage.  As I’ve discussed in other posts on Business Agility, the key to operational effectiveness and risk management is agility and responsiveness.  Given the vast volume of data that is now being thrust upon organizations because of the constant stream of connectivity, organizations have a pressing choice to make.   Utilize real time capabilities and analytics to create an agile business operation or risk being overwhelmed and unresponsive in the marketplace.

As I continue to work with organizations that innovate around these real time technologies, I am seeing a growing gap in performance and capability from those that are laggards.  The growth of mobility, social networks and connectivity are fueling a step-change in how we manage marketing, production, quality, compliance, governance and virtually all related services (internal and external).  The organizations that lead the way with investments in these technologies will be best positioned to innovate and adapt within their respective industries.

Real Time vs Analysis

September 19, 2011 § 4 Comments

It happens every day, every hour, minute and second.  Stuff.  Stuff happens, and lots of it.  Every so often, something happens that make us go, “oh, that’s big”.  And sometimes so “big” that we scramble to react to either take advantage or take cover; to move money in or out; run for higher ground or head out to sea.  Sometimes we have a bit of notice, but other times we don’t.

Previously, I wrote about risk, fraud and how Barings Bank was brought down by a single rouge trader.  Well, it happened again just a few days ago.  UBS AG, a large Swiss bank appears to have lost somewhere in the neighborhood of $2 Billion.  The news caused its stock to promptly drop; closing 11% lower than the previous day’s close.  Moody’s Investor Service quickly reacted suggesting they would review UBS for a possible downgrade; citing concerns that it’s not adequately managing risk.

It’s much too early to determine how this trader pulled off his scheme.  Early information suggests he may have manipulated back-office operational systems as he previously worked in back-office operations and would have had that knowledge.  Did UBS have a policy to restrict back-office workers from transferring to front-office trader positions?  They didn’t comment.

There is much that needs to come to light.  Was this the work of the single trader, Kweku Adoboli, that is currently being implied or were others involved?  What controls were in place to prevent these type of trades and why did they fail?  How long did it take for monitors to catch the rouge activity and did they prevent additional potential damage? 

To give a sense of size, it only took Nick Leeson’s $1.3B cheat to bring down Barings in 1995.  Jerome Kerviel devised a scheme that cost Societe Generale $7.16 Billion in 2008.  Other scandals have impacted banks over the years and the fraudulent events don’t seem to end.  Regulations can be implemented and made more stringent; auditors can review organization’s processes for compliance to those regulations, but still big stuff happens.  It’s the kind of big stuff that wipes out all other assumptions.  You can be the finest analyst in the universe, performing all the due diligence necessary to make the most prudent investments.  You believe in UBS, the fact that they brought back senior leadership that they are serious about reform.  Oswald Grubel were supposed to be turning around the troubled UBS, but it appears he and his leadership team was just not that concerned about managing operational risk.  Simple bottom line is: one event can be catastrophic erasing all other assumptions.

So, the questions that are most pertinent:  Which operational events need real-time monitoring?  What events need process controls in place to automatically prohibit additional risk exposure?  How can managers respond in real-time to both opportunities and adverse situations?  As Pete Seeger adapted from the Book of Ecclesiastes, “there’s a time to gain, a time to lose, a time to rend, a time to sew”.  Similarly, there is a time for analysis and there is a time for real-time response.  All the analysis in the world cannot determine the future.  As the Heisenberg Uncertainty Principal states; the more precisely one property is measured, the less precisely the other properties can be controlled or determined.   In other words, the mere act of observance imposes yet another factor into the set of conditions.  There are no absolutes about tomorrow and there is no such thing as risk-free.  So, while I point out the immense advantage that doing your homework will bring with a previous blog post in interconnectedness, at the end of the day, a single event can wipe out all of your assumptions. 

Well, I know what you’re thinking…. that sucks.  First you tell me that I should do fantastic amounts of due diligence to identify opportunities, but then you say, “ahhh, it’s all a waste once a single unexpected event strikes.”  Okay, I can see that paradox or contradiction, but really what I’m saying is: you have to do both.  Good operational process management is about analysis of the details; of every single activity; every single owner, reviewer, regulation and risk.  And yet, it’s also about agility.  What do we do when things don’t go as planned?  What do we do when the proverbial poop hits the fan?  Can we analyze each activity for its risk exposure?  Can we find methods and control activities to mitigate against adverse events… especially the catastrophic ones?  And can we buy insurance to position ourselves for gain if adverse events strike?  Absolutely, I say!  Why some organizations don’t, especially financial institutions that are particularly vulnerable, is beyond me.  Sometimes, it’s just incompetent management, but often it’s a simple lack of appreciation for how solid operational process management requires a sizable investment in process thinking, risk management and development of a process improvement culture. 

Fortunately, a lot is being done during this generation to advance process-based thinking and to raise the level of consciousness about business process management and its impact on corporate governance and risk.  But, it’s happening slowly.  Maybe events like last week’s UBS debacle will open a few eyes…. let’s hope so.

New Process Adoption: How do we get people to change behavior?

August 29, 2011 § 4 Comments

Earlier in my career, circa 1994, I was working for Lotus Development and I was having lunch with my boss.  We were sharing a bit of small talk and I remember telling him about a new personal accounting application called Quicken.  I explained all the cool things it did and how I could track my expenses, put line items into categories, build charts and graphs and see where my money went.  He listened as I went on and on about how empowered I felt that I knew what I was spending my money on.  He then asked, “Are you really changing your spending behavior now that you use it?”  And I thought about that for a moment and then had to admit, “Well, no.  Not yet anyway.”  That was all he needed to ask.  I got his point.

There is a lot of this same phenomena in the areas of governance, risk and compliance as well as with operational excellence and quality initiatives.  It can be exciting to start thinking about improvement of process execution; understanding exactly what is happening with end-to-end processes; who is responsible and how activities should be measured.  For large risk and compliance efforts, new control methods, systems and activities are designed to address areas identified as high risk, high impact potential.  Documentation efforts are extensive and training of personnel occurs to ensure understanding of these new processes.  The question that should be asked that my previous boss asked me is, “Are you really changing people’s behavior?”  It’s one thing to implement a method and to provide training, but not everyone is going to adopt the new application, system, method or rule.  There are those in the organization who have been doing their job, their specialized skill for a very long time.  Simply publishing a new process diagram, a new policy, or a best practice document is not going to ensure adoption of a new process.  Further, when process change or compliance regulations impact a wide variety of process areas with dozens or hundreds of roles impacted, how do we ensure adherence to the newly stated “way of working”?  Is it okay to have 80% adoption?  90%?  99%?  How do we know where the newly defined processes are not being followed?  And if we are not at 100%, what is at stake?

Pharmaceutical Quality Management

Let me cite a recent example.  I spent several months working with a global pharmaceutical company on SOPs or Standard Operating Procedures.  SOPs are at the heart of managing quality throughout product development; starting with R&D through Clinical Trials and then ultimately, manufacturing and distribution.  SOPs become the definition of how to execute on all process steps in order to complete an end-to-end process.  The criticality of execution is perhaps magnified when you’re dealing with medications that will be ultimately be marketed, administered by physicians and taken by large numbers of patients.  The stakes are extremely high with enormous investment by the company on each effort and extremely high risk of failure including scrutiny by the FDA as well as auditors. 

Quality Management is a fundamental discipline within pharmaceuticals.  So much so, that they provide a governance function purely for the sake of managing SOPs and ensuring operational participants have read and understood the process before they can execute on any activity relative to each and every project.  That was a bit of a mouthful, so let me simplify it from the end user perspective.  If you are a clinical technician and there is a new trial starting, a new SOP will be published and you will be asked to read the document (usually 40 or so pages long), take a quiz online and then “sign off” that you have read and understood (R&U) the procedure.  Now, from an administrative perspective, it’s also quite a job managing not only the content that needs to be gathered for defining the SOP, but then administering the SOP R&U tasks itself.  This organization conducts dozens of trials per year with many running simultaneously with hundreds of participants just within the clinical trials team.  So, you can imagine the complexity.  Now, at the heart of the issue is the SOP itself.  Over most of the past twenty years, SOPs have been large documents that are created in Microsoft Word, reviewed, approved and then converted to a read-only Adobe PDF.  The document is then stored in an EMC Documentum document management data repository (DMS).  The DMS captures necessary metadata about the SOP and ensures that it was “published”.  This distinction is important for compliance with FDA 21 CFR part 11, a regulatory standard that all participants adhere to.

Now, my client had a number of  problems that many other pharmaceuticals have.  The big ones were:

  1. How do we ensure that people on the project really understand the procedure?
  2. When participants are unsure of a procedural step, the existing PDF documentation within the DMS is unwieldy and difficult to find answers.
  3. Many SOP documents contain procedural details that overlap other SOPs and there are often inconsistencies between them.
  4. Gaps may exist between SOPs and the exact steps and responsibilities become unclear.
  5. The effectiveness of the SOP documents to impact behavior is widely believed to be lacking.

So, back to my parable on using Quicken.  Good material, lots of investment, but does the material actually impact how work gets done?  The short answer is “not well enough”.  When you give a broad audience a massive document locked away in a complicated environment, you don’t get the intended results.  You don’t get adoption and adherence to the stated process.

The Power of Simplicity

A general surgeon, Atul Gawande is also the author of a few books including a recent release entitled, “The Checklist Manifesto”.  In “Checklist”, Dr. Gawande details how medical surgery as well as other very complex procedures such as construction of skyscrapers and flying airplanes have a common tool that greatly impacts the quality of execution: a simple checklist.  Recently, Dr. Gawande spearheaded efforts to educate and deploy the use of checklists for surgical procedures in hospitals across a variety of environments; many in developing countries, but also in inner-city hospitals in the US and other developed countries.  The results are astounding as the checklist program is able to greatly reduce problems from surgical errors such as post operative infections, bleeding, unsafe anesthesia and operating room fires.  Incidents of these common problems dropped 36 percent after the introduction of checklists and deaths fell by 47 percent.  After this study, staff was ultimately surveyed and asked if they would want this checklist used if they were being operated on.  93 percent said “yes”.

How does the concept of a checklist apply to the clinical trials process within a pharmaceutical operation?  A checklist can serve just as the existing SOP is intended.  People on the trial team are responsible for understanding the whole process as well as their individual role.  But as we now understand, massive documents have inherent problems:

  1. maintaining integrity of the information
  2. readability by the audience
  3. providing guidance at the point of need

The net objective of ensuring execution of each process step is not being achieved with SOP documents.

That’s where a checklist comes in.  The work I’ve been a part of involved a major paradigm shift away from traditional large volume, free-form SOP documents and toward a new model that takes advantage of cutting edge BPM technology.  The client is using the Nimbus Control enterprise BPM application.  This model involves the following core BPM principals:

  • documenting end-to-end process in a universal process notation
  • linking all relevant electronic documentation at the activity levels
  • assigning ownership for every activity
  • building a Nimbus Storyboard (checklist) to correspond to each SOP
  • establishing end-user views to provide pre-populated lists of SOPs
  • providing end users with Search capabilities to get process diagrams, related documents and Storyboards in a structured Keyword taxonomy

A Revolutionary Tool

From an operational execution perspective, the use of Storyboards is revolutionary.  A Clinical Operations Director can now open a Storyboard which is essentially a list of activities that is relevant to their project; jump to the exact activity at the point of need and see all guidance, references and regulations that are necessary to perform that step.  A large team of participants contributed to defining this new way of working within the Clinical Trials operational team.  Contributors came from the operational team, governance office, IT, as well as the Quality Assurance organization.  The result is a solution that promises an ROI of over 30X within three years. 

Improving operations that impact the quality of how we develop medicines is important not only for the companies that invest in that work, but it also impacts doctors, patients and all those who care about those patients.  The impact of process performance, process execution and quality cannot be underestimated.  These initiatives are not driven purely by regulatory requirements and audit findings.  The investment in these technologies improves all aspects of the organization, the work experience of all participants and the pride that the organization can take by reducing errors and improving quality execution.

Simplicity Takes Work

I’m reminded of a quote of Steve Jobs who just this past week resigned as CEO of Apple: “That’s been one of my mantras — focus and simplicity.  Simple can be harder than complex:  You have to work hard to get your thinking clean to make it simple.  But it’s worth it in the end because once you get there, you can move mountains.”

The Checklist, or the Storyboard, helps make massive amounts of complexity and detail quite simple.  I applaud the ambitious courage of my client for taking bold action to transform the SOP process.  As Mr. Jobs noted, “you have to work hard to get your thinking clean and make it simple.”  My client has worked incredibly hard to design a simpler way for people to perform exacting work.

Compliance: Headache or Windfall?

August 8, 2011 § Leave a comment

Forming a Process Centric Model

Regulatory bodies and compliance rules are as old as civilization.  Early Egyptians, Greeks, Romans and Indians created standards and rules for business.  These rules were centered on weights and measures as well as currency, but today regulations come from many sources.

When we look at a regulatory construct, we are effectively looking at rules, laws, guidelines and best practices that are dictated by a governing body.  I say dictated, but these rules are generally a set of statements that have been developed, reviewed and ultimately enacted through a governing board within a corporation.  Regulations may also be established as governing boards that are industry related and many regulations are based on governmental laws (federal, state and local agencies).  The volume of these regulatory books and the volume of statements contained in each can be enormous depending on the industry and corporate size.  Public companies have the Securities and Exchange Commission to deal with.  Companies with global operations have to comply with varying laws that are relevant in each operating country; adhere to health and safety standards, hiring and firing requirements, social responsibility requirements, etc.  If you’re a financial services firm, a plethora of regulations guide how you account, record, trade and settle.  If you’re a pharmaceutical company, strict standards dictate how you run your clinical trials, record your findings, label your products, etc.  Now, add in all of the internal standards that govern your best practices related to your unique products, partnerships, and contract types.  As we can easily surmise, the complications that result are immense.

The SOX Phenomena

In 2003, after starting my own software services firm, I sat with the head of compliance for a Fortune 100 construction company to review their requirements for Sarbanes-Oxley.  Within a day of information gathering it became clear that their main objective was to look at how to manage a set of “controls” by recording who was responsible for each and whether it was working or not.  Now, to be clear, a “Control” is simply a process step that has an owner and it’s in place to mitigate a risk to the organization.  So, what this company was doing was to create a “matrix” of relationships between identified risks, controls (mitigation steps), owners, and the process area they relate to.  As I discovered in the weeks following these meetings, almost every company that was scrambling to comply with SOX was doing this exact thing and they were almost all risk/control matrices in spreadsheets.  The problems with that approach was universal and having a collaborative, relational data storage solution was an obvious need. 

Process is the common denominator

While I grew my business by developing software to address this requirement, other interesting similarities emerged from my client base.  Companies were not only interested in passing an audit or dealing with the SOX regulations.  They had a dozen or more other pressing regulations that required the same type of solutions.  In each case, whether it was FDA regulatory 21CFR part 11 or ISO9000 or Basel II, or the variety of internal standards that was being addressed, the same basic needs existed.  Companies needed to understand the regulations, identify the risks, controls, gaps, remediation steps, owners, process areas and manage all of that information somewhere.  Most commonly, that meant in an independent spreadsheet.  And what was the one thread that formed the backbone of all compliance management?  Process.

Another Fire Drill?

What I found was that each process area (ie: HR, Finance, Manufacturing Ops, etc.) was being hounded by internal audit teams, compliance directors, external auditors and quality managers to document their processes; document their controls; document their risks; document their issues; document remediation tasks, and on and on.  It’s amazing that anyone was ever actually doing their day job.  During the nearly ten years that I’ve worked with organizations on regulatory requirements, very little has changed in this regard.  I have yet to encounter a company that manages all of their compliance and regulatory requirements from a single platform.  Some organizations have made strides with managing process details in a more coordinated fashion, but most still deal with each compliance requirement as a separate challenge involving separate projects.

The issues with this condition are perhaps obvious;  each time one of the regulatory initiatives is executed, operational leaders are reliving the exact nightmare!  It’s Groundhogs Day!  I’ve had leaders within Pharmaceutical clients tell me that rarely does a year pass before they have to execute another fire drill of process capture, internal review, internal audit, and external audit.  Invariably, it’s a short sighted exercise to check a bunch of boxes and get a rubber stamp, so we can get back to normal operations.

The Single Platform Vision

Now for the good news, things are changing.  During my four year tenure at Nimbus I’ve seen an awakening within highly regulated industries to stop the nonsense.  It all begins with proper process management wherein organizations do the following:

  1. Define end-to-end processes using a simple notation for business end users.
  2. Govern process definitions and all related reference materials in support of process execution.
  3. Manage regulatory and internal standards within structured, governed statement(s).

Process management should not become the result of fire drill exercises to satisfy auditors, rather BPM should be an integral part of knowledge capture, process improvement, compliance management and business agility.  As one executive summarized when heading into a board meeting after meeting with me, “We can’t improve what we can’t understand.”  As I’ll discuss in later postings, there is both a mechanical nature to BPM and a cultural one.  Very few cultures are used to maintaining a high level of accountability and continuous management of process content.  Just putting systems in place is not a cure-all and as we’ll explore, organizational culture plays a huge role.

Where Am I?

You are currently browsing the Compliance category at Process Maximus.