Underground Markets: A Macro Risk Factor or a Better Supply Chain Model?

January 17, 2012 § Leave a comment

Recently, I read through the latest World Economic Forum “Global Risks 2011”  report which is an initiative of the Risk Response Network.  It’s an impressive assessment of global risks produced in cooperation with Marsh & McLennan, Swiss Re, Wharton Center for Risk Management, University of Pennsylvania and Zurich Financial.  What is compelling about the report is it is not simply a survey result or a list ranking, rather it details and illustrates the interrelationships between risk areas; identifying the causes in an effort to identify points of intervention.  The report highlights response strategies and even proposes long term approaches.

As with any risk report, it has a tendency to feel alarmist, but its value and content cannot be dismissed and its emphasis on response is encouraging.  The two most significant risks the report identifies are relative to economic disparity and global governance.  The main point being that while we are achieving greater degrees of globalization and inherent connectedness, the benefits are narrowly spread with a small minority benefitting disproportionately.  Global governance is a key challenge as each country has differing ideas on how to promote sustainable, inclusive growth.

The Rise of the Informal Economy

The report goes on to highlight a number of risks including the “illegal economy”.  The illegal economy risk includes a cluster of risks: political stability of states, illicit trade, organized crime and corruption.  Specifically, the issue lies with the failure of global governance to manage the growing level of illegal trade activities.  In a recent book by Robert Neuwirth entitled, “Stealth of Nations: The Global Rise of the Informal Economy”, the author estimates that off-the-books business amounts to trillions of dollars of commerce and employs half of all the world’s workers.  If the underground markets were a single political entity, it’s roughly $10 trillion economy would trail only the US in total size.   Further, it’s thought to represent in the range of 7-10% of the global economy and it’s growing.  To be clear, underground markets are not only dealing in illegal substances, crime, prostitution or drugs.  It’s mostly dealing in legal products.  Some of the examples Mr. Neuwirth provide include:

  • Thousands of Africans head to China each year to buy cell phones, auto parts, and other products that they will import to their home countries through a clandestine global back channel.
  • Hundreds of Paraguayan merchants smuggle computers, electronics, and clothing across the border to Brazil.  
  • Scores of laid-off San Franciscans, working without any licenses, use Twitter to sell home-cooked foods.  
  • Dozens of major multinationals sell products through unregistered kiosks and street vendors around the world.

A Global Risk?

Are the underground markets really a global macro-economic risk?  Mr. Neuwirth makes solid arguments that these markets provide jobs and goods that are essential to these populations and that it is the corrupt authorities in most developing countries that are being worked around.  In some ways, it can be argued that these unlicensed vendors and importers are the purest of capitalists; innovatively providing goods by avoiding intervention.  In a recent interview in WIRED magazine, Mr. Neuwirth points out that Procter & Gamble, Unilever, Colgate-Palmolive and other consumer products companies are selling through small unregistered, unlicensed stores in parts of the developing world.   He goes on to point out that P&G’s sales in these unlicensed market’s make up the greatest percentage of the company’s sales worldwide.  I found this tidbit shocking.  Really, a company that brings in over $80 Billion in revenue a year is actually pulling in most of its revenue through unlicensed channels?  Now, that doesn’t mean P&G is directly selling through those channels, but they sell through distributors that may use others that do sell through to unlicensed vendors who don’t pay taxes.

The WEF concludes that illicit trade has a major effect on fragile country states given that the high value of commerce and resulting high loss of tax revenues impinge on national salaries and government budgets.  An example that’s included in the report is that of Kyrgyzstan.  “Members of the Forum’s Global Agenda Councils argue that the undermining of state leadership and economic growth by corrupt officials and organized crime contributed significantly to social tensions which erupted in violent conflict in June 2010, causing widespread destruction, hundreds of civilian deaths and the displacement of 400,000 ethnic Uzbeks.” 

The Threat to Quality and Public Safety

So, if you were guess what type of goods top the list of sales that take place in these underground markets, what would you guess? Cocaine? Opium? Software Piracy? Cigarettes smuggling? Small arms?  Topping the list with a rough estimate of $200 billion in value is counterfeit pharmaceutical drugs.  Just behind at $190 billion is prostitution.   Which leads me to the next serious risk issue if global efforts don’t improve to govern these markets: quality.  I’m not qualified to address the quality of prostitution, but let’s consider the quality of counterfeit pharmaceuticals and the general issue of public safety.  If these markets go unregulated and unmonitored, we are likely to see terrible abuse by profiteers whose only concern is to bring high value products to market quickly.  No regulation also means an inability to create safe work environments and to protect rights of laborers all along the supply chain.

On the other hand, the vast majority of workers and consumers in developing countries thrive because of these markets.  A strong effort to disrupt or disband these markets would cause a high degree of distress in communities that rely on these markets for access to essential goods.  But in return, without tax revenue that can only be gathered from legitimate, licensed businesses can governments function and provide oversight services that would benefit quality and public safety concerns.  It’s an endless loop as we say in the software world; a true catch-22.   Even relatively well functioning supply chain operations at pharmaceutical companies in developed countries are consistently challenged to maintain a high degree of quality (note recent impact of product recalls at Novartis).  Considering how much effort and money is spent on quality assurance, inspections, and FDA audits on legitimate pharmaceuticals, it’s beyond scary to consider the quality of counterfeit pharmaceuticals that are circulating in illicit markets.

Within the US, in the state of California, we’ve seen recent evidence of solutions such as bringing the trade of marijuana within the framework of the law.  Potential results include ensuring quality and safety for the public, raising tax revenue and reducing the profits of organized crime.  Still, the issue of economic disparity is a much tougher nut to crack.  Widening gaps in income within all economies provide incentive for lower income individuals to work outside of established trade structures.  This incentive leads to greater illicit trade which in turn hinders a government’s ability to effectively tax businesses and provide services such as regulatory oversight. 

Can We Govern Illicit Markets?  And If So, Should We?

These are obviously very difficult challenges, but ones that the WEF is analyzing in an effort to form solutions.  The relationships between economic disparity, illicit commercial trade, public safety and government corruption becomes glaringly clear.  How can the global community govern these illicit markets?  They exist everywhere to some degree, even in the US where informal markets are estimated to account for 10-20% of GDP.  One solution that WEF recommends is to strengthen financial systems.  The implication is that weakened systems are the result of the heightened volatility and risk deriving from the recent capital markets crisis.  With diminished confidence comes incentive to work outside the system.  Some suggestions include:

  • Better surveillance of the financial sector, including all systemically relevant players
  • Tighter capital and liquidity ratios for all banking institutions (including non-banks), with higher ratios for systemically relevant institutions
  • Risk retention for securitization (so-called “skin in the game”)
  • Improved transparency and counterparty risk management in “over-the-counter” derivative markets

Perhaps the most interesting part of this global risk challenge is how interrelated these issues are.  The influence that government corruption has on illicit markets is direct, but not the only factor.  Further, the ability of governments to regulate, control and tax this commerce is not straight-forward and overly severe policies can prove detrimental to workers and consumers.  And how much do other factors such as financial stability contribute to activity moving outside conventional channels?  There is no certain view on these underground markets as we must consider why they exist, for whom they exist and how valuable they are for the good of all.

Advertisements

Addiction to Prediction

January 3, 2012 § 11 Comments

With each year end, all forms of media spew a tidal wave of predictions.  From the apocalyptic to the mundane, we get predictions from prognosticators on who will win an Oscar to which Republican will win in Iowa to how well the market will do in 2012 to who will win the Super Bowl.  But it’s not only during year ends that we get a hefty dose of soothsaying. It’s a public non-stop obsession.  Dare I say, it’s an addiction.  Predictions are in every facet of society – within industry, we are constantly trying to get insight on the level of demand this month for our products; the level of prices within each product type; and which company will gobble up which other company.  The fact that foresight can be a key advantage when competing for resources and competitive superiority is not surprising.  What is surprising is the amount of noise pollution and the insatiable desire to listen to that noise.

What is an Expert?

I still hear my favorite finance professor lecturing during one of my b-school classes about “experts”.  He illustrated quite powerfully (obviously, it’s stayed with me for all these years), how poor predictions were made by economists on interest rates, GDP growth, oil prices, and stock prices among many other measures.  In article after article, economists, industry experts, political experts, scientific experts were shown to be just slightly better than random guessing.  What’s worse, most “experts” tended to influence each other, so that consensus predictions prevailed.  The group of economists’ predicting the direction of interest rates tended to lump together in narrow ranges which indicated that working from the same sets of data with the same sets of assumptions, they also tended to create the same range of estimates.

Risk and Probability

We all know that the future is uncertain and that many unknown factors impact future events, yet we go to great lengths to predict.  The bottom line is that we can draw conclusions that are more about probability than actual pinpoint calculation.  If we normalize probabilistic outcomes for earnings per share for Apple this coming quarter, we can estimate EPS outcomes within ranges.  If we believe published consensus estimates by analysts, we can find that mean estimates are at 9.81 with a coefficient variance of 4.39.  Statistically, this variance is only significant for historical relevance and should not be seen as a predictor, but given analysts do not have crystal balls, they still use it as the main factor for setting probabilities.  So, if we conclude there is a 95% chance that EPS will fall within the range 8.56 – 11.06 (or two standard deviations), we are essentially placing bets based on probability.  Now, the valuation of a share of Apple common stock will vary greatly depending on where in this range actual EPS falls.  Of course, there is still the 5% chance that EPS falls outside the expected range.  And further, these numbers are purely estimates based on one set of assumptions that no two analysts would ever agree on.

Critical Analysis

When looking at all these predictions, it can quickly become apparent which “experts” are really viewing their data through a critical lens and which are simply along for the ride by echoing other expert’s viewpoints.  What I find most discouraging is how confident some prognosticators are, especially those on television and web broadcasts.  They emphatically proclaim their view in an effort to persuade viewers they are right – trying to create self fulfilling prophesies through persuasion – perhaps the most egregious offense.  We see this regularly on political discussion panels where party-aligned or candidate-partial analysts make their case for persuading us what people really want and how they will vote.  Are they really giving us a scientifically sound viewpoint or simply trying to manipulate our view about what will be?

Predicting Human Behavior

The digital age has provided a powerful platform for gathering information on individuals’ behavior.  Companies can gain insight into buying behaviors as well as data on individual and group interests in entertainment, politics, as well as professional and social connections.  The usefulness of this information is at once obvious and complex.  For instance, if we know that there is better than 50% chance that a person buying a camera will also buy a camera case then it would be an effective sales practice to suggest a camera case at the point the buyer selects a camera.  This practice is quite common now with online purchasing.  We also see it fairly frequently with phone sales as well as with fast food ordering; e.g.: “Would you like fries with that?”.  Recently, I was shopping for a new lawn mower and researching options on the website homedepot.com.  Within several minutes, after performing a search on the site, a pop-up offer to chat with a representative came up.  I took that offer as I had some questions about the models.  After about five minutes, the rep offered me a 10% discount if I’d like to purchase the item online and he’d help me through the purchase process.  I had already made the decision to buy the item, so I was happy to get this discount, but I wanted to pick it up at a store near me rather than have it shipped.  Their process allowed for this flexibility as I could purchase online and the item would be instantly put aside for me to pick up that evening.  This process was ingenious.  What percentage of people shop to the point of sale and then drop, reducing the chance of buying the item through the original site or perhaps not buying that item at all?  This new discount offer through the chat rep can help homedepot.com reduce that drop percentage.  Now, Home Depot does not know if I would have purchased that item that day online anyway and they do not know if I would have gone to the store and been willing to pay full price.  The fact is: there will be some percentage of margin they relinquish for the sake of capturing a higher percentage of potential sales.  Home Depot, like so many other retailers, banks, insurance companies, and drug companies are in the business of prediction – predicting what you will do if they communicate with you in a certain way at a certain time.

Statistical Sampling – A Foundation for Predictions

Statisticians often tell a joke about a man with his head in a refrigerator and his feet in an oven – on average he feels about right.  Sampling is used to determine probabilities and to make decisions on the level of risk we are taking.  It’s sampling and probability that determines the rate we pay for life insurance, car insurance, and all other types of insurance.  So, when we try to predict Apple’s earnings per share for next quarter or  whether a catastrophic disaster will strike an Asia Pacific country next year, we must calculate the odds, the percentages, the probabilities.  The fact remains, however, that not all outcome distributions fall into a normalized curve and highly unlikely events can be game changers.

Question Authority

Prediction should be all about risk, uncertainty, and likelihood, but what you’ll hear this week and throughout the year is a chorus of experts telling you with great certainty what the future will bring.  Don’t believe them.  If you’re jonesing for advice, try listening to those who are providing detail on probability, risk and trends.  But know that the future is never about certainty and always about probability.  When prognosticators get it right, they were just plain lucky.  They may have played the odds.  They may have had some truly intuitive insight that others did not.  But there is never a sure thing.

Notes From The Leading Edge

October 6, 2011 § Leave a comment

The past few weeks has been a whirlwind of activity; taking me to DC, Philadelphia, Silicon Valley and Las Vegas.  As I work primarily with Pharmaceutical, Life Sciences and Fast Moving Consumer Goods (FMCG) organizations, there has been tremendous interest in improving compliance, quality and an overall uptick in investment aimed at mitigating risk.  Now, working as part of TIBCO, I’m involved in a number of initiatives that pull process improvement into the world of real-time execution.  What the heck does that mean?  Consider this scenario that’s not in my sector, but clearly defines the power of real-time execution:  You’re a retailer who’s main objective is to optimize value to customers and maximize sales opportunities through targeted promotion.  Now, consider all the ways there are to accomplish that.  There are many ways information is gathered by your customers.  They gain awareness of products and retail stores through traditional channels such as TV, newspaper, magazines and radio.  Increasingly, information awareness is achieved through the internet channel that is delivered through a growing variety of devices such as laptops, desktops, tablets, and smart phones.  

Customer Understanding and Vast Amounts of Data

With the volume of channels and devices growing, so is the amount of data needing to be designed, managed and pushed to these different sources.  Given the variety of touch points with customers, the task of managing communications (specifically promotions) is not easy, so how can retailers be as effective as possible?  Let’s start with the retailer’s objective of maximizing sales opportunities through targeted promotion. There are many ways to promote.  There’s the old school carpet bombing method of devising an advertising campaign and simply pushing a variety of ads through each channel or a chosen subset.  Those methods are extremely expensive and targeting tends to be poor at best.  While some web sites and some cable networks help make targeting a bit more accurate, there is still quite of bit of dilution.  And perhaps the biggest issue with this traditional ad campaign push is that it relies on some action in the future when the message about a sale or coupon may be easily forgotten or lost.  One of the important developments with event pattern matching technology  is it enables retailers the ability to know what customers are doing in real time.  Imagine this:  A person goes into a retailer to purchase a pair of shoes.  The retailer’s system knows the frequency of purchase by the customer as well as preferred brands and the retailer also knows that 85% of shoe buyers are interested in deals for accessories such as socks.  Imagine the person who’s in the shoe department also getting a text coupon sent to their phone alerting them of a 25% discount coupon offer for socks.  This is the power of event based processing and its impact is immense. 

The Need for a New Architecture

The last twenty years has seen companies operating on the premise that all information needed to be stored, analyzed, and reported to enable effective decision making.  What TIBCO is doing is nothing short of revolutionary.  Given the fact that we have seen 10X growth in data being generated by organizations in just the last two years, we can no longer expect traditional database storage and analysis architectures to support this growth and need for real time responsiveness.  Data flows in real time and responsiveness to customers must also be executed in real time.  Threats to systems and resources happen in real time and again, responsiveness must be instantaneous.  Far too many organizations are still relying on massive storage requirements, old business intelligence methods for data warehousing, creation of data-marts, massive libraries of metrics and performance measures and an abundance of continuously evolving report generation.  To what end?  How well are organizations able to respond in virtual real time to meet customers, partners, and risk/compliance requirements?  Not very well if they do not adapt to this tidal wave of data growth.  

The Advantage of Statistical Insight

This weekend, I took my 12-year old son to see the recent film release, “Moneyball”.  The movie is based on a book by Michael Lewis entitled, “Moneyball: The Art of Winning an Unfair Game”.   I won’t diverge into writing a review (it is good, go see it), but what’s most interesting to me about the subject of Mr. Lewis’ book is the idea that the game is won and lost based on probability and not the conventional wisdom of aged scouts who see the romantic and timeless qualities in each ballplayer.  The success that the General Manager of the Oakland A’s, Billy Beane, has with inexpensive players out of necessity, brings to light a truth about the value of understanding probability and having the tools to make decisions ahead of your competitors.  Through the use of the analytical principals of sabermetrics and the brilliance of his assistant, Paul DePodesta, Beane is able to achieve as much with his paltry budget as teams with four times more to spend.  Unfortunately, that advantage does diminish fairly rapidly and the rest of Major League Baseball catches on in successive years, but the point is not lost.  If you can gain key insight to your customers, competitors, fraudulent attackers, supply chain partners, etc, you can greatly improve how you interact and respond to create distinct advantage.  As I’ve discussed in other posts on Business Agility, the key to operational effectiveness and risk management is agility and responsiveness.  Given the vast volume of data that is now being thrust upon organizations because of the constant stream of connectivity, organizations have a pressing choice to make.   Utilize real time capabilities and analytics to create an agile business operation or risk being overwhelmed and unresponsive in the marketplace.

As I continue to work with organizations that innovate around these real time technologies, I am seeing a growing gap in performance and capability from those that are laggards.  The growth of mobility, social networks and connectivity are fueling a step-change in how we manage marketing, production, quality, compliance, governance and virtually all related services (internal and external).  The organizations that lead the way with investments in these technologies will be best positioned to innovate and adapt within their respective industries.

The Interconnectedness of Things

September 7, 2011 § 6 Comments

This past week the company I work for, Nimbus Partners, was purchased by a larger software company, TIBCO.  I can’t comment on the process of due diligence of the deal, but as any large acquisition is considered, a great amount of analysis must be performed.  To value any software company,  the acquirer must value the assets for product technology, position in the market, product position within the existing family of assets, the company’s existing financial state as well as projected earnings potential.   

Minimizing Risk

This acquisition is one of many major decisions that executives at TIBCO and other corporations go through every year.  Some investment options require incredibly in-depth analysis while other investment decisions may be made quickly with far less due diligence.  There are plenty of reasons for performing an analysis on an investment to a given level and not to a finer level.  When purchasing a stock or making a trade on an existing holding, how much information is driving your decision?  Did you read the prospectus of the latest 10-Q?  Did you attend the recent investor conference calls with management?  Did you get all the answers to your concerns of the latest one-time charge to net income?  The odds are you didn’t.  The odds are you’re trading on gut feel of the situation or you’re trading on some limited understanding and you accept that risk based on the fact that simply don’t have the time to do all of the research you would have liked.  Now, you might also put your trust with money managers or fund managers; expecting they are doing all the analysis required to make good value judgments that are in line with your risk profile and your investment objective.  Again, are you sure they are going down to a depth of analysis that ensures risk is minimized?

A Hedge Fund Legend

Recently, I read about a very successful investor named Michael Burry.  For those of you who haven’t heard of Mr. Burry, he gained a degree of notoriety for wisely betting against banks’ mortgage holdings and cashing in massive returns for his hedge fund when the credit crises hit full tilt in 2007.  His brilliance wasn’t just that he recognized a good bubble when he saw one, it’s the way he figured out how to capitalize on this realization that a spectacular amount of mortgages were doomed to fail.  The fact is, when Mr. Burry first became convinced that the type of lending that banks were engaged in was destined to result in large numbers of defaults, there was no real instrument for wagering against the performance of these notes.  The various tranches of subprime mortgage bonds could not be sold short.  Even with his conviction that the subprime mortgage bond market was doomed, he could not capitalize on it.

Then came Mr. Burry’s discovery of the credit-default swap.  It was basically an insurance policy that could be purchased against corporate debt, but that was only useful for betting against the companies that would likely default such as home builders.  Ultimately, he convinced a number of big wall street firms to create them including Deutsche Bank and Goldman Sachs.  Now, what made his work absolutely brilliant was the fact that he would spend untold hours poring over each bond prospectus, only investing in the most risky of those assets.  He was performing the due diligence on each of the loans, such as analyzing the loan to value ratios, which had second liens, location, absence of income documentation, etc.  Within each bond, he could sort out the riskiest of the lots and incredibly enough, Deutsche and the other banks didn’t care which bonds he took positions against.  He essentially cherry-picked the absolute worst loans (best for him) and found the bonds that backed them.

Mr. Barry would ultimately bring his investors and himself astronomical returns at a time when the vast majority of investors lost roughly 50% during this crisis.  If you read about Mr. Burry, you’ll find there is much more to his story as he is unique in many ways, but one key point that separates him from the pack is that he does his homework.  Details matter.  How these loans were structured matter to all that were connected to them.  In these bonds were real loans that represented real value.  Understanding the risk factors would immediately point to a very low valuation on these bonds.

I’m not going to delve into the full issue of responsibility relative to loan originators, banks, Fannie Mae, borrowers, etc, but suffice it to say that solid due diligence reduces the risk of any transaction.  The more you understand about the asset under consideration, the better you can predict its performance.  It’s as simple as that.

Oneness

So, what’s with my title, “The Interconnectedness of Things?”  Well, it got me thinking about just how interconnected we all are.  Without getting all Jean Paul Sartre on you, let me point out the most common difficulty in all of management: interconnectedness.  That’s right, interconnectedness.  The fact is; executives hate it.  But it exists.  We have the tendency to measure performance of an exact metric; of an exact process step, or an exact person.  We like to think that sorting out the specific items of measurement can enable us to understand what is strong and what is weak.  Fix the weak bits, keep the strong bits, and voila, you have Lean.  But, from the work I’ve been involved in, it’s not so simple.  Similar to the difficulty of sorting out all the bits that make up a good loan from a bad loan; a good mortgage bond from a bad mortgage bond; business processes can be extremely complex and highly interdependent.

How do we get our arms around the complexity of process?  Mostly, in very distinct ways.  How many of us love to look at organizational charts, value chain analysis diagrams, system architecture diagrams?  If you are shaking your head “yes”, I’m deeply sorry.  The fact is, we are trying to ensure we understand the interconnectedness of things, but we often do that work in silos.  In efforts to diagram process or entity relationships of systems or people relationships, this work is most often performed as one-off attempts with a singular purpose or project in mind.  They are not done to ensure a wider scope of understanding is gained and maintained.  And therein lies a serious shortcoming of those efforts.  With islands of understanding, there may be some level of interconnected understanding, but the silos remain silos and whenever we look at those groupings within a map or chart or a diagram, there is too much lost information.  The value of what you have is just as quickly defined by what it does not have.  (Perhaps some Camus?)

Devil’s in the Details

So, how do we connect all these silos and how do we know when we have enough detail?  These are big questions to which there are no silver bullets.  During a recent engagement, I was working with a global IT organization who brought together four business units to define standard global processes.  Ultimately, the idea was to consolidate where possible, but initially they needed to capture how each unit was operating.  I’ve done this type of work a number of times and what still amazes me each time is how often we find gaps in processes, areas that are not understood, as well as overlaps where steps are replicated and no one knew what the other one was doing.  As we embarked on the journey of process design, the key question that this team asked of me was, “how many levels down do we need to go?”  My answer was pretty simple: go down to the level of detail that someone from outside this process area can read and understand what is happening without any ambiguity.

Imagine if you will an organization that has documented down to that level in a consistent way across their organization.  Further, imagine a singular map with diagrams that connect to all appropriate related process steps, to all related electronic content and within a platform that provides instant feedback from the personnel that perform the operations.  Now, that’s getting your arms around complexity and it tells the story of the interconnectedness of things.

Finally, once we gain perspective on this interconnectivity we can truly understand what is working and where risk lies.  For it is risk that we are constantly managing.  The banks that held large amounts of mortgage credit were blind to what was in the big bag of bonds that contained smaller bags of loans that contained all kinds of facts, some of which were never gathered (such as income verification).  Did they completely understand the interconnectedness of things?  Did they get down to a low enough level of detail to really understand the assets that so much was riding on?  To reduce operational risk, the devil’s in the details.  Get your arms around process, get your arms around the details and know what you’re buying into.

The Power of Events: Nimbus Acquired by TIBCO

September 1, 2011 § Leave a comment

Big news this week at Nimbus Partners, a company I joined exactly 4 years ago today.  We were acquired by TIBCO, a larger company with a diverse portfolio of BPM products.  I’ve known about TIBCO for about ten years now as they are pioneers in the development of middleware, messaging and enterprise application integration – what is now a core capability within Services Oriented Architecture or SOA.  Now, SOA is by no means new and the maturity of SOA is well advanced in large enterprises.  Many organizations have spent and continue to spend substantial amounts on its promise and a fair amount of challenges remain.  Still, like most revolutionary technologies, realizing the value of something that radically shifts what is possible, takes time.  TIBCO has been at the forefront of SOA, BPM and BI with technology that alters how information flows between systems and how quickly business users can get to answers.  The potential that lies before me and my company is exciting, with promise to connect advanced infrastructure capabilities with Nimbus’ cutting-edge business process management platform.  Now, I’m not going to delve into the intricacies of what is possible or which bits fit with which widgets as I’m sure whatever I imagine will evolve to be something quite different.  What I will tell you is that the power of this acquisition is an exciting event, one that will likely impact a wide variety of global enterprises. 

The Complexity of Events

On the topic of events, I’m reminded of a book I read years ago called aptly enough, “The Power of Events.”  It’s written by a gentleman named, David Luckham.  I was fortunate enough to hear him speak at a Gartner conference not long after I read his book when it was a groundbreaking topic.  Most interesting was his ability to illuminate the capabilities of Complex Event Processing (CEP).  The importance of this capability is primarily for organizations that need to minimize risk to their systems and the underlying assets that those systems control.  The need to minimize risk is evident in just about every organization I’ve ever entered. 

I remember being inspired by Luckham’s book and speech, and it had a big influence when I founded AVIVA Consulting.  One of the first opportunities to realize some of the key capabilities within CEP was with a partner company.  We took a simple technology for tracking real-time data flowing from any web service and integrated their software with our Microsoft focused collaboration stack of SharePoint, SQL, InfoPath, Office and K2, a third party workflow product.  We were mildly successful with it, but quickly became focused on risk and compliance requirements and never fully developed the real-time collaboration solution.  Later, after I sold our flagship product, ACES, to a Microsoft service provider, Neudesic, I spent a year working to take a Microsoft platform Enterprise Services Bus (ESB) to market.  I named it Neuron (yes, I’m still proud of how clever a name it is) and we launched it shortly before I left to work for Nimbus.  So, even though I was on the product management and product marketing side of the ESB product, I became intimately familiar with the core capabilities and potential of middleware messaging and SOA in general.

Now, full circle back to now… this event is the joining of TIBCO, the most innovative company in the middleware space with Nimbus, the most innovative company in the BPM content governance space.  I couldn’t be more excited to be at the nexus of this formation (you can kick me later if you get this).  Personally, I’m excited to see what will happen when we sit with financial services companies and pharmaceutical companies to look at how risk is managed within quality systems or compliance initiatives.  How well can most of these organizations manage real time events and how well designed are their processes to deal with adverse or opportunistic circumstances?  This is where the opportunity lies.  As I point out in my posting on business agility and the need to minimize risk through agile processes, organizations need to design processes that allow rapid response to unexpected conditions as well as the known possibilities.  The events that tend to be earth shattering are not the anticipated events, so how well we have modeled the organization to respond is critical.  Also, as I detail in my posting on checklists, minimally we must address the known risks with clear process handling instructions to ensure quality execution.

Rapid Response to Events = Reduced Operational Risk

So, imagine if you will, the situation where fraudulent phishing attacks attempt to lure bank customers to provide their login credentials to make a change to their account.  Rather than connecting to the real bank, customers are connecting to a fraudulent system that grabs their login ID and password.  The fraudsters then log-in to the real system, change the password and now begin making transactions to pull money out of the victim’s real account.  With CEP technology, banks can see in real time how much activity is occurring and when irregular volumes occur on a given function (such as 10X the usual number of password changes during the past minute), the system disables the password change function and alerts the appropriate administrators.  Cool stuff, right?  Now, tie in the ability to provide clear instruction on the manual handling that the administrator needs to perform.  This outlier password change event is rare and the steps required by the administrator may be exacting.  That’s where Nimbus comes in.  The admin will have clear steps to take, ensuring fast and accurate handling with quick access to all necessary resources and reference materials.  End game?  Very few, if any customers are impacted.  Very little, if any financial damage done to the bank.  Preventable adverse events are prevented.  And we can imagine in reverse how opportunistic events can also be quickly acted upon with decision-makers having clear instruction on execution.

Understanding Events in Context

The key to how an organization processes and responds to a large volume of diverse events is at the core of what BPM is about.  It’s not just process definition for the sake of checking a box that the auditor approves.  It’s about improving the decision-making ability of management and other operational decision makers.  It’s about reducing operational risk.  And it’s about continual tweaking or improving those processes as we learn what is working and what is not.  Gaining real time event information can be hugely beneficial, but it’s value is increased when we understand these events in context to precise operational activities.

Those of you who follow my blog already know, I’m not in the habit of reporting news or projecting the future, so consider this post a rare exception.  Given the personal nature of this event and the impact it will likely have on the future of BPM technology, I felt compelled to comment.  In a future post, I will explore technology specifics including how governance, risk and compliance requirements are handled with the variety of technologies available and the specific categories of capabilities including automation, content management, master data management, SOA, enterprise architecture, social networks, collaboration, search and reporting.  There are a variety of analysts and prognosticators jumping to conclusions about what this merging of technical capabilities will mean to the market.  I can tell you that this newly joined organization looks extremely promising, but the proof will be in how we make it happen with our customers.  It’s how Nimbus has always proven its advantage in the market; through real execution and value creation in real customer environments.  With the added strength and reach of capability that TIBCO brings, we should be proving what is possible very soon.

Operational Risk = Investment Risk

August 17, 2011 § 3 Comments

Investment Risk

The historical focus on securities analysis, valuation and prediction has centered on the tracking of historic prices.  Investment Risk has been measured based on price volatility, particularly historic price volatility.  Price valuation ( stock or bond values) have historically been measured based on the financial performance, cash position and cash flow of the issuing corporation.  Most measures review comparable financial ratios and industry benchmarks to form an assessment of proper pricing.  This practice of value pricing is flawed.  Future prices have little to do with past price behavior and risk is in no way correlated to price volatility.  Risk has everything to do with specific operational exposure, legal exposure, regulatory risk, as well as financial risk.  Price risk is the tail – not the dog and we are wasting our time measuring price trading patterns of any particular security.

Risk is not something any company, operational unit, financial arm or investment firm can ever fully manage.  Risk is part of existence.  Risk is part of living as individuals as well as companies.  Minimizing operational risk, while noble in its cause is also completely inadequate toward satisfying investor risk.  Investors are best served with their risk profile when companies  handle adverse events in the most adaptable, expedient and agile manner.    Companies that know how to respond to the unknown unknowns – not only the expected risk scenarios.  Those are the companies that know how to deal with risk and those are the companies that survive and thrive for decades or centuries.  This operational maturity for handling adverse events is where risk truly lies and what measurements are most relevant for investors to understand.  As is everything we measure, operational risk is relative and how one organization measures in comparison to other comparable companies in the appropriate industry sector should determine the risk premium investors apply to prices and price change expectations.

Market Shifts: Could Blockbuster See the Netflix Threat?

So, how is there ever a solid method of valuating stocks – understanding the risk inherent in the security and making a value judgment on its price relative to alternative securities within a similar risk class?  I believe the analysis needs to be based on the management of operational, financial and legal risk along with understanding the operational objectives, measures, tracking ability and business agility that each relevant comparable company displays.  How could the valuation of Blockbuster drop so precipitously?  Could we have measured their market position in 2003? Of course!  Could we have measured their operational and financial controls?  Absolutely!  Could we also see how unprepared they were to alter their market model to deal with alternate competitive models?  Yes, we could have see this risk as Blockbuster had very little ability to manage their processes and alter them as needed.  They could not move when the market shifted and they took years to try to compete with Netflix.  Why?  Because they were NOT agile.  They couldn’t just change their model even though they know they needed to.  By the time they implemented a competing service to Netflix’s, their lunch had been eaten.

Fraudulent Schemes

What about Barclays circa 2003?  A phishing attack; wherein a hacker steals email information from Barclays.  The hacker then sends emails to the Barclay’s customers asking them to change their password.  The hacker steals the real passwords and immediately changes them, thus locking out customers.  Finally, the hacker transfers funds from each account, effectively stealing tens of millions of dollars in a two day span.  What’s astonishing isn’t that Barclays or any other bank could be fooled with this scheme.  In the early 2000’s, this type of phishing attack was a new method of fraud and they couldn’t have foreseen it coming as much as it might seem obvious today.  What is astonishing is that Barclays didn’t have system parameters on their password change function that prevents such outlier events from occurring.  When thousands of users are changing their passwords in a few hours, there was not an automated trigger to shut down this service.  If the normal rate of password changes is only 100 per day, multiple standard deviations away from an average event usually means something is wrong and the service needs to be halted and evaluated.  But Barclays did not have such processes in place and they were not protecting their assets in a prudent manner in 2003.  As a shareholder, the most important question relative to your risk with your investment in Barclays must be, “Does Barclays have their arms around their processes?  Do they consistently look to improve and tighten their risk and control structure?  How do they govern their processes?  How visible are their processes?  How much risk is embedded into individual or small department knowledge domains wherein a rogue trader can bring down the entire company; such as what occurred at Barings Bank in 1995?  We don’t have to look far for examples.  How about the latest with Newscorp’s wiretapping scandal?  How well did Rupert Murdoch and the rest of the leadership team understand the risks they were taking?

Where traditional investment analysis becomes moot is when we consider these outlier events (market shifts, fraud attacks, internal fraud, legal rulings, reputational loss), etc.  While Sarbanes-Oxley put some basic, prudent rules in place for public companies in the US, this regulation does nothing to reveal the true risk position of the investment.  And it’s risk that is the issue here.  Every investment provides a risk/reward proposition.  If I’m going to incur greater risk – I’d better have the opportunity for greater rewards.  And vice-versa, I may choose to limit my risk exposure – knowing full well that my return opportunity is modest.  US Treasuries are considered among the least risky securities to hold and consequentially they yield very small returns relative to other bond notes with identical coupon and duration.  You are virtually guaranteed your 3.25% return on a 10-year note and “virtually” no risk of default.  Okay, if we use the US Treasury as a benchmark for no risk, then what constant above this risk illustrates relative risk and how much should investors be compensated for each 1% of additional risk?

Starting in the 1970’s, financial scholars embraced the idea that a stock’s risk was associated with price volatility.  Further, they measured past volatility as a likely indicator of future volatility and thus the inherent risk.  Barr Rosenberg’s consulting firm, Barra, would eventually develop the “Beta”, a quantified measure that represents a stock’s sensitivity to movements in the overall market.  A stock with a Beta of 1 would have identical price volatility as the broader market; more than 1 meant more volatile and less than 1, less volatile.  Suddenly, the risk factor of a stock could be calculated, quantified and estimated.  Another economist, Robert Engle, would win a Nobel prize in 2002 for his independent view of this same idea.  Amazing.  There you have it.  Investment Risk is all about past pricing.  Nonsense, I say.

The High Impact of  Unlikely Events

In 2007, Nassim Taleb published what would become a Wall Street favorite, “The Black Swan” (not a lot of ballet in this one), “The Impact of the Highly Improbable”.  The important core theory that Mr. Taleb establishes is that unexpected events happen and the impact that some of these events have is astronomical.  Whether we’re talking about 9/11, the capital markets collapse of 2008, BP’s oil catastrophe in the Gulf of Mexico, or Baring Bank’s rouge trader.  Micro level events or macro level events; either of which may be completely unpredictable have potential to be game-changing events.  The big question becomes: how well positioned are we to deal with such events?  Forget about trying to capture, measure and plan for each exact event.  That’s all fine and good.  But, how capable is the organization; how agile is the organization to respond when the big-one hits?

When a competitor exploits a new technology that undermines our traditional sales and delivery models, how well can we analyze the issues, develop strategic approaches, institute new models and underlying processes to maintain our market position?  These are the questions that Blockbuster could not answer adequately.  Their event wasn’t even an overnight impact.  They had many months to adjust, but like a big aircraft carrier, they just couldn’t change course quickly enough.

Only The Nimble Survive

In a recent blog post by Torben Rick http://bit.ly/lQpHYn,  he asserts that “Business history is punctuated by seismic shifts that alter the competitive landscape. These mega trends create inescapable threats and game-changing opportunities.  They require businesses to adapt and innovate or be swept aside.”  So, when we return to the topic of risk and how well organizations are managing risk, let’s not focus on stock price volatility.  Rather, let’s look at how organizations approach their business models in an agile manner.  Just how well are they positioned to change processes quickly and respond to “Black Swan” events?  Because the question of whether or not there will be such events is certain.  There will.  Another fine recent blog post is by Norman Mark’s http://linkd.in/qyQazb  who notes, “an organization needs not only to understand and assess its risks, but it needs to have a culture that embraces the active consideration of risk…”.  It’s this consideration that I suggest includes active response to the unexpected.

If you are dealing with management that is not capable of rapid change, you are dealing with high risk.  Now, my definition of risk might not be quite as quantifiable and as easily comparable as a risk quotient like Barra’s “beta”.  But for my money, it’s what really matters.

Process Content = Secret Sauce

August 4, 2011 § Leave a comment

As I discuss in my posting on Active Governance, I find a great deal of reluctance by executives to invest in managing and controlling process information.  There are a few factors that play into this reluctance by executives.  The first fundamental fact is that leaders hate spending time, resources and money on things that do not obviously make them money.  No company is in the business of governance.  Included in this category are other supporting operations such as Information Technology, Finance, Human Resources and Legal.  Most organizations do not value the investments in those parts of the organization as value drivers.  They are viewed as infrastructure.  They’re necessary for holding up the theme park, but not necessary to generating revenue and profits.  As long as these non-value driver (NVD) processes serve the most basic needs, keeping costs minimized is the objective. 

BPM treated as the “necessary evil”

The approach to NVD operations is the same objective to compliance and governance.  They are necessary evils.  They must be done, but the objective is to minimize costs to provide only the essential requirements.  Spending beyond the minimum provides no additional value to the organization or its investors.  If I spend more money to ensure a higher degree of IT support, it’s not likely to translate into more sales this quarter.  In fact, the cost of new systems or additional resources will shrink margins and drain profitability.

While there is some truth to this common condition, I contend that organizations must appreciate the value of ALL processes.  They need to see help desk support processes as equally valued when compared to sales processes.  Further, they must harvest and protect these unique processes as much as they value and protect the secret sauce locked in the impenetrable safe.  Where does Coca Cola or Heinz keep it’s secret recipes?  You bet they are secure and carefully managed. 

An IT Transformation Case

One of my clients has over 3,000 IT professionals in the US supporting the subset an organization of about 300,000.  How well this IT organization operates has a huge impact on the overall health of the company.  But, how are these processes managed?  How are they understood?  When I embarked on a major transformation effort with them, several issues were well understood. 

  • Costs could be reduced if they could consolidate overlapping processes across four major business units.
  • Quality could be improved if the global organization could identify the best performing processes and then standardize on those high performing processes.
  • A technology platform for managing process content could allow them to sustain all changes and improve process ongoing.
  • Future change considerations could be achieved far quicker and with a greater degree of certainty if process content was maintained and understood.
  • Process content within a management platform could be leveraged for governance and compliance initiatives, eliminating future process documentation projects.

 

What Makes Us Tick?

From this experience and others, I’ve found that it’s the unique processes that organizations possess that can make massive differences in profit margins as well as massive differences in revenue generation.  A sales approach is a process.  A pricing approach is a process.  Partner channel development is a process and success in those approaches needs to be harvested as repeatable processes.  It’s these successes that are some of the most important assets of an organization.  But how are they harvested?  How are they repeated?  Who actually understands them?  If a key individual leaves or a group of key people leave, does our ability to tap into that market disappear?  Do we lose the ability to form specific partner relationships?

And within IT or HR or Compliance… what about those areas of the business?  Are they just as important as the secret sauce?  A colleague of mine, Chris Taylor, recently highlighted the “secret sauce” issue in his posting on end goals.  He surmises that the “end goal of BPM is creating revenue for your company”.  As I will detail in later postings on this blog, BPM impacts top line revenue, cost containment, bottom line results, compliance management, risk management, business agility and investor confidence among other key business benefits.

I find a varying degree of understanding and appreciation for protecting the “secret sauce” of the organization.  Some organizations are highly protective of their processes and understand that the unique way they manage provides higher margins, quality products, quality service, customer experience and competitive advantage.  Process Management is the critical foundation, what is too often viewed as mundane infrastructure that is the secret sauce.  It just may be the case that new product development, marketing, and sales truly deserve the accolades, but again we must ask, how well has the organization captured that secret sauce and protected it?

Where Am I?

You are currently browsing the Business Agility category at Process Maximus.