January 17, 2012 § Leave a comment
Recently, I read through the latest World Economic Forum “Global Risks 2011” report which is an initiative of the Risk Response Network. It’s an impressive assessment of global risks produced in cooperation with Marsh & McLennan, Swiss Re, Wharton Center for Risk Management, University of Pennsylvania and Zurich Financial. What is compelling about the report is it is not simply a survey result or a list ranking, rather it details and illustrates the interrelationships between risk areas; identifying the causes in an effort to identify points of intervention. The report highlights response strategies and even proposes long term approaches.
As with any risk report, it has a tendency to feel alarmist, but its value and content cannot be dismissed and its emphasis on response is encouraging. The two most significant risks the report identifies are relative to economic disparity and global governance. The main point being that while we are achieving greater degrees of globalization and inherent connectedness, the benefits are narrowly spread with a small minority benefitting disproportionately. Global governance is a key challenge as each country has differing ideas on how to promote sustainable, inclusive growth.
The Rise of the Informal Economy
The report goes on to highlight a number of risks including the “illegal economy”. The illegal economy risk includes a cluster of risks: political stability of states, illicit trade, organized crime and corruption. Specifically, the issue lies with the failure of global governance to manage the growing level of illegal trade activities. In a recent book by Robert Neuwirth entitled, “Stealth of Nations: The Global Rise of the Informal Economy”, the author estimates that off-the-books business amounts to trillions of dollars of commerce and employs half of all the world’s workers. If the underground markets were a single political entity, it’s roughly $10 trillion economy would trail only the US in total size. Further, it’s thought to represent in the range of 7-10% of the global economy and it’s growing. To be clear, underground markets are not only dealing in illegal substances, crime, prostitution or drugs. It’s mostly dealing in legal products. Some of the examples Mr. Neuwirth provide include:
- Thousands of Africans head to China each year to buy cell phones, auto parts, and other products that they will import to their home countries through a clandestine global back channel.
- Hundreds of Paraguayan merchants smuggle computers, electronics, and clothing across the border to Brazil.
- Scores of laid-off San Franciscans, working without any licenses, use Twitter to sell home-cooked foods.
- Dozens of major multinationals sell products through unregistered kiosks and street vendors around the world.
A Global Risk?
Are the underground markets really a global macro-economic risk? Mr. Neuwirth makes solid arguments that these markets provide jobs and goods that are essential to these populations and that it is the corrupt authorities in most developing countries that are being worked around. In some ways, it can be argued that these unlicensed vendors and importers are the purest of capitalists; innovatively providing goods by avoiding intervention. In a recent interview in WIRED magazine, Mr. Neuwirth points out that Procter & Gamble, Unilever, Colgate-Palmolive and other consumer products companies are selling through small unregistered, unlicensed stores in parts of the developing world. He goes on to point out that P&G’s sales in these unlicensed market’s make up the greatest percentage of the company’s sales worldwide. I found this tidbit shocking. Really, a company that brings in over $80 Billion in revenue a year is actually pulling in most of its revenue through unlicensed channels? Now, that doesn’t mean P&G is directly selling through those channels, but they sell through distributors that may use others that do sell through to unlicensed vendors who don’t pay taxes.
The WEF concludes that illicit trade has a major effect on fragile country states given that the high value of commerce and resulting high loss of tax revenues impinge on national salaries and government budgets. An example that’s included in the report is that of Kyrgyzstan. “Members of the Forum’s Global Agenda Councils argue that the undermining of state leadership and economic growth by corrupt officials and organized crime contributed significantly to social tensions which erupted in violent conflict in June 2010, causing widespread destruction, hundreds of civilian deaths and the displacement of 400,000 ethnic Uzbeks.”
The Threat to Quality and Public Safety
So, if you were guess what type of goods top the list of sales that take place in these underground markets, what would you guess? Cocaine? Opium? Software Piracy? Cigarettes smuggling? Small arms? Topping the list with a rough estimate of $200 billion in value is counterfeit pharmaceutical drugs. Just behind at $190 billion is prostitution. Which leads me to the next serious risk issue if global efforts don’t improve to govern these markets: quality. I’m not qualified to address the quality of prostitution, but let’s consider the quality of counterfeit pharmaceuticals and the general issue of public safety. If these markets go unregulated and unmonitored, we are likely to see terrible abuse by profiteers whose only concern is to bring high value products to market quickly. No regulation also means an inability to create safe work environments and to protect rights of laborers all along the supply chain.
On the other hand, the vast majority of workers and consumers in developing countries thrive because of these markets. A strong effort to disrupt or disband these markets would cause a high degree of distress in communities that rely on these markets for access to essential goods. But in return, without tax revenue that can only be gathered from legitimate, licensed businesses can governments function and provide oversight services that would benefit quality and public safety concerns. It’s an endless loop as we say in the software world; a true catch-22. Even relatively well functioning supply chain operations at pharmaceutical companies in developed countries are consistently challenged to maintain a high degree of quality (note recent impact of product recalls at Novartis). Considering how much effort and money is spent on quality assurance, inspections, and FDA audits on legitimate pharmaceuticals, it’s beyond scary to consider the quality of counterfeit pharmaceuticals that are circulating in illicit markets.
Within the US, in the state of California, we’ve seen recent evidence of solutions such as bringing the trade of marijuana within the framework of the law. Potential results include ensuring quality and safety for the public, raising tax revenue and reducing the profits of organized crime. Still, the issue of economic disparity is a much tougher nut to crack. Widening gaps in income within all economies provide incentive for lower income individuals to work outside of established trade structures. This incentive leads to greater illicit trade which in turn hinders a government’s ability to effectively tax businesses and provide services such as regulatory oversight.
Can We Govern Illicit Markets? And If So, Should We?
These are obviously very difficult challenges, but ones that the WEF is analyzing in an effort to form solutions. The relationships between economic disparity, illicit commercial trade, public safety and government corruption becomes glaringly clear. How can the global community govern these illicit markets? They exist everywhere to some degree, even in the US where informal markets are estimated to account for 10-20% of GDP. One solution that WEF recommends is to strengthen financial systems. The implication is that weakened systems are the result of the heightened volatility and risk deriving from the recent capital markets crisis. With diminished confidence comes incentive to work outside the system. Some suggestions include:
- Better surveillance of the financial sector, including all systemically relevant players
- Tighter capital and liquidity ratios for all banking institutions (including non-banks), with higher ratios for systemically relevant institutions
- Risk retention for securitization (so-called “skin in the game”)
- Improved transparency and counterparty risk management in “over-the-counter” derivative markets
Perhaps the most interesting part of this global risk challenge is how interrelated these issues are. The influence that government corruption has on illicit markets is direct, but not the only factor. Further, the ability of governments to regulate, control and tax this commerce is not straight-forward and overly severe policies can prove detrimental to workers and consumers. And how much do other factors such as financial stability contribute to activity moving outside conventional channels? There is no certain view on these underground markets as we must consider why they exist, for whom they exist and how valuable they are for the good of all.
January 3, 2012 § 11 Comments
With each year end, all forms of media spew a tidal wave of predictions. From the apocalyptic to the mundane, we get predictions from prognosticators on who will win an Oscar to which Republican will win in Iowa to how well the market will do in 2012 to who will win the Super Bowl. But it’s not only during year ends that we get a hefty dose of soothsaying. It’s a public non-stop obsession. Dare I say, it’s an addiction. Predictions are in every facet of society – within industry, we are constantly trying to get insight on the level of demand this month for our products; the level of prices within each product type; and which company will gobble up which other company. The fact that foresight can be a key advantage when competing for resources and competitive superiority is not surprising. What is surprising is the amount of noise pollution and the insatiable desire to listen to that noise.
What is an Expert?
I still hear my favorite finance professor lecturing during one of my b-school classes about “experts”. He illustrated quite powerfully (obviously, it’s stayed with me for all these years), how poor predictions were made by economists on interest rates, GDP growth, oil prices, and stock prices among many other measures. In article after article, economists, industry experts, political experts, scientific experts were shown to be just slightly better than random guessing. What’s worse, most “experts” tended to influence each other, so that consensus predictions prevailed. The group of economists’ predicting the direction of interest rates tended to lump together in narrow ranges which indicated that working from the same sets of data with the same sets of assumptions, they also tended to create the same range of estimates.
Risk and Probability
We all know that the future is uncertain and that many unknown factors impact future events, yet we go to great lengths to predict. The bottom line is that we can draw conclusions that are more about probability than actual pinpoint calculation. If we normalize probabilistic outcomes for earnings per share for Apple this coming quarter, we can estimate EPS outcomes within ranges. If we believe published consensus estimates by analysts, we can find that mean estimates are at 9.81 with a coefficient variance of 4.39. Statistically, this variance is only significant for historical relevance and should not be seen as a predictor, but given analysts do not have crystal balls, they still use it as the main factor for setting probabilities. So, if we conclude there is a 95% chance that EPS will fall within the range 8.56 – 11.06 (or two standard deviations), we are essentially placing bets based on probability. Now, the valuation of a share of Apple common stock will vary greatly depending on where in this range actual EPS falls. Of course, there is still the 5% chance that EPS falls outside the expected range. And further, these numbers are purely estimates based on one set of assumptions that no two analysts would ever agree on.
When looking at all these predictions, it can quickly become apparent which “experts” are really viewing their data through a critical lens and which are simply along for the ride by echoing other expert’s viewpoints. What I find most discouraging is how confident some prognosticators are, especially those on television and web broadcasts. They emphatically proclaim their view in an effort to persuade viewers they are right – trying to create self fulfilling prophesies through persuasion – perhaps the most egregious offense. We see this regularly on political discussion panels where party-aligned or candidate-partial analysts make their case for persuading us what people really want and how they will vote. Are they really giving us a scientifically sound viewpoint or simply trying to manipulate our view about what will be?
Predicting Human Behavior
The digital age has provided a powerful platform for gathering information on individuals’ behavior. Companies can gain insight into buying behaviors as well as data on individual and group interests in entertainment, politics, as well as professional and social connections. The usefulness of this information is at once obvious and complex. For instance, if we know that there is better than 50% chance that a person buying a camera will also buy a camera case then it would be an effective sales practice to suggest a camera case at the point the buyer selects a camera. This practice is quite common now with online purchasing. We also see it fairly frequently with phone sales as well as with fast food ordering; e.g.: “Would you like fries with that?”. Recently, I was shopping for a new lawn mower and researching options on the website homedepot.com. Within several minutes, after performing a search on the site, a pop-up offer to chat with a representative came up. I took that offer as I had some questions about the models. After about five minutes, the rep offered me a 10% discount if I’d like to purchase the item online and he’d help me through the purchase process. I had already made the decision to buy the item, so I was happy to get this discount, but I wanted to pick it up at a store near me rather than have it shipped. Their process allowed for this flexibility as I could purchase online and the item would be instantly put aside for me to pick up that evening. This process was ingenious. What percentage of people shop to the point of sale and then drop, reducing the chance of buying the item through the original site or perhaps not buying that item at all? This new discount offer through the chat rep can help homedepot.com reduce that drop percentage. Now, Home Depot does not know if I would have purchased that item that day online anyway and they do not know if I would have gone to the store and been willing to pay full price. The fact is: there will be some percentage of margin they relinquish for the sake of capturing a higher percentage of potential sales. Home Depot, like so many other retailers, banks, insurance companies, and drug companies are in the business of prediction – predicting what you will do if they communicate with you in a certain way at a certain time.
Statistical Sampling – A Foundation for Predictions
Statisticians often tell a joke about a man with his head in a refrigerator and his feet in an oven – on average he feels about right. Sampling is used to determine probabilities and to make decisions on the level of risk we are taking. It’s sampling and probability that determines the rate we pay for life insurance, car insurance, and all other types of insurance. So, when we try to predict Apple’s earnings per share for next quarter or whether a catastrophic disaster will strike an Asia Pacific country next year, we must calculate the odds, the percentages, the probabilities. The fact remains, however, that not all outcome distributions fall into a normalized curve and highly unlikely events can be game changers.
Prediction should be all about risk, uncertainty, and likelihood, but what you’ll hear this week and throughout the year is a chorus of experts telling you with great certainty what the future will bring. Don’t believe them. If you’re jonesing for advice, try listening to those who are providing detail on probability, risk and trends. But know that the future is never about certainty and always about probability. When prognosticators get it right, they were just plain lucky. They may have played the odds. They may have had some truly intuitive insight that others did not. But there is never a sure thing.
November 7, 2011 § 2 Comments
On my white board sits a list of topics that are near and dear to my heart; topics that I think about often and want to espouse, pontificate and illuminate. Most often, I think I have original ideas on these subjects and while I don’t feel I have the time to get it all out at once, I keep this list with the intention of banging them out slowly – one by one. And almost without fail, in my regular reading or research, I’ll come upon an article or book on one of these topics and then suddenly, like a bolt of revelation, someone’s beaten me to the punch; made the key insights that I thought were my domain.
The Surety of Fools
One such happening this past weekend as I perused the New York Times Magazine, a gentleman by the name of Daniel Kahneman wrote an article entitled “The Surety of Fools”, an adaptation from his upcoming book entitled “Thinking, Fast and Slow”. He hit on a key observation that is at the core of what I’ve been writing about over these last few months; misperceptions of risk. I won’t rehash the whole article, but in essence, Mr. Kahneman points out how we often hypothesize based on logic, but when empirical evidence belies our theories, we simply don’t believe the facts. He calls this phenomenon the “illusion of validity.” I love this premise as I see it so often with investment managers, news reporters, mortgage brokers, sport team coaches, politicians, voters and prognosticators in general – they all create their own reality.
Creating Your Own Reality
We ALL do it to some degree. We watch the news channel that validates our set biases. We befriend people who support and validate our opinions and views. On the topic of investment risk, operational risk and risk in general, how does that phenomenon play out? Do we see the facts and are we able to evaluate data without bias? Mr. Kahneman illustrates the reality of investment bias with examples of studying investment managers and how their performance is measured. The vast majority of investment managers he studies do not perform better than a purely random pick of stocks. Yet, the illusion of validity causes the management of the largest investment firms to bonus and commission those managers as if they are keenly skilled; as if the fund managers have brought tremendous value to their client’s interests. They create their own reality – instead of accepting that the unbiased data shows no value in their management of investment assets.
Life Sciences’ High stakes
There are even greater risk examples. Life Sciences companies such as pharmaceuticals, biotechnology and medical device firms have huge investments and pressures to produce new products. Each development stage requires rigorous testing and massive volumes of data. While the FDA enforces regulations and these companies are regularly audited both internally and externally, the pressure to produce is high. Time is of the essence when it comes to bringing a new drug to market; both for the sake of patients as well as profits. How well is the data reviewed and scrutinized before passing each validity stage? Is there a bias that errs on the side of validation ahead of rejection? Absolutely. Kahneman’s Illusion of validity is at play and the consequences are immense.
The Supply Chain Fog
For Life Sciences companies the risks involve patient health as well as immense risks to the company including product recalls, regulatory findings, lawsuits, and ultimately, reputation damage. The organizations I’ve worked with over these last few years are extremely diligent in their processes and methods for R&D, trials, manufacturing as well as distribution. But other operational risks do exist. In a post last year by Daniel R. Matlis entitled, “Life Science Executives Concerned about Outsourcing and Globalization Unintended Consequences”, Mr. Matlis notes, “In the drive to lower costs, manufacturing and sourcing of ingredients and components in countries such as China and India are playing a more prominent role. Yet, according to the research, outsourcing to manufacturers in developing economies carries significant operational risks. Industry Executives surveyed for the research said that Raw Materials sourced outside the US represented the greatest risk to the Value Chain, with 94% of those who responded seeing it as a significant or moderate risk. When comparing the risk profile of US vs. foreign raw material Suppliers, United States Suppliers were classified as low risk nearly 10 times as often as foreign Suppliers.” Any Life Science company’s ability to define, monitor and track each and all of their third party providers adds a level of complexity and difficulty. This difficulty stems from what consultants at Nimbus have labeled the “fog of process accountability, control and oversight.”
To be certain, this fog exists to some degree everywhere and obviously with supply chain partners even more so, but how well an organization tries to create clarity of process definition and clarity of quality both from within and beyond the enterprise is critical when managing operational risk. Perhaps the biggest concern I have with the phenomena of “creating your own reality” is the fact that the “fog of accountability” provides a condition for pushing forward; an excuse for not accepting what the data is revealing; and a scenario wherein doubt can always be cast on outliers.
Focus on the Facts
I spent part of last week with a biotechnology firm’s scientific directors, their CIO and colleagues from TIBCO, briefing them on my company’s software technologies and how they apply to the wide variety of process areas they represent. The volume of data and the complexity of that data as it applies within their product trials is tremendous. Next week I’m with a medical device company who’s in the process of a major transformation and will need to address most every operational area as part of a corporate spinoff. These are just a couple of quick snapshots, but they epitomize the speed with which organizations change, adapt, and grow. Speed and volume is only increasing – further escalating the demands for validation of each initiative.
I can only hope that Mr. Kahneman’s “illusion of validity” is tempered when organizations manage operational risk and the key decisions that drive product development. The stakes are indeed high when it comes to Life Sciences, but every industry is predisposed to this condition. In short, we can never be to too sure. Let’s not fall too in love with our own marketing slogans. Let’s understand the complexity that we’re faced with, make our best, valid judgments and do the best with the facts we have. While there is never purity in our judgments, we can at least try to be aware of the propensity to fulfill objectives through maintaining a blindness to the facts.
October 6, 2011 § Leave a comment
The past few weeks has been a whirlwind of activity; taking me to DC, Philadelphia, Silicon Valley and Las Vegas. As I work primarily with Pharmaceutical, Life Sciences and Fast Moving Consumer Goods (FMCG) organizations, there has been tremendous interest in improving compliance, quality and an overall uptick in investment aimed at mitigating risk. Now, working as part of TIBCO, I’m involved in a number of initiatives that pull process improvement into the world of real-time execution. What the heck does that mean? Consider this scenario that’s not in my sector, but clearly defines the power of real-time execution: You’re a retailer who’s main objective is to optimize value to customers and maximize sales opportunities through targeted promotion. Now, consider all the ways there are to accomplish that. There are many ways information is gathered by your customers. They gain awareness of products and retail stores through traditional channels such as TV, newspaper, magazines and radio. Increasingly, information awareness is achieved through the internet channel that is delivered through a growing variety of devices such as laptops, desktops, tablets, and smart phones.
Customer Understanding and Vast Amounts of Data
With the volume of channels and devices growing, so is the amount of data needing to be designed, managed and pushed to these different sources. Given the variety of touch points with customers, the task of managing communications (specifically promotions) is not easy, so how can retailers be as effective as possible? Let’s start with the retailer’s objective of maximizing sales opportunities through targeted promotion. There are many ways to promote. There’s the old school carpet bombing method of devising an advertising campaign and simply pushing a variety of ads through each channel or a chosen subset. Those methods are extremely expensive and targeting tends to be poor at best. While some web sites and some cable networks help make targeting a bit more accurate, there is still quite of bit of dilution. And perhaps the biggest issue with this traditional ad campaign push is that it relies on some action in the future when the message about a sale or coupon may be easily forgotten or lost. One of the important developments with event pattern matching technology is it enables retailers the ability to know what customers are doing in real time. Imagine this: A person goes into a retailer to purchase a pair of shoes. The retailer’s system knows the frequency of purchase by the customer as well as preferred brands and the retailer also knows that 85% of shoe buyers are interested in deals for accessories such as socks. Imagine the person who’s in the shoe department also getting a text coupon sent to their phone alerting them of a 25% discount coupon offer for socks. This is the power of event based processing and its impact is immense.
The Need for a New Architecture
The last twenty years has seen companies operating on the premise that all information needed to be stored, analyzed, and reported to enable effective decision making. What TIBCO is doing is nothing short of revolutionary. Given the fact that we have seen 10X growth in data being generated by organizations in just the last two years, we can no longer expect traditional database storage and analysis architectures to support this growth and need for real time responsiveness. Data flows in real time and responsiveness to customers must also be executed in real time. Threats to systems and resources happen in real time and again, responsiveness must be instantaneous. Far too many organizations are still relying on massive storage requirements, old business intelligence methods for data warehousing, creation of data-marts, massive libraries of metrics and performance measures and an abundance of continuously evolving report generation. To what end? How well are organizations able to respond in virtual real time to meet customers, partners, and risk/compliance requirements? Not very well if they do not adapt to this tidal wave of data growth.
The Advantage of Statistical Insight
This weekend, I took my 12-year old son to see the recent film release, “Moneyball”. The movie is based on a book by Michael Lewis entitled, “Moneyball: The Art of Winning an Unfair Game”. I won’t diverge into writing a review (it is good, go see it), but what’s most interesting to me about the subject of Mr. Lewis’ book is the idea that the game is won and lost based on probability and not the conventional wisdom of aged scouts who see the romantic and timeless qualities in each ballplayer. The success that the General Manager of the Oakland A’s, Billy Beane, has with inexpensive players out of necessity, brings to light a truth about the value of understanding probability and having the tools to make decisions ahead of your competitors. Through the use of the analytical principals of sabermetrics and the brilliance of his assistant, Paul DePodesta, Beane is able to achieve as much with his paltry budget as teams with four times more to spend. Unfortunately, that advantage does diminish fairly rapidly and the rest of Major League Baseball catches on in successive years, but the point is not lost. If you can gain key insight to your customers, competitors, fraudulent attackers, supply chain partners, etc, you can greatly improve how you interact and respond to create distinct advantage. As I’ve discussed in other posts on Business Agility, the key to operational effectiveness and risk management is agility and responsiveness. Given the vast volume of data that is now being thrust upon organizations because of the constant stream of connectivity, organizations have a pressing choice to make. Utilize real time capabilities and analytics to create an agile business operation or risk being overwhelmed and unresponsive in the marketplace.
As I continue to work with organizations that innovate around these real time technologies, I am seeing a growing gap in performance and capability from those that are laggards. The growth of mobility, social networks and connectivity are fueling a step-change in how we manage marketing, production, quality, compliance, governance and virtually all related services (internal and external). The organizations that lead the way with investments in these technologies will be best positioned to innovate and adapt within their respective industries.
September 19, 2011 § 4 Comments
It happens every day, every hour, minute and second. Stuff. Stuff happens, and lots of it. Every so often, something happens that make us go, “oh, that’s big”. And sometimes so “big” that we scramble to react to either take advantage or take cover; to move money in or out; run for higher ground or head out to sea. Sometimes we have a bit of notice, but other times we don’t.
Previously, I wrote about risk, fraud and how Barings Bank was brought down by a single rouge trader. Well, it happened again just a few days ago. UBS AG, a large Swiss bank appears to have lost somewhere in the neighborhood of $2 Billion. The news caused its stock to promptly drop; closing 11% lower than the previous day’s close. Moody’s Investor Service quickly reacted suggesting they would review UBS for a possible downgrade; citing concerns that it’s not adequately managing risk.
It’s much too early to determine how this trader pulled off his scheme. Early information suggests he may have manipulated back-office operational systems as he previously worked in back-office operations and would have had that knowledge. Did UBS have a policy to restrict back-office workers from transferring to front-office trader positions? They didn’t comment.
There is much that needs to come to light. Was this the work of the single trader, Kweku Adoboli, that is currently being implied or were others involved? What controls were in place to prevent these type of trades and why did they fail? How long did it take for monitors to catch the rouge activity and did they prevent additional potential damage?
To give a sense of size, it only took Nick Leeson’s $1.3B cheat to bring down Barings in 1995. Jerome Kerviel devised a scheme that cost Societe Generale $7.16 Billion in 2008. Other scandals have impacted banks over the years and the fraudulent events don’t seem to end. Regulations can be implemented and made more stringent; auditors can review organization’s processes for compliance to those regulations, but still big stuff happens. It’s the kind of big stuff that wipes out all other assumptions. You can be the finest analyst in the universe, performing all the due diligence necessary to make the most prudent investments. You believe in UBS, the fact that they brought back senior leadership that they are serious about reform. Oswald Grubel were supposed to be turning around the troubled UBS, but it appears he and his leadership team was just not that concerned about managing operational risk. Simple bottom line is: one event can be catastrophic erasing all other assumptions.
So, the questions that are most pertinent: Which operational events need real-time monitoring? What events need process controls in place to automatically prohibit additional risk exposure? How can managers respond in real-time to both opportunities and adverse situations? As Pete Seeger adapted from the Book of Ecclesiastes, “there’s a time to gain, a time to lose, a time to rend, a time to sew”. Similarly, there is a time for analysis and there is a time for real-time response. All the analysis in the world cannot determine the future. As the Heisenberg Uncertainty Principal states; the more precisely one property is measured, the less precisely the other properties can be controlled or determined. In other words, the mere act of observance imposes yet another factor into the set of conditions. There are no absolutes about tomorrow and there is no such thing as risk-free. So, while I point out the immense advantage that doing your homework will bring with a previous blog post in interconnectedness, at the end of the day, a single event can wipe out all of your assumptions.
Well, I know what you’re thinking…. that sucks. First you tell me that I should do fantastic amounts of due diligence to identify opportunities, but then you say, “ahhh, it’s all a waste once a single unexpected event strikes.” Okay, I can see that paradox or contradiction, but really what I’m saying is: you have to do both. Good operational process management is about analysis of the details; of every single activity; every single owner, reviewer, regulation and risk. And yet, it’s also about agility. What do we do when things don’t go as planned? What do we do when the proverbial poop hits the fan? Can we analyze each activity for its risk exposure? Can we find methods and control activities to mitigate against adverse events… especially the catastrophic ones? And can we buy insurance to position ourselves for gain if adverse events strike? Absolutely, I say! Why some organizations don’t, especially financial institutions that are particularly vulnerable, is beyond me. Sometimes, it’s just incompetent management, but often it’s a simple lack of appreciation for how solid operational process management requires a sizable investment in process thinking, risk management and development of a process improvement culture.
Fortunately, a lot is being done during this generation to advance process-based thinking and to raise the level of consciousness about business process management and its impact on corporate governance and risk. But, it’s happening slowly. Maybe events like last week’s UBS debacle will open a few eyes…. let’s hope so.
September 7, 2011 § 6 Comments
This past week the company I work for, Nimbus Partners, was purchased by a larger software company, TIBCO. I can’t comment on the process of due diligence of the deal, but as any large acquisition is considered, a great amount of analysis must be performed. To value any software company, the acquirer must value the assets for product technology, position in the market, product position within the existing family of assets, the company’s existing financial state as well as projected earnings potential.
This acquisition is one of many major decisions that executives at TIBCO and other corporations go through every year. Some investment options require incredibly in-depth analysis while other investment decisions may be made quickly with far less due diligence. There are plenty of reasons for performing an analysis on an investment to a given level and not to a finer level. When purchasing a stock or making a trade on an existing holding, how much information is driving your decision? Did you read the prospectus of the latest 10-Q? Did you attend the recent investor conference calls with management? Did you get all the answers to your concerns of the latest one-time charge to net income? The odds are you didn’t. The odds are you’re trading on gut feel of the situation or you’re trading on some limited understanding and you accept that risk based on the fact that simply don’t have the time to do all of the research you would have liked. Now, you might also put your trust with money managers or fund managers; expecting they are doing all the analysis required to make good value judgments that are in line with your risk profile and your investment objective. Again, are you sure they are going down to a depth of analysis that ensures risk is minimized?
A Hedge Fund Legend
Recently, I read about a very successful investor named Michael Burry. For those of you who haven’t heard of Mr. Burry, he gained a degree of notoriety for wisely betting against banks’ mortgage holdings and cashing in massive returns for his hedge fund when the credit crises hit full tilt in 2007. His brilliance wasn’t just that he recognized a good bubble when he saw one, it’s the way he figured out how to capitalize on this realization that a spectacular amount of mortgages were doomed to fail. The fact is, when Mr. Burry first became convinced that the type of lending that banks were engaged in was destined to result in large numbers of defaults, there was no real instrument for wagering against the performance of these notes. The various tranches of subprime mortgage bonds could not be sold short. Even with his conviction that the subprime mortgage bond market was doomed, he could not capitalize on it.
Then came Mr. Burry’s discovery of the credit-default swap. It was basically an insurance policy that could be purchased against corporate debt, but that was only useful for betting against the companies that would likely default such as home builders. Ultimately, he convinced a number of big wall street firms to create them including Deutsche Bank and Goldman Sachs. Now, what made his work absolutely brilliant was the fact that he would spend untold hours poring over each bond prospectus, only investing in the most risky of those assets. He was performing the due diligence on each of the loans, such as analyzing the loan to value ratios, which had second liens, location, absence of income documentation, etc. Within each bond, he could sort out the riskiest of the lots and incredibly enough, Deutsche and the other banks didn’t care which bonds he took positions against. He essentially cherry-picked the absolute worst loans (best for him) and found the bonds that backed them.
Mr. Barry would ultimately bring his investors and himself astronomical returns at a time when the vast majority of investors lost roughly 50% during this crisis. If you read about Mr. Burry, you’ll find there is much more to his story as he is unique in many ways, but one key point that separates him from the pack is that he does his homework. Details matter. How these loans were structured matter to all that were connected to them. In these bonds were real loans that represented real value. Understanding the risk factors would immediately point to a very low valuation on these bonds.
I’m not going to delve into the full issue of responsibility relative to loan originators, banks, Fannie Mae, borrowers, etc, but suffice it to say that solid due diligence reduces the risk of any transaction. The more you understand about the asset under consideration, the better you can predict its performance. It’s as simple as that.
So, what’s with my title, “The Interconnectedness of Things?” Well, it got me thinking about just how interconnected we all are. Without getting all Jean Paul Sartre on you, let me point out the most common difficulty in all of management: interconnectedness. That’s right, interconnectedness. The fact is; executives hate it. But it exists. We have the tendency to measure performance of an exact metric; of an exact process step, or an exact person. We like to think that sorting out the specific items of measurement can enable us to understand what is strong and what is weak. Fix the weak bits, keep the strong bits, and voila, you have Lean. But, from the work I’ve been involved in, it’s not so simple. Similar to the difficulty of sorting out all the bits that make up a good loan from a bad loan; a good mortgage bond from a bad mortgage bond; business processes can be extremely complex and highly interdependent.
How do we get our arms around the complexity of process? Mostly, in very distinct ways. How many of us love to look at organizational charts, value chain analysis diagrams, system architecture diagrams? If you are shaking your head “yes”, I’m deeply sorry. The fact is, we are trying to ensure we understand the interconnectedness of things, but we often do that work in silos. In efforts to diagram process or entity relationships of systems or people relationships, this work is most often performed as one-off attempts with a singular purpose or project in mind. They are not done to ensure a wider scope of understanding is gained and maintained. And therein lies a serious shortcoming of those efforts. With islands of understanding, there may be some level of interconnected understanding, but the silos remain silos and whenever we look at those groupings within a map or chart or a diagram, there is too much lost information. The value of what you have is just as quickly defined by what it does not have. (Perhaps some Camus?)
Devil’s in the Details
So, how do we connect all these silos and how do we know when we have enough detail? These are big questions to which there are no silver bullets. During a recent engagement, I was working with a global IT organization who brought together four business units to define standard global processes. Ultimately, the idea was to consolidate where possible, but initially they needed to capture how each unit was operating. I’ve done this type of work a number of times and what still amazes me each time is how often we find gaps in processes, areas that are not understood, as well as overlaps where steps are replicated and no one knew what the other one was doing. As we embarked on the journey of process design, the key question that this team asked of me was, “how many levels down do we need to go?” My answer was pretty simple: go down to the level of detail that someone from outside this process area can read and understand what is happening without any ambiguity.
Imagine if you will an organization that has documented down to that level in a consistent way across their organization. Further, imagine a singular map with diagrams that connect to all appropriate related process steps, to all related electronic content and within a platform that provides instant feedback from the personnel that perform the operations. Now, that’s getting your arms around complexity and it tells the story of the interconnectedness of things.
Finally, once we gain perspective on this interconnectivity we can truly understand what is working and where risk lies. For it is risk that we are constantly managing. The banks that held large amounts of mortgage credit were blind to what was in the big bag of bonds that contained smaller bags of loans that contained all kinds of facts, some of which were never gathered (such as income verification). Did they completely understand the interconnectedness of things? Did they get down to a low enough level of detail to really understand the assets that so much was riding on? To reduce operational risk, the devil’s in the details. Get your arms around process, get your arms around the details and know what you’re buying into.
September 1, 2011 § Leave a comment
Big news this week at Nimbus Partners, a company I joined exactly 4 years ago today. We were acquired by TIBCO, a larger company with a diverse portfolio of BPM products. I’ve known about TIBCO for about ten years now as they are pioneers in the development of middleware, messaging and enterprise application integration – what is now a core capability within Services Oriented Architecture or SOA. Now, SOA is by no means new and the maturity of SOA is well advanced in large enterprises. Many organizations have spent and continue to spend substantial amounts on its promise and a fair amount of challenges remain. Still, like most revolutionary technologies, realizing the value of something that radically shifts what is possible, takes time. TIBCO has been at the forefront of SOA, BPM and BI with technology that alters how information flows between systems and how quickly business users can get to answers. The potential that lies before me and my company is exciting, with promise to connect advanced infrastructure capabilities with Nimbus’ cutting-edge business process management platform. Now, I’m not going to delve into the intricacies of what is possible or which bits fit with which widgets as I’m sure whatever I imagine will evolve to be something quite different. What I will tell you is that the power of this acquisition is an exciting event, one that will likely impact a wide variety of global enterprises.
The Complexity of Events
On the topic of events, I’m reminded of a book I read years ago called aptly enough, “The Power of Events.” It’s written by a gentleman named, David Luckham. I was fortunate enough to hear him speak at a Gartner conference not long after I read his book when it was a groundbreaking topic. Most interesting was his ability to illuminate the capabilities of Complex Event Processing (CEP). The importance of this capability is primarily for organizations that need to minimize risk to their systems and the underlying assets that those systems control. The need to minimize risk is evident in just about every organization I’ve ever entered.
I remember being inspired by Luckham’s book and speech, and it had a big influence when I founded AVIVA Consulting. One of the first opportunities to realize some of the key capabilities within CEP was with a partner company. We took a simple technology for tracking real-time data flowing from any web service and integrated their software with our Microsoft focused collaboration stack of SharePoint, SQL, InfoPath, Office and K2, a third party workflow product. We were mildly successful with it, but quickly became focused on risk and compliance requirements and never fully developed the real-time collaboration solution. Later, after I sold our flagship product, ACES, to a Microsoft service provider, Neudesic, I spent a year working to take a Microsoft platform Enterprise Services Bus (ESB) to market. I named it Neuron (yes, I’m still proud of how clever a name it is) and we launched it shortly before I left to work for Nimbus. So, even though I was on the product management and product marketing side of the ESB product, I became intimately familiar with the core capabilities and potential of middleware messaging and SOA in general.
Now, full circle back to now… this event is the joining of TIBCO, the most innovative company in the middleware space with Nimbus, the most innovative company in the BPM content governance space. I couldn’t be more excited to be at the nexus of this formation (you can kick me later if you get this). Personally, I’m excited to see what will happen when we sit with financial services companies and pharmaceutical companies to look at how risk is managed within quality systems or compliance initiatives. How well can most of these organizations manage real time events and how well designed are their processes to deal with adverse or opportunistic circumstances? This is where the opportunity lies. As I point out in my posting on business agility and the need to minimize risk through agile processes, organizations need to design processes that allow rapid response to unexpected conditions as well as the known possibilities. The events that tend to be earth shattering are not the anticipated events, so how well we have modeled the organization to respond is critical. Also, as I detail in my posting on checklists, minimally we must address the known risks with clear process handling instructions to ensure quality execution.
Rapid Response to Events = Reduced Operational Risk
So, imagine if you will, the situation where fraudulent phishing attacks attempt to lure bank customers to provide their login credentials to make a change to their account. Rather than connecting to the real bank, customers are connecting to a fraudulent system that grabs their login ID and password. The fraudsters then log-in to the real system, change the password and now begin making transactions to pull money out of the victim’s real account. With CEP technology, banks can see in real time how much activity is occurring and when irregular volumes occur on a given function (such as 10X the usual number of password changes during the past minute), the system disables the password change function and alerts the appropriate administrators. Cool stuff, right? Now, tie in the ability to provide clear instruction on the manual handling that the administrator needs to perform. This outlier password change event is rare and the steps required by the administrator may be exacting. That’s where Nimbus comes in. The admin will have clear steps to take, ensuring fast and accurate handling with quick access to all necessary resources and reference materials. End game? Very few, if any customers are impacted. Very little, if any financial damage done to the bank. Preventable adverse events are prevented. And we can imagine in reverse how opportunistic events can also be quickly acted upon with decision-makers having clear instruction on execution.
Understanding Events in Context
The key to how an organization processes and responds to a large volume of diverse events is at the core of what BPM is about. It’s not just process definition for the sake of checking a box that the auditor approves. It’s about improving the decision-making ability of management and other operational decision makers. It’s about reducing operational risk. And it’s about continual tweaking or improving those processes as we learn what is working and what is not. Gaining real time event information can be hugely beneficial, but it’s value is increased when we understand these events in context to precise operational activities.
Those of you who follow my blog already know, I’m not in the habit of reporting news or projecting the future, so consider this post a rare exception. Given the personal nature of this event and the impact it will likely have on the future of BPM technology, I felt compelled to comment. In a future post, I will explore technology specifics including how governance, risk and compliance requirements are handled with the variety of technologies available and the specific categories of capabilities including automation, content management, master data management, SOA, enterprise architecture, social networks, collaboration, search and reporting. There are a variety of analysts and prognosticators jumping to conclusions about what this merging of technical capabilities will mean to the market. I can tell you that this newly joined organization looks extremely promising, but the proof will be in how we make it happen with our customers. It’s how Nimbus has always proven its advantage in the market; through real execution and value creation in real customer environments. With the added strength and reach of capability that TIBCO brings, we should be proving what is possible very soon.