Ever since I was engaged in
writing Principia Mathematica I have come to realize a certain method
which consists of working towards building a bridge between the commonsense
world and the world of science. I accept
both, in broad outline, as not to be questioned. But as in cutting a tunnel through mountain,
work must proceed from both ends in the hope at last that the painstaking
labour will be crowned with a meeting somewhere in the middle.
—Bertrand Russell
The Bernoulli Model is a risk
management and decisionmaking methodology that presents the same consistent
storyboard for all organizational risk factors.
The storyboard sits atop a stylishlyengineered portfolio of scientific
management algorithms that form an advanced forecasting system which is
mathematically accessible to executives.
It is named after a family of Swiss mathematicians who lives several
hundred years ago and is founded on portfolio theory developed by Harry
Markowitz at the University of Chicago in 1952.
Markowitz forever linked reward with risk in the same way that Einstein
linked space with time in that both the expected outcome and the attendant
uncertainty of outcome are required to complete the picture. The Markowitz Model concatenates three
algorithms—forecasting, integration and optimization—in constructing a holistic
portfolio optimization algorithm that serves to maximize reward for given
levels of risk. Building on Markowitz,
The Bernoulli Model expands along a multitude of dimensions including
forecasting, efficiency analysis, utilization, accountability and
comparability. The Bernoulli Model
further adds the Delphi process to more specifically define organizational
values—utility theory to translate external values into internal values—Monte
Carlo simulation to aggregate heterogeneous risk components—the Camus
distribution to fourdimensionally represent risk—The Bernoulli Moment Vector
(BMV) for tracking forecasts—and an alternative hypothesis to serve as the
loyal opposition to the null hypothesis—and highly advanced Excel charts.
The Bernoulli Model represents
a fundamentally different methodology to managing risk and value creation. The essence of the difference is found in
three salient points. Firstly, it
involves a shift from an external, marketbased focus to an internal,
mathematicallybased focus. Secondly,
The Bernoulli Model is a topdown approach that is in stark contrast to
existing bottomup approaches. The
Bernoulli Model is not intended to replace existing systems but to give
executives a thoroughly different perspective.
Thirdly, it is based in the common software applications of Microsoft
Excel, Visual Basic and Access and includes all related material so that
organizations may, if they choose, carryon developing the model on their own
without reliance on outside consultants and software updates. The use of Microsoft Access as the database
for The Bernoulli Model means that it can easily be migrated to largescale
database systems. The Bernoulli concept
involves building a foundation by developing micromodels for the purpose of
highlighting basic subjects. Potential
micromodel subjects for The Bernoulli Model include market risk, credit risk,
the forward curve, drilling risk, capital investment risk, intertemporal
riskmodeling, insurable risk, Bayesian analysis and expert opinion, game theory
and the Delphi process. The Bernoulli concept
includes all related material such as the Excel charts, Visual Basic code,
Access databases, RoboHelp files, as well as the Excel models themselves.
In 1670 both Newton and
Leibnitz formulated versions of calculus—the mathematics of motion. John and James Bernoulli picked up on
calculus, spread it through much of Europe, and set the roadmap for efficiency
analysis by finding the curve for which a bead could slide down in the shortest
time. John’s son Daniel set the roadmap
for utility theory based on the idea that the value of an asset is determined
by the utility it yields rather than its market price. No less than eight Bernoulli’s made
significant contributions to mathematics.
In 1952 a twentyfive yearold University of Chicago graduate student
named Harry Markowitz stood on the shoulders of giants like Archimedes,
Descartes, Newton, Leibnitz, Pascal, Bayes, Cauchy, Gauss, Galois, Laplace, von
Newmann, Einstein and the Bernoulli’s in producing a fourteenpage paper
entitled Portfolio Selection. His
approach combines regression analysis with matrix algebra and linear
programming in the engineering of basic portfolio theory. The Bernoulli Model builds on Markowitz by
adding components like the Delphi and expert opinion programs along with
utility theory and event riskmodeling using decision trees—and by fortifying
existing components with advanced regression models and metaheuristic
algorithms such as Monte Carlo simulation, neural networks, genetic and
hillclimbing algorithms, and the stateoftheart fourmoment Camus
distribution. The Bernoulli Model is a
stylish Excelbased, RoboHelpcomplemented, totallyexpandable,
enterprisewide, actuarialvaluation, decisionmaking system.
Delphi Program 
The Delphi program allows for the input of organizational values
which then guides the decisionmaking process.
The basic Delphi value is represented by a confidence level describing
the maximum allowable downside change in portfolio value—ie. VaR. Advanced Delphi values include utility
translations and objectives relating to financial, strategic, operating and
competition. 
Forecasting 
The process of forecasting produces not only estimates of outcomes
but also estimates of uncertainty surrounding outcomes. Advanced forecasting methods include
forward curve analysis, option price analysis, intertemporal riskmodeling,
expert opinion, Bayesian analysis, neural networks and event riskmodeling. 
Integration 
In addition to the basic closedform method of integration involving
the twomoment normal distribution, the Bernoulli Model also employs Monte
Carlo simulation along with the fourmoment the Camus distribution in order
to capture and integrate the full spectrum of heterogeneously distributed
forecasts of outcomes and uncertainty surrounding outcomes. 
Optimization 
Optimization algorithms search riskreward space in order to determine
the optimal set of decision subject to Delphi constraints. Closedform optimization algorithms include
linear programming while openform methods include hillclimbing and genetic
algorithms. 
Optimized Portfolio 
The optimized portfolio represents the raison d’être portfolio
theory. Portfolio theory brings
together the consequences of a varied set of uncertain components. The Bernoulli Model employs sensitivity
analysis or stresstesting algorithms to insure optimal portfolio robustness. 
Historical Data 
Historical data includes market rates, forward rates, option rates
and production data—both historical and current. In addition, portfolio accountability
analysis is fed back into the model along with the other historical data in
order to essentially make the forecasting process selfaware. 
Exposure Data 
Exposure is simple the initial asset value exposed to change. For example, five barrels of oil at $40 per
barrel equals $200 of exposure. A long
position is has positive exposure while a short position is negative. Exposure data also includes exposure
dynamics particularly relevant to risk components like credit risk. 
Expert Opinion 
Expert opinion is particularly useful when
historical data is unavailable or unreliable.
The Bernoulli Model integrates expert opinion into the forecasting
process that uses Bayesian analysis to estimate uncertainty surrounding
forecasts. The model also provides
experts with regular followup performance reports. 
Event Scenarios 
One of the first event risk studies was conducted in 1933 to examine
the effects of stock splitting on price.
Event riskmodeling is an advanced form of forecasting that uses
decision trees in order to see how specific scenarios might play out. The approach is well suited to contingency
planning. 
Comparability Analysis 
While the primary function of portfolio theory is to bring together
all uncertain components into a single view, the secondary function is to
provide comparability between components.
The Bernoulli Model employs the complimentary principle with the null
and alternative hypotheses. 
Accountability Analysis 
The nineteenth century saw the inauguration of management accounting designed
to manage the efficient conversion of raw materials into finished
products. The Bernoulli Model is
designed to manage the efficient conversion of uncertain information into
organizational value. 
·
Radically optionbased
·
Roadmaps covering all risk
factors
·
Topdown implementation
potential
·
From traininglevel to
fullyoperational
·
Expanded definitions of both
risk and reward
·
The realization of
organizational portfolio management
·
An enterprisewide,
actuarialvaluation, decisionmaking system
·
The stateoftheart
fourmoment Camus distribution for modeling risk
·
The Bernoulli Moment Vector
(BMV) for tracking asset values and forecasts
·
An engineered help system to
perfectly complete the Bernoulli theatre of online scientific management
·
Building the case
·
The complementary principle
·
From accounting to scientific
·
From a posteriori to a
priori
·
Predefined values and opinions
·
The frontier of asset
management
·
The awesome power of
prototyping
·
Do you know what a paradigm
shift is?
The Delphi definition
worksheet is the overriding guidance system of The Bernoulli Model—which
employs the iterative Delphi method designed
to draw out fundamental values from officers and directors. The
purpose of the Delphi process is to streamline decisionmaking for all concerned. The
Delphi method is named after the Socratic inscription—Know Thyself—at the
oracle at Delphi in ancient Greece. The
primary Delphi value pertains to the confidence level and value for allowable
downside risk exposure of the portfolio distribution—also known as
valueatrisk or VaR. The secondary Delphi value pertains to the
utility translation function, which adjusts market returns to more
accurately represent internal organizational values. If the Delphi process were applied to a sovereignty, then the outcome would be a
constitution. The political theorist
Jean Jacques Rousseau (17121778) rejected the notion of representative
democracy and instead asserted that everyone should vote on every issue. The Delphi offers a more enlightened solution
with the predefinition of values. The
Delphi definition worksheet also includes two additional charts—The Fractal
Scaling Definition, which relates the kurtosis to the fractal scaling
exponent—and The Bernoulli Form, which depicts the touchstone for The Bernoulli
Model by graphically representing the solution to the problem of the curve for
which a bead can slide down in the shortest time. Below is a list of possible additional Delphi
criteria.
Financial Objectives 
Cash flow, dividends, earnings, interest coverage, shareprice, debt
rating and financial solvency concerns. 
Strategic Objectives 
Basic organizational objectives. 
Operating Objectives 
Protection from property, business interruption and liability losses. 
Competitive Objectives 
Objectives regarding positions relative to competitors. 
Directors and Officers Bias 
Liability protection for directors and officers of the organization. 
Social Values 
Needs of the employees and the community. 
Valuation Parameters 

Portfolio Increment 
Portfolio increment describes the size the portfolio under
consideration. 
Epoch Units 
An epoch is a basic unit of time.
The basic unit of time for The Bernoulli Model in one month. 
Valuation Epochs 
In this example the Delphi process has determined the global
valuation period is three months. 
VaR Definition 

Frac—M9 
The fractal scaling switch determines whether to adjust risk normally
(ie. with the square root of time—0.5) or fractally (ie. with an exponent
between 0.5 and 1.0 in accordance with the transformation of kurtosis. 
DelphiVaR 
The global DelphiVaR determines the global value (ie. VaR—value at
risk) for downside risk exposure. 
DelphiCL 
The global DelphiCL determines the global confidence level (ie. CL)
for downside risk exposure. 
DelphiSD 
The global DelphiSD is the calculated number of standard deviations
(ie. SD) based on the CL and the Frac—M9 switch. 
Utility Definition 
The utility definition shows the unadjusted and adjusted returns at
three points and the parameters that create the curve which translates from
unadjusted to adjusted returns. 
Workbook Settings 

Window Zoom 
The window zoom is a convenience feature that allows users to zoom
all worksheets at once. 
Smart Rounding 
The smart rounding is a display feature which formats display into
the specified number of significant digits. 
The utility definition
converts unadjusted returns into adjusted returns in accordance with the
utility transformation curve. So while
the fourmoment Camus distribution and fractal scaling provides an expanded
definition of risk, utility theory provides an expanded definition of reward by
changing external market values into internal Delphi values. For example, a 100 percent return (ie. Mu—M1)
becomes a 50 percent return (ie. VaL—M5) while a –20 percent return (ie. Mu—M1)
becomes a –30 percent return (ie. VaL—M5).
The rationale being a reflection of the gravity of the return. A 20 percent loss in value could well have
unexpected secondary effects that could end up feeling like a 30 percent
loss. Similarly, while a 100 percent
return is obviously a desirable outcome, undue emphasis on trying to achieve
such a result may produce erratic outcomes and the missing out on more
conservative opportunities. Therefore,
under utility translation, the 100 percent return is viewed internally as just
a 50 percent return. Utility theory was
founded by Daniel Bernoulli (170082) with his 1738 paper entitled Exposition
of a New Theory on the Measurement of Risk—with its central theme being
that the value of an asset is the utility it yields rather than its market
price. His paper, one of the most
profound documents ever written, delineates the allpervasive relationship
between empirical measurement and gut feel.
The fractal definition
determines the scaling exponent (ie. Frac—M9) as a function of kurtosis (ie.
Kurt—M4). A normal distribution has a
kurtosis of three which translates into fractal scaling exponent of 0.5—which
is the normal application with the square root of time. Going from a onemonth valuation period to a
oneyear valuation period under a normal assumption (ie. M4 = 3, M9 = 0.5)
results in a scaling factor of 3.5 times (ie. 12^0.5)—while a similar
calculation under a Camus distribution with a kurtosis of six (ie. M4 = 6, M9 =
0.75) produces a scaling factor of 6.4 times (ie. 12^0.75). The rational is simply that with higher
kurtosis comes the potential for larger jumps and thus a larger scaling
factor. The fractal scaling exponent is
also known as the Hurst exponent named after the hydrologist Harold Hurst
(190078) who managed the Nile River Dam from 1925 to 1952 by formulating water
discharge policies aimed at minimizing both overflow risk and the risk
associated with insufficient water reserves.
Fractals are selfsimilar geometric shapes in that there is no inherent
scale so that each small portion can be viewed as a scaled replica of the
whole. Fractal analysis in riskmodeling
capitalizes on selfsimilarity being able to extrapolating smallscale data
(eg. daily) to largerscale data (eg. monthly) where unavailable. Fractals exist pervasively in nature because
theirs is the most stable and error tolerant.
The Bernoulli Form is the
metaphor of efficiency that undergirds the entire Bernoulli methodology. The Bernoulli Model is mathematically and
ascetically efficient in its singular direction towards organizational
efficiency. Efficiency is the ability to
achieve an objective with a minimal expenditure of resources. In the case of riskreward efficiency, the objective
is to achieve the highest expected reward with the least expected exposure to
risk. In the case of costfunction
efficiency, the objective is to achieve optimal function with the least
expenditure of cost. The Bernoulli
brothers, James (16541705) and John (16671748), challenged each other with
the problem of finding the curve for which a bead could be slide down in the
shortest time—and found the answer in a cycloid—thereby laying the foundation
for efficiency analysis. John’s son
Daniel set the roadmap for utility theory and also formulated the Bernoulli
principle which simple state that the faster gas flows in a pipeline the lower
the pressure on the walls of the pipe.
Airplane wings are designed in accordance with the Bernoulli principle
so that the distance air travels over the top of the wing is greater than the distance
it travels under the wing thereby creating a pressure differential and thus
lift. No less than eight Bernoulli’s
made significant contributions to mathematics.
Exposure is simply the initial asset value exposed to change. For example, five barrels of oil at $40 per
barrel equals $200 of exposure volume. A
long position has positive exposure while a short position has negative
exposure. Exposure data also includes
exposure dynamics particularly relevant to risk factors like credit risk. Exposure status allows users to turn specific
exposures on and off. Note that the
sensitivity switch in the valuation parameters allows for inclusion of negative
and positive exposures. Contract RateM0
is the initial contracted rate of the position and also represents the initial
exposure for analysis of the rate.
If the Delphi definition
worksheet is the overriding guidance system of The Bernoulli Model, then the
technical analysis worksheet is the engine, the actuarial valuation worksheet
is the cockpit and the prechart analysis worksheet is the transmission. The Bernoulli technical analysis worksheet
contains a systematic delineation of the technical analysis involved in the
actuarial valuation process. The Bernoulli
Model is a prototyping system where the micromodels serve as both templates and
checkers for the operational models. The
micromodels, combined with the Bernoulli online help system, may also act as
training models for executives and managers.
Ultimately it means that every asset management process in the
enterprisewide actuarial valuation system is accessible to executives. The Harvard Business Review publication began
in 1922 with the mandate of connecting fundamental economic and business theory
with executive experience. The Bernoulli
Model represents a systematic realization of that very mandate. It is designed for everyday use by
executives, managers and analysts. The
model offers the ability to either skim the surface of riskmodeling and
decision analysis or, alternatively, to drilldown and examine the inner workings
of specific decision problems.
Ultimately it frees up executives from the politics of decisionmaking
and allows them to think more broadly about values and objectives.
The Bernoulli prechart
analysis worksheet contains the translation of the technical analysis into the
graphical format found in the six charts on the actuarial valuation
worksheet. The charts themselves are
very sophisticated and use wellwritten Visual Basic code to generate the chart
data—all of which is available for use in other applications and as the
startingpoint for other charting applications.
The ability to present ideas in a consistent and uncluttered way is the
hallmark of The Bernoulli Model. The
concept of a theory comes from the ancient Greek idea of detachment as the path
to wisdom. The word theory is the root
of the word theatre and is derived from the Greek verb theatai—which
means to behold as an action in which the observer is not involved. In fact, the dirty little secret of all great
thinkers is that new ideas are formulated completely outside any everyday view
of reality. Einstein first introduced
relativity theory in 1905 as a simple set of algebraic equations. Yet the theory was largely ignored until four
years later when Minkowski presented a geometric view of relativity as
characterized by the fourdimensional spacetime continuum. The Bernoulli Model represents a riskmodeling
and decisionmaking theatre for executives, managers and analysts alike.
Portfolio Efficiency Analysis Chart
The Bernoulli Model presents
the same consistent storyboard for all organizational risk factors. The display parameters are on the top while
the valuation parameters are on the left.
Below the valuation parameters is the component Bernoulli Moment Vector
(BMV) while on the right side of the charts is the portfolio BMV. If the three charts on the left of the
storyboard are the components or ingredients in a loaf of portfolio bread, the
three charts on the right are the different ways of slicing up the bread. The light blue is the null paradigm—green is
the alternative paradigm—dark blue is the common between the two paradigms. Chart V1 delineates the risk factor exposure
to change in value—eg. five barrels of oil at $40 per barrel equals $200 of
exposure. Chart V2 captures the
component forecast distributions for two of the components. Chart V3 shows the correlation between
changes in value of components—ie. a correlation of one means factors move in
lockstep and a correlation of zero means no correlation. Chart V4 illustrates firstorder risk
management by contrasting risk (ie. VaR) with risk exposure limits (ie.
Delphi). Chart V5 captures the portfolio
forecast distributions and demonstrates the standardized Bernoulli paradigm
frame—ie. –/+ sixsigma. Chart V6
illustrates secondorder risk management by contrasting value creation against
the risk associated with value creation.
HistoIter 
HistoIter determines the number of histogram iterations or
statistical samples used to generate the distributions in Charts V2 and V5. 
V23Disp 
The V23Disp switch allows for toggling between rate and portfolio in Charts
V2 and V3. The rate setting shows value before exposure, while the
portfolio setting shows value after exposure. 
BMVdisp 
The BMVdisp changes the display base of the Bernoulli moment vector
(ie. BMV). Value shows asset
values. M0 shows percentage of
exposure. M2 shows the number of standard deviations. 
V25Axis 
The V25Axis switch changes the XAxis for Charts V2 and V5.
Value shows asset values. M0
shows percentage of exposure. M2 shows the standard sixsigma paradigm
frame. 
TailScale 
The TailScale switch works in conjunction with the TailMult switch
and allows for settings of Off, 3SD and VaR. If only one ValStat switch
is on and if the TailScale switch is on, then the tails are dark blue. 
TailMult 
The TailMult switch works in conjunction with the TailScale
switch—and is the tail multiplication factor. The valid range for
TailMult is zero or greater. 
ValStat 
The ValStat switch turns the Null and Alt valuation paradigms on and
off. The light blue is the null paradigm—the green is the alternative
paradigm—the dark blue is the common between the two paradigms. 
ValForm 
ValForm determines whether the actuarial valuation is closed or
simulated. Closedform valuation utilizes matrix algebra and the normal
distribution to integrate forecasts, while the simulatedform is much more
robust and is able to make use the fourmoment Camus distribution and the utility
translation feature. 
ValDate 
ValDate is the date at which the assets are valued. 
ValType 
An ex ante valuation uses only historical data, while an ex post
valuation uses both historical and future data—ie. perfect information.
The overriding goal of forecasting is to construct models that best
align ex ante with ex post results. 
Epochs 
Epochs refers to the number of nonzero weighted epochs. 
Lambda 
Lambda determines the decay weights between adjacent epochs. 
Sensitvty 
The Sensitvty parameter is related to exposure and allows for either
short positions, long positions or both. 
Utility 
The Utility switch determines whether the utility transformation
defined on the Delphi definition page is activated or not. The Utility
transformation only works if the ValForm switch is set to Sim. 
M1stat 
In the on position M1stat uses the forecasted value, while in the off
position the expected value of the forecast is zero. 
MMstat 
The MMstat switch has three settings—off, M22, M44. The off
setting produces expected correlations of zero, while the M22 setting
produces the forecasted correlations. The M44 setting stresstests
correlations. 
SimIter 
SimIter determines the number of simulation iterations used to calculate
the BMV when the ValForm switch is set to Sim. 
Dist 
The Dist switch selects the distribution as either the normal or the
Camus. The normal is a twomoment distribution while the Camus is a
fourmoment distribution. 
PortM3 
The PortM3 input allows for the thirdmoment or skewness of the
portfolio to be overridden only if the Dist switch is set to Camus. 
PortM4 
The PortM4 input allows for the fourthmoment or kurtosis of the portfolio
to be overridden only if the Dist switch is set to Camus. 
Offset 
The Offset switch is a display switch which determines the offsets to
be overridden—with the choices being MTV or return. MTV is marktovalue and is the change
since the . Return . 
Marker 
The Marker switch toggles the triangle indicating M1, M5 and Sim for
Charts v2 and v5. 
Exposure is simply the
intuitive concept of the effective asset value exposed to change. For example, five barrels of oil at $20 per
barrel equals $100 of exposure. For
comparison, a penalty in hockey represents twominutes of powerplay
exposure. A long position has positive
exposure while a short position is negative.
Exposure is essentially the denominator of risk analysis and includes
adjustments for exposure dynamics and nonlinearities. The model also calculates the effective
portfolio exposure so that the notion of exposure is defined everywhere that
risk is defined. The Bernoulli Model
uses exposure as the starting point for the actuarial valuation process and
allows for full comparability between risk components in respect to both
exposure and distributions. The light
blue represents the null paradigm. The
green represents the alternative paradigm and the dark blue is the common
between the two paradigms. The first
four moments of a statistical distribution are the mean, standard deviation,
skewness and kurtosis. The mean is
calculated as the average value, the standard deviation is calculated as the
average deviation about the mean, the skewness is calculated as the average
cubed deviation about the mean, and kurtosis is calculated as the average
deviation to the fourth power about the mean.
In The Bernoulli Model, exposure is the zero or null moment.
While exposure represents the
denominator of risk analysis, the statistical distribution represents the
numerator. And while Chart V2 shows
components distributions, Chart V5 shows the portfolio distributions for both
the null and alternative hypotheses.
Both Chart V2 and V5 use the standard Bernoulli paradigm frame (ie. +/–
sixstandard deviations) and are affected by the V23Disp, V25Axis, HistoIter,
TailScale and TailMult display switches.
Chart V2 can show either the rate (shown above) or the portfolio (shown
below) distributions. The rate
distribution is prior to the application of exposure while the portfolio distribution
shows the selected portfolio component after the application of exposure. While the two charts shown here only show
value along the bottomaxis, the V25Axis switch allows for both exposure and
standard deviation along the bottomaxis.
In fact, the alternative paradigm (ie. the green) is calibrated to the
standard Bernoulli paradigm frame (ie. +/– sixstandard deviations). The two available valuation methods are
closed (ie. matrix algebra) and open (ie. Monte Carlo simulation). Using Monte Carlo simulation and the
fourmoment Camus distribution, the Bernoulli Model is able to use the method
of moments to capture the first four moments from the simulation and then apply
it to the Camus distribution with the end result being a smoothly presented statistical
distribution generated with a minimal number of simulation iterations.
While volatility of components
is usually considered to make up the principal composition of a portfolio
distribution, it is often dramatic changes in correlation between components in
times of turbulence that leads to unforeseen shifts in portfolio value. Chart V3 shows the expected correlation
between changes in value of risk components.
A correlation of one means that the components move in lockstep, while a
value of zero indicates there is no correlation between components. While standard deviation is the second moment
of a risk component, correlation is the second moment between risk components. To more accurately capture kurtotic
correlation (ie. large swings) the MMstat switch in the valuation parameters
includes an M22 option (ie. normal correlation) and an M44 option, which
essentially selects the correlation against the portfolio and in effect
represents a stresstest of correlation.
Niels Bohr (18851962), one of the founding fathers of quantum theory,
defined the complementary principle as the coexistence of two necessary and
seemingly incompatible descriptions of the same phenomenon. One of its first realizations dates back to
1637 when Descartes revealed that algebra and geometry are the same
thing—analytic geometry. The ability to
contrast paradigms, scenarios, strategies, risk components and valuation
parameters—and thus provide complementary perspectives of the same
portfolio—represents an invaluable feature of The Bernoulli Model.
Valueatrisk or VaR is a
market risk measurement standard that was created to allow companies to relate
the income from trading to the risk inherent in trading. Its original purpose was to establish a frame
of reference for evaluating trades, as well as providing a mechanism for
officers and directors to set risk exposure limits. Although VaR is nothing more than a subset of
The Markowitz Model formulated in 1952, the modern VaR movement started with
the investment bank JP Morgan in its attempt to create a market risk
measurement standard. Traders were
playing the game of headsIwintailsyoulose and exposing organizations to
huge risks. When gambles went south the
traders simply moved on. The problem is
that VaR is an unsophisticated measure and traders are now gaming VaR. VaR is defined as a statement of probability
regarding the potential change in value of a portfolio resulting from changes
in market factors over a specified time interval, with a certain level of
confidence. A oneday time horizon and a
confidence level of 95percent are commonly used. The Bernoulli Model uses a onemonth time horizon
as a basis and forecasts as far forward as twenty years. The method included all organizational risk
components and uses highly advanced forecasting techniques like intertemporal
riskmodeling and decision trees with Monte Carlo simulation. The Delphi process is used to establish the VaR
confidence level and value. Chart V4
shows the VaR/Delphi comparisons for the portfolio and three components.
The statistical distribution
is one of the most beautiful Forms for the reason that it represents both the
forecast of outcomes as well as the expected uncertainty surrounding
outcomes. A fractal is a mathematical
Form having the property of selfsimilarity in that any portion can be viewed
as a reduced scale replica of the whole.
There are explicit expressions for three fractal distributions—the Bernoulli
(ie. coin toss), the normal and the Cauchy.
The Bernoulli converges
to the normal distribution when the number of coins becomes sufficiently
large. The Cauchy is interesting in that
it possesses undefined moments. The
Bernoulli Model uses the fourmoment Camus distribution to model the full range
of fractal distributions. The first four
moments of a statistical distribution are the mean, standard deviation,
skewness and kurtosis. The Bernoulli
Model is able to capture the output from the most complicated forecasting
simulation and present the smoothlyrepresented Camus distribution generated
with a minimal number of simulation iterations.
The three images of Chart V5 here show the portfolio distributions for
both the null and alternative hypotheses, for just the null, and for just the
alternative—with a base of asset values, percentage of exposure, and the
standard Bernoulli paradigm frame (ie. +/– sixstandard deviations). Note that the display is not limited to the
specific combination of formats shown here.
The efficient frontier has
come to form the bedrock of modern portfolio theory since its introduction by
Harry Markowitz in 1952. It is the asset
allocation mechanism of choice for virtually all pension funds, and pension
fund money makes up the lion’s share of investments on Bay and Wall
streets. The efficient frontier is
defined as the riskreturn tradeoff curve.
The principle of risk and return is fundamental to asset
management. They are complementary items
in that a greater return on investment is expected as more risk is
accepted. Markowitz defined risk as the
variability of returns as measured by the standard deviation about the
mean. In practice, investors choose a
comfortable amount of risk to assume which then translates into an expected
return via the efficient frontier. The
efficient frontier is the best one can do in terms of riskreward
efficiency. It represents the panoramic
view of the organizational portfolio depicting the fruition of the highest forecasting
and decisionmaking intelligence available.
And while the end result is sufficient reason for conducting the
exercise in the first place, the process of going through the analysis is often
worthwhile in and of itself. Chart V6
illustrates secondorder risk management by contrasting value creation against
the risk associated with value creation—thus ensuring the efficient conversion
of uncertain information into value.
The Bernoulli moment vector
expands on the notion of the first two statistical moments of the distribution
by firstly including M0, which is the value that is exposed to change—and by
including the third and fourth moments of the distribution. M1 is translated into M5 using a utility
transformation defined by the Delphi process.
M2M4 are translated into M6 via the selection of a local confidence
level of the portfolio distribution. The
local confidence level is a translation of the global confidence level, which
is also defined by the Delphi process.
While M6 represents the lower local confidence bound of the distribution,
M7 is the upper local confidence bound.
M8 is the global confidence level of the distribution. M9 is the fractal coefficient and depicts the
coefficient by which risk scales over time.
A value of onehalf is the standard Brownian motion coefficient and is
equivalent to scaling according to the square root of time. MM is the correlation coefficient and
characterizes the portfolio correlation.
SM represents one iteration of a simulated return. A description of each element in the BMV
follows in the table below. Below that
table is a graphical depiction of the BMV with the BMVdisp switch set to Value,
M0 and M2. Value
shows asset values. M0 shows percentage
of exposure. M2 shows the number of standard deviations.
MTV 
MTV is the change in value since the inception of the asset. The value can be determined externally (ie.
the market) or internally via a utility transformation of market value. 
Return 
Return is simply the change in value since the inception of an
asset. The value can be determined
externally (ie. the market) or internally via a utility transformation of
market value. 
Offset 
The offset is simply the selected offset for displayed in Charts v2
and v5—and can take on values of None, MTV or Return. Offset is expressed as either a change in
value or a backward rate of return. (display) 
Expo—M0 
Exposure is simply the intuitive concept of initial asset value
exposed to change. For example, five
barrels of oil at $40 per barrel equals $100 of exposure. A long position is has positive exposure
while a short position is negative. 
Mu—M1 
M1 is the first moment and is the expected or forecasted outcome. 
SD—M2 
M2 or standard deviation or SD is the second moment and is the square
root of the mean of the squared deviations of returns from M1 and is
equivalent to volatility and the square root variance. 
Skew—M3 
M3 or skewness is the third moment and is calculated as the average
of the cubed deviations from the mean, normalized to M2. M3 is the
measure of the asymmetry of a distribution—with a M3 of zero indicating a
symmetric distribution. The normal distribution has a M3 of zero. 
Kurt—M4 
M4 or kurtosis is the fourth moment and is calculated as the average
of the fourthpower deviations from the mean, normalized to M2. M4 is
the measure of tailthickness as well as peakedness of a distribution. A normal distribution has a M4 of three. 
VaL—M5 
M5 is the translation of M1, possibly using the utility
transformation—normalized to M2 
VaR—M6 
M2M4 are translated into M6 via the selection of a local confidence
level of the portfolio distribution—and is the lower local confidence bound
of the distribution. 
uVaR—M7 
M7 is the upper local confidence bound. 
gVaR—M8 
M8 is the global confidence level of the distribution. 
Frac—M9 
M9 is the fractal coefficient and depicts the coefficient by which
risk scales over time. 
Corr—MM 
MM is the correlation coefficient and characterizes the portfolio
correlation—which ranges between minusone and one. A value of zero
means that the expectation of risk for the portfolio is zero. 
Sim—SM 
SM represents one iteration of a simulated return. 
The Bernoulli Market Model
(shown above) represents the beachhead for the Bernoulli concept and tracks the
analytical process from the inputs of the Delphi, rates and exposure through
the technical analysis to the actuarial valuation and decisionmaking. Advanced forms of the market model include
features like intertemporal riskmodeling
that involves reproducing data characterized by contemporaneous and
intertemporal dependencies—such as energy prices and foreign exchange
rates.
The Bernoulli Credit Model
starts from a base of the market model with the main point of departure from
market risk to credit risk being in the modeling of dynamic exposures. The general consensus in the literature is
that the VaR approach is most appropriate for credit risk management for both
presettlement and settlement risk.
Presettlement risk is a form of credit risk that arises whenever forwards
or derivatives are traded. Settlement
risk occurs when there is a nonconcurrent exchange of value. The essence of credit risk involves the
modeling of dynamic exposure.
The Bernoulli Insurance Model
uses a garch process to breakdown historical losses into property, business
interruption and liability types. It
further breaks down losses into small and large, and then breaks down large
losses into frequency and severity. The
model applies Monte Carlo simulation of the frequency and severity distributions
against the alternative insurance deductible arrangements and the Delphi
results in order to identify the optimal arrangement.
The Bernoulli Intertemporal
Model is an approach to modeling data characterized by both intertemporal and
contemporaneous dependencies—such as energy prices and foreign exchange
rates. Intertemporal riskmodeling
deconstructs historical data into correlated signal, wave and noise—each of
which is separately forecast—and then reconstructs within a Monte Carlo
simulation environment in order to produce the forecasted portfolio
distribution.
The Bernoulli Efficient
Frontier Model focuses on optimization—the final of the three basic portfolio
algorithms—forecasting, integration and optimization—in constructing an efficient frontier by optimizing for all levels of
risk. The model compares the basic
Markowitz Model—method of moments, matrix algebra and linear programming—with
the advanced Bernoulli Model—progressive method of moments, Monte Carlo simulation
and the Camus distribution, and hillclimbing and genetic algorithms.
The Bernoulli Capital Decision
Model delineates the decision of whether Eagle Airlines should buy another
airplane—and was inspired by the example found in the book Making Hard
Decisions by Clemen. In addition to
tangibly mapping out the capital decisionmaking process, the basic capital
decision micromodel also features sensitivity analysis and sets the table for
the introduction of decision trees as a fullyintegrated component.