The Bernoulli Universe
An Essay by Christopher Bek
christopher.bek@gmail.com
Summary—This essay tells the story of portfolio theory as a theory of one argument for the theory of everything. It begins with Spinoza’s metaphysics, followed with the Copenhagen interpretation of quantum theory by Bohr, Heisenberg and Primas, and concluded with Modern Portfolio Theory by Markowitz and The Bernoulli Model by Bek.
Quotation— God is a mathematician. —Sir James Jeans
Baruch Spinoza’s Ethics is set out in the fashion of Euclid’s geometry, and its five parts deal with God, the mind, the emotions, and human bondage and freedom. He begins with his metaphysical system, characterized as God or Nature, with the following eight definitions:
1) the cause of itself is that whose essence involves existence; or that whose nature cannot be conceived except as existing;
2) a thing is finite in its kind when it can be limited by a thing of the same nature;
3) substance is that which is in itself and is conceived through itself;
4) attribute is that which the intellect perceives of substance as constituting its essence;
5) mode is the modification of substance; that which is in something else, through which it is also conceived;
6) God is an absolutely infinite being; that is, substance consisting of infinite attributes, each of which expresses eternal and infinite essence;
7) a thing is said to be free if it exists solely through the necessity of its own nature, and is determined into action by itself alone; and
8) eternity means existence itself, insofar as it is conceived to follow from the definition alone of the eternal thing.
The Copenhagen Interpretation is a collection of views about quantum theory principally attributed to Niels Bohr and Werner Heisenberg. Hans Primas described the nine principles of the Copenhagen interpretation as follows:
1) quantum physics applies to individual objects, not only ensembles of objects;
2) their description is probabilistic;
3) their description is the result of experiments described in terms of classical physics;
4) the frontier that separates the classical from the quantum can be chosen arbitrarily;
5) the act of observation or measurement is irreversible;
6) the act of observation or measurement involves an action upon the object measured and reduces the wave packet;
7) complementary properties cannot be observed simultaneously;
8) no truth can be attributed to an object except according to the results of its measurement; and
9) that quantum descriptions are objective, in that they are independent of physicists’ mental arbitrariness.
Modern Portfolio Theory. Harry Markowitz is an economist who devised modern portfolio theory in his 1952 article entitled Portfolio Selection. His theories emphasized the importance of considering the portfolio construction of assets, risk, correlation, diversification and the efficient frontier. Markowitz shared the Nobel Prize in Economics with Merton Miller and William Sharpe in 1990. Investment decision making prior to modern portfolio theory was largely based on individual performance of an investment and its current price. Markowitz showed that the individual performance of a particular asset was not as important as the performance of the ean and variance of the portfolio probability distribution. According to Wikipedia, modern portfolio theory, or mean-variance analysis, is the mathematical framework for assembling a portfolio of assets such that the
expected return is maximized for a given level of risk. According to Investopedia.com, postmodern portfolio theory is a portfolio optimization methodology that only uses the downside risk of returns. The Bernoulli Model is postmodern in that it allows the use of only downside risk as a modeling choice. It is also postmodern as it sees the organization whole, and it also applies a full spectrum of mathematical algorithms to the modeling process.👍
The Bernoulli Model is a philosophic and scientific software application based in Microsoft Office and the Palisade.com suite of decision tools. Markowitz originally used the algorithms of forecasting, integration and optimization in constructing his portfolios. Specifically, he used regression analysis for forecasting, the central limit theorem for integration, and linear programing for optimal decisionmaking. The Bernoulli Model is also
postmodern in the sense that it both consolidates and builds on Markowitz with the following ten feature mathematical algorithms:
1) the Delphi to know thyself;
2) the objective function to define values;
3) actuarial exposure to understand value and variance;
4) utilize and synthesize array of forecasting methods;
5) integrate or convolute the moving parts;
6) employ optimization theory including local and global search algorithms;
7) utility theory to translate market values into internal values;
8) the complementary principle to contrast and compare paradigms like open and closed forms;
9) the Camus distribution to model the first four statistical moments; and
10) the Bernoulli Moment Vector to provide an expanded definition of both risk and reward.
The Delphi is named after the Oracle at Delphi in ancient Greece, which read: “Know Thyself” and “Nothing in Excess”. The Delphi method is a forecasting and value-defining process that employs multiple rounds of questionnaires completed by panels of experts or officers and directors, and where response summaries are provided after each round. The Bernoulli Model uses the Delphi for convergence on values, and to capture expert
forecasts.
The Objective Function. Linear programming is an optimization algorithm for a system of linear constraints and a linear objective function. The objective function defines the quantity to be optimized, and is the key component in the mathematical optimization process. Traditionally, its a real-valued function, while The Bernoulli Model allows for a complex-valued, non-linear objective function definition.
Actuarial Exposure. Traditionally, actuarial risk exposure is the measure of potential future loss resulting from a specific activity or event. Essentially analysis of the risk exposure for a business often ranks risks according to the probability of occurrence multiplied by the potential loss. The Bernoulli Model defines exposure as any potential for change in value.
Forecasting is a technique of predicting the future based on previous experience and data. Forecasting models include expert opinion, derivative pricing, garch models and neural networks. I separated time-series data into the principal components of signal, wave and noise for a currency exchange rate study I conducted. Garch is a class of autoregressive conditional time-series models. Neural networks are forecasting methods based on
mathematical models of the brain.
Integration. Convolution is a mathematical operation on two functions producing a third function. Numerical integration comprises a broad family of algorithms for calculating the numerical value of an integral. In probability theory, the central limit theorem establishes that the sum of independent random variables tends to be normally distributed. Closed-form integration algorithms include the central limit theorem, while open-form integration include Monte Carlo simulation.
Optimization Theory or operations research deals with the application of advanced analytical methods to make better decisions. The algorithms for searching risk/reward space tend to be either global (eg. genetic algorithms) or local (eg. hill-climbing algorithms). A genetic algorithm is a metaheuristic method inspired by Charles Darwin’s theory of natural evolution. Hill climbing algorithms are mathematical optimization techniques belonging to the family of local search.
Utility Theory. Daniel Bernoulli (1700-82) founded utility theory by writing a paper entitled Exposition of a New Theory on the Measurement of Risk. The theme being that the value of an asset is determined by the utility it yields rather than its market price. His paper delineates the all-pervasive relationship between empirical measurement and gut feel. The Bernoulli Model employs utility theory to translate external values into internal values.
The Complementary Principle. In what John Stuart Mill called the single greatest advance in the history of science, Descartes synthesized Greek geometry with Arab algebra in producing analytic geometry. From that point forward, mankind had two different views of the underlying Platonic form. Quantum theory is characterized by the principle of complementary with its wave-particle duality. The Bernoulli Model employs the complementary principle, including the tabular/graphical and Null/Alt paradigms duality.
The Camus Distribution. The normal and Cauchy distributions are continuous probability distributions, symmetric about the mean. The Cauchy is also the distribution of the ratio of two independent normally distributed random variables with mean zero. The Cauchy distribution has undefined moments, but the first four may be calculated by truncating the tails. The Camus distribution is a four-moment, closed-form distribution that I
built by blending the normal with the Cauchy, using simulation-based optimization.
The Bernoulli Moment Vector. The Markowitz Model uses the mean to represent reward and the standard deviation to represent risk—thus laying the groundwork for risk-reward efficiency analysis. The Bernoulli Moment vector builds on mean and variance by adding skewness and kurtosis, then defines Val (value) as the utilitarian translation of outcome, and Var (value at risk) as the confidence level for downside risk exposure, and uVar (upper value at risk), Min, Max and Sim (simulated or measured value).
Closing Arguments. Spinoza’s Ethics is based on Euclidean geometry, a system of mathematics attributed to the Greek mathematician Euclid. In the Elements, he assumes five intuitively appealing axioms, and then deduces many propositions. This essay shows eight axioms for Spinoza’s God, nine axioms for the Copenhagen Interpretation, and ten axioms for The Bernoulli Model—thus building the case for synchronized and holistic
decision-making applicable to all organizations at all hierarchical levels. And then, as Leibniz said, “Let us calculate.”