Financial Monte Carlo Simulation’s FLAW and FIXES

Join Our Mailing List

Physicians Must Understand Deus ex Machina

[By Wayne J. Firebaugh Jr; CPA, CFP®, CMP™]

SPONSOR: http://www.CertifiedMedicalPlanner.org

wayne-firebaughNamed after Monte Carlo, Monaco, which is famous for its games of chance, MCS is a software technique that randomly changes a variable over numerous iterations in order to simulate an outcome and develop a probability forecast of successfully achieving an outcome.

Endowment Fund Perspective

In private portfolio and fund endowment management, MCS is used to demonstrate the probability of “success” as defined by achieving the endowment’s asset growth and payout goals. In other words, MCS can provide the endowment manager with a comfort level that a given payout policy and asset allocation success will not deplete the real value of the endowment.

Divorce from Judgment

The problem with many quantitative software and other tools is the divorce of judgment from their use. Although useful, both mean variance optimization MVO and MCS have limitations that make it so they should not supplant the physician investor or endowment manager’s experience. MVO generates an efficient frontier by relying upon several inputs: expected return, expected volatility, and correlation coefficients. These variables are commonly input using historical measures as proxies for estimated future performance. This poses a variety of problems.

Problems with MCS 

First, the MVO will generally assume that returns are normally distributed and that this distribution is stationary. As such, asset classes with high historical returns are assumed to have high future returns.

Second, an MVO optimizer is not generally time sensitive. In other words, the optimizer may ignore current environmental conditions that would cause a secular shift in a given asset class returns.

Finally, an MVO optimizer may be subject to selection bias for certain asset classes. For example, private equity firms that fail will no longer report results and will be eliminated from the index used to provide the optimizer’s historical data [1].

Example:

As an example, David Loeper, CEO of Wealthcare Capital Management, made the following observation regarding optimization:

Take a small cap “bet” for our theoretical [endowment] with an S&P 500 investment policy. It is hard to imagine that someone in 1979, looking at a 9% small cap stock return premium and corresponding 14% higher standard deviation for the last twenty years, would forecast the relationship over the next twenty years to shift to small caps under-performing large caps by nearly 2% and their standard deviation being less than 2% higher than the 20-year standard deviation of large caps in 1979 [2].

Table: Compares the returns, standard deviations for large and small cap stocks for the 20-year periods ended in 1979 and 1999.  Twenty Year Risk & Return Small Cap vs. Large Cap (Ibbotson Data).

1979 1999
Risk Return Correlation Risk Return Correlation
Small Cap Stocks 30.8% 17.4% 78.0% 18.1% 16.9% 59.0%
Large Cap Stocks 16.5% 8.1% 13.1% 18.6%

Reproduced from “Asset Allocation Math, Methods and Mistakes.” Wealthcare Capital Management White Paper, David B. Loeper, CIMA, CIMC (June 2, 2001).

More Problems with MCS

David Nawrocki identified a number of problems with typical MCS as being that most optimizers assume “normal distributions and correlation coefficients of zero, neither of which are typical in the world of financial markets.”

Dr. Nawrocki subsequently describes a number of other issues with MCS including nonstationary distributions and nonlinear correlations.

Finally, Dr. Nawrocki quotes Harold Evensky who eloquently notes that “[t]he problem is the confusion of risk with uncertainty.

Risk assumes knowledge of the distribution of future outcomes (i.e., the input to the Monte Carlo simulation).

Uncertainty or ambiguity describes a world (our world) in which the shape and location of the distribution is open to question.

Contrary to academic orthodoxy, the distribution of U.S. stock market returns is far from “normal” [3]. Other critics have noted that many MCS simulators do not run enough iterations to provide a meaningful probability analysis.

Assessment

Join Our Mailing List 

Some of these criticisms have been addressed by using MCS simulators with more robust correlation assumptions and with a greater number of iterative trials. In addition, some simulators now combine MVO and MCS to determine probabilities along the efficient frontier.

Conclusion

Your thoughts and comments on this ME-P are appreciated. Feel free to review our top-left column, and top-right sidebar materials, links, URLs and related websites, too. Then, subscribe to the ME-P. It is fast, free and secure.

Link: http://feeds.feedburner.com/HealthcareFinancialsthePostForcxos

Speaker: If you need a moderator or speaker for an upcoming event, Dr. David E. Marcinko; MBA – Publisher-in-Chief of the Medical Executive-Post – is available for seminar or speaking engagements. Contact: MarcinkoAdvisors@msn.com

OUR OTHER PRINT BOOKS AND RELATED INFORMATION SOURCES:

DICTIONARIES: http://www.springerpub.com/Search/marcinko
PHYSICIANS: www.MedicalBusinessAdvisors.com
PRACTICES: www.BusinessofMedicalPractice.com
HOSPITALS: http://www.crcpress.com/product/isbn/9781466558731
CLINICS: http://www.crcpress.com/product/isbn/9781439879900
BLOG: www.MedicalExecutivePost.com
FINANCE: Financial Planning for Physicians and Advisors

References:

1. Clark, S.E. and Yates, T.T., Jr. “How Efficient is your Frontier?” Commonfund Institute White Paper (November 2003).

2. Loeper, D.B., CIMA, CIMC. “Asset Allocation Math, Methods, and Mistakes.” Wealthcare Capital Management White Paper (June 2001).

3. Nawrocki, D., Ph.D. “The Problems with Monte Carlo Simulation.” FPA Journal (November 2001).

Product DetailsProduct Details

Product Details

***

What Physician Investors STILL NEED TO KNOW about Monte Carlo Simulation in 2022

Join Our Mailing List

Probability Forecasting and Investing

By Dr. David Edward Marcinko MBA CMP™

[Editor-in-Chief] www.CertifiedMedicalPlanner.org

dr-david-marcinko1Recently, I had a physician-client ask me about Monte Carlo simulation. You know the routine: what it is and how it works, etc.

From Monaco

Named after Monte Carlo, Monaco, which is famous for its games of chance, MCS is a technique that randomly changes a variable over numerous iterations in order to simulate an outcome and develop a probability forecast of successfully achieving an outcome.

In endowment management, MCS is used to demonstrate the probability of “success” as defined by achieving the endowment’s asset growth and payout goals.  In other words, MCS can provide the endowment manager with a comfort level that a given payout policy and asset allocation success will not deplete the real value of the endowment.

Quantitative Tools Problematic

The problem with many quantitative tools is the divorce of judgment from their use. Although useful, MCS has limitations that should not supplant the endowment manager’s, FA or physician-investor’s, experience.

MCS generates an efficient frontier by relying upon several inputs: expected return, expected volatility, and correlation coefficients. These variables are commonly input using historical measures as proxies for estimated future performance. This poses a variety of problems.

  • First, the MCS will generally assume that returns are normally distributed and that this distribution is stationary.  As such, asset classes with high historical returns are assumed to have high future returns.
  • Second, MCS is not generally time sensitive. In other words, the MCS optimizer may ignore current environmental conditions that would cause a secular shift in a given asset class returns.
  • Third, MCS may use a mean variance optimizer [MVO] that may be subject to selection bias for certain asset classes. For example, private equity firms that fail will no longer report results and will be eliminated from the index used to provide the optimizer’s historical data.

Healthcare Investment Risks

A Tabular Data Example

This table compares the returns, standard deviations for large and small cap stocks for the 20-year periods ended in 1979 and 2010.

Twenty Year Risk & Return Small Cap vs. Large Cap (Ibbotson Data)

[IA Micro-Cap Value 14.66 17.44 24.69 0.44]

1979

2010

Risk

Return

Correlation

Risk

Return

Correlation

Small   Cap Stocks 30.8% 17.4% 78.0% 18.1% 26.85% 59.0%
Large   Cap Stocks 16.5% 8.1% 13.1% 15.06%

[Reproduced from “Asset Allocation Math, Methods and Mistakes.” Wealthcare Capital Management White Paper, David B. Loeper, CIMA, CIMC (June 2, 2001)]

The Problems

Professor David Nawrocki identified a number of problems with typical MCS in that their mean variance optimizers assume “normal distributions and correlation coefficients of zero, neither of which are typical in the world of financial markets.”

Dr. Nawrocki subsequently described a number of other issues with MCS including nonstationary distributions and nonlinear correlations.

Finally, Dr. Nawrocki quoted financial advisor, Harold Evensky MS CFP™ who eloquently notes that “[t]he problem is the confusion of risk with uncertainty.” Risk assumes knowledge of the distribution of future outcomes (i.e., the input to the Monte Carlo simulation). Uncertainty or ambiguity describes a world (our world) in which the shape and location of the distribution is open to question.

Assessment

Contrary to academic orthodoxy, the distribution of U.S. stock market returns is “far from normal.”[1] Other critics have noted that many MCS simulators do not run enough iterations to provide a meaningful probability analysis.

Conclusion

Your thoughts and comments on this ME-P are appreciated. Feel free to review our top-left column, and top-right sidebar materials, links, URLs and related websites, too. Then, subscribe to the ME-P. It is fast, free and secure.

Speaker: If you need a moderator or speaker for an upcoming event, Dr. David E. Marcinko; MBA – Publisher-in-Chief of the Medical Executive-Post – is available for seminar or speaking engagements. Contact: MarcinkoAdvisors@msn.com

OUR OTHER PRINT BOOKS AND RELATED INFORMATION SOURCES:


[1]   Nawrocki, D., Ph.D. “The Problems with Monte Carlo Simulation.” FPA Journal (November 2001).

Product Details  Product Details

Risk Management, Liability Insurance, and Asset Protection Strategies for Doctors and Advisors: Best Practices from Leading Consultants and Certified Medical Planners™8Comprehensive Financial Planning Strategies for Doctors and Advisors: Best Practices from Leading Consultants and Certified Medical Planners™

ABOUT RHETORICAL DEVICES AND PERSUASIVE APPEALS: Ethos, Pathos and Logos; etc

KAIROS; TOO!

DR. DAVID EDWARD MARCINKO FACFAS MBA CFP MBBS [Hon] [Executive Summary] -  PDF Free Download

By Dr. David E. Marcinko MBA CMP®

CMP logo

SPONSOR: http://www.CertifiedMedicalPlanner.org

****

The modes of persuasion, modes of appeal or rhetorical appeals (Greek: pisteis) are strategies of rhetoric that classify a speaker’s or writer’s appeal to their audience. These include ethos, pathos, and logos.

CITE: https://www.r2library.com/Resource/Title/0826102549

***

See the source image

***

Rhetorical appeal with persuasion elements are often key attributes for doctors, medical professionals, lawyers, CPAs, and all sorts of financial advisors and medical management consultants, etc.

Learning: https://medicalexecutivepost.com/2020/08/18/top-15-evolutions-of-learning/

So, here is a brief review for your consideration.

Examples: https://pathosethoslogos.com/

KAIROS: https://louisville.edu/writingcenter/for-students-1/handouts-and-resources/handouts-1/logos-ethos-pathos-kairos

YOUR COMMENTS ARE APPRECIATED.

Thank You

***

PARKINSON’S LAW: Beware in 2022

The 2-Ps [80/20] Rule

[By staff reporters]

Pareto’s law is either of the following closely related ideas: Pareto principle or law of the vital few, stating that 80% of the effects come from 20% of the causes Pareto distribution

Pareto distribution

The Pareto distribution, named after the Italian civil engineer, economist, and sociologist Vilfredo Pareto, is a power law probability distribution that is used in description of social, scientific, geophysical, actuarial, and many other types of observable phenomena. en.wikipedia.org

Parkinson’s law

Originally, Parkinson’s law is the adage that “work expands so as to fill the time available for its completion”, and the title of a book which made it well-known.

Assessment

However, in current understanding, Parkinson’s law is a reference to the self-satisfying uncontrolled growth of the bureaucratic apparatus in an organization.

COMPARISON

Conclusion

Your thoughts are appreciated.

***

8Risk Management, Liability Insurance, and Asset Protection Strategies for Doctors and Advisors: Best Practices from Leading Consultants and Certified Medical Planners™8Comprehensive Financial Planning Strategies for Doctors and Advisors: Best Practices from Leading Consultants and Certified Medical Planners™

***

Invite Dr. Marcinko

UNDERSTANDING MEDICAL PRACTICE CYBER SECURITY RISKS

Mitigations for the Digital Health Era

 By Shahid N. Shah MS

There has been a tremendous explosion of information technology (IT) in healthcare caused by billions of dollars of government incentives for usage of digital healthcare tools. But, IT systems face threats with significant adverse impacts on institutional assets, patients, and partners if sensitive data is ever compromised. Every health enterprise is required to confidentiality, integrity and availability of its information assets (this is called “information assurance” or IA). Confidentiality means private or confidential information must not be disclosed to unauthorized persons. Integrity means that the information can be changed only in an authorized manner so as to maintain the correctness of the information. Availability defines the characteristic that information systems work as intended and all services are available to its users whenever necessary.

It is well known that healthcare organizations face and have been mitigating many risks such as investment risk, budgetary risk, program management risk, safety risk, and inventory risk for many years. What’s new in the last decade or so is that organizations must now manage risks related to information systems because  operating systems [OSs] are also at risk. IT is now just as a critical an asset as most other infrastructure managed by health systems. It is important that information security risks are given the same or more importance and priority as given to other organizational risks.

As health records move from paper native to digital native, it’s vital that organizations have information risk management programs and security procedures that woven into the culture of the organization. For this to happen, basic requirements of information security must be defined and implemented as part of both the operational and management processes. A framework that provides guidance on how to perform these activities, and the co-ordination required between these activities is needed.

INTRODUCTION

The Risk Management Framework (RMF), supported by the National Institute of Standards and Technology (NIST) provides this framework. The NIST 800 series publications provide a structured approach to achieve risk management. It provides broad guidance and not necessarily all the prescriptions, which means it can be tailored to meet the organization’s specific needs and providing the flexibility needed for the different organizations. Using the NIST RMF helps organizations with risk management not only in a repeatable manner, but also with greater efficiency and effectiveness. Healthcare information assurance is complex and without a framework that takes into account a broad risk management approach, it is difficult to consider all the intricacies involved.

NIST Risk Management Framework

The NIST Risk Management Framework consists of a six step process designed to guide organizations in managing the risks in their information systems. The various steps as defined in the NIST specifications are the following:

  • Categorize the information system and the information processed, stored, and transmitted by that system based on an impact analysis.
  • Select an initial set of baseline security controls for the information system based on the security categorization; tailoring and supplementing the security control baseline as needed based on an organizational assessment of risk and local conditions
  • Implement the security controls and describe how the controls are employed within the information system and its environment of operation.
  • Assess the security controls using appropriate assessment procedures to determine the extent to which the controls are implemented correctly, operating as intended, and producing the desired outcome with respect to meeting the security requirements for the system.
  • Authorize information system operation based on a determination of the risk to organizational operations and assets, individuals, other organizations, and the Nation resulting from the operation of the information system and the decision that this risk is acceptable.
  • Monitor the security controls in the information system on an ongoing basis including assessing control effectiveness, documenting changes to the system or its environment of operation, conducting security impact analyses of the associated changes, and reporting the security state of the system to designated organizational officials.

***

***

Worst case scenario

All information systems process, store and transmit information. What is the possible impact if a worst case scenario occurs that causes endangers this information? A structured way to find out the potential impact on the confidentiality, integrity and availability of information can be done through the first step of NIST RMP, the categorization of information systems. The NIST SP 800-60  provides such guidance. The potential impact is assigned qualitative values – low, moderate, or high. Based on these impact levels for each of the information type contained in the system, the high water mark level is calculated, that helps in selecting the appropriate controls in the subsequent steps.

Organizations need to mitigate risks adequately by selecting an appropriate set of controls that would work effectively. In the selection of security controls step, the set of controls are chosen based on the categorization of the information system, the high water mark and the goals of the organizations. These baseline controls are selected from NIST SP 800-53  specification, one of three sets of baseline controls, corresponding to low, moderate, high impact rating of the information system. These baseline controls can be modified to meet specific business needs and organization goals. These tailored controls can be supplemented with additional controls, if needed, to meet unique organizational policies and environment factors and its security requirements and its risk appetite. The minimum assurance requirements need to be specified here.

All the activities necessary for having the selected controls in place, is done in the implementation of security controls step. The implementation of the selected security controls will have an impact on the organization risks and its effects. NIST SP 800-70 can be used as guidance for the implementation. An implementation strategy has to be planned and the actions have to be defined and the implementation plan needs to be reviewed and approved, before the implementation is done.

Once the controls are implemented, then the assessment of security controls is done to find out whether the controls have been correctly implemented, working as intended, and giving the desired output with respect to the security requirements. In short, whether the applied security controls are indeed the right ones, done in the right way, giving the right outcome. NIST SP 800-53,, NIST 800-53A, NIST 800-115 can provide the necessary guidance, here.

IS authorization

The authorization of information systems is an official management decision, authorizing that the information system can be made operational, with the identified risks mitigated and the residual risks accepted, and is accountable for any adverse impacts on the confidentiality, integrity and availability of information systems. If the authorizing personnel find that the risks are not mitigated and hence can compromise the sensitive information, they can deny authorizing the information system. NIST SP 800-37 provides guidance on authorization. The authorizing personnel are to be involved actively throughout the risk management process.

Risk management is not one-time process, that once it is done, it is forgotten. It is a continuous process, to be integrated with day-to-day activities. One of the key aspects of any risk management is the monitoring of security controls to check whether the controls are performing as intended. The main focus of monitoring security controls is to know whether the controls are still effective over a period time, given the changes that occur in the information systems — the changes in hardware, software and firmware, the changes in environment factors, operating conditions etc. NIST SP 800-37  provides guidance about this. And if the security controls are found to be ineffective, the cycle starts again, with either re-categorization or selecting another set of baseline controls, or assessing the effectiveness of the controls once more etc.

And, in all the steps in risk management framework, one of the important aspects is communication. Appropriate documents needed to be generated in all the steps, reviewed and kept up-to-date.

Assessment

Organizational risk management provides great benefits to the organization because it helps to prioritize the resources, increase interoperability, and reduce costs incurred due to the adverse effects. It helps to prevent unauthorized access to personally identifiable information which will lead to security breaches.

Conclusion

Your thoughts are appreciated.

***

Product DetailsProduct Details

***

%d bloggers like this: