Proposed Changes to the Mark-to-Market Accounting Rule

March 28, 2009

Below is the summary of proposed changes by FASB to FASB Statement No. 157, Fair Value Measurements.  This is something that has been widely debated in the blogosphere and is certainly set to have a major impact on the valuations of financial firms if it is passed this coming week on April 2nd.  This proposal is likely part of the tide lifting financial shares recently too.  Some of my earlier comments on mark to market or fair value accounting can be found in earlier posts: Comments on: Study on Mark-to-Market AccountingBattleshipIs capitalism fated to lead to socialism?, IMF Global Financial Stability Report October ‘08

The full content of the proposal is available at FASB here.


10. When evaluating whether it is necessary to make a significant adjustment to quoted prices for identical or similar assets or liabilities in markets that are not active, reporting entities shall apply the following two-step approach (this approach requires significant judgment):

Step 1: Determine whether there are factors present that indicate that the market for the asset is not active at the measurement date. Factors include: 

     a. Few recent transactions (based on volume and level of activity in the market).  Thus, there is not sufficient frequency and volume to provide pricing information on an ongoing basis.     

     b. Price quotations are not based on current information.

     c. Price quotations vary substantially either over time or among market makers (for example, some brokered markets).

     d. Indices that previously were highly correlated with the fair values of the asset are demonstrably uncorrelated with recent fair values.

     e. Abnormal (or significant increases in) liquidity risk premiums or implied yields for quoted prices when compared to reasonable estimates of credit and other nonperformance risk for the asset class.

     f. Significant widening of the bid-ask spread.

     g. Little information is released publicly (for example, a principal-to-principal market).

If after evaluating all the factors the sum of the evidence indicates that the market is not active, the reporting entity shall apply step 2.

Step 2: Evaluate the quoted price (that is, a recent transaction or broker price quotation) to determine whether the quoted price is not associated with a distressed transaction. The reporting entity shall presume that the quoted price is associated with a distressed transaction unless the reporting entity has evidence that indicates that both of the following factors are present for a given quoted price: 

     a. There was a period prior to the measurement date to allow for marketing activities that are usual and customary for transactions involving such assets or liabilities (for example, there was not a regulatory requirement to sell).

     b. There were multiple bidders for the asset.

11. If the reporting entity has evidence that both of the factors are present for a given quoted price, then that quoted price is presumed not to be associated with a distressed transaction. In that case, the quoted price may be a relevant observable input that shall be considered in estimating fair value. However, the reporting entity should consider whether any other factors or conditions warrant making an adjustment to the quoted price (see paragraph 29). For example, if a quoted price that is not associated with a distressed transaction is not current or is a consequence of a trade with an insignificant volume relative to the total market for that asset, the reporting entity should consider whether that quoted price is a relevant observable input (that is, whether the quoted price requires adjustment).

12. If the reporting entity does not have evidence that both of these factors are present for a given quoted price (including because there is insufficient information on which to base a conclusion), then the reporting entity shall consider the quoted price to be associated with a distressed transaction and shall use a valuation technique other than one that uses the quoted price without significant adjustment (that is, a significant adjustment is required, resulting in a Level 3 measurement). For example, the reporting entity could use an income approach (that is, a present value technique) to estimate fair value. However, the fair value resulting from the present value technique shall not be derived solely from inputs based on the quoted price associated with a distressed transaction. The inputs should be reflective of an orderly (that is, not distressed or forced) transaction between market participants at the measurement date. An orderly transaction would reflect all risks inherent in the asset, including a reasonable profit margin for bearing uncertainty that would be considered by market participants (that is, willing buyers and willing sellers) in pricing the asset in a non-distressed transaction. 


13. The staff proposes prospective transition. Changes in fair value resulting from the application of the FSP are considered changes in estimate and affect results in the period of adoption. The staff believes there are two effective date alternatives: 

     a. Effective for interim and annual periods ending after March 15, 2009.

     b. Effective for interim and annual periods ending after June 15, 2009. Early adoption would be permitted.

14. The staff recommends that a final FSP be effective for interim and annual periods ending after March 15, 2009.


 15. The staff recommends a comment period of 15 days ending April 1 so that the Board can finalize the proposed FSP at its Board meeting on April 2.

Proposed FSP FAS 157-x, Determining Whether a Market is Not Active and a Transaction is Not Distressed
FASB, March 16, 2009

U.S. financial firms want more on mark-to-market
Rachelle Younglai, Reuters, March 24, 2009


The Economist Has No Clothes

March 20, 2009

Stumbled on this old article from March 2008, one year ago.  The research here is deeper than my own, but the sentiment is the same as I have noted before such as I laid out in Syndicate This: Carbon Bonds.  The article I pasted below from the Scientific American is a very worthy read on the unsustainability of the neo-classical economics model.

In a close loop world, where resources are constrained, you cannot externalize environmental waste, damage or pollution.  The invisible hand has only allowed those with means and access to markets to steal resources from those with little or no knowledge.

The Economist Has No Clothes, March, 2008 in Society & Policy

Unscientific assumptions in economic theory are undermining efforts to solve environmental problems

By Robert Nadeau

The 19th-century creators of neoclassical economics—the theory that now serves as the basis for coordinating activities in the global market system—are credited with transforming their field into a scientific discipline. But what is not widely known is that these now legendary economists—William Stanley Jevons, Léon Walras, Maria Edgeworth and Vilfredo Pareto—developed their theories by adapting equations from 19th-century physics that eventually became obsolete. Unfortunately, it is clear that neoclassical economics has also become outdated. The theory is based on unscientific assumptions that are hindering the implementation of viable economic solutions for global warming and other menacing environmental problems.

The physical theory that the creators of neoclassical economics used as a template was conceived in response to the inability of Newtonian physics to account for the phenomena of heat, light and electricity. In 1847 German physicist Hermann von Helmholtz formulated the conservation of energy principle and postulated the existence of a field of conserved energy that fills all space and unifies these phenomena. Later in the century James Maxwell, Ludwig Boltzmann and other physicists devised better explanations for electromagnetism and thermodynamics, but in the meantime, the economists had borrowed and altered Helmholtz’s equations.

The strategy the economists used was as simple as it was absurd—they substituted economic variables for physical ones. Utility (a measure of economic well-being) took the place of energy; the sum of utility and expenditure replaced potential and kinetic energy. A number of well-known mathematicians and physicists told the economists that there was absolutely no basis for making these substitutions. But the economists ignored such criticisms and proceeded to claim that they had transformed their field of study into a rigorously mathematical scientific discipline.

Strangely enough, the origins of neoclassical economics in mid-19th century physics were forgotten. Subsequent generations of mainstream economists accepted the claim that this theory is scientific. These curious developments explain why the mathematical theories used by mainstream economists are predicated on the following unscientific assumptions:

  • The market system is a closed circular flow between production and consumption, with no inlets or outlets.
  • Natural resources exist in a domain that is separate and distinct from a closed market system, and the economic value of these resources can be determined only by the dynamics that operate within this system.
  • The costs of damage to the external natural environment by economic activities must be treated as costs that lie outside the closed market system or as costs that cannot be included in the pricing mechanisms that operate within the system.
  • The external resources of nature are largely inexhaustible, and those that are not can be replaced by other resources or by technologies that minimize the use of the exhaustible resources or that rely on other resources.
  • There are no biophysical limits to the growth of market systems.

If the environmental crisis did not exist, the fact that neoclassical economic theory provides a coherent basis for managing economic activities in market systems could be viewed as sufficient justification for its widespread applications. But because the crisis does exist, this theory can no longer be regarded as useful even in pragmatic or utilitarian terms because it fails to meet what must now be viewed as a fundamental requirement of any economic theory—the extent to which this theory allows economic activities to be coordinated in environmentally responsible ways on a worldwide scale. Because neoclassical economics does not even acknowledge the costs of environmental problems and the limits to economic growth, it constitutes one of the greatest barriers to combating climate change and other threats to the planet. It is imperative that economists devise new theories that will take all the realities of our global system into account.

The Economist Has No Clothes
Robert Nadeau, Scientific American, March 2008

Suburbia R.I.P.

March 19, 2009
David Dees: Foreclosure Street (

David Dees: Foreclosure Street (

Nice article in Fast Company about the end of Suburbia.  While I think to the extreme the end of suburbia is far fetched, I do think that suburbia as we know it will be transformed over the next 40 years or so from commuter living to community living.  The same mechanics driving sustainable distributed energy will also pave the way for a new suburbanism one built on almost-off-the -grid living combined with the necessary acreage for micro farming.  With home prices in Detroit averaging $7,500, this idea is not all that far fetched or expensive.   In any event the Fast Company article pays homage to New Urbansim, a movement that I think will also succeed in transforming America’s largest cities.

Does the downturn spell the beginning of the end for suburbia? Some experts say yesterday’s cul-de-sac is tomorrow’s ghost town.

The downturn has accomplished what a generation of designers and planners could not: it has turned back the tide of suburban sprawl. In the wake of the foreclosure crisis many new subdivisions are left half built and more established suburbs face abandonment. Cul-de-sac neighborhoods once filled with the sound of backyard barbecues and playing children are falling silent. Communities like Elk Grove, Calif., and Windy Ridge, N.C., are slowly turning into ghost towns with overgrown lawns, vacant strip malls and squatters camping in empty homes. In Cleveland alone, one of every 13 houses is now vacant, according to an article published Sunday in The New York Times magazine.


The demand for suburban homes may never recover, given the long-term prospects of energy costs for commuting and heating, and the prohibitive inefficiencies of low-density construction. The whole suburban idea was founded on disposable spending and the promise of cheap gas. Without them, it may wither. A study by the Metropolitan Institute at Virginia Tech predicts that by 2025 there will be as many as 22 million unwanted large-lot homes in suburban areas.

The suburb has been a costly experiment. Thirty-five percent of the nation’s wealth has been invested in building a drivable suburban landscape, according to Christopher Leinberger, an urban planning professor at the University of Michigan and visiting fellow at the Brookings Institution. James Howard Kunstler, author of “The Geography of Nowhere,” has been saying for years that we can no longer afford suburbs. “If Americans think they’ve been grifted by Goldman Sachs and Bernie Madoff, wait until they find out what a swindle the so-called ‘American Dream’ of suburban life turns out to be,” he wrote on his blog this week.


So what’s to become of all those leafy subdivisions with their Palladian detailing and tasteful signage? Already low or middle-income families priced out of cities and better neighborhoods are moving into McMansions divided for multi-family use. Alison Arieff, who blogs for The New York Times, visited one such tract mansion that was split into four units, or “quartets,” each with its own entrance, which is not unlike what happened to many stately homes in the 1930s. The difference, of course, is that the 1930s homes held up because they were made with solid materials, and today’s spec homes are all hollow doors, plastic columns and faux stone facades.

There is also speculation that subdivision homes could be dismantled and sold for scrap now that a mini-industry for repurposed lumber and other materials has evolved over the last few years. Around the periphery of these discussions is the specter of the suburb as a ghost town patrolled by squatters and looters, as if Mad Max had come to the cul-de-sac.

If the suburb is a big loser in mortgage crisis episode, then who is the winner? Not surprisingly, the New Urbanists, a group of planners, developers and architects devoted to building walkable towns based on traditional designs, have interpreted the downturn as vindication of their plans for mixed-use communities where people can stroll from their homes to schools and restaurants.

Richard Florida, a Toronto business professor and author of “Who’s Your City?: How the Creative Economy Is Making Where to Live the Most Important Decision of Your Life,” argues that dense and diverse cities with “accelerated rates of urban metabolism” are the communities most likely to innovate their way through economic crisis. In an article published in this month’s issue of The Atlantic, he posits that New York is at a relative advantage, despite losing a chunk of its financial engine, because the jostling proximity of architects, fashion designers, software writers and other creative types will reenergize its economy.


Suburbia R.I.P.

Michael Cannell, Fast Company, March 11, 2009

Artwork “Foreclosure St.”

David Dees,

Market Efficiency & Market Liquidity

March 18, 2009

Circa 1970 Eugene Fama popularized a body of research known as random walk theory.  Soon after leapfrogging many other efforts he finally published a paper on the Efficient Market Hypothesis (EMH) defining weak, semi-strong and strong efficiencies in markets.  Efficient market theories have often been criticized by academics in the field of behavioral finance.

In essence, what the body of work that came out of Fama and other’s efforts was after was to prove that the average investor was remiss in picking stocks since the US markets were fully efficient therefore indicating that it was impossible to find a “cheap” or undervalued security.  Strong form efficiency is summarized below:

  • Share prices reflect all information, public and private, and no one can earn excess returns.
  • If there are legal barriers to private information becoming public, as with insider trading laws, strong-form efficiency is impossible, except in the case where the laws are universally ignored.
  • To test for strong-form efficiency, a market needs to exist where investors cannot consistently earn excess returns over a long period of time. Even if some money managers are consistently observed to beat the market, no refutation even of strong-form efficiency follows: with hundreds of thousands of fund managers worldwide, even a normal distribution of returns (as efficiency predicts) should be expected to produce a few dozen “star” performers.

To me, our recent market crisis has me wondering what role liquidity plays in EMH or behavioral finance.  Efficiency in market pricing is based on the free flow of information and information symmetry where everything is essentially freely know to everybody.  Barring insider trading and with Sarbanes Oxley in place one would think that markets would have become more effiicient.

However, with indexes moving 4, 5 and 6% in either direction on a daily basis our current market has become very innefficient.  My unresearched and untested hypothesis, let’s call it LMH or the Liquid Market Hypothesis is that market liquidity carries a positive correlation to market efficiency.  I have often heard it said recently that this is not an investors market but a traders market.  That could not be more true.  Good traders today know that if they pump liquidity into a security, momentum alone well help boost their profitsm.  In fact managed futures funds have been star performers in this environment, and managed futures are fairly straightforward momentum strategies, following trend trading techniques.

Part of the painful result of massive deleveraging is that a large pool of dollars that used to provide liquidity to the market is now gone, and much of it will never return in the wake of the mega investemnt banking model gone bust.  As a result, we are in a world of higher volatility and even more opacity now.

Lastly, taking my Liquidity Market Hypothesis to the next level, I also wonder if there is a relationship to the amount of information available to market prices.  Said another way the interenet, email, instant messaging, and the proliferation of business news networks have changed the information to investment ratio.  If one could count all the bits and bytes of info over time available to the trading public and create a ratio to the amount of liquidity available I wonder what the result would look like.

My guess is over the last year the amount of “noise” has skyricketed meanwhile liquidity has plumeted.  This will benefit good stock pickers and excellent traders for a time to come.  In the meantime look forward to seeing your 401k (201k) bounce around at depressed levels as a reminder that, as John Stewart alluded to in his interview with Jim Cramer, that you are just a pawn in someone else’s game.

Cambridge Associates Outlook for Hedge Funds

March 17, 2009

Cambridge Associates, the venerable investment consultancy recently published a piece on their outlook for hedge funds as an asset class.  While nothing in it is terribly earth shattering it is always good to know what these guys are telling their clients who represent billions in public and private investment pools.  Click on the link below to access the original document.  Have pasted the conclusion here.

Cambridge Associates: Is the Hedge Fund Business Model broken? (January 2009)

Cambridge Associates: Is the Hedge Fund Business Model broken? (January 2009)

Is the “Hedge Fund Business Model” Broken?
– Get more Business Documents

Is the “Hedge Fund Business Model” Broken?
Cambridge Associates, January 2009

Was the Cold War Won by Reaganomics?

March 16, 2009

Here’s a novel idea giving credit to a regime I have very mixed feelings about, and as far as I know this is for better or worse all of my own dogma.

Was Reaganomics about economic progress, or was it really about geopolitical strategy?.  Hindsight being 20/20, one could research the idea that 28 years of deregulation and leverage helped us win the cold war.  Trickle down economics was possibly the single most force that allowed us to defeat the Russians and the prevailing global socialist regime.  It both afforded us a growing tax base created through economic stimulus to redeploy into defense spending, but simultaneously represented to the world that capitalism had succeeded, and that American Democratic Capitalism would lead to the best prosperity the world had ever seen.

Although if one subscribes to this, then it only took about nine years until the Berlin Wall came down. The following 20 or so were a holdover from a policy that few understood but that everyone enjoyed.  I give the thought credit on the basis that it would be awfully poetic to think that the same policy that helped us disarm the Russians will inadvertently force us to concede geopolitical leadership to the Chinese.

Anyone else seen work on this idea?

In 2007 Cramer Warned Passionately that Trouble Lay Ahead

March 15, 2009

For all the hoopla on Cramer this week calling to buy Bear Stearns by John Stewart, some may have overlooked this passionate plea for Bernake to cut rates and open the discount window in August 2007 in order to save Bear and other large banks from failing.  Huh.  That was seven months before Bear Stearns fell.