Monday, November 26, 2012

The CAPM Debate and the Search for "True Beta"

What has happened is that we’ve used these assumptions for so long that we’ve forgotten that we’ve merely made assumptions, and we’ve come to believe that the world is necessarily this way.”
                                                                        ~Resistance is Futile: The Assimilation of Behavioral Finance

Conventional investment theory states that when an investor constructs a well-diversified portfolio, the unsystematic sources of risk are diversified away leaving the systematic or non-diversifiable source of risk as the relevant risks. The capital asset pricing model (CAPM), developed by Sharpe (1964), Lintner (1965) and Black (1972) [zero-beta version], asserts that the correct measure of this riskiness is its measure known as the "beta coefficient" or just "beta".

Effectively, beta is a measure of an asset’s correlated volatility relative to the volatility of the overall market. Consequently, given the beta of an asset and the risk-free rate, the CAPM should be able to predict the expected return for that asset, and correspondingly the expected risk premium as well.

This explanation is textbook. However, unbeknownst to most, there has been a long running argument in academic circles on the CAPM and other pricing models, even within the milieu of traditional investments. Without going into the details of this debate, certain empirical studies have revealed "cross-sectional variations" in the CAPM questioning the validity of the model.

In response to Fama and French's (1992) challenge, Jagannathan and Wang (1996) theorized that “…the lack of empirical support for the CAPM may be due to the inappropriateness of some assumptions made to facilitate the empirical analysis of the model. Such an analysis must include a measure of the return on the aggregate wealth portfolio of all agents in the economy.”

Financial institutions have not been left behind by these evolving academic theories. Index creation and benchmarking has become standard fare. Since the introduction of 'exchange traded funds' (ETFs), a veritable industry has developed around the "multiple beta" concept. But by no means has the plethora of these instruments captured every aspect of the aggregate wealth portfolio of the global economy, although given new product development it would seem this is the undeclared objective.

Such backdrop is the principal context which gives impetus to the notion of "exotic betas". The term, a relatively recent addition to the investment lexicon which evolved from ideas advanced by proponents of alternative investments, suggests that certain alternative investment assets and/or strategies, representing commonly pursued market paradigms, can be identified, tracked and replicated employing a predefined passive approach/model similar to traditional index construction.

This leaves open the question as to whether institutions, through sophisticated financial engineering, can truly capture in a passive way all possible sources of return in the global economy. Or, does some aspect which the industry loosely calls alpha (i.e., skill-based returns) always remain outside the grasp of such institutions’ arbitrary models of beta?

Jagannathan - The CAPM Debate

References:
Black, Fischer (1972). “Capital Market Equilibrium with Restricted Borrowing” Journal of Business 45, July, pp. 444-455.

Fama, Eugene F.; French, Kenneth R. (1992). “The Cross-Section of Expected Stock Returns” Journal of Finance 47, June, pp. 427-465.

Jagannathan, Ravi; McGrattan, Ellen R. (1995). “The CAPM Debate” Federal Reserve Bank of Minneapolis Quarterly Review, Vol. 19, No. 4, Fall 1995, pp. 2-17

Jagannathan, Ravi; Wang, Zhenyu (1993). “The CAPM is Alive and Well” Research Department Staff Report 165. Federal Reserve Bank of Minneapolis

Jagannathan, Ravi; Wang, Zhenyu (1996). “The Conditional CAPM and the Cross-Section of Expected Returns” Journal of Finance, Vol. 51, No. 1, March, pp. 3-53.

Tuesday, November 13, 2012

Case Study: Roadmap to Dodd-Frank Compliance

Procrastination is opportunity's assassin. ~Victor Kiam, American entrepreneur

Last week, on the day of President Obama's re-election, CommodityPoint, a division of energy and utilities consultancy UtiliPoint International Inc., published a report on the current state of progress in implementing Dodd-Frank compliant processes and technology.[1] The study's survey responses indicated "widespread doubt... as to when the regulations will eventually come into force," and/or "significant amount of confusion as to the requirements and burdens".

CommodityPoint's warnings to the industry could not be more clear:
In reviewing the data, there appears to be a general lack of urgency on the part of many market participants...
[A] lack of movement by a significant number of market participants (especially in the ‘end-user’ segment) is creating a large backlog of work across the industry... this circumstance should be considered a significant risk as companies consider their compliance planning and efforts.

As with all regulations, companies exposed to Dodd-Frank rules will be considered guilty until they prove themselves innocent... continuously and consistently.
Clearly the time for "wait-and-see" is over. Firms need to apply a structured and proactive approach going forward despite the difficulty in anticipating how regulations will eventually evolve. Key is preparation. Firms that jumped started their initiatives based on rule proposals are going to be better prepared than those left scrambling after final rules are published.

To be blunt, notwithstanding that the industry is taking the battle to the courts, time is starting to run out...

According to Davis Polk's October 2012 Progress Report, 45 out of the 90 Dodd-Frank required Title VII rules have been finalized. However, this halfway mark doesn't tell the whole story. The CFTC has finalized 80% or 34 out of 43 rules with only 9 proposed rules having missed their deadline. The SEC, on the other hand, is behind the eight ball with 9 rules finalized and 11 proposed out of the 20 required for which deadlines have passed.[2]

One advantage that results from basing work on final rules is increased certitude. Firms are now better positioned to determine where gaps exist in their processes, prioritize activities required to comply with regulations, and convert requirements into implementation tasks. Key is having a methodology which considers the broad contours of areas impacted, along with the recognition that each new rule brings about new challenges. A robust methodology, in turn, generates the roadmap.

Establishing project scope

This case study applies IQ3 Group's strategy assessment framework as a method for scoping regulatory projects and forging
a pathway to compliance. The underlying approach involves topic decomposition until all relevant areas of investigation are defined in sufficient detail. The result of this analysis is then mapped to deliverables that need to be developed.

Figure 1


The CFTC has identified 8 categories and 38 areas where rules are necessary under Dodd-Frank Title VII and Title VIII.[3] A closer look, however, reveals 61 Federal Register (FR) publications of which 17 are still in the proposal stage with the balance representing final rules. The table below lists the 61 final and proposed rules within the categories established by the CFTC. Within each category, the rulemaking area is sorted by the effective date. Proposed rules are sorted by date of publication.

Figure 2


While the above table is helpful in providing a high level overview of Title VII and Title VIII CFTC rules, it still does not provide detailed descriptions of the rules, or a calendar of compliance dates which may be different than the effective date of the rule.

The following compliance matrix and calendar is a more complete overview and is also available in a spreadsheet format.

Dodd-Frank Act - Final and Proposed Rules Compliance Matrix and Calendar


Hypothesis and data collection

In order to focus the collection of data and begin to analyze and develop solutions, there must be a basis from which to drive an understanding of the issues. What are the potential operational gaps, problems or opportunities inherent in the rulemaking areas? What procedures and processes do we need to institute or re-engineer? What technology systems do we need to implement? What data will we need to support new processes and technology systems? Is compliance with these new regulations the responsibility of a particular department, or a firm-wide organizational commitment?

Formulating hypotheses---educated hunches to be tested---provides a focus for data collection and gives form to findings which ultimately leads to conclusions, and the development of recommendations. Subject matter expertise is often required to formulate relevant hypotheses. From hypotheses, specific questions can be developed that will help drive data collection.

Figure 3


Industry comments to proposed rules and corresponding CFTC responses within the FR publications is an excellent source of insight. The following discussion on affiliates that qualify for the end-user exception[4] is a good example:
Section 2(h)(7)(D)(i) of the CEA provides that an affiliate of a person that qualifies for the end-user exception… may qualify for the exception only if the affiliate… uses the swap to hedge or mitigate the commercial risk of the person or other affiliate of the person that is not a financial entity.
As the CFTC reiterates, an affiliate may elect the end-user exception, even if it is a financial entity, if the affiliate complies with the requirement that the swap is used to hedge or mitigate commercial risk; provided, however, that the affiliate is not a swap dealer or major swap participant. Nevertheless, Shell Energy North America (US) raises the issue that:
...potential electing counterparties that centralize their risk management through a hedging affiliate that is designated as a swap dealer or major swap participant may be unable to benefit from the end-user exception. As a result, many potential electing counterparties may need to restructure their businesses and risk management techniques, thereby losing the many benefits of centralized hedging.
Kraft, Philip Morris and Siemens Corp clarify that this concern relates to how treasury subsidiaries function:
...the Commission should exclude wholly-owned treasury subsidiaries of non-financial companies from the ‘‘financial entity’’ definition, to the extent that they solely engage in swap transactions to hedge or mitigate the commercial risks of an entire corporate group. These commenters noted in particular that the treasury subsidiaries may be, or are likely to be, "financial entities" ... because they are predominantly engaged in activities of a financial nature as defined in Section 4(k) of the Bank Holding Company Act.
In response, the CFTC states that it lacks discretion because Congress specifically defined financial entities (which cannot use the end-user exception) to include swap dealers and major swap participants. Further, Congress specifically outlines who may qualify as an affiliate eligible for the end-user exception. The specificity with which Congress defines these concepts constrains the CFTC’s discretion in this area. The CFTC, however, notes "it is important to distinguish where the treasury function operates in the corporate structure" and then establishes means by which concerns can be alleviated:
Treasury affiliates that are separate legal entities and whose sole or primary function is to undertake activities that are financial in nature as defined under Section 4(k) of the Bank Holding Company Act are financial entities as defined in Section 2(h)(7)(C)(VIII) of the CEA because they are ‘‘predominantly engaged’’ in such activities. If, on the other hand, the treasury function through which hedging or mitigating the commercial risks of an entire corporate group is undertaken by the parent or another corporate entity, and that parent or other entity is entering into swaps in its own name, then the application of the end-user exception to those swaps would be analyzed from the perspective of the parent or other corporate entity directly.
In other words, a parent company or other corporate entity predominantly engaged in manufacturing, agriculture, retailing, energy may elect the end-user exception for inter-affiliate swaps. The CFTC explains how:
If the parent or other corporate entity then aggregates the commercial risks of those swaps with other risks of the commercial enterprise and hedges the aggregated commercial risk using a swap with a swap dealer, that entity may, in its own right, elect the end-user exception for that hedging swap. The parent or other corporate entity in the example is not a ‘‘financial entity’’ as defined in Section 2(h)(7)(C)(VIII) of the CEA, because that entity is ‘‘predominantly engaged’’ in other, nonfinancial activities undertaken to fulfill its core commercial enterprise purpose. However, if the parent or other corporate entity, including, for example, a separately incorporated treasury affiliate, is a ‘‘financial entity,’’ then that entity cannot elect the end-user exception unless one of the specific affiliate provisions of the statute, Section 2(h)(7)(C)(iii) or Section 2(h)(7)(D), apply.
Generally speaking, the CFTC notes that Congress did not treat inter-affiliate swaps differently from other swaps in Section 2(h)(7) of the CEA. Accordingly, if one of the affiliates is not a financial entity and is using the swap to hedge or mitigate commercial risk, even if the other affiliate is a financial entity, the non-financial entity affiliate may elect the end-user exception and neither affiliate needs to clear the swap. Based on this analysis, such entities face a strategic choice...


Findings and conclusions


Assuming that a corporate entity engaged in commercial activities is structured to include a treasury subsidiary engaged in swaps which hedges the commercial risks of the corporate group, such subsidiary can: (i) continue to operate as a "financial entity" and if applicable register as a swap dealer or major swap participant; or (ii) seek to elect the end-user exception by restructuring where in the corporate structure swaps are transacted. But we are getting ahead of ourselves...

After collecting data from questions based on our hypotheses, the next step is synthesizing such data to derive findings and galvanize conclusions about what was learned. Findings and conclusions are defined as follows:
  • Finding—is a summary statement derived from raw data that directs our thinking toward solutions or opportunities regarding a problem.
  • Conclusion—is a diagnostic statement, based on the data and findings that explains problems or opportunities and is significant enough to warrant action

Figure 4



In performing an in-depth examination of the final end-user exception rule to the clearing requirement for swaps we can arrive at findings and conclusions appropriate to the context of a market participant that may fall within such category. To accomplish this task, IQ3 Group assembled a variety of decision flow charts including the "end-user exception" (see Figure 5 below).[5]

Below we step through the analysis of §39.6 "Exceptions to the clearing requirement":
Under §39.6(a)(1), a counterparty to a swap may elect the exception to the clearing requirement on condition that either: [1] under §39.6(a)(1)(i) it is not a "financial entity" as defined by CEA §2(h)(7)(C)(i)*; or [2] under §39.6(a)(1)(ii) it is using the swap to hedge or mitigate commercial risk as provided by CEA §2(h)(7)(A)(ii) or §39.6(b)(1)(ii)(B); or [3] provide, or cause to be provided information to a registered swap data repository (SDR) or, if no SDR is available to the Commission. A counterparty that satisfies this criteria and elects the exception is an "electing counterparty".

*Under §39.6(d), for purposes of CEA §2(h)(7)(A), a financial entity because of CEA §2(h)(7)(C)(i)(VIII) shall be exempt if: (i) it is organized as a certain type of bank [e.g., organized as a bank as defined in §3(a) of the Federal Deposit Insurance Act. See §39.6(d)(i)]; or (ii) has total assets of $10 billion or less on the last day of the entity's most recent fiscal year.

When electing the exception under CEA §2(h)(7)(A), one of the counterparties (the "reporting counterparty") shall provide, or cause to be provided information to a registered swap data repository (SDR) or, if no SDR is available to the Commission. Under §39.6(b)(3) each reporting counterparty needs to have a reasonable basis to believe the electing counterparty meets requirements for an exception to the clearing requirement.[6]

Under §39.6(b) the reporting counterparty will provide information in the following form and manner: (i) notice of the election of the exception; (ii) identity of the electing counterparty; and (iii) the following information [continues below next para], unless...

...such information has previously been provided by the electing counterparty in a current annual filing pursuant to §39.6(b)(2), which states that an entity under this section may report the information annually in anticipation of electing the exception for one or more swaps. Further, any such reporting shall be effective for 365 days following the date of such reporting, provided the entity shall amend such information as necessary to reflect any material changes to the information reported.

Under §39.6(b)(iii) the following information shall be provided by the reporting counterparty:

(A) Whether the electing counterparty is a "financial entity," and if yes, whether it is: (1) electing in accordance with §2(h)(7)(C)(iii) or §2(h)(7)(D); or (2) exempt from the definition of "financial entity" as described in §39.6(d).

(B) Whether the swap(s) for which the electing counterparty is electing the exception are used by the electing counterparty to hedge or mitigate commercial risk as provided in §39.6(c). [See §39.6(c) discussion below.]

(C) How the electing counterparty generally meets its financial obligations associated with entering into non-cleared swaps by identifying one or more of the following categories, as applicable: (1) a written credit support agreement; (2) pledged or segregated assets; (3) a written third-party guarantee; (4) the electing counterparty's available financial resources; or (5) means other than those described.

(D) Whether the electing counterparty is an entity that is an issuer of securities registered under section 12 of, or is required to file reports under section 15(d) of, the Securities Exchange Act of 1934, and if so: (1) the relevant SEC Central Index Key number for that counterparty; and (2) whether an appropriate committee of that counterparty's board of directors has reviewed and approved the decision to enter into swaps that are exempt from the requirements of CEA §§2(h)(1) and 2(h)(8).

The following discussion analyzes a key concept pertinent to §39.6(c) "Hedging or mitigating commercial risk":
A swap is deemed to hedge or mitigate commercial risk if such swap:

(i) is economically appropraite to the reduction of risks in the conduct and management of a commercial enterprise where the risks arise from §39.6(c)(i)(A),(B),(C),(D),(E), or (F);

(ii) qualifies as bona fide hedging for purposes of an exemption from position limits; or

(iii) qualifies for hedging treatment under (A) Financial Accounting Standards Board Accounting Standards Codification Topic 815, Derivatives and Hedging (formerly known as FAS 133) or (B) Governmental Accounting Standards Board Statement 53, Accounting and Financial Reporting for Derivative Instruments.

Additionally, a swap is deemed to hedge or mitigate commercial risk is such swap is:

(i) not used for a purpose that is in the nature of speculation, investing, or trading; and

(ii) not used to hedge or mitigate the risk of another swap or security-based swap position, unless that other position itself is used to hedge or mitigate commercial risk as defined by §39.6(c) or §240.3a67-4.

Figure 5


A key finding identified by IQ3 Group regarding how the CFTC approached writing Title VII rules is the CFTC's departure from a legacy approach that relied on the concept of "exclusion from the definition" and "exemption from the definition". As seen from the above analysis, if an entity transacts in swaps it falls under the definition, but can be "excepted from the definition". The burden of proof to elect such exception, however, is upon the entity, who for that reason must continue to collect and report required data. If called upon by the regulators, such recordkeeping is necessary to support the election of the exception.


Generating recommendations

Based on conclusions regarding problems and opportunities, practical recommendations can be generated, evaluated and finalized. The first step is specifying alternatives including next steps, and describing the intended results and benefits related to each alternative. Such analysis should take into account existing conditions, as well as barriers and resource constraints. Recommendations should cover the topics and outputs originally scoped out, and trace back to address root findings:

Figure 6


Each regulation that is promulgated brings about new challenges. The underlying impetus of Dodd-Frank regulations, however, is clear. By imposing "robust recordkeeping and real-time reporting regimes" regulators have signaled their intent on ushering stricter risk management across the financial system supported by robust data governance and straight through processing.

With that in mind, CommodityPoint's fulmination merits attention:
Give the potential legal and financial exposures of non-compliance... it is incumbent upon all levels of leadership, from risk managers to C-level executives, to create a culture of [Dodd-Frank] compliance within their companies. ...while the regulators' response will not be immediate, it will most likely be aggressive once in motion; and once a company is identified as one that has not been compliant in the past, that company will likely remain under CFTC scrutiny for a very long time.

Footnotes:

[1] Reames, P. and Bell, E. (2012). "2012 Dodd-Frank Market Survey and Report" CommodityPoint, sponsored by RiskAdvisory, November 2012

[2] Davis Polk & Wardwell LLP (2012). "Dodd-Frank Progress Report October 2012" Generated using the Davis Polk Regulatory Tracker™

[3] See: http://www.cftc.gov/LawRegulation/DoddFrankAct/Rulemakings/index.htm

[4] Federal Register / Vol. 77, No. 139 / Thursday, July 19, 2012 / Rules and Regulations (77 FR 42559)

[5] Decision flow chart based on Final Rule §39.6 "Exceptions to the clearing requirement" (77 FR 42590)

[6] The term "reasonable basis to believe" imposes a requirement upon the reporting counterparty that information from the electing counterparty supporting §39.6(b)(3) needs to be collected and maintained.

Saturday, November 3, 2012

Tower of Babel, Semantics Initiative, and Ontology

What's in a name? That which we call a rose by any other name would smell as sweet. ~ William Shakespeare 
The beginning of wisdom is to call things by their right names. ~ Chinese Proverb

At a symposium held by the Securities Industry and Financial Markets Association (SIFMA) in March 2012, Andrew G. Haldane, Executive Director of Financial Stability for the Bank of England, gave a speech titled, “Towards a common financial language”.[1] Using the imagery of the Tower of Babel, Mr. Haldane described how…
Finance today faces a similar dilemma. It, too, has no common language for communicating financial information. Most financial firms have competing in-house languages, with information systems silo-ed by business line. Across firms, it is even less likely that information systems have a common mother tongue. Today, the number of global financial languages very likely exceeds the number of global spoken languages.

The economic costs of this linguistic diversity were brutally exposed by the financial crisis. Very few firms, possibly none, had the information systems necessary to aggregate quickly information on exposures and risks.[2] This hindered effective consolidated risk management. For some of the world’s biggest banks that proved terminal, as unforeseen risks swamped undermanned risk systems.

These problems were even more acute across firms. Many banks lacked adequate information on the risk of their counterparties, much less their counterparties’ counterparties. The whole credit chain was immersed in fog. These information failures contributed importantly to failures in, and seizures of, many of the world’s core financial markets, including the interbank money and securitization markets.

Why is this? One would think that the financial industry would be in a great position to capitalize on the growth of digital information. After all, data has been the game changer for decades. But Wall Street, while proficient at handling market data and certain financial information, is not well prepared for the explosion in unstructured data.

The so-called “big data” problem of handling massive amounts of unstructured data is not just about implementing new technologies like Apache Hadoop. As discussed at the CFTC Technology Advisory Committee on Data Standardization held September 30, 2011, there is significant confusion in the industry regarding “semantics”.[3]

EDM Council - FIBO Semantics Initiative

The “semantic barrier” is a major issue in the financial industry, necessitating the creation of standards such as ISO 20022 to resolve.[4] For example, what some participants in the payments industry call an Ordering Customer, others refer to a Payer or Payor, while still others refer to a Payment Originator or Initiator. The context also plays a role here: the Payment Originator/ Initiator is a Debtor/ Payor in a credit transfer, while that Payment Originator/Initiator is a Creditor/Payee in a direct debit.[5]

It should therefore be apparent that intended use of systems is reliant on “human common sense” and understanding. Unfortunately, especially within the context of large organizations or across an industry, boundaries of intended use are often not documented and exist as “tribal knowledge”. Even if well documented, informal language maintained in policies and procedures can result in unintentional misapplication, with consequences no less hazardous than intentional misapplication.


Overcoming semantic barriers...

If your avocation[6] involves organizing information and/or modeling data and systems you invariably start asking epistemo-logical[7] questions, even though such questions may not be immediately practical to the task at hand: What is knowledge? How is knowledge acquired? To what extent is it possible for a given concept, either physical or abstract, to be known? Can computers understand meaning from the information they process and synthesize knowledge? “Can machines think?”[8]

Such questions are impetus to an ongoing debate about “the scope and limits of purely symbolic models of the mind and about the proper role of connectionism in cognitive modeling.” Harnad (1990) proffered such quandary as the Symbol Grounding Problem. “How can the semantic interpretation of a formal symbol system be made intrinsic to the system, rather than just parasitic on the meanings in our heads?”[9] The meaning triangle[10] illustrates the underlying problem.

 
Figure 1 – Ogden and Richards (1923) meaning triangle 

Figure 1 is a model of how linguistic symbols stand for objects they represent, which in turn provide an index to concepts in our mind. Note too, that such triangle represents the perspective of only one person, whereas communication often takes place between two or more persons (or devices such as computers). Hence, in order for two people or devices to understand each other, the meaning that relates term, referent and concept must align.

Now consider that different words might refer to the same concept, or worse, the same word could have different meanings, as in our example of the term “orange”. Are we referring to a fruit or to a color? This area of study is known as semantics.[11]


Relating semantics to ontology

As the meaning triangle exemplifies, monikers—whether they be linguistic[12] or symbolic[13]—are imperfect indexes. They rely on people having the ability to derive denotative (ie, explicit) meaning, and/or connotative (ie, implicit) meaning from words/signs. If the encoder (ie, sender) and the decoder (ie, receiver) do not share both the denotative and connotative meaning of a word/sign, miscommunication can occur. In fact, at the connotative level, context determines meaning.

Analytic approaches to this problem falls under the domain of semiotics,[14] which for our purposes encompasses the study of words and signs as elements of communicative behavior. Consequently, we consider linguistics and semiosis[15] to come under the subject of semiotics.[16] Semiotics, in turn, is divided into three branches or subfields: (i) semantics; (ii) syntactics;[17] and (iii) pragmatics.[18]

Various disciplines are used to model concepts within this field of study. These disciplines include, but are not necessarily limited to, lexicons/synsets, taxonomies, formal logic, symbolic logic, schema related to protocols (ie, syntactics), schema related to diagrams (ie, semiosis), actor-network theory, and metadata [eg, structural (data about data containers), descriptive (data about data content)]. In combination, these various methods form the toolkit for ontology work.

Ontology involves the study of the nature of being, existence, reality, as well as the basic categories of being and their relations. It encompasses answering metaphysical[19] questions relating to quiddity, that is, the quality that makes a thing what it is—the essential nature of a thing.

Admittedly, there are divergent views amongst practitioners as to what constitutes ontology, as well as classification of semiotics and related methodologies. To be sure, keeping all these concepts straight in one’s mind is not without difficulty for those without formal training. Further, “ontology has become a prevalent buzzword in computer science. An unfortunate side-effect is that the term has become less meaningful, being used to describe everything from what used to be identified as taxonomies or semantic networks, all the way to formal theories in logic.”[20]

Figure 2 is a schematic diagram illustrating a hierarchical conceptualization of ontology, its relation to epistemology, metaphysics, and semiotics, as well as its relation to cognitive science. Figure 2 also shows how semiotics encompasses linguistics and semiosis.

 
Figure 2 – Conceptualization of epistemology, metaphysics, ontology, and semiotics


Knowledge representation and first order logic

Knowledge representation (KR) is an area of artificial intelligence research aimed at representing knowledge in symbols to facilitate systematic inferences from knowledge elements, thereby synthesizing new elements of knowledge. KR involves analysis of how to reason accurately and effectively, and how best to use a set of symbols to represent a set of facts within a knowledge domain.

A key parameter in choosing or creating a KR is its expressivity. The more expressive a KR, the easier and more compact it is to express a fact or element of knowledge within the semantics and syntax of that KR. However, more expressive languages are likely to require more complex logic and algorithms to construct equivalent inferences. A highly expressive KR is also less likely to be complete and consistent; whereas less expressive KRs may be both complete and consistent.

Recent developments in KR include the concept of the Semantic Web, and development of XML-based knowledge representation languages and standards, including Resource Description Framework (RDF), RDF Schema, Topic Maps, DARPA Agent Markup Language (DAML), Ontology Inference Layer (OIL), and Ontology Web Language (OWL).[21]

 
Figure 3 – Adapted from Pease (2011) [Figure 15] and Orbst (2012)

SUMO, an open-source declarative programming language based on first order logic,[22] resides on the higher end of the scale in terms of both formality and expressiveness. The upper level ontology of SUMO consists of ~1120 terms, ~4500 axioms and ~795 rules, and has been extended with a mid-level ontology (MILO) as well as domain specific ontologies. Written in the SUO-KIF language, it is the only formal ontology that has been mapped to all of WordNet lexicon.

Formal languages such as DAML, OIL, and OWL are geared towards classification. What makes SUMO unique from other types of modeling approaches (eg, UML or frame-based), is its use of predicate logic. SUMO preserves the ability to structure taxonomic relationships and inheritance, but then extends such techniques with an expressive set of terms, axioms and rules that can more accurately model geo-spatial, sequentially temporal concepts, both physical and abstract.

Nevertheless, KR modeling can suffer from the “garbage in, garbage out” syndrome. Developing domain ontologies with SUMO is no exception. That is why in a large ontology such as SUMO/MILO, validation is very important.[23]


Models of concepts are second derivatives

Returning to Ogden’s and Richards’ (1923) meaning triangle, the relations between term, referent and concept may be phrased more precisely in causal terms:
  • The matter (referent) evokes the writer's thought (concept). 
  • The writer refers the matter (referent) to the symbol (term). 
  • The symbol (term) evokes the reader's thought (concept). 
  • The reader refers the symbol (concept) back to the matter (referent).

When the writer refers the matter to the symbol, the writer is effectively modeling the referent. The method that is used is informal language. However, without a formal semantic system in which to model concepts, the use of natural language as a representation of concepts will suffer from the issue that informal languages have meaning only by virtue of human inter-pretation of words. Likewise, it is important to not confuse the term as a substitute for the referent itself.

In calculus the second derivative of a function ƒ is the derivative of the derivative of ƒ. Likewise, a KR archetype or replica of a referent (ie, a physical or abstract thing) can be considered a second derivative, whereby the concept is the first derivative, and the model of the concept is the second derivative. What can add to the confusion is the term labeling the referent, versus the term labeling the model of the concept of the referent. One's inclination is to substitute the label for the referent.

Figure 4 – Adapted from Sowa (2000), “Ontology, Metadata, and Semiotics” 

Thus, it is important to recognize that a representation of a thing is at its most fundamental level still a surrogate, a substitute for the thing itself. It is a medium of expression. In fact, “the only model that is not wrong is reality and reality is not, by definition, a model.”[24] Still, a pragmatic method for addressing this concern derives from development and use of ontology.


Unstructured data and a way forward…

Over the past two decades much progress has been made on shared conceptualizations and the theory of semantics, as well as the application of these disciplines to advanced computer systems. Such progress has provided the means to solve the problem of deriving meaningful information from the unstructured schema size explosion (ie, “big data”) overwhelming the banking industry, as well as the many other industries which suffer from the same issue.

The missing link underlying “the cause of many a failure to design a proper dialect... [is] the general lack of an upper ontology that could provide the basis for mid-level ontologies and other domain specific metadata dictionaries or lexicons.” The key then is use of an upper-level ontology that “gives those who use data modeling techniques a common footing to stand on before they undertake their tasks.”[25] SUMO, as an open source formal ontology (think Linux), is a promising technology for such purpose with an evolving set of tools and an emerging array of applications/uses solving real world problems.

As Duane Nickull, Senior Technology Evangelist at Adobe Systems explained, “[SUMO] provides a level setting for our existence and sets up the framework on which we can do much more meaningful work.”[26]


About SUMO:
The Suggested Upper Merged Ontology (SUMO) and its domain ontologies form the largest formal public ontology in existence today. They are being used for research and applications in search, linguistics and reasoning. SUMO is the only formal ontology that has been mapped to all of the WordNet lexicon. SUMO is written in the SUO-KIF language. Sigma Knowledge Engineering Environment (Sigma KEE) is an environment for creating, testing, modifying, and performing inference with ontologies developed in SUO-KIF (e.g., SUMO, MILO). SUMO is free and owned by the IEEE. The ontologies that extend SUMO are available under GNU General Public License. Adam Pease is the Technical Editor of SUMO.  

For more information: http://www.ontologyportal.org/index.html. Also see: http://www.ontologyportal.org/Pubs.html for list of research publications citing SUMO.

Users of SUO-KIF and Sigma KEE consent, by use of this code, to credit Articulate Software and Teknowledge in any writings, briefings, publications, presentations, or other representations of any software that incorporates, builds on, or uses this code. Please cite the following article in any publication with references:

Pease, A., (2003). The Sigma Ontology Development Environment. In Working Notes of the IJCAI-2003 Workshop on Ontology and Distributed Systems, August 9, 2003, Acapulco, Mexico.



Footnotes:
[1] Speech by Mr Andrew G Haldane, Executive Director, Financial Stability, Bank of England, at the Securities Industry and Financial Markets Association (SIFMA) “Building a Global Legal Entity Identifier Framework” Symposium, New York, 14 March 2012.

[2] Counterparty Risk Management Policy Group (2008).

[3] CFTC Technology Advisory Subcommittee on Data Standardization Meeting to Publicly Present Interim Findings on: (1) Universal Product and Legal Entity Identifiers; (2) Standardization of Machine-Readable Legal Contracts; (3) Semantics; and (4) Data Storage and Retrieval. Meeting notes by Association of Institutional Investors. Source: http://association.institutionalinvestors.org/ Document: http://bit.ly/QJIP4i

[4] See: http://www.iso20022.org/

[5] SWIFT Standards Team, and Society for Worldwide Interbank Financial Telecommunication (2010). ISO 20022 for Dummies. Chichester, West Sussex, England: Wiley. http://site.ebrary.com/id/10418993.

[6] The term “avocation” has three seemingly conflicting definitions: 1. something a person does in addition to a principal occupation; 2. a person's regular occupation, calling, or vocation; 3. Archaic diversion or distraction. Note: we purposefully selected this terms because it relate to pragmatics; specifically, the “semantic barrier”.

[7] e•pis•te•mol•o•gy, n., 1. a branch of philosophy that investigates the origin, nature, methods, and limits of human knowledge; 2. the theory of knowledge, esp the critical study of its validity, methods, and scope.

[8] Turing, A.M. (1950). “Computing machinery and intelligence” Mind, 59, 433-460.

[9] Harnad, Stevan (1990) “The Symbol Grounding Problem” Physica, D 42:1-3 pp. 335-346.

[10] Ogden, C., and Richards, I. (1923). The meaning of meaning. A study of the influence of language upon thought and of the science of symbolism. Supplementary essays by Malinowski and Crookshank. New York: Harcourt.

[11] se•man•tics, n., 1. linguistics the branch of linguistics that deals with the study of meaning, changes in meaning, and the principles that govern the relationship between sentences or words and their meanings; 2. significs the study of the relationships between signs and symbols and what they represent; 3. logic a. the study of interpretations of a formal theory; b. the study of the relationship between the structure of a theory and its subject matter; c. the principles that determine the truth or falsehood of sentences within the theory. 

[12] lin•guis•tics, n., the science of language, including phonetics, phonology, morphology, syntax, semantics, pragmatics, and historical linguistics.

[13] Use of term “symbolic” refers to semiosis and the term “sign,” which is something that can be interpreted as having a meaning for something other than itself, and therefore able to communicate information to the person or device which is decoding the sign. Signs can work through any of the senses: visual, auditory, tactile, olfactory or taste. Examples include natural language, mathematical symbols, signage that directs traffic, and non-verbal interaction such as sign language. Note: we categorized linguistics to be a subclass of semiotics.

[14] se•mi•ot•ics, n., the study of signs and sign processes (semiosis), indication, designation, likeness, analogy, metaphor, symbolism, signification, and communication. Semiotics is closely related to the field of linguistics, which, for its part, studies the structure and meaning of language more specifically.

[15] The term “semiosis” was coined by Charles Sanders Peirce (1839–1914) in his theory of sign relations to describe a process that interprets signs as referring to their objects. Semiosis is any form of activity, conduct, or process that involves signs, including the production of meaning. [Related concepts: umwelt, semiosphere]

[16] One school of thought argues that language is the semiotic prototype and its study illuminates principles that can be applied to other sign systems. The opposing school argues that there is a meta system, and that language is simply one of many codes (ie, signs) for communicating meaning.

[17] syn•tac•tic, n., 1. the branch of semiotics that deals with the formal properties of symbol systems. 2. logic, linguistics the grammatical structure of an expression or the rules of well-formedness of a formal system.

[18] prag•mat•ics, n. 1. logic, philosophy the branch of semiotics dealing with causalality and other relations between words, expressions, or symbols and their users. 2. linguistics the analysis of language in terms of the situational context within which utterances are made, including the knowledge and beliefs of the speaker and the relation between speaker and listener. [Note: pragmatics is closely related to the study of semiosis.]

[19] met•a•phys•ics, n., 1. the branch of philosophy that treats of first principles, includes ontology and cosmology, and is intimately connected with epistemology; 2. philosophy, especially in its more abstruse branches; 3. the underlying theoretical principles of a subject or field of inquiry.

[20] Pease, Adam. (2011). Ontology: A Practical Guide. Angwin: Articulate Software Press.

[21] A review of the listed KR approaches is outside the scope of this discussion. See Pease (2011), Ontology: A Practical Guide, ‘Chapter 2: Knowledge Representation’ for a more in-depth discussion/comparison.

[22] While OWL is based on description logic, its primary construct is taxonomy (ie, frame language).

[23] See Pease (2011), Ontology: A Practical Guide, pp. 89-91 for further discussion on validation.

[24] Haldane, Andrew G. (2009). “Why banks failed the stress test” Speech, Financial Stability, Bank of England, at the Marcus-Evans Conference on Stress-Testing, London, 9-10 February 2009.

[25] Duane Nickull, Senior Technology Evangelist, Adobe Systems; foreword to “Ontology” by Adam Pease.

[26] Multiple contributors (2009). Introducing Semantic Technologies and the Vision of the Semantic Web Frontier Journal, Volume 6, Number 7 July 2009. See: http://www.hwswworld.com/pdfs/frontier66.pdf 

Thursday, November 1, 2012

Gold Loans and Reversing a Model’s Line of Causation

The 1970s was a crucial turning point in the history of 20th century gold markets. The costs of the Vietnam War and increased domestic spending had the effect of accelerating inflation. Meanwhile, US gold stock declined to $10 billion versus outstanding foreign dollar holdings estimated at about $80 billion.[1] Prior to that, the London Gold Pool made up of seven European central banks and the US Federal Reserve, a group which cooperated in maintaining the Bretton Woods System, found itself increasingly unable to balance the outflow of gold reserves and defend the fixed gold price of US$35.[2]

On August 15, 1971, President Nixon, a self-proclaimed Republican “conservative,”[3] imposed a 90-day wage and price control program and other various expansionary fiscal policies in what became known as the “Nixon Shock”.[4] More importantly, Nixon closed the gold window to prevent foreign governments that had been holding dollar-denominated financial assets from demanding gold in exchange for their dollars. By March 1973, all of the major world currencies were floating and in November 1975, the G-7 (i.e, Group of Seven) formed to hammer out the final details on a framework for a new monetary system. That agreement, which was finalized in January 1976, called for an end to the role of gold, the establishment of SDRs as the principal reserve asset, and legitimized the de facto system of fiat currencies and floating exchange rates.

The reason for retelling this story is because these events, along with a collapse in gold prices after peaking on January 21, 1980 at the high price of $850, led directly to formation of the gold leasing market during the mid-1980s. Gold loans evolved as a means for central banks to earn a return on their bullion inventories to cover the cost of warehousing bullion[5][6] by leasing gold in exchange for a lease rate. This rate is derived from the difference between the LIBOR and Gold Forward Offered (GOFO) rate.[7] Alternatively, a central bank could swap gold in exchange for currency such as US dollars.

On the Lease Rate and Convenience Yield of Gold Futures

A leasing transaction involves a central bank transferring ownership to a leasing institution (i.e., borrower), who could then sell the gold on the spot market and invest the proceeds. At a later date, the borrower would buy back the gold and return it to the central bank while paying the lease rate. Because gold could be leased at a relatively low rate from the central bank and then sold quickly on the spot market, participants in this market included gold producers who thereby gained cash to finance gold production at a comparatively low rate of interest, while simultaneously hedging against falling gold prices.[8]

The market for gold loans developed quickly after the October 1987 stock market crash left many mining companies with reduced access to capital. Prior to 1990, GOFO rates for gold normally were below 2 percent on an annualized basis and never exceeded 3 percent, providing an inexpensive source of finance for mining companies.[9] The Financial Times reported that some 30 central banks were estimated to have engaged in gold loans around this time.[10] Then in 1990 Drexel Burnham Lambert collapsed with large outstanding gold liabilities to many central banks, resulting in increased wariness and reduced supply of gold loans from central banks.[11] As a result, lease rates rose reflecting an increased tightness in the market after the loss of central bank suppliers, as well as a substantial risk premium over the implicit cost of providing such loans.

Nevertheless, the market for gold loans grew throughout the 1990s, and an informal global interbank system developed permitting dealers to borrow gold on a short-term basis in order to fulfill delivery requirements. When bullion subsequently dropped below $300 an ounce in late 1997, and drifted in that range through 2002 in what is now referred to as the “Brown Bottom,”[12] the gold carry trade came to dominate the derivatives markets. Gold’s steady appreciation since 2002, however, has rendered this trade obsolete. As a result, there has been a wholesale transformation in the gold market since the millennium began.

In a research paper published by the Swiss Finance Institute (SFI) titled, On the Lease Rate the Convenience Yield and Speculative Effects in the Gold Futures Market, the authors examine this aspect of the gold market in detail. They note that, “…since late 2001, the profitability of the carry trade has diminished. Rising gold prices have increased risk and diminished the trade’s profitability as a result of increasing repayment costs. Consequently, the prevalence of the gold carry trade is predicated on two factors: the rate at which the central bank is willing to swap or lease gold, and whether or not the gold price is increasing.” Further, the authors Barone-Adesi, Geman and Theal (2009) observe that the COMEX “is witnessing historically low derived lease rates, decreasing hedging activity and steadily rising non-commercial open interest.”

The reason why is because the gold carry trade is risky on two dimensions. First, if the borrower invests in long-term bonds, rising interest rates could cause downward pressure on bond prices exposing the leasing institution to principal risk. Second, since the borrower is effectively short gold, if the loan is called by the central bank and gold has risen in value, they may have to purchase gold at a higher price in the spot market. Hence, there always exists the potential of driving up gold prices even higher due to short covering. This unwinding of the carry trade, as with other similar trades (e.g., yen carry trade), can result in volatile markets.

The question then is to what extent is speculation having a “tangible effect” on gold valuations, and “if so, by what mechanism does speculation influence prices?” The SFI paper points out other academics, such as Kocagil (1997), who defined “speculative intensity” as the “spread between the futures and expected spot price,” and concluded that “speculation increases spot price volatility and thus has a destabilizing effect on price.” Another researcher, Abken (1980), based his analysis on the intuition that the only return that gold yields is based on the anticipated appreciation of gold above “any marginal costs associated with the storage of gold.” Abken argues that, “during times of uncertainty, excess demand for gold as a store of value [drives] up the spot price causing stored gold to be brought to market.”

The authors of the SFI paper, on the other hand, base part of their methodology on the work of Houthakker (1957), one of the first researchers to use trader commitment data to study speculation. To understand how speculative agents can affect the gold futures market, Barone-Adesi et al. (2009) examine the open interest data from the CFTC Commitment of Traders (CoT) report, thereby identifying commercial open interest with hedging activity, and conversely, non-commercial positions with speculative activity. The authors also study the relationship between gold leasing and the level of COMEX discretionary inventory.

Not surprisingly, Barone-Adesi et al. (2009) arrive at some obvious conclusions: First, they note an ever-increasing percentage of non-commercial open interest reflects increased speculation in the gold market. Second, “the lease rate and the speculative pressure appear to work in opposition to one another; the former acts to decrease short-term bullion inventories via lease repayments, while the latter result suggests speculators dominate leasing activity in the long term… Finally, the presence of speculation in gold futures contracts can be associated with increased futures contract returns and that this effect increases with increased futures contract maturity.” What these observations suggest in their entirety is that “speculation plays a significant role in the COMEX gold futures market” as opposed to hedging activities.

Uh, okay… but isn’t this a foregone conclusion? Albeit, On the Lease Rate the Convenience Yield and Speculative Effects in the Gold Futures Market derives its determinations from some interesting theoretical ideas between the relationship of gold loans, bullion inventories, convenience yield and speculation; but in the final analysis this paper raises the specter of Muth’s (1961) Rational Expectations and the Theory of Price Movements: “In order to explain fairly simply how expectations are formed, we advance the hypothesis that they are essentially the same as the predictions of the relevant economic theory.”

In other words, models unfortunately have the bad habit of assuming a predetermined conclusion around which expectations are formed, which in effect reverse the model’s line of causation. Our conclusion: research bias, the process where the scientists performing the research influence the results in order to portray a certain outcome, seems to be at work here—even though we happen to agree with Barone-Adesi, Geman and Theal's conclusions.

Footnotes:
[1] Spero, Joan Edelman, and Hart, Jeffrey A. (2010). The Politics of International Economic Relations. 7th ed. (originally published 1977). Boston, MA: Wadsworth Cengage Learning.

[2] Bordo, Michael D., and Barry J. Eichengreen (1993). A Retrospective on the Bretton Woods System: Lessons for International Monetary Reform. University of Chicago Press. pp. 461–494 “Chapter 9, Collapse of the Bretton Woods Fixed Rate Exchange System” by Peter M. Garber.

[3] Nixon tape conversation No. 607-11.

[4] “The Economy: Changing the World's Money” Time Magazine, Oct. 4, 1971 [First reference by Time of “Nixon Shock”]; http://www.time.com/time/magazine/article/0,9171,905418,00.html

[5] “Bullish on Bullion” by Peter Madigan, Risk Magazine, Feb. 1, 2008, Incisive Media Ltd.

[6] According to O’Callaghan, Gary (1991), two key disadvantages in holding gold as opposed to a financial instrument are storage costs and the fact that holding gold does not bear interest.

[7] Barone-Adesi, Giovanni, Geman, Hélyette and Theal, John (2009). “On the Lease Rate, the Convenience Yield and Speculative Effects in the Gold Futures Market” (March 12, 2009). Swiss Finance Institute Research Paper No. 09-07.

[8] O’Callaghan, Gary (1991). "The Structure and Operation of the World Gold Market" International Monetary Fund, IMF Working Paper WP/91/120, Master Files Room C-525, p 33.

[9] Ibid. pp 33-34.

[10] Gooding, Kenneth, “Gold Lending Rate at Record Level,” Financial Times (London), Dec. 4, 1990, p 34.

[11] “Fool’s Gold,” The Economist, Mar. 17, 1990, p 79.

[12] Term used to describe the period between 1999 and 2002, named from the decision of Gordon Brown, then the UK's Chancellor of the Exchequer to sell half of the UK's gold reserves in a series of auctions.

References:
Barone-Adesi, Giovanni, Geman, Hélyette and Theal, John (2009). “On the Lease Rate, the Convenience Yield and Speculative Effects in the Gold Futures Market” (March 12, 2009). Swiss Finance Institute Research Paper No. 09-07.

Bordo, Michael D., and Barry J. Eichengreen (1993). A Retrospective on the Bretton Woods System: Lessons for International Monetary Reform. University of Chicago Press. pp. 461–494 “Chapter 9, Collapse of the Bretton Woods Fixed Rate Exchange System” by Peter M. Garber.

O’Callaghan, Gary (1991). “The Structure and Operation of the World Gold Market” International Monetary Fund, IMF Working Paper WP/91/120, Master Files Room C-525

Spero, Joan Edelman, and Hart, Jeffrey A. (2010). The Politics of International Economic Relations. 7th ed. (originally published 1977). Boston, MA: Wadsworth Cengage Learning.