- Radically Inept
- Inspector Lohmann
- The Blogging of the President: 2004
- North Georgia Dogma
- The Corpus Callosum
- Drunken Monkey Style Blogging
- Fafblog! the whole worlds only source for Fafblog.
- Intel Dump
- Orcinus Fair and Balanced
- Brad DeLong's Semi-Daily Journal a Weblog
- Marginal Revolution
- How Appealing
- Chris C Mooney
- Kevin Drum
- Cyborg Democracy
- Iraq Now
- Talking Points Memo
- Roger Ailes
- The Panda's Thumb
- WorldChanging: Another World Is Here
- The Truth Laid Bear
- Fables of the reconstruction
- Oliver Willis: Like Kryptonite To Stupid
- Rogue Analyst (My other blog)
- CenterPoint - A Centrist Weblog
- Shock and Awe
- Whiskey Bar
- Farmers and Consumers Market Bulletin
- Open Government Information Awareness
- Agnosticism / Atheism - Skeptical Inquiry, Freethought, & Religious Philosophy
- Defense and the National Interest
- Google News
- TCS: Tech Central Station - Where Free Markets Meet Technology
- ajeeb, News
- Corp Watch
- Center for Strategic & Budgetary Assessments
- GHOST TOWN - Chernobyl Pictures - Kidofspeed - Elena
- Cooperative Extension Service (GA)- Publications
- MEGA START PAGE
- The Vaults of Erowid
- Eyeballing Series
My other blog, usually updated daily:
As time permits:
Infrequent, but worthy posters:
Excellent sources of info:
Fun and off the beaten path:
This site is concerned with exploring the concept of an information economy; to look at 'information' - how it is measured; how value is assigned; how credibility is established, and more - across disciplines, including but not limited to: economics, memetics, ecology, physics, public sector vs private sector, philosophy, cognition theories, risk communication, security, secrecy vs transparency. And, if possible, in doing so create a common, multi-disciplinary scaffolding for future policy dev
Saturday, November 13, 2004
I've posted this here as a reference to a post at Radically Inept, More points I'm pondering, and a little admin.
SUBJECT: Assessment of the National Highway Traffic Safety Administration’s Rating
System for Rollover Resistance, Chapter 3, “Statistics and Data Analysis”
DATE: March 17, 2002
The recently released Assessment of the National Highway Traffic Safety Administration’s Rating System for Rollover Resistance, Chapter 3, “Statistics and Data Analysis” largely supports the findings of National Highway Traffic Safety Administration (NHTSA) reported in the Federal Register 2000 and 2001. “The first point to note is that an increase in the SSF reduces the probability of rollover” (Ibid. 3-15). However, there are a few notable caveats. This assessment’s findings and recommendations include: 1) criticism of statistical methodology employed (Exponential Model) in the original findings (Federal Register, 2000); 2) support for the methodology (Logit Model) used in the second reporting (Federal Register, 2001) 3) suggestions for directions and methodologies (primarily the ‘non-parametric binary-response model) for further analysis; and 4) criticism of the current practice of dividing the risk values across only five striations (Star System) with relevant recommendations for expanding and refining the relationship between rollover risk and Static Stability Factor (SSF). There are also two points in the body of the Assessment that were not included in the summary of findings and recommendations, which the reviewer believes are pertinent to future policy considerations by NHTSA.
The first two points, the criticism of the earlier Exponential Model as a viable statistical methodology, and the validation by the assessment committee of the statistical merits of the second methodology, the Logit Model, is largely moot for policy purposes. The Exponential Model’s validity is sharply criticized for failure to take advantage of the volume of data available to provide high confidence in its findings. However, both, the criticized model and the validated model come to the same basic conclusion: “The problem…arises of how to predict the rollover probability for these make/model groups. The rollover curve provides a solution to the prediction problem…Given the SSF of the new make/model, the estimated rollover curve can be used to predict the rollover probability”, (Ibid. 3-14).
The third point, suggestions on directions and methodologies for future analysis, the Assessment concludes that the Logit Model is supported by the limited Non-parametric binary-response model (Non-parametric model) conducted in the course of this assessment, “The estimated rollover curve based on the Logit Model appears to be a reasonable approximation to the Nonparametric[sic]-based rollover curve using limited data, suggesting the Logit Model is a sensible starting point for…a rollover rating system,” (Ibid 3-24). The Assessment states, “A more extensive analysis using a larger dataset will be required to obtain a rollover curve that provides information at the national level’, (Ibid. 3-24). Figure 3-9, pg. 3-24 of the Assessment, graphically depicts the relative agreement between the Logit Model and the Non-parametric Model.
The final point is a criticism of the five star rating system currently employed by HTSA. This is broken into two points: 1) The lack of accuracy resulting from…an overly coarse discrete approximation, and 2) The lack of resolution resulting from the choice of breakpoints between star rating categories. Basically, the Assessment states that the current five star rating system does not allow for fine distinctions in the performance with in vehicle categories, notably the SUVs, and recommends further gradations within the rating system for a more refined risk communication.
A point, which failed to make its way to the findings and recommendations portion of the assessment, was the value of historical make/model rollover data; the Assessment recognizes the value of historical data to provide simple unbiased information. “If the objective is to estimate the true probability of rollover for a given make/model group, than…the best estimate of the rollover probability is the sample proportion of rollovers calculated from the crash population mean…for an old make/model group, there is no reason to estimate the rollover curve” (Ibid. 3-24). It goes on to say, “Given the SSF of the new make model, the estimated rollover curve can be used to predict rollover probability.”
The reviewer’s conclusions of the Assessment are mixed. To the first two points, the improved methodology (Logit Model), more accurate, and statistically defensible, still supports the validity of the rollover curve and the SSF as useful tools. The third point is more contentious. The value of further spending to improve accuracy using the Non-parametric Model seems to be self contradictory with the finding that historical data negates the need for estimating the rollover curve. The value of additional funding for further statistical analysis is not adequately supported. To the fourth point, that the current five star rating system fails to provide information of sufficient accuracy and refinement for the benefit of consumer decision making, this seems well supported by the Assessment. Discernment of risk levels between models of a specific type (i.e., SUVs) through the five star rating method is not refined enough. Increased graduations would contribute to the precision of data with which consumers make their choices.
However, based on the value of ‘real’ data to provide “simple unbiased” information and give the “true probability” of rollover with no need to estimate, NHTSA should use this as the basis of its rollover rating system with increased gradients.
Sunday, July 18, 2004
I've updated links, and plan on doing maintenance, as well as some posting here, in the near future.
Wednesday, May 05, 2004
I realize my neglection of this blog, and can only say that speed has to give way to not only the development of the argument, but deciding upon an effective (hopefully) approach has it's own development time. Also, I am going to update the interface value here to at least that of Radically Inept in the near future, Sort of penance work for the neglect. Rest assured, it is more based on the Germanic concept of the 'fates' than any judeo-christian structure. Combined with a healthy dose of buddhism, taoism, gnostizism, chaos theory, quantum theory, Bentham, Kant, Hegel, Jung, Wheeler, Pirsig, Poppler, Kuhn, Bill Hicks, Cheech and Chong, Humphrey Bogart, Lauren Bacall, Sidney Greenstreet, Robert Mitchum, Heinlien, Adams, London, Twain (especially), etc.So, now for some development of the idea.
But, I joined the quest for the Grand Unification Theory (GUT). I do not expect to provide answers, but I do hope I inspire someone else to provide the answers based to the questions. I will provide what I hope proves to a rational position, and hope that others contribute in either refutation or in support. Well, enough preview, let's get on with some meat.
Are photons an irreducible factor in all information exchanges?
That ultimately what I hope to be the basis of a common point of reference in the dialog of information transfers. I mean this in the broadest of senses, and am trying to derive the basics. I am working 'ass-backwards', I know, but I'm hoping to find some value in this approach.
Let me start with this picture. Visualize a license plate. At every point of it's existence, it is reflecting a certain value of information in photons, regardless of the degree to which these photons effect the macro level. How to you capture the economic value to the transfer of information through the course of the 'life' of the license plate? Is it all the photons reflected or only the photons that were absorbed and acted upon, and than what metric is used to measure the affect of the effect? Dollars spent? Acres farmed? Miles driven? Gross receipts? Is there a common metric in these transactions that can be reduced to photon exchanges? Can this be the first rung of a matrix, which admittedly and over-ambitiously strives to provide a bridge work across disciplines?
Because that is part of the concept that seems like the influence, in some stochastic method, possibly, can be measured. Can we look to math, and see the energy of the photon in all information exchanges? Can it be used as a common factor in measuring information exchanges? Can something like mega-photon out-put, or photon-micro influence become valuable in the discussion of information?
So, the first question for you, is in your particular field, can information ultimately be factored down to some multiple of photon transfer? If not, and especially not, why not?
I realized in speaking with Donna, from Merrill, and 'The Pacifier', that there was a specific level of categories in information measurements crucial to the discussion. The first is the distinction between the quantity of the information v the quality of the information. Using a metric like photons might prove valuable in measuring the quantity of the information involved in any exchange, and it is this that I am going to explore a little further. I deal a little with measuring quality in the post below, but I plan to do more on that subject later.
In the second category that is useful in measuring information processes, I am borrowing directly from Newtonian Physics. All information is either kinetic information or potential information, and that objects may possess both simultaneously. However, a quick example of what I mean is in order here, so consider a letter in a sealed envelope. As long as the letter stays in the envelope, it has information potential, but when someone opens the envelope and reads the contents, that information is kinetic at that stage. It does not, however, lose any appreciable potential information value unless it is destroyed. Someone else could read the letter, and receive the same quantitative amount of information. The quantity of photons exchanged in the first and all later readings of the letter, are the same. For now, let's call the letter, and other objects from which we derive information 'messengers'
The Quality of the information exchanged could on the otherhand vary widely depending on the reader, and other factors, such as time (currency of the information), events that have taken place external to the letter, a persons ability to read, and in fact, the volume of photons exchanged, quantity, is independent of a readers ability to comprehend, which is a qualitative measurement.
Consider the informational value of a license plate (our first choice of messenger) from it production to its ultimate demise. From the time it is completed by the prisoners, the quantity of the information 'emitted' by the plate in terms of photon exchanges is established. The quality of the information may change, but the photon value will remain, at least roughly, the same. Now, the prisoners put it in a box to ship to the dept of motor vehicles, or some other bureaucratic system for dispersal. While in the box, the license plates have only potential information. Their photons are confined to the volume of the box, and no one is there to receive the information. This last point about the role of a receiver in an information exchange, remains an intriguing vector throughout this concept.
When the owner finally receives his new car tag, and mounts it on the car, it may also be the last time anyone reads that information. But, the license plate will continue to emit the same level of information constantly, regardless of the existence of a receiver. And, it should be possible to derive a average daily photon out put for the license plate. Granted, when the light is dimmer, it will put out less than on a bright day, but a daily average should be definable. Here it bears repeating, that while the license plate is a 'constant' emitter, it's information potential does not degrade.
Let me quickly offer a somewhat counter example, which might help. Let's use a receipt that a patron now receives, at let's say the gas pump at his local filling station, is done on a thermal paper as a messenger. Here the information degrades quickly. So, that when it is first printed, it emits far more photons than it will three months hence, and will over time, cease to contain any of the original information. Instead, it will become a smudged piece of paper. It still emits photons based on its area, but the complexity (information) is gone. So, it losses it kinetic information, and its potential information, whereas our license plate appears to stay constant.
So, this leads to a few more questions which I am pondering. Is there then a way, to define the license plate using the costs of production, shipping, administrative record keeping, etc. I'm not sure how to factor in the money/taxes paid by the car owner i.e., does it go in the plus or minus column for the purposes of our accounting, but I'm sure it's easily resolvable. Anyway, it should be possible to find a dollar to photon rate, such that the license plates value could be expressed in those terms, perhaps something akin to mega-photons per penny, or something, and this ratio could be applied to virtually all economic information exchanges. At this point, I am not arguing it has a utilitarian value, only that it is how I'm constructing this mental model for further exploration.
Now, a brief summary:
Premise: Information can be measured in quantity and in quality. These are separate aspects of an
Premise: Photons are an irreducible factor in measuring the quantity of information exchanged in all exchanges.
This quantitative value of Information exchanges can be measured in photons.
Premise: The complexity of the photon pattern emitted, is information (I just added this, and I'm going to think further
on whether this makes sense in this model, but I like it intuitively,so...)
Premise: Messengers have a kinetic informational value or a potential informational, or both simultaneously.
Premise: Messengers may or may not degrade in volume of photons emitted, appreciably, over time.
Premise: It should be feasible to develop a factor combining the 'economic' cost of information, with the amount of
information generated in photons, such that you have a photon to currency ratio as a useful expression of the
value of an information exchange. And, that this expression will be useful in measuring information exchanges
across fields, i.e., IT, marketing, Risk Communication, policy, etc.
Well, that's my thinking on the quantitative aspect of information exchanges so far.
The post below deals a little with the measurement of the qualitative aspect of information exchanges. I got and pasted the post below from a .doc, and I apologize for all the crap, like boxes replacing quotation remarks. If I find the time I'll try to go in and clean it up. Oh, Wu is a concept I'm working on, and I chose Wu as a term, because I had not heard that term used in any of my cross readings. Wu should be thought of as the beginning of the original Star Trek series, where the music starts with, 'woo wooooo, woo woo woo woo', or whatever. And, the symbol for the term, which I don't know how to import here yet, is a cursive 'w' with a bar across it. It was a symbol that I had also not run across in my readings.
Well, you comments and criticisms are welcome, even urged. No sense in doing a whole lot of thinking in a bad direction, if one of you sees a huge fallacy that I've overlooked. Save me the time please.
Monday, May 03, 2004
There have been many attempts to analyze the data from a large number of surveys conducted in twenty-one of the wealthy democratic countries since the early 1960Âs, and to compare those results with surveys conducted in the US on the publicÂs trust in the government and other large institutions going back to 1958. Researchers have concluded that the beginning of the trend in ever lower levels of trust in institutions in the US seems to coincide with the Vietnam War and the Watergate scandal, but this correlation would not explain the same decline in trust in the other Western democracies which began about the same time. This has led researchers to look at a host of other sources for explanation of growing distrust. Among the hypotheses put forward are that the decline in trust parallels economic trends or that is due in part to the growth of negative media coverage and the increasing trend of the public using major media as its primary source of political information. Still others have looked to a generational shift from the materialism prevalent in the pre-WWII generations to a post-materialist outlook found increasingly among newer generations as a contributing factor. The increasing polarization of the American political parties to extremes which are not reflective of the populace in general; a growing trend on the part of politicians criticize the very institutions for whose offices they are campaigning and the reliance on negative campaign advertising to include disparaging the performance by the government in its role in solving social and economic problems have all been considered in the past forty years (Nye, et al 1997). In 1964, 75% of Americans said they trusted the federal government, while the 1997 figure was only 25% of the comparable levels of trust. Simultaneously, trust in universities as institutions fell from 61% to 30%, major companies fell from 55% to 21 % and medical institutions saw a downward shift from 73% to 29%. Trust in the media, which was not high to begin with in comparison to the other institutions measured, fell from 29% to 14% (Nye 1997).While the survey data cited above reflects a general downward trend in trust in institutions, the actual values at any given time are subject to sudden changes reflecting geopolitical events as they occur. A recent example was the event of Â9/11Â (2001):
A May 2002 survey commissioned by the Brookings Institution, however, contends the air of trust quickly faded. The survey, conducted by Princeton Survey Research Associates, found that the number saying they trusted the government to do what is right at least most of the time rose from 29 percent in July 2001 to 57 percent in October 2001, but dropped down to 40 percent in May 2002. The number who said the government could be trusted only "some of the time" rose from 39 percent in October to 53 percent in May. By September 2002, the CBS/New York Times survey found that only 37 percent said they trusted the government to do the right thing "always" or "most of the time (http://www.publicagenda.org/specials/terrorism/terror_pubopinion10.htm 4/22/03)."
The data above seems to validate the often cited concept that the public will rally around the countryÂs leadership in times of crisis, but these types of rallies are often extremely short lived. The first President Bush saw his approval ratings jump during the Gulf War only to decline swiftly in its aftermath.I think it falls far short of the argument I thought i could make, and I think I can make a better one now, but I think the overall points brought out in the piece would still be the basis of it's own conception of a particular metric in information exchanges. I think of it, as a potentially utilitarian equation, and the...well, admittedly it needs a lot of work, so, give me a hand here, "Does this work as a simple framework in your particular field? If so, how whould you use it? If not, where do you believe the fallacies are? The whole thing is a work in progress, so all input is welcome.
While the growth of distrust in institutions appears to be statistically proven, no single cause or aggregation of the causes looked at by various scholars appear to provide a sufficient explanation to adequately explain the phenomenon. This downward trend in distrust continues; the World Economic Forum released Gainternationallynal's 2002 Voice of the People survey of 36,000 citizens in 47 countries. 66% do not believe their country is "governed by the will of the people." Among the findings in the leadership categories included in the 2002 Voice of the People survey, leaders of non-governmental organizatiNicosia(NGOs) are trusted by majority of the respondents. Of particular note for the purposes of this paper is that of the five characteristics important for trust in leaders, 49% chose honesty. And, of five factors to cause distrust in leaders, respondent s chose Ânot doing what they sayÂ (45%), self-interest (28%), secrecy (11%) (World Economic Forum 2003). The growing body of research on corruption in government often looks closely at the role secrecy plays in developing or deconstructing trust in leaders and institutions, and appears to support the specific survey finding of the negative impact of secrecy, especially when secrecy involves transactions between elites.
ÂÂ it also may be that popular conceptions of wrongdoing rest not only on the law, but on a variety of other norms and values. For the non-rulebreaking actions, significant predictors of strict judgments included secrecy in a transaction (by far the strongest predictor); large and tangible stakes; high-status givers or victims, and highstatus takers. By contrast, transactions involving a combination of private-sector giver and private-sector taker were seen as less seriously wrong, other things being equal. In other words, a secret, high-stakes transaction of tangible value between high-status, public-sector figures was likely to be seen as seriously wrong Â even when no rules were being broken.Â (Johnston, ÂCorruption, Inequality, and ChangeÂ,)
An initial review of research from several disciplines and applied fields, including political science, computer science, public policy and management, sociology, psychology, and organizational behavior (Viklund 2002, Jane E. Fountain 2002, Walker 2000, Vargas 1998, MajtÃ©nyi 2002, Gaines 2003, Minogue 2002, Stapenhurst 1997, Heller 1998, Johnston 1998, Helper 1999) revealed a great deal of research and data, but no consistent definitions of trust or credibility across or even within fields. And, though various papers discuss corruption and its dependency on secrecy (except in cultures where bureaucratic rent seeking is the norm and not secret) no research was found that measured the effect of secrecy on the level of trust in a manner useful for policy makers in developing and implementing polices. The reason for this may be the obvious; the impact of what is not known is difficult to measure, though as the physical sciences have demonstrated, it is not impossible. Of the sources reviewed, the following potentially provides the basis for a practical starting point with which to build a framework for measuring secrecy and its effects:
Hardin (1992) provides one promising way of moving beyond this conceptual impasse. It is useful, he argues, to conceptualize trust as a three-part relation involving properties of a truster, attributes of a trustee, and a specific context or domain over which trust is conferred. From this perspective, strategic, calculative and instrumental considerations would be expected to exert a dominant influence in some organizational contexts (e.g. transactions involving comparative strangers). However, in other contexts (such as those involving members of one's own group), relational considerations might be more salient and exert more influence over how trust is construed. Fully elaborated, a three-part theory of trust would thus afford adequate attention to both the calculative and relational underpinnings of trust (Vargas citing Hardin, 1999).
The first section will provide definitions used of terms useful for building a common framework for discussing the potential costs and effects of information secrecy versus transparency relative to an individual or an organization. For the sake of consistency, the term ÂprincipalÂ will refer to the possessor of information and the term ÂagentÂ will refer to those with less or without access to the information in the principalÂs possession. Both principals and agents may be either individuals or organizations depending on the context of an information exchange. Also, to avoid confusion, this paper will not utilize the economic/game theory conventional use of agent as being in the employ of a principal. For those acting on behalf or in the employ of principal, principal associate will be used. In many cases a principal associate will fall under the category of Third Party. Third parties are considered as ÂpartiesÂ involved in an information transaction, but disinterested in the actual content of the information (see Exposure Â Third Party initiated, below). For simplicity and unless otherwise noted, information will always be deemed ÂtransparentÂ to the principal.
The following formula is generally in line with the suggestion above and will hopefully lend itself as an aid to policy makers in understanding an agentÂs perception of a principal or organization based on the principalÂs or organizationÂs past known information transfer activities. Additionally, it is hoped that the formula will demonstrate some utility as a framework for policy makers and implementers when deciding on the degree of secrecy or transparency to be utilized in future policy decisions and implementation.
C Â± Â± R = T, where T denotes the resulting level of trust on the agentÂs part based on the credibility (C) of the principal plus the ÂWuÂ (), where is based on the agentÂs biased (B) estimate of the principalÂs integrity and the agentÂs perception of the degree to which the content of the information has been subjected to spin (S) by the information delivery medium (which may be the principal, a principal associate or a disinterested third party, and R is the risk as defined below. So, that T ≥ C Â± Â± R for Âpositive actionÂ, or the level of trust required for a transaction to take place must be greater then the agentÂs perception of the risk involved.
Trust (T) is the sum of the principalÂs credibility (C), the subjective assessment of the principalÂs integrity combined with the equally subjective valuation of the best and worst possible outcomes of a transaction with the principal (), and the agentÂs perception of the risk (R) involved in the transaction. It is important to point out here that there are circumstances where the agent may ÂtrustÂ the principal to perform a negative action. For example, agent may trust the principalÂs assertion that the principal will harm the agent or something the agent values, unless the agent completes a specified action. In this case, there is a perverse relationship between a positive trust value and a negative transaction value.
Credibility (C) is based on the known history of past actions on the part of the principal and can be treated like business and personal credit histories commonly used to determine Âcredit worthinessÂ. Credibility is one factor in establishing trust. Thus credibility of a principal is determined by the agentÂs knowledge of the principalÂs past activity. Past actions may have positive, negative or neutral impacts on credibility dependent on the agentÂs knowledge of past actions, and the impact of those actions themselves. True secrets and truly anonymous information have a value of zero, and have no effect on the balance sheet; they do not increase or decrease a principalÂs credibility. To the degree to which information is not completely secret or anonymous is the degree to which an agent can use the information, however, partial knowledge may potentially influence the Bias (B) factor (explained below) in unexpected ways and have impacts out of proportion to their actual value. A principal associateÂs credibility may be largely determined by the principalÂs credibility, for instance, a person may not know the individual FBI investigator that comes to question them, so the investigatorÂs credibility may be partially or solely dependent on the FBIÂs overall credibility as perceived by the agent or third party. It is helpful to consider credibility as the sum of transparent (Tr), exposed (E) and disclosed (D) information pertaining to a specific information area or concerning a particular principal received by the agent in the past. C = %Tr + %E + %D. Whether this results in a positive or negative credibility value is dependent on the actual content of the information. For example, it could be that the information exposed actually puts the principal in a better light than had the information remained secret, or that information revealed through transparency leads to a negative credibility rating. While it is useful to think of credibility as a composite term, for policy makers considering specific information to be released and the appropriate delivery vehicle it may prove expedient to use survey results if available for a quick numeric value until such a time that more accurate values can be determined.
Risk (R) is utilized here, is the same as risk perception, and is the attempt on the agentÂs part to gage the potential gain or harm, which may result to the agent at some time in the future, due to an incorrect assessment of the validity of a specific piece of information. Risk may also have a positive or negative value. For instance, the agent may perceive that trusting the information may save the agentÂs life, a positive R with a potential value of ∞, versus those times when the agent perceives trusting the information may cost the agentÂs life, a negative R with a potential value of -∞. In less extreme situations, the values may approach zero. Unless the agent is a paleontologist, there is no risk (or, the risk is near zero) in believing that dinosaurs became extinct due to disease or whether the extinction was caused by a meteor strike. On the other hand, believing the proverbial used car salesmanÂs pitch that the car was owned by a little old lady who only drove it on Sundays, may lead the agent into a harmful financial transaction, and the risk value may be the potential financial outlay. Or, in the case of a Superfund site, an agent who lives adjacent to the site might perceive that acting on information from the Environmental Protection Agency carries less risk then acting on information communicated by alleged polluters, and in this case, the risk value might be the cost to the agents of relocation versus the probability of acquiring a disease. An agent may tend to be more skeptical of information the greater the agent perceives the potential harm that may befall him if he is if he has incorrectly placed his trust. Risk then, as used here, is potentially a combination of true risk (actual probability of an event) measured through a subjective the subjective lens of the agent. Since it is not practical to measure every agentÂs perception of risk on all potential information, it may often be expedient to use probability estimates from appropriate sources. The actual seismic probability of an earthquake in a specific region may of necessity be substituted for an individuals ÂguesstimateÂ when policy makers are considering communicating related information.
Bias (B) is an agentÂs predisposition to trust the veracity of the information based on the medium through which the information was communicated. The communication medium may be the principal in person, or the principal through electronic media such as the television, the internet or the phone, or other physical media such as a letter, newspaper. It may also be delivered via a principal associate, or a disinterested third party using any of the different mediums. Further, bias can be heavily influenced by the medium that communicated the information, which again may be the principal directly, a principal associate or a third party. If the source is not directly from the principal, then the bias will potentially be effected by the source and the principal. Whether the information was made available to the agent by the principal through active transparency or if the information was disclosed in response to routine compliance regulations versus a court order, or the whether information about the principal was exposed against the principalÂs volition by a third party or other agent, are some of the potential motivations that an agent may consider. It is possible that transparency would tend to influence bias in a positive manner, routine compliance might have a neutral effect, and exposure might tend to have a negative effect. The agent will make a ÂbiasedÂ decision on the degree to which he can rely on the information content. This will be an inherently subjective estimate and will be influenced by personality, personal history, subject matter knowledge, and many other factors. In a study surveying Swedish citizenÂs trust in five Swedish institutions the following attributes were found to have a positive effect [this effect is positive bias] of various degrees on trust:
Commitment to a goal
In contrast, Âacting in self-interestÂ and being Âpart of a power eliteÂ were found to have a negative effect on people perception of trustworthiness. (Viklund 2002). The latter is largely supported by the growing and now, fairly extensive research done on corruption. However, the factors cited above provide only a limited idea of the range bias may have. Examples of biasÂ potential range in opposite extremes are: A parent who completely trusts their offspring (+∞) despite full and even first hand knowledge of their offspringÂs history of extreme immoral and/or illegal past acts; and in contrast, the bigot who refuses to trust a coworker (-∞) of different ethnic descent in spite of having had only positive interactions with this particular individual and all other individuals of the same ethnic persuasion in the past This remains a gross oversimplification since in each information transfer the actual content of the information in each transfer is greatly and independently subject to the agentÂs biases and the biases themselves may change over time. Bias is sometimes unexplainably positive or negative toward the principal, and toward a principal associate divorced from the principal, even toward the potential outcome of the transaction itself. Bias includes those factors which effect an agentÂs rationality such as addictions, situation dependent desperation, religious/moral/ethical belief systems, compulsions such greed, etc. And, bias may effect the valuation of the potential gain or loss which could result as an outcome from the transaction, i.e. some agents may place an ÂirrationalÂ economic or personal value out of step with the common market value, such as the idea of the cherished family farm or the last item to complete a personÂs favorite collection of memorabilia,
ÂWuÂ () is the combination of the agents Bias relative to the content of the information and the delivery vehicle, and the amount of Spin put on the information. Wu increases or decreases to the degree that the agent perceives that the information has been subjected to spin (S), and the agentÂs bias (B) toward the delivery medium or principal. As such, P = Perception of Spin and = Â± B Â± (S/P). As defined above, transparency is the absence of spin, so that in the case of transparent information, = B. Since wu is based on bias it is also highly subjective, so that the potential value of ranges from -∞ to +∞ no matter how great the value of spin. Wu is a primary factor in establishing an agentÂs level of acceptance of information. In fact, this term, Wu, and the accompanying symbol, , were chosen to express the potential for a huge degree of irrationality on the agentÂs part in making an estimate of the proper level of trust to accord a given piece of information. As stated above, ÂSpinÂ efforts made through marketing or disinformation campaigns on the part of the principal or third party intermediaries, such as the media or even an agentÂs trusted associates, will have a less effect to the extent that agent perceives the spin. This is a generalization, and the actual results will vary based the specifics of a particular information transfer i.e. transparent information may be perceived as negative by the agent and conversely, any detected efforts at spin may not effect the absolute value of a given piece of information.
It is believed that the definitions and formulas presented above could be used in a method similar to the following. We can let the value of Spin (S) be of the actual or estimated costs of developing and implementing a specific communications, marketing or disinformation campaign. The value for Credibility (C), in the absence of more refined information on a particular organization, could be assigned based on surveys measuring trust in a particular institution, such as:
From: ÂHow Americans View Government,DECONSTRUCTING DISTRUST,Â by the Pew Research Center for the People & the Press, Survey Reports Released: March 10, 1998
The value for an individualÂs the Bias (B) factor could be based on some measure of interpersonal trust (Viklund 2002), perhaps using Julian RotterÂs 1980 "Interpersonal Trust Scale," which measures the belief that another person's word or promise can be relied upon. Or, on a population/community aggregated level, such as the Community Quotient developed for the Social Capital Community Benchmark Survey on community trust conducted in 2002 by Taylor Nelson Sofres Intersearch Corporation for the Saguaro Seminar of the John F. Kennedy School of Government at Harvard University may prove a useful basis for assigning a value to Bias to explain how it is envisioned to work within the framework being suggested here.
Community quotient -- Along every dimension of social capital (such as social trust, inter-racial trust, etc.) a community quotient (CQ) shows a community's performance on this dimension relative to what was predicted given its urbanicity, ethnicity, levels of education and age distribution. A CQ above 100 indicates that a community shows more of this community connectedness than its demographics would predict; conversely, a CQ below 100 indicates that a community shows less of this type of social capital than its demographics would suggest. Roughly 68% of all communities would fall in the 85-115 range, and almost 95% of all communities would fall in the 70-130 range (Social Capital Community Benchmark Survey by Taylor Nelson Sofres Intersearch Corporation for the Saguaro Seminar of the John F. Kennedy School of Government at Harvard University, 2002).
However, it must be noted that these scales were made for other purposes, and they do not reflect the full range of bias described above. Neither of these scales provides negative values which would be especially useful in applying the formulas suggested here. These scales are very different in their methodologies and in the ranges of their scales, but for general purposes, as long as the two scales are kept separate, for back of the envelope calculations, they may suffice.
An example of C Â± Â± R = T, where = Â± B Â± (S/P) using the Environmental Protection Agency communicating information on AtlantaÂs water quality might look something like this: Credibility (C) rating might be considered slightly positive based on a 69% favorability rating found in Pew Research Center survey above. The Spin (S) could be given the dollar value of the public information that might be enacted. For example purposes, assume the EPA and the information itself, is not completely transparent to the populace at large (possibly in spite of EPAÂs efforts), S might have a value of $10,000. The Bias (B) rating for Atlanta could be considered fairly negative based on its community quotient of 83 found in the Social Capital Community Benchmark Survey mentioned above. For example purposes, the Risk of being infected by E.coli bacteria from swimming in the Chattahoochee River may in fact be one in ten, but it is the agentÂs evaluation not only of the risk of becoming infected, but also, the risk of death versus the risk of diarrhea or an eye infection, and is dependent on the degree to which individual agents value health and life. Trust hypothetically will be measured by the number of people who do not swim in the Chattahoochee based on the information campaign. Using this as a framework or one similar, overtime should allow policy makers to get a sense of how much money should be spent to overcome agentsÂ biases when trying to communicate information.
SecrecyÂs impact in the above example might be found in the actions of a different principal, the polluter, should there be one. To the extent that the polluter actively works to keep its activities secret, or counters the EPAÂs information campaign with efforts of its own spin, based on misinformation of disinformation, the costs to the EPA and potentially the agents, will increase. It is also possible for the polluter to directly attack the EPAÂs credibility and information with similar results. Thus, in our example, the polluter could also use the equation to estimate the value of its efforts, and conduct a similar Âcost analysisÂ, though with an eye toward secrecy.
Note: You only get limited comment space in my current 'comment' format. If you have something longer that you'd like to contribute to the discussion/debate, please email me at 'firstname.lastname@example.org", and if I think it contributes, I cite you and post it. I reserve the right to edit, but I'll communicate with you, and ultimately, you'll decide whether it's posted or not. (The start of a some sort of blog/reader/contributor agreement? Yes. Edited comments will not be posted without permission of the originating source)
I'll come back to all this later, but I'm hoping to have further input from you first.
Note: I finished reading Amazon.com: The Transparent Society: Will Technology Force Us to Choose Between Privacy and Freedom?: Explore similar items, and well, he does a good job of laying alot, out. But, I found it a little plodding. And, I found I differed after all in the conclusions. I think the current revelation of the 'torture' photos, support my idea, that the asymmetry of information flows is going to become more and more difficult to control. Influence will always be possible, but ultimate control, will remain an unachievable paradise. A fantasy. I don't think Huxley vision is likely, though I haven't written it off yet.
If you're new to this blog, please check out the lighter side. Well, as light as it gets anyway: Radically Inept
Note: I plan to take this converstion back to atheism, I'm just a little plodding : )
Thursday, April 08, 2004
The Omni-Verse will be used to denote EVERYTHING. There is no realm, dimension or anything that exists outside my definition of Omni-Verse. If you are a christian and believe in a god that created our universe from outside the confines of our universe, than the realm said god exists within is within the Omni-Verse. And, it is infinite in time, space and any other metric one can measure; it is boundless.
Visual aids often help, and my method is to visualize a non-existent point residing in a vacuum. From this point, radiating outward, visualize every possible Fractal [see Infinite Fractal Loop: Index for visual examples]. Of course, the set of all possible fractals is infinite. Then, when you have all the fractals radiating out, add fractals moving in every other direction, including tangentially and in opposition. Once you have that visualized, take a moment and assign each shade of each color as a unique dimension, and couple that with concept that each fractal measures 'its' own time in relation to the other fractals that it 'encounter', but that also, many of the fractals will not interact with all of the dimensions of the other fractals that it comes in contact with. Where there is interaction between fractals, there is energy exchanged in some manner. This is not a perfect mental representation of my Omni-Verse, and I'm sure I will come back and make this passage clearer for readers, but it will have to do for now.
I must now confuse the matter further by saying that while the above passage makes the visualization easier, there is a problematic twist. None of the above exists. Everything in 'reality' is just an abstract relationship to a non-existent point. And, I am totally enamored the idea that I first saw put forth in Michael Moorcock's Elric Saga, where in the hero stands at the edge of the Universe, and through the force of will, shapes the chaos beyound into usable order. I think it interesting that today some scientists are pondering whether quarks and other sub-atomic particles and their 'behaviors' even existed before we 'discovered' them, or did we in fact create at them by applying order to chaos.
Infinity, is exactly that. Without end. Infinity, since it can be applied to various aspects and areas, applies to and is contained within the Omni-Verse.
Life. Life is the reason time exists. If there was no life, there would be no need of time, and of course, nothing to measure it. 'Time' is a construct of life, to allow life to take advantage of any and all energy transfers within the Omni-Verse, including energy changes between life forms. [I believe that life in some form exists anywhere there is a predictable pattern of energy exchange]. All life is of equal value. There is no hierarchy to measuring the value of life; no 'higher' form of life exists. There are more complicated and less complicated forms of life, and some life forms are able to achieve a greater influence upon their environment spacially, chemically or through some other method, but this does not determine the 'value' of its existence w/in the Omni-Verse.
The value of a particular life, mine or yours, is entirely subjective and of our own making. And the idea that somehow the life of a plant or fish has less value than a mammal, or that our lives have greater value than the other mammals, is also of our own making. Further, they is no way for a human to not take life during there existence. Every breath causes the death of thousands, possibly millions (maybe I'll find out some specifics and add them later) of bacteria when they come into contact with our saliva and other bio-defense shields.
I must shut down for a while now, and I did not even get to the main subject of atheism. I will try to do some more posting later, but it might well be tomorrow before I find the time. Hopefully the above will have been worth the visit.
Saturday, April 03, 2004
A national security system was in place, and would thereafter be on the defensive more than otherwise. It became easy to argue that the Government was hiding something. Conspiracy theories emerged to explain misfortune or predict disaster. There is nothing novel in the appearance of conspiratorial fantasies, but it could be argued that it is something new for large portions of the American public to believe that agencies designed to protect them are, in fact, endangering them. Senator Daniel Patrick Moynihan, “REPORT of the COMMISSION ON PROTECTING AND REDUCING GOVERNMENT SECRECY”, 1997 (SENATE DOCUMENT 105-2 PURSUANT TO PUBLIC LAW 236, 103RD CONGRESS)
Polls show that nearly 80 percent of Americans believe JFK died as a result of a conspiracy, and about half believe the CIA was somehow involved. Whatever remains in the CIA files cannot be nearly as awful as the American public imagines. To be sure, I hardly saw everything there was to see, but I got not even a whiff of dirty tricks that had somehow remained hidden from Church Committee investigators or the army of historians and authors who write about the CIA. I really believe that it would be in the Agency's interest to let historians see for themselves what remains classified. I do not see why the Agency does not declassify almost any secret that is more than 30 years old. Evan Thomas, Gaining Access to CIA's Records, Studies in Intelligence, Volume 39 Number 5, 1996:
This paper explores the value of secrecy in its various forms, the relationship of secrecy to trust, and is a call on policy makers to develop a framework which allows policy makers in various organizations and at multiple levels to visualize the potential benefits and consequences of different levels of information communication. It is with this in mind that this paper looks at this reliance on secrecy to conduct many of our basic interactions as a potential contributory source for a great deal of our dissatisfaction and distrust of major institutions. Yet, as individuals, we wish to maintain our right to privacy, and few citizens would question the value of keeping the codes to our nuclear weapons secret. In the current climate, there is now a call to keep a great deal of the research findings in biotechnology and nanotechnology from the public. Is the risk in keeping the operations of chemical companies operations secret greater or less than continuing to require disclosure of the chemicals they are using which may potentially fall into the hands of terrorists? Have we come to rely on secrecy to too great an extreme? Have increasing levels of information technology made the cost of maintaining secrets far greater then in the past? To what extent does secrecy affect public trust? And, can an effective framework be developed to help policy makers make decisions related to secrecy? An argument can be made that ultimately secrecy, by its nature invisible, is too great a challenge to attempt to measure and evaluate its impact. But, no one has seen electricity, and yet we have all seen the effects. Additionally, there exists a vast amount of literature on electricity which not only defines electricity in a useful manner, but also provides the means to measure its effects. In light of the impact of secrecy on the information economy and the economy in general, and secrecy’s impact on the public’s trust in major institutions, especially when secret and often illegal activities are exposed, it is hoped that the following concepts, though by no means definitive nor complete, will spur others to pursue the development of a useful common framework upon which secrecy can be better defined and measured, and its impact predicted to a level of accuracy not available to policy makers today. Answering all the questions raised above is beyond the scope of this paper, but it does attempt to point out many of the difficulties, and hopefully it will spur others to investigate ‘secrecy’, its costs and its benefits, to a greater depth then is available at present.
This paper is composed of three major sections. The first section provides an overview of the current known status of secrecy in the United States, and provides definitions of the terms related to secrecy upon which the other two sections build. The second section provides an overview of the current state of trust in major institutions and defines terms for utilization in the final section, as well as introducing a formula for measuring trust. The final section utilizes the concepts and definitions and the trust formula introduced in the first two sections to introduce ‘Wu’, a concept to aid policy makers in visualizing the affect of irrational biases and their influence on information transactions involving secrecy and trust.
The 1997 “Report of the Commission On Protecting and Reducing Government Secrecy”, chaired by Senator Daniel Patrick Moynihan is the second Congressional Commission to attempt a comprehensive assessment of government secrecy. The first Commission, established in 1955, was the Commission on Government Security and issued its final report in 1957. The findings of the two commissions and their subsequent reports are remarkably similar. Both reports recognize the legitimate need for the government to protect information in the interests of national security. Surely, anyone can recognize the legitimacy of restricting access to information in the case of the codes and access procedures for arming and launching America’s nuclear arsenal. Or, of similar importance, citizens understand the restriction on the information concerning troop movements in time of war. However, both reports found the procedures for classifying information and the amount of information classified as out of step with actual legitimate national security concerns. The report issued by the Moynihan Commission is a very comprehensive guide to current levels of secrecy in government and the related costs, in a very broad and detailed sense. Two of the surveys cited in the report provide what may be a very conservative cost estimate of the costs to taxpayers for protecting classified information. The first survey cited, 1994, estimated the total annual security costs of reporting agencies and departments for 1993 to be approximately $2.27 billion (costs for the CIA were omitted). The second survey, using improved methods, was issued in 1996, and put the costs of classifying information for government agencies at $2.7 billion (again excluding CIA cost data), and the cost to related defense industry firms at $2.9 billion, for a total outlay to protect classified national security information of approximately $5.6 billion annually. The report also looked at the broader intelligence community distributed among various departments and agencies, the related contractor organizations and a large host of university and research institutions as a large information economy. In this light, the report views secrecy and the classification system as a set of regulations, and provides insight into how the system distorts the information economy. Of course, much of this must rely on extrapolation from the data available, as this regulatory system self regulates itself into intentional/unintentional levels of obscurity. The report states that secrecy is the ultimate mode of regulation; leaving citizens unaware that they are being regulated. Regulations of the normal nature inform a citizen about his required behavior and are therefore disseminated to inform the citizen. In contrast, secrecy regulates what knowledge a citizen may have, but does not let him know what he legally may not know.
Even so, “overregulation” is a continuing theme in American public life, as in most modern administrative states. Secrecy would be such an issue, save that secrecy is secret. Make no mistake, however. It is a parallel regulatory regime with a far greater potential for damage if it malfunctions. Sen. Moynihan, Chairman’s Forward, 1997
And, in keeping with the concept of economics, the report points out that free markets provide players with the most information. And that as the free flow of information is restricted the markets become less efficient. In 1995, Executive Order 12958 authorized twenty officials to classify as Top Secret “information, the unauthorized disclosure of which could be expected to cause exceptionally grave damage to the national security.” This authority has been delegated to 1,336 “original classifiers.” “Derivative classification” authority is given to two million government officials and to one million industrial contractors. In 1995 there were 21,871 “original” Top Secret designations and 374,244 “derivative” designations. Were there 400,000 secrets created in 1995, the disclosure of any one of which would “cause exceptionally grave damage to the national security”? ibid.
However, it is not just in matters of national security where there is a legitimate need for secrecy, which may be abused. In many aspects of our system the use of secrecy is viewed as a legitimate tool for the protection of various institutions and systems including the Grand Jury systems, the government witness protection programs, and with a proper court order, the authority to use covert surveillance while investigating criminal activity. The right to privacy, which may be considered the right of individuals to have secrets, has been alluded to in a number of cases as protections arising from the 1st, 4th, and 5th amendments of the Bill of Rights. One of the first cases of significance might be MEYER v. STATE OF NEBRASKA, 262 U.S. 390 (1923), in which the right of the state to prevent parents from teaching their children in a language other than English was considered an invasion of liberty and related to parental privacy. It was not until GRISWOLD v. CONNECTICUT, 381 U.S. 479 (1965) that an independent right of privacy was explicitly expounded by the Court, which has since been refined in numerous decisions. The activities that have received various levels of legal protection include the lawyer client privilege, the doctor/patient relationship, the confessional of the Catholic Church, and the prohibition of one spouse being forced to bear witness against another. In the market, businesses have the right, and even the legal obligation to protect trade secrets from would be competitors, and the media, though often contested, has the right to protect its information sources. With this in mind, secrecy of various levels and in various forms is an integral and important aspect of our existing system. There is a great deal of debate today on the impact of technology and interconnectedness on personal privacy and anonymity.
Interestingly, while there have been the increasing cries for greater transparency on the part of governments, businesses and other large institutions there has simultaneously been a vast amount of popular dialogue on the right of individuals to keep secrets from these same institutions. This effort appears to be in response to the increasing technological abilities by these organizations and other individuals to access personal information. This may well be due to the perception that individual secrets are under more imminent threat than those of larger organizations, or it may be motivated by an increasing perception/recognition of an asymmetric access to information which is by its very nature inequitable. With this in mind, the advancements in information technology over the past several decades have several distinct impacts in the area of information both for the individual and institutions. First, it has allowed an incredible increase in the volume of information, the rate of exchange and the transmission of information, much of which is of suspect validity. It has aided in the advancement of research so that the pace of newly acquired information in all areas has advanced rapidly; in fact, so rapidly that ever increasing specialization is required in fields so that depth and breadth of information become almost mutually exclusive. Information technology has also enabled the storage of vast amounts of information into ever smaller spatial volumes, and allowed for the oft cited global interconnectedness.
More to follow, and of course the right to revise and re-edit at will, which I think it needs...
Thursday, March 25, 2004
WorldChanging: Another World Is Here
It appears to do an excellent job of covering a lot of science news, especially in the realm of the enviroment and sustainability issues.