Home   Editorial   About   
Tacit & explicit knowledge

Hector McNeill1
SEEL



The Real Incomes Approach is designed to encourage a sustained effort in increasing the productivity of economic activities as the driver of growth in real incomes.

An important aspect of achieving this is the optimization of the combinations of tacit and explicit knowledge. This is a vital topic that is quite often not addressed in discussions concerning macroeconomics or policies directed by encouraging increases in productivity in economic units.

This article is the sequel to article, Slumpflation or Deflationary Slump? and explains the significance of these considerations.

Elton Mayo (1880–1949) was a psychologist who carried our research on industrial activities from the standpoint of human interactions and organization. In the year when he died a book was published entitled, "The Social Problems of an Industrial Civilization" containing some of Elton's work and edited by Karl Mannheim (The series: The International Library of Sociology and Social Reconstruction, Routledge & Kegan Paul Ltd., London, 148 pp. 1949) .

Elton Mayo on this important niche

"A simple distinction made by William James in 1890 has all the significance now that it had then; one can only suppose that its very simplicity has led the universities to brush it aside as obvious, which is true, or as of small account, which is not true. James pointed out that almost every civilized language except English has two commonplace words for knowledge-connaitre and savoir - knowledge-of-acquaintance and knowledge-about. This distinction, simple as it is, nevertheless is exceedingly important; knowledge-of-acquaintance comes from direct experience of fact and situation, knowledge-about is the product of reflective and abstract thinking."

"Knowledge derived from experience is hard to transmit, except by example, imitation, and trial and error, whereas erudition (knowledge-about) is easily put into symbols-words, graphs, maps. Now this means that skills, although transmissible to other persons, are only slowly so and are never truly articulate. Erudition is highly articulate and can be not only readily transmitted but can be accumulated and preserved. The very fact that erudition (logic and systematic knowledge) can be so easily transmitted to others tends to prejudice university instruction in the social sciences heavily in its favour."

"Physics, chemistry, physiology have learned that far more than this must be given to a student. They have therefore developed laboratories in which students may acquire manipulative skill and be judged competent in terms of actual performance. In such studies the student is required to relate his-logical knowledge-about to his own direct acquaintance with the facts, his own capacity for skilled and manipulative performance. James's distinction between the two kinds of knowledge implies that a well-balanced person needs, within limits, technical dexterity in, handling things, and social dexterity in handling people; these are both derived from knowledge-of-acquaintance. In addition to this, he must have developed clinical or practical knowledge which, enables him to assess a whole situation at a glance. He also needs, if he is to be a scientist, logical knowledge which is analytical, abstract, systematic-in a word, the erudition of which Dr. Alan Gregg speaks; but it must be an erudition which derives from and relates itself to the observed facts of the student's special studies".

"Speaking historically, I think it can be asserted that a science, has generally come into being as a product of well-developed technical skill in a given area of activity. Someone, some skilled worker, has in a reflective moment attempted to make explicit the assumptions that are implicit in the skill itself. This marks the beginning of logico-experimental method. The assumptions once made explicit can be logically developed; the development leads to experimental changes of practice and so to the beginning of a science. The point to be remarked is that scientific abstractions are not drawn from thin air or uncontrolled reflection: they are from the beginning rooted deeply in a pre-existent skill. At this point, a comment taken from the lectures of a colleague, the late Lawrence Henderson, eminent in chemistry' seems apposite:

“In the complex business of living, as in medicine, both theory and practice are necessary conditions-of understanding, and the method of Hippocrates is the only method that has ever succeeded widely and generally. In the first element of that method is hard, persistent, intelligent, responsible, unremitting labour in the sick room, not in the library: the complete adaptation of the doctor to his task, an adaptation that is far from being merely intellectual. The second element of that method is accurate observation of things and events, selection, guided by judgement born of familiarity and experience, of the salient and recurrent phenomena, and their classification and methodological exploitation. The third element of that method is the judicious construction of a theory - not a philosophical theory, nor a grand effort of the imagination, nor a quasi-religious dogma, but a modest pedestrian affair . . . a useful walking-stick to help on the way. . . . All this may be summed up in a word: The physician must have first, intimate, habitual, intuitive familiarity with things; secondly, systematic knowledge of things; and thirdly, an effective way of thinking about things.”


Reference: Mayo, E., "The Social Problems of an Industrial Civilization", Routledge & Kegan Paul Ltd., London, 148pp. 1949.
Elton's writing was detailed and covered an immense field of research often referring work undertaken by others, sometimes some years back, but which had not yet been translated into English.

He was aware of the importance of what is now referred to as tacit and explicit knowledge and the box on the right covers his introduction to these issues and his justification as to why they were (are) important.

Understanding resource inputs

Tacit Knowledge - the knowledge and information embedded in people2

The bounds of production performance feasibility are normally set by the known capabilities or performance of the people working in a defined process and deploying defined techniques in the use of specified tools and equipment. There is a trade-off between innate capabilities of people (tacit knowledgee) which can be quantified in terms of practical performance related to the accumulated capabilities and knowledge derived from direct experience and embedded in people. Tacit knowledge is attained through experience in exercising given techniques in a specific task on a repetitive basis and it can be roughly measured in terms of the time a person takes to complete a specific task. This means, for example, that a person with 6 months experience will not have the same performance in carrying out a somewhat complex activity as someone with 15 years practice. All business operations need to contend with the reality that a specific level of human operational attainment might be desirable but often people with sufficient experience are not available so it is necessary to make use of less experienced people for them to learn on the job and gradually improve their performance.

These are realities and managers who learn to work and apply this knowledge can not only identify feasible quantitative operational targets but can also project the likely future profile of performance on an objective basis.

I will elaborate on this further on in this article.

Tools & equipment

Automation - knowledge & information embedded in tools & equipment

Besides the trade-off between experience and performance there is also a trade-off between the tools and equipment used and the degree to which they support work through automation (digitization) so a machine tool with considerable knowledge & information embedded in tools & equipment including programs and electro-mechanical or electro-graphical components will be more advanced and productive that ones with less automation components.

Feasible productive solutions

Feasible solutions arise from the combination of people, tools and equipment according to practical quantified attainments based on observation and measurement performance associated with the normal ranges of attained performance. These can be ranked as low average or high performance or be bench marked in terms poor, average or good practice.

Management of activities

General knowledge

The understanding and management of the activities benefits from people's domain knowledge and experience. A guide indication of this experience is formal instruction in the domain or a related topic.

Orientation of decisions

People manage activities in terms of instructions that set out what needs to be done, its timing and sequencing according to the specific requirements at any given point of time and these are communicated through spoken and written instructions expressed in the form of explicit knowledge & information. Explicit knowledge, unlike tacit knowledge, can be easily transmitted to another person within a relatively short period of time..

Explicit information communications infrastructures

Supportive resources for the storage, access and communication of explicit knowledge include information technologies and telecommunications including the World Wide Web and Internet backbone.

Decision analysis

The ability to make sound business decisions depends upon knowledge of cause and effect relationships (determinant functions), upon knowledge concerning the probability of events influencing decision outcomes and on the ability to identify, collect and analyse additional information to refine the understanding and relevance of the determinant functions and estimate probabilities. In this decision analysis process the vast range of resources that exist in the form of mathematical procedures, rules, logic and methodological elaborations in the form of statistical survey and analysis and operations research algorithms represent effective ways to apply explicit knowledge. Even the actual performance levels achieved by people with different levels of accumulated capabilities, or tacit knowledge, are expressed in the form of explicit knowledge for recording, accessing, transmitting, analytical and planning purposes.

Artificial intelligence and automation

The terms tacit and explicit knowledge became more evident during the 1980s when efforts were being made to advance 5th Generation computing, also referred to as artificial intelligence (AI). Whereas data processing involving explicit knowledge can be accomplished far more effectively with digital processes than human effort, when it comes to tacit knowledge, the automation processes face a significant challenge in being able to emulate the human characteristic of the learning curve process based on direct experience. There is also a problem in writing code that can combine the three properties of the effective human element described in the last three lines of the box on the above right from the lecture given by Lawrence Henderson where I paraphrase:

the effective practitioner must have:
  • first, intimate, habitual, intuitive familiarity with things
  • secondly, a systematic knowledge of things
  • and
  • thirdly, an effective way of observing things (objects, processes and events) and analyzing them so as to arrive at conclusions through a process of deduction
.
Notice that the third process is advanced on the basis of experience from the first (tacit knowledge) and learning acquired to build up the second (explicit knowledge).
George Boole

In a book entitled, "The Laws of Thought", published in 1854, George Boole described how humans deduce and make decisions. He also set this out as a practical mathematics of logic and probabilities. This work provided the rationale and methodology for reducing complex logical relationships to simpler sets of relationships which can reproduce all of the possible relationships from which the set was derived. This process is known as Boolean reduction. Boolean reduction is used to reduce the size and complexity of complex digital logic designs to produce workable logic designs for circuits for digital devices. The success of modern digital circuitry manufacturing, including micro-devices and the computer industry based upon these, rests directly upon the practical utility of the mathematics developed by George Boole.

George Boole's objective in developing this approach to logic was to explain how individuals use information and knowledge to deduce and take decisions. He succeeded in establishing, some 150 years ago, a practical basis for designing expert and knowledge based systems.

Boolean Logic in the Digital World

The essential contribution of Claude Shannon



Claude Elwood Shannon
1916-2001
There is no doubt that the person who was instrumental in pointing out the importance of Boole's mathematical logic to digital systems, was Claude Shannon. Claude Shannon was born in Petoskey, Michigan, USA in 1916. He graduated from the University of Michigan in 1936 in mathematics and electrical engineering. In 1940 he gained a masters in electrical engineering and a Ph.D. in mathematics at MIT. Claude Shannon died in 2001.

Although broadly appreciated for its brilliance, Boole's work had found limited practical application. However, in 1938 Claude Shannon published a paper, based on his 1937 thesis, entitled, "A Symbolic Analysis of Relay and Switching Circuits" where he explained how Boolean Logic could contribute to a more efficient circuit design. This seminal work launched Boolean logic into the digital world.

It is intriguing how the personal abilities of Shannon were such that, with the combined expertise of electrical engineering and mathematics and having studied Boolean Logic as an undergraduate, he was able to correctly identify the contextual significance of Boolean Logic. In doing so he made a vital contribution to the efficiency of circuit design based on Boolean logic and accelerated the world's entry into the digital era as we know it today.

An interesting paper entitled "1+1=1 a tale of genius" was written by Ian Petticrew who was the editor of the INTOSAI Magazine which is hosted on the web site of the National Audit Office (INTOSAI - International Organization of Supreme Audit Institutions) and describes in more detail the linkage between George Boole's and Claude Shannon's work. This paper, from Issue 18, August 2003 of the INTOSAI IT Journal, is available in pdf format and can be accessed on this link:

"1+1=1 - a tale of genius"


Source: The George Boole Foundation Ltd (London)
This constellation of linked logical activities represents the complex that any attempt at automation or embedding in a coherent code with an adequate feedback and response to learning cycles needs to contend. In the 1980s there was a significant effort in Europe and the USA to counter the Japanese initiatives in artificial intelligence following the release of the ICOT report that was produced in the early 1980s. Between 1983 through 1987 I was involved in the identification of novel applications and their impacts on the European economy (PROGNOS-GTS) and then with the Information Technology  Telecommunications Task Force (ITTTF) of the European Commission in Brussels identifying planning initiatives for the development of learning systems. As a result of this work I was very much aware of parallel efforts in the USA, European member States and Japan including those of the European Strategic Programme in Information Technology (ESPRIT) and, as far as I could see, four things emerged from this period of the mid-1980s through mid-1990s and which could be classified as practical successes, these were:
  • the introduction of technical operational standards by the European Commission and Telecom industry for the mobile telephony industry and market which helped the EU overtake the USA and Japan in this field
  • the development of the World Wide Web following the almost individual efforts of Tim Berners-Lee while working initially on a short term contract with CERN
  • the development of industrial process robots accelerated and was largely perfected during this period, largely in Japan
  • the major change in the programming paradigm to object oriented approach based largely on the efforts of Ole-Johan Dahl and Kristen Nygaard of the Norwegian Computing Center in Oslo since the 1960s starting with their program Simula 1.
These more "mundane" developments had a far more profound, practical, economic and social impact but they did not emerge from the efforts supported by Euro billions spent on research programmes and AI. Personally I came to the conclusion that there was little output that took us beyond the basic binary logic (Boolean Logic) based on the mathematical deductive logic developed by George Boole and set out in his work entitled, "The Laws of Thought", published in 1854, in which he described how humans deduce and make decisions. Much of the more recent "advances in AI including verbal responses to questions, quiz systems and so-called evolutionary algorithms and complex simulations can be explained in terms of Boolean logic (see box on the right).

Moore's Law

What has been apparent has been the continuing impact of Moore's Law in information technology. This is based on an observation made by Gordon Moore, one of the co-founders of Intel, which was that the number of transistors in a dense integrated circuit doubles approximately every two years. Although Moore's observation was made in 1965 this relationship seems to have endured, although the 2 year cycle has been reduced to 18 months. This law has been used in the semiconductor industry as a guide to medium to long term planning basing projected performance and targets for research and development efforts. Moore's law is an important indication of the intensity of technological and social change, productivity, and economic growth.

Management decision analysis

The subject of this article is fundamentally how knowledge on tacit and explicit knowledge can be used to improve management decision-making so as to increase real incomes within a Real Incomes Policy framework and thereby contribute to a general rise in real growth in the economy as a whole. The initial step is to understand that tacit knowledge is an evolutionary process that occurs over time and that the actual contribution of tacit knowledge can be measured over time to determine the contribution of tacit knowledge to any current state of performance. In addition the understanding of the learning curve relationships that trace the effect of tacit knowledge on productivity can be used to project likely costs of operations into the future.
Wright's & Moore's laws compared

Forecasting technological progress is vital importance to systems engineering economists, to policy makers and investors. There are several models for predicting technological improvements including the early hypothesis made by Theodore Wright in 1936 is that cost decreases as a power law of cumulative production. Moore's law is another “law” which states that technologies improve exponentially with time. Other models have been proposed by Goddard, Sinclair et al., and Nordhaus. Six of these predictive models were tested and compared making use of a database on the cost and production of involving 62 different technologies, to predict future costs. This involved hindcasting and developing a statistical model to rank the performance of the postulated laws. The results were published in the paper, “Statistical Basis for Predicting Technological Progress” (2012) (see below).

Wright’s Law wins

Wright's law produced the best forecasts, but Moore's law is not far behind. The researchers discovered a previously unobserved regularity that production tends to increase exponentially. A combination of an exponential decrease in cost and an exponential increase in production would make Moore's law and Wright's law indistinguishable. However, the researchers showed for the first time that these regularities are observed in data to such a degree that the performance of these two laws is nearly equivalent. Most significantly the results show that technological progress is forecastable. These results are important for theories of technological change, and assessments of candidate technologies and economic growth policies.


Bela Nagy, J. Doyne Farmer, Quan M. Bui and Jessika E Trancik,"Statistical Basis for Predicting Technological Progress", 2012, Santa Fe Institute, 1399 Hyde Park Road, Santa Fe, NM 87501, USA, St. John's College,1160 Camino Cruz Blanca, Santa Fe, NM, 87505, USA and Engineering Systems Division, Massachusetts Institute of Technology, Cambridge, MA, 02139, USA


Wright's Law

The effect of the learning curve was explained by Theordore Wright in a paper in 1936 entitled, "Factors Affecting the Cost of Airplanes" (Journal of Aeronautical Science, Volume 3, No.2, 1936, pp. 122-128). The learning curve impact described by Wright has also been referred to as Wright's Law has turned out to be a reliable basis for predicting the impact of tacit knowledge on the productivity of processes involving humans and in addition can be used to extend this analysis (see the box on the left).

Learning curve

The learning curve is the phenomenon of the occurence of measurable reductions in the resources used (including time) in the production of an object in association with the cumulative quantity of throughput. In general terms there is for many production situations a constant reduction in resources used measured in a percentage drop in resource consumption (learning index) associated with every doubling of historic cumulative production by a team or individual.

Belkaoui4 cites a summary by Hirschmann5 of the basic doctrine of the learning curve as:

1. Where there is life there is learning.

2. The more complex the life, the greater the rate of learning. Man-paced operations are more susceptible to learning or can give greater rates of progress than machine-paced operations.

3. The rate of learning can be sufficiently regular to be predictive. Operations can develop trends which are characteristics of themselves. Projecting such established trends is more valid than assuming a level or performance or no learning.

In general terms the learning curve effect is more pronounced if production processes are labour-intensive and less pronounced if production processes are more capital-intensive or automated.

Geometry of the learning curve

The learning index is the percentage reduction in resources used with each historic doubling of throughput. This is also expressed as a percentage curve thus more capital intensive processes might have a 90% curve signifying 10% reductions and a more labour–intensive process might have an 80% curve indicating a 20% reduction. An example of a 80% curve is shown below.

Real Incomes and Costs

As a result of our understanding of Wright's Law there is a degree of predictability in technological innovation and this has profound implications. Not least such knowledge provides a significant advance over conventional economic theories enabling a more rational theory on economic growth. More significantly this makes possible a more transparent alignment between public policies designed to support investment as well as the microeconomic imperatives of good systems and engineering design, investment implementation and the operations of economic units.

This is one of the main objectives of the Real Incomes Approach to economics. The researchers mentioned in the box entitled "Wright's & Moore's laws compared" explained that technological performance cannot be quantified by a single measure because technologies have several components. For example a computer has speed, memory size, disc capacity, size and a cost (or price). However, by using the inflation-adjusted cost of one “unit” it was possible to compare many different technologies even although the specifications (qualities) of units were very different and also might change over time. The researchers made use of such a cost model to undertake their comparative analysis of the technology prediction models. Even although unit cost is a crude measure it is the best there is and the researchers found that common trends were detectable which were found to be predictable and useful.

This is the same rationale for the use of the Price Performance Ratio (PPR), which measures the response of unit output prices to input unit costs, as a key practical measure of performance in the Real Incomes Approach (See: Price Performance Ratio). The use of the attainment of PPR values by companies as the determinant of bonuses for performance paid through the Price Performance Levy (PPL) (See: Price Performance Levy) is therefore applicable to all sectors of the economy and status of economic units. This underlying measure is the foundation of the general status and trends in real incomes.

Driving productivity and competitive pricing
The sequels to this article

The business management implications of this article are immense and the next article in this sequence will describe the appropriate business rules that maintain the required levels of coherence between the optimization of corporate returns and the maximization of the feasible growth in the economy as a whole.

Linked to this, another article will cover "beneficial work contract frameworks" which sustain the coherence between business rules and macroeconomic objectives while securing the mutual benefit of workers and business owners.


The tendency for Price Performance Policy to encourage lower unit prices has the benefit of raising consumption as a result of the price elasticity of consumption. This results in production throughput increasing and as a result the time lapses to achieve higher levels of cumulative production are reduced thereby increasing the impact of the learning curve. This in turn results in lower costs of production and lower feasible unit output prices. The setting of unit prices will depend upon the price elasticity of consumption. Because the PPR can be reduced through incremental rises in investment in technology and human resources, then innovation-driving investment is encouraged.

The Production, Accessibility & Consumption (PAC) Model of the Economy

The general result, in terms of economic growth, productivity trends and real incomes can be explained transparently by the Production, Accessibility & Consumption Model of the Economy (See: The PAC Model of the Economy) as opposed to the inexact and wanting Aggegate Demand Model (ADM) that is applied as the foundation for conventional economic theory and policy practice.


1 Hector McNeill is the director of SEEL - Systems Engineering Economics Lab.
2 McNeill, H.W., "Business Process Systems Performance Guidelines", SEEL, 2015.
3 George Elton Mayo (1880–1949) was a psychologist, industrial researcher, and organizational theorist. He was born in Australia.
4 Belkaoui, A., "The Learning Curve - A Management Accounting Tool", Quorum Books, 1986.
5 Hirschmann, w. B., "Learning Curve", Chemical Engineering, (Volume 71, No. 7,1964) pp 95-100.

Update: 9th September, 2015: alterations to enhance clarity; sense maintained.

All content on this site is subject to Copyright
All copyright is held by © Hector Wetherell McNeill (1975-2015) unless otherwise indicated

SMOT.GIF - 2160 Bytes