The Modeling Of Business Processes

Print   

02 Nov 2017

Disclaimer:
This essay has been written and submitted by students and is not an example of our work. Please click this link to view samples of our professional work witten by our professional essay writers. Any opinions, findings, conclusions or recommendations expressed in this material are those of the authors and do not necessarily reflect the views of EssayCompany.

"Experience is an expensive school".

Benjamin Franklin.

The Modeling of business processes.

A large part of modern business economics consider businesses such as complex dynamic systems in which flows of materials, people, financial resources and information intersect and different decision makers interact and influence each other. These systems receive spurs from the environment in which they fit and exploit the emerging strategic options. On the other hand, companies affect the environment, modifying the structure of the sectors with their innovations. Current management studies are focused in this direction and take inspiration from this idea to seek new frontiers of development of the strategic management of businesses (Mollona, 2000). "The research of competitive advantage as a result only of positioning decisions or the choices of resources and skills in which to invest is no longer sufficient" (Lodhi & K., 2011). The basis on which rest the competitive advantages of companies change: the resources and skills are eroded if they are not continually updated and sectors and business processes undergo radical metamorphosis and coevolution. Every decision is included in a more or less articulated system in which the effect of a choice not only stimulates others but generates a feedback by changing the situation in which the decision-making process takes place.

Mollona (2000) suggests that studies of business administration and management were productive in delineating a large body of articles and conceptual tools that support managers in shaping the content of their decisions. Less fertile and relatively recent is the commitment to develop tools that support the decision-making of managers developing their ability to generate interpretative models of systemic type, through which it is possible to place the individual decisions in a broader context. "It is through the understanding of the systemic structure of decision-making that can evaluate the dynamic consequences of decisions and predict the evolutionary phenomena that stem from it." [1] (Sterman J. D., 2000)

Exploiting the contributions synergistically in the context of Business Administration, of systems theory and of the school of studies developed at Massachusetts Institute of Technology at the end of the fifties known as System Dynamics [2] is emerging in recent years, a new framework for the analysis, the control and the design of business systems. The improvement and implementation of this systemic approach to business issues will lead to the creation of a new way of understanding business administration and especially the management sciences.

A model of System Dynamics, as recalled by Richmond (1993) "is a reasonable abstraction of the observed system, which is a theory of the behaviour of that system."

Already this first definition helps us understand the kind of perspective of this kind of approach, oriented towards the model's ability to explain rather than predict the observed phenomena. In this sense, it is more pronounced in its closeness to the field of business administration. But in the support to the decision-making process it is the predictive capacity which is very often taken as a reference, both for its statistical-econometric origin, on which it is based , both for its univocity of interpretation of the models. More specifically, in system dynamics, the main focus is given to the internal consistency of the model, and its ability to shed light on how the interaction between the variables explains the phenomenon observed, however, in econometric models, the emphasis is placed on the ability of a model to reproduce the historical data available, verifying the statistical relationship between the time series of data available to one specific variable and the range of values ​​simulated by the model.

The basic idea of this work is precisely that it is possible to implement the ability of the system dynamics models to predict the phenomena under study, and thus increase their suitability for use in decision-making. This, through the review of the deterministic logic that characterizes them and the introduction of some probabilistic-econometric tools both in the design and in the validation of the models. With the aim of exploiting the huge potential that these types of models has in the ability of phenomena explanation, we will endeavour to make them easier to validate and make the results "statistically significant".

"The greatest constant of modern times is change".

- John D. Sterman.

Why is it necessary to model.

One of the most significant changes that have occurred as a result of the great revolutions of the last centuries is certainly the exponential acceleration of the evolution of the environments, especially the social and economic ones. The increase of these changes might lead to an improvement rather than a deterioration of the situation, but above all puts to the test institutions, practices and beliefs that already exist.

Accelerating economic, technological, social, and environmental change challenge managers to be taught at growing rates, while at the same time the complication of the systems in which we live is growing. Many of the problems we now face arise as unexpected side effects of our own past actions. All too often the strategies we implement to resolve many problems fail, make the problem worse, or create new problems (Sterman J. D., 2000). Putting into practice and learning successful policies in a world in which there is a growing dynamic complexity requires managers and decision makers to implement and develop systems thinking, which, as explained by Sterman, is the ability to see the world as a complex system, in which we understand "you can’t just do one thing,"; that "everything is connected to everything else".

The origins of system thinking are still very controversial and it is believed that it emerged and established itself as a trans-discipline in the 1940s and early 1950s. System thinking helps people in understanding the problems as systems and finds a solution to those problems by finding out the root causes of the problem and considering all of them as a whole system, thereby improving their understanding of planning systems. (Batra, 2010) Systems theory is the one metaphor that highlights the relationships and interconnections among the biological, ecological, social, psychological, and technological dimensions of our increasingly complex lives (Hammond, 2003).

Through the system thinking can be overcome one of the biggest problems that hinder the success of the policy makers’ strategies, that is the policy resistance. Using the thought of Sir. Thomas More we can identify more precisely this problem:

"And it will fall out as in a complication of diseases, that by applying a remedy to one sore, you will provoke another; and that which removes the one ill symptom produces others..." (More, 1515)

It has long been acknowledged that people seeking to solve a problem often make it worse. Our policies may create unanticipated side effects. Our attempts to stabilize the system may destabilize it. Our decisions may provoke reactions by others seeking to restore the balance we upset (Sterman J. D., Business Dynamics. System Thinking and Modeling for a Complex World, 2000). These dynamics often lead to unexpected political resistance, the tendency to implement intervention is delayed, or discouraged by the response of the system to the intervention itself. [3] 

And it is precisely to overcome this problem that we use the feedback on the system dynamics view. How has explained by Sterman (2000), much of the art of system dynamics modeling is discovering and representing the feedback processes, which, along with stock and flow structures, time delays, and nonlinearities, determine the dynamics of a system. You might imagine that there is an immense range of different feedback processes and other structures to be mastered before one can understand the dynamics of complex systems. In fact, the most complex behaviours usually arise from the interactions (feedbacks) among the components of the system, not from the complexity of the components themselves. All dynamics arise from the interaction of positive (or self-reinforcing) and negative (or self-correcting) loops (Forrester, 1971 a). [4] From the beginning, system dynamics accentuated the multi-loop, nonlinear character of the feedback systems in which we live (Forrester1961). The choices of any one manager form but one of many feedback loops that operate in any given system. These loops react to the decision maker’s actions in ways both expected and unexpected. Often the loops are seen as instant, linear, feedbacks that yield stable convergence to an equilibrium or optimal outcome, just as direct visual feedback lets you to fill a glass of water without spilling. The real world is not so simple. Natural and human systems have high levels of dynamic complexity.

This combination of factors make us better understand the reason of the creation of a field of study for the dynamic complexity of the systems. Loops, feedbacks and delays are only the some of the tools that are used for the implementation of a system dynamics model. [5] 

Systems thinking in the companies

Now is the time to understand why it is useful to use the concept of a system to study the behavior of companies. To fully understand the meaning of a theory of business systems is useful to reason about the relationship between the concept of the system and the object of study: the company.

A company is not a system. is an entity that, for some of its key features, can be studied as a system or via "system models". We use the concept of system and build systemic theories of the operation of a company because we believe that the behavior has features that are not understood by the mere analysis of its elements taken individually, but should be studied as a set of properties of the relations between the various elements (Mollona, 2000).

The SD or the "General Systems Theory" of von Bertalanffy (1969), to which it is connected are in fact the first attempts to derive a theory, a «logical pattern of thought» or a hypothetical-deductive system, independent of its interpretation in terms of empirical phenomena but applicable to all fields of empirical correlation with the systems. The company has a systemic nature because it is composed of several interacting elements. If we think of a company, actually, we imagine an aggregation of people, machines and economic flows. Among these elements, which are interconnected via transactions and activities are many relations of cause and effect between them. The use of systemic models to study the problems of corporate governance is, especially, rooted in the tradition of Italian business studies. [6] Even in the Anglo-Saxon corporate literature has established a similar vision also derived from system dynamics studies.

The underlying assumption behind the dynamic analysis of business systems is that the behaviour or dynamic manifestations of companies can be explained by analysing their systemic structure. In other words, the company is seen as a system in which a number of variables affect each other in a dynamic way. The systemic structure, therefore, can be described at this point as a series of cause-effect concatenated circuits, that connect a set of resources to a set of activities, to collect information, decisions and actions. The structure of this aggregate of circuits becomes, in dynamic analysis of enterprise systems, the key to interpret the range of dynamic processes that are typical of companies. The fundamental problem underlying the analysis of dynamic enterprise systems thus becomes the identification of the structure of concatenated circuits responsible for a dynamic phenomenon (Forrester, 1961).

The System Dynamics approach at the business processes modeling.

One of the key assumptions that inspired this work is that the approach S.d. to the modeling is particularly suited to provide a set of methods for the representation and the study of companies. This belief find a supporting base in the fact that the S.d. was created by scholars, which had as its object of study, the large industrial enterprises and also in the fact that such discipline is firmly rooted in the management and organizational literature. The following section is intended to recall some concepts that are the basis of the subsequent discussion.

How does the S. D. works.

Forrester (1961) and also Sterman (2000) explains that the idea that inspires the System Dynamics methodology is therefore to use the elementary circuits as basic concepts to generate complex systemic representations. In other words, the elementary circuits can be regarded as simple symbols can be used to produce rich but understandable representations.

The basic elements from whose union originates the structure of a dynamic system are feedback loops. Locate and represent these circuits and the configuration of their concatenation is a crucial step to interpret the behavior of the system. A feedback loop can be defined as a relationship between two or more variables, such that, the action of a variable involves the reaction of the linked variables. Each variable included in a specific feedback loop may be part of more feedback circuits simultaneously. Therefore, it is precisely through these variables, which plays a role of connecting circuits, which are formed more or less complex systemic architectures. (Forrester, 1961)

The concept of the feedback loop is essential to conduct a thorough analysis of the behaviour of companies. In fact, if you look through systems thinking, many business events appears as the manifestation of a system that connects activities, operations and processes (Mollona, 2000). To use the feedback loops as an interpretive tool of the dynamic behaviour of a system, we have to introduce the concept of polarity. More specifically of three types of polarity: the polarity of the causal relationships between variables; the polarity of the feedback loop, and the polarity of a general system. The polarity of a relationship between two variables is indicated by a "+" sign (if positive) or a "-" sign (if negative) by the arrow that explains how has a variable effect on a second variable. [7] 

To explain the polarity of the feedback loop is necessary to divide the positive one from the negative one. Given a variable, inserted in a feedback loop, if the movement of the variable in a certain direction corresponds to a response of the system on the variable considered that amplifies this movement, we will say that the variable is inserted in a positive feedback loop. The positive feedback loop gives rise to a process of reinforcement and is indicated by the "+" sign.

To connect us to our topic, you can see for example that in the economic, some investment decisions have increasing marginal returns, marginal investments that are subsequent characterized by increased productivity. In this regard, it is called "self-reinforcing mechanisms" because the investments have an efficacy that is self-reinforcing.

If, instead, it is a negative feedback loop, in the case where the system is in equilibrium, the stimuli that disturb this situation will be absorbed and the dynamic behavior of the system will be restored to the initial state of equilibrium. The presence of negative feedback loop ensures that the system remains in equilibrium even in the presence of external stimuli to the system itself. It gives rise to a process of self-regulation and self-balancing and can be specified with the "-" sign.

A useful shortcut to define the polarity of a feedback loop, is to count the number of each "-" the causal relationships between variables. In the case in which the number of signs "-" this is called a positive feedback loop. [8] 

At the level of the evolutionary dynamics of the companies, the feedback loop provides a tool to conceptualize the patterns of adaptation and response to environmental stimuli. In this sense, you can think that all decision-making processes, which tend to restore the system state to a state-desired goal, constitutes an essential component of a self-regulating mechanism that tends to maintain the company closer to its goals , in spite of the "environmental perturbations" (Mollona, 2000).

To give meaning to the situation described in a feedback loop, you rebuild, perhaps unconsciously, a temporal order. It therefore becomes implicit and natural to think that the variables included in a feedback loop will influence with a certain order of precedence, thus giving rise to a phenomenon that occurs along a temporal horizon. The representation of the feedback loops only makes sense, if you introduce the temporal dimension in the analysis. In this process there is a specific order, the actions are not simultaneous. Some variables send information to other variables, and the latter transform information into action intended to changing the state of a system. To represent a circuit with feedback information, in fact, we introduce the notion of stock variable (or level variable) and flow variable. The stock variables represent the state of the system. The flow variables, however, collecting the information arising from the variable level will contain the information to change the status of the latter. (Richardson, 1995). The stock and flow diagram show more about the process structure than the causal loop diagrams. (Kirkwood, 1998)

Another important concept of System Dynamics models is the delay. The analysis of time delays is essential for understanding the behavior of business systems in particular. In fact, the time delays, separating actions and consequences, make it difficult to interpret the relationship between cause and effect. The time delay can be considered a process of converting an input into an output. To simplify the use of the delay, the System Dynamics uses the concept of variable stock. As is known, the stock variables do not change instantaneously but change incrementally as a result of successive accumulations or erosions of the quantity contained in it. So the time delay can also be thought of as related to the existence of a stock variable that separates the input from the output.

The use of SD in the companies.

The models S.d. are homologues of business systems, that is to say, there is a homology of logical character among the elements that make up a model S.d. and the elements that constitute the structure of companies. Such homology consists in the fact that the logic and principles that explain the behavior of dynamic systems also apply to the explanation of the behavior of companies, the latter described as dynamic systems. (Mollona, 2000)

Mollona continues by saying that in the explanation of the behavior exists a relation of isomorphism between system and company that makes sure that the feedback circuit is a fundamental concept to explain the movement both in systems both in companies.

However, to bring to light the feedback circuits is necessary to understand whether and in what way, the structure of business systems is translatable into elementary structures that constitute the feedback circuits, that is to say, the Flow variables, the Level variables, the decision-making functions and the information network that connects the level variables to the decision functions. Even in companies, there are elements that, as the level variables, represents stock quantities. Companies, in the course of their activity, in fact, accumulate resources of various types, eg. hire employees, buy raw materials and machinery and collect financial resources and build knowledge. Therefore we include both tangible and intangible variables.

If we take for granted these concepts, we can easily understand that every aspect of corporate life can be modeled by identifying the variables stock and their interactions with the flow variables, as well as the influences between the variables and the delays of which we have discussed earlier. There is a huge field of study of business process modeling, they are widely used in the study of company. [9] Grasl (2008) suggests that a business model that includes all the parts of a company can defines how it adds value for all the actors within its value network. It shows which channels a firm provides to connect the actors in the product and factor markets and which transactions it supports or enables via these channels. It also identifies the resources and capabilities it needs to support these transactions, and the costs incurred in doing so. It explicitly states the business policies that govern the channels and transactions it supports and the development of resources and capabilities needed to create the products or services it sells, and how these policies are connected to each other. It also states the assumptions that are made about how a firm will perform in its market.

But the question that more than any other is worth to be considered in this context is: why to use these models in companies?

We have already investigated the reasons that push us to model systems, but probably the one that most interests managers is the possibility to make forecasts.

The use of forecasts in business is widespread. "Estimates of future demand and performance are essential for many business decisions. Most companies devote significant effort to estimating future demand for their products, and to the consequences of that demand on business performance" (Forrester, 1961).

Forecasts in companies, have many purposes in support of business management. One of these, for example, is to set annual goals for each corporate function and periodically analyze the deviations with the current situation, thus enabling the formulation of immediate corrective strategies; the so-called budgeting process.

But there is a "unwillingness in the System Dynamics community to encourage the use of system dynamics models for forecasting. In part, this may be a reaction to the problems with the use of forecasts by businesses." (Barlas Y. , 1996)

In the first place, forecasts are likely to be wrong. "Inaccuracies in forecasts of economic growth and inflation are widely documented in the business press. While some of this error can be attributed to inaccurate or overly simplistic models, as Forrester clearly demonstrated even an accurate model can produce forecasts that diverge from reality." (Lyneis, 1998) "Random elements impinging on a system affect the point behaviour of an oscillatory system, and differences in the "noise" streams can quickly produce significant differences in behaviour. Since we cannot predict the random inputs, we cannot predict the behaviour of the system." (Forrester, 1961)

Second, "forecasts are a part of a system’s decision structure, and therefore can contribute to problematic behaviour. The adverse consequences which often befall businesses and industries as a result of decisions taken on the basis of inaccurate demand forecasts are less widely documented than forecasting inaccuracies, though still common." (Lyneis, 1998).

In addition to inaccuracies and potential misuse, the reluctance to use system dynamics models for forecasting may also result from a desire to shift managerial emphasis to understanding and policy design. Lyneis (1998) explain that "the business will inevitably use assumptions about the future as a basis for most decisions, even if only the "naïve" forecast of assuming the future will be like the past." [10] 

Some academic writers, however, believe that "the proper use of system dynamics models can improve the existing forecast techniques" (Lyneis, 1998). But, as suggested by Meadows, "The output of the model shouldn’t be seen as predictions of particular quantitative variables in certain years but to understand in advance the qualitative characteristics of the behaviour "(1980)

The practices that are most widely used to predict business phenomena are those statistical-econometric but may not be the most efficient, in fact, are countless the examples of forecasts misleading. The idea of this study is, as we said, to merge the deep ability to understand of the problems of SD models and the solidity and flexibility of the econometric practices in order to improve the decisions making processes in companies. [11] 

1.2.3. The System Dynamics deterministic nature.

We mentioned that is controversial attempt to use system dynamics models to help to make predictions. One of the most important problems, is in the same nature of SD models and in the aim for which it were created.

The field of system dynamics (Forrester, 1961) & (Sterman J. D., 2000), makes two simplifying assumptions: "Flows within processes are continuous, and they do not have a random component. By continuous flows, we mean that the quantity which is flowing can be infinitely finely divided, both with respect to the quantity of material flowing and the time period over which it flows. By not having a random component, we mean that a flow will be exactly specified if the values of the variables at the other end of information arrows into the flow are known. A variable that does not have a random component is referred to as a deterministic variable." (Kirkwood, 1998)

Clearly, the continuous flow assumption is not exactly correct for many business processes: "You can't divide workers into parts, and you also can't divide new machines into parts. However, if we are dealing with a process involving a significant number of either workers or machines, this assumption will yield fairly accurate results and it substantially simplifies the model development and solution. Furthermore, experience shows that even when quantities being considered are small, treating them as continuous is often adequate for practical analysis." (Kirkwood, 1998)

The assumptions of no random component for flows is perhaps even less true in many realistic business settings. "But, paradoxically, this is the reason that it can often be made in an analysis of business processes. Because uncertainty is so widely present in business processes, many realistic processes have evolved to be relatively insensitive to the uncertainties." (Forrester, Counterintuitive behavior of social systems., 1971 a) Because of this, "the uncertainty can have a relatively limited impact on the process. Furthermore, we will want any modifications we make to a process to leave us with something that continues to be relatively immune to randomness. Hence, it makes sense in many analyses to assume there is no uncertainty, and then test the consequences of possible uncertainties." (Sterman J. D., 1983)

Practical experience indicates that "with these two assumptions, we can substantially increase the speed with which models of business processes can be built, while still constructing models which are useful for business decision making." (Kirkwood, 1998)

Validation of System Dynamics models.

Validation is one of the fundamental steps for the use of models, it "is the confirmation by examination and provision of objective evidence that the particular requirements for the intended use are fulfilled." is an important and controversial in every model, and system dynamics in particular.

Validity of the results in a model-based study "are crucially dependent on the validity of the model. In some important ways, the question of validating causal-descriptive (e.g. system dynamics) models has strong ties with philosophy of science issues." (Mollona, 2000) A system dynamics model "is rebutted if a critic can show that a relationship in the model conflicts with an established "real relationship", even if the output behaviour of the model matches the observed system behaviour". For such models, "validity primarily means validity of the internal structure of the model, not its output behaviour. It can be said that a valid system dynamics model embodies a theory about how a system actually works in some respect." (Barlas Y. , 1996) Therefore, there has to be a "strong connection between how theories are justified in the sciences (a major and unresolved philosophy of science question) and how such models are validated. This means that our conception of model validity depends on our philosophy (implicit or explicit) of how knowledge is obtained and confirmed." (Barlas Y. , 1996)

With the assumption of deterministic components of SD models increases the difficulties associated with validation. The use of abstract variables that cannot be observed empirically; the lack of an empirical base and an estimation procedure in the selection of the parameters. The use of arbitrary functions, which, in the words of Mollona (2000) "encourages the inclusion of hidden arbitrary assumptions which are, best validated by plausibility arguments". The scholars of System Dynamics have doggedly replied to these criticisms in two ways.

On the one hand, it was emphasized that a qualitative approach to the validation of the models does not imply a lack of rigour. In this first group of contributions has set itself the mainly attention on the differences between the scientific paradigm that affects a major part of the economic studies and the scientific paradigm that inspires the pragmatic -relativist approach of System Dynamics.

Second, an attempt was made to conform to logic and common language, introducing in the process of validating the models a greater emphasis on statistical techniques and empirical tests. In addition to this, as Wittgenstein says, in the holistic approach to the scientific research, the acceptability of a theory depends on its overall characteristics.

In the system dynamics validation process, when it comes to the comparison between real data and simulated data, we mean something different from the investigations carried out with the use of statistical analysis of time series. In Sd. when we speak of 'prediction' or, more generally, the comparison between simulated data and historical data, there is a so-called "qualitative comparison." The logic with which historical data are used in models System dynamics is, therefore, very different from that which inspires the econometric models. The use of statistical tests for econometric models aims to understand whether a particular model is able to explain the historical data available. In particular, the analysis is focused on both the estimate of the value of both parameters both on the estimation of statistical significance that the parameters have in the explanation of the observed data. As regards the System dynamics, however, rather than using statistical tests and calculate confidence intervals, are analyzed the graphs of the data series (historical and simulated) or comparing key features, trends, cycles, periods of oscillations, the timing of maxima and minima points.

Another reason why in the models of System dynamics are not frequently used statistical tests to validation lies in the fact that their use is often very complex compared to the real benefits. As explained by Barlas (1996): classical hypothesis testing, for example, the 'F-test, t-test or the Χ2, are based on assumptions that are typically violated in the models of System dynamics: normal distribution of errors, independence and stationarity. Meadows (1980), adds that "the non-linear structure that characterizes the feedbacks normally found in System dynamics models makes the standard statistical analysis often inapplicable or extremely complicated." An approach could be to remove unwanted features of the System dynamics models as the autocorrelation and the "non-stationarity" and then, applying standard tests to transformed variables. However, as suggested by Barlas (1996). "... these undesirable characteristics are precisely the elements of behavior that system dynamics models attempt to reproduce." Very often, therefore, are preferred qualitative tests for this type of models, but, there remains the problem of trying to communicate the accuracy and rigor of the qualitative tests used. For this reason, some scholars of system dynamics (as Barlas, Sterman e Peterson) [12] have tried to support, not replace, qualitative tests with some statistical tests.

Statistical methods to support System Dynamics models.

For problems of parameter choice and validity, the system dynamics users has usually relied on "manual" examination of detailed structure of the model. The realism of both parameter values and model structure is assessed and improved by repeated simulation experiments. If a simulation experiment reveals something surprising or wrong, the modeller asks way, seeks the answer by examining the model structure, and tests the answer by new simulations. This informal procedure of model inspection and simulation is one of the great strengths of the system dynamics methodology. If the modeller proceeds with diligence and thoroughness, the model is greatly improved over "first cut" form, and the modeller gains a deep understanding of system being modelled. Numerical data contribute to the process, but usually only when the implications are obvious by inspection. While econometric methods are sometimes employed by the system dynamics users, such use is infrequent. (Peterson, 1980)

System Dynamics VS. Statistical and econometrical methods of modelling.

Models can be categorized in many different ways, "according to different benchmarks, such as physical or symbolic; dynamic or static; deterministic or stochastic, etc. As it refers to the notion of validity, a crucial distinction must be carried out between models that are "causal-descriptive" (white-box) and models that are simply "correlational" (purely data-driven, "black-box")." (Barlas Y. , 1996) In purely correlational (black-box) models, since there is no claim of causality in structure, "what matters is the aggregate output behaviour of the model; the model is assessed to be valid if its output matches the "real" output within some specified range of accuracy, without any questioning of the validity of the individual relationships that exist in the model." (Barlas Y. , 1996) This type of "output" validation can often be cast as a classical statistical testing problem. "Models that are built primarily for forecasting purposes (such as time-series or regression models) belong to this category. On the other hand, causal-descriptive (white-box) models are statements as to how real systems actually operate in some aspects. In this case, generating an "accurate" output behaviour is not sufficient for model validity; what is crucial is the validity of the internal structure of the model. A white-box model, being a "theory" about the real system, must not only reproduce (or maybe forecast) its behaviour, but also explain how the behaviour is generated, and possibly, suggest ways of changing the existing behaviour." (Barlas Y. , 1996) System dynamics models—and all models that are design-oriented in general—fall in this category. Such models are built to assess the effectiveness of alternative policies or design strategies on improving the behaviour of a given system. This latter is only possible, if the model has an internal structure that adequately represents those aspects of the system which are relevant to the problem behaviour in hand. In short, it is often said that a system dynamics model must generate the "right output behaviour for the right reasons." (Barlas Y. , 1989)

Econometrics is defined as the use of statistical methods to verify and quantify economic theory. A set of theoretical relationship that has been verified and quantified for a particular economic system constitutes an econometric model of that system. The model can be used for structural analysis, for forecasting, or for testing the effects of policy alternatives.

The field of econometrics combines tools and concepts from the two older fields of statistics and economics. Therefore it shares aspects of both those paradigms, as well as adding its own special perspectives to the world-view of its practitioners. Statistical economics developed in the 1930’s as a result of rising interest in the quantitative behaviour of national economic variables. Much theoretical and practical work had already been done by the early 1950’s, when the development of the computer permitted a great expansion in the scope and complexity of econometric models. The dominating characteristic of the econometric paradigm is its reliance on statistical verification of model structure and model parameters. Econometricians are forced by their paradigm to tie their models firmly to statistical observations of real world systems. The formulation of an econometric model may be divided theoretically into two sequential phases, specification of structure from economic theory, and estimation of parameters by statistical analysis. The second phase is the centre of concern, however, occupying most of the modeller’s time and attention and most of the pages in econometric textbooks and journals. "To some extent the mathematical and data requirements of the estimation phase enter into the specification phase as well." (Meadows, 1980)

The information base from which an econometrician can draw his model structure is the same one underlying system dynamics or any other modelling technique-abstractions, intuitions, personal experiences, statistical data, established wisdom, experimentation, and guesswork. In practice, most econometricians are attracted to questions about the precise, short-term values of economic variables. They find most of the concepts they need in traditional economic theory. They tend to make only limited use of theories from other disciplines, and when they do, their bias tends to be as much towards the social sciences as the system dynamicists ones is toward the physical sciences. No special distinction is made between the proprieties of physical and information flows in econometric models. The underlying economic theory from which econometrics is draw is much richer in static concepts that dynamic ones, perhaps because much of the theory was developed before computer simulation allowed dynamic analysis of complex, non-linear systems. Although many econometric models are dynamic, they maintain their parent field’s emphasis on optima and equilibrium rather than on dynamic characteristics. Furthermore, the relatively short-term focus of many economic problem statements means that analysts often need not take into account feedback processes with long time delays.

When two-way causation does appear in econometric models, it is typically represented by means of simultaneous equations. The simultaneous equation formulation is equivalent to assuming that system equilibrium will occur within one calculation interval. Although most econometric models contain simultaneous-equation formulations and are driven dynamically by exogenously forecasted variables, many models also contain some feedback though lagged endogenous variables. These formulations are not essentially different from those in system dynamics models. The distinction between the two approaches is one of relative emphasis, not absolute contrast.

The variables that can be included in econometric models are restricted to a subset of all conceivable elements, because of the necessity for statistical validation. That requirement tends to eliminate the inclusion of most of what system dynamicists call the information components of any system.

The parametric methods (like almost all those econometric) used for the solution of problems of univariate and multivariate character have, as a limitation, the necessity of having to resort to the introduction of hypotheses very restrictive, often unjustified, if not impossible to justify, unrealistic, not always clear, difficult to interpret, made ad hoc in order to do inference. To this must be add that the assumptions that make valid the application of these methods (normality, homoscedasticity, independence and identical distribution of the stochastic error component) are normally rarely fulfilled and, even if fulfilled the results are often obtained through approximation.

But also, the greatest strength of the econometric paradigm is its insistence on continuous, rigorous checking of theoretical hypotheses against real-world data. This strength leads, however, to two problems already noted: the statistical methods used for estimation impose artificial restrictions on the initial formulation of the model, and the necessary for proper verification are seldom available. The mathematical requirements of estimation cause econometricians to represent economic systems as linear, mostly simultaneous relationship connecting a few aggregate economic variables by means of historically-observed coefficients. "A system dynamicist’s bias causes me to suspect that real economic systems are nonlinear, multivariable, time-delayed, disaggregate, and ecological-socio-economic, and they may respond to policy decisions in ways that are not represented in historical data." (Meadows, 1980)

As we have seen in previous sections, another fundamental difference between the econometric models and the System Dynamics lies in the validation. We have seen "the great problems that the system dynamics models have developed in this field", but in the words of Meadow (1980), "although econometrics techniques include a number of sophisticated statistical validity tests, establishing confidence in a model’s output is as difficult and uncertain in this modelling school as it is in the others."

For questions of parameter choice and validity, the system dynamicist’s has usually relied on manual examination of the detailed structure of the model. The realism of both parameter values and model structure is assessed and improved by repeated simulation experiments. If a simulation experiment reveals something surprising or wrong, the modeller asks why, seeks the answer by examination the model structure, and tests the answer by new simulations. This informal procedure of model inspection and simulation is one of the great strengths of the system dynamics methodology. (Peterson, 1980)

Statistical tools for System Dynamics.

This section is intended to be only a summary of existing techniques. [13] In general, we can distinguish five stages in the process of modelling: Problem formulation; Conceptualization; Formulation of the mathematical model; Evaluation and analysis of the model; Analysis of policies and behaviours. (Richardson, 1995)

In each of these stages have been developed statistical tools able to support and implement the creation of a model of system dynamics; the process that is implemented most of all is the parameter estimation, which is part of the middle stages of the modelling process.

Table : Tests for building confidence in System Dynamics models. Adapted from Sterman (1983)"

Tests of Model Structure

Question Addressed by the test

Structure Verification

Is the model structure consistent with relevant descriptive knowledge of the system?

Parameter Verification

Are the parameters consistent with relevant descriptive knowledge of the system?

Extreme Conditions

Does each equation make sense even when its inputs take on extreme values?

Boundary Adequacy (Structure)

Are the important concepts for addressing the problem endogenous to the model?

Dimensional Consistency

Is each equation dimensionally consistent without the use of parameters having no real world counterpart?

Test of Model Behaviour

Behaviour Reproduction

Does the model endogenously generate the symptoms of the problem, behaviour modes, phasing, frequencies, and other characteristics of the behaviour of the real system?

Behaviour Anomaly

Does anomalous behaviour arise if an assumption of model is deleted?

Family Member

Can the model reproduce the behaviour of the examples of systems in the same class as the model?

Surprise Behaviour

Does the model point to the existence of a previously unrecognized mode of behaviour in the real system?

Extreme Policies

Does the model behave properly when subjected to extreme policies or test inputs?

Boundary Adequacy (Behaviour)

Is the behaviour of the model sensitive to the addition or alteration of structure to represent plausible alternative theories?

Behaviour Sensitivity

Is the behaviour of the model sensitive to plausible variations in parameters?

Statistical Character

Does the output of the model have the same statistical character as the "output" of the real World?

Test of Policy Implications

System Improvement

Is the performance of the real system improved though use of the model?

Behaviour Prediction

Does the model correctly describe the results of a new policy?

Boundary Adequacy (Policy)

Are the policy recommendations sensitive to the addition or alteration of structure to represent plausible alternative theories?

Policy Sensitivity

Are the policy recommendations sensitive to plausible variations in parameters?"

Test of parameters

The parameters in system dynamics models "are typically estimated in a one-time fashion by taking advantage of every source of information at our disposal. The information sources may range from the hard to the soft, as depicted in the information spectrum in Figure . Hard sources include physical laws and the results of controlled experiments. Social system data may take the form of time series and cross sectional data" (Ford, 2005). "When numerical data is available, parameters may be estimated by generalized least squares2 or Kalman filtering" (Sterman J. D., 2000).

The softer sources of information are depicted at the right end of the spectrum. "These sources may not provide data in numerical form, but they are often the most important sources of information for model development and parameterization" (Forrester, 1971 a) ."Expert knowledge may be obtained from informal interviews, Delphi interviews, and intensive modelling workshops. Even better, expert judgment may be obtained when the experts become part of the modelling team and the entire modelling process. At the far end of the spectrum is personal intuition. System dynamics practitioners are willing to call on their personal intuition to estimate parameters even though the parameter value may be more of a "guesstimate" than an estimate. We include highly uncertain parameters when we believe our estimate is "better than zero." Our approach is to proceed with rough estimates, confident that the importance of the uncertain parameters can be tested through sensitivity analysis." (Ford, 2005)

Physical Laws

Controlled Experiments

Uncontrolled Experiments

Social System Data

Social System Cases

Expert Judgment

Personal Intuition

Figure : The information spectrum. Source: My elaboration on (Ford, 2005)

The system dynamics approach leads to models with a large number of highly uncertain parameters, so we should ask ourselves which of the parameters are really important. "One might think that the parameters with the greatest range of uncertainty are the most important and should be given the greatest attention. On the other hand, some readers would suspect that the key parameters are those located in a strategic position in the model." (Ford, 2005)

However, some scholars have had an approach that targets to the research of statistical methods to improve the estimation of the parameters; were thus studied, tools based on the so-called F.I.M.L.O.F. method. [14] 

The method of full-information, maximum likelihood via optimal filtering, * for the parameters estimation is best understood as an optimum compromise between two less satisfactory extremes. One extreme is the "naïve" simulation (NS), and the other extreme is the econometric tool, ordinary least-squares(OLS).

Consider the system:

X(t) = YX(t-1)+W(t)

Z(t) = X(t) + V(t)

X is the state of the system, Z are the measurements of X, Y is an unknown parameter, W(t) is driving noise ("equation error"), and V(t) is measurement noise (errors in the variables).

Take the case where W(t) = 0 and V(t) = 0, which is equivalent to perfect measurement of a deterministic system. In this noise-free case, the parameter Y can be estimated by simply taking the ratio between two successive values of Z(t). This example can help to illustrate more indirect methods, which succeed not only in the noise-free case, but also in more complicated situations. The essence of these FIMLOF models is to guess a value of Y and simulate the system; to measure the error between the simulated data (t) and the actual data Z(t), and repeat the process, making new guesses of Y until has been found an error that minimize the error. The estimated value of Y is then chosen as the value which minimize the error between actual and simulated data.

Naïve Simulation

In the naïve simulation method (NS), the model is initialized at the first data points and simulated without further reference to the data:

The simulated values won't coincide with the data; the differences are called residuals:

The Naïve simulation sum of squared residuals (the loss function) is:

The modeller may guess close to the "right Y value but essentially the idea is to adjust the guess until no smaller errors can be found. If the system being modelled has equation noise W(t) ≠ 0 , then the NS may give minimum errors J for a completely wrong value of Y, since the real system may "drift" away from the deterministic trajectory; for noise-driven systems, the NS ignores most of the data. In fact, as explained by Guizzardi (2001) the NS method can be used with proficiency when there is a strong "trend" component that may suggest that the last observation of a parameter can be the best way to predict its futures values. Note that this method cannot capture the seasonal component and therefore is mainly used for short-term analysis.

Ordinary Least Squares

When driving noise W(t) is present but V(t) is absent, the modeller can gain better estimates of Y by re-initializing the system at each data point, and then applying the same function of squared-residual error J as in the preview method. This iteration is known as Ordinary least-squares (OLS)

Every time the simulation gets a data-sample time, the system is reinitialized, so each segment of the simulation bring into being on a specific data point.

The ordinary least squares method gives a good estimate of Y so long as V(t) = 0; but it doesn’t work when measurement errors can lead to grossly inaccurate estimates of parameters in a classic system dynamics model. The reinitializing of the model at each data point serves to keep the simulation close to the real state of the system, so that any divergence in behaviour (measured by the residuals analysis) would be meaningful; but do this, in presence of noisy data is unlikely to keep the simulation close to the "true" state. In the statistical analysis of the OLS models one of the most important and thorny steps to approach is that of the residuals analysis.

A further hint of analysis would be to use the generalized least squares (GLS) technique. These methods are particularly suitable in situations where there is presence of heteroscedasticity, or when, there is a certain degree of correlation between the observed variables. In all those cases, therefore, in which the models OLS may provide incorrect estimates.

Full information Maximum likelihood

The Full information Maximum likelihood method has the aim to reinitialize the system at each data point, at the value of X(t) where the system is most likely to be given all available data; this kind of simulation must calculate also the expected size of the error ( the standard deviation) of the forecasted state. The iteration is like:

is the most likely value of X(t), given all information thought time t-1; is the most likely value of X(t-1), given the same information, and is the best guess of the next measurement Z(t), given all the previous data Z with t = 0,…,t-1.

The simulation is then updated at which is defined as the most likely value of X(t), given all information though time t. This information is embodied in Z(t), in and in the variances of these two quantities. The variance of Z(t) is simply the variance of the process V(t). The variance of is automatically derived from the variances of the processes V(t) and W(t), from the variance of the guess of the initial conditions, , and from the structure of the model. The computation of the variances is made by an "optimal filter". [15] 

Variances are employed in this kind of techniques to avoid the pitfalls observed in the previews techniques. Also, the system is simulated to the point of the first data point, and the first residual is computed as:

The difference between this last method and the others lies in how the model is reinitialized at the point , in fact, the FIMLOF re-initializes the model to a compromise point somewhere between and . The compromise id based on the variances of and . If the variance of is large, but the variance of is small, the re- initialization will be close to (As in NS), vice versa, the variance will be (As in OLS). Only by making "astute" guesses of the parameters, the modeller can arrive at the desired "maximum likelihood" estimated.

The Full information Maximum likelihood method operates under conditions of (Peterson, 1980):

Nonlinearities in model dynamics.

Nonlinear measurement functions.

Measurement errors (errors in variables).

Mixed sampling intervals (e.g. Can estimate a weekly model using monthly and yearly data)

Missing data (Without sacrificing other data at the same sample time)

Models with unmeasured endogenous variables.

Cross-sectional, time series mixed data.

Unknown characteristics of equation errors and measurement noise.

According to these conditions these techniques are compatible with the system dynamics models.

Parameter estimation with the Monte Carlo method.

As we have seen, dynamic modelling can be an important statistical analysis tool, the computation, in this kind of models can be very complex because of the nonlinear (and non-Gaussian) nature of the explained phenomena. Monte Carlo computation methods [16] are useful for the real time analysis of dynamic systems. Since it was invented, the Monte Carlo method has experienced an exponential diffusion parallel to the spread of computers.

Given enough time, money, expertise and computer power, almost any system can be simulated on a computer but this may be not sensible. There are three main features that characterize a system able to be analysed in this way: Its dynamic; its interactive; its complicated. Computer simulation modelling is the best suited to system that are dynamic and interactive as well as complicated. The Monte Carlo method have a limit in this way. It is constructed to give a spread sheet in static simulation, but with some devices of computational nature can be adapted to be utilized in the System Dynamics’s parameter estimation.

If we define a system as a sequence of evolving probability distributions , indexed by discrete time t=0,1,…, is called a probabilistic dynamic system. The state variable can evolve in the following three ways: Increasing dimension: has one more component than ; Discharging: has one fewer component than ; No change (Liu).

Liu and Cheng (1999) suggest that, to implement Monte Carlo for a dynamic system, we need, at any time t, random samples either drawn from directly, or drawn from another distribution, say , and weighted properly (importance sampling). Static methods, archive this end by treating each separately and repeating same kind of iterative processes; all of the results obtained at time t are discarded when the system evolves from to However, when the system is slowly varying, random samples at time t+1 so as to improve efficiency.

A useful way to represent a complicated high dimensional distribution, such as , is by multiple Monte Carlo samples draw from it, also known as Sequential Importance Sampling (SIS).

A random variable X drawn from a distribution g is said to be properly weighted by a weighting function w(x) with respect to the distribution π if for any integrable function h,

A set of random draws and weights is said properly weighted with respect to if:

Eq.

For any integrable function h. In a practical sense we can think of Ï€ as a being approximated by the discrete distribution supported on the with probabilities proportional to the weights . The sequential importance sampling consist of a recursive applications of some computational steps. [17] 

In the system dynamics, often one is interested to obtain on line inference of the state variables, this is straightforward by using the Eq. , when available is a sample properly weighted by . However, several issues concerning statistical efficiency of the estimates are worth mentioning. Casella et al. provide a general treatment on this issue by saying that the estimation should be done before a resampling step, since resampling introduces extra random variation in the current sample; the Rao-Blackwellization can improve the accuracy of the estimation [18] ; delayed estimation at time t usually is more accurate than concurrent estimation at time t-k, since the estimation is based on more information.

Structural estimation.

The majority of models is constructed with a static standpoint, in these cases the parameters coincide with the structure and, therefore, there aren't many problems in its construction and estimation. For example:

Where A and B are constant matrices, is the state vector at time n, is a vector of known inputs, is a white, normal process of mean 0 and covariance Q. In this case, the parameters are simply the constant coefficients of the matrices A, B and Q.

General nonlinear systems, however, will require a more general definition of parameters which in this case will be a constant exogenous input to the system. A parameter in a nonlinear system may enter the system in any nonlinear fashion, it may be known of not, but it is always a constant whose value is not determined by the rest of the system. Thus, in nonlinear systems, parameters may take on qualities usually associated with structure. For example:

The would be considered a parameter in the equation. But determines the structure of the system. If = 0, then this system has the structure determined by the function ; if = 1, then the system structure is determined by the function .

The estimation of structure may be thought of as a kind of continuous hypothesis test. The maximum-likelihood value of may be thought of as selecting the most likely structure from the range of structures implied in the equation. In addition, separate models may be compared by computing the likelihood of each with respect to the same data base. (Peterson, 1975).

The confidence-building process

In a classic study, analysts will give ranges to many, but not all of the inputs. "We should expect that some individuals will be reluctant to assign ranges of uncertainty. The aim is to find a useful set of tolerance intervals. If specialists disagree on the appropriate ranges of uncertainty, the tolerance intervals can be recalculated with new ranges." (Ford, 2005) After we have to decide on the number of runs and to assign values to each of the parameters, this is no longer seen as a problem, having regard to the calculation capacity of the existing software. [19] In this type of software you can find an estimation of the tolerance intervals, the so-called sensitivity analysis.

Figure : (a) Vensim calculation of 100 simulation of a bank balance model. (b) Vensim’s percentiles of a bank balance model. Source: (Ford, 2005).To consider these graphs [Figure : (a) Vensim calculation of 100 simulation of a bank balance model. (b) Vensim’s percentiles of a bank balance model. Source: (Ford, 2005).] and translate them into confidence intervals, in my opinion, can only be done in the presence of independence in the input variables. Since, very often the system dynamics models are constructed by ignoring this problem Ford (2005) suggests to restrict the number of variables to the most relevant, and expand it gradually, checking the graphs of the intervals. Another tool available to users of the system dynamics is that of simple correlation coefficient based on (Eq. ), is a simple statistical tool that investigates the correlation between variables, again in order to avoid errors in reading the results of screening tests. [20] 

Eq. Correlation coefficient.

But care should be taken when you a



rev

Our Service Portfolio

jb

Want To Place An Order Quickly?

Then shoot us a message on Whatsapp, WeChat or Gmail. We are available 24/7 to assist you.

whatsapp

Do not panic, you are at the right place

jb

Visit Our essay writting help page to get all the details and guidence on availing our assiatance service.

Get 20% Discount, Now
£19 £14/ Per Page
14 days delivery time

Our writting assistance service is undoubtedly one of the most affordable writting assistance services and we have highly qualified professionls to help you with your work. So what are you waiting for, click below to order now.

Get An Instant Quote

ORDER TODAY!

Our experts are ready to assist you, call us to get a free quote or order now to get succeed in your academics writing.

Get a Free Quote Order Now