Qualitative and quantitative description of models. Quantitative analysis of models Quantitative analysis of the model characterizing

  • 19.05.2020

To carry out a quantitative analysis of the diagrams, we list the indicators of the model:

Number of blocks on the diagram - N;

Chart decomposition level − L;

Chart balance - AT;

The number of arrows connected to the block is - BUT.

This set of factors applies to each model diagram. The following will list recommendations for the desired values ​​of the chart factors.

It is necessary to strive to ensure that the number of blocks on the diagrams of the lower levels would be lower than the number of blocks on the parent diagrams, i.e. with an increase in the level of decomposition, the coefficient would decrease . Thus, a decrease in this coefficient indicates that as the model is decomposed, the functions should be simplified, therefore, the number of blocks should decrease.

Charts must be balanced. This means that within the framework of one diagram, the situation shown in Fig. 14: Job 1 has significantly more incoming and control arrows than outgoing ones. It should be noted that this recommendation may not be implemented in models describing production processes. For example, when describing an assembly procedure, a block can include many arrows describing the components of a product, and one arrow can exit - the finished product.

Rice. 14. An example of an unbalanced chart

Let's introduce the chart balance factor:

.

It is necessary to strive to K b, was the minimum for the chart.

In addition to the analysis of the graphic elements of the diagram, it is necessary to consider the names of the blocks. To evaluate the names, a dictionary of elementary (trivial) functions of the simulated system is compiled. In fact, the functions of the lower, level decomposition of diagrams should fall into this dictionary. For example, for a database model, the functions “find a record”, “add a record to the database” may be elementary, while the function “user registration” requires further description.

After forming the vocabulary and compiling a package of system diagrams, it is necessary to consider the lower level of the model. If it shows a match between the names of blocks of diagrams and words from the dictionary, then this indicates that a sufficient level of decomposition has been achieved. The coefficient that quantitatively reflects this criterion can be written as L*C is the product of the model level by the number of matches of block names with words from the dictionary. The lower the level of the model (more L), the more valuable the coincidence.

DFD Methodology

At the core DFD methodology lies the construction of a model of the analyzed AIS - designed or actually existing. The main tool for modeling the functional requirements of the designed system are data flow diagrams (DFD). In accordance with this methodology, a system model is defined as a hierarchy of data flow diagrams. With their help, requirements are broken down into functional components (processes) and presented as a network connected by data flows. The main purpose of such tools is to demonstrate how each process transforms its inputs into outputs, and to reveal the relationships between these processes.

The components of the model are:

Diagrams;

Data Dictionaries;

Process specifications.

DFD Diagrams

Data flow diagrams (DFD - Data Flow Diagrams) are used to describe workflow and information processing. DFD represents a model system as a network of interconnected activities that can be used to more visually display current workflow operations in corporate systems information processing.

DFD describes:

Information processing functions (works, activities);

Documents (arrows, arrows), objects, employees or departments that are involved in information processing;

Tables for storing documents (data store, data store).

BPwin uses Gein-Sarson notation to plot data flow diagrams (Table 4).

Gein–Sarson notation

Table 4

In diagrams, functional requirements are represented by processes and stores connected by data flow.

external entity- material object or individual, i.e. an entity outside the context of the system, which is a source or receiver of system data (for example, a customer, personnel, suppliers, customers, a warehouse, etc.). Her name must contain a noun. It is assumed that the objects represented by such nodes should not participate in any processing.

System and subsystem when building a complex IS model, it can be represented in the general view on the context diagram as one system as a whole, or can be decomposed into a number of subsystems. The subsystem number serves to identify it. In the name field, the name of the system is entered in the form of a sentence with the subject and corresponding definitions and additions.

Processes are intended to produce output streams from input streams in accordance with the action specified by the process name. This name must contain an indefinite verb followed by an object (for example, calculate, check, create, get). The process number serves to identify it, as well as to refer to it within the diagram. This number can be used in conjunction with the diagram number to provide a unique process index throughout the model.

Data streams– mechanisms used to model the transfer of information from one part of the system to another. The flows in the diagrams are represented by named arrows, the orientation of which indicates the direction of information flow. Sometimes information can move in one direction, be processed and returned back to its source. Such a situation can be modeled either by two different flows, or by one - bidirectional.

Qualitative and quantitative methods are a tool for a certain work with data, their recording and subsequent analysis.

Qualitative Methods are aimed at collecting qualitative data and their subsequent qualitative analysis using appropriate techniques and techniques for extracting meaning; quantitative methods are a tool for collecting numerical data and their subsequent quantitative analysis using the methods of mathematical statistics (Fig. 3.1).

Rice. 3.1.

Accordingly, qualitative research can be defined as research that predominantly uses qualitative methods, while quantitative research can be defined as research built on the predominant use of quantitative methods.

It seems obvious to define the type of study by the corresponding type of methods. However, not all authors define qualitative and quantitative research in this way, and their various interpretations can be found in the methodological literature. Indeed, a number of authors (see, for example: Semenova, 1998; Strauss, Corbin, 2007) characterize qualitative studies as those in which non-quantitative data collection methods are used, and data analysis is carried out using various qualitative interpretive procedures, without involving calculations and methods. mathematical statistics. In other manuals devoted to qualitative research (the most famous among them: Handbook of Qualitative Research..., 2008), along with exclusively qualitative (phenomenological, discourse-analytical, narrative, psychoanalytic) methods, the so-called Q-methodology is analyzed, in which collection of numerical data and their quantitative analysis. Q-methodology is usually contrasted with "R-methodology". The R-methodology uses objective indicators of tests, questionnaires, rating scales, which reflect the constructs created by the researcher himself - it is these objective indicators that are subjected to the mathematical processing procedure in the R-methodology (for example, using factor analysis procedures). Q-methodology, in turn, is aimed at obtaining subjective data. It is based on the Q-sorting procedure: the subjects are asked to sort a certain set of statements (as a rule, obtained from themselves as a result of a special survey or interview procedure), distributing these statements along a pre-organized continuum specified by some scale. The subjects sort the statements in accordance with their own subjective assessment, and in the future, the matrix of these subjective assessments subjected to processing by methods of multivariate statistics. As already mentioned, Q-methodology procedures are included in qualitative research manuals, despite the fact that they involve obtaining quantitative data and applying statistical methods. The authors believe that the Q-methodology is one of the possible alternatives to the main "objective" psychological research, and since it is the direction of qualitative research that embodies the spirit of cognitive alternatives, the Q-methodology based on quantitative methods is discussed in the context of qualitative research.

As can be seen, the interpretation of qualitative and quantitative research is not always strictly tied to the types of methods used in research. Very often, the peculiarities of the organization of research serve as a constitutive sign of the separation of qualitative and quantitative research. The problem of distinguishing different types of studies from the point of view of their organization will be considered in the next paragraph. To avoid confusion here, we propose to dwell on this at the beginning of the paragraph. methodical definition of qualitative and quantitative research as built on the predominant application of a certain type of methods. Qualitative research mainly deals with qualitative data and qualitative methods of their analysis, quantitative research - with quantitative data and their quantitative analysis.

Quantitative (mathematical-statistical) analysis- a set of procedures, methods for describing and transforming research data based on the use of a mathematical and static apparatus.

Quantitative Analysis implies the ability to treat results as numbers - the application of methods of calculation.

Deciding on quantitative analysis, we can immediately turn to the help of parametric statistics or first carry out primary and secondary data processing.

At the stage of primary processing are solved two main tasks: introduce the obtained data in a visual, convenient form for preliminary qualitative analysis in the form of ordered series, tables and histograms and prepare data for application of specific methods secondary processing.

ordering(arrangement of numbers in descending or ascending order) allows you to highlight the maximum and minimum quantitative value of the results, evaluate which results are most common, etc. A set of indicators of various psychodiagnostic methods obtained for a group is presented in the form of a table, in the rows of which the survey data of one subject are located, and in the columns - the distribution of the values ​​of one indicator over the sample. bar chart is the frequency distribution of the results over a range of values.

At the stage secondary processing the characteristics of the subject of research are calculated. Analysis of results secondary processing allows us to prefer the set of quantitative characteristics that will be most informative. Purpose of the stage secondary processing consists not only in obtaining information but also in preparing data for a possible assessment of the reliability of information. In the latter case, we turn to help parametric statistics.

Types of methods of mathematical-static analysis:

Descriptive statistics methods are aimed at describing the characteristics of the phenomenon under study: distribution, communication features, etc.

Static inference methods serve to establish the statistical significance of data obtained during experiments.

Data transformation methods are aimed at transforming data in order to optimize their presentation and analysis.

To quantitative methods of analysis and interpretation (transformation) of data include the following:

Primary processing of "raw" estimates To create the possibility of using nonparametric statistics, two methods are used: classification(separation of objects into classes according to some criterion) and systematization(ordering objects within classes, classes among themselves and sets of classes with other sets of classes).

This type of analysis is based on the calculation of a number of quantitative indicators for the constructed model. It should be borne in mind that these estimates are largely subjective, since the assessment is carried out directly on graphical models, and their complexity and level of detail are determined by many factors.

Complexity. This indicator characterizes how hierarchically complex the process model is. The numerical value is determined by the complexity factor k sl .

ksl = ? ur/? ekz

where? ur -- number of decomposition levels,

Ekz is the number of process instances.

The complexity of the considered model is equal to:

For ksl<= 0,25 процесс считается сложным. При k sl =>0.66 is not considered as such. The considered process is equal to 0.25, which does not exceed the complexity threshold.

Processivity. This indicator characterizes whether the constructed process model can be considered essential (describes the structure of the subject area in the form of a set of its main objects, concepts and relationships), or process (all instances of the model processes are connected by cause-and-effect relationships). In other words, this indicator reflects how the constructed model of a certain situation in the company corresponds to the definition of the process. The numerical value is determined by the process factor k pr

kpr = ? raz/? cap

where? raz -- the number of "breaks" (lack of causal relationships) between instances of business processes,

Processability is equal to

Controllability. This indicator characterizes how effectively process owners manage processes. The numerical value is determined by the controllability coefficient k kon

kkon = ? s/? cap

Where? s -- number of owners,

Kep is the number of instances in one chart.

Controllability is

When k kon = 1, the process is considered to be controlled.

Resource intensity. This indicator characterizes the efficiency of resource use for the process under consideration. The numerical value is determined by the resource intensity factor k r

k r = ? r/? out

where? r -- number of resources involved in the process,

Out -- number of outputs.

The resource intensity is

The lower the value of the coefficient, the higher the value of resource efficiency in the business process.

For k r< 1 ресурсоемкость считается низкой.

Adjustability. This indicator characterizes how strongly the process is regulated. The numerical value is determined by the control factor k reg

where D is the amount of available regulatory documentation,

Kep -- number of instances per chart

Adjustability is equal to

For k reg< 1 регулируемость считается низкой.

The parameters and values ​​of quantitative indicators are presented in Table. 7.

Tab. 7. Quantitative indicators

For a general assessment of the analyzed process, the sum of the calculated indicators is calculated

K = k sl + k pr + k kon + k r + k reg

The sum of the indicators is

K = 0.1875 + 0.25 + 0.9375 + 0.273 + 0.937 = 2.585

The calculated value satisfies the condition K > 1. At K > 2.86, the process is considered to be obviously inefficient. At 1< K < 2,86 процесс частично эффективен.

To carry out a quantitative analysis of the diagrams, we list the indicators of the model:

the number of blocks in the diagram - N;

level of decomposition of the diagram - L;

balanced diagram - B;

the number of arrows connected to the block - A.

This set of factors applies to each model diagram. The following will list recommendations for the desired values ​​of the chart factors.

It is necessary to strive to ensure that the number of blocks on the diagrams of the lower levels would be lower than the number of blocks on the parent diagrams, i.e., with an increase in the level of decomposition, the coefficient would decrease. Thus, a decrease in this coefficient indicates that as the model is decomposed, the functions should be simplified, therefore, the number of blocks should decrease.

Charts must be balanced. This means that within one diagram there should not be a situation where the work has significantly more incoming and control arrows than outgoing ones. It should be noted that this recommendation may not be followed for production processes, which involve obtaining a finished product from a large number of components (production of a machine assembly, production of a food product, and others). For example, when describing an assembly procedure, a block can include many arrows describing the components of a product, and one arrow can exit - the finished product.

Let's introduce the chart balance factor:

It is desirable that the balance factor be minimal for the diagram, and constant in the model.

In addition to assessing the quality of diagrams in the model and the model itself in general, by the balance and decomposition coefficients, it is possible to analyze and optimize the described business processes. The physical meaning of the balance coefficient is determined by the number of arrows connected to the block, and accordingly it can be interpreted as an estimated coefficient for the number of documents processed and received by a particular unit or employee and official functions. Thus, on the graphs of the dependence of the balance coefficient on the level of decomposition, the existing peaks relative to the average value show the overload and underload of employees in the enterprise, since different levels of decomposition describe the activities of various departments or employees of the enterprise. Accordingly, if there are peaks on the graphs of real business processes, then the analyst can issue a number of recommendations for optimizing the described business processes: the distribution of functions performed, the processing of documents and information, the introduction of additional coefficients for remuneration of employees.

Let's carry out a quantitative analysis of the models depicted in Figures 12 and 13, according to the method described above. Consider the behavior of the coefficient for these models. The parent diagram "Processing a client's request" has a coefficient of 4/2 = 2, and decomposition diagrams of 3/3 = 1. The coefficient value decreases, which indicates that the description of functions is simplified with a decrease in the level of the model.

Consider the change in the coefficient K b for two versions of the models.

For the first option shown in Figure 20,

for the second option

The coefficient K b does not change its value, therefore, the balance of the diagram does not change.

We assume that the level of decomposition of the considered diagrams is sufficient to reflect the purpose of modeling, and on the diagrams of the lower level, elementary functions are used as the names of works (from the point of view of the system user).

Summing up the considered example, it is necessary to note the importance of considering several options for diagrams when modeling a system. Such options may arise when adjusting diagrams, as was done with "Processing a client request" or when creating alternative implementations of system functions (decomposition of the work "Changing the database"). Consideration of options allows you to select the best one and include it in the diagram package for further consideration.