Unambiguous Entropic Evaluation of the Efficiency of Complicated Technologies of Complex Processing of Natural Resources

Show more

1. Introduction

Justification of such a criterion can be illustrated best of all on the example of plants producing various products from the same raw material, for example, “Norilsk Nickel” (Russia) or Dead Sea Works (Israel). The first of these enterprises produces, side by side with nickel, a number of other nonferrous metals. As for the Dead Sea Works, they produce around five or six different products from sea water, each of them being a final product of an individual plant. Many similar examples of multiple usages of natural resources can be given for other countries. To develop the mentioned method, it is necessary, first of all, to solve the problem of numerical evaluation of the state of a complicated system.

Here some explanations are needed. The main problem is reduced to the development of methodology of unambiguous numerical estimation of the state of a system of any complicacy. As a rule, complicated systems consist of several components, their number being different. It is very important to have a notion about the relationships between these components in the system. Rather often, if the number of components is small, it is sufficient to determine the ratio between them. But the most complete idea is provided by the usual percentage reduced to 100%. The estimation can also be reduced to fractions of unity, which correlates it with probability. The probability is determined as

$P\left({x}_{i}\right)=\frac{{x}_{i}}{{\displaystyle \sum {x}_{i}}}.$ (1)

Here ${x}_{i}$ can have any dimension (tons, dollars, kilograms, percentage, pieces, etc.).

2. Entropic Estimations of Complex Systems

If a system consists of two components, a specified content of one component automatically defines the content of another, since the sum of their contents is unity. Hence, for a binary system, a single-valued estimation can be obtained specifying the content of one of the components.

The situation is different if a system consists of more than two components. In this case, the content of one component does not define those of others. If the contents of all components of the system are specified simultaneously, it gives a multiple (and not a single-valued) estimation. Therefore, in this case we use other characteristic instead of the probability―a measure of uncertainty introduced by Hartly in 1929 [4] [5] [6] [7] and then used by Shannon in 1948 when developing the theory of information. The notion of the measure of uncertainty can be clarified by the following elementary example. We assume that a random value ${x}_{i}$ has $k$ equiprobable outcomes (When tossing a coin, $k=2$ , when casting a die, $k=6$ ).

According to the probability definition,

$P\left({x}_{i}\right)=\frac{1}{k}.$ (2)

The uncertainty is a function of the number of outcomes, and it can be denoted by $f\left(k\right)$ .

Shannon [3] has shown that the only function of the number of outputs is a quantity proportional to the logarithm of the number of outputs.

$f\left(k\right)=A\mathrm{log}k=H\left(x\right)$ (3)

where $A$ is the proportionality coefficient; $H\left(x\right)$ ―the uncertainty of a random value; $\mathrm{log}k$ ―a quantity determined with the accuracy up to a constant, because the base of logarithm is not determined yet.

According to [1] , we can write the following for the entropy of a binary system:

$H\left(x\right)=-AG\left({P}_{1}\mathrm{log}{P}_{1}+{P}_{2}\mathrm{log}{P}_{2}\right).$ (4)

${P}_{1},{P}_{2}$ are the probabilities of the components contents in a binary system.

For a multi-component system, the composition entropy is determined by the expression [1] :

$H\left(x\right)={\displaystyle \underset{i=1}{\overset{n}{\sum}}{\displaystyle \underset{j=1}{\overset{m}{\sum}}{P}_{ji}}}\mathrm{log}{P}_{ji}.$ (5)

${P}_{ij}$ is the probability of a component content in a multicomponent mixture.

As for the bases of logarithms, they can be whatever; any assumed values give results differing by a constant. At the comparison of successive computations in the process of the system change, the influence of the logarithm base is leveled. Therefore, for greater convenience in practical computations, we can recommend to use decimal logarithms, and for theoretical derivations―natural logarithms. There are no distinctions in kind between them.

Having clarified all these nuances, we can pass to the consideration of a general situation in the analysis of the state of the production of many products on the basis of complex usage of a raw material.

3. Evaluation of Complicated Technology Perfection on the Basis of Entropy Criterion

Our analysis is based on the plants realizing multiple usages of natural raw materials mentioned in the introduction. First of all, it is necessary to have data on the number of tons of ore or the number of cubic meters of sea water annually used by the respective plants. Besides, complete data on the material composition of ore or sea water are necessary.

Denote the reference quantity of these parameters by $F$ tons/year. If we denote the fraction of useful components in raw material by $k$ , then their total amount coming into production is $kF$ tons/year. Besides useful components, there is waste rock in this raw material, whose fraction is denoted by $m$ .

Clearly, $k+m=1.$

The waste rock can contain various substances, sometimes very valuable, but at present they are not the target product.

Determine the complicacy of composition of useful components that can be obtained purely theoretically at their ideal extraction. Denote the relative content of each component in this idealized balance by ${x}_{i}$ , and then their sum is

${x}_{1}+{x}_{2}+{x}_{3}+\cdots +{x}_{i}+\cdots +{x}_{n}=1.$ (6)

The components can be calculated in fractions of unity or percentage-wise.

The entropy of this initial composition is determined, by definition, as

${H}_{n}={x}_{1}\mathrm{ln}{x}_{1}+{x}_{2}\mathrm{ln}{x}_{2}+{x}_{3}\mathrm{ln}{x}_{3}+\cdots +{x}_{i}\mathrm{ln}{x}_{i}+\cdots +{x}_{n}\mathrm{ln}{x}_{n}.$ (7)

Denote the total yearly output of ready products of the plants under consideration by $G$ tons/year. It is absolutely clear that

$G<kF$ (8)

is always valid.

Denote relative amount of each manufactured product by ${x}_{i}^{/}$ . It is also clear that ${x}_{i}^{/}<{x}_{i}$ is always valid, if ${x}_{i}^{/}$ is calculated as a fraction of the ideal initial

composition, and $\underset{/}{\overset{n}{\sum}}{x}_{i}^{/}}<1$ or 100%.

The entropy of the obtained total production is

${H}_{s}^{/}={\displaystyle \underset{/}{\overset{n}{\sum}}{x}_{i}^{/}}\mathrm{ln}{x}_{i}^{/}<{H}_{s}.$ (9)

The efficiency of production using complicated technology can be unambiguously determined as

$E=\frac{{H}_{s}^{/}}{{H}_{n}}\cdot 100\%.$ (10)

Besides the total technological evaluation of the entire production complex, the analysis of obtaining each product separately can be performed in a similar way using the relation

${E}_{i}=\frac{{H}^{/}\left({x}_{i}\right)}{H\left({x}_{i}\right)}\cdot 100\%.$ (11)

However, it also visually shows a simple percentage extraction of each component separately. This makes it possible to analyze the perfection level of the technology of obtaining each product. It is especially important that using Equation (11) we can analyze the level of the entire combined production. Such analysis allows revealing technological reserves and deciding where and to what extent it must be modernized in order to increase the general effect.

Such analysis confirmed by financial calculations can even allow sometimes a decrease in the output of one product at the expense of increasing that of another in order to reach an increase in the total effect both technologically and financially.

For the sake of simplicity and clearness, we present a concrete example of the application of the suggested method. Usually the number of products made from the same raw material is not very high―maximum five or six. The calculations are presented in Table 1.

To clarify the total technological efficiency of all departments of the enterprise, it is necessary to sum up the initial entropy of the raw material in line 2 and the obtained entropies of manufactures in line 4 and to compose a ratio of the obtained numbers.

Table 1. Calculation example.

$\underset{/}{\overset{s}{\sum}}H\left({x}_{i}\right)}=326.05$ , $\underset{/}{\overset{s}{\sum}}H\left({x}_{i}\right)}=\mathrm{258.09.$

The total efficiency of useful components usage in the production process is

$E=\frac{{\displaystyle \sum {H}^{/}\left({x}_{i}\right)\cdot 100}}{{\displaystyle \sum H\left({x}_{/}\right)}}=\frac{258.09\times 100}{326.05}=79.15\%.$

Besides, this shows the presence of unemployed reserves in the total production. At the same time, the obtained result shows that the general usage of raw material is satisfactory.

4. Conclusions

The possibility of a unique evaluation of a complicated technological process ba- sed on the entropic parameter is demonstrated. This parameter is determined for any complicated systems that can be interpreted by a certain estimate within the bounds of the probability theory or mathematical combinatorics.

Both the production process as a whole and its separate steps can be analyzed on the basis of the entropic criterion and its poorly performing sectors can be revealed. This will make it possible to intensify the cumulative effect at the expense of greater usage of complex raw material components, which determines the assignment of financial means. The unequivocal evaluation gives the production managers a powerful argument for optimal management.

References

[1] Barsky, E. (2016) Unambiguous Entropic Evaluation of a Complicated Construction Process. American Journal of Industrial and Business Management, 6, 382-391.

https://doi.org/10.4236/ajibm.2016.63034

[2] Barsky, E. (2013) Entropy of Two-Phase Flows in the Mode of Separation. Journal of Mining Science, 49, 308-318.

https://doi.org/10.1134/S1062739149020147

[3] Barsky, E. (2014) Entropy Invariants of Two-Phase Flows. Elselvier, Amsterdam.

[4] Shannon, C. (1963) A Mathematical Theory of Communication. Moscow.

[5] Wilson, A.G. (1970) Entropy in Urban and Regional Modelling, Pion Limited. Addison-Wesley Publishing, Boston.

[6] Martin, N.F. and England, J.W. (1981) Mathematical Theory of Entropy. Addison-Wesley Publishing, Boston.

[7] Chambadal, P. (1963) Evolution et Applications du Concept D'Entropie, Dunod, Paris.