Abstract
Target recognition in uncertain environments is a hot issue. Fusion rules are used to combine the sensor reports from different sources. In this situation, obtaining more information to make correct decision is an essential issue. Probability distribution is one of the most used methods to represent uncertainty information. In addition, the negation of probability distribution provides a new view to represent the uncertainty information. In this article, the existing negation of probability distribution is extended with Tsallis entropy. The main reason is that different systems have different parameter
Introduction
In recent years, information fusion is paid great attention in military applications.1,2 Many methods based on information fusion have been proposed to classify objects, 3 target recognition4,5 and decisions making.6,7 In most cases, the final result which is obtained by information fusion of multiple sensors may be reasonable. However, the recognition result maybe counter-intuitive due to high conflict. 8 Up to now, fusion rule is still an open question. Besides, the information gathered in sensors fusing system9–11 exists uncertain as it is incomplete, inconsistent and possibly imprecise. Many methods have been proposed to obtain more information.12–14 However, due to the existing uncertainty,15–17 it is essential to obtain more information according to known knowledge.
Presenting knowledge is an open issue.18–20 In most cases, we commonly use ‘must’, ‘may’ and ‘likely’ to estimate whether an event will happen or not. Due to the uncertainty, there are some methods to deal with it.21–24 Probability distribution is used to quantitatively describe the possibility of occurrence of a result in real applications. 25 More importantly, for most cases, it is much easier to describe the negation of the events than directly describe them in some circumstances. For example, if it is difficult to prove a mathematical formula rigorously, however, a counterexample can easily prove the formula wrong. Similarly, there is a significant property in probability, namely mutual exclusion. Mutual exclusion means the occurrence of one thing when its opposite will not happen. In other words, the probability of an event will affect its opposition. Therefore, it is meaningful to study negation of probability distribution (NPD). 26 Recently, the NPD based on Gini entropy was proposed by Yager to present knowledge in a new view. 27
The motivation of this study is to find a more general and reasonable mode to evaluate the uncertainty of NPD. If the correlations between the
More importantly, in nature and society, everything has its negation. Regret and expect give us two views to consider a problem. Moreover, the best alternative is as close as ideal solution and is as far as negative solution. Besides, NPD can provide another information based on known information. That is to say, we can analyse this target from two sides, which can improve the correctness of decision-making. The reason is that NPD has the properties of imprecision and unknown, which is beneficial for decision. More importantly, if the original information is highly conflicting, the conflict of negation may not be highly conflicting. Based on the discussion, it is of great significance to study negation. Hence, the article also uses negation to recognize target based on sensor fusion.
The rest of this article is structured as follows: In the section ‘Preliminaries’, preliminaries of some entropies and Yager’s negation method are introduced. The proposed method is introduced in the section ‘The proposed method’. In the section ‘Examples and discussion’, a numerical example is used to illustrate the method of negation and using Tsallis entropy to evaluate the uncertainty. The application of negation in target recognition based on sensor fusion is introduced in the section ‘Application of negation based on sensor fusion’. Finally, some conclusions are given in the section ‘Conclusion’.
Preliminaries
In this section, the preliminaries of some entropies and NPD will be briefly introduced.
Gini entropy
Entropy plays a very important role in many systems.31–33 In Yager’s NPD, the Gini entropy is adopted. 27
Definition 1
Gini entropy is defined as follows 34
where
Moreover, if only for the sake of comparing the uncertainty associated with the two distributions, the Gini entropy may be preferred to the Shannon with simpler calculation. 35 Knowledge representation is of great significance to modern science. In many fields, it has been regarded as the main driving force for the application of theory to practice, such as, aggregation, 36 evidence resolution,37–39 decision-making40–42 and so on.43–45 Interestingly, probability is similar to coins, it also has a original side and a negative side. 46 Summarizing, negation provides a new view to investigate the property of probability.
Definition 2
Assuming a probability distribution
Because the probabilities are completely mutually exclusive, its negation was normalized, so the NPD satisfies
It should be pointed that since the basic probability assignment has more flexibility to represent uncertainty, the negation of basic probability assignment is also paid attention recently. 46
Tsallis entropy
Definition 3
Given a probability distribution
when
One can rewrite Tsallis entropy as follows 29
where
The proposed method
From the above, it can be seen that Yager’s method would be the special case. In order to expand the application of negation, the article proposed a method that uses Tsallis entropy to measure the uncertainty of NPD. Using Tsallis entropy to measure the uncertainty of NPD can expand the application of negation
In the following part, the property of negation-based proposed method can be discussed.
Property 1
Tsallis entropy can increase after negation.
Specific proof is as follows
when
when
From the above calculation, it can be seen that Tsallis entropy can increase after negation.
Property 2
There is maximum Tsallis entropy with uniform distribution after multiple negations.
The general formula of the negation method is as follows
where
From the above, it can be seen that Tsallis entropy reaches the maximum after multiple negations.
Besides, in nature, any operation can cause energy consumption. The change of entropy means the consumption of energy, and consumed energy cannot be reused. Similarly, the variation of information entropy is accompanied by the consumption of information. Of various entropies, Tsallis entropy is non-extensive entropy and applied in artificial and social complex systems. Hence, using Tsallis entropy to measure uncertainty of NPD can expand the applications of negation.
Examples and discussion
Example
Example 1
Given the event space
Comparing
It is apparent that the original probability distribution
It is necessary to explore what causes this irreversibility. From Example 1, it can be found that the probability would be redistribution after negation. Hence, it should be considered whether the uncertainty has changed after taking NPD. In the next section, there are some discussions on why the negation process is generally irreversible and how to measure the uncertainty of the probability based on our negation method.
Further discussion
Again considering the Example 1, the Table 1 is used to show the change of probability after each iteration of negation process. It can be clearly known that probability is reallocated after negation. Gradually, the probability distribution becomes more and more close to the uniform distribution. What is the cause of this phenomenon?
The change of probability after each iteration of negation process.
The concept of entropy is derived from physics.47,48 With the increasing application of entropy,49,50 information entropy has become an indispensable part of modern scientific development.51,52 Shannon
53
first proposed the concept of information entropy to describe the uncertainty and has applied in a lots of fields.54,55 However, as in typical physical problems, there are some examples where the Boltzmann-Shannon entropy is not suitable.56,57 In 1988, Tsallis proposed a non-extensive entropy called Tsallis entropy. Subsequently, non-extensive statistical mechanics which is Generalization of Boltzmann-Gibbs Statistics emerged based on Tsallis entropy. More importantly, the Boltzmann-Gibbs statistics is recovered as the limitation when
Let’s calculate the uncertainty using Tsallis entropy with different

The trend of Tsallis entropy after each iteration of negation process.
Application of negation based on sensor fusion
Target recognition is paid great attention in military applications. 60 There are some methods to make decisions.61–63 Fusion rules can help us make better decisions.64,65 However, using existing information to make more accurate decision is also an open issue. The negation provides a new view to obtain more accurate judgement with the collected information. The specific example is as follows.
There are three sensors to recognize the target which maybe
Dempster rule is widely used in sensor data fusion. 64 The combination results are shown as follows
Using the method of negation, new information is obtained as follows
Similarly, fusion rule is used as follows
It is well known that conflicting coefficient
More importantly, by comparing the results between the original and negation, the decision has more support to target
Next, considering the changes of entropy between the original and negation, assume that
From the above, it can be seen that the Tsallis entropy after fusion becomes less, showing that the fusion can decrease the uncertainty of information.
However, fusion rules don’t work in extreme probability distribution. Using negation can provide another view to analyse this phenomenon.
There is a special example to better explain the application of negation. There are two sensors to recognize the targets which maybe
Using fusion rules, one can get the results as follows
From the result, it can be seen that the target must be
The fusion results are as follows
Next, considering the changes of entropy between the original and negation, assume that
Obviously, negation gives us some new information. From the above result, the probability that target is not
In addition, analysing the Tsallis entropy between original and negation, it can be found that Tsallis entropy becomes bigger. Besides, according to the second law of thermodynamics, the entropy of an isolated system never decreases. The negation can increase Tsallis entropy. Moreover, entropy is irreversible. From this view, the negation can make systems spontaneously evolve towards thermodynamic equilibrium, the state with maximum entropy. That is to say, negation tends to make the system uniformly distributed, and in fact it does. Hence, the negation not only provides a new way to understand problems, but also provides better performance of decision support system. Besides, because the Tsallis entropy is highly applied in many applications, negation provides a new view to obtain information. Hence, using Tsallis entropy to measure the uncertainty can enlarge the application of negation.
Conclusion
Probability distribution is efficient to represent knowledge. However, everything in nature and society has its negation, which shows that negation is very essential. Similarly, probability distribution also has its negation. The article extends the proposed negation by using Tsallis entropy. Besides, numerical example is used to calculate the uncertainty by different
This article is a preliminary study to obtain the NPD based on uncertainty measurements. This work is done mainly to design a more efficient negation process and uncertainty measurement, and expand the application of negation. Besides, it is essential to determine the uncertainty related to the negation and study its properties.
Footnotes
Handling Editor: Mohamed Abdel-Basset
Declaration of conflicting interests
The author(s) declared no potential conflicts of interest with respect to the research, authorship, and/or publication of this article.
Funding
The author(s) disclosed receipt of the following financial support for the research, authorship, and/or publication of this article: The work is partially supported by National Natural Science Foundation of China (Grant Nos 61573290, 61503237).
