Abstract
A measure of uncertainly and information for possibility theory is introduced in this paper The measure is called the U-uncertainty or, alternatively, the U-information. Due to its properties, the U-uncertainty/information can be viewed as a possibilistic counterpart or the Shannon entropy and, at the same time, a generalization or the Hartley uncertainty/information.
A conditional U-uncertainty is also derived in this paper, it depends on the U-uncertainties or the joint and marginal possibility distributions in exactly the same way as the conditional Shannon entropy depends on the entropies or the joint and marginal probability distributions. The conditional U-uncertainty is derived without the use of the notion of conditional possibilities, thus avoiding a current controversy in possibility theory.
The proposed measures of U-uncertainty and conditional U-uncertainty provide a foundation for developing an alternative theory of information, one based on possibility theory rather than probability theory.
Notes
†This work was supported by the National Science Foundation under Grant No. ECS-8006590