Where N is the number of image pixels, C is the number of clusters, [[mu].sub.ij] is the intuitionistic fuzzy membership function, m is a intuitionistic fuzzy constant, [d.sup.2.sub.ij] is the kernel space distance, [M.sub.ij] is the local spatial-gray information measure
, and IFE(A) is the intuitionistic fuzzy entropy.
Roy and Kafatos (2003) showed that Wootter's measure of information (and distance) is related to Fisher information measure
, considered as the mother of all information measure
, including Shannon's measure.
Let n [member of] N, [xi] > 0 ([not equal to] 1) be arbitrarily fixed, then the mean length L ([xi]) corresponding to the generalized information measure
H (A; [xi]) is given by the formula
(2009) that demonstrated a correlation between nurse staffing and improvements in the recommend hospital and the provision of discharge information measures
. These referenced analyses were cross-sectional, but they included key covariates for nursing work environment--a measure of nursing leadership capacity, nurse's participation in hospital affairs, and nurse-physician relationships.
Their findings strengthen the power of the Corwin-Schultz measure as an asymmetric information measure
, but go against the stationarity finding because poor quality of financial reporting generates asymmetric information.
Hence the selected features of the mixed feature set such as homogeneity (homom), sum of averages (savgh), difference variance (dvarh), information measure
of correlation-1 (inf1h), information measure
of correlation-2 (inf2h), inverse difference normalized (indnc), short run emphasis (SRE), short run high gray-level emphasis (SRHGE), and length (l) show excellent accuracy.
(ii) to assess whether information measures
derived from financial ratios with discriminant analysis can improve the prediction ability of business failure compared to just using information measures
of financial ratios and financial ratios with discriminant analysis;
In this paper we propose an extension of the standard single-feature mutual information similarity measure to a multi-feature mutual information measure
, and solve the problem of efficient estimation of feature joint probability distribution in a high-dimensional feature space.
One way to overcome this problem is to apply an interval information measure
. In Veerkamp & Berger (1997), Fisher Information was integrated over a small interval around the ability estimate in order to correct for the uncertainty in the estimate.
In the present paper, we study the characterization results of the past lifetime distributions by using the following generalized information measure
Although often called an information measure
, in reality H is not a direct measure of information and at best measures information indirectly, or inversely.
To have a consistent model of turbulence, an information measure
of N-S equations should be defined.
Perhaps it is to be expected that an information measure
, suc as is used by Barrett and Sober, is able to mirror these human abilities.
In this communication, we have proposed a new (R, S)-norm information measure
of Pythagorean fuzzy set and applied the information measure
in an algorithm to solve multicriteria decision-making problem.
In this paper, we introduce and study a new information measure
which is called Renyi's-Tsalli's entropy of order [xi] and a new mean code word length and discuss the relation with each other.