The five machine learning algorithms used in the proposed work were Decision Table
(Chen, 2017), OneR (Alam and Pachauri, 2017), J48 (Katare and Dubey, 2017), Random Forest (Beaulac, and Rosenthal, 2017) and Random Tree (Sutera, 2013).
Considering the limitations of some ML approaches, in this work, we perform the quality assessment based on Decision Trees and Decision Table
, more precisely: M5P, REPTree, and Decision Table
Generally, cause-effect tables, also known as decision tables
or input-output tables, relate input conditions and/or events to output states and/or events.
In choosing the right weak classifier for Adaboost,  has compared four classifiers NNge (Nonnested generalised exemplars), JRip (Extended Repeated Incremental Pruning), RIDOR (Ripple-Down Rule), and Decision Tables
as a base classifier for Adaboost.
The traditional attribute reduction methods, like the differential matrix method, reduction method based on the importance, and so on, can find all reduction in the decision table
[10, 11], but these methods are only suitable for the decision table
with small amount of data and low dimension.
A decision table
S = (U, P, Q), P = [[a.sub.1], [a.sub.2], ..., [a.sub.m]], Q= [[d.sub.1], [d.sub.2], ..., [d.sub.n] (m [greater than or equal to] 1, n [greater than or equal to] 1).
The best five algorithms were as follows (Figure 7): Naive Bayes, Simple Logistic, lazy.IBk, Decision Table
, and LMT tree.
Compared with analytic hierarchy process, fuzzy comprehensive evaluation, neural network, etc., rough set doesn't need any prior information such as fuzzy membership function, basic probability assignment and relevant statistical probability distribution other than a decision table
. Under the premise of retaining the key categories of knowledge, it can reduce attributes, simplify decision table
and find minimum expression of classification knowledge by identifying and evaluating dependency relations, so as to overcome the shortcomings of traditional methods, such as strong subjectivity, poor testability and so on(Hu et al.
If for each scheme we define the modulation mode, the encoding rate, and the channel state as conditional attributes, while the transmitting power is the decision attribute, then we get an initial decision table
. This table has two problems: 1) The probable existence of some relevance between different conditional attributes will cause attribute confusion during the decision-making, and then decrease the distinctiveness of the derived rules.
3, in which (i) define a set of Boolean variables BV(U), then a new decision table
[T.sup.P] is created using boolean variables BV(U), defined in the previous step ([T.sup.P] is called P-discretization of decision table
), then searching for a minimal subset of P that discerns all the objects in different decision classes.
The values associated with the alternatives for MADM problems presented in the decision table
A "decision table
" models a set of conditionals and their corresponding results.
"Don't ask for permission, ask for forgiveness." This rallying cry against the "let's do as we always did" approach has helped the digital professionals take their seat at the decision table
in many institutions.
The inference mechanism of the fuzzy logic controller is represented by a (7x7) decision table