To select a suitable pruning method in decision tree pruning, four well-known pruning
methods were compared in terms of computational complexity, traversal strategy, error estimation
and theoretical principle by taking a classification and regression tree as an example. Compared with
pessimistic error pruning (PEP), minimum error pruning (MEP) is less accurate and produces a
larger tree. Reduced error pruning (REP) is one of the simplest pruning strategies, but it has the
disadvantage of requiring a separate data set for pruning. Cost-complexity pruning (CCP) produces
a smaller tree than REP with similar accuracy. Practically, if the training data is abundant, REP is
preferable; and if the train data is the expected accuracy is high but with limited data, PEP is good
choice.