> As associative modeling kernel, our CAEBM technology in most cases deploys rather simple "feed-forward" Neural Nets with only one hidden layer. This rather simple architecture offers unrestricted modeling complexity for arbitrary high-dimensional problems (cf the "Approximation Theorem" at Wikipedia) but especially also high scalability concerning problem complexity and model accuracy, which is very important for CAEBM. Moreover, measures for complexity, modeling accuracy, and even "modeling strength" are available, also to compare CAEBM work across different tasks, eg with different IP-sets for a given problem.
Alternatively, especially if non-numeric parameters have to be used for the problem at hand, lately developed Decision Tree Forests (DTF) techniques can be applied to setup powerful models, comparing to Neural Nets in terms of model complexity, but being able to handle non-parametric parameters directly and efficiently, without any parameter transformation or coding.
> In accordance to the powerful, prejudice-free modeling capabilities of our CAEBM technology, the training methods used for the neural models are adequate, as not solution speed but rather solution quality is the measure: As well for model complexity as for model accuracy, powerful Gennetic Algorithms (GA) are in use in a concerted action to find the best modeling solution. And because model complexity and accuracy can be measured, modeling approaches for different eg IP-sets can be compared directly, to make the best modeling decisions.
The same goes for using appropriate Boosting Techniques (instead of GAs) on DTFs, whereas often less computing power is needed for a given problem model, and the explanation features (of DTs!) succeed those of NNs.
> Additionally, also the contribution of any IP and the "structure" of the high-dimendionsl modeling solution becomes visible for humans with restricted 2-5 dimensional problem understanding capabilities: Does the problem have (only) one "best" solution? Or are there more than one or several to many solutions, and what are their modeling qualities? And, of course, different modeling solutions (eg for different IP-sets) can be compared, eg concerning modeling complexity, modeling quality, and modeling strength, if appropriate. This way, the moderately higher cost in terms of computing power for complexity and accuracy calculation is more than overcompensated by the rich information, delivered by the highly scalable modeling process. Not the fastest (in terms of computing time) but the "best" modeling result (in terms of different IP-sets) can be found along this line.
> Not to mention, that Computer Aid is an indispensable ingredient of this modeling technology: Again CA+EBM=CAEBM!