to: THE INNER WORKINGS OF TRANSFORMATION
The processing chain aims to take raw data and, step-by-step, extract the elements necessary for decision-making. After denoising, the filtered data will be used to create coherent sets to be fed to learning algorithms, which will then use other coherent datasets to test performance.
We design your solution as a processing sequence using a series of operators. Our representation is similar to that found in automaton.
An application is defined as a response to a problem in the form of a sequence of actions. Each action is performed in response to a specific situation and the sequence of those actions leads to the objective.
Classic operators include:
We specialise in:
- Modelling operators,
- Optimising parameters in a chain of operators,
- Implementing real-time operation of a chain of operators.
Creating learning sets
We refer to the set of data characterising the problem to be solved as the “data landscape”.
- Confusion measures
- Ambiguity measures
- Distance rejection measures
- Factor analysis
- Spectral analysis
The data is now available in the form of a landscape. We must then determine the homogeneous areas that form categories, which are the object of the problem to be solved.
Learning consists of constructing these boundaries using a suitable algorithm. The strategy for presenting data to the algorithm is fundamental to converging toward successful take-up and not drifting into over-learning.
The result is a software component representative of the dataset and the learning procedures.
As specialists, we have expertise in all effective approaches, such as:
- Probabilistic (Baye, Markov, etc.)
- Non-probabilistic (Beliefs, Possibilities, etc.)
- Connectionist (supervised or unsupervised)