• English
  • German




Training and Consulting for operational excellence

Statistical Design of Experiments (DoE) is used to create process models and is a tool for the efficient development of new products/processes and for the optimization of existing ones. The advantages of statistical design of experiments and evaluation are a significantly reduced number of experiments from which an adequate process model is derived.

Correctly planned and evaluated experiments are the key to rapid marketability and high quality.

However, a prerequisite for a successful application is an understanding of the methods and a correct interpretation of the results.
In this seminar, you will learn in a vital but relaxed learning atmosphere how to develop more and more confidence in the methods of statistics by means of practical examples.

learn more

Cornerstone is a statistical software that supports users in technically oriented industries with statistical procedures. The correspondingly chosen focus is on Design of Experiments (DoE) and important statistical analysis procedures. Cornerstone provides methods for exploratory data analysis through an efficient implementation of graphical techniques.

The use of software is an important factor in increasing the speed of statistical analysis. A prerequisite is the efficient handling of different data sources and quantities, as well as dynamic data.

In this seminar, you will learn about Cornerstone’s various methods using practical examples. By linking individual methods, the speed of data analysis is increased step by step.

learn more

Statistical process control supports the process engineer in the regulation and continuous improvement of processes. Using statistical methods, production processes are monitored and deviations are detected before any failure occurs. Furthermore, it is possible to check and analyse corrections to processes.

A natural scattering is observed for each process. In an undisturbed state this scatter is random and is caused by natural influences on the process. These are overlaid by systemic influences, which affects the location of a process. This results in two primary objectives for statistical process control

1) the stabilization of the process situation and
2) the improvement of process diversification.

Thus, statistical process control is the essential decision support whether and when to intervene in a process. It serves as the basis for many quality management strategies such as “zero defect”.

learn more

Tolerance calculation based on a statistical approach describes the variability of production processes more usefully than simply calculating the limits of the drawing tolerances. Thus, the potential for fulfilling customer specifications can be estimated more realistically at a very early planning stage. Critical variables can be easily identified.

The first building block of statistical tolerance calculation are assumptions about the distributions of the input variables which can be determined e.g. from comparable production data or Cpk data from suppliers. Furthermore, an action model between the input variables and the target variables to be tolerated is required which results from the design, from previously performed DoEs or from the physical / chemical / engineering understanding of the process.

The classical method of Gaussian error propagation is expediently replaced by distribution simulation. With Cornerstone, this can be done very well for all common distributions. Special cases such as truncated and cut-out distributions as well as mixed distributions are also possible. Such simulations can be integrated with other methods such as DoE. This allows for more economical or more robust designs of the planned processes.

The aim of the course is the introduction to statistical tolerance calculation and interfaces to thematically related methods.

learn more

Measurement System Analysis (formerly also known as measuring equipment suitability) evaluates the measurement process of a parameter with regard to its specification limits. It must be sufficiently resolved, lead to correct results, be repeatable or reproducible well enough. Its long-term stability must also be guaranteed. This production view can also be transferred to applications in research and development.

Without suitable measuring and testing processes, it is not possible to ensure the quality required by the customer. Sufficiently reliable measurability also plays an important role in the run-up to production, since measurements are the extended sensory organs of researchers and developers.

Suitable testing processes and an understanding of the necessary work steps improve quality, reduce quality costs, and facilitate the development of new processes and products.

The course teaches you techniques to evaluate and improve your measurement processes. Improved measurement techniques have repeatedly led to scientific and technical progress over the centuries (Keppler laws based on measurements by Tycho Brahe, land surveying based on accurate angular measurements according to Gauss, gravitational wave detection based on incredibly accurate interference measurements in the LIGO experiment).

learn more

Have we piqued your interest?

Then contact us!

The fields marked with an asterisk (*) are mandatory.

You can use this contact form to contact us. We will use your personal data only for the purpose of processing your query. Your contact request will be processed centrally within the camLine Group to ensure that your inquiry can be responded to as quickly and effectively as possible. More detailed information concerning the collection and processing of data in connection with this contact form is available in our Privacy Policy.