Username   Password       Forgot your password?  Forgot your username? 

Guest Editorial

Volume 10, Number 1, January 2014 pp 2

JEAN-FRANÇOIS DUPUY

IRMAR/INSA of Rennes, F-35708 Rennes, France

 

Traditional lifetime data analysis involves analysing time-to-failure data obtained under normal operating conditions (or stresses) in order to evaluate the reliability characteristics of a set of items (system, component, or product). Obtaining such data may however require an unreasonable amount of time. This is the case, for example, for highly-reliable units such as those encountered in the nuclear, aeronautic or electronic fields since these units are designed to operate without failing during years or decades. One way to obtain the desired reliability information is thus to test the units at higher-than-usual levels of stresses and to infer, or extrapolate, the reliability characteristics of the units at use conditions. Obviously, this extrapolation is only possible if appropriate models relating the failure times of the units and the accelerating factors (stresses) are available. A huge amount of literature has been devoted so far to the design, modelling, and analysis of experiments using higher-than-usual levels of stresses to accelerate the collection of failure time data. The phrase "accelerated life testing" has been adopted to describe such methods.

In particular, numerous contributions have been made to the development of statistical methods for planning accelerated life testing experiments. This is the topic of the paper by N. Chandra et al. who propose an optimal experiment plan for a 3-steps step-stress accelerated life testing when the units lifetime belong to the Lomax family of distributions. This is also the topic of the paper by F. Haghighi who constructs a step-stress experiment in a competing risk setting (that is, several independent causes of failure act simultaneously on the units). The construction of this plan uses the additional information provided by some available degradation data.

The analysis of accelerated failure time data requires a careful modeling of the relation between the accelerating variables and the reliability characteristics of the units under study (such models are called accelerated life or accelerated failure time models). J.-F. Dupuy provides a review of the most widely used accelerated life models and a description of the associated statistical inference. D.I. De Souza et al. propose to use the Maxwell distribution law to extrapolate the information resulting from an accelerated life test experiment to normal or design operating conditions. G. Babykina and V. Couallier propose a new reliability model for analysing recurrent failure time data, which takes account of aging, previous failures and covariates. This model is applied to repeated pipe failures in water distribution systems.

Once an accelerated life experiment has been modeled, the investigator usually faces the issue of testing the fit of the proposed model. This step allows to validate the conclusions and extrapolations obtained from the model. M.S. Nikulin and Q.X. Tran propose a chi-squared statistic for testing the appropriateness of parametric accelerated failure time models with possibly time-varying covariates. The proposed methodology is applied to two accelerated failure tests conducted on insulating structures and electric insulating fluids.

All the aspects of accelerated life testing (planning of experiments, modeling and model validation) are therefore present in this special issue.

 

 


JEAN-FRANÇOIS DUPUY

IRMAR/INSA of Rennes, F-35708 Rennes, France

Email : jean-francois.dupuy@insa-rennes.fr

 
This site uses encryption for transmitting your passwords. ratmilwebsolutions.com