PCA has been employed with genetic algorithms (GA) in order to reduce data dimensionality for use in fault diagnosis of induction motors. PCA was employed to remove relative features, after which GA was employed to select the irrelative features and to optimise the ANN (Yang, Han and Yin, 2006). Fault detection and diagnosis of plant subsystems have also been attempted using PCA. Normal plant operation decomposed through PCA was compared to faulty operation data through PCA decomposition to create thresholds for taking corrective actions. Real time monitoring of plant operation data was compared to both data sets with thresholds settled through Q statistics in order to detect faults (Villegas, Fuente and Rodriguez, 2010). Vibration monitoring of helicopter transmissions has been attempted using tri-axial accelerometers and PCA processing of the obtained data. The three different dimensions of acceleration data obtained using accelerometers were reduced to a single dimension using PCA for simpler processing. This approach is seen to provide a simpler and computationally robust technique for vibration monitoring in highly complex systems (Tumer and Huff, 2002). Independent PCA models suffer due to the control limits required for the Q and T2 statistics. Also, the limits are produced assuming that the process data is Gaussian in character, which may lead to complications if the process data is not actually Gaussian in character. Probabilistic techniques have been used in conjunction with PCA (PPCA) in order to handle both Gaussian and non-Gaussian process data for fault detection and diagnosis in a process control environment. Outcomes signified improvement over simple PCA based control schemes, but certain areas still required improvement under the PPCA based control scheme (He et al., 2012). PCA applications to process control are growing over time. Polyester film process monitoring has been attempted using Q and T2 statistics through a PCA approach for multivariate quality control (MQC). When compared to other techniques, PCA provided a more robust model for fault detection although diagnosis was not highly reliable. It could be inferred that PCA standalone approaches are best suited to fault detection since fault diagnosis requires the application of other techniques for established reliability (Qin, 2003). A combined index consisting of statistics Q and T2 has been developed in order to minimise the index when faulty variables are being isolated. This provides a better solution than applying the conventional approach of using statistics Q and T2 separately (Chen, Lee and Liu, 2011). It must be noted that PCA provides a simple reduction of dimensionality, but PCA processing is not suited to data streams with a large amount of outliers. A robust PCA (ROBPCA) method has been suggested for dealing with large dimension data using projection pursuit in combination with robust estimation of lower dimensions. Classification of outliers has been made possible through diagnostic plots (Hubert and Engelen, 2004). ROBPCA has been employed for fault detection and isolation in various theoretical situations in order to prove its worth over PCA. The findings signify that ROBPCA provides better results than PCA with its inherent ability to process large data sets (Tharrault et al., 2008). PCA has also been applied together with acoustic emission testing (AET) to deal with vibration monitoring of
Fault Detection Using Q and T2 Statistics PCA is highly effective in reducing the overall dimensions of varied input data for analysis. Over time, PCA has been adopted for use in different applications such as fault monitoring and diagnosis, signal processing, recognition of patterns, data compression and other similar tasks (Zhu, Bai and Yang, 2010)…
Many people confuse other neurological states with sleep. One of these states is coma, but unlike sleep, a person who is in coma cannot be aroused. Numerous research studies have been carried out for long periods to investigate sleep and its importance to the human body.
This process is predominantly human based and errors are likely to occur when progressing through the forms. Being a process run by humans, it is crucial to manage the software process to ensure success of the project. It follows that, the parameter to measure software success is software quality.
Data preparation, therefore, takes two courses. Firstly it is conducted using qualitative methods such as field studies and questionnaires or secondly, through extensive review of the available research from different data sources such as scholarly articles and books on GIS.
The term “big data” is normally used as a marketing concept refers to data sets whose size is further than the potential of normally used enterprise tools to gather, manage and organize, and process within an acceptable elapsed time. In fact, the size of these huge data sets is believed to be a continually growing target.
RVM has been utilised along with GA in order to to optimally control nonlinear manufacturing processes. The technique relies on discerning approximately optimal control parameters of the manufacturing device. In turn, the
There are a number of changings in every aspect of life, business, society, nations, cultures, etc. which are thought to be the result of this phenomenon, called “Globalisation”. The essence of this phenomenon laid
Along with this, their literature published on this subject matter and bibliographies of peer-reviewed articles have also been used to write the article. The researchers have focused on knowing the way artificial intelligence
6 pages (1500 words)Literature review
Get a custom paper written by a pro under your requirements!
Win a special DISCOUNT!
Put in your e-mail and click the button with your lucky finger
Apply my DISCOUNT
Got a tricky question? Receive an answer from students like you!Try us!
Didn't find an essay?
Contact us via Live Chat, call us at +16312120006or send an email to firstname.lastname@example.org