Please use this identifier to cite or link to this item: https://knowledgecommons.lakeheadu.ca/handle/2453/4673
Full metadata record
DC FieldValueLanguage
dc.contributor.advisorYang, Yimin-
dc.contributor.authorPaul, Adhri Nandini-
dc.date.accessioned2020-07-07T16:49:35Z-
dc.date.available2020-07-07T16:49:35Z-
dc.date.created2020-
dc.date.issued2020-
dc.identifier.urihttp://knowledgecommons.lakeheadu.ca/handle/2453/4673-
dc.description.abstractNetwork aims to optimize for minimizing the cost function and provide better performance. This experimental optimization procedure is widely recognized as gradient descent, which is a form of iterative learning that starts from a random point on a function and travels down its slope, in steps, until it reaches to the steepest point which is time-consuming and slow to converge. Over the last couple of decades, several variations of the non-iterative neural network training algorithms have been proposed, such as Random Forest and Quicknet. However, the non-iterative neural network training algorithms do not support online training that given a very largesized training data, one needs enormous computing resources to train neural network. In this thesis, a non-iterative learning strategy with online sequential has been exploited. In Chapter 3, a single layer Online Sequential Sub-Network node (OS-SN) classifier has been proposed that can provide competitive accuracy by pulling the residual network error and feeding it back into hidden layers. In Chapter 4, a multilayer network is proposed where the first portion built by transforming multi-layer autoencoder into an Online Sequential Auto-Encoder(OS-AE) and use OS-SN for classification. In Chapter 5, OS-AE is utilized as a generative model that can construct new data based on subspace features and perform better than conventional data augmentation techniques on real-world image and tabular datasets.en_US
dc.language.isoen_USen_US
dc.subjectNeural networksen_US
dc.subjectNetwork training algorithmen_US
dc.subjectMachine learningen_US
dc.subjectOnline sequential learningen_US
dc.subjectAutoencoderen_US
dc.titleOnline sequential learning with non-iterative strategy for feature extraction, classification and data augmentationen_US
dc.typeThesisen_US
etd.degree.nameMaster of Scienceen_US
etd.degree.levelMasteren_US
etd.degree.disciplineComputer Scienceen_US
etd.degree.grantorLakehead Universityen_US
Appears in Collections:Electronic Theses and Dissertations from 2009

Files in This Item:
File Description SizeFormat 
PaulA2020m-1a.pdf5.15 MBAdobe PDFThumbnail
View/Open


Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.