Please use this identifier to cite or link to this item: https://knowledgecommons.lakeheadu.ca/handle/2453/5182
Full metadata record
DC FieldValueLanguage
dc.contributor.advisorYang, Yimin-
dc.contributor.advisorWei, Ruizhong-
dc.contributor.authorLiu, Weiting-
dc.date.accessioned2023-06-27T17:41:12Z-
dc.date.available2023-06-27T17:41:12Z-
dc.date.created2023-
dc.date.issued2023-
dc.identifier.urihttps://knowledgecommons.lakeheadu.ca/handle/2453/5182-
dc.description.abstractThis thesis aims to explore the potential of statistical concepts, specifically the Vapnik-Chervonenkis Dimension (VCD)[33], in optimizing neural networks. With the increasing use of neural networks in replacing human labor, ensuring the safety and reliability of these systems is a critical concern. The thesis delves into the question of how to test the safety of neural networks and optimize them through accessible statistical concepts. The thesis presents two case studies to demonstrate the effectiveness of using VCD in optimizing neural networks. The first case study focuses on optimizing the autoencoder, a neural network with both encoding and decoding functions, through the calculation of the VCD. The conclusion suggests that optimizing the activation function can improve the accuracy of the autoencoder at the mathematical level. The second case study explores the optimization of the VGG16 neural network by comparing it to VGG19 in terms of their ability to process high-density data. By adding three hidden layers, VGG19 outperforms VGG16 in learning ability, suggesting that adjusting the number of neural network layers can be an effective way to analyze the capacity of neural networks. Overall, this thesis proposes that statistical concepts such as VCD can provide a promising avenue for analyzing neural networks, thus contributing to the development of more reliable and efficient machine learning systems. The final vision is to allocate the mathematical model reasonably to machine learning and establish an idealized neural network establishment, allowing for safe and effective use of neural networks in various industries.en_US
dc.language.isoenen_US
dc.subjectMachine learning (medical decision-making)en_US
dc.subjectNeural networken_US
dc.titleVapnik-Chervonenkis dimension in neural networksen_US
dc.typeThesisen_US
etd.degree.nameMaster of Scienceen_US
etd.degree.levelMasteren_US
etd.degree.disciplineComputer Scienceen_US
etd.degree.grantorLakehead Universityen_US
Appears in Collections:Electronic Theses and Dissertations from 2009

Files in This Item:
File Description SizeFormat 
LiuW2022m-1a.pdf1.93 MBAdobe PDFThumbnail
View/Open


Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.