Show simple item record

dc.contributor.authorPerry, Brittany
dc.date.accessioned2023-05-16T17:30:19Z
dc.date.available2023-05-16T17:30:19Z
dc.identifier.urihttp://hdl.handle.net/10464/17817
dc.description.abstractAdaBoost is an ensemble method that can be used to boost the performance of machine learning algorithms by combining several weak learners to create a single strong learner. The most popular weak learner is a decision stump (low depth decision tree). One limitation of AdaBoost is its effectiveness when working with small sample sizes. This work explores variants to the AdaBoost algorithm such as Real AdaBoost, Logit Boost, and Gentle AdaBoost. These variants all follow a gradient boosting procedure like AdaBoost, with modifications to the weak learners and weights used. We are specifically interested in the accuracy of these boosting algorithms when used with small sample sizes. As an application, we study the link between functional network connectivity (as measured by EEG recordings) and Schizophrenia by testing whether the proposed methods can classify a participant as Schizophrenic or healthy control based on quantities measured from their EEG recording.en_US
dc.language.isoengen_US
dc.publisherBrock Universityen_US
dc.subjectAdaBoosten_US
dc.subjectdecision treesen_US
dc.subjectsmall sample sizeen_US
dc.subjectgradient boostingen_US
dc.subjectSchizophreniaen_US
dc.titleAdaBoost And Its Variants: Boosting Methods For Classification With Small Sample Size And Brain Activity In Schizophreniaen_US
dc.typeElectronic Thesis or Dissertationen_US
dc.degree.nameM.Sc. Mathematics and Statisticsen_US
dc.degree.levelMastersen_US
dc.contributor.departmentDepartment of Mathematicsen_US
dc.degree.disciplineFaculty of Mathematics and Scienceen_US
refterms.dateFOA2023-05-16T17:30:20Z


Files in this item

Thumbnail
Name:
Brock_Perry_Brittany_2023.pdf
Size:
525.0Kb
Format:
PDF

This item appears in the following Collection(s)

Show simple item record