• Login
    View Item 
    •   Home
    • Brock Theses
    • Masters Theses
    • M.A. Applied Disability Studies
    • View Item
    •   Home
    • Brock Theses
    • Masters Theses
    • M.A. Applied Disability Studies
    • View Item
    JavaScript is disabled for your browser. Some features of this site may not work without it.

    Browse

    All of BrockUCommunitiesPublication DateAuthorsTitlesSubjectsThis CollectionPublication DateAuthorsTitlesSubjectsProfilesView

    My Account

    LoginRegister

    Statistics

    Display statistics

    Exploring the utility of differing methodological approaches to measure meaningful change in treatment and intervention scenarios

    • CSV
    • RefMan
    • EndNote
    • BibTex
    • RefWorks
    Thumbnail
    Name:
    Brock_Ross_Bailey_2020.pdf
    Size:
    983.8Kb
    Format:
    PDF
    Download
    Author
    Ross, Bailey
    Keyword
    single case design
    effect size
    between-case standardized mean difference effect size
    hedges’ g
    d-statistic
    
    Metadata
    Show full item record
    URI
    http://hdl.handle.net/10464/15006
    Abstract
    A large focus in social science research is geared towards establishing the effectiveness of treatments to help and support the population and provide best empirical data to researchers, practitioners, and policy makers. To support evidence-based practice and policy, findings from studies are synthesized in reviews and meta-analyses to verify the most effective treatments. However, due to differing metrics to describe effects found, single-case designs (SCD) have been excluded in such reviews and meta-analyses. This hinders the dissemination of valuable findings from studies that use SCD methodologies. The present study employs a unique dataset to exemplify differing methodological approaches to measure meaningful change from a treatment. The dataset, obtained from Vause and colleagues (2018), contained both a group-based design in the form of a randomized controlled trial and SCD methodologies on the same participants undergoing treatment. Thus, a d-statistic was calculated from the SCD methodology, also referred to as a between-case standardized mean difference effect size (ESBC) and was compared with the group-based effect sizes originally found by Vause and colleagues. The effect sizes corroborated with each other, such that a large effect was deduced from both the SCD analysis (g = 1.22) and the average effect from the group-based analyses (g = 0.99). In addition, the ESBC was found per participant allowing comparisons between individual effects and the overall outcome. Furthermore, this study explored how the acquired ESBC estimates complements traditional SCD methodologies including visual analysis and overlap statistics. By utilizing statistical techniques such as this software, many behaviors, participants, and data points can be analyzed simultaneously. Moreover, a forest plot can be generated with the results, providing a different perspective than what is normally available to SCD researchers. Finally, the most valuable consequence to note is the acquisition of a ESBC that results in the form of Hedges’ g, which can be compared across SCDs and between-group experimental designs. This is the first known study to explore and compare the effect size estimates of a treatment on participants’ behaviors that was evaluated as a SCD as well as a group-based design.
    Collections
    M.A. Applied Disability Studies

    entitlement

     
    DSpace software (copyright © 2002 - 2022)  DuraSpace
    Quick Guide | Contact Us
    Open Repository is a service operated by 
    Atmire NV
     

    Export search results

    The export option will allow you to export the current search results of the entered query to a file. Different formats are available for download. To export the items, click on the button corresponding with the preferred download format.

    By default, clicking on the export buttons will result in a download of the allowed maximum amount of items.

    To select a subset of the search results, click "Selective Export" button and make a selection of the items you want to export. The amount of items that can be exported at once is similarly restricted as the full export.

    After making a selection, click one of the export format buttons. The amount of items that will be exported is indicated in the bubble next to export format.