• Login
    View Item 
    •   Home
    • Brock Theses
    • Masters Theses
    • M.A. Applied Disability Studies
    • View Item
    •   Home
    • Brock Theses
    • Masters Theses
    • M.A. Applied Disability Studies
    • View Item
    JavaScript is disabled for your browser. Some features of this site may not work without it.

    Browse

    All of BrockUCommunitiesPublication DateAuthorsTitlesSubjectsThis CollectionPublication DateAuthorsTitlesSubjectsProfilesView

    My Account

    LoginRegister

    Statistics

    Display statistics

    Exploring the Reliability of an Objective Severity Tool to Classify Severe Problem Behaviour

    • CSV
    • RefMan
    • EndNote
    • BibTex
    • RefWorks
    Thumbnail
    Name:
    Brock_Morgan_Marie-Chanel_2021.pdf
    Size:
    1.445Mb
    Format:
    PDF
    Download
    Author
    Morgan, Marie-Chanel
    Keyword
    severe problem behaviour
    intellectual and developmental disability
    reliability
    severity scale
    research tool
    
    Metadata
    Show full item record
    URI
    http://hdl.handle.net/10464/15160
    Abstract
    The term ‘severe’ is a common descriptor for problem behaviour in research and practice. However, it is often applied inconsistently, and at times based on ill-defined or arbitrary criteria. Existing problem behaviour measurement tools often rely solely on caregiver recall (e.g., interviewing primary caregivers). This study explores the reliability of the first iteration of a severity tool employing direct measurement strategies (e.g., response rate, injury severity as evidenced by permanent product) to classify an individual’s problem behaviour severity. Nine Board Certified Behavior Analyst (BCBA) raters were recruited, five novice raters and four expert raters. They each experienced two conditions. In the first condition, raters classified the severity of 20 case scenarios without access to the tool. In the second condition, raters classified the severity of 20 novel scenarios after completing the tool for each case. All items of the tool (n=26) had good internal consistency (∝=.831). Intraclass correlations showed a meaningful increase in reliability for both groups when they had access to the tool (novice r=0.860, expert r=0.912) compared to when they did not have access to the tool to rate case severity (novice r=0.781, expert r=0.803). Most raters either strongly agreed or agreed that the severity tool had good applicability across research and clinical settings. This suggests that inconsistencies that may exist in the classification of severe problem behaviour could be mitigated with the proposed tool.
    Collections
    M.A. Applied Disability Studies

    entitlement

     
    DSpace software (copyright © 2002 - 2023)  DuraSpace
    Quick Guide | Contact Us
    Open Repository is a service operated by 
    Atmire NV
     

    Export search results

    The export option will allow you to export the current search results of the entered query to a file. Different formats are available for download. To export the items, click on the button corresponding with the preferred download format.

    By default, clicking on the export buttons will result in a download of the allowed maximum amount of items.

    To select a subset of the search results, click "Selective Export" button and make a selection of the items you want to export. The amount of items that can be exported at once is similarly restricted as the full export.

    After making a selection, click one of the export format buttons. The amount of items that will be exported is indicated in the bubble next to export format.