Research Group:
Behavior Analysis in Schools (BAiS)
My research interests are best summarized as school-based behavioral interventions implemented within a multi-tiered system of support (MTSS). I have specific interest in peer management interventions, class-wide behavioral assessment and intervention, implementation science, and visual analysis of single-case data. I conduct investigations of these topics through the lens of behavior analysis, emphasizing feasibility. I am always interested in collaborating with students to develop new research ideas and projects that fall within the broader scope of my research agenda too. Below are more detailed descriptions of my ongoing research strands as well as recent representative peer-reviewed publications and a forecast of where I anticipate taking each strand. Bolded names in publications represent student authorship:
Class-wide behavior interventions (e.g., the Good Behavior Game) are highly valued because they address the behavior of a large number of students at once, are effective, and are viewed as acceptable by teachers as a classroom management strategy; however, the range of class-wide behavior interventions currently available is somewhat limited in scope and the behaviors targeted may not always be appropriate. Furthermore, the way we evaluate these interventions is under-researched, potentially leading to incorrect decisions about their effectiveness. I have engaged in research to identify class-wide behavior assessment practices that produce accurate estimates of group behavior and have broadened the types of behaviors targeted under these strategies. I plan to continue to refine class-wide assessments to determine if alternative procedures are needed for behavioral targets that are measured in a duration (e.g., time on task) compared to those that are measured in frequency (e.g., hand raising).
Representative Publications
Dart, E. H., Radley, K. C., Briesch, A. M., Furlow, C. M., & Cavell, H. (2016). Assessing the accuracy of classwide direct observation methods: Two analyses utilizing simulated data and naturalistic data. Behavioral Disorders, 41(3), 148 - 160.
Dart, E. H., Radley, K. C., Battaglia, A., Dadakhodjaeva, K., Bates, K. E., & Wright, S. J. (2016). The classroom password: A class-wide intervention to increase academic engagement. Psychology in the Schools, 53(4), 416 – 431.
Radley, K. C., Dart, E. H., Battaglia, A. A., & Ford, W. B. (2019). A comparison of two procedures for assessing preference in a classroom setting. Behavior Analysis in Practice, 12, 95 - 104.
Treatment fidelity is the degree to which an intervention is implemented as intended and it is often expressed as the percentage of treatment components implemented correctly. Assessing treatment fidelity is critical to making accurate treatment decisions because it allows us to differentiate intervention plans that were ineffective because they were not implemented from those that were ineffective for other reasons. I have engaged in treatment fidelity research aimed at identifying the most accurate and feasible way to assess fidelity, identifying new strategies to promote fidelity, and strategies that combine assessment and improvement of fidelity. My future research in this area will continue to explore different behavior analytic methods to improve teacher implementation of behavior interventions with an emphasis on long-term implementation.
Representative Publications
Dart, E. H., Collier-Meek, M. A., Chambers, C., & Murphy, A. (2020). Multi-informant assessment of treatment integrity in the classroom. Psychology in the Schools.
Gresham, F. M., Dart, E. H., & Collins, T. A. (2017). Generalizability of multiple
measures of treatment integrity: Comparisons among direct observation, permanent products, and self-report. School Psychology Review, 46, 108 – 121.
Dart, E. H., Cook, C. R., Collins, T. A., Gresham, F. M., & Chenier, J. (2012). Test-
driving interventions to increase treatment integrity and student outcomes. School Psychology Review, 41, 467-481.
Linear graphs are used frequently in schools to make decisions about student academic and behavioral progress in response to services implemented within MTSS frameworks; however, I have been engaged in research to determine if the way these graphs are constructed can alter the decisions people make about the presence and magnitude of treatment effects. I have found that manipulations made to the graph axes and scaling can have a profound impact on treatment effect decisions when visual analysis is used as the analytic method. Thus, I have proposed a standard format for linear graph assembly that reduces the likelihood that treatment effects are overestimated. Future work in this area will continue to identify critical elements of linear graphs that, when modified, impact treatment decisions. I also hope to investigate the extent to which practitioner made graphs adhere to the standard assembly and whether training in graph construction is sufficient to remedy future construction errors.
Representative Publications
Dart, E. H., & Radley, K. C. (2018). Toward a Standard Assembly of Linear Graphs. School Psychology Quarterly, 33, 350 - 355.
Radley, K. C., Dart, E. H., & Wright, S. J. (2018). The Effect of Data Points per X- to Y-Axis Ratio on Visual Analysts Evaluation of Single-Case Graphs. School Psychology Quarterly, 33, 314 - 322.
Dart, E. H., & Radley, K. C. (2017). The Impact of Ordinate Scaling on the Visual Analysis of Single-Case Data. Journal of School Psychology, 63, 105 - 118.
- Class-wide Behavior Assessment and Intervention
Class-wide behavior interventions (e.g., the Good Behavior Game) are highly valued because they address the behavior of a large number of students at once, are effective, and are viewed as acceptable by teachers as a classroom management strategy; however, the range of class-wide behavior interventions currently available is somewhat limited in scope and the behaviors targeted may not always be appropriate. Furthermore, the way we evaluate these interventions is under-researched, potentially leading to incorrect decisions about their effectiveness. I have engaged in research to identify class-wide behavior assessment practices that produce accurate estimates of group behavior and have broadened the types of behaviors targeted under these strategies. I plan to continue to refine class-wide assessments to determine if alternative procedures are needed for behavioral targets that are measured in a duration (e.g., time on task) compared to those that are measured in frequency (e.g., hand raising).
Representative Publications
Dart, E. H., Radley, K. C., Briesch, A. M., Furlow, C. M., & Cavell, H. (2016). Assessing the accuracy of classwide direct observation methods: Two analyses utilizing simulated data and naturalistic data. Behavioral Disorders, 41(3), 148 - 160.
Dart, E. H., Radley, K. C., Battaglia, A., Dadakhodjaeva, K., Bates, K. E., & Wright, S. J. (2016). The classroom password: A class-wide intervention to increase academic engagement. Psychology in the Schools, 53(4), 416 – 431.
Radley, K. C., Dart, E. H., Battaglia, A. A., & Ford, W. B. (2019). A comparison of two procedures for assessing preference in a classroom setting. Behavior Analysis in Practice, 12, 95 - 104.
- Treatment Fidelity and Strategies to Promote Fidelity
Treatment fidelity is the degree to which an intervention is implemented as intended and it is often expressed as the percentage of treatment components implemented correctly. Assessing treatment fidelity is critical to making accurate treatment decisions because it allows us to differentiate intervention plans that were ineffective because they were not implemented from those that were ineffective for other reasons. I have engaged in treatment fidelity research aimed at identifying the most accurate and feasible way to assess fidelity, identifying new strategies to promote fidelity, and strategies that combine assessment and improvement of fidelity. My future research in this area will continue to explore different behavior analytic methods to improve teacher implementation of behavior interventions with an emphasis on long-term implementation.
Representative Publications
Dart, E. H., Collier-Meek, M. A., Chambers, C., & Murphy, A. (2020). Multi-informant assessment of treatment integrity in the classroom. Psychology in the Schools.
Gresham, F. M., Dart, E. H., & Collins, T. A. (2017). Generalizability of multiple
measures of treatment integrity: Comparisons among direct observation, permanent products, and self-report. School Psychology Review, 46, 108 – 121.
Dart, E. H., Cook, C. R., Collins, T. A., Gresham, F. M., & Chenier, J. (2012). Test-
driving interventions to increase treatment integrity and student outcomes. School Psychology Review, 41, 467-481.
- Linear Graph Construction and Visual Analysis
Linear graphs are used frequently in schools to make decisions about student academic and behavioral progress in response to services implemented within MTSS frameworks; however, I have been engaged in research to determine if the way these graphs are constructed can alter the decisions people make about the presence and magnitude of treatment effects. I have found that manipulations made to the graph axes and scaling can have a profound impact on treatment effect decisions when visual analysis is used as the analytic method. Thus, I have proposed a standard format for linear graph assembly that reduces the likelihood that treatment effects are overestimated. Future work in this area will continue to identify critical elements of linear graphs that, when modified, impact treatment decisions. I also hope to investigate the extent to which practitioner made graphs adhere to the standard assembly and whether training in graph construction is sufficient to remedy future construction errors.
Representative Publications
Dart, E. H., & Radley, K. C. (2018). Toward a Standard Assembly of Linear Graphs. School Psychology Quarterly, 33, 350 - 355.
Radley, K. C., Dart, E. H., & Wright, S. J. (2018). The Effect of Data Points per X- to Y-Axis Ratio on Visual Analysts Evaluation of Single-Case Graphs. School Psychology Quarterly, 33, 314 - 322.
Dart, E. H., & Radley, K. C. (2017). The Impact of Ordinate Scaling on the Visual Analysis of Single-Case Data. Journal of School Psychology, 63, 105 - 118.