The Top 5 Experiences That Shaped My Approach to Conceptually Systematic Services

3

written by Olivia Harvey, M.S., BCBA; edited by Sofia Abuin, M.S., BCBA

The Top 5 Experiences That Shaped My Approach to Conceptually Systematic Services

I have worked as a consultant in public education for the past three years. My role is to conduct assessments, design interventions, and train teachers. As I wrap up my doctoral training, I’ve been reflecting on the experiences that have shaped my approach to providing behavior-analytic services. A handful of experiences stand out. I hope the takeaway messages help shape others’ approaches as well.

1.     Re-evaluating Whether Services were Truly Behavior Analytic

Early in my role, I deviated from providing conceptually systematic services. I found myself designing structural rather than functional interventions. Structural interventions may appear behavior analytic on the surface, but when evaluated conceptually, they can be incongruent with behavior-analytic principles. For example, a token system might award a break from academics, yet the reinforcer maintaining behavior was teacher attention. In that case, awarding teacher-helper tasks might have better aligned with function. Similarly, reinforcement schedules were sometimes selected for convenience rather than their likely impact on behavior. Although feasibility matters, swinging too far in that direction can lead to minimal change, ultimately costing more time than implementing a more well-designed intervention from the start. Over time, I have learned how to balance feasibility and efficacy, but doing so first required that I consistently think through interventions conceptually.

The shift from functional to structural interventions did not happen overnight. It was a gradual and subtle slip over time. It took a reminder from my supervisor for me to recognize the drift. That conversation prompted me to pause, reflect, and to recommit to critically evaluating whether my recommendations conceptually aligned with behavior-analytic principles.

Takeaway: Even well-intentioned practices can drift over time. Taking time to step back, think through recommendations conceptually, and seek another perspective can help ensure services remain grounded in the science we know works.

2.     Learning How to Conceptualize Through Idealistic Behavior Intervention Plans

After recognizing my drift toward structural interventions, my supervisor arranged opportunities for me to learn how to conceptually think through interventions. One contributor to my drift was placing too much emphasis on feasibility when making decisions. Thus, the task was to write behavior intervention plans in the most idealistic of circumstances, as if time, staffing, and materials were unlimited. Once the plan was conceptually sound, it was rewritten to meet the realities of the classroom.

Writing idealistic plans forced me to clearly articulate the function of behavior and the mechanisms by which an intervention was expected to work. Rewriting those idealistic plans required me to identify which components were essential and which could be adapted without undermining the integrity of the intervention. It taught me that there’s a difference between simplifying an intervention and watering it down. Now when I design interventions, efficacy is the first consideration and feasibility is the second.

Takeaway: Understanding the conceptual underpinnings of an intervention is essential for thoughtful adaptation. Transforming idealistic interventions to feasible interventions helps highlight the mechanisms critical for behavior change.

3.     Matching Recommendations to Reinforcement Schedules

Designing interventions and recommending them are two distinct skills. In my work, we generate multiple recommendations and then collaborate with teachers to identify which interventions are most feasible and appropriate for their classrooms. I struggled to generate a diverse set of recommendations. I assumed there was a single right answer, and my role was to identify it.

My supervisor suggested I write one recommendation for each schedule of reinforcement. For example, the table below displays four recommendations I might provide to a teacher supporting a student with escape-maintained challenging behavior. Each recommendation aligns with a schedule of reinforcement at densities likely to facilitate meaningful behavior change, allowing the teacher to pick which worked best for them while remaining conceptually sound.

Schedule of ReinforcementRecommendation
Fixed ratioHonor the students’ appropriate requests to take a break.
Variable ratioFor every two or three academic activities the student completes, offer a break with preferred items.
Fixed intervalOffer the student to take a break in the calm down corner once every 30 minutes.
Variable intervalWhen you catch the student appropriately participating, chat with them for a few minutes about their interests to provide a momentary break.

In hindsight, this strategy feels so obvious, but it fundamentally changed how I approach recommendations. I moved away from “this is the right answer” and toward “here are several functionally sound options we can choose from together.” This strategy has stood the test of time and remains a practice I use consistently.

Takeaway: There is rarely only one right intervention. Using each reinforcement schedule as a starting point for developing recommendations helps create functionally sound options the team can pick from.

4.     Seeing the Assessment to Intervention Pipeline

To select which interventions to recommend, I conducted function-based assessments. In retrospect, however, I realize that I treated this step as a box to check rather than an integral part of designing interventions. This perspective shifted after one particular case.

a graph that shows more refusals during corrected sessions relative to not corrected sessions

We conducted a functional analysis comparing two conditions: one in which the student’s errors were corrected, and one in which they were not. The student only refused to work during sessions in which errors were corrected. Identifying such a precise function made it easier to decide which components to include in the intervention, like errorless learning and a token system that reinforced accepting and applying corrections.

Without the functional analysis, I may have recommended a broad token system that required implementation across all academic activities. Following the functional analysis, I was able to recommend a more targeted intervention that was less restrictive and more feasible for the teachers to implement. The student’s behavior improved immediately. From that point on, it became easier to design assessments that genuinely clarified what was driving behavior and design interventions that were effective.

Takeaway: Function-based assessments that precisely identify what is driving behavior make selecting and designing interventions easier. Conducting assessments shouldn’t be skipped or simply a box to check, but an intentional process that supports designing interventions.

5.     Identifying Influential Events and Collecting Data

The last experience that influenced my approach was my first experience. My supervisor and I went to conduct my first classroom observation. Within about 5 minutes, my supervisor drew a simple 4×4 grid in her notebook and began collecting data, tallying whether the teacher attended to or ignored hand raises and talk outs.

a table with two rows and columns

That moment demonstrated that  there’s a difference between watching and observing. Observing consists of identifying discrete responses and events, developing hypotheses about their likely function, and generating relevant measurement systems. Being intentional about observing, rather than watching, produces data rather than anecdotes, allowing us to make decisions earlier.

It also modeled that you don’t need a fancy data sheet to collect data, at least not initially. Early sketches and measures can serve as pilots before investing time in developing a formal measurement system that can be used to systematically assess trends in behavior over time.

Takeaway: Develop fluency in identifying responses and likely influential events and start collecting data immediately. Early measures help you to decide what is worth measuring. 

Because of these experiences,  I ground my work in behavior-analytic principles, use assessments to guide interventions, and collect high-quality data to inform decisions. Each of these experiences involved my supervisors, Dr. Claire St. Peter or Dr. Kathryn Kestner, in one way or another. I am grateful for their mentorship and the experiences they provided that shaped my approach to conceptually systematic services.

Smiling blonde woman

Olivia Harvey, M.S., BCBA

Olivia is a doctoral candidate at West Virginia University and is on track to graduate in May of 2026. She received her Bachelor of Science degree at Eastern Michigan University under the mentorship of Dr. Adam Briggs and her Master of Science degree at West Virginia University under the mentorship of Dr. Claire St. Peter. Her research interests include the assessment and treatment of severe challenging behavior, procedural fidelity, and data accuracy.

Leave a Comment

This site uses Akismet to reduce spam. Learn how your comment data is processed.