Pampering Participants Virtually

This document may be printed, photocopied, and disseminated freely with attribution. All content is the property of the McCormick Center for Early Childhood Leadership.

Pamper: To treat with extreme or excessive care and attention.1


Do you enjoy being pampered? Maybe it is treating yourself to a service you do not often indulge in, for example, a manicure or pedicure, a massage, a fine dining experience, or a car detailing.


Pampering participants is a foundational principle at the McCormick Center for Early Childhood Leadership. Paula Jorde Bloom, the McCormick Center Founder, was passionate about ensuring that anyone who participated in our trainings left feeling cared for and pampered. Thirty-seven years later, we continue to carry on this principle for participants. Over time, our team came to refer to this as the “McCormick Center Experience.”


The “McCormick Center Experience” is about showing appreciation and respect through attention to all the details that make up a high-quality training experience. Hospitality, best practices in adult learning, activities that engage the five senses, reflection, and laughter are all a part of this experience.


Prior to the pandemic, trainings offered at the McCormick Center were in-person. Upon arrival, participants experienced warm greetings, fresh flowers on the training and dining tables, and soothing music. The training tables were stocked with supplies such as markers, highlighters, and post-it notes. In addition, a bag of training materials, books, and other resources were neatly placed on each participant’s seat. A hospitality table was set up in the training room with items to meet physical needs, such as water to keep you hydrated, blankets, and handheld fans for those who needed to warm up or cool down. Hand lotion, hand sanitizer, tissues, mints, and snacks were also provided. Coffee and hot water for tea were ready when participants arrived. Participants enjoyed healthy and hot buffet lunches together in the dining room with extra time to get to know one another. Encouraging and inspirational messages were on the walls throughout the McCormick Center. Activities during the training allowed participants to connect and engage with one another and build professional learning communities. These were some of the standards of the “McCormick Center Experience.”


The Pandemic Challenge


Twenty-twenty brought a new challenge: the shift from in-person to virtual trainings. Our team rallied to brainstorm the question, “How do we continue to deliver the ‘McCormick Center Experience’ virtually?” The answer involved two primary components. The first was to deliver quality training content with engaging, creative activities through a virtual platform. The second was to send all materials, training resources, and pampering touches to each participant. This became known amongst our team as “the box.”


Pampering Inside of the Box


For example, for our Taking Charge of ChangeTM leadership academy we assembled five different kits for the box, each focused on a unique feature of our traditional in-person McCormick experience.


  • Hospitality Kit. An inspirational message, package of tissues, mints, hand sanitizer, instant coffee packets, and creamers.
  • Break Time Kit. Salty and sweet treats to have on hand during virtual training days.
  • Training Supplies Kit. Fidget toys, post-it notes, pencil, pen, whiteboard paddle, eraser, and markers.
  • Content Kit. Binder, books, and journal.
  • Resources Kit. Each training session had an envelope with activity tools (e.g., special handouts, supplies) to be used during the specific training of that topic.


In most cases, the kits were clear, plastic bags or large envelopes filled with materials. Colorful labels were placed on each kit. We labeled each resource kit with the session topic and date it was to be opened. The sealed box arrived with the friendly message label of Do Not Open Until Instructed, as we wanted to build suspense, excitement, and surprise.


The box and kits from our Taking Charge of Change™ Leadership Academy:

Pampering Outside of the Box  


The pampering did not end once we shipped the boxes. We also mailed occasional cards or inspirational messages to those we were coaching throughout our leadership academies. In addition, we connected one-on-one to participants via Zoom for coaching sessions and touched base via emails.


Results


Participants loved the way we created an element of surprise. Anticipation built, as participants received their boxes weeks before the trainings but were cautioned not to open them until the appointed time. The “wow!” factor was accomplished when we held a group “opening of the boxes” during the virtual training. A screenshot photo was taken of participants holding up their favorite item from the box. They expressed feeling pampered and could tell we had given attention to details that echoed our desire to pamper and care for them.


Pampering our participants today is as important as it was when our founder set the standard. In fact, it may be even more important now, as leaders in early childhood are feeling the mental, physical, and emotional strain of the pandemic. Participants repeatedly comment on how the pampering makes them feel cared for and appreciated. Not only do they recognize the effort it takes to create interactive virtual trainings and organize all of the boxes, but they have also mentioned how these special touches have lifted their spirits.


Come taste the “McCormick Center Experience” by applying for one of our Leadership Academies or other professional development experiences.


Reference:


1Merriam–Websterhttps://www.merriam-webster.com/dictionary/pamper.


Marleen Barrett, M.S.serves as Events Coordinator for the McCormick Center for Early Childhood Leadership at National Louis University (NLU), where she coordinates the details of the annual Leadership Connections™ conference and the Taking Charge of Change™ leadership academy. Mrs. Barrett also serves as a coach for training participants and as the liaison with Gateways Authorization Entity for the McCormick Center. She holds a master’s degree in training and development from Loyola University. Prior to working at NLU, she was the Director of Leadership Development for the American Farm Bureau Federation, where she conducted training programs on strategic planning, organizational skills, and team building throughout the United States.

By McCormick Center May 13, 2025
Leaders, policymakers, and systems developers seek to improve early childhood programs through data-driven decision-making. Data can be useful for informing continuous quality improvement efforts at the classroom and program level and for creating support for workforce development at the system level. Early childhood program leaders use assessments to help them understand their programs’ strengths and to draw attention to where supports are needed.  Assessment data is particularly useful in understanding the complexity of organizational climate and the organizational conditions that lead to successful outcomes for children and families. Several tools are available for program leaders to assess organizational structures, processes, and workplace conditions, including: Preschool Program Quality Assessment (PQA) 1 Program Administration Scale (PAS) 2 Child Care Worker Job Stress Inventory (ECWJSI) 3 Early Childhood Job Satisfaction Survey (ECJSS) 4 Early Childhood Work Environment Survey (ECWES) 5 Supportive Environmental Quality Underlying Adult Learning (SEQUAL) 6 The Early Education Essentials is a recently developed tool to examine program conditions that affect early childhood education instructional and emotional quality. It is patterned after the Five Essentials Framework, 7 which is widely used to measure instructional supports in K-12 schools. The Early Education Essentials measures six dimensions of quality in early childhood programs: Effective instructional leaders Collaborative teachers Supportive environment Ambitious instruction Involved families Parent voice A recently published validation study for the Early Education Essentials 8 demonstrates that it is a valid and reliable instrument that can be used to assess early childhood programs to improve teaching and learning outcomes. METHODOLOGY For this validation study, two sets of surveys were administered in one Midwestern city; one for teachers/staff in early childhood settings and one for parents/guardians of preschool-aged children. A stratified random sampling method was used to select sites with an oversampling for the percentage of children who spoke Spanish. The teacher surveys included 164 items within 26 scales and were made available online for a three-month period in the public schools. In community-based sites, data collectors administered the surveys to staff. Data collectors also administered the parent surveys in all sites. The parent survey was shorter, with 54 items within nine scales. Rasch analyses was used to combine items into scales. In addition to the surveys, administrative data were analyzed regarding school attendance. Classroom observational assessments were performed to measure teacher-child interactions. The Classroom Assessment Scoring System TM (CLASS) 9 was used to assess the interactions. Early Education Essentials surveys were analyzed from 81 early childhood program sites (41 school-based programs and 40 community-based programs), serving 3- and 4-year old children. Only publicly funded programs (e.g., state-funded preschool and/or Head Start) were included in the study. The average enrollment for the programs was 109 (sd = 64); 91% of the children were from minority backgrounds; and 38% came from non-English speaking homes. Of the 746 teacher surveys collected, 451 (61%) were from school-based sites and 294 (39%) were from community-based sites. There were 2,464 parent surveys collected (59% school; 41% community). About one-third of the parent surveys were conducted in Spanish. Data were analyzed to determine reliability, internal validity, group differences, and sensitivity across sites. Child outcome results were used to examine if positive scores on the surveys were related to desirable outcomes for children (attendance and teacher-child interactions). Hierarchical linear modeling (HLM) was used to compute average site-level CLASS scores to account for the shared variance among classrooms within the same school. Exploratory factor analysis was performed to group the scales. RESULTS The surveys performed well in the measurement characteristics of scale reliability, internal validity, differential item functioning, and sensitivity across sites . Reliability was measured for 25 scales with Rasch Person Reliability scores ranging from .73 to .92; with only two scales falling below the preferred .80 threshold. The Rasch analysis also provided assessment of internal validity showing that 97% of the items fell in an acceptable range of >0.7 to <1.3 (infit mean squares). The Teacher/Staff survey could detect differences across sites, however the Parent Survey was less effective in detecting differences across sites. Differential item functioning (DIF) was used to compare if individual responses differed for school- versus community-based settings and primary language (English versus Spanish speakers). Results showed that 18 scales had no or only one large DIF on the Teacher/Staff Survey related to setting. There were no large DIFs found related to setting on the Parent Survey and only one scale that had more than one large DIF related to primary language. The authors decided to leave the large DIF items in the scale because the number of large DIFs were minimal and they fit well with the various groups. The factor analysis aligned closely with the five essentials in the K-12 model . However, researchers also identified a sixth factor—parent voice—which factored differently from involved families on the Parent Survey. Therefore, the Early Education Essentials have an additional dimension in contrast to the K-12 Five Essentials Framework. Outcomes related to CLASS scores were found for two of the six essential supports . Positive associations were found for Effective Instructional Leaders and Collaborative Teachers and all three of the CLASS domains (Emotional Support, Classroom Organization, and Instructional Support). Significant associations with CLASS scores were not found for the Supportive Environment, Involved Families, or Parent Voice essentials. Ambitious Instruction was not associated with any of the three domains of the CLASS scores. Table 1. HLM Coefficients Relating Essential Scores to CLASS Scores (Model 1) shows the results of the analysis showing these associations. Outcomes related to student attendance were found for four of the six essential supports . Effective Instructional Leaders, Collaborative Teachers, Supportive Environment, and Involved Families were positively associated with student attendance. Ambitious Instruction and Parent Voice were not found to be associated with student attendance. The authors are continuing to examine and improve the tool to better measure developmentally appropriate instruction and to adapt the Parent Survey so that it will perform across sites. There are a few limitations to this study that should be considered. Since the research is based on correlations, the direction of the relationship between factors and organizational conditions is not evident. It is unknown whether the Early Education Essentials survey is detecting factors that affect outcomes (e.g., engaged families or positive teacher-child interactions) or whether the organizational conditions predict these outcomes. This study was limited to one large city and a specific set of early childhood education settings. It has not been tested with early childhood centers that do not receive Head Start or state pre-K funding. DISCUSSION The Early Education Essentials survey expands the capacity of early childhood program leaders, policymakers, systems developers, and researchers to assess organizational conditions that specifically affect instructional quality. It is likely to be a useful tool for administrators seeking to evaluate the effects of their pedagogical leadership—one of the three domains of whole leadership. 10 When used with additional measures to assess whole leadership—administrative leadership, leadership essentials, as well as pedagogical leadership—stakeholders will be able to understand the organizational conditions and supports that positively impact child and family outcomes. Many quality initiatives focus on assessment at the classroom level, but examining quality with a wider lens at the site level expands the opportunity for sustainable change and improvement. The availability of valid and reliable instruments to assess the organizational structures, processes, and conditions within early childhood programs is necessary for data-driven improvement of programs as well as systems development and applied research. Findings from this validation study confirm that strong instructional leadership and teacher collaboration are good predictors of effective teaching and learning practices, evidenced in supportive teacher-child interactions and student attendance. 11 This evidence is an important contribution to the growing body of knowledge to inform embedded continuous quality improvement efforts. It also suggests that leadership to support teacher collaboration like professional learning communities (PLCs) and communities of practice (CoPs) may have an effect on outcomes for children. This study raises questions for future research. The addition of the “parent voice” essential support should be further explored. If parent voice is an essential support why was it not related to CLASS scores or student attendance? With the introduction of the Early Education Essentials survey to the existing battery of program assessment tools (PQA, PAS, ECWJSI, ECWES, ECJSS and SEQUAL), a concurrent validity study is needed to determine how these tools are related and how they can best be used to examine early childhood leadership from a whole leadership perspective. ENDNOTES 1 High/Scope Educational Research Foundation, 2003 2 Talan & Bloom, 2011 3 Curbow, Spratt, Ungaretti, McDonnell, & Breckler, 2000 4 Bloom, 2016 5 Bloom, 2016 6 Whitebook & Ryan, 2012 7 Bryk, Sebring, Allensworth, Luppescu, & Easton, 2010 8 Ehrlich, Pacchiano, Stein, Wagner, Park, Frank, et al., 2018 9 Pianta, La Paro, & Hamre, 2008 10 Abel, Talan, & Masterson, 2017 11 Bloom, 2016; Lower & Cassidy, 2007 REFERENCES Abel, M. B., Talan, T. N., & Masterson, M. (2017, Jan/Feb). Whole leadership: A framework for early childhood programs. Exchange(19460406), 39(233), 22-25. Bloom, P. J. (2016). Measuring work attitudes in early childhood settings: Technical manual for the Early Childhood Job Satisfaction Survey (ECJSS) and the Early Childhood Work Environment Survey (ECWES), (3rd ed.). Lake Forest, IL: New Horizons. Bryk, A. S., Sebring, P. B., Allensworth, E., Luppescu, S., & Easton, J. Q. (2010). Organizing schools for improvement: Lessons from Chicago. Chicago, IL: The University of Chicago Press. Curbow, B., Spratt, K., Ungaretti, A., McDonnell, K., & Breckler, S. (2000). Development of the Child Care Worker Job Stress Inventory. Early Childhood Research Quarterly, 15, 515-536. DOI: 10.1016/S0885-2006(01)00068-0 Ehrlich, S. B., Pacchiano, D., Stein, A. G., Wagner, M. R., Park, S., Frank, E., et al., (in press). Early Education Essentials: Validation of a new survey tool of early education organizational conditions. Early Education and Development. High/Scope Educational Research Foundation (2003). Preschool Program Quality Assessment, 2nd Edition (PQA) administration manual. Ypsilanti, MI: High/Scope Press. Lower, J. K. & Cassidy, D. J. (2007). Child care work environments: The relationship with learning environments. Journal of Research in Childhood Education, 22(2), 189-204. DOI: 10.1080/02568540709594621 Pianta, R. C., La Paro, K. M., & Hamre, B. K. (2008). Classroom Assessment Scoring System (CLASS). Baltimore, MD: Paul H. Brookes Publishing Co. Talan, T. N., & Bloom, P. J. (2011). Program Administration Scale: Measuring early childhood leadership and management (2 nd ed.). New York, NY: Teachers College Press. Whitebook, M., & Ryan, S. (2012). Supportive Environmental Quality Underlying Adult Learning (SEQUAL). Berkeley, CA: Center for the Study of Child Care Employment, University of California.
Show More