Meeting the Need for Intensive and Cohesive Professional Development During Challenging Times

Meeting the Need for Intensive and Cohesive Professional Development During Challenging Times

by Robyn Kelton, Teri Talan, and Marina Magid

This document may be printed, photocopied, and disseminated freely with attribution. All content is the property of the McCormick Center for Early Childhood Leadership.


BACKGROUND



The pandemic has amplified and exacerbated many of the existing challenges facing the field of early childhood education and care (ECEC) and forced the field to innovate and change much of what had become the status quo (Hashikawa et al., 2020). One area in which the status quo has shifted significantly over the past several years is the delivery of professional development for ECEC program leaders. Social distancing, increased workloads, emotional and physical exhaustion, and the pervasive staffing crisis have posed significant barriers to engaging in the intensive and cohesive professional learning experiences that the research literature suggests are critical to the retention of program leaders, the well-being of staff, and positive outcomes for ECEC children and families (Arabella Advisors, 2018; Doherty et al., 2015; Sims et al., 2015; Talan et al., 2014).


As gatekeepers to quality (ensuring access to resources and supports), ECEC program leaders directly impact organizational climate, teaching practices, and family engagement in their programs; thus, role-specific support and professional development are critical to the success of the programs they operate (Bloom & Abel, 2015; Douglass, 2019). Specialized training in leadership strategies and program management is especially critical for new administrators. Administrators who receive support early in their role are more likely to remain in the field, improve the quality of their programs, and continue to grow professionally (Talan et al., 2014).


The McCormick Center has a 30-year history of providing intensive (6-12 months in duration), cohesive (curriculum content is coordinated and builds on previous learning and application), and cohort-based leadership academies. These leadership academies have been consistently evaluated, providing empirical evidence of success in supporting the unique needs of ECEC program leaders, increasing program quality, improving organizational health, and fostering a commitment to ongoing professional growth and achievement (McCormick Center, 2020; Talan et al., 2014).


Before the pandemic, the McCormick Center for Early Childhood Leadership received funding to develop and facilitate a seven-month leadership academy for newly-hired center-based administrators. At first, the traditionally designed leadership academy (in-person, off-site, full-day seminar structure) was postponed in hopes of rescheduling when life “returned to normal” for center administrators. Ultimately, the academy was deconstructed and rebuilt to be delivered virtually.


The leadership academy evaluated in this study provided the challenge and opportunity to rethink the traditional leadership academy delivery model in ways that accounted for an online and remote learning environment, the impact of a staffing crisis that forced administrators to function as teachers in the classroom to maintain ratios resulting in loss of dedicated time for administrative tasks, and amplified feelings of stress and isolation by participants. Was it possible to transform the traditional leadership academy model into a fully virtual one and still achieve the intended learning and professional outcomes? This brief provides an overview of the revised delivery model of a leadership academy for newly-hired administrators and the empirical evidence regarding its success.


THE ACADEMY


This seven-month leadership academy was designed for a cohort of new administrators (less than five years of administrative experience). The theory of change undergirding this academy is that the professional development needs of program administrators differ by their developmental stage (McCormick Center, 2018). The professional learning for novice administrators should begin with leadership essentials—the foundational competencies needed to build or maintain a thriving organization and lead quality improvement efforts at the classroom, program, and organizational levels.


Foundational competencies include knowledge and application of the Whole Leadership Framework, reflective and culturally responsive practices, communication and leadership styles, time management strategies, and intentional leadership practices for a healthy organizational climate (Bloom, 2016; Masterson et al., 2019). These foundational competencies are needed to support the achievement of additional competencies in the administrative and pedagogical leadership areas. The ability to lead effective early childhood programs is directly related to a leader’s self-efficacy. A program leader with high levels of self-efficacy will be resilient and persistent in the face of challenges and setbacks—a critical component of successful leadership (Bandera, 1997; Bloom, 1984).


The original plan for the academy was to offer multiple, full-day, in-person professional development sessions. Those sessions would involve numerous opportunities for self-reflection, small group discussions, peer-to-peer support, and large group discussions. With the onset of the pandemic, the McCormick Center team was forced to rethink the delivery model. The adapted model of the academy consisted of the Ready to Lead Institute (16 half-day remote learning sessions focused on leadership essentials) followed by Continuing the Journey with Aim4Excellence™ (four remote sessions facilitating learning from the asynchronous online modules focused on either pedagogical or administrative leadership).


It was understood that participants would face many challenges navigating the pandemic and staffing shortages while developing their leadership and management skills. Concerns about how to adapt the academy around these challenges were numerous, but three overarching questions rose to the top:


  1. Would it be possible for participants to focus on and prioritize their learning while also navigating personal and professional challenges? Attendance and completion of all of the components of the academy are critical for success. Sessions, topics, and materials build on each other, and conversations and themes are carried throughout the full academy.
  2. Would participants value the knowledge and skills targeted in the academy? With so much additional stress and uncertainty facing administrators, it would be challenging for participants to experience the content as timely and meaningful. We worried about how to ensure topics we knew were critical for novice administrators still felt important to participants.
  3. Would the virtual academy effectively increase knowledge and skills, levels of self-efficacy, and a sense of belonging to a professional community? We wanted to know if we could deliver on our original goals for the academy, despite all of the adaptations we would need to make.


METHODOLOGY


PARTICIPANTS AND PROGRAMS


Participants included 11 program administrators located in a small state. Of the 11 participants, six (55%) identified their role as Director, four (36%) identified their role as Assistant Director, and one (9%) identified their role as Principal. Participants self-identified their gender and racial categories. All participants identified as female. The racial composition of participants included four (36%) individuals who were Asian, three (27%) who were Native Hawaiian or other Pacific Islander, three (27%) who were Multiracial, and one (9%) who was White or Caucasian. The largest groups of participants (four individuals, 36%) reported being between the ages of 40 and 49 or between 50 and 59; another three individuals (27%) were between the ages of 30 and 39. Participants indicated they had worked in the field of early childhood education for an average of 16 years, worked in an administrative position for an average of 4 years, and worked in their current administrative position for an average 1.6 years.

Licensed capacity of the programs ranged between 61 and 120 children in four programs (36%), over 121 children in another four programs (36%), and between 1 and 60 children in three programs (27%). All programs served preschoolers. In addition, seven programs (64%) served toddlers, four programs (36%) served school-aged children, and two programs (18%) served infants.


FORMAL MEASURES


The Training Needs Assessment Survey


The Training Needs Assessment Survey (TNAS) is a 40-question survey of perceived knowledge and skill in areas deemed critical to successfully navigating a new leadership position. Respondents are asked to indicate their current level of knowledge or skill on a 5-point Likert scale (from 1 = I have no knowledge/skills to 5 = I am highly knowledgeable/skilled). Responses on the TNAS provide insight into the skills and knowledge participants gained over time. The TNAS is administered online and takes approximately seven minutes to complete.


The Administrator Role Perception Survey


The Administrator Role Perception Survey (ARPS) is a 25-minute survey for center-based program administrators. Administrators complete the survey online and are later provided with an individual ARPS Profile. The ARPS Profile provides administrators with information about themselves as leaders. It identifies administrators’ developmental career stages based not on years of experience but rather on their perceptions of mastery of key early childhood program leadership competencies. The Profile incorporates the McCormick Center’s Whole Leadership Framework into the results, providing administrators with information on the amount of time they spend on administrative and pedagogical leadership functions as well as their strengths and areas for growth in each of the three, interdependent leadership areas. The three areas of the Whole Leadership Framework—leadership essentials, pedagogical leadership, and administrative leadership—reflect on everything the administrator, and often other staff members, do as early childhood program leaders.


Final Evaluation Survey


A final evaluation survey was administered online after the conclusion of the academy. Participants were asked to provide feedback regarding their experience with various components of the leadership academy, perceived areas of professional growth, and information regarding the role of the leadership academy in their professional journeys. Data from the final evaluations provided a reflection on the overall impact of the leadership academy and the identification of targeted areas for future improvement.


FINDINGS


Q1. Would it be possible for participants to focus on and prioritize their learning while also navigating personal and professional challenges?

The original design of the academy involved multiple, back-to-back days of in-person training. The adapted design shifted to remote learning sessions conducted by McCormick Center faculty in a seminar-like atmosphere over several months. Each participant received three books, a binder of handouts, a journal for self-reflection, and articles to guide their learning. Participants met virtually on a live platform from their home or work office for 16 half-days over four consecutive months. Additionally, training days were clustered by content over four months rather than scheduled for 16 sessions in a row. This scheduling allowed for deep immersion by topic with a continuous focus on overarching themes. The decision to use half-days clustered across months freed up portions of each training day to allow participants to still address their programs’ daily needs. Participants continued their learning in the interim between clustered sessions with required readings and brief assignments related to the topics discussed.


Providing a leadership academy that fosters a professional learning community inclusive of critical inquiry, a safe learning environment, and trusting relationships remained a high priority. This meant faculty needed to consider new ways to engage participants with both the faculty and each other. Faculty worked to construct a virtual environment where participants could feel the same level of comfort and appreciation as they would in an in-person classroom. This began with packing and mailing each participant a hospitality box—a box including learning materials (e.g., books, a journal for reflection, and learning resources) as well as fun and functional items that would typically be in a training room (e.g., pens and pencils, fidget toys, snacks, tea bags and instant coffee, notes of inspiration, etc.). The hospitality box was delivered to each participant at their preferred address, work or home, with the instructions not to open it until the first virtual session. At the first session, everyone opened their boxes together—a professional Big Reveal that set the tone for all sessions.

During sessions, participants were encouraged to share passions and struggles, activities were designed to allow for the co-construction of knowledge in Zoom breakout rooms, and time was set aside for participants to use their journals to reflect on assumptions and beliefs about their roles and the vital work they do. A special focus was placed on acquiring strategies to help prioritize work. Participants were encouraged to rethink their previous practices and implement methods of shared decision-making and leadership across their programs. Through follow-up assignments and resources, administrators were challenged to refine their practices, acquire new competencies, gain insights, and become more confident and effective leaders.


Overall, 81% of participants (9 out of 11) fully completed the academy, including the 48 hours of online learning and completion of the assessments for learning embedded within the three modules they selected from Aim4Excellence, a national online director credential. The two participants who did not complete the academy reported changes in their roles and responsibilities, as well as significant personal issues. This suggests that, despite the leadership academy redesign and radical hospitality efforts, outside forces still deeply impacted some participants’ capacities to fully engage in the professional learning experience.


Q2. Would participants value the knowledge and skills targeted in the academy?

The overall format of the academy encouraged a collegial atmosphere with formal presentations, large-group discussions, small Zoom breakout room activities, role-playing, opportunities for reflection, short assignments, and time to develop intentions and next steps. The scenarios and case studies used during the sessions were drawn from participant responses on pre-data collection measures. This allowed for the personalization of the curriculum content based on the unique program characteristics and themes that emerged from the data, which enhanced the applicability of this training, allowing us to make sure that the content felt relevant to the participants.


After the 16 remote learning sessions, which focused on leadership essentials, the participants utilized their pre-Administrator Role Perception Surveys (ARPS) Profile to guide them in individualizing a plan for the next phase of the academy by selecting three modules from Aim4Excellence to achieve competencies in either administrative leadership or pedagogical leadership. This choice provided participants with a sense of learning autonomy as they continued their leadership journey.


During this second phase of the academy, participants completed the three self-selected Aim4Excellence modules and engaged in a monthly facilitated online cohort (either a pedagogical leadership cohort or an administrative leadership cohort) for an additional three months. Designed as an engaging and interactive online professional learning experience, Aim4Excellence explores the core leadership competencies that early childhood program leaders need. The facilitation provided by faculty supported participants in completing the modules and applying new knowledge and skills in the participants’ respective centers. The facilitated cohort model for the Aim4Excellence modules provided the opportunity for continued peer learning and support built on the trust established during the first phase of the academy, Ready to Lead.


The Final Evaluation Survey provided answers to question 2. Overall, participants found the academy useful in their professional growth and challenging, especially in the face of ever-increasing personal and professional demands. On a scale of 1 = not at all to 5 = completely, participants were asked to rate the degree to which they agreed with some general questions regarding different components of the academy. Table 1 provides means and standard deviations for each statement.

A table with numbers and letters on it

Next, participants were asked to rate how useful (on a scale of 1 = not at all to 5 = very) various elements of the Academy were and to share any additional information about their ratings. Results showed that all elements were perceived as useful, with materials provided having the highest rating and follow-up assignments and resources being the lowest. Means and standard deviations are provided in Table 2. Of particular interest were the participants’ ratings of the usefulness of the ARPS Profile to inform their choice of Aim4Excellence modules. Four participants rated it somewhat useful, three a bit useful, and five rated it very useful. Taken together, these ratings may suggest the need for additional professional development in interpreting and using individual ARPS Profiles to guide choices in leadership development.

Table 2 A table with numbers and letters on it

Q3. Would the virtual academy be effective at improving knowledge and skills, levels of self-efficacy, and a sense of belonging to a professional community?

Formal data gathered from the TNAS, ARPS, and Final Evaluation allowed us to examine the impact of the academy on the desired learning and professional outcomes.



Training Needs Assessment Survey

Paired t-tests were employed to explore changes in participants’ TNAS scores across time. The results of the data analyses revealed increases in all 40 areas and statistically significant increases in mean scores for 35 of the 40 areas assessed. Additionally, total TNAS scores showed a significant mean increase in knowledge and skill from pre (M = 103.44, sd = 23.55) to post (M = 161.89, sd = 19.69), t(8) = 6.56, p < .001. These results suggest that the training had a strong, measurable impact on participants’ level of knowledge and skill across a wide array of program leadership areas. Table 3 provides a summary of TNAS pre- and post-means and the results of the t-tests showing the improvements over time.

A table of pre and post knowledge and skill scores
A table with a lot of numbers on it

Administrator Role Perception Survey



A total of 11 participants completed an ARPS prior to the start of the academy, and eight completed a post-ARPS upon completion of the Aim4Excellence modules. The results of the one-sample t-tests show that participants’ confidence significantly increased in all three Whole Leadership Framework domains, as well as in nine specific competencies. Table 4 shows the average scores and standard deviations for each competency (on a 4-point scale) pre- and post-training, the gain over time, and any statistically significant change from the beginning to the end of the training.

A table with a lot of numbers on it

Moreover, comparisons of pre- and post-ARPS data suggest that the percentages of participants who felt Not Confident to Somewhat Confident in each of the three whole leadership domains before the training decreased while the percentages of participants who felt Confident to Very Confident in each of the whole leadership domains increased after the training. These gains in confidence are most notable in the domains self-selected by the participants: administrative leadership and pedagogical leadership. Figure 1 compares the frequencies of participant confidence levels before and after the academy.



Figure 1 | Confidence Ranges by Whole Leadership Domain Across Time

A graph showing the percentage of people who are not confident to somewhat confident

The ARPS also asks respondents to identify the three words that best describe their current role. We were interested in knowing if completion of the academy was associated with more favorable descriptions of the role of administrator. Results of pre- and post-comparisons showed a 13% increase in the selection of the word “coach”, a 16% increase in “leader”, an 11% increase in “mentor,” and a 29% increase in “motivator.” Results also showed a 23% decrease in the selection of “crisis manager” and a 9% decrease in “referee.” Interestingly, there was also an 11% decrease in the selection of “decision maker,” which likely reflects participants’ increased understanding and use of shared decision-making. These changes in role descriptors reflect a more positive portrayal of their administrative roles, selecting terms associated with effective and confident leadership and a positive work climate. Figure 2 shows changes in all words selected across time.

A graph showing the selection of role descriptors across time

Final Evaluation Survey



To learn more about increased internalization of leadership practices and improved program practices, we asked participants to rate, on a scale of 1 = not at all to 5 = completely, how much they agreed with statements regarding changes resulting from participation in the academy. All participants rated all statements positively, ranging from 3 = somewhat to 5 = quite a bit, indicating that the academy resulted in growth in all areas. Table 5 provides the ranges and means for each statement. Taken as a whole, the data suggests that the academy increased the number of administrators who internalized fundamental leadership practices, led to increases in administrators’ passion and commitment to the work and their programs, and improved program practices.

A table of changes as a result of participation

While the data collected to evaluate the academy were overwhelmingly positive and suggestive of the successful completion of the academy’s desired outcomes, a few notable areas for improvement and reflection were identified.



One specific area in which future leadership academies may seek to make changes is the outcome related to increasing program administrators’ access to resources to help them support the professional growth of their staff. While there was some evidence that participants benefited from specific resources and the use of their ARPS Profiles to support their own professional growth, the data also suggests that novice administrators need more learning and support for their role mentoring staff. Future leadership academies may need to consider adding curriculum items targeted toward building the capacity to seek out and find relevant professional resources.


When asked about suggestions for improvement to the academy, three themes emerged: no suggestions at this time (22%), additional information at the start of the academy regarding the time commitment (11%), and a desire to meet in person (44%).


DISCUSSION


This research brief offers a window into the changes made due to the pandemic to one of the McCormick Center’s traditional models of leadership development, the leadership academy. Further, it provides evaluation data from a fully remote (both synchronous and asynchronous) delivery model of a leadership academy, demonstrating many successful components and suggestions for improvements moving forward. Taken as a whole, the brief offers a springboard for future considerations on how to provide administrators with the targeted leadership development critical to the success of ECEC programs without compromising the elements we know to be vital in supporting learning and application of knowledge and skills.


While the pandemic significantly affected the ability of participants and faculty to travel and meet in person for this cohort of program leaders, the continuing staffing crisis makes a return to in-person professional development impractical in the near future. This study demonstrates that rigorous learning outcomes can be met via a fully virtual leadership academy. However, it is important to acknowledge that in-person learning is perceived by many participants as more desirable and has some distinct advantages for meeting the needs of adult learners (e.g., ease of creating a community of practice and engaging in effective small group activities). The long-term solution to the workforce crisis is not yet clear, but the need for flexible delivery models of coherent and intensive leadership development is well established.


The leadership academy—with time to reflect and apply new learning in a community of practice with peers—continues to be an effective and viable model of professional development for program administrators. Professional development entities must aim to deliver meaningful content that leads to mastery of competencies that impact professional practices. This high standard for professional development can be met through in-person, remote, or hybrid learning opportunities.


REFERENCES


By McCormick Center May 13, 2025
Leaders, policymakers, and systems developers seek to improve early childhood programs through data-driven decision-making. Data can be useful for informing continuous quality improvement efforts at the classroom and program level and for creating support for workforce development at the system level. Early childhood program leaders use assessments to help them understand their programs’ strengths and to draw attention to where supports are needed.  Assessment data is particularly useful in understanding the complexity of organizational climate and the organizational conditions that lead to successful outcomes for children and families. Several tools are available for program leaders to assess organizational structures, processes, and workplace conditions, including: Preschool Program Quality Assessment (PQA) 1 Program Administration Scale (PAS) 2 Child Care Worker Job Stress Inventory (ECWJSI) 3 Early Childhood Job Satisfaction Survey (ECJSS) 4 Early Childhood Work Environment Survey (ECWES) 5 Supportive Environmental Quality Underlying Adult Learning (SEQUAL) 6 The Early Education Essentials is a recently developed tool to examine program conditions that affect early childhood education instructional and emotional quality. It is patterned after the Five Essentials Framework, 7 which is widely used to measure instructional supports in K-12 schools. The Early Education Essentials measures six dimensions of quality in early childhood programs: Effective instructional leaders Collaborative teachers Supportive environment Ambitious instruction Involved families Parent voice A recently published validation study for the Early Education Essentials 8 demonstrates that it is a valid and reliable instrument that can be used to assess early childhood programs to improve teaching and learning outcomes. METHODOLOGY For this validation study, two sets of surveys were administered in one Midwestern city; one for teachers/staff in early childhood settings and one for parents/guardians of preschool-aged children. A stratified random sampling method was used to select sites with an oversampling for the percentage of children who spoke Spanish. The teacher surveys included 164 items within 26 scales and were made available online for a three-month period in the public schools. In community-based sites, data collectors administered the surveys to staff. Data collectors also administered the parent surveys in all sites. The parent survey was shorter, with 54 items within nine scales. Rasch analyses was used to combine items into scales. In addition to the surveys, administrative data were analyzed regarding school attendance. Classroom observational assessments were performed to measure teacher-child interactions. The Classroom Assessment Scoring System TM (CLASS) 9 was used to assess the interactions. Early Education Essentials surveys were analyzed from 81 early childhood program sites (41 school-based programs and 40 community-based programs), serving 3- and 4-year old children. Only publicly funded programs (e.g., state-funded preschool and/or Head Start) were included in the study. The average enrollment for the programs was 109 (sd = 64); 91% of the children were from minority backgrounds; and 38% came from non-English speaking homes. Of the 746 teacher surveys collected, 451 (61%) were from school-based sites and 294 (39%) were from community-based sites. There were 2,464 parent surveys collected (59% school; 41% community). About one-third of the parent surveys were conducted in Spanish. Data were analyzed to determine reliability, internal validity, group differences, and sensitivity across sites. Child outcome results were used to examine if positive scores on the surveys were related to desirable outcomes for children (attendance and teacher-child interactions). Hierarchical linear modeling (HLM) was used to compute average site-level CLASS scores to account for the shared variance among classrooms within the same school. Exploratory factor analysis was performed to group the scales. RESULTS The surveys performed well in the measurement characteristics of scale reliability, internal validity, differential item functioning, and sensitivity across sites . Reliability was measured for 25 scales with Rasch Person Reliability scores ranging from .73 to .92; with only two scales falling below the preferred .80 threshold. The Rasch analysis also provided assessment of internal validity showing that 97% of the items fell in an acceptable range of >0.7 to <1.3 (infit mean squares). The Teacher/Staff survey could detect differences across sites, however the Parent Survey was less effective in detecting differences across sites. Differential item functioning (DIF) was used to compare if individual responses differed for school- versus community-based settings and primary language (English versus Spanish speakers). Results showed that 18 scales had no or only one large DIF on the Teacher/Staff Survey related to setting. There were no large DIFs found related to setting on the Parent Survey and only one scale that had more than one large DIF related to primary language. The authors decided to leave the large DIF items in the scale because the number of large DIFs were minimal and they fit well with the various groups. The factor analysis aligned closely with the five essentials in the K-12 model . However, researchers also identified a sixth factor—parent voice—which factored differently from involved families on the Parent Survey. Therefore, the Early Education Essentials have an additional dimension in contrast to the K-12 Five Essentials Framework. Outcomes related to CLASS scores were found for two of the six essential supports . Positive associations were found for Effective Instructional Leaders and Collaborative Teachers and all three of the CLASS domains (Emotional Support, Classroom Organization, and Instructional Support). Significant associations with CLASS scores were not found for the Supportive Environment, Involved Families, or Parent Voice essentials. Ambitious Instruction was not associated with any of the three domains of the CLASS scores. Table 1. HLM Coefficients Relating Essential Scores to CLASS Scores (Model 1) shows the results of the analysis showing these associations. Outcomes related to student attendance were found for four of the six essential supports . Effective Instructional Leaders, Collaborative Teachers, Supportive Environment, and Involved Families were positively associated with student attendance. Ambitious Instruction and Parent Voice were not found to be associated with student attendance. The authors are continuing to examine and improve the tool to better measure developmentally appropriate instruction and to adapt the Parent Survey so that it will perform across sites. There are a few limitations to this study that should be considered. Since the research is based on correlations, the direction of the relationship between factors and organizational conditions is not evident. It is unknown whether the Early Education Essentials survey is detecting factors that affect outcomes (e.g., engaged families or positive teacher-child interactions) or whether the organizational conditions predict these outcomes. This study was limited to one large city and a specific set of early childhood education settings. It has not been tested with early childhood centers that do not receive Head Start or state pre-K funding. DISCUSSION The Early Education Essentials survey expands the capacity of early childhood program leaders, policymakers, systems developers, and researchers to assess organizational conditions that specifically affect instructional quality. It is likely to be a useful tool for administrators seeking to evaluate the effects of their pedagogical leadership—one of the three domains of whole leadership. 10 When used with additional measures to assess whole leadership—administrative leadership, leadership essentials, as well as pedagogical leadership—stakeholders will be able to understand the organizational conditions and supports that positively impact child and family outcomes. Many quality initiatives focus on assessment at the classroom level, but examining quality with a wider lens at the site level expands the opportunity for sustainable change and improvement. The availability of valid and reliable instruments to assess the organizational structures, processes, and conditions within early childhood programs is necessary for data-driven improvement of programs as well as systems development and applied research. Findings from this validation study confirm that strong instructional leadership and teacher collaboration are good predictors of effective teaching and learning practices, evidenced in supportive teacher-child interactions and student attendance. 11 This evidence is an important contribution to the growing body of knowledge to inform embedded continuous quality improvement efforts. It also suggests that leadership to support teacher collaboration like professional learning communities (PLCs) and communities of practice (CoPs) may have an effect on outcomes for children. This study raises questions for future research. The addition of the “parent voice” essential support should be further explored. If parent voice is an essential support why was it not related to CLASS scores or student attendance? With the introduction of the Early Education Essentials survey to the existing battery of program assessment tools (PQA, PAS, ECWJSI, ECWES, ECJSS and SEQUAL), a concurrent validity study is needed to determine how these tools are related and how they can best be used to examine early childhood leadership from a whole leadership perspective. ENDNOTES 1 High/Scope Educational Research Foundation, 2003 2 Talan & Bloom, 2011 3 Curbow, Spratt, Ungaretti, McDonnell, & Breckler, 2000 4 Bloom, 2016 5 Bloom, 2016 6 Whitebook & Ryan, 2012 7 Bryk, Sebring, Allensworth, Luppescu, & Easton, 2010 8 Ehrlich, Pacchiano, Stein, Wagner, Park, Frank, et al., 2018 9 Pianta, La Paro, & Hamre, 2008 10 Abel, Talan, & Masterson, 2017 11 Bloom, 2016; Lower & Cassidy, 2007 REFERENCES Abel, M. B., Talan, T. N., & Masterson, M. (2017, Jan/Feb). Whole leadership: A framework for early childhood programs. Exchange(19460406), 39(233), 22-25. Bloom, P. J. (2016). Measuring work attitudes in early childhood settings: Technical manual for the Early Childhood Job Satisfaction Survey (ECJSS) and the Early Childhood Work Environment Survey (ECWES), (3rd ed.). Lake Forest, IL: New Horizons. Bryk, A. S., Sebring, P. B., Allensworth, E., Luppescu, S., & Easton, J. Q. (2010). Organizing schools for improvement: Lessons from Chicago. Chicago, IL: The University of Chicago Press. Curbow, B., Spratt, K., Ungaretti, A., McDonnell, K., & Breckler, S. (2000). Development of the Child Care Worker Job Stress Inventory. Early Childhood Research Quarterly, 15, 515-536. DOI: 10.1016/S0885-2006(01)00068-0 Ehrlich, S. B., Pacchiano, D., Stein, A. G., Wagner, M. R., Park, S., Frank, E., et al., (in press). Early Education Essentials: Validation of a new survey tool of early education organizational conditions. Early Education and Development. High/Scope Educational Research Foundation (2003). Preschool Program Quality Assessment, 2nd Edition (PQA) administration manual. Ypsilanti, MI: High/Scope Press. Lower, J. K. & Cassidy, D. J. (2007). Child care work environments: The relationship with learning environments. Journal of Research in Childhood Education, 22(2), 189-204. DOI: 10.1080/02568540709594621 Pianta, R. C., La Paro, K. M., & Hamre, B. K. (2008). Classroom Assessment Scoring System (CLASS). Baltimore, MD: Paul H. Brookes Publishing Co. Talan, T. N., & Bloom, P. J. (2011). Program Administration Scale: Measuring early childhood leadership and management (2 nd ed.). New York, NY: Teachers College Press. Whitebook, M., & Ryan, S. (2012). Supportive Environmental Quality Underlying Adult Learning (SEQUAL). Berkeley, CA: Center for the Study of Child Care Employment, University of California.
Show More