Conversations with AI: How Artificial Intelligence Can Make Life and Work Easier

This document may be printed, photocopied, and disseminated freely with attribution. All content is the property of the McCormick Center for Early Childhood Leadership.

What do you picture when you hear Artificial Intelligence (AI)? Most people might think of futuristic robots and sometimes terrifying machines capable of doing everything we can but better. This perception is often influenced by popular culture, including movies, television shows, and books that depict AI in various forms, from helpful assistants, like Rosie the Robot from The Jetsons Cartoon to menacing rivals. Additionally, discussions around AI often frame it as a threat, emphasizing potential negative impacts on jobs and society. As a result, many people may associate AI with uncertainty about the future.


Feeling intimidated by new things is natural. But you know what helps? Embracing them! Once you understand a technology and start playing around with it, you might just find yourself loving it!


This is what happened to me with AI, particularly ChatGPT – a popular AI tool developed by the IT company OpenAI and trained to assist with a variety of tasks and respond to our questions and statements in a conversational, human-like manner.


Initially, I brushed off AI, thinking of it as something that only concerned those in the technology world where teams of tech experts were working on futuristic concepts. However, the topic of AI and ChatGPT was getting so much attention that it piqued my curiosity. I discovered numerous videos of people demonstrating a number of ways they were using AI in their everyday work. Many also often shared how using ChatGPT to expedite tedious or repetitive tasks freed up more of their time to focus on other, more critical responsibilities. So, I thought, Why not give it a try?


Before I go much further, I thought it might help to give definitions of a few key terms:

  • Prompt: The input the user enters to initiate a conversation or request a specific response from ChatGPT. It can be a question, statement, or any text to start the interaction provided by ChatGPT. ChatGPT generates responses based on the prompt it receives, showcasing its ability to understand the context and provide relevant information.
  • Prompt Expansion: Providing additional context or information in the prompt to guide ChatGPT’s responses. This enhances ChatGPT’s understanding of what the user is looking for.
  • Response: The output generated by ChatGPT in reply to the user’s prompt. It can vary in length and complexity depending on the input provided.


My first encounter with ChatGPT involved creating an important email. I drafted a prompt – a few sentences to explain what I wanted – and then submitted it to ChatGPT: “Write an email for my supervisor who is retiring. Express well wishes for her retirement and gratitude for all her support over the years.”


I was thrilled by the initial response: ChatGPT took my prompt and returned a well-polished email—the flow, language, and tone were all spot-on! I also discovered that I could shape ChatGPT’s output. Whether specifying writing style, providing context, or explaining the task at hand, I realized that I could use prompt expansions to guide ChatGPT in generating the unique responses I needed. I was able to edit my prompts and ask follow-up questions, and ChatGPT kept building on our interaction. Ultimately, it felt like I had a full conversation with ChatGPT. We had back-and-forth written dialogue, and it kept adjusting its responses to better meet my requests. I thought it was amazing.


Since then, ChatGPT has been an ally. We have had numerous conversations, and it has assisted me with a variety of tasks ranging from asking quick questions to providing resume revisions. ChatGPT even assisted in writing this blog! As an early childhood leader, AI can become your ally, too!


For example, AI can assist with:

  • Organizing staffing schedules and calendars
  • Generating and fine-tuning program communications like emails or memos to staff, letters to grant funders, or newsletters to families
  • Support with menu planning and making sure that nutrition guidelines are being met
  • Revising policies and procedures like crafting a checklist for opening and closing or wordsmithing a late pick-up policy
  • Finding resources to support staff training and development
  • Generating ideas for family resources
  • Brainstorming creative ways to enhance learning in the classroom
  • Creating marketing content such as an elevator pitch highlighting the benefits of your program
  • Simply serving as a thought partner


The potential to embrace AI is truly endless! If you need to cite AI in your work, you can find information about how to do that here.


In our upcoming technology training Unlocking the Potential of AI in Child Care, my colleague Robyn Kelton and I plan to share a number of ways administrators can harness AI’s power in their own programs. Participants will learn how ChatGPT can provide instant answers, enhance lesson planning, facilitate communication, and assist in problem-solving.


Register here to join this free Webinar—we’d love to help you tap into the power of AI.


Irina Tenis, Ph.D., is the Data and Evaluation Coordinator for the McCormick Center for Early Childhood Leadership at National Louis University (NLU). Irina was trained as a data analyst at Northwestern University, IL, where she completed Data Science and Visualization Boot Camp. She also holds a PhD in Linguistics, along with Bachelor’s and Master’s degrees in Education and English as a Foreign Language. Irina has worked in education for more than 20 years now, and prior to joining the McCormick Center, she worked as Senior ESL Academic Assistant at College of DuPage where she supervised the work of the department with non-native students.

By McCormick Center May 13, 2025
Leaders, policymakers, and systems developers seek to improve early childhood programs through data-driven decision-making. Data can be useful for informing continuous quality improvement efforts at the classroom and program level and for creating support for workforce development at the system level. Early childhood program leaders use assessments to help them understand their programs’ strengths and to draw attention to where supports are needed.  Assessment data is particularly useful in understanding the complexity of organizational climate and the organizational conditions that lead to successful outcomes for children and families. Several tools are available for program leaders to assess organizational structures, processes, and workplace conditions, including: Preschool Program Quality Assessment (PQA) 1 Program Administration Scale (PAS) 2 Child Care Worker Job Stress Inventory (ECWJSI) 3 Early Childhood Job Satisfaction Survey (ECJSS) 4 Early Childhood Work Environment Survey (ECWES) 5 Supportive Environmental Quality Underlying Adult Learning (SEQUAL) 6 The Early Education Essentials is a recently developed tool to examine program conditions that affect early childhood education instructional and emotional quality. It is patterned after the Five Essentials Framework, 7 which is widely used to measure instructional supports in K-12 schools. The Early Education Essentials measures six dimensions of quality in early childhood programs: Effective instructional leaders Collaborative teachers Supportive environment Ambitious instruction Involved families Parent voice A recently published validation study for the Early Education Essentials 8 demonstrates that it is a valid and reliable instrument that can be used to assess early childhood programs to improve teaching and learning outcomes. METHODOLOGY For this validation study, two sets of surveys were administered in one Midwestern city; one for teachers/staff in early childhood settings and one for parents/guardians of preschool-aged children. A stratified random sampling method was used to select sites with an oversampling for the percentage of children who spoke Spanish. The teacher surveys included 164 items within 26 scales and were made available online for a three-month period in the public schools. In community-based sites, data collectors administered the surveys to staff. Data collectors also administered the parent surveys in all sites. The parent survey was shorter, with 54 items within nine scales. Rasch analyses was used to combine items into scales. In addition to the surveys, administrative data were analyzed regarding school attendance. Classroom observational assessments were performed to measure teacher-child interactions. The Classroom Assessment Scoring System TM (CLASS) 9 was used to assess the interactions. Early Education Essentials surveys were analyzed from 81 early childhood program sites (41 school-based programs and 40 community-based programs), serving 3- and 4-year old children. Only publicly funded programs (e.g., state-funded preschool and/or Head Start) were included in the study. The average enrollment for the programs was 109 (sd = 64); 91% of the children were from minority backgrounds; and 38% came from non-English speaking homes. Of the 746 teacher surveys collected, 451 (61%) were from school-based sites and 294 (39%) were from community-based sites. There were 2,464 parent surveys collected (59% school; 41% community). About one-third of the parent surveys were conducted in Spanish. Data were analyzed to determine reliability, internal validity, group differences, and sensitivity across sites. Child outcome results were used to examine if positive scores on the surveys were related to desirable outcomes for children (attendance and teacher-child interactions). Hierarchical linear modeling (HLM) was used to compute average site-level CLASS scores to account for the shared variance among classrooms within the same school. Exploratory factor analysis was performed to group the scales. RESULTS The surveys performed well in the measurement characteristics of scale reliability, internal validity, differential item functioning, and sensitivity across sites . Reliability was measured for 25 scales with Rasch Person Reliability scores ranging from .73 to .92; with only two scales falling below the preferred .80 threshold. The Rasch analysis also provided assessment of internal validity showing that 97% of the items fell in an acceptable range of >0.7 to <1.3 (infit mean squares). The Teacher/Staff survey could detect differences across sites, however the Parent Survey was less effective in detecting differences across sites. Differential item functioning (DIF) was used to compare if individual responses differed for school- versus community-based settings and primary language (English versus Spanish speakers). Results showed that 18 scales had no or only one large DIF on the Teacher/Staff Survey related to setting. There were no large DIFs found related to setting on the Parent Survey and only one scale that had more than one large DIF related to primary language. The authors decided to leave the large DIF items in the scale because the number of large DIFs were minimal and they fit well with the various groups. The factor analysis aligned closely with the five essentials in the K-12 model . However, researchers also identified a sixth factor—parent voice—which factored differently from involved families on the Parent Survey. Therefore, the Early Education Essentials have an additional dimension in contrast to the K-12 Five Essentials Framework. Outcomes related to CLASS scores were found for two of the six essential supports . Positive associations were found for Effective Instructional Leaders and Collaborative Teachers and all three of the CLASS domains (Emotional Support, Classroom Organization, and Instructional Support). Significant associations with CLASS scores were not found for the Supportive Environment, Involved Families, or Parent Voice essentials. Ambitious Instruction was not associated with any of the three domains of the CLASS scores. Table 1. HLM Coefficients Relating Essential Scores to CLASS Scores (Model 1) shows the results of the analysis showing these associations. Outcomes related to student attendance were found for four of the six essential supports . Effective Instructional Leaders, Collaborative Teachers, Supportive Environment, and Involved Families were positively associated with student attendance. Ambitious Instruction and Parent Voice were not found to be associated with student attendance. The authors are continuing to examine and improve the tool to better measure developmentally appropriate instruction and to adapt the Parent Survey so that it will perform across sites. There are a few limitations to this study that should be considered. Since the research is based on correlations, the direction of the relationship between factors and organizational conditions is not evident. It is unknown whether the Early Education Essentials survey is detecting factors that affect outcomes (e.g., engaged families or positive teacher-child interactions) or whether the organizational conditions predict these outcomes. This study was limited to one large city and a specific set of early childhood education settings. It has not been tested with early childhood centers that do not receive Head Start or state pre-K funding. DISCUSSION The Early Education Essentials survey expands the capacity of early childhood program leaders, policymakers, systems developers, and researchers to assess organizational conditions that specifically affect instructional quality. It is likely to be a useful tool for administrators seeking to evaluate the effects of their pedagogical leadership—one of the three domains of whole leadership. 10 When used with additional measures to assess whole leadership—administrative leadership, leadership essentials, as well as pedagogical leadership—stakeholders will be able to understand the organizational conditions and supports that positively impact child and family outcomes. Many quality initiatives focus on assessment at the classroom level, but examining quality with a wider lens at the site level expands the opportunity for sustainable change and improvement. The availability of valid and reliable instruments to assess the organizational structures, processes, and conditions within early childhood programs is necessary for data-driven improvement of programs as well as systems development and applied research. Findings from this validation study confirm that strong instructional leadership and teacher collaboration are good predictors of effective teaching and learning practices, evidenced in supportive teacher-child interactions and student attendance. 11 This evidence is an important contribution to the growing body of knowledge to inform embedded continuous quality improvement efforts. It also suggests that leadership to support teacher collaboration like professional learning communities (PLCs) and communities of practice (CoPs) may have an effect on outcomes for children. This study raises questions for future research. The addition of the “parent voice” essential support should be further explored. If parent voice is an essential support why was it not related to CLASS scores or student attendance? With the introduction of the Early Education Essentials survey to the existing battery of program assessment tools (PQA, PAS, ECWJSI, ECWES, ECJSS and SEQUAL), a concurrent validity study is needed to determine how these tools are related and how they can best be used to examine early childhood leadership from a whole leadership perspective. ENDNOTES 1 High/Scope Educational Research Foundation, 2003 2 Talan & Bloom, 2011 3 Curbow, Spratt, Ungaretti, McDonnell, & Breckler, 2000 4 Bloom, 2016 5 Bloom, 2016 6 Whitebook & Ryan, 2012 7 Bryk, Sebring, Allensworth, Luppescu, & Easton, 2010 8 Ehrlich, Pacchiano, Stein, Wagner, Park, Frank, et al., 2018 9 Pianta, La Paro, & Hamre, 2008 10 Abel, Talan, & Masterson, 2017 11 Bloom, 2016; Lower & Cassidy, 2007 REFERENCES Abel, M. B., Talan, T. N., & Masterson, M. (2017, Jan/Feb). Whole leadership: A framework for early childhood programs. Exchange(19460406), 39(233), 22-25. Bloom, P. J. (2016). Measuring work attitudes in early childhood settings: Technical manual for the Early Childhood Job Satisfaction Survey (ECJSS) and the Early Childhood Work Environment Survey (ECWES), (3rd ed.). Lake Forest, IL: New Horizons. Bryk, A. S., Sebring, P. B., Allensworth, E., Luppescu, S., & Easton, J. Q. (2010). Organizing schools for improvement: Lessons from Chicago. Chicago, IL: The University of Chicago Press. Curbow, B., Spratt, K., Ungaretti, A., McDonnell, K., & Breckler, S. (2000). Development of the Child Care Worker Job Stress Inventory. Early Childhood Research Quarterly, 15, 515-536. DOI: 10.1016/S0885-2006(01)00068-0 Ehrlich, S. B., Pacchiano, D., Stein, A. G., Wagner, M. R., Park, S., Frank, E., et al., (in press). Early Education Essentials: Validation of a new survey tool of early education organizational conditions. Early Education and Development. High/Scope Educational Research Foundation (2003). Preschool Program Quality Assessment, 2nd Edition (PQA) administration manual. Ypsilanti, MI: High/Scope Press. Lower, J. K. & Cassidy, D. J. (2007). Child care work environments: The relationship with learning environments. Journal of Research in Childhood Education, 22(2), 189-204. DOI: 10.1080/02568540709594621 Pianta, R. C., La Paro, K. M., & Hamre, B. K. (2008). Classroom Assessment Scoring System (CLASS). Baltimore, MD: Paul H. Brookes Publishing Co. Talan, T. N., & Bloom, P. J. (2011). Program Administration Scale: Measuring early childhood leadership and management (2 nd ed.). New York, NY: Teachers College Press. Whitebook, M., & Ryan, S. (2012). Supportive Environmental Quality Underlying Adult Learning (SEQUAL). Berkeley, CA: Center for the Study of Child Care Employment, University of California.
Show More