Professional Learning Program Development

Professional development and learning has evolved from sit-and-get one-and-done programs into more distributed, longer-term, shorter experiences built on relationships, interactions, and shared learning among peers. We help our clients transform their offerings to align to a future of collaborative communities of practice.

To get there requires having a clear set of principles that serve as a basis to drive the work and help to spell out what the program will look like and the desired outcomes. These principles should:

  • Be written in a way that allows for program growth and evolution over time; 
  • Should provide a focus for the program; and,
  • Should drive the work and shed light on the opportunities that exist—they aren’t the work themselves.
We build professional learning programs on four key principles:
  • A program is built for its participants. Design begins with the “who.” Who will the program serve? How will participants be grouped? How will the program reach participants who serve in different roles within organizations? And most importantly, how can its participants help support the program and its growth?


    • It should be clear who the program serves, such as educators or administrators, or a mix of people serving in various roles

    • The program should address the varying needs, skill levels, and learning goals of participants

    • There should be a clear entry point for participants to join and/or be accepted into the program

    • The program should invite participants help to define what the community looks like and how it functions


    • Application process outline and procedures

    • Participant application

    • Scoring rubric and criteria, etc. for application process

    • Program overview (one-pager, intro video, etc.)

    • FAQ

    • Informational email for each session

    • Spreadsheet of participant data (for mail merge)

    • Self-assessment for data to design session(s)

  • The structure of a program is a driver of engagement. How will opportunities be delivered and how does a program’s scaling fuel its delivery methods? What opportunities can be offered to promote continuous learning before, after, and between program sessions? How will the program’s structure support growth and evolution year to year?


    • Program structure should be clearly articulated--asynchronous, synchronous, blended, or hybrid

    • Program should be scaled appropriately for large numbers or for a smaller, cohort model

    • Program should offer other opportunities (outside of the program itself) during and after its duration

      • Leadership and sharing opportunities

      • Field trips or on-site learning

      • Learning from experts

      • Opportunities to engage in communities of practice

    • Program should have a clear number of sessions offered (fixed)

    • Participants can collect, curate, or create a product along the way to show and reflect on their learning

    • Program should have a clear goal of growing year to year using its own human capital (community) to evolve

    • Participants should leave the program with both tangible and intangible experiences and learning


    • Scope and sequence of sessions

    • Schedule of sessions, events, etc.

    • Logistics of each session, event, etc.

      • What, when, where, who, why, how

    • Guidelines for both synchronous and asynchronous participation and facilitation

    • “Run of show” document for facilitators to outline important information, expectations, responsibilities, and deadlines

    • Training documents, videos, etc. for facilitators (if needed)

  • The content is the “what” of a program. How is the content specific and customized to the participants and their varying roles? How does content inform participants’ practice? How can it be designed so that it’s easily applicable and able to be reused and repurposed by participants? How can the content of the program promote reflection and growth?


    • Program should be content specific and customized to the audience

    • Program content can serve as a skeleton that participants can use to model their own, similar programs in their own organizations (i.e. the learnings and experiences should be delivered in such a way that they are applicable and able to be augmented)

    • Content should be delivered through a website or LMS, or be curated in some way for participants to easily access during and after the program

    • Content should be made “just-in-time” and easily applicable to the various roles of participants

    • Participants should be able to take content and use it in other capacities or use it to inform their practice in their role

    • Participants should have a role in defining what content can look like based on their evolving needs

    • Content should include some type of coaching cycle or feedback loop to promote ongoing reflection and a recursive process for growth


    • Website, LMS, or other curated place where all information will live

    • Program foundation elements

    • Agendas, anchor docs, etc. for participants

    • “Run of show” document for facilitators to outline important information, expectations, responsibilities, and deadlines

    • Seating charts, group lists, etc.

    • Instructions for accessing content during and after session(s)

    • Collateral materials such as an action plan, action research documentation, etc.

    • CEU and clock hour certificates

  • To grow, enhance, and transform the program over time, there must be means of measurement. What forms of data are collected, used, and shared with participants and stakeholders? How can the program’s goals be written to be measured or demonstrated? What opportunities are built in for participants to reflect and offer feedback on their experiences?


    • Data should be collected to inform program structure, content, etc. moving forward

    • Participants should be given the opportunity to reflect on and offer feedback for the program

    • Data should be collected, curated, and disseminated or shared with all stakeholders

    • Program goals should be written to be measurable or demonstrated

    • There should be a balance of quantitative and qualitative data

    • There should be a clear schedule of how often data will be collected


    • Evaluations and surveys for participants (Google Form, Qualtrics, Survey Monkey, etc.)

      • First session

      • Last session

      • In-between sessions

    • Schedule and outlines for participants interviews