This course is being built for the National Alliance of Preservation Commissions to help new historic preservation commissioners who inexperienced with preservation terminology and central concepts learn the terms they need to do their commission work effectively.
The process of building this course is outlined below.
ANALYSIS
I conducted a multi-method needs assessment using SME interviews, stakeholder input, and targeted surveys of 44 staff advisors to historic preservation commissioners and 32 serving historic preservation commission members to identify performance gaps.
By comparing importance vs. knowledge gaps and expert vs. novice perspectives, I used their data to identify critical deficits in both architectural vocabulary and application of preservation concepts in real decision-making contexts.
A learner and contextual analysis revealed key barriers—time, scheduling, and access—which informed the decision to design a flexible, asynchronous, scenario-based experience.
These findings directly shaped performance-based objectives and assessments, ensuring learners can apply concepts in realistic commission scenarios.



DESIGN
I translated analysis findings into a performance-based design focused on moving learners from recognizing terminology to applying it in real preservation decisions.
The course follows an iterative concept → application sequence, where learners learn terms, connect them to broader concepts, and apply them to realistic scenarios (e.g., evaluating infill in context). I incorporated SME input to ensure alignment with real-world practice.
Instruction combines multimedia, guided image analysis, and scenario-based practice to build conceptual understanding and decision-making skills.
UDL-informed supports (e.g., optional flashcards, consistent interaction patterns) ensure accessibility and reduce extraneous cognitive load while reinforcing learning.


DEVELOPMENT
The course is being developed in Articulate Storyline. We prefered it over Rise after testing interaction needs and determining that image-based learning and spatial interaction required greater flexibility.
The course translates design into practice through videos and interactive, image-centered experiences, including layered graphics, guided analysis, and scenario-based questions that mirror real commission tasks.
I am collaborating closely with the SME to develop authentic content, visuals, and scripts, aligning real-world examples with instructional best practices and cognitive load considerations.
To support accessibility and usability, I am implemented clear interaction patterns, multimodal content (video, text, visuals), and scaffolded feedback, ensuring learners can engage with and apply complex concepts effectively.
Currently, this course is in its development stage. I am doing formative evaluations and revising the courses based on feedback from colleagues, NAPC staff, and learners at different development points.




Implementation
While we are not yet at the implementation phase, I have developed a plan that I will hone with NAPC staff to ensure a smooth and efficient implementation process.
I will implement the course in partnership with NAPC, preparing learners through clear onboarding, access instructions, and technical guidance to ensure smooth entry into the self-paced experience.
We will support rollout by confirming technology requirements, access to materials, and alignment with learner context, while coordinating with stakeholders to ensure the course is integrated effectively into existing training structures.
Initial implementation will include a limited launch with a formative evaluation that will glean learner feedback and performance data to refine interactions, instructions, and assessment before a full launch.


Evaluation
I am implementing a multi-layered evaluation strategy integrated throughout development and planned implementation, including pre-/post-assessment, formative testing, and long-term follow-up.
During development, I am conducting iterative formative evaluations with stakeholders, SMEs, and a pilot group of approximately 20 learners to assess content accuracy, usability, cognitive load, and learning effectiveness, refining the course based on feedback at each stage.
I am comparing pre-assessment and summative assessment results during pilot testing to evaluate learning gains and identify areas where instruction may need adjustment.
I am aligning assessment directly to learning objectives through scenario-based summative evaluation, requiring learners to both identify and apply concepts with defined performance thresholds.
I am also incorporating post-course follow-up (Kirkpatrick Level 3) to evaluate transfer of learning into commission work and inform ongoing course improvement.


