By Brant Wilkerson-New
October 21, 2025
Key Takeaways
- Dick and Carey is a systems approach that ties goals, objectives, assessments, and materials into one coherent model.
- A strong instructional analysis leads to precise performance objectives and better assessments.
- The Carey model builds quality through formative evaluation before wide release and summative checks after launch.
- Align instructional strategy, learning activities, and feedback directly to objectives for better transfer.
- Use data from tryouts to revise quickly, keeping the learner at the center of the process.
- The model fits high‑stakes training, certification programs, and any system that demands measurable results.
Many teams look for a reliable way to turn ambition into results. That is where a clear, validated structure helps. The Dick and Carey approach gives you a repeatable way to move from goals to measurable changes in learner performance, without guesswork. It is practical, testable, and friendly to both new and seasoned designers.
Below you will find a plain‑spoken guide that keeps the theory intact while giving you moves you can put to work right away.
The Dick and Carey system has shaped how professionals build training for decades. It remains popular with instructional designers because it treats instruction as a set of connected parts that can be planned, tested, and improved. While trends come and go, this model stays useful because it is grounded in clear analysis, explicit objectives, and strong evaluation.
What is Dick and Carey Model?
The Dick and Carey model, sometimes called the Systems Approach Model for the Design of Instruction, was first introduced by Walter Dick and Lou Carey. You will also see it shortened in conversation to the Carey model, or simply Dick and Carey. It is a systems approach that threads together analysis, design, development, implementation, and evaluation into one coherent process.
- Walter Dick and Lou Carey built the approach to help designers plan instruction that truly changes performance.
- Many teams use the phrase Carey model as a convenient shorthand, though the full lineage credits both Walter and Lou.
- At its core, Dick and Carey is a design model that treats instruction like an engineered system whose parts must work in sync.
What sets the Dick and Carey model apart is the disciplined link it creates between instructional goals, performance objectives, assessments, and instructional materials. Every decision is tied to evidence, and each component supports the next. The result is instruction that targets specific skills and knowledge, backed by criteria for success.
You will sometimes see the name written as the systems approach model or approach model. Both point to the same concept, the systematic design of instruction grounded in analysis, objectives, and evaluation. The model assumes that learners, materials, activities, and assessments are interdependent, and that quality improves through continuous feedback.
A quick note for context: Walter Dick and Lou Carey published multiple editions, refining terms along the way. You might also hear the names Walter and Lou mentioned independently in academic discussion, but it is still the same Dick and Carey model people use in practice.
When to Use Dick and Carey Model
Use the Dick and Carey model when clarity, consistency, and accountability are nonnegotiable. It shines when you must document how goals translate into results, and when different stakeholders need to see the logic behind each instructional choice.
Good fits include:
- Large or complex training where multiple units must integrate into one system
- Compliance, safety, and certification programs with strict performance objectives
- Product, process, or software rollouts where assessments need to prove mastery
- K‑12 and higher education courses that require criterion‑referenced tests
- Workforce upskilling programs that must show improvement in performance metrics
It also helps when the team wants to reduce risk. Because the Dick and Carey model builds in formative evaluation, you can test, gather feedback, and revise before broad release. You can identify the right scope, verify learner needs, and fix weak spots early.
If your project is tiny or exploratory, lightweight models may suit better. But when stakes are high and the design process must hold up to scrutiny, Dick and Carey is a strong guide.
9 Steps of Dick and Carey Model
Below are the nine steps framed for practical use. Some editions list ten components; here, revision is treated as continuous within formative evaluation, which keeps the count to nine while preserving intent.
1. Identify instructional goals
Start by describing what learners should be able to do at the end of instruction. These instructional goals define the destination. Aim for performance verbs tied to observable outcomes. Think in terms of real tasks in the workplace, classroom, or program.
- Ask stakeholders to name the tasks, not the topics.
- Verify that goals reflect job, academic, or compliance needs.
- Keep the focus on transfer of learning to performance.
Clear goals keep the later steps aligned, especially when time pressure tempts shortcuts.
2. Conduct instructional analysis
Break down each goal into the skills and knowledge learners must have to reach it. The instructional analysis maps the steps, decisions, and subordinate skills that make the goal reachable.
- Analyze tasks from the top level to component parts.
- Identify prerequisite knowledge and supporting concepts.
- Document conditions, tools, and criteria for correct performance.
A strong instructional analysis is the backbone of the system. It sets up objectives, assessments, and materials to work together.
3. Identify entry behaviors and learner characteristics
Describe what your learner already knows and can do. Look at prior knowledge, current skills, motivations, constraints, and context. When you match instruction to the learner, everything lands better.
- Capture demographics, technology access, and reading level.
- Interview learners to get first‑hand information about day‑to‑day use cases.
- Note environmental limits like time, devices, or supervision.
This step keeps the design based on reality rather than assumptions. It also helps you select appropriate activities and tools.
4. Write performance objectives
Translate your analysis into clear, testable performance objectives. Each objective states the behavior, conditions, and criteria. These guide content, activities, and assessments so everything lines up.
Example structure:
- Behavior: what the learner will do
- Conditions: the tools or context provided
- Criteria: how well the learner must perform
Make objectives specific enough to drive design decisions while remaining readable. Keep an eye on the right level of detail, then verify that each objective supports an instructional goal.
5. Develop and select assessment instruments
Build assessments that match the objectives. In Dick and Carey, assessments are criterion referenced, which means they check whether each learner meets the standard you defined.
- Plan items that mirror the job or academic demand.
- Use a mix of selected response, performance tasks, and practical demonstrations where appropriate.
- Include both assessments and ongoing checks for learning.
Many teams say develop and select assessment instruments to reflect the reality that some items can be adapted from existing banks while others must be created fresh. Measure what matters, nothing more, nothing less.
6. Develop instructional strategy
Decide how learners will reach each objective. Your instructional strategy ties content sequence, learning activities, practice, and feedback to the objectives and assessments.
Include:
- Content organization based on your instructional analysis
- Activities that simulate real performance
- Practice with immediate, task‑focused feedback
- Supports that reduce cognitive load while building independence
This is where you plan the experience. Keep it evidence based. If an activity does not help a learner hit the objective, drop it.
7. Develop and select instructional materials
Turn the strategy into instructional materials. That might include facilitator guides, eLearning modules, slide decks, job aids, simulations, labs, and performance checklists.
- Create new assets where unique context matters.
- Select instructional materials from approved libraries where fit is solid.
- Document how each piece supports objectives and assessments.
Many teams maintain a design system so designers can develop instructional content quickly and maintain quality. Remember to select tools that match learner constraints and the system in which the training will live.
8. Design and conduct formative evaluation of instruction
Before full rollout, test the instruction with representative learners. Formative evaluation helps you find points of friction and misunderstanding early. The tradition in Dick and Carey includes one‑to‑one, small group, and field trials.
- Conduct instructional tryouts with real tasks and authentic materials.
- Gather data on errors, time on task, and learner comments.
- Iterate content, activities, and assessments based on evidence.
This is where you conduct formative evaluation, not to grade learners, but to strengthen the materials. Formative feedback improves clarity and reduces surprises at scale.
9. Conduct summative evaluation
Summative evaluation checks effectiveness at the program or course level after implementation. It looks at learner performance across the cohort, transfer on the job, and impact on business or academic metrics.
- Collect metrics aligned to the original goals.
- Compare performance before and after the instruction.
- Make recommendations for the next cycle of development.
Many teams run both summative and formative cycles. Summative evaluation gives stakeholders a clean look at results while formative evaluation keeps improvement moving during development.
Step‑to‑output map
A simple reference can help keep your team aligned during the design process.
Step | Primary outputs | Typical tools |
Goals | Goal statements, scope notes | Stakeholder interviews, task inventories |
Instructional analysis | Task map, prerequisites list | Process mapping tools, SMEs |
Learner profile | Entry skills, constraints | Surveys, focus groups |
Performance objectives | Objective bank with criteria | Objective templates, review checklists |
Assessments | Items, rubrics, practical tests | Item banks, rubric builders |
Strategy | Sequence, activities, feedback plan | Storyboards, design docs |
Materials | Draft modules, guides, job aids | Authoring tools, style guides |
Formative evaluation | Pilot results, revision log | Analytics, observation protocols |
Summative evaluation | Impact report, recommendations | Dashboards, interviews |
The table is not a rulebook, but it gives designers a quick guide for who is doing what and when.
Making the Model Work on Real Projects
Real projects are messy. The reason the Dick and Carey model remains popular is that it holds up under pressure. Here are practical habits that keep the system productive.
- Keep scopes tight. One goal, one flow. Avoid overpacking objectives.
- Write to performance. If an objective cannot be observed, rework it.
- Test early with small samples. Let formative results drive changes.
- Prioritize activities that look and feel like the real task.
- Trim content that does not support the objectives.
The more complex the system, the more this approach helps. Decision logs, review checkpoints, and simple templates prevent drift and keep stakeholders aligned.
Common Pitfalls and How to Avoid Them
Even good models can be misused. Watch for these patterns:
- Weak objectives. If criteria are vague, assessments will be soft.
- Assessments that do not match objectives. Real tasks require performance checks, not only quizzes.
- Skipping learner analysis. Without it, materials miss the mark.
- Treating formative as optional. You save time later by testing now.
- Overloading materials. More slides do not equal more learning.
A small shift in process can correct most issues. Start by revisiting the original instructional goals and the instructional analysis. Then tighten performance objectives and update related assessments.
How Dick and Carey relates to other models
Instructional design has many models. ADDIE, rapid prototyping, and agile‑based frameworks all offer value. The Dick and Carey system pairs well with these because it clarifies what must be true regardless of delivery speed.
- Use Dick and Carey to define goals, objectives, assessments, and materials.
- Use agile sprints to iterate assets quickly during development.
- Use analytics from your LMS to support both formative and summative cycles.
Many designers blend models while keeping the systems approach at the core. The Carey model remains the structure that keeps parts aligned.
Practical Tips for Assessment and Feedback
Because assessments sit at the center of Dick and Carey, treat them as design assets, not afterthoughts.
- Write items directly from objectives, using the same conditions and criteria.
- Add realistic constraints, including time, tools, and data the learner would have on the job.
- Provide targeted feedback that points to the exact step or concept that needs attention.
Remember, assessments include performance tasks and job‑like scenarios. This strengthens transfer of learning and builds confidence.
A Quick Note on Origin
It helps to honor the names behind the work. Walter Dick and Lou Carey advanced a systems approach that brought rigor to the design of instruction. Their efforts gave instructional designers a repeatable path from needs to results. The model is still taught widely because it works.
Where the Carey Model Shines in Measurement
Leaders often ask for proof. Dick and Carey provides it through aligned objectives, matched assessments, and built‑in evaluation. You can show exactly how a course supports a program metric, and how each item checks a skill.
- Tie each objective to a performance indicator the organization already tracks.
- Use summative reports to show aggregate gains and gaps.
- Feed results back into development so the next release hits even harder.
This is a systems approach at work. It is transparent, data based, and friendly to continuous improvement.
Examples of Effective Use
- Safety certification program: The team defined clear performance objectives, built scenario‑based assessments, ran formative rounds, then used summative evaluation to confirm a drop in incidents.
- Software onboarding: Designers aligned objectives to critical workflows, created short practice activities with immediate feedback, and showed improved time to proficiency.
- K‑12 science unit: The instructional strategy emphasized inquiry with hands‑on activities matched to objectives, and assessments captured both concepts and skills.
These examples show how the approach model adapts across contexts while keeping the logic intact.
Terms to use with Stakeholders
Sometimes the fastest way to bring people along is to use clear words.
- Instructional goals: the big outcomes we promise
- Performance objectives: the specific behaviors we will measure
- Assessments: the tools and tasks we use to check if criteria are met
- Instructional materials: what we create to teach and practice
- Formative evaluation: early testing to improve the design
- Summative evaluation: after‑launch review to confirm impact
Keep this language in your kickoff decks and status updates. It sets the frame for thoughtful decisions.
Frequently Asked Questions
How detailed should objectives be?
- Enough to drive design choices and assessments. If two designers would build different tests from the same objective, it needs tightening.
Do we need all steps on small projects?
- Yes, but scale them. A short interview can stand in for a full learner study. A quick pilot can still count as formative evaluation.
Can we reuse materials across courses?
- Yes, when objectives match. Select instructional materials carefully to ensure fit with the new learner profile and criteria.
How do we handle fast changes?
- Keep the core process and shorten cycles. Rapid tools help, but do not skip alignment between objectives and assessments.
Ready to turn your training goals into measurable results? Explore our instructional design services and see how our experts apply proven models like Dick and Carey to build impactful learning experiences.
- About the Author
- Latest Posts
I’m a storyteller!
Exactly how I’ve told stories has changed through the years, going from writing college basketball analysis in the pages of a newspaper to now, telling the stories of the people of TimelyText. Nowadays, that means helping a talented technical writer land a new gig by laying out their skills, or even a quick blog post about a neat project one of our instructional designers is finishing in pharma.