Skip to main content

Tools & Curriculum

Return to Tools & Curriculum

New Directors' Orientation Tutorial

The New Directors' Orientation Tutorial is made up of 14 self-paced modules to assist in learning basic program requirements. Each module is designed to be utilized based on a Director’s specific needs; there is not a set sequence for the tutorial to be completed. We encourage Directors to jump to the module that best fits their needs. Select a module below to view or download the corresponding materials.

Module 8. Program Planning – Migrant Education Program Evaluation

Section 1: Getting Started

Getting Started
In This Section
Tutorial Objectives
How to Use the Tutorial
Icons to Guide You
Key Readings and Resources
3
Tutorial Objectives
Module 8 will enable state directors to
1. understand the legislative and regulatory requirements for the
Migrant Education Program (MEP) Evaluation,
2. understand the role of program evaluation as part of a continuous
improvement cycle,
3. develop a program evaluation plan that aligns with Measurable
Program Outcomes (MPOs) and state performance targets/Annual
Measurable Objectives (AMOs),
4. ensure that Priority for Services students (PFS) are featured,
5. develop implications for program improvement from evaluation
results, and
6. develop an action plan for the next MEP Evaluation.
4
How to Use the Tutorial
For optimal benefit from the tutorial, you should
allow sufficient time to read the slides, reflect on the information, and
complete all activities on the slides or on the Quick Resource and
Reflection Sheets (QRRS) that can be downloaded as worksheets;
read each slide as well as the information referenced in the slides;
engage with the “What Do You Think?” slides to facilitate interaction
with the information (Answers will be provided directly following each
of these slides.);
5
How to Use the Tutorial
For optimal benefit from the tutorial, you should (continued)
pause to reflect on your state program at the “Check-in” slides
(A QRRS document will typically accompany these.);
complete the “Pop Quiz!” slides to reinforce key concepts;
review your state’s MEP documents and reports as directed;
develop an action plan using the worksheets provided;
add actionable items to your MEP planning calendar (See QRRS 14.2);
and
contact your OME Program Officer for follow-up questions.
6
Icons to Guide You
The following icons will guide you in making the best use of this tutorial:
What Do You Think?
Check-in
Pop Quiz!
Quick Reference and Reflection Sheet (QRRS)
Action Planning
Calendar Item
7
Key Readings and Resources
You should have these documents readily available while completing
the module, as the module will refer to these documents for more
complete information on various topics.
MEP Guidance on the Education of Migratory Children under Title I,
Part C of the Elementary and Secondary Education Act of 1965,
Chapter VIII
Migrant Education Program Evaluation Toolkit developed by the Office
of Migrant Education (OME)
Your states MEP Service Delivery Plan (which should include the
program evaluation plan)
The most recent state MEP Evaluation Report
8

Section 2: What is Required

What is Required
In This Section
Basic Requirements
What is Measured
Involvement of Local Operating
Agencies (LOAs) in the Evaluation
Connection to Program Improvement
9
Basic Requirements
States must plan, implement, and evaluate programs and projects that
ensure that the state and local operating agencies (LOAs) address the
special educational needs of migratory children, including preschool
migratory children.
Section 1304(b)(1) of the ESEA, as amended
10
Basic Requirements
The state must develop and update a written comprehensive state plan
that, at a minimum, has the following components:
1. Performance targets,
2. Needs assessment,
3. Measurable program outcomes,
4. Service delivery, and
5. Evaluation.
34 CFR § 200.83
11
Basic Requirements
Each state education agency (SEA) must determine the effectiveness of
the MEP through a written evaluation of the program that measures the
implementation and results of the program against the State's
performance targets, particularly for those students who have priority
for service (PFS).
34 CFR § 200.84
12
Basic Requirements
PFS children are those:
1. Who are failing, or most at risk of failing, to meet the states
challenging state academic content and student achievement
standards; and
2. Whose education has been interrupted during the regular school
year.
Section1304(d) of the ESEA
13
What is Measured
SEAs are required to evaluate the effectiveness of the MEP in terms of:
1. Program implementation and
2. Program results.
34 CFR § 200.84
14
What is Measured
Implementation and results of the MEP are measured against
performance targets/annual measurable objectives (AMOs) that the
state has established
For all children in reading and mathematics achievement, high school
graduation, the number of school dropouts, and school readiness (if
any) , and
Any other performance targets that the state has identified for migrant
children.
34 CFR § 200.84
15
What is Measured
Thus, implementation and results of the MEP should be measured
against the:
Measurable program outcomes (MPOs) the state has established for
the MEP as part of its comprehensive Service Delivery Plan (SDP).
Those MPOs are what the state MEP will produce to meet the needs of
migratory children and help them achieve the state’s performance
targets/AMOs.
34 CFR § 200.83(a)(3)
16
Involvement of Local Operating Agencies in the
Evaluation
In evaluating the program, the SEA will need LOAs to collect data on
migratory children who receive services from the program
(implementation component) and the outcomes of these services
(using the program’s measurable outcomes, as specified in the SDP).
Evaluations of the program at the local level will also assist the state
in its subgranting process (see Module 4).
17
Involvement of Local Operating Agencies in the
Evaluation
The SEA must ensure that the LOA conducts the local evaluation
properly, and
The SEA should inform its LOAs in advance of any specific data that it
will need to evaluate the statewide program, and how the LOAs should
collect the data.
18
Connection to Program Improvement
SEAs and LOAs must use the results of the MEP evaluation to improve
services for migrant children.
34 CFR § 200.84 and 200.85
19
Pop Quiz!
Statement
T F
1. The MEP evaluation collects and analyzes state
-level data only.
2. The focus of the evaluation is on program results, not
implementation.
3. The MEP evaluates instructional services and educational support
services.
4. The MEP evaluation should include the
evaluation of services for
preschool migratory children.
Instructions: Note True or False for each of the following statements
related to the requirements for the MEP evaluation.
20
Pop Quiz! - Response
Number 1 is FALSE
.
The MEP evaluation includes data from both the
local and state levels.
Number 2 is FALSE
.
The evaluation focuses on the results and
implementation of the MEP.
Number 3 is TRUE
.
The MEP evaluates both instructional services and
educational support services that enable migrant children to
participate effectively in school.
Number 4 is TRUE
.
MEP programs and projects must address the
special educational needs of migratory children, including preschool
migratory children.
21

Section 3: Performance Results to be Included in the Evaluation

Performance
Results to be
Included in the
Evaluation
In This Section
Performance Goals, Indicators,
and Targets
Government Performance and
Results Act (GPRA) Measures
22
Performance Goals, Indicators, and Targets
Familiarity with the following key terms will assist you in understanding
MEP evaluation requirements for reporting performance results:
Performance goals,
Performance indicators,
Performance targets, and
Measurable program outcomes (MPOs).
See QRRS 8.1 Reviewing Key Terms for Performance Assessment
23
Performance Goals, Indicators, and Targets
For purposes of program design and evaluation, the MEP will focus on
ESEA Performance Goals 1 and 5, along with the indicators for each
goal.
MEP Guidance, Chapter VIII, B3
24
State Performance Goals, Indicators, and
Targets
Performance Goals and Performance Indicators that
the State’s MEP Evaluation Must
Address
Performance Goal 1
:
By 2013
-2014, all students will reach high standards, at a minimum attaining proficiency
or better, in reading/language arts and math.
1.1 Performance indicator: The percentage of students, in the aggregate and for each
subgroup, who are at or above the proficient level in reading/language arts on the
state’s assessment
1.2 Performance indicator: The percentage of students, in the aggregate and in each
subgroup, who are at or above the proficient level in math or the state’s assessment
Performance Goal 5:
All students will graduate from high
school.
5.1 Performance Indicator:
The percentage of students who graduate from high school
each year with a regular diploma
5.2 Performance Indicator: The percentage of students who drop out of school
25
Performance Goals, Indicators, and Targets
For Performance Goals 1 and 5, the U.S. Department of Education
required all SEAs to submit performance targets in their consolidated
state applications for each performance indicator and baseline data for
the targets.
States are not required to resubmit their performance targets during
the period for which the ESEA, as currently enacted is authorized unless
the state makes a significant change in one or more of them. (See
Module 3).
o Several states have revised their performance targets via
application for a waiver of certain ESEA requirements (also known
as “ESEA Flexibility”). These revised targets are also referred to as
revised Annual Measurable Objectives (AMOs).
26
Check-in
Review your most recent MEP Evaluation report and respond to the
following questions:
To what extent are migrant students reaching the states performance
targets under ESEA Goals 1 and 5, or any other state performance
targets included in the MEP evaluation?
Does the evaluation show the performance of priority for services
(PFS) students relative to these targets?
See QRRS 8.2 State Performance Targets for ESEA Goals 1 and 5
27
Government Performance and Results Act
(GPRA) Measures
In compliance with the Government Performance and Results Act
(GPRA), ED/OME has adopted four GPRA measures for monitoring
progress and maintaining accountability for the MEP on a national level.
The GPRA measures present a national picture of the program, to
which each state contributes.
Therefore, GPRA measures should be part of the MEP evaluation.
28
Government Performance and Results Act
(GPRA) Measures
GPRA Measures:
1. Percentage of MEP students who scored at or above proficient on
their state’s annual reading/language arts assessments in grades 3-
8 and high school.
2. Percentage of MEP students who scored at or above proficient on
their state’s annual mathematics assessments in grades 3-8 and
high school.
3. Percentage of MEP students who were enrolled in grades 7-12 and
graduated or were promoted to the next grade level.
4. Percentage of MEP students who entered 11
th
grade who had
received full credit for Algebra I or a higher mathematics course.
29

Section 4: Program Evaluation as Part of the Continuous Improvement Cycle

Program Evaluation
as Part of the
Continuous Program
Improvement Cycle
In This Section
Continuous Improvement Cycle
Making Connections in the Planning
Process
30
Continuous Improvement Cycle
31
Making Connections in the Planning Process
Program planning involves a continuous cycle of needs assessment,
planning services, implementation, and evaluation.
The MEP evaluation determines the degree to which the services
identified in the Service Delivery Plan (SDP) (1) are implemented as
planned, (2) truly meet the needs identified in the needs assessment,
and (3) result in improved performance of migrant students as
measured against the state’s performance targets/annual
measurable objectives (AMOs) and the GPRA measures.
The evaluation will inform updates of the needs assessment and
changes in service delivery to improve the state MEP and services it
provides to migrant children.
32

Section 5: Setting the Context for the Migrant Education Program Evaluation

Setting the Context
for the MEP
Evaluation
In This Section
Purpose of the MEP Evaluation
Evaluation for Implementation and
Results
Responsibilities of Local Operating
Agencies for the Migrant Education
Program Evaluation
Frequency of Evaluation
33
Purpose of the Migrant Education Program
Evaluation
Program evaluation involves systematically collecting information about
a program, or some aspect of a program, in order to determine the
effectiveness of the program and to improve it.
MEP Guidance, Chapter VIII, A1
34
Purpose of the Migrant Education Program
Evaluation
The MEP evaluation allows SEAs and LOAs to:
1. Determine whether the program is effective and document impact on
migrant children,
2. Improve program planning,
3. Determine the degree to which services are implemented as planned
and identify implementation problems, and
4. Identify areas in which migrant children may need different MEP
services.
MEP Guidance, Chapter VIII
35
Evaluation for Implementation and Results
States should develop methods of disaggregating state assessment
data and data on measurable outcomes in order to determine the
impact of the MEP on all migrant children and in particular those who
have a priority for services (PFS).
MEP Guidance, Chapter VII, C8
Data Sets
q Data on all migrant students
q Data on priority for services (PFS)
students
36
Responsibilities of Local Operating
Agencies for the Migrant Education
Program Evaluation
So that its evaluation of the MEP is statewide, each SEA must ensure
that its LOAs, as needed, conduct a local project evaluation that
measures the implementation of the project and student performance
against measurable outcomes (which are aligned to the state MEP’s
measurable outcomes, as specified in the service delivery plan).
MEP Guidance, Chapter VIII, C3
37
Frequency of Evaluation
Evaluation of Overall Implementation of the State MEP
An SEA should conduct an evaluation of the implementation of MEP
services on a 2-3 year cycle to determine whether any improvements
are needed.
Evaluation of Program Results
SEAs and LOAs should evaluate the results of the program (e.g., the
degree to which the program has met the measurable state and local
outcomes) on an annual basis.
MEP Guidance, Chapter VIII, C5
38
What Do You Think?
Why do you think there is a difference in the recommended time frame
for evaluating program implementation and program results?
39
What Do You Think? - Reflection
Did your response address the following points?
Overall Program Implementation Evaluation – 2-3 year cycle
o Implementation of new programs or strategies usually takes two to
three years to get staff up to speed, adapt to the context, and
make mid-course adjustments for full implementation.
Evaluation of Program Results annually
o A results-based evaluation is necessary for monitoring student
progress toward established goals; information that is collected
annually will help determine if academic progress is occurring and
will lead to exploration of interventions to foster progress in the
cycle of continuous program improvement.
40

Section 6: The Program Evaluation Plan

The Program
Evaluation Plan
In This Section
Measurable Program Outcomes
(MPOs)
Evaluation Questions
The Data Collection Plan
The Data Collection Task and
Timeline
The Evaluation Matrix
41
Measurable Program Outcomes (MPOs)
The MEP evaluation builds on the MPOs
developed in the process of developing the
SDP. (See Module 7.)
MPOs are the backbone of the SDP and MEP
evaluation.
MPOs
SDP
Evaluation
42
Measurable Program Outcomes
Strong MPOs define:
What services will be provided,
What is expected to happen as a result of the MEP services,
Which students will directly benefit from the services, and
Time frame for the service.
43
Pop Quiz!
To what extent to you think the following is a strong MPO?
Migrant children in grades 3-5, whose parents attend two reading and
homework support workshops during SY 2012-2013, will increase their
performance on the state reading assessment by at least 10%.
44
Pop Quiz! - Response
You probably noticed that the sample MPO included each of the
components of a strong MPO:
Migrant children in grades 3-5 [which students], whose parents attend
two reading and homework support workshops [what services will be
provided] during SY 2012-2013 [time frame], will increase their
performance on the state reading assessment by at least 10% [what
will happen as a result of the services].
45
Evaluation Questions
Strong MPOs lead to evaluation questions that can:
o Measure the effectiveness of the strategies included in the SDP
and
o Maintain alignment with state performance goals, indicators, and
targets.
A Program Alignment Chart is a useful tool to show the connection
between all parts of the service delivery plan, culminating in
evaluation questions that will show the effectiveness of the program.
See QRRS 8.3 Example of a Program Alignment Chart, also included in the
Service Delivery Plan Toolkit, Section D.5.
46
Evaluation Questions
Review the Alignment Chart (QRRS 8.3).
Identify the following:
1. The state performance target related to reading and language
proficiency,
2. The concern regarding the barrier to migrant students meeting the
state performance target,
3. Strategies selected to address this concern,
4. MPOs for the strategies and,
5. Evaluation questions.
47
Evaluation Questions
In reviewing the Alignment Chart in QRRS 8.3, note:
The evaluation questions directly relate to the MPOs; they are
designed to determine to what extent each MPO’s depiction of
program success was achieved.
The MPOs relate to the proposed strategies and are measurable ways
to determine if the strategies produced the desired results and/or
were implemented as described.
The strategies address the needs identified in the Need Statements,
which were developed based on data to support the concern about
barriers to migrant students attaining the state performance target in
reading/language arts.
48
What Do You Think?
The Alignment Chart in QRRS 8.3 contains several evaluation questions
for each of the MPOs.
What are three benefits of having several evaluation questions related
to each MPO?
1.
2.
3.
49
What Do You Think? - Reflection
Did your responses include any of the following reasons for multiple
evaluation questions for each MPO?
Can capture possible impacts on results, such as student attrition, or
skill level upon entering the program.
Can capture possible impacts on implementation, such as teacher
level of experience, type of instruction, or frequency of supplemental
instruction.
Can triangulate results for greater validity.
Can raise questions for further examination when discrepancies in
data exist.
50
The Data Collection Plan
A well thought-out data collection plan will ensure that data collection is
efficient, cost-effective, and systematic.
It should specify what type of data is to be collected, from what
sources, by whom, and in what time frame.
It will ensure that the type of data collected is appropriate to measure
the effectiveness of each MPO and to demonstrate migrant student
performance relative to state performance targets.
51
The Data Collection Plan
Good questions to ask:
1. What existing data can be utilized?
o CSPR data (summary data)
o Data from migrant student databases (child-specific data)
2. What data collection strategies can be utilized across MPOs?
3. If funds and resources are limited, what are the data collection
priorities?
52
The Data Collection Plan
Possible responses:
Collect all required data (e.g., data related to state performance
targets and GPRA measures).
Collect data on the most critical needs that the MEP must address
and data that will show effectiveness in addressing these needs.
When possible, collect data that are the easiest to obtain (e.g.,
existing data from the CSPR and migrant student databases).
Conduct data collection activities that can address several MPOs (e.g.,
parent interviews, teacher survey).
53
The Data Collection Plan
Good questions to ask (continued)
4. How will the evaluator ensure that data are collected consistently at
the state and local levels?
Possible responses:
Provide training and technical assistance to LOAs on their
responsibilities for the MEP Evaluation.
Develop a common Evaluation and Data Collection Plan for all LOAs.
Include program evaluation as a condition for a subgrant award.
Include a monitoring indicator for LOAs on program evaluation.
54
Data Collection Plan
A Data Collection Task and
Timeline is a tool which
ensures that:
Data collection is planned
for each evaluation
question,
Tasks are identified,
Evaluator and staff
responsibilities are defined,
and
Deadlines are clear.
55
Sample Data Collection Plan
MPO
Evaluation
Questions
Data Sources
Person
Deadline
LOAs that
conduct a school
enrollment fair
for migrant
parents of
kindergarten
-
aged students
for
SY 2013 will
show a 25%
increase in
migrant student
kindergarten
enrollment from
SY
2012.
How many parents
participated in
school enrollment
fair?
How
effective
did participating
parents find the
fair in terms of:
Information
provided
Friendliness
Convenience
What was the
percent change in
migrant student
kindergarten
enrollment from
SY 2012 to SY
2013?
Attendance
records
and
p
arent
interviews
School
enrollment
records for SY
2012
and SY
2013
August 1
October 1
December 1
October 1
56
The Evaluation Matrix
An evaluation matrix is a useful tool for keeping track of the methods
you are considering for collecting data.
An evaluation matrix displays evaluation questions in alignment with
the data collection strategies that will be used to answer them.
The Program Evaluation Toolkit developed by OME, Section D, D.5
provides examples of an Evaluation Matrix.
57

Section 7: Conducting the Evaluation

Conducting the
Evaluation
In This Section
Evaluation Expertise
Data Analysis
Reviewing the Data and Discussing
Implications
58
Evaluation Expertise
The MEP Evaluation should be conducted by someone with specific
expertise in evaluation. Sources for expertise may include:
Internal Resources
SEA program evaluation staff
External Resources
Evaluation resources at colleges and universities
Other Evaluation consultants
59
Evaluation Expertise
Tips on working with your evaluator:
Become familiar with the evaluator’s background, and ensure it is a
match for the skills required for the MEP Evaluation.
Discuss expectations, responsibilities, deadlines, and accountability.
Involve the evaluator in the development of the CNA and SDP, if
possible, especially in the development of the evaluation questions.
60
Evaluation Expertise
Ensure that the evaluator knows the audience with whom the MEP
Evaluation will be shared so that he/she may develop the written
report appropriately.
Interact frequently with the evaluator so that you can determine if the
goal of measuring MEP effectiveness will be achieved and so that you
can make recommendations for changes during the evaluation
process if needed.
Provide OME resources and guidelines to the evaluator.
61
Data Analysis
Once the data have been collected, the evaluator will need to organize
the data in a way to enable you and the program planning team to:
Determine the extent to which migrant students are achieving the
state performance targets, especially PFS students;
Answer specific questions about the effectiveness and quality of the
MEP;
Identify program accomplishments; and
Identify areas for program improvement.
62
Data Analysis
The data should be presented in ways that are:
Visual,
Succinct,
Comprehensive,
Organized, and
Targeted toward the audiences level of knowledge of data analysis.
The Program Evaluation Toolkit developed by OME, Section E, provides
background information on analyzing and interpreting data.
63
Reviewing the Data and Discussing
Implications
Reviewing the data and discussing their implications are critical
activities that connect the evaluation with program improvement.
Once the data are analyzed and compiled for review, the program
planning team should be convened to discuss the data.
Adding data analysis to the Alignment Chart is a good way to review
the connection between the data analysis, MPOs, strategies, needs,
and state performance targets.
64
Reviewing the Data and Discussing
Implications
The evaluator should be part of the discussion to answer questions
and ensure the team is interpreting the data correctly.
The evaluation should show the correlation between MPO results and
state performance targets; if a high correlation does not exist, the
team should discuss how to ensure that services are resulting in
improved migrant student performance.
The team should develop a set of recommendations that will be
included in the written report and can be incorporated into the CNA
and SDP.
65
What Do You Think? - Scenario
Review the following scenario and data analysis, and determine
program implications.
In order to increase the number of migrant students who applied to
attend college, an LOA offered a four-hour Saturday morning
information workshop to a group of 55 10
th
- 12
th
grade migrant
students in two high schools to help them learn about preparing for
college.
66
What Do You Think?
Data Analysis
Migrant students who attended the workshop were given a pre-
workshop survey asking if they intended to apply to college; 53% said
they did.
In a post-workshop survey, 82% said they intended to apply to college;
72% of the seniors who attended said they intended to apply to
college.
A sample of migrants students in the 10
th
-12
th
grade, who did not
attend the workshop, were given a survey asking if they intended to
apply to college; 18% said they intended to apply to college
67
What Do You Think?
Data Analysis
Of all the migrant students invited to attend the workshop, only 34%
attended.
School graduation data for the year in which the workshop was
conducted showed that of the 19 migrant students who graduated
and had attended the workshop, only 10 (52%) applied to college.
68
What Do You Think?
Data Analysis
Interviews with a sample of the students who did not attend the
workshop indicated the following reasons.
Reason for Not Attending (n=46)
Can’t afford to go to college
92%
Don’t have good grades
87%
Don’t like school
41%
Feel
like I don’t belong in college 85%
Had to work on the day of the workshop
15%
Other
20%
69
What Do You Think?
Implications
Based on the data analysis related to the workshop for high school
students, what are three implications for improving the college
application rate for migrant students?
1.
2.
3.
70
What Do You Think? - Reflection
Following are some observations on the data:
Based on the increased percentage of students who indicated their
intent to apply to college in the pre- and post-surveys, there appeared
to be increased interest in college attendance.
o Note: You can identify a correlation between the workshop
attendance and increased interest in college attendance, but you
cannot assume that the increased interest in college attendance
is a direct result of the workshop. Other factors could have
impacted the post-survey results.
71
What Do You Think? - Reflection
More observations on the data:
Only a third (34%) of the migrant students who were eligible to attend
the workshop actually attended.
A small percentage (18%) of the sample of the migrant students who
did not attend the workshop indicated that they planned to apply to
college.
The most significant reasons why students in the sample indicated
that they would not apply to college were lack of finances, poor
grades, and a sense they did not belong.
A large discrepancy exists between the percentage of seniors who
attended the workshop and indicated an intention to apply to college
(72%), and the percentage of seniors who attended the workshop and
actually applied to college (52%).
72
What Do You Think? - Reflection
Implications:
Based on the percentage of invited students who did not attend the
workshop, there needs to be a more targeted effort to increase
attendance at a workshop of this kind.
Based on reasons for non-attendance, the program needs to help
migrant students become more aware of financial aid options,
improve their grades, and develop confidence that they can attend
college.
The discrepancy between the data on the percentage of migrant
seniors who attended the workshop and who indicated an intent to
apply to college and the percentage of those who actually applied to
college should be explored further. Additional data could help to
determine the merit and content of a future workshop, and/or the
need for additional follow-up.
73
Reviewing the Data and Discussing
Implications
Considerations:
Implications derive directly from the data.
Be cautious not to attribute causality to correlational data.
Multiple data sources (state and local student data, surveys,
interviews) can provide greater insight on the effectiveness of a
program or strategy.
Implications can lead to recommending strategies to improve progress
toward achieving MPOs and student results.
74

Section 8: The Written Evaluation Report

The Written
Evaluation Report
In This Section
Components of the Written Report
Utilizing the Report for Program
Improvement
75
Components of the Written Report
The MEP Guidance recommends the following components of the
written report:
1. Purpose Why did the SEA conduct the evaluation?
o What is required?
o Who is the audience?
o How is the audience expected to utilize the report?
MEP Guidance, Chapter VIII, D2
76
Components of the Written Report
2. Methodology What evaluation process was used?
o What were the evaluation questions?
o What basic evaluation designs and data collection methods did
the SEA use?
o What data were collected?
o What instruments were used?
o How were instruments tested for reliability and validity and
relevance to the program?
o What was the timeframe?
o What methodological limitations existed?
77
Components of the Written Report
3. Results What were the findings of the evaluation?
o Program Implementation Did the agency implement the program
as described in the program plan or application? Were there any
discrepancies? What implementation barriers were encountered?
o Program Results How did migrant students perform in relation to
the state performance targets? How did priority for services
students perform? How did the results compare to what might
have been expected? What special factors were considered (e.g.,
student attrition)?
78
Components of the Written Report
4. Implications What did the SEA and LOAs learn from the evaluation
and how will what they learned be applied to improving the program?
o How effective is the state’s MEP in improving student outcomes?
o Are there projects or particular strategies that the SEA should
continue, expand, or discontinue?
o What changes in MEP project implementation should take place?
o What new services should be considered?
o When will the SDP be revised to reflect these changes?
79
Check-in
Review the current MEP Evaluation report for your state and consider
the following.
1. On the whole, does the report provide a clear picture of the
effectiveness of the MEP?
o Does it show whether and how migrant students, and priority for services
(PFS) students in particular, are progressing toward meeting the state
performance targets?
o Does it show the extent to which specific strategies in the service delivery
plan are working to address the needs identified in the needs
assessment?
80
Check-in
2. Does the report meet the requirements in the statute and
regulations? If not, why not?
3. Does it include the components recommended in the MEP
Guidance? If not, why not?
See QRRS 8.4 Reviewing the MEP Evaluation Report
81
Utilizing the Report for Program Improvement
A program evaluation report is only useful to the extent that it is read
and used to improve outcomes.
The evaluation report should be:
Visually appealing,
Appropriate for multiple audiences,
Specific in its implications for program improvement,
Well-organized,
Clearly written, and
Used.
82
Utilizing the Report for Program Improvement
Communicating the Report
Consider to whom the report should be sent and for what purpose.
Disseminate the report to stakeholders with a customized cover letter
that explains what a particular stakeholder should learn or do as a
result of reading the report.
LOAs are key stakeholders; consider conducting a training session or
webinar to review the report and discuss ways LOAs can improve their
services.
83

Section 9: Wrapping Up

Wrapping Up
In This Section
Key Points
Action Planning
Resources
84
Key Points
The MEP Evaluation:
1. Is part of the cycle of continuous program improvement,
2. Measures both implementation and results of the MEP,
3. Builds on MPOs that define MEP success,
4. Must include particular focus on migrant students who have priority
for services (PFS), and
5. Must be used for program improvement.
85
Action Planning
Consider the following questions:
1. When should you conduct the next MEP Evaluation? (When was the
last Evaluation conducted?)
2. Does the SDP include an evaluation plan?
oDoes the SDP include strong MPOs?
oDoes the SDP include evaluation questions that address both
implementation and results?
oIs the evaluation plan aligned with the needs, strategies, and
MPOs?
86
Action Planning
3. Who will serve as the MEP evaluator? How will this person be
involved in the program planning process?
4. What processes are in place to obtain data from LOAs?
5. How will you customize the planning process for the size of your
state and number of migrant students?
Add any actionable items to your MEP planning calendar.
See QRRS 8.5 Program Evaluation Action Planning
87
Resources for the Migrant Education Program
Evaluation
MEP Guidance on Education of Migratory Children under Title I, Part
C, of the Elementary and Secondary Education Act of 1965
Explanation of guidelines to implement the laws and regulations
related to the MEP
Program Evaluation Toolkit developed by OMESuggested step-by-
step guide with tools and templates to develop the Program Evaluation
MEP Officers List of OME contact information
(https://results.ed.gov/about/contact)
Glossary of Terms Alphabetical listing of key terms applicable to
migrant education (https://results.ed.gov/idr-
manual/section/glossary/glossary)
88