Massachusetts Adult Proficiency Tests (MAPT)

Massachusetts Adult Proficiency Tests (MAPT)

Narrative Summary: 

The Massachusetts Adult Proficiency Tests (MAPT) were developed to measure the educational gains of adult education students in Massachusetts. The MAPT covers the subject areas of (a) Mathematics and Numeracy, and (b) Reading. These criterion-referenced tests are aligned with the Massachusetts Adult Basic Education (ABE) Curriculum Framework for Math and Numeracy and the Reading Strand of the Massachusetts ABE Curriculum Framework for English Language Arts and are aligned to the National Reporting System’s (NRS) Educational Functioning Levels (EFLs). A primary purpose of the MAPT is to measure gain across the NRS EFLs. The MAPT assessments are multistage computerized-adaptive tests. Students begin the test at a specific EFL entry point, but can be routed to easier or more difficult sets of items based on how well they perform on items during the assessment. Thus, there are not specific test “forms” for each EFL; rather, all EFL assessments are calibrated onto a single MAPT scale (in each subject area). By using adaptive testing technology, the MAPT is able to measure gains across all EFLs on a common scale, and is able to adapt the assessment to the proficiency level of each specific student. The technical details explaining this adaptive testing technology can be found in the MAPT Technical Manual (Sireci et al., 2008).

Intended Population: 

The MAPT for Math and the MAPT for Reading can be used for adults who are at Beginning Basic through High Adult Secondary Educational Functioning Levels. The tests are intended for adult basic education students in Massachusetts.

Purpose: 

The purposes of the MAPT are to measure ABE learners' knowledge and skills in math and reading so that their progress in meeting educational goals can be evaluated. The MAPT is designed to measure learners' educational gains for the purposes of alignment to content standards, state monitoring and accountability under the NRS. The MAPT should only be used for diagnostic evaluation when used in conjunction with other measures such as classroom assessments. The MAPT is not designed for placement in instructional courses, but may be used to help confirm current placement decisions for students.

Skills Measured-Test Content: 

Mathematics and numeracy; reading

Range of Skill Levels: 

Beginning Basic through High Adult Secondary Educational Functioning Levels

Number of Items per Test: 

40

Types of Items: 

Multiple-choice items

Development of Items: 

To maximize the degree to which the test items would reflect instructional practice, an explicit goal of test development was to train teachers to write items for the new tests. This strategy encouraged teachers to be part of the test development process so that they could take some “ownership” of the new tests.

A series of item writing workshops were held across the state, as well as a two-week full-time test construction course at UMass Amherst in August. The workshops and the course informed teachers of the characteristics of quality educational assessments (e.g., content validity) and of the appropriate uses of test scores.

Through these efforts, over 200 ABE teachers and administrators were trained to write items targeted to the test specifications (and to specific objectives within the curriculum frameworks). These activities led to the production of over 2,000 test items that made up the initial pool of potential MAPT items. It also helped ensure that the test items would reflect the instructional needs of ABE learners in Massachusetts as much as possible. As the item pool for each test was created, item review committees were assembled to review items for content quality (degree of match to intended objective, technical accuracy, appropriateness for adult learners, clarity, etc.). The review committees also rated items to determine how well the item measured the objective to which it was matched. Items were also reviewed for psychometric quality by UMASS staff. Items were edited and then subsets of items were selected for pilot testing. During pilot testing, items are completed by both native English speakers as well as well as non-native English speakers enrolled in ABE classes. Following pilot testing, the statistical functioning of the items was evaluated and the items considered to be most appropriate for operational use were selected. These items then underwent sensitivity (item bias) review and content validity review.

The end results of the item development, revision, and review processes were pools of items considered most appropriate for assembling the operational multistage tests.

Item writing, review, and revision activities are ongoing, to ensure the content of the MAPT remains current. More information regarding item development and evaluation activities can be found in Sireci et al. (2008).

Size of Item Bank: 

The operational items banks for Math and for Reading are drawn from a pool of about 1,000 items, with 400 items operational at any one point in time. Items are continuously pilot-tested and added to the pool of potential items once each year.

Subtests: 

None

Alternate Test Forms: 

The tests are adaptive in that the software tailors each sequence of items to specific learners by tracking their performance as they respond to items. Test-takers are given different items each time they test, so no separate forms of the test are needed. There are 6 stages involved in the multistage test, with students being routed to easier or more difficult sets of items based on their performance on the previous stage (see Sireci et al., 2008 for technical details regarding the multistage adaptive test algorithm).

Required / Recommended Administrator Training: 

Only staff who have attended a three-hour ACLS/UMass or SABES training may administer the MAPT. At least two staff per program must be ACLS/SABES-trained to administer the tests. Trained MAPT administrators may train other staff at their programs, but the training must be as thorough as the original training to avoid real-time testing problems and unreliable score data. Programs must maintain two staff trained by ACLS/SABES at all times.

Test Administration Procedures: 

Test administrators must follow the following steps at their program to ensure effective testing and reliable results:

  • Learners must be in an ABE class two weeks prior to taking the MAPT.
  • Before taking the MAPT for the first time, test takers must first take a “Computer Basics” tutorial to familiarize themselves with how to use a computer and the basics of the MAPT test.
  • In addition, learners must respond to the four “Sample Questions” in the MAPT for Math and/or Reading to prepare them for sample question types.
  • The optional Practice Tests offer 20 practice items per NRS level, and test takers are encouraged to take one at their level to familiarize themselves with the approximate difficulty level of the MAPT.
  • The MAPT must be proctored by a trained MAPT test administrator. The test platform also has technical support staff 9 am-5 pm.
  • Provide a quiet area for testing with enough time ideally for completion in one sitting. The test is untimed, and takes on average 60 minutes. It is helpful to have up to 90-120 minutes to ensure test takers are uninterrupted.
Scoring - How: 

All items are scored dichotomously using the three-parameter logistic item response theory model and students’ scores are computed using the expected a posteriori (EAP) estimation method (see Sireci et al., 2008). The IRT theta scores are transformed onto the 200-700 MAPT standardized score scale using a linear transformation within each EFL as described in Sireci et al. (2008).

Scoring - Type: 

All items are scored dichotomously. The item response theory three-parameter logistic model is used to score examinees at the end of each stage for routing to the next stage and for computing the final MAPT scale score for each examinee.

Scoring - Who: 

Each item is scored right or wrong by the computer.

Scoring - How Long: 

The MAPT scale score is calculated by the program as soon as the learner is finished taking the test.

Reporting Procedures: 

When examinees complete the test, the onscreen score report includes their scale score, the MAPT subject, and their EFL. These scores are automatically uploaded to the SMARTT system and recorded for programs and state use. In addition, expanded score reports for the MAPT for Math detailing benchmarks associated with items answered correctly and incorrectly as well as mappings of items to difficulty levels on the MAPT scale are available as of February 2010.

Time Needed for Assessment: 

The MAPT is designed to be completed within one hour. However, it is recommended that two hours be allocated for students to take the test to ensure they have adequate time and do not feel the need to rush. The test is intended to be taken in one sitting, and should not be split into multiple sessions unless absolutely necessary. Most learners will complete the test in one hour or less.

Publisher / Company / Source: 

The MAPT is produced through a collaboration between the Massachusetts Department of Elementary and Secondary Education and the Center for Educational Assessment at the University of Massachusetts Amherst

Versions & Publication Dates: 

First published July 2006; updated system software in August 2009.

Cost: 

TBD

Additional Comments: 

The MAPT Technical Manual (Version 2, April 2008) and the MAPT Technical Manual Supplement (July 2009) are available upon request. Please contact Jane Schwerdtfeger at the Massachusetts Department of Elementary and Secondary Education janes@doe.mass.edu