|
1
|
- on College-Level
Learning
- Sponsored by the Pew Charitable Trusts
- Margaret Miller, Project Director
- 2000-2004
|
|
2
|
|
|
3
|
|
|
4
|
|
|
5
|
|
|
6
|
|
|
7
|
|
|
8
|
- ”The reputation of American higher education as ‘the best in the
world’ is derived from that of a few elite institutions and from the
research contributions of a small number of universities. This reputation has little to do with
higher education as most American’s experience it.”
- Patrick Callan
- President
- National Center for Public Policy and Higher Ed.
|
|
9
|
- ”Despite the major accomplishments of American higher education,
geography, wealth, income, and ethnicity still play far too great a role
in determining the opportunities that Americans have to prepare for,
enroll in, afford, and complete college.”
- Governors Hunt, Edgars, and Carruthers
- Board members
- National Center for Public Policy and Higher Ed.
|
|
10
|
|
|
11
|
- Reputation of individuals and institutions
- Resources
- versus
- Results: what have they learned?
|
|
12
|
- Accreditors’ requirement of evidence regarding institutional effectiveness,
- AAHE’s Assessment Forum,
- Pew’s Quality of Undergraduate Education and writing assessment
projects,
- American Association of Colleges and Universities’ general education
assessment project,
- Council on Higher Education Accreditation’s project on institutional
effectiveness
- National/Community College Surveys of Student Engagement,
- Collegiate Results Inventory/Survey,
- Collegiate Learning Assessment,
|
|
13
|
- The National Education Goals
- National Postsecondary Education Cooperative’s common assessment
language project
- Secretary's Commission on Achieving Necessary Skills (SCANS) skills
- National Skills Standards Board
- Equipped for the Future
- National Assessment of Adult Literacy
|
|
14
|
- Certification of individual students
- E.g., Texas’s TASP, Florida’s CLAST
- Institutional assessment for improvement
- E.g., campus-based assessment
- Tennessee's performance measures
- Missouri’s accountability program
- Institutional assessment for accountability
- E.g., S. Dakota and Arkansas
|
|
15
|
- Help institutions anchor their campus-based assessment results in a few
key measures benchmarked against their peers.
- Enable states to analyze results by units other than institutions.
- Help states know what they’re good at and not, collectively.
- Help states know whether their learning news is good or bad.
|
|
16
|
- State-level assessment should not replace campus-based assessment
focused on improving programs.
- State-level indicators need to be meaningful to campuses.
- A state-level snapshot is only a beginning and should lead to a
finer-grained analysis.
|
|
17
|
- How well do individual students perform? or
- How well do institutions in the state individually promote learning? or
- How well do institutions in the state collectively promote learning? or
- What are the intellectual skills of the college-educated in each state?
|
|
18
|
- What do all the state’s
college-educated citizens know and what can they do that contributes to
the social good? What kind of
educational capital do they represent?
- and
|
|
19
|
- How well do the state’s public and private colleges and universities
collectively contribute to that educational capital? What do those whom they educate know,
and what can they do?
|
|
20
|
- Measuring Up 2002: model tested with incomplete data from
Kentucky
- 2002-2004: Five-state pilot to test assessment
model: IL, KY, NV, OK, SC
- Measuring Up 2004: publish the results of the pilot
- Measuring Up 2006: if enough states adopt the model,
grade states on learning
|
|
21
|
- Direct
- National Assessment of Adult Literacy
- Graduate admissions and licensure tests
- Collegiate Learning Assessment (four-year colleges)
- WorkKeys (two-year colleges)
- Indirect
- National and Community College Surveys of Student Engagement
- College Results Survey
|
|
22
|
- National Assessment of Adult Literacy
- Levels of literacy of the college educated
- Value added by attending college
- Graduate admissions and licensing exams
- Number of graduates ready for advanced practice divided by number of
applicable degrees
- General intellectual skills tests
- Proportions of test-taker scoring above a certain level
- Graduates reporting high levels of ability
- Proportions of respondents reporting performance above a certain level
- Good-practice surveys
- Aggregate results weighted by institutional FTE
- Each benchmarked against national or five-state norms
|
|
23
|
- The model is workable.
- The data show consistent patterns that tell a valid story in each state.
- Aggressive state leadership is crucial to reliable results.
|
|
24
|
- Motivation: states, institutions,
students
- Test and survey administration
- Instrument coverage
- Availability and representativeness of information
- Cost
|
|
25
|
- In order to address the accountability mandate
- To determine how to do so in a credible and responsible way.
- To generate information useful to states, institutions, and students
- To promote state-level analysis and collaborations to serve
under-achieving subpopulations or regions of the state.
- To target state resources effectively.
|
|
26
|
- Measuring Up http://measuringup.highereducation.
- org/
- The National Forum on College-Level Learning
- http://collegelevellearning.org
|