Assessment of Student Learning 101

Assessment Cycle

Assessment Cycle graphic depicting that assessment of student learning is a continuous, on-going process

Learning Goals

Good assessment begins with well-defined and clearly articulated learning goals. The Middle State Commission's handbook on Student Learning Assessment contains useful information on developing learning goals that describe what we want students to learn.

View the handbook Student Learning Assessment: Options and Resources »

View a tool for drafting student learning outcomes »

View a list of student learning outcomes action verbs »

Learning Opportunities

Programs must assure that students have ample opportunities to achieve defined learning outcomes. Curriculum maps depict graphically which courses address particular student learning outcomes for a specific program. Curriculum maps can be a rich source of information; for example, they may help programs identify student learning outcomes that are not being addressed, highlight courses that are not connected to any particular student learning outcome and/or determine course assignments that could be used as an embedded assessment, such as capstone course assignments.

Assessment

Rubrics

Rubrics are scoring guides that list the expectations or criteria to evaluate an assignment. Rubrics help ensure that assignments are graded consistently across students within the same course or across various courses with the same or different instructors. Generally, rubrics provide students with more specific feedback regarding strengths and weaknesses of the assignment.

The Association of American Colleges and Universities (AAC&U) has assembled a series of peer developed and reviewed Valid Assessment of Learning in Undergraduate Education (VALUE) rubrics.

View more information and download the AAC&U VALUE Rubrics »

Surveys

Nationally and locally constructed surveys are effective ways to measure students' experiences, perceptions and engagement with the university. Nationally constructed survey instruments, such as the NSSE, often include norm groups for institutional comparison and have established reliability, validity and external credibility; whereas, locally constructed surveys can be developed to more precisely fit institutional goals and information needs.

Other surveys used to assess student learning outcomes include:

  • Current student survey
  • Graduating student survey / Senior exit survey
  • Alumni survey
  • Employer survey
  • Focus groups
  • Course and instructor evaluations

Visit the IRA Survey page to learn more about the surveys conducted at Temple University and to view selected survey data »

Portfolios

Portfolios offer students the opportunity to showcase and reflect on a collection of their work for a defined period of time providing faculty a way to assess courses or programs in deeper, more meaningful ways. Typical portfolios include artifacts (Ex: papers, audio/video recordings, artwork, digital projects) and reflections on the artifacts and overall experience in the course/program. Portfolios are valuable assessment tools because they assemble direct evidence of student learning, show student growth over time and demonstrate how knowledge/skills learned in different courses intersect to build the intended program outcomes. Portfolios are however, time intensive both for the student constructing the portfolio and the faculty/staff assessing the portfolio. Portfolios compiled over the span of a degree should include portfolio checks along with a final review.

Other Tools

There are many other tools that can be used as either direct or indirect measures of student learning.

Direct Measures:

  • Practicum, internship or other field placement
  • Clinical evaluations
  • Student work in a capstone course
  • Final paper, thesis or dissertation
  • Licensure or Board Exam
  • Locally developed test or exam
  • Juried show, performance or critique
  • Oral presentation
  • Design project
  • Group project or demonstration
  • Journals or other reflective writing
  • Discussion boards
  • Research projects

Indirect Measures:

  • Surveys (see above)
  • Retention rate
  • GPA
  • Graduation rate
  • Job placement
  • Post graduate admission
  • Board scores
  • Records of publications or research activity

Using Results

Knowing your audience is critical in assessment planning. What are you being asked to do? Who will be using the results? Who will be interested in the results? Campus audiences might include: program faculty, faculty in affiliated programs, institutional leaders and/or program administrators and staff. Public audiences include prospective students and their families, employers and policy makers, among others.

Assessment can point to needed improvements. Does the curriculum address each learning goal? Are we using appropriate methods of instruction? Are courses taken in the proper sequence? Do students have the necessary foundation built in pre-requisites for advanced courses? Are there gaps in coverage? Can you improve the learning goals? Are there too many goals? Do the goals need to be clarified? Are the goals inappropriate or overly ambitious?

Assessment can identify need to enhance support programs such as: tutoring, library services, academic advising, counseling, technology infrastructure and co-curricular opportunities.

Consider the need to improve assessments. Are the assessments poorly written or misinterpreted? Does the type of assessment match the learning goals? Are the assessments too easy or difficult? Are the benefits of the assessment worth the resources invested?

Showcase your results. Consider sharing your results at: assessment committee meetings, department/school meetings, annual assessment events or at conferences. Share highlights and pleasant surprises but also address areas for improvement. Get feedback about next steps.