Professor and Associate Dean for Assessment and Evaluation Director of Preparing Future Faculty

About Conference

This program is designed to help teams of faculty members and administrators from a university to work together to create a practical plan for the implementation of interprofessional practice and education. Expert faculty will lead discussions and activities to highlight well-published practices from around the world designed to ensure high-quality interprofessional practices. This information will be foundational to team-based activities that will occur throughout the workshop. Sessions will discuss all elements of interprofessional education with a focus on theoretical underpinnings as well as using practical approaches when designing IPE curriculum.

Call for Poster Abstracts Call for Presentation Abstracts



Pre-assignment: Participants should bring the following items:

  • Objectives used for a lecture and the test questions used to assess that lecture, as well as any test statistics derived from the test questions
  • Copies of rubrics used to assess learners
  • A copy of your college’s program learning outcomes (PLOs)
  • A copy of program or course review forms


Arrival and Registration


Participants take seats in auditorium


Welcome, Introductions and Opening Remarks


Plenary Session 1

Writing measurable and specific lecture objectives

Description:  Objectives are one of the most important assessment tools faculty members can use, however, faculty members may not use them or write them in a way to facilitate assessment. This session will review 2 common mistakes faculty members make when writing objectives.

Session objectives:

  • Distinguish between low-level and high-level  objectives
  • Describe 2 common mistakes made in objective writing
  • Identify observable and measurable verbs for each level of learning to use when writing objectives
  • State how to quantify your objectives


Coffee Break / Poster Displays*

* All schools attending the workshop will be invited to display a poster about assessment at their institution; more details to follow


Plenary Session 2

Instructional Alignment, Individual Tagging, and the Relationship to Programmatic Assessment

Description:  One of the factors that influences the validity of an assessment is aligning test items with objectives. This session will review how to triangulate objectives, test questions and content delivery.    

Session objectives:

  • Name 1 common mistake made when writing objectives and test questions
  • Align your objectives with your delivery and assessment
  • State 4 steps for tagging your test questions


Lunch and Prayer


Plenary Session 3

Interpreting exam statistics and using them to vet your current test and to revise your test questions

Description:  After students complete a test, faculty members can be confused about what to do with test questions that don’t perform well. In addition, faculty members that assess the same topics each year should refresh at least 30% of their test questions on an annual basis. However, refreshing a test bank can be a daunting experience. This session will review how to use exam statistics to vet exam performance for a current test as well as how to revise test questions for future years.

Session objectives:

  • Compare and contrast 8 assessment terms:
  • State 7 factors that impact the validity of tests
  • Define Point biserial and Kuder Richardson KR-20
  • Interpret item statistics using a sample exam statistics report
  • List 3 steps for vetting test questions using exam statistics
  • Name 3 steps for revising test questions using the exam statistics report


Coffee Break / Poster Displays (Micro-Presentations* of ½ posters)

*Micro-presentations are 3-minute presentations of main points of poster


Team Time 1

(revise your objectives and test questions using exam stats, tag test questions using tagging helper – see specific activity slides)

  • Create or reconstruct your own objectives in accordance with guidelines discussed during Session 1
  • Evaluate your own objectives to ensure that they are aligned with your test questions.
  • Tag your test questions 


Regional Mini-Presentation* 1

*All schools attending the workshop will be invited to submit a proposal (abstract) for a 20-minute presentation (and 10 minutes of Q&A) about the assessment system at their institution. Three abstracts will be selected by the members of the Organizing Committee for presentation at the workshop; more details to follow.


Day’s Wrap-Up and Questions




Plenary Session 4

Using grading tools to deliver feedback

Description:  Providing feedback is an essential assessment activity faculty members need to conduct in order to help students enhance their performance and gain confidence and competence. However, not all faculty members are trained on how to deliver goal-oriented feedback. This session will review how to use grading tools to support the use of a 6-step process for delivering feedback to students. 

Session objectives:

  • Contrast the 3 types of grading tools faculty members can use to assess student performance
  • List 4 benefits and 3 weaknesses for using grading rubrics
  • Reflect on your ability to provide feedback
  • Distinguish formative vs. summative assessment and evaluative vs. descriptive feedback
  • Contrast 5 terms:  feedback, evaluation, constructive criticism, feeding, and self-reflection
  • Evaluate feedback structures and identify the 6-step process of providing feedback
  • State how to use a rubric to provide feedback


Coffee Break / Poster Displays (Micro-Presentations of ½ posters)


Regional Mini-Presentation 2
Regional Mini-Presentation 3


Lunch and Prayer


Plenary Session 5

Fitting All of the Pieces Together with Programmatic Assessment

Description:  While individual faculty member’s efforts for teaching and assessment are important, programs also need to evaluate how all the individual components fit together to shape what students look like upon graduation. This session will review best practices for programmatic assessment related to the importance of programmatic assessment, quantitative and qualitative data sources for programmatic assessment, and curricular review and mapping.

Session objectives:

  • State 2 reasons that programmatic assessment is important
  • Evaluate the sources of quantitative and qualitative programmatic assessment data
  • Identify 4 primary question areas to evaluate when reviewing courses within the curriculum
  • Define the role of all stakeholders in the course review process


Team Time 2 (coffee available)

Participants can choose from the following activities:

  • Complete work from team time 1
  • Revise existing rubrics
  • Discuss with colleagues which courses faculty members assess the same items in and what rubrics are currently used and could be shared


Putting it all Together: Closing Remarks and Wrap-up

Book now

Seats are limited!

1 - 3 Persons

Register now

4 Persons and Above

Register now

Contact Us

If you have any additional queries about this conference, Without any hesitation Please contact us! We will be available 24/7!

  • Contact Us


  • Mail us


  • Venue

    Al Ain University   – Abu Dhabi Campus

Location Map