2026 Pre-Institute Workshops
All Pre-Institute workshops will be held on Sunday, October 18, 2026. You may sign up and pay for any of the Pre-Institute workshops when completing the online registration form for the 2026 Assessment Institute in Indianapolis. Pre-Institute workshop fees ($150 for full-day; $75 for half-day) are in addition to the Institute registration fees.
Attending a Pre-Institute workshop can be a valuable experience for several reasons:
- In-depth learning: Pre-Institute workshops offer a more in-depth learning experience than regular conference sessions. Due to their longer duration, workshops allow for a more comprehensive exploration of the specific workshop topic.
- Practical experience: Pre-Institute workshops offer engaging exposure to the subject matter, giving attendees the opportunity to consider and apply what they learn in the workshop to their own contexts. This can be particularly helpful for attendees who prefer a more interactive learning experience.
- Networking: Pre-Institute workshops can be an excellent opportunity to connect with other attendees who share similar interests and goals. By attending a workshop, you can meet and network with other practitioners, potentially leading to new opportunities for collaboration.
- Access to thought-leaders: Pre-Institute workshops are facilitated by assessment and improvement thought-leaders. This provides attendees with a unique opportunity to learn directly from experienced practitioners and to ask questions in real-time.
Overall, attending a Pre-Institute workshop can be a valuable investment in your professional development. The workshops offer opportunities to learn in-depth, gain practical and applicable experience, network with like-minded colleagues, and access experienced practitioners in the field of assessment and improvement.
Full–Day Workshop | 9:00 a.m.–4:30 p.m.
Morning Half–Day Workshops | 9:00 a.m.–12:00 p.m.
Afternoon Half–Day Workshops | 1:30 p.m.–4:30 p.m.
Sunday, October 18, 2026
Full–Day Workshop | 9:00 a.m.–4:30 p.m. | $150
01A – Assessment 101
Assessment. Where to start? It’s here in Assessment 101. Participants will interact with each other and engage in hands-on activities throughout this full-day workshop. You will grapple with questions fundamental to higher education such as: What should students know, think, and be able to do when they graduate? And how would someone know if students succeeded? Fortunately, assessment can help us address these questions. Specifically, by the end of this workshop, you will be able to do the following: (1) explain the basic steps in the assessment process, (2) distinguish among beginning, developing, good, and advanced assessment reporting, (3) develop an assessment plan for one student learning outcome (SLO), and (4) discuss the fundamentals of applying interventions [e.g., pedagogy and curriculum] at the program level to improve student learning.
Keston H. Fulcher and Associates, James Madison University, and Associates
Audience: Beginner
Topic: Assessment Methods
Sunday, October 18, 2026
Morning Half–Day Workshops | 9:00 a.m.–12:00 Noon | $75 each
02A – Jumpstarting General Education Program Review: A Systems Thinking Approach to the Self-Study
Often overlooked in discussions of general education program development and assessment is the issue of program review. The Association for General and Liberal Studies (AGLS) offers two resources, “The Gen Ed Leader’s Playbook” and a “Guide to Assessment and Program Review,” to stimulate a collaborative discussion for improving a general education program. The Playbook provides leaders with various tools to help address the tough questions, such as “Why do we need Gen Ed Programs?” At the heart of the “Guide” is a set of twenty systems analysis questions to improve program quality. This workshop focuses on the initial stage of the self-study and allows attendees to “test-drive” the tools and practice some basic general education program evaluation steps.
Jody DeKorte, Purdue Global; Christine Robinson, University of North Carolina at Charlotte; and Kevin Hermberg, Dominican University New York
Audience Level: Beginner
Topic: General Education
02B – Power BI Boot Camp: Dashboard Building for Assessment Professionals
Interactive dashboards are a powerful tool used to visualize data. As postsecondary campuses collect an increasing amount of data related to student learning, satisfaction, and success outcomes, dashboards are becoming a commonly used method to visualize data. Dashboards are highly adaptive and can be used to display many types of data, including data from learning management systems, student surveys, admissions data, demographic data, and student success indicators. This workshop will focus on building dashboards visualizing student learning data using Power BI for participants with little or no prior experience. The workshop will be a hands-on learning experience, and participants will leave with a newly created dashboard. The emphasis of the workshop will be learning the basic steps of Power BI dashboard building so that participants can use these skills in the future. Note: Participants will need to bring their own laptop to the workshop with the desktop version of Power BI installed on the laptop prior to the workshop. Participants should bring a laptop with the Microsoft Windows operating system, as the free desktop version of Power BI does not work with Mac (iOS) laptops. A dataset will be provided to participants. More information and instructions will be communicated directly to registered participants prior to the start of the Institute. Please note that this session has been offered in prior years and will not significantly differ in content.
Shane Schellpfeffer, University of North Dakota; Caitlyn Jessee, Florida State University; and Jeremy Fry, Indiana University School of Dentistry
Audience Level: Beginner
Topic: Use of Technologies in Assessment
02C – Beyond Western Metrics: Re-Examining Assessment through Indigenous Ways of Knowing
Efforts to improve assessment in higher education often prioritize processes, methods, and tools while leaving the paradigms that shape them largely unexamined. This half-day workshop disrupts that pattern by centering Indigenous ways of knowing, paradigms, and research methods as foundations for rethinking what counts as knowledge, evidence, and success. Grounded in responsibility, relational accountability, community, and place, participants will examine how Indigenous paradigms challenge dominant Western assumptions about rigor, objectivity, and impact—and what this means for assessment practice. Through dialogue, reflection, and applied examples, the workshop supports assessment professionals in engaging Indigenous approaches ethically and contextually, while developing practical strategies to transform assessment in meaningful, respectful, and responsible ways.
Gavin Henning, New England College; Anne Lundquist, Temple University; and Benny Rieth, BelongingU
Audience Level: Beginner and Intermediate
Topic: Assessment Methods
02D – Survey Skills That Drive Decisions: From Design to Impact
Learn the art of building high-impact surveys that inform decision-making and strengthen student success and program improvement. In this interactive, half-day workshop, participants will gain practical, step-by-step experience to design, distribute, and analyze surveys that produce actionable feedback from students, faculty, staff, and program graduates. Geared to both beginning and seasoned assessment professionals, the workshop covers techniques for writing clear, engaging items; selecting the right question types; and using distribution strategies that reliably improve response rates. Participants will also learn approaches for summarizing results and creating compelling data visualizations that communicate key findings to stakeholders. Throughout the workshop, attendees can use the provided sample survey or work on their own survey projects, with time built in for refinement, peer discussion, and real-time feedback to ensure they leave with tools and templates they can apply immediately.
Joanna Z Boeing, Bridgewater State University
Audience Levels: Beginner and Intermediate
Topics: Analytics and Metrics, Assessment Methods, Institution-Wide Data Collection/Use, Use of Technologies in Assessment
02E – Strategic Framework: Crafting a Comprehensive Faculty Development Program Aligned with Assessment and Accreditation Standards
Faculty development that is timely and responds to accreditation changes and feedback is critically important. This workshop aims to help participants develop a comprehensive faculty professional development program including a robust orientation. Recognizing the substantial investment of time, attention, effort, and financial resources in faculty development, this process is strategically crucial. It serves as a pivotal process in acquainting faculty with the institution, establishing institutional and faculty connections, communicating expectations clearly, and assuring accreditation needs become part of faculty development. During the workshop, participants will engage in interactive, hands-on activities to: (1) identify the essential components of faculty professional development including initial orientation, establish a comprehensive understanding of “must-have” elements, (2) develop a “just-in-time” model for faculty professional development, ensuring relevance and timeliness and considering feedback from accreditors, (3) evaluate the effectiveness of their institution’s faculty professional development program through practical assessment strategies, and (4) provide evidence demonstrating that the institution meets and responds to accreditation requirements, both institutional and programmatic, pertaining to faculty training.
Amy N. Morris and Nehad El-Sawi, Des Moines University
Audience Level: Beginner
Topic: Faculty/Professional Development
02F – ACCELERATE in Action: Adopting and Applying the Updated Assessment Principles
This workshop guides attendees on learning and implementing the new ten principles—ACCELERATE—a contemporary update to the foundational nine assessment principles that have guided higher education practice for decades. You will gain insights, strategies, and approaches on what these principles are and how they can be enacted across academic, co-curricular, and institutional contexts, through practitioner-informed examples and resources for meaningful decision-making and improvement. Designed for a broad audience, the workshop emphasizes actionable strategies for aligning assessment with meaningful action and improvement efforts in powerful ways to strengthen the assessment culture and practice on your campus. Participants will leave with a robust toolkit on how to approach and boost assessment design and use, and cross-functional collaboration for informed decision-making and student success.
Divya Bheda, Santa Clara University; Constance Tucker, Oregon Health & Science University; and Daniel J. Kaczmarek, University at Buffalo
Audience Levels: Beginner and Intermediate
Topic: Future Trends
02G – From Compliance to Continuous Improvement: Practical Strategies for Making Assessment Matter
This workshop explores how a compliance mindset can undermine meaningful assessment efforts and limit its potential for improvement. Participants will examine how accreditation can be wielded in the wrong ways, common symptoms of compliance-driven culture, and explore strategies for reframing assessment (and accreditation efforts) as ongoing promotion of good practice. Using multiple institutional examples, presenters will engage attendees in reflection and activities to shift assessment practices toward betterment and continuous improvement aligned with institutional priorities.
Megan Good, James Madison University; Joe Levy, Excelsior University; and Will Miller, Embry-Riddle Aeronautical University
Audience Levels: Beginner and Intermediate
Topics: Accreditation, Future Trends, and Leadership for Assessment
02H – Student Partnership in Assessment: Ideas for Meaningful Student Engagement
In more than three decades of work, we are yet to regularly evidence a return on the investment of student outcomes assessment. Many adjustments to the process have been made (e.g., outcomes design, motivation, instrument design, faculty buy-in). Nevertheless, the key stakeholders—students—are rarely included in the process. Instead, we must—often incorrectly—assume the student experience and build our processes accordingly. To address this issue, participants will engage in a guided redesign of a specific assessment process at their own institution; intentionally planning for student partnership.
Nicholas Curtis, University of Wisconsin–Madison; and Robin Anderson, James Madison University
Audience Level: Intermediate
Topic: Student Partnership and Engagement in Assessment
02I – Process Education Through the Lens of Assessment: A Five-Level Journey
This half-day highly interactive workshop is designed for those new to Process Education (PE) and desiring to improve student learning outcomes. Participants move through five interactive activities aligned to PE’s “in a Nutshell” levels: learning, performing, assessing, growing, and self-mentoring. Each activity features a brief framing, an experience guided by the Strengths–Improvements–Insights (SII) assessment, and a short transfer debrief. Participants practice metacognition, provide and receive peer feedback on performance, critically analyze the assessment process, map their own growth, and engage in self-mentoring for improvement. Pre-conference preparation and a take-home toolkit support continued practice and adaptation in any context.
Tris Utschig, Kennesaw State University; and Josh Morrison, University of Indianapolis
Audience Level: Beginner
Topic: Assessment Methods
Sunday, October 18, 2026
Afternoon Half–Day Workshops | 1:30 p.m. - 4:30 p.m. | $75 each
03A – Leveling Up Your General Education Program Using the Assessment Maturity Model
Ready to discover the true maturity of your general education assessment practices? Join us for a half-day workshop to explore our new Assessment Maturity Model, designed to help you evaluate and enhance your general education assessment program. You will learn how to apply the model’s five dimensions: catalyst(s) for change; competencies and/or program student learning outcomes; assessment purpose, planning, and resources; faculty engagement; and transparency. By the end of this workshop, you will be able to identify your program's level of maturity based on the model and have gathered ideas to create an actionable plan to advance the maturity of the dimensions you selected for enhancement.
Christine Robinson, Mitch Cottenoir, Erica Andrews, and Shameika Daye, University of North Carolina at Charlotte
Audience Level: Intermediate
Topic: General Education
03B – Designing for Justice: Building Equity-Centered Data Visualizations
Data visualization is an important part of the work of assessment professionals, yet common practices can perpetuate biases through design choices, color schemes, labeling, and accessibility barriers. This workshop explores strategies, provides hands-on experience, and offers resources for creating and identifying equity-centered visualizations that prioritize inclusivity and justice.
Laura Lambert, James Madison University; and Joanna Boeing, Bridgewater State University
Audience Level: Intermediate
Topics: Analytics and Metrics, Assessment Methods, Institution-Wide Data Collection/Use, Use of Technologies in Assessment
03C - Assessment Unlocked: Practical Tools to Overcome Barriers and Build Resilience
This workshop explores elevating and sustaining student affairs assessment practices within common constraints and trends in higher education today. Led by the Consortium of Organizations for Student Affairs Assessment (COSAA), participants will be provided with tips, tricks, and institutional examples of success to navigate barriers to practice. Presenters will provide practical and implementation-ready strategies from practitioner research and motivation theory to increase the staying power of student affairs assessment as a culture of continuous quality improvement.
Rebecca Gibbons, Embry-Riddle Aeronautical University (representing AALHE / SAAL); Crystal Cyr, University of Colorado Boulder (representing ACPA CAE); Sophie Tullier, University of Delaware (representing NASPA AER KC); and Shaun Boren, University of Florida (representing NASPA AER KC)
Audience Level: Beginner
Topic: Student Affairs and Co-Curricular Programs and Services
03D – ePortfolios: Maximizing High-Impact Practices (HIPs) in Teaching, Collaboration, and Assessment
As a High-Impact Practice (HIP), ePortfolios help students document, reflect, curate, and showcase learning from a variety of contexts. Informed from the perspectives of the Association for Authentic, Experiential, and Evidence-Based Learning (AAEEBL), this workshop equips participants with knowledge, skills, and examples of how ePortfolios are making a positive difference in student learning, growth, and development. Participants will learn how to integrate ePortfolios in courses, programs, and co-curricular experiences, along with how to leverage their use to inform and influence instructional and assessment processes. The workshop will include small group consultations to support those who are new to ePortfolios, those optimizing ePortfolios in their courses, those working with ePortfolios at program/degree levels, and those wrestling with campus-level ePortfolio challenges. Faculty contributors will include seasoned experts at integrating ePortfolios into course curriculum, faculty working with degree-oriented ePortfolios, and ePortfolio innovators supporting undergraduate research initiatives.
Debbie Oesch-Minor, Indiana University Indianapolis; and Associates
Audience Level: Beginner
Topic: HIPs in the States/High-Impact Practices
03E – Beyond the Numbers: Transforming Assessment Data into Compelling Stories
Assessment data is essential for institutional decision-making, but numbers alone don’t inspire action—stories do. This interactive workshop will explore how data storytelling bridges the gap between data and impact, making assessment results more accessible, engaging, and actionable. In this workshop, participants will craft evidence-based stories by framing problems, aligning evidence, building narratives, and storyboarding to engage audiences. Through real-world examples and practical strategies, attendees will develop a framework for integrating storytelling into assessment, including how to tailor narratives to different audiences, translate data into stories, incorporate narrative into visualizations, and present evidence in ways that drive engagement and decision-making. This workshop will provide participants with the confidence and tools needed to turn data into impactful stories that inform, persuade, and inspire action. Note: Participants will need to bring their own laptop/device to the workshop. More information and instructions will be communicated to registered participants prior to the start of the Institute.
Ryan Smith, Illinois State University; and Associates
Audience Level: Beginner
Topic: Institutional-Wide Data Collection/Use
03F – Do It Without Talking About It: Liberatory Assessment, A Framework for Catalyzing and Inspiring Assessment Impact and Use in Your Organization
Assessment has often been perceived as a tool for compliance accountability rather than a medium that fosters organizational learning, improvement, and transformation to advance student success. Attend this workshop to learn about the liberatory assessment framework that can help you walk the walk by rethinking your assessment designs and approaches in powerful ways. You will learn practical strategies and application examples that empower you to be a change-agent-leader within your organization who can promote student success and institutional action in meaningful and empowering ways.
Divya Bheda, Santa Clar University
Audience Level: Intermediate
Topics: Emerging Voices in Assessment and Future Trends
03G – What Counts as Learning? Reframing Student Learning Outcomes as Observable Competence
For decades, colleges and universities have invested heavily in assessment systems designed to measure student learning. Yet a persistent challenge remains: learning itself is rarely defined in operational terms. In many institutional settings, grades, course completion, and student self-reports function as proxies for learning, leaving faculty and administrators without clear evidence of what students can actually do as a result of instruction. This extended workshop introduces a practical framework for defining student learning in terms of observable student behavior and demonstrable competencies. Drawing from assessment practice, behaviorist learning theory, and examples from multiple disciplines, the session explores how learning outcomes can be written, assessed, and documented through observable performance rather than inferred internal states. Participants will examine common problems in current outcome statements, redesign outcomes using observable action verbs, and explore methods for aligning assignments, rubrics, and program-level assessment with demonstrable student performance. The workshop will also address how this approach strengthens institutional accountability and credibility by allowing institutions to clearly articulate the competencies their graduates possess. Participants will leave the session with practical strategies for revising Student Learning Outcomes, designing assessments that capture observable evidence of learning, and communicating results in ways that are meaningful to faculty, accreditors, and the public.
Jarek Janio, Santa Ana College; and Associates
Audience Level: Intermediate
Topic: Learning Improvement
03H – Artificial Intelligence (AI) and Assessment: Opportunities and Challenges
Generative AI tools such as ChatGPT are increasingly visible in higher education, prompting assessment professionals to reconsider how they design, implement, and interpret assessment activities while addressing complex questions about ethics, equity, and integrity. This workshop moves beyond introductory “how‑to” demonstrations to engage participants in critically and creatively integrating AI across the assessment cycle, from articulating learning outcomes and designing instruments to interpreting evidence and communicating results for improvement. Drawing on recent survey data about how the assessment community is—and is not—adopting generative AI, participants will compare emerging patterns of use with their own institutional contexts and constraints. Facilitators will also surface ethical frameworks and practical guidelines to help teams navigate issues such as bias, transparency, data privacy, and academic integrity in AI‑supported assessment workflows. Through structured small‑group activities and live interaction with AI tools, attendees will iteratively prototype prompts, workflows, and reporting approaches that align with their assessment goals, leaving with concrete examples, templates, and strategies they can adapt for annual assessment reporting and other institutional needs.
John Hathcoat and Yu Bao, James Madison University; and Ruth Slotnick, Bridgewater State University
Audience Level: Beginner
Topic: Use of Technologies in Assessment
03I –Who Are We and Who Do We Want to Be Together? Assessing and Deepening Community-Campus Partnerships through Collaborative Critical Reflection
Community-campus partnerships are—or can be—vehicles for meaningful learning, growth, and change on the part of everyone involved. However, in order for such aims to be fulfilled, questions around partnership dynamics and quality must be asked—for example, how can partnerships deepen over time, especially through intentional collaborative actions of the partners themselves? How do we all—as partners—get on the same page about whether and in what ways we intend our partnerships to be transformational? How do we know whether partnerships are deepening, and what processes contribute toward to—or hinder—that deepening? Beyond the growth of our particular partnerships, what questions might we pursue in order to continue advancing the field’s understanding of partnership processes? In this half-day workshop, an interdisciplinary, interinstitutional team of facilitators will share and engage participants with an expansive (and deep!) body of work connected to TRES (pronounced “trees”), the Transformational Relationship Evaluation Scale, and the research-grounded Reflection Framework of which it is a part. In particular, we invite partnerships as a whole (whether dyads, triads, or larger groups) to register for and participate in this workshop together as we will spend a large portion of the session using—not just discussing—the hot-off-the-presses TRES III Reflection Framework in real time to not only gain familiarity with the tool but also explore your own partnership dynamics and from them co-generate concrete strategies for deepening the relationships, processes, and results that comprise your work together. Note: If it is not feasible for more than one member of your partnership to participate in the workshop or if your goal is to build your own capacity to support other partnerships, you are welcome to join us as an individual. Specifically, we will walk together through the TRES III Reflection Framework, first getting oriented to its conceptual underpinnings and then using it together, as partners, to determine what aspects of your partnerships you most want to deepen and what actions you will take to move forward accordingly. Facilitators will share research on the impact of the TRES Reflection Framework, which demonstrates that commitment to and clarity around partnerships increases and partnership gatherings, communication, and inclusivity are enhanced through collaborative use of TRES III. Participants will leave the workshop with concrete strategies for deepening partnerships (your own and/or those you support), ideas for future practice and inquiry, and an invitation to collaborate with the TRES team in ongoing research. Please join us for a highly interactive, enjoyable, and educational session as we explore together with TRES!
Robert G. Bringle, Indiana University Indianapolis; and Associates
Audience Level: Intermediate
Topics: Future Trends, Leadership for Assessment, and Assessment Methods
