Pearson VUE @ ATP

Session Details

Monday, March 6
9:30 AM – 10:30 AM

PLATINUM SPONSOR SESSION (Pearson VUE): Online Proctoring: Balancing the risks and rewards for your program


Presenters:
Peter Pascale, Pearson VUE
Joe Brutsche, Pearson VUE

Session Description:
Online proctoring is a proven way for testing programs to extend reach and provide candidates with an added level of convenience. But, it does come with an inherent level of risk. The emergence and evolution of multiple offerings now makes the process of evaluating online proctored solutions even more complicated. How do you identify the one that is best for your program? It’s important for test owners to fully understand how an online proctored solution aligns with the needs of their testing program.

Pearson VUE has partnered with the world’s leading IT certification and professional licensing programs to deliver online proctored exams successfully around the world. Our product, security and industry experts will provide the guidance critical to support online proctored deliveries that are reliable and scalable, as well as share best practices. We will also review market and technology trends that are influencing perspectives on traditional security models, the methods of proctoring, and the spectrum of assessments that fit online proctored delivery.

You will leave this session with a good understanding of the factors to consider when determining the best solution for your program and knowledge about what is coming next in expanded delivery opportunities.

 

Monday, March 6
9:30 AM – 10:30 AM

Measuring test taker competency: A cross-divisional session on the value of performance testing


Presenters:
Rory McCorkle, PSI Services LLC
John Weiner, PSI Services LLC
Rebecca Nelson, Tableau
Andy Stockinger, Pearson VUE

Session Description:
Performance testing has been used for decades in various testing programs. The use of performance testing, however, has increased rapidly in recent years with the growth of technology, the ability to score increasingly complex tasks automatically, and delivery methods that allow for easier distribution of such exams. And as more employers look for assurances that individuals can perform the tasks required of them in many job roles—and not simply have the underlying knowledge—performance testing can evaluate a program’s ability to assess these hands-on skills.

This moderated panel session will feature speakers from assessment organizations—including representatives from the Performance Testing Council (PTC)—who represent the Certification, Education, Industrial/Organizational, and Workforce Skills Credentialing divisions. Their discussion will explore how performance testing is used in these contexts, as well as other topics including: (1) how to assess whether a testing program may be a fit for performance testing, (2) common approaches to automating the performance testing process, and (3) common pitfalls encountered in performance testing.

 

Monday, March 6
11:00 AM – 12:00 PM

FEATURED SPEAKER SESSION: Is Gaming a Psychometrically Valid Option for High Stakes Testing?


Presenters:
Eric D'Astolfo, Pearson VUE
David Seelow, Revolutionary Learning & College of Saint Rose
Kristin DiCerbo, Pearson

Session Description:
Innovation is all industries is critical – and it is particularly critical that assessment tools keep pace with innovation to best reflect the industries in which they are used and to ensure ongoing enhancements in validity, accuracy and usefulness. With the evolution of gaming technology, there is an opportunity to advance education/training methodologies, data collection, and assessment to be both more engaging to students and candidates and to evaluate knowledge, skills, and abilities in ways that are substantially different than what has been available through traditional assessment methodologies. With the wealth of data that comes from a gaming solution, however, there also comes substantial psychometric considerations that must be addressed in order to ensure that the data collected, evaluated, and used for decision making is accurate and valid for the stated purpose. Are we at the point that gaming data can be used in a fair and valid manner? Or have we not yet evolved far enough? Participants in this activity will hear two sides of the gaming debate and will have the opportunity to add their own thoughts and experiences about the value of gaming in assessment.

 

Monday, March 6
4:30 PM - 5:30 PM

Actionable Intelligence: Real world examples on how to move from reactive to proactive


Presenters:
Victoria Quinn-Stephens, Cisco
Bryan Friess, Pearson VUE

Session Description:
Using actionable intelligence is the process of determining which insights need to be harvested from data in order to take specific, risk-based, and decisive actions in varying situations and with different programs.

This session will discuss (1) how one exam security program works with their delivery partner to gather the data needed to develop actionable intelligence enabling their joint teams to defensibly act against rogue training centers, candidates, and brain dumping sites, (2) how the test security and legal teams worked together to identify specific evidence, take action, and streamline their process in protecting intellectual property, and (3) how the collective teams have identified, developed, and implemented specific criteria and processes that can be monitored on an ongoing and proactive basis.

 

Monday, March 6
4:30 PM - 5:30 PM

IBM, Microsoft and Cisco: The journey to badging technology.


Presenters:
Vikas Wadhwani, Cisco
David Leaser, IBM
Jarin Schmidt, Acclaim
Selina Winter, Microsoft

Session Description:
Top credentialing, certification, and training programs have adopted digital badges as a secure, cost-effective way to issue verifiable recognition, and badging has evolved from new, unknown technology to a trusted method of managing a certification program. Alongside that evolution, a series of best practices to plan, launch, and grow a badging program have developed.

Join this session’s presenters as several top organizations in the credentialing industry share their journeys. Each of their experiences is unique, but will ultimately addresses the following topics.

Planning, including: (1) defining the goals of a badging program and determining which metrics used to measure those goals, (2) creating meaningful metadata as the foundation of a badge, and (3) designing a badge visual that accurately represents a brand in a way that is mobile optimized.

Launch, including: (1) making sure a marketing plan is multi-channel and informs badge earners of the main benefits of accepting and sharing their digital credential, (2) soliciting feedback from badge earners so a program can be adapted as needed, and (3) gathering influencing testimonials.

Growth, including: (1) checking in on goals initially set to make sure they are still relevant or if they need to be adjusted, (2) finding new ways to engage with earners through social media, special offers, or new learning opportunities related to badges issued, and (3) determining if there are new badges that could be offered which align with existing business goals and meet a need for badge earners.

Early adopters of badging have been in the business for several years now and have discovered what works and what doesn’t when it comes to building an effective badging program. In this session, attendees will leave with best practices, action items, and considerable insight—all of which will help them build their own badging programs.

 

Monday, March 6
4:30 PM - 5:30 PM

FEATURED SPEAKER SESSION: Opting Out: Does it Help or Hurt the Assessment Industry


Presenters:
Rob Pedigo, Castle Worldwide, Inc.
Eric D'Astolfo, Pearson VUE
Amy Riker, Educational Testing Service

Session Description:
In the last several years the Opt Out movement has gained attention, substantial momentum and certainly a lot of press – largely due to criticism associated with standardized testing in education and with recertification requirements in various industries. Proponents of the Opt Out movement suggest that issues within standardized testing can be solved by having candidates and students “opt out” of testing – refusing to participate in the assessment process. This is not a surprising approach, as boycott movements have certainly aided in social change at various times. So ignoring or dismissing the Opt Out movement is not likely the right answer. So what is? Should students and candidates be encouraged to Opt Out of the assessment process until their concerns can be addressed? Would a growth of the Opt Out movement encourage change within the assessment industry? Or does Opting Out result in negative impacts that will not, in the end, assist in the evolution of the assessment? If Opting Out is not the answer, what is? Join us in what will be sure to be an interesting and passionate discussion about the Opt Out movement and its positive and negative implications for assessment.

 

Monday, March 6
4:30 PM - 5:30 PM

The Importance of Providing Relevant Feedback: How Data Analytics Can Provide Actionable Feedback to Improve Performance on High-Stakes Assessments


Presenters:
Ashok Sarathy, GMAC
Martin Kehe, GED Testing Service

Session Description:
Effective high-stakes assessments are valid measures of the skills acquired in high school or those needed to succeed in college or on the job. How good are they at offering feedback or a diagnosis of a candidate’s results on an assessment that can be analyzed and utilized to improve performance, though? At the grade school level, the prevalence of formative assessments helps children effectively learn the skills needed to thrive at each grade level. Even adult learners and those seeking entry to college or the workforce need this feedback, yet very few assessments cater to these needs. This session will discuss how two different assessments have sought to offer test takers feedback on their performance both on practice exams as well as the real exam that enable better preparation to maximize performance.

One assessment—a high school equivalency assessment—provides candidates taking either the practice test or the operational test detailed feedback on the competencies that they demonstrated strengths in as well as the specific skills they need to strengthen or attain to improve their performance. The skills and competencies are cross-referenced to educational materials aligned to the test from 20 approved publishers. Links to these materials help candidates immediately address skill deficits and prepare for their next testing opportunity. For test takers who pass the exam, the feedback provides guidance on skills that they will need to acquire to help them advance on their next educational or career goals.

The second—a graduate admissions assessment—offers customized feedback on specific question types, skills, and timing that offers invaluable insight into a candidate’s performance on the exam and provides actionable information that will help the candidate identify their strengths and weaknesses on the skills tested in each section of the exam, as well as the time spent on each question. It enables test takers to speak to admissions staff about their skills as well as helps improve skills before arriving in the classroom or—should they decide—help them focus their preparation efforts for a future sitting of the exam.

This session will focus on how examiners can delve deeper into the exam data and report on it effectively, providing actionable insights while at the same time preserving the reliability and defensibility of the information that is reported.

 

Tuesday, March 7
10:15 AM – 11:45 AM

Workshop: Improving Test Results Reporting


Presenters:
Joshua Goodman, NCCPA
Jim Masters, Pearson VUE
Brian Bontempo, Mountain Measurment
Fen Fan, NCCPA

Session Description:
When designed effectively, test results reports can provide extremely valuable information to a variety of stakeholders (e.g., candidates, education providers, regulators, and hiring managers). These reports include: (1) traditional reports to candidates about their performance on the test, (2) summary reports based on a group of testers (e.g., item analysis report or traditional technical report), (3) educational program reports, (4) tests administered over a fixed period, and (5) registrations reports, among many others.

This session will offer practical advice on how to: (1) improve test results reporting to stakeholders, (2) what information to report and not to report, (3) the format of the reports, and (4) how to provide information that will help stakeholders effectively use the information contained in the reports.

Specifically, this session’s presenters will address five topics related to test result reports. The session will begin by presenting two contradictory approaches to test results reporting: (1) the ‘Game Day’ approach to reporting, where a snapshot of an examinee’s performance is summarized and displayed, and (2) a more technical report which documents the entire set of outcomes of an examination experiment. The second part of the session will discuss the different types of stakeholder groups that might use test results and the specific reporting needs of each of these groups. Third, the presenters will provide an overview of different varieties of quantitative information (e.g., scores, sub-scores, measurement error, and response time) and when best to include this information in test results reports. Fourth, the format of test results reports—including ideas of how to best use data visualization within a report—will be presented. Lastly, the presenters will discuss issues related to the deployment and delivery of reports.

This session will be a balanced mix of presentation and discussion to allow immediate application of knowledge gained during the session. Opportunities to review and evaluate examples of effective and ineffective reports will also be provided.

 

Tuesday, March 7
4:30 PM - 5:30 PM

New Innovations in Computer-based Tests: Best “Operational” Practices


Presenters:
K. Chris Han, GMAC
Huijuan Meng, GMAC
Jennifer Davis, National Association of Boards of Pharmacy
Kathleen A. Gialluca, Pearson

Session Description:
In this session, new, innovative designs used for four different test programs will be introduced. Two of them are already operational, whereas the others are in planning/development stages and will be launched in the near future. The presentations will cover (1) the goals and issues that each test program faced, (2) the research results and evidence showing the effectiveness of the new approaches, and (3) their practical implications and guidelines.

A Linear-On-The-Fly Testing (LOFT) design was adopted by the National Council of Examiners for Engineering and Surveying (NCEES) for their 8 Fundamentals exams. This session’s presentation describes guidelines for addressing operational challenges and evaluating the psychometric qualities of the LOFT forms from various perspectives. Specifically, the presentation describes a way to select items from the full item bank for use in the LOFT item pool—not only to ensure that the current LOFT exams are successfully deployed and administered but also to ensure that future LOFT item pools are equally effective while having minimal overlap with the preceding pool.
The Executive Assessment (EA) test was developed to evaluate a candidate’s readiness for executive MBA degree programs. The EA test was developed based on the new multistage testing (MST) with cross-sectional routing (CSR) in the design of the EA test. In EA’s MST with CSR, the correlational relationship among sectional scores is used only for routing and not for score estimation for each section. The result of study suggests that employing MST with CSR can meaningfully improve overall measurement efficiency—especially when the test is short. The study also introduces the real-world implementation of the new MST with CSR with the EA test.

The Pallet Assembly (PA) design utilizes a collection of equivalent test forms (a “pallet”) constructed using automated test assembly methods based on form- and pallet-wide objectives and constraints. The PA design will be implemented in Fall 2016 for the North American Pharmacy Licensure Examination (NAPLEX) offered by the National Association of Boards of Pharmacy (NABP).

A hybrid design (LOFT+CAT) with item parameter imputation approach has been developed potentially for the future version of NMAT-by-GMAC—a graduate business school admission test used in India to address the challenges of using new items operationally in an adaptive test.

 

Wednesday, March 8
8:45 AM - 9:45 AM

The Basics of Blockchain: What it is and What it Could Mean for Your Credential Program


Presenters:
Vikas Wadhwani, Cisco
Mark Mercury, Pearson VUE

Session Description:
An Internet search of the term Blockchain will return a mix of articles that go deep on how this buzzed-about technology works. Other articles found will have such titles as “Blockchain: Over-hyped bandwagon or truly revolutionary technology?” For those who have heard rumblings of Blockchain, it can be somewhat confusing. For those who haven’t heard of it (yet), Blockchain is downright mystifying.

Blockchain is being explored for an assortment of uses, including: (1) content distribution, (2) storing evidence, (3) credential issuing, (4) validation, (5) badging, and even (6) certification programs related to Bitcoin.

In this session, the presenters will explore: (1) what Blockchain is, (2) what Blockchain is not, (3) common misconceptions, (4) what impact this technology could have on the credentialing industry, and (5) key challenges that will need to be addressed before Blockchain could be effectively implemented for credential management.

Attendees will leave the session with a basic understanding of what Blockchain is, how it works, and why it’s something to keep an eye on in the years ahead.

 

Pearson VUE

Visit us at ATP booth #26


For more information on ATP
Click here

Innovations in Testing
Exhibit Hall Hours & Events

Sunday, March 5

6:00 PM - 7:30 PM

 

Opening Reception

 

Monday, March 6

7:30 AM - 8:30 AM

Breakfast with Exhibitors

10:30 AM - 11:00 AM

Coffee Break with Exhibitors

12:00 PM - 1:15 PM

Lunch with Exhibitors & Product Demos

2:15 PM - 2:30 PM

Networking Coffee Break with Exhibitors

3:30 PM - 4:30 PM

 

Dessert with Exhibitors

 

5:30 PM - 7:00 PM

 

Reception with Exhibitors and ePoster Sessions

 

7:30 PM - 10:00 PM

 

Pearson VUE
“Rock the Country” party
Click here

 

Tuesday, March 7

7:30 AM - 8:30 AM

Breakfast with Exhibitors

10:00 AM - 10:15 AM

Break with Exhibitors

11:45 AM - 1:00 PM

Lunch with Exhibitors

2:00 PM - 4:30 PM

Dessert with Exhibitors/Ignite Table Discussions