Online Livestream
Human Factors Engineering &Usability Studies Congress
What To Expect

The Human Factors Engineering & Usability Studies Congress is the only event specifically focused on the needs of medical device and combination product professionals responsible for avoiding use errors and medication errors while building a positive user experience. As your combination product matures, are you ready to address more involved questions of practical design, troubleshooting, and answering involved questions from regulators? This unique event will empower HF professionals to grow more confident in their skills, guide their teams toward the right insource/outsource decisions, and set new strategies for device software.

Event Sponsors
New Topics
Featured Speakers
Ed Israelski
Technical Advisor, Human Factors
Gia Rozells
Director of User Experience
James Kleiss
Research Architect, User / Patient Experience
Jay Duhig
Director, Patient Integration, Pharmacovigilance and Patient Safety
Jenn Zuba
Design Control, Risk Management, and Usability Process Owner
Joe Cesa
Manager, Human Factors Engineering, Surgical Robotics
John Kruse
Senior Human Factors Specialist
John Staton
Director, Human Factors, User Research, and Experience Design
John Towns
Senior Research Fellow, Regulatory, Delivery Systems
Jonathan Amaya-Hodges
Associate Director, Regulatory Affairs CMC, Combination Products and Medical Devices
Joseph Purpura
Associate Vice President, Medical Device Safety
Mary Pat Cottengim
Principal Usability Engineer, Medical Devices
Natalie Abts
Head of Human Factors Engineering, Packaging and Device Development
Rachel Poker
Human Factors Engineering Manager, Device Development
Siddharth Desai
Tim Goldsmith
Staff Human Factors Engineer / User Experience Designer
Tressa Daniels
Global Director, Human Factors Engineering
Vera Shuman
Human Factors Product Development Specialist
Event Schedule

Your test protocols get judged by regulators who might not have specific HF training, requiring a careful touch as you explain what is commonplace. FDA wants to avoid any appearance of overtraining or bias – but what do they really mean by this, and what is necessary to avoid it?

  • Review past complaints about the use of spoken or written descriptions
  • Work through FDA expectations regarding overtraining and learning decay
  • Retrain your teams to stop using potential intervention trigger words such as “familiarize”

Whereas graphical user interfaces and vision-oriented products are easily tested remotely, other device aspects are far more challenging.  New approaches in augmented reality, virtual reality, and puppet testing can all be achieved in the short term and can bridge COVID’s gaps.

  • Review data collection through virtual reality and other new methods
  • Establish a visual brand language that fits with product iconography and workflow
  • Fuse the expertise of research groups and designers

HF professionals must find new ways to recruit staff, maintain regular schedules, and interact with users – especially users who are in high risk groups for COVID-19. In addition to maintaining HF credibility, future test designs will need to address new concerns about liability, risk, and patient privacy.

  • Explore tests involving mobile labs and remote product shipping
  • Gather input from legal, regulatory, and compliance team members
  • Understand what novel study designs FDA will find credible and acceptable
Outsourced testing partners want sponsor companies to be happy – but what sponsors need is accuracy, even if the results are disappointing. The sooner you get bad results, the faster you can fix problems.
Firmly set expectations for clinical usability trial partnerships
Ensure studies are executed and interpreted with integrity
Look for warning signs in trial completion time

Getting answers to a presubmission can take over 2 months – for a study you’re actively trying to plan. Direct interactions with FDA can be used to help refine your questions and increase the likelihood of precise answers.

  • Limit the scope of questions to improve feedback speed and relevance
  • Focus on clarifying the acceptability of specific testing aspects
  • Compare and contrast different presubmission methodologies

Patients may prefer not to use a device even if they understand how to do it. It is critical both to gather data about patient preference and adoption, and to separate that data from usability itself.

  • Interpret use data about devices that may be confusing or embarrassing
  • Recognize that patients may be able to use a device but choose to cease using it – and why
  • Learn from both data sets while keeping them distinct
Outsourced testing partners want sponsor companies to be happy – but what sponsors need is accuracy, even if the results are disappointing. The sooner you get bad results, the faster you can fix problems.
  • Firmly set expectations for clinical usability trial partnerships
  • Ensure studies are executed and interpreted with integrity
  • Look for warning signs in trial completion time

When using complicated hardware, medical professionals are just as prone to error as laymen. Assuming that people with advanced degrees are better at learning than others is a serious, and often unrecognized, error that can plague design teams.

  • Recognize that learning is a potential burden for all users
  • Ensure a consistent approach to labeling and training regardless of user background
  • Interrogate your own assumptions

The patient experience is a sliding scale, not a good/bad binary, and removing problems doesn’t necessarily make device use more pleasant for them. When designing the communications systems for large medical devices, you can gain a market advantage by focusing on a pleasant or even ideal patient experience.

  • Analyze the factors and impacts of acoustics in large imaging devices (i.e. MRIs)
  • Recognize that there is more to patient perception than getting rid of complaints
  • Tally whether the market will pay for a better user experience

Writing the guides for screening and moderating is just the beginning of the oversight necessary for good relationships with outsourced partners. If you haven’t clearly defined what you expect regarding use errors and critical tasks, you’re just wasting your time and money.

  • Unify essential definitions
  • Highlight expectations for regulatory submission
  • Dedicate the time necessary to training

Identifying user groups and typical use environments is always a challenge for usability professionals – and the learning curve grows steeper when you are trying to develop a combination product that must satisfy two sectors of regulatory expectations at once. How can you make sure both sets of usability criteria are met?

  • Clearly map out validation and summative test designs for combination products
  • Review past challenges and successful solutions for user group selection
  • Instill proper appreciation in your teams for use environments

The advantage of using the Internet-of-Things to connect multiple devices is creating cheaper, disposable devices with simpler interfaces. But some devices can be too simple, causing user confusion and introducing risk of error.

  • Study the decision to withdraw wireless EKG devices due to usability challenges
  • Highlight the constraints of tiny disposable reusable devices
  • Apply proper skepticism to the IOT promises of service providers
Register Now

This window is secured by 256 bit encryption on a PCI compliant network. Click here to view this window in its own page.

Subscribe to Conference Updates
Join our e-newsletter list to follow closely all news.
Event Partners
Contact Us