TTAC

Video Otoscope Assessment Guide

This document is intended to be used as a guide through the telehealth technology assessment process for video otoscopes. While this is related to the TTAC video otoscope toolkit assessment, parts of this can be adapted to any telehealth technology assessment. This document takes the reader from start to finish thorough the process that the TTAC team used to assess video otoscopes. It starts with a guide to help determine the technology need and walks the reader through exactly how to assess and test video otoscopes.

Determine the Minimum Requirements

Establishing a base set of requirements is critical when starting any technology assessment.  Determining the basic requirements of a video otoscope unit is initially a fairly straightforward process.  A few simple questions can do a great deal to limit the scope of the market review and the nature of the testing and product evaluations.  Much of the complexity of the assessment process will come in analyzing the quality of images produced by each product, the products’ ease of use, and clinician satisfaction with the performance of the devices.  These “soft” requirements, which are often subjectively measured, will require additional rigor in the testing phase to ensure that the best, most objective data is made available to those involved in the final assessment.

Define the Users

Knowing the users that will be utilizing the video otoscope is an important first step in creating your programs minimum requirements. Knowing the users will help you determine the acceptable level of complexity that you can comfortably introduce with your selection. Questions to ask include:

  • Which specialties will use the video otoscope?
  • What is the level of expertise of the users?
  • How much time do the clinical users have to devote to an encounter utilizing the video otoscope?
  • On average, how many times per day will the video otoscope be used?
  • What other technologies do the users make use of on a daily basis?

Define the Required Functionality

Various user groups will need different functionality and performance from the video otoscope. If the video otoscopes will be shared by multiple disciplines make sure that all needs are represented. Being able to separate out the clinically useful features and functionality from the clinically irrelevant features and functionality will also aid in selecting the most appropriate technology.  Some user groups may want to have the option of attaching a flexible endoscope to the device, which may greatly limit the number of devices suitable for your organization.

  • What is the clinical need for using the video otoscope?
  • What types of surfaces will the video otoscope be imaging?
    • How important is color accuracy?
    • At what distances will the video otoscopes be used?
  • What platform will the video otoscope be used with: asynchronous, synchronous, or both?
    • Is there a need to store images?
    • Is there a need to send images over a network connection?
  • Is there a need to integrate the data obtained with the video otoscope into an EMR?
  • Does the video otoscope need to be used with existing equipment or infrastructure?
    • What types of cables, connections, and/or converters may be necessary to connect the video otoscope to the existing equipment?
  • Does the equipment need to be easily moved without a cart?
  • Does the video otoscope need to be used with only one hand?

Define the Deployment

Establishing where the video otoscopes will be deployed will most likely be based on the required functionality and clinical need. The level of training and support the video otoscopes will require will largely be based on the user assessment. All of this information will be helpful when budgeting and planning for the integration of video otoscopes into your program. It is important to keep deployment and support in the forefront of your mind throughout the entire technology assessment process, because it affects all aspects of the assessment and will aide you in making the most appropriate technology decisions for your program. Refer to the “Deployment and Support” section of the toolkit for a more in-depth explanation. Knowing your site equipment and support needs will help define your budget and could ultimately be a very important factor in your equipment selection.

Assessment

With clear ideas about your intended user population and potential equipment use, you are now ready to continue along the assessment continuum. You are ready to start actually evaluating the technology. You will start broadly with a complete market assessment based on your needed functionality, determine the units you will bring in for testing, gather all the information you can about the units you bring in to assess, come up with a testing plan based on the confirmed functionality of the equipment, actually test the video otoscopes that you bring in, and then begin to make some decisions based on your needs and results obtained.

Market Review

With your minimum requirements in mind, start with a broad internet and manufacturer search for all possible technology types on the market. Initially use the market review to familiarize yourself with what is available on the market. After you have determined what is available, then it is time to start molding the availabilities against your minimum requirements, programmatic needs, and clinical needs to start to narrow down your options.

Being methodical about your searches and documenting the options will make your market review much easier, especially if you are working with others or will need to be able to show your results to others. The process of elimination can begin once you have compiled a list of video otoscope options. Video otoscopes that do not meet your minimum requirements should be removed from the list. You may need to get creative to determine a device’s suspected functionality. Some useful resources are online product manuals and cut sheets, online reviews, manufacturer and reseller websites, and speaking directly with vendors. You may also find the “Resources & Standards” toolkit section helpful.

Your list of video otoscopes should be manageable at this time. The actual number of cameras that you bring in for testing, and the amount of testing that you will be able to do, will depend entirely on the budget that you have set aside. Be very cautious about making large volume technology selections without first getting a chance to physically interact with the equipment and personally verify its functionality.

Once you have made your selections, order test units or sample units and have them delivered to begin the testing process.

Product Information Gathering

For TTAC, the product information gathering phase is really the first stage in testing. We begin to familiarize ourselves with the technology as we record information on it. You will have inevitably already gathered some product information during your market review. Once you have the equipment in-house, you can gather more information about the products from their packaging and manuals. Gathering the information that will be important during testing is very useful, and should be in line with the minimum requirements that you have already established. The multiple formats of information can become overwhelming, which is why gathering product information into your own personal format can be very useful. See the “Product Information” section of this toolkit to see the information that TTAC focused on for the video otoscope assessment, as well as the table we used to summarize the information we gathered on all models.

Test Planning

It is tempting to begin “testing” the equipment as soon as you receive it. However, planning your testing against your minimum requirements and your programmatic needs will make the labor intensive testing process more manageable in the end. Without a plan for testing, you run the very distinct possibility that you will select a model because it is “cool”, but turns out to be incredibly un-useful and/or inappropriate for your program’s needs. Also, without a testing plan, you will inevitably utilize too many resources (financial and personnel-related) and possibly jeopardize other aspects of your budget. Keep in mind that the planning is just that, planning. It gives you a chance to document what you think you might do during testing. Oftentimes, what you plan on doing may change a bit once you get in the room with your team and have hands on the equipment. You may also test things and/or gather data that you later determine to not be useful. The more assessments you do, the better your team will get at planning.

Actual Testing

Testing can be a time and labor intensive process. Roughly sticking to your plan will help, but you need to expect to spend a full day or potentially multiple days with your team. Assigning different roles to your team members can also make testing run more smoothly. Having someone that can document the aspects of testing is extremely helpful. If you don’t have enough staff to have someone documenting, you might video or voice record the sessions for a refresher. The goal of documenting is to work towards a reproducible process as testing emerges again in the technology assessment cycle.

Reserve enough space to comfortable  in the same room with your team and all of the equipment that you purchased for testing/evaluation. Remember, if you are going to be obtaining sample media clips or images from volunteers, you also need a secure location that can provide dignity and privacy. Our team used a room that has a VTC unit, network connectivity, and a large enough table that they could lay out each video otoscope with its associated cables and attachments. Having all the necessary equipment and personnel in one place will help the process run smoothly.

It is helpful to introduce rigor into the testing process whenever you have the opportunity. It will add validity to your data and increase your overall confidence in your testing process. In the video otoscope evaluation, we introduced rigor in many ways. Most notably: we used one video platform, we assessed each characteristic with the same video otoscope in order, and while collecting clips, we utilized the same imaging guidelines.

Additional information on our testing process can be found in our testing overview, with high-level points discussed below.

Sample Media Acquisition

In order to compare media for color accuracy and image detail, the TTAC team gathered a set of clinical images with each video otoscope utilizing the same general set up, with variation when it came to the camera connections. The images that we obtained were both clinical and technical in nature.  Clinical images included tympanic membrane images of multiple volunteers of various ages.  The technical images looked at a variety of non-flesh subjects, including various resolution charts and focusing targets. We attempted to frame the shots with as little variance as possible, utilizing measurement standards and standardized positioning for obtaining each image set.

Conclusions

Once you obtain all of your data from testing, you will again need to compare it to your minimum requirements. Some models of video otoscope will easily be eliminated because you will find that they don’t meet some requirement or have poor quality or functionality.  Overall, it will depend on what you are trying to prove with your assessment and testing. You can use Likert-based ratings as tiebreakers, but be mindful that not all elements are created equally and may deserve different weightings in your overall summing or averaging. It is good to have a wide range of data components from testing, but don’t allow too much data to sidetrack you from the important details. Be mindful of the whole testing process, and what it took to get each piece of equipment to do what you needed it to do. You may discover some unexpected or unanticipated results after your assessment.

You may have been using the technology assessment and testing process to see if using a new type/model of video otoscope could meet your programmatic need. By the time you are to the point of testing being concluded, you will know if an alternative model will be technologically sufficient to meet your programmatic need.

A definitive choice is not always clear, so you will most likely need to go back to your assessment results and make a decision regarding technology selection.  It might be helpful to wait a short period of time that allows all of the details to sink in. Be careful not to allow too much time between decision making and testing, at the risk of forgetting testing details.