Introduction: Benign paroxysmal positional vertigo (BPPV) is a common cause of dizziness that is diagnosed by detecting nystagmus through positional maneuvers. Limited access to expert clinicians to correctly perform and interpret the eye movement findings of positional tests can hamper the diagnosis and delay the treatment. We aimed to assess the usability of a smartphone-based eye-tracking application (EyePhone) for self-recording eye movements during positional testing. Methods: Healthy volunteers were enrolled and provided instructions to perform Dix-Hallpike and Supine Roll tests using the EyePhone application to record themselves. A study team member was instructed to observe the process without interfering. They recorded the time each section took and the accuracy of performing positional tests. Usability was assessed using the mHealth App Usability Questionnaire (MAUQ), and expert evaluation of recorded videos determined quality. Results: All participants successfully performed the tests and recorded their eye movements. On average, after watching the instruction, it took participants 3 min 31 s to record the Dix-Hallpike test and 3 min 4 s to record the Supine Roll test. Nine participants completed Dix-Hallpike without major errors, and all completed the Supine Roll successfully. An expert review found that 95% of videos had clear eye visibility. Participants rated the app as easy to use and stated that they would use the app again. Conclusion: We demonstrated the usability and feasibility of the EyePhone app for self-recording positional tests. This application offers the potential for remote BPPV diagnosis and improved patient access to care.

Benign paroxysmal positional vertigo (BPPV) is one of the most common causes of vertigo, which stems from crystals from the otolith organs that are displaced within semi-circular canals of the inner ear. The presence of characteristic nystagmus during positional maneuvers like Dix-Hallpike and Supine Roll is the mainstay of BPPV diagnosis [1]. Quantified eye tracking coupled with newly developed artificial intelligence algorithms can improve the detection of such movements and help earlier diagnosis and treatment with repositioning maneuvers [2]. While testing in clinical settings is ideal, the characteristic findings might not be consistently present despite the underlying pathology [3]. Moreover, patients in remote and underserved areas might face greater challenges in receiving an accurate diagnosis and treatment of this condition. To address these challenges, we assessed the usability of our previously described smartphone eye-tracking application (EyePhone) for self-recording eye movements during positional testing [4, 5].

We recruited healthy volunteers for this study. The recording occurred in a standard clinic room with a test table suitable for positional (i.e., Dix-Hallpike and Supine Roll) tests. Each participant was given a cell phone with our smartphone eye-tracking app installed and a laptop containing a self-paced instructional PowerPoint presentation that was prepared specifically for our study. It included step-by-step instructions on operating the application and performing the Dix-Hallpike and Supine Roll tests while self-recording the eye movements. The instructional PowerPoint initially walked the participants through different components of the app and the general principles of recording eye movements of acceptable quality using our application. The rest of the presentation was focused on performing the positional tests. Initially, the participant would watch a video of how the test is performed without recording their eye movements. They were given the chance to watch the video as many times as they desired and try performing the tests to improve their technique. Later, they watched videos that showed how they could utilize the app during the positional testing to record their eyes. These videos were prepared by the study team and included a screen recording of a demo positional test and a video of the same test being performed from an angle that captured the full head, body, and phone. Again, the participants were given the option to attempt as many recordings as they desired until they were satisfied with their performance. After completing both positional tests, we asked our participants to complete a modified version of the previously validated mHealth App Usability Questionnaire (MAUQ) [6].

Throughout the self-recording process, a trained study team member was present in the room as an observer and recorded the duration of each part of the testing, the accuracy of the positional tests performed both without and with recording, the number of trials it took each participant to complete each positional test, and finally, the mistakes that were made while performing each positional test with and without phone. The study team was instructed not to interfere or help with the recording process. Later, an expert reviewed videos obtained by the participants and assessed if they were of sufficient quality for diagnostic purposes. We used descriptive statistics to describe the study population, the metrics above, and the participant’s responses to the MAUQ questionnaire.

In total, we recruited ten healthy volunteers; 50% (n = 5) were female, and the average age was 33.3 ± 10.4 years. Most of the participants (90%) had a graduate or professional degree, and one participant had no degree. When asked about their prior exposure to positional tests, three said they had previously undergone at least one of the two positional tests. Nonetheless, none had performed the test regularly, and there was a considerable time (at least 1 year) since the last exposure.

All the participants were able to successfully perform the positional tests and complete the recording session. The average duration of the session, including the time participants spent watching videos and practicing each of the tests, was 17 min 43 s ± 3 min 22 s. Excluding the time participants spent watching the videos and practicing, it took on average 3 min 31 s to record the Dix-Hallpike test (SD = 1 min 25 s) and 3 min 4 s to record the Supine Roll test (SD = 1 min 14 s). Table 1 summarizes the participant responses to the MAUQ. All participants agreed (9 strongly agreed) that they were satisfied with the app. Participants also indicated that the app was easy to use and that it was easy for them to learn to use it. All participants agreed (with seven strongly agreeing) that they would use the app again.

Table 1.

Participant responses to the mHealth App Usability Questionnaire (MAUQ)

QuestionStrongly agreeSomewhat agreeNeither agree nor disagreeSomewhat disagree
The app was easy to use 10 
It was easy for me to learn to use the app 
The navigation was consistent when moving between screens 
The interface of the app allowed me to use all the functions offered 
Whenever I made a mistake using the app, I could recover easily 
I like the interface of the app 10 
The information in the app was well organized 
The app acknowledged and provided information on the progress of action 
I feel comfortable using this app in social settings 
The amount of time involved in using this app has been fitting for me 
I would use this app again 
Overall, I am satisfied with this app 
The app would be useful for my health and well-being 
The app improved my access to healthcare services 
The app helped me manage my health effectively 
This app has all the functions and capabilities I expected it to have 
I could use the app even when the Internet connection was poor 
QuestionStrongly agreeSomewhat agreeNeither agree nor disagreeSomewhat disagree
The app was easy to use 10 
It was easy for me to learn to use the app 
The navigation was consistent when moving between screens 
The interface of the app allowed me to use all the functions offered 
Whenever I made a mistake using the app, I could recover easily 
I like the interface of the app 10 
The information in the app was well organized 
The app acknowledged and provided information on the progress of action 
I feel comfortable using this app in social settings 
The amount of time involved in using this app has been fitting for me 
I would use this app again 
Overall, I am satisfied with this app 
The app would be useful for my health and well-being 
The app improved my access to healthcare services 
The app helped me manage my health effectively 
This app has all the functions and capabilities I expected it to have 
I could use the app even when the Internet connection was poor 

Performance Metrics for the Dix-Hallpike Test

Nine of the participants (90%) completed the Dix-Hallpike test with no major errors, producing recordings that could be reliably evaluated. The most common minor errors included performing the first part of the test slower than recommended (8 out of 9) and not holding in position for the recommended 30 s (5 out of 9). One participant performed the positional testing incorrectly but successfully captured the eye movements despite inaccurate test performance. Four of the participants (40%) watched the tutorial more than once before finalizing the recording, and five (50%) made multiple attempts to record before finalizing the recording.

Performance Metrics for the Supine Roll Test

All participants completed the Supine Roll test with no major errors. The most common minor errors observed by the study team were lack of smooth rolling motion in 5 (50%) and not holding the position for the entire 30-s recommended duration. Two (20%) of the participants rewatched the tutorial videos before finalizing the tests, and two (20%) made multiple attempts before saving the test. One of the reattempts was due to an error during the maneuver that the participant noticed and corrected, and the other one was due to accidentally stopping the recording midway through the test.

Video Quality Assessment

Overall, 42 videos were assessed by an expert. Both eyes were clearly visible and could be assessed for the presence of abnormal movements in 40 (95%) of the recorded videos. The performance of the test and head motion also could be assessed in all 42 self-recorded videos.

Here, we present a short usability report of our smartphone eye-tracking application, EyePhone, for self-recording positional testing in healthy volunteers. The rationale behind recruiting healthy volunteers was to focus primarily on factors associated with the app design, self-recording instructions, and performance, and to control for the adverse effects of the symptoms on the participant’s performance.

While video recording of eye movements during positional testing has been studied before [7, 8], to address the lack of a uniform approach to evaluating the feasibility and usability of such methods, we aimed to establish a standard framework for future studies that examine self-recording eye movements. To do so, we utilized both subjective (i.e., questionnaire) and objective (i.e., observation and recorded metrics) measures to evaluate the feasibility of performing positional tests while self-recording eye movements and the usability of our EyePhone application.

In a previous study of video recording of eyes during positional tests, Shah et al. [7] showed a high sensitivity (93%) and specificity (100%) for diagnosing BPPV when experts reviewed videos of eye recordings. However, that study did not include self-recording, and the eye movements were not quantified by the phone application [7]. Another study with a similar approach of self-recording eyes during positional testing was conducted by Melliti et al. [8] that showed eye movement abnormalities (i.e., positional nystagmus) can be captured by the patients. Nonetheless, this study’s approach also provides challenges in addressing the main hurdles in expanding the reach and availability of such methods. A barrier to understanding the real-world performance and, therefore, improvement of the app is that when participants are provided with instructions and advised to perform the tests at home, there is no way to verify the correct performance of the test. Similarly, it is essential to keep track of the number of attempts each participant makes and what obstacles hamper the testing process. We addressed these issues in our study by conducting the first phase of our evaluation in a controlled environment where we could assess the adherence to the instructions, accurate performance of the tests, the timing, and the number of attempts. Hence, this gives us a better understanding of the variables we can change or improve before making the app available for recording in an unobserved (e.g., patient’s home) environment.

When assessing the time it took for the participants to complete the testing (∼17.5 min), it is essential to consider the time spent viewing the instructional videos and the trial runs (∼11 min). As most of the time is spent on learning how to perform the tests and operate the app, it is expected that by repeated use of the app – e.g., in patients who use the app for repeated follow-ups – this time will be shorter and the whole testing process would be conducted faster. Moreover, to put these numbers in perspective, we need to consider the alternative, which is referral to a healthcare setting – either urgent or outpatient setting – which would, in the best case scenario, take hours of their time. Moreover, a population-based study by Kerber et al. [9] indicated that the testing for and treating BPPV is infrequent in the emergency setting with a high level of inter-provider variability. Similar studies in outpatient settings have shown an average of 93 weeks between the time patients first experienced symptoms of BPPV and appropriate treatment, often involving visits to multiple providers [10, 11]. Therefore, a smartphone application would be a much quicker tool for triaging patients with positional symptoms at home.

Overall, the participant-centered metrics we assessed through the MAUQ are encouraging and complement the video recording quality metrics, indicating that participants viewed their experience working with the app favorably. Moreover, it is important to note that since we recruited healthy volunteers, not all the questions in the MAUQ applied to our study – e.g., the question regarding improving access to healthcare that one participant disagreed with. Now that we have successfully evaluated our app’s usability in a controlled environment, the next step will be to provide these instructions for in-home testing, which has multiple potential benefits for patients with BPPV. First, we already know that repositioning maneuvers are not always successful in treating patients, and they may continue feeling symptomatic even after successful treatment [12, 13]. Therefore, having a smartphone application that can detect the recurrence of BPPV after treatment would help streamline their care. Also, not all the patients presenting with positional vertigo have positional nystagmus when tested at the clinic [3]. Hence, clinicians could provide this app as a means for follow-up and diagnosis of the condition based on nystagmus detected by self-testing. The goal of this application is to improve access to diagnostic means for patients with positional vertigo and give them more control over the symptoms they experience.

This was a pilot study involving ten healthy volunteers, and as such, it faces certain limitations. The small sample size of the study, along with the enrollment of a younger participant population with a higher level of education, might inflate the favorability and usability metrics of the EyePhone app. Moreover, by not including symptomatic patients, we did not account for the potential effects that dizziness might impose on the quality of the test performance and app usage. We did not assess the accuracy of the app in detecting abnormal eye movements. Nonetheless, as highlighted in the discussion, the primary purpose of this study was to examine the usability of the EyePhone app and the feasibility of self-recording positional tests in a controlled environment to determine any inherent deficits in the app design, testing instructions, or combining recording with performing positional tests that could be addressed before moving to testing the app in real-world environment. Now that we have the baseline metrics in the “ideal” controlled environment, we can move to assess the app usability among the patients with positional symptoms and in a less controlled setting, e.g., providing instructions for self-recording at home and assessing the quality and the diagnostic yield of the recordings.

Prior exposure of the participants to the positional testing might be viewed as a potential shortcoming of this study. While we did not ask the participants whether their exposure was through experiencing similar symptoms, we made sure that ample time (at least 1 year) had passed since their last exposure. Moreover, as previous studies have shown a recurrence rate of 26% within 3 years of treatment for BPPV patients [14], it is reasonable to expect a portion of the real-world patients would have prior exposure to the diagnostic maneuvers.

Another limitation of the study was the use of PowerPoint slides for instructing the patients. While this makes it easier to update the instructions, it can also create challenges by requiring a separate device. Moreover, when instructions are displayed on a separate screen, it would be very difficult to follow them in real-time, creating a short time gap between viewing the instructions and performing the tests. Now that we have metrics to back the usability of our set of instructions, we can aim to address this issue by integrating the instructions into the app and creating a more streamlined experience for the users.

This study demonstrates that, with proper instructions, healthy volunteers can use the EyePhone app to accurately self-record their eye movements during positional testing. In addition, our usability questionnaire reveals that participants find the app easy to use. Furthermore, this study lays a standard framework for future studies that aim to assess self-recording eye movements by smartphones.

We would like to thank all the volunteers who helped us conduct this study.

This study protocol was reviewed and approved by the Johns Hopkins Institutional Review Board (IRB00258938). All participants were provided with the relevant information and signed a written informed consent before enrollment.

David E. Newman-Toker and Jorge Otero-Millan have a provisional patent application regarding the use of EyePhone in tracking eye and head position. David E. Newman-Toker, Ali Saber Tehrani, Jorge Otero-Millan, Hector Rieiro, and Pouya B. Bastani have a provisional patent application regarding using the EyePhone for recording saccades and smooth pursuit.

This study received no external funding.

P.B.B., D.E.N.-T., H.R., J.O.-M., D.S.Z., and A.S.T. contributed to the study’s conception and design. P.B.B., V.P., J.O.-M., H.R., and A.S.T. contributed to the acquisition and analysis of the data. P.B.B., V.P., and A.S.T. contributed to drafting a significant portion of the manuscript.

The data supporting the findings of this study are not publicly available to protect participant privacy and comply with institutional regulations. However, aggregate de-identified data can be obtained upon reasonable request from the corresponding author (A.S.T.).

1.
Kim
J-S
,
Zee
DS
.
Clinical practice. Benign paroxysmal positional vertigo
.
N Engl J Med
.
2014
;
370
(
12
):
1138
47
.
2.
Mun
SB
,
Kim
YJ
,
Lee
JH
,
Han
GC
,
Cho
SH
,
Jin
S
, et al
.
Deep learning-based nystagmus detection for BPPV diagnosis
.
Sensors
.
2024
;
24
(
11
):
3417
.
3.
Alvarenga
GA
,
Barbosa
MA
,
Porto
CC
.
Benign paroxysmal positional vertigo without nystagmus: diagnosis and treatment
.
Braz J Otorhinolaryngol
.
2011
;
77
(
6
):
799
804
.
4.
Bastani
PB
,
Rieiro
H
,
Badihian
S
,
Otero-Millan
J
,
Farrell
N
,
Parker
M
, et al
.
Quantifying induced nystagmus using a smartphone eye tracking application (EyePhone)
.
J Am Heart Assoc
.
2024
;
13
(
2
):
e030927
.
5.
Barahim Bastani
P
,
Saber Tehrani
AS
,
Badihian
S
,
Rieiro
H
,
Rastall
D
,
Farrell
N
, et al
.
Self-recording of eye movements in amyotrophic lateral sclerosis patients using a smartphone eye-tracking app
.
Digit Biomark
.
2024
;
8
(
1
):
111
9
.
6.
Zhou
L
,
Bao
J
,
Setiawan
IMA
,
Saptono
A
,
Parmanto
B
.
The mHealth app usability questionnaire (MAUQ): development and validation study
.
JMIR Mhealth Uhealth
.
2019
;
7
(
4
):
e11500
.
7.
Shah
MU
,
Lotterman
S
,
Roberts
D
,
Eisen
M
.
Smartphone telemedical emergency department consults for screening of nonacute dizziness
.
Laryngoscope
.
2019
;
129
(
2
):
466
9
.
8.
Melliti
A
,
Van De Berg
M
,
Van De Berg
R
.
Capturing nystagmus during vertigo attacks using a smartphone: adherence, characteristics, pearls and pitfalls
.
J Neurol
.
2023
;
270
(
12
):
6044
56
.
9.
Kerber
KA
,
Burke
JF
,
Skolarus
LE
,
Meurer
WJ
,
Callaghan
BC
,
Brown
DL
, et al
.
Use of BPPV processes in emergency department dizziness presentations: a population-based study
.
Otolaryngol Head Neck Surg
.
2013
;
148
(
3
):
425
30
.
10.
Fife
D
,
FitzGerald
JE
.
Do patients with benign paroxysmal positional vertigo receive prompt treatment? Analysis of waiting times and human and financial costs associated with current practice
.
Int J Audiol
.
2005
;
44
(
1
):
50
7
.
11.
Wang
H
,
Yu
D
,
Song
N
,
Su
K
,
Yin
S
.
Delayed diagnosis and treatment of benign paroxysmal positional vertigo associated with current practice
.
Eur Arch Otorhinolaryngol
.
2014
;
271
(
2
):
261
4
.
12.
Kim
H-J
,
Kim
J-S
,
Choi
K-D
,
Choi
S-Y
,
Lee
S-H
,
Jung
I
, et al
.
Effect of self-treatment of recurrent benign paroxysmal positional vertigo: a randomized clinical trial
.
JAMA Neurol
.
2023
;
80
(
3
):
244
50
.
13.
Gordon
CR
,
Gadoth
N
.
Repeated vs single physical maneuver in benign paroxysmal positional vertigo
.
Acta Neurol Scand
.
2004
;
110
(
3
):
166
9
.
14.
Nunez
RA
,
Cass
SP
,
Furman
JM
.
Short- and long-term outcomes of canalith repositioning for benign paroxysmal positional vertigo
.
Otolaryngol Head Neck Surg
.
2000
;
122
(
5
):
647
52
.