Skip to main content

Diagnostic accuracy of eHealth literacy measurement tools in older adults: a systematic review

Abstract

Background

In Canada, virtual health care rapidly expanded during the COVID-19 pandemic. There is substantial variability between older adults in terms of digital literacy skills, which precludes equitable participation of some older adults in virtual care. Little is known about how to measure older adults’ electronic health (eHealth) literacy, which could help healthcare providers to support older adults in accessing virtual care. Our study objective was to examine the diagnostic accuracy of eHealth literacy tools in older adults.

Methods

We completed a systematic review examining the validity of eHealth literacy tools compared to a reference standard or another tool. We searched MEDLINE, EMBASE, CENTRAL/CDSR, PsycINFO and grey literature for articles published from inception until January 13, 2021. We included studies where the mean population age was at least 60 years old. Two reviewers independently completed article screening, data abstraction, and risk of bias assessment using the Quality Assessment for Diagnostic Accuracy Studies-2 tool. We implemented the PROGRESS-Plus framework to describe the reporting of social determinants of health.

Results

We identified 14,940 citations and included two studies. Included studies described three methods for assessing eHealth literacy: computer simulation, eHealth Literacy Scale (eHEALS), and Transactional Model of eHealth Literacy (TMeHL). eHEALS correlated moderately with participants’ computer simulation performance (r = 0.34) and TMeHL correlated moderately to highly with eHEALS (r = 0.47–0.66). Using the PROGRESS-Plus framework, we identified shortcomings in the reporting of study participants’ social determinants of health, including social capital and time-dependent relationships.

Conclusions

We found two tools to support clinicians in identifying older adults’ eHealth literacy. However, given the shortcomings highlighted in the validation of eHealth literacy tools in older adults, future primary research describing the diagnostic accuracy of tools for measuring eHealth literacy in this population and how social determinants of health impact the assessment of eHealth literacy is needed to strengthen tool implementation in clinical practice.

Protocol registration

We registered our systematic review of the literature a priori with PROSPERO (CRD42021238365).

Peer Review reports

Background

Virtual care is used 4.6 times more often by older adults than before the COVID-19 pandemic [1]. Virtual care has rapidly expanded in most care sectors in Canada during the pandemic [2]. Despite this rise in virtual care use, older adults participate less in videoconference-based assessments than their younger counterparts and they predominantly use telephone as opposed to videoconference-based assessments [3]. Videoconference-based virtual care is uniquely complex in comparison to telephone-based care, as it requires the patient to be able to access and navigate webpages and webcam technology. Patients must possess a level of electronic health (eHealth) literacy in order to successfully navigate online healthcare videoconferencing platforms to communicate with their physician [4]. Not only are telephone-based assessments suboptimal because clinicians cannot see patients, but there is greater diagnostic uncertainty associated with telephone as opposed to videoconference-based cognitive assessments [4, 5]. Further, inexperience with technology created unreadiness among older adults towards accessing healthcare via videoconferencing [6]. Reduced use of videoconferencing and barriers associated with its use among older adults suggest a digital divide and uncertainty about how rapidly evolving virtual care practices are addressing older patients’ needs and concerns [6,7,8,9].

To tackle the digital divide, we must be able to assess eHealth literacy. eHealth involves health information and services provided via the Internet and other technologies, including virtual care, forums, electronic health records, and smartphone applications to facilitate healthcare decision-making [10,11,12]. eHealth literacy consists of more than computer literacy because it also incorporates traditional medical and information literacy [13]. Higher eHealth literacy is associated with improved cognitive health and low eHealth literacy is associated with poor medication adherence and increased risk of cardiac events in older adults [14, 15].

Increased uptake of virtual care, specifically the need to use videoconference-based assessments due to our greater certainty in their diagnostic accuracy compared to telephone-based assessments, indicates an urgent need for evaluation of eHealth literacy skills [16]. Clinicians and patients are concerned about the accuracy, effectiveness of virtual assessments and online health interventions, and older adults’ eHealth literacy skills [17,18,19,20,21,22,23,24,25]. An accurate method for assessing eHealth literacy would enable providers to predict if patients may have difficulty accessing virtual care and provide appropriate support to help them access virtual care. Given these concerns and the diagnostic uncertainty associated with how to assess older adults’ eHealth literacy skills, we completed a systematic review examining the diagnostic accuracy of eHealth literacy tools in older adults.

Methods

We reported our systematic review as per the Preferred Reporting Items for Systematic reviews and Meta-Analysis of Diagnostic Test Accuracy Studies (PRISMA-DTA) and Synthesis without meta-analysis (SWiM) guidance [26, 27]. This systematic review protocol was registered with PROSPERO (CRD42021238365) [28].

Data sources and search strategy

We searched MEDLINE, EMBASE, CENTRAL/CDSR, and PsycINFO for citations in any language. Our search was conducted from inception until January 13th, 2021. We used controlled vocabulary and keywords related to clusters of terms for eHealth Literacy and Older Adults (details in Supplementary File 1). Grey literature was identified using the Canadian Agency for Drugs and Technologies in Health (CADTH) Grey Matters Guide, following the Grey Literature Checklist and study authors’ content knowledge on July 26th, 2021 (Supplementary File 2) [29]. We searched references of included studies. Our search strategy was created and reviewed by authors (YQH, ZG, JAW) and a librarian experienced in developing systematic review literature searches (JM).

eHealth literacy reference standard

As of now, there is no agreed-upon reference standard for measuring eHealth literacy [16, 30]. We considered computer simulation or direct observation of eHealth-related tasks as the reference standard, and we made an a priori decision to include studies comparing two electronic health literacy assessments in older adults. We included articles that described eHealth literacy as either a general eHealth literacy tool or a disease-specific eHealth literacy tool. For articles that met our inclusion criteria and did not report diagnostic accuracy outcomes (e.g., sensitivity, specificity), we emailed authors to see if these data were available.

Study selection

All articles with data related to diagnostic accuracy comparing one eHealth literacy tool to a reference standard or another tool, where the mean population age was at least 60 years old, were eligible for inclusion. Upon completing our systematic review, we realized that our participant age inclusion criteria (enrolling subjects of 60 and older with the mean age of 65 and older – initially selected based on the definition from Centers for Disease Control and Prevention) was too restrictive, and we revised our criteria to include all studies where the mean population age was 60 years of age or older [31]. We chose the mean population age of 60 years of age or older as it is the threshold defined by United Nations for “older persons” [32]. All abstracts were reviewed independently, in duplicate by four authors (YQH, LL, JAW, ZG), and any abstract included by either author was reviewed at the full-text stage. Two authors (YQH, LL) independently reviewed all full-text articles; disagreement was resolved by discussion and a third author (JAW), if needed. We calculated the Cohen’s kappa coefficient (κ) using SAS University Edition to determine inter-reviewer agreement on article selection [33].

Data abstraction and quality assessment

Two reviewers independently (YQH, LL) abstracted data from each included full-text article and appraised the risk of bias using the Quality Assessment for Diagnostic Accuracy Studies-2 (QUADAS-2) tool [34]. Discrepancies were resolved within reviewer pairs and adjudicated by a third reviewer (JAW). We abstracted aggregate-level data from included studies such as name of the first author, study design, year of publication, country where the study was conducted, sample size, study setting (e.g., geriatric medicine clinic, general practitioner clinic), names of tools compared, participant’s primary language, demographic characteristics and experience in Internet use, number of items on each tool, reported cut-offs on tools, and reference standard used for measuring eHealth literacy. We abstracted data as per the PROGRESS-Plus framework, which is suggested by the Cochrane Handbook to assess the inclusion of social determinants of health in included studies [35, 36].

Synthesis

We could not complete a meta-analysis of diagnostic accuracy outcomes because there were too few included studies.

Results

We screened 14,940 titles and abstracts and 99 full-text articles, which resulted in two included studies (365 participants) (Fig. 1). Agreement between reviewers who completed full-text article screening was excellent (κ = 0.89; 95% confidence interval 0.82—0.97). The corresponding author of one included article, Neter et al., provided further data specific to the group of adults who were at least 60 years old [37]. Of the 50 excluded studies, the most frequently used tool was the eHealth literacy scale (eHEALS) (n = 45) [13, 38]. Most articles were excluded because they did not include a comparator group.

Fig. 1
figure 1

Preferred Reporting Items for Systematic Reviews and Meta-Analysis (PRISMA) flow diagram

Included articles

Neter et al.

This study enrolled 82 community-dwelling older adults living in Israel; 83% were of Jewish ethnicity (Table 1) [37]. The mean population age was 66.9 years old, and the study population was predominantly female (60%), well-educated (72% graduated from high school) and earned above-average income (53%) [37]. Further ethnicity data, religion, occupation, social support and personal characteristics such as frailty or disability and time-dependent relationships such as hospitalization or respite care were not reported [37]. Fifty-one of 82 individuals who underwent eHEALS testing and computer simulation were aged 60 years or older as per subgroup data provided by the study author [37]. The primary study outcome was the correlation between eHEALS (a self-reporting tool) score and computer simulation performance [37]. eHEALS is an 8-item self-reporting tool that assesses an individual’s perceived eHealth skills using a 5-point Likert scale to answer each question; response options range from “strongly agree” to “strongly disagree” (maximum score of 40) [38]. The cut-off value for eHEALS in this study was defined as the mean score of participants (Table 2) [37]. The computer simulation consisted of 15 tasks to be completed within an allotted time frame, which reflected participants’ operational, formal, information and strategic skills (total time of 108 min) [37]. Each participant received a rating on each task, which ranged from “not completed” to “completed independently” and the amount of assistance required was noted [37]. eHEALS scores (“perceived eHealth literacy”) correlated moderately to computer simulation results (“performed eHealth literacy”), with a correlation coefficient r of 0.34 (p < 0.01) [37].

Table 1 Study characteristics and diagnostic accuracy outcomes
Table 2 eHealth literacy tools

Risk of bias assessment of Neter et al.

There was high risk of bias for patient selection as study recruitment was completed by telephone and 90% of surveyed participants withdrew from the study before participating in the computer simulation component; additional participants were recruited by snowball sampling (Table 3) [37]. There was no reporting of standardization of the telephone interviews to mitigate bias from the administrator such as a formalized script or training of the interviewers. The risk of bias from reference test (computer simulation) administration was unclear because the training and inter-rater reliability of the reference standard assessors were not reported. Index (eHEALS) and reference (computer simulation) tests had low applicability concerns [37].

Table 3 Quality assessment of diagnostic accuracy studies (QUADAS-2)

Paige et al.

This study included 283 community-dwelling older adults from a university-based research registry in the United States (Table 1) [39]. The mean population age was 64.3 years, and participants were predominantly White (90.1%), female (56.5%), well-educated (95% had an education level of high school and higher) and earned more than $50,000 annually (51.6%) [39]. Authors did not report social determinants of health such as religion, social capital and personal characteristics or time-dependent relationships [39]. The study’s primary outcome was the correlation between TMeHL and eHEALS [39].

TMeHL is an 18-item self-reporting tool with four to five items under functional, communicative, critical and translational literacy [39]. TMeHL uses a 5-point Likert scale for each item (maximum score of 90) [39]. There was no cut-off value proposed to identify sufficient eHealth literacy for TMeHL (Table 2) [39]. The internal validity of TMeHL was determined via dimensionality and item analysis. The external validity of TMeHL was assessed through a comparison of scores to eHEALS (no added cut-off value by the study author), other online health information-seeking styles, and a health literacy tool (the All Aspects of Health Literacy Scale [AAHLS]) [39,40,41,42]. TMeHL had a moderate-to-high positive correlation with eHEALS on all four components of eHealth literacy: functional (r = 0.47; p < 0.01), communicative (r = 0.63; p < 0.01), critical (r = 0.66; p < 0.01), and translational (r = 0.65; p < 0.01) scales [39].

Risk of bias assessment of Paige et al.

There was a high risk of bias related to patient selection and applicability because individuals from a university research registry were recruited into the study via an email survey, which would select participants with higher digital literacy (Table 3) [39]. It was unclear if the index and reference standards were interpreted independently [39]. Unclear risk of bias from flow and timing primarily reflected a lack of reporting of the time between administration of index and reference standards [39]. The eHEALS as a reference standard has a high risk of bias because it has not been externally validated [43]. It was unclear if index test results were interpreted without knowledge of the reference standard [39].

Discussion

We found two tools that will support clinicians in measuring older adults’ eHealth literacy [37, 39]. However, both studies had components of their risk of bias assessments at unclear or high risk of bias and only one study assessed the external validity of eHEALS. Neter et al. found a moderate correlation between eHEALS and computer simulation [37]. TMeHL had a moderate-to-high correlation compared to eHEALS, but authors did not compare TMeHL to a reference standard [39]. Further, important social determinants of health such as social capital were not reported, which limits our understanding of how health equity factors influence the diagnostic accuracy of tools measuring eHealth literacy in older adults in older adults. Although we highlight limitations in our understanding of how eHEALS and TMeHL can be used to assess older adults’ eHealth literacy, as listed above, our systematic review is important because it is the first systematic review reporting on the diagnostic accuracy of tools for measuring eHealth literacy in older adults and our findings further a timely conversation about how we can equitably support older adults in accessing videoconference-based care, mobile health tools, and other digital health solutions.eHEALS was the most widely used eHealth literacy tool, as per our systematic review. Studies have validated eHEALS’ internal consistency, but not its external validity [38, 44,45,46]. For example, one study evaluating the internal validity of eHEALS found a Cronbach’s coefficient of 0.94 when re-tested eight weeks later in a group of educated older adults with high internet use; construct validity was evaluated by relating eHEALS score to individual Internet use, which was gathered via surveys [47]. Studies that solely validate eHEALS through the construct validity concept of Internet use are insufficient to represent the exhaustiveness of eHealth literacy such as the six spheres of the Lily Model [13]. This is further supported by a recent systematic review of studies assessing eHealth literacy tools’ ability to measure the competence areas of eHealth literacy against the European Commission’s Digital Competence (DigComp) framework [48, 49]. eHEALS only covered one out of five criteria of the DigComp framework. We did not identify any eHealth literacy tools within the two included studies that were evaluated based on all three subtypes of validity either (that is, content, construct, and criterion), which are imperative to ensure methodological quality of a tool’s measurement properties [50]. Further, Lee et al. showed that eHEALS had inconsistent low-quality evidence for relevance and insufficient very low-quality evidence for comprehensiveness [51]. On the other hand, eHEALS had moderate to high-quality evidence in structural validity, internal consistency, and measurement invariance [51]. Further research will be needed to fill these gaps in our understanding of the validity of eHEALS as a tool for measuring eHealth literacy in older adults.

Not being able to assess older adults’ eHealth literacy represents a critical knowledge gap and barrier to the sustainability of digital health solutions, especially as virtual care is integrated into routine healthcare delivery [7]. Further, there is a burgeoning interest in interventions to improve older adults’ eHealth literacy, especially in terms of technology use and internet and mobile applications; however, how can these interventions be developed and tested if there is no agreed-upon reference standard for assessing eHealth literacy and the diagnostic accuracy of tools for assessing eHealth literacy has not been compared to this reference standard [43, 52,53,54]? Griebel et al. summarized multiple definitions of eHealth literacy and underlined the importance of agreeing on an updated definition of eHealth literacy [55, 56]. Despite global efforts to develop eHealth literacy tools, there is no eHealth literacy tool of reference, even in adults of other age groups (< 65 years of age) [51, 57]. Evidence suggests that tools may be excessively restrictive in scope (disease-specific or not accounting for the rise of social media and mobile web) [51, 57]. The implementation of tools for assessing older adults’ eHealth literacy will be strengthened by further research to standardize the definition of eHealth literacy and understand a tool’s external validity and the influence of social determinants of health (Fig. 2).

Fig. 2
figure 2

Flow diagram illustrating implementation of eHealth literacy tools included in our review in a geriatric medicine clinic

Our systematic review has limitations. First, we could be missing relevant articles; however, we were inclusive in our database search and grey literature search. Second, there were too few studies to complete a meta-analysis of diagnostic accuracy estimates. Third, included studies had small sample sizes with limited recruitment strategies. Recruited participants were predominantly Israeli and White. As illustrated by our equity analyses following the PROGRESS-Plus framework, the applicability of these two eHealth literacy tools is limited as there wasn’t diverse representation within the small sample size [35]. Moreover, both studies’ participants were not patients requiring medical attention or intervention; they were on a national telephone registry or part of a university-based research registry. Thus, these findings may not be applicable to the clinical setting. Lastly, there was no description of personal characteristics such as cognitive impairment or frailty, among other factors, for participants in included studies; hence, our findings may not be generalizable to a population of older adults attending a geriatric medicine clinic. To overcome this limitation, future validation studies will need to include more diverse populations of older adults seeking medical care and describe the potential impact of geriatric syndromes on the assessment of eHealth literacy.

Conclusions

In conclusion, we completed the first systematic review on the diagnostic accuracy of eHealth literacy tools in older adults. We identified two eHealth literacy tools that were compared to a reference standard or another tool (that is, eHEALs and TMeHL); however, study limitations such as incomplete reporting of diagnostic accuracy measures (e.g., lack of sensitivity or specificity for studied tools) and unclear to high risk of bias across multiple components of each study’s risk of bias assessment preclude us from recommending one tool over another. Future research describing the sensitivity and specificity of tools for measuring eHealth literacy in older adults and how social determinants of health impact the diagnostic accuracy of eHealth literacy tools would strengthen tool implementation in clinical practice.

Availability of data and materials

The data can be found in Table 1. The studies included in our systematic review were published in peer-reviewed manuscripts and available on MEDLINE.

Abbreviations

eHealth:

Electronic health

eHEALS:

EHealth Literacy Scale

TMeHL:

Transactional Model of eHealth Literacy

PRISMA-DTA:

Preferred Reporting Items for Systematic reviews and Meta-Analysis of Diagnostic Test Accuracy Studies

SWiM:

Synthesis without meta-analysis guidance

CADTH:

Canadian Agency for Drugs and Technologies in Health

κ:

Cohen’s kappa coefficient

QUADAS-2:

Quality Assessment for Diagnostic Accuracy Studies-2

AAHLS:

All Aspects of Health Literacy Scale

r:

Correlation coefficient

DigComp:

European Commission’s Digital Competence

References

  1. Choi NG, DiNitto DM, Marti CN, Choi BY. Telehealth use among older adults during covid-19: associations with sociodemographic and health characteristics, technology device ownership, and technology learning. J Appl Gerontol. 2021;5:07334648211047347.

    Google Scholar 

  2. Virtual care in Canada | CIHI. [cited 2023 Jan 27]. Available from: https://www.cihi.ca/en/virtual-care-in-canada.

  3. Liu L, Goodarzi Z, Jones A, Posno R, Straus SE, Watt JA. Factors associated with virtual care access in older adults: a cross-sectional study. Age Ageing. 2021;50(4):1412–5.

    Article  PubMed  Google Scholar 

  4. Donaghy E, Atherton H, Hammersley V, McNeilly H, Bikker A, Robbins L, Campbell J, McKinstry B. Acceptability, benefits, and challenges of video consulting: a qualitative study in primary care. Br J Gen Pract. 2019;69(686):e586–94.

    Article  PubMed  PubMed Central  Google Scholar 

  5. Watt JA, Lane NE, Veroniki AA, Vyas MV, Williams C, Ramkissoon N, Thompson Y, Tricco AC, Straus SE, Goodarzi Z. Diagnostic accuracy of virtual cognitive assessment and testing: Systematic review and meta-analysis. J Am Geriatr Soc. 2021;69(6):1429–40.

    Article  PubMed  Google Scholar 

  6. Lam K, Lu AD, Shi Y, Covinsky KE. Assessing telemedicine unreadiness among older adults in the united states during the COVID-19 pandemic. JAMA Intern Med. 2020;180(10):1389.

    Article  CAS  PubMed  PubMed Central  Google Scholar 

  7. Watt JA, Fahim C, Straus SE, Goodarzi Z. Barriers and facilitators to virtual care in a geriatric medicine clinic: a semi-structured interview study of patient, caregiver and healthcare provider perspectives. Age Ageing. 2022;51(1):afab218.

    Article  PubMed  Google Scholar 

  8. Cuffaro L, Di Lorenzo F, Bonavita S, Tedeschi G, Leocani L, Lavorgna L. Dementia care and COVID-19 pandemic: a necessary digital revolution. Neurol Sci. 2020;41(8):1977–9.

    Article  PubMed  PubMed Central  Google Scholar 

  9. Mann DM, Chen J, Chunara R, Testa PA, Nov O. COVID-19 transforms health care through telemedicine: Evidence from the field. J Am Med Inform Assoc. 2020;27(7):1132–5.

    Article  PubMed  PubMed Central  Google Scholar 

  10. World Health Organization. Global diffusion of eHealth: making universal health coverage achievable. Report of the third global survey on eHealth. Geneva: World Health Organization; 2016. [cited 2021 Sep 1]. Available from: http://apps.who.int/iris/bitstream/handle/10665/252529/9789241511780-eng.pdf.

    Google Scholar 

  11. Eysenbach G. What is e-health? J Med Internet Res. 2001;3(2):e20.

    Article  CAS  PubMed  PubMed Central  Google Scholar 

  12. Mosa ASM, Yoo I, Sheets L. A systematic review of healthcare applications for smartphones. BMC Med Inform Decis Mak. 2012;12(1):67.

    Article  PubMed  PubMed Central  Google Scholar 

  13. Norman CD, Skinner HA. eHealth literacy: essential skills for consumer health in a networked world. J Med Internet Res. 2006;8(2):e9.

    Article  PubMed  PubMed Central  Google Scholar 

  14. Li SJ, Yin YT, Cui GH, Xu HL. The associations among health-promoting lifestyle, eHealth literacy, and cognitive health in older Chinese adults: a cross-sectional study. IJERPH. 2020;17(7):2263.

    Article  PubMed  PubMed Central  Google Scholar 

  15. Lin CY, Ganji M, Griffiths MD, Bravell ME, Broström A, Pakpour AH. Mediated effects of insomnia, psychological distress and medication adherence in the association of eHealth literacy and cardiac events among Iranian older patients with heart failure: a longitudinal study. Eur J Cardiovasc Nurs. 2020;19(2):155–64.

    Article  PubMed  Google Scholar 

  16. Centers for Disease Control and Prevention (CDC). eHealth Literacy. 2021. [cited 2021 Sep 1]. Available from: https://www.cdc.gov/healthliteracy/researchevaluate/eHealth.html.

    Google Scholar 

  17. Ohannessian R, Duong TA, Odone A. Global telemedicine implementation and integration within health systems to fight the COVID-19 pandemic: a call to action. JMIR Public Health Surveill. 2020;6(2):e18810.

    Article  PubMed  PubMed Central  Google Scholar 

  18. Brørs G, Norman CD, Norekvål TM. Accelerated importance of eHealth literacy in the COVID-19 outbreak and beyond. Eur J Cardiovasc Nurs. 2020;19(6):458–61.

    Article  PubMed  PubMed Central  Google Scholar 

  19. Watkins I, Xie B. eHealth Literacy Interventions for Older Adults: A Systematic Review of the Literature. J Med Internet Res. 2014;16(11):e225.

    Article  PubMed  PubMed Central  Google Scholar 

  20. Kim H, Xie B. Health literacy in the eHealth era: a systematic review of the literature. Patient Educ Couns. 2017;100(6):1073–82.

    Article  PubMed  Google Scholar 

  21. Cheng C, Beauchamp A, Elsworth GR, Osborne RH. Applying the electronic health literacy lens: systematic review of electronic health interventions targeted at socially disadvantaged groups. J Med Internet Res. 2020;22(8):e18476.

    Article  PubMed  PubMed Central  Google Scholar 

  22. Bokolo AJ. Use of telemedicine and virtual care for remote treatment in response to COVID-19 pandemic. J Med Syst. 2020;44(7):132.

    Article  Google Scholar 

  23. Itamura K, Rimell FL, Illing EA, Higgins TS, Ting JY, Lee MK, Wu AW. Assessment of patient experiences in otolaryngology virtual visits during the COVID-19 pandemic. OTO Open. 2020;4(2):2473974X2093357.

    Article  Google Scholar 

  24. Boehm K, Ziewers S, Brandt MP, Sparwasser P, Haack M, Willems F, Thomas A, Dotzauer R, Höfner T, Tsaur I, Haferkamp A, Borgmann H. Telemedicine online visits in urology during the COVID-19 pandemic—potential, risk factors, and patients’ perspective. Eur Urol. 2020;78(1):16–20.

    Article  CAS  PubMed  PubMed Central  Google Scholar 

  25. Talevi D, Socci V, Carai M, Carnaghi G, Faleri S, Trebbi E, di Bernardo A, Capelli F, Pacitti F. Mental health outcomes of the CoViD-19 pandemic. Riv Psichiatr. 2020;55(3):137–44.

    PubMed  Google Scholar 

  26. McInnes MDF, Moher D, Thombs BD, McGrath TA, Bossuyt PM, and the PRISMA-DTA Group, Clifford T, Cohen JF, Deeks JJ, Gatsonis C, Hooft L, Hunt HA, Hyde CJ, Korevaar DA, Leeflang MMG, Macaskill P, Reitsma JB, Rodin R, Rutjes AWS, Salameh JP, Stevens A, Takwoingi Y, Tonelli M, Weeks L, Whiting P, Willis BH. Preferred reporting items for a systematic review and meta-analysis of diagnostic test accuracy studies: the PRISMA-DTA statement. JAMA. 2018;319(4):388.

    Article  PubMed  Google Scholar 

  27. Campbell M, McKenzie JE, Sowden A, Katikireddi SV, Brennan SE, Ellis S, Hartmann-Boyce J, Ryan R, Shepperd S, Thomas J, Welch V, Thomson H. Synthesis without meta-analysis (SWiM) in systematic reviews: reporting guideline. BMJ. 2020;16:l6890.

    Article  Google Scholar 

  28. International Prospective Register of Systematic Reviews Centre for Reviews and Dissemination. Guidance notes for registering a systematic review protocol with PROSPERO. National Institute for Health Research; 2013. https://www.crd.york.ac.uk/prospero/display_record.php?RecordID=238365.

  29. Grey Matters: a practical tool for searching health-related grey literature. [cited 2021 Jul 26]. Available from: www.cadth.ca/resources/finding-evidence.

  30. World Health Organization. WHO guideline: recommendations on digital interventions for health system strengthening. Geneva: World Health Organization; 2019. [cited 2021 Sep 1]. Available from: https://apps.who.int/iris/bitstream/handle/10665/311980/WHO-RHR-19.10-eng.pdf?ua=1.

    Google Scholar 

  31. Indicator Definitions - Older Adults | CDI | DPH | CDC. 2019 [cited 2023 Feb 23]. Available from: https://www.cdc.gov/cdi/definitions/older-adults.html.

  32. Affairs UND of E and S. World Population Prospects 2017 - Volume I: Comprehensive tables. United Nations; 2021 [cited 2023 Feb 23]. Available from: https://www.un-ilibrary.org/content/books/9789210001014.

  33. SAS Institute Inc. SAS/IML® 14.1 User’s Guide. Cary, NC: SAS Institute Inc; 2015.

    Google Scholar 

  34. Whiting PF. QUADAS-2: a revised tool for the quality assessment of diagnostic accuracy studies. Ann Intern Med. 2011;155(8):529.

    Article  PubMed  Google Scholar 

  35. O’Neill J, Tabish H, Welch V, Petticrew M, Pottie K, Clarke M, Evans T, Pardo Pardo J, Waters E, White H, Tugwell P. Applying an equity lens to interventions: using PROGRESS ensures consideration of socially stratifying factors to illuminate inequities in health. J Clin Epidemiol. 2014;67(1):56–64.

    Article  PubMed  Google Scholar 

  36. Vivian A Welch, Jennifer Petkovic, Janet Jull, Lisa Hartling, Terry Klassen, Elizabeth Kristjansson, Jordi Pardo Pardo, Mark Petticrew, David J Stott, Denise Thomson, Erin Ueffing, Katrina Williams, Camilla Young, Peter Tugwell. Chapter 16: Equity and specific populations. Cochrane, 2021. Cochrane handbook for systematic reviews of interventions version 6.2 (updated Febuary 2021). [cited 2022 May 20]. Available from: www.training.cochrane.org/handbook.

  37. Neter E, Brainin E. Perceived and performed eHealth literacy: survey and simulated performance test. JMIR Hum Factors. 2017;4(1):e2.

    Article  PubMed  PubMed Central  Google Scholar 

  38. Norman CD, Skinner HA. eHEALS: The eHealth Literacy Scale. J Med Internet Res. 2006;8(4):e27.

    Article  PubMed  PubMed Central  Google Scholar 

  39. Paige SR, Stellefson M, Krieger JL, Miller MD, Cheong J, Anderson-Lewis C. Transactional eHealth literacy: developing and testing a multi-dimensional instrument. J Health Commun. 2019;24(10):737–48.

    Article  PubMed  PubMed Central  Google Scholar 

  40. Eheman CR, Berkowitz Z, Lee J, Mohile S, Purnell J, Marie Rodriguez E, Roscoe J, Johnson D, Kirshner J, Morrow G. Information-seeking styles among cancer patients before and after treatment by demographics and use of information sources. J Health Commun. 2009;14(5):487–502.

    Article  PubMed  PubMed Central  Google Scholar 

  41. Ramirez A Jr, Walther JB, Burgoon JK, Sunnafrank M. Information-seeking strategies, uncertainty, and computer-mediated communication: toward a conceptual model. Hum Commun Res. 2002;28(2):213–28.

    Google Scholar 

  42. Chinn D, McCarthy C. All Aspects of Health Literacy Scale (AAHLS): developing a tool to measure functional, communicative and critical health literacy in primary healthcare settings. Patient Educ Couns. 2013;90(2):247–53.

    Article  PubMed  Google Scholar 

  43. van der Vaart R, van Deursen AJ, Drossaert CH, Taal E, van Dijk JA, van de Laar MA. Does the eHealth Literacy Scale (eHEALS) measure what it intends to measure? Validation of a Dutch version of the eHEALS in two adult populations. J Med Internet Res. 2011;13(4):e86.

    Article  PubMed  PubMed Central  Google Scholar 

  44. Watkins I, Xie B. The effects of jigsaw- and constructive controversy-based collaborative learning strategies on older adults’ eHealth literacy. Gerontechnology. 2013;12(1):44–54.

    Article  Google Scholar 

  45. Zibrik L, Khan S, Bangar N, Stacy E, Novak Lauscher H, Ho K. Patient and community centered eHealth: exploring eHealth barriers and facilitators for chronic disease self-management within British Columbia’s immigrant Chinese and Punjabi seniors. Health Policy and Technology. 2015;4(4):348–56.

    Article  Google Scholar 

  46. Paige SR, Krieger JL, Stellefson M, Alber JM. eHealth literacy in chronic disease patients: an item response theory analysis of the eHealth literacy scale (eHEALS). Patient Educ Couns. 2017;100(2):320–6.

    Article  PubMed  Google Scholar 

  47. Chung SY, Nahm ES. Testing Reliability and Validity of the eHealth Literacy Scale (eHEALS) for Older Adults Recruited Online. Comput Inform Nurs. 2015;33(4):150–6.

    Article  PubMed  PubMed Central  Google Scholar 

  48. Oh SS, Kim KA, Kim M, Oh J, Chu SH, Choi J. measurement of digital literacy among older adults: systematic review. J Med Internet Res. 2021;23(2):e26145.

    Article  PubMed  PubMed Central  Google Scholar 

  49. Brecko BN. DIGCOMP: a Framework for Developing and Understanding Digital Competence in Europe. [cited 2022 May 18]; Available from: https://www.academia.edu/7132885/DIGCOMP_a_Framework_for_Developing_and_Understanding_Digital_Competence_in_Europe.

  50. Mokkink LB, Terwee CB, Patrick DL, Alonso J, Stratford PW, Knol DL, Bouter LM, de Vet HCW. The COSMIN study reached international consensus on taxonomy, terminology, and definitions of measurement properties for health-related patient-reported outcomes. J Clin Epidemiol. 2010;63(7):737–45.

    Article  PubMed  Google Scholar 

  51. Lee J, Lee EH, Chae D. eHealth literacy instruments: systematic review of measurement properties. J Med Internet Res. 2021;23(11):e30644.

    Article  PubMed  PubMed Central  Google Scholar 

  52. LaMonica HM, English A, Hickie IB, Ip J, Ireland C, West S, Shaw T, Mowszowski L, Glozier N, Duffy S, Gibson AA, Naismith SL. Examining internet and eHealth practices and preferences: survey study of Australian older adults with subjective memory complaints, mild cognitive impairment, or dementia. J Med Internet Res. 2017;19(10):e358.

    Article  PubMed  PubMed Central  Google Scholar 

  53. Hoogland AI, Mansfield J, Lafranchise EA, Bulls HW, Johnstone PA, Jim HSL. eHealth literacy in older adults with cancer. J Geriatric Oncol. 2020;11(6):1020–2.

    Article  Google Scholar 

  54. Arcury TA, Sandberg JC, Melius KP, Quandt SA, Leng X, Latulipe C, Miller DP, Smith DA, Bertoni AG. Older Adult Internet Use and eHealth Literacy. J Appl Gerontol. 2020;39(2):141–50.

    Article  PubMed  Google Scholar 

  55. Griebel L, Enwald H, Gilstad H, Pohl AL, Moreland J, Sedlmayr M. eHealth literacy research—Quo vadis? Inform Health Soc Care. 2018;43(4):427–42.

    Article  PubMed  Google Scholar 

  56. Norman C. eHealth literacy 2.0: problems and opportunities with an evolving concept. J Med Internet Res. 2011;13(4):e125.

    Article  PubMed  PubMed Central  Google Scholar 

  57. Tavousi M, Mohammadi S, Sadighi J, Zarei F, Kermani RM, Rostami R, Montazeri A. Measuring health literacy: A systematic review and bibliometric analysis of instruments from 1993 to 2021. PLoS ONE. 2022;17(7):e0271524.

    Article  CAS  PubMed  PubMed Central  Google Scholar 

Download references

Acknowledgements

We would like to thank Dr. Jessie McGowan for reviewing our literature search strategy.

Funding

The authors received no financial support for the research, authorship, and/or publication of this article.

Author information

Authors and Affiliations

Authors

Contributions

YQH, ZG and JAW designed the study. All authors participated in the review process. YQH drafted the first version of the manuscript, and all authors contributed to its revision and approved its submission.

Corresponding author

Correspondence to Jennifer A. Watt.

Ethics declarations

Ethics approval and consent to participate

Not applicable.

Consent for publication

Not applicable.

Competing interests

The authors declare no competing interests.

Additional information

Publisher’s Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Supplementary Information

Rights and permissions

Open Access This article is licensed under a Creative Commons Attribution 4.0 International License, which permits use, sharing, adaptation, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons licence, and indicate if changes were made. The images or other third party material in this article are included in the article's Creative Commons licence, unless indicated otherwise in a credit line to the material. If material is not included in the article's Creative Commons licence and your intended use is not permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from the copyright holder. To view a copy of this licence, visit http://creativecommons.org/licenses/by/4.0/. The Creative Commons Public Domain Dedication waiver (http://creativecommons.org/publicdomain/zero/1.0/) applies to the data made available in this article, unless otherwise stated in a credit line to the data.

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Huang, Y.Q., Liu, L., Goodarzi, Z. et al. Diagnostic accuracy of eHealth literacy measurement tools in older adults: a systematic review. BMC Geriatr 23, 181 (2023). https://doi.org/10.1186/s12877-023-03899-x

Download citation

  • Received:

  • Accepted:

  • Published:

  • DOI: https://doi.org/10.1186/s12877-023-03899-x

Keywords