Investigating Citizens' Acceptance of Contact Tracing Apps: Quantitative Study of the Role of Trust and Privacy
You are here
Title | Investigating Citizens' Acceptance of Contact Tracing Apps: Quantitative Study of the Role of Trust and Privacy |
Publication Type | Journal Article |
Year of Publication | 2024 |
Authors | Fox G, van der Werff L, Rosati P, Lynn T |
Journal | JMIR Mhealth Uhealth |
Volume | 12 |
Pagination | e48700 |
Date Published | Jan |
ISSN | 2291-5222 |
Keywords | adoption, contact tracing, information disclosure, mobile apps, privacy, public health surveillance, Trust |
Abstract | Background: The COVID-19 pandemic accelerated the need to understand citizen acceptance of health surveillance technologies such as contact tracing (CT) apps. Indeed, the success of these apps required widespread public acceptance and the alleviation of concerns about privacy, surveillance, and trust. Objective: This study aims to examine the factors that foster a sense of trust and a perception of privacy in CT apps. Our study also investigates how trust and perceived privacy influence citizens' willingness to adopt, disclose personal data, and continue to use these apps. Methods: Drawing on privacy calculus and procedural fairness theories, we developed a model of the antecedents and behavioral intentions related to trust and privacy perceptions. We used structural equation modeling to test our hypotheses on a data set collected at 2 time points (before and after the launch of a national CT app). The sample consisted of 405 Irish residents. Results: Trust in CT apps was positively influenced by propensity to trust technology (β=.074; P=.006), perceived need for surveillance (β=.119; P<.001), and perceptions of government motives (β=.671; P<.001) and negatively influenced by perceived invasion (β=−.224; P<.001). Perceived privacy was positively influenced by trust (β=.466; P<.001) and perceived control (β=.451; P<.001) and negatively influenced by perceived invasion (β=−.165; P<.001). Prelaunch intentions toward adoption were influenced by trust (β=.590; P<.001) and perceived privacy (β=.247; P<.001). Prelaunch intentions to disclose personal data to the app were also influenced by trust (β=.215; P<.001) and perceived privacy (β=.208; P<.001) as well as adoption intentions before the launch (β=.550; P<.001). However, postlaunch intentions to use the app were directly influenced by prelaunch intentions (β=.530; P<.001), but trust and perceived privacy only had an indirect influence. Finally, with regard to intentions to disclose after the launch, use intentions after the launch (β=.665; P<.001) and trust (β=.215; P<.001) had a direct influence, but perceived privacy only had an indirect influence. The proposed model explained 74.4% of variance in trust, 91% of variance in perceived privacy, 66.6% of variance in prelaunch adoption intentions, 45.9% of variance in postlaunch use intentions, and 83.9% and 79.4% of variance in willingness to disclose before the launch and after the launch, respectively. Conclusions: Positive perceptions of trust and privacy can be fostered through clear communication regarding the need and motives for CT apps, the level of control citizens maintain, and measures to limit invasive data practice. By engendering these positive beliefs before launch and reinforcing them after launch, citizens may be more likely to accept and use CT apps. These insights are important for the launch of future apps and technologies that require mass acceptance and information disclosure. |
URL | http://www.ncbi.nlm.nih.gov/pubmed/38085914 |
DOI | 10.2196/48700 |