Digital health is playing a pivotal role in transforming how care is delivered during this pandemic. While its benefits may be vast, we must pay attention to the “unintended consequences it may introduce based on biased tools that could exacerbate health disparities and jeopardize public trust,” a BMC Medicine article warns.
Digital health in many ways is a radical shift from the traditional delivery of healthcare. It democratizes healthcare while creating ease and convenience that patients are not accustomed to. Yet a critical look at digital health reveals gaps that need to be addressed to ensure it promotes fair provision of healthcare services to the whole population, especially the underserved. As such, proper ethical frameworks in the planning, development and deployment of digital health services are crucial. Key factors to consider when assessing the ethics of digital health services are as follows:
- Accessibility and affordability: In many ways, digital health has improved accessibility as it enables access to services regardless of location. Nonetheless, it is important to consider patients’ economic ability to afford the equipment, tools and infrastructure needed to access these services and to ensure patients have the technological literacy to engage with those required tools. For instance, the older population may be limited in its ability to use certain digital health tools. Therefore, it is important that access and affordability are consistently taken into consideration during the planning and implementation of services.
- Appropriateness, effectiveness and safety: Digital health services must meet the needs of the whole population regardless of race, age, gender or any other factor. Recently, biased algorithms that unintentionally discriminate against specific populations have been identified. For example, software sold by Optum, a leading health services company, unintentionally but systemically discriminated against Black people by using an algorithm that consistently lowered chronic health problem risk scores for Black patients, gravely affecting the appropriateness of the care the patients received. Often, the data used for these algorithms are not representative of the diversity within the population. Currently most genome-wide association studies (GWAS) are of people of European descent, thereby limiting how applicable the findings are to people of other genetic heritages. As well as diverse representation in data sets, it is also integral to have diverse representation across all governance levels related to policy, designing, planning, implementation and evaluation of digital health services.
- Privacy and security: These are foundational pillars. It is critical that patient data is stored and secured in ways that meet all relevant standards and regulations. It is also imperative that patients provide their consent to how and for what purposes their data will be used. Increasingly, there are debates about data ownership rights as technology spreads. For example, who owns the data collected by wearables such as the Apple Watch and Fitbit – the consumers or the technology companies? These products are not classified as medical-grade devices but as wellness devices, limiting the regulations that they must adhere to. The constant questions on whether companies that collect, store, transfer, share and analyze data have the rights to the data they process are fuelled by the lack of a clearly established right to data.
Moving forward, it is essential that more comprehensive and progressive laws and regulations are established around data governance and ownership, privacy and security in ways that promote digital health efforts while ensuring that appropriate checks and balances are in place to safeguard the integrity of patient data usage.
As with most things in healthcare, the issues are complex and multilayered. However, regardless of their role in the delivery of digital health services – whether a technology company, health system administrator, clinician, policymaker, etc. – it is imperative that ethical considerations are embedded and kept at the forefront of all digital health work to ensure the provision of equitable and appropriate digital health services to the population.