Share

Technology-enabled healthcare is all around us. But this modern healthcare is not accessible to all. The digital divide can create barriers to healthcare for those without reliable access to technology, transportation, or information. As part of an ongoing dialogue around health equity by design, Theresa Demeter, Tegria’s managing director of clinical transformation, shares how healthcare IT can help bridge barriers to more equitable, accessible healthcare for all. 

Tegria is a healthcare consulting and technology services company. What role does Tegria play in helping to advance health equity?  

Our role is to maximize technology to improve the delivery of healthcare, so we want to make sure in the delivery of healthcare we’re not leaving anybody out. Part of our role is to ensure equitable care is being delivered. Humanizing healthcare is how we like to think about it. 

Our commitment in this place is that we have partnered with the HLTH foundation, and we have partnered with the Office of the National Coordinator for Health Information Technology (ONC) in their efforts to ensure the intentionality of healthcare technology, data, and artificial intelligence, in a way that is inclusive of everybody. 

In your recent article on healthcare inequities for the HLTH Foundation, you shared that healthcare must work to eliminate systemic health inequities impacting outcomes for some people. Can you give an example of the type of systemic difference in health?  

A clear example of this was the use of a medical device called a pulse oximeter. A pulse oximeter is something that goes on your finger that uses infrared light to measure the oxygenation of your blood. Patients who were sick but weren’t ready to go into the hospital yet would be given these remote patient monitoring devices to use at home. When it reached a certain level, that was their trigger to go into the hospital to get care.  

The problem with a pulse oximeter is that in its development it was only thought about and tested on light-skinned people, and it was inaccurate when it was used on people who had darker skin. The consequence of that during Covid was that people with darker skin stayed at home longer, getting sicker. When they got to the hospital, they were sicker, they required more care, they stayed in the hospital longer, and they had worse outcomes. So that’s an example of when bias and inequity and not having intentional design of technology has a clear negative impact on patient care. 

How does system design perpetuate inequality in healthcare? Can you give an example of such a system or process?  

Everything we do is done through systems. Systems are components that work together to accomplish an objective. Systems perpetuate inequality by not intentionally considering all the people who will utilize that system. So, think about the system of how we make a doctor’s appointment. Traditionally, the way that we made the appointment was we had to pick up a telephone and probably sit on hold. We had to call between the hours of 8:00—5:00 p.m. Monday through Friday. If you were a person who worked weekends, or you worked all week and you had weekends off, or you only had short fifteen-minute breaks during the day, that system didn’t work for you. We’ve done a lot with technology to improve access. Now I can call a contact center that might be able to make an appointment for me quicker, I can do it online, or through a patient portal, so that has improved the system.  

There are still inequities in that system, and we need to do more. Because if you don’t have access to the internet, if English isn’t your first language, if you have arthritis and have difficulty using a computer, you still may be left out. Systems should be designed for all people who will use that system.  

Another good example of that is our electronic health record systems or EHRs. It wasn’t long ago that the registration form only listed male or female as the only gender options, and while this has mostly been corrected now, it didn’t allow clinicians to adequately capture information about their patients. Similarly, the same challenges exist with regards to race and ethnicity. People of Asian descent are often grouped together in a bucket called Asian when it comes to population data. That can impact the type of care that they receive.  

Can you talk about Health Equity by Design and Techquity, and how these concepts are shaping the future of health equity?  

There is such a proliferation of technology. We need it to take better care of patients, but it can either support or inhibit advancements in health equity. Both health equity by design and techquity seek to ensure that technology doesn’t exacerbate existing biases and inequities. The problem we’re running into is that we’re leaving people behind, so where technology works for large groups of people, if we’re not intentional, we’re creating a greater digital divide and more problems with access.  

How can technology help healthcare organizations screen and identify individuals and families affected by social determinants of health?  

Your social determinants of health are your nonmedical factors that influence your health outcomes. They include the conditions in which you were born, live, learn,  work, and play. It’s your age. It’s your culture. It’s your race, your ethnicity. It includes all of those things that make up who you are. Income, in fact, is the single largest determinant of health. People are really only as healthy as they can afford to be. It impacts the food you eat, whether you have insurance coverage, and how difficult it is for you to get to your doctor’s office.  

Race has a big impact on social determinants of health, and it drives inequities and bias in care. Technology can help us collect data and identify who is at risk for not getting the care they need and having worse outcomes based on their social determinants of health. Technology can help us identify who the people at risk are so that we can put in other systems to proactively reach out to those people and do better for them.  

Artificial intelligence is another place we’re using to help reduce the impact of SDOH. Eighty percent of your health comes from outside of the four walls of a hospital, so it’s a really big piece of how healthy or not healthy you are. AI is doing a lot to help us through the patient journey and providing information on who is at risk and then making recommendations for interventions. 

Can technology help to support equity as people move through their care journey, from first encounters through follow-up care, financial services, and beyond? 

Absolutely, and it really all starts with the data. It’s one of the big reasons things like techquity and health equity by design are focused on collecting social determinants of health data. It’s important that as we collect information, through technology or through self-reporting to a provider, that we’re not introducing bias into that data. We need to be intentional about collecting information that makes up who a person is. Not only their height and their weight and their diagnosis, but their full social determinants of health, so that we know who might be at risk, and how to reach out and ensure that we are addressing that risk and keeping people well and treating them when they are sick. Part of it is developing trust. 

Can you talk about how trust affects equity and healthcare?  

We have a long history of inequity in healthcare, and truly not treating people well. We have developed a distrust in many populations that have been underserved and have been marginalized. It's going to take a lot of work for us to build up trust again.  

One of the things people don’t have trust in is technology and giving their information to a computer, not knowing what will be done with the information and who’s going to see it, and where it will go. One of the things that we’re doing now to mitigate the lack of trust and help improve trust is actually using healthcare advocates. People who are of that community, of that population, who have been educated around healthcare, and the technology, and the benefits of vaccines, or the benefit of preventative care, and who go into those communities and build trust, explain in their language with cultural sensitivity, what to expect, what’s going to happen, where their data is going to go, how they will be cared for, and that really improves that community’s ability to trust the technology and trust the provider.  

As we know, trust is something that’s hard to grow and easy to lose. We have to ensure the core strategies of healthcare organizations from workforce to policies, to means of treatment, to implicit bias, all of those things need to have health equity at the core in order to provide equitable care for everybody. 

resource