Photo of a blue teapot pouring tea into a pink mug

Illustrations by Alexander Coggin

Improving Shared Decision-Making for Doctors and Patients with Type 2 Diabetes

Jessecae Marsh investigates how people’s beliefs about causal relationships influence their thinking—and how an understanding of these beliefs might lead to better health outcomes.

Story by

Kelly Hochbein

What is making me sick? 

A doctor’s answer to this question depends on the patient’s ailment, and the cause of an illness—the what—often determines its treatment. A bacterial infection might call for an antibiotic; high blood pressure might necessitate a change in diet and increased exercise; seasonal allergies might require medication. In the case of Type 2 diabetes—in which the body becomes resistant to insulin or is unable to produce enough insulin—a doctor might list the patient’s weight, inactivity or genetics as possible causes, and recommend an appropriate treatment. 

Patients with Type 2 diabetes must be heavily involved in and are often burdened by their treatment plans, which involve daily decision-making. Patients must perform insulin tests, adhere to dietary restrictions, and monitor stress and activity levels. Some patients and providers work together to formulate a manageable treatment plan in an approach known as shared decision-making. But if a patient believes his or her illness is caused by something other than what the doctor has identified as the cause, or that a different treatment would be more effective, the patient might resist—or even ignore—the doctor’s advice. 

“We all have experience with hearing things in the media about disease and treatment, like ‘drink this tea and it makes your diabetes go away,’” says Jessecae Marsh, an associate professor of psychology. “So if you’ve just been diagnosed with Type 2 diabetes, it’s not the first time you’ve heard of it. You have all this information out there, and there might be some beliefs you’ve formed about it. So now [your doctor is] going to try to give you all this correct information. What’s that going to do with all this information you already have?”

Marsh studies how people’s mental models, or beliefs, about causal relationships influence their thinking, and, in particular, how people reason and make decisions in the health domain. The disconnect or conflict between patient mental models about their disease and the recommended treatment can be an obstacle to shared decision-making, she says. 

“Doctors are often giving patients information that has all these causal links in it. And the question is, if you believe something else, what happens when you get this new information?” explains Marsh. “A patient’s mental model of how their disease might work might be very different than how it actually works.” This does not necessarily indicate that the patient is uneducated about his or her disease; they’ve merely received information over time and have built belief systems around it, she says. 

Marsh, a cognitive psychologist, has partnered with Samantha Kleinberg, assistant professor of computer science at Stevens Institute of Technology, and Onur Asan, associate professor in the School of Systems and Enterprises, also at Stevens Institute of Technology, in an effort to “close the gap from data to decisions,” which could improve shared decision-making for doctors and patients with Type 2 diabetes by better linking evidence to knowledge. 

The goal, says Marsh, is to “figure out a way to, for a given patient, elicit that model of what’s causing their diabetes, figure out what they think causes their diabetes, and then give that information to a doctor, so the doctor can help them make a better decision [in a clinical setting].” 

The project, “Uniting Causal and Mental Models for Shared Decision-Making in Diabetes,” is supported by funding from the National Science Foundation. 

Rather than providing group-level guidelines for patients with Type 2 diabetes, Kleinberg is working to develop machine learning methods to generate personalized causal models for individual patients based on health network data related to insulin dosage, blood sugar levels, activity levels and diet. For example, what causes a particular patient’s blood sugar to go up or down? 

pile of used teabags

“If you have a continuous blood sugar monitoring system and we look at what you eat during the day, and every time you eat something, two hours later your blood sugar is out of whack, there’s something going on that you need to know about,” explains Marsh. The idea, she says, is for Kleinberg to mine this personalized data to provide patients with specific causal information about their diabetes.

But what happens when a patient, who may have preexisting beliefs about his or her own health, receives that data-based causal information? The answer, it turns out, might be somewhat unexpected.

Causal Information & Decisions

Prior to beginning work on the Type 2 diabetes project, Marsh and Kleinberg, along with Min Zheng and Jeffrey V. Nickerson, both of Stevens Institute of Technology, conducted a study of how people use causal information to make everyday decisions about diet, health and personal finance. They detail their findings in a paper in the journal Cognitive Research: Principles and Implications.

Algorithms are effective at determining causal relationships, but the key to effective use with people is to understand how to marry existing beliefs with new information, and “how and when causal models can help people make decisions,” the team writes.

Through four large-scale online experiments on the Amazon Mechanical Turk platform, the team tested whether causal information improves decision-making; whether the impact of causal information differs for people with or without domain experience; how causal information at decision-time affects decision confidence; and how perceived and actual knowledge affect use of causal information for decision-making. Specific domains included weight management, Type 2 diabetes, and real-world personal finance.

The researchers discovered that when people have experience in a domain or even just believe they know about it, providing them with causal information leads them to make worse decisions with less confidence. Conversely, individuals without experience in a domain made more accurate decisions with more confidence when provided with causal models.

As an example, Marsh says, many variables impact blood sugar levels in patients with diabetes—these variables are considered causal information. In theory, learning about these variables should help patients make decisions. However, she says, “it looks like [the causal information] actually makes people way worse because it conflicts with their own knowledge [about what impacts their blood sugar] ... and they don’t know how to reconcile it.”

The team also attempted to get participants to realize how much they didn’t know about specific domains and then observed if they could then use causal information to help them make a decision. That didn’t work, either. “It just makes them not as bad,” says Marsh.

The authors conclude that “causal models can aid decision-making in unfamiliar situations, yet when individuals have prior beliefs about a domain, causal models can reduce confidence and lead to less accurate decisions.”

chicken soup, saltines and ginger ale

Improving Shared Decision-Making

Informed by this work, and now focused specifically on Type 2 diabetes for the current project, Marsh is developing a comprehensive set of online and clinical experiments to elicit and expose gaps in individuals’ causal beliefs. She and her team plan to link those beliefs to inferred causal models in an effort to create shared understanding between clinicians and patients and also reconcile the mental models across individuals to reduce conflict.

“We’re trying to elicit from people how they think their own diabetes works,” she explains. “Then we have the actual model from the real data [mined by Kleinberg]” for comparison.

The team envisions a “smart intake form” that will gather patient data and the beliefs of the provider and patient, and then personalize information for both to review. “This will lead to more effective appointments and improved clinical decision support, without increasing workload or cognitive burden,” the team writes.

With an eye toward helping patients with Type 2 diabetes understand and navigate their own daily care, the team will then evaluate methods developed in shared and individual health decisions, testing their approaches to improve shared decision-making in a clinical setting under the guidance of Asan, who specializes in patient-centered care and outcomes using health information technologies.

In addition, the researchers hope to reduce treatment disparities by creating training modules to educate clinicians about patient beliefs—and the demographic differences among them—as well as how these beliefs influence trust and decision-making.

Accounting for individual differences can enable more effective predictions and interventions, the researchers argue. The project, they write, “will lead to methods that can automatically learn personalized causal models that are specific to the decision-making situation and individual’s health, and communicated in the context of an individual’s knowledge in diabetic care.”

Better health decisions require accurate information. Finding a middle ground, says Marsh, could help doctors and patients have more effective shared decision-making experiences—and could lead to better health outcomes for patients. This project targets just that: better communication and shared understanding.

“I hope this is going to be a different way of looking at shared decision-making. … We can help both partners understand what one thinks to be true and what the other maybe knows to be true or believes to be true based on science,” Marsh explains.

She recalls drinking ginger ale and eating chicken noodle soup and saltine crackers when she was sick as a child. That experience, she says, influences her beliefs about how to treat an illness.

“Am I going to tell somebody to eat that while they’re sick? Yes, I am. Of course. Why would you have anything else? If I can weave a story of why those things are important—chicken noodle soup actually provides high levels of hydration and ginger can be calming in the stomach—now you can say, all right, this isn’t nonsense. So how do we get other things like causal information to fit?

“If you’re trying to just get rid of these beliefs people have, you’re not ever going to be successful in that,” Marsh says. “You’re trying to give them something else to believe [based on causal information]. And if we could actually interweave it with what they currently believe, then we can give them some more power in making decisions.”

Story by

Kelly Hochbein

Related Stories

Katie Gregory ’15 and Ashley Kreitz ’15 share a hug while working on restoring a home in Barnardsville, North Carolina that was damaged by Hurricane Helene.

Lehigh Alumni Aid in Hurricane Helene Relief Efforts in North Carolina

Ashley Kreitz ’15 has raised nearly $10,000 and recruited fellow alumni to help restore damaged homes in North Carolina.

Lehigh Research

Lehigh Research Explores Complex Issues Challenging American Voters

See a sampling of Lehigh research relevant to the upcoming elections.

campus beauty

Lehigh Welcomes New Faculty for the 2024-25 Academic Year

Learn about new faculty members on campus this fall.