Dive into the holistic, analytical approach of systems thinking to help understand why certain health care gaps remain and what can be done as individuals and leaders to move toward the system of care we want and need.
I wanted to, uh, introduce Doctor, uh, John Lynch who is our president of Vern Shoes Hospital. You know this is really, uh, a wonderful event and so pleased you, uh, you all could join us today. The energy in the room when I walked in was really palpable. Uh, this is really a fun, fun day of learning, uh, and sharing experiences as we all pursue, uh, excellence and exceptional care for our patients, our families, and our community. Um, since the inaugural 2011 symposium, and this is our 14th symposium, can you believe that? We, I remember planning the very first ones all those years ago. We had a few, took a few years off in between, I think the first year, one year off, did we, did we go through COVID too? Did we have symposium? Virtual, that's right. So, uh, so it's, uh, it's really become part of our culture and look forward to it, uh, every year as we explore a variety of topics including high reliable high reliability principles, safety culture, health care across the continuum, uh, use of electronic medical record to support quality initiatives. Uh, organizational transformation and, uh, those are to name just a few. This year's symposium's theme is closing the quality divide, providing health care that we want, need, and deserve. We look forward to learning from each of the speakers and it's a terrific lineup today as they share their wisdom, experiences, and expertise and move us forward on our journey. Today we will reflect on the 2001 Crossing the Quality CAM report, uh, and lessons learned and considerations for the future. Learn how to act like a child by exploring lessons in pediatric patient safety, and there are those in this room that say I do that very well. Uh, Doctor Henderson's not here. She would certainly. Uh, learn about artificial intelligence tools available in the electronic medical record to support clinical processes, identify opportunities to improve diagnostic excellence, and cover a variety of projects that are improving patient, uh, care, safety and quality. So I would like to thank all our presenters today and all of our symposium planning committee members. If you're a volunteer or on the committee, would you please just stand for a minute just for a quick round of applause. Thank you. Yeah. As you can tell, the symposium is really well organized, uh, and they will get one weekend off before they begin to plan for next year. So, uh, thank, thank you all. I'd also like to thank our colleagues from Barnes Jewish Hospital, WashU School of Medicine, BJC, Saint Louis Children's Hospital, and the Foundation for Barnes Jewish Hospital for supporting the symposium and thank you for attending and participating today. Uh, I'm sure you'll enjoy the day. Uh, but before, uh, again before we introduce our first speaker, uh, please make sure you scan, um, the poll everywhere QR code again, uh, on the screen you'll see it and as, as well at the end of each of the rows, uh, that will be the way we provide feedback throughout and the only way we provide feedback throughout the, uh, symposium. So, uh, with all that, it's, uh, my great pleasure to introduce our first, uh, speaker, Doctor Clay Dunnigan. Of course he's, he's the namesake for our symposium. After a distinguished, uh, nearly quarter century, over a quarter century career at BJC, Doctor Doctor Dunnigan now, uh, provides leadership training across the school of medicine. Doctor Dunigan, uh, and this is, uh, uh, just a couple of the very first, uh, that we've ever had. Doctor Dunigan was our first, uh, chief quality officer, uh, at, uh, BJC, and I remember those early days of collaborations when I was in the department of medicine. And, uh, his last 7 years, uh, at BJC, he was the senior vice president and our first chief clinical officer. During his tenure as chief clinical officer, Clay directed our COVID-19 pandemic response and also served as BJC's representative to the region's COVID advisory Committee. Much of his career, of course, been involved with developing and implementing initiatives and improving, enhancing patient care, safety and quality. OK, so that's all the formal stuff. If you today are involved in, uh, your daily work involves patient safety, quality, or patient experience, would you raise your hand? OK, if you are participating in tiered huddles, raise your hand. If you, uh, observe lean daily management principles every day or teach it, raise your hand. Uh, if you're a systems thinker, raise your hand. If you're involved in change management every day, raise your hand. If you believe in metric-driven improvement activities, raise your hand. Well, welcome. You are all part of the genealogy of Doctor Clay Dunnigan at BJC Healthcare, and you are part of his family. Yeah, so I'd like to welcome to the, to the stage my dear friend and colleague, Doctor Clay Dunnigan. Thank you. Thank you. Let's see if I'm live here. Thank you, John. That was very nice. Um, good morning. Nice to see all of you here. Thank you for coming. Uh, you know, uh, it's really a very deep honor to have this, um, this great day, uh, named in my honor. I came last year to make sure that people know it was in my honor and not my memory, uh. I, I was asked to speak this year and we'll see if it's still in my honor when I'm done, but, um. Uh, what I'm gonna do today is talk about crossing the quality chasm, which is a seminal moment in health care and not so much as a historical document but as a diagnosis of design issues that we still have to wrestle with, um. First, I have to disclose that I have no relevant financial interest. It did occur to me that at the age of 71, this may be a little disingenuous talking about health care quality. I do have financial and other interests that are at stake here. I just don't have any conflicts that are gonna, uh, preclude hopefully an unbiased, uh, talk. Uh, here are the educational objectives, you know, we'll talk about the, the, the, uh, report and what its impact was. We're gonna talk about what's happened nationally to quality and safety since that report and some of the challenges we have, and I'd like to do that through the lens of system thinking because I think that is a key to, uh, getting better, um, at what we do. And finally we'll talk a little bit about the actions that all of you, all of us can take as individuals and leaders to try to help close the remaining gaps. So let me take you back to 1999. You know, for some of us, this seems like, uh, just yesterday, for some of you, you may have been in training. There are some people that this is actually prehistory. I realized that, um. You know we didn't have smartphones. We had these cute little flip phones and you're very privileged to have one. Almost all the urgent communication that happened came across a pager like the one they're still around, but uh they were much more important and impactful then. Um, you know, GPS was just starting to make the rounds. Most people didn't have a device. Uh, some people still use the map from the gas station or a lot of people printed out these MapQuest instructions, and you can see there's a conveniently located advertisement to get ink for your printer after you printed it out. Computer technology of course uh was way better than what had preceded it but it's still far beyond what our computing power is today um and you know at that point medical records were still largely on paper uh there were a few leading institutions that had digitized the record at great local expense through heroic, uh, homegrown initiatives, but by and large this is the way records were kept and. In a big hospital like Barnes Jewish, that's what the file room for medical records looked like. Um And there was a quiet but pervasive notion really in healthcare that things were pretty good. The system worked pretty well, medical error might occur, but it was largely unavoidable. There were complexities about patients. Show of hands, who knows who this is? Somebody say it Marcus Welby. So Marcus Welby, now admittedly, Marcus Welby was a product of the 70s, not the 90s, but baby boomers sort of grew up with Marcus Welby, um emblematic of what it meant to be good in health care and in the 90s, you know, baby boomers were starting to get to the age where we were big consumers. He was the prototype, right? He was uh. A soft spoken, thoughtful, uh, compassionate physician, he made house calls. He, uh, he mentored his protege. He, uh, he, uh, administered wisdom and medications and, um, you know, you were in good hands and, and that's sort of how most of us thought of health care. We thought if we had a good compassionate family physician, the rest of it just took care of itself. And then things changed, um, but to hear that story I wanna take you back a little bit further in history to 1863 when one of our truly great presidents, Abraham Lincoln signed into law the creation of National Academy of Sciences. National Academy was a recognition that um. The government of the United States needed credible, independent, objective advice about science in order to make effective policy. That system persisted in 1964 I think it was they added engineering to it and then 1970 and growing recognition of the challenges of health care, the Institute of Medicine was formed. The Institute of Medicine. Very quickly started getting interested in quality issues and ultimately that led to the creation of the Crossing the Chasm report that we're gonna talk about today. Now that wasn't the first gambit by the Institute of Medicine. They published a report in 1990 on Medicare quality, uh, not much press about that. It was a good report, but um, it was pretty arcane and people didn't see it. In um in 1998, even more alarmed, they published, uh, the urgent need to uh to improve health care quality in the United States and likewise that report didn't really go very far. People didn't really take much notice of it. So, um, what was going on? There were signals that the Institute of Medicine saw and other experts saw that hadn't yet captured public attention. What did those signals look like? Well, here's one. This is a page from the Dartmouth Atlas. Maybe some of you remember the Dartmouth Atlas project. I heard recently that it's gonna be revived. This was taking a look at the country and dividing it up into geographic regions which were sort of based on how patients flow. So within a region, most patients would stay within that cluster, and this map shows the intensity of utilization of different services for Medicare enrollees. The darker the color, the higher the intensity. If you look at the bot the the the uh dot plot on the right that just shows the frequency distribution of the number of bypass procedures per 1000 Medicare enrollees across the country ranging from a low of about 2 per 1000 Medicare enrollees up to nearly 9. So that's an almost 5-fold difference, something that can't be explained at all by patient characteristics. And in fact when you did the correlations, the analysis. The single biggest driver of those numbers was the number of practitioners that had that skill in that region. There were lots of maps like this for all manner of diseases and the same patterns were observed. The implication is that, um, very likely the right answer is somewhere in the middle of this and that some large proportion of Medicare enrollees were receiving care that where the expected benefit wouldn't exceed the expected harm, so clear overuse of of resources. There was also data about um. There we go, um, underuse this study, which was published in, um, a little bit later, 2003, was actually data from the late 1990s, and it typified a lot of studies that showed that even where we knew evidence-based practices for preventive, acute or chronic care, we weren't very good at deploying them in, um, Elizabeth McGlynn's study, it was about half the time that we would get that right, so overuse, underuse, and perhaps most. Striking and disturbing was this, this, uh, paper from 1991. The, uh, author of the first paper that's listed on the left under New York was Troy and Brennan, who incidentally is a Saint Louis, a very nice guy, um, MDJD who wound up a professor at Harvard, and he did a project really looking at malpractice, uh, that was the primary goal, but they reviewed every hospital chart in the state of New York in 1984. And one of the startling features of that was finding out that nearly 4% of patients had evidence of some sort of unintended harm during that, that the majority of that unintended harm was preventable, and that often it led to either permanent disability or death. The study was repeated a few years later in Western states to see if maybe this is a geographic anomaly. Uh, the numbers are a little bit different, but they're still in the same order of magnitude. I mean these are very provocative numbers, but this study too went sort of under the radar screen. So we had the signals and um just hadn't put them together yet into a coherent picture that would make us take note and then things changed so the Institute of Medicine frustrated by what had happened before convened a national committee on quality of health care in the US and one of the first things they did was set to work assimilating all the information and pulling it together in a coherent picture. And they published two reports. I've listed them here together. My assignment was to talk about the one on, on, uh, on your right crossing the chasm, but essentially they, they developed these simultaneously and could have released them in either order. They chose to release two errors, human first, because it had the headline 44,000 to 98,000 Americans dying each year from medical error. And that frequently got translated in the lay press into this is the equivalent of a fully loaded 747 crashing every day. That was a number that people really had trouble ignoring this put patient safety into the mainstream. It made it, uh, something that people would actually talk about and certainly the professions and health care organizations noted. Wasn't long after that that Crossing the chasm came out. Crossing the chasm said, you know what? It's not just about safety. We have lots of other quality problems. We simply have a system that can't reliably deliver the care that we would like that we deserve. Further, both these reports said this is not a problem with individuals. This is not that we don't have excellent providers, excellent, uh, people supporting them. um, this is a systems problem. And it can't be fixed by just trying harder. That's not going to, that's not going to save the day. So the crossing the chasm report has a number of enduring legacies. One is this idea of the six aims of of uh quality, so safety, patient-centeredness. You can go around the rim, see them all, things that you're probably familiar with. You probably heard this because this is something that's lasted. The Crossing chasm report also had 10 rules for what they called the new health system. We're not going to go through those because, frankly, if you read them and you would think why would we need this rule that care should be transparent to the recipients, that safety should be a priority, that we should use evidence-based practices. The rules themselves were of course totally accurate and at this day and age not very provocative. But there were 13 additional recommendations, and this was important because the reports before about saying how terrible things were and how many problems there were to fix really didn't include a recipe for where to go and this report laid out in a very systematic fashion 13 things that the system ought to do and we'll come back to those in just a minute. Did it make a difference? Well, I'll tell you, uh, there's a couple of graphs here we're gonna look at in a second, but it did capture people's attention. One was the narrative was powerful, right? A jumbo jet crashing every day and saying that that's just the tip of the iceberg. So that got people's attention. It got, uh, policymakers, government providers, academicians, you name it. Here's, uh, what happened in the academic world. Um, the graph on your, uh, left shows publications, whether they're letters to the editor or editorials or original research just really blossomed after the, uh, two errors human and crossing the chasm reports which are the spike there you see the one on the right is, um, showing. Us in the open circles research awards for patient safety and the dark circles publications of original research, so we know for sure that the academic community saw this message and started thinking about it. Um, What was it about it that um that really grabbed people's attention? Well, one, I said there's a big headline. Two, the committee that the Institute of Medicine pulled together to write these reports was a blue ribbon, uh, committee headed by William Richardson and has a lot of notable names that you would know, uh, Don Berwick and Lucian Leap and others, so it had real credibility and, and, and authority. Uh, it also had, as I mentioned, uh, a plan said here are things that we can do that would make things better, and it had a multi-sector reach. It was really, uh, targeting the public, targeting the professions, targeting, um, legislators and, and policy makers. So a lot of activity, a lot of noise, a lot of people taking note. Question is what did it do? These are the 13 recommendations now it would make for a pretty boring talk if I were to go through and talk about each one of these and what happened, so I'm not gonna do that, but I did put them on a 0 to 5 scale, 0 being nothing happened to 5, we've gotten where we need to be, and you can see in a few areas we've really done pretty well. So we did adopt the 6 aims. That's what the quality conversations are about now. There is national healthcare quality and disparities monitoring, and we'll talk about data from one of those reports in a minute. We've built out an infrastructure so I'm not going to argue that the electronic health record is perfect, but we did digitize the patient record and at least set a foundation for what's going to come next and of course there's been a lot of experimentation about payment reform. There's a whole host of things below that that are important but more erratic, uneven in deployment. We haven't gotten as far along, so there's been moderate progress and a couple of things where we need. Um, cross-sectional, um, uh, work, uh, legislative, legal, regulatory accountability to change a bit before we can make progress. So great that there are no zeros or ones. Too bad that there are not some fives, but clearly some work to be done. OK, so we know people took note. We know there were actions taken. The infrastructure was built. Um, it begs the question of what do we actually see? I'm gonna use one example just to illustrate what I just made. This was what the adoption of EHRs looks like at 2000 when these reports were published, it was just a few percentage points of, uh, patients in hospitals that had electronic health records. And then over the next decade and a half, a number of policy steps were taken ending in meaningful use and high tech and those things which really put us over the tipping point and now virtually every hospitalized patient has an electronic health record and most most doctors' offices are electronic in some way. But what did that do to actual quality of care? Here's an example. In in cardiology, it was known for a long time that you know the the the biggest problem with a heart attack is damaging heart muscle, leads to heart failure, dysrhythmias, and, and can be fatal. And in the 90s, people started developing good technology to open coronary arteries after a heart attack. That was mind blowing at the time. It was really amazing. First pharmacologic and then actually catheters going in and removing them. It also became apparent that the quicker you did that, the better. So that's where the old adage time is muscle came from. Um, by the turn of the century we knew that we had a short window to do meaningful preservation of myocardium and so the standard was to try to get it done within 2 minutes, um, 2 hours, um, you can see for BJC hospitals in 2002 when we started gearing up for public reporting we were getting it right about 40% of the time, so more than half the time. Patients with acute MIs were not getting their arteries opened and a lot of those misses were were substantial. Sometimes people came in before the weekend and they didn't get anything done until the next week, way beyond a window where we could preserve myocardium. Through hard work, systematic efforts to redesign processes, a lot of gnashing of teeth as people's roles and responsibilities were changed. Um, we got to a different place, and now if someone shows up with a heart attack, it is very unusual for us not to get the artery open within 90 minutes. I mean that's a heroic change to go from 40% to very close to 100% and now the misses are usually just minutes or they're the results of some reason that we simply can't do it in a particular patient. That's really impressive work. Think about going from from 40% to nearly 100%. And it wasn't just reperfusion. These are a whole family of process of care metrics that we studied and followed at the time. Now some of these are probably more reflective of better documentation, although that's a good in its own right, but many of them represent real process changes and real improvements in care. So processes got better. What else happened? This is also heroic work. This is from the BJC Washoe Infection Prevention Consortium led by Vicky Fraser and Denise Murphy. Uh, the two of them actually helped set standards for the whole country. The, the BJC Infection Prevention book became sort of a standard that, uh, was, uh, widely used to model after, and this is looking at central line associated bloodstream infections over a little over a decade, the consortium. Under Vicky and Denise's direction drove in collapsing rates to a tenfold lower number again another sort of astonishing think about the problems that you may have worked on and to think about it being reduced tenfold is really impressive. And those lessons got applied to other areas. This is preventable harm data from the second decade of the 2000s where we took those lessons learned and applied them to all manner of patient harm, dramatic reductions in pressure ulcers, falls with injury, adverse drug events, 75% reduction. And in fact, uh, BJC Healthcare won an award for this work for regional excellence in patient safety. Now I'd love to tell you that we were the pioneers and we did this better than anybody else, but virtually any good hospital in this country can show you graphics that look like this spectacular improvement in processes of care, great reductions in harm. It doesn't mean there's not a lot left to do, and our next speaker will talk a lot about that, um, but we have made extraordinary progress. It would make for a really short talk if I could just say, and we did it for everything else and you know we're well on the way. Alas, that's not the case. We safety, we've made moderate progress. A lot of things have been improved. Much safer to be a hospitalized patient now than it used to be. Uh, Effectiveness we've made some gains, but it's uneven. Uh, we haven't even started the conversation really about diagnostic error. We don't even know how to talk about it and it actually may be bigger than all the buckets we've tackled so far. Hopefully artificial intelligence may actually give us an opportunity to tackle that one in a way we haven't been able to. Patient centeredness, things are better. Patient satisfaction scores are going up. But it's hard to argue that um care isn't still fragmented and there aren't uh significant headwinds for people who are trying to navigate the system for timeliness and efficiency, not so good efficiency in particular we spend a lot of money in this country on health care and we don't get a a huge bang for it and then finally, equity remains a a stubborn problem. Um, the only major change there has been the Affordable Care Act, which really extended opportunity for care to large segments of our population, but by and large equity remains our Achilles' heel. We'll see a little bit more about that in a minute. So the question is why? Why we've made so much progress, if we understand the issues we have made great strides in terms of processes of care, we've mitigated safety events, why aren't we doing better? I know many of you know this joke, right? About the hapless dude looking for his car keys under the street lamp, even though he lost them in the park just because the light's better there. In some ways that's a metaphor. Um, a lot of us tended to think of the Crossing the chasm report as a criticism of hospitals or sometimes hospitals and providers. We had a very narrow focus. Some of us even thought it really was just a 2 safety report. But that's really not what it was. It wasn't a critique of hospitals or individual providers, and it wasn't a narrow safety document. It was really about a larger health system and the call was to redesign care, not to try to work around the edges to improve things that were there with the recognition that that structure would drive performance, that would be among the many things that we need to do to improve the determinants of health in our country. So it was a clear message that we had system changes that we needed to make and that we couldn't do it just through effort. OK, that's gonna lead us to uh the next part of this talk which is about systems. Now, here's another story that probably many of you know, the, who knows the parable of the blind men feeling the elephant. Just show of hands of a few of you. OK, so, This, while it's a systems idea, isn't a very new idea. In fact, this parable goes back at least to 500 BC in Buddhist texts, and the parable begins with a group of blind men encountering an elephant for the first time and each feeling it and trying to describe what he's feeling. And in the most, uh, in the darkest versions of this tale they come to blows and actually kill each other, but without that sort of uh apocryphal end. Um, the message is you can't understand a system by looking at just one part of it. Um, if it's not that looking at parts of systems isn't important because there is real hay to be made when you, when you work on some piece of a system, but when you're looking at the piece, you're not looking at a small version of the system, you're looking at a component, you're not seeing anything about the way those pieces interact. Um There's another little punch line for that joke, um. So, um, what are some corollaries of that? Well, first of all, you can't build a great system just by building great parts, and here are 4. Wonderful examples of that. The, uh, the upper left is the US men's Olympic basketball team from 2004. Heavily favored to win yet another gold medal for the US loaded with NBA stars and recently graduated college stars that were on their way to fabulous careers and yet they didn't win a gold medal. They barely won a bronze and that was hard work for them, a system that clearly underperformed its parts. Um, on the right at the top is the Challenger Space Shuttle. A product of NASA engineering, a a company famously known for precision and pulling off incredible feats, uh, no expense spared in terms of putting the components together, and yet failures in the decision making body in NASA failed this flight. We'll come back to it in a minute. On the lower left is the 737 Max, right? Remember this story? New, newly designed. You know, extra powerful engines, a new, uh, automated pilot, a new avionics. This was going to be the workhorse aircraft and going to perform like a dream. And yet two flew straight into the ground because the systems hadn't been fully appreciated how they interacted and the pilots didn't know how to disengage the autopilot when it took over. And then finally, uh, lower right. The Titanic, right, uh, a marvel of engineering in its own time and it was so well built that they didn't even put enough lifeboats on it for the people who were passengers. So great parts don't make great systems. You have to understand the system as a whole and to do that we'll use the tools of systems thinking. Now most of us live at the top of this pyramid, right? We live our lives, we see events and we take actions and, and actually we, we, we have as a country and as individuals, we celebrate the people who take prompt action in response to events we see. And that's just the way life is. That's the way we are wired evolutionarily, um, but a lot of us have had the privilege of thinking about the patterns and trends underneath and that's great too because then you can start to say, well, if this happens, this, this patient, this group is at risk and we can do this and things may be differently so to sort of mitigate what might happen downstream by seeing the patterns, exposing them and taking some actions. But those patterns and trends are all driven by systems, structures, and drivers underneath the way things are designed set us in motion for those patterns and create the events and even more importantly and more profoundly. It's the predominant social attitudes, the paradigms, mental models, the worldviews at the very bottom that determine what systems we build. Right, we have a very expensive healthcare system. Not because we're reacting to events at the top health care events, and not because of trends in our population, we have it because we have a fragmented way of funding health care that's very inefficient administratively and leads to disincentives, costs way too much to get care in the in the US and it's driven by the way we've set things up. Uh, one thing that I've sort of already said, you know, the top is visible, the bottom is rarely seen. The other thing about systems is the importance of unintended consequences when we fail to see a system, sometimes we don't think about what's going to happen downstream, and I'd like to give you an example that some of you may be familiar with the hospital readmission reduction program. How many people have had anything to do with that, have worked on projects? There's been great work on this. It's been really looking for some real holes in our care delivery system and repairing them. But it's a more complicated story than that. At the beginning, the idea was, well, hospitals are taking care of people and they're doing these common things taking care of heart attacks, lung disease, heart failure, pneumonia, joint replacement, etc. and when they get done with their work, we should expect those patients to be able to stay at home, right? That makes sense. You've taken care of the problem and they shouldn't have to come back to the hospital. And yet 1 in 5 was coming back for additional care within 30 days. So Congress set out with the Centers for Medicaid and Medicare Services to set up this readmission reduction penalty program, which said that if your readmission rate was worse than average, worse than average, then you got a penalty. So, um, you, you all probably all recognize the Lake woeon problem that not all hospitals can be above average. Um, but be that as it may, this was thought to be an appropriate way to make an incentive to get people to move, and it's real money. It's 3% of, uh, of Medicare reimbursement, which for like BJC that's tens of millions of dollars and can be a substantial impact on the hospital's operating margin. So this, this policy had real teeth in it. What happened? Well, this is a graphic ballyhooing the success of the program. You can see the first inflection point is when the announcement of the program was made and institutions got busy trying to figure out how we're going to solve this problem. And then the program was actually implemented. That's the date that the penalties started to be provided, and you can see further reductions then the top line, by the way, is, is the list of targeted conditions. These are the things that were on that first page. And the red line is um other conditions so there was this spillover effect so this looks great, right? I mean, mission accomplished uh maybe the penalty program isn't set up the most intelligently about penalizing above average but um nevertheless we made progress. Let's take a closer look. This is in that period after the penalties themselves actually started. The the yellow graph on the bottom is single readmission. The purple is all readmission, so some people had more than one, and you can see that same downward slope, statistically significant improvements. Well, let's put it in context. I'm gonna show you 3 other graphics. Starting at the top right, you can see emergency department treat and discharge visits. Lower left observation stays. You know this difference between an observation stay and a hospital revisit is 22 midnights, right? The 2 midnight rule. I know many of you know that. And if you look at all hospital revisits in the upper left corner, you see that after penalty started, the number of re-contacts of the health system went up. So this program wasn't doing exactly what people thought. It didn't mean there weren't some real important problems being fixed. I don't, I don't want you to get me wrong. There's been some really heroic work done. But there are some indications here that people are working against a system that's not being modified by changes we make in the hospital. What are the explanations? This is a hot spot map for readmission in the Saint Louis area. Um, I'm sorry, it's just Missouri data. That's what we had access to and it's sort of self-explanatory. Red means there's a high readmission rate. The blue-green means there's a low readmission rate. So, um, you can see readmission is pretty concentrated in the city and, uh, North County. This is the poverty map for Saint Louis. There is statistically no difference between these two maps. There's some subtle differences that are mainly low population areas, but further work that we did using statewide data showed that almost all the variability in readmission rates in hospitals can be explained by socio-demographic factors for patients. We're working on a system that's going to have a great deal of trouble overcoming structural. Uh, differences So, um, this gets back to this unintended consequences. Look who gets penalized for readmissions reductions big hospitals, uh, teaching hospitals, safety net hospitals, the hospitals that take care of patients who have more challenges from an economic perspective are the ones that are paying the bulk of penalties in this program. So yes, we've made progress. Yes, we fixed some systems. But it's not because the penalties are doing what they're supposed to do actually penalties are working against uh the improvements that we need. OK, with that lens, let's step back and look at the aggregate US health care system. This is data from the Commonwealth Fund. It's an organization of 10 wealthy nations who trade data on all manner of things, and one of the things they do is issue regular reports on the healthcare system. And, uh, you can see the ratings for overall rankings in 5 different categories access, care process, administrative efficiency, equity, health outcomes. And the the rankings are 1 being best to 10 being worst, and I'll just draw your attention to the right where the US care process we did it we're we're good on care process, um, and actually we're very near the top, but in the others we're laggards in some cases we are spectacularly far behind our peers. Here's another graphic that's sort of disappointing. This is looking at life expectancy on the vertical axis versus how much we spend on healthcare on the horizontal axis. Um, this is from the Organization for Economic Cooperation and Development, uh, 39, 30 some odd nations worldwide who collaborate. I've highlighted a few of them who are doing really well in life expectancy just so you can see where some of our peers are. The length of the bar, it starts for most people in 1970, and as you move through time it's moving upwards and to the right, you know, a favor of inflation to always get it moving in one direction and you can see the cute little curly cue for COVID at the end of all these graphs, but, um, the United States is a standout in two ways, right? We spend a lot more on health care than most nations now. There's more to that story that we don't have time to get into about. Uh, the combination of health and social service spending, but be that as it may, we spend a lot on health care now. We do underwrite profitability for a lot of healthcare industries. We have less than 4% of the people in the world. 75% of the profitability in the pharmaceutical industry is from American sales. So some of this is the, uh, largesse the way we support other countries with medical technology. But there is still this striking disparity and the life expectancy thing is a little bit more disconcerting, right? Our life expectancy at birth in this country is below 80 years, and in the best countries it's pushing 85 years, a big gap. But we're not monolithic in that way. This little inset shows a really disturbing factor, and you've probably seen information like this before. If you're in the highest income quintile, you have a life expectancy at birth that's very similar to the best European and Asian nations in Australia. If you're in the lowest, you have at least a 10 year shorter life expectancy. A real structural inequality. And one that the health care system is gonna have trouble addressing, right? This isn't that health care is, is, um, a problem, it's that unequal access to health care is a problem. OK, so, I don't wanna deflate people and say it's the system. The system is very hard to move. Who are we, individuals, we should just go back to our holes and just hang our heads. That's not right. There's two very important things we can do. So, um, first of all, I think we all need to understand the complexity of the problem, and we need to be advocates for leaders who understand the complexity of problems. People who say this is a simple issue to fix. All we have to do is X, you know, we have to exclude these people or we have to pay for these people. That's not going to take care of what's going on. We have a more pervasive, more challenging road ahead. The second thing is we still have plenty of opportunity in the health system we do work in right now and there are lots of things that we need to do to make sure that we are optimizing outcomes. How do we do that? Well, as John alluded to, you know, I'm a real nut about process improvement, lean 6 Sigma, all those things that came from my engineering background before, before medical school. This is a graphic from, uh, human, uh, uh, sorry, healthcare performance improvement. A group we've worked with, now part of Press Gainey, and they're pointing out that with intelligent process design you can get orders of magnitude improvement in reliability. The scale on the left is essentially the number of errors per 100. Uh, it's a log scale, so per 100,000 going on up, and this shows that intelligent process design might take you from one defect per 10 chances to 1 per, per 100 chances or 1000 chances. And that's great we understand those things and that if we put on top of that a reliability culture we can amplify that and we may get up to 1 in 100,000 chances and further if you do those two things well in concert you can see some synergy and now we're starting to talk about 1 in a million chances impossible? No, um, this is very much the story of general anesthesia. That in the sixties you had about, you know, about 4 deaths per 10,000 cases. I mean, pretty, pretty substantial number of people. And now for a healthy person getting a minor procedure, it is way better than 1 in a million chances of a death and for all comers, even in hospital, is, is much better than, is much lower than 1 in 100,000. So it can be done with systematic effort and the right thinking. How do we do that? Well, I'm not gonna give you a dissertation on process design, but I would like to say a little bit about high reliability, something that's been discussed in this forum and others. Um, high reliability organizations are ones who operate in complex high hazard environments and have significantly lower than expected, uh, catastrophic failures, and they do it through how they organize and think. Uh, Carl Weich and Kathleen Sutcliffe have made their academic career studying these organizations, and they gave us the five principles that you've heard, and we're gonna go through. I'm not gonna read them, but they're listed here, and they talk about reliability as a way of operating. Let's just talk for one minute about the five principles. Um, so who knows what this is? It's a deck of an aircraft carrier with a crew on it. Anybody recognize it? It's the foreign object debris walk, so this happens every day or in some periods it happens more than once a day. I'm sure it's happening more than once a day right now. Uh, it's because I don't know if you can see it, but there's a little screw in this jet turbine. Small objects getting in the turbines of these sophisticated aircrafts can cause catastrophic damage and so the crew gets out and literally walks the deck looking for any little bit of debris that might get sucked into the intake of a jet and cause a problem. And this typifies this idea of, of preoccupation with failure. What could go wrong? What are we missing? Think about your unit, where you work. How often in the last 30 days have you sat down to think about what Almost went wrong. Where did you have a near miss? Where did things seem OK, but something wasn't quite right? Where were there's a weak, weak signal? Is that a regular part of our discipline? It should be. We should be thinking about the near misses and the good catches. Number 2, reluctance to simplify. This is the unfortunate photo shortly after the one I showed of Challenger taking off. This is when Challenger blew up. At the time Challenger left the pad, there were a significant number of people inside NASA who said, you know, it's only going to be 30 degrees when we launch this, and, and we're worried about the O-rings in that cold temperature. The O-rings are devices that kept hot gasses from escaping where they were supposed to be into areas where they were not, and they had noticed on previous shuttle flights, each of these red dots is a shuttle flight that there had been evidence that the gas had escaped around erosions they called them. And there was this pattern, you know, you can see on the horizontal axis, which is the launch temperature, you can see that one point that had 3 O-ring erosions when the launch was 52 degrees, but over at 75 degrees there, there were 2, and there's sort of not much pattern. And so despite a lot of concern, there was a lot of confusion about the communication. This graphic went to launch control and they said we're good to go. There's no pattern here. It doesn't matter what the temperature is. That was a gross oversimplification of what they had. The green dots are all the shuttle flights with no erosions. Uh, every erosion-free flight was above 65 degrees. No flight below 65 degrees was free from erosions. And if you did the mathematics on this, what it showed is that at 30 degrees, you could expect catastrophic O-ring failure, which is exactly what happened, and that's exactly what killed the astronauts on board that that shuttle flight. The mindset here is that the easy explanation is often wrong. Things are often more complex than they see. It's resisting single cause explanations and your careers and your lives. Think about the times that you use the, the sort of the, uh, excuse of the moment, you know, our staffing level wasn't quite right, so we made a mistake there, but, uh, we'll, if we're fully staffed, it'll be fine. Well, that, that might be true, but that's also a signal that you have a system that's vulnerable and the staffing isn't quite right, you better watch out. So reluctance to simplify is incredibly important. Sensitivity to operations. Here's 33 figures that sort of typify that. The first is the pilot doing the famous pre-flight walk around looking to see are the tires OK? Are the landing gear intact? Are there any sort of loose panels on this? The middle one's probably more familiar. That's people making rounds in the ICU, senior clinician listening to the team talk about what's going on with patients in that unit. The one on the right, uh, another aircraft carrier example might not be familiar, but these guys are, they look like they're playing with toys and. In a way they are. This is the flight control area and this is a 6 ft scale model of the deck of the aircraft carrier that shows where every plane and asset is. It's mechanical because if the computers go down or the power fails, this board still works. They still know where things are. They can still take off and land planes. Sensitivity to operations is knowing how work has actually happened, not, not what how it was designed, but what's actually going on. How often do we as leaders actually get out and see with our own eyes what's happening in the workspace, look for the workarounds, where has is somebody doing something that's not what they're supposed to because they're fixing a problem in the system. Number 4 is commitment to resilience, and I, I like this one because it was really easy to find health care examples. The first panel is a team training for, for, uh, code, right? We have absolutely made a commitment to being resilient when there's a cardiac arrest. We train people, we drill them. You get certified. If the code bell goes off, people cluster. They know what they're supposed to do, and we do a very reliable job of trying to resuscitate them. The one on the opposite side on the far right is the ED trauma room, and while there's no action here you can see it is set up thoughtfully to be able to deal with any contingency that arrives at those doors. The middle one's not as obvious, but that's a photo from COVID when every day we were confronted with a new challenge. No PPE, a diagnostic test doesn't work. We're running out of ventilators. We're running out of space in the morgue. We had lots of things we had to adapt to quickly, and that's the name of commitment to resilience. When things go wrong, we adapt. Failure is inevitable, but harm is not. Um, in your units, in your area where you work, how prepared are you for disruptions? How do you do with downtimes of the computer system? How would you do in a complete power failure? Who speaks up in a crisis, who doesn't speak up in a crisis, those are things that we need to be thinking about. Finally, deference to expertise. Uh, anybody know who this is? You probably sort of guessed from looking at it. The the the the the nicknames are yellow jacket or yellow bear. This is usually an enlisted man, not, not a very high ranking, but when they are running the flight deck on an aircraft carrier, they have absolute authority. They can tell the admiral of the Navy to move his butt out of the way because he's not safe. This is deference to expertise, right? It's the idea that. What you need is the person who knows the most about the problem to be in the 4. It's not always one person, it can be several, but it means you need to, uh, forget rank and think about knowledge and skills. Now the last thing I'll say about this is that. If we use intelligent process design and we empower it with this reliability culture, we create systems that are greater than the sum of the parts. This is the opposite to the slide I showed you earlier. Uh, US men's Olympic hockey team in 1980 when a bunch of college kids beat essentially professional teams from a variety of countries to win the gold medal. The Wright brothers flying machine, if you looked at the components of that, people said no way this thing is ever going to work, and yet there it is at Kitty Hawk first powered flight. Uh, Wikipedia, who would have thought that a volunteer thing like this would become essentially the best reference we have for anything you want to look up. Uh, and this is Apollo 13 after the explosion that really should have killed 3 astronauts, and yet with thousands of people on Earth and 3 astronauts working with really primitive stuff, I mean we're talking cardboard and duct tape, they managed to engineer a system that got those astronauts all the way back from the moon safely. Systems can work better than the sum of the parts, and if you start with good parts and you make good system work on top of it, it's unstoppable. So before I close, I have to show the quint, you know, the typical slide of a family member. This is my youngest of my 4 kids, my son John. Uh, he's now 30, so this is a while back. Uh, he's in the upper left corner. He's reading a book, and that's something that we try to instill in our kids. We, every night before we put him to bed, we would read a book and he's being the 4th. I mean, we had a veritable library. We had picture books, storybooks, adventures. We had chapter books. We had science, you know, you name it. And we would every night pull out a book and say, John, let's read this book and when he was about 3 1/2, no matter what book we pulled out, he would say teams that go. He wanted to read things that go every night. He was a guy on a mission and he clearly understood how the system worked. We would pull out things that go and we would go and, you know, after you've read it through 100 times, you're looking for shortcuts, right? So, Try to turn 2 pages at once. Skip over the, you know, the long description of the backhoe. He's having none of it. You'd grab the page you skipped over the thing and he would point to the thing you didn't read. You had to go through every piece of that book to start to finish. Sensitivity to operations, preoccupation with failure. Now, I will say that if we couldn't find the book, I wouldn't say his resilience was all the way where it should be, but. So 25 years ago. Um, crossing the chasm gave us a message, and the message was that the challenges we face are systemic, that it's not about the excellence of individuals. Trying harder was not going to fix the problem. And we listened to it. I mean, we built measurement systems. We started publishing outcomes to make them more transparent. We, uh, we built infrastructure. We digitized the record. Uh, we experimented with patient reform, uh, uh, payment reform, but, uh, we still lag behind our peers, right. And the lesson is not that we failed, it's that redesigning systems is hard, especially complex systems like the ones we have. Um, I've charged you with two things that I think are our tools to combat that. One is to advocate and support leaders who understand the complexity of the system and understand that the simple minded fixes that people often offer are not going to be good enough to tackle this problem. We need leaders who embrace the complexity and are willing to work on it. And then all of us inside the health system need to understand, even though we may not control the entire system. We have the opportunity to make the system we have operate even better. Diagnostic error, uh, maternal mortality, we have lots of problems where we can do more than we're doing now. That's where high reliability helps us because it helps us think about the complexity of the system and starts to guide our actions in terms of how we will think about the systems we're operate in and how they can do better. The chasm isn't, the chasm isn't crossed yet. But we can with systematic effort get there, and that job really remains very much ours. Thank you. Thank you Doctor Dunnigan. We have a couple of questions that were submitted online. If you can, let's see, what are the best safety quality changes you have seen? Well, you know, um, think about where my career started in this. Um, I was in, I became the chief quality officer in 1994, so before crossing the chasm came, we were still in that period of there are no problems with quality. So I would say the, the most important change has been cultural. That is switching from a profession in society who thought everything was fine to one that recognized there were problems, so the cultural change has been substantial. I think the advent of, um, systems engineering tools was essential, but we've also learned that it's not enough by itself and so the, the ideas of change management and high reliability have been the third thing that I would put on that. So those three things awareness. Uh, systems engineering tools and, um, high reliability and related cultural things. The biggest change challenge for, um, healthcare leaders today. Um, I, I believe in the American economy and I'm, I'm, I'm very entrepreneurial in my thinking, uh, so I don't want the next statement to come out, uh, wrong, but we have a tremendous misalignment with the way we drive the health care system. It is, it is too tightly coupled to financial performance, um, you know, pharmaceutical prices are high. We, we pay a lot for drugs in this country and there's all kinds of controversy about it. It's not because pharmaceutical companies are evil, they're, they're not. They're working on things to try to make people better. But they have as private companies a fiduciary responsibility to increase their share, private and public, to increase shareholder value, right? They're, they're doing really well at what they do, generating, uh, substantial profits. Uh, we now have, uh, venture capital buying up healthcare organizations. And, you know, venture capital isn't necessarily the best thing that happens to an organization. The, the goal there is to try to wring profits out of it and sell it. Um, so I think we, we have a real danger that financial interests conflict with the overall goals of the health system, and we don't have a very good appetite for dealing with that. Neither party, uh, political party has been particularly adept at recognizing that we're, we're going to need to take actions so that, um, health becomes the number one priority, not financial performance. Skills do healthcare leaders need to move systems forward to closing the chasm? I, I think change management skills are at the top of the list. I think, um. You can't make changes. What, what we found in all the work that we did, even the best ideas weren't sustainable if the people working the systems weren't supportive, and you have to do that through the lens of change management. You have to get people aligned around what the goals are. Uh, and so that's an essential skill set, I think. Beyond that people also need to develop the skills for how do you manage teams. Uh, often dynamics get complicated, understanding how to resolve conflict, how to, how to actually build on conflict where it's helpful and mitigate it when it's not. How, how can someone who considers themselves a big picture thinker avoid principle two of oversimplification, um. Yeah, I don't think those things are in, in, in conflict actually. I think it is the big picture thinking that allows you to see the complexity of the system. If you get too focused in, it starts to feel very much like a mechanical, uh, a mechanical system. Yes, when you're close to it, it's complicated, but we're talking about the difference between complicated, which means you can define it and run it. And complex, which means it's very hard to understand how it works and sometimes the only way you can deal with complexity is to stand back from it and see what happens when you do different things to tests of change. Uh, How safe do I feel incorporating AI into healthcare? I, I honestly believe our challenge is gonna be, um, that we're perhaps too slow. I think, you know, you look at self-driving cars as a metaphor. Uh, I think self-driving car technology is actually pretty extraordinary, and if you look at it on a statistical basis, self-driving cars are way safer than people drivers. But when a self-driving car kills a pedestrian or is involved in an accident, it really sets the whole thing way back. I think we're going to face that hurdle in healthcare. Inevitably as we work AI into systems, there will be mistakes where AI will contribute to an unfortunate outcome. Our, our challenge is going to be that we may let that keep us from deploying it in a way that can save thousands, millions of lives over the years because of its extraordinary capabilities. Uh, how do you? How do you foresee those who are impoverished or in medical deserts getting safe and improved care? This is a real challenge, certainly threats to underwriting insurance for those populations, so to deal with the impoverished part of it first is a real challenge, and I think you know one thing policymakers and pundits often forget is. The people who earn a living in this world pay for care for everyone else, period. It's just we argue about how that money is going to get redistributed. Is the government going to do it or we're going to do it through commercial plans where we, we pay exorbitant premiums because that's what we need to underwrite care for people who don't have resources. We spend way too much time thinking that somehow we're getting cheated and recognizing that we simply need as a country to wrestle with the fact that we've got a population that has to be taken care of. We should do it in the most administratively efficient way possible. In terms of the health deserts, uh, rural areas, I do think that's an area where AI may be a godsend, right? I mean, right now we talk about whether AI can rep replicate expert clinicians. That's probably the wrong test. Yeah, it would be great if AI was better than the best clinician in every single area. The real question is, is AI as good as the average clinician? And I think we're quickly approaching a place where AI can be as good as an average clinician. And that would be a huge boon for remote areas that might have access to the kind of resources that people in the city do. Um Post-COVID era, well, you know that. I risk getting off on a diatribe about the degradation of our public health system, um, you know, a lot of steps. I mean, we learned to do some amazing things during COVID, and it seems like we've been doing our darndest to undo those things. Uh, the whole ban on mRNA technology is absurd. Um, the things that have happened to the public health system, the CDC, are really counter to the best interests of society. Uh, what's happening now with vaccine, um, hesitancy is a real problem. There will be kids dying of measles because of the, the change in, um, attitudes, and there will be adults who later in life suffer from rare complications like, uh, SSPE, um, that, that shouldn't be there. Uh, HPV vaccine, hepatitis B vaccine, these are things have been fantastic public health interventions and they're under threat, they're under attack by people who really don't know what they're talking about. Um, is invested in investigating in technology good or, um, manpower? Oh, OK, so I think this is the, uh, technology versus people power trade-off, well. Um, I said something a minute ago that I think is, um, is applies not just to administrative efficiency but it's probably worth making. Every time we simplify something that people do now we are eliminating jobs for people and that's a problem. We don't want to exacerbate income inequality. So we have to figure out what to do with that. If, if, if I could change one thing about healthcare right now, I would get away from the 30% we waste on administrative activity that doesn't really advance the care of patients. Pre-authorization, I don't think I've ever seen anything that shows that that does any good. And yet we've got a vast enterprise devoted to it. It has to be. A hospital can't run without people in pre-cert making sure those T's are crossed and I's dotted. But in terms of does that add value to the health care system, the answer is probably not. If you suddenly blew that up, what does everybody who works on pre-authorization do? So this is a real point of contention, but I think it means redeploying people from jobs that computers can do well. Unfortunately those are things that are often. Mind numbing and boring repetitive tasks to creative things that humans can do better than computers. I don't know how much more time I have here. I think I'm about it. Is that it? OK, I think we'll call it there. Thank you very much. It's been wonderful to talk to you.