HIT Policy Certification/Adoption Wrkgrp morning TRANSCRIPT

 

 

 

Meeting Materials

 

 

This is the audio of the meeting

Transcript from the morning session below:


I am John Glaser, my best buddy Charles is keeping me out of the light, helping me here. I would like to welcome all of you to the meeting of the work group on certification and adoption of the healthcare IT policy committee. I have a couple opening comment and will turn it over to the chairs. Paul Egerman and Marc Probst.


I thank you all here in the audience and on the phone, I appreciate your devoting time and intellect to this important undertaking.


In the high tech legislation, there are a few key attributes, but at the center, in order to receive the financial incentives, providers must engage in meaningful use providing certified every EHRs. The adoption work group, chaired by the colleagues at my left and around the table, the committee asked them to spend time looking at the certification process, not an immediate focus on criteria; that will come out on of the standards work going on, but the process by which electronic records are certified for the high tech law and payment meek mechanism. An examination of, understanding of how the process might work, such as open-source software or internally developed software. We ask them to do that, they had some I rememberings and are scheduling today for overall thoughts.


We want to encourage you to evaluate all processes, but we want to make sure we thoroughly examine, look at a wide range of possibilities and potentials, and want you all to feel free to explore innovative ideas, new approaches. We want you to feel free, where appropriate, to be complementary, and to be critical, we would like to hear that. We look forward to a candid, wide-open, thoughtful and constructive discussion, and again, it thank you all for taking the time to be here.


I want to get introductions of the work group members.


Steve?


From the Robert Wood Johnson foundation.


I am [indiscernible] practicing gynecologist in Massachusetts, immediate past chair of the AMA.


I am John — office of the national coordinator.


My colleague and friend, Charles Kennedy with WellPoint.


I am Paul Egerman, software entrepreneur and co-chair.


Marc Probst with [indiscernible] healthcare.


Adam Clark, the Lance Armstrong strong foundation.


Any work group members on the telephone?


Good morning, I am still Paul Egerman. We have a few work group members on the phone, Terry [indiscernible] is going to be calling in, the chief information officer of the State of California, which is a state on the West Coast of the United States. Happy to have her, it’s very early in the morning she is. As John said, this is a meeting of the certification adoption work group. At this meeting we are only talking about certification, not adoption. Adoption is critically important and there are a lot of issues, but that’s not the focus of the meeting today, and we are not talking about meaningful use, only about certification. We are talking about that because, as John says, it’s something that is critically important. We do have an exciting group of speakers, and some very interesting panels. We have been meeting pretty much every week for about the last two months, gathering information. The purpose of today and tomorrow’s meeting is really information gathering. We are trying to learn upon more information about what is needed for certification, and what we are trying to do with many of our panels is to put people forward who have differing point of view on a lot of these subjects. We realize they are controversial, there are always controversies, related CCI it, what purchasers need, verpdz vendors, we are trying to have discussions about some of those differences. The purposes of the public comment period, at the end of the day today and tomorrow morning, for you to tell us whether or not we have gotten the right information. If somehow we go through this, we do not know what people are going to say, and it’s possible not every viewpoint has been expressed. If there’s a viewpoint that has not, we want to hear that. That’s the purpose of the public comment period.


The work group will be meeting in closed session, executive session this evening, and also tomorrow afternoon, and we are hopeful that we will have some consensus for recommendation on Thursday at the HIT policy committee meeting. If we are able to come to some consensus for what I would call preliminary recommendation, it will be printed in the Federal Register several days later and there will be another public comment period. There will be ample time for people to make comments. Any recommendation we make would be preliminary based on receiving additional feedback.


Those are the things we are trying to accomplish. As we went around the room, the people in the HHS staff did not really identify themselves. I want to thank the people in the HHS staff who have been very supportive, helpful, besides John Glaser, the lead, but Judy — the contract office, am I saying that right? Something like that, she’s a chief honcho at HHS, we very much appreciate your help, and Jodie’s, and we also thank HHS, who has given us autonomy to make whatever decisions we want, so our first speaker is William Stead, will be talking about a study performed at the national research council. So, Bill William Stead.


William Stead: From 2007 to 2008 I was privileged to chair an NRC commit assess the gap between healthcare IT as deployed in the best places it’s being deployed today, the gap between that and what is required to support the Upon institute of Medicine’s vision for quality healthcare. Today I want to share the observations and conclusions of that committee, and then to add my interpretation of what it means for the task of this working group.


The ideas that I am going to talk about challenge the status quo, so I will go through the presentation relatively quickly, so we have about half of our time together to let you explore the pieces of it that are most interesting to you.


These are the central conclusions of the report. The first was that if we continue our current efforts without change we will not solve the healthcare problem that’s underpinning reform, and we actually could make it worse. That, as I will try to make clear as we go ahead, does not mean we could not and should not act quickly. We can, we just have to be careful about how we we do it.


In particular, success will require much more attention to helping people think about what is in fact going on. In essence, we are interested in information technology, because the healthcare world is faced by an insurmount able cognitive problem now. That is actually the root cause of a lot of our overuse, underuse and misuse. That’s what we have to pay attention to. What the committee saw in the places using this technology a lot, is the patient is actually being lost in a sea of transaction-level detail.


Therefore, our recommendation of how one can move forward quickly, is to manage the effort, to focus on measurable improvement in healthcare. If we do that, and we don’t focus solely on the tool, then we will be able to adapt role, process, and our use of the tool to get the outcomes we want. It’s not possible if we simply are driven by the tool.


This is the committee that we put together. You will see there are bridge people like myself, that bridge biomedical informatics and medicine, we ID a, others bridge to computer science, Peter Solowe vitz, and — experts on bioinformatics, human factors, databases, on data spaces, embedded sensors and large-scale distributed architectures. That’s the group that looked at this.


What we did was to start with the IOMs quality [indiscernible] recommendations. We basically said, what are the information-intensive capabilities that would be necessary to support a system that was, say, effective, patient-centered, timely and equitable. This slide summarizes them. The first is that we have to have comprehensive data about patients across lo kales.


The second is the idea of support for people thinking about problems. I will give examples of that. Tools to manage populations, rapid integration of technologies, and empowerment of patients and their families.


The places we visited are listed here, a mixture of government, for profit, academic, can community and places with commercial systems and home-grown systems.


They were picked because they are amongst the leaders in use among this technology to improve quality, and we ob served that. We saw many successes. But we also saw challenges. They are summarized on this slide. I gave you appendix C of the NRC report that spells them out in more detail. The computer right now is increasing fragmentation. We made rounds, and saw five different sub-teams of the care team, each with computers on wheels, standing in a circle, facing their machines, talking to each other, with the only integration taking place in the conversation, and none of them being able to see whether they were collectively looking at all the important pieces of the record.


There’s no consistent way, even with insights in which people go look for information, systems are often used after the fact. We think about check lists as important, but if you look where checklists work, it’s where roles have been changed so you have a pilot and copilot, used use real-time, if you don’t do it that way, it does absolutely no good. Most of what we saw was in fact, after the fact.


Very little support for evidence-based medicine.


The best of places are working four to six major improvement projects at a time, each taking six to 24 months, very important work, but a small fraction of what we’ve got to do. Centralization is the primary method of standardization, and yet everywhere we found innovation, it was local, every single time.


The timelines are glacial. If you look at the time it takes people that have achieved “level six” it’s over a decade. There were places where we saw as four generations of systems running in parallel while people were still running level one have some other while developing level four.


What we came away with, maybe healthcare is doing 5% of what it needs to do to improve — four to times times more better than the average, and yet the gap between that and what we actually need is simply insurmountable by current techniques. It’s like trying to get into outer space with a jet airplane, instead of a rocket. It just gets to a point that you can’t get there.


Now, there were a couple of key epiphanies along the way of looking at this. The first was that there are four primary domains of computational techniques. Marc point ed this out. Automation is a technique that works when you want to do a well-defined process over and over the same way. Other techniques, connectivity, connecting people to each other in systems; data mining, discovering relationships between disparate non-regularized data sources; and decision support, helping people make choices. Those are quite different computational techniques. They work in the face of complex adaptive systems, which is what healthcare is; and they cost on order of magnitude less, and take an order of magnitude less time than automation techniques. When we look at other industries, we discover other industries balance their portfolio amongst techniques. Healthcare is almost always automation and when current vendors use other approaches they bolt them on to an automation core, so that becomes the rate-limiting step of are ability to handle complexity.


The second epiphany is the step in how clinician’s heads work and the [indiscernible] of IT. Today’s healthcare — people trained like I am, I am a nee frol gist by background. As a new piece of data comes in, we compare against the model F it fits, we keep the model, understand what that piece of data tells us. If it doesn’t fit, whoops, maybe the model is wrong, back up, re-think what’s going. Well, healthcare IT today doesn’t let us work that way. It focuses on transaction-level data. The results and the orders. What little decision-support today’s healthcare provides, it provides at that transaction level data.


What physicians would like to be able to do, and nurse practitioners, other people that are really responsible for understanding the patient; they would like support in working with an abstraction of what is going on with the patient. As new data comes in, they would actually like to see how it fits or not fits, or help with that, so they know what to worry about and not to worry about.


If you have a person who is infected, have a low white count, that means you are being overwhelmed by the infection. If you have a person infected and they have a high white count, that means the immune response is — we don’t have that basic level of support in most of today’s system.


To carry out the rest of the model, what will be needed, we need the way to deal with raw research data, biological observations, research protocols that build the data, the resulting medical knowledge; then we need to generalize that medical knowledge into medical logic, and then we need to make it specific to the patient context with decision support. That’s actually the set of components we will sometimes have, the support the kind of healthcare we are talking about, even if approximate we got full adoption of the best of what we got today, we would have about two-eighths of what we need. The message is to be extraordinarily careful that we move in way that’s do not lock us in concrete.


The committee came up with a set of are recommendations to support evolutionary change in the short term, and a set of recommendations to support radical change for the longer term, because what we are talking about in healthcare reform is radical change. Today’s healthcare process cannot deliver the results we want. We are not talking about healthcare IT, but connecting today’s healthcare process, but to radically reengineer the process. Improvements in care, so that if we focus on outcomes our deployment of technology will be self-correcting and will not end up letting us go far down the path of unin10ed consequences. A statement was made yesterday, beginning to get on the first rung of the escalator. There’s got to be a way of making progress incrementally. A key idea that I think is lost in today’s healthcare IT, that is quite do-able with today’s technology is to simply record all the data in all the form — whatever form it is. At Vanderbilt we keep — what is working, not working, people understand, don’t understand, users, et cetera.


In terms of principles for radical change, this idea of architecting systems so they can accommodate disruptive change. Anybody that has one of today’s IT systems and feels that it accelerates their ability to rapidly change work flow, raise your hand.


That’s what we’re talking about. It takes a different approach to the architecture. We should only employ automation at the component level, where we have a process perfectly designed to do what we want to do. When that process becomes obsolete we swap out the process, component technology, roles, plug in something new. We can do that at component level, can’t do that if it’s the complete neurologic system of our healthcare system.


Managed data separately from applications. I will come back at that, back at that and back at that. That is a key thing, frankly, if we only certified one thing, that’s what we certify. The technology let us manage the data separately from the applications. Then it will tolerate change in those applications as healthcare changes.


So that’s all I am going to say about the committee report. We can come back to it in questions. As I have thought about this, worked with other groups since that report came out at beginning of January, an idea emerged, in the standards working group, board of regents of the national — in January. The first idea is we need to redefine interoperability. Instead of interoperability being one version of truth that can be explicit, always the same, in all systems, which we believe simply does not reflect the nature of biological data. What we are talking about is we might are redefine interoperability as data that can always be correctly interpreted. In the light of today’s knowledge or in the light of tomorrow’s knowledge. How do we manage data if we want to handle the transition from 1979, two types of diabetes, 1997, where there were four and genomics, greater than 20 and know it’s going to go higher. How do we handle an archived data that lets us tolerate those kind of dramatic changes in our understanding of how to interpret it. separatability, and the decision support content, and finally, narrowly limit our use of standard data, by which I mean data that can have only one interpretation, whose meaning is stable over time, such as [indiscernible] ingredients. If man builds it, I believe we can define it with standard data. If the creator builds it, I believe we have to have more flexibility in how we will interpret the data across time. This is just another way of looking what the paradigm shift might be. This idea of moving from one integrated set the of data, the single source of truth, to ready access to all the sets of relevant data, multiple sources. This idea of capturing raw signal, annotating with standard terminology.


Within the Vanderbilt world, we collect information from any source, archive in its original source, may be raw output of monitor, we use parsers and — to attempt to interpret, show it to humans, let them annotate it, keep the human and machine interpretation, also the raw signal, so when knowledge changes, re-run the algorithm and it’s current again.


This really gets to where you are thinking of how we get a robust version of truth. Saying earlier, I was an intern, potassium was high, I would do a rhythm strip, would treat it, looking at getting robustness in the signal by getting two different sources.


The final piece of the paradigm shift is rebalancing our use of the computational techniques. We still should use automation, but in our hands, at least, tackling the electronic health record much more like a secure Google problem than a data processing problem has made it easy to get all the information together. We did this for the Memphis RIO, each hospital, different vendor, costs $2 per citizen per year and took 18 months. That’s a secure Google approach versus data processing approach. Similarly, our disease management desk would pull information from any system, don’t care if it’s a payer, clinical system, ancillary system, we do care that it shows the status of the patient in real-time. The are rest rest of that is probably self-evident.


What does this mean for certification? This is the challenge, trying to put together function, the use of the function, and the effect we want to achieve. The function takes a combination of software, patient data and decision support content. Truth be known, except for something like ability to keep the data separate, you cannot certify the software in a consistently performing way without the associated patient data and decision support. That’s the reason the leap frog tool that’s been used to let people access how, whether CCHIT, identified C POE blocks medication errors. Since the data is still in paper, press, I will say less than 50%, in the right range, not greater than 50%. That’s because we certified the software separate from these other things. That’s a mistake. We also have to recognize that the roles, process and training are an integral part of whether this will achieve meaningful use.


I think what that says is for certification we need to recognize that less is going to be more; we should only certify something where the definitions are precise. At all costs, we should avoid freezing work flow, content or technology, because of the magnitude of change that is still in front of us. That argues to me that we actually make this concept of liquidity the foundation of what we certify. That is not the way that we started. We started by what today’s — the certification process, I am in the middle of it, may be interpreting wrong, but my impression is it started largely with what the best of today’s systems could do. It did not necessarily start with what was, in fact, the essential ingredient that would allow a person who bought something to know it would work over time. And I believe liquidity is that piece, and it’s liquidity of data, decision-support content and the audit trails of how something got to be.


I think then we need to certify on a very granular basis, take E-prescribing. There are roughly nine decision points in E-prescribing, from the decision of what I want to do, task of formulating into a presubscription, translating, interacting with the pharmacist, to administering it, monitoring it, and refilling it. Each of those steps requires different decision support, different data requirements about the patient, it has different measures of what an effective completion of that step was, and has different dependencies.


I would argue we should certify a set of the absolute minimum number of functional building blocks, and building blocks necessary for effective use, and we should do it with all these pieces.


Then what I would do is add another idea. What is really an effective post-market surveillance. Think of the trouble we have gotten into with the drug industry, adding more and more stuff to the FDA pre-market approval process, because if stuff gets out through that process, we have very poor data about — takes us a long time to discover Vioxx, for example.


If in fact we had much better post-market surveillance, we could let things out — put a hold on it. We want to think that way about certification. What’s the absolute minimum we have to do at the front end, and then what would happen if both provider and their vendors were required to report, roll up on things like this. What’s the actual ease of learning, the set of function a role has to do, how long does it take to learn, and how long until they are at peak of efficiency.


Ease of use, what’s the time to complete an error rate of task, specificity of information retrieval. Content support, what percent of users with a deployed system, system, decision support content, patient data, what percent actually handle a piece of new information correctly?


Adaptation to time, from discovery of an urgent update to a drug interaction content, how long is it until that has been reflected in the decision support of all of a vendor’s deployed operational systems. Right now the answer is never. What should it be? Effectiveness, what percent? We have to give people the data to help them understand the process and fix it. What percent of alerts are overwritten, what percent of — follow an override, and absence of alert. If you put together a family of effective meaningful use as the goal, and we manage to that, a foundation of granular certification and a middle tier of robust reporting, not punitively, but to give people the information to fix the technology, to fix the process, I think we could really win this one and win it in the near term. Thank you.


Thank you very much, that was a fascinating presentation. Questions?


Thank you for that very interesting presentation. The question I have for you is, you are advocating a different approach, one that is less about automation and transactions, but more about knowledge support. My question, you mentioned we are flying a jet engine when we are trying to build a rocket. Can you evolve the jet into the rocket or do you need to literally scrap it and move in a different direction?


you


You can do that, but I would narrow down my use of automation, and set beside that my data mining tools, so instead of trying to scale up an automation system to do everything, shrink it down to its components, give it an information foundation that is data mining, and then use these decision-support and social networking and other tools up on top of it to let people work with it. I think what we have is actually a component of the rocket. Our trouble is we tried to make it the whole rocket. If we could narrow it down, put the other components on, I think we could get there, and do it now. This change could let us get to 2014, not stand in the way.


Thank you.


First of all, I didn’t want to be the only person in the room to raise my hand, but I actually think I am better off with my technology, than I was before. Let me just ask you about the — I am not quite sure what it is you are getting for $2 a citizen. What is in the center there? I am trying to understand what it is you are exchanging.


I guess I would say we are exchanging, we are having the six competing hospitals in the three-county region around Memphis, 22% of the population of Tennessee. Those systems, the two loops of safety networks, key reference labs, and one of the key payers, all export data into a regional data bank, which has facility-specific vaults. They can control the data, and yet, the region can then run algorithms across it that identify what person it relates to, what type of data it contains, and it constructs what looks like an electronic health record from a secure browser, and can —


Is it clinical data, like patient’s past history —


Yes, discharge, lab results.


Well, the lab results — but what I am trying to understand, without interfaces, somehow something is all getting dumped into one place, even though there are six different vendors or 10, whatever number of vendors there are, even though those vendors are storing things in different ways, somehow it all gets into the center and everybody can use it, I am just —


Yes, and the reason — I have been very careful about some words here. Said it’s not an exchange, it’s an aggregation, data mining and visualization capability, and that difference is important. We are not taking information from hospital A, mushing it around in a way to pump into hospital B. That would be my definition of an exchange. That would require us doing the end-to-end work [indiscernible] we have gotten to where we can do that for the less than 100 labs, a small fragment of what we are working with in Memphis. We have everybody publish the information, as a text report, tagged text report or as real data streams, doesn’t matter. Then we use parsers, simply semantic algorithms that recognize structure, can break something up into fragments and concept matching algorithms, use the knowledge of the national library’s medical thesaurus to construct something that looks and feels a lot like a electronic health record, the way Google crawls across all websites, lets you answer questions without the people that set up the website intending for you to be able to use it that way. Secure Google is the closest analog to what we are doing in Memphis that I can describe. Does that help?


Yeah.


Thank you very much for the presentation, and I should disclose, the Robert Wood Johnson foundation was a partial supporter of the study. I say that for disclosure, not credit. So much.


[indiscernible]


Steve Downs with the Robert Wood Johnson foundation. I think it’s easier for us to wrap our heads around certifying a system rather than a soup to nuts system providing a full range of functions. As you start talking about the componentized approach with this sort of base, and multiple functions layered on top of it, could you perhaps help us understand better what might be in the base, the foundational components, and also, if you could give an opinion on, do you think the certification should extend to the different apps that would run on top of that base or functions that would run on top of the base?


William Stead: I would like for the certification to be that the data is interpreted equally effectively outside of the system as within the system. We have got to change the incentives that have allowed the current industry compete on making the parts of their system work better together, which by definition means they work less well with other things. We’ve got to break that. The key is accept separatability —


Let me is stop you there, say it is a data recording and capturing, and function to make it available on equal basis to applications that can use them.


William Stead: Yes, we are trying to solve getting the information and putting it in front of each person that needs to know it, according to their need to know it, as they are making decisions about what to do next. That’s what’s going to solve the overuse and underuse problem in healthcare. The misuse part or part of that part that will solve, another part takes automation. Don’t misunderstand me. I am not against automation, I am very in favor of it at the component level; but that would be the base. It should come up above that, I would actually work out the key definitions of each of the components of what we have now lumped together as e-prescribing. When I had discussions with a very well-informed leader in Tennessee, very interested in these efforts, he was very surprised to discover that the e-of prescribing effort counted on — [indiscernible] drug interactions, for example. It could, if everything elsewhere there, but they were not counting the most essential thing for what they wanted to solve. That wasn’t obvious to him. For us to get really clear, this is the component. If you use component, this is the data you have to have. If we made that clear, people would know what choices they are making. Right now they don’t, because we lump it together. I think we have to be — if, in fact, you are a well-contained practice, and all of your patients are contained in your practice, all your data is in your practice, today’s monolithic systems work. That is a vanishingly small part of our problem. In 2000, the median care beneficiary saw two primary care, five specialists across four practices. Any approach to healthcare IT that doesn’t deal with that, reality isn’t going to get us there. We have to get that in at the start. So far we have left it as a to follow. That’s where we made the mistake, if we made a mistake.


I wonder if you would comment a little about — Larry with Kindred Healthcare. Thank you for the challenge to how we are think being this. Would you comment on the apparent paradox, focus on data, ability to share, aggregate the data. But your example of bringing together systems that didn’t start with clean data definition, but able to create useful integration is not the right words, but something, useful bringing together of information.


William Stead: My only point is what are in those systems we brought stuff together from is actually unimportant, and therefore in theory shouldn’t need to be certified. Because what we did was mine it, regardless of form and source. You may want to certify that latter. Can you construct a dash board that shows all the management — plan against real-time. That is an important thing. That is unlikely to be in any system within one enterprise. Yes, they can do it within the enterprise if all the patients are in the enterprise, or if all of their automation data processing interfaces with everybody else are in place. But I am sorry, not by 2014.


Our challenge is how to certify something now that will get us to 2014.


My struggle is you create a overarching system when you brought this system together.


William Stead: Just like Google brought an overarching system when you —


Google did that built on HTML standards, HTTP, a set of core standards they built on, a huge variability in what’s there. Some from the ambiguity of English and other natural languages that when they bring things together, as the recent Bing ads point out, you get interesting cross-threading.


William Stead: Amen. And the fact we have gotten all the power of the Internet with three simple standards, how you locate, display and transport it without any standardization of content. Look what that has done. If we could have the same effect on healthcare between 2009 and 2014 that the web, those three simple standards had between 1994 and 2000, wouldn’t that be wonderful. I know it wouldn’t be where we want to land, but wouldn’t it be wonderful.


My sense is, to achieve that, we would back off on some of the more engineered aspects. Some decision support requires you have common semantics across those data sources.


William Stead: Well, I would have common semantics where common semantics are in fact biologically — drug interactions, we can have common semantics. Let’s nail that one. If we in fact got to where all of today’s commercially available systems used an enhanced R x norm, with adding in drug class from the VA system. We then made all systems tag whatever they did without reference standard, we would have solved a huge problem. We could solve that one — when I was a member of the systemic interoperability — we laid out a road map showing how that could be solve by today. In print, reindorsed by the — it’s do-able. I am not saying don’t do those things. I am saying focus them where that approach actually makes sense, giving the structure, nature of the information, and use very different approaches for this big C of constantly evolving new and different sources, genomics, proteiomics, things that will hit us much harder than what is —


[indiscernible]


Yes.


As an example of changing understanding of classification schemes we are using.


Very much so.


Thank you.


Great, Charles, last question?


As you put together this vision of how this will work, as I am interpreting your comments it sounds like you are saying build something centrally held that a certain amount of transactional data is flowing, you want to build a component of something that fits, I will use the word centrally, algorithms, other things can run against. My question is, I agree the cognitive decision support is the most important thing we can achieve out of this. In thinking about certification standards, other than liquidity of data, is there anything else that comes to mind you would do to support cognitive decision support —


First, I am not talking about a central model. I am talking about breaking the problem up into pieces, having algorithming to aggregate, interpret whichever of those pieces you have the governance okay to do, and you need for the task. It is very much of a breaking the problem up, instead of trying to make it holistic. That clarification.


I think the — we need to start measuring, really the reason I put the things on this slide. We need to start measuring the cognitive, the ability of our systems to in fact enhance decision making. We have, today, the best of today’s systems actually work by get key information at the right point in the work flow. That’s key, that’s important, but it is much less global than really making sure we understand what’s going on with the patient, which is going to be necessary for care coordination across the continuum. It’s going to be necessary for saying let’s not just give the patients — make sure each drug, there’s evidence for, but given the evidence, the patient’s problems, what are the top X drugs that are the highest priority. We can’t take all the ones for which we have evidence. They can’t pay for them. We are so green there I can almost suggest we need to start measuring. I really did struggle. That list wasn’t a casual thought. It was a good day and a half when I was asked to make this presentation. That’s where I would start.


This has been a terrific presentation. I want to say thank you very much, Dr. Sted, for coming.


Thank you very much.


[Applause ]


Moving right along, trying to run this meeting on time, I will have to tell you, as former software vendor, doing anything on time is a very new concept, but I am doing my best. The next presentation is from the national constitute of standards and — I believe Gordon Gillerman also, and will present information about compliance testing, certification process they use for other products, and related topics. If you through the legislation, you know NIST is our partner in this process, funded as part of the process. We are very much looking forward to your presentation, thank you very much.


I like the term on-target, in-target, good phrase.


I want to thank you for the opportunity to talk about NIST and what we do. We are one institute, the National Institute of —


I apologize.


I thought you were more than one, you are so good.


We appreciate the complement. I wanted to give you a few minutes of an oversight of NIST, there may be people in the room not as familiar with our activities, then turn it over Gordon Gillerman, the lead — capabilities at NIST.


I want to remind people of our overall mission. We really work to support the U.S. innovation and industrial competitiveness. We call this the what, how and why slide. We are focused within the Department of Commerce on the economic strength of the United States, that is our mission.


We do this through measurement, science, standards and technology, and that’s what we are here to talk about today. The breadth, to enhance security, and improve life. Anything NIST touches you are aware of, not necessarily because of us. The browser standards we talked about earlier, when you get that little lock on your browser telling you that you are secure, that’s one of our standards we dealt with many years ago, a key management capability.


When you get your time, synchronizing your computers and atomic clocks, that’s reaching our signal. You pump your gas tank, you rely on the standard the states put on the gas pumps, they are referencing back to our standards. You weigh your groceries at the store, the states are references back to our standards. We provide reference materials, reference data, standard development and complementary conformance testing. One thing Gordon will do, go through examples of various ways we touch. But I wanted to give you the breadth of what NIST is. We were speaking to people yesterday, thought we were one slice of what we do. We are very broad, appreciate the anticipation there are several of us, but we touch a lot of things, in very accurate, precise measurement standard ways. Let me go to the next slide.


I want to touch on our responsibilities in the ARRA. You all certainly have read these things we are specifically — the funding we are in line for is for continued work on advancing healthcare information integration, certainly the topics we are talking about today. It is ARRA also directs NIST to update the federal strategic plan, voluntary certification programs, consult on assistance and health IT implementation, provide pilot testing of implementation and testing, and grants program for health integration program centers. The one we are here today to discuss, provide understanding of Vol voluntary certification programs.


I want to turn it over to my colleague, Gordon Gillerman, he will give you — tell you how things can be structured for the future.


Gordon Gillerman: Good morning. What we do at NIST, help federal agencies and industry understand how certification programs work, how they can be adapted to fit certain national needs, and how they can be implemented to be effective and efficient. We provide agencies, federal agencies with assistance and policies related to development of standards and general conformity assessment systems, administrative infrastructures, and also help design the programs themselves. Usually that is our role, to help build the understanding with the agencies, design programs, implement the programs, but not to run and operate the programs within NIST itself.


In this particular area we have been directed specifically to look at certification, with an eye toward making the program work and having a program that’s effective and functional and efficient. Our role in this is defined by the national [indiscernible] advancement act, to coordinate federal, state, local activities to reduce redundancy and effectiveness for the nation’s good. One role we play is trying to help organization and federal agencies developing conformity assessment programs to do so, so they are efficient, other programs working in similar areas to integrate programs to reduce redundancy.


You notice I use the term conformity assessment, and not certification. A term defined in international standards, the term covers the broad umbrella of tasks, systems, relative to demonstrating a product, service, process, meets requirements. General conformity assessment. A lot of activities can be used. Certification is one, not only one. Certification has a specific meaning in the international norms for conformity assessment. You will notice the refers to ISO, ISE, the international standard which has definitions for conformity assessment terminology. This series of standards produced by the ISO — group, really defining the way conformity assessment is carried out, both in industry, globally and more and more in governments across the world.


In conform assessment activities it’s important to — the parties in the process, classical business definitions, the first party for seller or manufacturer, second party for purchaser or user, and third party is very special party, an independent organization with no interest in the transaction between the seller and buyer. The concept is that third parties can give a high confidence objective opinion about the conformity of certain objects that the first party or second party could not convey with the same level of confidence. Those organization have a vested interest in making the transaction happen.


In conformity assessment in this particular area there are several things to look at when making a decision. One is testing. It looks inevitable some of these products will have to be tested in order to demonstrate compliance with certain requirements. There are other activities, like supplier’s declaration of conformity, certification, accreditation we will talk about in more detail. You notice on the slide where the definition of terms come up, we define clearly which party can perform the conformity assessment. On this slide we define the testing can be performed by the first, second or third party, depending on what the system needs and expectations are. Testing used when characteristics to be evaluated can be measured under specified conditions, describes what a test is. A special kind of test set in conformity, carried out on samples of products intended to represent the production. In software that can be a challenge because there’s usually a lot of ongoing changes to software over time. The challenge that you often have is understanding how the tests that have been conducted can be representative of software after the software has gone through a couple of revisions.


Testing can be an elements of both a system that uses supplier’s declaration or certification program. The laboratories which conduct testing have a standard to meet in the system, defined as ISOICE, for operating the management system and driving technical competency in the laboratory environment.


Supplier’s declaration, this is first party conformity assessment, used when the risk assessment for compliance are reasonable, penalties in the marketplace for removing non-compliant products and mechanisms for removing noncompliance products from the marketplace. What is the likelihood of noncompliant products being brought to the marketplace, standards for [indiscernible] associated with supplier’s declarations of conformity, enumerated on the slide. Supplier’s declar rageses come in two forms, formal, standards, requirements the first party is attesting conformity too, population of products, and very informal processes like that used on a box of 35 MM film where you may have a number in the corner saying 200. Well, the 200 number is really in this case the film manufacturer’s attestation that this role of film meets a certain ISO standard for film speed sensitivity. The manufacturer is telling you I declare this product you are about to purchase meets the ISO 28,000 standard rating for film speed 200.


We see a couple different flavors, generally speaking about 80% of transactions in the marketplace, in the United States, based on some form of a supplier’s declaration of certification.


Only third parties conduct certification activities. Certification is typically used when the risks associated with nonconformity are moderate or high. It includes several steps which are well-defined, evaluation of the evidence, some may be in the form of laboratory test data, but there are other forms of evidence, inspection of products, evaluation of people.


There’s a compliance decision based on an evaluation of that evidence, versus requirements for certification. Then there’s a public attestation of conformity by the third party certification organization. The last step is surveillance or follow-up. This is to give the third party confidence in ongoing conformity. So the laboratory tested a product, you end up with a snapshot in time this product has a certain set of test data associated, for the product tested in the laboratory, complying with a set of requirements. Now the vendor is going to manufacture 10,000 units for the marketplace starting tomorrow. The third party needs confidence every single one of those conforms to the requirements. The surveillance is the third party’s mechanism of obtaining that confidence. There are standards for everything, for certification, and it’s ISO 65, currently being revised. The new name will be IOS 17-065. I brought a couple examples of certifications, ubiquitous in the marketplace. Certifications take place for many reasons.


[Captioner transition]


 


 


 


 


 


 


 


 


 


This is a program to give the market and the Idaho potato industry that all of the potatoes in the back with that seal come from Idaho and not somewhere else. The potato industry once you to be confident and part of that process is an industry certification program. In another example has to do with motor oil. In this case we have another industry driven certification program and is the petroleum industry and the American Petroleum Society has a certification program for the motor oil according to a set of standards produced by another organization that was formally called the Society of Automotive Engineers, SAE and 10W30 . So you buy this motor oil and if the motor oil is one you could end up with a $10,000 paper wheat. So you assume a lot of risk when you introduce new motor oil in your vehicle and they want to give you confidence that it’s accurate so you are willing to take the risk. It comes to a lot of forms and favors for a lot of different reasons. Accreditation is the conformity assessment of conformity assessment organizations and their programs. So in many ways this is who is watching the watchers. It’s used to give us confidence that laboratory certification bodies and other conformity assessment organizations do their jobs according to the international standards, and with integrity and technical confidence. We have some examples one is operated by the National Laboratory accreditation program that accredits laboratories to ISO ICE 1725. And you notice there is the standard for the creditors to operate under as well which is 17011 so laboratories in seven fires have the standards and the job of the credit for — and certifiers have their standards. And there is also private organizations. There is a private body that also provide some services for accreditation and the American National Standards institute has a suite of accreditation services which covers both laboratories and product certification. As he said surveillance is a key part in many of these programs. Both in accreditation and certification there are some of the surveillance functions. And it’s a vacation is to give the certification organization ongoing confidence that all of the products continue to meet those requirements. For accreditation organizations, is there to make sure that the accredited laboratories and prior certification bodies continue to do their job in accordance with the standards and operate with competence and integrity. So it sets up a system or a hierarchy in conformity assessment. Where we have accreditation bodies assessing certification organization and laboratories. And those organizations initially of solve the compliance of products and other areas with requirements and the certification bodies can certify based on those test results and their surveillance activities. So it sets up the hierarchy that we can use and depend on in order to have a system that can be functional. A couple different models that we have seen used to put the government’s role in place in the certification program. We will go through. Here we have one that is the limited government will. We have health care and NEC products and new requirements from the selective standards — and IT products and these are two characteristics of this product that need to be assessed. We have laboratories and certifiers conforming to ISO ICE standards for certification in D.C. A-plus here that we may choose to develop sector — and you see a plus here that we may choose to develop sector specific requirements. So we may want to create some additional requirements that focus on the technical, and sees necessary to do the kind of testing that the Health Care IT standards will require. And you may find additional requirements beyond those would be necessary to have a system that we could have enough confidence in and operate appropriately. The private sector can be running all of these levels, we have organizations and the products can come from private sector vendors. The government is over here participating cooperative Lee and in a coordinated fashion with the credit her body or bodies to make sure that their processes meet the needs. So this would be one potential model. It would limit the amount of government resources necessary to be deployed. One of the interesting things in conformity assessment is that it’s not necessary just to have funding for this. This would be an ongoing system and it needs to be sustained and it needs to be sustained well. Generally speaking the systems are designed so that the private sectors are paying for the services from the organizations that they apply for so that the products from the vendors will pay for testing and certification, and the laboratories and certification organizations would paid for this and health sustain this overtime without a long-term funding. A more significant role would be very similar but now we have some form of agreement. Be it a memorandum of understanding or contractual agreement between the government and the creditors to make their role in the open stronger in the process. These are other ways of doing this but these are two examples that could be deployed in an area like this. We have designed programs like this in other sectors. And using that experience to guide what kind of models that we see working over the long-term. One of the things that is valuable to consider as you look at this is how much conformity assessment do we really need? I have a lot of government agencies and private sector organizations that come to us and ask for assistance in developing certification. All lot of times what they really need is something else. We have developed things like this, tools to health us understand that what we need to do is look at the perceived risk associated with nonconformity. How bad will things be if these products do not meet the requirements? Can society tolerate the nonconformity? And over here we have the independence and weaker of the performance assessment system itself and down here we have the we should do is nothing area. So if they do not meet these requirements we should not put any resources on conformity assessment. As we move but we moved up two systems were we can probably rely on first party separations of conformity and in here there is a lot of room for developing hybrid systems but maybe we would require third-party laboratory testing but not third-party certification. Although all the vendors, I’m sorry, to claim or declare compliance. So some different ways that we can use the system to adapt to different needs. Some of the things that will look at again is primarily wish to associate the amount of resources that we throw up these kinds of problems and at — and they should be proportional and this risk base model tends to drive a lot of rethinking as we develop these systems. If we do too much will ask cost — we will add cost and burden to the system. If we do not meet the company’s needs and we just fail. So we have to find the sweet spot between over the sign and a design that makes the system function efficiently and effectively with the right cost burden for conformity assessment. As we said if there is significant penalties for noncompliance we can probably reduce the conformity assessment. There would be a tendency not to be motivated to bring these products into the marketplace. For other kinds of products if we have a functioning ecosystem, the automotive industry is a great example of the functional recall system and we can recall nonconforming products before the societal impact is intolerable. So we have some system examples of the ones that we have built for other applications. These are examples of other areas where NIST has provided assistance in developing the structure and the process of the conformity assessment program. This is a diagram of a system that is likely to be used for the rollout of the Internet protocol version six technology in the U.S. government is likely to be an early adopter. And we are concerned that the systems that we adopt this technology to continue to work and serve the public and government well. We have built the system that is primarily based on suppliers declaration of conformity and profits works like this and we have laboratories who test our products. They test it against requirements that have been determined. The labs are credited for this testing. So we have an accreditation program, third-party accreditation body to give us confidence that the laboratories were in the test and the test results are high integrity test results. The test results are returned to the vendors and they evaluate the test results and make a determination if these demonstrate that my product that I have submitted to testing meets the requirement. If that is true the vendor create a declaration of conformity. And the Declaration of conformity is used in the transaction as the vendors evidence that the product meets the requirements. In this case it specifically ties them back to the accredited testing laboratory and the test report for that product. And this is how the transaction happened with the NIST assessment system. Again this is not is a vacation program but a supplier’s declaration based on accredited results. And one that is likely to be used successfully since it has not been deployed yet in IT related system. And to be timely here is a system that we’re hoping the industry built for the safety of [ Indiscernible ]. Everybody remembers in 2007 there were highly publicized recalls of toys because of lead content it and in excess and others because there were design issues with small magnets used in toys for the standards for safety did not anticipate the features and characteristics of those products and did not have requirements for the hazards associated with them. So we have a certification model. We have an accredited certification organization. This is a much more complex system because we have the evidentiary inputs to the system. When this test results, initial tape testing and periodically testing from accredited laboratory. And we have a design has an analysis which is conducted by between manufacture themselves to look for emerging hunters — which is conducted by the manufacturers themselves to look at emerging hazards and the system to develop the quality of the production facilities in place. All three of these ingredients or pieces of evidence come to this accredited certification organization. The certification organization evaluates that evidence, makes a compliance decision, and issues a public attestation of the toys compliance with the system’s requirements. And win this system is rolled out that public attestation will be in the form of a mark on the product or the toy product package that will be on the store shelf. And this — state owned decertification mark they enter licensing agreements with the certification organization so that they can authorize use of the mark for toy manufacturers whose products have been determined to comply with all the requirements of the system. A very complicated system to address a very complicated process and issue. We have manufacture and allocations of certain products and we have decided those products and other locations. We had a myriad of different test requirements that have to be tested against and a lot of laboratory competency to be necessary to do the bulk of the testing necessary to demonstrate conformity with all the documents or standards for toy safety saw was a very large-scale certification be and that is the end of the presentation. Thank you.


Thank you, you did a great presentation. Any questions from the work group?


Do you have any examples of certification or the process where it has been driven by the government whether its reimbursements sitting out there and has to be certified for meaningful use to get that reimbursement paid a lot of examples that you showed us was edging toward a field of approval that this was okay. Do you do anything like that or have any examples?


There are several programs that we have been working with with a variety of different agencies which will get how a federal grant money is spent. I to a lot of work in the organization at NIST with the Justice Department and Department of plant security. One of the issues is the effective use of a grant money for first responders. And one example is body armor. There is a newly developed body armor testing and certification program set up by the Department of Justice and there is also a program which is a grant program and the grant money to state and local law-enforcement agencies to purchase body armor for their officers. We want to make sure that this is effective for its use we don’t want them spending federal grant dollars on an effective body armor. So we work with the Department of Justice to test and certify body armor in an accredited laboratory and the Department of justice in this case operates as the certification organization itself. And based on that certification and the manufacturer’s name and model of body armor of. On a specific list that the Justice Department maintained that is with a grant recipient can go out and evaluate which models are example for them to purchase for their officers — are acceptable for them to purchase for their officers. So maybe not exactly a similar.


With that a case where a certification program existed prior to that or was there nothing to test that function?


The program itself was built before the industry have any program. The program was. Focus on the grant money and making sure it was used effectively. The regional program put in place was a testing program and not a certification program. Over time that turned out not to meet the Justice Department needs and what we did over a period of time was basically re-engineer that into a full-scale certification program.


Adam Clark, Lance Armstrong Foundation and we will be hearing later today about the best practices for adoption of HIT and this is critical in my world of cancer will people will be going to their primary care doctor a specialist and HIT can enable coordinated care. Looking at some of these logically, though, is there a risk that the certification will cost too much or required to much work that may be smaller practices may choose that it’s not worth my time, it’s not worth my at effort to become certified so you create a division between small practices in cancer care centers, etc.


Is a risk, yes, there is always that risk. As you saw in the presentation and a lot of the thinking in design the system is geared toward finding the balance between cost and confidence. And the issue that you run it is that confidence is not free.


I thought you were asking is whether the physicians could be certified.


What I am concerned about is the smaller practices, if it becomes a cost. For smaller practices for the certifiers to adopt that.


The products being sold, it’s just that they are paying for the project. Not the providers been certified. That is what I thought I heard you say.


I guess a clarification. The certification that you are discussing, is a certification of the position in the practice or that he is buying and implementing for the practice?


A smaller sized practice adopting a certified Project. If the product, the certification product is expensive to actually get certified. Will there be a barrier?


There is always a cost a sissy with conformity assessment. There is no question about that. There are costs for running a test. If all you need to do is test a product and you get to market 1 million of them based on one test than the vendors get to divide the cost of the testing across the total copy population that they sell. So I cannot answer the question straight out because I do not know all of the factors involved. There has not been in the development here, but part of that design is the balance between cost and confidence. So we always look at what is the competency of the stakeholders.


Helmets cost is meeting them adding to the systems — versus the cause that is needed them adding to the systems and what the system looks like and how it gets implemented.


Bill Heiman. I wanted to make sure I understand what NIST’s rule is in this particular process. The way I am thinking about this is the office of the national coordinator will describe what needs to be certified. And then NIST will design the process. Is that what I’m hearing?


I think we will assist in the design of the process and help ONC make a more informed decision. I think that is the way it’s going to work.


ARRA actually calls on a voluntary certification programs. So we are here to help.


This is currently a certification organization already for health information technology. So what your role be dealing with the Office of the national coordinator or the current certification organization or both?


We have been doing both.


Okay.


Would you play a direct accreditation Rolfe either as the contract or would you more accept the terms for what an accommodation is a vacation will look like?


And certification right now we do not have a certification or accreditation role. NIST operates the National voluntary laboratory accreditation Program and in some cases it is a good solution. We work closely with other organizations who are in the business of running accreditation programs for product certification. And that probably would be the primary role. We do not anticipate developing a product certification organization as part of our role.


To follow up on that, let state ONC wanted an accrediting organization or perhaps certification organization, ONC might have a contract for different accrediting organizations to bid on.


You could have several models, there was actually a formal agreement between ONC and the accreditor and then a different model word ONC played a role in the accreditation body and kind of the structure and policy and all of that stuff. I really did not have an agreement. So there’s a lot of opportunities for developing that relationship and see how it works.


Does NIST do any of those roles? Do you monitor their work? Was that something you don’t want to do or do not do?


I think that answer is there are very narrow areas where NIST does those activities and similar things. We have generally found that working with organizations would to this as part of their daily work — that do this as part of their daily work tend to be more efficient and effective. Not that it can never be done or it’s not a possibility but in our experience those generally are not the best solution.


We have an organization that already exists. What advice would you give us on how the organization should be monitored?


I think the important things that we need to look at is to understand what are the requirements for a confidant competent high integrity certification in this area. And we need to spell them out in the document. What we typically recommend is that those requirements are added into the international norms for certification programs and in this case ISO ICE Guide 65, and we would add those extra requirements to and work with them to make sure that it includes both of the requirements of the ISO ICE and these additional requirements. So that this particular sector did its job and met the needs for the sector.


Mark Perez. With new product certification you brought up the concept of surveillance. Is that done by the same third-party or cannot be done by a different party?


The certification body could contract out the tasks associated with it pits a let’s say that we are certified in toys and what we will use is post market surveillance. So we will go to the store shelf and retests them and they could contact a laboratory or several laboratories to test them and report the results back. But the decision to maintain the certification or not maintain the certification must be undertaken by the certification body according to the international standards. That seems to be an approach that works.


Larry Wolfe. You commented many different examples of certification testing. I sort of feel like the scope of what we’re looking at here is a lot broader than the examples you have used. So I wanted you to comment on the scope of what we’re trying to take on and models that might support that scope.


It’s interesting. If I look at the portfolio of the things that we are looking at with industry and government and different kinds of conformity assessment models. Right now from the technology perspective but the bottom end is toys and Walker efficient toilets and the top end is a shoulder fired rockets for commercial aircraft. So we are not bound by how technical or technologically sophisticated products are. We can assist in the design of a program that can meet the need across that spectrum. If we are talking about whether we give certifying a person or a product, or the implementation of a product on site, and those are very different things. There are conformity assessment process is to deal with all of that. But that begins to expand and broaden in many ways. So whether we are looking at individuals to will get did they have the right knowledge, skills, and abilities and have a demonstrated that to do this kind of practice. That is a different kind of conformity assessment exercise. A personnel certification versus certifying software or a integrated hardware software platform. They are all different applications but there are systems in place for the most part and there are international conform the assessments that addressed all of those.


I want to push a similar on industry examples. Two parts. Car there examples or models that were particularly in — are there examples or models that were particularly. So the electronic help records, he did not does put it out on a network. So examples on that. And we talked earlier about breaking systems into components and painted an image that you might have a system that comprises multiple independent vendors. Components that also worked together. So if I am giving a government grant to crunch numbers and they want my spreadsheet to work with that they want to certify both the spreadsheet and the computer on which the spreadsheet is wrong and they might be made by different vendors and how decertify the whole question?


I will certify the second one first. The the second is a supply chain issue. So if you look at information technology sitting in this room, laptops or the projector and has a plastic enclosure that is made of plastic material that is made in some raw form to somebody who molds the closure. It has printed wiring boards in there and sent to a different organization who puts the tracks on the printed wiring boards and maybe another organization with the components before it comes to the organization that rolls this thing into end device. And in the of local safety division the device itself is certified to a standard. So in this case there is a standard called UL60950 and that is the standard for the complete device and there are also safety standards for the printed wiring boards themselves and the critical components such as chords and connectors and capacitors. And the plastic material itself as Wallace the inclosure material. All of these things have their owns a — their own certification program and combine together by someone who makes the complete product and the complete product is submitted for certification. Instead having to reevaluate, is the printed wiring board good stuff? Of the certification body gets to look at the information from the certifications of the supply chain of the equipment and avoid a lot of redoing of work. Also provides the industry a lot of benefits because the manufacturer of that laptop or projector can alternate Resource printed wiring boards from a multitude of certified vendors who’ve been into the system. So conceivably you could make a system that has several tiers to it. You could have complement tiers and a tier for the compilation of the components into a system.


I went to make sure it because I think there might be a distinction in the example that I use. Because you are still talking about one vendor delivery in the final product of all of these components together. And I am thinking of all I by a Mac and I can load Excel or I can use Google docs. But the combination of the spreadsheet and the computer is sort of my own system that tie tailored for my own use.


You are looking for unique integration.


It could well be.


Probably the best analogy that we have put that is the built environment. So when you build a structure, there are building codes then gave you the rules for the structure. Right? The contractors bring in conduit, rebar, concrete and assemble its on-site to a structure that needs the code and they say that they have to meet the conduit standards and have to be certified to the conduit standard. So the people who come in and verify that in that integration of certified stocks of that they got it right. That is a model that could be applied to something like this. The difficulty that we get into is that you have to do one of these conformity assessment exercises, and the inspection every integration of the component to give you confidence that this integration was appropriate. But yes, there are models that can do it.


Joe Heiman again. Are there any accrediting organizations at the present time that could accredit a certifying certification for Health Information Technology?


As long as they have the right set of requirements. As I mentioned all accrediting organizations who are operating on a large-scale moving to or have adopted ISO ICE guide 65 as their standard of operation. So if we define what in addition to this general operating certification program we needed in order to have a good Health Care IT certification program than they could take those requirements and deploy them effectively paid so the answer is yes, but we have to make sure that we develop what the requirement for that particular application of certification is.


And we gave some examples. One of the examples right now, actually two of them, these both are in the business of accrediting businesses that do product certification pits of yes, there is a lease was up there for us to drive John Glasser. Also the ease of learning and ease of use and adaptation to change. What is your experience has been in the industry to get your hands around both measuring and perhaps certification of those areas or areas like that?


I guess when I looked at that question I was — I thought about it a lot. I think the difference is, what he was really talking about was monotremes the effectiveness of the products that have been put in place. So if we assume that all of the products meet the requirements and we are looking at the effectiveness of the system. What we should be doing is looking for data and information to help us to enhance the requirements over time. What we are learning is not worth the requirements that we put into the conformity assessment system the right set and are they working for us? And if they are not we should be enhancing them. In the world of standards we do not write standards and walk away. We continually improve them. So what we want to do is build products that meet standards and put them into the appropriate use environment and monitor the effectiveness of those products. And go backward we have issues or see opportunities for improvement and improve the technical requirements for the product, and take those new and revised standards and put them in assessment systems of the next generation are better piece of this is an evolutionary issue.


And some examples of what we have been working on in the same areas. And usability for instance. That is one of the things that we have been designing tests and augmenting standards and working with the standards community for several years. We have applied in the biometrics world and in the voting systems and what they have to reach. Declared an effective and useful. And we have also been working with ONC and tests that determine whether the systems actually meet the standards. So we can put prototypes’ together to see if we meet the standards and then the test can be adopted by other certifying bodies. So the experience in many of the things that Dr. Steve mentioned I was saying yes, because there is an understanding of separating he says and understanding how they need to be affective together and being able to design the test and things like usability is not soft at all. It’s testable and there are ways that you can come to agreement with international standards and test those against the standard.


One of the things I think will be very helpful is to departmental lies a little bit the technical requirements of the product and the conformity with those requirements. That is what conformity assessment is. So we can use what we learn in the marketplace to improve the or we can say that maybe the products are not conforming to the standard. So we have to fix the conformity assessment process and improve the standard and implement them to conformity assessment.


Getting back to CCHIT, when we started we wanted to be in the vendor community and raised the bar to a three year roadmap. But have been seen that type of approach worked effectively in other industries or have you seen challenges associated with it’s a moving target and vendors are challenged to create a moving target. What are your thoughts on that evolutionary strategy?


I will start with the system. That is in my world, too. That is an effort that we are providing the government testing capability or the Government Systems. So working with vendors and setting the bar. But the standard is there and the bar is the standard. So being able to demonstrate that systems are meeting that standard is what we have been working with the vendors. So the vendors are actually supporting that laboratory that is doing the testing. They are paying for that. But it’s a constant interaction with us in the community.


Did that standard stayed the same or did the bar race over time?


The standard has been set for almost 20 years.


That is the point I’m getting at, is there a need to raise the bar?


I think that concept is a valuable one and it works in a lot of areas very well. In the world of safety is continually monitoring and are the standards that we put in place achieving implementation and products in the marketplace that are safe enough for our society? We have a lot of wriggle Tory agencies that help us understand what is happening. regulatory agencies and an excellent example is electrical Safety centers have improved over the years and the cycle works pretty well. Where the cycle is difficult when you have interoperability issues because interoperability you have to be careful as you are improving the standards that you do not lose the initial interoperability or maybe find a way to enhance that.


So great care has to be taken in interoperability issues.


Larry Wilkes. One of the issues that have been used as success in safety is airlines. And I assume there are a lot of interacting components that make that safety. Could you comment on how the standards process, certification process how that works in that industry and is that they could analogue for Russ? Especially over the decade — analog for us, especially over the decade.


I do not have enough interaction or exposure with the system.


Adam Clark. One of the issue is obviously with health IT is insuring that there is security behind it and is surveiling going to be a part of that? Will you look at that to see is it through these systems that we are able to actually — are there beaches of security so to speak in the patient identification? And if so what is the recourse if these systems are out there and we find out that there are ways to obtain this information.


That is something again that I struggle with all the time because we have the responsibility of setting the standards for the federal government on cyber security issues. So the balance is always in the game now with the standards the to be and how the testing can be augmented to make sure that the appropriate things are being tested before these products are deployed. I am pleased that you bring it up because it is certainly one of the most critical pieces that we’ll need to be paying attention to.


[ Captioners Transitioning ]


Is there a typical time to get that done?


Standards can take a while, the way the United States does it, and Gordon can amplify. Believing the standards process should be driven by the private sector. We do not dictate, as some other countries do, say the standards will be thus and so employ it’s a consensus based, voluntary process that can take longer than anybody want its to take. It can also move more expeditiously when the right people are in the room, and the [indiscernible] is there. We have seen both extremes. It gets a lot of bad press because the process is so thorough, and good, but can be accelerated by, again, providing the right kinds of testing tools, the tight loop between the results of testing and where the standards are developed early on, so cliert clarity of this works, this doesn’t. You can really accelerate the results, but can get a bad press because it can take a while to get consensus. The job we have in this space is to move as quickly as we can.


Terrific. Thank you for a terrific presentation and making sure we stay according to the schedule, and Dr. Fall — we appreciate your presentation .d the work group will now have a 15 minute break, we will resume at 11:15:00 a.m.


[Recess until 11:15:00 a.m.]


I will ask you to resume your seats. Dr. Joe — will be our next — sees patients every day. Could everybody find their seats and we can get started. We have five representatives from the vendor community here. The first thing I want to make sure everybody understands, the fact these particular five vendors are here has nothing to do with the government iny doersing any of them, in any special way, only that they want to give us some information.


We have allotted about 15 minutes for each of the four vendors. And we hope will you not only keep within the 15 minutes time, but if you can shorten it a little so we have more time for questions, that’s even better.


Without further ado we will start off with Sheldon


— Quality systems is an IP vendor, significant player in the IT space. We have fiscal year revenues of 245 million, over 1200 employees, publicly traded on NASDAQ, we are ambulatory HIT provider, and over 45,000 physician licenses.


The products we provide are electronic practice management, along with ERH. We do revenue cycle management and also do community health systems. Probability one of the key features is integrated solutions for the healthcare community we serve. One question we were asked to talk about is what has been our experience with the certification process, and I would like to comment that we have been certified in years 2006, 2007, 2008, and for the pilot in 2009, I think relevant for this group here, our experience with the certification has been a very positive one. It has allowed us to do — the physician groups we service, made us follow standards, it’s made us as a company, most significantly R & D, and we can continually improve, ensure our future certification. We consider this investment in R & D a very good thing for IT, and very good for the community we service.


Another question we had was who should perform the certification, there be more are than one group or other groups involved. We believe as one of the vendors, that C CCHIT should be the sole certifying body. You might ask why we believe that. We think they have had the experience, been doing it four years, they are up to speed, and we believe the process is working. They are doing a good job. They have a broad base of stakeholders, physicians, health plans, health systems, hospital or physician groups. They include the IT vendors, pharmacists, public health, medical record and hospitals. And they have plans to expand into the specialties. My question would be why would we change a good functioning system? Also, on that same subject is developing criteria. We put a new entity, four years behind where we are now, I don’t think that’s a wise decision. The time required any new entity to have public comments, would also be large. One of the things we have — let me go back here — we strongly believe there should not be more than one certifying body. You might ask why that is our position. If you more than one certifying body, different criteria, this would confuse the market, and we don’t think that’s a good idea. We think on a going forward basis, the CCHIT should be the sole certifying entity. The core system should have a high bar, don’t think you should lower the standards, think you should continually, as one of the gentleman here mentioned, to continually raise the statement. We think that’s a very good idea. We think the specialties should be an add-on after the core is met, so that if you are doing certification down a specific specialty, they should meet the core, then after, on top of that, get additional certification, pick specialties they are involved in.


We think also that with respect to open-source, we think all those people should be included in the same certification process, should be no different. They have been able to certify system just as anyone else does.


Another question was should the certification be [indiscernible] specific, we think again they should be both. That in order to do it properly they should be broad based with a — and the top of the core for the specialties. We think an analogy would be if you had a medical degree, fine; you went and got boards certified in sub-specialty, that would be the analogy that we would support.


With respect to another question asked, should certification embody vendor fitness, we would like to weigh in on that subject, too. We think absolutely. We think the vendor should be financially sound, should have significant R & D programs, and why this is the case, we think, we want to be sure the business is going to be around to continually qualify and support its product. We think also this would avoid embarrassment to CCHIT for certifying somebody who is not around, and screening viability, you should examine recent profitability, cash, and balance sheets.


Next question was should certification be viewed as a seal of approval. Our position is no, and you might ask, that’s a weird answer, what’s your situation here? We think that we can certify the processes, such as vendor implementation and support, nor can we certify the physicians will use it properly. Because we can’t do the process here, we think it should not be met as a seal of approval.


With respect to — how should vendor systems be certified, and in-house systems and open-source systems. Again, we believe they have been certified in the same way as all the vendors are certified. Using the same standards, and open-source should be site certified.


How should criteria related to prisere as aspects of ARRA. We believe it should be an integral part, used to encompass a social history piece, privacy piece, obviously should be HIPAA compliant. We should be documenting any release of information or information access, and when we give out information because of privacy concerns, the record is incomplete, obviously should be a notice on that information going out that we have incomplete data. Another characteristic is we should have what I would call a life and death, break the glass feature. In which case [indiscernible] the patient is in dire need of help, we can break that privacy condition in terms of giving out life-saving information.


We believe the privacy issue should be driven by the patient or patient-advocate. The system can develop software to have features, but how they are finally used should be up to the patient and custodian. The healthcare IT should be provided with frameworks for these things, but the actual use of that should be in the hands of the patient and the custodian.


We would also, somebody mentioned earlier, in order for privacy to be maintained, we believe that there should be severe penalties associated with breaking that privacy.


With respect to security requirements that are also integral to this, we think they directly should be related to HR and be a function of the HR. Privacy shouldn’t be embedded in — elements of process should be left to the users. The security should be built in along with the privacy, how that is used should be a user ite .


With respect to summary comments, we feel strongly there should be one certifying body, we recommend that on the basis of clarity, clarity for the industry, that there is just one certifying body, we also believe the bar needs to be set high with respect to the core, and specialties should be added on top. Then, certification should not involve just checking the box. They should be verified, that the features are really functional, operating in a live site. To do otherwise is to invite what might have happened in another industry called real estate industry, where people had wire loans and lowered their standards, which collapsed the housing market. We are looking for a very robust growth in healthcare IT. We think the opposite should happen should continually keep a high bar, verify information. It should be verified by the CC hit certifying body and even to the extent 6 seeing, looking at live sites to make sure indeeds this information technology has been used properly in live sites.


Thank you.


Thank you.


Thank you very much. That was ambulatory EHR vendor. Now we will go to — hear from David McCallie, vice-president of medical informatics, and John John Traveis,


Thank you for the chance to present this testimony. John and I will do this in a tag team form. John has directly supervised all of our current certification efforts and has insight into the details. I will try to provide you an overview. I am relieved to see William Stead has left the room, I am proposing a considerably smaller in scope and more focused than the grand vision, and that’s doubly true because Dr. Sted was responsible for me taking this career, my first boss right after college. He would probably be disappointed in what I have to say here for failing to meet the high standards he tried to instill in me at that time.


We think the sharper the focus, the better. I agree with Sheldon there. The market needs clarity, and needs to move quickly. The timetable set by the statute is aggressive. Regardless bar set, it’s going to be a scramble. We think there are three areas that will come into focus. They all have something in common, which is they are really focused around the goal of achieving meaningful use. The first is interoperability, Dr. Stead and I would agree, interoperability is probably the most important thing to focus on in the short run. Second broad area would be functional behaviors. These systems are going to be purchased by providers who expect them to be able to do certain things well enough to qualify for meaningful use, and those functional behaviors should be what we certify against, and third, less tangible, equally important is pr product integrity. The purchasers expect them to be able to maintain intrinsic security, privacy, data validity. That’s not something easy for an end user to measure, should probably be part of the certification process.


This slide is more or less to repeat the point I tried to make at the beginning, which is we think meaningful use should be the sole target fiduciary taring for at least the next four or five years, in the short hand we have come to use, whatever is necessary to qualify for stimulus funding, and to avoid stimulus penalty in the future when the funding rewards expire. Focus certification on the stimulus, would be our recommendation, and the other existing certification efforts underway should be aligned as much as possible with the stimulus-driven funding, certification criteria, in particular the e-prescribing, Stark exemption, annual recertification effort, and the Neepa e-prescribing should be harmonized around the same criteria, cycle, to minimize confusion to vendors who build systems and the purchasers, who have to ensure they qualify for these particular bonuses and incentives.


We agree with the numerous suggestions that certification should switch from an all or none model to a modular approach. I am sure this will be talked about in great detail. We would simply suggest the definition of a module be based on sub-sets of meaningful use criteria. People could assemble a system if need be out of existing and piece parts, new parts, based on the pathway necessary to create something able to — someone may have an existing e-prescribing system, but need to buy a system that does CPOE, and a third that does quality reporting statistics. Put together a package that meansy sense to them for meaningful use.


We think [indiscernible] is problematic, at least at this stage of our industry and the products. After hearing Bill’s talk, I am beginning to believe maybe we should start all over. He really painted a an unsettling picture of the maturity of our industry, from a software point of view, though you and many of us in the room have spent our lives in the — evaluation of a system would be difficult, if not impossible to build into the certification process at this time. One reason for that is, I don’t think we know enough about what is the right way to solve some of the use ability problems to lock them into received wisdom at this point. That might stifle innovation. The other problem, in our experience with clients, much of the problems of usability have to do with improperly configured systems. Improper is whose definition, where the rub comes in. Our client choose how they wish to configure it. Sometimes those choices aren’t the best choices. It would be difficult to certify a system for usability absent an actual implementation at a site.


We think the certification criteria should be based on objective standards. The ideal, as we heard from the excellent presentation from the folks at NIST would be a machine-driven certification, where you have a harness, can hook up a system, run a test suite, get an answer. We think absent that, the criteria should be as objective as possible, and then absent that there should be an human, multi-person panel involved so we don’t run the risk of getting juror bias into the assessment of a system.


Then, continuous availability. This process is going to happen quickly, the long time tables of the current CCHIT style roadmap are probably too long for the next few years. We think that as much as possible there should be an ongoing process. You notice the second point there, if the test harnesses could be made available to the sites themselves, not just vendors, to perform continuous self-assessment and know they are maintaining themselves in a certified state, that would be a good idea.


Finally, watch out for hidden conflicts of interest. The certification should be based on the objective standard itself, not any particular network you have to hook up to, to test the standard, any network specific agreement should be separate from the certification process, to avoid inadvertently introducing essentially de facto requirements that are not really part of the standards process itself. We agree that an accreditation process, perhaps from NIST, makes sense, particularly if over time it’s necessary to support more than one certification body, if the process requires than one, accreditation is obviously necessary.


John, do you have other comments?


I think one last point on the — more than one certification body, I agree with the point made earlier that it’s very problematic as to the criteria development that may be put into place by those different bodies, in that underscores the need for oversight if that emerges. There’s an assuredness that those certifications hold equivalent meaning. At the same time if more than one emerges, vendors should be able to certify with one, have full standing for the purpose of the prerequisite requirements of meaningful use, not be compeeled to pursue certification by multiple authorities. Equivalent with each other, and oversight process assures validity and support market confidence.


I think we are ready for going to the next speaker.


Thank you very much


Now we will hear from two niche vendors, the first is Donald Deieso, chairman and CEO of PeriGen.


Thank you for allowing us to provide our comments today, and a special note for recognizing the importance of specialty solutions. We prefer specialty instead of niche, in that we track the therapeutic and clinical areas in which we serve. I would like to give a very brief presentation on the company, mostly as a representative of the many specialty solutions firms that are making such a significant contribution, and I wanted to highlight, too, advanced clinical decision support. Something that in the meaningful use definition today, as proposed, doesn’t really offer promise until 2013. I should like to give you evidence that it exists today, not only with us, but in many other companies and it’s functioning well.


We are a company based in Princeton, New Jersey; a U.S. corporation, we work exclusively in the area of OB/GYN. We have a prenatal system set up in the hospital systems we serve, to improve patient safety, clinical outcomes, through advanced clinical support. In short, we connect our system to any host of, or any group of ancillary systems in the hospital, and at the clinic, including any of the enterprise EMR systems as well. The system is distinct, of course, in that bed side for the mother and in-patient setting, there is a screen, clinicians interacting with the screen. The mother’s condition, data would suggest, give indication of her condition changing minute by minute in real-time, proper alerts, notifications are given far beyond that of a linear CPOE and drug-to-drug allergy. We, like many firms produce clinical and financial — our clients have saved or $6 million a year in medical malpractice claims as a result of the system.


Let’s talk a little about the specialty solution, and it’s role in the perspectives we offer today. First, there has not been a specialty certification from CCHIT. We are encouraged by several of the actions in recent months that suggest rei moving that way and we encourage that fashion. In the two categories of certification that exist, enterprise and ambulatory, the critical amount of function missing from the [indiscernible] or the specialty. We should also share with you the consequences, should the specialty solutions not be considered in the meaningful use and/or certification criteria.


First, we believe strongly that the real powerful transformative advances occur in the specialty firms. Evidence of this can be founds by simply studying the enterprising EMR companies to see how they have been created, by acquisition and merger of exactly those specialty firms over time.


Second, we believe that setting certification requirements based on the most common denominator of what’s available from the largest firms institutionalizes an average and discourages innovation. The most exciting opportunities, we believe, for driving improvements in healthcare reside at the fascinating intersection of clinical information and advanced in technology. It’s important that those advances be recognized and encouraged, not disincentivized through the measures you take here today. We also suggest that the incentive built into high tech is a powerful tool, not only it will be used to incentivize, drive this very important transformation of the U.S. healthcare industry, but may become a de facto standard with which medical malpractice claims are taken, beyond what we taking in this room. I can be confident plaintiff’s attorneys will use it to segregate those who are delivering care.


Meaningful use and the proposal offered today does not include entire populations, nor can it. We do suggest, however that women’s health, given the fact that 4 million births a year, and those 4 million new children introduced into the population of the United States, if not properly measured, may not be given the start needed well into their A adult hood. It’s a special population. The certification requirements for specialty solutions must be encouraged. I mention the encouraging early signs of CCHIT in this regard. We have encouraged them and we encourage this committee to be sure that we are not disenfranchised.


Defining the desired out come in summary, first, modularity, very, very important. Second, clear mechanisms for the dispersement of incentives for the specialty solution providers. For example, we have a number of hospitals that are regional, some rural, they don’t have enterprise EMRs, nor is one on their capital expense program, but they do have a system, such as ours, in OGEED, systems making huge contributions If the incentives are defined in a comprehensive way so it is hospital-only, much of those advances occurring today department by department will be lost.


We are not quick for a solution, but we are very clear that it is an issue you hold in your hands, and significantly changes the fate.


The vulnerable and protected populations, women’s health and pregnancy, one measure in meaningful use is associated with women’s health, percentage of women with mammograms over 50 years old, and the final bullet, one that is somewhat nebulous in its purpose, but quite important to us, that a national forum, a continuing forum that advances innovation, continues to share among those in HIT, what is actually occurring is very important, and we support it. Thank you for the opportunity to offer these thoughts.


Thank you very much, our next specialty presenter will be Do you Wilson from Varian medical systems, oncology, rather. Dow.


Waiting for the slides. Thank you. I am dow Wilson, executive vice-president, president of oncology at Varian Medical Systems. Very briefly, a little overview of what we do as a specialty provider, and talk a little about what we think the objectives of the initiative, application to oncology setting should be, things we are talking about. And last, make the case for specialty certification in the oncology setting, as opposed to certification requirements to specialty electronic health records.


Who are Varian medical system system a leading producer of radio therapy, surgery, proton, we have significant software business, do practice management, treatment planning, and oncology informatics. We work with many centers around the world and our oncology information systems interoperate with enterprise of-level EHRs to ensure access to cancer patient data anytime, anywhere in the healthcare systems.


We are installed in thousands of sites in the U.S., over 1000 internationally, and in addition we work collaboratively with the other oncology vendors, developing our view today, the principles for EMR certification, and I think I can say with some confidence our comments represent a very high percentage of the marketplace for oncology specific EMRs. Our product, called ar ya, designed to support the order, preparation and delivery of keepee therapy and radio therapy, encompassing a variety of work flow and decision support, charting, specifically provides a record and verify system for therapy treatments. Written over 1000 interfaces to exchange relevant patient information with that, information.


In terms of certification and healthcare objectives, Varian and our oncology vendor community, we fully support the concept of certification and meaningful use as a way to promote safety, efficiency in the delivery of care. Several organization have put forth recommendations to the federal government for the implementation of HIT certification, recommendations from the Institute of Medicine, national research council, and demonstrate the need for specialty focus certification for oncology. The goals are general, ambulatory and specialty certification can remain the same, the criteria for implementation somebody different.


In the few minutes I have, I would like to give a few examples. Recommendation one, electronic health record should contain comprehensive information relative to the patient’s condition, treatment plan, and outcomes. Electronic health records should meet this goal. However, oncology requires specific terminology, data collection to support physicians in the selection, planning, management of chemotherapy and radiation treatment. These are unique as mentioned by Don, very important to us, to develop practice guidelines, decision support, and many things we talked about before are germane to what we do in the electronic medical records setting for oncology. As a result, test cases for oncology, HRs should include oncology-specific elements.


Recommendation two, EHRs should have the capability to integrate evidence based practice guidelines and research results into systems. Oncology focused include practice guidelines, other specifically for the treatment of cancer, meeting this criteria. However, if a test case or certification designed with a general ambulatory system in mind, ong oncology-specific vendors would have to go back, patients may have to wait for oncology-specific — cancer staging, chemotherapy cycles, tumor respoons assessment. Compliance of general ambulatory criteria will delay developments by vendors and could offer less value to the oncology specialist.


Recommendation three, all EHRs should allow physicians to not only manage their own patients, problems specific to the individual, but when moves beyond the patient and impacting a larger population. Cancer measures, CMS’s — oncology-specific elements for data — quality care, oncology should permit aggregation, data mining and decision support elements to support good medicine and advance what we are trying to do in oncology.


Data relevant in oncology, particularly trials, toxicity grading, adverse events, necessary for quality — without specialty certification, it would be difficult to determine whether the EMR is capturing the appropriate information to support values relevant to oncology.


Certification of specialty, electronic medical records, Varian, other leading vendors believe the oncology department cannot be an information sileee if the — cancer care. Looking for options we believe it is essential loo separate, stand-alone certification must exist for oncology specific EMRs. Some are universeally relevant but — safety, efficiency, specialty disciplines, general criteria only provide part of the solution. If we agree that our objective is to improve the quality and efficiency of healthcare, and accept there are specific and unique needs required to achieve those objectives and the specialties, we must agree a sub-set of the general criteria, along with the specialty criteria would better serve the specialty community. A way to — baseline specifically for the oncology community.


Varian encourages the HIT community to continue open dialogue with the oncology professional associations. This interaction is essential loo the development of meaningful objectives for certification. It’s crucial a separate certification for oncology EMRs be ready for the recovery act implementation. Should this not be rolled out soon it would be ex2R50E78 treemly difficult for vendors to become compliant for general ambulatory system as justed discussed.


We want to ensure all providers, specialty or not are [indiscernible] forced into buying an EMR inappropriate for their practice, we offer our company as a resource for — to promote the best resource for the provider and patients they serve.


We are going to have questions now, if you don’t mind, I am going to ask mine first, because I have the microphone.


I have two questions. The first one is, in my community, we have six different EMRs, we have been trying to set up health information exchange for about five years, six vendors, we have a connectivity vendor. Despite the fact they are all [indiscernible] certified, we are having a Hell of a time getting health information exchange started because they don’t talk to each other, in spite of the fact they are certified to do so.


I would like a comment on that. Then the second question is one of the possibilities would be to have two separate kinds of certification. One is a CMS certification that the products meet meaningful use standards. That might apply to any kind of vendor, and then there might be some other kind of elective certification that doesn’t have to do the stimulus package, that might be more similar to what CCHIT is doing now. I would like you to comment on both of those issues and whoever wants to come in first, that would be great.


I would like to comment on your disparate systems not talking to each other. Principally, because our company, our NextGene developed exactly what you need, Community Health Systems, the ability for systems to talk to each other, only requirement is they be using standards. There are systems to help you do exactly what you are trying to do.


Your second question referred to —


Having two separate kinds of certifications. One that’s CMS, you meet the requirements of meaningful use for the stimulus funds, and the separate one more similar to what CCHIT does now, a more elective certification for any particular entity that wants to be certified.


I would like to comment on that, too, we think also you hit the nail on the head with respect to certification for meaningful use, means one pot of money. Maybe all others in a different one. Yes, I think that’s a good idea.


Okay, other comments from anybody?


From the specialty point of view, I think the differentiated certification would make a lot of sense. You look at medical oncology, 40% of practices are digitized, 70, 75% of radiation oncology are digitized. One of the big barriers, the financial incentive to make that change. A lot of medical oncology practices are very differ fuse, small practices U outside the hospital, providing that incentive would be a big deal for those organization.


I would like to comment on both questions. First, on the interoperability. The current certification effort focuses on just a tiny part of the interoperability equation. It’s like the plugs, but not what you plug it into. Until some of those health information exchange standards are settledded, agreed upon, it’s difficult for the vendors to know exactly what they plug into. Of all the real requests we get requested to connect to, every one of them is different, that’s just not nailed down yet. You can’t stop with certifying one half of the process, the plug, you have to certify the circuit you plug it into. With E prescribing, that’s much further along. We see proportionately greater success, and connects disparate systems together.


Second question, certainly agree [indiscernible] further along —


Yes and no, may be further along because the standards bodies did a better job of rounding up all the stakeholders, the NC CDP process, plus a clearer business model driving from the retail pharmacy side where they had a clear benefit and reason to push it forward.


On the eligibility, the payer side, formulary, likewise, a clear business model for them to make that available. In the real model it’s still debatability, the long-sphainlt equation. The certification processes, certainly we were proposing the first of yours, meaningful use certification, so purchaser has high degree of confidence if they purchase, install properly they can qualify for — penalties after 2014. Elective certification I am less interested in. We feel pretty well engaged in the meaningful use issues.


Other questions from the committee?


John, you were in charge of certification for Sirna, is that what I heard?


Correct.


What was The overall impact for your organization?


Your us, we looked, the acid test where there was an explicit — certification’s sake, for us it has to be more than a good housekeeping seal of approval, it has to hold meaning, both in its ability to enable requirements and [indiscernible] something they recognize, can have confidence in. We participated in three different certification programs over the last three years, with CCHIT, and I think we look at it as a very foundational element of what’s going on in the market. Early on there may have been vendor reaction of another thing we have to do, expensive, it’s ingrained. We know it’s there, have to deal with it. The comment was made by NextGen, driven to better embrace interoperability than they might on their own. That’s a value, definitely helped promote that objective and reinforce security, privacy standards. Which puts it in a new light, security, privacy, left to our own devices we would probably pay more attention to security and privacy within the four walls of our client. Those are real values.


Where we have tried to take great care in evaluating, just not default, saying go for every certification that comes out, has been is it an ongoing, continuous value. We mentioned in our presentation our concern about the relationship between certification requirements under the Stark safe harbor and under meaningful use. We are driven to an annual certification process with Stark, whatever we say about it, CCHIT certification periods have been effective for two years. Used to be three. Under meaningful use proposed to be two years. I think we need a better synchronization of the time period, effective across the federal regulatory requirements that point to certification. To me that’s a consistency that should be addressed. We recognize the value of it. I think it drives behavior that might not otherwise happen in the market, very positive for things that are the not function, feature, user-demanded requirements. It holds value around that.


This is a question targeted at — for the Nextgen and Sirna folks. When we talk about — are you communicating a narrowing versus a full CCHIT accreditation or is it in that comment a different emphasis, specific pieces of low-level functionality —


I will take the first, two things. We aren’t saying certification should only be meaningful use. Certifying bodies are free to develop what is appropriate. The comment from the specialty vendors, there’s great value in certification for specialties. The Medicaid programs, state HIT initiatives, like Minnesota, they are emphasizing specialty certifications. We are not meaning that. I would turn it around and say if a certification program is being represented as meaningful use, being geared, scoped for that, that’s what it should be. Shouldn’t be saddled with de facto requirements. We have experienced situations where bodies that are participating in certification that CCHIT has formed out certain requirements to have their own requirements that go beyond the certification requirements. Specifically for electronic prescribing where we have conformance requirements, connectivity to network, data collection requirements, provider participation in the networks; that are embedded into th certification we have to attain, subsidiary certification, we present evidence to CCHIT. We have to look at those things carefully. Our point on the impartiality, independence to certifying body, the certifications need to take great care not tho enable a business model or requirements to any that goes beyond meaningful use demands.


That’s the gist of that point.


I would like to comment, please Somewhat in agreement that you need to have your standards for your core EHR systems associated with meaningful use, and that be very clearly stated. Those requirements should be, that bar should be continually increased as we go on in time. The specialties should be on top of that as a kind of different, post-graduate degree on top of that certification. That’s how I will weigh in.


I would like to add one, maybe slightly dissenting opinion. I want to go back, from my notes, one of William Stead’s slides, touchstones, less is more, require precise definition, avoid freezing work flow content or technology, make data liquidity the foundation. I think that’s an incredibly important point. If we prematurely certify a particular work flow, way of doing things, we will suppress innovation, and we shouldn’t do that.


We are still a very immature industry. If we certified cell phones at, say, Blackberry, we wouldn’t have an I or whatever is coming next. We have to be carefully to not aggressively —


Do you worry about introducing any opportunity — the resulting system deployed might not be as coherent, any risk there?


Certainly there’s a risk. I think assembling a large-scale system to automate the processes of care across a hospital is an incredibly complicated and difficult process. It takes years to do it. Bill said 10 years. If you build a system out of component parts you have more work than if you buy a package system, one of the selling points in the marketplace for years, still, if you bought everything, you still have to assemble, nobody supplies the whole syste certification could make the problem an easier problem than today, by being aggressive on interoperability issues. Usability, work flow — I think to


We are on reality, especially for 2011, respective respect sayses, our clients have a variety of systems in use. Replacement strategy or mid-stream, they will look to vendors, to represent constituent, certifications, incentive requirements, they are not going to say to Sirna, I bought your catalog, will take me eight years to implement. I need something in 2011. A very delicate balancing problem will have to recognize organization have invested in whatever their production systems are. They are more likely, especially in 2011 to look to current vendors for updates to meaningful use, not engage in replacement strategies in 15 months. Modular certification is an enabler to those organization that have invested, for whatever reason, in the production portfolios they have. That’s their point of departure and it’s a rate limiting reality to what you do with certification. The bar needs to be set appropriately, but a lot of investment dollars have been spent, and they are what they are. Improvement should be made in systems as they are, not the only path is replacement, new plumbing. At my house, I am not building a new house for the sake of a new electrical system.


Adam?


Adam Clark, Lance Armstrong Foundation. Oncology specific issues, certainly that’s near and dear to the heart at the Foundation. Throughout these months after this committee was formed I received lots of calls from the cancer community, continuing to emphasize the importance of research in meaningful use, certification, in all aspects of electronic medical records, particularly as we look at pediatric population where 55 to 65% of those patients are on a clinical trial, or we look beyond with just doing comparative effectiveness research. We don’t know in radiation, number of — that will work best, for prostate the procedures that will work best.


I was wondering if you could comment on, particularly in specialty areas, how clinical research and data collection can play in the certification process, things we need to be looking at, barriers to address to really try too help communities like the cancer community that rely on this information.


It is a complex question. The percentage of pediatric patients on research, if you look at academic community, large community hospital community, 30 to 40% of general cancer patients are on some kind of research protocol. There has to be some capability. I know that our system has capability for managing those protocols, other competitor system does, what the certification requirements of those research protocols might be. We really haven’t spent a lot of time, but would love to think about that with you, as I am sure [indiscernible] will be as well.


It is a capability that should somehow be there. It’s just such a large percentage of the cancer population.


[Captioner transition.]


That capability has to be reflected in the requirement.


SUPERFLUSHWe need to be more aggressive about Interoperability. Can you give us more specifics besides just been more aggressive? What should we do to fix this thing?


I can give you some opinions. Actually, John Glasser and I have exchanged some emails about this. One of the things that I am afraid of in the current timetable for specifying specific meaningfully as milestones is the potential of getting a of sync between the Interoperability file stunts community-based, health information exchange. So in particular, the states are aggressively pursuing the ARRA funds available to the states to build Rios and information exchange entities, they are not covered by certification timetables or the certification process and there is an indication Port, say, the 2011 milestone for the benders for that EHRs bought haven’t specified what the [indiscernible] is and going forward to try to build exchanges against an unspecified standard. So one thing would try to be to avoid the synchronization. So that we are Glen to have a 2011 Interoperability standard, let’s let it apply equally two everyone and get that kneeled right away. If it will be 2013, a [indiscernible] the vendors to whatever standard de pact, which may or may not be the ones that get blast in the long run. So that is just really a plea for speculation in the interest of flying across the communities.


As far as the actual technique to do the Interoperability, I think it will get pretty technical pretty quickly.


There are — and I probably should not go there, because I could spend the rest of the afternoon giving you opinions on exactly the right technology. I think there are standards out there that could be used fairly quickly based on the documents sharing model called XDS that could achieve a certain, the Interoperability. Whether those are the right standards are not should be debated quickly so we could settle or go forward full speed with an XDS based approach to Interoperability or come up with something different. The Model that Bill described in the that that’s where they are sharing a loss of textual documents and finding smart ways to search those is for an example, not a XDS bottle although it is a very powerful model and it is the is a better place to start than a very structured model. Some of that on the [indiscernible] vendor side make it hard for us to go and commit substantial resources until we know we are Glen to land.


I probably did not answer your question.


Well, it was helpful, but as you were talking, I guess one of the things I don’t understand, or maybe I to understand, is, what went wrong. Why do we find ourselves where we have nothing that can talk to each other? Why isn’t it working?


Well I think it is a bit of a misnomer to say that nothing talks to each other. Everyone of us benders has implemented thousands of interfaces and data flows, absolver systems. We have a whole crew of people at Turner who do nothing but talk to other systems. That is their job, to build these interfaces. The vast majority of these interfaces are going to start with an existing standard and start with a small amount of customization on the client side and that may take as little as a day or 2’s worth of work or it may be bored depending on how unusual the system and what we are interfacing with. They’re is a tremendous amount of data that flows today, is just not plug and play. For example present and and and HL 72.5 out of the box. We have thousands of them and they work quite well.


What is not present is the community level sharing and that is partly the lack of agreement upon the right standard of use but I would say it is more of lack of agreement for what it means to have documents and patients shared in the community. Does a regional based Schering model makes sense? Cure share is regional most of the time but when it isn’t, like if you go to that Mayo Clinic or the Cleveland Clinic, it is not a regional base model. Those arguments have been going on now represents the beginning of Dr. Brailer tenure at UNC and until they have been, this is the way we share data in the community and manage privacy and consent and charity rules across state lines etc. The vendors to have a lot of incentive to do the extra work to make that plug and play.


If you look at what was actually certified, and kind of get inside that for a moment, CCHIT certified three aspects of Interoperability as to the types of data, so we had a lab, the values being exchanged, electronic prescribing for most of what the regulatory regime now relates two as advanced E prescribing, Medicare part D requirements and communicative requirements nothing in it for a cure a summary as to a transcribed physicians document or anything of that nature, so you had a CCD with what would be in moderate concept. But as of yet, what would be addressed by CCHIT proposed in 2009 that they still be part of their comprehensive certification for advanced Upper ability is anything to test XDS. So there is truly as of yet no test as have the ability to go approach H. ITI, present and resolve what clinical documents request for that document and any specific instance of that document. So perhaps buy that fail to cure relied on certified Systems is that none of those systems were certified to connect to anything for the purpose of collecting clinical data for any patient for resolving the identity to deal with consent. I do know that you will hear from HITSBE later this afternoon and and a part of that and we have spent a lot of time on data the management and provider identity management and the consent and the things that facilitate that but they are in the future relative two certifications even with the developments we have had today.


Are there a standard mechanisms in place to pass trust from one system to another, but we haven’t agreed upon what do you trust in the first place. What is a properly arson ticketed provider? What is it properly authenticated patient who is requesting information from a system? As John said, these have not been settled so they are not in the products, therefore I do not get the plug and play that you would like to say.


So the standards have not been set yet, that is the problem?


Yes, in many cases they are not settled or the parts are not sufficient to build the and the two and system. The easy part’s have been done but the hard parts have not.


And therefore they are not yet in the certification.


And wanted to go back to David Macaulay ‘s comment, that the equity with an important point, and I wanted to see if I could take a step further and see if he’s got it made sense to make that a fundamental requirement of certification, in particular in the sense of the separation of applications from data?


I think it is it part of what I think we’re heading torrents. And so the proposed requirement of the patient being able to get a copy of their data in the form of a CCD document or whatever the standard turn out to be, that is the liquidity point right there. That and not the 100% of that data in the system but that is the liquidity data — have it adjusted appropriately. So to the point that we can identify the course structured data elements and make those transferable between systems I think is the first set. And I think they’re should be some control bought back I will admit I didn’t follow that. I think that that doctor has a list that I am sure that everyone has heard him say, what he thinks those elements are around medications problems allergies, it is it for the smallest and I think all of us on the vendor side would agree that we will take the bull capillaries with their limitations. ICD9 is [indiscernible] how do you describe an allergic reaction and his vocabulary will use for that because there is not the defacto standard. If we settle a few of those things we can interchange the structured data fairly readily. That the rest of the data that is in the database, the non summary data for the data that describes the status of patient in the middle of a multi phase chemotherapy protocol, if you want to transfer that to another system, that is not possible yet. The Clinton systems but across boundaries, it is too complicated and the standards are not there. So I don’t know that we need to go and solve the most complicated data transfer problem, I think we can make a lot of progress with just the summary data. Many clinicians with take that summary and reassess the patient anyway, that is what they are engaged for, they probably don’t mind having 100% of the data there, they are engaging and having a second opinion with the new provider to start with the is and data and then taken and find out what got missed before and what is seen differently.


We would be thrilled with a CCD that we could actually transfer the data from one practice to another practice. Right now, we can’t do it, even though all of the vendors are supposedly capable of creating a CCD.


And that is an example of where the full scope of all of the steps are not well specified enough, so we could put a CCD on its zip drive and had it to issue an D ticket to the other provider who may or may not be able to import that particular file system into the system. You could wrap it into a security Mill message but providers not use secure e-mail, because people don’t go through the trouble of getting securities certificates in their e-mail systems. So there is a variety of ad hoc ways to solve the problem but there is not a standards based way to close the loop. It is doable, IT test has not been done.


Mark approved. I agree — Mark probes to. The standards are now put out and define and given to you. What kind of level of effort is going to be put on to your clients to make these modifications? What kind of time frame do you see, what is the impact?


I think you have to factor in one more item [indiscernible] organizations to adopt in the Interoperability space. Not all of them, certainly, but a number of them to function relatively with and they’re full walls relative to their health records systems bought back we have had a lot of discussion within Cerna about, is part of it is you can connect to an information exchange, what kind of exchange is it that will satisfy many people use. Is it that that organization can connect to the medical staff in their offices with whatever system is in place which doesn’t apply that a big [indiscernible] facilitate that change and pragmatically, as you put it, that is probably what most of them are interested in. In exchange data with the primary care physicians in my community that represent most of the pigeon population I see. For the most part that will be either my Munich referral services or the medical staff. Especially in an academic center and I think that is where the demand will start, that the systems enable exchange to the care settings that refer patients in the and also, are the primary referral sources Post discharge. So that is going to be the first motivation they have and we are already hearing a lot of questions from them, will that be a proof point that will satisfy meaningful use at least early on? Do we actually have two physically connect to some kind of regional HIE that maybe in its infancy corporate planning stage in our geography? They’re stages where there are very functional information exchanges but there are also places in the country that that is not true. But they are all held to the same standard and same burden of proof, so we have to take great care about what is the task early and what promotes growth over that period of time dropper ability point as to what they are connecting two and when they are asked for the burden of proof for that meaningful used, what is that measurement that they are going to require really going to mean?


I am thinking about the time fringe around two edification, he talked about two years or whatever. Might be, and I am looking at Cerna or whoever, we have the standards out there and we can write Release 6.8 that now has those standards in that and practically, someone has to accept that. I don’t think it is just as simple as a plane that code in the hospital or the physicians office or whatever it might be, but I am trying to get an idea of what you might think that level of effort might be?


From our experience, we really is an update to our software about twice a year that we consider a major update. Absent any of your decision, most vendors practice to certify a pretty contemporary version of their software. They are not certified exempting the released three years ago. It might take code back and certify their immediate prior release.


Our client base we try to encourage to an upgrade about every two years on an individual basis, to what is that currently available release. So historically they are on a cycle of the upgrades about every 18 to 24 months and each of those efforts probably is inside of itself a six month effort at any rate to taken up date — take an update. And they are fully converted from what they are from us and it will go through that process. Some less, probably not a lot more. So if they sit here today and they go out and certify her next generally available release, that we just put out and the version which took thrown our own CCHIT certifications earlier this year is a version that is generally available just as we speak right now. So for argument’s sake, if that is what we had pork meaningful use, which should be able to take that code right now barring the decision to take anything back or make it available on any basis, if there is a cap around meaningful use, and there could be, I think especially for some of the measurements that are expected to be EHR based, I think our client will expect whether or not that measurement content and reporting is enabled by as so it can be reliably collected. They may choose not to use it, they the right to something on there but they expect that capability there. So there will probably be some were to be done to make that available. So they are looking at between now and at the hospital starting fiscal year 2011 to take what it might be just now putting out as general availability. And I think that somewhere between a very and the 40% of that time would be spent doing nothing but the upgrade process. You have deepened begun to consider how they are already using our software compared two meaningful yes. Had they implemented CP 0E? Or intimated prescribing? So they are called up right now and that is an exercise, especially with our clients, I think it was NexGen that said it. Well, we’re trying to help our clients planned but we got a what we are planning against. So it is hard to set the bar if it is not reflective of something that is barely within reach. If there reusing the software for everything today, they would still face the upgrade to take them about, arguably, to that end of the year to get in position using our current software. If there is any competition on top of that at new development requirements, adoption on their part, and commit modules, that is additional work. So I think we can all look at what an entity that has and implemented CPOE, there are lots of studies that will take us there.


I would like to weigh in on that. With the expense to the company, you have to bifurcate into two areas. A product out and the other is how to kick it to the clients, two separate issues. And the timeframe that was mentioned by John is pretty typical [indiscernible] as well. So we have similar time frames as he quoted but it is important to bifurcate the two.


A relative to the expense, if you are improving our product, which you should be all the time, it is something you are doing anyway. Tell it is almost not an expense on acidification piece of it, it is minor compared to the product development which you are doing anyway. So that incentive for everybody to get that, the people that already have in our system but don’t have the latest version is the meaningful use piece. They want to be in on the meaningful use and the money associated with it.


I am interested in this question for the specialty vendors. I understand from your comments that the current certification process bills soared up over specifies in the sense that it includes records that are not relative to your specialties, and it also under specifies because it does not include things that are very relevant. I am wondering if the offer specification problem can be dealt with by ratcheting back the requirements to it. Course that of characteristics that are maybe applicable across a wide range, and in terms of under specification, can that be handled more in the medical use area brothers and certification?


I think the first proposal opinionated is the proposal CCHIT has proposed in the modular, and that is, rather than have the specialty folks who are advancing that technology and so many deep Ways and to their area of other than forcing them to come back and replicate the based enterprise, there could be a certain core, and we agree with that concept. As to the question that now whether the certification can acknowledge the advancements that the specialty firms, that clearly is not on the table today, the meaningful use definition a very wealthy. What concerns us, and I think John describes it as classic best of breed and enterprise approach, is prior to the recovery Act, and we were prepared to go head-to-head in the marketplace as a best of breed technology. But the client decides situation, to recovery Act creates that will change in the entire equilibrium, that ecosystem that is so threatening at least to the specialty species.


— John’s point about managing the development team to the various releases of the software. I would be interested in your comments on the certification process or the bar is raised every few years purses may be giving you the criteria of a way through 2015, is that a significant value to you in terms of planning, or is that of marginal value?


I actually think that is of great value to us to know where we are headed beyond the year. But one of the things that I think CCHIT has done well is there three your roadmap. Now without judging what is on it or in it, we have tended to look at doing our cap assessment on the two your ruling basis against that road map. Three years is hiing variability and greatly subject to change, but within the two year period, especially with their criteria definition settled, we have tended to lock in on the second draft. Once that is published and to public comment, as it. Meaningful point to go down to brass tacks about what it is that we need to do. If we wait long prison that, it is too late for our current development year and there is such a high degree of change CCHIT, honestly, we would have to hold our fire until we see the evolution of public comment way in, but that rortmack thing is very appreciated, because we are typically working with detail planning on a 1 to to your horizon and more objective level planning on 82 to 5 year. But that helps inform prioritization. And I like that most organizations would be it that way and they are not just looking inside what will get them through that element there.


I would like to echo Johns statement, and all of the information he would give us an advance is very helpful. “the market permission and the faster you get it to us is all very helpful.


One question, and Cerna ‘s presentation we had this concept of the test harness, which was interesting. Could you tell us more about that?


Yes, I believe that the current CCHIT process applies to test breaks for the vendors to use to test the software before the formal certification, and the idea was to simply extend that to the implementations, so our clients, the provider groups that are implementing systems could use those same test harnesses to certify that not only the software out of the box was capable of working, but the implemented version that they are actually running could also passed the test Parnis process. So it is the simple idea to extend it deeper into the delivery cycle.


Is it difficult to use these tests harnesses? Copay two prison medical group to this?


I think it certainly could be designed such that it would be trivial to use, it is probably not the case that it yesterday. It should not be any more complicated in an ideal world, and if you use sky, which is one of those pre Internet protocols, the voice over IP, which is what I used to communicate with my kids overseas. All he to install it and it plays a message back to you and you know it is working properly. For Interoperability testing, we ought to be able to achieve that after nearly down some specifications.


Whose responsibility it is it to develop these?


I would think [indiscernible] it could be NIST, although from what I heard this morning it is not likely to be them although they could help guide it to set one of those up.


I would offer the comment that in CCHIT, a delight at the test tool that they used to validate performance and Interoperability specifications, part of it was to render a few to the and the user — a view to the and the user, and I now know if that is specifically [indiscernible] to DR practice is implementing it to ensure that something they receive is workable, but those tools were developed for cert. And I don’t know that they’re useless limited to certification, but they were open sores and tended to be available prior to any certification. And think what we’re suggesting is passing passion open not availability up two anybody that has the appetite or desire to make use of them for there and the deployments. Now whether or not they are fit for a fairly basic level obscure or technical savvy out of bed and the user is a step beyond where they are today. But the development of those or part and parcel to the certification process because that was the means by which Interoperability was certified, so it may be a matter of taking it a step further to help validate use.


Well, I’d think we are ready for lunch.


What is a thank you for that vendor panel, I tell that some of you came great distances, from Laguna Beach and one case, and Kansas, and that could be even further away, but I wanted to say thank you to all of the vendors.


We will break for lunch until 1:15 p.m. Thank you very much.

Advertisements

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s


%d bloggers like this: