I have just returned from my third IHE Connectathon. It was a satisfying week characterized by gaining experience with profiles new to me, collaboration with skilled vendor staff, and meeting new friends. Starting Monday, I immediately developed a schedule of early morning workouts at the hotel fitness center followed by a quick shower and then a stroll outside, across the Chicago River to the north to find some sustenance. Then it was back to the hotel to start the morning testing session. Following this adjourned to the dining area for the proverbial free lunch. Fighting the postprandial slump, we went back to work until dark completing the afternoon testing. I finished up the day with a light dinner and some reading/writing. Lastly, it was off to bed for some rest and the beginning of another cycle.
Here are some of my observations garnered by my experiences this year. I performed a potpourri of testing in both the PCC (Patient Care Coordination) and QRPH (Quality, Research, and Public Health) domains.
PCC document creation and display: Vendors who have been to previous Connectathons have these tests down to a science. It feels great to test documents that fly through the NIST and content tests with no errors. On the other hand, some of the attendees seem to be poorly prepared. A few haven't even run their documents through the easily accessed tools to self-test ahead of the Connectathon. It makes one wonder what they are thinking! Here is a dilemma to consider concerning the Connectathon testing requirements.
A large number of tests involve what is called the "process document" procedure. Those of you familiar with CDA/C32/CCD will immediately understand what I describe below. Vendors pull documents produced by their fellow vendors from a repository and then demonstrate one of the following options- view document, import document, import section, or import discrete data. Everyone can demonstrate the view option. This would be only somewhat helpful in a clinical environment. It would be the same as having a patient bringing an envelope to Dr. X with a copy of a clinic note from Dr. Y for Dr. X to take a look at but then folding up the document and taking it home. In most cases, Dr. X would find the information more useful if it could be copied and attached to the patient's chart in the outside records section where it would be available to review later if needed (the import document option.) The import section option could be used to improve practice efficiency. For example, past history information could be cut and pasted from Dr. Y's records into a new history and physical form used by Dr. X. Dr. X would still need to verify the accuracy of the information but would not need to collect all the past history information de novo. Patients especially would appreciate this capability. Finally, I think the most useful option will be the ability to import discrete data and attach it to a patient's chart. There are many potential uses. Automated reconciliation processes for information about one patient, from different sources, will depend on it. The ability to graphically display information from different sources will require the ability to manage discrete data. Finally, many clinical decision support systems will require the ability to import discrete data. I think IHE should raise the bar on testing requirements, requiring the more advanced capabilities, in order to promote a vision in which health information technology systems are able to interoperate seamlessly to improve the safety, quality, and efficiency of patient care.
The astute will recognize that my recommendation anticipates rapid progress in the development of health information exchanges. We all recognize there are many unresolved issues concerning HIE such as governance, policy, trust, consent, etc. I think technical issues could be solved pending resolution of the more thorny controversies.
CRD (clinical research document) and DSC (drug safety content): These are relatively new content profiles for Connectathon testing. The content creators seemed have more difficulty delivering conforming documents for these profiles. Most completed the re-work effort needed to complete testing during the Connectathon though. Hopefully, testing of CRD and DSC will go more smoothly next year. Also, a NIST CDA tool option specific for the IHE DSC profile needs to be developed to make the testing more robust.
RFD (request form for data capture): This is a really cool profile that was new to me this year. Even after completing Connectathon tests, I can not say I fully understand how it works. I do know that RFD will turn out to be a terrific profile for use in public health, research, and possibly quality fields. Read through the use cases to gain an understanding of the facility of this profile.
I had fun doing the testing because success required actual, real-time interoperability involving two to four vendors, using the content requirements and available infrastructure. Test partners had to collaborate to solve infrastructure configuration challenges. These were dynamic demonstrations like you will see at the HIMSS Interoperability Showcase, not just mundane lab tests. I am especially interested in usability/user interface challenge. I have written about once or twice in the past year or so. I got a kick out of seeing the careful, thoughtful design incorporated into the displays by some vendors. They appeared so intuitive that even a technologically challenged MD could use them without the need to even consider the technical aspect running on the backend. For other vendor's setups, one would need to be a computer programmer to get them to work.
Summing it up: I learned a lot about the challenges of interoperability again this year. Communication is almost always the major issue. Some of the profiles are not as specific or as clear as they could be. Optionality in standards always introduces elements of uncertainty in interpretation and implementation. We saw this demonstrated over and over.
As in all human endeavors, the personalities of those we work with can have a big influence. Some monitors were easier to work with than others. My satisfaction and enjoyment of the Connectathon came from the interaction with certain vendor's representatives and with a few special fellow monitors. I would like to specifically mention Lisa Nelson from IHE who was instrumental in organizing the PCC and QRPH monitor efforts and Steve Moore, our fearless overall monitor leader. Fellow monitors Monique Speight, Andrew McCaffrey, Didi Davis, and Philip DePalo helped make this year's Connectathon memorable.
Saturday, January 22, 2011
NA Connectathon 2011
Subscribe to:
Post Comments (Atom)
No comments:
Post a Comment