Friday, October 28, 2011

Performance, Human Factors and EMR

This weekend my classmates and I are working on various projects and studying for pending exams. One project coming due is development of a Request for Proposal for acquisition, installation and support of an electronic medical records product. Some of the communications with my project team-mates concerned issues of hardware configurations and what we should assume. This left me thinking about frameworks for performance benchmarking for EMR systems and metrics for usability.

Much work has been done developing performance benchmarks for high end computing systems. The most widely used framework is Linpack. The defacto standard has several flaws, but serves as the one widely understood performance benchmark. Vendors tend to both tout their performance as part of marketing information, and optimize their codes to maximize scores.

I went to the NIST and CMS web sites in search of information on their EMR certification programs and scanned the test outlines descriptions. All were focused on basic functionality, and none that I found spoke to performance or usability issues. At this stage of the game, focus seems to be all on doing transactions at all. Efficiency will follow later.

A few blog postings popped up when searching the web. Generally they involved discussions of contributors to poor performance. Basic issues tended to center around inadequate infrastructure such as slow or undersized disk, network choke points, under provisioned system DRAM and undersized CPUs. No discussion of benchmarking.

This is not an exhaustive search, but it suggests a void that needs to be filled. I can imagine a framework of transactions segmented by clinic types. A survey process could identify the basic statistical profile of transaction types. Various clinicians (MD, NP, RN, etc.) could be interviewed and and perhaps shadowed to find the types and frequencies of transactions. Time required to complete the transactions and keystroke and mouse clicks could be recorded. With a sufficiently large library of user profiles, model practices could be defined, say ranging from small single physician clinics to large academic medical centers. In principle it would be possible to use this data to create stochastic models of traffic flows between various nodes, such as data entry stations, servers, backup storage, remote sites, etc. Similar techniques are have been used for decades to model network systems and to predict performance in normal operations and in the face of traffic surges and outages.

System vendors would benefit from such systems. It would allow them to evaluate hardware configuration options for customers and predict performance prior to first installations and planned upgrades. This should enabling them to right size configurations and avoid frustrations from under provisioned systems. It would also allow them to optimize their software implementations.

Labels: , ,

Tuesday, October 25, 2011

Exploitation of ignorance
Recently, I read a very interesting Rolling Stone article The Truth About the Tea Party by Matt Taibbi http://www.rollingstone.com/politics/news/matt-taibbi-on-the-tea-party-20100928?page=1

It was generally an article about some of history of the party and a lot about the cynical manipulation of the ignorance and intellectual laziness of electorate by politicians, and made the statement “And those people really don't pay attention to specifics too much. Like dogs, they listen to tone of voice and emotional attitude.” For some reason it got me thinking about a speech I once heard at Stetson by Claude Pepper, and an anecdote that he told about this being a factor in one of his early losses. With the help of Google I found the following from a 2005 book by Richard Grayson and 2007 book on FDR by Jean Edward Smith:

in 1950, George Smathers impuned the Senators morals with North Florida voters by calling Pepper a shameless extrovert who's sister was a thespian and who's brother was a Homo Sapiens.

Smathers also charged that Pepper matriculated with co-eds, practiced celibacy before marriage and monogamy afterwards, and vacillated one night on the Senate floor.

http://books.google.com/books?id=P4CJoskzWicC&pg=PA87&dq=claude+pepper+monogamy&hl=en&ei=rrsDToXSB8HSgQexuYCIDg&sa=X&oi=book_result&ct=result&resnum=5&ved=0CDsQ6AEwBA#v=onepage&q=claude%20pepper%20monogamy&f=true

Smathers of course won the election and Pepper learned a hard life lesson.

Wednesday, October 19, 2011

TV Electrical Interlock
Recently I read Design of Everyday Things, by Donald Norman, which had a reference to the electrical interlocks on TVs. Got me thinking. These are no longer made. Even for CRT based TVs, end user repairs are long forgotten. As child I recall pulling the back off the TV and removing the vacuum tubes with my dad. These were carted over to a local hardware store, plugged into a test set about the size of a pinball machine. Various settings were made and tube functioning could be tested. Replacements were available for bad tubes. The interlock prevented powering up the TV with the back off. This prevented an electric shock risk from the high voltages in some of the internal electronics. Today most TVs are flat screen, with no tubes and no customer replaceable parts.

I found myself agreeing with Norman’s message about human factors considerations in design. When I was at Bell Labs, working on forward looking technologies, we had a human factors team as part of the Lab. The team included experimental psychologists. Their purpose was to interact with the technology developer/inventors to help identify problems and solutions with user interfaces for the new communication services that we were coming up with. This sort of design approach is used extensively now in industries like nuclear power and aviation, where safety and minimization of human errors are critical. Interestingly, according to one of our MMCi professors, Constance Johnson, this approach is only now getting consideration for Electronic Medical Record and decision support software.

Labels:

Sunday, October 16, 2011

VistaNet

A couple weeks back I was chatting with a colleague about the VistaNet project and some the lessons learned from it. In 1989, I was looking for driving applications that could demonstrate the functionality of an emerging technology (Asynchronous Transfer Mode, ATM) coming out of the telecom industry. There was a funding opportunity that I became aware of. The NSF and DARPA were cooperating to establish a set of projects that would address the challenges associated with making networks and applications using them, to operate at the gigabit per second rate. I’d established contacts at MCNC and the UNC computer Science department. MCNC had recently acquired a Cray YMP supercomputer. UNC had an advanced graphics engine, called Pixel Planes 5. We approached UNC and MCNC about interconnecting these devices. UNC was interested and pulled in a radiation oncologist from the medical school. He wanted to improve the way radiation therapy treatment was planned for caner patients. The state of the art involved approximating the patient anatomy using an MRI slice that went through the tumor. He wanted to use 3D representations of the anatomy to do a better job of minimizing radiation exposure to the normal tissue near the tumor site. This became the vision behind our testbed proposal, called VistaNet: connecting the Cray, Pixel Planes 5 with a 2.5 Gbps link and delivering in real time, 3D images to a physician workstation of patient anatomy with radiation dose as an overlay. Change the dose parameters, rotate the anatomy, get new images in a fraction of a second. It was a lot of work, many people were involved, but over the course of 4 years we were able to bring about this vision. I was the project manager and coordinator for the effort, which ultimately came to also include BellSouth, GTE Labs, Fujitsu and NCSU. The politics and intrigue became complex. It was truly a cat herding experience. The 5 projects selected for funding received a great deal of attention. There were many publication and presentation opportunities. The press was very interested, although attention always went to the medical application and never to the technology behind the scenes. I learned a great deal about myself in this effort, and came have a great sense accomplishment. We overcame many technical and organization obstacles and ultimately in the process we enabled development of medical insights that changed the standard of treatment for radiation therapy planning. There were many times at the beginning when it seemed that the technical vision had become overburdened with organizational conflicts and that the team would fall apart. At one of these moments, I changed employers to stay with and hold together the project.

Thursday, October 13, 2011

Afternoon session at Duke

Joe Smith (West Wireless Health Institute) gave a spirited and delightful post lunch presentation on the risk aversion of the FDA and how it is a threat to innovation. In particular, he noted that a current point of discussion at the FDA involves data delivery assurance over wireless networks and how regulatory unwillingness to accept any level of perceivable risk trumps an overwhelming benefit that can accrue from accepting some risk. Joe also called attention to the lack of caution with respect to the lack of regulation of tobacco, which has quite a few risk issues. Paraphrasing his point, individual choice is the standard for tobacco use, but we must be protected from the low possibility of data loss over a wireless connection, when the benefit is substantial (see Joe's article in IEEE Spectrum http://spectrum.ieee.org/health1011). What's wrong with this picture?

I’m reminded of the discussion of this aspect of human nature from the evolutionary biology perspective in How Risky is It? by David Ropeik. The argument is that we are wired, by the process of natural selection, to over react to the possibility of risk. The cited example is that of two early humans walking through the savannah and hearing a rustling sound in the grass. One thinks it might be a lion, and flees. The other, more rational one, thinks the odds are that the sound is not from a lion, does not flee and is eaten. Moral: only the over reacting early man lives to pass on his genes. So here we are.

Wednesday, October 12, 2011

Duke University' 3rd Annual Medical Innovation and Strategies Conference

Running all day today at the Fuqua School of Business

A majority of the audience agrees that smart phones are enablers for paradygm shifts in health care. Critical issues include the payment system, if clinicians aren't re-embursed for a transactions, no matter how cool the technology. Also, sensors are just toys unless they are connected to backend systems to analyze and interpret the collected data.

David Watson, who is a Board Member for Sotera, talked about a wrist mounted wireless sensor that takes the place of multiple traditional monitoring systems. User oriented design is critical to success, not just connection to EMR.

The panel sessions made multiple references to speech recognition and natural language processing as critically needed technologies for EMR to meet its full potential.

Applications based on the iPhone were referenced multiple times. Notable points: the iPhone is a great example of the superiority of design for use, vs. simple technology. At the other end of the spectrum: an application for the iPhone that claims to cure acne by tuning the display to a curative color. Apparently this application was pulled through intervention of the FDA.

Monday, October 10, 2011

Improved patient management


While in a discussion last week that was focused on the post EHR adoption era, one of the participants present mentioned the desire for improved patient management by accessing and utilizing data that resides outside of the institution providing care.

His comment brought to mind a compelling example where this is being done. In July, I watched a PBS Frontline episode, which featured the Camden Coalition and the work of Dr. Jeffrey Brenner. A summary of program can be found at:


Here's a link to the online version of the program:

Dr. Brenner was focused on Emergency Room visits. These events are very costly, and generally patient follow-up is very weal to non-existent. Brenner used Medicare payment data and GIS to identify frequent and clusters of ER users. Once these were identified he used RN and social workers for the in-home contacts focused on preventive care and education. The results were significant reductions in repeat ER visits, reduced medical costs and improved outcomes.

Thursday, October 06, 2011

Why MMCi at Duke 0 0 1 766 4372 Duke University 36 10 5128 14.0

The dialogue with Stephen, posted earlier, left me thinking about recent career decisions. The paragraphs below are taken from the essay I wrote as part of the application to the MMCi program at Fuqua. They relate directly to these decisions. So far, the program has been living up to my expectations as stated in the last paragraph below.

********

There have been a few times earlier in my career when I found it necessary to “remake myself” professionally. This first occurred as a grad student in a PhD program studying nuclear physics, I came to realize that career opportunities would be limited. For a variety of reasons I resolved to stop my education after completing a MS and found employment at Bell Laboratories doing software engineering. Later I managed to switch from product development to applied research, which satisfied a yearning otherwise unmet, but required developing many new skills. After the telecom meltdown, and the onset of the wars in Iraq and Afghanistan, DoD research funds (which had been my bread and butter) tightened and it became increasingly difficult to find the external support required to continue what I’d done previously. I found another project at RTI that looked promising, and learned a new set of technologies associated with semiconductor fabrication and packaging. But after working in this area for a couple of years, it’s now clear, this path does not have enough market traction have much of a future for me. It’s time to rethink directions and remake myself yet again.

I’ve given much thought to encore career options. Generally I’ve concluded that I need to do something that can support a reasonable income to provide for a financial cushion to hedge against getting caught in the changes needed to restore fiscal order to the Federal government. The work needs to have the right kind of challenges for keep me intellectually and emotionally engaged. This means it must have opportunities/if not a requirement for continued learning. It needs to present technical problems that need to be solved, preferably at a conceptual level. It needs to have some aspect of returning to my community (professional, generational, and or geographical). My thoughts about options to meet these goals have tended along these lines:

Start a boutique/lifestyle business: I’ve done a couple of venture capital backed startups and feel confident that starting a small business is a realistic possibility. There are several issues with this direction, but perhaps most significantly, nothing I’ve identified as a specific business has a service to community component or involves work that I think would hold my interest over time.

Teaching physics and math: I spent 7 years of my life studying math and physics, and then followed a profession where those skills atrophied from disuse. The current buzz about science, engineering and math education suggests a ready demand for these old skills. Re-energizing those physic and math synapses appeals and meets many, but not all of my encore goals.

Become program manager at a research granting government agency: I have often toyed with the idea of going to DARPA, the ARO or the NSF to develop and manage research contracts and programs. The idea of sitting on the other side of the table is appealing. It would be challenging work that would meet my encore career objectives. While I’ve started down this path several times in past years, I have never followed it to the end.

Medical infomatics looks like an interesting career direction. With two medical schools, SAS and multiple pharmaceutical companies in the triangle, it would seem that there will be many job opportunities. Also if the rules being established by ONC and CMS under HITECH, if not unwound by TEA Party Republicans, strongly suggest that medical infomatics should remain a growing field. With respect to service to my community, as a baby boomer, I sit at ground zero of two fiscal ticking time bombs, the social security trust fund and the cost of medical care for an age skewed population. Continuing to work is about the only contribution that, at my pay grade, can influence the social security situation. However, a career in medical infomatics seems like a meaningful way to contribute to the solution of an important problem that our nation faces. My current judgment is that this direction represents the best option for meeting all of my encore objectives.

Two aspects of medical infomatics appeal to me in particular, 1) the policies and methods involved in aggregation and analysis data from large population groups, and 2) the policy and technical challenges of defining and meeting the requirements for information security and privacy when aggregating data across institutional boundaries. As indicated in my cost and quality impact essay, I see a great potential for reductions of medical costs and improved medical intervention outcomes through data aggregation and analysis. There is a complex tension between the goals of data aggregation, information security, and individual privacy that must be dealt with in order to realize this potential. The problems span technology and policy. Diving into these issues appears to be both challenging and rewarding.

Labels:

Tuesday, October 04, 2011

Dialogue with Stephen 0 0 1 269 1538 Duke University 12 3 1804 14.0

In July this summer I separated from my employer and enrolled in the Masters of Management in Clinical Informatics (MMCi) program at the Duke Fuqua School of Business.

Stephen, who was the VP of Marketing and Sales at one of my startup companies, noticed this change and dropped a note recently. The communication (edited for brevity and relevance) follows.

Dan,

Normally people go to grad school to become what you are, not the other way around. What was your plan here? What do you do next and why that?

Stephen,

The program I'm in runs for 12 months. It's not an MBA program, but a mix of business classes and classes on medical informatics. What's medical informatics you ask? It's the set of techniques involved in the keeping and use of electronic heath records. Lots of privacy issues, lots of opportunities for data mining that yields insights about cost reduction and better outcomes.

Dan,

Interesting plan...what made you decide to go that route? You were/are a distinguished network security and Grid computing man...is this about Obama's interest in digitizing medical data? Privacy and compliance is huge. Just where are you inserting yourself? Consulting for PwC?

Stephen,

Obama's stimulus has created a surge in consulting opportunities. I have some interest in going that direction, but don’t yet know if that’s my direction. The larger motivation is local opportunities are much greater in the new area. Two academic medical centers, all the pharma in RTP, SAS, Quintiles, and several others. RTI has a large group that has been working in this field for 10+ years. Networking has greatly diminished as a local option. Lots of former Nortel and other employees are competing for a few local jobs. My goals are to keep learning and working, but stay challenged and local.

Labels: