Supporting Psychiatric Assessments in Emergency Rooms
Wed, 01 Sep 1995 >Download PDF
Wed, 01 Sep 1995 >Download PDF
An inappropriate decision to admit or discharge a psychiatric patient from an emergency room is often the starting point for a series of undesirable results.
The project that the New York State Office of Mental Health (OMH) worked on with CTG was designed to address this issue through the development of a computer-assisted decision model to support psychiatric assessments in emergency rooms. The model was developed with the assistance of an expert panel representing both practitioners and consumers of mental health services.
This report discusses the decision support model and software developed to support the practitioner in gathering and considering all information relevant to an admissions decision.
Report of the Field Test to Evaluate a Decision Support Tool for Psychiatric Assessments in Emergency Rooms
Tue, 01 Aug 1995 >Download PDF
Tue, 01 Aug 1995 >Download PDF
This project represented one approach to improve emergency psychiatric services by reducing the number of inappropriate admissions, and by avoiding inappropriate releases, which can result in violent episodes in the community. This report describes the field test conducted with practicing clinicians, including the advantages and disadvantages that clinicians found with the decision support system. Recommendations made to the Office of Mental Health at the conclusion of the project are elaborated on here. Details of the prototype system are given and screen display images printed in the report.
The New York Office of Mental Health sought a project at CTG in order to improve emergency psychiatric decision making. The primary goals of the project were to reduce inappropriate admissions and discharges, improve client and system outcomes, and reduce inconsistencies in emergency room decisions. To achieve these goals, the project developed prototype decision support software and sought to apply this improved technology in a very harried, complex, and significant decision environment--an environment that deprives individuals of their liberty and consumes significant government resources.
The prototype decision support software was designed to assure that physicians ask all the appropriate questions needed to make an admission decision, and to help them sort and weigh the relative importance of the answers. The admission decision, however, is the province of the physician. The software was not intended to, nor can it, replace physician judgment.
Expectations for the CTG project were ambitious. OMH hoped ultimately for a psychiatric assessment product that could be sent to the 166 hospitals in New York for potential use in their emergency rooms. The complexity of the tool and the policies it represents, however, point to the need for much additional testing and revision of the prototype. Nevertheless, while still short of readiness for implementation, the project made much progress toward this ultimate goal. Specifically, OMH achieved:
A better understanding of the possibilities and limitations of technology use within the emergency room
The project enhanced understanding of the emergency room process, including how it differs from setting to setting. This wide variability among emergency rooms highlighted the need for ER assessment protocols to improve consistency across these settings. In addition, much was learned about the possibilities and limitations of computer software in a psychiatric emergency environment. Very important to OMH, the project demonstrated the feasibility of software use by physicians. This knowledge has important value as a guide for further efforts to improve emergency psychiatry, which may include the use of information technology.
Significant progress toward a decision tool for use in emergency psychiatric assessment
Consensus was achieved by a national expert panel on the basic structure of an instrument. Agreement was reached concerning the major areas (modules) to be assessed (such as danger to self) and the core questions within each area. The instrument included areas identified as important by consumers of mental health services and their families. Further, important headway was made in developing consensus about which areas and which questions within areas are related to other modules. For example, responses to questions about drug use and the presence of environmental stressors were recognized as ingredients in the danger to self module score. Although both of these areas need further refinement, they form a good foundation for additional development. Progress was also achieved in establishing the relative importance that should be associated with answers to various questions. Finally, the project evaluation outlined the important next steps for OMH to take in order to finalize the software. This instrument has great value to OMH. It is a product that is ready for further field testing and eventual use as a training device and/or a decision aid in ER settings.
Use of an expert panel to achieve both consensus and legitimacy
The dialog between emergency room practitioners and consumers of emergency psychiatric services and their family members allowed the airing of various and divergent perspectives in the expert panel meetings. These discussions enhanced understanding and empathy among the panel members and helped move the group toward consensus. Moreover, since the decision model was defined by a group of experts including practitioners, consumers, and officers of the American Association of Emergency Psychiatry, it has a level of authority and legitimacy well beyond what any one stakeholder could achieve alone. This group also represents a ready panel to be drawn upon in future work.
A basis for future agency-university collaboration
The project strengthened and initiated important working relationships between OMH and the University at Albany which are likely to lead to collaborative future work.
Many public agencies are responsible for programs that try to meet the needs of a diverse set of stakeholders. This project illustrates some ways to address that diversity and to seek consensus on both policies and actions. It also gives some guidance on the value of information technology as a way to bring needed expertise to decision situations.
The use of expert judgment panels is an effective way to identify differences, build credibility, and work toward consensus about complex issues.
While it is feasible to develop a decision model based on the judgment of a single expert, a decision model that would be acceptable in emergency rooms across the state and would be useful both to specialists and generalists required a consensus among experts from various fields. Achieving a consensus under these circumstances was not a trivial matter. Group decision support techniques developed at the University at Albany and elsewhere were used in this project to facilitate consensus among a diverse groups of experts. Although this technology has been applied successfully in a variety of fields, this project is the first time it has been used in psychiatric assessments.
The method works because it focuses the panelists' attention on the task, makes reasons for disagreement that are usually difficult to uncover explicit, separates false disagreement from real disagreement, and gives the participants the tools necessary to overcome some of the limitations that prevent agreement.
Prototyping encourages stakeholders to confront issues and make explicit choices.
During the design stage, the most important outcome of the prototype was the way it forced the panel to engage basic issues that had been previously avoided or treated superficially. These issues were (1) what is the specific purpose of the tool? and (2) in what part of the intake process is it to be used? Before the prototype was presented, the panel had discussed but did not settle on one of several possible uses for the tool: an aid in conducting an interview, an information recording tool to be used following the patient interview, or a training device. Neither had they clarified the characteristics of the user for which the tool is to be designed. The examination of the prototype brought these issues back to the surface, which led to more clarification and clearer direction for further development.
Once the prototype was taken to the field, these issues became even more obvious. The physicians who tested the system were able to make very precise comments about which features worked well and which did not and they were able to recommend changes that were far more specific than would be possible in any other circumstances.
This clarifying effect of prototyping appears to be an antidote to the common tendency to avoid making unambiguous decisions, especially in a group with conflicting interests and perspectives. Conflict within the group is avoided by either passing over tough issues, or dealing with them in overly general or superficial ways. A prototype will necessarily embody decisions on these issues. Therefore, confronting these decisions in the prototype forces the group and the user to deal with the implications and consequences of one choice or another. This can lead, in turn, to a more realistic and focused discussion of the issues, and clearer, more detailed specifications for a full system.
Policy advisors can play a useful framing role in software design.
This project rested on contested policies as well as presumably problematic practices. The expert group convened to design the decision model was not a typical software design team. It embodied competing perspectives on the underlying policy problems of emergency mental health services. In this case, the data set to be collected by the system represents a group policy about what is important in or required by an emergency psychiatric assessment.
The public policy principles of openness, participation, and legitimacy are critical to the eventual acceptance of a decision support tool for emergency psychiatric assessments. Groups like the expert panel assembled for this project are often consulted for policy advice and they lend accountability, legitimacy, and political and substantive credibility to public deliberations and decision making.
The use of expert panels and consensus-driven models to design unstructured software applications is a process that still needs refinement.
In most cases, a certain amount of vagueness in a policy statement is acceptable and sometimes even desirable. In this project, however, it produced an ambiguous model whose residual vagueness about purpose and intended user resulted in clear weaknesses in the software application.
An alternative approach would have been to use the diverse expert panel to first create a policy framework, which would set the boundaries for a more traditional system development phase. In this first phase, the expert panel would define the purpose, the user, the categories of necessary data, and the expected results of the system. This expert consensus about the key factors in the ER decision-making process would then have guided a small group of system designers and actual users to create and test a prototype which reflects both the panel consensus on policies and the practical complexities of a working ER. The design team would be responsible for detailed specifications for how the system works and how a user interacts with it in the context of a real life setting.
The prototype could then have been presented to the expert panel for further review and assessment of how well the prototype performs against their policy framework. Panel reactions and recommendations would become specifications for refinements in later versions of the prototype. The several iterations that would be necessary between the expert panel and the more traditional software design team would probably not have taken more time, would have relied more on the specific strengths of each group, and might have produced a more refined product for the field test.