ON THE PLAUSIBILITY AND SCOPE OF EXPERT SYSTEMS IN MANAGEMENT Vasant Dhar July 1 9 8 5 Center for Research on Information Systems Information Systems Area Graduate School of Business Administration New York University Working Paper Series CRIS f 9 8 GBA 1 8 5 - 6 1 This paper appears in the J o u r n a l o f M a n a g e m e n t I n f o r m a t i o n S y s t e m s , Vol. 3, No. 5 , 1987 Center for Digital Economy Research Stem School of Business Working Paper IS-85-61 1 Abstract Over t h e last decade-there have been several efforts a t building knowledge based "expert systemsH , mostly in the scientific and medical arenas. Despite the fact t h a t almost all such systems are in their experimental stages, designers are optimistic about their e ~ e n t u a l success. In the last few years, there have been many references t o the possibility of expert systems in the management literature. However, what is lacking is a clear theoretical perspective on how various management problems differ in nature from problems i n other domains, and the in~plications of these differences for knowledge based decision support systems for management. In this paper, I examine some of these differences, what they suggest in t4erms of the functionality t h a t a computer based system must have in order t o support organizational decision making, and the scope of such a system as a decision aid. The discussion is grounded in t h e context of a computer based system called PLANET t h a t exhibits some of the desired functionality. Center for Digital Economy Research Stem School of Business IVorking Paper IS-85-6 1 1. Introduction Over the last several years, computer-based modeling systems have made i t relatively easy for end users t o develop powerful decision support systems in many application areas. Yet, there is a growing recognition t h a t unless such systems are augmented with representational frameworks and inference mechanisms t h a t take explicit cognizance of the intellectual component of managerial decision making, their utility as decision aids is limited. In parallel efforts in the field of Artificial Intelligence (AT), researchers have been concerned with similar issues, although in problem areas t h a t would probably be regarded as more "structuredn than those encountered in management. Some of the programs t h a t have resulted from this research, commonly referred t o as "expert systems", have received considerable attention because of their ability t o engage in judgmental reasoning similar t o t h a t of domain experts, and exhibit comparable levels of performance. It seems natural to ask whether similar systems might be built t o support decision making in the management arena where many of the more challenging problems tend t o be fairly open-ended, non- repetitive, a n d not amenable t o analytical solutions. Answering this question requires addressing four, more fundamental questions: 1 . what is the nat,ure of expertise in domains where knowledge based support systems1 have heretofore been developed, 2. what is t h e nature of complex managerial problems t h a t distinguishes them from the above class of problems, 3 . given these differences, what problematic aspects of management problems might. knowledge based systems be used t o support, 4. what system functionality and architecture are needed in order t o alleviate such problems. An answer t o the first of these is based on a summary of existing literature in cognitive science and expert systems. In order t o keep the discussion on the last three questions in focus, I restrict the discussion t o managerial problems t h a t require modeling problem situations for decision-making purposes. Specifically, I shall ground the discussion in the context of a resource planning problem for which I have attempted t o '1 use t h e t e r m Bknowledge based system' t o refer t o a program where domain-specific knowledge plays a m a j o r role in inference. T h i s is in c o n t r a s t t o p r o g r a m s t h a t use 'weak methodss (Newell a n d Simon, 1972), t h a t is, s y n t a c t i c , domain-independent m e t h o d s t o guide inference. Center for Digital Economy Research Stem School of Business Working Paper IS-85-61 develop a "planner's assistantH (called PLANET) t o help planning managers with t h e formulation and maintenance of planning models t o support decision making. T h e investigation was initiated by planning managers in a large computer manufacturing company ( W C ) who expressed concern over the inadequacies of existing computer based support tools and the need for a knowledge based tool t o support the planning function. This effort has brought into focus some of the problematic aspects of managerial problems such a s planning, sharpened the distinction between such problems and those encountered in other domains, and the implications of the differences for knowledge based support system architectures. 2. The Relation Between Expertise and Problem Type T h e type of knowledge required t o solve a problem is influenced by the degree t o which the task has been formalized [37]. As a domain becomes better understood, formal theories or normative models are articulated. These provide a basis for understanding and solving problems within t h a t domain. In the absence of this formalization, problem solving and understanding are more likely t o depend on informal, intuitive, possibly unarticulated models. In this section, I consider t h e nature of problem solving in domains t h a t lie a t three different points of this "structurednessH spectrum: highly formalized domains where clearly identifiable bodies of knowledge exist, less structured domains where expertise is more implicit but nevertheless identifiable, and unstructured problems where the knowledge brought t o bear in solving problems, is evolutionary and often "distributedH across several individuals. T h e last of these is characteristic of managerial planning, where information t h a t is used t o construct models for decision-making, is continually changing. 2.1. Expertise in Structured Problem Domains There have been many psychological studies of human problem solving mostly in problem domains t h a t would generally be considered "well structured". Broadly speaking, t h e problems studied have either involved "common sense" reasoning pertaining to everyday physical phenomena 116, 24, 8, 20, 1 2 1 , ~ or specialized knowledge from highly formalized domains such a s physics o r algebra [22, 38, 30, 6, 51. 'sometimes referred t o as #naive physicsm. Center for Digital Economy Research Stem School of Business IVorking Paper IS-85-6 1 Although humans are generally competent with naive physics problems, competence in solving real physics o r other scientific problems is less common. Larkin [22] explains this phenomenon as follows: *...the process of mentally simulating events so as t o predict their outcome, a facility possessed by most people for common contexts, is extended and refined in a skilled scientist to become a sharp and crucial intuition t h a t can be used in solving difficult, complex or extraordinary problems. Novices, lacking this extended intuition, find such problems difficult" (Larkin, 1983, p.75). Several studies of problem solving in these areas have contrasted expert and novice behavior in order to understand the nature of this extended intuition. A common finding has been t h a t the quality and speed of solution is influenced by the n a t u r e of the representation adopted. Experts appear t o possess the functional equivalent of a large set of perceptual patterns and an "indexing scheme" t h a t enables them t o perceive the important features of a problem. If t h e problem is not exceptionally difficult, they often work "forward" without trial and error (i.e. without the need for backtracking) from general principles toward results t h a t "include" the solution. Chi. et.al [6] explain this in terms of the ability of the expert t o rapidly categorize the problem into an appropriate "principle-oriented" schema. Once correctly classified, axiomatic knowledge can be used t o solve the problem in a primarily top-down manner. Many studies of human problem solving behavior have involved the design of simulation programs. Several of these programs have been used for theory development and validation in domains such as statics [30], dynamics [26], and electronics 141. Evidence gained from observations of human problem solving is typically used to judge the validity of these computational models. An understanding and measurement of the "quality" of expertise is facilitated considerably because of t h e existence of a stable, clearly identifiable body of knowledge in the form of theoretical principles o r normative models. Not surprisingly, expertise in these areas appears t o be highly correlated with individuals' abilities t o recognize and apply the appropriate physical principles involved. The major usefulness of computer based systems as support tools in these domains appears t o be as intelligent tutoring systems t h a t can take cognizance of students' naive concepts about scientific domains, and facilitate the transfer of a principled body of knowledge t o novices. Several experimental systems along these lines have been built for symbolic integration (Kimball, 1983), electronic troubleshooting (41, axiomatically based mathematics [39], probability theory [I], and a consultative system for MACSYMA Center for Digital Economy Research Stem School of Business IVorking Paper IS-85-6 1 2.2. Expertise in Expert Systems Expert systems research has been influenced by a growing recognition t h a t high performance programs are not likely to emerge through the clever use of a few powerful domain-independent techniques, but through a systematic formalization and use of large amounts of domain-specific knowledge. T h e implications of this shift toward a "knowledge basedU approach are well summarized by Goldstein and P a p e r t [18]: "The fundamental problem of understanding intelligence is not the identification of a few powerful techniques, but rather t h e question of how to represent large amounts of knowledge in a fashion t h a t permits their effective use and interaction. The current view is t h a t the problem solver (whether man or machine) must know explicitly how t o use its knowledge - with general techniques supplemented by domain-specific pragmatic know-how. Thus we see AJ as having shifted from a power based strategy for achieving intelligence t o a knowledge based approach" [18]. Most A1 research in expert systems has involved development of large knowledge based systems in problem areas where consultative decision support is a practical necessity for solving difficult problems. Major efforts have been in medicine [31, 36, 4 1 , 1 , ~ geological exploration [15, lo,] mass spectroscopy interpretation [23], and computer layout 1251. In contrast t o physics-like domains, these areas are less well understood. Because of this, i t is much harder to measure expertise against a formal, axiomatized body of knowledge. Rather, expertise tends to be implicit, manifested by consistently high performance with difficult problems. These problems typically involve uncertain, ambiguous, and fragmentary data. An expert must therefore judge the reliability of facts in order t o clarify the problem, and acquire additional evidence in such a way so a s t o discriminate among competing conceptualizations of a situation. In affect, "noisy" d a t a coupled with a n inherently large search space requires the use of intelligent heuristics, typically refined through experience, in order t o impose pragmatic constraints on complex, open-ended problems. A major reason for the impressive performance levels of expert systems has been t h e extensive efforts by 3~~~~~~ 1411 specializes in glaucoma assessment a n d t h e r a p y , MYCIN 1361 in antimicrobial t h e r a p y , whereas CADUCEUS 1311 deals with t h e whole of internal medicine. Center for Digital Economy Research Stem School of Business IVorking Paper IS-85-6 1 system designers a t formalizing this mostly experiential, often subjective knowledge extracted systematically through expert^.^ In fact, an important benefit of this knowledge extraction exercise is the systematization of previously unrecorded or unexpressed knowledge. Some researchers consider the primary contribution of constructing expert systems in such domains as being one of theory formation, thereby moving such problems into the category of Ustructured" problems. 2.3. Expertise in Managerial Problems In the types of problems discussed above, the expertise involved is typically i n d i v i d u a l . F o r managerial problems however, i t is useful t o distinguish among problems where individual expertise or normative models are involved, and organizational level problems involving inputs from multiple individuals. Many attempts a t developing models of expertise for administrative problems have focused on the individual. Such models, some of which are embodied in computer based systems, have been designed in domains such as loan assessment and trust management 171, portfolio management 191, financial diagnosis [3], capital budgeting [2], and welfare eligibility [40]. In contrast to individual problem solving, organizational level problems introduce several types of complexity into the modeling process. These complexities are well chronicled in articles describing the early attempts a t building large corporate simulation models. In these efforts, detailed mathematical models of organizations were constructed compIex problems where closed form solutions were infeasible. [19,29,34,35]. A major motivation for developing such systems was they made i t possible t o evaluate the impacts of alternative policies, opportunities, and external events (all operationalized as parameters of the simulation model) a t the level of the firm. The major knowledge inputs into such models consisted of assumptions about the organization and its external conditions, obtained from multiple sources in the organization. These were then translated into detailed mathematical models for decision making. T h e essential features underlying this type of modeling activity are summarized a s follows: 1. Model Formulation as Assumption Synthesis: formulating models is a n inherently underconstrained exercise involving generation of alternatives for various parts of the task 4 ~ y t h e s a m e token, a continuing problem with such systems has been t h a t - t h e i r carefully crafted knowledge bases t e n d t o be extremely fragile - system behavior often changes in unforseen a n d undesirable ways when knowledge is added. Center for Digital Economy Research Stem School of Business IVorking Paper IS-85-6 1 environment and the making of choices from among them. Since these choices are often tentative, they can be viewed as assumptions or premises on which expectations and projections a r e based. Any quantitative model must be understood to be conditioned on one such set of symbolic assumptions. Model formulation as assumption synthesis is discussed more formally in section 3 . 2 . 2. Distributed Expertise: formulating models for decision-making involves many individuals from different levels and functional areas of an organization. There are seldom individual experts for broad-based organizational modeling; instead, knowledge about the alternatives in various p a r t s of t h e task environment is contributed by several individuals. A t higher levels, policy issues shape top level decisions. These provide the context for lower level strategies and decisions which can be expressed in terms of an algebraic/mathematical model. T h e form and implications of distributed expertise are discussed more formally in section 3.1. 3. The evolutionary nature of models: decisions are not "one shot" affairs. This contrasts with problem solving in expert systems and instructional systems in structured problem domains where solutions are typically " one-shot * , t h a t is, the decision maker obtains case data, engages in a consultative dialogue (with colleagues or a system), and a solution is obtained. Rather, in an ongoing enterprise, decisions are made in a context established by previous choices. New information is evaluated in light of existing assumptions and expectations. In some cases, the new information may be assimilated cleanly into the existing conceptual framework, perhaps resolving certain ambiguities or uncertainties in the prior assessment. In many cases, however, the new information can be accommodated only if prior assumptions a r e appropriately modified, perhaps leading t o radical restructuring of all or part of the situation model. Mechanisms for managing evolutionary models are described in section 3.3. I t should be noted t h a t our use of the term "distributed expertise" is qualitatively different from expertise in other domains in t h a t i t is neither anchored by a stable body of knowledge a s in physics, nor based on consistent virtuoso performance in some area such as medicine. Rather, i t is a consequence of the necessary diffusion of responsibility across multiple departments or individuals in an organization. The discussion so far is summarized in table 1 which draws o u t the essential features among t h e problem types in terms of five key features. In the following subsection, we discuss the implications of these differences for knowledge based decision support for management. 2.3.1. The Role and Scope of Knowledge Based Support Unfortunately, a fundamental problem with large scale organizational modeling is t h a t the richness of the modeling activity - t h e problem solving involved in formulating t h e algebraic model itself -- is n o t preserved systematically. T h e formulation or synthesis activity is in fact t h e most challenging and creative p a r t of modeling exercise t h a t shapes the structure of t h e mathematical model. Yet, if t h e Center for Digital Economy Research Stem School of Business IVorking Paper IS-85-6 1 Center for Digital Economy Research Stem School of Business IVorking Paper IS-85-6 1 $ S f n r r , z m m c n X E F I ID a 5 X ' b d v P r r (Dert. & rt I-r. * W ' b VI PI r P O s m 0 "0 1 m OP n P, r- 3 m r* C. N n ~t R g :: c G v r n r * B 0 t n :X 3 m n 2 1 0 m o r. b*Cgn: 0 m a 3 1 p g q 0 a a o m P, m m m s a - m a r m P, ,re+ r . r - o m i ~ r ~ ' 0 3 I 1 r g F m n r . a m x 0 3 1 C 0 3 O O m n ~ v r n m w r C r X m m n C C n n g v . 0 0 1 m 3 C. 0 PI n r 1 t S c C 3 09 U W M O W X i5 S z n r m % B e cn I V) I I O C ( 3 n s s e R 0 1 1 * P , e , r . o 2 - 0 m 3 0 0 3 a C . K ) w n l a r ( a m m m C. P, a m m n 0 n X V. 3 1 b a o m 3 m 3 o n 1 1 r n v 1 ' 4 m sC w 0 b y m 3 n I.'. m Ca 0 cn n P, v r m 3 r: X b o r . r m CTlD P, n r d m r ~ D R W Y s m r. a tn a : F. o n r * R * v, g 8 a e n x 1 0 1 0 0 3 0 Dp v m m R w l m r m m 1 ' 4 r: m r t n 0 v n - g 2 3 3 0 00 r U W V ) 0 W e 5 s z n r ' n 2 !i! 3 ail u I 5.3 ID m o a 1 0 cC b 0 C m u 0 3 7 m 3 n I-. m F a n x C' 00 3 r CC n En P, v r W O - 1 1 6 o r. r v m P, r 3 m l D R m a m r. a m m : C. n * R 5 g 09 3 a - n 1 0 1 0 3 0 a m m r I m m a l 0": m n n 0 w n c1 v. 0 ;4. s 3 0, OP r !4 Z 9 ." o rn M c r j U u rn a M 6 x g ; 0 g E g r' H s 3 V) a 2 /ij f2 5 e rC m $;: m W 2 2 f! cn m 0 Z c n W 3c n r f!R!i! 8 relationships between a large algebraic model and the symbolic assumptions underlying i t are not faithfully preserved, the interpretation of the model becomes difficult, and modification of such models in light of a changing reality can be time consuming, ad hoc, and error prone. Their pragmatic problems notwithstanding, the basic objectives of such modeling efforts were reasonable and are still worth pursuing. If we recognize t h a t i t is actually the formulation/reformulation of the model based on changing assumptions t h a t is most problematic for a manager and his support staff, i t is in this activity where knowledge based support is m o s t needed. Conceptually, this can be achieved by representing the ancillary symbolic knowledge about models, which includes knowledge about the assumptions underlying the various model components. With such knowledge, t h e system can become an active partner in reasoning about changes to the model instead of burdening t h e user with the complete responsibility of maintaining and exploring models. F r o m a design standpoint, what is needed are structures and mechanisms for representing and manipulating the qualitative d a t a t h a t forms the basis for the quantitative model. This emphasis on d e s i g n of the quantitative model from fragmentary qualitative data, (as opposed to the selection from a predefined set of models) requires a computer-based architecture t h a t is capable of representing knowledge t h a t lies outside the scope of current day modeling systems. In the following section, I describe such an architecture t h a t has been shaped by the concerns articulated above. I limit the discussion to synthesis and maintenance of quantitative models only; i t is assumed t h a t if such a model is maintained, a n algebraic model corresponding t o i t can be formulated. 3. Knowledge Based Decision Support for Planning Planning is an important activity in most large organizations. Considerable time and effort of individuals from different parts of the organizations can go into building and maintaining models for planning. Several types of qualitative knowledge are involved in developing such models. However, most current day modeling systems d o not adequately represent such knowledge, thereby placing a heavy burden on the decision maker t o maintain the correspondence between t h e knowledge t h a t can be represented within the system and t h a t which cannot. In this section, I describe a system designed t o Center for Digital Economy Research Stem School of Business IVorking Paper IS-85-6 1 represent and manipulate the diversity of knowledge involved in problems such a s planning. With this functionality, the system can play a more complete role in supporting decision makers. 3.1. Knowledge About Alternatives/Assumptions -- Distributed Expertise Conceptually, a n existing model can be viewed a s being the end result of a process involving consideration of a range of alternatives (assumptions) from various parts of the task environment. These alternatives may pertain t o decisions a t various levels of abstraction. For example, in the CMC manufacturing environment, these assumptions pertain t o computer technology t o be used in the product and t h e processes t o be employed in manufacturing it. Figure 1 shows a small set of alternatives about technology and testing processes t h a t might be considered in such a context. In P L m T , knowledge about these different parts of the task environment has been partitioned across a "society of agents* designed to represent standard areas of the planning activity or individual specialists in the different functional areas of the organization who have responsibility in the planning process. These specialists are represented as *objectsw in HOUSE [32], a Franz Lisp object oriented programming environment t h a t is similar in spirit to the FLAVORS package [27]. T h e objects correspond t o the real world entities in the domain under consideration. Referring t o figure 1 , each of the alternatives corresponds to a n object t h a t contains knowledge about a local p a r t of the task environment. Responsibilities of an object (which corresponds to a domain specialist) include responding to decisions being taken in other parts of the manufacturing environment and communicating its decisions so t h a t other specialists may also make appropriate adjustments t o their parts of t h e task environment. These "adjustments" are carried using "action oriented knowledge" which we describe shortly. Other, book- keeping oriented responsibilities of a specialist include keeping track of its current choice (with respect t o whatever decision(s) for which i t is responsible), reasons for it, and possible alternatives t o the existing choice. T h e implementation details of this are described in Dhar [13]. Center for Digital Economy Research Stem School of Business IVorking Paper IS-85-6 1 CMC Corporate D i v i s i o n s : A l t e r n a t i v e Methods : P o l i c y A l t e r n a t i v e s : A l t e r n a t i v e d i a g n o s t i c : p r o c e s s e s \ \ : e l l l p s l s \ \ \ Figure 1 A small set of alternatives considered in the course of formulating a plan in CMC's timbaktoo division. Choices a t the lower end of a line indicate alternative ways of accomplishing those indicated a t the top end of the line. Center for Digital Economy Research Stem School of Business \Vorking Paper IS-85-6 1 3.2. Assumption Synthesis as State Space Search There are t w o sources of "action orientedn knowledge t h a t are important in assumption synthesis. First, the problem domain itself provides constraints t h a t reflect certain relationships among different parts of the task environment t h a t must be realized. F o r example, in the computer manufacturing environment, two such domain-specific constraints (which we illustrate via a n example shortly) are: 1. "A decision to employ embedded etch board technology rules out using test processes designed for surface etch technologyn5 2. "Using Hitech's etch process requires using Hitech's heatsink technology t o o U Both these constraints are indicated in the search space shown in figure 2. As long as such constraints are applicable, the problem solver is in a "constrained mode.n Thus, a choice on what technology t o use would rule o u t certain testing processes. This could in t u r n trigger other similar rules, setting off a chain of choices. As long as there are such choices t o be made -- either due to a constraint or because there is only a single alternative with respect t o some decision -- the program is in a "constrained mode." There is also a second, quite different way by which choices are made. This is when all possible ramifications of a choice have been propagated and the problem is not yet fully solved, leaving the program in a "quiescent" state. In such situations, a "forced choice" is necessary in order t o continue with the formulation process. This is a characteristic of problems t h a t are inherently ~rnderconstrained, t h a t is, t h e constraint relationships alone are not sufficient t o make choices in all the required parts of the task environment. This requires the program to focus on some area of the task, and evaluate the set of alternatives available there. PLANET assesses t h e desirability of available alternatives on the basis of how they contribute toward the goals and objectives of the organization. This is operationalized as a pairwise comparison of alternatives on an "objectives vector" consisting of resources such as capital, space, and labor.6 The choice is determined on the basis of the resources required by the alternatives in ' ~ m b e d d e d etch boards technology refers t o boards where signals travel through t h e body of t h e board a s opposed t o i t s surface only. F o r a computer manufacturing company, the decision t o use such a technology is a strategic one and has i m p o r t a n t ramifications for decisions in related parts of the task environment. ' ~ e c a u s e t h e program must also sometimes compare high level alternatives for which detailed resource tradeoffs a r e impossible to assess before t h e details about these alternatives have been specified, * m a c r o level' knowledge is used in such situations. Basically, this heuristic knowledge consists of high-level associations a b o u t how t h e various alternatives typically compare across t h e various resources. Center for Digital Economy Research Stem School of Business IVorking Paper IS-85-6 1 M o t o r o l a ' s p r o c e s s t e s t e r s \ I \ / LEGEND : A -*-*-> B: c h o i c e of A r u l e s o u t B A ----- > B: B is a p o s s i b l e c h o l c e A -..+C> B: c h o i c e of A i m p l i e s B ----- >I): intermediate c h o i c e s Figure 2 A small section of a search space indicating a sequence of choices. In this space, a terminal node would represent a fully formulated plan incorporating a trajectory of choices indicated by the nodes Cl,C2,C3 & C4. Center for Digital Economy Research Stem School of Business IVorking Paper IS-85-6 1 light of the organization's resource availability picture. A "choice point" involving such a comparison is shown in figure 2. The process of successively making decisions pertaining t o various aspects of the task as shown in figure 2 can be viewed as a state space search. States toward the right represent choices being made in increasingly complete plans. T o summarize, two types of "action oriented knowledge" are brought t o bear in assumption synthesis. First domain-specific constraint relationships among different parts of the task environment must be taken into account. Once the choices resulting from these constraints have been exhausted, i t is necessary t o make forced choices. This requires the program t o focus on a critical p a r t of the task, and make a choice based on a heuristic evaluation function t h a t compares alternatives based on their resource requirements and the existing availability of resources. This can in t u r n lead t o further choices based on constraint relationships. This cycle continues until selections have been made from all parts of the task environment. 3.3. Preserved Process Knowledge The formulation process described above can be viewed as the result of a trajectory of choices in a state space, with the terminal nodes, if generated, representing "complete plans" from which algebraic models can be derived. This includes choices made by the program in its constrained mode, and the forced choices where alternatives are compared across the vector of objectives. Since some of these choices may have the effect of influencing others, the complete plan consists of "clusters of dependencies" in the state- space. One such cluster is shown in figure 3. Comparing figures 2 and 3, we can see t h a t a choice is not necessarily dependent on all chronologically earlier decisions, b u t only on those t h a t directly or indirectly led t o it. This "dependency informationW can play an important role in t h e incremental modification of a large plan. Referring to figure 3, we can see t h a t if the choice " 3 dimensional testers" is retracted, choices dependent on i t and all their dependents if any, need t o be undone. Revised choices in the affected areas are made from the available alternatives. In this example, retracting t h e embedded etch boards decision would also bring into contention, previously eliminated alternatives pertaining to t h e surface etch Center for Digital Economy Research Stem School of Business IVorking Paper IS-85-6 1 3 d i m e n s i o n a l t e s t e r s NOT M o t o r o l a ' s TI'S s u r f a c e e t c h s u r f a c e e t c h p r o c e s s p r o c e s s H i t e c h ' s embedded e t c h ( p r o c e s s ) H i t e c h ' s h e a t s i n k embedded e t c t e c h n o l o g y LEGEND A + B c h o i c e of B d e p e n d s on c h o i c e of A Figure 3 A dependency network corresponding t o t h e choices indicated in figure 2 Center for Digital Economy Research Stem School of Business IVorking Paper IS-85-6 1 processes. T h e revised choice for test processes would then be made among the previously passed over alternatives (indicated in figure 2), plus others t h a t might have become available since a choice was iast made in t h a t p a r t of the plan. For plans containing thousands of choices, this process of i n c r e m e n t a l plan evolution can serve a n important attention focusing role by highlighting only the affected areas of a plan, and suggesting revised choices in these areas. Maintaining t h e state-space associated with a n existing plan can also be useful for carrying o u t qualitative "what if" analyses of choices. F o r example, a query of the form "what if I use the surface etch board technology" boils down t o undoing the dependencies of the existing assumption (namely, the embedded etch technology), making revised choices for these parts of the task environment, and generating the resource requirements for the hypothesized scenario. This elevates t h e what-if analysis from t h e level of a quantitative model t o one allowing for perturbations of the symbolic assumptions underlying such a model. More generally, this functionality is a statement about the rote of a u s e r in such man-machine interactions. Most DSS literature uses t h e ambiguous term "judgement" t o account for the gap between the symbolic reasoning process of a decision maker and the outputs from a model underlying a system. Unfortunately, this view of decision support does not address issues about whether i t is reasonable t o expect t h e user to make all the right "adjustments" in translating qualitative reasoning into a form expressable for the quantitative model. In contrast, elevating t h e system functionality t o a level where the symbolic real-world assumptions can be manipulated relieves the user from making possibly unrealistic transitions between the two levels. 3.4. Summary of the Main Points It is worth summarizing the discussion so far in light of the four questions raised a t the beginning of this paper, in particular, the iast three. A fundamental characteristic of much of managerial problem solving is t h a t t h e symbolic knowledge about a problem domain is distributed and evolutionary, and must be maintained. A major problem facing planning managers is one of orchestrating t h e synthesis of assumptions into a coherent model, and Center for Digital Economy Research Stem School of Business \Vorking Paper IS-85-6 1 because of the instability of such assumptions, one of maintaining the integrity of this model over time. My approach t o decision support emphasize the design or maintenance of a qualitative model on which a quantitative model can be based. In contrast, most DSS approaches have viewed support via t h e selection from a predesigned set of models. Similarly, the design emphasis here also contrasts with most knowledge based systems to date which have been concerned mainly with classification problems t h a t involve mapping "factsu t o "conclusionsw, given a stable model of the domain. In contrast, I have argued t h a t i t is the formulation and maintenance of t h e model of the domain itself t h a t is a particularly problematic reality in organizations t h a t knowledge based systems can support. F r o m a functionality standpoint, modeling systems o r "DSS generators" form one component of such a support system. They are appropriate for representing algebraic models and performing parametric explorations within a given algebraic model structure. However, much of t h e problematic aspects of managerial decision making involve "getting the model right", a n exercise t h a t must make use of symbolic knowledge not expressable within modeling systems. F o r a system t o be sensitive t o the context of the decision making process, i t must be able to explicitly maintain and reason in terms of this process knowledge, and tie the outputs of this process with a modeling system. Basically, this requires a level of intelligence over and above the knowledge expressed in a n algebraic modeling system. In order for a system to maintain the context surrounding its models, a decision support system must therefore maintain knowledge about alternatives, general domain-specific constraints, resource availability information, and dependency among prior decisions. These constitute the qualitative knowledge components required in order to synthesize and maintain evolving models. Equipped with this functionality, knowledge based systems can play an important role in facilitating a n incremental evolution of models, and provide a continuity perspective t h a t is crucial to managerial decision making, but lacking in the support provided by current day systems. Center for Digital Economy Research Stem School of Business \Vorking Paper IS-85-6 1 4. Summary Much of the power of the PLANET architecture derives from its ability t o collect, preserve, and manipulate a store of domain specific knowledge in order t o reason about a problem situation. This knowledge must be provided to the system by the user. However, a n important part of a manager's job is to create the alternatives and recognize their interrelationship. Reitman [33] suggests t h a t t h e process of generating good moves or actions, particularly in the game playing context, is similar in spirit t o heuristic search. While this approach may be reasonable for domains where the entire set of alternatives, however large, can be generated a priori (i.e. the search space has a definite size), i t is of little value in a managerial planning situation where actions are not defined a priori, but continually generated or "recognized". In fact, a n important function of a h u m a n support staff is one of creating a set of "good" actions t o be examined by a decision maker [33]. The PLANET formalism is limited from this standpoint in t h a t i t is a reactive support tool; the inputs t h a t enable i t t o modify a plan must always come from the user. T h e realization of good actions also must come from the user. It is probably accurate t o say t h a t these creative aspects of decision making are likely t o remain outside the scope of computer based support in the near future. In conclusion, while computer based decision support systems will continue t o have certain limitations a s decision aids, there is nevertheless considerable support potential above and beyond what i s available with current day systems. In this paper, I have attempted t o address what I consider t o be important issues t h a t must be addressed if we are t o develop knowledge based systems t h a t exhibit some of t h e intelligence t h a t is associated with managerial decision-making. Specifically, I have argued t h a t since models used to support decision-making are based on evolving knowledge, such systems must be able t o represent and maintain such knowledge. Although such systems are not "expert systemsD such as those in the scientific and medical arenas, they can nevertheless use knowledge about a problem situation in supporting a decision maker with the formulation and maintenance of assumption-based models relevant t o t h e problem situation. The architecture described in this paper has been designed t o support this activity. Center for Digital Economy Research Stem School of Business IVorking Paper IS-85-6 1 Acknowledgements I am grateful to Herb Simon, Michael Ginzberg, Barry D. Floyd, and Joseph Davis for helpful comments on earlier drafts of this paper. Center for Digital Economy Research Stem School of Business IVorking Paper IS-85-6 1 REFERENCES 1 . Barzilay, Amos., SPIRIT: An Intelligent Tutoring System for Probability Theory, Ph.D Thesis, University of Pittsburgh, 1984. 2. Bohanek,M., Bratko, I., and Rajkovic, V., An Expert System for Decision Making, in Processes a n d Tools for Decision Support, Henk Sol (ed.), North-Holland, 1983. 3. Bouwman, M., Human Diagnostic Reasoning by Computer: An Illustration From Financial Analysis, Management Science, vol. 29, no. 6, June, 1983. 4. Brown, J.S., Burton, R., & de Kleer, J., Pedagogical, Natural Language and Knowledge Engineering Techniques in SOPHIE I, I1 and III, in Intelligent Tutoring Systems, Sleeman a n d Brown (eds), Academic Press, 1983. 5. Bundy, A., & Bird, L., Using the Method of Fibres in MECHO t o Calculate Radii of Gyration, in Intelligent Tutoring Systems, Sleeman and Brown (eds), Academic Press, 1983. 6. Chi, M.T.H., Feltovich, P., & Glasser, J., Categorization and Representation of Physics Problems by Experts and Novices, Cognitive Science, 5, 1981. 7. Clarkeson, G.P.E., A Model of the Trust Investment Process, in Computers a n d Thought, Feigenbaum and Feldman (eds.), McGraw-Hill, 1963. 8. Clement, J., Students' Preconceptions in Introductory Mechanics, American J o u r n a l of Physics, 50, 1982. 9. Cohen, P., & Lieberman, M., A Report on FOLIO: A n Expert Assistant for Portfolio Managers, Proceedings of the Eighth International Joint Conference on Artificial Intelligence, 1983. 10. Davis, R., Austin, H., Carlbom, I., Frawley, B., Pruchnik, P . , Sneiderman, R., & Gilreath, A., The Dipmeter Advisor: Interpretation of Geological Signals, IJCAI, 1981. 11. Dearborn, D.C. and Simon, H.A., Selective Perception: A Note on the Departmental Identifications of Executives, Sociometry, volume 21, 1958. 12. de Kleer, J., and Brown, J.S., Assumptions and Ambiguities in Mechanistic Mental Models, i n Intelligent Tutoring Systems, Sleeman and Brown (eds), Academic Press, 1983. 13. Dhar, Vasant., PLANET: An Intelligent Decision Support System for the Formulation and Investigation of Formal Planning Models, Ph.D Thesis, University of Pittsburgh, 1984. 14. Dhar, Vasant., and Quayle, Casey ., An Approach t o Dependency Directed Backtracking Using Domain Specific Knowledge, Proceedings of the Ninth International Joint Conference on Artificial Intelligence (IJCAI), 1985. 15. Duda, R.O., Gashnig, J., & Hart, P . , A Computer-Based System for Mineral Exploration in Experts Systems i n the Microelectronic Age by Michie (ed), Edinburgh Press, 1979. 16. Forbus, K., Qualitative Reasoning About Space and Motion., in Intelligent Tutoring Systems, Sleeman and Brown (eds), Academic Press, 1983. Center for Digital Economy Research Stem School of Business IVorking Paper IS-85-6 1 17. Genereseth, M. R., The Role of Plans in Intelligent Tutoring Systems, in Intelligent Tutoring Systems, Sleeman and Brown (eds), Academic Press, 1983. 18. Goldstein, I., & Papert, S., Artificial Intelligence, Language, and the Study of &owledge, Cognitive Science,l, 1977. 19. Gordon, R.J., Financial Modeling on Small Systems, I B M Systems Journal, volume 23, May- June, 1973. 20. Kuipers, B., Modeling Spatial Knowledge, Cognitive Science, 2, 1978. 21. Larkin, J., Problem Representation in Physics, in Intelligent Tutoring Systems, Sleeman and Brown (eds), Academic Press, 1983. 22. Larkin, J., McDermott, J., Simon, D.P, Simon, H.A., Models of Competence in Solving Physics Problems, Cognitive Science 1980. 23. Lindsay, R., Buchanan, B., Feigenbaum, E.A., & Lederberg, J., Applications of Artificial Intelligence for Chemical Inference: The DENDRAL Fkoject, McGraw-Hill, 1980. 24. McCloskey, M., Naive Theories of Motion, in Intelligent Tutoring Systems, Sleeman and Brown (eds), Academic Press, 1983. 25. McDermott, J., R1: A Rule-Based Configurer of Computer Systems, Artificial Intelligence, vol 19, no. 1, 1982. 26. McDermott, J., & Larkin, J., in Proceedings of the 2nd Conference of the Canadian Society for Computational Studies of Intelligence, 1978. 27. Moon, David. and Weinreb, Daniel., Lisp Machine Manual, MIT, 1981. 28. Naylor, T., The Politics of Corporate Model Building, Planning Review, no. 13, January 1975. 29. Naylor, T., and Schauland, H., A Survey of Users of Corporate Planning Models, Management Science, May 1976. 30. Novak, G . , Computer Understanding of Physics Problems Stated in Natural Language, Tech. Report NL-30, Department of Computer Science, University of Texas a t Austin, 1976. 31. Pople, Harry, E., Heuristic Methods for Imposing Structure on Ill-Structured Problems: The Structuring of Medical Diagnostics, Artificial Intelligence i n Medicine, Peter Szolovits (ed), Westview Press, Boulder, Colorado, 1982. 32. Quayle, Casey., Object Oriented Programming in Franz Lisp, Working Paper, Decision Systems Laboratory, University of Pittsburgh, 1983. 33. Reitman, Walter., Applying Artificial Intelligence to Decision Support: Where do Good Alternatives Come From? in Decision Support Systems, Ginzberg, Reitman and Stohr (eds), North-Holland, 1982. 34. Rosenkranz, F., An Introduction to Corporate Modeling, Unpublished Ph.D Dissertation, University of Basel, Switzerland, 1975. Center for Digital Economy Research Stem School of Business IVorking Paper IS-85-6 1 35. Shannon, R., Systems Simulation: The A r t and Science, Prentice-Hall, 1975. 36. Shortliffe, E., MYCIN: Computer Based Medical Consultation, American Elsevier, 1976. 37. Simon, Herbert A., The New Science of Management Decision, Prentice Hall, 1960. 38. Simon, H.A., & Simon, D.P., Individual Differences in Solving Physics Problems, in Children's Thinking: What Develops?, Siegler (ed), Erlbaum, 1978. 39. Smith, R.L., Graves, H., Blaine, L.H., & Marinov, V.G., Computer-assisted Axiomatic Mathematics: Informal Rigor, in Compute Education Learne and Lewis (eds), North Holland, 1975. 40. Subrahmanian, Eswaran., WELDS: A Welfare Eligibility Determination System - An Expert System in an Administrative Context. Expert Systems in Government Symposium, Washington D.C. 1985. 41. Weiss, S., Kulikowski, C.A., Amarel, S., & Safir, A., A Model-Based Method for Computer- Aided Medical Decision Making, Art4 ficial Intelligence, vol. 11, no. 1832, 1978. Center for Digital Economy Research Stem School of Business IVorking Paper IS-85-6 1