key: cord-0033205-emtww5ga authors: Joubert, I; Zeippen, B; O'Reilly, C title: Malaria and the HIV virus: is there any interaction? date: 2001-03-02 journal: Crit Care DOI: 10.1186/cc1164 sha: f833ad7365db7abed47cf34056bb26d6118906c1 doc_id: 33205 cord_uid: emtww5ga nan In conclusion, in the preliminary analysis of this study the rate of colonization tended to be increased in the closed aspiration system group when compared with the open aspiration system group. However, there was no difference between the groups in terms of the development of VAP and mortality in the ICU. Results: One hundred and forty-eight children (56%) underwent microlaryngoscopy (Storz 3.0 rigid telescope), usually at the time of airway intervention for failed medical treatment of severe croup (n = 147). Laryngeal ulceration, with or without exudation, oedema and erythema, was documented in 15 of these children (10%), median age 14 months (10-36) and median weight 10 kg (6-12). Twenty-seven of the children who underwent microlaryngoscopy (18%) also had ulcerative gingivostomatitis consistent with Herpes simplex virus. Ulcerative laryngitis was documented in 9 of 27 (33%) children with, and in 6 of 121 (5%) children without, co-existent ulcerative gingivostomatitis (P < 0.002). The presence of oral ulcers predicted ulcerative laryngitis with sensitivity and specificity of 80% and 86%, and positive and negative predictive values of 33% and 95%. Viral culture was available in 6 of the 15 children with ulcerative laryngitis, confirming Herpes simplex (n = 3) and cytomegalovirus (n = 1). All children with oral or laryngeal ulceration received acyclovir therapy. One of the 15 children did not require airway intervention. Nine children required nasotracheal intubation for a median of 4 days (3-11) and median ICU stay of 6 days (4) (5) (6) (7) (8) (9) (10) (11) (12) (13) (14) . Five children required tracheostomy ab initio, with a median ICU stay of 30 days (20-36), and duration of tracheostomy in situ for a median of 19 days (15-253). All 15 children survived. Conclusion: Ulcerative laryngitis is more common in our patient population than the few reports suggest. Early microlaryngoscopy is recommended in children with severe croup who follow an atypical course. The high frequency of severe exacerbation of bronchial asthma (BA) is still one of the main problems for pulmonologists in Ukraine. Ninety-eight of 620 asthmatics, surveyed in the diagnostic center 'Pulmis' in Dniepropetrovsk during 10 months of 2000, were referred to the intensive care department because of severe exacerbation of disease. The aim of present research was to study and to estimate the reasons for patients with BA hospitalizations in the intensive care department. Ninety-eight patients of intensive care department (63 men, mean age 45.5 ± 3.2 years, mean duration of disease 8.3 ± 1.7 years) with severe BA (according to GINA classification) were enrolled into the study. We evaluate patients' educational level, their medication and compliance. According to the results of our research, 34 patients (34.7%) did not receive any antiinflammatory medicine, 17 (17.3%) used only systemic corticosteroids. Medication of 38 (38.8%) patients consist of short-acting β2agonists only. Eighty-three (84.7%) patients have never applied long-acting β2agonists. Thirteen (13.2%) of them visited 'Asthma-school', 4 (4.1%) patients monitor their peak expiratory flow every day, 12 (12.2%) used additional methods of drug delivery (spasers, etc.). The results show that inadequate anti-inflammatory and bronchodilator therapy, low educational level of the patients, absence of compliance have significant importance in the development of BA severe exacerbation. outcome and follow up. Patient demographics, clinical, radiographic and laboratory features, necessity for mechanical ventilation, therapy, associations and hospital stay were also reviewed. The median age of the study group was 41.5 years (range 7-62) and included five males and five females. Five (50%) required mechanical ventilation. Three (30%) demised in hospital and one died following hospital discharge. Overall mortality was 40%. The median PaO 2 /FiO 2 ratio for the whole group was 139 (range 61-250) with the median LDH level, on admission, 612 µ/l (range 375-2638). Those patients who were discharged from hospital had a median LDH level of 521 µ/l (range 375-1599) versus a median level of 2154 µ/l (range 2100-2638) for those who died in hospital (P = 0.016). Patients who required mechanical ventilation had a median LDH level of 2100 µ/l (range 691-2638) versus a median LDH level of 504 µ/l (range 375-534) for non-ventilated patients (P = 0.008). The most common chest radiographic appearance was a mixed alveolar-interstitial pattern in six patients (60%). In addition to the characteristic histopathological features of BOOP, this group of patients universally demonstrated alveolar septal expansion. Increased reticulin was also noted. The median hospital stay was 25 days (range 9-127) with follow up ranging from one to 28 months. Conclusion: BOOP may manifest as a severe illness with an associated significant morbidity and mortality. LDH may be useful as a marker of disease severity and outcome. Introduction: Recent studies outside the UK suggest that patients with acute cardiogenic pulmonary oedema (CPO) may benefit from the application of facial continuous positive airway pressure (CPAP) support in emergency departments. The aim of this pilot study was to assess the impact of facial CPAP on patients with CPO within a UK Emergency department and to determine the sample size needed for a definitive randomised controlled study. Methods: A prospective powered study comparing CPAP to supplemental oxygen at ambient pressure, using historical controls matched for CPO severity. Forty patients with internationally accepted criteria for CPO were included. Twenty patients received 20 min of facial CPAP using a dedicated Drager CPAP system with a 5 cmH 2 O positive end-expiratory pressure valve and maximal inspired oxygen (Group CPAP). Twenty controls received identical therapy but received maximal inspired oxygen via a nonrebreathing mask (Group C). Outcome measures compared were arterial partial pressures of oxygen and carbon dioxide, objective clinical signs, intubation rate and death. Statistical analysis was by a t-test and 95% Confidence intervals (CI). Patients receiving CPAP had a better improvement in arterial blood gas indices compared to controls, with higher mean oxygenation (CPAP = 8.4 kPa & C = 3.5kPa, P = 0.017, CI = 0.92-8.77), and better mean excretion of CO 2 (CPAP = -0.9kPa & C = +0.9 kPa, P = 0.011, CI = -3.27 to -0.45). CPAP produced a higher mean reduction in respiratory rate, and lower mean reductions in blood pressure/pulse rate, although these were not statistically significant. The median length of in-patient stay (CPAP = 4, C = 5 days) and intubation rates (CPAP = 0, C = 3) differed between groups, but were not statistically significant. There was no difference in hospital mortality between groups (CPAP = 3, C = 3). Conclusions: CPAP utilisation in the emergency department impacts favourably on the physiological manifestations of CPO. Consequently to achieve a statistical power of 90% in a future randomised study, a sample size of 913 patients is required to show a 5% difference in mortality. The mortality associated with AHRF after lung resection may reach 50% mainly in relationship with complication of endotracheal intubation and mechanical ventilation. We compared NPPV and conventional therapy in avoiding endotracheal intubation in patients with AHRF after lung resection. Methods: On 2280 patients who had undergone thoracic surgery between May 1999 and July 2000, those who had pulmonary resection and experienced AHRF were prospectively recruited. Patients were enrolled if they met at least three of the following criteria: dyspnea at rest defined by a respiratory rate of 25 Background and objective: Noninvasive positive pressure ventilation (NPPV) has been reported to be beneficial in the treatment of acute exacerbation of chronic obstructive pulmonary disease (COPD), and to facilitate weaning. In this trial we assessed the possible benefit of early NPPV in patients with blunt chest trauma and acute respiratory failure. Eighteen patients admitted to ICU were enrolled in this prospective randomized study. Inclusion criteria were isolated blunt chest trauma with respiratory failure and ICU stay more than 7 days. Exclusion criteria were history of COPD and conditions when NPPV was contraindicated. The patients were randomized into two groups. Group 1 (n = 9) received standard therapy (oxygen, regional analgesia, fluid and nutritional support, pulmonary physiotherapy/rehabilitation) including tracheal intubation and mechanical ventilation when indicated. Group 2 (n = 9) received standard therapy along with NPPV. In Group 2 we used NPPV with face mask and Pressure Support (7-21 cmH 2 O)/CPAP (3-10 cmH 2 O) ventilation. The need for tracheal intubation was assessed and the number of intubated patient in both groups was recorded on the 12, 24, 48, 96th hour and 7th day. The effect of the therapy was assessed on the 1, 6 and 12th hour using PO 2 /FiO 2 index, frequency/tidal volume index (f/Vt), dispnea score, hemodynamics and the tolerance to pulmonary physiotherapy/rehabilitation. The main results suggest the possible beneficial effect of NPPV in decreasing the need for tracheal intubation and mechanical ventilation (Group 1 -intubated 7 [78%], Group 2 -intubated 3 [34%]). We found a significant statistical difference with improvement in all parameters (PO 2 /FiO 2 index, frequency/tidal volume index (f/Vt), dispnea score, hemodynamics and the tolerance to pulmonary physiotherapy/rehabilitation) in the NPPV group. The results show that NPPV should be considered as systemic approach in management of all patients with blunt chest trauma and acute respiratory failure. Intensive Care Unit, 1st Floor, St James' Wing, St. George's Hospital, Blackshaw Road, London SW17 0QT, UK Introduction: Helium is eight times less dense than nitrogen and only 10% more viscous. As a result of these physical properties it produces significantly higher gas flows for the same differential pressure gradient. This coupled with the fact that as a carrier gas He facilitates faster diffusion makes it a potentially useful adjunct in the ventilatory support of patients with acute respiratory failure. Substituting He for nitrogen has been shown to be of considerable benefit in the management of acute upper airway obstruction from a wide variety of causes. There is also a growing body of evidence for its use in acute severe asthma and decompensated COPD. We previously conducted a pilot study of He-O 2 in patients with acute respiratory distress syndrome (ARDS) and found that it led to a significant improvement in gas exchange in the majority of subjects. Having resolved a number of technical problem regarding the use of He-O 2 we have gone on to perform a larger cross over study in a wider variety of patients and present our preliminary findings here. Methods: All patients who were mechanically ventilated on our unit were eligible. Exclusion criteria included haemodynamic instability, active weaning of respiratory support and imminent deterioration. All patients were ventilated in a pressure control mode. Patients were observed for a 15 min period on their established ventilatory regime of N 2 -O 2 . They were then switched to He-O 2 and observations repeated after 15 and 60 min and then every 60 min for a maximum of 360 min. The trial was terminated when no further change in the partial pressures of arterial oxygen (PaO 2 ) and carbon dioxide (PaCO 2 ) were scene. Patients were then reestablished on N 2 -O 2 and observed for a further 60-120 min. No alterations in ventilatory parameters were made unless warranted by changes in arterial blood gases. Ventilatory and haemodynamic parameters were continuously monitored throughout. The ventilator flowmeter was calibrated for use with He-O 2 as previously described. Results: Six out of eight patients showed a significant improvement in PaO 2 and PaCO 2 within 15 min. Most of those studied showed further improvements at the successive observation time points. There were small improvements in respiratory mechanics, but these were insufficient to explain the improvements in gas exchange. There were no significant haemodynamic changes seen. The worse the derangement of gas exchange at study outset, the greater the magnitude of improvement seen on He-O 2 . Conclusions: This study adds to the growing body of evidence that He-O 2 may be a useful adjunct to mechanical ventilation, especially in the most severe cases of respiratory failure. breaths/min or more, active contraction of the accessory respiratory muscles or abdominal paradox, a ratio PaO 2 /FiO 2 < 200 and radiologic lesions on the chest radiograph. They were randomly assigned to receive either conventional therapy or conventional therapy and noninvasive positive pressure ventilation (NPPV) through a nasal mask. NPPV was provided with the BiPAP® Vision Ventilator System (Respironics Inc., Murrysville, PA, USA). The primary end point of the study was 'need for endotracheal intubation'. Secondary endpoints included: in-hospital mortality, the length of stay in the ICU, length of stay in the hospital, and the need for fiberoptic bronchoscopy. An interim analysis was designed at the middle of the study. Results: Over this 16 month period, 912 patients were admitted to the Intensive Care Unit. Forty-eight patients were enrolled. Because endotracheal intubation is the most important predisposing factor for ventilator associated pneumonia, bronchial stump disruption and bronchopleural fistula, postoperative re-intubation must be avoided. This is the first prospective, randomized study which demonstrates an improvement in survival and in avoiding endotracheal intubation in the postoperative care of patients undergoing lung resection surgery. In mechanically ventilated patients reducing the density of the inspired gas by substituting helium (He) for nitrogen offers several theoretical benefits. However, accurately monitoring tidal volumes of He-oxygen (O 2 ) mixtures with conventional flowmeters is problematic, as all commonly employed devices are adversely affected by changes in gas density. We tested two widely available flowmeters, to ascertain whether reliable and reproducible correction factors could be obtained. We used an unadapted Galileo ventilator (Hamilton Medical, Switzerland), to which heliox (21% O 2 79% He) was connected via the air inlet. This ventilator measures flow/ volume utilising a variable orifice pneumotachogrpah (VOP). The circuit was then connected to a Pitot tube flowmeter (PT) (MCOVX, Datex-Ohmeda, Finland) which in addition, contains a side stream gas sampler to monitor gas composition. In place of a patient, a mechanical lung (BIO-TEK Instruments Inc., USA) was employed, which accurately measures delivered gas volumes independent of gas density. Carbon dioxide (CO 2 ) was entrained, distal to the flowmeters, just proximal to the mechanical lung, to simulate the clinical setting. Both flowmeters were repeatedly tested across a range of tidal volumes (Vt) (200-1200 ml). This method proved to be reliable and reproducible. Inspired and expired Vt as measured by the two devices, when plotted against that measured by the mechanical lung, showed excellent linear correlation (R 2 = 0.99 for all). From the equations of these lines we calculated a set of correction factors (CF) and ordinates (ord). All measurements were repeated five times and means (Av) and standard deviations (SD) derived; results are tabulated. We conclude that both of these devices can be reliably employed to monitor respiratory variables in patients mechanically ventilated with He-O 2 provided gas composition, temperature and relative humidity are accounted for. However, the consistency of ord values for the VOP are advantageous and make this the preferred device. Objective: To review a single institution's experience with APRV with respect to safety, complication detection, and efficacy at correcting hypercarbia and hypoxemia. Methods: Consecutive patients transitioned from either volume or pressure targeted ventilation to APRV in a University hospital ICU were retrospectively reviewed. Patients initially ventilated with APRV were excluded. Initial APRV settings for correction of hypoxemia (pO 2 ≤ 60 on FIO 2 ≥ 0.9) were a Phigh at the prior plateau pressure, a Thigh of 6.0 s and a Tlow of 0.8 s. Hypercarbic (pCO 2 ≥ 55 and pH ≤ 7.3) patients were set at a Thigh of 5.0 s with a Tlow of 1.0 s. IRB approved data included principal diagnoses, ventilation parameters, laboratory values, and ventilator associated complications. Data before and after APRV were compared using an unpaired two-tailed t-test; significance at P < 0.05 (*). Results: Patient mix was 43% trauma, 32% sepsis, 8% cardiac surgery, 12% vascular surgery and 5% other. Transitioning to APRV was most frequent for hypoxemia (88%) and hypercarbia less often (12%). The mean time to correct hypoxemia (S a O 2 ≥ 92%) was 7 ± 4 min while the mean time to correct pCO 2 (pCO 2 ≤ 40 Torr) was 42 ± 7 min. Maximal CO 2 clearance was achieved by 76 ± 12 min. The mean minute ventilation on APRV decreased by 3.3 ± 0.9 l/min (*) but achieved superior CO 2 Conclusions: Patient controlled PSV could be a useful technique in ventilatory management of critically ill awake patients. 1 Critical Care Vol 5 Suppl 1 21st International Symposium on Intensive Care and Emergency Medicine oxygenation. The mean time to achieve FIO 2 ≤ 0.6 was 5.2 ± 0.9 hours. Four of 38 patients developed a pneumothorax although none developed hypotension; one had bilateral pneumothoraces. All four patients evidenced decreased CO 2 clearance and decreased release phase volumes as their only manifestation of a pneumothorax. 97% of patients on APRV with a Phigh ≥ 20 cmH 2 O pressure who were transported out of the ICU using bagvalve ventilation developed hypoxemia within 5 min. 100% of patients with a Phigh ≤ 20 cmH 2 O pressure were safely hand ventilated during transport without developing hypoxemia. Conclusion: APRV is a safe rescue mode of ventilation for hypoxemic or hypercarbic respiratory failure and requires a lower minute ventilation than does conventional modes. Decreasing release phase volumes and a rising pCO 2 are excellent clues of a pneumothorax in a patient on APRV. Thus, routine end-tidal CO 2 monitoring is recommended for patients on APRV. Preparations for safe intra-hospital transport may be keyed to the Phigh required for adequate ventilation and oxygenation. Introduction: Pressure support ventilation is mostly used in weaning from mechanical ventilaton in acute respiratory failure. There are no data regarding the optimal level of assistance for each patient in different clinical conditions. We approach a new method that allows patients to set their own PSV level by a remote control connected to the ventilator. In 9 awake intubated patients (age 57 ± 17 years, BMI 24 ± 3 kg/m 2 , PaO 2 /FiO 2 283 ± 51, Ramsey 1.9 ± 0.4) we measured the breathing pattern (V T , RR), the work of breathing (WOB J/min, estimated from a modified Campbell diagram [1] ) and the dyspnea sensation using the Borg visual scale [2] . Patients were studied at three fixed levels of PSV (5-15-25 cmH 2 O). At the end of each step we gave the patient the possibility to change the level of PSV using the remote control. Patients were previously instructed by attendant physician about the use of the remote control. Results: See Table and Figure. It appears that increasing the pressure support level, the patient work of breathing decreases while the Borg dyspnea scale shows no significant differences. Borg (cm) 3.3 ± 2 2 ± 1.6 3 ± 1.4 One way RM Anova * P < 0.05 vs others; † P < 0.05 vs PSV 25. The dyspnea sensation when patient is allowed to set the pressure support level. Introduction: Weaning period is critical in the evolution of patients with acute respiratory failure (ARF) and mechanical ventilation (MV). Weaning failure has been associated with increased morbidity and mortality. We evaluated the impact of weaning failure on mortality, and MV and ICU length of stay. Methods: Patients who were admitted to our 8-bed surgical-ICU and stayed more than 24 hours on MV were prospectively evaluated from June 1999 to June 2000. Demographics, ARF etiology, APACHE II and gas exchange and mechanical parameters were assessed. Weaning failure was defined as reintubation within 48 hours after extubation. Weaning failure (WF) patients were compared with those who were successfully extubated (SE) and the total group (TG = SE + WF). Outcome measures were mortality, MV and ICU length of stay and MV free days. Results: 155 patients required MV for more than 24 hours, of which 103 (66%) were successfully extubated, 19 (12%) had weaning failure, and 33 (21%) died before weaning could be attempted. There were no differences in age, sex, APACHE II scores, or etiology between WF and other groups. However, WF patients had longer ICU and MV length of stay than TG (14 ± 6.2 vs. 9.2 ± 7.4 days, P = 0.005; and 9.1 ± 4.8 vs. 5.5 ± 5.6, P = 0.002, respectively) and SE patients (8.8 ± 6.4, P = 0.003; and 4.6 ± 4.3, P = 0.001). WF patients had also less MV free days than SE patients (15 ± 11.2 vs. 25 ± 5.5, P = 0.001). There was no difference in mortality between WF and GT patients (32% vs. 26%, P = NS). However, SE patients had 2% mortality, which was lower than WF and GT patients (P < 0.05). We studied 18 patients with severe ARDS of varying etiology admitted to the intensive care unit of Istanbul University, Istanbul Medical Faculty. The diagnosis of ARDS was based on the criteria proposed by the American-European Consensus Conference on ARDS. The patients who demonstrated impaired oxygenation, during pressure regulated volume-controlled ventilation with 10 ml/kg tidal volume and 12/min frequency comprised our study population. The patients were presented to four members of the ICU team individually and assigned to primary (ARDS P ) and secondary (ARDS S ) ARDS groups according to the patient's history and clinical presentation, along with the results of microbiological tests. During the recruitment maneuver, we applied sustained inflation (SI), which consisted of 40 cmH 2 O CPAP. After 30 s we returned to the previous tidal volume with 20 cmH 2 O PEEP levels. Keeping peak airway pressure below 45 cmH 2 O, we adjusted the optimal PEEP level, between 10-20 cmH 2 O, by matching the saturation on pulse oximeter with the lowest PEEP. Following the setting of PEEP level, we recorded arterial blood gas values, ventilatory parameters and hemodynamic measurements at 15 min, 1st, 4th and 6th hours of post-RM period. The clinical defining scores of patients are summarized in Table 1 . Over 6-hour period of time following RM, there was no statistically significant change in both hemodynamic and ventilatory measurements, except PaO 2 /FiO 2 ratios (Table 2) . Mean values of PEEP before and after RM were 10 ± 3.4 cmH 2 O and 14.6 ± 2.8 cmH 2 O in ARDS P (P = 0.04), 12 ± 5 and 17 ± 2.8 in ARDS S (P = 0.01). The mean values of PaO 2 /FiO 2 ratio difference between the baseline and the value at the end of the study were 41 ± 30 % in ARDS P , and 89 ± 111% in ARDS s (P = 0.6). PaO 2 /FiO 2 ratios started to increase significantly mainly 15 min following RM in ARDS S , and 4 hours after RM in ARDS P . The response of patients with secondary ARDS to RM was observed earlier than the response of patients with primary ARDS. The pressure volume (P-V) curve of the total respiratory system is drawn assuming the changes of chest wall (∆V cw ) equal to the volume displaced from the syringe (∆V gas ). We compared ∆V gas and ∆V cw during P-V curves obtained by supersyringe and optoelectronic plethysmography [1] . Eight sedated paralysed intubated ALI/ARDS patients (5 M/3 F, age 70 ± 13 years, BMI 25.6 ± 3 S13 kg/m 2 , PaO 2 /FiO 2 220 ± 76) were studied. During the manoeuvre ∆V cw was recorded by optoelectronic plethysmography. The volume injected and withdrawn by the supersyringe step by step (100 ml) was corrected by temperature, humidity, pressure and gas exchange [2] . The discrepancy was computed as the difference between the volume of air inflated and the chest volume measured. The compliance of the total respiratory system was measured between zero and maximum airway pressure values on the inflation (Crs inf ) and deflation (Crs def ) limbs of the P-V curves. Hysteresis loops of the corrected P-V gas and P-V cw curves were calculated as the percentage ratio between the area of the P-V curves and the product of maximal volume by maximal pressure. Even considering thermodynamics and gas exchange correction, ∆V gas values were systematically higher than ∆V cw probably due to blood shifts from the thorax to the extremities. As a consequence, the standard supersyringe method provides an overestimation of the inspiratory and expiratory compliance of the total respiratory system on the inflation limb and an overestimation of the hysteresis area. The data are expressed as mean ±SD.* Paired t-test P < 0.05 vs PV CW . Introduction: Different techniques to determine optimal PEEP setting in patients with ARDS have been suggested. The aim of our study was to evaluate assessment of static compliance of respiratory system in combination with estimated lung recruitment [1] for selection of PEEP setting with minimal Qs/Qt. Seven patients with early ARDS (intubation time < 7 days) with pulmonary artery catheter in place were studied. During the study all patients were ventilated at different PEEP levels using CMV with constant inspiratory flow, tidal volume was 5-7 ml/kg IBW, inspiratory/exspiratory time and respiratory rate was adjusted to avoid presence of intrinsic PEEP. On each level of PEEP arterial and mixed venous blood gases, hemodynamic parameters, static compliance of respiratory system Crs and difference in endexpiratory lung volumes ∆EELV PEEPtest between tested and baseline level of PEEP (minimal level of PEEP used during the study, usually 8 cmH 2 O) were recorded. Volumes and pressures were measured using CP-100 pulmonary monitor (BICORE monitoring systems, USA) at the end of airway. Estimated lung recruitment ELR (ELR PEEPtest = ∆EELV PEEPtest -Crs PEEPtest x [PEEP test -PEEP baseline ]) was calculated for each tested level of PEEP. Ability to predict the PEEP level with minimal shunt was tested for minimal PEEP with maximal Crs, for maximal PEEP with maximal Crs and for algorithm based on static compliance and the amount of esti-mated lung recruitment. Sensitivity, specificity and likelihood ratio (LR) for prediction of PEEP level with minimal shunt were calculated, Fisher exact test was used for statistical analysis, P < 0.05* was considered statistically significant. Sensitivity Specificity LR Maximal PEEP with maximal Crs 0.143 0.7 0.4 Minimal PEEP with maximal Crs 0.571 0.850 3.8 Minimal PEEP with ELR > ELRmax -0.857* 0.95 17.14 150 ml and maximal Crs Conclusion: Despite limited number of patients and possible influence of used equipment on critical value of ELR we found that combined assessment of compliance and recruited lung volume enables better prediction of PEEP setting with minimal Qs/Qt. Introduction: Mechanical ventilation with PEEP is the cornerstone of treatment of patients with ALI and ARDS, but it is not free of adverse effects. This study aims to examine the effect of varying levels of PEEP on the intraocular pressure in critically ill patients with intracranial pathology. We studied 40 patients with intracranial pathology and respiratory failure, without history of glaucoma and not receiving drugs known to affect intraocular pressure. Twentyone patients had head injury (GCS ≤ 8 on admission), 11 had subarachnoid hemorrhage (III-IV Hunt and Hess) and 8 had intracerebral hemorrhage. Measurement of intraocular pressures where done while the patients were mechanically ventilated with different levels of PEEP. These patients were divided in four groups (A, B, C, D) of 10 patients. Each group had different PEEP values according to the attending physician for at least > 24 hours (see Table) . Mean systemic arterial pressure, peak airway pressure, central venous pressure and arterial oxygen saturation were recorded (see Table) . Results: In all groups there was no difference in intraocular pressure values between the right and left eye. In patients subjected in different levels of PEEP no effect on intraocular pressure was observed. Available online http://ccforum.com/supplements/5/S1 S14 Critical Care Vol 5 Suppl 1 21st International Symposium on Intensive Care and Emergency Medicine Introduction: Impaired lung function is common after cardiopulmonary bypass (CPB). We hypothesized that due to collapse tendency 1) the effect of a lung recruitment maneuver (LR) on a pressure-absolute lung volume (P-V) curve would be minimal, 2) but if LR is followed immediately by PEEP the curve would shift upwards and the slope would be steeper. Sixteen patients (48-75 years) after CABG in CPB were studied postoperatively every 0.5 h during 3 h while mechanically ventilated (FiO 2 1.0) with measurements of end-expiratory lung volume (EELV) and blood gases. Eight patients were randomized to LR (45 cmH 2 O airway pressure 2 x 20 s) after which PEEP was set 1 cmH 2 O > LIP obtained from a static P-V curve (PEEP group), while the 8 other were randomized to LR only (ZEEP group). Three inspiratory P-V (including EELV) curves were obtained in both groups. In the PEEP-group, the first curve (A l) was obtained before LR and PEEP, the second (B ¡) during PEEP (14 ± 3 cmH 2 O, mean ± SD) 2.5 h after LR, and the third (C v) 0.5 h after removal of PEEP, i.e. 3 h after LR. In the ZEEP group, P-V curves were obtained at similar times. Statistics: ANOVA and Wilcoxon signed rank test. In the ZEEP group, no change in PaO 2 , lung volume or PV-relations occurred during the study. In the PEEP group, PaO 2 increased by 16 ± 15 kPa (P < 0.002) after LR and PEEP and was unchanged during the study. EELV increased by 1120 ± 235 ml (P < 0.0001) and remained stable until removal of PEEP. The 3 P-V curves are shown in the figure (A ≠ B or C, P < 0.03). Discussion: In patients after CPB, LR without subsequent PEEP had no effect. However, when LR was followed by PEEP, EELV increased and the P-V curve became steeper and shifted upwards. Furthermore, the curve remained the same 0.5 h after PEEP-removal. This and the unchanged PaO 2 indicate that no new lung collapse occurred after removal of PEEP and suggest that PEEP might have a sustained stabilizing effect on lung structures in these patients. Our aim was to compare the effects of LV + ZEEP, LV + Sigh and LV + PEEP on histopathological changes with healthy rabbit lungs. Fifteen New Zeland rabbits were randomly divided into three groups (n = 5). Animals were ventilated for 3 hours with Low volume (5 ml/kg) + Sigh. At the end of 3 hours the animals were sacrificed for histopathological evalotion. Lungs were removed and fixed in 10% buffered formaldehyde. Tissue sections were processed in the usual manner for light microscopic examination with Hematoxylin-Eosin stain. The pathological lesions were classified ranging from 0 to ++++. There were statistically significant differences between ZEEP and PEEP groups, ZEEP and Sigh groups (P < 0.05). The damage were prominent in the ZEEP group for alveolar hemorrhage and con-gestion. There were no statistically significance within groups for notrofill infiltration and density. But the damage were more significant than the others in the ZEEP group. We saw atelectasia only in the Sigh group. The damage was more significant than the others in the ZEEP group. The atelectasia was seen only in the Sigh group. We conclude that adding Sigh and PEEP to low volume may decrease the damage and PEEP may be more effective to prevent atelectasia. We previously found that optoelectronic plethysmography can be used to measure the chest wall volume and its compartments: rib cage and abdomen [1] . We evaluated the breathing pattern and the chest wall displacement during pressure support (PSV) in patients with acute respiratory failure. Nine intubated patients (age 57 ± 12 years, BMI 26 ± 5 kg/m 2 , PaO 2 /FiO 2 293 ± 67) were studied first at four levels of PSV (5, 10, 15, 25 cmH 2 O) at 10 cmH 2 O of PEEP and then at three levels of PEEP (5, 10, 15 cmH 2 O) at 10 of PSV. We measured the breathing pattern, the rib cage contribution to tidal volume (RC/V T ) and the inspiratory asynchrony (IA) [2] . IA was calculated as the area enclosed by the inspiratory portion of rib cage abdomen loop and the line connecting the commencement and the terminal of inspiration. Our results suggest that only the level of PSV affects the breathing pattern. IA (ml/ml 2 ) 10 ± 12 11 ± 9 11 ± 7 * P < 0.05 vs PSV 25, † P < 0.05 vs PSV 5, ‡ P < 0.05 vs others PSV levels. In early acute respiratory distress syndrome (ARDS), alteration in the pulmonary capillary permeability is associated with outcome [1] . Alteration in the relation of extravascular lung water to intrathoracic blood volume (EVLW/ITBV) derived from thermal-dye dilution curves indicates changes in the pulmonary vascular permeability. Prone positioning improves gas exchange in most patients with ARDS, however whether this improvement is related to effects on pulmonary vascular permeability has not been evaluated. This prospective pilot study was designed to investigate whether prone positioning would alter EVLW/ITBV as a measure of pulmonary vascular permeability. Patients with ARDS on inverse ratio pressure-controlled ventilation with PEEP > 10 cmH 2 O for at least 24 hours were recruited. Patients were turned prone for 18 hours. Except for FiO 2 , ventilatory settings remained unchanged during the study period. Values of EVLW and ITBV were obtained using a single transpulmonary arterial thermodilution technique with a 5F-fibreoptic thermistor femoral artery catheter. Measurements of EVLW and ITBV were taken at pre-prone, 1, 2, 6, 12 and 18 hours after proning and 1 hour after supine. EVLW/ITBV while prone was normalised to pre-prone values as a baseline to illustrate differences during prone and supine. Data were expressed as mean (SEM). Repeated measures ANOVA was used for statistical analysis. Twelve episodes of proning in 11 patients were studied. Although mean PaO 2 /FiO 2 improved within 1 hour, it continued to improve 1 h 2 h 6 h 12 h 18 h Supine EVLW/ITBV 0.015 ± 0.0024 0.015 ± 0.0016 0.014 ± 0.0017 0.014 ± 0.0017 0.013 ± 0.0012 0.013 ± 0.0017 0.012 ± 0.0013 during the period studied and only reached significance 12 hours after proning (17.9 ± 2.9 v 35.1 ± 4.2, P < 0.05). Mean EVLW/ITBV did not change significantly. At least 12 hours may be needed for maximal benefit with prone positioning. Changes in pulmonary vascular permeability in ARDS do not appear to be an important mechanism to account for the improvement in gas exchange seen following prone positioning. Reference: 1 Introduction: Acute Respiratory Distress Syndrome (ARDS) is one of the most common, potentially lethal disease processes encountered in critical care with extremely high mortality of about 60%. Researchers have found that a significant improvement in gas exchange often occurs when ARDS patients are turned from the supine to the prone position. But there are a lot of difficulties in nursing these patients in dependence of the kind of kinetic therapy. So the major goal is, to find out the best kind of kinetic therapy. The available study is designed by physicians and nurses. Besides the question of patients benefit concerning the gas exchange the handling, the acceptance of the nurses and economical consequences are proofed. Methods and materials: Patients with ARDS, or those patients identified as requiring to be nursed in the prone position with a Horowitz-Quotient (PaO 2 /FiO 2 ) < 250 were treated by different kinds of kinetic therapy in order to compare the efficiency concerning clinical outcome, personnel and material resources and the incidence of complications. Patient were turned into face down prone position, in near side prone position or they were treated in a Rotation bed. Gas analysis defined the clinical effect of the position on gas exchange. Also changes in skin integrity, skin status and clinical outcome of prone positioning were documented. Additionally number of nurses/physicians being involved in positioning the patient and the duration of time was documented. Results: Ten patients were positioned in near side prone position (NSPP). Eight patients were positioned in 180°-position and 10 patients were treated in rotation bed. In handling, nursing and observing the NSPP is the preferred kind of kinetic therapy. The pulmonary outcome is comparable to the other forms of kinetic therapy but less need of personnel and material resources and the very small risk of complication (skin damages, oedema formation, lost of catheter or tube) are advantages of the NSPP. So at every point of time position changing can occur. Also the patients treated in rotation bed shows a similar clinical outcome concerning the gas exchange but the need of material resources is disproportionately high and the availability is to be estimated very low. Objective: Ventilation in prone position (PP) is a common method to improve gas exchange in patients with acute lung injury (ALI). Although no significant changes in systemic hemodynamics were observed [1] , the question remains as to whether the PP may alter the hepato-splanchnic perfusion and oxygen exchange. Therefore, we studied the effects of PP on hepato-splanchnic hemodynamics in patients with ALI. Up to now five patients with ALI fulfilling criteria for PP (PaO 2 /FiO 2 < 250, FiO 2 > 0.55, PEEP > 10) were studied. In addition to systemic hemodynamics (radial and PA catheters) and gas exchange, hepatosplanchnic blood flow (HSBF, steady state indocyanine green technique) and hepatic venous pressure (HVP) were measured using hepatic vein catheter. Gastric intramucosal-arterial PCO 2 gap was determined using air tonometry (Tonocap) and intravesical pressure was recorded to monitor intra-abdominal pressure (IAP). Data collection was performed at the supine position (SP1), after 90 min of PP, and after 90 min of supine repositioning (SP2). Results: Data are median and interquartile range (RM-ANOVA). Our preliminary results suggest that the prone position, provided that IAP remains unaffected [2] , compromises neither the perfusion of the hepato-splanchnic area nor the gut mucosal energy status in patients with ALI [3] . The fulminant hepatic failure (FHF) could be a risk factor for acute lung injury (ALI). However, there is little information on the pathophysiology of pulmonary involvement during the early course of acute liver failure. Alterations in lung permeability, surfactant metabolism and local inflammation are significant markers of acute lung injury. The objective of this study was to evaluate alterations of protein as a marker of permeability, phospholipid as a marker of surfactant alterations and PAF-acetylhydrolase (PAF-AcH) as inflammatory marker in bronchoalveolar lavage (BAL) fluid from pigs in an experimental surgical model of FHF. Methods: Twelve anesthetized, intubated and mechanically ventilated pigs weighting 20-25 kg, with a pulmonary artery in place were studied. In seven pigs FHF was induced by surgical devascularization of the liver and the other five animals were shamoperated. BAL was performed by fiberoptic bronchoscopy before, 4 hours and 7 hours after the surgical procedure respectively using 4 × 20 aliquots of normal saline after the first aspi-rated fluid. BAL fluids were collected in ice-cold tubes (4°C). Total protein concentration, phospholipids and PAF-AcH were subsequently measured. Results: Total protein was found significantly higher in animals with FHF, while the protein concentration of the BAL fluid in shamoperated animals was not changed. Reduction of total BAL phospholipids was observed in animals with FHF. In contrast, pospholipids increased in sham animals during mechanical ventilation. PAF-AcH was also reduced in FHF (Table 1) . Pulmonary wedge pressure remained low in both groups during the study. Conclusion: Experimental FHF is associated with an increase of pulmonary permeability as indicated by increased proteins in BAL fluid, and surfactant alteration as indicated by the change in total BAL phospholipids. The decreased PAF-AcH, an anti-inflammatory factor, implies an inflammatory process. These findings suggest that lung injury occurs early in the course of FHF. [1, 2] . Because pulmonary PDE activity is increased in rats exposed to endotoxin [3] , we hypothesised that inhibition of PDE type 5 with zaprinast or sildenafil improves responsiveness to inhaled NO in isolated-perfused lungs from rats challenged with endotoxin. Methods: Adult Sprague-Dawley rats (400-450 g BW) were injected intraperitoneally with 0.5 mg/kg E. coli 0111:B4 endotoxin. Sixteen to 20 hours later, lungs were isolated, ventilated and perfused in situ. Lungs were perfused using a modified Hank's balanced salt solution containing 5% dextran, 5% bovine albumine, 30 µM indomethacin, and 580 µM L-NAME to inhibit endogenous nitric oxide synthesis. Then, U46619 was employed to increase PAP by 5 to 8 mmHg. After a stable elevated PAP was reached, 4 or 10 ppm NO with or without zaprinast (50 µg) or sildenafil (10 ng) was administered for 3 min and the decrease of PAP was measured. Zaprinast (50 µg) -1.9 ± 1.8* (n = 14) -2.9 ± 1.3* (n = 4) Sildenafil (10 ng) -1.8 ± 1.1* (n = 6) -2.4 ± 0.5* (n = 5) † P < 0.05 vs NO ppm and * P < 0.05 vs NO 4 ppm and NO 10 ppm, respectively; ANOVA. Objective: To test the hypothesis that PLV combined with a high PEEP and a moderate Vt results in improved gas exchange and lung mechanics compared to CMV in acute lung injury in piglets. ALI was induced in 12 piglets weighing 9.0 ± 2.4 kg by repeated intravenous injections of oleic acid and repeated lung lavages. Thereafter the animals were randomly assigned either for PLV (n = 6) or CMV (n = 6) at a FiO 2 of 1.0, a PEEP of 1.2 kPa, a tidal volume < 10 ml/kg, a respiratory rate of 24 breaths/min, and an I:E ratio of 1:2. Perfluorocarbon liquid (30 ml/kg b.w.) was instilled into the endotracheal tube over 10 min followed by 5 ml/kg b.w./h. Cardiorespiratory monitoring was done at baseline, after induction of ALI, and every 30 min up to 120 min. When compared with control animals, PLV resulted in significant better oxygenation, significant lower dead space ventilation, and significant better CO and DO2. Conclusions: PLV combined with high PEEP and moderate tidal volume significantly improves oxygenation, dead space ventilation, cardiac output, and oxygen delivery in piglets with ALI, but has no significant influence on lung mechanics. Results: Acute HS with volume resuscitation was not associated with significant alterations in any of the measured variables of the perfused lung. In contrast, acute HS without volume resuscitation resulted in the most striking alterations in lung mechanics, haemodynamics, as well as weight changes. Subacute HS independently of volume resuscitation showed only moderate and non significant changes. See Figure 1 opposite. Our results demonstrate that pulmonary injury following non resuscitated HS depends principally on the intensity and not on the volume of blood loss, and that during subacute HS even of grade IV, volume resuscitation is of minimal benefit. Mean ± SE values of data measured at 60 min after reperfusion of the isolated lungs of control rats (Black bars), acute HS with resuscitation (white bars), acute HS without resuscitation (light grey bars), subacute HS with resuscitation (dark grey bars), subacute HS without resuscitation (hatched bars). V PEF /V T = volume of peak expiratory flow over tidal volume, an index of airway resistance. *P < 0.05. Introduction: Enteral feeding is a well known risk factor for developing Ventilated Associated Pneumonia (VAP) [1, 2] . In this study we compared the incidence of VAP in trauma and non-trauma patients. Both groups were mechanically ventilated and enterally fed, in a 12 bed adult medical surgical ICU in a tertiary/ trauma center in Riyadh, Saudi Arabia. A prospective study, all patients intubated and ventilated for more than 48 hours receiving enteral feeding by oral or nasal gastric tube were included during a period of 1 year. Diagnosis of VAP was made according to CDC criteria (Atlanta). In this study we showed that enteral feeding could be a risk factor for developing VAP in trauma patients in spite of being less critically ill than other medical or surgical (non-trauma) patients in ICU. However further studies are needed to elaborate more in the mechanism of VAP in trauma patients. Introduction: Lactulose favourably influences microbial colonisation of colon and may therefore prevent ascending colonisation of the gut and consequently decrease the incidence of VAP. Double blind placebo controlled study using lactulose 4 × 15 ml daily or saline given into the stomach for prevention of VAP was performed in our ICU during 1999-2000. Forty-eight patients without pulmonary infiltrates and mechanically ventilated for < 24 hours entered the study. Gastric, oral and tracheal aspirates and rectal swabs were taken for semiquantitive analysis within 24 hours and in 3 days intervals thereafter up to 21 days of ICU stay. For aspirates, the amount of pathogens > 10 5 /ml was considered significant. Metoclopramide and sucralfate were used in all patients. Cyclic enteral nutrition was given into the stomach according to standard protocol. Statistics: Chi-squared and Fisher exact tests were used, P < 0.05 was considered significant. Results: Thirty-eight patients (15 lactulose -L and 23 placebo -P) stayed in the ICU > 4 days and were further analysed. L and P groups did not differ in age, APACHE II, ICU survival, LOS, ventilatory and antibiotic days. The frequency of stools was significantly higher in L group. On day 1 gastric aspirates were positive in 1 L and 2 P patients. On day 4 gastric colonisation increased to 4 (27%) and 7 (30%), respectively (NS). During further ICU stay the colonisation remained in range 20-35% and did not differ between the groups. Candida albicans was the most often pathogen isolated. On day 1 tracheal aspirates were positive in 1 (L group) and 3 (P group) patients. Later on the frequency in both groups increased to and remained in the range of 40-75% (NS). VAS was diagnosed in 6 (40%) and 8 (35%) in L and P groups, respectively (NS). In 9 (L) and 12 (P) patients the pathogen was primarily isolated from oral cavity with Pseudomonas aeruginosa and coagulase negative Staphylococcus being the most often pathogens. The consecutive appearance of pathogen in stomach, oral cavity and trachea was clearly present in 2 patients. In further 5 patients the pathogens were isolated on the same day from all three sites. Aim: To analyse the utility and impact of Fiberoptic bronchoscopy (FOB) and bronchoalveolar lavage (BAL) in patients with haematological malignancy and bilateral pulmonary infiltrates. A FOB with BAL was performed using three 50 ml aliquot of 0.9% sterile saline solution and aspiration under continuos monitoring of the saturations, over a 24 month period. Results: Data as median (range). We studied 35 patients with haematological malignancy with evolutive pneumonia despite treatment with appropriate empirical antimicrobial therapy. The overall diagnostic yield was 80%, which was identical in both neutropenic and non-neutropenic patients. This resulted in a change or rationalisation of antimicrobial therapy for both groups of patients. Conclusion: FOB with a BAL is a beneficial investigative tool in patients with haematological malignancy with an evolutive pneumonia as manifested by bilateral pulmonary infiltrates. Conclusions: HARI was diagnosed in 20.3% of the patients admitted to our unit. The infected patients are significantly older than those without infection (P = 0.008) and had more comorbidities (especially chronic restrictive pulmonary illness -P = 0.041). In these patients there were more non-elective OTE and subsequent invasive procedures of the airway. The occurrence of HARI prolonged significantly ICU-hospitalisation (P = 0.024), but did not cause a significant difference in the mortality rate. Conclusions: EORI has a high prevalence (45.1%) in our unit. Both EORI and LORI are associated with prolonged ICU lengths of stay. The presence of more than one risk factor has a strong association with EORI but depressed level of consciousness and documented aspiration were the only risk factors independently related to EORI. The existence of two or more conditions that disrupt ventilatory mechanics is a significant risk factor for EORI. Emergency intubation was also identified as a significant risk factor for EORI. Intervention: At admission to the ICU, patients were randomly assigned to one of two groups. In one group, patients were ventilated with bacterial filter, in the other group, were ventilated without bacterial filter. We collected data on surveillance samples from throat on admission and afterwards twice weekly, and respiratory infections during the ICU stay. Infections were diagnosed according to the criteria of the CDC and classified bassed on throat flora [1] in: primary endogenous, secondary endogenous and exogenous (Ex). . Eighty-two resulted in a positive culture. In 56 of these microbiologically confirmed VAP, at least one sTA less than 8 days old was available, for a total of 65 sTA. The specific diagnostic specimen and sTA were concordant in 35/65 cases (54%). When negative or contaminated sTA were omitted concordance reached 78% (in order to consider the value of sTA to target therapy). In sTA taken ≤ 4 days before the clinical suspicion of VAP, concordance was better than for sTA taken more than 4 days before the specific tests (83% vs 55.6%; P = 0.07). In addition, mean colony counts ≥ 10 6 cfu/ml (96%) were more fre-quently concordant than colony counts < 10 6 cfu/ml (50%) (P = 0.0003). Concordance rate was not significantly different in Early Onset Pneumonia (within 4 days after intubation-EOP) and Late Onset Pneumonia (that occurs more than 4 days after intubation-LOP). In EOP, a mean colony counts ≥ 10 6 was associated with high rate of concordance. In LOP, even low colony counts specimens identified the correct pathogen in 56% of cases. The concordance between sTA and diagnostic test was function of mean colony count and of time interval between the sTA and VAP onset. In LOP concordance was acceptable (56%) even with a low bacterial count. Surveillance TA may be a useful guide to improve early empiric therapy in suspected VAP. A multigenotypic and sequential molecular identification (Mol-id) was applied on bronchoalveolar lavages (BAL) from 12 pts with clinical-radiological evidence of infection with (n = 9)/without (n = 3) positive Ba-cul. DNA extraction and duplicate specific amplification of 16S rDNA from BAL was first used to identify signals corresponding to the presence of Gram + (G+), Gram -(G-), or mixed G+/G-. Any G+ signal was followed by femA and mecA multiplex (mpx)-PCR for species-specific identification of staphylococci and methicillin resistance, and by mpx-PCR for S. pneumoniae. Any G-signal was followed by mpx-PCR for species-specific identification of Pseudomonads (aeruginosa, cepacia, maltophilia) vs other G-for which the 16S rDNA amplicon was sequenced. Mol-id identified mixed G+/G-in 2/3 negative Ba-cul. Ba-cul gave G+ or G-only in 3 and 4/12 pts, respectively. Among these, Mol-id confirmed either G+ (2/3) or G-(2/4), and found a mixed signal in 3/7 pts. Mixed G+/G-was found by Ba-cul and Mol-id in 2/12 BAL, but in one case, S. epidermidis was found by Mol-id and enterococci by Ba-cul. Staphylococci (MRSA, other) and Ps. aeruginosa were found by Ba-cul and Mol-id in 2 and 3 pts, respectively. Current results suggest that Mol-id is a useful adjunct bringing more insight to standard criteria of nosocomial pneumoniae in ICU and can be a timely relevant guide for antibiotherapy. Design: Retrospective cohort study. Setting: A 6-bed medical ICU in a university hospital. Methods: Patients with first NP (bronchoalveolar lavage culture ≥ 10 4 CFU/ml or protected specimen brush culture ≥ 10 3 CFU/ml) or first NBSI (CDC definition) were included between May 1998 and September 1999. The organ failure score (Fagon criteria) at the time of sampling (day 0) and the interval between sampling and the start of AAT were recorded. Antibiotic treatment was considered to be adequate when all etiologic organisms isolated from the culture specimen were found to be sensitive to the initial empiric antibiotics. Mortality was compared according to the time of AAT and the organ failure score on day 0. Results: A total of 25 patients (mean SAPSII = 44) were included in the study. Seventeen of them presented with a first NP and eight with a first NBSI. The infection occurred 6.5 ± 4.6 days after ICU admission, 23 patients were receiving mechanical ventilation on day 0. The ICU mortality was 48% (12/25) and was not different between NP patients and NBSI patients: 9/17 vs 3/8 (P = 0.47). Mortality increased with the duration without AAT (P = 0.011) and was reduced when AAT was started on day 0 (P = 0.016) or day 1 (P = 0.036). A subsequent change from inadequate to adequate antibiotic treatment had no impact on survival. Mortality was also associated with the number of organ failures on day 0 (P = 0.017). The mortality rate in patients developing NP or NBSI can be reduced when AAT is started before day 2. When the results of the bronchoscopy specimen and blood cultures are obtained early, they can therefore be helpful to start AAT and influence survival. Objective: To examine whether measurement of procalcitonin (PCT) in comparison to Interleukin 6 (IL-6) is a reliable marker to score the extent of lung contusion in bronchoalveolar lavage (BAL) fluids in polytrauma patients. Design: Prospective, non-randomized observational study. Setting: Twelve-bed ICU in a 1100-bed primary care university hospital. Patients: n = 14 trauma victims presenting with severe lung contusion and acute lung injury (ALI) or acute respiratory distress syndrome (ARDS) were enrolled in the study. Interventions: Bronchoscopy with collection of lavage fluid and serum blood samples. Samples were obtained on day 1 and day 2 after severe chest trauma, and lung contusion was assessed by CT scan. Measurements and main results: PCT was detectable in bronchoalveolar lavage (BAL) fluids of all 14 patients. A significant correlation for PCT serum and BAL levels was found on day 2 (P = 0.0063). For PCT no significant correlations (Spearman rank) were found to the lung injury score (LIS) (P = 0.93), the abbreviated injury scale (AIS-lung) (P = 0.33) and the sepsis-related organ failure assessment score (SOFA-lung) (P = 0.38). Also for IL-6 there was no significant correlation to the LIS score (P = 0.62), AIS-score lung (P = 0.45) and to the SOFA-score lung (P = 0.54). Methods: Ten patients with systemic autoimmune disease (rheumatoid arthritis, systemic lupus erythematodes, sclerodermia) and sepsis of an Internal Intensive Care Unit, University hospital, were included. The severity of the disease was assessed at the APACHE II and SOFA-score. To determine the systemic inflammation were measured the serum concentrations of the C-reactive protein, procalcitonin, TNF-alpha, interleucin-6 and the leukocyte count during the septic process. There was a significant difference in TNF-alpha-and interleukin-6-level and procalcitonin during the systemic bacterial inflammation (P < 0.05). Only the marker procalcitonin was related to the clinical signs and the severity of disease. Also compared to the C-reactive protein and leucocyte count PCT showed a better association to the duration of sepsis. These results indicate that Procalcitonin is in autoimmune diseases an important marker to discriminate sepsis from underlying disease activity. The measurement of TNF-alpha and interleucin-6 during sepsis in these diseases described the systemic inflammation but not specific the bacterial infection as compared to PCT. Serum-endotoxin was formerly measured by the limulus-amebocyte assay. A major step forward since this test assay was available was the introduction of chromogenic substrates which were convertible by LAL clotting enzymes. Further improvements could not prevent the results to be non-specific and in general unsatisfactory. An alternative approach of our group for the detection of endotoxin was the usage of a cross reactive monoclonal antibody against endotoxin (WN1-222/5) in combination with flow cytometry, measuring subsequent light emission of a second antibody directed against WN1-222-5 in peripheral mononuclear cells (MØs). In our porcine endotoxin shock model we investigated 10 pigs under analgosedation receiving 250 ng/kg/hour endotoxin from Salmonella friedeman. Separation of mononuclear cells every 4 h and determination of the concentration of endotoxin revealed the results shown in the Table. In this preliminary study we could not find LPS at the cell surface in vivo but in vitro (unpublished data). The current results indicate that the process underlying LPS internalization are very complex. During the course of our endotoxin shock with continuous endo-toxin infusion endotoxin internalization increases gradually. Initially the PMNs show the highest activity and a small increase thereafter. In contrast the MØs revealed a more than 10 times higher activity after 4 hours of the experiment. With our new specific endotoxin test protocol it might well be possible in the future to evaluate the different responses of LPS as it finds its way to different surface domains or intracellular components in different cell populations, respectively. The systemic response to infection is defined as septic phenomenon. By its definition is a generalized inflammatory process and during its progression every organ and system can potentially be impaired. Its progression is associated with and mediated by the activation of a number of host defense mechanisms (cytokine networks, activation of leukocytes, etc) and is characterized by many organ damages mediated by this 'whole body inflammation'. Procollagen type III peptide (PIIIP), as marker of collagen type III biosynthesis and turnover, directly indicates collagen synthesis and seems to be a good marker of many fibrosing, destructive or healing processes. However, serum concentrations of PIIIP have never been systematically measured in patients with graded sepsis We hypothesized that procollagen type III peptide serum levels might be also of value in estimating the 'whole body inflammation and damage' appeared in sepsis. This study was undertaken to test the aforementioned hypothesis. We measured, by a commercially available radioimmunoassay (ELISA) technique, the serum procollagen type III peptide levels of 51 septic patients (pts) (22 pts with sepsis [group G1], 12 severe sepsis pts [group G2], 17 pts with septic shock [group G3]) and we compared them with the findings of 12 healthy controls (group H). The definition of the stages of sepsis followed the criteria established by the ACCP/SCCM consensus conference (August 1992). We use one-way ANOVA to compare the results from sepsis, severe sepsis and septic shock patients with the ones from healthy controls. Procollagen type III peptide serum levels was markedly increased during the septic process: group H 3.7 ± 0.2 µg/ml, group G1 10.1 ± 1 µg/ml, group G2 30 ± 6.2 µg/ml, group G3 34 ± 8.1 µg/ml (P < 0.005 -one-way ANOVA), and was to be of statistically significant value when group H and group G1 compared with group G3 (P < 0.05 and P < 0.005 respectively) (Sheffe test for the post hoc comparisons of means). We conclude that PIIIP serum levels increased in parallel with the increasing severity of septic process, probably being a good indicator of tissue inflammation, damage, and fibrogenesis. S27 body and sacrificed 18 hours after LPS for functional and immunohistochemical studies (N = 6 each). Semi-quantitative RT-PCR was also performed to delineate the time course of MCP-1 mRNA expression (0, 1, 3, 6, 24, and 48 hours). MCP-1 mRNA peak expression was confirmed and quantified by SYBR Green two-step real-time RT-PCR. Cellular infiltration (polymorphonuclear neutrohils, monocytes) and MCP-1 protein expression was determined by immunohistochemistry. Spontaneous and bethanechol stimulated circular muscle strip contractility was assessed using a standard organ bath. Statistical analysis was performed using the unpaired Student t-test. Data were considered statistical significant at a P < 0.05. Endotoxemia caused a significant increase in MCP-1 mRNA expression in the intestinal muscularis, peaking at 3 hours and returned to near control levels after 48 hours. SYBR Green real-time PCR revealed a 280-fold increase in MCP-1 mRNA expression 3 hours after LPS compared to control tissue (relative quantification: ∆∆C T = -8.1 ± 0.25). MCP-1 protein was immunohistochemically located in resident muscularis macrophages. LPS application caused a significant infiltration of leucocytes into the intestinal muscularis and a 51% decrease in muscle contractility, which was significantly blocked by MCP-1 antibody treatment (8% decrease), but not by the non-specific antibody (38% decrease, at bethanechol 300 µM). The results suggest that locally derived MCP-1 plays a major role in the recruitment of monocytes during endotoxemia from which kinetically active substances contribute to ileus. Patients: Ten patients with severe sepsis and septic shock, and 10 control healthy volunteers. Interventions: Blood samples were drawn to evaluate neutrophil apoptosis and neutrophil function by flow cytometry. Measurements and statistical analysis: Neutrophil apoptosis was assessed by propidium iodide (PI) DNA and annexin V staining. Neutrophil phagocytic capacity in response to Staphylococcus aureus (S. aureus) stimulation was analyzed. Baseline reactive oxygen species formation and in response to phorbol myristate acetate (PMA), N-formyl-methionyl-leucyl-phenylalanine (FMLP), lipopolysaccharide (LPS) and S. aureus stimulation, were assessed by flow cytometry using dichlorofluorescein (DCFH). Differences in extent of apoptosis, phagocytosis and oxidative metabolism in septic patients vs healthy volunteers were analyzed using a Mann-Whitney test. Results: Neutrophil apoptosis, neutrophil phagocytic ability and reactive oxygen species formation were enhanced in septic patients compared to healthy volunteers. Circulating neutrophils from septic patients presented increased apoptosis and phagocytic ability. Reactive oxygen species formation was also enhanced in these patients. Although the examination of system functions such as liver, kidney, and lung is well established, monitoring of immune system is still poorly developed. Immunosuppression or hyperinflammation are often met in ICU patients. Thus the establishment of objective measures of immunological parameters is essential for the control of hyperinflammatory stage and of infections as a result of immunosuppression. We had measured Th1/Th2 balance and monocytic HLA-DR expression. Intracytoplasmic Th1 and Th2 cytokine production in isolated PBMCs was assessed by flow cytometry following in vitro activation by PMA plus ionomycin. Monocytic HLA-DR expression was also measured. The percentage of Th1 cells in PBMCs from severe burn injury patients, septicemia, and multiple injuries are 4%, 9%, and 13%, respectively. The percentage of Th1 cells from healthy subjects was 22%. In the meantime, the percent-age of Th2 cells did not show significant change in all patients. The significant reduction of Th1 cells and monocytic HLA-DR expression in ICU patients except for acute interstitial pneumonia (AIP) indicates that major stress results in immunoparalysis. Two AIP patients who revealed more than 40% of Th1 cells were tried to immunomodulation by methylpredonisolone administration. Y Imamura*, T Yokoyama † , E Hiyama † , Y Takesue*, T Sueda*, S Zedler ‡ , E Faist ‡ *First Department of Surgery, and † Department of General Medicine, Hiroshima University, Japan; ‡ Department of Surgery, University of München, Germany Objective: Under systemic inflammatory response syndrome (SIRS), a change in Th1/Th2-response by T-cell and a defect in monocyte function is observed, which may lead a derangement of immunological homeostasis, associated with immunosuppression and susceptibility to sepsis. We have recently developed an immune-inflammatory monitoring system, that can detect a constitutional change in Th1/Th2-population of T-cell subsets and monokine production by monocyte, via multi-parameter flowcytometry. The aim of this study was to investigate whether these tests were useful in establishing the difference between septic SIRS and non-septic SIRS. In the septic SIRS group, 11 patients with sepsis followed by organ dysfunction were studied on admission to ICU in our hospital. In the non-septic SIRS group, 10 patients who underwent major elective surgery were studied on the first and third day after operation. We investigated the cytokine expression of T-cells after activation by ionomycin and PMA, and the expression of monokine and HLA-DR antigen by monocytes after being stimulated with LPS and/or IFN-gamma, using whole blood culture (6 hours). After stain for cell-surface phenotypes, the cells were fixed and permeabilized, then fluoro-immunostained for intracellular IFN-gamma, IL-4 in T-cells, and TNF, IL-6, and IL-12 in monocytes. The frequencies of these cytokines-producing cells were estimated with multicolor flowctyemetric analysis. 1) The number of IL-4 producing cells (Th2) in T-cells increased significantly both in CD4+ and CD8+ subsets in patients with sepsis, but not in patients with non-septic SIRS. While the IFN-gamma producing cells (Th1) increased slightly in patients with sepsis and non-septic SIRS. 2) The production of IL-6, TNF, IL-12 by monocytes from patients with sepsis and nonseptic SIRS was significantly decreased, together with a reduction of HLA-DR expression. Afterwards, the defect of TNF and IL-12 production in monocytes from non-septic SIRS patients recovered by the third postoperative day. These findings show a significant shift of Th2 response in T-cell subsets and a prolonged reduction of TNF and IL-12 production with a reduced HLA-DR expression by monocyte in sepsis, compared with those in non-septic SIRS. These tests may be available for the differentiation of immunosuppression subsequent to sepsis from the SIRS with a dominant pro-inflammatory state. Methods: Ten patients with sepsis of an Internal Intensive Care Unit, University hospital, were included and compared to 10 healthy controls. The severity of the disease was assessed at the APACHE II and SOFA-score. Also were measured the serum concentration of the C-reactive protein, procalcitonin, TNF-alpha and interleucin-6. Levels of CD4 and CD8-expression were analysed flow cytometrically ex vivo and after stimulation with PMA at day 1, day 7 and day 14 in culture (proliferation index). There was a significant difference in the CD4/CD8-ratio between septic patients and healthy controls (P < 0.05) ex vivo and at day 7. Also the severity of the disease showed a linear correlation to the changes in CD4/CD8-ratio after stimulation with PMA. These results indicate that changes in CD4/CD8-ratio could be an important part of the immunologic response to pathogens in the septic process. Therefore T-cell-subpopulations are probably involved in the development of the dysfunction in the immune system. Table. One animal in LVZP (1 h) and 3 in HVP (2, 3, 2 h) group died before the end of the experiment. Conclusion: Different injurious ventilatory strategies may induce TNF-α and IL-10 response from alveolar space that may lead to an increase in serum cytokine levels in normal swine lung. 1 TNF-α (pg/ml) 5.1 ± 0.9 20.9 ± 3.1 3.4 ± 1.1 35.4 ± 9.9* 16.7 ± 0.8* 33.9 ± 7.6* IL-10 (pg/ml) 2.8 ± 0.9 2.9 ± 0.6 1.5 ± 0.7 27.2 ± 22.9* 2.8 ± 0.7* 16.2 ± 2.5* * Statistically significant difference (P < 0.01). The aim of this study was to show the effect of intratracheal (IT) and intravenous (IV) lidocaine on haemodynamic and arterial blood gas values in hydrochloric acid (HCl) induced-acute lung injury (ALI) in rabbits. Twenty New Zealand rabbits were randomly divided into for groups of five. An endotracheal tube was placed through a tracheostomy in all animals following ketamine hydrochloride (50 mg/kg, intramuscularly) induced anaesthesia. Maintenance of anaesthesia was achieved by the use of ketamine hydrochloride 10 mg/kg/h and atracurium besylate 1 mg/kg/h. The animals were ventilated with Pressure Control mod for 3 hours and the parameters of ventilation were FiO 2 : 1.0, RR: 80/min, Vt: 8 ml/kg, I/E: 1/2, PEEP: 5 cmH 2 O. Intratracheal HCl (2 ml/kg), was given following tracheostomy in all animals. Five minutes after the application of HCl the group 1 received IV lidocaine (2 mg/kg), group 2 received IT lidocaine (2 mg/kg), group 3 received IV lidocaine (4 mg/kg) and group 4 received IT lidocaine (4 mg/kg). We recorded mean arterial blood pressures (MAP), heart rate (HR) every 60 min. We measured arterial blood gas values (PaO 2 , PaCO 2 , pH) at beginning of the study, at 90th and 180th minutes. Data were compared by Mann-Whitney U-test. A P value of less than 0.05 was considered to indicate statistical significance. Available online http://ccforum.com/supplements/5/S1 MAP levels of group 2 and group 4 were higher than group 1 and group 3 at 120th and 180th minutes (P < 0.05). HR levels of group 2 were lower than the other groups at 120th and 180th minutes (P < 0.05). PaO 2 levels of group 1 were higher than the other groups at 180th minute (P < 0.001). There was no significant difference between four groups regarding PaCO 2 , pH. In conclusion, IT and IV lidocaine (2 mg/kg) given after 5 min of HCl application had a beneficial effect on the haemodynamic and arterial blood gas values in acute lung injury in rabbits model. Introduction: Fibrin formation in the airway is a common occurrence with acute lung injury. Mucous plugging in the airway prevents alveoli expansion and may increase shunt blood flow. The aim of this study was to investigate the effect of heparin nebulization in acute lung injury (ALI) with sepsis or burn after smoke inhalation in sheep. Female sheep (n = 20) were used. The animals were divided into two groups. One is an ALI model induced by a combination injury with smoke inhalation and severe pneumonia (n = 10). The other is an ALI model induced by a combination injury with smoke inhalation and 3rd degree 40% body surface area flame burn (n = 10). The preceding groups received 48 breaths of cotton smoke (< 40°C). The sham control animals were not injured, Ps. aeruginosa (5 × 10 11 CFU) was inoculated into the airway using a bronchoscope. All the animals were mechanically ventilated after the injury. Burned animals were resuscitated with lactate Ringer's solution following the Parkland formula. Both groups were divided into two. One was treated with heparin nebulization (n = 5; 10,000 U 1 h after the injury and every 4 hours) and the other were treated with the same dose of 0.9% NaCl as a control. Lung histology was scored by a pathologist who was blinded for the animal grouping. Congestion, edema, inflammation, and hemorrhage were scored from 0 (non) to 4 (severe) and total score was calculated (full score = 16). In sepsis study, the drop in PaO 2 /FiO 2 was significantly attenuated by a heparin while it was not in burn study. The lung histology score was also attenuated by heparin in sepsis but not in burn. Heparin nebulization was effective in reducing acute lung injury induced by severe pneumonia and smoke inhalation but not in burn and smoke inhalation. Since heparin does not inhibit thrombin without antithrombin, the result suggests that antithrombin level in the alveolar space, which is exuded from a bronchial blood flow, may be different. Background: Superoxide (O 2 -) -a key anti-microbial agent in phagocytes -is produced by the activity of NADPH oxidase. High concentrations of glucose may reduce O 2 production through inhibition of glucose-6-phosphate dehydrogenase (G6PD) [1] , which catalyzes the formation of NADPH. To measure the acute effects of high glucose or the G6PD inhibitor, dehydroepiandrosterone (DHEA), on release of O 2 from isolated human neutrophils. Methods: Neutrophils were isolated from peripheral blood of healthy subjects by gradient centrifugation and incubated for 1 hour in Krebs-Ringer buffer containing 5, 10 or 25 mM glucose, 5 mM glucose with 0, 5 or 20 mM mannitol or 5 mM glucose with 1, 10 or 100 µM DHEA at 37°C. N-Formylmethionyl-leucyl-phenylalanine (fMLP)-induced O 2 release was measured by superoxide dismutase-inhibitable reduction of cytochrome c or luminol-enhanced luminescence. Scavenging of O 2 by glucose or DHEA was assessed by the pyrogallol assay [2] . Results: Incubation with glucose or DHEA, but not glucose/mannitol, dose-dependently reduced fMLP-induced release of O 2 as detected by either method. In a cell free system, neither glucose nor DHEA scavenged O 2 -. Conclusions: Inhibition of G6PD may be the cause of acutely reduced O 2 release from activated neutrophils in response to high Methods: Over a 6 month period, 33 patients with severe sepsis or septic shock were studied in the intensive care unit. Annexin V binding by leukocytes was determined daily using flow cytometry and FITC-labeled Annexin V. Transient leukocytosis, or peaks in leukocyte counts, were defined as individual increase of at least 30% within 2 days, followed by a decrease of at least 30% within the following 2 days. Results: Nine, 14 and 10 peaks in neutrophil, monocyte and lymphocyte counts, respectively, were observed in 6, 9 and 10 patients; in all of these patients, increased Annexin V binding by neutrophils, monocytes and lymphocytes, respectively, by 69% up to 809% (median 215%), by 32% up to 973% (median 330%), and by 32% up to 4713% (median 224%), respectively, paralleled the neutrophil, monocyte and lymphocyte (in 13/14 episodes) peaks, respectively. During periods in which neutrophil, monocyte and lymphocyte numbers were stable, Annexin V binding was constant as well. In conclusion, mobilization of leukocytes during severe sepsis and septic shock in critically ill patients is associated with increased apoptosis, as determined by Annexin V. Background: Recently animal studies have been presented suggesting that complement factor C5a blockade might be of benefit in patients with severe sepsis/septic shock. In one study the expression of the C5a receptor (CD88) on the granulocytes from septic animals was increased. The aim of the present investigation was to study the CD 88 expression on leukocytes in human sepsis. Twelve ICU patients fulfilling the ACCP/SCCM criteria for severe sepsis/septic shock were prospectively included into the study as early as possible in their septic course. Blood samples for analyses of leukocyte receptor expression and complement factors were taken on day 1, 3 and 15. The leukocytes were isolated from heparinised whole blood and labelled with CD88 antibodies. As controls leukocytes from 20 healthy individuals were used. The samples were analysed by the use of flow cytometry and results presented as mean fluorescence intensity (MFI). The complement proteins C3a and terminal complement complex (TCC) were analysed in EDTA plasma by capture ELISA techniques. Levels of TNF-α, IL-6, IL-8, IL-10, IL-1ra, and MCP-1 were analysed in EDTA plasma with ELISA technique. Results: On day 1 10/11 patients had increased levels of C3a, 1144 ± 138 ng/ml, (mean ± SE) (normal range: 92-268) and 11/11 of TCC 146 ± 46 AU/ml (normal range: 12-56). CD88 expression on the granulocytes in the control group was 63 ± 4. In comparison with the controls, the patients with severe sepsis/septic shock had significantly lowered values: on day 1 37 ± 5 (P < 0.001), on day 3 45 ± 8 (P < 0.05), and on day 15 51 ± 8 (P < 0.05). The granulocyte expression of CD88 on day 1 correlated negatively to APACHE-II score at inclusion (r = -0.59, P < 0.05). Besides a weak correlation to IL-1ra, there were no significant correlations to the other cytokines. In the patient group, the CD88 expression on the monocytes did not change during the observation time and did not differ from that in the control group. Methods: Ten anaesthetised, and multi-catheterised pigs (20.6 ± 1.3 kg) were investigated over a period of 8 h. Sepsis was induced by fecal peritonitis. Animals were infused using 6% hydroxyethyl starch 200/0.5 to maintain a CVP of 12 mmHg. In kidneys biopsies TCC deposition was detected immunohistologically. Plasma levels of TCC were measured in a double antibody EIA using the neoepitope-specific MoAb aE11 as catching antibody. Albumin escape rate (AER; tc 99m-labeled albumin), serum protein (S-Protein), and hematocrit (Hct) were determined. After verifying normal data distribution (skewness < 1.5) Student's t-test was performed by rankordered stepwise testing. Data are mean ± SD. Results: Septic animals showed marked renal deposition of TCC. Other results, see Table. Conclusion: Although plasma levels of TCC declined over study period, in septic animals marked renal depositions of TCC indicated complement activation. Since AER increased and serum protein levels decreased, capillary loss of TCC into organ tissue may explain our findings in part. We conclude that in septic shock with substantial CLS plasma levels of TCC may not reflect degree of complement activation. Methods: Over a 6 month period 35 patients with proven infection and severe sepsis or septic shock for at least 3 days' duration were monitored on a daily basis during their stay in the intensive care unit (ICU) until discharge from the ICU or death. In 19 out of these 35 patients one or more peaks in G-CSF serum concentrations occurred. Eleven of these 19 patients survived, eight patients died. A longitudinal analysis of G-CSF serum concentrations, phagocytotic activity of granulocytes and surface expression of monomeric Fc receptor type I (CD64, FcγRI) on granulocytes was performed by ELISA technique (R&D Systems, Minneapolis, MN, USA) and flow cytometry (Phagotest™; Orpegen, Heidelberg, Germany) and CD64 (clone 22; Immunotech, Krefeld, Germany), respectively on a daily basis. Results: A G-CSF peak was defined as an increase of at least 30% from one day to the other, followed by a decrease of at least 15% on the next day. The following results are expressed as median (min -max) values. In seven episodes there was a parallel course of the G-CSF peak and phagocytosis with an increase in phagocytosis by 37% (6-50%). In 11 episodes, phagocytosis continuously increased and remained on a higher level after the increase of 10% (1-164%) from day 1 up to day 2. In 10 episodes, there was a decrease by 40% (17-76%) at the day of the G-CSF peak, followed by an increase by 58% (6-322%) on the next day. In 12 episodes, there was no increase (n = 4) or even a decrease (n = 8) by 24% (3-46%) over all days. A peak in G-CSF serum concentration was followed by a continuous increase in phagocytosis at the same day in 7, and a delayed increase in 21 out of 40 episodes, but no increase or even an decrease in 12 out of 40 episodes. Thus, phagocytotic activity is increased when G-CSF peaks endogenously, in most patients with severe sepsis or septic shock. There was a statistically significant increase in WBC count between the 1st and 2nd, 3rd and 4th days (P < 0.01). There was no statistically significant PaO 2 /FiO 2 index improvement between the 1st, 2nd and 4th days (P < 0.251 and 0.392) and index deterioration between the 1st and 3rd day (P < 0.478). Conclusions: Despite significant WBC count increase after rHu GM-CSF supplementation, the PaO 2 /FiO 2 index was not significantly altered. Molgramostin, when given in septic patients may improve the PaO 2 /FiO 2 index except during the 3rd day of the major WBC count increase during which there is a nonstatistically significant decrease in this index by 3.1%. GM-CSF supplementation can be safely used without deteriorating pulmonary gas exchange function. 1 Background: Translocation of microorganisms from the gut lumen may contribute to morbidity in sepsis. Conservation of normal barrier function should, therefore, be a goal in critical illness, but its components is only partly defined. Beyond mechanical contribution, the colonic epithelium may possess antimicrobial activity via yet undefined mediators [1] . Superoxide (O 2 -), generated by membrane-bound NADPH oxidase (Nox), is a key antimicrobial agent of phagocytes, but output from human colonic epithelial cells has not been demonstrated. To measure O 2 output and expression of the non-phagocytic NADPH oxidase, Nox1, in short term cultures of primary epithelial cells from normal human colon and in Caco-2 colonic epithelial cell line. Colonic epithelial cells were isolated from biopsies of subjects with uninflamed bowel and Caco-2 cells were grown in 24-well plates [2] . Output of O 2 was measured by the cytochrome c assay or luminol-enhanced luminescence and localised by the nitrobluetetrazolium (NBT) assay [3] . Effects of 1 mM NADPH, oxidoreductase inhibition by 10 µM diphenylene iodonium (DPI) and protein kinase C-activation by 0.5 µg/ml phorbol myristate ester (PMA) were assessed. mRNA expression of Nox1 was analysed by RT-PCR. : Identical results were obtained in primary epithelial cells (see Figs) and Caco-2 cells. NBT-reduction was observed at the outer cell membrane and Nox1 mRNA was detected in all cell cultures. NADPH increased output of O 2 within seconds although cell membranes are impermeable to NADPH. These results show that cultures of human colonic epithelial cells produce extracellular O 2 possibly through membrane-bound Nox1, which appears to independent of protein kinase C activation. O 2 from epithelial cells may possess antimicrobial activity and contribute to colonic barrier function. In pilot studies to calibrate the murine HSR, 20-25 g ND4 mice were anesthetized and immersed in a water bath for a total of 20 min to raise core body temperature to 37, 40 or 41.5°C (n = 3 per group). Livers were harvested 24 hours later. Western blot analyses for Heat Shock Protein-72 (HSP-72, a widely-accepted marker of HSR) showed the expression of HSP-72 at 41.5°C for 20 min but not at or below 40°C. This pattern is strain independent. Next, the effect of HSR prior to or subsequent to cecal ligation and puncture via halothane anesthetic (CLP) upon survival was tested. 20-25 g male inbred C57-BL6 mice were randomized to one of six groups (n = 15-20 per group) and heated for 20 min to either 37 or 41.5°C alone or in combination with CLP. Survival was 70% and 15% for HSR induction prior, or subsequent, to CLP respectively (P = 0.001). To exclude the possibility that the order-dependent response was strain specific, the study was repeated with outbred ND4 mice (n = 11-13 per group). In the ND4 mice, survival for HSR induction prior to CLP was 25% but following CLP was nil (P = 0.0001, Fig. 1 ). Though beneficial and somehow protective when induced prior to insult, the heat shock response paradoxically increases mortality when activated after severe stress. This paradoxical potentiation of injury also appears independent of the specific strain. The susceptibility of infected animals to devastating HSR at 41.5 degrees may explain, at least in part, why human fevers are generally self-limited to 40 degrees or less. Objective: The ORL-1 receptor (Orphan opioid receptor) has been discovered recently and is involved in pain perception and immune function [1] [2] [3] . The regulation of the ORL-1 receptor in patients with systemic inflammation has not been elucidated yet. This study investigates the influence of different doses of LPS on ORL-1 expression in peripheral blood cells ex vivo. Human whole blood from healthy volunteers was cultured at 37°C and 5% CO 2 without LPS or LPS 0.1 ng, 10 ng and 100 ng/ml for 3, 6, 12 and 24 hours. Reverse transcriptase polymerase chain reaction (rt-PCR) using specific primers was performed and RNA contents was estimated by semiquantitative analysis employing a housekeeping gene as internal standard. Southern blot analysis and hybridisation to a specific DIG-labeled probe confirmed the identity of the ORL-1 transcripts. Statistics: Mean ± SEM, repeated measures ANOVA. Results: ORL-1 receptor was expressed constitutively in human peripheral blood cells. Mean baseline expression resulted in a ratio of ORL/GAPDH of 1.0 ± 0.07. Semiquantitative rt-PCR revealed a dose and time dependent down regulation of ORL-1 expression (P < 0.05). Incubation with LPS 0.1 ng/ml decreased the ratio ORL/GAPDH from 0.95 ± 0.06 at 3 hours to 0.19 ± 0.02 at 24 hours. In contrast, incubation with LPS 100 ng/ml already suppressed the ORL-1 message after 3 hours of incubation. Southern blot analysis and hybridisation proved the specificity of the amplified PCR products for ORL-1 transcripts. The influence of direct hemoperfusion using polymyxin B immobilized fiber (PMX) to immunity was studied in severe septic patients with organ dysfunction. Thirty-four cases whose mean age and APACHE II were 59 years and 23 were treated by the PMX in 50 times. They were divided into the detectable (≥10 pg/ml, n = 27) and non-detectable (n = 23) endotoxin group, analyzed before PMX by colorimetric limulus test with chromogenic substrate (Toxicolor). After PMX, mean arterial pressure and systemic vascular resistance index were significantly increased in both groups. Endotoxin, neutrophils and monocytes were significantly decreased from 76.4 ± 19.5 to 64.8 ± 17.8 pg/ml (P = 0.0158), from 14810 ± 2020 to 9990 ± 1660/mm 3 (P = 0.0002), and from 688 ± 103 to 512 ± 99/mm 3 (P = 0.0087) respectively in the D group, while lymphocytes were decreased not significantly in both groups. Furthermore IL-6 was significantly decreased from 958 ± 437 to 722 ± 296 pg/ml (n = 37, P = 0.0495), IL-8 and CD4 were significantly increased from 162 ± 80 to 195 ± 114 pg/ml (n = 31, P = 0.0456) and from 31 ± 4 to 34 ± 3% (n = 14, P = 0.0091) respectively, but TNFα and IL-1β did not change significantly. These results indicated that the therapy with PMX in severe sepsis could be going to end the response of innate immunity and induce the adaptive immunity, so that it would be helpful for hemodynamic stability. Introduction: Endotoxemia is characterized by increased microvascular permeability. Endothelial factors and mast cell activation seems to promote microvascular permeability independently from leukocyte adherence [1] . The aim of our study was to investigate the influence of mast cells on leukocyte-independent microvascular permeability changes. Therefore, microvascular permeability was determined during endotoxemia after inhibition of the L-selectin mediated leukocyte-endothelium interaction by fucoidin. Mast cells were either degranulated with compound 48/80 (CMP48/80) prior to the experiment or pre-treated with the mast cell stabilizing agent cromolyn. In male Wistar rats, microvascular permeability (MP), leukocyte adherence (LA) and mast cell activation (MCA) were determined in mesenteric postcapillary venules using intravital microscopy at baseline, and at 60, and 120 min after start of a continuous infusion of endotoxin (groups A-C, n = 8 each). Animals underwent laparotomy and the mesentery was exposed beneath an in vivo videomicroscope. MCA was determined in vivo by superfusion of the mesentery with ruthenium red. MP was measured using fluorescein isothiocyanate (FITC) labeled albumin. Leukocyteendothelial interaction was blocked by fucoidin 10 min before laparotomy in groups A, B, and C. Animals in group B additionally received CMP48/80 (1 mg/kg b.w. i.p.) 48 h before the start of the experiment. Animals in group received cromolyn (20 mg/kg b.w. i.v.) prior to baseline measurement followed by a continuous superfusion of the mesentery with cromolyn. Group D (control, n = 8) only received equivalent volumes of NaCl 0.9%. Statistical analysis was performed using student's t-test. A P value < 0.05 was considered significant. In the endotoxin and fucoidin treated groups (A-C), LA was attenuated to levels similar to control group (D). Endotoxininduced MCA was prevented in both the CMP48/80-and the cromolyn-treated animals (groups B + C) (P < 0.05 vs group A). However, in the CMP48/80-and the cromolyn-treated groups the endotoxin-induced increase in microvascular permeability tended to be reduced too, but without being significant (P > 0.05 vs group A). Differences in MP between group A and the control group were significant at 120 min. The results of this study demonstrate that mast cells are only less involved in the pathophysiology of leukocyte-independent plasma extravasation during endotoxemia. Further pathomechanisms must be discussed. 1 Reactice haemophagocytic lymphohistiocytosis (HLH) is a highly underdiagnosed condition in Adult Intensive Care. It is characterized by a non-malignant proliferation of histiocytes with phagocytosis of haematopoietic cells, resulting in cytopenia of at least two cell lines. The acquired form of HLH has been associated with infections, neoplasms, autoimmune diseases and immunosuppression. Methods: Retrospective review of 15 patients who developed reactive HLH during their stay in the Liver Intensive Care Unit between January and November 2000. Predictors for survival were calculated using a non-parametric Mann-Whitney U-test. Results: Admission diagnosis was fulminant hepatic failure in six patients, two patients presented with sepsis and chronic liver disease and seven patients following liver transplantation. All patients were in multiple organ failure (MOF) and severely thrombocytopenic (median 32 × 10 5 ), despite platelet support in 13 patients, at the time of diagnosis. Twelve patients received immunosuppressive medication. Among the post transplant patients, six tested positive for CMV DNA and were treated with Ganciclovir. Ten patients died, two are still requiring intensive care treatment and three have been discharged from ITU (two from hospital). Survivors had significant lower SOFA scores at the time of HLH diagnosis, a significant higher, unsupported platelet count 7 days after diagnosis and high dose gamma-globulin therapy and a trend towards a shorter stay on mechanical ventilation (Table -all parameters median and range). (FasL) . FasL also has a proinflammatory role and is released following ischaemia-reperfusion. We wished to investigate the time course of release of the soluble forms of Fas and FasL (sFas, sFasL) post CPB, and whether steroid pre-treatment altered the response. Method: Twenty-seven children with congenital heart disease were studied, median (IQ) age 7 (0.4-10) months. Patients were given 0.25 mg/kg dexamethasone (DEX) (n = 13) or no DEX (n = 14) at induction of anaesthesia. Groups were well matched in terms of age, type of operation, length of CPB, cross clamp, and circulatory arrest (all P > 0.15). sFas, sFasL and interleukin (IL) 6 (a marker of cytokine response) were measured over 24 hours by double sandwich ELISA. Results: DEX significantly blunted the release of IL6 and sFas, but not sFasL. The DEX group exhibited a decreased clinical inflammatory response post CPB as evidenced by a lower temperature, less colloid requirement, chest drain loss, acidosis, hyperlactataemia and coagulopathy (all P < 0.05). Conclusion: DEX blunts IL6 and sFas but not sFasL release following CPB, attenuating clinical inflammatory response. The significance of the sFas response is unclear; this may be a passive marker of a decreased inflammatory response but decreased levels may also negatively influence apoptosis/inflammation by being less able to 'mop-up' excess membrane and soluble FasL. Methods: Pigs were assigned to a temperature (T°) group during standardized cardiopulmonary bypass (CPB): normothermia (T°, 37°C; n = 8) and moderate hypothermia (T°, 28°C; n = 8). Liver probes were taken before and 6 hours after CPB for standardand immunohistological examinations. Apoptotic cells were detected by TUNEL-staining. Intrahepatic gene expression of TNFα, IL10 and of apoptosis regulating proteins were examined by competitive RT-PCR. Results: Gene expression of cytokine and apoptosis regulating proteins was not detected before but 6 hours after CPB. Pigs operated on under 28°C showed lower TNFα-mRNA and higher IL10-mRNA than those operated on under 37°C (P < 0.05). While expression of apoptosis regulatory proteins and percentage of apoptotic hepatocytes were similar in both groups, percentage of necrotic hepatocytes was lower in 28°C than in 37°C group (P < 0.05). TNFα-mRNA after CPB was correlated with the percentage of necrotic hepatocytes (P < 0.05). Moderate hypothermia during CPB provides hepatic protection by increasing IL10-and decreasing intrahepatic TNFαgene expression without affecting gene expression of apptosis regulatory proteins. Aim: To study whether immune competence as assessed by HLA-DR expression on monocytes and by ex vivo production of TNFα allows prediction of complications after cardiac surgery, and to analyze the influence of cardiac surgery on immune competence. Methods: Forty patients aged 1 to 188 months undergoing cardiac surgery were enrolled. Whole blood was collected before, 1, 3, and 5 days after surgery. HLA-DR expression (expressed as antiboby binding capacity, Abc) was determined using a flow cytometric assay. TNFα production was assessed in the supernatant after whole blood stimulation with LPS (500 pg/ml; incubation: 4 h) with the Immulite (DPC Biermann GmbH). Severity of the operation and postop clinical condition were assessed by score. Results: Ten patients developed postop complications (group 1) and 30 not (group 2). Preop HLA-DR (32,000 Abc) was not significantly different between groups. In contrast, preop ex vivo produc-tion tended to be lower in group 1 than in group 2 (TNFα: 363 vs 628 pg/ml, P = 0.06). In both groups, HLA-DR significantly decreased on postop day 1 (10,000 Abc, P < 0.002 vs preop) and remained significantly lower than preop value up to postop day 5. Group 1 had significantly higher and persistent decrease of HLA-DR over the postop period than group 2. Severity of operation and clinical score 4 and 24 h postop negatively correlated with postop HLA-DR but not with ex vivo TNFα production. Conclusion: Cardiac operations in children are associated with impairment of immune competence, which is more severe and persistent in patients with postop complications. Our results suggest that preop ex vivo TNFα production but not HLA-DR expression could be predictive for the development of postop complications. With this regard, additional study of TNFα gene polymorphism (locus 308) is expected to allow further preoperative risks stratification. Patients having uncomplicated coronary artery bypass surgery (CABG) are extubated within 2-3 hours of surgery. Despite warming to core temperatures of 37°C post bypass, extubation is often limited by postoperative hypothermia caused by redistribution of thermal energy from core to periphery (afterdrop). We hypothesised that rewarming using axillary temperature as an endpoint may eliminate post bypass afterdrop, thus potentially allowing earlier extubation. Methods: Following ethics committee approval, 39 patients for CABG or aortic valve replacement were randomised to be rewarmed to either a nasopharyngeal temperature (NPT) of 37°C (Group A) or an axillary temperature of 35.5°C (Group B) following hypothermic cardiopulmonary bypass (CPB). Nasopharyngeal and axillary temperatures were recorded at 1 min intervals in all patients until extubation. Extubation occurred when a patient achieved cardiorespiratory stability, blood loss < 1.5 ml/kg/hour and a NPT ≥ 36.5°C. The data was statistically analysed using the Mann-Whitney U-test. Total CPB times and lowest temperature on CPB were similar in both groups. Uncontrolled abdominal abscess after major trauma or surgery easily makes a patient septic condition. It is important but difficult to aspirate mucinous purulent abdominal fluid effectively and to keep the abscess cavity dry for prevention of abdominal sepsis. Formerly, we use double luminal tube, which we use usually as nasogastric tube with low negative pressure. However we could not keep the condition of the infectious space dry by this technique. Patients with abdominal infection or abscess after major trauma or major surgery were examined. We Available online http://ccforum.com/supplements/5/S1 used an overcoated double luminal drain. The tube consisted of an outer big with many side pores containing an inner small drain and the tip of the inner drain was kept its site never extended the tip of the outer drain. We aspirate this overcoated drain with maximum negative high pressure of central aspirating system. Mucinous infectious fluid was aspirated with air. We evaluate the clinical course of the patients, condition of the infectious space, volume of aspirate, the number of dressing change. Results and discussion: Fourteen patients were examined. We could (1) keep infectious spaces, (2) keep the skin around infec-tious space intact resulting in good and rapid healing, (3) exactly evaluate the volume of aspirated fluid, that made it easy to evaluate the healing course, (4) save the number of dressing change resulting in saving the cost. Overcoated double luminal drainage is useful for aspirating mucinous infectious fluid effectively, for keeping the infectious space dry, for reducing the infectious space, and consequently for preventing abdominal sepsis. 4.2 ± 0.5 7.9 ± 0.5 100 * P < 0.05, mean ± SD. Purpose: In this study, we evaluated the effect of FR167653, which is a potent suppressant of TNF-α and IL-1 production, on lipopolysaccharide (LPS)-induced lung injury and lethality in rats. Male Sprague-Dawley rats weighing from 200 to 270 g were used. Subject animals in the LPS only and LPS/FR groups received 6 mg/kg of LPS intravenously. The animals in the LPS/FR group also received an infusion of FR167653 at 0.2 mg/kg/hour, commencing 30 min prior to the LPS injection and continuing for 5.5 hours. The LPS significantly induced the accumulation of pulmonary neutrophils and lung edema, both of which were significantly attenuated by the treatment with FR167653. FR167653 also significantly decreased the LPSinduced lethality. Histologically, tissue damage was milder in the LPS/FR group than in the LPS only group. Serum levels of TNF-α and IL-1β were suppressed in the LPS/FR group compared with the LPS only group. Western blot analysis revealed that FR167653 inhibited the phosphorylation of p38 MAP kinase in lung tissues. Conclusions: FR167653 administration resulted in a decrease in the serum TNF-α and IL-1β levels that was associated with decreased lung injury and lethality. The mechanism responsible for the decreased TNF-α and IL-1 may be related to the inhibitory effect of FR167653 on p38 MAP kinase activation. Dedicated organ dysfunction scores such as the Multiple Organ Dysfunction (MOD) score [1] objectively measure morbidity in critically ill patients. The MOD score is constructed using simple physiologic measures of dysfunction in six organ systems and correlates in a graded fashion with the ICU mortality rate. Serum interleukin (IL)-6 levels have been proposed as a biochemical marker of the severity of the multiple organ dysfunction syndrome. Purpose: To assess the agreement of the MOD score with IL-6 levels as baseline prognostic indicators in patients enrolled in the MONARCS trial. The MONARCS trial was a double-blind, placebo-controlled trial designed to evaluate the safety and efficacy of afelimomab, an anti-tumor necrosis factor (TNF)α antibody. Before randomization, patients were stratified by the results of a rapid semiqualitative test (Septest) measuring serum IL-6 levels. A positive (+) Septest indicates elevated IL-6 levels. An aggregate MOD score ≥ 9 defines significant multiorgan dysfunction syndrome. Mean baseline score, percent patients with score ≥ 9, and the strength of the agreement of each component of the aggregate MOD score were compared in the groups of patients Septest (+) and (-) at baseline. The difference (delta) between the percent of Septest (+) and (-) patients with the highest scores for each component of the MOD score was used to test the strength of agree between IL-6 levels and each organ system component of the aggregate MOD score. Results: A total of 2634 patients were enrolled. 998 (37.9%) had elevated IL-6 levels and 1636 (62.1%) did not. Baseline mean MOD score was higher in Septest (+) patients compared to Septest (-) patients, 8.77 vs 6.64 (P < 0.001), respectively. Baseline aggregate MOD score ≥ 9 was found in 52% of Septest (+) versus 28% test (-) patients. The cardiovascular, neurologic and respiratory components of the MOD score appeared to show the best agreement with baseline IL-6 levels. Conclusion: Despite lack of strong agreement between certain organ system components of the MOD score, the aggregate MOD score and Septest strongly agree at baseline. By inference, therefore, the results suggest that the Septest might also be a useful marker for ICU mortality. 1. Purpose: 1) To determine if differences in baseline characteristics exist between patients with and without hypercoaguable sepsis, and 2) to explore the effect of afelimomab (an anti-TNFα antibody) on mortality of hypercoaguable and non-hypercoaguable septic patients. A post hoc analysis identified patients with a hypercoaguable state in the overall population enrolled in a large placebocontrolled sepsis trial investigating the safety and efficacy of afelimomab. A hypercoaguable state was defined by (1) platelet count below 140,000 per µl plus (2) D-dimer > 250 ng/ml. Patients were randomized to receive placebo or afelimomab. Results: A total of 2634 patients were enrolled, and 1313 (49.8%) had baseline determinations of both platelets and Ddimer. The table describes the baseline characteristics of the two patient groups. Compared to those without laboratory evidence of consumption coagulopathy, patients with a hypercoagulable state had higher organ dysfunction scores, higher interleukin-6 levels and higher frequencies of positive blood cultures. Among the patients with evidence of consumption coagulopathy, 283 were treated with placebo and 311 with afelimomab. Mortality at 28 days was 43.8% vs 38.6% in placebo and afelimomab patients, respectively. In patients not meeting the definition of hypercoaguability, 376 and 343 received placebo and afelimomab; mortality was 35.9% and 26.5%, respectively. Septic patients with evidence of consumption coagulopathy had more organ dysfunction and a higher rate of positive blood cultures. Afelimomab appears to be beneficial in reducing the mortality of septic patients with a hypercoaguable state. PMX-DHP (Endotoxin adsorption method) is used for treatment of patients with sepsis and septic shock primarily caused by Gram negative infections, and its effectiveness has been evaluated in Japan. In this study, patients with septic shock caused by intraperitoneal infections were classified as a group in which septic shock was caused by biliary tract infections (Group C) or a group in which septic shock was caused by perforation of the lower digestive tract (Group P). Both groups of patients were assessed for the clinical effect of PMX-DHP, inflammatory mediators and markers related to vascular endothelial cell function. Cases in which PMX-DHP was performed consisted of seven in Group C (age: 69 ± 11 years; mean no of treatments: 1.6) and 18 in Group P (age: 73 ± 24 years; mean no of treatments: 1.5). Results: There were no significant differences observed in APACHE-II scores with Group C having a score of 28.9 ± 8.5 and Group P having a score of 27.1 ± 8.2. However, Goris MOF scores were obviously higher in Group C, with a score of 8.5 ± 1.5 for Group C and a score of 4.9 ± 2.0 for group P. The number of days from the onset of shock until PMX-DHP was performed was also greater in Group C (1.6 ± 0.8 days versus 0.9 ± 0.8 days). Although blood pressure elevating effects were observed in both groups, the magnitude of those effects was higher in Group P. The amount of catecholamine use was also obviously greater in Group C than in Group P. Although Group C exhibited higher endotoxin (ET) values than Group P, there were no significant differences observed. The values of vascular endothelium markers (ICAM-1, ELAM-1, thrombomodulin, PAI-1 and NOx) before the start of PMX-DHP were higher in Group C. Although IL-6 tended to decrease before and after PMX-DHP in both groups, there were no significant differences observed. However, IL-6 in Group P demonstrated a significant decrease at 24 hours after PMX-DHP. IL-1ra tended to decrease before and after PMX-DHP in both groups, and values before and after PMX-DHP in Group P were roughly twice that of those in Group C. Conclusion: Vascular endothelial cell activation and the degree of its impairment were considered to be greater in Group C than in Group P. Background: First intention treatment for early acquired respiratory infections diagnosed in an ICU is usually made without microbiological identification. Microbiological profile is important in supporting therapeutic options. Objectives: Study of the microbiological profile and effectiveness of first intention antibiotic therapy in early respiratory infections. Retrospective clinical study of 385 patients admitted to our ICU, from January to September 2000; selection of those with respiratory infection diagnosed in the first 72 hours of admission. Tracheobronchitis was defined by 'new' purulent bronchial secretions, with fever (> 38.5°C, axilar) and leucocytosis (> 10,000), and pneumonia by the three mentioned criteria plus 'new' pulmonary infiltrate. For these patients we reviewed the microbiological profile of specimens collected in the first 72 hours and the antimicrobial therapy. First intention treatment was in the vast majority amoxicilin + clavulanate (72 patients -68.6%). Initial therapy was changed according to antibiogram in 5 patients and in 18 patients (17.1%) it was necessary to change the therapy without microbiological guidance. Early respiratory infection is a frequent diagnosis in our ICU (27.3%). The microbiological profile in our patients shows, in 93% of the cases, organisms usually acquired in the community. Amoxicilin and clavulanate as first intention therapy appeared to be a good option (it failed only in 7% of the patients in whom it was used), and according to our microbiological profile it seems to be a good choice. The use of burn clinical pathways has helped to improve quality of care and decrease costs in the Burn Center. However, despite implementation of these pathways, certain 'ritual care' persists as part of the 'burn culture'. Specifically, the collection of wound cultures of acute burns at admission and the collection urine, blood, sputum, and wound cultures ('pan-culturing') on patients with temperatures greater than 38.5°C in the first 24 hours of admission are considered standard of care despite the lack of scientific data supporting these practices. The objectives of this study are to (1) establish the proportion and cost of cultures obtained in the first 24 hours after acute burn injury that yield positive microbiological cultures and (2) determine the utility of pan-culturing for temperatures greater than 38.5°C in the first 24 hours following burn injury. Design: Retrospective, computer-assisted chart review. Setting: University-based burn center. The records all burn injuries evaluated at the Burn Center or in the Emergency Department between 1/1997 and 1/1998 were retrospectively identified by ICD-9 release codes. Patients presenting with evidence of infection and pediatric burn patients were excluded. Data evaluated included: extent of burn injury, length of stay (LOS), documentation of initial cultures, culture results, and intervention/treatment. Design: Retrospective, computer-assisted chart review. Setting: University-based burn center. The records of all burn injuries admitted to the Burn Service between Jan 1997 and Jan 1998 were retrospectively identified by ICD-9 release codes. Patients with length of stay less than 1 day, or who had a bed assignment other than in the burn center were excluded. Patients with at least one surveillance culture were included. Surveillance cultures were defined as cultures obtained on Wednesdays and Saturdays per Burn Center protocol. Data collected included: length of stay, extent of burn injury, documentation of surveillance culture results, documentation of signs and/or symptoms of cellulitis and intervention and treatment. A P value < 0.05 was considered statistically significant. Results: 151 patients were identified. Eighty patients met the inclusion criteria. A total of 179 surveillance cultures were col-lected. 89% (71/80) of study patients received antimicrobials during their hospital course and 82% (58/71) had clinical signs of cellulitis. 91% (53/58) of patients with clinical signs of cellulitis were treated with antimicrobials. Most of these patients (86%) received empirical antimicrobials for cellulitis, based solely on clinical judgment not on culture results. In only three cases (1.6%) were orders for antimicrobials initiated or changed on the basis of wound surveillance cultures. Patients with surveillance culture were significantly more likely to receive antimicrobials than those who were not cultured (39.6% vs 1.7% P = 0.001). However, among those patients with surveillance cultures (n = 179), there were significantly more patients who did not receive any antimicrobials (60.3% vs 39.7% P = 0.001) There was no significant relationship between a positive or negative surveillance culture and orders for antimicrobials (P = 0.097). The clinical management of burn wounds is not significantly altered by the results of routine surveillance wound cultures. The diagnosis of cellulitis is based on clinical judgment and treatment is initiated empirically. Omitting twice-weekly routine surveillance cultures would result in potential savings of $25,550.00 (US) and would not compromise the quality of patient care. Methods: In a retrospective study (01/1992-12/1998), attributable mortality for MSSA and MRSA bacteremia was investigated and compared in critically ill patients. Two independent casecontrol studies were performed. Matching (1:2-ratio) was based upon APACHE II-score and admission diagnosis. As expected mortality can be derived from these two variables, this matching procedure resulted in an equal expected mortality rate for cases and controls. Attributable mortality is determined by subtracting the in-hospital mortality rate of the controls from the in-hospital mortality rate of the cases. Results: During the study period 22,431 patients were admitted to the ICU. In 85 patients a microbiologically documented S. aureus bacteremia was diagnosed. In the MSSA case-control study, an attributable mortality of 1.3% was found: mortality in cases (n = 38) and controls (n = 76) was respectively 23.7% and 22.4% (P = 0.937). In the MRSA case-control study an attributable mortality of 23.4% was found: mortality rates for cases (n = 47) and controls (n = 94) were respectively 63.7% and 40.4% (P = 0.017). The difference between both attributable mortality rates (22.1%) was statistically significant (95% CI: 8.8-35.3%). In critically ill patients, MRSA bacteremia have a significantly higher attributable mortality than MSSA bacteremia. Aim: To study the use of antibiotics in a neonatal intensive care unit. All neonates receiving antibiotic treatment over a 6 month period were included in this prospective study. The information gathered included the indications and criteria for initiating antibiotic therapy (clinical, biological and bacteriological), duration of treatment and the reasons for withdrawal of treatment. The prescription/written procedures correlation was judged a posteriori by a non-prescribing physician. One hundred and thirty-five infants were included, of whom eight received several courses (2-5 courses) at different times. Mean gestational age was 33.5 weeks (24.5-41.5 weeks). In 82% of cases admission was for respiratory distress. The main indications for antibiotic treatment were 1) primary infection (78%, of which 92.4% were for suspected feto-maternal infection), 2) nosocomial infection (17% of which occurred in neonates born at less than 28 weeks' GA and hospitalized in the unit for more than 21 days) and 3) post-operative prophylaxis (5%). Clinical criteria predominated in the primary infections but in 62% of cases fetomaternal infection was not confirmed, justifying withdrawal of treatment after 3 days. Bacteriological criteria played a part in the treatment decision only in nosocomial infections (14/23 newborns). Infringement of treatment guidelines represented 9% of infants treated and mainly involved length of treatment. These results show that 1) the majority of antibiotic treatments for suspected feto-maternal infection were unconfirmed, demonstrating the need to refine the criteria for diagnosis, 2) the level of nosocomial infection was low and mainly occurred in very premature infants (7-30% in the literature) and 3) the fewer than 10% infringements of guidelines emphasized the value of continuous surveillance of use of antibiotics in neonatal intensive care units. Objective: To determine ICU-acquired infections and sensibility to antibiotics by an antibiotic therapy policy with a patient-to-patient rotation. We have compared our dates with EPIC study [1] . Setting: A 20-bed medical-surgical Intensive Care Unit (ICU). Patients: All patients admitted in ICU from 1-1-1999 to 30-7-1999 and from 16-6-2000 to 16-10-2000, with a length of stay in ICU longer than 24 hours. The infections were diagnosed according to the criteria of the CDC. Table. Conclusions: The antibiotic therapy policy with a patient-to-patient rotation can be useful for the control of infectious mape in ICU. The prospective cross-over study was carried out in 34 patients with severe pneumonia (n = 26) or bacteriemia (n = 8). There were randomized to receive Cefepim 4 g per day either as a continuous infusion (Group 1, n = 17) or intermittent administration 2 g x 2 (Group 2, n = 17) in combination with amikacin 15 mg/kg/day in the two groups. Patients were significantly comparable in terms of age, sex, initial infection disease, IGS II score and MIC of gram negative bacilli isolated. Clinical outcomes: mechanical ventilation, ICU stay durations and clinical recovery were assessed along with pharmacokinetic (24-hour AUIC, 12-hour AUIC) and pharmacodynamic (T > MIC and T > 5 MICs) in both groups and compared (chi-squared and Mann-Whitney U-tests). Results with P < 0.05 were considered significant. Results: Mechanical ventilation, clinical recovery (13 vs 11), bacteriologic eradication (12 vs 10) and duration of stay in ICU (35 vs 38 days) were better in Group 1 but did not significantly differ between the two group. Neither did 24-hour AUIC (569 vs 414) nor 12-hour AUIC (218 vs 202). However, T > MIC in Group 1 (23.8 ± 0.2) was significantly higher (P < 0.05) than in Group 2 (20.4 ± 3). T > 5 MICs in Group 1 (23.6 ± 0.6) was also very significantly higher (P < 0.01) than in Group 2 (16.7 ± 6). Introduction: Leptospirosis is generally found in tropical regions but it can occur in temperate regions. It is caused by all kinds of leptospiras and it is, in general, a self limited disease. However, reports of important complications as acute respiratory failure (ARF), associated or not with other organic dysfunction had increased in the last years and had been associated with a high mortality rate [1] . Objective: The goal from this paper is to evaluate the clinical characteristics and the morbimortality of severe leptospirosis associated with ARF in two general ICU from two general hospitals. Methods: All cases with the diagnosis of leptospirosis, confirmed by blood macroagglutination test, associated with ARF and admitted from January 1990 to October 2000, in two general ICU from two general hospitals, were studied. There were analyzed the clinical and laboratory characteristics, the associated organic dysfunction and the mortality rate. Survivors were compared with non-survivors. The quantitative variables have been compared by unpaired t-test and the qualitative variables by a chi-squared test. The level of significance was P < 0.05. We described 34 adults patients, 39 ± 16 years, 28 men and 6 women. The most frequent clinical manifestation were dyspnea (n = 32), fever (n = 31), myalgias (n = 29), jaundice (n = 28), hemoptysis (n = 25) and cough (n = 25). All patients showed ARF needing for mechanical ventilation (PaO 2 /FIO 2 = 169 ± 73, with diffuse pulmonary infiltrates in all cases) as well as some level of other organic dysfunction as hepatic (n = 26), renal (n = 25), cardiovascular (n = 22), hematological (n = 20) and neurological (n = 11). The mortality rate was 53% (n = 18). The comparison between non-survivors and survivors showed they were older (P < 0.05) and had: 1) higher number of organic dysfunction, principally higher incidence of renal, cardiovascular and neurological failures (P < 0.05); 2) higher levels of acidosis (P < 0.05); 3) higher use of invasive mechanical ventilation and positive end-expiratory pressure (P < 0.05). In endemic regions leptospirosis has to be considered as a cause of ARF as well as a cause of other associated organic dysfunction. Leptospirosis associated with ARF has a high mortality rate mainly when associated with other organic failures. and presented more chronic pulmonary obstructive disease (P < 0.05) and alcoholism (P < 0.05) than group 1. There was no significant difference between groups considering other concomitant diseases (ischemic cardiopathy, diabetes mellitus, systemic arterial hypertension). Group 2 presented longer PI (P = 0.04), OP (P = 0.02), SP (P < 0.001), PICU (P < 0.001), Pbenzo (P < 0.001), Pcur (P < 0.001), and PMV (P < 0.001) than group 1. During clinical evolution, group 2 presented higher incidence of arrythmias, respiratory and urinary infections than group 1 (P < 0.001). Group 1 presented more pneumothorax and cardiopulmonary arrest (P < 0.001) than group 2, probably secondary to problems with mechanical ventilators. There were no differences between both groups regarding hypertension, hypotension and renal failure. The mortality rate was 36.5% in group 1 and 16.8% in group 2 (P < 0.005). 1) The reduced mortality in group 2 (P < 0.005) is probably related to advances in ICU resources (advanced ventilators and early physiotherapy). 2) The higher incidence of infections in group 2 is probably related to longer ICU stay, mechanical ventilation period and use of invasive procedures. The time of treatment in the ICU was of the 7.7, 10.6, and 5.5.days in the candidemia, OI and endophtalmitis, respectively. The kind and time of the antifungal treatment was different between the groups analized. Sepsis is one of the main causes of death in developed countries. It is commonly associated with disseminated intravascular coagulation (DIC), caused by consumption of coagulation factors and inhibitors such as antithrombin (AT) and the production of interleukin 6 (IL-6) and interleukin 8 (IL-8) have been shown to correlate positively with the severity of disease. Large doses of antithrombin (AT) reduce mortality and morbidity in septic patients and there is increasing evidence to suggest that AT has anti-inflammatory properties in addition to its anticoagulant properties. In our previous in vitro study, we have found that AT inhibits tissue factor and interleukin-6 production induced by lipopolysaccharides (LPS). In the present investigation, we have studied the effects of AT on LPS induced IL-8 production in three in vitro cellular systems. Citrated whole blood, human umbilical vein endothelial cells (HUVECs) and mononuclear cells (MNCs) were stimulated with LPS for 4-6 hours in the presence and absence of 0-40 IU/ml AT. IL-8 was measured by ELISA. In all three systems, AT dose-dependently inhibited IL-8 production, with greatest inhibition (98.7 ± 5.2% at 40 IU/ml) observed for the whole blood system and the least inhibition seen with MNCs (8.7 ± 21.7% at 40 IU/ml). RNA extraction of time course whole blood experiment followed by the detection of mRNA specific for IL-8 showed that in the absence of AT, mRNA for IL-8 was apparent after 30 min incubation with LPS. However, the level of IL-8 mRNA was found to decrease with increasing concentrations of AT. These results imply the inhibition of IL-8 antigen production by AT is due to the suppression of mRNA and indicated that the anti-inflammatory activity of AT also extends to the inhibition of IL-8, an important cytokine implicated in neutrophil migration. Background: Neutrophil activation is a crucial step in the pathogenesis of sepsis and subsequent development of multiple organ failure. Antithrombin III (ATIII) exerts direct effects on neutrophils by inhibiting chemokine-induced migration. The positive outcome of animals in models of severe sepsis treated with ATIII may be due to this neutrophil-dependent action. The aim of the present study was to determine the potency of different ATIII preparations in inhibiting neutrophil chemotaxis compared to monoclonal antibody-purified ATIII. Methods: Human neutrophils were isolated using standard preparation methods. Cell migration was tested in modified Boyden microchemotaxis chambers bearing nitrocellulose filters in the leading front assay. Human neutrophils were incubated with seven different ATIII preparations at various concentrations (1 µIU/ml to 5 IU/ml) for 20 min. Immuno-purified ATIII served as positive control. After washing twice, neutrophils migrated toward interleukin-8 (1 nM) for 30 min in humidified atmosphere at 37°C. After staining of the cells, migration depth was measured microscopically. Results: At concentrations below 10 mIU, neutrophil chemotaxis toward interleukin-8 was decreased by the ATIII preparations with different potencies, whereas at higher concentrations (1 IU and 5 IU) no significant differences could be observed. Deactivation of neutrophil chemotaxis was most pronounced by Kybernin ® P (Aventis Behring, Marburg, Germany) at 100 µIU and was comparable in potency to homologous deactivation with IL-8. The purified ATIII inhibited interleukin-8-induced chemotaxis at all concentrations tested (1 µIU to 5 IU). We suggest that antiinflammatory activity of ATIII may be due to deactivation of chemokine-induced leukocyte migration. Commercially available ATIII-preparations at distinct concentrations show significant differences in their ability to deactivate neutrophil chemotaxis toward interleukin-8. This may suggest also different activities in vivo depending on the various preparation procedures. Background: Antithrombin III exerts direct effects on neutrophils by inhibiting chemokine-induced migration [1] . Whether ATIII directly affects the migratory behaviour of other types of leukocytes is unknown. We investigated the effect of ATIII on spontaneous and chemokine-triggered migration using RANTES and interleukin-8 as attractants of lymphocytes, and RANTES and monocyte chemotactic peptide-3 as attractants of monocytes, in modified Boyden chamber micropore filter assays. Lymphocyte and monocyte populations from human peripheral blood were pure. Signaling of ATIII in migration of the leukocytes was studied by blocking signaling enzymes with staurosporine, GFX, wortmannin and rolipram. As AT III, the concentrate Kybernin ® P and antibody purified AT III thereof were used. Results: Pretreatment of lymphocytes with ATIII slightly augmented random locomotion, chemotaxis toward optimal concentrations of RANTES or IL-8 was significantly inhibited by pretreatment of the cells with ATIII followed by washing. Significant inhibition of chemotaxis was seen at ATIII concentrations as low as 10 nU/ml. Exposure of lymphocytes to gradients of ATIII stimulated migration in the absence of additional chemokines. Pretreatment of monocytes with ATIII before triggering of directed migration revealed similar findings, with ATIII again being active at low concentrations. In the absence of chemokines, ATIII again activated monocytes' directed migration. This ATIII-induced augmentation of migration was used for investigating signaling events induced in the cells by preincubation with various enzyme blockers: in contrast to neutrophils, where ATIII effects are mediated by protein kinase C and cAMP, responses of monocytes were wortmannin-and rolipram-sensitive; lymphocytes were additionally affected by GFX. Conclusion: ATIII directly affects monocyte and lymphocyte functions in vitro. ATIII inhibits chemokine-stimulated migration of the two peripheral blood mononuclear cell populations. Thus, cellular effects of ATIII may occur not only in neutrophils but also in other immune cell populations. Signal transduction may be cell typedependent, as it differs between neutrophils, lymphocytes and monocytes. A specific pathway for direct cellular activation by ATIII is postulated. 1. Dunzendorfer et al: Cell-surface heparan sulfate proteoglycan mediated regulation of human neutrophil migration by the serpin antithrombin III. Blood (in press). Introduction: Coagulopathy is frequently seen in ICU patients. These patients may increase ICU costs considerable by their higher need of blood products. Detailed information about incidence and consequences of coagulopathy would be useful in predicting prognosis and resource consumption in this cohort. Objective: To determine incidence, severity, prognosis and therapeutic implications of coagulopathy in a 12-bed medical (non-coronary) ICU. We evaluated in a prospective observational study over 13 months (1.11.1997-30.11 .1998) all patients (pts), who stayed longer than 48 h in the ICU. Coagulopathy was defined as an abnormal prothrombin time (PT < 70%, Quick's method, corresponding to an INR > 1.26). Pts with thrombolytic therapy or preexisting therapeutic anticoagulation were excluded. Results: Defined by PT (INR) coagulopathy was diagnosed in 187 of 231 cases (81%). Of these pts 55 had ICU acquired coagulopathy (24%). Coagulopathy was classified as mild in 78 (34%), moderate in 72 (31%), severe in 37 (16%) ( Table) . PT values normalized until ICU discharge or death in 77 (41%) pts with coagulopathy. In patients with a relative drop in PT of ten or more percent between day 1 and 3 mortality was significantly higher than in patients with a lower or no decrease irrespective of absolute PT values (mortality: 28 of 66 vs 33 of 165, P < 0.001, OR 3.0, 95% CI 1.6-5.5; χ 2 test). The incidence of coagulopathy in our pts is high (81%). Moderate and severe but not mild coagulopathy are associated with increased mortality. Even a slight decrease in PT during the first 3 days is associated with higher mortality and may be regarded as an early warning sign. Haemostatic failure, secondary to large volume fluid replacement, is a major component to the mortality and morbidity associated with blunt trauma. Progressive bleeding in multiply injured patients is due to both dilution effects and specific inhibitory effects on platelet function of the colloids used. Recombinant factor VIIa (rVIIa) is seen increasingly as a possible universal haemostatic agent that could act to reverse or prevent haemostatic failure associated with dilution and the direct effects of the colloids within the 'Golden' hour of haemorrhagic shock. We have conducted a pilot preclincal study to evaluate the potential role of rVIIa as a universal haemostatic agent in a model of large volume fluid replacement using thrombelastography (TEG). TEG is a method of global haemostasis assessment, providing information on the rate of clot formation, clot strength and durability. Whole blood samples from normal donors were tested undiluted (100%) or diluted (50% and 80%) with standard colloid replacement solutions (Haemacel, Albumin, Gelofusine, Hydroxyethyl starch) and N/Saline. Global haemostasis was assessed in the TEG, ± 90 µg/kg rVIIa added. In undiluted blood (100%) there were no statistically significant changes in any TEG parameter when rVIIa was added. At dilutions of > 50% addition of rVIIa significantly improved the kinetics of clot formation and rate of platelet reactivity, P < 0.05, although time to the start of coagulation and final clot strength were not significantly different. The beneficial effects of addition of rVIIa did not differ between different fluid replacement solutions. Addition of rVIIa therefore appears to improve markers of global haemostasis in this model of large volume fluid replacement. Further work is required to assess its potential value as a universal haemostatic agent in the setting of blunt trauma and large volume fluid replacement. Based on the above data, normal transferrin receptors in septic ICU anemic patients, indicate the minor role of iron deficiency in the aetiology of anemia in these patients, usually provides information on the etiology of anemia. Given that definitive distinction between iron deficiency anemia and the anemia of chronic disease requires a bone marrow examination to determine the iron status. To avoid the cost and discomfort of this method, measurement of sTfR can be used as a reliable index. In this study we observed that the iron deficiency anemia has low incidence in the critically ill patients. Background: Allogenic blood transfusion is associated with many risks. For the reduction of its usage acute normovolemic haemodilution (ANH) and intraoperative blood salvage were recommended. Aim: The reduction of allogenic blood transfusion during the operations on ascending aorta by the usage of ANH, intraoperative salvage of blood and antifibrinolitic drug -aprotinin. During the observation period (Jan 1997-Oct 1999) there were 83 patients (13 female and 70 male) of age 56.2 ± 9.8 who underwent ascending aorta surgery. Exclusion criteria for ANH were haemoglobin concentration < 11 g/dl and haemodynamic instability. All patients were premedicated with midazolam 5 mg and meperidine 50 mg i.m. Anaesthesia: midazolam 0.1 mg/kg, fentanyl 5 µg/kg, and pancuronium 0.1 mg/kg; maintenance: the same drugs plus isoflurane 0.5-1 vol%. ANH was started after induction of anaesthesia and completed before the beginning of operation. Autologous blood 15 ml/kg was withdrawn from a. radialis into standard autologous blood collection sets containing CPDA-1. Gelatin and Hartman's solutions were infused to maintain baseline central venous pressure (CVP). 10 6 iu of aprotinin were given i.v. and 2 x 10 6 iu were added into the solution for extracorporal circulation. Autologous blood was processed into packed red blood cells (PRBC), plasma and platelet. Arterial pressure, heart rate and CVP were continuously measured. Cell saver was installed in all patients. Arterial blood samples were obtained for analysis of blood gases, haemoglobin concentration, platelet count, electrolytes, acid-base status and activated coagulation time. PRBC were given toward the end of operation and in postoperative period when haemoglobin concentration was < 8.0 g/dl. Statistical analysis: All data are expressed as mean values (SD). : ANH procedure lasted for 20 ± 5 min and 1350 ± 150 ml of autologous blood was withdrawn. Three units of PRBC of allogenic blood were given to 59 (71.1%) out of 83 patients during their stay in hospital, while no allogenic blood had to be given to the rest of them (24 patients or 28.9%). The efficacy of ANH in the reduction of allogenic blood transfusion is still a debated issue [1] . Our data confirm that ANH has a beneficial effect on the reduction of allogenic blood transfusion. Conclusion: ANH is the useful method for saving allogenic blood in major surgery. 1. The hyperdynamic patients had statistically significant reductions in hematocrit (P < 0.001) and platelets (P < 0.001); this was associated with an increased incidence of return to the OR for bleeding/tamponade (9.4 vs 2%, P < 0.001). Although overall morbidity was increased (5.7 vs 2.5%, P < 0.001), this was not accompanied by significant increases in mediastinitis or blood stream infections (P = 0.09), ARDS (P = 0.25) or MODS (P = 0.09). Conclusions: Hyperdynamic circulation, presumably secondary to a CPB induced inflammatory response, is associated with increased postoperative hemorrhage and morbidity but not with increased susceptibility to MODS. Elucidation of unique mechanisms regulating systemic vasodilatation following CPB may point to novel strategies that attenuate CPB mediated inflammation. Background: Septic shock is characterised by increased systemic fibrinolytic activity. This study in endotoxemic pigs was designed to: 1) describe regional tPA (tissue-type plasminogen activator) activity; 2) assess changes in tPA activity following aggressive volume resuscitation. Results: CO decreased from 4.1 ± 0.3 l/min to 2.7 ± 0.1* during sepsis and was restored to 4.7 ± 0.1 by volume resuscitation. Parallel changes were observed in Q MES (from 0.9 ± 0.1 l/min to 0.5 ± 0.1*, and back to 1.1 ± 0.2) and Q REN (from 135 ± 15 ml/min, to 62 ± 20*, and back to 124 ± 25). Q HEP was maintained at 1.1 ± 0.1 to 1.3 ± 0.1 l/min by an effective hepatic arterial buffer response. Net regional plasma fluxes of tPA are shown in the Table. Conclusions: Fibrinolytic activity increases early in sepsis shown by a net pulmonary and mesenteric release of tPA. Increased net hepatic uptake would mask the pre-hepatic changes in tPA. Early volume resuscitation in sepsis is able to completely reverse all these changes. The findings demonstrate the dynamics of hemostasis during sepsis. Early interventions to restore hemodynamic stability are important to maintain normal fibrinolytic activity. Conclusions: Hextend may be used as part of a large volume resuscitation strategy in diverse patient populations including those with traumatic brain injury. Hextend helps minimize the hyperchloremia associated with massive volume resuscitation without significantly perturbing the serine protease dependent coagulation profile. Results: A total of 390 g HES per patient was administered and well tolerated by all patients in the study group. Only one serious adverse event was noted in the crystalloid group, whereby a relation with the study medication was unlikely. The most frequent adverse event was itching, reported by two patients in the crystalloid group and three patients in the HES group. At baseline all hemostatic parameters were within the normal range and no clinically relevant changes were observed with respect to any parameter between the two treatment groups. Purpose: To determine if ultrasound can detect muscle atrophy in muscles that are known to undergo muscle atrophy in ICU patients, measured as decrease in muscle thickness. Twelve women, mean age 55 ± 14 years old, who entered the ICU for diverse reasons, and were not under mechanical ventilation, did not received corticosteroids and/or muscle relaxants, and received either total parenteral nutrition or enteral nutrition at 30 kcal/kg/day with 1 g prot/kg/day were chosen. All patients had 1 hour of bedside passive physicotherapy. Ultrasound measurements were taken at the moment of admission and at the moment of disclosure from the unit. The intercostal muscle group thickness was measured at the 6th intercostals space, and the biceps brachialis muscle thickness, at is point of maximum width. Both sets of measurements were recorded and analyzed with the unpaired t-test. The P value for the intercostal muscle group did not show any statistical significant difference, while the P value for the biceps brachialis showed a statistical significant difference. Ultrasound can detect a decrease in muscle thickness in muscle groups known to show muscle atrophy in patients that enter the ICU. This decrease in thickness is not seen in muscle groups that do not undergo muscle atrophy. The patients, who were fed more than 2 weeks parenterally without enteral feeding due to their disease (mostly uncomplicated intestinal fistula after surgical intervention with possibility to start enteral feeding), where studied after obtaining informed consent. The parenteral nutrition was fortified by 4200 kJ parenterally in days 0-2 and next enterally in the same contents (Nutrison 1000 ml) in days 7-14. The local Ethics committee approved this research project. Comparisons between parenteral + parenteral period and parenteral + enteral period in several serologic and urine parameters were calculated. Wilcoxon's pair t-test was used for statistical analysis. Results: HDL cholesterol (0.61 vs 0.72 mmol/l), apoprotein A (0.63 vs 0.71 g/l) and insulin like growth factor (IGF-1) (291.7 vs 321.4 ng/ml) were significantly higher in enteral period (P < 0.05). The urinary output of urea (551 vs 489 mmol/day), P (31 vs 24 mmol/day), Na (418 vs 220 mmol/day) were significantly lower during enteral period (P < 0,05). Enteral nutrition application is associated also with some anabolic effects in comparison to application of parenteral nutrition alone. The higher level of IGF-1 is main anabolic marker of enteral nutrition in our study. The decline of urea output means an enhanced proteosynthesis (probably most in intestine mass). Higher level of HDL cholesterol and apo A during enteral feeding we explain by the enhanced synthesis of cholesterol in the intestine. The metabolic function of intestine may be important for intensive care patients. Because tolerance of enteral feeding is changing quickly during the critical situation, we prefer combination of parenteral and enteral nutrition in unstable intensive care patients. Supported by Grant IGA-MZ-CR-4788-3/98. Critical Care Directorate, University Hospital of Wales, Cardiff, UK The preferred method of nutritional support in intensive care patients is via the enteral feeding route due to its favourable trophic effects on the intestinal mucosa, reduced rate of complications and lower costs when compared with parenteral nutrition. Impaired gastric emptying can be a limiting factor in providing enteral nutrition, commonly affected by critical illness, and naso-jejunal tube feeding can be useful in patients who fail to tolerate naso-gastric tube feeding. Endoscopic placement of nasojejunal tubes can be performed at the bedside, is highly successful, 85-90% success rate can be expected and enteral feeding can start immediately following the procedure. This retrospective review evaluated the use of feeding tubes (8 fr, 240 cm: Wilson-Cook Medical Inc.) endoscopically placed in ICU patients over a 15 month period. A total of 27 patients had 36 naso-jejunal feeding tubes placed endoscopically (five patients had more than one placement). The patients were typically male admitted to the ICU following an emergency procedure. All were placed successfully and used for a mean 5.7 days/tube (range 1-27 days). Avoidable complications such as blockages or accidental misplacement of the tube occured in 28% of the tubes. A review of these results suggests that the majority of patients would have benefited from placement of an enteral feeding tube during theatre, and that the success of naso-jejunal feeding could be improved by attention to the care of the feeding tubes once placed. Introduction: Although enteral nutrition is considered superior to parenteral nutrition in critically ill patients, it is frequently delayed or prevented by gastroparesis. A variety of approaches have been developed to bypass the stomach, but none has proved entirely satisfactory. The Cathlocator™ is a novel device that permits real time localization of the end of feeding tubes by detection of a magnetic field generated by a small electric current in a coil in the tip of the tube. It is portable, can be used at the bedside and uses no ionizing radiation. To evaluate placement of tubes for (i) nasoduodenal feeding, and (ii) nasogastric drainage in critically ill patients using the Cathlocator™. Methods: Ten nasoduodenal tube placements were attempted in nine critically ill patients. The Cathlocator™ was used to guide positioning of the tube beyond the pylorus and also to determine whether a separate nasogastric tube was placed correctly. Tube tip position was confirmed by plain abdominal X-ray. Data are median and range. The Cathlocator™ accurately gives the location of an enteral tube in real time. It thus provides an effective bedside tech-nique for nasoduodenal tube placement to facilitate enteral feeding in critically ill patients. This study was partially funded by Micronix Pty Ltd. Williams The present study assessed the effects of feeding glutamine enterally in 14 intensive care patients who were not immunosuppressed. Blood samples were analysed for plasma glutamine concentration (enzymatically), lymphocyte proliferation via 3 H thymidine incorporation into DNA (counts per minute -CPM) with and without mitogenic stimulation (Concanavalin-A), neutrophil activity (using an oxidative burst technique) and cytokine production in culture medium (IL-2 and IL-8). Patients were randomly divided into two groups and fed 15-g bolus doses in water of either glutamine (n = 7; 60 ± 8 years) or placebo (n = 7; 60 ± 6 years) for 5 days. Nasogastric nutrition was discontinued daily at midday and a blood sample was taken; 1 hour later, a bolus dose of glutamine/placebo was administered via the tube; 2 hours later total nutrition was started again. Lymphocyte proliferation increased in the glutamine group (P < 0.04) compared with the placebo, as measured by CPM and the stimulation index. IL-2 production increased in culture medium from Day 0 to Day 5 for patients receiving glutamine (P < 0.02) compared with the placebo group. IL-8 production, neutrophil numbers or activity, and plasma glutamine did not change. Our results suggest that glutamine is able to induce an IL-2 mediated increase in the proliferative ability of T-cells, and we hypothesise that these changes may underlie previously observed clinical benefits from glutamine administration in critically ill patients. The need for glutamine supplementation in catabolic states is well established. In ICU-patients the majority is fed by the enteral route. However, the documentation and control of glutamine administration enterally is still insufficient, while parenterally provided glutamine is well documented. Concentrated dipeptide solution may be given in a central venous line as a nutritional adjunct, but as central venous lines are associated with complications and risks, administration via a peripheral vein would be preferable. The osmolarity of a 20% glutamine dipeptide solution is 920 mosmol/l, which exceeds the recommended limit of 800 mosmol/l. Therefore we systematically evaluated local tolerance in a group of ICU-patients. Patients and methods: ICU-patients (n = 20) on ventilator, but with limited liver and kidney insufficiency and with available peripheral vein were randomised to receive a 20% alanyl-glutamine (Dipeptiven, Fresenius-Kabi) infusion of 0.5 g/kg or the same volume of saline during 4 h in a peripheral vein on 3 consecutive days. The venous lines (BD Venflon ® i.v.; 1.0 mm diameter) were inserted especially for this purpose, and removed after each infusion. Different veins in the same arm were used. Local tolerance was evaluated clinically by Maddox score, and ultrasonically (Aspen™ Ultrasound System equipped with L10 transducer with frequency 11-6 MHz) before the infusion and on days 1, 4 and 8 after the infusion. In the study 56 out of 60 (93%) planned infusions were administered and 157 out of 168 (93%) clinical evaluations were successfully performed. The ultrasonic evaluation revealed that the utilized veins had a diameter of 2.15 ± 0.8 mm (mean ± SD; range 1.0-4.4 mm; n = 56) 3 cm proximal of the insertion site. Using the protocol described above, there were no signs of thrombophlebitis in any single patients by either Maddox score or ultrasound. Administration of glutamine-containing dipeptide concentrate (20%) by peripheral veins is safe in terms of local tolerance, if a strict protocol is adapted for this purpose involving a separate line for the infusion removed immediately afterwards. Delayed gastric emptying is common in critically ill patients. Amylin is a novel 37 amino acid polypeptide, which is co-localised and cosecreted with insulin by pancreatic beta cells [1] . In conjunction with its role in glucose homeostasis it is a potent inhibitor of gastric motility [2] . Thus we hypothesised that high circulating levels of amylin may be associated with delayed gastric emptying in critically ill children. Method: Nineteen children were enrolled within 48 hours of ICU admission. Exclusion criteria included: liver disease, gastrointestinal abnormalities and use of prokinetic agents. All patients were N.P.O., and maintained on a ivi glucose (5-8 mg/kg/min). Gastric emptying (GE) was assessed clinically by feed intolerance and using a paracetamol absorption technique (PTA). Feed intolerance was defined as a residual gastric volume > 0.25 ml/kg after 4 hours of a bolus 2 ml/kg test milk feed. At this point (T0), a single 15 mg/kg dose of paracetamol was administered nasogastrically, and serial blood samples taken for paracetamol assay at 0, 15, 30, 60, 120, 240 and 360 min. GE was calculated using the gastric emptying ratio (GER) which is the time to reach peak paracetamol level divided by its peak concentration, with high values reflecting delayed gastric emptying [3] . Blood amylin and insulin sample were taken at T0 and T360 with the mean of these two values used to reflect the average level over the study period. Amylin was measured by radioimmunoassay. Data were assumed non-parametric, thus Spearman's correlation coefficient and Mann-Whitney tests were used. Data are shown as median (interquartile range). Results: Nineteen patients were enrolled with a median age 6 years (1.7-8.5), and weight 20 kg (11.5-31.5). Diagnoses included sepsis (n = 8), respiratory (n = 5), head injury (n = 2), neurology (n = 2) and other (n = 2). Four patients did not tolerate enteral feeds (median residual volume 4.4 ml/kg). Factors associated with impaired gastric emptying were not different between the two groups, in particular opiate infusions ( Introduction: Lactulose was reported to inhibit gastric tone and motility in healthy volunteers [1] . Double blind placebo controlled study using lactulose four times 15 ml daily (ie 40 g) or saline for prevention of ventilator associated pneumonia was performed in our ICU during 1999-2000. In this report the impact on gastric feeding tolerance of lactulose given into stomach is evaluated. Forty-eight patients without pneumonia and mechanically ventilated for < 24 hours entered the study. From day 2 EN was given by a standardised protocol into the stomach. Daily amount of enteral nutrition (EN) was recorded. Failure was defined as EN interruption > 24 hours because of large residuals. Metoclopramide was used in all patients. Chi-square and Mann-Whitney U-tests when appropriate. Data presented as means ± SD. P < 0.05 was considered significant. Results: Thirty-eight patients in whom EN was not contraindicated and stayed in the ICU > 3 days were analysed. Fifteen patients received lactulose (L) and 23 were given placebo (P). L and P groups did not differ in age (50.3 ± 18.2 and 52.7 ± 16.3; NS), APACHE II on admission (24.7 ± 8.7 and 26.0 ± 7.3; NS). There also was no difference in L and P groups in mortality (6 ICU survivors and 9 non-survivors in lactulose group and 13 non-survivors and 10 survivors in placebo group; NS), length of ICU stay (LOS) (13.5 ± 7.4 and 15.2 ± 11.0; NS) and ventilatory days (12.5 ± 6.8 and 12.0 ± 7.8; NS). Patients receiving L did not differ from P in daily tolerance of gastric feeding (604 ± 438 and 425 ± 264 ml/day; P = 0.16) and number of EN intolerances per hospitalisation (0.51 ± 0.26 and 0.50 ± 0.35 intolerances/LOS; NS). This was also true when ICU survivors were analysed separately. Lactulose administered into the stomach has no impact on poor tolerance of gastric feeding in mechanically ventilated long term ICU patients. diabetes mellitus (DM), (4) degree of organ dysfunction/ failure (MOF score: calculated from the MOF criteria of Japanese Association for Critical Care Medicine), (5) cardiac output/body weight (C/B), (6) serum fat (triglyceride, free fatty acid, total cholesterol) levels (SF), and (7) blood stress hormone levels (SH). The results are as follows (shown in mean ± SD): 1) C1-C3: 0.52 ± 0.88 ng/ml (n = 40), 2) I1: 53 ± 26, I3: 191 ± 75 µU/ml, IC: 20 ± 10 ml/kg min (n = 40), 3) IC in comparison between the patients (1) with liver dysfunction (-)/(+): 23 ± 11 ml/kg min (n = 25)/15 ± 5 ml/kg min (n = 15) (P < 0.025), (2) with the MOF score 0 or 1/more than 5: 24 ± 14 ml/kg min (n = 15)/16 ± 4 ml/kg min (n = 14) (P < 0.05), 4) There was positive correlation between IC and C/B (Y = 0.11 X + 5.0, n = 23, r = 0.59, P < 0.003), 5)There was no definite relationship between IC and organ dysfunction except liver dysfunction, M value, BGm, DM, SF, and SH. (1) IC calculated from our method seemed to be reliable because there seemed no significant influence of the intrinsic insulin which was apparent from the small amount of (C1-C3) value. (2) Insulin clearance was elevated in most patients with hyperdynamic state, although the degree of the elevation was suppressed in patients with liver dysfunction and multiple organ dysfunction. (3) The results suggest the justification of sufficient insulin therapy. Background: The importance of parenteral nutritional therapy for patients in whom gut feeding is not feasible has been well documented. The main goal in such cases is to correct, maintain and improve patients' nutritional status by choosing an optimal TPN regimen [1] . Objective: In this prospective randomized trial we compared the average cost of two different TPN delivery systems -Three Bottle System (TBS) and All In One (AIO) (3) using three compartment NuTRIflex ® Lipid bags provided by B|Braun Ltd. We also studied the average time spent for prescription, transcription, preparation and handling of the two regimens. Twenty-four patients admitted to our general ICU at the 'PIROGOV' Emergency Institute (1050 beds) and requiring TPN were enrolled in this prospective randomized study. The patients were randomized into two groups. Group 1 (n = 12) received TPN with the standard three bottle system with fat, glucose and aminoacids. Group 2 (n = 12) received TPN with the NuTRIflex ® Lipid three compartment bags system. A record of all solutions and disposables was kept and a stopwatch was kept at each bed and each time the TPN needed attention. In addition we interviewed the nurses about their impression working with the two systems. The results suggest a big difference in average nursing time (10 min), and cost (1.0-1.17 units) favouring the NuTRIflex ® Lipid system. The possibility of less working errors and the reduction of risk from infection also makes it worthwhile. We concluded that the cost/day difference, ergonomic and time saving advantages and well known better tolerance and lower complications risk of three compartment bags system makes it an acceptable and safer TPN regimen. 1 In contrast to ω-6 lipids, fish oil downregulated CD62E, ICAM-1 and VCAM-1 in a dose-dependent manner. CD62P, however, remained unchanged. The changes in adhesion molecule expression were accompanied by a significant reduction of firm adhesion to 54%, whereas rolling interactions remained unchanged. In contrast to monocyte adhesion, fish oil had no effect on TF expression of cocultured monocytes. Conclusion: Intravenous fish oil emulsions reduce both, endothelial cell adhesion molecule expression and monocyte adhesion. However, under flow conditions, rolling interactions via P-selectin remain unaltered. The functional importance of this effect is illustrated by the corresponding upregulation of TF in response to residual monocyte-endothelial interactions. Critical illness is associated with increased protein catabolism that is resistant to nutritional support. There is some evidence that an acquired resistance to GH anabolic action appeared in these situations, may be partly responsible. Moreover, in the same conditions, IGF-I and IGFBP-3 serum levels concentrations are low, despite high circulating concentrations of GH. Among several critical illnesses, sepsis is one that GH/IGF-I axis has never systematically studied, and mainly in regard to its progression from uncomplicated sepsis to severe sepsis and/or septic shock. Thus, the aim of this study was to investigate the GH/IGF-I axis in septic phenomenon, especially with regard to its evolution. We measured, by commercially available radioimmunoassays (ELISA), the serum GH, We conclude that GH/IGF-1 axis impairment increases in parallel with the increasing severity of septic process. Aims: The immunoglobulin production is influenced by the functional integrity of the intestinal mucosa, as long as 50% of the secretory IgA originates from it. Therefore, the type of nutritional schedule could interfere with immune response of the respiratory tract. The aim of our study was to compare the changes of the immunoglobulins (IgA, IgG and IgM) in the blood and the bronchial secretions in intensive care unit (ICU) patients under enteral (EN) and total parenteral (TPN) nutrition. Twenty ICU patients were included in the study. Ten of them received EN and the other 10 TPN. Immunoglobulins (IgA, IgG and IgM) were measured in the blood and the bronchial secretions (samples obtained during bronchoscopy), at the 1st and 5th hospitalization day. White blood cells were also measured and cultures of bronchial secretions were received. Our results concerning immunoglobulins levels on the 1st and 5th day (mean values ± SD) are presented in the following table (mg/dl). During the 1st day of the study immunoglobulins did not differ significantly either in blood or in bronchial secretions in both groups. During the 5th day of the study we did not detect significantly differences (P > 0.05) for immunoglobulins in the blood in both Between 1st and 5th day in EN group, there are not significantly differences for immunoglobulins either in blood or in bronchial secretions. Between 1st and 5th day in TPN group, there are not significantly differences for immunoglobulins in blood, but there are significantly differences for IgA and IgG in bronchial secretions. The TPN results in atrophy of intestine mucosa and, therefore in deficiency of the gut associated lymphoid tissue, which would explain the decrease of respiratory tract immunoglobulins. Patients with pH of 7.33 or less with evidence of hyperglycemia, ketosis and between the age of 1 year to 18 years of age were enrolled. ETCO 2 measurements was obtained by continuous oral/nasal side stream capnometry and vital signs were recorded every 15 min. Venous blood gases analyses were obtained every 1-2 hours with simultaneous recorded ETCO 2 , (RR). ETCO 2 measurements were compared with venous pCO 2 and changes in respiratory rates. A total of 121 patients were monitored for a mean of 5.9 hours. Age ranged from 1.8 to 18 years, mean of 10.9 ± 5.2 years. Patient disposition were admitted to PICU or Wards or discharged to home. (Table) . The mean coefficient of variation (CV) among daily measurements of TDEE was in 14 out of 17 patients (82%) < 10%, in the other 3 patients the CV was 10, 11 and 14%. Including all measurements there was a strong positive correlation between the ratio energy intake/TDEE and RQ (r s = 0.65, P < 0.01). There was a significant difference in RQ between patients with a ratio energy intake/TDEE < 1 or ≥ 1 (resp. 0.82 and 0.90, P < 0.01). At the last day of measurement, nine patients with a positive cumulative energy balance (PCEB) (89 kcal/kg, range: 3-446) had a significant higher RQ than eight patients with a negative cumulative energy balance (NCEB) (-21 kcal/kg range: -53 to -4), (resp. 0.89 vs 0.83, P < 0.01), whereas there was no significant difference in RQ between these groups at the first day of measurement. All values expressed as median, day 5-7 are not shown because of limited data. During mechanical ventilation of critically ill children, TDEE can be predicted in 82% of the patients by performing only one measurement, despite individual differences. RQ is strongly influenced by the ratio energy intake/TDEE and by the cumulative energy balance. We advocate to feed critically ill mechanically ventilated children according to or in excess of their TDEE as soon as possible during admission, in order to optimize nutritional therapy. .7 ml/min/m 2 , respectively, n = 17) and in sepsis and septic shock (161.9 ± 35.3 vs 170.0 ± 39.9 ml/min/m 2 , respectively, n = 24). However, correlation was present only in the whole population (r = 0.3, P < 0.05) and in patients with congestive heart failure (r = 0.4, P < 0.05). Gas mixing chamber indirect calorimetery is a noninvasive, safe method for measuring Oxygen consumption in critically ill patients. Oxygen consumption measured by Fick's method is comparable with that measured utilizing gas mixing chamber indirect calorimetry, however correlation was absent in patients with respiratory failure, sepsis and septic shock. Introduction: The anion gap (AG) and base excess (BE) are used to identify the presence of a metabolic acidosis. A method of analysis using physico-chemical principals has been developed by Stewart and refined by Fencl. This technique has recently been demonstrated to detect unmeasured anions, due to metabolic derangement, more readily than the traditional markers of AG and BE. This method may be a more sensitive indicator of patients with circulatory inadequacy or organ dysfunction. The Fencl-Stewart method of assessing unmeasured anions, resulting in the strong ion gap (SIG), has been shown to be more strongly associated with mortality in paediatric patients than BE or AG [1] . This study examines the predictive value of these measurements in an adult population of critically ill patients. Methods: 100 consecutive patients admitted to an adult intensive care unit had electrolyte and blood gas analysis performed on admission. The AG, SIG were calculated on admission and the base excess measured (BE) by blood gas analyser. APACHE II data and 28 day mortality were recorded. Results: 100 mixed medical and surgical patients (48 and 52 respectively) with a mean age 60.5 years (range 18-97) and mean APACHE II score of 20.4 (range 5-40) were enrolled into the study. Twenty-eight day mortality was 31%. Logistic regression analysis showed that the APACHE II score was the best predictor of outcome (OR 1.17 95% CI 1.07-1.26 P < 0.001). Predictions did not significantly improve when either BE, AG, SIG or lactate were included. From ROC analysis, the best of the acid base variables for prediction of mortality was the BE (area under curve 0.70) followed by AG (AUC 0.66), lactate (AUC 0.66) and then SIG (AUC 0.57). These data indicate that SIG does not have a useful predictive value in the adult patients in our ICU practice. Further studies are required to determine whether the application of SIG differs between adult and paediatric ICU patients. Background: An elevated blood lactate concentration is a classic marker of hemodynamic instability and tissue hypoperfusion. It is considered by most physicians as the hallmark and even signature of a life threatening underlying condition. After the sporadic observation of an elevated lactate value in patients with typical hyperventilation we decided to study systematically blood lactate concentration in these patients. We included consecutive patients admitted to the emergency department with a history and clinical findings suggesting hyperventilation, either primary (psychogenic, panic attack) or secondary to a renal colic. In addition to a standard history taking, complete physical examination, laboratory screening, ECG and chest radiograph, arterial blood gases and venous lactate were measured by standard clinical laboratory methods. Results: Twenty patients, 10 females and 10 males, mean age 36 years, were studied. Sixteen had psychogenic hyperventilation and four hyperventilated as a reaction to a typical renal colic. Lactate concentration was increased in 12 cases (> 2 mmol/l); in 3 of them a value of more than 4 mmol/l was found. The lactate value did not correlate with pH nor paCO 2 . An increased blood lactate concentration may be caused by hyperventilation, either primary pscyhogenic or secondary to intense pain. This latter finding is of particular importance for the clinical assessment of the patient presenting with acute abdominal pain in whom most physicians consider an increased lactate concentration as a sign of mesenteric ischemia. Introduction: Elderly patients tolerate trauma less well than younger patients. Early monitoring of the cardiac output to improve peripheral perfusion can diminish mortality and complications in elderly patients. Base deficit can be used as a marker of significant injury and to predict resource utilization and mortality. The purpose of this study is to determine which patients will require intervention. Our Trauma Service has established a trauma in the elderly protocol. Criteria for activation includes patients 60 years or older, multisystem injuries or single system injuries with significant co-morbid factors. Patients meeting criteria are transferred to the intensive care unit and placed on cardiac output monitoring. Intervention consisting of packed red blood cells, vasopressors and/or inotropes is indicated if the cardiac index is less than 3 l/min/m 2 or the patient has an abnormal base deficit. Results: From April 1995 to July 2000, 76 patients met criteria. The patients were divided into three Groups according to their initial base deficit: Group I normal base deficit, 3 to -2, Group II moderate base deficit -2.1 to -6, Group III severe base deficit -6.1 or greater. The results are summarized in the Table. Group I patients required significantly less intervention to improve cardiac output than Group II and Group III, 15% versus 42% and 94% respectively, P < 0.05. Overall, 22 patients (29%) expired, 31 (41%) were discharged home and 23 (30%) to an extended care facility. Conclusions: Base deficit can be used to determine which patients are most likely to require intervention. Elderly trauma patients with normal base deficit on admission seldom require cardiac output monitoring. Elderly patients with severe base deficit have a high mortality. Results: Mean age was 57 ± 17 years, being higher among patients with ARDS than the ones without ARDS. Mean APACHE II and SOFA scores were 21.5 ± 6.6 and 9.9 ± 2.5, respectively. The lactate flux showed a trend to be greater in non-survivors than survivors septic patients (P = 0.06). Despite of respiratory failure severity, all septic patients presented pulmonary lactate production, which was significantly greater in patients with ARDS. Introduction: Many works have shown the relationship between the outcome of surgical critically ill patient and oxygen availability of the body. Many factors cause an insufficient oxygen availability: tissutal traumatism with the consequent beginning of the inflammatory cascade, anemia, hypotension due to the blood losses, anesthesia. Shoemaker has called this alteration as 'oxygen debt' and has demonstrated that patient outcome is directed correlated to it and that this oxygen debt must be extinguished as soon as possible. The higher is the oxygen debt, higher is the risk of multiple organ failure and death. The gut has certainly a prominent role to determine these two last events and previous studies have shown the importance of a low gastric perfusion to cause death, measuring intramucosal pH with a gastric tonometer. A previous study has shown a correlation between Il-6 levels and outcome in major abdominal surgery. Now we have done a work to investigate the correlation between gut hypossia, measured with sigmoid tonometry, during aortic clamping for abdominal aortic aneurysm operation, and patient outcome. It was a prospective study on a series of 13 patients operated for abdominal aortic aneurysm. These patients were monitored with a sigmoid tonometer and pHi was detected at some times, together with arterial blood lactate and blood lactate of portal vein. These parameters have been detected at these times: at the beginning of anesthesia (t0), before aortic clamping (t1), 30 min after aortic clamping (t2), just after the operation (t4). The portal blood have been taken only in the first two times by the surgeon. Results: Data have been divided in two groups following the developing or not of organ failures (OF) in the postoperative period. Seven patients had not organ failure, while among the other six patients, one died for MOF and the other five developed organ failures: three patients had acute renal and two cardiac failure. Arterial blood lactate trend was not significatly different between the two groups. Tables 1 and 2 show portal blood lactates and pHi trend. Patients with organ failure had a drop of sigmoid pHi with an increase of portal blood lactate after aorta declamping. At T4 pHi was still significatly lower in patients with OF. Fisher exact test has shown a significative relationship between pHi < 7.15 at 30 min after declamping and outcome (P < 0.05). Patients operated for abdominal aortic aneurysm often have organ failures in the postoperative period. The developing of organ failure is correlated with the gut ischemia which happens in the intraoperative period. In fact from these preliminary data patients with a persistent drop of pHi < 7.15 had organ failure in the postoperative period. Objective: Prevalence of Helicobacter pylori in critically ill patients, its relation to bleeding, the grade of stress ulcers and the APACHE score. The study included 40 critically ill patients admitted to critical care department with different medical emergencies (21 males with mean age of 49 ± 17 years). Thirty patients had stress ulceration of different grades and 10 patients did not have stress ulcers (control group). On admission all patients were subjected to full clinical assessment, routine laboratory investigations, occult blood in stool and upper GIT endoscopy (done in the 1st 24 hours) taking multiple biopsies from gastro-duodenal mucosa for culture of Helicobacter pylori (HP). The APACHE II score for each patient was calculated. Endoscopical grading of stress ulceration was done using the grading system of Martin. The patient group was classified into bleeders (73%) and nonbleeders (27%). Helicobacter pylori was +ve in 9 out of 40 patients (22.5%), 8 in the patient group (26%) and 1 in the control group (10%) P = NS. HP +ve cultures were uncountered in 7 bleeders, (32%) and 1 non-bleeders (12.5%). The mean grade of stress ulcer in the HP +ve group was 3.3 ± 0.7 vs 2.5 ± 0.6 in HP -ve group (P = 0.026). The stress ulcer grade was higher in bleeders than in non-bleeders (mean 2.9 ± 0.7, 2.3 ± 0.5, respectively P = 0.01) and higher in overt bleeders than in occult bleeders (mean 3.4 ± 0.5, 2.4 ± 0.5, respectively P = 0.000). APACHE II score was higher in patient with grades 3 and 4 of stress ulceration than with grades 1 and 2 (mean 17.8 ± 5.6, 12.6 ± 4.8, respectively, P = 0.004), higher in bleeders than in non-bleeders (mean 17.0 ± 5.4, 12.9 ± 2.2, respectively P = 0.006). The presence of HP was correlated with higher grades of stress ulceration and incidence of bleeding. As the APAPCE score was directly propertioned to the ulcer grading and to the incidence of bleeding, it can predict the occurrence of bleeding from stress ulceration and also its severity. Background: In Slovenia the annual incidence of peptic ulcer hemorrhage is 118/100,000 inhabitants, with mortality up to 14%. Interventional endoscopy has largely reduced mortality in these patients. To compare the efficacy of argon plasma coagulation (APC) and injection sclerotherapy (IS) in endoscopic hemostasis and evaluate mortality in this potentially lethal medical emergency. Conclusions: APC seems to be an effective treatment modality in peptic ulcer bleeding. It is likely that improved endoscopic therapies, increased sophistication of ICU and better trained personnel are responsible for the decline in mortality from bleeding peptic ulcer in the last decade. Departments of Surgery, University of Bonn, Germany and University of Pittsburgh, PA, USA We have previously demonstrated that manipulation of the rat intestine leads to a local inflammatory response within the intestinal muscularis followed by a decrease in gastrointestinal motility. Furthermore, locally produced mediators play a major role in systemic inflammatory syndromes. The aim of this study was to delineate the initial steps of the local inflammatory cascade within the gut wall in respect to the known prototypic inflammatory cytokine interleukin-6. Methods: ACI rats, IL-6 -/and IL-6 +/+ mice underwent a standardized intestinal manipulation (IM) and were sacrificed at various time points postoperatively. One group of rats received repeated i.v. doses of blocking antibodies against the adhesion molecules (1A29 + WT.3) pre-and postoperatively. The small bowel was separated in mucosal and muscularis layers and specimens were used for RNA and protein extraction. STAT protein was quantified using EMSA. IL-6 protein was measured by ELISA in tissue culture and by immunohistochemistry in muscularis whole-mounts. Muscularis extracts demonstrated an early upregulation (12.3 fold at 3 hours) in IL-6 mRNA by RT-PCR. This was in contrast to mucosal extracts that did not show significant changes. Adhesion molecule blockade resulted in a significant decrease in infiltrating cells, but did not change mRNA expression. IL-6 immuno-histochemistry stained resident muscularis macrophages, smooth muscle cells and infiltrated leukocytes. Muscularis tissue culture after IM demonstrated a significant increase in IL-6 protein compared to untreated control cultures (420 vs 230 pg/100 mg tissue). Postoperatively STAT proteins showed a significant increase in activation (32 fold at 30 min) with a prototypic IL-6 supershift profile (Stat3α). IL-6 -/mice demonstrated a significantly lower STAT activation following IM compared to IL-6 +/+ mice. These results demonstrate for the first time that operative trauma leads to an early and signifcant production of IL-6 within the intestinal muscularis. IL-6 is mainly produced by resident cells within the muscularis and has functional activity. Therefore, the intestinal muscularis plays a role in the postoperative production of the proinflammatory mediator IL-6. Results: Average blood loss was 48 ± 11%. All tissue sites were found to respond significantly to shock and resuscitation. Both SMpH and SQpH remained significantly lower than baseline until 90 min of recovery, whereas BW returned to normal by 30 min. PCO 2 was significantly elevated at decompensation in all tissues, but returned to baseline by the end of resuscitation. Only SM and BW were found to decrease significantly at the end of decompensation, but returned to normal with resuscitation. Overall, SM afforded the greatest measurable change with the smallest relative variance at each time point. Continuous multi-parameter monitoring of SM, SQ and BW potentially provide a minimally invasive method of assessing shock and resuscitation. Of the tissue sites investigated, SM provides the most sensitive means of monitoring hemorrhagic shock with the least amount of inter-subject variance. Introduction: Measurements of P IVC have been reported to allow a fairly reliable assessment of CVP. On the other hand, in patients with abdominal hypertension, P IVC has been reported to reflect P CYST . We studied the relationship between Pivc and a) Psvc, b) intra-abdominal pressure measured in the urinary bladder (Pcyst). We obtained simultaneous measurements of Psvc, Pivc and Pcyst in 27 critically ill hemodynamically stable patients under mechanical ventilation (81 sets of measurements). Phlebostatic zero and pubic symphysis were at the same level. Measurements were divided in two groups: Group A (Pcyst ≥ Psvc, n = 39). Group B (Pcyst < Psvc, n = 42). Statistics were performed with paired t-test and Pearson correlation. Results: a) Pivc was significantly higher than Psvc (15.7 ± 0.5 vs 14.9 ± 0.6, P = 0.008), with r = 0.87. Pivc was significantly higher than Pcyst (15.7 ± 0.5 vs 13.6 ± 0.7, P= 0.000), with r = 0.61. b) In Group A, Pivc was significantly higher than Psvc (15.8 ± 0.9 vs 13.4 ± 0.9, P= 0.000), with r = 0,89. There was a significant difference between Pivc and Pcyst (15.8 ± 0.9 vs 17.3 ± 0.8, P = 0.030), with r = 0.85. c) In Group B there was no significant difference between Pivc and Psvc (15.7 ± 0.7 vs 15.8 ± 0.8, P = 0.877) r = 0.91. Pivc was significantly higher than Pcyst (15.7 ± 0.7 vs 10.2 ± 0.7, P= 0.000), with r = 0.62. Pcyst. When Pcyst > Psvc, Pivc does not allow an accurate assessment either of Pcyst or of Psvc but becomes highly correlated with Pcyst. G Backgrounnd and goal of study: The use of alpha-adrenergic substances for treatment of hypotension in sepsis might deteriorate splanchnic regional and microcirculatory blood flow. The aim of this study was to measure microcirculatory blood flow (MBF) continuously and simultaneously in multiple abdominal organs during administration of epinephrine, norepinephrine and phenylephrine in peritonitis induced sepsis. Pigs (20-25 kg, n = 9) were anaesthetised and ventilated. Cardiac index (CI) was measured with thermodilution. Superior mesenteric artery (SMA) flow was measured using ultrasound transit time flowmetry. MBF was measured in gastric, colon and jejunal mucosa, jejunal muscularis, the pancreas, the liver and the kidney using a multi-channel Laser Doppler flowmeter. Peritonitis was induced by instillation of autologous feces in the peritoneal cavity. After 240 min of peritonitis, intravenous colloids were given to transform hypodynamic shock into normodynamic shock. Each animal received a continuous infusion of epinephrine (Epi), norepinephrine (Nor) and phenylephrine (Phe) in a random order. A separate baseline was taken before administration of each drug. The infusion rate was adjusted to obtain MAP between 20% and 30% above baseline for at least 10 min. After stopping the infusion MAP and CI were allowed to return within 10% of baseline. Additional intravenous fluids were given when needed. Results: MAP was 66 ± 5 mmHg before epinephrine, 70 ± 4 mmHg before norepinephrine and 69 ± 5 mmHg before phenylephrine. Epinephrine and norepinephrine appeared to divert blood flow away from the splanchnic circulation, in particular from the small intestine. In contrast, SMA flow and MBF of the intestine remained unchanged during administraton of phenylephrine. As phenylephrine is a pure alpha-adrenergic agonist, the beta-adrenergic properties of epinephrine and norepinephrine might play a role in the redistribution of blood flow when administered in sepsis. Values are percent of baseline ± SEM. * P < 0.05, ** P < 0.01 vs baseline. Data processing: Best statistical curve fit (Fig. 1) . Results: Two linear curves for MAP, SVRen variations were observed ( Fig. 1) Introduction: Corticosteroids improve haemodynamic profile in patients with septic shock [1, 2] . Adrenal dysfunction is common in patients with acute hepatic necrosis (AHN) and haemodynamic instability [3] . We studied 13 patients aged between 19 and 63 years with liver failure defined by the presence of encephalopathy and hypotension requiring noradrenaline (NA) support despite adequate fluid resuscitation. Eleven patients had AHN and two decompensated alcoholic cirrhosis. Treatment with 300 mg hydrocortisone daily by infusion was started when BP required NA support. Baseline, incremental rise and peak cortisols following 250 µg intravenous synacthen were recorded, as were NA requirements and mean arterial BP for 24 hours before and 24 and 48 hours after the start of corticosteroids. Results: NA was given for a median of 2 days prior to steroid therapy. Three patients survived, eight died and one underwent liver transplantation (OLT). Of those who came off NA (seven), three required further NA and all died; four required no further inotropes of which three survived and one underwent OLT. There were no differences between baseline, increment and peak cortisols between those who became inotrope independent and those who did not using the Mann-Whitney test. Conclusion: Low dose corticosteroids improve the haemodynamic profile in patients with hypotensive liver failure and may improve survival. This may not be dependent upon endogenous adrenal function. Introduction: Splanchnic blood flow is impaired in sepsis [1] . Therefore, beta-adrenergic agents are frequently administered in sepsis to increase global oxygen delivery in an attempt to increase splanchnic blood flow as well. However, changes in microcirculation can not be predicted by changes in systemic blood flow [2] and the effects of beta-adrenergic drugs on the splanchnic and gut mucosal circulation in sepsis is still not well understood. The aim of this study was to measure the effects of dopexamine and dobutamine, on systemic flow, regional flow and microcirculatory blood flow in the intestinal mucosa in septic shock. Systemic flow (CI), regional flow (superior mesenteric artery) and microcirculatory blood flow (MBF) were measured in nine sedated (midazolam & fentanyl) and ventilated pigs (20-24 kg). MBF in the mucosa of the stomach, jejunum and colon was measured with multichannel laser Doppler flowmetry. Septic shock was induced by fecal peritonitis at 0 min. After 240 min, i.v. fluids were administered to alter hypodynamic shock to hyperdynamic septic shock. At 360 min each animal received either i.v. infusion of dopexamine (1.0 µg/kg/min, increased after 30 min to 2.0 µg/kg/min for another 30 min) or dobutamine (5 µg/kg/min, increased after 30 min to 10 µg/kg/min for another 30 min) in a random order. After 30-60 min recovery period a new baseline was taken before infusion with the other test drug was started. Results and discussion: Results are presented in the Table as percent of baseline ± SEM. Changes within each parameter were tested with ANOVA for repeated measurements. *P < 0.05; † P < 0.01; ‡ P < 0.001. Both the tested beta-adrenergic drugs increased cardiac index significantly. However, regional blood flow was only increased with dopexamine, but not with dobutamine. Microcirculatory blood flow in the gastric mucosa was also significantly increased with dopexamine, but not with dobutamine. Neither drug appeared to influence microcirculatory flow in the intestinal mucosa. Although the beta-adrenergic agents significantly increased cardiac output the mucosa of the gastrointestinal tract appeared to profit very little from increased systemic flow. Dobutamine did neither influence regional flow or microcirculatory flow. Dopexamine improved both regional and gastric mucosal flows, but had no effects on mucosal flow in the small and large bowel. Introduction: In the presence of severe complications associated with invasive measurement of cardiac output a noninvasive technique is desirable. This study was to evaluate a rebreathing method to determine cardiac output in mechanically ventilated patients. The noninvasive method is based on a lung model consisting of a ventilated and a non-ventilated compartment, the first leading to effective pulmonary blood flow (PBF), the latter to shunt perfusion (QVA/Qt [2] . Results: Cementation of the stem caused a cascade of fine emboli of less than 5 mm with an opacification of the right atrium and ventricle. In the same set of patients, after reduction of the hip joint, it was followed by macroemboli up to 3 cm (49 patients, 75%). No important embolic phenomena were observed during other surgical steps. Both embolic events were followed by changes in hemodynamics (increase in heart rate in 18%, P < 0.05; hypotension more than 20 mmHg in 62%) and blood gas parameters (paO 2 decreased for 7.7%, 41.4 mmHg; P < 0.05). PetCO 2 decreased for a mean of 2.9 mmHg (P < 0.05). Pulmonary shunt values increased after embolisation for a mean of 30.5% (P < 0.05). They did not turn back to baseline values in the postoperative period in patients classified ASA III and IV. A significant correlation (P < 0.05) was found between the clinical state before surgery and the duration and intensity of hemogasanalytic changes after insertion of the stem. There is a correlation between embolic events quantified by transesophageal echo-cardiography and the grade of hemodynamic and haemogasanalytic changes of patients during cemented total hip arthroplasty. Patients with high anasthesiological risk can suffer severe cardiopulmonary complications from fat and bone marrow embolisation, that last even into the postoperative period and may cause intesive care treatment. 1 Introduction: During postsurgical treatment on our ICU by six patients occurred an acute lung embolia Grade III/IV. Right after resuscitation we placed prior to angiography a Swan-Ganz-Catheter with SVO 2 in a pulmonary arteria. Typical signs of a lung embolia like increased PAP, PCWP and decreased SVO 2 and CO were obtained. We used the right-ventricular monitoring during lysis with rt-PA. Six patients with average age of 68.3 were monitored. Prior to embolia none of the patients had insufficient heart function or irregular heart rhythms. After resuscitation all patients showed increased PAP sys higher than 50 mmHg, CI lower than 2.5 and SVO 2 lower than 75%. All patients were treated with administration of rt-PA in fraction of 5 mg each 90 s. Success of lysis was shown by angiography and right-ventricular monitoring. Results: By four patients there was a rapid normalisation after administration of highest 15 mg rt-PA. The result was to be seen first at angiography than on SVO 2 . Cardiac parameters took 5-8 min to normalize. One patient needs administration of 40 mg rt-PA for normalization. One patient died during lysis without any remarkable normalization and after unsuccessful resuscitation. Discussion: It was shown that continous right-ventricular monitoring is a capable feature for monitoring during lysis at thecritically ill patient. Even angiography is more sensitive the SVO 2 -monitoring provides usefull information. If there is no way to have an angiography, right-ventricular monitoring is a valuable system. Placement of catheters can easily be performed and is even in unstable patients practicable. The pulse oximeter waveform (POW) is derived from variation in intensity of transmitted light through tissues. It may be used to provide a non-invasive surrogate for the intra-arterial waveform, and represents an accessible measure of cardiovascular status. We examined the effects of age, blood pressure and smoking on the waveform patterns. These factors are known to decrease arterial compliance, increase peripheral wave reflection and alter the contour of the pulse pressure waveform. Supine and standing POW recordings were taken from 200 healthy volunteers. The resting waveform patterns were categorised into four distinct classes as described by Murgo and Nichols for intra-arterial pressure waveforms, and compared to age and systolic blood pressure. Post hoc ANOVA revealed a significant relationship between waveform class and age (P < 0.001), blood pressure (P < 0.001), and smoking status (P = 0.021. These effects appear to represent changes in the peripheral arterial system resulting from altered wave reflection. We also analysed the POW in both time and frequency domains. On standing, time domain analysis showed a decrease in total variability, as measured by standard deviation and variance. Frequency domain analysis demonstrated an increase in power at a frequency (0.1 Hz) associated with sympathetic nervous system output. These results are consistent with changes in the microcirculation related to autonomic control mechanisms. These patterns of change, however, were only apparent in a proportion of the subjects. We believe that the responses elicited are dependent on factors affecting the long-term compliance of the arterial tree, and are thus related to the pulse oximeter waveform classification described. We have shown that POW analysis yields consistent and reproducible patterns. Further research may lead to the use of these techniques as a measure of peripheral vascular responsiveness, with potential utility in the evaluation of cardiovascular disease. Introduction: We have reported that the initial distribution volume of glucose (IDVG) reflects the central extracellular fluid volume status and correlated with cardiac output following major surgery. In clinical practice we are using IDVG as an indicator of cardiac preload to predict hypovolemia or fluid refilling phase following major surgery. IDVG is an intermittent measurement parameter and its rather time consuming procedure limits frequent measurements of IDVG. In this study, to investigate whether or not continuous cardiac output (CO) monitoring could be used as a clinical alternative of IDVG to estimate the changes in cardiac preload, we examined a correlation between changes in CO and IDVG in sedated, mechanically ventilated postoperative patients. Thirty-one patients with written informed consent managed in our ICU following subtotal esophagectomy ware included. IDVG was measured once a day from the day of admission in our ICU until 3rd postoperative day by injecting 5 g of glucose and calculated using a one-compartment model. CO was measured using a pulmonary artery catheter equipped with a continuous CO calculation computer. Regression analysis was used to detremine the relationship between daily differences in IDVG and CO. Results: Trends of mean absolute values of IDVG and CO indicate parallel changes of the curves. Daily differences in IDVG and CO had a statistically significant correlation (r = 0.59, n = 93, P < 0.0001). In sedated, mechanically ventilated patients following subtotal esophagectomy, changes of IDVG and CO had a good correlation. In this limited clinical setting, continuous CO monitoring could be used as an alternative of IDVG as a preload indicator although false positive error of changes in CO during the period of each IDVG measurement cannot be ruled out in our study design. Aim: Volume expansion is usually guided by central venous pressure (CVP) [1] . PiCCO ® system which evaluates intra thoracic blood volume (ITBV) may be a useful guide for volume expansion The aim of this study was, using a PiCCO ® system, to measure the changes induced by a volume expansion in cardiac index (CI), systolic index (SI), mean systemic arterial pressure (MSAP), CPV and ITBV. Eighteen patients with acute circulatory ditress syndrome were prospectively studied. All the patients were sedated and mechanically ventilated and received 50 or 100 ml (group 1: CVP > 14 and 14 ≤ CVP ≥ 8 respectively) or 200 ml (group 2: CVP < 8 cmH 2 O) of HEAfusine 6% ® over a 10 min period. Hemodynamics measurements were performed using a transpulmonary indicator dilution technique (PiCCO ® Pulsion, Munich) before (T0), after fluid administration (T10) and 10 min later (T20). The acute circulatory distress syndrome was due to septic shock (n = 15), cardiogenic shock (n = 2) and hypovolemic shock (n = 1). The Table shows the changes observed from the baseline values at T10 (T0-T10 ) and from the end of volume expansion to 10 min later (T10-T20) in each group. In the 200 ml group, 10 min after the end of volume expansion, the increase in MSAP and CVP disappears, whereas ITBV, CI and SI increases remain unchanged and even shows for ITBV a trend toward increase. This suggest a post-expansion effect which is not observed on CVP values. In group 2 who had a higher baseline CVP and a lower volume expansion, the increases in MSAP, CVP, ITBV and SI observed at the end of volume expansion disappear 10 min later, suggesting the absence of a post-expansion effect when the baseline CVP is high and/or the volume infused is low. Conclusion: ITBV measurement may be a useful guide for the evaluation of volume expansion and post-volume expansion effect when baseline CVP is low and/or volume expansion is 200 ml. 1. Weil MH, Henning RJ: New concepts in the diagnosis and fluid treatment of circulatory shock. Anaesth Analg 1979, 58:2. T0 T0-T10 T10-T20 T0 T0-T10 T10-T20 CI (l/min/m 2 ) 3.86 ± 0.9 0.3 ± 0.09 0.11 ± 0.07 3.9 ± 0.9 0.12 ± 0. In an earlier study, we clinically validated an expert computer program (Hemodyn™) designed to assist in interpreting pulmonary artery catheterization data [1] . The present study in seven European centers assessed the influence of Hemodyn™ on the therapeutic strategies of experienced residents. Patients with a pulmonary artery catheter inserted were included in the study if they had hemodynamic disorders unresponsive to standard therapy and/or if their condition raised a therapeutic problem that PAC was expected to solve. Each resident examined and collected PAC data from the study patients in their center, under the supervision of the local study coordinator. Then, the resident completed a data form and made one or more choices among the diagnostic possibilities listed on the form. The same PAC data were then entered into the software. Based on the software's diagnostic evaluation, the resident could either maintain or change his or her diagnosis and treatment. Finally, a senior intensivist accepted or rejected the resident's final diagnosis and treatment plan. Agreement between the residents' initial evaluation and the software's evaluation was poor (kappa < 0.6). Sixty-four hemodynamic profiles from 44 patients were used for the study covering a broad spectrum of critical situations. Before computer assistance, the residents suggested at least one treatment change (to improve hemodynamics) in 83% of patients and a mean of 2.2 treatment changes per patient. After computer assistance and evaluation by the local study coordinator, the residents changed their treatment suggestions in 94% of patients, making a mean of 1.9 changes per patient. Therapeutic agreements before and after computer assistance are shown in the Table (* P < 0.05 vs other comparisons). After computer assistance, agreement was very good between residents and seniors and between seniors and the software. Com-puter assistance led the residents to change at least one suggested treatment in 63% of cases; in 55% of cases, the change was not minor. Analysis of the points of disagreement showed that the residents often underused fluids and vasodilators: these two points contributed 42% of changes after computer assistance. In only 20% of the cases in which the patient received a vasodilator was this treatment suggested initially by the resident. For the other treatments, the proportions of inappropriate use and inappropriate absence of use were similar. Expert software capable of helping residents to interpret PAC data properly may improve the quality of care given to critically ill patients. 1 Results: Results, descriptive statistics and correlation coefficients are shown the Table. The difference between measured and calculated SvO 2 was statistically significant (P < 0.000). Conclusions: Calculation of SvO 2 using BGA technology is always higher than PAOC SvO 2 direct measurement by 1.6%. Although this difference is statistically significant (P < 0.00) the correlation between the two methods is quite high (r = 0.828, P < 0.01). BGA significantly overestimates SvO 2 in comparison to PAOC. These results suggest that calculated SvO 2 may affect therapeutic decisions in comparison to directly measured SvO 2 because the slope of the oxyhemoglobin dissociation curve is very steep in the usual SvO 2 range and thus small changes in the determination of PvO 2 will result in relatively large changes in calculated saturation [1] . Also, minor calculated hemoglobin saturation differences in this steep part of the curve represent major differences in hemoglobin O 2 carrying capacity. 1. Objective: In the postoperative care of patients with severe aneurysmal subarachnoid hemorrhage, a pulmonary artery (PA) catheter is highly recommended for guiding the appropriate hyperdynamic volume management. We prospectively evaluated the accuracy of cardiac output (CO) measurements of a new device for continuous CO monitoring based on transpulmonary thermodilution detected in a femoral artery line against the known gold standard of a PA catheter. Methods: Ten patients presenting with high-grade aneurysmal subarachnoid hemorrhage were monitored in their postoperative course. CO measurements were obtaken simultaneously via a PA catheter and the transpulmonary thermodilution CO device. After calculating the accuracy of the new device, the obtained data of CO and cardiac preload values, wedge pressure and intrathoracic blood volume index (ITBI), were related to the patients final outcome. In 183 parallel measurements of CO, the correlation coefficient R 2 of both devices for CO and systemic vessel resistance was 0.88. Mean difference between the two devices was 0.02 ± 0.78 l/min. Absolute values of CO ranged from 3.1 to 15.6 l/min. Comparing the ITBI obtained from the new device (mean 912 ± 136 ml/m 2 , within normal range) with known pulmonary capillary wedge pressure values (mean 14.4 ± 5.2 mmHg, above normal range) resembled that preload was overestimated when solely regarding wedge pressure. There was no correlation of both preload parameters (R 2 = 0.001). Albeit a small sample size, both mean CO and wedge pressure correlated significantly with the patients final outcome. However, best differentiation of final outcome states was obtained with mean ITBI. The reliability for CO measurements of the new transpulmonary thermodilution device is high when compared with the gold standard of a PA catheter. The measuring device is far less invasive than a PA catheter and simple in handling. Although still to be validated by a larger database, guiding hyperdynamic volume therapy with filling volumes rather than filling pressures seems promising. Introduction: The cardiac output (CO) for thermodilution can have a great variability. The esophageal echo-Doppler estimates the CO for the measurement of blood in the aorta. In spite of these potencial advantage, the clinical use is limited for the lack of experience. The objective is to describe the correlation that exists between the obtaing of CO by Swan-Ganz catheter and by esophageal echo-Doppler continuous aortic blood flow and CO estimated. The placing of the Doppler probe was considered a minimal invasive procedure, and it doesn´t have major complications. We included the prostective form, no random, all patients older than 18 years that required invasive mon-Available online http://ccforum.com/supplements/5/S1 itorization for their treatmeant. A Swan-Ganz catheter (Baxter Healtcare 131F7) was placed on all by the usually form. The esophageal echo-Doppler (Hemosonic 100, Arrow International) was placed via oral/nasal passages and simultaneous measuments of CO were given, with interval of 3 min between each. The investigator was blinded by the CO by Swan-Ganz. The therapeutic decision were guided by the value of the SG catheter. For the statistical analysis we will the paired samples for patients and time of measurement. We used sofware SPSS 8.0. We made a analysis correlation for Pearson and a comparison between group with 'U' Mann-Whitney for a significant statistic between populations of P < 0.05. We obtained difference of the medications by descriptive statistic, the inferencial analysis was made with IC 95%. We studied 12 patients; 3 of them it was imposible insert the probe, in 4 patients it was difficult to get a well measument because the lack of vision of diameter aorta and correct flow curves. And in 5 patients we made a total of 38 measurements with each of the described methods, for a total of 76 measuments. With the SG catheter the mean CO were 6.94 ± 2.1 l/min (IC 95% 6.2-7.2, range 3.1-11.2). The measurements of aortic blood flow with the esophageal echo-Doppler were 5.14 ± 1.9 l/min (IC 95% 4.5-5.7, range 1.8-8.5). The measurements of CO with esophageal echo-Doppler were 6.90 ± 2.2 l/min (IC 95% 6.17-7.63, range 2.9-10.5). The comparison between groups was NS. The Pearson correlation between SG and echo-Doppler ABF was 0.838 (r 2 = 0.70), between SG and CO by echo-Doppler was 0.819 (r 2 = 0.67). The preliminary report showed that the correlation between the measurements of ABF with esophageal echo-Doppler is good. This new method offers advantage that are minimally invasive and quickness in obtaning results; however it requires replacing the esophagus transductor every time the patient is mobilized and not in all patients could be used. Objectives: To compare the accuracy of an integrated fiberoptic monitoring system in measuring blood volume (BV) with standard method using chromium-51-tagged erythrocytes in septic shock. Design: Prospective animal laboratory study. Twenty anaesthetised, and mechanically ventilated pigs (20.9 ± 1.9 kg) were investigated over a period of 8 h. Septic shock was induced with faecal peritonitis (1 g/kg body weight autologous faeces). A central venous catheter was used for the injection of the indicator dyes. BV was measured by detecting indocyanin green by reflection densitrometry using a fiberoptic thermistor tipped catheter inserted into right carotid artery (4F PV 2024, Pulsion Medical Systems). Haemodynamic treatment scheme was aimed at maintenance of a central venous pressure of 12 mmHg. Data were analysed using Bland-Altman analyses, linear regression and correlation. Forty data pairs of simultaneous BV-measurements were yielded during haemodynamic consistency with a mean BV measured by integrated fiberoptic monitoring system of 66.6 ± 20.3 ml/kg (range: 24.5-122.6 ml/kg). Mean BV measured by chromium-51-tagged erythrocytes was 76.1 ± 17.9 ml/kg (range: 49.7-121.6 ml/kg). Linear regression equation was: BV by integrated fiberoptic monitoring system = 0.65 x BV: chromium-51-tagged erythrocytes + 17.6; r = 0.57, P < 0.01. The mean bias was 9.6 ml/kg (95% confidence interval: 3.7-15.4 ml/kg), with limits of agreement of -26.5 to 45.6 ml/kg and a precision of 16.8 ml/kg. In this model of porcine septic shock we could show a significant correlation in blood volume measurement between fiberoptic monitoring system and chromium-51-tagged erythrocytes. The relatively wide limits of agreement might be due to pronounced circulatory alterations including slow mixing compartments, prolonged equilibration and sequestration in septic shock. Conclusions: Patterns of release of cTnI and cTnT after CABG are different: cTnI reaches its postoperative peak value earlier and declines more quickly than cTnT. After uncomplicated CABG surgery both cardiac troponins remain continuously low. Elevated concentrations reflect perioperative myocardial injury. Reperfusion of the infarct related artery is mandatory to salvage the myocardium, and the best way nowadays to achieve this goal is percutaneous transluminal coronary angioplasty (PTCA). Although important to preserve the myocardium, reperfusion leads to what is called reperfusion injury which decreases the benefits of reperfusion. The process of reperfusion is known to generate oxygen free radicals and activation of apoptosis (programmed cell death) through different mechanisms, both may participate in the pathophysiology of reperfusion injury. This study was intended to evaluate the effects of reperfusion done by PTCA of the infarct related artery, on both the generation of oxygen free radicals (assessed indirectly by measuring oxygen free radicals scavenger Glutathione Peroxidase 'GPx') as well as the induction of apoptosis (assessed by measuring Soluble Fas using APO-1/Fas), in correlation with the clinical outcome. This study comprised 32 pts admitted with acute MI, including 28 males and 4 females (mean age 52.09 ± 9.6 years) to Critical Care Department of Cairo University from December 98 to June 2000. Peripheral & coronary arterial samples were withdrawn before and after inteventional restoration of coronary arterial patency as well as 24 hours later. Our study, has shown that following interventional restoration of coronary arterial patency, there was significantly decreased GPx levels in the coronary vessels (expressed as mean rank 16.1 vs 15.8, P < 0.05), possibly due to the liberation of excessive amounts of oxygen free radicals after reperfusion, consuming the GPx scavengers. There was a further significant decrease in the GPx scavenger levels after 24 hours following reperfusion (expressed as mean rank 16.44 vs 15.1, P < 0.05), probably due to a reactionary increase in oxygen free radicals levels. A 15% increase in sFas levels in coronary arteries and peripheral circulation after reperfusion (4.83-5.58 ng/ ml) points to increased apoptosis after reperfusion. Stratified into 27 survivors and 5 non-survivors, non-survivors exhibited a significant decrease in GPx levels in coronary samples after reperfusion (66%) compared to an insignificant increase in survivors (7%). Suggesting that the high levels of oxygen free radicals produced after reperfusion may directly affect patients mortality, together with a significant increase in coronary arterial sFas levels after reperfusion (44%) in non-survivors versus an insignificant increase in survivors (12%) Conclusion: Increased levels of oxygen free radicals 'indirectly assesed by peripheral and coronary arterrial levels of GPx scavenger' and increased apoptosis 'indirectly assesed by the apoptotic marker sFas' charecterise the reperfusion phase of ischemic myocardium following interventional PTCA. Non-survivors were found to have higher levels of oxygen free radicals and higher incidence of apoptosis, than survivors, indicating a direct relationship between their levels and the ultimate pts prognosis. Modern technology of molecular markers thus permits elucidation of the potential mechanisms of reperfusion injury. Background: Platelet activation and aggregation are central mechanisms in the pathogenesis of coronary thrombus formation in acute coronary syndromes ACS. One final obligatory step for platelet aggregation is binding to GP IIb/IIIa receptor. In this study an attempt has been made to assess the effects of the glycoprotein IIb/IIIa receptors antagonist (tirofiban) on angiographic patency, scintigraphic perfusion and GP IIb/IIIa receptors activity in ACS. This dramatic effect on platelet function was not paralleled by coronary angiographic patency (TIMI flow ≥ 2 is 70% in both groups, P > 0.05) but was expressed in significantly better myocardial perfusion as evidenced by regression of myocardium at risk from a mean of 20.3 ± 5.7% to a mean of 8.1 ± 4.9% in Gp2 vs a mean of 20.4 ± 8.59% to a mean of 14.4 ± 7.05% in Gp1, P < 0.03 and improved perfusion scoring from a mean of 12.3 ± 3.5 to a mean of 4.6 ± 3 in Gp2 vs a mean of 12.4+5.1 to a mean of 8.7 ± 4.2 in Gp1, P < 0.03 with salvage index exceeding 30% in 80% of pts in Gp2 vs 50% in Gp1, P < 0.05. In acute coronary syndromes, the use of platelet glycoprotein IIb/IIIa blocker tirofiban could effectively reduce GP IIb/IIIa receptor activity, enhance myocardial perfusion, improve salvage of myocardium, reduce residual ischemia and improve short term outcome. We are conducting a multicenter, randomized, prospective, controlled trial to compare the strategy of early coronary stenting (group A) with conservative treatment (group B) following T within 12 hours after onset of AMI. Patients of group A are transferred to the interventional center within 6 hours after T for coronary angiography including stenting of the infarct related artery. Group B has elective coronary angiography after 2 weeks. Primary endpoint is a combined endpoint of death, reinfarction, and target lesion revascularization. Conclusion: Early stenting after T in AMI is safe. This preliminary data indicate a clinical benefit by this approach compared to conservative treatment after T in AMI. Objective: To analyze epidemiology and evolution differences between sex on elder 80 years old patients with myocardial infarction. It is a retrospective analysis. Were included all patients elder 80 years admitted from 1-1-1990 to 31-10-1998 with a myocardial infarction. We used χ 2 test and P < 0.05 to establish a statistically significant association. Conclusions: Women have a higher mortality rate and more complete atrioventricular block. Background: Studies have revealed sex-specific differences in the electrophysiology with respect to rate, sinus node function, refractory periods, conduction time of accessory pathways and the type and duration of arrhythmias. However, data on sex-specific differences of arrhythmias are lacking in critically ill patients. Aim of the study: From November 1996 until July 1999 a prospective study was conducted in the ICU of a university hospi-tal to assess sex-specific differences in all consecutive tachyarrhythmias. Aim: Ventricular tachycardia (VT) and atrial fibrillation (AF) are the single most frequent arrhythmias (ARRHY) in our ICU. The present study investigated whether there is a circadian variation in the onset of these ARRHY in critically ill patients in a medical cardiologic ICU which also admits cardiac surgery patients. Between 11/96 and 7/99 there were 98 consecutive patients with VT (43 pts) or AF (55 pts). There were a total of 218 ARRHY episodes (AF, n = 83; VT, n = 135). The time of onset of these ARRHY was not evenly distributed across the 24 hours day. Both, VT and AF showed a nadir during the night. The circadian variation in the occurrence of VT, but not AF, could well be modeled by a sine wave function. For VT, but not AF, there was a significant correlation between the observed frequency distribution and a sine wave function (VT r = 0.71, P < 0.001; AF r = 0.29, P > 0.05). The circadian variation with/without analgosedation was not significantly different for VT episodes (n = 86 and n = 49; χ 2 = 20.3, P = 0.56). For the distribution of VT episodes, there was a significant correlation to a sine wave function irrespective of the presence of analgosedation (no analgosedation, r = 0.59, P < 0.05; analgosedation, r = 0.9, P < 0.001; Fig.) . VT occurrence showed a peak in the early afternoon between 1 and 2 PM. As for VT, the distribution of AF episodes was not different during periods with (n = 47) or without analgosedation (n = 36; χ 2 = 19.6, P = 0.6). The distribution of VT and AF episodes across the day in critically ill patients is nonuniform. The distribution of VT episodes follows a sine wave function showing a peak in the early afternoon between 1 and 2 pm. The circadian variation and its representation by a sine wave function is less clear for the onset of AF episodes. Most striking, the circadian variation is present even during periods of analgosedation. Introduction: The survival from cardiac arrest is strongly influenced by the delay and appropriateness of interventions as CPR and defibrillation. For intra-hospital cardiac arrests which occur in general wards, the ALS rescuers are usually non-readily available at bedside and rescue depends on an emergency team moving inside the hospital. We investigated how the response to cardiorespiratory emergency is organised inside the hospitals in Rome. Twenty-five secondary and tertiary level hospitals of the Rome area (2,840,000 inhabitants) have been considered. Among them 22 included an ITU. Data were collected by means of interviews with ALS team members and direct overview on site. The time interval to reach the farthest ward from the starting point of the crash team was also measured. Results: Only 8% of hospitals have a dedicated telephone number for the emergency team. In 44% of cases a beeper is used, in 40% of cases the personnel call directly the ICU. In the remaining 8% of cases the Anaesthesiologist on duty is called. Only in 27% of cases the beeper allows bi-directional communication. In 42% of cases the number corresponding to the beeper is always the same, while in the remaining cases it may change even daily or from day to night. The ALS team includes one Anaesthesiologist and one nurse in 20% of cases, one Anaesthesiologist and one optional nurse in 40% of cases and only one Anaesthesiologist in 40% of cases. The ALS equipment carried by the crash team includes a monitordefibrillator, intubation equipment and drugs in 12% of cases, intubation equipment with drugs in 52% of cases and no equipment at all in 36% of hospitals. In these last cases, the crash team relies on the equipment available in the wards, but only 75% of them have protocols to check it regularly. In 20% of hospitals there is a defibrillator in every ward, in 24% there is one defibrillator per floor and in 56% less than one per floor. In 79% of these last cases, the crash team does not carry the defibrillator, which has to be found by the personnel in other ward of the same floor or in other floors of the hospital, and has to be carried by hand or by elevator. In two hospitals a transthoracic emergency pacemaker is not available; in only one hospital the pacemaker is included in the crash cart carried by the emergency team; in the remaining cases, it is available, but it has to be found in CCU, ITU or OR. Fifty-six percent of hospitals do not have standard ALS protocols; 28% have them, while in 16% of hospitals some providers use them, others do not. Only in two hospitals the Utstein style for reporting cardiac arrest is used. Despite the fact that the majority of hospitals have a regular BLS training programme for the personnel, in 88% of hospitals the emergency team members complained of insufficient CPR training of general ward personnel, on which they rely to perform ALS on arrival. The time for arrival in the farthest ward of the hospitals ranges from 30 s to more than 15 min (average 3 min 30 s ± 4 min 27 s SD). The majority of hospitals are on a single building with less than six floors, but seven of them have separate buildings, and two have 11 floors. The majority of hospitals in Rome do not have a dedicated telephone number for emergencies and do not use international ALS protocols for cardiac arrest treatment. Only in 12% of cases the crash team carries a complete ALS equipment on the scene, while the majority of ALS teams prefer to move with a limited equipment, relying on materials available on the wards. However, only in 20% of hospitals there is a defibrillator in every ward. In the majority of hospitals there is regular training on CPR for general ward personnel, but the majority of emergency team members complaint of insufficient CPR training of general ward personnel. Between January 1999 and June 2000, we analysed in a prospective study patients admitted for sudden death with positive resuscitation (SD) after acute myocardial infarction (AMI). All patients were assigned to undergo immediate PTCA. About 588 AMI were recruted during this period, and 38 patients (6.46%) have presented a SD. Objectives: To evaluate health-related quality of life (HR-QOL) of cardiac arrest survivors with EQ-5D, a generic instrument developed by the EuroQol group. From April 1997 to December 1999, all cardiac arrest adult patients admitted in a six-bed medical/surgical (ICU) of a tertiary care hospital were enrolled. At six months after ICU discharge survivors attended a follow-up interview and answered EQ-5D questionnaire. A match-control group was created choosing for each survivor of cardiac arrest two controls, with similar age range (± 3 years) and similar Apache II (± 2 Apache units), that were randomly selected among other ICU patients. : From a total of 834 patients, 67 patients were admitted after cardiac arrest. Of these, 33 (49%) were discharged from ICU. From these, five patients died in the ward. Twenty-eight (42%) patients were discharged from hospital. Ten patients died after hospital discharge but before 6-month evaluation. Seven patients were not evaluated, five because they were living in distant locations, one because was in prison and another one for unknown reasons. Eleven patients attended the follow-up consultation. Two patients presented with anoxic encefalopathy with severe neurological dysfunction. A third one presented less severe anoxic neurological sequelae. Cardiac arrest survivors exhibit significantly worst quality of life only in the self-care dimension. Conclusions: When evaluated with EQ-5D at 6 months after ICU discharge, survivors of cardiac arrest exhibit a HR-QOL similar to other ICU survivors presenting only more difficulties in the self-care dimension. These results agree with previous reports stating that CPR is frequently unsuccessful but if survival is achieved a fair good quality of life can be expected. Introduction: Burn injuries still produce a significant morbidity and mortality in Iran. This study was carried out to analyze the epidemiology, mortality, and current etiological factors of burn injuries in Kurdistan province in order to develop effective burn prevention programs. Methods: During 6 years (1994) (1995) (1996) (1997) (1998) (1999) (2000) , 874 patients were admitted to the burn unit of Tohid hospital affiliated to the Kurdistan University of Medical Sciences. Data were obtained by analysis of the medical records of hospitalized patients included age, sex, percentage of burn in terms of body surface, etiology, and the outcome of treatment. The overall incidence rates of hospitalization and death were 10.8 and 3.6 per 100,000 person-years. The mean age was 20.6 years, and 58% of patients were children under 20-years-old. The highest rates of hospitalization were observed in the age groups 16-20 years (19.3/10 5 person-years) and ≤ 5 years (17.8/10 5 person-years). Thus, 53% of the patients had BBS less than 40%. Burn injuries were more frequent and larger with higher mortality in females than in males (P < 0.00001). There was also statistically significant correlation between age groups, gender, and BBS with mortality rate (P < 0.00001). Flame was the most common etiology of burns (63.7%). There was also significant correlation between age groups and type of burns (P < 0.00001). Suicide attempts for all patients ≥ 13 years were the cause of 12% (59/468) of the burns involving women and of 4% (7/151) of the burns involving men. The overall case fatality rate was 33.4% (292/874). The mortality rate was significantly higher for selfinflicted burns (72%, 48/66) than for accidental burns (30%, 244/808). Most of the lesion requiring hospital admission occurred during the winter months. A large number of burn injuries, which affect children and females, occur in the domestic setting and could have been prevented. Therefore, it is necessary to implement programs for health education relating to prevention of burn injuries focusing on the domestic setting. Most of the burn injuries were caused by domestic accidents and were therefore preventable. In Jordan, pre-hospital care is provided by the civil defense department in general but there is a wide underpopulated area in the eastern and southern parts of the coutry plus many remote small villages and communities, which are underserved by clinics, hospitals and civil defense units and the evacuation routes are very long, so emergency care which is needed is provided by general physicians and family practitioners with little if any training in emergency care. Ten years ago we started to train all physicians who would practice in such areas, in providing emergency care. All 220 physicians graduated from the family medicine residency programs in Jordan had to rotate through the emergency departments as basic 3-month rotation in their family medicine training program and were required to master all life saving skills before they finish this rotation. Also all of them were tested to make sure that they are competent to provide emergency care. A questionnaire was designed for both the receiving hospitals and the providers in the field physicians to assess the outcome of this training scheme. Feedback from the hospitals on the quality of care given to patients by those physicians in pre-hospital setting indicated great improvement in evaluating, stabilizing, and providing the necessary care mainly to trauma and cardiac patients. All providers feel more satisfied with their practice and less stressed when they face an emergency, and that pre-hospital care had improved dramatically in those remote areas. In the urban area of Prato (population 172,473 01.01.2000) , the Emergency Department runs three ambulances with doctor on board coordinated by the emergency number 118. Here we analysed the typology of 798 consecutive services carried out by our ambulance from 01.08.1998 to 31.01.1999. Twenty-eight services (3.5%), canceled for various reasons, were excluded by the statistical analysis. 770 patients were visited; 402 (52.2%) were males and 368 females (47.8%); males were significantly younger than females (53.8 ± 24.2 vs 59.9 ± 17 years, P = 0.001). 75.5% of services concerned non-traumatic cases, 17.9% traumatic cases and 6.6% transfers between hospitals. The services for nontrauma group, concerned dyspnoic symptoms (15.7%), cases of lipothimia (15.2%), chest pain (11.1%), mental disorders (7.9%), cerebrovascular pathology (6.8%), abdominal pain (4.4%), use of psychotropic substances (4.3%), epileptic and not epileptic convulsions (3.5% According to Sonsin et al [1] , the most frequent services were related with cardiorespiratory pathologies, cases of lipothimia and trauma so as the % of non-hospitalised patients (24.2% vs 23%). The % of our traumatic cases is like that found by Brismar et al [2] in Sweden urban areas (17.9% vs 20%). Cases of cardiopulmonary resuscitation were 1.9%, similar to 1.8% found by Hu et al [3] . Further studies are in progress. All SICU admissions were compared with interhospital transfers for the last 3 fiscal years. Interhospital transfers account for 5% of ICU admissions but 10% of total costs. These patients generate increased cost per case of over $11,000. The ICU LOS is significantly increased from 3.3 days for all patients to 7.5 for transfers. Likewise, the SICU mortality of these cases is significantly increased from 7.3% to 28.6% and hospital mortality from 9.6% to 33.5% We continually review our practices to dedicate our resources where they do the most good. We must continue to take salvageable, critically ill patients in transfer early in the course of their illness when appropriate SICU management can favorably influence outcome. In our experience, interhospital transfer of critically ill patients identifies a group with overall poor prognosis. There is a need for a means to evaluate and appropriately triage outside referrals in order to maximize clinical outcomes. Analysis of these transfers is underway to identify prospective predictors of potentially futile care to allow better utilization of available resources. Introduction: Hypothermia after massive resuscitation is known to lead to coagulopathy, myocardial depression, and a depressed immune response. Attempts at prevention or correction of hypothermia in the perioperative period frequently fail in spite of uti-lizing aggressive rewarming modalities. We hypothesized that the response to rewarming is directly correlated to control of bleeding and adequacy of resuscitation. Methods: Retrospective review of injured patients admitted to a level 1 trauma center who: 1. underwent emergent celiotomy and/or thoracotomy, 2. received six or more units of blood within 12 hours of operation, 3. arrived at the ED normothermic (temperature > 96°F), and 4. developed perioperative hypothermia. Hypothermia was defined as mild (temperature 92-94°F), moderate (temperature 90-92°F), or severe (temperature < 90°F). pH and base excess were measured sporadically. All patients were managed in one institution, and temperature control procedures were standardized. Results: Thirty-eight patients met inclusion criteria. 28 patients sustained penetrating injury of which 24 survived. Ten patients sustained blunt injury of which six survived. All deaths occurred within 24 hours of injury. The Table represents degree of hypothermia in survivors and non-survivors. Two non-survivors had initial control of surgical bleeding and a transient partial correction of temperature, but had progressive hypothermia and death related to bleeding recurrence. Discussion: Failure of correction of hypothermia indicates inadequate resuscitation or failure to control bleeding. In survivors and non-survivors, the pH response tended to lag behind temperature fluctuations, implying hypothermia may be superior to pH in reflecting correction of hypovolemia. At 8 hours, survivors achieved temperatures over 96°F, while non-survivors averaged < 90°F. Non-survivors failed to correct hypothermia, probably due to inadequacy of resuscitation. Hypothermia is a marker for the adequacy of resuscitation in patients with severe truncal injury. Failure to correct a hypothermic trend should prompt a search for ongoing bleeding. Three children ages 5, 7 and 12, fell through the ice on a pond December 23, 1998. The oldest child was able to pull himself to safety and call for help. The water temperature was 32°F. The first responders were city police followed by EMS. A call was placed to the Trauma Team by prehospital personnel shortly after their arrival at the scene. The Trauma Attending notified the Operating Room and the Cardiac Surgery Attending to prepare for cardiopulmonary bypass. The 7-year-old was finally rescued 45 min after initial immersion and was intubated at the scene. He arrived in the Trauma Room at 12 noon, asystolic with a temperature of 81°F. He was taken directly to the OR for rewarming via cardiopulmonary bypass. He was on bypass for approximately 4 hours and was successfully resuscitated. He was admitted to the Pediatric Intensive Care Unit for 34 days and then discharged to inpatient rehabilitation. He has made a full functional and neurological recovery. A similar treatment was pursued for the 5 year old child when recovered but resuscitation of vital signs could not be achieved. This institution experienced many 'firsts' with these cases. We had not previously used cardiopulmonary bypass in children. For future cases of hypothermic immersion injury, we needed a method to assure adequate communication and a systematic way of mobilizing appropriate personnel in a timely manner. Pediatric cardiopulmonary bypass equipment needed to be available and ready. When literature search did not yield specific guidelines for hypothermia, a multi-departmental task force was formed and a new hypothermia management protocol was created. This protocol insured maintenance of necessary equipment, availability of specialty personnel and rapid sequential mobilization and coordination of staff from many disciplines. Development of a multi-departmental protocol is the key to successful coordination of multiple departments toward a favorable outcome in such time-critical cases as hypothermic immersion injury. Institutional education and the publicizing of favorable outcomes will help to motivate staff in maintaining preparedness for these cases. Introduction: Myocardial contusion can result from blunt thoracic trauma and is commonly suspected in deceleration injuries. Unfortunately, traumatic heart disease is frequently overlook and the diagnosis of myocardial contusion is often unrecognized. Also the frequency and prognostic influence of cardiac injury in patient with blunt trauma is contraversial. Myocardial contusion is reported to be present in as few as 0% of patients and as many as 76% with blunt chest trauma depending on the criteria used for establishing the diagnosis. Diagnosing myocardial damage as a result of trauma may be a problem because there is a lack of a gold standard for establishing the diagnosis. More articles brought no clear results about specifity and signification of these methods. The conventional CKMB activity and CKMB/CK-total ratio are still used in our country even if low specifity of these markers has been demonstrated, especially in American literature. On the contrary echocardiography (and particularly transesophageal echocardiography) is still not fully used and in many hospitals is not accessible at all. The group of 101 patients hospitalized in traumatology and general ICUs (1998 ICUs ( -2000 with diagnosis: multiple trauma (including trauma of chest) or isolated thoracic trauma. The only including criterion -admission to our ICU within 24 hours from the injury. We have tried to determine the incidence of heart contusion and to compare all accessible methods of diagnosis and its importance. The aim of this study was to assess the validity of diagnostic procedures and possibility of its simplification. The incidence of heart contusion in our group was 11.8%. See the Figure We investigated the prognosis of patients with multiple rib fractures for 2 years after discharge from the hospital, and whether the pain control management during the initial respiratory care influence the respiratory function or neurological findings. The ethical committee in Fukushima Medical University approved this study. 221 patients with chest blunt trauma were admitted in our hospital from 1990 to 1996. We found 61 cases that had more than 4 rib fractures with or without pulmonary contusion. The subjects that were 51 patients who had traffic or some other accidents can be followed by phone and investigated. Forty-one cases were received mechanical ventilation for from 5 to 18 days. All of 51 cases were administrated oxygen and drugs that controlled the chest pain. Nineteen of 51 patients (EPI group) were undergone the epidural anesthesia with xylocain for the first 7 days (5-9 days). The other patients (non-EPI group) were administrated narcotics or non-steroid anti-inflammatory agents. Measurements were arterial blood gasses, respiratory functions, chest X-ray, physical and neurological findings. The subjects were 48 patients who had traffic or some other accidents. Respiratory care was required in 20 cases for 5 ± 18 days. PaO 2 , PEFR (peak expiratory flow rate) were gradually increased after discharge. The improvement was not associated between both of pain controls. Chest deformities were seen in 12 of 51 cases. Râle sound was audible in three cases. Neurological findings (spontaneous pain, tenderness, hypersthesia, sensory disturbance) were found less frequent in patients who were treated under epidural anesthesia. The epidural anesthesia sometimes induced hypotension, however, fluid resuscitation improved. The epidural anesthesia was better than the other narcotics or anti-inflammatory drugs from the points of neurological prognosis. Sensory disturbance is not associated with chest deformities but pain control. Epidural anesthesia was one of the useful pain management to avoid neurological complications after long-term of multiple rib fractures of chest trauma. Patients and methods: Seven patients with severe head injuries, 5 girls (8-18 years), 2 boys (11-16 years) and incresead intracranial pressure not responding to conventional treatment were S83 observed between 1993 and 2000. Due to their bad Glasgow Coma Scale (< 9 pts), all these patients received an epidural monitoring of the intracranial pressure (ICP). Increased ICP (> 30 mmHg), decreased cerebral perfusion pressure (< 45 mmHg), clinical detoriation, and signs of raised ICP in CT scans were the indications for the decompressive craniectomy. Fronto-temporoparietal decompressive craniectomy (5 × unilateral, 2 × bilateral) was performed 1-7 days after the trauma. The bone flap was preserved in the abdominal wall or stored under sterile conditions at -80°C. Reimplantation was carried out 6 weeks to 3 months later. Results: After acute decompression 6/7 patients (4 girls, 2 boys) survived and had a good outcome (Glasgow Outcome Scale: 4-5 pts.). Two patients had bone flap infections. Conclusions: Increased traumatic brain edema requires intensive care, a good monitoring and a well-timed decompression for prevention of irreversible brain damages. Background and purpose: In a model of human endotoxaemia, we have previously shown that the blood concentration of tumour necrosis factor alpha (TNF-α) peaks at 90 min after an intravenous bolus of endotoxin (ETX) [1] . At this time (peak TNF-α), subjective symptoms are marked. We measured cerebral blood flow (CBF) and cerebral metabolic rates (CMR) of oxygen (O 2 ), glucose (glu), and lactate (lac), at peak TNF-α after ETX. Eight healthy young volunteers (median age, 25 [range, 21-28] years) were studied. Informed consent was obtained after approval by subjects and the Scientific-Ethics Committee of Copenhagen. After an overnight fast, catheters were placed in the left radial artery, the right internal jugular bulb, and bilaterally in the antecubital veins. Isotonic glucose was infused at 100 ml/hour. Mean arterial pressure (MAP), heart rate, peripheral saturation, and rectal temperature (Tp rect ) were continuously monitored. CBF and CMR were measured by the Kety-Schmidt technique [2] at baseline, during normoventilation and voluntary hyperventilation (to measure subject-specific CO 2 reactivity), and 90 min after an intravenous bolus (2 ng/kg) of a standard E. coli endotoxin (ETX). . Subjective symptoms were headache, nausea, chills, and shivering but not overt encephalopathy. Compared to baseline, CBF was significantly decreased; however, P a CO 2 also decreased, and the CBF decrease was sufficiently explained by hyperventilation, as calculated from individual CO 2 reactivities. A trend occurred towards decreased CMR lac , ie increased lactate efflux, similarly explained by hyperventilation. CMRO 2 remained unchanged after ETX, whereas we observed a trend towards decreased CMR glu associated with decreasing blood glucose levels. All subjects were alert without signs of cerebral dysfunction throughout the study. In this human model of early sepsis, the high levels of TNF-α were associated with spontaneous hyperventilation, which decreased CBF and increased cerebral lactate efflux, but did not affect the cerebral metabolic rate of oxygen. Thus, high circulating levels of TNF-α during endotoxaemia and sepsis appear not to be responsible for the development of encephalopathy by a direct reduction in global cerebral oxidative metabolism. Encephalopathy is a common complication of systemic sepsis, but its pathogenesis has received little attention. The adrenergic system has been implicated in the inflammatory response to sepsis and therefore the effects of adrenergic agents on the brain were investigated in an animal model of systemic sepsis. Thirty-two pigs (25-30 kg) were anaesthetised, intubated and a thermodilution catheter placed in the pulmonary artery. Peritonitis was induced by spreading caecal contents around the peritoneum. Non-septic pigs underwent laparotomy without caecotomy. The effects of the predominantly β 2 adrenoceptor agonist dopexamine (0.6 mg/kg/h; i.v.), the α 1 -agonist methoxamine (80 mg/h; i.v.) and the β 2 -blocker ICI 118,551 (a 20 µg/kg bolus given hourly; i.v.) were investigated. Samples of frontal cortex were obtained 8 h after induction of sepsis, processed for electron microscopy and subjected to quantitative morphometry. Septic pigs showed significantly more (P < 0.0004) peri-microvessel (PMV) oedema than non-septic pigs. However, there was no difference between the amount of PMV oedema in septic pigs that received dopexamine and in non-septic pigs. Neither was there any significant difference between the amount of PMV oedema in septic pigs treated with dopexamine + methoxamine and in nonseptic pigs. However, non-septic pigs treated with methoxamine showed significantly more PMV oedema than non-septic pigs (P = 0.0044). There was significantly more PMV oedema in septic pigs treated with ICI 118,551 than in non-septic pigs (P = 0.0028). The mean cross-sectional area of cerebral microvessel endothelial cells in methoxamine-treated septic pigs was significantly greater than that in pigs with sepsis (P = 0.0036) and in non-septic pigs (P = 0.0004). The mean endothelial cell area was also significantly greater in methoxamine-treated non-septic pigs (P = 0.0036) and in dopexamine + methoxamine septic pigs (P = 0.0036) than in non-septic pigs. The mean cerebral microvessel lumen area was significantly larger in septic than in non-septic pigs (P = 0.012). None of the drug treatments used resulted in a mean lumen area significantly different from that of non-septic pigs. Therefore, sepsis resulted in PMV oedema, which was protected against by dopexamine treatment. Conjoint methoxamine treatment did not impair this protective effect of dopexamine in septic pigs, but methoxamine alone caused PMV oedema formation in nonseptic pigs. β 2 adrenoceptor blockade did not affect the formation of PMV oedema in sepsis. Methoxamine treatment resulted in the swelling of microvessel endothelial cells in both septic and nonseptic pigs, but conjoint dopexamine treatment did not prevent this swelling. These results suggest that β 2 adrenoceptor stimulation is beneficial and α 1 adrenoceptor stimulation is detrimental to the treatment of septic encephalopathy. Introduction: Aneurysmal subarachnoid haemorrage (SAH) is a challenging pathology which remains a cause of considerable morbidity and mortality. We attempted to classify patients with aneurysmal SAH according to prognostic estimates, identifying cases with a certain risk profile. Retrospective chart review of 80 SAH patients admitted in our ICU following surgical ligation of cerebral aneurysm. The following clinical details were recorded: age, sex, ASA physical status (ASA-PS), timing of angiography and surgical treatment, pre-operative Glasgow Coma Scale, Hunt-Hess grade, admission angiographic features, length of sedation and ICU stay. Functional outcome was assessed by Glasgow Outcome Scale (GOS) upon ICU discharge. For statistical purpose two subgroups according to outcome were identified: a) GOS 1-3 (unfavorable outcome) and B) GOS 4-5 (good recovery). Statistical analysis was performed using Students' t-test and χ 2 analysis. Results: History of longstanding hypertension was recorded in 45% (n = 36) of the cases, while cigarette smoking in 48.7% (n = 39). Male to female ratio was 1.3:1. Early operation (within the first 3 days after the bleed) was performed 8.7% (n = 7) of the cases with a mortality rate of 43% (n = 3). Calcium antagonists (nimodipine) were administered to 90% of the patients in the study group. In addition early hints for poor neurological outcome were also acquired from ASA-PS, pre-operative Hunt-Hess, evidence of angiographic spasm and presence of severe brain swelling intraoperatively (χ 2 -test, P < 0.001). Overall mortality rate was 5% (n = 4). Our results indicate that age, neurological status upon admission, pre-existing medical condition, evidence of vasospasm and cerebral oedema are the most important determinants of outcome for patients with aneurysmal SAH, regardless of treatment utilized. Objective: This study describes the occurrence of intracranial hypertension; its correlation with clinical presentation; its relationship with cerebral ischemia and its association with outcome. Methods: This prospective study was performed in patients who suffered aneurismal SAH, admitted into our Intensive Care Unit (ICU) from 1998 to 2000. Hemodynamic parameters, ICP, cerebral perfusion pressure (CPP) and jugular bulb O 2 saturation (SjO 2 ) were monitored and collected up to 7 days after admission. Continuous ICP and CPP were summarised by 24-h mean values. In addition, patients were classified by their 'worst mean data', ie worst data of the daily means; this allowed us to identify three groups based on their 'worst mean ICP' (wmICP): < 20, 20-29 and ≥ 30 mmHg. Hypodense lesions were detected on serial CT scans. Outcome at ICU discharge was classified as: obeying orders, alive but not obeying, died. Results: Fifty-four patients (64% female, mean age 54 ± 12 years) were studied. Patients were classified according to the Hunt-Hess (HH) scale: I 25%, II 30%, III 20%, IV 14% and V 11% respectively. Thirty-two (60%) of patients had ICP monitoring and was performed in HH classes as follow: HH1 9%, HH2 35%, HH3 28%, HH4 19%, HH5 9% (P = 0.02). Elevated ICP (ICP ≥ 20 mmHg -HICP) was recorded in 27 patients (84.3% of monitored patients). Worst mean ICP was 21.7 ± 17.8 (range 3-85). HICP was detected in HH1-2 (28.6%), in HH3 (11.1%) and HH4-5 (55.6%) (P = 0.12). Ischemia was more frequent in wmICP ≥ 20 than in wmICP < 20 group (62% vs 11% -P = 0.003). Patients with higher wmICP suffered a poorest outcome: 4.5% patients with wmICP < 20; 33.3% with wmICP 20-29 and 75% with wmICP ≥ 30 died (P = 0.006). Conclusion: In our sample of selected patients, HICP occurred frequently, in all HH grade, and an association with low-density changes on CT scans was observed. Persistently HICP was associated with poor clinical outcome even in good HH classes. ] o sensor was also modulated by Mg 2+ , Gd 3+ , and spermidine, consistent with properties of identified extracellular Ca 2+ receptors (CaRs), and regulated the NSCC via a diffusible second messenger. We predict that the NSCC may act to counter the fall in release probability produced by physiological decreases in [Ca 2+ ] o at the synaptic cleft, by favoring Ca 2+ delivery via broadening of presynaptic action potentials. If this novel signaling pathway acts to ensure synaptic efficacy at times of excessive synaptic activity, inactivation of the pathway may provide a new route through which we can reduce glutamate excitotoxicity and so reduce neuronal death. Introduction: Even moderate fever soon after head trauma or cerebral ischemia may markedly worsen brain injury. These effects may justify aggressive antipyretic treatment. Objectives: In this study we compared two modalities to treat fever and to maintain normothermia and examined their influence on: ICP, CPP, SjvO 2 , length of ICU stay (LOS) and clinical course of existing infections. We prospectively randomised patients with severe head injury (TBI) and subarachnoidal hemorrhage (SAH) in two groups: low doses Diclofenac sodium infusion (0.01-0.08 mg/kg/h) [DCF] and extemporaneous antipyretic bolus by unrestricted physician prescription [NSAIDs] . Randomisation was designated when internal temperature was ≥ 38°C lasting longer than 30 min. Drugs were used till clinically necessary. Internal temperature (T°), cerebral and systemic hemodynamic data were continuously recorded. Results: Eight TBI and three SAH (mean age 40 ± 17 -64% male) were studied. Median TBI GCS motor score was 5 and all SAH patients were Hunt Hess IV. Five patients were enrolled in NSAIDs group and six in DCF group. Age, admission GCS, cerebral damage, max temperature before randomisation and outcome were not significantly different in the two groups. During treatment, normothermia was easier maintained by DCF than by NSAIDs: mean T°, max T°, % minutes T° ≥ 38°C, and febrile episodes incidence were lower in DCF than in NSAIDs with minor antipyretics use (P < 0.05). During treatment, mean ICP and minutes of ICP > 20 were not while, mean CPP, minutes of CPP < 70 and mean SjvO 2 were significantly different (P < 0.05) between groups. Incidence of infections and LOS were not different between the two groups. Conclusions: During the infusion period, a better stability of hemodynamic and cerebral parameters demonstrated the superior antipyretic effect of DCF. NSAIDs were associated with deeper CPP reductions and not all febrile episodes were completely managed. We, preliminary, concluded that DCF at low dosage was advantageous and effective. Intensive Care Unit, Royal North Shore Hospital, St Leonards 2065, Australia Introduction: Following SAH a massive diuresis, natriuresis and hyponatraemia may occur, termed 'cerebral salt wasting'. The aetiology of this condition is unclear and its existence has been questioned [1, 2] . This study aimed to document sodium balance in a patient cohort treated by the same protocol following SAH, hoping to further elucidate the mechanism of this condition. Prospectively entered data from a computerised database of SAH patients admitted to an eight-bedded neurosurgical ICU were analysed to assess the correlation of sodium flux with the course of the disease. For up to 18 consecutive days plasma sodium was measured, 24-hour urinary collections were analysed for sodium loss and the daily intravenous sodium intake was calculated from the charted intravenous fluids. Patients underwent check cerebral angiography on day 5-7 following admission or earlier if there was clinical evidence of vasospasm. Papaverine was administered to patients with vasospasm who then underwent further angiography and hypertensive therapy with noradrenaline. The total doses of papaverine and noradrenaline administered were used as markers of the severity of spasm. Results: Data from 39 patients was analysed. In 15 patients urinary sodium excretion and hence sodium balance could be calculated for 8 or more days. The median sodium input was 360 mmols/day (range 15-4868), see Figures for daily plasma sodium and sodium balances. Measured sodium balance was consistently negative in this population. There were no significant differences between high and low papaverine and noradrenaline groups. In this limited dataset, following SAH and saline loading, plasma sodium increased in these patients but there was no obvious natriuretic phase. Cerebral salt wasting was not clearly demonstrated by these data. Introduction: The circadian rhythm of melatonin secretion is a most reliable biochemical marker of the endogenous biological rhythm of the organism. Plasma melatonin levels are low during the day (1-5 pg/ml) and normally increase to about 50-100 pg/ml at night. The objective of the present study was to investigate any potential disturbances of melatonin's secretion pattern after head injury. The sample consisted of eight subjects (seven males and one female) admitted at the Intensive Care Unit after head injury. Mean age ± SD of subjects was 41 ± 17.91 years, mean APACHE II Score was 15.5 ± 3.75, mean Glasgow Coma Score was 7.11 ± 3.14, mean duration of stay in the ICU was 23.44 ± 12.23 days, and mortality rate was 11.1%. All S87 patients were under sedation (Fentanyl and Propofol) and mechanical ventilation. Various medications were administered as appropriate. Blood samples for melatonin determination were collected via an arterial line at 08.00, 12.00, 16.00, 18.00, 21.00, 24.00, 03.00 and 06.00 for the first 2 days after admission. Core body temperature was also recorded through an esophageal thermometer at hourly intervals. All patients were under constant ambient light (< 600 lux) and none of them received β-adrenergic blockers. Plasma melatonin was assayed using a RIA method. The peak mean melatonin concentrations were recorded during the dark photoperiod and the lowest mean concentrations were recorded during daylight hours (Table 1 ). In overall a clear circadian secretion pattern of melatonin was observed during the first two post-admission days, with higher melatonin levels during the dark period (24.00-06.00) and lower levels during daylight hours (08.00-18.00). However, only four patients exhibited a typical diurnal variation of plasma melatonin, whereas the other four did not. No consistent core body temperature pattern was observed during the study period and individual temperature fluctuations were not related to the corresponding melatonin changes. Melatonin secretion appeared reduced in subjects treated in the ICU after head injury. However, its circadian secretion pattern was not disturbed in half of them during the first two post-admission days, indicating an intact endogenous circadian generating system. Furthermore, the diurnal secretion pattern of melatonin appeared to be desynchronized with the rhythm of core body temperature. Aim: To assess influences of different bypass modalities and core temperatures during ECC on brain pathomorphology and immunohistochemistry in an animal model of cardiac surgery. Methods: Twenty-one pigs, aged 4 months, were assigned to the following temperature-and flow groups: I: 37°C-2.7 l/min/m 2 -(7), II: 28°C-1.6 l/min/m 2 -(9), III: 20°C-1.3 l/min/m 2 -(5). Duration of ECC was 120 min, including cooling and rewarming 30 min each. Six hours after ECC, brains were harvested for histology and immunohistochemistry. Coronal sections from different brain regions were stained with hematoxylin/eosin and antibodies against glial fibrillary acid protein (GFAP), astroglial cell protein S-100 (PS-100) and neuron-specific enolase (NSE) for standard light microscopy. Ischemic lesions were evaluated by a quantitative histological score with respect to tissue stratification and vacuolization, ganglion and glial cell densitiy and glial fiber reaction. Apoptosis was identified by in situ DNA fragmentation (TUNEL). In addition, perioperative PS-100 serum kinetic from jugular venous bulb blood was evaluated. In frontal lobe, the most severe ischemic lesions were observed in group III showing frequent and distended loss of histologic stratification between outer molecular and pyramidal cell layers, caused by necrosis of pyramidal cells and vacuolization within the outer molecular cell layer. In group II, these lesions were less and only focal, in group I hardly present. In all groups, apoptosis was not apparent in the neocortex. In hippocampus, pyramidal cell damage was severe in III, moderate in II, whereas slight in I. In cerebellum, Purkinje's cells were found markedly reduced in III, moderately reduced in II, and slightly reduced in I. Striate body and medulla oblongata were free from ischemic lesions in all groups. Signs of inflammation or edema were not observed. In addition, marked expression of PS-100 in perivascular glial cells as well as significant elevation of serum PS-100 values at the end of ECC and return to preoperative values 6 h after ECC were assessed in all groups. Ischemic lesions with loss of ganglion cells in sensitive brain regions as cortex, hippocampus and cerebellum are most marked after deep-hypothermic low-flow bypass, lowest-graded after normothermic full-flow bypass, and intermediate after moderate hypothermic low-flow bypass. Reduction of flow-rate and hypoxia during ECC rather than inflammation are assumed to contribute to the histopathological findings. Serum PS-100 elevation after ECC is, independent of the bypass modalities, supposed as the result of a temporary disturbance of the blood-brain barrier with respect to tight relations between astroglial and vascular endothelial cells. Introduction: Clinical examination criteria have been established to determine brain death and subsidiary investigations such as electroencephalogram (EEG), arteriography and radionuclide scans have been used to determine the arrest of the cerebral circulation and to confirm the diagnosis of brain death. We examined in this study the clinical utility and reliability of transcranial doppler ultrasonography (TCD) as a confirmatory test for brain death. A total of 30 (20 male, 10 female and ranged in age from 2 to 80) consecutive comatose patients were studied by TCD and at the same time EEG recording was performed. Twenty-five of 30 patients clinically presented with brain death. Undetectable flow despite accurate bone window or the demonstration of isolated systolic spikes or diastolic reverse flow without forward flow in TCD examination and isoelectricity in EEG recording were accepted as confirmation of brain death. TCD examination was repeated in clinically brain death patients in whom TCD demonstrated initially systolodiastolic forward flow or diastolic forward flow and reverse flow in middle cerebral artery (MCA). Five of 25 patients who were brain death excluded from the study because of the lack of accurate bone window in TCD examination. In only 9 (45 %) of 20 patients who were clinically brain dead, three wave-form patterns that confirmed brain death were seen at the initial of TCD examination. Various wave-form patterns were detected in 11 patients in the first examination. Repeated TCD examination of 9 of these 11 patients who initially had forward flow patterns later demonstrated flow patterns which confirmed brain death. In 2 of 11 patients TCD examination could not be reperformed because of sudden cardiopulmonary arrest. In one of 5 patients who were not clinically brain death, TCD showed systolic spikes and no diastolic flow. The sensitivity and specificity of TCD for brain death were found to be 45% and 80%, respectively. Discussion: TCD is a simple and noninvasive imaging modality that is easily performed at the patient's bedside in order to evaluate cerebral perfusion. A number of authors have presented very similar results concerning the specificity (100%) and sensitivity (91-100%) of this method [1, 2] . However, we found 11 (55%) false negative and one false positive (20%) results which are higher rates than the results of previous studies [1, 2] . Finally, in our study we interestingly found that the specificity and sensitivity of TCD examination in confirming brain death were lower than in previous studies. Objectives: Although the incidence of electrophysiological and muscle histological abnormalities in ICU patients has been largely described, the clinical incidence of ICU acquired paralysis (ICUAP) remains poorly explored. The objective of this study was to assess clinical incidence, risk factors and outcome of ICUAP. Method: All consecutive patients without pre-existent neuromuscular disease were daily screened for awakeness in the 5 participating ICUs after 7 days of mechanical ventilation (MV). The first day patient was considered awake (based on a specific awakeness scale) was Day 1. Patients with a neuromuscular score (NMS) < 48 (on a scale ranging from 0, totally paralyzed to 60, normal muscle strength) on Day 7 were considered as having ICUAP. These patients underwent an electrophysiologic (EP) examination within the next 72 hours. Patients with persistent paralysis (NMS < 48) on Day 14 underwent a muscle biopsy. Potential risk factors (including demographic, metabolic, drugrelated and organ failure-related variables) were recorded between ICU admission and Day 1. Odd ratios (OR) with 95% CI were separately computed for each potential risk factor. Then, significant factors were simultaneously included in a multivariate logistic regression model (BMDP software). Results: Out of the 12 patients, 4 had to be excluded due to concommitant focal cerebral laesion impairing motor functions. Two patients had to be referred to another hospital and thus were lost for the study. In the remaining six patients, a combination of axonal sensory-motor polyneuropathy in electrophysiology and myopathy in histological specimen was found. The mortality in these patients was 50% (3 of 6). The complete muscle strength examination was performed in three survivors. The clinical course of these patients is summarized in the Table. Conclusions: Neuromuscular failure during critical illness is a serious medical and economical problem. All the surviving patients with acute quadruplegia of intensive care recieved mechanical ventilation for more than 3 months, requiring a very long stay in the ICU and a prolonged rehabilitation. On discharge, peroneal paresis was present in all survivors. In our ICU, more than one bed is permanently occupied by patients with severe neuromuscular failure. These neuromuscular disorders developed in patients who were not treated with neuromuscular blocking agents or corticosteroids. Supported by grant IGA No. NB 5197-3. Background: Sedation and analgesia are common techniques widely used in the intensive care unit. Since complications such as prolonged sedative effects and associated long-term mechanical ventilation use are common, a careful assessment of level of sedation is warranted to avoid such complications. Nowadays, noninvasive devices like Bispectral Analysis (BIS) recorders are commercially available and are actually considered gold standard tests in level of sedation assessment; however, clinical scales (Ramsay's and the Observer's assessment of Alertness/Sedation [OAA/S] scales) are cheaper and more widely performed in critically ill patients. In the present study we compared clinical assessment scales with BIS recordings. Methods: Prospectively we analyzed nine mechanically ventilated patients under deep sedation. We excluded patients with muscle relaxants, metabolic encephalopathy or primary central nervous disease. All patients were simultaneously evaluated with a BIS device, and with clinical assessment scales (Ramsay and OAA/S). At least 9 periodic measurements in 24 hours were performed in each patient, recording neuronal activity level and the signal quality index (SQI) with the BIS device, excluding all measurements with a SQI more or equal to 50%. Data comparisons among BIS recordings and each clinical scale were analyzed with Pearson's and Spearman's correlation, and determination coefficients, considering significant a 0.75 and a 0.11 value for each category (according to Colton), with a significance level set on 0.05. Results: Eighty-five measurements were analyzed. Correlation and determination coefficients are shown in the following and were subjected to respiratory mechanical support were investigated. Patients who had damage of the neurological system or those who had received muscle relaxants were excluded from the study. All the patients, during evaluation and measurements were under stable sedation and analgesia (Fentanyl, Propofol or Mydazolam) under continuous intravenous infusion. The evaluation of the level of sedation was carried out during the same time period by means of two different scales: The Ramsay (0-6), Cook's (4) (5) (6) (7) (8) (9) (10) (11) (12) (13) (14) (15) (16) (17) (18) (19) and BIS (0-100). The BIS registration lasted 60 min and the final value was calculated from the average of the total recorded 10 min values. The measurements were taken under consideration if the SQI (Signal Quality Index) was higher than 80%. The statistical analysis was carried out by the Jonckheere-Terpstra test. The results are depicted analytically in Table 1 . Between BIS and the Ramsay Scale, a correlation or a degree statistically significant (P = 0.012) were observed. The Cook Scale was not statistically significant (P = 0.091). Conclusions: BIS is satisfactory correlated with the Ramsay scale and its indications correspond to the clinical condition of the patient, where the sedation level is concerned, as opposed to the Cook scale for which no significant correlation was established. The BIS values in the ICU and the corroboration of our results require further study. Entropy quantifies the amount of disorder in a system and characterizes chaotic behaviour. The complexity of a signal can be characterized by spectral entropy, which gives the amount of disorder in frequency space [1] . If an EEG signal includes a wide spectrum of frequencies, its spectral entropy has a high value (near one), and in case of few relevant frequencies spectral entropy is low (near zero). Spectral entropy has been shown to be an effective tool in measuring depth of anaesthesia [2] . In this study, we investigated whether spectral entropy can distinguish between the different sedation levels corresponding to the Ramsay Scores 2, 4, and 6. In order to study spectral entropy during different sedation levels, EEG was recorded from 26 patients scheduled for an elective cardiopulmonary bypass operation with propofol/alfentanil/isoflurane/pancuronium anaesthesia [3] . Postoperative sedation was maintained with propofol to keep the sedation level at Ramsay Score 6 (not responding to any commands) until the patients were hemodynamically stable. EEG was recorded 5 times for each patient: 1 day before the operation (Ramsay Score 2), after premedication 1 hour before the operation (Ramsay Score 2 or 3), immediately after the operation (Ramsay Score 6), after the patient had opened his eyes for the first time (Ramsay Score 4), and the following morning (Ramsay Score 2 or 3). The EEG signal was recorded bipolarly between electrodes Fz-M1, Cz-M2, C3-P3 and C4-P4. It was amplified and digitized continuously at 100 Hz using the Datex-Ohmeda EEG module and stored to a PC for off-line analysis. Spectral entropy values were evaluated for 5 s epochs in two frequency bands: 0.5-32 Hz and 7-32 Hz. Epochs including artefacts were removed from the data before the calculation. Spectral entropy for the range 0.5-32 Hz differentiated statistically significantly whether the patient was awake (Score 2) or asleep (Scores 4-6) (P < 0.05). Spectral entropy for 7-32 Hz was able to differentiate the sedation levels 4 and 6 (P < 0.001). Sedation levels 2, 4, and 6 could thus be distinguished by using spectral entropy. For comparison, we analyzed whether spectral edge frequency or auditory evoked potentials can distinguish between these levels. These methods failed in separating levels 6 and 4. There was considerable variation in spectral entropy values between the patients having the same Ramsay Score. This may be due to the physiological variation of different EEG-patterns between individuals. We divided the patients into two groups according to how the EEG, measured the following morning after the operation, was recovered compared to the EEG 1 day before the operation. In both recordings the patients were awake. The spectral entropy values 1 day after the operation were significantly lower in the group in which the EEG was not at all recovered compared to the group in which the EEG was almost recovered. Our results indicate that spectral entropy can be a useful tool for assessment of the sedation level of a patient. The performance of spectral entropy in distinguishing Ramsay Score levels 2, 4, and 6 was superior in comparison to spectral edge frequency and auditory evoked potentials. However, in patients with postoperative EEG significantly slower compared to preoperative EEG, spectral entropy remained at a low level and was not able to indicate whether the patient had waken up. A relation between postoperative EEG slowing and mild subclinical cerebral injuries has been discussed by Vanninen et al [4] . Our results suggest that spectral entropy might provide diagnostic information of such a state. Methods: Twenty adult patients expected to require at least 8 h artificial ventilation following major abdominal/pelvic surgery, were randomised to receive sedation with either dexmedetomidine or propofol. Additional analgesia was provided by alfentanil infusions if clinically indicated. Patients received a loading dose of dexmedetomidine 2.5 µg/kg/h over 10 min followed by a maintenance infusion of 0.2-2.5 µg/kg/h. Propofol and alfentanil were infused according to the manufacturers' recommendations. Patients were maintained at a Ramsay Sedation Score > 2 by adjustments to the sedative regimen. Heart rate, arterial and central venous pressures were monitored continuously and recorded at 10 min intervals for the first 30 min and then hourly. Venous samples were taken for haematological and biochemical profiles immediately on arrival in the ICU, and then at 24 h and 48 h. Extubation was performed when clinically indicated and time from cessation of sedation infusion to extubation was recorded. Patients completed a questionnaire [1] , 48-72 h following discharge from the ICU. Results: There were no differences in patient and ICU characteristics between the two groups. Heart rates were significantly lower (P = 0.034) in the dexmedetomidine group. There were no differences in arterial pressure, central venous pressure and haematological/biochemical profiles between the groups. Conclusion: Dexmedetomidine appears to be safe and acceptable to both clinician and patient in the ICU. Depth of sedation is comparable to propofol and extubation time equally rapid. Furthermore dexmedetomidine provides analgesia and attenuation of the cardiovascular responses to stress, with the potential to minimise ischaemic events. Supported by Abbott UK. 1. Introduction: To date the pharmacokinetics of the sedative agent, dexmedetomidine, has only been reported in volunteers [1, 2] . This study investigated the pharmacokinetic profile of dexmedetomidine infusions in human patients requiring postoperative sedation and ventilation in the ICU. Methods: Ten adult patients who were expected to require a minimum of 6 hours of postoperative sedation and ventilation were studied. Patients received a loading dose of dexmedetomidine (DEX) of 2.5 µg/kg/h over 10 min (approx 0.42 µg/kg) followed by a maintenance infusion rate of 0.7 µg/kg/h. Alfentanil infusion was commenced if additional analgesia was required. The Ramsay sedation score and Bispectral Index (BIS) were used to measure depth of sedation. Blood samples were obtained for measurement of plasma concentrations of DEX immediately prior to infusion start (t = 0), at 5, 10, 20, 30, 45, 60, min, 2, 3.5, 6, 10, 14, 19 and 24 h if the patient was still receiving a DEX infusion. Samples were also obtained at 0, 10, 25, 40, 60, 90 min and 2, 3, 4, 5, 6, 12, 24 h post infusion termination. Plasma DEX concentrations were measured using a gas chromotographic-mass spectrometer method (Oneida Research Services Inc, Whitesboro, NY, USA). Pharmacokinetic parameters were estimated by noncompartmental methods [3] . Introduction: Morphine (MO) was shown to depress immune function in animal models and cell cultures [1] . Virtually no data exist regarding the in vivo effect of MO in human beings. MO is often used for sedation of ventilated ICU patients. We therefore evaluated the effect of MO on the immune system of these patients. The project was a prospective, self-controlled study. Ventilated ICU patients who were in the Unit > 24 hours and demonstrated no signs of acute infection and considered clinically stable were included. Following exclusive sedation with continuous Midazolam infusion, the first blood sample was taken and MO was added to the regimen, keeping sedation at level 3-4 of Ramsay scale. Twenty-four hours after the addition of the MO, the second blood sample was collected. Leukocytes were analyzed for phagocytosis, oxidative burst and the presence of membrane markers of activation by flow cytometry. Forty-eight and 72-hour blood samples under MO sedation were also analyzed in some patients. Results: Thirteen patients met the inclusion criteria. The leukocyte membrane activation markers CD11b and CD11c showed significant decrease after 24 hours of MO infusion, 26 ± 13.6% (P = 0.05) and 27 ± 10.0 (P = 0.025), respectively. Other membrane activation markers, CD14, CD18 and oxidative burst, demonstrated a non-significant trend toward decrease in 6 patients with 72 hours exposure to MO. These human data suggest that MO, which is a standard sedative agent in the ICU, might compromise the immune efficacy of these patients. 1 ) without significant renal dysfunction received an initial infusion of either remifentanil (9 µg/kg/h) or fentanyl (1 µg/kg bolus + 1.5 µg/kg/h) in a double-blind manner. Optimal sedation, defined as a Sedation Agitation Scale (SAS) score of 4 (patient was calm, and easily arousable), with no or mild pain was achieved by initial titration of the opioid infusion (fentanyl patients also received bolus doses of 1 µg/kg) followed by administration of propofol (0.5 mg/kg/h), if required, according to a pre-defined dosing algorithm specifically designed to reflect the advantages that the study drug offered. Assessments of SAS, pain intensity, mean arterial pressure and heart rate were made every 20 min for the first 6 hours after starting the opioid infusion and then hourly. Results: See Table. Conclusions: The dosing algorithm allowed effective provision of optimal sedation with remifentanil without the addition of propofol in the majority of patients. The similarity of the results for fentanyl probably reflects the stringent conditions of the dosing algorithm, which demanded frequent monitoring and adjustment of the level of sedation to ensure that a SAS score of 4 was maintained. The remifentanil regimen was well tolerated and the safety profile was similar to fentanyl. The titratability and predictable duration of action make remifentanil a very effective opioid for the provision of analgesia/sedation in ICU patients. This study (USA30206) was supported by a grant from Glaxo Wellcome. prolonged infusion the effects of R do not accumulate. Its major metabolite, remifentanil acid (RA) is eliminated by the kidneys and its elimination is prolonged as a result of increasing renal dysfunction. However RA has been reported to have 1/4600 mu-opioid potency of the parent compound. This study assessed the offset of pharmacodynamic (PD) effects of R in ICU patients with varying degrees of renal dysfunction receiving R for provision of sedation and analgesia. Methods: R (starting rate 6-9 µg/kg/h) was administered as a continuous infusion for up to 72 hours in 40 ICU patients (10 normal/mild renal impairment, creatinine clearance ≥ 50 ml/min: 30 moderate/severe renal impairment, creatinine clearance < 50 ml/min) who required sedation and analgesia. At scheduled times (8, 24, 48 and 72 hours) R was down titrated until the offset of PD effects (eg changes in sedation, pain intensity, respiratory function or haemodynamic variables) were seen. On confirmation of the offset of PD effects, R was continued at the original rate. Results: See Table. Conclusions: The offset of the PD effects of R were consistent and independent of the duration of infusion even in patients with a significant degree of renal dysfunction. R was well tolerated in these patients. R may therefore be an ideal agent for provision of sedation and analgesia to patients with varying degrees of renal dysfunction in the ICU. This study (USA30212) was supported by a grant from Glaxo Wellcome. Sedation is an essential part of intensive care medicine. The most common agents used for sedation in intensive care units are midazolam or propofol in combination with opioids. This sedatives have short half-life times, rapid onset after application and a short duration depending on the dose that is given. A small part of patients in our intensive care unit need long-term sedation. Most of this patients have multiple organ failure caused by septic shock or cardiac shock. Short working sedatives are very expensive and a relevant economical factor in intensive care units. Drugs used in long term sedation should be as safe and as comfortable for the patient as continuous given midazolam or propofol. Patients ventilated for two or more weeks often need some days to awake after discontinuation of sedation and they awake with discomfort from sedation with midazolam. In long-term sedation we used phenobarbital in combination with fentanyl since more than 5 years in about 40 adult patients. After stabilisation of the patient we switch from continuous application of midazolam to bolus application of phenobarbital 200 mg given every 6-8 hours. A patient with 70 kg bodyweight needs about 1000 mg per day phenobarbital for a deep sedation. The level of sedation could be controlled easily. The Ramsey sedation scale should be used to measure the level of sedation. Plasma levels could be controlled. Phenobarbital is a safe drug and the patient are comfortable sedated. There is no significant decrease in blood pressure after bolus application. There is no prolongation of sedation in contrast to continuous given midazolam. All patients waked up comfortable. Objectives: Repetition of self-poisoning and self-injury presents us serious problems because there are some important reasons. Not only may it represent the establishment of a maladaptive pattern of coping, but it also the probability of fatal suicide. The purpose of this study was to investigate characteristics of patients who attempted suicide repeatedly. This study was undertaken in the emergency department of Fukushima Medical University Hospital. We conducted a retrospective cohort study of consecutive patients with attempting suicide for over 4-years period between April 1996 and March 2000. Patients who attempted suicide were picked up from the records of emergency room and then all records where the attending doctor diagnosed self-poisoning and self-harm were identified. The total number of attempters of suicide was 215 (0.38%, whom 74 were men and 141 women (male to female ratio 1:1.9). Of these, the number of repeaters was 41, whom 5 were men and 36 women. Repeaters tend to be younger, particularly, 64% of repeaters in women were under 30 years old. Overdoses of drug and poison were most frequently in single presenters. In repeaters, repetition of overdose was most frequent combinations, particularly, 63.9% of repeaters in women attempted overdose of drugs. Significantly more repeat presentation in women were triaged to the low priority categories and survival ratio of this group was highest. There were no tendencies in psychiatric diagnosis of repeaters, however, young women were frequently diagnosed per-sonality disorder. First repeats were found to occur more frequently in the 3 months following the original attempt for the half of patients. This study concludes that there are some significant differences between patients who attempt suicide repeatedly and those who attempt on one occasion only. To arrange suitable aftercare, the doctors who are in charge of emergency department should train to assess and manage patients of self-poisoning or self-harm in collaboration of the psychiatric services. Four hours after ingestion, a bolus dose of enoximone 1 mg/kg was administered, followed by a continuous infusion at 0.5 mg/kg/h. During the following hours the patient's clinical condition markedly improved: CO increased up to 6 l/min, HR stabilized at 80 bpm, PCWP reduced to 11 mmHg and CVP to 8 mmHg, blood pressure raised to 135/75 mmHg, SVR decreased 1045 dyne/s/cm -5 , urine output was stable at 150 ml/h. Adrenaline infusion was progressively reduced and stopped after 24 hours, while enoximone was discontinued on 5th day. Dopamine infusion was maintained at low rates for 10 days after the ingestion. The patient recovered a normal neurological and hemodynamic status, was extubated and dismissed from the ICU on 14th day. Most common features of acute beta-blockers poisoning are hypotension, bradycardia and depres-sion of level of consciousness. Calcium channel blockers, particularly verapamil, have an additive effect with beta-blockers and therefore enhance their cardiovascular toxicity, causing severe cardiogenic shock, which can be, as in this case, unresponsive to even high doses of cathecolamines [1, 2] . The use of a phosphodiesterase III inhibitor, as enoximone, can bypass the effect of beta-blockers and restore a sufficient cardiac output, without increasing myocardial oxygen consumption, which should be avoided when past history includes angina pectoris or myocardial infarction, as in this case [3, 4] . This report strongly supports the use of enoximone in combined beta-blockers and CCBs intoxication and suggests its administration particularly when cathecolamines have been showed ineffective. Results: Fifty-four per cent (n = 29) were from the Cape Town urban area, and 46% (n = 25) from rural areas, significantly more than expected for our PICU referral pattern (25% rural, P = 0.0075). Sixty-nine percent (n = 37) were boys and 31% (n = 17) girls (P = 0.039). There was no seasonal variation (winter, n = 25 vs summer, n = 29). The routes of poisoning were ingestion (n = 27, 50%), topical skin contamination (n = 8, 15%), a combination of the above (n = 5, 9%), and unknown (n = 14, 26%). Presenting clinical features included bronchorrhoea (n = 31, 57%), miosis (n = 30, 56%), seizures (n = 16, 30%), and sinus bradycardia (n = 2, 4%). Complications included acute respiratory distress syndrome (ARDS) (n = 2, 4%) and tachyarrhythmia (n = 9, 17%). Patients were treated with atropine median total dose 0.3 mg/kg (0.03-16.7) and obidoxime 4-8 mg/kg. Twenty-nine children (53%) required mechanical ventilation for median duration 2 days (1-39). Duration of PICU stay (survivors) was median 3 days (1-83). Four children (7%) died. Decontamination prior to PICU admission was associated with a shorter hospital stay (median 3 vs 5 days) for survivors (P = 0.028), but not with a lower rate of complications (P = 0.73) or mortality (P = 0.34). The presence of a tachyarrhythmia was associated with an increased mortality (n = 4/9; 44% vs n = 0/45; 0%) (P = 0.0004). Results: See Table. Conclusion. In the initial nursing of patients with organophosphate poisoning, monitoring (ECG, capnometry) and observation of the QTc period and EtCO 2 is essential, for it helps us in the prognosis of the patient and suggests precaution due to the danger of complications (respiratory failure). Aim: To assess the incidence of myocardial toxicity in paracetamol induced fulminant hepatic failure in a prospective, controlled trial. Patients admitted with paracetamol (POD) and nonparacetamol acute hepatic failure were studied (subjects and positive controls). Healthy volunteers were enrolled as negative controls for serum sampling. Patients with pre-existing cardiac disease, chronic liver disease or chronic alcoholism were excluded from the study. Cardiac investigations included baseline ECG and transthoracic echo (TTE) on admission, invasive haemodynamic monitoring and daily cardiac output studies in those in whom a pulmonary artery floatation catheter was clinically indicated. Serum creatinine kinase MB isoenzymes and cTnI were followed for the first week of admission. Results: Nineteen patients were enrolled from September 1999 to October 2000. Eleven had paracetamol induced liver failure and eight had other aetiologies. Thirteen were female and the mean age was 35.79 (range 17-59). Fourteen patients were admitted to ICU, 12 were ventilated and underwent invasive haemodynamic monitoring. Eleven had intracranial pressure (ICP) monitoring. None of the 13 patients who had TTEs showed evidence of myocardial dysfunction. ECG was normal in 11 patients and showed sinus tachycardia in 7 patients. One patient had terminal ventricular tachycardia. 53% of patients (subjects and positive controls) had cTnI above the upper limit of the laboratory normal range on day 1 and subsequent days of admission (0.67 + 1.01). cTnI was higher in the paracetamol group than the positive controls, but this difference was not statistically significant, 1.02 ± 1.2 (POD) versus 0.23 ± 0.24 (non-POD). Negative control values fell within the normal range. By multivariate analysis there was an independent significant correlation between noradrenaline requirements and cTnI (P = 0.004). In addition, cTnI levels above the normal range were associated with a low LVSWI (P < 0.01), an increased heart rate (P < 0.05) and CVP (P < 0.05). In this study, we have demonstrated that previously fit young patients with acute hepatic failure developed myocardial injury. This was more severe in those treated with noradrenaline. There was a trend towards worse myocardial injury as evidenced by raised cTnI, but not by TTE, in patients with paracetamol induced hepatic failure. A larger study is required to establish whether myocardial damage seen in acute liver failure is a direct effect of paracetamol. In 45 min, the average heart rate comes down to 108. Graphical presentation (average heart rate). Serum creatinine (after Mg therapy over 5 days). Available online http://ccforum.com/supplements/5/S1 Introduction: Ionized calcium is essential for maintenance of myocardial function and vascular tone. Ionized hypocalcemia is seen frequently in critically ill patients and is associated with poorer prognosis. It is unclear whether calcium administration is beneficial in these patients. To evaluate the incidence of ionized hypocalcemia (Ca 2+ -level < 1.0 mmol/l), the time course of serum ionized calcium (Ca 2+ ) in children with septic shock, the effect of calcium administration and the relationship with survival. We retrospectively studied 37 children with septic shock. We analyzed Ca 2+ levels on admission and after 8 and 24 hours, the dose of intravenous calcium administration in the first 24 hours and survival. To investigate the contribution of (ARF) to hypercalcaemia in the critically ill. Methods: Twenty-three hypercalaemic (Gp 1) were compared to six normocalcaemic (Gp 2) mechanically ventilated, critically ill adults. Urinary pyridinoline (Pyr), deoxypyridinoline (Dpyr) and plasma carboxyterminal cross-linked telopeptide of type 1 collagen (ICTP) were measured as markers of bone resorption. Plasma carboxyterminal propeptide of type 1 procollagen (P1CP), bone specific alkaline phosphatase (BAP) and osteocalcin were measured as markers of bone formation. ARF was defined as the need for renal replacement therapy. For analysis the Mann-Whitney U and Fisher's exact tests were used (significance = P < 0.05). Results: Medians (ranges) for indices of bone resorption and formation together with the prevalence of acute renal failure are presented in the Table. Increased bone resorption without an increase in bone formation was demonstrated in both groups, with no significant difference between the groups. ARF was significantly more prevalent in Gp 1. Increased bone resorption leads to efflux of calcium from bone into the plasma pool in hypercalcaemic and normocalcaemic critically ill patients. Our results suggest that ARF may be a contributory factor to the development of hypercalcaemia. Introduction: It has been described some scores to establish the prognosis in patients with acute pancreatitis. The Balthasar CT classification (1) allows to establish with a lot of certainty the mortality according to the gland compromise that its seen in the CT, however the certainty of the prognosis is improved if these combine with other severity criteria as the Ramson criteria. These associations could show that the degree of SIRS has a more closer direct relation with the prognosis then the lesion seen in the CT. Objetive: To evaluate if the Balthasar classification has a high power of prognosis in those cases with a CT, classification of C, D, and E. In addition other factors that could be associated to morbility and mortality in patients with acute pancreatitis were evaluated. The records of 49 patients with acute pancreatitis were reviewed. These patients entered to our unit in the period of July 1, 1999 to June 30, 2000. Every CT was reviewed with two radiologists to determine the Balthasar classification and in case of dis-agreement a third evaluation was asked. The kappa value was established for the CT evaluation. The variables shock, renal failure, MODS and Balthasar for the mortality prognosis were evaluated. Table. ) The review of the Balthasar score had a kappa of 0.88, our results show a low specificity and predictive positive value of Balthasar score C, D, and E, to predict the probability of mortality. The variables shock, renal failure and MODS show more certainty to the prediction of mortality. The Balthasar score does not predict with certainty the probability of mortality by itself. Other variables related with the severity of SIRS as shock more certainty the mortality in patients with acute pancreatitis. Balthasar Results: Sixty-one patients were admited to our ICU with ARF caused by crush injury (25 patients) or prolonged positional compression on a muscle group (36 patients). These patients con-sisted of 55 men and 6 women with a mean age of 40.9 ± 13.4 years, ranging from 19 to 85. All the patients demonstrated kidney failure with increased concentrations of serum urea (13.22-79.40 mmol/l) and creatinine (172-1398 µmol/l). ARF was highly associated with massive muscle damage and insufficient initial fluid resuscitation. The period from the onset of symptoms and signs of the injury to the commencement of treatment with hemodialysis varied from 4 hours to 9 days. Fifty-nine (97%) patients were oliguric. Fifty-eight (95%) of these patients were treated with hemodialysis from 1 to 21 days. Hyperkalemia (5.6-8.1 mmol/l) was present in 38 (62%) patients. More than in half cases hyperkalemia was diagnosed before azotemia. Six (9.8%) patients underwent fasciotomies and 6 (9.8%) patients underwent amputations. The outcome was favorable in 43 (70%) patients, 18 (29.5%) patients died. The half causes of death were infection and sepsis. Conclusions: 1. Hyperkalemia, and metabolic acidosis appear before azotemia and within hours of the rescue of casualties with traumatic rhabdomyolysis. 2. Very early, aggressive volume replacement followed by forced solute-alkaline diuresis therapy may protect the kidney against acute renal failure. Results: Eight prospective studies were included (n = 857) and an improvement in survival with biocompatible membrane was demonstrated (OR = 1.37, 95% CI = 1.03-1.84, P = 0.03). Use of biocompatible membranes for dialysis in ARF is associated with improved patient survival compared to the use of bioincompatible dialysis membranes. In an open, randomized, multicenter study, we investigated the effects of RF-bic and RF-lac on cardiovascular outcome in patients requiring CVVH following acute renal failure. Methods: 117 patients between the age of 18 and 80 years were randomized to CVVH either with RF-bic (N = 61) or RF-lac (N = 56). Patients were treated with CVVH for 5 days or until either renal function was restored or the patient was removed from the study. Data were analyzed on day 5 or according to the 'last observation carried forward' (LOCF) option. Results: Blood lactate levels were significantly lower and blood bicarbonate levels were significantly higher in patients treated with RF-bic than in those treated with RF-lac (lactate, 17.4 ± 8.5 vs 28.7 ± 10.4 mg/dl, P < 0.05; bicarbonate, 23.7 ± 0.4 vs 21.8 ± 0.5 mmol/l, P < 0.01). The number of hypotensive crises was lower in RF-bic-treated patients than in RF-lac-treated patients (RF-bic 14 out of 61 patients, RF-lac in 29 out of 56 patients; 0.26 ± 0.09 vs 0.60 ± 0.31 episodes per 24 h, P < 0.05). Nine out of 61 patients (15%) treated with RF-bic and 21 out of 56 patients (38%) treated with RF-lac developed cardiovascular events during CVVH therapy (P < 0.01). A multiple regression analysis showed that the occurrence of cardiovascular events was dependent on replacement fluid and previous cardiovascular disease and not on age or blood pressure. Patients with cardiac failure died less frequently in the group treated with RF-bic (7 out of 24, 29%) than in the group treated with RF-lac (12 out of 21, 57%, P = 0.058). In patients with septic shock, lethality was comparable in both groups (RF-bic, 10 out of 27, 37%; RF-lac, 7 out of 20, 35%, P = NS). The results show that the administration of RF-bic solution was superior in normalizing metabolic acidosis without the risk of alkalosis. The use of RF-bic during CVVH reduced cardiovascular events in critically ill patients with acute renal failure, particularly in those with pre-existing cardiovascular disease or heart failure. Infra-renal clamping of the abdominal aorta during aneurysm repair is associated with a significant reduction in renal blood flow. Adenosine has been implicated as a mediator of renal ischemia in a number of human and animal models and its antagonism with theophylline has lead to the attenuation of these effects. In order to investigate the possible attenuation of renal arterial vasoconstriction associated with infrarenal cross clamping of the aorta we prospectively randomised 8 consecutive patients undergoing abdominal aortic aneurysm repair to receive aminophylline 5 mg/kg (n = 5) in 500 ml normal saline or placebo (n = 3) on the morning of surgery followed by an infusion of 6 mg/hour of aminophylline or placebo for 24 hours. Staff, investigators and patients were blinded. Inulin clearance as an indicator of glomerular filtration rate (GFR) was measured pre-operatively and again on the 2nd post-operative day. N-Acetyl-β-glucosaminidase (NAG) activity (mmol PNP/hour/mol creat) and Retinol Binding Protein (RBP) excretion (mg/mol creat) were measured to assess renal tubular damage and albumin excretion (mg/mol creat) as a marker of glomerular injury. Introduction: The precise pathophysiological role of the natriuretic and diuretic peptide urodilatin remains to be defined. We hypothesized that there is a relationsship between renal function and the course of renal urodilatin excretion (U URO V) immediately after cardiac surgery (CS) and -specifically -that a lower U URO V may be observed in patients (pts) showing an increase in P CREA . We determined U URO V, urine flow (UV), plasma creatinine (P CREA ) and creatinine clearance (CL CREA ) in 61 consecutive pts immediately after cardiac surgery. Blood samples were taken after arrival at the ICU (P0) and after 4-8 h (P1) and 12-16 h (P2) hours, urine was sampled from P0 to P1 and from P1 to P2, respectively. URO was measured with a commercially available RIA (Immundiagnostik, Germany). Pts were divided into subgroups showing an increase in P CREA of more than 25% (P CREA + group: n = 10) or not (P CREA ± group: n = 51). : Baseline variables at P0 were not different between both groups. U URO V was tremendously increased in comparison with historical data from healthy volunteers and did not differ between both groups during the observation period despite a significant decrease of CL CREA in the P CREA + group (Table 1) . Correlation analysis of all pts revealed a significant relationship between the decrease in CL CREA and the decrease in U URO V (r = 0.5, P = 0.006). Conclusions: U URO V is tremendously increased immediately after cardiac surgery. The relationsship between the decrease in U URO V and CL CREA suggests that URO might play a role in the fluid homeostatic adaptions after cardiac surgery and cardiopulmonary bypass. However, pts showing an increase in P Crea and a decrease in CL CREA cannot be identified by a single determination of U URO V. Renal failure requiring renal replacement therapy (RRT) increases mortality after any operation. Cardiothoracic surgery carries a high risk of post-operative renal failure, but the effect of the management of RRT on mortality is unclear in these patients. The aim of this study was to determine the changes in mortality over a 15-year period in patients requiring RRT following cardiothoracic transplantation. We performed a retrospective review of all 406 patients who received a heart (n = 359) or lung (n = 49) transplant in a single tertiary care centre from November 1986 to October 1999. Two patients underwent a second heart transplantation and one patient received a combined heart/renal transplant during this period. The requirement for RRT has not significantly altered over the time period of the study. However there has been a dramatic reduction in the 30-day mortality in patients treated with RRT which is sustained through to the end of the first year. This is likely to be due to earlier implementation of RRT with its associated improvements in nutrition and homeostasis. Conclusion: SOFA score is an excellent tool to describe the extent of organ dysfunction in critically ill cardiovascular pts. Moreover, the degree of organ dysfunction is associated with ICU-LOS and mortality. Survival rates were higher in pts with TMS ≤ 6, pts with a TMS > 6 were 13.2 times more likely to die. Therefore SOFA score may be utilised for quality assessment or appraisal of new therapeutic strategies. Introduction: In patients with liver cirrhosis who develop extrahepatic organ failure hospital mortality rates of 63-100% have been reported [1] . For ethical reasons but also due to limited resources physicians need early and reliable outcome predictors to identify cases where aggressive treatment for cure or potential liver transplantation is merited, as well as those where such care is likely futile. We therefore analysed the prognostic accuracy of the Child-Pugh (CP) classification, the Acute Physiology and Chronic Health Evaluation (APACHE) II prognostic system and the Sequential Organ Failure Assessment (SOFA) [2] in predicting hospital mortality of cirrhotic patients on the first day after admission to a medical ICU. All patients with hepatic cirrhosis admitted to our medical ICU were eligible. Prospectively collected data included demographics, reason for ICU admission, acute diagnosis and mortality rates. Prognostic data were assessed 24 hours after ICU admission. Discriminative power of the scores was evaluated using the area under the receiver operating characteristic (AUROC) curve. Results: 143 consecutive patients with hepatic cirrhosis were enrolled. 62% were male, median age was 53 years. Hospital mortality was 46%. CP category (A/B/C; n) was 6/40/97, mean CP points 10.1 ± 2, mean APACHE II 20.6 ± 10.7, mean SOFA 8.6 ± 4.7. The total SOFA score on the first ICU day had the best predictive ability (AUROC 0.94, standard error (SE) 0.02). No significant differences were seen between APACHE II (AUROC 0.79, SE 0.04) and CP points (AUROC 0.74, SE 0.04). A cut-off of 8 SOFA points had an overall correctness of 91%, a positive predictive value (PV) of 87% and a negative PV of 96% with regard to hospital mortality. In our population of critically ill patients with cirrhosis the total SOFA score on the first ICU day was found to be a very reliable scoring system to discriminate between hospital survivors and non-survivors. Introduction: Septic shock (SS) is associated with 50% mortality. Severity is usually estimated from indexes of multiorgan dysfunction, but hemodynamic dysfunction, despite its main role, has traditionally been underscored. The aim of this study was to test a severity classification for SS according to noradrenaline (NA) requirements. Methods: An algorithm for hemodynamic treatment in SS, which established NA as the first drug (followed by dobutamine or adrenaline as required), was followed prospectively in all SS patients from December 1999 to August 2000. We evaluated Apache II and SOFA scores, maximum values for C-reactive protein (CRP) and lactate, hemodynamic profiles, and renal, respiratory and hepatic dysfunction. Patients were classified in three groups according to the maximum NA requirement: G1 (mild shock), NA < 0.1 µg/kg/min; G2 (moderate shock), NA from 0.1 to 0.3 µg/kg/min; and G3 (severe shock), NA > 0.3 µg/kg/min. Results are shown in the Table, expressed as mean ± standard deviation. Conclusion: Septic shock severity as assessed by noradrenaline requirement could aid in selecting patients for future trials. A NA requirement higher than 0.3 µg/kg/min is associated with high mortality. Introduction: Organ failure scores were designed to describe organ dysfunction more than to predict outcome. The main difference between these systems is how they evaluate cardiovascular dysfunction, which is the main cause of mortality in septic patients. Objectives: a) to compare the effectiveness of Sequential Organ Failure Assessment (SOFA) score and of Logistic Organ Dysfunction System (LODS) score to discriminate outcome in septic patients; b) to determine the best cut-off value for both scores. Design: Prospective, observational study. Setting: Two large general ICUs. Patients: Forty-seven adult septic patients. Table 1 Time course of median (with interquartile range) of SOFA and LODS score in survivors and non-survivors septic patients Non-survivors P SOFA Day 0 8 (7-11) 11 (8.5-12) < 0.05 Day 1 9 (7-10) 13 (9-13) < 0.01 Day 2 8 (6.5-9) 12 (10-14.5) < 0.001 Day 3 8 (6-10) 12 (11-14) < 0.001 Day 10 3 (3-8) 11 (8-11) < 0.01 LODS Day 0 6 (4.5-7) 8 (5-9) < 0.05 Day 1 4 (2.5-7) 7 (6-9) < 0.01 Day 2 4 (2-7) 6 (3-9) < 0.01 Day 3 4 (2-6) 7 (6-8) < 0.01 Day 10 2 (1-6) 7 (6-8) < 0.01 We measured SOFA and LODS scores at ICU admission and daily. Survivors and non-survivors median scores for both descriptors were compared using Mann-Whitney U-test; the relative risk (RR) was also calculated. The mean age was 51 ± 18 years and mean APACHE II was 19.8 ± 6. The best cut-off value for SOFA and LODS score were 11 and 6, respectively. The overall mortality rate was 48.9%. Both SOFA and LODS score discriminated adequately survivors and non-survivors septic patients. Objective: To determine predictors of ICU mortality in end-stage renal disease (ESRD) patients treated with haemodialysis or peritoneal dialysis and requiring ICU admission. Design and setting: Retrospective/prospective cohort study in an adult 10-bed medical ICU in a university hospital. Over a 4-years period, out of 104 ICU admissions, 92 ESRD patients were studied. The etiologic diagnosis of ESRD was diabetes mellitus (n = 23), glomerulonephritis (n = 19), hypertension (n = 13), polycystic kidney disease (n = 11), pyelonephritis and obstructive uropathy (n = 6), interstitial nephritis (n = 5), congenital abnormalities (n = 5), others or unknown (n = 10). The prior mean duration of dialysis was 68 ± 83 months, 86 patients were on hemodialysis and 18 on peritoneal dialysis. The admission diagnosis was sepsis (n = 32), cardiac failure/fluid overload (n = 19), hemorrhage (n = 12), postoperative (n = 10), mesenteric ischemia and peripheral arterial thrombosis (n = 7), stroke (n = 6), cardiac arrest (n = 6), hyperkalemia (n = 5), others (n = 7). The mean length of stay in ICU was 6 ± 9 days. The overall ICU mortality was 29.8% (31/104). The survival rate for patients requiring mechanical ventilation was significantly less than for those not mechanically ventilated 13/36 (36%) vs 8/68 (12%), respectively (P < 0.0001). There was no significant difference between ICU survivors and nonsurvivors according to prior duration of dialysis, type of dialysis, and etiology of ESRD. In this target population, the mean SAPS II and APACHE II were 50.3 ± 20.9 and 24.9 ± 9.1, respectively. The discrimination as determined by the area under the receiver operating characteristic curve was not different between SAPS II and APACHE II: 0.859 vs 0.878, respectively (P = 0.62). For both models, the Hosmer-Lemeshow goodness-of-fit test revealed a poor performance. The H test result was P = 0.013 for SAPS II and P = 0.0006 for APACHE II. The C test result was P = 0.008 for SAPS II, and P = 0.005 for APACHE II. The ICU mortality among ESRD patients trend to be higher than that of other ICU admitted patients during the same period (21.8%), P = 0.063. Conclusions: SAPS II and APACHE II were not well calibrated in ESRD patients. These models probably need to be customized to accurately predict mortality and analyse quality of care or performance among ICUs when applied to this target population. Introduction: There is much interest in outcome prediction for ICU patients and many studies have evaluated the accuracy of predictions made by ICU medical staff [1] . There is little data on the accuracy of outcome prediction made by the ward-based doctors who refer these patients for ICU admission. As part of an on-going prospective study comparing the accuracy of prediction of hospital mortality by ICU medical staff and referring parent team doctors we analysed the first 100 completed data sets. The most senior doctor from both the referring parent team and receiving ICU team were asked to give their prediction of the likely hospital mortality for all emergency adult admissions to our 22 bedded ICU. Predictions as a percentage chance of dying in hospital were made at the time of referral to ICU, there was no conferring between the ICU and parent team and all results were confidential. MPM II 0 was scored in all patients. Results: Data was analysed on all patients who had either died in hospital or been discharged. The mean age was 59.6 years, 59% were males, and the overall mortality rate was 38%. The case mix was medical 55%, neurosurgical 18%, and other surgical specialties 28%. Data was ranked in order of predicted mortality and divided into equal deciles. Average predicted mortality (%) and observed mortality (%) were calculated for patients in each decile. Table 1 shows ICU doctors were better able to predict outcome over a wide range of mortalities compared to referring doctors ( Table 2 ). The latter tended to under predict mortality except for the patients most at risk of death. The Hosmer-Lemeshow goodness of fit test showed a good fit for ICU doctors' prediction (χ 2 = 10.24, P = 0.25), but not for referring doctors (χ 2 = 31.32, P < 0.001). MPM II 0 predicted mortality poorly in this cohort (χ 2 = 18.62, P = 0.02). Conclusion: Doctors who refer patients to ICU for emergency admission are not able to predict accurately the hospital mortality of these patients and tend to underestimate mortality when compared to ICU doctors. Department of Intensive Care, University Hospital of Wales, Heath Park, Cardiff CF14 4XW, UK Limitations of study entry criteria may be a crucial factor in the failure of immunomodulatory trials in sepsis. Several studies have used very broad inclusion criteria such as septic shock < 12 hours, sepsis syndrome, severe sepsis or more recently various combinations of SIRS criteria with at least one organ dysfunction. These criteria may include patients with a wide range of illness severity, reducing the sensitivity of studies using them. Methods: Data analysis was performed for 5400 patients admitted to the Intensive Care Unit from 1994-1999. Patients fulfilling SIRS criteria were stratified according to the number of organ systems that had failed [1] and the duration of failure of those systems on the first day of SIRS. Mortality within the groups was compared using the chi-square test. Numbers of organs failed (and duration of those organ failures) at the onset of SIRS were compared in survivors and non-survivors of ICU using the Mann-Whitney U-test. Results: 3259 (60.4%) patients developed SIRS; 1077 (33%) died. Presence of temperature, WBC count, heart rate or respira-tory criteria alone or in combination did not influence ICU outcome. Of the 1192 patients manifesting with no organ system failure (OSF) on the day of onset of SIRS, 140 (11.7%) died. Organ dysfunction evolved frequently in association with SIRS (63.4%). Nonsurvivors exhibited increased numbers of organ systems failed (1.7 ± 1.0 vs 0.7 ± 0.8, P < 0.001 [mean ± standard deviation]) and numbers of days of organ failure (0.87 ± 0.3 vs 0.5 ± 0.5 days, P < 0.001). Although individual organ system dysfunction had low specificity and sensitivity, the extent of OSF had significant impact on outcome (Table) . Organ dysfunction occurs commonly with SIRS and in close temporal proximity with it; its duration and magnitude are strongly associated with increased mortality. Inclusion criteria employing the severity of organ dysfunction may help limit selection bias in sepsis trials. 1 Previous clinical and experimental studies examining genderrelated differences in infectious complications and mortality have provided inconsistent results. Some have shown a better, some a worse prognosis in women. In a small study an increase in β-estradiol has been demonstrated in men. Therefore our purpose was to study gender hormones in patients with severe infection with regard to 28 day mortality in a prospectively study. We have included 145 patients (39.3% women) with a median age of 60 years, APACHE II Score of 19 and a survival rate of 64.7% in men and 62.1% in women. At admission, there was no significant difference in demographic data, laboratory signs of infection (temperature, C-reactive protein, procalcitonin, leukocyte count, platelets), location of infection or incidence of chronic immunosuppression, APACHE score or outcome between men and women. In addition hormonal analysis revealed no difference between sexes in β-estradiol, testosterone, cortisole, thyroidea stimulating hormone, whereas the gonadotropines were significantly higher in women, indicating that they were post menopausal (67% older than 55 years). In a univariant analysis we found significant difference between survivor and non-survivor in β-estradiol (P < 0.001) without difference between gender, in temperature, C-reactive protein and procalcitonin at the day of admission, whereas testosterone, cortisone, dehydroepiandrostendiole, thyroidea stimulating hormone, gonadotropines or prolactin were not different between groups. We have compared the results of the radio-immuno-assay with HPLC and found perfectly matched results. β-Estradiol was elevated up to 25-times the normal range even at day of admission. β-Estradiol increased further in the next days reaching a maximum at day 4-7 and decreased thereafter. Patients on chronic steroid therapy had no different β-estradiol level than untreated patients. β-Estradiol was not influenced by body weight. Even women after hysterectomy and ovarectomy had elevated β-estradiol levels in plasma indicating that the gonads are not the source of β-estradiol. In the survival analysis with Kaplan-Mayer analysis, increasing βestradiol was highly associated with increasing mortality for both sexes, men and women (P < 0.001). Conclusion: Men and women had similar survival rates. In men and women β-estradiol influence outcome. Objectives: To know the density of incidence and annual punctual prevalence, mortality and main factors for death in Acute respiratory distress Syndrome (ARDS) in our patients. Cohorte study, retrolective and prolective. Secular period: All the admissions of the intensive care unit (ICU) in 1 year (March-1-99 to Feb-29-00). Zero point was considered the admission of the ICU continuing follow up until the outcome: discharge from the ICU or death. We included all those patients that completed criteria of ARDS according to the American-European consensus (1994). The risk factors were defined to all factors associated with ARDS during the first 24 hours and were classified as direct or indirect. The following variables were defined: age, gender, number of days in ICU and hospital, comorbility, APACHE II (admission), factors associated to the development of ARDS, time between risk factors and start of mechanical ventilation and duration of the same with complications, less PaO 2 /FiO 2 , maximum PEEP, use of Swan-Ganz catheter and amines (dopamine > 5 µg/kg/min, norepinephryne and epinephryne), type of nutrition and causes of death. Statistical analysis: Program SPSS 9.0 was used, unvaried analysis with χ 2 and exact Fisher tests. P was considered significant < 0.05. : 550 cases were reviewed. Forty fullfilled criteria of ARDS (18 men and 22 woman). The punctual prevalent was 8.8% and the incidence was 19 per 100 anual. The mortality was 55%. Of the evaluated variables P < 0.05 was found between living and death: the number of days in ICU and hospital stay, APACHE II (admission) and less PaO 2 /FiO 2 . The risk factors found asociated were direct (30%), indirect (45%) and both (25%). The punctual prevalence was 8.8%. The Incidence was 19 per 100 anual. ARDS is an important cause of death (55%) and the independent variables for mortality found were: APACHE II > 18 and less PaO 2 /FiO 2 < 100 with OR 50.66 and CI 95% (7.51-341). Introduction: Acute lung injury (ALI) and acute respiratory distress syndrome (ARDS) carry a high morbidity and mortality (10-90%). ALI is characterised by non-cardiogenic pulmonary oedema and refractory hypoxaemia of multifactorial aetiology [1] . There is limited data about outcome particularly in children. Methods: This retrospective cohort study of 85 randomly selected patients with respiratory failure recruited from a prospectively collected database represents 7.1% of 1187 admissions. They include those treated with High Frequency Oscillation Ventilation (HFOV). The patients were admitted between 1 November 1998 and 31 October 2000. A crude mortality of 14.3% compares favourably to published data. The A-a gradient and PaO 2 /FiO 2 ratio may be of help in morbidity scoring in paediatric ARDS. Use of Nitric Oxide and HFOV is associated with increased mortality, which probably relates to the severity of disease. Multiple organ failure particularly respiratory and cardiac disease is associated with increased mortality. ARDS with isolated respiratory failure carries a good prognosis in children. *G Torres, R Butteli, JF Pires, F de la Veja, JH Diehl, M Premaor, AR Vacari, T Rosito, V Diemen, F Fernandes, C Marc, C Peukert, C Stapenhorst, C Borges, C Schneider, C Bete, C Ortiz, M Furtado, E Silveira, E Cappelari, R Melo, D Ughini, W Weishaimer Jr, M Bozzetti. Introduction: Pathophysiology of respiratory failure in COPD is highly different from other diseases caused chronic respiratory failure. Respiratory muscle dysfunction detoriated by dynamic hyperinflation, malnutrition and electrolyte disturbances is major contributor to weaning failure and prolonged mechanical ventilation in these patients. The aim of this study is to investigate if there is any difference in patients' characteristics in ICU between COPD and nonCOPD diseases caused chronic respiratory failure and require mechanical ventilation during acute exacerbations. Method: Forty-six patients with chronic respiratory failure and acute exacerbation were included in the study. Twenty-eight of them had COPD (group 1) and 18 patients had chronic respiratory failure caused by nonCOPD diseases (group 2) (Bronchiectasia [3] , asthma [3] , diffuse interstitial fibrosis [6], neuromuscular diseases [6]). All patients required mechanical ventilation. Patients' characteristics assesed in this study are given in the Table. Noninvasive mechanical ventilation (NIMV) were tried in 56% of patients in group 1 and 48% of patients in group 2. 58% of patients in group 1 and 40% of patients in group 2 were taking long term O 2 therapy. Patients with COPD were significantly older and had significantly higher APACH II scores than group 2. Their blood phosphorus level were significantly lower than group 2. In COPD patients duration of mechanical ventilation and requirement of tracheostomy were significantly lower than group 2. While there was no significant difference between Pseudomonas infections between the groups, ventilator associated pneumonia (VAP) caused by Acinetobacter species in group 1 significantly lower than group 2. There was no significant differences between the groups in hospital mortality rates, length of ICU and hospital stay and sepsis rates. These results showed that there might be some differences in weaning process and ICU infections between COPD and nonCOPD diseases in the ICU but they do not cause any significant difference in mortality, sepsis rates, and length of ICU and hospital stay. Aim: The aim of our study was to evaluate factors that would help predict survivors in-patients with haematological malignancy who required admission to intensive care unit. We retrospectively analysed data on patients with haematological malignancy admitted to the intensive care over a 5 year period. We identified 65 haematological patients, admitted for > 1 day to the intensive care. (Table overleaf. ) Discussion: There were differences in terms of the demographic data or severity of illness scores on day 1, ie APACHE II and organ failure scores, organ failure days and the P/F ratios between the two groups survivors and non-survivors. However by day 3, there was a significant difference between the survivors and the non-survivors in terms of improvement of the APACHE II and organ failure scores, organ failure days and the P/F ratios. Conclusion: An improved outcome was associated with an improvement in the day 3 APACHE II and organ failure scores, organ failure days and the P/F ratios, however a failure to do so was associated with a poorer outcome. All survivors could be identified by day 3 of ICU admission. Introduction: Congenital diaphragmatic hernia (CDH) is a severe disorder in neonates. The prognosis has been improved in the past 10 years by a combination of HFO ventilation, sedation and analgesia, nitric oxide and delayed surgery. Extracorporeal oxygenation (ECMO) has been proposed by certain teams. A decrease in mortality from 70% to approximately 40% has recently been reported [1, 2] . However, little is known about the outcome of such patients. We report the outcome of a group of patients after 10 years' follow-up. A retrospective review was undertaken of neonates admitted to the paediatric intensive care unit after 1 January 1992 (date at which new treatment methods were introduced in the department) and operated on for congenital hiatus hernia. Results: Nineteen neonates were reviewed, of whom seven died (36%) and 74% had left hernia. Two of the 12 remaining survivors had other malformations (cardiac and urogenital). Mean age at the time of surgery was 17 hours. Median ventilation duration was 6 days. Median duration of intensive care was 12.5 days. 58% chil-dren were discharged from intensive care unit with oxygen therapy and median hospitalization duration was 31days. Mean age at the last consultation was 18 months (SD 18). Six infants required readmission and four had severe respiratory disorders (one with severe bronchiolitis, one with asthmatiform bronchitis and one with chronic clinical respiratory insufficiency). One infant was hospitalized for more than 12 months and required left pneumonectomy and tracheotomy. Six infants had gastro-esophageal reflux, of whom one required surgery. This infant also had residual hiatus hernia and scoliosis. Three infants had neurological sequelae (one with psychomotor retardation and two with overall hypotonia, one of whom had problems with swallowing). Our results showed that 1) the mortality was similar to those reported by the other two French series [1, 2] (36%) despite the lack of use of ECMO and 2) middle-term morbidity was low compared to the mortality reported after ECMO [1] . ± 14.9 years. Sixteen (34%) patients died. Of the 16, 11 (68.8%) were admitted with a GCS ≤ 4.11 (35.5%) of the survivors had an admission GCS ≤ 4. MVA's accounted for 66% of head injuries, followed by assault injuries at 25.5%. 81.25% of the patients died as a result of the primary brain damage. No association could be established between poor outcome and the presence of concomi-tant injuries, non-operative management and the number of brain lesions. Mortality from head trauma is high. An initial low GCS ≤ 4 is associated with poor outcome. A few patients with an initial low GCS do recover fully. Results: Total mortality was 38.4%. There was a statistically significant difference (one way analysis of variance ANOVA) regarding: 1) age: groups A, B and C versus E, 2) GCS: group A versus E, 3) CT-Scan grade: group A versus D and E, 4) ISS: group C versus D and E and 5) APACHE II score: group C versus E (Table) . Patients with head injury or multiple trauma had better outcome than patients with cerebral hemorrhage (Fig.) . Age, GCS and CT-Scan grade were related to the patient outcome regarding life or death whereas diagnosis, age, ISS and APACHE II determined the severity of disability. Relation of diagnosis to GOS Background: A difficulty in modeling survival after sepsis is that hazards may not be proportional, thus violating a key assumption of traditional Cox survival models. We modeled survival after sepsis using Gray's approach, a new spline-based technique that does not rely on the proportional hazards assumption. We then compared hazard ratios over time between Gray's and Cox models. Hypothesis: Gray's model will yield different estimates of hazards over time in sepsis when compared to Cox. We analyzed 1090 patients recently enrolled in a US multicenter sepsis trial. We considered 26 potential baseline demographic and clinical risk factors and modeled survival over the first 28 days from the onset of sepsis. We tested proportionality in univariate Cox analysis using Schoenfeld residuals and log-log plots. We then constructed a standard multivariate Cox model and a Gray's model. We evaluated the validity of the proportional hazards assumption in the predictors selected by the Cox model. We compared the selection of predictors by both models. Results: Twenty-eight day Cox univariate analysis demonstrated 9 of 26 factors had non-proportional hazards. A multivariate Cox model identified 7 significant predictors, 4 predictors with non-proportional hazards (presence of comorbidity, hypotension, acute renal failure, and chronic liver disease) and 3 predictors with proportional hazards (Pseudomonas etiology, no identified etiology and pulmonary site of infection). Gray's model also identified seven risk factors. Age was a significant predictor, while a urinary site of infection portended a sig-nificantly better prognosis. Three of the common risk factors between the two models had non-proportional hazards (presence of comorbidity, hypotension, and acute renal failure [ARF]). The figure demonstrates that the Gray's model captured the large variation (ie non-proportionality) of the hazard ratio for ARF over time. Conclusion: Accurate survival models must take into account the observation that mortality risk factors have non-proportional hazards. Of several alternatives to a standard Cox model, Gray's model appears particularly promising. The aim of the study was to compare the outcome of the geriatric patients (≥ 65 years of age; Group 1) with young ones (< 65 years of age; Group 2), and to compare the outcome of the elderly patients (65-74 years of age; Group A) with the very old ones (≥ 75 years of age; Group B) in the intensive care unit (ICU). In a 12 month period, data of the 245 patients who were hospitalized more than 24 hours in the ICU was prospectively collected. The patients were followed till death or discharge from the hospital. Disease severity was assessed by using the APACHE II score from which age factor was subtracted (APACHE II -age), and the predicted mortality was calculated by using the original APACHE II score. The comparison of the Group 1 and 2 is shown in the Table, with the results as mean ± SE or as n (%). Although Group A (N = 69) and B (N = 50) had similar APACHE II scores (22.1 ± 0.9 vs 23.5 ± 1.1), Group B tended to have increased ICU mortality rate and had higher hospital mortality rate when compared with Group A (48% vs 32%; P = 0.06; 60% vs 33%; P = 0.003). In conclusion, geriatric patients, in general, had ICU and hospital mortality rates similar to that of the young patients with comparable disease severity. However, the ICU and hospital mortality rates of the patients more than 74 years of age were found to be higher than those of the patients who were 65-74 years of age. Introduction: Cardiac failure is a potential motor of multiple organ failure in long term ICU patients [1] . The aim of our study was to monitor incidence on admission of cardiac failure in long term ICU patients. The early course (first 72 hours) of cardiac failure was also evaluated. Materials and methods: ICU database was used for data acquisition. Long term patients were defined as those who survived > 3 days in the ICU. Cardiac failure was defined as cardiac SOFA points ≥ 3. Chi-square, Fisher exact test, Mann-Whitney U-test and Manova for repeated measures when appropriate. Data are presented as means ± SD. P < 0.05 was considered significant. Out of 110 patients admitted from January 1 to October 15, 72 (65%) stayed in the ICU > 3 days. Forty-six patients (65%) survived and 26 died. Survivors (S) and non-survivors (NS) did not differ in age (55.2 ± 15.9 and 60.5 ± 15.9 years, respectively; P = 0.18). S had significantly higher APACHE II score on admission than NS (24.2 ± 7.2 and 29.2 ± 7.0, respectively; P < 0.01). S had significantly lower incidence of cardiac failure on admission (< 24 hours) compared to NS (13 [28%] and 17 [65%], respectively, P < 0.001). This difference was attenuated but remained significant by day 2 when additional 5 S developed cardiac failure (8 developed, 3 recovered) and there was no change in NS (P < 0.05). Non-survivors had a trend to more severe forms (cardiac SOFA points 4) of cardiac failure in the first two days of hospitalisation (P = 0.1). The course of cardiac failure during the first 3 days of ICU stay did not differ between S and NS (Manova time effect P = 0.23). Only 2 survivors without cardiac failure during the first 48 hours (n = 25) developed cardiac failure during their further ICU stay (both later than 72 hours after admission). Long term ICU patients who do not survive have greater incidence of cardiac failure on admission. Nevertheless the course of cardiac failure during the first 3 ICU days does not differentiate between survivors and non-survivors. Liver transplantation (LT) patients have problems involving different organ systems which have a critical role in the success of the procedure. The purpose of this study is to evaluate the factors contributing to the duration of mechanical ventilation (MV) and the length of stay in ICU after LT. We evaluated 53 cadaveric liver transplantation operated by using simple cross clamping technique without venovenous bypass. All the patients were transferred to the multidisciplinary ICU for early postoperative care where all vital organ systems were monitored. Twenty-one different factors were evaluated by Mann-Whitney U, Student's t-tests and by regression analysis. All values are as mean ± SD. The mean age of the patients was 36.6 ± 15.4 years (F:M 18/35). Preoperative mean Child-Pugh scores of the patients was 10.9 ± 2.3, the perioperative course had a mean operation time of 342 ± 113 min and a mean anhepatic period of 80 ± 36 min. Mean duration of MV and length of stay in ICU were 22 ± 22 hours and 50 ± 39 hours respectively. Factors on Table 1 prolonged the duration of MV and the length of ICU stay. The analysis of regression showed no correlation between duration of MV and/or the length of ICU stay with age, Child-Pugh scores, duration of operations and anhepatic periods, total protein, glucose and bilirubine levels, prothrombin times and the need for transfusion except for the SGOT (r 2 = 0.18, P < 0.001) and SGPT (r 2 = 0.22, P < 0.001) levels which caused longer durations of MV. During early postoperative course of the OLT patients, clinical factors such as presence of a hypoxemic and/or hypercapnic period, difficulties in weaning from MV, hemodynamic instability, increase in plasma urea and/or creatinine levels, decrease in plasma calcium levels, and postoperative fever might be indicative for a prolonged ICU stay and a possible increase in morbidity. Table 1 Factors that affect the durations of MV and the length of ICU stay (hours, mean ± SD) References: 1. Sonsin et al: Aust Fam Physician Cerebral salt-wasting syndrome: does it exist? Nephron Cerebral salt-wasting syndrome Comp Methods Programs Biomed Precedex product label (USA) Hemodynamic consequences of combined beta-adrenergic and slow calcium channel blockade in man Characterization of fatal beta blocker ingestion: a review of the American Association of Poison Control Centers Data from 1985-1995 The role of enoximone in the treatment of cardiogenic shock A comparison of enoximone, dopamine and dobutamine in weaning from cardiopulmonary bypass, following mitral valve surgery The effect of acute renal failure on mortality -a cohort analysis Prognosis in acute organ-system failure Intens Care Med Predicting outcome in critically ill patients The American-European Consensus Conference on ARDS The American-European Consensus Conference on ARDS APACHE II scoring: 17.78 ± 0.53 (mean ± EE), multisystem organ failure (MSOF): 1 (0-2) (median and 25-75% interquartile range), a simplified organ failure index (s-OFI): 1 (1-2) (median and 25-75% interquartile range). ICU mortality: 25.4% (52), hospital mortality: 33.7% (69) body mass index (HR 0.97; 95% CI: 0.93-0.99), sepsis with or without associated pneumonia 95% CI: 1.1-2.56), and the presence of chronic respiratory disease Female gender with an attributable risk of hospital mortality of 14.8% (95% CI: 1.0-28.7%), an attributable fraction in exposed population of 34 and an adjusted odds ratio (OR) of 2.80 (95% CI: 1.22-6.41), degree of malnutrition (OR 2.80 48) increased hospital mortality risk Scoring systems in cancer patients admitted for an acute complication in a medical intensive care Is intensive care justified for patients with haematological malignancies? Withdrawing life support from mechanically ventilated recipients of bone-marrow transplants Severe respiratory failure requiring ICU admission in bone marrow transplant recipients P242 The intensive care unit in paediatric oncology: 10 years experience NS Parekh, AT Cohen, M O'Meara Intensive Care Unit The intensive care unit in paediatric oncology Sharples A: Paediatric oncology and intensive care treatments: changing trends Incidence, severity and mime course of cardiovascular failure in patients requiring prolonged intensive care Hospital mortality in relation to staff workload: a 4-year study in an adult intensivecare unit Withholding and withdrawal of life support from the critically ill Limitation of life support: frequency and practice in a London and a Cape Town intensive care unit Reference: Ingelhart JK: The American health care system: expenditures Objective: To determine accidental withdrawal (AW) of tubes, sounds and catheters. Design: Prospective observational study Setting: A 20-bed medical-surgical Intensive Care Unit (ICU) Mean age was 57.64 ± 16.71 years. APACHE-II was 12.48 ± 5.52. Mortality was 15.81%. Patients distribution was: 48% cardiac surgery, 14% cardiologic, 10% neurologic, 8% traumathology, 7% pulmonary, 6% digestive and 7% others Introduction: Propofol (Diprivan ® , AstraZeneca) is a surgical anesthetic and an intensive care sedative that contains 0.005% disodium edetate (EDTA) as an antimicrobial agent. EDTA is also a chelating agent that may affect the function of the parathyroid-calcium axis, predisposing patients to the development of hypocalcemia.Purpose: To compare the effect of propofol with and without EDTA on the parathyroid-calcium axis in normal healthy volunteers. In a randomized, double-blind, age-stratified, crossover trial, 50 normal subjects were randomly treated with propofol or propofol EDTA as a bolus containing 2 mg propofol/kg iv (1 mg/kg if aged > 65 years), followed by randomly selected infusions (25, 50, 100, or 200 µg/kg per min). The alternate treatment was given 15 to 29 days later. Changes in ionized Ca, total Mg, and intact parathyroid hormone (PTH) levels were measured. The normal range for PTH is 9 to 46 pg/ml. Eighteen women and 32 men were equally distributed among 3 age groups (19-34, 35-65, > 65 years). Ionized Ca and total Mg remained within the normal range for both treatments throughout the study. However, PTH levels significantly increased from baseline (40.7 ± 19.8 pg/ml and 40.4 ± 16.7 pg/ml for propofol and propofol EDTA, respectively) to 54.3 ± 24.7 pg/ml and 55.8 ± 23.0 pg/ml, respectively, (P < 0.05) 4 min after the bolus injection and returned to baseline within 60 min. Propofol infusions significantly (P < 0.05) increased PTH levels in a stepwise fashion. PTH levels increased 31% and 43% for the 100 and 200 µg/kg per min infusions of propofol, respectively. These PTH levels are similar to those seen in hyperparathyroidism and hypocalcemia. Age did not affect PTH responses. Propofol was associated with a dose-dependent increase in PTH levels that were not related to changes in ionized Ca, total Mg, or EDTA.Conclusions: ICU patients undergoing prolonged MV represented an important subset of the whole ICU population (one third of total admissions, half of them medical category). Prolonged MV was associated with a relatively high short-term (ICU and hospital) mortality and prolonged ICU and hospital LOS. Gender category, chronic health condition and several early-acquired clinical data successfully predicted both the duration of MV and short-term mortality. Severity scoring indexes behaved as useful tools to predict the duration of MV (APACHE II) and the risk of mortality (SAPS II Severity-of-illness scores, use and duration of mechanical ventilation, extrarenal replacement (ERR) and amines were significantly associated with mortality. Interestingly, only hepatic, neurologic and circulatory failures at admission were also significantly associated to a poor outcome. Mortality rates in patients with no organ supply, with 1, 2 and 3 supplied organs were respectively 0%, 66.66%, 86.66% and 92.59%. Type of disease was not associated to an increased mortality rate. Performing chemotherapy in S113 ICU, with ERR if necessary, seems possible with no harmful consequences. These results could suggest that hematological underlying disease has no major influence on early outcome, but that prognosis is mainly determined by acute physiologic changes induced by sepsis, and reflected in severity-of-illness scores [1] . New therapeutic strategies based on earlier referral in ICU, reliable markers of organ dysfunction and agressive treatment, should be tested prospectively to ensure an optimal management and a better prognosis for these patients. Intensive Care Unit, Royal Marsden Hospital, London SW3 6JJ, UK Introduction: Patients with haematological malignancy who develop respiratory failure have a very poor prognosis [1, 2] . The presence of leukopenia at the time of admission has long been suspected to be associated with worse outcome but no published study has convincingly demonstrated this. Prospectively entered admission and 12 month follow up data from 1222 admissions to an oncological ICU over a 7 year period ending June 1998 were reviewed. We identified 231 haematological (Leukaemia, Lymphoma or Myeloma) admissions. Patients who had received stem cell transplants (SCT) were identified from a separate database and sub-set analyses were performed.A cross-query of the central haematology laboratory database was set up to determine the total white cell count of each patient prior to and on admission. Leukopenia was defined as an absolute white cell count of < 1.5 × 10 6 /ml. Statistical analyses were by Fisher's exact test.Results: 152 (66%) of the 231 patients died on the unit, a further 26 died within 30 days of leaving the ICU, a total hospital mortality of 77%.50/107 (48%) of the neutropenic patients were SCT recipients. Excluding 20 patients with acute leukaemia (white cell counts 49.5-850 × 10 6 /ml), 37/100 (37%) of non-neutropenic patients were SCT recipients (P = 0.16). Haemato-oncological patients have a significantly greater mortality if they are leukopenic at the time of ICU admission: this is not solely attributable to the increased proportion of stem cell transplant recipients in this group. There is an extremely poor prognosis and a strong trend toward increased mortality in patients with pre-admission leukopaenia of more than 7 days duration.We report the results of a retrospective study of the intensive care outcome of 196 children admitted with malignancy at a tertiary referral institution over a period of 10 years from 1988 to 1997. A total of 165 children required 196 admissions for a median of 3 days. Their mean age was 5.82 years and there were 100 boys (51%) and 96 girls (49%). Their admission median Apache II score was 18. A total of 150 children (76.5%) survived intensive care. Nonsurvivors had a higher Apache II score than survivors (23 vs 15 respectively, P < 0.001). Patients were divided into those needing postoperative care (n = 55), patients with respiratory infection (n = 39), systemic infection (n = 30), neurological complications (n = 20), respiratory failure with no evidence of infection (n = 17), metabolic effect (n = 13), tumour mass effect (n = 9), GI bleed (n = 5), cardiac failure (n = 4), post cardiac arrest (n = 3). The overall survival, defined as those who survived 1 week after discharge from PICU, was 73.4%. Invasive monitoring including arterial and central venous pressure line were inserted in 143 (72.9%), mechanical ventilation was required for 133 (67.8%), ionotropic support for 66 (33%), pulmonary artery floatation catheter insertion and monitoring in 23 (11.7%) and renal replacement therapy for 13 (6.6%) children. The profile of diseases in children admitted in PICU appears to be changing since last report from this unit in 1992 [1] . The most common reason for admission was need for postoperative care (28%) and survival in this group was 100%. There also is a significant improvement in survival rate of patients with systemic infections (63%) needing ventilatory support and children with respiratory (with or without infection) failure (67%) [2] .Objective: To predict cerebral outcome after CA by clinical neurological examination. We conducted a prospective study, started in August 2000, including all patients that had a return of spontaneous circulation (ROSC) after CA. All the resuscitation attempts were registered using a form based in the Utstein Style template. A neurological evaluation was performed using a protocol that included GCS, brain steam reflex, spontaneous eyes movements, spontaneous mobility, breathing and seizures immediately after CPR; at 12, 24, 48, 72 hours, 8 days later and at discharge. Twenty-three patients were included in the protocol, corresponding to 68 resuscitation attempts (corresponding to a ROSC of 47%). We found 72 hours the better time for evaluation because before that most of the patients were sedated. All patients with either GCS < 5, absence of one or more brain steam reflex, in any time of evaluation died without any neurological recovery. The patients that had at the third day a GCS > 9 and or oriented eye movements and or oculocefalic reflexes were discharged from hospital in 63,6% of cases. Seven patients, all from the last group of patients, had a complete neurological recovery at discharge. The mean duration of cardiac arrest was 4.7 min in group with GCS score superior to 13; 11 min in the group with GCS score between 5 and 13, and 12 min in group with GCS inferior to 5.Conclusion: Neurological prognostic depends on cardiac arrest duration. In our study the existence of a GCS < 5; absence of spontaneous eyes movement or absence of brains reflex at any time after cardiopulmonary resuscitation was indicative of poor neurological and overall outcome. The pressure on intensive care beds has led to the discharge of patients before clinically indicated. It has been shown that discharging patients with high TISS scores increases mortality [1] , and it has been stated that 'premature' discharge is likely to worsen outcome [2] . We postulated that premature discharge would increase mortality, and that this would be independent of the different TISS scores that were likely to occur.All ICU patients discharged alive to the ward during 1997-1999, whose reason for discharge had been recorded by the ICU consultant as fully fit or premature were studied. We excluded those discharged for palliative care. We compared the groups APACHE II, admission risk of death, length of ICU stay and last TISS using ANOVA. We then assessed the relative risk of hospital mortality and investigated the association between TISS on day of ICU discharge, reason for discharge and mortality using ordinal regression analysis.552 patients were identified (Premature Group 145, Fully Fit Group 407). ANOVA identified a significant difference in last TISS, but no significant difference in APACHE II, risk of death or ICU length of stay ( Table 1 ). The hospital mortality was greater in the premature group (relative risk 2.1, 95% CI 1.3-3.5). Both last TISS and discharge reason were found to be independent indicators of hospital mortality (Table 2) .Patients who leave ICU before they are considered fit for discharge are twice as likely to die, despite minimal difference in their risk of death on ICU admission. Although they are receiving more care when they are discharged the excess mortality in those discharged prematurely is equivalent to that associated with a further increase in last TISS of 14 points. Society recommendations [2] ) and numbers of nurses on duty were recorded for every shift. From this the average and peak nursing dependency:nurses on duty ratios during the time each patient was in the ICU were calculated. Logistic regression analysis was performed using hospital mortality as the dependent variable and age, sex, MPM II 0 , ICU admission on the day of hospital admission, average nursing dependency:nurses on duty ratio, peak nursing dependency:nurses on duty ratio and interactions between the latter two and MPM II 0 as potential independent variables. The withdrawal of mechanical ventilation as a terminal care process occurs with increasing frequency. The aim of the study was to evaluate patients undergoing terminal weaning (TW) with or without severe brain damage. A prospective, descriptive study of all patients experienced TW during 2 years period was conducted. Apache II, SOFA, length of ICU stay (days) before decision of TW (LOS), method (step-wise reduction or withdrawal of ventilatory support), using analgesia/sedation during TW procedure and length of TW (LTW) in minutes were recorded. Data as mean (SD), median (25-75%), t-test, Mann-Whitney Rank Sum Test (SigmaStat Statistical Software) were used, P < 0.05 was considered statistically significant.Results: Sixteen patients were studied, Apache II and SOFA score were 32 (6.9) resp. 12.9 (3.7). Eleven patients with severe brain damage (group BD), five patients without brain damage (group NBD). All patients died during TW. The LOS was shorter in BD group comparing to NBD group, 2.9 (1.9) resp. 17 (9.3), P < 0.0001. The TW procedure was step-wise reduction of ventilatory support in 5 patients and as a ventilator withdrawal in 11 patients. The length of TW was 17 (12-87) in BD group and 187 (16-605) in NBD group. Analgesia/sedation was employed in eight patients, there were no statistically significant differences in LTW between patients with or without analgesia/sedation (223, 13-662, resp. 15, 12-43, P = 0.232).Discussion: LOS before decision of TW was significantly longer in patients without brain damage. There were no significant differences in length of TW as between groups BD and NBD as between patients with or without analgesia. Campbell Supported by IGA MZ CR 4530-3. Introduction and aim: Despite modern intensive therapy, 10-20% of patients admitted to ICU will not survive [1, 2] . Dealing with family issues surrounding death is therefore an important aspect of ICU care. The purpose of this study was to record aspects of the experience of families whose relative had undergone LOT and identify the views of the family regarding who should be involved in the process of LOT.Methods: Consent for a telephone interview and demographic data were obtained from the representative of all families whose relative had died in the ICU. Four weeks later, the representative was contacted for a structured telephone interview. Questions explored the respondents experience of anxiety related to the process of LOT, their understanding of the explanation and the reasons for LOT, and the adequacy of time to participate in the decision. Respondents were asked who should be involved in the decision process. The relatives of 88 patients who died were interviewed. Sixty-six (75%) patients had undergone LOT. The majority of respondents (90%) expressed that the explanation of LOT was clear and understood, but 18% felt pressurized into decision making, and 16% felt that inadequate time was allowed for discussion before the decision was made. Participating in the LOT decision provoked anxiety in 45% of respondents. However, when compared with respondents whose relatives did not undergo LOT, the expression of anxiety was lower (P < 0.01). Respondents indicated that LOT decisions should be made by the doctor and patient and/or family group (41%), family and/or patient alone (32%), or doctor alone (22%). While most family members understand the process of LOT, it is still associated with significant anxiety. Allocating more time to the decision making process and improving communication techniques may be important. Family members believe that they should be part of the decision making process. Sequence of WH and WD ICU-therapy in patients with poor prognosis from 1 (first) to 8 (last). These data show the statistical mean value. Methods: Patients were identified as early sepsis based on one of the following criteria; 1) primary acute ICU admission diagnosis of SIRS, septic shock, or Multi Organ Dysfunction Syndrome (MODS); 2) notation in the ICU log book of sepsis on admission; 3) infection at 24 hours plus evidence of hypotension, hypoperfusion or multiple organ dysfunction. Resource use measures computed for each patient included: a) total hospital costs incurred during the ICU stay, b) mean costs per ICU day, c) total costs incurred from ICU admission to hospital discharge, d) costs per day during ICU stay for specific cost categories (pharmacy, lab, imaging, respiratory therapy), e) ICU and total hospital (from ICU admission) length of stay (LOS). Methods: 1368 patients were prospectively studied. Demographic data, APACHE II and SOFA score, diagnostic group (TR = trauma, TBI = traumatic brain injury, COPD = chronic obstructive pulmonary disease, CPR = cardiac arrest, ARDS = acute respiratory distress syndrome, INTOX = intoxication), length of ICU stay (LOS), clinical outcome and cost of care in CZK were recorded. Relationship among between cost, diagnostic groups and severity score were evaluated. Data as mean (SD), median (25-75%), ttest, Mann-Whitney Rank Sum Test, z-test, ANOVA, linear regression (SigmaStat Statistical Software) were used, P < 0.05 was considered statistically significant. Methods: Pre mortem diagnosis, length of stay and presence of chronic disease were determined from medical records. Medical diagnosis were rated into three levels of clinical diagnostic certainty as complete ('golden standard') certainty (Group L1), minor diagnostic uncertainty (Group L2), and major diagnostic uncertainty (Group L3). Autopsy results were obtained from final pathology report. A panel of three intensivists reviewed the findings and distributed patients into three error groups: Group A -the autopsy confirmed the clinical diagnosis (fully correct diagnosis), Group B -the autopsy demonstrated new active diagnosis, which would have probably not influenced the therapy (non-fatal diagnostic error), Group C -the autopsy demonstrated new active diagnosis, which would have probably changed the therapy (fatal, but potentially treatable error). The overall mortality in the treated population was 20.3% (270/1331 patients). Autopsies were performed in 126 patients (46.9% of deaths), more often in younger patients (66.6 ± 13.9 years vs 72.7 ± 12.0 years, P < 0.001), in patients staying less then 24 h in ICU stay (4.7 ± 5.6 days vs 6.7 ± 8.7, P = 0.0549) and in patients in Group L3 without chronic diseases (15 vs 1, P < 0.001). According to pre mortem clinical level of diagnostic certainty 46.0%, 24.6% and 29.4% of patients were in group L1, L2 and L3 respectively (ns between groups). After analysis of pathological findings, fully correct diagnosis (group A, 60 patients [47.6%]) was found in 60.3%, 40.0%, 34.2% of patients in group L1, L2 and L3 respectively (ns between groups). Non-fatal diagnostic errors (group B, 54 patients [42.9%]) were found in 31.0%, 50%, 55.3% of patients in group L1, L2 and L3 respectively (ns between groups). Fatal, but potentially treatable errors (group C, 12 patients [9.5%]) were found in 8.7%, 10.0% and 10.5% of patients in Group L1, L2 and L3 respectively (ns between groups). ICU length of stay shorter than 24 h was not related with frequency of group C errors. The autopsies are performed more often in younger patients without chronic disease, in patients with shorter ICU stay and in cases with low clinical diagnostic certainty. No level of clinical diagnostic certainty could predict the pathological findings. Autopsy remains the essential verification of clinical diagnostic certainty in critically ill patients. This study permit to know our current levels of accidental withdrawals and comparing them with other ICU and with our results in the future. Introduction: In critical care, accurate assessment of daily fluid balance is both necessary and important. We evaluated the accuracy of the calculated daily fluid balance during continuous hemodialysis and filtration (CHDF) by checking the relationship between two values: (a) daily fluid balance calculated from the balance sheet, and (b) daily body-weight change, a standard way of evaluating daily fluid balance. We studied data obtained from patients who underwent CHDF using one of two machines, CHF-1 or JUN-500 (Ube Medical Corporation, Tokyo, Japan). CHDF patients were randomly assigned to one or other of the machines: Group-A (14 patients) to CHF-1 and Group-B (7 patients) to JUN-500. We also studied the relationship between the two values, (a) and (b), above in 10 patients (Group C) not undergoing CHDF. Within S125 each group, the correlation between values (a) and (b) was studied by regression analysis. Significance was defined as P < 0.05. The number of time-points studied was 32 in Group A, 22 in Group B, and 45 in Group C. Within each group, we saw a significant relationship for (a) versus (b), the coefficient numbers (r 2 ) being 0.400 in Group A, 0.663 in Group B and 0.757 in Group C.Discussion: JUN-500 has three pumps, providing a stricter regulation of rates of infusion and removal of fluids; this may have given more accurate management under CHDF than that achieved with CHF-1. During CHDF, a large amount of water may be infused and/or removed, and so a slight error in pump calibration can lead to a considerable inaccuracy in the daily fluid balance calculated from the balance sheet. This would result in a fairly low correlation number for (a) versus (b). In our Groups A and B, although we found significant relationships between the two values, the r 2 numbers were not particularly high. Therefore, using the above reasoning, we would have been unwise to draw conclusions about changes in daily fluid balance using balance-sheet data alone.Conclusion: During CHDF, daily fluid balance still needs to be based on data obtained by measurement of daily weight change, not solely on data obtained from the fluid-balance sheet.