key: cord-0033810-362y8743 authors: nan title: OP 262-338 date: 2005-09-10 journal: Intensive Care Med DOI: 10.1007/s00134-005-2779-x sha: fcc1a9efe5f865c132cbdd15cf99fdcf3b2160eb doc_id: 33810 cord_uid: 362y8743 nan We have identified significant differences in duration of ventilation and ICU LOS between the pre-bundle and bundle groups. We have been unable to identify confounding variables which might otherwise have explained the differences. Toft P 1 , Nielsen B 1 , Lillevang S 2 , Bollen P 3 , Olsen K 4 , Andersen S 5 1 Intensive care, 2 Immunology, 3 Biomedical Lab., 4 Pathology, Odense University Hospital, Odense, 5 Intensive care, Aarhus University Hospital, Aarhus, Denmark Previous randomized multicenter trials failed to demonstrate a decrease in mortality of patients with Acute Respiratory Distress Syndrome or Acute Lung Injury (ARDS/ALI) treated by surfactant 1. However, the effect of surfactant on re-aeration of the lung has never been assessed. The aim of the study was to evaluate the effect of porcine-derived lung surfactant (HL-10) administered by intratracheal route on lung re-aeration in patients with ARDS/ALI. After approval of ethical committee and obtaining informed consent, 19 patients with ARDS/ALI were included in the HL-10 randomized multicenter trial (9 HL-10 and 10 controls) in whom a spiral thoracic CT scanner was acquired at PEEP 10 cmH2O before and 7 days after the first HL-10 administration. In the HL-10 group, three doses (200mg/kg/dose) of HL-10 were instilled in the right and left lungs at H0, H12 and H36. Each instillation was followed by recruitment maneuvers. Volumes of gas and tissue were separately measured in poorly-, nonaerated and normally aerated lung areas using the software Lungview®. HL-10-induced lung re-aeration was defined as the increase in gas volume in poorly-and nonaerated lung areas at day 7 following HL-10 administration and was compared to the control group. The clinical characteristics and cardiorespiratory parameters at inclusion were not different between control and HL-10 groups (PaO2/FiO2: 200±63 versus 194±64 mmHg) . At day 7, HL-10 induced a significant increase in volume of gas in the poorly-and nonaerated lung areas (320±125 versus 135±161 ml, p = 0.01) and a significant increase in volume of tissue in the normally aerated lung areas (189±179 versus -15±105 ml, p< 0.01) compared to the control group. CONCLUSION. In patients with ARDS/ALI, intratracheal surfactant administration induces a reaeration of poorly-and nonaerated lung areas that is associated with an increase in lung tissue in normally aerated lung areas that may reflet an inflammatory reaction. Richard J 1 , Janier M 2 , Lavenne F 2 , Costes N 2 , Le Bars D 2 , Bregeon F 3 , Guerin C 1 1Service de reanimation medicale, Hopital de la Croix-Rousse, 2 CERMEP, Hospices civils, Lyon, 3 Laboratoire de Physiopathologie Respiratoire, Faculté de Médecine, Marseille, France Prone positioning (PP) is frequently performed in ARDS patients as a rescue therapy to improve arterial oxygenation. However, mechanisms by which PP improves oxygenation remain partially understood, but seem unrelated to significant gravity-dependent perfusion redistribution. Therefore, this study aims to evaluate with positron emission tomography (PET), posture and positive end expiratory pressure (PEEP)-related changes in regional ventilation (V'A) during experimental lung injury. After oleic acid-induced lung injury, five mechanically-ventilated pigs were randomly studied in supine (SP) and prone (PP) position, using two PEEP levels (0 and 10 cm of H2O). In each experimental condition, regional alveolar gas volume (VA) and specific ventilation (ventilation by unit of alveolar volume (V'A/VA)) were measured with PET imaging, during a washout of nitrogen-13. In each tomographic slice, measurements were performed in 20 regions distributed along the ventral-to-dorsal axis (Bins). Regional recruitment was computed as the difference of regional alveolar volume between 2 experimental conditions and global recruitment as the sum of regional recruitment. Unlike PEEP, PP did not significantly modify the level of global alveolar recruitment. In SP on PEEP 0, non-dependent regions were significantly more aerated relative to dependent regions, but with a lower V'A/VA. PP on PEEP0 induced regional alveolar recruitment in dorsal regions, and derecruitment in ventral regions, probably by oedema redistribution. Finally, V'A/VA was less heterogeneous in PP than in SP, whatever the PEEP level, resulting in a better agreement between regional alveolar recruitment and ventilation. CONCLUSION. Relative to SP, PP homogenizes alveolar ventilation, with no significant modification in global level of alveolar recruitment. PP beneficial effects on oxygenation may partly result from preferential recruitment of lung regions with potential to achieve high level of regional ventilation. Grant acknowledgement. HCL, Equipe d'accueil 1896 and Claude Bernard Lyon I University (BQR) High vascular flow can increase lung damage in a ventilator-induced lung injury (VILI) model. Positive end-expiratory pressure (PEEP) might attenuate lung damage but its influence on the "vascular site" of VILI remains poorly understood. We hypothesized that PEEP might protect the lung by decreasing trans-alveolar-capillary pressure gradient during graded VILI: low and high vascular flow with or without added oleic acid (OA) in the perfusate. Two series of experiments were performed. First, 15 sets of isolated rabbit lungs were randomized into 3 groups (n=5): low vascular flow/low PEEP (LFLP); high vascular flow/low PEEP (HFLP), and high vascular flow/high PEEP (HFHP). Second, the same protocol was applied in another 15 sets of isolated rabbit lungs with 0.1 ml of OA added to the perfusate solution. All lungs were ventilated with peak airway pressure of 30 cmH2O during 30 minutes. Weight gain, changes in flow across lungs and in pulmonary vascular resistance, and extent of hemorrhage (scored by histology) were compared with ANOVA. Table 1 . Data are expressed as mean±SD. Larché J 1 , Attoui A 1 , Greciano S 1 , Maurer P 1 , Gérard A 1 1 Medical Intensive Care Unit, CHU Brabois Nancy, Vandoeuvre-les-Nancy, France Cardiogenic shock in ICU still carries a high rate of mortality. We aimed to identify early prognostic factors associated with high risk of death in patients with cardiogenic shock. A retrospective study was conducted in an 11-bed medical ICU in a tertiary universitary hospital. We reviewed all patients admitted for cardiogenic shock (including for acute myocardial infarction) from January 2002 to February 2005, excluding post-operative cardiogenic shock and infectious endocarditis. 47 patients were included in the study. Severity physiologic score, as SAPS II, and organ dysfunction scores as OSF and LOD (both at H0 and H48) scores were assessed. Survival rate, demographic data, use of catecholamines (number of days), initial left ventricular ejection fraction (LVEF), and biological markers were also issued from the medical charts. Mean age was 66.5±11.3 years-old. Respective means of SAPS II, OSF at H0 and H48, and LOD at H0 and H48 were 56.8±17.4, 2.7±1 and 2.7±1.1, and 9.4±4.4 and 8.7±4.8. Mean LVEF was 25.2±11.8%. Mortality rate at 36 days was 61.7% (29 patients). Persistence of shock (p<0.0001), use of epinephrine (p=0.004), higher LOD score at H0 (p=0.0013) and H48 (p<0.0001), higher OSF score at H48 (p<0.0001), higher lactates at H0 (p=0.0097) and H48 (p=0.0008), and higher white blood cells count (p=0.014) were significantly associated with a fatal outcome. Interestingly, a LOD score at H48 >7 identified patients who subsequently died, with a sensitivity at 86.2%, a specificity at 83.3%, a positive predictive value at 89.2%, and a negative predictive value at 78.9%. Moreover, survival rates at 36 days were significantly better in patients with LOD score at H48 ≤ 7 than in those with a LOD score > 7 (45.9% versus 5.7%; p=0.001). CONCLUSION. 1) Risk factors associated with a fatal outcome in patients with cardiogenic shock seem mainly related to acute physiologic disturbances, and not to initial cardiologic presentation. 2) A LOD score > 7 at 48 hours of hospitalization could allow an early identification, with acceptable positive and negative predictive values, of patients with cardiogenic shock at high risk of death. Alternative and early therapeutic interventions, such as percutaneous cardiopulmonary support, could be then assessed in these "high-risk" targeted patients. However, a prospective assessment is mandatory to confirm these preliminary data. . We compared the (F)group of the recent RIFLE classification[1]which represents patients with ARF,with the (Fc)subgroup which represents acute on CRF in ICU for morbidity and mortality. We also evaluated the RIFLE-F as an independent risk factor for death in ICU. A retrospective analysis of cohort of all patients admitted to two major ICUs in the UK from 2002 till 2004. 448 ARF patients were selected with sCr >150umol/L then classified by RIFLE criteria to F= 112/448 and Fc= 115/448. Epidemiological data, survival, mortality and morbidity scores were compared. Regression analysis was performed to predict independent risk factors for death. Demographics showed no significant difference apart from age and APACHEII (p=0.032 and 0.038 respectively). MR in F group was significantly higher than Fc group(45%, 25%, p<0.0001). In the F group deterioration in MAP, HR, pH, ventilatory parameters and GCS was more severe than Fc group (p= 0.018, 0.003, 0.000, 0.007 and 0.019 respectively).RIFLE-F criteria for death prediction were found to have the highest significant Odds Ratio (OR) (OR: 0.337, CI=0.191to 0.596, p= 0.000) but no significant change in OR for the cause of ARF. CONCLUSION. RIFLE-F (ARF) patients show worse prognosis and outcome compared to RIFLE-Fc (Acute on CRF) patients in ICU. The RIFLE-F criteria were found to be more predictive of mortality in ARF patients in ICU. To determine the effectiveness of different artificial intelligence methods in mortality prediction and compare their effectiveness with conventional prognostic scoring systems. Using the precollected data of 197 patients who stayed in our intensive care unit for more than 24 hours between February-June 2004, we developed and tested multilayer perceptron and Jordan/Eldman artificial neural networks. The data have been presented to the networks directly, or by preprocessing using fuzzy logic interpreters. A new prognostic system have been developed using expert opinion based parameters and fuzzy logic interpereters to present data to artificial neural networks. Patients were divided to training, cross validation and testing groups which were changed in different training and testing occasions. For both artificial neural networks tested, direct presentations of physiologic data have resulted in comparable calibration and discrimination characteristics with conventional prognostic scoring systems. Addition of fuzzy logic interpreters have improved the performance of artificiall neural networks. Expert opinion and fuzzy logic based new prognostic system has resulted in the best performance among artificial intelligence and conventional prognostic systems. Although effected from changes in training and testing subsets, on the avarage, area under receiver operating curves were 0.82 for direct data presentation, 0.85 for preprocessing by fuzzy logic interpreters, 0.94 for expert opinion + fuzzy-neuro hybrid system. Artificial intelligence is a reliable alternative in mortality prediction in the intensive care unit. Data presentation approach and data preprocessing algorithms effect the performance of artificial neural networks. Addition of expert opinion based parameters to a neuro-fuzzy hybrid system, besides increasing the performance, allows customization of prognostic calculations for individual intensive care units. Simeone F 1 , Scolletta S 1 , Maglioni E 1 , Franchi F 1 , Pastorino A 1 , Vaiti A 1 , Giomarelli P 1 , Biagioli B 1 1 Department of Surgery and Bioengineering, Unit of Cardiothoracic Surgery, University of Siena, Siena, Italy The ability to predict excessive bleeding after cardiac surgery could improve patients' management in intensive care unit (ICU). We developed a risk model able to predict the excessive bleeding in our postoperative ICU. We analysed 2297 patients underwent cardiac surgery from January 2001 to May 2004. Coronary surgery was performed on 1.175 patients. 532 were valvular surgical patients. Combined operations were 590. More than 80 perioperative risk variables were assessed. Excessive bleeding was defined as re-exploration and/or blood loss of more than 800 ml within 12 hours postoperatively. A bleeding risk score (BRS) was constructed using the independent risk factors' logistic regression coefficient values. The predictive model was validated on 246 patients underwent cardiac surgery from June 2004 to December 2004. Logistic regression and receiver operating characteristic (ROC) curves were applied. SPSS software was used. Emergency operation and urgent operation were done in 45 (1.9%) and 136 (5.9%) patients, respectively. Excessive bleeding was observed in 181 (7.9%) patients, and 973 (42.3%) subjects required transfusions. Twelve predictors of excessive bleeding were found: body surface area, age >70, preoperative hypertension on medication, preoperative creatinine, preoperative bilirubin, preoperative albumin, preoperative use of anti-aggregants or heparin or warfarin, cardiopulmonary bypass (CPB) and aortic clamp times, congestive heart failure, lowest temperature during CPB, emergent/urgent operation. The BRS ranged from 0 to 14. Low and medium risk BRS resulted under 5 points (low risk = 0 to 2, medium risk = 3 to 5). Patients with a BRS higher than 6 points showed an incidence of bleeding of 95%. ROC curve for BRS applied to the validation group was 0.75 (95% CI from 0,640 -0,830; p<0.01). CONCLUSION. Our study identified 12 variables able to predict patients at risk of bleeding. Those parameters have been included in the BRS, which seemed an useful tool to classify patients at risk of bleeding. The capability to identify patients with high BRS (>6 points) may allow physicians to use strategies reducing the postoperative bleeding. That could reduce the ICU-stay and improve the postoperative outcome. Abizanda R 1 , Vidal-Tegedor B 1 , Micó-Gómez M 1 , Alvaro-Sánchez R 1 , Belenguer-Muncharaz A 1 , Madero-Pérez J 1 , Mateu-Campos L 1 , Bisbal-Andrés E 1 1 Intensive Care Department, Hospital Universitario Asociado General de Castelló, Castelló, Spain INTRODUCTION. Delay in ICU admission has been identified as a reason for discrepancies between real and expected deaths of ICU patients ("lead time bias")1. Nevertheless, no one of the available systems for predicting risk of death includes systematically this variable, except for EPEC2. Our aim has been to assess the influence on outcome of the time delay detection of EPEC METHODS. During six months of 2003, EPEC has been used for outcome prediction assesment in all consecutive admissions to our 19 beds ICU. Simultaneously, predictions by means of MPM II 0, SAPS 2 and socio demographic data were also collected. Mortality was considered during hospital stay. Time delay, in EPEC, is defined as the delay in real admission of the patient in ICU, once the ICU physician has notice of his situation, no matter where the patient was before (ED, wards, RR, OR). That means that real delay (no detected risk situation) was not captured by the system. During the study, 481 patients were enrolled, of whom 44 died before leaving the hospital. Average time delay before admission to ICU was 0,7±1,98 hours ( range 0 -20), and when no delay times were excluded (time 0), the detected delay was 2.96 ± 3.2 hours (108, patients, range 0.25 -20). This delay was greater in patients admitted from the ED or other hospitals with no ICU facilities. There were no differences in delay between patients who died and those who survive until hospital discharge. As it was already known, the concordance of the risk of death of these patients established by MPM II 0 and SAPS 2 and EPEC, was very poor. Two different kinds of delay in admission to the ICU can be admitted. 1.-delay due to the non detection of the risk situation and 2.-the "logistic time delay", when the situation of the patient is well detected, but there is not an empty bed in the ICU for admission. This one is the type we could explore. It seems not to influence much the outcome of patients in this study, and there are little possibilities of acting on it according with the actual performance pattern of our unit Luyt C E 1 , Nieszkowska A 1 , Aubriot-Lorton M H 2 , Deback C 3 , Combes A 1 , Capron F 2 , Agut H 3 , Gibert C 1 , Chastre J 1 1 Service de Réanimation Médicale, 2 Service d'anatomo-pathologie, 3 Service de virologie, Groupe Hospitalier Pitié-Salpêtrière, Paris, France Herpes Simplex Virus (HSV) was recently detected in the lower respiratory tract (LRT) of a large proportion of ICU patients, but its clinical importance in such situations remains unclear (Bruynseels et al., Lancet 2003) . It is not known whether the isolation of HSV from LRT samples corresponds to a true HSV pneumonia or to a bronchial contamination from mouth and/or throat. We did a prospective cohort study to determine the prevalence of and risk factors for HSV pneumonia in ICU patients undergoing prolonged mechanical ventilation (MV). METHODS. 88 consecutive non-immunocompromised patients who were receiving MV for >5 days and were clinically suspected of having ventilator-associated pneumonia were studied using fiberoptic bronchoscopy with BAL. HSV pneumonia was diagnosed when all the following criteria were met: 1) a new and persistent pulmonary infiltrate on chest radiograph associated with at least 1 of the following: temperature 3 38°C, leukocytosis 3 10,000/mm3 and purulent tracheal secretions; 2) presence of HSV in the BAL fluid as determined by VERO cells cultures and/or PCR; and 3) presence of specific nuclear inclusions in cells collected by BAL, as determined by cytological examination. HSV was detected by culture and/or PCR in the LRT of 54 (61%) of 88 patient, whereas HSV pneumonia was diagnosed in only 23 (26%). Mean duration of MV before HSV pneumonia diagnosis was 14 ± 7 days in these individuals. There were no differences between patients with and without HSV pneumonia with regards to age, SAPS II, reason for and duration of MV, ICU mortality and length of stay. However, patients with HSV pneumonia had more frequently HSV skin or oral lesions (57 vs. 18 %, p <0.001), as well as HSV in the throat (87 vs. 43%, p<0.001) and in LRT (100 vs. 48%, p<0.001) than patients without HSV pneumonia. Our data suggest that HSV pneumonia is frequent among patients ventilated for more than 5 days and is associated with HSV reactivation or infection of the mouth and/or throat. However, it is not yet known whether it is of clinical relevance. Grant acknowledgement. This study was partially supported by a grant from the Société de Réanimation de Langue Française Arvaniti K 1 , Lathyris D 1 , Synnefaki E 1 , Liatsi D 1 , Adamopoulos C 1 , Matamis D 1 1 Intensive Care Unit, Papageorgiou General Hospital, Thessaloniki, Greece INTRODUCTION. VAP is the leading nosocomial infection in our ICU. Surveillance cultures (SC) for detection of multi-drug-resistant bacteria (MDR-B) have been used as predictors of microbial flora in VAP. In a 1-year prospective study we collected at admission and then weekly pharyngeal, cutaneous and rectal swabs (SC) from every patient admitted in the ICU in order to detect carriage of A.baumannii (Ab), P.aeruginosa (Pa) and Meticillin-Resistant-Staphylococcus-Aureus (MRSA). Diagnosis of VAP was based on quantitative culture of protective-telescopic-catheter (PTC) performed blindly whenever clinical suspicion for VAP (positive culture if PTC 3 103cfu/ml). Patients were considered to have positive SC (SC+) if at least one specimen (pharyngeal, cutaneous or rectal) revealed Ab, Pa or MRSA, within the week preceding the diagnosis of the VAP. For SC, sensitivity, specificity, ppv and npv were calculated. . 303 patients were admitted during the study-period, 9 patients were excluded from analysis as they had SC+ after the diagnosis of VAP. 25 VAP were diagnosed in 21 patients (14,4%, 20,66/1000 days of mechanical-ventilation). Among them, 17 were caused by Ab, Pa or MRSA. 7/17 had SC+ for the same pathogens as those found in VAP (sensitivity for SC: 87,5%). 60 SC+ patients had no VAP while 226 patients had neither a VAP nor a SC+ (specificity for SC:79%). PPV and NPP for SC were 10% and 99% respectively. A patient with VAP had 4,16 times more chances to have a SC+ than a patient without VAP (LR:4,16). CONCLUSION. SC could reliably predict VAP-pathogens. Naturally, VAP diagnosis cannot be based on SC, however, ICU-physicians may be helped in empirical antibiotic prescription for VAP by systematic surveillance of MDR-B carriage Berger M M 1 , Eggimann P 2 , Revelly J P 2 , Raffoul W 3 , Shenkin A 4 , Chioléro R 2 1 Surgical ICU Burn Center, 2 Surgical ICU, 3 Surgical ICU Burn Center, CHUV, Lausanne, Switzerland, 4 Clinical Chemistry, Liverpool University, Liverpool, United Kingdom Nosocomial infections and particularly pneumonia, remain a leading cause of morbidity and mortality after major burns. A previous trial in 20 patients providing selenium, copper and zinc supplements for 8 days showed a reduction in pneumonia: the present analysis aimed at assessing the same aspects in a larger group of critically ill burned patients METHODS. Aggregation of 2 consecutive prospective randomised placebo controlled supplementation trials delivering Cu 59 µmol, Se 4.8 µmol (370 mg), Zn 574 µmol IV per day (group TE) or vehicle (group V) for 8 to 21 days depending on surface burned. Infectious complications were collected prospectively over 30 days: number of infections (pulmonary, cutaneous, urinary, bacteremia, other), timing, location, antibiotherapy and length of mechanical ventilation and ICU stay. Community acquired pneumonia (first 48 hours) were counted separately. Blood samples on days 0, 1, 5, 10, 15, 20: selenium, glutathion peroxidase (GSHPx) RESULTS. Fourty-one patients, aged 42 ±15 years, burned 46±19 % of body surface were studied (n=21 TE, n=20 V); 32 patients required mechanical ventilation (2-24 days); 16 patients suffered inhalation injury. Demographic characteristics did not differ between groups. Length of mechanical ventilation was 8.0±7.8 days in ET (11.8±8.0 in V, p=0.12). Plasma selenium and GSHPx concentrations were significantly higher in the TE group after day 5. Total number of infectious complications was lower in TE group (2.0±1.0 episodes versus 3.5±1.2 in V, p = 0.0002), as was the number of nosocomial pneumonias (0.6±0.7 versus 1.7±1.1, p=0.0005). There were 6 early pneumonias (ns between groups). Cutaneous and urinary infections were unchanged. Length of ICU stay per %BSA shorter with TE (0.64±1.4 versus 1.03±1.5 days/%: p=0.008) CONCLUSION. Trace element supplements including selenium were associated with a significant reduction of nosocomial pneumonia and of length of ICU stay. The reinforcement of antioxidant defenses is a likely mechanism Frantzeskaki F G 1 , Xanthaki A 2 , Adamidis V 1 , Tsimogianni A 1 , Balla M 1 , Charalambous A 1 , Betrosian A B 1 1 Intensive Care Unit, 2 Microbiologic laboratory, Ippokrateion General Hospital of Athens, Athens, Greece INTRODUCTION. Acinetobacter baumanii is a common cause of late onset ventilator associated pneumonia (VAP).We examined the microbiologic and clinical efficacy of the administration of two dose regiment of A/S in mechanically ventilated patients with VAP, caused by multiresistant Acinetobacter baumanii. Eleven patients (mean age:70 ± 10) with nosocomial pneumonia according to the criteria of CDC, caused by multiresistant Acinetobacter baumanii (quantitative cultures of a BAL specimen) were treated with A/S. Seven of them received 27gr A/S (9g sulbactam, group A) and four of them 36gr (12g sulbactam, group B). Follow up BAL and clinical evaluation of all patients was performed four days after the initiation of therapy. Clinical success was defined by a lessening of the signs and symptoms of Acinetobacter VAP, while microbiologic success was defined as suppression of Acinetobacter organisms to <10,000 cfu/ml (BAL culture). Follow up BAL revealed microbiolobic success in culture specimens of five patients from group A (71%) and all patients from group B (100%), but the difference was not statistically significant. Clinical improvement followed microbiologic success in four patients from group A (57%) and three patients from group B (75%), difference which was not statistically significant. Three of the remaining patients showed positive blood culture with Acinetobacter baumanii, while two of them had a subsequent infection caused by a different pathogenic organism. Preliminary results demonstrate that high-dose of A/S is a well tolerated effective treatment for VAP caused by multidrug resistant Acinetobacter. Dose regimen of 36g Ampicillin-Sulbactam seemed to be more effective than 27g, but the difference was not significant, because of the limited number of patients. Positive blood cultures influence the efficacy of the treatment. Further data are needed to determine the role of A/S in the treatment of VAP. Hemorrhagic shock (HS) is a severe pathological condition and is associated with high mortality. It is known that reperfusion of ischemic tissue causes damage of the DNA, which leads to the activation of Poly-(ADP-Ribose)-Polymerase (PARP) (1). PARP repairs damaged DNA in energy-consuming steps, which might lead to cellular energy-depletion and cell death. In our experiments we inhibited the enzyme PARP with the PARP-Inhibitor 5-AIQ in HS prior to reperfusion and investigated the impact of PARP-inhibition on the microcirculation of the rat liver. Briefly, a HS was induced to a mean arterial pressure of 40 mmHg for one hour (2). Five minutes prior to reperfusion the animals in the treatment group received the water-soluble PARPinhibitor 5-AIQ (3mg/kg) intravenously. Intravital fluorescence microscopy was performed at baseline, the end of the HS, 1 hour and 5 hours after reperfusion. We observed changes in macrocirculation, microcirculation and liver function. Statistical analysis: SigmaStat 3.0, ANOVA, groups: n=5, p<0,05 significant RESULTS. Animals treated with the PARP-Inhibitor 5-AIQ showed a significant reduced leucocyte stasis in liver sinusoids, less adherent leucocytes in post-sinusoidal venules, improved sinusoidal pefusion, reduced tissue hypoxia and a restored bile excretion. 1.88±0.1 1.01±0.2 1.69±0.14* CONCLUSION. Our findings show a protective effect of the PARP-inhibitor 5-AIQ on the microcirculation and function of the liver given to rats in HS prior to reperfusion. Panzeri M 1 , Marelli C 1 , Brambillasca P 1 , Aliprandi L 1 , Villa F 1 , Berra L 2 , Kolobow T 3 , Pesenti A 1 1 Anestesia e Rianimazione, Ospedale San Gerardo, Monza, Italy, 2 Anesthesia and Critical Care, MGH, Boston, 3 PCCMB, NIH, Bethesda, United States Bacterial growth within endotracheal tube (ETT) is source of lower respiratory tract bacterial colonization. Within the ETT the bacterial biofilm develops rapidly and cannot be prevented by antibiotic therapy. Recently we developed an ETT internally coated with silver and sulfadiazine in polyurethane (SSD-ETT) and showed it can prevent bacterial colonization of lower respiratory tract in animal model. To evaluate the safety and effectiveness of SSD-ETTs, we conduct a randomized trial in cardiosurgical patients comparing standard ETT (St-ETT) to SDD-ETT. METHODS. 46 patients intubated for a period between 12-24 hours were randomized to receive a St-ETT or a SSD-ETT. Trachea and bronchi were brushed for bacterial growth before extubation. ETT was sampled for bacterial growth, its lumen analyzed by Scanning Electron Microscopy (SEM) and Confocal Scanning Laser Microscopy (CSLM). No adverse events were recorded. No bacterial colonization of the lower respiratory tract and ETT was observed. SEM and CSLM showed absence of bacterial biofilm; thickness of secretion deposit was 0-200 µm. St-ETT: 35% of ETTs samples and 9% of the bronchial brushes were colonized. SEM and CSLM showed presence of bacteria, red and white cells, organized bacterial biofilm could not be identified; the thickness of accumulated material ranged between 50-700 µm. Tight glycaemic control in critically ill patients may improve morbidity and mortality. The mechanism(s) is unclear but may be due to an amelioration of catabolism. This study assessed the contribution of the insulin dose to the level of glycaemia achieved, on the catabolic response. 16 critically ill patients were studied on 2 occasions, 48 hours apart. Study 1 was within 36 hours of ICU admission (glucose 7-9mmol/L). Patients were then randomised to one of 4 groups; a) Variable insulin, glucose 4-6mmol/L (LILG), b) Variable insulin, glucose 7-9mmol/L (LIHG), c) High-dose insulin (2mU/kg/min plus requirement from study 1), blood glucose 4-6mmol/L (HILG) d)High-dose insulin, blood glucose 7-9mmol/L (HIHG CONCLUSION. Amongst non-surgical ICU admissions, insulin administered to achieve lessstringent glycaemic targets suppressed glucose Ra and increased glucose uptake but leucine Ra remained higher than in control subjects. Pharmacological doses of insulin failed to reduce leucine Ra whereas glucose Rd was significantly increased. There is no evidence for benefical metabolic effects with stringent glycaemic targets. In SS, absolute AI may be defined by basal F =<3µg/dl, and relative AI is defined by basal F between 3 and 15µg/dl and any Dmax or >=15µg/dl and Dmax<9µg/dl. Financial support from the Commission of the European Communities, QLRT-1999-30589 Corticus. Financial support from the ESICM and the ISF Ellger B 1 , Debaveye Y 1 , Herijgers P 2 , Van den Berghe G 1 1 Intensive Care Medicine, 2 Center for Experimental Surgery and Anesthesiology, Catholic University of Leuven, Leuven, Belgium Tight glycemic control by intensive insulin therapy (IIT) reduces morbidity and mortality of critically ill patients. The relative impact of maintaining normoglycemia and of glycemia-independent effects of insulin in explaining these clinical benefits remains unknown. In a TPN-fed rabbit-model of prolonged (7 days) critical illness we assessed the impact of normoglycemia/normoinsulinemia (NG/NI), normoglycemia/ hyperinsulinemia (NG/HI), hyperglycemia/normoinsulinemia (HG/NI) and hyperglycemia/hyperinsulinemia (HG/HI) on survival, endothelial function, and myocardial contractility (n=8 survivors per group and compared with healthy controls). Assessment of myocardial function (dp/dt max) was performed under sedation and mechanical ventilation on day 7. Aortic rings were isolated to quantify endothelium-dependent relaxation by relaxation to cumulative doses of a) Acetylcholine (Ach), b) Ach + L-nitro-argininemethy-ester (L-NAME) and c) Nitroprusside of norepinephrine (NE)-induced vasoconstriction. Both normoglycemic groups revealed a mortality rate of 11%, whereas mortality was 36 % in the HG/NI group and 47% in the HG/HI group. Left ventricular dp/dt max was higher only in the NG/HI group (P<0.05). Relaxation to Ach after NE-induced contraction was impaired in all 4 critically ill groups (P<0.05). The 2 normoglycemic groups revealed a better relaxation to Ach, as compared with the two hyperglycemic groups (P<0.05), independently of insulinemia. Nitroprussideinduced relaxation was not affected. CONCLUSION. 1) Survival benefit of intensive insulin therapy depends on glycemic control rather than on glycemic-independent effects of insulin. 2) For improving myocardial contractility, both direct insulin effects and maintaining normoglycemia are necessary. 3) Endothelial function is altered in prolonged critical illness, as shown by reduced activity of NO-synthase (NOS), whereas NOSindependend NO-mediated vasorelaxation is maintained. 4) The endothelium is protected by maintaining normoglycemia independent of insulinemia. Hence, clinical benefits of IIT require both maintenance of normoglycemia and glycemia-independent effects of insulin, and the impact of both is organ/tissue-specific. Fredriksson K 1 , Klaude M 1 , Strigård K 2 , Hammarqvist F 2 , Wernerman J 1 , Rooyackers O 1 1 Anaesthesiology and Intensive Care, 2 Surgery, Karolinska Institutet, Huddinge, Sweden Critical illness is leading to a characteristic loss of skeletal muscle protein and changes in intramuscular amino acid concentrations. However, most of these studies are perfomed in muscle biopsies obtained from the leg and it is not known to what extend other muscle groups are influenced. Especially muscles involved in respiration are of interest due to the frequent ventilatory problems in the ICU that might be related to respiratory muscle dysfunction. Here we compare leg skeletal muscle and intercostal muscle of ICU patients with multiple organ failure. ICU patients (n=10) depending on mechanical ventilation were included. Comparisons were made with metabolically healthy, age and sex matched controls undergoing elective surgery (n=10). Biopsies were obtained from the vastus lateralis (leg) and intercostal muscle. Samples were analysed for protein content, activities of the 26S proteasome system (main enzyme involved in skeletal muscle protein degradation) and intracellular amino acid concentrations. In general the changes seen in leg muscle of the ICU patients are also present in the intercostal muscle, except for the branched chain amino acids (BCAA; valine, leucine and isoleucine) which are unchanged in the intercostal muscle. Nutritional support is an important therapeutic intervention in critically ill patients. However, it is often considered a second order treatment, compared to other therapies. ICU physicians focus on supporting emergencies from other systems and start nutritional support late or underestimate patients' metabolic needs. We conducted a prospective study to evaluate the real metabolic needs of ICU patients and to detect discrepancies between the measured metabolic needs of the patients and the prescribed nutritional support. METHODS. 37 consecutive patients with a mean age of 52 years and an APACHE score of 18±9, admitted with an expected ICU stay longer than 4 days, were included in this study. Starting the third day after admission, the metabolic needs of each patient were measured by indirect calorimetry every day for a period of one week. The caring physician was informed of the daily metabolic needs of each patient. We recorded the day nutrition was initiated, the differences between the metabolic needs measured by indirect calorimetry, the nutritional support prescribed by the physician in charge and the total amount of calories the patient received daily. The reasons for temporary interruption were also noted. Paired t-test with Bonferonni correction was used for statistical analysis. Despite daily information of the physician in charge concerning the metabolic needs of each patient, nutrition was started the 7±6 post admission day. The metabolic needs measured by indirect calorimetry were 1826±407 Kcal/day and the nutritional support prescribed was 1327±55 kcal/day. The amount of calories the patient received was 1020±650 Kcal/day (p<0.05). The reasons for temporary daily nutrition interruption in order of frequency were: diarrhea, vomiting, transport to the Radiology Department, no restitution of the nutrition after the every 4 hours tolerance test, interruption 4 hours before tracheotomy and abdominal distension without vomiting. Our study demonstrates that nutritional support is surprisingly not a first order therapy in the mind of the ICU physicians. Very often it starts late and the prescribed amount of calories is less than the real metabolic needs of the patients. Even worse, the amount of calories administered is less than the prescribed. Most of the reasons for temporary interruption are unjustified. Mentec H 1 , Plantefeve G 2 , Boulet E 3 , Le Gall C 4 , Annane D 5 , Van de Louw A 6 , Antoun S 7 , Bleichner G 1 1 ICU, CH Dupouy, Argenteuil, 2 ICU, CHU Bichat, Paris, 3 ICU, CH Dubos, Pontoise, 4 ICU, CH de Corbeil, Corbeil, 5 ICU, CHU Poincarre, Garches, 6 ICU, CH Michel, Evry, 7 ICU, Institut Roussy, Villejuif, France During enteral nutrition (EN) in critically ill patients, upper digestive intolerance is frequent. It can be screened for by monitoring gastric aspirate volume (GAV). However, the GAV cut-off requiring administration of prokinetics to allow a better compromise between feeding goals achievement and pneumonia is not known. Multicentre prospective randomised study comparing early (GAV 100 ml) and classical (GAV 500 ml) strategies for administration of erythromycin as a prokinetic agent during nasogastric tube enteral feeding in critically ill patients. Feeding goal was 30 kcal/kg theoretical weight. When GAV exceeded group cut-off or when vomiting occurred patient was given erythromycin 200 mg IV. Pneumonia was diagnosed on clinical, biological and radiological suspicion criteria and a positive quantitative distal bronchial sample culture. The Ethics Committee approved the study. can identify a subgroup with a worse prognosis. Parasympathetic stimulation can depress inflammation and might thus improve survival. The aim of the present study was to detect whether ßblockers as indirect parasympathetic modulators have a positive impact on prognosis. We retrospectively analysed the data of 120 consecutively admitted ICU patients with MODS. HRV was measured according to the international standards using a 24-hours-ECG. All patients were checked for ß-blocker treatment and followed up for 28-day-survival. We calculated a cutpoint (maximum of sensitivity x specificity in ROC analysis) for the HRV parameter lnVLF which predicted 28-days-survival best. The APACHE II score (APII) was calculated to characterize the severity of illness; a MODS was defined as a APII of 20 or more points. The demographic data of the patients were as follows: age 59.8±13 y, weight 76.5±14.9, height 170.4±10.0 cm, APACHE II score 26.9±7.6. 56 of the 120 included patients received ßblockers during the ICU stay. Patients with ß-blockers had a significantly higher HRV at admission than patients without ß-blockers (3.4 vs. 4.5 lnms2, p<0.0001). Dividing the cohort of patients into four subgroups we found that patients with ß-blocker treatment and a high HRV on admission had the best survival compared with 1) patients with low HRV and ß-blocker-treatment (log rank [LR] of Kaplan-Meier-Analysis=3.9, p=0.047), 2) patients with high HRV but without ß-blockers (LR=4.6, p=0.03) and 3) patients with low HRV and without ß-blockers (LR= 13.4, p=0.0003). CONCLUSION. ß-blocker treatment could improve survival in MODS patients. This favorable effect might be mediated by a restoration of blunted HRV which could yield depression of the overwhelming inflammation seen in MODS. Grant acknowledgement. HS and UMW: DFG SCHM 1398/3-1,-2, MB: DFG BU 859/3-1,-2,3. INTRODUCTION. The potential impact of GM-CSF plasma levels on the immune response and mortality of septic patients remains unclear (1). We recently conducted a randomised trail to investigate the effects of GM-CSF on monocytic HLA-DR expression in severe sepsis (2). The influence of GM-CSF on pro-and anti-inflammatory cytokines and the association with outcome of patients with severe sepsis were studied. After ethical approval 2x16 patients with severe sepsis and a HLA-DR expression below 200 mean fluorescence intensity (MFI) over 48h were included in a double blind RCT. GM-CSF (4µg/kg sc) was continued until HLA-DR expression increased above 500 MFI >48 h and restarted when decreased <200 MFI. Observation period was 21 days. Plasma levels of GM-CSF (Quantikine®, R&D Systems, USA), Interleukine (IL) 6, IL 10, Tumor Necrosis Factor a (TNF) (all Immulite® , DPC Biermann, Germany) and Procalcitonin (PCT) (LUMItest®, Brahms Diagnostica, Germany) were assessed daily. The outcome scores APACHE and SAPS and the longitudinal scores MODS and SOFA were calculated and patients were followed up for ICU mortality. Overall mortality was 12/32 (38%). Application of GM-CSF had no influence on outcome (verum 7/16 vs. placebo 5/16, resp.). GM-CSF plasma levels increased following treatment and remained elevated in the verum group. Proinflammatory cytokines (IL 6 and PCT) had comparable baselines, but increased to higher values in the verum group. TNF plasma levels increased during the first days in the verum group compared to the placebo group. IL 10 plasma levels were persistently higher in the verum group. Plasma levels of proinflammatory cytokines or GM-CSF did not correlate with outcome. In nonsurviving patients IL 10 plasma levels were elevated during the first week when compared to survivors and decreased constantly over the last treatment week. Nonsurvivors of the verum group showed consistently higher IL 10 levels than the rest of the population. CONCLUSION. In our population of severe septic patients high IL 10 plasma levels were generally associated with an adverse outcome. A potentially beneficial effect of GM-CSF on acquired immunoparalysis might be counteracted by high IL 10 levels. Patient-based surveillance of nosocomial infections in the ICU is the level 2 component of HELICS (Hospitals in Europe Link for Infection Control through Surveillance). The ICU protocol was published oct 2004 (http://helics.univ-lyon1.fr). The goal is to generate a harmonised database to improve knowledge about variability in incidence rates of nosocomial infections and to allow standardised bench-marking. Surveillance was done for blood stream infections, ventilator associated pneumonia, central venous catheter infection/colonisation and urinary tract infections. The aim of this investigation is to compare the level 2 data from 7 European countries. In 2004 data from 7 countries (Austria, Belgium, Spain, France, Luxembourg, Netherlands, Portugal) were transfered to the HELICS database in Brussels, checked for consistency and analysed. A total of 83488 (832-42544) patients with a length of stay > 2 days was available for analysis. A total of 622289 patient days were analysed. Demographic characteristics are given in the CONCLUSION. SPAD is an easily applicable method of detoxification in severe liver disease and organ dysfunction. Neutec Pharma Brivet F G 1 , Aegerter P 2 , Jacobs F M 1 , Guidet B 3 1 Medical Intensive Care Unit, Hôpital Antoine-Béclère -AP-HP, Clamart, 2 Department of Biostatistics, Hôpital Ambroise Paré -AP-HP, Boulogne, 3 Medical Intensive Care Unit, Hôpital Saint Antoine -AP-HP, Paris, France INTRODUCTION. Acute renal failure (ARF) is associated with an excess risk of hospital mortaliy (Metnitz et al. Crit. Care Med. 2002 ; 30 : 2051) . Recent studies suggest that the dose of renal replacement therapy (RRT) may improve hospital outcome. Objective: To evaluate and update the epidemiology of ARF requiring RRT. Patients and setting:1)A total of 172288 consecutive admissions in 36 ICUs from 1993 to 2002;2)patients requiring RRT but had chronic renal failure were excluded;3) data were extracted for the 127298 admissions to the 22 ICUs participating to the database since 1993. Among the 21201 ARF admitted during the 10 years period to the 36 ICUs (incidence:12.8 %),7079 required RRT (incidence:4.28 %,i.e 33.4 % of ARF).During this period,median age (52 years in 1993,56 in 2002,p<0 .001) and severity (SAPS II:27 in 1993 vs 34 in 2002 .001) and ARF incidence (8.4 % in 1993 vs 18.3 % in 2002,p<0 .001) increased.When considering only admissions to the 22 ICUs participating to the database since 1993, 5391 ARF required RRT.During the 10 years period of the study, whereas urea concentration and urine output at day 1 after admission did not changed,the number of patients requiring RRT increased (382 in 1993,720 in 2002) ;the patients get older and the severity increased [SAPS II (median):53 in 1995 , 57 in 2002 .Mechanical ventilation and vasopressor therapy were more frequently used in 2002 in comparison to 1993 (respectively:73.9 % versus 60.7 %,p<0.02;74.7 % versus 67.8 %,p<0.05).In 2002, 38.1 % of ARF requiring RRT were treated with continuous methods,in comparison to 1993 (28 %,p<0.001),but the number of sessions (hemodialysis:2) or the duration (3 days) remained unchanged.The crude ICU mortality rate (54.6 %) did not significantly decreased since 1993;hospital mortality in RRT patients remained globally unchanged. In this series,1)incidence of ARF and number of patients treated with RRT increased,whereas urea and diuresis remained constant;2)despite a high number of patients requiring mechanical ventilation and vasopressor therapy, intermittent hemodialysis is still the mostly frequently used technic;3)in the last years,intensity of RRT did not change;4) mortality remained elevated,unchanged,whereas admission severity increased. Joannes-Boyau O 1 , Lafargue M 1 , Honoré P M 2 , Gauche B 3 , Fleureau C 1 , Janvier G 1 1 Anesthesiology and Critical Care 2 , Ht-Lévèque University Hospital of Bordeaux, Pessac, France, 2 Critical care medicine, St-pierre para-university Hospital, Ottignies Louvain-la-neuve, Belgium, 3 Critical care medicine, Robert Boulin Hospital, Libourne, France INTRODUCTION. Continuous renal replacement therapies (CRRT) is widely used in the management of patients with severe sepsis and acute renal failure. Nevertheless, short filter lifespan (< 24 hours) is still a major concern, and could be the result of a procoagulating state secondary to sepsis. The aim of this work is to study the possible relationship between antithrombine (AT) deficit and early filter's clotting, and also if supplementation of AT could increase the filter lifespan. We did also compare two different methods for supplementation : Bolus versus Continuous Infusion. A two center prospective study, realized from march 2003 till may 2004 in three different intensive care units. Twenty eight patients in septic shock with acute renal failure were included. All of them were treated by CRRT and UnFractionated Heparin (UFH) was used for anticoagulation. The level of AT was measured before CRRT and every day, and a supplementation was started after two early filter's clotting, with two different methods: half of the patients was supplemented by bolus of AT, and the other half with continuous infusion. The risk of thrombosis of the filter increase dramatically when the level of AT felt under 60%. The supplementation in AT was able to restore a level over 60% and increase the filter lifespan from 15,2 to 33,2 hours. Patients treated with a continuous infusion method had a level of AT above 60 % a longer period of time when compared with the bolus method, and obviously their filter lifespan increased further from 27,8 (bolus) to 48,5 hours. The loglinear regression with frailty is statistically significant and show a high correlation between filter lifespan logarithm and AT activity level (p<0,001).CONCLUSION. This study show that a level of AT under 60% increase the risk of early filter clotting and the filter lifespan is clearly correlated with the AT activity level. A supplementation of AT could increase the filter lifespan by more than 100%. The continuous infusion is a better method to maintain a level of AT above 60% and can increase more significantly the filter lifespan. Costeffectiveness should be evaluated shortly. Herrera-Gutiérrez M E 1 , Seller-Pérez G 1 , Lebrón-Gallardo M 1 , Fernández-Ortega J 1 , Hernández-Sierra B 1 , Quesada-García G 1 1 Critical Care and Urgencies, Complejo Hospitalario Universitario Carlos Haya, Málaga, Spain Hemodynamic improvement after initiation of high flow hemofiltration carries a better prognosis. We intend to demonstrate that early hemodynamic improvement after standard continuous renal replacement therapy (CRRT) is also related to survival Prospective registry of all patients treated with CRRT in our ICU during Jan03-Apr05. We analyse mean arterial pressure (MAP) and vasopressor (VP) dosage before and after 24 hours of CRRT use and defined as responders (Rs) 20% fall in VP or 20% raise in MAP and non-responders (NRs) those that did not improve. We defined as unstable patients under VP or MAP lower than 60mHg when initiating CRRT. We evaluated sex, age, APACHE II at admission and initiation of CRRT, number of failing organs at CRRT and dose of CRRT (ultrafiltrate+dialysate). We used Chisquare and T-test for univariate analysis and logistic regression for multivariate analysis of mortality. . 94 patients (mean 54±13.8 years, APACHE II on admission 20.9±6.1 and at CRRT initiation 22.1±5.7, organs failing at CRRT start 3.7±1.3, dose of clearance 2.45±0.5 L/h) were included. 72.3% were unstable before treatment. Global mortality was 54.3% (57.7% stable vs 52.9% unstable, p ns). 42 patients (44.7%) were classified as Rs and 52 (55.3%) as NRs. Excluding mortality no differences were detected between these groups in any of the variables studied: RS 51.9±12.6 years, APACHE II at CRRT initiation 21.6±4.9, organs failing 3.8±1.2, dose of CRRT 2.5±0.5 L/h and NRs 55.8±14.5 years, APACHE II at CRRT 22.5±6.3, organs failing 3.7±1.4, dose 2.35±0.5 L/h. Rs mortality was 28.6% vs 75% in NRs (p<0.001). Logistic regression analysis: age (OR 1.05, p<0.05), APACHE II at CRRT initiation (OR 1.14, p<0.05), number of organs failing at initiation (OR 1.98, p<0.05) and NRs (OR 11.54, p<0.001) were related to mortality. Selecting only patients unstable (N 68), Rs mortality was 23.7% (9 out of 38) vs 97% (27 out of 30) in NRs. Logistic regression analysis: APACHE II at CRRT (OR 1.18, p<0.05) and NRs (OR 26.5, p<0 .001) only variables related to mortality CONCLUSION. Early hemodynamic improvement after CRRT is closely related to survival and this relationship is ever stronger when considering unstable patients . This response is a better predictor of survival than APACHE II or number of failing organs Van der Voort P H J 1 , Postma S R 1 , Kingma W P 1 , Bakker A J 2 , Boerma E C 1 1 Intensive Care, 2 Clinical Chemistry, Medical Center Leeuwarden, Leeuwarden, Netherlands Citrate is an anticoagulant for continuous veno-venous hemofiltration (CVVH) through its ability to chelate calcium (Ca). This might result into a potentially lethal hypocalcaemia for which Ca suppletion is obligatory. We hypothesized that citrate CVVH results in an endocrine response despite Ca suppletion up to normal levels. We included 19 consecutive ICU patients who were treated with CVVH in an observational study. In 11 patients citrate was the anticoagulant and in 8 patients nadroparin. Citrate was chosen for patients with active bleeding or at risk for bleeding. In addition we observed 10 control ICU patients. Total Ca, ionised calcium (iCa), parathyroid hormone (PTH) and vitamine D (VD) were measured at 24 and 48 hours. Total Ca in-and output was calculated to obtain Ca balance. [PICU] using a policy of enteral vncomycin with no secondary endoogenous infections. There were 3 lower airway and 3 bloodstream infections. 5 infections were primary endogenous due to the outbreak strain present in the admission flora, and 18 [9 secondary endogenous, 9 exogenous] due to the strain being acquired. 4 children developed 6 EMRSA-15 infections (4 exogenous, 2 secondary endogenous). In August, when a different patient acquired an infection each week, the incident team was convened and an infection control policy rigorously implemented. Acquired carriage/infection due to the outbreak strain susided following this reinforcement. Transmission reduced and was abolished after four months. Endemicity of the outbreak strain did not ensue and remains so. 2. Despite similar efficacy in eradicating EMRSA-15 and the outbreak strain, the latter escaped infection control suggesting emergence of a new fitter clone that disseminated more quickly and easily amongst the same high risk patients. The new clone was intrinsically more pathogenic as half the carriers developed infections compared with 1 in 10 patients with gentamicin sensitive EMRSA-15. No one with EMRSA-15 died, whilst the outbreak strain mortality was 22%. 1 patient died of necrotising pneumonia. 2 hypotheses: EMRSA-15 already carrying SCCmec acquired PVL, or a methicillin-sensitive S.aureus with PVL acquired SCCmec. CONCLUSION. The presence of PVL and SCCmec in a genetic background of proven epidemicity such as EMRSA-15 could prove highly transmissible and cause serious disease in a hospital setting. Reinforcement of enteral vancomycin cleared the new more virulent clone. Silvestri L 1 , Milanese M 1 , Durì D 2 , Gregori D 3 , Gullo A 2 , Van Saene H K F 4 1 Anesthesia and Intensive Care, Presidio Ospedaliero di Gorizia, Gorizia, 2 Perioperative Medicine, Intensive Care and Emergency, University of Trieste, Trieste, 3 Public Health and Microbiology, University of Torino, Torino, Italy, 4 Medical Microbiology, University of Liverpool, Liverpool, United Kingdom Selective decontamination of the digestive tract (SDD) is an infection control protocol that uses parenteral and enteral non-absorbable antimicrobials, usually polymyxins, aminoglycosides and polyenes. The aim is prevention and/or eradication of oropharyngeal and gut carriage to control endogenous infections. Nine meta-analyses of randomized controlled trials [RCTs] evaluated SDD in critically ill patients. SDD reduces lower airway infections and mortality. No metaanalysis that assesses the impact of SDD on bloodstream infections [BSIs] has been performed. A systematic review of SDD RCTs has been performed to assess whether the manoeuvre of SDD controls BSIs. A systematic review and meta-analysis of RCTs on SDD was undertaken. Data sources were Medline, Embase, Cochrane Register of Controlled Trials, previous meta-analyses, personal communications, and conference proceedings, without restriction of language or publication status. The main outcome measures were patients with bloodstream infection, and type of microorganism involved. Fifty-five RCTs conducted between 1987 and 2005, comprising a total of 9,230 critically ill patients, were included in the review; 4,659 patients received SDD and 4,571 were controls. Of those trials, 28 studies including 4,280 patients (2,213 SDD, 2,067 control) reported data on patients with bloodstream infections. There were 233 patients with bloodstream infections in the SDD group (10.5%), and 296 patients in the control (14.3%). Thirteen studies, including 1,817 patients (893 SDD, 924 control), reported data on the type of microorganism causing bloodstream infection. SDD significantly reduced the number of patients with bloodstream infections (common Odds Ratio 0.69, 95% confidence interval 0.55 to 0.88), and patients with BSI infections due to Gram-negative bacteria (OR 0.45; 95% CI 0.25 to 0.78). Gram positive BSI and fungemia were not significantly different among the two groups.CONCLUSION. SDD reduces bloodstream infections, and, in particular, Gram-negative bloodstream infections, with no significant effect on infections due to Gram-positive microorganisms and fungi. Malbrain M L N G 1 , For the Mycograb Invasive Candidiasis Study Group 1 1 Intensive Care Unit, ZiekenhuisNetwerk Antwerpen, Campus Stuivenberg, Antwerp, Belgium Candida infections carry a high mortality. Monotherapy is associated with persistent candidaemia in 12-17% and a candida-attributable mortality of 10-19% [1,2]. Combination studies using existing agents have failed to show improvements in clinical efficacy. The development of antibodies against HSP90, a molecular chaperone present in yeast cell wall is associated with recovery from candida infections. Mycograb®, a human antibody against HSP90 enhances the immune response to candida infections and can be combined with Lipid Amphotericine-B (L-AM-B). Mycograb® shows synergy with L-AM-B in vitro and in animal models of invasive candidiasis. The current pivotal study was undertaken in adult hospitalised patients with culture-confirmed deep-seated candidiasis. This randomised prospective double-blind placebo controlled study looked at the efficacy and safety of a 5 day course of Mycograb® (1 mg/kg iv bid) combined with L-AM-B versus placebo (saline iv) combined with L-AM-B. Patients were stratified on the basis of their germ tube test result. Efficacy was assessed on the basis of clinical and mycological response at Day 10, Candida-attributable mortality (28 days after last dose of study drug) and speed of culture-confirmed resolution of the infection. The primary efficacy variable was overall response at Day 10 i.e. clinical and mycological resolution of the infection. There were 117 patients (from 12 Centres)in the modified ITT population. A complete overall response was obtained in 48% (29/61) of the placebo group compared to 84% (47/56) of the Mycograb®-treated group (P < 0.001). The following secondary efficacy criteria were also met: clinical response (52% versus 86%, P < 0.001), mycological response (54% versus 89%, P < 0.001), Candida-attributable mortality (18% versus 4%, P 0.03), and rate of cultureconfirmed clearance of the infection, being over twice as fast in the Mycograb®-treated group (P 0.001). Mycograb® was well-tolerated.CONCLUSION. This is the first double blind, placebo controlled trial showing synergy between 2 antifungals in the treatment of invasive candidiasis. The combination of Mycograb® with L-AM-B produced a highly statistically significant improvement in outcome in patients with culture-confirmed invasive candidiasis and should therefore become the first choice treatment. This study investigates long-term outcomes after severe trauma in terms of recovery, functional status and health-related quality of life (HRQoL). Cohort study of all 208 severly traumatized patients (> 18 years) admitted to our general ICU from 1998-2001. Data concerning the ICU stay were retrieved from our ICU database. During March and April 2005 semi-structured telephone interviews were performed with 139 (86%) of the 161 patients still alive three to seven years after the injury. The interview included evaluation of HRQoL using the EQ-5D questionnaire (present status and prior to trauma), Glasgow Outcome Scale (recovery) and Karnofsky Index (funtional status). Mean age was 40 ± 17 years, mean SAPS II 31 and median Injury Severity Score 25. According to Glasgow Outcome Scale 84% experienced good or moderate recovery. Karnofsky Index below 60 (i.e. requiring significant help in daily life) were obtained by 13%. A total of 66% of fulltime workers before the trauma were still employed. Results from the EQ-5D evaluation are shown in the table. Kopp R 1 , Bensberg R 2 , Henzler D 2 , Wardeh M N 3 , Niewels A 3 , Rossaint R 2 , Kuhlen R 3 1 IZKF Biomat and Operative Intensive Care, 2 Anesthesiology, 3 Operative Intensive Care, University Hospital Aachen, Aachen, Germany Miniaturized veno-venous Extracorporeal Lung Assist (Mini-ECLA) or pumpless arterio-venous Interventional Lung Assist (ILA) are newly developed devices for patients with severe Acute Respiratory Distress Syndrome. Pulsatile blood flow (ILA), modified blood pumps and oxygenators as well as reduced filling volume could affect biocompatibility and evaluation seems necessary. De Rooij S E 1 , Govers A 1 , Korevaar J C 2 , Giesbers A 1 , Levi M 1 , De Jonge E 3 1 Internal medicine, 2 Clinical Epidemiology and Biostatistics, 3 Intensive care, AMC, amsterdam, Netherlands Few studies were undertaken to investigate the long-term outcomes of very elderly patients surviving ICU. This may be partly due to the high risk of ICU or hospital mortality among these patients, often expressed as the short-term outcome of ICU treatment. However, mortality is not the only outcome of ICU treatment. We aimed to study the long term cognitive and functional outcomes and health-related quality of life in very elderly patients surviving ICU. In this retrospective observational cohort study we interviewed the survivors of a cohort of 578 patients aged 80 years and older, admitted, planned or non-planned, to a medical and surgical ICU during 1st of January 1997-1st of December 2002 in a general tertiary university teaching hospital. Data were collected on preoperative conditions, demographic and social background, and present medical consumption. Cognitive screening was obtained by the IQCODE, activities of daily living by the KATZ ADL Index. Health-related quality of living was measured by the Euroqol and the VAS score. All were validated quantitative measurement instruments. Patients and relatives were interviewed concerning their ICU experiences. A total number of 578 patients were studied. ICU and hospital mortality for this cohort was 31.7 %, and 28.2 % died following hospital discharge. The 1-years survival of all patients was 37.9 %. A total number of 204 survivors (mean follow-up of 44.2 months, sd 19.9 months) and 155 relatives were interviewed. Although independence in activity of daily living was decreased after ICU stay, 74.3 % of the patients who lived at home before ICU admission were still at home with or without assistance. To study feasibility and toxicity of induced whole body hyperthermia (WBH) for treatment of cancer patients we measured indocyanine green plasma disappearance rate (PDR) as a parameter for liver function.METHODS. 35 treatments with whole body hyperthermia (body core temperature up to 42.2°C for 1 hour) in combination with hyperoxemia (paO2>250 mmHg), hyperglycemia (blood glucose>400 mg/dl) and cytostatic drugs were performed in 25 patients with disseminated malignancies. For guidance of volume therapy goal parameters were intrathoracic blood volume index (ITBVI) of 800-1100 ml/m2, cardiac index > 3.5 l/min/m2, hemoglobin concentration > 10 g/dl, and mean arterial pressure (MAP) > 55 mm Hg. MAP was maintained using noradrenalin in all patients. Measurements included PDR, cardiac index, ITBVI, as well as blood lactate levels. All parameters were measured at three defined points: after induction of anesthesia at 37°C, 30 minutes after reaching the plateau phase at 42.2°C, and at 39°C during cooling phase. For statistical analyses a paired test according to Wilcoxon was used. Statistical significance was accepted for p<0.05. Data presented as means +/-standard error of the mean. Under WBH CI increased significantly while PDR stayed within a normal range. PDR did not increase, and in contrast, CI and blood lactate levels increased significantly (see table) . The review of the current state of the art reveals that fuzzy logic has been used in a number of medical device application. The aim of this study is to demonstrate that a fuzzy controller of the norepinephrine infusion rate can reduce septic shock duration. A prospective, randomized study, approved by the "Comité Consultatif de Protection des Personnes dans la Recherche Biomédicale". Between the 6th and 24th hour after introduction of the vasopressor therapy, patients with septic shock and hemodynamic status controlled by norepinephrine (NOR) were included. The NOR infusion rate in the control group (SURV) was managed by the doctors of the intensive care unit, whereas the NOR delivery of the tested group was controlled by a fuzzy controller (CATHECO) depending on the mean arterial pressure (MAP) variations. The fuzzy-logic-based, automated drug delivery system was able to increase or reduce the norepinephrine infusion rate, every 7 minutes, after analysis of the variations in MAP. A MAP between 70 and 75 mmHg was the wanted set point. The mean criteria was the shock duration defined by the vasopressor support duration. There was no significant difference between the 2 groups with regard to demographic data, severity scores at inclusion, fluid resuscitation and initial norepinephrine infusion rate. Peritoneal dialysis with oxygenated perfluorocarbons (PFCs) reduces the effects of intestinal ischemia in small animals. We studied the therapeutic concept in a near-human animal model. Following institutional ethical approval, 18 pigs of 75 kg were anesthetized and their hemodynamics and regional perfusion monitored. Sup. mesenteric artery blood flow was reduced to 20% (ISCH, n=6) for 4 hrs, followed by 1 hr of reperfusion. In THER group (n=6), oxygenated PFC was simultaneously circulated through the abdomen. 6 animals received only SHAM surgery. Bowel tissue was examined for mucosal damage. Histological examination and scoring showed no damage in SHAM, and equally extensive damage in ISCH and THER. Intraabdominal application of oxygenated PFC could not bridge intestinal ischemia in a near-human model. Possible explanations are limited O2-diffusion capacity, O2-mismatch and the abdominal cavity's high resistance.Grant acknowledgement. This study was supported by the European Union's Interreg IIIAinitiative. O'Beirne J P, O'Beirne J P 1 , Auzinger G 1 , Sizer E 1 , Bernal W 1 , Wendon J A 1 1 Liver Intensive Care Unit, Kings College Hospital, London, United Kingdom Albumin dialysis has been proposed as a method for the removal of protein bound toxins in liver dysfunction. SPAD employs standard haemodiafiltration with a solution of albumin which is then discarded. As opposed to other methods of albumin dialysis SPAD is simple to perform and does not require the purchase of specialised equipment. We have evaluated the clinical use of SPAD in patients with severe liver dysfunction.METHODS. 25 patients with severe liver dysfunction requiring intensive care underwent 48 sessions of SPAD. Median SOFA score was 13 (range 6-21), mean serum total bilirubin was 466µmol/L, and mean serum creatinine was196µmol/L prior to treatment. Mean arterial pressure, SOFA score, creatinine and lactate did not change over treatment. There were highly significant reductions in total and conjugated bilirubin and serum bile acids after 1 session of SPAD.