key: cord-009567-osstpum6 authors: nan title: Abstracts Oral date: 2008-04-23 journal: Am J Transplant DOI: 10.1111/j.1600-6143.2008.02254.x sha: doc_id: 9567 cord_uid: osstpum6 nan Abstracts evl (target 3-8ng/mL), AZA or MMF with standard (SD) or reduced (RD) CsA. Data at 6 months post-tx is presented here. Results. The proportion of patients with CMV events, including serious adverse events and laboratory evidence of CMV infection, has remained low and broadly similar following introduction of CC administration of evl and use of concomitant RD-CsA. CMV syndrome and CMV organ involvement occurred in <3% of recipients receiving evl in each of the studies. The highest rates of all CMV parameters occurred with MMF + SD-CsA or AZA + SD-CsA. Conclusion. The low incidence of CMV infection and CMV-related events observed in heart tx patients receiving FD-evl and SD-CsA as de novo immunosuppression has been maintained in newer regimens of CC-evl with RD-CsA. CMV syndrome and CMV organ involvement are rare with evl-based immunosuppression after cardiac tx. The low rate of CMV infections in evl-treated patients may contribute to improved long-term outcome and a lower incidence of CAV; confirmatory data are awaited. Fibronectin-α 4 β 1 Interactions Enhance p38 MAPK Phosphorylation and Metalloproteinase-9 Expression in Cold Liver Ischemia/Reperfusion Injury. Sergio Duarte, 1 Xiu-Da Shen, 1 Takashi Hamada, 1 Constantino Fondevila, 1 Ronald Busuttil, 1 Ana J. Coito. 1 1 The Dumont UCLA Transplant Center. Expression of endothelial fibronectin (FN) is an early event in liver ischemia/reperfusion (I/R) injury. We have recently shown that CS1 peptide facilitated blockade of FN-α4β1 leukocyte interactions regulates metalloproteinase-9 (MMP-9) expression in steatotic orthotopic liver transplants (OLT). This study tests the function of the CS1 peptide therapy upon MMP-9, and further dissects putative mechanisms, in an alternate model of cold Ischemia/Reperfusion (I/R) Injury. Methods and Results: CS1 peptides were administrated through the portal vein of Sprage-Dawley (SD) rat livers before and after 24 h cold storage (500 µg/rat). SD recipients of OLTs received an additional dose of CS1 peptides 1h post-OLT. CS1 therapy significantly increased the 14d OLT survival rate (100% vs. 50%, n=8/gr, p<0.005). CS1 peptides reduced sGOT levels (U/L) at 6h (1413±420 vs. 2866±864 p<0.008) and 24h (1350±142 vs. 4000±1358, p<0.006) post-OLT. CS1 treated OLTs showed good preservation of lobular architecture, contrasting with severe necrosis and sinusoidal congestion in controls. Moreover, CS1 treated livers were characterized by a profound decrease in T (31±3 vs. 64±8, p<0.0002), NK (19±2 vs. 41±3, p<0 .0003) and ED1 (21±7 vs. 57±16, p<0.008) cells as early as 6h after I/R. Neutrophils, as indicated by MPO activity (0.48±0.02 vs. 3.18±0.94, p<0.02) were depressed by CS1 therapy. This correlated with decreased mRNA expression of TNF-α (0.032±0.04 vs. 0.83±0.26, p<0.005) and cycloxygenase-2 (0.8±0.1 vs. 2.2±0. 6, p<0.05) . Leukocyte transmigration is dependent upon adhesive and focal matrix degradation mechanisms. MMP-9, which is inducible and expressed by infiltrating leukocytes, was profoundly depressed in 6h CS-1 peptide treated OLTs, at both mRNA (0.1±0.1 vs. 0.5±0.2, p<0.03) and protein (0.15±0.07 vs. 0.7±0.14, p<0.03) levels. Moreover, MMP-9 activity evaluated by zymography was reduced by ∼3-fold in the CS-1 group. Interestingly, phosphorylation of mitogen-activated protein (MAP) kinase p38 (0.05±0.01 vs 0.14±0.06, p<0.03) was selectively downregulated in the 6h CS-1 treated OLTs. In conclusion, this work supports a broad regulatory role for FN-α4β1 interactions on MMP-9 expression by leukocytes, likely mediated through activation of p38 MAPKinase, and matrix pathological breakdown associated with leukocyte infiltration. This data provides the rationale for the development of novel therapeutic approaches in cold liver I/R injury. Ischemia reperfusion injury is the most common cause of acute kidney injury in both native and allograft kidneys. The pathogenic mechanisms of renal ischemia reperfusion injury include changes at the level of the microvasculature as well as the tubules. Within the microvasculature, leukocyte-endothelial interactions likely play a role and contribute to the well-established microvasculature dysfunction (e.g., endothelial permeability) and alterations in endothelial-leukocyte interaction occur during ischemic acute kidney injury. Our previous studies have demonstrated that T cells modulate renal ischemia reperfusion injury in a murine model, we hypothesized that T cells could mediate changes in renal vascular permeability during ischemia reperfusion injury. We performed a 30 min bilateral renal ischemia followed by reperfusion in C57BL6 wild type mice and in T cell deficient (nu/nu) mice with or without T-cell adoptive transfer from their wild type littermates and evaluated RVP by Evans blue dye extravasations (EBDE). The time course studies of RVP showed marked increases in renal EBDE within the early 3-6 hrs after ischemia. CD3 positive pan T-cells but neither CD4 nor CD8 T cells were found to infiltrate into post ischemic kidney within 6 hrs, and comparison was made with other leukocytes using immunohistochemistry technique. Gene microarray analysis demonstrated that the gene for TNF-α (Tnfaip1), a potent mediator of microvascular permeability as well as a T cell product, was increased in the kidney early after ischemia. TNF-α, IFN-γ and IL-4 protein by an intracellular cytokine staining technique was found increased early in peripheral circulating T cells and later in renal T cells after renal ischemia. The rise in RVP was significantly attenuated in T cell deficient mice nu/nu at 6 hrs compared to the wild type littermates and this attenuated RVP in T cell deficient mice was restored after adoptive transfer of splenic T cells from wild type littermates into these nude mice. These data demonstrate that T cells traffic early into post ischemic kidney, produce TNF-α and other cytokines that can increase RVP, and directly participate in the increased RVP during acute ischemia reperfusion injury. T cell-endothelial interactions are a likely mechanisms underlying the pathophysiologic role of T cells in renal ischemia reperfusion injury . Background: Ischemia/reperfusion injury (IRI) is a major cause of organ dysfunction after intestinal injury and transplantation. The interaction between the innate and adaptive immune systems has become a major area of focus in this field. Our preliminary work showed that mice undergoing intestinal IRI experienced worse survival and tissue injury in conjunction with increased infiltration of PMN and CD3+ cells as compared to sham mice. The purpose of this study was to investigate the role of the T cell in intestinal IRI through the use of genetically deficient mice. Methods: Under anesthesia, male C57BL6 wild-type mice (WT) and CD4 knockout mice (CD4KO) underwent 100 min of warm intestinal IRI by clamping of the SMA. Separate survival and analysis groups were performed. Intestinal tissue was harvested at 4h and 24h. Tissue was analyzed by histology, CD3 immunostaining, myeloperoxidase activity (MPO), and semi-quantitative PCR for several cytokines/chemokines. Results: WT had significantly worse survival compared to CD4KO (30% vs. 100%, p=0.002), and worse histopathological injury mostly involving mucosal sloughing. WT had higher MPO activity than CD4KO (1.9±0.88 vs. 0.70±0.73 at 4h, p=0.059, and 1.5±0.05 vs. 0.22±0.065 at 24h, p=0.004) and increased CD3+ cell infiltration. There was also increased mRNA production of cytokines/chemokines in WT vs. CD4KO at both timepoints; specifically, the data with respect to B-actin at 24h was: IL-2 (0. Conclusion: This study demonstrates for the first time in the intestine that CD4+ T cells are important mediators of IRI. In the absence of CD4+ T cells, better outcomes were associated with less PMN infiltration implying a link between the innate and adaptive immune systems. An alteration in chemotactic signaling potentially effected by CD4+ T cells may play a role in this process. These results confirm the important function of CD4+ T cells in intestinal IRI and justify further investigation into their complex role in this process. Messenger RNA assessment in urine sediments from renal transplant patients is rapidly evolving as a non-invasive diagnostic tool. T cells, macrophages, and often B cells are present in rejecting kidney grafts. We questioned whether urinary mRNA levels of markers representing these cell types ((regulatory) T cells: CD3ε, FoxP3, CD25, TGF-ß; macrophages: CD68, S100A9; B cells: CD20) are associated with rejection. MATERIALS In our institute 520 urine samples a year from over 100 renal transplant patients are being collected for mRNA quantitation purposes. Urine sediments are pelleted and stored in RNA-preserving solution. We initially investigated the effect of incubation time of urine on mRNA integrity. Next, in a case-control study 17 urine samples taken at time of biopsy-proven rejection were compared to 17 samples taken during stable graft function. Median time of sampling (55 days versus 43 days posttansplant) did not significantly differ between groups. RNA (0.90 ± 1.18 µg) was extracted using RNeasy spin columns. Message of the markers mentioned above was quantified with Q-PCR and normalized. RESULTS Incubation of urine for 7h or longer at room temperature after acquisition resulted in a 75% decrease in 18S rRNA signals (P<0.05). However, storage of urine for up to 24h at 4ºC did not result in decreased mRNA expression. In the case-control study, all markers tested showed the highest expression levels in urine rejection samples. When corrected for multiple comparisons, only TGF-ß (26-fold, P=0.007) and FoxP3 (21-fold, P=0.002) expression was significantly increased in rejection samples compared to non-rejection samples. TGF-ß levels highly correlated with FoxP3 levels (r = 0.90, P<0.00001), suggesting a mechanistic relationship in vivo between the two molecules. CONCLUSION RNA integrity in urine samples stored at 4ºC is maintained for at least 24h. Detection of increased TGF-ß and FoxP3 message in urine from patients with a kidney transplant represents a means for indicating occurrence of graft rejection. This implies that mRNA urinalysis renders a suitable molecular tool for non-invasive patient monitoring in clinical practice. The data furthermore suggest that rejection is associated with TGF-ß-mediated immune mechanisms in which regulatory T cells play a role. The significance of B cell and plasma cell infiltration in renal allografts remains controversial. We previously established that transcript sets associated with Ifng effects or T cells reflect the inflammatory burden in renal allografts. In the present study, we identified B cell associated transcripts (BATs) and immunoglobulin transcripts (IGTs), reflecting B cell and plasma cell infiltration, respectively. Using microarrays, we analyzed BAT and IGT expression in relationship to histologic lesions, diagnosis, and renal function in 177 renal allograft biopsies. Immunostaining confirmed that BAT and IGT expression was associated with B cells and plasma cells in the graft. Expression of BATs and IGTs was increased in biopsies with rejection (1.2 ± 0.2, 1.6 ± 0.9) compared to non-rejection (1.1 ± 0.2, 1.1 ± 0.5), but was not different between T cell (1.3 ± 0.3, 1.5 ± 1.1) and antibody mediated rejection (1.2 ± 0.2, 1.5 ± 0.9) and also occurred in some non-rejecting biopsies (recurrent GN). BAT and IGT scores correlated strongly with time post transplant (Fig1), which was best modeled as a dichotomous relationship: biopsies ≤ 5 months did not express BATs or IGTs above the level of control kidneys. Other inflammatory markers were not time dependent. In biopsies ≥ 5 months, BAT and IGT expression correlated with interstitial inflammation, tubular atrophy, and interstitial fibrosis. In a multiple regression analysis, only time post transplant and interstitial inflammation were independently related to BAT and IGT scores. When correcting for time post transplant, BAT and IGT scores did not correlate with renal function at the time of biopsy or future function 6 months post biopsy. BATs and particularly IGTs are a time dependent feature of injured and inflamed renal allografts. Corrected for the effect of time, BATs and IGTs do not correlate with outcomes, indicating that B cell and plasma cell infiltrates have no specific role for the mechanism of injury independent of the inflammatory burden. Their accumulation in late allografts may indicate emergence of specialized lymphoid compartments in tissues with long standing low level inflammation. Pearson correlation coefficient between PBTs expression and renal function PBTs eGFR at %change in eGFR from Biopsy 6 months after biopsy Baseline to biopsy Biopsy to 6 months after QCATs -0.17* -0.16* -0.19* -0.05 GRITs -0.23** -0.25** -0.20** -0.10 IRIT_D1 -0.02 -0.01 -0.09 0.13 IRIT_D3 -0.51** -0.29** -0.36** 0.16* IRIT_D5 -0.28** -0.22** -0.19* -0.02 KT2 0.27** 0.13 0.22** -0.02 * p<0.05, ** p<0.01 Thus the PBTs have prognostic value. Surprisingly, the transcripts in the biopsy with greatest prognostic value for future GFR or recovery of GFR were not related to rejection (cytotoxic T cell or IFNG associated) but to the degree of injury response. We propose that the common pathway linking various injuries (immune and non immune) to GFR is through the degree of injury response these events induce in the epithelium, particularly those in the IRIT_D3 set. Early Rejection from Anamnestic Response in Offspring-to-Mother or Husband-to-Wife Kidney Transplant. Kwan Tae Park, 1 Song Chul Kim, 1 Duck Jong Han. 1 1 Surgery, Asan Medical Center, Seoul, Korea. Accelerated rejection can be developed in the immediate post-transplant period resulting from anamnestic response due to the exposure to fetal HLA antigen during the previous pregnancy in case of offspring-to-mother or husband-to-wife. However, accelerated rejections in these groups have been rarely reported. 81 cases of offspring-to-mother (Offspring group) and 53 cases of husband-to-wife (Spouse group), who has been sensitized to spouse via their children, underwent kidney transplants from January of 1997 to August of 2007 at our institution and retrospectively reviewed. Control group was female kidney recipients transplanted at the same period from living related donors other than offspring and from living unrelated donors other than husband. Acute rejection (AR) rate within 3 months after transplant were 19.7% (16/81) in offspring group and 18.8% (10/53) in spouse group. The AR rates were not different between the two groups, however they were significantly higher than control group in both groups, namely 10.2% (24/235) for offspring control and 13.5% (10/74) for spouse control. The mean onset of AR was 7.5 day (0-29) and it was significantly later than that of control groups (16.5 day, p<0.05) and 54% of AR were accelerated rejections within 7 days after transplant. Proportion of acute humoral rejection was significantly higher in both groups than control group (37.5% vs 8.3% in offspring group, 60% vs 10% in spouse group). Most of the AR was successfully reversed by steroid pulse and/ or plasmapheresis, IVIG, rituximab. Any risk factors for the AR such as number of pregnancy, preoperative CDC cross matching, PRA, immunosuppressant and antibody induction couldn't be identified. Serum creatinine level in AR patients were higher than AR free patients by postoperative 1 month, but last follow up creatinine level didn't show any statistical difference (1.43 mg/dl in AR vs 1.14mg/dl in AR free). 5 year graft/patient survival rate were not different between AR and AR free groups and study and control groups. Risk of higher rejection rate accompanied with higher accelerated and humoral rejection was identified in female kidney recipients from offspring or spouse donor in comparison with control group. Considering the anamnestic response of this cohort of female recipients, more prudent preoperative screening and careful immediate postoperative immune monitoring are required for the avoidance of rejection and impaired graft survival. Kaplan-Meier survival curves showed that ∆UPC > 0.2 were associated with decreased patient and allograft survival. These data show for the first time that regardless of the histopathology, ∆UPC > 0.2 one month after rejection is associated with poor patient/allograft outcomes. From Histology to Microarrays The histopathological hallmark of T cell mediated rejection (TCMR), is interstitial infiltration. Assessing infiltration by histology is arbitrary, limited in reproducibility, and has never been assessed against independent standards. We recently reported that a set of cytotoxic T cell-associated transcripts (QCATs) could quantitatively assess the T cell burden in tissue. Objective of the present study was to re-examine the current diagnostic criteria in relationship to the QCATs. In 129 renal allograft biopsies, we assessed how histology predicts the QCAT burden. An independent diagnostic threshold for QCATs was established in control samples. Applying this QCAT threshold revealed current diagnostic criteria for histology to be flawed; the threshold for interstitial infiltration is too high (100% specificity, 37% sensitivity), the types of infiltration to be considered (i.e. i-Banff) are wrongly defined, and tubulitis is not increasing diagnostic accuracy. Changing the criteria by lowering the histological threshold for cortical infiltration from 25% to 10%, taking into account all interstitial cellular infiltration (= i-total) and ignoring tubulitis increased sensitivity to 91% with a decreased specificity of 50%. But, this includes biopsies having interstitial infiltration without QCATs, indicating that histology might not discriminate between 'active' and 'inactive' infiltration. Both the refined histological and the QCAT threshold had prognostic value in terms of future renal allograft function. We propose a refined histological scoring system that better predicts the active T cell burden and outcome than the current Banff criteria for renal allograft rejection. There has been an increasing and well justified interest regarding the long term renal consequences of kidney donation. The collective evidence suggests, however, that kidney donors enjoy a normal life span and their lifetime risk of ESRD may not be different from non-kidney donors. Assessing kidney function utilizing serum creatinine is not without limitations. Therefore, to better assess kidney function, 5 yeas ago, we began a large effort to measure GFR using the plasma disappearance of iohexol and first void urinary albumin/creatinine ratio (ACR) in randomly selected kidney donors who donated at our institution. Results: We have performed over 3600 donor uninephrectomies since the inception of our program in 1963. Of these, 242 donors underwent iohexol GFR at our General Clinical Research Center. Microalbuminuria is defined as ACR between 30-300mg/g and macroalbuminuria as ACR>300mg/g. The mean age was 53.0±9.6 years, 11.9±8.8 years have elapsed since donation, hemoglobin was 13.7±1.23g/L, systolic blood pressure (SBP) was 121.7±14.7mmHg and diastolic blood pressure (DBP) was 73.2±9.1mmHg. 82.6% of donors had a GFR greater than 60ml/min/1.73m 2 and 17.4% had a GFR between 30-60ml/min/1.73m 2 . The detailed renal profile of these donors is shown in the table below. Multivariate analysis that adjusted for age, gender, ethnicity, time from donation, SBP, DBP and body mass index (BMI) identified age at donation, female gender and BMI as independent predictors for GFR<60mL/min/1.73m 2 and time from donation and systolic blood pressure as independent predictors for micro and macroalbuminuria. Conclusion: This is the largest effort describing measured GFR in previous kidney donors. It is reassuring to find out that the majorities have a preserved GFR and only a minority has albuminuria. The risk factors for reduced GFR and albuminuria are analogous to what has been described in the general population. Cystatin C Levels and Long-Term Donor Health Markers Cystatin-C <1 (n=65) Psychosocial and physical health of LKDs following donation is of utmost importance. Unfortunately, this issue has been neglected to a large extent. We sought to examine the current practices regarding psychosocial follow-up of LKDs after surgery across transplant (Tx) centers. We conducted a 68-question online survey regarding this practice and issues related to LKD. The survey was e-mailed via listservs of 2 professional Tx societies. Several questions allowed for more than one response. Characteristics of the 69 Tx centers that participated in the survey are found in Table 1 . Only 38.3% actively follow donors regarding mental health, substance abuse, or quality of life issues after donation. The professionals most often involved are social workers (72%) and nurse coordinators (69%). The majority of follow-up is conducted via phone (83.3%); though appointments (69.4%) and questionnaires (11.1%) are also utilized. 65% initiate follow-up with donors 1-4 weeks post-operatively, whereas only 27% of programs maintain contact at both 3-6 months and 12 months. 83.3% offer post-operative psychosocial support; 62% only under certain circumstances. This support is provided by social workers (90%), psychologists (30%), and/or psychiatrists (28%). Support is available indefinitely in 68% of programs. 30.5% assume the costs of post-operative medical and psychosocial care indefinitely; 50.8% for a specified period of time. 40.7% bill the recipient's insurance and 15.2% bill the donor's insurance. 83.1% of centers accept donors without health insurance and only 5.1% purchase insurance on behalf of donors to cover post-operative health care needs. The results of this survey clearly demonstrate that post-operative psychosocial follow-up of LKDs is uncommon and that current practices are widely variable. Without routine follow-up of donors, Tx centers are less likely to capture post-operative psychosocial issues that may result from organ donation. Standardized post-operative follow-up of LKDs should become a mandatory part of their care, which will require increased support from health care policy makers. The Role of Hepatitis C and Race in Patient and Graft Survival in Combined Kidney and Liver Transplantation. Dilip Moonka, 1 Ravi K. Parasuraman, 2 Kim A. Brown, 1 Alissa Kapke, 3 Dean Y. Kim. 4 1 Gastroenterology, Henry Ford Health Systems, Detroit, MI; 2 Nephrology, Henry Ford Health Systems, Detroit, MI; 3 Biostatistics, Henry Ford Health Systems, Detroit, MI; 4 Transplant Institute, Henry Ford Health Systems, Detroit, MI. The influence of hepatitis C (HCV) and race are not well understood in combined kidney-liver transplant (KLT). HCV has a negative impact on patient and graft survival in liver recipients whereas African-American (AA) race is a negative prognostic factor in kidney recipients. AIM: To determine the influence of HCV and AA race on patient and graft survival in KLT. METHODS: The UNOS Public Use Database was used to identify 2296 patients undergoing KLT who had known HCV Ab status. 13% were AA and 68% were non-Hispanic white (NHW). 39.5% were HCV Ab positive. Groups were assessed for patient and graft survival. RESULTS: There is a significant gradient in patient survival and both kidney and liver survival from NHW patients without HCV to AA patients with HCV (Table) . The patient survival at five years drops from 72% to 54% along this gradient. There is also a 14% drop in liver and an 18% drop in kidney graft survival. On pairwise testing, the difference in patient survival at 5 yr between all patients without HCV and those HCV Ab positive drops from 72% to 60% (p=0.0003). The difference in 5 yr survival between all NHW and AA patients (regardless of HCV status) drops from 68% to 57% (p=0.221). On multivariate analysis, HCV and the combination of HCV and AA race remains associated with diminished survival but race alone does not. CONCLUSIONS: Patients with HCV who undergo a KLT transplant are at increased risk for poor overall survival and poor survival of kidney and liver grafts. This effect appears even more pronounced in AA patients. This knowledge is critical to individual centers in assessing risk and benefit from KLT in these groups and these groups represent an opportunity for improved interventions. Kidney: Pediatrics Background: Pediatric renal transplant recipients have excellent short-term outcomes but long-term success is compromised by complications of chronic immunosuppressive medications and chronic allograft nephropathy. Studies show that calcineurin inhibitors and steroids can be individually avoided in pediatric renal transplantation. Building on that experience we designed this study to optimize short and long-term renal allograft function with minimal chronic immunosuppression using a steroid-free, calcineurininhibitor withdrawal protocol in low risk pediatric renal transplant recipients. Methods: Unsensitized pediatric recipients of a first living donor kidney transplant received 2 doses of Campath-1H ® (0.3 mg/kg), 1 day pre-and post-transplant. Subjects received tacrolimus and MMF immediately post-transplant until week 8-12 when they underwent protocol renal biopsy and were changed to sirolimus and MMF if rejection free. The planned 35 subjects have been enrolled; this report describes the clinical outcomes of the 23 with 1 year of follow up. Results: The mean subject age is 12.9 yrs; 59.3% are female and 70.4% Caucasian. At transplant, 16/23 were CMV seronegative and 16/23 were EBV seronegative. Protocol therapy was discontinued in eight subjects due to: rejection (3), mouth ulcers (2), leukopenia (1) , unrelated (2). Clinical acute rejection (AR) occurred in 4 subjects (17%) and 2 had subclinical AR; AR was cellular rejection in 5 subjects, and humoral in 1 at 4 days post-transplant who had an undetected positive crossmatch to Class II HLA. There were two graft losses, one due to recurrent FSGS and one due to medication non-adherence. There were no cases of PTLD and no deaths. Leukopenia occurred in 11 subjects (47.8%). There were 9 infections of which 7 were urinary tract infections (34.7%) and 2 were pneumonia (8.7%). Conclusions: Minimization of immunosuppression using a steroid-free, calcineurinwithdrawal protocol in low risk pediatric renal transplant recipients appears to be well tolerated with acceptable rates of clinical AR and no serious infections 12 months after transplantation. male; 57% white). Substantial increases in BMIz were observed within the first 6 mo.; no changes were seen from 6 to 48 mo. Baseline BMIz category (<-2.0, -2.0 to +2.0, >+2.0) influenced the pattern of change. Subjects with low BMIz (<-2.0) at baseline experienced the greatest increases in BMIz, but overweight was rare; increases tended to result in a BMIz in the healthy range. Those with high BMIz (>+2.0) at baseline demonstrated no significant change in BMIz post-Tx. Younger age at Tx (highest risk for those 2 to 5 y. at Tx) more remote date of Tx, and baseline BMI between the 25 th and 75 th percentiles were significant independent risk factors for unhealthy weight gain both at 12 mo. and persisting at 48 mo. post-Tx. Weight gains occur early after Tx and tend to persist. Counseling focused on prevention of weight gain should be a routine part of post-Tx care, with the most intense efforts concentrated on the highest risk patients: young, healthy weight children. Avoidance of early weight gains may have an important impact on BMI over the long term. Referral. Lindsey A. Pote, 1 Jennifer Trofe, 1 Erin H. Wade, 1 Jorge Baluarte, 3 Alden Doyle, 2 Simin Goral, 2 Karen Warburton, 2 Robert Grossman, 2 Jo Ann Palmer, 3 Roy D. Bloom. 2 1 Pharmacy, Hosp of the Univ of Penn; 2 Nephrology, Univ of Penn; 3 Transplant Surgery, Children's Hospital of Philadelphia. Intro: Few data exist on outcomes in pediatric kidney recipients who transition to an adult transplant program. Purpose: To examine outcomes in pediatric kidney recipients who transition to an adult transplant program and to identify characteristics associated with non-adherence related graft loss. Methods: Retrospective, single center analysis of 41 pediatric kidney recipients who transitioned to an adult program. Results: For the cohort overall, transition to the adult program occurred a mean of 82 ± 8.7 months following transplantation. Mean serum creatinine at transition was 2.6±3.2 mg/dL and mean patient age 21.2 ± 0.3 years. 61% of patients received living donor kidneys. Within 31±3.5 months of transition, 13 (31.5%) of patients experienced graft loss. Causes of graft loss included admitted non-adherence (n=6), recurrent disease (n=3), chronic progressive graft dysfunction (n=3) and BK nephropathy (n=1). Graft loss occurred a mean of 16±7 months post transition for non-adherent patients and 20.5± 3 months for adherent patients (P=ns). The characteristics of the cohort is shown in the table according to whether or not patients had documented non-adherence related graft loss. Conclusions: 1) Graft loss commonly occurs within 3 years following transition and is attributable to both patient non-adherence and late referral by the pediatric transplant program, 2) Non-adherence related graft loss was more common in males, 3) Factors unassociated with non-adherence include ethnicity, prior transplantation, age at transplant, & duration of dialysis, 4) The development of collaborative pediatric-toadult transition clinics may enhance adherence and lead to improved graft outcomes in this population. This 2-year, prospective randomized pilot study compares the effect of conversion to sirolimus (SRL) vs. continued mycophenolate meofetil (MMF) in patients with chronic allograft nephropathy (CAN), on histological progression. We present 2-year data on safety and renal function. Participants >1 year post-transplant with CAN (Banff ≥ci1, ct1) on tacrolimus, MMF and prednisone were randomized to continue MMF or convert to SRL (target 8-12 ng/ ml). TAC dose was minimized (target 3-5 g/L), and renal biopsy performed at baseline, 1, 2 years. 3-month interval monitoring included adverse event (AE) reporting, eGFR (Schwartz) and immunosuppressant levels. 16/20 (MMF=10, SRL=10) have completed 2-year follow-up. Baseline gender, ethnicity, previous acute rejection episodes, CAN grade, proteinuria (uPCR) were similar. SRL had lower baseline eGFR (79±17 vs. 109±49 ml/min/1.73m 2 , p=0.22). 285 AEs were reported (MMF=117, SRL=168), 43 serious AE (SAE: MMF=25, SRL=28). Common SAEs were similar: dehydration & elevated creatinine (30), gastroenteritis (10) and rejection (AR). 2 episodes (1 patient) of AR occurred in MMF group and 5 (2 patients) in SRL group, all attributable to non-adherence. There were no sustained change in electrolytes, Hg and total WBC counts in the 1 st 2 year, except neutrophil counts were lower in the SRL group (24 mo: mean 4.7 vs. 2.8, p<0.05). Platelets were significantly lower at 3 months (SRL) but not thereafter (24 month: mean 271±71 vs. 196±61 x10E9). Cholesterol and TG increased early (p<0.05), but only cholesterol persisted after 2 years (mean 5.2 vs. 3.7 mmol/L, p<0.05). In SRL group only, uPCR increased over 2 years (∆uPCR 94±87 mg/mmol, p<0.05). This was not associated with hypoalbuminemia at 2 years. eGFR was lower in SRL vs. MMF after 18 months (p<0.01), but the rate of change in eGFR from baseline was similar between groups (-28±23 vs. -30±39 ml/min/1.73m 2 , p=NS). Serious adverse events did not differ significantly between the SRL and MMF groups. Sustained SRL treatment was associated with mild neutropenia, hypercholesterolemia and proteinuria after 2 years. eGFR was reduced at baseline compared with MMF, and both groups had similar rates of decline in GFR over time. Allograft Recipients. Maarten Naesens, 1 Oscar Salvatierra, 1 Li Li, 1 Minnie Sarwal. 1 1 Department of Pediatrics, Stanford University School of Medicine, Stanford, CA. Background: In contrast to adult kidney recipients, little is known about the long-term evolution of tacrolimus pharmacokinetics in pediatric kidney transplant recipients. Methods: One-hundred five pediatric recipients of a kidney allograft, all treated with a corticosteroid-free immunosuppressive protocol, were included. The evolution of tacrolimus doses and exposure was recorded at 3, 6, 9, 12, 18 and 24 months after transplantation, as well as all pre-dose trough levels (C 0 ; N=9376) obtained in the first 2 years after transplantation. Results: Dose-corrected tacrolimus exposure (C 0 /dose/kg) increased in the first 2 years after kidney transplantation in pediatric recipients (table 1, figure 1 ). This decrease in dose requirement by time was only significant in children older than 5 years at the time of transplantation ( figure 1 ). In addition, the younger patients had significantly higher dose requirements compared to older recipients, which translated in marked underexposure in 72-79% of patients <12 years of age in the first days after transplantation. Conclusion: Pediatric kidney transplant recipients exhibit maturation of tacrolimus pharmacokinetics with time after transplantation. This can not be explained by differences in corticosteroid use, as all patients were treated with a corticosteroid-free protocol. The higher dose requirements for younger recipients and the absence of tacrolimus maturation in the youngest recipients suggest that age-dependent changes in tacrolimus intestinal first-pass effect, metabolism or distribution play a role. Whether age-specific tacrolimus dosing algorithms will improve outcome needs further study. Determinants of dose-corrected pre-dose trough levels (C0/dose/kg) AIH may present as acute hepatitis in 25% of patients (pts) and result in ALF. Early administration of corticosteroids may obviate the need for liver transplantation (LT), but features which identify AIH in pts with ALF have not been determined. AIM: To identify clinical and histological features which distinguish ALF due to AIH. METHODS: 197/1033 pts in the ALF Study Group Registry had no evidence of viral, metabolic, vascular, and drug/toxic liver injury. All had ALF defined by acute disease, encephalopathy, and coagulopathy. Based upon admission clinical features, 52/197 had probable AIH, and 145 were considered "indeterminate." Liver biopsies (LBx) available from 56 pts were reviewed by a blinded expert hepatopathologist. Clinicopathologic correlations were analyzed retrospectively. RESULTS: All LBx available from pts with suspected AIH had classical histologic features of AIH. Moreover, 23/46 (50%) LBx from pts with indeterminate ALF also had AIH features: extensive necrosis (100%), fibrosis (84%), cirrhosis (22%), interface hepatitis (75%), and plasma cell-rich inflammation (69%). Of all 33 LBx with AIH features, centrilobular lesions associated with acute, severe AIH (Am J Surg Path 2004;28:471) were frequent: plasma cellrich central venulitis (97%), exclusive centrilobular necroinflammation (19%), and pericentral dilatation/congestion (46%). Clinical characteristics and outcomes of the 75 pts with AIH based upon laboratory and histologic findings differed significantly from pts whose ALF remained indeterminate: AIH pts were older (44 v 38 y), predominantly female (75 v 56%), and had longer jaundice-encephalopathy interval (20 v 12 d) , lower ALT (660 v 1949 U/L), higher globulins (3.7 v 2.7 g/dL), lower creatinine (1.7 v 2.3 mg/ dL), and a higher prevalence of ANA (68 v 24%) and ASMA (51 v 29%) (all P<.01). Although spontaneous survival did not differ, more AIH pts underwent LT (63 v 34%), and more survived 1 month after enrollment (75 v 59%; P<.0001). CONCLUSIONS: Using histologic criteria to classify pts with indeterminate ALF, AIH accounted for at least 7% of the ALF Study Group registry, and represented 50% of indeterminate ALF. LBx should be performed in all patients with ALF of obscure etiology, in particular to identify centrilobular necroinflammatory lesions, which appear to be specific indicators of acute and severe AIH. (U-01 58369 from NIDDK). (2002) (2003) (2004) (2005) (2006) . Mikel Gastaca, 1 Miguel Montejo, 1 Lluis Castells, 2 Antonio Rafecas, 3 Antonio Rimola, 4 Ramon Barcena, 5 Federico Pulido, 6 Magdalena Salcedo, 7 Martin Prieto, 8 Manuel de la Mata, 9 Jose R. Fernandez, 1 Jose M. Miro, 4 The Spanish LT in HIV-Infected Patients Working Group. 1 Hospital de Cruces, Bilbao, Spain; 2 Hospital Vall d´Hebron, Barcelona, Spain; Barcelona, Spain; Univ. of Barcelona, Barcelona, Spain; 5 Hospital Ramon y Cajal, Madrid, Spain; 6 Hospital 12 de Octubre, Madrid, Spain; 7 Hospital Gregorio Marañon, Madrid, Spain; 8 Hospital La Fe, Valencia, Spain; 9 Hospital Univ. Reina Sofia, Cordoba, Spain. Background and Aim: We report on the preliminary results of the prospective multicenter Spanish study in HIV-1 infected patients who underwent OLT. Methods: The prospective multicenter Spanish study FIPSE-OLT-HIV 05-GESIDA 45-05 was initiated in January 2002. Inclusion criteria follows the rules previously described in the Spanish Consensus Document. HIV-infected patients transplanted between January 2002 and December 2006 were included in this study. Results: 89 OLT were consecutively performed in 85 HIV-1 infected patients in the period of the study. Median (IQR) follow-up is 16 (8-29) months. Median (IQR) age was 42 years (39-46), 74% of the recipients were male and former drug abuse was the most common HIV-1 risk factor (78%). 83% of the patients were transplanted due to HCV-related cirrhosis and 6% due to HBV cirrhosis. Median (IQR) MELD score was 14 (10) (11) (12) (13) (14) (15) (16) (17) (18) (19) . Pre-OLT median (IQR) CD4 cell count was 288 (165-486) cells/mm3 and 64%patients had undetectable plasma HIV viral load. Immunosuppression was based on tacrolimus in 70% of the patients. HAART was re-started in a median (IQR) time of 10 (5-21) days after OLT and was based on efavirenz in 48% of the cases. Acute rejection occurred in 38 patients (43%). Twenty patients died (22.5%) mainly due to HCV recurrence (8%). Antiviral therapy with peg-interferon plus ribavirin was initiated in 18 patients obtaining a sustained viral response in 6 of them (33%). Patient survival (95% confidence intervals) at 1, 2, 3 and 4 years was 90% (81-95%), 74% (60-84%), 67% (52-79%) and 67% (52-79%), respectively. Conclusion: For selected HIV-1 infected patients under HAART therapy, OLT is a safe and effective procedure at mid-term. As in the HIV-negative population, HCV recurrence is the mayor cause of concern and response to antiviral therapy is still disappointing. Hepatitis C Virus and Objectives: Describe the 12-year experience of a University Hospital treating chronic hepatitis C relapse, histologically diagnosed after liver transplant. Material & Methods: From September 1991 through July 2006, all the patients subject to liver transplant with histological diagnostics of chronic hepatitis relapse were submitted to antiviral treatment. Treatment was suspended if there was a severe adverse reaction to the drugs used, non-adherence, severe rejection or no response after at least 12 months. HCV positive serology patients, ALT increase (1,5x nl) and hepatic biopsy with Metavir rating showing structural 1 and/or periseptal or parenchymatous portal inflammatory activity /= 2 were included. From 1995 to 2003, all patients were treated with conventional interferon and ribavirine, regardless of the genotype. From October 2003 on, patients with genotype 1 were treated with ribavirine and pegylated interferon and genotype 3 were treated with conventional interferon and ribavirine. Results: 47 patients were treated during this time period, thirty-seven male (78.7%). Their average age was 48. 51.1% were genotype 1, 25.5% were genotype 3. Average time between transplant and beginning of treatment was 30 months. Average treatment was 16 (4-36) months. 74.5% received conventional interferon, 6.4%, pegylated alpha 2a and 19.1%, pegylated alpha 2b. 48.9% had one or more cellular rejection episodes before VHC relapse diagnostics and were treated with corticosteroids. Sustained virological response (SVR) occurred in 48.8%. Three patients had a virological response at the end of treatment (EVR) and have not completed six months post treatment. Out of 21 patients with SVR, 20.9% were genotype 3, 9.3% genotype 1 and 18.6% were not genotyped. Considering genotypes, SVR was detected in 16.7% with genotype 1, in 75% with genotype 3. SRV was detected when we examined hepatic biopsy, in 85.7% of the F1/F2 patients (18 out of 21), 9.5% of the F3 patients (2), 4.8% of the F4 patients (1) . Survival period of 5 years for SVR patients was 94% (20 out of 21) and 68.2% for patient without SVR (p = 0.03 -Kaplan-Meier). Five-year survival was 79.2% for patients with genotype 1 and 100% for patients with genotype 3. Conclusions: Our results show it is possible to get good five-year survival rates for treated patients. However, handling adverse reactions and long term treatment still pose difficulties in pursuing better SVR rates. The Impact of Alcoholic Liver Disease (ALD) and Background: Although liver transplant survival benefit has been shown to be associated with MELD score, disease-specific analyses have not been reported. We evaluated, using SRTR data, the effect of ALD and/or HCV infection on transplant waitlist and post-transplant mortality, and survival benefit of deceased donor liver transplants. Methods: 38,889 patients age ≥18, who were listed between Sept 2001 and Dec 2006, and followed until Dec 31, 2006, were classified into 4 cells according to HCV or ALD status: HCV+, ALD-: 12,322; HCV+, ALD+: 4,220; HCV-, ALD+: 6,581; HCV-, ALD-: 15,776). Cox regression was used to estimate waiting list mortality and post-transplant mortality separately. Survival benefit, which encompasses both pre-and post-transplant events, was assessed using sequential stratification; an extension of Cox regression which matches transplant recipients by MELD and organ procurement organization to patients on active on the waitlist at the time of transplant. Results: HCV significantly (p=0.0001) increased waitlist mortality, with a covariate-adjusted hazard ratio (HR) for HCV+ vs. HCV-of HR=1.19 (p=0.0001). The impact of HCV+ was greater among ALD+ candidates (HR=1.36; p<0.0001), but was also significant among ALD-candidates (HR=1.11; p=0.02). The contrast between ALD+ and ALD-waitlist mortality was only significant among HCV+ candidates (HR=1.14; p=0.006). Post-transplant mortality was significantly higher among HCV+ vs. HCV-recipients (HR=1.26; p=.0009); there was no difference between ALD+ vs. ALD-recipients. Survival benefit of liver transplantation was significantly lower among HCV+ compared to HCV-recipients with MELD 9-29, but significantly higher for HCV+ recipients with MELD scores of ≥30. A diagnosis of ALD did not influence the survival benefit of transplantation at any MELD score. Conclusion: Despite higher waitlist mortality, HCV+ recipients had significantly lower liver transplant survival benefit than HCV-recipients, within categories defined by MELD score. In contrast, transplant survival benefit was not influenced by ALD. Transplanted for HBV-Related Cirrhosis. Giuseppe Tisone, 1 Daniele Di Paolo, 1 Ilaria Lenci, 1 Laura Tariciotti, 1 Andrea Monaco, 1 Manuele Berlanda, 1 Linda De Luca, 1 Giuseppe Iaria, 1 Alessandro Anselmo, 1 Irene Bellini, 1 Mario Angelico. 1 1 Hepatic Surgery and Transplantation, University of Rome tor Vergata, Rome, Italy. Background and Aim: Post-transplant active immunization with HBsAg vaccine is a potential prophylaxis strategy against HBV-recurrence after liver tranplantation due to HBV-related disease. Previous studies showed conflicting results using standard vaccines, whereas the use of the new adjuvant 3-deacylated monophosphoryl-lipid-A (MPL) significantly increased patient's immunization rate. We investigated the efficacy of a long-term (12 months) accelerated (monthly doses) vaccination schedule using the MPL-adjuvanted vaccine administered with and without concomitant HBIg. Methods: 18 patients (M/F:13/5) transplanted for HBV-related cirrhosis 73±38 months earlier were recruited. All were HBsAg and HBV DNA negative in serum and cccDNA negative in liver tissue; 5 (27.7%) were co-infected with HCV and 5 (27.7%) with HDV. Study protocol consisted of 12 consecutive monthly intramuscular vaccine doses (HBsAg 20mg plus MPL 50mg) given together with lamivudine (100 mg/daily). Each of the initial 6 doses (first cycle) was administered within 7 days after 2000 IU HBIg i.v. infusion, while the last 6 doses were given after complete HBIg withdrawal (second cycle). HBsAb titre was determined before each vaccine dose and during the follow-up. All patients were maintaiened on low-level immunosuppression. Preliminary results: all patients completed first vaccination cycle; 14 (77.7%) patients received 12 adjuvanted vaccine doses (first and second cycles) and were monitored during 3 months follow-up after vaccination end. No side effects occurred, nor evidence of HBV recurrence. At the end of first cycle all patients achieved an anti-HBs titre >100 IU/L (mean 343±209 IU/L) and 3 (16.6%) a titre greater than 500 IU/L. At the end of follow-up 8/14 (57.1%) and 4/14 (28.5%) had an anti-HBs titre greater than 100 (mean 411±599 UI/L) and 500 IU/L, respectively. Conclusions: nine months after HBIg withdrawal more than half of the patients reached and maintained a protective anti-HBs titre (>100 IU/L). This intensive schedule using the MPL-adjuvanted vaccine, given in combination with HBIg and lamivudine, seems to be more effective than previous HBV vaccination protocols, although a longer follow-up is needed to assess its final effectiveness. Heterologous Immunity Via Alloimmune Responses to Hepatitis C Virus Replication after Liver Transplantation. Hideki Ohdan, 1 Masahiro Ohira, 1 Yuka Tanaka, 1 Kohei Ishiyama, 1 Nobuhiko Hiraga, 2 Michio Imamura, 2 Kazuaki Chayama, 2 Toshimasa Asahara. 1 1 Surgery, Hiroshima University, Hiroshima, Japan; 2 Internal Medicine, Hiroshima University, Hiroshima, Japan. The immunosuppressive environment in liver transplantation (LT) recipients infected with HCV is believed to be associated with the progression of HCV reinfection. In the present study, we simultaneously monitored HCV levels and alloimmune status in HCV-infected LT recipients. For evaluating the immune status of 33 HCV-infected LT recipients, we employed a mixed lymphocyte reaction (MLR) assay using a CFSElabeling technique. The kinetics of the stimulation index of anti-donor reactive CD4 + T cells clearly mirrored that of the HCV RNA titer, and a significant reverse correlation was observed between the 2 parameters (R 2 = 0.67). We did not observe a similar relationship between the stimulation index of an anti-third party CD4 + T cells and the HCV RNA titer (R 2 = 0.03). One possible explanation for this phenomenon might be that cytokines outputted by T cells responding to allostimulation display the anti-HCV activity either directly or indirectly through the activation of bystander immunocytes. To investigate such possibilities, we performed a transwell culture assay comprising a MLR culture in the upper chamber and genomic HCV replicon cell culture in the lower chamber to mimic the anatomical features of the interaction between HCVinfected hepatocytes and alloreactive T cells that infiltrate the portal area in the liver. When one-way MLR was performed using one-halpoidentical combination in the upper chamber, HCV replication was significantly suppressed in the lower chamber containing HCV replicon cells. HCV replication was further suppressed when MLR was carried out with a complete allogeneic combination. This inhibiting effect was dependent on their IFN-γ secreting activity. In addition, a similar result was obtained when IFN-γ-secreting NK/NKT cells stimulated with IL-2 were cultured in the upper chamber. In conclusion, there was a close relationship between the anti-donor and the anti-HCV immune status in the LT recipients infected with HCV. Cytokines such as IL-2 and IFN-γ may be produced in response to allostimulation, and even these cytokines do not cause graft rejection, display anti-HCV activity. The elucidation of such a possible heterologous immunity via alloimmune responses to HCV replication might lead to the establishment of a novel method to prevent the progression of HCV reinfection in HCV-infected LT recipients. Hepatitis C E2 Region Diversity after Liver Transplantation: A Report from the Hepatitis C III Trial. Juan F. Gallegos-Orozco, 1 Hugo E. Vargas, 1 Georges Netto, 2 Gary L. Davis, 3 , and 7.3% are morbidly obese . The goal of this study was to quantify the effect of BMI on access to transplantation (ATT), likelihood of receiving a transplantation (LRT), turndown rates (TR), and survival benefit (SB). Methods: ATT was defined as either registering for the deceased donor waiting list or receiving a live donor transplant, and was analyzed based on the USRDS registry using logistic regression. LRT was based on time spent in active status on the waiting list before receiving a transplant, and was analyzed based on the UNOS waiting list dataset using Cox proportional hazards time-to-event analysis. TR was defined as the relative likelihood of being turned down for an organ offer by a provider other than the patient, and was analyzed based on the UNOS organ turndown dataset using negative binomial regression. SB was defined as survival after kidney transplantation versus survival on the deceased donor waiting list, and was analyzed based on the UNOS waiting list dataset using a time-dependent Cox survival benefit model. All models were adjusted for known factors influencing those outcomes. LGBP LSG LSG Mean Follow-up (months) 15.4 (range 3-24) 9 (range 3-18) 12.5 (12, 13) Patients > 9 months since operation (n) 6/7 4/6 2/2 Mean % EWL at > 9 months 61 (range 41-75) 33 (range 24-40) 62 (50, 73) Transplant candidate at > 3 months 7/7 5/6 2/2 Underwent transplant 0/7 1/6 1/2 Complications developed in two patients (both with cirrhosis) and there was no mortality. Mean follow-up was 12.3 months, and mean EWL at 9 months or later was 61% (ESRD), 33% (cirrhosis) and 61.5% (ESLD). Obesity associated comorbidities were improved or resolved in all patients. Serum albumin and other nutritional parameters 9 months or later after surgery were similar to preoperative levels in all 3 groups. At most recent follow-up, 14 out of 15 patients (82%) have reached our institution's body mass index (BMI) limit for transplantation and are awaiting transplant; one patient with ESLD has undergone a successful lung transplant and one patient with cirrhosis has undergone a successful liver transplant. Factors that contribute to inequitable access to the transplantation network (UNOS/ OPTN) include socioeconomic status, geographic location and delayed referral. The goal of the MELD allocation system was to assign priority to the sickest candidate and assure equitable access to liver transplantation (LT). The MELD has been validated as a reliable marker for liver disease severity. At the time of referral to the transplant center or listing a high MELD (HM) is an indirect measure of delayed referral. Therefore, the aim of this study is to identify factors associated with a HM at the time of listing for LT. Method: Using the UNOS database, we identified all adult candidates listed for LT from 2002 to 2006. Patients who received MELD upgrade (i.e. HCC, FHF) and those listed for multi-organ transplant were excluded. The data collected included demographics, insurance payor, diagnosis, and MELD score. MELD score at time of listing was categorized as low (< 20) and high (>20) and insurance type as Private, Medicaid and Government (excluding Medicaid) . Results: During the study period 15,323 candidates were added to the LT waiting list. Of these, there were 10,302 (67.2%) males with a median age of 53 (range 18 -83) . Caucasians were 11,511 (75.1%), Hispanic (11.6%), African Americans (8.9%) and the rest (4.4%). The underlying liver disease was Hepatitis C (31.9%), Alcohol (15.1%), Alcohol/Hepatitis C (7.2%), PBC/ PSC (9.9%), Cryptogenic (7.5%) and others (28.4%). Age, gender and liver disease etiology were not associated with a HM at listing. A HM was associated with Ethnicity: AA (49.5%), Hispanic (44.2%), Caucasian (36.6%)[p<0.001] and Insurance type: Medicaid (46.3%), government (38.8%) and private (36.5%)[p<0.001].There was no strong interaction between ethnicity/race and insurance in combination as predictors of HM i.e. both were independent. Conclusion: (1) AA and LT candidates with Medicaid insurance are more likely to have a HM score at initial listing (2) Hispanics had the lowest rate of private insurance compared to Caucasians with the highest rate. The above results suggest that type of insurance and ethnicity are independently associated with a HM (i.e. sicker patients) at listing. Since a HM score is associated with increased mortality, implementation of strategies that result in timely and equitable access to the transplantation network regardless of the insurance type or ethnicity of the candidate(s) should be a priority. Marked Increase in the Use of Inactive Status on the Kidney Transplant Waiting List. Kim Nguyen, 1 Valarie Ashby, 1, 2 further wait time accrued only after active status was restored. On 11/5/03, the OPTN implemented a change in policy that provides for the accrual of waiting time during the entire interval that WL candidates are designated as S7. Methods: We investigated the impact of this policy change on the patterns of S7 designation, using the SRTR/OPTN database. Initial and subsequent status on the WL were determined for 130,986 candidates placed on the KI Tx WL from 11/5/00-11/4/06. The probability of becoming active (including receiving a living donor Tx) on the WL was calculated for those who were S7 at listing before and after the policy change. Results: Table 1 shows trends in the use of S7 before and after the policy change. Of the 1,396 candidates listed as S7 before the policy change, 76% became active within 6 months and 83% within 1 year of WL. For the 11,248 candidates listed as S7 after the policy change, the corresponding figures are 48% and 60% respectively. Figure 1 demonstrates that, since the implementation of the new policy, the % of patients initially WL in S7 at most US Tx centers has increased. Conclusion: Since the implementation of this policy, the number and % of pts WL as S7 at most KI Tx centers, and the duration of S7 for those initially WL as S7 has increased dramatically. The implications of this practice on access to KI Tx and survival warrants further investigation. Can Abstracts (11%) patients were retransplanted, and 13 (9%) patients had panel reactive antibody >30. The mean cumulative Thymoglobulin dose administered was 5.7 mg/kg, and the primary maintenance immunosuppression started prior to discharge was a sirolimuscontaining regimen (59%). At the end of the follow-up period, 98 (65%) patients had functioning grafts and 52 (35%) patients experienced graft loss. Of the 52 patients with graft loss, 15 (10%) patients experienced graft loss secondary to death. No pertinent differences were identified between groups. The graft survivals for African American recipients with African American or Caucasian donors were 95% vs. 91% at 1 year and 83% vs. 72% at year 2. Conclusions: Our study results suggest that donor race does not adversely affect graft outcomes in African American cadaveric kidney recipients with modern immunosuppression. Donor racial differences may not play a primary role in the inferior graft outcomes of African American cadaveric kidney recipients. Cardiac Evaluation before Kidney Transplantation: Are We Screening Too Often or Not Enough? Krista L. Lentine, 1 M. A. Schnitzler, 1 J. J. Snyder, 2 D. C. Brennan, 3 P. Hauptman, 1 P. R. Salvalaggio, 1 B. Kasiske. 4 Evaluation for ischemic heart disease (IHD) is a common but non-standardized practice before kidney transplant. We retrospectively studied pre-transplant cardiac evaluation (CE) practices among a national sample of renal allograft recipients. Methods: We examined USRDS data for Medicare beneficiaries transplanted in 1991-2004 with Part A and B benefits from dialysis initiation through transplant. Clinical traits defining "high" expected IHD risk were defined as diabetes, prior IHD, or >2 other coronary disease risk factors. Pre-transplant CE were identified by billing claims for noninvasive stress tests and angiography. We quantified individuals with claims for coronary revascularization procedures between CE and transplant, and abstracted post-transplant acute myocardial infarction (AMI) events from claims and death records. Results: Among 27,786 eligible patients, 46.3% (65.4% of high-risk and 20.4% of lower-risk) underwent CE before transplant. Overall, 9.5% of patients who received CE also received pre-transplant revascularization, including only 0.3% of lower-risk patients studied by CE ( Table A) . The adjusted odds of transplant without CE (higher OR for no CE, Table B ) increased sharply with younger age and shorter dialysis duration. Increased likelihood of transplant without CE also correlated with black race, female sex, and certain geographic regions. Post-transplant AMI rates in patients transplanted without CE allow assessment of whether CE was appropriately deferred in those who indeed face low IHD-risk after transplant. Among patients transplanted without CE, the 3-yr incidence of post-transplant AMI was 3% and 10% in lower and high-clinical risk groups, respectively, but varied by clinical traits within these groups. In lower-risk patients transplanted without CE, blacks patients faced increased AMI-risk compared to whites (adjusted HR 1.52, 95% CI 1.16-2.00). Conclusions: Observed CE practices demonstrate a low yield of pre-transplant revascularization but also raise concern for socio-demographic barriers to evaluation access. Women Who Become Pregnant in the First Two Years after Kidney Transplantation Have a Higher Risk of Graft Loss. Nadia Zalunardo, 1 Olwyn Johnston, 1 Caren L. Rose, 1 John S. Gill. 1 1 Division of Nephrology, University of British Columbia, Vancouver, BC, Canada. Existing information regarding the risks of pregnancy is derived from voluntary data sources. Using Medicare Claims files from the USRDS (1990 USRDS ( -2003 , we examined pregnancies (defined by presence of an inpatient ICD-9 billing code for pregnancy or pregnancy-related complication) in the first 3 years after kidney transplantation (KTX) among women aged 15-45 years whose primary insurance payor was Medicare (N=16,195) . Patients were followed until graft failure, death or December 31, 2003. There were 530 pregnancies identified in 483 women during the first 3 post-transplant years. Women who became pregnant during the 1st or 2nd post-transplant year had a shorter time to graft loss than women who became pregnant during the 3rd year (figure). To minimize the survivor bias among women who became pregnant during the 3 rd posttransplant year, we determined the association between the timing of pregnancy and graft survival among the subset of women (n=455) who had graft survival of at least two years using a Cox multivariate regression adjusted for age, race, cause of ESRD, donor source, calendar year of transplant, dialysis vintage, maintenance immunosuppression, and GFR at 6 months after KTX. Pregnancy in the 1 st year (HR 2.01, 95% CI 1.35 -2.99), and second year (HR 1.79, 95% CI 1.23 to 2.62) was associated with an increased risk of graft loss compared to pregnancy during the third post-transplant year. We concluded that women who are able to wait should be counseled to become pregnant after the second post transplant year. The aging donor and recipient populations have led to new challenges in simultaneous kidney-pancreas transplantation (SKPT). The purpose of this study was to retrospectively review our single center experience in SKPT with respect to extended (EX) donor (D) and recipient (R) criteria. Methods: Over a 65 month period, we performed 83 SKPTs with enteric drainage (79 portal venous drainage). EX Ds were defined as age <10 (n=4), >45 (n=12, mean age 50.2 yrs), or donation after cardiac death (DCD, n=4). All DCD donors were managed with extracorporeal support. EX Rs were defined as age >50 (n=20, mean age 55.8 yrs) or those with a pretransplant serum C-peptide level >2.0 ng/ml (n=7, mean 5.7 ng/ml). All Rs received depleting antibody induction (57 rATG, 26 alemtuzumab) with tacrolimus, MMF, and tapered steroids (39 steroid-free). Results: A total of 20 Ds (24%) and 23 Rs (28%) met the above EX criteria. Median waiting time was 10 months, mean pancreas preservation was 17 hours, and median length of stay was 10 days. With a mean follow-up (f/u) of 28 months, patient (pt) (95% EX D vs 94% non-EX D), kidney (90% EX D vs 89% non-EX D) and pancreas graft survival (85% EX D vs 81% non-EX D) rates were similar between D groups (all p=NS). The incidences of delayed kidney graft function (5% in each group) and early pancreas graft loss due to thrombosis (5% EX D vs 8% non-EX D) were also comparable between D groups. With regard to R groups, pt (87% vs 97%, p=0.13) and kidney (87% vs 92%, p=NS) graft survival (GS) rates were slightly lower in the EX R group compared to the non-EX R group, respectively. However, death-censored kidney GS rates (100% EX R vs 95% non-EX R) were comparable between groups. Uncensored pancreas GS rates (83% EX R vs 82% non-EX R) were similar. The incidences of acute rejection, surgical complications, infection, and other morbidity were comparable regardless of D or R group. At 1 year (or latest) follow-up, renal and pancreas functional parameters were similar between D groups. However, the EX R group demonstrated slightly compromised renal and pancreas allograft function and a greater need for oral hypoglycemic agents. Conclusion: Intermediate-term outcomes in SKPT from selected EX Ds or Rs have comparable outcomes, although EX R criteria may represent a risk factor for pt survival and functional outcomes. Conclusion: SPK recipients with functioning pancreas grafts have significantly improved kidney and patient survival compared to LD KA and DD KA. However, early pancreas graft failure results in kidney and patient survival rates similar to KA recipients. SPK provides optimal outcomes for patients with DM1, but long-term risk associated with early pancreas loss may be a consideration when selecting SPK vs KA transplantation. The Impact of Long-Term Metabolic Control on Renal Allograft and Patient Survival in Type 1 Diabetes. Christian Morath, 1 Martin Zeier, 1 Bernd Dohler, 2 Jan Schmidt, 3 Peter P. Nawroth, 4 Gerhard Opelz. 2 1 Nephrology, University of Heidelberg; 2 Transplantation Immunology, University of Heidelberg; 3 Transplantation Surgery, University of Heidelberg; 4 Endocrinology, University of Heidelberg. It is a matter of debate whether pancreas allografts independently contribute to renal transplant and patient survival in type 1 diabetics who received a simultaneous pancreas kidney transplant (SPK). Using the data of the Collaborative Transplant Study (CTS), we studied type 1 diabetic recipients of deceased donor kidneys (DDK), living donor kidneys (LDK), or SPK performed during two time periods: 1984 to 1990 and 1991 to 2000 . We analyzed graft and patient survival rates for a maximum of 18 years. DDK recipients showed inferior graft and patient survival compared to LDK and SPK recipients in both time periods. LDK recipients had a superior graft survival rate initially, but the survival rate of kidneys in SPK recipients reached the level of LDK toward the end of the follow up period. The results of patient survival paralleled those of kidney graft survival: an early advantage of LDK as compared to SPK faded away during follow up. Multivariate analysis, in which pretransplant cardiovascular risk assessment was appropriately considered, showed that patient survival of SPK recipients was superior to that of LDK recipients beyond the tenth year after transplantation (hazard ratio HR = 0.55, p = 0.005). This was reflected by a lower cumulative cardiovascular death rate in recipients of SPK (37.0 %) compared to recipients of DDK (45.8 %) or LDK (49.3 %) . The early survival benefit of LDK compared to SPK is lost during long-term follow up, probably related to improved glycemic control in SPK recipients. Proposal for a grading rejection schema in pancreas allograft biopsies from a multidisciplinary panel of pathologists, surgeons and nephrologists. 1.Normal 2.Indeterminate Septal inflammation that appears active but the overall features do not fulfill the criteria for mild cell mediated acute rejection. Active septal inflammation involving septal structures and/or Focal acinar inflammation. -Grade II / Moderate acute cell mediated rejection Multifocal (but not confluent or diffuse) acinar inflammation with spotty acinar cell injury and drop-out and/or Minimal intimal arteritis -Grade III / Severe acute cell mediated rejection Diffuse, (widespread, extensive) acinar inflammation with focal or diffuse multicellular /confluent acinar cell necrosis.and/or Moderate or severe intimal arteritis and/or Transmural inflammation -Necrotizing arteritis Chronic active cell-mediated rejection. Chronic allograft arteriopathy (arterial intimal fibrosis with mononuclear cell infiltration in fibrosis, formation of neo-intima) 4.Antibody mediated rejection =C4d positivity + confirmed donor specific antibodies + graft dysfunction -Hyperacute rejection -Accelerated antibody mediated rejection Severe, fulminant form of antibody mediated rejection with morphological similarities to hyperacute rejection but occurring later (within hours or days of transplantation). -Acute antibody mediated rejection Specify percentage of biopsy surface with interacinar capillaries positive for C4d. 5.Chronic allograft rejection/graft sclerosis Stage I (mild graft sclerosis) <30% of the core surface Stage II (moderate graft sclerosis) 30-60% of the core surface. Stage III (severe graft sclerosis) >60% of the core surface 6. Other histological diagnosis e.g. CMV pancreatitis, PTLD, etc. A simple, reproducible, clinically relevant and internationally accepted schema for grading rejection should improve the level of diagnostic accuracy and positively affect patient care. rank test). Similar results were obtained when death was included as a cause of graft loss (data not shown). Conclusion: PAR is associated with worse long term kidney graft survival than KAR. It is likely that patients that experience PAR within the first year have also experienced undiagnosed sub-clinical KAR. This adds associative data to the argument that the likelihood of concordance between PAR and KAR is high and therefore, the need for increased monitoring of kidney function after PAR is warranted. The most efficacious immunosuppressive (immuno) regimen for SPKT is debated, and information on the best regimen to prevent AMR is particularly scarce. We performed a retrospective comparative cohort study that included 136 SPKT patients (Pts) transplanted in 2002-2005, who received maintenance immuno with TAC, MMF and steroids. Two groups were compared: Pts who received induction with alemtuzumab (Alem) (n=97) and Pts who were induced with basiliximab (Basilmab) (n=39). Donor and recipient characteristics were similar. KT acute cellular rejection (ACR) was more frequent with Basilmab (2-yr 12.8% vs 3.1%, p=0.04), but the incidence of biopsy-proven KT AMR was similar (2-yr 18% with Basilmab vs 13.8% with Alem, p=NS). No differences in prevalence were detected considering early AMR (<90th day) or late AMR (>90th day) separately. Multivariate analyses showed that the only significant risk factor for AMR was female gender (adjusted HR 2.97, p=0.016). The biopsy-proven KT rejection was associated with clinical PT rejection in 13 out of the 21 AMR cases (62%), without differences between the groups. Post-rejection KT graft survival was similar in both groups (2-yr Basilmab/Alem 94.7/91.2%), but deathcensored KT survival was lower with Alem (100/91.2%, p=0.05). The predominant cause of KT loss in Pts induced with Basilmab was death-with-function (n=3), while in Pts induced with Alem acute or chronic AMR was the most common cause of KT attrition (n=5). Pancreas survival was similar between both groups. More PT losses due to AR occurred in Alem-treated group than in the Basilmab group (4 vs 1) . Nearly all early episodes resolved with treatment and were not associated with worse graft survival. Conversely, late AMR episodes carried a worse prognosis, with decreased 2-yr KT (61.5% vs 96.2%, p<0.0001) and PT graft survival (47.7 vs 92.3%, p<0.0001). Late AMR was associated with graft loss in multivariate Cox models (kidney loss HR 3.34, p=0.05 and pancreas HR=3.22, p=0.049) . AMR is common in SPKT recipients. ACR is better prevented with Alem than with Basilmab, but no relevant difference is found between Alem and Basilmab in AMR. Late (as opposed to early) AMR episodes are associated with significant reduction in SPKT survival rates despite treatment. Preventive and/or different treatment strategies are required to address late AMR in SPKT. Intraoperative Fluorescence Imaging (IFI) in Pancreas Transplantation (PT) To Determine Vascular Patency and Allograft Perfusion. Edmund Q. Sanchez, 1 Srinath Chinnakotla, 1 Marlon F. Levy, 1 Robert M. Goldstein, 1 Goran B. Klintmalm. 1 1 Baylor Regional Transplant Institute, Dallas/Ft. Worth, TX. Thrombotic complications of PT are well known. We report IFI using the SPY device to assess immediate vascular patency and allograft perfusion. Our controlled experience is presented here. Methods: PTs were imaged intraoperatively using the SPY device under our IRB approved study. Indocyanine green solution (2.5 mg/ml) was injected into central venous catheters after completion of the vascular anastomoses of whole pancreaticoduodenal allografts. The pancreas transplants were performed in the retroperitoneal portal and enteric drained technique described by Boggi, et al. Imaging of the allograft vasculature, perfusion of the pancreas, and perfusion of the duodenal segments was performed and recorded intraoperatively. All video sequences were archived for later review. Results: IFI on 7 pancreas transplants (6 Simultaneous Pancreas-Kidney and 1 Pancreas after Kidney) was performed and video sequences were recorded. All pancreas allografts demonstrated intraoperative vascular patency and complete pancreatic and duodenal perfusion. There were no side effects seen. All pancreas transplants had immediate graft function. One patient was re-explored on postop day 0 due to persistent acidosis and hypotension. Repeat IFI demonstrated vascular patency and perfusion, despite an ischemic external physical appearance. There were no vascular complications that required reoperation, nor were there any graft thromboses. Characteristic slow venous outflow was seen in each case. Conclusion: IFI with the SPY device is a simple and effective method to determine immediate patency and perfusion of the whole pancreaticoduodenal allograft. Its use is beneficial in re-exploration to rule out infarcted pancreas allografts. Further development in quantification of vascular flow rates and allograft perfusion indices by this device is Abstracts in progress. Once these are established, this information will be studied in a controlled fashion, and will be compared with clinical outcomes. We have demonstrated safety and developed a protocol for using the SPY device in IFI of pancreas transplants. Tolerance/Immune Deviation I Tolerance without Immunosuppression: Exploiting Epigenetic Regulation as a New Approach To Achieving Donor-Specific Allograft Tolerance. Liqing Wang, 1 Ran Tao, 1 Joel A. Friedlander, 1 Wayne W. Hancock. 1 1 Pathology & Immunology, Children's Hospital of Philadelphia & UPenn, Philadelphia, PA. Given toxicities of all current immunosuppressive agents, plus complications of malignancy, infection and chronic rejection, maintaining the search for new approaches to tolerance induction is essential. While weaning of immunosuppression is fraught with risks and costimulation blockade has not yielded the expected boon, use of a broad histone deacetylase inhibitor (HDACi), such as trichostatin A (TsA) or suberoylanilide hydroxamic acid (SAHA), promotes Foxp3 acetylation of Foxp3 and binding to target genes, leading to enhanced Treg suppressive function, and 2 wks of therapy with HDACi and low-dose rapamycin can induce allograft tolerance. We now show that in addition to acetylation, modulation of a second major epigenetic mechanism, DNA methylation, has salutary effects, such that combined use of an HDACi and a DNA methyltransferase inhibitor (DNMTi, e.g. 5-aza-2 deoxycytidine) has potent effects. Microarray analysis of the effects of HDACi on Tregs showed down-regulation of multiple genes associated with DNA methylation, including DNMT, methyl-CpGbinding domain and associated proteins, leading us to test the effects of DNMTi use on Treg function. DNMTi administration decreased Foxp3 gene methylation, enhanced Treg gene expression, and increased Treg suppression in vitro, and led to a dose-dependent prolongation of cardiac allograft survival (BALB/c->C57BL/6) in vivo (p<0.01). The combination of HDACi and lower doses of DNMTi, administered for just 2 wks, led to permanent engraftment (>100 d, p<0.001), and the permanent acceptance of second donor allografts but acute rejection of third-party (CBA) cardiac allografts indicated induction of donor-specific tolerance post-epigenetic therapy. Analysis of long-surviving cardiac allografts showed a minor infiltrate consisting primarily of Foxp3+ Tregs and an absence of chronic rejection (transplant arteriosclerosis, myocardial fibrosis), whereas combined epigenetic therapy led to only minor prolongation of allograft survival in Treg-depleted recipients. In summary, brief use of 2 clinically approved agents (an HDACi plus an DNMTi), can synergistically enhance Treg function and induce Tregdependent donor-specific allograft tolerance without use of any immunosuppression. We conclude that insights gained through epigenetic targeting may provide completely new approaches to the taming and regulation of otherwise powerful immune responses post-transplantation. A Novel Epigenetic Approach To Generate Mouse and Human CD4 + CD25 + Foxp3 + Regulatory T Cells. Girdhari Lal, 1 Nan Zhang, 1 William van der Touw, 1 Yaozhong Ding, 1 Jonathan S. Bromberg. 1 1 Gene and Cell Medicine, Mount Sinai School of Medicine, New York City, NY. Background: Constitutive Foxp3 expression is required for stable and suppressive CD4 + CD25 + regulatory T cells (Treg) . We previously showed that demethylation of an upstream CpG island of the Foxp3 promoter is characteristic of stable and suppressive Treg. Using DNA methyltransferase inhibitors, we present a novel method to generate antigen-specific Treg from mouse and human CD4 + CD25 -T cells. Methods: Naïve CD4 + CD25 -T cells and natural Treg (nTreg) were purified from wild type or Foxp3-GFP transgenic mice. Human naïve CD4 + CD25 -T cells were purified from PBMC. T cells were cultured with irradiated antigen presenting cells in the presence of IL-2, anti-CD3ε mAb, TGFβ or the DNA methyltransferase inhibitor 5-aza-2'-deoxycytidine (ZdCyd), which demethylates the upstream promoter. Foxp3 expression was determined by flow cytometry and qRT-PCR. Results: Naïve T cells cultured under stimulatory conditions with ZdCyd express increased Foxp3 mRNA (30 fold) and intracellular Foxp3 protein (35% of cells vs 2% without ZdCyd). Foxp3 expression is synergistically enhanced with TGFβ (76 ± 4.6% vs 38.8 ± 5.3% ZdCyd alone or 44.6 ± 1.4% TGFβ alone), similar to nTreg (92.4 ± 2.5%). TGFβ in combination with other DNA methyltransferase inhibitors (procainamide, hydralazine, RG108) also induces Foxp3. ZdCyd plus TGFβ induced Treg stably express Foxp3 and similar surface markers and cytokine mRNA as nTreg. ZdCyd plus TGFβ induced Treg suppress proliferation of CD4 + CD25 -T cells, prevent CD4 + CD25 -CD45RB hi induced colitis in SCID mice, and enhance islet allograft survival. ZdCyd plus TGFβ induce Foxp3 expression in antigen-specific wild type or T cell receptor transgenic CD4 + T cells stimulated with alloantigen or peptides, and in naïve CD8 + CD25 -T cells. In vivo administration of ZdCyd preferentially preserves thymic CD4 + CD25 + Foxp3 + Treg generation. ZdCyd alone or ZdCyd plus TGFb induce strong Foxp3 expression in human CD4 + CD25 -T cells compared to TGFβ alone. ZdCyd plus TGFβ induced human Treg suppress the proliferation of naïve CD4 + CD25 -T cells, whereas TGFβ-induced Treg do not. Conclusion: DNA methyltransferase inhibitors induce stable and suppressive Foxp3 + Treg from peripheral CD4 + CD25 -T cells. These finding have important implications for understanding T cell development and differentiation, and provide a clinically applicable technique for manipulating epigenetic regulation for the generation of Treg for tolerance. Macrophages Driven to a Novel State of Activation Can Promote Tolerance. Katharina Kronenberg, 1 Seiichiro Inoue, 1 James A. Hutchinson, 2 Beate G. Brem-Exner, 2 Gudrun E. Koehl, 1 Hans J. Schlitt, 1 Fred Fandrich, 2 Edward K. Geissler. 1 1 Surgery, University of Regensburg, Regensburg, Germany; 2 Surgery, University Hospital Schleswig Holstein, Kiel, Germany. Background: The use of immunomodulatory cells in organ transplantation (Tx) is a promising tolerance-induction strategy. Here we describe a novel macrophage population capable of enriching T regulatory cells (Treg) and promoting tolerance. Methods: BALB/c bone marrow, spleen and blood mononuclear cells were cultured with M-CSF for 5 days, with a 24 hr pulse of IFN-γ on day 4; the resultant adherent cells are referred to as IFNγ-induced monocyte-derived cells (IFNγ-MdC) . Residual lymphocytes from these cultures were recultured with the adherent IFNγ-MdC for an additional 3 days. IFNγ-MdC were phenotyped by FACS and immunomodulation of lymphocytes was determined by cell counting and FACS analysis for Tregs. Mouse heterotopic heart Tx was used for in vivo testing. Results: IFNγ-MdC highly express CD11b/c, F4/80 and PD-L1. Compared to classically-activated (M1) macrophages, IFNγ-MdC express less CD40/CD86, but higher CD11c. IFNγ-MdC profoundly delete activated, but not resting, lymphocytes (>60%), with cell contact (via transwell system) and caspases (via inhibitors) being essential. Most interestingly, IFNγ-MdC highly enrich lymphocytes for CD4 + CD25 high Foxp3 + Treg (41±2%; n=27). The same, or higher, proportions of Treg developed when IFNγ-MdC were produced from MACS-purified CD11b + and CD4 + cells (1:1) . IFNγ-MdC culture supernatant showed increased IL-10 (ELISA) vs. monocyte control cultures (183±42 pg/ml vs. <30 pg/ml, respectively; n=6), suggesting IL-10 as one potential mediator. IFNγ receptor signaling and CD40 interactions are required for Treg enrichment, as confirmed using IFNγ receptor and CD40 knock-out mice (C57BL/6 mice). IFNγ-MdC derived from CD11b + cells and lymphocytes of OVA-transgenic OTII mice showed enrichment of Treg by 10-fold with high-affinity OVA peptide (323-339) vs. Treg levels using a low-affinity OVA peptide (OVA 323-334), suggesting Ag-specific Treg cells can be enriched with IFNγ-MdC. Finally, a single post-heart Tx (d+1) i.v. injection of 5x10 6 donor (C3H)-derived IFNγ-MdC prolonged allograft survival in BALB/c recipients from 8.5±0.8d in controls (n=5) to 21.4±8.3d (n=11; p=0.001). Conclusions: IFNγ-MdC are a novel macrophage population capable of deleting activated T cells and highly enriching remaining lymphocytes for Tregs. Therefore, IFNγ-MdC could be useful in a clinical setting for promoting transplant tolerance. Plasmacytoid DC Therapeutic Potential Is Independent of IDO Induction Due to Elevated DAP12 Expression. Tina L. Sumpter, 1 Bridget L. Colvin, 1 Zhiliang Wang, 1 Andrew L. Mellor, 3 Angus W. Thomson. 1,2 1 Starzl Transplantation Institute, Dept. of Surgery, University of Pittsburgh, Pittsburgh, PA; 2 Immunology, University of Pittsburgh, Pittsburgh, PA; 3 Immunotherapy Center, Medical College of Georgia, Augusta, GA. Optimizing the mechanisms of pDC tolerogenicity may facilitate their use in the development of cell-based tolerance induction strategies. pDC may undermine T cell function via inducible expression of IDO, a tryptophan catabolizer that enhances T cell apoptosis. IDO is negatively regulated by DAP12. In some systems, loss of DAP12 correlates with immunostimulatory activation of pDC, while in others, its absence correlates with enhancement of pDC tolerogenicity through induction of IDO. In these studies, we characterized the ability of WT and IDO-deficient pDC to attenuate murine heart allograft rejection in order to define the contribution of IDO and its regulation by DAP12 relative to the immunoregulatory potential of pDC. Methods: pDC and control myeloid (m) DC were generated from BM cultures of WT or IDO KO C57BL/6 (B6; H2 b ) mice. WT mDC, pDC or IDO KO pDC were pulsed with alloAg (BALB/c; H2 d ), stimulated with CpG, and then injected i.v. into syngeneic (B6) recipients 7d before heart transplant. Donor Ag-specific activation of T cells was analyzed by MLR and ELISA. DAP12 expression was evaluated by RT-PCR and abrogated with siRNA. Results: Administration of IDO KO pDC 7d before transplant prolonged graft survival beyond that seen with control mDC, but not to the extent seen with WT pDC, suggesting involvement of IDO. pDC cultures from IDO KO mice exhibited a more mature phenotype than their WT counterparts with reduced expression of co-regulatory molecules (e.g., ICOSL). Although IDO KO pDC induced enhanced T cell alloactivation compared to WT pDC, use of the IDO inhibitor 1-MT indicated that WT pDC do not produce functional IDO. Further analyses showed WT pDC exhibited high levels of DAP12 mRNA expression. Silencing DAP12 had no effect on the ability of WT pDC to activate allogeneic T cell proliferation, but significantly enhanced IFNγ secretion. Addition of 1-MT to MLR with DAP12-silenced pDC increased T cell proliferative responses to alloAg, verifying IDO induction in DAP12-silenced pDC. Conclusions: These data underscore the prophylactic potential of pDC for cell-based therapy that may be independent of IDO, due, in part, to elevated DAP12 expression. Additionally, our data support a role for DAP12 in both immunostimulation and immunoregulation by pDC. Rapamycin Preferentially Blocks the Expansion of Potentially Tolerogenic Plasmacytoid Dendritic Cells In Vivo. Heth R. Turnquist, 1 Angus W. Thomson. 1, 2 1 Starzl Transpl Inst and Dept of Surgery; 2 Dept of Immunol, Univ of Pittsburgh Sch of Med, Pittsburgh, PA. Rapamycin (RAPA), a 'tolerance-sparing' immunosuppressant with anti-proliferative properties, inhibits myeloid (m) dendritic cell (DC) differentiation/maturation in vitro. RAPA decreases CD11c + DCs in the mouse spleen and suppresses the expansion of total CD11c + DC following administration of the DC growth factor, fms-like tyrosine kinase 3 ligand (Flt3L). However, the influence of RAPA on plasmacytoid (p) DC,the principal type-1 IFN producers in the body, known to regulate innate and adaptive immune responses, and reported to promote experimental transplant tolerance, has not been evaluated. Methods: DC were propagated from bone marrow for 10 days (d) in Flt3L, in the absence or presence of a clinically-relevant dose of RAPA (10 ng/ml). In addition, DCs were mobilized in C57BL/10 mice by i.p. Flt3L (10 mg/d for 10 d; d1-10). Untreated and Flt3L-treated groups also received RAPA (0.5 mg/kg/d i.p.; d3-10). On d11, DC were isolated by density gradient centrifugation and/or CD11c + positive selection. mDC (CD11c + CD11b + ) and pDC (CD11c lo CD11b -B220 + ) were identified by flow cytometry. Results: RAPA suppressed pDC and mDC generation in vitro; RAPA-DC-treated cultures had only 22% of the pDC and 40% of the mDC found in control conditions. In normal mice, both mDC (54% of control) and pDC (24.5%) numbers were reduced significantly by RAPA administration. RAPA, when given concurrently with Flt3L, blunted the typical profound expansion of mDC. Specifically, Flt3L and RAPA-treated mice displayed a 46±8X increase over control (steady state) numbers compared to a 145±39X increase in mice treated with Flt3L alone. Flt3L-induced expansion of pDC was to a much greater extent impacted by RAPA, as absolute numbers of pDC increased only 6.5±2.2X over control numbers compared to 51±3X increase in absolute splenic pDC in Flt3L-treated mice. Conclusion: These data identify RAPA as a selective suppressor of pDC generation, as described for corticosteroids. This has significant implications, given the use of RAPA following organ transplantation, and the suggested importance of secondary lymphoid tissue pDC for promotion of transplant tolerance mediated by Treg. Due attention to these disparate effects of RAPA on regulating cell populations will be necessary to optimize therapeutic regimens for safe promotion of tolerance. The liver is tolerated better than other transplanted organs. This may reflect the inherent tolerogenicity of liver dendritic cells (DC) and interactions with Foxp3 + regulatory T cells (Treg). Recent studies have shown that CD103 expression on DC correlates with induction of Foxp3 + Treg, and can be enhanced in the presence of transforming growth factor-β (TGF-β) and retinoic acid (RA), both of which are produced in the liver. This study evaluates CD103 expression on liver plasmacytoid (p) and myeloid (m) DC and the potential of liver DC subsets to induce Tregs. Methods: pDC and mDC were magnetically isolated from livers and spleens of C57BL/10 mice and co-cultured with CFSE-labeled allogeneic (BALB/c) CD4 + CD25 -T cells. After 4 or 5d of co-culture, T cells were assessed for CFSE dilution and intracellular Foxp3 expression. rhTGF-β and either all trans or cis-RA were added to co-cultures. CD103 expression was evaluated by flow cytometry. Results: CD103 expression was elevated on liver pDC and mDC compared to spleen pDC or mDC, with higher expression on liver mDC. pDC from the liver and the spleen were poor inducers of Foxp3 in CD4 + CD25 -T cells compared to mDC. Foxp3 induction was enhanced with TGF-β when splenic pDC were used as stimulators. However, when liver pDC were used as stimulators, TGF-β had no effect on Foxp3 induction. RA (either all trans or cis) enhanced induction of Foxp3 + cells in CD4 + CD25 -T cells in the presence of TGF-β when splenic pDC but not liver pDC were used as stimulators. TGF-β and TGF-β with RA also enhanced Foxp3 induction in CD4 + CD25 -T cells when either spleen or liver mDC were used as stimulators, though splenic mDC were superior inducers of Foxp3. Conclusions: Many studies have focused on naturally-occurring Foxp3 + Tregs in systemic tolerance. Our data show that liver mDC and pDC are poor inducers of Foxp3 in T cells, even in the presence of TGF-β and RA. These data suggest that the inherent tolerogenicity of liver DC subsets may be independent of Treg induction or that liver DC interact with Treg through alternative mechanisms. Background: T cell-mediated immune rejection occurs in organ transplantation. In addition to MHC-TCR signaling, T cell activation requires costimulation from antigen presenting cells. B7 molecules (CD80/CD86) and CD40 are critical costimulatory molecules in T cell activation. Insufficient or lack of costimulation results in inactivation or tolerance. We hypothesized that blocking the costimulation pathway using small interfering RNA (siRNA) expression vector can prolong allogeneic heart graft survival. Method: Vectors that express hairpin siRNAs specifically targeting CD40 and CD80 were prepared. Recipients (BALB/c mice) were treated with CD40 and/or CD80 siRNA vectors, 3 and 7 days prior to heart transplantation. Control groups were injected with a blank vector and sham treatment (PBS). After siRNA treatment, a fully MHCmismatched (BALB/c to C57/BL6) heart transplantation was performed. Result: Allogeneic heart graft survival (>100days) was approximately 70% in the mice treated simultaneously with CD40 and CD80 siRNA vectors. In contrast, allogenic hearts transplanted into recipients treated with blank vector and PBS stopped beating within 10 days. Hearts transplanted into CD40 or CD80 siRNA vector-treated recipients had an increased graft survival time compared to negative control groups, but did not survive longer than 40 days. Real time PCR and flow cytometric analysis showed an upregulation of FoxP3 expression in spleen lymphocytes and a concurrent downregulation of CD40 and CD80 expression in splenic dendritic cells of siRNAtreated mice. An MLR, using splenic dendritic cells (DCs) isolated from tolerant recipients, showed a significantly lower T cell proliferation capacity in CD40-and CD80-siRNA vector-treated mice, compared to control groups. Tolerant DCs from CD40-and CD80-treated recipients promoted CD4+CD25+FoxP3+ regulatory T cell differentiation. Finally, tissue histopathology demonstrated an overall reduction in lymphocyte interstitium infiltration, vascular obstruction, and edema in mice treated with CD40 and CD80 siRNA vectors. Conclusion: This study demonstrates that the simultaneous silencing of CD40 and CD80 genes has synergistic effects in preventing allograft rejection, and may therefore have therapeutic potential in clinical transplantation. Costimulation blockade (CoB) of the CD28/B7 pathway has long been suggested as a means of attaining indefinite, well-tolerated, antigen-specific prophylaxis from allograft rejection. Although effective in rodent models, CD28/B7 blockade is not alone sufficient to prevent rejection in more robust primate models. The relative presence of allocrossreactive memory T cells is one mechanism of CoB resistant rejection. LFA3-Ig is an approved agent for the treatment of psoriasis, shown clinically to deplete TEM cells in psoriatic lesions. We hypothesize that LFA3-Ig specifically targets CoB resistance cells, including heterologous alloreactive T cell memory, while avoiding interference of peripheral mechanisms that foster CoB-mediated allograft acceptance. Rhesus monkeys underwent MHC mismatched renal allotransplantation. CTLA4-Ig was given 20 mg/ kg IV on days -1, 0,3,7,14,21,28 and 10 mg/kg IV on days 35,42,49,56. LFA3-Ig, (0.3 mg/kg IV) was given on day -1,0,3,7 then weekly (8 wks). Sirolimus was given orally (1mg/kg) days 0-90 and donor specific transfusion (7 ml/kg IV) on day -1. Animals were followed by polychromatic flow-cytometry to quantify T cell subsets. LFA3-Ig synergized with CTLA4-Ig successfully preventing renal allograft rejection in this model and leading to indefinite survival in 2/5 animals (Table) . This enhanced survival was associated with a significant reduction in TEM cells in animals treated with LFA3-Ig compared to animals without LFA3-Ig treatment (figure). LFA3-Ig withdrawal led to TEM re-population and rejection. This therapy regimen, all clinically available agents, promotes CoB-mediated graft acceptance without the use of calcineurin inhibitors, steroids, or gross T cell depletion. The aim of this study was to investigate the immunological pathways that regulate T cell dependant allograft responses in the hope of finding novel immunomodulatory compounds. Islet (and skin) allografts were transplanted into full MHC-mismatched Wild Type (WT) and BAFF-Transgenic (BAFF-Tg) recipient mice. Allograft function/ rejection was monitored by blood glucose levels and/or confirmed by nephrectomy. Histology, splenocyte populations & T cell functions were analyzed. The B cell activation factor from the TNF family (BAFF) is critical for B cell survival. T cells express BAFF receptors suggesting that BAFF may also play a role in T cell responses. Contrary to expectations, we found that BAFF-Tg mice accepted a full MHC-mismatched islet allograft for >100 days. In addition, BAFF-Tg mice also showed delayed skin graft rejection. This was due to a T cell intrinsic change in allo-responsiveness as shown by a failure of purified BAFF-Tg T cells to reject an allograft in adoptive transfer experiments. However, BAFF-Tg T cells were not anergic per se as they proliferated normally to mitogens in vitro and to antigenic challenge in vivo. Intriguingly, BAFF-Tg mice harbored an increased frequency of T regulatory cells in the periphery as compared to WT mice. Elimination of CD25 + T cells restored normal allograft rejection to BAFF-Tg mice, demonstrating that the increased number of Treg cells were responsible for their altered allo-immunity. In a second approach, BAFF-Tg T cells depleted of CD4 + CD25 + T cells were adoptively transferred to transplanted RAG recipients, and in this case the BAFF-Tg CD4 + CD25 -T cells rejected their allograft. Proliferation assays showed that excessive BAFF was not promoting Treg expansion by driving proliferation in the periphery. Adoptive transfer of GFP expressing syngeneic splenocytes did not exhibit enhanced survival in the presence of excessive BAFF. However, analysis of Treg cells in the thymus of BAFF-Tg mice showed an increased intrathymic frequency. Together these results demonstrate that BAFF-Tg mice harbor an expanded number of Treg cells that prevent Th1-type immune responses including allograft rejection. Furthermore, we demonstrate that manipulating BAFF levels may provide a means to generate immunosuppression free dominant allograft tolerance. We have previously reported the outcome of pig-to-baboon thymokidney transplants from GalT-KO miniature swine using an immunosuppressive regimen designed to facilitate the induction of tolerance. Although the results were superior to results using hDAF thymokidneys, there was a high rate of early post-operative complications. We investigated whether the elimination of steroids and whole body irritation (WBI) from the treatment protocol would decrease the complication rate in the perioperative period. Methods: 7 baboons received thymokidney transplants from GalT-KO miniature swine. The immunosuppressive regimen was based on the previously published protocol, but eliminated steroids and substituted Rituximab for WBI. It consisted of thymectomy day -21, splenectomy day 0, ATG, Rituximab, MMF and anti-CD154 mAb. Renal function was assessed by serum creatinine levels. Evidence for baboon thymopoiesis in the pig thymus was assessed by FACS and immunohistochemistry. Results: One animal died due to a drug reaction at POD 18 with normal renal function and no evidence of infection. The remaining 6 baboons showed no signs of rejection although IgM deposition was observed, likely due to preformed non-Gal nAb. There were no deaths due to rejection. Although early infectious complications were not seen, late complications leading to mortality were still observed, including systemic CMV infection (POD 28), pleural effusions (POD 40, 57) , ARDS with pulmonary hemorrhage (POD 49, 81) and acute myocardial infarction (POD 83). Two animals showed evidence for early thymopoiesis in the pig thymus by FACS and immunohistochemistry. CD4+/CD8+ thymocytes were seen in the thymic grafts, indicating T cell development was supported by the pig thymus. The average survival of the steroid-free group was 51 days including the animal with the drug reaction and 56 days excluding it, compared to 34.6 days in the regimen that included steroids/WBI. Conclusions: A steroid-free and WBI free regimen designed for the induction of tolerance across the pig-to-baboon xenogeneic barrier had fewer early post-operative complications than previous regimens. Greater than 50-day average survival of recipients of life-supporting xenogeneic thymokidneys was observed without rejection. This regimen also appears to permit baboon thymopoiesis in the pig thymus. Purpose: The effects of human decay accelerating factor (hDAF) addition to the galactosyl transferase knock-out (GalT-KO) background in transgenic pig kidney xenotransplantation in baboons has not been previously studied. Methods: Baboon recipients of GalT-KO (n=4) or GalT-KO/hDAF (n=2) pig kidneys received Flolan, heparin and/or aspirin. Steroids (S), cobra venom factor (V), ATG (A), MMF (M), and/or a low-dose CD154 blockade (C) regimen were given. Results: Pre-transplant (tx) anti-non-Gal antibody (Ab) titers were inconsistently associated with early GalT-KO kidney xenograft failure. High D-Dimer levels 4 hours post-tx were closely associated with early GalT-KO graft loss but not when assessing levels of C3a, BTG, or increased CD62P expression on circulating platelets. Regimens lacking MMF and/or CD154 blockade elicited by day 7-14 large anti-donor non-gal Ab responses . Post-operative creatinine levels for the GalT-KO/hDAF recipients were lower along with superior graft survival compared to GalT-KO; D-Dimer, anti-donor non-Gal Ab, C3a, BTG, and F1+2 measurements are in progress for the GalT-KO/hDAF recipients. GalT-KO/hDAF V,A,M,S,C >10* TBP 239 1.9; 1.5 * Graft survival at time of abstract submission as graft still functioning at time of abstract submission; TBP --To be processed ! --anuric; non-life supporting, but appeared viable, pink, and well-perfused until day 7 Conclusion: Early failure of vascularized GalT-KO organs is closely associated with activation of coagulation pathways; whether this is triggered primarily by anti-non-Gal Ab or by other mechanisms is not addressed by our study. Preliminary results using GalT-KO/hDAF pig kidneys show promise in attenuating or possibly preventing these and/or other such mechanisms. The established efficacy of CD154-based costimulation blockade with MMF to prevent induced anti-non-Gal antibody elaboration is confirmed by our preliminary findings. Recovery of Cardiac Function after Pig-to-Primate Orthotopic Heart Transplant. Christopher G. A. McGregor, 1 William R. Davies, 1 Keiji Oi, 1 Henry D. Tazelaar, 1 Randall C. Walker, 1 Krishnaswamy Chandrasekaran, 1 Guerard W. Byrne. 1 1 Mayo Clinic, Rochester, MN. Xenotransplantation has been proposed as a method to alleviate the shortage of donor organs. We report survival of up to 8 weeks of three orthotopic transplants after pig-tobaboon cardiac xenotransplantation. Pig-to-baboon orthotopic transplantation was performed using an adult baboon recipient and CD46 transgenic pig. The recipients were treated with an α-Gal polymer to block the effects of anti-Gal antibody and with ATG induction and a tacrolimus and sirolimus based maintenance immunosuppression. Heart function was continuously monitored by intramyocardial electrocardiography and every 3 -4 days by serial echocardiography. Standard hematological, serum chemistry and cardiac troponin levels were monitored every 2 -3 days. The animals survived 34, 40, and 57 days in a healthy condition. Mortality was a result of pneumonitis, respiratory failure and bowel infarction respectively. One recipient (survival 40 days) showed impaired perioperative left ventricular function with an ejection fraction of 25% on echo. Over the next two weeks, the ejection fraction in this animal returned to normal (60%) and troponin levels normalized. This study shows the longest survival of orthotopic cardiac xenografts to date with recovery of impaired heart function after early ischemia reperfusion injury. This significant improvement in ventricular function suggests that the normal reparative processes appear to function across the xenotransplantation barrier, supporting the potential clinical potential of cardiac xenotransplantation. Purpose: To determine whether the galactosyl transferase gene knock-out (GalT-KO) protects pig lungs from hyperacute lung rejection (HALR) by human blood. The effects of adding human decay accelerating factor (hDAF) to the GalT-KO background will also be examined for the first time. Methods: Heparinized fresh human blood was used to perfuse GalT-KO pig lungs (n=10). Lung function was assessed by flow, pulmonary vascular resistance (PVR), oxygen transfer, and tracheal edema using pre-defined survival endpoints. Historical wild-type (WT) pig lungs (n=15) provide context for analysis. Additionally, two pig lungs expressing GalT-KO/hDAF were recently examined, one of which received treatment with Xigris. Results: Median lung survival in GalT-KO lungs was 114 minutes (range 15min to >240min) vs. 10 minutes in WT lungs (range 4min to 36min, p= 0.01); 2/2 GalT-KO/hDAF pig lungs survived >240 min. PVR at 5 minutes for GalT-KO lungs was 62±35mmHg-min/L vs.WT lungs (308±178 mmHg-min/L at 5 minutes) vs. 23mmHgmin/L for each GalT-KO/hDAF lung. Complement activation (∆C3a) at 15 minutes was significantly lower with GalT-KO (∆C3a 133±243 ng/mL) than WT lungs (vs. 2080±1332 ng/mL, p=0.039). GalT-KO was also associated with reduced platelet activation: only 4±4% of circulating platelets express CD62P at 15 min. vs. 16±13% in the WT group (p=0.05); ∆βTG at 15 min [223±151 IU/mL (GalT-KO) vs. 1196±1186 IU/ mL (WT)], and less thrombin formation (∆ F1+2) at 15 minutes (GalT-KO: 5±7.9 nM vs. WT: 22±39 nM). ∆C3a, % of circulating platelets expressing CD62P, ∆βTG, and ∆ F1+2 measurements are in progress for GalT-KO/hDAF. Platelet sequestration was delayed as mean % of the initial platelets remaining in the GalT-KO lung perfusate was 52±19% at 15min vs. 27±9% in the WT lung group (p=0.02). Neutrophil sequestration was also diminished in GalT-KO (58±15% residual) vs. WT lung (5±3% residual, p=0.004). Conclusion: GalT-KO pig lungs are significantly protected from several facets of HALR: complement and coagulation cascade activation, neutrophil sequestration, and platelet activation are all partially attenuated by this modification alone. Preliminary results suggest that GalT-KO/hDAF lungs offer even further protection. Disseminated Intravascular Coagulation Is Associated with Tissue Factor Expression on Recipient Platelets and Monocytes. Chih Che Lin, 1, 3, 4 Mohamed Ezzelarab, 1 Corin Torres, 1 David Ayares, 2 Anthony Dorling, 3 David K. C. Cooper. 1 1 Thomas E. Starzl Transplantation Institute, University of Pittsburgh, Pittsburgh, PA; 2 Revivicor Inc., Blacksburg, VA; 3 Department of Immunology, Imperial College London, Hammersmith Hospital, London, United Kingdom; 4 Department of Surgery, Chang Gung Memorial Hospital, Kaohsiung, Taiwan. Purpose Acute humoral xenograft rejection (AHXR), frequently associated with disseminated intravascular coagulation (DIC), remains a challenge in pig-to-primate xenotransplantation (Tx). A previous in vitro study showed that recipient platelets and monocytes were induced to express tissue factor (TF) after incubation with porcine endothelial cells, and we speculated this may contribute to thrombosis. The present study investigated whether circulating (extragraft) monocytes and platelets express TF and whether this relates to the development of DIC after pig Tx. Methods Baboons (n=7) received an aortic patch or kidney from either a wild-type (WT) or α1,3galactosyltransferase gene-knockout (GTKO) pig, with or without immunosuppressive therapy. Baboon monocytes and platelets were isolated from blood before and after Tx. Surface TF phenotype was determined by flow cytometry. Functional TF activity was determined by clotting time assays after mixing monocytes and platelets with recalcified Factor VII (FVII)-deficient plasma with or without added FVII. Before Tx, monocytes and platelets did not express TF or display TF-dependent procoagulant activity. After GTKO aortic patch Tx, the baboons were euthanized by day 35 without developing DIC. However, by day 28, circulating platelets (but not monocytes) expressed TF and promoted TF-dependent clotting in recalcified plasma. In the absence of immunosuppression, WT kidneys survived <1 day before the onset of DIC, at which time TF activity was detected on both platelets and monocytes. After GTKO kidney Tx in immunosuppressed baboons, platelets expressed TF as early as day 3. In contrast, monocytes began to express TF only at the onset of DIC. This study links expression of TF on recipient monocytes and platelets with the development of DIC. Expression on platelets appeared to predict the subsequent development of DIC, whereas that on monocytes was associated with the onset of DIC. Whilst our observations do not establish a causal relationship, they provide the basis for further study. International Human Xenotransplantation Inventory. Antonino Sgroi, 1 Leo Buhler, 1 Megan Sykes, 2 Luc Noel. 3 1 Surgical Research Unit, Department of Surgery, Geneva University Hospital, Geneva, Switzerland; 2 Transplantation Biology Research Center, Massachusetts General Hospital, Boston, MA; 3 World Health Organization, Geneva, Switzerland. BACKGROUND: Xenotransplantation carries inherent risks of infectious disease transmission to the recipient and even to society at large, and should only be carried out with tight regulation and oversight. A collaboration between the International Xenotransplantation Association, the University Hospital Geneva and the World Health Organization has established an international inventory (www.humanxenotransplant. org) aiming to collect basic data on all types of xenotransplantation practices on humans that are currently ongoing or have been recently performed. METHODS: We collected information using publications in scientific journals, presentations at international congresses, internet-based information, and declarations of IXA members. An electronic questionnaire is available on the website www. humanxenotransplant.org, which can be filled out and sent to the office in Geneva. RESULTS: We identified a total of 16 recent or current human applications of xenotransplantation, eight were currently ongoing and one will start soon. The source animal was: pig (n=7), sheep (n=3), calf (n=1), rabbit (n=2), blue shark (n=1), hamster (n=1), and unknown (n=1). All trials transplanted xenogeneic cells, i.e. islets of Langerhans (n= 6), hepatocytes (n=1), kidney cells (n=1), chromaffin cells (n=1), embryonic stem cells (n=2), fetal (n=4) and adult cells (n=1) of various organs. The treatments were performed in 10 different countries, 7 in Europe, 2 in Russia, 2 in Asia, 2 Mexico, 1 in USA and 1 in Africa. Six countries had no national regulation on xenotransplantation. CONCLUSION: Several clinical applications of cell xenotransplantation are ongoing around the world, often without any clear governmental regulation. This information should be used to inform national health authorities, health care staff and the public, with the objective of encouraging good practices, with internationally harmonized guidelines and regulation of xenotransplantation. Cells from α1,3-Galactosyltransferase Gene-Knockout Pigs Additionally Transgenic for Human Membrane Cofactor Protein Demonstrate Resistance to Human Complement-Mediated Cytotoxicity. Hidetaka Hara, 1 Cassandra Long, 1 Mohamed Ezzelarab, 1 Peter Yeh, 1 Carol Phelps, 2 David Ayares, 2 David K. C. Cooper. 1 1 Surgery, Thomas E. Starzl Transplantation Institute, University of Pittsburgh, Pittsburgh, PA; 2 Revivicor, Inc., Blacksburg, VA. Purpose: (1) To compare the antibody binding and complement-mediated cytotoxicity (CDC) of human sera to pig peripheral blood mononuclear cells (PBMC) and porcine aortic endothelial cells (PAEC) from (i) wild-type (WT), (ii) α1,3-galactosyltransferase gene-knockout (GTKO), (iii) human membrane cofactor protein transgenic (MCP) and (iv) GTKO/MCP pigs. (2) To investigate the effect on binding and CDC of human sera following activation of PAEC. Methods: Pooled human serum was tested by flow cytometry for binding of IgM and IgG (5% serum concentration) and CDC (25% serum concentration) to PBMC and PAEC from WT, GTKO, MCP, and GTKO/MCP pigs. PAEC from all 4 pig types were activated by IFN-γ, and again tested for antibody binding and CDC. Results: There was higher binding of IgM and IgG to WT and MCP than to GTKO and GTKO/MCP PBMC and PAEC, but there were no differences in binding (i) between WT and MCP cells or (ii) between GTKO and GTKO/MCP cells. CDC of WT PBMC and PAEC was significantly greater than of GTKO, MCP, and GTKO/MCP PBMC (WT 73%; GTKO 37%; MCP 30%; GTKO/MCP 4%) and PAEC (WT 51%; GTKO 18%; MCP 1%; GTKO/MCP 0%) (both p<0.05). Importantly, there was no lysis of GTKO/ MCP PAEC. After activation of PAEC, although CDC was increased against WT and GTKO PAEC, MCP and GTKO/MCP PAEC demonstrated significant resistance to lysis (WT 73%; GTKO 41%; MCP 5%; GTKO/MCP 0%). Conclusions: (1) CDC of all 3 types of genetically-engineered PBMC and PAEC was significantly lower than of WT PBMC and PAEC, with lysis of GTKO/MCP PAEC being significantly lower than to GTKO or MCP cells. (2) PAEC from both MCP and GTKO/MCP showed resistance to lysis even after activation of the cells. (3) Organs from GTKO/MCP pigs should provide considerable protection against CDC. Role for CD47-SIRPα Signaling in Human T Cell Proliferation in Response to Stimulation with Porcine Antigen Presenting Cells. Hiroyuki Tahara, 1 Hideki Ohdan, 1 Kentaro Ide, 1 Toshimasa Asahara. 1 1 Department of Surgery, Hiroshima University, Hiroshima, Japan. We have previously proven that genetic induction of human CD47 on porcine cells provides inhibitory signaling to signal regulatory protein(SIRP) α on human macrophages; this provides a novel approach to preventing macrophage-mediated xenograft rejection. A recent report indicated that the similar CD47-SIRP system negatively regulates the functions of both T cells and antigen presenting cells(APCs) in humans. We hypothesize that the interspecies incompatibility of CD47 may also act as an additional barrier of T cell-mediated xenograft rejection. We have analyzed the frequency and proliferative activity of human T cells responding to either porcine or allogenic APCs by using in vitro mixed lymphocyte reaction (MLR) assay with a CFSE-labeling technique. Irradiated stimulator porcine or human peripheral blood mononuclear cells(PBMCs) were cultured with CFSE-labeled responder human PBMCs. By FCM analysis, the number of division precursors was extrapolated from the number of daughter cells of each division and from the mitotic events, and precursor frequencies in CD4 + and CD8 + T cell subsets. Using these values, stimulation index(SI) was calculated. The frequencies of alloreactive and xenoreactive CD4 + T cell precursors were almost identical, 4.5±0.8% and 4.4±0.9%, respectively (n=10-12, for each type of precursor). However, the SI of xenoreactive CD4 + T cells was significantly higher than that of alloreactive CD4 + T cells, indicating a stronger reaction by a single xenoreactive CD4 + T cell. In the presence of human CD47-Fc(containing the extracellular domain of human CD47 fused to the Fc portion of human Ig, 10µg/ml), the SI of xenoreactive CD4 + T cells was significantly reduced to a level similar to that of alloreactive CD4 + T cells (n=12, P<0.05), although the frequencies of their precursors were absolutely uninfluenced. The frequencies of alloreactive and xenoreactive CD8 + T cell precursors were also identical, i.e. 3.5±0.3%, 3.9±0.9%, respectively (n=10-12). The SI of both alloreactive and xenoreactive CD8 + T cells did not differ: however, that of xenoreactive CD8 + T cells was significantly suppressed in the presence of human CD47-Fc (n=12, P<0.05). These results suggest that the T cell responses to porcine cells are stronger than those to allogeneic cells because of the interspecies incompatibility of CD47. Moreover, genetic manipulation of porcine APCs to induce human CD47 expression might attenuate human T cell-mediated xenograft rejection. Reactivation of human cytomegalovirus (hCMV) is a potential risk following the clinical application of pig-to-human xenotransplantation. Since little is known about undesirable side-effects arising from hCMV infection of porcine organs, we investigated the capabilities of various hCMV strains to infect porcine endothelial cells (pEC) and putative immunological consequences thereof. pEC from different anatomical origins were incubated with the hCMV laboratory strains AD169 and TB40/E or a clinical isolate at a multiplicity of infection (MOI) ranging from 0.1 to 100. Viral replication kinetics, evolution of cytopathology, and lytic end points were analyzed. Consequences of pEC infection on human NK cell activation were evaluated using xenostimulation assay assessing NK cell IFNg secretion and cytotoxicity. Infection was evident and a maximum percentage of infected cells was reached at a MOI of 10. hCMV replicated in all tested pEC types, with a fraction of infected cells ranging from 1% to 50%. AD169 infection of 2A2 (microvascular EC) resulted in cytopathic effect (CPE) development by 3 dpi and in lysis of 70% of the cells at 7 dpi. Contrary, no CPE was observed in PED (aortic EC) up to 15 dpi. Infection of 2A2 with TB40/E was non-lytic and resulted in accumulation of both extra-and intracellular virus. Virus titers reached a maximum at 5 dpi, peaking at levels 20-fold higher than residual input virus. We then determined whether infected pEC supported a complete replication cycle and produced viral progeny. After 5 dpi pEC supernatants and lysates revealed the production of significant amounts of virus. Preliminary results showed that coculture of human NK cells and infected pEC resulted in increased NK killing and IFNg production. Altogether, these findings provide evidence that pEC are permissive and support the complete productive replication cycle of hCMV. However, CPE are cell-type and strain dependent. Moreover, pEC infection leads to modification of the xenogeneic cytotoxicity mediated by human NK cells. Our findings allow a better estimation of the potential role of hCMV cross-species infection following xenotransplantation and may be crucial to guide future clinical trials. Chronic Rejection/Injury: Innate and Adaptive Immunity I Testing for Donor-Specificity of Antibodies Post-Transplantation Increases the Predictability of Chronic Rejection. Mikki Ozawa, 1 Arundhati Panigrahi, 2 Paul I. Terasaki, 3 Narindar Mehra. 2 1 One Lambda, Inc., Canoga Park, CA; New Delhi, India; 3 Terasaki Foundation Laboratory, Los Angeles, CA. Purpose: Post-transplant HLA antibodies have been shown to be linked to chronic graft failure, yet some recipients do well in the presence of antibodies. We investigated whether testing for donor-specificity of antibodies improves the predictive value of antibodies in regards to chronic rejection. Methods: In a prospective study of 400 renal transplant recipients, post-transplant sera were tested for HLA and MICA antibodies, and positive sera were tested by single antigen beads to determine whether the antibodies were donor-specific. One year after testing, biopsy and clinical data on the graft status were collected. Patients who did not have follow-up or died with function were excluded from analysis. Results: Of the 350 patients with follow-up and ABDR typing info, 102 (29%) were found to have antibodies. Sixteen of those had donor-specific A, B, or DR antibodies (DSA), while 26 had non-donor specific ab's (NDSA). Sixty patients had only DP, DQ, Cw, or MICA antibodies, and were grouped into "Donor-specificity unknown", as these typing were not available at the time of this report. Table 1 summarizes the antibody groups and transplant outcome one year later. Interestingly, the mean serum creatinine values at the time of antibody testing were very similar among the groups, regardless of the presence of antibody or the type (ranged 1.34 to 1.73mg/dL). One year later, however, a striking 63% of DSA patients had either returned to hemodialysis, were regrafted, had died, or had biopsy-proven chronic rejection, compared to only 23% in NDSA patients and 13% in those who were negative (DSA vs. Neg, p=0.00009). MICA antibodies also showed significant association with graft failure or CAN (p=0.03), and donor MICA typing is underway. Conclusion: In this prospective study of 350 renal patients, DSA detected posttransplantation were highly correlated with chronic rejection. Inclusion of specificity analysis in post-transplant antibody monitoring would significantly improve the predictability of chronic rejection. Donor CD4 T Cells within Solid Organ Transplants Contribute to Graft Rejection. Thet Su Win, 1 Sylvia Rehakova, 1 Margaret Negus, 1 Kourosh Saeb-Parsy, 1 Martin Goddard, 2 Eleanor Bolton, 1 Andrew Bradley, 1 Gavin Pettigrew. 1 1 University of Cambridge; 2 Papworth Hospital, United Kingdom. This study examines the contribution of donor CD4 T cells within heart allografts to the development of graft vasculopathy, by providing help to recipient B cells for generating effector autoantibody responses. B6 mice were transplanted with MHC class II-disparate bm12 hearts. The allo-and auto-antibody responses were quantified, and their contribution to allograft vasculopathy assessed by incorporating B cell deficient mice as recipients. The role of donor CD4 T cells in providing help for autoantibody was examined by removing CD4 T cells from donor hearts before transplantation by three approaches: treating with anti-CD4 mAb, administering 1300Gy lethal irradiation, and using bm12 donors that are genetically deficient in T cells. The contribution of donor CD4 T cells to autoanitbody development was further assessed by challenge with bm12 CD4 T cells two weeks prior to transplantation. Bm12 heart grafts developed progressive vasculopathy and were rejected slowly (MST=95 days, n=17). The contribution of antibody-mediated effector mechanisms was confirmed by histopathological evidence (fibrinoid necrosis and vascular proliferation), by complement C4d staining, and by a reduction in the severity of vasculopathy and long-term graft survival in B cell deficient recipients. Surprisingly, no alloantibody was detected, but recipients instead developed autoantibody. The autoantibody response was completely dependent upon the provision of help from donor CD4 T cells; it was abrogated by transplanting CD4 T cell deficient hearts. To further confirm an effector role for autoantibody in vasculopathy development, B6 mice were primed for humoral autoimmunity, but not for alloimmunity, by injecting with highly purified bm12 CD4 T cells prior to transplantation. Heart grafts were rejected more rapidly (MST=29 days, n=4, p<0.0001), and demonstrated severe vasculopathy. Finally, BM-chimeric recipients that lacked MHC class II expression only on B cells did not develop autoantibody, suggesting that cognate interaction between the donor CD4 T cells and MHC class II of recipient B cells is crucial for post-transplant humoral autoimmunity. Our results demonstrate the novel finding that help for autoantibody production is provided by graft-versus-host cognate recognition of recipient B cell MHC class II by donor CD4 T cells. This autoantibody contributes to the development of vasculopathy independently of alloantibody. Allograft Rejection. Gang Chen, 1 Huiling Wu, 1 Jainlin Yin, 1 Josette M. Eris, 1 Steven J. Chadban. 1 1 Collaborative Transplantation Research Group/Renal Medicine, University of Sydney/RPAH, Sydney, NSW, Australia. Toll like receptors (TLRs) are innate immune receptors that play an important role in innate immunity but also provide a link between innate and adaptive immunity. MyD88 is a key TLR signal adaptor. A recent clinical study reported that activation of innate immunity in heart-transplantation recipients through TLR4 contributes to the development of chronic rejection after cardiac transplantation. In this study, we aimed to determine whether TLR4-MyD88 signaling is required for the development of chronic allograft damage using TLR4 -/and MyD88 -/mice in a murine model of chronic cardiac allograft rejection. Methods: Cardiac transplants were performed: B6.C-H-2 bm12 (B6.H-2 bm12 ) hearts to WT, MyD88 -/and TLR4 -/mice (all C57BL/6-H-2 b -single MHC class II mismatch) as allografts (n = 6-8/group) and C57BL/6 to C57BL/6 isografts (n = 5). Blood and tissue were harvested at day 49 after transplantation. Cell infiltration, fibrosis and vasculopathy were assessed histologically (grade 0 -5). CD4 + Foxp3 + cells in the blood and spleen were measured by flow cytometry analysis. Results: All hearts remained pulsatile until sacrifice. Histology of TLR4 -/and MyD88 -/recipients showed protection from leukocyte infiltration (1.2±1.1&1.0±1.1vs 3.2±1.4, p < 0.01), fibrosis (0.75±1.0& 0.6±1.2 vs 2.6±2.0, p < 0.05) and vasculopathy (0.28±0.6 & 0.04±0.1 vs 1.9±2.2) at day 49 post transplantation compared to WT recipients. By FACS, the ratio of CD4 + FoxP3 + regulatory T (Treg) cells to total CD4 + cells at day 49 post transplantation in spleen was significantly increased in TLR4 -/and MyD88 -/recipients versus WT allograft and isograft mice (18.1±0.2 & 19±0.9% vs 12.9 ±1.1 & 11.8±0.2 %, P <0.01). Conclusion: Absence of TLR4 and MyD88 signaling reduces chronic allograft damage in a murine model of chronic cardiac rejection. One mechanism of protection may be enhancement of regulatory T cell function. Promotion of Treg generation via the TLR4/ MyD88 pathway may be one important consequence of cross-talk between innate and adaptive immunity. Pathway: Implication in Transplant Arteriosclerosis. Thibaut Quillard, 1 Stéphanie Coupel, 1 Flora Coulon, 1 Juliette Fitau, 1 Maria-Cristina Cuturi, 1 Elise Chiffoleau, 1 Béatrice Charreau. 1 1 INSERM U643, ITERT, Nantes, France. Endothelial cell (EC) activation, injury and apoptosis are the key events associated with transplant arteriosclerosis (TA) and chronic allograft rejection (CR). Notch is a major signalling pathway controlling vascular function and injury. Nonetheless, the involvement of Notch, at endothelial level, in both TA initiation and progression remains unknown. Using a fully MHC mismatched rat cardiac allograft model of CR, we found that TA at Day100 correlates with a strong decrease in both expression and activity of the Notch pathway in transplant as compared to tolerant and syngeneic controls. The present study investigates the contribution of Notch activity to EC survival. In this purpose, a recombinant adenovirus, encoding the Notch2 intracellular Domain (NICD) and the reporter GFP cDNAs was constructed and used to maintain Notch activity in cultured primary human arterial EC. Our results demonstrate that NICD protects EC from cell death induced by TNFα in the presence of CHX or by anoïkis as measured by AnnexinV and DNA content staining. NICD mediates cytoprotection through the regulation of key-apoptosis molecules. Using an apoptosis-dedicated qPCR array, we found that 14 out of a panel of 88 apoptosis-related genes were significantly regulated. NICD upregulated Bcl2, Bcl2A1and Bcl-xl (2.5-, 8.2-and 2.8-fold compared to noninfected cells, respectively). In addition, pro-apoptotic genes Bim, Drp1, Bok and CD40 were markedly downregulated in transduced cells (6.7-, 11.4-, 3.9-and 14 .5-fold decrease, respectively). Western blotting analysis confirmed the induction of protective molecules (Bcl2 and Bcl-xl) and the inhibition of pro-apoptotic proteins (bad, bak and bax). Consistent with these data, NICD conducted phosphorylation of Akt as well as a reduction of PTEN expression, suggesting that the cytoprotective activity of NICD may be mediated by recruitment of the PI3K survival pathway. To conclude, our findings indicate that TA correlates with a decreased Notch signaling in transplant and that activation of the Notch pathway in vascular EC prevents apoptosis by promoting protective gene expression and survival pathway. These data suggest that controlling Notch activity may prevent EC dysfunction associated with TA and CR. Donor Age Intensifies the Early Immune Response. Christian Denecke, 1 Xupeng Ge, 1 Irene Kim, 1 Daman Bedi, 1 Anne Weiland, 1 Anke Jurisch, 1 Steven McGuire, 1 Johann Pratschke, 2 Stefan G. Tullius. 1 1 Div. of Transplant Surg, BWH, Transplant Surg Res Lab, Boston; 2 Dept. of Surgery, Charite Berlin, Germany. Increasing numbers of organs from elderly donors are currently utilized for transplantation. Advanced donor age may not only be associated with physiological impairments but also with a modified immune response of the recipient. We hypothesized a more potent early immune response following the transplantation of elderly donor organs and we analyzed the immune response in a mouse heart transplant model. Young B6 mice received heart allografts from 3, 12 and 18mths old bm12 donors. The recipients immune response and intragraft changes were analyzed. Elderly, non-manipulated hearts contained overall significantly elevated frequencies of CD4 + and CD8 + T-cells and DCs (CD11c + ) (18mths vs. 3mths: CD4 + : 2.77% vs. 1.35%, CD8 + : 3.90% vs 1.71%, CD11c + : 46.1% vs.11.8%, p<0.05). Following engraftment of 18mths old heart grafts numbers of activated DC's (CD11c + I-A b+ and CD11c + CD40 + ) had significantly increased in recipient spleens (day 14:p<0.05). In parallel, frequencies of effector/memory phenotype T-cells (CD4 + CD44 high CD62L low and CD8 + CD44 high CD62L low ) were significantly elevated with increasing age (3 vs. 12 vs, 18mths: CD8 + CD44 high CD62L low :6.7% vs. 8.4% vs. 12.2%, respectively, p< 0.01). In addition, Tregs (CD4 + CD25 + FoxP3 + ) were also elevated (3 vs. 12 vs, 18mths: 2.9%vs.9.6%vs.11%,p<0.05) T-cell alloreactivity, as measured by IFNγ-production, increased with donor age (3mths vs. 12mths vs.18mths: 24.4±1.5 vs 74.4±21.9 vs. 73.2±7.8 IFNγ-producing spots/5x10 6 cells, p<0.05). Mixed Lymphocyte Reaction (MLR) at day 14 revealed a gradual increase in splenocyte proliferation with advancing donor age (p=0.0002) indicating an enhanced immunogenicity of older organs. Immunohistochemical staining confirmed augmented CD4 + and CD8 + T-cell infiltrates and an intense Ki67 positivity of GICs in 18mths old heart grafts, emphasizing an intensified immune response towards organs from old donors. In summary, old native heart transplants contain an overall higher number of passenger leukocytes contributing to an increased immunogenicity of these organs. After transplantation, a more potent DC and T-cell activation and intragraft T-cell infiltration was observed in elderly organs. Nox and Chronic Kidney Allograft Interstitial Fibrosis. Shannon Reese, 1 Madhu Adulla, 1 Surmeet Bedi, 1 Jose Torrealba, 1 Deb Hullett, 1 Arjang Djamali. 1 1 Nephrology, UW Madison, Madison, WI. To determine the role of oxidative stress (OS) in kidney allograft fibrosis, we are developing strategies that would decrease the generation of reactive oxygen species (ROS) instead of using ROS scavengers, the standard approach to antioxidant therapy so far. We therefore started to examine the expression of NADPH oxidase (Nox) subunits (Nox2, Nox4, p22 and p47phox) and their distribution in human and rat kidney allografts with chronic interstitial fibrosis (IFTANOS). Using double-staining immunofluorescent studies in human allografts (n=16) we showed that Nox2 was present in injured tubules costained with αSMA ( Figure 1.1) Similarly, interstitial macrophages (CD68 + -Nox2 + ) and myofibroblasts (αSMA + -Nox2 + ) but not CD3 + T cells or S100A4 + fibroblasts expressed high Nox2 levels, suggesting that Nox is involved in the pathogenesis of allograft fibrosis via epithelial-tomesenchymal transition (EMT), macrophage and myofibroblasts activation. We then examined the coexpression of Nox2, Nox4 and p22phox in normal and transplant kidneys with IFTANOS and showed that these molecules were upregulated in the latter group and that Nox2 and Nox4 were associated with p22phox expression. These results were confirmed in the Fisher to Lewis rat kidney transplant model (Figure 1 .2). Immunoblot analyses at 3 weeks and 6 months showed that Nox4 and Nox2 levels were increased in the allogeneic (n=7) compared to syngeneic transplants (n=3). Greater Nox levels were Composite Tissue Allotransplantation (CTA) is a recently introduced option for limb replacement and reconstruction of tissue defects. As other allografts, a CTA grafts can undergo immune-mediated rejection, and standardized criteria are required for characterizing and reporting severity and types of rejection. This manuscript documents the conclusions of a symposium on CTA rejection held at the Ninth Banff Conference on Allograft Pathology in La Coruna, Spain on June 26, 2007, and proposes a working classification scheme, the Banff CTA-07, for the categorization of CTA rejection. This classification was derived from public international consensus discussions regarding all published scoring systems for CTA rejection. Given the current limited clinical experience in CTA, a formal histological classification was established for acute skin rejection with the understanding that other types of rejection involving other tissues will be developed with periodic review of this emerging field. It was agreed that the defining features to diagnose acute skin rejection would include inflammatory cell infiltration with epidermal and/or adnexal structure involvement, epithelial apoptosis, dyskeratosis, and necrosis, and that severity of rejection will be graded under five categories. This classification refines proposed schemas, represents international consensus on this topic, and establishes a working collective classification system for CTA reporting. The Role of Skin Biopsies in Diagnosing Clinical Rejection in Hand Composite Tissue Allotransplantation (CTA). Christina L. Kaufman, 1, 4 Ruben N. Gonzales, 1 Kadiyala V. Ravindra, 3, 4 Brenda W. Blair, 1 Joesph F. Buell, 3, 4 Warren C. Breidenbach. 1, 2, 4 1 Christine M. Kleinert Institute, Louisville, KY; 2 Kleinert Kutz and Associates, Louisville, KY; 3 Jewish Hospital Transplant Center, Louisville, KY; 4 Surgery, University of Louisville, Louisville, KY. Aim: Traditionally allograft biopsy has dictated treatment of allograft rejection. We hypothesize that histological grade of the biopsy may over diagnose rejection in CTA Hand CTA is unique in that the organ is external and early signs of rejection can be viewed directly. Skin is the first tissue in the CTA graft to show rejection. Methods: For this study, rejection by defined by requirement of treatment. These episodes were compared with histological grade, and hand appearance. We reviewed 127 skin biopsies taken during 9, 7 and 1 years of follow up. Results: Three observations surfaced. First, a rejection grading scale for CTA is still evolving. A review of the literature showed at least four different criteria are in use. The criteria used to grade biopsies at our center changed over the 9 year follow up. Bias towards calling higher grades of rejection, and more aggressive treatment occurred in the first two patients. A re-read of random biopsies taken from the first two patients showed a down-grade of rejection to grade II or I in five cases. Secondly, concomitant CMV infection in the last patient prevented the aggressive treatment of rejection. Despite high grade histology, the swelling, rash and redness responded to topical immunosuppression. Systemic treatment was not necessary. Analysis indicated that swelling and/or rash, and the percentage of skin involvement seemed to be more informative markers of existing (or resolving) alloreactivity than was histology. The biopsies taken from the graft shown below showed a grade III histology. Thirdly, changes in hand function were not associated with changes in biopsy grade. Conclusion: The histologic grade of skin biopsies from hand allografts appears to over estimate rejection. Alterations in immunosuppression shoud be based on appearance of the allograft, and clinical course as much as on the histologic grade of the biopsy. Expression of Molecular Mechanisms of Lymphozyte Trafficking Correlates Closely with Skin Rejection in Human Hand Transplantation. Theresa Hautz, 1 Bettina Zelger, 2 Gerald Brandacher, 1 Hans G. Mueller, 3 Andrew W. P. Lee, 4 Raimund Margreiter, 1 Stefan Schneeberger. 1 1 Dept. of General and Transplant Surgery, Innsbruck Medical University, Austria; 2 Dept. of Pathology, Innsbruck Medical University, Austria; 3 Dept. of Dermatology, Innsbruck Medical University, Austria; 4 Div. of Plastic Surgery, University of Pittsburgh. Introduction: To understand in greater depth the molecular mechanisms involved in skin rejection in hand transplantation, we investigated 8 key molecular markers of lymphocyte trafficking, cellular rejection and antibody mediated rejection in human hand transplantation. Methods: A total of 130 skin biopsies taken from three bilateral hand transplants were assessed by H&E histology (grades as per previously a published classification 0-4b) as well as immunohistochemistry using antibodies for following markers: Lymphocyte Function-Associated Antigen (LFA)-1 = CD11a, Intercellular Adhesion Molecule (ICAM)-1 = CD54, Selectin E = CD62E, Selectin P = CD62P, VE-Cadherin = CD144, Human Leukocyte Antigen (HLA) II (DP, DQ, DR), Psoriasin = S100A7 and C4d. Levels of expression were assessed (0, +, ++, +++) and read in the light correlated with the rejection as well as time after transplantation. Results: Rrejection ranged between grade 0 and 4a with an average score of 0.9. In healthy skin, none of the markers investigated was consistently up-regulated. Upon rejection, CD54, CD62E and CD62P staining in endothelial cells was significantly increased. Expression of CD62E and CD62P correlated well with severity of rejection. The majority of infiltrating lymphocytes stained positive for CD11a. Interestingly, also kerationcytes were highly positive for CD11a at the onset of rejection. CD144 was detected on endothelial cells, but its occurrence did not correlate with rejection. Psoriasin expression was observed in keratinocytes in a basal and focal pattern and correlated well with rejection. For C4d, no consistent staining pattern was observed indicating that antibody mediated rejection did not play a role in these patients. Conclusion: Molecular markers involved in lymphocyte trafficking are up-regulated upon skin rejection after hand transplantation and represent promising target for prophylaxis and treatment of rejection in Composite Tissue Allotransplantation. Investigation of the Immunomodulatory Phenotype of Infiltrating Lymphozytes in Skin Rejection of Human Hand Allografts. Theresa Hautz, 1 Gerald Brandacher, 1 Bettina Zelger, 2 Hans G. Mueller, 3 Andrew W. P. Lee, 4 Raimund Margreiter, 1 Stefan Schneeberger. 1 1 Dept. of General and Transplant Surgery, Innsbruck Medical University, Innsbruck, Austria; 2 Dept. of Pathology, Innsbruck Medical University, Austria; 3 Dept. of Dermatology, Innsbruck Medical University, Austria; 4 Div. of Plastic Surgery, University of Pittsburgh. Introduction: Skin rejection has complicated the postoperative course in human hand transplantation. To better define the characteristics of the lymphozytic infiltrate human hand transplant biopsies have been investigated for expression of FoxP3 and Indoleamine 2,3-dioxygenase (IDO), a key regulatory enzyme to induce T-lymphocyte unresponsiveness. Methods: A total of 104 skin biopsies taken from three bilateral hand transplant recipients during the first 6 years after transplantation were assessed by H&E histology (graded as per previously published classification 0-4b) as well as immunohistochemistry for IDO and FoxP3. Levels of expression were assessed (0, +, ++, +++) and interpreted in the light of clinical courses, time after transplantation, severity of rejection as well as markers for lymphozyte migration. Results: Overall, rejection ranged between grade 0 and 4a with an average score of 0.94. IDO was found constitutively expressed in the endothelium independent of rejection. IDO expression in the cellular infiltrate was significantly increased upon and correlated well with severity of rejection (rejection grade 1, 0.91+/-0.85: rejection grade 3, 2.30+/-0.82 p<0.001). FoxP3 positive T-cells were mainly found in severe rejection (rejection grade 1, 0.14+/-0.35: rejection grade 3, 0.75+/-0.75 p=0.019). IDO expression correlated well with FoxP3 expression, although the overall staining intensity for FoxP3 was lower. A strong tendency towards higher expression of IDO as well as FoxP3 towards later time-points after transplantation was observed (year 1 -FoxP3 0.07+/-0.26, IDO 0.75+ /-0.92 [n=68] , year 3 -FoxP3 0.45+/-0.74, , year 5 -Fox 0.25+/-0.45, ). Expression of IDO correlated closely with expression of E-Selectin, P-Selectin, ICAM1 and LFA-1. Conclusion: Characteristics of the cellular infiltrate indicate a strong tendency towards self limitation of the alloimmune response towards the skin with both time after transplantation as well as severity of rejection in human hand transplantation. Further studies are warranted to clarify the clinical relevance of these findings. Prospective Analysis of the Immunologic Profile of a Hand Transplant Recipient in the First Year. Kadiyala V. Ravindra, 1 Warren C. Breidenbach, 2 Joseph Buell, 1 Suzanne T. Ildstad. 3 1 Department of Surgery, University of Louisville, Louisville, KY; 2 Kleinert, Kutz, and Associates, Louisville, KY; 3 Institute for Cellular Therapeutics, University of Louisville, Louisville, KY. Introduction: A major obstacle to the wider application of hand transplantation is the long term complications associated with immunosuppression. Minimization of immunosuppression is an important goal in all transplant recipients. Currently there are no accurate tools to evaluate the immunological responsiveness which might help tailor the level of immunosuppression for an individual patient. The response of recipient lymphocytes to PHA, Candida, and alloantigen may represent laboratory tool towards this end. It has been reported that these 3 responses are hierarchical with response to alloantigen being the first to be lost, followed by candida and finally PHA. A 54 year old male received a proximal forearm transplant in November 2006. Immunosuppression included induction with a single 30 mg dose of alemtuzumab and maintenance with tacrolimus and mycophenolate mofetil. The patient developed an episode of cytomegalovirus infection followed by acute rejection after reduction of his immunosuppression during the 3 rd post-operative month. These were successfully treated with Ganciclovir and topical tacrolimus & steroids respectively. Blood samples were drawn at selected time points, and subjected to phenotyping of lymphocyte subsets and immune monitoring for circulating peripheral blood regulatory T cells (T reg ) and proliferative responses to phytohemagglutinin (PHA), candida, and alloantigen. Results: Alemtuzumab induction resulted in profound lymphopenia. At week 2 and 1 month, the response to PHA was intact (stimulation index 110 and 24 respectively), but response to alloantigen and candida suppressed (SI <3). A similar immunologic profile persisted up through 6 months. At 1 year, the PHA and candida responses are robust (SI 69 and 38 respectively), but alloresponses have not returned. Current immunosuppression consists of tacrolimus (6-9 ng/ml) and mycophenolate mofetil (500 mg b.i.d.). There is no gross evidence of acute or chronic rejection. Conclusions: Induction with alemtuzumab alters the recovery of immune response: recovery to candida was delayed beyond 6 months and to alloantigens beyond a year. In light of this, further reduction of immunosuppression may be contemplated in future hand transplants without the risk of rejection. Composite tissue allotransplantation has achieved significant clinical advances despite an adequate preclinical model to study technical and immunosuppressive strategies. We have developed a non-human primate model of facial composite tissue allografts (CTA). Unilateral lower hemi-facial CTA (bone, muscle, skin) transplants were performed between mismatched cynomolgus monkeys. Immunosuppression consisted of 28 days of continuous IV tacrolimus monotherapy followed by tapered daily IM doses. Six animals received prophylactic gancyclovir. All animals had serial transplant biopsies. Ten transplants have been performed, with one loss secondary to line infection on day 27 without evidence of rejection. Two CTAs had evidence of chronic rejection (day 56, 99); with development of alloantibody after 30 days. Five CTAs had prolonged survival (day 60, 86, 108, 150, 177) , but developed PTLDs resulting in experimental endpoints. All animals had clinically normal grafts, but 3 animals showed histological evidence of mild rejection not treated with any additional immunosuppressive therapy. PTLD tumors were analyzed using short tandem repeats (STR) to define donor or recipient origin. STR analysis demonstrated donor origin of 4 PTLD tumors and recipient origin in 1 animal. None of the PTLD animals had clinical evidence of rejection of skin, bone, or muscle. EBV was not detected in the serum of 2 tested animals, and ganciclovir therapy had no effect on the development of tumors. Tacrolimus levels of PTLD animals were higher than animals with rejected grafts (45 vs. 34 ng/mL; p=0.02). Two additional animals have healthy grafts (day 24+, 52+) without evidence of rejection or PTLD, and have been converted to rapamycin after day 28. We have developed a preclinical model for facial CTA transplantation that achieves prolonged graft survivals with tacrolimus monotherapy. The high incidence of PTLD tumors of donor origin represents an outcome similar to bone marrow transplantation in contrast to its rarity in solid organ transplantation. Our findings are a cautionary note regarding CTAs that include vascularized bone elements. Immunosuppressive protocol modifications have been made in an effort to decrease the incidence of these donor-derived PTLDs. Heterotopic Heart Transplantation: The United States Experience. Jama Jahanyar, 1 Tarek A. Sibai, 1 Matthias Loebe, 1 Michael M. Koerner, 1 Guillermo Torre-Amione, 1 George P. Noon. 1 1 Dept. of Surgery, Baylor College of Medicine, Houston, TX. Heterotopic heart transplantation (HHT) is utilized in patients (pts) who do not qualify for standard orthotopic heart transplantation (OHT). Specific indications include refractory pulmonary hypertension and a donor-recipient size mismatch. The objective of this study was to analyze the UNOS database and compare outcomes of HHT to OHT. The UNOS database with more than 58000 pts undergoing thoracic organ transplantation in the U.S. between 1987 and 2007 was reviewed (based on OPTN data as of May 1, 2007) . Primary endpoint of this study was overall survival and subgroup survival [pts with transpulmonary gradient (TPG)>15, ischemic (ICM) and dilated cardiomyopathy (DCM)]. Secondary endpoint was assessment of pretransplant criteria. Exclusion criteria were retransplantation and missing transplant dates. Of 41379 who underwent OHT and 178 who underwent HHT, 32361 and 111 respectively, were enrolled in this study. [5] [6] [7] [8] [9] [10] Survival after OHT is superior to HHT. This survival benefit however, disappears in pts with a TPG>15. Overall the survival after HHT is superior to the reported survival in pts who undergo LVAD implantation as destination treatment (REMATCH-trial/1year survival of 52%). Thus in selected pts, especially those with elevated TPGs, HHT should be considered a viable option with overall good results. Ventricular Assist Device as a Bridge to Heart Transplantation in Children: Can We Afford It? William T. Mahle, Glen Ianucci, Robert N. Vincent, Kirk R. Kanter. Sibley Heart Center Cardiology, Emory University School of Medicine, Atlanta, GA. Ventricular assist devices (VADs) allow children with severe heart failure to be bridged to successful heart transplantation (HT). VADs are being used with increasing frequency in the pediatric population and newer devices allow even young infants to be supported. VAD implantation and maintenance, however, is quite expensive and the cost-effectiveness of VAD use in adults has been questioned. To date, an economic analysis of VAD support in children has not been undertaken. Methods: We used Pediatric Health Information System (PHIS), an administrative database of the Child Health Corporation of America (a consortium of Children's Hospitals in North America), to determine the costs related to VAD use in children. Data on subjects <18 yrs of age from 2002-2007 were reviewed. Hospital charges were converted to costs based on hospital-specific cost-to-charge ratios. Projected survival for subjects who were successfully bridged to HT was derived from published data. Costutility was expressed as cost per quality-adjusted life years (QALYs) saved, expressed in 2006 US dollars. All future costs and benefits were discounted at 3%. Results: The median age at implantation from the PHIS database was 11.2 years, range 2 days to 17 yrs. The mean hospital cost per patient was $522,489. Estimated survival to heart transplantation was 77% and estimated successful explantation without transplantation was 4%. The calculated cost-utility for VAD as a bridge to transplantation was $123,232 /QALY saved. If one assumes that the children who survived to VAD explantation would otherwise have a high risk of hospital death (65%) without VAD support, then the calculated cost-utility would be $117,235/ QALY saved. Even if Abstracts survival to transplantation exceeds 90%, VAD implantation does not achieve a favorable cost-utility ratio. VAD support in pediatric heart only becomes cost-effective if recovery and explantation can be achieved in over 22% of subjects. Conclusions: VAD supports serves as an effective bridge to heart transplantation in children. However, the cost-utility of this strategy is above the generally accepted threshold for cost-effectiveness ($100,000/QALY). In a setting of limited healthcare resources VAD as a bridge to transplantation may not be justified. Purpose: Heart transplantation (tx) in patients with HLA sensitization presents challenges in organ allocation. Virtual crossmatch (vXM), in which recipient HLA antibodies, identified by LabScreen PRA beads, are compared to the prospective donor HLA-type, could increase the use of allografts from distant donors (DD). Accuracy of vXM and outcomes of this approach in heart tx are not known. Methods and Materials: To increase time-efficiency at the time of organ allocation, crossmatch testing is frequently initiated on pre-set trays which contain sera of multiple prospective sensitized recipients. We used results from these studies to determine expected accuracy of vXM. We assessed outcomes of allocation algorithm implemented in 2001 in sensitized patients. Conventional prospective crossmatch was done when allografts were procured from local donors (LD), while vXM was utilized when allografts were offered from DD. There were 257 direct T-Cell AHG crossmatch tests done with sera of 12 potential allograft recipients who had preformed HLA-antibodies of known Class I HLA specificities. As shown in table 1, the positive and negative predictive values (PPV, NPV) of vXM were 92% and 79%. Table 2 shows outcomes in 30 sensitized patients who were eligible for vXM approach. 14 received allografts from LD with negative prospective crossmatch while 16 (53%) received allografts from DD with negative vXM. Three DD patients had a positive retrospective crossmatch -NPV of vXM was 81% in this cohort. vXM has high negative and positive predictive values, accurately predicting results of standard direct crossmatch in most patients. vXM allows use of allografts from DD and is likely to improve organ allocation in the disadvantaged group of sensitized patients. Single Center Experience with New Heart Allocation System Implemented by United Network of Organ Sharing. Biljana Pavlovic-Surjancev, 1 Nilamkumar Patel, 1 Linda Dusek, 1 James Sinacore, 1 Jennifer Johnson, 1 Cassie Bessert, 1 Alain Heroux. 1 1 Heart Failure/Heart Transplant Program, Loyola University Medical Center, Maywood, IL. Purpose: In July 2006, United Network of Organ Sharing (UNOS) implemented a new allocation system for adult heart transplant (tx) candidates with the following sequence: Local Status 1A→Local Status 1B→Zone A Status 1A→Zone A Status 1B→Local Status 2. The purpose of this study is to evaluate the impact of new system on heart tx patients at a single center in region 7. Methods: Patients transplanted during 1 year (y) prior to new system (7-11-05 to 7-11-06, Group 1, N=19) and 1 y following new system (7-12-06 to 7-12-07,Group 2, N=26) were compared for UNOS status at the time of transplant, waiting time, ischemia time, length of the hospital stay (LOS) before and after transplant, donor age, procurement-team travel distance and cost. Results: New system significantly decreased median waiting time, but increased median ischemia time without affecting short-term survival: 1 patient died in each group. Number of transplants increased 37% in Group 2 mostly due to increased number of Status 1A patients supported by IABP without change in number of Status 1B and 2 patients. Median pre-transplant LOS increased 16-fold in Group 2 (p=0.072), whereas mean pre-transplant LOS increased by 5 days. In Group 2, thirteen patients had donor heart procured in Zone A and thirteen patients had donor heart procured locally, whereas in Group 1, all donor hearts were procured locally. Median procurement-team travel distance increased 15-fold and median travel cost 10-fold in Group 2 (p<0.05). Clinical Outcomes Associated with Simultaneous Heart-Kidney Transplantation. Tariq Shah, 1, 3, 4, 5 Suphamai Bunnapradist, 2 Jagbir Gill, 2 Steven K. Takemoto. 1 The figure below indicates recipients of SHK transplants (open symbols) had lower rates of rejection when compared to heart (square) or kidney (diamond) recipients. Survival rates were initially higher for kidney recipients with rates for SHK similar to heart recipients. The hazard ratio (HR) for graft loss was higher for SHK recipients compared to kidney, but lower when compared to heart recipients. The lower hazard for SHK in heart allografts might be attributed to risk associated with pretransplant dialysis (HR=4.09, 3.81-4.39, P<0.001) . Approximately 10% (2,122) of cardiac transplant recipients initiated dialysis prior to transplantation. The differing rates of SHK rejection observed in the heart and kidney analytic files might be attributed to susceptibility or treatment of heart and kidney allografts, rejection monitoring or reporting. Conclusion: Retrospective examination of data provided to the OPTN indicates simultaneous heart-kidney transplantation seems to be effective for cardiac transplant candidates who require dialysis. Abstract# 123 Objective: KRP203, a novel structural analog of FTY720, has been documented to display 5-fold greater selectivity of agonism for sphingosine-1-phosphate (S1P) type 1 (S1P 1 ) receptor compared with S1P 3 receptor. Clinical trial has shown that FTY720 produced dose-limiting toxicity-bradycardia, due to its effect on S1P 3 receptor. We have tested the effect of KRP203 on the survival of islet allografts either alone or combined with local delivery of CD4 + CD25 + Foxp3 + T regulatory (Treg) cells. Previous studies using knockout mice have documented a key role for the integrin CD103 in promoting allograft rejection. These data are consistent with a critical role for CD103 expressing cells in this process. However, a direct test of this hypothesis has proven problematic due to the lack of mAbs that efficiently deplete CD103 + cells in wild type hosts. To circumvent this problem, we conjugated the non-depleting anti-CD103 mAb, M290, to the toxin, saporin (SAP), to produce an immunotoxin (M290-SAP) that selectively depletes CD103-expressing cells in vivo. Treatment of naive mice with M290-SAP selectively depleted CD103 + CD8 + cells and dramatically reduced the overall frequency of CD8 + lymphocytes in diverse compartment including intestinal intraepithelial lymphcyte,spleen and mesenteric lymph node. M290-SAP also depleted CD103 + dendritic cells (CD11c + ) and T regulatory cells (Tregs, CD4 + CD25 + ) in the above compartments. In the thymus, M290-SAP depleted CD103-expressing cells in both CD4 -CD8and CD4 -CD8 + subpopulations, both of which express CD103, leading to a dramatic reduction in the number of thymocytes(77.08 ± 5.74 ×10 6 vs. 2.71 ± 0.82 ×10 6 , P<0.0001, in control vs. treated respectively). We next assessed the effect of M290-SAP in a fully allogeneic islet transplantation model (Balb/c→C57BL/6). M290-SAP produced long-term (LT) graft survival (>100d, n=9,vs. untreated median survival time 13d, n=8 ). Unconjugated M290 or isotype control (IgG-SAP) did not significantly prolong islet allograft survival. Graft histology showed little, if any, lymphocyte infiltration surrouding islet transplants in LT mice. In contrast, intense lymphocyte infiltration with disruption of islet morphology was observed in untreated mice. Pretreatment of donor islets with M290-SAP did not significantly prolong allograft survival indicating that the immunosuppressive effect of M290-SAP resides at the level of host. Interestingly, we found a 3-4 fold increase in both percentage and number of FoxP3 + CD4 + CD25 + cells in the spleen and draining lymph node from LT mice, though it remains to be determined whether such cells account for graft acceptance in M290-SAP treated recipients. In summary, these data document that depletion of CD103 expressing cells promotes long-term islet allograft survival. These findings point to a novel strategy for therapeutic intervention in islet allograft rejection. Pancreatic Objective: To engineer pancreatic islets in a rapid and efficient manner with a novel form of FasL protein chimeric with core streptavidin and test the efficacy of engineered islets for long-term survival in allogeneic hosts. Methods: BALB/c pancreatic islets were engineered first by cell surface modification with biotin followed by the display of a chimeric form of FasL protein that consists of extracellular domain of FasL fused C-terminus with core streptavidin (SA-FasL). SA-FasL-engineered islets were transplanted into streptozotocin diabetic C57BL/6 mice under transient cover of rapamycin. Unmodified islets or those engineered with streptavidin protein (SA) served as controls. Results: All the islets showed effective engineering with SA-FasL, which persisted on the surface of islets for weeks in vitro as assessed by confocal microscopy. All the islets (n=23) engineered with SA-FasL survived over the observation period of 100-400 days without detectable signs of rejection. In marked contrast, all the unmodified (n=9) and SA-engineered (n=14) islets underwent acute rejection within 40 days. The observed tolerance was localized to the engineered islets as unmodified second set of islets transplanted under contralateral kidney of long-term (>90 days) graft recipients were rejected in a normal tempo (MST=17 ± 9 days) without any effect on the survival of primary islets. Conclusions: Engineering pancreatic islets with exogenous immunomodulatory molecules, such as SA-FasL, in a rapid (∼ 2 hrs) and efficient (100% of targeted islets) manner represents a novel means of immunomodulation with considerable therapeutic potential for the treatment of type 1 diabetes. Supported in parts by NIH (R21 DK61333, R01 AI47864, R21 AI057903, R21 HL080108), JDRF (1-2001-328) The ability of embryonic stem (ES) cells to form cells and tissues from all three germ layers can be exploited for the generation of cells that can be used to treat diseases. In particular, successful generation of hematopoietic cells from ES cells could provide safer and less immunogenic cells than bone marrow cells, that require severe host preconditioning, when transplanted across MHC barriers. In the past, it has been difficult to derive hematopoietic cells from ES cells. It has now become clear that this was due to the lack of self-renewal properties by these newly developed progenitor cells. Here, we exploited the self-renewal properties of ectopically expressed HOXB4, a homeobox transcription factor, to generate hematopoietic progenitor cells (HPCs) that successfully induce high level mixed chimerism and long-term engraftment in recipient mice. HOXB4-transduced 129SvJ ES cells (H2 b ) were allowed to form embryoid bodies. These were dismantled after 5 days and the cells treated with a cocktail of hematopoietic cytokines. By day 26, ES cells had formed HPCs. These newly generated HPCs were CD45+, CD34+, CD117+ but poorly expressed MHC class I molecules and no class II. The HPCs fully restored splenic architecture in Rag2 -/-γ c -/immunodeficient mice, Abstracts comparable to bone marrow. Additionally, HPC-derived newly generated T cells were able to mount a peptide-specific response to lymphocytic choriomeningitis virus (LCMV) and specifically secreted IL-2 and IFN-γ upon CD3 stimulation. Further, HPC-derived antigen presenting cells (APCs) in chimeric mice efficiently presented viral antigen to wild type (WT) T cells. In syngeneic recipient mice, HPCs engrafted and formed more robust T and B cell populations. The majority of the HPC-derived cells were however, Gr-1 + , suggesting a bias towards myeloid cells by the HOXB4. Interestingly, these cells successfully engrafted in allogenic MRL (H2 k ) and Balb/c (H2 d ) recipients without the need for immunosuyprresion. This ability to form mixed chimerism across MHC barriers is a consequence of their lack of MHC, CD80 and CD86 expression. Our results demonstrate for the first time that leukocytes derived from ES cells ectopically expressing HOXB4 are immunologically functional and escape immunological rejection when transplanted across MHC barriers allowing the induction of mixed chimerism. Normalization The DE is derived from the anterior segment of the primitive streak which corresponds to the early and mid-gastrula organizer during early embryo development, from which many of the major visceral organs, including the liver, pancreas, lung, thyroid and intestines are derived. ES cells were cultured in a serum-free medium containing Activin A and bFGF for 6-8 days, the differentiated cells developed into an epithelial monolayer yielding more than 70% of CXCR4 expressing cells. CXCR4 has been reported as a cell surface marker of the definitive endoderm. Molecularly, the differentiated ES cells express typical definitive endodermal genes in particular, FoxA2, Sox-17, GSC, and HNF4α. The CXCR4 + definitive endodermal cells were further purified using immunomagnetic bead separation to more than 99% purity in order to eliminate teratoma-forming cells. To study their engraftment and regenerative capacity, these newly differentiated cells were intravenously infused into a mouse with carbon tetrachloride-induced liver injury. Harvested livers from these animals showed large DE-derived cells positive for albumin suggesting that they were de novo generated hepatocytes. A second cell type was CK-19 expressing, suggesting that the engrafted cells also differentiated into cholangiocytes. Therefore, we further transplanted these DE cells into a factor VIII null mouse and asked whether these cells could correct factor VIII activity. Plasma factor VIII activity (Coamatic assay) fully normalized to that of wild type mice and has remained stable over 60 days. These data suggest that ES cell-derived DE progenitor cells can restore factor VIII activity in hemophilia A mouse model, presumably through protein production in de novo generated hepatocytes. More importantly, none of these animals developed teratomas. Thus, ES-cell derived cells can potentially be coaxed to form cellular transplants with curative capabilities. Objective: We have established a novel approach, ProtEx, to rapidly and efficiently engineer primary cells, tissues, or organs to display on their surface exogenous proteins of interest for immunomodulation. This approach involves generation of chimeric proteins with core streptavidin, biotinylation of cells, and the transient display of chimeric proteins on the cell surface. In this study, we displayed a chimeric form of FasL (SA-FasL) on the surface of bone marrow cells and tested the efficacy of these cells to establish mixed chimerism in allogeneic hosts under nonmyeloablative conditions. Methods: BALB/c bone marrow cells were engineered with SA-FasL and 30 million of these cells were transplanted into C57BL/6 mice subjected to various doses of total body irradiation 2 days earlier. A short course of rapamycin was used to enhance the tolerogenic effect of SA-FasL. Bone marrow cell recipients were typed for multilineage chimerism at various times post-transplantation and tested for donor-specific tolerance using skin grafts. Results: All the animals (n=8) treated with 300 cGy total body irradiation and transplanted with SA-FasL-engineered donor cells showed significant levels of chimerism (10-60%) on day 14 post-transplantation that showed a steady increase overtime and reached to 40-90% on day 60 post-transplantation. In marked contrast, none of the control animals (n=11) receiving bone marrow cells and a short course of rapamycin showed detectable chimerism. Chimerism was multilineage and associated with donor specific tolerance since chimeric animals accepted donor, but rejected third party skin grafts. Conclusions: Engineering bone marrow cells in a rapid (∼2 hrs) and efficient (100% targeted cells) manner with exogenous proteins having immunoregulatory functions provides a new and effective means of immunomodulation to establish mixed allogeneic chimerism under nonmyeloablative conditions with significant potential in clinical bone marrow transplantation. Funded in parts by NIH (R21 DK61333, R01 AI47864, R21 AI057903, R21 HL080108), JDRF (1-2001-328) , ADA , and AHA Fellowship 0725348B. Chronic allograft dysfunction is still a major clinical problem in organ transplantation. Morphologically it is characterized by changes suggestive of an alloantibody mediated mechanism such as glomerulopathy and vasculopathy or by non-specific changes such as interstitial fibrosis and tubular atrophy. Alloantigen dependent as well as -independent factors contribute to the pathogenesis of these changes. In this study we analysed allospecific T cells from the peripheral blood of kidney transplant patients under different immunosuppressive protocols with or without chronic allograft dysfunction. 107 renal allograft recipients of our renal transplant clinic were screened at least six months after transplantation. All patients were on a calcineurin-based immunosuppressive protocol consisting of cyclosporine (CsA)/mycophenolate mofetil (MMF)/Steroid, Tacrolimus (Tac)/MMF/Steroid, or CsA/Steroid. Patients had to be mismatched for one or more of the five candidate HLA-DR antigens for which synthetic peptides were available (DR1, DR2, DR3, DR4, and DR7). Patients with biopsy proven chronic allograft nephropathy (CAN)with an elevated serum creatinine level of ≥1.6 mg/dl were compared with patients with stable allograft function (serum creatinine <1.6 mg/dl). T cell lines were generated from peripheral blood lymphocytes of renal transplant recipients against donor-derived HLA-DR peptides presented by self APC. T cell lines generated from patients with CAN produced significantly more IFN-γ, while those generated from stable patients produced IL-10 associated with a low proliferation index in response to the donor-derived mismatched HLA-DR allopeptide in vitro. Moreover, significantly more CD4+CD25+Foxp3+ T cells were found in stable patients on Tac and MMF as compared to patients on CsA and MMF. Interestingly, a higher gene expression of CD4, CD25, CTLA-4, Foxp3, and IL-10 was observed in those patients. Taken together a Th1 (IFN-γ) alloimmune response is deleterious and promotes chronic graft damage, while a Th2 (IL-10) response seems to be associated with a lower incidence of chronic allograft nephropathy. An immunosuppression based on Tac and MMF seems to favour CD4+CD25+Fosp3+ T cells and to allow long-term engraftment with stable renal function. Cyclosporin Induces Epithelial to Mesenchymal Transition in Renal Grafts. The expression of epithelial to mesenchymal transition (EMT) markers is a reliable predictor of the progression towards interstitial fibrosis and tubular atrophy of the renal grafts. In vitro experiments suggest that calcineurin inhibitors (CNI) can induce EMT of tubular epithelial cells. Although no evidence was ever provided in vivo, this suggests that EMT could be involved in the pathogenesis of renal fibrosis induced by CNI. We have previously reported the results of a prospective randomized trial comparing the elimination at month 3 of either cyclosporine (CsA, n=54) or mycophenolate (MMF, n=54) from a triple drug regimen in 108 de novo renal transplant patients. All of them had 2 systematic graft biopsies at months 3 and 12 post engraftment. In the leftover material, we retrospectively detected in tubular cells and by immuno-histochemistry the expression of two validated markers of EMT: the de novo expression of vimentin (Vim) and the cytoplasmic translocation of b-catenin (Cat). We were able to measure the EMT score at both months 3 and 12 in a total of 68 patients (34 in each group). In the CsA group, the Vim and Cat scores had progressed between 3 and 12 months from 1. Calcineurin inhibitors (CNI) are efficacious but nephrotoxic immunosuppressives. Arteriolar hyalinosis is one of the characteristic histological correlates of this toxicity. Yet, time course of this lesion, its reversibility, dose-dependency and the discrimination between drug-related effects, diabetic and hypertensive vasculopathy remain unclear. Aim of this study was to evaluate the prevalence and time course of CNI-related vascular changes after renal transplantation (tx) in protocol (pBx) as well as in indication biopsies (iBx) and to correlate this to the CNI blood levels, blood pressure and diabetes. From 491 patients, a total number of 1239 pBx, taken at 6 weeks (n=380), 3 (n=420) and 6 months (n=439) after tx and 360 iBx were classified according to Banff-criteria. Assessement of CNI toxicity included: isometric vacuolisation of tubules (ISOVAC), intimal arteriolar hyalinosis (IAH) and nodular arteriolar hyalinosis (NAH) and vacuolisation of small vessel smooth muscle cells (VSM). 96% of patients received either cyclosporine or tacrolimus. In pBx, ISOVAC was present in 15%, 17% and 15% of patients at 6 weeks, 3 and 6 months post-tx, respectively; IAH in 11%, 9% and 10%; NAH in 5%, 5% and 7%; VSM in 19%, 19% and 15%. In late iBx (>2 years post-tx) the prevalence of the analyzed parameters apart from ISOVAC (9%) was markedly higher: 42% for IAH, 22% for NAH and 38% for VSM. In pBx, VSM, IAH and NAH were associated with each other but not significantly dependenct on blood pressure, rejection episodes or diabetes. Through levels of CNIs were not different between patients with and without vascular hyalinosis. In patients with VSM and IAH in late iBx one third had these lesions already present in earlier biopsies. Conclusion: The prevalence of presumed morphological signs for vascular CNI-toxicity in pBx is low and constant and apparently, not associated with CNI blood levels, hypertension or diabetes. In contrast, in iBx later than two years after transplantation, prevalence of vascular CNI-toxicity signs is much higher. This emphasises that CNI reduction protocols should be regarded within the first six months after transplantation, when vascular changes are still marginal. Further elucidation of precursor lesions in pBx would help to find out patients at risk for CNI-induced vascular changes. Donor Introduction: Achieving donor specific tolerance has been the goal of the transplant community. Success has been reported with donor stem cell transfusion in animal studies and prevention of chronic rejection in cardiac allograft recipients in the clinic. This report details the results of the initial phase of a study in humans. Methods: A prospective phase I/II FDA approved pilot protocol was initiated to evaluate the effects of donor graft facilitating cell (FC)/stem cell infusion in kidney transplant recipients. Conditioning was performed with 200 cGy of total body irradiation. Bone marrow processed to remove GVHD-producing cells but retain CD8 + /TCR -FC and stem cells was infused 24 hours post-operatively. The dosage of stem cells was limited by the T cell dosage. The starting dose was 1 x 10 5 T cells. This was increased in steps of 2 x 10 5 per patient. As the study spans a 7-year period, the immunosuppression changed: 3 patients received cyclosporine (CyA), MMF and Prednisone; 3 were induced with Basiliximab and maintained on CyA(2)/FK(1), Cellcept and Prednisone; and 3 received Alemtuzumab induction and maintenance with FK and MMF. All patients underwent tolerance testing and immunoprofiling studies. Results: Of the 9 patients, 6 received live donor kidney transplants. One graft was lost from arterial thrombosis on day 2. Delayed graft function was seen in 3 patients. Good long-term graft function was seen in the other 8 patients. Acute rejection was noted only in 1 and infectious complications (CMV-1, Histoplasma-1) in 2 patients. Two patients died with functioning grafts -one at 4 years from lung cancer and another from complications from diabetes at 6 years. Six patients are alive with functioning grafts at mean follow-up of 4.3 years with a mean creatinine of 1.3 mg/dl. No GVHD was detected in any of the patients. Macrochimerism was not detected in any of the patients at any point. Notably, in spite of the fact that durable engraftment was not yet achieved, none of the patients were sensitized as a result nor did they experience immunologic sequelae. Conclusions: In our patients there were no untoward sequelae related to either the conditioning regimen or marrow infusion. The incidence of acute rejection even on longterm follow up has been low. We currently propose to reduce the immunosuppression further in these patients. A pilot study was performed to evaluate whether immune cell depletion with alemtuzumab would permit post-transplant weaning of maintenance immunosuppression in well-matched renal transplant recipients. Patients received alemtuzumab 30 mg intravenously on the day of the transplant and the subsequent 2 days while sirolimus and tacrolimus were started on day 1. Tacrolimus was discontinued at day 60 in all patients. Extensive immune monitoring was performed at 1 year. At current follow-up (22 to 34 months), all patients are alive with a functioning graft (median MDRD GFR=46 ml/ min). One patient experienced clinical and biopsy-proven rejection at 9 months. All other patients remain on sirolimus monotherapy. Four patients have been weaned to 1 mg of sirolimus daily as their sole immunosuppressive agent, with resulting blood levels of 3-4 ng/ml. These 4 patients have no evidence of donor-specific alloantibody, are unresponsive or hyporesponsive to donor cells by the cytokine kinetics test, and have a regulator phenotype to soluble donor antigens by trans-vivo DTH. Flow cytometry of peripheral blood demonstrated increased Foxp3 expression in the CD3+CD4+ population (p=0.002). Naive B cells (CD19/CD27neg) cells increased in 9 of 10 patients (p=0.02) and memory B cells increased in all 10 recipients (p=0.0005) when comparing pretransplant to 1 year timepoints. Other than the patient with rejection, 12-month protocol biopsies did not show any evidence of rejection, although 2 of 9 showed focal C4d positivity and 1 diffuse positivity. These 3 patients also had evidence of alloantibody by Luminex xMAP testing. In conclusion, the cytokine kinetics test, alloantibody testing, and trans-vivo DTH assay Abstracts results correlated with clinical evolution of patients who successfully weaned both tacrolimus and sirolimus without rejection or alloantibody. The flow cytometry findings described occurred regardless of clinical evolution and may represent alterations of the immune system inherent in the treatment protocol independent of individual patient responses to the graft. However, the functional assays of cytokine kinetics assay and trans-vivo DTH may be of potential use to correlate with the clinical immune status of the kidney transplant recipient. We have previously reported the short-term results of Alemtuzumab (Campath-1H) pre-conditioning with tacrolimus monotherapy and subsequent spaced weaning in living donor kidney transplantation (LDKT). We report here our 5 year experience. Methods: We performed 411 consecutive unselected LDKT (donor kidneys were removed laparoscopically) from 12/11/2002 to 11/26/2007 using 30 mg (0.5mg/kg) alemtuzumab and tacrolimus monotherapy. At 6 months post-transplant and every 2 to 6 months interval, we used clinical data (including ELISA antibody titers, Cylex T-cell activation assay, and identification of donor specific antibodies) to wean tacrolimus when possible (bid-->qd-->qod-->tiw-->biw-->qwk). The recipients included 5 HIV+, 35 pediatric recipients, and 60 re-transplants. The mean follow up was 843.1+478.7 days. Results: Actuarial recipient survivals at 1-, 2-, 3-years were 98.4%, 95.6%, and 92.7%, respectively. Graft survivals at 1-, 2-, 3-years were 97.6%, 90.4%, and 85.4%, respectively. The mean creatinine (mg/dL) at 1-, 2-, 3-years were 1.46+0.62, 1.56+1.08, and 1.54+0.89, respectively. The mean GFR (mL/min/1.73m 2 ) at 1-, 2-, 3-years were 73.0+28.5, 71.8+28.8, and 68.9+28.6, respectively. The cumulative incidence of acute cellular rejection (ACR) at 6-, 12-, 18-, 24-, 30-, 36-, 42-, and >42 Conclusions: In the current era low risk patients infrequently have AR, and have excellent short and long term graft survival without the use of depleting antibodies. Given the increased costs of these drugs, the indications for using depleting antibodies in low risk KTRs of SCD kidneys should be further clarified. Kidney transplantation prolongs survival in hepatitis C virus-positive (HCV+) patients with end-stage renal disease (ESRD). However, the effects of induction therapy and chronic immunosuppression are unknown on the course of HCV infection and potential for cirrhosis in renal transplantation (RTx) recipients. We have retrospectively assessed parameters of liver function, Child-Pugh (CP) and MELD scores in HCV+ ESRD patients who received induction therapy with T-cell depletion (Group 1: Thymoglobulin, n=30) or an IL-2 inhibitor (Group 2: Basiliximab, n=46). Pre-RTx liver biopsies were similar in Group 1 and 2. Patients were followed for a mean of 826 days (range 47 to 2679 days) following RTx and received tacrolimus, mycophenolate mofetil, and sometimes steroids post-transplant. Overall graft survival was 84% in Group 1 and 85% in Group 2 (p > 0.05). Data were analyzed pre-RTx, at 30 and 365 days and at time of last follow-up. Serum AST, ALT, platelets, INR, albumin and bilirubin did not change following RTx in either group. CP scores in Group 1 were not significantly changed after RTx (5.5 ± 0.9 to 5.7 ± 0.9 at last follow-up, p=0.53). Group 1 patients on steroid-free protocols (n=8) demonstrated declining CP scores from 6.0 ± 1.0 to 5.3 ± 0.5 at last follow-up that were not statistically significant (p=0.31). Group 1 patients on steroids showed opposite trends of CP: 5.3 ± 0.6 pre-RTx vs 5.8 ± 1.1 at last follow-up that similarly did not reach statistical significance (p=0.18). Group 2 CP scores declined from 5.8 ± 0.8 to 5.4 ± 0.9 at last follow-up (p=0.04). There was no difference between CP or MELD scores at any point between the groups. As expected, MELD scores improved significantly following RTx and remained low up until the final visit (p=0.001); this was attributed to the drop in serum creatinine post-RTx. Neither the use of Thymoglobulin or Basiliximab resulted in acute hepatitis resurgence or the development of cirrhosis post-transplant. We have not identified any association between choice of induction agent or maintenance immunosuppression regimens, including steroid withdrawal, with impaired hepatic function or progression to liver cirrhosis in HCV+ RTx patients. T cell depletion was well-tolerated by HCV+ RTx patients and resulted in good graft outcomes. Interestingly, CP scores declined after renal transplantation in the Basiliximab induction group. Kidney: Complications I Prevalence Background: A few years ago we observed an expansion of blood gd T cells following cytomegalovirus (CMV) infection in kidney transplant recipient (KTR). We recently demonstrated that these cells share a strong reactivity against CMV infected cells and tumor epithelial cells in vitro. An implication of gd T cells in the immune surveillance against cancer has been demonstrated in mouse and strongly suggested in human. We tested here the hypothesis of a protective role of CMV-induced gd T cells against neoplasia in KTR through: 1/ a longitudinal case / control (KTR with cancer / KTR without cancer) study where gd T cell percentages were determined before and after cancer diagnostic (n=63), 2/ a retrospective follow-up of 131 KTR for 8.23 years looking for risk factors for malignancy. Results:The median of gd T cell percentage in patients with malignancies was significantly lower when compared to control patients 18, 12 and 6 months before the diagnostic of the cancer (p<0.005). Using a conditional logistic model, we determined that patients with a gd T cell percentage above 4 % were protected from cancer (p<0.008). A significant association between increase of the Vd2 neg gd T cell subset and lower cancer occurrence was only retrieved in the KTR who experienced pre-or post-graft CMV infection. Finally, using univariate and multivariable analysis, absence of pre-or post-graft CMV infection in KTR was associated with a risk of cancer 5.69 times more elevated (p=0.006). This study reveals an unexpected protective role of CMV against cancer in KTR most probably via the expansion of gd T cells cross-reactive against CMVinfected and tumor cells. Background: Viral infection (VI) is a morbidity factor in transplant recipients (Tx pts). Induction therapy (Ind-Rx) is a known risk factor for VI. Although CAM is thought to be a more potent Ind-Rx than ZEN, we have previously shown similar CMV infection rates in each. We have also shown that CMV-Tc analyzed by cytokine flow cytometery (CFC) are consistently detectable in CMV sero(+), but not sero(-) individuals. CMV-Tc(-) was associated with persistence of CMV infection in Tx pts. Here, we report on the effect of Ind-Rx on CMV-Tc in kidney Tx pts. Methods: 58 pre-Tx samples from 39 CMV-sero(+) pts and 62 post-Tx samples from 44 pts were submitted for CMV Tc-CFC. Whole blood was incubated with a pooled overlapping peptide mixture consisting of 138 peptides from CMV pp65 and Brefeldin A at 37 degrees for 6 hours and room temperature overnight. IFNγ+CD8+ cells were enumerated by CFC and results were expressed as IFNγ+CD8+ cell%. Results >0.2% were considered as (+ Infection associated graft loss during the entire study period is shown in figure1. Infections contributing to renal allograft loss increased significantly from 1990 to 2006. This may be due to increase use of both induction agents and potent maintenance regimens. This is an important cause for poor long-term graft outcome despite decreasing rejection rates and a balance has to be maintained between prevention of rejection and avoidence of infection. Serum creatinine (Scr) at procurement was 101±51 µmol/L. The incidence of donor hypertension, diabetes, and death from cerebrovascular origin was 31%, 15%, and 55% respectively. Multivariate analysis showed that the only clinical parameters associated with a low eGFR were donor Scr and donor hypertension. Nyberg or Pessione scores were not significantly associated with a low eGFR. Regarding D0 biopsies, univariate analysis showed that % of sclerotic glomeruli (SG, p=0.02), arteriolar hyalinosis (p=0.03), mean Remuzzi score (p=0.03) and mean CADI score (p=0.04) were all significantly associated with a low eGFR. A logistic regression showed that an integrated score including: i) donor Scr (±150 µmol/l), ii) hypertension, and iii) SG (±10%) had the highest performance in predicting a low eGFR at 1 yr compared to clinical or histological parameters alone. Using this composite score, the adjusted OR for the prediction of a low 1-yr eGFR ranged from 1 if none of the 3 factors were present, to 7.1 (if SG >10% was associated with one of the 2 clinical factors), and to 27.5 (if the 3 factors were present, p=0.0003). Conclusion: this study highlights that D0 biopsies are useful to predict graft outcome particularly in MD population, and may perform better than clinical scores alone. In this population, a simple and routinely applicable integrated scoring strongly predicts a poor graft outcome, which may allow an optimized allocation of marginal donors. Prospective Kidney transplantation from small pediatric donors is increasingly being utilized as a means to optimize the organ supply, however the single most common specified reason for the discard of pediatric kidneys is vascular damage, such as shortening of the suprarenal aorta or injury to the renal artery orifices, which often precludes en bloc transplantation (EBK). At our center, damaged kidneys were salvaged by transplantation as singles (SK BACKGROUND: In an effort to maximize the number of recipients transplanted per donor, transplant centers in our donor service area (DSA) voted to preferentially allocate local and imported en-bloc pediatric donor renal allografts to centers willing to transplant two individuals with single allografts. After 1 year of implementing this policy into action, we report on our initial experience. METHODS: From July 2006 to June 2007 we reviewed our experience with 12 adult single allograft recipients of pediatric donors less than 60 months of age. There were no exclusions based on age or size with exclusion criteria consisting of donor age < 2 months and single allograft size < 5 cm. All but 1 recipient received rabbit anti-thymocyte globulin induction, tacrolimus, mycophenolate mofetil, and rapid steroid withdrawal. Results from this cohort were compared to 86 consecutive recipients of adult single allografts from standard criteria donors with the same immunosuppression protocol used as historical controls. RESULTS: 12 pediatric single allografts with median donor age of 24 months (range 8-58) were transplanted into 12 adults with median age 46 years (range 24-67). showed that non-white race was associated with increased risk of death (p<0.01). This effect of race was attenuated when LTx center was taken into account [B] . Finally, with the addition of MELD in the model, the effect of race on waitlist mortality all but disappeared [C] . Conclusions: On the surface, minority patients may appear to have higher mortality on LTx waitlist compared to Caucasian counterparts. However, this association is predominantly a result of minority patients having a higher MELD score, although LTx center-specific mortality may contribute. These data suggest that waitlist outcome may be improved by optimizing referral of minority patients. Background: In cirrhotic patients awaiting liver transplantation (LT), low serum sodium (Na) predicts short term pre-LT mortality, independently of MELD. Incorporation of Na into MELD has been recommended to improve prognostic accuracy (MELD-Na; Gastroenterology 130:1652 Gastroenterology 130: , 2006 ). However, short term interventions such as water restriction that improve Na may have little effect on prognosis. Hypothesis: the lowest level of serum sodium in the preceding 30 (Na30), 90 (Na90) or 180 days (Na180) may be better than the current serum sodium (NaC) for predicting pre-LT cirrhotic mortality. Methods: We reviewed electronic records of 764 cirrhotic veterans referred for consideration of LT, 2/28/02-6/30/07. Date of most recent Na at referral was chosen as time zero for determining NaC, Na30, Na90, and Na180 and for assessing subsequent survival. Findings: Within 90 days, 94 patients died pre-LT (12%) and 27 underwent LT (3%). Na at all time points was associated strongly (p<.001) with preLT death (censored at LT). Areas under receiver operating characteristic curves (AUROCs) for Na30, Na90 and Na180 as predictors of 90d preLT mortality (mean±SE) were .808±.026, .807±.026 and .783±.025, respectively, compared to .714±.032 for NaC (all p<.05 vs. NaC). On multivariable logistic regression analysis, MELD and Na90 were independent predictors of 90d preLT mortality, with best discrimination given by the following model: MELD-Na90 = MELD + (135-Na90)*1.04 with value of Na90 capped at 135. AUROCs for MELD-Na90, MELD-Na, and MELD were .875±.027, .865±.025 and .838±.027, respectively. Findings were similar when patients with HCC at referral (n=155) were excluded (AUROCs .884±.025, .876±.026 and .840±.031, respectively), and when 90 day survival endpoint was changed from "death censored at LT" to "death or LT" (AUROC's .890±.021, .882±.022, and .855±.026, respectively). Conclusion: Short term improvement in Na may mask true preLT mortality risk. The lowest Na in the preceding 90 days is a better prognostic indicator than current sodium. Substitution of MELD-Na90 for MELD would permit more accurate "sickest first" organ allocation, while at the same time allowing preLT correction of Na without loss of priority. Prospective validation of MELD-Na90, in comparison to MELD-Na and MELD, is warranted. Hyponatremia Does Not Affect Survival Following Liver Transplantation. Byung Cheol Yun, 1 W. Ray Kim, 1 Y. S. Lim, 1 Joanne T. Benson, 1 Walter K. Kremers, 1 Terry M. Therneau. 1 1 Gastroenterology and Hepatology, Mayo Clinic College of Medicine, Rochester, MN. Background: Hyponatremia is a common yet important complication of cirrhosis. Serum sodium (Na) has been found to be an important predictor of survival in patients with cirrhosis. Models incorporating Na have been proposed for liver allocation. Concerns have been raised, however, that liver transplantation (LTx) in hyponatremic patients will adversely affect the outcome. In this work, we assessed the effect of pre-LTx Na on the short term survival following LTx. Methods: Patient-level data on all waitlist registrants in the US for 2005 and 2006 were obtained from the Organ Procurement and Transplantation Network. Demographic, clinical and laboratory data at the time of LTx and outcomes following LTx were extracted. The relationship between Na pre-LTx and survival post-LTx was analyzed using multivariable regression analyses. Results: There were 7411 primary transplants that met the inclusion criteria between 2005 and 2006. The median Na in mEq/L at the time of LTx was 136 (interquartile range, IQR: 132-139). There were 1255 patients who had a Na ≤130 mEq/L. The mean MELD score was 22.5 (SD 9.2). Median follow up was 279 (IQR: 155-371) days. The overall 30-and 90-day survival post-OLT was 96% and 93%, respectively. In a multivariable logistic regression model, MELD was associated with 1.04-fold increase in 90-day mortality (95 confidence interval: 1.03-1.05), while Na did not have impact on survival (HR=0.99, 95% CI: 0.98-1.02). The figure represents the risk of 90-day mortality according to Na after adjustment for MELD, which clearly shows absence of mortality increase over a wide range of Na. Conclusions: Hyponatremia at the time of LTx has no detrimental impact on short term patient survival following LTx. Although these data do not address morbidity (e.g., central pontine myelinolysis), there is no evidence that incorporation of Na in organ allocation will lead to diminished survival. Objective. This study examined the relationship between MELD at liver transplantation (LT) and post-LT quality of life (QOL). Methods. Adult LT recipients (n = 247) at two centers completed the SF-36 and Transplant Symptom Frequency Questionnaire (TSFQ) 1-year post-LT. High SF-36 scores indicate better QOL; high TSFQ scores indicate more symptomatology. Clinical (lab) MELD at LT, demographic characteristics, presence of ascites, encephalopathy, and variceal bleeding pre-LT, current employment status, presence of co-morbid medical conditions, and BMI were collected from medical records. Results. Primary LT indication was viral hepatitis (57%), cholestatic liver disease (17%), or hepatocellular disease (27%), and 63% had ascites, 51% encephalopathy, and 29% gastroesophageal bleeding. Mean MELD at LT was 20±9. There was almost no correlation between MELD and SF-36 Physical (r = 0.11) and Mental (r = 0.001) functioning. Statistically significant yet weak correlations were found between MELD and Physical Functioning (r = -0.15) and Role Functioning -Physical (r = -0.15). MELD was not significantly correlated with any other SF-36 scales (r's -0.11 to -0.01). MELD was not significantly correlated with any TSFQ domains: affective distress (r = 0.08), neurocognitive symptoms (r = 0.05), gastrointestinal distress (r = -0.02), physical appearance changes (r = 0.09), appetite and weight changes (r = 0.09), and miscellaneous symptoms (r = 0.08). Older age (ß = -0.28), female sex (ß = -0.26), viral hepatitis (ß = 0.67) or cholestatic disease (ß = 0.45), higher BMI (ß = -0.62), and >1 medical co-morbidity (ß = 0.57) were significant predictors of lower QOL as measured by the SF-36 (adj R 2 = 0.12, F = 4.4, p < 0.001). Older age (ß = 0.27), female sex (ß = 0.29), higher BMI (ß = 0.71), history of variceal bleeding (ß = 0.19), and >1 medical co-morbidity (ß = 0.17) were predictive of more symptoms on TSFQ (adj R 2 = 0.10, F = 4.1, p < 0.001). MELD was not predictive of QOL. Conclusions. Higher disease severity, as measured by MELD, at LT does not portend a worse QOL outcome for patients 1-yr after transplantation. Other pre-LT indicators of decompensation also do not predict post-LT QOL. Post-LT QOL is affected more by other variables, including age, sex, BMI, and medical co-morbidities. Introduction: Racial disparities in access to cadaveric renal allografts have been well described for renal transplantation. However, little is known about differences in orthotopic liver transplantation (OLT) rates for patients of minority racial groups following listing. The purpose of the current study was to determine if there is difference in rate of transplantation among racial groups and to examine the potential reasons for the disparity. Methods: The United Network for Organ Sharing (UNOS) database was obtained. Data was extracted for adult OLTs greater than 18 years of age performed from 2/2002-12/2005. Transplants for which recipient race or Model for End-Stage Liver Disease (MELD) score at listing were unknown and patients active on the list were excluded. Rates of transplantation as well as differences in reasons for de-listing (transplantation, death/deterioration, and improvement) were examined. In an effort to examine only patients with chronic liver disease, further analysis was performed excluding patients with acute fulminant liver failure and retransplants. Results: The database contained complete MELD and race information on 17,916 OLTs. Seventy-four percent of patients were Caucasian, 12% Hispanic, 9% African-American, and 4% were Asian. As seen in Table 1 , laboratory MELD score at removal differed between racial groups. Examining pair-wise comparisons of the three minority groups to Caucasians, only Hispanics differed in reason for delisting (Table 1) . Subgroup analysis excluding acute hepatic failure patients and retransplants showed similar results with Hispanic patients being more likely to die/deteriorate as compared to other racial groups (31% deaths vs. 24% deaths for Caucasians), and being less likely to receive a transplant (69% of Hispanics vs. 75% of Caucasians, p<0.001). Conclusion: Hispanic patients, although listed with higher MELD scores, are transplanted less often than Caucasian patients and are more likely to die/deteriorate while awaiting OLT. Reasons for this discrepancy are unclear and merit further attention. Background: Since the implementation of the Model for End-Stage Liver Disease (MELD) for liver allocation, an increasing number of candidates with renal insufficiency have undergone orthotopic liver transplantation (OLT). Since candidates with renal insufficiency have higher post-transplant morbidity and mortality, MELD-based allocation may be shifting some waiting list mortality to the post-transplant period in these candidates. The objective of this study was to evaluate the survival benefit among candidates with renal insufficiency who underwent OLT. Methods: Scientific Registry of Transplant Recipients data for adult candidates age ≥18 initially listed for OLT between 9/1/01 and 12/31/2006 (n=38,899) were analyzed. The effect of serum creatinine on the survival benefit (contrast between waiting list and post-transplant mortality) was assessed by sequential stratification, an extension of Cox regression. Each recipient was matched with candidates active on the waiting list in the same organ procurement organization with the same MELD score. Results: For MELD scores 12-40, the survival benefit of OLT significantly decreased as serum creatinine increased. Among candidates transplanted at MELD 12-14, the 23% with serum creatinine >1.1 mg/dl (23%) experienced no significant survival benefit ( Figure) . Candidates transplanted at MELD ≥15 experienced significant OLT benefit irrespective of serum creatinine level. Conclusions: Comparing two patients with MELD ≥12, the patient with higher creatinine experiences significantly less survival benefit from liver transplantation. Almost onequarter of patients transplanted at MELD 12-14 experienced no survival benefit from OLT based on 5 years of follow-up. Therefore, more careful assessment of candidates is required in order to maximize the survival benefit gained by the wait-listed end-stage liver disease population as a whole. Liver: Living Donors and Parial Grafts I Assessment Introduction: Consideration of the risks and benefits of a procedure are critical in medical decision making. However, relatively little is known about risk tolerance amongst donors and transplant professionals in live-donor liver transplantation (LDLT). We conducted confidential semi-structured interviews in a convenience sample of donors, non-donors (individuals who had been assessed for donation but did not donate) and transplant team members. In addition to examining issues surrounding decision making for LDLT donation, we sought to assess the tolerance of participants, above which they would no longer contemplate donation, for a number of potential outcomes following LDLT. The outcomes that participants were asked to consider included their tolerance for risk of donor death, risk of serious donor complication, as well as risk of recipient death following transplantation. The interviews were conducted sequentially, data was coded quantitatively, and the study terminated once saturation was reached. (pre, weeks 1, 4, 12, 26 and 52 post-donation) . Ambivalence detected by staff or described by donor was recorded. Donor and recipient characteristics were examined and compared between ambivalent and non-ambivalent groups. Results: Staff identified and self identifed ambivalent donors were not equivalent. Staff assessments indicated 20 ambivalent donors (16 male, 4 female). 18 donors self-identified as ambivalent (11 male, 7 female); 7 donors were on both lists (5 male, 2 female). The combinations of brother to brothers and sons to fathers were the most common pairs among ambivalent donors and more common than in total donor cohort. Recipient diagnosis of alcohol or hepatitis C related liver disease was more common in ambivalent donors. Ambivalent donors were more likely to be college educated and to express significant religious affiliations than the total RHL donor group. All but 1 ambivalent donor indicated that they would donate again on the 1 year QOL survey. Conclusions: Ambivalence about RHL donation is present in approximately 20% of candidates who complete donation. Staff-identified and self-identified groups showed only 20% overlap; however, both groups showed similar characteristics. Brother-tobrother and son-to-father pairings and recipients with perceived self-induced liver failure were more common in both groups compared to total donor cohort. Ambivalent donors had more education and stronger religious or spiritual identification than the entire cohort. Only 1 donor indicated persistent doubt about donation. These results suggest that expressed or perceived donor candidate ambivalence may represent a process of careful consideration and should not be used sole basis for donor disqualification. The impact of donor age on recipient outcome for adult right-lobe living donor liver transplantation (RLDLT) is unclear. Aim: To analyze the effect of donor age on recipient outcome following RDLDT. Methods: Since 2000 we have performed 226 RLDLT (mean donor age 37 years, range 18-60 years), including 20 donors age 55 years or older. We analyzed the effects of donor age, as a continuous or categorical (< 54 vs > 55years) variable, on recipient outcome. Recipient outcome measures included biochemical markers of hepatocytes injury (AST, ALT) and graft function (INR, bilirubin), postoperative infections, bleeding, biliary complications, acute cellular rejection, as well as patient and graft survival. Analyses were carried out stratified for higher recipient MELD scores (< vs. >25), recipient age (< vs. > 60 years), and hepatitis C virus (HCV) infection (presence vs. absence). Results: 5-year patient and graft survival after RLDLT was 80% and 78%, respectively. Donor age as a continuous variable was associated with increased AST (p=0.04) and ALT (p= 0.022) release after transplantation, while no effect was observed on INR or bilirubin. RLDLT using donors above 55 years of age resulted in an increased incidence of biliary strictures (20% vs. 6%, p= 0.028), postoperative cholangitis (25% vs. 9%, p= 0.023). No effect of donor age was found for the following recipient outcome measures: the number of bile ducts supplying the graft, type of biliary reconstruction required; rejection, hemorrhage, pulmonary or urinary tract infections, renal failure, or length of hospital stay. 5-year patient survival was identical for patients receiving grafts from donors below or above 55 years of age (81% vs 77%, p= 0.86). Similarly, 5-year graft survival was comparable for young and old grafts (78% vs 77%, p= 0.71). Recipient age (< vs > 60 years), recipient MELD score (< vs >25), or Hepatits C status of the recipient did not impact on the effect of age on patient or graft survival. Conclusion: In this single center series of 226 RLDLT, the use of selected older donors did not impair graft and patient survival, but was associated with an increased rate of biliary strictures. Background: Biliary stricture rate after living donor liver transplant (LDLT) in adults remains relatively high in comparison to the stricture rate after adult cadaveric liver transplant or LDLT in pediatric patients. The etiology or risk factors for biliary stricture development at present time are uncertain. Purpose: To determine the risk factors for biliary stricture after right lobe (RL) LDLT. Methods: From 5/99 to 12/07, 109 LDLT procedures were performed in 109 adult recipients. Eleven patients were excluded from analysis due to <90 days follow up or need for retransplant. The following data was prospectively collected: 1. demographics, 2. acuity of illness, 3. number of bile ducts, 4. type of biliary reconstruction, 5. graft to recipient weight ratio, 6. hemodynamic parameters, 7. outcomes. These parameters were compared in patients with and without strictures. Results: Mean follow-up for 98 patients is 1642 days (range: 120-3052). 8 of 109 patients died during the follow-up range and 3 required whole liver re-transplants. 33 patients (34%) developed a biliary strictures during the follow-up period. Comparison of risk factors in patients with and without strictures revealed the following results: Mean MELD >1 Bile Duct GRWR* < 1. Neither MELD score, number of bile ducts or type of biliary reconstruction appear to be contributing factors to the development of bile duct stricture following RL LDLT. The biliary stricture rate was related to the volume of transplanted liver and post transplant graft recovery. Therefore, the development of biliary strictures in some patients may represent yet another feature of small-for-size syndrome. Background: OX40 and CD154 can be expressed by both Foxp3+ Tregs and activated T effector cells. However, the question as to how OX40 and CD154 function, individually or collectively, in regulating such functionally different T cell subsets in transplant models remains poorly understood. In some models, blocking CD154 costimulation is remarkably effective in prolonging graft survival, but targeting CD154 alone rarely creates tolerance. But the role of OX40 in regulating the CD154 blockade induced tolerance is completely unknown. In the present study we critically examined the role of OX40 in the activation of CD154 deficient T effector cells as well as in the regulatory function of Foxp3+ Tregs. We also examined the effect of OX40 on the induction of new Foxp3+ Tregs/Th17 cells from activated CD154 deficient T effector cells. The impact of OX40 in the induction of allograft tolerance was examined using an islet transplant model. We found that CD154 deficient Foxp3+ Tregs constitutively expressed OX40 on the cell surface, but the CD154 deficient T effector cells did not. However, when the T effector cells were sorted and stimulated in vitro, OX40 expression could be Abstracts readily induced on the T effector cells. To further examine how OX40 regulates such functionally different T cell subsets, we found that OX40 delivers potent costimulatory signals to T effector cells, which prevent the induction of new Foxp3+ Tregs from activated T effector cells but promote their differentiation to Th1 cells but not Th17 cells. Surprisingly, OX40 costimulation to CD154 deficient Foxp3+ Tregs completely inhibited their regulatory functions. In an islet transplant model, we showed that CD154 deficient mice can reject the DBA/2 islet allografts, but blocking OX40 costimulation readily induced donor specific tolerance (MST>150 days), and this tolerant status was critically dependent on the induction of Foxp3+ Tregs. In contrast, treatment of CD154 deficient recipients with a agonist anti-OX40 mAb precipitate rapid islet allograft rejection, suggesting that OX40 costimulation is critically important in the induction of transplant tolerance. Conclusions: Our data suggest that OX40 is a costimulatory molecule to T effector cells but a powerful negative regulator for Foxp3+ Tregs. Thus, a key role for OX40 in the induction of transplant tolerance is the control of T cell mediated regulation. Background: Foxp3 is a winged-helix family transcription factor that is the master regulator for the development and function of regulatory T cells (Treg). We investigated the molecular mechanisms important for regulation of Foxp3 expression, and defined the structure of the active Foxp3 promoter in CD4 + T cell lineages. Methods: Purified CD4 + CD25 -Foxp3 -GFP -T cells (naïve) and CD4 + CD25 + Foxp3 + GFP + Treg were cultured with antigen presenting cells in the presence of IL-2, anti-CD3ε mAb, TGFβ or the DNA methyltransferase inhibitor 5-aza-2'-deoxycytidine (ZdCyd). Foxp3 promoter structure and activity were monitored with methylation-specific PCR, disulfite-sequencing, chromatin immunoprecipitation (ChIP) assays, electrophoretic mobility shift assay (EMSA) and luciferase promoter assay. The Foxp3 promoter has an upstream CpG island ∼5kb from the transcriptional start site. Disulfite-sequencing and methylation-specific PCR analysis showed that this region is heavily methylated in naïve CD4 + T cells and TGFβ induced peripheral Treg, but demethylated in thymic derived natural Treg (nTreg). ChIP analysis showed that the methylated CpG island is bound specifically by the DNA methyltransferases 1 and 3b. ZdCyd causes demethylation of the CpG island, and in combination with TGFβ, synergistically induces Foxp3 expression. ChIP assays for acetylated histone 3 and Sp1, both markers of gene activation, showed that the CpG island is acetylated and bound by Sp1 in nTreg and ZdCyd plus TGFβ induced Treg, but not in activated CD4 + T cells or TGFβ induced Treg. EMSA likewise shows the CpG island binds Sp1. In contrast to the upstream promoter, the structure of the first intronic promoter differs markedly between nTreg and TGFβ induced Treg, but is not affected by ZdCyd. The upstream CpG island also possesses enhancer activity that is repressed by DNA methyltransferases. ZdCyd plus TGFβ induced Treg have stable Foxp3 expression and enhanced suppressive functions in vitro and in vivo. Conclusion: These results demonstrate that nTreg and TGFβ induced Treg are genetically distinguished from each other by the epigenetic structure of a unique upstream CpG island of the Foxp3 promoter. The function of this region is regulated by DNA methylation and histone acetylation. ZdCyd demethylates the promoter, leading to enhanced and stable expression of Foxp3 and suppressor activity, similar to nTreg. This has important implications for biology, and generating Treg for tolerance. Chemokine Background: Trafficking of lymphocytes through lymphatics to secondary lymphoid organs is crucial for immune responses. We previously showed that regulatory T cell (Treg) function required trafficking from the inflammatory graft site to the local draining lymph node (dLN). Since the mechanisms that regulate migration through afferent lymphatics are poorly understood, we explored the role of chemokine receptors on Treg for afferent lymphatic migration in an islet transplantation model. Methods: Islets were transplanted from BALB/c mice into Foxp3 GFP C57BL/6 mice. Treg from wild type, CCR2 -/-, CCR4 -/-, CCR5 -/-, or CCR7 -/-C57BL/6 mice were isolated, labeled with red dye PKH26, and transferred intravenously, or locally into the islet allograft. Treg migration to islet grafts and dLN was determined by flow cytometry and immunohistochemistry. Endogenous Foxp3 GFP+ Treg and transferred PKH26 labeled Treg were sorted from the islet grafts and the dLN, and chemokine receptor and sphingosine 1-phosphate receptor (S1P1) expression were determined by RT-PCR. Islet allograft survival was determined by measurement of blood glucose. Results: Freshly isolated Treg expressed S1P1 and the chemokine receptors CCR2, CCR4, CCR5, and CCR7. Endogenous Treg, and both intravenously and locally transferred Treg, that were recovered from islet allografts and dLN expressed similar levels of CCR2 and CCR5. CCR4 was expressed preferentially on islet migrating Treg, while S1P1 and CCR7 were expressed preferentially in dLN migrating Treg. Locally transferred Treg migrated to the dLN, but CCR7 -/-Treg were not able to migrate to the dLN. CCR2 -/and CCR5 -/-Treg were impaired in their ability to migrate to the dLN. This suggested that these three chemokine receptors all regulated Treg entry into afferent lymphatics and migration from the graft to the dLN. In contrast, CCR4 -/-Treg migrated normally from the islet to the dLN. Importantly, CCR2 -/-, CCR5 -/and CCR7 -/-, but not CCR4 -/-Treg, were impaired in their ability to prolong islet allograft survival when transferred locally in the islet allograft. Conclusion: Treg migrate from the inflammatory site of the allograft to draining secondary lymphoid tissue through afferent lymphatics. This process depends on CCR2, CCR5, and CCR7; and is crucial for full Treg function in vivo. These results demonstrate a novel role for sequential migration from the graft to the dLN in Treg function and suppression. Epigenetic regulation of gene expression provides a major, and especially beyond oncology, largely unexplored means to regulate host immune cell functions. Our ongoing analysis of histone deacetylase (HDAC) expression by Foxp3+ naturally occurring murine regulatory T (Treg) cells showed TCR-activated Tregs had 5-6 fold more HDAC6 mRNA than corresponding resting Treg or non-Treg cells. In various cell types, HDAC6 deacetylates alpha-tubulin, cortactin, and HSP90, abrogates formation of the aggresome, and blocks the unfolded protein response, though nothing is known regarding these pathways in Tregs. We found that an HDAC6-specific inhibitor, tubacin (but not the control compound, niltubacin), increased Treg suppressive function in vitro (p<0.01), in association with increased expression of CTLA, IL-10, GITR, PD-1 and other Treg-associated genes (p<0.05), and increased Treg Foxp3 protein (though not mRNA) expression. Tubacin enhanced the conversion of CD4+CD25-cells into CD4+ Foxp3+ Treg in vitro, and globally decreased cytokine production, with the exception of IL-10 and IL-17 mRNA. Comparable and dose-dependent effects were seen using the HSP90 inhibitor, geldanamycin, suggesting that the effects of HDAC6 inhibition were mediated, at least in part, by blocking the chaperone effect of HSP90. Use of tubacin in vivo significantly decreased the severity of colitis in two murine inflammatory bowel disease models (p<0.05), dextran sodium sulfate-induced colitis and the CD4+CD62Lhigh adoptive transfer model of colitis, as assessed by standard clinical and histologic criteria. In addition, 14 days combined use of tubacin and a subtherapeutic dosage of rapamycin led to significantly prolonged cardiac allograft survival (BALB/c->C57BL/6) compared to use of either agent alone (p<0.05). Our data show that use of the first known small molecule inhibitor of one specific HDAC has important therapeutic effects, including enhancing the production and suppressive function of Tregs. While ongoing studies are directed towards unraveling the interactions of HDAC6-dependent pathways and Treg functions, the current data indicate the importance of understanding the functions of HDACs to the development of entirely new ways to regulate host immune responses. Dendritic Cells Supply Paracrine IL-2 for Treg Cell Functional Activity. Regulatory CD4+CD25+ T cells (Tregs) are important for the maintenance of immune tolerance, and immunotherapy with Tregs is being explored for organ and cell transplantation. Treg development, expansion and function depend on IL-2. Because Tregs do not make IL-2, they must obtain IL-2 from another cell. Although CD4+ Teffectors are a logical candidate, the identity of the paracrine source of IL-2 for Tregs is not substantiated. We explored whether dendritic cells (DCs) could serve as the paracrine source of IL-2 for Treg and RD6, a CD4+CD25+ regulatory hybridoma. Using four dimensional live cell imaging we demonstrate that Treg and RD6 cells establish tight contact with DCs, and CD25 is localized at these contacts. Using the IL-2 ELISPOT and real-time RT-PCR we found that splenic DCs and the JAWSII DC cell line constitutively make IL-2. LPS and CpG increases DC production of IL-2. Co-Culture with JAWSII DC cell line significantly upregulates CD25 expression on alloreactive DO11.10 Tregs and RD6 cells, but not on DO11.10 CD4+ Teffector cells. Tregs and RD6 cells are functionally suppressive after activation by wild type but not IL-2 knock-out allogeneic DCs, and anti-CD25 inhibits the function of Treg and RD6 cells in a dose response fashion. In contrast, wild type and IL-2 knock-out DCs are equally able to activate alloreactive CD4+CD25-cells. Supplemental IL-2 at high (1000 U/ml) but not low doses (100 U/ml) restores the function of alloreactive Tregs and RD6 that were activated by IL-2 KO DCs. These data indicate that Treg cells acquire IL-2 from dendritic cells for their gain of function and validate dendritic cells as a paracrine source of IL-2 for Treg. Introduction: Previously, it has been demonstrated that FOXP3, a gene required for the development and function of regulatory T cells, was highly expressed in the graft during cardiac rejection, suggesting infiltration of regulatory T cells in the transplanted organ during an allogeneic response. In this study, we investigated whether graftinfiltrating T cells expanded from rejecting human cardiac allografts exhibit immune regulatory activities. Methods: Graft-infiltrating lymphocytes (GILs) cultured from endomyocardial biopsies (EMB; n=13) with histological signs of acute cellular rejection were expanded in the presence of donor-antigens in IL-2/IL-15-enriched medium for 2-3 weeks. Flow cytometry was used to analyze the expression of CD3, CD4, CD8 and FoxP3. To analyze the immune regulatory function, we performed MLRs with peripheral blood mononuclear cells (PBMC) of the patients and irradiated donor or third party spleen cells in the absence and presence of GILs (ratio 5:1). Results: Of the CD3 + GILs, 9% (median; range: 1-21%) stained positive for FoxP3. This FoxP3 expression was detected in both CD4 + and CD8 + T-cell population (median: 8% and 9%, respectively). Functional analysis demonstrated that GILs suppressed the antidonor proliferation of responder T cells (range % inhibition: 55-81%). Interestingly, this suppression was predominantly achieved by CD8 + GILs: depletion of CD8 + cells from the GILs population diminished the inhibitory effect, whereas addition of solely CD8 + GILs to the MLR abundantly suppressed the anti-donor response (range % inhibition: 62-77%). In contrast, GILs did not inhibit the proliferation of T cells stimulated with third-party antigens. The figure below depicts a representative example. Graft-infiltrating lymphocytes expanded from rejecting cardiac allograft exhibit donor-specific immune suppressive activities. These results suggest that during acute cellular rejection, graft-infiltrating lymphocytes not only consist of graft-destructing effector T cells, but may also comprise immune regulatory cells of the CD8 + phenotype. The Context: It has been previously suggested that a liver allograft is immunoprotective and able to decrease the rate of rejection of a donor-specific allograft of another organ. It has been recently proposed that allografts other than the liver may also be immunoprotective. Objective: The aim of this analysis was to examine one year rejection rate and the incidence of rejection free survival of all combined transplants in the collective US experience to gain insight to any possible protective effect of one organ for another. Methods: The United Network of Organ Sharing (UNOS) provided de-identified patientlevel data. Analysis included all recipients transplanted between January 1, 1994 and October 6, 2005 who were 18 years or older (except intestinal transplants). Rejection at one year was defined as treatment for one or more episodes of rejection. Results: Analysis included a total of 83,424 patients who received either one, or combined, simultaneous or sequential, organ transplants in all possible combinations. Results are summarized in figure 1 (one-year organ allograft rejection rate). The collected data demonstrate that the rejection rate of donor-specific organ allografts which accompanied primary liver, kidney, and heart transplants was significantly lower in combined transplants as compared to that of the primary allograft transplanted alone. This was not true, however, for intestinal and pancreatic allografts where protection for the accompanying organ was not observed. We further demonstrate that transplantation of two organs of the same type (double kidneys or double lungs), i.e. increase in antigen load, also leads to decreased rates of rejection of the allografted organs. Conclusions: In combined simultaneous transplants, the heart, liver, and kidney allografts appear themselves to be protected, and to protect the other organ from rejection. Increased antigenic load of identical antigens in case of double lung and double kidney transplants appears to also offer immunologic protection against rejection, perhaps by different mechanisms. Background: A2ALL (9-center Adult-to-Adult Living Donor Liver Transplantation Cohort Study) has identified risk factors for mortality after AALDLT, including center experience. The aim of this study was to determine if A2ALL findings are reflected in the national experience. Methods: AALDLT at A2ALL (n=681) and non-A2ALL centers (n=1598) from 1/1/98 to 8/31/07 in the Scientific Registry of Transplant Recipients database were analyzed. Cox regression models adjusted for recipient and donor characteristics were fitted to test associations with mortality risk after AALDLT, including center type (A2ALL vs. non-A2ALL) and case number (for each AALDLT at each center). Results: AALDLT were performed at 9 A2ALL and 63 non-A2ALL centers. There was no significant difference in overall mortality risk between A2ALL and non-A2ALL centers. Significant predictors of death (both groups combined) included donor age (hazard ratio (HR)=1.14 per 10 years, P=0.003), recipient age (HR=1.25 per 10 years, P<0.001), diagnosis of HCV (HR=1.22, P=0.04) or HCC (HR=1.99, P<0.001), and earlier center experience (AALDLT case number ≤15, HR=1.58, P<0.001). There was no significant effect of transplant year after adjusting for experience. Cold ischemia time >4.5 hours was associated with higher mortality (HR=1.83, P=0.004); this effect was similar in A2ALL and non-A2ALL centers. There were no significant interactions between center type and any predictor except center experience ( Figure) . Compared to later experience, earlier center experience was associated with significantly higher mortality risk in both A2ALL (HR=2.24, P<0.0001) and non-A2ALL centers (HR=1.38, P<0.004). Survival during early experience was significantly worse at A2ALL vs. non-A2ALL centers (HR=1.42, P=0.024), but survival in later experience was similar. Conclusions: After the first 15 cases, AALDLT survival was similar at A2ALL and non-A2ALL centers, and similar significant mortality risk factors were identified, including center experience. These analyses support the generalization of findings from A2ALL centers to others performing AALDLT. Abstract# 173 Rejection with hemodynamic compromise (HC) and chronic allograft vasculopathy (CAV) impact survival in pediatric heart transplantation (PHTx). We showed that high pro-inflammatory / lower regulatory cytokine gene polymorphism (GP) profile increased the risk for acute rejection. In this analysis, we assessed the effect of genetic factors on HC and CAV. Methods: 406 PHTx with clinical and GP data for cytokines (TNF-α A-308G; INF-γ T+874A; IL-10 G-1082A, C-819T, C-592A ; IL-4 C-590T; IL-5 T-746C; IL-6 G-174C), growth factors (TGFβ-1 T+869C, C+915G; VEGF A-2578C, C-460T, G+405C), effector molecules (Fas A-670G; FasL C-843T) and pharmacogenomics (ABCB1 C3435T, G2677T/A) were analyzed regarding HC and CAV. Results: Adjusting for recipient black race and age with Cox regression models, we identified the following risk factors: IL-10 High was associated with lower rates of HC. Low Th1 (INF-γ, TNF-α) with high Th2 (IL-4, IL-5) cytokine GP profiles were protective for HC in combination with IL-10 High. Carriers of Fas High experienced higher rates for HC and CAV and High Fas-FasL combination doubled the relative risk for CAV. ABCB1 3435CC/2677GG genotypes were also associated with lower rates of HC (Table 1) . Conclusion: In this large multi-center study GPs with higher regulatory profiles and increased drug transport were associated with a lower incidence of HC. A genetic proapoptotic profile might contribute to the pathogenesis of CAV. Sponsorship: This work was supported by 5P50 HL 074 732-03 from the National Heart Lung and Blood Institute, National Institutes of Health. It has recently been reported that CD1d-restricted NKT cells that express invariant TCR (iNKT cells) play an important role in the production of autoantibodies through the interaction with B-1 cells. This observation prompted us to investigate the possible role of iNKT cells in the production of antibodies (Abs) against transplant-related antigens, such as ABO blood group carbohydrates and histocompatibility complex allopeptides, in a mouse model. We have previously demonstrated that B cells with receptors for blood group A carbohydrates were found exclusively in a CD11b + CD5 + B-1 subpopulation of mice, resembling humans with blood group O or B. Immunization with human blood group A red blood cells (A-RBCs) elicited the extensive production of anti-A IgM and IgG. Furthermore, the number of B-1 cells with receptors for A carbohydrates increased in the peritoneal cavity. In CD1d -/and Vα14 -/-Balb/c mice, which lack iNKT cells, such elicited production of anti-A IgM was not observed, even after immunization with human A-RBCs. However, class II -/-Balb/c mice, which lack CD4 + T cells but maintain normal levels of iNKT cells, exhibited levels of anti-A IgM production comparable to those in wild-type (WT) Balb/c mice. Moreover, anti-A IgG production was absent in CD1d -/-Balb/c mice even after the immunization, indicating that although iNKT cells crucially contribute to anti-A IgM production and IgG class switching, helper T cells do not. Notably, the proportion of B-1 cells in the livers of CD1d -/-Balb/c mice was significantly reduced (2.66 ± 0.43%, n = 4) when compared to that in WT mice (4.76 ± 1.93%, n = 4). We next immunized CD1d -/and WT Balb/c mice twice with 2 ×10 6 allogeneic B6 mouse thymocytes, and thereafter detected the anti-B6 (allopeptides) Abs by flow cytometry. In the CD1d -/mice, anti-B6 IgM production was comparable to that of WT mice, and IgG class switching also occurred normally. These findings indicated that iNKT cells play a pivotal role in the production of Abs specific for blood group carbohydrate determinants that are believed to be T cell independent, but are not required in the production of Abs for allopeptides that are believed to be T cell dependent. The depletion of iNKT cells or the suppression of their function might constitute a novel approach for preventing antibody-mediated rejection in ABO-incompatible transplantation, or in xenotransplantation, which involves similar carbohydrate antigens. Background Static cold storage (CS) is the most widely used organ preservation method for deceased donor kidney grafts. Retrospective analyses have indicated that preservation by hypothermic machine perfusion (MP) may lead to improved outcome after renal transplantation. However, there is a lack of sufficiently powered prospective studies to test the presumed superiority of MP. In an international prospective randomized controlled trial we enrolled kidney pairs of 336 consecutive deceased donors and randomly assigned one organ to MP and the contralateral kidney to CS preservation. Follow-up was directed at all 672 recipients of these grafts. The primary endpoint was delayed graft function (DGF). MP significantly reduced the risk of DGF (OR 0.63; P=0.02) and more than halved the incidence of primary non-function after transplantation, when compared to CS (2.1 vs. 4.8%; P=0.04). Furthermore, MP significantly reduced the risk of graft failure in the first 6 months post-transplant (HR 0.46; P=0.05). In recipients who developed DGF, 6-month graft survival was better if their transplanted kidney was machine perfused (87 vs. 76%; P=0.05). Hypothermic machine perfusion reduces the risk of delayed graft function, primary non-function, and graft failure in deceased donor kidney transplantation when compared to static cold storage. Furthermore, MP alleviates the deleterious effect of DGF on graft survival. We investigated the trafficking of cells after skin and heart transplantation in a dynamic fashion through the use of in vivo microscopy. Antigen presenting cells were followed using MHC-Cl-II-GFP and CD11c-GFP transgenic mice. Vascularized and non vascularized skin grafts as well as heart transplants were used in syngeneic as well as allogeneic settings. After syngeneic non-vascularized skin transplantation, we observed an early and massive cellular infiltration of host cells into the graft as early as 3 hours post-transplant with a gradual accumulation in the dermis. The accumulation of host-derived cells was accelerated after graft vascularization at day 4/5 post transplantation. This graft infiltration by recipient cells was more pronounced with vascularized skin grafts, and to a higher degree in heart transplants. Recipient cells similarly infiltrated allogeneic grafts early on and in larger numbers than for syngeneic grafts by day 4/5 post-transplantation. When visualizing MHC-Cl-II-GFP recipient cells in a syngeneic skin transplant, recipient DCs invaded the graft early on and, by 3 weeks post transplant gradually replaced graft DCs in the dermis (dermal DCs) and the epidermis (Langerhans cells). Donor DCs could still be seen in the graft up to 8 days post transplant. However, virtually all donor Langerhans cells were eventually replaced by recipient ones in a concentric fashion suggesting that the new Langerhans cells originate from the recipient skin adjacent to the graft and not from centrally-derived precursor cells. The vascular endothelium of a syngeneic transplant was partially replaced by recipient vascular endothelial cells in a centripetal fashion with more recipient-derived vascular endothelium present at the periphery of the graft and more donor-derived endothelium remaining in the center of the graft. Therefore, the graft can be seen as a "chimera" of cells from donor and recipient origin. The presence of recipient endothelial vascular cells and DCs within the graft may be important for maintaining the indirect response thought to be responsible for chronic rejection. Objective: Maturation resistance and tolerogenicity can be conferred on dendritic cells (DC), -crucial regulators of T cells, by exposure to rapamycin (RAPA), a tolerance-sparing immunosuppressant. The mechanisms underlying this acquired unresponsiveness, typified by diminished responses to Toll-like receptor (TLR) or CD40 ligation, have not been identified. Thus, our objective was to elucidate a molecular basis for RAPA-induced DC maturation resistance. Methods: RAPA administration was used to condition splenic DC in vivo and bone-marrow derived DC in vitro. DC maturation was monitored by assessment of co-stimulatory molecule expression, cytokine production, and T cell allostimulatory capacity. To identify negative regulators of maturation, microarray analysis and quantitative RT-PCR was completed, and findings confirmed via Western and flow cytometric analyses. Results: In vitro or in vivo exposure of myeloid DC to RAPA elicited de novo production of IL-1β by otherwise immature DC (CD86 lo ). Interestingly, DC IL-1β production, acting in an autocrine/paracrine fashion, promoted DC overexpression of the IL-1 receptor(R) family member, ST2L, and enhanced its surface expression. ST2L is the receptor for IL-33, an IL-1 family member, and has also been implicated as a negative regulator of TLR signaling. Consistent with this regulatory function, IL-1β-induced ST2L expression suppressed the responsiveness of RAPA-conditioned DC to TLR or CD40 ligation. Conclusion: RAPA causes de novo production of IL-1β by immature DC, upregulating ST2L, and establishing a barrier to DC maturation following exposure to TLR or CD40 ligation. As such this work identifies a novel mechanism by which a clinically-important immunosuppressant impedes the capacity of DC to mature and consequently stimulate effector/adaptive T cell responses. These findings are particularly relevant to the potential use of RAPA-conditioned DC as "negative" cellular vaccines to block alloAg-specific responses, as exposure to endogenous and exogenous inflammatory stimuli can induce DC maturation and negate the tolerogenic properties of immature DC. Exosomes are nanovesicles (50-100nm) released to the extracellular milieu by different cell types. Exosomes secreted by dendritic cells (DCs) and other APCs express MHC Ag, adhesion molecules and costimulatory molecules oriented on the membrane surface with their binding domains facing outwards. Thus, exosomes released by graft-infiltrating leukocytes (GILs) could function as "Ag-presenting vesicles" or as vehicles to transfer alloAg between recipient's APCs during elicitation of T-cell allo-immunity. Aims: to test if (i) GILs activate anti-donor T-cells in secondary lymphoid organs by releasing exosomes with alloAg into systemic circulation; or (ii) GILs that traffic to the spleen as passenger leukocytes use exosomes as a local mechanism to transfer alloAg to recipient's DCs. Methods: Exosomes were isolated from supernatants of BM-derived [C57BL/6(B6), IA b ] DCs pulsed with the BALB/c IEa 52-68 allopeptide and purified by ultra-filtration and ultra-centrifugation on a 30%sucrose/D 2 O gradient. We used PKH67 + exosomes and CD45.1 congenic B6 mice for traffic studies, heart (heterotopic) and skin transplantation models (BALB/c→B6, Thy1.2 + ), and CFSE-labeled 1H3.1 TCRtg CD4 T-cells (Thy1.1 + ) specific for IA b (B6) loaded with IEa 52-68 (BALB/c). DCs were genetically engineered to release exosomes expressing green fluorescent protein (GFP). We have previously shown that blood-borne exosomes carrying BALB/c alloAg are reprocessed by different subsets of splenic DCs for presentation to indirect pathway 1H3.1 CD4 T-cells. Here, we demonstrated that although GILs of cardiac and skin allografts release exosomes ex vivo, they did not secrete enough concentrations of exosomes with alloAg into circulation to stimulate donor-reactive T-cells in lymphoid organs. Instead, our findings indicate that migrating DCs (generated in vitro or isolated from GILs), once homed in the spleen, they transfer exosomes expressing GFP and carrying allopeptides to spleen-resident DCs of the recipient, identified by the congenic marker CD45.1. Conclusion: Exchange of exosomes between DCs in lymphoid organs might be a mechanism by which passenger leukocytes transfer alloAg to recipient's APCs in secondary lymphoid organs. T cell activation is critical in initiating adaptive immunity, and PKCθ, a novel member of the PKC family, mediates non-redundant functions in the T cell receptor; however, its role in the mediation of allograft rejection remains unclear. This study is aimed at investigating whether alloimmune response can be alleviated by a deficiency of the PKCθ molecule, and whether transgenic expression of anti-apoptotic Bcl- Methods. Wild-type (WT) cardiac allografts were transplanted into PKCθ -/mice, with or without sub-therapeutic anti-CD154 mAb. Purified PKCθ -/or PKCθ -/-/ Bcl-x L T cells were adoptively transferred into Rag2 -/mice engrafted with cardiac allografts. Lymphocyte proliferation assays were performed (CFSE). NF-kB activation was assessed by bioluminescence imaging (BLI) using luciferase transgenic mice under the control of a NF-kB promoter. Results. The cardiac allografts were rejected in a delayed fashion in PKCθ -/mice with increased NF-kB activation; however, sub-therapeutic anti-CD154 mAb (that normally delays rejection of cardiac allograft) induced long-term survival of cardiac allografts. The cardiac allografts were permanently accepted in Rag2 -/mice with adoptive transfer of PKCθ -/-T cells, and the rejection can be elicited by transfer of PKCθ -/-/ Bcl-x L T cells. In a lymphocyte proliferation assay, PKCθ -/-T cells displayed greatly reduced proliferation. In response to CD3 and CD28 stimulation, PKCθ -/-T cells underwent accelerated apoptosis and reduced Th1, Th17, and Treg subsets compared to the WT T cells. Bcl-x L restored the survival of the PKCθ -/-T cells. Conclusions. The results suggest that PKCθ mediates the alloimmune response. Bcl-x L transgene prevents PKCθ -/-T cell apoptosis and re-elicits allograft rejection. Tolerogenic dendritic cells (DC) are immature, maturation-resistant(MR) or alternatively-activated DC that express MHC molecules and low levels or absent costimulatory signals. Although MRDC administration has successfully prolonged allograft survival in murine models, the mechanism of action in vivo remains unknown. Aim: to test in vivo if the down-regulation of the anti-donor response induced by donor-derived tolerogenic DC is due to: (i) direct interaction of the tolerogenic DC with donor-reactive T cells or (ii) by reprocessing of the tolerogenic DC into alloantigen (alloAg) by recipient APC for interaction with indirect pathway T cells. Methods: DC were generated in vitro by culturing BALB/c bone marrow cells for 6-8 days in medium with GM-CSF + IL-4 supplemented with 10nM 1α,25-(0H) 2 vitamin D 3 (VD 3 ). We used a model of heterotopic vascularized allogeneic heart transplantation [BALB/c into C57BL/6 (B6)] and CD4 T cells from 1H3.1 TCRtg mice that recognize B6 IA b loaded with the BALB/c allopeptide IEα 52-68 (indirect pathway) . Results: we demonstrated that VD 3 renders DC maturation resistant (VD 3 -MRDC) as VD 3 -MRDC fail to up-regulate co-stimulatory molecule expression, release IL-12p70, or stimulate allo-responsive T cells after challenge with potent DC-maturation stimuli. Adoptive transfer (i.v.) of BALB/c VD 3 -MRDC (day -7) significantly prolonged survival of BALB/c heart grafts in B6 mice in the absence of immunosuppressive therapy. Interestingly, we found that in vivo, BALB/c VD 3 -MRDC induced proliferation of indirect pathway 1H3.1 CD4 T cells in the spleens of B6 recipient mice, indicating that reprocessing of the BALB/c DC by host (B6) APC does occur. Proliferation of 1H3.1 CD4 T cells in response to BALB/c VD 3 -MRDC resulted in defective activation (CD62L high , CD69 low ) of 1H3.1 T cells, leading to their peripheral deletion and outgrowth of CD4 + Foxp3 + Treg cells. Reprocessing of BALB/c VD 3 -MRDC was performed by recipient splenic CD11c high CD8α neg DC, and donor alloAg continued to be presented through the indirect pathway for 3 days after donor DC administration. Conclusion: These results suggest that DC-based therapies downregulate T cell allo-immunity and prolong allograft survival, at least in part, through reprocessing of the tolerogenic DC into alloAg by recipient APC. Early Introduction: Alloreactive memory T cells are present in all transplant recipients due to prior direct sensitization or heterologous immunity. These cells are known to circumvent tolerance induction and/or prevent indefinite graft survival in several models, but mechanistic details of their function are unknown. The goal of this study was to test the hypothesis that CD8 memory T cells initiate alloreocognition and express effector functions within hours of reperfusion. Methods: Syngeneic or A/J (H-2 a ) hearts were transplanted into WT C57BL/6 (H-2 b ), CD4-/-, CD8-/-, or Rag1-/-recipients. RNA and protein were prepared from total graft homogenates and analyzed by qRT-PCR and ELISA. Rag1-/-mice received 1x10 6 WT or IFNg-/-2C cells and were used as recipients 10 weeks after reconstitution. Donor-specific CD8 memory cells were purified from WT spleens 8 weeks after A/J skin grafting, and donor-specific effector CD4 cells were purified from spleens of CD90.1 mice 8 days after A/J heart transplantation. Flow cytometry was used to quantify graft infiltrates. Results: Allografts contained elevated levels of IFNg and CXCL9 mRNA at 24, 48 and 72 hrs post-transplant vs. isografts. Detectable CXCL9 protein was produced in allografts from WT and CD4-/-recipients but not in isografts or allografts from CD8-/-or Rag1-/recipients. Treatment with CTLA4-Ig and MR1 failed to reduce CXCL9 production. Reconstitution of Rag1-/-mice with IFNg sufficient or deficient 2C TCR transgenic CD8 cells indicated that early allospecific CXCL9 production absolutely requires IFNg made by recipient CD8 cells. Although donor-specific IFNg production was undetectable in splenocytes until day 4-5 post transplant, graft-infiltrating CD44 hi CD62L lo CD8 T cells were present as early as 24 hrs post-transplant. In adoptive transfer studies, effector-memory CD8 T cells reconstituted early allospecific CXCL9 production in CD8-/-mice. Lastly, primed CD4 T cells adoptively transferred at day 2 post-transplant readily infiltrated allografts in control but not CD8 depleted recipients. Conclusions: CD8 memory T cells infiltrate allografts rapidly post-transplant, produce IFNg, and propogate an inflammatory environment which optimizes recruitment of primed effector T cells. Successful neutralization of this early allorecognition pathway should provide valuable adjunctive therapy to improve graft function and survival. Background: Allogeneic T cell stimulation requires not only antigen-specific signals but also costimulatory signals, most importantly between CD80/86 on the antigen presenting cell (APC) and CD28 and CTLA4 on the T cell. Engagement of the T cell receptor without costimulation can lead to anergy and the induction of regulatory T cells (Tregs). T cell activation is also controlled by expression of the tryptophan-catabolising enzyme indoleamine 2,3-dioxygenase (IDO). Depletion of this essential amino acid, and/or the production of tryptophan metabolites inhibits T cell proliferation. Methods: A genetic approach to confer tolerogenic properties on murine dendritic cells (DCs) has been explored using lentiviral vectors, based on the Equine Infectious Anaemia Virus. Firstly, an intracellular method that prevents costimulation has been developed: A fusion protein consisting of CTLA4 and KDEL [an endoplasmic reticulum (ER) retention signal] is expressed in DCs. The CTLA4-KDEL binds to CD80/86 in the ER and prevents expression of these proteins on the DC surface. A second approach uses an elevated expression of the IDO enzyme by transduced DCs. Results: CTLA4-KDEL-or IDO-transduced DCs were unable to induce allogeneic T cell proliferation. However, using two-stage DC:T cell co-culture assays, it was shown that CTLA4-KDEL-, but not IDO-transduced DCs, can induce donor-specific T cell anergy in vitro and in vivo. Tolerance to both the direct and indirect pathways was shown using CTLA4-KDEL-transduced DCs. Linked suppression was mediated by the generation of donor-specific Tregs. IDO-transduced DCs did not generate Tregs. Furthermore, it was shown separately that DCs expressing IDO whilst lacking CD80/86 expression for potential ligation by CTLA4 (although CTLA4-CD80/86 ligation upregulates IDO, it downregulates T cell activation) failed to generate or even sustain FoxP3+ Treg populations. The ability of the transduced DCs to induce tolerance to allografts was assessed in a complete mismatch and CBK→CBA (indirect pathway) corneal graft model. These results support a clinical strategy to induce Treg-mediated, donorspecific transplantation tolerance using CTLA4-KDEL-, rather than IDO-expressing DCs. Indirect CD4 T cells that recognise processed alloantigen on recipient APC can provide help to alloreactive cytotoxic CD8 T cells that recognise intact MHC I alloantigen on donor APC, but exactly how such 'un-linked' help is provided is not clear. The respective abilities of direct and indirect pathway CD4 T cells to provide help for cytotoxic CD8 alloimmunity were examined in a mouse model of heart graft rejection in which the recipients contain only monoclonal helper CD4 T cells, specific for self-restricted H-Y antigen (female B6 Mar/RAG1 -/mice). Mice were additionally reconstituted with 10 6 B6 CD8 T cells, and then challenged with female BALB/c (no CD4 T cell help), or male BALB/c (indirect pathway help), or male B6xBALB/c F1 hearts (direct pathway help) . Un-reconstituted Mar/RAG1 -/mice lack effector B and CD8 T lymphocytes, and consequently all heart grafts survived indefinitely. In contrast, reconstituted Mar/ RAG1 -/mice rejected male F1 grafts rapidly (MST 8d), whereas female BALB/c grafts survived indefinitely, confirming a CD4-dependent effector role for the transferred CD8 T cells. CD4 T cell help through the indirect pathway, although sufficient to elicit graft rejection, was less efficient than direct pathway help, because male BALB/c grafts were rejected more slowly than the F1 grafts (MST 12d, p<0.001). We next considered whether indirect pathway CD4 T cells provide help through recognition of MHC II complexes on the surface of alloreactive CD8 T cells, in analogous fashion to the cognate interaction between B and T lymphocytes. In support, reconstitution of Mar/RAG1 -/recipients with instead, MHC II-deficient CD8 T cells, resulted in slower rejection of male BALB/c hearts (MST 21d, p<0.01), whereas male F1 grafts, that still permit provision of linked help, were rejected at the same tempo. Most tellingly, Mar/RAG1 -/mice that received simultaneously a female BALB/c heart and male B6 APC (to activate Mar CD4 T cells) rejected their grafts rapidly when reconstituted with male CD8 T cells (MST 9d). In contrast, grafts survived indefinitely when female CD8 T cells were transferred. Flow cytometric analysis of mitogenstimulated CD8 T cells revealed surface MHC II expression. Indirect allorecognition can provide help for generating cytotoxic alloimmunity, but not as effectively as through the direct pathway. Indirect pathway help is potentiated by linkage through recognition of CD8 MHC II. PURPOSE: Tolerogenic properties of dendritic cells (DC) are supported and preserved by conditioning with the immunosuppressant rapamycin (RAPA). The ability of RAPAconditioned, recipient-derived DC pulsed with alloantigen (alloAg) to suppress both direct and indirect alloAg-specific T cells in the absence of immunosuppression has been demonstrated in a murine allograft model. DC can acquire intact MHC from cells or cell lysates. However, the ability of alloAg-pulsed RAPA-DC to immunomodulate directly-reactive alloAg-specific T cells has not been formally demonstrated. METHODS: DC were generated from C57BL/6 (B6; H2 b ) bone marrow cells in GM-CSF and IL-4. RAPA was added to indicated cultures (RAPA-DC) beginning on day (d) 2. On d7, CD11c + bead-purified RAPA-DC or non-treated control DC (CTR-DC) were incubated with BALB/c (H2 d ) splenocyte lysates ("alloAg pulsing"). Following incubation, the DC were harvested and the level of donor and recipient MHC molecules on CD11c + cells determined by flow cytometry and immunofluorescent imaging. Surface levels of CD86, B7-H1 (Programmed Death Ligand-1; PD-L1), and Fas-L were compared. Pulsed-DC were also incubated with CD8 + T cells from Rag -/-2C mice for 3d. 2C CD8 + cells express T cell receptors specific for H2-L d , a MHC class I molecule of BALB/c. Following incubation, 2C cell proliferation and apoptosis were both assessed. RESULTS: CTR-and RAPA-DC presented detectable levels of directly-transferred MHC class I and II on their surface after incubation with allogeneic BALB/c cell lysate. Donor MHC presented by "pulsed" recipient DC stimulated directly-reactive, alloAg-specific 2C T cell proliferation. However, only RAPA-DC induced apoptosis in the overwhelming majority of these cells responding via the direct pathway. Induction of apoptosis correlated with an increased level of surface Fas-L on RAPA-DC and their comparatively low level of CD86 relative to PD-L1. CONCLUSIONS: RAPA-conditioned DC can present intact MHC molecules acquired from lysates of allogeneic splenocytes and concurrently induce apoptosis of directlyreactive alloAg-specific CD8 + T cells. As such, we provide mechanistic insight into a mechanisms by which alloAg-pulsed, recipient-derived RAPA-DC may facilitate allograft tolerance. Background: T regs actively regulate alloimmune responses and promote transplant tolerance. ATG, a widely used induction therapy in organ transplantation, depletes peripheral T cells but may preferentially spare T regs . Sirolimus is thought to expand natural T regs . B7 T cell costimulatory blockade inhibits effector T cell (T eff ) expansion and may promote regulation. We investigated the effect of combining mouse ATG (mATG), CTLA4Ig and Sirolimus on stringent skin allograft survival, and studied the mechanisms by determining T reg /T eff balance in vivo using a unique model (ABM-TCRtg-Foxp3/gfp reporter mouse Conclusion:This is the first report to establish that T cell depletion with mATG combined with CTLA4Ig and Sirolimus synergize to prolong stringent fully allogeneic skin allograft survival by promoting regulation and tipping the T reg /T eff balance by both preserving T regs and facilitating generation of new T regs by a conversion mechanism. These results provide the rationale for translating such a novel therapeutic combination to promote regulation and tolerance in primates and human organ transplantation. Expansion of Cynomolgus CD4+CD25+FoxP3+ Regulatory T Cells Using Low Dose Anti-Thymocyte Globulin. To test low dose ATG in vivo, 2 mg/kg (10% of depleting dose) was administered thrice (day 0,1 and 2) to a naïve monkey and to a monkey that was treated concurrently with sirolimus (trough 15-20 ng/ml) after heart transplantation. In the naive monkey, lowdose ATG led to expansion of CD4+CD25+FoxP3+ Tregs in peripheral blood (baseline 0.57%, day 7=4.09%, day 12=2.7%) and in lymph nodes (baseline 2.47%, day 7=9.35%, day 12=7.08%) without causing T cell depletion. Similarly, in the transplanted monkey peripheral blood Tregs expanded from 1.4% at baseline to 4.64% on day 6. Low dose ATG is not only able to expand Tregs ex vivo by proliferation of natural CD4+CD25+ cells, but can equally induce Tregs in vivo without lymphodepletion. These findings provide the rationale for development of tolerance inducing strategies based on enhancing regulatory mechanisms in human transplant recipients. Immunological Background⁄Aim: Previously, we have shown that combination of human anti-CD40 mAb, 4D11 and tactolimus exerts additive immunosuppressive effect and markedly prolongs renal allograft survival in cynomolgus monkeys. In this study, we further evaluated the immunological aspects among these transplant recipients. Method: Kidney transplantations were performed across MHC mismatched cynomolgus monkeys. Transplant recipient was given either no-treatment, tacrolimus (1 mg⁄kg⁄day, po), 4D11 (5 mg⁄kg, iv) or tacrolimus+4D11 (n=3⁄group). Peripheral lymphocyte population, MLR and serum anti-donor antibody levels and graft histology were assessed. Results: Mean graft survival for no-treatment, tacrolimus, 4D11 and tacrolimus+4D11 treatment groups was 6.0±1.0, 27.7±3.2, 120.3±90.6 and 213.7±28.0 days, respectively. Peripheral CD20 + cells partially declined in both 4D11 alone and 4D11+ tacrolimus given animals at the early post-operation period, although the numbers recovered thereafter. CD4 + and CD8 + cells were unaffected. CD4 + effector memory population was reduced by addition of tacrolimus to 4D11 (Fig. 1A ). MLR against donor and 3rd party antigens were suppressed in both 4D11 and tacrolimus+4D11 groups (Fig. 1B) . Addition of tacrolimus further reduced graft CD4 + , CD8 + and CD20 + cellular infiltration (Fig. 1C ). Anti-donor antibodies were detected in sera during the treatment course of 4D11; however, they did not develop under the tacrolimus+4D11 treatment. Graft C4d deposition correlated with serum anti-donor antibody levels. The 4D11 inhibits both cellular and humoral responses against donor antigens. Addition of tacrolimus strengthens these immunosuppressive effects of 4D11, leading to further prolongation of graft survival. Objectives: Allogeneic islet transplantation offers the potential for cure from diabetes. Application of this therapy, however, is limited by immunologic mechanisms requiring medical therapy to prevent rejection of the islets. Costimulatory blockade of the CD28/ CD80/CD86 and the CD40/CD154 pathways has shown promise in ameliorating the immune response to allow engraftment and function of islets. We have evaluated a new drug regimen consisting of induction therapy with 3A8, a murine anti-CD40 antibody, and basiliximab and maintenance treatment with CTLA4Ig and sirolimus in diabetic rhesus macaques which received allogeneic islets. Methods: Allogeneic rhesus macaque islets (14,100 IE/kg ± 1,802) were transplanted intraportally into diabetic rhesus macaques (n=4) under the following immunosuppressive regimen: short term administration of anti-IL-2 receptor (basiliximab) and anti-CD40 (3A8), with maintenance immunosuppression using sirolimus for 134 days and abatacept (CTLA4Ig) for long term therapy. Weekly peripheral blood flow cytometric and CMV viral load monitoring was performed. Results: Recipients treated with this immunosuppressive regimen had immediate return to normoglycemia following islet transplant. The graft survival in the first three animals was 141, 283 and 297 days. The fourth animal continues to exhibit good glycemic control at his current post-operative day 30. Each of these animals had monthly intravenous glucose tolerance tests with monitoring of blood glucoses and c-peptides with further evidence of glycemic response and c-peptide generation. Flow cytometry confirms CD40 blockade during the administration of 3A8 and return of CD40 after cessation of therapy. The treatment was well tolerated with minimal evidence of CMV reactivation and no evidence of thrombocytopenia or thromboembolism. Conclusions: These preliminary results indicate that CD28/CD40 costimulatory-based immunosuppressive regimens can protect allogeneic islets from rejection. Furthermore, 3A8 appears to adequately block CD40 to facilitate this engraftment and function as demonstrated by flow cytometry. Iwami, 1,2 Qi Zhang, 1 Osamu Aramaki, 1 Nozomu Shirasugi, 1 Katsuya Nonomura, 2 Masanori Niimi. 1 1 Surgery, Teikyo University, Tokyo, Japan; 2 Renal and Genitourinary Surgery, Hokkaido University, Sapporo, Japan. Many studies have shown immunosuppressive effects of dietary intake of fish oil containing eicosapentaenoic acid (EPA) in various models such as autoimmune diseases and transplantation. However, its mechanisms remain uncertain. Furthermore, there have been no studies examining the effect of purified EPA. Here we determined the ability of purified EPA to inhibit alloimmune response in mouse cardiac transplantation model. Methods: CBA recipients (H-2 k ) were given single injection of purified EPA intraperitoneally on the same day as transplantation of a heart from C57BL/10 donors (H-2 b ). Mixed leukocyte reaction (MLR) assay and enzyme linked immunosorbent assay (ELISA) were also performed to evaluate the effect of purified EPA on cell proliferation and cytokine production. To determine the presence of regulatory cells, adoptive transfer study was conducted. Results: Untreated CBA recipients rejected C57BL/10 cardiac allografts with median survival time (MST), 8 days. In contrast, CBA recipients treated with purified EPA (1.0g/kg) had significant prolongation of allograft survival (MST, >100 days). CBA recipients treated with 0.1g/kg purified EPA eventually rejected allografts (MST, 13 days). In MLR assay, treatment with 1.0 g/kg purified EPA suppressed alloproliferation of splenocytes in the recipients. The treatment also inhibited production of IL-2, IL-12 and IFNg by the splenocytes in the recipients. When splenocytes were harvested from the recipients treated with 1.0g/kg purified EPA 50 days after cardiac allografting and were adoptively transferred into naïve secondary recipients, the adoptive transfer induced significant prolongation of cardiac allograft in nave secondary recipients (MST >30 days, compared to that in the recipients with adoptive transfer of naïve splenocytes, MST, 10 days). Conclusions: Purified EPA induced significantly prolonged survival of fully mismatched cardiac allografts, and generated regulatory cells. Background: Chronic allograft nephropathy (CAN), the most common cause of late kidney allograft failure, is not effectively prevented by the current regimens. Activation of extracellular signal-regulated kinases 1/2 (ERK1/2) mediating intracellular signal transduction from various growth factor stimuli is required for TGF-β production, which plays a key role in the development of CAN. Hence, the therapeutic potential of disruption of ERK1/2 signaling to prevent CAN was examined in an experimental model. Methods: Kidney donors from C57BL/6J mice (H-2 b ) were transplanted to bilaterally nephrectomized Balb/c recipient mice (H-2 d ). The recipients were treated with CI1040 (MEK-ERK1/2 inhibitor) or vehicle after 14 days post-transplantation for 28 days. CAN was evaluated with the Banff 97 working classification. Results: All six allografts receiving CI1040 treatment were survived, while two out of seven grafts were lost in vehicle-treated group. At the end of experiment, the function of grafts in CI1040 treated recipients had been maintained, indicated by lower levels of serum creatinine and BUN (30±6 µM and 22±8 mM, n=6) as compared to those (94±39 µM and 56± 25 mM, n=5) in vehicle group (creatinine, p=0.0015; BUN, p=0.0054). Pathological evaluation indicated that CI1040 reduced CAN, reflected by a lower CAN score in CI1040-treated group (3.93, n=4 ) as compared to that (9.96, n=5) in vehicle controls (p=0.0234). Further examinations showed that CI1040 treatment resulted in inhibition of phosphorylation of ERK1/2 and reduction of TGF-β levels in grafts. In vitro CI1040 potently suppressed not only growth factors-stimulated ERK1/2 activation and TGF-β biosynthesis in renal tubular epithelial cells, but also attenuated alloantigenstimulated T cell proliferation. Conclusion: Our data suggest that interference of ERK1/2 signaling with pharmacological agent (i.e. CI1040) has therapeutic potential to prevent CAN in kidney transplantation. Objective: This is the first study to investigate the role of a novel JAK3 and Sykinhibitor, R348, in the prevention of obliterative airway disease (OAD), the major obstacle after lung transplantation. Methods: Trachea from Brown-Norway (BN) donors were heterotopically transplanted in the greater omentum of Lewis (Lew) rats. Recipients were treated for 28 days with R348 (10, 20, 40, or 80 mg/kg), rapamycin (0.75 or 3 mg/kg), or left untreated. Allografts were recovered and processed for histological evaluation determining degree of luminal obliteration, percentage of respiratory epithelial coverage, and mononuclear cell infiltration. Donor reactive (IgG) antibodies from the recipient's serum were determined using flow cytometry. Results: R348 at 20, 40, and 80 mg/kg significantly inhibited luminal obliteration in a dose dependent manner (69±20%, 20±13%, 15±7%; p=0.003 vs. no medication). Rapamycin in both concentrations significantly inhibited luminal obliteration (37±15%, 11±6%; p<0.001 vs. no medication) similarly to R348 at 40 and 80 mg/kg. R348 at 40 and 80 mg/kg significantly preserved respiratory epithelium compared to R348 at 10 and 20 mg/kg (49±35%, 76±27% vs. 0±0, 3±7%; p=0.004) and was superior to rapamycin in epithelial preservation (49±35%, 76±27% vs. 27±17%, 36±15%; p=0.01). All R348 and rapamycin-treated recipients expressed decreased numbers of peritracheal mononuclear cells in a dose dependent manner (p<0.0001). R348 20, 40 , and 80 mg/ kg treated recipients had significantly reduced IgG levels versus untreated recipients (381±136, 371±61, 230±56 vs. 853±207; p<0.03). All R348 treated recipient thymus and spleen weights were significantly lower compared to the untreated group (p=0.001). BUN, Cr, and cholesterol levels were unaffected in R348 treated recipients. Conclusion: R348 potentially exhibits its inhibitory effect by preserving the respiratory epithelium, rather than by rapamycin's mechanism of reduced smooth muscle cell (SMC) proliferation. R348 occupies a beneficial pharmacokinetic profile, lacks nephrotoxic and atherogenic properties, and provides a favorable alternative to rapamycin in the treatment of chronic rejection in lung transplant recipients. Genz-29155 is a novel, oral immune-modulatory agent identified in a high-throughput screen designed to find inhibitors of TNFα-induced apoptosis. The molecular target of the compound remains under investigation but is likely downstream of the TNFα cell surface receptor. In vitro studies have shown Genz-29155 to be an effective inhibitor of the TNFα-triggered caspase cascade but not anti-CD3 or Fas-mediated apoptosis, and thus may act by inducing allograft resistance to immune attack rather than suppressing the alloimmune response per se. It has been shown to synergize with sirolimus in murine heterotopic cardiac allotransplant models. In order to test this promising new agent in a more clinically relevant model of solid organ transplantation, we studied Genz-29155 in a mismatched NHP (rhesus macaque) renal transplant model. Genz-29155 (n=3) was administered (3mg/kg, IV, Days 0-14) with sirolimus (1mg/kg, PO, Days 0-30). Five control animals received only sirolimus and vehicle (1mg/kg, PO, Days 0-30). All animals were followed serially by polychromatic flow cytometry to determine the relative and absolute number of CD4+ and CD8+ T cell subsets. Time to allograft rejection, the primary end point, was determined by a significant rise in serum creatinine and BUN, as identified with biweekly monitoring. After diagnosis of rejection, allografts were removed for histological and transcriptional studies, along with splenocytes for immune function assays. In this pilot study, prolongation of rejection-free survival was significantly improved with Genz-29155 and sirolimus combined vs. sirolimus alone (42.67 days vs. 17.20 days, respectively, p = 0.05). Given these initial results, we have initiated a larger study (n=12) to optimize the dose and duration of Genz-29155. Five animals, transplanted within the past month remain alive and well in this study. Further investigation of this agent will allow us to better understand the benefit of inhibiting TNFα-mediated apoptotic effects in both cellular alloimmune response and allograft injury in solid organ transplantation. Targeting Purpose: Allospecific T memory cell responses are present in transplant recipients from exposure to cross-reacting antigens. We have previously reported that LFA-1 inhibition suppresses primary CD8-dependent rejection responses which are not controlled by any conventional immunosuppressive strategy. These studies were conducted to analyze the efficacy of this anti-LFA-1 Ab for control of CD8-dependent responses in sensitized hosts. Methods: FVB/N (H-2 q ) donor hepatocytes were transplanted into C57BL/6 (H-2 b ) or CD4 KO (H-2 b ) recipients. Memory responses were analyzed by retransplantation with a second FVB/N allogeneic hepatocyte transplant. Cohorts of mice were treated with anti-LFA-1 mAb and observed for hepatocyte survival or magnitude of CD8 + T cell mediated allospecific cytolytic activity. Results: The untreated secondary CD4 KO and C57BL/6 recipients rejected hepatocyte allografts with enhanced kinetics in comparison to the primary graft (MST= day 10 vs day 14, and MST= day 7 vs. day 10, respectively; p < 0.05). Anti-LFA-1 mAb treated CD4 KO recipients demonstrated delayed rejection (MST= day 35 vs day 10; p=0.001) compared to secondary rejection in untreated CD4 KO hosts. Anti-LFA-1 mAb treatment did not delay rejection in sensitized C57BL/6 recipients (MST= day 7) but did significantly reduce the in vivo allospecific cytotoxic effector function in C57BL/6 secondary recipients (20.6±4.1%; p=0.001) as compared to untreated controls (97±1.3%). The residual cytotoxicity observed in anti-LFA-1 mAb treated C57BL/6 recipients is comparable to the in vivo cytotoxicity of CD8-depleted C57BL/6 secondary recipients (12.1±8.9%) and is likely mediated by alloantibody. In fact, the level of allospecific cytotoxicity in anti-LFA-1 mAb treated sensitized C57BL/6 recipients correlated with the amount of alloantibody present in recipient serum. Conclusion: In conclusion, treatment with anti-LFA-1 mAb delayed (CD4-independent) CD8-dependent rejection in sensitized recipients but did not delay rejection in sensitized CD4-sufficient C57BL/6 recipients. Despite the efficacy of treatment with anti-LFA-1 mAb to significantly reduce the in vivo allospecific cytotoxic effector function in sensitized C57BL/6 mice this strategy did not delay rejection. This is likely due to alloantibody mediated rejection in sensitized C57BL/6 (but not CD4 KO recipients) which is not suppressed by treatment with anti-LFA-1 mAb. Abstract# 194 Cytomegalovirus (CMV) represents a major cause of infectious complications after transplantation. Recently, chronic infections with LCMV, HIV or HCV were shown to be associated with functionally anergic T-cells characterized by high expression of the programmed death (PD)-1 molecule. This study was carried out to characterize functional exhaustion of CMV-specific CD4 T-cells as determinant of impaired CMV-control and to elucidate whether the PD-1 pathway may be operative in active CMV-infection after renal transplantation. CMV specific CD4 T cells from 13 controls, 31 hemodialysis patients, and 51 renal transplant patients were quantified using flow cytometry and analysed for their expression of PD-1 and cytokines IFNγ and IL2. CMV specific proliferation was analysed by CFDA-SE dilution. In viremic transplant-recipients, a significantly higher proportion of CMV-specific CD4 T-cells were PD-1 positive (median 40.9%) as compared to non-viremic transplant patients (8.8%), dialysis-patients (8.8%) or controls (3.1%, p<0.0001). In line with functional impairment, PD-1 positive T-cells produced significantly less IFNγ per single cell as compared to PD-1 negative T-cells (mean fluorescence intensity 129.4±52.6 versus 256.6±96.1, p<0.0001). Moreover, unlike controls or non-viremic patients, the majority of CMV-specific T-cells from viremic patients showed a long-term loss of IL-2 production. Interestingly, functional anergy of PD-1 positive CMV-specific CD4 T-cells was reversible in that antibody-mediated blockade of PD-1 signaling with its ligands PD-L1/-L2 led to a 10fold increase in CMVspecific proliferation. In conclusion, expression of PD-1 defines a reversible defect of CMV-specific CD4 T-cells, and blocking PD-1 signaling may provide a potential target for enhancing the function of exhausted T-cells in chronic CMV-infection. Differential Background: Some patients with CMV disease may be simultaneously infected with multiple viral strains. It is unknown if different strains clear differently with the commencement of antiviral therapy. We assessed response to antiviral therapy in patients with simultaneous co-infection with multiple strains of CMV. Methods: PCR-based strain typing of CMV was performed using the glycoprotein B gene of CMV (gB1-4) in a cohort of organ transplant recipients with CMV disease. From this, 89 patients were identified that had simultaneous infection with ≥ 2 CMV strains. Quantitative assessment of each of the strain types was performed at regular intervals after starting antiviral therapy. Results: The different types of multi-strain infections were gB1+gB2 (11/89, 12%), gB1+gB3 (22/89,25%), gB1+ gB4 (7/89, 8%), gB2+gB3 (10/89, 11%), gB2+gB4 (6/89, 7%) and gB3+gB4 (8/89, 9%). 25/89 (28%) were simultaneously infected with 3 or 4 different genotypes. Within individual patients, there was trend for gB1 CMV load (4.27log genomes) to be lower than the other genotypes (p=0.05-0.07) at the onset of disease. Decay kinetics for all genotypes showed a bisphasic response with a 1 st phase decline of ∼0.7 days and a 2 nd phase of ∼3days. 1 st phase delines were fastest for gB1 (p=0.0001 vs gB2 and 4) while gB4 decline was slower than gB3 during the 1 st phase. 2 nd phase declines were similar between gB1 and 3 (3days and 2.9 days) but were slower for gB2 and 4 (3.45 days; p=0.01). There was a significant correlation between 1 st phase decline and log decline from baseline by day 21 (r=0.37; p=0.002). Relative fitness calculations revealed complex fitness dynamics between genotypes although gB3 was always less fit than gB1, 2 and 4, and gB4 and 2 were always less fit than gB1. Conclusion: In patients with CMV disease who have simultaneous coinfection with multiple strains, the 1 st and 2 nd phase declines in gB1 are significantly slower than either gB2 or 4 and have a lower log decline from baseline by day 21. These data indicate that either a significant fitness difference exists between CMV strains or that antiviral control of replication may be linked to CMV gB genotype and should aid our understanding of treatment success and failure. One Introduction: Parvovirus B19 (PVB19) is a single-stranded DNA virus that was first reported to affect transplant (tx) recipients around 20 years ago. In the kidney tx setting PVB19 has been reported to cause anemia and proteinuria. Reported incidence in a general kidney transplant cohort has been reported to be between 0%-12% and as high as 38% in an anemic kidney tx population. Here we report our incidence of PVB19 infection over a 12 year span. Patients and Methods: All records of kidney tx recipients from 1996 until 2007 were reviewed for the presence of PVB19 infection. There were 834 kidney tx performed during this period. Diagnosis of PVB19 infection was made either by detection of PVB19 via PCR in a blood/tissue sample or by detection of virus on renal tissue by immunostain. In patients found to have PVB19 infection; presence of anemia, proteinuria, concurrent infection and acute rejection rates were examined. Response to treatment with IVIG was also evaluated. Results: Incidence of infection was 2.4% as 20 patients were found to have evidence of infection. Average time from tx to diagnosis of infection was 21.6 months (range 3 days-86 months). Average creatinine at diagnosis was 2.32 mg/dl. Anemia was present in 85% of patients with an average hematocrit of 30.8%. Proteinuria was present in 45% of patients with evidence of PVB19 infection. Co-infection was noted in 5 patients (4 CMV, 1EBV) and acute rejection was noted in 30% of individuals within 3 months of diagnosis. Collapsing glomerulopathy (CG) was present in 5 patients and they all had subsequent graft loss at an average of 6 months after diagnosis. 4 of the 5 patients with CG had proteinuria along with anemia and were Caucasian. 90% (18/20) of all patients with evidence of PVB19 infection received IVIG and cleared their infection. One of the remaining 2 pts without IVIG spontaneously cleared their virus. Conclusion: Although the incidence of PVB19 infection in our kidney tx cohort was very low, its presence portends an unfavorable outcome. The presence of CG associated with PVB19 is an especially devastating lesion with very poor outcomes. Response to treatment with IVIG and reduction of immunosuppression is variable. Based on our data it seems reasonable to screen all tx patients with unexplained anemia and concurrent proteinuria as early detection of PVB19 may be crucial. Background: Prior to transplant, screening for latent tuberculosis (LTBI) by tuberculin skin test (TST) is recommended. The accuracy of TST in end stage renal disease however may be limited. The Quantiferon®-TB Gold assay (QFT) detects interferon-δ produced by peripheral blood T-cells in response to TB specific antigens and may be more accurate for diagnosis of LTBI. Methods: This prospective single center study compared the TST to QFT for the diagnosis of LTBI in a cohort of adult patients listed or undergoing workup for renal transplantation. All patients had both TST and QFT performed. Additional data collected included demographics, TB risk history and chest x-ray results. Based on demographic and radiographic findings, patients were classified as high or low risk for LTBI. A positive TST was defined as ≥5 mm and positive QFT as ≥0.35 IU/ml. Results: A total of 45 patients were enrolled. Complete data was available for 38 subjects (5 did not return to have TST read). The mean age was 50.4 +/-12.2 years with 21 (55.3%) males and 17 (44.7%) females. The most common etiologies of renal diseases were diabetes (52.6%) and glomerulonephritis (26.3%). Most subjects (35 of 38) were on renal replacement therapy (hemodialysis in 73.7% and peritoneal dialysis in 18.4%). Twenty (52.6%) subjects had received BCG and 10 (26.3%) were born in or lived in a country in with TB prevalence rate > 20/100000 population. Fifteen (39.5%) subjects were considered to be at high-risk for LTBI. Overall 5 (13.2%) had a positive TST and 3 (7.9%) had a positive QFT. The QFT was indeterminate in 1 subject due to a low mitogen response. Agreement between the tests was 92% (K=0.72, p<0.0001). In low-risk subjects (n=23) the TST was negative in all and the QFT was negative in 22 and indeterminate in 1. In clinically high-risk subjects, 5 (33%) had a positive TST and 3 (20%) had a positive QFT. The 2 subjects with discordant results, both from TB endemic countries, had both completed treatment for LTBI 4 years prior and remained TST positive, but were QFT negative. In renal transplant candidates, the TST and QFT are comparable for the diagnosis of LTBI. The QFT has the advantage of being completed in a single visit and in our cohort indeterminate results were uncommon. Optimal Utilization of HTLV I/II Positive Organs -A Nationwide Survey. Objective: We recently presented data from the UNOS database that demonstrated no significant difference in graft or patient survival between HTLV I /II (+) and (-) liver recipients. Several Organ Procurement Organizations (OPO) including our own, do not offer HTLV I /II positive organs while many others find it difficult to place them. Despite this, the number of HTLV (+) organs is increasing with 31 utilized in 2006 alone. This prompted us to evaluate the practical difficulties in placing these organs so as to improve utilization of these "high risk" life saving organs. Medthod: A telephone/email survey of all the 65 OPOs in USA was done over a 2 month period from October to November 2007. Results: Of the 65 OPOs, 42 responded. All screen patients for HTLV I/II with ELISA. 38 centers confirm with repeat ELISA, 27 confirm with Western Blot and 6 centers do not pursue further. 36 of the 42 centers offer the HTLV I/II positive organs. There were a total of 231 positive donors in the past 5 years of which organs from 109 donors (47.2%) were placed. 27 centers offer all the organs while 15 offer one or more organs selectively based on accepting centers. None have been able to place the pancreas. Only liver and kidney were commonly accepted. Several centers noted a high false positive rate. Based on the UNOS regional analysis data, 43% of the organs are utilized in NY state alone. Many OPOs did not know which particular centers accept these organs and consequently spend a lot of time and effort in order to place them. A majority wanted to have a list of transplant centers that accept these organs. Conclusions: HTLV I/II organs are being underutilized. Moreover, our prior analysis of UNOS data shows that these life saving organs are shared more nationally than loco-regionally which is associated with a poorer outcome. Increased knowledge of successful HTLV (+) donation and the centers that are willing to utilize these organs in the appropriate setting will help expand the donor pool and decrease mortality on the waiting list. Increasing Traditional two-drug chronic immunosuppression (IS) used in organ transplantation (Tx) is associated with development of EBV-driven complications because of impairment of anti-viral CD8 + T cell surveillance. Since the long-term impact of alemtuzumab preconditioning combined with tacrolimus monotherapy on EBV immunity after Tx has not been studied, here we aim to analyze the frequency and function of peripheral blood EBV-specific CD8 + T cells. Thirteen EBV + stable kidney transplant (KTx) recipients and 12 EBV + healthy controls were recruited to this cross-sectional study. All patients received alemtuzumab preconditioning, followed by tacrolimus monotherapy. Blood samples were collected at least 1 year post-Tx to allow immune reconstitution. The EBV-specific CD8 + T cell phenotype and function were screened by flow cytometry and IFNg ELISPOT assay. HLA-A201 restricted EBV-lytic (BMLF1) and latent (LMP2a) peptides were used to generate Tetramer (TMR) probes, and for functional screening in ELISPOT. Circulating CD8 + T cells from KTx patients had recovered by 1 year, and were comparable to those of healthy controls (28.6%±12.9 vs 23% ±6.6, p=0.2). Moreover, the memory distribution and the frequency of EBV-specific CD8 + T cells detected in patients and controls (BMLF1-specific: 0.56%±0.96 vs 0.88%±1.08, p=0.46, and LMP2specific: 0.05%±0.06 vs 0.24%±0.40 p=0.09) were similar. In contrast, the frequency of functional Type-1 (IFN-g producing) EBV-specific CD8 + T cells was significantly lower in KTx patients than in healthy controls (BMLF1: 47±24 spots/10 5 CD8 T cells vs 164±134 p=0.06, and LMP2: 9±8 vs 64±54 p=0.03). Accordingly, on average, only 8-12% of circulating EBV-specific CD8 + T cells from KTx patients produced IFN-g, while 20-30% of effector cells were functional in healthy controls. Addition of IL-2 (20IU/ml) during the ELISPOT assay reversed the hypo-responsiveness of Type-1 (IFN-g) EBV-specific CD8 + T in patients (range 3-8 fold increase), suggesting that these effector cells were anergic. These results support the notion that alemtuzumab-induced lymphocyte depletion followed by tacrolimus monotherapy renders EBV-specific CD8 + T cells anergic in vivo, a state that can be readily reversed by cytokines such as IL-2, which are commonly released during immune activation. Background: The epidemiology of the transmission of cytomegalovirus (CMV) from organ donors to recipients is not completely understood. We studied donor to recipient transmission patterns by analyzing viral genomic variants through the use of CMV glycoprotein B (gB) genotyping by real-time PCR. Polymorphisms in gB UL55 allow discrimination of 4 distinct genomic variants (gB 1-4). Methods: Organ transplant recipient pairs or triplets were included in the study if: a) they had CMV infection, b) they received an organ from a CMV seropositive donor, and c) there was at least one other recipient from the same donor that also developed CMV infection. Genotyping (gB 1-4) was performed by quantitative real-time PCR on stored blood samples. Clinical charts were reviewed to evaluate the clinical characteristics and outcome of CMV infection. Results: Of the 78 CMV seropositive donors screened, 21 were multiple organ donors for which 2 or more of their recipients developed CMV infection. The total number of recipients from these 21 donors was 86 (median of 4 recipients per donor). Of these recipients, 47 (55%) had CMV infection (16 recipient pairs and 5 recipient triplets). The prevalence of genotypes was gB1 (n=22; 47%), gB2 (n=11; 23%), gB3 (n=4; 9%), gB4 (n=0). Mixed infection with two concurrent genotypes was present in 10 patients (21%). Overall concordance between CMV gB genotype in recipient pairs was 62.5% (10/16). If both recipients were CMV seronegative (D+/R-) the gB concordance in recipients was 67% (2/3 pairs). gB concordance was 70% (7/10 pairs) if one of the recipients was seronegative and the other seropositive. Concordance was 33% (1/3 pair) if both recipients were seropositive. Concordance between genotypes was seen in 2/5 (40%) recipients triplets. In seropositive recipients with CMV viremia, the origin of the CMV strain was thought to be donor derived in 11/16 (69%) and of reactivation of the recipients own virus in 5/16 (31%) of the cases. No difference in clinical outcome or organ tropism was seen between genotypes. Based on an analysis of strain concordance within recipients from common donors, transmission patterns of CMV can be assessed. In D+/R+ transplant patients, donor strain superinfection accounts for the approximately two-thirds of CMV infection. Backgroud: Although MAP kinases have been implicated in the pathophysiology of liver IRI, their functional significance in the mechanism of TLR4 mediated pro-inflammatory immune regulation, remains to be elucidated. Methods: MAP kinase activation in a murine model of liver warm IRI (90 min. ischemia, 6 h reperfusion) was determined by Western blots. Chemical inhibitors of ErK (U0126, 10 µM in vitro or 160 mg/kg in vivo), JNK (SP600125, 50 µM or 30 mg/kg), and p38 (SB203580, 10 µM, 100 mg/ kg) MAP kinases were utilized in vitro in primary BM-derived macrophage cultures stimulated with LPS (10 ng/ml); or in vivo in liver pro-inflammatory immune responses induced by LPS (1 µg/mouse, i.p.) or IRI. Results: Erk and JNK, but not p38, MAP kinase activation were readily detected in liver IRI. In primary macrophage cultures, LPS induced pro-and anti-inflammatory genes, including TNF-α, IL-1β, IL-6, IL-10, iNOS and CXCL10. Erk inhibitor mainly suppressed IL-1β and IL-10 (80% and 90% resp), whereas JNK inhibitor suppressed the majority of genes. In LPS-induced liver inflammation, Erk inhibitor suppressed IL-6, IL-1β, iNOS and IL-10 by >50%, but failed to affect TNF-α/CXCL10. JNK inhibitor, on the other hand, preferentially inhibited pro-inflammatory genes, but marginaly affected IL-10 (<30%). and produced comparable suppression of pro-/anti-inflammatory genes (figure1). Interestingly, TNF-α was the least responsive gene subjected to MAP kinase regulation. Conclusion: Erk and JNK MAP kinase activation: 1/ are required for TLR4 activationinduced pro-inflammatory gene induction; 2/ play critical role in the development of IR-mediated liver immune response/tissue injury. Background: The JAK/STAT signaling is one of the major pathways for cytokine signal transduction. The signal transducer and activator of transcription 1 (STAT1) is mainly activated by IFN-α/ß/IFN-γ. The activation of STAT1 by IFN-γ has been implicated in hepatic inflammation. We have shown that activation of Toll-like receptor (TLR) 4 complex initiates pro-inflammatory response leading to liver ischemia/reperfusion Abstracts injury (IRI). Indeed, TLR4 signaling in vitro activates STAT1, which in turn triggers production of type-1 IFN-dependent CXCL10 (IP-10). This study was designed to analyze the cross-talk between STAT1 and the MAP kinase (ERK) downstream of JAK/ STAT signaling pathways. Methods & Results: We used a mouse liver model of partial warm ischemia (90 min), followed by reperfusion (6 h) . First, we employed STAT1 KO (n=17) and control WT (n=16) mice. The hepatocellular damage, as measured by sALT levels (IU/L), was significantly decreased in 10/17 STAT1 KO mice (p<0.005); the remaining 7/17 of STAT1 KO showed sALT levels comparable with WT. Hence, we distinguished two groups of STAT1 "protected" vs. "nonprotected" KO recipients. Histology revealed minimal sinusoidal congestion without edema/vacuolization or necrosis in STAT1 KO "protected" group. The induction of mRNA coding for TNFα/ IL-6 was higher in STAT1 KO "nonprotected" livers. The expression of CXCL10, the product of STAT1 activation downstream of TLR4 in Type I IFN pathway, was profoundly and selectively depressed in livers from STAT1 KO "protected" mice, as compared to IRI susceptible livers. Similarly, Western blot-assisted phospho-ERK expression was up regulated selectively in the STAT1 KO "protected" group. In the second series, C57BL/6 mice were treated 1 h prior to liver ischemic insult with JAK-2 inhibitor (Tyrphostin AG490; 40 mg/kg, i.p.; n=5), or vehicle (n=5). The hepatocellular damage, as measured by sALT levels (IU/L), and histology was significantly decreased in AG490 group, as compared with controls (mean = 12770 vs. 28770; p<0.05). The disruption of JAK/STAT signaling by inhibiting JAK2 uniformly ameliorates the inflammatory immune response in liver IRI. However, the blockage of STAT1 alone is insufficient to reproducibly exert cytoprotection. As JAK2 is upstream of STAT1 as well as upstream of MAP kinase (ERK), this study highlights the role of both signaling pathways in hepatic IRI. Purpose: TLR4 is required for maximal ischemic injury of the heart, liver, lung, and kidney. To better understand the mechanisms of TLR4 action, we investigated a murine model of ischemic kidney disease and examined endothelial TLR4 expression. Methods: 1. Animal ischemia reperfusion injury(IRI): the right kidney of wildtype(WT) C57BL/10, or TLR4-deficient(KO) C57BL/10ScN mice was removed. The left pedicle was clamped for 23 min, followed by 4 hr reperfusion. Sham animals were controls. 2. Bone-marrow chimera: four groups of BM chimeras were created: WT→WT; KO→WT; WT→KO; KO→KO (8 recipients/grp). 8 wks later, chimerism was confirmed by tail and blood genotyping, and mice subjected to renal IRI. 3. TLR4 mRNA detected by DIG-labeled antisense; TLR4 on endothelium by anti-TLR4 and anti-CD31. 4. Total genome mRNA expression on ischemic vs. sham WT mice, ischemic vs. sham KO mice(4 mice/grp) determined using Affymetrix Mouse Genome 430.2 GeneChips followed by Genesifter analysis. Quantitative real-time RT-PCR confirmed candidate genes. 5. MS1 endothelial cells were treated with H 2 O 2 (500uM) for 30 min, cultured for 4 hr and then expression of TLR4 and endothelial genes determined. Results: TLR4-deficient mice had less renal injury as assessed by pathology and function. Radiation chimeras showed that radioresistant parenchymal cells and radiosensitive leukocytes were both required for maximal injury. Immunohistology and in situ hybridization identified endothelia in the outer medulla as a major TLR4 expressing cell type at 4 hr post-reperfusion. Genechip analysis revealed a panel of cytokine, chemokines, proinflammatory and cell-cycle related genes that were differentially expressed on IRI versus sham kidneys. Real-time RT-PCR confirmed that the following pro-inflammatory endothelial genes increased after ischemia in wildtype but not TLR4 KO mice: TLR4, Pentraxin related gene(PTX3), and Endothelial cell-specific molecule 1(ESM1). Real-time RT-PCR showed that in vitro H 2 O 2 treatment of endothelial cells induced expression of TLR4(6.6-fold), PTX3(4.6-fold), and ESM1(69-fold). We found that endothelia in the outer medulla increase their expression of TLR4 after ischemia, and that the endothelial genes PTX3 and ESM1 increase only in WT ischemic kidneys. We also found that H 2 O 2 , which mimics reactive oxygen species generated during IRI, directly increases these same endothelial genes in vitro. NKG2D is an activating receptor expressed on NK cells and CD8 + T cells. Ligands for NKG2D including RAE-1, MULT-1 and H60 in mouse, may be upregulated by tissues in response to stress. A study in mouse macrophages described upregulation of RAE-1 in response to stimulation through TLR4 by LPS. We have shown that TLR4 mediates kidney ischemia reperfusion injury (IRI) and also observed RAE-1 upregulation in IRI kidneys. We now determined whether: 1) kidney IRI could induce the expression of NKG2D ligands: 2) expression of NKG2D ligands is TLR4 dependent; 3). bone marrow (BM) derived cells or parenchymal kidney cells express NKG2D ligands. Methods. Kidney-ischemia was induced in TLR4 -/-, MyD88 -/and WT mice for 22 min. Blood and tissue were harvested at days 1, 3 and 5. Primary cultures of mouse tubular cells (TECs) were also subjected to ischaemia (mineral oil overlay for 1 hr) or TLR4 activation (LPS stimulation). BM chimeric mice were generated by transplanting BM into irradiated recipient mice before IRI was induced. Results. RAE-1 mRNA level in IRI kidney was increased from day 1 to day 5, peaking at day 3 compared to sham operated kidney (14-48 fold increase, P < 0.001) measured by real time PCR. RAE-1 protein was detected in renal tubular cells from ischemic kidney but not from shamoperated control by flow cytometry. MULT-1 mRNA expression was also increased from day 1 to day 5 (15-51 fold increase, P <0.005). Both RAE-1 and MULT-1 mRNA levels were reduced in TLR4 -/and MyD88 -/-IRI kidneys (2-8 fold reduction) versus WT controls (P < 0.01). TLR4 -/and MyD88 -/primary cultured TECs submitted to IRI in vitro also showed less RAE1 and MULT-1 mRNA expression than WT controls (p < 0.01). LPS stimulated RAE-1 and MULT-1 expression in WT but not in TLR4 -/-TECs. TLR4 -/mice bearing WT hemopoietic cells had significantly lower kidney RAE-1 and MULT-1 mRNA expression after IRI versus WT mice with TLR4 -/-BM (p < 0.05). Conclusion. Kidney IRI causes RAE-1 and MULT-1 expression and kidney parenchymal cells are the dominant source. TECs can be stimulated to express RAE-1 and MULT-1 by ischemia or LPS via the TLR4 pathway in vitro and deficiencies in this pathway provide protection against IRI and RAE-1 and MULT-1 expression in vitro and in vivo. Thus, kidney IRI causes upregulation of NKG2D ligands by parenchymal kidney cells via TLR4. Background: Intestinal ischemia/reperfusion injury (IRI) is a major clinical problem. Although Toll-like receptor 4 (TLR4) has been implicated as a potential link between the innate and adaptive immunity, little is known on its role in intestinal IRI. Our preliminary research in intestinal IRI has shown that, compared to sham controls, WT mice had decreased survival, worse tissue injury/apoptosis, increased PMN infiltration, increased CD3+ cell infiltration, and increased production of TLR4, chemokines, and adhesion molecules. Here we used TLR4KO mice to further investigate the role of TLR4 in intestinal IRI and its effects on cytokine/chemokine programs and apoptotic signaling. Methods: C57BL10 WT and TLR4KO mice underwent 100 min of total jejunoileal warm IRI by clamping of the SMA. Separate survival and analysis groups were performed. Intestinal tissue was harvested at 4h and 24h. Tissue analysis included histopathology, CD3 immunostaining, myeloperoxidase (MPO) activity, RT-PCR for chemokines/ cytokines, and Western blots for apoptotic and HO-1 protein expression. Results: TLR4KO had superior survival compared to WT (100% vs. 33%, p<0.02). On histopathology TLR4KO had near normal-appearing villous architecture, while in contrast, WT showed mucosal erosions and villous congestion/hemorrhage. TLR4KO had reduced CD3+ cell infiltration as compared to WT (3.9±0.7 vs. 6.2±1.4 per hpf at 4h, p<0.001; and 4.8±1.1 vs. 7.0±2.3 per hpf at 24h, p<0.02). Early MPO activity was also reduced in TLR4KO (0.11±0.06 vs. 0.88±0.5 U/g at 4h, p<0.005). RT-PCR analysis demonstrated decreased production of mRNA for IP-10, MCP-1, RANTES, and IFN-γ and increased production of IL-13 in TLR4KO. There was decreased protein expression of caspase-3 and increased expression of bcl-2 and HO-1 in TLR4KO mice. Conclusion: The genetic absence of TLR4 exerts protection against intestinal IRI, demonstrating for the first time that TLR4 is required for intestinal IRI. The absence of TLR4 signaling reduces IRI through reduced neutrophil and T cell chemotaxis, and up-regulation of protective molecules. These results support data that TLR4 is a mechanistic link between the innate and adaptive immunity, implicating TLR4 as a potential therapeutic target for the prevention of intestinal IRI. It is well known that liver steatosis increases hepatic vulnerability to ischemia/ reperfusion (I/R) injury as part of the transplantation process. Endotoxin (LPS) is thought to be a major contributing factor to the pathogensis of I/R. During portal occlusion, LPS is translocated across the mesenteric tissue barrier into the portal circulation, and is delivered as a large bolus to the liver at the point of reperfusion. At this time, LPS is mainly recognized by toll-like receptor 4 (TLR4). This leads to downstream signaling and the production of proinflammatory products that ultimately lead to cellular inflammation, necrosis, and apoptosis. It is well known that steatotic livers are highly sensitive to endotoxin as compared to their lean counterparts post-I/R, and we have previously seen that monoclonal antibody blockade of endotoxin dramatically improves animal survival after I/R. Therefore, we propose the novel hypothesis that TLR4 signaling is a major contributor to cellular damage after steatotic hepatic I/R. To test this hypothesis, we subjected male 4-week-old C57BL/10J (control) or C57BL/10ScN (TLR4 deficient, TLR4KO) mice to a high-fat diet (HFD) for four weeks. Then, we subjected the animals to 35 minutes of total hepatic ischemia and 1 or 24 hours of reperfusion. There was a dramatic improvement in animal survival in the HFD TLR4KO animals versus control HFD animals at 24 hours (74% vs. 31% in control HFD animals, p<0.05). There was significantly more liver necrosis (as measured by a grading scale from 0-3) in the control HFD animals as compared to the TLR4KO HFD animals (1.4±0.3 in control vs. 0.4±0.2 in TLR4KO, p<0.05). In addition, we see significant increases in the message level of the proinflammatory cytokines IL-6, IL-12, and IFN-γ at one hour in the control HFD animals that is abrogated dramatically in the TLR4KO HFD animals. We do not see these dramatic changes in the control animals fed a normal diet. Despite the significant increases in inflammation in the control HFD animals versus the TLR4KO HFD and normal diet control animals, we do not see changes in the TLR4 message level or endotoxin boluses, implying an increased sensitivity in the absence of an increased number of receptors. TLR4 is a critical molecule in the pathogenesis of steatotic liver ischemia/reperfusion, and represents a potential therapeutic target for expansion of the donor pool. The Background: Neutrophils are considered crucial effector cells in the pathophysiology of organ ischemia and reperfusion injury (IRI). Particularly, neutrophil elastase (NE) accounts for a substantial portion of the neutrophil function. This study was designed to explore the role of, and mechanism by which NE exerts its function in a mouse model of liver warm IRI. Methods: Partial warm ischemia was produced in the left and middle hepatic lobes of C57BL/6 mice for 90 min, followed by 6-24h of reperfusion. Mice were treated with NE inhibitor (NEI; 2mg/kg p.o.; GW311616A; n=6) or control (n=6) at 60 min prior to the ischemia insult. After 6h or 24h of reperfusion, sAST/sALT levels and intrahepatic neutrophil accumulation (Myeloperoxidase [MPO] activity) were assessed. The pro-inflammatory cytokine (TNF-α, IL-6), chemokine (CXCL-1, CXCL-2, IP-10) and Toll-like receptor (TLR) 4 gene expression profiles were screened by RT-PCR. Liver samples were collected for histological grading, and detection of neutrophil infiltration by the naphtol AS-D chloroacetate esterase stains. Results: NEI treatment significantly reduced sAST/sALT levels, as compared with controls (17695±3413 vs 10990±2411; p<0.05 / 33650±3793 vs 13510±6809; p<0.01 at 6 h, and 11778±3113 / 14483±3972 vs 3565±957 / 4810±1063; p<0.01 at 24h). The expression of pro-inflammatory cytokines, and chemokines was significantly reduced in the NEI treatment group (TNF-α in 6h; p<0.05, IL-6 in 6h and 24h; p<0.01 and p<0.05, CXCL-1 in 24h; p<0.01, CXCL-2 in 6h and 24h; p<0.01 and p<0.05, IP-10 in 24h; p<0.01). The MPO activity (U/g) was also significantly reduced following NEI treatment (14.1±7.7 vs 4.0±1.2; p<0.05). TLR4 expression was selectively diminished in NEI pretreated livers (6h; p<0.01). Histological examination of liver sections has revealed that unlike in controls, NEI treatment markedly reduced edema, diminished centrilobular ballooning/sinusoidal congestion, ameliorated hepatocellular necrosis, and decreased local neutrophil infiltration. Conclusion: The inhibition of NE ameliorated hepatocellular damage, reduced local inflammatory responses, and neutrophil activity/infiltration in a stringent mouse liver model of warm IRI. Interestingly, it also downregulated the innate TLR4 signaling. This study documents the previously unrecognized NE -TLR4 cross talk, and implies neutrophil elastase in the signal transduction pathway instrumental for liver IRI. Lymphocyte Lymphocytes are involved in the early pathogenesis of ischemia-reperfusion injury (IRI) in kidney; however, their role during healing is unknown. This has direct clinical consequence since lymphocyte-targeting agents are currently administered to prevent rejection during recovery from IRI in renal transplants. C57BL/6 mice underwent unilateral clamping of renal pedicle for 45 min, followed by reperfusion, and were sacrificed at day 10. Mice were treated with saline (C), methylprednisolone (Pred) or mycophenolate mofetil (MMF) i.p. daily from day 2 until sacrifice (n=12/group). Lymphocytes were isolated from the kidneys, counted and stained with monoclonal antibodies. Kidney damage (% damaged tubules) and proliferation (Ki67 assay) were assessed. Flow cytometry analysis demonstrated increased numbers of TCRβ + CD4 + and TCRβ + CD8 + T and TCRβ -NK1.1 + NK, but not CD19 + B cells at day 10 in the ischemic (IR) kidneys compared to contralateral. Regulatory T cells, TCRβ + CD4 + CD25 + Foxp3 + , and T cell subsets TCRβ + CD8 + CD122 + and TCRβ + NK1.1 + also increased. Moderate tubular damage in cortex, severe injury in outer medulla and increased proliferation in both compartments characterized the repair phase. Pred improved histological damage in IR kidneys, while MMF worsened it. Proliferative index correlated with histology in outer medulla. Pred reduced the total counts and activation of TCRβ + CD4 + and TCRβ + CD8 + T cells in IR kidneys, and increased the percentage of TCRβ + CD8 + CD122 + among total TCRβ + CD8 + T cells. MMF reduced all lymphocyte subsets, decreased the percentage of TCRβ + CD4 + CD25 + Foxp3 + among total TCRβ + CD4 + T cells, and lowered IL-10 tissue levels. IL-6 and Platelet Derived Growth Factor-BB protein levels were also decreased in IR kidneys from MMF-treated mice. In conclusion, specific trafficking and phenotypic changes of kidney-infiltrating lymphocytes occur during recovery from renal IRI, and lymphocyte-targeting agents, Pred and MMF, alter tubular cell structure, proliferation, and inflammatory response in the repair phase. Background: HO-1 plays an important cytoprotective role in a variety of organ injury models. We have shown that HO-1 exhibits potent cytoprotective effects against liver I/R injury. This study explores the function and mechanism of HO-1 in liver I/R injury by using siRNA that suppress HO-1 expression both in vitro and in vivo. Methods: Using a partial liver warm ischemia model, C57BL/6 wide-type (WT) mice (n=6/gr) were injected with HO-1 siRNA/nonspecific control siRNA (2 mg/kg, i.v. at day -1) or Ad-HO-1/Ad-β-gal (2.5x10 9 pfu, i.v. at day -2). Sham control WT underwent the same procedures, but without vascular occlusion. Mice were sacrificed at 6 h of reperfusion; liver tissue and blood samples were collected for future analysis. In in vitro studies, YPEN-1 endothelium cells were transfected with HO-1 siRNA (100 nM) or Ad-HO-1/ Ad-β-gal. Results: HO-1 siRNA treated mice showed significantly increased sGOT levels (IU/L), as compared with nonspecific control siRNA or Ad-HO-1 (7593±2859 vs. 3104±1777 and 211.5±40, respectively; p<0.05). These correlated with histologic Suzuki's grading of liver I/R injury, with HO-1 siRNA showing significant edema, sinusoidal congestion/cytoplasmic vacuolization, and severe hepatocellular necrosis; nonspecific control siRNA showed moderate edema, sinusoidal congestion/cytoplasmic vacuolization. In contrast, Ad-HO-1 revealed only minimal sinusoidal congestion without edema or necrosis. HO-1 siRNA significantly increased local neutrophil accumulation and caspase-3 activity, and increased the frequency of apoptotic cells (31.8±5.5 vs. 18.5±5.4 and 3.5±2.2, respectively; p<0.005), as compared with nonspecific control siRNA or Ad-HO-1. Both YPEN-1 endothelium cells and WT mice treated with HO-1 siRNA revealed markedly increased Caspase-3 activity and reduced HO-1 expression. In contrast, Ad-HO-1 significantly decreased caspase-3 activity and increased HO-1 and anti-apoptotic Bcl-2/Bcl-xL expression. Conclusion: This study provides evidence that HO-1 exerts cytoprotection against I/R injury by regulating liver apoptosis and inhibiting caspase-3 activation pathway. Organ specific siRNA is not only a powerful tool to study local gene function, but it may also provide novel therapeutic application in transplant recipients. Islet cell transplantation has recently emerged as one the most promising therapeutic approaches for diabetic patients to improve glycometabolic control. One major problem of the procedure is the requirement of an immunosuppression protocol capable of counteract both auto and allo-immune response. Recent data suggest that Anti-Thymoglobulin (ATG) can halt efficiently the mounting of an alloresponse and the recurrence of autoimmunity in NOD mice by expanding antigen specific T-regulatory cells. We retrospectively reviewed our casuistry type 1 diabetic kidney-transplanted patients who underwent islet transplantation using an immunosuppressive protocol based on ATG or Daclizumab (as induction treatment) plus cyclosporine and MMF as maintenance therapy. 45 patients underwent islet after kidney transplantation in our center. Thirty-four patients received a time course of ATG as induction (125 mg per day for 7-10 days), (number of islet infused=499,596±28,780), and 9 patients received Daclizumab at induction (1 mg/kg every 2 weeks for 10 weeks); (number of islets=486,108±70,644). No major adverse events were recorded in our center; no malignancies or infections outbreaks were evident. Patients in the ATG induction group showed a better islet survival rate compared to Daclizumab (p=0.01), according to C-peptide>1ng/ml. A sustained and prolonged C-peptide secretion was evident in the ATG group; while in the Daclizumab group only 1 patient was functioning at 1 year. Interestingly, in the ATG, which reached a longer follow-up, we cannot observe a loss of beta cell mass according to our metabolic test, suggesting the preservation of islet mass. In conclusion, ATG can provide a good protection towards both allo and auto immune response. The next step will be to use ATG in a calcineurin free protocol, allowing a better expansion of T-regs. Introduction: Despite consistent achievement of insulin-independence, recent data indicate that the long term success of islets transplants using the Edmonton protocol is <10% at 5 years. The cause of the nearly universal late islet allograft failure remains unknown but hypotheses include: allo or autoimmune injury, marginal mass exhaustion, hepatic site related dysfunction, and immunosuppression toxicity. We examined the series of islet transplants performed at our institution and noted marked differences in the outcome and complications in IA and IAK groups. Our results may provide insight as to the cause of chronic islet loss. Methods: Thirty-one islet infusions were administered to IA (n=9) and IAK (n=8) Type-1 diabetics between 9/2001-11/2007. IA and IAK had similar demographics and transplanted islet mass (13,838 vs 13,411 IEq/kg). IA received Edmonton like immunosuppression with Zenapax induction and CNI/SRL, whereas IAK patients received Zenapax and: CNI/MMF/pred (6), CNI/MMF (1), CNI/SRL (1)). Results: Insulin-independence was achieved in all but 2 patients who completed therapy (2 others withdrew). Compared with IA, IAK exhibited better glycemic control (6-month mean HbA1c 6.5 vs 6.1, stimulated C-peptide 2.3 vs 6.0), and improved islet survival; all IA eventually failed and only 1/8 were insulin free for >2-years, whereas only 3/7 IAK grafts have failed with 4 exhibiting continued robust function at >38,>38,>46, and >58 months with 3/4 fully insulin independent. In addition, all IA patients demonstrated immune sensitization post graft failure versus 0/8 IAK. All 9 IA developed mouth ulcers versus only 1/8 IAK. Conclusions: IA and IAK exhibit striking differences in outcome and complications. The absence of mouth ulcers and lack of sensitization in IAK may relate to steroid use and continued immunosuppression for the kidney graft, respectively. The superior outcome of the IAK cohort may be a result of differences between the two groups including the use of maintenance steroids, prior exposure to Thymo or the chronically immunosuppressed state of the IAK recipient. Perhaps the most interesting correlation with outcome is the absence of the anti-proliferative agent sirolimus in the IAK group. Our results provide clues to the cause of chronic islet transplant failure and may lead to novel approaches to avoid it. Islet Background: Although islet transplantation has become an option for treatment of type 1 diabetes, all currently used immunosuppressive protocols have significant renal and islet toxicity. We describe a novel immunosuppressive protocol using sirolimus and the anti-LFA 1 antibody efalizumab that permits prolonged islet allograft survival without the need for steroids or calcineurin inhibitors (CI). Methods: Between February and August 2007, 4 consecutive type 1 diabetic patients with hypoglycemic unawareness and normal renal function received allogeneic pancreatic islet transplants. Induction immunosuppression consisted of 2 doses of thymoglobulin given on pre-transplant days -2 and -1, efalizumab (5mg/kg sq/week starting on d -1), and sirolimus. Maintenance immunosuppression consisted of sirolimus and efalizumab. Results: All patients achieved insulin independence after single islet infusions (mean IEQ/kg=8,905). Three of 4 remain insulin independent 4 or more months after transplant (table 1) . Patient 1 resumed low dose insulin (approximately 15% of original dose) 6 weeks after transplant and is awaiting a second islet infusion. Her blood glucose control is markedly improved and she has not experienced any hypoglycemic episodes. All patients show persistent c-peptide secretion and have stable renal function. Side effects due to efalizumab were limited to transient irritation at the injection site. Conclusions: Thymoglobulin induction followed by sirolimus and efalizumab maintenance is well tolerated and allows prolonged islet allograft survival. This protocol is the first CI/steroid free islet regimen resulting in insulin independence with a single donor islet infusion. By eliminating CI, this protocol minimizes renal and islet toxicity and may thus improve long-term islet survival and function. Long . Clinical and metabolic profiles were assessed every 3 months for18 months. Results: SI-EXN group was on this drug for median time of 171 days pre-SI. Both groups were similar except for duration from post-completion to graft dysfunction (648±34 vs 221±58 in SI-EXN and SI-C group respectively, p<0.05) and duration of graft dysfunction before SI (664±83 vs 326±93, p=0.03). SI-C and SI-EXN groups received mean of 8713±2123 and 5613±485 IEQ/kg, respectively (NS). Only 3/5 of SI-C patients achieved insulin independence for 303, 403 and >1670 days after SI. All subjects in SI-EXN group achieved insulin independence for more than 443, 621, 641, 654 days. At 18 months insulin independence was 20% in SI-C and 100% in SI-EXN group. Comparing pre and post-SI, SI-EXN group had significantly lower A1C at 3-18 months and lower AUC glucagon at 6 and 15 months (p<0.05). AUC C-peptide in SI-EXN group was significantly higher than SI-C at 3 and 9 months. Intravenous glucose tolerance test showed significantly increased acute insulin responses to glucose at 3-18 months in SI-EXN and at 3 and 6 months in SI-C group. SI-EXN group had more acute C-peptide response to glucose than SI-C at 3 and 6 month (p<0.05). Acute exenatide administration during intravenous glucose tolerance test at 15 month in SI-EXN revealed significantly increased acute insulin and c-peptide response to glucose which indicates improved first phase insulin release. Conclusion: Supplemental islet infusions under exenatide lead to insulin independence, restore first phase insulin secretion and result in long term insulin independence. Exenatide This prospective phase 1/2 trial aimed to demonstrate safety and reproducibility of allogeneic islet transplantation (Tx) in type 1 diabetic (T1DM) patients and implement a strategy to achieve and maintain insulin-independence with minimal islets. Ten C-peptide negative T1DM subjects with hypoglycemic unawareness received 1-3 intraportal allogeneic islet Tx. Four subjects (Group 1) received the Edmonton immunosuppression regimen (daclizumab, sirolimus, tacrolimus). The next 6 subjects (Group 2) received etanercept, exenatide and the Edmonton regimen. We followed all subjects for 15 months after the first Tx. The primary efficacy end point was insulin independence. Secondary endpoints were HbA1C, fructosamine, OGTT, mixed meal test, glucagon stimulation test, IVGTT and hypoglycemia. To study the effect of exenatide, we compared frequently sampled IVGTT, C-peptide, proinsulin, amylin and glucagon with and without exenatide. Two self-limiting bleeds occurred in 18 infusions. All subjects became insulin independent. Group 1 received a mean total number of islets (EIN) of 1,460,080±418,330 in 2 (n=2) or 3 (n=2) Tx, whereas Group 2 became insulin independent after 1 Tx (537,495±190,968 EIN, p=0.028). All Group 1 subjects remained insulin free through the 15-month follow-up. Two Group 2 subjects resumed insulin: one after immunosuppression reduction during an infectious complication, the other with severe gastroparesis and exenatide intolerance. HbA1c reached normal range in both groups (6.5±0.6 at baseline to 5.6±0.5 after 2-3 Tx in Group 1 vs. 7.8±1.1 to 5.8±0.3 after 1 Tx in Group 2). Baseline HbA1c was significantly higher in Group 2 than Group 1 (p=0.046). Pre-and post-Tx HYPO scores were 841.8±1217.9 and 0 in group 1 vs 812.8±941.7 and 33.5±50.0 in Group 2. Glucagon levels decreased significantly in all subjects. In Group 2, the decrease in glucagon levels after challenge tests with and without exenatide was 18.7 fold more significant after exenatide in all subjects ( Background IsTx has been investigated as treatment for type 1 DM. However, the hope this approach would result in long-term freedom from exogenous insulin has failed in practice. Techniques for isolating islets have advanced and with availability of new IS agents, strategies can now be developed specifically for IsTx that will provide greater immunologic protection w/o diabetogenic side effects. Rejection/ vascularization still remain major limitations for success of IsTx. We hypothesize that BM is an accessible, immunologic privileged space with natural well-developed vasculature and may be a suitable site for IsTx. Method Wistar rats were used as donors/ recipients. DM was induced by iv-streptozocin. Rats who had morning Glc levels > 300 mg/dl on two separate occasions were used as recipients. PTx was performed as normal rodent standard. Islets were isolated from pancreas by distending pancreatic duct with Liberase. Islets were separated on a discontinuous histopaque density gradient, further purified, counted, divided into aliquots transplanted into different sites (liver, BM). BW, Glc and C-Pep were measured in all groups before/after Tx. Background: In February 2007, the islet community was notified of a possible bovine product contamination in the collagenase enzyme (Liberase HI) used for human islet isolations. To eliminate the potential hazard of bovine spongiform encephalopathy, we successfully adapted our human islet processing procedure to utilize a different GMP Collagenase and Neutral Protease (Nordmark/Serva). Here we describe what we consider the most important factors for achieving reproducible and clinically useable islet isolations. Methods/Results: A standard isolation protocol involving controlled enzymatic digestion followed by density gradient purification was used. Seventeen donor pancreata were processed and ten were ultimately used for clinical transplantation. Eight of these successful isolations were performed using the Nordmark/Serva Enzymes (table 1) . The following factors were identified as being important for ensuring successful islet yields: 1) Donor Age and Size: Male donors 17-45 years old who were tall (>180cm) and heavy (>100kg) Conclusions: Incorporation of several important modifications into our existing islet isolation protocol has allowed us to routinely obtain high quality islet isolations using an alternative, bovine product-free GMP enzyme. (n=187)], and in 121 Pts immuno was CNI-free. TAC-treated Pts were younger, more sensitized and more frequently re-KT. DGF was more frequent in Pts receiving CNI-free immuno. As a group, these Pts were older, more frequently received a KT from a donor after cardiac death and less frequently from a living donor. Acute cellular rejection (ACR) was diagnosed in 41 Pts (6.8%), 13 occurred before day 90th (E-ACR), and 28 after this date (L-ACR). C4d-positive AMR was diagnosed in 128 Pts (21.2%), of which 58 were early (E-AMR) and 70 were late episodes (L-AMR Rates of Acute Rejection (AR) and Background: An increasing number of HS patients are being transplanted using desensitization protocols. These patients are considered high risk for AR, particularly antibody mediated rejection (AMR). Here we examined our AR rates and treatment outcomes for our HS kidney transplant (KT) recipients using our most current desensitization strategies. Methods: Between June 2004 (when rituximab was introduced to our desensitization and treatment protocols) to July 2007, 130 HS KT patients were transplanted using combinations of high-dose IVIG, rituximab, and plasmapheresis (PP) for desensitization. We examined the overall AR, C4d-cell-mediated (CMR), and C4d+AMR rates. AMR episodes were treated with steroids, high-dose IVIG, and rituximab. Refractory and rapidly progressive AMR was treated with PP. Treatment outcomes and differences in AR rates between deceased (DD) and living donors (LD) was examined. Recent reports suggest that treatment with the monoclonal anti-CD20 antibody, Rituximab (Rtx) may improve renal graft survival in antibody mediated rejection (AMR); however optimal dosing for this indication is unknown. We examined the efficacy of a single low dose (500mg) of Rtx for treatment of refractory AMR in order to limit significant infective complications associated with conventional dosing regimens (375mg/m 2 ). Rtx was used in seven consecutive patients who had refractory AMR as judged by ongoing biopsy evidence of AMR after 4 weeks of standard therapy consisting of pulse methylprednisolone and plasma exchange with low dose IVIg (100mg/kg) (PE/IVIg). All patients received Tacrolimus and Mycophenolate Mofetil. AMR was defined as 1) characteristic histology on biopsy (Banff criteria), 2) graft dysfunction and 3) presence of a donor specific anti-HLA antibody (DSAb). B-cell counts (CD19) and serum creatinine (SCr) were monitored. PE/IVIg was ceased in all cases after Rtx dosing. The average follow-up since Rtx dosing is 16.8 months (range 5.3-28.9 mths). All patients still have functioning grafts (100% 12 month graft and patient survival), with current mean SCr levels (156±17µmol/L) significantly lower than mean peak rejection levels (514±355µmol/L) p=0.05. CD19 counts fell to zero and remained <5x10 6 /L for >6 months in all patients. DSAbs remain detectable by Luminex Flow Beads in all patients despite stabilization of SCr. Two of 7 patients developed an infective complication post Rtx dosing. One patient developed bacterial pneumonia requiring hospital admission whilst a second patient developed CMV viraemia and later BK nephropathy which has not led to significant graft dysfunction. Hence, infectious complications are far less than those reported for multiple standard dose regimens in similar patient groups. Large clinical trials are required to confirm the efficacy of Rtx in treating AMR as well as optimal dosing. Standard dosing is based on oncology treatment regimens which are likely to be excessive for AMR in patients already markedly immunosuppressed. Our data suggests single low-dose Rtx for refractory AMR results in excellent patient and graft survival and leads to low rates of serious infective complications in the short-term. Renal The demographic and clinical data were similar between +CXM and Negative CXM groups. The rejection rate was significantly higher and the length of stay was significantly longer in +CXM group as shown in the table. The RA survival rates were 8%, 7% and 6% lower at 1, 3 and 5 years post transplant respectively among + CXM recipients. However, this did not reach significance mostly due to small sample size (p=0.11). The inferior RA outcome is seen during the initial few years after transplantation that disappears after 9 years. In CLKT, pre-transplant +CXM can result in higher RA rejection rate and longer length of stay in hospital. LA may not always confer immunological protection to RA in + CXM CLKT and its true burden may be under recognized since CXM is not routinely performed in all CLKT. Study of antibody specificities or LA volume to determine the effect of +CXM on RA outcome is essential. SKPT in patients with positive CDC B-cell and/or flow cytometry crossmatch is associated with high AMR rate despite low dose IVIG and r-ATG/Alemtuzumab induction. Pancreas graft survival is inferior in patients with positive crossmatch while kidney graft and patient survival are similar to that of negative crossmatch recipients. Majority of AMR can be reversed with treatment but further long-term follow-up is needed to determine the impact on graft survival. Ten The first European Phase III trial with tacrolimus in the early 90´s clearly showed advantages for tacrolimus in terms of acute rejection (AR) vs the original cyclosporin formulation. However, no advantages were seen in survival rates. The present investigator-initiated, observational follow-up study collected data on patients included in the original cohort of 448 patients who participated in this large study and who were randomised to a triple immunosuppressive regimen consisting of tacrolimus, azathioprine and steroids (TAC/AZA/STER, n= 303) or cyclosporine, AZA and steroids (CSA/AZA/STER, n=145). Efficacy and safety parameters assessed at follow-up included: acute rejection; patient and graft survival; renal function, vital signs, basic lab results and immunosuppressive regimen for the patients 10 years after completion of the original study. Results: Currently, data are available from 75% of the patients. All assessments were conducted following the intent-to-treat principles. More patients in the TAC than in the CSA group remained on the randomised treatment. While Kaplan-Meier (K-M) estimates for patient survival at year 11 show comparable (72% TAC vs 72%CSA) results, K-M estimates for graft survival demonstrate an advantage for the tacrolimus cohort (51% TAC vs 46% CSA). Graft half-life estimates (Gjertson and Terasaki method) yielded overall results of 14.2 years (TAC) and 11.4 years (CS). In both treatment groups, AR was associated with inferior long-term results. For patients with AR, half-life estimates were 11.1 and 7.8 years for TAC and CSA, respectively, while in patients without AR half-lives were 19.2 (TAC) and 15.8 (CS) years. Mean serum creatinine after 10 years was significantly lower in the TAC cohort vs the CSA group (1.65mg/dL TAC vs 2.01mg/ dL CSA, p<0.05). Mean glomerular filtration rates (MDRD4 estimate) were 48.7 mL/ min in TAC and 41.9 mL/min in CS. After ten years of follow-up, the mean (C-0) levels were 7.6 mg/mL (TAC) and 120 mg/mL (CS). Conclusion: Analysis of this long term follow-up of a large study confirms the benefits of a tacrolimus-based therapy. It also confirms the concept that freedom from early acute rejection is associated with superior long term results in renal fuction. Therefore, long term renal allograft function is remarkably good in TAC patients. Cyclophosphamide a Novel Therapy for IVIG/PLEX-Resistant Antibody-Mediated Rejection. (C)). Moreover subgroup analyses within the different groups were made to identify potential differences. Groups were analyzed for survival, graft rejection and graftvasculopathy (CAD). Kaplan-Meier analysis was used and log-rank test was performed to detect differences. Results: A total of 81 transplants (8.9%) were blood group compatible. The majority (n=32) were 0A transplants (40%) followed by 0B (n=19; 23%), AAB (n=17; 21%), BAB (n=7; 9%) and 0AB (n=6; 7%). Overall survival comparison showed no significant difference in long-term survival (10-year) Recent data has shown the utility of urinary biomarkers to predict delayed graft function early following renal transplantation, but their ability to predict longer-term allograft function is less clear. We evaluated whether urinary biomarkers measured in the early post-transplant period would predict allograft function at 6 and 12 months after renal transplantation. Urinary biomarkers including N-acetyl-β-D-glucosaminidase (NAG), kidney injury molecule-1 (KIM-1) and matrix metalloproteinase-9 (MMP-9) were measured in 20 patients at hours 0, 6, 12, 24 and days 2 and 3 following renal transplantation. Glomerular filtration rate (GFR) at 6 and 12 months were calculated using the MDRD equation. All 20 patients had functional allografts at 6 months and 2 patients lost allografts after 6 months. Levels of individual biomarkers at different post-transplant time points were correlated with 6 and 12 month GFR using Spearman's correlation (rho) and are shown in the Urinary NAG levels at post-transplant hours 0, 6, 24 and days 2 and 3 showed significant negative correlation with 6-month GFR. NAG levels at hour 24 and days 2 and 3 maintained the significant negative correlation with 12-month GFR. Urinary KIM-1 levels at the 24 hour point showed significant negative correlation with 6-month GFR and KIM-1 levels at hour 24 and day 2 displayed significant negative correlation with 12-month GFR. Urinary MMP-9 levels at no time points had significant correlation with either 6-month or 12-month GFR. Urinary biomarkers particularly NAG shows promise as a tool in the early prediction of longer-term allograft function following renal transplantation. In kidney disease chemokines are involved in the recruitment of leukocytes causing kidney damage and progression to endstage renal failure. Similar mechanisms are proposed for chronic allograft failure in renal transplant recipients. The chemokine attracted leukocytes produce proinflammatory and profibrotic cytokines contributing to fibroblast proliferation, matrix production and tubular atrophy. The role of chemokines in the acute post-ischemic tubular necrosis early after renal transplantation and their impact on long term allograft outcome has not been investigated yet. Methods 30 patients (23m, 7f, mean age 49.9yrs) with DGF (with a median number of 6.4 dialysis sessions) developed biopsy proven acute tubular necrosis during the first three weeks after transplantation. The biopsies were studied by immunostaining for KI67, CCR1, CCR2, RANTES, MCP-1, CD20 and CD68, respectively. The patients were followed for a mean time of 4.2yrs. None of them developed a rejection episode in the follow up time. After follow up the mean serum creatinine was 2.4mg/dl (0.8-6. The goal of a tx is for the recip to achieve long-term survival (surv.), with continued graft function, equivalent to the gen. population. We studied subsequent outcome in 2202, 10-yr kidney tx survivors (tx 1963-1997) ; 62% living donor (LD); 87% 1 st tx; 41% female; 28% type 1 diabetes; mean age at tx (±SE), 33±15 yrs. Actuarial 25-yr surv. is shown in Table 1 ; LD patient, graft, and death-censored graft surv. is signif. ↑vs. deceased donor (p<.0001). Mean creatinine (± SE) at 10 yrs was 1.7±.8; 15 yrs, 1.7±.9; 20 yrs, 1.5±1; 25 yrs, 1.6±1. The 2 major causes of late graft loss (GL) were death with function (DwF) and chronic rejection (CAN). By multivariate analysis, risk factors for GL after 10 yrs were age ≥50 yrs at tx; type 1 diabetes, retx, HLA mm ≥3; ≥1 acute rejection episode, and pretx cardiac, peripheral vascular, or liver disease (p<.05). The most common cause of DwF was cardiovascular disease (CVD); however, >20% deaths were due to malignancy. The use of kidneys from donors with a positive serology for HCV into HCV(+) recipients still remains controversial. In 1990, our units adopted this policy, subsequently modified in 1993, so these kidneys were limited to recipients with a positive RNA HCV before transplantation. The aim of the present analysis was to review the long-term safety of our policy. Since January 1990 to February 2007, 474 HCV(+) patients with a negative HbsAg and not treated with interferon received a kidney transplant in our units. 313 patients were transplanted from an HCV(-) donor(Group 1) and 161 from an HCV(+) donor(Group 2). Median follow-up time was 65(IR 24-117) months. Remarkably, Group 2 showed a significantly higher donor age (46.6±13.6 versus 41.1±18.8 years;p<0.0001) and recipient age (50.4±13.1 versus 44.9±14.6 years;p<0.0001), as well as a worse HLA compatibiliy than Group 1. Immunesuppressive therapy did not significantly differ between the two groups. Group 2 Notably, only 5 patients died because of a liver disease (3 in Group 1 versus 2 in Group 2) and only 2 patients developed a hepatocarcinoma (both in Group 1). In summary, no significant differences were observed in HCV(+) recipients according to HCV serology of the donor, in terms of death censored graf survival, patient survival and evolution of liver disease. Therefore, our experience clearly demonstrates that the use of kidneys from HCV(+) donors into HCV(+) recipients is a safe strategy in the long-term and a wise way of using these kidneys, that otherwise would be lost at a moment of shortage. Introduction: Deceased donor kidneys are allocated to adult candidates in the U.S. primarily by waiting time and HLA similarity. We investigated two alternative allocation systems, LYFT-SCD and LYFT-DY, that prioritize offers of kidneys to adults based in part on LYFT. Methods: Based on characteristics of each candidate and donor, LYFT was calculated as the difference between expected median years of life for the candidate with that kidney donor transplant (tx) versus without a kidney. Years on dialysis were weighted at 0.8 of years with a functioning graft. The relative risk of graft failure for each donor was calculated from donor factors using a Cox model and the Donor Profile Index (DPI), the percentile score of that risk among kidneys used in 2003 (100%=greatest risk of graft failure). Using actual candidates and deceased donors and acceptance patterns from 2003, we modeled allocation with the Kidney-Pancreas Simulated Allocation Model (KPSAM). KPSAM sequentially offers kidneys to candidates according to user-specified allocation rules, and models the probability for kidney placement and post-tx survival. The LYFT-SCD model allocated SCD kidneys by LYFT and ECD kidneys by Dialysis Years (DY). The LYFT-DY model allocated organs according to the formula [.8LYFT(1-DPI)]+[DY(.8DPI+.2)]+[4PRA/100] to adult kidney tx candidates. Results: The table compares a year's worth of kidney-alone recipient characteristics, lifespan after tx and additional lifespan (v. dialysis) due to tx among all kidney (including SPK) recipients, and the donor/recipient age correlation (R) under current national kidney allocation, allocation using LYFT for SCD, and allocation using LYFT-DY within each donation service area. Conclusions: Allocation systems incorporating LYFT have the potential to increase the number of years of life attainable from tx and are being considered for the allocation of kidneys within the U.S. Introduction : Organ shortage for transplantation has led to the use of marginal kidneys, which has been associated with poorer but still acceptable results. In our center, the transplantation of two very marginal kidneys into a single recipient was used as a strategy to increase the number of organs available. The present study reports renal function and survival of dual-kidney transplants (DKT) performed at our institution. Methods : From October 1999 to June 2007, 392 transplants were performed at our center, from which 63 were DKT. Kidney selection for a DKT was based on refusal of the kidneys for single kidney transplantation (SKT), donor's age ≥ 60 years and ≥15% of glomerulosclerosis on pre-implantation biopsy. The calculated creatinine clearance (crcl, ml/min), graft and recipient survival from DKT were compared to those of SKT from ECD (based on UNOS criteria, n=66) and ideal donors (ID, aged 10 to 39 y, n= 63) over a seven-year period. Results : DKT donors were significantly older than the two other groups (DKT: 69 year old, ECD: 62, ID: 24). Dual transplants were offered to older recipients by design (60, 50, 44). Delayed graft function was more prevalent for DKT and ECD than with ID (27%, 29%, 11%). Twelve, 36 and 84-month crcl were similar for DKT and ECD but lower than ID (twelve months: 58, 59, 81; 36 months: 54, 60, 82; 84 months: 62, 51, 86, respectively). Patient survival, actuarial graft survival and actuarial graft survival censored for death were similar between the 3 groups. Conclusion : To our knowledge, this study reports short and long-term outcome of the largest cohort of DKT performed at a single institution. It shows that dual-kidney transplantation is a more complex intervention with more risks of surgical and post-op complications (data not shown). However, with adequate surgical expertise, DKT patients can expect long-term results comparable or even better than ECD. In our center, the use of DKT has increased the number of transplantations by 19% for the period of study. It allowed older patients to receive a graft within shorter delays. Moreover, it permitted a more judicious allocation of a resource that is still limited. An Results: Very few differences resulting from DonorNet were seen. For regionally exported kidneys, cold ischemia time (CIT) was 22.9h before DonorNet (BD) and 23.3h after DonorNet (AD); 6.4% of regionally exported kidneys had CIT>36h BD, as compared with 9.0% AD. For nationally exported kidneys, CIT was 27.3h BD and 26.8h AD; 17.9% of nationally exported kidneys had CIT>36h BD, as compared with 16.1% AD. The same HUCs of exported kidneys BD were also HUCs AD. Of the top 10 centers utilizing exported kidneys BD, only one was not within the top 10 centers utilizing exported kidneys AD. When we looked at CIT for kidneys exported to HUCs, again there was little difference before or after DonorNet, with a trend to possibly longer CIT for HUCs. For kidneys regionally exported to HUCs, cold ischemia time (CIT) was 23.1h before DonorNet (BD) and 23.4h after DonorNet (AD); 7% of regionally exported kidneys had CIT>36h BD, as compared with 10.4% AD. For kidneys nationally exported to HUCs, CIT was 27.2h BD and 27.4h AD; 17.5% of nationally exported kidneys had CIT>36h BD, as compared with 16.8% AD. Conclusion: The first 6 months of national implementation of DonorNet were not successful in decreasing CIT for regionally or nationally exported kidneys. Furthermore, the centers to which a high proportion of these kidneys were exported did not change as a result of DonorNet, and for these centers allocation efficiency is no better and might even be worse than before. Machine Cold machine preservation has been shown to improve outcome following transplantation of deceased heart-beating donor kidneys. To determine whether cold machine preservation also improves outcome of kidneys donated after cardiac death (DCD) we undertook a multicentre randomized controlled trial in the UK comparing machine perfusion with simple cold storage. One kidney from each DCD donor was randomized to machine perfusion (Organ Recovery Systems LifePort device with KPS-1 solution), the other to perfusion with UW (ViaSpan) solution followed by simple cold storage. The primary outcome measure was the need for dialysis in the first 7 days (delayed graft function, DGF); secondary outcome measures included GFR at 1 week (MDRD technique); 3 and 12 month graft and patient survival. At the time of writing, one week outcome data are available for all patients. A sequential design was used with the data inspected after 60 transplants and then every 20. All recipients received basiliximab, mycophenolate sodium, tacrolimus and prednisolone. The significance of the initial results has been tested using paired T tests and McNemar's exact tests. Recruitment was stopped after 46 donors when the unadjusted sequential analysis concluded that there was no difference in the incidence of DGF. 91 of the 92 kidneys were transplanted; one was not used for anatomical reasons. One kidney suffered primary non function; it had been machine preserved. The results shown below represent intention to treat. Introduction: Split liver transplantation has been adopted as an alternative to expand the organ donor pool. While the Adult/Child Split Liver (A/CSL) in which an extended right graft (ERG) is transplanted into an adult and a left lateral segment (LLS), transplanted into a child, has gained a wide acceptance within the transplantation community, Adult/ Adult Split Liver (A/ASL) in which the liver is split into a full right (FR) and full left (FL) is still performed very rarely. We analyzed the experience at our Center with split liver grafts over a period of 10 years. Material and Methods: Between October 1997 and October 2007 we performed a total of 676 liver transplants in 625 recipients (370/328 children and 306/297 adults). Among the 593 (310 children and 283 adults) recipients of a primary isolated liver transplant, 281 (247 children and 34 adult) were transplanted with a A/C or A/A SL. The recipients of a whole size graft (249 adults and 63 children) were used as a control. Results: Adults: 34 patients received a split liver graft (26 ERG graft and 8 FL/FR grafts) and 249 a whole liver graft. Overall the incidence of biliary complications using a split liver graft was 29% and with a whole liver graft 15% (p= 0,0506).For the recipients of a split liver graft 1 and 5 years patient and graft survival was 91%/88% and 88%/85%. Recipients of a whole liver graft had a 1 and 5 years patient / graft survival of 85%/84% and 80%/78% respectively. Children: 247 children received a split graft (230 LLS, 12 ERG and 5 FL) and 63 a whole size graft. Biliary complications occurred in 42% of the recipients of a split liver graft and 5% of the recipients of a whole size graft (p < 0,0001). Among the recipients of a split graft 1 and 5 years patient / graft survival was 91%/84% and 87%/80% respectively. The recipient of a whole size graft had a 1 and 5 years patient / graft survival of 84%/78% and 83%/74%. Conclusion: Incidence of biliary complications is significantly higher using a split graft. At a high volume Center, use of split liver grafts reached comparable and even better results of a whole liver graft in terms of patient and graft survival. Thus, the use of split grafts should be strongly encouraged to expand the donor pool. A Groups of primary LDLT and DDLT recipients were identified by matching diagnosis, MELD score, and recipient age. Cost data, acquired from the hospital financial database and inflation adjusted and clinical outcomes were compared between the 2 groups. Background: Liver transplant recipients with (GW/RW) < 0.8% are thought to have a higher incidence of post-operative complications, including small for size syndrome (SFSS), and overall inferior outcomes. We analyzed a cohort of partial liver graft recipients and compared those with GW/RW < 0.8% to those with GW/RW > 0.8%. Results: Between 1999 and 2007, 107 adult patients underwent partial graft liver transplant. Seventy-six grafts were from live donors (LDLT) and the remaining 31 were from deceased donor split-liver transplants (SLT). Of these, 22 patients had GW/ RW < 0.8% (12 LDLT, 10 SLT), and 85 had GW/RW > 0.8% (64 LDLT, 21 SLT). Median follow-up was 46.1 months and did not differ between the two groups. Baseline demographics including donor and recipient age, and MELD at the time of transplant also did not differ between groups (Table) . Three-month and 1-year graft survival for the two groups was comparable. Three recipients with GW/RW < 0.8% developed SFSS (13.6%); all three had inflow modification with splenic artery ligation (SAL), and 1 of the 3 grafts was lost at one-year. Eight recipients with GW/RW > 0.8% developed SFSS (9.4%) (p = ns); six underwent SAL and none had graft loss at one-year. There was a trend toward increased hepatic artery thrombosis with the smaller grafts that did not reach statistical significance (p = 0.10). The incidence of other surgical complications was similar between the two groups (Table) . The diagnosis of rejection in post-transplant kidney disease depends largely on the assessment of biopsy pathology, which suffers from arbitrary scoring, subjectivity, and sampling variance. With microarray gene expression data from 186 biopsies for cause, we used Predictive Analysis of Microarrays (PAM) to build a rejection classifier. Because of the imperfect gold standard, the classifier cannot and should not have extremely high predictive accuracy. The goal was to see if microarrays could detect a robust rejection signature despite this uncertainty, and to identify the top genes distinguishing rejecting from non-rejecting biopsies. By examining discrepancies between pathology diagnoses and PAM predictions, and using clinical follow-up information, we combine the strengths of each method to produce a more accurate diagnostic system. Biopsies with rejection plus clinical episodes (Fig. 1 ) have higher positive predictive value than those lacking episodes. The majority of episodes occur in the top part of Fig.1 , indicating that a valid rejection signature is being detected. Of the exceptions, most had been treated with steroids before the biopsy, suppressing their gene expression patterns. Samples called borderline rejection separated into two distinct high/low probability groups using PAM, with all the cases classified as clinical rejection episodes occurring in the high probability group. Of the 39 genes chosen by the classifier, 36 had previously been annotated in our studies of mouse kidney graft rejection. GZMA, GZMB, and PRF1 ranked 26th, 32nd, and 38th respectively. The top five genes were: CXCL11, GBP1, CXCL9, INDO, and CXCL10, all previously annotated in rejecting mouse kidneys as interferon-γ induced genes. In summary, gene expression-based classifiers, combined with histopathology, promise more accurate diagnostic assessment of the state of kidney transplants at the time of biopsy. Moreover our data driven classifier independently identifies the genes previously annotated in experimental studies. Influence of Donor Background: Renal-cell associated TLR4 activation was found to be important in mediating ischemia reperfusion injury to murine kidneys. We hypothesized that genetic variations within the TLR4 gene of the kidney donor affects the rate of delayed graft function (DGF). Methods: We genotyped the functional TLR4 polymorphisms (SNPs) D299G (rs4986790) and T399I (rs4986791) in 268 kidney donors from 2 centers and correlated them with the occurrence of DGF. DGF was defined as the need of dialysis within 7 days post-transplant or less than 25% drop in creatinine within the first 24 hours after transplant. In a sample of 67 patients from one center, we analysed intragraft HMGB-1 (High Mobility Group Box Protein-1) gene expression in pre-and post-implantation biopsies. Statistical analyses were performed using the SPSS statistical package. Results: Both TLR4 SNPs were in Hardy-Weinberg Equilibrium, and were combined for further analysis. Recipients of a TLR4 mutated kidney showed a significant lower rate of DGF (p=0.005), which was persistent after correction for known donor-derived risk factors (donor age, ECD vs. DCD-kidney, cold ischemia time, p=0.007, HR 4.42, CL 1.51-12.94). Additionally, we confirmed the presence of a possible TLR4 ligand, HMGB-1, in pre-and post-implantation biopsies. We also confirmed the presence of a functional TLR4 receptor in human proximal tubular cells which expressed MCP-1, IL1-β, IL-6 and TNF-α after specific TLR4 agonist treatment. Conclusion: Human renal tubular epithelial cells express TLR4 and a functional TLR4 SNP in donor kidney is associated with lower rate of delayed graft function. Background: Methylation of promoter CpG islands has been associated with gene silencing and demonstrated to lead to chromosomal instability. We postulate that differences in methylation patterns observed in tissues ranging from normal to cirrhosis to HCC may lead to the discovery of direct precipitating events that contribute to tumorigenesis in HCV-HCC patients. Methods: DNA from 20 HCV-HCC tumors and corresponding non-tumor HCV cirrhotic tissues as well as 20 independent HCV cirrhotic tissues and 20 normal liver tissues were bisulfite treated and hybridized to the Illumina GoldenGate Methylation BeadArray Cancer Panel I for interrogating CpG sites in the promoter regions of 808 distinct genes. For each CpG site, a summary statistic representing "percent methylated" was estimated. For each CpG site, a Jonckheere-Terpstra test was applied to identify whether there was a significant monotonic trend in percent methylated across the independent normal, cirrhotic, and HCC tissues. In addition, the paired HCC and non-tumor cirrhotic tissues were analyzed using a paired t-test. The q-value method for estimating gene-wise false discovery rates (FDR) was used to adjust for multiple comparisons; CpG sites with an FDR<0.05 were considered significant. To identify if serological responses to allogenic non-HLA renal compartmentspecific antigens can be detected after renal transplantation (txp). METHODS: 36 paired pre-and post-transplant (mean time 25 months) serum samples from 18 pediatric kidney allograft recipients were used for identification of de novo non-HLA antibody formation assessed using Invitrogen ProtoArray (v3), measuring signal post-pre. Probes from the Protoarray and cDNA microarray platforms were re-annotated to current NCBI Entrez gene identifiers using AILUN. 34 cDNA arrays (GSE:3931) from 7 normal kidney regions (inner and outer cortex, inner and outer medulla, papillary tips, renal pelvis and glomeruli; see Table) were analyzed for compartment-specific genes by SAM, and the highest-ranked genes in each compartment (KS two-sample test p < 0.001) and were ranked against each individual patient's numerical antibody response across the 5057 proteins on the Protoarray. RESULTS: In 83% of patients (15/18), kidney-specific serological responses against the renal pelvis were the first to be detected, followed by responses against renal cortex. Control gene expression data sets from other solid organs (GSE: 1133; lung, heart, pancreas) did not demonstrate immune-response enrichment. The presence of these antibodies was not associated with donor specific HLA class I or II antibody levels. CONCLUSION: We demonstrated de novo formation of circulating non-HLA non-ABO antibodies after txp, irrespective of HLA antibodies, are detected against kidneyspecific protein targets. As these antibodies are not seen against other solid organs, the response is likely allogeneic. The renal pelvis and renal cortex have the highest immunogenic potential. We conclude that urinary cell mRNA profiles that include levels of mRNA encoding tubular proteins and mRNA encoding proteins implicated in EMT offers a noninvasive means ascertaining renal allograft status with respect to presence or absence of CAN. Functional Functional annotation of these genes identified cell cycle pathway and carboxylic acid metabolism pathway as potentially relevant for both datasets. Altogether, these data suggest that as compared with recipients requiring on-going immunosuppression, operationally tolerant liver recipients exhibit traits of immunological quiescence and activation of natural killer related pathways. Liver and kidney tolerant states appear to be fundamentally different biological processes, although some common pathways could be relevant to both settings. Genetic Introduction: Both normal and fibrotic renal allografts develop a large number of persistent changes in pro-inflammatory gene transcripts early after transplantation (1) . The aim of the current study was to determine if this "transplant effect" is attenuated by immunosuppression, fibrosis or HLA-match. Methods: We studied 41 histologically normal implantation biopsies (T0) and 45 12m protocol biopsies (T12) from living donor kidney transplant recipients who had no identifiable post-transplant complication (no acute rejection, polyoma virus, etc). Patients were stratified into 4 distinct clinicopathologic groups. The first three groups had normal histology at both T0 and T12 and included: 1) HLA identical-Tac treated (n=10); 2) non-HLA identical-Tac (n=19); 3) non-HLA identical Srl (Tac-free, n=8). A fourth group was histologically normal at T0 but developed mild fibrosis by T12 (n=7). mRNA expression from these biopsies was measured using the U133Plus2.0 microarray and custom Taqman Low Density Arrays (TLDA). Data for each group was tested using a generalized linear model and changes common to each dataset identified (p<0.05). Results: In addition to transcript changes specific for each clinicopathologic condition, a large number of transcripts (n=1152) were altered in all groups. 68% (n=778) had higher expression in all comparisons at T12 versus T0, including fibronectin and collagen IV. 32% (n=374) of transcripts were considered down-regulated, including interleukin 6 and 1β. Gene Ontology and Signaling pathway analyses showed many of the transcripts to be involved in pro-inflammatory and immune response signaling pathways (ex. TLR, IL-10/-2, etc). Custom TLDAs were used to study several genes considered not detected by microarray. This included TGF-β1, CD-3E/-28/-74 and VEGF, all of which were significantly up-regulated in multiple comparisons. Conclusions: Persistent changes in pro-inflammatory transcripts occur in all renal transplants. This "transplant effect" cannot be explained by calcineurin-inhibitors (occurs in Tac-free immunosuppression), increased alloreactivity (similar in HLA identical/ non-identical) or fibrosis. A likely cause is persistent changes from IR injury. The role of the "transplant effect" in long-term allograft survival is unclear, but it likely interacts with other forms of graft injury. BACKGROUND: A critical 26 gene-set for acute renal rejection (AR) diagnosis in peripheral blood has been previously identified by our group using cross-platform hybridization of 91 unique peripherla blood samples with matched biopsy diagnosis, with AR (n=39) and without AR (STA; n=52), with cross-hybridization across 3 human microarray platforms: 30k cDNA Lymphochip, 44k Agilent and 54k Affymetrix HU133plus. In this study we proposed to verify and validate this gene-set for AR diagnosis and prediction by peripheral blood PCR-based analysis. METHOD: 57/91 peripheral blood samples from the microarray gene-set were selected for 384-well format multiplex ABI-Taqman qPCRv validation of AR diagnosis across the 26 gene-set. An independant set of 66 new, time and immunosuppression matched, peripheral blood samples (30STA, 36AR) were used as a test-set for blinded prediction of AR diagnosis using qPCR across the most informative 8 genes. 42 sequential samples from 14 AR patients (at AR, 1-3 months prior to, and after AR) were also examined by qPCR for these 8 genes for blinded AR prediction, prior to biopsy proven AR. Prediction models were generated using logistic regression analysis and p<0.05 considered significant. RESULT: In the validation set of 57 samples, 8/26 genes were highly significant for confirming peripheral blood-AR diagnosis (p<0.03). In the test group of 66 samples, AR was predicted with a very high level of sensitivity and specificity (PPV and NPV> 90%). In the group of 14 AR patients with sequential samples pre-and post-AR, expression for these 8 genes were significantly elevated in the pre-AR samples (vs. STA, p<0.01) but were not different between pre-AR and AR values. CONCLUSION: A highly specific and sensitive biomarker panel of 8 genes has been derived and tested by cross-platform microarray analysis. thsi gene-set has now been validated and further verified by multiplex PCR, for AR diagnosis and prediction, even 3 months prior to the rejection event. Utility of this gene-set should be further validated in real-time by serial longitudinal peripheral blood testing, for more senstive and specific minimally invasive monitoring for graft rejection. Tolerance/Immune Deviation II Background: Our previous studies showed that regulatory T cells (Treg) execute suppressive function in both the inflammatory site of the allograft and the draining lymph node (dLN) to suppress allograft rejection. We explored how Treg influenced the migration and function of antigen specific effector T cells and dendritic cells (DC) at these two sites in an islet transplant model. Methods: Treg were sorted from wild type or CCR7 -/-C57BL/6 mice, labeled with PKH26, and transferred directly to islet allografts (BALB/c into C57BL/6). Effector CD4 T cells (Te) from alloantigen specific T cell receptor transgenic mice were labeled with CSFE and transferred intravenously. Treg and Te migration to islet grafts and dLN were analyzed with flow cytometry and immunohistochemistry. Chemokine expression was determined by RT-PCR. CX3CR1 GFP mice, in which DC are internally labeled with GFP, served as islet donors, and donor DC migration was tracked with fluorescence microscopy. Results: During priming and rejection, islet allografts expressed a panel of inflammatory chemokines shortly after transplantation that recruited Te to the islets. Te accumulated and proliferated in both the islet allograft and the dLN. Concomitantly, donor derived DC migrated from the islet to the dLN as early as 1 day after transplantation. Local transfer of Treg into the islet allograft inhibited islet chemokine expression, inhibited donor DC migration in an IL-10 and TGFb dependent fashion, and reduced Te migration to and proliferation in the graft. The transferred Treg also migrated from the islet allograft to the dLN, and directly inhibited Te accumulation and proliferation in the dLN, and migration of Te to the islet. CCR7 -/-Treg, which could not migrate to the dLN but were retained within the graft, and were far less effective than wild type Treg in inhibiting Te migration and proliferation into both the islets and dLN. Conclusions: Treg migrate to the islet and suppress parenchymal cell chemokine expression, DC migration, and Te responses in the islet allograft. Treg also migrate to the dLN to suppress Te proliferation and migration. Treg trafficking from the allograft to the dLN is crucial for suppressive function in order to target the separate effector mechanisms. These results demonstrate novel and important functions for migration in Treg induced suppression and graft survival. Tregs CD4+Foxp3+ regulatory T cells (Tregs) play an important role in transplant tolerance, yet basic aspects of Treg biology including the mechanisms involved in their induction remain unclear. α-CD45RB is a potent tolerogenic agent that induces a 2X increase in number of Tregs in wt mice, even in the absence of allo-antigen. Here we use Foxp3red fluorescent protein reporter knock-in mice to study Treg homeostasis and identify the mechanisms by which α-CD45RB induces Tregs. CFSE-stained highly sort-Abstracts purified Foxp3+ or Foxp3-cells were adoptively transferred into fully replete naive wt congenic mice. Whereas only 5-10% of transferred Foxp3-cells undergo homeostatic proliferation (HP) over 10d, 50-70% of Foxp3+ cells proliferate -a surprisingly high rate of HP. Moreover, treatment with α-CD45RB markedly enhanced HP by transferred Foxp3+ cells, resulting in a 10X increase in cell number. α-CD45RB induced de novo Foxp3 expression in 13-25% of transferred Foxp3-cells (many of which subsequently underwent HP). Thus, α-CD45RB induces both conversion of Foxp3-to Foxp3+ cells and promotes HP in Foxp3+ cells in the absence of exogenous antigen. We then addressed the signals controlling basal HP of Tregs and both HP and conversion mediated by α-CD45RB. CFSE-stained congenic CD4+ cells adoptively transferred into naive wt mice were untreated, or received CsA, α-CD45RB, or both and Foxp3+ and Foxp3-CD4 cells were assessed on d10. Calcineurin inhibition greatly reduced basal HP of transferred Foxp3+ cells compared to untreated mice. Moreover, CsA completely abrogated increased Treg HP induced by α-CD45RB, although conversion still occurred. Next we assessed the role of TGFβ. We found that blocking TGFβ signaling with α-TGFβ shortened allograft survival, but had no effect on basal Treg HP, or on Treg HP or conversion by α-CD45RB. Finally, although exogenous antigen is not required, basal and α-CD45RB-induced HP may still require recognition of self-Ag byTregs. Indeed, preliminary results reveal that when CFSE-labeled CD4+ cells are transferred into MHCII KO mice, basal HP by Tregs is reduced and not restored by α-CD45RB. Thus, α-CD45RB alters normal controls regulating HP by Treg. Moreover, both basal and α-CD45RB-mediated HP requires intact calcineurin activity and TCR signaling, but not TGFβ. Understanding the signals controlling HP in Treg may reveal new therapeutic targets for tolerance induction. We report the first demonstration of a tolerance-inducing strategy based on posttransplant administration of allo-Ag-specific Treg (AAsTreg) and rapamycin (Rapa), in a fully allogeneic, unmanipulated mouse heart transplant model. Enrichment of naturally-occurring AAsTreg represented the first step to counterbalance the high frequency of alloreactive T cells. Selection was achieved by co-incubation of freshly-isolated CD4 + CD25 + T cells with donor bone marrow-derived dendritic cells (DC). The source of stimulatory factors necessary to sustain Treg proliferation was the supernatant of CD4 + CD25 -T cells co-cultured with allogeneic DC (MLRsup). Use of mature (CD86 high ) DC favored extensive expansion of Foxp3cells capable of IL-17 production (T H 17). Co-culture of Treg with immature DC in the presence of MLRsup rendered a T cell population with regulatory phenotype: Foxp3 + , GITR + , CTLA-4 + , CCR7 + , CD62L -. These cells inhibited in vitro effector T cell proliferation at 1:20 and 1:40 Treg:T cell ratios. The suppressive activity was Ag-specific; no significant inhibition was evident when effector T cells were stimulated by third party DC. AAsTreg were then tested for their ability to induce transplant tolerance in a mouse heterotopic heart allograft model (BALB/c to C57BL/10; unmanipulated). Rapa was used: i) to inhibit effector T cell proliferation, while sparing Treg activity and thus enhancing the in vivo Treg:effector T cell ratio; ii) to promote resolution of the inflammatory state and preserve the susceptibility of conventional T cells to Treg suppression. Graft recipients received Rapa (1mg/kg/d; d0-9) and 2x10 6 AAsTreg i.v. on d7. In comparison to the untreated group (median survival time, MST=11d; n=14), Rapa alone extended MST to 30 d (n=7), with no long-term graft survival. Under cover of Rapa, AAsTreg exerted a profound tolerogenic effect: >80% recipients exhibited longterm graft survival (MST>150d; n=6). This effect was stronger than polyclonal Treg administration: 40% long-term survivors (MST>50d; n=5). Moreover, the tolerogenic effect was Ag-specific, as AAsTreg selected against third party DC (C3H/HeJ) did not prolong graft survival in comparison to the Rapa-only control group. These results indicate the feasibility and therapeutic potential of Ag-specific tolerogenic cell therapy based on post-transplant administration of selected AAsTreg. Direct Intravenous Immunoglobulins (IVIg) is an effective treatment for T-cell mediated graft rejection and autoimmune diseases. IVIg treatment is associated with rapid clinical improvements without the side effects of global immunosuppression. This prompted us to ask whether IVIg might enhance directly the suppressive function CD4+CD25+Foxp3+ regulatory T cells in vitro and in vivo. In vitro, mouse CBA/Ca (H2 k ) total CD4 + or CD4 + CD25responder cells were stimulated with C57BL/10 (H2 b ) splenocytes, and human CD4 + or CD4 + CD25cells were stimulated with allogeneic antigen presenting cells. In vivo, 1*10 5 total CD4 + or CD4 + CD25 -T cells of CBA/Ca mice were adoptively transferred into CBA/RAG1 -/mice, and one day later transplanted with a tail skingraft of C57BL/10 mice. IVIg was administered i.v. on day 1,3,7,10 and 14. Human Serum Albumin (HSA) was used as a control. Binding of IVIg and activation status of Foxp3 + CD4 + T cells were determined. In vivo, only when total CD4+ T cells, but not CD25-T cells, were adoptively transferred, IVIg protected against T-cell mediated rejection of the fully mismatched skingraft (MST: IVIg >100 vs HSA 16 days, p<0.01). IVIg binds to 16±2% of Foxp3 + T cells, while binding to Foxp3 -T cells was minimal. This binding was partially Fcγ receptor mediated. Furthermore, IVIg treatment resulted in activation of CD25 + T cells as detected by increased expression of phosphorylated ZAP70/Syk, which could be abrogated by specific tyrosine kinase inhibition. In vitro, IVIg inhibited the alloproliferative response of murine CD4 + T cells by 75±25%, but after depletion of CD25 + T cells, this inhibition decreased to 17±12% (N=6, p<0.01). Similar results were found in human MLR, where depletion of CD25+ T cells resulted in twofold reduction in IVIg mediated suppression. Significantly, incubation of human CD4 + CD25 + with IVIg enhanced their ability to suppress allogeneic T-cell proliferation (IVIg 63±8% vs HSA 37±10% of inhibition, N=5, p<0.05) . Our results identify a novel pathway through which IVIg treatment induces direct functional activation of both mouse and human regulatory T cells. Immediate binding and activation of regulatory T cells is one of the mechanisms in the immunomodulatory repertoire of IVIg, which allows rapid inhibition of allogeneic responses, and therefore can be a valuable tool after organ transplantation. Generation Foxp3 is a DNA-binding protein that is necessary for regulatory T cell (Treg) function, though recent data indicate that detection of Foxp3 mRNA or protein alone does not necessarily indicate whether the associated Foxp3+ cell is functional or not. To that end, our analysis of Foxp3 sequence showed the presence of an RXXR motif, conserved between mice and humans, that is located 12 amino acids (aa) from the carboxy terminus of Foxp3 and is a potential recognition sequence for cleavage by proprotein convertase enzymes. We found several proprotein convertases are expressed by naturally occurring Tregs, with further upregulation upon TCR activation. We generated an antibody against the 12 aa carboxyl peptide of Foxp3, and by Western blotting detected a peptide of about 1.3-kDa, consistent with a 12-aa long carboxy-terminal Foxp3 cleavage product. Both cleaved and uncleaved forms of Foxp3 were resolved by SDS-PAGE, and the cleaved form was found only in the DNA-bound fraction. The requirement of proteolytic cleavage, at the intact tetrabasic RXXR motif (414RKKR417), for full Treg function was shown by abolishing the motif through mutagenesis, followed by retroviral expression of mutants, and analysis of proteolytic cleavage by Western blotting. Assays of Treg function by cells expressing either long-or short-Foxp3 mutants showed short-Foxp3 (missing the carboxy-terminal 12-aa) suppressed Teff cell proliferation significantly more effectively than WT Foxp3 or an engineered and cleavage-resistant mutant, termed long-Foxp3 (p<0.01). Moreover, adoptive transfer of cells expressing short-Foxp3 prevented experimental colitis more effectively than WT-Foxp3 (p<0.01), and both were more effective than cells expressing long-Foxp3. Animals that received cells expressing short-Foxp3 gained weight and were protected from disease. Thus, function of Foxp3 is regulated at a post-translational level by proteolytic cleavage and the short form of Foxp3 represents the active form. Our data shows proteolytic cleavage of Foxp3 is key to the generation of functional Tregs, with enzymes(s) of the proprotein convertase family likely playing an important role in this process and providing an extra and hitherto unrecognized important level of regulation for Foxp3. Blocking Since TGF-β is secreted in a latent form and active TGF-β is rapidly cleared from circulation. In order to make long-lasting active form of TGF-β, the mutant TGF-β was fused to a human IgG Fc component. The secreted human mutant TGF-β/Fc (hmTGF β /Fc) fusion protein stained with both anti-human TGF-β and anti-human IgG antibodies, confirming the cytokine and isotype specificity of TGF-β moiety and Fc domain. Moreover, in vitro bioassay, hmTGF-β/Fc inhibited IL-4 dependent HT-2 cell proliferation in a dose dependent manner and had a circulating half-life of 36 hours in mice. In vitro, anti-CD3 and anti-CD28 triggered CD4+ or CD8+ T cell proliferation, measured by 3H-thymidine uptake, could be synergistically inhibited by TGF-β and Rapamycin. At the same time, the two reagents when added together could synergistically promote the induction of CD4 + Foxp3 + T cells from peripheral naïve CD4 + Foxp3 -T cells. In a pancreas islet transplantation model in which the fully MHC mismatched DBA/2 islets were transplanted into C57 BL/6 mice rendered diabetic by streptozotocin, mean survival time of islet was 19 days in control recipients (n=6). Administration of mutTGF-β/Fc resulted in a delayed islet allograft rejection (MST; 38 days, n=5). Treatment with Rapamycin prolonged islet grafts survival in 63% of recipients. In contrast, combined treatment with TGF-β/Fc and Rapamycin by four doses produced indefinite islet allograft survival in 83% cases. We have produced the long-lasting active hmTGF-β/Fc fusion protein. The TGF-β/Fc is synergistic with Rapamycin to convert naïve CD4+Foxp3-T cells into CD4+Foxp3+ Tregs and promote long-term islet allograft engraftment. The The recognition of microbial motifs by the innate immune system leading to the stimulation of adaptive immune responses may explain the relationship between infections and susceptibility to graft rejection. A number of infectious agents have been reported to prevent the induction of allograft tolerance, however, none of these can reverse established tolerance, a situation that is highly relevant to clinical transpalntation. We here report that established allograft tolerance (induced with anti-CD154 + donorspecific transfusion) can be reversed in a cardiac transplantation mouse model following infection with Listeria monocytogenes. This reversal of tolerance was dependent on the presence of both CD4+ and CD8+ cells, as well as of the TLR/IL-1/IL-8 adaptor molecule, MyD88. We hypothesized that the reversal of tolerance requires that alloreactive conventional T cells (Tconv) escape dominant regulation by pre-existing Tregs. These alloreactive Tconv can then proliferate resulting in an increased ratio of Tconv:Tregs favoring the reversal of established tolerance. Indeed, we observed that tolerant grafts harbored low numbers of infiltrating CD4 + and CD8 + T cells (1-1.4 x 10 4 /heart), with 10-43% of the infiltrating CD4 + cells co-expressing FoxP3 (N=10). Following the reversal of tolerance and allograft rejection, a 7-10 fold increase in CD4 + FoxP3and CD8 + FoxP3 -Tconv was observed in the graft, but no increase in the FoxP3 + subset. In contrast, we observed no significant changes in the splenic T cell subsets, raising the possibility that the escape from Tregs occurs locally within the graft. Infection of tolerant IL-6-or IFNaR1-deficient recipients with Listeria monocytogenes failed to reverse tolerance, consistent with a hypothesis that the initial escape of Tconv from regulation may depend on IL-6, while their proliferation may be Type I IFN-dependent. In summary, the conditions for the reversal of tolerance is more stringent than the prevention of tolerance, and requires MyD88-signalling and the presence of both CD4 + cells and CD8 + cells, as well as IL-6 and type I IFN signaling. These studies point to the potential impact of bacterial infections on established tolerance, and to novel strategies to monitor and facilitate the maintenance of tolerance in the clinic. Th17 Our laboratory has identified Kα1 tubulin, as an epithelial autoantigen to which immune response occurs following human LTx. Further, we have shown a strong correlation between the presence of antibodies (Abs) to Kα1 tubulin and development of chronic rejection ie Bronchiolitis Obliterans Syndrome (BOS). Goal of our study is to test the hypothesis that epithelial damage due to allo immunity results in remodeling exposes otherwise cryptic self antigens including Kα 1 tubulin and collagen type V (coll V) leading to cellular immune reactivity and Ab production against these auto-antigens which may play a role in the pathogenesis of BOS. 10 patients who developed anti-HLA Ab and 10 patients who had no detectable anti-donor HLA Abs post-LTx by Luminex assay were analyzed for development of Abs to Kα1 tubulin and coll V using ELISA method developed in our laboratory. The ELISA for Kα1 tubulin used recombinant protein (1µg/ml) purified on affinity resin. The ELISA for collagen V uses commercially available human Collagen V (1µg/ml). Elispot assay for IFNγ and IL-10 were performed for 5 BOS-and 5 BOS+ patient PBLs (at the time of BOS diagnosis), after 3 stimulations with Kα1 tubulin protein. The levels of anti-tubulin Abs and anti-coll V Abs were increased significantly in BOS+ LTx recipients with anti-HLA compared to those with no anti-HLA, Table 1 (p<0.05). . Surprisingly, both CD4 and CD8 subsets induced long-term survival of secondary test grafts (>100 days). However, only the CD4 + T-reg prevented TVS (NI=19+5, 21+6% of vessels), as compared to CD8 + T-reg (NI=40+9 in 50+15% of vessels). The cytokine profile indicated a dominant IL-10 response. Allo-antibody analysis showed up-regulation IL-4/IL-12 dependent IgG1 and IgG2c. Conclusion: Allochimeric protein-therapy and CsA treatment generate T-reg that prolong graft survival, but only CD4 + T-reg generated from allochimeric protein-therapy prevent TVS. This CD4 + T-reg may be responsible for long-term graft maintenance through its unique ability to control anti-inflammatory and alloantibody responses. Immunodominant H60 Background: Minor histocompatibility Ags (miHAs) are self peptides, derived from proteolytic processing of normal cellular proteins. miHA incompatibility can induce T cell response to dominant miHAs and facilitates expansion of CTLs and graft rejection. However, the contribution of CTLs recognizing these miHAs in solid organ transplantation has not been fully evaluated. Methods: BALB.B (H-2 b ) donor hearts were transplanted heterotopically to C57BL/6 (H-2 b ) recipients. In vivo alloreactive CD8 T cells were monitored with peptide/MHC multimers. α-LFA-1 mAb was given to recipients to prevent rejection. Various numbers of H60-specific CD8 T cells or CD8 T cells lacking H60 specificity were adoptively transferred into BALB.B graft bearing B6.SCID recipients. Results: 50% of transplantation recipients developed acute rejection and showed markedly increased numbers of H60-specific CD8 T cells at day 8-12 in spleen. Abundant CD8 infiltration was found and selective infiltration of H60-specific CD8 T cells in the graft was confirmed with flow cytometry and in situ tetramer staining. α-LFA-1 mAb treatment prevented acute rejection (MST>100) and allogeneic T cell expansion. It also profoundly attenuated neointimal hyperplasia ( graft-bearing B6.SCID mice. We found that the degree of acute rejection positively correlated with the number of H60-specific CD8 T cells transferred. However, transferred H60-specific CD8 T cells did not cause chronic rejection. Interestingly, greater numbers of CD8 T cells with H4 immunodominance were found in spleen, blood and graft at 50 days after adoptive transfer of CD8 lacking H60 specificity compared to H60-specific CD8 T cell transfer. Conclusion: H60 responses dominate other miHA immune responses during acute rejection after BALB.B to B6 cardiac allograft. We confirmed a role of immunodominant H60 specific CD8 T cells as pathological effector T cells in acute rejection but not in chronic rejection. Maintaining a H60 response may be beneficial in allotolerance to suppress miHA responses that could induce chronic rejection in long-term grafts. Chronic lung allograft rejection, bronchiolitis obliterans syndrome (BOS), affects up to 60% of transplant survivors 5 years post-LTx. Human Neutrophil peptides (HNP 1-3/ α defensins), have been identified in the lavage fluids from patients with chronic rejection by proteomics. Goals of our study were to determine the role of anti-HLA Abs and defensins in BOS and to define the interactions between α-1antitrypsin (AAT), and defensins in regulating inflammation leading to epithelial cell proliferation and BOS pathogenesis. BAL and serum samples (post-LTx and pre BOS) from 21 BOS+, 15 BOS-patients and 12 normals were analyzed by ELISA for α defensins (HNP1-3), human β defensin2 (HBD2) and AAT. Small Airway Epithelial Cells (SAEC) were treated with HNP1 or 2, with or without equimolar AAT or with anti-HLA Abs and analyzed for levels of HBD2 production (ELISA), cytokine and chemokine (Luminex), and cell surface adhesion molecules (ICAM, VCAM) by FACS. BAL and serum from BOS+ patients had high levels of HNP1-3 and HBD2 compared to BOS-recipients or normal sera (p<0.05) ( Table 1 ). There was also a significant decrease in AAT levels in BOS+ compared to BOS-or normal serum (p=0.01) ( Table 1) . SAEC produced human β defensins following anti-HLA Ab stimulation or by HNP treatment (Table 2) . There was increase in adhesion molecules (ICAM, VCAM, 2 folds) cytokines {IL-6, IL-1R α (2 folds), IL-1 β, IL-13(20 folds)}, IL-10(2 folds decrease), chemokines (IL-8, MCP1, 2-4 folds increase) and growth factors (EGF and VEGF, 2.5 folds increase) in response to treatment with HNP1 or HNP2 compared to untreated SAEC, that was inhibited by AAT. Increased defensins and decreased AAT levels were seen in lavage and serum of LTx recipients with BOS. Anti-HLA Abs further stimulate defensin production by SAEC. We conclude that chronic stimulation of epithelial cells both by defensins and Anti-HLA Abs can lead to increased growth factor production contributing to the pathogenesis of BOS. Under Tubular atrophy/interstitial fibrosis (TA/IF) remains a major cause of late kidney allograft loss and epithelial mesenchymal transformation (EMT) is now appreciated as a key feature in this process. We hypothesize that macrophages play a direct role in the development of TA/IF by creating a profibrotic microenvironment and stimulating EMT within the allograft. In human kidney allografts with TA/IF (n=128), macrophage infiltration detected by immunostaining was significantly upregulated (mean score 2.8±0.5) compared to biopsies without corresponding pathological changes from recipients with stable function (n=51; 2.0±0.1; p<0.0001) and the extent of this staining also correlated to the magnitude of graft dysfunction (p<0.0001). RT-PCR in TA/IF grafts also showed marked upregulation for EMT markers αSMA (3.0±1.1-fold; p<0.05), S100A4 (8.7±2.7-fold; p<0.05), and vimentin (4.2±2.3-fold; p<0.01), compared to stable function allografts. To explore the macrophage-EMT relationship, we co-cultured freshly isolated human monocytes over a primary culture of human proximal tubular epithelial cells (PTECs) using culture inserts allowing media exchange but forbidding contact between the two cell populations. By RT-PCR, co-cultured epithelial cells showed significant downregulation of BMP7 (0.45±0.09-fold; p<0.001), a negative regulator of EMT, and epithelial marker E-cadherin (0.001±0.001-fold; p<0.01), with marked upregulation of EMT genes S100A4 (2.43±0.40-fold; p<0.001) and αSMA (5.31±2.23-fold; p<0.05) compared to PTECs cultured in media alone. Investigating the mechanism of monocyte driven EMT, we evaluated the effects of Am80, a synthetic retinoid that also inhibits IL-6 and VEGF signaling. Am80 markedly reversed the transcriptional profile with reduction of EMT markers S100A4 (0.34±0.10-fold; p<0.01), αSMA (0.29±0.06-fold; p<0.05) and vimentin (0.45±0.13-fold; p<0.01), and a simultaneous increase in expression of BMP7 (151.7±53.9-fold; p<0.01), thus reversing the pro-EMT milieu. These results indicate that human monocytes induce transcription of EMT related genes in PTECs, even in the absence of physical contact. Moreover, this phenomenon appears to be mediated in part by IL-6, VEGF, and other pathways mediated by the retinoic acid receptor. Thus, in addition to their typical immune functions, macrophages support a milieu within allografts that promotes TA/ IF in humans. Further investigation into this novel pro-TA/IF pathway could ameliorate chronic injury and improve graft survival. Old donor kidneys are more likely to develop long-term graft failure, especially if they are exposed to stresses (e.g. rejection). This limited ability of old tissue to withstand stress may be due to its reduced capacity of replication and regeneration. Telomere shortening determines lifespan and regenerative capacity. We showed that telomere shortening occurs in old human kidneys. Late-generation TERC KO mice with critically short telomeres have a reduced lifespan, show an accelerated aging phenotype and are an ideal model to resemble the situation in old human donors. This study addressed the question whether IRI causes greater damage in kidneys from late-generation TERC KO mice. We studied TERC wildtype (WT), early-(G1) and late-generation (G4) TERC KO mice 1 day (WT:n=7; G1:n=8; G4:n=8), 3 (WT:n=8; G1:n=8; G4:n=5), 7 (WT:n=6; G1:n=8; G4:n=8) and 30 days (WT:n=8; G1:n=9; G4:n=6) after IRI. Acute tubular necrosis was found mainly on days 1 and 3 and was significantly higher in TERC KO G4 kidneys. Tubular atrophy (TA) and intersitial fibrosis (IF), reflecting chronic damage, were first detected at day 7 and increased in all mice with a significantly higher extent of TA/IF in TERC KO G4 kidneys at day 30 ( fig.) . The cell cycle inhibitor p21, a downstream mediator of senescence induced by telomere shortening, was significantly upregulated in all groups after IRI. P21 levels were highest in TERC KO G4 kidneys at day 30 ( fig.) . Proliferative capacity, as measured by Ki-67 immunostainings at day 30, was significantly lower in tubular, glomerular and interstitial cells of kidneys from TERC KO G4 mice. We show that late-generation TERC KO mice with critically short telomeres have a greater susceptibility towards acute injury and develop more chronic renal damage. This is likely due to the reduced capacity of these kidney to proliferate and thereby to regenerate. Our data strongly suggest a pathogenetic role of telomere shortening, an important senescence mechanism, for the development of long-term graft failure. Results: Belatacept treatment had no effect on the number of peripheral blood Treg cells and T cell suppression assays. The percentage of FoxP3 cells was significantly elevated in rejecting kidney allografts in belatacept-treated patients compared to CNItreated patients. Conclusions: Following chronic belatacept therapy, the number and function of peripheral blood Treg cells are maintained in kidney transplant patients. Belatacept enhances the Treg population in the allograft in patients with acute rejection. Therefore, our data suggest that co-stimulation blockade with belatacept does not affect Treg homeostasis. The increased number of Treg cells in rejecting allografts in belatacepttreated patients may provide a novel mechanism whereby belatacept can mitigate the severity of acute rejection and improve graft survival. The steady-state AUC of N-desmethyl-AEB071 was minor in comparison to AEB071 and similar between the patient groups: 200 ± 90 ng.h/ml in transplantation vs 393 ± 220 ng.h/ml in psoriasis (p=0.08). Demographic covariates: In the first week posttransplant with patients receiving fixed-dose AEB071, intersubject variability for C0 was 78% and for AUC was 49%. AUCs were similar in men vs women (8350 ± 4163 vs 10784 ± 5369, p=0.26). Age, which ranged from 18-64 years, did not influence AUC based on regression analysis (r 2 = 0.005, p=0.65). There was a borderline-significant negative correlation between weight (range, 51-110 kg) and AUC (p = 0.07); however, its clinical relevance was low in that it could explain <9% of the variability in AUC (r 2 = 0.086). There was a significant positive correlation between AEB071 C0 and AUC (r 2 = 0.724, p<0.001). Conclusions: (1) In the first week posttransplant, patients achieved AEB071 blood levels anticipated for this regimen. (2) There was notable intersubject pharmacokinetic variability at this time but it was not attributable to standard demographic factors such as sex, age, or weight. (3) A good correlation was noted between C0 and AUC suggesting that C0 might serve as a marker for total drug exposure. A We studied which IS protocol would be best after rapid discontinuation of P. Between 9/01 and 4/06, 440 1st and 2nd kidney tx recips were randomized: CSA-MMF (n=151) vs. high TAC-low SRL (n=149) vs. low TAC-hi SRL (n=140) (for TAC and SRL levels, high = 8-12, low = 3-7). All received thymoglobulin (TMG) 5 doses, and P for 5 days; TMG was continued in DGF. Min f/u = 1 yr; mean = 42±19 mos. There was no diff between groups in recip age, gender, ethnicity, prim dis, donor source, % retx, PRA; or in donor age, gender, ethnicity. There was no signif diff. between groups (intention to treat) in actuarial patient, graft, death-censored (DC) graft, acute rejection (AR), or biopsy-proven chronic rejection rates (CR), serum Cr level or calculated (MDRD) GFR or in studied side effects. Cr(SD) 1.6(.7);1.7(.9);1.8(1.3); 1.9(2);1.6(.7) 1.5(.4);1.7(1);2(2.5); 1.7(1.1);1.6(.6) 1.5(.5);1.5(.7); 1.6(.6);1.6(.6);1.7(.7) MDRD GFR(SD) 63(24),58 (21),60(26), 61(26),62 (20) 64 (20),65(24),61(27), 56(23),63 (28) 65 (20) The most common cause of graft loss was death with function (CSA 40%, high TAC 35%, low TAC 57%). The majority of recips in each group remained P-free (CSA 72%, high TAC 86%, low TAC 74%) but a large % were not on the medications which they were randomized to (CSA 16%, high TAC 51%, low TAC 42%). We found a signif ↑ of new onset diabetes (NODM) (p=.03) in the TAC-SRL groups: CSA 2%, high TAC 11%, low TAC 5%. There was a trend towards more use of lipd lowering medications in the TAC/SRL groups. In summary, all 3 IS protocols were effective; with min f/u 1 yr for each group, patient and graft survival and AR rates are similar. We found an increased NODM and possibly hyperlipidemia in the TAC/SRL groups. Purpose: We present preliminary 2-year results from a worldwide trial comparing 2 SRL regimens with TAC+MMF. Methods: Renal transplant recipients (N=451) were randomly assigned to the following treatment regimens: Group 1: SRL (8-15 ng/mL, then 12-20 ng/mL after week 13) + TAC (6-15 ng/mL) with elimination at 13 weeks (n=155); Group 2: SRL (10-15 ng/ mL through week 26, 8-15 ng/mL thereafter) + MMF (up to 2 gm/day) (n=155); or Group 3: TAC (8-15 ng/mL through week 26, 5-15 ng/mL thereafter) + MMF (up to 2 gm/day) (n=141). All patients received corticosteroids and daclizumab. In June 2006, Group 2 was terminated (after all patients were accrued) because of increased acute rejection (AR) rates. Results: Demographic characteristics were similar between groups except for more females in Group 3. Patient and graft survival were also similar among groups (see Table) . Biopsy-confirmed AR (BCAR) was significantly greater in Group 2 compared with Groups 1 and 3, p<0.001, and most occurred in the first 6 months posttransplant with preponderance during the first 3 months. Subtherapeutic SRL trough concentrations were reported in a large number of rejectors in Group 2. Most of the rejections in Group 1 occurred within the first 3 months before TAC was eliminated. All ARs were mild to moderate; grade II ARs were proportionally greater in Group 3. Mean Nankivell GFR was numerically higher in Group 2. Preliminary results at 2 years show excellent patient and graft survival and similar renal function among treatment groups, despite higher AR rates in Group 2. Early adequate exposure to sirolimus is mandatory to achieve desired (low BCAR rates) results. The goal of rapid discontinuation of prednisone (RDP) after kidney transplantation is to minimize prednisone-related side effects without increasing acute rejection (AR) rates or decreasing long-term graft survival. To date, studies have shown that RDP is associated with decreased prednisone (P)-related sided effects, and randomized trials of RDP (vs. long-term P) have shown little or no ↑ in AR rates. However, concern remains that long-term graft survival will be worse with RDP protocols. We studied t½ (the time it takes for ½ of the grafts surviving at 1 year to subsequently fail) for 1st tx recipients treated with RPD (1999-2007) (n = 652) (antibody, CNI, antimetabolite [MMF or SRL] , and RDP) vs. 1st tx historical controls (1995) (1996) (1997) (1998) (1999) (2000) (n = 388) treated with a protocol of antibody, CNI, antimetabolite [MMF] , and long-term P. Table 1 shows characteristiscs of the 2 groups. PRA; RDP were more likely to get a living unrelated donor transplant. T ½ is shown separately for living (LD) and deceased (DD) donor transplants in Table 2 . There was no significant difference in t½ between groups. We conclude that, compared to historical controls, rapid discontinuation of prednisone can be done without any detrimental impact on long-term outcome. A prospective, randomized study to confirm this observation is necessary. Successful 1A) . We enrolled 152 patients (Table 1) with an average follow-up of 19.7 + 11.7 months. Subjects 19 -65 years old included primary and re-transplants, with primary endpoints of renal function and CAN. Follow-up averaged 16.6 + 9.5 months in 47 patients withdrawn from CI (26 from Group 3, 21 from Group 4). We found no increased rejection after CI discontinuation and that both rATG induction dosing regimen and CI discontinuation significantly impacted renal function and the development of CAN. We successfully discontinued both calcineurin inhibitors and steroids with either single or divided-dose rATG induction, and single-dose rATG induction independently associated with improved renal function and reduced CAN. Study Demographics Single-dose rATG (6 mg/kg x 1) Divided-dose rATG (1.5 mg/kg x 4) Group 1 (n = 38) Group 3 (n = 38) Group 2 (n = 37) Group 4 (n = 39) Age 45.7 ± 11.9 45. In subset analysis, we also find an enhanced negative impact of Srl when combined with steroids (Str) in patients without DGF. Conclusion: This data supports previous findings that Srl may be associated with a sustained survival disadvantage apparent early post transplant, and that this effect appears exacerbated when combined with DGF or Str. Several potential explanations for this effect may include Srl-associated prolonged early graft dysfunction, hyperlipidemia or proteinuria. However, patients initiating Srl after discharge may not evidence this survival disadvantage, and this question was not addressed here.These findings may be of particular importance to centers employing combined Srl and Str therapy, as well as to define the best mode of support during recovery from DGF. Prospective randomized studies would be necessary to evaluate these. Table 1 shows CAI in both groups from 1 to 5 years. In SLR group the doses of CIN were significantly lower from one through 5 years (p=0.05). CAI due to interstitial fibrosis/ tubular atrophy was significantly lower in SLR group at 5 years. Our data shows that 5 year patient and graft survival, graft function, BPAR and SCAR were comparable between MMF and SLR groups despite lower prevelance of interstitial fibrosis/tubular atrophy in SLR group. Registry analyses suggest that TAC/MPA immunosuppression is associated with superior kidney graft survival vs. TAC/SRL. Large single-center experience may assist in clarifying these findings, by examining outcomes related to specific utilization practice. We retrospectively examined the outcomes of 529 consecutive first renal transplants (55% deceased donor, 45% living donor) at a single center, treated with TAC/SRL or TAC/MPA. Graft and patient survival, acute rejection rates, and 1yr eGFR were analyzed by era of transplant (2000-2002 vs. 2003-2006) . Changes in TAC/SRL utilization between eras included elimination of the SRL loading dose and a reduction in TAC target trough concentrations. Summary. Compared to TAC, SRL+CYA was associated with a 55% decreased risk of skin CA that was statistically significant. Although SRL+CYA was associated with a 32% decreased risk of de novo solid CA, it did not reach statistical significance. This reduced risk may not reflect the unique aspects of SRL itself, but may be a reflection of practice pattern, patient selection or center effect. The Backgrounds: A number of studies have observed increase of malignancies following renal transplantation. However, the incidence and the site of malignancies were quite different by the follow-up times, the era and the region. Methods: We reviewed the records of 694 renal transplant recipients in our institute between 1970 and 2007 and recorded the incidence and types of de novo malignancies. They were divided into two groups by immunosuppressive era; azathioprine (AZA) era (1970.4-1982.3: n=172) and calcineurin inhibitor (CNI) era (1982.4-: n=522) . Results: A total of 57 (27 in AZA era and 30 in CNI era) kidney recipients out of 694 developed 60 malignancies. The tumors included 10 GI-tract cancers, 9 liver cancers, 12 skin cancers, 4 tongue cancers, 6 breast cancers, 6 renal cell carcinomas, 2 thyroid cancers, 5 leukemia, 3 lymphoma, one lung cancer, one uterus cancer and one Kaposi's sarcoma (KS). The average interval between transplantation and development of malignancy was 134±86 (8-340) months. Mortality was high in liver cancer (89%) and leukemia (100%). Cumulative incidence of malignancies of all 694 recipients in 5, 10, 20, 30 years were 2.3%, 4.5%, 10.7% and 14.3%, respectively. Graft-loss censored cumulative incidence, which was calculated to see the incidence among graft survivors under continuing immunosuppression, of all recipients in 5, 10, 20, 30 years were 2.8%,6.3%, 16.3% and 26.2%. That of 5 ,10 and 20 years in CNI era was 3.0%, 6.8% and 13.9%, while that in AZA era was 1.8%, 4.9% and 19.5%, showing early higher incidence in CNI era outstripped by AZA era by 12 years. Site of malignancy in CNI era occurring within 3 years, which was never observed in AZA era, was focused on liver, leukemia (including ATL), KS and PTLD. Discussions:Our results demonstrated that recent potent immunosuppressive regimen shortened the interval between transplantation and viral-related malignancies. However, long-term incidence of whole malignancies has been decreasing by minimizing chronic immunosuppression in our institute. The Impact of Transplant Center Practice on the Association between Pulsatile Perfusion and Delayed Graft Function. Jagbir Gill, 1 David Gjertson, 1 Suphamai Bunnapradist, 1 Michael Cecka. 1 1 UCLA, LA, CA. The use of pulsatile perfusion (PP) is increasing in the US, but practice varies widely among transplant (Tx) centers. We describe the variability of PP use and its impact on post transplant outcomes. Methods: We identified all cadaveric kidney Tx from 2000-2005 using OPTN/UNOS data. The cohort was stratified by Tx center PP use as follows: Low PP centers (0-10% PP use), med PP centers (10-30% PP use), and high PP centers (>30% PP use). Donor characteristics and the incidence of DGF were compared between and within each strata. Results: PP was used by 97 % of centers, however most centers used PP <10% of the time (70.7%). Compared to low and med PP use centers, kidneys pumped in high PP centers were from donors that were younger, had a lower mean terminal serum creatinine, and had a lower incidence of CVA and hypertension. The overall incidence of DGF was lowest in the high PP centers (13.3%), compared to the med (24.7%) and low PP (24.8%) centers. The rates of DGF within each strata (high, med, and low PP centers) for Tx performed using PP versus cold storage (CS) are outlined in the Table below . Within each strata, the rate of DGF did not differ between Tx performed with and without PP. In ECD transplants, PP was associated with lower rates of DGF across all center groups. However, the impact of PP on SCD and DCD transplants was less significant and varied across centers by PP use. HLA sensitized patients (20-100%) in our DSA are now transplanted at a rate of 40%, which is significantly higher (p = 0.02) than the 23% rate when UA weren't entered into Unet. Furthermore, of 9 kidneys imported into our DSA and allocated to the DSAwide renal candidate list (since UA entry started), 63% (5/9) were transplanted into sensitized candidates (85% to 27%), each of whom had a negative flow or AHG T cell IgG crossmatch. Conclusion. The higher transplantation rate for HLA sensitized patients in our DSA shows that virtual A, B, & C crossmatching yields a DSA-wide ranked list of sensitized candidates likely to have a negative final class I (T cell) crossmatch and be transplanted. The data also lend support to the notion that sharing kidneys across DSA boundaries for HLA sensitized candidates, based on a negative virtual HLA class I crossmatch, has merit. Predicting Introduction: Predicting graft outcome after renal transplantation based on donor histological features has remained elusive and is subject to institutional variability. We propose a pre-transplant donor path scoring system that reliably predicts graft outcome regardless of recipient ® characteristics. Methods: We retrospective analyzed 286 imported cadaveric renal transplants which were initially rejected by other centers due to donor parameters between 1/00-6/05. All kidneys were re-biopsied at our center prior to implantation. Morphometric analysis performed consisted of measuring glomerular-size arterioles, interlobular, and arcuate/ interlobar arteries and wall to lumen ratio (WLR) was calculated; calculating % glomerulosclerosis (GS); the presence of arteriolar hyalinosis (AH), scar, periglomerular fibrosis (PFG), and acute tubular necrosis in the biopsies. The patients were followed for a mean of 33 months. Multivariate Cox analysis was done to evaluate the predictive value of these path variables to graft outcome.Results: AH, GS>15%, WLR>0.5, PGF, and scar were found to independently predict graft outcome. The UNOS Board of Directors has approved the change from using Panel Reactive Antibodies (PRA) to calculated PRA (CPRA). The CPRA is a formulated PRA based upon the frequency of the specificities of HLA antigens found in the donor pool and is expected to standardize the degree of patient sensitization. Donor organs expressing unacceptable antigens will not be offered to a recipient with donor (HLA) antigen specific antibodies (DSA). Highly sensitive, single antigen bead and solid phase assays (Flow PRA and Luminex) are used to identify these HLA antibodies (Abs). Each transplant center can determine the criteria used for identifying an unacceptable antigen, for example, based upon DSA titer or the fluorescence intensity (FI) coming from the donor-specific single antigen bead. It is unclear whether Abs identified by these techniques are clinically relevant for organ allocation. We retrospectively evaluated Flow-PRA, flow cytometry crossmatching (FCXM), HLA Ab specificities and titers of 300 pre-transplant (Tx) sera from transplant recipients of deceased renal allograft donors transplanted following a negative cytotoxic-anti-human globulin crossmatch. The two year graft survival of 91% for the recipients (44/54, 81%) with low-titer (≤ 1:16) donor specific HLA Ab and a negative (-) FCXM was significantly better when compared to the 60% two year graft survival of the 19% (10/54) of recipients presenting with (+) DSA but a (+) FCXM (p< 0.001). Recipients (55/110, 50%) with non-donor Abstracts specific HLA Abs (high or low titer) and a (-) FCXM also experienced a better two year graft survival of 89% compared to the 74% for the other 50% of recipients with (+) FCXMs (p < 0.001). These data suggest that in the presence of donor-specific or non-donor-specific HLA Abs you can not predict the crossmatch outcome without actually performing the crossmatch which will then influence donor organ allocation and graft survival outcome. In the face of low-titer DSA and a (-) FCXM recipients experienced excellent graft outcome when compared to recipients with (+) FCXMs. Therefore, donor organ allocation based on CPRA (Ab specificity and unacceptable Ags) utilizing highly sensitive single antigen bead and solid phase Luminex assays may disadvantage recipients (no donor crossmatch) who could otherwise be successfully transplanted. Aim: To assess VEGFr2 expression in HCC and the adjacent benign cirrhotic parenchyma and its correlation with tumor differentiation, vascular invasion, and tumor morphologic parameters. Background: HCC is a highly vascular tumor in which angiogenesis is mediated in part by VEGF. VEGF is highly expressed in HCC and mediates its effects through multiple receptors including VEGFr2. The tyrosine kinase inhibitor Sorafenib inactivates the VEGFr2 receptor and exhibits anti-tumor effects in HCC. Clinical significance of VEGFr2 expression with respect to tumor parameters has not been evaluated. Patients and Methods: Immunohistochemical staining for VEGFr2 was performed in HCC and corresponding adjacent cirrhotic liver from 79 patients undergoing liver transplant. 57 patients had HCC within MC. Stains were scored by estimating the % of positive surface area in veins, arteries, and sinusoidal lining cells. Data are presented as Median [P25, P75]; Wilcoxon signed rank and Wilcoxon rank sum tests were used. Results: VEGFr2 levels in HCC were significantly correlated to levels in adjacent non-tumorous liver. Higher levels of VEGFr2 in non-tumorous liver were associated with higher levels in HCC from the same patient. VEGFr2 levels were significantly higher in HCC compared to adjacent areas (P<0.05). VEGFr2 levels in HCC were not significantly different between patients who fell within MC and those beyond MC. However, VEGFr2 levels were significantly higher in the adjacent arteries of nontumorous liver (15.8 [0, 60] vs. 0 [0, 35]; P=0.03) in those patients with HCC beyond MC. Subjects with moderate or poor differentiation had significantly higher levels of VEGFr2 in sinusoids and veins of HCC and in the sinusoids of adjacent non-tumorous liver. There was no correlation with vascular invasion. Conclusions: Elevated VEGFr2 in HCC correlates with elevated VEGFr2 in adjacent cirrhosis, suggesting that high expressing HCC arise in a high VEGFr2 expression, pro-angiogenic environment. Moreover, higher VEGFr2 expression in background cirrhosis correlates with advanced HCC (beyond MC), suggesting that anti-angiogenic agents may prevent tumor formation or progression in cirrhotic patients. This novel concept warrants further study. . We further attempted to find a subgroup of patients combining tumor size, number of nodules and various levels of AFP which together would correlate strongly with PDIFF tumors. However, the distribution was erratic and even in extreme outliers (>4 nodules, > 8 cm in maximum size, with or without high AFP), where transplantation is currently not indicated according to any current expanded tumor inclusion criteria, the incidence of PDIFF did not exceed 42%. Conclusions: PDIFF is most highly associated with tumors that exceed the Milan criteria and the expanded criteria currently used for liver transplantation. Thus, tumor biopsy would not appear to be of benefit except perhaps in those patients with a high AFP level. However, if tumor criteria are further extended in the future, tumor biopsy may be justified in those with higher tumor burdens. < 2 cm) ), 2,511 (85.7%) at LS=T2 (1 tumor < 5cm or 3 tumors ≤ 2 cm), and 111 (3.8%) with LS>2. Results: Overall survival at 36 months was 76.5%, with significant differences seen for listing stage (LS 1= 82.2%, LS 2=75.9% LS>2=73.5%, p=0.0395) and by histologic stage as determined by pathology (p<.0001). Survival for patients with histologic stage 4b HCC was only 29% at 36 months, versus 83% for stage 1. Patients with tumors greater than 3cm both by listing stage and by pathology fared poorly compared to those with smaller tumors (p=0. 0.002). Patients listed who received ablation treatment (AT) preoperatively and who had their tumors "down-staged" had better results (p= 0.0061). We observed no difference in AT types (TACE vs RFA). Those with micro or macrovascular invasion had lower survival rates, at 66.8% and 30.5%, respectively (p <.0001). AFP >500 continues to be a significant predictor of lower post-transplant survival, with a survival rate of 57.9% at 36-months (p<.0001). Conclusions: We conclude that listing and histologic tumor size, presence of micro/ macrovascular invasion, and high AFP are associated with poorer LT results. AT is emerging as potentially effective treatment for improving LT outcomes for HCC recipients. Introduction: So far, Milan criteria are used to select patients with hepatocellular carcinoma (HCC) for liver transplantation (LT). Herein we compare prognostic markers in patients with pretreatment by transarterial chemoembolization (TACE) to a second cohort transplanted without TACE pretreatment. Patients and methods: Between September 1997 and October 2007, 134 patients with HCC underwent LT at our institution. Eighty-two patients were pretreated by repeatedly performed TACE whereas in 52 patients none or other forms of pretreatment had been used (non-TACE group). TACE was performed using lipiodol and mitomycin. Every 6 weeks TACE was repeated until transplantation. Tumor response was assessed by CT scans (6-week intervals). Results: Sixty-seven percent of the patients transplanted after TACE pretreatment exceeded the Milan criteria compared to 37 % in non-TACE patients. The proportion of recurrence-free patients was comparable in the TACE and non-TACE group (77.32 % and 78.45 %, respectively). In the univariate analysis grading, angioinvasion and progress-free TACE were significant predictors for recurrence after LT in patients with TACE pretreatment. Progress-free TACE was the only significant predictor of recurrence in the multivariate analysis. In the non-TACE group T classification, number of nodules, grading, angioinvasion, Milan criteria and underlying disease (HCV versus all other diseases) were significant. After TACE pretreatment freedom from recurrence was 92.3 % in patients with stable disease or regress but only 38.4 % in patients with progress during TACE. Conclusions: TACE pretreatment is capable of selecting a biological entity of tumors which differs significantly from untreated tumors. This statement is deduced from the remarkable differences of predictors for tumor recurrence in patients with and without TACE pretreatment. Moreover, TACE patients can be separated into two groups: those who experienced tumor progress during repeatedly performed TACE and those who did not. Stable disease during the continued TACE before transplantation resulted in remarkably low tumor recurrence. Liver With a median follow-up of 30 months, there was no statistical difference in the 5 year overall survival in the M (76%) and M+ (69%) groups (p=0.40). The 5-year disease free survival was significantly higher in the M (80%) vs. M+ (61%) groups (p=0.02). When stratifying for UCSF criteria tumors, the 5-year disease free survival was Milan (80%) vs. UCSF (72%) vs. beyond UCSF (54%) groups (p=0.01). Univariate analysis demonstrated the following factors to be associated with disease recurrence: intermediate waiting time 2-4 months (p=0.01), preoperative TACE (p=0.002) or resection (p=0.04), more than 3 tumors (p=0.03), and max tumor size over 5cm (p=0.03). Multivariate analysis controlling for age, gender, and waiting time demonstrated that preoperative resection HR2.78 (95%CI 1.1-7.2), more than 3 tumors HR2.3 (95%CI 1.1-4.7) and max tumor size greater than 5cm HR2.68 (95%CI 1.2-5.9) were independently associated with disease recurrence. Conclusions: The overall survival after liver transplantation is excellent in the M and M+ groups, far exceeding survival rates that can be obtained via any other modality. The current UNOS HCC algorithm should be reconsidered, to allow extra listing points for selected patients with HCC that exceed both the Milan and UCSF criteria. Postoperative Use of Intense Insulin Therapy in Liver Transplant Recipients. Lama M. Hsaiky, 1 Iman E. Bajjoka, 1 Dhaval Patel, 1 Marwan S. Abouljoud. 1 1 Transplant Institute, Henry Ford Hospital, Detroit, MI. Hyperglycemia and insulin resistance are common post liver transplant, even in patients with no history of diabetes. In the general surgical intensive care unit (SICU) population, the use of intensive control of blood glucose has recently been shown to reduce both morbidity and mortality. Thus far, limited data exist in the liver transplant population. Purpose: Assess the impact of intensive insulin therapy to maintain blood glucose at or below 110 mg/dL immediately post liver transplantation in the surgical intensive care unit (SICU). Methods: A retrospective evaluation of liver transplant recipients who received two different insulin protocols in the SICU was performed. Prior to January 2003, patients were assigned to receive sliding scale insulin therapy (SSIT) to maintain blood glucose (BG) less than 180 mg/dL. The intensive insulin therapy (IIT) was implemented in August 2005 with a goal BG level between 80-110mg/dL. The following data was analyzed: BG ranges, need for mechanical ventilation, blood transfusions, infection rate in the SICU, rejection episodes and patient survival. Results: A total of 97 liver transplant patients were evaluated; of which 47 patients were in the SSIT group and the other 50 in the IIT group. Demographic characteristics were comparable between the two groups. In the IIT, 24% of the BG readings were maintained at < 110 mg/dL versus 5% in the SSIT. The incidence of hypoglycemia (BG <60mg/dL) was less than 1% in both groups. The need for mechanical ventilation was 3.0 days vs. 2.3 days and the overall number of blood transfusion was on an average of 5.0 vs. 2 units (P<0.05) in the SSIT vs. IIT group, respectively. IIT also reduced overall SICU infection rate by 15% (P=0.01). The rate of acute cellular rejection at 3 months post transplant was less in the IIT, 14% vs. 25% in the SSIT group (P=0.02). Moreover, mortality during hospital stay was reduced from 5% in the SSIT to 2% in the IIT group. Conclusions: The use of intense insulin therapy immediately post liver transplantation has resulted in reducing infection rate and rejection episodes and a trend for reduced morbidity and mortality among post surgical liver transplant recipients without the adverse effects of hypoglycemia. Medical Epidemiology of Patients Surviving Ten Years after Liver Transplantation. Kerri A. Simo, 1 Stephanie E. Sereika, 1 David A. Gerber. 1 1 Abdominal Transplantation Division, Department of General Surgery, University of North Carolina, Chapel Hill, NC. Background: As the population of long term survivors of liver transplantation (OLT) grows, their medical epidemiology has become increasingly important. The goals of this study were to define a collective profile of liver transplant recipients ≥ 10 years post OLT and to compare their co-morbidities with those of the general population. In 2005, the National Health Survey reported that 22% of the US population had hypertension, 14.4% had diabetes/impaired fasting glucose, and 16.8% had chronic kidney disease. Methods: A retrospective review of a prospectively collected database of 125 adult patients who underwent OLT at a single transplant center from September 30, 1991 to October 1, 1997 was performed. Inclusion criteria consisted of survival ≥10 years post OLT with >1 year follow up. Results: Seventy-one patients met inclusion criteria. Ninety percent of patients had ≥10 years follow up. The mean age at transplant was 43 (range 18-67). The mean calculated MELD score was 22 (range 9-62, median=22). Indications for OLT were HCV(32%), alcohol(28%), cryptogenic(18%), autoimmune(10%), HBV(7%), PSC(7%), PBC(7%), and other(10%). Seven patients required retransplant during the first 10 years. An additional 38 patients underwent other operations: 10 arterial or biliary revisions, 13 hernia repairs and 19 non-transplant related (7 abdominal). During analysis, the following medical co-morbidities were found: 51 patients(72%) had hypertension (41 new onset), 20(28%) diabetes (10 new onset), 33(46%) renal insufficiency and 8 renal failure (28 new onset), 10(14%) cardiovascular disease (all new onset). Nine patients (13%) were diagnosed with de novo cancer. Medications for chronic health problems included 18 patients on diabetic medications, 46 on antihypertensives and 18 on lipid lowering agents. Initial immunosuppression consisted of 100% on steroids, 61% on cyclosporine, 40% on tacrolimus, 24% on mycophenolate mofetil (MMF) and 23% on azathioprine. Immunosuppression at 10 years consisted of 42% on cyclosporine, 41% on tacrolimus, 18% on steroids, 13% on MMF, 8% on sirolimus, 1% on mycophenolate sodium, 1% on azathioprine. No patients were on triple therapy, 29 were on dual therapy, and 42 were on monotherapy. Summary: Patients alive 10 years post OLT have a significantly higher incidence of hypertension, diabetes, and renal disease than the general population. This study supports conscientious medical follow up to ensure continued meaningful survival. Liver Transplantation in the Morbidly Obese (BMI>40). C. Quintini, 1 L. Kauzman, 1 K. Hashimoto, 1 P. Ding, 1 T. Doago Uso', 1 N. Sopko, 1 J. Rosenblum, 1 F. Aucejo, 1 C. Winans, 1 D. Kelly, 1 B. Eghtesad, 1 D. Vogt, 1 J. J. Fung, 1 C. Miller. 1 1 General Surgery -Liver Transplant Service, Cleveland Clinic OH. Morbid obesity (MO) is a problem seen with increasing frequency among candidates for OLT. MO is considered a contraindication for OLT in some centers without clear evidence to support such a practice. Our aim is to describe outcomes for OLT in patients with a BMI>40kg/m 2 in a single center. METHODS. Between 1/2004 and 4/2007, 364 OLTs were performed in 347 patients. 20/347 patients with a BMI>40 were compared to all other OLT patients with a BMI< 40. We analyzed patient and graft survival, operative time, blood transfusion requirements, and post operative events (ICU and overall length of stay LOS, surgical complications and infections). We also analyzed the post transplant weight records of our study group at 1-3 and 6 months. RESULTS. Results are summarized in table 1. Outcomes of OLT in the morbidly obese are no worse than those of other patients undergoing OLT. BMI is often artificially elevated in end-stage cirrhotic due to severe fluid retention. These patients exhibit rapid weight loss following transplantation that is likely due to extra-cellular fluid loss from the improved homeostatic milieu provided by the new liver and possible improvement in renal function. The presence of obstructive CAD was not associated with increased peri-operative morbidity or mortality. These patients experienced similar patient and graft survival irrespective of the degree of CAD. Significant unmodified CAD should not represent an absolute contraindication to liver transplantation. OLT was also examined. We found that using a TRJ of 3.0 m/s as a cut-off created two age-matched groups with significantly different survival curves at one year (p = 0.006) with a relative risk of mortality of 3.98 for TRJ ≥3.0 m/s. This study underscores the importance of screening TTE for the presence of PPH in the evaluation of patients for OLT, suggesting that clinically significant PPH may be present at lower TRJ than typically prompt right heart catheterization. Some of the limitations of the study are the variability of the time from pre-OLT echo to transplant, and the assumption of RAP = 5 without assessment of IVC size. We are using these data as a framework for further prospective trials to better assess the clinical applications of the findings. Hyperlipidemia has been shown to predict faster chronic kidney disease (CKD) progression over the long term in lung allograft recipients. It is unknown whether disordered lipid metabolism may also aggravate the early loss of renal function often seen in this patient population in the immediate post-operative period. We studied 230 lung allograft recipients transplanted between January 1997 and December 2003. Pertinent demographic and clinical variables were recorded at baseline and one month post-transplant, including creatinine levels and fasting lipid panels. Logistic regression models were created to investigate an independent association between lipid levels and change in renal function by one month post-transplant. Mean +/-SD baseline creatinine was 0.8 +/-0.2 mg/dl and low density lipoprotein (LDL) was 110 +/-35 mg/dl, the latter remaining unchanged at one month. In contrast, by one month post-transplant the mean creatinine level of survivors increased significantly to 1.2 +/-0.6 mg/dl (p < 0.001), with a overall mean increase of 52% above baseline. The highest quartile of patients that fared the worst experienced a rise in creatinine > 72% above baseline. On univariate analysis, there was a strong trend toward those with one month LDL values in the highest quartile (i.e., >140 mg/dl) having an increased odds of experiencing a rise in creatinine by one month > 72% above baseline (OR 1.8, p=0.09). After controlling for age, gender, pre-transplant creatinine, BMI, and the presence of diabetes prior to transplant an LDL value in the highest quartile by one month post-transplant was the only variable independently predictive of a rise in creatinine > 72% above baseline at one month (OR 2.5, p=0.05). In summary, hyperlipidemia occurring early post-lung transplant predicts faster loss of renal function soon after surgery. Though speculative, given the known beneficial effects of lipid lowering agents such as statins on the rate of CKD progression, perhaps timely initiation of these medications after surgery may also attenuate the early decline in renal function often observed in this patient population. The risk for acute cellular rejection (ACR) following lung transplantation (LTx) is still a problem despite heavy multidrug immunosuppression. Induction therapy with potent T-cell depleting agents have facilitated the implementation of minimal post-transplant immunosuppression.The impact of this protocol on the activation of proinflammatory cytokines and effector molecules that affect the cellular rejection process is not well determined. In this study, we evaluated the relationship between upregulation of T-cell and macrophage-dependent inflammatory cytokines detected by molecular methods and the clinical status of LTx patients. We studied 28 LTx patients who received anti-lymphocytic induction therapy(Thymoglobulin n=8 or Campath-1H n=20) followed by maintenance immunosuppression with Tacrolimus and low-dose steroids. We analyzed 137 bronchoalveolar lavage (BAL) mRNA samples (3-8 per LTx patient) by Real-Time PCR for GZMB, IFN-γ, IL-15, MCP-1, RANTES, TNF-α, and GAPDH (control). The ABI Prism 7700 SDS and 7500 Fast Real-Time PCR Systems were used and data were analyzed by the dCT and 2-ddCT methods. We determined the relationship of categorical outcomes (rejection-no rejection) with continuous variables (number of cycles) by ANOVA. Early ACR that occurred within the first 60 days post-LTx was associated with an increase (>3-6 fold) of macrophage-specific mRNA (MCP-1, RANTES, TNF-α). 7/8 Thymo-treated LTx patients experienced early ACR while only 5/20 Campath-1H-treated patients had early ACR (p<0.005). In contrast, late ACR that occurred greater than 60 days post-LTx was associated with a significant upregulation (4-64 fold) of T-cell dependent mRNA (GZMB and IFN-γ) in addition to increases of RANTES and MCP-1 mRNA. Overall, LTx patients who experienced early or late ACR (n=20) exhibited higher mRNA levels for the above mediators compared to stable patients (n=8). Our data indicated that in early post-T-cell depletion, the ACR phenotype was characterized mainly by macrophage activation. In contrast, greater than 3 months post-LTx with the recovery of CD8+ T-cells, the ACR phenotype was associated with cytotoxic T-cell activation. Furthermore, sensitive molecular methods may detect the activation of pro-inflammatory mediators within the allograft prior to the diagnosis of rejection by transbronchial biopsies and may impact optimal patient management. Antibody-mediated rejection (AMR) is defined by the presence of donor-specific alloantibodies, markers of complement activation and the clinical phenotype of organ dysfunction. Compared to renal and cardiac allografts, very few AMR cases are documented in lung transplantation (LTX). Methods. We report here on seventeen LTX patients exhibiting: 1) donor-specific HLA antibodies (DSA); 2) linear, continuous subendothelial C4d deposition in lung allograft; 3) lung allograft dysfunction. The presence and specificity of DSA were determined by ELISA and/or Luminex. C4d deposition was assessed by immunohistochemistry in transbronchial biopsy paraffin blocks. Allograft dysfunction was considered when either biopsy-proven acute cellular rejection (ACR, ≥ ISHLT A2) or bronchiolitis obliterans (BOS) were diagnosed. The average detection of DSA occurred in the first year post-LTX, on 283±218 postoperative day (POD), range 14 to 734 days. Thirteen (76%) of AMR patients were females. An anamnestic humoral response was encountered in 4 cases (one pregnancyrelated and three re-transplant patients), while de novo DSA were detected in 14 cases (82%). The percent-reactive antibody (PRA) was lower in de novo cases (24%) when compared to memory response (71%, p<0.05). Anti-class I DSA were found in 7 cases, anti-class II in 5 cases, while 5 patients exhibited both class I and class II DSA. Specific vascular C4d deposition was detected in 16 (94%) patients. Lung allograft dysfunction was considered in 14 (82%) cases, while three patients with DSA and specific C4d deposition fulfilled the criteria of sub-clinical AMR. In patients where plasma-exchange/ IVIg were applied, antibody titers dropped from 1:32 to 1:4 or 1:2 in two cases; in a third case, antibody titer remained high post-pheresis (1:512) and the graft failed. Conclusions: Both anti-class I and anti-class II, low PRA or high PRA, pre-formed or de novo DSA can be detrimental for lung allograft. The presence of donor-specific alloantibodies, vascular C4d deposition and allograft dysfunction shows that AMR criteria can also be met in LTX. Rationale: Recently, inhaled cyclosporine has been shown to reduce mortality and BOS. Previously, we demonstrated that apically-dosed cyclosporine poorly transmigrated differentiated human airway epithelial cells in vitro (using a transwell system). Only ∼5% of the apically deposited cyclosporine passed through the epithelial layer whereas most of the drug accumulated within the epithelium. Thus inhaled cyclosporine may not be capable of inactivating airway allogeneic T cells since it may not reach the airway wall in high concentration. In this study, we hypothesized that inhaled cyclosporine alters airway epithelial signaling cascades that may ultimately result in reduced inflammatory cell recruitment to the lung. Methods: Human tracheobronchial epithelial (hTBE) cells from healthy donors were grown in ALI media to confluence at an air-liquid interface in millicells. Differentiated hTBEs were treated on their apical surface with cyclosporine concentrations of 1000 and 10,000 ng/ml or vehicle for 24hr to mimic the effects of inhaled cycolsporine. The basilar and apical compartments (n=8-12 for each) were then assayed for cytokine secretion (IL8, IL6, IL1, IL-12p40, TNF, eotaxin, MCP-1, GM-CSF, RANTES and EGF) by luminex assays in pg/ml. Results: 10,000ng /ml of cyclosporine markedly blunted the basilar secretion of IL-6 (250 ±142 vs 71± 24, p= 0.002), RANTES (22±8 vs 9±3, p=0.003), GM-CSF (21±16 vs 3±3, p=0.008) and MCP-1 (181±112 vs 28±19, p=0.002). IL8 and eotaxin were unaffected; TNFα, IL1β and IL12p40 were undetectable. At 1000ng/ml cyclosporine, cytokine secretion was decreased to a lesser extent. A similar pattern of diminished cytokine secretion was seen in the apical compartment of this system. Last, apical, but not basal, EGF secretion was augmented by cyclosporine (2231±817 vs 1114±614 pg/ml, p=0.02). Conclusion: Cyclosporine decreased the secretion of critical cytokines and chemokines from human airway epithelial cells. These mediators are known to enhance mononuclear and T cell recruitment in a large variety of animal models of disease as well as in clinical studies. Inhaled cyclosporine may work by reducing cell recruitment. This hypothesis will require in vivo testing. Funded By: CF Foundation. Background: Cytomegalovirus (CMV), human herpes virus -6 and -7 (HHV-6 and -7) are β-herpesviruses that commonly reactivate and have been proposed to trigger acute rejection and chronic allograft injury. The role of these viruses in the development of BOS after lung transplantation remains unclear. We assessed the contribution of β-herpesvirus infection in the allograft by a prospective molecular assessment of serial broncho-alveolar lavage (BAL) samples in lung transplant recipients. Methods: Quantitative real-time PCR of BAL samples were performed for CMV, HHV-6 and HHV-7 in a prospective cohort of lung transplant recipients. A time-dependent Cox regression analysis was used to correlate the risk of BOS and acute rejection in patients with and without β-herpesviruses infection. Results: 93 patients were included in the study over a period of 3 years. A total of 581 samples from BAL were obtained (median 6 per patient). 61/93 patients (66%) had at least one positive result for one of the β-herpesviruses: 48 patients (52%) for CMV, 19 patients (20%) for HHV-6, and 19 patients (20%) for HHV-7. Median time to detection was 152 days (range 8-839) for CMV, 309 days (range 28-928) for HHV-6, and 286 days (range 17-928) for HHV-7. Median peak viral load was 3,419 copies/ml (range 102-41,600,000) for CMV, 258 copies/ml (range 106-131,075) for HHV-6, and 665 copies/ml (range 103-66,100) for HHV-7. Acute rejection (≥ grade 2) occurred in 46% and BOS (≥ stage 2) in 19%. In the time dependent Cox regression model, the relative risk of BOS or acute rejection was not increased in patients with CMV, HHV-6, or HHV-7 reactivation. For example, the hazard ratio of CMV and BOS was 1.04 (95% CI 0.62-1.73, p=0.9) and for CMV and acute rejection was 0.81 (95% CI 0.55-1.20, p=0.3). In many of the patients, β-herpesvirus reactivation occurred after the acute rejection episode likely reflecting augmented immunosuppression. Abstract# 317 Mannose Binding Lectin In this large cohort of lung transplant recipients, local reactivation of CMV, HHV-6 and HHV-7 in the allograft was very common. However, despite high viral loads in many patients, infection was not significantly associated with the development of acute rejection or BOS. Introduction: The role of chemokine receptors in regulating donor-specific responses to allografts is poorly understood. CD4 + CD25 + T cells regulate alloreactive CD4 T cell responses and acute rejection of single class II MHC-disparate cardiac allografts in C57BL/6 mice. CCR5 is expressed by a small proportion of CD4 + CD25 + T cells but the requirement for these cells in regulating alloreactive T cell responses remains poorly understood. The goal of this study was to investigate the role of CCR5 + T regulatory cells in acute rejection of single class II MHC-disparate cardiac allografts. Methods: Wild-type C57BL/6 (H-2 b ) and B6.CCR5 -/received heterotopically transplanted B6.H-2 bm12 mice heart grafts. The presence of CD4 + Foxp3 + T cells in the recipient spleen and in heart allografts was determined by flow cytometry. Foxp3 mRNA expression in the heart grafts was analyzed by qRT-PCR. Donor-specific CD4 + T cells producing IFN-g or IL-4 in allograft recipient spleens were enumerated by ELISPOT assay. Cell sorted naïve wild-type CD4 + CD25 + T cells and CD4 + CD25 -T cells were adoptively transferred to wild-type and CCR5 -/mice before the cardiac transplant. Results: In wild-type recipients >80% B6.H-2 bm12 cardiac grafts survived more than 100 days whereas CCR5 -/recipients rejected the allografts within 24 days (18 days mean survival) with intense CD4 + T cell infiltration in the graft. Donor-reactive IFN-g and IL-4 producing CD4 + T cell numbers were increased 2-fold in the spleens of CCR5 -/vs. wild-type recipients at day 7 post-transplant and in contrast to wild-type recipients these numbers were sustained for at least 7 more days. Allograft infiltrating CD4 + CD25 + Foxp3 + cells and intra-graft Foxp3 mRNA expression were clearly present in allografts from wild-type recipients and were virtually absent in allografts from CCR5 -/recipients. Transfer of purified wild-type CD4 + CD25 + T cells to CCR5-deficient mice resulted in the long-term survival of 60% of B6.H-2 bm12 cardiac allografts. Conclusion: CCR5 + regulatory T cells control the magnitude and function of the alloreactive T cell immune response to single class II MHC-disparate cardiac allografts. profile that were distinct from those of CD4+CD25+ Tregs, naïve CD4+CD25-T-cells, and activated CD4+ T-cells. Furthermore, the CD4+ converted DN T-cells were highly potent in suppressing antigen specific alloimmune responses in vitro. In this study, we further characterized and test the functional potential of the converted DN T-cells in vivo. We showed that the converted DN T-cells retained a stable phenotype after re-stimulation in vitro and in vivo. IL-2 was capable of breaking the anergic status and reserving the suppressive function of DN T-cells. In an immunocompetent MHC completely mismatched islet transplant model, the transfer of 13 x 10 6 DN T-cells (converted from CD4+CD25-T-cells of naïve C57BL/6 mice by co-culture with mature DBA/2 DC plus rIL-15 in MLR for 6 days) resulted in a statistically significant prolongation of alloantigen specific DBA/2 strain, but not third party C3H strain, islet allograft survival in C57BL/6 recipients in comparison with that of untreated control group. As IL-2 was capable of breaking the anergic status and reserving the suppressive function of DN T-cells, we added IL-2/Fc, a long-lasting form of IL-2, and low dose rapamycin with DN T-cells in a MHC mismatched skin allograft model. The single transfer of 5 x 10 6 DN T-cells plus 28 days IL-2/Fc and low dose rapamycin treatment significantly prolong DBA/2 skin allograft survival in C57BL/6 recipients in comparison with untreated group (MST 71 days vs. 11 days, p=0.0049) and IL-2/Fc plus rapamycin treated group MST (71 days vs. 34 days, p=0.0091). The results of using ex vivo CD4+ T-cells converted DN T-cells in skin and islet transplantation models support the concept and the feasibility of potentially utilizing this novel cell-based therapeutic approach clinically for the prevention of allograft rejection. Jessamyn Bagley, 1 Jonathan G. Godwin, 1 Joren Madsen, 2 John Iacomini. 1 Introduction: It has been suggested that natural killer (NK) cells are critical mediators that connect the innate and adaptive immune response. Cytokine production by NK cells contributes to the polarization of immune responses to T helper 1, and NK cells express co-stimulatory molecules that may affect T cell proliferation. Recent work has shown that NK cells are involved in the chronic rejection of parental cardiac grafts by F1 recipients. We hypothesized that given the role of NK cells in T cell activation and proliferation, NK cells may play a role in the development and function of T regulatory cells (Treg) which control alloreactive responses. Methods: NK, CD4+ T cells and CD4+CD25+ Treg were purified from the spleens of C57BL/6J mice using MACS bead separation followed by fluorescence activated cell sorting. Naïve CD4+CD25-T cells from C57BL/6J mice (H-2b) were placed in culture with TGF-beta in the presence or absence of NK cells, and the development of Treg was monitored by assessing FoxP3 expression by intracellular cytokine staining and flow cytometry. In addition, the effect of NK cells on the function of Treg was measured by ELISPOT assay. Results: Following 4 days of culture with TGF-β and anti-CD3 antibody, CD4+CD25-T cells acquire FoxP3 expression. The addition of activated NK cells to cultures with CD4+CD25-T cells and TGF-β prevented the acquisition of FoxP3 expression by CD4 cells. Tregs induced by stimulation of CD4+CD25-T cells with anti-CD3 in the presence of TGF-β are capable of suppressing the production of IL-2, IFN-γ and IL-4 by CD4 T cells in response to fully allogeneic BALB/c stimulators. However, in the presence of activated NK cells, induced Tregs fail to function, and production of IL-2, IFN-γ and IL-4 by CD4 T cells is restored. These experiments were repeated with natural CD4+CD25+ Tregs isolated directly from mice without induction, and found that the ability of natural Treg to suppress a CD4 T cell response to alloantigen was impaired in the presence of activated NK cells. Conclusions: The presence of activated NK cells in culture can prevent the development of induced Treg. Furthermore, the presence of activated NK cells interferes with the function of mature Treg in culture and allows a productive cytokine response by CD4 effector cells in response to alloantigen. Results: Both the syngeneic B6 cells and allogeneic DBA/2 cells survived nicely in the Rag-/-IL-2Rg-/-mice. However, the allogeneic DBA/2 cells, but not the syngeneic B6 cells, were readily killed by the NK cells in the Rag-/-mice. However, both Rag-/-and Rag-/-IL-2Rg-/-mice accepted the DBA/2 skin allograft long term without any sign of rejection (>100 days). Thus, NK cells by themselves, though cytolytic to DBA/2 cells, fail to reject the DBA/2 skin allografts. To test the hypothesis that the activation status of NK cells may dictate their alloreactive potential, we treated the Rag-/-mice with IL-5/IL-15Ra complex to maximally stimulate the NK cells in vivo. We found that IL-15 is remarkably potent in stimulating NK cells in vivo; and NK cells stimulated by IL-15 express an activated phenotype and are surprisingly potent in mediating acute skin allograft rejection in the absence of any adaptive immune cells, as IL-15 treated Rag-/-, but not the Rag-/-IL-2Rg-/-mice, readily rejected the DBA/2 skin allografts (MST=18 days). NK cell-mediated graft rejection doesn't show features of memory responses. and suggests that the fate of the allografts may depend on the activation status of NK cells and the availability of NK stimulating cytokines. Background: T-bet is a transcription factor that promotes Th1 development. Both T-bet and the cytokines IFNg and IL-4 have been implicated as negative regulators of Th17. In contrast, IL-6 promotes Th17 development. IL-17 production is associated with granulocytic pathologies in several disease states. Hence, this study assessed the relationship between T-bet, IFNg, IL-4 and IL-6 in Th17 induction and granulocytic infiltration of cardiac allografts. (IFNg-/-) mice were transplanted with BALB/c cardiac allografts. Recipients were left untreated, depleted of CD4+ or CD8+ cells, treated with anti-CD40L mAb, and/or treated with neutralizing anti-IL-4 or anti-IL-6 mAb. Graft histology was assessed and primed donor-reactive Th responses were quantified by ELISPOT. Intragraft expression of IL-17 and the Th17 transcription factor RORgT were quantified by real-time RT-PCR. Results: WT and T-bet-/-mice rejected their allografts at a similar tempo but with distinct pathologies: allografts in T-bet-/-recipients were heavily infiltrated with granulocytes while graft infiltrating cells in WT recipients were primarily mononuclear. While Th1 and Th17 responses were readily detectable in T-bet-/-recipients, Th1 dominated the response in WT mice and Th17 were not detectable. Depletion of CD4+ cells prolonged graft survival in WT, but not in T-bet-/-recipients suggesting that CD8+ cells mediated rejection independent of CD4+ help in T-bet-/-mice. CD8+ cells were the source of IL-17 in T-bet-/-recipients. Anti-CD40L therapy promoted long-term allograft survival in WT recipients, but not T-bet-/-mice unless CD8+ cells were depleted. Additionally, anti-CD40L therapy inhibited Th1 responses in both WT and T-bet-/-recipients, but not the CD8+ Th17 response in T-bet-/-mice. Eliminating IFNg and IL-4 failed to induce IL-17 production, while neutralizing IL-6 reduced the Th17 response in T-bet-/-mice. Conclusions: While CD4+ Th17 have been described in detail, CD8+ Th17 have received less attention. In T-bet-/-allograft recipients, CD8+ Th17 emerge independent of CD4+ help. CD8+ Th17 are resistant to anti-CD40L therapy and are associated with granulocyte infiltration of the graft. These data implicate T-bet, as opposed to IFNg and IL-4, as a negative regulator of the Th17 response while IL-6 is required for CD8+ Th17 induction. Nan Zhang, 1 Bernd Schroppel, 1 Girdari Lal, 1 Jordi C. Ochando, 1 Jonathan S. Bromberg. 1 1 Background: CD4 + CD25 + Foxp3 + regulatory T cells (Treg) are important in suppressing immunity to prolong allograft survival. Treg migration and its effects on suppressive activity are poorly understood. We determined Treg migration patterns and effector function in an islet allograft model. CCR2 -/-, CCR4 -/-, CCR5 -/-, CCR7 -/-, L-selectin (CD62L) -/-, and fucosyltransferase (FucT) IV-VII -/-C57BL/6 mice were used to generate Treg. Treg were transferred intravenously, or locally to islet allografts, following islet transplantation (BALB/c into C57BL/6). Transferred Treg were labeled with red dye PKH26 and migration to islet grafts, draining LNs (dLN), peripheral LNs and spleen were tracked with fluorescence microscopy and flow cytometry. Islet allograft survival was determined by measurement of blood glucose. Results: Treg expressed P-selectin ligand, CD62L, and a panel of chemokine receptors similar to other T cell subsets. Intravenously transferred wild type Treg migrated to both islet grafts and dLN, and prolonged allograft survival. CD62L -/or CCR7 -/-Treg, which migrated to islets but not dLNs, prolonged allograft survival as potently as wild type Treg. FucT IV-VII -/-Treg, which lack E-selectin and P-selectin ligands, migrated to dLN, but not islets, and did not prolong graft survival. Similarly, CCR2 -/-, CCR4 -/-, and CCR5 -/-Treg migrated to dLN, but not islets, and did not prolong graft survival. When locally transferred to the islet graft, Treg also migrated from the allograft to the dLN, and prolonged graft survival even longer than after intravenous transfer. Locally transferred CCR2 -/-, CCR5 -/-, or CCR7 -/-Treg were not able to migrate from the islet to the dLN, and were impaired in their ability to prolong islet survival. Conclusion: Treg migration to allografts is essential for their suppressive function; migration to lymphoid tissues alone is not sufficient to prolong graft survival. Treg migration from the islet allografts to the dLNs, via afferent lymphatics, is also required for optimal suppressive function and graft survival. The sequential migration from the site of inflammation and then to dLNs is necessary for Treg to execute fully their suppressive program. These results demonstrate a novel and important aspect of migration in Treg suppression and tolerance. Abstracts failure is unknown. Methods:40 patients with iothalamate GFR <40ml/min (n=34) or on dialysis (n=6) at the time of liver transplant evaluation had undergone a percutanous CT guided renal biopsy. Prior to the biopsy an INR ≤1.5 and platelet count ≥50,000/ ml were achieved in the majority of cases. All patients were monitored overnight for complications. Candidates were listed for SLK if pathology showed ≥ 40% glomerulosclerosis (GS) or ≥30% interstitial fibrosis (IF). Results:13 patients were eligible for SLK and 27 for LTA.Creatinine was higher in SLK candidates but not clinically different. Background -It is well known that delayed graft function (DGF) is costly for those who received a renal transplant. However, the true cost of DGF is unknown. Methods -We estimated the cost of DGF for adult cadaveric renal recipients in the USRDS 1995-2004 who had Medicare as their primary payer. Those included were restricted to single organ recipients as well as those who had no previous transplants. Cost was defined as the accumulated average cost per day for everyone with a functioning graft on that day. We examined the total cost of DGF, cost associated with dialysis, and non-dialysis cost for all patients combined and separately by donor type; standard criteria donors (SCD), expanded criteria donors (ECD), and non-heart beating donors (DCD) as well as time to graft failure or no graft failure. Background: Based on adverse outcomes during the first year post-transplant, the OPTN Membership and Professional Standards Committee (MPSC) peer-review process flags transplant programs for further review using one of two methods. Programs performing at least 10 transplants during at 2.5 year cohort are flagged based on the comparison of observed to expected event counts (death or graft failure) and the corresponding p-value (<0.05). Programs performing 9 or fewer transplants are flagged if they have any adverse events. This leads to different flagging rates for programs of different sizes. The p-value has low sensitivity to identify poorly performing programs with a "moderate" number of transplants (10 to 20) whereas flagging every event results in a high false positive rate for "small" centers with <10 transplants. During the July 2007 review, 27% of "small" centers and 7% of centers with 10+ transplants were flagged for review (all organs). The Scientific Registry of Transplant Recipients (SRTR) has developed alternative approaches for flagging centers with more consistent flagging rates. Methods: The new method would allow for different choices of sensitivity (RR) and specificity (p-value) for different purposes and would flag program if either {p-value < .05 (a different value could be chosen)} or {observed / expected > RR}. The resulting false negative rate is less than 50% for centers with the given RR. This approach avoids an arbitrary definition of small versus large programs, and has sensitivity (or power) > 50% to flag programs with the selected RR, regardless of size. Results: The following Discussion: This approach gives a more balanced distribution of flagging across programs of different sizes and has sensitivity >50% for transplant programs of all sizes. With the choice of RR=2.5, the overall number of centers flagged would be nearly unchanged from current methods. Center performance ratings are of increasing importance to the transplant community with the introduction of the CMS final rule. Ongoing debate exists regarding how much center ratings are directly a reflection of quality of care or whether ratings can be substantially influenced by exogenous factors. The study examined data from the national SRTR database from 2000-2007. Centers' semi-annual graft and patient survival were calculated along with a comparison with expected outcomes adjusted for covariates used by the SRTR. Centers meeting three criteria for poor performance were categorized within each cohort. Patient characteristics at each center were compared with performance evaluations. Overall, half of transplant centers met criteria for low performance in at least one of the semi-annual intervals. Several center factors investigated were not significantly associated with the likelihood to meet low performance criteria including the proportion of older, obese and privately insured patients. In contrast, centers with higher levels of ECD transplants (most ECDs=70% vs least ECDs=40%), African American recipients (most AAs=75% vs least AAs=35%) and patients with low albumin level (lowest albumin =60% vs highest albumin =32%) were more likely to meet low performance criteria. Approximately half the centers that initially met criteria for low performance no longer met criteria when excluding ECD transplants and African American recipients from the performance assessment. Conclusions Given the extreme implications of performance ratings for transplant centers including possible loss of funding to centers with low performance, it is critical that we recognize potential weaknesses and biases of performance ratings. Our results are important towards understanding factors related to performance ratings and raising questions as to whether risk adjustment techniques are adequate for fair transplant center performance evaluations. Rather than only risk adjustment, stratified evaluations may be a partial solution to remove disincentives to performing higher risk transplants. It is also important to recognize factors not associated with low performance such that centers do not unnecessarily limit access to groups based on perceived deleterious impact on ratings. Model. Olaf Boenisch, 1 Takaya Muramatsu, 1 Francesca D'Addio, 1 Robert Padera, 1 Hideo Yagita, 2 Nader Najafian. 1 1 Transplantation Research Center, Brigham and Woman's Hospital and Children's Hospital, Boston, MA; 2 Immunology, Juntendo University of Medicine, Tokyo, Japan. T cell immunoglobulin and mucin domain (TIM)-3, a molecule expressed on terminally differentiated Th1 cells, is an important regulator of Th1 autoimmunity and in induction of transplantation tolerance. Its functions in alloimmune responses during acute and chronic rejection are unknown. TIM 3-23, a novel blocking antibody of TIM-3, was administered to recipients in various donor-recipient strain combinations on days 0 (500ug), 2, 4, 6, 8 and 10 (250ug) after transplantation. The frequency of IFNg-, IL-4-, and Granzyme B-producing splenocytes was measured by ELISPOT. These data establish the regulatory functions of TIM-3 in alloimmune responses in solid organ transplantation models. The inhibitory actions may be secondary to modulation of effector or regulatory T cells and appear to dominate in conditions of low levels of T cell activation, due to a restricted degree of allogeneic mismatch or absence of CD28 costimulation. The I-R injury in transplanted kidney is a major cause of DGF, an event associated with an increased risk of acute rejection. Adaptative immunity was suggested to play a role in the pathogenesis of renal I-R injury, although the influence of the Th1/Th2 bias in this scenario is still debated. Thus, the aim of the present study was to evaluate the features of T cell response during I-R injury at the peripheral and tissue level in renal graft recipients with DGF. The mRNA levels of specific Th1 (T-bet) and Th2 (GATA-3) transcription factors were evaluated in circulating lymphomonocyte of kidney transplant recipients with early graft function (EGF) (n=10) and DGF (n=10), before (T0) and 24 hours after transplantation (T24) by real time PCR. Infiltrating lymphocytes were characterized in graft biopsies of patients with DGF (n=40) and in a control group of patients with tubular damage by acute CNI toxicity (n=10) by immunohistochemistry. In addition, we evaluated the Th1/Th2 bias at the renal level in a pig model of I-R injury. T-bet/GATA-3 mRNA ratio was similar in the 2 groups of patients at T0. At T24 the DGF group presented a significantly higher increase of T-bet/GATA-3 ratio compared with the EGF group (798±346 vs 288±147% of T0, p<0.001). Moving to the tissue level, DGF patients presented a number of interstitial CD4 + (8.0±5.1 vs 2.6±2.1, p=0.04) and CD8 + (10.8±6.7 vs 4±2, p=0.02) T cells significantly higher compared to the control group, while no significant differences were observed in CD20 + cells number between the two groups. Also at the tissue level the ratio between T-bet + and GATA-3 + cells was significantly higher in the DGF compared with the control group (3.5±1.8 vs 1.1±0.9, respectively, p=0.005). To confirm that these changes were due to I-R, we investigated the presence of T-bet + and GATA-3 + cells in a pig model of I-R injury. Interestingly, the ratio was significantly increased after 24 hours of reperfusion (basal 2.5± 0.9 vs 24 hours 8.1±3.6; p=0.02). In conclusion, our results suggest that kidney transplant recipients with DGF present a bias toward a Th1-driven immune response both at the peripheral and at the tissue level. This event, due to the I-R process, as suggested by the animal model, may represent a link between DGF and acute graft rejection. As expected a strong negative association between duration of dialysis and patient survival was seen in Caucasian recipients. In contrast the relationship between patient survival and duration of dialysis was U shaped in minorities with the worst patient survivals seen among preemptive transplants and those patients with over 60 months of dialysis. The difference in outcomes between Caucasians and minorities could be related to the biologic differences in the effect of dialysis on subsequent transplantation or to a differential selection bias introduced by duration of dialysis in the two populations. Anti-Carbohydrate Natural Antibodies. Lorenzo Benatuil, 1 Jonathan G. Godwin, 1 Shamik Gosh, 1 John Iacomini. 1 1 Transplantation Research Center, Boston, MA. Background. We constructed immunoglobulin gene knock-in mice lacking expression of the αGal epitope that carry rearranged VH and VL genes encoding antibodies specific for the carbohydrate antigen Galα1-3Galβ1-4GlcNAc-R (αGal). Here, we describe two novel populations of B cells in the spleen of these M86VHVLGT 0 mice and their role in the production of αGal specific antibodies. Methods. M86VHVLGT 0 mice were sacrificed and single cell suspensions from spleen, bone marrow, peripheral lymph nodes and peritoneal cavity were prepared and stained with different antibodies and with fluorescently labeled αGal-BSA for flow cytometric analysis. Using multiparameter cell sorting, we purified marginal (MZ) and follicular (FO) zone B cells, in addition to two novel splenic B cell populations. Cells were adoptively transferred into B cell deficient µMT/GT 0 double knock-out mice. Serum samples were collected, and production of αGal specific IgM antibodies was assessed by ELISA. To investigate these B cell tolerance mechanisms, we employed mice with naturally occurring antibodies (Abs) against human blood group A carbohydrates in their sera and possessing B cells with receptors for blood group A determinants. B cells with receptors for A carbohydrates in mice belong to the CD5 + B-1 subset, with phenotypic properties similar to those of human B cells. When these cells were temporarily eliminated by injecting synthetic A carbohydrates, subsequent treatment with cyclosporin A or tacrolimus, which blocks B-1 cell differentiation, completely inhibited the reappearance of B cells with receptors for A carbohydrates in mice. It is probable that calcineurin inhibitors used for preventing T cell-mediated rejection simultaneously suppress B-1 cell differentiation. However, despite a very limited dose of calcineurin inhibitors, the B cell tolerance toward blood group A antigens was persistently maintained in the blood group A-to-O liver transplant recipient. B cell tolerance after ABO-incompatible transplantation might be a consequence of presentation of blood group carbohydrate antigens by cells in the engrafted liver. Immune fluorescence staining of the human liver reveals that blood group antigens are predominantly expressed on the liver sinusoidal endothelial cells (LSECs). We have previously proven that LSECs, which constitutively express Fas-L and PDL-1 have the capacity to tolerize alloreactive T cells. Taken together, we hypothesize that blood group antigen-reactive B cells are also tolerized through the interaction with the LSECs. In order to address this possibility, we used α-1,3galactosyltransferase-deficient mice. When the α-Gal-expressing LSECs isolated from wild-type mice were adoptively transferred via the portal vein into the splenectomized congeneic α-Gal-deficient mice, these mice lost the ability to produce anti-α-Gal Abs (n = 4). This finding suggests that the LSECs expressing blood group carbohydrates play a pivotal role in the tolerization of newly developed B cells specific for the corresponding carbohydrate antigens after ABO-incompatible liver transplantation. Success in kidney transplantation has resulted from control of T-cell-mediated acute rejection. However, little has been done to improve the fate of patients who possess pretransplant donor-specific antibody (DSA), and no proven therapies exist to specifically prevent DSA formation post-transplant. While methods to detect and characterize DSAs are clearly useful, the B-cell subsets that produce DSAs or more importantly sustain their production are poorly characterized. This study set out to investigate the fundamental mechanisms determining the formation of donor-specific memory B cells and plasma cells in novel mouse models of DSA formation. We have developed two complementary and novel systems to track the phenotypic and functional properties of polyclonal donor-specific B-cells. The first system involves allosensitization between normal C57BL/6 and Balb/c mice. We used advanced flow cytometric methods to track donor-specific B cells with H-2k b or H-2k d tetramers. Tetramers are comprised of four identical H-2 molecules bound to a fluorescently labeled steptavidin molecule that is able to bind the donor-specific B-cell antigen receptor (surface immunoglobulin). In our second system, we use donor mice that constitutively express full-length membranebound chicken ovalbumin (mOVA) protein in all tissues. Analogously, Ova specific-B cells can be analyzed flow cytometrically with the use fluorescently-labeled ovalbumin. Both systems allow us to identify donor-specific memory B cells (7AAD -CD4 -CD8 -Ova/ tetramer + IgD -B220 + CD138 -) and plasma cells (7AAD -CD4 -CD8 -Ova/tetramer + IgD -fB220 lo CD138 + ) during skin graft rejection (day 14 post transplant). We were also able to track the development of a stable memory cell population in the bone marrow and spleen for > 90 days cells.For the Ova system quantitative ELISA and ELISOT measurements for serum DSA and DSA-secreting cells, respectively, correlated with the development of donor-specific memory B cells. Using these assays we will be evaluate polyclonal donor-specific B memory subsets under multiple conditions of immunity and tolerance and for the first time, characterize their functional properties and migration patterns. These experiments will provide new information on the basic biology of the memory B-cell response to allografts in mice and facilitate the development of these methods in sensitized patients that may lead to critical therapeutic opportunities for DSA production. In The TNF-related B-lymphocyte survival factor, BLyS/BAFF, is critical for primary follicular (FO) and marginal zone (MZ) B-cell survival. In vivo neutralization of BLyS/ BAFF, using a newly developed monoclonal antibody, depletes the FO and MZ B-cell compartments. Here, we hypothesized that targeting B-lymphocytes, via the BLyS/BAFF pathway, could promote humoral tolerance to islet allografts. Cohorts of STZ-diabetic C57BL/6 mice were transplanted with isolated islets from BALB/c donors. Treatment with anti-BLyS/BAFF alone (100mcg/mse x2 doses+15mcg/wk/mse) did not protect islet allografts from acute rejection (MST=14d; n=4) . On the other hand, a 14-day course of Rapamycin (1mg/kg) prevented acute allograft rejection (MST=35d; n=9). When Rapamycin was combined with the anti-BLyS regimen, islet allograft survival was markedly prolonged (MST>150d; n=4). Importantly, islet allograft survival was coincident with the absence of detectable serum alloantibodies and blunted donor specific T-cell responses. Following treatment with anti-BLyS, B-lymphocyte compartment reconstitution was detectable at 30 days following treatment and was characterized by stringent selection at the transitional→FO B-cell tolerance checkpoint. Overall, these data indicate that the BLyS/BAFF pathway may be a logical target of immunomotherapy for the achievement of humoral transplantation tolerance. Reza Tavana Background: Highly sensitized patients' ability to be transplanted is severely compromised because of high level of antibodies against various HLA antigens. Interleukin-21 is a type I cytokine that signals through a receptor composed of the IL-21R and the common cytokine receptor -chain ( C ). It is produced by T-cells and has been shown to be the contributing factor in terminal differentiation of memory B-cell to Anti-body producing B-cell and plasma cells. Objective: We evaluated the effect of IL-21 Co-stimulation on Antibody production capacity of B-cells and also measured the expression of IL-21R on B lymphocytes of highly sensitized patients compared to Non-sensitized patients on the kidney transplant list. Methods: Patients with a PRA level >20% (sensitized) were compared to Non-sensitized patients from the transplant waiting list. After consent was obtained, peripheral blood was taken from patients before initiating hemodialysis. Leukocytes were labelled with anti-CD19, anti-IL-21 antibodies and paraffin fixed after RBC lysis. IL-21R Expression was measured by Flow-cytometry over FACScan machine on the same day. The Expression of the IL-21R is significantly higher on B-lymphocytes of highly sensitized patients (HSP) compared with non sensitized patients (NSP) (Fig 1.p<0.05 ). In-vitro Co-stimulation of Isolated peripheral B-cell with IL-21 and anti-CD40 results in higher IgG1 production compared with Anti-CD-40 or IL-21 stimulation alone (Fig 2) . Conclusions: Il-21 is an important cytokine in B-lymphocyte stimulation and increases IgG1 production. Il-21 Receptor on B-lymphocytes is up-regulated in sensitized kidney transplant recipients. Il-21 to Il-21R interaction between T and B-lymphocytes may be an important pathway in antibody production in highly sensitized renal transplant recipients. We created mixed bone marrow chimeras in irradiated µMT (B cell-deficient) recipients by transplantation of syngeneic bone marrow from µMT, wildtype (wt) and MHCko (lack MHC I & II expression on all cells) mice. µMT+MHCko chimeras lack expression of MHC I & II on B cells but not other professional APCs such as DCs hence antigen presentation specifically by B cells is disrupted. Allograft rejection and development of alloreactive memory T cells in µMT+MHCko chimeras was compared to µMT+wt chimeras that had intact antigen presentation by both B cells and APCs. Skin allograft rejection was comparable between µMT+MHCko and µMT+ wt chimeras (MST = 17 and 16 days, respectively, p = 0.6, n = 5/grp). However, heart allograft rejection was significantly delayed in the µMT+MHCko chimeras compared to µMT+ wt chimeras (MST = 23 and 15 days, respectively, p=0.03, n = 5/ grp). Development of alloreactive memory T cells was assessed in chimeras at 8 weeks after skin allograft rejection by quantitation of antigen-specific IFNγ producing T cells. Alloreactive CD4 and CD8 memory T cells were significantly fewer in µMT+ MHCko chimeras (7-fold fewer CD4, p = 0.02, and 5-fold fewer CD8, p = 0.003, n = 5/grp) than in µMT+ wt chimeras. These results show that the disruption of antigen presentation by B cells significantly delays heart but not skin allograft rejection. Development of alloreactive CD4 and CD8 memory T cells is significantly impaired in the absence of antigen presentation by B cells. Conclusions: Antigen presentation by B cells accelerates heart allograft rejection and leads to development of alloreactive memory T cells. These findings emphasize antibodyindependent functions of B cells in promoting alloimmune responses and highlight the need for B-cell targeted therapies to improve long-term allograft survival. Purpose: To test whether depletion of CD20+ B cells at the time of engraftment alters the prevalence of anti-donor alloantibody (Ab)or severity of CAV in the context of therapeutic immunosuppression with (CsA) or high dose CD154 inhibition. Methods: Forty-five MLR-mismatched heterotopic cardiac cynomolgus allograft recipients were treated with high dose anti-CD154 monotherapy (αCD154; n=14, 5 with ATG induction) or αCD154 with additional anti-CD20 therapy (rituximab 20mg/ kg q wk for 4 weeks: αCD154 + αCD20; n=18, 11 with ATG). Thirteen other animals received therapeutic CsA (target trough >500ng/ml), five of which received additional αCD20. Graft survival was censored at 90 days. Ab was deemed (+) if present (by flow cytometry) around time of explant. Results: 12 animals died with beating grafts, mainly with ATG-associated lung pathology or infectious etiologies and are excluded from this analysis. 3 animals that had sustained αCD154 levels <100µg/ml after protocol day 30 were considered "subtherapeutic" for this reagent and also excluded from further consideration. Graft survival with αCD154 + αCD20 (median >90d) and proportion of grafts surviving to 90 days (6/6) was significantly increased relative to αCD154 alone (median 43d, p=0.007; 4/14 >90d). Graft survival with CsA + αCD20 (median >90d) and proportion of grafts surviving to 90 days (4/4) was increased relative to CsA alone (median 66d, 3/6 >90d). With therapeutic αCD154 (trough level >100 µg/ml until graft explant), 12/13 (92%) developed Ab vs. 1/6 with αCD154 + αCD20 (17%) (p=0.006). Average CAV score for the αCD154group was 2.44 ±0.44 vs. 0.7 ± 0.8 in the αCD154 + αCD20 group (p=0.006 using unpaired t test). Preliminary CAV scoring suggests that added αCD20 inhibited CAV (scores ranging 0.0-0.1) relative to CsA alone (median 2.1; range 1.5-2.4); Ab analysis is in progress for these groups. Conclusions: Using αCD20 is associated with significant attenuation of CAV when used with either "therapeutic" αCD154 or CsA. Our findings demonstrate for the first time that αCD20 reduces the severity of CAV in conjunction with both conventional immunosuppression and costimulation pathway blockade. Mechanisms include inhibition of Ab production, and perhaps others that remain to be defined. Subsaturating Concentrations of Anti-HLA Antibody ( Sensitized recipients having donor specific anti-HLA Abs have been successfully transplanted following a conditioning regimen employing plasmapheresis with or without pooled human immunoglobulin. In vitro studies have shown that exposure of ECs to subsaturating concentrations (SSC) of anti-HLA Ab (priming) followed by subsequent exposure to high concentrations (HC) of anti-HLA induces the expression of protective genes, Bcl2 and HO-1 and confers protection to complement mediated lysis of ECs. However, the molecular events following priming with exposure to SSC of anti-HLA Ab still remains undefined. To determine the priming events and to define the kinetics of protective gene expression, we exposed human aortic ECs to SSC of anti-HLA Ab for 72 hrs and re-exposed them to saturating concentrations of anti-HLA Abs. ECs were collected at 24, 48, and 72 hrs after exposure to SSC as well as 24 hrs after exposure to HC. Expression profile of signaling intermediates (MAPK, WNT, NF-kB, Hedgehog, PI3kinase, stress pathway, TNF family) in the ECs were analyzed by gene array. Analysis of the Bcl2 and HO-1 expression showed no significant increase in expression following exposure to SSC of w6/32 (priming) or control Ab alone at any of the time points ( Background/aims: Microcirculation disturbance, endothelial injury and cytokine overproduction are implicated in the pathophysiology of hepatic ischemia reperfusion injury (IRI). Thrombomodulin (TM) is a membrane-bound endothelial thrombin receptor that accelerates thrombin-catalyzed protein C activation, inhibits thrombin-induced fibrin formation, and also regulates inflammation. In the present study, we investigated the effects of recombinant human soluble TM (rhsTM) on hepatic IRI in the rat. Methods: Wister Hannover rat was used for preparing ischemia/reperfusion model. Hepatic IRI was induced by subjecting 70 % area of the rat liver to 90 minutes of ischemia followed by 6 h of reperfusion. The rats were randomly assigned to a group receiving an intravenous injection of rhsTM (3 mg/kg body weight) and to a group treated with saline 60 minutes prior to the beginning of reperfusion. Sinusoidal endothelial cells (SECs) and Kuppfer cells (KCs) were isolated using centrifugal elutriation. The plasma levels and the concentration in cultured cell supernatant of IL-6 and TNFα were measured by specific enzyme immunoassays. Results: The plasma ALT, AST and hyaluronic acid levels were significantly decreased, and the histological damage of the liver was attenuated in rhsTM-treated rats as compared to control rats. Using Laser Doppler flow-meter we found that rhsTM treatment improved hepatic microcirculation. The intrasinusoidal fibrin deposition, injury of SECs and liver dysfunction during hepatic IRI were weaker in rhsTM-treated rats than in control rats. TM activity in SECs was significantly recovered, and plasma IL-6 and TNFα were significant decreased in rhsTM-treated rats as compared to control rats. Further, IL-6 and TNFα production in isolated KCs was also significant decreased in rhsTM-treated rats as compared to control rat . Conclusion: The present results suggest that rhsTM is useful for the prevention of SECs damage and KCs activation induced by IRI. Our present study also suggests that disturbance of hepatic microcirculation is induced in part by intrasinusoidal microthrombus formation and by locally released inflammatory cytokines from KCs. Protective Effects of Preservation Solution Including Activated Protein C in Small-for-Size Liver Transplantation in Rats. Background: Small-for-size liver graft is a serious obstacle of partial orthotopic liver transplantation (OLT). However, various therapeutic strategies including surgical innovations, pharmacological agents and gene therapies to protect small-for-size liver graft have not yet been developed. Aims: Activated protein C (APC) is known to have cell protective properties via its anti-inflammatory and anti-apoptotic activities. This study aimed to examine the cytoprotective effects of preservation solution containing APC on OLT using smallfor-size rat liver graft (20% partial liver). Methods: Liver grafts were assigned to two groups: in the control group, the grafts were flushed and stored in Histidine-tryptophan-ketoglutarare (HTK) solution alone for 6 h; in the APC group, in HTK solution containing APC for 6 h. Results: The APC group significantly increased 7-day graft survival from 60% to 100%, decreased levels of transaminase, and improved histological features of hepatic IRI compared to the control group. Myeloperoxidase activity demonstrated that the APC group markedly suppressed the infiltrations of neutrophil. Hepatic expressions of tumor necrosis factor-α and IL-6 of the APC group were remarkably decreased. The APC group significantly reduced serum hyaluronic acid levels, indicating attenuated sinusoidal endothelial cell injury. Moreover, the APC group markedly increased hepatic levels of nitric oxide caused by upregulated endothelial nitric oxide synthesis (NOS) together with downregulated inducible NOS, and decreased hepatic levels of endothelin-1. Finally, hepatocellular apoptosis of the APC group was remarkably suppressed by downregulated hepatic caspase-8 and caspase-3 activities. Conclusions: Preservation solution containing APC inhibited pro-inflammatory cytokine synthesis, which leads to hepatocellular apoptosis and liver injury. One of the cytoprotective effects of the APC treatment was to upregulate hepatic eNOS, followed by increased expression of hepatic NO, and to decrease expression of hepatic ET-1, resulting in the prevention of microcirculatory disturbance. Preservation solution containing APC is a potential novel and safe product for small-for-size liver transplantation to improve liver graft function and animal survival. Deletion of CD39 on Natural Killer Cells Attenuates Hepatic Ischemia/ Reperfusion Injury. Living donor liver transplantation (LDLT) has emerged as a solution to ease organ shortage in orthotopic liver transplantation. However, LDLT is often complicated by small-for-size liver graft that is highly susceptible to injury and shows decreased liver regeneration. Suppression of liver regeneration in small-for-size grafts correlates with impaired priming as a result of limited NF-κB activation and decreased production of the priming cytokines TNF and IL-6. We have shown that the hepatoprotective protein A20 promotes liver regeneration, partly through blockade of the cyclin dependent kinase inhibitor p21. However the impact of A20 expression in livers on IL-6 production and/ or signaling were still unknown. In this study we demonstrate that secretion of IL-6 (ELISA) following treatment with LPS or TNF was moderately lower in hepatocytes transduced with a recombinant A20 adenovirus (rAd) as compared to non-transduced or rAd.β-galactosidase transduced cells. This indicates that IL-6 production in hepatocytes is not solely NF-κB dependent (not totally blocked by the NF-κB inhibitor A20). Despite similar or lower IL-6 levels, IL-6 signaling as evaluated by phosphorylation of STAT3 (Western blot; WB) was enhanced in A20 expressing hepatocytes. This was confirmed in experiments showing that A20 increases STAT-3 phosphorylation in response to exogenous human IL-6. Accordingly, hepatocyte proliferation was significantly higher in rAd.A20 transduced hepatocytes as opposed to controls. The pro-proliferative function of A20 mapped to the 7Zn domain. Since the balance of IL-6 signaling in hepatocytes is finely regulated through a negative feed-back loop provided by the IL-6/STAT-3 dependent induction of suppressor of cytokine signal-3 (SOCS3), we investigated whether A20 affects IL-6 signaling by modulating SOCS3 expression. Our results indicate that A20 indeed decreased IL-6 mediated upregulation of SOCS3 (WB). This later result was confirmed in vivo. Improved regeneration and survival in A20 treated livers correlated with a substantial decrease in SOCS3 levels before and 24 hours following extended (78%) liver resection in mice. These results suggest that A20 enhances priming of hepatocytes by IL-6 likely through down-regulation of SOCS3. This added to the effect of A20 on p21 would further enhance its pro-proliferative function in hepatocytes to benefit survival and function of small-for-size liver grafts. Background: We have previously demonstrated that silencing inflammatory, apoptosis and complement genes can prevent ischemia-reperfusion (I/R) injury occurring in heart transplantation. However, the method for efficiently delivering siRNA into donor organ has not been established. This study was designed to develop a new method to induce gene silencing by coronal artery infusing with siRNA solution for prevention I/R injury in heart transplantation. Methods: Multiple siRNAs that specifically target TNFa, Caspase 8, and C5a receptor (C5aR) genes were generated and selected. siRNA protection of donor organs was evaluated in a rat heart transplantation model. Heart grafts from Lawis rats were infused with siRNA solution via coronal artery and preserved in HKT solution at 4° C for 18 hrs, and subsequently transplanted into syngeneic Lawis rats. Cardiac functions were assessed by heart beating rate. Gene expression at mRNA level was determined by qPCR. The I/R injury was assessed by immunohistochemistry. Results: After donor heart perfusion with siRNA solution, siRNA was found to enter the myocardial cells, indicated by the fluorescence emitted from the dye labeled siRNA. The levels of TNFa, caspase 8 and C5aR genes were significantly up-regulated in the grafts after ex vivo preservation for 18 hrs. These up-regulated TNFa, caspase 8, and C5aR genes were significantly knocked down by siRNA infusion. Using a siRNA infused organ as a donor, the graft survival was significantly prolonged in heart transplantation. While siRNA solution-treated heart grafts retained strong heartbeat up to the end point of observation (>10 days), the control grafts lost function within 3 days. In addition, an improved cardiac function was observed in the graft preserved in siRNA solution. The protection of graft by siRNA solution is associated with prevention of I/R injury. siRNA solution-treated organs exhibited almost normal histological structures as well as less neutrophil and lymphocyte infiltration, compared with control solution-treated organs. Conclusions: This study developed a novel ex vivo siRNA delivery system using coronal artery infusion, which can effectively silencing genes in donor hearts and prevent cardiac I/R injury. Background: Ischemia/reperfusion (I/R) insult is a prime factor leading to liver dysfunction. Apoptosis plays key role in the early graft loss following orthotopic liver transplantation (OLT). Bcl-xL has been showed to exert an anti-apoptotic function both in vitro and in vivo. This study was designed to evaluate potential cytoprotective effects and mechanisms for Bcl-xL in liver I/R injury by Ad-Bcl-xL gene transfer. Methods: A mouse model of partial 90 min warm hepatic ischemia followed by 6 h of reperfusion was used. Balb/c wide-type (WT) mice (n=6/gr) were injected with Ad-Bcl-xL or Adβ-gal reporter gene (2.5x10 9 pfu, i.p. at day -2). Sham control WT mice underwent the same procedure, but without vascular occlusion. Mice were sacrificed after 6 h of reperfusion; liver tissue and blood samples were collected for future analysis. Results: Ad-Bcl-xL treated mice showed significantly lower sGOT levels (IU/L), as compared with Ad-β-gal or WT controls (509±361 vs. 2819±706, and 2212±841, respectively; p<0.005). These correlated with histologic Suzukis grading of hepatic I/R injury, with WT/Ad-β-gal controls showing significant edema, sinusoidal congestion/cytoplasmic vacuolization, and severe hepatocellular necrosis. In contrast, WT mice treated with Ad-Bcl-xL revealed minimal sinusoidal congestion without edema/vacuolization or necrosis. Ad-Bcl-xL gene transfer significantly reduced local neutrophil accumulation and apoptosis (3.8±2.9 of TUNEL+ cells in Ad-Bcl-xL vs. 21.5±5.3 and 23.5±6.5 of TUNEL+ cells in WT or Ad-β-gal treated mice; p<0.01). Unlike in controls, intragraft expression of mRNA coding for TNF-α, E-selectin/ICAM-1, and IP-10/MCP-1 remained depressed in the Ad-Bcl-xL group. Ad-Bcl-xL gene transfer markedly depressed the activation of NF-κB, Caspase-3, and increased HO-1, A20, and Bcl-2/Bcl-xl expression, as compared with WT/Ad-β-gal controls. Conclusion: This study demonstrates that inhibition of NF-κB activation contributes to the cytoprtective effects after Bcl-xL gene transfer in hepatic I/R injury. The induction of anti-oxidant HO-1 and anti-apoptotic A20, Bcl-2 by Bcl-xL gene transfer exerts synergistic cytoprotective effect against antigen-independent hepatic inflammatory injury induced by I/R. Hepatocyte Background: Primary graft non-function (PNF) affects survival and function of renal allografts. PNF, secondary to the ischemic and inflammatory injury in the peri-transplant period, leads to acute tubular necrosis and predisposes to acute rejection. Defining new preconditioning regimens to reduce PNF are desirable. A20 is part of a negative antiinflammatory loop aimed at inhibiting NF-κB in renal proximal tubular epithelial cells (RPTEC). Hepatocyte growth factor (HGF) is a pleiotropic growth factor upregulated in acute kidney injury and acute rejection likely to modulate inflammation and promote repair. In the present study we evaluated the effect of HGF on RPTEC and hypothesized that some of its protective functions may relate to the upregulation of A20. Methods and Results: Treatment of RPTEC with HGF (50ng/ml) led to a 2.4±0.8 (n=4; p=0.04) fold increase in A20 mRNA (real time-PCR), which translated into a significant 3.0±1.5 fold increase (n=5; p=0.04) in A20 protein by 6h, as shown by Western blot (WB). Two lines of evidence suggested that upregulation of A20 by HGF was NF-κBindependent. HGF did not degrade IκBα in RPTEC (WB) nor upregulated the NF-κB dependent molecule ICAM-1, as shown by flow cytometry analysis (FACS). Further, A20 was still upregulated in RPTEC expressing the NF-κB inhibitor IκBα, both at the mRNA and protein levels. Upregulation of A20 by HGF protected RPTEC from a subsequent inflammatory insult, here mimicked by the addition of TNF. Pretreatment of RPTEC with HGF for 6 hours blunted TNF-induced (10U and 25U/ml) upregulation of ICAM-1, as analyzed by FACS. Conclusion: To our knowledge this is the first demonstration that A20 could be upregulated in RPTEC, in a non-NF-κB dependent manner. Further studies are carried out to elucidate the transcription factors involved in HGF-induced upregulation of A20. From a clinical standpoint, these results highlight the unique ability of HGF to protect RPTEC from inflammation by inducing the anti-inflammatory protein A20, remarkably without triggering other pro-inflammatory signals. We propose that HGF-based therapies could serve in preconditioning regimens to prevent ischemia/reperfusion injury and reduce PNF and acute rejection in renal transplantation. Background. Liver ischemia reperfusion injury (IRI) is one of the main causes of graft dysfunction and rejection in liver transplantation. It has been documented that IRI is associated with inflammatory and complement pathway activation. This study was designed to investigate the efficacy of small interfering RNA expression vector (shRNA) targeting TNF-α and complement 3 (C3) genes in the protection of mouse liver IRI. Methods. shRNA expression vectors were constructed for TNF-α and C3 genes. Mice received shRNA by hydrodynamic injection prior to IRI, which consisted of interrupting blood supply to the left lateral and median lobes of the liver for 45 minutes followed by reperfusion. IRI was evaluated using liver histopathology, as well as levels of serum alanine transferase (ALT) and aspartate transaminase (AST). Neutrophil accumulation was determined by a myeloperoxidase (MPO) assay. Lipid peroxidation was assessed by malondialdehyde (MDA) levels. Realtime PCR was used to test gene silencing efficacy in vitro and in vivo. Result. We demonstrated that IRI is associated with an increase in TNF-α and C3 mRNA levels in liver tissue 6 hours after reperfusion. shRNA-treatment effectively down-regulated TNF-α and C3 expression in IRI livers. In comparison with vehicle control, the serum levels of ALT (9843.31 ±2610.66U/L vs 1960.00 ±1361.98 U/L) and AST (7188.25 ±3295.99U/L vs 1405.13 ± 774.65U/L), were significantly reduced in mice treated with TNF-α and C3 shRNA. Additionally, the neutrophil accumulation and lipid peroxidase-mediated tissue injury, detected by MPO and MDA respectively, were improved after shRNA treatment. Tissue histopathology showed an overall reduction of injury area in shRNA-treated mice. Conclusion. This is the first demonstration that liver IRI can be prevented through gene silencing of inflammatory genes and complement genes, showing potential for shRNA-based clinical therapy. Kidney -Acute Rejection: Antibody-Mediated Rejection The (2) in the first three days after transplantation, a temporary decrease in DSA was observed in all AMR cases, and all of them quickly rebounded thereafter; (3) C4d can be detected very early (can be seen on day 1, 89% in day 4 protocol biopsies, frequently in the absence graft dysfunction, and 100% in index biopsies); (4) the pathologic changes observed in sequential biopsies were C4d deposition followed by acute tubular injury, then interstitial inflammation and peritubular capillary margination seen in index biopsies; (5) pure AMR occurred early (100% at day 3), usually evolving into mixed AMR with accompanying cellular rejection (87% in index biopsies); (6) Most recipients (87%) had initial graft function before developing AMR; (7) Background: Antibody-mediated rejection (AMR) has been recognized as a major problem in ABO-incompatible (ABO-I) renal transplantation (RTX). However, little is known about the long-term impact of AMR in the ABO-I renal transplant setting, especially after the introduction of a tacrolimus (FK)-based immunosuppressive regimen. The aim of this study was to assess the long-term impact of AMR on the clinical and pathological outcomes in ABO-I RTX. Methods: Fifty-eight patients who underwent ABO-I RTX at our institution between March 1999 and December 2004 under an FK-based immunosuppressive regimen were enrolled in this study. Protocol biopsies were performed regardless of renal function at one month and one year after RTX. Fifty-six of the 58 patients received the biopsy at one month and 43 of 56 patients underwent biopsy at one year posttransplant. AMR was diagnosed by morphological features based on the Banff '05 update and other characteristic findings for AMR previously reported, such as mesangiolysis, interstitial hemorrhage, and cortical infarction. We evaluated graft survival, incidence of chronic rejection characterized by transplant glomerulopathy (TGP) at one year posttransplant, and renal function using serum creatinine at three years posttransplant according to the incidence of AMR at one month posttransplant. The overall graft survival rate at 3, 5, and 8 years after RTX was 93%, 87%, and 63%, respectively. The incidence of AMR at one month was 34% (19/56). The graft survival rate of the patients with AMR was significantly lower than that of the patients without AMR (p<0.01, 3 years: 79% VS 100%, 8 years: 38% VS 94%). The incidence of TGP in the patients with AMR was significantly higher than that of the patients without AMR (p<0.001, 57% VS 7%). The serum creatinine concentration at three years after RTX was significantly higher in the patients with AMR than in those without AMR (p<0.001, 1.95 mg/dl VS 1.32 mg/dl). In this study, we revealed that AMR in ABO-I RTX is associated with not only graft loss but also the progression of chronic renal impairment, functionally as well as pathologically, even after the introduction of an FK-based immunosuppressive regimen. Further studies are needed to establish a more effective immunosuppressive regimen, such as rituximab induction therapy; against AMR in ABO-I RTX. Conclusions: De novo DSA in AR is an independent predictor of graft loss and its degree of influence is comparable to other established risk factors (AA race, DGF, increased baseline creatinine). Additional studies are warranted to: 1) confirm the predictive ability of DSA and 2) determine whether reduction/eliminattion of DSA will allow improvements in graft survival. was performed in all the recipients before and six months after LRKT. Graft biopsies were performed as well within and after six months of the transplantation (Tx). All the data of 87 recipients were collected prospectively during the period of follow-up. Humoral rejection rate, donor specificity, and time of appearance of the de novo Abs were retrospectively studied. Results Among the 87 LRKT recipients, 47 (54%) showed negative/negative results, 15 (17%) showed positive/positive results, 12 (14%) showed positive/negative results, and 13 (15%) showed negative/positive results (de novo Abs) in the pre-/post-transplant Flow-PRA analysis. Among the 13 cases with de novo Abs, 5 (38%) had donorspecific Abs (DSA) and the remaining 8 (62%) had non-donor specific Abs (NDSA) as determined by LAB single antigen analysis. Four of the five recipients (80%) with DSA showed evidence of both vascular and humoral rejection in the graft biopsies performed within 6 months of the transplantation, while one of the eight recipients (13%) with NDSA showed evidence of cellular rejection during the same period. A 5-year graft survival rate of the recipients with de novo Abs was 69%, compared with 96%, 88% and 93% in other groups without de novo Abs (P=0.009). Conclusions LRKT recipients with developing de novo Abs has much higher incidence of humoral rejection and worse prognosis, especially those with donor-specific de novo Abs. Cautious monitoring for the appearance of anti-HLA antibodies should be adopted after transplantation, even in patients without anti-HLA Ab prior to the transplantation. Despite significantly higher response than the 4 males w/o AMR (p<0.03), the other 5 females did not experience AMR. Conclusions: 1) CFC is a novel assay to measure allo/ DO CD3-cell responses, assess the degree of sensitization, and predict AMR in HS, 2) Allo/DO CD3-cell numbers are elevated in many HS, but not NC, 3) HS w/ high(+) CFC are at increased risk for AMR and may need additional pre-Tx desensitization, 4) Allo/DO reactivity are higher in HS females, which may explain their higher rate of AMR, 5) CFC cut off levels for AMR prediction may be higher in females than males, 6) Monitoring HS using the CFC pre-and post-desensitization may help determine the efficacy of desensitization and risk for AMR. were treated with plasmapheresis, IVIg and rituximab, and Pts with L-AMR received IVIg and rituximab. The 2-year GS post-AMR in Pts with E-AMR and focal C4d staining was 33% vs. 50% in Pts with diffuse staining; while cases of L-AMR with focal C4d deposition had a GS of 55% vs. 60% in cases with L-AMR and diffuse staining. The number of cases with focal staining was low, and the numerically evident differences were not statistically significant (log-rank p=NS). Notably, when losses due to death with a functional graft were censored, post-AMR GS was significantly lower in Pts with E-AMR and focal staining than in their counterparts with diffuse C4d deposition (41% vs. 67%, log-rank p=0.04). In this retrospective single center study, focally positive C4d AMR carries a worse prognosis than previously thought, and causes a significant reduction of GS. Whether any degree of C4d staining in the context of KT dysfunction should be treated as AMR remains a pending question. Association Time of biopsy was 29.1±43.0 mo after KT. However, 172 cases were biopsied in the 1st year posttransplant. The extent of C4d staining was graded as <10% (0), 10-50% (1), and ≥50% (2) of PTC, and the intensity was graded as none (0), light (1), and strong (2) staining. These findings demonstrate the significant discordance between detection of DSA and C4d, which is a relatively specific histological evidence of Ab-mediated injury. This factor should be taken into account when clinical decisions for treatment of patients with either C4d or DSA positivity are made. The observed discrepancy could be partially due to the technique used for staining of the biopsy specimens, inability to detect anti-HLA DSA with the available technology, or non-HLA DSA. Long term follow up data are needed to evaluate the impact of these markers on graft outcomes. Introduction Epithelial to mesenchymal transition (EMT) is a potential mechanism of tissue fibrogenesis. In a previous study, we had reported that the early expression of EMT markers was associated with the progression of renal grafts towards interstitial fibrosis and tubular atrophy (IF/TA). Here, we report the long-term follow-up of this cohort, paying a special attention to the evolution of graft function. Patients and Methods 83 patients engrafted with a kidney from a cadaveric (n=63) or a living (n=20) donor, and in whom sequential protocol biopsies had been performed at 3 and 12 months, were included. The phenotype of epithelial cells was studied at three months according to the expression of vimentin (an intermediate filament normally expressed by fibroblast-like cells) and to the cellular localization of β-catenin. Grafts in which these two markers were abnormally expressed by more than 10% of tubular cells were considered as EMT+ grafts. Serum creatinine and creatinine clearance (estimated Abstracts by Gault and Cockcroft index) were collected from 12 to 24 months post-transplant and compared according to the EMT status of the graft. Results Multivariate analysis demonstrated that the early expression of EMT markers was an independent risk factor of the progression of graft fibrosis between 3 and 12 months. More importantly, these early phenotypic changes were associated with a progressive and sustained deterioration of the graft function : EMT+ patients had a statistically higher serum creatinine from twelve months after transplantation, and a significantly lower creatinine clearance from 18 months after transplantation (EMT+ 49.4±4.5 ml/min vs EMT-61.1±2.3 ml/min, p=0.01). The difference was persistent at 24 months. Conclusion The expression of EMT markers by tubular epithelial cells at an early time point post-transplant (three months) is highly suggestive of an ongoing fibrogenic process, and has repercussions on the long-term graft function. Therefore, these epithelial phenotypic changes are relevant and promising biomarkers for an early detection of IF/TA. We recently reported that 73% of transplant glomerulopathy (TG) has evidence of alloantibody-mediated injury in biopsies for cause. We found that 1/3 of TG is C4d+Ab+ and 1/3 is C4d-Ab+ suggesting that C4d staining is not sensitive enough to detect all biopsies with antibody-mediated injury. We aimed to develop a new laboratory test to detect biopsies with antibody-mediated rejection (ABMR) which are missed by C4d. Using Affymetrix microarrays, we analyzed gene expression in 173 renal allograft biopsies for cause. We previously reported that both ABMR and T cell-mediated rejection (TCMR) biopsies show increased expression of transcript sets associated with cytotoxic-T cells (CATs) and gamma-interferon effects (GRITs) compared to biopsies without rejection (p<0.05). However, ABMR biopsies were discriminated by a selective increased expression of 25 "endothelial cell-associated transcripts" (ENDATs). These genes included established endothelial markers such as VWF, PECAM1, SELE, CD34, and cadherin 5, which are involved in endothelial-cell activation. Hierarchical clustering of 82 biopsies with Ab+ using ENDATs identified a group of C4d-Ab+ biopsies (n=20) clustered with C4d+ Ab+ biopsies (n=18). Thus 25% of biopsies with antibody (20 of 82) had increased ENDAT-scores despite being negative for C4d. These C4d-Ab+ biopsies with high ENDATs, had higher scores for CATs and GRITs, increased incidence/severity of TG, tubular atrophy/interstitial fibrosis, and worse future graft function (p<0.05), but similar incidence of TCMR or borderline lesions, in comparison to C4d-Ab+ biopsies with no increase in ENDATs. The C4d-Ab+ cases with endothelial activation show extensive inflammation in the allograft, as measured by the gene sets, which is similar to C4d+ ABMR. There are a significant number of cases with alloantibody and no C4d that show increased expression of endothelial genes. Thus the transcriptomics detects deteriorating C4d-allografts with ongoing alloantibody mediated injury. We conclude that increased expression of endothelial genes provides a new feature of ABMR, and can be used as a new diagnostic test to detect and treat C4d-ABMR. Probabilistic ( 001) . The Bayesian network model was analyzed and, interestingly, CD86 was critically related to TG, suggesting a B-cell mediated process. TG was also predicted by upregulation of CCL2, CCL3, CCL5, CXCL9, IL-8, IL-10, and ICAM gene expression. Ten percent of the samples were excluded randomly from the initial model, and subsequently used for cross validation. In the validation analysis, the model effectively predicted TG (AUC of 0.92, 90% PPV) and SF (AUC of 0.866, 83% PPV). This study provides a compelling and clinically relevant example of the combination of quantitative gene expression with probabilistic Bayesian modeling to predict renal allograft pathology. Potentially important molecular pathways associated with transplant glomerulopathy were also identified. The application of this integrated approach has broad implications in the field of transplant diagnostics and interpretation of large data sets. 1, 6, 12, 24, 36, 48 and 60 months respectively. The daily dose and blood levels of TAC were significantly lower in TAC/SLR group compared to TAC/MMF group. Renal function is shown. Despite lower prevelance of CAI in TAC/SLR group long-term graft function and patient and graft survival are comparable between TAC/MMF and TAC/SLR groups. Objective: The aim of this study was to determine if ethinicity impacts graft outcomes in kidney transplant patients converted to sirolimus (SRL) and either maintained on calcineurin inhibitors (CI) or mycophenolate (MMF) with steroids. Methods: This was a retrospective analysis of all kidney transplants converted to SRL and transplanted from 7/91 to 4/07. Patients were divided into 4 groups: Group 1: AAs converted to SRL + continued on CI; group 2: non-AAs converted to SRL + continued on CI; group 3: AAs converted to SRL + continued on MMF; group 4: non-AAs converted to SRL + continued on MMF. Pediatrics and multiorgan transplants were excluded. Results: A total of 257 patients were included (61% AA). Demographics, baseline immunosuppression, and reason for SRL conversion were similar between groups. Table 1 displays characteristics and outcomes. Patients converted to SRL+CI regimens had higher rates of acute rejection before SRL conversion (p<0.02), but equal rates after conversion. Development of proteinuria was similar across groups. Figure 1 displays the graft survival rates for each group. AA patients converted to SRL+MMF tended to have poorer outcomes compared to AA patients converted to SRL+CI. Non-AA patients converted to SRL+MMF tended to have better graft outcomes compared to non-AA patients coverted to SRL+CI, although this did not reach statistical significance(p=0.186). Conclusion: AAs converted to SRL may benefit from continued CI, while non-AAs converted to SRL appear to have better outcomes with MMF. Further prospective studies are warranted to confirm these findings. AA SRL+CI (n=113) There are no large registry studies evaluating the correlation of allograft failure for recipients of kidneys from the same deceased donor. We examined outcomes in such recipient pairs using data from the United States Renal Data System. Methods: We studied the correlation of graft failure events within 19,461 pairs of same-donor recipients transplanted during 1995 through 2003. Analyses were limited to patients with functioning grafts 3 months post-transplant (tx) and adjusted for known donor, recipient, and tx management factors. We estimated odds ratios to measure the increased risk for 1, 2, and 3-year graft failure and death-censored graft failure when the contralateral kidney had such an event. We also evaluated the effect of recipient pairs transplanted at the same center vs different centers. Results: There is a strong correlation in outcomes for 2 recipients with the same donor (table) . The correlation was stronger within pairs transplanted at the same center than for those transplanted at different centers. Differences in the correlation of graft failures within pairs transplanted at the same versus separate centers diminished over 2 and were absent by 3 years post-tx. Results for death-censored allograft failure were similar. Conclusion: Unmeasured donor factors contribute significantly to the correlated graft failure outcomes in paired recipients of deceased donor kidneys and need further study. Kidney tx outcomes in the first year may be affected by differences in management between transplant centers, more so than in subsequent years post-tx. 1 Odds ratio of death-censored graft failure and graft failure in recipients of a donor pair, given that the outcome occurred in the recipient of the contralateral kidney, with both recipients being transplanted at the same (S) center or at different (D) centers. All odds ratios significant with p<0.001. The Deterioration of Kidney Allograft Function (DeKAF) study is a NIH-funded multicenter observational study of late allograft (ktx) loss. The study examines two cohorts: a "long-term cohort" (LTC) of prevalent ktx with SCr < 2.0 mg/dL with deterioration of function (25% increase in SCr or proteinuria) and a "prospective cohort" (PC) of incident ktx developing a persistent >25% increase in SCr. We examined the pathologic features of the first renal biopsy (bx) obtained for new onset deterioration in each cohort. All bx were read and scored centrally using a modified Banff schema. On average, bx were obtained at 6 (PC) and 75 (LTC) months post-tx. Mean SCr was similar, but more patients (pts) in LTC had proteinuria. Moderate to severe interstitial fibrosis and tubular atrophy (TA) were more prevalent in LTC than PC (33 vs 13% ci score ≥2; 33 vs 9% for TA). The rate of vascular sclerosis >25% was similar (10.87% PC vs 15.46% LTC); however, hyaline arteriolar sclerosis >25% was more common in the LTC (30 vs 4.35%). Interstitial inflammation (i) and tubulitis (t) scores were similar in both cohorts. However, more pts in LTC had peritubular capillary infiltrates (>5 cells) and evidence of tx glomerulopathy. While rates of interstitial inflammation and tubulitis sufficient to warrant diagnosis of acute rejection are similar in the prospective and long-term cohorts, long term cohort pts had more proteinuria, interstitial fibrosis, tubular atrophy, hyaline arteriolar sclerosis, and transplant glomerulopathy. Analyses of histologic findings and renal outcome are ongoing. The term chronic allograft nephropathy (CAN) has been abolished by the last Banff Meeting Report (Am J Transplant, 2007) and 2 categories have been introduced for chronic changes: chronic active T cell-mediated rejection and chronic active humoral rejection (CAHR). Aim of the study was to review all cases of CAN diagnosed in the last 4 years and to identify immunohistochemical markers of chronic rejection. A cohort of 79 CAD pts with biopsy-proven CAN was analyzed. Each case was reviewed and assigned into 3 groups according to Banff 2005 criteria: chronic rejection (CR), chronic calcineurin toxicity (CNIT) or chronic lesions not otherwise specified (NOS). CD4+, CD8+, CD20+, CD68+ cells and C4d deposits were assessed by immunohistochemistry. Twenty-eight pts were classified as CNIT,34 as CR,19 of which were CAHR, and 17 as NOS. Serum creatinine and 24h proteinuria at renal biopsy, extent of interstitial fibrosis and glomerulosclerosis were not significantly different among 3 groups (Table1).The number of CD4 + cells was higher at TI level in CR compared to CNIT (Table1;*p=.05). CD8 + cells were higher at TI and G level in CR compared to CNIT (Table1;*p=.05). TI and G CD20 + cells were not different among the 3 groups (Table1). The number of G CD68 + cells was increased in CR compared to CNI and NOS (Table1;*p=.05). No significant difference in CD20, CD4, CD8, CD68 expression was found at TI and G level between C4d + and C4dcases of CR. CD68, CD8, CD4 but not CD20 expression at TI level correlated with TI fibrosis (R 2 =.105, .077, .156, respectively, p<.05) at the univariate analysis. Only TI CD4 + cells independently correlated with fibrosis at multiple regression analysis. In conclusion, our data suggest that: Morbid obesity limits access to kidney transplantation and predicts adverse transplant outcomes. There are limited data on the safety and efficacy of gastric bypass (GB) as a weight reduction therapy among transplant candidates and recipients. Methods: We examined USRDS registry data to identify Medicare-insured kidney transplant candidates and recipients with billing claims for GB procedures. GB were categorized according to occurrence before listing, on the waitlist, or after transplant. We studied the clinical characteristics of GB-treated patients, and subsequent outcomes including progression from listing to transplant and 30-day mortality. USRDS surveys BMI data at dialysis start, waitlist entry, transplant, and transplant anniversaries.We computed changes between most recently reported body mass index (BMI) values preceding and following GB, when available. Results: We identified 72 transplant candidates treated with GB before listing, 29 who underwent GB on the waitlist, and 87 GB cases after transplant. Patients treated with GB were most commonly female, white race, and without diabetic or hypertensive renal failure (Table) . 30-day mortality after GB, calculable for listed and transplanted patients, was 3.4% and 3.4%, respectively. 1 transplant recipient experienced graft failure within 30 days of GB. Of 29 patients treated with GB on the waitlist, 20 proceeded to transplant. Post-GB weight loss was detected for 60% with GB pre-listing, 70% with GB on the waitlist, and 90.7% with GB after transplant. Among patients listed for transplant in the same era and BMI >35 at first dialysis who were not treated with GB, 22% had lost weight between dialysis start and listing. Conclusions: GB has been performed in small numbers of kidney transplant candidates and recipients, and is followed by weight-loss in the majority of cases. Peri-operative mortality is comparable to reports in patients without kidney disease. GB warrants prospective study as a strategy for reducing complications of obesity in ESRD. INTRODUCTION: The present study investigated the incidence of posttransplant diabetes mellitus (PTDM) and calculated the risk of developing PTDM under a tacrolimus and mycophenolate mofetil (MMF)-based immunosuppression based on clinical characteristics, tacrolimus-pharmacokinetics, and genetic polymorphisms related to tacrolimus-pharmacokinetics, cytokines and diabetes mellitus. METHODS: Seventy-one non-diabetic adult kidney recipients (male 37, female 34) were studied. Patients with continuous high plasma glucose levels, over 6.5mg/dl of hemoglobin A1c, or requiring insulin and/or oral anti-diabetic agents for more than 3 months after transplantation at 1-year after transplantation were diagnosed as having PTDM. Fifteen genomic polymorphisms were assessed. RESULTS: One year after transplantation, 21 recipients (29.6%) developed PTDM. Positive risk factors were age (p=0.028) and body mass index (p=0.048). There were no significant differences in acute rejection rate, total steroid doses, tacrolimuspharmacokinetics or its related to genetic polymorphisms between the two groups. The frequencies of PTDM were significantly higher in patients with adiponectin T45G TT genotype than in those with the G allele (p=0.034), and in patients with glucocorticoid receptor (NR3C1) Bcl I CC genotype than in those with the G allele (p=0.004). CONCLUSIONS: The incidence of PTDM at 1-yr after transplantation was 29.3% in our cohort. Elder or obese patients were risky for the development of PTDM. The presence of the adiponectin T45G TT or NR3C1 Bcl I CC genotype may also be risk factors for PTDM, suggesting that insulin and glucocorticoid sensitivity-related genes are associated with the development of PTDM. Analysis of these genotypes is a possible method of predicting a patient's risk for developing PTDM and would be a valuable asset in selecting appropriate immunosuppressive regimens for individuals. Pharmacokinetics Persistent hyperparathyroidism (HPT) with hypercalcemia is common after renal transplantation. Studies have shown that treatment with cinacalcet corrects hypercalcemia and lowers PTH levels in these patients. So far cinacalcet's steadystate pharmacokinetics and their correlation with pharmacodynamics (PK/PD) have only been studied in hemodialysis patients, but not in renal transplant recipients with persistent HPT. To gain further insight into cinacalcet's effects on calcium-phosphate homeostasis, we determined its steady-state pharmacokinetics and pharmacodynamic effects in these patients. In a prospective, single center, open label study we examined the effect of a 2-week treatment with 30 mg and subsequent 2-week treatment with 60 mg cinacalcet daily on calcium-phosphate homeostasis over 24 hours and determined the steady-state pharmacokinetics of cinacalcet in stable renal allograft recipients. The urinary calcium excretion was determined in timed urine samples. Median AUC 0-24 was 784.8 ng*h/ml and C max was 68.5 ng/ml for 60 mg cinacalcet which is higher, and oral clearance (CL/F) was 76.9 l/h which is lower in renal transplant recipients compared to previously published data of hemodialysis patients (50 mg cinacalcet AUC 0-24 179, C max 17.2, CL/F 279). We also observed a non-proportional increase of AUC 0-24 after doubling of the cinacalcet dose. The once daily administration of cinacalcet dose-dependently reduced iPTH and serum calcium. Cinacalcet and parathyroid hormone (PTH) concentrations showed an inverse correlation and were fitted to a simple Emax model (E max =80 % reduction vs. baseline, EC 50 =13 ng/ml). The 8-hour fractional urinary excretion of calcium was increased after 60 mg cinacalcet (baseline 0.85±0.17 %, 30 mg 1.53±0.35 %, 60 mg 1.92±0.37 %). Renal function remained stable. Cinacalcet's higher and non-proportional increase of AUC 0-24 in transplant recipients compared to hemodialysis patients evokes the possibility of a pharmacokinetic interaction with concomitant cyclosporine treatment. Cinacalcet effectively corrected the biochemical abnormalities of persistent HPT. The transient calciuria could potentially favor nephrocalcinosis and reduce bone mineral density, suggesting that higher doses of cinacalcet need to be used with caution in renal transplant recipients with severe persistent hyperparathyroidism. Screening for Proteinuria in the Kidney Transplant Clinic. Bryce A. Kiberd, 1 Romuald Panek. 1 1 Dalhousie University, Halifax, NS, Canada. Proteinuria is a predictor of progression in kidney disease. It is not clear whether measuring albuminuria will have greater clinical utility over measurement by dipstick or total proteinuria in kidney transplant recipients. There has also been a trend away from using 24 hour collections to using spot urine albumin/creatinine (AC) and protein/creatinine (PC) ratios. We compare the prevalence of proteinuria estimated by dipstick, AC and PC in prevalent patients (>6 months post transplant) in the kidney transplant clinic. Significant albuminuria defined as ≥30 mg/g was present in 52% (211/403). Albuminuria was seen in 29% (70/245) with negative and 67% (34/51) with trace dipstick proteinuria. Significant predictors of albuminuria in a mulitvariate logistic analysis were eGFR (OR 0.977 per ml/min/1.73m 2 , 95% CI 0.965-0.989 p=0.001), diastolic BP (OR 1.028 per mmHg, 95% CI 1.008-1.048 p=0.007), and MMF use (OR 0.50, 95% CI 0.32-0.77 p=0.002). Macroalbuminuria (AC>299 mg/g) was seen in 17.4% (70/403) and significant predictors in a mulivariate logistic analysis were lower eGFR and higher systolic BP. Sirolimus use was associated with more macroalbuminuria and MMF use with less macroalbuminuria. In a subset of patients followed for >2 years prior GFR loss was considerably greater (p=0.021) in patients with albuminuria (-0.88 ml/min/1.73m 2 /year) compared to those without (-0.15 ml/min/1.73m 2 /year). However other measures of proteinuria were also significantly (p for trend) associated with prior GFR loss (ml/min/1.73m 2 /year) as shown in the <0.13 (n=185) 0.13-0.49 (n=109) >0.50 (n=80) ∆ eGFR/year -0.33 -0.26 -1.40 0.02 * A PC cut point of 0.129 g/g had a sensitivity and specificity for albuminuria > 30 mg/g of 87% and 88% respectively (c=0.92, 95% CI 0.89-0.95), and a PC cut point of 0.49 g/g had a sensitivity and specificity for macroalbuminuria >300 mg/g of 100% and 94% respectively (c=0.98, 95% CI 0.97-0.99). AC may be more sensitive and therefore have more clinical utility than other measures of proteinuria for progression. However prospective follow up of renal function change and CV outcomes is required. Serum creatinine is a crude marker of GFR in renal transplant recipients and changes in GFR are frequently not accompanied by commensurate changes in serum creatinine concentration. Serum cystatin C and estimates of GFR (eGFR) based on cystatin C have been shown to be more accurate than serum creatinine and creatinine-based eGFR in renal transplant recipients. The purpose of this study was to determine whether the Filler, Lebricon and Rule cystatin C-based eGFR equations were better able to detect changes in true GFR than the MDRD and Cockcroft Gault creatinine-based eGFR equations. We performed two measures of 99m Tc-DTPA GFR, serum creatinine and serum cystatin C on each of 183 stable renal transplant recipients at least 6 months apart. We calculated and compared the percent annual change in the measured GFR and the estimated GFR using the various GFR estimation equations. We also determined the sensitivity, specificity, positive predictive value and negative predictive value of each prediction equation for the detection of decline in measured GFR. Results are presented below: The cystatin C and creatinine-based eGFR equations all demonstrated poor sensitivity and diagnostic performance to detect a decline in GFR. Novel equations derived and validated in the transplant population are needed to accurately assess kidney function over time. Background: Hypercalcemia, hypophosphatemia and renal phosphate wasting are common after kidney transplantation and are related to persistent hyperparathyroidism and hyperphosphatoninism. Animal data suggest that these alterations in mineral metabolism may contribute to nephrocalcinosis and progressive graft dysfunction. Supporting clinical data are limited. Aim: To test the hypothesis that nephrocalcinosis is highly prevalent in the early posttransplant period and is related to a disturbed mineral metabolism. Methods: Biomarkers of mineral metabolism (including albumin-corrected serum calcium [Ca c ], serum phosphorus [P], biointact PTH, calcidiol, calcitriol and alkaline phosphatase) and renal calcium and phosphorus excretion parameters were prospectively assessed in 201 renal transplant recipients (62% male, mean age 55 ± 14 yrs) at the time of their 3-month protocol biopsy. These protocol biopsies were screened for the presence of microcalcifications. Intratubular, interstitial and/or cytoplasmatic microcalcifications were observed in 30.4% of biopsies. Calcifications were more prevalent in recipients of a living related donor as compared to cadaveric donor. High serum Ca c levels, high serum PTH levels, a high urinary Ca×P product and high fractional excretion of P and low serum P levels were significantly associated with renal microcalcifications (see figure below). Microcalcifications were not related to the fractional excretion of Ca, use of diuretics, immunosuppressive regimen, serum alkaline phosphatase level and history of delayed graft function. The extent of microcalcifications correlated significantly with the severity of mineral metabolism disturbances. Conclusion: Our data demonstrate that nephrocalcinosis is highly prevalent in the early posttransplant period and suggest that a disordered mineral metabolism is implicated in its pathogenesis. Polymorphism in ABCB1, the Gene Encoding for P-Glycoprotein, Predicts Recovery of Graft Function Early after Kidney Transplantation. The pharmacokinetics of cyclosporine (CsA) is characterized by wide inter-individual variability. This might be particularly relevant in the early post-transplant period, due to the detrimental effects of the drug on the kidney. P-glycoprotein (P-gp), the product of the ABCB1 gene, plays a key role in the distribution of CsA at cellular level. Single nucleotide polymorphisms (SNPs) of ABCB1 might potentially influence the response of patients to CsA. In particular, the SNP in position 3435 of the exon 26, despite its silent nature, has been recently associated with altered specificity for its ligand, such as CsA (Kimchi-Sarfaty et al, Science 2007, 315:525) . Whether P-gp pharmacogenetics would help to guide CsA treatment early post-transplant remains ill defined. We sought to evaluate the effects of the SNPs in the exon 26 on the rate of recovery of graft function, as estimated GFR early postoperatively, in 150 kidney transplant patients given CsA as part of their immunosuppressive regimen. The frequency of DGF (as need for post-operative dialysis) among the different ABCB1 genotypes was also estimated. Of the 150 kidney transplant recipients, 35% had the ABCB1 wild type (C/C) genotype in exon 26, 40% were heterozygous (C/T) and 25% were homozygous (T/T) for the polymorphic variant in position 3435. GFR values were significantly lower in patients carrying one of the two mutant alleles than in the wild type ( Figure) . The frequency of DGF was 15%, 28% and 30% in patients with the CC, CT and TT genotypes, respectively. These findings demonstrate that in patients carrying the CT or TT mutant alleles in exon 26 of the ABCB1 gene and given CsA, the recovery of graft function is less prompt, and the risk to develop DGF higher than in wild-type CC genotype. Pre-transplant screening for ABCB1 polymorphism would help to identify patients who may safely receive CsA early post kidney transplantation. Cigarette Cigarette smoking has shown to reduce graft and patient survival in renal transplant patients. However, whether it could directly produce allograft disfunction has not been investigated. The aim of this study was to assess the smoking influence on renal graft function. We studied a cohort of 827 adult renal transplant patients, transplanted from Jan/96 to Dec/05 and followed until Dec/06. Smoking habits were recordered at the time of transplant (never, former, current smoker). During summer of 2007, a telephonical survey allowed us to obtain complete information about smoking habits in 642 patients (aged 47.6 ±13, 65% male): status (never, former, current), years of habit, years of quit, and number of cigarette smoked per day. Number of "pack-years" was calculated. Renal function was measure by inverse serum creatinine at 3 rd month and then annually. Time to decline a thirty percent in inverse serum creatinine was registered. Patients were divided in two groups: those who always smoked during all transplant period (smokers, n=94) and those who did not (n=546 Renal insufficiency occurs frequently after extrarenal transplantation as a result of acute tubular necrosis at transplantation, high blood pressure, and CNI toxicity. We performed 56 renal biopsies in 54 patients after heart (4), lung (20) , liver (22), bone marrow (9), and cornea (1) transplantation since 2000. The time from transplantation to biopsy was 49±53 months in general and was longest after liver (60±61 months) and shortest after bone marrow transplantation (37±43 months). The histologic changes were: tubular atrophy/interstitial fibrosis of ≥20% in 52% of biopsies (heart biopsies 75%, lung 70%, liver 46%, bone marrow 22%); acute tubular changes in 54% (heart 25%, lung 65%, liver 46%, bone marrow 56%); arteriolar hyalinosis in 41% (heart 75%, lung 65%, liver 23%, bone marrow 22%); arterionephrosclerosis in 41% (heart 50%, lung 50%, liver 36%, bone marrow 22%); glomerular sclerosis of ≥20% in 21% (heart 50%, lung 25%, liver 18%, bone marrow 22%); glomerulonephritis in 13% (heart 25%, liver 18%, bone marrow 22%; that means IgA-nephropathy after heart and liver, immune complex nephritis twice after liver, MPGN after liver, and membranous glomerulonephritis and minimal changes after bone marrow transplantation); thrombotic microangiopathy in 7% (lung 10%, liver 5%, bone marrow 11%); finally one case of polyoma nephritis after lung transplantation. Among 1702 heart and lung transplantations, 13 patients needed kidney transplantation (0.8%) after 96±40 months; and among 2221 liver transplantations, 11 needed kidney transplantation (0.5%) after 112±65 months (sign. later, p=001). Conclusion: As expected, most histologic changes were those of CNI toxicity and hypertension. Surprising is the high number of glomerulonephritis under immunosuppression. Thrombotic microangiopathy without the typical clinical signs were interpreted as CNI-related toxicity and seems to occur more often after extrarenal than after renal transplantation. Patients with heart and lung transplantation reach end-stage renal failure more often and earlier than patients with liver transplantation. The Context: Living kidney transplantation, a superior therapy to deceased donor kidney transplantation, is underutilized. States have enacted legislation and the federal government has launched initiatives to compensate living organ donors, but the effect of policy on improving living kidney donation rates in the United States is unknown. Objective: To determine whether public policies are associated with changes in living kidney donation rates in the continental U.S. Design, setting, and study subjects: Series of cross-sectional analyses using records of state legislatures in 48 continental states and living kidney donation rates from the United Network for Organ Sharing. Main Outcome Measures: Living kidney donation rate during each year from 1988-2006 and change in donation rates before and after legislation enactment in each state and launch of federal initiatives. Results: From January 1990 through December 2005, 28 states enacted legislation for living donors (24 mandating paid leave, 8 tax deductions, 3 unpaid leave, 2 encouraging paid leave). Few states (n=5) enacted legislation prior to 1999. There was a steady increase in the mean living kidney donation rate in the continental U.S during the study period (mean (standard deviation) annual increase in donations 1.5 (0.04) donations per 1,000,000 population). In analyses accounting for length of time state legislation had been enacted, the types of legislation enacted, and the incidence and prevalence of ESRD in each state, there was a slightly (but not statistically significantly) greater average annual increase in donations after compared to before state legislation enactment (annual increase in donations per 1,000,000 population [95% confidence interval ( Introduction: Accurate and precise renal function assessment is essential in the evaluation of prospective kidney donors. While direct measurement of GFR is the "gold standard", it is not widely available. Moreover, creatinine (SCr)-based estimation equations are suboptimal to assess kidney function in this setting. CT scans are increasingly being used to study renovascular anatomy in donors and has replaced angiographic exams in many institutions. 3D imaging reconstruction allows for kidney volumes (KV) measurements which have been shown to highly correlate with measured GFR in this population. Thus, the purpose of this study was to develop a model to estimate measured GFR that not only incorporates SCr and demographic data but also KV as measured by 3D CT scans. Methods: 244 individuals who underwent donor evaluation were identified. An automated segmentation algorithm was used to measure renal parenchymal volume from preoperative abdominal CTs. Patient demographics and SCr values were obtained from the medical records. GFR (normalized for BSA) was measured by I 125 iothalamate renal clearances (iGFR). An analysis of covariance model was created to correlate measured iGFR with KV, patient age, sex, race, weight, height and SCr. Pearson's correlation coefficient was calculated for each variable. Results: KV (p<0.001), age (p<0.001), SCr (p<0.001) and weight (p<0.001) significantly correlated with iGFR. Sex (0.60), race (0.90) and height (0.76) were not statistically significant. The new fitted regression model is: KV-eGFR (ml/min/1.73 m 2 ) = 70.77 -(0.444*age) + (0.366*weight) + (0.200*volume) -(37.317*SCr). We then compared the performance of the KV-eGFR model to the re-expressed MDRD equation using calibrated SCr assay. The R 2 was 0.61 vs 0.50, respectively; signed median % difference was +1.8% vs -12.7%, respectively (% difference b/w estimated GFR and iGFR); and accuracy within 30% (% of estimated GFR values that fall within 30% of iGFR) of 96.3% vs 89.8%, respectively. Finally, the KV-eGFR model was closer to iGFR (in absolute values) than the MDRD eq. in 177/244 (70.5%) cases vs 72/244 (29.5%) cases, respectively. Conclusions: Kidney volumes highly correlate with iGFR and the proposed GFR estimation model outperforms the MDRD equation in potential living kidney donors. The KV-eGFR model could be used to estimate donor GFR in lieu of I 125 iothalamate GFR which is less clinically available. for donor selection, current reports identify unsuspected renal pathology by time 0-biopsy. Aim: To explore whether the findings at time 0-renal biopsy (bx) correlates with pre-donation clinical data including renal function. Methods: KT databases from 2 institutions were reviewed. Time 0-renal bx are routinely performed from the upper pole during back-table and evaluated by 2 nephropathologist for interstitial fibrosis (IF), tubular atrophy (TA), arteriolar hyalinosis (AH), mesangial increase (MI), and glomerulosclerosis (GS). Pre-donation data gathered from the donors were demography, body weight, BMI, systolic/diastolic BP, SCr, proteinuria, and eGFR by Levey equation Clinical data is summarized in the table. and GS showed no correlation. Multivariate analysis failed to sustain the significant associations found on bivariate analysis, most likely due to a low event/parameter relation. Conclusions: A significant correlation was observed between time 0-bx findings and clinical pre-donation parameters. Whether these histological findings at the time of kidney donation represent a higher burden/risk for the remaining kidney ought to be evaluated during follow-up. In an era where living donation is increasing, we should advise a closer surveillance of these donors in order to modify risk factors that participate in kidney damage progression. Predictors of Poor Early Graft Function Following Laparoscopic Donor Nephrectomy (LDN). Matthew Cooper, 1 Abdolreza Haririan, 1 Stephen Jacobs, 1 Michael Phelan, 1 Benjamin Philosophe, 1 Stephen Bartlett, 1 Joseph Nogueira. 1 1 Dept of Surgery, Urology, and Medicine, University of Maryland, Baltimore, MD. LDN has become the standard of care in many transplant centers. Poor early graft function remains an important complication. We conducted a retrospective study to evaluate the risk factors for slow or delayed graft function following LDN METHODS: Donor and recipient records from the first 1000 LDN were reviewed (1996) (1997) (1998) (1999) (2000) (2001) (2002) (2003) (2004) (2005) . RESULTS: Slow graft function (SGF) was defined as Cr>3.0mg/dl at POD5 and DGF as the need for dialysis within the first week following transplantation. Donor variables examined included age, sex, race, BMI, eGFR, and number of renal arteries. Recipient variables included age, sex, race, BMI, prior tx, pre-tx DM, history of smoking and drug usage. Additional variables evaluated included degree of relationship (LURT), HLA mismatch, antilymphocyte induction, de novo CNI usage, R v. L nephrectomy, WIT, total OR time, and performance of simultaneous deceased donor pancreas tx (SPLK). Univariate analysis was performed with significance defined as p<0.05. Significant variables were then included in the multivariate analysis. Background: A kidney exchange program is the logistic solution for patients with positive cross match (X + ) or ABO incompatible donors. A major problem in all kidney transplantation programs is the sensitized recipient. We analyzed the success rate for immunized recipients in the Dutch kidney exchange program. Methods: From January 2004 till December 2007 242 donor-recipient pairs were registered. There were 117 couples in the X + group while 125 pairs were ABO incompatible. In the X + group the median PRA was 46% (2-100%). To create new combinations a match program was run 16 times (every 3 months) with a median of 46 (16-66) participating couples. Allocation criteria included bloodtype (first identical, then compatible), HLA match probability within the actual exchange donor pool to ensure that highly sensitized recipients have the best chance to receive a kidney, and the waittime on dialysis. Cross matches between new donor and recipient were performed centrally in our Reference laboratory with CDC-tests. Results: After 16 match runs, we found matching couples for 83/117 (71%) X + pairs, for 21/87 (24%) ABO incompatible pairs with O recipients and for 29/38 (76%) ABO incompatible pairs with non-O recipients. Median PRA of the 83 recipients in the X + group was 42% (2-100%). After 3 match runs chances for success became small. The overall success rate for ABO incompatible and X + pairs in the Dutch kidney exchange program after 4 years is 55%. However, the success rate for immunized patients in the X + group is significantly higher (71%) as our match program gives priority to those recipients with the smallest chance of finding a compatible donor in each match run. Thus our kidney exchange program is especially suited for immunized patients. Although paired kidney donation (PKD) program is an established method to overcome incompatibilities between kidney donor-recipient pairs (DRP), significant proportion of the incompatible DRP participating in such program could remain unmatched. Domino-kidney transplantation (KT) in which altruistic living non-directed donor kidney (LNDK) is offered to a pool of incompatible DRP, and is used to initiate a chain of PKD transplants, could provide more opportunities of kidney transplantation to DRP in PKD program. We introduce our experience of multicenter domino KT for the last 7 years Sixteen hospitals participated in the domino-kidney transplantation between February, 2001 and July, 2007. 181 domino-kidney transplants were performed with 69 domino-KT chains initiated by altruistic LNDK. 2-pair chains were 43, 3-pair chains 14, 4-pair chains 9, 5-pair chains 2, and 7-pair chains 1. The Development of a Multi-Regional Kidney Paired Donation Program. Using an optimization matching algorithm the New England Program for Kidney Exchange (NEPKE) efficiently matches incompatible donor/recipient pairs. In 2006, the Mid-Atlantic Paired Exchange Program (MAPEP) and other individual centers began sharing data with NEPKE. This is a report of the success of two regional programs working together to increase the probability of participants in both programs finding a compatible match. Methods: Incompatible pairs and non-directed donors (NDD) are referred to NEPKE through transplant centers. Donor and recipient ABO, HLA and recipient HLA antibody screening are entered into the computer database. Utilizing the optimization program, searches for compatible matches are conducted every 30 to 45 days. The program identifies potential 2 and 3-way matches, NDD chains, and list exchange chains. Following the determination of compatibility NEPKE notifies transplant centers involved of the potential match and centers accept or decline the offer. Transplant centers notify their pairs and preliminary crossmatches are performed. A conference call is scheduled to coordinate simultaneous donor nephrectomies and recipient transplants. Surgeons speak prior to incision to ensure simultaneous donation. Results: From July 2005 to December 2007, 186 pairs and 10 NDDs have entered NEPKE. During this time frame over one thousand possible matches were identified, with the majority of these matches involving the same pairs in multiple matches. After "optimizing" and eliminating multiple matches 13 two-way exchanges; 15 three-way exchanges; 7 three-way list chain exchanges; and 25 NDD chain matches were offered to transplant centers as possible matches. Most common reason offers were declined include: positive crossmatch (21.6%); donor factors (20%), and recipient inactive or transplanted (20%). Four offers occurred in MAPEP alone pairs, 47 in NEPKE, and 14 offers were cross-regional. Three matches are pending for 9 additional transplants. One previous and one pending transplant are the result of cross-regional exchanges. Five transplants performed and 6 pending involve NDD chains. One pending match involves a list exchange chain. Conclusion: Using a computerized optimization algorithm to match 2 and 3 way exchanges, NDD and list exchange chains has lead to a substantial increase in the number of KPD matches and transplants performed. Cross-regional coordination is feasible and expands the number of transplants performed beyond the ability of individual exchange programs. As living donors become an increasingly important source of life-saving organs, there is growing concern about the lack of comprehensive research on donor outcomes. In addition to possible long term consequences, there is a risk that donors will experience complications during or following surgery. Of the 13,000 living kidney donors in 2005-2006, none died during surgery, and 0.5% (n=68) needed blood transfusions during surgery. In the six weeks following donation, 3.9% (n=512) had at least one serious adverse event (SAE): 1.7% (n=220) needed readmission following initial discharge, 0.6% (n=72) needed an interventional procedure, 0.5% (n=61) needed re-operation, 0.3% (n=41) had vascular complications, and 2.1% (n=274) had other complications. When all SAEs were considered in combination, the rate was 3.9%. Because over 50% of LDR forms were submitted by transplant centers fewer than 6 weeks post-donation, all complication rates should be considered minimum estimates. One donor was reported to have died from donation-related causes within 6 weeks of donation. The number of living donor kidney transplants performed by a transplant center in 2005 -2006 ranged from 1 to 362 transplants. Additionally, the risk of donor complications is not equal across transplant centers. For example, there was a significant correlation between the number of living donor kidney transplants performed at a transplant center and the percentage of that center's patients who were readmitted within 6 weeks of donation, with greater donor volume associated with a lower rate of readmission. Living kidney donation is relatively safe, but prospective donors should be made aware that there is a non-trivial risk (3.9%) of short-term complications post-donation. As with many other major surgical procedures, complication rates are lower, on average, at institutions that perform a larger number of these procedures. We sought to determine if intensive screening improves detection of polyomaviral reactivation in asymptomatic patients and pre-emptive stepwise modification can improve outcome of polyomaviral nephropathy (PVN). Methods:This is a prospective single center study. We randomly assigned de novo KT (cluster randomization) to Intensive Screening (IS:n=648) and Routine Care (RC: n=260) for the detection of decoy cells. IS was initiated at week-4 of KT and RC at the time of increase in serum creatinine. This was complemented with urine and blood nucleic acid testing. All patients had biopsies performed for detection of polomya nephritis (PVN). Both groups were treated with pre-specified stepwise modification of IT based on cell cytology and viremia (Step 1: decrease dose of cellcept by 50%, Step 2: decrease dose of tacrolimus by 50%, or switch to sirolimus therapy, Step 3: discontinue cellcept). Primary outcome included persistence of decoy cells/viremia following each step in modification of IT every three months and secondary outcomes included acute rejection, graft function and graft loss. Results: Polyomaviral reactivation developed in 11.7% in IS group and 40% had PVN without changes in serum creatinine. The estimated cumulative rate of primary outcome in the IS versus RC groups, 3-months (51% vs. 79%) relative risk (RR), 0.47;95%CI,0.20-1.08;p=0.051); 6-months (20%vs.52%) (RR=0.67; 95% CI,0.38-1.20; p=0.009); and 12-months (2%vs.33%) (RR=0.88; 95% CI, 0.41-1.43; p=0.003). Secondary outcomes: despite similar degrees of Step-Wise IS modification, rate of acute rejection non-significant (p=0.25), but 25% of patients in RC loss the graft vs no graft loss in IS group. Patients who continue to remain on tacrolimus vs. those who were switched to sirolimus therapy had persistent viremia (OR=10.25; 95%CI, 1.2-92.0; p=0.003). Conclusion: IS for poloymaviral reactivation allows early detection of PVN in the presence of stable graft function. Stepwise modification in IT resulted in early resolution of decoy cells and viremia in both groups, albeit slowly in RC group, and it did not prevent the graft loss in RC group. BACKGROUND: DNA sequencing of the BK viral (BKV) genome Non-Coding Control Region (NCCR) from individual patient isolates demonstrate divergent sequence and alterations in the arrangement of modularly conserved sequence blocks (P-Q-R-S) ranging in size from 39 to 68 base pairs. AIM: Primary aim to molecularly clone and analyze patient-derived BK virus NCCR sequence variants. Our secondary aim is to determine if reporter gene constructs of patient-derived NCCR variants differ in their promoter activity upon transfection into a mammalian cell line (Vero) and a human primary tubular epithelial cell line. METHODS: BKV DNA was amplified and sequenced from blood and urine samples of 18 renal transplant recipients. Via sequence alignment, unique NCCRs were determined. These sequences were PCR amplified and cloned into reporter plasmids containing the Renilla Luciferase gene. Promoter activity was measured via luminometer 24 hours after transfection into Vero cells and human tubular epithelial cells. RESULTS: Variation in naturally occurring BKV NCCR promoter regions exist as single basepair insertions and deletions, and insertions or deletions of partial sequence blocks (P-Q-R-S). Single basepair substitutions were most commonly seen (46% of analyzed samples). Promoter activity within Vero cells ranged from 188% to 39% as compared to NCCR activity of an archetypal strain (WWb). Low promoter activity (<50%) was seen in isolates with duplications of the P block and large deletions of the R block. CONCLUSION: These sequence blocks are rich in regulatory elements and control the expression of both BKV structural and regulatory genes. Variation in these cisacting eukaryotic transcriptional promoter binding sites corresponds to differential promoter activity in naturally occurring BKV isolates. Current plans include repeating the promoter activity studies in human primary tubular epithelial cells. Humoral and Cellular Immunity to Polyomavirus BK Large T and VP1 Antigens after Pediatric Kidney Transplantation. Polyomavirus BK-associated nephropathy (BKVN) has emerged as a cause of graft failure after kidney transplantation (KTx). In a cohort of 37 pediatric renal recipients undergoing prospective three-monthly monitoring for BK by blood and urine Q-PCR, we evaluated antibody response, measured by enzyme immunoassay using BK VLP, and cellular immune response, reported as frequency of IFNγ-secreting cells in a ELISPOT assay after 9-day stimulation with BKV large T (LT) and VP1 peptides. We could not observe any influence of recipient pre-transplant BKV-specific IgG or T-cell levels, which were generally low, on BKV infection after allografting. After transplantation, both specific IgG levels, and frequency of BKV-specific T cells increased according to the degree of viral exposure. In detail, BKV-seropositive patients who never reactivate the virus (group1, n=10) did not show significant increase in IgG levels (from a median OD of 0.37 at month +1 to 0.39 at the end of follow-up), while patients with urinary shedding alone (group 2, n=14) or with viremia (group 3, n=13) increased from a median OD of 0.3 to 1.2 (p=0.09), and 0.26 to 2.8 (p>0.0005). In the case of cellular immunity, VP1-specific T-cells increased in the three groups. Conversely, LT-specific T-cells, which were high (median 71 SFU/10 5 cells) and remained unchanged throughout the follow-up period in group 1 patients, had a significant increase in recipients belonging to both group 2 and 3. Interestingly, patients with urinary shedding who do not progress to viremia show a median 3-fold increase in LT-specific T-cell levels at peak viruria, compared to no increase observed in patients who develop viremia. The latter group mount a significant response to LT only after therapeutic reduction of immunosuppression. At peak viruria, viremic patients already show a 8-fold rise in specific IgG compared to the 1.5 increase observed in group 2 recipients. Our data suggest that inability to reach protective levels of BKV LT-directed T cells, rather than specific IgG, predispose KTx recipients to BKV replication. Introduction:The antibody response to human BKV virus (BKV) is incompletely characterized. Antibody responses to the VP-1 protein have been detected in kidney transplant patients, but it is not known if these have virus neutralizing activity. Methods:Recombinant BK, JC, and SV40 virus like particles were used to produce a panel of 20 monoclonal antibodies. These antibodies were characterized for isotype and for ability to bind the respective antigens in ELISA assays. To test neutralizing activity, BKV Gardner strain (ATCC# VR837) viral particles were incubated with the corresponding antibodies for 2 hours at 37 degrees C, and used to infect WI 38 cells. BKV infection was monitored by quantitative real time PCR using primers directed against the VP-1 gene. Neutralizing activity was defined as greater than 95% inhibition of viral yield. Results:The monoclonal antibodies were of the IgG 2a or IgG 2b class with the exception of one IgG 1 and one IgM antibody. All 12 anti-BKV monoclonal antibodies bound to BKV capsids in-vitro in ELISA assays. This binding affinity was species specific, as only 1 antibody showed weak binding activity to JCV and SV40 capsids. Neutralization of infectious BK virus was shown for 10/12 antibodies. Denaturation of capsid proteins indicated that the monoclonal antibodies recognized primarily conformational epitopes, with only monoclonal antibody appearing to have a linear component. Paucity of linear epitopes was further suggested by lack of reactivity of sera from BKV seropositive subjects in ELISA assays based on genotype-specific short peptide sequences derived from the BKV VP-1 loop region. Four monoclonal antibodies each generated from JCV capsids and SV40 capsids did not show any BKV neutralizing activity. Conclusions: BKV VP-1 protein capsids contain species specific conformational epitopes which can elicit virus neutralizing and non-neutralizing antibody responses. Measurement of these antibodies in renal transplant recipients may have diagnostic and prognostic applications. BKV specific monoclonal antibodies deserve further study as potential therapy of acute infections in the viremic phase. Impact of Immunosuppression Reduction in BK Viremic Patients: 3 Year Follow-Up. S. Kuppachi, 1 A. Guasch, 1 C. P. Larsen, 1 K. E. Kokko. 1 1 Transplant Center, Emory University, Atlanta, GA. Background: Development of BK nephropathy is a risk factor for allograft loss. BK viremia (BKV) precedes the development of BK nephropathy. It has been reported with 1-year follow-up that reduction of immunosuppression leads to control of BKV. Here, we report our 3-year follow-up experience of BKV patients that were identified by a prospective screening protocol and managed by sequential reduction of immunosuppression. Methods: All kidney or kidney-pancreas transplant recipients at Emory University between 5/03-5/04 were screened prospectively for BKV by real time PCR during follow-up visits (1-6, 9, 12, 24, 36) . Patients with a BK viral load of greater than 10,000 copies/ml blood received a kidney biopsy to screen for BK virus nephropathy by immunohistochemistry. BKV without nephropathy resulted in reduced immunosuppression by a 50% reduction in mycophenolate dose. BKV with nephropathy resulted in discontinuation of mycophenolate. All identified BK patients were monitored every 2-4 weeks until viral load was below 10,000 copies/ml. Immunosuppression was further reduced if viral loads failed to decrease. Results: 135 recipients were followed over a 36-month period. 24 patients had received simultaneous kidney and pancreas, 2 a liver and kidney and the rest kidney alone. 27 of 135 (20%) patients developed BKV within a year from time of transplantation. Average time to diagnosis of BKV was 4.7 months. Average dose of mycophenolate at 3 years was 1.3 g/d in the BKV negative population as compared to 0.56 g/d in the BKV positive population. Average 12 hour trough blood level of tacrolimus at 3 years was 9.1 ng/ml in the BKV negative population as compared to 7.4 ng/ml in the BKV positive population. Survival rates for both patient and organ were comparable at 3 years in BK viremic vs BK negative patients (100% vs 92%) and (88% vs 83%) respectively. While BK viremia is a historic risk factor for organ loss, prospective monitoring and reduction of immunosuppression is associated with comparable 3 year patient and organ survival to patients that never develop BK viremia. Background: There are currently no BK virus (BKV) specific therapies available for clinical use. This study evaluates viral large T antigen as a potential target for drug development, since (a) this is a key molecule that participates in several different stages of viral replication, (b) has no homologous human protein, and (c) offers multiple functional domains for chemical binding, particularly the ATP binding site, the DNA binding site, the hexamerization surfaces, and the hinge region. Methods: Virtual screening and protein modeling techniques were applied to BKV large T antigen using a model developed from the known crystal structure of SV40 large T antigen. Two different structural states of large T antigen (monomer and dimer structure) in three different states (with the nucleotide pocket empty, with bound ADP and bound ATP), were evaluated for a total of 6 large T antigen receptor conformations. Results: A computational solvent mapping analysis of small molecular probes allowed identification of multiple functional sites, which represent potential drug binding pockets on the large T antigen molecule. It was possible to classify 1200 molecular conformations centered on the ATP binding site, hexamerization surface and hinge region of the viral protein. We docked 1209 medium sized fragments (<100 Da) to confirm the results obtained by computational solvent mapping, and further characterize chemical properties of the ATP binding site. In another approach, known chemical structures of Hsp90 and Rho-kinase inhibitors were used to search compound databases and obtain a subset of 8294 compounds (mean size of 350 Da) capable of docking large T antigen. Cross-referencing the top solutions obtained by energy ranking, we were able to identify 50 compounds that bind large T antigen in all 6 conformational states. A subset of 5 compounds simultaneously binds the ATP binding site, hexamerization surface and hinge region of BKV large T antigen. Conclusions: Virtual screening and three dimensional homology modeling technology has allowed us to identify compounds that can bind multiple sites on the large T antigen. These compounds are predicted to preferentially inhibit viral replication without the toxicity expected from simultaneous inhibition of host cell kinases. BK Virus (BKV), a human polyomavirus, causes BKV nephritis, which often leads to graft loss after renal transplantation. Currently, the only efficient therapy against BKV nephritis appears to be a reduction/change of immunosuppressive agents, and this may increase the inherent risk of rejection. Since human renal proximal tubular epithelial cells (HRPTEC) represent a main natural target of BKV nephropathy, the analysis of BKV infection of HRPTEC is likely to provide necessary additional insight into BKV biology and contribute to the development of strategies for treatment of BKV nephritis. Here we report the ability of 3-hydroxy-3-methyl-glutaryl coenzyme A (HMG-CoA) reductase inhibitor pravastatin, which is routinely used to treat hypercholesterolemia, to repress BK virus entry pathways in HRPTEC and, correspondently, prevent BKV infection. The percentage of HRPTEC infected with BKV was assessed by immunofluorescent analysis in the absence and presence of pravastatin. Both, the percentage of BKV infected cells and the intensity of BKV infection, assessed by western blotting using antibodies against large T antigen, were significantly decreased in HRPTEC treated with pravastatin. It is likely, that pravastatin's inhibitory effect is explained by depletion of caveolin-1, a critical element of caveolae. We demonstrate that BKV enters HRPTEC by caveolarmediated endocytosis and disruption of Caveolin 1 mRNA and protein inhibits BKV infection of HRPTEC. We provide evidence that pravastatin dramatically decreased caveolin-1 expression in HRPTEC and interfered with internalization of labeled BKV particles. Our data suggest that pravastatin, acting via depletion of caveolin-1, prevented caveolar-dependent BKV internalization and repressed BKV infection of HRPTEC. Our data represent the first report of inhibitory action of statins upon BKV infection. Results: 74.6% of patients were Caucasian, 12.2% Hispanic, 8.9% African-American and 4.4% Asian. Overall 1-and 3-yr patient survivals were 85% and 77%. Stratified by race, only African-American survivals differed from Caucasian (3-year survivals of 71% vs. 77%; Figure 1 ). Compared to Caucasians, African-American patients were younger (48yrs ±11.5 vs. 52yrs ±9.9), were more likely to be Status 1 (10.3% vs. 5.1%), to have a serum creatinine ≥1.5mg/dL (37% vs. 31%), to receive an multi-organ transplant (8% vs. 5%), to have fulminant hepatic failure (12% vs. 7%), to have a higher MELD score (22.9 ± 10.4 vs. 20.1 ±9.4) , to be in the ICU (16% vs. 11%), and to be ventilated pre-transplant (8% vs. 5%); all p-values <0.001. After adjustment for each of these variables, race was still an independent predictor of mortality (p=0.01, HR: 1.22, CI: 1.048-1.425). Multivariate analysis of the African-American group (n=1481) alone revealed that a BMI greater than 40 (p=0.001, HR: 2.5, CI: 1.49-4.17), a creatinine greater than 1.5 mg/dL (p=0.002, HR: 1.6, CI: 1.18-2.10), and ICU admission pretransplant (p=0.018, HR: 1.5, CI: 1.07-2.17) are independent predictors of mortality in this subset of patients. Conclusion: In the current MELD era, race is still a predictor of worse outcomes, even after adjustment for multiple clinical variables. Further work is necessary to elucidate why this disparity exists. The purpose of this study was to analyze the effects of successive pregnancies in female liver transplant recipients on newborn and maternal outcomes. Data were collected from the National Transplantation Pregnancy Registry via questionnaires, phone interviews and hospital records. Analyses for linear trends (proportions and continuous variables) were done by Chi square and least squares regression. There were 217 outcomes of 213 pregnancies, including twins. Of the 125 liver recipients who had a first pregnancy, 61 had between one and four subsequent pregnancies. There were no significant differences in the variables analyzed as noted in the CONCLUSIONS: Successive pregnancies in liver transplant recipients are not associated with adverse fetal outcomes and/or increased maternal graft loss. Female liver recipients with excellent allograft function without significant recurrent disease or chronic rejection who wish to have more than one pregnancy should not be discouraged to conceive. recorded and categorized in a blinded fashion. Univariate, multivariate and survival analyses were performed. Over a 4.5-year period (2002) (2003) (2004) (2005) (2006) (2007) , 148 OLT recipients were randomized to receive a CCA with biliary stent (n=76) or CCA alone (n=72). Patients with hepatic artery thrombosis (n=8; 5.7%) were excluded. There was no significant difference in demographic or graft-related variables. The mean age at transplant was 55 years, 82% were male, the mean MELD was 21 and 23% had hepatitis C. The mean donor age was 46 years, 59% of donors were male, and the mean cold ischemia time was 8.7hrs. The only complication related to the biliary stent was one occlusion. The rate of overall BC in the stented patients was 24.7% vs. 32.8% in non-stented patients, (p=NS). However, stented patients had significantly less BC in the first 60-days post-OLT (6.8% vs. 21.0%, p<0.02) and significantly less anastomotic leaks (2.4% vs. 9.6%, p<0.05). Over the year following OLT, stented patients also required less biliary therapeutic interventions (mean 1.8 vs. 0.9 interventions/patient, p<0.04) and fewer readmissions (mean 2.8 vs. 1.6 readmissions/patient, p<0.01). We also observed improved late graft survival (>6mo) in the stented group. Intraoperative stenting of the CCA at OLT does not appear to reduce the long-term rate of BC but it decreases the incidence of biliary leaks and significantly improves many facets of early patient management. Is Background: Long-term outcomes after retransplantation of the liver (re-OLT) is inferior compared to primary OLT. However, the survival benefit of re-OLT based on Model for End Stage Liver Disease (MELD) is not known. A single-center analysis of 421 adult patients who underwent re-OLT between February 1984 to February 2007 was performed. Survival benefits, at a given MELD score, were calculated by comparing re-OLT survival at 3 months to expected 3-month survival without retransplantation Results: Of 421 pts, 380 underwent re-OLT, and 41 received 3 transplants. The figure shows re-OLT survival benefit at any MELD score with increased significance in patients with MELD scores > 18. Although MELD scores 30-40 predicted the highest mortality after re-OLT, they also demonstrated survival benefit. Multivariate Cox regression identified cold ischemia time >10 hrs (RR 1.8, P 0.001), MELD 30-40 (RR 2.0, P 0.04), time from first OLT (8 days -1 year, RR 1.8, P 0.02) and third transplant (RR 1.7, P 0.03) as independent predictors for mortality following re-OLT. Conclusions: Re-OLT should be considered in all patients even with low MELD scores. Although patients with high MELD scores (30-40) exhibit poor survival outcomes, a survival benefit is achieved. Survival benefit that considers both probability of death without re-OLT and expected survival with re-OLT, should be used for selection of retransplantation candidates. This study analyzed posttransplant complications, MELD score, and donor ages and their effect on length of stay (LOS). Methods: This IRB approved retrospective review of our prospectively maintained database included 1427 liver transplant recipients transplanted between 1999-2006. LOS <14 days, 14-30d, and >30d were analyzed. Primary analysis to look at LOS, MELD, and donor age was performed using Wilcoxon two-sample test. Complication groups were analyzed using logistic regression and odds ratio estimates were determined. Kaplan-Meier for patient survival for the LOS groups was performed. Univariate analysis: Allograft dysfunction, vascular complication of liver allograft, intra-abdominal (other than liver), biliary, cardiac, pulmonary, neurologic, sepsis, renal, and endocrine were statistically significant (p<0.001) using Fisher exact test. Multivariate analysis: Using logistic regression, significant complications were analyzed to determine the complications linked with LOS >14d and >30d. Analysis of 1427 liver transplants demonstrated increasing LOS with increasing MELD in each donor age group (50-60, 61-70) p<0.0001. For donor age >70, this relationship was not significant (p=0.2206). Multivariate analysis using logistic regression determined odds ratio estimates for each complication resulting in LOS >14 days and >30 days. LOS >14 days and LOS >30 days were associated with different complications (as shown above). Allograft dysfunction (including PNF) and renal complications were not significant factors in multivariate analysis. Patient survival significantly decreased (p<0.001) with increased LOS >14 days. Introduction: Some patients with primary biliary cirrhosis (PBC) may require longterm corticosteroid (CS) therapy following liver transplantation (OLT) due to recurrent inflammation in the graft. Our center has attempted to minimize CS use in all of our OLT recipients. We reviewed our experience in this cohort of patients to determine 1) patient outcome including recurrent disease and 2) long-term requirement for CS use in PBC patients. Methods: From 1988 to 2006, 1,102 OLTs were performed in 1,032 adults at the University of Colorado of which 70 patients (6.8 %) with PBC received 74 allografts. Recurrence was defined by characteristic histologic changes on biopsy. Bivariate and multivariate analyses were used to evaluate predictors of CS withdrawal. 13 potential predictors of CS discontinuation were considered: age, gender, BMI, race, presence of inflammatory bowel disease (IBD), type of graft (cadaver or living donor (LD), recurrence of AIH, warm ischemia time, follow up time (time since transplant), and immunosuppressant (IS). Results: Overall survival at 5 years was 85%. The 1, 5 and 10 year recurrence-free survival was 90, 72, and 54%, respectively. Disease recurred in 18 patients (25.7%). Of these 18 patients, none received a second transplant because of recurrent disease. CS was withdrawn in 73% of patients at time of review. Independent predictors of CS discontinuation are age (>median) (p = 0.0029) and LD graft type (p=0.0018). Conversely, Cyclosporine (CSA) (p=0.0012), female gender (p=0.0197), and BMI > 31 (p=0.0306) were negatively associated with CS withdraw. Interestingly, CS withdrawal did not influence PBC recurrence. Conclusions: 1) Long-term outcomes in PBC patients are favorable and disease recurrence can be managed medically without Abstracts re-transplantation. 2) Using an aggressive CS minimization approach, almost 3/4 of the patients were CS-free at the time of last follow-up. 3) Increasing age and LD grafts were associated with successful CS withdraw; while CSA, female gender, and increasing BMI were associated with unsuccessful CS withdraw. Incidence Cholestatic disease (CD), either chronic rejection or recurrent primary sclerosing cholangitis (PSC), post liver transplantation (LT) occurs in 5-35% of PSC grafts. The study objectives were to evaluate the incidence and long-term outcome of CD. From 1985 to 2006, 141 grafts in 125 consecutive PSC patients and 85 grafts in 79 concurrent alcoholic liver disease (ALD) patients were compared. CD was diagnosed by biliary imaging and/or histology. Median follow-up was 57 months. The groups were similar including MELD score and cold ischemic time; however, roux-en-Y biliary anastomosis was more common in the PSC group (95% vs. 13%, p<0.01). The PSC group had more CMV hepatitis (22% vs. 6%, p<0.01) and acute rejection (60% vs. 30%, p<0.01), and fewer biliary anastomotic strictures (7% vs. 16%, p<0.01). CD occurred in 38 PSC grafts and 7 ALD grafts (p<0.01). The incidence of CD was greater in the PSC group (p=0.02, Fig. 1 ). No significant risk factors for CD were identified. In the PSC group the graft survival was lower in the CD group (p=0.04, Fig. 2) ; however, patient survival at 10 and 15 years was not effected by CD: 73% and 67% vs 75% and 60% because the re-LT rate was greater in this group (26% vs. 6%, p<0.01). At 15 years, patient survival in the PSC group was better than for ALD (64% vs 22%, p<0.01). Long-term outcome post LT for PSC is good and better than for ALD; however, CD continues to develop beyond the first 5 years with a prevalence of 43% at 15 years. Graft loss in patients with CD is high, but with re-Tx, patient survival is similar to non-CD patients. Further studies to distinguish chronic rejection vs. recurrent PSC may provide insight into the prevention of CD following LT for PSC. Discussion: Our analysis showed that patients with AIH have a worse long term survival compared to PBC and PSC after DDLT. This may be explained by possibly more advanced disease at the time of presentation or more septic complications due to pretransplant salvage therapy with immunosuppressants. Also,PBC had the worst relative outcome after LDLT in this group-possibly explained by the older age of the recipients. Overall, LDLT offered better outcomes than DDLT in terms of survival in patients with AIH,PBC and PSC. This study highlights an important and previously unvisited aspect of transplantation for autoimmune and cholestatic liver diseases. Inflammatory Bowel Disease Course in Patients Transplanted for Primary Sclerosing Cholangitis. Ariana Wallack, 1 Joel S. Levine, 2 Lisa Forman. 2 1 Internal Medicine, Univ. of Colorado HSC, Aurora, CO; 2 Gastroenterology and Hepatology, Univ. of Colorado HSC, Aurora, CO. The natural history of IBD following liver transplant (LT) for PSC is unknown. Prior studies have not shown factors consistently associated with disease activity, but were limited by small sample sizes. The aim of the study is to describe our experience with IBD post-LT in patients with PSC and determine factors predictive of disease activity. A survey was mailed to 115 liver recipients transplanted for PSC between 1988 and 2006 asking about medications, IBD activity and quality of life after LT. Responses were linked to our LT database. Results 88 (77%) recipients responded. 76% were male with a median of 82 (7-215) months since transplant. 93% were Caucasian with a median transplant age of 46 (16-71) years. 16% developed recurrent PSC post-LT. 74% had a diagnosis of IBD (63% UC). Immunosuppression included tacrolimus in 84% and cyclosporine in 11%. 71% experienced at least one episode of acute rejection. 73% rated their quality of health 4-5 on a scale of 1-5. There was no significant difference in demographic variables between IBD and non-IBD cohorts. 66% of respondents had IBD pre-LT. Of the 30 patients without pre-existing IBD, 7 developed IBD post-LT. Of the de novo IBD cohort, 43% were male, median transplant age was 45 years (39-57), and 86% were Caucasian. IBD developed in 71% at more than 3 years post-LT. There was no statistical difference between the de novo IBD cohort and those with pre-existing IBD except for IBD type (57% UC vs 88%, p=0.008). Significantly more patients were not on any IBD medications post-LT compared to pre-LT (75% vs 91%, p=0.02). Fewer post-LT patients were on aminosalicylates (63% vs 79%, p=0.03) and prednisone (14% vs 31%, p=0.03) compared to pre-LT. Post-LT recipients developed fewer IBD flares requiring hospitalization (27% vs 45%, p=0.004). 38% reported improvement in IBD activity post-L. 23% and 33% reported no change and worsening activity, respectively. There was no significant difference between reported disease activity and immunosuppression, CMV status, rejection rate, recurrent PSC, transplant age, gender or race. Background: Symptomatic CMV continues to be a significant problem. The costs associated with its development remain high without an optimal method for prevention. The aim of this study was to measure the pharmacoeconomic impact of implementing an abbreviated pre-emptive monitoring strategy versus valganciclovir (VGC) prophylaxis in a large teaching hospital. Methods: Costs for this analysis were based on a societal perspective, including drug, personnel, hospital, outpatient infusion and monitoring costs related to resources needed to perform pre-emptive monitoring and treatment of CMV related events. Time per hr costs were nurse/data coordinator $45, physician $200, and PharmD $52. CMV PCR cost was $203. VGC cost per mg was calculated for prophylaxis, treatment and 30 days of consolidation as needed based on an AWP ($0.09 per mg). Estimated cost of CMV syndrome was based on cost of PICC line placement, drug, personnel and supply costs based on 21 days of therapy and was estimated to be $11,885.50. Inpatient admission cost per day were $3727. Results: A total of 119 patients were included in this analysis. Baseline and transplant demographics were well matched. Table 1 displays the total direct and indirect costs accumulated for each group. CMV syndrome occurred in three patients for each group. There was no CMV disease in either group. 27% of patients in the pre-emptive group had DNAemia, but only 4 patients required oral anti-viral therapy. Provider time and lab monitoring costs were significantly higher in the preemptive group, while direct medication cost was significantly higher in the prophylactic group. Conclusions: Frequency of disease severity and outcomes were equal in each group. Although the overall costs between strategies is equivocal, allocation of resources to provide pre-emptive monitoring places the burden of disease prevention on the health care system versus the patient necessitating further abbreviation of these strategies. We propose a non-simultaneous form of kidney paired donation that starts with a living, non-directed donor (LND). A paired donation matching algorithm was developed to allow for LNDs to start potentially Never-Ending Altruistic Donor (NEAD) chains in addition to closed loops of 2-, 3-and 4-way exchanges. RESULTS: In July 2007, a LND from Michigan traveled 1500 miles to donate a kidney to a woman whom he had never met in Arizona. The following week, the Arizona recipient's husband donated one of his kidneys to a 32-year-old woman in Ohio. The following month, the mother of this recipient traveled to a city 3 hours away to donate her kidney to a patient whose incompatible donor simultaneosly gave a kidney to the fourth patient in the chain. The incompatible donor for this fourth recipient is now slated to give her kidney to a patient in Maryland. Over the past 9 months, 60 transplant programs have partnered to perform 10 paired donation transplants. Demonstrating the advantage of NEAD chains over classic paired donation, 8 of 10 transplants resulted from altruistic donor chains. CONCLUSION: In order to fully realize the potential of the above approach, one must be willing to supplant two prevalent ideas: 1) that kidneys from altruistic donors should be given to the top candidate on the deceased donor waiting list, and 2) that paired exchanges must be done simultaneously. While there are certain pitfalls to using chains of donors as opposed to traditional "swaps" (i.e. the possibility that a donor could renege, or the accumulation of type AB donors who are not likely to be able to begin another chain), this proposed paradigm shift could result in a very significant increase in both the number and quality of paired donation kidney transplants. Purpose: Medical literature and national best practices correlate families' understanding of brain death with organ donation rates. Analysis of hospital data demonstrated variability in family communication. Although surgical residents frequently interact with family members of potential donors and play a critical role in the donation process, they receive no formal CST. We sought to determine if pre-training residents improved performances in CST involving explanation and notification of brain death to family members and aided efforts to achieve the national donation rate goal of 75%. Methods: In collaboration with a regional organ procurement organization (OPO), an educational model for end of life CST was developed. 17 surgical residents were divided into 2 groups. The first group (n=9) attended a 3-hour didactic session at the OPO that included role-playing exercises explaining brain death to families. OPO staff, trained to serve as family role-players and skills station coaches, debriefed residents after each simulation. The second group (n=8) received no specialized training. Six weeks later both resident groups participated in formal videotaped family communication simulations. Independent observers (hospital faculty and senior OPO staff), blinded to training, evaluated residents' communication skills using a 12-parameter assessment tool. All residents reviewed their videotaped performances and evaluations, then repeated the simulations after six months. Results: During the study period, the pre-trained resident group assessment scores increased by 25% (p=0.0001), while the untrained group increased by 39% (p<.0001). While evidence of improvement existed in both groups, pre-trained residents consistently scored higher when compared to untrained (80% vs. 71%, p=0.037). During this same time period, donation rates in the Surgical Intensive Care Unit (SICU) increased by 21%. Conclusion: Our educational model demonstrated effective training in communication skills during end of life discussions. Although a direct relationship cannot be established, donation rates in the SICU increased after resident training. Incorporation of this training program into resident education has the potential to improve donation rates and increase the number of organs available for transplantation. Program. Saverio Mirarchi, 1 Graeme N. Forrest, 1 Benjamin Philosophe. 2 1 Medicine, University of Maryland, Baltimore, MD; 2 Surgery, University of Maryland, Baltimore, MD. Objective: To improve the efficiency of care for solid organ transplant patients by having internists work in conjunction with transplant surgeons to manage transplant patients admitted to the hospital more than 30 days post organ transplant. This includes patients who have undergone kidney, pancreas, and liver transplants which our center transplants over 300 of these organs/year. In 2002, the organ transplant program was divided into a surgical transplant service (STS) and medical transplant hospitalist service (MTHS). The MTHS consists of four full time physicians, one part time physician, and one nurse practictioner with daily rounds from a transplant surgeon on a weekly rotating schedule. Methods: We analyzed our data from the past five years since the inception of the MTHS at a large university medical center. The length of stay (LOS) for the MTHS and STS were compared to data from the previous combined program as well as data from the University Health Consortium (UHC). In addition, we looked at the cost of care to determine if there were any savings related to reduced LOS and adjusted for the combined salary of the MTHS. Results: In the review period, the total admissions for both the MTHS and STS averaged 1450 admissions/year. Over the past five years the MTHS showed a major decrease in the LOS index, defined as the ratio between the observed LOS and the expected LOS based on UHC data. This index dropped from 1.27 to 0.91 for patients on the MTHS. There was also a parallel drop in the STS LOS index from 1.53 to 1.05. This was associated with a significant cost savings for the hospital. On average, the program has realized an average savings of approximately $1,000,000 per year. When accounting for the combined salary for the MTHS group which is 625,000 dollars/year, the total savings over the 5 year period this amounts to $1.9 million. Conclusion: The creation of a MTHS working within an organ transplant program at a large university center has been able to demonstrate a major decrease in LOS for transplant patients which has resulted in decreased costs and improved efficiency. This has also allowed more focused care for a complex medical population. We believe that our program can serve as a model for similar programs at other busy transplant sites. With increasing demand for kidney transplants(Tx), more patients are opting to travel outside the US to obtain transplantation. We describe the characteristics and outcomes of 32 kidney Tx recipients followed at our center who traveled abroad for a kidney Tx between 1995 and 2007. Methods: Data were obtained via chart review. We compared demographics to all Tx recipients at our center during the same period and compared post-Tx outcomes to a cohort of patients transplanted at our center matched for age, race, Tx year, dialysis time, prior Tx, and donor type. Median follow-up time was 487 days (range 31-3056). Results: Demographics are outlined in Table. Patients transplanted abroad were more likely to be Asian and had shorter dialysis times. Most patients were transplanted in China (44%) followed by Iran (16%), the Philippines (13%), and India (9%). Living unrelated Tx were most common. All patients were discharged on a calcineurin inhibitor, pred, and either MMF (90%), AZA (7%), or rapamycin (3%). 7 patients received induction. Only 4 patients received CMV prophylaxis. The median duration of hospitalization was 15 days. The median time post-tx to initial visit at our center was 35 days. 4 patients required urgent admission to hospital, 3 of whom lost their grafts. 17 patients ( Graft survival at 1-year was 96% for both "tourists" and matched recipients at our center. Median time to graft loss was 500 (31-2832) days. Mean SCr and rejection 1-year post Tx was not significantly different between recipients transplanted abroad and at our center. Conclusion: Compared to patients transplanted locally, patients transplanted abroad had similar graft survival and renal function post Tx, but had a high incidence of infectious complications. [Background] The primary benefits anticipated following successful induction of allograft tolerance are avoidance of the complications of long-term immunosuppression and prevention of chronic rejection. We have previously reported successful induction of renal allograft tolerance in recipients of HLA mismatched combined kidney and bone marrow transplantation (CKBMT). No evidence of chronic rejection has been observed in these recipients after immunosuppression-free periods of 1 to 4 years. In the current study, we have evaluated the longer-term economic impact of this approach. [Method] The conditioning regimen for CKBMT included cyclophosphamide, thymic irradiation, anti-CD2 mAb, and a calcineurin inhibitor, which was discontinued after 9-12 months. 18 stable renal transplant recipients receiving ongoing triple or double drug immunosuppressive therapy with compatible follow up times were compared with the four tolerant recipients. Tolerant patients and stable patients on maintenance immunosuppression were also comparable with respect to their age, their original disease and donor-recipient histocompatibility. [Results] The perioperative charges for CKBMT were approximately $90,000 higher than those for conventional living donor kidney transplantation. After 9-12 months, continuing medications in CKBMT recipients included only occasional over-thecounter analgesics. In contrast, the stable conventionally treated recipients were taking an average of 8 pills daily. These included treatments required for de novo diabetes (11%), hypertension (67%), hyper lipidemia (22%) and gastro-intestinal symptoms/ prophylaxis (78%) in addition to their maintenance immunosuppression. These needs resulted in annual maintenance charges of over $15,000/year/allograft recipient and do not include unmeasured costs related to issues such as quality of life or absences from employment. [Conclusion] Even with this admittedly expensive approach to tolerance induction, the overall medical costs for conventional kidney transplantation with ongoing immunosuppression and treatment for complications will exceed the cost of tolerance after approximately 5 years. Increasing African American Donation Rates in a Midwest Metropolitan Community. Susan Gunderson, 1 Susan Mau Larson, 1 David M. Radosevich, 2 Clarence Jones, 3 Bill Tendle, 3 Tiffany Scott. 1 1 LifeSource, St. Paul, MN; 2 Transplant Information Services, University of Minnesota, Minneapolis, MN; 3 Southside Community Health Services, Minneapolis, MN. Purpose: African Americans are underrepresented in deceased organ donation in this +2 million community. Historically minimal educational outreach had occurred and this study was designed to understand the community's disposition toward donation and to increase support for donation. Methods: Television, newspaper, and radio advertising aired over a 24 month period beginning in 2004 using the nationally produced Donate Life-African American campaign as the primary intervention. Other components included related faith-based and community outreach. The OPO partnered with a minority focused community health clinic to survey African Americans pre-and post-intervention. A mailed, selfadministered survey was sent to a sample drawn from organizational lists with a high likelihood of including African American households. For the evaluation, the sample was separated into 1) community members and 2) church members exposed to faithbased campaigns. Results: African Americans in this community have a sophisticated understanding of the importance of donation (average 12 of 15 knowledge questions scored correctly) in pre-intervention survey. In both community and church groups media exposure and donation knowledge increased after the media campaign (p<0.001 and p=0.004 respectively). Similarly, donor designation rates for the entire population on the drivers licenses increased (33.0% versus 40.1%, p=0.119). Among all African Americans the propensity to donate was, however, unchanged following the campaign. Propensity to donate increased (p=0.034) in the community sample whereas there was a reduced propensity to donation (p=0.092) in the church sample. During the study time period the OPO also experienced significant increases in donation authorization rates among African-Americans, increasing from 37% to 75%. Summary: A media based campaign combined with grassroots outreach is an effective tool to increase knowledge and awareness. Partnership between the OPO and community and faith based leadership was a significant positive byproduct of the project and should be included in future outreach efforts. Background: It has been hypothesized that the clinical benefits of Anti-Thymocyte Globulin (ATG) induction therapy do not completely result from immunodepletion, but may also result from induction of immunoregulatory T-cells (Treg). In this prospective, controlled study we investigated the effect of ATG-induction therapy on the frequency and phenotype of peripheral CD4 + FoxP3 + CD127 -/low T-cells in kidney transplant patients. Methods: After transplantation, 16 patients received ATG-induction therapy (Thymoglobulin ® ) and triple therapy consisting of tacrolimus, MMF and steroids. The control group (n=18) received triple therapy only. By flow cytometry, T-cells were analyzed for markers associated with immune regulation: CD25, FoxP3 and CD127. Within the FoxP3 + T-cell population, the CD45RO (memory) and CCR7 (homing receptor) markers were characterized. Results: Pre-transplant levels of CD4 + FoxP3 + CD127 -/low T-cells in all patients were 3-17% (median 6%) of CD4 + T-cells. One wk post ATG induction therapy, no measurable numbers of Treg were present. At 12 wks post ATG-induction therapy, a higher proportion of the first detectable T-cells expressed FoxP3 compared to the control-group (ATG vs. control-group; 7 vs. 4%, median, respectively, p=0.03) and then returned to pre-transplant levels at 26 wks. This increased proportion of CD4 + FoxP3 + CD127 -/ low T-cells resulted in a significantly higher FoxP3 + /FoxP3 neg ratio than in the controlgroup at 12 wks; 0.074 vs. 0.046, median, p=0.02) . At 26 wks, we found a decline in the proportion of naive FoxP3 + Treg (CD45RO neg CCR7 + pre-transplant vs. post-transplant; 21 vs. 9%, median, p=0.02) which was associated with a rise in the proportion of memory FoxP3 + Treg (CD45RO + pre-transplant vs. 26 wks post-transplant; 65 vs. 82%, p=0.02). Moreover, the proportion of memory T-cells exceeded that in the control-group at 26 wks (ATG vs. control-group; 82% vs. 67%, median, p=0.05) which was mainly due to an increase in the proportion of effector memory FoxP3 + Treg (CD45RO + CCR7 neg ). Conclusion: After ATG-induced immune cell depletion, a shift towards CD4 + FoxP3 + CD127 -/low peripheral regulatory T-cells with the memory phenotype was measured in kidney transplant patients. This finding suggests that ATG treatment triggers the generation of de novo peripheral regulatory T-cells by homeostatic proliferation. Introduction In a prospective study, we investigated whether donor-specific regulatory CD4 + CD25 bright+ FoxP3 + T cells develop in kidney transplant patients. Methods We analyzed the percentage and function of peripheral regulatory CD4 + CD25 bright+ FoxP3 + T cells of 51 patients before, 3, 6 and 12 months after kidney transplantation. The immune regulatory capacities of CD4 + CD25 bright+ FoxP3 + T cells were assessed by their depletion from PBMC and reconstitution to CD25 neg/dim responder T cells at a 1:10 ratio in the MLR. In the first year after transplantation, peripheral CD4 + CD25 bright+ FoxP3 + T cells decreased from 8,7% ± 3,0 pre-transplantation to 6,1% ± 1,8 at 12 months (p<0.001). While MLR reactivity to 3 rd party-Ag (3 rd P) significantly improved (p<0.05), the reactivity against donor antigens remained low. Functional analysis demonstrated potent donor-specific regulatory activities by CD4 + CD25 bright+ FoxP3 + T cells after transplantation. Depletion of CD4 + CD25 bright+ FoxP3 + T cells from PBMC resulted into increased proliferation upon stimulation by donor antigens (p<0.01). Upon reconstitution, the capacity of CD4 + CD25 bright+ FoxP3 + T cells to control the proliferation of anti-donor reactive CD25 neg/dim T cells increased over time: from 58% (median) pre-transplant to 85% post-transplant at month 12 (p<0.01). Moreover, the anti-donor regulatory activities by the CD4 + CD25 bright+ FoxP3 + T cells were significantly more vigorous than those controlling 3 rd P-Ag stimulated responder T cells (65%, p<0.05). The generation of potent donor-specific regulatory CD4 + CD25 bright+ FoxP3 + T cells in the periphery of kidney transplant patients prevents the development of adequate alloreactivity. Conclusions: These results indicated that anti-donor responses could be detected even in a significant proportion of "stable" long-term HLA identical kidney transplant recipients. Speculatively this may be the cause of late graft failures in this group of patients. Expression Klotho is a gene almost exclusively expressed in renal distal tubules and loss of expression is associated with accelerated aging. In various models of acute and chronic renal injury, and with normal aging, renal klotho expression has been found to decrease. Increased donor age is associated with poorer long term allograft function. We hypothesized that klotho mRNA expression in renal implant biopsies would correlate with donor kidney quality, as determined by donor chronologic age and peri-transplant renal injury. Our recent unsupervised microarray analysis of 87 renal implant biopsies revealed a continuum of organ quality across all samples ranging from the best living donor (LD) kidneys at one end to the worst performing deceased donor (DD) kidneys at the other (Am J Transpl 2007, Nov 16,epub). Three predominant groups were identified of LD, DD1 kidneys with low risk (9.5%) of delayed graft function (DGF) and DD2 kidneys with high risk (35%) of DGF (p < 0.05). Analysis of klotho gene expression among these groups also revealed a spectrum of expression from highest levels in LD to lowest in DD2 kidneys, especially those who developed DGF. Differences were highly significant among LD vs DD kidneys (p < 0.001), although donor age was not different between these groups. Klotho transcript levels were significantly different among kidneys that developed DGF compared to those with immediate graft function (IGF) (p < 0.05). In conclusion, reduced klotho gene expression is associated with grafts at risk of poorer function and appears to be independent of donor age. Decreases in klotho expression likely reflect other factors impacting the renal tissue, which may impact potential for repair and ultimately allograft function. Anti Renal allograft rejection episodes produce a stereotypical response in the allograft characterized by infiltration by cytotoxic T lymphocytes (CTL), potent Ifng response by donor and recipient cells, and decreased transcripts associated with the epithelium. We previously identified pathogenesis based transcript sets (PBTs) that reflect the disturbance in rejecting allografts (QCATs -CTL, GRITs -Ifng response, KTs -decreased function, AJT 7:2712 , 2007 . We hypothesized that anti-rejection treatment would reverse the transcriptome changes of rejection more than histopathologic lesions. Using microarray analysis we measured expression of these PBTs in 18 antibody mediated rejection (ABMR)(11 untreated, 7 treated) and 31 T-cell mediated rejection (TCMR) (25 untreated, 6 treated) biopsies, normalized to nephrectomy samples. Histopathology scoring of the primary rejection lesions were analyzed; interstitial inflammation(i), tubulitis(t), intimal arteritis(v), and glomerulitis(g) (Fig 1A) . Of these, only i and t differentiated between ABMR and treated ABMR (p<0.05) yet none differentiated between TCMR and treated TCMR. PBTs on the other hand differentiated treated from untreated for both ABMR (p<0.04) and TCMR (p<0.005) with all 3 PBTs (Fig 1B/C) . The changes in transcript expression in treated cases were dramatic. Particularly impressive was the consistent correction of the disturbances in PBT expression despite the heterogeneous treatment following ABMR episodes. Unlike ABMR treatment, TCMR treatment always included steroids which may contribute to the more pronounced changes compared to ABMR. The time of treatment before the biopsy, which also varied within the groups, did not affect the changes in PBT expression since values in treated cases approached those of nephrectomy samples. Thus, histologic rejection lesions persist following anti-rejection treatment whereas transcript disturbances of rejection are greatly reduced compared to untreated cases. This study suggests that assessment of anti-rejection treatment is more sensitively monitored by transcript expression than histopathology. Blood Deciphering the mechanisms of tolerance and chronic immune-mediated rejection remains a major goal in transplantation. Data in rodents suggests that Toll-Like-Receptors (TLR), regulators of innate immune responses, play a role in determining graft outcome. However, few studies have focused on TLR in human kidney transplant recipients. We addressed this issue by analyzing the peripheral blood (n = 75) and graft biopsies (n = 26) of renal transplant patients and healthy volunteers. We analyzed, for the first time, the expression of TLR4 in PBMC from kidney recipients with contrasted situations: operational tolerance and chronic immune-mediated rejection (Banff 2005), compared to patients with normal histology and stable graft function, non transplant patients with renal failure and healthy volunteers. We found that MyD88 and TLR4 were significantly contrasted in the PBMC, and in particular in monocytes, of patients with chronic immune-mediated rejection vs. operational tolerance. Chronic rejection patients had significantly increased TLR4 and MyD88 compared to operationally tolerant patients, who resembled healthy volunteers and non transplant patients with renal failure. Interestingly, analysis of TLR4 transcripts in graft biopsies from patients with normal histology or chronic immune-mediated rejection reflected the blood findings, with a significant increase of TLR4 in chronic immune-mediated rejection. Thus, we provide data to support a link between TLR4 expression and long-term graft outcome. The role of TLR and their endogenous ligands in mediating allograft rejection or acceptance therefore warrants further investigation and could give rise to new strategies of therapeutic intervention. Our results suggest that peripheral blood TLR4 shows potential as a biomarker of chronic immune-mediated rejection and that measuring blood TLR4 levels may help to identify patients experiencing chronic rejection who require a biopsy. Moreover, our data suggest that absence of TLR signaling may be a feature of operational tolerance to kidney grafts. Identification of Immune Identification of immunological tolerance is an important prerequisite in order to establish an individually-tailored approach to the post-transplant management of allograft recipients. It will also provide new insight into the mechanism underlying the balance between tolerance and rejection. Here we present data from a multi-centre study aimed at identifying tolerance to renal allografts. We have collected samples from five selected groups of renal transplant recipients: drug-free tolerant patients that were functionally stable despite remaining immunosuppression-free for more than one year; functionally stable patients on minimal immunosuppression (<10 mg/ day prednisone); stable patients maintained with calcineurin inhibitors (CNI); stable patients maintained on CNI-free immunosuppression regimen; and patients showing signs of chronic rejection. A group of age and sex matched healthy volunteers was also included as control. Several biomarkers and bioassays, were combined to provide an immunological 'fingerprint' of the tolerant state. Immunophenotype showed a selective expansion of peripheral blood B and NK lymphocytes in drug-free tolerant patients. This group of patients was also characterized by the absence of anti-donor specific antibodies. The differential expression of several immune relevant genes and a high ratio of foxp3/ α-1,2-mannosidase expression in these patients was observed. TCR landscape analysis highlighted differences between the Vβ repertoires of drug-free tolerant recipients and chronic rejection patients. Additionally, direct pathway donor-specific hyporesponsiveness by IFNγ ELISpot and lack of indirect pathway anti-donor responses assessed by trans-vivo DTH were detected in drug-free patients. The diagnostic capabilities of the combined results of several of the above mentioned biomarkers and bioassays are as follows: specificity 0.964, sensitivity of 0.933 and a positive predictive value of 82.4%. These biomarkers could be used to inform drug weaning protocols of kidney transplant recipients. Bile Acid Aspiration Stimulates Lung Allograft Immunity. Bile acids detected in the broncho alveolar lavage (BAL) as a marker of aspiration has been associated in a dose dependent fashion to earlier development of bronchiolitis obliterans syndrome. We sought to study the relationship between bile acids and active immune molecules as detected in the BAL. Methods: BAL collected prospectively from lung transplant recipients at routine surveillance bronchoscopies were assayed for bile acids. Samples were then assayed by Luminex for cytokines , and by ELISA for pulmonary collectins (SP-A, SP-D). Results were analyzed according to levels of bile acids as per ROC testing for accuracy for bronchiolitis obliterans syndrome diagnosis (high levels ≥3.5 µmol/L). Results: We prospectively examined 120 lung transplant recipients and a total of 271 BAL samples were collected. In 114 no bile acids were detected, low levels were present in 110, and high levels were detected in 47 samples. Samples with high bile acids had significantly greater innate (TNF-a, IL-1b, IL-12, IL-10, IL-6) and adaptive (IFN-g, IL-2) cytokines as well as greater chemokines (MCP-1, IL-8) compared to the other samples. In contrast pulmonary collectins SP-A and SP-D were significantly reduced in samples with high bile acids. The figures shows the median and interquartile range for each molecule according to bile acid levels. Conclusion: Bile acids detected in the BAL as markers of aspiration stimulate the lung allograft immunity in a dose dependent fashion. In particular high levels of bile acids are associated with an impaired lung specific innate defense system provided by the pulmonary collectins and with a broncho-alveolar district "cytokine storm". Tolerance/Immune Deviation III The results of using ex vivo CD4+ T-cells converted DN T-cells in NOD mouse models support the concept and the feasibility of potentially utilizing this novel cell-based therapeutic approach clinically for the treatment of autoimmune type I diabetes. Horng-Ren Yang, 1 Gouping Jiang, 1 John J. Fung, 1 Shiguang Qian, 1 Lina Lu. 1 1 Immunology and General Surgery, Cleveland Clinic, Cleveland, OH. Liver transplant tolerance was recognized by spontaneous acceptance of liver allograft in many species. The underlying mechanism remains unclear. Interestingly, although liver allografts are accepted, hepatocyte transplants in the same combination are promptly rejected, indicating a crucial role of liver tissue cells in immune suppression. We have demonstrated a profound T cell inhibitory activity of hepatic stellate cells (HpSC), which are known to participating in repairing and fibrosis during liver injury. Addition of activated (a) HpSC, but not quiescent HpSC, significantly inhibited allo-DC induced-T cell proliferative responses (MLR) in a dose dependent manner, which was associated with enhanced T cell apoptosis (TUNEL). Neutralization of B7-H1 by anti-B7-H1 mAb significantly reduced the HpSC-induced T cell apoptosis and reversed the inhibition of T cell proliferation, suggesting a key role of B7-H1. To evaluate this in vivo, BALB/c islets (300) were co-transplanted with 3 x 10 5 activated HpSC (B6) into STZ-induced diabetic B6 recipients. Co-transplant with HpSC effectively protects islet allografts from rejection. This was associated with reduction of graft infiltrating T cells and enhancement of apoptotic activity. Co-transplant with HpSC from B7-H1 -/livers markedly lost their islet graft protective capacity, associated with less apoptosis of infiltrating cells. To determine the subsets of apoptotic T cells, T cells were isolated from spleen, draining lymph nodes, and grafts for phenotype and function analyses. On POD 8, a marked reduction of graft infiltrating CD4 + (30.7 %) and CD8 + T cells (31.3 %) was seen in HpSC co-transplant group, as compared to islets alone, which was further progressed thereafter. CD8 T cells dropped ∼7 folds on POD 140. CD4 + / CD8 + ratio was increased from 0.5 at the early POD to 2.0 in the long-term survival grafts. Adoptive transfer of CFSE-labeled DES transgenic T cells was used to track the response and fate of antigen-specific CD8 + T cells. The results showed active division of DES + cells within the allograft in both the islet only and HpSC co-transplantation groups. However, accumulation of these DES + cells was significantly lesser in the HpSC co-transplantation group as compared to that in the islet only group (1.9×10 3 vs. 10×10 3 cells per graft). These findings suggest that HpSC induce antigen-specific CD8 + T cell death, and may not inhibit their activation. Donor-specific memory T cells are potent mediators of allograft rejection due to their ability to proliferate and give rise to cytotoxic and inflammatory cytokine-secreting effectors within hours of stimulation. Furthermore, memory T cells have been shown to be relatively resistant to the effects of many tolerance-induction protocols, including blockade of the CD28 and CD154 pathways. While seminal studies have shown that donor-reactive memory cells, generated through pre-sensitization with donor tissue or infection with pathogens with cross-reactive epitopes, contribute to costimulation blockade-resistant rejection of fully MHC disparate allografts, the role of memory T cells specific for minor antigens in costimulation blockade-resistant rejection has not been well studied. We addressed the ability of memory T cells specific for a single donorderived class I epitope to mediate this process. TCR transgenic T cells (OT-I) specific for SIINFEKL/Kb were adoptively transferred into naive B6 recipients, which were then infected with ovalbumin-(SIINFEKL) expressing Listeria monocytogenes (LM-OVA). At memory, mice received a skin graft expressing ovalbumin (mOVA), and therefore the SIINFEKL epitope. Results showed that the transferred OT-I T cells proliferated in response to LM-OVA, but not in response to a control infection with wild-type Listeria (LM). At day 9, OT-I T cells comprised ∼15% of the total CD8+ T cell compartment. During memory, the SIINFEKL-specific cells comprised ∼1-2% of the total CD8+ T cell compartment. Engraftment of mOVA skin on LM-OVA memory recipients resulted in rejection with accelerated kinetics relative to LM infected controls (MST=13d vs 20d). Following treatment with CTLA-4 Ig and anti-CD154, 9/10 LM-OVA infected recipients experienced costimulation blockade-resistant rejection, while LM-infected controls went onto long-term graft survival (p<0.001). LM-OVA-infected recipients also resisted the engraftment of mOVA-expressing donor bone marrow following a tolerance induction protocol containing busulfan, CTLA-4 Ig, and anti-CD154. These results suggest that memory T cells specific for a single surrogate minor antigen are sufficient to induce costimulation blockade-resistant rejection, and support the feasibility of using this model to study the specific requirements for donor-reactive memory T cell activation and tolerance induction during transplantation. The During an immune response, CD4 helper T cells can be instructed by non-antigenspecific signals to differentiate into functionally distinct subsets with mutually exclusive patterns of cytokine production. We find that Ikaros, a zinc finger transcription factor required for lymphocyte development, is crucial for the development of polarized T helper subsets. In the absence of Ikaros DNA binding activity, CD4 T cells induced to undergo Th2 differentiation in vitro or in vivo produce high levels of IL-4, but fail to silence expression of IFN-gamma and IL-17, cytokines that contribute to inflammatory disease processes such as autoimmunity and organ transplant rejection. Similarly, Ikaros is required for repression of IL-4 and IL-17 expression by Th1 cells, and inhibition of IFN-gamma and IL-2 production by Th17 cells. Our results show that Ikaros controls the expression of transcription factors such as GATA-3, c-Maf, Stat-6, and Runx3, and in polarized Th2 cells, Ikaros inhibits IFN-gamma gene expression through direct repression of the T-bet locus. These studies place Ikaros, a DNA binding protein previously recognized only as a regulator of lymphocyte development, as a master regulator of peripheral T cell differentiation and function. Introduction As the fastest growing subpopulation seeking organ transplantation is >65 yrs of age, it will be imperative to discern how aging impacts the acquisition of transplantation tolerance. Prior work has demonstrated that viral infections induce the development of alloreactive T cells, which impede the induction of transplantation tolerance. In this study, we tested the hypothesis that aging alters host defense against viruses leading to the development of cross-reactive T cells, which impair transplantation tolerance induction. We first examined how aging modifies the function of plasmacytoid DCs (pDCs), as IFNα production by pDCs is essential for control of viral infections. Using both in vitro and in vivo murine systems, we found that aged pDCs produced lower levels of IFNα in response to TLR9 activation with CpG sequences or herpes simplex (HSV)-2 virus (ELISA). Aged mice (18-20 mths) failed to the clear this virus as effectively as young (2-4 mths) mice. This was associated with increased liver inflammation, the release of systemic Th17-skewing cytokines, IL-6 and IL-21, and augmented splenic IL-17 levels in aged mice (ELISA). Prior to transplantation, unmanipulated aged mice (B6 or CBA background) produced more donor-specific IL-17 effector-memory T cells as compared to unmanipulated young mice (ELISPOT). Furthermore, MLR assays demonstrated that aged memory CD4+ T cells from unmanipulated mice produced significantly more IL-17 in response to donor antigen compared to young T cells (ELISA). To determine if aged recipients manifest an altered response to therapies that prolong allograft survival, aged and young B6 mice received BALB/c skin allografts and perioperative anti-CD45 and anti-CD154. We found that aged mice rejected their allografts significantly faster (median survival 20 days) than young mice (MST, 76 days, p <0.01). Similar results were noted when we employed perioperative treatment with anti-CD154 + DST and when we altered the donor-recipient (B6 to CBA) strain combination. Pre-treating aged mice with an anti-IL-17 mAB improved the efficacy of the graft-prolonging therapy compared to aged mice that received control mAB (p = 0.04). Conclusion Our results suggest that impaired control of viral infections with aging leads to the generation of alloreactive IL-17 producing T cells that impair therapies that may induce transplantation tolerance. Prevention of Type 1 Diabetes by Thymus Genetic Modification with Protective MHC Class II Molecules. Jesus R. Paez-Cortez, 1 Michela Donnarumma, 1 Chaorui Tian, 1 John Iacomini. 1 1 Transplantation Research Center, Boston, MA. Introduction. Susceptibility to type 1 diabetes is determined by multiple genetic factors, among the strongest of which is the inheritance of at-risk genes that lead to disease development. Here we examined whether diabetes can be prevented by providing protective MHC class II genes through directly infecting the thymus of diabetes prone NOD mice. Methods. After direct exposition of the thymus, lentiviruses encoding control (pHAGE-CMV-DsRed-IRES-ZsGreen-W) or protective MHC class II IAβ d (pHAGE-CMV-IAβ d -IRES-ZsGreen-W) genes were injected in a single thymic lobe of 3 to 4 week old female euglycemic NOD mice. Gene expression was determined by microscopic examination at different time points after injection and blood glucose levels were monitored weekly to examine whether this approach prevents the development of diabetes. The presence of diabetogenic CD8 + T cells were detected by flow cytometry via tetramer staining of splenocytes. Pancreatic islet integrity and insulin production was assessed by immunohistochemistry. Results. Viral gene expression was observed exclusively in thymic epithelial cells beginning at 5 days post-injection in both groups. NOD mice injected with control lentivirus developed diabetes by 22 weeks post injection, similar to non-treated animals (18-20 weeks). In contrast, pHAGE-CMV-IAβ d -IRES-ZsGreen-W injected animals remained normoglycemic at 12 months post-injection. Diabetogenic CD8 + T cell population were not detected in splenocytes of IAβ d injected animals using MHC class I tetramers. Islet integrity and insulin production was preserved in the treated group, in marked contrast to controls, which exhibited characteristic lymphocytic infiltration of islet cells and low insulin storage. Conclusions. Thymic genetic modification with protective a MHC class II IAβ d molecule can be used to prevent diabetes in NOD mice. Central deletion of diabetogenic T cell populations may be involved in prevention of autoimmunity in these animals. Role of Invariant NKT Cells in Liver Sinusoidal Endothelial Cell-Induced Immunosuppression of T Cells with Indirect Allospecificity. Masayuki Shishida, Hideki Ohdan, Yuka Tanaka, Masataka Banshodani, Yuka Igarashi, Toshimasa Asahara. Surgery, Hiroshima University, Hiroshima, Japan. We have reported that liver sinusoidal endothelial cells (LSECs) endocytose portally injected allogeneic splenocytes and can negatively regulate T cells with indirect allospecificity via the Fas/FasL pathway. As a result of in vitro transmigration across the LSECs from BALB/c mice treated with a portal injection (PI) of B6 MHC class IIdeficient (C2D) splenocytes, the naive BALB/c CD4 + T cells lost their responsiveness to the stimulus of BALB/c splenic antigen presenting cells (APCs) that endocytosed the donor-type alloantigens. However, they maintained a normal response to the stimulus of BALB/c APCs that endocytosed third-party C3H alloantigens. In the present study, we examined whether invariant NKT (iNKT) cells influence the ability of LSECs to endocytose irradiated allogeneic cells. BALB/c wild-type (WT) mice or BALB/c CD1d-deficient (CD1d -/-) mice that lacked iNKT cells were portally injected with 30×10 6 irradiated B6 C2D splenocytes labeled with PKH-26. Only 3.56 ± 3.19% LSECs endocytosed the labeled splenocytes in the BALB/c CD1d -/mice at 12 h after PI, whereas 16.82 ± 2.69% LSECs endocytosed the labeled splenocytes in the WT control mice (P < 0.001, n = 4 each). When BALB/c WT mice intraperitoneally received 4 µg α-GalCer, before PI of B6 C2D splenocytes, the expression of MHC class II on LSECs and endocytic activity of LSECs were enhanced. Thus, we found that the endocytic activity of LSECs was regulated by the iNKT cells. Intraportal adoptive transfer of LSECs isolated from BALB/c WT mice, treated with a PI of B6 C2D splenocytes, into BALB/c mice significantly prolonged the survival of subsequently transplanted heart allografts (n = 4) as compared to the adoptive transfer of LSECs isolated from BALB/c CD1d -/mice, treated similarly, into the BALB/c mice (n = 4). However, intraportal adoptive transfer of LSECs isolated from BALB/c WT mice, which received 4 µg α-GalCer intraperitoneally prior to PI of B6 C2D splenocytes, into BALB/c mice did not result in a further prolonging of the effect (n = 4). These findings indicate that iNKT cells are required for such LSEC-induced immunosuppression of T cells with indirect allospecificity; however, α-GalCer-induced activation of iNKT cells does not promote such suppressive effects on these T cells. In conclusion, naive iNKT cells play a pivotal role in the LSEC-induced immunosuppression of T cells with indirect allospecificity. Notwithstanding the considerable amounts of in-vitro data supporting the nonimmunogenicity and immunomodulatory effects of mesenchymal stem cells (MSC), scanty and conflicting data are available on their in-vivo immunomodulatory capacities. In this study we formally investigated whether MSC had immunomodulatory properties in solid organ transplantation, using a semi-allogeneic heterotopic heart transplant mouse model, and studied the underlying mechanism(s). MSC, isolated from bone marrow by adherence, were depleted from CD45+CD11b+ cells before injection. Bone marrow-induced hematopoietic mixed chimerism is the most robust mechanism for the induction of transplantation tolerance. However, immunogenicity of bone marrow cells requires harsh immunosuppressive regimens that can lead to severe sideeffects including death. Here, we examined whether embryonic stem (ES) cells can be successfully coaxed to form hematopoietic progenitor cells (HPC) which potentially could be less immunogenic than bone marrow cells. Here, we transduced ES cells with HOXB4, a hematopoietic transcription factor that confers self-renewal properties to hematopoietic cells and differentiated them into hematopoietic cells. Transduced cells had a 100-1000fold greater proliferation capacity than controls. At the end of the differentiation procedure, most cultures were >80 % CD45 + . HPCs were purified using immunomagnetic bead separation. The separated cells were further characterized for leukocyte markers and showed a high percentage of CD34, CD117, CD31 and low class I, but no class II expression. Further, they poorly express co-stimulatory molecules such as CD80 and CD86. When transplanted in Rag2 -/γ c _/_ mice, HPCs fully reconstituted bone marrow, forming multi-lineage hematopoietic cells. To now determine whether these cells engraft in allogenic recipients, the cells were transplanted in syngeneic and allogenic MRL mice. All transplanted animals became chimeric (n>30), reaching 20-30 % after 28 days. Thereafter donor cells declined as a result of out-competition by resident bone marrow cells. This pattern was identical in both syngeneic and allogenic recipients. Unexpectedly, allogenic chimeric mice became tolerant to donor-type cardiac allografts as monitored over 100 days. Grafts showed no mononuclear cell infiltration or signs of chronic rejection. Interestingly, the T cells in tolerant animals showed responses to MRL alloantigen similar to that of controls, suggesting that our protocol was likely non-deletional, but could involve regulatory T cells. Indeed, when stained for CD25 + FoxP3 + cells, the allografts showed a high percentage of these cells, but not in controls, confirming our hypothesis. Thus, these data show for the first time the potential of ES-derived HPCs to regulate engraftment of allografts, providing an alternative approach for the induction of transplantation tolerance. We had previously shown that A20 is part of the regulatory atheroprotective response of endothelial (EC) and smooth muscle (SMC) cells to injury. A20 is a NF-κB dependent gene with potent anti-inflammatory effects in EC and SMC, through blockade of NF-κB. A20 also serves an anti-proliferative function in SMC and opposite anti-apoptotic or pro-apoptotic functions in EC and neointimal SMC. Based on these functions, A20 would be a good candidate to prevent transplant arteriosclerosis (TA) and chronic rejection in vascularized organ grafts. This is supported by A20 expression in EC and SMC correlating with the absence of TA in rat kidney allografts and long-term functioning human kidney allografts. Fully mismatched C57BL/6 (H2 b ) and BALB/c (H2 d ), were used as donors and recipients of an aortic to carotid allograft. In this combination, TA lesions start at 4 weeks and become occlusive by 8 weeks when left without immunosuppression. A20 expression in the graft was achieved by recombinant adenoviral (rAd) mediated gene transfer prior to retrieval. Control mice were infused with saline or control rAd beta-galactosidase. The grafts were harvested at 4 weeks and analyzed for TA lesions by measuring intima to media ratios (I/M) and for markers of inflammation and of the immune response by immunohistochemistry. A20 expressing vessels were significantly protected from intimal hyperplasia with I/M reaching 0.4± 0.09 as compared to saline (1.62±0.17) and beta-Gal (1.86±0.06) treated vessels. This effect of A20 did not associate with a decrease in infiltrating CD3, CD4 or CD8 T cells in A20 vessels as compared to controls. A possible modification of the phenotype of these T cells (T-regs vs. effector T cells) is being explored. Rather, protection from TA correlated with increased expression of endothelial and inducible nitric oxide synthases (NOS) in EC and SMC of A20 expressing vessels, suggesting that this effect was, at least in part, related to increased in situ production of nitric oxide. In conclusion, we present the first direct evidence that expression in the vessel wall of the anti-inflammatory and atheroprotective protein A20 prevents transplant arteriosclerosis through a mechanism implicating increased expression of NOS. Carbon Monoxide Inhalation Reverses Established Chronic Allograft Nephropathy through the NO Pathway. G. Faleo, A. Nakao, J. Kohmoto, R. Sugimoto, K. Tomiyama, A. Ikeda, M. A. Nalesnik, D. B. Stolz, N. Murase. Thomas E Starzl Transplantation Institute, University of Pittsburgh, Pittsburgh, PA. Chronic allograft nephropathy (CAN) is the most common cause of graft loss; however an established therapeutic strategy in preventing/treating CAN is not yet available. We have previously shown that carbon monoxide (CO) effectively inhibits CAN development. Here, we examine the mechanisms of CO in overturning CAN through vascular endothelial cell protection. Methods: Orthotopic kidney transplantation (KTx) was performed in Lewis to binephrectomized BN rats under brief tacrolimus (0.5 mg/kg, d0-6, im). By d60 after KTx, BN developed CAN with decreased creatinine clearance (CCr, 0.58 ±0.1 ml/min), significant proteinuria (92.8 ±30.5 mg/24h), and increased Banff scores for intimal arteritis, interstitial fibrosis, and tubular atrophy. Recipients were then treated with inhaled CO 20 ppm from d60 to d150. Results: Inhaled CO effectively reversed the severity of CAN and markedly improved renal function at d90 (table) and recipient survival (>150d vs. 81d air control). CO treatment resulted in reduced cytokine mRNA levels (TNF-α, IFN-γ) and improved Banff scores compared to untreated controls. In untreated allografts, CD31 expression on peritubular capillaries (PTC) was markedly diminished, while CO-treated grafts showed normal CD31 expression, suggesting significant improvement in maintaining PTC integrity with CO. Interestingly, eNOS and iNOS protein expression was significantly upregulated in untreated grafts, while it was maintained at steady levels in CO-treated grafts. Immunohistochemistry revealed eNOS expression on vascular endothelial cells while iNOS on infiltrates. Further, serum nitrate/nitrite levels were significantly higher in untreated than in CO-treated recipients. Elevated levels of MDA, a marker for oxidative stress, in air control group were accordingly reduced in CO-treated grafts. Renal cortical blood flow data showed a better perfusion in CO-treated group at 90d (69. Chronic allograft vasculopathy (CAV) is a component of chronic rejection and a major cause of graft loss. Non-immunologic factors and indirect allorecognition participate in the pathogenesis of CAV. New therapies with tolerogenic dendritic cells (DCs) are based on in situ-delivery of alloAg to "quiescent" DCs of the recipient's lymphoid organs via apoptotic cells, vesicles or particles. We have shown that the ability of apoptotic cells to deliver alloAg and an inhibitory signal to DCs down-regulates the indirect alloresponse and prolongs allograft survival in mice. Aims: to test if targeting of recipient's DCs in situ with donor apoptotic cells ameliorates CAV by down-regulating indirect pathway allo-immunity. Methods: We performed functional aortic (abdominal) transplantation in mice [BALB/c→C57BL/6 (B6)]. B6 mice were injected i.v. with 10 7 BALB/c UVB-induced early apoptotic splenocytes (d-7). Sixty days later, grafts were evaluated in sections with H&E, vanGieson's (elastic fibers) and Masson's (collagen) techniques. CFSE-labeled 1H3.1 TCRtg CD4 T-cells specific for IA b (B6) loaded with IEα 52-68 (BALB/c) were used to evaluate the indirect pathway T-cell response. Results: PKH67 + BALB/c apoptotic cells injected (i.v.) in B6 mice were captured by splenic CD8and CD8α + DCs, but not plasmacytoid DCs. Splenic DCs with apoptotic cells remained quiescent in vivo (MHC-I/II lo , CD80/86 lo , ICOSL + , PDL-1/2 + ) and were unable to up-regulate MHC-II and CD86 upon culture with GM-CSF. Injection of donor apoptotic cells induced defective activation and deletion of indirect pathway 1H3.1 CD4 T-cells. Therapy with donor apoptotic splenocytes reduced intimal thickness in aortic allografts (100±21 vs.196±28mm in controls; p<0.001 ) and proliferation of α-smooth muscle cells and collagen deposition. The effect of apoptotic cells was allospecific, superior than that of cells alive, and depended on the physical properties of apoptotic cells, since necrotic cells did not achieve the effect. Treatment with donor apoptotic cells decreased significantly the indirect pathway T-cell response (assessed by ELISPOT for IFN-γ) and reduced the level of circulating alloAb. Conclusion: In situ-targeting of recipient's DCs with (early) apoptotic cells carrying donor alloAg is a novel approach to prevent CAV by down-regulating the indirect pathway alloresponse. The Antifibrotic Agent, Pirfenidone, Has Direct Inhibitory Effects on T Cell Activation, Proliferation, and Cytokine and Chemokine Production, Leading to Suppression of Host Alloresponses. Gary A. Visner, 1 Fengzhi Liu, 1 Hanzhong Liu, 1 Liqing Wang, 2 Wayne W. Hancock. 2 1 Medicine, Children's Hospital Boston, Boston, MA; 2 Pathology, Children's Hospital of Philadelphia, Philadelphia, PA. There is an urgent need to develop new therapies effective against the fibrotic and other complications of chronic allograft rejection. While pirfenidone (PFD) is an established anti-fibrotic agent, we previously showed that PFD treatment reduced acute rejection in a rat lung transplant model suggesting that it might have direct immune modulating properties. Accordingly, in this study, we tested the effects of PFD on T cell responses. We first evaluated whether PFD alters T cell proliferation and cytokine release in response to T cell receptor (TCR) activation in vitro. Since PFD can inhibit TGF-β by mononuclear cell fractions, we also examined whether PFD affects the suppressive effects of regulatory T cells (CD4+CD25+). The effects of PFD on alloantigen-induced T cell proliferation in vivo were then assessed by adoptive transfer of CFSE-labeled T cells across a parent->F1 MHC mismatch, as well as by using a murine heterotopic cardiac allograft model (BALB/c->C57BL/6). PFD was found to significantly inhibit TCR-stimulated CD4+ T proliferation in vitro (p<0.01), whereas CD8+ T cell proliferation was not significantly affected. While the beneficial effects of PFD were not associated with increased CD4+ T cell apoptosis, PFD use inhibited TCR-induced production of multiple cytokines and chemokines, including . Interestingly, there was no change on TGF-β production by purified T cells, and PFD also had no effect on the suppressive properties of naturally occurring regulatory T cells. Similar to the in vitro studies, PFD inhibited allo-antigen-induced T cell proliferation in vivo (parent->F1 model), and showed synergistic effects with low dose rapamycin in this model. Lastly, though PFD alone did not affect the tempo of acute cardiac allograft rejection across a full MHC mismatch, use of PFD plus a subtherapeutic regimen of rapamycin significantly prolonged allograft survival (p<0.05), decreased mononuclear cell infiltration and prevented development to chronic rejection, including arteriosclerosis and myocardial fibrosis. We conclude that PFD may be an important new agent in transplantation, with particular relevance to combating chronic rejection by inhibiting both fibroproliferative and alloimmune responses. We have previously reported studies in miniature swine showing that transplantation (Tx) of prevascularized donor islets as part of composite Islet-Kidney (I-K) reversed diabetic hyperglycemia across fully allogeneic barriers, while free islets did not. In order to test the potential clinical applicability of this strategy, we have extended it to a fully allogeneic nonhuman primate model. Methods: Two diabetic baboons received composite IKs and one diabetic baboon received free islets across fully allogeneic barriers. (1) I-K preparation in donors: Two I-Ks were prepared by isolating islets from 60% partial pancreatectomies and injecting them under the autologous renal capsule, allowing for vascularization before allogeneic Tx. These I-Ks were harvested at days 203 and 102 for allogeneic IK Tx. (2) Induction of insulin-dependent diabetes (IDDM) and allogeneic I-K or free islet Tx: All recipients received streptozotocin at 2000 mg/m 2 x (Body Surface Area). IDDM was induced successfully in two animals, while one animal required total pancreatectomy to induce IDDM. After confirming IDDM by fasting blood sugar (FBS) >300mg/dl for 3 consecutive days, either I-Ks or free islets were transplanted (single donor to each recipient) with ATG (day -3) followed by MMF and low-dose tacrolimus. Free islets were injected into the liver through the ileocolic vein. Islet function was assessed by FBS and renal function was assessed by serum creatinine. Immunologic status was examined by CML/MLR assays. Results: All three recipients had strong CTL/MLR responses to donors preTx, indicative of a fully allogeneic combination. FBS decreased immediately after I-K Tx and no insulin therapy was required throughout the experimental period (days 225 and 279). BS levels averaged 82.2+/-15.6 mg/dl in the first baboon and 125.4+/-42.7 mg/dl in the second. Normal creatinine levels (<1.0mg/dl) were maintained by these life-supporting IK grafts. In contrast, the recipient of allogeneic free islets had unstable BS levels and required insulin from day 60 (BS 400 at day 60) Conclusions: Life-supporting I-Ks from single donors achieved glucose regulation without insulin therapy and maintained normal renal function. These results demonstrate the feasibility of composite I-K Tx in a non-human primate allogeneic model, with possible clinical applicability for the cure of diabetic nephropathy. Donor Cell Infusion without Immunosuppression as a Novel Therapy for Induction of Donor-Specific Tolerance in Islet Cell Transplantation. Xunrong Luo, 1 Kathryn Pothoven, 1 Derrick McCarthy, 2 Matthew DeGutes, 2 Aaron Martin, 2 Xiaomin Zhang, 3 Guliang Xia, 3 Dixon Kaufman, 3 Stephen Miller. 2 1 Medicine, Northwestern University; 2 Microbiology and Immunology, Northwestern University; 3 Surgery, Northwestern University, Chicago, IL. Background In autoimmune models, peptide-pulsed splenic antigen presenting cells that are chemically fixed with ethylcarbodiimide (ECDI) have been used as a powerful and safe method to induce antigen specific T cell tolerance. ECDI fixed donor cells for allo-antigen specific transplant tolerance has not been well studied. Material and Methods C57BL/6 mice were rendered diabetic by STZ. Kidney subcapsular allogeneic islet transplant was performed 10 days after diabetes stabilization (day 0). 1x10 8 ECDItreated donor splenocytes were injected i.v. either once (day -7) or twice (day -7 and day +1). Animals were analyzed for graft outcome. Results Control mice receiving islet graft alone rejected the graft between day 11 to day 18 (MST = 14 days, N=8). Mice receiving one dose of ECDI-treated donor splenocytes (on day -7) showed similar graft survival as controls (MST = 15 days, N=5). In contrast, mice receiving 2 doses of ECDI-treated donor splenocytes (on day -7 and day +1) showed significant prolongation of graft survival with 72.7% functional grafts at day 60 (N=11), and some remained functional >100 days. This protection is donor-specific as mice receiving ECDI-treated SJL cells rejected Balb/c islet grafts as controls (MST = 18 days, N=4). Immunohistochemistry of protected grafts showed positive insulin staining within well-defined islet architecture. Peri-islet infiltrates were composed of CD4+, CD8+, and CD11c+ cells, with occasional Foxp3+ cells. Anti-donor antibody production (IgG1,G2a,b, G3) was completely abolished in long-term graft survivers. This tolerance was undisturbed by anti-CD25 antibody treatment during maintenance stage, but could not be established if treatment was given around the time of the first donor cell infusion. In addition, lack of PD-L1 also impaired tolerance induction evidenced by using PD-L1 -/as recpients. Conclusion 1. Multiple infusions of ECDI-treated donor splenocytes significantly prolonged allo-graft survival in the islet transplant model. 2. The protective effect is donor-specific and is dependent on regulatory T cells as well as the PD-L1 signaling pathway. Therapy with ECDI-treated donor cells may emerge to be a novel and potent agent for induction of donor-specific transplant tolerance. Mice. Rebecca Stokes, 1 K. Cheng, 1 C. Scott, 1 W. Hawthorne, 2 P. O'Connell, 2 J. E. Gunton. 1 1 Garvan Institute, Darlinghurst, NSW, Australia; 2 NPTU, Westmead, NSW, Australia. The aim was to investigate the effects of increasing HIF-1α protein in human islets upon islet-transplant outcomes. HIF-1α is a transcription factor which co-ordinates a program of cellular responses to stressors including hypoxia. In other cell-types HIF-1α improves survival following hypoxic-challenge. HIF-1α functions as a heterodimer with ARNT which we have shown to be important for normal β-cell function (1) . Islet transplantation subjects islets to hypoxia. It is thought up to 70% of islets die within 1 week of transplantation and this is at least partly due to hypoxia. The role of HIF-1α in β-cell function is unknown. We hypothesized that increasing levels of the protective factor HIF-1α in islets before transplantation would improve survival and engraftment and thus improve islet transplant outcomes. Isolated human pancreatic islets from 9 separate donors were cultured overnight in control media, or media supplemented with desferrioxamine (DFO), a small molecular stimulator of HIF-1α protein. Islets were transplanted into diabetic SCID mice. There were 3 transplant groups, with mice receiving: 1.Supra-physiological-mass transplant of 2000 control-cultured IEQ (islet equivalents) 2.Minimal-mass-transplant of 600 control-cultured IEQ, or 3.Minimal-mass-transplant of 600 IEQ cultured with DFO. For each human donor, at least 1 of each of the 3 transplant groups was performed to avoid the confounder of inter-donor variability. Recipients of 2000 control IEQ cured in 42% of cases. Minimal-mass-transplantation was ineffective: 600 IEQ cured 0% mice at 28-days. However, minimal-mass-transplant of DFO treated islets had 53% success (p<0.001 vs group 2 and p=ns vs 2000-control-IEQ). Blood glucose levels were markedly improved in the 600 DFO treated group compared to 600 IEQ control group (p<0.0001) and equivalent to 2000 control IEQ transplants (p=ns). This data demonstrates increasing HIF-1α in human islets prior to transplantation markedly improves islet transplant outcomes. HIF-1α and DFO may have a therapeutic role in human islet transplantation. Long-Term Disappearance of Neovascularization of Transplanted Islets. Eba Hathout, Nathaniel Chan, Annie Tan, John Chrisler, John Hough, Naoaki Sakata, John Mace, Ricardo Peverini, Richard Chinnock, Lawrence Sowers, Andre Obenaus. Loma Linda University, Loma Linda, CA. We recently reported an in vivo time-line for neovascularization of transplanted islets using dynamic contrast enhanced (DCE) magnetic resonance imaging (MRI) over a 14-day period. However, vascularization of transplanted islets must be maintained for extended periods to provide long-term function. In this dataset, we investigated whether vascularization was maintained in transplanted Feridex-labeled syngeneic murine subcapsular islets (400 IEQ per kidney) using DCE imaging on an 11.7T MR scanner and subsequent immunohistochemistry over 180 days. Sub-capsular transplants could be visualized at post-transplant days 3 and 14 using T2 weighted imaging. However, the islets could not be seen on MRI at post-transplant day 180. Injection of the contrast agent gadolinium (Gd)-DTPA for DCE at 3, 14 and 28 days showed increased signal in the transplant area. At 180 days, there was no change in signal intensity after contrast injection during DCE. Immunohistochemistry confirmed MRI and DCE findings. These results suggest that islet neovascularization occurs early after transplantation but is likely not maintained for the 180-day duration of our experiments. This work was supported by NIH/NIDDK Grant # 1R01DK077541. A) T2 imaging at day 3 clearly identifies iron-labeled islets (arrows) in the subcapsular region. No iron-labeled islets are observed at day 180 (arrows). B) DCE imaging for neovascularization of transplanted islets in the subcapsular region demonstrates a temporal decline in signal intensity. Oleanolic Acid, a Natural Triterpenoid, Significantly Improves Islet Survival and Function Following Transplantation. N. Angaswamy, 1 D. Saini, 1 S. Ramachandran, 1 N. Benshoff, 1 W. Liu, 1 N. Desai, 1 W. Chapman, 1 T. Mohanakumar. 1, 2 1 Surg, WUSM; 2 Path & Immunol, Washington Univ Sch Med, St. Louis, MO. Oleanolic acid (OA), a triterpenoid in medicinal herbs, is an integral part of normal human diet. OA has anti-oxidant, anti-inflammatory properties (inhibits iNOS & COX2) & lowers plasma glucose levels. We hypothesis that these properties of OA will prevent early islet cell loss following transplantation & also benefit long term function of allograft. C57BL/6 mice, made diabetic by streptozotocin (200mg/kg) were transplanted with 500 BALB/c islets (isolated by collagenase digestion) under kidney capsule. OA (0.5mg/day) was administered i.p. in 100 µl of PBS (with 6M DMSO) or PBS-DMSO as vehicle control daily from day -1 onwards. Blood glucose was monitored daily. Immunohistochemical analyses of grafts were performed for CD4 & CD8 markers. Cellular immune responses to donor antigens & cytokines produced by cells and in sera were measured using ELISPOT & Luminex assays. Effect of OA on function of transplant with suboptimal dose of islets (100-250) was also analyzed. Optimal dose of islets (500) transplanted into diabetic BL/6 mice administered with OA significantly reduced time taken to reverse diabetes following transplantation (<2±1 vs 4±2 days, p=0.003). Further, OA treatment reversed diabetes even with suboptimal dose (200) of islets while untreated animals did not achieve normoglycemia. As expected, control diabetic mice rejected on 6±2 days whereas, OA administration alone prolonged islet allograft survival to 23±3 days (p<0001). OA treatment resulted in >3 fold increase in serum KC, IL-10 & VEGF (p<0.0003) & 2 fold decrease in MCP-1, IP-10 & IL-4 (p<0.005) in Luminex assay. Stimulation of splenocytes from OA treated mice with donor BALB/c cells resulted in significantly reduced IFNg (4.5 fold), IL-4 (3.5), IL-2 (2.3) & IL-17 (4). In addition, proliferation in MLR was also reduced 2.5 fold. Immunohistochemical analysis of grafts showed significant reduction in cellular infiltration in OA treated animals with reduction in both CD4 and CD8 T cells. Daily administration of OA markedly improved islet engraftment & function with reversal of diabetes even when suboptimal dose of islet were transplanted. Further, OA treatment allowed significant long term survival of allograft with no other immunosuppression. We demonstrate that prevention of inflammatory signaling cascades by OA resulted in marked reduction of cellular infiltration into graft allowing long term function of allograft. Endoplasmic Reticulum Stress May Be an Important Cause of Cell Loss after Human Islet Isolation. Soon Hyang Park, 1 Michel Tremblay, 2 Steven Paraskevas. 1 1 Surgery, McGill University Health Center, Montreal, QC, Canada; 2 McGill Cancer Center, McGill University, Montreal, QC, Canada. Purpose: To evaluate the presence of endoplasmic reticulum (ER) stress, induced by conditions to which islets are subjected during isolation (ischemia, nutrient deprivation, thermal stress and cytokine release) in human islets and to determine if this leads to the Unfolded Protein Response (UPR), which could alter cell survival. Methods: Human islets were purified from cadaveric pancreata by collagenase dissociation and continuous density gradient purification. Islet preparations were cultured in serum-free medium and sampled at the end of isolation and daily thereafter. Total mRNA was purified and gene expression evaluated by RT-PCR. Activity in UPR signaling pathways was evaluated by immunoblot. Apoptosis was measured by a caspase-3 activity assay. Representative trends observed in >5 isolations are described. Results: Following isolation, a rapid increase in UPR signaling was observed in the PERK and IRE-1 modules of the UPR. These include the phosphorylation of PERK target eIF2? and splicing of mRNA for the transcription factor XBP-1. These changes occurred concurrently with a rapid spike in JNK activity and a rise in expression of the UPR target gene CHOP. After these signals peaked, caspase-3 activity increased with time (apoptotic cells), as did expression of ER chaperone Bip (surviving cells). Conclusion: We consistently observed UPR activation in human islets. ER stress and the UPR may be one important and unrecognized cause of apoptosis in this context. Current investigations focus on UPR modification and determination of a causal relationship with apoptotic cell death. Immunosuppression and the Risk of Renal Transplant Failure Due to Recurrent Glomerulonephritis. Atul Mulay, 1 Carl van Walraven, 1 Greg Knoll. 1 1 The Ottawa Hospital, Ottawa, ON, Canada. Glomerulonephritis (GN) is the most common cause of end-stage renal disease among those who undergo kidney transplantation. Recurrent GN is a major cause of kidney transplant failure. Immunosuppressive medication is used to treat GN in the native kidney prior to the development of end-stage renal disease but the impact of different immunosuppression on recurrent GN post-transplantation is unknown. We used the United States Renal Data System to determine the association of routine post-transplantation immunosuppressant use with time to renal allograft failure due to recurrent GN. Immunosuppressants were treated as time-varying covariates. The study-cohort included patients with kidney failure due to GN who received first kidney transplant between 1990 and 2003. The study cohort included 41,272 patients with a median follow-up of 51 months. Ten-year overall graft survival (including death as graft loss) and death-censored graft survival was 56.2% and 70.5% respectively. Use of cyclosporine (hazard ratio 0.93; 95% CI 0.62-1.41), tacrolimus (hazard ratio 1.03; 95% CI 0.67-1.60), azathioprine (hazard ratio 0.88; 95% CI 0.68-1.13) or mycophenolate mofetil (hazard ratio 1.10; 95% CI 0.85-1.40) was not associated with risk of graft failure due to recurrent GN after adjusting for important covariates. There was no difference of recurrent GN causing graft failure between cyclosporine and tacrolimus (P=0.4) or between azathioprine and mycophenolate mofetil (P=0.1). However, change in any immunosuppressant during follow-up was independently associated with graft loss due to recurrence (HR 1.31, 95% CI 1.07-1.60, P=0.01).When we restricted the analysis to patients who had no change in immunosuppression during follow-up we again found no association between any of the immunosuppressive medications and the risk of graft loss due to recurrent GN. Despite the increased use of tacrolimus, cyclosporine and mycophenolate mofetil to treat GN in native kidney disease, the use of these medications following kidney transplantation had no impact on the risk of graft loss due to recurrent GN. Glomerulosclerosis. Junichiro Sageshima, 1 Gaetano Ciancio, 1 Alessia Fornoni, 1 Linda Chen, 1 Carolyn Abitbol, 1 Jayanthi Chandar, 1 Warren Kupin, 1 Giselle Guerra, 1 David Roth, 1 Sherry Shariatmadar, 1 Gaston Zilleruelo, 1 George W. Burke III. 1 1 University of Miami Miller School of Medicine, Miami, FL. BACKGROUND: Disease recurrence is a major obstacle of kidney transplant for focal segmental glomerulosclerosis (FSGS). Anti-CD20 antibody (rituximab) has been used for nephrotic syndrome of native kidney. The significant reduction of proteinuria in transplant recipients with FSGS recurrence was also reported after rituximab use for posttransplant lymphoma. We hypothesized that rituximab induction could alter the posttransplant course of FSGS recipients, particularly in those patients with rapid progression to end-stage renal disease who are higher risk of recurrence. METHODS: We compared the outcome of transplants for primary FSGS treated with and without rituximab. From Jan. 2000 to Dec. 2003 received renal allografts along with our "standard" immunosuppressive protocol, consisting of tacrolimus, mycophenolate, corticosteroids, antithymocyte globulin and/or daclizubab. From Jan. 2004 to Dec. 2007 received rituximab in addition to the "standard" immunosuppression. Posttransplant proteinuria was treated with plasmapheresis (PP) and maintenance angiotensin blockade (AB). RESULTS: There was no adverse event related to rituximab infusion. The overall incidence of posttransplant proteinuria was significantly lower in recipients with rituximab induction (p < 0.05). Four recipients treated with "standard" immunosuppression developed massive proteinuria (u-protein/creat. > 10) immediately following transplantation; they responded poorly to PP and AB. Four other recipients had moderate proteinuria. In contrast to this, of the 18 patients induced with rituximab, only 2 had massive proteinuria and 4 had mild to moderate proteinuria which responded well to PP and AB. With a median follow-up of 26 months, there was no significant difference of graft survival between 2 groups (2-year survival: 81% without rituximab vs. 84% with rituximab). A half of the graft loss was related to non-compliance. CONCLUSION: While the mechanism of action is unclear, our observation indicates that rituximab induction may decrease the incidence and severity of recurrence of FSGS following kidney transplantation. A larger-scale study is desirable to confirm this observation. Mesangial Chimerism in Recurrent IgA Nephropathy. Geoffrey Talmon, 1 Dylan Miller. 1 1 Department of Pathology and Laboratory Medicine, Mayo Clinic, Rochester, MN. Background IgA nephropathy (IN) is the most common primary glomerulonephritis and nearly 50% of patients who undergo a renal transplant for IN recur. Data support that bone Abstracts marrow-derived cells are capable differentiating into various mesenchymal cells within the kidney. The extent to which this phenomenon versus proliferation of resident mesenchymal cells is involved in populating mesangium is not well understood. The mesangial injury and/or hypercellularity seen in IN provides a robust in vivo model for determining if this phenomenon is prevalent in human kidneys. Design Follow-up biopsies from male patients receiving female renal allografts for IN that showed recurrent disease were selected. Fluorescent in-situ hybridization and immunofluorescent staining was performed on unstained slides from the paraffinembedded tissue for smooth muscle actin, X, and Y chromosome centromeres. Cells within nonsclerotic glomeruli with triple positivity (Y+) were assumed to be mesangial cells derived from the recipient. Results Four cases of recurrent IN with nonsclerotic glomeruli were obtained, each displaying at least minimal mesangial proliferation by light microscopy (one "minimal", two "mild", one "moderate"). Mesangial cells with Y chromosome centromeric material were observed in each case (100%). Between 6 and 34 mesangial cells were present in each glomerulus (mean 16.45) with no to three Y+ cells seen (mean 1.8). These accounted for between 8% and 16% of mesangial cells in individual glomeruli. The ratio of Y+ to total mesangial cells in each case ranged from 1:7.8. to 1:11.3 (mean 1:9.35 ). The case exhibiting minimal mesangial hypercellularity had a ratio of 1:9.6, those with mild had ratios of 1:7.8 and 1.8.7, and that with moderate 1:11.3. Conclusions Recipient-derived mesangial cells make up a fraction of the population of glomerular cells in renal allografts affected by recurrent IN. Although the number of cases is small, the number of recipient-derived cells does not seem to be directly related to the degree of mesangial hypercellularity seen by light microscopy. The consistent presence of these "colonizing" cells in patients with recurrent IN does, however, suggest that there may be a role for targeted therapy directed against circulating recipient cells. In the mid 1970's, patients developing ESRD secondary to systemic lupus (SLE) were deemed to be poor transplant candidates because of concern for early recurrent lupus nephritis (RLN) leading to allograft loss. Subsequently, RLN was considered an unusual complication of kidney transplantation, occurring in <4% of allografts. However, over the last decade, several reports have shown the frequency of RLN to range from 8-30%. We sought to determine the frequency of RLN at our center and to identify any clinical variables associated with RLN. Between 6/1977 Between 6/ -11/2005 allografts in 166 patients with ESRD due to SLE functioned for more than 90 days after engraftment. Immunosuppression consisted of Azathioprine (AZA), or Cyclosporine (CSA) and AZA, or Mycophenolate (MMF) and CSA, or Tacrolimus and MMF depending on the date of transplant. All received steroids. Proteinuria was defined as 2+ on dipstick or urine protein/creatinine ratio >0.5. Medical charts were reviewed. We found pathologic evidence of RLN in 18 (24%) of 75 patients who underwent biopsy due to allograft dysfunction or proteinuria, comprising 10% of all patients transplanted for SLE. Characteristics of these patients are shown below: Randomized studies have shown little or no increase in AR in kidney tx recips on P-free IS; but concern remains about long-term outcome. We present 8-yr f/u of a protocol incorporating rapid (<6 days) discontinuation of P, with now over 1000 patients transplanted using this protocol. Between 1/1999 Between 1/ and 10/2007 Between 1/ , 1003 adult tx recips were treated with thymoglobulin (5 doses)(extended in DGF), P (5 days), a CNI, and either MMF or SRL. Of these, 658 were LD (260 LURD); 345 DD. Of the 1003, 41% were female;88% white; mean recipient age was 47± 14 years and mean donor age was 39.7± 13.2 years. Diabetes was present in 34.3% of the recipients and 12.5% of the transplants were retransplants. The peak PRA was >10% in 20% of recipients; 14% had tx PRA >10%. Table 1 shows actuarial survival rates. Graft survival rates were significantly better in LD vs DD transplants (p=0.002) and and acute rejection rates lower (p=0.04). Compared to national data from SRTR, overall outcomes were not significantly different. With mean follow-up of 3.5 years, a total of 110 (11%) recipients have died -the most common cause being cerebrovascular accident (14%) followed by malignancy (9%). There were only 5 (4.5%) patient deaths due to cardiac causes. Of 195 (19.4%) graft losses, 86 (44%) were from DWF (death with function); 43 (10%) from CR/CAN. Renal function has been stable with mean serum Cr of 1.7 mg/dl at 3 and 5 years posttransplant with cretinine clearance of 63 and 60 respectively. At 5 years posttx, compared to ppretransplant values, recipients showed a 9.6% increase in weight, a 3% decrease in serum cholesterol, and a 10.8% decrease in serum lipid values. 84% of the kidney recips remain P-free; the most common reason for restarting P was acute rejection (AR). Conclusion: Short-term data suggests kidney tx recips do well with rapid discontinuation of P. Our intermediate-term data suggests that patient and graft survival rates remain good and renal function remains stable. Ongoing long-term follow-up is necessary. Background. Since 4/02 our program has employed a steroid-free, Rapamycin and Neoral maintenance immunosuppression regimen for kidney transplant recipients. Prior to that time recipients were treated with prednisone, MMF and Neoral. We noted a significant reduction in acute cellular rejection (ACR) after implementing this regimen. This retrospective analysis was performed to examine the impact, if any, on the incidence, character, and outcomes of early AHR. Results. This study includes 1611 consecutive kidney recipients transplanted between 1/99 and 12/06. There were 717 recipients in the prednisone, MMF, and Neoral era (Grp 1) and 894 in the steroid-free, Rapamycin and Neoral era (Grp 2). Recipient age, gender, African-American race, AB and DR mismatch, and frequency of PRA>10% was not statistically significantly different between the 2 groups. However, 43% of Grp 1 pts vs 63% of Grp 2 pts received a living donor kidney due to a more recent volume increase in this procedure (p<0.001). There were a total 722 kidney biopsies in 466 pts performed in the first 6 months post-transplant; 478 in 287 pts when excluding pre-perfusion biopsies. Nineteen percent (54/287) of these showed >50% peritubular capillary (PTC) C4d deposition. Comparison of Grp 1 to Grp 2 demonstrated the following: 1) The incidence of clinical acute rejection in the first 6 months was 12.7% (91/717) in Grp 1 and 5.6% (50/894) in Grp 2 (p<0.001), 2) The overall incidence of a C4d+ biopsy was similar in the 2 groups ( Conclusions. Despite a significant reduction in the incidence of acute rejection using our newer, steroid-free immunosuppression protocol, there has been no reduction in the incidence of early (0-6 months) AHR evidenced by C4d+ kidney biopsy. However, the percentage of AHR unassociated with ACR has significantly increased. The poor graft survival in pts with early AHR has not improved with our newer immunosuppression regimen. Conclusions: Four risk factors for AR were identified in the RDP study population: retx, AA race, age 18-50 (vs. >50) and PRA >50 (vs. <50). Four risk factors for GL were identified: pre-tx T1 DM, AR, DGF and DD tx (when AR and DGF omitted). These risk factors for AR and GL are the same as we observed in prednisone-containing protocols. Additionally, many of these factors are not modifiable. Identification of high risk groups allows for individualization of IS. Increasing LDs and utilization of IS protocols to decrease or minimize DGF and AR are goals for improving graft outcome. Effect With the advent of more potent immunosuppression HLA matching has been deemphasized in the allocation of deceased donor kidneys due to the limited impact on acute rejection and graft survival. An unforeseen consequence of poorer matching could be an increase in sensitization of patients in need for a repeat transplant. Our study examined candidates listed in the US from 1988-2007 from the SRTR database that were re-listed following loss of a primary kidney transplant (n=19,827). The primary outcome of the analysis was change in PRA from the listing prior to recipient's initial transplant to the subsequent listing. Absolute change in peak and current PRA levels were examined in general linear models as well as the proportion of patients with a rise in PRA level in a logistic model. Results HLA(A,B,DR)-matching in the primary transplant was strongly associated with change in PRA level (p<.001, Figure 1 ). Among recipients with 6-HLA MM, over 50% had a rise in PRA at re-listing as compared to 25% of 0-HLA MM recipients. Younger recipient and donor age, males, deceased donor transplants and African American recipients were also significantly associated with elevation in PRA. In addition, the effect was apparent stratified by primary donor type. While there might be a limited impact of HLA matching on graft survival, many patients might be negatively impacted from poor HLA matching from their first transplant when needing a second transplant. As high PRA is one of the strongest risk factors for not getting transplanted, this should be taken into account when evaluating the impact of HLA matching in kidney transplantation. This might be particularly important in younger patients and in patients with a long life expectancy in general because of the high likelihood of needing a second transplant during their lifetime. The were about 25% less likely to die or lose an ECD kidney from a donor aged 50-59 and 70% more likely with kidneys from donors older than 69. A similar trend was seen with older recipients, but the risks seemed to be lower. Logistic regression indicates recipients older than age 65 were 5 times more likely to have a donor older than age 70 than recipients younger than 50, hypertension and creatinine >1.5 were less likely in older donor kidneys. Interestingly, the use of these kidneys relative to the total number of ECD kidneys has decreased since 2002 (Odds ratio=0.63, 0.53-0.76, P<0.001). Conclusion: An increase in the supply of kidneys might be achieved with increased utilization from deceased donors older than age 70. Outcomes were similar to those from donors age 60-69 in recipients that were older than age 50. Outcomes , but approximates non-DCD survival thereafter. DCD listing for retransplantation and graft failure progressed continuously over 180 days versus 20 days in non-DCD. When retransplanted, DCD recipients waited longer and received higher risk allografts (p=0.039) more often from another region. More DCD recipients remain waiting for retransplantation with fewer removed for death, clinical deterioration, or improvement. CONCLUSIONS: DCD utilization is impeded by early outcomes and a temporally different failure pattern that limits access to retransplantation. Allocation policy that recognizes these limitations and increases access to retransplantaton is necessary for expansion of this donor population. Orthotopic Liver Transplantation with Allografts from DCD Donors. Roberto C. Lopez-Solis, 1 BACKGROUND: In the current era of liver transplantation, organ shortage continues to be a significant problem. The use of extended criteria allografts from donation after cardiac death (DCD) donors to increase transplantation rates is widely practiced. This study is a review of one of the largest single center experiences utilizing DCD donors in the world with a follow-up of almost 15 years. METHODS: From 03/01/1993 to 10/31/2007, 3,431 liver transplants were performed at our institution, 146 (4.2%) of which were liver allografts from DCD donors. Patient and donor demographics, recipient and graft survival, and the incidence of primary non function, hepatic artery thrombosis, retransplantation, and bile duct complications were analyzed for this subset of recipients. RESULTS: Kaplan Meier analysis showed a 1-and 5-year patient and graft survival of 81% and 71%, and 71% and 59%, respectively. The mean age of recipients was 52 ± 10 years with an average MELD score of 18.7 ± 9.3 (range, 6 to 40), and there were 101 male patients (69%). Donor mean age was 36 ± 16 years and cold ischemia time was 646 ± 173 minutes. One hundred and three patients (70.5%) are alive and 24 (16.4%) underwent retransplantation. The incidence of primary non-function was 11.6% (17 patients) and hepatic artery thrombosis was 4.1% (6 subjects The FDA warns against using sirolimus (SRL) in liver transplants, reporting increased hepatic artery thrombosis (HAT), excess mortality and graft loss when SRL is used as initial immunosuppression (IS) with calcineurin inhibitors. We report the largest experience to date of patients with SRL used as initial IS, assessing hepatic artery complications and survival outcomes. Materials and Method All 1554 OLT pts from 1998-2007 were reviewed. Those using SRL as initial IS were identified, and the remaining OLT pts from that time period were used as controls. Ultrasound assessed graft vascular status and any issues were verified by angiogram. . There were no significant difference in demographics variables, LT indication or pre-LT MELD score between the two gr. Mean follow-up was 6.8±3.9 months. All the enrolled pts were treated with Abstracts an initial dose of Cs of 2mg/kg/day, to target 100ng/ml for the first 10 days. Pts were randomized on day 10 into one of the two following gr on a 2:1 basis. Ev gr: initial dose of Ev was 2 mg/day, to reach blood level of 8 ng/mL. The dose was increased on day 30, when Cs was discontinued, in order to reach an Ev blood level between 10 and 12 ng/mL. Cs gr: after the 10 th post-operative day the dose of Cs was adjusted to a target level of 250 ng/ml until day 30, then to 200 ng/ml until the end of month 6. All pts received basiliximab induction on day 0 and 5 after LT. Pts were weaned off prednisone by 5 weeks. Pt survival at 6 and 12 months was similar in the Ev and Cs gr (95.2% and 86.6% vs 87.5% and 87.5% respectively; p= ns). Causes of death were sepsis (1), HCV recurrence (1), pulmonary embolism (1) in the Ev gr, and sepsis (1), rupture of splenic artery aneurysm (1) The overall incidence of infection episodes was comparable between two gr (5.9% Ev gr vs 12.5% Cs gr; p=ns). Cholesterol but not triglycerides increased in the Ev gr compared with the Cs gr (p<.05); Ev dose reduction decreased such parameters without the need for statin implementation. CONCLUSION Ev monotherapy in de novo LT showed similar patient survival and incidence of morbidity compared to a Cs immunosuppressive protocol. The primary endpoint was achieved inasmuch as renal function was statistically better in the Ev gr. Background: SIR is a potent immunosuppressive agent that inhibits T-cell activation and proliferation. In LT recipients, SIR has primarily been used as a renal-sparing agent, but its toxicity and tolerability in this population has not been well defined. Aims: To identify the adverse effects and predictors of discontinuation of SIR in LT recipients. Methods: Records from 327 adult LT recipients transplanted between 1/2000 and 12/2006 were reviewed. Reasons for starting and discontinuing SIR were captured, as were all significant adverse effects and laboratory abnormalities. Factors predicting SIR discontinuation in univariate analysis were further analyzed by multivariable logistic regression (MLR). Results: Mean age of the study group was 50 ± 11 years, and 75% were male. Underlying liver disease was HCV ± alcohol in 56%, 24% had hepatocellular carcinoma, and 14% received living donor grafts. Calcineurin inhibitors (CNI) were started post-operatively in 91% (85% tacrolimus/15% cyclosporine), with or without mycophenolate and prednisone. 179 patients (54%) started SIR a median of 15 days (IQR: 4-138) post-LT primarily for renal insufficiency (76%) or CNI neurotoxicity (7%). SIR was overlapped with tacrolimus and cyclosporine in 65% and 16%, respectively. Prior to starting SIR, total cholesterol was 139 ± 69 mg/dl, LDL-cholesterol 90 ± 49, triglycerides 193 ± 126. Peak lipids after SIR were 263 ± 97, 138 ± 81, and 475 ± 456 mg/dl, respectively, despite lipid-lowering therapy. Serum creatinine was 2.05 ± 1.04 and 1.08 ± 0.5 mg/dl before and after SIR, respectively. Before SIR, 70% of patients had no proteinuria, but only 36% had no proteinuria after SIR. High range proteinuria (>300mg/dl) was noted in 3% before and 16% after SIR. Finally, SIR was discontinued in total of 81 (45%) patients, for indications of cytopenias (20%), hyperlipidemia (19%), mouth ulcers (11%), sepsis (7.5%), skin reactions (7.5%), nephrotic syndrome (6%), GI intolerance (4%), pneumonitis/BOOP (2.5%), myopathy (2.5%), and combinations of above (20%). MLR failed to identify any pretreatment predictors of discontinuation. Conclusions: Immunosuppression with SIR improves azotemia at the expense of considerable hematologic, metabolic, dermatologic, renal, pulmonary and muscle toxicity. Considering the high incidence of proteinuria after SIR treatment, the use of SIR as a less nephrotoxic agent must be re-considered. and to identify the most effective protocol. Peripheral blood was obtained from 4 EBVseronegative and 4 EBV-seropositive pediatric heart (H) Tx patients. LCL vs DC-based methods were compared as follows: (i) LCL (ii) LCL + IL-12 (iii) Type-1 polarized DC (treated with IL1-b, TNF-a, IL-6 and IFN-g) loaded with MHC class I-restricted EBV-peptide pool (DC/pep.) and (iv) DC/pep. + IL-12. The EBV-specific CD8 + T cell phenotype and function were screened using flow cytometry, IFNg ELISPOT and cytotoxicity assays. The yields and the functional activities of in vitro co-cultures differed based on the induction method employed, and on the EBV status of the patients tested. For the EBV-seropositive pediatric HTx patients, all four methods resulted in the successful expansion of functional Type-1 EBV-specific CD8 + T cells, suggesting that memory CD8 + T cell are readily reactivated in vitro. For the EBV-seronegative pediatric Tx patients however, only the LCL + IL-12 approach resulted in significant augmentation of Type-1 EBV-specific CTLs that were competent to secrete IFN-g (400±50/10 5 cells) and to kill (300±170 LU/10 7 cells) EBV + targets. We found that IL-27 secreted by LCL (and not by DC) was critical in triggering expression of IL-12Rb2 on naive CD8 + T cells, and rendering these cells responsive to IL-12p70. Further addition of exogenous IL-12p70 (which is generally not produced by LCL) proved to be essential for effective Type-1 priming. However, blocking IL-27 during EBV-priming has abolished IL-12Rb2 expression and subsequent IFNg production. These results demonstrate that the inducible expression of IL-12Rb2 on naïve CD8+ T cells was dependent on IL-27, and support the critical early role of EBV infected B cells in the in vivo priming of naïve precursors into potent EBV Type-1 CD8 + T cell in children. Serial EBV load monitoring in pediatric heart transplant patients (PHTx) has identified a group of asymptomatic children that exhibit persistently high EBV loads in peripheral blood (≥16,000 copies/ml on at least 50% samples over a period of at least 6 months). These patients have a high rate of progression to late PTLD. Our goal is to characterize the deficiency of EBV-specific CD8 + T-cell immunity that allows this state to occur and be maintained. Twenty-one stable EBV + PHTx patients were categorized as follows: Group 1 (n=6) no detectable viral load; Group 2 (n=12) low viral load (≤16,000 copies/ ml); Group 3 (n=4) high viral load (≥16,000 copies/ml). Twelve healthy subjects were recruited as controls. Flow cytometric analysis with HLA-A2 EBV-tetramer (TMR) probes in conjunction with mAbs against memory/activation markers was performed on peripheral blood CD8 + T cells, and their EBV-specific IFN-γ production was measured by ELISPOT. EBV-"latent" specific CD8 + T cells in G2 patients were mostly CD62L + / CD45RO + (central memory) and expressed heterogeneous levels of PD-1 and high CD127 (IL-7 receptor α), the EBV-lytic-specific CD8 + T cells were more frequent, and biased toward CD62L -/CD45RO + (effector memory) and CD62L -/CD45RA + (stable effector memory), corresponding to terminally differentiated memory compartments. This cell population also expressed heterogeneous levels of PD-1 and down-regulated CD127. In contrast, both EBV-lytic and -latent specific TMR + CD8 + T cells from G3 patients were homogeneously CD62L + /CD45RO + (effector memory), CD38 + and CD127 -, suggestive of "recently activated" phenotype. Interestingly, although patients in groups G2 and G3 had high frequencies of EBV-specific TMR + CD8 + T cells (G2 0.65%±1.5, G3 1.9%±3.5, p=0.3), only G2 patients exhibited a direct correlation between TMR + CD8 + T cells and EBV-specific IFN-γ production. These results demonstrate that different levels of chronic EBV-antigenic pressure trigger significant differences in the phenotypic and functional features of EBV-specific CD8 + T cells from PHTx, suggesting that the immunologic characterization of high EBV load carrier state is a combined "activated" phenotype with "exhausted" function of EBV-specific CD8 + T cells. EBV-Encoded LMP1 Indirectly Activates the Jak/STAT Pathway through Induction of IFNγ. Abstracts evidence of PTLD. 15 children were enrolled at 5 sites. Mean age at transplant 7.1 years, mean time to PTLD 43 months . Organs transplanted were lung 6, heart 5, kidney 4. All PTLD were of B cell origin and expressed CD20 and all were EBV positive. Histology was: polymorphic 11, monomorphic 3, Hodgkins-like 1. 7 patients received 4 doses, 7 patients 8 doses and 1 patient 7 doses. Treatment was associated with minimal side effects in 12, mild-moderate infusion related reactions in 2 and moderate reaction in 1. No patient had treatment discontinued because of side effects. Twelve patients (80%) showed complete response after 4-8 doses, 2 had progressive disease and one had stable disease. At 24 months, 10 (67%) were alive with one graft loss (kidney) and none with residual disease. At latest follow-up (mean 60.4 months, 42-82), 10 remain alive with 1 further graft loss (lung) and no PTLD. The 5 deaths occurred between 2.5 and 13 months and were associated with progressive disease (2), chronic rejection (2), and complications of elective surgery (1) . Conclusions: These findings support prior registry data and suggest that rituximab without chemotherapy is a successful second line treatment in approximately two-thirds of children with refractory PTLD. Cancer after Organ Transplantation in France. Jean Michel Rebibou, 1 Fabienne Pessione, 1 Francois Aubin, 1 Bernard Loty. 1 1 Agence de la Biomedecine, France. Cancer prevention appears as a major challenge in transplantation. Estimating cancer frequency is a major step toward designing prevention policy. We report on patterns of cancer incidence among 47000 transplantations registered in the French data base. ). The risk of lung cancer is higher for recipients of a thoracic organ and the risk of kidney cancer appeared higher for kidney recipients. Ten year cumulative incidence was 8.6% for all cancer and transplantation types, 1.7% for NHL. Multivariate analysis demonstrated that cancer risk increased with recipient age (p<10 -3 ), 10 year cumulative incidence was 15% for recipients older than 60 year. It was higher in male (p<10 -3 ) and in thoracic organ recipients when compared with kidney recipients (p<10 -3 ). Cancer incidence did not vary according to the transplantation period (1995 ( -1999 ( vs 2000 ( -2005 .95 p=0.34) and NHL risk was significantly lower during the period 2000-2005 (RR=0.8, p=0.02). This work does not report any increase in cancer incidence among transplant recipients while cancer incidence increased in the general population. The observed decrease of NHL risk is of particular interest. The Both costimulatory and co-inhibitory signals are delivered by B7 ligands through the CD28 family of receptors on T lymphocytes determining the ultimate immune responses. Although B7-H4, a recently discovered member of the B7 family, is known to negatively regulate T cell immunity in autoimmunity and cancer, its role in transplantation rejection and tolerance has not been established. To study its role in physiologic rejection processes, we first treated B6 WT recipients of BALB/c hearts with a blocking mAb against B7-H4 or isotype IgG control and found no difference in graft survival (MST 7 days, n=8 vs. 7, n=10). However, B7-H4 blockade resulted in accelerated allograft rejection in CD28 deficient B6 recipients (MST 9.5 vs. 19, n=8, p=0.003) , indicating that B7-H4 signaling can mediate negative regulation in the absence of CD28 costimulation. We next studied B7-1/B7-2 double deficient (DKO) B6 recipients of BALB/c heart allografts, as these mice are truly independent of CD28/CTLA-4:B7 signals. While cardiac allografts were accepted in control DKO recipients (MST>100, n=5), blocking B7-H4 precipitated rejection (MST 45, n=5, p=0 .01) demonstrating non-redundant functions of these two negative pathways. Based on these results, we next evaluated the role of B7-H4 in acquired transplantation tolerance by blocking CD28:B7 using CTLA4-Ig. B7-H4 blockade abrogated prolongation of allograft survival by CTLA4-Ig (250 µg on day 2) in the fully MHC-mismatched cardiac allograft model (MST 13 vs. 46.5, n=8, p=0.0002) . We conclude that the novel B7-H4 molecule can regulate alloimmune responses independent of an intact CD28/CTLA-4:B7 costimulatory pathway. The interplay between positive (stimulatory) and negative (regulatory) costimulatory signals is an important determinant of the outcome of the alloimmune response and could be exploited to induce tolerance. Specific In a model of MHC-mismatched kidney allograft in the rat, treatment with anti-CD28 antibodies induced a form of tolerance independent of Treg cells but associated with a two-fold accumulation of MDSC in the blood. To further characterize these cells, we analyzed their phenotype and mechanism of action by flow cytometry, western blotting and suppression assays. MDSC expressed CD80 and CD86, NKRP-1, CD172a (SIRPa), CD11a, CD11b, HIS48 and for a fraction of them CD4 but did not express MHC class II molecules. MDSC dose-dependently suppressed the proliferation of T cells in MLR and after stimulation with anti-CD3 + anti-CD28 antibodies. Although detected in blood, bone marrow, spleen and lymph nodes, MDSC were only suppressive in blood and bone marrow. This suppression was lost after physical separation from the responding T cells by semi-permeable membranes in transwell assays, as well as after addition of L-NMMA, a selective inhibitor of inducible nitric oxide synthase (iNOS), suggesting a role for NO in the suppression. Western blot analyses revealed that iNOS was expressed only after contact between MDSC and activated CD4 + CD25effector T cells and to a much lesser extent after contact with activated CD4 + CD25 high T reg cells. MDSC affected the viability of stimulated CFSE-labeled effector T cells by blocking their proliferation but not their activation. In contrast, MDSC did not block the response of T reg cells stimulated by anti-CD3 + anti-CD28 antibodies. This selective suppression of effector but not of regulatory T cells was confirmed by cytokine profile analyses. In vivo, the expression of iNOS was higher in the blood of tolerant recipients, as well as in the graft, as compared with isografted recipients. In addition, the injection in stable tolerant animals of aminoguanidine, which inhibits iNOS, induced graft rejection within 3 weeks. In conclusion, these results suggest that MDSC, accumulated in the blood of tolerant recipients of kidney allografts, release high levels of NO after contact with activated effector T cells and specifically control their proliferative response. Liver Non-Parenchymal Components Inhibit Dendritic Cell Differentiation and Maturation. Ching-Chun Hsieh, 1 Horng-Ren Yang, 1 Guoping Jiang, 1 John J. Fung, 1 Shigunag Qian, 1 Lina Lu. 1 1 Immunology and General Surgery, Cleveland Clinic, Cleveland, OH. The inherent tolerogenicity of liver allografts could be due to comparatively large numbers of potentially tolerogeneic antigen presenting cells, in particular dendritic cells (DC). It is not clear whether the unique antigen presenting function of liver DC is intrinsic, or is altered by the microenvironmental factors within the liver. In the present study, we investigated the effect of hepatic stallet cells (HpSC), a unique tissue cells in the liver which were actively expanded in the liver allografts, on generation and function of bone marrow derived DC (BM DC). We hypothesize that liver HpSC may modulate immune response via inhibition of liver dendritic cells (DC) which are known of bone marrow (BM) origin. In this study, DC were propagated from B6 BM cells with GM-CSF for 5 days. Irradiated HpSC isolated from B6 mouse liver were added at the beginning into the culture at HpSC : BMDC progenitor ratio of 1:20. The differentiation, maturation and function of propagated DC were determined by characterizing their surface molecule expression and functions of instructing T cell activation / differentiation. The results showed that addition of HpSC markedly blocked the differentiation of DC from BM precursors (most cells remained in CD11b + CD11cprecursors stages). The incidence of CD11c + cells was 7% vs. 41.6% in normal BM DC culture (without HpSC). The presence of HpSC also prevented maturation of CD11c + DC, as evidenced by low expression of CD40, CD80 and CD86, but high of B7-H1. The inhibitory effect appeared to be mediated by soluble factor (s) produced by HpSC, since addition of the HpSC culture supernatant or transwell culture provided comparable inhibitory activity. Culture of allogeneic CD4 + T cells with HpSC-DC elicited poor proliferative response in a 3d MLR assay, with low IL-2 and IFN-γ production. Three color staining of T cells stimulated by HpSC-DC showed that CD25 + Foxp3 + T cells were preferentially expanded, suggesting that HpSC-DC were capable of inducting Treg. In contrast, BM DC induced vigorous CD4 + T cells proliferation, most of activated CD4 + T cells were CD25 + Foxp3associated with high levels of IFNγ and IL-2 in the culture, indicating induction of Th1 cells. In conclusion, liver tissue cells, such as HpSC markedly inhibit DC differentiation and maturation, suggesting that the tolerogenic property of liver DC may not be intrinsic, but is altered by the microenvironmental factors in the liver. . Allograft rejection was also significantly accelerated when bm12 hearts were transplanted into CD28KO recipient (MST 14 days). To investigate the mechanisms of these findings, lymphocytes harvested at day 10 post-Tx from spleens and regional lymph nodes were assessed by flow cytometry for phenotypic differentiation of the activation status, regulatory T cell markers and effector cell generation. The ratio of effector to regulatory cells was 1.4: 3.8: 5.2 in the WT, CD28KO and B7DKO recipients, respectively. Cytokine production was evaluated by ELISPOT using donor specific antigen stimulation of recipient splenocytes. The mean IFN-γ production was 173 spots per 500,000 splenocytes in WT, 206 in CD28KO and 702 in B7DKO. This study for the first time demonstrates the paradoxical role of B7-CD28 co-stimulation in fully allogeneic and class II mismatched grafts and proposes a possible mechanism through the re-alignment of the effector to regulatory T cell ratio in favor of a highly alloreactive immune response. The major concern regarding kidney donation has been whether the occurrence of hyperfiltration, on the background of increasing prevalence of hypertension with aging and the decline in glomerular filtration rate (GFR) noted in some people as they get older, may put donors at a higher risk for progressive kidney disease. We have previously reported on donors > 20 years after donation and found them to incur no excessive risk of hypertension or kidney disease. Herein, we report on renal and non-renal outcomes on the world's largest experience of kidney donors to date (n=302) who donated more than 35 years ago. Methods: Kidney donors were asked to fill out a survey detailing their medical history since donation and obtain a physical exam and laboratory testing by their local physician. Results are expressed as mean±standard deviation (SD The graph below depicts the cross-sectional distribution of last serum creatinine available 35 years or more after donation regardless whether the donor is presently alive or not though the majority is from currently alive donors. Conclusion: In the longest follow-up of kidney donors to date, this data indicates that 4-5 decades of living with one kidney has no serious adverse renal effects and a prevalence of hypertension that is probably similar to the general population considering the age of these donors. Centers ) . Before and after adjustment for comorbidity, 19 (7.8%) and 18 (7.5%) centers, respectively, met all criteria for review, but only 16 met criteria for review both before and after comorbidity adjustment. We conclude that failure to adjust for pre-existing recipient comorbidity results in grossly inaccurate estimation of expected GF per KTx center, more often resulting in expected GF that is too low. Using data that are not adjusted for comorbidity to judge the quality of Tx programs could encourage denial of access to high-risk patients. Introduction: The purpose of our study was to examine temporal trends in the regionalization patterns and center volume-outcomes relationship for lung transplantation in the United States over the past decade. A retrospective analysis of all adult single-organ lung transplants included in the Scientific Registry of Transplant Recipients for three consecutive time periods between 1997 and 2006 was performed. For each time period, lung transplant centers were divided into three groups based on each center's annual volume of the procedure (low-volume group = 1-17 procedures per year, medium-volume group = 18-30 procedures per year, high-volume group = greater than 30 procedures per year). Oneyear observed-to-expected patient death ratios were then calculated and compared for each group in each time period. A temporal analysis of the percentage of transplants being performed relative to center volume was also performed. Statistical comparisons were made using chi square testing. Results: A total of 12,603 lung transplant procedures were included in the analysis. In Period 1, there was not a significant difference in the one-year observed-to-expected patient death ratio of low-volume lung transplant centers when compared to high-volume centers (ratio 1.15 for low-volume centers vs. 0.79 for high-volume centers, p = 0.07). By Period 3, however, a significant relationship between center volume and outcomes had emerged (ratio 1.30 for low-volume centers vs. 0.79 for high-volume centers, p = 0.006). Over this same time period, the percentage of lung transplants within the United States that are performed at low-volume centers has decreased significantly (from 43.3% of all lung transplants in Period 1 to 22.7% in Period 3, p < 0.0001), while the percentage being performed at high-volume centers has increased significantly (from 20.5% of all lung transplants in Period 1 to 51.3% in Period 3, p<0.0001). Conclusions: A significant relationship between center volume and patient outcomes has emerged for lung transplantation over the past decade. At the same time, the percentage of these procedures being performed at high-volume centers has increased. These findings suggest that regionalization patterns for a given procedure may be influenced by the presence or absence of a volume-outcomes relationship for that procedure. Liver transplantation (LT) has emerged as one of the few curative treatment modalities for patients with hepatocellular carcinoma (HCC). However, the increase in the incidence of HCC recurrence due to immunosuppressants administered after LT is a serious issue. We have recently proposed a novel strategy of adjuvant immunotherapy for preventing the recurrence of HCC after LT: intravenous administration of IL-2-stimulated natural killer (NK) cells extracted from donor liver graft to liver transplant recipients. Since the immunosuppressive regimen currently used after LT reduces the adaptive immune components but well maintains the innate components of cellular immunity, the augmentation of NK cells response might be a promising immunotherapeutic approach. We confirmed that the IL-2-stimulated donor liver NK cells exhibited a significantly high level of TRAIL and showed vigorous cytotoxicity against an HCC cell line without cytotoxicity against normal cells. After obtaining approval from the ethical committee of our institute, we successfully applied this therapy to 13 cirrhotic patients with HCC from January 2006. The average number of NK cells that had been administered to LT recipients at 4 days after LT was 304.2 ± 225.5 × 10 6 cells/body. The LT recipients were categorized as follows: (1) based on the Milan criteria, 8 recipients met the criteria while 5 did not, and (2) based on the TNM stage, 2 recipients were categorized as pathological TNM stage I; 6, stage II; 4, stage III; and 1, stage IV. In our institute, the 2-year recurrence-free survival rates of the LT recipients treated with and without this therapy were 100% and 71.3%, respectively. Kinetic studies revealed that in the early postoperative period, the peripheral blood obtained from the treated LT recipients exhibited a significant improvement in cytotoxicity against HCC cell line as compared to the untreated LT recipients (p < 0.01). Furthermore, flow cytometric analyses revealed that the frequency of TRAIL + NK cells increased remarkably in the peripheral blood of the treated LT recipients (p < 0.05). In conclusion, the administration of IL-2-stimulated donor liver NK cells contributes to the promotion of host anticancer activity and has the potential to regulate HCC recurrence after LT. Abstract# 500 VEGF, a well-established angiogenesis factor, is expressed within allografts at high levels in association with acute and chronic rejection. In previous studies, we have reported that VEGF possesses potent proinflammatory properties in part via its ability to mediate leukocyte trafficking into allografts. Recently, we discovered that VEGF mediates CD4+ and CD8+ T cell migration via interaction with its receptor KDR (also called VEGFR2). Blockade of T cell KDR significantly inhibits transendothelial migration. These observations suggest that KDR may be a novel T cell receptor for allogeneic lymphocyte recruitment. Here, we first examined the expression of KDR on peripheral human CD4+ and CD8+ T cells by FACS analysis. We found that ≤ 1% of circulating T cells express KDR, and its expression was at low levels on individual unactivated T cells. Further, we found that the expression of KDR on T cells increased markedly following activation with mitogen and following interactions with activated allogeneic endothelial cells. Induced expression of KDR on CD4+ and CD8+ T cells was at a similar level as that observed on endothelial cells. Therefore, KDR appears to be selectively expressed on T cells, that traffick into allografts. To test the pathophysiological significance of these observations, we analyzed the expression of KDR in a total of 19 cardiac, and 5 renal, human allograft biopsies. We correlated the expression of KDR with CD3+ T cell infiltrates; and by double immunofluorescence staining, we determined co-expression of KDR on individual CD3+ T cell infiltrates. In cardiac allografts we found that KDR was expressed throughout the endomyocardium, and was most notable on endothelial cells in all biopsies examined. By grid counting of 3-4 areas of each biopsy, we found that the mean number of CD3+ T cell infiltrates ranged from 4 to 79 cells/hpf (x600 mag.). By double staining, we noted that KDR was expressed on 29 ± 4 % (mean±SEM) of these CD3+ T cell infiltrates. Similarly, we found that KDR was co-expressed on CD3+ T cells within renal allografts. While infiltrates were more focal, again 30 ± 2 % (mean±SEM) of graft infiltrating T cells expressed KDR. Collectively, these observations for the first time identify KDR as a novel receptor on allogeneic T cells. We suggest that intragraft VEGF may interact with T cell KDR to facilitate homing and recruitment of allospecific lymphocytes into allografts. Interaction of Infiltrating CD8 + T Cells and Tissue Cells in Tolerant Liver Allografts: Using TCR Transgene Approach. Guoping Jiang, 1 Qiwei Zhang, 1 Horng-Ren Yang, 1 Kathleen Brown, 1 John J. Fung, 1 Lina Lu, 1 Shiguang Qian. 1 1 Immunology and General Surgery, Cleveland Clinic, Cleveland, OH. The liver allografts are accepted without requirement of immunosuppression in mice. The underlying mechanisms are not completely understood. We hypothesized that it resulted from an abortive T cell response within the liver due to hyporesponsiveness or apoptosis. To test this, we examined the activation and fate of allo-Ag specific CD8 + T cells following liver transplants (LTX) compared with heart transplants (HTX) that were acutely rejected. Following transplantation [B10 (H2 b )→C3H (H2 k )], CFSE labeled CD8 + T cells (10x10 6 ) from Des TCR Tg mice (H2K b specific TCR) were adoptively transferred into recipients. Animals were sacrificed two days following Des T cell administration for analyses of T cells in the grafts or draining lymph nodes (D-LN). Host CD45 + leukocytes were quickly infiltrated following LTX. Among CD3 population ∼48% were CD4 + and ∼52% CD8 + . CD8 + T cells were further increased thereafter in grafts and D-LN, associated with high INF-g production. CD8 + T cells in the liver grafts rapidly reduced to 36% by POD 14, and to 12% by POD 40. However, the incidence of CD4 + T cells remained high. CFSE dilution assay and ELISPOT showed an active division of Des + T cells in liver allografts either on POD 7 or 40. These Ag-specific CD8 + T cells functioned well evidenced by IFN-g production in response to allo-Ag. However, compared to HTX, the accumulation of Des + cells in grafts was significantly lower in LTX [9.6% (17x10 4 /heart), and 0.59% (6.14×10 4 /liver)] on POD 7, and further dropped to 0.14% (3.14×10 4 /liver) on POD 40. The expended cohorts of adoptively transferred cells followed by their elimination suggested elimination of Ag-specific CD8 + cells. To examine the role of liver environment, graft CD45non-parenchymal cells (NPC) were isolated tested fro regulatory effect on Des + T cell response. Liver allografts showed significantly expansion of CD45 -NPC, which were donor MHC class I + (H2b + ), B7-H1 + , TRAIL + and low for CD40 and CD86. These cells did not inhibit Des + T cell proliferation in response to B10 spleen stimulation, but significantly enhanced their death, which was dependent on B7-H1/TRAIL. This was confirmed by using CD45 -NPC from B7H1 -/or TRAIL -/mice. In conclusion, activated T cells in liver grafts may stimulate tissue cells to express inhibitory and death inducing molecules, resulting in T cell death and graft acceptance. FOXP3 + graft-infiltrating lymphocytes (GIL) have been detected in the rejecting allografts of transplant patients. FOXP3 is a marker for regulatory T cells (T reg ). Published reports suggest that human FOXP3 can be upregulated following TCR stimulation without induction of regulatory function. To investigate whether FOXP3 + GIL during acute rejection are T reg , we analyzed the phenotype of GIL harvested from acutely-rejected non-human primate kidney allografts. METHODS: Renal allografts with histologically-confirmed acute rejection were harvested at the time of necropsy from macaques that had undergone experimental transplantation. Following digestion of kidney fragments using collagenase, GIL were isolated using Lymphocyte-Separation Media and cryopreserved. Axillary lymph nodes (Ax LN) were also isolated either at the time of necropsy or by biopsy. For SEB-stimulated lymphocytes, Ax LN cells were stimulated for 5 days in the presence of 25 ng/ml SEB for 5 -6 days. To perform intracellular interferon-gamma (IFNg) analysis, cells were stimulated with PMA and ionomycin in the presence of Brefeldin A for 5 -6 hours at 37°C. Samples were prepared for flow cytometry by first staining for extracellular antigens. Cells were then fixed and permablized (kit from eBiosciences) prior to intracellular staining for FoxP3, Ki67 and IFNg. RESULTS: The percentage of FOXP3 + in CD3 + GIL was similar to that found in Ax LN ( Table 1 ). The majority of these cells were CD4 + . While 80% of the CD3 + /CD4 + /FOXP3 + population of both AX LN and GIL were CD25 + , a significantly higher frequency of CD39 + and Ki67 + cells were found in GIL (p < 0.01; Student's t-test). Simlar levels of CD39 and Ki67 expression were found in SEB stimulated lymphocytes. Unlike the SEB-stimulated lymphocytes, few IFNg -producing cells were demonstrated following PMA/ionomycin stimulation of GIL. CONCLUSION: Our current data indicates that the majority of FOXP3 + GIL from acutely rejecting renal allografts are recently activated CD4 T cells that lack effector function. This suggests that FOXP3 T cells within rejecting allografts may indeed be T reg . Findings in mouse models of transplantation often fail to translate well in humans. Three variables may account for the discrepancy: (1) evolutionary divergence between mice and humans, (2) influence of infection history on alloimmunity, and (3) use of highly inbred strains of laboratory mice. Here, we investigated whether the use of inbred mouse strains skews the rejection phenotypes and their response to treatment due to decreased genetic diversity and/or fixation of undesirable genetic loci, known as inbreeding depression. We examined heterotopic cardiac allograft survival in outbred and inbred mouse populations in the presence or absence of immunosuppression. In the absence of immunosuppression, heart transplantation within or between outbred stocks of mice (n = 35) resulted in three distinct rejection phenotypes that resemble accelerated (1 -4 days), acute (8 -25 days), and chronic rejection (> 75 days), respectively. In contrast, all fully allogeneic grafts transplanted between inbred mice (n = 12) were rejected acutely (7 -13 days) as were historical controls (n > 50). The accelerated phenotype, present in 29% of outbred to outbred transplantations, was characterized by extensive hemorrhagic necrosis of the heart with thrombosis, neutrophil margination and neutrophilic arteritis, and did not correlate with donor:recipient MHC II disparity. Immunosuppression with T cell costimulation blockade did not prevent accelerated rejection (incidence = 27% in the treated group, n = 26) but did convert the acute rejection phenotype into longterm allograft survival in all groups studied. The same accelerated phenotype was observed if transplantation was performed from outbred to inbred mice (incidence = 39%; n = 23) but could not be duplicated if inbred to outbred transplantation was performed (n = 20). Finally, C3 depletion with cobra venom factor abrogated the accelerated rejection phenotype in outbred to outbred transplantations (n = 16), suggesting a role for complement in the pathogenesis of this phenotype. In summary, our data (1) indicate that the use of outbred mouse stocks may uncover clinically-relevant rejection phenotypes not observed in inbred mouse strains, and (2) underscore the importance of the donor background in determining the phenotype of rejection. Outbred mouse stocks may provide a platform to uncover MHC-unlinked genetic loci that play an important role in the outcome of solid organ transplantation. Th17 Are Limited in Their Ability To Reject Allografts. Elderly recipients represent the most rapidly growing segment of patients on the waiting list. However, little is known about age-dependent alterations of the immune response in organ transplantation. We examined age dependent T-cell functions in a transgenic mouse transplant model. Effector T-cell phenotype, -function, cytokine production and regulatory T-cell function were analyzed in 3 and 18mths old B6 mice. In an in vivo transplant model, BL/6 nude mice were reconstituted with 2x10 6 young or old transgenic alloantigen-specific CD4 + Tcells and engrafted with bm12 skin grafts. T-cell phenotype and cytokine secretion were sequentially analyzed in all lymphatic compartments. Splenocytes of naïve old B6 mice contained significantly higher frequencies of T-cells with an effector/memory phenotype (CD4 + CD44 high CD62L low and CD8 + CD44h igh CD62L low; p<0.005). In vitro proliferation and IFNγ-production were significantly reduced in aged mice indicating an impaired T-cell response with increasing age as assessed by MLR (p<0.005) and ELISPOT (p<0.001). In parallel, regulatory functions remained age-independent as alloantigen-specific CD4 + CD25 + FoxP3 + T-cells isolated from sensitized old mice demonstrated a dose-dependent well preserved suppressor function. Next, we tested the age-dependent alloantigen -specific CD4 + T-cell function in a transgenic skin transplant model: Age did not significantly impact rejection kinetics (young vs. old: 10.1 vs. 14.3days, n.s.) However, T-cell migration and activation were significantly different: Fewer numbers of activated CD4 + CD25 + and effector/memory phenotype T-cells (CD4 + CD44 high CD62L low ) were found in recipient spleens (p<0.05) and draining lymph nodes (dln) (p<0.05) after transfer of old T-cells. Chemokine receptor staining revealed less CXCR3 + and CCR7 + T-cells in dln following the transfer of old T-cells (total cell numbers x104: CXCR3 + : 10.9±4.2 vs.0.95 ±0.2, CCR7 + : 4.7±1.0 vs. 0.35±0.2, p<0.05). This was paralleled by reduced intragraft T-cell infiltration as observed by immunohistochemistry. In summary, native elderly mice showed an increased frequency of effector memory T-cells but an overall impaired T-cell response. Regulatory -T-cell function remained preserved. In vivo allospecific CD4+ T-cell activation and migration was impaired in elderly transplant recipients. The Background: The sensitized transplant recipients may undergo an "accelerated" form of rejection, which is mediated by T cell-dependent mechanisms. These patients often experience increased rate of early rejection episodes, which are difficult to control with currently used immunosuppressive agents. Methods: In our model of cardiac graft rejection in sensitized recipients, B6 mice are first challenged with B/c skin, followed 40-60 days later by B/c heart transplant (HTx). Unlike in naive hosts, HTx rejection and alloreactive CD8 activation in this model are CD154 blockade-resistant. We first performed systemic analysis at the intragraft transcriptional level by microarray to identify disparities in local immune responses in naive vs. sensitized hosts. Aiming to improve the efficacy of costimulation blockade in the sensitization settings, we then determined the role of CD4 T cells in costimulation blockade-resistant alloimmune response by using CD4 depleting (GK1.5) vs. CD4 blocking (YTS177) Ab, in conjunction with CD154 blockade (MR1). Results: HTx harvested from groups of naïve (day 4-6), sensitized (day 2-4), control Ig or MR1 Ab treated mice (n=3/gr) were subjected to microarray analysis. MR1 treatment suppressed HTx expression of proinflammatory genes (IL-1β, IL-6, TNF-α), and T cell-targeted chemokines (RANTES, Mig, CXCL10) early after HTx in naïve, but not sensitized recipients. Five groups of sensitized mice treated at the time of HTx with: (1) 5). CTL activation was determined by FACS phenotyping at day 10 and 30. The simultaneous blockade of CD154 costimulation and CD4 help, but not a single blockade with MR1 Ab or anti-CD4 Ab, was required to inhibit peripheral alloreactive CD8 activation in sensitized mice. Additionally, CD8 activation in the absence of CD4 help showed defective cytotoxic molecule profile, with suppressed perforin but upregulated granzyme B expression at the graft site. Conclusion: CD154 blockade-resistant CD8 activation is critically dependent on CD4 T cells. This study provides novel immunological basis to study the potential synergy between adjunctive CD4 and CD154 targeted therapies to control accelerated graft rejection in sensitized hosts. Mediators. G. Einecke, L. G. Hidalgo, P. F. Halloran. Department of Medicine, University of Alberta, Edmonton, Canada. The hallmark of T cell mediated rejection (TCMR) are interstitial inflammation and tubulitis. The mechanisms of tubulitis and epithelial deterioration during TCMR are unknown. We previously showed that tubulitis in mouse allografts is independent of cytotoxic molecules (GzmA/B, Prf) and is preceded by molecular changes with loss of epithelial genes, reflecting epithelial dedifferentiation. Human TCMR is associated with loss of the same epithelial genes and re-expression of embryonic pathways (Wnt, Notch). We hypothesized that TCMR is mediated through soluble factors released by effector T cells or macrophages in the interstitium, and that supernatants of effector T cells would simulate these changes in epithelial cultures. We established an in vitro model in which cultured primary human renal epithelial cells are incubated with supernatants from effector T cell/monocyte co-cultures. The transcript changes in this model, analyzed by microarrays, closely simulated those in human and mouse TCMR (Fig1), with loss of epithelial transcripts, activation of Wnt/ Notch pathways, and increased expression of Ifng-inducible and injury-related transcripts previously defined in our mouse model. Some of these changes were reproduced by incubation of epithelial cells with Ifng or Tgfb. The in vitro model identified additional epithelial transcript changes not previously identified in vivo (not affected by Ifng or ischemic injury, not expressed in T cells, macrophages, B cells, or NK cells). Expression of these transcripts (n = 305) was highly altered in human TCMR compared to nonrejecting biopsies of 177 human renal allograft biopsies and distinguished TCMR from antibody-mediated rejection in a hierarchical cluster analysis. Thus we have established an in vitro model that closely simulates the epithelial events during human TCMR and confirms that these changes are independent of direct contact with inflammatory cells, supporting the hypothesis that interstitial effector T cells mediate allograft deterioration by soluble mediators. Together with previous mouse and human data these results provide the first in vitro model of the epithelial consequences of TCMR. Objective. C4 split product deposition to HLA antigen-coated microparticles ([C4d] FlowPRA) was previously shown to be a specific marker of C4d-positive antibodymediated rejection (AMR). The objective of this study was to assess the predictive value of [C4d]FlowPRA reactivity in a cohort of non-biopsied patients with stable graft function during the first year. Methods. A total of 133 kidney transplant recipients were enrolled (inclusion criteria: functioning graft at 12 months; prospective collection of sera taken before and at 1-3, 6, and 12 months after transplantation). Included patients were serially screened for humoral panel reactivity applying [IgG] and [C4d]FlowPRA screening. Results. Fifty-four of the included 133 recipients had stable graft function within the first year and were not subjected to diagnostic renal biopsy. In this particular patient group, detection of complement-fixing HLA reactivity tended to be less frequent than in the 79 patients with biopsied graft dysfunction (≥10% [C4d]FlowPRA before transplantation: 9% vs. 18% of recipients, P=0.2; ≥10% [C4d]FlowPRA after transplantation: 11% vs 24%, P=0.06). In line with our previous results, within the group of biopsied patients, pre-and/or post-Tx [C4d]FlowPRA reactivity was tightly associated with the immunohistochemical detection of peritubular capillary C4d deposition (P=0.005) reflecting ongoing AMR. Remarkably, in initially stable patients, detectable [C4d] FlowPRA reactivity was not associated with inferior long-term outcomes. Within this patient group, recipients with and without (pre-and/or post-transplant) C4d-fixing anti-HLA reactivity did not differ with respect to 4 yr allograft survival (P=0.4), 4 yr serum creatinine levels (1.5 vs. 1.5 mg/dl; P=0.7), and proteinuria at 4 yrs (0.2 vs. 0.18 g/24h; P=0.8). Similar results were obtained for a comparison of [IgG]FlowPRA positive vs. negative subjects. Conclusion. Our data suggest that a considerable number of patients with initially stable graft function may have excellent long-term graft function despite serologically detectable levels of (complement-fixing) alloreactivity. For these antibody-positive recipients, a potential role of graft accommodation can be speculated. The Immunoproteasome Subunit Beta 10 as a Novel Peripheral Blood and Intragraft Biomarker of Chronic Antibody Mediated Allograft Rejection in Clinical Transplantation. Joanna Ashton-Chess, 1 In an attempt to identify non-invasive biomarkers of specific histological scarring, we compared publicly available gene sets derived from microarray studies of human renal transplant biopsies published in the literature with our own microarray data derived from studies of rat heart allografts. In this way we identified an immunoproteasome subunit (proteasome subunit beta 10 -PSMB10) as a potentially interesting candidate. PSMB10 is one of three members of the immunoproteasome that are induced by Abstracts interferon gamma. Messenger RNA profiling in renal transplant biopsies (n = 52) with normal histology, interstitial fibrosis and tubular atrophy, calcineurin inhibitor toxicity, transplant glomerulopathy or chronic antibody-mediated rejection (Banff 2005) revealed PSMB10 to be strongly and significantly increased in chronic antibody mediated rejection vs. the other three histological diagnoses. Receiver Operator Characteristic (ROC) curve analysis showed that PSMB10 mRNA could diagnose chronic antibody-mediated rejection with an AUC of 0.99, a sensitivity of 1.0 and a specificity of 0.95. Moreover, PSMB10 mRNA was significantly increased in the PBMC (n = 24) of patients with chronic antibody-mediated rejection compared to those with normal histology. ROC analyses revealed an impressive AUC of 1.0 with all patients being correctly classified. Similar results were also observed in a rat allograft model where PSMB10 was significantly increased at day 100 post transplantation in both the heart allograft and the PBMC of animals presenting chronic transplant vasculopathy vs. syngeneic grafts. Moreover, inhibition of the proteasome by administration of Velcade® at 0.1 mg/kg every other day for the first 20 days post transplantation significantly and dose-dependently prolonged allograft survival (MST 31.7 days in velcade-treated vs. 6.3 days in untreated animals).Together our data point towards PSMB10 as a blood and intragraft biomarker of chronic antibody-mediated rejection as well as a potential therapeutic target. Furthermore, our results suggest that using a threshold of PSMB10 in the blood could help in guiding the decision to biopsy in the clinic. BAFF Monitoring after B-Cell Depletion Therapy for Acute Renal Transplant Rejection. Valeriya Zarkhin, 1 Snehal Mohile, 1 Li Li, 1 Jonathan Martin, 1 Minnie Sarwal. 1 1 Pediatrics, Stanford University, Stanford, CA. Introduction: The objective of this study was to investigate the interaction between B-cell activation factor of the TNF family (BAFF) level and circulating B-cell repopulation in pediatric patients with acute kidney transplant rejection treated with the B-cell-depleting agent Rituximab. Methods: 10 pediatric patients (3-23 yrs) with biopsy proven B-cell positive AR were treated with steroids and Rituximab (4 x 375 mg/m 2 /dose/week). All patients were followed up for 12 months. Peripheral blood CD19 cells and donor specific antibodies (DSA) were monitored monthly. Serum level of BAFF was measured by ELISA at AR, 1, 3, 6, and 12 months post-AR treatment and correlated with clinical outcomes. Results: Complete depletion of circulating and intragraft B-cells was observed with Rituximab, with improvement in AR grade in all patients. The median time of peripheral B-cells repopulation was 5 months (range 3-12 months, Fig. 1A) . No correlation was found between pre-treatment peripheral B-cell number and the B-cell repopulation time (r=0.59, p=0.09). BAFF levels rose significantly with B-cell depletion with maximum values at 3 months post-treatment (7.5 fold increase, p=0.0001) and returned to pre-treatment levels, with B-cell recovery, at 12 months (Fig. 1B) . Serum BAFF levels correlated positively with B-cell depletion >6 months (r=0.91, p=0.004, Fig1B). A lack of depletion of DSA I, but not DSA II correlated with higher BAFF levels (r=0.99, p=0.007). The timing of B-cell repopulation and depletion of DSA I may be dependant on serum BAFF level. Anti-BAFF treatment may be considered in addition to Rituximab or standard immunosuppressive treatment protocols in patients with persistent and/or antibody mediated rejection. The Background: The development of donor specific HLA antibodies (DSA) post-transplant has been associated with graft failure. We have shown in a longitudinal study that increases in DSA may precede rejection by months. This retrospective analysis evaluates changes in maintenance immunosuppression (MI) and appearance of DSA in stable transplant recipients. Methods: Sera from stable renal transplant recipients were collected at 4-6 month intervals and tested for the presence of DSA. The types and doses of immunosuppression were correlated with the appearance of DSA. Two hundred eighty stable renal transplant recipients who received either a deceased or a living donor kidney were monitored post-transplantation for the development of DSA. Patients have been followed for 1 to 7 years and had a minimum of 4 serum samples analyzed. All recipients received anti-lymphocyte induction therapy. Maintenance immunosuppression (MI) consisted of a calcineurin inhibitor, prednisolone, and mycophenolic acid. HLA single antigen beads analyzed in the Luminex instrument were used to establish donor specificity of the antibodies. A chart review was undertaken to determine the doses of the MI posttransplantation. All MI was managed by the transplant team and changed according to clinical indications without regard to DSA. Results: Of the 280 patients monitored 37 developed DSA post-transplantation with a functioning graft. DSA was against HLA-Class II antigens in 26 of 37 (70%); Class I antigens in 8 of 37 (22%); and against both Class I and II antigens in 3 of 37 (8%). DSA against Class II was against DQ in all except one case. In the majority of the recipients the appearance of DSA was preceded with dose reduction of the MI, either calcineurin inhibitor or mycophenolic acid or both. Conclusions: Our data show that DSA developed predominantly against HLA-class II antigens and that the appearance of DSA was often preceded with reduction of one or more of the MI. This data shows the importance of monitoring DSA with MI decreases in a stable allograft recipient. Antibody Production and Antigen Presentation Are Directly Inhibited by Mycophenolate Mofetil. Anat R. Tambur, 1 Joe Leventhal, 1 Nancy D. Herrera, 1 Joshua Miller. 1 1 Division of Organ Transplantation, Northwestern University, Chicago, IL. Immunosuppressive medications are primarily designed to target T cell proliferation. Mycophenolate Mofetil (MMF) exerts its effect by inhibiting de-novo synthesis of Guanine, a DNA building block. We, and others, have previously shown that MPA (active metabolite of MMF) affects the differentiation of monocytes into dendritic cells (DC). We further demonstrated that cell-surface receptors associated with antigen up-take and antigen-processing and presentation (CD83 and CD205) are down regulated when cells are matured in the presence of MPA. This phenotype translated into a decreased uptake of alloantigens and reduced stimulation of T cells. We concluded that MMF inhibits also cell functions requiring mRNA synthesis. We now present data regarding the role of MPA in maturation and function of B-lineage cells. PBMCs from 10 subjects were cultured in the presence of CpG, IL-2, IL-10 and CD40L for 5 days to induce memory B and plasma cell maturation in-vitro. Cultures were performed in the presence or absence of MPA (50 ugr/ul) for the length of the incubation period. In-vitro stimulation of B cells increased the memory population (CD19+ CD27+) from 2.8 +/-1.1% to 7 +/-2%. Similarly, plasma cells were increased from 4.4 +/-1.3% to 6.4 +/-1.8%. The addition of MPA to the culture inhibited stimulation for both memory and plasma cells (4.7 +/-3% p=0.01; and 4.7 +/-2.3% p=NS, respectively). We have further analyzed the effects of MPA on antibody secretion using an ELISA (measuring soluble antibodies) as well as a B-cell ELISPOT assay (assessing the number of B cells that produce antibodies). While an expected increase in OD values was observed for stimulated samples compared with non-stimulated samples, a significant decrease was observed when stimulation occurred in the presence of MPA. Nonstimulated cells: 0.249 +/-0.28; Stimulated cells: 0.391+/-0.33; MPA treated stimulated cells: 0.279+/-0.43; p<0.0005). The number of antibody producing cells was also significantly lowered when cultures were done in the presence of MPA (a mean of 106 cells were counted for the stimulated cells compared with 1-2 cells for non-stimulated and MPA-treated-stimulated cells). To our knowledge this is the first time where in-vitro experimental data document the inhibitory effect of MPA on B lineage cells and antibody secretion, although clinically known for some time. These results confirm our previous observations regarding the effects of MMF on non-proliferating immune cells. A Non-Allogeneic Stimulus Triggers the Production of De Novo HLA and MICA Antibodies. Luis E. Morales-Buenrostro, 1,2 Lluvia A. Marino-Vazquez, 1 Anh Nguyen, 3 Paul I. Terasaki, 2 Josefina Alberu. 1 1 Nephrology and Transplantation., Instituto Nacional de Ciencias Medicas y Nutricion Salvador Zubiran, Mexico City, DF, Mexico; 2 Terasaki Foundation Laboratory, Los Angeles, CA; 3 One Lambda Inc., Canoga Park, CA. Background: In a previous study, we found that healthy people developed HLA Abs after immunization against hepatitis B virus. The aim of this prospective study was to establish if stimulation with influenza vaccine is capable of triggering the production of HLA and MICA Abs. Methods: We determined the presence of HLA and MICA Abs (de novo and preformed Abs) in 3 groups of patients vaccinated against influenza: A) 42 healthy adults, B) 40 ESRD patients, and C) 25 TR. Additionally, we followed 22 healthy unvaccinated people without exposure to sensitizing factor: D) control group. Sera samples were collected at baseline (pre influenza shot), at 1 week, and monthly up 6 months after immunization. HLA Abs were assessed with Labscreen Single Antigen beads for Luminex. All samples of each patient were tested simultaneously. A luminescence value higher than 500 was considered positive only if it was 3 times the baseline value. We analyzed the data using Chi square, one way ANOVA test, and Logistic regression. Results: The table shows the types of Abs in each group. Interestingly, we found preformed Abs across all four groups, including the control group (which is free of any known sensitizing factors). The proportion of de novo Abs was higher in the group B and C. Multivariate analysis shows that the only independent factor associated with development of de novo Abs was the presence of preformed Abs. We observed a nonspecific immunologic response triggered by external stimulus and was not necessarily associated to the vaccine in people previously sensitized. A Introduction. Decisions about the minimization and ultimate withdrawal of immunosuppression (IS) would be facilitated by the identification of biomarkers associated with operational tolerance (OT). Methods. As part of an ITN/NIH supported study tolerant kidney transplant recipients (off all IS for > 1yr with stable function, N=22) were compared to recipients with stable function on IS (SIS, N=34), recipients with chronic allograft nephropathy (CAN, N=20), and healthy volunteers (HV, N=18). PBMC, whole blood total RNA, and urine samples from each group were examined using flow cytometry, microarrays, and RT-PCR respectively. Results. Analysis of microarrays revealed significantly higher expression of B cell differentiation genes in tolerant recipients compared to the SIS and CAN groups. Consistent with this finding, tolerant recipients also displayed higher numbers of naïve B cells in peripheral blood and increased expression of CD20 in urine relative to the SIS and CAN groups. No differences in Treg or genes associated with regulatory cells were observed in tolerant recipients relative to other groups. These analyses failed to demonstrate significant differences between tolerant recipients and HVs although support vector machine learning methods suggested potential differences in a number of genes including NFAT and calcineurin. Finally, relative to tolerant patients, those with CAN showed decreased numbers of T and NK cells and expressed lower levels of genes associated with immune cell activation in peripheral blood. Conclusions. Differences in B cell numbers may be useful in identifying tolerant renal transplant recipients or those predisposed to developing tolerance and could potentially provide insights into the mechanisms of tolerance. Erythrocyte Development of antibodies (Abs) to mismatched donor HLA antigens has been associated with acute and chronic rejection. Complement activation, and C4d deposition, has been correlated with humoral rejection of allografts. However, utility of C4d staining in LTx has been controversial. A recent study (Arthritis Rheum. 2004 Nov; 50(11) :3596-604) shows a strong correlation between erythrocyte bound C4d (E-C4d) in diagnosis and monitoring of SLE. Goal of our study is to determine the utility of measuring E-C4d in the diagnosis of humoral rejection following human LTx. 26 LTx recipients were analyzed post-LTx for E-C4d using FACS of RBCs incubated with anti-C4d (Quidel) followed by FITC-goat anti-mouse.10 normals were also analyzed. The serum was analyzed for development of anti-HLA Abs by solid phase assays and for the presence of autoAbs to Kα1 tubulin and collagenV (ELISA). Biopsies from 11 patients were stained immunohistochemically for C3d deposition. Summary of results are presented in Table 1 . Infection=2/10 AR=0/10 % E-C4d in normals-10.35%; MFI in normals-5.46; MFI=Mean Fluorescence Intensity; AR=Acute Rejection; DSA=Donor Specific Abs 16 out of 26 patients show significant increase in the % bound E-C4d (P<0.05) as compared to controls.11/16 had anti-HLA and 13/16 had autoAbs which were significantly different from those with low E-C4d (P= 0.005). Staining of the biopsies showed C3d deposition in 6 recipients with increased E-C4d. All 4 patients with acute humoral rejection had elevated E-C4d. We conclude that there is a significant correlation between increase in % E-C4d in LTx recipients and development of Abs to either HLA antigens or auto-antigens during the post-LTx period. Biopsies from patients with increased E-C4d showed deposition of C3d in the allografts. Preliminary data suggest that measurement of E-C4d using a non-invasive method of flow cytometry may be of value in monitoring LTx patients for humoral rejection. Innate Immunity: Chemokines, Cytokines Innate immunity is emerging as an important initiator and modulator of the adaptive immune response. In the setting of transplantation, ischemia-reperfusion injury and tissue trauma appear to potentiate the alloimmune response. One of the mechanisms through which the innate immune system modulates an adaptive immune response is dendritic cells (DC). In this study, we examined DC activation in a mouse model of skin transplantation by monitoring the expression of MHC II and p40 chain of the proinflammatory cytokine IL-12. The p40 expression was detected in live cells using the Yet40 reporter mouse, in which a transgene for yellow fluorescent protein (YFP) was placed downstream of the endogenous IL-12p40 gene, thus faithfully "reporting" p40 expression. Skins from C57BL/6 or BALB/c donors were grafted on the dorsal thorax of C57BL/6.Yet40 mice. Draining and non-draining lymph nodes (LN) were harvested at 24 hours and examined for YFP expression by fluorescent microscopy. A marked increase in the numbers of YFP-expressing DC was observed in draining LN in both syngeneic and allogeneic graft recipients. Surprisingly, YFP-expressing DC also increased in nondraining LN when compared to non-transplanted controls. Similarly, DC in draining and non-draining LN showed higher MHC II expression than those in non-transplanted controls. Upregulation of MHC II was highest at 24 hours and decreased significantly by 48-72 hours. To assess whether DC activation in non-draining LN was functionally significant, we monitored the activation of adoptively transferred ovalbumin-specific OT-II TCR transgenic T cells in response to footpad antigen challenge in mice with or without a syngeneic skin transplant on the contralateral upper thorax. OTII T cells in transplanted animals proliferated approximately 5-to10-fold better and a higher percentage of the OTII cells produced IL-2 than those from non-transplanted animals. Thus, local surgical trauma results in a widespread, time-limited, functional activation of DC that appears to act as a partial adjuvant for T cell responses. Together, these data suggest that surgical trauma may incite a systemic barrier to transplantation via the activation of DC and therapeutic interventions that reduce the surgical trauma and DC activation may help to improve survival of transplanted grafts. Tolerance of Cardiac Allografts: Studies with CX3CR1-Deficient Mice. Takaya Murayama, 1 Katsunori Tanaka, 1 Takuya Ueno, 1 Mollie Jurewics, 1 Guleria Indira, 1 Fiorina Paolo, 1 Paez Jesus, 1 Smith N. Rex, 2 Sayegh Mohamed, 1 Reza Abdi. 1 1 Transplantation Research Center, Renal Division, Brigham and Women's Hospital, Harvard Medical School, Boston, MA; 2 Department of Pathology, Massachusetts General Hospital, Harvard Medical School, Boston, MA. Although donor/tissue dendritic cells (dDC) have long been known to play a key role in mounting alloimmune responses, however, their generation, trafficking and role in tolerance have not rigorously examined. We have used B6.FVB-Tg (Itgax-DTR/ EGFP) 57 mice which has a GFP linked to the CD11c promoter. Using these mice as the donors of heart allograft transplantation provided us a unique model to study dDC posttransplantation. Our trafficking data indicate there is a rapid migration of dDC into spleen (3 hours post transplantation) but not lymph nodes and that dDC were unexpectedly detected in the spleen of the recipients long after rejection of heart allografts suggesting dDC could escape from the immunosurveillance of host immune system. Our data also show that dDC proliferate in the lymphoid tissue of the recipients and co-express Class II molecule of the recipients. We then show that CX3CR1 pathway regulates generation Abstracts of heart tissue DC constitutively. As compared to WT hearts, CX3CR1 -/hearts contain lower number of DC and transplanting CX3CR1 -/donor hearts into WT BALB/c mice led to significant prolongation of allograft survival without immunosuppression (MST of 8 vs. 17 days, respectively). Increasing CX3CR1 -/heart DC by implanting donors with FLT3-producing hybridoma cells has restored the time of rejection. Unexpectedly, induction of long-term survival with anti-CD154 blockade (MR1) and CTLA-4 Ig (but not low dose Rapamycine) was abrogated when CX3CR1 -/hearts were used as donors, with concomitant lesser Tregs in the CX3CR1 -/heart allografts as compared to WT. Furthermore, co-transplanting hearts from WT and CX3CR1 -/into the same recipient treated with MR1 resulted in significant prolongation of CX3CR1 -/heart allograft survival. Depleting the dDC of heart donors prior to transplantation with diphtheria toxin also worsened markedly chronic rejection in the recipients at day 100 post-transplantation. Our data indicate that, in contrast to the widely accepted dogma, the presence of donor DC in graft tissue is not only central to allograft rejection but also is necessary for the induction and maintenance of peripheral tolerance. Local C5a Interaction with C5aR on DCs Modulates DC Function, Subsequently Up-Regulating Allospecific T Cell Responses. Qi Peng, Ke Li, Steven H. Sacks, Wuding Zhou. MRC Centre for Transplantation, Department of Nephrology and Transplantation, King's College London, Guy's Campus, London, United Kingdom. The innate system of immunity plays an important role in ischemia-reperfusion injury and allograft rejection. The early stages of inflammatory processes are accompanied by complement activation. One biological consequence of this activation is the release of potent inflammatory anaphylatoxins, C3a and C5a, which have been reported to regulate a range of inflammatory responses. We previously reported that DCs express C3aR and C5aR, and C3a-C3aR interaction has a positive impact on murine BM DCs, in terms of activation phenotype and capacity for Ag uptake and allostimulation. However, the role of C5a in modulating DC function remains unclear. The aim of this study is to investigate the role of local C5aR signalling in modulating murine BM DC function and subsequent regulation of the allospecific T cell response. We first evaluated if C5a-C5aR interaction could result from local expression of factors. Our results showed that C5aR mRNA was detected in WT DCs at different stage of DC culture by RT-PCR, and C5a was detected by ELISA in the culture supernatants from different stages of DC culture. We next determined if C5a-C5aR interaction modulates DC function in allospecific T cell stimulation in vitro and in vivo. We found that BM DCs cultured from C5aR-/-mice or treated with C5aR antagonist (C5aRa, W54011) exhibited a less activated phenotype (producing significantly less IL-12 and more IL-10, in response to LPS stimulation); both C5aR-/-and antagonist-treated DCs (LPS stimulated) showed reduced capacity to stimulate naïve alloreactive T cells, as measured by IFN-γ production and thymidine uptake. As regards interaction in vivo, following i.p. administration of the C5aRa-treated DCs into allogeneic mice for 10 days, ex vivo mixed lymphocyte reaction showed that CD4+ T cells from those recipients have reduced thymidine uptake, but increased IL-4 production compared to that with untreated DCs. Conversely, DCs treated with C5aR agonist (C5a) exhibited a more activated phenotype (producing more IL-12 and less IL-10) and were more potent in allospecific T cell stimulation. Our findings demonstrate that murine BM DCs can express C5aR and C5a can be generated locally; C5a-C5aR interaction up-regulates murine BMDC activation and their allostimulatory capacity. Thus, targeting C5a-mediated signal may be able to prevent allograft injury. Role of TNFα in Early Chemokine Production and Leukocyte Infiltration into Heart Allografts. Daisuke Ishii, 1 Austin D. Schenck, 1 Robert L. Fairchild. 1 1 Immunology, Glickman Urological and Kidney Institute., Cleveland Clinic, Cleveland, OH. OBJECTIVES: The acute phase cytokines IL-6 and TNFα are produced early during inflammatory processes, including wound healing and ischemia/reperfusion. The goal of this study was to investigate the role of these cytokines in the induction of early chemokine production and leukocyte infiltration into heart allografts. METHODS: C57BL/6 (H-2b), BALB/c (H-2d), and BALB/c.IL-6-/-mice received vascularized syngeneic or complete MHC mismatched, A/J (H-2a), cardiac grafts. Grafts were retrieved at different time points and total RNA and tissue protein were prepared and analyzed by quantitative RT/PCR and ELISA to test expression levels of TNF-α, IL-6, CXCL1/KC, CXCL2/MIP-2, and CCL2/MCP-1 in the grafts. Anti-TNFα mAb (500 ug) was given at the time of transplantation with or without anti-CD154 mAb (200 ug on days 1 and 2). Infiltration of CD4+, CD8+ T cells, neutrophils and macrophages was assessed by flow cytometry and immunohistochemistry. Donor-reactive T cell priming to IFN-g producing cells in the recipient spleen was measured by ELISPOT. RESULTS: Expression of TNFα and IL-6 mRNA reached an initial peak at 3 hrs post-transplant and a second peak at 9-12 hrs with equivalent levels in both iso-and allo-grafts. The neutrophil and macrophage chemoattractants CXCL1/KC, CXCL2/ MIP-2 and CCL2/MCP-1 reached peak levels at 9 hrs post-transplant in both sets of grafts and then declined to background levels. IL-6 deficiency in the recipient or the cardiac allograft did not prolong allograft survival. In untreated mice, heart allografts were rejected at 8.6 ± 0.6 days after transplantation. Anti-TNFα mAb decreased neutrophil and macrophage chemoattractant levels 50% at 9 hrs post-transplant and subsequent neutrophil, macrophage and CD8+ cell infiltration into the allografts as well as extended graft survival to 14.1 ± 0.8 days. Anti-TNFα mAb also decreased the number of donor-reactive IFN-g producing CD8 T cells almost 90% on day 7 post-transplant. Whereas anti-CD154 mAb prolonged survival to day 21, administration of anti-TNFα and anti-CD154 mAb delayed rejection to day 32 and resulted in the long-term (> 80 days) survival of 40% of the heart allografts. CONCLUSIONS; These data indicate that anti-TNFα antibodies can delay donorreactive CD8 T cell priming and leukocyte infiltration into heart allografts rejection. As a conjunctive therapy, TNFa antibodies can promote long-term survival of the allografts. Introduction Recent evidence indicates that inflammation impairs immune regulation, yet the mechanisms behind this effect are not clear, in particular in regards to transplantation tolerance. In this study, we investigated the role of inflammatory cytokines, IL-6 and TNFα, in transplantation tolerance induction. We first examined the impact of IL-6 + TNFα on in vitro T cell alloimmune responses. T cells that were stimulated by allogeneic APCs in conditioned media harvested from LPS-activated DCs proliferated more than APC-stimulated T cells that were cultured in conditioned media derived from non-LPS-activated DCs. The ability of CD4 and CD8 T cells to respond to allogeneic APCs in the LPS-activated conditioned media was significantly impaired by the addition of either anti-IL-6 or anti-TNFα mABs. The addition of both mABs further diminshed T cell proliferation, indicating that IL-6 and TNFα synergize to augment in vitro T cell allostimulation. In support of these findings, we noted reduced T cell proliferation during the MLR when T cells were cultured in conditioned media derived from either IL-6-/-or TNFα-/-LPSactivated DCs. Furthermore, these diminished responses was restored by the addition of recombinant IL-6 or TNFα, respectively. To examine the in vivo implications of these findings, we employed a murine skin allograft model, with recipients that were either B6 wild type, IL-6-/-or TNFα-/-. These groups were transplanted with a BALB/c skin graft and treated with (or without) perioperative costimulatory blockade (CTLA4 Ig and anti-CD154). In the absence of immune modulation, all groups rejected BALB/c skin allografts at a similar tempo (<12 days). In the presence of costimulatory blockade, IL-6-/-recipients (median survival time, MST = 22 days, p =0.05) rejected their allografts at a slower tempo compared to WT (MST = 16 days) or TNFα-/-recipients (MST = 18 days). However, this response in IL-6-/-recipients was further delayed by administering a TNFα inhibiting mAB (MST = 60 days), indicating that synergy between IL-6 and TNFα occurs in vivo and prevents the ability of costimulatory blockade to delay the onset of allograft rejection. Conclusions We conclude that synergy between IL-6 and TNFα augments T cell alloimmune responses and impairs the effects of costimulatory blockade to delay allograft rejection. Abstract# 529 Background: Calcineurin inhibitors (CNI) are involved in the development of post transplant diabetes mellitus (PTDM). Changes in insulin secretion and sensitivity are central mechanisms involved in the development of PTDM. In addition alterations in endothelial function seem to be involved. The present study investigated the effect of CNI's on these factors. Methods: In a predefined sub-study of a previously published randomized trial it was aimed to compare the effect of CNI treatment (n=27) with complete CNI-avoidance (n=27) on insulin secretion and sensitivity as well as endothelial function. An oral glucose tolerance test and endothelial function investigation with laser Doppler flowmetry was performed in 44 patients, 10 weeks and 12 months following transplantation. Results: Insulin sensitivity differed already 10 weeks posttransplant and was significantly better after 12 months in patients never treated with CNI drugs (P=0.043). Endothelial function was significantly correlated with insulin sensitivity (N=27, R 2 =0.22, P=0.013) at 10 weeks posttransplant, but not after 12 months (P=0.54). Insulin secretion tended to be higher in CNI treated patients both at week 10 and month 12 (P=0.068). Conclusions: Findings in the present study indicate that long-term CNI treatment reduce insulin sensitivity which was associated with impaired endothelial function. In response to this peripheral insulin resistance a tendency towards a compensatory increase in insulin secretion was seen. These effects combined may indicate a future risk for premature cardiovascular disease in CNI treated renal transplant recipients, but this hypothesis needs further study. Group(n) Ktx Ptx(KPtx) ECD Re-tx PRA(>20%) Race(AA) Low Immunologic Risk Alem (113) Overall pt, Ktx, and Ptx survival are 96, 92, and 86% at 16 months median follow-up. Actuarial survival rates, initial length of stay, delayed graft function, steroid free rates, major infection, and incidence of PTLD (1 rATG pt) were similar for Alem and rATG groups, but treated acute rejection (AR) occurred in 17(15%) Alem pts compared to 29(27%) rATG pts (p=0.03) and biopsy proven rejection (BPAR) in 13(12%) Alem pts compared to 23(21%) rATG pts (p=0.04). Only 1 Alem BPAR has occurred after 12 months. Total daily MMF doses were similar for Alem and rATG groups at 3 months (1567±457mg vs 1670±508mg). Neupogen use was greater in the Alem group, 26(23%), than in the rATG group (14(13%), p=0.03). Excluding PAK, chronic allograft nephropathy (CAN) was observed in 19(17%) Alem pts and 27(25%) rATG pts. (p=0.14). Conclusions: Alem and rATG induction both provide excellent 1 and 2 yr pt, Ktx, and Ptx survival. Alem is associated with lower acute rejection rates and perhaps less CAN, but requires increased Neupogen administration to help maintain MMF dosing. Thymoglobulin Dosing Intensity and Density: Effects on Induction Efficacy Ruth-Ann M. Lee, 1 Adele H. Rike, 1 Background: Alemtuzumab (Campath 1H) has been used as induction therapy for kidney transplant recipients with acute rejection rates reported by us and others of 7 to 20% at one year. The histologic type of rejection and the time frame for occurrence after treatment with alemtuzumab have not been well established. This study is a retrospective single center review of acute rejection episodes of kidney transplant recipients treated with alemtuzumab induction with respect to the kinetics and histologic patterns of acute allograft rejection. Methods: From 11/01/03 to 10/31/06, 416 kidney transplants were done meeting the inclusion criteria for this review. All patients had negative T and B cell flow cytometric crossmatches and received induction therapy with alemtuzumab 30mg IV intra-operatively, methylprednisolone 500 -750 mg during the first 24 hours, and were then maintained on tacrolimus (target level 5-7) and mycophenolate mofetil without steroids. All episodes of biopsy-proven acute rejection (AR) were reviewed. Patients in pre-transplant desensitization protocols or with documented non-adherence with medications were excluded from the analysis. Results: A total of 44 of 416 patients (10.6%) experienced AR during the study period, with a mean follow up of 28 months (range 14-48 months). Of the AR episodes, 19 (43%) occurred within the first 3 months post-transplant, 13 (30%) occurred between 3-6 months, 5 (11%) occurred between 6-9 months, 1 (2%) occurred between 9-12 months, and 6 (14%) occurred more than one year post-transplant. Of the rejection episodes within the first 3 months post-transplant, 9/19 occurred within the first 30 days. Histologic analysis showed that 9/19 rejection episodes (47%) within the first 3 months included an antibody mediated component (6/9 within the first 30 days.) In contrast, 3/25 rejection episodes (12%) which occurred greater than 3 months post-transplant were antibody mediated. Of the 12 patients with antibody mediated rejection, only 3 patients had panel reactive antibody (PRA) levels > 25% at the time of transplant. Conclusion: This large experience with alemtuzumab induction therapy with a steroidfree maintenance protocol, demonstrates that the majority of rejection episodes occur within the first 6 months post-transplant, with the largest fraction in the first 3 months. A significant number of early rejection episodes are antibody mediated and occur in unsensitized recipients. In a randomized, international, multicenter study, comparing the use of Thymoglobulin (TMG) and basiliximab (BAS) in recipients at high risk for delayed graft function (DGF) or rejection, TMG was associated with less acute rejection (15.6 % vs 25.5%, P=0.02) and a lower triple endpoint (rejection, death or graft loss, 20.6% TMG vs 33.6% BAS, P=0.02) but not a significantly lower quadruple endpoint including DGF. The purpose of this study was to compare the efficacy of TMG and BAS for induction stratified by donor source: standard criteria donor (SCD), extended criteria donor (ECD) or donor with hypertension (HTN). Methods: Retrospective review of data collected in the original randomized trial. Data-capture limitations necessitated defining ECD as donor age > 60 or donor age between 50 and 60 with both a donor history of HTN and donor renal insufficiency (history of ATN or creatinine above 2.5 mg/dl during the 24 hours prior to organ recovery/start of cold ischemia time). Results: 75 recipients received ECD kidneys [TMG n=40 (28.4%), BAS n=35 (25.6%), P=NS]. Outcomes are presented below. There were no differences in the rates of DGF between the groups examined. Conclusion: Standard and non-HTN donor recipients had a tremendous benefit of TMG compared to BAS with less acute rejection and death. Contrary to the perceived niche of TMG in ECD recipients, TMG has its most beneficial effect in SCD recipients and recipients of donors without HTN at risk for acute rejection or DGF. Evaluation We have changed our immnosuppressive protocol in ABO-incompatible kidney transplantations and attempted to determine whether the changes in agents have resulted in better outcomes. We used tacrolimus(FK), mycophenolate mofetil(MMF) and methylprednisolone(MP) in immunologically high risk patients between 2000 and 2007. Moreover, we performed splenectomy at the time of the transplant surgery in 117 patients (group 1) with ABOincompatibilities between 2000 ansd 2004, and administered rituximab as an alternative to splenectomy in 38 patients(group 2) with ABO-incompatibilities between 2005 and 2007. In this study, we compared the graft survival rates as well as the incidence of acute rejection in these two treatment eras. The graft survival rate at one year was 94% in group 1 and 97% in group 2 (P=NS). The graft was lost in one case of the 38 cases of group 2 due to insufficient doses of the immunosuppressive drugs. The incidence rate of acute rejection was 15% (18/117) in group 1, and 11% (4/38) in group 2 (P<0.01). There were no significant differences in serum creatinine level one year after transplanatation between two groups(1.5±0.3 in group 1 vs.1.4±0.4mg/dl in group 2). No serious adverse events associated with rituximab or splenectomty were encountered in either groups. In ABO-incomaptible kidney transplantation, rituximab under FK/MMF combination as an alternative to splenectomy seems to yield an excellent result in terms of the incidence rate of acute rejection. Introduction: The choice of immunosuppression in the elderly kidney transplant recipient remains unclear. The objective of this study was to compare outcomes with different T cell-depleting induction agents in the elderly. Method: All solitary kidney transplant recipients over the age of 60 years that received induction therapy with either alemtuzumab or Thymoglobulin from 2002 to 2007 were included in this UNOS analysis. Overall graft survival, the risk of graft loss, and the risk of rejection were compared using Kaplan Meier, Cox proportional hazards, and logistic regression, respectively. Results: Patients receiving alemtuzumab had a significantly lower 3-year graft survival (70.8%) than patients receiving Thymoglobulin (79.3%), p=0.02) (Figure) After adjusting for other risk factors, alemtuzumab had a higher risk of graft loss compared to patients given Purpose. The aim of this prospective randomized study was the comparison of efficacy and incidence of adverse events in two induction therapy regimens (ATG versus basiliximab) in patients receiving a dual immunosuppression. Methods. 120 recipients of first or second deceased donor kidney transplants were prospectively randomized to receive either ATG (Fresenius) or basiliximab (Novartis) as induction therapy. Dual immnosuppression consisted of tacrolimus (Astellas) and methylprednisolone. CMV prophylaxis was not applied on a regular basis. Statistical analysis was performed with Fisher's exact or chi-squared test, ANOVA or Mann-Whitney U test, Kaplan Meier curves and log-rank test. Results. Patient characteristics of populations treated with ATG versus basiliximab were similar concerning average age (48 years), gender and dialysis time prior to transplantation (78 vs. 88 months). Average donor age and cold ischemia were also comparable (43 vs. 41 years and 833 vs. 852 minutes). The actuarial 5-year patient survival for the ATG subpopulation is 91,7 % in comparison to 85 % in the basiliximab group (n.s.). Analyzing graft survival after 5 years, rates of 88,3 % in ATG patients compared to 75 % in the basiliximab group can be observed (n.s.). The incidence of acute rejection episodes was similar in both groups (ATG: n=20 vs. basiliximab: n=16). 18 (30%) patients in the ATG group and 20 (33,3%) in the basiliximab group showed a delayed graft function. Serum creatinine was not significantly different at 1 and 5 years (ATG: 1,4±0,6 mg/dl and 1,4±0,8 mg/dl vs. basiliximab: 1,3±0,7 mg/dl and 1,5±0,9 mg/dl). Patients in the ATG group had a higher rate of CMV infections (n=13 vs. n=3; p=0,05), whereas patients treated with basiliximab had significantly more hematological complications like anaemia, leukopenia and thrombocytopenia. Conclusion. Comparing induction therapy with ATG and basiliximab, our data shows similar patient and graft survival rates with slightly better results in the ATG group. Patients treated with ATG had a higher rate of CMV infections but less hematological complications. Predicting cardiovascular events (CVD) and the varied effects of immunosuppressive medication on CVD risk factors requires an understanding of how traditional and nontraditional risk factors impact CVD after kidney transplantation (KTx). Single-center studies have generally lacked statistical power and generalizability. Registry studies have lacked sufficient data on CVD risk factors. The PORT project is creating a multicenter international database of KTx recipients with the primary objective of developing risk prediction models for post-transplant CVD. As a preliminary data assessment, an analysis was done on 19,669 KTx from 1990-2007 from 11 transplant centers representing 5 European centers, 4 North American centers, and 2 centers from Asia/Oceania. All data were extracted from preexisting databases at each individual transplant center and processed into the consolidated PORT database. 1,131 major adverse cardiac events (MACE) were identified, defined as non-fatal or fatal myocardial infarction, cardiac Abstracts arrest, and sudden death. The MACE-free survival curves by participating center are shown in the figure. In this preliminary analysis, the overall one-year cumulative incidence of MACE was 2.2%, ranging from 0.2% to 4.4% across centers; and the five-year cumulative incidence was 5.8%, ranging from 0.6% to 13.5%. The prevalence of diabetes pre-transplant varied from 10% to 47% across centers. In Cox proportional hazards models adjusted for age, gender, race, donor type, transplant center, and reported history of diabetes, hypertension (HTN), and AMI, patients with a reported history of AMI had a 112% increased risk (79%-152%, p<0.0001) for MACE. For the final analysis, the definition of MACE will be expanded to include major revascularization events. The final PORT database will be used to develop and validate an equation to predict MACE and other health outcomes of interest, after accounting for differential CVD risk factors internationally. Abstracts to center. The most commonly used BP medications were beta-blockers, followed by CCBs, and both were used with the same frequency at 1 and 6 months. In contrast, the percentage of patients on an ACEI or ARB more than doubled between 1 and 6 months, suggesting reluctance to use these agents early after Tx (a practice not necessarily evidence-based); however, more than 70% of subjects did not receive ACEI/ARB therapy even at 6 months. Fewer than half of all patients received aspirin, including only 60% with DM and/or CVD. Similarly, only half with DM and/or CVD received a statin at 1 and 6 months. These data indicate current management of KTx recipients fails to utilize optimal CVD risk reduction measures in a timely fashion, perhaps missing an opportunity to reduce long-term morbidity and mortality from CVD in this at-risk population. Background: Cardiovascular (CV) risk reduction has been a primary reason for pursuing early corticosteroid withdrawal (ECSWD). To date, actual cardiovascular event data (CVE) (rather than CV risk) has not been reported for ECSWD. Therefore, we analyzed and compared actual CV events (CVE) and CV-related survival in ECSWD (≤7 days) and chronic corticosteroid (CCS) pts. Methods: CVE and heart failure (HF) data were prospectively collected. CVE were defined as sudden death, myocardial infarction, angina, and cerebrovascular accident/ transient ischemic attack. HF events were defined as pulmonary edema or HF diagnosis. Conclusions: RTx recipients receiving ECSWD experienced: 1) fewer CVE and 2) a trend toward overall better pt survival. These differences in CVE and pt survival do not present until at least 3 yrs PTx. and therefore require long term followup to become evident. Abstracts Immunohistochemistry was performed for CD4 + and CD8 + cells. Results were compared with those of the patients on maintenance IS (Gr-IS n=29) and the liver tissue from normal subjects (Gr-normal n=11). Results: The follow-up time in Gr-Tol was longer than that in Gr-IS.(Gr-Tol and Gr-IS: 121m and 52m, p<0.01) In Gr-Tol, typical features of neither acute nor chronic rejection were observed following Banff criteria. The extent of graft fibrosis in Gr-Tol, however, was greater, than those in Gr-IS and Gr-normal (Gr-Tol, Gr-IS and Gr-normal ; 1.6, 0.9 and 0 (Ishak's modified staging )(Gr-Tol vs. Gr-IS, Gr-IS vs.Gr-normal p<0.01). Each number of CD4 + and CD8 + cells in graft infiltrates was increased in Gr-Tol, compared with that in Gr-normal, but equivalent with that in Gr-IS (CD4 + Gr-Tol, Gr-IS and Gr-normal ;14.6,10.6 and 5.3 cells/field, Gr-Tol vs.Gr-normal p<0.05, Gr-Tol vs.Gr-IS NS / CD8 + Gr-Tol, Gr-IS and Gr-normal; 27.6, 26.3 and 10.5 cells/field Gr-Tol vs.Gr-normal p<0.01, Gr-Tol vs.Gr-IS NS). Conclusions+Discussion : In tolerant graft after pediatric living-donor Ltx, neither acute nor chronic rejection was observed, but fibrosis developed. Because of the similar extents of CD4 + and CD8 + cells infiltrates and different follow up time between tolerant and immunosuppressed patients, it remains questionable whether fibrosis in tolerant graft is antigen-dependent. Serial protocol biopsy before and after starting weaning IS will detect fibrosis early, and observing whether reintroduction of maintenance IS reverses fibrosis in that case will answer this question. Operational tolerance may not always guarantee intact graft morphology. Development of "Operational Tolerance" after Pediatric Liver Background : In the setting of our pediatric living-donor liver transplantation (Ltx), 15% of all the patients (significantly higher proportion, compared with those of other transplant centers) achieved complete withdrawal of immunosuppression (IS), which is reffered to as "operational tolerance". Nonetheless, some patients encountered rejection while they were undergoing weaning from IS. It is,therefore,essential to identify and characterize the differences that will enable patients in these two distinct populations to be distinguished reliably. Methods: The study groups consisted of group tolerance(Gr-Tol) in which 88 patients are successfully weaned off from IS, and group rejection (Gr-Rej) in which 22 patients experienced clinically evident rejection during or after weaning process. The correlation between the clinical outcome (success or failure of weaning IS) and following parameters was assessed ; donor/recipient age, donor/recipient gender, ABO compatibility, HLA mismatch, graft size, early (<1month) rejection episode and initial immunosuppression.Results: There was no difference between Gr-Tol and Gr-Rej with respect to donor/recipient age (Gr-Tol and Gr-Rej;32y and 32y NS/35m and 26m NS), or donor/recipient gender (Gr-Tol and Gr-Rej (female);60% and 50% NS/60% and 73% NS). ABO compatibility did not differ between the two groups(Gr-Tol and Gr-Rej (i dentical:compatible:incompatible);63%:27%:10%and 55%:27%:18% NS).The presence of HLA-B mismatch was more frequent in Gr-Tol than that in Gr-Rej (Gr-Tol and Gr-Rej;95% and 76% p<0.05 ), while the presence of HLA-A or DR mismatch did not affect success or failure of weaning IS(HLA-A Gr-Tol and Gr-Rej;73% and 76% NS, HLA-DR ;83% and 82% NS). Graft size did not differ between the two groups(GBWR Gr-Tol and Gr-Rej;2.9% and 3.2% NS). The patients in Gr-Rej experienced early rejection more frequently than those in Gr-Tol(Gr-Tol and Gr-Rej;17% and 50% p<0.05). Mean trough level of tacrolimus within 7 days after Ltx was compatible between the two groups (Gr-Tol and Gr-Rej;11ng/ml and 11ng/ml NS). Conclusions: Development of operational tolerance after pediatric Ltx was associated with the absence of early rejection and the presence of HLA-B mismatch between donors and recipients. Auxiliary Partial Orthotopic Liver Transplantation (APOLT) in Children with Fulminant Hepatic Failure. Patients with fulminant liver failure (FHF) who undergo auxiliary partial orthotopic liver transplantation (APOLT) have a chance to come off immunosuprresion (ISP) when the native liver regenerates. It may be most beneficial for children with FHF; however, the literature regarding its use in children has been limited. From the beginning of the pediatric liver transplant program at our institution, 31 patients underwent liver transplantation for FHF. Of those, 7 received APOLT and the remaining standard liver transplantation (OLT). Seven children (age 8months to 8 years) who received APOT (APOLT group) were compared to matched control group of 11 patients (OLT group). Since APOLT was offered routinely at out instituting since 2005, 6 of APOLT cases were done since 2005. In APOLT group, either left lateral segment or left lobe graft was used. Recipients left lobe was removed in all cases. In OLT group, 7 received whole liver graft and 4 received partial liver graft. All native livers showed submassive to massive necrosis at the time of transplant in pathology. All children (100%) in APOLT group are currently alive with a median follow up of 761 days (range 116-4139 days) where 8 (72%) patients are alive in OLT group (median follow up 1724 days). Six of 7 children in APOLT group (87%) showed native liver regeneration. First four APOLT recipients (57%) are currently off ISP with fully regenerated native liver. Two of those patients developed complete atrophy of the graft liver, one underwent graft removal due to sepsis caused by severe rejection. One remaining patient who is off ISP is displaying progressive atrophy of the transplant liver. Incidence of acute rejection was 57% (4/7) in APOLT group vs 30% (3/10) in OLT group. Other postoperative complications included hepatic artery thrombosis (HAT) (n=1), bile leak (n=1), bilary structure (n=1) and bowel obstruction (n=1) in APOLT group, HAT (n=1), bile leak (n=2), bowel perforation (n=2), chylothorax (n=1) and aplastic anemia (n=2). Median posttransplant length of stay was 15 days in APOLT and 20days in OLT group. Conclusions: APOLT was safely performed in children with FHF. Significant proportion of recipients displayed native liver regeneration and came off immunosuppression. Natural Killer Cell Dysfunction in Pediatric Acute Liver Failure. Nada Yazigi, 1 Greg Tiao, 1 Alexandra Filipovich, 2 John Bucuvalas. 1 In pediatric patients, indeterminate Acute Liver Failure (ALF) accounts for ∼50% of all cases, and carries a particularly poor prognosis without transplantation. Evidence exists to suggest that acute liver failure may reflect a disproportionate immune response to a common stimulus. NK cells comprise a central component of the innate immune system. We hypothesized that NK cell dysfunction (innate or secondary to an antigenic insult) plays a pathologic role in indeterminate ALF. We reviewed peripheral NK cell function in a series of 15 consecutive children cared for at Cincinnati Children's Hospital, who met criteria for indeterminate ALF as defined by the Pediatric ALF Study Group. Peripheral blood testing was carried for NK cell number, cytolytic function and perforin and granzyme activity as part of our clinical ALF protocol. Seven of fifteen patients had NK cell dysfunction. Only the severity of cholestasis was statistically higher in the NK cell dysfunction group. There was no statistical difference between the 2 groups with respect to age, INR, or peripheral blood cell counts. 3 of seven patients with NK cell dysfunction died in contrast to none of eight in the normal NK cell group. Of the 3 patients with NK cell dysfunction who eventually received a liver transplant, 2 had severe early recurrence of chronic hepatitis in the graft at a year follow up. Those outcomes are in sharp contrast to the group with no NK cell dysfunction where 7 patients needed transplantation, but all had no complications both on short or long term follow up. We documented NK cell dysfunction at the time of ALF diagnosis in 7/15 pediatric patients with indeterminate ALF. This subgroup of patients was found to have higher: mortality, risk of infections, as well as recurrent disease in the graft. Our findings suggest that NK cell dysfunction is involved in the pathogenesis of indeterminate ALF. As such, it could therefore be a prime target for therapeutic intervention, with goals to rescue the patient from liver failure and /or to improve post-transplantation outcomes. Background HRS is a reversible renal failure which occurs in pts with advanced liver disease and portal hypertension and is characterized by a marked decrease in GFR and RPF in absence of other identifiable causes. Vasodilation theory is currently the most accepted hypothesis to explain the pathogenesis. In decompensated cirrhotics, probability of developing HRS is 8-20%/yr and increases to 40% at 5 yrs. Ideal treatment is LTx. However, there is an urgent need for effective alternative tx to increase surv chances for pts until LTx can be performed. Interventions that have shown some promise are vasoconstrictors in splanchnic circulation and TIPS. Main objective was to compare efficacy of two different regimens (albumin/terlipressin resp HES/terlipressin both w/ wo midotrine) against TIPS whereas GRF was considered as primary efficacy endpoint. Pts/Tx Dx of HRS was based on criteria, as proposed by International Ascites Club. Only pts with ESLD on the waiting list for LTx were eligible to be enrolled. Pts were assigned to tx arms and randomized w/wo midotrine. Volume/vasoconstrictor tx lasted for 10d, mitodrine was continued; follow up for 90d. Results IIa (albumin/terlipressin); IIb (HES/terlipressin); IIc (TIPS) Discussion Combination of volume expansion/vasoconstriction improved effectively GFR in pts with HRS. Use of albumin shows no advantage compared to (cost effective) HES. Although a marked improvement was observed during iv-treatment, renal fct deteriorated upon treatment withdrawal whereas pts with continued mitodrine showed superior long term outcome. We analyze our single institution experience to quantify the long-term incidence of renal failure based on month 3 GFR compared to subsequent determinations. Methods: This is an IRB approved retrospective review of the prospectively maintained database of LT recipients. Exclusion: patients on renal replacement therapy (RRT) at time of transplant, combined liver kidney, fulminant hepatic failure, and <1-year follow-up. GFRs (I 125 iothalamate Glofil method) were measured at initial evaluation (IE), month 3 (M3), year 1 (Y1), 2 (Y2), 5 (Y5), 10 (Y10), and 15 (Y15). Patients were grouped by GFR >80 (G1), 60-80 (G2), and <60 (G3). IE and M3 GFR were used as starting points for longitudinal analyses. Paired data analysis for IE, M3, Y5 and Y10 was also performed. Renal failure was defined as GFR <30, received kidney transplant, on dialysis, or on kidney transplant list. Results: 592 liver transplant patients were reviewed between 1985 and 1999. Paired Glofil data was available for the Y5 and Y10 analysis in 114 patients. M3 GFR correlated more with long-term renal function (p<0.0024). G1 demonstrated largest reductions in GFR over time. G2 and G3, when corrected for patients that got kidney transplantation and RRT, demonstrated progressive reduction in GFR. G3, G2, and G1 were statistically significant (p<0.001 Wilcoxon two-sample test). This study clearly demonstrates progressive decline in GFR continuing out to 15 years after liver transplantation. M3 GFR correlates better with long-term renal function compared to IE GFR (not truly reflective of renal function at time of LT). If M3 GFR <60, data showed a high rate of renal failure in our paired data analysis by Y5 (p<0.0024). By correcting for patients with renal failure, the previously reported stability in GFR between Y1 and Y5 is not seen. Analysis of grouping demonstrates that G1 patients at M3 have lower incidence of renal failure >10 years after LT. G2 and G3 patients will be at higher risk for renal failure each year after transplantation. Introduction: Calcineurin inhibitors have demonstrated efficacy in liver transplantation. However, they have a potential to impair renal function. Delayed tacrolimus (TAC) administration may reduce the risk of renal dysfunction. Methods: A prospective study included liver transplant pts randomised to delayed introduction of TAC (Day 5) + Daclizumab (DAC) (Group A) or to immediate TAC administration (Group B). In both groups TAC T0 was 10-20 ng/ml until Week 4 and 5-15 ng/ml thereafter. MMF was given at 2 g/d for 2 months, and corticosteroids (CS) at standard doses. Pts with a serum creatinine (SCr) > 180 µmol/l at 12 hours (H12) were excluded. The primary endpoint was the rate of pts with a mean SCr > 130 mmol/l at Month 6. Month 24 results are presented. Results: 207 pts were randomised. Baseline characteristics were similar. At Month 6, mean TAC T0 was 9.6 (Group A) and 11.2 ng/ml (Group B Median follow-up post-Tx was 8 years (5-21). Most frequent Tx indications were alcoholic (34%) and HCV (10%) cirrhosis. aMDRD Glomerular Filtration Rate (GFR) was < 60 mL/min/1,73m 2 in 11% (BT), 46% (1M), 48% (1Y) and 55% (5Y) of the patients. Changes in GFR were then compared according to the immunosuppressive protocol: -group "CNI+MMF" = a calcineurin inhibitor (CNI) + mycophenolate mofetil (MMF). -group "CNI" = a CNI without MMF. In this group, some patients received only CNI and some CNI + Azathioprine. There was no difference between those 2 sub-groups, neither on RF nor on CNI doses. All those patients were thus pooled. In both groups, GFR decreased from BT: -14% in "CNI+MMF" vs -25% in "CNI" at 1M (p=0.06), -14% vs -29% at 1Y (p=0.03), and -12% vs -33% at 5Y (p=0.02). Although their mean GFR BT was lower (86 vs 97 mL/min/1.73m 2 , p=0.0002), the decrease in RF in "CNI+MMF" patients was less severe. Nearly 50% of the patients had renal insufficiency in the 5 years following liver Tx. The reduction in the GFR is less pronounced in patients treated with MMF even if they were significantly more at risk BT. Except at 1M, there was no difference in CNI doses between the 2 groups, suggesting that the sustained lower decrease in RF observed in "CNI+MMF" may not be only explained by a CNI dose reduction. Acute rejection is a complex biologic process involving multiple cell types, cytokines and chemokines/chemokine receptors. We hypothesized that an mRNA panel that included genes implicated in the anti-allograft response would distinguish allografts undergoing acute rejection from normal allografts with a high degree of accuracy. We tested this hypothesis by measuring levels of urinary cell mRNA and peripheral blood cell mRNA for cell surface proteins CD3, CD20, CD25, CD103, and CTLA4; chemokines/ chemokine receptors IP10, MIG, CXCR3; cytotoxic attack molecules granzyme B(GB) and perforin, and immunoregulators Foxp3, TGF-beta1 and IL-10. Gene specific primer pairs and probes were used in pre-amplification enhanced real time quantitative PCR assays to measure mRNA and transcripts for 18s rRNA. For each cell source, we used logistic regression to identify a linear function of up to 5 log-transformed measures that would distinguish biopsies of AR patients from those of Stable transplant patients. Our study demonstrates that molecular signatures developed using urinary cell levels of just 5 genes (Signature 1, urinary cell levels of mRNA for CTLA4, Foxp3, GB, CD3, and MIG), or a combination of 3 urinary and blood cell levels (Signature 3, urinary cell level of CTLA4 mRNA and peripheral blood cell levels of CD103 and CTLA4) differentiate AR from Stable biopsies with 100% sensitivity and 100% specificity. Blood cell levels alone are also informative, but less so. We conclude that molecular signatures, developed from noninvasively ascertained mRNA profiles of urinary cells/ peripheral blood cells, predict acute rejection with extraordinary accuracy. Clinical Trials to validate the predictive value of these signatures are worthy of pursuit. We have described the association of CD20+ B cell infiltrates in renal transplant (Tx) biopsies (bxs) with acute cellular rejection (ACR) and Tx dysfunction (dysfx). We have also found metabolically active plasma cells (Pcs) staining for S6 ribosomal protein (S6rp) within these Txs. Herein, we report the significance of CD138+ Pcs in rejection (rj) and evaluate the impact of CD20, CD138, and S6rp on long term Tx fx by calculated creatinine clearance (CrCl). We studied 46 Tx bxs from 32 pediatric (ped) patients (pts) who were bxed for suspicion of rj from Nov 2001 to Nov 2004. Pts were given Daclizumab and maintained on prednisone, mycophenolate mofetil, tacrolimus or cyclosporine. Immunohistochemical staining and quantification for CD20, CD138, S6rp, and C4d were performed under 500x light microscopy. Bxs were classified by modified Banff 97 criteria. CrCl was followed 2 yr post-bx. CD138+ Pcs were associated with C4d-negative ACR (p=0.0002) but not antibody mediated rj (AMR, p=0.82). ROC analysis confirmed >5 CD138+ cells/hpf strongly associated with ACR, yielding 89% sensitivity, 83% specificity, correctly classifying 84% and comprising total ROC area 0.92 (95% CI 0.82, 1). Higher CD138 counts at bx correlated with worse Tx fx (Fig 1) . A univariate regression model showed that CD20, CD138, S6rp and time were associated with a decline in Tx fx at bx. Multivariate model showed that CD20, CD138, and time had the main effects on CrCl decline, with S6rp dropping out. All patients regardless of rj status had a 16 ml/min/1.73 m 2 CrCl decline exerted by time (p=0.004). Pts with CD20 had an additional sustained 19 ml/min/1.73 m 2 CrCl decline seen 2yrs post-bx (p=0.03). Pts with CD138 also had 19 ml/min/1.73 m 2 CrCl decline at bx (p=0.03), but there was an interaction between time and CD138 that negated a sustained effect (p=0.04). This study identifies a numerical threshold of >5 CD138 cells/hpf that is associated with ACR and Tx dysfx. Infiltrating CD20 cells had the greatest, sustained effect on Tx dysfx. We conclude that cells of the B lineage, particularly CD20, play a key, but undefined, role in ACR. Intragraft There is now evidence that FOXP3+ cells are not indicators of tolerance, since FOXP3 is also increased during acute rejection. However, it is unknown whether FOXP3+ cells are present during chronic antibody mediated rejection. Moreover, the relative balance of regulatory, effector and cytotoxic pathways in chronic vs. acute injury has yet to be explored. Here we addressed this issue. Intragraft regulatory, effector and cytotoxic transcriptional profiles were analysed within renal transplant biopsies (n = 43) classified (Banff 2005) as displaying normal histology, chronic calcineurin inhibitor toxicity (CNItox), chronic antibody mediated rejection (CAMR) and acute cellular rejection (ACR). Granzyme B, Tbet and FOXP3 mRNA were measured by quantitative PCR and FOXP3 -positive cells were additionally quantified in graft biopsies by immunohistochemistry. Distinguishing mRNA profiles were analyzed in the peripheral blood (n = 26). Our data show that FOXP3 mRNA is increased not only in ACR (p<0.0001) but also in CAMR (p<0.05). Expression of FOXP3 mRNA correlated tightly with the density of FOXP3 protein-positive cells by immunohistochemistry (Spearman r = 0.71; p < 0.0001); FOXP3+ cells were found in aggregates and within tubules. Moreover, graft cytotoxic, effector and regulatory pathways were all found to be active in chronic as well as acute graft injury. Significant increases in Granzymze B, Tbet and FOXP3 mRNA were observed in CAMR, CNI-tox and ACR compared to normal histology (p<0.0001, p<0.001 or p<0.05). However, differences in the relative contribution of each pathway were evident, with significant accumulation of FOXP3 mRNA predominating in ACR and granzyme B predominating in CAMR. Thus, CAMR can be distinguished from both ACR and CNI-tox by an unfavorable intragraft Granzyme B/FOXP3 mRNA ratio (p< 0.05). Interestingly, this ratio was reversed in the blood, suggesting different migratory patterns for regulatory and cytotoxic cells between the blood and the graft.Our data thus confirm that intragraft and peripheral blood FOXP3 accumulation is also a feature of CAMR of kidney grafts. Moreover, CAMR can be distinguished from other graft injury types based on its intragraft or blood cytoxicity/regulatory profile. Survival of solid organ grafts depends on life long immunosuppression which results in increased rates of infection and malignancy. Induction of tolerance to allograft would represent the optimal solution for controlling both chronic rejection and side effects of immunosuppression. We previously showed that operational tolerance after kidney transplantation could occur in some patient. Here, the potential of high throughput microarray technology allowed us to study the peripheral blood gene expression profile associated to operational tolerance and chronic rejection in a cohort of human kidney graft recipients (n=26). Microarrays were used to compare the gene expression profile of PBMC from patients with chronic rejection and drug-free operationally tolerant recipients. Results have been treated using a classical statistical and a non-statistical analysis based on the identification of key leader genes associated respectively to chronic rejection and operational tolerance, either as those mostly changing their expression or having the strongest interconnections. 343 differentially expressed genes were identified between operational tolerant patients and patients with chronic rejection. Abstracts defined as missing > 10% of prescribed doses on MAM and MEMS, and >2 SD among consecutive blood serum levels. Results: Participants were 28 transplant patients (M = 14.11 +3.38 years old, 75% male, 71.4% Caucasian). On the MAM, 67.9% of the patients acknowledged some non-adherence but minimized how many doses they missed. Using MEMS technology, 90% had some non-adherence and specifically, 23.8% of the participants missed doses and only 56% of their doses were taken within the allowable time frame. Using > 2SD criteria for blood serum levels, 42% of the participants were considered non-adherent. Non-adherence worsened with years since transplant. More missed (r = .59, p = .001) and late doses (r = .59, p = 0.04) on the MAM and >2 SD among blood serum levels (r = .83, p = .001) was associated with higher incidence of acute rejections. Adherence data was examined for patients with documented acute rejections (N = 12). Sensitivity and specificity of each detection method was also examined. The MAM and SD detection methods each identified non-adherence in 67% of the patients with acute rejections; MEMS did not identify any additional non-adherent patients. Only 25% of the patients with acute rejections were identified consistently by all three adherence detection methods; all patients with acute rejections were identified by at least one method. Discussion: Non-adherence worsening with time since transplant and was associated with acute rejections. Since no single method of detecting adherence identified all the patients with acute rejections, multi-method adherence assessments should be used to accurately capture patients who are non-adherent. Non NA is a leading cause of allograft loss and results from multiple factors. Locus of control (LOC) and beliefs regarding health have been associated with adherence in other populations. 51 randomly chosen KTR's were interviewed using a confidential questionnaire administered by an outside investigator that included questions regarding LOC, health beliefs and self-efficacy. The population was 55% female, 92% Black, 15% Hispanic, 71% deceased donor kidney, 47% diabetic, 49% greater than High School education, 36% employed, 30% married or cohabiting, 63% income <20K per year, 65% insured by Medicaid. Mean age 47.8±13.5 yrs, time on dialysis 72.8±51 mos, months since transplant 23.9±14.3, total meds 7.3±3.0. By Pearson correlation, non-adherence (NA), defined as "having missed doses of immunosuppression over the preceding 3 months", was not correlated with race, gender, income, type of insurance, age, marital status, type of txp, mos on dialysis, time since transplant, or number of medications. NA was correlated with higher education level (r=0.4, p=0.006), current employment (r=0.3, p=0.05), and knowledge of most recent creatinine value (r=0.35, p=0.015). NA was associated with concerns regarding prednisone (long term effects, dependency) r=0.33, p=0.018, feelings of greater personal control over illness (r=0.36, p=0.011), and inversely correlated with powerful others LOC (feeling that one's health is dependent on other people), r=-0.31, p=0.029 and belief in the necessity of medication for maintenance of transplant health, r=-0.3, p=0.031. We conclude, in our population of inner-City patients: 1. NA is not associated with standard demographic factors including income, race and gender. 2. Contrary to findings in other populations, NA is associated with higher education and current employment. 3. NA is associated with knowledge about creatinine value, concern regarding long-term effects of prednisone and disbelief in the importance of transplant medications. 4. NA is associated with feelings of personal control and feeling that powerful others (e.g. health care providers) are not of high importance in the outcome of illness. 5. Education programs designed to address NA in this population should be targeted towards altering negative beliefs regarding medications and stress the importance of partnering with the transplant team for optimal long-term outcome. Purpose: The present study aimed to prospectively examine the relationships among nonadherence, health-related quality of life (HRQOL), and family factors in adolescent kidney, liver, and heart transplant recipients. Method: 68 adolescent transplant recipients aged 11 to 20 years (M = 15.8, SD = 2.5; 44% female; 57% kidney, 25% liver, 18% heart) and their parents participated. At baseline and 18-month follow-up assessments, adolescents and their parents independently completed phone interviews assessing self-/proxy-reported medication adherence, HRQOL, and family cohesion and conflict. Medical record reviews were conducted to obtain current medications, immunosuppressant drug assays, and clinical outcomes in the past year (i.e., rejection episodes, hospitalizations, graft loss). Results: At baseline, adolescents classified as nonadherent based on self-report and Tacrolimus standard deviation (SD) reported significantly lower general health perceptions (F(3, 64) = 3.63, p < .05), self-esteem (F = 4.58, p < .01), mental health (F = 3.09, p < .05), and behavior HRQOL (F = 2.75, p < .05) compared to adolescents classified as adherent. Similarly, parents of adolescents classified as nonadherent reported significantly lower physical functioning (F = 3.18, p < .05), self-esteem (F = 5.85, p < .01), and behavior HRQOL (F = 2.82, p < .05) for their adolescents. Family conflict was correlated with adolescent report of behavior (r = -.49, p < .01), physical functioning (r = -.33, p < .01), self-esteem (r = -.42, p < .01), and mental health HRQOL (r = -.34, p < .01). Family conflict was correlated with parent report of behavior (r = -.46, p < .01) and physical functioning (r = .24, p < .05). Improvement and deterioration in HRQOL from baseline to 18-month follow-up is currently being examined. It is expected that increased family conflict and decreased medication adherence will be associated with deteriorations in HRQOL. The interrelationships between medication adherence, family conflict, and HRQOL domains such as self-esteem and mental health suggest that interventions targeting these domains may result in improvements in medication adherence behavior. The use of CAM in general is associated with non-disclosure by patients to physicians. 51 randomly chosen KTRs were interviewed using a confidential questionnaire administered by an outside investigator, including questions on CAM usage, whether it was doctor-recommended, and whether the patient disclosed use. CAM was defined as ingestion of herbal or other preparations, use of mind-body techniques or manipulation of the body for healing by someone not an allopathic medical provider. Use of vitamins and spirituality were excluded. The population was 55% female, 92% Black, 15% Hispanic, 71% deceased donor kidney, 47% diabetic, 49% > high school education, 36% employed, 30% married or cohabiting, 63% income <20K per year, 65% insured by Medicaid. Mean age 47.8±13.5 yrs, time on dialysis 72.8±51 mos, months since transplant 23.9±14.3, total meds 7.3±3.0. 45% of patients (n=23) used CAM. By Pearson r, CAM use was correlated with NA to immunosuppressants, p= 0.041, r= 0.4, blood sugar-lowering medications, p= 0.003, r= 0.57, and cholesterol lowering medications, p= 0.0001, r=0.84, worries about long-term effects of medicines, p=0.02, r= 0.324, belief that doctors place too much trust in medication, p=0.032, r=0.3, and that natural remedies are safer than medicines, p= 0.034, r=0.3. CAM use was inversely related to belief that health depends on allopathic medicines, p=0.014, r= -0.341, medicines protect from worsening disease, p=0.048, r= -0.28, and that following doctors orders is the best way to stay healthy, p= 0.04, r= -0.29, and that having a kidney transplant makes them feel happy, p= 0.005, r= -0.39. We conclude, in our population of Inner-City patients: 1. Use of CAM is correlated with medication non-adherence. 2. Patients who use CAM are more worried about long term effects of medication, believe that natural remedies are safer than medications and that doctors place too much trust in medication. 3. Patients who use CAM do not believe that their health depends on allopathic medication, that medicines protect from worsening disease, or that following a doctor's orders is the best way to maintain optimal health. 4. Patients who use CAM are less happy with their kidney transplant. 5. Disussing CAM use and motivation for use is important in the Transplant Clinic and may alert the provider to possible risk for non-adherence. By multivariate Cox analysis the risk of TG related to the presence of HLA-IIAb Death censored graft loss occurred in 4.6% of patients without TG and in 38.4% of patients with TG (p<0.0001) HLA-IIAb are associated with higher risk of TG and reduced graft survival. Furthermore, the risk of TG and its prognosis relate to the level of HLA-IIAb quantitated in a solid phase assay Term Survival of Cardiac Allografts in Wild-Type Mice by Alloantigen (alloAg)-Specific FOXP3 + CD4 + CD25 + Natural Regulatory T (nT reg ) Cells. Guliang Xia, 1 Jie He Methods: Fresh naive CD4 + CD25 + nT reg were isolated from congeneic B6.PL mice via autoMACS and enriched for alloAg specificity by in vitro culture with either anti-CD3/ CD28-coated Dynabeads (d0-11), then donor bone marrow-derived dendritic cells 8% (d+16) for DC/Beads-expanded nT reg , while total fold of expansion of nT reg remained similar (24.8∼30.2 for Beads/DC-or 5.2∼7.1 for DC/Beads-expansion) regardless of the presence or absence of TGF-β. Introducing RA (100 nM) into Bead/DC-based, TGF-β/ IL-2-conditioned culture resulted in marginal improvement with 28.9% (d+18) nT reg being Foxp3 + . In MLR assays, nT reg expanded with TGF-β/IL-2 exerted more potent suppression than cells conditioned with IL-2 alone. In vivo, Beads/DC-expanded, TGF-β/ IL-2-conditioned nT reg synergized with transient host T cell-depletion (anti-Thy1.2 mAb i.p. 50 µg at d-3 & 25 µg on d+2) in C57BL/6 mice to suppress BALB/c heart allograft rejection with 57.1% (n=7) and 100% (n=10) allografts surviving over 100 days when 4x10 7 or 8x10 7 cells/mouse were injected immediately post-transplant, respectively. Anti-Thy1.2 treatment alone led to only 18.8% long-term survival. Infused nT reg survived long-term (5.8 % circulating T cells (8x10 7 cell dose) or 1.2 % (4x10 7 cell dose) at d+7 post-transplant) and expressed high level Foxp3 (40∼60%) in vivo. Long-term surviving allografts showed characteristics of 'acquired immune privilege' with cellular infiltrates that were Foxp3 + , TGF-β + , IL-10 + and indoleamine 2,3-dioxygenase (IDO) + , although signs of mild to moderate chronic rejection were still evident Conclusion: T-bet deficiency results in up-regulation of IL-17 expression in addition to Th2 associated cytokines resulting in acceleration of chronic rejection despite profound deficiency of IFN-γ. T-bet deficiency may contribute to the alloimmune responses independent of IFN-γ By in situ hybridization, Tir8 mRNA level was higher (P<0.05) in cortical tubuli, glomeruli, perivascular and peritubular areas of kidney grafts at 1, 3 and 6-7 d post-tx, than in naive kidneys. To assess how local expression of TIR8 affects the outcome of kidney grafts, we transplanted Tir8 -/-B6x129 kidneys into DBA/2 mice. Most (83%) recipients of Tir8 -/-kidneys rejected their grafts with a median survival of 9.5 d (n=12, P<0.05 vs wt) and had more severe graft dysfunction (BUN levels) at day 1, 3 and 6-7 days post-tx, than recipients of a wt allograft (P<0.05) Opticept Trial: Efficacy and Safety of Monitored MMF in Combination with CNI in Renal Transplantation at 12 Months. R Trough-based dose adjustments were made in the MMF CC arms. Antibody induction and/or corticosteroids were administered according to center practice. Primary endpoints were the proportion of patients with treatment failure (biopsy-proven acute rejection [BPAR], graft loss, death), and mean percent change in calculated glomerular filtration rate (GFR; Nankivell equation) at 12 months. Safety endpoints were incidences of adverse events (AEs) and serious AEs Baseline characteristics did not differ among treatment groups with living donors accounting for approximately 50% of grafts. 82% received tacrolimus (Tac) and 18% cyclosporine (CyA): CNI doses and levels were significantly lower in Group A. MMF doses were greater in CyA-treated subjects in all groups CyA treated patients and in Group A (p=0.08); stability of renal function over time was greatest in Group A. Despite higher MMF doses in Group A (p<0.001) at most time points, significantly fewer MMF withdrawals occurred in Group A vs. Groups B and C. Conclusions: A concentration-controlled MMF and reduced level CNI regimen is not inferior to that of fixed-dose MMF and standard-dose CNI as regards BPAR and other end points. This regimen facilitated higher MMF dosing without an overall increase in adverse effects, and with a trend toward preservation of kidney function versus standard-dose CNI regimens Comparison at One Year of Interstitial Fibrosis (IF) by Automatic Quantification in Renal Transplant Recipients with Cyclosporine (CsA) Discontinuation and Sirolimus (SRL) Introduction Introduction: We previsouly reported the clinical results of a multicentric study showing that CsA conversion to SRL at week (W) 12 is associated with a significant improvement in renal function. Using routine renal biopsy (RB) performed at W52 during this study Routine RB was performed at W52. For each RB, a section was imaged using a colour video camera and analyzed by a program of colour segmentation which automatically extracts green colour areas characteristic of IF. Results were expressed as percentage of IF and grade according to Banff classification. Results: Male donor gender was associated with higher IF (30±2% vs. 23±2%, p =0.03). IF was numericaly higher in patients who had experienced acute rejection (33±4%, n = 18 vs. 26±1%, n=99, p=0.06) There was a positive correlation between renal function and the percentage of IF on RB (p=0.004). Despite significant improvement of renal function at W52 in the SRL group Intent to treat (n=117) Mean IF (%) Grade I (%) Grade II (%) Grade III (%) Sirolimus (n=58) Conclusion: Despite significant improvement in renal function after CsA to SRL conversion at 3 months, we found no difference of IF on RB at W52. The observed improvement of renal function may be due to a hemodynamic effect. A longer delay may be necessary to observe histological improvement. The higher IF score than the one previously reported by others may be explained by the use of expanded criteria donors Abstract# 528 Effects of CNI or MMF Withdrawal on Carotid Intima Media Thickness in Renal Transplant Recipients Methods: We included 119 stable renal transplant patients on CNI-based immunosuppression, including steroids (10 mg/d) and MMF (2g/d), who were randomized to MMF-withdrawal (group A: CsA-AUC 3000 ng*h/ml) or CNI-withdrawal (group B: AUC-MPA 75 µg*h/ml). Patients were treated for traditional risk factors according to stringent predefined targets. Ambulatory bloodpressure (ABPM), lipids, estimated creatinine clearance (MDRD) and IMT were measured at baseline and after 12 months. Results: Groups were comparable with respect to demographic characteristics, immunological profile, renal function, systolic and diastolic bloodpressure and lipids. Mean duration of follow-up was 17.4±5.5 months. Only 1 patient (1.3%) in group B and 3 patients (3.8%) in group C experienced acute rejection despite adequate exposure (p=0.31). IMT did not change Final Renal Function Outcomes from the Spare-the-Nephron (STN) Trial: Mycophenolate Mofetil (MMF)/Sirolimus (SRL) Maintenance Therapy and CNI Withdrawal in Renal Transplant Recipients Purpose: To compare the effect on renal function of maintenance immunosuppression with MMF and SRL to that of MMF and a CNI in renal allograft recipients. Methods: In a 2-year open-label, prospective, randomized, controlled, multicenter study, 305 subjects maintained on MMF and a CNI were randomized 30-180 days posttransplantation to either MMF (1-1.5 g BID) plus SRL (2-10 mg followed by ≥ 2 mg/ day Results: Outcomes of the first 123 subjects receiving MMF/SRL and 126 receiving MMF/CNI (Tac, n=98; CsA, n=28) completing 1 year of follow-up will be reported here. Final outcomes of all 305 subjects will be presented at the congress. Mean time from transplant to randomization in both groups was 117 days. Groups were similar at baseline for all reported renal function endpoints After 12 months of therapy, maintenance immunosuppression with MMF/ SRL after CNI withdrawal appears to preserve renal function when compared with a MMF/CNI-containing regimen Improved Outcomes after De Novo Renal Transplantation: 2-Year Results from the Symphony Study. H. Ekberg, 1 H. Tedesco-Silva Frei, 10 Y. Vanrenterghem, 11 P. Daloze, 12 P. Halloran At 2 years, the rate of uncensored graft loss was lowest in patients receiving tacrolimus (8% vs 10-13% in other groups; Kaplan-Meier estimates). GFR at the end of the core study was slightly better in the follow-up ITT patients (65-70ml/ min) than in the core study ITT patients (57-65ml/min), suggesting inclusion of betterperforming patients in the follow-up. Renal function was generally stable over year 2. A slight improvement in GFR in the sirolimus group (+1.6ml/min) was observed, whereas the tacrolimus group still had superior GFR (69 vs 65-67ml/min in other groups). Conclusions: In follow-up patients, renal function was stable during the second year and GFR differences were less marked than at 1 year A Prospective Randomized Study of Alemtuzumab vs Rabbit Anti-Thymocyte Globulin Induction in Kidney and Pancreas Transplantation Gautreaux, 1 S. Iskandar, 4 P. Adams, 2 R. Stratta. 1 1 Surgery; 2 Medicine; 3 Pharmacy Alemtuzumab (Alem) and rabbit anti-thymocyte globulin (rATG) are the most commonly used T-cell depleting induction agents in kidney (K) and pancreas (P) transplantation (tx) Expanded criteria donors (ECD) were included. Results: Between 2/1/05 and 8/24/07 225 pts enrolled and 222 pts were transplanted. Of 222 pts, 180(81%) had Ktx alone, 38(17%) KPtx, and 4(2%) PAKtx. Of 180 Ktx alone, 152(84%) were deceased donor, and 61(34%) were ECDs. Recipient age, race, re-tx Abstract# 535 Purpose: To determine the impact of ALGinduction on long-term outcomes post-renal Tx. Methods: Between 01/85 and 01/86, 123 consecutive adult pts received a deceased donor renal Tx at a single institution Results: The incidence of acute rejection was lower in Gr.2 (28% vs. 75%, P<0.0001). The incidence of CMV infection was 10% in Gr.1 and 18% in Gr.2 (P=NS). The overall incidence of cancer was 22 Abstract# 538 Single-Dose Induction with Rabbit Anti-Thymocyte Globulin (rATG) Safely Improves Renal Allograft Function and Reduces Chronic Allograft Nephropathy 1 Clifford Miles, 1 Gerald Groggel, 1 Lucile Wrenshall. 1 1 Divisions of Transplantation and Nephrology We conducted a prospective, randomized trial in renal transplant recipients comparing two dosing protocols [single dose (6mg/kg) vs. divided doses (1.5mg/kg for 4 doses)] of rabbit anti-thymocyte globulin (rATG; Thymoglobulin®). We present herein the results of the first 142 patients Throughout the first 6 months post-transplantation, recipients of kidneys from non-marginal deceased donors derived the greatest benefit in renal function (eGFR) from the single-dose regimen (p = 0.006). The incidence of chronic allograft nephropathy (CAN) was also lower in the single-dose group, in both clinically-indicated and protocol biopsies combined (p = 0.045) and in 12-month protocol biopsies alone High Risk (race, PRA) 8 (11%) In multivariable regression, allograft failure strongly predicted increased risk of subsequent CVE. Among listed candidates, receipt of a transplant was associated with significant time Adjusted for baseline factors, CVE after transplant predicted increased risk of subsequent mortality: HR 5.15 (CI 4.53-5.85) after IS Microalbuminuria Post-Renal Transplantation Is Related to Inflammation and Cardiovascular Risk Our objective was to define the relationship between microalbuminuria and these risk factors in stable RTR. Methods: Over one year, we identified 223 stable RTR who were at least 2 months post-transplant and provided 3 successive urine albumin-to-creatinine ratio (ACR) measurements, excluding those with recent illness and overt proteinuria. Microalbuminuria was defined as averaged ACR ≥ 2.0 in men and 2.8 in women (CDA 2003). Framingham-based traditional as well as novel cardiovascular risk factors associated with microalbuminuria were determined by univariate (p < 0.20), followed by stepwise backwards elimination (p >0.05) multivariate logistic regression analysis Microalbuminuria did not correlate with prior acute rejection, delayed graft function, or any specific antihypertensive or immunosuppressive agents. Conclusions: Post-transplant microalbuminuria is highly prevalent and is associated with elevated CRP, elevated BP, and smoking. Its relationship to these other factors suggests that it reflects an inflammatory state in otherwise stable patients and thus may indicate graft and patient health The first year after kidney transplantation (Tx) is associated with increased mortality relative to dialysis. Early post-Tx deaths are often cardiovascular (CV) and frequently occur after the first week post-Tx. cTnT is a sensitive and specific maker of myocardial injury. In this study we investigated whether cTnT relates to early post-Tx survival. Methods: 396 patients received kidney Tx from 9/2004 to 12/2006, 74% from living donors. cTnT was measured during the pre-Tx workup and periodically while on the Tx waiting list. 261 patients (64%) had a dobutamine stress echo (DSE) and 103 (26%) had a coronary angiogram. The combined end point of the study was death or major cardiac events. Survival was censored for graft loss. Results: Mean age was 51+12, 59% males. Pre-Tx cTnT level was elevated (>0.01 ng/ ml) in 56% of patients Other DSE derived parameters did not relate significantly to survival. cTnT further stratified the risk associated with other variables. Thus, among patients with EF<55%, 2 year survival was 97%, 88% and 60% (p=0.0003) in patients with cTnT <0.01, 0.01-0.03 and >0.03, respectively. Similarly, these cTnT ranges stratified risk in patients with low albumin Conclusion: An elevated pre-Tx cTnT is a strong and independent predictor of reduced early post-Tx survival. cTnT allows stratification of risk in patients who have other risk factors such as low EF, low serum albumin and dialysis>2 years. In all patients, independent of any other variables, a normal cTnT was an excellent predictor (97%) of survival Abstract# 545 Validation of Framingham Risk Assessment by Actual Cardiovascular Event Data in Renal Transplant Recipients Alloway, 2 Michael Cardi, 3 Gautham Mogilishetty, 2 Shazad Safdar Excellent Outcome after Liver Transplantation in Children with Cystic Fibrosis Some studies have reported benefits of liver transplantation (LT) in CF patients, but large outcome studies are not available. We report the outcomes of a large cohort of CF patients undergoing LT. Methods: Pre and Post-LT patient characteristics, post-LT morbidity and mortality, and patient and graft survival were 2702 patients age < 18yr) received a 1 st isolated LT. 62 CF patients were listed for 1 st LT, neither waitlist deaths nor the probability of death from time of listing was different from non-CF. 42 (1.6%) CF patients underwent LT with an average followup of 3yrs (0-10yrs) average PELD: 5.4 (86.5% had a PELD < 10), median age: 11.9 yrs (0.7 -17.5) Graft survival in CF patients was 85.0%, 81.2%, and 81.2% at 1, 3, and 5 yrs compared to 85.6%, 80.7%, and 78.1%. Rejection rates were not different (50.6% CF vs 56.4% non-CF @ 5 yrs with 50% of these patients requiring dialysis. Standardized height and weight scores showed no improvement over 2 years followup in the CF patients (height Z -1.4 at Tx to -1.4 at 2 yrs., weight Z -1.2 to -1.5), but tended to improve in the non-CF group In addition, death rates from time of listing are not increased compared to non-CF patients. These data support LT as a treatment for CF liver disease, but studies investigating the lack of growth improvement and increased renal complications in these patients may further improve outcomes. Abstracts full CNI group. Conclusion: Compared to full CNI, low CNI/MMF a) allows renal function to recover in patients with impaired renal function at the time of LTx and b) preserves long term renal function. CNI sparing in combination with MMF may become Cellular Islet Autoimmunity Influences Clinical Outcome of Islet Cell Transplantation Methods: Twenty-one T1D patients received cultured islet cell grafts prepared from multiple donors and transplanted under anti-thymocyte globulin (ATG) induction and tacrolimus plus mycophenolate mofetil (MMF) maintenance immunosuppression. Immunity against auto-and alloantigens was measured before and during one year after transplantation. Cellular auto-and alloreactivity was assessed by lymphocyte stimulation tests against autoantigens and cytotoxic T lymphocyte precursor assays, respectively. Humoral reactivity was measured by auto-and alloantibodies. Clinical outcome parameters remained blinded until their correlation with immunological parameters. Results: All patients showed significant improvement of metabolic control and 13 out of 21 became insulin-independent. Multivariate analyses showed that presence of cellular autoimmunity before and after transplantation was associated with delayed insulinindependence (p=0.001 and p=0.01, respectively) and lower circulating C-peptide levels during the first year after transplantation (p=0.002 and p=0.02, respectively). 7/8 patients without pre-existent T-cell autoreactivity became insulin-independent, versus 0/4 patients reactive to both islet autoantigens GAD and IA-2 before transplantation. Autoantibody levels and cellular alloreactivity were not associated with outcome. Conclusions: Cellular islet-specific autoimmunity affects clinical outcome of islet cell transplantation under ATG-tacrolimus-MMF immunosuppression BMP-7 Is Downregulated & TGFβ1 to BMP-7 Ratio Favors EMT during Acute Rejection of Human Renal Allografts Allospecific CD154+ T-Cells Predict Rejection Risk and Measure Immunosuppressive Effect after Abdominal Organ Transplantation in 100 Recipients Methods: Allospecific CD154+T-cells were measured in <24 hours with polychromatic flow cytometry to identify Rejectors (who had experienced acute cellular rejection within 60 days post-transplantation) in single mixed leukocyte responses (MLR) from 100 cross-sectional recipients-88 children with liver or intestine allografts, and 12 adults with renal allografts. Where possible, results were correlated with proliferative alloresponses measured by CFSE-dye dilution (n=45), allograft biopsies (n=46), and expression of CTLA4, a negative T-cell costimulator, which antagonizes CD154-mediated effects (n=52). Results: In the first 33 children, logistic regression identified donor-specific, memory CD154+ T-cytotoxic cells (Tc) as enhanced among Rejectors, compared with Non-Rejectors (408±231 vs 90±36 per 10,000 cells, p=0.003), relatively drug-resistant (r with drug levels =-0.5, p=NS), with greatest sensitivity/specificity (>93%) for Rejectors Noninvasively Developed Molecular Signatures Accurately Predict Acute Rejection of Human Renal Allografts greater emotional well-being (SF-36) and felt that their transplant interfered significantly less with various aspects of their life (IIRS). Conclusions: Findings highlight the potential utility of assessing attachment style in transplant populations CNI Sparing in De Novo Renal Transplantation: 3-Year Results from the Symphony Study One Background: Single center non-randomized results with steroid avoidance have shown patient and graft benefits. Methods: 130 unsensitized, primary kidney recipients, 0-21 yrs of age, were enrolled from 12 US transplant programs (2004)(2005)(2006), in a prospective 1:1 randomized multicenter study of steroid-free (SF) vs. steroid-based (SB) immunosuppression with matched demographics. 24.1 % of SF and 27.5% of SB were African Americans and 15.5% of SF vs. 8.7% of SB had ESRD from FSGS. SF patients received extended (6 mo) vs. standard (2 mo) Daclizumab induction in the SB group. Patients in both arms received tacrolimus and MMF maintenance. Protocol biopsies were performed at 0, 6, 12 and 24 mo, and for renal dysfunction. Primary end-points were differences for standardized height scores and biopsy proven acute rejection (BPAR) at 1 year. Results at 1 year: 60 SF and 70 SB patients were enrolled; 10 SF and 16 SB were 0-5 yrs of age. Patient survival was 100% in both arms. Graft survival was similar (96.7% in SF vs. 98.6% in SB). Intent to treat median delta height SDS scores from baseline for different age groups were: 0.58 for SF and 0.48 for SB in the 0-5 yr old (p=0.80); 0.34 for SF and 0.34 for SB in the 6-12 yr old (p=0.86 Protection of Liver Ischemia Reperfusion Injury by Silencing of TNF-α and Complement 3 Genes. Roberto Hernandez-Alejandro, 1 Xusheng Zhang, 2 Dong Chen, 2 Xiufen Zheng, 2 Hongtao Sun, 2 Weihua Liu, 2 Marianne Beduhn, 2 Aminah Shunnar, 2 Motohiko Suzuki, 2 Norihiko Kubo, 2 Bertha Garcia, 2 Anthony Jevnikar, 1,2 Living kidney donation is rapidly increasing worldwide to offer a partial (?) solution for the numerous ESRD wait-listed pts. In spite of properly followed guideline criteria Conclusion: DCD donors are a viable source of liver allografts for transplantation. Patients who receive DCD livers have outcomes comparable to subjects who receive grafts from brain dead donors. Use of DCD livers from donors over 60 years of age is accompanied by a higher incidence of retransplantation and biliary complications. Background: Hypothermic machine perfusion (HMP) is in its infancy in liver transplantation (LTx). Potential benefits include diminished reperfusion injury and improved early function. Methods: The study was designed as a Phase 1 trial of liver HMP. Exclusion criteria included: multiple organ recipients, MELD>35, ICU patients, and patients >65 years of age. Donor livers >65 years, biopsy with >30% macrosteatosis and DCD were also ineligible for HMP. Seventeen patients were enrolled transplanted with livers that underwent HMP for 4-7 hours using dual centrifugal perfusion with Vasosol solution at 4-6°C. Patient, operative and early outcome variables were recorded. We compared outcomes to 17 matched cold stored (CS) controls from the same era. Results: All 17 HMP grafts functioned immediately by usual clinical criteria with intraoperative bile production. Results are summarized in Table 1 . Synergy between IL-6 and TNFα Promotes T Cell Alloreactivty and Impairs the Graft-Prolonging Effects of Costimulatory Blockade. Hua Shen, 1 Bethany M. Tesar, 1 Wendy E. Walker, 1 Daniel R. Goldstein. 1 1 Internal Medicine, Yale University, New Haven, CT. A Novel Role of Th17 Cells in Allograft Rejection and Vasculopathy. Francesca D'Addio, 1 Jesus Paez-Cortez, 1 M. Javeed Ansari, 1 Laurie Glimcher, 2 John Iacomini, 1 Mohamed Sayegh, 1 Xueli Yuan. 1 1 Transplantation Research Center, Renal Division, Brigham and Women's Hospital, Boston, MA; 2 Harvard School of Public Health, Boston, MA. Introduction: Transcription factor T-bet plays a crucial role in Th1/Th2 development. Here, we investigated the role of T-bet in Th17 differentiation and function of Th17 cytokines in allograft rejection using an MHC class II mismatched model of cardiac allograft vasculopathy. Methods/Results: Cardiac allografts from bm12 mice were transplanted into wild-type as well as T-bet and IFN-γ deficient C57BL/6 recipients. T-bet-/-mice showed significantly accelerated allograft rejection (MST=14.75±0.96 days). However, as previously reported, all IFN-γ-/-and majority of the C57BL/6 mice accepted grafts for greater than 60 days. Upon in vitro stimulation of recipient splenocytes by irradiated donor cells, T-bet-/-and INF-γ-/-lymphocytes produced significantly less INF-γ and more Th2 cytokines. Interestingly, production of the proinflammatory cytokines IL-17 and IL-6 was significantly higher in T-bet-/-(337±94.4 and 339±10.5 pg/ml) than C57BL/6 (135.1±68.3, 75.3±25.9pg/ml, P=0.0399 and 0.0001 compared to T-bet-/-) and INF-γ-/-(95.8±29.8, 20844.5 pg/ml, P=0.0135 and 0.0093 compared to T-bet-/-) mice. In vivo administration of IL-17 neutralizing antibody (mab421) significantly prolonged survival of bm12 hearts (MST>30 days, p<0.01 compared to the 15.3±2.6 days of the control IgG group) in T-bet-/-mice. Immunofluorescence staining of bm12 hearts harvested from T-bet-/-recipients indicated that both CD4 and CD8 infiltrating lymphocytes produced IL-17. However, T-bet-CD4 double knockout mice did not reject bm12 heart grafts, nor did the grafts exhibit chronic vasculopathy. In contrast, T-bet-CD8 double knockout mice rejected (MST:18.9±3.2 days). Splenocytes from T-bet-CD4 knockouts produced significant lower IL-6 (12.9±3.8) and IL-17 (7.9±4.6 pg/ml) than observed in T-bet knockouts (940±176.3 and 210.2±73.4 pg/ml) and T-bet-CD8 knockouts (1240.5±215.6 and 113.1±61.2 pg/ml respectively) recipients when re-stimulated with donor cells, while there was no significant difference in INF-γ production. Induction in the Elderly Transplant Recipient: An Analysis of the OPTN/ UNOS Database. Suphamai Bunnapradist, 1 Steven Takemoto, 2 Jagbir Gill, 1 Tariq Shah. 2 1 Medicine-Nephrology, UCLA, LA, CA; 2 Medicine-Nephrology, National Institute of Transplantation, LA, CA.We examined the incidence and mortality implications of cerebrovascular events (CVE) after kidney transplant. We also compared variations in risk on the transplant waitlist and after allograft failure. Methods: We used registry data from the US Renal Data System to retrospectively investigate ischemic stroke (IS), hemorrhagic stroke (HS) and transient ischemic attacks (TIA) among 29,614 adults who received kidney transplants in 1995-2002 with Medicare as primary payer. Patients with prior indications of CVE in the registry were excluded. We ascertained events from billing claims, and estimated incidence of first events by the product-limit method. At-risk time was censored at: loss of Medicare, 3yr transplant anniversary, non-CVE death or end of study (12/31/2002). Cox regression was used to identify independent correlates of CVE, and to examine CVE events as time-dependent mortality predictors. We estimated CVE incidence after graft failure among patients without CVE diagnoses prior to graft loss (n=2,599), and amongThe Association between Hyperuricemia at Six Months after Kidney Transplantation and the Development of New Cardiovascular Disease, Many studies have previously reported safe withdrawal of prednisone (PW) late after kidney transplantation (Ktx). To determine the best immunosuppression regimen during the PW, we performed a prospective trial with stable KTx patients randomized to either CsA or 'S' based regimen. Methods: All patients received antibody induction therapy at the time of Rtx and maintained on CsA, P and Cellcept®. Patients excluded if they had >1 acute rejection, >1 gm/d proteinuria or serum creatinine >2.5 mg/dl. 161 patients were enrolled and data presented for 141 patients with >52 weeks follow-up (f/u) with mean f/u of 112.2±29.2 weeks. No differences observed in baseline characteristics in both groups. All patients then randomized to either CsA (n=69) or 'S' (n=72) and Cellcept® converted to equivalent dose of MS. CsA dosed by C2 level (2-hour) with goal level of 600 ng/ml. Sirolimus target level was 8 ng/ml. Results: 5 patients withdrew from study, 12 patients on S returned to CsA regimen because of side effects. 5 patients in the CsA group and 1 patient in 'S' group had AR (3 of them due to drug non-compliance). Death censored graft survival was 100%. Mean CsA drug level acheived was 664±188 ng/ml and 'S' drug level was 8.2±1 ng/ml. No significant differences noted in hematological values or BP measurements. CsA ( Purpose: The clinical significance of C4d positiviity in patients with acute rejection is well defined but its significance in stable graft function is undetermined. This study was performed to evaluate the clinical outcome of protocol biopsy-proven C4d positive renal transplants with stable graft function in the early posttransplantation period. Methods: 151 renal allograft biopsies were included. Protocol biopsies (n=79) were performed from stable allografts on day 14 posttransplantation, and indication biopsies (n=72) were performed from dysfunctioning allografts. Incidence of C4d positivity was compared between protocol and indication biopsies. Clinical characteristics, biopsy findings, graft function, acute rejection episodes, and graft survival rates were compared between the C4d-positive and C4d-negative grafts in each group. Results: C4d deposition in protocol biopsies was detected in 4 of 79 biopsies (5.1%), whereas 9.7% (7 of 72 biopsies) in indication biopsies. The histological findings of C4d-positive protocol biopsies were minimal inflammation of tubulointerstitium. On the other hand, those of C4d-positive indication biopsies were various including acute humoral rejection, acute cellular rejection, acute tubular necrosis and calcineurin inhibitor toxicity. In the protocol biopsy group, graft function during 1 year after biopsy, acute rejection rate, and cumulative graft survival did not differ between the C4d-positive and C4d-negative grafts. All C4d-positive allografts maintained stable graft function without any antirejection therapy. In the indication biopsy group, graft function during 1 year after biopsy and acute rejection rate did not differ between the C4d-positive and C4d-negative grafts. However the cumulative graft survival rate was worse in the C4d-positive grafts than the C4d-negative ones (p=0.008). Conclusion: C4d positivity associated with allograft dysfunction indicates a poor graft outcome. However, C4d-positive allografts with stable graft function in the early posttransplantation period take an indolent course. Are Methods: This was a retrospective single centre study reviewing all the adult patients who had a kidney transplant biopsy between April 2003 and October 2006 at Guy's Hospital. Results: 45 patients had diffuse (>50%) C4d staining out of 228 who had kidney transplant biopsies in this time and had been followed up within the centre. Of these 45 patients, 20 also had DSA prior or at the time of biopsy. The fall in eGFR in this group a year post biopsy was greater than those with diffuse staining for C4d but no DSA. The mean change in eGFR from the day of biopsy at a year was -3.1ml/min/1.73m2 (+/-12.1) in those with DSA compared with +17.7 ml/min/1.73m2 (+/-23.2) for those with C4d but without DSA. The changes in eGFR from the pre-biopsy baseline at one year showed a fall in eGFR in both groups but this was greater in those with DSA (-21.5 compared with -9.6 ml/min/1.73m2 ).Of 179 patients who never had diffuse or focal C4d staining on biopsy, only 81 had had DSA tested. Of these only 8 (10%) had a positive DSA result. From these eight, four had features of rejection and four did not. One person in each of these groups is dialysis dependant and one person in the rejection group has eGFR <15 ml/min/1.73m2. Although small numbers, this outcome appears to be worse than that of C4d negative patients with no DSA but features of rejection who in fact showed an improvement in eGFR from the day of biopsy by 14.4 m l/min/1.73m2 or an improvement from their pre-biopsy baseline of 1.3 m l/min/1.73m2 at one year. Conclusion: DSA is of additional value in evaluating risk of graft failure. This appears to be of value in those with and without diffuse C4d staining on biopsy. Utility of Post-Transplantation Flow Cytometry Crossmatching in Predicting Graft Outcomes. Michelle Willicombe, 1 Graham Shirling, 2 Ray Fernando, 2 Henry Stephens, 2 Paul Sweny, 1 Peter J. Dupont. 1 1 Department of Renal Medicine, Royal Free Hospital, London, United Kingdom; 2 Histocompatibility Laboratories, Anthony Nolan Trust, London, United Kingdom. De novo development of donor-specific anti-HLA antibodies after renal transplantation may be associated with increased rejection and decreased graft survival. Flow-cytometry crossmatches (FCXM) have been suggested as method of screening for development of donor-specific HLA antibodies post-transplantation, but interpretation of crossmatch results can be confounded by antibodies directed against antigens other than HLA. We assessed the impact of developing a positive FCXM post-transplantation on clinical outcomes in a cohort of live donor renal allograft recipients. Methods: 29 patients were studied. 23/29 (79%) received tacrolimus-based and 6/29 (21%) ciclosporin-based immunosuppression. Median follow-up was 24 months. All patients had negative complement-dependent cytotoxic (CDC) T cell crossmatches pretransplantation. 4/8 (50%) in the group with a positive FCXM had an acute rejection episode in the first 12 months compared with 10/21 (41%) in the group with a negative FCXM (p=NS). Graft function at 12 months was not different between the groups (positive FCXM -median creatinine 137mmol/L; negative FCXM -median creatinine 141mmol/L; p=NS). 2/8 grafts (25%) were lost within the first year in the positive FCXM group compared with 1/21 (5%) in the group with negative post-transplant FCXM (p=NS). The development of a positive FCXM post-transplantation alone is not predictive of adverse clinical outcomes. This may be explained by the poor correlation between a positive FCXM and the presence of antibody directed against mismatched donor HLA antigens. Surveillance for development of donor-specific anti-HLA antibodies after transplantation may be best performed using high-resolution bead technologies rather than FCXM. Use of the FCXM alone, without establishing antibody profiles, is of limited predictive value. based upon the amount of antibody (Ab) measured by the titer of donor specific HLA antibodies, DSA, or the fluorescence intensity (FI) of the donor specific single antigen bead. It is unclear whether Abs indentified by sensitive single antigen bead and solid phase assays (Flow PRA and Luminex) correlate with and are predictive of a clinically relevant end-point (a + FCXM). We evaluated the PRA, DSA, bead specific Ag FI and FCXM reactivity of pre-transplant (pre-Tx) sera from 219 recipients of a deceased donor renal allograft to determine whether amount of Ab (measured by FI) predicts a (+) FCXM. Patients with a (+) DSA, a (+) FCXM and pre-Tx class I PRA ≥ 80% (N = 103, mean PRA of 91 ± 6%) when compared to patients with pre-Tx PRA < 80% (N = 58, mean PRA 53 ± 21%) had comparable mean FIs (7,407 ± 4,305 vs 7,483 ± 5,380) , FI ranges (1,000 -20,243 vs 1,552 -17,153 ) and median FIs (5,681 vs 5,670). The class II comparisons were of the same pattern. Surprisingly, patients with a (+) DSA, a (-) FCXM and pre-Tx class I PRA ≥ 80% (N = 22, mean 90 ± 6%) compared to patients with pre-Tx PRA < 80% (N = 36, mean PRA 50 ± 18%) had comparable mean FIs (6,758 ± 4,841 vs 6,412 ± 3,844) , FI ranges (1,656 -16,596 vs 2,162 -11,637) We have recently showed that pre-treatment of the donor with EPO causes a substantial reduction of the dysfunction and injury associated with the transplantation of kidneys recovered after cardiac death. 5-aminoisoquinolinone (5-AIQ) a potent water soluble PARP inhibitor has proven to reduce renal ischemia-reperfusion (I/R) injury. The aim of our study was to determine the effects in the graft and in the receptor of the pre-treatment of the donor with EPO and treatment of the recipient with 5-AIQ, in a porcine model of DCD kidney transplantation. Material/Methods:24 Landrace pigs were killed by lethal injection; their kidneys were subjected to 30 min of warm ischemic time (WIT) and then transplanted after 24 h of cold storage in Celsior. In the pre-treated group, donors received a single dose of EPO (1000 IU/kg) 30 min before cardiac arrest. In the treated group, recipients received a continuous dose of 5-AIQ (5mg/kg/h) 10 minutes before reperfusion and maintained during 60 minutes. Blood, urine and renal tissue samples were collected at the end of the experiment for biochemical, histological and immunohistochemistry (PARS, iNOS and COX-2) evaluation. Data analysis performed with Graph Pad Prism Statistical Package; p<0.05 considered statistically significant. Results:Transplantation of kidneys from DCD resulted in: a significant rise of the levels of creatinine, N-acetil-b-D-glucosaminidase, glutathione-S-transferase, AST, LDH, ALT, fractional excretion of Na+, interleucin 1 and 6, malondialdehyde levels and myeloperoxidase activity (p<0.05); a significant reduction in urine flow and creatinine clearance, disturbances in the histological and imunohistochemistry pattern. Administration of EPO before ischemia and 5-AIQ before reperfusion reduced significantly the biochemical (p<0.01), histological and imunohistochemical evidence of glomerular dysfunction and tubular injury. They also reduced systemic injury, inflammatory response and oxidative stress. Conclusions:Pre-treatment of the donor with EPO and treatment of the recipient with 5-AIQ causes a substantial reduction of the dysfunction and injury associated with the transplantation of kidneys recovered after cardiac death. In the HMP group perfusate AST levels strongly correlated with recipient Peak AST by linear regression (p<0.001). Conclusions: HMP of liver grafts provides safe and reliable preservation in our pilot series. Perfusate AST may allow pretransplant prediction of reperfusion injury. A larger randomized trial will be necessary to demonstrate the magnitude of benefits of HMP over CS in LTx.Purpose: Liver discard rates have increased in a large, urban organ procurement organization from 8 % to 24 %. The reason is likely the result of increased transplant surgeon willingness to consider organs close to the margin of clinical acceptability. However, the costs associated with recovering these organs are high if the liver is discarded. We endeavored to determine whether a model based on pre-recovery data could predict liver discard Introduction: We report our 6 years experience with the use of Campath-1H (C1H) in adult liver transplantation. From December 2001 until July 2007 we administered C1H induction with low dose maintenance tacrolimus immunosuppression to 166 adult recipients of a liver allograft. Most common primary diseases were Laennec (n=60), cryptogenic cirrhosis (n=31) and autoimmune: PSC (n=17), PBC (n=16) and AIH (n=14). The first dose of C1H was administered immediately before (n=74) or after (n=92) the transplant procedure. Follow up was until September, 2007. Results: Five year patient and graft survival was 90% and 85% respectively. There were 14 deaths due to stroke (n=2), chronic rejection (n=2), failure to thrive/pneumonia (n=3), sepsis (n=2), hepatic artery thrombosis, HCC, prostate cancer, graft lymphoma and non-compliance (one each). Seven patients were retransplanted, for primary non function (n=2), portal vein thrombosis (n=1), hepatic artery thrombosis (n=1), hepatitis B (n=1) and chronic rejection (n=2). Thirty six patients had biopsy proven rejection episodes: mild (n=28), moderate (n=9) or severe (n=3). The average tacrolimus 12 hour trough levels were 6.6, 5.2 ng/ml and 2.78 ng/ml for the 1rst, 3 nd and 5th year post-transplantation, respectively. There was no significant difference in the outcome of the transplant so far, between patients that received C1H before or after the transplant procedure. Immunosuppression-related complications included A). Opportunistic infections: most common were Herpes Zoster (n=19), CMV (n=3), and Herpes Simplex (n=3), B). Neoplasms: skin cancer (n=3), Kaposi sarcoma (n=1), lymphoma (n=2) and C). Nephrotoxicity: Five patients received a kidney graft for diabetic nephropathy (n=2), nephrotic syndrome (n=1) and calcineurin nephrotoxicity (n=2). Conclusion: The use of C1H induction with half the usual dose of tacrolimus is an effective regimen in adult liver transplantation. The timing of C1H administration does not seem to affect the clinical outcome so far. A. David Mayer, 1 James M. Neuberger. 1 1 The Liver Unit, Queen Elizabeth Hospital, Birmingham, United Kingdom. Introduction: In the prospective ReSpECT study, primary liver transplant patients were randomised to 1 of 3 groups: A) standard-dose tacrolimus (target trough level > 10 ng/ml) for the 1 st month; B) 2 g mycophenolate mofetil (MMF) IV until at least day 5, 2 g PO thereafter + reduced-dose tacrolimus (target trough level ≤ 8 ng/ml); and C) MMF as in Group B + reduced-dose tacrolimus introduced on day 5 (target trough level ≤ 8 ng/ml) + daclizumab on days 1 and 7. Steroids were given in all groups according to local centre protocol. Results at 1 year showed that 2 g MMF + delayed and reduced tacrolimus + daclizumab is associated with significantly less impairment of renal function compared with standard treatment. Here we present the results of the per protocol (PP) population. Methods: The PP population, which was defined prior to the sub-group analysis, consisted of patients from the full analysis set who had no inclusion/exclusion criteria violation, had at least one creatinine clearance (CrCl) value beyond 6 months, were treated according to the protocol and did not receive any prohibited medication during the first 14 days and for less than 1 week at any time during the study. A composite endpoint comprising freedom from renal dysfunction (≥ 20% decrease from baseline in calculated CrCl), acute rejection, graft loss or death was also investigated. Results: The full analysis set included 181, 168 and 168 patients, whereas the PP population only included 69, 67 and 67 patients in Groups A, B and C, respectively. The mean difference in calculated CrCl from baseline to 1 year was significantly smaller in Group C compared with Group A (-10.7 ml/min vs -22.0 ml/min, p = 0.038), but was not significantly different between Groups A and B (-20.50 ml/min). The incidences of death (n = 2, 0, 1) and graft loss (n = 1, 0, 0) were similar in all 3 groups. The incidence of the composite endpoint at 1 year was in both the full analysis set and the PP population significantly lower in Group C compared with Group A (PP population: 70% vs 93%, p < 0.0001), but was not significantly different between Groups A and B (85%). The PP analysis confirms the results from the full analysis set that 2 g MMF + delayed and reduced tacrolimus + daclizumab is associated with less impairment of renal function compared with standard treatment with no negative effect on death and graft loss. Aims: Post transplant lymphoproliferative disorder (PTLD) is a serious complication of solid organ transplantation that is closely associated with Epstein Barr Virus (EBV) infection. EBV + PTLD lymphomas express several latent viral genes including latent membrane protein 1 (LMP1), a proven oncogene that is essential for human B cell transformation. LMP1 is able to activate Erk, JNK, p38, NFκB and PI3K. The aim of this study is to determine whether LMP1 isolated from PTLD tumors differs in signaling ability from LMP1 derived from the B.95 strain of EBV, originally isolated from a patient with infectious mononucleosis. Methods: LMP1 variants isolated from a panel of EBV + PTLD-associated B cell lines were cloned and sequenced. Inducible chimeric constructs containing the LMP1 C-terminus and NGFR transmembrane domain were created for each tumor variant and expressed in the Burkitts B lymphoma cell line BL41. LMP1 signaling in BL41 clones was induced by crosslinking of NGFR. Activation of p38, Erk, Akt and JNK was assayed by Western Blotting (WB) with phospho-specific antibodies. NFκB activation was assayed by WB for IκB and cFos induction was analyzed by WB and the TransAM cFos binding assay. Results: All three tumor variants of LMP1, as well as the B.95 LMP1 isoform, were able to induce p38 activation within 30min of NGFR crosslinking while Akt and JNK were activated within 10min. All variants showed similar ability to activate NFκB. However, tumor LMP1 variants induced prolonged Erk activation (up to 3hrs) while the B.95 LMP1 variant induced a transient response. cFos is induced only during the sustained phase of Erk activation. Indeed, the tumor variants of LMP1, but not B.95 LMP1, were able to induce cFos protein. Similarly, cFos binding to the AP1 consensus site was only observed in tumor LMP1-induced nuclear lysates. Two mutations in the C-terminus-aa212 (S vs G) and aa366 (T vs S) -are conserved in the tumor variants LMP1 compared to B.95 LMP1. Point mutation of either of these amino acids from the B.95 to tumor variant version allowed for sustained activation of Erk and subsequent cFos induction and binding to the AP1 site. Conclusion: Tumor-derived LMP1 has enhanced ability to induce the cFos oncogene and this property can be localized to two amino acids in the C terminus. These findings suggest that these specific amino acid residues of LMP1 are important in determining whether EBV infection is benign or results in PTLD. The Absence of Interferon Regulatory Factor-3 (IRF-3) Confers Protection Against the Liver Ischemia and Reperfusion Injury through an IL-10 Independent Pathway. Elizabeth R. Benjamin, 1 Xiu-Da Shen, 1 Feng Gao, 1 Yuan Zhai, 1 Genhong Cheng, 1 Ronald W. Busuttil, 1 Jerzy W. Kupiec-Weglinski. 1 1 Surgery, Dumont-UCLA Transplant Center, Los Angeles, CA. Toll-like receptor 4 (TLR4) mediated liver reperfusion damage after warm ischemia requires signaling through the MyD88-independent, IRF3-dependent pathway with CXCL-10 (IP-10) playing a central role in the injury development. Studies using CXCL-10 KO mice have shown that these mice are protected through an IL-10 dependent mechanism. We chose to investigate IRF3, upstream of CXCL-10, to further characterize its role in the injury progression, and to better understand the involvement of IL-10 in this pathway. METHODS: We used IRF3 KO mice and their WT counterparts in a model of partial hepatic warm ischemia with 1, 2, and 6h of tissue reperfusion (n=2 KO, WT at 1 and 6h; n=3 KO, WT at 2h). WT bone marrow derived macrophages were generated and stimulated with LPS to determine the kinetics of IL-10 production. Tissue was analyzed for histology and mRNA levels were measured by qPCR. RESULTS: Kinetic studies showed peak IL-10 production at 1 and 2h post-reperfusion (PR). On pathology, IRF3 KO mouse livers were protected from IR injury both early PR, at 1 and 2h, and at 6hrs PR when compared to WT. Consistent with these data, IL-6 mRNA induction was decreased in IRF3 KO, as compared with WT at 1, 2, and 6h. Although IL-10 induction was maintained in the CXCL-10 KO mice, the IRF3 KO mice showed decreased levels of IL-10 at 1h PR. By 2h, IL-10 levels were normalized to WT. CONCLUSION: IRF3 KO mice are protected from liver IR injury with evidence of this protection as early as 1h PR. Although CXCL-10 KO mice are protected from IR injury with maintained IL-10 expression, the absence of the upstream molecule, IRF3, confers protection in an IL-10 independent manner. These data suggest a novel mechanism of IR injury mediated by IRF3 in the liver. Background: Bone marrow (BM) transplantation may induce donor-specific tolerance to prevent rejection of allogeneic solid organs while maintaining immunity against infections and tumors. Currently allogeneic BM transplantation is limited by donor T cell mediated graft-versus-host disease (GVHD), as well as a variable requirement for recipient marrow ablation and high numbers of donor BM cells. Furthermore, sustained macro-chimerism has not yet been easily or predictably achieved in partially ablated patients or large animals. While rejection of allografts is mediated primarily by recipient T cells, recent studies have demonstrated the capacity of NK cells to reject allogeneic BM and to prevent long-term mixed chimerism. Thus, NK cells represent a barrier to long term BM engraftment even with T cell tolerance. We have previously identified a novel type of regulatory T (Treg) cell with a "double negative" (DN) phenotype (TCRab + CD3 + CD4 -CD8 -). DN-Treg cells can effectively suppress anti-donor T and B cell responses and prolong graft survival in allo-and xenotransplantation models. We therefore tested the capacity of DN-Treg to alter NK cell function. Methods: C57BL/6 BM cells were i.v. injected into sub-lethally-irradiated (6.5 Gy) CB6F1 (H-2b/d) in a "parent to F1" model, or into allo-disparate BALB/c mice. BM cells were co-transplanted with various numbers of C57BL/6 DN-Treg cells or CD4 + or CD8 + T cells as controls. Recipient spleen cells were collected 7 days after to detect donor progenitors in a colony-forming-unit (CFU) assay. Mice then received cardiac (n=8) or skin transplants (n=8) to confirm tolerance. We found that donor-derived DN-Treg cells suppress NK cell-mediated allogeneic BM graft rejection in both "parent-to-F1" and fully MHCmismatched BM transplantation models. Adoptive transfer of DN-Treg cells with donor BM cells promoted the establishment of stable mixed chimerism and donor specific tolerance to BM donor cardiac and skin grafts (MST>160 days), without inducing GVHD in sub-lethally irradiated mice. Perforin deficient DN-Treg cells were unable to efficiently inhibit NK cell function, and donor BM did not engraft. These results demonstrate a potential approach to control innate immune responses and promote allogeneic BM engraftment and donor specific tolerance through the use of DN-Treg cells.Framingham risk score (FRS) predicts cardiovascular (CV) risk in the general population, but may underestimate CV risk in kidney transplant (txp) patients (pts). FRS has not previously been validated by prospective cardiovascular event (CVE) data collection in kidney txp pts. The purpose of this study was to validate FRS with actual observed CVE data in kidney txp pts. Methods: CVE data was collected at routine intervals in our kidney txp pts and entered in a cardiovascular risk database. FRS was calculated from baseline to 7 yrs posttransplant (PTx) Individual FRS factors of age, sex, smoking, diabetes mellitus (DM), high-density lipoprotein (HDL), total cholesterol (TC), and blood pressure (BP) were evaluated for their ability to predict acutal CVE occurring after kidney txp. Pts with coronary artery disease (CAD) were excluded from the FRS analysis. CVE were defined as sudden death, myocardial infarction, angina, and cerebrovascular accident/transient ischemic attack. FRS factors were evaluated by cox proportional hazards in univariate (UVA) and multivariate (MVA) models. Rho kinase (ROK) modulates calcium sensitivity of vascular smooth muscle cells and contributes to the regulation of peripheral vascular tone in man. In essential hypertension increased ROK-activity contributes to the generation of vascular resistance. Arterial hypertension is a common complication in renal transplant recipients. In this study we were interested in the role of ROK for systemic hemodynamics in hypertensive renal transplant recipients (TX). We tested the specific inhibitor of ROK fasudil. 10 TX and 10 matched control subjects (C) received either fasudil (1600 g/min) or placebo over a period of 30 minutes intravenously. Peripheral blood pressure and heart rate were recorded every 5 min over a total of 120 minutes. Measurements for pulse wave analysis (SphygmoCor VT) were performed every 10 minutes during this period. Statistics by ANOVA for repeated measurements.Compared to placebo fasudil significantly reduced peripheral mean arterial pressure p<0 .05; figure 1) and increased heart rate (+ 6.31.1 bpm, p<0.001) in TX but not in C. Likewise, central systolic pressure(p=0.006; (figure 2), augmented pressure and augmentation index were decreased in TX only.We conclude that acute inhibition of ROK by fasudil consistently and effectively lowers blood pressure in TX with a calcineurin inhibitor-based immunosuppression. Interestingly, ROK-inhibition also reduces central blood pressure and arterial stiffness in these patients. Improvement of both these parameters has been linked to a reduction in cardiovascular morbidity and mortality in large trials. Hence ROK inhibition might prove beneficial for the treatment of hypertension in renal transplant recipients. Use Death with function causes half of late KTx failure, and cardiovascular disease (CVD) is the most common cause of death. Chronic kidney disease (CKD) is a CVD risk equivalent, justifying aggressive risk reduction with blood pressure (BP) control, statins, aspirin, and use of angiotensin converting inhibitors (ACEI) and angiotensin receptor blockers (ARB). DeKaf is an NIH-sponsored prospective observational study examining causes of KTx failure at 7 transplant centers in the US and Canada, with current enrollment of over 1800 subjects. We examined the use of cardioprotective medications among patients transplanted after 10/1/05 with at least 6 mos follow-up, focusing on subgroups with preexisting diabetes (DM) and/or CVD. We conducted a retrospective cohort study to asses the prevalence and the predictors for the development of hyperuricemia at 6 months after kidney transplantation and the association between hyperuricemia and clinical outcomes including patient and graft survival, new cardiovascular events and chronic allograft nephropathy (CAN). Adult patients who underwent kidney transplantation at Mount Sinai Medical Center between 1.1.2001-12.30.2004 were included. Patients who died or lost the allograft within 6 months after transplantation were excluded from analysis. Of the 307 patients with a functioning allograft at 6 months after transplantation, 163 patients (53%) had normal uric levels and 144 patients (47%) had hyperuricemia. After age, race, sex adjustment, receiving a cadaveric kidney, having an eGFR<50 ml/min, and taking diuretics and cyclosporine were associated with a higher odds ratio of hyperuricemia. Over a mean of 4.3 years of follow-up, 81 patients had one, or more, of the pooled outcomes; 40 had new cardiovascular events, 41 developed biopsy-proven CAN, 4 patients died, and 20 had graft failure. Kaplan-Meier survival curves demonstrated that the pooled outcomes of events occurred more frequently in hyperuricemic patients (Figure, p < 0.001). Due to association between low eGFR and hyperuricemia, we analyzed the clinical outcomes in patients with low and normal eGFR. While 44.7% of hyperuricemic patients with an eGFR<50 ml/min had one of the pooled outcomes, it was 20.8% in patients with normal uric acid levels (p=0.038). Among patients with an eGFR ≥50 ml/min, 19.4% of normouricemic and 25% of hyperuricemic patients had one of the events. These results suggests an important association between hyperuricemia at 6 months after transplantation and the new cardiovascular events, biopsy-proven CAN, and graft loss in kidney transplant recipients with decreased allograft function. Background: Long-term survival after liver transplantation (LT) is now the rule rather than the exception. Hence, assessment of outcomes for children after LT must consider not only the quantity, but also the quality, of life years survived and restored. Aim: To examine key HRQOL themes after pediatric LT raised by both recipients and their parent proxies, with evaluation by time (1-3 yrs, 3-5 yrs, 5-10 yrs, and >10 yrs) since LT. Methods: Semi-structured 1:1 item generation interviews were conducted in person with children (C) and parents (P) at time of ambulatory LT follow-up at 4 pediatric LT programs in Canada and UK. All interviews were audio-taped, transcribed verbatim, and subjected to content analysis utilizing QSR NVIVO 2.0 software for HRQOL related theme generation. The participants interviewed were part of a larger research program aimed at developing a disease-specific instrument to assess HRQOL for children after LT. Results: Data representing 91 (47% male) pediatric LT recipients was obtained from a total of 146 (62 C, 84 P) item generation interviews. Median recipient age at LT was 2.4 (range, 0.05 to 17) yrs, for primary indications including biliary atresia (45%), fulminant liver failure (17.6%), metabolic liver disease (4,4%), malignancy (8.8%) and others (24.2%). Median patient age at time of interview was 9.9 (range, 1.1 to 18.8) yrs. Themes emerging at all time points post-LT included infection risks, limitations on physical activities, side effects from immunosuppression meds, educational supports, and ongoing bloodwork. Themes identified within the medium (1-5 yrs) follow-up included worries about rejection episodes, need for future re-transplantation, school absenteeism, and altered sibling and family dynamics. The impact of living with a surgical scar was a more frequent theme with recipients >3 yrs from LT. As time from LT increased to >10 yrs, themes suggest a focus on normalization and health promoting behaviours, along with expressed desires to be like healthy peers. Conclusions: Unique HRQOL themes emerged from item generation interviews not captured by currently available generic HRQOL tools. HRQOL themes identified after pediatric LT suggest the importance of considering time trajectories from LT, and a focus on elements of 'everyday life' apart from LT. The shortage of cadaveric donors has led many transplant centers to expand their criteria for accepting life-saving organs. Utilization of donation after cardiac death (DCD) donors has been estimated to increase the number of cadaveric donors. We report our experience with a recently established DCD program at a pediatric hospital and the outcome with the transplanted grafts. Methods: In 2005 a protocol for DCD was established at a free standing pediatric hospital. From 2005 to 2006 all patients undergoing withdrawal of care were evaluated for DCD. Patients meeting criteria for DCD underwent withdrawal by the critical care team and organ retrieval was initiated if asystole was reached in less than 60 minutes. In addition, one DCD liver was imported and was included in the liver results. Results: During the 2 year study period 7 patients (25% of total donors) underwent DCD resulting in 14 organs (11 kidneys and 3 livers) transplanted. The 7 cases had a mean donor age of 9 yrs (range 3-16), WIT of 22 min (9-55), time sBP < 60 of 11 min (7-15), and time from asystole to aortic flush of 8 min (6) (7) (8) (9) (10) (11) (12) . Four kidneys were transplanted locally with CIT 5-14 hrs, no DGF, and one month creat 1.2-1.7. The remaining 7 kidneys were exported for transplant. Four livers were transplanted locally with donor age 7 mth-16 yrs, WIT 12-28 min, recipient age 14 mth-61 yrs, CIT 6-15 hrs, AST peak 190-3630, AST day 7 33-44, INR peak 1.4-1.8, INR day 7 1.0-1.1, Total Bili one month 0.3-1.0, and graft survival 1.5-2.9 yrs. There were no vascular or biliary complications. Conclusions: A protocol for DCD at a pediatric hospital increased the number of pediatric donors by 25%. Liver and kidney grafts from pediatric DCD donors demonstrated excellent graft function and survival. Liver Retransplantation in Children. A 21 Year Single Centre Experience. Christophe Bourdeaux, Andrea Brunati, Magda Janssen, Jean-Bernard Otte, Etienne Sokal, Raymond Reding. Pediatric Liver Transplant Program, Université Catholique de Louvain, Saint-Luc University Clinics, Brussels, Belgium. When graft failure occurs in liver recipients, secondary transplantation represents the only chance of long-term survival. In such instance however, several surgical and immunological aspects should be carefully considered, with respect to their impact on final outcome.In the present study, the epidemiology and outcome of graft loss following primary pediatric liver transplantation (LT) were analysed, with the hypothesis that early retransplantation (reLT) might be associated with lower immunologial risks when compared to late reLT. Between March 1984 and December 2005, 745 liver grafts were transplanted to 638 children at Saint-Luc University Hospital, Brussels. Among them, a total of 90 children (14%) underwent 107 reLT, and were categorized into two groups (early reLT, n=58; late reLT, n=32), according to the interval between both transplant procedures (< or > 30 days).Ten-year patient survival rate was 85% in recipients with a single LT, versus 61% in recipients requiring reLT (p=0.001). Ten-year patient survival rates were 59% and 66% for early and late reLT, respectively (p=0.423), the corresponding graft survival rates being 51% and 63% (p=0.231). Along the successive eras, the rate of reLT decreased from 17% to 10%, whereas progressive improvement of outcome post-reLT was observed. No recurrence of chronic rejection (CR) was observed after reLT for CR (0/19). Two children developed a positive cross-match at reLT (2/10, 20%), both retransplanted lately for CR secondary to immunosuppression withdrawal following a post-transplant lymphoproliferative disease.In summary, the current need for reLT has been decreasing over years, with a parallel improvement of its outcome. The results presented could not evidence better results for early reLT when compared to late reLT. The latter did not seem to be associated with higher immunological risk, except for children with immunosuppression withdrawal following the first graft. The Background: A serum conjugated bilirubin greater than 100 umol/L (CB100), in neonates who receive parenteral nutrition (PN), has been demonstrated to be a predictor of end-stage liver disease requiring transplantation. Given the recent interest in the role of omega-6 lipids in the development of Parenteral Nutrition Associated Liver Disease (PNALD), we sought to examine in a multiple variable model the role of days of maximal lipid (>2.5 g/kg/day), in the development of this outcome. Method: Between 2003 and 2004, data were collected prospectively on all neonates undergoing an abdominal surgical procedure. Univariate logistic regression models for the prediction of CB100 were developed with the following predictors: gestational age, percentile weight, percent predicted small bowel and colonic length, resection of the ileocecal valve, presence of a stoma, post-operative enteral tolerance, number of septic episodes, days of PN amino acid > 2.5g/kg/day, days of PN lipid > 2.5g/kg/day, and total days of PN. Univariate predictors significant at the 0.2 level were entered into a backward stepwise multiple variable logistic regression. Results: 152 infants received PN post-operatively, and 22 developed CB100. Predictors that met criteria for consideration in the multiple variable model were: age (p=0.079), weight (p=0.059), small bowel length (p=0.001), presence of a stoma (p=0.163), proportion of enteral feeds post-operatively (p=0.049), days of PN amino acid > 2.5g/ kg/day (p=0.00), days of lipid > 2.5 g/kg/day (p=0.00), and total days of PN (p=0.00).The final multiple variable model which had a negative predictive value of 97.6% and positive predictive value of 52.6% is presented in the table below. Our model suggests a key role of PN lipids and intercurrent septic events in the development of CB100 from PNALD. These data may provide targets, such as careful line care, reduction in maximal lipid dose, or the use of alternate lipids such as Omega-3 fatty acids, to prevent CB100 an identified marker for the need of subsequent liver transplantation in infants with PNALD. Terminal renal failure occurs in more than 10 % of liver transplant recipients after 10 years. We have previously shown that, beside renal toxicity of calcineurin inhibitors, renal lesions may be related to diabetes, arterial hypertension, accumulation of hydroxyethylstarch (ELHOES), and the etiology of the liver disease. We made the hypothesis that these lesions may be already present at the time of liver transplantation (LT), a finding that could lead to adapt the perioperative management. This work investigated prospectively whether renal histopathological lesions were present before LT by performing systematically a renal biopsy by endovenous route in 60 candidates to LT with end-stage liver disease. These patients were 58 ± 10 years old, 46 males ; 10/60 had a diabetes, and 21 an arterial hypertension ; the liver disease was related to alcohol in 32 cases, HCV in 12 cases, HBV in 5 cases, and to a cholestatic disease in 7 cases. At the time of the pre-LT workup, the biochemical parameters were : Child score10 ± 2, MELD score 18 ± 4, prothrombin rate 50 ± 12, creatinin serum level 90 ± 6 umol/L, proteinuria 0.12 ± 0.04 g/24h. Severe side effects related to the procedure were limited to 2 cases of macroscopic hematuria, lasting less than 24 hours. In 10 cases, the material obtained during the procedure did not allow the histological analysis. Among the 50 samples available, 21 were considered as normal ; in 29 cases, lesions related to mesangial IgA glomerulonephritis (14 cases), diabetic glomerulosclerosis (12 cases), ELHOES accumulation (5 cases), thrombotic microangiopathy (1 case) were found, often associated ; in 5 cases, the lesions were severe and lead to combined kidney/liver transplantation in 2 cases. In conclusion, significant renal lesions are detectable in more than 50 % of the candidates to LT. Interestingly, histological findings often combined lesions related to the liver disease and to an associated cause (diabetes, previous treatment by ELHOES or interferon). Results of histological analysis could help to decide either to perform a combined renal/liver transplantation, to adapt the immunosuppressive regimen, or to abandon the LT project. . Because serum creatinine is one of the components of MELD, liver candidates with renal insufficiency have been transplanted in increasing numbers, with some candidates receiving a kidney along with the liver transplant. We aimed to compare the liver graft outcomes for liver alone (LTA) transplants with those from combined liver-kidney transplants (CLKT). A propensity score analysis was used to reduce the impact of selection bias in the comparison of outcomes in the two groups. Methods. Demographics, clinical factors and outcomes on LTA and CLKT recipients from 3/1/02 to 12/31/06 (N=10,388) obtained from the OPTN database were used for the analysis. Univariate post-transplant survival rates were estimated using Kaplan-Meier survival, and multivariable post-transplant outcomes were analyzed using a Cox regression model with and without stratification by categories of the propensity score. The propensity score (probability of receiving a CLKT) for each recipient was estimated using a logistic regression model. Several donor and recipient factors were included in both the Cox and logistic regression models. In this cohort, liver graft outcomes for CLKT were significantly better than those for LTA based on the multivariable analysis. The results were similar, although slightly less significant, when the model was adjusted for propensity. The superior outcomes of CLKT may be due to unobserved differences between these groups of recipients, reflecting data not currently captured by the OPTN. Effect of Liver The TGFβ1 to BMP-7 ratio was higher in the AR Group (median ratio: 1635) compared to recipients with stable graft function & normal biopsy (median ratio: 438, P=0.0002).Our observations that BMP-7 is specifically down-regulated during an episode of AR and that the balance between TGFβ1 & BMP-7 is in favor of EMT advance a mechanism for the deleterious impact of AR on the long-term outcome of human renal allografts. The non-statistical bioinformatic approach identified 68 leader genes which define the highest interaction genes derived from the 343 SAM-gene list. An interaction map between the genes identified has been calculated. This network is formed around 2 majors clusters: a network of interleukins and a network of signal transduction which allow us the identification of key genes such as BANK1, a negative modulator of CD40mediated AkT activation, thereby preventing hyperactive B cell response in blood from patients with operational tolerance and IL7R, a specific marker absent on potentially regulatory CD4 + CD25 +high T cells in blood from patients with chronic rejection. We have identified by a non-statistical analysis of the peripheral blood gene expression in human kidney recipients a cluster of genes which are strongly interconnected and which could be a starting point for further analysis of the molecular mechanisms of kidney graft operational tolerance and chronic rejection. Fecal Algorithm Based on Multiparameter Mixed Lymphocyte Reaction Assay for Tailoring Maintenance Immunosuppressants after Living Donor Liver Transplantation. Yuka Tanaka, Hideki Ohdan, Toshimasa Asahara. Department of Surgery, Hiroshima University, Hiroshima, Japan.Background: No reliable immunological parameters exist for identifying liver allograft recipients in whom immunosuppressants can be safely withdrawn. For minimizing maintenance immunosuppressants, we established an algorithm determining anti-donor alloreactivity based on multiparameter mixed lymphocyte reaction (MLR) assay, wherein the number and phenotype of alloreactive precursorscan be quantified. We enrolled 68 adults undergoing living donor liver transplantation (LT). The initial immunosuppressive regimen comprised tacrolimus/cyclosporine and methylprednisolone, which were gradually tapered off by 6 months after LT. Thereafter, therapeutic adjustments were determined by a policy of slow tapering off in the case of normal liver function. MLR assay was performed at 6 month intervals to monitor immune status. In this assay, CFSE-labeled PBMCs from recipients were used as responders. Irradiated donor and third-party PBMCs were used as stimulators. After coculture, the responder cells were stained with CD4 or CD8 mAbs along with CD25 mAb, followed by FCM analyses. The proliferation and CD25 expression of CD4 + and CD8 + T cell subsets in response to anti-donor and anti-third-party stimuli were analyzed; the immune status of LT patients was categorized as hypo-response, norm-response, or hyper-response for CD4 + T cells and as hyper-response for CD8 + T cells. Of the 68 patients, 33 had normal liver function at >6 months after LT. We examined the fluctuation of immunosuppressants at 6 months after MLR in these patients. In patients whose immune status was categorized as hyper-response for CD4 + or CD8 + T cells (n=4), immunosuppressants had to be increased. In patients with norm-response immune status (n=7), immunosuppressant tapering was abandoned. Immunosuppressant therapy was successfully tapered off in patients with hypo-response immune status (n=22). Of 10 patients with hypo-response immune status at >3 years after LT, immunosuppressants were completely discontinued in 3. In these "operational tolerance" patients, the precursor frequency of anti-donor CD4 + T cells (mean=5.2±1.9%) was not reduced compared to that in non-tolerance patients, suggesting that donor-specific immune tolerance is maintained via inhibitory/ suppressive mechanisms rather than via clonal deletion. Conclusion: Multiparameter MLR assay can provide a clinically validated rule predicting the success of tailoring/weaning immunosuppression. Psychological Factors Associated with Non-Adherence among Adolescents before and after Kidney Transplant. Nataliya Zelikovsky. 1 1 Dept. of Pediatrics, Div. of Nephrology, The Children's Hospital of Philadelphia, Philadelphia, PA.Purpose: Little is known about psychosocial risk factors for poor adherence among pediatric transplant patients. Identification of variables that impact illness management can guide targeted interventions to improve adherence. Methods: A longitudinal study was conducted to determine whether quality of life (PedsQL), family functioning (FAD), and parent adjustment (PIP) would predict adherence in adolescent transplant patients. Psychological questionnaires were administered prior to and 12 months after the transplant. Medical Adherence Measure (MAM), a semi-structured interview was used to assess adherence. Adherence was calculated as % missed and % late doses of those prescribed. Results: 60 patients (M =14.32 years +2.18, 75% male, 63% Caucasian) and their parents were evaluated at the time of listing for kidney transplant. The rate of non-adherence prior to transplant was high, with 90% of patients reporting some degree of nonadherence. Of these patients, 37% missed and 28% took late > 10% of prescribed doses. On the quality of life measure, behavior issues were associated with missed (r=-.42, p=.01) and late doses (r=-.39, p<.05), and mental health issues were associated with late doses (r=-.38, p<.05). Adolescent reports of problems in affective responsiveness among family members was associated with missed (r=.33, p=.04) and late (r=.51, p=.001) doses. Missed doses were also associated with mother reports of difficulties with overall family functioning (r=.34, p<.05), communication (r=.41, p<.01) and role definitions (r=.33, p<.05) among family members. 49 of the families were re-evaluated one year after the kidney transplant. 65% had been on dialysis prior to transplant and 42% received living-related transplants. 58% of the patients reported some degree non-adherence post-transplant, and using more stringent criteria, 17% of patients reported missing and 27% reported taking late >10% of prescribed doses. Worse quality of life such as limitations due to emotional problems (r=-.32, p<.05), behavioral problems (r=-.39, p<.01), and difficulties with family cohesion (r=-.32, p<.05) was related to worse adherence. Discussion: Adolescent quality of life in behavioral and emotional domains, and family functioning play a significant role in adherence both before and after transplant. Programs to improve adherence among transplant patients should incorporate psychosocial supports and behavioral interventions to improve adjustment of patients and families. Kidney transplantation leads to marked improvements in health, yet transplant (tx) recipients often have difficulty with sexual functioning, which can affect quality of life. Specific sexual concerns of tx recipients remain under investigated. The purposes of this study were to 1) further establish the psychometric properties of the sexual concerns questionnaire (SCQ) including reliability and preliminary construct validity and 2) identify the sexual concerns of kidney tx recipients. The SCQ was answered by 390 kidney tx recipients who rated each item on a 0 (not at all) to 5 (extremely) scale. A Cronbach's alpha correlation coefficient was calculated to determine the reliability of the SCQ. An alpha value of .83 was calculated for the questionnaire indicating it was reliable. Exploratory factor analysis (EFA) was performed to establish preliminary construct validity of the SCQ. As a result from the EFA, 14 items were dropped and a 5 factor structure was accepted. Examples of items and responses include Question 1: "How difficult is it for your vagina to get or stay wet or moist?" (for women) and "How difficult is it for you to get or keep an erection?" (for men) and Question 2 "How comfortable are you talking about sexual concerns with your doctors and nurses?"Participants were also asked to indicate how important their sexuality was to them on a 0 (not at all) to 6 (extremely) rating scale. Twenty-six percent of participants rated their sexuality as quite a bit important, 26% rated their sexuality as very important, and 20% rated their sexuality as extremely important. The findings provide evidence of a reliable questionnaire with evidence for preliminary construct validity. They also indicate that sexuality is an important issue for a majority of kidney tx recipients. A Cross-Sectional Study of Fatigue before and after Liver Transplantation. James R. Rodrigue, 1 Timothy Antonellis, 1 p=0.78 # 1 (none) to 10 (extremely high); ♣ Higher score = more fatigue, poorer sleep quality, or more mood disturbance; ¶ Higher score = better QOL One-third of pre-LT (32%) and post-LT (33%) patients reported severe fatigue. Poor sleep quality was reported by 68% and 79% of pre-and post-LT patients, respectively. Pre-LT fatigue was predicted by higher BMI (ß = 0.21) and MELD (ß = 0.20), depression (ß = 0.35), and poor sleep quality (ß = 0.48), adj R 2 = 0.44, F = 10.72, p < 0.0001. Post-LT fatigue was predicted by older age (ß = 0.19), tension-anxiety (ß = 0.27), anger-hostility (ß = 0.38), and poor sleep quality (ß = 0.18), adj R 2 = 0.38, F = 7.45, p < 0.0001. More fatigue was associated with lower SF-36 physical (r = -0.34) and mental (r = -0.44) QOL. Conclusions. Fatigue and poor sleep quality are clinically significant problems for LT candidates and recipients. BMI, psychological functioning and sleep quality are modifiable variables that predict fatigue severity and should be targets of intervention when addressing fatigue symptoms. Health Literacy in Kidney Transplant Recipients. Elisa J. Gordon, 1 Michael S. Wolf. 2 1 Medicine, Albany Medical Center, Albany, NY; 2 Medicine, Northwestern University. Background: In order to successfully manage the transplant long-term, kidney recipients must have a basic understanding of key transplant-related concepts and terms indicative of their condition and treatment to properly communicate with health care providers and manage their health. Kidney recipients must also possess numeracy skills to enable proper medication-taking and to monitor serum creatinine levels, bodily temperature, etc. The objective of this study was to examine health literacy levels among kidney transplant recipients. Methods: We surveyed 124 consecutive adult renal transplant recipients using the Test of Functional Health Literacy in Adults (S-TOFHLA), and a modified version of the Rapid Estimate of Adult Literacy in Medicine, called the "REALM-Transplant," which measured patients' knowledge of 69 kidney transplant-related terms that patients are expected to have familiarity with. Open-ended and multiple choice questions assessed numeracy related to kidney survival. Results: Most kidney recipients (91%) had adequate health literacy (S-TOFHLA), but 81% were unfamiliar with at least 1 kidney transplant-related term (REALM-T). Patients who were less educated (p<0.0001), had lower income (p<0.002), and were single or without a partner (p=0.046) had significantly lower health literacy levels (S-TOFHLA). Patients less familiar with transplant-related terms (REALM-T) had less education (p<0.0001), lower income (p<0.0001), and were nonwhites (p=0.033). The five least familiar terms were: sensitization (50%), urethra (45%), trough level (41%), blood urea nitrogen (32%), and toxicity (31%). Sixteen percent wanted more information about their transplant. Numeracy levels varied: 21% knew the likelihood of 1-year survival; 29% knew that half of kidney recipients have problems with the transplant in the first 6 months; 86% knew the normal range of creatinine for kidney recipients; and 86% were aware of the risk of death within the first year of transplantation. Conclusion: At this clinic, kidney transplant recipients generally had high levels of health literacy. However, most had difficulty recognizing frequently used transplantrelated terms, which could impede their understanding of health information and self-care management. Greater efforts are needed to educate kidney recipients about transplant concepts, which may foster better self-care management, and ultimately transplant outcomes. Abstracts by a single pathologist using the Banff, CADI and CNIT classifications. All indication biopsies with clinical acute rejection (AR; 23.3% for SF and 21.4% for SB) were excluded from this analysis. The histological and clinical parameters were assessed using multivariate generalized-estimating-equations statistical analysis. Results: Subclinical AR was present in 12.2% SF vs. 8.5% SB bx at 6 mo and 0% SF vs 5% SB bx at 12 mo (p=NS); borderline AR was seen in 2.4% SF vs. 8.5% SB bx at 6 mo and 10% SF vs. 5% SB bx at 12 mo (p=NS). Despite the pristine condition of the kidneys at implantation, regardless of steroid exposure, there was a significant trend increase (p<0.0001) in chronic tubulo-interstitial damage; 35% of 6 mo bx and 53% of 12 mo bx demonstrated IFTA; with moderate/severe changes (IFTA grade 2-3) in 6.8% and 10% of 6 and 12 mo bx respectively. The prevalence of biopsies with ischemic glomerular changes (p<0.0001), tubular microcalcifications (p=0.009), vascular intimal thickening (p=0.0008) and the number of sclerosed glomeruli (p<0.0001) increased over the first year after transplantation, without any difference between the SB and SF group. A critical risk factor for IFTA injury by multivariate analysis, independent of time after transplantation, was smaller recipient size. In this first ever serial histological analysis, embedded in a randomized multicenter pediatric study of steroid avoidance, we found significant progression of chronic graft injury in the first year post-transplantation in both study arms. Small recipient size is the primary risk factor for tubulo-interstitial damage, likely related to vascular size discrepancies between recipient and the graft, resulting in chronic graft ischemia. In the 1-year Symphony core study, a regimen with 2g mycophenolate mofetil (MMF) + low-dose tacrolimus (3-7 ng/ml) + daclizumab + steroids resulted in less acute rejections and better glomerular filtration rate (GFR) compared with 2g MMF + steroids and either standard-dose cyclosporine (CsA), low-dose CsA (50-100 ng/ml) + daclizumab or low-dose sirolimus (4-8 ng/ml) + daclizumab. Methods: 960 patients participated in an optional follow-up of 2 years. GFR data from 79% of included patients (48% of the core ITT population) were available at 3 years.Here we present results in the ITT population. Results: At inclusion into follow-up 47%, 34% and 17% of patients received CsA, tacrolimus or sirolimus, and at 3 years 37%, 31% and 14%, respectively. Many follow-up patients had been switched to tacrolimus in the 1st year, including 24% of patients randomized to sirolimus. At 3 years 95% of patients were on MMF and 69% on steroids.Over the 2nd and 3rd year all arms had a low rate of biopsy-proven acute rejection (BPAR; 2-4%) and of graft loss (3-5%). Low-dose tacrolimus remained clearly superior in terms of BPAR (13% vs. 26%-38% in the other arms). Uncensored 3-year graft survival was 89% with low-dose tacrolimus and low-dose CsA, 87% with standarddose CsA and 85% with low-dose sirolimus (p=0.19). Patient survival was between 94% and 97% (p=0.52). In the four arms, the mean GFR change over the 2 nd and 3 rd year was between +1 and -3 ml/min and low-dose tacrolimus still had the highest GFR (69 vs 64-66 ml/min, p=0.15). Observational follow-up results based on approximately half of the core Symphony population indicate that during the 2nd and 3rd year renal function was stable, BPAR and graft loss rates were low and many patients changed treatment regimen substantially. Still, the ITT arm with 2g MMF + low-dose tacrolimus + daclizumab + steroids was superior at 3 years with respect to renal function and graft loss but differences were less marked and statistically not significant. Discovering Histological evaluation of time zero donor kidney biopsies has not conclusively predicted graft outcome. We hypothesize that gene expression analysis could provide additional information to determine graft outcome in the first year of transplantation. To this end, we evaluated all 49 implantation biopsies obtained post reperfusion in 18 deceased donors (DD) and 31 living donors (LD) at our center. Biopsies were evaluated and scored using Banff criteria. Low density real time PCR arrays were utilized to measure intragraft expression of 95 genes associated with programmed cell death, fibrosis, innate and adaptive immunity, and oxidative stress signaling. Results of expression were defined as folds compared to a pool of 25 normal kidney biopsies. In DD, histological features of ATN were more common (44%) than in LD grafts (6%; p<0.001), whereas arteriosclerosis was infrequent in both groups (6% and 15%, respectively), as well as the extent of glomerular sclerosis (0% and 2%). There was no association between these histological features and renal function at 1 year post transplant. Not surprisingly, DD grafts displayed a pattern of gene expression remarkably different from LD, including an increased expression of complement protein C3 ( Background: ISA247 is a novel calcineurin inhibitor (CNi), developed using a pharmacodynamic approach for use in autoimmune disease and solid organ transplantation. In moderate to severe plaque psoriasis, a Canadian Phase III trial has demonstrated that ISA247 is efficacious with minimal changes to renal function and a European trial is presently underway comparing ISA247 to cyclosporine A (CsA).In renal transplantation, a Phase IIa study comparing ISA247 to CsA in stable renal transplant recipients demonstrated ISA247 to be efficacious and well tolerated. A Phase IIb study in de novo renal transplant patients comparing ISA247 to tacrolimus is ongoing and final data will be available May 2008. We hypothesize that ISA247 is non-inferior to tacrolimus in terms of efficacy.Methods: This is a 6 month, randomized, multicenter, open-label, concentrationcontrolled study comparing three oral ISA247 dosing groups (0.4, 0.6, or 0.8 mg/kg bid) to tacrolimus in 42 North American transplant centres. All CNi's were titrated to target trough concentrations. Inclusion criteria included males and (non-pregnant) females between the ages of 18-65 who were receiving a first deceased or living donor renal transplant. Cold ischemia times were to be ≤ 24 hours, and peak panel reactive antibodies ≤ 30%. The primary efficacy parameter of the trial is non-inferiority (in at least one dose group) in biopsy proven acute rejection (BPAR) at 6 months as compared to tacrolimus. Secondary objectives include: renal function; PK/PD relationships; patient and graft survival; and proportion of patients with hypertension, hyperlipidemia or new onset diabetes mellitus (NODM).Results: Interim data, as previously presented at the 2007 ATC, demonstrated that ISA247 had rejection rates similar to tacrolimus (ISA247 0.4 mg/kg bid 11%, ISA247 0.6 mg/kg bid 8%, ISA247 0.8 mg/kg bid 3%, tacrolimus 9%) and confirmed previous results indicating an improved safety profile. 334 patients have now been enrolled between January 2006 and June 2007, with an optional extension to 12 months added to the trial. A recent approval by both FDA and Health Canada has allowed continued use of ISA247 in these patients until commercialization. The six month final results will be available for presentation at ATC 2008.