key: cord-0005212-lfeqwzjt authors: Grill, Eva; Müller, Martin; Mansmann, Ulrich title: Health—exploring complexity: an interdisciplinary systems approach HEC2016: 28 August–2 September 2016, Munich, Germany date: 2016-08-13 journal: Eur J Epidemiol DOI: 10.1007/s10654-016-0183-1 sha: 60a80d5b8f7f2066e38b0877371eeccfc6927ac0 doc_id: 5212 cord_uid: lfeqwzjt nan Health-exploring complexity: an interdisciplinary systems approach HEC2016 S3 A continuous Learning Health System (LHS) has been envisioned, nationally and globally, as a critical and necessary initiative for improving health outcomes. The ability to use, share, and harness the potential of data so as to inform practice, and improve health care decisions, is the underlying platform of a LHS. Generating new knowledge rapidly, accelerating the adoption of innovations and discoveries, integrating new data sources, and providing guidance to providers regarding treatment effectiveness and patient risks are some of the benefits to be derived in a LHS. While rapid learning is the pinnacle goal of a LHS, lowering health care costs, improved population health, empowering consumers, and driving innovation have also been stated as outcomes of a LHS. A Health Data Management infrastructure that encourages and implements innovative policies, practices is a requisite for supporting a LHS. The meaningful use of EHR's, incorporating patient generated data, providing patient access to their health information, ensuring data quality and integrity, creating data trust, as well as implementing a supportive information governance environment are but some of the opportunities and challenges that need to be addressed to drive forward the future of health care and achieve a LHS. In addition to a transformative health data management infrastructure that provides patient and providers with actionable data, a LHS can exist only with the support and engagement of a multitude of stakeholders, each contributing with a set of required skills and competencies. A LHS will redefine relationships between physicians and patients, researchers and clinicians, health professionals and health information management and informatics professionals. Addressing these new and developing collaborative relationships is a priority. Examining workforce readiness globally to support a LHS, as well as defining the new roles and functions for health professionals and health informatics professionals, is needed to foster collaboration among the stakeholders engaged within the LHS and to enable operational effectiveness and sustainability. A decision support framework integrating medical knowledge with live patient data for proactive chronic patient management Abidi SS 1 , Abidi S 1 , Abusharekh A 1 , Hashemian N 1 1 Dalhousie University, Halifax, Canada Background Atrial Fibrillation (AF) is the most common abnormality of cardiac rhythm. Timely interventions to AF related symptoms coupled with evidence-informed care at the primary care level could help to prevent AF related complications, such as stroke, reduce hospitalization costs and improve patient's quality of life. In this paper, we present a clinical decision support framework-termed as e-IMPACT-that targets the translation of Atrial Fibrillation (AF) clinical guidelines as point-of-care decision support aids for primary care physicians. e-IMPACT offers AF patient monitoring, education and management services to proactively flag and respond to AF related complications based on evidence-based AF clinical guidelines. We use a semantic web suite of technologies to computerize clinical guidelines in terms of specialized ontologies and use of reasoning engines to execute computerized clinical guidelines to provide patient-specific recommendations. Results e-IMPACT is currently being deployed across the province of Nova Scotia for a province wide cluster randomized trial over 12 months to investigate the efficacy of our approach to proactive chronic patient management. In this paper we presented a semantic web based framework targeting the effective management of AF at the primary care level. The key distinction of e-IMPACT is pervasive patient monitoring to proactively initiate interventions to avoid potential AF-related adverse event. Our approach is a paradigm shift from the current use of decision support systems that support physicians only during in-clinic encounters-the problem with the current approach is that the intervention may be too late. 820 An ontological model of behaviour theory to generate personalized action plans to modify behaviours Abidi S 1 , Abidi SS 1 , Baig W 1 1 Dalhousie University, Halifax, Canada Behavior change (BC) approaches aim to assist patients in achieving self-efficacy in self-management of their condition. Social cognitive theory (SCT) stipulates self-efficacy construct as central to behavior change. We have taken a knowledge management approach to computerize specialized self-efficacy constructs stipulated by SCT to formulate a high-level SCT knowledge model. We have collected and computerized behavior change content targeting healthy living and physical activity. Semantic web technologies have been used to develop a SCT ontology and SWRL rules to infer personalized self-management plans based on a given patient profile. We present formative evaluation of the clinical correctness and relevance of the generated personalized action plans for a range of test patient profiles. Our results confirm the consistency and correctness of the ontology based behaviour change model. This work presents a knowledge model computerizing behaviour change theory in order to develop personalized action plans for behaviour change-the plans are based on evidence-based behaviour change theories. 821 A digital health system to assist family physicians to safely prescribe NOAC medications Abidi S 1 , Cox J 2 , Abusharekh A 3 , Hashemian N 3 , Abidi SS 3 1 Dalhousie University, Halifax, Canada; 2 Division of Cardiology, Dalhousie University, Halifax, Canada; 3 Faculty of Computer Science, Dalhousie University, Halifax, Canada Background Atrial fibrillation (AF) is the most common cardiac arrhythmia, observed to be prevalent in almost 2 % of the general population. Although, published clinical guidelines on AF management recommend the use of antithrombotic treatment, it is noted that family physicians are underprescribing Novel Oral AntiCoagulant (NOAC) therapies. We present a computerized NOAC Authorization Decision Support System (NOAC-ADSS), accessible to FP to optimally prescribe NOAC based on Canadian AF clinical guidelines. We have adopted a knowledge management approach, leveraging semantic web technologies, to ontologically model the NOAC eligibility criteria and NOAC dosing algorithms, augmented with specialized SWRL rules are developed to determine a patient NOAC eligibility and dosage. In addition, we have developed a service that auto-fills the Nova Scotia NOAC authorization form (as a pdf file) based on the patient parameters and recommendations of the NOAC-ADSS, and the completed NOAC authorization form can be digitally signed by the FP and can be electronically submitted to the pharmacy and the Pharmacare office. To evaluate the accuracy of the NOAC-ADSS decision logic we evaluated its recommendations by engaging cardiac experts. 100 randomly selected de-identified patient cases were used. For each patient case, the patient information was provided to NOAC-ADSS and its recommendations were noted; next the same patient case was independently presented to two cardiac specialist physicians who provided their recommendations about the eligibility of NOAC, dosage (if eligible) and eligibility for special authorization. The evaluation results indicate a 96 % agreement between the NOAC-ADSS recommendations and that of the specialists, confirming the knowledge validity of the NOAC-ADSS. The prescription of appropriate and effective AF medications, based on AF clinical guidelines, will lead to reduction in the incidence of AF related adverse events. It is important to support FP to optimally prescribe antithrombotic treatments, including the use of NOAC. In our work, we present a digital health inspired decision support solution to assist FP to prescribe NOAC, as per the AF clinical guidelines. Our approach also streamlines the process of NOAC authorization, thereby making it easy for FP to prescribe NOAC and also complete the NOAC special authorization process. A decision support environment to design personalized behavior modification plans for diabetes self-management Abidi S 1 , Vallis M 1 , Piccinini-Vallis H 1 , Imran SA 1 , Abidi SS 1 1 Dalhousie University, Halifax, Canada We present Diabetes Web-Centric Information and Support Environment (D-WISE) that features: (a) Decision support tool to assist family physicians to administer Behavior Modification (BM) strategies to patients; and (b) Patient BM application that offers BM strategies and motivational interventions to engage patients. We take a knowledge management approach, using semantic web technologies, to model the social cognition theory constructs, Canadian diabetes guidelines and BM protocols used locally, in terms of a BM ontology that drives the BM decision support to physicians and BM strategy adherence monitoring and messaging to patients. We present the qualitative analysis of D-WISE usability by both physicians and patients. Our results indicate that family physicians are willing to provide behaviour modification counselling to patients, provided they are supported with evidence-based tools. Despite the availability of evidence-based BM models the challenge is to translate these models to personalized BM strategies that can help patients adopt healthy behaviours. We have presented a digital health approach that translates BM models in terms of a point-of-care BM environment. The key contribution of our approach is the modeling of BM knowledge and then operationalizing the BM model to generate personalized BM strategies for diabetes patients to self-manage their condition. Transcription of case report forms from unstructured referral letters: a semantic text analytics approach Abidi SS 1 , Christie S 2 1 Dalhousie University, Halifax, Canada; 2 Division of Neurosurgery, Dalhousie University, Halifax, Canada In this paper we present a framework for the semi-automatic extraction of medical entities from referral letters and use them to transcribe a case report form. Our framework offers the functionality to: (a) extract the medical entity from the unstructured referral letters, (b) classify them according to their semantic type, and (c) transcribe a case report form based on the extracted information from the referral letter. We take a semantic text analytics approach where SNOMED-CT ontology is used to both classify referral concepts and to establish semantic similarities between referral concepts and CRF elements. We used 100 spine injury referral letters, and a standard case report form used by Association of Dalhousie Neurosurgeons, Dalhousie University. Our results indicate a 90 % accuracy for mapping the content of the referral letters to specialized case reporting form. This work presents an medical text analytics approach to prepare case report forms or synoptic reports based on existing clinical documents, such as referral letters submitted by physicians to specialists for specialized care of their patients. Time-related complexities in the analyses of observational time-to-event studies of health: why do we need more refined statistical methods? McGill University, Montreal, Canada michal.abrahamowicz@clinepi.mcgill.ca (for the STRATOS Topic Group 8 'Survival Analysis': Per Kragh Andersen, Richard Cook, Pierre Joly, Torben Martinussen, Maja Pohar-Perme, Jeremy Taylor, and Terry Therneau) Most health outcomes arise from longitudinal processes and are analyzed using time-to-event or survival analysis methods, which are the focus of STRATOS Topic Group 8 (TG8). The paramount challenge is to accurately account for time-related complexities, which require developing specialized methods to address even those analytical issues which, in other analyses, are adequately handled by 'standard' statistical methods. We will illustrate such complexities using two examples, both involving challenges investigated also by other STRATOS TG's, which emphasizes the need for STRATOS-wide collaborations to develop an integrated modeling approach. The first challenge involves modeling the effects, i.e. functional forms, of continuous covariates. TG2 aims at developing guidance for flexible modeling of possibly nonlinear effects of continuous covariates in multivariable analyses. However, we will demonstrate how, in the specific context of survival analyses, the accuracy of both estimating and testing of non-linear (NL) effects may be affected by imposing a priori proportional hazards (PH) assumption, which underlies the extremely popular Cox PH model. The PH assumption implies that the effects of the covariates remain constant during the entire follow-up period, but there is ample empirical evidence of time-dependent (TD) effects, where the hazard ratio changes substantially during the follow-up [1] . We will show, using both simulated and real-life data (concerning prediction of death after a septic shock, and prognostic studies of cancer recurrence or mortality) how ignoring TD effects may lead to biased estimates and invalid conclusions regarding NL effects, and vice versa, and affect the accuracy of risk prediction or (cost-)effectiveness assessment of health interventions [2] . Another time-related complexity arises when the values of the covariates themselves change during the follow-up time, implying use of time-varying covariates (TVC). The challenge is how to define the TVC that best reflects the way past covariate values may affect current hazard. This problem can be seen as a complex case of measurement errors, which are the focus of TG4. We will demonstrate that the choice of the way repeated-over-time TVC values are aggregated in the analysis may have a major impact on the model's fit to data and/or power to detect an association [3] . We will then propose a more flexible weighted cumulative exposure model and illustrate its advantages in modeling TVC in the analyses of adverse effects of medications [4] . Additional challenges are faced if the longitudinal changes in TVC are driven by the same process that affects the clinical endpoint and have to be predicted, together with survival, using joint modeling methodology [5] . To what extent do established risk factors explain stroke in the young? A case-control study The INTERSTROKE study showed that across all age groups, about 90 % of all strokes are attributable to 10 risk factors. Whether the effect of established risk factors is equally important for younger patients has, however, not been established yet. To close this gap in research, this study aims at determining the attributable risk of established risk factors for stroke in a young population under 55 years. We performed a case-control study based on a sample of 3308 stroke patients aged 18 to 55 years, included in the Stroke In Young Fabry Patients (SIFAP1) study in 26 German centers (cases). They were matched to individuals without a history of stroke from a nationwide population-based sample of adults in Germany, participating in the German Health Update (GEDA) study 2009 and 2010 (controls) . Subjects were matched by sex and age in a 1:4 ratio. Cases and controls with a self-reported previous stroke were excluded from the analysis, as were patients with a different stroke pathology than ischemic stroke or primary intracerebral hemorrhage. We calculated adjusted population attributable risks (PAR) and corresponding 95 % confidence intervals (CI) of eight established risk factors for stroke (hypertension, hyperlipidemia, diabetes, coronary heart disease, smoking, high alcohol consumption, physical inactivity and elevated BMI) and their combinations. All analyses were performed for all stroke, and additionally for ischemic and hemorrhagic stroke separately. Additional analyses were carried out stratified by age groups (18-34, 35-44 and 45-55 years) and etiological subtypes of ischemic stroke according to TOAST criteria. We included 2125 cases (2009 ischemic and 116 hemorrhagic) and 8500 controls. Hypertension and physical inactivity were the most important risk factors and in combination accounted for 69.92 (95 % CI 66.60-73.25), 69.22 (66.35-72 .08) and 82.08 % (72.47-91.69) of all, ischemic and hemorrhagic strokes, respectively. High alcohol consumption (PAR: 17.35 %, ), smoking (PAR: 14.95 %, 6.04-23.86) and diabetes (PAR: 4.84 %, 3.02-6.66) were additional important risk factors for ischemic stroke. Based on our results eight risk factors explained 78.86 % (75. 94-81.78 ) of all strokes in the young and the two main risk factors hypertension and physical inactivity alone explained almost 70 %. Stratified analyses by age groups revealed a fairly consistent trend that established risk factors explain higher percentages of ischemic stroke risk in older age groups. TOAST subgroup analyses indicate a higher similarity between large-artery artherosclerosis and small-vessel occlusion with regard to risk factors, compared to cardioembolic stroke. Our results show that risk factors previously established for older age also account for a large part of stroke in younger age, with higher similarity in patients above 45 years. Targeted population-based risk factor interventions may also have a substantial effect in reducing stroke in the young. Determination of nasal and oropharyngeal microbiomes in a multicenter population-based study-findings of the Pretest 1 phase of the German National Cohort Background Nasal and oropharyngeal swabs are used to study microbial populations and dynamics of multidrug-resistant pathogens (e.g. methicillinresistant Staphylococcus aureus). However, collection of such specimens may be impeded by technical and logistic difficulties, in particular when applied in population-based studies. Previously, we reported that nasal and oropharyngeal swabbing was a highly acceptable and feasible method in the Pretest of the German National Cohort (GNC). We now examined microbial communities (microbiomes) from nasal and oropharyngeal swabs of this study. The feasibility study was conducted in 2011 in six study centers. Certified study personnel collected nasal and oropharyngeal swabs in the study centers, and participants self-collected nasal swabs at home and returned them by mail (total n = 1142 swabs). In a subsample (n = 313), we used next generation sequencing of the 16S rDNA V1-V2 variable regions to assess the ability of the swabs to detect microbiome structures. We also compared 4 different swab brands. Microbial community structure analysis of the two ecological niches revealed the presence of 846 phylotypes (PT), 46 of which were exclusively found in oropharyngeal communities, and 164 exclusively in nasal communities. The communities of these two habitats differed drastically, forming two separate clusters in non-metric multidimensional scaling plots. The anterior nasal communities were dominated by Actinobacteria (mean abundance of 45 % in a total of 195 samples analyzed), Firmicutes (40 %) and Proteobacteria (12 %), along with minor amounts of Bacteroidetes (2 %). Members of five phyla were abundant in the oropharyngeal communities (38 % Firmicutes, 22 % Bacteroidetes, 18 % Fusobacteria, 11 % Actinobacteria and 10 % Proteobacteria). Notably, the oropharyngeal communities were richer in species (with an average of 201 ± 41 phylotypes per sample) than the anterior nasal communities (87 ± 37 phylotypes per sample). One swab brand was found to be unsuitable for DNA extraction, but analysis of diversity of microbial communities of the remaining three swab brand did not reveal any significant differences. There were only marginal differences in microbial communities between staff-and self-collected swabs. The expected differences in microbial communities between anterior nares and oropharynx were detected, thus demonstrating microbiological validity of the approach. Self-swabbing was found to be suitable for nasal microbiome analysis. Three of four swab brands performed equally well and may be recommended for larger population-based studies in the future. Metabolic factors explain association between adult weight gain and colorectal cancer: data from the EPIC cohort Aleksandrova K 1 , Schlesinger S 2 , Fedirko V 3 , Jenab M 4 , Buenode-Mesquita HB 5 , Riboli E 6 , Pischon T 7 , Boeing H 8 1 Deutsches Institut für Ernährungsforschung Potsdam-Rehbrücke (DIFE), Nuthetal, Germany; 2 Christian-Albrechts Universität zu Kiel, Kiel, Germany; 3 Rollins School of Public Health, Emory University, Atlanta, United States; 4 International Agency for Research on Cancer, Lyon, France; 5 National Institute for Public Health and the Environment (RIVM), Bilthoven, Netherlands; 6 Department of Epidemiology and Biostatistics, School of Public Health, Imperial College London, London, UK, London, United Kingdom; 7 Max-Delbrück-Centrum für Molekulare Medizin (MDC) Berlin-Buch, Berlin, Germany; 8 DIfE, Nuthetal, Germany Background and aim A growing body of evidence has emerged consistently suggesting that high body weight gain during adulthood is associated with a higher risk of colorectal cancer [1] [2] [3] . However, it remains unclear whether weight gain per se promotes colorectal cancer and what exact mechanisms may underline this association. We, therefore, aimed to investigate whether the association between high adult weight gain and risk of colorectal cancer may be mediated by attained adiposity and metabolic biomarkers using a nested case-control study within the European Prospective Investigation into Cancer and Nutrition (EPIC) Cohort. The association between high adult weight gain and colon and rectal cancer was evaluated using multivariable conditional logistic regression analyses. Relative risks (RRs), estimated from odds ratios as derived from the risk set sampling design and 95 % confidence intervals (95 % CIs) were computed. We evaluated the mediation effect of 20 different biomarkers on the relation between adult weight gain and CRC risk using data from a prospective nested case-control study among 452 incident cases diagnosed between 1992 and 2003, and matched within risk sets to 452 controls within the EPIC cohort [4] . Adult weight gain was assessed as kilograms gained per year for the period between age 20 and study enrolment at average age of 59 years. To explore potential mediation effects of the biomarkers, we added each individual biomarker to the multivariable-adjusted model and estimated the percent of effect change in regression coefficients (b-s) using the difference of coefficients method originally proposed by Freedman & Schatzkin, 1992 [5] based on the formula Bindirect effect = [(b -b1)/b]*100. Higher adult weight gain was associated with a higher risk of colon cancer [C300 versus\300 g per year: multivariable relative risk (RR) = 1.54, 95 % confidence interval (CI) 1.07, 2.24), but not with rectal cancer (1.07, 95 % CI 0.68, 1.66) . This association was accounted mostly for by attained waist circumference [reduction of 61 %], and by the biomarkers soluble leptin receptor [reduction of 43 %] and glycated hemoglobin [reduction of 28 %]. Other factors that showed an influence on the association included the biomarkers of oxidative stress-ROM [reduction of 11 %] and FRAP [reduction of 13 %] and the iron metabolism biomarker-transferrin [reduction of 17 %]. Attained BMI in adulthood and leptin attenuated the association by 24 and 21 % respectively; however these changes did not prove to be statistically significant. Among the biomarkers, sOB-R and HbA1c further attenuated the association beyond waist circumference by 58 and 35 %, respectively. Conclusions These novel data suggest that the observed association between adult weight gain and colon cancer could be primarily explained by attained abdominal fatness and biomarkers of metabolic dysfunction. Avoiding weight gain and metabolic control from early adulthood on through lifestyle modifications could provide valid means for improved CRC prevention. Epidemiologic associations of famine during pregnancy, adverse birth outcomes, and metabolic conditions in adulthood have been reported from natural experiments in Caucasian populations. This phenomenon may partially explain the upsurge of type 2 diabetes in sub-Saharan Africa. Thus, in a Ghanaian birth cohort, we investigated the relationship of malarial infection during pregnancy, birth outcome, and metabolic traits of the children at age 15 years. In this cohort, we observed that malarial infection in pregnancy increased the odds of low birth weight (\2500 g) by 50 % and the odds of preterm delivery (\37 weeks) by 40 %. In the year 2000, we recruited 839 pregnant women in rural Ghana for this birth cohort; malarial infection and birth outcome were assessed. In 2015, we followed up on 339 of these mothers, and assessed fasting plasma glucose (FPG), blood pressure (BP), weight, and height of their children born in 2000. In 145 adolescents, the distributions of early-life factors (malarial infection, birth weight, gestational age) across tertiles of metabolic traits at age 15 years (FPG, systolic BP, diastolic BP, BMI) were evaluated; associations of early-life factors with metabolic traits were calculated using linear regression. In 78 boys and 67 girls, the proportion of malarial infection during pregnancy tended to increase across tertiles of FPG, systolic BP, and diastolic BP, but decreased across tertiles of BMI at age 15 years. In linear regression, none of these trends remained. Mean birth weight increased across tertiles of BMI at age 15 years, and in the multipleadjusted linear model, elevated birth weight (per 200 g) increased BMI in adolescence by 0.4 kg/m 2 . No trends of mean birth weight were observed across tertiles of FPG and BP. Mean gestational age decreased across tertiles of diastolic BP, which translated into ?0.6 mmHg per -1 gestational week in the linear model. No further trends were observed for gestational age. In rural Ghana, early-life factors may partially contribute to differences in metabolic traits in adolescence. The relationship between malarial infection in pregnancy, LBW, and low BMI in adolescence is unexpected and requires further investigations. patient's room or near the nursing station to hear the sound of the alarm. (1) To reduce errors and improve patient safety, patient monitors that use smartphones and tablet devices have been developed for simply sending an audible alarm to the nurse. (2) In other cases, web cameras have been used to observe elderly individuals who live alone and patients with dementia. (3) Web cameras would also enable medical staff to obtain more information about the patient image via a Wi-Fi network when an alarm is generated. However, no current patient monitoring systems use real time patient imaging. Therefore, the aim of this study was to develop a monitoring system involving an iOS application to get the alarm information and patient image on smartphones from the monitor. System description The hardware of this system consists of the patient monitor (BSM-6701), a serial Wi-Fi adapter (REX-WF60), a wireless router, a web USB camera, a streaming server (''Raspberry Pi'' and an established Wi-Fi connection), and a mobile battery. Serial Wi-Fi adapters are provided for communication between the patient monitor and the wireless router. A streaming server connected to a web USB camera was located on the patient monitor. The application for receiving the alarms and streaming images was developed using iOS SDK in conjunction with Apple's Xcode7. We confirmed its operation by changing the threshold to generate an alarm. In a further test of our system, a normal electrocardiogram (ECG), an abnormal ECG generated by a Human Patient Simulator (HPS) (SimMan, Laerdal) and alarms for ventricular premature contraction (VPC), heart rate (HR) and ST-segment level were transmitted to the smartphone from the patient monitor. This app received and displayed the device ID, HR, SpO2, information from 13 different potential alarms and patient image in real time. The alarm transmitted from the patient monitor initiated patented lights and vibrations on the smartphone. Tests using the simulator revealed that the alarms for VPC, HR and ST-level were enabled. The technical feasibility was assessed by the test with an HPS providing normal and abnormal ECGs and by changing the alarm threshold in this study. Our results demonstrate our system attains the informed vital signs and alarms from the patient monitor and captures the patient image in a simulated environment. By contrast, we were using a portable battery for the power of the Raspberry Pi. But we need an environment that can ensure a power supply. Because our system can display the patient and the reason that the alarm has occurred, it is capable of complementing an existing alarm system. Moreover, our system can be introduced by utilizing existing equipment. It may be useful in other daily clinical situations. Adverse event reporting in noncommercial non-interventional studies Amiri M 1 , Ose C 2 1 Zentrum für klinische Studien Uniklinikum Essen, Essen, Germany; In the light of the growing role of the noncommercial non-interventional study (NIS) in Europe in this article the challenges for monitoring the drug safety activities were explored in one type of noncommercial NIS named investigator-initiated non-interventional post-authorisation safety study (PASS) also known as post-marketing surveillance in the form of observational safety study (German Anwendungsbeobachtungstudie) or drug registries. ''Clinical studies in the public interest without private financing are characterized as non-commercial clinical research. In non-commercial studies, the substance being researched frequently already has marketing approval for the investigated indication group'' [1] . Noncommercial NIS are typically financed by an academic institution such as university hospital and hence commercial profits, like increasing the sale of a particular drug product, is unlikely to have any influence in their results. However sometimes a pharmaceutical company can also financially support the study but it should be impartial in the context of the design, assessments of the obtained data and the results of these studies [2] . These types of studies could be used a great tool to provide real-life information about the safety of the marketed medicinal product and are regulated by the German Drug law (Arzneimittelgesetz, AMG). Although such studies are useful to safeguard human health, there are some issues regarding the pharmacovigilance activities which remain under discussion. The EU Guideline on Good Pharmacovigilance Practices (GVP Module VIII-Post-authorisation safety study) provides fundamental instructions for performance of pharmacovigilance in accordance with the legislations for Marketing authorization holders (MA) and commercial NIS, but they do not specifically address the noncommercial NIS. Furthermore, an obligation to record and report adverse drug reactions in NIS is regulated according to section 63b of the 12th amendment of the German drug law in 2012. To this end, some measures that need to be taken to fulfill the regulatory demand on pharmacovigilance responsibilities in noncommercial NIS will be discussed in this article. Workshop: online learning in health informatics-best practices and lessons learnt Ammenwerth E 1 , de Keizer N 2 , Koch S 3 , Haas P 4 1 UMIT -Private Universität für Gesundheitswissenschaften, Med. Informatik und Technik Tirol, Hall in Tirol, Austria; 2 Academic Medical Center, Amsterdam, Netherlands; 3 Karolinska institutet, Stockholm, Sweden; 4 Fachhochschule Dortmund, Dortmund, Germany Health informatics programs exist since many years in Europe. Only recently, programs and courses have been developed that are based on online learning instead of classroom teaching. Various educational theories and tools for online learning exist. In this workshop, we present four approaches for online-based health informatics courses from four countries: A group-oriented, interactive online module based on learning activities; a massive open online course (MOOC) for self-paced learning of large groups of students; self-paced instructional online modules with assignments, quizzes and exams; and an elearning portal based on eLearning objects (eLOBs). In the workshop, we will first present objective, target group, organization, content, online didactics, and first experiences of these approaches. In small group work and a plenary discussion, we will then discuss best practice for online learning in health informatics and try to come up with recommendations. Systematic health IT evaluation studies are needed to ensure system quality and safety, both to create sound evidence and to ensure its application as part of an evidence-based health informatics approach. Well-trained health informatics specialists are required to guarantee that health IT evaluation studies are conducted in accordance with robust standards. Policy makers and managers need to appreciate how good evidence is obtained by scientific process, and be able to identify where this is the case. This contribution presents recommendations for the structure, scope and content of a university-based health IT evaluation course on the master or postgraduate level. These recommendations are based on a series of international workshops combined with a structured analysis of available courses and of available literature. The recommendations describe 15 mandatory topics and 15 optional topics for a health IT evaluation course. Experiments with cross-language information retrieval on a health portal for psychology and psychotherapy Andrenucci A 1 1 Dept. of Computer and Systems Sciences, Stockholm University, Stockholm, Sweden Few studies have been performed within cross-language information retrieval (CLIR) in the field of psychology and psychotherapy. The aim of this paper is to analyze and assess the quality of available query translation methods for CLIR on a health portal for psychology. A test base of 100 user queries, 50 Multi Word Units (WUs) and 50 Single WUs, was used. Swedish was the source language and English the target language. Query translation methods based on machine translation (MT) and dictionary look-up were utilized in order to submit query translations to two search engines: Google Site Search and Quick Ask. Standard IR evaluation measures and a qualitative analysis were utilized to assess the results. The lexicon extracted with word alignment of the portal's parallel corpus provided better statistical results among dictionary look-ups. Google Translate provided more linguistically correct translations overall and also delivered better retrieval results in MT. Cross-language information retrieval and the medical domain, a review of the main approaches Andrenucci A 1 1 Dept. of Computer and Systems Sciences, Stockholm University, Stockholm, Sweden The information retrieval process where the query submitted by users and the documents retrieved by the search engine are in different languages is called cross-language information retrieval (CLIR). The query translation method is the most common translation approach, since it is less computationally costly and easier to maintain [Kishida 2005 , Rosembat et al. 2003 ]. User queries in source language are translated into the target language in order to match the documents collection. There are three main types of query translation methods: approaches based on (a) bilingual dictionary search, (b) machine translation (MT) and (c) parallel corpora (Gey et al. 2005 , Kishida k. 2005 . The aim of this paper is to discuss the approaches and their implementation in the medical domain and elicit their differences and their context of pertinence. We conducted a literature review of articles within CLIR from 1998 to 2016 based on PubMed, ACL proceedings, Google Scholar, CLEF and relevant referenced publications from the papers. To our knowledge this is the first comparison of the three approaches that focuses on the medical domain. Results CLIR based on MT translates the source language query into target language query with the help of commercial software for machine translation. Linguistic and semantic analysis are applied on user queries in order to improve translation quality. MT approaches have sometimes performed worse compared to dictionary based translations in particular for short queries. The information contained in oneword queries can be too limited to find out the context of the word. Quality MT systems are also difficult and expensive to build and sometimes limited to specific languages (Zhu & Wang, 2006) . Other drawback of MT is that systems tend to lack adaptation to specific domains (like the medical domain) and have economic, usage constraints (Pecina et al. 2014) . In recent years machine translation approaches based on statistics (SMT) and training data have become more and more popular. The strength of this approach is that it utilizes advanced machine learning techniques training translation models on monolingual and bilingual resources in specific domains. Dictionary based methods (also called dictionary look-up) implement word by word translations of queries by using bilingual dictionaries. Among the drawbacks of this approach we have ''out of vocabulary words'' and ''ambiguity'' of the translation candidates. On a parallel corpus documents can be aligned at the sentence level and at the word level. Multilingual domain specific resources are not available in many languages and it is time consuming and costly to build multilingual corpora and align them at the sentence and word level. Recent CLIR research show that researchers tend to combine those approaches: this holistic view provide better translation and retrieval results overall. The best performing team at CLEF 2014 (Cross-Language Evaluation Forum) (SNUMedinfo, Shen et al. 2014 ) uses a MT system (Google translate) to translate the queries in different languages. The system also utilizes UMLS-metathesaurus (i.e. an advanced dictionary) for expanding the queries with related terms. Even research outside CLEF confirm this trend. S14 E. Grill et al. Trends in the incidence of hospitalized pelvic fractures in Germany during the years 2000-2011 Andrich S 1 , Haastert B 2 , Arend W 1 , Giersch R 1 , Jungbluth P 3 , Thelen S 3 , Windolf J 3 , Icks A 1 1 Institute of Health Services Research and Health Economics, Heinrich-Heine University Düsseldorf, Germany; 2 mediStatistica, Neuenrade, Germany; 3 Department of Trauma and Hand Surgery, University Hospital Düsseldorf, Germany Background Pelvic fractures are associated with a high health and economic burden for the society and particularly among older people. The purpose of this study was to determine trends in the incidence of hospitalized pelvic fractures in the German population from 2000 to 2011. In our retrospective population-based observational study, we used data from the national hospital discharge register and were therefore able to cover 99 % of all hospital discharges in Germany. Pelvic fractures were identified by discharge diagnosis (ICD 10): S32.1, S32.2, S32.3, S32.4, S32.5 and S32.8. Assuming Poisson distribution, we analyzed the pelvic fracture incidence per 10.000 personyears for each year along with 95 % confidence intervals (CI), overall and stratified by age (age groups 0-29, 30-59, 60 years or older), sex and region (Eastern and Western Germany). We accounted for recurrent and double registrations by employing a correction factor of 0.74 estimated from individual claims data. Incidence rates were age-and sex-standardized to the German population in 2009. Poisson regression models were used to estimate age-, sex-and region-adjusted annual changes (incidence rate ratios, IRR; 95 % CI). The number of hospital discharges due to pelvic fractures in Germany increased from 22,919 in 2000 (German population 82,259,540) to 39,807 in 2011 (German population 81,843,743) . During the whole observation period, the overall standardized incidence rates (including a correction factor) for women were higher than for men (2011: 4.16 [4.11-4.21 The etiology of pelvic fractures differs among younger and older people. In the older population, we found a relevant increase in the incidence trend of pelvic fractures during the study period, even after adjusting for age. Varying patterns of the incidence of pelvic fractures were observed depending on age, sex and region, especially in the younger age group. Notable, only incidence trends for boys/men in Eastern Germany were significantly declining. Smoking is one of the major causes of diseases such as lung cancer. Beside this risk factor the aim of the present study was to evaluate whether there is an association of coronary artery calcification (CAC) with lung cancer development in the population-based Heinz Nixdorf Recall study (HNR) irrespective of the confounding effects of smoking status, number of cigarettes actually smoked and socioeconomic status (SES). The HNR comprises 4,814 individuals aged 45-75 years from the German Ruhr area. Complete data were available for 4,188 study participants with a median follow-up time of 10.21 years. Incident lung cancer cases were defined by ICD10: C33-C34. To examine the effect of CAC on lung cancer risk cox proportional hazard regression was used to calculate hazard ratios (HR) with 95 % confidence intervals (95 % CI). CAC score was categorized in 0, 1-10, 10-100, 100-400 and [400. Time to event was either time from study start to date of lung cancer diagnosis or last contact. As indicator for SES, total years of formal education were used categorized in four groups: B10, 11-13, 14-17 and C18 years. The full adjustment set included age, sex, smoking status at baseline (current, ex-and never-smokers) and education categories. All analyses were carried out using SAS 9.4 (The SAS Institute, Cary, NJ). The mean age of the study population was 59.6 ± 7.8 years, 2,084 (49.76 %) were male. 992 (23.69 %) were current smokers, 1,391 (33.21 %) ex-smokers and 1,805 (43.10 %) never smokers. There were 0.4 (95 % CI 0.2; 1.0), 1.2 (0.5; 2.4), 1.5 (0.8; 2.4), 1.8 (1.0; 3.2) and 4.0 (2.5; 6.0) lung cancer cases per 1000 years of person time in CAC categories 0, 1-10, 10-100, 100-400 and [400, respectively. In the crude model HRs of 2.7 (95 % CI 0.9; 7.8), 3.1 (1.2; 8.1), 4.2 (1.6; 11.1) and 9.0 (3.6; 22.2) were observed in the CAC categories 1-10, 10-100, 100-400 and [400 using the group with no detectable CAC as reference. Adjusting for sex and age did not change the results significantly. Adjusting additionally for smoking status and number of smoked cigarettes the HRs decreased to 2.6 (0.9; 7.6), 2.5 (0.9; 6.7), 3.0 (1.1; 8.2) and 5.5 (2.1; 14.8) whereby the smoking status itself showed HRs of 2. 7 (0.8; 8.4) for ex-smokers and 12.5 (4.3; 36.7) for current smokers with never smokers as reference. With the complete adjustment set the HRs dropped to 2.3 (0.8; 6.7), 2.2 (0.8; 5.8), 2.5 (0.9; 6.9) and 4.8 (1.8; 12.8) . For male subjects the effect was higher in the CAC categories 100-400 and [400 with HR of 3.5 (0.8; 15.8) and 6.1 (1.4; 26.9) , respectively. We observed high effect size estimates especially for the highest CAC category. With adjustment for the strongest confounder smoking and socioeconomic status the risk decreased but was still present. We assume that CAC represents life style factors besides smoking and those additionally associated with SES. The increased risk for men might be due to the higher number of men in upper CAC categories. Development of a smartphone-based motion measurement system for the elderly and hemiplegia patients by use of frequency analysis Arisaka N 1 , Mizuno K 2 , Mamorita N 3 , Shiba Y 1 , Shimizu S 1 , Matsunaga A 1 , Tsuruta H 4 1 Kitasato University, sagamihara-shi, Japan; 2 Kitasato University Hospital, sagamihara-shi, Japan; 3 Kitasato University, Sagamihara, Japan; 4 Kitasato Uiversity, Sagamihara, Kanagawa, Japan Background and objectives Quantitative assessments of ambulatory function play a very important role in rehabilitation for the elderly. However, special equipment such as a three-dimensional position analyzer for accurate measurement of gait is complex and expensive [1] . Therefore, such special equipment is often unavailable for routine gait assessments of the elderly. In this study, we developed a posture analysis system based on a small information terminal to provide a system that can be used for elderly gait assessment in research and clinical practice. Also, pedometers designed for healthy people cannot count the number of steps accurately when the elderly walk slowly [2, 3] . We devised a new algorithm for smartphones that detects the number of steps during slow walking. Methods Overview First, we developed a stand-alone system (ver.1) for measuring the patient's posture, and verified the accuracy of the measured angles. Second, we developed remote a control system with a control terminal and two measurement terminals (ver.2). Finally, we devised a new algorithm that detects the number of steps in the slow walking of patients with hemiplegia, and verified the accuracy of this algorithm (ver.3). Ver. 1 Subjects wore two iPod Touch (5th generation) on their chests (anterior to the sternal bone) and their lumbars (posterior to the sacrum bone) regions, and inclination angles in the anteroposterior and horizontal directions were measured. Ver. 2 When a user touches the measurement terminal, unwanted signal noise is generated. For that reason, we have developed a controller for remotely controlling the measurement terminals. Ver. 3 We devised a new algorithm by the use of the frequency analysis. The accuracy of the measurement in the case of standing and walking The maximum inclination angles at the sternum front and the sacrum rear were measured simultaneously with a set of measurement terminals and a three-dimensional position analyzer. The new pedometer algorithm We compared our new algorithm with three conventional (commercial) pedometers in five healthy adults (they are all in their 20's). In addition, measurement of walking speed has verified in 31 patients exhibited a hemiplegia of less than 30.0 m/min (the walking speed 18.9 (7.7) m/min) to the target. The accuracy of the measurement in the case of standing and walking For anteversion, there were no significant differences between the instruments in all conditions. Error was within the acceptable range. 2) The new pedometer algorithm Measurement error of the number of steps was 8.85 % when patients walked very slowly (\30 m/min). It was impossible to count steps when patients walked very slowly with conventional pedometers. We have developed a body motion analysis system using smartphones. It was possible to measure the change in the posture in both of standing and walking. Furthermore, our new count algorithm showed the possibility to act as a pedometer for very low speed walking. Our system is used routinely in five hospitals to measure the elderly, patients with hemiplegia and patients with Parkinson's. Treatment of patients in the acute phase of severe traumatic injury (SMI) remains to be challenging problem of unquestionable importance. The growing number of cases (up to 45 cases per 100000) and non-decreasing mortality rate (about 16-45 %) allow to speak about SMI as one of the leading cause of death in 21th century. Acid-Base blood (ASB) test is a an worldwide standard in treatment patients in critical stage. However the clinical value of this test remains under estimated especially in the conditions of support ventilation of patients. The current study aims to propose a new integral indicator of patient routine ASB test of capillary blood based on trained artificial network. Methods ASB test results of 367 SMI patients, all over 1400 tests, were used to train 1-rank neural network. As the result the linear classifier was developed. The output of this classifier is in integral value varied from relative (?10) to relative (-10) . Negative values indicate the negative prognosis of patient status. For clinical assessment of proposed method a special module was added to Clinical Information System. This module allows to compare the output of neural network with clinical evaluation of highly professional medical experts. From the beginning of 2014 until present time 82 SMI patients were included for validation of proposed neural net prognosis method. The overall correlation between clinical expert evaluation and the output of developed neural network was 87 %. The result of current study shows a practical value of proposed method of evaluation and prognosis of SMI patient status based on routinely performed blood test in ICU using specially adopted neural networks technique. Background Nipah virus poses an eminent threat in Bangladesh as it encounters almost yearly outbreak since 2001. Comprehensive and coherent understanding of the disease is of paramount importance due to high case fatality. The objective of this review is to summarize the epidemiology and control of Nipah encephalitis in Bangladesh. We searched three different databases-PubMed, Web of Science and Google Scholar using different search term as appropriate for those databases mixing the Boolean operators that yielded 144, 113 and 1530 articles respectively. We screened the titles first for relevance and later abstracts. 47 articles met our criteria and were finally reviewed. A literature matrix was formed in Excel for compilation and analysis. Results Nipah outbreaks in Bangladesh were different compared to initial outbreak in Malaysia in 1999. Unlike Malaysian outbreak, person to person transmission occurred in Bangladesh leaving humans most vulnerable. Infections were reported in domesticated pigs, dogs, cats in South Asia and serological study in Bangladesh showed 6.5, 4.3 and 44.2 % seroprevalence in cattle, goat and pig respectively. Pteropus bat plays a key role in Nipah transmission and risk factors include drinking raw date palm sap contaminated with bat urine or faeces, eating halfeaten bat saliva laden fruit and close contact or involvement in burial of cases. As date palm sap is harvested during winter, Nipah outbreaks mostly occurred during this season. Nipah occurred within the ''Nipah belt'' and had a high case fatality of more than 70 %. Avoiding raw date palm sap and drinking sap from trees protected with bamboo skirt or ''bana'' remain the principle preventive measures along with strict infection control in hospitals for prevention of secondary cases. Conclusion Extensive knowledge of this emerging zoonosis is the key for prevention and saving lives. Community sensitization and multisectoral intervention through One Health approach can prevent this deadly disease especially in outbreak areas. Use of SNOMED-CT for the transmission of laboratory infection prevention data Aschhoff M 1 , Treinat L 2 , Heitmann KU 3 , Thun S 4 , Dewenter H 5 , Pantazoglou E 4 1 ZTG -ZTG Zentrum für Telematik und Telemedizin, Bochum, Germany; 2 ZTG Zentrum für Telematik und Telemedizin GmbH, Bochum, Germany; 3 HL7 Deutschland e. V., Köln, Germany; 4 Hochschule Niederrhein, Krefeld, Germany; 5 Hochschule Niederrhein, University of Applied Sciences, Krefeld, Germany According to German legal specifications each resident physician, hospital and laboratory is obliged to transmit data about notifiable diseases or agents to the relevant health authority. In case of reasonable suspicion, affection or death by infectious diseases specific information is differently communicated by laboratories and physicians. Proprietary ways of transmission inherit threats like deficient or incomplete availability of data. At least these circumstances imply nonpredictable health-related hazards for the population [4] . As the carrier of the information, HL7's Clinical Document Architecture (CDA) [1] is used to define the contents of the notifiable disease documentation. The structure of this CDA is based on the IHE Sharing Laboratory Reports (XD-LAB)[2] Integration Profile specification. One of the biggest challenge in this project is about dealing with different code systems. Whether it is the local health authority or the laboratory system predominantly not the same code systems are used to deliver data. The international established medical terminology SNOMED CT can contribute semantic interoperability and a highly specific description of laboratory findings. The applicability of SNOMED CT shall be tested in the domain of laboratory findings respective notifiable infectious agents. As part of the project the International Health Terminology Standards Development Organisation (ihtsdo) [3] granted a research license to be used to determine the benefit of using SNOMED CT to bridge the gap between local code systems of laboratory systems and local health authorities. SNOMED CT will be used to describe the specimen and the pathogen. This information will be delivered to the local health authority. The implementation of the CDA guideline was successfully tested in January 2016. Currently we are testing this solution in a local scenario. A systematic review of computerized decision aid tools in pregnant women Aslani A 1 1 Shiraz University of Medical Science, Tehran, Iran, Islamic Republic Of Abstract Objective: Involving pregnant women in the clinical decision making and keeping them well informed during their pregnancies support patient autonomy. Computerized decision aids (CDAs) are a set of tools that can help patients making informed decisions about their health. The main objective of this paper is to review the literature on CDAs in pregnancy and child birth and their effects on the knowledge acquired, decisional conflict, and clinical outcomes. Methods: We included published studies about interventions designed to help pregnant women in making decisions about screening options and obstetrical treatment. Two authors systematically searched the electronic bibliographic databases Ovid MEDLINE (1946 to October 2013 and OVID Embase (1980 to October 2013 for articles published in English. We only included observational and RCTs studies where computerized tools (such as electronic pamphlets, educational brochures, booklets, audio-books, and decision boards, etc.) were used. Studies which did not use computerized tools (e.g., pamphlet/ educational brochure/booklet, audio-booklet, decision board) were excluded. Two reviewers assessed the quality of the selected studies by ''The Cochrane risk of bias assessment'' tool [1] . The search keywords were website, internet, ''decision support'', ''decision aids'', alert, reminder, teleconsult*, pregnancy, gestation, matern*, obstetrics. Results: we identified 190 potentially relevant articles on the basis of the abstract (n = 3511 titles).182 articles were excluded on grounds of irrelevance to the population, interventions, outcomes, or study design. The reference lists of the included studies and relevant reviews led to the identification of two additional articles. Ten studies included seven Randomized Controlled Trials (RCT) and three observational studies. Six studies used computerized decision aids in the form of a computer program, computer based information system, or computer generated reminders, and four studies assessed decision aids in the form of email or web based consultation. Compared to written materials or routine care, CDAs have improved pregnant women's knowledge on specifics subjects [2, 3, 4] . A CDA tool is more effective than a written informational document in reducing decisional conflict (based on O'Connor's decisional conflict scale) (p \ 0.05). Three studies demonstrated improved health outcomes including increased rate of vaginal delivery after a previous cesarean section [5] , undergoing invasive prenatal testing [4] , and intended external cephalic version for breech presentation [3] , but the differences were not statistically significant. Conclusion: Computerized decision aids showed improved effects on various health outcomes (higher knowledge acquisition, reduced decisional conflicts) compared to the traditional written patient educational pamphlets and routine care. Further research on other potential effects of CDAs on patient decisions and clinical outcomes are still required. Dengue fever is among the top 13 neglected tropical diseases as classified by WHO. Pakistan is under major threat of dengue fever since more than a decade [1] . Punjab has been majorly affected by Dengue fever since 2006. But now the epidemiological trend is shifting. Since 2006 number of people infected with dengue fever is rising up continuously [2] . This paper is discussing about the general trends and demographic features of dengue fever in Swat Region of Pakistan. In 2013 Swat region, which has been under insurgency since several years, was hit by a major dengue outbreak. [3, 4] This outbreak was comparable with 2011 outbreak in Lahore, Punjab in which more than 300 people died. In this outbreak in total 9032 cases of dengue fever were reported along with death of 33 people [5, 6, 7] . Outbreak was started suddenly in August and ended in November, 2013. More than half 6473 cases were reported in September. Methods Data for all the reported cases of dengue fever was collected from district health offices which collect the reported data. As it is a notifiable disease so the physicians have to report any case immediately. After devolution of powers district health information system is working efficiently for collection of routinely collected health data. Data was cleaned and then visualized into graphical representation via MS Excel 2010. ArcGIS 10.3 was used to generate incidence rate map of dengue in the district at union council level. Results have been presented below in the form of several charts. They evidently represent the distribution of dengue cases according to the location of cases. Graphs also show month wise distribution as well as gender distribution during that sudden outbreak in 2013. Map generated shows the incidence rate per each union council in Swat district. Male accounted for 6157 (68 %) of the affected reported cases while remaining 2875 (32 %) were females. Mean age of infected reported cases was found to be 29.70. There are several factors which may have contributed to this sudden rise of dengue fever in Swat. In all the outbreaks male to female ration of infected population has been around 70:30. It can be attributed to the cultural practices because women normally don't go out to work while men work in fields. It is suspected that government operations against insurgents lead to under reporting of epidemic in the region which leads to sudden outbreak. There is need to further explore the climatic conditions and land-cover land-use changes which potentially might contribute to dengue outbreak. Best control measure of dengue virus is to control its vector. Conclusion It can be concluded that time period from August to October is favorable season for the growth of mosquitoes which are vector of dengue virus. During this time period active vector control measures must be taken along with awareness campaigns to effectively control the spread of dengue infection. Areas which are highly affected by dengue must be given special consideration for effective dengue control. Information System (HIS). HIS is considered prerequisite for the efficient delivery of high quality health care in hospitals [2, 3] . Hospital information systems focus on the integration of all clinical, financial and administrative applications and thus could also be called an integrated hospital information processing systems (IHIPS). HIS is useful in optimizing the performance and workflow of the hospitals. It can also be helpful to overcome the issues faced by hospitals to accomplish the tasks in timely manner [4] . However, there are very few studies which talk about adoption and implementation of Hospital Information System in developing countries because it is big challenge to develop one, as stated by several authors [5] [6] [7] . Methods A Cross Sectional survey was carried out in different public sector tertiary hospitals of Punjab and Islamabad. The method used to collect data involved visits, interviews and distribution of questionnaires. The interviews and visits were conducted in selected group of hospitals, for which proper appointments or official contacts were already made. The data obtained from the questionnaires filled out by the respondents has been coded, entered and analyzed by Statistical Package for Social Science (SPSS) version 17. In total eight major tertiary care hospitals were included in this study. The survey team together with the hospital managers MIS staff filled the questionnaires which were received on time for data analysis and report writing. Being a provincial capital and the most populous city of Punjab, four of the hospitals were selected from Lahore city. Thus half of the data for this survey is collected from Lahore. For improving our understanding of the management Information System and for sake of convenience this study could be concluded on the basis of SWOT Model. Strengths: It is encouraging to note that all the tertiary hospitals visited had a Management Information Unit adequately staffed with professionals. Most of this staff are statisticians and those from information technology. Weaknesses: Hospital had some kind of data collection and processing systems, the efficiency and appropriateness varied location to location. There is no consistency or standardization of data elements among the hospitals studied. Each one has developed its own system without due consideration to what others are doing. AAL new service excellence model-a framework for continuous new service development Auinger K 1 , Kriegel J 2 , Reckwitz L 3 , Schmitt-Rueth S 4 , Kraenzl-Nagl R 1 1 Fachhochschule Oberösterreich, Linz, Austria; 2 FH Oberösterreich, Linz, Austria; 3 Fakultät für Mathematik und Informatik, Georg-August Universität Göttingen/Germany, Göttingen, Germany; 4 Fraunhofer-Arbeitsgruppe für Supply Chain Services, Nürnberg/ Germany, Nürnberg, Germany This paper presents the design of an Ambient Assisted Living (AAL) specific framework for Continuous New Service Development. It is known that the development of AAL services with the aim to reduce the complexity of living environments of the elderly citizens often does not lead to the desired results on the market. Our research addresses this problem with a new Service Excellence Model (SEM) and outlines the benefits of this specific approach. The research is based on the data of the PenAAL project (Performance Measurement Index for AAL solutions). Part of the project was the classification of business dimensions and the development of a related scoring tool for continuous benchmarking. Hormone replacement therapy and mammographic density-a systematic review Azam S 1 , Aro AR 1 , Jovanovic Andersen Z 2 1 University of Southern Denmark, Esbjerg, Denmark; 2 University of Copenhagen, Copenhagen, Denmark Hormone replacement therapy (HRT) has been associated with the increased risk of breast cancer. Mammographic density (MD) is one of the strongest risk factors for breast cancer. The aim of this study was to review systematically studies assessing the association between HRT use and MD, separately for different HRT regimens. We systematically searched in MEDLINE database using 6 search terms ''hormone replacement therapy'', ''postmenopausal hormone replacement therapy'', ''oestrogen-progestin hormone replacement therapy'', ''HRT'', ''combined hormone replacement therapy'', and ''oestrogen alone hormone replacement therapy'' in combination with ''mammographic density'' and ''breast density'' from 2013 until March 2015. We included in this review studies that were published after 2013, were accessible in full-text format, designed based on the randomized control trial (RCT), cohort, case-control, or cross-sectional, and were published in English. Descriptive analysis of data from the included studies on the association between HRT use and MD for different HRT regimens was performed. In addition, study design features of all included studies were assessed. In total twenty two articles were selected. Eleven studies found that MD increased in ever HRT users compared to never users and the highest increase in MD was observed in current users compared to former and never HRT users. Six studies showed that using oestrogen ? progestin (E ? P) was associated with higher MD than oestrogen alone use. In addition, 4 studies reported that continuous oestrogen ? progestin (CEP) users had higher MD than sequential oestrogen ? progestin (SEP) and oestrogen alone users. However, 2 studies showed that SEP users had higher MD compared to CEP and oestrogen alone users. Conclusions Epidemiological evidence is rather consistent suggesting that there is a positive association between HRT use and MD among menopausal women, with the highest increase in MD among current users, and CEP users. The potential public health implications of this association are increased breasts discomfort, reduced sensitivity and specificity in detecting small tumours during mammography screening in women using HRT, and possibly increased breast cancer risk. registry for intensive care (IC) called NICE [2] . To this end a mapping is realized between the items in the quality registry and the DCMs. 2. Methods In the Netherlands, 44 DCMs are available describing different parts of the care process and clinical findings (e.g. demographics, contact information, blood pressure, etc.). The NICE registry contains 26 items on demographics and administrative information, 53 on physiology and laboratory outcomes, and 472 diagnoses, i.e. 445 reasons for admission based on APACHE IV classification [3] and 27 comorbidities. These diagnoses have been mapped to SNOMED CT. 3. Results Twenty-two percent of the registry items could be mapped to a DCM. For 14 % of the items a DCM was not available. For 64 % of these items a DCM was available, but the context used in the definition of the item (e.g. worst value in the first 24 h of admission) could not be represented with the DCMs and therefore a full mapping could not be realized. Sixty-eight percent of the reasons for IC admission could be mapped to a precoordinated SNOMED CT concept, the other 32 % needed post coordination. Twenty-two percent of the comorbidities could be mapped to a SNOMED CT concept. For the remaining comorbidities, the definition was too ambiguous (37 %) or too narrow with additional context (41 %) to be mapped to SNOMEC CT. 4. Discussion DCMs were initially meant to support data exchange between institutions and therefore items related to the location of a patient within an institution (e.g. IC admission date) cannot be captured with current DCMs. Applying DCMs made explicit that the use of international terminologies such as SNOMED CT is assumed, but not yet realized for entry of diagnoses in the Netherlands. It also clarified that definitions of some data items in the quality registry are ambiguous which should be solved. DCMs are meant to aid standardized point-of-care data capture for clinical process. However, as in each quality registry, also the NICE registry requires specific context for data items that are currently not included in the DCMs. For instance, for physiologicaland laboratory outcomes, the NICE registry requires the worst values (e.g. for blood pressure) in the first 24 h of admissions. Although there is a DCM for instance blood pressure, the context related to the worst value in the first 24 h of admission cannot be captured. This registry specific context should be handled in the software using queries when extracting the data to the registry. 4.1. Conclusion Changes in the registry items and extensions of the DCMs are required for DCMs to aid a quality registry. The panel discussion will have participants from ECDC and the health informatics community, discussing state of the art and future of information management in the domain of infectious disease prevention and control. There will be ECDC authors shortly presenting current and future characteristics of EU regulated information flow and management regarding data on infectious diseases epidemiology and epidemic intelligence. There will be special emphasis on data intensive activities as molecular surveillance. External panelists are asked to introduce possible influence of new paradigms of big data, reusable semantic assets, medical terminologies, ontologies on infectious disease information management. Other areas to investigate are the role of social media, emerging fields as digital surveillance. ECDC Chief Scientist will chair the meeting, where interested participants will be asked to contribute their ideas, suggestions. Mission statement for the scientific panel The mission of the ECDC panel meeting is to find the state of the art and the future of information management for current and emerging threats posed by infectious diseases. By asking the scientific community how to strengthen and renew information management, ECDC would like to pool Europe's knowledge to develop authoritative scientific opinions. Quality control of a health information system for causes of death statistics in a country in West Africa: capture-recapture method Barro GS 1,2 , Dufour JC 3 , Dandjinou M 4 , de Lame Pa 5,6 , Staccini P 7, 8 Background Cardiorespiratory fitness (CRF) has been demonstrated to be a strong predictor for cardiovascular and all-cause mortality. Alcohol consumption and smoking are major risk factors for coronary and pulmonary disease and smoking is common in alcohol drinkers. Data regarding the relationship between alcohol consumption and CRF are scarce and data on joint effects of alcohol drinking and smoking on CRF is lacking. The aim of the present study was (a) to examine the dose-response relation between average alcohol consumption and risky single episodic drinking (RSOD) with CRF, and (b) to investigate joint associations of average alcohol consumption, RSOD and smoking with CRF. We analyzed data from four independent population-based studies (Study of Health in Pomerania , SHIP-Trend, US National Health and Nutrition Examination Survey [NHANES] 1999 -2004 Health Interview and Examination Survey) including 10,651 men and women aged 20 to 85 years without lung disease or asthma. CRF was measured directly from exercise testing (peak oxygen uptake; peakVO2) or estimated using a standardized formula. Data on alcohol consumption and cigarette smoking were obtained from selfreports. We calculated the intake of alcohol in grams per day. RSOD was defined as the consumption of at least five drinks per occasion. Smoking was categorized as follows: never, former and current smoking. Analyses in pooled data revealed that current abstainers and individuals consuming 60 grams of alcohol per day yielded lower peakVO2 compared to individuals consuming 10 grams of alcohol per day (linear regression coefficient ß = -2.89 and ß = -0.99, respectively). RSOD was not significantly associated with peakVO2. Analyses further revealed that current smoking modified the association between average daily alcohol consumption and peakVO2, while no effect of RSOD on peakVO2 was evident. In the present study, high levels of average daily alcohol consumption were associated with low peakVO2 values indicating a detrimental influence of high alcohol consumption on CRF. The fact that peakVO2 was low in current abstainers gives support to the 'sick quitters' hypothesis. Smoking was found to modify the association between average daily alcohol consumption and CRF underlining the impact of the presence of both risk factors on CRF. The importance of RSOD in CRF needs to be further elucidated. Associations of physical activity and cardiorespiratory fitness with incident and recurrent major depressive disorder, depressive symptoms and incident anxiety in a general population Baumeister SE 1 , Leitzmann M 2 , Bahls M 3 , Dörr M 4 , Schmid D 5 , Schomerus G 6 , Markus MR 4 , Völzke H 7 , Gläser S 4 , Grabe H OR = 0.45 (0.24-0.84)] as compared to depression or anxiety alone ]. Conclusions Greater CRF but not self-report PA was associated with a lower incidence of clinical depression and clinical anxiety. Anti-b2-glycoprotein I autoantibody expression as a potential biomarker for strokes in patients with anti-phospholipid syndrome Background Anti-phospholipid syndrome (APS) is an autoimmune disease. Cerebral ischemia associated with APS occurs at a younger age than typical atherothrombotic cerebrovascular disease, is often recurrent, and is associated with high positive IgG anti-phospholipid (GPL) unit levels. Methods All cases were under 50 years-of-age and had no recognizable risk factors. Using ELISA to evaluate the presence of IgG isotype of aCL, ab2-GPI, and aPS autoantibodies in their blood The results indicated that the frequency of ab2-GPI was 14/50 (28 %), aCL was 11/50 (22 %), and aPS was 9/50 (18 %) among stroke patients. In contrast, aCL was detected in 2/30 (6.7 %) of control subjects; each of the other anti-phospholipid antibodies (APLA) was never observed. Of all the ab2-GPI ? cases, the incidence of stroke patients having the combined profile of ab2-GPI ? aCL was 11/14 (78.6 %) and of ab2-GPI ? aPS was 9/14 (64.3 %). Only 2/14 (14.3 %) of these ab2-GPI ? patients also expressed aCL in the absence of aPS. The frequency of patients expressing all three markers was only 9/14 (64.3 %). In none of the APS/stroke patients were aCL or aPS expressed in the absence of the ab2-GPI. Conversely, IgG ab2-GPI as a sole marker was seen in 3/14 (21.4 %) of these patients (i.e. in absence of either other marker). Conclusion It can be concluded from these studies that the among the three major forms of APLA examined, the presence of IgG ab2-GPI autoantibodies appeared to correlate best with stroke in patients who were concurrently suffering APS. Extraction of SNOMED-CT Ò codes from unstructured german clinical text Becker M 1 , Böckmann B 2 1 Fachhochschule Dortmund, Dortmund, Germany; 2 FH Dortmund, Dortmund, Germany Automatic information extraction of medical concepts and classification with semantic standards from medical reports is useful for standardization and for clinical research. This paper presents an approach for a SNOMED-CT concept extraction with a customized natural language processing pipeline for German clinical notes using Apache cTAKES. The objectives are, to prove the natural language processing tool for German language if it is suitable to identify UMLS concepts and SNOMED-CT Codes. The German UMLS database and German OpenNLP models extended the natural language processing pipeline, so the pipeline can normalize to domain ontologies such as SNOMED-CT using German UMLS concepts. For comparison, the ShARe/CLEF eHealth 2013 training dataset translated into German was used. The implemented algorithms are tested with a set of 199 German reports including 11,797 UMLS concepts and obtained a result of 9,018 concepts that could be mapped to SNOMED-CT. The number of mapped codes is a quantity characteristic and does not describe the quality and the correctness of the mapping. A novel Bayesian dose-response meta-analysis model selection framework accounting for study-specific exposure distributions and reference levels Behrens G 1 , Leitzmann M 2 1 University of Regensburg, Regensburg, Germany; 2 Universität Regensburg, Regensburg, Germany In epidemiology, dose-response meta-analyses of published studies are frequently conducted to clarify the association between exposure and disease. However, previous dose-response meta-analysis methods were prone to bias because they did not account for heterogeneous definitions of reference levels across studies and/or because the treatment of study-specific exposure distributions was sub-optimal. We previously presented a novel Bayesian cubic dose-response meta-analysis approach, which was able to account for heterogeneous reference levels. We now present an extended version of that approach. Our Bayesian method is now able to select dose-response models from a wide class of regression models and to incorporate study-specific exposure distributions. Our novel Bayesian framework may be applied to any regression model used to represent the dose-response relation. Suitable reparametrizations of the dose-response relation allow for the incorporation of study-specific reference levels into the model. Study-specific exposure distributions may be represented by (truncated) mixtures of distributions, whose choice will be influenced by study-specific participant numbers within exposure categories. Bayesian model selection will be possible if a prior distribution on the model space is specified. Evaluation of our Bayesian model may be based on (Reversible Jump) Markov chain Monte Carlo samples from the posterior distribution. Those samples may be generated using tempered versions of the posterior distribution as auxiliary distributions in the sampling process (using tempered Markov chain Monte Carlo methods). In a toy example, our novel Bayesian approach was able to substantially reduce bias induced by the sub-optimal treatment of exposure and reference levels. Similarly, in an example from the literature, the novel approach altered the inference on the dose-response relation materially. Our novel Bayesian dose-response meta-analysis model selection framework accounting for study-specific exposure distributions and reference levels may substantially reduce bias in dose-response metaanalyses. E. Grill et al. Limitations of the incidence density ratio as approximation of the hazard ratio Bender R 1,2 , Beckmann L 3 1 Institut für Qualität und Wirtschaftlichkeit im Gesundheitswesen, Köln, Germany; 2 Faculty of Medicine, University of Cologne, Cologne, Germany; 3 Institut für Qualität und Wirtschaftlichkeit im Gesundheitswesen (IQWiG), Köln, Germany Adverse events play an important role in drug approval as well as in the benefit assessment of drugs. For the analysis of adverse events simple standard methods for contingency tables are commonly used. However, the application of simple standard methods may be misleading if observations are censored at the time of discontinuation due to treatment switching or noncompliance, resulting in unequal observation times. To account for varying observation times frequently incidence densities, i.e. events per patient years, are used to quantify the risk of adverse events. For comparisons between groups incidence density ratios are used together with confidence intervals based upon the Poisson distribution. The corresponding results are interpreted in the same way as hazard ratios. In this talk, the validity of the incidence density ratio as approximation of the hazard ratio is investigated by means of simulations. Methods A simulation study was performed by using various survival time distributions with constant, increasing and decreasing hazard functions. A number of different data situations were considered using various sample sizes, various baseline risks, various effect sizes, and various differences of mean observations times between the groups. From the simulated survival times hazard ratios were estimated by using the Cox proportional hazards model. Incidence density ratios with corresponding confidence intervals were estimated by usual formulas based upon the Poisson distribution. As performance measures the coverage probability, the relative bias, and the mean squared error were applied by using the true hazard ration for the comparisons. As expected, the incidence density ratio represents an appropriate approximation of the hazard ratio, if the true hazard function is constant. For non-constant hazard functions the incidence density ratio is only an appropriate approximation of the hazard ratio, if the mean observation times of the groups are equal and the baseline risk is not larger than 25 %. If the mean observation times are different between the groups the validity of the incidence density ratio depends on the true survival time distribution, the difference between the mean observation times, the baseline risk, and the sample size. In the case of large differences of the mean observation times between the groups, the incidence density ratio represents no valid approximation of the hazard ratio if the true hazard function is not constant. As in practice constant hazard functions are rarely justified adequate survival time methods accounting for different observation times should be used to analyze adverse events rather than the simple incidence density ratio. A checklist for the evaluation of published indirect comparisons and network meta-analyses Bender R 1 , Sturtz S 1 , Kiefer C 1 1 Institute for Quality and Efficiency in Health Care (IQWiG), Cologne, Germany Background Systematic reviews provide an overview of the available studies on a certain topic. By means of meta-analyses pooled effect estimates can be calculated if the considered data are sufficiently homogenous. Besides traditional meta-analyses, in which direct head-to-head studies comparing 2 interventions are summarized, indirect comparisons and network meta-analyses are increasingly used. Existing approaches for indirect comparisons and network metaanalyses are presented and explained. The main assumptions and requirements of these methods are described. A checklist for the assessment of published indirect comparisons and network metaanalyses is suggested. By means of examples different types of indirect comparisons and network meta-analyses are described and the application of the checklist is explained. Within the framework of systematic reviews indirect comparisons and network meta-analyses make it possible to estimate effects without corresponding direct head-to-head studies and to analyse simultaneously networks containing more than 2 interventions. The adequate application of these methods requires strong assumptions. For an adequate assessment of published results from indirect comparisons and network meta-analyses a transparent and detailed documentation is essential. Indirect comparisons and network meta-analyses represent an important advancement of traditional meta-analyses. However, the underlying assumptions and requirements have to be acknowledged. Dealing with missings in predicting determinants for musculoskeletal pain among Danish fishermen via conditional imputation and missing categories Berg-Beckhoff G 1 , Østergaard H 2 , Riis Jepsen J 2 1 University of Southern Denmark, Esbjerg, Denmark; 2 Centre of Maritime Health and Society, University of Southern Denmark, Esbjerg, Denmark The aim of the analysis is to estimate the determinants of musculoskeletal pain among Danish fishermen after several positive structural changes for the physical work environment have been initiated. The present analysis focuses on considering missings via different methods such as conditional imputation and a ''missing''category in multiple regression analyses. Health-exploring complexity: an interdisciplinary systems approach HEC2016 S23 Methods A cross sectional survey in a random sample of Danish fishermen was done in 2015 with application of the Nordic questionnaire for musculoskeletal pain in nine different body regions (neck, shoulder, elbow, hand, upper back, lower back, hip, knee, and foot) . In total, 270 fishermen participated in the study (response rate 27 %). Determinants for musculoskeletal pain were tested using a multiple linear regression analysis with an overall pain score summing up all nine pain locations. Missing answers were calculated out of the mean value of the remaining answers of the relevant person (conditional imputation). Cronbach alpha of 0.91 indicates a very good internal consistency of the scale. Additionally, multinomial logistic regression analyses considering relevant confounders were used to look at each single pain site with missing as an additional outcome. In all analyses, sideline occupations, work position, vessel type, education, and duration at sea were considered as further predictors. The prevalence of pain was high for all musculoskeletal locations. In the multiple linear regression analysis, workload was positively associated with musculoskeletal pain (Beta: 0.05 (95 % CI 0.04; 0.05). Having a sideline occupation was negatively associated with musculoskeletal pain (Beta: -0.15; 95 % CI -0.24; -0.03). A linear regression model excluding all missing's revealed similar results. Multinomial regression models showed that workload was the only consistent predictor for musculoskeletal pain, in particular regarding upper and lower limb pain. Two additional predictors were found for the nine different pain locations models; sideline occupation was associated with less shoulder pain, and work duration of more than 30 days per year was a predictor for hip pain. An additional category for missing values was considered in the multinomial regression analysis to see if missing by itself had an effect on the outcome. Overall, the odds ratios for missing categories were small and far from being significant which suggest that missing values do not bias the overall results. Both methods considering missings, the multiple linear regression model with conditional imputation and the multinomial logistic regression using missing categories in categorical outcomes revealed similar results. The work as a fisherman remains physically demanding, although this is much less than previously. Fishermen still have a high prevalence of musculoskeletal pain. Workload is the only and consistent predictor of pain. Different forms of missing imputation revealed similar results suggesting missings to occur unsystematically. Background Social-emotional skills are associated with subsequent school career and school success. Therefore, it is necessary to early promote these skills in the child development [1, 2, 3, 4] . The federal state law for child day care and preschools Mecklenburg-Western Pomerania (MWP) regulates documentation and observation of the childreńs development [5] . Day-care teachers shall assess developmental motor, linguistic, cognitive, and social risks by conducting the standardized, objective and valid Dortmund Developmental Screening for Preschools (Dortmunder Entwicklungsscreening für den Kindergarten DESK 3-6) [6, 7, 8] . In the event of deviations from an age-appropriate development, the law stipulates funds to offer targeted individualized support to children prone to developmental delays. Mandatory criteria for claiming benefits based on the law are (1) the annual performance of the screening and (2) the participation in a scientific evaluation (n = 143 day-care centers (dcc); 11.03.2016). The DESK items are age-based and vary in three age groups (3-yearolds, 4-year-olds, and 5-to 6-year-olds). The DESK 3-6 was carried out by previously trained preschool teacher. Data analysis: descriptive statistics of age-adjusted stanine-scores stratified by age and sex, and binary logistic regression. Sample size: n = 5.595 children. The results show that 539 children (9.6 %) are endangered in their social development. Further 348 children (6.2 %) are possibly at risk. The results vary with gender and age. 13.8 % boys have a risk in their social development, but just 5.5 % of the girls. 8.5 % of the 3-yearold children (n = 119), 9.9 % of the 4-year-old children (n = 166) and 10.0 % of the 5/6-year-old children (n = 254) are endangered in their social development. Statistically significant risk factors: 3 year olds: male gender (OR = 2.58, 95 % CI 1.73-3.86, p \ 0.001), unregular dcc attendance (OR = 2.57, 95 % CI 1.06-6.24, p \ 0.05), other than German nationality (OR = 7.36, , p \ 0.001), affected by chronic disease/disability (OR = 5.88, 95 % CI 2.16-16.08, p \ 0.01), 4 year olds: male gender (OR = 2.03, 95 % CI 1.48-2.79, p \ 0.001), unregular dcc attendance (OR = 1.85, 95 % CI 1.00-3.42, p \ 0.05), affected by chronic disease/disability (OR = 2.98, 95 % CI 1.69-5.23, p \ 0.001), 5 to 6 year olds: male gender (OR = 2.32, 95 % CI 1.76-3.06, p \ 0.001), unregular dcc attendance (OR = 1.91, 95 % CI 1.11-3.28, p \ 0.05), other than German native language (OR = 1.73, 95 % CI 1.00-2.99, p \ 0.05), affected by chronic disease/disability (OR = 4.62, 95 % CI 2.89-7.37, p \ 0.001). Conclusion A considerable proportion of children aged 3 to 6 years is affected by social-emotional developmental risks. The analysis show the important influence of dcc attendance on social-emotional skills. The results show the need for early prevention concepts that should be gender specific and should take account of unregular dcc attendance and chronic disease/disability. These results provide an important contribution to achieve equal opportunities in social-emotional development in the pivotal period prior to school enrolment. Combing medical measurements from diverse sources: experiences from clinical chemistry Bietenbeck A 1 1 Klinikum rechts der Isar TU München, München, Germany In health care, exchange of data is becoming increasingly important. However, it is still unclear how to interpret measurements of the same parameter from diverse data sources. In clinical chemistry point-ofcare testing (POCT), e.g. for blood glucose, complements the analytical spectrum of central laboratories. Available data from ''central laboratory '', ''professional POCT'' and ''non-professional' ' blood glucose measurements can be used to realistically simulate errors in different environments. The ability to classify the disease status or to detect a significant change based on these simulated measurements was assessed in several scenarios. E. Grill et al. Patients were simulated with realistic ''true values'' of blood glucose and a ''diseased'' or ''non-diseased'' status. To further simulate a change of blood glucose values, a random difference was created and classified as ''significant'' or ''non-significant''. Scientific literature was searched for accuracy and precision of blood glucose measurements in different environments. These values were used to calculate ''measured values'' from the ''true values''. In scenarios where only measurements from the same environment were combined the area under the ROC curve (AUC) was used to express classifier performance. Balanced accuracy was used for scenarios with a combination of measurements from different environments. Situations with and without prior knowledge of the data source were simulated separately. AUC for the classification of disease status remained around all 0.68 in scenarios. Median AUCs for the detection of a significant change ranged between 0.89 in the central laboratory and 0.76 in the nonprofessional environment. For the combination of different measurement environments median balanced accuracies reached 0.63 for the classification of the disease status and differed little between scenarios. Median balanced accuracy to detect a significant change ranged from 0.81 to 0.72. When there was no prior knowledge about the additional data source it could not be used to determine an optimal decision boundary. In these cases, median balanced accuracy was approximately 0.02 lower. These simulations highlight issues that are relevant beyond the combination of blood glucose values. Measurements conducted to help clarify one medical issue are often reused in other contexts. However not all data sources are equally suitable for all analyses. Therefore, when data from laboratory medicine is exchanged, accompanying information that helps to estimate reliability is critically important. Binder H 1 1 Universitätsmedizin der Johannes-Gutenberg-Universität Mainz, Mainz, Germany Key topics for guiding design and analysis of high-dimensional data Harald Binder, on behalf of the STRATOS High-Dimensional Data Topic Group University Medical Center Mainz, Germany Harald.Binder@unimedizin-mainz.de When confronted with projects that comprise high-dimensional data, statisticians face challenges that can to some extent be addressed by established techniques from low-dimensional settings, but increasingly require specifically adapted techniques and novel developments. The STRATOS topic group ''High-Dimensional Data'' has formed to deal with the multitude of available proposals. Specifically, approaches for design considerations, data pre-processing, exploratory data analysis, data reduction, multiple testing, developing prediction models, dealing with categorical data, comparative effectiveness, and simulation-all in a high-dimensional setting-will be considered. The talk will introduce the work carried out by the group with respect to these areas, and will more closely discuss selected areas, such as exploratory data analysis and data reduction. Health systems turn to highly distributed, cross-organizational, regional or even international care settings including many different stakeholders from multiple knowledge and policy domains and integrating human actors, devices, applications or components. Standards and publicly available specifications as well as tools for managing the resulting interoperability challenges cannot be developed by one organization. Hereby, the seamless integration of specifications and artifacts from different organizations is crucial and so far insufficiently solved. The workshop, jointly organized by HL7 Germany and HL7 Austria and supported by activists engaged in ISO and CEN, will address the whole standards and product development and deployment lifecycle. It will help overcoming the existing problems with standards and specifications from different Standards Development Organizations (SDOs) and guiding standards developers, designers, and implementers to improve their processes and outcomes. The principles, methodologies, requirements and solutions proposed in the workshop will be shared with the related SDOs and published in an international journal. Is semantic interoperability enough for future healthcare? Blobel B 1 , Oemig F 2 1 University of Regensburg, Regensburg, Germany; 2 Deutsche Telekom Healthcare and Security Solutions GmbH, Mülheim, Germany For interrelating health information systems, different levels of ICT interoperability ranging from structured and syntactic messaging through sharing concepts expressed in common terminologies up to sharing application services are practiced. Organizational, methodological and technological paradigm changes enable a personalized, predictive, preventive and participative approach to health and social services supported by a comprehensive set of domains managed by different scientific disciplines. Interoperability has to advance beyond ICT concerns, including the real world business domains and their processes, but also the individual context of all actors involved. So, the system must properly reflect the environment in front and around the computer as essential and even defining part of the health system. This paper introduces an ICT-independent system-theoretical, ontology-driven reference architecture model allowing the representation and harmonization of all domains involved including the transformation into an appropriate ICT design and implementation. The entire process is completely formalized and can therefore be fully automated. Abstract. In this paper we report on our findings arising from a qualitative, interview study of students' experiences in an undergraduate health informatics program. Our findings suggest that electronic health record competencies need to be integrated into an undergraduate curriculum. Participants suggested that there is a need to educate students about the use of the EHR, followed by best practices around interface design, workflow, implementation and safety with this work culminating in students spearheading the design of the technology as part of their educational program. Internationally, over the past several years researchers from around the world have been focused on varying aspects of patient safety and risk management to improve health information technology safety (HIT). Specifically, varying organizational methods have been developed to improve HIT safety and organizational learning. In this panel presentation researchers will review and discuss varying perspectives on the methods for addressing patient safety and risk management. Published as textual documents, clinical practice guidelines (CPGs) didn't demonstrate to impact physician practices when disseminated in their original format. However, when computerized and embedded in clinical decision support systems, they appeared to be more effective. In order to ease the translation from textual to computerized CPGs, we have elaborated a fine-grained knowledge model of CPGs (FGKM) to be used when authoring CPGs. The work has been conducted on VIDALRecos Ò CPGs. The building of the model has followed a bottom-up iterative process starting with 15 different CPGs. The first version of the FGKM has been assessed on two new complex CPGs, and was enriched by comparison with the Guideline Elements Model (GEM). The final version of the FGKM has been tested on the 2014 Hypertension CPGs. We compared the rules automatically derived from FGKM instances to those manually extracted from textual CPGs for decision support. Results showed that difficulties such as text normalization have to be solved. The FGKM is intended to be used upstream of the process of CPGs authoring in order to ease the implementation and the update of both textual and computerized CPGs. The analysis of big heterogeneous and distributed radio-oncological data for research purposes is our long-since vision. Additionally, we want to support image-based retrospective analyses across multiple patients with data collection and analysis tools that can be linked to automatic analysis processes for specific medical questions. Previously, we have successfully introduced a central radio-therapeutic research database integrated in the clinical environment to collect heterogeneous radio-oncological text and imaging data from distributed clinical source systems. Furthermore, we have developed an infrastructure to connect our central research database with data collection and analysis tools for the easy definition of analysis processes. In this work we show the successful realization of a typical analysis problem in an fully automated analysis process. We established a central research database in the context of the European research project ULICE [1] . Both structured text and radio-therapeutic imaging data (RT data) are collected. The research database is fully integrated in our clinical information systems and is automatically populated with data from these systems. For the analysis of RT data we introduced an analysis platform [2] with diverse service-oriented analysis tools [3] . To show the successful technical realization of the mentioned objectives, a typical radiooncological use case has been selected as feasibility project. Radiation oncologists want to know, how much dose has been applied during primary treatment in the region of recurrence. First, the patient group has to be queried from the research database. Then, the RT data of primary and recurrence irradiation has to be collected and pre-processed for analysis. During analysis images from recurrence irradiation have to be registered with the original planning CT from primary irradiation. The resulting transformation can be assigned to the remaining primary RT data (e.g. segmented structures, planning parameters and dose distributions). Thereby the primary RT data is projected to the coordinates of the recurrence. Then, dose statistics and dose volume histograms (DVHs) can be calculated for each patient, which summarize the three-dimensional dose distribution and represent the irradiation at a glance. Finally, a statistical analysis is performed across all patients and corresponding diagrams are generated. Nine patients were identified for our feasibility project. For these patients the analysis workflow has been executed successfully. As a result an overall dose statistic ( Fig. 1 ) and an overall DVH (Table 1) have been generated summarizing the results of all patients. E. Grill et al. The results of the analysis workflow represent the data needed for further evaluation by a physician. A first interpretation shows their importance for radiation oncology. The successful realization of our feasibility project shows that an automatic analysis of imaging data across multiple patients is possible. With our approach secondary use of radio-oncological data is no longer a vision but reality. As a consequence evidence from retrospective analysis of big heterogeneous radio-oncological data can be achieved substantially easier and it will improve radio-oncological research in future. The Internet is a free space that everyone shares, a search for quality information is like a shot in the dark: neither will reliably hit their target. The Health On the Net Foundation (HON) was born in 1996, during the beginning of the World Wide Web, from a collective decision by health specialists, guided by the late Jean-Raoul Scherrer, who anticipated the need for online trustworthy health information. Thus, HON was created to promote deployment of useful and reliable online health information, and to enable its appropriate and efficient use. Two decades on, HON is the oldest and most valued quality marker for online health information (HONcode). The organization has maintained its reputation through dynamic measures, innovative endeavors and dedication to upholding key values and goals. This paper provides an overview of the HON Foundation, and its activities, challenges, and achievements over the years. Regional deprivation is associated with the distribution of vulnerable asylum-seekers in Germany: a nation-wide smallarea analysis Bozorgmehr K 1 , Razum O 2 , Szecsenyi J 1 , Maier W 3 , Stock C 1 1 Universität Heidelberg, Heidelberg, Germany; 2 Universität Bielefeld, Bielefeld, Germany; 3 Helmholtz Zentrum München, Neuherberg, Germany There is increasing evidence that contextual characteristics of the place of residence, such as regional deprivation, have an impact on the health status of individuals. Newly arriving asylum-seekers in Germany are relocated to one of the 16 Federal States through a quota system based on tax income and population-size. After a stay in reception centres up-to 6 months, asylum-seekers are relocated to counties and municipalities within respective Federal States based on administrative quota. Asylum-seekers who receive social transfers according to the Asylum-Seekers' Benefits Act (AsylbLG) must usually reside in the assigned county or municipality. Moving house to other regions or federal states is only possible under specific circumstance and after formal application at local authorities. The characteristics of the assigned place of residence could have important implications for asylum-seekers' health. We aimed to assess the association between regional deprivation of the place of residence and the distribution of asylum-seekers who receive social transfers and could be considered as vulnerable, such as women, children under 7 years, and elderly asylum-seekers aged 65 and above. We used nationally representative data on all asylum-seekers subject to the German AsylbLG at the end of the year 2013. The rates of observed to expected numbers of asylum-seekers (based on the underlying German county-population size) and of vulnerable subgroups were analysed in Bayesian spatial models fit by the integrated nested Laplace approximation (INLA) approach. Regional deprivation as measured by the German Index of Multiple Deprivation (GIMD) was the exposure of interest comprising seven domains of area deprivation (income, employment, education, municipal revenue, social capital, environment, and security). The analyses were performed at the county-level (N = 402 Landkreise/kreisfreie Städte) Health-exploring complexity: an interdisciplinary systems approach HEC2016 S27 and adjusted for effects of federal states as well as structured and unstructured spatial effects. Of the 224,993 asylum-seekers, 38.7 % were women, 13.8 % were children aged below seven years, and 19.8 % aged above 65 years. The total number of asylum-seekers and vulnerable asylum-seekers were consistently and significantly higher in counties with medium (Q3) and highest deprivation (Q4, Q5) relative to counties in the lowest deprivation quintile (Q1). The adjusted rate ratios for counties with highest deprivation (Q5) relative to those with lowest deprivation (Q1) were 1.21 (1.03-1.41) for total numbers, 1.26 (1.03-1.53) for women, 1.28 (1.04-1.58 ) for children aged under seven years, and 1.50 (1.08-2.08) for elderly asylum-seekers aged 65 years and above. No significant association was found comparing counties in the second (Q2) with those in the lowest deprivation quintile (Q1). The number of asylum-seekers and those considered to be vulnerable was higher in counties with medium and highest deprivation compared to counties with lowest deprivation. The disproportionate distribution was highest for elderly asylum-seekers and children under seven years. Vulnerable subgroups tend to be exposed to more deprived places of residence which may further increase health risks and health care needs. Health care planners need to be aware of this association to develop targeted health care programs or measures for prevention and health promotion. GeoHealth analytics to support proactive decision making in prevention of highly resistant microorganism outbreaks (HRMO) Braakman-Jansen A 1 , Siemons L 1 , Stein A 2 , Al Naiemi N 3,4 , van Gemert-Pijnen L 1 The number of disease outbreaks within Dutch hospital settings are increasing (from a frequency of 2.2 per month in 2012 towards 4.7 per month in 2014) while 78 % of these outbreaks were due to highly resistant microorganisms (HRMO). HRMO can cause life-threatening infections in immunocompetent and immunocompromised patients, resulting in a significant longer length of stay, higher mortality rates, and considerable higher healthcare costs [1] . As such, prevention of HRMO transmission and hospital acquired infections is of utmost importance to improve patient outcomes and reduce significant healthcare costs. However, though current automated surveillance tools have shown to be sensitive, specific and efficient to prevent HRMO spreading and reduce healthcare-associated costs, these tools lack a systematic approach to detect changes in trends or sentinel events that require acute interventions. Also, surveillance is currently not used to predict antimicrobial resistance problems and upcoming outbreaks of diseases due to the lack of advanced analytics. The medical microbiology laboratory LabMicTA and the Integrated Disease Management Corporation (IDMC) have developed a real-time automated surveillance system to monitor all pathogens incorporating a historical dataset starting at 2010. In addition, antibiotic resistance and trends over time are displayed. All microorganisms detected in various patient materials are recorded including medical data and geolocation of the hospital and patient [2, 3] . The goal of this project is to develop a real-time predictive model to enrich the current system. The CeHRes roadmap for eHealth design will be used [4] . At first, the needs and barriers of relevant stakeholders (doctors, microbiologists) with respect to clinical decision making will be explored by scenario driven focus groups. Data will be translated into Geographic Information Systems requirements and mind maps for the predictive model. Then, the current surveillance system will be enriched with available geospatial time-series data of patients movements within and between hospitals. Inferential modelling will be applied and finally a predictive decision model will be developed and incorporated in an automated real time surveillance system. The reliability and validity of this model will be evaluated. Next, this project aims to develop a Data dashboard system for professionals e.g. doctors, infection control practitioners, microbiologists that have to make decisions upon the health data (study 2). The persuasiveness and usability will be tested via in depth interviews and scenario based tests with end users. Finally, the design principles applicable for such information-rich and expertise requiring system usage are examined as is the trust of health care professionals in the visualised information. No results yet, project just started. First results are expected in August 2016. Predictive models using complex datasets and advanced analytics will renew healthcare by the ability to better measure, aggregate and making sense of previously used surveillance data to make personalized healthcare possible. Real-time decision making using geodata and clinical data is important to ensure and enable a sustainable safe society and public health (UN Sustainable development goals). Digital early warning systems are highly needed to ensure a safe and healthy society, to prevent and control for (re)emerging HRMO infections. 1 Universität Ulm, Institute of Epidemiology and Medical Biometry, Ulm, Germany; 2 Ulm University, Department of Dermatology and Allergic Diseases, Ulm, Germany; 3 Technische Universität Dresden, Dresden, Germany; 4 Universität Ulm, Ulm, Germany Background Evidence linking maternal psychosocial stress during pregnancy to subsequent child atopic eczema (AE) is growing but the operationalization of AE is diverse. Objective To analyze the association between maternal prenatal stress or mood disorders and child AE allowing for potential misclassification of AE diagnoses and symptoms. In the Ulm SPATZ Health Study, chronic stress and symptoms of mood disorders were assessed by standardized self-reported S28 E. Grill et al. questionnaires in 934 singleton mothers following delivery in Ulm, Germany, from 04/2012-05/2013. Maternal hair cortisol concentrations (HCC, n = 626) at childbirth and the cumulative incidences of parentreported child AE symptoms, parent-, and pediatrician-reported AE diagnoses were assessed until age 2 years (n = 787). At least one dermatological examination was performed in n = 167 children showing AE symptoms. Adjusted odds ratios (aOR) with 95 % confidence intervals were modelled by logistic regression analyses. Maternal stress and anxiety were associated with child AE symptoms (aOR for the highest vs. the lowest quartile of stress: 1.78 (1.02;3.10), for possible anxiety symptoms vs. no symptoms: 1.70 (1.06;2.72)). No relationship was found between stress or mood disorders and AE diagnoses nor could we show a consistent association between maternal HCC and child AE. The associations of maternal prenatal stress and mood were similar or stronger in families with unaffected older siblings compared to all families with older siblings. Stress measurements or symptoms of mood disorders are related to AE symptoms but not consistently to AE diagnoses. The role of older siblings affected by AE should be further elucidated. The patient portal of the personal cross-enterprise electronic health record (PEHR) in the Rhine-Neckar-Region Over the last years we stepwise implemented our vision of a personal cross-enterprise electronic health record (PEHR) in the Rhine-Neckar-Region in Germany. The patient portal is one part of the PEHR architecture with IHE connectivity. The patient is enabled to access and manage his medical record by use of the patient portal. Moreover, he can give his consent regarding which healthcare providers are allowed to send data into or read data from his medical record. Forthcoming studies will give evidence for improvements and further requirements to develop. MI-Lab-a laboratory environment for medical informatics students Medical research and health care highly depend on the use of information technology. There is a wide range of application systems (patient administration system, laboratory information system, communication server etc.) and heterogeneous data types (administrative data, clinical data, laboratory data, image data, genomic data etc.). Students and researchers do not often have the possibility to use productive application systems of e.g. hospitals or medical practices to gain practical experiences or examine new components and technologies. Therefore, the aim of this project is to develop a dedicated laboratory environment for patient health care and clinical research. Essential application systems were identified and a suitable architecture was designed for this purpose. It is accompanied by a teaching plan that considers learning modules for bachelor and master degrees in medical informatics. We implemented the laboratory environment called MI-Lab with multiple free and open source software components. All components are installed on virtual machines and/or Docker containers. This modular architecture creates a flexible system which can be deployed in various scenarios. The preliminary evaluation results suggests that laboratory environments like MI-Lab work well in teaching practical aspects of medical informatics and are widely accepted by students. Heterogeneous tumor documentation and different interpretation of terms lead to problems in analyses of clinical and epidemiological cancer registries. The objective was to design and implement a national wiki of oncological terms. Therefore, terms of existing handbooks and documentation sources are analyzed and combined and summarized by medical experts of different comprehensive cancer centers. Informatics experts created a generic data model based on an existing metadata repository. In order to establish a national knowledge management system for standardized cancer documentation a prototype of a tumor wiki was designed and implemented. Target user groups are documentation officers as well as physicians and patients. Next steps are the integration of a term-suggestion process and the linkage to other information sources like PubMed. Estimation of the incidence of diagnosed diabetes mellitus from consecutive waves of telephone health surveys in Germany: is trend detection feasible? Brinks R 1 , Röckl S 2 , Du Y 2 , Scheidt-Nave C 2 , Heidemann C 2 1 Deutsches Diabetes-Zentrum, Düsseldorf, Germany; 2 Robert Koch-Institut, Berlin, Germany There are only few population-based data about the age-and sexspecific incidence of diabetes in Germany. Detecting temporal changes in the incidence of diabetes is important for unveiling changes of the risk profile in the considered population. Age-specific incidence of diagnosed diabetes can be calculated from cross-sectional studies if the age-specific relative mortality (R) of Health-exploring complexity: an interdisciplinary systems approach HEC2016 S29 persons with diabetes compared to persons without diabetes is known [1] . We used three cross-sections representative for the population in Germany based on different waves of the telephone health surveys of the [1] . With a view to this model, the question arises how the quality of case-finding, i.e., the combined efforts to identify persons who contracted the disease, can be assessed. This question plays a role in many diseases with an asymptomatic preclinical phase such as diabetes or cardiovascular disease. We use data from a Danish diabetes register [2] and simulate two scenarios of different performances of case-finding. In one scenario, the case-finding worsens over time while in the other scenario the case-finding improves. We apply several epidemiological indices to assess case-finding in both scenarios and compare the results. One of the prevalence-based indices leads to wrong conclusions. Some indices are partly insensitive to distinguish the quality of casefinding. The incidence-based indices perform well. Prevalence-based indices for assessing case-finding should be interpreted with caution. If possible, incidence-based indices should be preferred. Deutsches Diabetes-Zentrum, Düsseldorf, Germany; 2 Robert Koch-Institut, Berlin, Germany In Germany, data about the association between diabetes and limitations in daily activities are scarce. Consecutive waves of the GEDA cross-sectional national health interview surveys for adults in Germany 18 years and older were conducted in 2009, 2010 and 2012 using computer-assisted telephone interviewing technique. We used self-reported physician-diagnosed diabetes and self-reported limitation in daily activities (Global Activity Limitation Indicator (GALI) questions to calculate age-and sex-specific lifetime spent with limitations for persons with diabetes and without diabetes. To get more reliable estimates of the age-and sex-specific prevalence of limitations, we pooled the prevalence data of all three cross-sections and referred our estimates to the year 2010. Mortality was taken into account from the mortality follow-up of the German National Health Interview and Examination Survey 1998 (GNHIES98). Pooled survey samples were used for analysis. Restricting the age range to 30-84 years in order to assure sufficiently high number of observations across age strata, a total of 51455 persons were included in this analysis. Estimates were referred to the German population in 2010. Confidence intervals were estimated by a bootstrap method with 2000 replicates. The age-and sex-specific fraction of lifetime spent with limitations of persons with (N = 4545) and without diabetes (N = 46910) in 2010 is shown in the Fig. 1 . For example, 52 and 32 % of the total lifetime of men aged 60 years with and without diabetes, respectively, were spent with limitation (left part of the Fig. 1 ). The ratio (diabetes versus non-diabetes) for men ranged from 3.4 at age 30 to 1.1 at age 84. The corresponding ratio for women ranged from 1.8 at age 30 to 1.3 at age 84. E. Grill et al. In 2010, persons with diabetes spent a substantially higher percentage of their lifetime with limitations in their daily activities than persons without diabetes. Visualization tools represent a key element in triggering human creativity while being supported with the analysis power of the machine. Selecting the right visualization tool, can be seen as purely a problem of resource usage, however, other valuable factors like familiarity, user experience and data domain needed to be displayed to play an important role. This paper analyzes free network visualization tools for bioinformatics, frames them in domain specific requirements and compares them. Performance of the Peto odds ratio with rare event data: a simulation study Brockhaus AC 1 , Grouven U 1 , Bender R 2 1 Institut für Qualität und Wirtschaftlichkeit im Gesundheitswesen (IQWiG), Köln, Germany; 2 Institut für Qualität und Wirtschaftlichkeit im Gesundheitswesen (IQWiG), Köln, Germany For the calculation of relative measures such as risk ratio (RR) and odds ratio (OR) in one study, additional approaches are required for zero events. In the case of zero events in one treatment arm, the Peto odds ratio (POR) can be calculated without continuity correction, and is currently the relative effect estimation method of choice for binary data with rare events. The aim of this simulation study is the comparison of the estimators for OR and POR with the true OR in variegated situations with regard to several performance measures not yet considered. We provided a broader picture of the estimators for the two effect measures POR and OR in the situation of one study with two parallel groups without confounders by investigating the coverage, width of the confidence interval (CI), mean squared error (MSE), and mean percentage error (MPE). In the situation of rare events, small treatment effects and similar group sizes, we demonstrated that the estimator for the POR performed only better than the estimator for the OR regarding the coverage and MPE, but not regarding the width of the CI and the MSE. For larger effects and unbalanced group size ratios the coverage and MPE of the estimator for the POR was inappropriate. The estimator for the POR does not outperform the estimator for the OR for all the regarded performance measures. The performance of the estimator for the POR depends on the true and unknown effect size. Relative survival from melanoma stratified by tumor stage and histological subtype Brunssen A 1 , Jansen L 2 , Eisemann N 1 , Waldmann A 1 , Kraywinkel K 3 , Eberle A 4 , Holleczek B 5 , Zeißig S 6 , Brenner H 2,7,8 , Katalinic A 1, 9 Institute for Cancer Epidemiology, University of Luebeck, Luebeck, Germany Background Previous research has shown that relative survival from melanoma differs between age groups, histological subtypes, tumor stages, and body sites. Superficial spreading melanoma is associated with a very good prognosis. In contrast, nodular melanoma has a much lower 5-year relative survival. It has also been observed that early stage melanoma with lower tumor thickness is associated with better prognosis than thick melanoma. Survival differences between the histological subtypes were discussed to be mainly caused by tumor stage, but so far relative survival of melanoma patients in Germany stratified by both T-stage and histological subtype is unknown. Objective To investigate relative survival from melanoma by T-stage for each histological subtype, stratified by sex. Methods Data from 12 German cancer registries are analyzed, covering a population of 28 million inhabitants. Patients with a diagnosis of primary melanoma (ICD-10: C43) in 1997-2012 are included whereas patients younger than 15 years and death certificate only (DCO) cases are excluded. Patients with multiple melanomas of different histologic subtypes contribute to all respective subgroups. Stage-specific 5-and 10-year relative survival are estimated by histologic subtype (superficial spreading melanoma, nodular melanoma, amelanotic melanoma, lentigo maligna melanoma, acral lentiginous maligna melanoma, and other melanoma) for women and men. Stage is operationalized as TNM stages [T-classification (T1, T2, T3, T4, Tx), nodal involvement (N0, N?, NX) and metastasis (M0, M1, MX) ]. Survival estimates are based on period analysis. All survival estimates are age-standardized according to the International Cancer Survival Standards. If possible (depending on number of cases) the effect of tumor stage and histological subtype on relative survival is assessed by model based period analysis, while adjusting for age group. For each histological subtype, stage-specific incidence is determined. The analyses are ongoing and results will be presented at the conference. Terminology-based recording of clinical data for multiple purposes within oncology Brønnum D 1 , Højen AR 2 , Gøeg KR 2 , Elberg P 2 1 Aalborg University Hospital, Aalborg, Denmark; 2 Aalborg University, Aalborg, Denmark Collecting clinical data once for the use in both electronic health record (EHR) and registries requires semantic interoperability. This paper presents the results of a systematic semantic analysis of similarities and differences in clinical documentation across regional EHR and a national oncology registry to assess options for an integration of recording templates Methods A comparison of current clinical information in EHR and the national registry was carried out, using SNOMED CT as frame of reference to find exact-, similar-and non-match. Exact match was found for 9 out of 19 terms from the registry and EHR in relation to clinical history, observations and findings at the examination and tumor control. Similar match concerned clinical findings of more common side effects to therapy whether present or absent. Both EHR and the registry had information with no compared match. Clinical documentation during a follow-up in head and neck cancer contains a core set of items recorded in both EHR and registry, representing clinical history, observations and more common side effects and tumor evaluation. These core items could be the point of departure for integration or re-design of EHR-systems. Effectiveness of courses in evidence-based medicine for year 3 undergraduates-results of an evaluation Buchberger B 1 , Mattivi JT 2 , Schwenke C 3 , Katzer C 1 , Huppertz H 1 , Wasem J 1 1 Universität Duisburg-Essen, Essen, Germany; 2 Universität Duisburg-Essen, Lehrstuhl für Medizinmanagement, Essen, Germany; 3 SCO:SSiS Statistical Consulting, Berlin, Germany An essential aim of courses in evidence-based medicine is to improve the skills for reading and interpreting medical literature adequately. Regarding the conceptual framework, it is important to consider different educational levels. We conducted four short courses in EbM of 90 min each for medical students and health care management students focused on critical appraisal of the literature. At the end, the students assessed five publications about randomized controlled trials by five different instruments; the results were compared to expert assessments. In total 169 undergraduates participated in our EbM courses. The students' assessments were affected by arbitrariness without a positive or negative tendency. 16 % of the medical students stated that their command of English was weak or low. Agreement rates between expert and student assessments were found to have an impact of instruments (p = 0.0158). Three studies showed an influence of the instrument on the agreement rate (p \ 0.05 each). Our results contrast sharply with those of many other comparable evaluations. Reasons may be a lack of students' motivation due to the compulsory courses, and the comparison to a reference standard in addition to self-ratings causing objectivity. Undergraduates should become familiar with the principles of EbM, the research methods included, and the reading of scientific papers as soon as possible. For a deeper understanding, clinical experience seems to be an indispensable precondition. Based on our results, we would recommend an integration of lectures about EbM and critical appraisal into medical curricula at least twice and with greater intensity shortly before graduation. The specification and customization of clinical document types are tasks that need a strong collaboration between domain experts and IT specialists. However, these collaborators are often faced with difficulties due to different interpretation of domain knowledge. Therefore, an adequate method for sharing domain knowledge about documents is necessary. Even though there are some tools that help to define medical document types, there is a lack of approaches that focus on the understandability of the specification for the domain experts. This paper proposes a modeling approach based on the Clinical Document Architecture to address this gap. Cardiovascular diseases (CVD) are the leading cause of death and responsible for a large part of total healthcare expenditure in Germany and other high income countries. Information on regional variation in the prevalence of CVD are important for planning of targeted prevention activities and allocation of healthcare resources. We examined differences in the prevalence of major CVD in the 16 federal states in Germany and how these compare to variations in CVD mortality across the federal states. We analysed pooled data from waves 2009, 2010 and 2012 of the nation-wide population-based study German Health Update (GEDA) with a total of 62,606 participants aged C18 years. Data on selfreported physician-diagnosed CVD diagnoses and sociodemographic characteristics were collected by standardized computer-assisted telephone interviews. We examined differences between the 16 German federal states in the lifetime prevalence of major CVD which was defined as self-reported physician-diagnosed myocardial infarction, chronic coronary hearth disease, heart failure or stroke. Ranking of states according to the prevalence of major CVD was compared to ranking according to mortality rates from major CVD based on the official cause of death statistics of the German Federal Statistical Office. The influence of age, sex and social status on regional differences in major CVD prevalence was analysed by multiple logistic regression analyses. Lifetime prevalence of major CVD varied significantly between German federal states and ranged from 10.0 % in Baden-Württemberg and 10.2 % in Hamburg to 13.7 % in Rheinland-Pfalz and 15.8 % in Sachsen-Anhalt. These differences did not materially change in age-standardised analyses or after adjustment for age, sex and social status. Ranking of states according to major CVD prevalence differed substantially from ranking according to major CVD mortality for some states. Sachsen-Anhalt had the highest rankings in both prevalence and mortality while Baden-Württemberg was among the lowest ranking states in both prevalence and mortality. There are notable differences between German federal states in the prevalence of major CVD which can only partly be explained by differences between states in age, sex and social status. Cancer survivors in the German adult population: prevalence estimates for major cancer sites in 2012 Buttmann-Schweiger N 1 , Dahm S 1 , Kraywinkel K 1 1 Robert Koch-Institut, Berlin, Germany An ageing population, implementation of early detection and better treatment options increase the population of individuals living with cancer and cancer increasingly gains public health relevance. Prevalence is regarded as a measure to support quantification of the hereby growing demand for health care services and associated financial resources. Different approaches to estimate cancer prevalence might be used, depending on data availability and purpose. We provide prevalence estimates for Germany from long-term cancer registry data. For the first time, nationwide estimates on [10 year-prevalence and estimated complete prevalence are in progress. We use information on cancer incidence and survival from all reporting registries to obtain the first malignant cancer by site in the respective population alive in 2012 after a diagnosis of cancer made within 1, 5, 10 years. Up to 20 years prevalence estimates are derived from the longest reporting cancer registries of Saarland, Hamburg and Muenster. ICD-10 codes and ICD-O-3 morphology codes are used to select site and histologic types. Appropriate correction methods for cases known from death certificates only and completeness are to be discussed. Indirect statistic modelling of complete prevalence estimates from limited-duration prevalence is applied. As of 31st December 2012, more than 1.1 mio. women and 1.2 mio. men were living with anogenital-, colorectal-, and prostate cancer, malignant melanoma, lung cancer, bowel-gastric and oropharyngeal cancer that were diagnosed in the past 10 years, hereof 665000 women and 706000 men who were living with a diagnosis given within the past 5 years. Up to 20-year prevalence-estimates will be provided by the time of the conference. Although in most federal states, cancer registration has first been established around the year 2000, the number of long term cancer survivors can be assessed indirectly. The existing data is sufficient for continuous monitoring of long-term cancer prevalence indicators. Prospective relevance of early life factors for the intima-media thickness of the common carotid artery in younger adulthoodinsights from linear and quantile regression analyses Buyken A 1 , Bolzenius K 1 , Penczynski K 1 1 DONALD Study, IEL-Nutritional Epidemiology, University of Bonn, Dortmund, Germany Perinatal factors may program offspring cardiovascular risk in later life. This study examined whether early life factors are related to adult carotid intima-media thickness (IMT), a surrogate marker of early atherosclerosis predictive of future cardiovascular events. We firstly performed linear regression analyses to identify relevant early life predictors of mean adult IMT. Secondly, we used quantile regression analyses to assess the relevance of the identified predictors for the upper parts of the distribution, i.e. values most likely associated with a higher future cardiovascular risk. We studied term participants (121 girls, 99 boys) of the DONALD Study, with a bilateral sonographic measurement of the IMT in young adulthood (18-40 years; IMT averaged from 8-16 measurement points) and data on early life factors: gestational duration, gestational weight gain, parental age at birth, first-born status, birth weight (including appropriateness-for-gestational-age), and full breastfeeding. Information on gestation and birth was taken from the maternity passport, full breastfeeding was prospectively assessed at routine visits (ages 0.25, 0.5, 0.75, 1, 1.5 and 2 yrs). Multivariable linear regression models also considered age at IMT measurement, clinician performing the measurement, birth year, smokers in the household, maternal overweight and maternal education as potentially confounding factors. Considering the same covariates, quantile regression models were derived for percentiles P10, P25, P50, P75 and P90 and specifically analyzed for associations at P75 and P90 of the adult IMT distribution. Mean adult IMT was 566 lm (SD 60 lm) among males and 554 lm (SD 54 lm) among females. Linear regression analyses identified maternal age at birth as a relevant predictor for mean adult IMT among females: offspring IMT was higher among females whose mothers had been older at birth (ß 37.5 (SE 14.9) lm/decade, p = 0.01). No early life predictors were identified for mean adult IMT among males. In quantile regression analyses maternal age at birth was not of specific relevance solely for the upper IMT quantiles in females (p = 0.006 at P75, p = 0.7 at P90); instead values across the entire IMT distribution were higher among females whose mothers had been older at birth. Among males, older maternal age at birth also emerged as a potential predictor of adult IMT in quantile regression analyses; however, older maternal age at birth was related to lower IMT levels, yet across the distribution this association reached significance at P75 only (p = 0.03). This study suggests that advanced maternal age may have adverse overall consequences for female offspring IMT. However, according to quantile regression older maternal age at birth is of universal relevance and not specifically directed at higher female offspring IMT levels. Of note, quantile regression analyses point towards a sexdifference in the association of maternal age at birth with adult IMT. The recent developments in the area of mobile computing have fueled the debate on potential applications for independent living, well-being and medical interventions both in the academic field and in practice. A considerable amount of research and development of IT-based assistance systems is focused on clinical applications. However, assistance systems that can be used in everyday life by a broad user group of people with increased need for care are rare. In this paper we present the vision of iCare-a research program to support a safe and independent living of people with special needs. In addition, we exemplarily introduce DeSearch-a privacy-aware sensor-based network to locate missing people-as a first result of iCare. Automated import of clinical data from HL7 messages into OpenClinica and tranSMART using mirth connect Camacho Rodríguez JC 1 , Löbe M 1 , Stäubert S 1 1 Institute for Medical Informatics, Statistics and Epidemiology (IMISE), Leipzig, Germany Electronic data capture (EDC) tools are designed to simplify data acquisition, improving data quality and managing clinical data electronically. Some data are collected from the laboratory information management system (LIMS), which is an important data source for a study. OpenClinica is an open source clinical data management system (CDMS) for web-based electronic data capture (EDC), which is used widely in academic clinical research. TranSMART is also an open source web-based platform used for the management and analysis of different data types common in clinical and translational research. Many LIMS use the Health Level 7 standard-Version 2.x (HL7) as a message exchange protocol. In this paper, we implement Mirth Connect as a Communication Server (CS) to convert these HL7 messages either to Operational Data Model (ODM) data for the automatic import in OpenClinica or tabular-delimited text format files, whose data is uploaded in tranSMART using the tMDataLoader tool. A national medical information system for Senegal: architecture and services Camara G 1 , Diallo AH 2 , Lo M 2 , Tendeng JN 2 , Lo S 2 1 Univesity Alioune Diop of Bambey Senegal, Bambey, Senegal; 2 University Gaston Berger, Saint-Louis, Senegal In Senegal, great amounts of data are daily generated by medical activities such as consultation, hospitalization, blood test, X-ray, birth, death, etc. These data are still recorded in register, printed images, audios and movies which are manually processed. However, some medical organizations have their own software for non-standardized patient record management, appointment, wages, etc. without any possibility of sharing these data or communicating with other medical structures. This leads to lots of limitations in reusing or sharing these data because of their possible structural and semantic heterogeneity. To overcome these problems we have proposed a National Medical Information System for Senegal (SIMENS). As an integrated platform, SIMENS provides an EHR system that supports healthcare activities, a mobile version and a web portal. The SIMENS architecture proposes also a data and application integration services for supporting interoperability and decision making. Sedentary behavior recognition using accelerometer and symbolic data Ceron J 1 , Lopez D 1 , Ramirez GA 1 , Sierra-Torres H 1 , Alvarez-Rosero R 1 1 Universidad del Cauca, Popayan, Colombia Sedentary behavior has been associated to the development of noncommunicable diseases (NCD) such as cardiovascular diseases (CVD), type 2 diabetes, and cancer. Accelerometers and inclinometers have been used to estimate sedentary behaviors, however a major limitation is that these devices do not provide contextual information such as the activity performed, e.g., TV viewing, sitting at work, driving, etc. Objective: Propose and evaluate the precision of a system for objectively measuring sedentary behavior and identifying the context (location) where it occurs. Results: The system is implemented as an Android Mobile App, which identifies individual's sedentary behaviors and location based on accelerometry data obtained from a smartwatch, and symbolic location data obtained from Bluetooth beacons. The system infers sedentary behaviors by means of a supervised Machine Learning Classifier. The precision in the classification of the six studied sedentary behaviors exceeded 90 %, being the Random Forest algorithm the most precise. Conclusion: The proposed system allows the recognition of specific sedentary behaviors and their location with very high precision. E. Grill et al. A systematic review of studies on the knowledge, attitude, and perception of doctors, nurses, pharmacists, and medical students towards antibiotic use and resistance in developing countries Background Irrational use of antibiotics is a leading cause of antibiotic resistance especially in developing countries where patient management is mainly based on prescription of medicines due to deficiencies in diagnostic procedures. The objective was to conduct a systematic review of studies on the knowledge, attitude, and perception of doctors, nurses, pharmacists, and final year medical students towards antibiotic use and resistance in Africa, Asia, and South America from 2010 to 2015; and to summarize the findings which will be useful epidemiologically, and for future interventions and decisions regarding antibiotic prescribing. PubMed, Scopus and Google scholar were used to conduct a systematic search for studies on the topic. We included papers that focused on health practitioners' knowledge on antibiotic use, local resistance, and extent of the problem of antibiotic resistance as well as the health practitioners' confidence in antibiotic prescribing, commonly used guide when prescribing antibiotics and recommendations to improve antibiotic prescribing. Studies that assessed other indicators were excluded. The systematic search resulted in a total of 961 papers, with 900 papers left after removing duplicates. Upon reviewing the titles, 33 papers were identified as relevant. After further reviewing the abstracts, 16 papers were excluded as unrelated to the topic. Seventeen full text papers were then reviewed, 7 of which were excluded as different indicators were used for assessment. A total of 10 articles fulfilled our criteria and were therefore included (4 from Asia, 3 from Africa, 2 from South America and 1 from Middle East). All the studies were cross-sectional and questionnaires were used as survey tool. A good level of knowledge on antibiotic use but low level of knowledge on local antibiotic resistance was reported in most of the studies. Confidence in antibiotic prescribing was reportedly high in most of the studies (ranging from 44.8 to 88.6 % of all the participants). There was a wide variability in how the participants viewed the extent of the problem of antibiotic resistance; however, most participants agreed that it is a problem nationally and globally but to a lesser extent in their own practice. Internet sources, university courses, antibiotic guidelines, antimicrobial prescribing guidelines, and support from local infectious diseases service where the most commonly used guidelines for antibiotic prescribing (ranging from 45.7 to 75.5 % of the participants). There is low level of knowledge on local antibiotic resistance and a wide variation in how the participants recognized the extent of the problem of antibiotic resistance despite reports of good knowledge on antibiotic use and high level of confidence in antibiotic prescribing. Obesity is a modifiable risk factor for many chronic diseases affecting both high-and low-income countries. Obesity and overweight-related diseases cause about 2.8 million deaths per year worldwide [1] . There are gaps in current knowledge of the epidemiology of obesity in low-income countries. This is an abstract of the study protocol and result of a pilot study to investigate the epidemiology of adult obesity in Enugu state in south east Nigeria. The state has a total population of 3.3 million people [2] . Household surveys involving 1674 Households and 6665 adults aged 20 to 60 years of age. All eligible adults in each household are to be interviewed and measured. Cluster randomized sampling (31 randomly selected clusters with expected number of 215 persons per cluster). Census enumeration areas (EA) with well-defined boundaries were used as clusters. The EAs were pre-defined by the government for the purposes of census enumeration. Sampling method designed to obtain a representative sample of the adult population of Enugu state. Anthropometric data (weight, height, abdominal circumference, biceps skin fold thickness and triceps skinfold thickness) taken. NHANES anthropometric training manual was used to train data collectors.• Oral interviewer-administered questionnaires to collect socio-demographic, lifestyle, nutritional and behavioural data. GPS coordinates of households and spatial data collected. Multilevel and spatial statistical modelling to be used for data analysis. Overal Prevalence of obesity: 31.6 % Prevalence of obesity in men: 12.0 % Prevalence of obesity in females: 40.7 % Majority of obese people perceive their body size as underweight or normal Conclusion Study protocol shows a unique way of cluster sampling in household surveys using government-designated census enumeration areas with well-defined geographical boundaries. The study protocol is also unique in the way of obtaining a representative sample of the different strata of the population. Pilot study shows high prevalence of obesity especially among women and a mismatch between the perceptions of obese people about their size and their actual weight. Field work of the main study is expected to be completed in April 2016. A prediction model for insulin associated weight gain in patients with type 2 diabetes Cichosz S 1 1 Background Weight gain is a continued challenge when initiating insulin therapy in patients with type 2 diabetes. However, if prediction of insulin associated weight gain was possible on an individualized level; targeted initiatives could be initiated to reduce weight gain. The objective of this study was to identify predictors of weight gain in insulin treated patients with Type 2 Diabetes. Methods 412 individuals with type 2 were in addition to metformin or placebo randomised to 18 months treatment with three different insulin analogue treatment regimens. Participants with excessive weight gain were defined as the group with weight gain in the 4th quartile. We developed a pattern classification method to predict individuals prone to excessive weight gain. The median weight gain among all patients (n = 412) was 2.4 (95 % prediction interval: -5.6 to 12.4) kg and 8.9 (95 %: 6.3 to 15.2) kg for the upper 4th quartile (n = 103) during the 18 months. No clinical baseline data were strong predictors of excessive weight gain. However, the weight gain the first 3 months of the trial and the subsequent dose of insulin yielded a useful predictor for the weight gain at the 18 month follow-up. Combining these two predictors into a prediction model with other clinical available information produced a ROC AUC of 0.80. We have developed a prediction model that could help identifying a substantial proportion of individuals with type 2 diabetes prone to large weight gain during insulin therapy. A general software framework for reading centers with application to ophthalmology findings in clinical studies and telemedicine Clin L 1 , Leitritz MA 2 , Brand L 1 , Dietter J 3 , Burgert O 1 , Dynowski M 4 , Ueffing M 3 , Thies C 1 application have the same requirements there is no specific software support system available. In this work the general requirements of a reading center are determined and the underlying general software components are identified. Based on these findings, a general information system architecture and implementation for the examination of medical data by medical experts is presented in the context of health studies. Fundamental challenges are the adoptability of interfaces and data models to varying study contexts. And the integration of external data processing for computer assisted diagnosis. The system is successfully applied to two different ophthalmologic clinical studies with 2000 and 3000 fundus images an three examiners respectively. It is also prepared to process fundus images for more than 40,000 participants in a large longitudinal study. The design of a specific reading center platform supports the processing of large sets of data by few medical experts. By providing an open source framework this can be made available to the community and extended by special interfaces, form management or methods for computer assisted diagnosis. The association between admission anemia and long-term mortality in patients with acute myocardial infarction: results from the MONICA/KORA myocardial infarction registry Colombo M 1,2 , Kirchberger I 1,3 , Amann U 1,2 , Heier M 1,3 , Thilo C 4 , Kuch B 5 , Peters A 3 , Meisinger C 1,3 1 KORA Herzinfarkt Register Augsburg, Augsburg, Germany; 2 Helmholtz Zentrum München, München, Germany; 3 Helmholtz Zentrum München, Neuherberg, Germany; 4 Klinikum Augsburg, Augsburg, Germany; 5 Klinikum Nördlingen, Nördlingen, Germany Low hemoglobin levels indicate anemia and are commonly found in patients with acute myocardial infarction (AMI). Studies have shown that in patients with AMI, anemia on admission is associated with increased short-and long-term mortality. This study aims at examining the impact of anemia at hospital admission on long-term allcause mortality following AMI in patients recruited from a population-based registry. The final study population comprised 2011 patients, who survived more than 28 days following AMI. Patients were consecutively hospitalized for AMI that occurred between January 2005 and December 2008 and were followed up until December 2011. Hemoglobin levels were measured at hospital admission and categorized according to the WHO classification of anemia. Mild anemia was defined as hemoglobin levels of 110-119 g/l in women and 110-129 g/l in men. Patients with hemoglobin levels B109 g/l were assigned to the group with moderate to severe anemia. To compare survival in patients with and without anemia, Cox proportional hazards regression models adjusted for potential confounders were calculated. Mild anemia and moderate to severe anemia were present in 183 (9.1 %) and 100 (5 %) patients on admission, respectively. During a median follow-up time of 4.2 years (interquartile range 2.3 years) 11.9 % of the total study population died. Long-term mortality among patients with and without anemia was 30.0 (patients with mild-anemia: 26.2 %; patients with moderate to severe anemia: 37.0 %) and 9 %, respectively. Multivariate Cox proportional hazards regression showed significantly increased mortality risks in both patients with mild (HR 1.74, ) and moderate to severe (HR 2.05, 1.37-3.05) anemia compared to patients without anemia. The presence of anemia at hospital admission adversely affects longterm survival following AMI in our study population. Anemia on admission, even if only present in the mild form, should be considered a risk factor in patients with AMI. Long menstrual periods are determinant factors in the long-term exposure to hormones. Long-term exposure to hormones was positively associated with the risk of disease related hormone. Long-term gynecological age was increased risk of endometrial cancer [1] . According to previous studies, early age at menarche was positively related to risk of breast cancer and an inversely correlate with risk of a stomach and gallbladder cancer. Also age at menopause was related to women's cancer, depression and insomnia [2] [3] [4] [5] [6] [7] [8] . Therefore the aim of this study is to know the relations in gynecological age with the epidemiological characteristics. Methods Data of the Fourth to Sixth (Including the second year of investigation) Korea National Health and Nutrition Examination Survey (KNHANES IV; 2007 -2009 2010 -2012 2013 -2014 was used in this study. Among 65,973 individuals, 11,936 women have gynecological age. Only 7,868 women who experienced natural menopause were included for analysis. The time of menopause was stratified. Gynecological age was defined as duration of age at menarche and age at menopause. In addition we applied weighted of subjects. In normal age at menopause groups, parent's education level, income level, marriage status (p \ 0.0001), quality of life (p \ 0.0001), subjective cognitive health, weight control experience (p \ 0.0001) has different means of gynecological age. Besides pulse rates, SBP was related to inversely and DBP, height, weight and BMI (kg/m 2 ) was positively related to gynecological age. Whereas, women who experienced early or late menopause had no relation to gynecological age with health behavior and mental health. Conclusion It is known, menopause age is determined genetically, not a related with socioeconomic position or environmental factor [9]. We show that gynecological age was related socioeconomic status, mental health, health behavior, obstetrics & gynecological history and anthropometry. Probably we determined that the father's education level was related childhood nutrition. Also, we estimated childhood nutrition status was related early age at menarche. In a previous study, middle-aged women's mental health was associated with early menopause [4] . However, in this study, women who experienced depressive symptom has no relate to gynecological age. Long-term gynecological age was related who has good scores on quality of life among normal age at menopause. Increasing hormone exposure was associated with body mass. Obese was related not only gynecological age also early age at menarche and late age at menopause. Interestingly the women who are overweight or more at their BMI and below the average or less of subjective body mass recognition have shorter gynecological age relatively than others. Unlike natural menopause group, the surgical menopause group does not have a full menstrual period. In previous studies it did not mediate to whole of gynecological age. This study suggests a gynecological age and epidemiological characteristics that are related to when they experienced natural menopause. Development and validation of statistical models for a web-based risk calculator predicting complications of colorectal cancer surgery Crispin A 1 1 Departement of Medical Information Sciences, Biometry, and Epidemiology, LMU, München, Germany To provide statistical models for a web-based calculator predicting individual complication probabilities of patients undergoing colorectal cancer (CRC) surgery in Germany. Prediction models were derived from records of patients undergoing first-time CRC surgery between 2010 and 2014, documented in the database of the Study Documentation and Quality Center (StuDoQ) of the German Society for General and Visceral Surgery (Deutsche Gesellschaft für Allgemein-und Visceralchirurgie, DGAV), a registry of surgical interventions due to CRC in participating hospitals throughout Germany. The StuDoQ database covers more than 200 patient variables, including demographic characteristics, medical history, tumor features, comorbidities, behavioral risk factors, and outcomes. Prediction models were developed in a learning-sample of 3813 patients. To avoid overfitting and overly optimistic assessment of our prediction models, we used penalized logistic regression (ridge regression). Models were evaluated in a separate internal validation sample of 2294 patients. Discrimination, i.e. the ability to distinguish between patients with and without events, was assessed using the c statistic, i.e. the area under the receiver operating characteristic (ROC) curve. Calibration was examined graphically by plotting observed frequencies versus predicted probabilities and numerically by means of Brier scores. We report results regarding four selected outcomes: 30-day mortality, anastomosis leakage, bladder voiding disturbance after rectal surgery, and wound infections. Most models exhibited reasonable discriminatory power. When applied to the validation sample, the resulting c statistics ranged between 0.61 for wound infections and 0.85 for mortality. The corresponding Brier scores ranged from 0.024 for mortality to 0.110 for bladder voiding disturbances. Most models showed satisfactory discrimination and calibration. This does not, however, preclude overly optimistic or pessimistic individual predictions. Risk estimates for rare complications are less reliable than those for more frequent ones. Similarly, particularly high, and thus especially interesting, individual risk estimates tend to be unreliable. Users of risk calculators are strongly advised to seek statistical guidance to prevent irrational clinical decisions. Trends of cancer survival rates estimated from data of German epidemiologic cancer registries Dahm S 1 1 Robert Koch-Institute, Berlin, Germany Cancer survival is a key measure in evaluation of cancer control. Large variation of survival rates estimated from German epidemiologic cancer registries make it difficult to interpret trends in survival. The study was based on 3.6 million cancer cases notified in 14 German epidemiologic cancer registries within the years from 1997 to 2012. 5-year relative survival rates were estimated stratified by cancer sites, registries, follow-up years, age groups and sex. For each stratum the DCO proportion was recorded in addition. Survival rates by cancer sites, age groups and sex were modeled by mixed linear regression using random intercepts for registries and DCO proportions and follow-up years as fixed effects. Based on this linear model trends in cancer survival rates were predicted at DCO proportions of 0 %. After controlling for DCO proportions and follow-up years survival rates showed only a small variation between registries. For most cancer sites we estimated significant positive trends for survival rates over the years. The strongest increase of survival was seen for prostate cancer Conclusion The proposed method could be used to estimate recent trends in cancer survival in Germany. Accompanying depression with FINE-a smartphone-based approach Dang M 1 , Mielke C 1 , Diehl A 2 , Haux R 1 1 Peter L. Reichertz Institut für Medizinische Informatik, Braunschweig, Germany; 2 Klinik für Psychiatrie, Psychotherapie und Psychosomatik, Städtisches Klinikum Braunschweig, Braunschweig, Germany Major depression is the most prevalent mental disorder and one of the main reasons for disability. To be successful in treating depression, it is necessary to have early identification and intervention. Therefore, it is important to design more objective and more efficient depression screening techniques. Such interventions provided by mobile apps shows promise due to their capabilities to support people in their everyday lives. Until very recently, the general approach to design and functionality of mental health apps that works effectively in the context of diagnostics, had not been widely explored. For this reason, we have investigated potentially significant depressioncorrelated parameters derived from self reports, smartphone usage pattern and sensor data to specify our concept. Following the results of the requirement analysis, we developed the Android app 'Fine'. E. Grill et al. A feasibility check with a specific target audience has shown that the app can record most of the selected parameters reliably. It has also shown that the overall concept has been accepted positively with the target audience. Further work is planned to improve the functionalities and to adapt specific needs for depression attendance. Long-term effects of the intervention ''active health promotion in old age'' over a 13 year period-results from the LUCAS Cohort Study (prolong health) Germany (2001 Germany ( -2002 . Eligible study participants were 60 years and older, without need of nursing care (German long-term care legislation) and without cognitive impairments. The intervention included a reinforcement programme to strengthen health recommendations: participants of the intervention group had the choice to take part in the small group session programme ''Active Health Promotion in Old Age'' or in a preventive home visit programme or opt for no programme. The half-day group-sessions were provided by an interdisciplinary geriatric health educator team and included information on medical aspects of ageing, physical activity, healthy eating and social participation. Home visits comprised multidimensional assessment performed by a specially trained nurse. One-year follow-up results showed favourable effects in the intervention group in the use of preventive services and health behaviour [1] . The majority of the intervention group chose the programme ''Active Health Promotion in Old Age''. After completion of the RCT, members of the control group could also participate in this programme [2] . In addition to the 1-year follow-up, long-term analyses were performed based on the ITT principle (original randomisation), showing no significant differences [3] . Next, we examined long-term effects in participants and non-participants of the multicomponent intervention ''Active Health Promotion in Old Age'' (1st Prize German Preventative Award 2005) [4] . On-treatment analyses summarised all participants from the intervention and the control group, who decided to take part in the intervention ''Active Health Promotion in Old Age'' and were compared to all other RCT participants (= non-participants). Long-term analyses comprised a time period of 13.8 years using the primary endpoint survival. Analyses included Kaplan-Meier curves, log rank tests and proportional hazard ratios cox regressions. We adjusted for sociodemographic and functional factors, self-perceived health and medical conditions. The on-treatment analyses showed significant differences between participants of the intervention ''Active Health Promotion in Old Age'' and non-participants in survival, log rank test: p \ 0.0001; cox regression: hazard ratio (HR) = 0.499 [95 % CI 0.426; 0.584]. Treatment hazard ratio remained significant after adjustment for survival. Giving the participants of an intervention the opportunity to choose between two intervention programmes (against the conventional ITT principle of offering one intervention only) showed significant preventative effects not only in the short term (1 year follow-up) but also in the long-term perspective (13 years). Individual preferences in the heterogeneous older population should be considered to provide tailored interventions to diverse target groups. Approximately 25 % of the German population is 60 years and older with 90 % living independently in the community. Nonetheless older people vary in health status, functional competences and limitations. Health-promoting preventative and therapeutic intervention programmes with the aim to propose tailored and thus effective interventions have to take these differences into account. The LUCAS Functional Ability Index (LUCAS-FAI) was developed to screen functional competence in senior citizens in the community setting. This validated self-reporting index includes pre-clinical markers and focusses equally on both, functional resources and restrictions to subdivide the heterogeneous group of older persons into the classes ROBUST, post-ROBUST, preFRAIL and FRAIL. The prognostic validity of the selfreporting index was investigated in the Longitudinal Urban Cohort Ageing Study (LUCAS). Survival analyses showed significant differences between these classes [1, 2] . One question to detect functional resources was: ''Please indicate if you are able to walk 500 meters''. Alternatively, we have also used a question on cycling instead. Despite many attempts to provide appropriate tools for supporting stroke patients in their care and rehabilitation processes, there is still room for improvement. We propose an online care and rehabilitation planning tool as a potential eHealth service for stroke patients. The aim is to study the care professionals' perceived usefulness of the planning tool. We developed and presented a functional prototype to a neurology team in Stockholm. Three focus groups were performed with the care professionals in the team and the data were analysed based on the unified theory of acceptance and use of technology.Although care professionals mentioned challenges such as time limitation for using the tool and issues related to the responsibility of the system, they were positive towards the tool and its potential usefulness in ease of understanding the rehabilitation process and support for collaboration. Identification of sepsis patients using temporal abstractions and temporal pattern matching Despite the increasing use of electronic medical records (EMRs), high quality clinical data is still lacking [1] . Among the main issues is the widespread use of free-text for clinical documentation. To some extent, Natural Language Processing (NLP) can be used to extract complex concepts from clinical text but building a NLP engine demands significant efforts. Electronic medical records are very unique systems in which concepts with different levels of abstraction coexist. For instance, an EMR may contain individual temperature measurements as well as the concept of 'fever', which is of course an abstraction of a set of temperature measurements. Following that principle, we have built Clini-calTime, a system that allows researchers to visually express patients' phenotype using clinical intervals and instants as well as their temporal relationships. Here, we describe the use of Clini-calTime to identify patients with sepsis using the MIMIC II database. Methods MIMIC II is an anonymized database of clinical data containing information from in-tensive care unit episodes [2] . To create our dataset, we randomly selected 60 patient records. These records were blindly annotated by two clinicians according to the presence/absence of sepsis, defined as a Systemic Inflammatory Response Syndrome (SIRS) of infectious origin [3] . Disagreements were solved by consensus. In parallel, we described the same criteria using temporal intervals and temporal patterns as described in our previous work. Briefly, in ClinicalTime a researcher uses a graphical interface to define clinical temporal intervals. The researcher defines the intervals and their temporal and mathematical relationships and later combines any number of intervals to define a temporal pattern of clinical events. Subsequently, ClinicalTime performs a temporal abstraction from structured clinical data, in this case MIMIC II, and identifies the patients that match the pre-defined pattern [4] . Using the manually an-notated dataset as the gold standard, we calculated ClinicalTIme's accuracy, precision, recall and F-measure. Overall, 38 % of all patients had sepsis, concordant with previous literature reports. Clinical Time correctly classified 73 % of all patients in the sample (accuracy). Our system performed moderately at classifying patients with sepsis, with a precision and recall of 63 and 74 %, respectively, and a F-Measure of 0.68. The main source of misclassifications were false positives, with 16 % of all patients. Other systems based on the use of natural language processing have shown better results, such as the study by Murff et al. in which they obtained 83 % accuracy for identifying patients with sepsis, however, that accuracy is achieved at the great cost of developing a specific NLP engine. Conclusion ClinicalTime performed moderately in the correct classification of sepsis patients in the ICU. Despite the existence of systems that may perform better, ClinicalTime's simplicity-with a query system based on the graphical definition of clinical cases using temporal intervals and their relationships, which allows researchers to define their cases in a few minutes-make it a significant addition to the currently available tools to identify patient cohorts based on their phenotype in large clinical databases. E. Grill et al. The 2013 ACC/AHA guidelines introduced an algorithm for risk assessment of atherosclerotic cardiovascular disease (ASCVD) within 10 years. In Germany, risk assessment with the ESC SCORE is limited to cardiovascular mortality. We therefore sought to recalibrate and evaluate the ACC/AHA risk score in the German population and to compare it to the ESC SCORE. We studied 5,238 participants from the KORA surveys S3 (1994) (1995) and S4 (1999-2001) and 4,208 subjects from the Heinz Nixdorf Recall (HNR) Study (2000) (2001) (2002) (2003) . There were 383 (7.3 %) and 271 (6.4 %) first non-fatal or fatal ASCVD events within 10 years in KORA and in HNR, respectively. Risk scores were evaluated in terms of calibration and discrimination performance. The original ACC/AHA risk score overestimated 10-year ASCVD rates by 37 % in KORA and 66 % in HNR. After recalibration, miscalibration diminished to 8 % underestimation in KORA and 12 % overestimation in HNR. Discrimination performance of the ACC/AHA risk score was not affected by the recalibration (KORA: C = 0.78, HNR: C = 0.74). The ESC SCORE overestimated by 5 % in KORA and by 85 % in HNR. C-statistic was 0.82 in KORA and 0.76 in HNR. The recalibrated ACC/AHA risk score showed strongly improved calibration compared to the original ACC/AHA risk score in both German cohorts. Predicting only cardiovascular mortality, discrimination performance of the commonly used ESC SCORE remained somewhat superior to the ACC/AHA risk score. Yet non-fatal events are of increasing importance given the decreased case-fatality of acute myocardial infarction. Considering this advantage, the recalibrated ACC/ AHA risk score provides a meaningful tool for estimating 10-year risk of fatal and non-fatal cardiovascular disease in the German population. Clinical data is often captured in unstructured texts and scattered in different health information systems. This complicates the aggregation of information in the process of clinical decision making. However, having a quick overview and an efficient representation of relevant aspects of a patient's health status are crucial for this process. While accessing patient data and perusing clinical documents, relevant details need to be discovered quickly. In this paper, we introduce an approach to visualize relevant information from clinical documents by tag clouds. The conventional tag clouds visualize the content of a document using the terms they are containing shown in different sizes with the size calculated based on the term frequency. Important facts and diagnostic results with low occurrence in a text may be ignored by this naïve method. In this paper, we therefore adapt the conventional tag clouds by information extraction and a guidelines-based classification schema, so that the clinical concerns can be visualized more correctly. The aspects are extracted according to a classification schema developed by clinical experts. We evaluate the approach on a set of radiology reports for cervical spine treatment. Patient centered event representation for the treatment of multifactorial diseases: current progress and challenges Deng Y 1 , Gaebel J 1 , Denecke K 2 1 Universität Leipzig, Leipzig, Germany; 2 Bern University of Applied Sciences, Biel, Switzerland Health-exploring complexity: an interdisciplinary systems approach HEC2016 S41 Events such as clinical interventions, adverse drug events are one of the basic semantic units in the clinical workflow and are the foundation of the pathway representation. Current research has mainly concentrated on event recognition through concept mapping using from medical ontologies (UMLS, SNOMED CT) and gene relation detection in the biological context. However, the analysis of the patient status and the interaction between the patient status and a context event is still at the primary stage. In order to realize an efficient personalized treatment design and pathway planning, the correlation between a patient status and different types of clinical events should be analyzed. In this paper, we will provide a summary of the current research progress in clinical event detection in the biomedical domain and compare two approaches of event acquisition: an event schema produced using a guideline-based method and an expert-based annotation. We will apply the approaches to generate a structured annotation corpus and a special case of an event schema based on the complication classification and risk management in treatment of laryngeal cancer. Background Clinical trials are majorly performed in the western world to proof safety, efficiency, and efficacy of novel drugs or medical devices. For data capture and the comprehensive documentation, electronic systems are nowadays available. However, such systems are stand alone, not providing any interface for data exchange and system interconnections. In addition, image-or signal-based surrogates (imaging biomarkers) are acquired increasingly in those trials. Currently, this binary large objects (BLOB) data is also not integrated to the IT systems supporting clinical trials, since it is approaching big data volumes. We present a novel approach on system interconnection based on existing standards and representational state transfer (REST) Web services. The core idea is taken from health level seven (HL7)-based hospital information systems, where the particular information system components are interconnected via a centralized communication server (CommServ) that translates messages to the language of the communicating components. Currently, the (CommServ) interconnects: the clinical trial management system (CTMS), two electronic data capture systems (EDCS): OpenClinica for clinical trials and Google Web Toolkit (GWT)-based rare disease registry (RDR) framework, a picture archiving and communication system (PACS) for diagnostic images and biosignals (DICOM4CHEE), a de-identification service (TMF-PID), a randomization service (R), and Matlab-or Java-based data processing. All messages sent via the communication server are coded in the extensible markup language (XML). Requests to the EDCS are coded according to the operational data model (ODM) of the clinical data interchange standards consortium (CDISC), while enrolment numbers and remote procedure calls (RPC) for BLOB data processing uses a proprietary scheme. We demonstrate all directions of data flow by two case studies using mobile devices: an App that displays enrolment for trials, disregarding weather the figures are hosted in the CTMS, OpenClinica, or RDR framework, and an App that allows photographic documentation of skin lesions or wounds, disregarding whether the data is sent to OpenClinica or the RDR framework. Our Macbeth Color Checker (MCC)-based reference card [1] , which is posed next to the lesion, is located automatically in the image and, based on the actual cards geometry and the colors of the MCC plates, used for calibration of definition and value ranges of the two-dimensional images. In addition, the integrated barcode is read automatically. The card ID is looked up successfully via the communication server for automatic subject referencing and, based on the trial and subject IDs, the current image is send to the respective EDCS, which acknowledges the receipt. So far, 31 virtual machines hosting the variety of services are linked via the CommServ. Conclusion IT systems for clinical research can be integrated effectively using a communication server. However, the existing standardized protocols are insufficient to support all functionality. Regional difference in the prevalence of cardiovascular risk factors in Germany Diederichs C 1 , Neuhauser H 2 , Kroll LE 1 , Lange C 1 , Mensink G 3 , Scheidt-Nave C 1 , Busch MA 4 1 Robert Koch-Institut, Berlin, Germany; 2 Robert Koch-Insitut, Berlin, Germany; 3 Robert Koch Institut, Berlin, Germany; 4 Robert Koch Institute, Berlin, Germany Cardiovascular disease is the major cause of death worldwide. Eight risk factors including physical inactivity, smoking, risky alcohol consumption, low fruit and vegetable intake, high body mass index, hypertension, high cholesterol and high blood glucose were identified by the WHO to account for more than 60 % of cardiovascular deaths. We investigated whether there are differences in the prevalence of these risk factors in the 16 federal states in Germany that could help explain the known regional variations in cardiovascular disease prevalence and mortality in Germany. We used pooled data from waves 2009, 2010 and 2012 of the national population-based study German Health Update (GEDA) with a total of 62,606 participants aged C18 years. Data was collected by standardized computer-assisted telephone interviews. We examined differences between the 16 German federal states in the prevalence of eight major cardiovascular risk factors separately and combined as the total number of risk factors per person. Risk factors were categorized as follows: physical inactivity (no regular sport in the last 3 month), smoking (current regular or occasional smokers), alcohol use (risky consumption), low fruit and vegetable intake (\1 portion of fruit, vegetable or juice/day), obesity (BMI C 30 kg/m 2 ) and lifetime medical history of hypertension, hyperlipidemia or diabetes. Logistic regression analysis was used to analyze differences between federal states considering variations in age, sex and social status. The federal state with the lowest risk factor prevalence served as the reference category. The results were weighted to represent the German population on 31.12.2011. Statistically significant differences between the 16 federal were observed for all cardiovascular risk factors except for risky alcohol consumption. Diabetes had the largest regional variation between the lowest (6.7 %) and highest prevalence (13.3 %), followed by low fruit and vegetable intake (11.8 versus 21.0 %), obesity (11.8 versus 20.6 %), hypertension ( There are notable differences between German federal states in the prevalence of the eight most important cardiovascular risk factors, with the highest prevalence for six of those in federal states in former East Germany. The observed variations in cardiovascular risk profiles are in line with known differences in cardiovascular morbidity and mortality between federal states in Germany and emphasize the need for benchmarking in cardiovascular prevention and health care. Comparative co-authorship network analysis to map and analyse global chagas and African trypanosomiasis research Dinges SS 1 , Göppel C 1 , Tinnemann P 1 1 Institute for Social Medicine, Epidemiology and Health Economics, Charité-Universitätsmedizin Berlin, Berlin, Germany, Berlin, Germany The World Health Assembly in 2013 called for strengthening research to accelerate the work to overcome the global impact of neglected tropical diseases (NTDs). Social network analysis (SNA) of bibliometric big data enables new empirical measures of health research metrics and contributes to develop appropriate strategies and research to combat NTDs. We aim to quantitatively compare NTD research bibliometrics of Chagas (Trypanosoma cruzi) and Human African Trypanosomiasis (HAT, Trypasonoma brucei gambiense and rhodesiense), two protozoan diseases with particularly weak treatment options, by mapping research intensity, localisation and cooperation patterns. We collected and analysed bibliometric data from publications (original articles and reviews) with relevance for human medicine of research on Chagas and HAT between 2005 and 2015 from the web of science database. Co-authorship networks were constructed for each disease based on authors and countries as network nodes. Network structures and the importance of nodes and edges were analysed by SNA parameters. For authors connectedness analysis all authors affiliated countries were stratified by economic level according to The World Bank income classifications. Data processing was conducted in established software tools using Microsoft Excel, Open Refine, and Open-StreetMaps; for creation, visualization and analysis of social networks Table 2net and Gephi were used. We collected and processed bibliometric data of 17 975 (Chagas) and 10 639 (HAT) published articles of which 7 319 (Chagas) and 4 251 (HAT) were included in our analysis respectively. For co-authorship SNA based on authors 23 615 nodes/154 774 edges (Chagas) and 12 740 nodes/88 390 edges (HAT) arose. There are little more papers per author for HAT (13.88 against 13.11) and more authors per paper for Chagas (6.46 against 5.73). For Chagas, more often the same people cooperate (average edge weight 1.22 against 1.18). Both networks differ significantly in number of publications, number of authors and network cohesion. In comparison the network constructed on published HAT research is smaller and less connected with proportional more components and a smaller giant component (76.78 percent of the authors against 81.27 percent for Chagas). There is less research conducted in affected countries (8.5 % for HAT against 56.9 % for Chagas of the publication output). For publication output and SNA parameters middle-income countries affected with Chagas perform stronger than low-income countries, for HAT it is the other way around. There is less collaboration between high-income countries and affected countries in HAT compared to Chagas research with less than half of all possible connections. The distribution of HAT research among affected countries is more equally distributed. Both diseases networks show over proportionally a limited number of individuals as authors of published research. Conclusion Significant differences in published research on NTDs exist. Co-authorship networks of published research are currently dominated by few actors only, however, SNA metrics is suggested to identify promising authors and countries where research support, i.e. funding, could be more specifically targeted. Currently established policies to address NTD research should be changed to foster individual or institutions NTDs research capacity building, potentially addressing research collaboration. Converting ODM metadata to FHIR questionnaire resources Doods J 1 , Neuhaus P 1 , Dugas M 1 1 Universität Münster, Münster, Germany Abstract. Interoperability between systems and data sharing between domains is becoming more and more important. The portal medicaldata-models.org offers more than 5.300 UMLS annotated forms in CDISC ODM format in order to support interoperability, but several additional export formats are available. CDISC's ODM and HL7's framework FHIR Questionnaire resource were analyzed, a mapping between elements created and a converter implemented. The developed converter was integrated into the portal with FHIR Questionnaire XML or JSON download options. New FHIR applications can now use this large library of forms. What added value can bring the Massive Open Online Courses for initial medical training? Douali N 1 , Jaulent MC 2 1 Inserm (Paris 6 University), Paris, France; 2 INSERM, PARIS, France Background Although many universities are rushing to join the world of MOOCs, it seems that medical schools are more conservative. No option used MOOCs to validate the different modules. Recently in a Swedish study, the authors tested using MOOCs for the virtual patient. This study demonstrates that the use of MOOCs can replace other learning methods in medicine. The certification of these two MOOCs is not used in the medical curriculum of the students and it only serves to supplement their knowledge. In this study, we developed a MOOC hosted on MedeSpace.fr. The MOOC was designed to further training in the health economics field. It details the courses taught to medical students in France. Initially Health-exploring complexity: an interdisciplinary systems approach HEC2016 S43 this MOOC is intended for medical students as additional training. The objective is to study the number of students who will participate in this training and those to be certified. Fication of these two MOOCs is not used in the medical curriculum of the students and it only serves to supplement their knowledge. Results 2360 people have been registered of which 600 are medical students. 102 people were certified 90 medical students. The low level of certification concerns all MOOC but this study the certified individuals are mostly medical students. This work is the first step that aims to develop MOOCs at the medical faculty of Pierre and Marie Curie. The MOOCs can be used to fill the shortage of teachers in certain fields and can be used in continuing medical education. Malnutrition personalized diagnosis decision support system Douali N 1 , Jaulent MC 2 1 Inserm (Paris 6 University), Paris, France; 2 INSERM, PARIS, France Population forecasts for France suggest rapid increase in the number of over-75 years old, who may constitute 16 % of the population as a whole by 2050. Protein-energy malnutrition is common in the frail elderly admitted to hospital or living in institutions, and results in many negative health outcomes. It can result a multiple medical complication, hospitalization and even death. Protein-energy malnutrition is often under diagnosed in the elderly, then it can cause serious complications. The aim of this study is to use a CDSS to predicate a malnutrition of elderly admitted to hospital. The CDSS use the Case Based Fuzzy Cognitive Maps as method of reasoning. We include in this study 421 patients and we compared the physician decision with CDSS prediction. Results 203 patient had a malnutrition and the CDSS had a performance of 93 % compared to physician final decision. Conclusion Protein-energy malnutrition diagnosis can improve the care of patients and prevent serious complications. Development of clinical decision support for personalize patient care is to achieve its potential and improve the quality, safety and efficiency of healthcare. Psychotropic drug use and alcohol consumption among older adults in Germany. Results of the German Health Interview and Examination Survey for Adults (DEGS1, 2008 (DEGS1, -2011 Du Y 1 , Wolf IK 1 , Knopf H 1 1 Robert Koch-Institute, Berlin, Germany Use of psychotropic drugs and alcohol among older adults, particularly concomitant use of the two substances, is a growing public health concern and should be constantly monitored and carefully reviewed over time. Studies investigating psychotropic drug use and alcohol consumption among older adults in Germany are scarce. Using representative data of the most recent national health survey we analyze prevalence and determinants of psychotropic drug use and alcohol consumption among this population. Methods Study subjects were 60-79 year-old participants (N = 2,508) of the German Health Interview and Examination Survey for Adults (DEGS1) conducted in 2008-2011. All medications used during the last 7 days before the medical interview were documented. Psychotropic drugs were defined as medicines acting on the nervous system (ATC code N00) excluding anesthetics (N01), analgesics/antipyretics (N02B), but including opiates (N02A) and codeines used as antitussives (R05D). Psychotropic drugs with herbal active ingredients were considered and coded under specific therapeutic subgroups. Alcohol consumption during the preceding 12 months was measured by frequency (drinking any alcohol-containing beverages at least once a week or daily) and quantity (alcohol consumed in grams/day, cutoffs 10/20 grams/day for women/men were used for defining moderate and risky drinking). Odds ratios (OR) and 95 % confidence intervals (95 % CI) for psychotropic drug use and alcohol consumption were derived from logistic regression models. SPSS complex sample procedure was used for analysis. Results were weighted to the population of 31.12.2010. Among 2508 study subjects, 21.4 % of them used at least one psychotropic medication, 51.0 % drank alcohol at least once a week while 18.4 % drank daily by frequency, and 66.9 % consumed alcohol moderately while 17.0 % riskily by quantity. 2.8 % used psychotropic drugs combined with daily alcohol drinking. Among psychotropic drug users, 62.7 % of them drank alcohol moderately while 14.2 % drank riskily, and 45.7 % reported drinking alcohol at least once a week while 12.9 % reported drinking alcohol daily. The most frequently used psychotropic medications were antidepressants (prevalence 7.9 %) and anti-dementia (4.2 %). Phyto-medicines accounted for 22.7 % of all psychotropic drugs. The most frequently used phyto-medicines were Ginkgo biloba (prevalence 3.8 %), followed by valerian (1.5 %) and St. John's wort (1.1 %) . Factors associated with a higher psychotropic drug use were female sex (OR 1.83, 95 % CI 1.27-2.64), poor health status (1.74, 1.28-2.36) , having a certified disability (1.60, 1.17-2.18 ) and exposure to polypharmacy (4.30, 3.11-5.94 (1.51, 1.09-2.10) . Determinants for combined use of psychotropic drugs and daily alcohol consumption were exposure to polypharmacy (2.46, 1.10-5.48) and having a worse health status (1.88, 1.03-3.43) . Despite the high risk of synergetic effects of psychotropic drugs and alcohol, a substantial part of the elderly psychotropic drug users consume alcohol riskily and daily. Health professionals should talk about the additional health risks of alcohol consumption when prescribing psychotropic drugs to older adults. decade have suggested that light-to-moderate alcohol consumption is associated with reduced risks of cardiovascular morbidity and mortality [1, 2] . This may likely have an influence on the drinking patterns of older adults and possibly on the combination use of psychotropic drugs and alcohol. Combined use of the two substances is extremely harmful and risky to the elderly and needs to be constantly monitored and carefully reviewed over time. Study subjects were 60-79 years-old participants of GNHIES98 (N = 1,614) and DEGS1 (N = 2,508). All drugs used during the last 7 days were documented. Psychotropic drugs-including phytomedicines and synthetic medicines-were defined as medicines acting on the nervous system (ATC-code N00) excluding anesthetics (N01), analgesics/antipyretics (N02B), but including opiate (N02A) and codeines used as antitussives (R05D). Alcohol consumption during the preceding 12 months was measured by frequency (drinking any alcohol-containing beverages at least once a week or daily) and quantity (alcohol consumed in grams/day, cut-offs 10/20 grams/day for women/men were used for defining moderate and risky drinking). The changes in prevalences from GNHIES98 to DEGS1 adjusted for potential confounders (sex, age, region, community size, social status, living alone, polypharmacy, certified disability) were derived from the predictive margins calculated from a logistic regression model [3] . The standard errors and correlation of the predictive margins were approximated by the SAS LSMEANS statement and used in the calculation of the 95 % confidence intervals (CI). All results were weighted to the population of 31.12.2010. The prevalence for psychotropic drug use remained constant (adjusted absolute change: -0.3 %, 95 % CI -5. 4/4.8) between GHNIES98 (20.5 %) and DEGS1 (21.4 %) overall and for phyto-medicines (6.7 vs. 6.5 %) and synthetic medicines ( Use of psychotropic drugs among older adults in Germany remains stable overall but shows a significant structured change in therapeutic subgroups. In spite of significantly higher daily alcohol consumption, psychotropic drug use combined with daily alcohol drinking shows no significant increase. More older adults engage in moderate alcohol consumption, health outcomes of moderate drinking, particularly in combination with psychotropic drug use, need to be closely monitored and further investigated. How can continuity of care be harmful for diabetes patients? Earlier work found associations between continuity of care (COC) and positive clinical outcomes [1] [2] . Hereby COC was typically only measured for primary health care and thus ignored care received from other medical disciplines. In the present study we used the complete care spectrum of diabetes mellitus (DM) patients to analyse COC and its relation to the outcomes ''total days of hospitalization'' and ''mortality'' for DM-patients in Lower Austria. Our study was based on claims data from the province of Lower Austria from 2008 to 2011. COC was quantified by the COC-index [3] that measures the extent to which different care providers were seen [4] . We referred to DM-specific medication to identify study patients [5] . Using a cohort study design, the COC-index of patients was measured for the duration of one year starting with the day of their first DM-specific medication. Only outcomes occurring after this harvesting phase were considered to avoid inferential errors that might originate from commingling the two phases. The outcomes ''total days of hospitalization'' and ''mortality'' were examined by means of negative binomial models respectively Cox-models. Age, gender, number of visits, hospital stays, comorbidities, and received medication were used as control variables. Our study cohort consisted of 51,728 DM patients. We did not find a relation between COC and the patients' hospitalizations. However, according to our study increased COC seems to be associated with higher mortality (patients with a COC-index of 0.88 had a 67 % higher mortality risk compared to patients with a COC-index of 0.33). More details of the project results are available in German language [6] . Our results, which are in contrast to earlier work, may be explained by the fact that we considered all medical visits for COC whereas most earlier work focused only on primary care visits. DM patients typically visit different medical disciplines (our cohort visited care providers from a median of 4 different disciplines within one year), which entails a lower COC-index. However, this broader care spectrum may be more beneficial for them than visiting just one single care provider, where the latter would entail a high COC-index. Thus, a simple and reasonable interpretation of the COC-index would only be possible, if the COC-index was disentangled from the number of different medical disciplines, which a patient has visited. This seems to suggest an adaptation of the COC-index metric. Exploring complexity of medical data models: the role of open access, common data elements and uniform semantic annotation Background Data sharing between IT systems requires compatible data models. Due to the complexity of medicine, there is an astronomical number of medical data models [1, 2] . At present, the vast majority of these models is not available to the scientific community, neither for datasets of routine healthcare nor research. Most data models are not semantically annotated, therefore mapping of data elements between different medical data sources is an error-prone, manual process. Establishment of data standards, i.e. common data elements (CDEs), requires discussion in the scientific community. Open access to medical data models [3] can create transparency and foster this Health-exploring complexity: an interdisciplinary systems approach HEC2016 S45 consensus process. Semantic annotation of data models [4] can support mapping of data elements between different data sources, even in a multi-lingual setting. This annotation should be uniform, i.e. the same code for the same concept, to enable automated mappings. The portal of Medical Data Models (MDM, [2]), a registered European research infrastructure, contains approximately 5.300 data models and more than 440.000 data items. It is open access and the majority of these items is annotated with UMLS codes [5] . This system provides a web-based editor for data models with support for uniform semantic annotation. When new data elements are entered, available data elements from the MDM portal are presented which can be reused. Information infrastructures for open medical data models with uniform semantic annotation are available. However, at present data models of ongoing commercial trials are completely intransparent. Authors and editors of scientific journals should demand open access to data models to foster data sharing and standardisation of medical data. The rare disease joint action and how to make rare diseases visible in health care systems The Joint Action (JA) for Rare Diseases (RD) to promote the implementation of recommendations on Policy, Information and Data for RD was initiated in June 2015. It includes six Workpackages (WP) on the following topics: Coordination, Communication, Evaluation, Orphanet, Orphacodes and Policy Development. The main objectives of the RD-Action are the further development and sustainability of the Orphanet database, the contribution to solutions to ensure an appropriate codification of RD in health information systems, to continue the implementation of already identified priorities and to support the work of the Commission Expert Group on RD (CEGRD). The WP5 of the JA aims to develop a toolset to assist European countries in implementing a specific coding system for RD, the Orphacodes, in a standardized and interoperable way. This should improve the codification of RD and hence their traceability in health care systems. It is based on the CEGRD s Recommendation on Ways to improve Codification for RD in Health Information Systems. A major aspiration of WP5 is the definition of common guidelines addressing the quality of codification and the coherence of exploitation at European level. It includes 4 tasks. Each task has its own focus but they are created to build on each other. Deliverables and milestones are set to ensure the achievement of the objectives. The first task aims at defining and setting the necessary strategy and tools to implement the Orphacodes in the European countries. A steering group of MS will be set up to learn from local experiences already in place and better define the required steps and strategy. The second task aims to specify the required resources for coding RD consistently across Europe. The substance of task 2 is to create a master file with preexisting data from countries and to test the coding guidelines. Following this, the Orpha codes will be promoted across MS by sharing coding tools. Also the master file will be tested together with the guidelines against already existing systems and fine-tuned according to the test results in task 3. With the development of the toolset, the inter-observer variability should be reduced and the consistency across the coding systems improved. During the last task, the next steps to address long-term maintenance and sustainability of the resources and guidelines will be determined. To achieve the set objectives and deliverables of the first task a survey was performed. With this, an actual overview of the RD coding situation in contributing countries was created. The results of the survey were analyzed, interpreted and published in December 2015. This analysis elucidated that only the introduction of a new coding system may not be sufficient to implement it in a standardized way. A risk to take into account is the high variability of the existing coding systems in the MS, and the local legal regulations for different ways of application of these. Large geographical coverage of a standardized coding system enables identification of higher number of RD patients for research and statistical purposes. In individualized treatment, a well-known strategy concentrates on identifying patient characteristics which predict response to a drug. Factors of this kind may be difficult to establish in some disease areas. In such cases, to achieve individualized treatment, the treatment scheme can be adapted at each monitoring time point in accordance with the observed efficacy response. For eye diseases which are treated by injection into the eye, flexible treatment schemes are commonly used. These might differ by the frequency of monitoring visits and injections. For instance, for the so-called PRN (pro re nata) regimen, monthly monitoring visits are scheduled. Following one or more initial injections a decision whether to inject or not is made based on visual function at each of monthly visit. In an alternative approach referred to as ''Treat and Extend'' (T&E), following initial injections, an injection at each monitoring visit is mandatory, however the length of the monitoring interval can be adapted based on visual function. Comparisons of safety for flexible treatment schemes like these are usually performed based on rates of safety events. However, differences in rates might be difficult to interpret if treatment patterns vary considerably between treatment schemes. A stratified analysis conditional on the treatment pattern might not be feasible as few patients only might share the same treatment pattern. We propose a model which addresses these issues for the evaluation of adverse events, involving a piecewise-constant hazard rate (Cox, D.R. et al., 1984) and time-dependent covariates to reflect the diverse treatment patterns with varying treatment exposure. We present a maximum likelihood approach for estimation and inference and apply this to the PRN and T&E regimens. By modeling the safety risk under diverse treatment patterns, and by using maximum likelihood methodology for estimation and inference, it is possible to take differing exposures associated with different treatment regimens into account. This will be compared with clinical study results. Conclusion By means of the proposed method, targeted questions can be answered, such as comparisons of safety of different regimens. This is useful for treatment decisions in clinical practice. E. Grill et al. The increasing importance of IT in nursing requires educational measures to support its meaningful application. However, many countries do not yet have national recommendations for nursing informatics competencies. We thus developed an iterative triple methodology to yield validated and country specific recommendations for informatics core competencies in nursing. We identified relevant competencies from national sources (step 1), matched and enriched these with input from the international literature (step 2) and fed the resulting 24 core competencies into a survey (120 invited experts from which 87 responded) and two focus group sessions with a total of 48 experts (steps 3a/3b). The subsequent focus group sessions confirmed and expanded the findings. As a result, we were able to define role specific informatics core competencies for three countries. Addressing the complexity of mobile app design in hospital setting with a tailored software development life cycle model Ehrler F 1 , Lovis C 1 , Blondon K 1 1 University Hospitals of Geneva, Geneva, Switzerland Recent studies on workflow processes in hospital settings have shown that, since the introduction of EHRs, care-providers spend an increasing amount of their time on documentation rather than on bedside patient care. In order to improve the bedside work process and facilitate bedside documentation, we are developing an evidencebased mobile app for healthcare providers. In this paper, we present a tailored software development life cycle model that we created and validated during the design and development of this smartphone application. Large scale eHealth deployment in Europe: insights from concurrent use of standards Eichelberg M 1 , Chronaki C 2 1 OFFIS -Institute for Information Technology, Oldenburg, Germany; 2 HL7 Foundation, Brussels, Belgium Large-scale eHealth deployment projects face a major challenge when called to select the right set of standards and tools to achieve sustainable interoperability in an ecosystem including both legacy systems and new systems reflecting technological trends and progress. There is not a single standard that would cover all needs of an eHealth project, and there is a multitude of overlapping and perhaps competing standards that can be employed to define document formats, terminology, communication protocols mirroring alternative technical approaches and schools of thought. eHealth projects need to respond to the important question of how alternative or inconsistently implemented standards and specifications can be used to ensure practical interoperability and long-term sustainability in large scale eHealth deployment. In the eStandards project, 19 European case studies reporting from R&D and large-scale eHealth deployment and policy projects were analyzed. Although this study is not exhaustive, reflecting on the concepts, standards, and tools for concurrent use and the successes, failures, and lessons learned, this paper offers practical insights on how eHealth deployment projects can make the most of the available eHealth standards and tools and how standards and profile developing organizations can serve the users embracing sustainability and technical innovation. Effect of plant-based diets on inflammatory profiles: a systematic review and meta-analysis of intervention trials Background Persistent low-grade inflammation is considered an independent risk factor for the development of chronic diseases including neurodegenerative diseases, cardiovascular disease and cancer (1) . Substantial evidence supports the role of modifiable lifestyle factors such as diet in chronic disease risk (2). In observational studies the consumption of plant-based diets showed an association with reduced inflammatory profiles and subsequently lowered risk of chronic diseases (3). Plantbased diets are defined as dietary patterns that emphasize foods of plant origin, particularly vegetables, grains, legumes, and fruits (4). These diets generally exclude or rarely include meats, but may include dairy products, eggs, and fish. Variations of plant-based diets are Mediterranean (5), Nordic (6) and Dietary Approaches to Stop Hypertension (DASH) (7) , for which previous studies found favorable associations with lower chronic disease risk. In this context, it could be hypothesized that plant-based dietary strategies may in fact prevent generation of a pro-inflammatory environment in the human organism, which may be one important mechanism linking healthy diets to reduced chronic disease risk. Therefore we performed a systematic review and meta-analysis of intervention trials in order to assess the effect of plant-based diets on inflammation-associated biomarker profiles. Methods Medline, EMBASE and Cochrane Central Register of Controlled Trials (CENTRAL) were searched for articles published in the period between January 1946 and January 2016. Eligibility criteria included: (1) participants older than 18 years of age; (2) plant-based dietary intervention; (3) data on mean differences in biomarkers between individuals consuming plant-based diets and those consuming control diets. Exclusion criteria were: (1) intervention trials \4 weeks; (2) studies among pregnant women or terminally ill patients; (3) concomitant interventions; (4) interventions based on individual food components rather than on overall dietary patterns. Data collected included: study design, baseline characteristics of the study population, information on dietary interventions, and outcome assessment (biomarker measurements). Data were pooled using random-effects models and Heterogeneity was assessed by subgroup and sensitivity analyses, and meta-regression. Evaluated biomarkers of inflammatory status were: C-reactive protein (CRP), interleukin-6 (IL-6), tumor necrosis factor-alpha (TNF-a), soluble intercellular adhesion molecule 1 (sICAM), leptin, adiponectin and resistin. Of initially identified 2,583 publications, 28 met the meta-analysis inclusion criteria [a total of 2,689 participants; median age, 53 years]. Consumption of plant-based diets was associated with a reduction in the mean concentrations of the following biomarkers: CRP [effect size, -0.55 mg/l, 95 % confidence intervals (CI) -0.78; -0.32, I 2 = 94.4 %], IL-6 [effect size, -0.25 ng/l, 95 % CI -0.56; 0.06, I 2 = 74 %], and, to some degree, sICAM ; 2.17, I 2 = 93.2 %]). No substantial effects were revealed for the remaining biomarkers: TNF-a, resistin, adiponectin and leptin. Conclusion Plant-based diets are associated with an improvement in inflammatory profiles-as indicated by decreases in CRP and IL-6 concentrationsand could provide means for therapy and prevention of inflammationrelated chronic diseases. (1) . These include the work sites perform in the trial as well as work the coordinating center performs in managing sites. Site work includes identifying potential subjects, verifying that subjects meet study inclusion/exclusion criteria, conducting study procedures, and collecting study subject's baseline and follow-up information. Many site-based activities revolve around the collection and reporting of clinical trial information and potentially could be enhanced through information technologies. Secondary data collection A major breakthrough in clinical trial operations occurred when a Swedish group used national cardiovascular registries to collect and management clinical trial data (2). This and other government initiatives led to interest in the secondary use of registry and/or patient billing data in clinical trials. However, the secondary use of existing data is associated with limitations in study design and may raise issues regarding data quality. eSource For at least a decade, researchers have explored the use of e-source technologies to populate study data sets by capturing data directly from electronic medical record systems (3). More recently, standards and methodologies such as Retrieve Form for Data Capture (RFD) and Fast Health Interoperability Resources (FHIR) seek to overcome many limitations of secondary data use for clinical trials while retaining most of the benefits. However, these methods require set-up that may place a burden on informatics-naïve sites. Data quality Data quality issues are important considerations with any data collection method, more so recently for health record data. Concerns over the quality of health record data have deep roots (4, 5, 6) . Current interest is focused on problems created by the inability of most emerging data capture methods to provide for data validation and correction. Two distinct modes of data collection exist (1) collection of existing data for some secondary use, and (2) collection of new data for some primary or secondary use. Data quality assessment methods are needed in both cases. Data quality assurance and control methods should be applied in the latter case. We present a framework that covers both cases, and as such is applicable over the wide range of proposed and existing methods for the collection of EHR data for use in clinical studies. Ethics and regulatory Ethical and regulatory issues underlie each data capture and data quality issue associated with the use of new medical informatics technologies in clinical trials. If registry, billing or electronic medical record data are repurposed for clinical trials, do patients and regulators need to be informed and do they need to consent? If so, in which situations is this required and how will it be obtained? Summary In this workshop, we will review technical, data quality and ethical/ regulatory issues associated with using medical informatics to improve clinical trial operations. Workshop participants will contribute to this work and help set an agenda for a subsequent workshop planned for the Information Technology and Communication in Health (ITCH) 2017 Conference in Victoria, BC. Graph databases for openEHR repositories OpenEHR is a specification for an Electronic Health Record (EHR) architecture addressing adaptability and interoperability issues. Building openEHR repositories is challenging since it requires a thorough grasp and implementation of the openEHR Reference Model (RM) that has numerous classes in a tree-structure with deep hierarchy. Moreover, a mismatch between the database model and the RM can lead to high development time and cost. A graph model shares many semantic similarities with the definitions of the openEHR RM making it a potential fit for its representation and implementation. This research proposes a method for implementing an openEHR repository by a graph database employing the labeled property graph model. Moreover, the performance and scalability of Neo4j for reallife EHR applications are considered. For the implementation of the openEHR RM, we propose the following approach: Nodes correspond to instances of the RM classes. Relationships are modeled following the RM class hierarchy. Relationship types are named according to archetype paths originating in the openEHR's path mechanism. RM Class names are assigned as node labels. The 'archetypes_node_id' and the 'node_id' properties are assigned as node attributes. This approach preserves the ability to locate nodes using a path mechanism. Subsequently, we propose the use of archetype paths to direct traversals over the graph structure. The complexity of executing traversals in Neo4j is O(1) in comparison to an average of O(log(n)) for a binary search to locate an index entry. E. Grill et al. Finally, to gain performance, we label archetype root and leaf nodes using the semantics provided by archetypes. Since tree structures have underlying graphs i.e. trees in graph theory, the modeling of the openEHR RM as a graph consisted of direct mappings. Multiple EHR instances were created and queried using CYPHER. Using Neo4j's browser, we were able to directly visualize the patient EHR as a semantic graph. The resultant graph database contains a disjoint union of directed rooted trees. Each tree corresponds to the EHR of an individual person where the root node indicates the EHR ID and the leaf nodes correspond to values of archetype elements. Moreover, we found CYPHER queries could be directly mapped to the Archetype Query Language (AQL) queries proposed by openEHR. A possible obstacle lies in the limited number of nodes currently supported by Neo4j, 34 billion nodes, likely to be surpassed in real EHR applications. Horizontal scaling requires us to partition our graph. Partitioning big graphs is usually a challenge. However, in the case of an openEHR repository, having disconnected graphs allows for good partitioning where traversal operations executing locally can be maximized for routine clinical activities. Further research is required for this matter. We proposed a method for implementing the openEHR RM using a graph database. The proposed method was applied using Neo4j. The resultant model aligns semantically with the definitions in the open-EHR specification making it fit for the representation and implementation of its RM. Further exploration of performance gain opportunities using the knowledge, structure and semantics provided by the openEHR archetypes is encouraged. OutlineHistorically, research and health care information technology systems have been disconnected, supporting separate, and sometimes redundant, processes and workflows. Unfortunately, the use of disparate systems can result in patient safety concerns, inefficient processes, data quality issues and challenges in research billing. The PowerTrials solution integrates research processes into the workflow of Cerner's electronic health record (EHR), ensuring that clinical research and clinical care share relevant data. You can use PowerTrials to organise the flow of clinical trial data by efficiently identifying candidates, consistently managing documentation and integrating data capture. This integrated approach enhances patient safety, supports study recruitment, and streamlines research processes ensuring protocol compliance, efficiency and data quality. Pow-erTrials also tests study feasibility. Information gathered can guide protocol revisions and enable the discovery and prioritisation of research opportunities. Please find below further detail about the key benefits: Enhance patient safety PowerTrials is embedded in the EHR, thus all clinicians have awareness of their patients' research participation via an ''on study'' flag. Support study recruitment PowerTrials systematically compares EHR records and study criteria to alert researchers and clinicians when a patient is a good candidate for a study. Protocol-specific screening modules support two distinct workflows for identification of potential research candidates. Specifically, these modules screen: Studies against a specified patient population (researcher workflow) A patient against available studies (clinician workflow) Streamline research processes PowerTrials integrates with core EHR features so you can ensure accurate and efficient research visits and streamline charges. Deterministic data linkage between organizations: a verified procedure applied in Austria Endel F 1 , Dremsek A 2 , Holl J 2 , Endel G 3 , Wagner-Pinter M 2 1 Vienna University of Technology, Vienna, Austria; 2 Synthesis Forschung, Vienna, Austria; 3 Main Association of the Austrian Social insurance companies, Vienna, Austria Background DEXHELPP (''Decison Support for Health Policy and Planning''), a COMET K-project funded by FFG, is a collaboration between small and medium-sized businesses, the University of Technology in Vienna, Gesundheit Ö sterreich GmbH a subsidiary of the federal ministry of health, and the main association of Austrian health insurance institutions. Analysing, sharing and linking sensitive, indirectly identifiable personalized data is a required but highly delicate process in this project. In the federal state of Austria, responsibilities concerning the health and social security system are spread among various public institutions. As a result, data collections are detached and although a unique person identifier (''Sozialversicherungsnummer'' in German) exists, legal and organisational obstacles have to be tackled to utilize or even link data. To prepare secure and reliable data linkage of several sources at a central facility, we defined and implemented a thorough process. The designed procedure is divided into two distinct steps. First, project dependent identifiers (called 'common ID') holding no usable information are exchanged. Second, involved data sources prepare a defined extract including the common ID and independently transfer it to the project's secure computing facility where linkage is performed. Exchanging the common ID is the most sophisticated part of this process. Several participants are involved: 2 project partners managing research data, 2 trust centers liable for pseudonymisation and data custodians managing the collaboration. At the beginning, one data center creates a unique random 'common ID', and transmits it, including the associated pseudonyms to the center's own data trustee. The trustee replaces the pseudonym with the identifying unique person identifier and securely transmits both variables to the second data trustee. After pseudonymisation, the second source receives the data center, consisting of the common ID associated with its own pseudonym. Finally, a dataset including the common ID and the local pseudonym is available at both data sources. As a result, they are potentially linkable without revealing any information during this process. The whole procedure has already been performed successfully in a pilot study restricted to a certain part of Austria. Two large data sources from different collaborating organizations, potentially covering the entire Austrian population (* 8.5 million inhabitants) for a period of several years took part in this first proof of concept. Tough questions about security, permission and liability have been discussed during the process and have been sufficiently answered. It is now possible to securely transfer data to the project's independent research facility where deterministic linkage on an individual basis can be performed directly. An important milestone of the DEXHELPP project has been reached by demonstrating data linkage in a complex environment. The described process not only allows to share and link data of different sources in a reliable manner, but also achieved trust of all participating organizations while ensuring privacy beyond legal requirements. Confidence and understanding of the presented procedure turned out to be a key point of this project. Constructing an ROC curve to assess a treatment-predictive continuous biomarker Engelhardt A 1 , Shen YM 1 , Mansmann U 1 1 IBE, LMU München, München, Germany This paper presents the idea of an ROC curve which quantifies the discrimatory potential of a continuous biomarker for treatment selection when the outcome is continuous. The analysis assumes data from a randomized parallel group design. We use non-parametric density estimators to construct an ROC curve based on the probabilities that a (non-)responder, defined by better (worse) response to treatment as opposed to control, in the treatment group has a biomarker value above a value c. Our non-parametric approach comes close to the true AUC in a simulation study based on a normal distribution. Application to a real data set shows that despite a significant interaction term in a proportional hazards model, a biomarker may not be helpful for treatment decisions. Our proof of principle study opens the door to further developments and generalizations. Antenatal detection of intrauterine growth restriction: a casecontrol study Ernst SA 1 , Brand T 1 , Zeeb H 1 1 Leibniz Institute for Prevention Research and Epidemiology -BIPS, Bremen, Germany Intrauterine growth restriction (IUGR) can be described as the inability of a fetus to reach its designated growth potential at any gestational age [1] . As demonstrated recently by results of a population-based study, IUGR is the single largest risk factor for stillbirth, increasing the stillbirth rate fourfold than compared to pregnancies with normal growth, and if antenatally undetected increases further to eightfold [2] . Therefore early detection of IUGR is of core importance for maternal and child health. However, observational studies from the late 1990s and early 2000 reported antenatal detection rates of 25-30 % for suboptimal fetal growth (i.e. IUGR) through routine fetal ultrasonography [3, 4] . Possible reasons for the antenatal non-detection of IUGR have not been well elucidated yet. The aim of this study was to describe the sensitivity of antenatal detection of IUGR and to investigate if there are specific groups having a higher chance of non-detected IUGR (e.g. women with a migrant background). All mothers who delivered a small for gestational age (SGA) newborn (birth weight \10th percentile for gestational age and sex) in one of three hospitals in Bremen (Germany) during recruitment phase from November 2012 up to June 2015 were eligible for inclusion in this hospital-based case-control-study. Cases were defined as neonates with an IUGR that was not detected antenatally, while controls were defined as neonates whose IUGR was identified antenatally. Data collection instruments include a newborn documentation sheet and a standardized, computer-assisted personal interview (CAPI). Univariate and adjusted logistic regression models were performed for the association of antenatal non-detection of IUGR and maternal migration status and socio economic status that provided odds ratios (OR) with 95 % CI. During the study period 1,087 of all 12,926 newborns were identified as SGA (8.0 %). The overall response rate was 15.0 % (n = 161). Suboptimal fetal growth was identified antenatally in n = 77 pregnancies (controls) and in n = 84 (cases) was not detected antenatally (antenatal detection rate: 47.8 %). Detection rates increased with severity of the growth restriction (birth weight \10th percentile = 34.2 %, \5th percentile = 64.7 %, \3rd percentile = 57.7 %). In multivariate analyses, the point estimator indicates that antenatal non-detection of IUGR is about twice more likely in women with a migrant background (OR 1.90; ) than compared to non-migrants, although it was not statistically significant. We did not find statistically significant differences regarding parental socio-economic status. Although detection rates seem to have increased since the late 1990s, about half of the pregnancies affected by suboptimal fetal growth Bone age assessment is important in many fields of medicine, including pediatrics, forensic medicine, and sport medicine. Several methods have been developed to determine bone age. Among them, Tanner and Whitehouse method is considered the most reliable, but complicated and time-consuming, making this method not so welcomed in clinical practice. Electronic methods of data entry and reporting may enhance usability of this score-based method. The purpose of this study is to develop a web-based system to reduce the time required for bone age determination, followed by an assessment of the actual effect in clinical practice. We have designed and implemented a web-based application for bone age determination based on Tanner and Whitehouse (tw3) method to help radiologists determine bone age of children through a userfriendly interface. Different levels of access are embedded in the system to ensure appropriate privacy and confidentiality for both patients and practitioners. There are also customizable image libraries, which allow the user to define the score for each of the target areas by rapid point and click methods. The calculated score is plotted onto a normal curve as summarized report, including a visual summery of the process and a chart of an individual's skeletal age compared with chronological age based on reference values for RUS skeletal maturity score. The system was assessed with respect to functional accuracy and calculation of the level of impact of the system on the reduction of the bone age reading time. In compliance with TW3 method, 30 radiographs were reported, both with and without this application, and the time needed for the reporting was recorded. The images were viewed by Clear Canvas viewer. The system is a customizable application which allows the user to keep the data of the patients for long-term comparisons and provide not only the score and bone age, but also a collection of images most matching the case and plot indicating the skeletal maturity situation according to normal cases. The results support that this system can help physicians and radiologists to determine the skeletal maturation of children accurately. Also, the web-based system allows us to reduce the reading time with the TW3 method from 8.4 min down to 2.7 min in average. The online score-based bone age determination tool can significantly reduce the time required for interpretation and reporting, is reproducible, and can be used from anywhere. We hope this new approach facilitates a wider use of score-based, more accurate methods. Rivaroxaban vs. Phenprocoumon use in Germany and risk of bleeding: a claims data analysis based on 80,000 patients Fassmer AM 1 , Jobski K 2 , Haug U 1 , Schink T 1 1 Leibniz Institute for Prevention Research and Epidemiology -BIPS GmbH, Bremen, Germany; 2 School of Medicine and Health Sciences, Carl von Ossietzky University of Oldenburg, Oldenburg, Germany Background Rivaroxaban (RVX) is an increasingly used new oral anticoagulant (NOAC) licensed i.a. for stroke prevention in atrial fibrillation (AF) and treatment of venous thromboembolism (VTE). Clinical trials compared RVX to warfarin, but not to the standard of care in Germany, phenprocoumon (PPC). Recent insights regarding defective devices used to measure blood clotting rates in the control group of these trials raised new questions about the risk of bleeding associated with RVX as compared to standard drugs. The aim of this study was to characterize new users of RVX versus PPC in Germany and to assess and compare bleeding rates in both groups using claims data. Methods Based on data from the German Pharmacoepidemiological Research Database (GePaRD) from 2011 to 2013 a cohort of new users of RVX or PPC was established. Comedication, comorbidity, potential indication as well as the thromboembolic and the bleeding risk scores (CHA2DS2-VASc and HAS-BLED) were assessed. Crude incidence rates (IRs) per 1,000 person years (PY) were estimated for any bleeding leading to hospitalization and the sub-types intracerebral, gastrointestinal and urogenital bleeding. A nested case-control analysis (NCCA) will be applied to compare the adjusted risk of these bleeding events between new users of RVX and PPC using conditional logistic regression. The study cohort included 31,596 new RVX users and 48,965 new PPC users. Main indications were AF (40 %) and VTE (23 %). New users of RVX were younger (median 69 vs. 71 years) and had less cardiovascular comorbidity (31 vs. 46 % in users with AF). The CHA2DS2-VASc was lower in RVX users while HAS-BLED was similar in both groups. The overall crude bleeding rates were slightly lower in new RVX users compared to new PPC users (overall IR: 17.58 per 1,000 PY (95 % CI 15.98-19.29) vs. 19.60 per 1,000 PY (18.43-20.83)). Conclusion New users of RVX and PPC differed regarding cardiovascular comorbidity and thromboembolic risk score. Therefore the crude IRs should be interpreted with caution. As the delivery of further data for the NCCA is pending the results will be presented only at the conference. Fehr A 1 , Hense S 1 , Ziese T 1 1 Robert Koch-Institut, Berlin, Germany Evidence-based public health policy needs data and health information from valid and comparable sources. Harmonized data collection at European level reduces variation and multiplication of data collection in member states. The ECHI-initiative (European Core Health Indicators) aims at promoting a sustainable, policy-relevant European public health monitoring, thereby reducing health information inequalities in and between EU member states. In four EU-funded projects between 1998 and 2012 (ECHI-1, ECHI-2, ECHIM, JA-ECHIM), a comprehensive list of indicators was developed. From these, 88 public health indicators were selected for the ''ECHI-Shortlist''. The Shortlist is divided into three sections according to the level of 'implementation-readiness' of each indicator. The ''Implementation Section'' comprises over 50 indicators. For these, data is available for a majority of member states as part of regular international data collections. The indicators can thus be used to support policy making. The data for many of these indicators derive from the European Health Interview Survey (EHIS) or the European Union Statistics on Income and Living Conditions (EU-SILC). Indicators in the two remaining sections are either operationalized and nearly ready to be incorporated into regular international data collections (''Work-in-Progress Section'') or still in need of conceptual and methodological development (''Developmental Section''). The ECHI-Shortlist is an on-going European public health task. Changes in underlying indicator methodology or data collections require changes in definitions and background information for some Shortlist-indicators, and arising public health topics warrant information as a basis for policy making. Since 2015, the EU funds the BRIDGE Health (BRidging Information and Data Generation for Evidence-based Health policy and research) consortium project. The overarching objective is to develop a concept for a sustainable European infrastructure for public health monitoring (EU-HIS). BRIDGE Health vertically connects European indicator projects. Horizontally, these projects cooperate on relevant topics such as standardization methods, data quality, health information priority setting, or legal and ethical frameworks. Within the scope of the BRIDGE Health project, the Unit ''Health Monitoring'' of the Robert Koch Institute (RKI) is tasked with leading Work Package 4 (WP4). WP4 maps data availability for ECHI indicators across Europe, evaluates and improves existing ECHIindicators and assesses policy relevance for indicator topics. WP4 implements these tasks by conducting a survey among health data experts in EU member states. Based on its results, WP4 will further develop the ECHI-Shortlist, in close cooperation with representatives from international organizations and with a network of experts on national health indicator implementation. Clear policy relevance is a crucial criterion for the inclusion of indicators into the ECHI-Shortlist. The availability of data is then key to the implementation of indicators. Sustainable and harmonized data collections allow to assess and compare states and trends in population health and in health related determinants in Europe, as well as in health care and services. Several EU-funded projects set the cornerstones for a European public health monitoring. BRIDGE Health will facilitate an up-to-date list of indicators and a concept for a European health infrastructure to support the design, implementation and monitoring of public health policy. Obesity and post-operative cognitive dysfunction: a systematic review and meta-analysis Feinkohl I 1 , Winterer G 2 , Pischon T 1,2 1 Max-Delbrück-Centrum für Molekulare Medizin (MDC) Berlin-Buch, Berlin, Germany; 2 Charité -Universitätsmedizin Berlin, Germany Post-operative cognitive dysfunction (POCD) occurs frequently after surgery, and is associated with an increased risk of subsequent dementia diagnosis as well as premature death. Though identification of risk factors for POCD will enable risk assessment of patients undergoing surgery and will additionally help to shed light on the etiology of the condition, few risk factors have been determined to date. Obesity is established as increasing the risk of hospitalization and late-life cognitive impairment, and so is a plausible candidate predictor of POCD. Here, we report a systematic review and metaanalysis of studies on the association between obesity and risk of POCD. Methods PubMed and the Cochrane Library were systematically searched. Studies were included if they had prospective designs, reported on human adults undergoing surgery, if cognitive function was measured pre-and post-surgery, if obesity, body mass index (BMI) and/or body weight were ascertained, and if associations with POCD were reported as relative risks or odds ratios. PRISMA (Preferred Reporting Items for Systematic Reviews and Meta-analyses) and MOOSE (Meta-analysis Of Observational Studies in Epidemiology) guidelines were followed. Underweight, weight loss, and post-operative delirium were not considered. The search yielded 287 articles, of which inclusion criteria were met by six articles. Samples totaled 1432 older patients (mean age C62 years) who were followed up for 24 h to twelve months after surgery. Study quality was relatively poor and studies varied in their consideration of potential covariates. Analysis of three studies with obesity defined as a categorical measure found a trend for a higher risk of POCD among persons with BMI[30 kg/m 2 versus B30 kg/m 2 that fell short of statistical significance (RR 1.27; 95 % CI 0.95, 1.70; p = 0.10), with indication of moderate statistical heterogeneity between the studies (I 2 = 77 %; p = 0.01). No associations were found in one study that analyzed BMI as a continuous predictor of POCD (RR 0.98 per kg/m 2 ; 95 % CI 0.93, 1.03, p = 0.45) or when effects were pooled across two studies that associated body weight as a continuous measure with risk of POCD (RR 0.99 per kg; 95 % CI 0.89, 1.09; p = 0.83). In the first systematic review of obesity as a potential risk factor of POCD to date, we found that a very small number of studies have addressed the topic. Their results overall provide only limited support for an increased risk of POCD in patients who are obese. Further large-scale, prospective investigations as well as more detailed reanalysis of data from existing surgical cohorts are necessary for clarification. E. Grill et al. Background Avoidable mortality is a composite measure of deaths from conditions that might be avoided by timely and effective medical care. It was developed as an indicator to study the quality of health care services. Previous research has shown that avoidable mortality represents a substantial proportion of all-cause mortality. Furthermore, large regional and socioeconomic disparities in avoidable mortality have been reported. The main objectives of this study are to examine: (1) time trends in avoidable mortality in Switzerland, (2) Swiss results in an international context, and (3) Swiss-specific association between socioeconomic position (SEP) and avoidable mortality. Based on mortality data from the Swiss Federal Statistical Office, 3-year age standardized mortality rates (ASMRs) per 100,000 personyears (PYs) were calculated for the population aged less than 75 years, covering the time period 1996-2010 and the following groups of causes of death: avoidable conditions, ischemic heart diseases (IHD) (defined as partly avoidable) and non-avoidable conditions. For international comparisons, Swiss mortality rates were calculated for the time periods 1997-1998 and 2006-2007, As a rule, until now, the nursing process has only been used for the planning and evaluation of direct patient care. It would be far too short-sighted to limit the use of data to the immediate nursing care of a patient during their hospitalisation. A dissertation project at the Witten/Herdecke University [1] is investigating which patient characteristics are suitable for predicting a higher or lower degree of nursing intensity. Moreover, the project will also examine whether it is possible to classify a prospective nursing intensity on the basis of data obtained directly from routine nursing documentation. A prognostic determination of nursing intensity of this kind can be used not only as a basis for human resource planning in hospital, but also to respond to health policy issues. In addition to the collection and compilation of this data, innovative methods for analysing this volume of data are needed in order to generate information that can guide action. Classic statistical methods reach their limits here or cannot be applied due to the nature of the data. In order to identify predictors for nursing intensity, hitherto unclear relationships between independent and dependent variables will be examined. Multivariate linear regressions are not suitable in this case, because these are drawn from the field of parametric statistics, and their conditions are not suitable for the design of the computational model. Methods from the field of non-parametric tests represent an alternative. Four predictive modelling methods were selected for the proposed research project: regression trees, random forests, multinomial logistic regression and multivariate adaptive regression splines. These are explorative approaches aimed at uncovering structure, such as those found in data-mining processes [2] . What is characteristic of these approaches is not only the large volume of data and the speed with which the data is generated, but also the often very diverse source of that data. However, it is not possible to arrive at reliable claims based on the volume of data alone, if there are no suitable algorithms available to test the quality of the very diverse (routine) data sources. The first results from the data mining process will be presented over the course of the presentation. To date, we have only just begun to recognise the possibilities made available through the use of ''big data''. Yet for all the euphoria surrounding this, we must also minimise the risks, which have only discussed in a rudimentary fashion thus far. The question as to how to handle, apply and utilise the data obtained through the process of routine documentation poses as much of a challenge in dealing with new information technologies [3] as, for example, issues concerning data protection of data security. Addressing these challenges will be a multidisciplinary task that must involve the concomitant engagement of scientific fields such as information technology, mathematics, epidemiology, biostatistics, social and nursing science, and ethics. Unlocking data for statistical analyses and data mining: generic case extraction of clinical items from i2b2 and tranSMART In medical science, modern IT concepts are increasingly important to gather new findings out of complex diseases. Data Warehouses (DWH) as central data repository systems play a key role by providing standardized, high-quality and secure medical data for effective analyses. However, DWHs in medicine must fulfil various requirements concerning data privacy and the ability to describe the complexity of (rare) disease phenomena. Here, i2b2 and tranSMART are free alternatives representing DWH solutions especially developed for medical informatics purposes. But different functionalities are not yet provided in a sufficient way. In fact, data import and export is still a major problem because of the diversity of schemas, parameter definitions and data quality which are described variously in each single clinic. Further, statistical analyses inside i2b2 and tranSMART are possible, but restricted to the implemented functions. Thus, data export is needed to provide a data basis which can be directly included within statistics software like SPSS and SAS or data mining tools like Weka and RapidMiner. The standard export tools of i2b2 and tranSMART are more or less creating a database dump of key-value pairs which cannot be used immediately by the mentioned tools. They need an instance-based or a case-based representation of each patient. To overcome this lack, we developed a concept called Generic Case Extractor (GCE) which pivots the key-value pairs of each clinical fact into a roworiented format for each patient sufficient to enable analyses in a broader context. Therefore, complex Patients and methods RNA expression of genes depicting the p53 pathway (p14ARF, MDM2, p21, BAX and HMGA2) were studied in tumor tissues of 208 HNSCC patients. Network analysis was used to characterize the relation between genetic biomarker, acting upstream (p14ARF, HMGA2, MDM2) and downstream (p21, BAX) of the tumor suppressor p53. Multiple imputation was used to fill in missing values of gene expressions and covariates, creating a total of 5 data sets. Gaussian graphical models within a bootstrap approach were then applied to access the structure of the gene expression network [4] . The final network was generated using a model which counts how often an edge is detected in 10,000 bootstrap network replications. The effect of the p14ARF network on human papilloma virus infection (HPV) status and patient survival (overall survival and tumor-specific survival) was determined using logistic and Cox regression models for a follow-up time of 6-8 years. Kaplan-Meier survival curves were compared using the log-rank test. Principal component analysis was conducted to summarize the gene expression profile of a network and its effect on patient survival. Expression analysis yielded a bimodal expression of the tumor suppressor p14ARF which acts upstream of p53. Transcription networks were subsequently separated into a p14ARF high expression network and a p14ARF low expression network. In the p14ARF low expression network, the p53 pathway displayed more network edges, indicating a stronger activity of the p53 tumor suppressor pathway, including a stronger association of p14ARF with the senescence gene p21. In the p14ARF high expression network, the interaction between p14ARF and p53 pathway components was less strong. P14ARF expression was significantly correlated with positivity for human papilloma virus infection (HPV) (OR = 0.39, 95 % CI = [-1.76; -0.14]). High expression of p14ARF was potentially associated with overall survival time (median difference = 55 months, p = 0.036). Gene expression networks showed no association with patient survival. The two main transcription networks derived from HNSCC tumor tissue gene expression analysis can be characterized by low and high p14ARF expression. Gene expression networks with high p14ARF display fewer p53 network associations which argues for a tumor suppressor activity of p14ARF independently from p53. High p14ARF expression is a hallmark of human papilloma virus associated HNSCC and patients have a favorable outcome. A risk based neural network approach for predictive modeling of blood glucose dynamics Frandes M 1 , Timar B 1,2 , Lungeanu D 1 1 University of Medicine and Pharmacy Timisoara, Timisoara, Romania; 2 Third Medical Clinic, Emergency Hospital, Timisoara, Romania, Timisoara, Romania For type 1 diabetes patients, maintaining the blood glucose (BG) at normal values is a challenging task due to e.g. variable insulin reactions, diets, lifestyles, emotional conditions, etc. Hyperglycemic and hypoglycemic events can generate various complications (e.g. diabetic ketoacidosis, retinopathy, neuropathy, etc.), so predicting BG values in time is of great importance for diabetes self-management. Herein, we propose a non-linear autoregressive neural network approach, based on the minimal dataset available from a continuous glucose monitoring (CGM) sensor, with an integrated measure of intra-patient BG variability. The method kept the balance between accuracy and complexity, allowing a fast response with no additional effort or discomfort for the patient. Genetic sum score of risk alleles associated with body-mass-index interacts with socioeconomic status in the Heinz Nixdorf Recall Study Frank M 1 , Dragano N 2 , Eisele L 1 , Arendt M 1 , Forstner AJ 3,4 , Nöthen MM 3 , Erbel R 1 , Moebus S 1 , Jöckel KH 1 , Schmidt B 1 Indicators of socioeconomic status (SES) are known to be related to risks of diseases such as obesity. In addition, twin and family studies suggest that SES modifies the impact of genetic factors on diseases, indicating increasing genetic risks with decreasing SES. However, empirical studies on such gene by SES interactions incorporating molecular genetic information are sparse. The aim of the present study was to investigate whether a sum score of genetic risk alleles (GRS) related to Body-Mass-Index (BMI) interacts with indicators of SES in a population-based study sample. Derived from the latest genome-wide association meta-analysis (Locke et al. 2015) , 97 single nucleotide polymorphisms (SNPs) associated with BMI were selected for calculating a GRS in 4,518 participants (age 45-74 years) of the population based HNR study. Health-exploring complexity: an interdisciplinary systems approach HEC2016 S55 SES indicators (formal years of education and monthly household equivalent income) and BMI (kg/m 2 ) were assessed at study baseline. Linear regression models were fitted to estimate sex-and age-adjusted effects separately for education and income on BMI; and for the GRS on BMI. Genetic effects were additionally stratified by SES group. Finally, interaction terms were included in linear regression models to check for possible additive interaction effects between the GRS and SES indicators. Social inequalities in BMI were found in the study population with a decrease in BMI of 0.6 per 1000€/month (95 % CI -0.79, -0.39; p = 4.4*10-9) and 0.25 per year of education (95 % CI -0.31, -0.19; p = \ 2*10-16). For the genetic association, a 0.10 (95 % CI 0.07, 0.12; p = \ 2*10-16) increase in BMI per additional risk allele was observed. Stratified by educational groups, a 0.24 increase in BMI (95 % CI 0.16, 0.31; p = 3.5*10-10) in the lowest educational level (B10 years) and a 0.03 (95 % CI -0.02, 0.01; p = 0.21) increase in the highest educational level (C18 years) was seen per additional risk allele. Stratified by sex-specific income quartiles, a 0.14 (95 % CI 0.09, 0.18; p = 3.2*10-9) increase in BMI in the lowest income quartile and a 0.05 (95 % CI 0.01, 0.09; p = 0.01) increase in the highest income quartile was found. GRS by income interaction was observed for BMI (ßGRS 9 income = -0. The federal state law for child day care and preschools (FSL-CDC-P) is a statutory activity of the federal state government of MWP. Its goal is to reduce the effects of social inequalities on developmental health in day-care centers (dcc). An annual amount of EUR 5 mio. is provided for dcc in social hotspots (n = 143 ''DESK-dcc' ' [11.03.2016] ; 2012: n = 108 DESK-dcc). Mandatory criteria for claiming benefits and grants are an annual application of the ''Dortmund Developmental Screening for Preschools (DESK 3-6)'', a standardized, objective, valid, and reliable screening measure to detect early developmental risks [1, 2] highly accepted by day-care teachers (dct) [3] . In the event of deviations from an age-appropriate motor, linguistic, cognitive, or social-emotional developmental health, the amendment stipulates funds to employ additional staff to offer targeted individualized support to children aged 3 to 6 years and affected by developmental risks [4, 5, 6] . Another mandatory criterion for claiming the benefits is the participation of the dcc in a scientific evaluation including longitudinal DESK data and data on the amount of additional staff employed (aoase) obtained from paper-and-pencil interviews (ppi) of the administration staff of the DESK-dcc. 3) for repeating a school year, as was early onset (0-5 vs. 6-9 years of age) of disease (OR 2.3; 95 % CI 1.0; 5.2). Late disease onset (OR 0.5; 95 % CI 0.2; 0.9) was protective. There was no association with sex, parental SES, type of IBD, region of residence or emotional problems. In contrast, disease specific variables had no influence on the type of school attended. The most important risk factor for not attending grammar school was low parental SES (OR 2.9, 95 % 1.9 to 4.7, low vs. middle). The sensitivity analysis confirmed these findings. In addition, children in Bavaria were less likely to attend grammar school. Results for family situation and migration background, or residence in Austria were inconclusive in all analyses due to low numbers. The impact of disease severity and age of disease onset on ageappropriate schooling clearly indicates an influence of IBD characteristics on school performance, as shown by increased rates of having to repeat a school year. Predictors for type of school attended confirm previous reports on the prominent role of parental SES on educational chances in Germany. Differences in the schooling system by state may obscure relevant regional effects. These will, therefore, be examined in more detail in subsequent analyses. The high degree of dissatisfaction of parents with chronically-ill children with the school situation underlines the need for a more interdisciplinary approach in health services research, in particular in young people. From interoperability to semantic interpretability Freriks G 1 , Huff S 2 1 EN13606 Association and Clinical Information Modeling Initiative, Buitenkaag, Netherlands; 2 Intermountain Healthcare and CIMI, Salt Lake City, United States 2-Panel discussion: From Interoperability to Semantic Interpretability Organisers: Stan Huff, Intermountain Healthcare, chairman CIMI Gerard Freriks, EN13606 Association, co-director ERS Abstract: Clinical Decision Support, eHealth, mHealth, pHealth, expect that health related data from disparate sources can be reused freely. The truth is that this is not possible until now. The Clinical Information Modeling Initiative (CIMI) as a transcontinental group of experts aims to create shared Information Models that provide in interfaces data that can be interpreted in a full and patient safe way. CIMI Information Models bring CDS, eHealth, mHealth and pHealth data models that can capture in a normalised way rich data in its full context. Thereby allowing safe interpretation. This Workshop aims to start a discussion about: the process where present health IT-systems and user communities transform existing IT-systems to real state-of-the-art systems. the needed shared international infostructure that supports the system Agenda: The workshop will present CIMI's goals, its impact on eHealth exchange of data inside and between IT-systems and will allow for ample discussions on the impact it will have on European developments. Agenda 10 min Introduction to CIMI: the organisation, goals, status, potential impacts 10 min Collaborations: SNOMED-CT, LOINC, OMG UML tooling, HL7 FHIR, CEN/ISO communities, 25 min Discussions Topic abstracts 1-CIMI is a volunteer organisation now formally part of HL7. Among its members are: Intermountain Healthcare, Mayo Clinic, Veterans Health Administration, Kaiser Permanente, NHS-UK, ITSDO, EN13606 Association, openEHR, … The goal is to create clinical Information Models that users (healthcare providers, organisations, projects) can share. The models are pre-populated with the correct codes from existing reference terminologies, thereby reducing the degrees of freedom to create information objects, messages, etc. This approach will improve substantially semantic interoperability. 2-CIMI collaborations: CIMI models will be created with tools based on OMG specifications and those based on ISO 13606, and use codes from IHTSDO and LOINC. This topic will explain the active cooperations and expected results. Consumers of CIMI Information Models will be: HL7 FHIR, CEN/ISO 13606 and openEHR communities. This topic will explain how CIMI models will be created and used by users in health IT. A genome-wide association meta-analysis on apolipoprotein A-IV concentrations Friedel S 1 , Lamina C 1 , Coassin S 1 , Rueedi R 2,3 , Yousri NA 4,5 , Seppälä I 6 , Gieger C 7, 8, 9 , Schönherr S 1 Background Apolipoprotein A-IV (apoA-IV) is a major component of HDL and chylomicrons particles and plays an important role in reverse cholesterol transport. It is an early marker of impaired renal function. The aim of this project was to identify genetic loci that are associated with apoA-IV concentrations based on a hypothesis-free genomewide approach and to investigate a possible relation with already known susceptibility loci for related traits (kidney function, HDL cholesterol and triglycerides). Methods A genome-wide association meta-analysis was conducted with about 10 Mio. SNPs using data from five population-based cohorts (n = 13813) followed by a replication step in two additional studies (n = 2267). To account for possible heterogeneity, a fixed effects or random effects model was used to associate each SNP with apoA-IV concentrations. Additionally, a look-up of the replicated SNPs in downloadable GWA meta-analysis results was performed on kidney function (defined by eGFR), chronic kidney disease, HDL cholesterol and triglycerides. Moreover, weighted SNP-scores were built involving already known susceptibility loci for the aforementioned traits and were associated with apoA-IV concentrations. All apoA-IV measurements were done by ELISA centrally in one laboratory. Three independent SNPs from two genomic regions (APOA4 and KLKB1) could be identified, which were significantly associated with apolipoprotein A-IV concentrations: rs1729407 near APOA4 (p = 6.77E-44), rs5104 in APOA4 (p = 1.79E-24) and rs4241819 in KLKB1 (p = 5.63E-14)). Analyses based on weighted SNPscores for kidney function showed a significant and inverse association with apoA-IV concentrations (p = 5.5E-05). Furthermore, an increase of triglycerides-increasing alleles was shown to decrease apoA-IV concentrations (p = 0.0078). By looking up the independent and significant SNPs from our results in downloadable GWAS data, only the apoA-IV lead SNP rs1729407 showed an association with HDL cholesterol (p = 7.1E-07). Our investigation identified two independent SNPs located in or next to the APOA4 gene and one SNP in KLKB1 gene. To our best knowledge this is the first observation of an association of KLKB1 with apoA-IV, which points towards an involvement in renal metabolism and/or an interaction within HDL particles. Analyses of SNPscores explaining kidney function, HDL cholesterol and triglyceride levels indicate a potential causal effect of primarily kidney function and by a lesser extent triglycerides on apoA-IV concentrations. A general theory to calculate parameters for imperfect diagnostic tests in the absence of a complete gold standard Fröhlich A 1 , Schauer B 2 1 Friedrich-Loeffler-Institut, Institut für Epidemiologie, Insel Riems, Germany; 2 Universitätsmedizin Greifswald, Institut für Community Medicine, Greifswald, Germany When imperfect diagnostic tests are used, estimates of the test characteristics (sensitivity and specificity) are needed to calculate the true prevalence as with the apparent prevalence it is not clear to what extent the observed test results reflect the truth. These test characteristics are usually estimated by comparing the imperfect test with a gold standard. However, in many cases a gold standard does not exist. By applying several tests simultaneously to the same samples, the class membership and consequently the performance of each test can be estimated under certain conditions via latent class analysis (LCA) based on the joint probability of observed result patterns. However, among the many publications applying LCA in the field of diagnostic testing, most publications appear to focus rather on the technical implementation without considering questions relevant for safe practical applications: (1) Is the common mandatory assumption of conditional independence of the diagnostic tests realistic? (2) Can a unique solution be identified (identifiability)? We will present a general theory in form of a system of equations for multiple populations and tests, which was derived from real observations. This theory is based on two assumptions: (a) each population has a pairwise distinct disease prevalence and (b) test sensitivity and specificity do not vary between populations. Based on this theory, exact point estimates can be obtained for sensitivity, specificity, the prevalence for each included population and correlations between diagnostic tests. It does per se not require additional information in form of priors and thus provides exact estimates. Therefore, this general theory also allows better insight into the conditions needed for identifiable models, which are crucial to assess the requirements during the design stage and to judge the practical value of existing results. In conclusion, the general theory provides exact estimates of test characteristics, prevalence and correlation between tests. It can be applied to any area, which uses dichotomous measurements to characterize test characteristics, prevalence or correlation between tests. More exact estimates of these parameters will in turn improve the accuracy of other dependent applications such as sample size Functional capacities are an important indicator of health and participation among older persons. Assessment of functional capacities and limitations in national health surveys of aging populations is therefore important but time-consuming and costly. We examined the relationship between self-reported daily activity limitation and testbased limitations of physical functioning in relation to morbidity/multimorbidity, in order to characterize potentially vulnerable populations who might be missed without functional capacity testing. The ''German Health Interview and Examination Survey for Adults'' (DEGS1) comprised interviews, examinations and tests. Self-reported daily activity limitation is assessed via the Global Activity Limitation Indicator (GALI) based on a single question with answering categories ''severely limited/limited but not severely/not limited at all.'' In the present analysis, people reporting severe limitations were considered as limited. A modification of the Short Physical Performance Battery (SPBBmod) (Guralnik 1994) was used to measure physical functioning. SPBBmod is based on Timed-up and Go-Test, Chair-Rise-Test and Balance-Tests. Test-based limitations of physical functioning were defined as a SPPBmod score of 8 or lower. Data on 1,707 community-dwelling people aged 65 to 79 years participating in DEGS1 with full records on GALI and SPBBmod were analysed. 21 chronic conditions and seeing and hearing limitations were assessed. Multimorbidity was defined as having 3 or more of these conditions. Agreement and disagreement between GALI and SPPBmod was assessed with descriptive methods and the kappa statistic was calculated. Participants who show limitations in SPPBmod but do not consider themselves as severely limited were considered as particularly vulnerable. Among persons with limitations in SPPBmod, we identified characteristics of this potentially vulnerable subgroup using multivariable logistic regression analysis with self-reported severe activity limitation (no vs. yes) as the outcome and chronic health conditions, recurrent falls, social support and education as independent variables. [4.8-9.7 %] of men had self-reported severe activity limitation as well as measured functional limitations. A total of 13.8 % [11.2-16.9 %] of women and 7.9 % [5.8-10.7 %] of men had poor physical performance only. In both sexes multimorbidity was a significant and independent determinant of poor physical performance in the absence of selfreported activity limitation; additional independent determinants included recurrent falls among men and low social support and lower education among women. Discussion and conclusions GALI and SPPBmod measure different aspects of functional impairment. Special focus should be put on persons with test-based limitations without self-report of limitations. In future surveys of older adults both indicators should be included in order to properly assess the burden of disease connected to chronic conditions. Sleep quality as mediator between drinking and depressive symptoms Guo L 1 , Zhang WH 2 , Lu C 3 Background Previous researches indicated that drinking relates to poor sleep quality and depressive symptoms, and poor sleep quality also was associated with depressive symptoms. However, the mechanisms explaining these relationships have yet to be determined. In this study, we aimed to investigate whether sleep quality mediate the relationship between drinking and depressive symptoms. Methods A total of 34,283 school students in grades 7-12 were sampled from the schools in Guangdong. We used a self-administered questionnaire to collect data. The Pittsburgh Sleep Quality index (PSQI) was used to assess the occurrence of sleep disturbance, and the Center for Epidemiology Scale for Depression (CES-D) was used to identify whether individuals had depressive symptoms. A path model was produced by using AMOS 17.0. The mean PSQI global score was 5.6 (±2.8) points, and the mean CES-D score of students was 14.5 (±8.6) points, and 5.9 % of the students had depressive symptoms. The results of the path model suggested that without adjustment for other variables, the standardized indirect effect between drinking and depressive symptoms was 0.032, and with adjustment for age, gender, family economic status, and study pressure, the standardized indirect effect between drinking and depressive symptoms was 0.031, which represents 81.6 % of the total effect of drinking on depressive symptoms. These results suggest that sleep quality mediated the association between drinking and depressive symptoms. We should focus on the high-risk population who was involved in both drinking and poor sleep quality. Weekly pattern for online information seeking on HIV-a multilanguage study Studies have demonstrated that there are weekly patterns of information-seeking activities on sexual health topics in some selected languages. However, it is not known if this weekly pattern is found across the ten most commonly-used languages on the Internet, and whether international public events might have an impact on these information-seeking patterns. The objective of this study is to examine sexual health information-seeking patterns for searches performed in several languages, and also to analyze the potential impact of public events on these information-seeking rates. We extracted the number of hits on the HIV article on Wikipedia for the ten most used languages on the Internet for all of the year 2015. The results confirm the existence of a weekly pattern for the searches performed in English, Spanish, Portuguese, Japanese, Russian, French, and German. But the weekly pattern was not found for searches in Mandarin Chinese, Arabic, and Malay. The number of HIV queries increased significantly during two public events, the World AIDS Day, and the announcement regarding the HIV-positive condition of the celebrity actor Charlie Sheen. The existence of higher peaks in searching rates at the beginning of the week for some languages, and the increase in queries related to public events could represent valuable opportunities for public campaigns promoting sexual health. Beyond cohort selection: an analytics-enabled i2b2 Gabetta M 1,2 , Malovini A 3 , Bucalo M 1 , Zini E 2 , Tibollo V 3 , Priori SG 3 , Vettoretti S 4 , Larizza C 5 , Bellazzi R 2 , Barbarini N 1 1 BIOMERIS s.r.l., Pavia, Italy; 2 Center for Health Technologies, Università di Pavia, Pavia, Italy; 3 Fondazione IRCCS S. Maugeri, Pavia, Italy; 4 Fondazione IRCCS Ca' Granda Ospedale Maggiore Policlinico, Milan, Italy; 5 Center for Health Technologies, Università di Pavia, PAVIA, Italy The i2b2 software is a widely adopted solution for secondary use of clinical data for clinical research, specifically designed for cohort identification. i2b2 is still lacking functionalities for data analysis. The aim of this work is to empower the i2b2 framework enabling clinical researchers to perform statistical analyses for accelerating the process of hypothesis testing. To this aim we have developed a flexible extension of i2b2 able to exploit different statistical engines. We have implemented some first applications for basic statistics and survival analyses, exploiting this extension and accessible through suitable user interfaces designed with a special consideration for usability. Achieving baseline balance in cluster-randomized trials with a user-friendly covariate-constrained randomization tool Gabrysch S 1 , Lorenz E 1 1 Institute of Public Health, University of Heidelberg, Heidelberg, Germany Cluster-randomized trials are increasingly used to evaluate complex interventions. Randomization ensures that arms are comparable in terms of the distribution of known and, more importantly, unknown factors that may influence the outcome; however, this only holds if sample size is large. The number of clusters in cluster-randomized trials is often limited and therefore one cannot rely on chance alone to ensure balance of important covariates and of sample size between arms. Covariate-constrained randomization is often the method of choice when baseline data are available, but has rarely been used due to the need for statistical support and specialized computer software to implement it. We provide a brief overview of methods to balance baseline covariates in cluster-randomized trials, explain the principles of covariateconstrained randomization and existing software tools in SAS and R. We describe our new ccrand procedure in Stata and illustrate the command using an example dataset. The new command, ccrand, implements a covariate-constrained randomization procedure for cluster-randomized trials in Stata. It can be used to ensure balance of one or more baseline covariates between trial arms by restriction to those allocations that meet specified balance criteria. The command allows the combination of covariateconstrained randomization with stratification, supporting a large number of strata and a moderate number of clusters within strata, it puts no restriction on the number of covariates to be balanced and it includes a final validity check. The availability of a user-friendly Stata command could make covariate-constrained randomization more accessible to non-statisticians and help to improve baseline balance in cluster-randomized trials. A virtual patient representation in the medical rehabilitation domain Gal N 1 , Andrei D 2 , Poenaru DV 3 , Stoicu-Tivadar V 4 , Gal-Nadasan EG 2 matchstick skeleton. This virtual patient is used to analyze and track the recovery of the orthopedic patient with malicious posture problems. The creation of the digital patient was realized using a markerless depth camera, the Microsoft Kinect. The gathered data was saved into a BVH type motion capture file. This file records not only the skeletal structure of the patient but its movements as well from witch the adduction, rotation and flexion angles of the joints can be analyzed. The data is stored in structured text format making it suitable to be used in telemedicine. The results confirm the utility and usability of the digital patient in clinical reasoning and in educational applications. Adverse drug events in Austrian hospital discharge diagnoses Gall W 1 , Endel G 2 , Grossmann W 3 , Jankowitsch-Zimbova S 1 , Ratajczak P 1 , Sheikh Rezaei S 1 , Rinner C 1 , Wolzt M 1 Estimates suggest that approximately 5 % of hospital admissions are associated with Adverse Drug Events (ADEs) [1] . We developed the web application ''JADE regio'' for physicians to directly explore ADEs in Austrian hospital discharge diagnoses between the years 2001 and 2011. We used a research database that consists of pseudonymized health claims data from hospital stays of all Austrian citizens in the years 2001 and 2011. To identify ADE-relevant diagnoses, the 505 ADE diagnoses (segmented into the seven categories A1, A2, B1, B2, C, D, E) defined for Germany by Stausberg [2] were adapted to fit the documentation habits in Austria (i.e. 458 ICD-10 diagnoses). We incorporated 16 comorbidities, calculated Odds-ratios and implemented geographic visualisations. The tool was developed with Shiny, a web application framework for R, and PostgreSQL. The implemented tool provides a numerical and graphical overview of hospital stays with ADE associated diagnoses. The ADE diagnoses can be analysed in conjunction with comorbidities, demographic parameters and spatial information of the patients and the hospitals to focus on specific cohorts. Overall analyses show for example that women have a higher risk of ADE diagnoses within the ADE categories A1, A2, B1 and B2. The ratio of ADE diagnoses increased between 2001 and 2011 from 4.5 to 5.5 %. The tool has been evaluated and used by physicians to generate hypothesis. Despite several limitations, secondary use of health claims data can complement clinical studies in exploring ADEs with population-based analyses. Calretinin as a blood-based biomarker for mesothelioma in men Gawrych K 1 , Johnen G 1 , Raiko I 1 , Pesch B 1 , Weber DG 1 , Taeger D 1 , Lehnert M 1 , Kollmeier J 2 , Bauer T 2 , Musk AW 3,4,5 , Robinson BW 3,5 , Brü ning T 1 , Creaney J 3,5 Malignant mesothelioma (MM) is an aggressive tumor of the serous membranes, associated with a previous exposure to asbestos. Its diagnosis and treatment are challenging. Mesothelin is currently the best available blood-based marker for MM. However, no single tumor marker has sufficient sensitivity for an early detection of MM. We have developed an assay to determine calretinin in serum and plasma as another promising tumor marker (Raiko et al. 2010, BMC Cancer) . The objective of this study is the verification of the performance of calretinin to discriminate MM in a larger study population of men with former exposure to asbestos and to compare its performance with that of the established marker mesothelin. In this case-control comparison in men with former exposure to asbestos, blood samples were collected in Australia from 80 MM cases and 75 controls and in Germany from 36 MM cases and 72 controls. The cases did not include the sarcomatoid subtype. Controls were frequency matched to cases by age in 5-year groups. Both case groups were similar regarding subtypes of MM and age at blood drawing. Calretinin and mesothelin were determined by enzymelinked immunosorbent assays (ELISA) in plasma samples collected prior to therapy. Biomarker classification performance was determined by nonparametric and parametric estimation of the ROC curve within the pooled and single study groups. The Chi square test was performed to compare the area under the curves (AUCs). Calretinin concentrations did not differ by country of origin and were significantly higher in MM patients than in controls. The AUC of the nonparametric ROC curve of calretinin and mesothelin in the Australian group were 0.90 (95 % CI 0.85-0.95), and 0.91 (95 % CI 0.87-0.96), respectively; and in the German group were 0.83 (95 % CI 0.74-0.92), and 0.84 (95 % CI 0.76-0.93), respectively. When the pooled data from both countries were considered, the accuracy of prediction for MM was robust, with an AUC of 0.86 (95 % CI 0.82-0.91) for calretinin and 0.89 (95 % CI 0.85-0.93) for mesothelin. AUCs of the parametric ROC curves were slightly higher. Calretinin has a similar good performance to detect MM as mesothelin. We are currently conducting a prospective study in asbestos-exposed subjects aimed to validate the performance of calretinin, mesothelin, and other tumor markers to detect MM prior to the occurrence of symptoms. The effect of exposure to ultraviolet radiation in infancy on melanoma risk Gefeller O 1 , Fiessler C 1 , Radespiel-Tröger M 2 , Uter W 1 , Pfahlberg AB 1 1 University of Erlangen -IMBE, Erlangen, Germany; 2 Bavarian Health and Food Safety Administration, Nuremberg, Germany Evidence on the effect of ultraviolet radiation (UVR) exposure in infancy on melanoma risk in later life is scarce. Three recent studies suffering from methodological shortcomings suggest that people born in spring carry a higher melanoma risk. Our study aimed at verifying whether such a seasonal pattern of melanoma risk actually exists. Methods Data from the Bavarian population-based cancer registry Bavaria on the birth months of 28374 incident melanoma cases between 2002 and 2012 were analysed and compared with data from the Bavarian State Office for Statistics and Data Processing on the birth month distribution in the Bavarian population. Crude and adjusted analyses using negative binomial regression models stratified by sex were performed and supplemented by a variety of subgroup analyses. In the crude analysis, the birth months March-May were overrepresented among male and female melanoma cases. Sex-specific negative binomial regression models adjusted only for birth year revealed a significant association between melanoma risk and birth month with on average 16 % higher relative incidence rates for March, April and May compared to the reference month December. However, after additionally adjusting for the birth month distribution of the Bavarian population, these risk estimates decreased markedly: neither any single month differed significantly from the low-risk reference month December nor any seasonal pattern or a significant overall association between melanoma risk and birth month was apparent in both sex-specific analyses. Similar results emerged in all subgroup analyses. Our large registry-based study provides no evidence that people born in spring carry a higher risk for developing melanoma in later life and thus lends no support to the hypothesis of higher UVR-susceptibility during the first months of life. Patient self-management is a vital factor for a successful therapy especially for chronic diseases like Diabetes. Self-management includes timely intake of prescribed medication, i.e., it requires a high degree of medication adherence. The concept of structured selfmanagement and continuous care (''managed care loops'') can be supported by eHealth services in order to achieve a high degree of individualized therapy. This paper presents an approach for a medication reminder-exemplified by an app for the Apple Watch (Mediwatch app)-and adherence assessment system: A physician can generate and edit a medication plan via a web UI. Then a patient's smartwatch fetches the current plan via web services and reminds the patient to take the prescribed drug. Information about the patient's decision (confirmation, postponement, cancellation) is then delivered back to the physician who can provide timely feedback or adjust the medication plan. Preliminary analysis of difficulty of importing pattern-based concepts into the National Cancer Institute Thesaurus Geller J 1 , He Z 2 1 New Jersey Institute of Technology, Newark, United States; 2 Florida State University, Tallahassee, United States Maintenance of biomedical ontologies is difficult. We have developed a pattern-based method for dealing with the problem of identifying missing concepts in the National Cancer Institute thesaurus (NCIt). Specifically, we are mining patterns connecting NCIt concepts with concepts in other ontologies to identify candidate missing concepts. However, the final decision about a concept insertion is always up to a human ontology curator. In this paper, we are estimating the difficulty of this task for a domain expert by counting possible choices for a pattern-based insertion. We conclude that even with support of our mining algorithm, the insertion task is challenging. Parental and child sleep domains at two and three years of agelongitudinal data from the Ulm SPATZ Health Study cohort Genuneit J 1 , Braig S 1 , Urschitz M 2 , Rothenbacher D 1 1 Institute of Epidemiology and Medical Biometry, Ulm University, Ulm, Germany; 2 Universitätsmedizin der Johannes Gutenberg-Universität Mainz, Mainz, Germany Background Several sleep domains have been associated with various health outcomes in children and adults including mental disorders. Additionally, infant sleep characteristics and parental sleepiness are major sources of parental concern and psychosocial stress. However, longitudinal studies investigating the impact of preschool child sleep domains on parental sleep domains in a family context in the general population are scarce. The aim of this study is to investigate parental and child sleep domains in early childhood in the context of a birth cohort study. In the Ulm SPATZ Health Study, 934 singleton newborns and their mothers were recruited from the general population following delivery in the University Medical Center Ulm, Southern Germany, between 04/2012 and 05/2013. A total of 571 fathers also opted into the study. Parental and child sleep were assessed in 343 and 260 triads during the 2-and the ongoing 3-year follow-up, respectively, using the Pittsburgh Sleep Quality Index (PSQI) and the Children's Sleep E. Grill et al. Habits Questionnaire (CSHQ) in separate self-administered questionnaires. Poor sleep quality (PSQI [ 5) was detected for 36 % of the mothers and 24 % of the fathers with both parents affected in 10 % of the families at 2 years. The crude relative risk (95 % confidence interval) of persistence vs. remission of poor sleep quality at 3 years was 2.55 (95 % CI 1.82; 3.56) and 5.50 (95 % CI 3.16; 9.55) among mothers and fathers, respectively. Correlation of total child CSHQ scores between both timepoints was rspearman = 0.64 (p \ 0.001). Parasomnias were common, affecting 63 and 68 % of the children at 2 and 3 years, respectively, with 50 % of the children being affected at both follow-ups. The child being restless and moving a lot during sleep was the most common contribution to the CSHQ parasomnia subscale, sleepwalking was virtually never reported. More severe parasomnias (subscale C 9) were associated with about threefold and twofold risk of very poor or poor maternal sleep at 2 and 3 years, respectively. The effect on paternal sleep quality was less pronounced and not statistically significant at 2 years and absent at 3 years. Overall, similar results were obtained using all available data instead of fully assessed triads. Poor sleep quality is prevalent and persists among parents even when their children have developed more stable sleeping patterns in early childhood. Child parasomnias seem to have a strong influence on poor maternal sleep quality; although, confounding of this association by social or psychosocial determinants is possible. Further, adjusted analyses including the ongoing 3-year follow-up data are warranted with regard to several domains of sleep including duration, latency, efficiency, and daytime dysfunction. A practical method for data handling in multi-method usability research studies Georgsson M, Staggers N Background Analyses of large, complex data sets are common in health informatics and usability research. Researchers need feasible ways of streamlining data handling and analyses. We offer a useful approach across qualitative and digitalized data. Illustrated by a usability evaluation study on a mHealth system, we present methods for managing a large set of usability data using qualitative data analysis software (QDAS). Three different data collection methods were used (usability testing, in-depth interviews, and open-ended questionnaire responses). The process began at initial transcription and all data were imported into the system. Content analysis was used throughout-from problem identification to assigning problem classifications and severity ratings to linkages with system views. This approach was practical and useful as it allowed the capture and synthesis of a large number of multifaceted usability problems. We recommend this approach to other researchers performing usability evaluations on large data sets. Today, mobile health apps play an important role in healthcare processes and health promotion. In recent years many Persian mHealth apps were developed presented by several national app markets. Cafebazaar is the largest Persian app store that contains more than 3500 android apps in medical and health & fitness categories (about 1600 apps in medicine and more than 1900 apps in health & fitness). In this study some characteristics of 200 top Persian medical apps of Cafebazaar investigated and then categorized based on Yasini and Marchand's use case classification model. Results showed that 20 (10 %) of the studied apps had the ability of information collection and only in 12 cases (6 %) clinical experts we involved to app development process. Also, in 130 (65 %) apps, developers contact information was provided and in 160 (80 %) cases applied references were offered. More investigations showed that only 21(10.5 %) apps had reliable references. Other findings showed that for 200 studied Persian mHealth apps, 13 use cases were found that 11 of them were matched by Yasini and Marchand's classification Model. Based on this study's criteria, Persian mHealth apps cannot fulfill required principles according to the applied model. Lack of regulatory agencies and absence of dynamic evaluation system for Persian mHealth apps might be the main reasons of these defects. This study also proved that case based classification model is usable for Persian M-health apps and may facilitate access to theMobile health apps play an important role in healthcare processes and health promotion. In recent years many Persian mhealth apps were developed and are available in various national app markets. Cafebazaar is the largest Persian app store that contains more than 3500 android apps in medical and health & fitness categories. In this study some characteristics of 200 top Persian medical apps of Cafebazaar were investigated and then categorized by their use cases. Results showed that only 6 % of apps declare the involvement of at least one health professional in the conception or development of the apps. In 35 % of studied apps, no contact information was provided for the users and 10.5 % applied reliable sources for their content. 13 distinct use cases were found in all 200 apps of which two were new to an already published use-case model. This study shows that Persian mHealth apps, like other existing apps in the world, have a long way to improve and reach some basic standards. Lack of regulatory agencies and absence of a dynamic evaluation system for mHealth apps might be the main reason of these defects. This study also shows that 20 use cases existing in international health related apps are not yet used in Persian apps and therefore there is a reach potential of creating new apps in mHealth field. apps by users. A SOA-based solution to monitor vaccination coverage among HIV-infected patients in Liguria Giannini B 1 , Gazzarata R 1 , 2 , Sticchi L 3 , Giacomini M 1 , 2 1 Department of Informatics, Bioengineering, Robotics and System Engineering -University of Genova, Genova, Italy; 2 Healthropy s.r.l., Savona, Italy; 3 Department of Health Sciences; University of Genova, Genova, Italy Vaccination in HIV-infected patients constitutes an essential tool in the prevention of the most common infectious diseases. The Ligurian Vaccination in HIV Program is a proposed vaccination schedule specifically dedicated to this risk group. Selective strategies are proposed within this program, employing ICT (Information and Communication) tools to identify this susceptible target group, to monitor immunization coverage over time and to manage failures and defaulting. The proposal is to connect an immunization registry system to an existing regional platform that allows clinical data re-use among several medical structures, to completely manage the vaccination process. This architecture will adopt a Service Oriented Architecture (SOA) approach and standard HSSP (Health Services Specification Program) interfaces to support interoperability. According to the presented solution, vaccination administration information retrieved from the immunization registry will be structured according to the specifications within the immunization section of the HL7 (Health Level 7) CCD (Continuity of Care Document) document. Immunization coverage will be evaluated through the continuous monitoring of serology and antibody titers gathered from the hospital LIS (Laboratory Information System) structured into a HL7 Version 3 (v3) Clinical Document Architecture Release 2 (CDA R2). The use of tools, modelling methods, data types, and endpoints in systems medicine: a survey on projects of the Systems medicine is the consequent continuation of research efforts on the road to an individualized medicine. Thereby, systems medicine tries to offer a holistic view on the patient by combining different data sources to highlight different perspectives on the patient's health. Our research question was to identify the main data types, modelling methods, analysis tools, and endpoints currently used and studied in systems medicine. Therefore, we conducted a survey on projects with a systems medicine background. Fifty participants completed this survey. The results of the survey were analyzed using histograms and cross tables, and finally compared to results of a former literature review with the same research focus. The data types reported in this survey were widely diversified. As expected, genomic and phenotype data were used most frequently. In contrast, environmental and behavioral data were rarely used in the projects. Overall, the cross tables of the data types in the survey and the literature review showed overlapping results. Co-design of a computer-assisted medical decision support system to manage antibiotic prescription in an ICU ward The need to achieve high levels of semantic interoperability in the health domain is regarded as a crucial issue. Nowadays, one of the weaknesses when working in this direction is the lack of a coordinated use of information and terminological models to define the meaning and content of clinical data. IHTSDO is aware of this problem and has recently developed the SNOMED CT Expression Constraint Language to specify subsets of concepts. In this paper, we describe an implementation of an execution engine of this language. Our final objective is to allow advanced terminological binding between archetypes and SNOMED CT as a fundamental pillar to get semantically interoperable systems. The execution engine is available at http://snquery.veratech.es. E. Grill et al. The use of gamification for behavioral change has been gaining traction for some time and is now a popular strategy in both commercial and academic fields [1] [2] [3] [4] [5] [6] [7] . There are many definitions for the term, one often used was provided by Deterding et al. [1] that states that ''gamification is the use of game design elements in nongame contexts''. This definition is not without its faults as it's a very broad one but other proposed definitions fare no better. Gamification key opinion leaders openly discuss about the problem presented when defining and characterizing gamification [8] [9] [10] [11] . The use of this tactic in health and fitness mobile apps has increased despite little to no in-depth inquiry into its effectiveness and appropriate functionality [3, 4] . Not having a clear definition makes it even more difficult to detect when gamification is being used or not. Current methodologies for detection require operators to manually review and determine whether gamification has been used [12] . These approaches might work well for relatively small sample sizes but turn cumbersome when facing with larger samples. This study elaborates on the first steps of the construction and validation of a proposed method for automatic detection of gamification elements in apps for health. Methods A panel of gamification experts constructed an initial list of terms commonly used when describing gamification elements. The initial list was revised and refined to remove terms that were too broad or too specific (ie: ''explore'' or ''on-boarding''). A randomized sample of 50 apps for health description was produced from the Medical category of the iTunes App Store. These descriptions were given to a gamification expert for detecting the presence of gamification elements and used as gold standard. The descriptions were automatically searched for matches with the initial and final list of gamification keywords. Any match considered the description to belong to a gamified app. A Fleiss-Kappa analysis was done to check reliability with both lists. The final list contained 54 terms (78 in the initial list). The Fleiss Kappa before refining the list was 0.29, the SD was 0.10 (IC 0.090-0.497) After refining kappa was 0.50, SD was 0.09 (IC 0.312-0.695). While the Fleiss Kappa was not initially ideal, it got better on the next iteration. There are some limitations to this method. Our gamification keywords might not be as representative as it could be. There might be terms, expressions or concepts that cannot be easily identified or that developers exclude from their descriptions (false negatives). The method also determines an app to be ''gamified'' with just a single match so ''gamificationness'' might be overestimated (false positives). It is, however, not without its merits as it makes for interesting first steps towards an effective method for fast detection in large samples. Background Patient engagement and self-efficacy are increasingly regarded as vital to improved health outcomes in many contexts including discharge after surgery, chronic conditions and others. Mobile health can assist care delivery in many of these. To establish the breadth of new ideas and directions in the field from a review of literature as well as use case experiences by a panel which has done work and demonstrated success. Results mHealth, ie the use of mobile phones and tablets to support anytime/ anywhere clinical practice guidelines and patient education as well as eHealth, including tele-monitoring, and participation in social networks can help in Patient engagement and self-efficacy, support anytime/anywhere clinical practice guidelines and patient education. Perceived stress during pregnancy, associated socioeconomic and psychometric characteristics and possible preventive targets-first results from the PRINCE study Goletzke J 1 , Pagenkemper M 1 , Hansen G 1 , Becher H 1 , Diemert A 1 , Arck P 1 1 University Medical Center Hamburg-Eppendorf, Hamburg, Germany Prenatal stress is associated with adverse pregnancy outcomes for both the mother and the newborn child. From a preventive point of view it is hence important to identify women with increased stress levels. To date it is unknown whether and how stress levels change during pregnancy and which characteristics are associated with higher levels of prenatal stress. Our objective was to examine perceived stress over the course of pregnancy and associated socioeconomic and psychometric factors in a cohort of healthy pregnant women from Hamburg, Germany. Methods Data from 335 women participating in the pilot study of the PRINCE (Prenatal Identification of Children Health) cohort, a populationbased prospective cohort study, was used for the present analyses. Once per trimester the women completed the Perceived Stress Scale (PSS), the Edinburgh Postnatal Depression Scale (EPDS) and the SF12 regarding physical and psychological aspects of quality of life. Furthermore, data regarding socioeconomic factors, social support (Berlin Social Support Scale) as well as coping strategies (brief COPE Health-exploring complexity: an interdisciplinary systems approach HEC2016 S65 inventory) was assessed once during pregnancy. For analyses, the study population was allocated into quartiles yielding groups of low (1st quartile; n = 84), medium (2nd and 3rd quartile; n = 167) and high (4th quartile; n = 83) mean perceived stress and the three groups were compared with regard to socioeconomic and psychometric characteristics using ANOVA for normally distributed continuous variables, Kruskal-Wallis test for not normally distributed continuous variables and Chi square test for categorical variables. Mean (SD) scores on the PSS in the first, second and third trimester were 20.9 (7.1), 20.9 (7.1) and 19.5 (7.4) , respectively, with a significant decrease of perceived stress between the second and third trimester (p = 0.005). Looking at the three groups of perceived stress, there were no significant differences regarding examined socioeconomic factors. However, women reporting higher perceived stress during pregnancy were more likely to have higher EPDS scores (p \ 0.0001) and a lower quality of life-both regarding the physical (p = 0.0002) and mental (p \ 0.0001) health composite score. Also, both the perceived available and the actually received social support were significantly lower in the higher stress group (p \ 0.0001 and p = 0.002, respectively). Regarding coping strategies, women with lower scores of perceived stress were more likely to focus on the positive (p = 0.001) and less likely to follow an evasive coping strategy (p \ 0.0001). Our data indicates that the PSS can serve as a tool for identifying women with increased prenatal stress, and that an early application in the first and second trimester of pregnancy, where stress levels were comparably higher, might be particularly relevant. While not being confined to certain socioeconomic characteristics, higher perceived stress during pregnancy was linked to different adverse psychometric aspects in our study. Furthermore, higher social support and effective coping strategies were associated with lower levels of perceived prenatal stress. Identification of misreporting of energy intake from infancy to school age: results from a longitudinal European study One of the major challenges in obtaining accurate dietary data is the issue of misreporting. Under-and over-reporting lead to reduced validity of dietary recall methods and distorted analysis of relationships between nutrient intake and health (1, 2) . Identification of misreporting is crucial in paediatric population. However, the methods applied to do so are based on the adult population (3, 4), which may lead to misclassification. We identify misreporting of energy intake in young children enrolled in a longitudinal study by using a straightforward approach based on estimated energy requirements. Children (n = 1,199) enrolled in the Childhood Obesity Programme based in 5 European countries (Belgium, Germany, Italy, Poland and Spain) with repeated measurements of food intake and anthropometric indices at ages 1, 3, 6, 12, 24, 36, 48, 60, 72 and 96 months (total observations = 6,137). Cut-offs for the ratio of reported energy intake (EI) and estimated energy requirement (EER) were calculated using different methods based on Butte (5) and Torun (6) to identify misreporters. Misreporters were studied according to age, gender, BMI z-scores and country. Employing individual cut-offs for each child, a higher proportion of over-reporters (19.1 %) than under-reporters (10.6 %) were identified. The highest proportion of under-reporting (28.1 %) and over-reporting (34.1 %) is found at age 96 months and 12 months respectively. Energy intakes of obese children (3.9 %) are more likely to be underreported (43.4 %).The proportion of under-reporting in boys (12.0 %) is slightly higher than girls (9.2 %). Spain has the highest proportion of misreported records (36.3 %), followed by Poland (30.5 %), Belgium (27.8 %), Italy (26.2 %) and Germany (24.6 %). The fixed cut-offs were derived from the mean individual upper and lower cut-off values, which are 0.80 and 1.20 and 0.75 and 1.25 for children aged B12 months and for children [12 months respectively. Based on these values, we identified 10.4 % under-reporters and 17.4 % over-reporters. Agreement between both methods was good (Cohen's Kappa j = 0.79, observed agreement = 90.4 %, n = 5,546 observations). Since fixed cut-offs have been rounded off for easy recall and application, some degree of disagreement between the two methods is expected. Misreporting is a serious bias in nutritional studies on children. It can easily be identified using fixed cut-offs (children [ 12 months = ± 20 %; children [ 12 months ± 25 %) for the agreement between energy intake and estimated energy expenditure in studies with less than 3 day dietary recall. Particularly in infants, comparison of energy intake should be made against the energy requirement, which additionally includes energy needed for growth and development. Obese nondiabetic pregnancies and high maternal glycated hemoglobin at delivery as an indicator of offspring and maternal postpartum risks: the prospective PEACHES Mother-Child Cohort Gomes D 1,2 , Brandlhuber L 3,4 , Burgmann M 5 , Riedel Sobotzki C 6 , Zwafink C 3 , Anzill S 3 , Holdt L 7 , Teupser D 7 Obese women develop metabolic abnormalities during pregnancy such as glucose intolerance, which increases later-life risks in mothers and offspring. There are no specific recommendations for gestational diabetes (GDM) diagnosis for obese pregnant women, who are at high risk for glycometabolic disturbances. Although diagnosed as nondiabetic during pregnancy using new stringent criteria of the International Association of Diabetes and Pregnancy Study Groups (IADPSG), a high frequency of elevated glycated hemoglobin (HbA1C) at delivery was found in obese mothers (1) . We defined a cut-off value for maternal HbA1C at delivery and investigated the influence of high HbA1C on offspring outcomes at birth. Effects on long-term postpartum measures of maternal glucose metabolism and inflammation at [2 years after delivery were also assessed (2). Pregnant women were enrolled into the Programming of Enhanced Adiposity Risk in Childhood-Early Screening (PEACHES) study, a prospective mother-child cohort in Germany initiated by our group in 2010. Analyzed cohort groups comprised normal-weight (n = 155) and obese mothers (n = 307), who were diagnosed as GDM-negative based on the IADPSG criteria. Women were followed up to 2.9 years (median) postpartum. The main exposure variable was increased maternal HbA1C at delivery. Offspring outcome variables included perinatal parameters such as large-for-gestational age (LGA) birth weight, absolute birth weight and cord-blood serum C-peptide concentrations. Maternal outcome variables included dysglycemia and inflammation biomarkers such as fasting glucose and high-sensitivity C-reactive protein (hsCRP). Univariate and multiple regressions were used to compare the impact of high maternal HbA1C at delivery on outcome variables. Upper-range cut-off value for maternal HbA1C at delivery was derived as the 90th percentile of levels from normalweight GDM-negative mothers using a cumulative distribution curve. Among obese women, the frequency of high HbA1C levels at delivery (upper-range cut-off value: 5.7 %) was approximately 3-fold higher than in healthy normal-weight women. Children of GDM-negative obese women with HbA1C at delivery C5.7 vs.\5.7 % were at a higher risk of being LGA (adjusted odds ratio: 3.48, 95 % CI 1.58-7.90) and had 0.09 ng/mL higher mean cord-blood serum C-peptide concentrations (95 % CI 0.01-0.17 ng/mL). No statistical differences were found in clinical neonatal outcomes between the two groups. At the postpartum visit, obese mothers with HbA1C at delivery C5.7 vs. \5.7 % (n = 42) had a 0.3 % higher mean HbA1C (95 % CI 0.1-0.5 %), 6.0 mg/dL higher mean fasting glucose (95 % CI 2.4-9.5 mg/dL) and 6.8 mg/L higher hsCRP (95 % CI 1.4-12.3 mg/L) concentrations. Increased HbA1C in obese GDM-negative women at delivery indicates gestational dysglycemia, which appears to have an impact on offspring and long-term maternal health. It could be used as an indicator to identify women at greater risks, thereby allowing the provision of preventive strategies and closer health checkups in the early postpartum phase. We thank all participants, physicians, and mid-wives involved in the PEACHES study for their enthusiastic support. Benzodiazepine and z-substance use and risk modification of dementia-a claims data analysis Benzodiazepines and related z-substances (BDZRs) are widely prescribed to treat anxiety and sleep disorders, particularly in the elderly. Conflicting results have been obtained when the association between BDZR use and dementia risk has been analyzed. Previous studies on claims or cohort data often describe an increased risk for dementia with the use of BDZRs. For Germany, longitudinal analyses on large population-based data sets are missing. To evaluate the association between BDZR prescription and incident any dementia. We used longitudinal German public health insurance data from 2004 to 2011 and analyzed the association between BDZR use and incident any dementia in a case-control design. We examined patient samples aged 60 years and older who were free of dementia at baseline. A lag time between BDZR prescription and dementia diagnosis was introduced to address potential protopathic bias. To assess dose dependency we used a dose-time index. Odds ratios were calculated applying conditional logistic regression, adjusted for potential confounding factors such as comorbidities (e.g. stroke, depression, ischemic heart disease, diabetes, epilepsy, anxiety, insomnia, schizophrenia, and hypertension) and polypharmacy. The use of BDZRs was associated with a significant increased risk of incident any dementia for patients aged 60 years and older (odds ratio 1.21, 95 % confidence interval 1.13-1.29). For long acting substances the association was slightly more pronounced than for short acting ones. We observed a trend for increased risk for incident any dementia with higher BDZR exposure. The restricted use of BDZRs in the elderly may contribute to dementia prevention. Universität Duisburg-Essen, Berlin, Germany Conducting systematic literature reviews is a time and cost consuming task. In order to reach high levels of evidence researchers need to be able to search medical databases for relevant results while minimising the number of irrelevant records that need to be excluded manually. To enhance the effectiveness of systematic reviews researchers can apply search filters, i.e. tested and preferably validated search strategies. According to the type of the filter, they either focus on methodological or subject-specific aspects of publications. To appraise the quality of filters they are treated as diagnostic tests and assessed by performance measures as for instance sensitivity, specificity, and precision. A search filter is always developed for a specific interface of a database. Therefore, a filter is not easily transferrable between interfaces or even databases. Translating a filter to be used in a different database requires a thorough knowledge of underlying indexing rules of the databases involved. The aim of the project is to develop and validate a subject-specific search filter that identifies publications relevant to the topic of eHealth in the MEDLINE database. Moreover, the research field of search filters will be assessed systematically by examining filter development publications with regard to factors such as type of filter developed, performance measures used, databases addressed, and methods of search term identification applied. The project will be divided into three parts: (A) Scoping review of search filter development studies on the basis of a systematic database search. (B) Based on results of the scoping review a questionnaire will be developed and sent to experts in search filter research. The results of part (A) and part (B) will serve as quantitative and qualitative input for examining the research domain of search filters. (C) Systematic construction of a gold standard for publications on eHealth. The gold standard will be split into a development and a validation set. Word frequency analyses will be deployed in the development set in order to identify terms that will be used to build the filter. Subsequently, the filter will be internally validated in the validation set. Results of the scoping review will be discussed with regard to opinions and perceptions of the experts addressed. Furthermore, divergent aspects of development methodology will be depicted. Discussing these aspects will help answering the question whether development and usage of search filters should be of greater stimulation. Finally, a validated search filter for publications on the topic of eHealth will help researchers to conduct searches with greater certainty. We had a response rate of 80.7 %; the validation rate was between 64 and 87 %. Our overall cataract prevalence was 43 % (and 9 % each for AMD and glaucoma); glaucoma and AMD co-occurred mainly with cataracts. In cases of the known age of onset of cataracts and glaucoma (n = 35), we did not observe a preferential sequence of the two events (cataractàglaucoma: n = 18; glaucomaàcataract: n = 17). In contrast, for the co-occurrence of cataracts and AMD (n = 37), the sequence cataractàretina diseases/AMD was more frequent (n = 29) than the opposite direction (AMD/retinal diseasesàcataract: n = 8) (p = 0.001, two-sided binomial test). In analysing comorbidity of eye diseases with diabetes, hypertension and airway diseases, we identified a significant association of diabetes with cataract surgery in men. Similarly, sympathomimetic drugs or inhaled glucocorticoids (used in the treatment of chronic airway diseases) were highly significant risk factors for cataracts in older men. Other eye disorders did not show comorbidity with diabetes, hypertension and airway diseases, even if different subgroups of eye diseases were analysed. We identified alcohol and actual smoking only in older women as a risk factor for glaucoma or cataract surgery, respectively. Thyroid E. Grill et al. hormone replacement therapy in women was a risk factor for dry-eye syndrome, but was inversely associated with AMD. Males treated with thyroid hormones had a significantly higher risk for glaucoma. Among metabolites and enzyme activities, uric acid and alkaline phosphate activity appear to have protective effects in women only: alkaline phosphatase for dry eye syndrome and uric acid for cataracts. We investigated a limited number of candidate genes for their association with eye disorders. The ARMS2 gene (age-related maculopathy susceptibility gene 2), and the CFH gene (complement factor H) showed a significant association with AMD. Conclusion At old age, combinations of eye diseases were frequent. Moreover, the importance of classical risk factors like diabetes, hypertension and airway diseases decreased either due to a survivor bias leaving healthier survivors in the older age group, or due to an increased influence of other up to now unknown risk factors. The KORA-Age project was financed by the German Federal Ministry of Education and Research (BMBF FKZ 01ET0713, 01ET1003A and 01ET1003C) as part of the 'Health in old age' program. Detection and classification of prostate cancer by geometric measurements Greim T 1 , Braumann UD 1 , Muders M 2 , Löffler M 3 1 Interdisciplinary Centre for Bioinformatics (IZBI), Leipzig, Germany; 2 Institute of Pathology, University Hospital Carl Gustav Carus, Dresden, Germany; 3 Institute for Medical Informatics, Statistics and Epidemiology (IMISE), Leipzig, Germany Description and purpose Grading of Prostate Cancer mainly applies the Gleason scheme which depends on morphological and architectural aspects of prostate glands. Intra-and inter-observer reproducibility remains the main problem-even after revisions (Epstein 2010). To further improve and standardize malignancy analyses, computer-assisted approaches should be integrated. Unlike in our previous work (Loeffler et al. 2012) using specimens of radical prostatectomy with ROIs we are now working with prostate needle biopsies containing only few glands. Prior to classifying malignant glands they need to be localized, whereas a set of characteristics addressing general prostate tumor growth is used. To classify located tumor glands, two measures quantifying their morphology were applied: inverse compactness and number of holes. Methods First, the H&E stained specimens are color-separated since for further processing only hematoxylin stained basophil structures (e.g. nuclei, endoplasmic reticuli) are important. Next, an edge-preserving image smoothing was applied to close staining gaps. Nuclei of epithelial cells are combined as connected chains. Resulting images are binarized using hysteresis thresholding to highlight connectivity. Malignant glands lose their basal epithelial cell layer. Using thinning and distance transform the epithelial thickness is measured, so thin tumor glands can be distinguished from physiological and hypertrophic glands. Further, other features useful for tumor detection are both circular and monomorphic glandular lumen as well as spatial tumor gland aggregations. Extents of non-overlapping bounding disks are determined by both size and number of underlying elements. Thus isolated small structures can be separated from agglomerated tumor glands. An overall of 102 prostate needle biopsy specimens was available. There were 29 specimens containing tumor pattern of Gleason grade (GG) 3 and 73 with pattern of GG 4. The biopsies were graded (grading was blinded to the image analysis persons) by a trained uropathologist. Because the inverse compactness is defined as ratio of the area of an isoperimetric circle and the area of the epithelium, it is high for GG 4 with lobulated structures. Increasing malignancy is characterized by fusion of several glands. Hence, high grade tumor exhibits multiple holes, and counting them gives large values for GG 4. Combining both measures accuracy has reached 92 % for full biopsies with single tumor pattern. In this work we introduce a computer-assisted approach to both standardize and support the pathologist's grading. With our algorithm for detecting and classifying prostate tumor we reach an accuracy of remarkable 92 %. Soon an overall of about 6000 specimens will be available for testing and improving. Additionally, we will mark the malignant glands according to their Gleason grade in ''heat maps'' and we will be able to distinguish different tumor patterns in one specimen. Prevalence and determinants of vestibular hypofunction in the general population: results from the KORA-FF4 survey Grill E 1 , Heuberger M 2,3 , Strobl R 4,5 , Saglam M 6 , Holle R 7 , Linkohr B 7 , Ladwig KH 8 , Peters A 7 , Lehnen N 5,9 Loss of vestibular function impairs the reflexes responsible for gaze and posture stabilization and for preventing falls. This may result in chronic dizziness, cause problems with walking and driving and affect spatial memory and navigation. Although bilateral vestibular hypofunction (VH) is one of the most frequent causes for movementrelated vertigo in older individuals, affecting both perception and postural stability, etiology and risk factors are still incompletely understood. Also, data on the prevalence of VH in the general population is scarce because gold-standard clinical tests that could give reliable estimates are mostly considered as too complex to be employed in large surveys. The objective of this study was to estimate the prevalence and determinants of uni-and bilateral vestibular hypofunction in the general population. Data originates from the second follow-up (FF4) in 2013/14 of the KORA (Cooperative Health Research in the Region of Augsburg)-S4 study (1999) (2000) (2001) from Southern Germany. To assess the presence or absence of VH, all participants who reported moderate or severe vertigo and dizziness during the last 12 months were tested with the Video-Head Impulse Test (Video-HIT). Specially trained examiners applied high-acceleration, smallamplitude passive head rotations (''head impulses'') to the left and right in the plane of the horizontal semicircular canals while participants fixated a target straight ahead. During head impulses, head movements were measured with inertial sensors, eye movements with video-oculography (EyeSeeCam system). To determine the presence or absence of VH, the gain of the vestibulo-ocular reflex was calculated as the ratio of the median of eye and head velocity in a window between 55 and 65 ms after head impulse start, and the presence or absence of re-fixation saccades was noted by two experienced otoneurologists. To estimate the prevalence of VH in asymptomatic persons, we also tested a random sub-sample of participants who were representative for the general population and who reported no vertigo during the last 12 months. Participants were only tested if they explicitly consented to the procedure and had no problems with head movements or the cervical spine. A total of 2279 participants were included (mean age 60.8 years, 51.6 % female), 570 (25.0 %) reported moderate or severe vertigo or dizziness during the last 12 months. Of these, 432 were tested with the Video-HIT. Likewise, 185 asymptomatic participants representative for the general population were tested. 9.4 % of symptomatic individuals had uni-or bilateral VH, 3.5 % had bilateral VH. Of the tested asymptomatic individuals, 3.9 % had either uni-or bilateral VH. Prevalence of symptomatic VH among all participants increased from 0.6 % in individuals below 50 to 8.1 % in individuals aged 79 and over. Prevalence of symptomatic bilateral VH in all participants aged 79 and over was 4.3 %. Older age and self-reported acute hearing loss were independently associated with VH. This is the first study presenting reliable estimates of the prevalence of vestibular hypofunction in a representative sample of the general population. Older age and acute hearing loss were predictive of VH, pointing at Menière's disease and degenerative origins. Diagnosing patients with vertigo and dizziness is a challenge in primary care settings where laboratory examinations are often not available. This study uses data from patients with confirmed diagnoses of vestibular syndromes to develop and validate simple diagnostic prediction models for the primary care physician. We describe the implementation of these models into an application that may assist the practitioners with their clinical decisions. Are SES and migration history associated with stress in pregnancy? An analysis of baseline data from the BaBi birth cohort in Bielefeld, Germany Grosser A 1 , Hinz I 1 , Dong Q 1 , Höller C 1 , Breckenkamp J 1 , Razum O 1 , Spallek J 2,1 1 Universität Bielefeld, Bielefeld, Germany; 2 Brandenburg University of Technology Cottbus-Senftenberg, Senftenberg, Germany In Germany as well as in many other modern societies, there is a social gradient in health: People in lower social positions have an increased risk of morbidity and mortality. Stress is hypothesized as a major pathway between socioeconomic disparities and health outcomes. Pearlin's (1987) stress model postulates that socio-economic status (SES) influences stress symptoms in adults. Low SES and ethnic minority or migrant history is associated with acute or chronic stress. Stress in pregnancy can have negative impact on maternal and fetal health. Until now it is not clear (i) to what degree low SES and migration history contribute to maternal stress and (ii) if determinants for stress are different in pregnancy due to biological and endocrinological changes. We thus investigated the role of SES and migration history as possible determinants of stress in pregnancy. We used baseline data of a prospective population-based birth cohort in Bielefeld/Germany (BaBi). Subjective chronic stress was measured using an adapted version of the INTERHEART general stress scale that combined stress at work or at home. Chronic stress was grouped as none or some vs. several or permanent stress. Acute stress was identified if participants indicated the occurrence of major adverse life events during pregnancy. SES was described by participants' age, educational level, household net income and number of children (marital status was excluded, due to low numbers of singles). Migration history was identified by participants' place of birth, whereas born outside Germany indicated migration history. We performed bivariate and multivariate logistic regression analysis to identify if SES and migration history are associated with chronic and acute stress in pregnancy. Data of 598 women surveyed during pregnancy and postpartum were analysed, of whom 34.6 % (n = 207) perceived chronic stress and 39.2 % (n = 232) perceived acute stress in pregnancy. Bivariate analysis revealed that variables of SES were not significantly associated with any form of stress. Women with migration history had a significantly higher number of children (p \ .05), lower educational levels (p \ .001), and lower household net incomes (p \ .001). Furthermore, they reported a significantly lower prevalence of subjective chronic stress (p \ 05) compared to participants without migration history. This association remained statistically significant in the multivariate logistic model, indicating a 40 % lower chance for chronic stress among participants with migration history (OR 0.6 [95 % CI 0.4-0.9]). For acute stress however, none of the tested possible influencing factors were significant. Conclusion Women experienced a considerable burden of acute and chronic stress in pregnancy. Migrants were experiencing significantly less chronic stress than non-migrants, in spite of their significantly lower SES. Low SES did not contribute to maternal stress in this population. Additional research is needed to study other factors such as personality and health status to better understand chronic and acute stress in pregnancy. Furthermore, the development of a more elaborated stress theory in pregnancy could help to guide further research. The high prevalence of stress among pregnant women cannot be explained by lower SES and needs further attention. E. Grill et al. Patient empowerment meets concerns for patients-a study of patient accessible electronic health records in Sweden As part of a EU project, the Swedish county Uppsala launched a patient portal, Journalen in 2012 [1] . Patients can now access their Electronic Health Records (EHR) online, which is aimed to increase patient empowerment. The medical professionals reacted strongly on patients accessing the medical records. Main concerns were related to quality of care, the effect on their work environment, providing bad news through the eHealth service, and also the wellbeing of patients. While the opportunities of implementing these e-health services seem promising, the concerns of the medical professionals have to be understood and addressed, as well as the actual use of the system by patients. This presentation integrates results from two interview studies with physicians and patients related to patients accessing their medical records online [2,3]. The presented results are synthesis of the interviews studies with 12 physicians [2] and 30 patients [3] , which took place about 6-12 months after the launch of the portal. The synthesis presented in this paper focus on Technological Frames [4] of physicians and patients including the attitudes and experiences in relation to possible (1) anxiety creation, (2) increased of workload, and (3) the general value of patients reading medical records. Anxiety creation due to receiving bad news. Many physicians believe that breaking bad news to patients during a patient encounter is vital as this would give them the possibility to also explain treatments and answer questions. Somewhat unexpectedly, some patients preferred receiving bad news through Journalen instead of waiting for the physicians. The patients argue that waiting times causes more anxiety. The choice of not accessing is also important, as there are patients who do not want to receive bad news before a patient encounter. Workload increases. Many physicians are worried about the workload of doctors, as reading the medical record online may result in increased number of phone calls because of anxious patients. However, many patients did not tend to take any additional contacts to ask questions. Some of the patients even believe that access to their medical records reduces the number of contacts with healthcare. Usefulness of accessing online. Many physicians are concerned that online access will have a negative impact on the patient such as increased anxiety and misconceptions as they lack understanding of medical terms. Unlike the doctors' perspective, many patients argue that they do not have major difficulties in understanding the contents. They also argue that Journalen was central to their coping with their decease. From this study it is clear that the Technological Frames of physicians differ from those of patients, and that they have different attitudes and experiences towards the system. The intention from the politicians was that the system would contribute to Patient Empowerment, but that framing of the technology differs from the physicians' view, as they are concerned of the consequences. More research is needed on the framing of the technology and how that has been changed after the launch of the system. The sex difference of overweight and obesity in urban Shanghai, China Gu H 1 , Wang L 1 , Qian X 1 , Wang K 2 , Xuan Z 2 , Fu C 3 Objective To explore the sex difference of overweight and obesity in urban adults in China. In downtown of Shanghai, China, we face-to-face investigated 11041 subjects aged 15 years old or above in their household with the response rate of 98.8 %. Heith and weights were measured at the same time. Body mass index (BMI, kg/m 2 ) was used to group subjects into three categories: obesity (C30.0), overweight (25.0-29.9) and normal weight (\25.0). Among subjects, 52.3 % were female and the average of age was 52.1 years old. The proportions of overweight and obesity were 32.6 and 7.6 % respectively. There were significant higher prevalences of overweight and obesity in men than women. The prevalence of overweight increased as age in both men and women but a sex difference was observed in the prevalence of obesity significantly. It deceased from 11.1 % in age group 15-29 years to 6.2 % in age group 80 years or above in men but the reverse trend was found in women (from 2.7 % in age group 15-29 years to 12.4 % in age group 80 years or above). There was sex differences in overweight and obesity among urban Chinese population. The sex specific interventions and policies needed to be developed for the control of prevalent overweight and obesity. Optimizing digital health informatics interventions through unobtrusive quantitative process evaluations Health informatics interventions such as clinical decision support (CDS) and audit and feedback (A&F) are variably effective at improving care because the underlying mechanisms through which these interventions bring about change are poorly understood. This limits our possibilities to design better interventions. Process evaluations can be used to improve this understanding by assessing fidelity and quality of implementation, clarifying causal mechanisms, and identifying contextual factors associated with variation in outcomes. Coiera describes the intervention process as a series of stages extending from interactions to outcomes: the ''information value chain''. However, past process evaluations often did not assess the relationships between those stages. In this paper we argue that the chain can be measured quantitatively and unobtrusively in digital interventions thanks to the availability of electronic data that are a byproduct of their use. This provides novel possibilities to study the mechanisms of informatics interventions in detail and inform essential design choices to optimize their efficacy. Modelling of operative report documents for data integration into an openEHR-based enterprise data warehouse In order to integrate operative report documents from two operating room management systems into a data warehouse, we investigated the application of the two-level modelling approach of openEHR to create a shared data model. Based on the systems' analyses, a template consisting of 13 archetypes has been developed. Of these 13 archetypes, 3 have been obtained from the international archetype repository of the openEHR foundation. The remaining 10 archetypes have been newly created. The template was evaluated by an application system expert and through conducting a first test mapping of real-world data from one of the systems. The evaluation showed that by using the two-level modelling approach of openEHR, we succeeded to represent an integrated and shared information model for operative report documents. More research is needed to learn about the limitations of this approach in other data integration scenarios. Overweight, high blood pressure and arterial stiffness in early childhood Obesity's adverse impact on blood pressure (BP) and its association with impaired arterial function have been increasingly explored in the young. However, research has paid little attention to early childhood. The objective of this study was to investigate whether elevated BP and arterial stiffness occur more often in overweight or obese (OWOB) with focus on preschool children. The data were baseline measurements of 150 children aged 2 to 6 years (4.79 ± 0.83, 52.7 % females) participating in the Fitness for Kids cluster randomized controlled trial. Heart rate, brachial and aortic BP, pulse pressure (PP), mean arterial BP (MAP), augmentation pressure (AugP) and index (Aix), as well as pulse wave velocity (PWV) were measured non-invasively using an oscillometric device (Mobil-O-Graph, I.E.M. Stolberg, Germany). We applied multilevel linear and logistic models with daycare centers as a random intercept, adjusting for age, height, sex (BP variables) and heart rate (PWV, AugP, Aix) to estimate the effect of OWOB (BMI C P90 according to German reference data) on BP and arterial stiffness. Prevalence of OWOB was 10.7 % overall, and 22.2 % in those who exhibited an elevated SBP (CP95 according to the German KiGGS Study reference values) compared to children with normal SBP (7.5 %) . Children who were OWOB demonstrated 4.5, 4.8, 9.1 and 3.8 % higher brachial SBP, aortic SBP, aortic PP and PWV (all P \ 0.05), respectively, but no significant differences regarding Aix or AugP compared to normalweight children could be detected. The odds of having an elevated SBP were significantly higher in OWOB opposed to their normalweight peers (OR = 3.43, 95 % CI 1.18-10.00, P = 0.024), and PWV was significantly higher in hypertensive children even after adjusting for BMI and MAP (4.47 ± 0.15 vs. 4.26 ± 0.17; 0.21 m/s, 95 % CI 0.14-0.28, P \ 0.001). The results suggest that overweight is simultaneously associated with adverse BP levels and stiffer arteries, even in preschool children. Early treatment and prevention of overweight must obtain a priority because it may prevent irreversible damage to the cardiovascular system. The small sample size must be taken into account. More research is required to examine the differential interaction between overweight, high blood pressure and arterial stiffness. Aortic pulse wave velocity as a surrogate marker for arterial stiffness and its determinants in early childhood Background Aortic pulse wave velocity (PWV) has emerged as an established marker for arterial stiffness in order to stratify cardiovascular risk and to improve the prediction of future events in adults. A considerable literature has grown up around hemodynamic indices detecting target organ damage in different pediatric diseases, such as early atherosclerosis, obesity, hypertension or congenital heart diseases. Due to the lack of hard end points there is increasing need for noninvasive surrogate markers to identify altered vascular function in pediatric populations. However, particularly for preschool children data on PWV are sparse. Therefore, this study was set out to characterize factors determining PWV in early childhood. The present data is based on the baseline measurements from the Fitness for Kids Study, a cluster randomized controlled exercise intervention program in preschoolers. A total of 139 healthy children aged 2 to 6 years (4.80 ± 0.82 years, 51.8 % girls) were recruited from five daycare centers in Hamburg, Germany. We measured weight, height, heart rate, blood pressure (BP), mean arterial BP (MAP) as well as PWV noninvasively by a validated oscillometric device (Mobil-O-Graph, I.E.M., Stolberg, Germany). Statistical analyses included correlations to examine the relation between anthropometrics, hemodynamics and PWV, as well as linear mixedmodels with random intercepts for daycare centers to assess sex differences and to derive the determinants of PWV. E. Grill et al. On average, boys had a PWV of 4.34 ± 0.32 m/s and girls of 4.29 ± 0.32 m/s (P = 0.324). Correlation analysis revealed significant associations of PWV with height, weight, BMI, systolic and diastolic BP as well as MAP (P \ 0.05), but not with age. In the multivariate analysis we found that MAP (beta = 0.037, P \ 0.001) was the major determinant of PWV in the investigated age range. Additionally, mean standardized values of systolic BP and MAP increased with age and height, while PWV weakly inclined or remained practically unchanged. The study offers important insights into arterial ageing in early life, namely that not age, but rather MAP independently predicts PWV in very young children. According to previous research MAP physiologically increases with growth, which in turn enhances elastic properties and compliance of arteries despite an accompanying ageand BP-dependent rise in arterial stiffening. This effect on vascular structure and function might explain our finding that PWV remained constant and will probably start to show a noticeable increase with onset of puberty. Our results emphasize the usefulness not only of routine BP measurements in early childhood but moreover of measuring PWV in order to evaluate arterial function. Research questions should focus on further clinical and lifestyle risk factors potentially influencing PWV in healthy young children. Background It should be the aim to protect women from a carcinoma of the cervix and its preliminary stages with a type specific vaccination. Half of all women develop barely or not antibodies against HPV. A possible advancement of the HPV vaccination with antibody-titer-identification could be a sensible support for women with HPV infection. Furthermore the continuing care, education and invitation for vaccinated women to take part in gynecological screening with morphological diagnostics apply. During microscopic inspection of gynecological cells of the cervix of vaccinated women between the ages of 18 and 26 using the known malignancy criteria, mild, moderate and sever dysplasia as classified in the Münchner Nomenklatur III in German-speaking areas, can be detected. From January 2015 to January 2016 27 cases of dysplasia of the cervix in Pap smears of vaccinated women were diagnosed. 59 % of these cases were vaccinated with Gardasil, in 41 % the vaccine wasn't specified. The dysplasia show the following distribution: 63 % of cases show a mild dysplasia (low SIL). In 37 % a high SIL lesion can be verified. Thereof 33 % have a moderate dysplasia and 4 % a severe dysplasia. Development and progress of the dysplasia: in 74 % of cases the year of vaccination is unknown. In 7 % of cases a dysplasia shows after 1 year, in 4 % after 4 years and in 15 % after 5 years of HPV vaccination. There are 70 % of cases in dysplasia follow-up examinations, 15 % of these are healed up within 2 years. In 15 % of high SIL cases a histological clarification followed, which confirmed the high SIL lesion 100 %. The actual rate before the formation of a dysplasia in vaccinated women to the rate of formation of a dysplasia in non-vaccinated women, cannot be illustrated at the moment. At this particular time there is no valid specific marginal limit for a HPV-antibody-titeridentification and also a subtype determination of HPV antibodies isn't possible, that's why a reliable statement about the length of immunity and the actual raising of specific antibodies against HPV isn't possible. The HPV test doesn't indicate dormant HPV infections reliably. The effect of the HPV vaccination into a dormant HPV infection to the actual effect on the formation of a dysplasia is therefore unclear. It can also be observed that further HPV types in high SIL are detected, even though these are not officially classified as high SIL. These HPV types are currently not detected using the common HPV tests and are not included in the new advanced vaccination. If the development of further new high risk HPV types continues it is questionable if a HPV-screening-program can withstand the HPV mutation, while morphological diagnostics are unaffected and malignancy criteria remain consistent. It is through morphological diagnostics such high SIL are detected. Thus the importance of gynecological screening with morphological diagnostics becomes apparent. More than 40 years have passed since large-scale Japanese hospitals began using hospital information systems (HIS). Fortunately, the initial concerns about HIS networks have been reduced and the importance of wireless information networks in hospitals will continue to increase. We herein discuss the problem of ''network availability'', which can be defined as ''Being able to communicate with the required transmission rate, wherever and whenever a user wants''. We summarize the elements of ''network availability'' in hospitals and make suggestions for ensuring availability, especially focusing on wireless communications. When using wireless LAN, the network administrator must be conscious of the following matters; location and output of each AP, the effect of construction materials on signal reach, and security issues. We hope to encourage hospital staff to recognize the benefits of the use of telecommunications equipment and to understand the importance of cooperation with the people responsible for design, construction, and maintenance of the hospital systems. Participatory heuristic evaluation of the second iteration of the eWALL interface application Hangaard SV 1 , Schaarup C 1 , Hejlesen O 1 1 Aalborg University, Aalborg, Denmark The number of people having a chronic disease is increasing. Telehealth may provide an alternative to traditional medicine as telehealth solutions have shown to have a positive influence on quality of live and to decrease the number of hospital visits. A new telehealth solution is the eWALL system. Previously, the eWALL interface application has been evaluated using participatory heuristic evaluation (PHE). The previous round of PHE lead to drastic changes of the eWALL interface application. Thus, a second round of PHE was performed. Five usability experts and two work domain professionals inspected the eWALL interface application and identified usability problems. The usability experts and work domain professionals identified 384 usability problems. The work domain professionals had a tendency to use other heuristics than the usability experts. The study highlights the relevance of using PHE in an interface development process. The use of smartphones in Norwegian Social Care Services Hansen LI 1 , Fossum M 2 , Fruhling A 3 1 University of Agder, Faculty of Health and Sport Sciences, Grimstad, Norway; 2 University of Agder, Faculty of Health and Sport Sciences, GRIMSTAD, Norway; 3 School of Interdisciplinary Informatics, University of Nebraska, Omaha, Nebraska, United States This study aims to understand how smartphone technology was perceived by social workers responsible for piloting social services software and the experiences of involving end-users as codevelopers. The pilot resulted in an improved match between the smartphone software and workflow as well as mutual learning experiences among the social workers, clients, and the vendor. The pilot study revealed several graphical user interface (GUI) and functionality challenges. Implementing an ICT social service smartphone application may further improve efficiencies for social workers serving citizens, however; this study validates the importance to study end-users' experiences with communication and the real-time use of the system in order reap the anticipated benefits of ICT capabilities for smart phone social service applications. Accelerate healthcare data analytics: an agile practice to perform collaborative and reproducible analyses Hao B 1 , Sun W 1 , Yu Y 1 , Li J 2 , Hu G 1 , Xie G 1 1 IBM Research-China, Beijing, China; 2 IBM Research -China, Beijing, China Recent advances in cloud computing and machine learning made it more convenient for researchers to gain insights from massive healthcare data, yet performing analyses on healthcare data in current practice still lacks efficiency for researchers. What's more, collaborating with different researchers and sharing analysis results are challenging issues. In this paper, we developed a practice to make analytics process collaborative and analysis results reproducible by exploiting and extending Jupyter Notebook. After applying this practice in our use cases, we can perform analyses and deliver results with less effort in shorter time comparing to our previous practice. E. Grill et al. The effects of dietary fatty acid (FA) intakes on serum lipids have been investigated thoroughly in adults. However, studies in children and adolescents are lacking. The current study therefore aims to describe the longitudinal associations of dietary FAs with serum lipids, in two birth cohorts of German children from ages 10 to 15 years. Children with complete data on serum lipid concentrations, FA intakes and covariates at the 10-and 15-year follow-ups of the GINIplus and LISAplus birth cohort studies were included in the analyses (n = 1073). Lipid concentrations were measured from blood samples and dietary intake was assessed by food frequency questionnaires (FFQs). FA intake was expressed as percentage of total energy intake (%EI) following the nutrient density model. Longitudinal associations of the consumption of saturated fatty acids (SFA), monounsaturated fatty acids (MUFA), n-3 and n-6 polyunsaturated fatty acids (PUFA) with total cholesterol (TC), low-density lipoprotein cholesterol (LDL-C), high-density lipoprotein cholesterol (HDL-C), and triacylglycerols (TAG), were assessed by linear mixed effects models including a random intercept. Additionally, change-on-change analysis was performed using multiple linear regression models regressing changes in the consumption of FAs on changes in serum lipids. All analyses were adjusted for sex, study, region, parental education, pubertal onset, fasting status, exact age at blood-withdrawal, BMI and total energy intake. Sex-stratified analyses were performed if significant interactions with sex were observed. In the longitudinal regression model SFA, MUFA and n-3 intakes were significantly associated with higher HDL-C; however, this was limited to males in the sex-stratified analyses. Furthermore, MUFA intake was inversely associated with TAG in both sexes. Change-onchange analyses indicated that increases in n-3 PUFA intakes between 10 and 15 years of age were related to concurrent increases in HDL-C levels among males. Our findings suggest that during the transition from childhood to adolescence dietary fatty acid intakes affect serum lipid levels in a way which is broadly comparable to effects observed for adults. Future analyses should elucidate whether the benefits of MUFA intakes in particular are comparable when replacing SFA, n-3 PUFA or carbohydrates. Are we doing the right research in biomedical and health informatics and are we doing it right? Some thought-provoking and critical proposals to encourage scientific debate on the nature of good research in medical informatics The multi-country-EU project ADVOCATE (Added Value for Oral Care) strives to optimize the delivery of oral health towards prevention. It involves the analysis of routinely collected oral health care records from health insurances in six European countries, including the National Health Service (NHS) England and NHS Scotland. It is planned to store the data in a central repository. The use of sensitive medical records is dependent on data usage agreements, each of which underlies different national data protection and privacy regulations. The result is that the requirements for data aggregation differ between the European partners. While some regulations may allow the use of a pseudonymised patient identifier, other requires anonymisation before the data can be used in a central repository. Hence, a flexible workflow for the integration of routinely collected insurance data in a central repository must be established. Thus, we analysed the data usage agreements from ADVOCATE project partners, to derive requirements for the data integration workflow. IT infrastructure for merging data from different clinical trials and across independent research networks Opsoclonus Myoclonus Syndrome (OMS) is a rare disease in children which is often associated with neuroblastoma and, therefore, requires treatment by pediatric neurologists and oncologists. The ongoing OMS trial investigates different questions related to OMS and potentially underlying neuroblastomas. In order to support this trial with an adequate IT infrastructure, linkage of neuroblastoma research databases with the OMS electronic data capture system was required. Therefore, an EDC system for the OMS trial has been developed and integrated into the research infrastructure of the European Network for Cancer Research in Children and Adolescents (ENCCA) project. Application of ENNCA's pseudonymization concept enabled linkage of the OMS trial with neuroblastoma trials from two different scientific societies, while being compliant with current data protection regulations. Linkage of the neurological and the oncological domain could successfully be demonstrated and a promising concept for secondary use of the data of both domains has been developed, proofing the broad potential of the concepts for cross domain research as promoted in the ENCCA project. Based on data comprising numbers of prescriptions reimbursed by statutory health insurance from January 2006 until June 2015 in Schleswig-Holstein, we analyzed prescriptions of all BZD and Z-hypnotics in intervals of once a month. Tetrazepam that has been applied exclusively for cure of muscle distension was included to evaluate the pharmacoeconomic significance of this BZD with regard to withdrawal from EU market in 2013 and to potential abuse of other BZD as substitutes of tetrazepam. Furthermore, patterns of prescriptions in 2015 relating to age, duration and administrative districts were examined. Because of small numbers of prescriptions, short acting midazolam or triazolam as well as the anticonvulsant clonazepam and clomethiazole mainly used for acute alcohol withdrawal were excluded from analysis. The tranquilizers alprazolam, chlordiazepoxide, clobazam, clorazepate, medazepam and prazepam and the hypnotics brotizolam, flunitrazepam, flurazepam and zaleplon did not represent important dues of prescribed substances (\5.000 prescriptions/year. Numbers of prescriptions regarding the tranquilizers bromazepam (47.300 [2006]/ 23.561 [2014] , -50 %), diazepam (67.800/41.875, -38 %) and oxazepam (57.600/25.362, -56 %) indicated that these BZD still are popular but prescription numbers declined continuously through the years. Lorazepam was prescribed most frequently with an increasing number of prescriptions (65.089/72.477, ?12 %). Prescriptions of BZD applied as hypnotics were also declining during the analyzed period: lormetazepam (34.842/15.894, -54 %), nitrazepam (21.400/ 7.790, -64 %), temazepam (29.242/11.295, -55 %). In case of Z-hypnotics, zolpidem tended towards decreasing numbers of prescriptions (62.378/51.291, -17 %) whereas zopiclone seems to be favoured more and more by practitioners (71.044/79.167, ?11 %). Besides tranquilizers and hypnotics, tetrazepam was prescribed extensively with an increasing number of prescriptions (71.500/ 82.800, ?12 % [2012]) thus representing the most frequently used BZD until 2013. There were no striking changes in prescription patterns of other BZD since discontinuation of tetrazepam. Despite existence of pertinent recommendations in 2015 the vast majority of BZD and Z-hypnotics turned out to be prescribed to patients between 70-80 years old. Indication was not of relevance since both hypnotics (short acting) and tranquilizers (mid-long/long acting) were prescribed likewise to this group. In contrast, tetrazepam and diazepam were found to be prescribed particularly to people aging 45-50 years. Percentage of patients receiving prescriptions of BZD permanently (periods of [2 quarters) averaged out 19 % in 2014. Distinct differences in numbers of prescribed BZD and Z-hypnotics amongst various areas consistent with a general north-south-divide became evident. No seasonal variations in prescriptions were noticed. Conclusion This analysis revealed an overuse of BZD and more and more of Z-hypnotics that has to be questioned especially with respect to the fact that the majority of this substances is prescribed to elderly patients. Thus, there is still a need to develop programs reducing uncritical prescription of BZD. Heinze G 1 , Dunkler D 2 Statistical models are handy tools for empirical medical research. They facilitate individualized outcome prognostication conditional on covariates as well as adjustments of estimated effects of risk factors on the outcome by covariates. Theory of statistical models is wellestablished if the set of covariates to consider is fixed and small, such that we can assume that effect estimates are unbiased and the usual methods for confidence interval estimation are valid. In routine work, however, it is not known a priori which covariates should be included in a model, and often we are confronted with the number of candidate variables in the range 10-30. This number is often too large to be considered in a statistical model. In recent decades many statisticians have extensively studied variable selection procedures for various purposes, e.g., for adjusting the effect of a risk factor of interest for confounders or other covariates, for hypothesis testing, or for deriving multivariable prediction models. In this talk, we will exemplify application of variable selection using scientific questions and data from real medical studies with continuous and censored survival endpoints. Using these examples and simulated data, we will discuss implications of variable selection, e.g., on bias and uncertainty of the final model. We demonstrate how resampling methods can help to quantify these instability issues in real-life data analyses. Currently, routine software implementations largely ignore instability issues caused by variable selection. We advocate the implementation of resampling methods in routine software such that applied researchers can get aware of instabilities in the finally selected model, in particular if they apply variable selection to too small data sets. Predictors of pre-diabetes: evidence from a German Cohort Study Hengelbrock J 1 1 Institut for Medical Biometry and Epidemiology, University Medical Center Hamburg-Eppendorf, Hamburg, Germany Pre-diabetes represents an elevation of plasma glucose and glycated hemoglobin above the normal range but below clinical level. It is associated with early forms of nephropathy, chronic kidney disease and increased risk of macrovascular disease and is a high-risk state for diabetes. To date, the mechanisms whereby obesity, hypertension, laboratory parameters, lifestyle habits and psychosocial factors contribute to or predict the development of pre-diabetes are only incompletely understood. We conducted a cohort study of 2098 workers at an aircraft factory located in Germany and collected data on socio-demographic variables, lifestyle and dietary habits and clinical as well as laboratory parameters at a baseline survey. Of these, complete follow up data 3 years later were available from 1319 individuals. Exclusion criteria were a diagnosed diabetes or the use of anti-diabetic medication at baseline. Instead of classifying subjects as non-, pre-or diabetic based on fixed cut-off values for fasting plasma glucose (FPG) and hemoglobin A1c (HbA1c) levels, we analyzed glucose and HbA1c levels as continuous variables. Using a linear model, we estimated the association between glucose and HbA1c levels at follow up and age, sex, obesity, hypertension, levels of LDL, HDL and triglycerides, as well as smoking, alcohol consumption, physical activity and perceived stress at baseline, with the sum of the standardized glucose and HbA1c values at follow-up as dependent variable and controlling for baseline glucose and HbA1c levels. According to the definition of pre-diabetes of the American Diabetes Association (FPG of 100-125 mg/dl or HbA1c levels of 5.7-6.4 %), the prevalence of pre-diabetes at baseline was 29.4 %. Many subjects switched between the non-diabetic and diabetic state from baseline to follow up (non-diabetic to pre-diabetic: 173; vice versa: 146), and many had glucose and HbA1c levels close to the mentioned classification cut-offs for pre-diabetes. In the linear model, waist-to-height ratio at baseline explained more of the variance in glucose and HbA1c levels at follow-up than waist circumference or body mass index (BMI). In addition to waist-to-height ratio, baseline glucose and HbA1c levels, age and hypertension were positively associated with an increase in glucose and HbA1c levels at follow-up. HbA1c, glucose, waist-to-height ratio and age at baseline were the variables that contributed most to the explained variance of the model and including more factors did not increase the model fit substantially. Modeling levels of glucose and HbA1c levels at follow up separately revealed similar results. Detailed estimates will be presented. In our study, only waist-to-height ratio, age and hypertension at the baseline survey were significantly associated with increased glucose and HbA1c levels at follow up, controlling for baseline glucose and HbA1c. Additional factors such as laboratory parameters, lifestyle habits and psychosocial factors did not explain much of the remaining variance of glucose and HbA1c levels. However, the moderate size of the cohort must be taken into account. A planned further follow-up of the cohort will help to disentangle the causal mechanisms behind these factors and the development of pre-diabetes and diabetes. Cross-sectional and longitudinal clustering of metabolic outcome-real-life data analysis based on 8,370 patients with type 1 diabetes Hermann J 1,2 , Schwandt A 1,2 , Rosenbauer J 3,2 , Holterhus PM 4 , Warncke K 5 , Binder E 6 , Meissner T 7 , Kapellen TM 8 , Holl R 1,2 1 Institute of Epidemiology and Medical Biometry, ZIBMT, Ulm University, Ulm, Germany; 2 German Center for Diabetes Research (DZD), München-Neuherberg, Germany; 3 German Diabetes Center, Leibniz Center for Diabetes Research at Heinrich Heine University Düsseldorf, Düsseldorf, Germany; 4 University Hospital of Schleswig-Holstein, Campus Kiel/Christian-Albrechts University of Kiel, Kiel, Germany; 5 Klinikum rechts der Isar, Technische Universität München, München, Germany; 6 Medical University of Innsbruck, Innsbruck, Austria; 7 University Children's Hospital Düsseldorf, Düsseldorf, Germany; 8 University of Leipzig, Hospital for Children and Adolescents, Leipzig, Germany In large cohort studies, it is often of interest to divide the cohort into subgroups of subjects with comparable characteristics. Mostly, subgroups are based on specific pre-defined variable cut-offs, while subgroup definition using clustering techniques is less common. However, in the context of disease heterogeneity and a personalized treatment approach, identification of subgroups with similar developmental trajectories over time gets more popular. This analysis aims to cluster patients with type 1 diabetes (T1D) based on HbA1c, one of the most important quality indicators in T1D therapy. In particular, changes of HbA1c during puberty are of interest. Results from both cross-sectional and longitudinal cluster techniques are compared against the background of clinical interpretability. Methods N = 8,370 patients from the German/Austrian multicenter diabetes registry DPV were analyzed using SAS 9.4 (T1D duration C2 years, HbA1c aggregated per year of life, follow-up from 10 to 18 years with mandatory HbA1c values at the age of 10/15 years and values in at least another three years). Two cross-sectional cluster techniques, K-means (PROC FAS-TCLUS) and hierarchical agglomerative clustering (PROC CLUSTER), were used. Cluster variables were HbA1c at the age of 10 years (pediatric baseline value) and relative change in HbA1c at the age of 15 years (puberty). In addition, latent class growth analysis (LCGA, PROC TRAJ [1] )-a longitudinal method based on structural equation modeling (SEM) [2]-was applied. LCGA was based on HbA1c follow-up from the age of 10 to 18 years. Comparison of patient characteristics between subgroups was performed using multinomial logistic regression models. Based on clinical interpretability, a number of six clusters was predefined for the K-means analysis. The six-cluster solution (R2 = 0.90) revealed cluster sizes between 156 (2 %) and 2472 (30 %) patients. Clusters significantly differed in age at T1D onset, migration background, body height, physical activity, and intensity of diabetes self-management (all p \ 0.05). The six-cluster solution of the hierarchical agglomerative clustering with average linkage as clustering method (R2 = 0.61) led to two clusters consisting of only one and seven patients, respectively. The largest cluster contained 7052 (84 %) patients. In LCGA, BIC-based model selection revealed an optimal number of six clusters using a predefined range from two to six clusters. The smallest/largest cluster included 298 (3 %)/3143 (36 %) patients. Both K-means and LCGA detected a cluster of patients with stable-low HbA1c over time. Patients with high baseline HbA1c were split into groups with stable (12 % of all patients) or deteriorating (4 %) metabolic control by the LCGA, but not by the K-means or hierarchical cluster analysis. Both simple and more complicated SEM-based cluster methods exist to define subgroups within a cohort. While the K-means analysis used baseline HbA1c and the change in HbA1c at the age of 15 years only, LCGA clustered patients based on a 9-year follow-up of HbA1c values and thus seems preferable. Both methods can be extended to include further variables. Hierarchical clustering may require preliminary clustering (e.g. K-means) in large data sets. All three methods have to deal with statistically optimal versus clinically interpretable solutions. Studying genome-wide single nucleotide DNA polymorphisms (SNPs) to infer heritable components of the susceptibility to diseases is still challenging. Although it is evident that the concerted action of multiple SNPs affects disease susceptibility, univariate analyses are still most frequently employed. Deep learning approaches such as deep belief networks (DBN) allow for the extraction of high-level multivariable features. Having originally been developed for image classification (e.g. Huang 2012), DBNs have rarely been applied for genomic data. Still, genome-wide data usually comprises millions of SNPs and a corresponding DBN would require an unrealistic amount of training samples to be efficient. Here we assess the performance of DBNs in detecting pattern in high-dimensional SNP data. We use susceptibility to radiation induced cancerogenesis as a model to investigate how filtering SNPs based on external knowledge such as gene expression data or functional annotations of SNPs affects the performance of the DBN. In a simulation study we generated 100,000 genotypes using Hap-Gen2 (Su 2011) and publically available whole genome SNP data (The 1000 Genomes Project Consortium 2015). These 100,000 genotypes were used as background noise in which we placed different SNP pattern. We filtered the SNPs based on external knowledge and investigated the performance of DBNs to learn the simulated pattern by inspecting the likelihood of test data or by investigating samples generated from the model. DBNs with varying number of hidden layers were trained unsupervised with the simulated genotypes. By varying the amount of filtering, the frequency of the pattern in the training data or the amount of training data, we evaluated to what extent the DBN performance is sensitive to these parameters. Using external information we reduced the dimension of the simulated data to an amount ranging from 1000 to 10000 SNPs. At a number of 1000 SNPs and a pattern frequency of 10 % the network was well trained using an amount of 5000 genotypes. We observed only a subtle increase in the likelihood when further increasing the amount of training data. Even noisy SNP pattern were detected. However we observed that with increased noise the network became more sensitive to the amount of training data. DBNs are well suited to capture pattern in high dimensional SNP data when using external data to reduce dimensionality prior to network training. Given that pattern are not too rare they can be detected with a moderate amount of training data. Reliability of the regional myocardial infarction registry In the last decades, a declining mortality rate from cardiovascular disease as well as acute myocardial infarction (AMI) has been observed in Germany. However, there are large differences between the federal states according to the mortality and morbidity of AMI. Saxony-Anhalt is one of the federal states with the highest mortality rates for AMI in Germany. In order to investigate the reasons, the regional myocardial infarction registry of Saxony-Anhalt (RHESA) was established. A follow up of RHESA was conducted in 2014 (RHESA-CARE). E. Grill et al. In RHESA and RHESA-CARE we receive data based on medical record as well as self-reported data including individual factors via a hospital questionnaire (HQ) in RHESA and a computer assisted telephone interview (CATI) in RHESA-CARE. Aim is to quantify the reliability of RHESA and RHESA-CARE data. Therefore, we answer following question: Differ the data of the HQ from the data of the computer assisted telephone interview CATI? Methods RHESA is a population-based registry of patients with fatal or nonfatal AMI that was established in July 2013. The registry population comprises inhabitants aged 25 years or more of the city of Halle (Saale) (n = 179.000) and inhabitants of the rural district Altmark (n = 165.000) in the federal state Saxony-Anhalt, Germany. RHESA-CARE is a follow-up of AMI patients who have given written consent to participate in the registry of RHESA. It is conducted six weeks after AMI. We will use the data of patients who had been interviewed until June 2016th within RHESA-CARE Study. Until then we expect a sample size of 430. We will compare the patient reported factors from the CATI's with the data of the HQ. We will focus on the main risk factors, defined as smoking status, diabetes mellitus, hypertension, hypercholesterinaemia, reinfarction and stroke as well as information about height, percutaneous coronary intervention, bypass surgery and atrial fibrillation. For the reliability analyses of continuous data we will use Bland-Altman plots, and for comparison of categorical data Cohen's kappa. Analysis will be done with SAS 9.4 and R 3.0.3. Self-reported data continue to be important tools in epidemiology and public health research. In RHESA we use hospital questionnaire to obtain information on patient health status and risk factor profile whereas in RHESA-CARE we use a computer assisted telephone interview. Patient reported data and medical record data is important for obtaining meaningful estimates of disease and risk factor prevalence. Can questionnaires identify which children with headache suffer from Migraine?-A large-scale study investigating childrens' selfreport and parental ratings Hirschfeld G, Zernikow B 1 , Wager J 1 , Thiele C 2 1 Deutsches Kinderschmerzzentrum, Datteln, Germany; 2 Hochschule Osnabrück, Osnabrück, Germany Background Many children and adolescents suffer from recurrent headaches. The most frequent diagnoses are tension-type headaches and migraine. An important question for epidemiological studies is whether or not it is possible to classify children with those diagnoses based on questionnaire data. Several pain-related characteristics are surmised to be different between children with migraine; intensity, frequency, painquality (pulsating vs. pressing), and accessory symptoms (nausea, vomiting, sensitivity to light). The aim of the present study was to investigate the utility of questionnaire ratings of these individual characteristics in identifying children who suffer from migraine. Methods 795 patients (72 % female; mean age = 15 years, standard deviation 2 years) admitted to a tertiary pain clinic participated in the study. Children that were diagnosed as having migraine (irrespective of additional diagnoses) were contrasted to those children that were not diagnosed as having migraine. Diagnosis was made by an experienced pain-therapist after a 1.5 h-interview with children and parents. Data on pain-characteristics mentioned in the IHS-criteria (intensity, frequency, pain-quality rhythm, pain-quality affect, days missed school, pain-related disability, number of accessory symptoms) was collected via questionnaire before the interview (both children and parents reported characteristics). First, the individual characteristics were tested for significant differences between children with and without migraine and expressed as effect sizes. Second, Receiver-Operating-Characteristic (ROC)-analysis were used to determine the diagnostic utility of the individual characteristics. Third, two separate decision trees-one for children and one for parents-were developed. In order to avoid overfitting the depth of the tree was based on 50 replications of ten-fold stratified cross-validation. Diagnostic utility of classification was measured by the area under the curve (AUC) within the full sample and out of sample as estimated by cross-validation. Almost all of the individual characteristics showed significant differences between pediatric headache patients with vs. without migraine. However, with the exception of number of accessory symptoms and frequency that both showed large effect sizes, the effect-sizes were only small to medium. The ROC-analysis similarly showed that only these two characteristics had acceptable AUCs. The decision tree for children's self-report consisted of only two nodes (frequency and number of accessory symptoms) and had an AUC of 76 % in sample and 74 % out of sample. The decision tree for parental reports also included frequency, number of accessory symptoms, pain-quality rhythm, and days missed in school. This decision tree had an AUC of 83 % in sample and 78 % out of sample. Even though many significant differences in headache characteristics exist, these do not easily translate into useful classifications. Only frequency and number of accessory symptoms sohwed some diagnostic utility when tested individually. When combining the different characteristics, we found overall only acceptable classification performance for children and their parents, with the latter performing slightly better. It is important to keep in mind that the models need to be evaluated in a prospective sample. Together these results highlight further avenues to improve clinical decision-making and epidemiological studies that rely on questionnare data. Semi-automated annotation of biobank data using standard medical terminologies in a graph database Hofer P 1 , Neururer SB 2 , Goebel G 3 1 Medical University of Innsbruck, Innsbruck, Austria; 2 Medizinische Universität Innsbruck, Innsbruck, Austria; 3 Medical University Innsbruck, Innsbruck, Austria Data describing biobank resources frequently contains unstructured free-text information or insufficient coding standards. (Bio-) medical ontologies like Orphanet Rare Diseases Ontology (ORDO) or the Human Disease Ontology (DOID) provide a high number of concepts, synonyms and entity relationship properties. Such standard terminologies increase quality and granularity of input data by adding comprehensive semantic background knowledge from validated entity relationships. Moreover, cross-references between terminology concepts facilitate data integration across databases using different coding standards. In order to encourage the use of standard terminologies, our aim is to identify and link relevant concepts with freetext diagnosis inputs within a biobank registry. Relevant concepts are selected automatically by lexical matching and SPARQL queries against a RDF triplestore. To ensure correctness of annotations, proposed concepts have to be confirmed by medical data administration experts before they are entered into the registry database. Relevant (bio-) medical terminologies describing diseases and phenotypes were identified and stored in a graph database which was tied to a local biobank registry. Concept recommendations during data input trigger a structured description of medical data and facilitate data linkage between heterogeneous systems. Colorectal cancer (CRC) screening aims at both prevention and early detection of CRC. CRCs detected earlier through screening may have divergent molecular pathologies than symptom-detected cancers. Knowledge about the molecular pathology can help improve screening examinations and technologies. Thus, we aimed to compare molecular characteristics of screening-and symptom-detected CRCs according to molecular pathological characteristics. Mode of detection was assessed in 1,260 CRC patients of a large population-based case-control study from Germany (DACHS) with available information on molecular pathological features analysed in the tumor tissue of the patients (microsatellite instability (MSI), CpG island methylator phenotype (CIMP-high), KRAS and BRAF mutations, estrogen receptor (ER) beta expression). Multivariable logistic regression analyses were employed to calculate adjusted odds ratios (ORs) and 95 % confidence intervals (CI). The study participants were on average 69 years old. MSI-high, CIMP-high, KRAS and BRAF mutations showed associations with age, sex, stage and location of CRC as expected. In multivariable analyses adjusting for the latter factors and other patient characteristics, CIMP-high CRC was detected more often at screening (17 %) than after occurrence of symptoms (10 %) (OR 1.70, 95 % CI 1.07-2.71). On the contrary, CRCs lacking ER beta expression were detected less often at screening (36 versus 48 %; OR 0.56, 95 % CI 0.40-0.80). In this first study exploring differences of major molecular characteristics in screening-and symptom-detected CRCs, screening was less effective in detecting CRCs with lack of ER beta expression, a finding with potential prognostic relevance. In 2015, there were around 214 million cases of malaria, which resulted in 438 000 deaths. Over the past 15 years, malaria control efforts have cut the number of deaths in half. Despite the enormous burden of on sub-Saharan Africa, 90 % of all malaria deaths in 2015, a number of countries on the continent are pursuing malaria elimination efforts. 1 The ability to track and target malaria transmission is vital to the success of malaria elimination programs, especially in elimination settings, where transmission is rare and clustered. 2 The DiSARM platform pulls surveillance and intervention data in real-time, combines it with climate and environmental data, automates analyses by running spatial models in Google Earth Engine and in R, and produces risk & decision-support maps and task lists. A key feature of the platform is that non-experts can run spatial models, using an interface that requires minimal technical expertise, to produce risk maps. GIS experts are a scares resource. 3 A mobile app is being developed for in-field use. After piloting DiSARM in Swaziland and Zimbabwe, the goal is to introduce the platform to all Elimination Eight countries in southern Africa. The development of DiSARM involves a number of steps, which have already taken place or are scheduled for this year. Initially, an indepth evaluation of current malaria surveillance data processing has to be performed. This will be an in-country implementation in pilot countries. Findings from the in-country implementation in pilot countries will influence the overall development of the platform. A major barrier to implementing DiSARM is that malaria surveillance systems are set up for reporting and monitoring & evaluation. This causes a number of bottlenecks for a quasi real-time application like DiSARM, because data are often transferred to the surveillance system in aggregate and transfer intervals are long. In most cases, manual action is required. Possible enhancements have to be identified to shorten transfer intervals and automate data flows during the in-country evaluation. Both proprietary surveillance systems, for example in Swaziland, or widely used surveillance platforms like District Health Information Software Version 2, e.g. in Zimbabwe, are common. A process for implementing DiSARM in both scenarios has to be developed. The group of potential end-users for DiSARM is large and with diverse needs. A series of use cases has been formulated and the use case of planning a vector control campaign will be implemented in the beta version. The interface has to be customizable to the different tasks by the different users. E. Grill et al. The potential of DiSARM in supporting malaria elimination efforts is huge. Pilots will give valuable insight for the roll-out in countries with proprietary and DHIS2-based malaria surveillance systems. Swazilands malaria surveillance data processing is advanced and offers a good foundation for cloud-based risk mapping, but is not currently set up for real-time data flow. This is the biggest in-country implementation challenge of the platform. Automated data flows could address. Toward secure Hospital Information System and secondary use system of patients data-based on replacement of HIS in Nagasaki University Hospital Honda M 1 , Matsumoto T 1 1 Nagasaki university Hospital, Nagasaki, Japan Background A promotion of EMR system, medical information network system among hospitals and clinics, and telemedicine system have been a decade-long prime theme in the medical information field in Japan. Every university hospital has been already established EMR system even though the difference in its quality and scale. But it has been still an important problem to construct Hospital Information System (HIS) with more effective, more secured features, and toward BCP (Business Continuous Plan) strategy against a big disaster. In January 2015, we replaced HIS for a new one with several features such as an improvement of a response time, being non-stop system, an improvement of safety. For realizing a new HIS, we have been introduced a new infrastructure of virtualized server system, virtualized client system and a finger vein authentication system. We effectively built up not only the core system of HIS, such as an EMR, an order-entry, and a medical accounting systems, but also many subsystems, on a new infrastructure. A list of essential issues of updating HIS in 2015, are the followings;(1) Retaining suitable system responses; on my opinion, this issue is still most important for sustainable clinical activities even though more sophisticated design is accomplished. (2) Continuity of non-stop service through the year; it means part of EMR system would be running even a situation of an annual planned power cut. (3) Three principles (regulations in Japan) in EMR system; EMR system has to be installed the several functions with three principals such as Authenticity, Preservation, and Visual readability. (4) Improvement of information security functions including user authentication; an each EMR client terminal is provided with a finger vein authentication device system. So far there are no failures of this system. (5) DWH system; secondary use of patients' data; at the construction and operation process, there are some features such as followings: a. for an effective transforming step from the intermediate database to DWH database, using an interface tool called ''Ensemble'', b. using nursing data and others as well as EMR data for DWH database, c. using PDF file as well as text file for DWH database. (6) DR (Disaster Recovery) strategy as BCP measure; 42 National University Hospitals in Japan have been introduced remote-backup system since 2014, so NUH also join this project for BCP strategy. This kind of remote-back up system should be essential to be BCP strategy, but it has still difficult problems as the practical management aspect. (7) Enhanced functionality of regional medical information network called ''Ajisai net'' in Nagasaki District; this is one of the most scalable regional network and successful case in Japan. Almost 25 hospitals and 250 clinics have been sharing patients information. In this article, we presented total configuration and the essential points with virtualization system toward secure HIS at NUH. For becoming more secured and comfortable situation, an effective management will be important as well as install of sophisticated software and hardware. So at the poster presentation, we will show the concrete outcomes and problems. Non-specific effects of vaccinations and infections on humoral immune response in children: a secondary analysis based on data from a birth cohort study in Germany Background Epidemiological studies in low-income countries suggested that immunization with certain vaccines can have non-specific effects (NSE) on all-cause morbidity and mortality in children [1] . These effects could be attributed to non-specific modulation of the immune system [2, 3]. As vaccinations protect directly from infections, and infections have also been shown to modulate the immune system nonspecifically [4] , it is important to take infections into account when assessing NSE of vaccines. We examined whether routine childhood vaccinations and the individual infection history in the first two years of life are associated with NSE measured by levels of humoral response at the age of two years among children in a high-income country. We used data from the German LISAplus birth cohort study (Influence of Life-style factors on Development of the Immune System and Allergies in East and West Germany plus the influence of traffic emissions and genetics). Between 1997 and 1998, 3,094 healthy term newborns were recruited in four German regions (Munich, Wesel, Leipzig, and Bad Honnef). Of the 3,094 newborns, 2,664 (86.1 %) were followed up until the age of two years and of 2,176 (70.3 %) blood samples were obtained at the age of two years [5] . We used information about vaccinations and infections from birth until the age of two years from self-administered questionnaires. The timings of infection and vaccination were coded as during first, second, or in both years of life. Children who were not exposed to a respective vaccination or infection during the entire observation period constituted the reference group. Humoral immune response was assessed by twelve IgG-antibody titers at the age of two years. We applied multivariable linear regression with backward-selection to assess the effect of vaccinations and infections on IgG-antibody titers (one model for each titer; Box-Cox and z-transformed). All models were adjusted for sex of the child, age of the mother at birth, study region, and education of the parents. For 2,123 (68.6 %) children information was available for our analysis. For nine of the twelve investigated IgG-titers significant effects (p \ 0.05) of certain vaccines and infections were observed. Wearable sensors for objective long-term measurement of physical activity, sleep quality and other parameters are increasingly used in large cohort studies such as the UK Biobank, the Tromsø Study and the German National Cohort. These devices provide large amounts of potentially valuable data. There are, however, several open issues related to data acquisition and analysis methodology, outcome parameters, device usability and availability of raw data. This workshop emphasizes the first of these issues and aims to discuss limitations and possible solutions of wearable sensor technology regarding quality assurance and data analysis in particular, and the harmonization of the methodology in general. Stimulated by invited presentations related to NOKO as example, given by authors of this workshop proposal, guided discussions with all participants are going to take place. The workshop will conclude with practical recommendations for researchers involved in cohort studies using wearable sensors, and for those who need to analyze sensor data. Background Meta-analysis of diagnostic studies has been a vivid area of statistical research in recent years. While there exist many different approaches for the meta-analysis of a diagnostic test that collect a single pair of sensitivity and specificity from each study, methods for estimating full summary ROC curves are needed if several pairs of sensitivity and specificity, each one for a different threshold, are available. We propose a novel model for the meta-analysis of full ROC curves and compare it to an established simple approach suggested by Littenberg and Moses [1] which does, by construction, only use a single pair of sensitivity and specificity per study. Our idea is to estimate a summary ROC curve based on a bivariate model with random effects for interval-censored data, additionally allowing for modelling the underlying distributions of the diagnostic test values. Thereby we implement three possible distributions: the Weibull, log-normal and log-logistic distribution. In a small simulation study we compare our new to the Littenberg/Moses model. Finally, all models are applied to an example data set where a screening method for diagnosing type 2 diabetes is evaluated. There, the simple analysis discards more than 70 % of the available observations. In terms of bias of the estimated AUCs, our simulation study shows that the model of Littenberg and Moses performs worse than our new model. To be concrete, in some scenarios biases of 12 percentage points are found, as compared to a maximum bias of 1.6 percentage points for our approach. For the example data set, our new model estimates an AUC of nearly 83 % ( The rise of consumer health awareness and the recent advent of personal health management tools (including mobile and health wearable devices) have contributed to another shift transforming the healthcare landscape. Despite the rise of health consumers, the impact of user-generated health data remains to be validated. In fact, many S82 E. Grill et al. applications are hinged on the interpretability issues of this sort of data. The aim of this panel is two-fold. First, this panel aims to review the key dimensions in the interpretability, spanning from quality and reliability to information security and trust management. Secondly, since similar issues and methodologies have been proposed in different application areas ranging from clinical decision support to behavioral interventions and clinical trials, the panelists will also discuss both the success stories and the areas that fall short. The opportunities and barriers identified can then serve as guidelines or action items individuals can bring to their organizations to further improve the interpretability of user-generated data. Workshop on interdisciplinary approaches for using visualization for wellness and clinical decision support Hsueh PY 1 , Demiralp C 1 , zhu X 1 Background Visualization-enhanced decision support tools have the potential to revolutionize the way researchers and practitioners make sense of health data for wellness and clinical decision support. However, as it is a highly interdisciplinary area that spans across health informatics, computer science and human computer interaction, its progress has been hindered by the lack of interactive tooling that allows easy interpretation and incorporation into workflows. In this workshop, we will first provide an overview on the principles and methodologies of visualization and then dive into case studies on wellness and medication applications. The presentations will be followed by a breakout session, in which workshop participants will be divided into three groups to conduct hands-on exercises. In particular, the workshop participants will build on their first-hand experience of using interactive visualization to discover segmentation and patterns from open data. After the breakout session, each group will report on their experiences and assessment on the pros and cons of such methods for wellness and decision support. The workshop will be concluded with a short presentation that showcases action maps going forward to present current and future self for decision support. Cloud computing is a model of self-service on-demand network access enabling delivering computing resources and services. Contemporary forecasts show that the technology is simultaneously evolving and maturing, the healthcare cloud market is expected to grow by one third of CAGR through to 2018. For companies transitioning their enterprise operations to the cloud, the technology brings a variety of opportunities and challenges, which were identified with our systematic literature review (Management, Technology, Security, Legislative). Consequently, using case study review, the research provides interesting insights emphasizing cloud complexity and superposition which seems to balance potential organizational limitations. Furthermore, results give an overview on the healthcare areas, where cloud computing has already undergone significant development (bioinformatics) or is expected to widen in the near future (cardiology). Managing healthcare data complexity with graph databases Since the introduction of diagnosis-related groups in the German healthcare system, classifying patient diagnosis and procedures with controlled vocabularies have become mandatory and thus creating a large dataset for secondary use in biomedical research. In this paper we present the analysis of an ICD dataset with regards to potentially reimbursement motivated classification and the effects on precision and recall when considering the change history of ICD codes. Identifying the basis for clinical decisions-a feasibility study Hurlen P 1 , Ofstad EH 1 , Gulbrandsen P 2 1 Akershus University Hospital, Lørenskog, Norway; 2 University of Oslo, Oslo, Norway This study explored the possibility of defining a set of terms to describe and identify the basis for clinical decisions in a set of transcriptions from clinical encounters with previously identified decisions. The paper presents the considerations behind the exploratory study and considerations for further work. Can Big Data solve small problems? Paper use in a paperless hospital Hurlen P 1 , Pedersen J 1 1 Akershus University Hospital, Lørenskog, Norway A ''Big Data'' approach was used in part to study the use of printed paper in a 700 bed paperless hospital, in part to study the usefulness of the approach. Between 1,2 and 1,5 million pages were printed each month, corresponding to 10 % of a citizens' monthly paper use. The identified use of printed paper did not seem to be high compared to other organisations. The big data approach was not able to answer all our questions, primarily because we did not get log data for the source programmes for the printing. The approach could consequently not provide the data needed to reduce paper printing. Overview of P4P systems outside Germany and their implication on clinical documentation Universitätsklinikum Erlangen, Erlangen, Germany One of the objectives of the new German Hospital Structures Act is to strengthen the quality of hospital care while controlling costs. Quality-related aspects will play a role in hospital remuneration. Specific quality surcharges or reductions in reimbursement will be introduced. These ''pay-for-performance'' (P4P)-concepts are based on quality indicators to be developed and implemented in German hospitals. Thereby, quality of care of German hospitals should be objectively assessed using predetermined criteria. In some Anglo-American countries P4P-concepts in the healthcare sector are already being used. In a survey conducted among the member nations of the International Federation of Health Information Management Associations (IFHIMA), experiences and impact of P4P programs on clinical documentation are to be examined. The survey shall in particular contain the following items: (a) Do existing remuneration systems depending on quality indicators? (b) If these exist: -What criteria matters most and what data is collected to evaluate objectively the quality of health care? -How does the requirement for documentation of quality indicators impact clinical documentation practices? -Does the hospital use a hospital reporting system based on particular indicators? -Do national benchmarking approaches exist now? (c) If these don't exist -Are P4P concepts currently under discussion? -If so, are they based on predefined quality indicators? The survey will be carried out among the following IFHIMA member nations: USA, Canada, Australia, Japan, Korea, UK and Spain. The results of the survey described above will be presented and discussed in this presentation. Specifically, to what extent these findings might be applied to German hospitals will be reviewed. Development and validation of a cardiovascular risk score for aged patients based on the getABI dataset Häusler U 1 , Trampisch HJ 1 1 Institut für Medizinische Informatik, Biometrie und Epidemiologie der Ruhr Universität Bochum., Bochum, Germany Background Scores for cardiovascular disease (CVD) are an important part of primary prevention and promotion of a health conscious life style. No score is specifically adopted for elderly patients, even though Interheart (Anand et al. 2008) reports that classical risk factors are less important for patients older than 60. Therefore our rationale was to validate currently used scores and develop a new one for CVD in Germany within an aged population. Based on the getABI study 4603 participants without preexisting cardiovascular events were selected. 1533 patients were chosen for validation on the basis of different study centers. This allowed a quasi external (domain) validation of the developed score.The remaining patients were used for recalculation of the currently used scores and for the new development. Recalculation was done by logistic regression using the original risk factors and their transformations. For the development of the new score, logistic regression combined with best subset selection or backward reduction and lasso regression was used. 20 of the 28 initially considered risk factors (including medication) were selected for the data driven parameter selection algorithms. Transformations for nonlinear parameters were done using restricted cubic splines. Performance measures included AUC, Brier score, Hosmer-Lemeshaw statistics and decision curve analysis. External validation of the widely used CVD scores demonstrated low discrimination with AUC values around 0.67 (SCORE 0.67, Arriba 0.66, FramCVD 0.67). Recalibration and recalculation did not largely improve this (SCORE 0.66, Arriba 0.67, FramCVD 0.68). The newly developed score included 10 parameters for males and 13 for females. The score showed significantly better discrimination with AUC of 0.71 (p \ 0.031) in the validation data-set, and adequate calibration (Hosmer-Lemeshow Test, Chi square 15.7219, p [ 0.072). The currently used scores for CVD have discrimination values that do not allow individual predictions. With regard to general CVD risk score development in older people, simple recalibration or recalculation of scores developed for a middle aged population is not sufficient. Further predictors relevant for this age group are necessary. The newly developed score is an improved alternative for people older than 65. The new score can be generalized to a German population attending general practitioners. Even though the new score is clearly better than the currently used scores for CVD in aged clients it should be used with care. An AUC of 0.71, still, does not allow individual predictions just on the basis of the score. On the occasion of EFMI's 40th anniversary, the goal of this panel is to point out accomplishments in the history of biomedical and health informatics, and anticipate a number of challenges faced by our field for its future advancement. The panelists will elaborate on R&D achievements capitalizing on the technological opportunities, professional curiosity and increasing citizen engagement and point to further, important opportunities for the coming years. We will stimulate discussion of how future advances and challenges of informatics can be influenced by lessons from the part and critical reflection on current practice-to learn from the past and not to ''reinvent the wheel'' where past experiences can help guide future developments. This relates to articulation of ways innovation happens, and how to provide added value in clinical practice and empowered health care consumers. Applying the SNOMED CT concept model to represent value sets for head and neck cancer documentation Højen AR 1 , Brønnum D 2 , Elberg P 1 , Gøeg KR 1 1 Aalborg University, Aalborg, Denmark; 2 Department of Oncology, Aalborg University Hospital, Aalborg, Denmark This paper presents an analysis of the extend to which SNOMED CT is suitable for representing data within the domain of head and neck cancer. In this analysis we assess whether the concept model of SNOMED CT comply with the documentation needs within this clinical domain. Attributes from the follow-up template of the clinical quality registry for Danish Head and Neck Cancer, and their respective value sets were mapped to SNOMED CT using existing mapping guidelines. Results show that post-coordination is important to represent specific types of value sets, such as absence of findings and severities. The concept model of SNOMED CT was found suitable for representing the value sets of this material. We argue for the development of further mapping guidelines for consistent post-coordination and for initiatives that demonstrate use of this important terminological feature in actual SNOMED CT implementations. Towards an international framework for recommendations of core competencies in nursing and inter-professional informatics: the TIGER competency synthesis project Background Informatics competencies of the health care workforce must meet the requirements of inter-professional process and outcome oriented provision of care. In order to help nursing education transform accordingly, the TIGER Initiative deployed an international survey, with participation from 21 countries, to evaluate and prioritise a broad list of core competencies for nurses in five domains: (1) nursing management, (2) Information Technology (IT) management in nursing, (3) inter-professional coordination of care, (4) quality management, and (5) clinical nursing. Informatics core competencies were found highly important for all domains. In addition, this project compiled eight national cases studies from Austria, Finland, Germany, Ireland, New Zealand, the Philippines, Portugal, and Switzerland that reflected core competencies based on the country specific perspective. This project is taking a unique approach as it is the first international effort to identify core informatics competencies for nurses in various roles inclusive of inter-professional coordination of care and quality management. These findings will lead us to an international framework of informatics recommendations. Online tool for pharmacoepidemiology: big data exploration for drugs and cancer Background Pre-marketing drug evaluation processes are still largely based on formal, extremely expensive randomized clinical trials and manual information collection processes that cover a relatively small sample of patients. Moreover, the rapid change in health information technology system had dramatically increased health data accumulated. It has become an important material that provides an extraordinary opportunity to observe the emergence of new knowledge and its influence, particularly in drug repositioning and cancer prevention. Upon this beneficial, we aimed to develop an online informatics tool in order to evaluate the risk of drugs for cancer by utilizing medical big data. Data Source: We use the Taiwan's National Health Insurance Database that has provided a huge data which covered all health information including characteristics and all drug information i.e. prescriptions, etc. of 23 million Taiwanese population. Front-end development: Web-based interface was developed by using PHP package and Javascript. In addition, we included the guidelines of evidence based medicine (EBM) level 3 for observational study such as cohort, case-control, and/or case serial self-control in order to support users interact with system. Back-end development: A package of Apache, MySQL & PHP was used to build the serve-side of the system. We integrated the Elasticsearch API5 to our system in order to search and analyze data immediately. The example of data transform to person-level from Taiwan NHI database is shown in Box 1. After then, we also integrated the analytics package (ie. R package) to perform the statistical analysis to a given study. This online analytical tool has capability to massively explore and visualize big data for long term use drugs and cancers through OMOSC system which will help to do mass online studies for long term use drugs and cancer risk. It would help to direct the health care professionals with lack of datamining skills to lead the study. The constructed online system would generate automatically case and controls by utilizing large databases for long term drug exposures and cancer risk. The results are shown in odds ratio (OR) and if selected some confounding factors then could also get adjusted odds ratio (AOR) for risk estimation with 95 % confidence intervals (CI). We used SAS statistical software on the same dataset to validate the OMOSC system results. It could help to do massive online studies which will saves time and cost effective. This system will be capable for drugs and cancer risk evaluation on a societal scale which is a big challenge that is approachable, achievable, and has implications towards those developing or using medications. Since the clinical trials are impossible to conduct due to cultural, cost, ethical, political or social obstacles. Therefore, this kind of research model would play an important role in health care industry by providing an excellent opportunities for solving the technological, informatics, and organizational issues towards other broad domains of drugs evaluation by utilizing large-scale databases. Designing mHealth tools empowering people with sickle cell disease The global burden of sickle cell disease (SCD) is increasing, affecting about 100 million people. However, the disease is still neglected and lacks recognition. mHealth solutions have demonstrated their ability of supporting chronic diseases' self-management. Only few studies related to SCD have been conducted and no mHealth tools encompassing the comprehensive needs of people with SCD exist. The first objective of this study is to collect SCD patients' specific needs and to assess how mHealth tools can support them in their daily lives. In order to reflect patients' needs we performed a literature review and developed a web-survey. The questionnaire has been filled by 33 patients worldwide. Three focus groups involving 4 patients were conducted and finally, software requirements were extracted from the interviews and the focus-groups. The literature review highlighted that the diseases' negative effects can be reduced if patients take simple and practical actions to maintain their health. The survey results confirmed the sub-optimality of care received by patients (72.7 %), motivating patients to avoid healthcare utilization. Patients demonstrated a clear interest in using tools such as mobile apps (81.2 %), web apps (45.5 %) and wearable devices (39.4 %) to support their self-care needs. To illustrate the findings and to establish a foundation for future research, a prototype of mobile app has been designed. It exemplifies patients' requirements refined during focus-groups and take account of the survey and literature results. Considering a wide range of people with SCD's selfmanagement needs, an implementation of the suggested prototype may have the potential to encourage specific interventions improving patients' quality of life. Finally, extensive testing, usability studies and randomized control trials will help to gain clinical approval and patients' compliance. Calculation of age-specific population-based reference intervals Ittermann T 1 , Schmidt CO 2 , Völzke H 3 1 Universitätsmedizin Greifswald, Greifswald, Germany; 2 Universität Greifswald, Greifswald, Germany; 3 Ernst-Moritz-Arndt-Universität, Greifswald, Germany Reference intervals are widely applied in clinical practice, particularly for biomarkers. If a value outside of a reference interval is observed in an individual, the risk for a disease may be increased indicating potential diagnostic significance. However, treatment decisions should not be drawn solely from a value outside of a reference interval, because reference intervals have no prognostic significance by themselves. Reference interval calculation requires two steps. First a healthy population has to be defined. For this, data from large populationbased studies is ideal, since exclusion criteria can be derived from high-quality study data. Not only diagnosed but also previously undiagnosed diseases can be considered to define a healthy population. Commonly, reference intervals for a parameter of interest are defined as its 2.5th and 97.5th percentile in the healthy population. Recently, quantile regression has received increasing attention as a tool for reference interval calculation, because percentiles (e.g. 2.5th and 97.5th) can be directly modeled. A further advantage of quantile regression is the potential for individualized reference intervals by introducing variables such as age, sex, smoking status and body mass index into the model. Thus, reference intervals of a parameter can be given for subgroups of interest. Furthermore, fractional polynomials can be used to model non-linear associations between explanatory variables e.g. age and the respective percentile of the outcome. Two examples of reference interval calculation based on data from the Study of Health in Pomerania (SHIP) will be presented. Examples will include serum TSH levels and parameters of cardiopulmonary Statistically significant interactions of the aMED with sex (p = 0.02) and ethnicity (p = 0.04) were observed for CRC-specific survival. A higher aMED score was associated with better CRC-specific survival in women (HR1SD = 0.86; 95 % CI 0.77-0.96; ptrend = 0.008), but not in men (HR1SD = 1.01; 95 % CI 0.92-1.11; ptrend = 0.84). The protective relation for the aMED was limited to African Americans and to colon, but not rectal, cancer. Removing total fruits and grains from the scores reduced the protective associations more than any other food group. No significant associations with CRC-specific survival for the HEI-2010, AHEI-2010 or DASH indexes were detected. Results were similar for all-cause mortality. The current findings indicate a weak influence of prediagnostic dietary patterns on survival in CRC cases; primarily the aMED was associated with better survival possibly due to higher consumption of fruits and grains. Background and aims In Germany, among patients with (sporadic) community-acquired pneumonia (CAP) Legionella spp. belongs to the most important causative agents (''community-acquired Legionnaires' disease'', CALD). Due to underreporting and lack of source identification, it is unknown how many cases of CALD are due to contaminated drinking water in residential plumbing systems. The recently reformed drinking water ordinance is aimed at reducing Legionella contamination in plumbing systems of residential buildings, excluding the household level. The aims of the study are (1) to estimate how frequently community-acquired pneumonias are caused by Legionella, (2) to examine how frequently the case-specific source of infection can be identified, and (3) to investigate among cases of CALD acquired at home to what extent (i) host (e.g. profession, behavior), (ii) environmental (drinking water contamination), and (iii) Legionella strain characteristics contribute to CALD acquisition through residential drinking water at home. Methods This prospective case-control study will take place from 2016-2019 in the German states of Berlin and Brandenburg. We will cooperate with hospitals to estimate the proportion of CAP that is CALD as well as to identify the causative strain of Legionella. After identification and reporting of a case of CALD, in cooperation with local health departments, the German Environment Agency and the Reference Laboratory for Legionella, we will initiate the following investigations: (1) extensive case interview, (2) sampling and assessment of the drinking water plumbing system of the case, (3) sampling of potential other sources of infection. We will attempt to match environmental with case patients' Legionella cultures through typing methods. Two controls will be matched for age (5-year age-groups), sex, smoking status, comorbidities and district. Controls will undergo the same questionnaire and environmental investigations as done for cases. Expected Results Percentage of CAP that is CALD, (2) Percentage of CALD with identified (typable) Legionella strain, (3) Percentage of CALD that is due to contaminated residential drinking water (CALD-DW), (4) Risk factors for CALD-DW, such as contamination on household level versus contamination on house plumbing system outside the household level, (5) Estimation of CALD that is preventable through full implementation of the drinking water ordinance. Expected Conclusion It is hoped that the anticipated results will allow drawing conclusions about the causal distribution of CALD and contribute to optimize the application of the drinking water ordinance. An ontology-based scenario for teaching the management of health information systems Jahn F 1 , Schaaf M 1 , Kahmann C 1 , Tahar K 1 , Kuecherer C 2 , Paech B 2 , Winter A 1 1 Institut für Medizinische Informatik, Statistik und Epidemiologie, Universität Leipzig, Leipzig, Germany; 2 Software Engineering Heidelberg, Institute for Computer Science, University of Heidelberg, Heidelberg, Germany The terminology for the management of health information systems is characterized by complexity and polysemy which is both challenging for medical informatics students and practitioners. SNIK, an ontology of information management (IMI) in hospitals, brings together IM concepts from different literature sources. Based on SNIK, we developed a blended learning scenario to teach medical informatics students IM concepts and their relationships. In proof-of-concept teaching units, students found the use of SNIK in teaching and learning motivating and useful. In the next step, the blended learning scenario will be rolled out to an international course for medical informatics students. Dietary patterns and Type 2 Diabetes -systematic review and meta-analysis Jannasch F 1,2 , Kröger J 1,2 , Schulze M 1,2 1 German Institute of Human Nutrition Potsdam-Rehbruecke, Nuthetal, Germany; 2 Deutsches Zentrum für Diabetesforschung (DZD), Neuherberg, Germany During the last decade focus in nutritional science was set on dietary patterns besides single foods to determine the association with health outcomes. Existing reviews and meta-analyses concentrated on a selection of approaches and included limited numbers of studies. Therefore aim of this systematic review is a comprehensive overview of dietary patterns evaluated in prospective studies with regard to their association with type 2 diabetes (T2D). The literature search was performed in the electronic databases MEDLINE and Web of Science. The systematic review summarized each dietary pattern approach separately and risk ratios were included in meta-analyses according to the applied approach and their comparability, using Cochrane Software package Review Manager 5.3. Dependent on the heterogeneity between the studies, fixed or random effects models were chosen. There is strong evidence for diabetes risk reduction by dietary patterns reflecting either a Mediterranean diet, DASH or AHEI. Evidence from exploratory patterns, although population-specific, support that diets high in red and processed meat, refined grains, high-fat dairy, soft drinks and fried products and low in vegetables, legumes, fruits, whole grains, nuts and seeds, low-fat dairy, poultry and fish increase diabetes risk. As evidence is limited, further research should assess generalizability of these exploratory approaches. The strongest associations with diabetes risk were observed for patterns derived by RRR which highlights the usefulness of diabetes-related biomarkers in the evaluation of dietary patterns. Effects of lignans and isoflavones on cell proliferation marker Ki-67 and HER2 receptor status Jaskulski S 1 , Rudolph A 1 , Sinn P 2 , Thöne K 3 , Flesch-Janys D 3 , Chang-Claude J 1 Our findings showed that a higher isoflavone exposure, indicated by higher serum GEN levels, is associated with reduced cell proliferation. This could partially explain the protective effect of isoflavones for breast cancer prognosis. Further research is warranted to elucidate the differential associations between isoflavones and Ki-67 expression by tumor prognostic markers for breast cancer. Data in an Electronic Health Record must be recorded once, in a standardized and structured way at the point of care to be reusable within the care process as well as for secondary purposes ('collect once, use many times' (COUMT) paradigm). COUMT has not yet been fully adopted by staff in every organization. Our study intends to identify concepts that underlie its adoption and describe its current status in Dutch academic hospitals. Based on literature we have constructed a model that describes these concepts and that guided the development of a questionnaire investigating COUMT adoption. The questionnaire was sent to staff working with patient data or records in seven out of eight Dutch university hospitals. Results show high willingness of end-users to comply to COUMT in the care process. End-users agree that COUMT is important, and that they want to work in a structured and standardized way. However, end-users indicate to not actually use terminology or information standards, but often register diagnoses and procedures in free text, and experience repeated recording of data. In conclusion, we found that COUMT is currently well adopted in mind, but not yet in practice. Emergency data management from an international perspective: concept, practice, lessons learned and way forward Introduction A significant number of problems in emergency care are caused by a lack of provider access to pre-existing patient information at the point of care [1, 2] . A particular difficulty is the access to information on the medical history of patients from other countries. This is due to a number of reasons such as language barriers, difficulty in accessing the dataset over the world wide web, differences in expectations around content of the data, usability and workflow differences between emergency room departments between countries. This workshops aims to discuss possibilities and requirements for setting up a globally available emergency data set. On the basis of our work in this important area of research we will present and discuss the following items: review of current national activities (e.g. the emergency data set for the German Electronic Health Card developed by the German Medical Association) as well as international projects (e.g. the European epSOS project) in deploying emergency datasets, previous work in assessing the information needs and content of emergency data sets as a basis for designing information exchange, methods for assessing the ability of emergency data sets to address the usability and workflow needs of health care provider using the data to support decision making, validation of the emergency data set developed by the German Medical Association with clinicians in Canada to provide an international context, core requirements for a successful implementation of an international emergency data management. The results of the workshop are supposed to form the basis for a concept how emergency data can be used in an international context. There are many different efforts to improve the availability of electronic patient data in case of emergency. However, the usability of the different approaches is mainly limited to the borders of the particular country, so that the emergency data is not available when traveling abroad. The workshop results can help to overcome these limitations. Snapshot on infection protection for asylum seekers in North Rhine-Westphalia Jurke A 1 , Thole S 1 , Daniels-Haardt I 2 1 Infectiology and Hygiene, NRW Centre for Health, Münster, Germany; 2 Health Protection, Health Promotion, NRW Centre for Health, Münster, Germany In 2015 16.0 % of population of EU-countries lives in Germany. In June 2015 37.6 % of asylum seekers in EU-countries made their application in Germany. In Germany 17.0 % asylum seekers are allocated to North Rhine-Westphalia (NRW). This high number of people seeking asylum currently is challenging the national asylumsystem in an unprecedented way. Adherence to standard procedures of accommodation and allocation is hindered, at the federal state and regional level. According to the German Asylverfahrensgesetz ( §62, AsylVfG), asylum-seekers living in accommodation-centres are obliged to undergo a basic medical examination in order to exclude communicable diseases. The procedure of this examination is defined by the federal state health authorities. Health authorities in NRW recommend a set of vaccinations for refugees upon arrival. The federal states' vaccination-guidance does not include all age-specific vaccinations that are recommended by the German standing committee on vaccination (STIKO). Particular attention is paid on prevention of diseases with public health significance. There are scarce data on number of asylum seekers, their age, origin (Asylgeschäftsstatistik of Federal Office for Migration and Refugees), their participation in the basic medical examination and the resulting diagnoses. In Germany, a set of infections is mandatory notifiable to Local health departments according to the German Protection against Infection Act (IfSG). These notified cases should include information on the person (i.e. asylum seeker). Nearly seventy percent of the applicants in 2013 and 2014 were younger than thirty years, most of which were children under the age of sixteen. In 2015 in NRW the notification data according to IfSG indicate that 36.3 % of infections affecting asylum seekers are caused by gastrointestinal pathogens, 25.1 % by the M. tuberculosis complex, 18.0 % by Varicella Zoster Virus, 14.8 % by Hepatitis (A, B, C) Virus and 5.8 % by other pathogens (infection data base, NRW Centre for Health, 27 November 2015). 46.3 % of infections are affecting children under the age of fourteen. The most vulnerable group among the immigrants are children and young people. Inevitably the local departments of youth and family services have to take special care of unaccompanied minor migrants. Age-specific vaccinations are to be completed as soon as people enter the regular health care system. Even if there is underreporting of infectious diseases, the risk of importation of non-endemic infections via mobile populations seems to be rather low. Migrants, as well as the resident population, rather come down with ordinary infections like diarrhoea and vaccine preventable diseases. The mandatory Tbscreening at allocation to accommodation-centres may result in detection of cases of TB-infection, needing a follow up, which may be challenging. The most important measures of infection prevention are: access to medical care, early vaccination and implementation of hygiene guidelines in the accommodations for asylum seekers. E. Grill et al. Background The unexplained heritability of type 2 diabetes is high, suggesting that low frequency or rare genetic variants might explain additional proportions. Metabolites associated with type 2 diabetes may aid the identification of new risk variants for type 2 diabetes. We used the Illumina HumanExome v1.1 Bead Array to genotype individuals from the German EPIC-Potsdam study. Within a representative subsample (n = 2500) we exploratory explored the associations between single genetic variants and diabetes-associated metabolites measured in serum samples by targeted metabolomics (Biocrates). Findings (p \ 1.64 9 10 -7 ) were replicated within the German KORA F4 study population. For the replicated genetic variants associations with type 2 diabetes risk were investigated within the EPIC-Potsdam case-cohort (n = 2891; n cases = 758) and by look-up in DIAGRAM and other consortia for type 2 diabetes. We identified one new association between rs499974 (MOGAT2) and a diacyl-phosphatidylcholine ratio (PC aa C40:5/PC aa C38:5) and identified seven associations of loci which have been previously described with regard to metabolites, but found new ratios including sphingolipids: rs7412 (APOE) and SM (OH) C22:2/SM (OH) C22:1; rs7157785 (SGPP1) and SM C16:1/PC aa C28:1, SM (OH) C22:2/SM C24:0, SM (OH) C22:2/SM (OH) C14:1, SM (OH) C22:2/SM (OH) C22:1; rs364585 (SPTLC3) and SM C16:1/SM C18:0, SM (OH) C22:2/SM C16:1. Previously known variants are located in FADS1-2, CPS1, REV3L, NTAN1 and PNLIPRP2. Four replicated common SNPs (rs3204953, rs174550, rs499974, rs7157785) showed nominally significant associations with type 2 diabetes in DIAGRAM. We identified eight new associations between genetic variants and diabetes-related metabolite traits providing insights into metabolic pathways underlying the development of type 2 diabetes. Identifying of ischemic heart disease patients' preferences in obtaining health information: a strategy for clinical information designing Health Information Technology department, Semnan University of Medical Sciences, Semnan, Iran, Islamic Republic Of Understanding ischemic heart disease patients' preferences in obtaining health information can improve the notification procedures and appropriate services delivery to these patients; and increases efficiency and effectiveness of education to them. This study aimed to identify ischemic heart disease patients' preferences in obtaining health information for clinical information designing. This cross-sectional and descriptive-analytic study was performed on 200 patients with ischemic heart disease who were admitted in 2015. Measurement tool a valid and reliable researcher made questionnaire. Data by Chi square, logistic regression and path analysis were analyzed. Results 115 patients (57.5 %) of patients need this information. The highest average score (2.13) was related to the field of psychological problems. The results showed that age (OR = 3.509), jobs (OR = 20786), education (OR = 5.187) and income (OR = 3.774) had a direct, positive and significant relationship with the need for information in univariate analysis (P \ 0.05); however, Sex (OR = 2.756) had a direct, positive and significant relationship with the need for information in multivariate analysis (P = 0.007). The findings of this study indicate unsatisfied learning needs, and the information perceived importance for the patients. The results show that some patients' characteristics are significantly determinant in the increase of the patients' willingness to obtain health information. The awareness this fact shows the importance of effective provision of quality information for this category of patients. Development of a diagnostic model for early differentiation of sepsis and SIRS in pediatric patients-a data-driven approach using machine-learning algorithms Health-exploring complexity: an interdisciplinary systems approach HEC2016 S91 treatment, antibiotics are not indicated and potentially harmful for SIRS patients. Differentiation between SIRS and sepsis, however, is difficult in a clinical setting as the SIRS definition is part of the diagnostic criteria for sepsis. Furthermore, age-specific changes in symptoms and laboratory parameters hinder the differentiation of both entities in pediatric patients. We aimed to develop a diagnostic model for the discrimination of sepsis and SIRS in critically ill children based on routinely available parameters. Methods Sepsis and SIRS defined by the international pediatric sepsis consensus conference (IPSCC) criteria were evaluated in a secondary data analysis based on a single-center, prospective, randomized controlled trial (RCT) conducted at a pediatric intensive care unit (PICU) of a German tertiary-care hospital (ClinicalTrials.gov number; NCT00209768). We identified 238 episodes of SIRS and 58 episodes of sepsis in the 807 children aged 0 to 18 years originally enrolled in the RCT. Fifty predictor variables (demographic information, clinical parameters, laboratory parameters) measured at the first day of disease were considered as potential predictors for a diagnostic model. We used a Random Forest approach for classification and applied a temporal split-sample approach in order to validate the developed model. Background Antimicrobial resistance (AMR), i.e., the ability of microbes such as bacteria, viruses, fungi and parasites to resist the actions of one or more antimicrobial drugs or agents, is a serious global threat. Bacterial antibiotic resistance poses the largest threat to public health. The prevention of antimicrobial infections and their spread relies heavily on infection control management, and requires urgent, coordinated action by many stakeholders. This is especially true for nosocomial infections, also known as healthcare-associated infections (HAIs), i.e., infections that are acquired in healthcare settings. It is known that continuous, systematic collection, analysis and interpretation of data relevant to nosocomial infections and feedback for the use by doctors and nurses can reduce the frequency of these infections. Data from one hospital are more valid and more effective when they are compared with those from other hospitals. In order to avoid false conclusions, comparisons are only possible when identical methods of data collection with fixed diagnostic definitions are used. The automatic aggregation of standardized data using data from electronic medical records (EMRs), lab data, surveillance data and data on antibiotic use would greatly enhance comparison and computerized decision support systems (CDSSs). Once standardized, data can be aggregated from unit to institutional, regional, national and EU level, analysed and fed back to enhance local decision support on antibiotic use and detection of nosocomial infections. The authors will give introductory speeches followed by a moderated discussion with the presenters and the audience. Expected result is a report about the workshop that should contain a common minimum data set for nosocomial infection surveillance or at least a roadmap for further standardization. Results may also lead to a joint publication of the authors. Several computerized decision support sytems for electronic surveillance of nosocomial infections and antibiotic stewardshio exist. Standardization is a prerequisite for international collaboration and comparison of local data. We sought to analyze all medications from our medical ward system for a large patient cohort. Our system provides single medication applications with names, e.g. ''Valoron N Trpf.'', and dosage. However, an automated analysis also requires the ATC code. Thus, our main focus was on matching a catalog entry with the correct product name, less on the exact ingredient quantity. Others have used a similar approach with medications in unstructured text [1] . In contrast, our exported names are based on an internal catalog and, thus, should lead to a more exact mapping. This work suggests a mapping of medication names to a comprehensive medication catalog containing [150,000 entries. First, the exported and the catalog names are cleaned, e.g. by removing application specific multiplications in the form of ''0.5x…''. Afterwards, the names are split into tokens, by characters (';', '/', ',', '-', '') and between number/string combinations (e.g. '20 mg'). Then, the catalog is reduced by matching the first text token of the exported medication. This subset is weighted by comparing each search token to each catalog token as a summary of following single weightings (ordered by its descending grade): a full search token exists, a number/unit combination exists, and a search token partially exists in the set of catalog tokens. The summarized weighting is reduced by the number of tokens not used. The resulting mapping was validated by a manual comparison of searched and detected name. We analyzed [700,000 single applications from 2011 to 2016 of several wards containing 7,174 unique names including multiples with different ingredient quantities. Manual validation was done on a subset of 2,959 items after removing the ingredient quantity. Of those, 2,695 were correctly matched. In 92 cases, the source was more specific then the mapped name, e.g. had an additional ''retard'' as in ''Valoron N retard 50/4 mg'', and in 77 cases the mapped name was more specific. 22 medications were wrongly mapped. 73 were not mapped, mostly because of absent catalog equivalencies, in few cases also because of spelling errors. Hence (without those 73), 93.4 % were correctly matched to a full medication name and 99.3 % correctly to the product name. The largest drawback of our mapping strategy is that documented names sometimes only include a very basic name, e.g. a brand name for the most common product of a company, which thus cannot directly be mapped to a catalog medication. Another issue is that some products can be used for multiple indications and some drugs contain multiple ingredients, representing different ATC codes. Further, medication names usually contain a specific ingredient quantity, e.g. '40 mg'. The export name might, however, include adaptions, e.g. '20 mg'. Often, if available in catalog, this quantity was mapped correctly, but not reliably. Thus, a medication export/catalog mapping based on these names does not lead to a full catalog mapping. As a consequence, further catalog attributes might not be usable; for linkage with ATC codes, however, this approach is fully sufficient. HL7 FHIR Ò is a standard for trial use in HL7's standard frameworks. It aims to support both mobile apps and web based applications-by facilitating well-established technologies like XML, JSON and REST-as well as EHRs and other data centric systems. HAPI-FHIR [1] is an open-source FHIR Java Library and is used within the apps to create, receive and send the BGL related resources to the HAPI-FHIR server. Camunda [2] is an open source platform for workflow and business process management. It supports the execution of BPMN2, CMMN and DMN. The Patient Android Application [3] reads the glucose measurements from the glucose meter via its BLE interface and is based on Nordic Semiconductors Android-nRF-Toolbox [4] . The received measurements are displayed and converted into FHIR resources. Thereafter, they are transferred to the FHIR Server on which an interceptor extracts the Blood Glucose Values and sends them to the Process Server. In the next step the Process Server evaluates the BGLs using a DMN Decision Table. If the BGLs aren't in the normal range a Push Notification is sent to the Physician Application. If the patient doesn't transmit BGLs for a configurable period of time a reminder is sent to his application. The Physician Application displays the BGLs of a selected patient and is implemented as a hybrid mobile app using the Ionic Framework [5] . Several things need to be addressed if using these apps in a routine use: (1) the FHIR community has to think about in which way the Patient App could be deployed onto the patients' devices easily without the need for the patient to configure it elaborately. Also, it needs to be ensured that only the authorized patient and healthcare professionals can access the BGL data. (2) the compliance to national constraints or legislative settings needs to be assured as the app is presumably a medical product. The presented application fosters interoperability for BLE-enabled BGLs. On the one hand, it enables patients to submit their measurements directly to their physician's HIS. On the other hand, physicians are enabled to track their patient's vital parameters by such a system in real time although these patients are in outpatient care. Even though the App was implemented to work with BGLs, it can be easily adapted to work with blood pressure measurements or other vital signs. [AK1]poster/abstract? Publishing clinical guidelines (GLs) on the web increases their accessibility. However, evaluating their usability and understanding how users interact with the websites has been neglected. In this study we used Tobii eye-tracker to analyse users' interaction with five commercial and public GL sites popular in Norway (four in Norwegian and one English of US origin (UpToDate)). We measured number of clicks and usage rate for search functions, task completion time, users' objective and perception of task success rate. We also measured learning effect for inexperienced users. We found a direct correlation between participant's satisfaction regarding website usability and the time spent, number of mouse clicks and use of search function to obtain the desired results. Our study showed that users' perceived success rate was not reliable and GL publishers should evaluate their website regarding presentation format, layout, navigation bar and search function. Designing an assessment form for speech language pathology: toward standards for speech therapy documentation Medical documentation in all health care services must be provided accurately and on time for each patient. Accurately, timely and accessible health care data play a major role in the planning, development and support services to health care. High quality of health care depends on the comprehensive patient's medical record. This comprehensive information is necessary to support diagnosis and treatment, care quality measurement, improvement of public health, improving the productivity of health care and facilitating reimbursement. One of the important services is speech therapy that is of value to patients with speech and voice problems. Providing data that often collected via standard forms and presented to users is crucial for speech therapy services. Such forms must be designed appropriately. Therefore, a standard form should have the following characteristics; appropriate physical and logical structure, useful data, prevent entering duplicate data and etc. Since no appropriate forms were available in Iran, one limitation has been the lack of standards for presenting and exchanging such data. We aimed to design an assessment form for documenting speech language therapies. This study was a descriptive-comparative study which was carried out in 2014 in which data elements of speech therapy records in Iran and related databases in developed countries were used to design a check list which was subjected to debate by Iranian speech pathologists (30 experts) in a Delphi method. So we design a standard form for speech language therapy that describes the minimum data set required to ensure that data about speech problems can be easily interpreted. Data were analyzed using standard comparative tables and SPSS statistical software version 11.5. All proposed data elements were confirmed by speech pathologists. The final proposed form composed of 10 main parts that include (A) demographic information (17 elements) (B) referral information (6 elements) (C) medical history (11 elements) (D) verbal skills assessment (2 elements) (E) non-verbal skills assessment (2 elements) (F) articulation assessment (13 elements) (G) cognitive skills (12 elements) (H) linguistic and cultural assessment (2 elements) (I) diagnosis (7 elements) (J) therapeutic recommendations and plans (9 elements). There is a need for a standardized screening instrument which used for speech therapists to document daily reports. Items from the standard form can be organized to screen for speech problems of patients in speech therapy centers. Also the minimum data set defined may be used for the content and structure of electronic speech S94 E. Grill et al. screening system. The ultimate goal of this work was to establish a standard form for recording and reporting speech language pathology data, which will in turn facilitate the establishment of such databases. Issues related to ensuring patient privacy and data ownership in clinical repositories prevent the growth of translational research. Previous studies have used an aggregator agent to obscure clinical repositories from the data user, and to ensure the privacy of output using statistical disclosure control. However, there remain several issues that must be considered. One such issue is that a data breach may occur when multiple nodes conspire. Another is that the agent may eavesdrop on or leak a user's queries and their results. We have implemented a secure computing method so that the data used by each party can be kept confidential even if all of the other parties conspire to crack the data. We deployed our implementation at three geographically distributed nodes connected to a high-speed layer two network. The performance of our method, with respect to processing times, suggests suitability for practical use. Achieving a good understanding of the socio-technical system in critical or emergency situations is important for patient safety. Research in human-computer interaction in the field of anesthesia or surgery has the potential to improve usability of the user interfaces and enhance patient safety. Therefore eye-tracking is a technology for analyzing gaze patterns. It can also measure what is being perceived by the physician during medical procedures. The aim of this review is the applicability of eye-tracker in the domain of simulated or real environments of anesthesia, surgery or intensive care. We carried out a literature research in PubMed. Two independent researchers screened the titles and abstracts. The remaining 8 full-papers were analyzed based on the applicability of eye-trackers. The articles contain topics like training of surgeons, novice vs. experts or the cognitive workload. None of the publications address our goal. The applicability or limitations of the eye-tracker technology were stated incidentally. Only few epidemiologic studies examined sleep characteristics in relation to dietary behaviour. Our aim was to analyse if sleep duration, midpoint of sleep and sleep quality are associated with dietary behaviour among the Bavarian population. Within the cross-sectional Bavarian Food Consumption Survey II 1050 subjects aged 13-81 years were recruited. Data on lifestyle, socioeconomic and health status were collected by means of a computer-assisted personal interview. Over the following 2 weeks, dietary intake was assessed with three 24-h dietary recalls by telephone (EPIC-Soft). Dietary behaviour was analysed using the consumption of main food groups, energy-proving nutrients and energy intake. Characteristics of sleep were measured by means of the Pittsburgh Sleep Quality Index Questionnaire. Categories of self-reported usual sleep duration per night (\6, 6-6.5, 7, 7.5-8, and [8 h) , midpoint of sleep (early, medium, late) and overall sleep quality (poor, good) were used for our analysis. Results 814 participants aged 18 years or older, who completed at least two 24-h dietary recalls and who had complete and plausible information on sleep characteristics were analysed. Mean sleep duration was 7.1 h per night with no difference between men and women. Median midpoint of sleep was 2:42 a.m. in men and 2:26 a.m. in women. Most subjects ([80 %) had a good sleep quality. Sleep duration was associated with intake of non-alcoholic beverages (p \ 0.01), carbonated beverages (p = 0.04), water (p = 0.04) and coffee/black tea (p = 0.01). The highest intake of these beverages was observed among subjects sleeping \6 h. No association was found with sleep duration and the consumption of other food groups, energy-proving nutrients or total daily energy intake. Midpoint of sleep was only associated with intake of carbonated beverages (p = 0.02) with the highest intake among subjects with an early midpoint of sleep. No association was detected between sleep quality and dietary intake. Conclusion Mainly sleep duration was associated with beverage intake. Midpoint of sleep and quality of sleep were hardly related to food and beverage intake. They simply forget!-Knowledge, Attitude and Practice (KAP) of General Practitioners and Physician Assistants with respect to advising vaccinations to elderly in Germany Klett-Tammen CJ 1,2 , Krause G 1,2 , Castell S 1 1 Helmholtz Centre for Infection Research (HZI), Braunschweig, Germany; 2 Hannover Medical School, Hannover, Germany Currently, official German vaccination recommendations for individuals C60 years include tetanus vaccination in ten-year intervals, an annual influenza vaccination and a single pneumococcal vaccination. Despite reimbursement by statutory health insurances and published evidence for a positive benefit-risk-ratio, the vaccination coverage is not satisfactory, especially for influenza and pneumococcal vaccination. Advices of General Practitioners (GP) and Physician Assistants (PA) are known to influence vaccination uptake, but little is known about vaccination-related knowledge, attitude, and practices (KAP) and possible predictors of advice-giving behavior towards the elderly in primary health care personnel. We conducted a nationwide survey, sending a piloted questionnaire on vaccination-related KAP and specifically advice-giving practices to 5,000 German GP practices (one GP version, one PA version) (netto n = 4,995) to be returned by fax or prepaid mail. We performed multivariable logistic regressions, defining the outcomes as not advising at least one officially recommended vaccination in general, not advising the tetanus vaccination, not advising the influenza vaccination, and not advising the pneumococcal vaccination, all in the absence of a contraindication. Of 4,995 practices, 813 (16 %) returned at least one questionnaire, equaling 774 GPs (16 %) and 563 PAs (11 %). In 612 (79 %) of the practices, not only GPs, but also PAs counsel on vaccinations. Twenty-one percent of all participants counselling on vaccinations stated to have at least once not advised an officially recommended vaccination to an elderly patient, with significantly less PAs than GPs declaring this (14 % vs. 26 %, p \ 0.001). This applies most frequently to pneumococcal vaccination (15 % of all participants). The most common explanation for non-advising given by both professions for each vaccination was having forgotten to advise it (50-74 %). Most respondents declared to know (92 %) and to trust (90 %) the official recommendations, with significantly more GPs than PAs agreeing to both items (97 vs. 86 %, p \ 0.001 and 91 vs. 88 %, p = 0.005). Statistical significant multivariable predictors for not advising vaccinations included the practice being located in West Germany (any vaccination: OR: 2.9, 95 % CI 1.7-4.9; influenza vaccination: 2.4, 1.3-4.5; pneumococcal vaccination: 2.8, 1.6-5.1) and not counselling regarding vaccinations at regular intervals (any vaccination: OR: 2.8, 1.5-5.3, influenza vaccination: 2.3, 1.1-5.1, pneumococcal vaccination: 3.1, 1.5-6.7). GPs had a higher chance of not advising tetanus vaccination (2.8, 1.5-5.4), influenza vaccination (2.6, 1.5-4.6), and pneumococcal vaccination (1.8, 1.1-3.0) than PAs. The self-reported knowledge of and attitude towards officially recommended vaccinations for the elderly among German GPs and PAs in general and related to vaccinations against tetanus, influenza and pneumococci in particular are satisfactory. While in most general practices, also PAs counsel on vaccinations, they felt less sufficiently informed about official vaccination recommendations compared to GPs. Therefore, empowering PA by special tailor-made information on official recommendations is desirable to further improve advice-giving behavior. The main reason to not advise vaccinations among our study population is forgetting to do so rather than disagreeing with the official recommendation. Thus, we recommend an improved implementation and promoting of recall-systems as integrated part of practice software already in place. Infections with Bbsl can affect the skin (erythema migrans, borrelia lymphocytoma, acrodermatitis chronica atrophicans), the nervous system (neuroborreliosis), joints (Lyme-arthritis) or heart (Lyme carditis). Concise data on European LB epidemiology is not readily available. This overview is aimed at providing current data on the incidence of LB and on longitudinal trends. Data was collected by literature search in PubMed, grey literature such as national health registries, as well as personal contacts. Data was subjected to the Mann-Kendall test for statistical trend analysis. Data on LB incidence in 35 countries was obtained. Reported incidences per 100,000 population vary not only with respect to geographic location and environmental factors but also both due to underlying reporting and surveillance systems (mandatory or voluntary reporting, sentinels, population-based studies) as well as the requirements for notification (clinical and/or laboratory confirmed) and applied case definitions. Geographic distribution data reveal the lowest incidences in southwest Europe (Spain \ 1; 2012) and along the Mediterranean coast. Incidences increase northward (Sweden 69; 1995) and from western (England & Wales, 2.0, 2014) towards central Europe (Switzerland, 113; 2014) . Incidence was highest in Slovenia (195; 2014) decreasing towards southeast Europe (Bulgaria, 4,7, 2014) . Neuroborreliosis incidence (11 countries) ranged from 0.4 (Ireland, 2014) to 5 (Sweden, 2010). Time series analysis revealed major trends in LB incidence: (a) countries (n = 14) exhibited significant rising trends until 2009-2011, followed by an oscillating plateau phase (e.g. England & Wales, Slovenia), (b) countries (n = 5) showed decreasing incidence over time (e.g. Bulgaria, Rumania), and (c) countries had stable incidence e.g. Denmark, Belgium, France, Switzerland. Current data are heterogeneous and difficult to compare directly due to variations in study type, case definitions, year and enrolment criteria. To obtain comparable data, surveillance systems implementing harmonized case-definitions are needed. Results from sentinels e.g. France, Netherlands, Switzerland indicate high reliability of collected data. Additionally, concomitant training of participating practitioners, as exemplified by the Bavarian LYme-DIsease (LYDI)-Sentinel (2012-2015), helps to promote sustained awareness of LB and its appropriate diagnostic and clinical management. Despite increasing incidences of LB in many European countries between 1999 and 2009/2010, data collected subsequently suggest that LB incidence in many countries reached a plateau, but with interannual oscillations. Developing a complex intervention to improve participation and quality of life in nursing home residents with joint contractures (JointConImprove) Klingshirn H 1 , Saal S 2 , Bartoszek G 3 , Beutner K 2 , Hirt J 2 , Stephan AJ 1, 4 Joint contractures are common problems in nursing home residents. Prevalence data in this setting vary between 25 and 80 %, presumably due to different definitions and population characteristics. Since there is rare evidence for the effectiveness of interventions to prevent joint contractures in nursing home residents, attention should be better paid on the relief of negative impact of joint contractures on daily life. From the patients' perspective, restrictions in social participation are most burdensome. These may result from the accumulation of various factors, from changes in body function and structures to debilitating environmental factors, such as design of living areas or attitudes of health care professionals. An effective intervention has to reflect this complexity. We therefore aimed to develop a complex intervention to improve participation and quality of life in nursing home residents with joint contractures. The intervention development followed the UK Medical Research Council (MRC) framework through integrating the perspective of the affected individuals, the health professionals and caregivers, and the researchers. The development process comprised a systematic literature review, analysis of existing data from interviews with affected individuals, and a focus group with nursing home residents, complemented by feedback from two nursing home managers. Based on these preparatory studies, eight geriatricians, scientists in the field of nursing, health and rehabilitation discussed potential intervention components in a structured consensus meeting. With these components, a first draft of the intervention was developed. The intervention was further developed in a focus group with four experts working in the field of nursing home management. The intervention was developed according to the theory of planned behavior and contains four main components. The first component of the intervention is a one day workshop for skilled nurses. This workshop consists of four modules and prepares nurses to implement the intervention in their nursing home. Module one introduces the nurses to their future roles. Module two summarizes the current evidence on joint contracture development and treatment. Module three provides measures to support nursing home residents with joint contractures in participation and quality of life. Module four presents methods of collegial consulting and training. The second component of the intervention is a presentation dealing with knowledge about joint contractures lasting approximately 40 min. Addresses are nursing home staff, residents and relatives. In a third and fourth component of the intervention, nurses will be supported by telephone counselling and collegial peer-review to discuss practical patient-related issues. The implementation process will be documented to describe underlying mechanisms in relation to context, setting, professionals and residents. For this purpose, questionnaires before and after the workshop and a diary to document nurses experiences during the intervention period will be used. The next step is to evaluate the complex intervention in a clusterrandomized pilot study with seven nursing homes in two regions in Germany. The pilot study will be accompanied by a comprehensive process evaluation to assess the barriers and facilitators of successful implementation of the intervention and thus to provide sufficient information for planning a main trial. One of the fundamental issues in the support of medical research is data management in its basic form: the maintenance of data files. Many research projects still store raw, result and analysis files on local hard drives without traceable versioning, metadata information and access management [1, 2] . This is not only troublesome during the active research progress, but even more when research should be reproducible, auditable and ready for long-term archiving. A solution to this problem are research management systems with file storage capability [3] . They provide versioning, access rights and logs as well as customizable metadata forms. The openBIS platform [4] is an open source research management system by the CISD group of ETH Zurich. It was developed with the goal to support biological research data workflows. The multi-center consortium sysINFLAME is investigating inflammatory diseases [5] . With many different source systems, data types and clinicians, researchers, biomedical staff and statisticians collaborating on all stages of data creating workflows in different institutes, we aimed at providing a robust and useful research management framework with openBIS. The interdisciplinary research consortium MetastaSys is analyzing molecular markers and pathways in cancer cells and their microenvironment, which govern the fate and localization of tumor metastases [6] . The investigation spans from biological samples like cell lines, mouse models up to patient samples and annotations of specimens with histology and omics before culminating in statistical models. One example for the necessity of research data management tools is the utilization of microbiome data in sysINFLAME. The workflow starts with the extraction of bacterial genes from stool samples, datacleaning and ends with identifying the amount of bacteria genera and cumulative analyses. We examined the data files originating at each step and chose relevant metadata to develop a research supporting platform. MetastaSys uses openBIS to collect and join the information generated on individual specimens in its collaborating workgroups. For statistical assessment data is linked: e.g. for cell lines their cell line profile, animal model implantation results to resulting expression profiles. The regular openBIS configuration allows for (1) checking in, (2) versioning, (3) deriving analysis data from the raw data, (4) storing the process steps. In both projects all data is linked to reproduce traceable data to information flows. Additionally, in MetastaSys openBIS is used to match information from the different work groups and join them according to their processing history. Researchers can store and retrieve data via a web GUI with individual metadata forms and access to the data depending on the researcher's role. Aiding the research process in both projects by researchers spread over different institutes in all stages, openBIS's additional value in providing a basis for unified data access and long-term archiving have to be investigated next. An electronic lab notebook is already included in the openBIS platform. Furthermore, aiming at a productive implementation of openBIS technical and organizational issues have to be considered: e.g. accessibility from source systems and researchers alike, an operational concept and sustainability in a working healthcare environment. To identify diet-health relationships the individuals' usual dietary intake needs to be estimated. One often used dietary assessment method is the repeated application of the 24-h dietary recall (24hDR). The adjustment for covariate information can improve the estimation of the usual dietary intake based on repeated 24hDRs. The frequency information from a food frequency questionnaire (FFQ) used as an auxiliary covariate may further improve the estimation of individuals' usual dietary intake. Objective The objective of this study was to investigate the effect of different adjustment sets on the estimation of the individuals' usual dietary intake using repeated 24hDRs. Methods 811 study participants (49.5 % women, mean age = 66 years) of the cross-sectional EPIC-Potsdam long-term study 1 (2010-2012) completed up to three 24hDRs (in total n = 2422 recalls) and one FFQ. Beginning with a reference model adjusted for day of the week, season, sex, and age we added successively body-mass index (BMI), smoking status, educational attainment, and physical activity level (PAL) as adjusting variables. After applying the National Cancer Institute (NCI) method on 28 food groups using each adjustment set we used the goodness-of-fit measurement AIC criterion and the prediction variance for model comparison. We selected the prediction variance as a criterion because the higher the prediction variance of the estimated individuals' usual dietary intakes is, the more precise is the effect estimate in a diet-health outcome model. Applying the AIC criterion no best adjustment set for all food groups could be identified. Overall we observed that additionally to the reference model the adjustment for BMI showed the most meaningful improvement but all AIC values were comparable suggesting that all models resulted in similar model fits. The additional inclusion of frequency information improves always the AIC criterion suggesting an information gain for the discrimination of individuals. The relative change of the prediction variances of each food group after successive inclusion of the covariates was only slightly of around 1 %. The inclusion of frequency information improves the prediction variance of round 20 % compared to the models without the frequency information. The improvement of less frequently consumed foods is roughly twice as high as the improvement of frequently consumed foods. The sequence of covariates day of the week, sex, age, BMI and frequency information improves the estimation of individual usual dietary intake most but the inclusion of additional covariates smoking status, educational attainment and physical activity was not harmful. E. Grill et al. Our study promotes the use of a FFQ as an auxiliary instrument, especially for episodically consumed foods. HandiHealth, London, United Kingdom The openEHR project is well-known for publishing and updating a set of open specifications to build maintainable and semantically interoperable (and even intraoperable) electronic health record systems that stay agile in a changing clinical reality. It is closely related to the family of ISO 13606 standards and to CIMI (now an HL7 WG). The detailed openEHR clinical models (archetypes and templates) are authored by global and regional clinical communities in an online environment where the authoring and review process gathers views and consensus from a breadth of clinical specialities. The openEHR archetypes are often used as a source of clinical requirements gathering also in non-archetype-based systems and interoperability standards (e.g. HL7 FHIR). This workshop will introduce and discuss openEHR based implementations and integrations primarily from developer and systems-engineering perspectives. In recent years several different technical openEHR persistence implementation approaches have been published, two recent approaches using graph-databases and combinations of relational ? schema less databases will also be described and discussed. Developers in an openEHR context nowadays thus have access to both a wealth of detailed clinical models and a wealth of published approaches to technical implementation using various persistence solutions, APIs and programming languages. We will present openEHR implementation in details at this workshop and discuss the future architecture of the health care systems. Designing concept models by openEHR clinical models for nation wide EHR project in Japan The EHR Research Unit, Kyoto University, Kyoto, Japan; 2 Miyazaki university hospital, Miyazaki, Japan We have developed an EHR system for regional health care in these 15 years on our own XML standard, named MML (Medical Markup Language) [1] . This EHR has stored health records of more than 3,400 patients, but it was pointed out that there were requirements to catch up clinical updates and to improve interoperability to the other clinical standards, such as HL7 or others [2] . From 2015, we started a project to update this EHR system for nationwide scale healthcare data repository. Meanwhile, the openEHR archetype technology was expected as a future-proof standard with flexible domain concept model [3] , and clinical information modeling initiative (CIMI) has developed global harmonization of clinical standards based on openEHR archetype technology [4] . To improve the interoperability, we decided to adopt openEHR technology to improve interoperability of clinical data. The clinical data on our EHR system has 16 components constructed with MML standards, and periodic health care surveillances of employee and students. Therefore, we figure out the mindmaps of the clinical data of MML and the surveillances to analyze the concept models to be constructed with. After mindmap analysis, we designed archetypes of the identified concepts by Ocean Archetype Editor. The artefacts were mainly quoted from the openEHR clinical knowledge manager (CKM) (http://www.openehr.org/ckm/) with or without modification to keep interoperability to other standards. Since the archetypes on CKM could not cover all of the MML semantics, archetypes were newly designed to complement for MML semantics. Dental archetypes were designed, because the surveillance to students includes clinical investigation of dental examination and CKM had no dental archetype. We developed MML compatible archetypes and clinical concept model for c based on openEHR CKM archetypes and MML equivalent archetypes successfully. 21 MML components/modules could be constructed by 69 archetypes. The periodic surveillance for employee was constructed by 23 archetypes, and the surveillance for students was constructed by 18 archetypes. The most of the archetypes were quoted from CKM, but 23 archetypes were specialized and 8 archetypes were newly designed. The reasons for specialization were to adjust the demographics (personal name, address, and telephone) expression for Japanese style and to extend archetypes for dental domain. For dental examination, 6 archetypes were developed and 5 archetypes were to point anatomical location in oral area. We could constructed concept models with archetypes semantically equivalent to the legacy EHR, and newly developed archetypes for surveillances. 11 dental archetypes were specialized from existing archetypes. The suggested archetype technology improves the flexibility of the existing standard. Thank you for not smoking: Predictors of the outcome of a smoking ban referendum in Bavaria, Germany Kohler S 1 1 Institute of Public Health, Heidelberg University, Heidelberg, Germany The Tobacco Control Scale surveys repeatedly classified the tobacco control activity in Germany among the weakest in European comparison. In the year 2010, the Bavarian government was called upon to enact Germany's to date strictest smoking ban in a referendum with 61 % approval rating. Due to the low voter turnout of 37.7 %, it was debated if the success of the referendum has been representative. This study assesses the associations between community characteristics and the outcome of the referendum that comprehensively banned smoking in Bavaria. Average voting behavior in all 2056 constituencies is studied retrospectively. Simple and multivariable regression analyses were used to identify community-level factors associated with the outcome of the referendum. Secondary data from the Bavarian State Office for Statistics and Data Processing on community characteristics and the two rounds of the referendum, i.e. motion and vote, were combined. There were positive correlations between the number of people who signed the motion and the voter turnout at the referendum as well as its approval rating. A negative correlation was estimated between voter turnout and the approval rating. These and other possible predictors of the referendum's approval rating, like average age, gender, religion, ethnicity, population density, or death rates, communal wealth, party affiliation of local council members and support for previously held referendums were evaluated in simple and multivariable regression analyses. This study examined socioeconomic and political predictors of the voting behavior in the referendum that led to Germany's so far strictest smoking ban. Understanding the predictors of the outcome of a smoking ban referendum can help identify barriers and facilitators to non-smokers protection through direct democratic processes. Exposure to high sound pressure levels in discotheques or while listening to portable listening devices (PLD) can lead to hearing damage especially in adolescents and young adults. A comprehensive analysis of total leisure noise exposure, including all relevant noisy activities improves the exposure estimation and repeated surveys allow to analyse how exposure changes with age. The Ohrkan study is funded by the Bavarian State Ministry of Health and Care. The aim of the study was to estimate the prevalence of total leisure noise exposure amongst adolescents and to analyse the change in leisure noise during 2.5 years of follow-up. In Ohrkan baseline a total of 2149 9th graders in the City of Regensburg (mostly aged 15-16 years) were recruited from 2009-2011 (response rate 56 %). At 2.5-year follow-up the participants were contacted again and asked about their leisure noise exposure. Both at baseline and in the follow-up self-reported frequency of noisy activities was combined with literature-based values of usual sound pressure levels of these activities to calculate the total weekly noise exposure. This exposure was compared to the lower exposure action value (LEAV = 80 dB(A)) from the occupational health and safety regulations. Exposure data from both time points were available for 1704 adolescents (55 % females). The percentage of participants who were exposed to leisure noise exceeding the LEAV increased from 75 % in baseline to 92 % in the follow-up due to more participants visiting discotheques: 17 (baseline) vs. 77 % (follow-up). In both surveys, listening to pop music via PLD was the activity reported by most of the participants and with the longest duration. In the follow-up survey, PLD use and visits of discotheques each contributed 35 % to the total leisure noise exposure. The high number of adolescents with risky leisure noise exposure shows the need for early prevention programs. In the pilot study Ohrkan Kids, therefore, a prevention module was developed consisting of music examples, information services, estimation of noise exposure due to PLD usage and the education of 11 to 14 years old students. Modern high-quality education is based on guaranteed, structured and well-balanced curricula covering the necessary scope of knowledge and skills required in subsequent practice, which respond to changing epidemiological profiles, health systems challenges and population health needs. The general idea is to obtain vector representations of words included in descriptions of disciplines, represent the textual content in a vector space (word2vec in particular) and visualise the results of similarity analysis as an interactive web-based report. We present the findings of the real application of the word2vec approach on curriculum data and reflect on its implications in the long-term process of curriculum innovation. The paper describes an appealing way of exploring that can be helpful to anyone who is involved in the complicated and time-consuming process of curriculum design at institutions of higher education. The final visual report serves as a comprehensive overview on selected parts of curriculum described in each discipline in order to provide transparent data for further evaluation. A re-analysis of the Cochrane Library Data: the dangers of unobserved heterogeneity in meta-analyses Background Heterogeneity has a key role in meta-analysis methods and can greatly affect conclusions. However, true levels of heterogeneity are unknown and often researchers assume homogeneity. We aim to: (a) investigate the prevalence of unobserved heterogeneity and the validity of the assumption of homogeneity; (b) assess the performance of various meta-analysis methods; (c) apply the findings to published meta-analyses. We accessed 57,397 meta-analyses, available in the Cochrane Library in August 2012. Using simulated data we assessed the performance of various meta-analysis methods in different scenarios. The prevalence of a zero heterogeneity estimate in the simulated scenarios was compared with that in the Cochrane data, to estimate the degree of S100 E. Grill et al. unobserved heterogeneity in the latter. We re-analysed all metaanalyses using all methods and assessed the sensitivity of the statistical conclusions. Levels of unobserved heterogeneity in the Cochrane data appeared to be high, especially for small meta-analyses. A bootstrapped version of the DerSimonian-Laird approach performed best in both detecting heterogeneity and in returning more accurate overall effect estimates. Re-analysing all meta-analyses with this new method we found that in cases where heterogeneity had originally been detected but ignored, 17-20 % of the statistical conclusions changed. Rates were much lower where the original analysis did not detect heterogeneity or took it into account, between 1 and 3 %. When evidence for heterogeneity is lacking, standard practice is to assume homogeneity and apply a simpler fixed-effect meta-analysis. We find that assuming homogeneity often results in a misleading analysis, since heterogeneity is very likely present but undetected. Our new method represents a small improvement but the problem largely remains, especially for very small meta-analyses. One solution is to test the sensitivity of the meta-analysis conclusions to assumed moderate and large degrees of heterogeneity. Equally, whenever heterogeneity is detected, it should not be ignored. We present a methodology and accompanying Stata and R commands, (pcdsearch/Rpcdsearch), to help researchers in this task. We present severe mental illness as an example. We used the Clinical Practice Research Datalink, a UK Primary Care Database in which clinical information is largely organised using Read codes, a hierarchical clinical coding system. Pcdsearch is used to identify potentially relevant clinical codes and/or product codes from word-stubs and code-stubs suggested by clinicians. The returned code-lists are reviewed and codes relevant to the condition of interest are selected. The final code-list is then used to identify patients. We identified 270 Read codes linked to SMI and used them to identify cases in the database. We observed that our approach identified cases that would have been missed with a simpler approach using SMI registers defined within the UK Quality and Outcomes Framework. We describe a framework for researchers of Electronic Health Records databases, for identifying patients with a particular condition or matching certain clinical criteria. The method is invariant to coding system or database and can be used with SNOMED CT, ICD or other medical classification code-lists. Investigating the relationship between quality of primary care and premature mortality in England: a spatial whole-population study Kontopantelis E, Buchan I 1 , Doran T 2 1 University of Manchester, Manchester, United Kingdom; 2 University of York, United Kingdom Our aim was to quantify the relationship between a national primary care pay-for-performance programme, the UK's Quality and Outcomes Framework (QOF), and all-cause and cause-specific premature mortality linked closely with conditions included in the framework. We used a longitudinal spatial study, at a low geographical level (''lower layer super output area'' or LSOA). The setting was 32482 LSOAs (neighbourhoods of 1500 people on average), covering the whole population of England (approximately 53.5 million), from 2007 to 2012. We included 8647 English general practices which participated in the QOF for at least one year of the study period, and included over 99 % of patients registered with primary care. The intervention we assessed was the national pay-for-performance programme which incentivises performance on over 100 quality-of-care indicators. The outcomes interest were all-cause and cause-specific mortality rates for six chronic conditions: diabetes, heart failure, hypertension, ischaemic heart disease, stroke, and chronic kidney disease. We used multiple linear regressions to investigate the relationship between spatially estimated recorded quality of care and mortality. All-cause and cause-specific mortality rates declined over the study period. Higher mortality was associated with greater area deprivation, urban location, and higher proportion of a non-white population. In general, there was no significant relationship between practice performance on quality indicators included in the QOF and all-cause or cause-specific mortality rates in the practice locality. Higher reported achievement of activities incentivised under a major, nationwide pay-for-performance programme did not seem to result in reduced incidence of premature death in the population. Relationship between quality of care and choice of clinical computing system: retrospective analysis of family practice performance under the UK's quality and outcomes framework Background An efficient user interface plays one of the major roles in implementation of personal health records. Patient with different background especially elder people require a simple tool to control their health status. To implement user requirements' based interfaces we conducted interviews with doctors and patient focus groups from September to July 2014. We offered potential users to test an interactive prototype (Axure) of the web application. Focus group participants were eligible if they were home health consumers (ages C61 years who received home care within the previous three years). We also conducted semistructured interviews with doctors who worked at the piloting hospital in Tomsk, Russia. The open source library D3js (data-driven documents and Java-Script) is a high quality diagram develop solution using HTML, CSS, SVG and JavaScript. It allows creating static and dynamic graphical elements. The medical data was ''archetyped'' according to the ISO 13606 standard. XML Schema was applied to define the structure of the archetypes. To implement user requirements' based interfaces we conducted interviews with doctors and patient focus groups. To test the usability of the portal we conducted usability testing interviews from January to February 2015. Application of open source libraries allows developing free and flexible graphical data representations. One on hand java script libraries provide a powerful tool to create custom made diagrams, on another hand open source community has developed a lot of useful templates that can be applied with minor changes to visualize medical data. Simple and informative templates are suitable for patients with no medical background even for elder people. So the application of open source solutions in AAL projects focused on elder people can reduce costs and time spent on development of interfaces. Kopanitsa G 1 1 Tomsk Polytechnic University, Tomsk, Russian Federation The aim was to study the opinions of tuberculosis patients and doctors and identify perceived opportunities and barriers to using a Web portal to optimize its use. The perceptions of 30 tuberculosis patients and 18 doctors (10 general practitioners and 8 TB specialists) from Tomsk, Russia were collected through semi-structured interviews. The responses were analyzed for content using principles of the grounded theory and thematic analysis, in order to gain insight into the participants' beliefs and attitudes towards adopting tuberculosis web-portal to increase the efficiency of the treatment and rehabilitation process. As the need for quality health care provision continues to expand, web portal technologies can be an efficient and intuitive approach for improving medical practices. The current paper set out to explore a qualitative perspective on the attitudes of 30 patients and 18 doctors towards using a tuberculosis web portal to report patients' vital signs and produce treatment recommendations. The findings from our reported study in the area of tuberculosis treatment and rehabilitation suggested that the participants generally accepted the introduction of web-portal for the purpose of reporting health status and getting recommendation from the doctors as an alternative to the traditional doctor visits. Developement of a clinical decision support system for the patients of a laboratory Service Kopanitsa G 1 1 Tomsk Polytechnic University, Tomsk, Russian Federation In Russia many patients address laboratory services directly without a doctor's referral. This causes the problem of interpretation of laboratory test results by the patients who don't have a proper medical background. So the patients require that the laboratory services provide not only the results of the tests but also their interpretations. To implement a decision support system must solve a classification problem by associating a vector of test results to a set of diagnoses and find a set of recommendations associated with every diagnosis from this set. The system was implemented in the Helix laboratory service in Saint-Petersburg, Russia. At the moment it generates about 3500 reports a day. The implementation of the system allowed increasing the number of patients who refer to a doctor after laboratory tests by 14 %. A qualitative study with 100 patients demonstrated a high acceptance of the system. The majority (82 %) of the patients reported that they trust the system and follow its advice to visit a doctor if necessary. The paper presents a process of development and implementation of a decision support system for laboratory service patients. The system allows patients reading and understanding medical records in natural language. For the laboratory service the system allowed increasing the level of satisfaction of the patients and the number of patients who came back to the laboratory service for more detailed testing. How to analyse the relation between bowel obstruction and gynaecological surgery using existing Danish health and social registries Kopp TI 1 , Norrbom C 2 , Osler M 1 , Løkkegaard E 2 1 Research Centre for Prevention and Health, Glostrup, Denmark; 2 Department of Obstetrics and Gynecology, Hillerød, Denmark Adhesions are frequent after gynaecological surgery and might cause serious complications including bowel obstruction (1) (2) (3) (4) . In Denmark, short-term complications after gynaecological surgery (including caesarean section and hysterectomy) are well-known (5), but the longterm risks, however, have been scarcely investigated. Moreover, some surgical procedures may cause more adhesions than others with subsequent increased risk of bowel obstruction (1, 4, 6) . In most countries, identifying persons with a relatively rare condition as bowel obstruction can be a challenge due to lack of systematic registration requiring exhaustive efforts for the investigators. Denmark has an extensive collection of national and disease-specific registries with unique information on patient level (7, 8) . The personal identification (CPR) number attached to every Danish citizen makes it is possible to link data from multiple data sources, facilitating registerbased research (9). The overall aim of the study was to investigate the incidence of bowel obstruction in women after different types of gynaecological surgeries over the last decades in Denmark using existing data retrieved from various Danish health and social registries. The present paper describes the method used in the study. Overall, data on 3.9 million women were obtained from the National Patient Registry (10) between 1977 and 2013. Data on relevant diagnoses and surgeries from all included women were retrieved including dates of admission and surgery. In order to adjust for socioeconomic position at individual level, data on educational background, income and civil status were retrieved from Statistics Denmark (11) for the included women and their partners. From the Civil Registration System (12), we extracted data on vital status, immigration and emigration. To examine the incidence of bowel obstruction after caesarean sections, we retrieved data from the Danish Medical Birth Registry (13), where information on all births in Denmark is registered (i.e. number of births, birth outcome, maternal risk factors and complications). Data from the Danish Hysterectomy and Hysteroscopy Database (DHHD) (14) were extracted to investigate the risk of bowel obstruction after hysterectomy and also, to investigate the risk of bowel obstruction after different surgery methods. In DHHD, detailed information about women undergoing hysterectomy from 2003 and forward is registered including indications, type of surgical procedure, complications, comorbidity, life style factors, re-admissions and morbidity. This large amount of data retrieved from various Danish health and social registries renowned for their comprehensive and high quality data with very little selection bias (7, 8) , enables us to do time-to-event analyses with bowel obstruction as outcome. The data also allow comprehensive descriptive unbiased analyses on gynaecological surgical procedures over time in the Danish female population. Using data from existing Danish health and social registries we are able to examine many new aspect of the relation between bowel obstruction and various gynaecological surgeries with very little selection bias. Providing appropriate support for the most vulnerable individuals carries enormous societal significance and economic burden. Yet, finding the right balance between costs, estimated effectiveness and the experience of the care recipient is a daunting task that requires considering vast amount of information. We present a system that helps care teams choose the optimal combination of providers for a set of services. We draw from techniques in Open Data processing, semantic processing, faceted exploration, visual analytics, transportation analytics and multi-objective optimization. We present an implementation of the system using data from New York City and illustrate the feasibility these technologies to guide care workers in care planning. Applying harvest plots to visualize the efficacy of different guideline implementation strategies in the primary care setting Clinical practice guidelines represent the most important decisionmaking support by offering a synthesis of the current evidence and recommendations for action. However, their use is still not comprehensively accepted by general practitioners (GPs). Finding the most effective implementation method from literature may improve acceptance and adherence. However, the diversity of the applied intervention schemes and the heterogeneity of the outcomes prevent direct comparison, applying traditional methods such as forest plots. Harvest plot provide an alternative graphical method displaying not only the effect but also other parameters as the study quality. This study aimed to evaluate the effect of intervention methods by using harvest plots. We performed a systematic search in Medline and Embase for implementation of guidelines on non-communicable diseases in primary care setting, reported in the recent 5 years. The quality of the studies was assessed according to the Cochrane criteria. The outcome indicators covered the categories of process of care (prescription, diagnostic behaviour, counselling and knowledge) and patient outcomes. Level of effect was estimated by the proportion of improved outcome indicators as ''no'', ''moderate'' and ''strong''. Results were visualized using harvest plots, displaying the level of effect and the quality of paper in a matrix of intervention schemes vs. outcome categories. Each cell of this matrix display parameters of the included studies by using bar charts. The height of each bar corresponds to the quality of the paper, the colour to the sample size and the positioning of the bars displays the level of the effect. Each row corresponds to an intervention scheme and each column to an outcome category. The matrix-display of five outcome categories and six intervention schemes provides at-a-glance highlighting of the results from a total of 22 included papers. Outcome categories were efficient knowledge transfer (three studies), diagnostic behaviour (12 studies), prescription habits (13 studies), counselling (three studies), and patient-based outcomes (eight studies). Investigated intervention schemes were (1) training (four studies), (2) training supplemented by audit (five studies), (3) training supplemented by other methods (three studies), (4) guideline distribution (three studies), (5) diverse isolated single methods (three studies), and (6) multifaceted approaches (four studies). Training and multifaceted approaches had relevant effects on most outcome categories, while distribution of guidelines was largely ineffective. The heterogeneous outcomes and design of the included studies raised the need for a straightforward graphical presentation of the results. Harvest plots can harmonize the results and facilitate interpretation. Our review showed that, instead of increasing the complexity of intervention schemes, interventions should be adjusted to match the targeted specific outcome and the actual needs and expectations of the GPs. Associations between HbA1c and progression of coronary artery calcification Kowall B 1 , Lehmann N 2 , Moebus S 3 , Erbel R 1 , Jöckel KH 1 , Stang A 1 1 Institut für Medizinische Informatik, Biometrie und Epidemiologie; Universitätsklinikum Essen, Essen, Germany; 2 Universitätsklinikum Essen, Essen, Germany; 3 Institut für Medizinische Informatik, Biometrie und Epidemiologie, Essen, Germany Background A hemoglobin A1c of 6.5 % has been defined as a new criterion for the diagnosis of diabetes, but except for retinopathy little is known about how strongly the risks of diabetes complications (or their precursors) are increased in persons with newly detected HbA1c defined diabetes. Coronary artery calcification (CAC) is an established predictor of cardiovascular events and might be a mediator between diabetes and incident cardiovascular events. We assessed whether the new 6.5 % HbA1c criterion for the diagnosis of diabetes is associated with progression of CAC and how well glucose control measured by HbA1c is associated with CAC progression in persons with diabetes. In the Heinz Nixdorf Recall Study, a population-based cohort study in Germany (N = 3,453, age 45-74 years), CAC was assessed by electron-beam tomography at baseline and at a median follow-up time of 5.1 years. Among persons without previously known diabetes, 2,463 (71.3 %) had HbA1c \ 5.7 %, 653 (18.9 %) had Hba1c within S104 E. and 114 (3. 3 %) had HbA1c C 6.5 %; among persons with previously known diabetes, 129 (3.7 %) had good (HbA1c \ 7.0 %), and 94 (2.7 %) had poor diabetes control (HbA1c C 7.0 %). 1256 (36.4 %) participants were free of CAC at baseline. We assessed associations between (1) newly detected, HbA1c defined diabetes (HbA1c C 6.5 %) and prediabetes (HbA1c 5.7-6.4 %), and (2) poorly controlled (HbA1c C 7.0 %), and well controlled (HbA1c \ 7.0 %) previously known diabetes as exposure variables and CAC onset and CAC progression as the outcome variables. For CAC onset as the dependent variable, we fitted robust Poisson regression models (crude, and adjusted for age, sex, BMI, smoking, lipids, blood pressure, use of antihypertensives); for CAC progression, we fitted linear regression models. In subjects without baseline CAC, the risk of CAC onset was increased in poorly and well controlled previously known diabetes (RR = 1.9, 95 % CI 1.1; 3.5, and RR = 1.9, 95 % CI 1.3; 2.9, respectively), but not so in newly diagnosed, HbA1-based diabetes or prediabetes (RR = 0.9, 95 % CI 0.4; 2.0, and RR = 1.1, 95 % 0.9; 1.4, respectively) (crude model, reference: persons without previously known diabetes with HbA1c \ 5.7 %). After adjustment, RRs were attenuated to 1.5 (95 % 0.8; 2.7) for well controlled and 1.5 (95 % CI 1.0; 2.4) for poorly controlled diabetes. The percentage increase of the mean of 5-year CAC progression was largest in poorly controlled known diabetes (70.1 %, 95 % CI 35.9; 115.1) and well controlled known diabetes (22.4 %, 95 % CI 0.5; 49.1), but small in newly detected diabetes (0.4 %, 95 % CI -18.5; 23.8) and prediabetes (7.5 %, 18.4) . Effects were only slightly attenuated after full model adjustment. In known diabetes, CAC progression was slower when diabetes control was good. In newly detected, HbA1c diagnosed diabetes and prediabetes, there was no increased risk of CAC onset and CAC progression in a 5-year follow-up period. Associations between sleep characteristics and weight gain in an older population: Results of the Heinz Nixdorf Recall Study Kowall B 1 , Lehnich AT 2 , Erbel R 2 , Moebus S 3 , Jöckel KH 2 , Stang A 2 1 Institut für Medizinische Informatik, Biometrie und Epidemiologie; Universitätsklinikum Essen, Essen, Germany; 2 Universitätsklinikum Essen, Essen, Germany; 3 Institut für Medizinische Informatik, Biometrie und Epidemiologie, Essen, Germany Sleep duration influences weight change in children and young adults, but there is less evidence in middle-aged and, in particular, older adults. We assessed associations between sleep duration, daytime napping and sleep disturbances, respectively, with change of weight and waist circumference in older subjects. Contrary to previous studies, we also used two points in time to assess sleep characteristics. We used data from the population-based Heinz Nixdorf Recall study, a cohort study in Germany with a baseline and two follow-up visits (age 45-74 years, median follow-up 5.1 years for first, 5.2 years for second follow-up visit). In adjusted linear regression models (N = 3,751), we estimated weight change between baseline and first follow-up visit in relation to various self-reported sleep characteristics measured at baseline. Furthermore, we estimated change of weight and waist circumference, respectively, between first and second follow-up visit in relation to patterns of sleep characteristics measured at baseline and at the first follow-up visit (N = 2,837). In all analyses, short and long sleep duration, sleep disturbances, and regular daytime napping were associated with less than one kilogram of weight gain and less than one centimeter of gain in waist circumference over five years compared to the respective reference categories. For example, compared with 7-\8 h night sleep, short night sleep (B5 h at baseline) was associated with 0.5 kg of weight gain (95 % confidence interval -0.1 kg; 1.1 kg). Our study gave no evidence that sleep characteristics were associated with clinically relevant weight gain in the older population. Discharge letters are an important means of communication between physicians and nurses from intensive care units and their colleagues from normal wards. The patient data management system (PDMS) used at our local intensive care units provides an export tool to create discharge letters by inserting data items from electronic medical records into predefined templates. Local intensivists criticized the limitations of this tool regarding the identification and the further processing of clinically relevant data items for a flexible creation of discharge letters. As our PDMS supports Arden Syntax, and the demanded functionalities are well within the scope of this standard, we set out to investigate the suitability of Arden Syntax for the generation of discharge letters. To provide an easy-to-understand facility for integrating data items into document templates, we created an Arden Syntax interface function which replaces the names of previously defined variables with their content in a way that permits arbitrary custom formatting by clinical users. Our approach facilitates the creation of flexible text sections by conditional statements, as well as the integration of arbitrary HTML code and dynamically generated graphs. The resulting prototype enables clinical users to apply the full set of Arden Syntax language constructs to identify and process relevant data items in a way that far exceeds the capabilities of the PDMS export tool. The generation of discharge letters is an uncommon area of application for Arden Syntax, considerably differing from its original purpose. However, we found our prototype well suited for this task and plan to evaluate it in clinical production after the next major release change of our PDMS. Background Ischemic heart disease is the most important cause of death in Germany, where hypertension is one of the most important risk factor. Among children, bronchitis is one of the most common acute disease and asthma the most common chronic disease. We investigated the influence of road traffic noise C55db(A), socioeconomic and -demographic risk factors classified with a number of indicators to a Principal Component Analysis (PCA)-based score resulting in the new variable ''social burden'', physician density as a surrogate measurement for health access, and the proportion of people aged 65 or older on prevalence rates of heart insufficiency and hypertension. Additionally, the influence of road traffic noise C55db(A), social burden, and physician density was analyzed on prevalence rates of acute bronchitis and asthma in children aged 14 or younger. Methods All disease data and information on social indicators were obtained from the Central research institute of ambulatory health care in Germany and from the Statistical Office for Hamburg and Schleswig-Holstein. To determine the affected population by noise C55db(A), a road traffic noise map was overlayed with a population map and the affected area fractions were calculated and scaled in percent. We modeled associations between potential risk factors and district prevalence of heart insufficiency, hypertension, acute bronchitis, and asthma by multivariate ANCOVA analysis. Due to expected sexspecific differences, all analysis were conducted for both sex. To account for age dependency of ''heart insufficiency'' and ''hypertension'', we used age-standardized prevalence. The independent variables ''district area affected by noise C55db(A)'', ''proportion of people aged 65?'', and ''physician density'' were considered continuous and scaled by units as per 5 % increase in district area and per unit step of social burden (three groups: low, average, high social burden). The analysis of cardio-respiratory disease in Hamburg clearly indicated an association between the socioeconomic and sociodemographic district background with the spatial prevalence of heart insufficiency, hypertension, acute bronchitis, and asthma. A significant higher prevalence for both cardiovascular health outcomes could be detected for districts classified as high social burden showing the highest influence among hypertensive female patients with 4.34 (p \ 0.0001) in the highest social burden category compared to the group of low social burden. Additionally, the proportion of people aged 65? showed high significant results in both cardiovascular health outcomes (p \ 0.0001). An increase of 5 % of physician density per district significantly resulted in lower prevalence of hypertension with -1.57 (p \ 0.0001) among males and -1.54 (p = 0.005) among females, whereas district area affected by road traffic noise C55db(A) did not show any significant influence. For the respiratory health outcomes the same trends could be observed, but the results were less strong and only partly significant. The results suggest that the socioeconomic and sociodemographic district background is significantly associated with cardio-respiratory diseases. Future studies should focus on individual level to improve our findings and to develop strategies to reduce the prevalence of cardio-respiratory disease in high burden countries. Globally an estimated 14.5 million hospital admissions are caused by acute lower respiratory tract infections (ALRI) in children under 5 years each year. The implementation of molecular diagnostic tools (e.g. PCR) has led to the detection of a large number of organisms claimed to be causal in ALRI. However, current data are insufficient to determine whether these viruses are truly pathogenic or commensals. The aim of this study was to assess the prevalence, pathogenicity and clinical relevance of organisms in respiratory infections of Ghanaian children using a case-control design. Methods From September 2014 to August 2015 all children admitted to the Agogo Presbyterian Hospital in Ghana were screened for symptoms of ALRI and recruited for participation if symptomatic. Concurrently, healthy controls were recruited from the community. Pharyngeal swabs were taken from all participants and analysed by PCR for Pneumococci, Mycoplasma, Influenza (A, B), Parainfluenza (1) (2) (3) (4) , hMPV, RSV, Entero-. Rhino-, Adeno-, Parechovirus and Human Corona Viruses (NL63, 229E, OC43, HKU-1). The frequency of organisms in both groups was determined and age-adjusted odds ratios (OR) for association with ALRI calculated using logistic regression models. The attributable risk fraction of ALRI for each organism was estimated. Of 723 children admitted with fever, 337 (46.6 %) fulfilled the criteria for inclusion as cases. In total, 573 healthy controls were recruited. Of these, 235 (69.7 %) and 271 (47.3 %) tested positive for at least one organism in the case and control group, respectively. The most common organisms in cases were pneumococci (163, As a contribution to WHO resolution 66.23 [1] this WHO project is producing an eBook for the social determinants of health approach in order to build and strengthen global health workforce capacity. The rapid rise and acceptance of eBooks as information and learning resources provides a common 'place' to support, empower and deliver institutional learning as well as in-service learning such as the community engaged learning approach. The concept embraces an eLearning approach that uses Information and Communication Technologies (ICT) which opens up a wide range of possibilities, both in pre-service and in-service education. Smart health workforce education is a networked systems approach that integrates traditional print with multimedia and social media aligned to an agile development process underpinned by web based analytics. The interactive multimedia eBook format is a promising tool and resource with ''the capacity to shape how individuals access information in medical education and healthcare settings'' [2] as well as proven learning benefit [3] . In order to cover different needs of ICT equipment and usage worldwide, the content will be delivered as an interactive PDF and on a website, next to the eBook in the open format EPUB3 as the central resource. Research questions have been developed as part of an international eBook project. Key aspects are the effectiveness of the eBook as a learning resource in healthcare education, the number and origins of downloads, rationale for selection of provided formats and assessing the context of usage along with questionnaires will provide an in depth insight for future developments. In Japan, School Health Law defines annual health checkup for all students in primary school, junior high school, high school and university since 1958. The checkup items are classified with physical attributes and exercise capacity, which is optional. The result is reported to Ministry of Education, Culture, Sports, Science and Technology (MEXT) by statistic central values. The result in the almost all school is recorded on paper. Also, it is preserved for five years. However, after the legal preservation period, the record is discarded. In these circumstances, the authors pointed that the necessity of the preservation of school health checkup results from a view point of life-long health record preservation. Because ISO13606 Archetype is defined as a maximum dataset, the standard is capable to integrate a new archetype for school health checkup. Therefore, the authors employ and extend openEHR Archetype to express the health checkup items. Because every municipality employs their own paper format, the record items are not unified in whole country. To define archetypes, the authors leads basic format of archetypes representing school health checkup from a guideline book published by Japanese Society of School Health. For the first step, the authors compare the items in each entry to existing Archetypes, and define cluster archetype to each entry. For the second step, existing health checkup data are analyzed to evaluate validity of real world data, which are recorded several schools differently. Because there are no existing dental Archetype definition, the authors propose oral Archetypes. The proposed oral Archetype consists of eleven concepts. To evaluate the proposed Archetypes of school health checkup, existing health checkup data analysis was performed. How many Archetypes are in use by existing health checkup data is evaluated. Existing health checkup item granularity is coarse than the proposed Archetype in the first place. Following characteristics were observed; Demographics Test date is not certain because some tests are performed in different date. Birth year is not certain because only school grade is registered in some schools. Observation Iterative free-text form should be prepared, especially for multiple symptom description. Raw data in a defined free-text often has CSV structure, which can be separated into different free-text form, e.g. 'Left shoulder keloid scar, atopy, contused wound'. If there is no specific abnormality, no data is recorded for normal status. Therefore, observed dataset of every school are sparse. It was concluded that the proposed Archetypes have enough structure to preserve real data except for iterative free-text of specific items. The development of personal medicine approaches is one of the major challenges of modern medicine. So far, research has focused on developing the most effective treatment for entire populations. However, the most effective treatment might be different for different subgroups of patients. Recent developments in modern laboratory medicine (like advances in genomics and proteomics) have lead to the discovery of new biomarkers determining subgroups of patients benefitting from a specific treatment. The impact of genomic variability on treatment response is often assessed in a biomarker-based design. Currently, such studies are restricted in the sense that only patients for whom the biomarker can be measured can be enrolled. Patients who unable to provide adequate biosamples are thus excluded from the trial and in some cases the proportion of such patients can be substantial. A recent study in non-small lung cancer showed that the epidermal growth factor receptor (EGFR) could not be measured in about 30 % of the patients [1] . Hence, results from trials including only patients whose biomarker can be assessed only provide evidence on how to treat these patients. We demonstrate how current biomarker-based designs can be extended to also include patients with missing biomarker status. Analytical solutions for the test statistic of the interaction effect between the biomarker status and the received treatment are derived. We also show under what circumstances the data from these patients can be used to obtain a more powerful test of the interaction effect. The inequality of patient profile information in Japanese hospitals Kurihara Y 1 , Ishida H 2 , KIMURA E 3 , Gochi A 4 , Kondoh H 5 , SHIMAI Ki 6 , Nakajima N 7 , Tanaka T 8 , Ishikawa k 8 , Oohara M 9 , Sonoda T 10 , Takai K 11 1 Kochi University, Nankoku, Japan; 2 Yamaguchi University Hospital, Ube, Japan; 3 Ehime University, Matsuyama, Japan; 4 Okayama University Hospital, Okayama, Japan; 5 Tottori University Hospital, Yonago, Japan; 6 Tokushima University Hospital, Tokusima, Japan; 7 Kochi Medical School, Kochi University, Nankoku, Japan; 8 Hiroshima University Hospital, Hiroshima, Japan; 9 NEC Corporation, Tokyo, Japan; 10 Fujitsu Corporation, Hakata, Japan; 11 Japan IBM Corporation, Tokyo, Japan A model dataset of patient profile information was created based on the items used at five Japanese university hospitals, the patient information data elements in Health Level 7 (HL7) v2.5, and the standard datasets for medical information exchange used in Japan. In order to check the validity of the model dataset, a cross-sectional survey was performed. A preliminary analysis of 20 Japanese hospitals found that most items were implemented at some hospitals, but the number of items implemented at many hospitals was rather small. This result strongly shows the necessity for a standardized dataset of patient profile information. Shortening or abolishing shift report as the effect of ENR and EMR implementation, and its side effects Kurihara Y 1 , Tanaka A, Nakajima A, Fukuju Y, Mori N, Shimoda M 1 Kochi University, Nankoku, Japan We investigated the shortening or abolishing shift report as the effect of implementing electronic nursing and medical records and its side effects at 10 Japanese hospitals. The majority of staff nurses accepted this change, but both directors and staff nurses recognized the risk of insufficient collection and communication of patient information, difficulties of understanding the risks and matters of patients not under care and an increase in time to collect information from the computer before starting patient care. Directors should carefully evaluate and reduce the negative influences associated with changing or eliminating the traditional shift report. Bayesian modeling of skewed distributions for health care costs using mixture models Kurz CF 1 , Holle R 1 1 Helmholtz Zentrum München, German Research Center for Environmental Health (GmbH), Neuherberg/München, Germany The modeling of costs in health care utilization is often problematic because of markedly right-skewed distributions and a significant S108 E. Grill et al. percentage of excess zeros. In addition, there may be a small but important proportion of high users which makes it more adequate to distinguish between low users and high users, rather than non-users and users. In this study we present a fully Bayesian approach to determine the best modeling strategy using finite mixture models (FMMs) to distinguish these groups. This is an ideal setting for a Bayesian interpretation, where parameters of interest are described by a distribution reflecting uncertainty concerning the true value of the parameter. We start with looking at model fit of different distributions (e.g. Gamma, Lognormal) in FMMs using both generated and real data from German health insurance. Model choice is of even greater importance when using health claims data, due to the large number of observations available. We explore the ideal number of groups that provide best model fit by comparing models by the Watanabe-Akaike information criterion and leave-one-out cross validation. Results FMM provide better model fit than traditional (i.e. single-group) distributions for cost data in health economic outcomes. Distinguishing between high and low user groups using FMMs is often more adequate than single distribution models. In the DREAM ALS Stratification Prize4Life Challenge held in summer 2015, more than $25000 were awarded to find the most accurate prediction of ALS progression and survival. Based on patient data from two national registries, solvers were asked to predict survival for three different time intervals and then evaluated on undisclosed information from additional data. In this presentation I describe my method that was submitted to the challenge. Methods I used extensive feature engineering to generate new predictive variables out of the available longitudinal data. Then, based on different variable selection and feature importance methods (elastic net, boosting, random forest), the most predictive features were used in a random survival forest model. In a post-challenge analysis I stratified the patients into four risk groups by nonparametric density clustering. Regarding prediction accuracy, this approach ranked second best. The prediction model confirms previous reports suggesting that past disease progression measured by the ALSFRS (ALS functional rating scale) score, time of disease onset, onset site and age are strong predictors for survival. In post-challenge analysis I found that patient stratification can further improve the predictions. The DREAM ALS stratification challenge was an interesting experience and will hopefully help to better understand and predict progression of the ALS disease. It provided a realistic setting with real data and attracted over 25 teams from around the world. What kind of user testing should be required before healthcare IT is released? Ensuring the usability and safety of health information technology (HIT) depends on the extent and quality of user testing. To address the question of what kind of user testing should be required before HIT is released; the panelists will describe their perspectives and work. This includes methods for improving the level of testing using rapid lowcost usability testing methods, clinical simulations, multi-phased testing and use of best available evidence. Regulatory issues will also be discussed, including how regulations applied to medical devices could be extended to ensuring usability and safety of healthcare software systems. International Perspectives on Innovation in HIT to drive Healthcare Transformation Kuziemsky C 1 , Nøhr C 2 , de Lusignan S 3 , Westbrook J 4 1 University of Ottawa, Ottawa, Canada; 2 Aalborg University, Aalborg, Denmark; 3 University of Surrey, GUILDFORD, United Kingdom; 4 Macquarie University, Sydney, Australia Background Health Information Technology (HIT) will play a key role in healthcare transformation. However, to date, HIT implementation efforts have often fallen short of their desired objectives, and in fact, have often produced undesired consequences (e.g. patient safety or communication issues) that have actually been a barrier to improved healthcare delivery. We first need to understand the complexity of healthcare transformation before we can design HIT to support it. Innovation is defined as ''the intentional introduction and application within a role, group, or organization, of ideas, processes, products or procedures, new to the relevant unit of adoption, designed to significantly benefit the individual, the group, or wider society'' [3] . Part of understanding the design of innovative technologies is that they both shape and are shaped by the settings and contexts where they exist, and are defined by the various activities and behaviors, both individual and collaborative, that interact with the technologies [4] . Too often HIT implementation is seen as an opportunity to automate rather than to informate [5] ; and fail because of negative unintended consequences, while positive ones go unrecognized. Health informatics specialists and the HIT they design need to span disciplinary boundaries if they are to be effective [6] . HIT innovations to date have been done in somewhat of a piecemeal approach, and that is likely why HIT implementation has not yet lived up to its potential. This international panel will contribute to addressing the above issues by discussing international perspectives on HIT innovation and then beginning development of unified framework to guide HIT innovation across the system development lifecycle. Methods This panel will discuss international perspectives on HIT innovation and then begin development of a unified framework to guide HIT innovation across the system development lifecycle. This international panel will look at HIT innovation as a driver of healthcare transformation in four countries (Australia, Canada, Denmark and the United Kingdom). A model that describes the relationship between technology and requirements in the context of strategic management of organizational complexity will be used to frame all the presentations [7] . Following all the presentations the panel will discuss the four perspectives as a basis for beginning development of unified framework for the design, implementation and evaluation of innovative HIT. Audience participation will be sought for discussion of the framework. The four panelists will provide complementary perspectives on HIT innovation as each speaker will discuss a specific part of the system development lifecycle. Craig Kuziemsky (Canada) will look at elicitation of system requirements from organizational and social perspectives. Christian Nohr (Denmark) will discuss system design from human factor perspectives. Simon de Lusignan (United Kingdom) will discuss implementation of clinical systems. Johanna Westbrook (Australia) will describe evaluation considerations for HIT innovation. Each panelist will draw upon case studies from their respective countries to supplement their presentation. A model that describes the relationship between technology and requirements in the context of strategic management of organizational complexity will be used to frame all the presentations [7] . Background Network meta-analysis is increasingly used to synthesize evidence on several treatments for the same condition, which have been compared in studies with varying subsets of treatments. Often, not all relevant clinical outcomes have been reported in all studies. Then multivariate meta-analysis promises to use data more efficiently and to reduce reporting bias. Unfortunately, information on correlation of effect estimates for different outcomes is often scarce or lacking. To cope with that several approaches have been published, among them Bayesian methods and extrapolation from individual patient data of few studies and other likelihood based methods [1] . Up to now it is not clear, whether and when uncertain within study correlation parameter assessment induces bias in effect estimates or generates unduly small confidence intervals. We therefore propose a tool to assess sensitivity of treatment effect estimates in network meta-analysis to miss-specification of within study correlations. First, we model a common correlation structure. Secondly, we display the multivariate maximum likelihood estimate of a treatment effect as a function of a correlation parameter. In addition, we display the profile likelihood of the correlation parameter to assess uncertainty about the correlation parameter. We illustrate the method on some published multivariate network analyses. The point estimates resulting from multivariate and univariate models generally differ more if treatment effects are modified by the outcome reporting patterns, if the proportion of studies not reporting some outcome is high, and if the multiple correlation (including the heterogeneity variance structure) between the outcome at interest and all other outcomes is strong. The width of confidence intervals is reduced if the latter two conditions are favorable, a case which has been referred to as borrowing of strength by [2] . If the data do not indicate heterogeneity, the dependence of results on uncertain correlation parameters is stronger than in the case of heterogeneity. Multivariate network meta-analysis is the method of choice to explore reporting bias. It should always be presented in parallel with univariate network meta-analysis. The potential of multivariate network meta-analysis to reduce reporting bias and to lend strength depends on the degree of heterogeneity and the number of heterogeneity degrees of freedom. Significant association between annual hospital volume and the risk of in-hospital stroke or death after carotid endarterectomy, but not following carotid stenting Kü hnl A 1 1 MRI, TUM, München, Germany The purpose of this study was to analyze the association between hospital volume and the risk of stroke or death following CEA or CAS on a national level in Germany. Methods Secondary data analysis using microdata from the nationwide statutory German quality assurance registry on all surgical or endovascular carotid intervention of the extracranial carotid artery between 2009 and 2014. Hospitals were categorized into quintiles that were determined empirically according to the annual case volume. The resulting volume thresholds for our analysis were 10, 25, 46, and 79 for CEA and 2, 6, 12, and 26 for CAS procedures. The primary (binary) endpoint of this study was any stroke or death occurring until discharge from hospital. For risk-adjusted analyses, a multilevel regression model with robust error variance was applied. A total of 161,448 CEA and 17,575 CAS procedures were included in the analysis. In CEA patients, the crude risk for stroke or death decreased monotonously from 4.2 % in low-volume hospitals (first quintile, 1-10 CEA/year) to 2.1 % in hospitals providing C80 CEA/ year (fifth quintile, p \ 0.001 for trend). No such trend was seen in CAS patients (p = 0.304). Risk-adjusted analyses confirmed a significant inverse relationship between hospital volume and the risk of stroke or death after CEA, but not following CAS procedures. This study provides evidence for a significant inverse volume-outcome association in patients treated with CEA. No significant association between hospital volume and the risk of stroke or death was found in CAS patients. Lay crowd-sourced expertise (LCE) and its influence on the new role of patients: ethical and societal issues The emergence of social media on the Internet allows patients to discuss about their chronic diseases within online communities sharing common interests. This allows patients to gather other patients' experience, and gain new knowledge that is usually not shared by healthcare professionals. In this context, further studies are required on the actual impact of the use of social networks on the quality of life of patients participating in these online communities, focusing on the evolving role and impact of Lay Crowdsourced expertise (LCE) in improving disease management and control. We present a study on a large number of posts from social networks of different online communities. This study allowed us to choose four pathologies, with distinctive characteristics relevant for our future analysis, and to define the themes that will be covered in future work by online questionnaires. The analysis of responses from patients, who volunteer to participate, will help us in exploring how interactions between patients, on these online communities, may help them to gain useful information for managing their conditions and improving their quality of life. Furthermore, we will identify new ethical issues that arise in the sharing of health data. A genome-wide meta-analysis approach to evaluate the genetic basis of lipoprotein(a) Lamina C 1 , Friedel S 1 , Coassin S 1 , Rueedi R 2 , 3 , Yousri NA 4 , 5 , Seppälä I 6 , Gieger C 7 , 8 , 9 , Schönherr S 1 , Forer L 1 , Erhart G 1 , Marques -Vidal P 10 , Waeber G 10 , Dähnhardt D 1 (a)) is a lipoprotein consisting of a LDL-like particle and the glycoprotein apolipoprotein(a) (apo(a)), which are covalently linked to each other. Lp(a) has been shown to be a causative and independent risk factor for cardiovascular diseases and is under pronounced genetic control. The concentration of Lp(a) is mostly influenced by the size of the apo(a) isoforms, which are expressed by copy-number variations in the LPA gene, the so-called Kringle IV repeats. In an hypothesis-free genome-wide approach, we aim to identify other variants, which might be involved in the regulation of Lp(a) concentrations, either within or in the proximity of the Health-exploring complexity: an interdisciplinary systems approach HEC2016 S111 LPA gene and also in other genes. Besides the search for new genome-wide findings, we also aimed to identify tagging SNPs which might be used as proxies for the apo(a) isoforms/Kringle IV repeats, which are not easily measurable for large-scale epidemiological studies. Genome-wide data from five studies of European ancestry (n * 13000) were available for analysis. In each study, each of the roughly 10 M included SNPs has been tested for association with inverse normally transformed Lp(a) values in an additive genetic model using linear regression, adjusted for age and sex. Study-specific results were meta-analyzed using fixed effects as well as random effects models. Stepwise regression models were used to assess independently associated SNPs. All measurements of Lp(a) concentrations by ELISA and apo(a) isoforms by SDS agarose gel electrophoresis have been performed centrally in one laboratory. SNPs in the LPA gene region were most strongly associated with Lp(a) concentrations. More than 2000 SNPs were genome-wide significant with a minimum p-value of 1.6e-424. A conditional stepwise regression model identified 39 SNPs, which were still genome-wide significant in a joint model. These 39 SNPs jointly explain 37 % of the phenotypic variance of Lp(a) concentrations. A subset of these SNPs, however, tag apo(a) isoforms: 5 SNPs are sufficient to explain 39 % of the variance of apo(a) isoforms. Besides SNPs in LPA, one SNP in the APOE gene was also significantly associated with Lp(a) concentrations. For carriers of rs7412-T, Lp(a) is decreased by 3.5 mg/dL (p = 3.4e-10). Our GWA meta-analysis in more than 13,000 individuals identified SNPs in two independent genes: LPA and APOE. rs7412 in APOE has consistently been shown to decrease LDL cholesterol and risk of coronary heart disease. The numerous independent SNPs that were identified within the LPA region imply a major task for the subsequent identification of the truly functional variants. Further studies are needed to replicate and validate our proposed SNP marker panel, which can tag especially small apo(a) isoforms with a sufficient sensitivity and specificity. Automatic extraction of drug adverse effects from product characteristics (SPCs): a text versus Background Potential adverse effects (AEs) of drugs are described in their summary of product characteristics (SPCs), a textual document. Automatic extraction of AEs from SPCs is useful for detecting AEs and for building drug databases. However, this task is difficult because each AE is associated with a frequency that must be extracted and the presentation of AEs in SPCs is heterogeneous, consisting of plain text and tables in many different formats. We propose a taxonomy for the presentation of AEs in SPCs. We set up natural language processing (NLP) and table parsing methods for extracting AEs from texts and tables of any format, and evaluate them on 10 SPCs. Automatic extraction performed better on tables than on texts. Tables should be recommended for the presentation of the AEs section of the SPCs. The future impact of healthcare services digitalization on health workforce: the increasing role of medical informatics Overall, it is still not clear what would be the impact on heath workforce. Different type of services (''store-and-forward'', ''remote monitoring'' and real-time ''interactive services'') will require distinct health workforce and clinical informations. There are areas that the eHealth services will enable to optimize HR whereas other areas that will demand more health professionals, probably even promoting the emergence of new professions. Technological change allegedly eliminates routine labor, whether physical or cognitive, and it increases demand for non-routine work typically requiring additional education. Depending on the type of service it will entail reductions or more need of healthcare workers, yet whatever the scenario medical informatics will play an increasing role. Nevertheless, the impacts of digitization are much more profound and raise many questions that are open to further research. E. Grill et al. Immunoglobulin G (IgG), a glycoprotein produced and secreted by B-cells, plays a major role in the human immune response. IgG glycosylation patterns affect the Fc binding receptors which, being activated, prompt the immune response. To analyze the genetic influence on glycosylation in IgG, we performed a genome-wide association study (GWAS) on LC-MS measured glycans on plasma of about 1600 individuals in the KORA (Cooperative Health Research in the Augsburg Region) study cohort. Additionally, we performed GWAS on ratios of glycans within IgG subclasses in order to elucidate genetic factors determining single steps in the glycosylation pathways. We replicated the results from all GWAS in the Leiden Longevity Study (LLS) of about 1842 individuals. The results indicate that in addition to genes encoding for enzymes known to contribute to IgG glycosylation (ST6GAL1 (chr.3), B4GALT1 (chr.9), FUT8 (chr.14), MGAT3 (chr.22)), other gene coding areas have a strong influence on the IgG glycosylation patterns. Interestingly, associations were found with SNPs on chromosome 7 in the region of IKZF1, a gene encoding for a DNA-binding transcriptionfactor. This is of particular interest as its non-binding isoforms are suggested to alter the progression of B-cell malignancies in humans. In order to facilitate and increase the usability of Electronic Health Records (EHRs) for healthcare professional's daily work, we have designed a system that enables functional and flexible EHR access, based on the execution of clinical workflows and the composition of Semantic Web Services (SWS). The backbone of this system is based on an ontology. In this paper we present the strategy that we have followed for its design, and an overview of the resulting model. The designed ontology enables to model patient-centred clinical EHR workflows, the involved sequence of tasks and its associated functionality and managed clinical data. This semantic model constitutes also the main pillar for enabling dynamic service selection in our system. The extracted information can be viewed as a set of links between variables. In the first example above, variables are ''high young adult BMI'' and ''premenopausal ER ? cancer''. In the second case, variables are ''breast cancer'' and ''low-level (\21 g/d) alcohol drinkers''. In order for Medical Recap to be able to aggregate such links into a dependence graph or a Bayesian network, it needs to group the extracted variables (terms) semantically into concepts, as some terms may refer to similar or related things. For example, terms ''premenopausal ER ? cancer'' and ''breast cancer'' may be grouped under concept Breast Cancer. The most common way to approach this problem is clustering. Typically clustering relies on the provision of a semantic similarity or relatedness measure between terms or texts to cluster. It then applies a clustering algorithm to group similar terms under clusters. There exists a panoply of similarity measures in the literature, including distributional similarity measures [6, 9, 12] , knowledge based [10, 11] , or more recently word-embedding based methods [1, 2] . In the medical domain, knowledge based semantic similarity approaches rely on the UMLS meta-thesaurus, MeSH, or the SNOMED-CT ontology, e.g., [2, 7, 8] . In this paper, we explore an alternative to similarity-based clustering, suitable for grouping terms (i.e., short text fragments) semantically. The objective is to provide an accurate grouping of terms into a hierarchy of concepts using a domain thesaurus (UMLS). This has the advantage of not depending on the quality of a similarity measure or a clustering algorithm. In addition, it allows users to apply term groupings at custom conceptualisation (abstraction) levels as needed. Our approach uses UMLS to identify concept mentions in the extracted variables. It does so by constructing a hierarchy of subterms for each input term (either by applying a natural language processing constituency parser, or by extracting the lattice of n-grams). It then matches the so-created hierarchies to UMLS concepts. Next, it identifies synonymous sub-terms based on their UMLS concept matches. Then, it selects the most relevant concept mentions based on their frequencies and positions in the term hierarchies. Finally, it creates the hierarchy of concepts and terms according to their meanings (using both UMLS narrower/broader relationships and lexical relationships). The approach described above for hierarchical semantic grouping of medical terms has been implemented and tested as part of the Medical Recap system. The result is an interactive tool that groups medical terms under a hierarchy of concepts, which then may be further edited by the user. The tool supports flexible term grouping, as it allows the user to select the concept levels at which the final term grouping should be applied. Accuracy of recall of disclosed medical findings from whole-body magnetic resonance imaging: results from a population-based study Lau K 1 , Hegenscheid K 2 , Schmidt CO 3 1 Institute for Medical Psychology, Greifswald, Germany; 2 Department of Radiology, Greifswald, Germany; 3 Institute for Community Medicine, Greifswald, Germany Background Patient self-reports on past diagnoses are indispensable in clinical practice and research. However, these reports may be biased due to poor recall of previous events and may lead to invalid prevalence estimates, thereby producing misleading results. Quantifying the magnitude of bias regarding reports on past diagnoses requires a precise knowledge of what has been communicated previously. However, many studies such as those based on secondary data have weaknesses in this regard. A precise baseline is provided by the Study of Health in Pomerania (SHIP), a population-based study conducted in North-eastern Germany. The present work aimed to analyze the reliability of self-reports on disclosed incidental findings from a whole-body MRI examination. Methods Data from 2063 individuals aged 30 to 90 years at baseline were analyzed of whom 1193 (57.8 %) underwent whole-body MRI. MRI was performed using a 1.5-Tesla system. Findings of potential clinical relevance were communicated to participants using standardized letters via postal mail. A postal survey was conducted on average 2.47 years after the MRI examination among all SHIP participants which included questions about disclosed medical findings. Self-reported data were compared with existing study documents regarding communicated findings using standard measures of diagnostic accuracy. According to study documents, 344 (28.8 %) MRI participants had received a notification about a medical finding. The sensitivity and specificity of participants' self-reports on whether they received an incidental finding was 76 % (95 % confidence interval 71; 81) and 87 % (95 % confidence interval 86; 89), respectively. The proportion of participants who had not undergone MRI examination, but reported to have received a notification from MRI examination was 10.5 % (n = 180). Subgroup analyses revealed that recall was better in participants having at least 10 years of schooling compared to individuals having less than 10 years of schooling (sensitivity 80 vs. 67 % and specificity 89 vs. 80 %), in married participants compared to unmarried participants (sensitivity 78 vs. 70 % and specificity 87 vs. 87 %) and in employed participants compared to unemployed participants (sensitivity 82 vs. 70 % and specificity 88 vs. 87 %). In the present study, sensitivity and specificity of participants' selfreport of disclosed medical findings was prone to a moderate degree of recall bias. Some participants recalled a communication without having participated in the MRI examination, which might be related to the long recall period. Results of subgroup analyses demonstrated that the reliability of recalled events differ across population subgroups, particularly regarding sensitivity. Therefore, participants' sociodemographic characteristics should be considered in research and clinical practice in the evaluation of potential recall bias regarding self-reports of past communicated medical findings. Lautenbacher H 1 , Swoboda W 2 , Hagmann S 2 , Jobst F 3 , Haase U 1 1 Universitätsklinikum Tübingen, Tübingen, Germany; 2 University of applied sciences Neu-Ulm, Neu-Ulm, Germany; 3 University Medicine Ulm, Ulm, Germany Modern hospitals mostly are using hospital information systems (HIS) that are composed of heterogeneous components and are characterized by distributed data processing. Communication with its components is driven by non-propriety protocols like ANSI HL7, one of the most important standards to be used in hospitals. There is an almost 30-year experience in the area of medical informatics with it and there are also many extensions and enhancements in use or in progress. On the other side any chance of interface technology is risky and expensive. Not every further development of a standard undertaken by standardization panels is accepted by software houses and hospitals. So it is important for hospitals to know which extensions are coming up in future. This work describes a qualitative survey of HIS-manufacturers about this matter. The sample includes 25 manufacturers of the most important clinical subsystems in the University Hospital Tübingen. The questionnaire consists of five questions: Do they plan to use HL7 version 3? How long is it possible to use certain former versions 2.x? Do they have a preferred version of 2.x? Do they plan new message types? Do you further support z-segments if hospitals are going to use them? All questions were sent to the manufacturers by email; answers were evaluated by qualitative analysis. Results 25 manufacturers were asked, 24 from them answered and 23 answers could be evaluated. One could not be evaluated because the given stuff was not assignable to the questions. Before, we had remembered more than 50 % by follow-up letters. 15 manufacturers (68 %) have no concrete plans to use HL-7 version 3, excluding the usage of CDA. The rest has the software prepared for using HL7 version 3 or is planning this, but without concrete request. 18 manufacturers (78 %) are going to use version 2.x five years or longer, 15 of them are not planning an expiring date. Most of the producers do not prefer a single version 2.x for extensions, because they are experienced in dealing with various versions. Almost all of the system providers are not thinking about new message types (20 or 87 %) and 13 manufacturers (57 %) are not seeing any or insuperable problems by using z-segments by the hospital. In a nutshell, system providers take no stock in using HL7 version 3 (except CDA); they will not push newer technologies for practical implementation. The established communication interfaces by HL7 version 2.x are running, and the advantages of new message types are not enough to take the risk of implementation and to invest the additional money (subsequent discussions with manufacturers showed that even new developments such as HL7 FHIR are seen as suspicious). Most of them allow the use of Z-segments to implement hospital requirements by the users. Fine? No, because there are a lot of challenges with existing interfaces (e.g. scheduling of entities). The authors suppose a continued existence and usage of inefficient old structures and increasing complexity of the communication scenery by newer extensions. A service design thinking approach for a stakeholder-centred eHealth Lee E Background eHealth is a healthcare practice supported by electronic processes and communication. Stakeholders in eHealth services play an important role when adopting or integrating new technologies in their work, but the design process of eHealth services is characterised by insufficient stakeholder engagement. Failure to identifying stakeholders and their needs in eHealth projects resulted in customers not being satisfied and required redoing of many parts of the projects. To improve this situation, a service design thinking approach is proposed. We conducted a case study in Norway from August to November 2014 to evaluate a message exchange module in an Electronic Health Record (EHR) system and to gather ideas for future improvement. We considered seven components affecting service experience: (a) service customers, (b) service workers, (c) service setting, (d) service process, (e) service objective, (f) service interaction type, and (g) sub-service provision context. The findings from our study suggest that SJML and SJC offer effective ways of engaging stakeholders, in particular when analysing existing eHealth service processes and facilitating discussions for future eHealth service processes. Conclusion Future research should look at how to engage stakeholders who cannot participate in a workshop setting. Software tools that support visualisation of service processes in a remote environment can be a solution for this. Observational studies could be used to measure the stakeholders' online engagement. Equity impact assessment-Considering social inequalities in the evaluation of interventions to promote physical activity among older adults. An equity-focused review as part of the project EQUAL Lehne G 1 , Brand T 2 , Bolte G 1 1 Institute for Public Health and Nursing Research, University of Bremen, Bremen, Germany; 2 Leibniz Institute for Prevention Research and Epidemiology -BIPS GmbH, Bremen, Germany Social inequalities in health are one of the main challenges in public health. Social epidemiological studies have shown that physical activity is less prevalent among socially disadvantaged population groups [1] . Universal prevention strategies not specifically targeting particular social groups of a population may unintentionally miss social disadvantaged groups and thus widen social inequalities (''intervention-generated inequalities'') [2] . The project EQUAL within the BMBF funded prevention research network AEQUIPA refers to the question on how social inequalities can be considered in the development, implementation, and evaluation of interventions to promote physical activity among older adults. The purpose of this study is to analyze the impact of interventions on social inequalities in physical activity among older adults and to identify appropriate methodological approaches for equity impact assessment using systematic review methods. Methods Nine electronic databases (MEDLINE, PsycINFO, CINAHL, CEN-TRAL, Physical Education Index, Social Science Citation Index, ASSIA, Sociological Abstracts, IBSS) were searched using keywords related to ''physical activity'', ''interventions'', ''intervention effects'', and ''older adults''. Searches were limited to peer-reviewed journal articles published from July 2005 to July 2015, written in English or German. Studies were included if they reported the effects of interventions on physical activity behavior among adults aged 50 years and over. Interventions targeting particular social groups or persons with certain diseases or risk factors were excluded. A review protocol was drafted adhering to the PRISMA-P 2015 statement and published in Systematic Reviews [3] . The final review will be reported following the equity extension of the PRISMA statement (PRISMA-E 2012). Results Searches identified 7,704 articles. Fifty-nine studies were included, representing a diverse range of study designs, intervention strategies, and outcome measures. In addition to age and gender, the majority of studies collected at least one other social factor (e.g., ethnicity, education, area deprivation), with these most commonly reported in the description of the study population. Around half of the studies considered social factors as covariables in multivariate analyses. Eleven studies had at least an indication of having analyzed potential effects on inequalities. Intervention effects were most often compared by age and gender subgroups. Differences in intervention effects according to marital status, ethnicity, and educational level were considered less frequently. The methods used for measuring equity impacts and their quality varied by study design. Several trends for the presence of equity impacts could be derived from the analyses. With regard to gender inequalities, the results of three studies pointed to gender-specific intervention effects. Public Health interventions may unintentionally widen social inequalities in physical activity among older adults. The findings of this equity-focused review suggest that many studies have not exploited the potential for assessing equity impacts so far. The systematic application of appropriate methodological approaches and transparent reporting of their results can provide important indications for the design of those interventions that are most likely to be effective across all social groups of older adults and thus do not increase health inequalities. Background During the last 15 years, several statins were made available as generic drugs and therapy became more cost-efficient. The aim was to analyze whether statin prescriptions are associated with the social position in a health care system with universal coverage. We used data from the baseline (2000-2003, n = 4814, 49 .8 % male, 45-75 years old) and first follow-up examination (2006-2008, n = 4157, 49.4 % male) of the population-based Heinz Nixdorf Recall Study, Germany. We only included participants with an indication for statins (n = 2305) according to NCEP ATPIII (participants with coronary heart disease (CHD) or CHD risk equivalents, ten year risk for CHD or cardiovascular disease according to Framingham risk scores, LDL-level, number of risk factors). In the data from the first follow-up, we categorized statin prescriptions as generic or brand name. We used education as an indicator of social position (ISCED-97) and set up the three categories low, medium and high education. We applied log regression models to estimate prevalence ratios (PR) with 95 % confidence intervals (95 % CI). According to a directed acyclic graph, we adjusted for age and stratified by sex. During the baseline examination, 290 out of 1423 men with an indication for statins took a statin (20.4 %). For women we found a prevalence of 21.2 %. Among men, the adjusted PR for receiving a statin was 0.70 (95 % CI 0.43-1.15) for men with low education compared to men with high education. For women we found an adjusted PR of 1.03 (95 % CI 0.66-1.61) for participants with low education. Among participants receiving a statin, the adjusted PRs for men receiving a generic drug were 1.38 (95 % CI 1.17-1.63) and 1.25 (1.13-1.39) for low and medium education, respectively, compared to participants with high education. For women the adjusted PRs were 1.05 (95 % CI 0.89-1.23) and 1.09 (95 % CI 0.94-1.26) for low and medium education, respectively. Conclusion Men with low education have a lower probability for the prescription of a statin compared to men with high education and are more likely to receive a generic drug. Among women the education level was barely associated with the prescription of statins. This workshop is organized by the GI/GMDS technical committee ''Informatics and Life Science'' (ILW). The GI (Gesellschaft für Informatik) is the German Informatics Society. The Technical Committee Informatics and Life Science comprises the German informatics communities for bioinformatics, medical informatics and environmental informatics. Its aim is to support interdisciplinary research in these highly complex informatics application fields. This particular workshop is aimed at bringing together experts from the life science disciplines with data integration experts. One common problem in all life science disciplines is data integration. In the age of ''Big Data'' the number, size and complexity of related scientific data sources grows continuously. New tools and techniques have emerged but subtle domain specific integration challenges are hard to address with general purpose integration recipes. Workable solutions typically require in depth domain knowledge paired with state of the art data integration technology. The workshop is intended to address and discuss issues such as, but not limited to: Big Data integration challenges in Life Sciences Domain specific data integration problems and challenges Geospacial data integration in environmental informatics Reference ontologies and standards for Life Sciences Lessons from data integration projects in Life Sciences Biomedical data integration including data integration for medical research Data integration in healthcare information systems Interdisciplinary approaches to data integration Data Spaces for Life Sciences Continuous data integration approaches in Life Science Novel tools for data integration Progress in the past 10 years We have planned for two workshop sessions. The first session will comprise technical data integration issues and reusable data integration approaches. This session will be opened with a keynote given by Professor Ulf Leser from the Humbold University in Berlin. The second session is aimed at the presentation of specific data integration challenges and solutions from the life sciences. This session will be opened by a keynote given by Ulrich Sax from the University of Göttingen. Additional presentations will be given for peer reviewed and accepted papers. Long versions of accepted submissions will be published in a special issue of the international journal ''it-information technology''. Award Students have the chance to apply for a 1000 € best paper award (GI FB ILW Förderpreis). Applicants should have completed their Master or PhD thesis in the area of data integration in Life Sciences within the past 18 months. The submitted paper should contain a summary of the respective research results. The successful candidate will present his/her research results at the workshop. The workshop will increase the interdisciplinary exchange between data integration experts and Life Science domain experts. It will be the starting point for an ongoing discussion about the reusability of generic data integration techniques for specific data integration challenges in the life sciences. The workshop will also encourage participants from different communities for collaborative interdisciplinary research projects. Disease risk prediction is highly important for early intervention and treatment, and identification of predictive risk factors is the key point to achieve accurate prediction. In addition to original independent features in a dataset, some interacted features, such as comorbidities and combination therapies, may have non-additive influence on the disease outcome and can also be used in risk prediction to improve the prediction performance. However, it is usually difficult to manually identify the possible interacted risk factors due to the combination explosion of features. In this paper, we propose an automatic approach to identify predictive risk factors with interactions using frequent item set mining and feature selection methods. The proposed approach was applied in the real world case study of predicting ischemic stroke and thromboembolism for atrial fibrillation patients on the Chinese atrial fibrillation registry dataset, and the results show that our approach can not only improve the prediction performance, but also identify the comorbidities and combination therapies that have potential influences on TE occurrence for AF. A Web-Based Database for Nurse Led Outreach Teams (NLOT) in Toronto Li S 1 , Kuo MH 2 , Ryan D 1 1 Regional Geriatric Program of Toronto, Toronto, Canada; 2 The University Of Victoria, Victoria, Canada A web-based system can provide access to real-time data and information. Healthcare is moving towards digitizing patients' medical information and securely exchanging it through web-based systems. In one of Ontario's health regions, Nurse Led Outreach Teams (NLOT) provide emergency mobile nursing services to help reduce unnecessary transfers from long-term care homes to emergency departments. Currently the NLOT team uses a Microsoft Access database to keep track of the health information on the residents that they serve. The Access database lacks scalability, portability, and interoperability. The objective of this study is the development of a web-based database using Oracle Application Express that is easily accessible from mobile devices. The web-based database will allow NLOT nurses to enter and access resident information anytime and from anywhere. [4] . In 2015, the British Heart Foundation commissioned a feasibility study into the establishment of a national Public Accessible Defibrillator (PAD) database for the UK and how it could improve OHCA survival. As part of this feasibility study, the views of a range of stakeholders were sought, including those of community first responders (CFRs). CFRs are volunteers from local communities who attend to emergency calls received by the ambulance service and provide care until an ambulance arrives. There were over 12,000 CFRs operating across the UK in 2014 [5] . Most of the PADs are static AEDs made available in public places. The alternative is the use of mobile defibrillators that are carried to the scene of an emergency by CFRs [6] . Methods A survey was designed with 12 questions. It covered the areas of CFR demographics, experience with PADs, and their views on a national PAD database and the technologies and apps available for locating defibrillators and for alerting lay responders of an emergency. The survey questionnaire was made available via an online survey tool (SurveyMonkey). Between November and December 2015, links to the survey were sent via email to CFRs via the National Ambulance Services Responder Managers Forum [7] . Results 760 responses were received (6.3 % response rate) with 12 out 14 ambulance service regions represented. The experience of CFRs ranges from none (just finished training) to 16 years, with the largest proportion having under 2 years experience (47.28 %, 322 of 681). Awareness of apps and their use was variable with a reasonable knowledge of GoodSAM, AED Locator and the South Central Ambulance Service app but 75 % did not know if apps are routinely used in their area. 35.47 % of respondents (255 of 719) felt a national database of AED locations would have a significant impact on awareness and use of defibrillators with a further 42.84 % (308 of 719) thinking it would have some impact. Additionally, the use of apps linked to a national database was supported by over 85 % of respondents. In the survey, CFRs expressed generally positive opinions about a national PAD database and linked apps. The need for it arises naturally from having defibrillators in the community and is an additional tool to help save lives. Issues that need to be considered however include information accuracy and maintenance; the need for training combined with first aid and CPR. CFRs identified the main barriers to defibrillator use as (1) not knowing how to use one, (2) not knowing where to find one and (3) fear of injury to the victim. Lack of awareness amongst the public about the availability and use of defibrillators was highlighted as a major challenge and the absolute need for a high profile and awareness raising campaign was flagged. Re-organising care of elderly, multi-morbid COPD and heart failure patients with low digital literacy-a 4 year Swedish telehealth intervention study Background Traditionally, when a patient treated for a severe exacerbation of chronic obstructive pulmonary disease (COPD) or heart failure (HF) at a Swedish hospital, the patient is then discharged to primary care. Mostly, it relies upon the patient to contact the primary care centre for necessary follow-up visits. Because of the lack of follow-up, the risk of hospital readmissions due to a new exacerbation is high. In order to break the vicious cycle of exacerbation, hospital admission, re-exacerbation and re-hospitalization we therefore reorganised the care by letting a specialised home care (SHC) clinic, supported by a telehealth system, be responsible for the care of these patients instead of discharging the patients to primary care. The study will evaluate if the re-organisation and system can contribute to earlier detection of patients' deterioration and prevention of acute hospital admission, costs, and differences between COPD and HF patients. A 4 year (2013-2017) telehealth intervention non-randomized single-centre clinical study, compared with expected outcomes, was set up. Elderly, multi-morbid patients with COPD or HF exacerbations are admitted to specialised home care and tele-monitored by a multi-professional team at the SHC clinic. Patients included in the study have had at least two hospital admissions during the year preceding inclusion and planned study participation for twelve months. The patients report daily to the clinic on performed physiological measurements (e.g. saturation, weight, blood pressure), symptom assessments (breathlessness, cough, phlegm) and intake of p.r.n. (''as needed'') medication. The care providers use a web-based system, the Health Diary, to view patients' reports for the detection of early signs of deterioration. The system generates alarms if the reported measurements or symptoms are below/above the individually decided thresholds and also if a patient misses the daily report. Meeting the patients' low digital literacy we developed the telehealth system based on digital pen technology. This way the patients do not have to learn how to use a tablet/smartphone or computer; they only have to be able to write with the digital ink pen. Earlier studies have shown digital pen technology to be suitable for elderly, multi-morbid patients [1] . Exacerbations are identified using information provided through the Health Diary system and the EHR. Consumed health care is assessed as the number of patient-care provider contacts (home visits or telephone consultations by a nurse or physician). Presently (February 2016), 72 patients with advanced disease are enrolled (27 COPD and 45 HF patients) of which 22 patients (9 COPD and 13 HF patients) have completed the 1-year study period or have died during the study period (8 COPD and 11 HF patients). Exacerbations were 2.4 and 0.8 and patient contacts were 85 and 53 per COPD and HF patient, respectively. While HF patients were significantly older than COPD patients, the two groups demonstrated no difference regarding gender distribution. Conclusion COPD patients exhibit exacerbations more frequently and demand much more home health care than patients with HF do. It seems that this difference of health care consumption is mainly due to disease characteristics. On the possibilities to reduce automatic alerts during electronic prescription Lindemann WB 1 1 Cabinet de Médecine Générale, Blaesheim, France One task of electronic health records (EHR) is decision support during electronic prescription by generating automatic alerts triggered by preexisting diseases, allergies or drug-drug-interactions. Those alerts are generally too abundant and too unspecific to be usefull 1,2,3 . It has been evaluated in a university hospital setting 2,3 which information already present in the EHR could improve their quality. I have repeated this study in family medicine. For 29 randomly selected patients who choose me as family doctor I evaluated how many automatic alerts for their long-term-medication would be avoidable if information usually present in the EHR would have been considered. I had seen all patients at least 5 times during at least 3 months (the time necessary to complete the EHR in structured form). The statistical evaluation was done with Excel and SPSS 23. The 29 patients (13 women) were aged 17-88, average 64.5. Older patients received more medicaments and had more structured preexisting conditions (both p \ 0.05). 181 prescribed medications generated 661 alerts of different levels. For those 661 alerts, 80 were doubles and 194 ''insufficient data for checking the prescription''. 60 alerts misinterpreted a diagnostic, for example (e.g.) identifying ''hemochromatosis'' with ''liver cirrhosis/insufficiency''. At least 15 were avoidable by considering the age (e.g. statins and triptans display ''contraindicated for fertile women''). 26 did not consider a protective comedication (e.g. against peptic ulcer or epilepsy) and 16 did not consider the chronology, e.g. for gastrointestinal side effects/absorption interactions). Considering blood pressure values would have avoided 23 alerts, considering the dose further 19 (e.g. Aspirin in ''cardiac'' dose), some were grotesquely wrong (e.g. alert for associating central nervous system depressors with asthmaor nitrate sprays containing traces of alcohol). Considering renal clearance, HbA1c-and TSH-values would have avoided 88, 9, 2 alerts, respectively. 3 did not consider local/topical application. 107 alerts were manifestly needless, but I did not understand the triggering cause, e.g. an ACE inhibitor displayed ''contraindicated in arterial hypertension'' or the first prescription of insulin for a drug-naïve type-I-diabetic displayed ''contraindication because of association with pioglitazon''). I evaluated 147 alerts as unavoidable (21.6 %) (e.g. ''special monitoring in elderly''), but in no case I changed the medication. E. Grill et al. I found no similar study in general medicine 4 , Seidling (2014) and Czock (2015) seem to be the only such study in a hospital setting. My EHR uses the common database ''VIDAL'', comparable to the ''Rote Liste''. Since 2012, the 11th convention incites French family doctors to use an EHR like the one I evaluated which generates automatic alerts during prescription. The alert quality is not only bad but ridiculous. Main limitation of my study is that I am no trained pharmacologist. But the evaluation of a medicament's benefit-riskprofile is an essential part of general practice. I formalized and applied those criteria we are using in practice and whose security we check during the follow-up of our patients. A pharmacologist should find more possibilities to filter unnecessary alerts. Until the congress, roughly 100 long-term and a sufficient number of intercurrent medications will be evaluated. Background This paper discusses an evidence-based approach to software requirements engineering. The approach is called evidence-based, since it uses publications on the specific problem as a surrogate for stakeholder interests, to formulate risks and testing experiences. This complements the idea that agile software development models are more relevant, in which requirements and solutions evolve through collaboration between self-organizing cross-functional teams. The strategy is exemplified and applied to the development of a Software Requirements list used to develop software systems for patient registries. Identifying depression among people with diabetes by health insurance data and the CES-D and PHQ-9 depression scales persons. The aim of this study was to assess potential differences in identifying depression among diabetes patients when using the Centre for Epidemiological Studies Depression (CES-D) scale, the Patient Health Questionnaire-9 (PHQ-9) or diagnosis in health insurance data. The study is based on a random sample (N = 1860) of all insured persons of a statutory health insurance (pronova BKK, covering 673.366 persons in Germany) with a documented diabetes diagnosis. A survey was performed, and survey data was linked on an individual level with health insurance data. Three methods were available to identify depression in the sample, CES-D, PHQ-9 from the baseline survey and health insurance data. The PHQ-9 focuses on signs and symptoms of depression based on DSM-IV. The CES-D is used to assess the occurrence of depressive symptoms over seven days. Health insurance data on health care utilisation were available for the period covering 12 months before and after the baseline survey. Patients were divided into seven groups based on the results of Screening-Instruments and health insurance data (e.g. group 1: identified by all methods, group 7: identified by health insurance data only, in the following numbered (1) to (7)). Groups were compared by a Kruskal-Wallis-Test and a Chi squared test for differences in demographical and clinical measures. In our analysis sample 546 of 1580 participants were identified as depressive by at least one method: (1) 16.8 % of all patients with depression met the criteria for all three methods. (2) 11.3 % met the criteria for both screening tools but not for health insurance data. (3) 5.7 % met the criteria for PHQ-9 and health insurance data and (4) 8.1 % for CES-D and health insurance data. (5) 11.0 % met only the criteria for CES-D, (6) 8.3 % only the criteria for PHQ-9 and (7) 38.6 % had a diagnosis in health insurance data only. The prevalence of depression ranged from 14.6 % for PHQ-9, 16.3 % for CES-D up to 24.0 % in health insurance data. The groups differed in terms of age, comorbidities and Health-Related Quality of Life but not in terms of sex. Patients who met the criteria for all three instruments (group 1) had a significant lower quality of life based on the mental scale. Patients in groups involving health insurance data (group 1, 3, 4 and 7) had a higher mean average of comorbidities compared to groups without a diagnosis in health insurance data. Most depressive patients were identified by at least two methods. Depending on the method, the amount and the characteristics of patients identified differed. These results are important to decide which method to use when identifying depression in study samples. How to analyze right-censored change from baseline data?-A simulation study for the analysis of a randomized controlled cochlear implant trial Liu X 1 1 In many clinical trials, baseline and single or multiple follow-up measurements are obtained and the change from baseline is of main interest. For some clinical examinations (e.g. a hearing test with exposure to increasing sound volume), the upper limit to which they can be quantified is restricted by technical limitations, leading to right-censored measurements. Under the assumption, that only subjects with uncensored baseline measurements are recruited to the trial, the problem of an upper limit of quantification is accompanied with right-censored change from baseline data in the primary analysis. In order to handle right-censored measurements, many studies simply replace these values by a single value, e.g. the upper limit of quantification or the upper limit of quantification ? x, and apply a conventional analysis for non-censored data after this step. As an alternative to simple substitution, there are various statistical methods for the analysis of right-censored data, although many of them are initially designed for non-negative survival data. We conducted a simulation study to investigate the performance of various approaches in analyzing right-censored (positive and negative) change from baseline data. In this study, we focused on a simple situation with two comparators, non-censored baseline measurements and single primary follow-up measurements. The sample size, the proportion of censoring and the correlation between baseline and follow-up were varied in the simulation. Then appropriate methods including simple substitution as well as parametric and non-parametric analysis of censored data were/are currently applied on the simulated datasets. To demonstrate and compare the performance of various approaches, estimations of the treatment effect, the type I error rate and the power for different scenarios and approaches will be presented. As an example, we illustrate the different evaluated approaches using the measurements of a randomized controlled trial in patients with cochlear implantation. Smoking cessation, relapse, and breastfeeding duration: results of the Ulm SPATZ Health Study Logan C 1 , Rothenbacher D 1 , Genuneit J 1 1 Institut für Epidemiologie und Medizinische Biometrie, Universität Ulm, Ulm, Germany Maternal smoking during pregnancy and the post-partum period have long-term effects on child health. Though most women quit or reduce smoking while pregnant, the majority resume within 6 months of delivery. A number of studies have reported associations between smoking status and breastfeeding duration leading some researchers to suggest further emphasis on breastfeeding education as a means to promote long-term smoking cessation. Conversely, results from other studies have demonstrated interventions focusing on smoking cessation may result in longer breastfeeding duration suggesting a more complicated interrelation than simple cause and effect. To contribute to current knowledge on this relationship, we analyzed the association between smoking status and predominant and overall breastfeeding duration within a population of smoking mothers with a high proportion intending to breastfeed recruited in a birth cohort study. The Ulm SPATZ Health Study cohort consists of 1,006 newborns and their mothers recruited from the general population shortly after delivery in the University Medical Center Ulm, Southern Germany, from April 2012 until May 2013. Demographic and birth data including maternal age, education, parity, nationality, pre-pregnancy body mass index, delivery mode, gestational age, and history and frequency of smoking and alcohol consumption before and during pregnancy were collected via questionnaire or electronic hospital records at recruitment. Current smoking and breastfeeding status and frequency were assessed via questionnaire at 6 weeks, 6 months, 1 year, and 2 years post-delivery. Smoking status was assessed categorically by time of first reported post-delivery smoking relapse as follows: \6 weeks, 6 weeks to 6 months, 6 months to 2 years, or long-term abstainers (no relapse reported). Cox proportional hazards models and Kaplan-Meier plots were used to estimate crude hazard ratios and log-rank p-values for the effect of smoking status on noninitiation and time to cessation of time-specific and continuous predominant and total breastfeeding. Results A total of 173 mothers of singleton term newborns who reported smoking within the year before pregnancy were included in this analysis. Of these, 34 % abstained from smoking up to 2 years postdelivery while 32, 19, and 15 % resumed smoking by 6 weeks, 6 months, and 2 years respectively. Smoking by 6 weeks was associated with maternal smoking during pregnancy, age [30 years, multi-parity, and paternal smoking history. Significant divergence in both predominant (log-rank p value = 0.01) and total breastfeeding (p value .001) patterns were observed between maternal smoking categories. Divergence in predominant breastfeeding was driven primarily by significantly higher rates of non-initiation [crude hazard ratio (HR) (95 % confidence interval (CI)): 3.01 (1.27; 7. 12)] among \6 week smokers compared to long-term abstainers. Significantly higher rates of total breastfeeding cessation were mostly observed during or immediately following the period of smoking resumption beginning at 3 months [3.11 (1.54; 6.28 Our results further support the association between timing of smoking relapse and total breastfeeding duration. Furthermore, we observed similar association between smoking status and predominant breastfeeding duration. The effect of gestational weight gain on cord blood adiponectin and leptin: Results of the Ulm Birth Cohort Studies Mounting evidence suggests that intrauterine growth restriction may play a role in the origin of a number of chronic diseases including obesity, cardiovascular disease, and type-2 diabetes. Fetal growth and development may be affected by nutritional availability during critical periods of gestation when fetal programming most likely occurs. Previous studies have observed correlations between pre-and post-natal growth and cord blood levels of adiponectin and leptin. Adipokines involved in signaling pathways associated with growth, metabolism, and the immune system thus may provide a possible link between fetal environment, growth, and disease. However, few studies have investigated how these associations may be modified throughout pregnancy. We aimed to investigate the association between gestational weight gain (GWG), a proxy for fetal nutritional availability, throughout pregnancy and cord blood levels of S120 E. Grill et al. adiponectin and leptin within the context of two similarly recruited population-based birth cohorts. The Ulm Birth Cohort Study (UBCS) and the Ulm SPATZ Health Study, two methodologically similar birth cohort studies, each consist of approximately 1,000 newborns and their mothers recruited from the general population shortly after delivery in the University Medical Centre Ulm, Southern Germany, respectively from 11/2000-11/2001 and 04/2012-05/2013. Inflammatory biomarkers were measured by identical commercially available ELISAs at both time periods (R&D Systems). Trimester-specific and total GWG were assessed using z-scores based on the mean underlying cohort-specific population and adjusted for body mass index. Following log-transformation of the adiponectin and leptin concentrations, linear regression was used to estimate geometric means ratios for the effect of a one unit increase in trimester-specific and total GWG z-scores on these concentrations, as well as for the difference between low, normal, and excessive Institute of Medicine (IOM) recommended GWG categories, adjusting for several confounders. After restriction of twins and pre-term births, complete GWG and biomarker data were available for 589 UBCS and 428 SPATZ mother-newborn pairs. Compared to the UBCS, mothers of the Ulm SPATZ Health Study were higher educated, less likely to have recently smoked, but more likely to have given birth by elective cesarean delivery. Following adjustment for gender, gestational age, body mass index, parity, duration of labor, and maternal smoking, we observed significant positive associations between GWG and cord blood leptin throughout pregnancy in UBCS [Geometric mean ratio (95 % confidence interval)] Modifiable lifestyle risk factors offer great potential to reduce the risk of cancer. Over the past years, the focus has changed from investing the role of specific lifestyle factors, such as smoking and diet, towards considering lifestyle in its entirety. Recent studies suggest an association between adherence to recommendations on healthy lifestyle and mortality. Based on the cancer prevention recommendations of the World Cancer Research Fund (WCRF) and the American Institute for Cancer Research (AICR), we built a lifestyle score to investigate its association with all-cause, total cancer, specific cancer type, and CVD mortality. We performed Cox regressions and estimated RAP (rate advancement periods) as well as PAF (population attributable fractions) for all-cause and total cancer mortality, respectively. Therefore, we used Swiss census-and death registry-linked survey data allowing a mortality follow-up for up to 32 years (16,722 participants). Information on lifestyle score components and confounders was collected at baseline. Over a mean follow-up of 21.7 years 3730 deaths were observed (1132 cancer and 1271 CVD deaths). Comparing best with poorest category of the lifestyle score showed a decreased risk for all-cause (HR 0.82 China has greatly shrunk the malaria burden over the last century and now aims at malaria elimination by 2020. In 2012, the country initiated the 1-3-7 malaria surveillance and response strategy. The 1-3-7 approach strategy refers to case reporting within one day, case investigation within three days, and focus investigation within seven days. A malaria focus is defined as a circumscribed locality situated in a currently or formerly malaria-endemic area, containing the continuous or intermittent epidemiological factors necessary for malaria transmission. The goal of focus investigation is to interrupt the further malaria transmission potency in identified foci. Public health actions conducted during the focus investigation include providing residues in the foci with health education and screening for malaria symptoms (i.e., demographic reactive case detection), vector control actions (i.e., indoor residual spraying), and active examination of potential malaria cases by RDTs or PCR (i.e., geographic reactive case detection). Focus investigations in China face to challenges as less than 50 % of foci was completed with public health actions in the first year after the initiation Health-exploring complexity: an interdisciplinary systems approach HEC2016 S121 of the program. Therefore, the aim of this study is to evaluate challenges and lessons learned of the malaria focus investigation in China. This qualitative study was conducted in two provinces of China, Gansu Province (northwestern China) and Jiangsu Province (southeastern China). Key informant interviews (n = 6) and in-depth interviews (n = 36) on implementation aspects of the overall 1-3-7 malaria surveillance strategy and especially specifically operational aspects of focus investigation were conducted. The respondents include malaria experts, health staff, laboratory practitioners and village doctors at the provincial, city, county, township and village administrative level. Themes related to challenges and lessons learned regarding the implementation of the focus investigation were identified. Major challenges were related to the difficulty of evaluation of local transmission risk, to the absence of or difficulties to follow guidelines, to difficulties with respecting the 7-day timeline, and to the acceptance of public health actions (e.g., indoor residual spraying or reactive case detection) by individual or community. Important lessons learned were the importance of communication in improving the logistics of the surveillance activities, quality control, ethical aspects, and the importance of active screening of the migrant workers and their peers upon return to China in counties or villages where imported malaria cases occurred frequently. Surveillance and response paly an essential role in malaria elimination programmes. The Chinese 1-3-7 malaria surveillance strategy has already proven to be successful but still needs improvement, especially focus investigation. Developing a standard operational procedure of public health actions in focus investigation, strengthening intersectional collaboration in active screening of potential imported malaria cases, and giving well consideration of ethic aspects of the actions are important for improving the implementation of focus investigation. As the 1-3-7 malaria surveillance and response strategy may also be considered as a model for other countries, the identified challenges and lessons learned should be considered in adoption and adaptation of the program. We conducted a systematic review of published studies of recent time trends of hip fracture incidence in European countries. For each of the 14 countries with time series data available, we extracted sex-and age-specific incidence rates over time. Using pooled data, we modelled incidence rates using linear mixed-effects models, including the fixed effect of calendar year, as a quadratic function, and age, through fractional polynomials, as follows (the effects of age and calendar year were modelled after z-score transformations: age* = (age -mean(age))/SD(age); year* = (year-mean(year))/SD(year)): Women: log(rate) * b0 ? b1 (year*) ? b2 (year*)2 ? b3 (age*/100)3 ? b4 (age*/100)3 log(age*/100) ? ewomen Men: log(rate) * b0 ? b1 (year*) ? b2 (year*)2 ? b3 (age*/ 100) -2 ? b4 (age*/100) -2 log(age*/100) ? emen Random effects were tested to quantify country-level variability in background rates, timing of trend reversal and tempo of reversal. For each country, we calculated the predicted values for calendar year of trend reversal (h) and maximum incidence rate (k) for the mean age in the sample as the coordinates of the vertex: Gaussian mixture models were then applied to the random coefficients to identify clusters of countries defined by common behavioural features. Our model including a quadratic function of time with random effects for intercept (background rates) and calendar year slope (timing of trend reversal) had good adjustment to observed data. A term for tempo of reversal did not improve fit. In the pooled analysis, predicted trend reversal was on average in 1999 in women (peak incidence rate about 600 per 100,000) and 2000 in men (about 300 per 100,000;. After extracting country-level random coefficients and applying mixture models, appropriate cluster solutions had three country groups for women and two for men. In most of Scandinavia, observed data were compatible with higher background rates but earlier trend reversal in both women and men. Later trend reversals but lower peak incidence rates were found in Southern Europe and in most Central European countries in both sexes, whereas an additional cluster of countries with intermediate rates and later trend reversals was identified only in women. Hip fracture incidence trends can be described within the same overall reversal pattern, suggesting that there is one rather than many epidemics. Country-level variability validly reflected previous knowledge on the geographical patterning of hip fracture burden. Decision support systems, for alert drug-drug interactions has been shown as valid strategy to reduce medical error, even so, the use of these systems has not been as expected, probably, due to the lack of a suitable design. This study compare two interfaces, one of them using participatory design techniques based on user centered design process. This work showed that the use of these techniques improves satisfaction, effectiveness and efficiency in an alert system for drug-drug interactions, a fact that was evident in specific situations such as the decrease of errors to meet the specified task, the time, the workload optimization and users overall satisfaction in the system. We aimed to identify factors associated with lung function in 15-year olds and to investigate the relative importance of early life events and current environmental and lifestyle factors. Methods Data from 1326 participants from the German GINIplus and LISAplus birth cohorts with valid spirometric indices-including forced expiratory volume in 1 s, forced vital capacity and expiratory flow rates-were analysed. Best subset selection was performed for linear regression models to investigate associations of early life (e.g. birthweight, breastfeeding, parental atopy and education, peak weight velocity, lower respiratory tract infections within the first 3 years of life), current lifestyle and environmental factors (e.g. active smoking, air pollution, serum vitamin D concentration, body mass index (BMI)), and allergic diseases (e.g. asthma, rhinitis) with spirometric lung function. Replication analyses (1000 replications) were performed in randomly selected subpopulations (N = 884) to assess which factors remained in at least 80 % of the best fit models for at least one lung function parameter. The following factors were associated with spirometric lung function parameters indicative of lung volume, airflow limitation or airway function in the total population and remained in [80 % of the replication models: peak weight velocity and early lung infections (early life factors), indoor passive smoke exposure, serum vitamin D concentrations and BMI at 15 years (current environmental and lifestyle factors). Sex, height, and asthma status were also consistently relevant. In addition to the well-known measures included in prediction equations (sex, height), current asthma and BMI, as well as weight gain and pulmonary infections during infancy were identified as prevalent factors associated with lung function in adolescents. Although our findings require replication in independent studies, they nevertheless highlight factors that should be considered in studies on lung health among adolescents. Due to the specific needs of biomedical researchers, in-house development of software is widespread. A common problem is to maintain and enhance software after the funded project has ended. Even if many tools are made open source, only a couple of projects manage to attract a user basis large enough to ensure sustainability. Reasons for this include complex installation and configuration of biomedical software as well as an ambiguous terminology of the features provided; all of which make evaluation of software laborious. Docker is a para-virtualization technology based on Linux containers that eases deployment of applications and facilitates evaluation. We investigated a suite of software developments funded by a large umbrella organization for networked medical research within the last 10 years and created Docker containers for a number of applications to support utilization and dissemination. Clinical registries are a powerful method to observe the clinical practice and natural disease history. In contrast to clinical trials, where guidelines and standardized methods exist and are mandatory, only a few initiatives have published methodological guidelines for clinical registries. The objective of this paper was to review these guidelines and systematically assess their completeness, usability and feasibility according to a SWOT analysis. The results show that each guideline has its own strengths and weaknesses. While one supports the systematic planning process, the other discusses clinical registries in great detail. However, the feasibility was mostly limited and the special requirements of clinical registries, their flexible, expandable and adaptable technological structure was not addressed consistently. Insights into the association between obesity and the risk of respiratory tract infections in the context of a population-based study in South Baden Respiratory tract infections (RTI) constitute a major morbidity factor in our society. RTI burden is affected by inter-individual differences in pathogen exposure, but host factors also contribute to the susceptibility towards airway infections. Here we present data from infection diaries collected in the context of the airway infection susceptibility (AWIS) study. Participants (n = 1630) in the age range between 18 to 70 years of a large population-based cross-sectional survey on RTI were invited to participate in a follow-up study with the aim to assess airway infections over three subsequent winter/spring seasons and correlate findings with lifestyle factors, environmental factors and co-morbidities determined at baseline. A diary score and respective upper and lower RTI sub-scores were calculated summing up individual RTI reported in the timeframe under investigation, namely between 2012 and 2015. STATA (version 13 STATSCorp, USA) was used to estimate unadjusted and adjusted correlations and odds ratios for the association of body mass index (BMI) and other indicators of obesity and subsequent burden of RTI. BMI correlated significantly with the overall diary score as well as with the upper and lower RTI sub-scores. Obesity (BMI C 30 kg/ m 2 ) conferred a 2.1 fold increased risk (95 % CI 1.4-3.2) to belong to the group reporting the highest 10 % of RTI in the indicated study period. Adjustment by gender, age, education, contact to children and smoking only marginally changed the estimates. In contrast, the correlation between RTI and obesity seems to be confounded by the association of co-morbidities with both variables. Differences in the RTI-obesity correlation could be observed across various subgroups, without reaching significance. The association between BMI and diary score seems to be linear above a BMI of 18.5 kg/m 2 . Obesity appears to be a risk factor for frequent and severe RTI in the German population. Further analysis including data on serum vitamin D levels, genetic data, as well as epidemiological data on weight change over time will help to gain mechanistic insight into the observed relationship. E. Grill et al. Online medical device use prediction: assessment of accuracy Siemens Healthcare GmbH, Erlangen, Germany Background Providing medical practitioners and other healthcare professionals with adequate access to well-filled electronic patient records and health-related information contributes to an improvement of the treatment process and ultimately the quality of patient care. This can be achieved by large, nationalwide EHR systems. Such an EHR however is expected to contain much more documents and health-related artefacts than conventional EMRs (Electronic Medical Records). This raises concerns of information overflow amongst the physicians and health professionals. Objectives This paper aims to introduce a new concept for the automated aggregation of a fullystructured patient summary document based on information extracted from documents which are published in large-scale EHRs. Methods In the first step, semi-structured group interviews with experts and customers were conducted. Afterwards a qualitative literature analysis was conducted. As a last step, technical and medical standards in the field of interoperability were screened. Results The result is architectural approach that integrates into workflows and processes in large eHealth projects, as shown on the example of ELGA. The integration is achieved by introducing two new services, Content Indexer and Content Index. Offering standardized IHE XDS and XDR based interfaces it is ensured, that the integration into standard based systems is possible. Capturing data in its registration process allows the extraction of information without executing automated transactions that do not involve human actors (which is always regarded suspiciously and hard to explain to the patient). Mathematical models in medical and health research Mansmann U 1 , Löffler M 2 1 Universität München, München, Germany; 2 Universität Leipzig, Med. Fakultät, Leipzig, Germany GMDS Working group ''Mathematical models in medicine and biology'' Mechanistic models enable researchers to integrate and analyze data from a wide range of different sources, for example molecular profiles, clinical trials, medical records and insurance claims, as well as public health data. Different modelling paradigms are in use such as differential equations, agent based models, petri nets and spatial models of tissue organization. Models addressing relevant clinical questions typically aim at representing our current knowledge of the underlying most important mechanisms of the disease and therapy processes considered. Model construction closely interacts with statistics and (often high-dimensional) data analysis in order to estimate model parameters in an efficient and unbiased way or in order to scale data to the model system. However, heterogeneity of data sets as well as over-fitting and validation are major challenges of ongoing modelling efforts. In systems medicine, models aim at representing an algorithmic image of complex biological processes, their influences, and outcomes. They model treatment response based on a complex interaction of pharmacokinetic and -dynamic processes in vivo, helping to answer questions like: Which treatment schedule will help the patient to respond? What is the maximum dose of a drug that will limit side effects to an acceptable level? How can therapy adapted to individual requirements? Especially if the number of therapeutic options and known individual risk factors is increasing, modelling could help to select promising new therapy schedules which however require validation in clinical trials. Models may also reflect our understanding of complex healthcare settings in health economics and public health. They are useful to study issues like: How many patients would benefit from a treatment and how many would be harmed? Would the potential benefit of screening everyone for a disease outweigh the potential harm to people who receive unneeded care because of false positive screening results? The purpose of this workshop is to introduce the participants in the state of the art of mathematical modelling in medical research: (1) to understand the interplay between data, modelling and theory; (2) the relevance of a deeper understanding of the underlying biology in order to build useful models of diseases and treatment responses; (3) analyzing effects of complex interventions in biological systems when purely empirical approaches are unfeasible; (4) decision making based on model predictions. The whipworm Trichuris trichiura is one of the most common soiltransmitted helminths infecting humans. It is estimated that T. trichiura accounts for up to 800 million infections worldwide [1] . Although mortality from helminth infections is very low, the burden of disease and morbidity is high. The infection occurs after ingesting the eggs. Light forms of T. trichiura infections are usually asymptomatic. Heavy infections might cause painful diarrhea, anemia and growth retardation in children [2] . Preventive measures include education to improve hygiene and food preparation. Large-scale deworming programs are also in place to control infection, especially for school-aged children, which represent the most vulnerable group to helminth infections [3] . Remotely sensed data about environmental variables such as rainfall, vegetation, land surface temperature or elevation are available free of charge in the public domain. Linking the spatial distribution of T. trichiura infection with environmental data can be useful to inform decisions to prevent infection by appropriate control measures taken directly in the environment and provide information to plan de-worming programs in high-risk areas characterized by the environmental data. E. Grill et al. The primary objective of this study is to identify environmental factors associated with T. trichiura infection in Mbeya region in Tanzania. Methods Data from a cross-sectional study collected as a part of the EMINI cohort study [4] including 6234 participants from nine different study sites in South-Western Tanzania is analyzed. The statistical analysis involves uni-and multivariate log link binomial regression modelling to assess associations of environmental factors with T. trichiura infection prevalence. Controlling for confounding and adjusting for household clustering play an important role in the analysis. Preliminary results suggest that T. trichiura infection occurs solely in Kyela, a low-lying study site in close vicinity to lake Nyasa. The prevalence of T. trichiura infection there is 26.6 % (95 % confidence interval [95 % CI] 23.9 to 29.6). In the other eight study sites either no or very few Trichuris infected participants were found, which suggests that infection is related to one or more factors specific for Kyela site. The site consists of two subsites with very different prevalences (2.7 %, 95 % CI 1.4 to 5.3 and 38.1 %, 95 % CI 34.3 to 42.0, respectively). The factors contributing to this high prevalence and the differences between both subsites remain to be assessed. The study population has a median age of 16.7 years (interquartile range IQR = 8.9-35.3 years). The maximum prevalence of T. trichiura infection (40 %) is found in children between 10 and 15 years of age. This is in agreement with the literature, which shows that children are at highest risk for T. trichiura infection. Final results will be available after comprehensive data analysis. The prevalence of T. trichiura infection in the EMINI study sites in the Mbeya region of Tanzania seems to be restricted to the vicinity of lake Nyasa, with local prevalences up to 38 %. Potential associations of T. trichiura infection with remotely sensed environmental factors remain to be examined. This poster presents the proof-of-concept of an evidence-based usability database applied to medication alerting systems. Methods A list of evidence-based usability principles has been turned into a relational database enabling queries. The developed database gathers 64 types of design principles, 167 instances of violations, 116 consequences for the user and 20 negative outcomes. The present database must now consider evidence from other sources of data (e.g. incident reports) and must be extended to others types of technologies. Poor usability of Medical Devices is a cause of use-errors that may ultimately lead to patient harm or even death. The European regulation requires that threatening risks related to usability-induced useerrors are identified and controlled by the manufacturer in order to get CE marking. This imposes a usability engineering process to be integrated within the design and risk management processes. The standards harmonized to this essential requirement call for a final usability evaluation, namely ''usability validation''. However, the lack of precise definition of this ''validation'' leads to difficulties and misguided applications. The Useval-DM project aims at gathering scientific evidence regarding two main operational issues of the ''usability validation'' phase: the ecological validity of the test environments (e.g. in-lab vs. in-simulator vs. actual clinical setting) and the number of participants to validation tests. Moreover, the Useval-DM project intends to perform an economic evaluation of candidate validation protocols in order to calculate the tradeoff between their costs and effectiveness at detecting threatening usability-induced use errors. Based on its results, the project will deliver guidelines and illustrative protocols to optimize usability validation of Medical Devices compliant with the European regulation. How to present evidence-based usability design principles dedicated to medication-related alerting systems to designers and evaluators? Results from a workshop Medication-related alerting system use errors, as well as lack of adoption, are often attributed to usability issues. Previous work has used evidence from the literature to reveal usability principles specific to medication alerting systems and identify potential consequences of violating these principles. The current study sought to explore how best to convey these principles to designers and evaluators of these systems to facilitate their work. To this aim, a workshop with 19 participants was used to generate ideas and opinions on how to deliver these topic-specific design principles in a way that would be most helpful for them. Workshop participants generated several ideas for how (e.g., a collaborative, continuously updated forum) and what (e.g., illustrated examples, checklists, evidence sources, potential consequences of violating principles, strength of the evidence) information is most useful to disseminate usability principles for medication alerting systems. Participants, especially designers, expressed desire to use these principles in practice and avoid previously documented mistakes and therefore make design and evaluation of these systems more effective and efficient. Those insights are discussed in terms of feasibility and logistical challenges to developing the proposed documentation (e.g., maintenance, sourcing examples). To move this work forward, a more collaborative approach of Human Factors specialists in medical informatics is necessary. Background Westernized lifestyle has been blamed for allergy epidemics, and increased distances and frequency of travelling is one of its characteristics. It has been speculated that traveling to places which substantially differ from home environment in terms of climate, vegetation and food early in life could increase the exposure to further unknown allergens and hence promote the development of allergies, but no epidemiological study has investigated this research question. Methods Detailed data on travelling during first two years as well as range of atopic outcomes along with potential confounders up to 15 years were collected prospectively within two large population-based multicentre German birth cohorts-GINIplus and LISAplus. Maximum travelling distance (within Germany; middle/northern/eastern Europe; southern Europe; outside Europe) and total number of trips were considered as exposures. Six atopic outcomes were used: (1) doctor diagnosed asthma, (2) doctor diagnosed allergic rhinitis, (3) eyes and nose symptoms, (4) sensitization to food allergens, (5) sensitization to indoor and (6) outdoor inhalant allergens. Longitudinal associations between each exposure and health outcome pair were analyzed using generalized estimation equations (GEE). The results of our longitudinal analyses of 5996 subjects do not support the research hypothesis that travelling abroad to different regions in Europe or beyond Europe increases prevalence of doctordiagnosed asthma and allergic rhinitis, nose and eyes symptoms and allergic sensitization up to 15 years of age, and there was no indication of age-varying effects. When number of trips was used as exposure, no statistically significant results and no pattern of exposure-response relationship was revealed. Traveling abroad seems neither adverse nor beneficial in relation to the development of allergies. If our findings are replicated in other epidemiological studies from different regions, they might be important for parents, indicating that there is no reason to be afraid of taking a small child on a trip abroad because of the risk of development various atopic conditions. Nevertheless, as travelling is unlikely to be at random and as possibility of reverse causation could not be disregarded, these first findings should be interpreted with caution. The promise of internet of things in healthcare: how hard is to keep? Internet of Things is starting to be implemented in healthcare. An example is the automated monitoring systems that are currently being used to provide healthcare workers with feedback regarding their hand hygiene compliance. These solutions seem effective in promoting healthcare workers self-awareness and action regarding their hand hygiene performance, which is still far from desired. Underlying these systems, an indoor positioning component (following Internet of Things paradigm) is used to collect data from the ward regarding healthcare workers' position, which will be later used to make some assumptions about the usage of alcohol-based handrub dispensers and sinks. We found that building such a system under the scope of the healthcare field is not a trivial task and it must be subject to several considerations, which are presented, analyzed and discussed in this paper. The limitations of present Internet of Things technologies are not yet ready to address the demanding field of healthcare. research. In order to alleviate the effects that aging will have on our societies in terms of increased functional deficits, waning possibilities for participation and communication and a potential risk of emergencies, supportive technical solutions are frequently proposed. Such technologies may comprise a wide range of solutions including e.g. wearable sensor-based systems, ambient assisted living systems for personal environments or communication support tools. To make sure that the rapidly advancing field of research in technologies for the elderly meets the demands of the persons in need, research efforts should be interdisciplinary. Evaluation in controlled clinical trials or field experiments is extremely valuable and therefore highly commended, yet challenging. The objective of the workshop is to discuss current approaches in wearable and ambient sensor technologies, with a particular focus on challenges and lessons learned during field studies or clinical trials. We will emphasize experiences, common pitfalls and how to avoid them by means of three examples. Together with all workshop participants, we will attempt to draft and later publish a document that summarizes experiences and provides a guideline for future applications and evaluation studies. Using SNOMED CT expression constraints to bridge the gap between clinical decision-support systems and electronic health records Clinical decision-support systems (CDSSs) should be able to interact with the electronic health record (EHR) to obtain the patient data they require. A recent solution for the interoperability of CDSSs and EHR systems consists in the use of a mediated schema which provides a unified view of their two schemas. The use of such a mediated schema requires the definition of a mapping between this schema and the EHR one. In this paper we investigate the use of the SNOMED CT Expression Constraint Language to characterize these mappings. Background Excess body weight is a well-established risk factor for both type 2 diabetes (T2D) and postmenopausal breast cancer, but considerable evidence also indicates that T2D may be an independent predictor of breast cancer risk. This analysis investigated the complex association of obesity and T2D with breast cancer across five ethnic groups within the Multiethnic Cohort (MEC). In this cohort, participants aged 45-75 years from five ethnic groups were recruited in 1993-1996. New cancer cases were identified through tumor registries and deaths through vital records. Body mass index (BMI) was calculated from self-reported weight and height. T2D was defined as a self-reported diagnosis from one of three questionnaires and confirmed by one of three administrative data sources. Cox regression with age as the time metric was applied to estimate hazard ratios (HR) and 95 % confidence intervals (CI) for the development of breast cancer in women with T2D. Separate models with BMI and T2D as time-varying exposure alone and in combination as well as by ethnicity were completed while adjusting for known confounders. Interactions were tested using a global Wald test of the cross-product terms for the variables of interest, i.e., ethnicity, BMI, and T2D status. Among 100,855 (24,427 white, 20,025 African American, 7,596 Native Hawaiian, 26,396 Japanese American, 22,411 Latina) women, 6,557 breast cancer cases were identified after 14.8 ± 4.1 years of follow-up. Of 17,991 self-reported T2D cases, 14,425 were confirmed by administrative data and 892 developed breast cancer. Native Hawaiians (HR = 1.52; 95 % CI 1.38-1.67) and Japanese Americans (HR = 1.20; 95 % CI 1.11-1.29) had a higher and Latinas (0.81; 95 % CI 0.75-0.88) a lower risk to develop breast cancer than whites. Compared to women with BMIs 20-\25 kg/m 2 , overweight and obese women had HRs of 1.22 (95 % CI 1.15-1.30) and 1.33 (95 % CI 1.24-1.43). The association was stronger among whites, Native Hawaiians, and Japanese Americans than African Americans and Latinas with a borderline significant interaction (p = 0.08). In models without BMI, T2D was significantly associated with breast cancer (HR = 1.14; 95 % CI 1.06-1.23), but including BMI lowered the estimate to 1.08 (95 % CI 0.99-1.16). The T2D-ethnicity interaction was borderline significant (p = 0.07); stratification by ethnicity showed elevated risks with and without adjustment for BMI among Latinas (HR = 1.34; 95 % CI 1.15-1.57 and HR = 1.30; 95 % CI 1.11-1.53) and African Americans (HR = 1.18; 95 % CI 1.02-1.37 and HR = 1.30; 95 % CI 1.11-1.53), but not in the other three ethnic groups regardless of BMI. In agreement with previous reports, obesity and T2D are both predictors of breast cancer risk in the MEC study population. However, the findings across ethnic groups indicate complex relations between the three conditions possibly due to differences in genetics, adiposity, and metabolism. In health monitoring, self-reported health professional-diagnosed depression is an internationally used measure. It depends on several conditions, notably previous contact with a health professional with reporting of the depressive symptoms, a correctly made depression diagnosis which was communicated to the patient and reporting of the diagnosis in the health survey interview. Information about mental health characteristics of participants with a self-reported diagnosed depression may contribute to a more accurate interpretation of survey results. In this study, the prevalence of a health professional-diagnosed depression is compared with the prevalence of DMS-IV-based major depression (MD) and their correspondence is examined across different population groups. Moreover, the prevalence of mental disorders among participants with a self-reported health professionaldiagnosed depression is reported and depression severity of individuals with a health-professional diagnosed-depression and MD, with a health-professional diagnosed-depression only and with MD only is explored. Methods Data from the cross-sectional nationwide German Health Interview and Examination Survey for Adults (DEGS1) and its mental health module (DEGS1-MH) were used (n = 4483; age 18-79 years). In DEGS1, health professional-diagnosed depression in the past 12 months was assessed as a self-report of the participants in the standardized physician-administered CAPI. In DEGS1-MH, DSM-IV-based 12-month MD and further mental disorders were assessed with a modified German version of the Composite International Diagnostic Interview (DIA-X/M-CIDI). In both assessments, mental health severity indicators were assessed. Median time lag between DEGS1 and DEGS1-MH was 6 weeks. Weighted percentages and 95 % confidence intervals (95 % CI) were calculated for participants with valid data in both instruments (N = 4382) using survey data procedures. A 12-month self-reported health professional-diagnosed depression was most frequent in those aged 45-64 years (8.9 %, 95 % CI 7.1-11.2) and 12-month MD in those aged 18-29 years (9.7 %, 95 % CI 7.1-13.2). Among participants with a health professional-diagnosed depression, 51.8 % (95 % CI 42.9-60.5) had any affective disorder, 54.7 % (95 % CI 46.0-63.1) had any anxiety disorder, 73.4 % (95 % CI 64.2-80.9) had any mental disorder; the prevalence of MD was 37.2 % (95 % CI 38.8-46.5) and was highest in those aged 18-29 years and decreased in older age groups. Among participants with MD, the prevalence of a health professional-diagnosed depression was 33.0 % (95 % CI 25.9-40.9), and it was the highest in participants aged 45-64 years and the smallest in participants aged 18-29 years. Severity measures had the highest prevalence among participants with a health professional-diagnosed depression who also fulfilled the criteria for MD. Limiting the time lag did not alter the result substantially. Conclusion Identified cases vary between a self-reported health professional-diagnosed depression and MD. A large part (73.4 %) of participants with a health professional-diagnosed depression had any 12-month mental disorder. The results indicate that among survey participants with MD, a health-professional depression diagnosis is reported most often by middle-aged and less often by younger participants. However, if a participant reports health professional-diagnosed depression, it more often corresponds with MD in younger than in older participants. The results further suggest that participants with depression identified by both instruments had higher severity levels. A dynamic approach to support interoperability for medical reports using DICOM SR Matos P 1 , Bastião Silva LA 2 , Marques Godinho T 1 , Costa C 1 1 DETI/IEETA -University of Aveiro, Aveiro, Portugal; 2 BMD Software, Lda, Aveiro, Portugal The standardization of data structures for clinical observations in medical imaging environments is a relatively recent effort. DICOM standard defines a set of supplements for different medical reports denominated as Structured Reports (SR). In 2013, Integrating the Healthcare Enterprise (IHE) also followed this trend by publishing the profile Management of Radiology Report Templates (MRRT). However, the generalized adoption of these normalized reports has been delayed due to several factors. In fact, numerous medical institutions still use proprietary formats that do not promote sharing and remote access. New strategies to incentivise the adoption of normalized report templates are needed to make them interoperable between distinct applications. This article proposes a new method to automatically generate DICOM SR from distinct data sources. It encompasses a flexible mapping schema that can be used with distinct medical imaging modalities. Our ultimate goal is to encourage the usage of DICOM SR by providing an effortless method to convert proprietary formats into standard ones. Moreover, the developed methods can be also used for supporting IHE MRRT profiles, making the reports accessible across different information systems and institutions. A strategy for reusing the data of electronic medical record systems for clinical research Matsumura Y 1,2 , Hattori A 2 , Manabe S 1,2 , Tsuda T 3 , Takeda T 1 , Okada K 1 , Murata T 4 , Mihara N 1 1 Osaka University Graduate School of Medicine, Suita, Osaka, Japan; 2 MKS Inc., Ikeda, Osaka, Japan; 3 MKS Inc, Ikeda, Osaka, Japan; 4 Osaka University Hospital, Suita, Osaka, Japan There is a great need to reuse data stored in electronic medical records (EMR) databases for clinical research. We previously reported the development of a system in which progress notes and case report forms (CRFs) were simultaneously recorded using a template in the EMR in order to exclude redundant data entry. To make the data collection process more efficient, we are developing a system in which the data originally stored in the EMR database can be populated within a frame in a template. We developed interface plugin modules that retrieve data from the databases of other EMR applications. A universal keyword written in a template master is converted to a local code using a data conversion table, then the objective data is retrieved from the corresponding database. The template element data, which are entered by a template, are stored in the template element database. To retrieve the data entered by other templates, the objective data is designated by the template element code with the template code, or by the concept code if it is written for the element. When the application systems in the EMR generate documents, they also generate a PDF file and a corresponding document profile XML, which includes important data, and send them to the document archive server and the data sharing saver, respectively. In the data sharing server, the data are represented by an item with an item code with a document class code and its value. By linking a concept code to an item identifier, an objective data can be retrieved by designating a concept code. We employed a flexible strategy in which a unique identifier for a hospital is initially attached to all of the data that the hospital generates. The identifier is secondarily linked with concept codes. The data that are not linked with a concept code can also be retrieved using the unique identifier of the hospital. This strategy makes it possible to reuse any of a hospital's data. A key feature in openEHR architecture is the direct, collaborative involvement of clinicians in the design, review, curation and governance of open source clinical content definitions 'archetypes'. OpenEHR, which grew out of academic enquiry into approaches to EHR architecture, draws heavily from approaches to distributed open source development in attempting to maximise the implementation of high-quality clinical content models, whilst supporting the competing need to reflect local requirements and innovation. OpenEHR is both 'cathedral and bazaar' in Raymond's terms. A carefully controlled and slowly developing core 'cathedral' information model underpins technical implementation but is designed to consume arbitrary clinical concept models designed and developed by a set of collaborating clinical communities operating rather more as 'bazaar', as a means of maximizing value within the highly complex and diverse domain of healthcare documentation. The emergent methodology draws heavily on the paradigms of open source distributed development and 'crowd-sourcing'. It expects organic and evolutionary development, anticipates and accepts speciation and 'forking' of assets. It tolerates and includes local, national and international perspectives to have to be supported, and that full international alignment/adoption will not be a smooth journey, understanding that short-term imperatives will always trump longerterm aspirations. Nevertheless, steady progress is being made, largely due to considerable voluntary commitment from the community with the output of openEHR clinical modelling becoming apparent through implemented national and international standards, as well as significant system implementations. This workshop will discuss the implications of this evolutionary approach to shared content development based on international, national and vendor-level experience as a means of maximizing benefit within a highly dynamic, culturally-mediated complex system. Asthma is a common chronic disease across Europe, though estimates of prevalence vary from 5-20 % [1] . Most management is provided in primary care and increasingly data about care is recorded in computerized medical record (CMR) system. Ontologies enable the formal definition of concepts and their relationships. We tested an ontology for asthma using the Royal College of General Practitioner (RCGP) Research and Surveillance Centre (RSC) database (N = 1,246,735). A case of asthma might be defined based on a recording of diagnosis (8.67 %), symptoms such as cough (35.13 %), diagnostic tests (17.48 %), therapy (15.04 %) or records of the process of care (9.46 %). Depending on whether a combination or just one factor is used causes a variation in prevalence in RCGP RSC from (4.92 % to 41.96 %). Presenting the ontological basis of prevalence estimates would help make comparisons more reliable. Ontology Web Language (OWL) can be used to enable such sharing-http://bioportal.bioon tology.org/ontologies/AO/. Meineke F 1,2 , Stäubert S 1 1 Leipzig University, Leipzig, Germany; 2 IFB Adiposity Diseases Leipzig, Leipzig, Germany The German Federal Ministry of Education and Research (BMBF) starts in 2016 the Medical Informatics Funding scheme [1] . Perspectively each university hospital is requested to establish a Data Integration Center (DIC) for local management and dissemination of health care data for the benefit of research and patient care. With the Open Archival Information System (OAIS) [2,3] there exists a widely adopted conceptual framework for modelling longterm preservation archives. We analyze, whether OAIS is applicable and useful in modelling the archival core of a DIC. Neither BMBF nor OAIS propose implementation details or data and interface standards. The BMBF only lists general responsibilities a DIC should fulfill: give access to local data, set up a quality management, be compliant with data protection provisions, include a user and rights administration, give support and training and create interfaces for data exchange. Although a national steering committee will assure their interoperability for overarching access, it must be feared that the DICs will have very different architectures and workflows, hindering collaboration, harmonization or comparison. Methods OAIS[2] defines terminology (here: italics), a functional architecture, an information architecture and transformation workflows. Information packages are submitted by producers, archived, transformed and released to customers. Producer and consumers are persons or systems including other archives. Work is done in functional entities, e.g. ingest, access or management. OAIS includes an elaborated information model, carefully deconstructing data and various metadata categories, e.g. for access and access rights. The components of OAIS are analysed and associated with the mandatory responsibilities of a DIC as defined by the BMBF and looked for DIC specific demands. We found, that the responsibilities of a DIC are generally well covered by OAIS. As the DIC manages heterogeneous, patient related, clinical data, additional aspect are to be analyzed: Data pre-processing and curation is up to the producer, as there is no content related expert knowledge inside an OAIS. For real-time data all processes must be fully automatized. De-identification and anonymization services are part of access, as record linkage is required. OAIS defines all means for rights and access management for consumers. Full anonymization of clinical data requires detailed content related metadata, e.g. to assure l-diversity or k-anonimity. The BMBF recommends, that in general data should not be stored in the DIC itself; this is understandable e.g. for image and genotype data but might decrease overall access reliability, giving much burden to the access software. The BMBF makes no data or metadata related proposal, needed to assure interoperability and overarching data exchange. We may here adopt OAIS ideas. It demands that any information preserved is independently understandable to the designated community, without any assistance from the producers. The OAIS federated model for cooperating archives proposes a central common catalog as access point for global consumers. Conclusion DIC and OAIS are not congruent concepts. But OAIS can serve as masterplan and checklist for designing the archival part of a DIC infrastructure. We conclude, that an OAIS based DIC is potentially more reliable, enhance able and understandable compared to an ad hoc architecture without underlying conceptual framework. The Markov-based modelling tool DYNAMO-HIA ('DYNAmic Model for Health Impact Assessment') allows the estimation and quantification of impacts on population health, due to policies and interventions. Based on underlying epidemiological causal chains, the model simulates a target population dynamically through time, considering changes in individuals' exposure to a risk factor in the life course. Country-specific prevalence data on common risk factors (alcohol, smoking, BMI), on associated chronic diseases and the corresponding relative risks (RR) are supplied to the user by a readyto-use database. For NRW, we aim at modelling an additional risk factor of growing concern: insufficient physical activity. Methods DYNAMO-HIA is a generic model which allows adding data on e.g. a further risk factor. Age-and gender-specific data on the risk factor prevalence, its relative mortality risk and disease-specific RRs are required. We examined 10 different data sources whether they provided the required physical activity prevalence data on country level, ideally for NRW. All are questionnaire-based, since surveys using mobile sensor systems are still in an early stage. The selection of associated health outcomes was based on sufficient evidence of the risk factor-disease relationship for physical activity. A comprehensive literature review was conducted to achieve RRs. For physical activity prevalence, survey data from the German Robert Koch-Institute (KIGGS wave 1, GEDA-NRW 2009/2010) was chosen. Since DYNAMO-HIA requires age-and sex-specific input data, extensive processing (imputation and smoothing) was necessary. A systematic literature review was hampered by the fact that physical activity is measured and reported in multiple dimensions and with different cut-offs, be that the type (e.g. occupational, recreational or leisure), the dose (frequency/duration/intensity), the scale (continuous vs. categorical) or the timing of physical activity. We focussed therefore on publications that used the physical activity levels and intensity as recommended by the WHO. Furthermore, the classification in physical activity categories had to be similar. RR-estimates were derived for use in DYNAMO-HIA by judging the published estimates, giving higher priority to data representing German or European populations and to those of relevant, large-scale studies. The associated health outcomes were all-cause mortality, ischemic heart disease and coronary heart disease, stroke, diabetes mellitus type 2, cancer of colon/rectum and breast cancer. Expanding the database of DYNAMO-HIA by a further risk factor like physical activity is feasible, even though challenging, due to the complex structure and manifold approaches to measure physical activity. Compromises need to be made, as well as assumptions related to the transferability of the results of literature research. This has to be kept in mind when interpreting the upcoming modelling results for NRW. Functional health literacy-merely a question of educational background? Mensing M 1 1 Landeszentrum Gesundheit NRW, Bielefeld, Germany The importance of functional, interactive and critical health literacy in citizens and patients is increasingly recognised as a key determinant for health-related behaviour, the ability to navigate through today's complex healthcare systems, and health outcomes. This study examined the extent to which educational levels (ISCED) influence and predict the participants' performance in functional health literacy, that is, the ability to find and understand health information, and what other socio-demographic and -economic factors play a decisive role when estimating the chance of limited health literacy of adults. In Germany, the federal state of NRW (North Rhine-Westphalia) took part in the HLS-EU with a total sample of n = 1,057. The computerassisted personal interviews of adult residents aged 15? years were carried out in July 2011 in 328 sample points. Health literacy was measured by asking for self-perceived, subjective difficulties in dealing with health information using 47 items. A composite indicator 'subjective functional health literacy (S_FHL)' was calculated, using those items that reflect difficulties in finding and understanding health information. As a comparable objective measure (O_FHL), the Newest-Vital-Sign Test was used. For both outcomes, multivariate logistic regression analyses were applied to evaluate adjusted ORs of possible predictor variables. The educational level is far from being the believed set-in-stone predictor of functional health literacy in NRW adults. Therefore, further effort is needed to understand and reduce insufficient or problematic health literacy especially in the elderly and migrants. This is all the more true against the background of demographic change and increasing life expectancy on the one hand, and current migration flows on the other. These groups show an increased risk for limited health literacy levels, partially accompanied by a faultily self-assessment. Reducing barriers that hinder appropriate use of the German health system by the identified vulnerable groups represents a further contribution towards a reduction of health inequalities in NRW. Omentin-1 is a recently identified adipokine, mainly expressed in visceral adipose tissue. Cross-sectional studies reported inverse associations of omentin-1 with body fatness, metabolic syndrome, type 2 diabetes and cardiovascular disease. However, prospective data on the association between plasma omentin-1 levels and future risk of cardiovascular diseases (CVD) are lacking. The aim of the study was to investigate the relationship between omentin-1 and incident myocardial infarction (MI) and stroke. Methods A case-cohort study nested within the European Prospective Investigation into Cancer and Nutrition (EPIC)-Potsdam cohort was used to investigate the relationship between omentin-1 and the risk of MI and stroke. Among 26 444 middle-aged men and women 350 CVD events occurred during 8.2 ± 1.6 years of follow up. Plasma omentin-1 was measured in a randomly selected subcohort of 2084 participants and all incident CVD cases. Prentice modified Cox proportional-hazards regression was used to estimate the association between omentin-1 and risk of MI and stroke adjusted for established CVD risk factors. Interactions were tested with cross-product terms in the fully adjusted model. After adjustment for established CVD risk factors, omentin-1 was not significantly associated with risk of MI (HR per doubling omentin-1: 1.17; 95 % CI 0.80-1.72; p = 0.43), but higher omentin-1 levels were associated with a higher risk of stroke (HR per doubling omentin-1: 2.20; 95 % CI 1.52-3.21; p \ 0.0001). In subgroup analyses, associations of omentin-1 with risk of stroke were generally stronger in lower versus higher CVD risk groups. For example, risk of stroke was stronger in participants without metabolic syndrome (HR: 2.58; 95 % CI 1.64-4.07; p \ 0.0001) compared to those with metabolic syndrome (HR: 1.21; 95 % CI 0.59-2.50; p = 0.60) (p interaction = 0.05). Similar interactions were observed when participants were classified in low or high risk groups according to waist circumference, triglyceride levels, hsCRP or adiponectin levels. We found no evidence that omentin-1 is related to MI risk, but omentin-1 concentrations may be related to increased stroke risk. This association is stronger in metabolically healthy individuals. Further studies are needed to explain the different role of omentin-1 under different metabolic conditions. Epilepsy diagnosis is frequently confirmed using electroencephalogram (EEG) along with clinical data. The main difficulty in the diagnosis is associated with the large amount of data generated by EEG, which must be analyzed by neurologists for identifying abnormalities. One of the main research challenges in this area is the identification of relevant EEG features that allow automatic detection of epileptic seizures, especially when a large number of EEG features are analyzed. Objective The aim of this paper is to analyze the accuracy of algorithms typically used in feature selection processes, in order to propose a mechanism to identify a set of relevant features to support automatic epileptic seizures detection. This paper presents a set of 161 features extracted from EEG signals and the relevance analysis of these features in order to identify a reduced set for efficiently classifying EEG signals in two categories: normal o epileptic seizure (abnormal). A public EEG database was used to assess the relevance of the selected features. The results show that the number of features used for classification were reduced by 97.51 %. The paper provided an analysis of the accuracy of three algorithms, typically used in feature selection processes, in the selection of a set of relevant features to support the automatic epileptic seizures detection. The Forward Selection algorithm (FSA) produced the best results in the classification process, with an accuracy of 80.77 %. Health-exploring complexity: an interdisciplinary systems approach HEC2016 S133 Care organizations in the Dutch region Apeldoorn want to collaborate more in order to improve the care provision to elderly and psychiatric patients living independently. In order to support the collaboration they intend to create a regional digital healthcare network. The research was focused on the relevance of a regional healthcare network for care providers. Eleven semi-structured interviews based on the USE IT-model, were conducted with care providers and staff members. Results show that care providers need to tune their activities for this target group and create an agreement on integrated care. The relevance of a digital communication and collaboration platform is high. The regional healthcare network should support the collaboration between care providers by: 1. Offering a communication platform to replace the time consuming communication by telephone; 2. Making patient information available for patient and care provider at patients' homes; 3. Giving insight in who is giving what care to whom; and 4. Giving access to knowledge about the target group: elderly and psychiatric patients living independently. The Berlin Initiative Study-a cohort of community dwelling older adults: prevalence of reduced kidney function and albuminuria Mielke N 1 , Ebert N 1 , Martus P 2 , Schäffner E 1 1 Institute of Public HealthCharité -Universitätsmedizin Berlin, Berlin, Germany; 2 Institute of Clin, Epidemiology and Med. Biostatistics, Tübingen, Germany Older adults account for the largest and still increasing proportion of those with decreased kidney function a phenomenon that might also be nourished by an ongoing demographic shift in most industrialized countries. Although chronic kidney disease (CKD) is said to increase especially among older adults epidemiologic data on kidney function in people C70 years are scarce. The Berlin Initiative Study (BIS) aims to fill this gap by evaluating the CKD burden in older adults. The BIS is a prospective population-based cohort study. Inclusion criteria were ''Allgemeine Ortskrankenkasse'' (AOK)-Nordost membership and age C70 years (y). Dialysis and kidney transplant patients as well as nursing cases were excluded. Sampling was stratified for age (70-74, 75-79, 80-84, 85-89, 90?) and gender. This cross-sectional analysis gives a detailed baseline characterization. Representativeness was analysed by comparing the distribution of comorbidity-claims data [''hierarchical morbidity groups'' (HMGs)] in the BIS population with the total source population from the AOK. The raw and adjusted association of risk factors with reduced glomerular filtration rate (GFR) (eGFR BIS2 \ 60 ml/min/1.73 m 2 ) and macroalbuminuria (ACR C 30 mg/g) was analysed using Chi square tests and multiple logistic regression analysis. Results 2069 participants (70-74 y, n = 573; 75-79 y, n = 476; 80-84 y, n = 429; 85-89 y, n = 385; and 3 90 y, n = 206) were enrolled. Mean age was 80.4 y and slightly more females (53 %) were participating. 26 % were diabetic, 79 % were on antihypertensive medication, 9 % had experienced a stroke, 14 % a myocardial infarction, 23 % had been suffering from cancer, 18 % were anaemic, and 28 % obese. Distribution of comorbidities (HMGs) in the BIS cohort was very similar to that in the insurance ''source-population'' with no difference in prevalence rates being larger than 6 %. Kidney markers creatinine and cystatin C as well as albumin creatinine ratio (ACR) increased with increasing age. After multivariate adjustment reduced GFR and elevated ACR were associated with most cardiovascular risk factors. Participants with an ACR C 30 mg/g were more often male (OR 1.36, 95 % CI 1.07-1.74), older than 80 y (OR 1.64, 95 % CI 1.29-2.09), diabetic (OR 2.46, 95 % CI 1.94-3.11), anemic (OR 1.35, 95 % CI 1.03-1.78), had lower GFR (OR 1.50, 95 % CI 1.12-2.01), and suffered from higher systolic and diastolic blood pressure (OR 1.60, 95 % CI 1.26-2.02 and OR 1.51, 95 % CI 1.16-1.95, respectively). High BMI was associated with a reduced risk of albuminuria (OR 0.75, 95 % CI 0.58-0.97). The BIS is a well characterized, representative community-based cohort of older adults. There are considerable age-and genderspecific differences in comorbidity prevalence rates. Participants with eGFR \ 60 or ACR C 30 had significantly higher odds for most cardiovascular risk factors compared to eGFR[60 ml/min/1.73 m 2 or ACR \30 mg/g, respectively. Kidney function declined and ACR rose with increasing age. These data contribute to build the ground for future research on health development, patient-centred care, and anticipation of health care expenditures in the growing field of geriatric nephrology. Background Biomedical text mining commonly uses concepts extracted from MEDLINE citations, often MeSH terms and their co-occurrences, as an indication of a relationship [1] . Increasingly, semantic relations extracted from titles and abstracts are also being used. These have a higher likelihood of representing actual relations expressed in text [2] . Although, analysis of co-occurring concepts in Medline has been addressed [3] [4] , little attention has been paid to temporal aspects of those relationships. Such analysis (of both co-occurring concepts and their relationships) could be used to reveal trends in biomedical research. As a first step, we used MapReduce [5] to extracted MeSH codes, their co-occurrences, and year of publication from MRCOC file, representing all of Medline. The concepts in this dataset are identified using their CUIs from the UMLS. In parallel, we extracted semantic relations from all of MEDLINE using SemRep [6] and joined them with publication years. From the semantic relation instances, we also produced aggregated counts of the relation arguments and pairs of S134 E. Grill et al. arguments. We did this with various AWK scripts and Unix shell utilities. Finally we joined the MeSH occurrences and co-occurrences counts with those extracted with SemRep to show the trends of each concepts, their associated co-occurrences, and semantic relations by year. The resulting dataset consists of the counts of the occurring and cooccurring MeSH headings and the counts of the corresponding semantic predications, by publication year. The data is stored in a database that contains eight columns (1) CUI of the first concept; (2) CUI of the second concept; (3) semantic predication name; (4) publication year; (5) number of occurrences of the first concept; (6) number of occurrences of the second concept; (7) number of cooccurrences; and (8) number of occurrences of the semantic predication. There are more than one billion co-occurrence records in MRCOC file. From these we obtained the counts for 1,149,803 occurring concepts and 228,898,636 pairs of co-occurring concepts from 1902 to 2015 years. From 47,730,370 semantic relation instances we extracted 95,460,740 argument occurrences, which we aggregated into 3,000,744 distinct arguments by year. From the same semantic relation instances we produced 23,864,872 co-occurring pairs of arguments. After joining concept co-occurrences originating from MeSH headings with those originating from SemRep semantic relations, we got 2,732,765 co-occurrences. After joining concept occurrence originating from MeSH with those originating from semantic relations, we got 692,727 concept occurrences. The dataset that we produced comes from MEDLINE, but from two different concept and relationship sources: the assigned MeSH headings and the semantic relations extracted from the titles and abstracts with SemRep. From these two sources we first separately produced concept occurrence and co-occurrence by year, and then joined them together. The dataset produced is a solid starting point for temporal analysis, such as trend analysis, which we plan to do in the future. The construction and publication of predications form scientific literature databases like MEDLINE is necessary due to the large amount of resources available. The main goal is to infer meaningful predicates between relevant co-occurring MeSH concepts manually annotated from MEDLINE records. The resulting predications are formed as subject-predicate-object triples. We exploit the content of MRCOC file to extract the MeSH indexing terms (main headings and subheadings) of MEDLINE. The predications were inferred by combining the semantic predicates from SemMedDB, the clustering of MeSH terms by their associated MeSH subheadings and the frequency of relevant terms in the abstracts of MEDLINE records. The inferring process also obtains and attaches a weight to each generated predication. As a result, we published the generated dataset of generated predications using the Linked Data principles to make it available for future projects. Comparison of two methods for habitual dietary intake assessment: a combination of 24-h food lists and a food frequency questionnaire (FFQ) versus conventional FFQ Mitry P 1 , Jourdan C, Zoller D 1 , Meisinger C 1 , Thierry S 1 , Knü ppel S 2 , Boeing H 2 , Linseisen J 1 1 Helmholtz Zentrum München -Deutsches Forschungszentrum für Gesundheit und Umwelt (GmbH), Neuherberg, Germany; 2 Deutsches Institut für Ernährungsforschung Potsdam-Rehbrücke, Nuthetal, Germany Valid estimation of habitual dietary intake still poses a challenge in epidemiological studies. A new approach combines the strength and information provided by two different instruments, i.e., repeated 24-h food lists (24HFLs) and food frequency questionnaire (FFQ). The results are compared to a conventional FFQ method. Methods Three web-based 24HFLs and a FFQ were completed by 805 participants of the population-based KORA FF4 study (48.1 % women, age range 39-88 years). Consumption probabilities were estimated using logistic mixed models adjusted for age, gender, BMI, smoking, and physical activity, and additionally adjusted for the consumption frequency information derived from the FFQ. Daily consumption amounts of a consumed food item were predicted for each participant based on the results of the population-based Bavarian Food Consumption Study II applying linear mixed models adjusted for the above mentioned variables. By multiplying consumption probability and estimated daily consumption amount the usual intake for each subject was calculated. These results were compared to the results derived from the FFQ only. Mean 24HFL-based results for food group intake were often higher as compared to the FFQ-based results. The differences seem to be smaller in women as compared to men. Spearman Correlation coefficients between both methods ranged between 0.36 (for cereals/cereal products; women) and 0.8 (for alcoholic beverages intake; men). Comparison of both methods resulted in weighted kappa values based on quintiles ranging from fair to excellent agreement. The results on the usual dietary intake based on the new method differ from the FFQ-based results, both in absolute terms and in classification according to quintiles. A validation study including recovery biomarkers is warranted. Background Citizen engagement and patient involvement stand out as the most underutilized resources to meet future challenges in health care. Placing the citizen at the center of digital health interventions is a daunting task if the intent is to bridge divides and broaden the community that benefit from digital health services and tools. Methods Panel discussion presenting research examples to discuss how to make tools for navigation more personal and relevant for the individual's situation and peer interaction, are critical. Mobilizing citizens has been shown to significantly improve personal health and well-being. However, the impact and sustainability of digital health interventions remains unclear. It is challenging to orient and create a safe path in unfolding digital realities of unfamiliar information, fragmentation, and tools one does not feel confident with. To support citizen engagement in digital health we will discuss how to create more relevant tailored resources and services to safely navigate the increasingly complex digital health landscape with the analog of a personal digital health compass. The Ministry of Health Malaysia started training medical interns for proper documentation of discharge diagnosis for inpatient cases in 2011. Medical interns are often responsible to complete the clinical component of the PER-PD301 Admission and Discharge form while the medical record officers (MROs) are responsible to assign the ICD10 code. Therefore, the training aims to address the interns' competency in quality documentation which will affect the quality of the coding and resulting health information. It does not educate interns on making the correct diagnosis. The training has 3 components: (1) documentation awareness, (2) the form and definitions, and (3) problem-based exercises. Documentation awareness informs on the responsibility to complete the form and well as the consequences of its quality. They are informed on how the information flows starting from themselves to the MROs to a centralized national centre before reaching to secondary recipients; how the information represents national statistics, reused for research, and perused for health system planning. They were also shown real-life samples of poor documentation which includes ineligible handwriting, use of short forms, missing and improper information that does not define a diagnosis. Next is introduction to the form, its structure, and the definitions for each data elements required to be filled. The diagnosis section is further divided into main diagnosis (for morbidity cases) or cause of death (for mortality cases), underlying cause, comorbidity(s), complication(s), and external factor(s) in accordance to ICD10 guidelines. If filled properly and completely, this information was deemed sufficient for MROs to assign the correct coding. Lastly, the participants were given an interactive problem-based exercises consists of clinical scenarios with explicit diagnosis details. The interns are then required to fill in the form followed by an open discussion on the expected answers. The training ends with Dos and Don'ts as a take-home message. The ministry has included this training as part of the Housemanship Induction Course in 2013 conducted at the central level. Due to increasing demand and number of sessions, the training has now been decentralized to the state level. Decentralization was implemented after awareness and commitment given by hospital and state health department directors in supporting quality information. Mohd Yusof M 1 , Matsumura Y 2 , Takeda T 3 , Mihara N 3 , Abdul Rahman R 4 Introduction Health Information Systems (HIS) and clinical workflow generate medication errors that affect the quality of patient care. Rigorous and comparative evaluation of medication error management in clinical practices enable the understanding of management role and its effect in reducing or eliminating error that occurs from clinical workflow and HIS. The primary objective of this study was to compare and understand the health disparities in management practice of HIS-induced medication error between hospitals in Malaysia and Japan in terms of approaches, impact, and lesson learned. Methods A comparative case study of summative evaluation was conducted, employing observation, interview, and document analysis in a 1000-bed Japanese secondary care teaching hospital and a 300-bed Malaysian secondary care hospital from the perspectives of users (nurses and physicians), management, and information technology staff. Management error plan and strategy in terms of human, process, organisation, and technology factors of the HIS of both hospitals were evaluated. The two hospitals differ in terms of size, age, culture, and infrastructure but they shared common attributes in ensuring patient safety through the use of technology. Positive impact on reducing medication error were associated with technical factors, including system alert, notification, and constraint; automated checking, dosing and calculation; and information completeness and accuracy; human factors, particularly user awareness and attitude; and organisational factors, namely policy, top management support, clinical processtechnology alignment and champion. However, Computerised Physician Order Entry (CPOE) has a number of limitations in guiding physician to choose the right medication and prompting excessive alerts without additional information to aid in decision making. Tedious system use, limited screen view and machine, inconvenient, nondurable and insufficient number of personal digital assistant (PDA) led to user resistance. Organisational factors such as planning, training, technology support, turnover rate, clinical workload, and communication were barriers to system implementation and use that subsequently affect medication error. A number of fit and misfit between the three factors and lesson learned were identified. Overview of HIS and technology involved in medication process, pharmacist interventions, medication error reporting system and significant medication safety issues were compared. Recommendations to improve the current medication error issues were also discussed. Continuous, systematic program of quality improvement and peer review that include mechanism for monitoring, reviewing, and reporting medication error had positive impacts on error reduction and clinical workflow. Clinical staff positively perceived the program in managing and minimizing medication error. Conclusion HIS was designed to minimize error but it also creates new risk in medication process. Error occurrence is inevitable but it can be minimized with pre-emptive endeavor. Managing medication error is crucial and benefits greatly in minimizing error and enhancing patient safety. Specific, tailored management approaches works well in a specific, contextual setting but they may also be applied to different settings. The role of socio-technical factors and their fit in realizing the potential of HIS and management approaches in minimizing error requires continuous, in-depth evaluation. Evaluation could provide the understanding for managing error incidents as a learning process in continuous process improvement. Background As with many other chronic diseases, the approach currently used to explain the etiology of suboptimal bone strength is built on the premise that there is important tracking throughout the life course, modulated by gene-environment interactions. Anthropometrics comprises a set of partially modifiable bone strength influences with great importance during childhood. Strong cross-sectional relations between body size and bone have been clearly established in paediatric populations. However it is unknown whether the identification of distinctive growth trajectories, capturing different shapes of overall growth, improves the statistical prediction of bone properties in addition to cross-sectional data. The aim of this study was to assess the influence of growth trajectories on bone physical properties at 7 years (y) of age. We studied a subsample of children from Generation XXI, a prospective population-based cohort study established in Porto, Portugal, that underwent a whole body dual-energy X-ray absorptiometry (DXA) at the 7-year-old follow-up evaluation. Total body less head bone mineral content (BMC) and bone area (BA) were obtained. BMC regressed on BA was calculated as a measure of volumetric bone mineral density (vBMD) and sex-specific BMC and vBMD z-scores were established. Children were weighted and heighted following standard protocols. The definition of growth trajectories in Generation XXI was based on weight measurements extracted from the children's health books, recorded in routine care, from birth until the age of 6y. Four growth trajectories patterns were identified using normal mixture modelling for model-based clustering and labelled as ''normal weight gain'' (trajectory I), ''persistent weight gain'' (trajectory II), ''weight gain during childhood'' (trajectory III) and ''weight gain during infancy'' (trajectory IV). Associations between growth trajectories and BMC and vBMD through linear regression coefficients (b) and 95 % confidence intervals (95 % CI) were computed for boys and girls separately. To test the effect of growth trajectories on bone properties independently of current body size models were additionally adjusted for 7y weight and height. . In girls, after adjustment for current weight and height, trajectory II was associated with a 0.12 (95 % CI 0.04;0.21) higher BMC z-score than trajectory I. Children from trajectory II had significantly higher vBMD z-score independently of current body size [girls: b = 0.38 (95 % CI 0.21;0.55) and boys: b = 0.21 (95 % CI 0.02;0.40)]. Our study adds that growth trajectory during infancy and childhood predicts bone physical properties at 7y of age, beyond the effect of current body size. Demographic ageing and the evolution of smokingattributable mortality: the example of Germany Mons U 1 Smoking is still the single most important preventable cause of death worldwide. For most of the smoking-caused diseases, incidence increases with age. Thus, the burden of smoking-attributable deaths is likely to be strongly influenced by trends of demographic ageing. With this study, I sought to quantify the evolution of smoking-attributable mortality in Germany over the last two decades since reunification, to explore the impact of demographic ageing and to discuss potential avenues to curb the smoking-associated disease burden. Smoking-attributable mortality (SAM) for Germany was estimated using the current methodology and relative risk estimates as provided by the US Centers for Disease Control and Prevention (CDC). Current and former smoking prevalence were extracted from the German microcensus as provided by the German Federal Statistical Office. Prevalence data for 1992, 1995, 1999, 2003, 2005, 2009 and 2013 were available. Cause-specific numbers of death for the respective microcensus survey years were taken from mortality statistics of the German Statistical Office. From these data, SAM was calculated separately for each cause of death. The impact of demographic ageing was explored by applying direct age standardization. In order to estimate the impact of demographic aging on future SAM, a forward projection until 2035 was modelled. Total SAM has declined from about 139,000 cases in 1992 (15.7 % of all deaths) to about 125,000 cases in 2013 (14.0 %). The evolution over time was substantially different in men and women: while SAM in men declined by approximately 20,000 cases per year, SAM even increased somewhat in women. Differences between actual and agestandardized SAM were particularly striking in men. While the agestandardized SAM in men decreased nearly by half, it remained more or less stable in women. The forward projection of SAM suggests that demographic ageing will lead to a further steady increase in actual SAM within the next two decades by approximately one third among men and one fourth among women. Potential avenues to curb the smoking-associated disease burden are discussed. Conclusion Especially in men, age-standardized SAM declined much sharper than the actual SAM, meaning that the decreases of SAM in men were largely compensated by demographic ageing. In the future however, the increasing number of deaths resulting from the ageing population will lead to a steep increase in SAM in the near future if no efforts are taken to curb smoking. Moreno Conde A 1,2 , Thienpont G 3 , Lamote I 4 , Coorevits P 5 , Parra C 6 , Kalra D 7 Interoperability assets is the term applied to refer to any resource that can support the design, implementation and successful adoption of eHealth services that can exchange data meaningfully. Some examples may include functional requirements, specifications, standards, clinical models and term lists, guidance on how standards may be used concurrently, implementation guides, educational resources, and other resources. Unfortunately, these are largely accessible in ad hoc ways and result in scattered fragments of a solution space that urgently need to be brought together. At present, it is well known that new initiatives and projects will reinvent assets of which they were unaware, while those assets which were potentially of great value are forgotten, not maintained and eventually fall into disuse. This research has defined a quality in use model and assessed the suitability of this quality framework based on the feedback and opinion of a representative sample of potential end users. This quality framework covers the following domains of asset development and adoption: (i) Development process, (ii) Maturity level, (iii) Trustworthiness, (iv) Support & skills, (v) Sustainability, (vi) Semantic interoperability, (vii) Cost & effort of adoption (viii) Maintenance. When participants were requested to evaluate how the overall quality in use framework, 70 % would recommend using the register to their colleagues, 70 % felt that it could provide relevant benefits for discovering new assets, and 50 % responded that it would support their decision making about the recommended asset to adopt or implement in their organisation. Several European projects have expressed interest in using the register, which will now be sustained and promoted by the European Institute for Innovation through Health Data. Semantic Data Virtualization in support of primary and secondary usage of clinical data Muth M 1 , Colaert D 2 1 Agfa HealthCare, Bonn, Germany; 2 Agfa HealthCare NV, Brüssel, Belgium Agfa's Semantic Data Virtualization (SDV) platform will be presented. It is a platform, fully built on the Semantic Web technology stack and providing just-in-time data access to multi-sourced semantically harmonized data to different types of applications. This is achieved by transforming the data in different steps using rules and a very powerful first-order-logic reasoning engine. Data is first converted to an internal model and stored as semantic graphs. Then, the data is re-exposed via Rest services in a format suitable for the application that requested the data. During these transformations, derived data can be calculated and exposed to the applications. Typical views on the data can be patient-centric for primary usage (EHR) or population centric for secondary usage (analytics, clinical trials, research). Characterizing comorbidities, a step to organize the patient problems list Background An accurate and updated problems list is critical in a problem oriented Electronic Health Record (EHR) as it plays a mayor role in assessing patient clinical status and facilitates communication between providers. The lack of organization and categorization of the problems limits the value of the list. There are certain problems that affect more than others the clinical evolution of the patient, these are known as comorbidities. Currently comorbidities are poorly identified as such and frequently become lost within the patient problem list. The purposes of this article is to standardize the concept of comorbidity and identify, classify and characterize the comorbidities. Methods Descriptive, observational and retrospective study. A literature search of available definitions to standardize the concept ''comorbidity''. Then we analyzed the EHR in Hospital Italiano de Buenos Aires from 1998 to 2015. All recorded problems as comorbidities conditions in patient discharge summaries in the period between 1998 and 2015 were taken for analysis. Results 20,849 identified problems were initially obtained as comorbidities in patient discharge summaries for the 17 years studied. The 90 % more frequent were taken for individual analysis. After subsequent steps and according to established criteria, 614 were selected as comorbidities.. These concepts were registered 657,694 times, representing 80.24 % of all records. The most frequently recorded comorbidity was High Blood Pressure (124,186) followed by Dyslipidemia (60.245). The Kappa analysis among the observers was 0.68 (95 % CI, 0.615 to 0.729). The standardised concept of comorbidities that was reached was used to identify comorbidities. Most of the problems analyzed were characterized as comorbidities with substantial inter-observer agreement making a list of the most frequently comorbidities. The Diesel Exhaust in Miners Study (DEMS) is the largest epidemiological study on the association between occupational exposure to diesel motor exhaust (DME) and lung cancer risk. The study base comprises underground and surface workers whose exposure to respirable elemental carbon (REC) differs by nearly two orders of magnitude. The study data have been analysed using a cohort approach as well as a nested case-control approach. Neither the external cohort analysis nor the primary internal cohort analysis yielded an association between DME and lung cancer (1). However, adjusting for work location (''ever-underground'' vs. ''surface-only'') resulted in a dose-response relationship both in the cohort and the case-control analyses (1, 2) . Based mainly on the promising findings of the DEMS cohort, a working group of the International Agency for Research on Cancer (IARC) classified DME in June 2012 as ''carcinogenic to humans'' (3, 4) . Subsequently, the disparities amongst DEMS results and the lack of agreement with the results from a cohort study of German potash miners (5) led to several critical commentaries. Recently, an expert panel, set up by the Health Effects Institute, has evaluated the DEMS study results (6). However, a satisfactory explanation for the apparently self-contradictory DEMS results and especially of the contextual meaning of the factor work location was not provided. A synthesis of the published results leads to a new explanation. The key to this hypothesis is the assumption that workers, who switch from surface job to underground job are very likely in much better physical conditions than other surface workers. Therefore, this special form of a healthy-worker-effect could explain the apparent self-contradictory results of the DEMS cohort. A modified analysis by unconditional logistic regression yields a crude OR = 1.03 (95 % CI 0.74-1.43) for miners ever worked underground jobs. After adjustment for smoking the risk estimator decreases to OR = 0.92 (95 % CI 0.66-1.30). Hence, the results of the different approaches for the DEMS cohort are compatible with each other. Moreover, now they are also in line with the findings in the German potash miner cohort (5). Conclusion Nevertheless, although a reanalysis of the DEMS based on the full data set is necessary to finally confirm the results of the approximate approach described, the evidence for a dose-response relationship between diesel motor exhaust and lung cancer in humans seems to be much less than assumed by the IARC. During the development of a new medicinal drug, identifying the optimal dose is a key goal and its importance cannot be stressed enough. Selecting a too high dose might result in an increased risk of adverse events or other safety concerns, which can be avoided by using a lower dose. Selecting a too low dose might result in insufficient efficacy of the drug. In the past, two main approaches were applied for identifying the optimal dose in dose-response studies: Multiple comparison procedures, where the dose is considered as a qualitative factor, or modelbased procedures, where a functional dose-response relationship in terms of a pre-specified parametric model is assumed and the dose is considered as a quantitative factor. Bretz et al. (2005) combined multiple comparison techniques and model-based approaches by introducing the MCP-Mod procedure, which marked an important methodological milestone in the advancement of dose-response analyses. Hence, the MCP-Mod approach was recently qualified as an efficient statistical methodology for model-based design and analysis of Phase II dose finding studies by the CHMP. The MCP-Mod procedure allows the specification of several candidate dose-response models. Each of these models is used to test for a dose-response signal using multiple comparison techniques while preserving the overall type I error. The most appropriate model to describe the doseresponse relationship is then selected or several models are used to estimate a dose-response relationship by averaging the results from the models with a significant dose-response signal. We applied the MCP-Mod procedure for finding the optimal dose in two Phase II studies that evaluated specific immunotherapy for the treatment of seasonal allergy. This talk gives a brief summary of the MCP-Mod methodology. The application of the method is then illustrated with emphasis on practical and methodological challenges that arose during the planning, conduct and analysis in each of the two mentioned studies. Model interpretations, results and conclusions are illustrated. Advantages of the MCP-Mod procedure in comparison to standard analyses which are still often performed in dose-response studies are illustrated and discussed as well. From a practical perspective, this talk also covers prior discussions with clinicians, lessons learned, and computational challenges using SAS and R. The increased dissemination of powerful mobile computing devices with access to a variety of wireless communication channels has benefited on-line capture of medical research data many areas. Yet, there is still a need for off-line capture, especially, when the availability of on-line connectivity cannot be established in advance. The clinical study ''BreathEase'' presents such a case, because it calls for extensive interviews conducted at the patient's home by visiting study staff. On the other, patient data also needed to be collected in an outpatient clinic. Furthermore, these activities needed to be co-ordinated according to protocol guidelines via a stationary system. Hence, the interview questionnaire data captured off-line needed to be merged with a stationary central database. For this purpose, an existing electronic data capture system (DBFORM) was configured to include some clinical trial management features and complemented with a module to synchronize data in the mobile and stationary databases. This paper describes the requirements, the procedure, and its implementation. While it is shown that off-line capture followed by on-line synchronization is feasible, certain ambiguities may arise that should be addressed by technical and/or other means. The German government as well as reimbursement organizations heavily discuss the introduction of quality indicators in German hospitals. The German institute IQTIG has been asked to define quality indicators which could be used as part of the reimbursement system (DRG-based). Methodology A literature review should identify possible quality indicators as for example in-house mortality, length of stay or readmissions. An important focus of this review will cover whether these indicators are risk-adjusted according to comorbidities and complications. After having identified a set of approved quality indicators the authors examine their representation in the daily clinical documentation in German hospitals. The company DMI is providing scanning and archiving services to many hospitals. The DMI staff has profound knowledge about the clinical documentation processes of their customers. The authors will identify a sample of representative hospitals and check the content of their clinical documentation forms. A comparison with the previously defined quality indicators should give a first overview about their representation within the clinical documentation. The examination will be finalized in August 2016. The results will be presented during HEC2016 and the discussion should give possible indications to changes of the clinical documentation processes in case of adoption of a P4P-model in Germany. Problems and challenges of stroke registration in Japan: Fukushima Stroke registry, 2016 Nakano H 1 1 Radiation Medical Science Center for the Fukushima health management survey, Fukushima Medical University, Fukushima, Japan The trends in case-fatality rates of stroke has not been investigated in Japanese population. We have started monitoring stroke events in Fukushima prefecture in Japan from 2016, compiling information from disease registration covering the entire population in Fukushima. The purpose of this study was to define the trends in case-fatality rates of stroke using a population-based disease registry that provides the most up-to-date information in a Japanese population. The stroke diagnostic criteria employed in this study were established by the Monitoring System for Cardiovascular Disease commissioned by the Japanese Ministry of Health and Welfare. The registered patients included all residents of the prefecture, who were hospitalized with stroke in the prefecture. Stroke and patients who were residents of Fukushima, but visited or were referred to one of the three tertiary hospitals outside the county, were also included in the registry. Will be Register patients were monitored annually by death certification. To conduct the registration annually, original death certificates were permitted by Japanese Ministry of Public Management to be confirmed at Fukushima Radiation and Health Center in Fukushima Medical University. Patients' privacy was protected at all times. Approval for this study was obtained from the institutional review board of the Fukushima Medical University. The present study shows that it is not easy to practice of the stroke registration. For example, clinical information of the EMR(electronic medical record) is not considered using data in it. We need complicated effort to transfer the data from the EMR to the paper survey form. We need 60 to 90 min per one case in the survey. A confirmation is difficult cause too hard to understand the location of the clinical information on the EMR. Conclusion By way of example, Sweden as well as Japan, has been enhanced medical insurance system of the country, the database infrastructure in the government-led has been developed. Coverage of receipts database is 100 %, electronic medical records (Electronic Medical Record, EMR) has also become close to 100 %. There is a national ID number system, EMR, receipts, multiple databases, such as a patient registry has been linked with a high degree of accuracy. However, the backwardness in conduction of Electronic Medical Record database in Japan was obvious. In Japan, there are still several obstacles for the dissemination of public health and medical information, such as the problem of the protection of personal information of patients. There will be a long way to go to construct the National Electronic Medical Record database in Japan. Lack of coherent policies and standard ''good practices'' for secondary use of medical data impedes efforts to strengthen the Japan health care system. The nation requires a framework for the secondary use of medical data with a robust infrastructure of policies, standards, and best practices. Such a framework can guide and facilitate widespread collection, storage, aggregation, linkage, and transmission of medical data. The framework will provide appropriate protections for legitimate secondary use. Epidemiological characteristics of study population 2012-2014 in the AMLSG-BIO registry Nagel G 1 , Erhardt S 2 , Döhner K 3 , Fromm E 4 , Heuser M 5 , Paschka P 6 , Bullinger L 6 , Thol F 5 , Ganser A 5 , Döhner H 7 , Schlenk R 3 1 Background Acute myeloid leukemia (AML) is the most frequent acute leukemia in adults. We investigated epidemiological characteristics. Patients aged 18 or more years with newly diagnosed AML of all AMLSG centers in Germany and Austria were registered within the AMLSG-BiO registry (ClinicalTrials.gov Identifier: NCT01252485). Data were collected on sex, age, date of diagnosis, World Health Organization (WHO)/Eastern Cooperative Oncology Group (ECOG) performance status, comorbidities and treatment. Crude and age-standardized incidence rates of AML for Germany (N = 3251) were calculated using the data of the German population data and compared with data from other German registries and the SEER program. Between 01.01.2012 and 31.12.2014 data of 3,525 patients with AML (45 % women) were registered, including 98 cases with acute promyelocytic leukemia (APL). The median age was 65 (Q1, Q3; 54, 74) years, men were slightly older (median age 66 vs. 64 years). Most cases were defined as de novo AML (71.09 %), followed by secondary (s) AML (21.84 %) and therapy related (t) AML (6.94 %). Overall 591 (16.79 %) participated in a clinical treatment trial, almost equally men and women. The comparison of the age-specific AML incidence rates with data from other registries revealed good coverage of AML patients in the AMLSG BiO registry in younger and middle age classes. Patients 60 years and older were less frequently registered. In men, the highest incidence (7 cases per 100.000 personyears) was observed in age class 70-74 years, whereas in women the highest incidence (5 cases per 100.000 person-years) was found in age group 75-79 years. Most common mutations were NPM1 (914, 26 %) followed by FLT3-ITD (666, 18.9 %) and FLT3-TKD (219, 6.2 %). In women FLT3-ITD (54.05 vs. 45.95 %) and NPM1 mutations (54.27 vs. 45.73 %) were more prevalent than in men. The AMLSG-BiO registry has good population coverage in younger and middle age patients groups. Prospective randomized trial of telemedicine-based collaborative care using a prefectural medical information network system Nakayama M 1 , 2 , Inoue R 3 1 Tohoku University, Sendai, Japan; 2 International Research Institute of Disaster Science, Tohoku University, Sendai, Japan; 3 Tohoku University Hospital, Sendai, Japan Although sharing patient medical records among doctors via electronic health records is a common practice, its remains unclear what effect this has on patients' prognosis. Thus, we began a clinical trial study to clarify whether using a medical network system that enables the dispensing of advice from specialists in university hospitals to primary care physicians in rural areas would help reduce patient risk. In 2013, a medical network system, the Miyagi Medical and Welfare Information Network (MMWIN), was launched in Miyagi prefecture, Japan, after the Great East Japan Earthquake. The MMWIN allows for the backup of medical information from hospitals, clinics, pharmacies, and various types of care facilities. By the end of March 2016, over 3 million registered patients and more than 90 million pieces of backed up data will be recorded in it. Doctors registered on the MMWIN can share the medical information of patients who agree to use it. We are recruiting 1,000 patients with varying levels of risk of heart or kidney disease or stroke between November 2015 and June 2016. The risk is being determined by cardiologists, nephrologists, and specialists in cerebrovascular disorders at Tohoku University Hospital according to standardized guidelines. High-risk patients will be recommended to visit a specialist in the area; in contrast, low-or moderate-risk patients will be randomly allocated to either an intervention or an observation group. In the intervention group, specialists will check patients' data and give advice on patient care to their physicians. The final decision on the course of therapy will be made by the physicians. Both groups will be observed and their status checked every six months. If a patient's risk becomes aggravated over the trial's course, he or she will be recommended to visit a specialist. The main outcome measures are the occurrence of heart failure, myocardial infarction, angina pectoris, kidney diseases, stroke, death, and other medical events. We will also check changes in examination results and physicians' attitude towards patients. This study was approved by the human research committee of Tohoku University School of Medicine, and has been registered in the University Hospital Medical Information Network (UMIN000018552). Written informed consent will be provided by all patients before their enrolment. Results: Seven hundred and twenty five patients have enrolled as of March 7, 2016. The intervention is being performed according to the protocol. Conclusion: A prospective randomized trial using this medical network system in a prefecture of Japan has begun clarifying the effect of telemedicine-based collaborative care between specialists in university hospitals and physicians in rural areas on patients' prognosis. Background Cancer, heart disease, and cerebrovascular disease are the three major leading causes of death in Korea. The mortality of heart disease has been gradually increasing but Cerebrovascular disease has been decreasing. However, these reports are based on descriptive chronological prevalence and mortality at the national level, but not on the inferential analytic approach [1] [2] . Therefore, those known trends are lack of information to infer inflection point from the report over time and grasp the generation pattern. Therefore the aim of this study was to analyze prevalence/mortality trend and to find grounds to support the disease prevention using joinpoint analysis. Methods Prevalence data were obtained from the First to Sixth(including the second year of investigation) Korea National Health and Nutrition Examination Survey (KNHANESI-VI; 1998 -2015 . Mortality data were obtained from the Korean Statistics Information Service database of Death in Korea for 1983-2014. Age-adjusted prevalence/mortality rates were calculated by the direct standardization method using and the age-distribution for the 2010 census as standard population. Rates were presented per 100,000 population. The joinpoint analysis used to identify the best-fitting points where a statistically significant change in the trend occurred [3] . Both a prevalence/mortality rates and age-adjusted prevalence/mortality rates for CVD declined. As a result of inflection points(4) for mortality of all CVD among Korean adults over age 30 and there were 2 significant trends; 5.2 % decrement per year between 1983-1990 and 10.6 % decrement between 1995-1997. A mortality rates for hypertensive and cerebrovascular diseases had a decreasing trend. In contrast, mortality rates for ischemic heart diseases increased by 4.4 % per year from 1983 to 2014. On the result, It had 2 inflection points at 1991 and 2003 (1983 -1991 APC 19.4, 1991 APC 19.4, -2003 APC 5.6, 2003 APC 5.6, -2014 APC -4.4) . There was gender-difference declining smoother in women than in men. The trends were similar across age groups. In 30-64 years, the overall mortality of all CVD decreased (AAPC -6.5). Hypertensive diseases showed the decreasing trend but ischemic heart diseases had increasing trend. In prevalence rates among adults over 30 years, angina and MI rapidly increased in 2005. For men, angina and MI consistently increased in all age groups and for women, decreased trend of angina and MI starting in 2005. Also, In 30-64 years, Men had 1 infection point at 2007 significant trend. For women, It wasn't significant but showing a similar pattern. In case of stroke prevalence consistently increasing trend was found in men and in women, It was steady or slightly increasing trend regardless of age groups. Mortality rates of CVD, hypertensive and cerebrovascular diseases were changed at one or more joinpoints during the 1990-2002 and later, showed decrease tendency. On the other hand, ischemic heart diseases increased gradually and decreased in 2002 as the starting point. Also, AAPC was increased. That was statistically significant over the period. In prevalence rates, an angina, a MI, and a stroke are on the rise. As a result, we find mortality rates of disease that need medical technology decrease rapidly but prevalence rates increase consistently. Human-Centered Design of an mHealth App for the prevention of Burnout Syndrome Background Stress-related disorders have become one of the main health problems in many countries and organizations worldwide. They can generate depression and anxiety, and could derive in work absenteeism and reduction in productivity. Objective Design, develop, and evaluate an mHealth App for the prevention of Burnout Syndrome following the recommendations of standard User-Centered Design methodologies. (1) A descriptive cross-sectional study was performed on a sample of 59 faculty members and workers at the University of Cauca, Colombia using the Maslach Burnout Inventory as an instrument for measuring Burnout, accompanied by a demographic and technological questionnaire. (2) Three prototypes of the mHealth App were iteratively developed following the recommendations provided by the ISO Usability Maturity Model and the ISO User-Centered Design model. (3) Usability tests of the system were performed based on the ISO 9126 standard. The results obtained are considered positive, particularly those regarding user's satisfaction measured using the System Usability Scale. Contactless patient monitoring for general ward-a systematic technology review Naziyok TP 1 , Röhrig R 2 , Zeleke AA 2 1 Carl von Ossietzky University Oldenburg, Oldenburg, Germany; 2 Carl von Ossietzky University, Oldenburg, Germany Background Sudden, serious life-threatening situations happen even on general wards. Current technologies are working with sensors which are attached to every patient, which is a source of failures and false alarms. The goal of this review was to assess the state of the art of potential techniques for contactless patient monitoring in general wards. Search was performed on MEDLINE. Inclusion criteria were studies with technologies to receive cardio and respiratory signals from humans without contact. To maximize recall and precision, three categories of search terms was built: The first describes the technology used, second what kind of data was collected and the third included terms that described a contactless system. A computer scientist and a public health informatics performed an independent assessment of the relevance of the papers in three stages: First stage the title, second stage the abstract and in third stage the full paper was reviewed. Results 453 unique references screened, 34 research articles met inclusion criteria. Ballistocardiography, Radar and Thermography technologies are the most widely tested techniques. Majority of the studies are done in laboratory setting. No study shows the feasibility of one contactless monitoring technology over a needed distance for monitoring rooms. Conclusion Today no technology is feasible. A combination of technologies may become feasible in 10 or more years, since then we have to think about ethical and privacy issues of these pervasive technologies. Network meta-analysis comparing Tiotropium HandiHaler Ò and Respimat Ò -time to exacerbation in COPD patients Background Network meta-analysis allows to investigate several treatments together, so that all treatment contrasts can be analysed in one model, including direct as well as indirect evidence [1] . The effects in the single trials are described by the hazard ratios, which are assumed to be constant over time, and their standard errors. The meta-analytic GLMM for the hazard ratio bji between the test treatment j and the reference treatment i, in study k with arms ik and jk, is LN (bji,k) * Normal (sj -si ? ajk -aik, SEij) S142 E. Grill et al. with si fixed for all i, aik * N(0,r2) i.i.d. and s1 = 0 (Placebo). Bayesian distributions of the treatment contrasts can then be estimated through this common model for all 3 treatments, starting from uninformative prior distributions. We compare the MCMC method as implemented in WinBUGS [2] and, as deterministic-numerical approximation, the Integrated Nested Laplace Approximation (INLA) method[3]. As an example, the relative effectiveness of 2 inhalative Tiotropium formulations, the HandiHaler Ò and the Respimat Ò , and placebo is investigated. Double-blind randomized studies having a duration of at least 1 year, in patients with Chronic Obstructive Pulmonary Disease (COPD), were selected [4] . The endpoint is the time to occurrence of the 1st exacerbation of the COPD during the trial. In total, 11 trials with 22 trial arms are included. The approximation worked well and the medical results did not differ by much. The Respimat Ò had a slight advantage compared with the HandiHaler Ò and both were far superior to placebo. This was valid for the estimated treatment differences as well as for the ordering of the treatments. An intuitive graphical result summary[5] is also shown. In conclusion, the INLA method reached a result close to the reference method in this example with a moderate large number of studies included. Self-rated health of children and adolescents is a long-term predictor of incident health outcomes (results of the national KiGGS cohort in Germany) Neuhauser H 1 , Kurth BM 2 1 Robert Koch-Insitut, Berlin, Germany; 2 RKI, Berlin, Germany Background Self-rated health (SRH) is a single item assessment of an individual's own overall health status with well-documented predictive power for later morbidity, mortality and health service use in adults. However, studies on the longitudinal predictive power of SRH in children are very scarce. Cohort study based on the nationally representative Health Interview and Examination Survey of Children and Adolescents 2003-2006 (KiGGS0, age 0 to 17 years) and its first follow-up interview KiGGS1 (2009 KiGGS1 ( -2012 , conducted six years later (median 6.1 years, range 7.7 to 5.3 years), N = 11,996. Self-rated health (SRH) was assessed as very good/good/fair/poor/very poor at baseline and at follow-up by parents (p-SRH) for participants aged B13 years at baseline and by the children/adolescents themselves (c-SRH) for those aged C11 years. The longitudinal association of p-SRH and C-SRH (very good/good vs. fair/poor/very poor) with three incident KiGGS1 healthoutcomes was investigated using logistic regression analyses: (1) long-term health-related functional difficulties (in KiGGS0 and KiGGS1 identical question which is part of the Children with Special Health Care Needs (CSHCN) screener; (2) parent-or self-reported chronic disease/health condition (this question was not contained in KiGGS0, therefore incident chronic disease/health condition was assumed in children who were CSHCN-negative in KiGGS0); and (3) consultation of C5 different medical specialties in the last 12 months (this question was also not asked in KiGGS0, therefore incident multiple consultations were assumed in participants who denied more than usual medical care in KiGGS0). Complex samples procedures were used due to the clustered sampling design and interactions of p-SRH and c-SRH with child-sex were tested. Results At baseline p-SRH was very good/good in 94.4 % and c-SRH in 85.6 %. Dichotomized p-SRH (very good/good vs. fair/poor/very poor) had changed at follow-up for 8.9 % of children (4.1 % improved and 4.8 % deteriorated) while c-SRH had changed for 19.0 % (10.0 % improved and 9.0 % deteriorated). Fair/poor/very poor p-SRH for children aged 0-13 years at baseline was predictive for incident health-related functional difficulties (OR 3.51, 95 % CI 2.31-5.34), incident chronic disease (girls OR 3.13, 1.96-5.00; boys OR 1.56, 1.05-2.34) and multiple consultations (OR 1.84, 1.20-2.83) (adjusted for age, sex and social status class). Similarly, C-SRH for adolescents aged 11-17 years at baseline was predictive for incident chronic disease (OR 2.06, 1.41-3.00) and for multiple consultations in girls (OR 2.60, 1.57-4.30) but was not predictive for multiple consultations in boys aged 11-17 at baseline and not predictive for functional difficulties (data available only for children 11-13 years). Conclusion P-SRH for children aged 0-13 years predicts incident health outcomes and health service utilization six years later and can thus be a valuable prospective single-item health indicator. C-SRH, which has been shown to be conceptually different from SRH of adults, has also predictive power for health outcomes, but more limited and with differences between boys and girls. The incidence rates for colorectal cancer (CRC) are higher in men than in women, thus endogenous sex-hormones may play a role. Former studies reported inconsistent findings on the association of age at menarche and age at menopause, as indicators for endogenous sexhormone exposure, with CRC risk. These inconsistent results may be due to recall bias or unmeasured confounders. Mendelian Randomization (MR) and instrumental variable methods have been established as an approach to estimate causal relationships. Therefore, we investigated the role of age at menarche and age at menopause in CRC risk using a MR approach. Methods Participants consisted of 6339 patients with confirmed colorectal adenocarcinomas and 6652 controls from the Genetics and Epidemiology of Colorectal Cancer Consortium (GECCO). As instrumental variables, we constructed weighted genetic risk scores (GRS) based on 123 single nucleotide polymorphisms (SNPs) associated with age at menarche and 54 SNPs associated with age at natural menopause. Using logistic regression models, we investigated the association between GRS and CRC risk adjusted for age, study and the first 3 principal components of genetic ancestry. In a supplementary analysis we additionally adjusted for education, family history, aspirin use, menopausal hormone therapy (estrogen alone/ estrogen-progestin combined), adult BMI and smoking. A genetically predicted one-year increase in age at menarche was associated with a non-significant reduction of CRC risk (OR = 0.95, 0.89-1.01). In the covariate adjusted analysis, age at menarche was significantly associated with a reduced risk for CRC (OR = 0.89, 0.81-0.98). The association differed somewhat by cancer site, with an OR of 0.86 (0.78-0.96) for colon and an OR of 0.98 (0.83 -1.15) for rectal cancer. For age at menopause, we did not observe a significant association with CRC risk (OR = 0.99, 0.95-1.04). We found some evidence that older age at menarche is associated with a reduced risk for CRC. However, confounding of these findings by adolescent BMI cannot be entirely excluded. We are currently replicating these results using data from the CORECT study. There is no evidence for a significant association of age at menopause with CRC risk. The Portable Document Format (PDF) is the most commonly used file format for the exchange of electronic documents. A lesser-known feature of PDF is the possibility to embed three-dimensional models and to display these models interactively with a qualified reader. This technology is well suited to present, to explore and to communicate complex biomedical data. This applies in particular for data which would suffer from a loss of information if it was reduced to a static two-dimensional projection. In this article, we present applications of 3D PDF for selected scholarly and clinical use cases in the biomedical domain. Furthermore, we present a sophisticated tool for the generation of respective PDF documents. Constructing a pre-emptive system based on a multi-dimensional matrix and autocompletion to improve diagnostic coding in Acute Care Hospitals Noussa Yao J 1 INSERM UMR_S 1138, éq. 22, CRC, Paris, France Short-stay MSO (Medicine, Surgery, Obstetrics) hospitalization activities in public and private hospitals providing public services are funded through charges for the services provided (T2A in French). Codification must be well matched to the severity of the patient's condition, to ensure that appropriate funding is provided to the hospital. We propose the use of an auto completion process and multidimensional matrix, to help physicians to improve the expression of information and to optimize clinical coding. With this approach, physicians without knowledge of the encoding rules begin from a rough concept, which is gradually refined through semantic proximity and uses information on the associated codes stemming of optimized knowledge bases of diagnosis code. Human-centered development of an online social network for metabolic syndrome management Nuñ ez-Nava J 1 , Orozco-Sánchez P 2 , Lopez D 3 , Ceron J 1 , Alvarez-Rosero R 1 1 University of Cauca, Popayan, Colombia; 2 University of Cauca, popayan, Colombia; 3 Universidad del Cauca, Popayan, Colombia Problem According to the International Diabetes Federation (IDF), a quarter of the world's population has Metabolic Syndrome (MS). Objective To develop (and assess the users' degree of satisfaction of) an online social network for patients who suffer from Metabolic Syndrome, based on the recommendations and requirements of the Human-Centered Design. Results Following the recommendations of the ISO 9241-210 for Human-Centered Design (HCD), an online social network was designed to promote physical activity and healthy nutrition. In order to guarantee the active participation of the users during the development of the social network, a survey, an in-depth interview, a focal group, and usability tests were carried out with people suffering from MS. Conclusions The study demonstrated how the different activities, recommendations, and requirements of the ISO 9241-210 are integrated into a traditional software development process. Early usability tests demonstrated that the user's acceptance and the effectiveness and efficiency of the social network are satisfactory. Determinants of household food access and food consumption in a Zanzibari population in Unguja: a cross-sectional study Food access is the ability of households to acquire food through a combination of home production, animal rearing, purchase or a mixture of all three and this is a major challenge for many populations in sub-Sahara African (SSA) countries. Compared to developed countries, main causes for food accessibility and availability vary with geographical location and affect dietary intake of SSA-households differently. Objectives Investigating (1) food consumption of 235 random households using the food consumption Score (FCS), (2) the association between socioeconomic and demographic determinants of household food access and food consumption score and (3) the association between these determinants and food access on food consumption score. Methods During a baseline survey conducted in 2013, 235 households were randomly selected in 6 districts of rural, peri-urban and urban Unguja, Zanzibar. Determinants for food access were assessed on household level as well as food consumption using validated instrument (FCS; [1] ). Economic determinants of the household were proxied by number of income sources, occupation and education of the head of household, number of assets and ownership of any type of vehicle. Proxy measures for food access of household were among others home gardening, animal rearing, food purchases, distance to market. Two separate multi-level logistic regression models were created using GLIMMIX procedure in SAS version 9.3 to investigate the determinants on food access and food consumption score. The status of the household food consumption for poor, borderline and acceptable food consumption was 29.3, 38.4 and 32.3 % respectively, with the majority of the households consuming staples, sugar and vegetables. Poor food consumption is particularly high in households in the Northern district (41.8 %) of Unguja. About 77.7 % of the households reported good food access; we observed that low education level (according to ISCED; [2]) of head of household, household size ([6 members), having one or more sources of income, having a vehicle and having several assets reduced the household's risk of poor food access. On the other hand, being a male head of household having more than one wife and at least one of reared animals increased the risk of having poor food access in the household. The findings also suggest that, only households who kept at least one animal were associated with poor food consumption score. However, when all determinants variables were combined with food access, households with male head and good food access (48.9 %), have at least one job and good food access (38.9 %) and have at least one animal and good food access (31.7 %) were significantly associated with good food consumption score. Our data suggest that the study population had sufficient food in terms of accessibility and diversity during the survey period, with an average of 6 food groups consumed per household in the last 7 days during the survey period. Poor food accessibility and poor food consumption remain a key public health problem in many households in developing countries thus, attaining mixed strategies and policies could help improve the situation. Background Lithium occurs naturally in drinking water and may have a positive effect on mental health and suicide. In clinical practice, lithium in high therapeutic doses is used as a mood-stabilizer in the treatment of affective disorders. Previous studies performed at an ecological level have found an association between lithium in drinking water and risk of suicide. The present study investigated whether exposure to naturally occurring low levels of lithium in drinking water was associated with a reduced risk of suicide in Denmark considering-for the first time-individual level data and long-term exposure. The study population consisted of all 3,724,588 Danish adults (C20 years) of which 15,370 committed suicide in the study period from 1990-2012. Information on suicides was obtained from the nationwide Danish Register of Cause of Death. The use of prospectively collected individual level data enabled complete follow-up over the course of 23 years. Data on lithium concentrations were obtained through a nationwide drinking water campaign from 2013 including 151 measurements from waterworks supplying approximately 42 % of all residents in Denmark. Spatial statistics were applied to investigate geographical patterns in lithium levels. The lithium measurements were interpolated using point kriging and an accumulated lithium exposure was computed for each individual based on municipality of residence every year. The association between accumulated lithium exposure and suicide rate was analyzed using Poisson regression analysis. Clustering in the residuals was further evaluated through spatial regression analysis. Results Significant regional clustering in drinking water lithium levels were found with high levels in Eastern and low levels in Western Denmark. Suicide rates on municipality level were also found to cluster. The Poisson regression analyses showed a significant dose-response trend of decreasing suicide rates with increasing accumulated lithium exposure. The trend remained the same after adjustment for gender, age group, socioeconomic status, civil status, and calendar year. The findings support the growing evidence of long-term exposure to naturally occurring lithium levels being protective against suicide. If these findings can be further supported they may have implications for future public health strategies on suicide prevention. Citizens' access to their digital health data in eleven countries-a comparative study Governments around the world are actively promoting citizens electronic access to their health data as one of a number of ways to respond to the challenges of health care delivery in the 21st century. While numerous approaches have been utilized it is evident from cross-country comparisons that there are different conceptualizations of: both the expected and desired roles for citizens in the management of their own health; the benefits that will be delivered by citizen access and how these benefits should be measured and benchmarked over-time. This paper presents comparative analyses of the methods by which citizens are provided with access to their own health data across 11 countries. The paper aims to stimulate debate on electronic citizen access to health data and the challenges of measuring benefit as well as reflection on capacity of different citizens to engage with e-health. Several studies have shown that elderly cancer patients have poorer survival even after adjusting for the higher mortality due to other causes of elderly patients. In addition, some publications have shown that the gap in survival between younger and elderly cancer patients is still widening. Therefore, we aimed at (1) analyzing whether in Tyrol/ Austria elderly cancer patients (70? years) have poorer relative fiveyear survival compared to younger patients (50-69 years) and (2) analyzing the temporal time trend of the survival difference between elderly and younger patients. The analysis was based on all incident cancer patients in the population of Tyrol diagnosed between 1990 and 2009. Life status was assessed by linking cancer data with the official mortality data until December 2014, thus allowing for at least five years of follow-up. We computed relative 5-year-survival adjusted for mortality due to other causes with 95 % confidence interval (95 % CI). The analysis was conducted for all cancer patients except non-melanoma skin cancer (NMSC) grouped together and also specifically for the most frequent cancer sites. i.e., breast cancer, prostate cancer, colorectal cancer, lung cancer and neoplasms in the hematologic/lymphatic system. For all cancer patients except NMSC diagnosed between 2000 and 2009, relative five-year survival was 69 % (95 % CI 68-70) for patients aged 50-69 and 54 % (95 % CI 52-55) for patients aged C70. The difference of 15 % between younger and elderly patient was larger for patients diagnosed more recently compared to the difference between age groups for patients diagnosed between 1990 and 1999 (10 %). We observed a gap in relative 5-year-survival between younger and elderly patients for all frequent cancer sites between 5 and 18 % and the gap was widening in the past decades for breast cancer and colorectal cancer. For prostate cancer, the gap decreased from 5 % in the 1990s to 2 % in the more recent period. Although survival of cancer patients steadily improved in the past decades, the survival gap between elderly and younger patients widened. This observation deserves attention from the medical community, expecting an increasing number of elderly cancer patients during the next decades. Moreover, there is a need for studies focusing on elderly cancer patients and on studies on the reasons for the survival gap between elderly and younger patients. How to work with vocabulary correctly, exemplified with gender coding? Oemig F 1 , Blobel B 2 , Geibel P 3 Deutsche Telekom Healthcare and Security Solutions GmbH, Mülheim, Germany; 2 University of Regensburg, Regensburg, Germany; 3 German Association of Health IT Vendors -bvitg e.V., Berlin, Germany Popular data exchange standards facilitate the use of vocabulary in different ways. Quite often, the usage is based on, and in favor of, simple implementations not considering reality. In most cases the vocabulary is assembled in form of simple code lists. Well-known bad practice is neither to define proper concept descriptions for the individual codes leaving their interpretation to the developer nor to identify the underlying vocabulary domain resulting in a mixture of something. This paper takes the gender use case and analyses the use of codes within some data exchange standards and makes recommendations for improvements in handling and managing code systems correctly. What could we learn from the influence of age on perceptions of a CIS by the clinical staff of a French hospital? The goal of this study was to examine the relationship between age and perceptions of a Clinical Information System (CIS). A survey was conducted in September 2015 in a French Teaching Hospital, based on a questionnaire consisting of items on the Likert scale. As results, the impact of age has a strong impact on Perceived Ease of Use, anxiety and Perceived Behavioral Control. The result related to Perceived Ease of Use is unexpected. Younger staff reported to be less comfortable with technology than olders. This result is not consistent with literature. We propose an explanation consisting in the importance of clinical process and organization knowledge and skills while general technology skills of young generations may be less significant. One of the main hurdles of research is understanding how to properly collect, organize, and secure data. Typically, data are entered into Microsoft Excel or Access documents. Structuring these documents is difficult, often leading to inefficiently organized and incorrectly entered data; especially if conducting multi-center studies. Moreover, the security of the data is dependent on the security setting of the computer and the option for encryption. When storing sensitive data such as protected health information (PHI), security is vital and nonnegotiable. We sought to construct a system to make the process of creating, organizing, and securing research data nontechnical, effortless and dynamic. The first challenge was managing clinical data across multiple facilities, a large volume of patients, numerous studies each with significantly different sets of variables; and physical samples. We developed a private dynamic data management system, where nontechnical personnel can maintain the unique data elements-including GCP requirements-for each study and conduct data entry via an auto-generated user interface. This started with a fixed data schema, with the ability to represent any set of dynamic data, e.g. float points, date and times, and lists. In addition, this schema supports data validation to ensure data accuracy. To complete the full set of data required for studies, we created import tools to integrate electronic medical record information from external systems, and automatically join those data elements to data entered into the system. Once the study data are captured and structured, the data must be able to be exported in an accessible format. To achieve this task, the system was designed to make the process non-technical, utilizing a step-by-step ''wizard'' approach, making simple what would otherwise be complex and requiring SQL programming knowledge. With security being a top priority, the system was designed throughout with industry-leading best practices, including utilization of third party authentication technology. Special consideration was given to user roles, allowing restricted access based on facility, study, and access to PHI. Finally, to open global access to any facility or external location, while ensuring the most robust, scalable, and failure tolerant solution, we utilized the latest in cloud technologies. This included web services and data services that can be automatically scaled and geolocated across the globe. In the last five years, this solution has supported numerous clinical studies across three level-one trauma centers and one comprehensive stroke center, with users located across the United States of America. Currently, there are nearly 11,000 patients, 55,000 patient samples, and over four million points of data being held in the system. Thus far, this system has supported the publication of over 30 peer-reviewed manuscripts. The solution we created to handle the requirements of dynamic data management has been extremely useful, and has easily organized and structured our data in ways that were unachievable using older methods. By utilizing the strength and security of the cloud-based platform, studies can be created in London, data can be entered in Munich, and then data can be exported in Rome. Collection of medical original data with search engine for decision support Orthuber W 1,2 1 Universitätsklinik Schleswig-Holstein, Kiel, Germany; 2 Christian-Albrechts-Universität Kiel, Kiel, Germany Medicine is becoming more and more complex and humans can capture total medical knowledge only partially. Methods For specific access a high resolution search engine is demonstrated, which allows besides conventional text search also search of precise quantitative data of medical findings, therapies and results. For the search engine users can define metric spaces (''Domain Spaces'', DSs) with all searchable quantitative data (''Domain Vectors'', DSs). An implementation of the search engine is online in http://numericsearch.com. In future medicine the doctor could make first a rough diagnosis and check which fine diagnostics (quantitative data) colleagues had collected in such a case. Then the doctor decides about fine diagnostics and results are sent (half automatically) to the search engine which filters a group of patients which best fits to these data. In this specific group variable therapies can be checked with associated therapeutic results, like in an individual scientific study for the current patient. The statistical (anonymous) results could be used for specific decision support. Reversely the therapeutic decision (in the best case with later results) could be used to enhance the collection of precise pseudonymous medical original data which is used for better and better statistical (anonymous) search results. Strategies to improve quality in electronic health record problem list Problem oriented electronic health records have been one of the clinical documentation models more adopted in the health informatization process. The problem list has become in the central feature. An accurate, complete, and up-to-date problem list improves quality of care, allows the development and implementation of clinical decision support systems (CDSS), and enables the development of health care programs, as well as research studies. An accurate problem list also facilitates the administrative and billing processes in health systems. In hospitals and health care centers that adopt problem oriented electronic heal records, the problem list should be the central feature of the health information system. Even many institutions and vendors declare their own registries as problem oriented, there are several models implemented, ranging to an isolated problem list to a real integrated one. Moreover, another big difficulty with problem lists is that clinicians do not always complete the lists and there are not always updates of the health problems. Objective: The proposal of this workshop is to describe what a real Problem oriented Electronic Health Records is and describe solutions and strategies to improve quality in electronic health record problem list. Use of disease-modifying drugs for MS in Germany Pachaly L 1 , Khil L 2 , Berger K 2 , Garbe E 1 , Schmedt N 1 1 Leibniz Institute for Prevention Research and Epidemiology -BIPS GmbH, Bremen, Germany; 2 Institute of Epidemiology and Social Medicine, University of Münster, Münster, Germany Despite an increasing number of disease modifying drugs (DMD) for the treatment of multiple sclerosis (MS), up-to-date population-based studies describing their use are scarce. The objective of this study was to investigate the characteristics of DMD users with MS in Germany. We conducted a cohort study based on the German Pharmacoepidemiological Research Database (GePaRD) between January 1st, 2006 and December 31st, 2013. Cohort entry was the date of the first DMD dispensation after a continuous insurance of 24 months with a MS diagnosis in the year before cohort entry. MS patients were described with respect to sex, age, comorbidities and comedications. Further, we calculated exacerbation rates (ERs) and classified DMD users in five mutually exclusive groups: continuous single use, multiple use, switch, interruption and discontinuation. The cohort comprised 15,377 DMD users with median age of 40 years and 68 % females. The most frequent DMDs at cohort entry were interferon ß-1a (41.4 %), glatiramer acetate (26.6 %), interferon ß-1b (18.8 %), azathioprine (6.3 %) and natalizumab (2.9 %). Nearly half of all patients were new DMD users. During follow-up, 42.3 % of the patients used the same DMD continuously and 39.6 % interrupted treatment. In contrast, 11.8 % switched and 5.6 % discontinued treatment. Only 0.6 % were multiple DMD users. The highest ER was found in natalizumab users with 0.50 events per person year. Of all DMD users, MS patients on fingolimod treatment were most often diagnosed with muscle spasticity (23.1 %) and bladder dysfunction (31.7 %) and revealed the highest rate of methylprednisolone use with 0.42 dispensed DDDs per day of followup. In addition, azathioprine users showed a high prevalence of most comorbidities and comedications. Conclusion Treatment patterns and user characteristics in MS patients varied for different DMDs in Germany. Our results indicate the highest MS severity in users of fingolimod and natalizumab. Surprisingly, we still found a high number of MS patients on treatment with azathioprine. Analysis of apps for adults and older adults by using the ''ALFA4Hearing'' model (at-a-glance labelling for features of apps for hearing health care) Paglialonga A 1 , Chiaramello E, Pinciroli F 2 , Tognola G 3 1 CNR Consiglio Nazionale delle Ricerche, Milan, Italy; 2 Politecnico di Milano, Milano, Italy; 3 CNR IEIIT, Institute of Electronics, Computer and Telecommunication Engineering, Milan, Italy In this study, we explored trends, challenges and opportunities in the field of apps for hearing health care (HHC) for adults and older adults by using the ALFA4Hearing model (At-a-glance Labelling for Features of Apps for Hearing health care), a novel HHC-specific tool for the identification and assessment of apps. Falls are a major problem in the older population, as their consequences represent a threat on physical and psychological wellbeing. The efficiency of fall preventive interventions can improve by targeting the interventions specifically on subjects identified at higher risk. The Timed Up and Go test (TUG) is a mobility test, widely used in geriatric clinical practice. The time for its completion, as an indicator for fall risk, has shown modest predictive accuracy [1, 2] . Some studies tried to improve it by leveraging richer quantitative information rather than the sole time for completion, employing wearable inertial sensors (e.g. [3, 4] ). However, the results of some of these studies are possibly flawed by biases [5] . Our objective is to develop and validate a tool for fall risk assessment based on the TUG. Methods A subset of the participants of the InCHIANTI FU4 study performed the TUG while wearing a smartphone at the lower back [6] [7] [8] . The participants were subsequently followed up with telephone interviews for one year to assess occurrence of falls. Two hundred sixty-four persons aged 65 years or more performed the test and were followed up for at least 6 months. Motion signals were segmented, identifying the phases of sit-towalk, walk, turn, and turn-to-sit. For each phase, we extracted its duration and computed 37 other statistical and biomechanical features from the acceleration signals (e.g. range, root mean square, jerk, and number of steps). Dimensionality reduction was performed with factor analysis [9], after factoring out the effect of age, sex, weight, height, and Mini Mental State Examination. The associations between sensor-derived factors and number of falls were tested with negative S148 E. Grill et al. binomial regression, adjusting for different follow-up lengths, age and sex. Fifty persons (19 %) fell at least once during the follow-up, and 14 of them (5 %) at least twice. By applying factor analysis on the available features derived from inertial sensors, we revealed the presence of 8 main latent factors related to turning ability, movement smoothness, weight transfer ability, and global fitness. Two of these were found to be significantly associated with falls: factor 5, related to mediolateral weight transfer ability, with an incidence rate ratio (IRR) of 0.740 (95 % CI 0.551-0.994) and factor 8, containing total TUG duration and related to low global fitness, with an IRR of . Other results about the associations of TUG features with history of falls in this population have been presented elsewhere [10, 11] . The study design and model validation have been thought to minimize known sources of bias, according to published recommendations [5, 12] . The significant association of factor 5, that by construction is uncorrelated to factor 8, indicates that it is possible to retrieve from the TUG information about fall risk, besides its total duration. We will evaluate whether this is sufficient to substantially improve TUG predictive accuracy, and we will compare our risk models with clinically accepted methodologies for risk screening. We will further test whenever possible previously-published sensor-based risk models. Polygenic risk score of 55 coronary artery disease risk variants is associated with the progression of coronary artery calcificationresults of the Heinz Nixdorf Recall Study Pechlivanis S 1 , Lehmann N 1 , Mahabadi AA 2 , Erbel R 1 , Hoffmann P 3,4 , Jöckel KH 1 , Nöthen MM 3 , Moebus S 1,5 1 Institute for Medical Informatics, Biometry and Epidemiology, University Hospital of Essen, University Duisburg-Essen, Essen, Germany; 2 Department of Cardiology, West-German Heart Centre, University Hospital of Essen, University Duisburg-Essen, Essen, Germany; 3 Department of Genomics, Life & Brain Center, University of Bonn, Bonn, Germany; 4 Division of Medical Genetics, Department of Biomedicine, University of Basel, Basel, Switzerland; 5 Centre for Urban Epidemiology, University Hospital Essen, Essen, Germany Atherosclerosis is the primary cause of coronary artery disease (CAD) and precedes the onset of most cases of clinically apparent coronary heart disease by decades. Coronary artery calcification (CAC) is one of the most sensitive and specific markers of coronary atherosclerosis. CAC quantification has been shown to improve the ability to predict future CVD events. In recent years, 55 single nucleotide polymorphisms (SNPs) associated with CAD were identified in genome wide association studies 1. In our study we hypothesized that the polygenic risk score (PRS) for CAD associated SNPs influence the progression of CAC. We included 2187 participants of the Heinz Nixdorf Recall Study with CAC measured during two time points (five years apart). CAC was determined using non-contrast electron-beam computed tomography and quantified using the Agatston-Score at baseline and after five years. CAC at 5 year was modelled from the age-and sexspecific CAC percentiles at baseline, exponentially extrapolating along the subject's percentile by the time between measurements. A predefined acceptance band nominally covers 20 % of the observed values around the individually predicted value at 5 year. CAC progression was classified as within the band (expected), above band (rapid), and below band (slow) as described in 2. To evaluate the effect of 55 SNPs associated with CAD, a PRS for each individual was constructed by adding up the number of risk alleles (0/1/2) of each SNP and weighing by the natural log of the reported odds ratio for CAD. Generalized linear regression models were used to estimate the relative risk (RR) (95 % confidence interval) and explored the impact of the PRS on the outcome ''rapid vs. expected/slow CAC progression''. The models were further adjusted for cardiovascular risk factors (age, systolic and diastolic blood pressure, high density lipoprotein cholesterol, low density lipoprotein cholesterol, smoking status, body mass index, use of cholesterol lowering medication and antihypertensive medication) as well as stratified by sex. Prevalence of rapid CAC progression in the Heinz Nixdorf Recall study was 19.9 %. CAD risk alleles in PRS ranged from 32 to 62. The weighted CAD PRS was normally distributed within a range from 2.6 to 5.0 (mean ± SD: 3.93 ± 0.37). In the crude model the weighted CAD PRS was associated with rapid CAC progression (Relative Risk (RR) (95 % confidence interval) per SD: 1.12 (1.03;1.22), p = 0.009). Similar effect was observed in the models adjusted for cardiovascular risk factors (1.10 (0.99;1.21) , p = 0.06). The sex stratified models adjusted for cardiovascular risk factors did not reveal differences between men and women (men: 1.08 (0.98;1.20), p = 0.13) and women (1.07 (0.93;1.23) , p = 0.33). Conclusion A higher polygenic risk score of 55 GWAS identified CAD variants seems to be associated with rapid progression of coronary artery calcification in the Heinz Nixdorf Recall study. The increasing demand for home care due to the growing number of older people with an increased incidence of multiple chronic conditions requires the adoption of technological innovation that can improve the integration and coordination of health and social services. This paper presents the H@H infrastructure that aims to improve the coordination and cooperation among stakeholders facilitating the provision of care services in a continuity of care framework. Moreover, a conceptual model that combines the main concepts of the ContSys standard with social care components is proposed. Flavonoids may be responsible for some of the preventive effects of a diet rich in fruit and vegetables on type 2 diabetes. Ideally, flavonoid intake estimation should combine dietary instruments and a validated exposition biomarker so as to avoid exclusive reliance on self-reports and flavonoid content databases. This study aimed to investigate the prospective association of flavonoid intake from fruit and vegetables (FlavFV) during puberty-estimated on dietary and biomarker level-with diverse risk markers of type 2 diabetes in young adulthood. The analysis used data from healthy participants of the DONALD Study with a fasting blood sample in adulthood (18-39 y) including measurements of the following type 2 diabetes risk markers: updated homeostasis model assessment insulin sensitivity (HOMA2-% S), alanine aminotransferase (ALT), c-glutamyltransferase (GGT), and adiponectin. Habitual FlavFV intake during puberty (females: 9-15 y, males: 10-16 y) was estimated on the dietary level using a minimum of two weighted 3-d dietary records (n = 256 for HOMA2-%S and adiponectin, n = 275 for ALT and GGT) and on the exposition biomarker level using urinary hippuric acid (HA) from at least two 24-h urine samples (n = 235 for HOMA2-% S, n = 234 for adiponectin, and n = 247 for ALT and GGT; with a balanced sex ratio across all samples). Sex-stratified multivariable linear regression models adjusted for potential confounding by age at blood measurement, BMI-SD scores (BMI-SDS), body surface area or energy intake and other dietary factors at baseline, as well as early life factors, socioeconomic factors and parental disease history (diabetes, steatosis). A higher habitual intake of FlavFV during puberty was independently related to higher adult HOMA2-%S in females (p = 0.032). A similar trend was found using the urinary biomarker HA (p = 0.087). Also in females, a higher urinary HA excretion but not FlavFV tended to be related to higher adult adiponectin levels (p = 0.096). No associations with HOMA2-% S or adiponectin were found in males (p [ 0.2). For males, independent inverse associations of pubertal flavonoid intake and adult liver enzyme levels were found on both dietary and biomarker level: GGT (p = 0.020 for FlavFV and 0.023 for HA) and ALT (p = 0.027 for FlavFV and 0.030 for HA). No associations with liver enzyme levels were found in females (p [ 0.4). Our data suggest a beneficial effect of a higher habitual flavonoid intake from fruit and vegetables during puberty for risk markers of type 2 diabetes in adulthood in a sex-specific manner: with benefits for insulin-sensitivity among females and for liver enzymes among males. There is significant, unexploited potential to improve the patients' engagement in psychotherapy treatment through technology use. We develop Tele-Board MED (TBM), a digital tool to support documentation and patient-provider collaboration in medical encounters. Our objective is the evaluation of TBM's practical effects on patientprovider relationships and patient empowerment in the domain of talk-based mental health interventions. We tested TBM in individual therapy sessions at a psychiatric ward using action research methods. The qualitative results in form of therapist observations and patient stories show an increased acceptance of diagnoses and patient-therapist bonding. We compare the observed effects to patient-provider relationship and patient empowerment models. We can conclude that the functions of TBM-namely that notes are shared and cooperatively taken with the patient, that diagnostics and treatment procedures are depicted via visuals and in plain language, and that patients get a copy of their file-lead to increased patient engagement and an improved collaboration, communication and integration in consultations. E. Grill et al. ANDANTE: a proposal for setting up a database on pediatric patients treated with proton radiotherapy in future epidemiological studies Perstorfer K 1 , Walsh L 2 1 Federal Office for Radiation Protection, Neuherberg/München, Germany; 2 University of Zurich, Zurich, Switzerland Survivors of childhood cancer generally have an increased risk of developing secondary cancers associated with the treatment for the first primary cancer. Proton therapy represents a highly effective treatment technique for some types of childhood cancers but scattered radiation from secondary neutrons is an unwanted by-product. The ANDANTE project investigated the relative risk of neutrons compared to photons for tumorigenesis, as a function of dose and energy. The approach was multidisciplinary, including physical measurements and modeling, molecular biology, radiobiology and epidemiology. One aim of ANDANTE was to use the experience gained in the other parts of the project to design a prospective epidemiological study using pediatric radiotherapy data collected from multiple radiotherapy centers worldwide. The results of this epidemiological part are reported here. The groundwork reported here involved a feasibility study for a prospective study carried out on patient data at the first hospital-based proton facility at Loma-Linda University Medical Centre (LLUMC), USA. Patient data at LLUMC were obtained for 242 children treated for arteriovenous malformations or astrocytoma. Epidemiological data on mean age at treatment, available length of follow-up time, survival times and other general characteristics of the cohort were collected and assessed. However, ANDANTE scientists were not allowed access to data on the occurrence of second malignant neoplasms (SMN). Another approach reported here involved a literature review of secondary cancers occurring in patients after radiotherapy in childhood. This was done to obtain indications of the types of secondary cancers occurring, the lengths of typical epidemiological follow-up times, and numbers of patients studied, required by a future prospective study. The main results, based on 30 reviewed papers, were that brain and nervous system cancer, breast cancer and thyroid cancer have the shortest time to occurrence as secondary malignant neoplasms after radiotherapy for first malignant neoplasms and a minimum length of follow-up of 20-25 years should be planned for, based on the patient numbers in the reviewed studies. Power calculations were done to obtain another type of estimate of typical epidemiological follow-up times, and numbers of patients studied, required to detect a neutron effect contribution to the occurrence of SMNs in future prospective studies. Data from 17 European proton therapy centers was collected and the number of patients treated with protons in the United States per year was estimated. The power calculations were based on neutron risks per organ averaged absorbed neutron doses (typically 2 mGy for active and 10 mGy for passive modalities of proton therapy). The results indicated that adequate power is achievable, with a 20-year follow-up, if an average of 2000 persons per year enter the cohort over the 20-year follow-up period and the solid cancer neutron risks are at the upper level of the risk range determined from the cohort of Japanese A-bomb survivors. The proposal presented here forms a well-designed basis for setting up a database and/or a prospective epidemiological study on future implications of radiotherapy during childhood. Patients perception of clinicians use of ICT during patient consultation in the different sectors of Danish healthcare Petersen LS 1 , Bertelsen P 1 In Denmark ICT is a central part of almost all healthcare professionals' daily practices, and patients are increasingly encouraged to take and active interest in own health data. Therefore, ICT is an important part of what happens at consultations between the patients and the healthcare professionals. We explore the impact of ICT based on a survey of citizens'/patients' experience of interaction with healthcare professionals. How often and for what ICT was used in communication with the patients in different sectors of the Danish healthcare. The results show that ICT is used in communication with citizens and during interaction with patient, however the use of ICT is mostly for the healthcare professionals own benefit and only about 15-39 % of the reported instances ICT was used to communication and interact with the patient. Through the concept of boundary objects we proposes a model that split the object of the technology mediated information into three setting for communication between patients and healthcare professionals. We propose further studies into how ICT can be used to explore the possibilities for more interactive and involving care processes as a key element in further development of eHealth. Using propensity score matching to compare patients with and without treatment switch in a randomized trial on chronic myeloid leukaemia Pfirrmann M 1 IBE -Institut für Medizinische Informationsverarbeitung, Biometrie und Epidemiologie, Ludwig-Maximilians-Universität München, München, Germany From 2002 until 2012, in the German chronic myeloid leukaemia (CML) study IV, 1.408 patients were randomized between treatments starting with the first-line tyrosine kinase inhibitor (TKI) imatinib (Hehlmann et al., JCO, 2014) . In particular after significantly higher molecular response proportions were published for second-line TKIs in 2010, some patients of CML study IV switched treatment. This raised the question of whether treatment switch hinted at an effect on long-term survival. Eligibility criteria for the analysis were CML in chronic phase, assessment of molecular response, and start of treatment within 6 months from diagnosis. Furthermore, the latest result on molecular response status within 3 months before the switch as well as the ELTS score (Pfirrmann et al., Leukemia, 2016) had to be known. The ELTS score consists of four prognostic factors evaluated at diagnosis. In accordance with the propensity score definition of Rosenbaum and Rubin (Biometrika, 1983) , the probability that a patient will switch in dependence on the continuous ELTS score was estimated. Matching between patients with and without treatment switch was performed in two steps. At first, for each patient i of the n patients who switched, matching candidates among the non-switching patients were identified who (a) still received imatinib treatment at the time patient i switched and (b) had a matching molecular result within 3 months before the switching time of patient i. Of the m matching candidates after the first step, a non-switching patient with a propensity score as close as possible to patient i was sought for. Following Austin (Pharm Stat, 2011), a caliper of 0.2 9 the standard deviation of the logit of the propensity score was applied in order to consider a non-switching patient as an acceptable matching partner. The z-difference (Kuss, J Clin Epidemiol, 2013) was used to assess whether the two matched samples would have a similar distribution of the baseline score. Finally, 150 patients with a single treatment switch fulfilled all eligibility criteria. Among the non-switching patients, a matching partner was found for 140 individuals. Before the matching, a z-difference of 2.304 was calculated for the ELTS score. Switching patients had higher, prognostically more unfavourable ELTS values. After the matching, the z-difference was 0.012. Starting from matched switching times, overall survival probabilities between the groups were not significantly different (p [ 0.5). Propensity score matching supported the obtainment of two samples with similar distributions of the prognostic score and with identical distributions of molecular response status. Applying the matching procedure, bias of the selection for switching was reduced but the selection process remained unknown. By matching, also power was reduced. However, there was no hint at possible differences in longterm outcome which could necessitate immediate action in the German CML study IV. The analysis was retrospective. A prospectively designed study on switching in patients where its necessity is not obvious could be worthwhile. In-work poverty and self-rated health in a cohort of working Germans: a hybrid approach for decomposing the effects of within-person and between-person differences in in-work poverty status Pförtner TK 1 , Schmidt-Catran A 2 1 Institute of Medical Sociology, Health Services Research, and Rehabilitation Science (IMVR), University of Cologne, Cologne, Germany; 2 Institute of Sociology and Social Psychology, University of Cologne, Cologne, Germany This study investigated whether self-rated health (SRH) is predicted by in-work poverty and how between-person and within-person differences in in-work poverty status contribute to this relationship. We used a logistic within-between random effects model in a nationally representative German sample with 19 waves of data collection (1995-2013) to estimate between-person and within-person differences of in-work poverty status on poor SRH. Interactions by age and gender were tested, and models were controlled for sociodemographic, socioeconomic and work-related characteristics. We found significant differences in SRH by between individuals with different working poverty status but no evidence that within-person differences in in-work poverty status are associated with poor SRH. The association between in-work poverty and SRH was significantly stronger for women but did not significantly differ by age. All findings were robust when including sociodemographic, socioeconomic and working characteristics. Conclusion For Germany, we found a polarization of poor SRH between the nonpoor and the working poor but no causal association of in-work poverty status with SRH within For Germany, we found a polarization of poor SRH between the non-poor and the working poor but no causal association of in-work poverty status on SRH within persons. The association between sarcopenia, muscle wasting related to ageing, and disability is not well understood. The objectives of this study using data from a 3-year prospective population-based cohort were to estimate the prevalence of sarcopenia in older people in Germany and to test the hypothesis that sarcopenia is associated with increasing disability in older adults. Cross-sectional analysis of n = 927 participants and longitudinal analysis of n = 859 participants aged 65 or older enrolled in the KORA-Age Study. Sarcopenia was defined based on the European Working Group on Sarcopenia in Older People (EWGSOP) algorithm which consists of three components: low muscle mass, measured by a skeletal muscle index of B8.72 kg/m 2 for men and B6.33 kg/m 2 for women; low muscle strength, measured by handgrip strength \30 kg for men and \20 kg for women; and low muscle performance, measured by gait speed B0.8 m/s. Identification of sarcopenia required the presence of both low muscle mass and low muscle function (strength or performance). Disability status was measured by the Health Assessment Questionnaire-Disability Index (HAQ-DI). We defined the presence of any disability as HAQ-DI [0. Directed Acyclic Graphs (DAG) were constructed to identify an unbiased set of covariates. Linear mixed effects models were used to estimate the total effect of sarcopenia on disability scores. The overall prevalence of sarcopenia was 5.7 % (men 4.0 %, women 7.5 %) and increased with age. Prevalence of any disability was 46.9 % at baseline and 60.0 % at follow up. 3-year incidence of disability was 32.7 %. After adjustment for all potential confounders according to the DAG minimally sufficient adjustment set, persons with sarcopenia at baseline had a higher disability score by 0.146 (95 % confidence interval: 0.031-0.262); p = 0.013 at follow-up compared to those without sarcopenia. The EWGSOP algorithm, along with proposed gait speed and grip strength cut-off points, as well as calculated muscle mass cut-off points from a healthy German population could be used to define sarcopenia in our study. These results support the longitudinal S152 E. Grill et al. relationship between sarcopenia and disability in older people. Further longitudinal studies are recommended to confirm these findings. The aim of this study is to examine the association between subjective and objective measures of physical activity (PA) as well as heart rate variability (HRV) indices and manic symptomatology in adolescents. Prior research shows that bipolar patients were less physical active and demonstrated a significant reduction in HRV and parasympathetic activity (1, 2) . The majority of studies on the association between PA, HRV and manic symptomatology have been conducted in controlled settings with clinical populations and retrospective questionnaire assessments. In contrast, our study examines the association in real life in a non-clinical sample of adolescents from the general population. The current preliminary analysis is a part of an ongoing epidemiologic study in Dresden, Germany (''BeMIND''-Behavior and Mind Health Study) to examine mental health and illness conditions as well as psychological, developmental, cognitive-affective and biological risk factors among adolescents and young adults. Up to now 85 individuals aged 14-21 years participated in a four day ecological momentary assessment (EMA) study part. Participants were prompted with acoustic signals to complete EMA items via smartphone on two week-days and two weekend-days. Manic symptomatology was assessed using an adapted short version of the Altman Self-Rating Mania Scale (ASMR). During waking hours subjects rated their manic symptomatology on four items (since the last prompt … (1) I felt happier or more cheerful than usual; (2) I felt more self-confident than usual; (3) I talked more than usual; (4) I have been more active than usual) on visual analogue scales from 0 (none of the time) to 100 (the whole time) eight times a day (approximately every 2 h). To assess PA participants were asked at each prompt ''What percentage of the time were you physically active since the last beep?'' In addition objective movement data (body acceleration) as well as HRV data (RR-intervals) were recorded continuously on all assessment days with a HRV/Movement recorder (Firstbeat Bodyguard 2) attached to the skin with two chest electrodes. Because of the nested structure of the EMA data multilevel modeling (random intercept models) was used for analyses. Higher rated manic symptomatology was associated with subjectively rated higher self-initiated PA. Comparing the manic symptomatology ratings on weekdays and the weekend participants reported higher ratings during weekdays. Manic symptomatology fluctuated over the course of the day with higher ratings in the afternoon to evening hours compared to morning times. The association between physical activity and manic symptomatology ratings persists even after adjusting the time of day/week variations. First results indicate a positive association between PA and (hypo-)manic symptomatology in a non-clinical sample of adolescents. Symptomatology fluctuated over the course of the day and between leisure and school/working time. Data collection of the BeMIND study and analysis of the physiological data is ongoing. In addition to the association between manic symptomatology and subjective rated PA, the associations with objectively measured PA (accelerometry) and the HRV measures will be analysed and discussed. Association of maternal and paternal sociodemographic characteristics and use of medications during pregnancy in a cohort The use of prescription medications during pregnancy has been associated to sociodemographic characteristics [1] [2] [3] [4] [5] [6] . This prospective cohort study aims (a) to assess the relation between prescription medication use during pregnancy and sociodemographic characteristics, adjusting for indicators of maternal health status; (b) to assess whether the relation varies according to the method of data collection. Maternal factors affecting recall can partially explain the different association of sociodemographic factors with prescription redemption data and reported medication use, respectively. Further analysis to assess the role of maternal health behaviors, such as smoking and alcohol use, is ongoing. Agreement between self-reported medication use and dispensing data varied by therapeutic group [7] . The relation with sociodemographic characteristics with use of the most common therapeutic groups will be assessed as well. Medications prescribed at hospital discharge in patients with validated diagnosis of dementia Dementia is a major cause of morbidity and mortality among older people and is becoming increasingly prevalent with population aging. Dementia increases in-and out-patient health care utilization. Patients are often in polypharmacy. In these frail patients, drug interactions are a serious concern and neuroleptics have been associated with excess morbidity and mortality. We present the preliminary results of an ongoing cohort study aimed at assessing (a) the validity of discharge diagnosis of dementia; (b) medications prescribed at discharge; (c) prescription redeemed after discharge, through record-linkage with outpatient prescription database; (d) utilization of nursing homes and outpatient healthcare. Consistently with prior studies, codes for dementia showed high validity, however review of hospital charts is required when the information in HEMR is lacking. Proton pump inhibitors were prescribed in 50 % of discharges and antipsychotic medications in more than one third. Further analysis will assess the frequency of in-hospital initiation and the pattern of use after discharge. Clinical cancer registries are a valuable data source for health services research (HSR). HSR is in need of high quality routine care data for its evaluations. However, the secondary use of routine data-such as documented cancer cases in a disease registry-poses new challenges in terms of data quality, IT-management, documentation processes and data privacy. In the clinical cancer registry Heilbronn-Franken, real-world data from the Giessen Tumor Documentation System (GTDS) was utilized for analyses of patients' disease processes and guideline adherence in follow-up care. A process was developed to map disease state definitions to fields of the GTDS database and extract patients' disease progress information. Thus, the disease process of sub-cohorts could be compared to each other, e.g., comparison of disease free survival of HER2 (human epidermal growth factor receptor 2) -positive and -negative women who were treated with Trastuzumab, a targeted therapy applied in breast cancer. In principle, such comparisons are feasible and of great value for HSR as they depict a routine care setting of a diverse patient cohort. Yet, local documentation practice, missing flow of information from external health care providers or small sub-cohorts impede the analyses of clinical cancer registries data bases and usage for HSR. Former analyses showed an increase of prevalence of overweight until 2004 and a decrease of prevalence until 2008. The authors of the latest publication supposed that prevention programs initiated this positive development. We examined the prevalence of overweight and underweight in the entire population of 5-and 6-y-old children before entering school in Germany, Rhineland-Palatine, and to assess time trends over the last 6 years and the impact of migration background. The current study was based on cross-sectional data of the obligatory pre-school examinations from 35 out of 36 counties of Rhineland-Palatinate between 2009/10 and 2014/15 (n = 214.005). Body mass index (BMI; kg/m2) was calculated and the prevalence of overweight ([90th percentile) and underweight (\10th percentile) was defined based on German age-and gender-specific normative data. Information on the migration background was available since 2011/12 for 29 counties (n = 82.621). The prevalence of overweight and underweight was estimated to be 9.4 and 11.6 % in 2009/10, respectively. The prevalence of both conditions seemed to be stable over time (9.6 and 11.6 % in 2014/15, respectively). By restricting the analysis to children with information on the migration background, the prevalence of overweight raised from 7.8 % in 2011/12 to 9.2 % in 2014/15 while the prevalence of underweight declined from 14.1 % in 2011/12 to 12.4 % in 2014/15. Mean BMI was higher in children with migration background (15.65, 95 % CI = 15.62; 15.68) compared to children without migration background (15.36, 95 % CI = 15.35; 15.38) . The prevalence of overweight was 1.5 times higher and the prevalence of underweight was 0.95 times lower in children with migration background compared to children without. In children with migration background, there was an increase in overweight from 10.3 % in 2011/12 to 12.8 % in 2014/15. In children without migration background, there was a slight increase in overweight (from 7.1 % in 2011/12 to 8.1 % in 2014/15) and a decrease in the prevalence of underweight (from 14.2 to 12.8 %). Further analyses using logistic regression models including year of examination, sex, age, migration background and socioeconomic status are pending. This large study on almost all preschool children of Rhineland-Palatinate in 2009 to 2015 shows rather stable patterns of overweight and underweight, which are different to past comparable studies in Germany. Interestingly, results differed when analyses were restricted to counties with assessment of migration background. Our results indicate that country and/or region of origin influence the risk of overweight. The high prevalence of underweight in children without migration background and the upwards shift of the BMI distribution in children with migration background needs further investigation. Evaluation of sample size estimation tools for differentially expressed RNA-seq data Poplawski A 1 , Binder H 1 1 Institut für Medizinische Biometrie, Epidemiologie und Informatik (IMBEI)Universitätsmedizin der Johannes-Gutenberg-Universität Mainz, Mainz, Germany Background RNA-seq is the instrument of choice for measuring differential gene expression. Good experimental design includes the maximizing of power while minimizing the costs. The experimental design integrates the choice of the necessary biological replicates number and the sequencing depth. Because of the negative binominal distribution, the high dimensional structure and the correlation between genes, sample size estimation for RNA-seq profiling differential expression is not trivial. Several sample size estimation tools had been developed to deal with these challenges. We performed a systematically literature search and evaluation of such open source tools. We performed a systematic search in PubMed using the key words 'RNA', 'seq'/'RNA-seq' and 'sample size'/'power' and a complementary search using Google. We created pilot data selecting two biological replicates from two different conditions using existing mouse [1, GSE68376] and human [GTEx Analysis Release V6, dbGaP Accession phs000424.v6.p] RNA-Seq count data to compare the sample sizes estimated by the different tools. The human count data were also used to simulate data with known differential expressed genes. We estimated the necessary sample sizes to achieve predetermined power values for these simulated data using DESeq, DESeq2 and edgeR. We randomly selected two test and two control samples from these simulated data to obtain pilot data and used them as initial point for the tool application. We detected eight sample size tools, like PROPER, SPSS and Scotty. Three tools estimate the sample sizes per simulations, other uses mathematical formula, two tools estimates the sample size only for one gene, while the other perform multiple comparisons. Most of the tools take the negative binominal distribution into account and perform correction for multiple testing, but only SPSS can be used for more than two conditions. The application of the six tools performing multiple comparisons using mouse and human RNA-seq count data files obtained different sample sizes. To evaluation of the tools performance we conduct simulations and compare the results with the tool outcomes. Only for two tools the estimated sample sizes were similar to our outcome. The Usage of tools for sample size estimation can help researchers to estimate the necessary sample size for RNA-seq data, but the results has to be treated with caution. One should always use pilot data and best use more than one tool to estimate the necessary sample size to be on the safe side. Prevalence of polypharmacy is not the same all over Germany Possehl C 1 1 DAK-Gesundheit, Versorgungsmanagement, Hamburg, Germany Prevalence of polypharmacy is not the same all over Germany Christiane Possehl DAK-Gesundheit, Hamburg During the past years, polypharmacy has been recognized as a challenge in our health care system, especially in the treatment of elderly patients. As a statutory health insurance with clients from all over Germany, our data of pharmacy prescriptions enable us to estimate prevalence of polypharmacy in different regions of the country. For each patient, we looked for simultaneous prescriptions of 5 or more different substances (ATC-Code) based on DDD-coverage at 6 fixed points in time during two subsequent quarters. Patients who met these criteria minimum twice per each quarter were considered positive for polypharmacy. The analysis was performed for the years 2013 to 2015. Results could be attributed to regions of Germans Associations of Statutory Health Insurance Physicians (KV-Regionen) with patients' residency information. We adjusted for age and sex to the population of our health insurance as well as to the German population. According to our data, polypharmacy seems to be more spread in the eastern part of Germany. Even if information based on health insurance data is limited, our data can give some interesting insight in prescription practice in different parts of Germany. Efficient representation of health data de-identification policies Prasser F 1 , Kohlmayer F 1 , Kuhn KA 1 1 Technical University of Munich, Munich, Germany Patient privacy must be protected when sensitive biomedical data is shared. Data de-identification is an important protection method, where attributes in a dataset are transformed to meet formal privacy requirements which are specified in terms of mathematical or statistical models. A method which has been recommended for transforming health data is generalization, which means that userdefined hierarchies are utilized to iteratively reduce the precision of attribute values and thereby decrease privacy risks. With this model, the space of potentially privacy-preserving data transformations is defined by the set of all possible combinations of generalization levels of each attribute. Each such combination is called a de-identification policy. To balance privacy with data quality, data de-identification algorithms need to manage information about a potentially large number of such policies. For high-dimensional data this can result in significant memory requirements. In this article, we present a spaceefficient encoding scheme for data de-identification policies. Our results show that our method reduces memory consumption by up to a factor of 25 when de-identifying high-dimensional data. A typical example are longitudinal datasets which are collected in research S156 E. Grill et al. Due to the inherent sensitivity of health data, patient privacy must be protected. De-identification means to transform a dataset in such a way that it becomes extremely difficult for an attacker to link its records to identified individuals. This can be achieved with different types of data transformations. As transformation impacts the information content of a dataset, it is important to balance an increase in privacy with a decrease in data quality. To this end, models for measuring both aspects are needed. Non-Uniform Entropy is a model for data quality which is frequently recommended for de-identifying health data. In this work we show that it cannot be used in a meaningful way for measuring the quality of data which has been transformed with several important types of data transformation. We introduce a generic variant, which overcomes this limitation. We performed experiments with real-world datasets, which show that our method provides a unified framework in which the quality of differently transformed data can be compared to find a good or even optimal solution to a given data de-identification problem. We have implemented our method into ARX, an open source anonymization tool for biomedical data. The benefits of a formative evaluation for developing a highly innovative software: the case of the handover EHR Background Innovations are typically characterised by their relative newness for the user. In order for new eHealth applications to be accepted as innovations more criteria were proposed including ''use'' and ''usability''. The handoverEHR is a new approach that allows the user to translate the essentials of a clinical case into a graphical representation, the so-called cognitive map of the patient. This study aimed at testing the software usability. Methods A convenience sample of 23 experienced nurses from different healthcare organisations across the country rated the usability of the handoverEHR after performing typical handover tasks. All usability scales of the IsoMetricsL questionnaire showed positive values (4 ''I agree'') with the exception of ''error tolerance'' (3 ''neutral statement''). A significant improvement was found in self-descriptiveness as compared to an initial usability testing prior to this study. Different subgroups of users tended to rate the usability of the system differently. This study demonstrated the benefits of formative evaluations in terms of improving the usability of an entirely new approach. It thus helps to transform a novel piece of software towards becoming a real innovation. Our findings also hint at the importance of user characteristics that could affect the usability ratings. -39) . Data were collected before and 6 months after the implantation of the DBS. Caregivers assistance were also taken for recording the questionnaire. Modified Caregiver Strain Index was used to assess the impact of DBS on the care providers health finances, social interactions and time demands. All the data obtained were compared before and after the insertion of DBS. The perceived or experienced difficulties after DBS insertion dropped from a mean, X = 72.5 (± 18.2) to a mean, X = 33.4 (± 15.7) after six months of treatment. Patients reported high level of compliance after the surgical procedure (DBS) which was statistically significant (p [ 0.05). A significant improvement (p [ 0.05) the caregivers index was reported after the surgical implantation of DBS in the care recipients (patients). Our study showed significant improvement in the Quality of Life among in patients with Parkinson's Disease after the implantation of DBS. The caregivers stress also considerably reduced after the surgery in the care recipients (patients with DBS). Fingerprint pattern and blood groups in twins population-a comparative study Educare Institute of Dental Sciences, Malappuram, India The distinguishing nature of physical characteristics of a person is due to both the inherent individual genetic diversity within the human population as well as the random processes affecting the development of the embryo. The focus of this study is to quantitatively determine the similarity of fingerprint pattern and blood groups in identical twins as well as in non identical twins. Methods A total of 24 pair of identical twins and 33 pairs of non-identical twins were selected for the study. The fingerprints of the thumb, index, middle, ring and little fingers of both hands of 57 pairs of twins were scanned. Due to differences in paper quality and degradation of the print over time, several of these fingerprints are of poor quality and we selected only 51 pairs. The blood groups of the study population were identified using ABO System of Blood Grouping. The results showed that 'Arch' type was the most common type of finger print pattern present in both identical (42.04 %) and nonidentical twins (53.10 %). The Loop type was 26.59 and 22.24 % in identical and non-identical twins respectively. All the identical twins shared the same blood group as their respective co-twin except one pair where they had a different type of B? and O?. The similarity in finger print pattern among identical twins were very high than non-identical twins and it was statistically significant. Rh? blood type was the common blood group in twins than Rh-. Suggesting missing relations in biomedical ontologies based on lexical regularities The number of biomedical ontologies has increased significantly in recent years. Many of such ontologies are the result of efforts of communities of domain experts and ontology engineers. The development and application of quality assurance (QA) methods should help these communities to develop useful ontologies for both humans and machines. According to previous studies, biomedical ontologies are rich in natural language content, but most of them are not so rich in axiomatic terms. Here, we are interested in studying the relation between content in natural language and content in axiomatic form. The analysis of the labels of the classes permits to identify lexical regularities (LRs), which are sets of words that are shared by labels of different classes. Our assumption is that the classes exhibiting an LR should be logically related through axioms, which is used to propose an algorithm to detect missing relations in the ontology. Here, we analyse a lexical regularity of SNOMED CT, congenital stenosis, which is reported as problematic by the SNOMED CT maintenance team. The web application is developed using Java with Jsf2 and Prime-Faces and running on Apache Tomcat. All data including study configuration and participant data are stored in a MySQL DBMS. Documents are generated from LaTeX templates using Apache's Velocity Template Language. WebModys (code name) comprises three responsibility layers: (1) the basic application providing the available study object types, generic business logic, graphical user interface (GUI), database interface, user rights and internationalization; (2) the study configuration defining a process graph of the different steps and levels of recruitment as well as related study objects; and (3) the daily participant management tasks and their documentation. The application layer provides the tools needed in everyday participant management; whereas the configuration layer employs a process graph to describe recruitment levels (nodes) as well as feasible and predefined interaction sequences of recruiting personnel and participants (edges). WebModys is in productive use in SHIP-Trend-1 starting in Spring 2016 and planned for several studies at the BIPS. The application layer offers core functionalities for the management of participant data, appointments, documents and the recording of all recruitment steps performed. User rights can be assigned via the GUI. The GUI also provides configuration options for several study objects: schedule setup, document template upload, station and contact result configuration. The configuration layer GUI is still under development. The participant management layer supports the everyday work for standard tasks such as participant detail views, participant data and address editing, letter generation, printing and archiving as well as appointment, consent, ID, and group management. Support for further recruitment tasks will be integrated stepwise into the application. Conclusion At present, WebModys offers a basic working environment for the participant management of complex population-based studies with diverse recruitment strategies. The implementation and integration of more complex functionalities will continue. The productive use in SHIP-Trend-1 and BIPS studies will provide us with valuable experiences for further application development. The relationship between symptom severity of major depressive disorder and lifestyle Major depressive disorder (MDD) is supposed to be associated with the development of unhealthy lifestyle habits, which in turn may promote the development and progression of physical comorbidities such as cardiovascular disease or type 2 diabetes. However, the impact of symptom severity of MDD is still unclear. The aim of this study was to examine the associations of MDD and its symptom severity with four individual lifestyle factors, namely diet quality, physical activity, body mass index (BMI), and smoking, as well as with an overall lifestyle index as a measure of the co-occurrence of multiple unhealthy lifestyle factors. The study involved 1478 participants aged 35-65 years from the BiDirect Study, including 881 patients with a clinical diagnosis of MDD and 597 non-depressed population-based controls. The psychiatric assessment was based on clinical interviews and included the Hamilton Depression Rating Scale (HAM-D) to assess symptom severity of MDD. Controls with indications for a previous or current depressive symptomatology were excluded from this analysis. Self-reported data on smoking habits, diet, and physical activity were assessed during a faceto-face interview. BMI was calculated from measured weight and height. The four lifestyle factors were dichotomized as either healthy or unhealthy, and the participants were assigned one point for each of the following unhealthy lifestyle habits: current smoking, poor diet quality (diet quality score Bmedian), low physical activity (according to WHO recommendations), and overweight (BMI C 25.0 kg/m 2 ). Next, an overall lifestyle index was calculated, which adds up the number of present unhealthy lifestyle factors. The cross-sectional associations of MDD and symptom severity with the individual lifestyle factors and the combined lifestyle index were investigated by alternating logistic regression and ordinal logistic regression, respectively. The analyses were adjusted for socio-demographic characteristics (age, sex, marital status, job status, and education). After adjustment for socio-demographic characteristics, patients with MDD showed significantly higher odds for smoking (OR = 2.41, 95 % CI 1.86-3.13), low physical activity (OR = 1.68, 95 % CI 1.31-2.14), and overweight (OR = 1.52, 95 % CI 1.19-1.94) than non-depressed controls. Likewise, we found significant associations between symptom severity of MDD and these three lifestyle factors, most of all in a doseresponse manner. For example, the odds for overweight increased with increasing symptom severity of MDD (mild: OR = 1.43, 95 % CI 1.07-1.93; moderate: OR = 1.71, 95 % CI 1.21-2.41; severe: OR = 1.81, 95 % CI 1.00-3.32; non-depressed controls = reference). In terms of overall lifestyle, MDD was also associated with the unhealthy lifestyle index (OR = 1.91, 95 % CI 1.55-2.34). Again, depression severity showed a strong dose-response association (mild: OR = 1.62, 95 % CI 1.26-2.07; moderate: OR = 2.67, 95 % CI 2.01-3.55; severe: OR = 3.28, 95 % CI 1.99-5.41). In conclusion, the findings support that patients with MDD are more likely to exhibit an unhealthier lifestyle than non-depressed individuals and highlight an important role of symptom severity in this relationship. Given that the above-mentioned unhealthy lifestyle habits are established risk factors for several physical comorbidities, patients with MDD are an important target group for health-promoting lifestyle interventions. Managing quality and safety in real time? Evidence from an interview study Health systems around the world are investing increasing effort in monitoring care quality and safety. Dashboards can support this process, providing summary data on processes and outcomes of care, making use of data visualization techniques such as graphs. As part of a study exploring development and use of dashboards in English hospitals, we interviewed senior managers across 15 healthcare providers. Findings revealed substantial variation in sophistication of the dashboards in place, largely presenting retrospective data items determined by national bodies and dependent on manual collation from a number of systems. Where real time systems were in place, they supported staff in proactively managing quality and safety. Mixed methods evaluations for understanding impact on patient outcomes and work practice Randell R 1 , Dowding D 2,3 , Scott P 4 , Georgiou A 5 Two approaches to evaluation of health IT predominate: clinical trials that focus on patient outcomes and qualitative studies that explore the impact of health IT on work practice. The intention of this workshop, organised by the EFMI Human and Organisational Factors (HOFMI) Working Group and Assessment of Health Information Systems (EVAL) Working Group, is to demonstrate that it does not need to be, and should not be, a choice between these two distinct approaches. The organisers will provide examples from their own work where they have sought to bridge this divide through the use of mixed methods evaluations. The goal is to encourage researchers to engage in mixed methods evaluations by presenting a range of possible designs. Background Peak oxygen uptake is a specific measure for global cardio-respiratory fitness and is determined by incremental exercise tests. The application ranges from assessment of operability in patients with heart failure to evaluation of endurance exercise in healthy individuals. Therefore, peak oxygen uptake values reveal a wide range and depend on a number of factors. The goal of the present analysis was to find sex and age adjusted reference values for peak oxygen uptake and to consider further confounding factors. Methods 10,189 individuals (3,677 females and 6,512 males) were recorded during primary preventive examinations at three locations (Ruedesheim, Frankfurt and Munich). Absolute and relative peak oxygen uptake, maximum work rate as well as anthropometric, demographic and further characteristics were measured. The associations of peak oxygen uptake and possible confounding factors were analyzed using Mann-Whitney U test and Spearman's correlation coefficient as appropriate. Based on the results, regression modelling was performed using absolute and relative peak oxygen uptake as well as maximum work rate as continuous dependent variables. Quantile regression analysis and non-parametric spline regression models were applied. Based on the estimated goodness of fit and the practical applicability, the best approach was selected. The analysis was performed stratified for sex and adjusted for age. After exclusion of cases with missing or implausible values in one of the dependent variables, 3,545 females and 6,371 males were eligible for the analysis. Median age of the observed population was 45 years (interquartile range: 41 to 50) for males and females. The age groups of \30 and C70 years were sparse. Therefore, a valid prediction is restricted to age groups from 30 to \70 years. Several factors were associated with peak oxygen uptake including obesity, cigarette smoking, myocardial infarction, diabetes mellitus and burnout syndrome. Linear and polynomial quantile regressions were considered as the most feasible approaches to calculate reference values for a patient with known characteristics. The models' goodness of fit was better in polynomial models. Non-parametric spline regression models were considered less feasible, as the models are more complex and the calculation of predicted values is inconvenient. The underlying data provide a comprehensive sample to estimate percentile values for incremental exercise tests in the context of primary preventive examinations. Polynomial quantile regression was considered to be a feasible and valid approach for the calculation of adjusted quantile reference values. Data privacy regarding the integration of laboratory data in a multicenter population-based study-juggling with pseudonyms using the example of the German National Cohort Rau H 1 , Stü bs G 1 , Hoffmann W 1 1 Institute for Community Medicine, Section Epidemiology of Health Care and Community Health, Greifswald, Germany Background In many population-based studies aiming to identify potential risk and disease-associated factors a variety of results from laboratory analyses is collected. The population-based German National Cohort study (GNC) [1] collects data and biological samples of a random population sample of 200,000 study participants. Most of the biological substances (blood, urine, saliva, nasal swabs and stool) are stored in biorepositories. For each participant one sample of EDTA-blood and serum are used immediately for laboratory analysis. Consequently, with a minimal set of 21 laboratory parameters for each analysis and two survey dates more than 1 million laboratory results of this instant analysis are expected. The key question is how to gather, store and provide high quality laboratory data for future analyses considering both aspects of data privacy and practicability? Within the GNC this huge amount of laboratory data is collected by laboratories and sent via a laboratory information management system (LIMS) to the data management system (DMS). Particularly in complex studies, entities work together through direct communication. Data management must assure that researchers or other entities cannot access identifying personal data of study participants. Consequently, this implies the division into separate domains to ensure data privacy. Personal identifying data is only accessible to a trusted third party (TTP) [2] . Since the laboratories, LIMS and DMS have to directly communicate with each other multiple pseudonyms are used. As all gathered data have to be associated with the study participant, the pseudonyms are also managed by the TTP. The first pseudonym is used in the laboratory, which carries out the instant analyses and communicates with the LIMS regarding orders and results. But, it must not know any identifying data of a participant or its samples by this given pseudonym except for age and gender, if this is needed to classify the results. The second pseudonym is used by the LIMS to manage a participant's samples position in the biobank for long-term storage, supply and instant analysis. The LIMS has to directly communicate with the laboratory, DMS and Transfer Unit [3], but must not be able to connect the samples to any other medical or personal data stored in the DMS by this pseudonym. The data management, however, has to process and store all study-relevant data, including the laboratory results, without knowledge of any identifying data of a participant. Therefore, a third pseudonym is used to ensure data privacy. In this scenario, the TTP connects the three participating entities by a token-based, direct communication. Consequently, entities can communicate with each other in real-time using a token from the TTP without knowing other pseudonyms or a participant's data than their own. Data protection within the NAKO cohort study is ensured due to the use of different pseudonyms for different entities and purposes. This allows the utilization of and direct communication between several laboratories and other external services. In summary, the use of multiple pseudonyms for data protection reasons is already successfully implemented in the German NAKO project. A weighted combined effect measure for the analysis of a composite time-to-first-event endpoint with components of different clinical relevance Rauch G 1 , Eulenburg C 2 , Wegscheider K 3 , Kieser M 1 1 Universität Heidelberg, Heidelberg, Germany; 2 IMBE UKE Hamburg, Germany; 3 IMBE UKE Hamburg, Hamburg, Germany Composite endpoints combine several events within a single variable, which increases the number of expected events and is thereby meant to increase the power. However, the interpretation of results can be difficult as the observed effect for the composite does not necessarily reflect the effects for the individual components, which may be of different magnitude or even point in adverse directions. This is especially a problem if the event types are of different clinical relevance, which is commonly the case in clinical applications. The common effect measure for composite endpoints is the all-cause hazard ratio which gives equal weight to all types of events irrespective of their clinical relevance. An alternative could be to consider a weighted average of the log-transformed cause-specific hazard ratios. This approach is standardly used when combining hazard ratios of different studies within a meta-analysis but has not yet been investigated in the context of composite endpoints. In meta-analyses, the weights are chosen as the inverse of the variances of the different hazard ratios reflecting their level of uncertainty. However, the motivation of our approach was to take account of the different levels of clinical relevance. Therefore, we will investigate an extended approach, where the weights are given as a product of the inverse variance and a factor expressing the clinical relevance of the component. We will compare this new effect measure to the standard hazard ratio by Monte-Carlo simulations and by means of a clinical trial example. Depending on the underlying cause-specific hazard ratios of the components the weighted approach and the standard unweighted approach might come to relevantly different results. The weighted approach thereby is less sensible to masking effects caused by less relevant components and might thus provide an interpretation advantage. The weighted approach is easy to apply and can improve the interpretation of the composite effect. Thus this new method defines a valid alternative to the standard unweighted approach. Public awareness of antibiotic resistance in Lower Saxony: results from the HaBIDS study Raupach-Rosin H 1 , Rü bsamen N 1,2 , Schü tte G 1 , Karch A 1,2 , Raschpichler G 1 , Mikolajczyk R 1,3 1 Helmholtz-Zentrum für Infektionsforschung, Braunschweig, Germany; 2 PhD program ''Epidemiology'' Braunschweig-Hannover, Braunschweig-Hannover, Germany; 3 Hannover Medical School, Hanover, Hannover, Germany The development of antibiotic resistance is a major public health concern worldwide. The WHO recently published a multi-country survey on adherence to antibiotic therapy and public awareness regarding antibiotic resistance, which showed some misunderstandings regarding antibiotic resistance and a widespread belief that lay people do not have any influence on antibiotic resistance. However, no country from the European Union was included. We aimed to assess the current public awareness and knowledge about the development of antibiotic resistance in Lower Saxony and to identify misconceptions susceptible to aid policies and campaigns. Knowledge was determined using four questions. Furthermore, there were questions covering awareness and worries with regard to MRP as well as questions concerning the responsibility for the spread of antibiotic resistance. Almost all the participants (98.0 %) had heard about antibiotic resistance. One percent of the participants stated that they had carried a resistant pathogen themselves. Nearly half of the participants (45.6 %) responded that they knew at least one carrier of a MRP personally. In median, participants answered two knowledge questions correctly (interquartile range [2, 3]); participants knowing somebody with MRP personally showed better knowledge (median 3, IQR [2, 3] vs. median 2 IQR [2, 3]; p \ 0.001). Two thirds (67.8 %) of the participants considered the topic of antibiotic resistance as very important. Regarding the risk of acquiring a resistant pathogen, participants were more worried for their family members (45.5 %) than for themselves (37.0 %; p \ 0.001). In terms of responsibilities regarding the spread of MRP in the health care sector, 96.1 % of the participants considered health care workers responsible to engage against MRP, 90.2 % stated that every individual is responsible, and 73.1 % considered politicians to be responsible for the reduction of MRP. In a case study, almost all participants (98 %) agreed that they would wash hands after visiting a neighbour with MRP; only 0.8 % would completely avoid contact to a neighbour because of MRP. Conclusion Awareness of antibiotic resistance was widespread within our study population, and the majority considered the topic to be very important. Responsibility for controlling antibiotic resistance was attributed equally to health care workers and the general population. Even though participants were worried about MRP, the way they would expect themselves to behave in case of exposure to a person with MRP in a not stigmatizing way. Group-based trajectory models to classify medication adherence in a comparative effectiveness study of bisphosphonate use on the risk of hip fracture Reinders T 1 , Enders D 1 , Kollhorst B 1 Background Medication possession ratio (MPR) is an established measure of adherence. A major disadvantage of MPR is that long time periods are reduced to a single number leading to the same MPR for patients with completely different adherence patterns. Group-based trajectory models provide the possibility to identify groups of patients with similar adherence patterns and model the average adherence in each group over time. The objective of the study was to compare models for adherence prediction when adherence was defined by either a trajectory model or MPR. For this purpose, an observational study classifying osteoporotic patients by their adherence to bisphosphonates was used and the association between adherence and subsequent hip fractures was investigated. We conducted a retrospective cohort study among users of bisphosphonates with osteoporosis aged 45 years and older in the German Pharmacoepidemiological Research Database between 2007 and 2011. We evaluated medication adherence during 540 days after initiating bisphosphonates and grouped patients based on a 5-group trajectory model. For comparison, the medication adherence of patients was also summarized into 2 and 4 groups using the MPR. Hip fractures were measured after adherence assessment. To evaluate the association between adherence and hip fractures, Cox proportional hazards model were used and the association was quantified by the hazard ratio (HR). Fit between models was compared using model C-statistic. Among 47,664 bisphosphonate users, 586 (1.23 %) had a hip fracture during follow-up. The 2-groups as well as the 4-groups MPR model and the 5-groups trajectory model discriminated similar between patients with and without hip fractures (C-statistic 0.67168, 0.67033 and 0.66929). HRs for adherence groups were similar among the 5-groups trajectory and the categorized MPR models. The group of the 5-group trajectory model that included patients with the highest medication adherence showed significantly smaller risk (HR: 0.69, 95 % CI 0.57-0.84) for hip fractures compared to the group with patients that are least adherent, while no significant differences in risks were observed between the other three adherence groups (HR: 0.84, 0.87 and 1.12). Conclusion Bisphosphonate adherence trajectory models and MPR showed similar results in predicting hip fractures. However, while classifying groups based on the MPR requires expert knowledge trajectory models allow for automated group classification. Medication use during pregnancy is common and increasing but as pregnant women are systematically excluded from clinical trials, data regarding safety of medication is inadequate1. Administrative claims databases are becoming increasingly important for investigating drug utilization and safety during pregnancy. However the beginning of pregnancy is seldom available in these databases and has to be estimated. Several algorithms have been developed to estimate the beginning of pregnancy in administrative claims databases 2-4, though not yet in Germany. To identify and describe existing algorithms applicable to German administrative claims data, to adapt them to the data structure of the German Pharmacoepidemiological Research Database (GePaRD) and to compare and evaluate their results. A comprehensive literature review yielded 16 algorithms that could be classified into three different categories. From each class one algorithm was selected, adapted and implemented in GePaRD. To compare the performance and results of the algorithms, the proportion of pregnancies where the beginning could be estimated as well as the estimated length of the pregnancies will be compared. Plausibility of results will be assessed by review of pregnancy care profiles based on in-and outpatient care as well as additional information such as birthweight. As the last part of the data was only delivered in February, analyses are not finished yet; results and conclusion will be presented at the conference. Expectation-driven text extraction from medical ultrasound images Reul C 1 , Köberle P 1 , Ü çeyler N 1 , Puppe F 2 1 University of Würzburg, Würzburg, Germany; 2 Universität Würzburg, Würzburg, Germany In this study an expectation-driven approach is proposed to extract data stored as pixel structures in medical ultrasound images. Prior S162 E. MDD was a significant predictor of mortality after adjusting for age, sex, smoking status, LEVF, medical comorbidities and percutaneous coronary intervention. The severity of depressive symptoms was also a significant predictor of mortality after adjusting for covariates, and this was the case for continuous depression scores from both depression instruments (PHQ and HADS). In this consecutive cohort of German coronary heart disease patients, depressive symptoms as well as a clinical depression diagnosis were significant and independent predictors of all-cause mortality after adjusting for known medical predictors. Findings are discussed with respect to data from international literature and recent calls for routine depression screening in adults. , waist-to-hip ratio, waist-to-height ratio, fat mass percentage, hypertriglyceridemic waist (HTGW, a combination of increased WC and elevated serum triglyceride levels), and sarcopenic obesity (a combination of increased body fat and decreased muscle strength)] were assessed at baseline in 2009. HRQoL, assessed with the EuroQol 5 Dimensions questionnaire, and SRPC, assessed with a single question, were collected at baseline and at the 3-year follow-up examination in 2012. Linear and logistic regression was used to examine the longitudinal associations between the measures of obesity and the following three outcomes: change in HRQoL, improvement and deterioration of SRPC. None of the seven measures of obesity was significantly associated with the three outcomes in longitudinal analyses. These results show that neither simple anthropometric measures nor more complex measurement techniques of obesity at baseline were significantly associated with changes in HRQoL and SRPC over 3-years of follow-up in older adults. A longer follow-up period or consideration of changes in obesity measures between baseline and follow-up is needed in further research. A tool for identification of familial colorectal cancer risk Rieger A 1 , Mansmann U 1 1 Ludwig-Maximilians-Universität München, München, Germany Familial clustering of colorectal cancer (CRC) is found in a fifth to a fourth of all diagnosed cases. Only a small part of familial CRC risk can be explained by specific genetic causes (e.g. ''hereditary nonpolyposis colorectal cancer'' (HNPCC)). For most familial CRC clusters the reasons are not specific. They may be complex genetic and/or lifestyle based. Simple epidemiological findings show that risk families need a specific screening approach. It is known that the risk for first degree relatives of a CRC case is doubled compared to the normal population [1] . At the moment there are instruments such as Amsterdam and Bethesda criteria to detect CRC risk families. If the risk families cannot be detected it is impossible to offer them appropriate CRC screening. Therefore, a risk score based on family anamnesis is proposed as a potential screening instrument and compared to a simple questionnaire which is provided by the ''Netzwerk gegen Darmkrebs e.V.'' [2] . Since the genetic background for familial CRC risk is unknown, we apply different models for inheritance (deterministic and random inheritance) and penetrance (different relative risks). These models are used to perform Bayesian analyses based on a familial CRC history and their family tree. The corresponding posteriors are candidates for familial CRC risk scores. Simulation studies are performed to assess the properties of the risk scores. Data from the study ''Familien schützen und stärken -der Umgang mit dem familiären Darmkrebsrisiko'' are used to calculate ML estimators for the parameters of the models by grid search. The different risk scores (derived from the posteriors of the interesting genetic models) are compared. ROC curves are calculated. In the simulation study it was obvious, that the ability to differentiate between risk and non-risk families is comparable for the scores derived from different models. The ROC curves based on the posterior risk score show AUCs above 0.9. The posterior risk scores are therefore more accurate as the questionnaire with an AUC of about 0.8. The AUC of the application to the real data of the ''family study'' was also more than 0.9 and about 0.9 for the questionnaire. Since the posterior risk scores of the genetic models under study were comparable with respect to the ability to discriminate between CRC risk and non-risk families, it is recommended to use the simplest model to calculate the score of a family based on the familial CRC history. This may provide a strategy to identify CRC risk families and to offer them appropriate CRC screening. The myth on smartphone use and sleep-is it real? Rod N 1 , Dissing N 1 , Clark A 1 , Gerds TA 2 , Lund R 1 Department of Public Health, University of Copenhagen, Copenhagen, Denmark; 2 Department of Biostatistic, Copenhagen, Denmark Bip…zzzz…bip…zzz…bip…zz…bip…z…! Being constantly awakened during sleep is a method used in experimental sleep studies to show adverse health consequences of sleep deprivation. The widespread use of smartphones provides an interesting analogy to these experimental sleep studies. Smartphones are easy to carry into bed and offer multiple facilities, e.g. calling, social networking, texting, gaming, internet, which may disrupt sleep initiation and maintenance. We aim to comprehensively describe cell phone activity during the nightly sleep span and evaluate how it relates to physical and mental health symptoms. We use data on 800 college students enrolled in the Copenhagen Social Networks Study, which includes continuous monitoring of cell phone data in all enrolled students. High-resolution objective data on duration and timing of cell phone activity (including calls, texting, S164 E. Grill et al. and social networking) during night time is used as a measure of disrupted sleep. Information on physical and mental symptoms was obtained from smartphone administered questionnaires. We find substantial cell phone activity during self-reported sleep hours. While cell phone activity in the hours around bed time and awakening is expected, it is striking that more than 10 percent were also found to have cell phone activity in the middle of the nightly sleep period. Cell phone activity during sleep hours may interfere with normal physiological restitution and potentially constitute a rising public health problem. Serum vitamin D concentration in pregnant women of light and dark skin color in Zurich Rohrmann S 1 , Richard A 1 , Quack Lötscher K 2 1 University of Zurich, Zurich, Switzerland; 2 University Hospital Zurich, Zurich, Switzerland Vitamin D fortification programs have largely eradicated health risks of vitamin D deficiency such as rickets and osteomalacia from western populations. However, vitamin D deficiency (\20 ng/ml) is widespread in industrialized nations. Currently, vitamin D status of pregnant women living in Switzerland is unknown, although vitamin D insufficiency during pregnancy may impact glucose tolerance and increase the risk of preeclampsia. Therefore, we evaluated vitamin D concentrations in pregnant women and determined the prevalence of vitamin D deficiency. Due to the stronger pigmentation of the skin, women with dark skin color are more likely to have lower vitamin D concentrations than women with lighter skin color. Hence, we also addressed whether the prevalence of vitamin D deficiency differs between women with lighter or darker skin color. Methods 205 women attending the ''Geburtshilfliche Poliklinik'' at the University Hospital Zurich were recruited for blood sampling between September 2014 and December 2015. Blood was collected at the first regular pregnancy visit. A woman's skin type was self-reported based on the Fitzpatrick Scale (5-point scale), which assesses appearance and reaction to sun exposure. Serum 25(OH) vitamin D concentrations were measured using chemoluminescence immunoassay; concentrations\20 ng/mL (\50 nmol/L) were defined as vitamin D deficient. Covariates were assessed from hospital records and a questionnaire. Logistic regression was used to examine the odds of being vitamin-D deficient depending on skin color. Results 129 women (63.2 %) had serum 25(OH) vitamin D concentrations below 20 ng/mL. Comparing prevalence by skin type, 55 % of women with light skin type (Fitzpatrick I, II and III; n = 152) and 83 % of women with dark skin type (Fitzpatrick IV and V; n = 52) had a vitamin D deficiency. Median serum concentration of 25(OH) vitamin D in all women was 17.1 ng/ml (Q1, Q3: 9.75-22.3 ng/ml). Women with light-colored skin had a mean concentration 18.4 ng/ml (Q1, Q3: 10.8-24.0 ng/ml) 25(OH) vitamin D, whereas the concentration in women with dark-colored skin was 12.3 ng/ml (Q1, Q3: 6.55-18.6 ng/ml). After adjusting for age, plasma parathyroid hormone concentration, season of blood collection, vitamin D supplement use, body mass index, smoking status, parity, and education, women with dark-colored skin had a higher odds of being vitamin D deficient than women with light skin color (odds ratio = 2.60, 95 % confidence interval 0.98-6.84). We observed a high prevalence of vitamin D deficiency among pregnant women living in the Zurich area. The prevalence was higher in women with darker skin color; dark-skinned women had a 2.6times higher risk of being vitamin-D deficient than women with lighter skin color after taking potential confounders into account. This calls for more intense counseling to improve vitamin D status during pregnancy, i.e., use of vitamin D supplements during pregnancy, in particular for women with darker skin color. Utilization of preventive care among people with migrant background Rommel A 1 , Frank L 2 , Lampert T 1 1 Robert Koch-Institut, Berlin, Germany; 2 Robert Koch-Institut, Abteilung für Epidemiologie und Gesundheitsmonitoring, Berlin, Berlin, Germany Because preventive care helps to early detect health problems universal access is granted by law in many health systems. Thus, utilization should not strongly depend from social characteristics. Supposedly, people with migrant background are not sufficiently acquainted with the health systems of their host countries and therefore make less use of preventive services. However, research on migrants' health is frequently challenged by rather small and biased sub-samples. Moreover different demographic and socio-economic patterns further impede simple comparisons between migrant and non-migrant populations. In our research we ask to what extend the utilization of preventive care depends from certain migrant characteristics. Furthermore, we try to show that predicted probabilities may alleviate the research with migrant sub-samples. The sample of the German Health Interview and Examination Survey for Adults (DEGS1) (2008-2012, n = 7,987) contains 1,107 respondents with MB. General health checks (GHC, age 35?, past 2 years), skin cancer screening (SCS, age 35?, past two years) and dental check-ups (DCU, past year) serve as exemplary outcomes. Apart from sex, age and socio-economic status (SES) logistic regression considers 1st and 2nd migrant generation, length of stay (LOS) for the 1st generation (short LOS: B5 years, long LOS: in years resulting from separate models for 1st generation migrants reveal that the convergence of utilization rates is to be estimated at several decades given the current societal conditions. Conclusion Migrant background and LOS strongly determine the utilization of preventive care. Thus, our findings suggest substantive variation in service utilization. Utilization should be increased by informing about existing services regularly starting early after immigration. Predictive margins allow for adjusted probabilities and intuitive interpretations and avoid the shortcomings of simple descriptive comparisons between non-migrants and migrant sub-populations. Maternal uric acid serum concentrations, renal function at delivery and health-related pregnancy outcomes in neonates: the Ulm SPATZ Health Study Rothenbacher D 1 , Braig S 2 , Mü ller M 2 , Koenig W 3 , Reister F 4 , Genuneit J 2 1 Ulm University, Institute of Epidemiology and Medical Biometry, Ulm, Germany; 2 Universität Ulm, Ulm, Germany; 3 Department of Internal Medicine II -Cardiology, University Medical Center Ulm, Germany, Ulm, Germany; 4 Department of Gynecology and Obstetrics, University Medical Center Ulm, Ulm, Germany Background Many studies in adults have shown that serum uric acid (SUA) plays a role in risk for many cardiovascular diseases including diabetes. During pregnancy SUA, which is able to pass the placenta, undergoes different changes. In early pregnancy, serum levels decrease because of the uricosuric effects from estrogens and the increase of renal blood flow. During the third trimester SUA increases considerably. SUA has been associated with blockage of endothelial growth proliferation and may therefore be associated with fetal growth. The aim of this analysis was to describe the relationship of maternal SUA and renal function and investigate it with various measures of birth weight in neonates. In the Ulm SPATZ Health Study, 934 singleton newborns and their mothers were recruited from the general population during their hospital stay in the University Medical Center Ulm, Germany, between 04/2012 and 05/2013 (overall response 49 %). Demographic data was collected using a self-administered maternal questionnaire during the hospital stay. Clinical data related to the child's delivery was obtained from electronic hospital records. Serum uric acid values were measured photometrically with a Cobas 6000 from Roche. Cystatin C was determined by immunonephelometry on a Behring Nephelometer II (Dade-Behring, Marburg). The association between SUA and cystatin C (both in quartiles) with gestational age measures of the newborn (i.e. length-associated birth weight, Ponderal-Index (PI = 100 x body weight (g)/body size3 (cm)) was quantified by means of multivariable logistic regression after adjustment for potential confounders. Overall, n = 885 mother-newborn pairs with measurements of SUA and cystatin C were included in the final analysis. Most of the mothers were of Germany nationality (85 %) and were between 26 and 35 years of age at delivery. Maternal pre-pregnancy BMI was normal (between 18.5 and \25.0 kg/m 2 ) for 59.8 %. Maternal SUA levels shortly after delivery were 300.3 lmol/l (interquartile range (IQR) 255.0; 340.0). Maternal cystatin C serum levels were 0.90 mg/l (IQR 0.77; 1.00). SUA was associated with age, body mass index, alcohol consumption as well as with history of hypertension and many other maternal cardiovascular risk markers. Cystatin C was associated with parity. In the multivariable analysis no clear association of SUA with various birth weight measures was seen in the fully adjusted models. However, cystatin C was negatively associated with small for gestational birth weight (\10th percentile) with an OR in top quartile vs. bottom quartile of 0.29 (95 % CI 0.13-0.63; p for trend in quartiles 0.009) after adjustment for maternal age at delivery, maternal nationality, maternal education, maternal BMI, smoking during pregnancy, mode of delivery, first parity, newborn's gender, maternal alcohol, maternal diabetes, hypertension before pregnancy. Contrary to the very few existing studies based on creatinine-defined renal function we found a clearly reduced risk for a small for gestational age baby with maternal increased cystatin C values in a population with relatively good renal function. SUA was not associated with birthweight outcomes, but it was associated with many maternal cardiovascular risk factors. Personal health data-privacy polizy harmonization and global enforcement Ruotsalainen P 1 , Delgado J 2 , Lacroix P 3 , Kurihara Y 4 , Blobel B 5 , Pharow P 6 , Stechova K 7 , Sahama T 8 This workshop is jointly organized by EFMI Working Groups Security, Safety and Ethics and Personal Portable Devices in cooperation with IMIA Working Group ''Security in Health Information Systems''. In contemporary healthcare and personal health management the collection and use of personal health information takes place in different contexts and jurisdictions. Global use of health data is also expanding. The approach taken by different experts, health service providers, data subjects and secondary users in understanding privacy and the privacy expectations others may have is strongly context dependent. To make eHealth, global healthcare, mHealth and personal health management successful and to enable fair secondary use of personal health data, it is necessary to find a practical and functional balance between privacy expectations of stakeholder groups. The workshop will highlight these privacy concerns by presenting different cases and approaches. Workshop participants will analyse stakeholder privacy expectations that take place in different real-life contexts such as portable health devices and personal health records, and develop a mechanism to balance them in such a way that global protection of health data and its meaningful use is realized simultaneously. Based on the results of the workshop, initial requirements for a global healthcare information certification framework will be developed. This workshop will identify stakeholder needs, concerns and privacy conflicts regarding PHI under changing contexts, and it will draft a context-specific privacy balancing mechanism for managing conflicts in the use of PHI and a global privacy protection certification framework for PHI. Outcomes of the workshop will be published to initiate wider discussion S166 E. Grill et al. Exploring the increased mortality risk for adults with type 2 diabetes compared to adults without diabetes in Germany Röckl S 1 , Heidemann C 1 , Paprott R 1 , Scheidt-Nave C 1 1 Robert Koch-Institut, Berlin, Germany To investigate all-cause mortality risk among adults with type 2 diabetes in Germany compared to adults without diabetes using mortality follow-up data of national health survey participants with successive consideration of established confounders. The German National Health Interview and Examination Survey 1998 (GNHIES98, n = 7124) is a survey representative of the 18-79 year old non-institutionalized German population. Between 2008 and 2011, information on vital status was collected for 98.0 % of the survey participants. A total of 6550 GNHIES98 participants with complete mortality follow-up also had complete information to be categorized at baseline as having known diabetes (self-reported physician-diagnosed diabetes or anti-diabetic medication, n = 330), unknown diabetes (no known diabetes but HbA1c C 6.5 % (C48 mmol/mol), n = 245) or no diabetes (neither known nor unknown diabetes, n = 5975). Hazard ratios (HR) of death from all causes were calculated from Cox proportional hazard models using age as the time scale. In analyses comparing persons with and without total (known or unknown) diabetes, results were successively adjusted for baseline characteristics including sex, socioeconomic factors (education, poverty), cardiovascular risk factors (BMI, sport activity, smoking, hyperlipidemia, hypertension), and comorbidities (cardiovascular disease, cancer, estimated glomerular filtration rate\60 mL/ min per 1.73 m 2 ) and high sensitivity C-reactive protein (hs-CRP). In additional unadjusted analyses, we stratified by sex or age group at baseline, and, in separate models, compared persons with unknown diabetes and persons with known diabetes each to persons without (unknown or known) diabetes. During a median follow-up of 12.0 years ( Adjustment for established confounders hardly influenced the approximately 1.6-fold higher hazard to die from any cause for persons with known or unknown type 2 diabetes compared to persons without diabetes in Germany. The increased mortality risk associated with diabetes seems to be more pronounced among men than women, among persons not aware of their diabetes and especially among younger adults. University of Heidelberg, Heidelberg, Germany Manufacturer information, user experiences and product availability of assistive living technologies are usually not known to citizens or consultation centers. The different knowledge levels concerning the availability of technology shows the need for building up a knowledge base. For this reason the German Federal Ministry of education and research started the initiative ''Guiding for technology based living of elderly'' which was built on two pillars: the establishment of 22 municipal technology information centres (below: ''MTICs''), which focus on counselling elderly and their relatives regarding the use of assistive technologies for a longer independent living and the development of a product knowledge base accumulating existing assistive products. The aim of this contribution is the definition of requirements in the development of knowledge bases for AAL consultations. Methods A variety of structured expert group discussions, individual feedback sessions and process analysis of the consultation interviews were performed in collaboration with municipal technology information centers. Within those sessions, requirements for the implementation of the knowledge base were predefined and possible solutions for their implementation were discussed. In the following the knowledge base was implemented as open web platform based on the framework of Semantic Mediawiki with a user interface which enables the user to search for products on different access paths. The major requirements, such as a maintainable and easy to use structure, were defined and implemented into a web based knowledge base. The website went productive in a field deployment in around 3700 consulting interviews of municipal technology information centers for more than a year. Within this field phase the implementation of the requirements for a knowledge base in the field of AAL consulting was evaluated and further developed. After the roll out to the website, different evaluations tools were integrated into the consultation process. Website analysis as well as permanent documentation of the consultation interviews were implemented. The above introduced knowledge base was actively used within the project duration of 1.5 years by up to 22 MTICs. The evaluation shows the additional need for an individual planning tool with decision support for identifying problems in daily living of the elderly in order to support with adequate assistive technologies. Paper vs. online symptom diary: results from the first six months of the LoewenKIDS study Rü bsamen N 1,2 , Raupach-Rosin H 1 , Zoch B 1,2 , Dorendorf E 1 , Bittner M 1 , Helmke C 1 , Schlinkmann KM 1,2 , Karch A 1,2 , Mikolajczyk R 1,3 1 Helmholtz Centre for Infection Research (HZI), Braunschweig, Germany; 2 PhD Programme ''Epidemiology'', Braunschweig-Hannover, Germany; 3 Hannover Medical School, Hannover, Germany The LoewenKIDS study is a birth cohort study designed to study infections and the development of immunity in early childhood. A symptom diary for the first six years of life is a central instrument of this study designed to collect information on acute respiratory and gastrointestinal infections in a prospective manner. Traditionally, symptom diaries were kept on paper, and the data was entered manually into a database. With the increasing use of internet, online symptom diaries can be used as an alternative mode of data collection. We aimed to compare paper and online mode in the LoewenKIDS study with respect to completeness (the proportion of days filled in) as well as in terms of reported illness episodes. Methods Participants in the LoewenKIDS study were asked to keep a symptom diary starting at birth of their child. They could choose between a paper-based or an online mode. Recruitment for the birth cohort started in February 2015 and is ongoing. In January 2016, we extracted symptom diary data for the first six months of life of all children born between February 1st, 2015 and June 30th, 2015. We included data for 35 children: twelve (30 %) families filled in the online diary and 23 families the paper-based diary. In total, the participants filled in the diary on 4,875 of 6,440 (75.7 %) required days. The paper-based diary was filled in on 83.0 % of the required days whereas the online diary was filled in on 62.0 % (p \ 0.001). Parents reported 59 episodes of upper respiratory tract infections in the first six months of their child's life (median 2; interquartile range [2, 4] ). The number of reported episodes did not differ between modes (p = 0.16). One quarter of these episodes (27.1 %) were accompanied by fever (temperature [38.4°C) and their length was 5.5 days in median [IQR 2, 10] and did not differ between modes (p = 0.18). Parents reported medication on 13.2 % of all days (852/6,440). In the starting phase of LoewenKIDS, more parents opted for the paper-based diary, and paper-based diaries were filled out more completely. It is not clear, if preference for paper-based diaries will continue over time as the children grow. As a consequence of our results, we improved the user-friendliness of the online-based symptom diary, established regular reminders, and developed an app for smartphones. Rü bsamen N 1,2 , Akmatov M 1,3 , Karch A 1,2,4 , Mikolajczyk R 1,5 1 Helmholtz Centre for Infection Research (HZI), Braunschweig, Germany; 2 PhD Programme ''Epidemiology'', Braunschweig-Hannover, Germany; 3 TWINCORE, Centre for Experimental and Clinical Infection Research, Hannover, Germany; 4 German Centre for Infection Research, Hannover-Braunschweig, Germany; 5 Hannover Medical School, Hannover, Germany Longitudinal panel surveys allow to study changes over time. Attrition from the panel is a threat because of reduced sample size and possible introduction of bias if attrition is differential. We established a longitudinal population-based panel, the Hygiene and Behaviour Infectious Diseases Study (HaBIDS), in four regions in Lower Saxony, Germany. In two regions, we invited stratified random samples of individuals aged between 15 and 69 years who could choose between paper-and-pencil and online questionnaires. In the two other regions, individuals were offered only online participation. Between March 2014 and July 2015, the online participants (n = 1,444) received eleven questionnaires. Reminder emails were sent to participants who did not fill in questionnaires within two weeks. We analysed formal withdrawal and non-usage attrition (participants who did not fill in two or more consecutive online questionnaires and did not return to the panel later on) in the online group among those who filled in the first online questionnaire. We analysed data of 1,285 online participants. Half of these participants (51.5 %) filled in all questionnaires while 22.4 % missed one or more questionnaires, 19.1 % became non-users, 6.1 % withdrew, and 0.9 % could no more be contacted by email. Withdrawal was associated with age and frequency of Internet usage whereas non-usage was associated with age and receiving reminders: Older participants were more likely to withdraw than younger ones (hazard ratio (HR) 1.21 [1.02, 1.43] per 10 years of age), but were less likely to become non-users (HR 0.84 [0.76, 0 .92] per 10 years). Participants using the Internet daily were less likely to withdraw than less frequent users (HR 0.61 [0.38, 0.96] ). Participants who frequently did not fill in questionnaires before the reminders were more likely to become nonusers (HR 1.21 [1.16, 1.27] per 10 % of questionnaires not filled before reminder). Participants' sex and education were not associated with withdrawal nor non-usage attrition. Conclusion Sociodemographic characteristics, apart from age, did not play a role in online attrition. Intrinsic motivation (characterised by less need to be reminded) was the most important reason for staying in the panel. Conducting online-only studies might, thus, not be hampered by bias caused by differential attrition. E. Grill et al. Identification of patients with ventilator associated pneumonia using clinicaltime, a temporal abstractions and temporal pattern matching system The use of Electronic medical records (EMRs) generates a scenario in which routinely collected clinical data can be reused for research purposes. However, the availability of high quality clinical data is still elusive, mostly due to the extensive use of free-text for clinical documentation [1] . Natural Language Processing (NLP) tools have been developed to extract clinical concepts from free-text, but developing a NLP engine is a time consuming task. Electronic medical records are distinctive databases where raw data and higher level concepts coexist. For example, they contain individual blood pressure measurements as well as the concept of 'hypertension', which is a higher level abstraction of a set of blood pressure readings. Using that principle, we have developed ClinicalTime, a clinical query system enables researchers to describe a patient of interest's phenotype using intervals and instants as well as their temporal relationships. In this abstract, we de-scribe the use of ClinicalTime to identify patients with ventilator associated pneumonia using the MIMIC II database. Methods MIMIC II is a clinical database containing more than 30,000 anonymized intensive care unit episodes [2]. We first created our gold standard dataset by randomly selecting 60 patient records. Two clinicians blindly annotated these records based on the presence or absence of a Ventilator Associated Pneumonia (VAP) either in the discharge summary or the ICD-9 diagnostic codes. Disagreements were solved by consensus. In parallel, using ClinicalTime we used the same VAP criteria to define a temporal pattern. ClinicalTime has a graphical interface where a researcher can define temporal intervals of clinical events, as well as the mathematical and temporal relationships be-tween intervals. The researcher later combines any number of intervals to define a clinical temporal pattern. Using the temporal pattern, ClinicalTime queries structured clinical data and performs a temporal abstraction from the database-in this case MIMIC II-and retrieves patients that match the aforementioned pattern [4] . With the annotated dataset as the gold standard, our system's, precision, recall and F-measure. Overall, only 5 % of all patients had a VAP, which is lower than what is reported in the literature, probably due to the high prevalence of cardiovascular surgery patients in the selected database. The system correctly classified 98 % of all patients (accuracy). ClinicalTime demonstrated high accuracy in classifying patients with VAP, with a precision of 100 %, a recall of 66 %, and an F-Measure of 0.8. Only one VAP patient was missed by ClinicalTime but, given the low frequency of VAP, this impacted greatly in the observed recall. Conclusion ClinicalTime performed well in identifying patients with VAP. This performance was especially good in terms of precision, since we did not observe any false positives. Given the low frequency of VAP in the sample, we need to continue to test ClinicalTime to see whether these results hold true with larger sample sizes. If these results can be further reproduced, ClinicalTime could become a significant addition to the currently available tools to identify patient cohorts in large clinical databases. Impact of a computer system and the encoding staff organization on the encoding stays and on health institution financial production in France Sarazin M 1,2,3 , El merini A 4,5 , Staccini P 6,7 In France, medicalization of information systems program (PMSI) is an essential tool for the management planning and funding of health. The performance of encoding data inherent to hospital stays has become a major challenge for health institutions. Some studies have highlighted the impact of organizations set up on encoding quality. The aim of this study is to evaluate staff organization and also computerized information system impact for treatment of the encoded information. The study was conducted by two health institutions: the Firminy hospital (CH) DIM department and Nice University Hospital (CHU) DIM department in France. The studied stays were between January the 1st of 2012 to December the 31th of 2015. Data considered in the study: the number of features of the computer system used to support the processing of coding, encoding clerk number, the information circuit, financial value of a stay depending on the homogeneous group of patients sick (GHM) established by the PMSI, diseases diagnostics code of the International classification of disease 10th version (ICD 10), technical acts code of the French classification (CCAM). Statistical Analysis: the total and average differences between financial valuation of stay before and after encoding clerk participation, encoding ratio: precaution/optimization per year, description of the nature of changes due to encoding clerk participation. A linear regression analysis has been provided to measure the impact of the computer system on the average volume of treated stays per day. Impact of an encoding staff organization: an average of 105 000 stays were provided each year by CHU and 16500 by CH. The number of GHM stays that changed each year with organization optimization increased from 5929 to 7737 for CHU stays (?30 %) and from 2689 to 3200 for CH stays (?19 %). In CHU, financial gain related to encoding optimization increased from 6.9 millions euros to 8 millions euros. Financial loss related to encoding precaution increased from 3.9 millions euros to 5.2 millions euros. Ratio precaution/optimization was not significantly different (0.57 to 0.61). In CH, financial gain increased from 990250 euros to 1.26 millions euros and financial loss from 180000 euros to 206000 euros without any ratio difference. Impact of a computer system: two computer systems were compared A difference of 6 min on average was found (p \ 0.005) per stay encoding decreasing time as computer system facilities increases. Both encoding organization and computer system can improve encoding performance, increasing number of stays validated and quality of encoding data. Encoding clerk dispatching can be considered as a smaller hospital design. Computer system performance is necessary to improve the accessibility speed to encoding data: integration of clinical reports, surgical. Comparing the quality of documenting discharge summaries in paper-based and electronic systems Sarbaz M 1 , Kimiafar K 1 , Sheikhtaheri A 2 1 Mashhad University of Medical Sciences, Mashhad, Iran, Islamic Republic Of; 2 Iran University of Medical Sciences, Tehran, Iran, Islamic Republic Of Quality of health information may results in improved health care quality. Given to the roles of medical records and health information for various purposes such as continuous care, education, research and legal issues, the importance of complete and accurate documentation is well recognized. Despite this, previous researches indicate the low quality of information; however, information technology as a solution may result in better documentation. In this regard, this study aimed to compare documentation completeness of discharge summaries in manual and electronic systems. We conducted this descriptive study in a teaching hospital with 918 beds, which is located in North-Eastern of Iran in 2014. This hospital is equipped with a hospital information system can create discharge summaries for patients, which can be later printed and located in paper medical records, however, providers are allowed to handwrite discharge summaries. We randomly selected 1500 medical records for patients who admitted in spring and summer of 2014 and compare the rate of completeness of all the data elements for discharge summaries. Among the selected discharge summaries, 64 % were electronic and 36 % were handwritten. As for data elements such as record number, patients' first and family name, admission date, and provider name, these elements were completed in 100 % of handwritten and electronic records. Father name, room number, bed number, patient occupation, address and phone number were completed in 100 % of electronic records, however, these data were completed in 70, 0, 5, 10 and 0 % in paperbased records, respectively. Test results and final diagnosis were completed in 84 and 97 % of electronic records but in 45 and 70 % of paper ones. Sixty percent of paper records and 91 % of electronic records included data about discharge disposition. Recommendations for taking medications after discharge were documented in 27 % of electronic and 20 % of paper records. These figures for follow up care were 78 vs. 20 %, respectively. Nutrition information was not documented in any records, paper or electronic ones. Conclusion Computerized systems through some features such as requiring data elements can result in better documentation quantitatively; however, improving the content of record qualitatively requires more advance features such as algorithms for preventive, detective and corrective measures for quality documentation. In addition, it seems that improving the attitude of providers about the importance of documentation and educating them is still required. Epidemiology of transport accidents based on international statistical classification of diseases and related health problems, tenth revision (ICD-10) in a developing country Sarbaz M 1 , Kimiafar K 1 , Khadem Rezaiyn M 1 , Vakili V 1 1 Mashhad University of Medical Sciences, Mashhad, Iran, Islamic Republic Of Transport accidents (TA) are the leading cause of unnatural mortality in the world and Iran has a high incidence of road accidents. Approximately 1.2 million people die each year on the roads around the world. In 2009, in Iran more than 800,000 cases of death occurred due to accidents which more than 23,000 people killed and 270,000 people injured. The aim of the study was to investigate epidemiology transport accidents based on ICD-10 in patients referred to trauma department of academic hospitals affiliated with Mashhad University of medical sciences in northeast Iran. This retrospective cross-sectional study was performed during March 20, 2013 until March 20, 2014. The study population included all records of inpatients referring due to transport accidents (9162 cases) to three trauma specialized hospitals. This study comprised only hospitalized' patients and did not included outpatients referring to ambulatory departments. The twentieth chapter of ICD-10 is about external causes of morbidity and mortality. In the present study, based on the ICD-10, all codes of TA were categorized according to the codes blocks (V01-V99).The majority of people in transport accidents were men (75 %). Most transport accidents included motorcycle riders (39.36 %), car's occupants (26.21 %) and pedestrians (24.82 %), respectively. The highest mortality of transport accidents was related to motorcyclists (37.4 %) and pedestrians (32.2 %).The majority of victims of motorcycle (22.03 %), cars (11.61 %) and pedestrians (17.93 %) injured in collision with car, pick-up truck or van. Most accidents occurred in summer (33.2 %) and spring (26 %). The majority of accidents occurred between 6 pm to 8 pm and fewer accidents occurred in the early hours of the day. As the present study showed, motorcyclists' mortality was high and these accidents occurred more for young people, it seems lack of experience of driving in young people, unauthorized speed and tend to have high speed in driving can be causes of the accidents, so officials should consider more severe rigors in terms of issuing motorcyclists' driving licenses. It seems in countries with low income, mostly inexpensive vehicles such as motorcycles are used that has low security and cause dangerous accidents on the road. Official holidays in Iran are in summer and spring and many travels are done in these seasons because Mashhad is a holy and religious city, many passengers and tourists travel to the city so that the city population in these seasons becomes three times of the real population of the city, it seems these factors are causes of more accidents in these seasons. Policy makers should consider more at groups at high risk such as pedestrians and motorcyclist. Background and objective Hypertension is globally identified as the leading risk factor for morbidity and mortality. An effective blood pressure (BP) control reduces the risk of stroke, coronary heart disease and congestive heart failure. However, international data suggest that more than a third of treated hypertensives are uncontrolled. The purpose of this study is to estimate the prevalence and associated factors of untreated, uncontrolled and apparent-resistant hypertension (RH) in Germany 2008-2011. Methods Data from the German Health Examination Survey (DEGS1 2008-11 n = 7,115, age 18-79 years) consisting of a nationwide two-stage S170 E. Grill et al. clustered sample from local population registers including standardized BP measurements and detailed recording of taken medications using the Anatomical Therapeutic Chemical (ATC) codes were analyzed. Hypertension was defined as BP C 140/90 mmHg (C140/ 85 mmHg in diabetics) or antihypertensive medication in individuals with physician-diagnosed high BP. Based on the European Society of Hypertension (ESH) criteria, apparent-RH was defined as BP C 140/ 90 mmHg (C140/85 mmHg in diabetics) under treatment with 3 different classes of antihypertensive agents including a diuretic. In Germany 2008-2011 the prevalence of hypertension was 31.8 % and the proportion of awareness was 82.1 %. Among aware hypertensives 37.9 % were uncontrolled and among these 33.4 % were untreated. Being aware, untreated and uncontrolled was independently associated with male sex, young age, not having cardiovascular disease (CVD), not performing BP self-measurement, not being obese and not being smoker. Treated aware hypertensives were uncontrolled in 28.9 % of cases (20.2 % uncontrolled with 1-2 drugs, 0.7 % uncontrolled with C3 drugs but no diuretic and 6.8 % apparent-RH). Apparent-RH was positively associated with having diabetes mellitus as comorbidity. Only 6.9 % of adults with apparent-RH were treated with potassium-sparing agents. More than one third of aware hypertensive adults in Germany 2008-2011 have their BP uncontrolled. Among uncontrolled hypertensives, a third is untreated. The absence of smoking, obesity and CVD are independent risk factors for being uncontrolled and untreated. It seems that not having those ''obvious risk factors'' has become a risk itself for being untreated and uncontrolled for hypertension. Sauerbrei W 1 1 IMBI, Universitätsklinikum Freiburg, Freiburg, Germany The validity and practical utility of observational medical research depends critically on good study design, excellent data quality, appropriate statistical methods and accurate interpretation of results. Statistical methodology has seen substantial development in recent times. Unfortunately, many of recent methodological developments are ignored in practice. Consequently, design and analysis of observational studies often exhibit serious weaknesses. The lack of guidance on vital practical issues discourages many applied researchers from using more sophisticated and possibly more appropriate methods when analyzing observational studies. Furthermore, many analyses are conducted by researchers with a relatively weak statistical background and limited experience in using statistical methodology and software. Consequently, analyses reported in the medical literature are often flawed, casting doubt on their results and conclusions. An efficient way to help researchers to keep up with recent methodological developments is to develop guidance documents that are spread to the research community at large. These observations led to the initiation of the international initiative for strengthening analytical thinking for observational studies (STRATOS), a large collaboration of experts in many different areas of biostatistical research. The overarching objective of STRATOS is to provide accessible and accurate guidance in the design and analysis of observational studies. The guidance is intended for applied statisticians and other data analysts with varying levels of statistical education, experience and interests (Sauerbrei et al., Statistics in Medicine 2014, 5413-5432) . The initiative involves more than 80 members from 15 countries (http://www.stratos-initiative.org/), who work in nine topic groups (TGs), focusing on such general issues as missing data, measurement error, survival analysis, and causal inference. We will show a potential way to work towards reaching STRA-TOS aims by introducing the work of topic group 2 (TG2) 'Selection of variables and functional forms in multivariable analysis' as an example. We will show the necessity of finding agreement across the TGs on the use of a 'common language' which implies work on a glossary of terms and some broader rules to review the literature, more specific rules for STRATOS publication and many more. TG2 needs to review the literature concerning the current use of methods for variable and function selection in health research, and to provide an overview and assess advantages and disadvantages of the large number of strategies for variable selection and strategies to derive functional forms for continuous variables. To assess and compare methods of interest for TG2, as well as for all other TGs, simulation studies are a suitable approach to derive the objective evidence. Obviously, some general rules to design and conduct meaningful simulation studies are also needed. Consequences of IT-blackouts in the area of fully digital documentation in hospitals Sax U 1 , Lipprandt M 2 , Röhrig R 2 1 Universitätsmedizin Göttingen, Göttingen, Germany; 2 Carl von Ossietzky University, Oldenburg, Germany Background Efficient health care processes are highly depending on IT-infrastructure [1] [2] [3] . The requirements for availability and accuracy of the involved applications are very high. Traditional risk assessment tries to reduce the risk of IT-downtimes, but the likelihood of an IT failure is still not null. Though availability or performance problems may slow down or hinder health care processes, they can be circumvented by alternative workflows. Problems with data integrity may be much worse, as much time may get lost until the problem is discovered, reported and fixed [4, 5] . This oral presentation aims to give an overview about Consequences of IT-Blackouts in the Area of Fully Digital Documentation. We categorize the most important types of IT system failures, assess the usefulness of the counter measures and state that most risk management approaches fall short on service continuity. In a thorough literature analysis we sift through the IT risk management literature [6] [7] [8] [9] [10] [11] [12] , the recommendations of the federal office of information security as well as the Critical Incident Reporting Service (CIRS) databases in order to assess the reported IT-related incidents [13] [14] [15] . We find sound and valid recommendations for reducing the risk of IT failure [16] [17] [18] [19] [20] [21] [22] . Classically security can be improved by redundancy for data, location and staff using a proper set of well-trained SOPs. Few experiences though are documented for the case of loss of data integrity. CIRS reports and personal experience of the authors and their colleagues show the severity of these cases and the urgent need for open communication about IT-related problems and their solutions. Very few of the papers address the most critical clean-up phase after an incident. Procedures to recover, check and release the system to the user are extremely demanding for the involved management, as they may have to arbitrate between patient safety and economy aspects. IT-blackouts and IT disasters should to be reported, analyzed and critically discussed [23] [24] [25] [26] in order to develop sustainable strategies to ensure process stability in the case of IT failures in medical processes. This requires a change from a culture of shame and blame to a culture of error and a culture of safety for healthcare IT [27, 28] , as this change is upcoming in other areas of medicine. Disclaimer: Some aspects of this article and the literature review have been submitted in a full paper version to another journal. Implications of Swedish National Regulatory Framework of the patient accessible electronic health record Scandurra I 1 , Lyttkens L 2 , Eklund B 2 1 Ö rebro University, Ö rebro, Sweden; 2 Uppsala County Council, Uppsala, Sweden Online access to your own electronic health record is a controversial issue. In a Swedish county such eHealth service has been in operation since 2012 and it is now being widely deployed in the other counties. This first review presents work regarding current National Regulatory Framework (NRF) related to the public eHealth service Patient Accessible Electronic Health Record (PAEHR) and points out how optional rules have been applied in different counties. Results Potential implications due to the different decisions made are discussed in terms of patient centricity and health information outcome. In current PAEHR, care providers have assessed differently how to apply the NRF. For the patients, this means that information gathered from the health record may be displayed differently, depending on where, when and why they sought treatment. When a patient visits different care providers such solution may cause confusion and its purpose may go lost. Consequently a revised NRF with less optional rules is recommended, as well as a joint adherence to the next NRF by all county councils. Visualization of large ontologies in university education from a tool point of view Schaaf M 1 , Jahn F 1 , Tahar K 1 , Kuecherer C 2 , Winter A 1 , Paech B 2 1 Institut für Medizinische Informatik, Statistik und Epidemiologie, Universität Leipzig, Leipzig, Germany; 2 Universität Heidelberg, Heidelberg, Germany The realization of ontology visualization requirements in university education is a challenging task and should be supported by appropriate tools. This applies in particular, if the ontology to be visualized is based on a large text corpus that comprises a huge number of concepts, relations and annotations. In SNIK, we developed such an ontology of information management in hospitals in order to support the transfer of knowledge in the context of the university education. The challenge is to identify tools and methods, which are capable to support ontology visualization and usage as efficiently as possible. Related research fields (e.g. bioinformatics) are confronted with similar visualization problems. These tools and methods used could provide a suitable solution in our research field. In total, we assessed eight tools concerning the visualization of large ontologies to evaluate their suitability representing knowledge in the field of medical informatics. Participatory heuristic evaluation leads to extensive changes in the functionalities of the eWALL telehealth system Schaarup C 1 , Hangaard S 1 , Hejlesen O 1 1 Aalborg University, Aalborg, Denmark To overcome the growing elderly burden suffering from chronic diseases, telehealth systems have become relevant as new ways of delivering health care. The telehealth system eWALL is a 46' interactive wall designed to assist people suffering from chronic obstructive pulmonary disease, mild cognitive impairment, and age related impairment in their activities of daily living. A participatory heuristic evaluation (PHE) was performed on the first version of eWALL telehealth system. The PHE led to extensive changes in the functionalities of the system. In this study, we compared the functionality changes in the eWALL telehealth system and described and presented the changes. Irrelevant functionalities have been removed, new functionalities have been added, and existing functionalities have been changed. The result is a more user-friendly user interface with relevant functionalities that address the end users' needs. However, there is a need for further rounds of testing the newly added functionalities, and it must be explored if potential end users want to use the eWALL telehealth system in their daily life. Post-implementation study of a nursing e-chart: how nurses use their time Schachner B 1 , González Z 1 , Recondo F 1 , Sommer J 1 , Luna D 1 , Garcia G 1 , Benitez S 1 1 Nursing documentation is a significant component of the electronic health records nevertheless integrating a new chart into nursing activities required multiples strategies to ensure adherence. Current literature demonstrates that nurses spend part of their time performing activities no related with patients' direct care and sometimes even does not fall under their purview. Thus it is important to quantify the effect that a new system could have in the proportion of time dedicated to documentation. The objective of this work was to determine the time dedicated to different activities including those related to electronic documentation after the implementation of a renewed nurse chart in an Electronic Health Record at Hospital Italiano de Buenos Aires. An observational, cross sectional and work sampling study was performed. During the study 2396 observations were made in 3 wards. Nurses' activities included 36.09 % of direct care, 28.9 % of indirect care, 0.67 % support tasks, 22.99 % non related to patient tasks, 11.32 % personal activities and documenting on EHR 17.43 %. The comparison with the previous study shows indirect care activities decreased 12.28 % and non-related to patients increased 11.85 %. The results demonstrate that the new nurses' e-chart did not increase documentation time. E. Grill et al. Results of the international workshop 'From Global Burden of Disease Studies to National Burden of Disease Surveillance' Scheidt-Nave C 1 , Ziese T 1 , Fuchs J 1 , Kraywinkel K 1 , von der Lippe E 1 , Eckmanns T 1 , Haller S 1 , Buchholz U 1 , Plaß D 2 1 Robert Koch Institute, Berlin, Germany; 2 German Environment Agency, Berlin, Germany Health systems all over the world need adapt to the new challenges that result from sociodemographic, epidemiologic and environmental changes. The challenges arise from infectious diseases and antimicrobial/antibiotic resistance but also from a vastly increasing burden of non-communicable and age-related health conditions. The Global Burden of Disease (GBD) framework was first introduced the early 1990s with the aim to permit a first global comprehensive and comparable evaluation of population health [1] . With the new update of the GBD, experts at the Institute for Health Metrics and Evaluation (IHME) in Seattle have refined the methodology to permit burden of disease analyses that are comparable over time. Increasingly individual countries have not only joined the GBD network, but also initiated national burden of disease studies adapted to the specific Public Health challenges in their countries [2] [3] [4] . So far, Germany has only participated with single experts or provided national health survey data [5] . The international workshop ''From Global Burden of Disease Studies to National Burden of Disease Surveillance'' aimed to enhance the cooperation in public health research and exchange of information between GBD researchers and public and environmental health institutes in Germany. The Workshop was organized by the Robert Koch Institute (RKI) and the German Environment Agency (UBA) in cooperation with the German Society for Epidemiology (DGEpi) e.V. and Bielefeld University. The framework, methodology and results from recent GBD analyses were presented by T. Achoki and M. Forouzanfar (IHME) and J. Schmidt (Public Health England). C.E. Stein (WHO) reflected on critical aspects of the methodology and the need for adjustments towards country-specific challenges. Researchers from the RKI presented analyses from non-communicable disease surveillance in Germany including public health and data challenges and different approaches for estimating the burden of communicable diseases using the examples of healthcare associated infections and influenza. D. Plass (UBA) described environmental burden of disease assessments for Germany. The workshop provided deeper insight into the concepts and methodology of the GBD framework and clearly demonstrated the potential to use this approach for country-specific Public Health research and evaluation, not only at the national but also at smaller geographical level. The workshop stimulated ideas for improving data collection systems and the use of already available data sources. Setting up a continuous research network turned out to be a central issue, to strengthen collaboration and to jointly use this the methodology to identify important health changes and intervention effects. Cooperation will be essential to further improve analysis methods and data transparency. Conclusions National burden of disease surveillance is considered being not only a mission, but a dynamic process of important value for Public Health in Germany. The following next steps were envisaged: Set up a national burden of disease research network between IHME and national institutes and built up a training program to strengthen the knowledge of country experts. It is also aimed to foster research in the fields of disability weights, co-/multimorbidity and frailty indicators. Further, regional differences and also social disparities should be covered. Patients with sepsis experience a life-threatening organ dysfunction caused by a dysregulated host response to infection [1] . Recent estimates of highly-variable population incident rates from high-income countries ranged between 288 and 148 hospital-treated sepsis cases per 100,000 person-years depending on disease severity [2] . For survivors, however, sepsis is not over after the intensive care therapy and a wide range of sepsis sequelae has been reported. Sepsis has therefore been called a ''hidden'' healthcare disaster [3] . While these adverse and long-lasting post-discharge outcomes are increasingly acknowledged, we lack a comprehensive assessment of the nature, extent, and impact of sepsis sequelae. A systematic review [4] that summarized the available evidence until 02/2009 demonstrated that studies mainly focus on long-term mortality and quality of life. When we recently updated this systematic review, we came to the same conclusions. Consequently we designed the Mid-German sepsis cohort (MSC) as a patient cohort to better and more completely quantify mid-and long-term functional disabilities resulting from sepsis. We focus on ICU-treated patients with sepsis. Recruitment will be based on ICUs of six large hospitals in Thuringia and Saxony, which are experienced in study conduct. After the initiation in 03/2016, we aim at including 3,000 patients during a period of 3 years to obtain a sufficiently large sample of survivors discharged from the ICU to assess the nature, extent, and impact as well as the time course of physical, mental, and emotional outcomes over a period of up to 5 years. Here we present details on the medical background and study design information including a summary of the operationalized outcomes measured by the MSC Investigator Group. Therefore, it is the objective to examine changes in the distribution of mean BMI of German adults between 1990 and 2011. We used data from 25-69 year old participants of the three German National Health Interview and Examination Surveys conducted in 1990-1992 (German Cardiovascular Prevention Study and Survey East-GCP/SE, n = 7466), 1997-1999 (GNHIES98, n = 5825) and 2008-2011 (DEGS1, n = 5375) and examined trends in mean BMI and different BMI-categories. Body weight was measured to the nearest 0.1 kg and body height to the nearest 0.1 cm. Body mass index (BMI) was calculated as body weight (kg) divided by body height squared (m 2 ). Categories were defined according to WHO criteria into normal weight (BMI 18.5-25.0 kg/m 2 ), overweight (BMI 25.0-\30.0 kg/m 2 ), obesity (BMI C 30.0 kg/m 2 ), obesity class I (BMI 30.0-\35.0 kg/m 2 ), class II (BMI 35.0-\40.0 kg/m 2 ), class III (BMI C 40.0 kg/m 2 ). Means and prevalence estimates from all three surveys were standardized to the German population Dec 31st 2010 using SAS 9.4 survey procedures for complex sample designs, overall and within strata of sex and age groups. Mean BMI increased significantly from 26.7 kg/m 2 (95 % CI 26.6-26.9) in 1990-1992 to 27.0 kg/m 2 (26.8-27.2) in 1997-1999 (p = 0.03). Mean BMI was unchanged between 1997 -1999 and 2008 ). Among men, mean BMI was higher in 2008-11 (27.5 kg/m 2 , 27.3-27.7) than in 1990-1992 (27.0 kg/m 2 , 26.8-27.1) (p = 0.0005), especially among the age group 25-34 years (1990-1992: 25.5 (25.2-25.8), 2008-11: 26.1 (25.6-26.7) ; p = 0.03) and 55-69 years (1990-1992: 27.7 (27.5-28.0), 2008-11: 28.5 (28.1-28.9 ); p = 0.0004), while there was no overall change among women (p = 0.77). The rightward shift in the distribution is more pronounced for men than for women. From 1990-1992 to 2008-11 the prevalence of overweight decreased slightly but not significantly among men from 50.6 (48.7-52.6) to 46.1 % (43.7-48.5) and from 33.0 (31.3-34.7) to 28.5 % (26.5-30.6) among women, whereas the prevalence of normal weight remained stable among men and women. During this period the prevalence of obesity increased from 18.9 (17.2-20.8) to 24.5 % (22.1-27.0) (p = 0.0002) among men and from 21.6 (19.8-23.4) to 23.0 % (20.9-25.3) among women (p = n.s.). Among men and women with obesity, the prevalence of obesity class I decreased among men from 85.0 (81.3-88.0) to 77.7 % (73.8-81.2) (p = 0.005) and from 69.6 (65.9-72.9) to 65.0 % (60.3-69.4) among women (p = n.s.), whereas among men the prevalence of obesity class II increased significantly from 11.7 (9.0-15.1) to 16.8 % (13.3-20.9) (p = 0.04) and among women the prevalence of obesity class III increased from 8.6 (6.7-10.9) to 12.6 % (9.8-16.0) (p = 0.02). Between 1990 and 2011 mean BMI in adults in Germany has increased among men, especially among the younger and older age groups. This has resulted in an increased prevalence of obesity, particularly the most severe level of obesity. To ensure that obesity prevention is targeted appropriately, it is important to monitor changes in different classes of BMI. Transmission of respiratory and gastrointestinal infections in German households containing children Schlinkmann KM 1,2 , Bakuli A 1,2 , Dreesman J 3 , Mikolajczyk R 4 1 Helmholtz-Zentrum für Infektionsforschung, Braunschweig, Germany; 2 PhD program ''Epidemiology'' Braunschweig-Hannover, Braunschweig-Hannover, Germany; 3 Niedersächsisches Landesgesundheitsamt, Hannover, Germany; 4 Helmholtz-Zentrum für Infektionsforschung -HZI, Braunschweig, Germany Background Transmission of acute respiratory infections (ARI) and acute gastrointestinal infections (AGI) not only occurs on community but also on household level. The aim of this study was to identify episodes of respiratory and gastrointestinal infections and the potential transmission of these infections in German households with children attending daycare. We used a study design that enables family members to autonomously gather these data. We conducted a four month prospective cohort study in the winter period 2014/2015. Recruitment proceeded in 75 of 151 DCCs in Braunschweig, Lower Saxony, Germany, including children aged 0-6 years. One study arm focussed on the transmission of respiratory and gastrointestinal infections within families. In this module ''transmission'' every household member was included. Participants kept a health diary for the whole study period and took self-administered nasal swabs on the first day with symptoms for pathogen identification in case of respiratory infections. We estimated number of episodes and potential transmission of ARIs and AGIs based on different assumptions. Altogether, there were 558 episodes of ARI with a mean duration of 7.7 days. For AGI, there were 146 episodes with a mean duration of 2.4 days. Incidence rates varied by calendar month for ARI (lowest 0.35 in March and highest 0.78 in November per person month) and AGI (lowest 0.07 in March and highest 0.19 in December per person month) and were highest among children in the second year of life, with a subsequent steady decrease with increasing age. Median number of within household transmission of ARI was 4 (IQR 2;7) for 4-people households, 2 (IQR 1;3) for 3-people households and 1 (IQR 0;3) for 5-people households. Altogether, 353 nasal swabs were sent in for pathogen identification, and virus detection was successful in 39 %. When considering only symptoms, 256 potential ARI transmissions were observed. Among them, in 177 cases at least one swab was missing. The same pathogen was identified in 15 potential transmission events and different pathogens were identified in 8 potential transmission events. Pathogen identification was unsuccessful in 31 transmission events for both swabs and in 25 events for one swab. For AGI, only 3 % of the households had at least one potential transmission. Parents were able to collect the required data and samples autonomously. Symptoms observed in a close time sequence in one household might not necessarily indicate an infection caused by the same pathogen. Schmid M 1 , Perperoglou A 2 with complex functional forms of continuous variables statisticians have developed a variety of spline methods. This project will investigate options that are available in R for building multivariable regression models with splines, compare between different approaches and provide practical guidance on available software. For R packages that implement spline bases we identified the types of splines available, whether a user can define degrees of freedom, number and position of knots and if there are available methods for determining the smoothing parameters. For R packages that fit regression models, we looked into types of regression models, whether the package includes criteria for the significance of a non-linear effect and whether there are graphical tools to visualize spline estimates. In this presentation we will present our first findings. We will show what are the most used packages, what are their interdependencies and their basic features. Furthermore, we will discuss the framework for further research, where we will evaluate the quality and performance of packages with the aim to provide detailed guidelines for applied researchers. SQUARE 2 -a web-based data monitoring tool for epidemiological and clinical studies Schmidt CO 1 , Albers M 2 , Krabbe C 3 , Radke D 4 , Henke J 5 1 Universität Greifswald, Greifswald, Germany; 2 University Medicine Greifswald, Greifswald, Germany; 3 Institut für Community Medicine, Greifswald, Germany; 4 Institute for Community Medicine, University Medicine Greifswald, Greifswald, Germany; 5 University of Greifswald, Greifswald, Germany Obtaining a high data quality is pivotal in epidemiological and clinical studies. For this purpose a functional data monitoring is indispensable which provides an overview on data quality characteristics such as missing values, extreme values, as well as various measures of reliability and validity. However, to efficiently monitor data over the course of a study with a wide scope of examinations, multiple examination centers and thousands of study variables requires versatile IT tools that meet changing demands regarding content, structure and layout of quality reports. Therefore we developed a web-based data quality monitoring tool in the Study of Health in Pomerania (SHIP), a large population based epidemiologic study. Methods Square 2 was designed as a Java EE web-application. It is built upon freely available libraries such as Java Server Faces (myfaces, bootsfaces), Rserve and Rengine for the direct usage of an R-Server by Java. Additionally, we used unit testing libraries (junit) and selfdeveloped libraries for data persistence (ShipDBM) and encryption (pwencrypt). Square 2 is deployed in Tomcat 8. All data are stored in a PostgreSQL-database (9.5). R was selected as the statistics interface because of its wide and growing scope of packages and because it is open source software. Square 2 allows for a direct PostgreSQL access by R and results of statistical analyses are administrated in the database. Square 2 reports are presently generated in PDF files (built using LaTeX) and will be available in a web format in a future release. A GUI manages all phases of data management, the upload of statistical functions and the generation of reports. Square 2 is a platform for versatile web-based online data monitoring which can be adapted to changing demands by its users. It is used at present within the Study of Health in Pomerania (SHIP) and can be employed by non-statisticians to produce extensive quality reports. SQUARE 2 comprises an extensive user rights and roles concept to be usable in large scale studies with many different roles such as PIs, senior and junior quality officers, statisticians, examiners, and data base managers. SQUARE 2 is modular in its design and can be extended to meet growing demands. Statisticians may extend the scope of statistical functions to improve quality reports without in depth knowledge of the functioning of SQUARE 2 . The availability of powerful data quality monitoring software such as SQUARE 2 supports high quality standards in large scale epidemiological studies while saving cost and time due to automated reporting and may contribute to further develop quality standards across studies. Explaining medical exceptions for dialysis patients Schmidt R 1 1 Universität Rostock, Rostock, Germany In medical practise and in knowledge-based systems too, it is necessary to consider exceptions and to deal with them appropriately. In medical studies and in research exceptions shall be explained. We present a system that helps to explain cases that do not fit into a theoretical hypothesis. Our starting points are situations where neither a well-developed theory nor reliable knowledge nor a proper case base is available. So, there are just some theoretical hypotheses and a set of measurements. Case-Based Reasoning is used to explain those cases that do not fit the model. The case base has to be set up incrementally, it contains the exceptional cases, and their explanations are the solutions, which can be used to help to explain further exceptional cases. The system is demonstrated for dialysis patients. Finding biomarkers for diabetes Schmidt R 1 1 Universität Rostock, Rostock, Germany The aim of our project is to find biomarkers that indicate a future development of diabetes. We especially wish to calculate relative risk coefficients for development of overt diabetes at different time points of life. So far, we have applied decision trees and other classification algorithms like random forest and support vector machines on rather small data sets. Furthermore, feature selection was used to support the genes and biomarkers generated by decision trees. Planning and analyzing adaptive designs with test statistics derived from stochastic differential equations Schmidt R 1 , Faldum A 1 Background Many common test statistics used in clinical trials may be viewed as finite-dimensional distributions of a Brownian motion. Brownian motion is the prototype of a diffusion process with independent increments. Whenever the independent increment structure holds true, Health-exploring complexity: an interdisciplinary systems approach HEC2016 S175 the standard approach to adaptive designs based on p-clud p-values applies. In general, however, diffusion processes do not have independent increments. Then control of the type I error rate is in general no longer guaranteed with traditional adaptive designs. This requires adaptive designs with dependent p-values. Diffusion processes naturally arise as solutions to stochastic differential equations. Stochastic differential equations play an important role in modeling various phenomena arising in fields as diverse as finance, physics, and biology, but are of increasing importance in medicines as well. We provide a methodology for adaptive designs with test statistics derived from stochastic differential equations. In principle, adaptive designs with dependent p-values can be handled using the conditional rejection probability approach. The latter, however, is primarily a meta-concept. Computation of the involved conditional rejection probabilities is in practice difficult. To our best knowledge, no general concept for their computation has been described in the literature when the p-values are dependent. We fill this gap in the context of adaptive designs with test statistics derived from stochastic differential equations. For this purpose, we use the Markov property of the solutions to the involved stochastic differential equations. The procedure is illustrated using the example of growth models in the context of personalized medicines. The required conditional rejection probabilities may be obtained as solutions to partial differential equations. Planning and analyzing of adaptive designs may thus be reduced to solving (a system of) partial differential equations. We provide a general methodology for planning and analyzing adaptive designs with stage-wise test statistics derived from stochastic differential equations. For a big variety of models, we show that planning of adaptive designs may be reduced to solving the eigenvalue problem for a self adjoint. linear differential operator. This yields an alternative approach of planning and analyzing adaptive designs. Wearable technology in medicine: machine-to-machine (M2M) communication in distributed systems Schmucker M 1 , Yildirim K 1 , IGEL C 2 , Haag M 1 1 Hochschule Heilbronn, GECKO Institute for Medicine, Informatics & Economics, Heilbronn, Germany; 2 DFKI-Deutsches Forschungszentrum für Künstliche Intelligenz GmbH, Berlin, Germany Smart wearables are capable of supporting physicians during various processes in medical emergencies. Nevertheless, it is almost impossible to operate several computers without neglecting a patient's treatment. Thus, it is necessary to set up a distributed network consisting of two or more computers to exchange data or initiate remote procedure calls (RPC). If it is not possible to create flawless connections between those devices, it is not possible to transfer medically relevant data to the most suitable device, as well as to control a device with another one. This paper shows how wearables can be paired and what problems occur when trying to pair several wearables. Furthermore, it is described as to what interesting scenarios are possible in the context of emergency medicine/paramedicine. Evaluating therapeutic medical devices: recommendations from an EUnetHTA guideline Schnell-Inderst P 1 , Hunger T 1 , Sauerland S 2 , Perleth M 3 , Fujuta-Rohwerder N 2 , Zens Y 2 , Gutierrez Ibarluzea I 4 , Siebert U 5,6,7 Background Health technology assessment (HTA) of medical devices (MD) may present specific challenges that need to be considered compared to pharmaceuticals. We aimed at identifying areas where specific HTA methods for clinical effectiveness assessment may be required and to propose best-practice solutions for these problems. On the basis of a targeted literature review, we identified issues and methods that are specific or particularly relevant for MD assessment and derived recommendations for different areas of the evaluation of clinical effectiveness. The guideline development process also included two commenting periods in order to obtain feedback from all European HTA agencies, stakeholders such as the MD industry, and the general public. Standard HTA methodology for finding, selecting, analyzing, synthesizing and interpreting evidence on clinical effectiveness is generally also applicable when assessing MD. Nevertheless, we found evidence that specific attention is required when defining, describing and evaluating MD interventions. The use of therapeutic MDs often implies further procedures that may vary and also the MD itself can be composed of several components undergoing frequent modification. The more complex nature of MD interventions requires a more elaborated development of the research question. A logic model may help in describing the components of the intervention and comparators, outcomes and effectmodifying factors such as individual and institutional learning. For defining the intervention, one should be explicit about the focus of the HTA report, whether it is the evaluation of one particular MD product or of all MDs that can be used for a certain treatment method. Although randomized controlled trials are the preferred study design in the assessment of effectiveness, such evidence is frequently lacking for MD interventions. HTA assessors should also be familiar with special study designs and analyses methods that take into account the specifics of MD. Registry analyses can be considered to assess long-term outcomes in addition to RCT but should only be used for the assessment of treatment effects when appropriate confounder control is possible. In many cases it is likely that there is an influence of institutional and individual expertise on treatment effects, which has to be taken into account in the assessment. Evaluation of therapeutic MD should generally be done with established HTA methods. Compared to drugs, MD interventions are usually more complex and their effectiveness is influenced by expertise and learning. These features together with the rapid development of technologies pose challenges to MD assessment. Anti-Mullerian hormone and endometrial cancer: a multi-cohort study Schock H 1 , Jung S, Allen N 2 , Arslan A, Egleston B, Helzlsouer K, Idahl A, Falk R, Kaaks R 3 , Krogh V, Shu XO, Sluss P, Staats P, Tworoger S, Dorgan J 1 Deutsches Krebsforschungszentrum (DKFZ), Heidelberg, Germany; 2 Cancer Epidemiology Unit, University of Oxford, Oxford, United Kingdom; 3 DKFZ Heidelberg, Heidelberg, Germany The Mullerian ducts are the embryological precursors of the female reproductive tract, including the uterus, and the anti-Mullerian hormone (AMH) plays a key role in the regulation of sexual differentiation of the developing fetus, by causing regression of these structures in males. We hypothesized that AMH secreted by the ovaries of adult women may inhibit the development of endometrial cancer given that these tumors arise in tissues derived from the Mullerian ducts. AMH has been shown to inhibit endometrial tumor growth by apoptosis and cell cycle arrest in vitro and in vivo experimental models. However, there are no epidemiologic data on the association of AMH and endometrial cancer to date. We used data from eight cohort studies located in the United States, Europe, and China. Our investigation was limited to healthy women premenopausal at blood collection, as AMH is undetectable after menopause. We identified 329 endometrial cancer cases and 339 matched controls. Concentrations of AMH were measured centrally using a commercially available enzyme-linked immunosorbent assay (Ansh Labs, Webster, TX). Ten multiply-imputed datasets were created to impute the missing values of the following potential confounders: body mass index (BMI; 21 % missing), ever use of oral contraceptives (23 % missing), total number of pregnancies (27 % missing), and smoking status (4 % missing). These variables were included in the imputation models in addition to age at blood draw, year of blood draw, cohort, current oral contraceptive use, race, and education to estimate the imputed value based on the distribution within each cohort. Conditional logistic regression was used to estimate odds ratios (OR) and 95 % confidence intervals [CI] across tertiles of AMH concentrations, defined among the control population in each cohort, and for doubling of AMH concentrations (OR log2 ). OR estimates were calculated in each of the multiply-imputed datasets and pooled using Rubin's rule. The associations between AMH levels and endometrial cancer risk were evaluated using both unadjusted models and multivariable models adjusted for age at blood draw, BMI, ever use of oral contraceptives (no, yes), total number of pregnancies (0, 1, 2, 3, C4), and smoking status (never, past, current). Results AMH was not significantly associated with risk of endometrial cancer overall Background Access to modern health services and medication is a key factor in the prevention of child deaths. Because individuals are charged user fees at the point of use, utilization of health services is often linked with high and partly unexpected costs for the average household in disadvantaged populations. This expense often hampers access to modern health services, especially for young children. Micro health insurance schemes have been implemented across developing countries as a means of facilitating access to modern medical care, with the ultimate aim of improving health. This effect, however, has not been explored sufficiently. We investigated the effect of enrolment into community-based health insurance on mortality in children under 5 years of age in a health and demographic surveillance system (HDSS) in a prospective dynamic cohort of about 90,000 individuals in Nouna, rural Burkina Faso. From 2004 onwards, a community-based health insurance (CBHI) was offered to the study population. The enrolment period is 1 year, after which enrolment has to be renewed. Enrolled persons can access care free of charge at point of use at their reference health facility. Surgery and in-patient treatment at Nouna hospital are also covered. We included all children born between 2000 and 2010 and analyzed the effect of health insurance enrolment on child mortality with a Cox regression model. Insurance enrolment status was included in the analysis as a time-dependent covariate. We adjusted for variables that we found to be related to the enrolment in health insurance in a preceding analysis. Of the 33,500 children born between 2000 and 2010, 4.8 % have ever been enrolled in CBHI before 2010. About 74 % of those children were insured until the end of the observation period and 95 % of the insured children were continuously insured without disruption. We identified socioeconomic status, father's education, distance to the closest health facility, year of birth, and insurance status of the mother at time of birth as the major determinants of health insurance enrolment. In the main analysis, we found that mortality risk was 46 % lower in children enrolled in health insurance as compared to the nonenrolled children (HR = 0.54, 95 % CI 0.43-0.68) after confounder adjustment. The present analysis found the strongest effect shown to date of health insurance enrolment on mortality. This strong effect may likely be explained by increased utilization of health services by enrolled children. Because malaria is a main cause of death in the study area, early consultation of health services in case of infection could prevent many deaths. A second explanation is the reduction of out-of-pocket expenditure on health, which means that this money can be spent for other health-related goods such as food, housing, or clothes. Despite adjustment for the most important determinants of CBHI enrolment, some residual confounding cannot be excluded due to the self-selection of individuals into the CBHI scheme and due to the low recruitment rate. Nevertheless, implementation of health insurance could be a major driving factor of reduction in child mortality in the developing world. Penile cancer is a rare disease in Europe and North America. Cancer registry data were used to estimate the incidence, mortality and survival of penile cancer in Saxony, Germany. Methods Data on incidence were analyzed for the period 1961-2012 and mortality for 1990-2012, standardized with the European population. Trend analyses of incidence and mortality were performed using joinpoint regression. Survival rates for primary penile cancer were estimated; overall, by T stage, UICC stage and by year of diagnosis for the years 1963-2012. Age-standardized incidence increased from 1. Incidence of penile cancer has increased in recent years, however, survival rates were higher in Saxony compared to other published studies in Europe and the United States. An architecture for the integration of clinical data from a PEHR in a regional research platform Schreiweis B 1 , Bronsch T 1 , Stein KE 1 , Nöst S 1 , Aguduri LS 1 , Brandner A 1 , Pensold P 1 , Weiss N 1 , Yü ksekogul N 1 , Bergh B 1 , Heinze O 1 1 University Hospital Heidelberg, Heidelberg, Germany Making clinical information available for research is not only relevant for healthcare institutions, but also for regional EHRs, as cross-sectorial information can be made accessible. In the INFOPAT (INFOrmation technology for PATient-oriented health care in the Rhine-Neckar metropolitan region) project we are thus implementing both, a regional personal cross-enterprise electronic health record (PEHR) and a regional research platform (RRP) based on information from the PEHR. IHE profiles are implemented to achieve interoperability between healthcare institutions electronic medical records (EMR) and PEHR on the one hand, as well as PEHR and RRP on the other hand. The use case for the RRP is cross-sectorial quality assessment and improvement for colorectal cancer based on a quality indicator (QI) approach including patients' perspectives. For semantic interoperability the responses are transferred in the form of HL7 CDA L2 documents. The resulting architecture for a RRP shows that implementing a PEHR in combination with a RRP based on international communication standards is possible. Also IHE XDS can be used for integration of patient care and biomedical research infrastructures. Cervical cancer and frequency of participation in screening: results from the TeQaZ study Schriefer D 1 , Radde K 1 , Schü lein S 1 , Schoffer O 1 , Pinkert S 1 , Schweigler D 1 , Polster M 1 , Garbe Y 1 , Klug S 1 1 Cancer Epidemiology, University Cancer Center, University Hospital, TU Dresden, Dresden, Germany The incidence of cervical cancer in Germany remains high in relation to other countries in Western Europe. Possible reasons for this include low participation in cervical cancer screening (CCS) and low quality of cytology. Germany currently does not have an organized CCS program and screening efforts with the Pap smear remain opportunistic. The TeQaZ study is a case control study investigating participation in CCS and evaluating the quality of cytology. The aim of this analysis was to investigate differences with regards to frequency of participation in CCS and other risk factors between women who developed cervical cancer and healthy controls. Methods Incident cases of cervical cancer, diagnosed between 2012 and 2016 in Saxony, Rhineland-Palatinate and surrounding regions, were recruited. Cases were matched with three population-based controls, recruited via population registers, based on age and region of residence. Cases and controls completed a telephone interview regarding frequency of CCS participation during the past ten years. Various socio-demographic and other risk factors were assessed. Univariable and multivariable conditional logistic regression analyses were performed to model the probability of developing cervical cancer. A total of more than 200 cases and 600 controls were included in the study and successfully completed the study questionnaire. Univariable logistic regression showed that women who participated in CCS less frequently were more likely to develop cervical cancer than women who participated more frequently. Additionally, findings from multivariable logistic regression adjusting for factors potentially associated with the development of cervical cancer will be presented. These factors include smoking, number of children, number of sexual partners, use of oral contraceptives and other sociodemographic factors. The findings of the TeQaZ study regarding frequency of participation in CCS make an important contribution towards the development of an evidence-based, organized CCS program in Germany. Such an organized program should include population-based invitations to attend screening at appropriate intervals, with high quality cytology and HPV testing for the appropriate age groups. Using administrative data to examine extent and risks of psychotropic drug use in children and adolescents. Possibilities and limitations Background Although many drugs work differently in children and adolescents as compared to adults, drug safety in this patient group is generally an understudied field, amongst others due to ethical problems of including minors in clinical trials. This often leads to off-label use of drugs in pediatric patients, but the extent and associated risks are unknown. To overcome this gap of knowledge, claims data of statutory health insurances (SHIs) may provide valuable information. We aim to illustrate the possibilities and also the limitations of using claims data for this purpose based on a study that investigated the offlabel use of neuroleptic and antidepressant drugs and risks of stimulant use in children and adolescents. We used the German Pharmacoepidemiological Research Database (GePaRD), which contains claims data of about 20 million persons from four SHIs. Our study included minors aged 0 to 17 years. We analyzed prescription patterns of antidepressant and neuroleptic drugs (with a focus on off-label use) for the years 2004 to 2011 using both cross-sectional and longitudinal approaches. A case-control design was used to examine risks of off-label use. In addition, we performed a cohort study including minors with attention-deficit/hyperactivity disorder (ADHD) to investigate cardiovascular risks of methylphenidate use. Each year, there were about two million individuals aged 0 to 17 years in the database. We observed varying rates of antidepressant and neuroleptic drug use with marked changes regarding sub-classes of drugs and single substances during the study period. The prevalence of antidepressant off-label use decreased from 58 % in 2004 to 41 % in 2011. During the same time period, the prevalence of neuroleptic off-label use varied between 52 % and 71 %. Adverse drug events like cardio-and cerebrovascular events, suicidality, death, and extrapyramidal events in minors with antidepressant or antipsychotic medication were rare and no significant differences were detected between on-and off-label use. In the ADHD cohort, comprising about 119,000 minors with a total time of methylphenidate use of 1.2 million person-years, only nine major cardiovascular events were observed, which did not allow for statistical analysis. However, there was no statistically significantly increased risk of non-major cardiovascular events for current methylphenidate use compared to nonuse. Conclusion By using claims data we were able to perform comprehensive analyses of antidepressant and neuroleptic drug use in children and adolescents and to study time trends over a period of eight years. The large study population enabled detailed stratifications not only by age and sex but also by on-/off-label use, drug classes, and single substances. The database enabled us to investigate rare exposures and rare outcomes without recall-bias. However, even in the large source population some events were too rare to be studied. Further limitations include the fact that the database does not contain any laboratory parameters or information about inpatient drug use. Besides, outpatient diagnoses are available on a quarterly basis only. Despite these limitations, claims data allow addressing important research questions regarding drug use in children and adolescents that could not be answered otherwise. Degree of urbanization and utilization of ambulatory care-a least square means analysis Schulz M 1 , Czihal T 1 , Erhart M 2 , von Stillfried D 2 1 Zentralinstitut für die kassenärztliche Versorgung in Deutschland (Zi), Berlin, Germany; 2 Zentralinstitut für die Kassenärztliche Versorgung in Deutschland (Zi), Berlin, Germany We recently introduced two area-level socio-economic pattern variables identified by factor analysis, the socio-economic health index (SGX) and the index of urbanization (UX), which have proven useful in predicting morbidity and assessing need for care [1] . Aim of the present study was to evaluate the association of socio-economic patterns with utilization of ambulatory care. For this analysis, complete nation-wide claims data on ambulatory physician services according to §295 SGB V from 2010 (N\(round\) & 70,000,000 patients) were used. Based on place of residence of patients, volume of services per patient quantified in points according to EBM (Einheitlicher Bewertungsmaßstab, uniform value scale) served as measures of utilization of ambulatory care. Data were aggregated to county level (N = 413) and supplemented by factor scores of the socio-economic pattern variables SGX and UX. Analysis of covariance was used to compare mean outcome levels (i.e. total volume and separately for GP and specialist's services) between tertiles of UX and SGX, adjusted for covariates (outflow of services, share of people aged 65?, hospital beds per 100,000 inhabitants). We evaluated the impact of region (East vs. West Germany) as a potential effect modifier on the association between UX/SGX and measures of utilization by fitting models, including interaction terms for UX/SGX-tertile and the East-West variable. We found strong associations between UX and measures of utilization of ambulatory care and evidence of effect modification by region. Except from volume of psychotherapy (PT) services, adjusted mean volumes of total, GP, specialist's and ophthalmologist's services were higher in East than in West Germany. With increasing urbanization tertile total volume per patient ranged from 150,387 to 169,015 points in East Germany and from 141,780 to 157,215 points in West Germany. Higher UX was related with a slight decrease in volume of GP services in East but not in West Germany (p for interaction .05). Volume of all specialist's services showed a steeper increase in East than in West Germany as UX increases (p for interaction .1). Specifically, volume of ophthalmological services was steadily increasing in East Germany (from 5,352 points in tertile 1 to 6,901 points in tertile 3) but almost constant in West Germany (p for interaction .001). With regard to SGX, we observed positive associations with volume of total and GP services and no or weak associations with the remaining volumes. There was no evidence of effect modification by region yet values for SGX where thoroughly higher in the East. The impact of urbanization (UX) on utilization of ambulatory care is modified by region, except for PT services. Thus, based on volume of utilized services per patient, mean outcome levels tended to be higher in East than in West Germany. This finding may be in part attributable to differences in morbidity and in supply and warrants future research on risk factors explaining variance in utilization differences. Socio-economic and health deprivation (SGX) in contrast was found to equally impact utilization of ambulatory care, suggesting that this measure captures major parts of heterogeneity accounting for variation in service utilization. Big data resources are difficult to process without a scaled hardware environment that is specifically adapted to the problem. The emergence of flexible cloud-based virtualization techniques promises solutions to this problem. This paper demonstrates how a billion of lines of can be processed in a reasonable amount of time in a cloudbased environment. Our use case addresses the accumulation of concept co-occurrence data in MEDLINE annotation as a series of MapReduce jobs, which can be scaled and executed in the cloud. Besides showing an efficient way solving this problem, we generated an additional resource for the scientific community to be used for advanced text mining approaches. How fit is SNOMED CT for eHealth interoperability in Europe? Schulz S 1 , Cangioli G 2 , Chronaki C 3 , Kalra D 4 , Stroetmann V 5 , Thiel R 6 , Thun S 7 1 Medizinische Universität Graz, Graz, Austria; 2 HL7 International Foundation, Florence, Italy; 3 HL7 International Foundation, Athens, Greece; 4 Eurorec, Brussels, Belgium; 5 empirica Technology Research mbH, Bonn, Germany; 6 empirica Communication and Technology Research, Bonn, Germany; 7 Hochschule Niederrhein, Krefeld, Germany ASSESS CT is an EU funded project that aims at contributing to better semantic interoperability of eHealth services in Europe. Its main goal is the investigation of the fitness of the international clinical terminology SNOMED CT as a potential standard for EU-wide eHealth deployments. This panel will report on the current use of SNOMED CT's in Europe and world-wide, offer experimental evidence for the fitness of SNOMED CT and present impact assessment and socio-economic stakeholder analysis. The audience will be actively engaged in the discussion of policy recommendations and the identification of next steps in advancing semantic interoperability with working reference clinical terminologies that are active parts of the emerging digital health standards landscape that encourages innovation. Competing risk bias in a leading medical journal Schumacher M 1 , Ohneberg K 2 , Beyersmann J 3 1 Institut für Med. Biometrie und Med. Informatik/ Universitätsklinikum Freiburg, Freiburg, Germany; 2 Universitätsklinikum Freiburg, Freiburg, Germany; 3 Institut für Statistik, Universität Ulm, Ulm, Germany About 10 years ago, statistical methods for time-to-event data were used in about two thirds of articles published in the New England Journal of Medicine (NEJM) [1] . We investigated the current prevalence and sought to determine whether competing risks, a major source of survival bias[2], was adequately addressed. We assessed all original articles published in the 2015 issues of NEJM. We identified those publications with a time-related primary endpoint and investigated whether competing risks were present and, if so, they were adequately dealt with. For some selected example studies[3], we reconstructed the correct cumulative incidence curves and contrasted them with the corresponding, incorrect Kaplan-Meier curves in order to illustrate the origin and the magnitude of the bias [4] . In total, 219 articles were assessed. The vast majority (188; 86 %) had a time-related primary endpoint; 153 (70 %) use statistical methodology for time-to-event data in a narrower sense. In 50, competing risks were present; 25 (50 %) of those were susceptible to competing risk bias. The major reason was that a Kaplan-Meier analysis for a non-comprehensive composite endpoint was carried out, e.g. by considering only deaths of a specific cause and taking deaths from other causes as censored observations. Studies susceptible to competing risk bias were concentrated in the field of cardiovascular diseases/diabetes (16) in contrast to only 2 that we found in the field oncology where avoidance of competing risks by the use of comprehensive composite endpoints like disease-or progression-free survival is standard practice. With the selected example studies, we show that cumulative incidence curves are generally overestimated when using Kaplan-Meier methodology. The magnitude of the bias ranges from a negligible to a considerable amount depending on the sizes of both incidence rates for the primary endpoint and for the event constituting the competing risk. The proportion of articles in NEJM using statistical methods for timeto-event data still increased from 2005 to 2015. Although many tutorial papers are available [4] , there is a considerable number of studies susceptible to competing risk bias. This bias can be avoided by using adequate statistical methodology. There is no excuse not to use it and Kaplan-Meier methodology should be completely abandoned in publications of studies with competing risks not only in the NEJM but in all journals [5] . Schuster R 1 , Heidbreder M 2 , Emcke T 3 , von Arnstedt E 4 1 MDK Nord, Lübeck, Germany; 2 Medizinischer Dienst der Krankenversicherung Nord, Lübeck, Germany; 3 Kassenärztliche Vereinigung Schleswig-Holstein, Bad Segeberg, Germany; 4 MDK-Nord, Hamburg, Germany We introduce a classification structure based on a large extent at the patient focus. In this context, patients in an outpatient setting can be classified with respect to the medical as well as the pharmacoeconomic background. In the hospital setting each patient receives a main diagnosis and a DRG (Diagnoses Related Group). By analogy with DRG, the MRG classification is also related to the morbidity of the patient. A more detailed structure could contain a risk adjustment with respect to age, multimorbidity and treatment intensity when respective differences are relevant. Such a classification of patient types leads to a medical office type structure of the analyzed physician. Subsequently, a natural discrimination of the medical disciplines can be achieved. To this purpose, we analyzed prescription data of the SHI in Schleswig-Holstein in the second quarter of 2015. We applied the ATCclassification with some additions for medical products not included and aggregated drug groups on a four character level like A10A for insulins and analogues. The group with the largest costs for related drugs within a quarter for each consulted physician of this patient will define the MRG. Hence, this group should strongly reflect the morbidity of the patient. One can consider the cost as an indicator for the severity of the drug treatment and could also use other weight functions instead of the cost. When all patients have been classified by the proposed method, an office type structure will result. As an example, we considered a physician who cares for 14 % of his patients in the MRG type ''insulins and analogues'' and for 11.8 % patients using test strips measuring glucose appearing as ''other diagnostic agents'' in the classification. In his medical discipline general practitioners (GP) only accounts for 3.8 % in these two groups. The physician can thereby be identified as a diabetologic specialist. Regarding orthopedics we observe a patient type structure, in which 42.9 % of all patients are related to the MRG type M01A (antiinflammatory and antirheumatic products, non-steroids). Costs again depend mainly on the medical discipline. In oncology average costs per patient of 15.288,17 € related to the MRG group (immunosuppressive agents including all the other drugs for the patient) can be observed versus 2.515,02 € for orthopedics. Conclusion So far, the so called ''Richtgrößen'' (cost benchmarks per patient in a view groups) were used due to German law as a measure of drug economic efficiency of physicians in the outpatient area, but includes the morbidity structure only sparsely. With beginning of 2017 the ''Richtgrößen'' are allowed to be replaced. Beside epidemiological benefits application of the MRG concept can be utilized further for drug economic measurements. By using subgroup structure a risk adjustment can be accomplished. This has not to be precise on the patient level but at the physician level. The distance of a physician to his medical discipline measures, to which extend he is typical or not. Extreme values may be a hint for. Current Ambient Assisted Living (AAL) environments lack integration of sensors and actuators of other sub-domains. Creating technical and organizational integration is addressed by the BASIS project (Build Automation by a Scalable and Intelligent System), which aims to build a cross-domain home bus system. The main objective of this paper is to present an overview of design, architecture and state of realization of BASIS by describing the requirements development process, underlying hardware design and software architecture. We built a distributed system of one independent building manager with several redundantly meshed segment controllers, each controlling a bus segment with any number of bus nodes. The software system layer is divided into logical partitions representing each sub-domain. Structured data storage is possible with a special FHIR based home centered data warehouse. The system has been implemented in six apartments running under daily living conditions. BASIS integrates a broad range of sub-domains, which poses challenges to all project partners in terms of a common terminology, and project management methods, but enables development of inter-domain synergies like using the same sensor and actuator hardware for a broad range of services and use cases. Facilitating clinical pathway standardization and adoption using a web-based software solution Schwarz B 1 , Rupp S 1 , Hinsmann A 1 , Mekhzoum VD 1 , Friesen A 1 , Friedl T 1 While clinical pathways are an accepted tool for process optimization and quality management, their adoption into clinical routine is often lacking. This is often due to difficulties that arise when integrating process control into common electronic health records and administrative or diagnostic software. The solution presented here allows a standardized approach to modeling clinical pathways using the Business Process Model and Notation (BPMN) and integrates into existing software solutions. An intuitive interface supports pathway execution, and a feedback system allows to continually improve the processes. Pre-and post-diagnosis food intake and food based dietary patterns and risk of mortality and cancer recurrence among cancer survivors: a systematic review and meta-analysis of cohort studies Schwedhelm C 1 , Boeing H 1 , Hoffmann G 2 , Aleksandrova K 1 , Schwingshackl L 1 1 Deutsches Institut für Ernährungsforschung Potsdam-Rehbrücke (DIfE), Nuthetal, Germany; 2 University of Vienna, Vienna, Austria To date the existing evidence on the association between dietary patterns [a priori based (diet quality indices) and data-driven (prudent/ healthy or Western)], individual food components (fruits, vegetables, dairy, meat, fish, and cereals), and beverages (alcohol, tea, and coffee) and the risk of overall mortality and cancer recurrence among cancer survivors has not been a subject of a systematic review. The aim of this study was to conduct a systematic review and meta-analysis of cohort studies and to investigate the association between food intake and food based dietary patterns/indices and overall mortality and cancer recurrence among cancer survivors. We performed a literature search including cohort studies of cancer survivors, investigating pre and/or post-diagnosis food/beverage intake and food based dietary patterns/indices and overall mortality and cancer recurrence. Study specific risk ratios (RR) were pooled highest vs. lowest pre and post-diagnosis exposure categories using random effects models. Results 108 studies enrolling 183,374 cancer survivors were included. Higher intakes of vegetables (RR, 0.85; 95 % CI, 0.77-0.94), fruits (RR, 0.90; 95 % CI, 0.82-0.99) and fish (RR, 0.83; 95 % CI, 0.72-0.95) were inversely associated with overall mortality, and higher alcohol consumption was positively associated with overall mortality (RR, 1.08; 95 % CI, 1.02-1.15), and cancer recurrence (RR, 1.16; 95 % CI, 1.02-1.32). Adherence to the highest diet quality indices category (RR, 0.78; 95 % CI, 0.72-0.85), (post-diagnosis: RR, 0.79; 95 % CI, 0.71-0.89), and highest category of a prudent/healthy dietary pattern (RR, 0.81; 95 % CI, 0.67-0.98), (post-diagnosis: RR, 0.77; 95 % CI, 0.60-0.99) were inversely associated to overall mortality. The Western dietary pattern was associated with increased risk of overall mortality (RR, 1.47; 95 % CI, 1.26-1.70), (post-diagnosis: RR, 1.51; 95 % CI, 1.24-1.85). No significant association could be observed for prudent/healthy or Western dietary pattern with cancer recurrence. Conclusion This meta-analysis provides evidence that adherence to high diet quality indices and a prudent/healthy is inversely associated with overall mortality, whereas an unhealthy Western dietary pattern is positively associated with risk of overall mortality among cancer survivors. Non-interventional studies-requirements and supervision within the European Union Schweim J 1,2 , Ose C 1,2 1 Institute for Medical Informatics, Biometry and Epidemiology (IMIBE), University Hospital Essen, Essen, Germany; 2 Zentrum für klinische Studien Uniklinikum Essen, Essen, Germany Society and some members of the scientific community to this day sceptical consider and misconceive Non-Interventional Studies (NIS), as defined in Art. 2c of Clinical Trials Directive [1] , or Observational Studies as being a type of study with higher potential for misuse and interference. The continuous critic especially focusses on conjunction of scientific questions with advertising measures and the risk of financial incentives influencing investigator's prescription behaviour. Meanwhile, the pharmaceutical industry as well as non-commercial sponsors appreciate NIS as an important instrument for confirmation of results gained from clinical trials, by observing the use of an authorised medicinal product under routine conditions. From the perspective of competent authorities, Post-Authorisation Studies may provide important information regarding the efficacy or safety of medicinal products. The conduction of Post-Authorisation Efficacy (PAES) or Safety Studies (PASS) [2] may be imposed as an obligation to a Marketing Authorisation Holder (MAH), but can also be carried out voluntarily. Specific rules for protocols, reports etc. have to be followed. Still inconsistencies in the interpretation and divergent requirements for the conduction of NIS exist in EU Member States. To exemplify, ethical and regulatory requirements for NIS in Germany, United Kingdom (UK) and Scandinavian countries (Denmark, Finland, Norway, Sweden) have been examined. A research on webpages (English versions) of independent ethics committees (IEC), medicines agencies and data protection agencies for each country was necessary to collect information and verification was performed by consultation of national laws on medical research. The responsibilities for supervision and requirements to be fulfilled before starting a NIS are rather varying among EU Member States. For example, Germany or Norway, both require an application to the local IEC and a notification to the competent medicines agency; a German IEC is involved according to physician's professional law. In UK, neither an IEC nor an authority needs to be informed. In Denmark, Finland or Sweden either one of the bodies needs to be involved. Germany has additionally installed rules for the limitation and supervision of remunerations paid to investigators participating in NIS. In most EU countries, the pharmaceutical industry has set own transparency standards including reasonable compensation for investigators reflecting the fair market value. The institution planning the study is responsible for classification as NIS and clear distinction from clinical trial. In case of uncertainties, the national competent authority offers advice. As demonstrated by the requirement of an application in the national language, the procedures are exclusively national. So far, only German medicines agencies publish the notifications of NIS in a publicly accessible database. With ENCePP/ EUPAS, a European database for pharmacovigilance and pharmacoepidemiological studies exists, the registration is recommended. The divergent and nationally specific requirements may hinder the performance of multi-national NIS, especially regarding the language barrier. Harmonisation in this field is a future task, which unfortunately will not be reached with the implementation of the Clinical Trials Regulation[3], as this law only applies to (interventional) clinical trials. Consistent EU standards could be the answer towards critical voices raised against NIS. had to deter-mine the requirements such a system has to meet. By performing extensive Requirements Engineering, we aimed at raising the success of the future system and the user satisfaction. Investigations revealed the existence of TORE (Task and Object-oriented Requirements Engineering), a task-driven approach for determining requirements on user interface-and information-intensive systems. In this paper, we present an adoption of this method for our purposes, resulting in a reasonable list of requirements for CTMS acquisition. An update on solarium use and risk for malignant melanoma: a critical systematic review and meta-analysis Cervical cancer remains the third most common cancer among women worldwide, with HPV infection being causal for its development. This analysis describes various recruitment strategies for the MARZY study, which investigated cervical cancer screening (CCS) invitation models, reasons for non-participation in CCS and population-based human papillomavirus testing in Germany. The MARZY study is a prospective, population-based cohort study. A sample of women aged 30-65 years, living in the city of Mainz and the rural district of Mainz-Bingen was randomly selected for the study via population registries. During the baseline investigation (2005) (2006) (2007) , women in intervention group were sent a personalized letter of invitation and were asked to make an appointment at an office-based gynecologist of their own choice. Those women who did not respond to the invitation letter were contacted via a first reminder letter, second reminder letter or telephone interview. This analysis investigated the proportion of women participating in the study after each intervention, as well as the time period until study participation. Logistic regressions analyses were performed to investigate associations between study participation and factors such as age, nationality and study region. A total of 5 275 women in the intervention group were eligible to participate and received an invitation letter. The proportion of women participating after each intervention, the time period until study participation, as well as factors associated with participation will be reported. The findings of this analysis will provide insights into which recruitment strategies are effective in terms of contributing towards a higher response rate when conducting large epidemiological studies. The aim of this investigation was to assess the relationship between acute adverse effects and health related quality of life in prostate cancer patients who underwent radiotherapy (RT). Prostate cancer patients undergoing RT were recruited in seven countries between April 2014 and October 2015 for an ongoing multicentre prospective cohort study (www.requite.eu). In this preliminary analysis, RT details as well as quality of life (QoL) data prior to and at the end of RT were available for 382 patients. Acute adverse effects (gastrointestinal/GI, genitourinary/GU) were defined as grade 2 or higher according to CTCAE version 4 criteria. Global Health Status (GHS)/QoL was assessed using the EORTC QLQ-C30 questionnaire. Differences in GHS/QoL between patients with and without acute adverse effects were assessed using the Wilcoxon-Mann-Whitney test. Multiple linear regression was used to investigate associations between common acute adverse reactions and a worsening of quality of life, adjusted for age. The median age of the patients was 70 years. Twenty-eight percent were post-prostatectomy patients, 68 % received neoadjuvant or adjuvant hormone treatment. Sixty-two (16 %) and 66 (17 %) of the 382 patients developed GI and GU toxicity, respectively. Eighteen (4.7 %) suffered from both GI and GU toxicity. Most common acute toxicities included proctitis (9.7 %), diarrhea (5.5 %), urinary frequency (10.5 %), urinary urgency (6.8 %) and retention (4.5 %). Significant clinical worsening of GHS/QoL (i.e. a drop of more than 10 points on a scale from 0 to 100 at the end of RT) was reported by 111 (29 %) patients. Both those without GI and without GU toxicity had an median QoL change of 0 points, whereas both patients with grade C2 GI and with grade C2 GU toxicity showed a median change of -8.3 (range GI toxicity yes: -75.0 to 66.7, no: -58.3 to 66.7, p = 0.028 and range GU toxicity yes; -75.0 to 66.7, no: -66.7 to 66.7, p-value = 0.015, respectively). All the individual common acute toxicities except for diarrhea were associated with a reduction in GHS/QoL (median changes for all individual acute toxicities: -8.33; p \ 0.01). In multiple linear regression analysis, proctitis was the only common adverse effect significantly associated with a decrease in GHS/QoL (p = 0.021). We showed that patients with GI or GU toxicity had statistically significant lower GHS/QoL scores at the end of RT than patients without experiencing adverse effects. Proctitis may have the strongest impact on worsening of QoL. Updated results of these analyses based on a larger patient cohort will be presented. Besides, there will be further follow-ups at 1 and 2 years after treatment in order to study late adverse effects. The state of asthma epidemiology-an overview of systematic reviews Seibold A 1 , Genuneit J 1 1 Institute of Epidemiology and Medical Biometry, Ulm University, Ulm, Germany Epidemiological research on asthma has been conducted for decades. While numerous expert opinion publications exist, there is no systematic overview over what this field of research has achieved and which knowledge gaps may exist. We performed to our knowledge the first overview of systematic reviews in allergy epidemiology conducting a systematic literature search in MEDLINE and EMBASE up to 12/2014. We included reviews indicating a systematic literature search and dealing with incidence, prevalence, time trends, or risk/protective factors for several allergic diseases. We focused on research on interventions for primary prevention, reviews on interventions for secondary or tertiary prevention or therapeutic interventions were excluded. Here, we describe systematic reviews dealing with asthma or wheeze. Amongst other indicators the topic of research was extracted and quality was evaluated using the AMSTAR checklist. The search retrieved 4,566 hits of which 411 were relevant with n = 305 (75 %) on asthma or wheeze. They were published between 1990 and 2015 (median 2011); median AMSTAR score was 8 (Q1 = 6, Q3 = 10) with significantly increasing quality in more recent years. Of all AMSTAR items, inclusion of grey literature and assessment of publication bias were most often neglected (76 and 51 % of the articles, respectively). The topic most often studied was genetics (27 % of the reviews), followed by pollution (16 % including smoking), microbes (14 %), and body weight and diet (each 12 %). Whereas 28 % investigated specifically childhood, only 4 % focused on adult asthma. So, most of the reviews included studies regardless of population age or did not specify the population ages in their search criteria. Numerous systematic reviews in asthma epidemiology have been published although some have limited methodological quality. Substantial evidence has been gathered for genetics, smoking, body weight, infections and other allergens but other research topics lack systematic reviews, potentially due to a lack of original data. Asthma is a complex disease with multiple causes which is more and more acknowledged in the literature. Thus, future systematic reviews should probably focus more on overarching themes like gene-environment interactions or interrelation of environmental factors rather than single determinants of the disease. Traffic noise and depression risks-a large case-control study based on secondary data The to-date evidence pointing towards a relation between traffic noise and depressive disorders is still inconclusive. The purpose of this study is to clarify the relation between depressive disorders and traffic noise separately as well as combined for aircraft, road, and railway traffic noise, in a large secondary-data based case-control study. The study population consisted of individuals that were insured by three large statutory health insurance funds in the Rhine-Main area (administrative region Darmstadt and Rhine Hesse) of Germany. The results suggest that traffic noise exposure might lead to depression. As a potential explanation for the decreasing risks at high traffic noise levels, vulnerable people might actively cope with noise (e.g. insulate or move away). Given the high proportion of the population that is exposed to traffic noise and the clinical importance of depression, the results are of high public health relevance. Reporting of prognostic tumour marker studies after the introduction of the REMARK guideline needs improvement demonstrated that many lacked key information needed by readers to evaluate their reliability and clinical applicability [1] . The aim of the current study was to examine whether the quality of reporting has improved in the meantime. As closely as possible, we used the methods of the earlier review of published articles from the 'pre-REMARK' era. This approach includes the utilization of the same data extraction form with questions representing subitems of the original items of the REMARK check list [1] . The literature search for prognostic tumour marker studies was done in Web of Science in 2013. Altogether, we assessed adherence to REMARK for 53 publications citing REMARK ('citing group') and 53 publications not citing REMARK ('not-citing group'; matched by journal and issue). Descriptive comparisons over time and between groups were done with a particular focus on 10 items of the REMARK check list. Background and reasons for the restriction to 10 out of 20 items will be provided. Overall, the proportion of the 10 items that were assessed slightly increased on average from 53 % (range: 10-90 %) in the earlier study to 58 % (range: 30-100 %) in the citing group and to 58 % (range: 20-100 %) in the not-citing group. The improvement, however, was not seen in all 10 items. While an improvement was substantial for some (e.g. item 6: 'Study designfollow up'; past study: 40 %, citing group: 60 %, not-citing group: 62 %), it got worse for others (e.g. item 13: distribution of demographic characteristics; past study: 58 %, citing group: 42 %, notciting group: 55 %). In principle, it should be easy to report all study details included in the REMARK checklist. However, our investigation shows that many items are still poorly reported, so there remains much room for improvement. To improve the clinical value of published prognosis research in cancer authors, editors and peer reviewers should be aware of and follow reporting recommendations. This workshop aims at discussing alternative approaches to resolving the problem of health information fragmentation, partially resulting from difficulties of health complex systems to semantically interact at the information level. In principle, we challenge the current paradigm of keeping medical records where they were created and discuss an alternative approach in which an individual's health data can be maintained by new entities whose sole responsibility is the sustainability of individual-centric health records. In particular, we will discuss the unique characteristics of the European health information landscape. This workshop is also a business meeting of the IMIA Working Group on Health Record Banking. Towards educational electronic health records (EHRs): a design process for integrating EHRs, simulation, and video tutorials Electronic health records (EHRs) are becoming ubiquitous in healthcare practice. However, their use in medical education has been slower to catch on and a new category of EHRs is beginning to emerge known as eduEHRs. These systems allow learners to explore and experiment with EHRs in the context of medical education. However, current eduEHRs have limitations, such as a lack of dynamic interaction built-in that would mimic real-world use of these tools. To overcome this the integration of eduEHRs with software and tools such as video simulations and tutorials has considerable promise. In this paper we describe a new design process for integrating EHRs, simulations and video tutorials. Shaw P 1 1 Many variables of interest in epidemiological observational studies are subject to measurement error and misclassification. However, in many fields of epidemiological research the impact of such errors is either not appreciated or is ignored. As part of the STRengthening Analytical Thinking for Observational Studies (STRATOS) Initiative1, a Task Group on measurement error and misclassification (TG4) is currently engaged in several activities to increase the awareness of this problem among biostatisticians and epidemiologists and point to methods to address it. As part of this effort, TG4 conducted a literature survey of four types of research studies that are subject to exposure measurement error: (1) nutritional cohort studies, (2) nutritional surveys, (3) physical activity cohort studies, and (4) air pollution cohort studies. The survey was conducted both as a general search of articles in these areas to understand current practice with regards to addressing measurement error and a specific methods search to understand which methods were used when adjustments for measurement error in the analysis were made. The TG4 survey revealed that while researchers in these areas were generally aware of issues in measurement error that affected their studies, very few researchers adjusted for the error in their analysis. Furthermore, most articles provided incomplete discussion of the potential effects of measurement error on their results. The methods search revealed that regression calibration2 was one the most widely used methods to adjust for covariate measurement error. We illustrate this method using a data example from nutritional epidemiology. This example highlights how measurement error can bias measures of association if ignored in the analysis. We also discuss analysis options for settings where this method is not expected to perform well. Conclusion While many researchers acknowledged errors in their instruments, few fully appreciated how this error would affect their results or made any adjustments for error in their analysis. Use of methods such as regression calibration to correct for error depends on the availability of data that allows estimation of the magnitude and nature of the errors, and currently there is a great need to incorporate the collection of such data within study designs. Awareness, attitude and readiness of clinical staff towards telemedicine: a study in Mashhad, Iran Sheikhtaheri A 1 , Sarbaz M 2 , Kimiafar K 2 , Ghayour M 2 , Rahmani S 1 Iran University of Medical Sciences, Tehran, Iran, Islamic Republic Of; 2 Mashhad University of Medical Sciences, Mashhad, Iran, Islamic Republic Of A questionnaire was used to survey Iranian clinical staff about telemedicine. The score for awareness was 13 ± 5.5 out of 35 and indicated low awareness about telemedicine. Only 43.7 % stated they had heard about tele-consultation. The figure for tele-monitoring was 20.1 %. Awareness about other types of telemedicine services was even lower. The most frequently-used sources of information about telemedicine were friends (51.4 %) and public media (30.3 %). Attitudes were generally positive about telemedicine (63.42 ± 9.5 out of 95). It was found that a significant positive correlation exists between attitude and awareness (p = 0.027). In conclusion, Iranian clinical staff has little knowledge about telemedicine services; however, they have a positive perception of this type of service. Providing appropriate education and information resources to them is necessary. Method for detecting interstitial pneumonia from accumulated image reports obtained from electronic medical records Shimai Y 1 , Takeda T 1 , Manabe S 1 , Teramoto K 2 , Mihara N 1 , Matsumura Y 3 1 Osaka University Graduate School of Medicine, Suita, Osaka, Japan; 2 Tottori University Hospital, Yonago City Tottori, Japan; 3 Osaka University Graduate School of Medicine, Osaka, Japan We are carrying out a project with the goal of evaluating the safety of medicines using data obtained from electronic medical records (EMR). Interstitial pneumonia (IP) is one of the most serious drug induced adverse reactions which can potentially lead to the death of the patient. The aim of this study is to evaluate the risk of medicines to cause drug induced IP based on the data obtained from EMR. IP is usually diagnosed by chest CT examinations. However, because many radiologists write their image reports in a free text form, it is not easy for a system to determine which reports clearly describe a diagnosis of IP. In this study, we developed a method to detect IP reports using a keyword analysis of free text data. The chest CT reports obtained from the EMR of Osaka University Hospital were used in this study. We extracted 400 reports in which a diagnosis of IP was made and 400 reports in which a diagnosis of diseases other than IP was made by radiologists. We 300 of the 400 reports as learning data and the remain 100 reports as verification data. The image report form consisted of two sections, namely that regarding the findings and that regarding the diagnosis. When a definitive diagnosis of IP was made in the diagnosis section, we identified these reports as being an IP report. When there was no definitive diagnosis of IP in the diagnosis section, then we analyzed the finding section. We extracted any keywords that appeared more than 10 times in the finding section of the learning data. Thereafter, we calculated the positive and negative likelihood ratio for each keyword associated with IP. The IP score was defined as the multiplication of the likelihood ratio of each keyword. We calculated the IP score of each report on the verification data. The cut-off value for detecting IP reports was determined as the nearest value to points (0, 1) on the ROC curve. To evaluate the accuracy of our IP detection method, we randomly selected another 100 chest CT reports and applied the same method to them. The keywords ''honeycombing'', ''collagen'' and ''interstitial pneumonia'' showed higher positive likelihood ratios. These keywords are thought to be the most characteristic words associated with IP. The likelihood ratios for the positive and negative findings of these keywords were 740, 0.75 for honeycomb, 140, 0.95 for collagen, 56.20, 0.06 for interstitial pneumonia, respectively. Using test dataset 1, the cutoff value of CT was 0.06, the sensitivity was 0.95 and the specificity was 0.98. In the detective precision of IP using test dataset 2, the sensitivity was 0.89 and the specificity was 0.99. Conclusion Therefore, it can be said that by using this method we could predict the occurrence of IP with high specificity from accumulated image reports in a hospital information system. Gamma radiation in decontamination jerked beef Silva M, V. Vicalvi MC, Batista LP, G. Solidônio E, X. F. R. Sena K, Colaço W Food-borne diseases are the major health problem in many countries. This concern relates to the fact that in recent years the occurrence of foodborne diseases has become frequent due to failure to comply with the hygienic and sanitary standards. The processing of Jerked beef is very similar to the charque and at all stages of their technological processing the meat is exposed to contamination, especially in operations where is more manipulated. The objective was to determine which doses of radiation between 2 kGy, 4 kGy 6 kGy would be more effective in decontaminating the product sold in a large supermarket network in Recife. The jerked beef is sold in vacuum packaging weighing 500 g each, for this work were purchased six bags of 500 g, these were divided into two different batches (each of three samples). Under sterile conditions, the meat was cut placed in Petri dishes and weighed, each specimen were made in eight subsamples each weighing 25 g, generating 48 subsamples. Of these 12 sub-samples were assigned to the control group and the remaining (36 sub-samples) were taken to the irradiator with a source of cobalt-60 MDS-Gammacell 220EXCEL Nordionn. The sub-samples were added to an Erlenmeyer flask with 225 ml of sterile water and were agitated for 15 min creating wash water, and another part was added to an Erlenmeyer flask with 225 ml of sterile water and was stored at room temperature by having 14 h to form a water desalting. 1 lL Aliquots of these waters were removed and sown in the midst of exhaustion sheep blood agar and incubated at 35°C for 24 h for analysis of bacterial growth and the microbial count. Using the methodology of wash water was not observed in any growth plate. for the desalting water the results were as 'follows: for first experiment in the control group values were between 5.133 9 105 and 9.56 9 108 UFC/g after irradiation values ranged from 1.7 9 105 to 2 9 106 for 2 kGy; 0 to 6 9 104 for 4 kGy and 0 to 6 kGy performing statistical analysis between the control and the doses were not statistically significant of dose 2 kGy (p = 0, 130919) there is a statistical difference in rates of 4 kGy (p = 0.040510) and 6 kGy (p = 0.047905) in experiment two control values ranging from 2.3 9 109 to 4.1 9 109 UFC/g for the dose 2 kGy, and 4 kGy 6 kGy variants respective values of 6.5 9 107 to 1.05 9 109; 1.7 9 105 to 1.76 9 105 and from 0 to 1.3 9 104, as seen in the first experiment, no statistically significant difference in dose of 2 kGy (p = 0.079057), significant differences were 4 kGy and 6 kGy (p = 0.028125; p = 0.028151, respectively). We can conclude that the dose of 2 kGy is ineffective, and doses of 4 and 6 kGy are effective in decontaminating the jerked beef because there is a statistically significant difference between control and these doses. Keywords Jerked beef, Food contamination, Irradiation. Trends of advanced breast cancer incidence rates after implementation of the mammography screening program in the Regierungsbezirk (RB) Mü nster Simbrich A 1 , Wellmann I 1 , Heidrich J 2 , Heidinger O 2 , Hense HW 1 1 Institute of Epidemiology and Social Medicine at the University of Münster, Münster, Germany; 2 Epidemiological Cancer Registry of North Rhine-Westphalia, Münster, Germany Mammography screening programs (MSP) aim to detect early-stage breast cancers and to decrease the incidence of advanced stages in order to reduce breast cancer mortality. We investigated whether the incidence of advanced stage breast cancers changed in the MSP target population of women aged 50-69 years since implementation of the MSP in 2005. In the RB Münster, 13,879 women were newly diagnosed in the target population with invasive breast cancer (BC) between 2000 and 2013. Data from the population-based epidemiological cancer registry were used to stage cancers. Missing values (10.4 %) were imputed by using a fully conditional specification (FCS) method in which the missing values for all variables are filled in sequentially over the variables taken one at a time. The incidence rates for early stage (UICC I) and advanced stage (UICC II?) BC were determined and a join-point analysis was performed using 3-years moving averages. The incidence of UICC I BC increased rapidly after the introduction of the MSP in 2005 and declined modestly after 2010. The incidence of UICC II ? BC rose after the start of the MSP and decreased markedly after 2009 to levels lower than in the pre-MSP phase 2000-2004. The observed decreases in UICC II ? BC incidence occurred only in the age groups 55 to 74 years corresponding to annual percent changes (APC) ranging from -4.6 to -7.3. In the age groups of 70-74 and 75-79 years, an increase in BC II ? incidence could be observed after 2010 and 2006, respectively. No changes in the incidence of UICC II ? BC were found for the age group 50-54 years. Participation in the MSP during the years was between 53 and 59 %. The incidence rate of advanced stage breast cancers decreased in the MSP target group below pre-screening levels. Such changes were not seen in age groups newly entering the program and those at least five years above the eligible age range. Our results indicate that the MSP holds potential of reducing BC mortality in the target population. Background Bayesian networks (BNs) can provide important information in the diagnosis and prognosis of breast cancer [1] . Through such analysis, it is possible to represent the interactions between the variables of interest and quantitatively measure the impact on a particular outcome (such as breast cancer) [2] . Objective The goal of the work described here was to correlate the findings of mammography and/or ultrasound of the breasts and other signs and symptoms with a diagnosis of breast cancer using a BN. Methods The data used in the training database and tests were obtained from the radiology service. The subjects were 164 women with a total of 170 nodes that underwent core biopsy between July 2014 and February 2015. The nodes had been imaged at least once and classified as Breast Imaging Reporting and Data System (BI-RADS) 3, 4, or 5. The BN was developed with Netica software, Version 5.17 (Norsys Software, Vancouver, BC, Canada, http://www.norsys.com) and comprised the patient's profile and signs and tests related to breast cancer. The output node was outcome (malignant, benign). BN entries included family history (yes, no), nodule palpable (yes, no), BI-RADS USG category (3, 4a, 4b, 4c, 5) , BI-RADS Mammography category (0, 1, 2, 3, 4, 5), age (C50, \50 years), and nodule size ([2, B2 cm). The knowledge needed to explain the quantitative and qualitative parts of the BN was acquired using Bayesian learning. One problem a BN proposes to solve is lack of information, which occurs in most databases, in this context, one of the most frequently used methods for Bayesian learning is the expectation maximization (EM) algorithm [3] . For preparation of the BN, the database was divided into a training set (n = 85 nodes, 23 malignant) and a set of tests (n = 85 nodes, 22 malignant). Results The total sample consisted of 170 nodes belonging to 164 women aged 46.1 ± 14.2 years (mean ± SD, range: 19-89 years). In general, without considering any variable, the probability a nodule was malignant nodule was 26.7 %. On testing, the sensitivity of the BN was 93.3 % (95 % confidence interval [CI] 82.9-96.0), the specificity was 88.6 % (95 % CI 81. 1-96.0) , and the accuracy was 89.4 %. Conclusions We conclude that the sensitivity, specificity, and accuracy of our BN approach was similar to those obtained in related research, but we believe that the inclusion of such variables as breast density, format injuries, and history of hormone replacement therapy may improve network performance. Background Diabetic retinopathy, which causes visual impairment/ blindness, is one of the most common microvascular complications of diabetes [1] . It is considered the most frequent cause of blindness in adults between 20 and 60 years of age, and 45 % of those affected have diabetes mellitus [2] . Objective The aim of this study was to evaluate the accuracy of decision support systems in the diagnosis of diabetic retinopathy by means of a systematic review and meta-analysis of studies of diagnostic accuracy. Methods An exhaustive search of Medline, Embase, and Gray Literature was performed for publications between 1970 through 2015, and were included cross-sectional and cohort primary studies of diagnostic accuracy that evaluated images of individuals with diabetes (target condition) with or without diabetic retinopathy by means of decision support systems (test in evaluation). The meta-analysis was developed with Meta-Disc software, Version 1.4. Results Twelve primary studies containing 10,195 images met the inclusion criteria and were analyzed. The overall sensitivity was 89.4 % (95 % confidence interval [CI] 87.8-90.9), and the specificity was 64.8 % (95 % CI 63.8-65.8). Diabetic retinopathy was observed in 1,624 images (pretest probability-posttest prevalence difference = 15.93 %), whereas there was no evidence of diabetic retinopathy in 8,571 (84.07 %) images. In meta-analysis, 89.41 % (1,452/1,624) of cases assessed as diabetic retinopathy and 35.18 % (3,015/8,571) of cases assessed as negative with the gold standard were positive in the decision support system; the odds ratio for overall diagnosis was 55.06 38) . Heterogeneity in sensitivity (p \ 0.001) and high inconsistency (I2 = 88.7 %) [3] were observed. The overall specificity analysis also revealed heterogeneity (p \ 0.001) and high inconsistency (I2 = 94.5 %) [3] . Because of the heterogeneity, various subgroup and sensitivity analyses were performed; however, the heterogeneity remained high. Sources of heterogeneity between studies were explored using meta-regression analysis with clinical and technological co-factors. There was no evidence that these co-factors were the cause of the heterogeneity. Therefore, in the use of decision support systems, the chance of a positive outcome in subjects with diabetic retinopathy is 55.06 times greater than the chance of a positive outcome in individuals with a normal retina on imaging. Despite the heterogeneity between studies included in our meta-analysis, the chance of the decision support system being right in the diagnosis of diabetic retinopathy was high. The results indicate that these systems represent an accurate and noninvasive means of diagnosing diabetic retinopathy. Patterns and consequences of multimorbidity in the general population: there is no chronic disease management without rheumatic diseases management Simões D 1,2 , Araú jo F 1 , Severo M 1,3 , Monjardino T 1,3 , Cruz I 1,4 , Carmona L 5 , Lucas R Background A substantial part of the adult population suffers from the concurrent presence of more than one chronic disease, a phenomenon known as multimorbidity. Using a large representative national survey, we aimed to identify empirical model-based patterns of multimorbidity from chronic non-communicable diseases, with a focus on the contribution of rheumatic and musculoskeletal diseases (RMDs), and to quantify their association with adverse health outcomes. Methods Cross-sectional data from the Portuguese Fourth National Health Survey were analyzed (n = 23,754). Latent class analysis was used to identify patterns of coexistence of 11 chronic non-communicable diseases (RMDs, diabetes, hypertension, chronic obstructive pulmonary disease, stroke, depression, myocardial infarction, cancer, osteoporosis, asthma and renal failure). Based on the OMERACT Filter 2.0, health outcomes included life impact (self-rated health, short-and long-term disability), pathophysiological manifestations (chronic pain) and resource use indicators (frequent healthcare utilization and recent outof-pocket healthcare expenses). We assessed the association between patterns and adverse health outcomes, through sex-, age-and body mass index-adjusted prevalence ratios (adjPR), with 95 % confidence intervals (95 % CI), obtained using Poisson regression. Four patterns of chronic non-communicable diseases co-occurrence were identified and labeled as (1) (4) ''RMDs and Depression''. RMDs were highly prevalent in chronic patients (from 38.6 % in pattern 2 to 66.7 % in pattern 4). While negative self-rated health, short-term disability and chronic pain were more strongly associated with patterns 3 and 4, multimorbidity patterns 2 to 4 were similarly associated with long-term disability, healthcare utilization and out-of-pocket healthcare expenses. Subjects in pattern 3 had the strongest association with all the adverse health outcomes: negative self-rated health ( (1.99, . Subjects assigned to pattern 2 had the lowest prevalence of adverse health outcomes, but were still more likely to report negative self-rated health (2.37, 95 % CI 2.23-2.53), shortterm disability (2.07, 95 % CI 1.86-2.29), long-term disability (1.77, , chronic pain (1.54, healthcare utilization (1.96, , and out-of-pocket healthcare expenses (1.78, . Our study emphasizes RMDs as a major presence in multimorbidity in the general population. Management strategies for the chronic patient with cardiometabolic, respiratory or depressive conditions should also target RMDs. The association of all chronic non-communicable multimorbidity patterns with a wide spectrum of adverse health outcomes emphasizes the potential of empirical methods to translate the patientoriented model from clinical management to public health, and thus to inform chronic disease management policies. Health-exploring complexity: an interdisciplinary systems approach HEC2016 S189 Background Although osteoporosis has been observed worldwide, this disease is not distributed proportionally, as the prevalence of osteoporosis varies among countries due to the different characteristics of different populations and the use of public health resources [1] . Indeed, in 2000, the European Union spent 32 billion Euros in healthcare costs associated with this disease [1, 2, 3] . Objective To use the J48 algorithm as a classification task for data from women exhibiting changes in bone densitometry. Methods Technological study. The database used in these experiments was derived from a cross-sectional study involving 1871 women who underwent bone densitometry at a specialized service in the southern region of Santa Catarina during the period of January 2012 to December 2012, approved through the Ethics Committee under protocol 82939/2012. Initially, the information regarding the group of subjects with osteoporosis, including a total of 14 variables, was mined using preprocessed attributes as independent variables: age bracket, previous fracture, number of previous fractures, previous fracture in the femoral neck, prior spine fracture, previous fracture in the forearm, previous fracture in the rib, use of drugs, menopause, use of calcium, hormone replacement therapy, use of drugs for the thyroid, hysterectomy, oophorectomy, diagnosis, body mass index (BMI), weight, and three degrees of obesity. After the DM step, only those rules representing an association higher than 5 % and a rate of correctly classified records greater than or equal to 60 % were analyzed. A total of 1385 rules were generated, and the best rules were selected for further evaluation using the WEKA tool, considering correctly and incorrectly classified instances, the kappa coefficient, the average absolute error, the area of the ROC curve, sensitivity, specificity, and accuracy. The results of the classification task were analyzed using the J48 algorithm, and among the dichotomized variables associated with a diagnosis of osteoporosis, Among the analyzed rules, the following associations were highlighted: osteoporosis was observed in women of up to 49 years of age (183 occurrences, 7 errors); women 50 years or older, with a BMI [26 (520 occurrences, 125 errors); women older than 50 years of age who use calcium, without fractures (527 occurrences, 121 errors); and subjects older than 50 years of age who use calcium and have a BMI C 26, without fractures (109 occurrences, 43 errors). Conclusion The analyzed results showed higher values of sensitivity, accuracy and ROC curve in experiments conducted with individuals with osteoporosis. The majority of the generated rules were consistent with the literature, and the few differences might serve as hypotheses for further studies. Based on these results, we recommend that future researchers conduct DM experiments with larger databases and apply other tasks, methods and DM algorithms to deepen the generated knowledge. Impact of socio-economic position on cancer stage at presentation All but two of the socio-economic parameters were associated with advanced stage, always in the direction that patients with poorer socio-economic position were more likely to present with advanced stage. Our findings provide evidence that, in Germany, especially employment and income are related to tumour stage. It is of great concern that these socio-economic gradients occur even in systems with equal access to health care. Measures should be taken to reduce this inequality. Incidence and survival of neuroendocrine tumors in lower saxony: a population-based study Sirri E 1 , Vohmann C 1 , Kieschke J 1 1 Epidemiologisches Krebsregister Niedersachsen (EKN) -Registerstelle -, Oldenburg, Germany Neuroendocrine tumors (NETs) are rare neoplasms (incidence: 1-5/ 100,000 person-years), which originate from neuroendocrine tissues, localized in different organ systems. An increasing incidence of NETs has been observed in many countries. NETs are more common in the gastrointestinal tract (GIT) and the bronchopulmonary system. Despite their identification more than a century ago, NETs remain a poorly understood disease, clear causative factors have not yet been delineated and classification of the disease has faced challenges. The latest WHO classification guideline (2010) takes into account biological behavior for the GIT in order to apply risk stratification. Prognostic factors for NETs include grading, topography, morphology and sex. In general, there is paucity of epidemiological data on NETs in Germany. This study describes the incidence rates of and the relative survival (RS) of NETs in Lower Saxony (LS). Patients of all ages diagnosed from 2004-2013 with NETs (ICD-0-3 classification of all cancer sites with malignant morphology codes 8013, 8041-8045, 8150-8153, 8155-8157, 8240-8242, 8245-8249, and 8240/1) , extracted from the database of the LS Cancer Registry were analyzed. For this study, composite carcinoids, medullary carcinoma and merkel cell carcinoma were excluded. Death certificate only cases were included in the incidence calculations but excluded in the survival computations. Incidence rates were estimated for NETs overall, by sex and by topography. For survival calculations, only patients with malignant NETs aged 15-99 years (n = 12,570) were analyzed. The cohort approach was employed to derive 5-years RS by sex and by topography (gastrointestinal tract (ICD-O-3 C15-C26), separately for pancreas (C25), and lung (C34)). In LS a total of 13,574 patients were diagnosed with NETs in 2004-2013, 8,220 men (61 %) and 5,354 women (39 %) . Median age at diagnosis was 67 years for men and 65 years for women. Almost 71 % of NETs originated from the lung, 20 % from GIT, and 5 % were of unknown primary site. Overall, the age-adjusted (Europe) incidence rate was 15/100,000 (men) and 10/100,000 (women). After excluding NETs of the lung, the incidence was 3.9 and 3.3/100,000 for men and women. The incidence increased from 3.5 in 2004 to 4.9/ 100,000 in 2013 for men and almost doubled for women from 2.6 to 4.8/100,000. The overall 5-year RS for NETs excluding lung was better for women compared to men (57.3 vs. 47.6 %). The same pattern was observed for all NETs of the GIT (68.8 vs. 58.5 %), as well as for NETs of the pancreas (53.8 vs. 45.1 %). NETs of the lung showed the lowest 5-year RS (women: 14.7 % vs. men: 8.4 %) . Our observation of an increasing incidence of NETs concurs with other studies. Subsequent analysis will consider the 2010 WHOclassification and more topographical subtypes, especially for NETs of the GIT. Results are expected to give some insights into why women survive better than men as shown in this study. Background Application of metabolomics techniques to human stool samples becomes more and more common. Usually, some time passes between sample collection and long-term storage in the freezer, allowing undesired enzymatic reactions or degradation of metabolites. The aim of this study was to assess the effects of pre-freezing storage period and temperature on the stability of metabolites in human stool samples. Methods Stool samples from six persons were collected, aliquoted and stored at two different temperatures (4°C and room temperature) over different storage periods. Cooled storage periods are 4 h, 12 h, 24 h, and 48 h, room temperature periods are 12 h, 24 h, and 48 h; the immediately frozen aliquot was used as reference sample. Metabolites were analyzed by means of the Metabolon platform (Metabolon Inc.). To examine the influence of different pre-freezing conditions, the nonparametric Friedman test for repeated measures and the Dunnett's test for multiple comparisons were used. We also performed a principal component analysis to visualize the associations. Overall, 393 different metabolites were identified in stool samples. We observed for the samples stored at 4°C for different time periods, only few metabolites with significant deviation in their concentration (4 h: 0 metabolites, 12 h: 1 metabolite, 24 h: 3 metabolites, and 48 h: 1metabolite). As expected, samples kept at room temperature over different time periods resulted in more metabolites with significant changes in concentration (12 h: 13 metabolites, 24 h: 2 metabolites, 48 h 13 metabolites). However, no clear time trend was observed in the samples stored over different time periods at 4°C or at room temperature. For some of the stool metabolites we identified significant variability over storage periods; overall, however, we found no indication for linear changes over time. According to our data, storage of stool samples at 4°C is preferable as compared to storage at room temperature. Leukemia mortality and low dose ionizing radiation in the WISMUT uranium miner cohort Sobotzki C 1 , Fenske N 2 1 Bundesamt für Strahlenschutz -Federal Office for Radiation Protection, Oberschleissheim, Germany; 2 Bundesamt für Strahlenschutz -Federal Office for Radiation Protection, Germany Background An increased risk of leukemia following exposures of acute high doses of ionizing radiation is well established, while the risks associated with protracted low dose or internal exposures are unclear. Analyses were conducted in a cohort of 58.982 German uranium miners with mortality follow-up from 1946 to 2008. The red bone marrow (RBM) dose for low-linear energy transfer (LET) (mainly external gamma radiation) and high-LET (mainly radon gas) radiation was estimated based on a job-exposure-matrix and biokinetic/dosimetric models. Linear excess relative risks (ERR) and 95 % confidence intervals (CI) were estimated via Poisson regression models for chronic lymphatic leukemia (CLL) and non-CLL. The mean cumulative low-and high LET RBM doses among the 86 % radiation exposed workers were 47 mGy and 9 mGy, respectively. There was a positive non-significant linear dose response for mortality from non-CLL (n = 104) in relation to low-LET (ERR/ Gy = 1.8; 95 % CI -0.8; 6.2) and high-LET dose (ERR/Gy = 19.2; 95 % -0.5; 53.7). This excess was mainly related to chronic myeloid leukemia (n = 29) with an ERR/Gy of 5.7 (95 % CI -0.31; 21.4) and 36.4 (95 % CI -2.3; 149.8) for low-and high-LET, respectively. The ERR/Gy tended to be about five to ten times higher for high versus low LET RBM dose, however, the confidence intervals largely overlapped. Results do not indicate any association of death from CLL (n = 54) with any of the two types of radiation. Conclusion Our findings support an increased risk of death from non-CLL in relation to chronic low LET radiation and alpha-radiation and no such relation for CLL. To increase patients' empowerment and involvement in their own health, several countries has decided to provide patients with electronic access to their health record. This paper reports on the main findings from sub-studies and pilots prior to the implementation of patients' access to their medical records in large-scale in the Northern Norway Regional Health Authority in December 2015. All patients aged 16 years or older have digital access to their own EHR. Parents to children under the age of 12 has digital access to their journal as well. By default, all documents are available in digital format as soon as they are approved (signed) and not exempted transparency. The following research questions have been addressed: Who ask for access to the patient journal today? What is the rationale and requirements for electronic access to the patient journal? What are the attitude and experiences regarding use of the Internet for health purposes? What are the patients' experiences from piloting the service? Methods Data for the participatory design process was collected through questionnaires and interviews. Multiple methods have been used: 1. Registration of patient requests to the hospital regarding access to the journal, followed by questionnaire. 2. A national survey conducted by TNS Gallup in November 2013 on the use of the Internet for health purposes. 3. Two pilots on real patient data. The largest pilot included nearly 500 patients. Patients would like digital access to their journal because they wanted to learn more about their diagnosis and treatment, to share it with others, and to check if the information was correct. The national survey showed that a total of 78 % had used the Internet for one or more health related areas and electronic access to the EHR was one factor that mattered when choosing a GP. The usability tests from the pilots revealed that the service in general functioned as expected. The patients reported that they would continue to use the service, recommend it to others, and generally had no problems in understanding the content. The positive results may be a result of developing the service step-by-step, based on the patients' requirements and needs. It has been crucial to allow enough time to evaluate every step and to implement necessary improvements before moving to the next phase. Timing has been another important factor; without a national health portal and a secure log-in mechanism widely in use, it would have been challenging to establish a full-scale service. The interdisciplinary outpatient clinic of the German Center for Vertigo and Balance Disorders (DSGZ) at the Munich University Hospital of the Ludwig-Maximilians-Universität München treats more than 3000 patients per year. Up to now, data from the clinical investigations are decentrally stored at different locations each using different systems and formats. As a consequence, a systematic analysis is only possible with much effort. Therefore, the installation of a patient-based registry (DizzyReg) is an important objective of the DSGZ. This central access and storage to patient data in a prospective database can facilitate the interdisciplinary research and allows to prospectively analyze determinants and patient-relevant outcomes of different, even rare, vestibular disorders. Moreover, the registry can provide excellent-characterized cases for case-control studies. Methods Core data will be imported directly by sending an http-request to the clinical workplace system (KAS). Manual data come from medical history, such as duration and type of vertigo attacks, the clinical investigations, such as the video head impulse test to measure the vestibular-ocular reflex, and neurological examinations, such as the Romberg's test. Data sources currently include medical reports and examinations. Further measurements are collected by means of a questionnaire and include information on socio-demographic data, health care utilization, and health-related quality of life. Data is stored on a virtual server with a static IP, fixed domain, and Secure Sockets Layer certificate situated at the hospital. Collection and management of the data is realized in a browser-based data management system. Data will be released only anonymously and after a positive vote of an internal steering group. Postal or telephone follow-ups are planned. Within the first six weeks 136 persons have agreed to store their data in the patient registry (response of 57 %). Reasons of refusal are cognitive impairment 9 %, not of legal age 9 %, lack of language skills 9 %, bad health condition 8 %, other technical failures 6 %, and principal refusal 60 %. The acceptance among the patients is high. In further steps, data from children and young people will be incorporated and the overall process further optimized. DizzyReg is the first prospective registry which can be used for investigations of clinically and precisely characterized vestibular diseases. Implementation of an ODM and HL7 compliant electronic patient-reported outcome system Soto-Rey I 1 , Dugas M 1 , Storck M 1 1 Interoperability is one of the biggest issues in health informatics despite of the huge effort invested to solve it. Clinical Data Interchange Standards Consortium (CDISC) and Health Level 7 (HL7) are two of the most recognized institutions working on this field. Several systems are becoming compliant with their standards; however, the process to accomplish it is not always straightforward. In this manuscript, we present the successful implementation of the CDISC ODM and HL7 import and export functions for ''MoPat'', a webbased multi-language electronic patient-reported outcomes system. The system has been evaluated and tested and is currently being used for clinical study and routine data collection, including more than 10.000 patient encounters. In a meta-analysis and network meta-analysis level, direct allocations of missing binary outcome data are based on a limited number of assumptions about missing outcome data (and usually on very extreme scenarios), therefore, restricting the sensitivity assessment of the missing outcome data implications. Hollis proposed a graphical sensitivity assessment of missing binary outcome data in a two-arm clinical trial which provides a complete assessment of the impact of missing outcome data on the trial results. This graphical sensitivity assessment was extended to a published pairwise meta-analysis of randomized controlled trials in the field of dentistry and to a published network meta-analysis on the comparative efficacy of 12 antimanic treatments and placebo in acute mania. All analyses were implemented within the Bayesian framework using non-informative normal prior distributions centered at zero with a large variance for the location parameters and normal distribution truncated at zero for the heterogeneity standard deviation. The results from the meta-analysis were particularly affected by the allocation of missing outcome data as events in either arm both in respect of the magnitude and strength of the results leading to unrealistic conclusions. The graphical evaluation of all possible imputation scenarios in the efficacy of 12 antimanic treatments relative to placebo provided consistent conclusions for all placebo comparisons apart from asenapine for scenarios that deviate from the best-case scenario, carbamazepine for scenarios close to the worstcase scenario and ziprasidone for scenarios that allocated less missing outcome data as events related to this treatment than to placebo. There were only small changes in the ranking of most of the treatments possibly due to the low missing outcome data rate. The impact of the allocation scenarios on heterogeneity and especially for the best-and worst-case scenarios was large. Finally, the consistency model was able to compromise better the trade-off between model fit and complexity compared to the inconsistency model. However, only all missing-non-events assumption (also considered in the original publication) provided the best predictions whereas to the worst-case scenario provided the worst predictions. In a meta-analysis and network meta-analysis context, contour plots can facilitate the complete investigation of the results under a wide range of imputation scenarios of the missing binary outcome data. This approach can help to explain the implications of variations in the degree and direction of the missing binary outcome data. Background Breastfeeding is generally highly recommended, but recent reports have raised doubts on the evidence of the WHO recommendation of exclusive breastfeeding for six months or four months to prevent obesity, obesity related disorders in adulthood, allergies and neurobehavioural disorders. Dyslexia is one of the most common neurobehavioural disorders in childhood, with prevalence rates up to 10 % depending on diagnostic criteria and region. Apart from the genetic background involving many genes, as for other complex diseases, and the familial aggregation, the etiology is not completely understood. Although the impact of breastfeeding on a broad spectrum of neurobehavioural disorders is extensively (and controversially) discussed, no single study has investigated the association between breastfeeding and reported symptoms of dyslexia later in life so far. Therefore, the aim of this study is to analyze the association between breastfeeding and symptoms of dyslexia prospectively using data from two large population-based birth cohorts in Germany. Within the two ongoing German multicenter birth cohorts GINIplus and LISAplus, information on reading/spelling problems and difficulties were collected at the 10 and 15 year follow-ups. Parents answered the question whether their child presented reading/spelling problems as well as reading/spelling disorders. Here we used the combined answers to reading/spelling problems or disorders. Breastfeeding data during the first six months of life were collected at the 6-months survey in LISAplus and the 1-year survey in GINIplus. Breastfeeding was defined as exclusive breastfeeding for at least the first four months of life. Thus, data from 4411 children aged 10 years and 4379 adolescents aged 15 years, respectively, could be included in this analysis. Logit models were adjusted for sex, age, cohort, study center, parental education and parental psychopathology. Reading/spelling problems or difficulties were reported for 11.4 and 10.0 % of the subjects aged 10 and 15 years, respectively. Exclusive breastfeeding for at least the first four months of life were reported in 57.4 % of all subjects. The majority of the cohort members (65.2 %) have a high educational background, while for only 7.2 % of the subjects both parent have less than 10 schooling years. Exclusive breastfeeding for at least the first four months of life was statistically significantly associated with lower adjusted odds ratios for reading/ spelling problems or difficulties at age 10 years (aOR 0.75, 95 % CI (0.60-0.93), p \ 0.05), while the association attenuated (0.86 (0.70-1.07), p = 0.17) at age 15 years. Exclusive breastfeeding for at least the first four months of life might be a protective factor for dyslexia, although residual confounding cannot be ruled out so far. The WHO classification of testicular cancer recommends documenting the histological components of mixed germ cell tumors (mGCTs). In the ICD-O, these tumors are coded as 9085/3 (''mixed germ cell tumor''). Therefore, a routine analysis of ICD-O codes of testicular cancer does not permit the description of histological types in mGCTs. The aim of this study was to review all reports, especially pathology reports, of registered mGCTs and to document the histological types involved. For the years 2008-2013, we reviewed pathological and clinical reports (whatever was available) of mGCTs (9085/3) reported to the North Rhine-Westphalia cancer registry. Besides these tumors, we also reviewed cases coded as 9101/3 (choriocarcinoma with other components) and 9081/3 (teratocarcinoma) as teratocarcinomas are defined as tumors with a mixed histology of embryonal carcinoma and teratoma. We studied the age distribution of the many subtypes and estimated population-based crude incidence rates per million person years. Overall, 5,263 primary malignant testicular cancer cases were registered during 2008-2013. Among the 4,916 germ cell tumors (93.4 % of all testicular cancers), 20.8 % were mGCTs. The crude incidence rate for GCTs with one histology was 75.1 (SE 1.2) and for mGCTs 19.7 (0.6). The rate of pure choriocarcinoma (n = 19 cases) (a highly aggressive GCT) was 0.4 (SE 0.1) whereas the rate of mGCTs with choriocarcinoma (n = 117 cases) was 2.3 (SE 0.2). Mixed GCT without seminoma had a higher rate than mGCT with seminoma (9. 7 [0.4] [-4.0; 0.0] ). Discussion This is one of the few in-depth population-based studies on the histology of mixed testicular germ cell tumors. The most frequent mGCT consisted of embryonal carcinoma and teratoma (teratocarcinoma), followed by seminoma and embryonal carcinoma. Although the highly aggressive tumor, choriocarcinoma, in its pure form is rare (0.4 % of all GCT), mixed GCTs contain a choriocarcinoma component in 11.6 % of cases. Daytime napping has been associated with increased and decreased risks of cardiovascular events. Previous studies did not assess details of daytime napping including the reason for, the location and the subjective effect. The aim of this study was to provide these details assessed in a cross-sectional study in Germany. We used data of the innovation sample of the Socioeconomic Panel Study (SOEP), that representing a population-based sample of private households in Germany for people aged 16 years and more. Overall 1383 personal interviews took place between September 2013 and January 2014 among people aged 16-95 years. We estimated prevalences and age-and sex-adjusted prevalence ratios (PR) and 95 % confidence intervals (95 % CI). The prevalence of regular daytime napping was 14.2 and 13.6 % among men and women. Reasons for regular daytime napping were nonrestorative nocturnal sleep (7.5 and 27.1 %) , habitual napping (96.8 and 81.3 %) and physical or mental exhaustion (44.6 and 61.1 %) among men and women, respectively. Regular long daytime nappers ([60 min) more frequently reported nonrestorative nocturnal sleep (28.6 versus 14.3 %; age-and sex-adjusted PR = 1.8, 95 % CI 0.9-3.4) and exhaustion (69.1 versus 48.3 %; age-and sex-adjusted PR = 1.9, 95 % CI 1.1-3.5) as reasons for their daytime napping than regular short daytime nappers. Regular (at least 3-4 times/week) loud (louder than usual speech) snoring and self-reported regular (at least 3-4 times/week) breathing arrests during sleep was associated with regular midday napping after adjusting for age and sex (PR = 1.9, 95 % CI 1.4-2.6 and PR = 1.9, 95 % CI 1.2-3.0, respectively). Discussion Self-reported regular loud snoring and regular breathing arrests at night, both indicators for sleep apnea, were associated with regular daytime napping. Regular long daytime napping was more frequently associated with nonrestorative nocturnal sleep and exhaustion than short regular daytime napping. Stephan AJ 1 , Strobl R 2 , Holle R 3 , Meisinger C 4 , Schulz H 5 , Ladwig KH 6 , Thorand B 7 , Peters A 3 , Grill E 2 Introduction Although aging typically results in a successive accumulation of physiological deficits, considerable variability in the extent of deficit accumulation (DA) can be observed between individuals. Therefore, individual characteristics may decelerate DA, postpone its onset or both. Various socio-demographic, socio-economic and lifestyle factors have been shown to be associated with DA. However, determinants and trajectories of DA are still incompletely understood. The objectives of this study were to estimate the 3-year incidence of DA levels and to identify predictors of DA trajectories over time in people aged 65 and older. Methods Data originates from the 2009 baseline assessment and the 2012 follow-up of the KORA (Cooperative Health Research in the Region of Augsburg)-Age study from Southern Germany. A Deficit Accumulation Index (DAI) was constructed from 30 health deficits. Incidence calculations were based on published DAI threshold values for non-frail (DAI B 0.08), pre-frail (0.25 [ DAI [ 0.08) and frail individuals (DAI C 0.25). State transitions were examined graphically. The effect of income, education, age, sex, marital status, physical activity (PA), smoking, alcohol consumption, and body mass index (BMI) on level of and change in DA was analyzed using generalized linear mixed models with DAI values as continuous outcome. Data was available from 1079 participants at baseline and 822 at follow-up (mean age at baseline 76 years, range 65 to 93 years, 50 % female). Over the three-year follow-up period, out of 244 participants who were non-frail at baseline, 114 became pre-frail (incidence 47 %) and eight became frail (incidence 3 %). Higher baseline levels of DA were significantly associated with lower education, higher age, female sex, less household and leisure time PA and higher BMI. For participants aged [65] [66] [67] [68] [69] 60 % (CI [65, 54] Starting from 18 years experience of Ipath platform the organizers like to challenge the audience to discuss low cost sustainable solutions and their extension in other regions. iPath-Network is a web application service based on an open source collaboration platform which was originally developed at the University of Basel. iPath is being actively used in 35 countries for consultation, education and research. The system is stable, easy to use and is not highly demanding in resources. For that reason it is used in many developing countries, where connectivity may be difficult or high bandwidth unavailable. At universities, internally accessible installations of iPath are being used for scientific research. The targeted audience is broad, including medical staff interested in using simple IT solutions and seeking collaboration with peers, IT analysts and developers, members of the academia, and students. The Frankfurt Center for Rare Diseases (FRZSE) at the University Hospital Frankfurt is in the process of setting up a registry for undiagnosed patients utilizing the OSSE (www.osse-register.de) open source software and toolbox. Before visiting the undiagnosed patients center at the FRZSE, patients usually have a history of consulting a large number of medical specialists and have undergone numerous diagnostic procedures. The value of these pre-existing diagnostic procedures and the respective results are gathered in a registry to make the data accessible for further analysis and for reducing the time to a final diagnosis. The public funded project OSSE (Open Source Registry System for Rare Diseases in the EU), provides a reusable software for Rare Disease registries [1] . This web based registry solution is available as open source and, by design, focuses on ensuring the interoperability between different registries. The underlying federated approach allows to perform distributed searches which are designed to comply with data protection requirements and preserve data sovereignty. Generally, the data structure as well as the forms have to be defined. To ensure that the configuration meets the registry's purpose, an iterative approach in close contact with the registry owners is appropriate. At first, the server was set up using the prepared OSSE CD image [2] . After the basic configuration, we defined the documented dataset [3] in the central metadata repository. Based on the feedback, the structure was fine-tuned and the data elements were split into two forms, one for basic data and one for data which may change over time. After this step, it was possible to enter data into the registry and users began thoroughly testing the functionality and the usability. By reviewing the user's feedback, changes in the data structure and forms surfaced and were implemented accordingly. After a final testing run and the approval of the registry owners, the production system was set up and the inclusion of patients into the registry started. The feedback regarding the usability was very helpful. Although, as some of the necessary adaptions would have required modifications of the source code, the respective problems could not be addressed before the production phase. Still, that feedback greatly influenced the plan to implement new features into and further improve the OSSE registry toolkit. During the iterative process it turned out, that merely documenting the diagnosis in the basic data form is not feasible, as the registry focuses on patients without diagnosis and therefore the diagnosis would be made at a later date or could even change over time. The described interactive design process should be conducted in the OSSE demo environment. As there is currently no automatic migration process to move a registry to the production environment, this has to be done manually. The implementation of a migration tool is planned for a future release. To ensure the compatibility to datasets from other similar registries, the feasibility of documenting symptoms and complaints by recording the codes from the HPO (Human Phenotype Ontology [4] ) will be evaluated. How to use a risk calculator: the StuDoQ E-learning tool Strahwald B 1 , Crispin A 1 , Rieger A 1 , Mansmann U 1 1 IBE, LMU München, München, Germany Background Risk calculators are decision aids for both patients and physicians. A surgical risk calculator estimates the patient-specific probabilities of postoperative complications for different interventions [1] . As such it is a helpful tool to weigh the potential harms and benefits of an intervention. The surgical risk calculator is designed to complement counseling, not to replace it. It should be used for shared decision making (SDM), a central concept of patient-centered care [2, 3] . This process allows patient and surgeon to make a decision about further treatments together: the surgeon contributes his expert knowledge while the patient contributes his values and preferences. However, there are two major problems. First, research has shown that many physicians have a low numerical and statistical literacy [4, 5, 6 ]. Therefore they are not able to interpret the results of a risk calculator adequately. They will especially not be able to explain the meaning of estimates and the uncertainty of any model-based result. This renders informed consent of the patient impossible. Second, many physicians are not well trained in shared decision making [7] . Therefore risk calculators could potentially be abused to persuade patients and ignore their values and preferences. Methods Development of an e-learning course for surgeons. The learning objectives are: basic principles of a risk calculator and its limitations, principles of shared decision making, the use of the StuDoQ risk calculator in clinical practice. The e-learning course offers targeted training for surgeons. It guides the user step by step through the whole process. At any point, the user can get additional information about the structure of the risk calculator and about the principles of patient-oriented risk communication. Thus the physician learns 1. to evaluate the potential and limitations of risk calculators, and 2. how to use the results of risk calculators for shared decision making in clinical settings. The complexity and potential for ambivalent use of surgical risk calculators suggests the need for supporting surgeons in the use of those calculators. The e-learning course offers targeted training for surgeons on how to employ the StuDoQ risk calculator appropriately. Comparison of underreporting calculated for three dietary assessment methods. Results of the German National Nutrition Survey II Straßburg A 1 , Eisinger-Watzl M 1 , Krems C 1 , Hoffmann I 1 1 Max Rubner-Institut, Karlsruhe, Germany Underreporting is a main problem in the assessment of dietary intake. Results on food consumption and nutrient intake of under reporters and plausible reporters were compared in diet history interviews (DHI), weighed food records (WFR) and 24-h recalls (24HR) to show potential differences between the dietary assessment methods. Methods Underreporting of 677 participants (14-80 years) of the German National Nutrition Survey II was assessed for each dietary assessment method based on the ratio of reported energy intake and calculated resting metabolic rate (Goldberg et al. 1991 , Black 2000 . Resting metabolic rate was determined by the formula of Müller et al. (2004) considering sex, age, body weight and for adolescents additionally height. Nutrient intake was calculated with the German Nutrient Database (BLS 3.02). Confidence intervals were calculated on basis of bootstrapping samples. The proportion of underreporting is 23 % for DHI, 22 % for WFR and 16 % for 24HR. In all three dietary assessment methods, consumption of most food groups (except of water) was lower for under reporters compared to plausible reporters. Especially foods rich in sugar and/or fat (pastries, sweets and ice cream) were to a considerable amount lower in under reporters than in plausible reporters. In WFR, under reporters reported raw and cooked vegetables to a sizeable lower amount in comparison to plausible reporters. This could not be found for DHI and 24HR. For all three methods, median nutrient intake of under reporters and plausible reporters differed for most nutrients between 30-40 %. Altogether, the results underline that underreporting is a problem in each of the three dietary assessment methods. For all food groups but vegetables comparable results regarding underreporting were found for all three dietary assessment methods. Recording of vegetables, a highly inhomogeneous food group which is mostly consumed in mixed dishes, is complex. Therefore, recording fatigue or a change in eating behaviour leading to undereating may be a possible reason for low vegetable consumption assessed with WFR. Background Dispensing an ePrescription for medicinal products for human use by a community pharmacist in a cross-border situation regularly poses specific challenges: the univocal identification of the medicine specified in the foreign prescription, and, if this product is not authorised, not available (shortage) or substitution is required by regulation, the dispensation of a similar medicine in line with national law. With respect to the univocal identification of medicines, global efforts by European Medicines Agency, FDA and others focus on implementing the ISO IDMP suite (11615/16, 11238-40)-Identification of medicinal products. The presentation analyses the related substitution challenges. Based on a literature review, analyses of EU regulations and crossborder ePrescription guidelines, an analytical framework was established. Informal discussions with experts (pharmacists, physicians, regulators) revealed that substitution is a rather ill-defined concept, and its understanding is very much depending on the concrete experience and regulation in the respective country. Defining a suitable subset of univocally formulated questions (administered in English, but reflecting reality in 28 countries) for an empirical survey was a very complex process. Regular pre-testing of tentative formulations by uninvolved experts from diverse countries was applied. Data collection was via online LimeSurvey. Potential respondents were identified through a snowball system, involving known experts, membership of the Pharmaceutical Group EU and other associations. Almost 100 persons responded, covering 26 of 28 EU member states. Data were analysed per country and compared across countries, after triangulating and integrating multiple results per country into a single file. Almost half of the countries have national or regional ePrescription systems. Four generic types of (e)Prescriptions prevail: specifying an individual originator medicinal product, a generic product, only an active ingredient (e.g. INN prescription), or a predefined set of products (cluster prescription). Substitution occurs mostly at the brand name level (same producer, but different name in dispensing country), or generic substance level (originator or generic branded product by another generic product). Therapeutic substitution (different active ingredient and/or from a different therapeutic class) is virtually absent. Substitution along other dimensions like dosage form, strength, or route of administration is sometimes allowed within strict limits. INN or cluster prescriptions cannot be handled crossborder due to missing international data bases. The Smart Open Services for European Patients (epSOS) projectimplementing an infrastructure for the cross-border exchange of electronic patient summaries and ePrescriptions-identified substitution challenges as a major barrier to indeed safely and successfully dispensing a medicine specified in a foreign ePrescription. Global efforts to establish and maintain comprehensive data bases-fully structured and with mostly coded data elements-of medicinal (and pharmaceutical) products and their identifying attributes accessible in different languages will greatly facilitate coping with this challenge. Nevertheless, the great diversity of substitution options and regulations prevailing across countries suggests to better align regulation and/or reach agreements on how to handle this challenge. Acknowledgement This presentation reports results from an EU funded project involving SDOs, regulatory agencies and research organisations, and as informal partners EMA, FDA, WHO and others. Assessing patient activation through eHealth-a multi-site study across Europe The European PALANTE project covered 7 sites catering to chronic disease patients in 6 European countries, offering a wide variety of eHealth services. A specific goal was to better understand how the use of eHealth services facilitates activation & empowerment. The eHealth services covered diabetes or CHF patients' access to their PHR; support for ankylosing spondylitis patients; online access to discharge notes or X-ray exposure data, a Personal Health Care Manager module within a nation-wide EHR system, an interactive, (a) mobile or (b) internet TV-based access to a PHR for COPD. The number of patients involved varied from 80 to several thousand per pilot. Patient empowerment was operationalised by PAM(R) (patient activation measure). To explore which factors may impact on the actual level of PAM and its change over time, 15 independent variables were identified (ranging from gender, income. to scales measuring trust in the healthcare system and past treatment experience). Expected and experienced usefulness as well as ease of use where measured applying the Technology Acceptance Model (TAM).. A pooling of data across all sites rendered only insignificant results. But analysing data per site led to highly significant results, and identified certain impact factors which apply in most situations. The average PAM score for all of these pilot sites is considerably above 50 (mean). The mean score of Sites 1 (Andalusia-Diabetes) and 7b (Basque country-Asthma, TV platform) were even almost 70. These data indicate that the patient population involved was already at the start of the pilots more activated than the average patient. Across-sites analysis indicates that almost invariably patientś activation can be improved particularly by an improved treatment experience, and assuring s/he gains more trust in the healthcare services provided. For most patients involved, the expected and experienced usefulness of eHealth services is regarded as a priori given-the majority of patients was already quite empowered and trusts the recommendations made by their healthcare professionals, but the perceived and experienced usability had quite often a significant impact. Other factors being sometimes significant were the patients location (urban, suburban, rural), the age of the patient, the family situation (living with others or alone), or the employment situation (being employed or not). They do not hold across most sites and need interpretation in the context of the individual site. Obtaining high quality empirical data from these diverse patient populations turned out to be more demanding than anticipated. The difficulty of translating the survey instruments into different languages without losing their exact meaning contributes to this. The results provide well-grounded, exploratory insights into what impacts on patient empowerment and activation. The diversity and size of the pilots allowed a deeper analysis of a variety of independent variables expected to impact on whether and to what extent citizens and patients can become (more) activated. To observe a noteworthy increase in patient activation may require a longer observation period than only 6 months, and measurement at three points in time seems recommendable. These consequences and especially the fear and insecurity generated by the anticipation of unemployment is expected to generate mental pressure. Hence, the job insecurity during the recession might have caused negative effects on both mental and physical health. We analyze the effect of job insecurity on mental and physical health in Germany during the financial crisis in 2008. We contribute to the literature on the fear of unemployment and health outcomes using administrative data form a sickness fund, that were so far neglected in this field of research. This goes along with much larger samples sizes and much more detailed health outcomes. As civil servants have very secure jobs in Germany they were not affected by the crises in 2008 regarding the fear of a job loss. Hence, within a difference-in-differences estimation framework we distinguish between civil servants and all other employees prior and after the peak of the crisis in 2008 to measure the effect of the fear of unemployment on various health outcomes. In our analysis we control for a long list of socio-economic and health variables. We use claims data from Germany's largest sickness fund, the Techniker Krankenkasse with more than 10 million insured. This data source refers to the years 2007-2009 and allows us to measure mental and physical health considering prescribes doses of certain drugs within one year. The use of prescribed drugs instead of diseases as health outcomes has the advantage of more detailed information on the magnitude of health impairments. Preliminary results suggest that the fear of unemployment goes along with poor mental health and especially depressions as those individuals who had to fear unemployment during the financial crisis have significantly higher prescriptions of antidepressants compared to those with secure jobs. Conclusion While job displacements are important feature in each well-functioning market economy, they seem to have adverse consequences for those fearing job losses. We conclude that the underlying data that have not been used in the given context so far constitute a very reliable source of information for the analyzed question as well as various other economic studies. From the project to standard care-the correct assessment of the benefits of telemedical applications is essential Background Telemedical services are normally not reimbursed in Germany. However, in order to increase their use, it is important to establish unified reimbursement rules. For an admission in reimbursement catalogues the proof of a benefit is central. Within the self-administration entity, but also in parts of the scientific community, a method of proof is-in analogy to the pharmaceutical research-exclusively accepted in the form of (several) randomized controlled trials (RCT). Occasionally, this leads to a delayed admission of innovations. Hence, the main research hypothesis of this paper is as follows: If telemedical processes are based on an already well-established diagnostic procedure, it is less likely that the digitalization of data transfer systems leads to a new medical entity that would entail RCTs. This is because telemedical services differ with respect to the intensity of their technical usage as well as their ways of service provisions and in the degree of providing benefits or risks for patients. For the purpose of this study the existing range of tele-medical services is divided into three categories. A specific method repertoire was assigned to every category to the evaluation which ranges from feasibility and acceptance analyses to RCTs. The examination of the evidence base is the foundation of the underlying medical model of the telemedical services. In addition, the economic risk and the risk for patients are also taken into account. The paper is based on the experience of the ZTG GmbH in designing projects as well as on their evidence reports and its engagement in working groups on state and federal level and the analysis of projects of the German ''Telemedizinportal (Telemedicine portal)''. The three categories are as follows: The first one consists of services of already reimbursed achievements. Only the way of data transmission is different. It is planned to conduct feasibility and acceptance analyses. The second category encloses applications which produce evidence-based achievements trough telematics processes. Besides technological applications collaboration between stakeholders is modified, too. In this case a health-economic proof based on a simplified evaluation would be sufficient. The third category encompasses applications which are based on a medical model that has not yet been approved of. Those medical models are going to be created for the first time with the help of telematics processes. In this category the focus is on applications, in which the underlying medical model is not confirmed yet and which are produced now for the first time through telemedical processes-for this RCTs are indicated. When evaluating telematics applications it is vital to differentiate methodically. In order for this study to be recognized it is important to discuss the methods used and examine if the quality of the study results is adequate. There are methodical, economic and ethical reasons to be considered thoroughly when deciding to apply RCTs. In case of applications that merely differ in the way of data transmission or rest on telecooperative procedures between health professionals, a more pragmatic form of evaluation seems to be sufficient. Modeling requirements for cohort and register IT The project KoRegIT (funded by TMF e.V.) aimed to develop a generic catalog of requirements for research networks like cohort studies and registers (KoReg). The catalog supports such kind of research networks to build up and to manage their organizational and IT infrastructure. To make transparent the complex relationships between requirements, which are described in use cases from a given text catalog. By analyzing and modeling the requirements a better understanding and optimizations of the catalog are intended. There are two subgoals: (a) to investigate one cohort study and two registers and to model the current state of their IT infrastructure; (b) to analyze the current state models and to find simplifications within the generic catalog. Methods Processing the generic catalog was performed by means of text extraction, conceptualization and concept mapping. Then methods of enterprise architecture planning (EAP) are used to model the extracted information. To work on objective (a) questionnaires are developed by utilizing the model. They are used for semi-structured interviews, whose results are evaluated via qualitative content analysis. Afterwards the current state was modeled. Objective (b) was done by model analysis. A given generic text catalog of requirements was transferred into a model. As result of objective (a) current state models of one existing cohort study and two registers are created and analyzed. An optimized model called KoReg-reference-model is the result of objective (b). It is possible to use methods of EAP to model requirements. This enables a better overview of the partly connected requirements by means of visualization. The model based approach also enables the analysis and comparison of the empirical data from the current state models. Information managers could reduce the effort of planning the IT infrastructure utilizing the KoReg-reference-model. Modeling the current state and the generation of reports from the model, which could be used as requirements specification for bids, is supported, too. In 2012, 95.000 children and adolescents under the age of 20 were murdered, with the majority of child homicides taking place in low and middle income countries. Gender, age and geography play an important role in understanding child homicides, with boys above the age of 15 in Latin America being at highest risk of child homicide (UNICEF 2014). Despite increased knowledge on patterns of child homicides, little is known on its main risk factor-the perpetrators of child homicide and their relationship to the victim. To address this important gap to advice prevention strategies on the reduction of child homicide, this study combines the findings of a systematic literature review, a survey of 191 statistical and police offices worldwide and a survey of homicide experts to identify the proportion of different types of child homicide perpetrators. Out of the 27 databases searched, two authors independently screened 6064 abstracts and 579 full-study, resulting in 125 included studies. These studies were matched with 11 country estimates resulting from the conduced surveys. They review identified data on perpetrators of child homicide from 31 countries, with the majority of available data originating from high income countries. There was a concerning lack of information from regions where child homicide is most prevalent-Latin American and Central and Western Africa. Findings from the available data indicate that the majority of child homicides are committed by family members or acquaintances, with stranger homicides only constituting approximately eight percent of child homicides. Parents constitute approximately half of all child homicides, among them mothers were the main perpetrators for children under one and fathers for children above the age of one. Child homicides are often the result of longstanding child maltreatment. A substantial amount of child homicide perpetrators commit or attempted suicide. Child homicide often occurs as a result of familizide-the murder of more than one family member and most often in relation to intimate partner homicide, with children either being direct targets to punish the partner or being bystanders to the intimate partner homicide. The systematic review concludes that children face the highest risk of child homicide within their family and from people they know and that prevention strategies need to be devised accordingly. Limited compilation of routine data on perpetrators of child homicideespecially outside high income countries needs to be addressed as well as the need for separate analyses on child homicide, especially among adolescents, in official statistics on homicide. This kind of information is imperative for improving prevention efforts to effective reduce child and adolescent homicide mortality worldwide. Identifying the risk factors of elderly cognitive impairment using feature selection and association: findings from a Singapore crosssectional study Sudhir Pillai P 1 , Feng L 2 , Leong TY 3,1 1 School of Computing, National University of Singapore, Singapore, Singapore; 2 Department of Psychological Medicine, National University of Singapore and National University Health System, Singapore, Singapore; 3 School of Information Systems, Singapore Management University, Singapore, Singapore Mild cognitive impairment in the elderly that may lead to full-blown dementia has been separately associated with various demographic, lifestyle, and medical conditions of the individuals [1, 2, 3] . It is unclear, however, how the different risk factors are related in contributing to the neurodegenerative processes. Advanced machine learning methods such as feature selection and association inference can help explicate the relevant relations from different sources and aspects. We aim to identify the potential risk factors for cognitive impairment and their inter-relationships based on multi-view data using feature selection and association inference with Bayesian networks. We also aim to examine the challenges and opportunities in applying such techniques in a real-life health problem with uncertain causes, multifaceted manifestations, and progressive development patterns. We examine a multi-view dataset of n = 319 participants from a cross-sectional study on the community-living elderly in Singapore. This dataset of 138 explanatory variables is derived from questionnaires about demographic, cognitive, nutritional, lifestyle, medical and family history and clinical characteristics of the participants, called views. Considering an individual as a data point, we first merge the heterogeneous data, which includes ordinal, nominal and scale variables, and derive relevant scores such as diet, cognitive scores etc. Next, we perform hierarchical clustering on the dataset to obtain view-specific clusters such that the similar features across datapoints are grouped together using the Minkowski distance with Ward's linkage algorithm [4] . Then, we compare the utility of views and their cluster structures by classifying the datapoints using Support Vector Machine as diseased or healthy based on the representative cluster features. Similar to feature selection, which finds the most relevant and uncorrelated features, clustering minimizes within-cluster distances and maximizes between-cluster distances [5] . The cluster representative feature is at the least total distance from others in a cluster. We also learn a Bayesian network for discovering the relationships among the variables in different views using the hillclimbing approach [6] . To overcome missing values, sparse and skewed nature of certain views like diet information, we performed nearest neighbour imputation [7] , normalization etc. Preliminary cluster analysis results show that Mini Mental State Exam (MMSE) score, weight, age, sex, alcohol consumption, vegetable-fruit-soy, mediterranean-diet, tea scores are important from the cognitive, physical health, demographic, lifestyle and diet clusters. Diet and cognitive variables provide better classification than others with 66.7 % in distinguishing cognitively impaired from healthy. It will be better however to include the available classification labels while building the clusters. In the Bayesian network learnt, we could visualize relationships between the MMSE scores and tea consumption and food scores, anxiety and perceived deficits, which are in accordance with results from previously reported studies. The factors affecting cognitive health are manifold and should be examined and studied separately as well as together using multimodal, multi-view machine learning techniques. The complex relationships need to be examined in different perspectives, with the results systematically integrated and interpreted to provide a comprehensive and coherent view of the risk profiles. Lifestyle changes and medical or psychological interventions developed by taking these factors into consideration may help in Dementia prevention and delaying cognitive decline. With the rapid growth of clinical data and knowledge, feature construction for clinical analysis becomes increasingly important and challenging. Given a clinical dataset with up to hundreds or thousands of columns, the traditional manual feature construction process is usually too labour intensive to generate a full spectrum of features with potential values. As a result, advanced large-scale data analysis technologies, such as feature selection for predictive modelling, cannot be fully utilized for clinical data analysis. In this paper, we propose an automatic feature construction framework for clinical data analysis, namely, Feature??. It leverages available public knowledge to understand the semantics of the clinical data, and is able to integrate external data sources to automatically construct new features based on predefined rules and clinical knowledge. We demonstrate the effectiveness of Feature?? in a typical predictive modelling use case with a public clinical dataset, and the results suggest that the proposed approach is able to fulfil typical feature construction tasks with minimal dataset specific configurations, so that more accurate In the study, the most common types of medical images used are: X-rays and ultrasounds images. The effective representation of image format LBC is proven using an image quality index type, the Structural Similarity Index (SSIM). An approach to support collaborative ontology construction The increasing number of terms used in textbooks for information management (IM) in hospitals makes it difficult for medical informatics students to grasp IM concepts and their interrelations. Formal ontologies which comprehend and represent the essential content of textbooks can facilitate the learning process in IM education. The manual construction of such ontologies is time-consuming and thus very expensive [4] . Moreover, most domain experts lack skills in using a formal language like OWL[3] and usually have no experience with standard editing tools like Protégé [5, 6] . This paper presents an ontology modeling approach based on Excel2Owl, a self-developed tool which efficiently supports domain experts in collaboratively constructing ontologies from textbooks. This approach was applied to classic IM textbooks, resulting in an ontology called SNIK (Semantic Network of IM in Hospitals). Our method facilitates the collaboration between domain experts and ontologists in the development process. Furthermore, the proposed approach enables ontologists to detect modeling errors and also to evaluate and improve the quality of the resulting ontology rapidly. This approach allows us to visualize the modeled textbooks and to analyze their semantics automatically. Hence, it can be used for e-learning purposes, particularly in the field of IM in hospitals. In clinical trials, investigating the ratio of patients with each disease who are treated in a hospital is important for determining the number of patients who are allocated to hospitals. The Japanese health insurance claims data includes standardized disease and medicine data. However, the disease data has some problems in terms of reliability, because the healed diseases are sometimes not deleted or because a disease that a patient does not actually have is registered to claim the cost of the examination. On the other hand, therapeutic medicines are administered to target particular diseases. In this study, we developed a system for estimating the number of patients with each disease using the disease data and the therapeutic medicine data. We converted the ICD-10 code to a 4-grade classification code so that we could predict the diseases in the shallow layer (e.g. gastrointestinal disease) when it was difficult to predict the precise diseases in the deep layer (e.g. gastric ulcers). A table showing the disease code and the corresponding therapeutic medicine code was provided by the Japan Pharmaceutical Information Center (JAPIC). We calculated the disease probability score from the diseases and therapeutic medicines and recorded the predicted disease. For the system evaluation, we used the health insurance claims data from Osaka University Hospital for January 2015. A total of 58,526 diseases were predicted from the health insurance claims data of 18,393 patients. One hundred twenty patients were randomly extracted for use in a chart review that was performed by an expert physician. Two hundred twenty-four of 329 predicted diseases, were correctly predicted; 56 were reasonably predicted, and 49 were incorrectly predicted. The main disease was correctly predicted in 71 patients. In conclusion, we could estimate the number of patients with each disease using the health insurance claims data with a certain degree of accuracy. EPR or PHR? A model to clarify their relation Tange H 1 1 Maastricht University, Maastricht, Netherlands The spirit of times is in favor of the personal health record (PHR) and against the electronic patient record (EPR). Yet, there is a place for both. In this contribution a model is proposed in which they are positioned in parallel pathways for health and healthcare. The model is based on well-accepted theories and consists of decisions, motivations, and conversations and presents decision making as a three-step process. Decisions are connected by conversations that produce information. A model is presented in which each information item has its natural place and can be assigned to the PHR, or the EPR, or both. The model may serve as a frame of mind for the discussion how PHR and EPR should evolve in the future. The adoption of the model will take time and effort. The scope of the research was to characterise to what extent it was possible to use the electronic medical records today to perform the cancer related studies which are traditionally done with the help of paper records, and to propose what might be possible in the future, based on current plans for capturing further electronic data. To that goal, we also produced a list of parameters that are most unlikely to become available from the EHR system within the next 5 years. Method Exemplar data elements for the studies were obtained from three research protocols. Availability of each was studied in the EHR systems, and labelled by a coding system designed to cover a range of availability and accessibility levels for data elements. Source or clinical application for each data element was given. Demonstrations of how certain data elements can be derived from the available EHR data was provided. Data items from the protocols were categorised under the seven accessibility and availability. The demonstration of the derivation of data value for platinum-free interval showed the feasibility of deriving fields from available EHR data. We have shown that many of the key data items are currently documented and available within a hospital setting. Although this work was based at a large cancer centre, its informatics programme is not particularly advanced, and many hospitals in the UK would be able to capture and provide either the same or higher. Intellectual information system of medical aid control in the scope of russian medical insurance Taranik M 1 , Kopanitsa G 1,2 1 Tomsk Polytechnic University, Tomsk, Russian Federation; 2 Tomsk State University for Architecture and Building, Tomsk, Russian Federation The article presents developed intellectual information system, oriented for healthcare providers. System solves a problem of medical aid quality control in Russian medical insurance sphere. The main components are ISO13606, fuzzy logic and case-based reasoning concept. System provides medical insurance payments forecasting by the analysis of medical records and generates two evaluations based on medical standards and set of precedents. The result of system implementation allowed to 10 % increase insurance payments for healthcare provider. Standardized headings as a foundation for semantic interoperability in EHR The new Swedish Patient Act which allows the patient to choose health care in different county councils implies the need of sharing health-related information in electronic health records (EHR:s) across county councils. This demands interoperability in terms of structured and standardized data. Headings in EHR structure data. Headings could also be a part of a standardized terminology. The aim was to study to what extent terminology is shared and standardized across county councils in Sweden. Headings from three county councils were analyzed to see to what extent they were shared across county councils and to what extent they corresponded to concepts in SNOMED CT and National Board of Health and Welfare's term dictionary (NBHW's TD). In total 41 % of the headings were shared across two or three county councils. A third of the shared headings corresponded to concepts in SNOMED CT. Further, an eighth of the shared headings corresponded to concepts in NBHW's TD. Five percent of the shared headings were found both in SNOMED CT and NBHW's TD. The results showed that the extent of shared and standardized terminology in terms of headings across the studied three county councils were negligible. The studied EHR:s cannot be defined as EHR according to the current definition because they are not interoperable. E. Grill et al. Using Bayesian networks to verify assumptions in epidemiologic data analysis Epidemiologic data analysis in observational studies is faced with high-dimensional and heterogeneous data where the covered pieces of information are spread over numerous variables with complex interaction patterns [1] . Accordingly, there are various established strategies for the inference and interpretation of associations as well as methodological concepts for the handling of well-known sources of errors, i.e. confounding, effect modification, several types of bias, missing data and random error. Apart from that, common epidemiologic variables frequently are proxies or markers and researchers can hardly ever guarantee that the intended information is reasonably well portrayed by the collected data. We thus postulate that a disagreement between the intended informational content of a variable and the information actually encoded in the variable can disturb statistical analysis and suggest a data-driven approach to check whether the formal content of the dataset is roughly consistent with the indented informational content. For demonstration purposes, we used exemplary study data from the de novo Parkinson (DeNoPa) cohort that comprises 108 early stage Parkinson's disease (PD) patients and 102 healthy controls [2] . Starting point was a conventional logistic regression analysis evaluating whether the extent and localization of vascular brain damage at baseline can predict cognitive decline (one of the symptoms in PD) during a 24-month follow-up interval. Subsequently, we applied a well established and easy-to-use structure learning algorithm for Bayes networks (i.e. probabilistic models that specify the joint probability mass function of a set of variables in a structured form) [3] [4] [5] and evaluated how the thus obtained network structure can support the understanding of the data as well as the explanation and interpretation of the previously obtained statistical results. From a content-related view, the identification of potential confounders and mediators as well as the further conceptual classification of numerous variables (especially age, education and depression) in the dataset was ambiguous and the logistic regression analysis led to rather vague results. The learned Bayesian network, however, revealed several probably misleading presumptions and helped to work out optimisations for future data collections, data preparations and data analyses. In addition to profound background knowledge and prudently applied statistical skills, structure learning algorithms for Bayesian networks can be used as a diagnostic tool to reconsider the researcher's assumptions regarding the dataset. The thus obtained networks possibly can guide interpretation and contribute to get out what actually is in the data. The cognitive functioning of socially isolated individuals may profit from high mental work demands Previous studies have shown that individuals with poor social relationships have an increased risk for dementia. Dementia risk, however, can also be positively influenced by lifestyle factors. One such very important factor is high mental demands at work, in particular as the work environment affects a very long lifetime period. Thus, our objective was to investigate whether the cognitive functioning of socially isolated individuals may profit from high mental work demands. Analyses were based on n = 10,000 participants (aged 40-80 years) of the population-based German LIFE-Adult-Study. All participants underwent medical examinations and filled out standardized questionnaires. Cognitive functioning was assessed via the Verbal Fluency Test (VFT) and the Trail-Making Test (TMT). Social relationships were assessed via the Lubben Social Network Scale (LSNS-6). The interaction between social isolation and mental demands on cognitive functioning was analyzed via multivariate regression analyses. The difference in cognitive functioning between high and low mental work demand conditions was larger in socially isolated individuals (VFT: 2.7 words, TMT-B: 26 s) compared to socially well integrated individuals (VFT: 2.1 words, TMT-B: 9 s). Multivariate regression analyses-adjusted for age, gender, and education-indicated that both mental work demands as well as social relationships are significantly associated with the level of cognitive functioning (p \ 0.001). Results also suggest interaction effects indicating a stronger impact of mental work demands on cognitive functioning in socially isolated individuals than in well integrated individuals. The findings imply that individuals with poor social relationships may particularly benefit from high mental work demands regarding their risk for dementia. The level of mental demands at work could therefore be an important target for tailored preventative approaches. Univesität Bonn, Germany Previous studies have shown an impact of an overall high level of mental work demands on a decreased dementia risk. In our study, we investigated whether this impact may be driven by specific mental work demands and whether it may be exposure-dependent. We followed a sample of 2,315 GP patients aged 75? years participating in up to seven assessment waves (every 1.5-year) of the longitudinal AgeCoDe study. Analyses of the impact of specific mental work demands on dementia risk were carried out via multivariate regression modeling. We found significant associations between specific mental work demands, particularly for information processing and pattern detection, and a 10-15 % lower dementia risk per level. Our longitudinal observations suggest that different types of mental work demands have different effects on dementia risk, and thus, propose that theoretical models clarify these differential impacts in order to efficiently guide interventional studies. The costs and benefits of SNOMED CT implementation: an economic assessment model The EU-funded ASSESS CT project investigates the fitness of SNOMED CT as a potential standard for EU-wide eHealth deployments. To cover core implementation aspects, the project analyzed SNOMED CT's impact from technical, business, organizational, governance and socio-economic viewpoints [3] . This paper presents the interim results of the socio-economic analysis of SNOMED CT. The main contribution is the development of an Economic Assessment Model that can be used to assess SNOMED CT and other terminologies' socio-economic impact systematically. Such economic assessments are valuable for public authorities when deciding on future investment in clinical terminologies, because they allow customization to different contexts, and help to forecast the likelihood of a reasonable return on investment. Methods For a comprehensive socio-economic analysis, data to measure the benefits and costs for each specific stakeholder are needed. The method that supports the linking of these data is cost-benefit analysis (CBA). For identifying and describing indicators we documented methodological challenges in existing studies; performed indicator development and validation for costs and benefits. We analyzed data collected in ASSESS CT through country focus groups, questionnaires and case studies. We used the analysis of implementation experiences to confirm or reject theoretical benefit indicators. In addition, we discussed which benefit indicators were most important to stakeholders at the 2nd ASSESS CT validation workshop, focusing on real implementation experiences. A Danish-Swedish implementation cost-benefit study was performed including a dedicated focus group where key stakeholders from eHealth authorities and national implementation projects in Denmark and Sweden were asked to quantify cost and benefits of their projects. To give an overview of costs and benefits, the set of cost indicators was developed is presented in detailed tables, along with a set of major benefit indicators. The description of each indicator includes scope, specificities, and assumptions about the size and form of the indicator across the different scenarios. The data collected from the Danish-Swedish case study meant that for each indicator, the appropriate variables and metrics could be identified. The assessment framework allowed for cost items (within indicators) to be assigned to different stakeholders, as the underlying CBA tool is stakeholdercentric. We populated the CBA tool using the data from the Danish-Swedish case study, as well as most appropriate assumptions. The indicators were successfully implemented in a Microsoft Excel-based CBA tool. This implementation allow for interested parties to perform their own analyses. Comprehensive information based on the indicator descriptions, available figures and assumptions are incorporated to guide the stakeholders through the assessment process for their specific case. The ASSESS CT project has developed the first draft of an economic assessment model to base any impact assessment on scientific methodologies, real observations, and actual data. The strength of our approach is that indicators and assumptions are implemented in a customizable CBA tool, which can be refined whenever better data becomes available. The next step is to (a) finalise the CBA method and (b) produce a toolkit for general use by practitioners, scientists, and policy-maker alike. Time trends in embolic and ischemic stroke incidence in the German MONICA/KORA cohort study Background Cardiovascular diseases still account for the majority of deaths worldwide. Stroke incidence and mortality have already been described in some populations participating in the WHO MONICAproject (Monitoring trends and determinants in cardiovascular disease), indicating declines. The incidence trends could partly be linked to trends in the frequency of classic cardiovascular risk factors. Here, we performed analyzes of the incidence of fatal and non-fatal stroke in the South German MONICA/KORA Augsburg surveys and its secular trend. Data from the four MONICA/KORA Augsburg surveys S1, S2, S3, and S4 were used to calculate age-standardized incidence rates (IR) of first-ever non-fatal and fatal embolic and ischemic stroke. Participants' age-range at baseline was 25-64 years in S1 and 25-74 in S3-S4. To assess time trends over 24 years of observation the IRs per survey were compared. The difference of IR was significant if their 95 % confidence intervals (CI) did not overlap. The age-standardized IR of embolic and ischemic stroke per 100,000 person-years declined from survey to survey; the difference became significant when comparing IRs of S1 (IR = 222.9; CI 194.6-254.2) with S4 (IR = 104.9; CI 85.8-127.0). IRs were higher in men than in women but a similar time trend was found in both sexes. Similarly, the IR of fatal embolic and ischemic stroke strongly declined, in total (S1: IR = 48.4; CI 35.7-64.1 versus S4: IR = 3.4; CI 0.8-9.3) as well as in both sexes. In the study region Augsburg, a decline in stroke events and mortality over time was observed. This result is in good agreement with findings in other international MONICA populations. Estimating out-of-sample performance for diagnostic classifications-comparing resampling methods Thiele C 1 , Hirschfeld G 1 Hochschule Osnabrück, Osnabrück, Germany The in-sample performance of statistical classifiers overestimates the performance on new data. Both complex-classification trees-as well as simple-univariate cutpoints-classification methods are prone to overestimation. The present study will compare validation methods for estimating out-of-sample performance for classificationtrees as well as Youden-based cutpoints. We compared the following validation methods and variants of those methods with roughly equal computational requirements: Single training/testing split, leave-one-out cross-validation (LOOCV), bootstrapping (including the leave-one-out bootstrap, bootstrap.632, bootstrap.632? and the optimism corrected bootstrap) and crossvalidation (CV) with 2, 5 and 10 folds and up to 10 repetitions. The validation methods were compared in a nested-validation approach using a dataset with 767 observations and 9 predictor variables. The nested-validation approach proceeded as follows: First, a pseudosample of size 30 or 80 was drawn without replacement. Second, the validation procedures were executed within the pseudo-sample, yielding different estimates of the out-of-sample Youden index. Third, these estimates were compared against the Youden indices that were achieved by fitting the respective model on the complete pseudo-sample and predicting all observations in the rest of the data set. These steps were repeated at least 6000 times. In order to test the generalizability of the findings, we also performed a simulation study for the Youden-based cutpoint model. Using that model, the validation methods were tested on simulated binary data to confirm the results that were obtained using the real data set. Inspection of the different repetitions in terms of bias and variance yielded the following results. First, the leave-one-out bootstrap has lower variance than cross-validation but is more pessimistically biased while cross-validation is almost unbiased. Second, alternative bootstrap variants were highly optimistically biased. Third, additional bootstrap repetitions or CV-repetitions lead to lower variance but do not influence the bias. Fourth, a single training/testing split and LOOCV suffer from high variance, high bias, and, in the case of LOOCV, also from a large computational burden. Fifth, confidence intervals based on the different validation methods do not have the desired coverage probability. Inspection of the simulation model yielded the same pattern of results as the analysis of the empirical data. Many different validation methods for estimating out-of-sample performance and informing model selection have been suggested in the machine-learning literature but have not yet been applied to medical decision making. Our comparison of different validation methods showed that unbiased estimates of out-of-sample performance are feasible at a similar level of variance that is exhibited by the in-sample estimate. The choice between the different validation methods should be guided by the goal of the researcher. If she aims to select a specific model, bootstrapping, in particular leave-one-out bootstrapping, offers the smallest variance. If she aims to estimate the out-of-sample performance, CV offers the smallest bias. In particular 10 times repeated 5-fold CV proved to offer a good balance between bias, variance and computational burden. In the AKTIN Project we use i2b2 databases to establish local data warehouses in the participating clinics [1] . The concepts of the AKTIN datasets are imported from standardized HL7 CDA documents [1] [2] [3] [4] so that at this stage in i2b2 no further data mapping is required. However, we need a description language for the i2b2 ontology to be able to maintain it, distribute it to the local i2b2 installations and produce documentation. In order to create or update the i2b2 ontology from a script we need SQL statements to manipulate the i2b2 database tables. It is not feasible to create and maintain these manually. The complete AKTIN ontology for the base module consists of 2102 concepts which translate to about twice that number in insert statements in i2b2. Most of these come from the ICD10GM (n = 1795) and CEDIS (n = 188) terminologies. The other concepts refer to LOINC if possible or use proprietary AKTIN codes. Since SQL is not a good way to maintain ontology information we looked for relevant standard formats and found Simple Knowledge Organization System (SKOS). It is the W3C standard to represent knowledge and a variety of tools for processing is available. Using XSL transformations we were able to create a script to transform the XML release document of ICD10GM to SKOS. The CEDIS classification is available in a less structured tabular format that can also be converted into SKOS. The resulting SKOS ontology can be loaded in any Resource Description Framework (RDF) engine (e.g. Apache Jena, OpenRDF). The RDF graph is then traversed to obtain the necessary SQL insert statements for i2b2. We used XML-FO (via Apache FOP) to create a PDF documentation of the AKTIN ontology. The AKTIN ontology information is available in a machine readable form and can easily be used by different systems, which we have demonstrated by generating a human readable PDF overview from the ontology as well as an automated i2b2 import. The developed transformations are generic and can be reused in similar projects. It is feasible and efficient to manage bio-medical ontologies for clinical data warehouses using only existing international standards SKOS, XSLT, XML-FO. Our approach shows that a careful application of standard formats and tools leads to a coherent workflow and less maintenance work. All proprietary or modified contents of the ontology need to be maintained in one place only: The SKOS ontology representation. It is also possible to process the ontology with external standard tools like Protégé without further transformation. The standard catalogs (e.g. ICD) do not need any manual maintenance-if a new release is available it can be transformed to SKOS and the whole ontology can be deployed to multiple i2b2 installations, documentations, reports etc. This significantly reduces the work to maintain the ontology for AKTIN and also increases the quality because there are as few as possible manual steps which effectively prevents copy and paste errors and ensures the changes are deployed to all relevant parts of the project. Objective Patient satisfaction with teleconsultation services can increase their acceptance. Validated and standardized questionnaires to measure the quality aspects of teleconsultation relevant from the patients' perspective are not available yet. We aim to develop such a questionnaire. Methods First, a systematic literature search was performed and focus groups were held to acquire quality aspects of teleconsultations patients perceive as important. Thirty-seven unique quality aspects distilled from these activities, were used for questionnaire development based on the framework of the Consumer Quality Index. In future research, the comprehensiveness, relevance and unambiguousness of the concept questionnaire need to be tested and the reliability and internal cohesion of the questionnaire assessed. Quality management in non-interventional studies and a comparison of quality control in non-interventional studies with clinical trials Thomas P 1 1 University of Essen, Essen, Germany Studies or trials carried out by academicians in the University are Investigator Initiated (IITs) and noncommercial. There is a general perception that the quality of non-interventional studies may not be high or comparable to that of clinical trials in view of the fact that there are no legal requirements to be followed but only guidelines, internal SOPs and study protocol [1] . Various studies have been carried out by many research-based pharmaceutical companies with respect to the quality management measures in such studies. In Essen University Hospital, up-to-date various clinical trials and non-interventional studies have been carried out. Risk-adapted monitoring strategies are used which is proposed from the ADAMON project as required by the Federal Ministry of Education and Research (Bundesministerium für Bildung und Forschung abbreviated, BMBF [2]). According to the ADAMON (adaptiertes Monitoring) strategy, the risk of a trial is analysed (Risk Analysis).The category of the trial is determined (Risk-Based Categorisation).Depending on the results of the risk analysis and risk-based categorization, monitoring strategy is chosen [3] . Monitoring strategies are divided into three classes (K1 to K3) depending on various factors. Patient specific factors like safety, protection and well being of trial participants and their rights, integrity and confidentiality determine the category into which monitoring for that trial will be selected. When a trial has a high risk in terms of patient safety and protection, it will fall into K1 class of monitoring strategy. Other principles for risk adapted strategies are source data verification (SDV), if 100 % SDV is required in what percentage of patients and centres. The frequency and duration of visits per centre per year and the presence of central monitoring carried out to monitor remote data entry (RDE) are other factors. Centre specific factors like local infrastructure do not play a major role in determining its monitoring strategy. In centres which fall under K2 or K3 monitoring strategy, problem oriented monitoring is carried out if the necessity arises. The aim of this presentation is to discuss the different quality assurance measures in non-interventional studies in the academic domain. The presentation will go in detail on various quality management measures like appropriateness of the study design to avoid bias of result, adequacy of data analysis to assure authenticity, completeness and validity of the data, identification of deficiencies and resolutions at an early stage, design of case record forms(CRF, electronic-CRF), standardization and feasibility assessment of case report form (CRF), type of training given to participating staff(physicians, nurses),type or method of source data verification carried out, patient information, informed consent and telephone monitoring after patient inclusion in the study. Extensive review of the CRF, double data entry, and data checks for plausibility of the data are also considered. A comparison between quality control measures in non-interventional studies with quality control aspects in clinical trials is made and their results are discussed. The level of agreement between quality standards of a non-interventional study to that of a clinical trial is also interpreted. E. Grill et al. Dementia is a progressive disease and characterized by cumulative cognitive decline. In the absence of a cure or effective disease treatment, the largest effect on reduction of dementia occurrence has been reported to be prevention by reducing dementia risk. Often used drugs are of special interest as risk factors for dementia in the elderly, especially consequences of their regular use. Anticholinergic drugs have been shown to enhance the risk for cognitive decline in the elderly. However, the classification of anticholinergic drugs according to their anticholinergic profile is inconsistent. Since cholinergic transmission is mainly mediated by the M 1 muscarinic acetylcholine receptor in the brain, we classified drugs from anticholinergic risk lists according to their M 1 receptor affinity and analyzed the risk modification for incident any dementia. Analyses were conducted on a longitudinal sample of the statutory health insurance AOK. The data include information on inpatient and outpatient diagnoses and drug prescriptions on a quarterly basis for the years 2004-2011. We included patients aged 75 years and older who did not have dementia at baseline. Anticholinergic drugs were taken from three anticholinergic risk lists. The PDSP-database and literature search were used to define Ki-values for the substances. Hazard ratios were calculated using time-dependent Cox regression including covariates like age, gender, several comorbidities, and polypharmacy. Anticholinergic drug prescription significantly increased the risk for incident any dementia in the elderly. The effect was stronger for anticholinergic drugs with small K i -values compared to substances with larger K i -values. Further risk factors for dementia (e.g. age, depression, stroke) were confirmed. Taking into account the M 1 receptor affinity may be a contribution in determining the anticholinergic burden in view of dementia risk in the elderly. The cross-sectional and longitudinal associations of psychosocial well-being with sleep duration and sleep quality in European children and adolescents Poor psychosocial well-being and poor sleep (short sleep duration, poor sleep quality) have both been linked to cardiometabolic disorders in children and adolescents [1, 2] . The aim of this study was to explore the cross-sectional and longitudinal associations of well-being with sleep in order to gain more insight in pathways potentially leading to cardiometabolic disorders. Methods We analysed parent-and self-reported data of 3.0-to 15.9-year-old children and adolescents from 8 European countries participating in the IDEFICS/I.Family cohort. The cross-sectional analysis included 6,277 children and adolescents participating in the latest follow-up examinations (2013/2014; T3). The longitudinal analysis was carried out with data of 3,635 children and adolescents who also participated at T1 (2009/2010). A well-being score was calculated using 16 items of the KINDL questionnaire covering emotional well-being, self-esteem and relations to family and friends (range: 0-48 points) [3]. Nocturnal sleep duration was age-standardised. Sleep quality parameters included difficulties falling asleep (yes/no) and trouble getting up in the morning (yes/no). Linear and logistic mixed-effects models were estimated to assess the cross-sectional associations between well-being and sleep accounting for the clustered study design created by the existence of siblings in the study sample (random effect for family affiliation). Mixed-effects models were also used to regress change in wellbeing between T1 and T3 on sleep at T3. All models were adjusted for age, sex, pubertal status, educational level of parents, study centre, media time and included an indicator for self-vs. proxyreported data. Cross-sectionally, well-being was positively associated with sleep duration. For every four point increase in well-being score there was a 0.044 (95 % confidence interval (CI) [0.025;0.063]) unit increase in sleep duration z-score; e.g. a child with a well-being score of 40 slept on average 9 to 13 min longer than a child with a score of 20. Higher well-being was also associated with a lower odds of having trouble getting up in the morning (odds ratio ( Our results indicate that psychosocial well-being and sleep are associated and that positive changes in well-being over time are associated with longer sleep duration and better sleep quality later on. Therefore, when investigating the effect of well-being on cardiometabolic disorders, sleep could be a confounding or possibly a mediating factor. We will further explore the potential effect of (change in) sleep duration and sleep quality on well-being. A tele-based cross-sectorial conference in the patient's home-a randomized controlled trial Objective To assess whether discharge planning followed by a cross-sectoral video conference affect readmissions among patients discharged from medical and geriatric departments. The study was a multi-center, 1:1 parallel-group individual patient randomized controlled trial. Inclusion criteria was age 55 or older, discharged alive from the medical or geriatric departments at Roskilde or Køge hospital, Denmark and have residency in the surrounding municipalities. The inclusion period was from 4 July 2013 until 31 December 2014 and all patients were recruited by one of the five study nurses. All participants provided informed consent. The participants were randomized via a computer-generated randomization sequence. Patients in both groups received standard discharge planning and follow-up. The intervention group received the intervention as an addition consisting of a video conference between patient, municipal nurse and project nurse on the day of discharge focusing on medication, appointments with the hospital and general practitioner, aids, nutrition and transportation. The predefined primary outcome was unplanned readmission at any hospital in Denmark within 180 days after discharge. Secondary outcomes at 8, 30 and 180 days included readmission, use of primary health care as well as municipal services. Outcomes were register-based and analysed using the intention-to-treat principle. Of the 4934 patients assessed for eligibility, 1292 were randomized to intervention (n = 644) and control (n = 648). No patients were lost to follow-up. The intervention and control groups were similar in baseline characteristics. The municipality nurse evaluated the videoconferences as worthwhile in 57 % of the cases. In 50 % of the conferences issues concerning medicine were identified, 40 % identified problems concerning nutrition, mobility or lack of information on plans. The intervention and control groups did not differ in the primary outcome, with 49 % of control patients and 47 % of intervention patients having an unplanned admissions within 180 days of discharge (p = 0.43). The number of unplanned admissions and length of stay were also similar. At 30 days the proportion of intervention patients with unplanned admission was 21 % compared to 25 % among the control patients. This difference was not significant (p = 0.08). The groups received the same number of general practitioner services and had the same amount of municipal services four months after discharge. The mean increase in nursing municipal services compared to usage at admittance were lower among intervention patients compared to control patients (p = 0.04). This intervention study of systematic discharge planning including a videoconference between the primary and secondary sector showed no significant difference between intervention and control groups. Considering the nature of this intervention as a direct discharge planning intervention it would be expected that the immediate readmission at 8 or 30 days would be the more relevant outcome, and the results showed a trend towards a lower readmission rate at 30 days. The program for early detection of cervical cancer in Germany will be reorganized over the coming years. According to law, the Federal Joint Committee (Gemeinsamer Bundesausschuss-G-BA) is responsible for establishing the new program protocols. In March 2015, the G-BA published an outline of the planned early detection program [1] . Women younger than 30 years will eligible for an annual pap smear. Women age 30 years and older will be allowed to choose one of two early detection plans. They may opt for an annual pap smear, which reflects the current offering of the statutory health insurances, or an HPV test every five years. Once one plan has been chosen, switching between plans will not be allowed. After at least six years, an evaluation of this choice model should help determine the superiority of one plan over the other. The Cancer Epidemiology working group of the German Society for Epidemiology was invited to a meeting of the G-BA in December 2015 to discuss the planned cervical cancer early detection program and to provide a written statement. The statement highlighted various organizational and methodological shortcomings of the plan, in particular in relation to the comparative evaluation of the two early detection strategies. In this workshop, the opportunities and limits for the evaluation of the planned cervical cancer early detection program in Germany will be presented and discussed. The goal is to exchange ideas across institutions and to lay the groundwork for developing a methodologically sound evaluation plan. Prevalence of cardiovascular risk factors using a German sample-findings from the STAAB Cohort Study Tiffe T 1 , Gelbrich G 1 , Wagner M 2 , Morbach C 3 , Rü cker V 4 , Kisker C 5 , Störk S 6 , Heuschmann P 1 Comprehensive Heart Failure Center, Würzburg, Germany Background Cardiovascular disease (CVD) is the main contributor to hospital admission, disability in middle-aged and older patients, and the leading cause of death (40 %) in Germany. Changes in vascular risk factors in patients without established CVD are thought responsible for the observed pronounced reduction in CVD mortality over the last decades ([50 %). Therefore, national and international guidelines for CV prevention emphasized control of vascular risk factors and adoption of healthy lifestyles also in the primary prevention setting. The current study assessed prevalence of risk factors and its determinants in a cross-sectional sample of the general population without established CVD. The STAAB Cohort Study was initiated in 2013 to determine the prevalence and natural course of early and asymptomatic heart failure stages (i.e., stages A/B) from a representative sample of the general population of Würzburg (Germany) aged 30-79 years. Presented data excluded participants with established CV vascular disease (coronary artery disease, peripheral artery disease, stroke) and known diabetes as some risk prediction models (e.g. SCORE) classified their risk similar to persons with prior CVD. CVD risk factors were defined according to the European Guidelines on CVD prevention in clinical practice (version 2012) as follows: hypertension (systolic RR [140/ 90 mm/Hg or antihypertensive medication), hyperlipidemia (total cholesterol [190 mg/dl or lipid-lowering medication), smoking (selfreported), obesity (body mass index C 30 kg/m 2 ), HbA1c [ 6.0). The influence of age, sex, education and living status on accumulation of these risk factor (0-1 vs. 2-5) was identified by multivariable logistic regression. Of 1298 participants, mean age was 53.1 (SD: ± 20.5), 52.6 % were females. The prevalence of CVD risk factors was as follows: hypertension 33.9 %, hyperlipidemia 3.5 %, HbA1c [ 6 5.8 %, smoking 19.2 %, obesity 17.3 %. Of all participants, 44.8 % had no risk factor, 35.3 % had one, and 19.9 % had multiple risks (0.8 risk factors on average). In a multivariable model, probability of having 2 or more risk factors was higher in men (OR: 1.4, 95 % CI 1.0-1.8), older participants (OR per decade: 1.4, 95 % CI 1.3-1.6), and for participants with primary education (OR 2.3, 95 % CI 1.6-3.2, ref: tertiary degree). A borderline statistical significant lower probability was found for subjects living alone (OR: 0.74, 95 % CI 0.06-1.0)). For developing effective programs targeting CVD prevention in the general population it is important to understand the whole disease continuum and the burden of uncontrolled CV risk factor in general population. Risk adapted prevention strategies for colorectal cancer: the RaPS study Background People aged 40-60 years with a family history (FH) of colorectal cancer (CRC) in 1st degree relatives (FDRs) have a 2-to 4-fold increased risk of CRC compared to the average risk population (1) . Therefore, experts recommend starting CRC screening earlier for this high-risk group (e.g., offering a first screening colonoscopy at age 40). However, information on prevalence of relevant colonoscopic findings in this group is sparse, and no risk adapted screening is offered in the German health care system (i.e., screening colonoscopy is uniformly offered from age 55 on). Thus, a multicentral epidemiological study-the RaPS studywas started in 2015 within the framework of the German Cancer Consortium (DKTK). Primary aims of this study are: to determine the prevalence of having a FH of CRC in FDR in the German population aged 40-54 years; to investigate the prevalence of colorectal neoplasms (CRC/adenomas) among people with a FDR in this group; and to develop risk-adapted prevention strategies for this high-risk group based on collected information. Methods A random sample of around 120.000 persons from the general population aged 40-54 years from the catchment areas of the study centers in Dresden, Munich and Stuttgart was contacted to assess FH of CRC by an online-questionnaire. All respondents (targeted number 30.000) receive feedback on their individual risk factors and recommendations for CRC prevention and screening. Those with a FH of CRC in FDRs are invited to the study centers for individual consultation regarding CRC prevention (targeted number 1.200) . Participants are asked to donate blood and stool samples which will be stored in a biorepository for interdisciplinary translational research within DKTK. Considering the individuals' medical history, diagnostic colonoscopy is recommended and colonoscopy records will be obtained. Prevalence of CRC and its precursors will be evaluated according to age and sex of participants and number, type and age at diagnosis of affected FDRs. Furthermore, genetic, epigenetic, proteomic and microbiomic biomarkers in blood will be investigated to predict the presence of colorectal neoplasms in this group. Risk markers and their eligibility for risk adapted screening offers (including effectiveness and cost-effectiveness) will be examined. Recruitment of participants, sample collection and analysis are ongoing and expected to be completed by September 2017. Currently, out of 16.803 persons who have completed our online questionnaire, 1.593 have reported a FH of CRC (* 10 %). Out of those, 851 have reported previous colonoscopies. In addition, 672 persons have already been to or have shown interest in an individual consultation carried out in our study centers. Nearly all participants who underwent consultation were willing to donate blood and stool samples. By the time of the conference, updated information will be provided. For the first time in Germany, this study will provide data for the prevalence of persons with a FH of CRC in the age group 40-54 years, which will enable us to derive evidence based screening strategies for this high-risk group. Our study provides a comprehensive approach to CRC prevention among high-risk individuals with FH of CRC. Background Health literacy (HL) is defined as the knowledge and competence to access, understand, appraise, and apply health information for health judgement. The conceptual model of HL integrates three health relevant areas (health care, disease prevention, health promotion) [1] . Low HL is associated with different health outcomes such as self perceived health status, mortality and use of health care facilities [2] . Therefore, HL is of increasing interest in epidemiological studies. The aim of the present analysis is to validate a short version of the 47-item health literacy questionnaire HLS-EU-Q for the use in epidemiological studies. During the second followup of the CARLA-Study (Cardiovascular Disease, Living and Ageing in Halle) in 2013 HLS-EU-Questionnaire were administered in 140 subjects aged 55 to 83 years. HL-scores for both questionnaires were calculated according to the recommendations of the European Health Literacy Project [3] The short version containing 16 items, was adapted by Pelikan and colleagues [4] . Agreement between full and short version of the questionnaires were evaluated using Bland & Altman plots and Kappa Coefficient. Furthermore, we calculated Chronbachs Alpha to determine the internal consistency of the questionnaires. In total, 139 subjects (52 % males) could be included in the analysis. The mean age of the subjects was 66.7 (SD 6.7) years. The mean duration for completing the full version was 12.3 (SD 3.3) minutes and for the short version 3.6 (SD 2.8) minutes. The mean general HL was 34.0 for the full version and 34.5 for the short version of the HLS-EU-Q. The mean of the difference between full and short version for general HL was -0.45 (95 %CL -0.77; -0.14). The Bland and Altmann Plot showed a good agreement between both versions of the HLS-EU-Q. Chronbachs Alpha for the full version was 0.94 and for the short version 0.84. The short version showed good agreement in comparison to the full item version of the HLS-EU-Questionnaire at least for the whole scale (general health literacy). While the subscale regarding disease prevention seems to overestimate HL substantially the other subscales rather underestimate HL. (United Nations, 2015) . Therefore, this study aims to identify protective factors to reduce the under-five mortality in Yemen. Methods Secondary data analysis was performed based on the Demographic and Health Survey (DHS) Yemen 2013. Descriptive and bivariate analyses were conducted. Simple logistic regressions and a multiple logistic regression model were calculated to identify factors associated with under-five mortality. The factors included in the regression model were based on literature findings and their significance level in the bivariate analyses. Odds ratios (OR) and 95 % confidence intervals (CI) were calculated. Based on the DHS data, the under-five mortality was 60.5 per 1,000 live births in 2013 in Yemen. From overall 14,688 live births 77.8 % (n = 11,431) survived to the age of five years. Regional differences from 32.9 to 88.1 child deaths per 1,000 live births were observed. 78.5 % of deaths under five years occurred in rural areas. The vaccination status of children-particularly immunizations against polio, measles, BCG and DPT-were recognized as important protection factors through simple logistic regression. Apart from that further variables such as injuries as a result of violence as well as the body mass index of the mother were considered but not included in the regression model. The multiple logistic regression model indicated the following factors as the main protective factors for under-five mortality: A higher wealth index (OR = 0.56; CI 0.35-0.87), a higher number of children ever born to a mother (OR = 0.09; CI 0.05-0.14), a higher age (above vs. below 21 years) of the partner (OR = 0.28; CI 0.12-0.67), a higher educational level (OR = 0.39; CI 0.17-0.88), breastfeeding (OR = 0.96; CI 0.96-0.97). In contrast, birth at home was associated with a higher likelihood for under-five mortality (OR = 4.53; CI 2.40-8.53). Nagelkerkes R 2 indicated that 40.1 % of the variance could be explained by the factors included in the model. The access to health services in Yemen is limited because of poverty, an inadequate health infrastructure and low investments in the health sector. Especially rural regions-where 71.4 % of the population lives-are at risk. Our study shows that children born at home are at higher risk for under-five mortality. Therefore, much more action to strengthen the health care sector is needed. The study results demonstrates that further improvement in terms of family planning, breastfeeding behaviour, professional birth care in clinics and vaccination status are needed to reduce under-five mortality. Algorithmic summaries of perioperative blood pressure fluctuations Toddenroth D 1 , Ganslandt T 2 , Drescher C 1 , Weith T 3 , Prokosch HU 1,2 , Schuettler J 3 , Muenster T 3 1 Chair of Medical Informatics, Friedrich-Alexander-University Erlangen-Nuremberg, Erlangen, Germany; 2 Medical Center for Communication and Information Technology, University Hospital Erlangen, Erlangen, Germany; 3 Dept of Anaesthesiology & Intensive Care, University Hospital Erlangen, Erlangen, Germany Automated perioperative measurements such as cardiovascular monitoring data are commonly compared to established upper and lower thresholds, but could also allow for more complex interpretations. Analyzing such time series in extensive electronic medical records for research purposes may itself require customized automation, so we developed a set of algorithms for quantifying different aspects of temporal fluctuations. We implemented conventional measures of dispersion, summaries of absolute gradients between successive values, and Poincaré plots. We aggregated the severity and duration of hypotensive episodes by calculating the average area under different mean arterial pressure (MAP) thresholds. We applied these methods to 30,452 de-identified MAP series, and analyzed the similarity between alternative indices via hierarchical clustering. To explore the potential utility of these propositional metrics, we computed their statistical association with presumed complications due to cardiovascular instability. We observed that hierarchical clustering reliably segregated features that had been designed to quantify dissimilar aspects. Summaries of temporary hypotension turned out to be significantly increased among patient subgroups with subsequent signs of a complicated recovery. These associations were even stronger for measures that were specifically geared to capturing short-term MAP variability. These observations suggest the potential capability of our proposed algorithms for quantifying heterogeneous aspects of short-term MAP fluctuations. Future research might also target a wider selection of outcomes and other attributes that may be subject to intraoperative variability. Helmholtz-Zentrum für Infektionsforschung -HZI, Braunschweig, Germany The use of mobile phone information technology in the health sector has received a lot of attention especially during the Ebola Virus Disease (EVD) outbreak. Mobile health (mHealth) promises to allow a major improvement but there is a lack of an overview of what kinds of tools are available, and the functionalities they offer. We therefore conducted a systematic review mHealth tools in the context of the recent EVD outbreak in order to support further mHealth developments for infectious disease control. We conducted a systematic search for all articles published in any language indexed in Google Scholar, PubMed, Embase, CAB Abstracts (Global health), Popline and Web of Science with publication dates from 01.01.2014 until 31.12.2015 using the following search strategy: (''Outbreak'' OR ''Epidemic'') AND (''mobile phone'' OR ''smartphone'' OR ''smart phone'' OR ''mobilephone'') AND (''Ebola'' OR ''EVD'' OR ''hemorrhagic''). Each publication was then categorized into either a book chapter, scientific peer reviewed journal article or a non-peer reviewed web article. A threephased approach was used beginning from the manual selection to extraction based on a standardized extraction form focused on technical design and functionality of the described tools. Two different reviewers independently screened titles and abstracts of all publications resulting from the search, selecting original publications on tools that run on smartphones and deal with the management of EVD or other hemorrhagic fever outbreak. We extracted the content of the selected manuscripts using a standardized extraction form focused on technical design and functionality of the described tools. We retrieved 690 publications from the automated search in Google Scholar, of which we identified 118 (17 %) as relevant for the topic, among which 50 (42 %) publications described a total of 42 different tools for Google Scholar. We found 4 publications in PubMed, 3 of which were relevant for the topic and also included in Google Scholar. No articles were found in Embase, CAB Abstracts (Global health), Popline and Web of Science using the same search strategy. Among the three key functionalities surveillance notification was supported by 14 (36 % of 39 for which that information was retrievable), contact tracing by 4 (25 % of 16) and case management by 6 (18 % of 34).Three tools support all three of the above (''commcare'', ''sense-followup'', and ''sormas''). Four (13 % of 32) had the capacity of data entry in offline mode; 9 (47 % of 19) were programmed in open source development technology; Thirteen tools contained outbreak management functionalities (33 % of 39); 31 (84 % of 37) contained the feature of bi-directional communication; For 5 tools (16 % of 32) reference to an official data standard in disease surveillance was stated. Among the large number of tools developed in the context of the EVD outbreak management, it appears that only three of these tools contain the three key functionalities of outbreak management for EVD. We recommend focusing attention for further development on those tools that contain comprehensive portfolios of outbreak management functionalities and reference to established standards in disease surveillance. Purpose: An important part of the electronic information available in Hospital Information System (HIS) has the potential to be automatically exported to Electronic Data Capture (EDC) platforms for improving clinical research. This automation has the advantage of reducing manual data transcription, a time consuming and prone to errors process. However, quantitative evaluations of the process of exporting data from a HIS to an EDC system have not been reported extensively, in particular comparing with manual transcription. In this work an assessment to study the quality of an automatic export process, focused in laboratory data from a HIS is presented. Methods: Quality of the laboratory data was assessed in two types of processes: (1) in a manual process of data transcription, and (2) in the automatic process of data transference. The automatic transference was implemented as an Extract, Transform and Load (ETL) process. Then, a comparison was carried out between manual and automatic data collection methods. The criteria to measure data quality were correctness and completeness. Results: The manual process had a general error rate of 2.6 to 7.1 %, obtaining the lowest error rate if data fields with a not clear definition were removed from the analysis (p \ 0.000). In the case of automatic process, the general error rate was 1.9 to 12.1 %, where lowest error rate is obtained when excluding information missing in the HIS but transcribed to the EDC from other physical sources. Conclusion: The automatic ETL process can be used to collect laboratory data for clinical research if data in the HIS as well as physical documentation not included in HIS, are identified previously and follows a standardized data collection protocol. The genomic medicine game Background During the advance of genomic medicine [1] , one of the most commonly mentioned obstacles is the lack of confidence of health professionals, other than those related with the field of genetics, to understand and interpret the results derived from genomic tests. This need of training in the field is a recurrent topic in the analyses of the major gaps for the advancement and application of genomic medicine. Our proposal, the Genomic Medicine Game (GMG), is an educational tool designed to support genomic education and help clinicians, researchers, students and potentially a broader sector of society to better understand the basis of genomic medicine by showing the basis human genomic variation and its relationship with different health-related conditions. Methods The genomic medicine game is supported by genome simulation followed by a role-game based interaction with the audience where each of the participants is given an individual genome with a different set of phenotypic and environmental factors so they can learn about their different interactions. A key element for the development of the GMG is the development of a manually curated knowledge base with a selection of different environmental factors, clinically relevant phenotypes and genotypes. This knowledge based is used in a Java module to generate simulated genomes with associated phenotypes and environmental factors that are handed to participants to be used in a role-game where the different genotype-phenotype-environment interactions are presented. The GMG module was tested on different OS platforms (Mac, Windows, Linux) to ensure its use across them. This testing process validated its simple interface, its capability to generate large numbers of simulated genomes and the correct use of the allelic frequencies among the simulated genomes. We run a successful initial demonstration of the Genomic Medicine Game with a group of health professionals (n = 24). The dynamics of this demonstration included an initial explanation of the aims and purpose of the experience, followed by the role-playing experience were all the different variants were explained using the three different scenarios included. The Genomic Medicine Game is a tool for teaching the fundamental elements for genomic medicine, combining audience engagement, genome simulation and the use of a curated knowledge base. We have tested the GMG with a group of health researchers with promising results. Participants engaged in the role-playing dynamics, and suggested as future improvement an expansion of the current number of examples included in the exercise. Utilization of the German skin cancer screening program and effects on melanoma incidence and disease severity-a secondary data-based analysis Malignant neoplasms of the skin are the most common tumours diagnosed in industrialized countries. In July 2008, a nationwide screening program for adults from age 35 was introduced in Germany. However, evidence on utilisation and effects on disease burden and prognosis is still limited. Our analysis is based on data from outpatient routine care of a large German health insurance company covering anonymized data of [2 million insured persons age 35 years or older from Saxony. The data include complete information regarding diagnosis, prescriptions and procedures at patient level for the years 2005 to 2012. Melanoma and non-melanoma skin cancer (NMSC) cases were identified using an ICD-10-code based algorithm. Screening participation was determined using the Uniform Value Scale (''Einheitlicher Bewertungsmaßstab''). Cross-sectional and longitudinal analyses were applied to determine the utilization of the screening program. Furthermore both, incidence and indicators for disease severity as a proxy for prognosis were determined with respect to screening participation. Overall, 38.0 % of insured persons of age C35 years were screened for skin cancer at least once during 07/2008 and 12/2012. On average the annual participation rate was 12.5 %. Participation was highest among women age 60-69 years (15.7 %) and men age 70-79 years (15.5 %). Out of 533,393 persons screened during the observation period, 1,668 persons (0.3 %) were diagnosed with melanoma and 13,535 persons (2.5 %) with NMSC. During 2007 and 2008 an increase in incidence was observed for both entities. However, melanoma incidence per 100,000 insured persons in 2008 was higher in the first half year in comparison to the second half year after introduction of the screening program. Incidence of NMSC did not change considerably during 2008. Age-standardized (European standard) incidence of melanoma and NMSC remained stable after introduction of the screening program with 24.2 per 100,000 insured persons and 165,7 per 100,000 insured persons, respectively. The proportion of screening participants receiving interferon alpha treatment (8.6 %) and/or being diagnosed with lymph node (8.5 %) and/or distant metastasis (1.5 %) was lower in comparison to non-participants (11.2, 8.5 and 3.5 %). These differences were, however, not statistically significant. This study provides important facts regarding the utilization of the skin cancer screening in Saxony. Our results suggest that the increase in incidence is not attributable to the introduction of the screening program. However, final conclusions regarding the effects of the screening program on incidence and disease severity are limited due to a restricted period of observation and limited clinical cancerspecific information available. Longer follow-up and linkage with clinical or epidemiologic registry data is necessary to clarify respective issues more precisely. Inconsistencies between antiparkinsonian drugs and ICD-10 codes in inpatients: a TOLBIAC project case study Triki N 1 , Bousquet C 2,3 , Lardon J 4,1 , Asfari H 3 , Spiga R 1 , Trombert-Paviot B 1, 3 In France, data derived from hospital information systems are adequate to feed the prospective payment system. The consistency between drugs prescribed to patients and their indications could solve difficulties related to the identification of ICD-10 undercoded chronic diseases as the Parkinson Disease. Our goal was to highlight patients' stays mentioning administration of antiparkinsonian drugs and not coded for Parkinson's disease. Our approach was to parameterize tables of associations between ICD-10 codes and drug identifiers in the Web100T Ò application that collects medical information in our hospital and displays related inconsistencies for patients' stays. Based on acute care patients' stays of the second semester of 2015, we identified 246 patients corresponding to 253 stays, for which 33 % of stays were not coded with the ICD-10 G20 code of the Parkinson's disease. The precision of our approach was 29 %. Based on these data we predict roughly 84 patient stays without mention of Parkinson Disease. We plan to extend this study to other drugs and other kinds of data available in the health information system, such as biology or medical devices in order to improve the coding of chronic diseases in our hospital. Connecting routine care and research systems in a university hospital to accomplish bilateral additional values Trinczek B 1 1 Meierhofer AG, München, Germany The local system infrastructure of a modern (university) hospital is assembled out of several different systems that send and receive patient data and workflow data utilizing a communication server. While the communication and process workflow between routine care systems and users is highly standardized and sophisticated, systems focussing on research often have to take a back seat. This leads to flaws such as (1) routine care and research personnel working next to each other without knowledge of each other, (2) patient data being gathered, processed and stored redundantly, and (3) potential benefits for and from routine care and research not being recognized and leveraged. The demands of the specific professional groups of the routine care and research as well as the target system's behaviour were determined by textual descriptions and workflow diagrams. A review of the specific cases by the project partners verified that the concept defines a benefit of the routine care and research platform. The project partners implemented new features and adapted customizable features. The basic structure of the proposed solutions is that the involved systems are located in their own domain and manage their data independently, but transfer patient and process-relevant data as many and promptly as possible. End users of the routine care can receive information and tasks from the research system (e.g. proposals for recruitment) and vice versa. The routine care always has priority over research. The routine care system MCC has been extended with concepts such as ''study'', ''informed consent'' and ''proposal for recruitment''. Patient data exports were developed by means of well-established communication standards. MCC provides data at latest five minutes after its entry. The research system is then able to process these wellstructured exports by utilizing a rule-based workflow-engine and thus achieve automatic pseudonymization and detection of recruitment eligibility and the need of additional forms. Routine care personnel as well as research personnel is able to check proposals for recruitment via work lists, can contact the patient and record his/her informed consent or refusal in the electronic trust centre's user interface. In that kind of scenario, the routine care data and the research data of a person is divided into different administrative cases. All services and measures are assigned to the respective administrative case, but are still linked to the entity ''patient'' and consequently within the scope of the billing system. A continuation of the study is possible although the patient has been discharged from hospital (in the context of his acute treatment). Furthermore, separating documentation between a patient's administrative cases ensures data protection. Discussion and outlook Besides the patient recruitment, there are further functions designed and implemented, based on the derived cases. The connection between MCC and CentraXX at the UMG enables a new cooperation between routine care and research against the background of successfully performed integration tests. The project partners are planning to analyze the benefits of the implemented solution after the system's go-live in Q1/2017, e.g. by conducting a pre-postcomparison. Changes in the prevalence of metabolic risk factors and the metabolic syndrome among adults in Germany-German health interview and examination surveys 1997-99 and 2008-11 Truthmann J 1 , Schienkiewitz A 1 , Heidemann C 1 , Knopf H 1 , Scheidt-Nave C 1 1 Robert Koch-Institut, Berlin, Germany To investigate the prevalence of metabolic risk factor clustering as defined by the metabolic syndrome (MetS) and changes in prevalence over time among adults in Germany. We analysed data from national health interview and examination surveys 1997-99 and 2008-11 for adults in Germany 18-79 years of age. The following metabolic risk factors were considered: serum glucose C5.6 mmol/l (7.0 mmol/l among participants, who fasted less than four hours); blood pressure C130/85 mmHg; triglycerides C1.7 mmol/l (2.1 mmol/l among participants, who fasted less than four hours); high density lipoprotein-cholesterol B1.0/1.3 mmol/l (HDL-C; women/men); waist circumference C80/94 cm (WC; women/men). MetS was defined as the presence of three or more metabolic risk factors. Drug treatment for elevated levels of triglycerides, HDL-C or diabetes, and antihypertensive drug treatment among persons with a self-reported history of physician-diagnosed hypertension were additionally considered for defining the presence of the respective metabolic risk factor [1] . Persons with incomplete data were excluded (1997-99: N = 479/7124; 2008-11: 201/7115) . Prevalence estimates and changes in prevalence over time standardized to the population 2010 were calculated using SAS 9.4 survey procedures for complex sample designs, overall and within strata of sex, age (18-39, 40-64, 65-79 years) and social status (low, intermediate, high). The prevalence of elevated blood pressure including treated hypertension increased among men (45 to 52 %) and decreased among women (39 to 38 %) over time. The prevalence of elevated triglycerides decreased among men (43 to 35 %) and women (22 to 18 %) and the prevalence of reduced HDL-C decreased among men (25 to 15 %) and remained relatively constant among women (24 and 23 %). The prevalence of elevated WC decreased among men (58 to 54 %) and women (60 to 57 %). The prevalence of elevated glucose remained unchanged among men (20 %) and women (14 %). The overall prevalence of the MetS slightly decreased among men (34 to 30 %) and women (24 to 22 %) between 1997-99 and 2008-11. However, the decrease in prevalence was statistically significant only among middle-aged men and women (age group 40-64 years). Lower social status was significantly associated with higher MetS prevalence in both surveys. There was a consistent decrease in MetS prevalence across social status strata among men. Among women a decrease over time was restricted to women with low social status. Between the 1997-99 and 2008-11 survey periods, the overall Mets prevalence slightly decreased in both sexes, mainly due to improvements in serum lipids. As of 2008-11, almost one third of men and a quarter women 18-79 years in Germany still met criteria of the MetS indicating a high risk of cardiovascular morbidity and mortality. Impacts on effect estimation within a stepped wedge design in health care research: real study results in the light of findings obtained through a computer simulation experiment Trutschel D 1 , Reuther S 1 , Holle D 1 , Verde P 1 Deutsches Zentrum fuer Neurodegenerative Erkrankungen, Witten, Germany Stepped wedge design (SWD) is an alternative study designs when a simple parallel design is not useful or not be feasible. This design is relative new and for health care researchers in practice there are several methodological pitfalls possible. The SWD is a type of cross-over design in which different units cross-over from control conditions to the intervention over time. The aim is to give an orientation before beginning a study to determine how sensitive their study is against common scenarios in research practice using increasing popularly SWD. For the effect estimation, a linear mixed-effects model with fixed effect of intervention and time, as well as random effects for cluster adjustment was used. Here, a performed simulation experiment involved a factorial design based on three factors: different delays of intervention effects, number of missing clusters and time point at which clusters were lost. We investigate the impact of the different identified factors on the intervention effect estimation within an SWD trial. We identified that a delay in the intervention effect had the greatest influence on the estimates. The variance of the effect estimates increased with the number of clusters that were lost during a trial. The time-point of clusters loss had only a marginal influence. The results of the simulation study show that the SWD was not robust against a delay in the intervention effect. In the light of this findings the results of a real SWD trail testing a complex intervention to older people with dementia are evaluated referred to possible contaminations. Assessment of content-based image retrieval approaches for mammography based on breast density patterns Turkeli S 1 , Purwadi NS 1 , Atay HT 1 , Kurt KK 1 Istanbul Technical University, Istanbul, Turkey This study is trying to assess methods commonly used in contentbased image retrieval (CBIR) for screening mammography analysis. Methods A database consists of 12 different BI-RADS classes related to breast density patterns of mammogram patches which are taken from IRMA database is used in this study. Three feature extraction methods, namely grey-level co-occurrence matrix (GLCM), principal component analysis (PCA), and scale-invariant feature transform (SIFT) are being investigated and compared with prior studies. Two retrieval methods are also used in this study, namely k-nearest neighbor (KNN) and mutual information (MI) to measure the similarity between query image and images in database. The result will be evaluated using positive count rate in each query for each class. The result of this study is expected to contribute more towards better Computed-Aided Diagnosis (CADx) and specifically screening mammography analysis in clinical cases. Long-term traffic noise exposure corrected for indoors and its association with outdoor air pollution and annoyance in participants of the Heinz Nixdorf Recall study Traffic noise is an environmental burden associated with annoyance and cardiovascular morbidity. Housing characteristics and personal behavior can lead to substantial exposure measurement error of personal exposure when relying only on facade levels of noise. Furthermore, high correlation of facade levels of noise with outdoor air pollution might lead to inefficient control of confounding in health effects analyses. Assessing traffic noise levels indoors at home instead of outdoors might be a better indicator of personal exposure due to noise attenuations and noise annoyance reactions. It might also limit confounding with outdoor air pollution levels. As noise and air Background An age-related decline in muscle mass contributes to impaired function and disability. The interest in modifiable lifestyle factors to tackle such a decline is growing. Thigh muscle volume (TMV) is known to be a good predictor for total body muscle mass. Thus, the aim of the present study was to investigate the association between dietary protein intake and physical activity on TMV in a Northern German population. This cross-sectional analysis comprised 593 participants recruited by the PopGen biobank in Kiel, Germany. Participants underwent magnetic resonance imaging, and TMV was subsequently quantified using an Automatic Tissue Labelling Analysis Software (ATLAS). Nutrient intakes were estimated by a semi-quantitative food frequency questionnaire linked to the German Food Code and Nutrient Database. Physical activity including walking, cycling, sports (without cycling), gardening, workmanship and housework was assessed using a self-administered questionnaire. Multivariable linear regression models were used to analyze associations of dietary protein (% of energy intake from total, plant and animal protein), physical activity (total in MET-h/wk and single activities in h/wk, respectively), age, and sex with TMV. Age, sex, height, waist circumference, total energy intake (kcal/d), and chronic disease (yes, no) were considered as potential confounders. Analyses including dietary protein and physical activity were adjusted for one another, as well as were plant and animal dietary protein. In total, 41 % of the study sample were women and the median age was 62 years. TMV ranged from 5.7 to 18.6 dm 3 . Higher age was associated with lower TMV (ß = -0.02, p \ 0.0001). Further, men had higher TMV compared with women (ß = 1.44, p \ 0.0001). More hours of cycling and sports per week were associated with higher TMV (ß = 0.05, p \ 0.01 and ß = 0.04, p \ 0.01, respectively). Neither total physical activity and other single activities nor dietary protein were statistically significantly associated with TMV. The study suggests that in addition to non-modifiable factors, physical activities with a moderate to high intensity level such as sports or cycling may influence muscle mass. Decision system integrating preferences to support sleep staging Ugon A 1 , Sedki K 2 , Kotti A 1 , Seroussi B 3,4 , Philippe C 5 , Ganascia JG 1 , Garda P 1 , Bouaud J 5,6 , Pinna A 1 integrating, for some of them, preferences allowing to support decision in conflict situations. Applied to a doubtful epoch, our approach has taken the appropriate decision. Evaluation of nonlinear registration method in abdominal CT images for integration of dose distribution in multiple radiotherapies Uemura K 1 , Tanikawa T 2 , Mukai M 3 , Ando Y 3 1 Kagawa University Hospital, Kagawa, Japan; 2 Asahikawa Medical University, Asahikawa, Japan; 3 National Institute of Radiological Sciences, Chiba, Japan In recent years, the precise irradiation methods, such as IMRT and heavy ion radiotherapy, have developed and been used widely for cancer treatment. These radiation devices made it possible to irradiate complicated-shaped target and improved the effect of radiotherapy. In the treatment for patients who receive two or more radiotherapy because of a local recurrence and/or a remote metastasis, physicians are required to plan the radiation treatment so that the dose distribution of surrounding organs would not exceed the tolerance dose. However, it is very complex and difficult to make a new plan by referring to a past radiation treatment plan visually, because the shape of organs among images obtained separately changed nonlinearly. In this study, we tried to develop a new system to support multiple treatment plans with reliable and effective radiation. As a first step, we investigated the nonlinear registration method for aligning the intra-subject X-ray CT images obtained on different days. In the proposed method, we used the National Library of Medicine Insight Segmentation and Registration Toolkit (ITK Toolkit) with Free Form Deformation (FFD) method for nonlinear registration of three-dimensional image set. In this method, normalized mutual information was used as the evaluation function for the registration of two images. At first, we evaluated optimal value of seven parameters in FFD using simulated and actual CT images, and determined the parameter value that made the mean square error of signal value for all voxels smallest. Next, we applied this method for the registration between abdominal CT images obtained for the same patient on different days, referred as original image and target image. The target image was transformed to the original image by using determined FFD parameters for 8 patients, and the accuracy of registration was assessed by calculating the difference of six anatomical landmark between marker on the original image and that on the registered target image drawn by a specialist. In addition, in order to reduce calculation time for the nonlinear registration, we included the graphics processing unit (GPU) for the processing, and compared the calculation time by GPU with that by conventional central processing unit (CPU) without GPU. In the nonlinear transformation of abdominal CT-images, mean square error of six landmarks 3.78 mm and ranged from 0.0 (upper point of pubic symphysis) to 32.93 (upper point of left kidney). These values were satisfactory for the radiotherapy dose distribution. This nonlinear registration took about 30-40 min by the process without GPU, and the calculation time depended on total number of CT slices and FFD parameters. GPU performed this process 6.3-11 times faster than CPU. The proposed nonlinear registration method using FFD showed the tolerable accuracy in the transformation of 3D abdominal CT images, and demonstrated it would be applicable to the transformation of dose distribution image for radiation therapy in the patients with repeated radiations. The processing using GPU shortened the calculation time dramatically, suggesting it would perform the deformation within a tolerable time for clinical use. Unreconciled data structures and formats are a common obstacle to the urgently required sharing and reuse of data within healthcare and medical research. Within the North German Tumor Bank of Colorectal Cancer, clinical and sample data, based on a harmonized data set, is collected and can be pooled by using a hospital-integrated Research Data Management System supporting biobank and study management. Adding further partners who are not using the core data set requires manual adaptations and mapping of data elements. Facing this manual intervention and focusing the reuse of heterogeneous healthcare instance data (value level) and data elements (metadata level), a metadata repository has been developed. The metadata repository is an ISO 11179-conformant server application built for annotating and mediating data elements. The implemented architecture includes the translation of metadata information about data elements into the FHIR standard using the FHIR Data Element resource with the ISO 11179 Data Element Extensions. The FHIRbased processing allows exchange of data elements with clinical and research IT systems as well as with other metadata systems. With increasingly annotated and harmonized data elements, data quality and integration can be improved for successfully enabling data analytics and decision support. Evidence synthesis for a single randomized controlled trial and observational data in small populations Unkel S 1 , Röver C 1 , Friede T 1 1 Universitätsmedizin Göttingen, Göttingen, Germany We consider the scenario of a single randomized controlled trial (RCT) comparing an experimental treatment to a control in a small patient population. Whereas in large populations usually two independent RCTs are required to demonstrate efficacy and safety for marketing authorization, in small populations the conduct of even a single RCT with a sufficient sample size might be extremely difficult or not feasible. Inspired by an ongoing paediatric study in Alport syndrome (Gross et al. 2012), we consider study designs in which information external to the randomized comparison, such as data arising from disease registries, are integrated into the design and analysis of an RCT in different ways. Methods of generalized The different attitudes can be seen as decisions between a value based and a benefit based strategy. In a value based strategy, a test is introduced when the value is proven, and in a benefit based strategy, a test is introduced when the benefit is proven, but later the decision may be changed and the new therapy may be introduced to all patients. We investigate the potential conclusions form enrichment designs and interactions designs about value or benefit. We further investigate the main determinants for the superiority of one strategy over the other by considering 6 different consequences for a single test. Enrichment designs can inform the benefit based strategy and interaction designs or separate studies in test positive and test negative patients can inform the value based strategy. We have to expect that interaction studies are underpowered to come to a definite conclusion about the value of a test. Advantages and disadvantages from the strategies are mainly determined by the influence of the strategies on the timing of performing corresponding studies: The benefit based strategy allows test positive patients to benefit earlier from the new therapy, but the strategy may delay the conduct of interaction studies and hence delay the detection of test with no value, but with a benefit from the new therapy for all patients. Benefit based strategies are preferable if the risk of off label use and delayed decisions on the value of a test can be limited. If this cannot be achieved by administrative means like conditional approval, the superiority of the two approaches depends on how often the value based strategy would never allow a test of value to come into use and on how often the benefit based strategy may prevent to detect the novalue status of a test. Vach W 1 , Sun H 2 1 Klinische Epidemiologie, Freiburg, Germany; 2 Clinical Epidemiology, IMBI, Freiburg, Germany The emergence of more and more potentially predictive markers with low to moderate prevalence has born the idea of umbrella trials. In umbrella trials, several markers are measured in each patient, and dependent on the marker pattern patients are randomized to a new marker-specific treatment or the current standard treatment. Hence umbrella trials allow to conduct simultaneously studies on each single treatment of interest. Already in 2006 Vach & Christensen pointed out that there may be more efficient ways to randomize patients in such studies. If patients with multiple markers are randomized in anadequate manner, it is possible to use patients receiving standard treatment for several comparisons. Further efficiency gains are possible if patients with multiple markers are allowed to receive multiple treatments. We consider different randomization strategies (fixed sequence, random allocation, random allocation allowing multiple treatments, block randomization) and their combination with different analysis strategies: Two group comparisons, adjustment for marker status, adjustment for co-treatment and a joint analysis based on a common model. We develop an overall model for data generation reflecting essential properties of data in umbrella trials: Prevalence of markers, positive/negative correlations between markers, treatment effects, treatment effect modification by other markers, interactions between treatments, prognostic value of markers. This model allows to make a comprehensive comparison of the different strategies with respect to power and sample size. Clever randomization of patients may allow to reduce the sample size by up to 20 %. Allowing multiple treatments in one patient allows to use factorial designs and can imply a further reduction by 30 %. Efficiency is mainly depending on the degree of additivity of multiple treatments and adequate choice of analysis strategy. Conclusions Randomization and analysis strategies have a high impact on the efficiency of umbrella trials. The most crucial point is the question, whether we allow multiple treatments or not. Due to the potential impact on the necessary sample size, the design of umbrella trials should be done with great care. Many scientists perform initial data analyses (IDA) as part of their research studies. For some this may mean data cleaning, others understand IDA as basic data summaries, and yet others perform exploratory analyses, which are then formalized in statistical models. These first steps of data analysis are often performed in an informal and unorganized way, and it has been recently stressed that this may have a large and intransparent impact on final results presented in publications. Already 30 years ago Chatfield discussed a common, general frame for IDA. However, the conditions and scope for IDA have changed over the last 30 years. Data sets have grown in size and complexity. With standard software packages many researchers may rush to perform sophisticated analyses, without systematically checking for errors in the data and without a clear understanding about the underlying features of the data. To improve this situation, IDA has to be established as a necessary and genuine step in the mind of all researchers. On the other hand we are today more concerned about undesired or negative consequences of IDA. IDA may lead to unjustified removal of ''disturbing'' observations, to data driven hypotheses, to nontransparent changes in the statistical analysis plan, or to misleading ''optimization'' of analysis strategies. All these changes over the last three decades make it necessary to take a fresh look at what the core of IDA is. Although there have been articles listing and discussing important elements of IDA, we lack a general, conceptual framework. Consequently, in this paper we develop a conceptually oriented, contemporary view on IDA. We try to clarify the scope and the aim of IDA, focusing on the question ''Why do we do what''? We divide IDA into five major steps: (1) Data cleaning, (2) Data screening, (3) Initial data reporting, (4) refining and updating the analysis plan, and (5) Reporting IDA in research papers. For each step we try to define the basic issues to be solved and to identify the main conceptual approaches we may apply. For example, we characterize data screening as a attempt to understand those properties of the data that may affect future analysis and interpretation. It consists of two main activities: Comparing the actual distribution of variable(s) with implicit or explicit expectations, and detection of data properties which may represent a potential threat to the correct application of statistical methods or the adequate interpretation of results. Across all steps, key elements of IDA are transparency and full documentation. We hence introduce principles to achieve this goal and discuss several aspects of the organizational frame. Implementing web-based patient-reported screening software for ICU follow-up clinics: a mixed methods approach van Beusekom I 1 , Bakhshi-Raiez F 1 , van der Schaaf M 2 , Dongelmans D 2 , de Keizer N 2 1 Academic Medical Center, University of Amsterdam, Amsterdam, Netherlands; 2 Academic Medical Center, Amsterdam, Netherlands Background ICU survivors frequently suffer psychological distress, reduced social well-being and long-term physical limitations after ICU discharge. Because of the complexity and magnitude of these complaints, multidisciplinary care is required. Screening the ICU survivors is time consuming for the healthcare providers and recommendations concerning the use of web-based patient-reported screening software, in order to make the screening more efficient, were developed. Although web-based screening software has major benefits, research about the use of web-based questionnaires in ICU survivors is scarce. We created new software and conducted a pilot study to test the software on site. After finishing the pilot study, semi-structured interviews were conducted with caregivers who used the patient-reported screening software. In total 112 ICU survivors filled out the questionnaires, of which 44 used the new software. ICU survivors using the new software were significantly younger compared with the paper questionnaire group, had a longer ICU stay and less items were missing. Most important barriers for not using the software were patients stating not having an e-mail address and healthcare personnel not registering the e-mail addresses. The results of our study can be used during the implementation of web-based patient-reported screening software in this specific group of patients. User oriented platform for data analytics in medical imaging repositories Valério M 1 , Marques Godinho T 1 , Costa C 1 1 DETI/IEETA -University of Aveiro, Aveiro, Portugal The production of medical imaging studies and associated data has been growing in the last decades. Their primary use is to support medical diagnosis and treatment processes. However, the secondary use of the tremendous amount of stored data is generally more limited. Nowadays, medical imaging repositories have turned into rich databanks holding not only the images themselves, but also a wide range of metadata related to the medical practice. Exploring these repositories through data analysis and business intelligence techniques has the potential of increasing the efficiency and quality of the medical practice. Nevertheless, the continuous production of tremendous amounts of data makes their analysis difficult by conventional approaches. This article proposes a novel automated methodology to derive knowledge from medical imaging repositories that does not disrupt the regular medical practice. Our method is able to apply statistical analysis and business intelligence techniques directly on top of live institutional repositories. It is a Web-based solution that provides extensive dashboard capabilities, including complete charting and reporting options, combined with data mining components. Moreover, it enables the operator to set a wide multitude of query parameters and operators through the use of an intuitive graphical interface. and improving scientific data analysis. This work focusses on the field of myeloid leukemia (ML), where a semantic core of common data elements (CDEs) in routine and trial documentation was established. Analysis was performed using automatic UMLS-based coding evaluation of existing clinical trials and data standards in myeloid leukemia. The initial CDEs were reviewed and commented by leukemia experts and then electronically surveyed systematically item by item. CDEs (n = 227) were generated before they were systematically surveyed by an international voting process through seven hematologists of four countries. The total agreement score was 86 %. 116 elements (51 %) of these share an agreement score of 100 %. This work generated CDEs with language-independent semantic codes and international clinical expert review to build a first approach towards an international data standard for ML. A first version of the CDE list is implemented in the data standard Operational Data Model and additional other data formats for reuse in different medical information systems. Popularity of Russian medical educational sites and dynamics of their rankings Background Network Analysis provides a new perspective on modeling the complex interplay between feelings, thoughts, behaviours, or physical symptoms. In this perspective, each measured item is a link in one or more possibly cyclic causal chains that predict the dynamic behavior of the whole set of items. The set of causal chains can be represented by a weighted directed network. We explore the short-term interplay between several mood measures and asses which pairwise connections of items are most strongly and which items are most central for the overall network. Analyses are based on data of the ongoing Behavior and Mind Health (BeMIND) Study in Dresden, Germany, which examines developmental trajectories of mental disorders in a community sample of adolescents and young adults aged 14-21. In the ecological momentary assessment (EMA) study part, so far N = 81 participants completed EMA eight times a day for four days via Smartphone. At each EMA, participants particularly rated the degree to which they felt satisfied vs. unsatisfied, calm vs. agitated, full of energy vs. without energy, optimistic, pessimistic, and the amount of time they felt stressed since the last beep. Our analysis employs a slight modification of the multilevel vector autoregression (VAR) method proposed by Bringmann et al. (2013) . The strengths of the edge between two items in the network come from regression coefficients of VAR models. Item values at time point t ? 1 are regressed on item values at time point t for all consecutive time points on the same day. We choose Random Intercept Models to incorporate the multilevel structure of the data. The strengths of an edge can be either positive or negative, indicating a positive or negative predictive interplay. For the assessment of importance of single items, standard centrality measures were used. Betweenness measures the importance of an item for the information flow through the whole network. Outstrength represents the degree to which an item affects all other items of the network. Instrength measures how much an item is affected by other items of the network. The resulting network included significant self-loops for all measures, meaning that an item value at time point t ? 1 is significantly predicted by the same item at time point t. Strong positive edges were found between feeling full of energy and feeling satisfied, between feeling calm and feeling satisfied, and between feeling pessimistic and feeling stressed afterwards. The strongest negative edge was found between feeling calm and feeling stressed. The highest betweenness was found for energy level, highest outstrength for feeling calm, and highest instrength for feeling satisfied. The occurrence of self-loops in the network is in-line with earlier findings (Bringmann et al., 2013) . The high betweenness of the energy level stresses the high importance and feedback loop property of this measure. Feeling calm was scored the most influential among the used items, suggesting that a calm state can improve mood in the long term. Future research may compare networks representing short-term dynamics of mood states between depressed and non-depressed subjects. Prevalence of hypertension and adherence to hypertension guidelines in the population-based Heinz Nixdorf Recall study: a comparison of participants on medication with and without blood pressure control Of the participants with hypertension with intake of C1 antihypertensive agent (76.0 %), 56.9 % achieved BP control. In the subgroup of participants with uncontrolled BP despite medication (43.1 %), the mean BP was 154/85.7 mmHg (SD 14.5/10.1). Participants C67 years had a mean BP of 156.0/85.5 mm Hg (SDs 14.7/ 10.5), while this was slightly lower in younger subjects (152.3/ 88.8 mm Hg; SDs 13.9/8.6). The mean BPs did not differ between men and women as well as participants with and without CHD. Of the participants on medication, 95.8 % received first line therapy (single or combined): ACEI 42.0 %, ARB 23.2 %, BB 53.2 %, thiazides 36.4 %, CCB 21.6 %. On average participants used 2.0 agents (1-7 agents) with no difference between those with and without BP control. Also, the mean dosages were similar between participants with BP control (50.7 % of the recommended maximum) and those lacking control (52.3 %). There were no differences between sex or age-group. On average participants with CHD used 0.5 antihypertensive more agents than to those without CHD. In participants with CHD, we observed no difference in the number of antihypertensive substances between those reaching the BP target and those who did not (mean agents n = 2.6; SD 1-6 vs. n = 2.4; 1-6). Also, the percentages of the recommended maximum dosages of antihypertensive medications did not differ between these groups (52.4 % vs. 51.5 %). In this population-based study we observed that 60 % of the participants on antihypertensive medications reached the target BP. Overall, pharmacological agents were selected according to guidelines. There is a potential to increase medication dosages in participants lacking BP control if clinically indicated. 548 A mobile ECG system for the evaluation of cardiovascular risk Villamil C 1 , Landinez S 1 , Lopez D 1 , Blobel B 2 1 Universidad del Cauca, Popayan, Colombia; 2 University of Regensburg, Regensburg, Germany Problem: Cardiovascular diseases (CVD) are the number one cause of death globally. The World Health Organization estimated that 80 % of the deaths caused by CVD take place in low and middle-income countries (LMIC). Objective: This paper describes the development of a mobile Electrocardiogram (ECG) system designed to support the evaluation of cardiovascular risk. Methods: The system was developed using low-cost technology, implemented under the open-source platform SANA and adding an ECG signal to the process of cardiovascular risk evaluation. Results: Main functionalities of the system include a visualization and analysis of the ECG signal in the Android mobile device, calculation of four cardiovascular risk scales, standard ECG transmission using the European Data Format (EDF), and integration into an Electronic Health Record system. Ten experts recommended 28 different application scenarios for the system, and evaluated its performance (100 %) and relevance of the functionalities (89 %). Conclusions: The paper demonstrates the feasibility to develop a low-cost, open source, mobile ECG System able to support the evaluation of cardiovascular risk and potentially useful for other health promotion and prevention programs and scenarios, especially in LMIC. The two-sided market perspective of e-health Business models reflect the way, how organizations conduct commercial transactions and thereby create and capture value [1] . These models are meaningful boundary objects for explaining the relation between strategy and IT [2]: the same technology commercialized in different ways may result in different economic outcomes and value perceptions [3] . In healthcare contexts a systems may be justified by social cost reduction, such as patient travelling time, fares, loss of remuneration etc., but these are not changed as cash receipts and so the system provider is out of pocket while society gains. This workshop will introduce the components of an ecosystem in a two-side market. In particular, we will discuss and offer some food for thoughts regarding the ''chicken and egg'' and ''two-homing'' issues and business models. The workshop is composed of two parts. In the first part of the workshop, the participants will learn about the components of the ecosystem of a two side market and its requisites to success. They will also be involved in the review of the components of the ecosystem. The content of the discussion will be summarized and the speakers will seek the publication of an opinion paper with the conclusions of the panel in one of the official IMIA journals. Nowadays obesity is epidemic. More than 75 % obese children can have fatty pancreas (non alcoholic fatty pancreas disease-NAFPD) and/or fatty liver (non alcoholic fatty liver disease-NAFPD) and these diseases increases the risk for prediabetes, pancreatitis and cardiovascular diseases later in young adulthood. Biopsy or nuclear magnetic resonance are sensitive methods for NAFPD and NAFLD diagnosis, but they are not highly recommended for children. The aim of this study is to propose a medical alert by using scores for usual plasma and anthropometric parameters, which helps the physician to suspect NAFPD and NAFLD in the obese children and alerts him to recommend abdominal ultrasounds. We searched different databases like Biomed Central, Thomson, Elsevier by using keywords NAFPD or NAFLD and usual plasma parameters. We selected some studies on NAFLD and on the less studied NAFPD, which were done on a high number of subjects. Most of these studies demonstrated positive correlations between these diseases and body mass index (BMI), triglyceridemia, number of thrombocytes, AST/ALT ratio value lower than 1 and negative correlation with HDL-cholesterol according to logistic regression analysis and simple Pearson correlations. We made our own study on 60 overweight and obese children (10-16 years old) with and without these diseases checked by ultrasounds. For each of the above parameters we gave scores 0 or 1 according to the standard cutoff values (40 mg/dL for HDL-c, 110 mg/dL for triglycerides, 400000/mm 3 for trombocytes and AST/ALT ratio lower than 1) and 0, 1 or 2 for BMI according to its severity. For each patient, we computed the total score, which had possible values between 0 an 6. The sensitivity for NAFLD was 65.51 % and specificity was 51.72 % and for NAFPD the sensitivity was 61.07 % and specificity was 50.9 %. We made an Excel application (macro), using VBA (Visual Basic for Application), which computes, for a given patient, the total score (based on the five parameters presented above). If the value is greater or equal than 3, a window is shown, indicating that the patient should be checked by abdominal ultrasounds. The determination of the above mentioned parameters is cheap and it is usually made for every patient at least once a year. The physician should adopt a more ''aggressive'' management, starting with lifestyle changes under a nutrition supervisor in those children with positive results. The impact of this study is important in preventive medicine and should reduce the incidence of obesity complications as prediabetes and pancreatitis. This study may be a start point for establishing a score indicating the risk of fatty pancreas and fatty liver. The EUthyroid Consortium-concept and design Völzke H 1 , Albrecht D, Thuesen B, Erlund I, Markou K, Zimmermann M, Peeters R, Siebert U 2 , Rochau U 3 , Grü nert I, Hubalevaska-Dydejczyk A, Lazarus J 1 Ernst-Moritz-Arndt-Universität, Greifswald, Germany; 2 UMIT -University for Health Sciences, Medical Informatics and Technology, Hall i.T., Austria; 3 UMIT -Private Universität für Gesundheitswissenschaften, Medizinische Informatik und Technik, Hall i.T., Austria, Hall in Tirol, Austria Despite the fact that iodine deficiency (ID) can easily be prevented by iodine fortification of table salt, industrial salt and cattle food, Europe belongs to the worst regions in terms of access to iodized salt and is seriously iodine deficient, resulting in the perpetuation of the single most important, preventable cause of brain damage. Up to an estimated 360 million European citizens are exposed to ID, which is not only due to variable iodine provision, it is also rooted in significant heterogeneity of prevention and monitoring activities, leading to inappropriate interventions, health inequalities and increased disease burden with substantial impact on health-care costs. A major concern remains the large number of pregnant women exposed to ID, which results in a measurable decrease in cognitive potential of their children. An effective European monitoring program is a crucial step towards eradication of ID disorders (IDD) with significant benefits for European citizens and the sustainability of health care systems. The effects of ID in total cause significant, preventable costs in health-care systems of affected regions. Twenty-seven countries contribute to the EUthyroid consortium, which is funded by the EC Horizon2020 programme. The overall aims of EUthyroid are to evaluate IDD prevention programmes in European countries, to initiate capacity building for integrated activities in Europe. Coordinated measures at national and EU level are a long-term goal, which will be worked towards by targeted communication activities of project results. To achieve these aims, EUthyroid will establish a meta-platform for collaborative data collection and use. National and regional registry data will be collected to gain an overview over prevalent and incident thyroid diseases and treatments in European nations. In parallel, EUthyroid will improve the data quality of national and regional IDD monitoring studies by providing infrastructures for training and standardization of interviews, laboratory measurements and thyroid ultrasound. These data will then be collected in a centralized EUthyroid database. All data will be used to provide a European map of iodine status and subclinical disorders and to relate the iodine status of populations to thyroid-related outcomes for the evaluation of IDD prevention programmes. In addition, thyroglobulin, a novel and promising biomarker for the iodine status will be evaluated using biomaterials collected in IDD monitoring studies and three mother-child cohorts from regions with different iodine status. The latter cohorts will investigate the important association between the iodine status in pregnant women and the neurocognitive function in their children. Combining findings from all studies, health-economic analyses will be conducted to investigate benefits and harms as well as costeffectiveness of IDD prevention. Potential barriers against harmonized IDD prevention and monitoring programmes will be explored by collecting information from national and European ministries. The successful implementation of the ambitious research programme in EUthyroid and providing impact through coordinated communication measures rests on a carefully selected consortium. Most of the partners are representatives of their nations in the Iodine Global Network, a non-governmental, non-profit organisation, associated with WHO, UNICEF and other world nutrition charities combating ID globally. Using specialized software to streamline the production of systematic reviews of complex public health interventions: a comparison of Covidence and abstrackr 123 Background Systematic reviews are of crucial importance for evidence-based health care but their high-quality conduct requires a substantial commitment of time and resources. Specialized software, such as Covidence and abstrackr, has been developed to increase the speed, the efficiency and the quality of the production of systematic reviews. Their use for systematic reviews on clinical questions has been validated, but no reports on their use for systematic reviews on complex public health interventions have been published so far. We compare two software tools recommended by Cochrane based on their usefulness for an ongoing Cochrane review on environmental interventions to reduce the consumption of sugar-sweetened beverages. We compare strengths and weaknesses and discuss their compatibility with each other and with common software applications used for statistical analysis and reference management, such as ReviewManager and EndNote. Special attention is paid to the specific needs of reviews of complex public health interventions. Further criteria for the comparison are the time needed to familiarize onself with the programs, supported data formats in import/export, data mining functions, pricing, data security, functions to manage collaboration, functions to manage the review progress and assisting software tools in the abstract screening process. Both Covidence and abstrackr can be used to facilitate manual title and abstract screening through user-friendly screening interfaces. Moreover, abstrackr employs prediction algorithms based on machine learning technologies to sort references by their predicted relevance, based on defined keywords and the manual screening decisions on a subset of references. This allows for semi-automatic screening of the remaining references. Covidence relies on manual screening only, but provides further tools for full text screening, data extraction and risk of bias assessment. The respective templates have been developed for systematic reviews on clinical questions, but can be adapted to the more complex data extraction and risk of bias assessment needed for public health reviews. Moreover, Covidence allows automated export of results into RevMan for statistical analysis. Both tools are webbased applications which allow multiple users to work on the same review. Specialized software tools can contribute considerably to improve the speed, efficiency, convenience and quality of the production of systematic reviews, including systematic reviews of complex public health interventions. Abstrackr provides advanced functionality for title and abstract screening, and its semi-automated prediction algorithm shows promise in reducing the time needed for screening. However, further validation of this novel method is needed. By contrast, Covidence requires manual screening of all references, but can be used to manage all steps of a systematic review from title screening to data analysis. Is geography really destiny in healthcare? von Stillfried D 1 , Czihal T 1 1 Zentralinstitut für die kassenärztliche Versorgung in Deutschland (ZI), Berlin, Germany The bulk of research on geographical variation in healthcare is based on John Wennberg's statement which equates geography to destiny. This may largely be caused by the way researchers looked at populations, i.e. by area of residence. Comparing outcomes for populations served by informal network structures as created by utilization demonstrates systematic variation depending on the networks rather than on geography. This perspective may lead to different conclusions. Objective To explore the importance of regionally distributed variables versus the importance of practice patterns we applied informal network analysis to ambulatory care in Germany. To identify potential for interventions we looked at stability of practice patterns over time. Using a complete nationwide claims data set for Germany (70 million patients, 100 K physician practices) for 2010 we attributed each patient to the primary care provider with the largest share of services for the patient and added all other ambulatory physicians contacted by the patient to create the virtual network. To compare quality of care across networks we looked at the incidence of selected process variables. Networks were attributed to geographic area (counties) based on patients' residences. To analyze variability over time we followed all networks that treated at least 100 patients in 2012, 2013 and 2014. For 2010 43 K networks could be identified with only small systematic differences distribution according to metropolitan and rural areas. Quality of care did not vary according to rural/urban setting. Regional patterns exist but cannot be explained by urban/rural context. Trends over time could be observed in 38 K networks which indicate that indicators of process quality (e.g. HbA1c testing for diabetics) have been subject to significant change in many networks. Judging from the degree of change in process variables over time much of regional variation could attributed to practice patterns and would therefore offer potential for deliberate improvement. Causes of massive improvement of process indicators in some networks need to be studied further. Yet, the results suggest that in healthcare there is much less destiny in geography than hitherto assumed. Regional patterns may be affected by varying levels of substitution between inpatient and ambulatory care. We could not control for social structure as a major risk factor between networks. Small numbers may be a problem when comparing care for less prevalent indications across networks. This might be overcome by creating bigger networks based on hospitals as network nodes or by other ways of clustering small numbers in networks. Determinants of activity limitation among older adults in Germany. Results from German Health Update (GEDA) national health interview survey waves 2009, 2010 and 2012 von der Lippe E 1 , Fuchs J 1 , Scheidt-Nave C 1 1 Robert Koch-Institut, Berlin, Germany The Global Activity Limitation Indicator (GALI) is based on a single survey question asking a person's extent of long-term daily activity limitation due to health problem with answering categories ''severely limited/limited but not severely/not limited at all.'' The GALI has been shown to be a valid indicator of functional limitations and is hence used as the underlying measure to calculate the European indicator Healthy Life Years (HLY). The present study aimed to identify determinants of GALI among older adults in Germany. We used pooled data from three consecutive waves of the telephone interview survey German Health Update (GEDA) for adults in Germany aged 18 years and older, conducted in 2009, 2010 and 2012 as part of the continuous national health monitoring. We limited the sample to adults aged 50 years and older (N = 29,562). The prevalence [95 % confidence intervals (CI)] of persons reporting themselves as severely limited was calculated across strata of sex, 10-year age groups, social status (low, intermediate, high), living alone (yes/no), obesity defined as body-mass-index C30 kg/m 2 (yes/ no), and among persons with and without chronic health conditions. We grouped information on single health conditions in 8 different categories (cardiometabolic, cancer, musculoskeletal, lower respiratory, liver/renal, chronic back pain, depression, sensory limitation defined as severe visual or hearing impairment). Adjusted odds ratios (OR) as a measure of association with severe limitation (yes/no) were calculated from sex-specific multiple logistic regression models including the number of concurrent physical health conditions (0, 1, 2, C3), depression (yes/no), sensory limitation (yes/no), obesity (yes/no) and sociodemographic characteristics. Overall 18 % of adults reported severe activity limitation (women: 17.1 % [16.2-18.0]; men: 18.5 % [17. 7-19.3 ]. In both sexes, the prevalence of severe limitation significantly increased from about 14 % among persons 50-59 years of age to nearly one third among those 80 years and older. Severe limitation was significantly related to lower social status, living alone, being obese, and having any of the examined chronic health conditions. Multivariate analyses confirmed profoundly these findings and demonstrated the importance of comorbidity. In both sexes, low social status, depression, any severe sensory limitation were independently associated with a two-to threefold increase in the odds of reporting severe activity limitation. Compared to persons with no physical health condition, the odds were about threefold higher among men and women with any one of the considered physical health conditions and 15-20-fold higher among those with three or more physical health conditions (OR [95 % CI], women: 15. 4 [11.6-20.4]; men: 20.2 [14.2-28.8] .. Among older adults in Germany non-medical determinants of health as well as chronic health conditions, in particular multimorbidity, are important determinants of activity limitation captured by the GALI indicator. Single physical, mental and functional conditions showed associations of similar strength with activity limitation. No important sex differences were observed. As a limitation, the current results may underestimate the strength of the true associations between chronic conditions and activity limitation, due to the exclusion of severely ill, hospitalized individuals or people living in nursing homes. Controlled studies in the authorization process of medicinal products: objectives, problems, current practice and suggestions for new regulations Wadepuhl M 1 1 Büro für Versuchsplanung und statistische Auswertung, Ulm, Germany Controlled randomized and blind trials (RCT) are the gold standard for producing reliable answers to clinical questions. However, they are almost absent in the process of CE-certification, i.e. during the process regulating the access of medicinal products (devices) to the European market. The new EU Regulation, initiated after diverse product scandals and expected this summer, is trying to improve the quality of the on which the certification is based. It turned out, that simple transfer of principles applied in drug approval is inefficient, despite the fact, that epistemological problems are comparable. After addressing the analysis of safety (Wadepuhl, 2011) I will discuss the use of RCTs taking on mainly a social perspective. These ideas have to be seen in the context of the development of the ''Ausführungsbestimmungen'' of the new EU-regulation, which have to create an innovative, cost-effective environment were a multitude of clinical questions are to be answered with sufficient certainty. This discussion is restricted to the highest risk class, i.e. class III devices like implants. The crucial difference to drugs is the complicated application of implantable devices. This leads to using the term ''effectiveness'', instead of ''efficacy'', if performance of a device is addressed. In other words, the ''applicator'' of a device can no longer be neglected. Due to learning curves, habits and the necessity of product specialists, are equivalent controls often difficult to establish and is randomization complicated. ''Usability'' is therefore another dimension of a device, which has to be addressed. Blinding, even observer blinding, is unethical or often impossible. Furthermore is generalizability hampered by the large influence of the investigator. Finally, is the patient pool available for a study often limited, a problem which will aggravate due to individually manufactured devices. All these obstacles can be easily demonstrated with recent or ongoing studies. There is no doubt that RCTs are necessary at certain stages of the life cycle of a device or class of devices, but design and objective must be adjusted to the above presuppositions and predefined precisely. Possible solutions are ventilated by looking e.g. at analogous problems in drug development and include. Definition of phases of development (I-IV), and restricted definition of ''conformity''. Definition of equivalence comparable to biosimilars. Tightening of application of historical control. Enforcement and improvement of post-marketing surveillance. Review of consequences from device scandals (e.g. PIP, Ranvier, cranial stents). Prevalence of chronic kidney disease and care giver's awareness in patients hospitalized for coronary heart disease Wagner M 1,2,3 , Fette G 4,2 , Wanner C 3 , Hartmann K 1 , Rü cker V 1 , Oezkur M 5,2 , Gelbrich G 1,6 , Störk S 2,7 , Heuschmann P 1, Background Aside from classic CV risk factors (e.g. blood pressure, LDL cholesterol, diabetes, overweight/obesity, smoking), chronic kidney disease (CKD) has been identified as an important risk modifier in patients with coronary heart disease (CHD). CKD disposes the patient to complications during hospital stays due to CHD, such as acute kidney injury (AKI). Both, CKD and AKI represent risk factors for the progression of CKD, progression of CHD, and impaired outcomes in CHD. Therefore, reporting CKD and/or AKI in discharge letters is important to transfer information on these major events to the physician in the ambulatory setting. From the hospital's perspective, adequate ICD-coding of CKD and AKI is relevant for reimbursement. In the current study we describe the prevalence of CKD in patients admitted due to CHD. We also analyzed reporting of CKD and/or AKI in the discharge letter and ICD-coding, both reflecting the care giver's awareness of CKD. We used data of n = 498 patients of the University Hospital Wuerzburg, enrolled in the EUROASPIRE IV study. These patients were hospitalized for CHD (myocardial infarction or ischemia, angioplasty/ stent, coronary bypass grafting) 6 mo to 3 yrs before the EURO-ASPIRE IV study visit, at which subjects consented to detailed retrospective chart review. Information on CKD and AKI in diagnoses or summary of the discharge letters was extracted by doctoral students. Data collection of ICD codes and SCr measurements was supported by the CHFC Datawarehouse, which enables data of clinical routine to be used for scientific purposes with digitized (semi-)automated datatransfer. All serum creatinine (SCr) measurements (n = 2332) were used to define AKI (any rise in SCr C 0.3 mg/dl during the hospital stay) and CKD (glomerular filtration rate GFRCKDEPI at admission or at discharge \60 ml/min/1.73 m 2 , i.e. CKD stage 3?). Relevant ICDcodes included N18, N19, I12.0, I13 for CKD, and N17 for AKI. Study participants were on average 67 yrs old with 83 % being of male gender. A total of 32.6 % patients had CKD or AKI during their hospital stay (CKD on admission 17.6 %, CKD at discharge 19.0 %, AKI 25.1 %). Patients with either CKD or AKI were older (73 vs. 64 yrs), more likely having undergone CABG (32.3 vs 8.1 %, p \ 0.01) and less likely having received PCI/stent (50.3 vs. 77.8 %, p \ 0.01) as compared to subjects without renal impairment. CKD or AKI was mentioned in the discharge letter in 22.3 % of patients with CKD or AKI. Renal ICD-codes were coded in 72.7 % of patients with CKD or AKI. Our data suggest that coding of reimbursement relevant ICD-codes for CKD or AKI, i.e. ''hospital's awareness of CKD'' seems to be fair, but there are still a considerable number of patients in which accurate ICD-coding would yield monetary benefits for the hospital. In contrast, the ''physician's awareness of CKD'', i.e. CKD and/or AKI clearly stated in the discharge letter, indicates room for improvement. Consequent reporting of CKD and AKI in discharge letters may lead to improved information transfer to care givers in the outpatient setting, and may also lead to better informed patients. Epidemiological challenges when assessing cancer risk in a contaminated Austrian alpine valley Waldhoer T., Mueller K., Hackl M., Hutter H-P., Heinzl H. Introduction From 1926 to 1981 contaminated waste has been deposited at a special landfill by local chemical industry in Carinthia, Austria. In 2010, it was decided to clear the site and about 100.000 tons of contaminated slaked lime were burnt between July 2012 and November 2014 in a local cement plant. In spring 2014, an unusually high environmental load of hexachlorobenzene (HCB) has been measured in milk products from this Carinthian alpine valley. In December 2014, public authorities appointed a group of experts to conduct a human biomonitoring study in order to investigate the HCB exposure of the population. At the same time media presented figures showing that the district containing this alpine valley region exhibits the highest cancer rates in Carinthia. This caused general health related concerns. Debates started if the recent exposure to HCB and/or other negative environmental factors in the past are associated with an increased cancer risk. The aim of this study is to address methodological challenges and technical questions concerning calculation and presentation of cancer risk estimates for this alpine valley region. Epidemiological analyses of cancer risks are usually based on data from public cancer registries. Epidemiologists can perform analyses, and they should-at least theoretically-produce similar results. In practice, this will not always be the case as various methodological and technical issues may be handled differently by different researchers. Methodological issues concern the question whether the analyses are hypotheses driven, and if so who is entitled to formulate the hypotheses. E.g. besides all cancers combined, should also various subgroups or even single cancer entities be considered? The question whether the results should be accompanied by confidence intervals (i.e. hypothesis testing) is also methodological. Technical issues relate to the choice of epidemiological measures, type of standardisation, smoothing, pooling of calendar years, width of age intervals, and the general question what to do if the nominator or denominator becomes too small. We will especially focus on principles how to address these technical issues in practice. Methodological and technical issues in the generation of epidemiological reports are often ignored by the consumers of these reports (e.g. journalists, politicians, general public). Considering the increased public health concerns, epidemiologists may even face insinuations that they focus on exaggerating unimportant or, on the contrary, hiding important facts by their specific analyses. In this case, even the simplest descriptive result may become a matter of debate. Our guidance for such situations consists of the following three principles: (1) involve physicians and toxicologists for the methodological issues, (2) follow clear epidemiological principles to address technical issues in a robust and readily comprehensible manner, and (3) as somebody will always disapprove what you do-keep calm and stick to your analysis. Estimating usual dietary intake in the BVS II study: comparison of the results derived by the NCI method and a weighted mean approach Wawro N 1 , Kleiser C 1 , Himmerich S, Gedrich K 2 , Boeing H 3 , Knü ppel S 4 , Linseisen J 5 1 Helmholtz Zentrum München -Deutsches Forschungszentrum fürGesundheit und Umwelt (GmbH), Neuherberg/München, Germany; 2 Technische Universität München, Freising, Germany; 3 DIfE, Nuthetal, Germany; 4 Deutsches Institut für Ernährungsforschung Potsdam-Rehbrücke, Nuthetal, Germany; 5 Helmholtz Zentrum München, Neuherberg, Germany Estimation of habitual dietary intake is still a challenge, and new statistical methods have been suggested. We applied the NCI method on BVS II data, which has been shown to improve over existing methods (Tooze, 2006) . Methods A sample of 1050 Bavarian residents aged 13-80 years participated in an computer-assisted personal interview (CAPI) and completed three 24-h dietary recalls by telephone interview using EPIC-SOFT. Underreporters were excluded from the analysis. For the main food groups and subgroups the usual intake was calculated. We applied a weighted means approach, accounting for weekday and weekend day of consumption on the 24 h-recall data. These distributions were compared with those derived by the NCI method, which fits a twopart model for the consumption probability and the amount on consumption days. We included age, sex, BMI, smoking, physical activity and SES as adjustment variables. As expected we found a good accordance of mean intake values across most food groups and subgroups for the distributions derived with the two approaches. A comparison of 25, 50 and 75 %-percentile shows differing dispersions: e.g., for meat the interquartile range in men narrows from 120 to about 80 when using the NCI method. Mean intake can easily be derived from the weighted means approach. However, only the NCI method gives valid information on the distribution accounting for intra individual variation, which is important when food consumption or nutrient intake above or below a given limit is evaluated. Sensitisations and allergies in Bavarian preschool children (SEAL)-a project within the framework of the Health-Monitoring-Units (GME) Weber A 1 , Herr C 1 , Hendrowarsito L 1 , Buters J 2 , Pusch G 2 , Effner R 2 , Nennstiel-Ratzel U 3 , von Mutius E 4 , Kolb S 1 Background Both the total number of pollen as well as the allergenicity of pollen may increase due to climate change. Early contact to allergens in childhood can lead to sensitisations and possibly cause asthma and other atopic diseases. So far it has not been examined, how the increase and the modification of the pollen affect sensitisations in preschool children. Beyond that, the importance of standardized repeated assessment of asthma and allergies, as well as corresponding symptoms which includes objective measurements is obvious. Methods Supported by the Bavarian State Ministry of Health, in 2004 health monitoring units (GME) were established by the Bavarian Health and Food Safety Authority. The aim is to routinely collect standardized health data of children. In three of these cross-sectional studies (2004/ 2005, 2006/2007 and 2012/2013 ) questionnaires included items on doctor diagnosed asthma, hay fever and atopic dermatitis, symptoms of these diseases (e.g. wheezing, itchy eyes, rash) as well as already performed allergy testing and thereby identified allergens. Based on these findings, a study is being conducted with the addition of objective measurement of sensitisations in Bavarian preschool children. Sensitisations are analysed in capillary blood samples of preschool children by the use of two methods for determining specific immunoglobulin E (IgE): ImmunoCAP ISAC Array and Euroline. Beyond this, the local pollen load in nearby cities is extrapolated. With these parameters an examination of the association between sensitisations in preschool children and local pollen loading is possible. Data from the GME suggests that parent reported prevalence of doctor diagnosed asthma (2.6 to 2.8 %), hay fever (4.7 to 4.0 %) and atopic dermatitis (12.4 to 11.1 %) either remained fairly stable or decreased not significantly from 2004 to 2012. Doctor diagnosed asthma and allergies, as well as symptoms were more frequent in boys, except for atopic eczema. In 2012/2013 out of children with at least one parent reported symptom of asthma and allergies, 44 % had previously undergone an allergy test. The most common reported allergens in medically diagnosed allergic children were: house dust mites, pollen, animals and food. In the current SEAL-study (Sensitisations and Allergies in Bavarian preschool children) these results will be compared to objective measured sensitization of the preschool children. As first results, it appears that both children with and without sensitizations are participating in the SEAL study. Furthermore with the data of the compulsory school entrance examination, data from the SEAL study can be weighted to check its representativeness for preschool children in whole Bavaria. The GME provide standardized and repeated cross-sectional data that allow monitoring of the health status of children and influencing factors, in terms of surveillance. Currently the GME structure is used to examine the associations of pollen counts and preschool children's allergic sensitization. Long term auditory complications after childhood cancer: a report from the Swiss Childhood Cancer Survivor Study Weiss A 1 , Sommer G 1 , Kasteler R 1 , Scheinemann K 2 , Grotzer M 3 , Kompis M 4 , Kuehni C 1 1 Institute of Social and Preventive Medicine, University of Bern, Bern, Switzerland; 2 Division of Paediatric Haematology/Oncology, Children's Hospital Lucern, Luzern, Switzerland; 3 Department of Paediatric Oncology, University Children's Hospital Zurich, University of Zurich, Zurich, Switzerland; 4 Department of ENT, Head and Neck Surgery, University Hospital Bern, University of Bern, Bern, Switzerland Background Ototoxicity is a common adverse effect of childhood cancer treatment. It results in auditory complications such as hearing loss and tinnitus. Previous estimates of prevalence and incidence of auditory complications vary greatly and are hard to compare between studies. It thus remains unclear, how auditory complications vary between diagnostic childhood cancer groups. Platinum compounds with ototoxic properties or cochlea-sparing radiation techniques are increasingly used to treat childhood cancer. It is important to know how these developments in ototoxic cancer treatment affect the burden of auditory complications in childhood cancer survivors. Therefore, we aimed: (1) to investigate the prevalence of self-reported hearing loss and tinnitus in survivors of all diagnostic childhood cancer groups, compared to their siblings; (2) to assess the effects of cancer treatments on hearing loss, and (3) to describe the cumulative incidence of hearing loss over different time periods of cancer diagnosis. Within the Swiss Childhood Cancer Survivor Study, we sent a questionnaire to all Swiss C5-year survivors aged B16 years at cancer diagnosis, diagnosed between 1976 and 2005. We compared prevalence of self-reported hearing loss and tinnitus between survivors and siblings, analysed the effect of treatment-related factors using multivariable logistic regression, and estimated cumulative incidence of hearing loss for different treatment groups (platinum compounds, cranial radiation, both) and for different time periods of cancer diagnosis (1976-1985, 1986-1995, 1996-2005) . We included 2061 survivors and 864 siblings. Median (IQR) age at survey for survivors was 21 years (6-46) and median time since diagnosis was 15 years (5-36). The prevalence of self-reported hearing loss was higher in survivors of childhood cancer compared to their siblings (10 vs. 3 %, p \ 0.001). Survivors and siblings reported similar prevalence of tinnitus (4 vs. 5 %; p = 0.574). Survivors of CNS tumors had the highest prevalence of hearing loss (25 %). Survivors exposed to platinum compounds (carboplatin, OR = 2.01; cisplatin, OR = 8.94; both combined, OR = 7.83; p \ 0.001), cranial radiation doses over 29 Gray (30-49 Gray, OR = 1.67; C50 Gray, OR = 2.23; p = 0.011), brain surgery (OR = 2.28; p \ 0.001) and BMT (OR = 2.09; p = 0.022) were at increased risk for hearing loss after cancer diagnosis. In those treated with platinum compounds only, hearing loss did not develop later than 7 years after cancer diagnosis, but in those with cranial radiation, hearing loss still occurred 17 years after cancer diagnosis. Incidence of hearing loss increased since the introduction of platinum compounds in the 1980s, but tended to decreased again in more recent years. Survivors treated with platinum compounds, cranial radiation and brain surgery have an increased risk to develop hearing loss. Despite increased use of platinum compounds with ototoxic properties, the burden of hearing loss did not increase in recent decades. Survivors benefit from the implementation of new treatment regimens with less ototoxic radiation and carefully dosed platinum compounds. Implementing PEHR: design and integration of a consent creator service Weiss N 1 , Aguduri LS 1 , Yü ksekogul N 1 , Schreiweis B 1 , Brandner A 1 , Bronsch T 1 , Pensold P 1 , Stein KE 1 , Bergh B 1 , Heinze O 1 1 University Hospital Heidelberg, Heidelberg, Germany Giving the patient full control over his medical data electronically remains one of the most discussed topics in healthcare today. The INFOPAT project in the Rhine-Neckar region focuses on a personal cross-enterprise electronic health record (PEHR) in which the patient plays a major role. Thus, he should be provided with the possibility of granting access to his medical data which could be realized using a consent creator service. This paper presents a user interface concept for such a service as well as aspects for the technical implementation. In addition, a pattern for integrating the service into an existing IHE based infrastructure is shown. These concepts could be further adapted for improving patient empowerment in health care projects. Towards developing a coherent architecture for IT benchmarks: lessons learned from the literature and evaluating IT components Weiß JP 1 , Hü bner U 1 , Liebe JD 1 , Hü sers J 1 , Teuteberg F 2 1 Hochschule Osnabrück -University of Applied Sciences, Osnabrück, Germany; 2 University Osnabrück, Osnabrück, Germany IT benchmarking has been used increasingly to measure the performance, costs and quality of health information systems. Existing benchmarks, however, suffered from time-consuming data management and analyses due to a lack of a coherent IT architecture. The aim of this study, therefore, was (1) to identify existing concepts and solutions for IT tools to support benchmarking and (2) to check the suitability of these solutions for this task. To this end, a systematic literature review was performed and concepts and solutions, which were identified in the literature, tested for their support of six core benchmark processes. The systematic literature review resulted in 7 studies, which partly covered the entire benchmark process. The most elaborate approaches contained a data warehouse and some related tools. Based on these findings, an overall architecture for data management and analysis of benchmarks was developed. Open source components (LimeSurvey, Pentaho, OpenRefine, MySQL, R) were implemented in an initial prototype with a data warehouse as the information hub. Data could flow bidirectional and semi-automatic. This approach demonstrated the feasibility of implementing the architecture and showed first benefits in terms of consistent data, which are less error prone. Health status and use of health services of children with insecure residence status in Germany Wenner J 1 , Razum O 2 , Schenk L 3 , Ellert U 4 , Bozorgmehr K 5 1 Bielefeld University, Bielefeld, Germany; 2 Uni Bielefeld, Bielefeld, Germany; 3 Charité-Universitätsmedizin Berlin, Berlin, Germany; 4 Robert Koch-Institut, Berlin, Germany; 5 Universität Heidelberg, Heidelberg, Germany One third of all refugees who apply for asylum in Germany are children. Little is known about their health status [1] . National and international studies on refugee children's health suggest that pre-and peri-migration exposures related to war, flight, persecution and living conditions in the country of origin contribute to poorer health status of minor refugees compared to the other children living in the reception countries [2] [3] [4] [5] . However, selection effects (healthy migrant effect) might balance these effects out [6] . In addition, there is growing evidence that the post-migration living conditions and the often insecure legal status constitute an additional risk factor for the opportunity of minor refugees to grow up healthy and to access health care services [7] [8] [9] [10] . Differentiation within the refugee population according to social and legal status is therefore necessary when assessing and comparing the health status of minor refugees living in Germany. We analyzed the health status of children and adolescents with insecure residence status compared to children with secure residence status (with and without migration background) in Germany. Our aim was to assess in how far the legal status and the related living conditions of minors in Germany were associated with their health status and their use of health services. We used data from the German Health Interview and Examination Survey for Children and Adolescents (KiGGS). The KiGGS is the largest nation-wide survey on health of children in Germany [11] . In multivariable logistic regression models the association between children's residence status and their subjective and mental health as well as their utilization of emergency services and vaccination status was analyzed while adjusting for children's social status and migration background. We also assessed whether the association between residence status and health outcomes is mediated by social status. Among 17,245 children, 197 (1.1 %) had an insecure residence status. Adjusting for age and sex, an insecure residence status is associated with poorer subjective health [OR = 3.12 (2.07-4.94)], mental health problems [OR = 1.83 (1.16-2.87) ], an incomplete vaccination status [OR = 2.0 (1.33-3.0)] and the use of emergency health services [OR = 2.28 (1.2-4.36) ]. After adjusting also for social and migration status, only the association with the use of emergency care remains significant [OR = 2.53 (1.18-5.43) ]. The mediation analysis did not lead to any statistically significant findings. The association between residence status and use of emergency health services indicates possible barriers towards the use of regular primary care services which require further research. Though the importance of the legal status for health outcomes cannot be demonstrated here, the findings indicate health inequalities related to the children's migration background and their social status. The results probably underestimate the association between the residence status and the outcomes because of misclassification of children with insecure residence status. For more focused analyses of refugee children's health and its determinants, representative samples and a more specific assessment of their residence status and living conditions are necessary. Predictors for the prescription of aclidinium bromide as compared to tiotropium bromide using German claims data Wentzell N 1 , Kollhorst B 1 , Schink T 1 Background Aclidinium bromide (aclidinium) and tiotropium bromide (tiotropium) are long-acting anticholinergics for the long-term treatment of chronic obstructive pulmonary disease (COPD). In early benefit assessment, no proof of an added benefit was found by German authorities, but dispensations of aclidinium continuously increased since its launch. The objective of this study was to determine potential predictors for initial prescription of aclidinium compared to tiotropium. Retrospective cohort study based on claims data from one statutory health insurance provider included in the German Pharmacoepidemiological Research Database. The study population comprised patients with a first dispensation of aclidinium or tiotropium between October 1st and December 31st, 2012, after an aclidinium/tiotropiumnaïve period of one year. Age, sex, COPD-related variables, Charlson-Comorbidity-Index (CCI), time spent in hospital, history of renal disease (potential contra-indication) and characteristics of the prescribing physician were considered as potential predictors and their impact was assessed with multivariable logistic regression. During the study period, 544 insurants received an index prescription of aclidinium (61 % women, median age 69 years) and 4,379 insurants started tiotropium therapy (58 % women, median age 71 years). Patients living in rural areas were more likely to receive aclidinium (OR: 1.43; CI 1.1-1.87). A pulmonologist as the prescribing physician (1.91; 1.55-2.35), participation in the Disease Management Program for COPD (1.26; 1.01-1.57), chronic use of corticosteroids (1.75; 1.25-2.44 ) and history of renal disease (1.46; 1.03-2.07) increased the likelihood of receiving aclidinium. In contrast, COPD hospitalizations (0.5; 0.32-0.8), hospitalization time (0.95; 0.92-0.97) and CCI (0.89; 0.84-0.95) increased the likelihood of receiving tiotropium. Both clinical factors and aspects of care were identified as predictors for an aclidinium prescription. The strongest predictor was a pulmonologist as the prescribing physician. General comorbidity increased the probability to receive tiotropium. Background Allergic respiratory diseases represent a global health problem leading to reduced quality of life for patients. The two major treatment strategies comprise symptom treatment and specific immunotherapy which is deemed the only treatment that is able to alter the course of disease. This study aims to analyze the course of disease of respiratory allergy by treatment strategy and disease group. The analysis is based on routine data of a German statutory health insurance. A cohort is established based on 2007 and observed until 2012. According to their assured outpatient diagnoses patients are assigned to one of the disease groups rhinitis, asthma and both diseases for each year. Prescription requiring medication is analyzed in the same manner for patients receiving no medication, medication for rhinitis, medication for asthma or medication for both diseases. To compare patients under symptom treatment alone with patients under additional immunotherapy a matched-pair approach is applied. The study population comprises 165,446 respiratory allergy patients of which 70 % are assigned to rhinitis, while 16 % have asthma and 14 % report both diseases. Only in about 12 % of rhinitis-patients and 28 % of asthma-patients a second allergic respiratory diagnosis occurs during the observation period. About 50 % of patients with both diseases have a remission of one diagnosis. Under immunotherapy these patients are more likely lose their asthma diagnosis compared to symptom treatment (subcutaneous immunotherapy (SCIT) OR 1.36, p = 0.089; sublingual immunotherapy (SLIT) OR 3.57, p = 0.012). Analysis of medication reveals less asthma medication and less need for continuous medication under immunotherapy in rhinitis patients (SCIT OR 0.75, p = 0.086) and patients with both diseases (SCIT OR 0.65, p = 0.006; SLIT OR 0.39, p = 0.011). Due to small number of observations asthma patients are not included in the comparison of treatment strategies. Results of detailed analysis of diagnoses reflect the variability characteristic of the course of allergic diseases. Although limited by accuracy of documentation and the lack of clinical information the comparison of treatment strategies shows some advantages of immunotherapy. The precise observation system for the safe use of medicines (POSSUM): an approach for studying medication administration errors in the field Westbrook J 1 , Raban M 1 , Lehnbom E 2 , Li L 1 1 Australian Institute of Health Innovation, Macquarie University, Sydney, Australia; 2 Faculty of Pharmacy, University of Sydney, Sydney, Australia Medication administration errors (MAEs) in hospital are frequent and significantly more likely to result in serious harm to patients than other medication error types. Many interventions have been proposed in order reduce MAEs and the amount of harm associated with these errors. A major limitation in assessing the effectiveness of these interventions has been the lack of robust measures for assessing changes in MAEs and associated harms. Drawing upon extensive foundational research we have developed a robust approach and data collection software to be applied in direct observational studies of nurses to allow measurement of changes in MAE rates. We report how this approach is being applied in a large stepped-wedge cluster randomised controlled trial to assess the effectiveness of an electronic medication management system to reduce MAEs in a paediatric hospital. Practical challenges in planning and conducting a stepped wedge cluster randomised controlled trial to assess the effects of health information technology in hospitals Westbrook J 1 , Raban M 1 , Baysari M 1 , Li L 1 , Kim T 1 , Prgomet M 1 , Mumford V 2 1 Australian Institute of Health Innovation, Macquarie University, Sydney, Australia; 2 Australian Institute of Health Innovation, Sydney, Australia Background Health information technology (HIT) are complex interventions that may result in both expected and unexpected consequences [1] [2] [3] [4] . The need to apply robust and sophisticated evaluation models to assess HIT interventions is recognised. Systematic reviews demonstrate an excessive reliance upon uncontrolled before and after studies and qualitative studies.[5; 6] The opportunity to conduct randomised controlled trials of HIT interventions is extremely limited. Steppedwedge cluster randomised controlled trials (SWCRCT) present a relatively new design with possibilities for application in HIT evaluation. [7] However there are significant practical challenges. Our aim was to design and execute a SWCRCT to assess the effectiveness of an electronic medication management (eMM) system at two paediatric hospitals. This paper describes the study and some of the practical challenges encountered in this process. Methods A SWCRCT will be used to measure prescribing and medication administration error (MAE) rates, potential and actual adverse drug events, and length of stay pre and post eMM system implementation. In stage 1, 8 wards within the first paediatric hospital have been randomised to receive the eMM system one week apart. In stage 2, the second paediatric hospital will randomise implementation of a modified eMM and outcomes will be assessed using a SWCRCT. Prescribing errors will be identified through record reviews, and MAEs through direct observation of nurses followed by record reviews. Clinicians will abstract data from medical records using a structured form, assisted by harm identification guides. In total, 1,232 patient admissions will be reviewed for prescribing errors and over 5,000 medication administrations will be observed. Outcomes will be assessed at the patient-level using mixed models, taking into account correlation of admissions within wards and multiple admissions for the same patient, with adjustment for potential confounders. Interviews with clinicians will investigate the effects of the system on workflow. Data from hospital 1 will guide eMM improvements and the enhanced eMM will then be implemented at hospital 2. The SWCRCT is underway. Specific practical challenges include the identification of MAEs. As there are few tested data collection tools, the team was required to develop an observational approach and specialised data collection software for this study, the Precise Observational System for the Safe Use of Medicines (POSSUM). The significant efforts required to train observers, determine data definitions, and assess inter-rater reliability will be discussed. Preliminary field work confirmed clinicians' work practices often deviate significantly from processes as described within hospital policy and procedure manuals. Thus data collection in the field must adapt to this challenge in order to reflect the 'real world' experience of providers, while maintaining high levels of data quality and meaningful indicators. Our study hospitals have provided an opportunity rarely afforded to research teams, the ability to randomise the order in which the hospitals implement an HIT intervention. Many organisational, research and practical issues were required to maintain the integrity of such a large study. Our experience provides many practical lessons for researchers seeking to embark on the use of the SWCRCT design to assess health system interventions. Linked data for quality monitoring and commissioning for health and care service planning White P 1 , George A 1 , Bourne T 1 1 Kent County Council, Maidstone, United Kingdom Background Improving access to and linkage between routine administrative datasets in the National Health Service (NHS) and rest of public sector organisation for research and statistical purposes would have demonstrable effects on economic growth and help us respond more effectively to challenges related to population health and wellbeing of people. Making better use of these under-utilised resources can provide efficiency gains through the use of whole population person level linked datasets that can potentially speed up the production of policyrelevant research that can support current national policy imperatives such as service integration and integrated care. The Joint Strategic Needs Assessment (JSNA) is a statutory requirement across all Health and Well Being Boards, providing a helicopter view of current population health and wellbeing, giving recommendations for improvement. While the JSNA core datasets produce key headline indicators of population health and wellbeing, they are derived separately from person level data which is anonymised at source and not capable of being 'joined up'. In addition, the JSNA development process faces significant challenges in answering more complex questions from commissioners, particularly around service redesign in the backdrop of fiscal containment and assessing care continuity across patient journey, accurately predicting future health and care service use to meet population need. These types of questions cannot be answered by the siloed datasets which the JSNA currently uses. Aims/objectives/learning outcomes Knowledge and understanding of the different public sector data sources and datasets used in health and social care commissioning in England Knowledge and understanding in data quality and completeness-SWOT analysis Knowledge and understanding in IT infrastructure necessary for the collection and collation of these routine administrative datasets e.g. IT rules for setting up data warehouses Knowledge and understanding of the information governancehow the EU data protection act is applied at national level in terms of health legislation and policy for data sharing, knowledge of the national IG Toolkit-SWOT analysis Knowledge and understanding as to how routine administrative datasets are used for the public health quality monitoring and commissioning purposes, beyond JSNA core dataset. Emerging importance in the development and design of whole population person level linked datasets for commissioning purposes Examples of analyses done via local JSNA and indepth needs assessments, service review etc. Examples of advanced software applications such as system modelling to predict service demand and to increase efficiency in healthcare quality monitoring. Platform to support study planning and requesting patient healthcare information and biomaterial for research projects Wieland M 1 , Bougatf N 2 , Döllinger C 3 , Schreiweis B 4 , Hund H 5 , Bergh B 4 , Debus J 2 , Fegeler C 6 , Katus H 7 , Kirsten R 3 1 German Cancer Consortium (DKTK), German Cancer Research Center (DKFZ), Heidelberg, Germany; 2 National Center for Radiation Oncology, Heidelberg, Germany; 3 BioMaterialBank Heidelberg, Heidelberg, Germany; 4 Center for Information Technology and Medical Engineering, University Hospital Heidelberg, Heidelberg, Germany; 5 GECKO Research Institute, Heilbronn University, Heilbronn, Germany; 6 Heilbronn University, Heilbronn, Germany; 7 Department of Cardiology, University Hospital Heidelberg, Heidelberg, Germany Latest progresses in biomarker research and personalized medicine have rapidly increased the demand of research institutions for healthcare information and human biomaterial. This research is often performed within multi-site projects or large consortia, like the German Centers for Health Research (DZG), adding a new challenge for the IT-departments of all the involved partners. We combined use cases, existing laws and guidelines as well as an expert panel to identify the specific requirements of the research community and the involved data/biomaterial providers to design a decentralized research platform. The concept includes support for interdisciplinary and multi-centric research including study planning, search, request and reception of data and biomaterial. It also considers an independent data trustee and the complex ethical, legal and social issues related to clinical research. The required user interfaces, modeled processes and communication elements will be developed based on well-established standards of the IHE consortium. If necessary, new profiles will be developed within the project. This will be the first system to fully support the entire biomedical research workflow: from the idea to the publication. Prevalence of varicella zoster virus IgG antibodies and determinants of seropositivity in children and adolescents in the pre-vaccine era, Germany Since 2004 routine vaccination of children (11-14 months) is recommended in Germany. Evaluation of the vaccination strategy is supported by sero-epidemiological data before and after implementation of a routine vaccination. Objectives -Determination of age-specific prevalence of anti-VZV IgGantibodies in the pre-vaccine era -Identification of factors which are associated with seropositivity Method Serum samples of 13,433 varicella-unvaccinated children aged 1-17 years of the population-based German Health Interview and Examination Survey for Children and Adolescents (KiGGS; conducted [2003] [2004] [2005] [2006] were tested for anti-VZV-IgG by EUROIMMUN-ELISA. Equivocal results were retested by fluorescent antibody to membrane antigen test (FAMA). Statistical analyses used a weighting factor adjusting the study population to the German population. Seroprevalence is reported as percentages (%) with a 95 % confidence interval (CI). Odds ratios (OR) were computed by multivariate logistic regression to determine the association between socio-demographic factors and seropositivity. Overall seropositivity rate was 80.6 % (95 % CI 79.4-81.2). Seropositivity rates differed significantly between age-groups up to the age of six years, but not by gender. About 95 % of adolescents had anti-VZV IgG-antibodies. Multivariate analyses showed that beside age, older siblings and early start of day care were main factors increasing seropositivity rate in preschoolers; migration background reduced the chance of seropositivity in school-children (OR: 0.65; 0.43-0.99) and adolescents (OR 0.62; 0.4-0.97). Pre-vaccine data are the baseline to monitor the impact of the routine varicella vaccination strategy in Germany if seroprevalence data from the post-vaccine era will be available. During the pre-vaccine era most children contracted VZV-infection by the age of six. Identified risk groups for reduced seropositivity, such as school-children with migration background, should be targeted for catch-up vaccination. Opportunities and challenges of consumer-centric eHealth services-an interdisciplinary workshop Wiesner M 1 , Griebel L 2 , Hagglund M 3 , Heinze O 4 , Pobiruchin M 5 , Pohl AL 6 , Scandurra I 7 , Schreiweis B 4 1 Heilbronn University, Dept. of Medical Informatics, Heilbronn, Germany; 2 Friedrich-Alexander-Universität Erlangen-Nürnberg, Erlangen, Germany; 3 Karolinska Institutet, Stockholm, Sweden; 4 Center for Information Technology and Medical Engineering, University Hospital Heidelberg, Heidelberg, Germany; 5 Heilbronn University, GECKO Institute for Medicine, Informatics & Economics, Heilbronn, Germany; 6 Flensburg University of Applied Sciences, Institute for eHealth and Management in HealthCare, Flensburg, Germany; 7 Ö rebro University, Ö rebro, Sweden The European Union (EU) and its national governments are making huge investments to increase the dissemination of eHealth services. Several countries in the EU have implemented eHealth services at different levels. However, the current eHealth/mHealth situation in Europe is heterogeneous and fragmented, especially when citizens work or travel in different European countries. In Germany, neither the healthcare professional card nor the telematics infrastructure are available yet at a broad scale to be used for national eHealth services after more than ten years. In Sweden, a national eHealth infrastructure has been implemented which at least enables cross care provider access to health data at national level. At the same time, the consumer-oriented 2nd health market grows rapidly, mainly driven by wellness, fitness and sport apps for citizens' personal needs. As a consequence, this effect further scatters the already fragmented eHealth landscape impeding the integration of eHealth services for citizens across healthcare institutions and country borders. The main focus of this workshop will be to interactively explore challenges for citizens to access their patient information from different stakeholders' perspectives. The participants can expect profound knowledge, exchange of experiences and gain insights into different eHealth systems in Europe. Speakers' contributions and the input provided by workshop participants will be captured by the organizers during the group work and presented in a graphical illustration during the open discussion phase. Influence of human factor issues on patient centered mHealth apps impact; where do we stand? Wildenbos GA 1 , Peute L 2 , Jaspers M 3 1 Academic Medical Center. University of Amsterdam., Amsterdam, Netherlands; 2 Academic Medical Center, Universiteit van Amsterdam, Amsterdam, Netherlands; 3 Academic Medical Center-University of Amsterdam, Amsterdam, Netherlands The interest in mHealth applications (apps) is growing in the recent years and their potential to assist patients to more actively engage in the management of their care has resulted in a large number of apps. This paper discusses the preliminary results of a literature review on studies published in 2014-2015 concerning patient-centered mHealth apps' impact. Abstracts were included when they described a mHealth app targeted at patients and reported on the effects of this app on patient care. From a total of 559 potentially relevant articles, 17 papers were finally included. Nine studies reported a positive impact of the patient-centered mHealth app on patient care; 4 of these studies were randomized controlled trials. Measured impacts in the 17 studies focused on improving patients' physical activity, self-efficacy and medication adherence. Human factors issues potentially mediating these effects were discussed in all studies. Transitions in the interaction between healthcare providers and their patients were most often discussed as influencing the impact of the mHealth app. More research is needed, focussing on human issues mediating the effect of patient-centered mHealth apps to precipitate knowledge on the effectiveness of mHealth. This research should preferably be guided by socio-technical models. Making medication data meaningful: illustrated with hypertension Williams R 1 , Brown B 1 , Peek N 2 , Buchan I 1 1 University of Manchester, Manchester, United Kingdom; 2 The University of Manchester, Manchester, United Kingdom We demonstrate, with application to hypertension management, an algorithm for reconstructing therapeutic decisions from electronic primary care medication prescribing records. These decisions concern the initiation, termination and alteration of therapy, and have further utility in: monitoring patient adherence to medication; care pathway analysis including process mining; advanced phenotype construction; audit and feedback; and in measuring care quality. By accounting for SMK, we identified a total of 189 non-overlapping independent genetic loci for association with the three traits. Compared to previous studies, 23 of the 189 loci were found for the first time in the current study: Six for BMI, 11 for WC and six for WHR. Our methodological investigations of approaches to identify GxSMK interaction effects demonstrated the importance of testing for GxSMK interaction with and without a priori filtering for the SMKadjusted effects. By applying the recommended approaches, we identified six genetic loci with significant evidence of different genetic effects across smokers and non-smokers: Two for BMI (near CHRNA5 and INPP4B, both with effect in smokers-only and no effect in non-smokers), two for WC (near GRIN2A and PRNP, both with opposite effects between smokers and non-smokers) and two for WHR (RSPO3 and LYPLAL1, both with effects in non-smokers only). The majority of these loci harbor strong candidate genes with evidence for influence on obesity or a potential role for the modulation of effects through tobacco use. In summary, we identify several novel obesity loci by accounting for SMK and highlight genetic loci with evidence of interaction with SMK. Our results underscore the importance of appropriately modelling genetic associations with known biological relationships between phenotypes and environmental exposures. Development of tobacco control policy and its impact on smoking prevalence worldwide Winkler V 1 , Anderson C 2 , Becher H 3 1 Ruprecht-Karls-Universität Heidelberg, Germany; 2 Charité-Universitätsmedizin Berlin, Germany; 3 University Medical Center Hamburg-Eppendorf, Germany To describe worldwide levels and development of tobacco control policy by country development status from 2007 to 2014 and to analyze the corresponding relation to recent changes in smoking prevalence. Policy measure data representing years 2007 to 2014 were collected from all available WHO reports on the global tobacco epidemic. Corresponding policy percentage scores (PS) were calculated based on MPOWER measure indicators. Age-standardized smoking prevalence data for years 2010 and 2015 were collected from the WHO Global Health Observatory Data Repository. Development of PS was analysed with respect to WHO region and OECD country development status. Scatter plots and regression analysis were used to depict the relationship between tobacco control policy of 2010 and change in smoking prevalence sex and development status. Background Asymmetric (ADMA) and symmetric dimethylarginine (SDMA) are inhibiting the cellular uptake of nitric oxide, which inherits various biological functions that make it an important cardiovascular health promoting agent. ADMA and SDMA have shown adverse cardiovascular properties and are elevated in patients with cardiovascular disease, including heart failure (HF). We studied the association between ADMA and SDMA and the risk of HF. Analyses were conducted within a case-cohort study of the European Prospective Investigation into Cancer and Nutrition (n = 27,548), comprising a random subcohort (n = 2,224 including 19 HF cases), and all remaining HF cases (n = 176) that occurred within 8.3 years of follow-up. Serum concentrations of dimethylarginines were measured using liquid chromatography-tandem mass spectrometry. Hazards ratios (HRs) and 95 % confidence intervals (CI) were estimated across quartiles and per doubling of ADMA and SDMA concentrations using Cox's proportional hazards regression. In a multivariable adjusted model, each doubling of ADMA was associated with a 60 % higher HF risk (HR [95 % CI] 1.60 [1.10-2.31]). Between SDMA and HF risk a U-shaped association was observed (HR [95 % CI] for the second, third, and fourth quartile compared to the first: 0.52 [0.33-0.82], 0.63 [0.40-0.99], and 0.71 [0.46-1.10], p for nonlinearity .01). We provide substantiated evidence for a relationship between ADMA and cardiovascular endpoints. In addition to the established relation between ADMA and myocardial infarction our findings indicate a positive association between ADMA and HF incidence in persons without apparent myocardial infarction. Targeting the ADMA metabolism might open up new therapeutic perspective for HF prevention and treatment. Further investigations are needed to shed more light on mechanisms involved in the pathogenesis of HF related to elevated ADMA concentrations. Dietary patterns and the risk of gallstone disease in the Health Professionals Follow-Up Study Janine Wirth, Andrew T. Chan, Amit Joshi, Cornelia Weikert, Michael Leitzmann, Edward Giovannucci, Kana Wu Background Gallstone disease has been linked to mortality and increased risks of major chronic diseases, such as diabetes, cancers and cardiovascular diseases. We plan to examine the association between the adherences to three a priori defined dietary patterns, namely the alternate Mediterranean diet (aMed), the Alternative Healthy Eating Index (AHEI), and the Dietary Approaches to Stop Hypertension (DASH) diet, and the risk of gallstone disease. The research questions will be examined using data from the Health Professionals Follow-Up Study (HPFS), a cohort of 51,529 male U.S. health care professionals aged 40 to 75 years at time of recruitment in 1986. Diet was assessed at baseline (1986) and every four years during follow-up. Hazards Ratios (HR) and 95 % confidence intervals (CI) of gallstones (as combined endpoint of either medically confirmed gallstones or cholecystectomy during 1988-2012) according to quintiles of adherence to the aMed, the AHEI and the DASH diet will be calculated for simple updated (most recent = short-term effect) and cumulative average measurements (long-term effect) performing Cox proportional hazards regression. We expect to observe a lower risk of gallstones in persons with higher adherence to each of the three dietary patterns. Which of the patterns may be linked to most beneficial outcome is to be investigated and will then open up new perspectives on investigating healthy lifestyle factors for the prevention of gallstone disease. The adherence to all three dietary patterns, the aMed, the AHEI and the DASH diet, shows similar associations with gallstone risk in a crude model stratified by age and adjusted for total energy intake. Comparing the fifth to the first quintile of adherence was associated with an about 30-40 % lower risk of gallstone disease. Analyses of short-term (single update) and long-term adherence (cumulative average over time) showed only slight differences in the risk estimates. A comparison of distributed computing technologies for medical image processing in digital pathology Witt M 1 , Strutz M 1 , Lindequist B 1 , Jansen C 1 , Hufnagl P 1,2 , Heßling H 1 , Krefting D 1 1 Hochschule für Technik und Wirtschaft, Berlin, Germany; 2 Charité -Universitätsmedizin, Institut für Pathologie, Berlin, Germany Modern scanners can produce high resolution whole slide images (WSI [1] ) of human tissue. Digital image processing can be applied to WSIs of pathological samples and results can assist diagnosis. The protein Ki -67 [2] has been established as a biomarker for the measurement of cell proliferation in breast cancer. A high value indicates high tumor growth and impacts the patient's treatment. Several automated image analysis methods to identify Ki-67-positive and negative tumor cells are available. Processing a WSI can take several hours or days depending on image size and algorithm complexity. A common approach to reduce computational time is using a distributed architecture, where processes can be executed in a highly parallelized manner. Pathological slides can typically be splitted into tiles which can be processed independently. Different technologies with different levels of abstraction are available, ranging from Big Data frameworks to Cloud infrastructures. In this paper we compare technologies based on the example of Ki-67 image analysis. The investigated testcase consists of a stained WSI (H-DAB) that was separated into 3629 equally sized tiles. The algorithm [3] rates the given tiles and produces validated Ki-67 scores [4] . It is executed on a single PC (1), on a Cluster with Hadoop Streaming (2) [5] , and an OpenStack Magnum [6] [7] Docker-Swarm (3) [8] . For the distributed systems, the data is stored on a HDFSbackend. Besides comparing the processing time, distribution and result aggregation frameworks in the presented systems are investigated as they play a vital role in medical image processing. On a single PC (1), the processing time is about 28 h which is significantly reduced by the distributed systems (cf. Table 1 ). Due to the usage of the Map-Reduce-Paradigm problem distribution and the results collection are inherently solved by (2). Because of the character of OpenStack and DockerSwarm as Infrastructure as a Service-Frameworks, all software related orchestration has to be done separately. It requires more administration efforts, but allows for full control of the data flow in the execution chain. In Hadoop, tracking and tracing of the data are challenging and require in-depth system knowledge. Using an IaaS-approach outperforms Hadoop regarding processing time, even on an older and smaller cluster. This is presumably due to the high orchestration overhead of Hadoop. As the image processing algorithm was not specifically designed for Hadoop, certain features might not be supported optimally. So better adoption might improve processing time. However, the tight coupling of the Hadoop core components makes it hard to adapt the system for requirements like data protection measures. Due to the multi tenancy approach of OpenStack, separation of data, processing and network are core capabilities, while docker swarm allows for light-weight cluster distribution. Therefore we consider container virtualization within an IaaS environment as a promising base for efficient medical image analysis. Commercial airline crew are exposed to ionizing radiation of cosmic origin. With excess effective doses of about 2-6 milliSievert (mSv) annually, and a cumulative occupational lifetime dose of up to 100 mSv, airline crew are one of the occupational groups with the highest radiation exposures. Airline crew's exposure to ionizing radiation as well as to other risk factors have raised concerns about potential health effects. Currently, the third follow-up of a large cohort of German aircrew employed by two German airlines is under way that extends observation time until 2014, adding 10 years to the last follow-up. For 6,061 cockpit crew members, a cumulative yearly effective dose had already been estimated for the years 1960-2004. In addition, for both cockpit and cabin crew, the federal radiation registry (SSR) records monthly calculated doses since August 2003. For future doserelated analyses including the 20,757 cabin crew members, their cumulative yearly dose from 1960-2004 first needs to be estimated. We here report results from building a statistical prediction model for cabin-crew dose from 1960-2004, including the estimated of out-ofsample prediction error. To estimate radiation exposure for cockpit personnel from 1960 to 2004, their individual yearly flight hours per aircraft type were combined with a job-exposure matrix that listed effective radiation dose rates per flight hour, aircraft type and calendar year. From the SSR, we obtained individual monthly doses from August 2003 until July 2014 for 9614 cabin crew members (1736 male, 7878 female) and 3421 cockpit personnel (3374 male, 47 female). Based on this data, we built a statistical model that can be applied to predict cabin-crew dose from 1960-2004. We first aggregated the dose information provided by the SSR into average cumulative yearly dose stratified by age (in 5-year groups), role (cabin vs. cockpit), and gender. To predict yearly cumulative cabin-crew dose for the years covered by the SSR, we considered average cumulative male-cockpit-crew dose, average heliocentric potential, role (cabin vs. cockpit), age (in 5-year groups), and gender. We included these predictors as well as relevant pairwise interaction terms in a linear regression model. By comparing predicted aggregate cumulative yearly cabin-crew dose to the corresponding actual dose as calculated by the SSR, we estimated out-of-sample prediction error using cross-validation. Mean monthly dose from 2003-2014 was 213 mikroSv (range 0-860 mikroSv), mean cumulative yearly dose was 2 mSv (range 0-6 mSv). There were several systematic patterns affecting the annual doses (11year solar cycle) or monthly doses (seasonal peaks closely correlated with available seat miles). Average doses followed a characteristic, gender-specific career pattern for cabin and cockpit crew. Estimated out-of-sample prediction error (RMSE) for aggregate cumulative yearly cabin-crew dose was 43 mikroSv, i.e. about 20 % of average cumulative yearly dose. This new approach to retrospective dose reconstruction among aircrew uses several information sources and benefits from the fact that estimated effective doses of cockpit and cabin crew are correlated, given their job patterns. We aim for further validation by comparing our approach and results to international aircrew studies. Measured and predicted radiation dose distribution in functional heart regions from 3D conformal radiotherapy for breast cancer Cardiac late effects are a major health problem for long term survivors after radiotherapy for breast cancer. Cohort studies to better understand the exact dose-response relationship require individual estimates of radiation dose to the heart. In the PASSOS retrospective epidemiological cohort study on 12264 women treated for breast cancer, we examined a sample of 774 patients treated with megavoltage tangential field radiotherapy in 1998-2008 and individually determined their heart dose. We obtained electronic treatment planning records for the 774 patients in the dosimetry sample. All dose distributions were recalculated using Eclipse with the anisotropic analytical algorithm (AAA) for photon fields, and the electron Monte Carlo algorithm for electron boost fields. Based on individual dose volume histograms for the complete heart and for several functional sub-structures, we estimated several dose measures and compared multiple patient groups. Statistical models were fitted to predict absorbed dose using only covariate information from patients' clinical records on tumor location, patient anatomy and radiotherapy prescription. We used these models to predict individual heart dose in the remaining cohort while assessing the out-of-sample prediction error using cross validation. Mean heart dose spanned a range of 0.9-19.1 Gy for left-sided radiotherapy and 0.3-11.6 Gy for right-sided radiotherapy. Average (median) mean heart dose was 4.6 Gy (3.7 Gy) for left-sided radiotherapy, and 1.7 Gy (1.4 Gy) for right-sided RT. Younger age, higher body mass index, tumor location in a medial quadrant, and presence of a parasternal field were also associated with higher heart dose. The out-of-sample prediction error for mean heart dose was 54 % (coefficient of variation). Prediction error in functional sub-structures ranged from 49-68 % for mean dose, and from 52-86 % for extreme exposure. Tumor location and treatment choices influence cardiac dose with complex interactions. There is considerable variability in heart dose, with dose metrics of different cardiac sub-structures showing different patterns in their dependency on external influences. Based on a patient sample with exact heart dosimetry, it is possible to use clinical information alone to predict radiation exposure of the heart in the remaining cohort with a quantified error suitable for dose-response analyses of cardiac late effects. Dose-response analysis of late cardiac effects after radiotherapy benefits from detailed individual dosimetry and should account for errors in estimated dose. Prospective association between dietary soy protein/isoflavone intake and metabolic syndrome among Korean adults ‡40 years old: the Korean multi-Rural Communities Cohort Study (MR Cohort) Woo HW 1, 2 Regardless of considerable attention to the potential beneficial effects of soy protein and isoflavone on metabolic syndrome (MetS), findings linking habitual consumption of them are limited. This study aimed to evaluate the association between habitual soy protein/isoflavone intake and MetS incidence, as well as its components, in a populationbased cohort of Korean men and women aged C40 years old. Methods Data from the Korean multi-Rural Communities Cohort Study (MRCohort) which is a part of the Korean Genome Epidemiology Study (KoGES), were used. There were total 4,404 subjects (men 1,741, women 2,663) who did not have MetS. Soy protein and isoflavones intake was calculated using a food frequency questionnaire (FFQ) composed of 106 items [1] . MetS was defined using the updated National Cholesterol Education Program Adult Treatment Panel III criteria after modified according to the International Diabetes Federation [2] and Korean Diabetes Association criteria [3] . The association between soy protein/isoflavone intake and Mets risk was investigated by a modified Poisson regression model, using a robust error estimator [4] . During the follow-up period, 659 new cases of MetS occurred. Among women, increased soy protein and isoflavone intake was associated with decreased risk of MetS (RR = 0.63, 95 % CI, 0.47-0.85; P for trend = 0.0131 for the highest soy protein intake, RR = 0.58, 95 % CI, 0.44-0.78; P for trend = 0.0057 for the highest isoflavone intake). Among men, there was no significant association with MetS, however, in individual component analyses, daily intake of soy protein and isoflavone was inversely associated with risk of elevated blood glucose (RR = 0.67, 95 % CI, 0.49-0.90; P for trend = 0.1132 for the highest soy protein intake, RR = 0.66, 95 % CI, 0.49-0.90; P for trend = 0.0964 for the highest isoflavone intake) and elevated triacylglycerol (RR = 0.58, 95 % CI, 0.39-0.85; P for trend = 0.1218 for the highest soy protein intake, RR = 0.64, 95 % CI, 0.43-0.93; P for trend = 0.2873 for the highest isoflavone intake). In addition, for women, daily intake of soy protein and isoflavone showed significantly inverse association with risk of abdominal obesity, elevated blood pressure, and elevated triacylglycerol. In conclusion, our findings suggest that higher daily intake of soy protein and isoflavone may reduce the risk of MetS and some metabolic components. The sex difference in the prevalence of metabolic syndrome among a rural Chinese community population Xiao T 1 , Chen X 2 , Su M 3 , Chen Y 4 , Jiang Q 1 , Fu C 1 The sex difference in the risk of metabolic syndrome (MS) needs to be further clarified. This study aimed to determine the sex difference in MS prevalence among rural Chinese adults population in transition. Methods A cross-sectional study of 27889 adults aged 18 to 93 years was conducted in Yuhuan County, Zhejiang province in 2012. MS was defined by incorporating diagnostic criteria of the American Heart Association/National Heart, Lung and Blood Institute (AHA/NHLBI) and the International Diabetes Federation (IDF). Logistic regression model was used to examine the relationship between female sex and metabolic syndrome, and adjusted odds ratio (aOR) and 95 % confidential interval (CI) were calculated for the association. Of the 27889 participants, 59 % were female. Their mean age was 53.2 ± 14.0 years. The overall prevalence of MS was 25.3 %, ranging from 3.7 % in the 18-24 year age group to 36.6 % in the 75-93 year age group. The prevalence of MS reached a peak at the age of 55-64 years in men, but increased with age continuously in women. The prevalence of MS was significantly higher in women (28.1 %) than in men (21.3 %) (v 2 = 165.427, p \ 0.001) and the sex difference remained after adjustment for covariates (aOR = 1.82, 95 % CI 1.68-1.98). Metabolic syndrome was more prevalent in women than in men among the rural Chinese population in transition. Validating and transforming of CDA HL7 v3 documents for I2B2 import Xu T 1 , Thiemann VS 1 , Röhrig R 1 , Majeed RW 1 1 Carl von Ossietzky University, Oldenburg, Germany S234 E. Grill et al. Background We are building a decentralized emergency care research IT-Infrastructure (AKTIN) to ensure patient privacy and comply with the many and varied privacy laws in Germany [1] . Each participating hospital will store their own data in a Data Warehouse (DWH) located on its own premises. The data originates directly from the emergency room (ER) protocol [2, 3] , documented in HL7 v3 Clinical Document Architecture (CDA) [4] . The DWH is based on the open source clinical data warehouse I2B2 (short for: informatics for integrating biology and the bedside) [5], with custom tools to validate and import the ER-CDA documents. Methods CDA documents can be transmitted to the DWH via two standard interfaces: We implement an IHE XDS.b document repository for SOAP API calls as well as a HL7 FHIR Binary [6] endpoint for RESTful data transfers. The documents originating from the information infrastructure have to be validated. The specification is available at HL7 Germany [7, 8] . We make use of the generated Schematron definition and validate the documents against the given rules. Validation ensures only specified documents are accepted into the DWH, in this case, those compliant with the ER-CDA. We break down the complex CDA XML structure into Entity-Attribute-Value (EAV) format, since the I2B2 stores data as encounter datasets, based entity-value pairs. Transformation from and to CDA documents is no novelty [9] , but in this case, we only need one direction and only from one CDA definition. Furthermore, not all parts from ER-CDA are required, e.g. a laboratory test will not be part of every encounter. Thus, not all fields as defined in the CDA are necessarily filled. As no person identifying data is needed in the DWH, fields such as names are removed from the document as a first early step of anonymization. Since a CDA document can be submitted to the DWH without being finalized, e.g. the patient is still in care, each encounter dataset in the DWH is identified via the encounter ID in the original CDA document, generated by the originating infrastructure. Thus, the dataset can be updated given the corresponding updated CDA document. Our implementation allows us to validate and transform given ER-CDA documents. In the process, we can filter out faulty transmissions and unneeded fields. For validation, we had to adjust the ER-CDA definitions to optimize the process. Likewise, we identified and implemented interchangeable units, based on different conventions utilized in different locations. Each value will be stored as-is into the DWH, but can be transformed into other units. The resulting software runs within the same application server as i2b2 and requires no additional database or configuration. The software can be used with any existing i2b2 installation without any additional modifications. In comparison to the ETL process from general CDA documents as is described in [10] [11] [12] , the transformation and import process we have developed is specialized for the German ER protocol, focusing on the special needs and regulative requirements in Germany [13] . Code available: https://gitlab.uni-oldenburg.de/AKTIN/dwhimport Overview Abstract Security and privacy issues are getting increasingly intensive attention in recent years from stakeholders in the healthcare sector based on several trends in the healthcare domain, such as: (1) digitization of health and medical records; (2) proliferation of Bring-Your-Own-Devices (BYOD) in the daily healthcare work; (3) telemedicine and electronic records sharing; (4) connecting medical devices to the cyber space; (5) Internet and cloud based healthcare services being delivered across the borders complying with local laws and regulations. Most of these trends, while improve the accuracy, efficiency, and convenience of healthcare services delivering, on the other hand increase the attack surface from the adversary's perspective. Taking the advantage of digitalization of medical and health records and cloud based data management applications, e.g., hospitals' and health insurance companies' own databases or services provided by cloud-based platforms such as Microsoft HealthVault, Google Health, and Dossia, an attacker can have a chance to hack into a health database [1] with full profiles of the targets. BYOD and electronic records sharing can increase the chances of data breach caused by inadvertent or malicious security and privacy policy violation [2] . The wired and wireless network connecting functionality in medical devices, e.g., a pacemaker [3] , can open a channel in an unexpected way to cyberattacks if having not considered security requirements in the technology design phase. On the other hand, the scope of entities with legal liability to security breaches can be expanded from healthcare organizations to including other involved business associates [4] under the increasing privacy concerns, which requires more stakeholders including technology solution providers in the healthcare eco-system to take efforts in ensuring the protection of the data processed and used. However, healthcare staffs' practical experiences and statistics showed that most security breaches have been attributed to socio-technical and human factors, which go far beyond the area of technology [2, 5] . Such challenges include gaps in working culture, lack of security training and awareness, negligence of security policy under emergencies or other stressful situations, lack of individualized guideline for security incident response, and lack of a quantitative way to modelling the human factors in the whole healthcare eco-system from the information security perspective. We believe we need an interdisciplinary and socio-technical approach to securing health informatics solutions. The future security and privacy designs for health informatics should fully incorporate the knowledge created from human response and behavior analysis when modelling the human factors for incident prevention, detection, response, and disaster recovery. In this workshop we plan to invite people with varied backgrounds to provide their opinions on security challenges and hopes about health informatics and exchange ideas to facilitate new ideas for implementing concept of using the socio-technical approach to securing health informatics. 463 mHealth quality: a process to seal the qualified mobile health apps Yasini M 1 , Béranger J 2 , Desmarais P 3 , Perez L 1 , Marchand G 1 1 DMD Santé, Paris, France; 2 KEOSYS, Scientifical, Ethical & Medical departement, Paris, France; 3 Desmarais Avocats, Paris, France A large number of mobile health applications (apps) are currently available with a variety of functionalities. The user ratings in the app stores seem not to be reliable to determine the quality of the apps. The traditional methods of evaluation are not suitable for fast paced nature of mobile technology. In this study, we propose a collaborative multidimensional scale to assess the quality of mHealth apps. During our process, the app quality is assessed in various aspects including medical reliability, legal consistency, ethical consistency, usability aspects, personal data privacy and IT security. A hypothetico-deductive approach was used in various working groups to define the audit criteria based on the various use cases that an app could provide. These criteria were then implemented into a web based self-administered questionnaires and the generation of automatic reports were considered. This method is on the one hand specific to each app because it allows to assess each health app according to its offered functionalities. On the other hand, this method is automatic, transferable to all apps and adapted to the dynamic nature of mobile technology. Keywords Mobile health, Assessment, Smart phones, Health apps, Certification 504 Predictors of self-reported health status of a population in Oromia National Regional State, Ethiopia Yesuf EA 1,2 , Kim T 3 , Tushune K 2 , Assefa F 2 , Workneh N 4 , Nemera G 5 , Noh Y 3 , Min S 6 , Chun C 7 , Cheon J 6 , Kassahun W 8 , Balcha F 5 , Assefa A 9 , Segni H 10 Background Health and quality of life measurements are widely used in Europe and North America. They are useful to measure the self-reported health status of general populations and patients with diseases, such as diabetes, hypertension and arthritis. Health status measures predict mortality in general population. Short Form 36 (SF-36) was validated and is widely used in countries in Europe and North America. However, SF-36 is not validated for populations in Ethiopia. We aimed to validate SF-36 in Afan Oromo speaking general population in Ethiopia and determine predictors of self-reported physical-and mental-health. We conducted a cross-sectional study at Jimma Zone of the Ethiopian Oromia National Regional State (ONRS) which is the largest state in Ethiopia. Jimma zone of ONRS had a population of 2,936,631 in 2015. The sample was randomly drawn from 24 districts, 18 of which were rural. Sample size was based on a prevalence of 0.5 (50 %) with a 0.03 margin of error and 10 % contingency for potential non-response. All adults 18 years and older were eligible to participate. Data was collected via face-to-face interview by trained interviewers. We translated the English version of SF-36 into Afan Oromo, the official language of ONRS. Items were converted to the local context. For example, item on 'climbing stairs' was translated as 'walking up hills.' This version was then back translated into English. The translated version was seen and approved by 11 of the 14 authors giving 78.6 % agreement. We used Cronbach's a for reliability, and spearman's correlation coefficient for scale validation. Linear regression was used to analyze predictors of health status. We obtained a response from 1163 (99.1 %) participants. Females accounted for 45.3 % of the participants. Age of the participants ranged 18-86 years with mean ± SD of 35.5 ± 14.2. Seven out of the eight scales of SF-36 were found to be reliable with Cronbach's a of [0.7. Vitality scale was not reliable. Physical Functioning scale showed a valid item-scale correlation except for the items 'walking one block' and 'bathing or dressing yourself.' All mental health items were item-scale valid. The mean and standard error of the physical functioning and mental health scales were 76.66 (0.76) and 72.18 (0.63), respectively. Predictors of good physical functioning were being male; higher schooling vs. no schooling; physical functioning decreased significantly with increasing age (age 18-24 (b = 29.47), age 25-34 (b = 23.68), age 35-44 (b = 20.22), age 45-54 (b = 16.50), age 55-64 (b = 14.90), p value for all age groups is .0001). Being single (b = 10.88, p = 0.038) or married (b = 8.60, p = 0.016) compared to widowed/divorced; and being employed (b = 7.59, p = 0.0002) also indicated better physical functioning. Conclusion This is the first study showing that in the Ethiopian Oromia state, women, married/divorced/widowed individuals, individuals with no education/informal school only, and unemployed persons were at risk for lower physical functioning. Likewise, physical functioning decreased monotonically with age. Results point at potential for preventive interventions. Snapshotting your clinical data: a data exploration framework for clinical data analytics Yu Y 1 , Sun W 1 , Hao B 1 , Hu G 1 , Liu H 1 , Xie G 1 1 IBM Research-China, Beijing, China Clinical data analytics have been utilized in various aspects of healthcare such predictive modelling, clinical decision support and personalized health management. Before a formal data analysis such as predictive analysis can be conducted, the analyst must know the main characteristics of a dataset and how it can be analyzed. Given varies of clinical data sources, even to conduct the same clinical data analytics such as predictive analysis, the data exploration tasks diverse. The Data Exploration Framework (DEF) provides a generalized method and process, which takes clinical dataset as input, transforms analyst's requirements to configurable parameters, and automatically generates diverse of charts to visualize the results. By snapshotting the exploration results of the clinical dataset in different analytics stages, the analyst can get a ''gallery'' of snapshots of the clinical dataset and understand it much better during analysis process. Providing patients with access to their medical data has recently evolved as a topic in several countries. Different approaches are possible. For example patient portals are used for patient access towards medical data. The University Hospital Heidelberg is engaged in a research project to develop a personal cross-enterprise electronic health record (PEHR). The objective of this work is to describe the architecture and implementation of a component called IHE Connector which represents the native IHE-based integration between the patient portal and the PEHR core components. The architecture of the PEHR is accepted based on international standards. The core components consist out of ready to use software products like a master patient index. The patient portal has been developed using Liferay framework. The IHE Connector is mainly based on the Open eHealth Integration Platform (IPF) Framework, which has been deeply integrated into the patient portal to support the needed IHE transactions. Several IHE profiles for sharing documents and patient information are supported by the IHE Connector. As IPF already provides interfaces for some IHE profiles others had to be developed from scratch. The IHE Connector can not only be used for connectivity between patient portal and PEHR core, but also provide connectivity for third party apps and healthcare providers' information systems. Future of telehealth services in Pakistan Zahid Z 1 1 COMSATS Institute of Information Technology, Islamabad, Islamabad, Pakistan Background Introduction: Practice of Telehealth is increasing day by day in both developing and developed world and many healthcare professionals are providing their services to some extent through Telehealth. Telehealth may help us to improve the primary healthcare system by providing the healthcare services at door steps. With the help of Telehealth we can provide many services ranging from consultation to treatment and provision of different therapies. There is a dearth of qualified health care professionals in Pakistan and this gap can be bridged via use of Telehealth services to considerable extent. Telehealth can also be helpful in monitoring the geographical distribution of different disease and thus act as a guide to prevent the outbreaks and wide spread of particular disease. The main motivation behind this study was the unavailability of basic health care at community level especially in remote areas of Pakistan. Methods It was a quantitative based survey study and 350 health professionals from different cities of Pakistan participated in this study. Data was analyzed by SPSS-16. The prime objective of this study was to assess the future of Telehealth in Pakistan and its knowledge among the health professionals of Pakistan to understand the feasibility of introduction of Telehealth services in our healthcare system. The result showed that Healthcare professionals in Pakistan are have good knowledge about the Telehealth and 60 % of the participants said that they have very good knowledge of ICT and 38 % rated their knowledge as excellent while 75 % participants rated their knowledge about telehealth as very good and about 35 % of total participants were practicing Telehealth to some extent ranging from patient follow-up to delivery of required therapies. About 83 % participants said that with the help of telehealth we can provide a variety of services to the underserved community especially living in remote areas of Pakistan which comprises the major chunk of our population. We have concluded that there is a good scope of Telehealth care in Pakistan because most of the health care professionals have good ICT knowledge and they are practicing Telehealth in their daily life to some extent as they are doing follow-ups, consultations and providing therapies to their clients in remote areas with the help of different communication tools. We concluded that with the help of Telehealth government of Pakistan can provide the basic health care services in rural and far areas and that will help to improve the health status of its citizens. The limitation of this study was a short sample because of less resources and therefore we can't extrapolate the result of this study. Expansion of the hierarchical terminology auditing framework through usage of Levenshtein distance-based criterion Zakharchenko A 1 , Geller J 1 1 New Jersey Institute of Technology, Newark, United States The multitude of different medical terminologies and abundance of information presented by them, combined with development of automated tools and decision support systems makes the issues of cross-referencing and compatibility between different medical terminologies one of the key questions in scientific research in the field of medical informatics. Manual resolution of incompatibilities and errors tends to be inefficient and expensive due to the sheer volumes of information involved in the process. In our previous work [1] we have offered an approach of automatically labeling potential similarities in the hierarchical structure of the terminologies based on the semantic likeness and hierarchical position of the concepts themselves. The efficiency of this approach has been demonstrated by developing a framework and implementing in an automated tool, which we called HierarchyDiff. This tool was able to detect similarities between related hierarchies of two different terminologies: Pharmaceutical biologic product hierarchy and Substance hierarchy of SNOMED CT[2] and INGREDIENT_KIND hierarchy of National Drug File-Reference Terminology (NDF-RT)[3] and suggest potential ways of combining and cross-referencing the information in these hierarchies. Although that approach has demonstrated positive results, through replacement of the underlying string-matching algorithm in the semantic part of HierarchyDiff tool by replacing it with a newer algorithm based on the Levenshtein distance [4] and expanding the hierarchical match algorithm, we were able to obtain significant improvement in the rate at which HierarchyDiff was detecting similarities between hierarchies. Nonparametric meta-analysis for diagnostic accuracy studies Zapf A 1 1 Universitätsmedizin Göttingen, Göttingen, Germany Nonparametric meta-analysis for diagnostic accuracy studies Background Summarizing the information of many studies using a meta-analysis becomes more and more important, also in the field of diagnostic studies. The special challenge in meta-analysis of diagnostic accuracy studies is that in general sensitivity and specificity are co-primary endpoints. Across the studies, both endpoints are correlated, and this correlation has to be considered in the analysis. The standard approach for such a meta-analysis is the bivariate logistic random effects model. An alternative, more flexible approach is to use marginal beta-binomial distributions for the true positives and the true negatives, linked by copula distributions. However, both approaches can lead to convergence problems. We developed a new, nonparametric approach of analysis, which has greater flexibility with respect to the correlation structure. Furthermore, the nonparametric approach avoids convergence problems. In a simulation study, it became apparent that the empirical coverage of all three approaches is in general below the nominal level. Regarding bias, empirical coverage, and mean squared error the nonparametric model is often superior to the standard model, and comparable with the copula model. I will also show the application of the three approaches for two example meta-analyses: one with very high specificities and low variability, and one with an outlier study. In summary, the nonparametric model as compared with the standard model and the copula model has better or comparable statistical properties, no restrictions on the correlations structure and always converges. Subject of further research is the consideration of multiple thresholds per study. Mining patterns of disease progression: a topic-model-based approach Zhang L 1 , Zhao J 1 , Wang Y 1 , Xie B 1 1 Peking University, Beijing, China Knowledge of how diseases progress and transform is crucial for clinical decision making. Frequent pattern mining techniques, such as sequential pattern mining (SPM) algorithms, can automatically extract such knowledge from large collections of electronic medical records (EMR). However, EMR data are usually unorganized and highly noisy. Finding meaningful disease patterns often calls for manual manipulation such as cohort and feature selection on EMR data by medical professionals. We propose a topic-model-based SPM approach to find disease progression patterns from diagnostic records. We improve the traditional SPM algorithms by filtering and grouping the diagnosis sequences according to different clinical topics. These topics represent certain clinical conditions with closely related diagnoses, and are detected without prior medical knowledge. valid results. Subsequently, we are confident that the most highly ranked and credible webpages of official sources are automatically discovered, even though these are not contained P. A full scale crawl of the GHW remains an open task and will be conducted as a next step. Electronic health record with structured data and patient narrative Zvarova J 1 , Zvára K 2 , Tomečková M 3 , Peleska J 3 1 Charles University in Prague, 1st Faculty of Medicine, Prague, Czech Republic; 2 Charles University in Prague, 1st Faculty of Medicine, Praha 2, Czech Republic; 3 EuroMISE Mentor Association, Prague, Czech Republic The most important source of information for biomedicine and healthcare is the data. Electronic healthcare documentation is the key element of electronic healthcare (eHealth). We can see a huge amount of data collected in an unstructured form, e.g. in narrative medical reports. The unstructured data is very difficult to analyze and most information in the unstructured data is used rarely or never. With penetration of information and communication technology in healthcare comprehensive lifelong electronic health records combined with appropriate representing, accessing and visualizing health data have been developed. Now we can provide physicians with an electronic health record that facilitates rapid capture of structured data and narrative information for integration and reuse. We propose a design in which patient narrative is enhanced with partial structuring of narrative information and structured data are captured with voice controlled interactive user computer interface. Partial structuring is based on marked up items associated with standardized codes that enable linkage to other events as well as efficient reuse of information which can speed us data entry by clinicians. We proposed the new methods for partial structuring of narrative information stored in electronic health record in Czech language. In the same way we can apply the method to narrative medical reports using other Slavic languages. The final result of the method is the semi-structured normalized medical report where extracted information is matched to codebook concepts. Structured information by codebook concepts can be stored in electronic health record and reused for medical decision support and quality assurance tasks. The model is validated through examples from dentistry and cardiology. We developed lifelong electronic health record for oral medicine with interactive user computer interface DentCross that is well reflecting a traditional way of data capture in dentistry. Data capture can be controlled by voice or by keyboard. Application of the method for partial structuring of patient narratives is demonstrated on 48 patient narratives from the field of cardiology. Two cardiologists found 2226 matches to codebook concepts from 48 medical narrative reports. These concepts and codes were stored to be reused for extracting structured information from other narrative medical reports. Relevant aspects of the user computer interface, data structures and processing rules are discussed. Structured narratives have potential to facilitate capture of data directly form clinicians by allowing freedom of expression, giving immediate feedback supporting reuse of clinical information and structuring data for subsequent processing such as medical decisionmaking, quality assurance and clinical research. The rapid capture of structured data and narrative information can highly support quality of epidemiological studies, clinical trials and public health surveys. The present study was supported by the grants PRVOUK P28/LF1/6 and SVV 260267 of Charles University in Prague. Identifying potentially interesting covariates for genetic risk scores under data protection constraints in consortia Zöller D 1 , Schniedermann M 2 , Kadioglu D 1 , Binder H 1 1 University Medical Center Mainz, Mainz, Germany; 2 Johannes Gutenberg University Mainz, Mainz, Germany In the medical research field, the samples available in a single biobank and corresponding molecular measurements do often not contain enough information to answer increasingly complex scientific questions. By complementing the measurements from one biobank with measurements from other biobanks, statistically more reliable results can be obtained. Yet, due to data protection constraints in Germany, such a contribution of data is only readily permissible after anonymizing the respective datasets-this usually requires aggregating the data. In addition, data owners might not be willing to contribute data on an individual level and therefor give up on their data sovereignty. We propose a method for multivariate regression modeling that requires only aggregated data from different data locations. This approach is used to identify potentially interesting covariates for a genetic risk score from a potentially large set of covariates. Specifically, we propose a regularized regression approach, namely componentwise likelihood based boosting, solely based on univariate coefficient estimates obtained from a linear regression for the endpoint of interest and the covariance matrix of the covariates. The univariate coefficient estimates and the covariance matrices of the different data providers can be combined via pooling. Due to regularized regression, data need to be standardized. Ideally, this would be done based on all individual data, but due to the mentioned data protection and data sovereignty issues, this might only be possible per data provider. We evaluate our proposed method in a simulation study. Here, we compare our results to the results obtained with the pooled individual data. The proposed method is seen to lead to a moderate increase in the number of false positive detected covariates, but is able to detect covariates with true effect to a comparable extent as a method based on the individual data, if the number of locations is rather small. With increasing number of locations, the number of false positives increases considerably. The influence of standardizing per data provider instead of standardizing the total data is seen to be even favorable in our simulations. In summary, the proposed approach is seen to provide multivariable risk prediction models with an adequate performance in situations with distributed data. Posselt T 4 , Wiese B 5 , Marmone S 6 Zentralinstitut für seelische Gesundheit Rostocker Zentrum zur Erforschung des Demografischen Wandels Germany Accounting for smoking in meta-analyses of genome-wide association studies identifies novel loci and gene-smoking interactions for obesity traits United States Obesity and cigarette smoking are important risk factors for cardiometabolic diseases. Smokers often exhibit lower body mass index (BMI) and higher waist circumference (WC) compared to nonsmokers. Previous meta-analyses of genome-wide association (GWA) studies revealed many genetic loci affecting obesity traits. However, these studies have largely ignored smoking status and little is known whether genetic effects differ between smokers and non-smokers. By accounting for smoking status (SMK, current vs. not current smoking), we here aim to identify novel genetic loci and gene-smoking (GxSMK) interaction effects for (1) BMI, a measure of overall obesity Department of Health Economics, Management and Policy; College of Health Sciences Department of Nursing; College of Health Sciences Republic Of; 8 Department of Epidemiology; College of Health Sciences Department of Obstetrics and Gynecology; College of Health Sciences The experiment on real-world EMR data shows that our approach is able to find meaningful progression patterns with less noises, and can help quickly identify interesting patterns related to a certain clinical condition with less human effort. Analyzing the German health web using a focused crawling approach Zowalla R 1 , Wiesner M 1 , Pfeifer D 1 1 Heilbronn University, Dept. of Medical Informatics, Heilbronn, Germany Given increasing webpage numbers in the medical domain, it is nearly impossible to manually keep track of such dynamic content [1, 2] . In this context, an automatic Web crawling approach can help to analyze the structure of the health related Web. By harvesting only such content, it is possible to (i) identify information hubs, (ii) find content providers of high prestige and (iii) identify important topics and trends.In this paper, a German Health Web (GHW) crawler is outlined (i.e. all.de,.at,.ch domains). With the crawled data it is possible to extract the structure of the GHW for (i)-(iii) and to provide access to huge amounts of health-related text material for linguistic analysis. Material/methods Given a set of seed points P, a Web crawler extracts hyperlinks contained in P and traverses the directed graph of the Web [3] . In this context, we obtained P from the Open-Directory Project. Due to the enormous size of the Web [4] , discriminating whether a webpage's content is health-related or not is an important task to reduce the total number of pages that need to be crawled. In this context, a focused crawler uses techniques from the field of machine learning to estimate the relevance of a webpage for the domain of interest [5,6]. We applied well-known text classification methods described in [7, 8] to determine the relevance via SVMs. In total, the crawler visited 120.584.630 webpages with 17.448.782 being classified as health-related. The crawl dates from Feb 3rd-10th 2016. A total of 91 % originates from.de, 6 % from.at and 3 % from.ch domains. For a preliminary analysis of particular and wellknown webpages of the GHW, we generated two sub-graphs with a depth d = 6 for the seed URL (a) www.bmg.bund.de [Nodes = 28776; Edges = 999.130; Average-Degree = 34,72; Network-Diameter = 13] (b) www.rki.de [Nodes = 73694; Edges = 3.325.714; Average-Degree = 45,3; Network-Diameter = 16]. The corresponding.graphml files for BMG/RKI can be downloaded at shcinfo.gecko.hs-heilbronn.de/german-health-web for free. In addition, we can provide other graph files for research purpose, generated on demand for other parts of the GHW. Discussion Several limitations can be identified: First, it is not clear whether P covers a broad amount of topics within the health domain. Second, due to a limited resource amount and time, the data crawled cover only a certain fraction of the estimated GHW. Third, the SVM classifier might have produced false positive results. Moreover, the partial, structural analysis of the crawled data was conducted for two points of interest. Still, as we used graph entry URLs of very high credibility, the resulting graph structure can be assumed as a valid partial representation of the GHW.A first investigation of a randomly selected sample consisting of several hundred pages suggests that our approach seems to produce