The use of volumetric projections in Digital Human Modelling software for the identification of large goods vehicle blind spots Browse Explore more content The use of volumetric projections in Digital Human Modelling software for the identification of large goods vehicle blind spots - IR_version.pdf (2.05 MB) The use of volumetric projections in Digital Human Modelling software for the identification of large goods vehicle blind spots CiteDownload (2.05 MB)ShareEmbed journal contribution posted on 23.11.2015, 14:24 by Steve Summerskill, Russell Marshall, Sharon Cook, James Lenard, John H. Richardson The aim of the study is to understand the nature of blind spots in the vision of drivers of Large Goods Vehicles caused by vehicle design variables such as the driver eye height, and mirror designs. The study was informed by the processing of UK national accident data using cluster analysis to establish if vehicle blind spots contribute to accidents. In order to establish the cause and nature of blind spots six top selling trucks in the UK, with a range of sizes were digitized and imported into the SAMMIE Digital Human Modelling (DHM) system. A novel CAD based vision projection technique, which has been validated in a laboratory study, allowed multiple mirror and window aperture projections to be created, resulting in the identification and quantification of a key blind spot. The identified blind spot was demonstrated to have the potential to be associated with the scenarios that were identified in the accident data. The project led to the revision of UNECE Regulation 46 that defines mirror coverage in the European Union, with new vehicle registrations in Europe being required to meet the amended standard after June of 2015. Read the paper on the publisher website The use of volumetric projections in Digital Human Modelling software for the identification of large goods vehicle blind spots Categories Design Practice and Management not elsewhere classified Keywords Digital Human ModellingTrucksVehiclesBlind spotClass V mirrorHeavy goods vehicleVulnerable road userField of visionVehicle ergonomicsAccident dataCluster analysis History School Design and Creative Arts Department Design Published in Applied Ergonomics: human factors in technology and society Volume 53 Issue Part A Pages 267-280 Citation SUMMERSKILL, S. ... et al, 2016. The use of volumetric projections in Digital Human Modelling software for the identification of large goods vehicle blind spots. Applied Ergonomics, 53, pt.A, pp.267-280. Publisher © Elsevier Version AM (Accepted Manuscript) Publisher statement This work is made available according to the conditions of the Creative Commons Attribution-NonCommercial-NoDerivatives 4.0 International (CC BY-NC-ND 4.0) licence. Full details of this licence are available at: https://creativecommons.org/licenses/by-nc-nd/4.0/ Acceptance date 25/10/2015 Publication date 2015-11-14 Copyright date 2016 Notes This paper was accepted for publication in the journal Applied Ergonomics and the definitive published version is available at http://dx.doi.org/10.1016/j.apergo.2015.10.013. DOI https://doi.org/10.1016/j.apergo.2015.10.013 ISSN 0003-6870 Publisher version http://dx.doi.org/10.1016/j.apergo.2015.10.013 Language en Administrator link https://repository.lboro.ac.uk/account/articles/9349382 Licence CC BY-NC-ND 4.0 Exports Select an optionRefWorksBibTeXRef. managerEndnoteDataCiteNLMDC Read the paper on the publisher website The use of volumetric projections in Digital Human Modelling software for the identification of large goods vehicle blind spots Categories Design Practice and Management not elsewhere classified Keywords Digital Human ModellingTrucksVehiclesBlind spotClass V mirrorHeavy goods vehicleVulnerable road userField of visionVehicle ergonomicsAccident dataCluster analysis Licence CC BY-NC-ND 4.0 Exports Select an optionRefWorksBibTeXRef. managerEndnoteDataCiteNLMDC Hide footerAboutFeaturesToolsBlogAmbassadorsContactFAQPrivacy PolicyCookie PolicyT&CsAccessibility StatementDisclaimerSitemap figshare. credit for all your research.