Programme 2024

Thursday 11th July 2024

TimeUren 1202Uren 1203
08:15Registration & Refreshments (U1201)
09:00Welcoming and Housekeeping
09:10Plenary Speaker:
Christine Vermeulen:
Behind Enemy Lines – Vascular War Trauma
09:50Change-over time (partition added)
10:00Session 1A:
Biology, Physiology and Rehabilitation of Blast and Conflict Injury
Workshop 1:
Development of Physiologically Based Injury Criteria for Behind Armor Blunt Trauma
11:30Refreshment break (U1201)
12:00Session 2A:
Paediatric Blast and Conflict Injury
Workshop 2:
Neck Injury Studies in Warfighters
13:30Lunch break (U1201)
14:30Session 3A:
Musculoskeletal Injury and Osseointegration
Workshop 3:
Is attacking the self identity of the victim the prime purpose of an explosive weapon?
15:30Refreshment break (U1201)
16:00Session 4A:
Living with Blast and Conflict Injury and the Lived Experience
Workshop 4:
Impact of Repetitive Blast Exposure on Military Health and Performance –Multinational Efforts and Guidelines

17:30: Drinks Reception in Uren 1201 – those going to the Conference Banquet need to leave by ~18:45

19:30: Conference Banquet at DoubleTree Hilton, Ealing, W5 3HN.

Friday 12th July 2024

TimeUren 1202Uren 1203
08:30Registration & Refreshments (U1201)
09:00Welcoming and Housekeeping
09:10Plenary Speaker:
Harry Parker:
Life beyond survival: 15 years of living with blast injury
09:50Change-over time (partition added)
10:00Session 5A:
Blast and Conflict Neurotrauma – Treatment and Rehabilitation
Session 5B:
Blast Force Protection
11:15Refreshment break (U1201)
11:45Session 6A:
Lightning talks and Poster Session
Workshop 5:
A Consensus Meeting to Develop Guidelines for Paediatric Patients Undergoing Surgical Amputation Following Blast Injury – Updating the Paediatric Blast Injury Manual
13:15Lunch break (U1201)
14:15Session 7A:
Blast and Conflict Neurotrauma – Prediction and Prevention
Workshop 6:
What are the clinical challenges of contemporary conflicts? How can we solve them? Developing a modified Delphi approach
15:45Refreshment break (U1201)
16:15Plenary Speaker:
Jesse Mez:
Head trauma in the community, contact sports and the military: Disentangling heterogeneity to clarify head trauma’s relationship with later life neurodegenerative outcomes
16:55Closing Remarks
17:00End of Conference

Sessions

Session 1A

Biology, Physiology and Rehabilitation of Blast and Conflict Injury


10:00 – 10:15

Influence of Blast Exposure on the Neurobiological Pathway between TBI and Cognitive Function
Sarah Martindale, Salisbury Veterans Affairs Healthcare System
Abstract

Introduction: Post-traumatic stress disorder (PTSD), traumatic brain injury (TBI), and blast are associated with long-term cognitive outcomes in Veterans. Despite overlap in symptoms and frequent comorbidity, work suggests that these conditions affect cognition independently. The present analysis identifies functional neurobiological pathways by which PTSD, TBI, and blast exposure affect neuropsychological functioning long-term.
Method: Participants were 181 US Military combat Veterans. Structured clinical interviews provided diagnoses and characterization of TBI, blast exposure, and PTSD. Neuropsychological testing characterized cognitive function. Magnetoencephalography characterized the functional brain connectomes for each participant. Linear regression identified factors contributing to cognitive function including connectome metrics, deployment TBI, blast exposure, PTSD, and conditional effects.
Results: Conditional relationships between blast and the connectome as well as TBI and the connectome were identified for several cognitive outcomes. Primary Blast TBI was the most consistently related to general cognitive function compared to Deployment TBI and Blast TBI. Blast exposure severity, with or without associated TBI, was related to tests of processing speed but not general cognitive functioning. Aspects of the connectome altered by blast and TBI included degree, global efficiency, connection strength, and the K-core.
Discussion: These results demonstrate that the relationship between the functional connectome of the brain and cognitive function is dependent on blast exposure and TBI history. Outcomes support a neurological pathway of influence for blast and TBI on cognitive function that is distinct from psychiatric conditions such as PTSD. These relationships were observed independent from effects of covariates such as PTSD, age, education, and race, suggesting they specifically represent sequelae of blast and TBI.



The Functional Connectome as a Neurological Pathway for TBI and Blast Exposure to Influence Long-Term Symptom Presentation
Jared Rowland, Salisbury Veterans Affairs Healthcare System
Abstract

Background: Traumatic brain injury (TBI) and blast exposure are common among warfighters and can result in long-term symptoms. However, the role of blast exposure and associated neurological changes in long-term symptom presentation is not well defined. The current study investigated the functional connectome of the brain as a neurological pathway producing long-term symptoms following deployment-related TBI and blast exposure.

Method: Participants were 181 US Military combat Veterans. Structured clinical interviews provided diagnoses and characterization of TBI, blast exposure, and PTSD. Self-report measures characterized long-term symptoms (psychiatric, behavioral health, and quality of life). Resting-state magnetoencephalography (MEG) characterized the functional connectome of the brain for each participant. Linear regression identified factors contributing to symptom presentation including relevant covariates, connectome metrics, deployment TBI, blast exposure, PTSD, and conditional effects.

Results: Several conditional relationships were identified demonstrating the connectome was related to outcomes only in the presence of deployment-related TBI (including blast-related TBI and primary blast TBI) and blast exposure. These effects were most frequently associated with primary blast TBI. Overall, outcomes suggest a potential dose-escalation effect as severity of blast exposure increases. No conditional relationships were identified for PTSD; however, the main effect of PTSD was significant for all models.

Conclusions: The current study demonstrates the importance of brain function for understanding long-term post-deployment symptom outcomes, particularly as a pathway of influence for blast exposure and deployment-related TBI. These findings provide a clear demonstration of a neurological pathway of influence for deployment-related TBI and blast exposure on symptom presentation over a decade past the injury event.



Evaluation of the Blast Injuries using the Blast Gauge Sensors
Thanyani Pandelani, University of South Africa
Abstract

Blast Traumatic Brain Injury (bTBI) has become the “signature wound” of conflicts in Iraq and Afghanistan. In 2018 the Defense and Veterans Brain Injury Center worldwide numbers for service men, diagnosed with TBI from 2000-2018, stood on 383,947, giving reason why TBI is known as the signature injury of modern war. This has grave implications, as soldiers reporting some degree of bTBI have been shown to be five times more likely to report a major decline in health in the six months following injury. A full 41.3% reports a severe decline in health, and quality of living, in the five years following deployment. Classifying the TBI correctly can ensure correct treatment and prevent decline of health on the long-term following bTBI.
Battlefield medical personnel rely on the visual signs and personal accounts of patients, to alert them of the possibility of TBI. The identification of both the magnitude, as well as the characteristics of clinically relevant blast waves, which caused the brain damage, is of paramount importance for correct diagnosis. Measuring an individual’s exposure level becomes key to provide a good understanding of the mechanisms of blast TBI and can provide insight for the choices about treatment and therapies.
The Blast Gauge Systems (BGS) provides a quantitative means for measuring blast related exposure, thus providing a mechanism for medical personnel to better identify those at risk for TBI. A series of laboratory and field tests were conducted using the BGS and pencil probes as controls. The tests were conducted with PE4 detonated at different Height of Burst (HOBs) with a mass 300 g and 1 kg.
The experimental results were compared with respect to incident pressures, positive phase duration of the blast wave, impulse and relevant injury criteria. The blast injuries were predicted using the Bowen curves for the BGS and the Pencil probe. The Pencil probe predicted higher injuries that the Pencil probe.


Repetitive subconcussive blast overpressure results in neurophysiological slowing and DMN dysconnectivity measured with MEG, not evident in fMRI, and independent of concussion history
Benjamin Dunkley, The Hospital for Sick Children
Abstract

Repetitive subconcussions can lead to serious neurological deficits and are common in the military, with certain role routinely exposed to subconcussions through repetitive blast overpressure. Post-mortem reveal that the cumulative exposure and total force from sports collisions are better predictors than concussion history for chronic traumatic encephalopathy (CTE). CTE and proteinopathic dementias can only be diagnosed postmortem – thus, an in vivo marker would be game changing. Magnetoencephalography (MEG) is a sensitive, non-invasive tool with exceptional temporal sampling for imaging electrochemical action and could provide a surrogate biomarker of tauopathy. Additionally, functional magnetic resonance imaging (fMRI) has shown that functional connectivity is associated with tauopathy patterns. We applied MEG and fMRI to examine the effects of repetitive subconcussions on neuronal activity and functional connectivity, as well as neurological symptoms and mental health in CAF members and Veterans. MEG revealed evidence of disrupted neuronal activity in those with a greater exposure to overpressure, including neural slowing (delta, 1-3 Hz) in the right frontal (F=8.85, p=.004), temporal (F=7.95, p=.006), and subcortical regions (F=7.27, p=.009), and functional dysconnectivity in the posterior default mode network (low gamma, 30-55 Hz; F=4.20, p=.044; high gamma, 80-150 Hz; F=5.31, p=.024). These abnormalities were independent of concussion or traumatic stress history, and MEG revealed functional dysconnectivity not detected with fMRI. Those with greater blast exposure also had poorer somatic and cognitive functioning, with no blast-related differences in mental health. This study suggests that repetitive subconcussions have deleterious effects on the brain and that MEG provides a potential avenue for both treatment targets by identifying affected brain regions and in prevention by identifying those at risk of cumulative subconcussive neurotrauma.


Repetitive Blast Exposure is Associated with Increased [18F]flortaucipir Uptake using In Vivo Tau-PET Imaging in Canadian Armed Forces Breachers and Snipers
Shamantha Lora, CAMH, Brain Imaging Center
Abstract

Chronic traumatic encephalopathy (CTE), a progressive tauopathy, is suspected to occur as a result of repetitive exposure to (sub)concussive head trauma, as sustained by military personnel in operational and training environments. Here we used positron emission tomography (PET) imaging of the tau radioligand [18F]flortaucipir to determine whether exposure to low-level repetitive blast overpressure (ReBOp) in Canadian Armed Forces (CAF) members with long-term exposure to blast and weapons systems is associated with higher tau levels and if [18F]flortaucipir uptake is related to clinical symptoms.

This cross-sectional study compared brain [18F]flortaucipir levels [Standardized Uptake Value ratios (SUVr)] in a sample of all male CAF (N=25;43yr) breachers/snipers with extensive ReBOp exposure to 6 matched CAF controls (N=6; 38yr) without ReBOp using 6 MRI-delineated regions of interest (ROIs) and cerebellum as a reference.

CAF members exposed to ReBOp were associated with significantly higher [18F]flortaucipir SUVr across ROIs (p=0.03) compared to controls. Effects were greatest in prefrontal (6.1%; p<0.01) and temporal cortices (7.2%; p<0.01) and remained significant in prefrontal cortex after covarying out age (p=0.03) but not BMI (p=0.12). [18F]flortaucipir SUVr values (n=25) were positively correlated to years of breaching and explosives exposure (r=0.36-0.54 p=0.01-0.06). Greater [18F]flortaucipir SUVr values in prefrontal regions were associated with poor sleep quality (r=0.52; p=0.01), anxiety (r=0.40 p<0.05), and post-concussive symptoms (r=0.48-0.62 p<0.02).

Our findings of elevated [18F]flortaucipir SUVr in ReBOp exposed CAF members linked with poorer clinical outcome and years of exposure to blast supports our hypothesis and is partly in line with an earlier PET study in ReBOp . Investigations in larger military cohorts should model tau to better understand the pathobiological significance and delineate what constitutes safe exposure limits to ReBOp.


Computational Modelling of Blast-Induced Burn Injuries
Timothy Brewer, Synthetik Applied Technologies LLC
Abstract

The accurate prediction of human burn injuries from explosive events requires two key components: 1) the development of a computational solver that can accurately compute the thermal effects associated with an explosive detonation including both convective and radiative heat transfer from the fireball and detonation products, and 2) the implementation of an appropriate skin damage model that can accurately calculate the resulting type, extent, and severity of burn injuries.
Development of a suitable computational solver requires several key additions/modifications to conventional computational fluid dynamics (CFD) codes that are used to simulate airblast. First, and most importantly, energy-temperature coefficients must be obtained for the explosive detonation product mixture, allowing for more accurate temperatures, in comparison with standard models.
Alone, this improvement to the energy-temperature relationships greatly enhances the accuracy of the temperatures, but inclusion of a radiation model is also important, as this will remove additional energy from the “fireball” and transport it though ambient regions. The use of a radiation model also allows investigation of thermal effects on skin by using both the temperature gradient and the radiative heat flux at boundaries to calculate a thermal flux which can be used to determine damage.
Different skin damage models have been investigated and
a quantitative comparison of several legacy (long duration) burn and thermal radiation dose models is presented. Furthermore, simulation results from the newly developed computational model are provided for notional human subjects at a range of stand-off distances. Simulation outputs are subsequently compared to two of the 7/7 London underground bombings (Aldgate and Kings Cross) to demonstrate the validity of this approach.


Physical and chemical changes in human blood when exposed to a simulated blast under-pressure wave.
Jo Harding, Massey University, New Zealand
Abstract

Injuries from blast are found in a variety of clinical settings following exposure to an explosion. A blast wave is formed by the gas released from the explosive material, but also by compression of the surrounding atmospheric gas. That compression in turn leads to an under-pressure phase.

While blast trauma is well documented in the literature, the origins of microscopic gas emboli resulting from blast are not. As such the management of, or potential consequence to gas emboli for blast victims is currently not based upon research evidence.

The major objective of this research was to outline and test an alternative theory of microscopic gas emboli development in blast other than the popular but untested translocation theory.

We present data from an in-vitro examination of the effect on the under-pressure phase of blast on human blood, using a simulated isolated under-pressure wave. This research has shown a rapid decompression effect liberates dissolved gas (carbon dioxide) from blood into gas bubbles. This was supported by a lowered carbon dioxide content in the active samples and aligned acid-base chemistry using a blood gas analyzer.

We postulate that these microbubbles may have the potential to cause or potentiate injury following exposure to blast. These findings justify ongoing research to further test the hypothesis.

Ethics was approved. A power analysis determined sample size N=10-12, alpha value 80%.
Each sampling was tested for baseline value, then divided into two samples acting as its own control. Baseline and control samples attested sample preservation.
Active samples were exposed to decompression, controls were not. Both samples were tested to the blood gas analyser following each decompression. Blood gas test placing was randomised.
43 venous samples
Apparatus data: decompression level(M -95.43kPa), time to peak decompression (M 0.04seconds), total time of decompression(M 0.35seconds). Baseline carbon dioxide tension was M 52.66mmHg (95%CI 50.9 – 54.4), control M 51.0mmHg (95% CI 49.3 – 52.6), and active M 46.9mmHg (95% CI 45.0 – 48.7). T-test comparing active and control showed significance (p=0.002). Baseline and control showed no significance. Sodium bicarbonate and potassium changes were in keeping with buffering pH to the changes in carbon dioxide.
25 arterial samples.
Apparatus data: decompression level (M -94.24kPa), time to peak decompression(M 0.20seconds), total time of decompression (M 0.138seconds). Baseline carbon dioxide tension showed M 32.8mmHg (95%CI 32.4 – 33.2), control M 32.9mmHg (95% CI 32.6 – 33.3), and active M 31.0mmHg (95%CI 30.3 – 31.6). T-test comparing active and control showed significance (p<0.001). Baseline and control showed no significant difference.
Sodium bicarbonate and potassium changes aligned with a buffering pH to the changes in carbon dioxide.
This research shows a rapid decompression effect liberates dissolved carbon dioxide gas from blood. This was supported by a lowered carbon dioxide content in active samples and aligned acid base chemistry.


Back to main programme (Thursday 11th July)


Session 2A

Paediatric Blast and Conflict Injury


The impact of blast injuries on children: A case study from Gaza
Becky Platt, Save the Children
Abstract

Becky Platt, Paediatric Nurse who’s recently returned from working in a field hospital in Gaza with Save the Children, will talk about the physical impact of blast injuries on children, the psychological distress and the research that needs to be done to manage pain appropriately. Her talk will cover why it’s so important to have a specific focus on paediatric care from the point of wounding right through to the discharge of the patient and beyond as well as the international responsibilities states have to protect children from being injured by blast.


Long-term bone growth complications following traumatic injuries in children in conflict zones.
Sumudith Jayasuriya, Imperial College London
Abstract

Severe extremity injuries to children due to crush or conflict are common and result in significant medium- and long-term complications. Advances in surgical approach, wound management, and post-operative care for children with such injuries have yielded improved short- to medium-term outcomes for these children. However, there are long-term unresolved complications, particularly related to bone growth, requiring research endeavour at both the basic and clinical sciences levels. The aim of this work is to identify the key medium- and long-term unresolved clinical issues related to bone growth in children with severe extremity injuries due to conflict or crush and to develop a research programme that addresses these. Through literature review, clinical input, and ongoing collaborator work with detailed case file analysis, the following two key priorities were identified: growth plate damage and post-amputation overgrowth. Damage to the growth plates may lead to angular deformities and length discrepancies in the affected limb. Such limb deformation can be detrimental to the child long-term, resulting in functional deficit and secondary musculoskeletal conditions such as scoliosis and osteoarthritis. Additionally, in children who received amputations as a treatment for their traumatic injuries, post-amputation bone overgrowth is another long-term complication that can cause pain, skin conditions, and issues with prosthesis fitting, thereby significantly impacting the child’s well-being. While these issues are currently unaddressed, there is a clear motivation to study these conditions in terms of how they relate to the injury mechanism, as well as how they can be mitigated through contemporary post-operative management approaches like mechanical mediation. The literature review has identified that animal and computational models of the bone growth response to traumatic injury can address the basic sciences scientific questions ito inform future clinical studies.


Malnutrition in children and provision of school meals in conflict settings.
Natalia Kovalevskaya, Imperial College London
Abstract

The provision of high-quality, diversified, and nutritious food to school-age children is one of the principal areas of national security. Historically, national school meal programmes play an important role during conflicts and emergencies. These programmes contribute to promoting peace and social stability outcomes and have been scaled up in emergencies as a rapid safety net. School feeding is a popular intervention in food-insecure settings and is designed to improve children’s education, health, and nutrition. It is estimated that approximately one in six children live in conflict zones. Armed conflicts precipitate global food crisis and severe food insecurity, impact malnutrition and other health outcomes. Malnutrition is manifested in several broad forms: underweight, overweight and obesity, and micronutrient deficiencies. Although the history of armed conflict has been associated with poor children’s health, there is limited research on impact of conflict on their nutritional status. Malnutrition adversely affects physical, mental, and social aspects of children’s health and prevents them from reaching their full potential. Poor nutrition in school children is a risk factor for the development of nutrition-related non-communicable diseases later in life. Adequate nutrition during conflict and in humanitarian emergency settings is one of the crucial factors in prevention of malnutrition and growth deficiencies in school children. This narrative review aims to outline the issue of malnutrition in children living in conflict settings and humanitarian crisis and highlight the relationship between school meals and health of children


Pain management for injured children: A manual for low resource settings
Emma Fisher, University of Bath
Abstract

The Paediatric Blast Injury Field Manual was launched in 2019 outlining medical care from pre-hospital to rehabilitation for injured children. However, there was little information within this manual on how to manage pain in children with major trauma, and psychosocial information was scarce. Pain management is a critical part of care for injured children. Pain is often not assessed and under-treated, which can lead to long-term negative outcomes such as the development of chronic pain, higher disability, and increased distress. Effectively treating pain in children with major trauma from war-inflicted injuries is critical. In this project, we are creating a sister manual providing pain management guidance for low resource settings such as those currently or recently experiencing war. We have received feedback from Gazan medics of the context in which they are currently delivering care and the need for resources. We provide guidance using a biopsychosocial approach, emphasising strategies that can be easily implemented to improve pain management and comfort for children with injuries within these contexts. Whilst the manual is written for healthcare providers, we provide skills for caregivers to help manage their child’s pain. Following the Blast Injury Field Manual, we will provide guidance from pre-hospital through to rehabilitation, and provide overview guidance on chronic pain management. Pain assessment strategies, the role of culture, and neurodiversity are also covered within the manual. The manual will be translated and launched in November 2024 at the WISH Summit in Qatar.


How Do Social Justice Appraisals Impact Chronic Pain Development in Adolescents Injured due to War Based Violence?
Anna Gibby, University of Bath
Abstract

Over 400 million children live in warzones globally and are highly susceptible to major injury and significant pain as a result. How people conceptualise and cognitively appraise their pain determines its impact on their lives. Social Justice Appraisals or the ‘evaluation of fairness’ is one such cognitive appraisal undertaken by individuals who develop pain after sustaining accidental injury. Perceived injustice is one type of social justice appraisals which includes the individual framing the severity of loss as a result of unfair injury or accident. In adults, perceived injustice has been found to increase pain severity, intensity and chronicity as well as promote pain-related behaviours, including pain catastrophising.

With civilian war based injuries being one of the most unjust causes of pain, and the recent conflict in Gaza resulting in thousands of injured children, research efforts must be directed towards understanding the association between social justice appraisals and pain experience in this population.

The central aim of this PhD is to determine the influence of social justice appraisals on pain experienced as a consequence of war time injury in adolescents.

We aim to achieve this in several ways including reviewing existing literature and exploring established paediatric datasets, as well as collecting primary data as a part of fieldwork in refugee populations in the shape of interviews and questionnaires. We hope this combination of methods will help us to identify and better understand mechanisms that contribute to adverse pain outcomes in this population, including severity, disability, impact and chronicity.

A more comprehensive understanding will help identify treatment targets for healthcare professionals to better manage pain in children injured by conflict and prevent them from developing long term pain.

Affiliated with Centre for Pain Research/University of Bath


The potential of AI and Augmented Reality technology to support capacity strengthening initiatives for health providers treating children with paediatric blast injury.
Rathi Guhadasan, Save the Children
Abstract

Children injured in conflict pose specific problems for those trying to treat them. Paediatricians are often not trained in the specific management of conflict trauma, such as blast injuries; while first responders, emergency physicians and surgeons may be experienced at managing this type of trauma in adults but do not know the specific adjustments needed to treat children safely and effectively, leading to excess paediatric mortality and disability.
UK-based experts have developed resources and face-to-face courses in paediatric blast injury management, but the nature of conflict settings makes it challenging to deliver these. Augmented reality can help providers access and search valuable resources in real time and can support local trainers to deliver scenario-based training; with low-fidelity mannequins easily and cheaply enhanced to lend authenticity to such sessions, and extend the domains in which participants can learn and be tested. In this presentation, we will demonstrate some of these techniques and share results from user testing of pilot models developed specifically for capacity strengthening in the field of paediatric blast injury.


Panel Discussion

Back to main programme (Thursday 11th July)


Session 3A

Musculoskeletal Injury and Osseointegration


Measurement of gluteal size, hip extensor strength and gait biomechanics evaluated by a 6MWT demonstrates preserved physical performance in young military veterans with lower limb amputations 10-15 years post-battlefield trauma
Jose Manuel Frias Bocanegra, Loughborough University
Abstract

Introduction: Rehabilitation pathways to improve outcomes in the veteran amputee population are important. Late pathologies in the intact limbs may lead to further disability. To date, no comprehensive study has quantified key parameters and indicators of MSK pathology in this cohort using clinically applied gait tests. This study fills the gap.
Method: ADVANCE cohort unilateral transtibial (UTT=12) transfemoral (UTF=9) and bilateral transfemoral (BTF=8) amputees were matched to able-bodied controls (ABC=9). Participants were tested using a 20-camera 3D motion capture system and 6 force plates allowing analysis of the spatial-temporal hip joint kinematics and kinetics parameters during a 6-Minute Walking Test. An isokinetic dynamometer was employed to measure hip extensor strength/torque. 3T MRI scans permitted the quantification of gluteal muscle and adipose tissue.
Results: Overall performance of the 4 groups was similar. However, significant differences included: lower peak torque in BTF than in the intact UTF side, higher peak torque on the UTT residual side than UTF; ABC show higher peak torques than UTF and BTF residual sides. UTF showed higher torque on the intact glute. MRI analysis showed BTF had high gluteal muscle and fat symmetry. UTF had higher intact side muscle thicknesses and higher fat on the residual side. Only UTT displayed higher residual side gluteus medius muscle thickness. 6MWT motion capture gait analysis showed greater total distance covered and walking speed in controls compared to UTF and BTF. UTT showed both parameters higher than for BTF. Ground reaction forces showed significant differences in gait cycle events for UTT, UTF and BTF.
Conclusion: Comprehensive rehabilitation protocols and state-of-the-art prosthetics have allowed this population to maintain excellent physical performance. Comparisons within groups show opportunities for improvement.
We thank the participants and staff at Stanford Hall who helped with this ADVANCE study.


Assessment of Thorax Response and Impact Location Dependence for Behind Armor Blunt Trauma
Duane Cronin, University of Waterloo
Abstract

Behind Armor Blunt Trauma (BABT) injuries can result from non-perforating ballistic impacts on thoracic armor. At present, the evaluation of soft armor performance involves subjecting it to various impacts, including shot-to-edge and shot-to-shot tests. Clay backing is utilized to measure the Back Face Deformation (BFD), where 44 mm is often used as a threshold. In a recent study, the correlation between BFDs and injuries was investigated using a computational human body model (HBM) to gain insights into injury tolerance, and to re-create real-world BABT cases that resulted in good, but conservative, injury predictions relative to the medical reports. It was noted that impacts occurred at various locations on protective body armour, but this was not considered in contemporary injury thresholds.

In this study, a thorax model, enhanced for high-rate deformation, was used to assess soft armor BABT for varying impact locations and BFDs. Twenty-four cases were simulated for three severities and applied to the thorax in a grid pattern covering low and high compliance regions spanning the sternum, ribs, and costal cartilage. Whole thoracic and locationally segregated injury risk curves (IRCs) were calculated for deformation- and kinetic-based metrics. Then the variability and prediction accuracy of the methods were evaluated using receiver-operating characteristic (ROC) area-under-the-curve (AUC) analysis.

The number of rib fractures, percent pulmonary contusion by volume, and corresponding injury rank, increased with increasing BFD. It was found that deformation-based IRCs generally had lesser variability and better accuracy than the kinetic-based IRCs. When impacts of like locations were grouped, it was found that location had a marked effect on the deformation-based IRCs, generally decreasing variability and increasing accuracy relative to the whole thorax IRC, suggesting that different regions of the thorax may have varying deformation tolerance levels.


Can Advanced Ankle-Foot Prostheses Mitigate Risk for Musculoskeletal Conditions after Lower Limb Loss?
Brad Hendershot, Extremity Trauma and Amputation Centre of Excellence, Walter Reed National Military Medical Center (EACE-WRNMMC)
Abstract

Preserving musculoskeletal (MSK) health after limb loss is critical for maximizing long-term outcomes. Given the role of biomechanical risk factors for low back pain (LBP) and knee osteoarthritis (KOA), two highly prevalent post-amputation MSK conditions, we characterized trunk-pelvis and (contralateral) knee joint mechanics among nine persons with unilateral transtibial limb loss (7 male/2 female, 40±8yr, 179±5cm, 89.7±14.1kg, 63±96mo since amputation) while walking with three different types of ankle-foot prostheses. Each completed a fully-body gait evaluation, at three targeted walking speeds, after 1 week each with an energy storing and returning (ESR), ESR with articulation (ART), and powered (POW) device. Several biomechanical outcomes associated with risk for LBP (trunk-pelvis ranges of motion [ROM], peak lumbar moments) and KOA (peaks in vertical ground reaction force, external knee adduction moment, medial condylar force) were evaluated across ankle-foot prosthesis types and walking speeds using linear mixed models (p<0.05). Despite a foot type-speed interaction in pelvis axial rotation ROM (p=0.001), with POW vs. ESR and ART devices resulting in lesser increases in pelvis ROM at faster walking speeds, there were no between-device differences in trunk ROM nor peak lumbar moments (p>0.14). Similarly, there were no differences in knee outcomes by foot type nor interactions with walking speed (p>0.07). While these findings suggest more advanced prosthetic feet may not moderate biomechanical risk factors for LBP and KOA during walking, future work should extend outside the laboratory to various environments and activities of daily living.

Acknowledgments: Supported by DoD Award #W81XWH-17-2-0014. The views expressed are those of the authors, and do not reflect the official policies of the U.S. Departments of Defense, Veterans Affairs, nor the U.S. Government. The authors thank J.R.G., J.M.C., and D.V.H. for their contributions to data collection & processing.


Walking Before and After Unilateral Transfemoral Osseointegration: Loading Implications for Contralateral Knee Health
Courtney Butowicz, EACE-WRNMMC
Abstract

Mobility after limb loss is critical for maximizing independence and quality of life. In persons with unilateral transfemoral limb loss (UTFLL) using socket-suspended prostheses, compromised mobility is often accompanied by an elevated risk for secondary musculoskeletal conditions related to altered limb loading (e.g., contralateral knee pain and osteoarthritis [OA]). Osseointegration (OI), through skeletal connection of a terminal prosthesis, aims to improve mobility; however, corresponding influences on contralateral knee outcomes remain unclear. We compared contralateral knee loads during walking, at multiple speeds, before and 24-months post-OI in ten persons with UTFLL (9 M/1F, mean±SD age: 39±12yr, stature: 175±12cm, pre-OI body mass: 86.3±18.6kg, 99±50mo from amputation to OI). Biomechanical outcomes associated with knee OA risk (peaks in vertical ground reaction force [vGRF] and external knee adduction moment [EKAM]) were evaluated using linear mixed models with a fixed effect of OI and a covariate of walking speed, determined per stride (p<0.05). Peak EKAM increased as walking speed increased (p<0.001) but were similar post-OI (p=0.318). Peak vGRF increased post-OI (p<0.0001), particularly at faster walking speeds (p<0.0001). The lack of EKAM change despite an increased vGRF suggests proximal adaptations (i.e., increased lateral trunk lean) likely reduce the moment arm at the knee, perhaps reducing risk for knee OA at the expense of other joints (e.g., spine). While greater mobility after OI is a clinical goal, further work is needed to capture movement strategies more broadly (i.e., across activities and over time), complimenting a comprehensive approach to evaluating both near and long-term outcomes of OI.

Acknowledgements: Supported by W81XWH-17-2-0060. The views expressed are those of the authors and do not necessarily reflect the official policy of the Uniformed Services University of the Health Sciences, Department of Defense, nor the US Government.


Back to main programme (Thursday 11th July)


Session 4A

Living with Blast and Conflict Injury and the Lived Experience


The ADVANCE Study: a prospective,longitudinal cohort study investigating the medical and psychosocial outcomes of UK combat casualties from the Afghanistan war. Complete baseline results.
Alexander Bennet, ADVANCE
Abstract

The ADVANCE study is a longitudinal cohort study evaluating the effect of combat trauma (CT) on health indicators in military personnel who served in the UK-Afghanistan War (2003–2014). The cohort (579 male adult UK combat veterans with CT) were frequency-matched to 565 uninjured men. Baseline statistics: 34.1±5.4 years, with a mean (±SD) time from injury/deployment of 8.3±2.1 years.
Cardiovascular: CT is associated with an increased prevalence of metabolic and arterial stiffness, co-influenced by age, injury severity, physical activity and socioeconomic status. CT, traumatic amputation and its physical deficit are associated with lower coronary flow reserve and extra subclinical cardiovascular (CVD) risk and independently associated with increased 10-year CVD risk.
Musculoskeletal: Bone mineral density (BMD) was lower at the femoral neck of the amputated limb, with greater reduction for above knee than below knee amputees. Similarities in spine BMD or activity levels between amputees and controls, suggests that these bone health changes are mechanically driven and not systemic. Radiographic measures of knee osteoarthritis (KOA) were worse for amputees than the injured non-amputated (INA) group, yet participant reported KOA scores were worse for them. Injured participants (without knee injury or amputation) had x1.74-fold odds of radiographic KOA than uninjured participants, suggesting the influence of major CT on KOA.
Mental health: The rates of PTSD, depression, anxiety and associated multimorbidity were greater in the injured than the uninjured group. Specifically, minimal differences in odds of reporting poor mental health outcome from the amputation injury subgroup but double the odds for the INA subgroup.
Conclusions: There are clear detrimental mental and physical health outcomes 8-year post CT. The amputee sub-group is distinct as patient reported outcomes were typically better than the INA participants, and frequently, not different from the control group.


Improving initial management of the injured at Ghanaian district and regional hospitals with a Trauma Intake Form: Implementation Outcomes
Aldina Mesic, Imperial College London
Abstract

Globally, injuries are a leading cause of death, especially in low- and middle-income countries. Although trauma care improvements have reduced mortality rates, efforts in these countries mainly target tertiary centers rather than smaller hospitals closer to the scene of injuries. Enhancing initial assessment and care provision at the primary access point to the health system could significantly enhance outcomes for those injured in resource-constrained, conflict-affected, or fragile settings.

In this study, a standardized trauma intake form (TIF) with real-time clinical decision support prompts was developed for non-specialized providers in eight non-tertiary hospitals in Ghana. The TIF led to a notable decrease in mortality rate from 17.7% to 12.1% among seriously injured patients and improved 14 of 16 trauma care key performance indicators for all patients seeking care.

Understanding the implementation of this intervention is crucial for scaling it up in Ghana and beyond. In this study, we conducted a mixed-methods evaluation with 241 clinicians and found that the uptake of the TIF varied among facilities, with some showing high adoption rates while others lagged behind. Despite generally positive perceptions of the TIF’s acceptability and feasibility, concerns about time constraints were noted. Suggestions for improvement included making the form shorter, more user-friendly, and tailored to different clinician types.

Overall, the study demonstrates the effectiveness of the TIF in improving clinical outcomes and its perceived acceptability. However, addressing issues such as limited uptake in some facilities requires tailoring the form to suit the needs of smaller, low-volume facilities. Lessons learned from this study are relevant for improving trauma care provision in similar resource-constrained and fragile settings.


Blast Injury in Operation HERRICK (2002-2014): A Review of the ADVANCE Injury Data
Sarah Dixon-Smith, Imperial College London
Abstract

Over the 11 years of operations in Afghanistan, more than 2,400 British military personnel were wounded in action, 70% sustaining blast injury. Advances in trauma care and evacuation meant many casualties who may have died of wounds in previous conflicts survived, creating a new cohort of ‘unexpected survivors’ with life-changing trauma. Whilst short-term outcomes of these injuries appear to have been “favourable”, the full extent of the long-term physical health outcomes are currently unknown. The ArmeD SerVices TrAuma RehabilitatioN OutComE (ADVANCE) study, a prospective 20-year cohort investigation, aims to address this gap by examining long-term impacts of conflict wounds in ~600 British military personnel injured in Afghanistan, match paired with an uninjured comparison group. To date, studies published in ADVANCE have focused on specific areas of injury and no baseline or descriptive analysis of injury has been carried out.

This paper will present an analysis of injury data recorded in the UK’s Joint Theatre Trauma Registry for the 538 injured ADVANCE participants to determine trends in combat injury characteristics and stages of blast injury. In published literature, blast is often presented as four distinct stages, each with specific causation and subsequent injury, and investigated in isolation, rather than as an interplay of complex polytrauma. This study will explore how far this is applicable in practice, with the hypothesis blast injuries do not present in such discreet stages, but instead as multiple, non-sequential and interacting injuries; injuries which will overlap and continue to interact over the long-term, requiring specialist planned rehabilitation and interventions.

Results from this study will contribute to other areas of investigation in ADVANCE and become a foundational analysis for future work into the impact of conflict injury on physical health, provision of protective equipment, and prehospital trauma interventions, amongst others.


Association of serum biomarkers with early radiographic knee osteoarthritis, knee pain and function in a young, trauma-exposed population – findings from the ADVANCE study
Joanne Stocks, University of Nottingham
Abstract

Objective

Identifying osteoarthritis (OA) using molecular biomarkers is a drug-discovery and patient-care priority given the significant burden of pain, symptoms and morbidity. The ArmeD SerVices TrAuma RehabilitatioN OutComE (ADVANCE) study is investigating long-term combat-injury outcomes; this substudy aims to understand the association of preselected candidate biomarkers of early OA with knee radiographic OA (rOA), knee pain and function in this high-risk population for post-traumatic OA.

Design

ADVANCE compares combat-injured participants with age, rank, deployment and job-role frequency-matched uninjured participants. Immunoassay-measured serum biomarkers, knee radiographs, knee injury and osteoarthritis outcome scale (KOOS), and six-minute walk-tests (6MWT) are reported at baseline. Univariate, regression and correlation analyses were performed and adjusted for age, body mass, socioeconomic status, and ethnicity.

Results

1145 male participants were recruited, aged 34.1±5.4, 8.9±2.2 years from injury (n=579 trauma-exposed, of which, traumatic-amputation n=161) or deployment (n=566 matched). Cartilage oligomeric matrix protein (COMP) was significantly higher in the combat-injured group (267.22ug/l v 254.05ug/l, p=0.01), and significantly lower with a traumatic-amputation (197.26ug/l, p<0.001), decreasing relative to number of amputations (p<0.001). Leptin was higher (p=0.005) and adiponectin lower (p=0.017) in those with v without knee pain, associated with an increased risk of 22% and 17% for knee pain, and 46% and 34% for painful rOA, respectively. There were no significant differences between those with rOA in the trauma-exposed and unexposed groups.

Conclusions

The most notable findings of this large, unique study are the similarities between those developing OA regardless of trauma-exposure, the differences in COMP levels related to combat-injury and traumatic-amputation, and the relationship between adipokines and pain.


Post First World War Conflict Disability in Industrial South Wales.
Beth Griffiths, Swansea University
Abstract

This paper will argue that the experience of veterans living with conflict injuries in industrial south Wales differed from those in non-industrial areas of the United Kingdom and is a subject not previously addressed. By the end of the First World War, Britain had 75,000 men ‘permanently disabled’ of which 41,000 had lost one or more limbs. All these men experienced evacuation, treatment, rehabilitation, and reintegration into a post-war world. While these elements were common for the injured soldier, life as a disabled veteran became an individual experience dependent upon the results of these four stages alongside personal circumstances. The disabled veterans of previous wars had been deemed to be destined for a life of unemployment, poverty, and beggary. However, south Wales, dominated by the coal, steel, and tinplate industries, was a region with a long history of civilian disabled. This economic context alongside the paternalistic attitudes of company owners and union strength led to stronger community and kinship networks. These networks were so integral that the YMCA and British Legion in Wales were more involved in the lives of veterans than in other parts of Britain and facilitated their reintegration in the regional society and economy.
This paper will explore this process of reintegration in south Wales and examine the extent to which disabled veterans were able to meet societal and cultural norms surrounding masculinity. Masculinity was a constant theme in the lives of disabled veterans. The financial security of employment was an important factor enabling them to fulfil their role as a man. Archival sources, census, pension, and military records have contributed to the compilation of a sample of one hundred and forty-seven men affected by conflict injuries, and who returned to south Wales. Using reintegration into employment and society as measures of masculinity, this paper will draw conclusions regarding how successfully they returned to society.


Inequalities in injury prevalence and the availability of emergency care: a multi-country district-level analysis of the Demographic Health Surveys
Aldina Mesic, Imperial College London
Abstract

Injuries are a leading cause of morbidity and mortality, accounting for an estimated 520 million injuries and 4.5 million deaths annually. Despite mortality reductions in the past three decades, substantial regional, national, and sub-national disparities persist, with nearly 90% of deaths occurring in low- and middle-income countries (LMICs). One of the key contributing factors to this burden is inadequate access to pre-hospital and emergency care in LMICs. A study of African countries found that 29% of people were located more than 2 hours from the nearest facility, falling short of the global standard. This is despite projections indicating that access to appropriate and timely emergency services would led to a 36% reduction in disability and a 45% reduction in mortality in LMICs.
We aimed to assess the spatial distribution of injuries and emergency care services in three LMICs (Kenya, Nepal, and Uganda) using Demographic Health Survey data at the household and facility-level. We descriptively report the prevalence of road traffic and other-cause (e.g., violence) injuries by sex, age, and type, on a district-level. We use the Service Provision Assessment to classify facilities into four categories: Level A (facilities with 24-hour emergency services), Level B (facilities with resuscitative capabilities), Level C (tertiary care facilities), and Level X (facilities with insufficient emergency care capacity). In an exploratory analysis, we also assess the location of conflict deaths in relation to emergency care capacity. This study seeks to shed light on the distribution of injuries and emergency care services, recognizing the critical role they play in addressing this global health challenge. Through our analysis of data from Kenya, Nepal, and Uganda, we aim to provide insights that can guide more effective allocation of emergency care resources, ultimately contributing to improved health outcomes and reduced mortality rates in these regions.


Back to main programme (Thursday 11th July)


Session 5A

Blast and Conflict Neurotrauma – Treatment and Rehabilitation


Thresholds of Subconcussive Blast Exposure Associated with Differences in Brain Structure and Function
Jared Rowland (Salisbury Veterans Affairs Healthcare System)
Abstract

Introduction: A high proportion of military personnel are exposed to primary blast forces as part of training and/or combat. Blast forces have been shown to affect brain structure and function. Most service-related blast exposures are considered subconcussive and do not result in overt clinical symptoms. No standards exist to identify when blast exposure will increase the risk of negative outcomes. This proposal presents a subconcussive blast exposure profile associated with alterations in brain structure and function.
Method: 107 US Military Veterans without history of military-related traumatic brain injury (TBI) were included in analyses. Blast exposure was evaluated using the Salisbury Blast Interview. Outcomes were hippocampal brain volume and functional brain connectomes.
Results: There were no effects of maximum blast severity or frequency of exposure on brain volume or function. There were significant effects of average exposure severity on functional connectome metrics including Rich Club, connection frequency, gamma connections, as well as hippocampal volume. There were clear thresholds at each level of severity. Finally, a profile of subconcussive blast was developed using a combination of exposures across severities that identified differences in brain function and structure.
Discussion: These results demonstrate that negative outcomes are unlikely to occur after a single subconcussive event due to lower severity. Similarly, a high number of exposures to low severity events was unrelated to outcomes. Repeated exposure to sufficiently severe events is required for neurological effects to be observed following subconcussive blast exposure. This is in contrast to blast exposure more generally where severity and close-range exposures are the strongest predictors. Applicable thresholds based on a blast exposure profile can identify individuals at risk for effects of subconcussive blasts on brain structure or function.


Repetitive, but not single, mild blast neurotrauma results in persisting neurological deficits and selective cortical loss in a novel rat model
Robert Dickinson, Imperial College London
Abstract

Repeated mild blast neurotrauma/traumatic brain injury (mbTBI) is common in combat soldiers and in the training of Special Forces. There is evidence to suggest that repeated exposure to a mild or subthreshold blast can cause serious and long-lasting impairments, but the underlying mechanisms are unclear. We describe and characterise the effect of single and tightly-coupled repeated mbTBI in a novel rat model using male Sprague Dawley rats exposed to blast shockwaves generated using a shock tube. The primary study outcomes are functional neurologic function (unconsciousness, neuroscore, weight loss, RotaRod performance) and neuronal density in brain regions associated with sensorimotor function. We show that exposure to a single shockwave does not result in functional impairment or histologic injury, consistent with a mild or subthreshold injury. However, exposure to three tightly coupled shockwaves results in unconsciousness, coupled with persistent neurological impairment. Significant neuronal loss following repeated blast was observed in the motor cortex, somatosensory cortex, auditory cortex and the amygdala. Our study identifies specific brain regions particularly sensitive to repeated mbTBI. The reasons for the sensitivity of specific regions to injury may include their exposure to less attenuated shockwaves, or proximity to tissue density transitions and merits further investigation. Our model can be used in elucidating mechanisms of sensitisation to injury, determining the temporal window of sensitivity of brain parenchyma, biomarker discovery and for evaluation of novel treatments.


Volume differences in blast-related mild TBI with cognitive implications in service members and veterans: a LIMBIC-CENC study
Emily Dennis, University of Utah
Abstract

Blast-related mild traumatic brain injuries (brTBI) have been called the “signature injury” of the recent conflicts in Iraq and Afghanistan, with more than 100,000 injuries sustained by service members between 2001 and 2018 and many more “subthreshold” exposures. Blast-related TBI is thought to have some unique mechanisms resulting from the over- and underpressure of the blast. Leveraging the multi-site LIMBIC-CENC sample, we sought to identify alterations in brain structure in individuals with blast-related TBI, and to determine if these alterations related to cognitive function.
Using a structured interview, lifetime history of all possible concussive events (PCEs) was assessed. Every PCE was classified as mTBI versus not mTBI, during deployment or outside deployment, and as blast-related or non-blast, based on the mechanism of injury. Participants also completed a battery of cognitive tests, including Trail Making Test (TMT) and the Wechsler Adult Intelligence Scale 4th Edition (WAIS-IV) Processing Speed Index (PSI), and working memory using the WAIS-IV Digit Span subtest. We used tensor-based morphometry (TBM) to create Jacobian determinant images containing voxelwise volume information.
Comparing individuals with a history of blast-related TBI to those with no history of brTBI (including both unexposed and only impact TBI), we found smaller volumes in the brTBI group in the bilateral superior corona radiata and bilateral clusters focused on the globus pallidus, some of the stiffest tissues in the brain. These effects remained after covarying for PTSD, MDD, and problematic alcohol use, and remained when comparing the brTBI group to non-blast deployment TBI. The volumes of these clusters were significantly associated with TMT-B completion time and Digit Span performance in the blast-related TBI group (ps < .03) and, across the whole sample, significantly mediated the association between brTBI and cognitive performance.


Relationship Between Structural and Functional Network Connectivity Changes for Patients with Brain Injury and Chronic Health Symptoms
Maheen Adamson, WOMEN, VA Palo Alto Healthcare System/Neurosurgery, Stanford University School of Medicine
Abstract

Background. TBI is a frequent cause of disability. Structural (SC) and functional (FC) connectivity were used to evaluate network properties in TBI. The aim of the study was to test for differences in SC and FC between TBI patients and controls.

Methods. 46 participants were divided into 3 groups: Control group (CG) n=13; n=16 TBI patients without chronic symptoms (TBIncs); n=17 TBIs with self-reported chronic symptoms (TBIcs). For each participant, one high-resolution T1W image and two DWI scans were acquired. One resting state functional MRI (rsfMRI) scan was also acquired. T1W anatomical images processed using FreeSurfer, which provides 34 cortical parcels from DK parcellations per hemisphere. DWIs processed using Mrtrix3. SC data collected for all connections between 68 parcels. RsfMRI data processed using CONN toolbox. FC were obtained for the same DK parcels. SC and FC compared between groups. Benjamini-Hochberg algorithm applied to perform false discovery rate (FDR) correction.

Results. Correlation between SC and FC is 11.5% and 11.9% stronger for TBIncs, and TBIcs compared to CG, respectively. SC reduction was observed in 4 parcels and 6 parcel clusters for TBIcs but only one cluster for TBIncs compared to CG. FC reduction was observed only in one cluster for TBIncs but in one parcel and two parcel clusters for TBIncs compared to CG, respectively.

Conclusions. Abnormal FC may be the result of damage to specific functional areas, or damage to the SC between functional areas. Combined assessment of SC and FC may provide a predictive model for clinical outcomes.


Xenon as a neuroprotective treatment for traumatic brain injury
Robert Dickinson, Imperial College London
Abstract

Military traumatic brain injury (TBI) can be categorized into non-blast (blunt or penetrating injury without an explosion) and blast TBI consisting of ‘primary blast injury’ due to the blast wave exposure, ‘secondary blast injury’ resulting from blunt or penetrating injury from fragments and ‘tertiary blast injury’ due to acceleration/deceleration forces. Survivors of TBI have an increased risk of death & are more likely to develop dementia or cognitive impairment in later life. Preclinical, in vitro and in vivo, models play an important role in understanding TBI pathophysiology and in the evaluation of novel therapies. There are few pre-clinical neuroprotection studies examining very long-term outcomes and survival after TBI.

Xenon is an elemental noble gas used medically as a general anaesthetic and in medical imaging. Following the discovery that xenon inhibits NMDA receptors, and activates certain types of potassium channel, there has been growing interest in investigating the efficacy of xenon as a treatment for acquired brain injuries. Early attention focussed on brain ischemia, and xenon has already undergone successful early clinical trials for ischemic brain injury after out of hospital cardiac arrest. More recently xenon has been evaluated as a neuroprotectant in preclinical models of blunt and blast TBI. This presentation will present recent preclinical studies investigating the efficacy and mechanism of xenon as a neuroprotectant for TBI.


Questions & Answers

Back to main programme (Friday 12th July)


Session 5B

Blast Force Protection


Behaviour of synthetic gelatine as a human tissue surrogate for blast protection applications
Emma Osborne, University of Sheffield
Abstract

The need for further research into the transmission of blast effects through the human body is motivated by the devastating physical harm caused by explosions. Organic gelatine, with a concentration of 10% at 4°C, is often used as a soft tissue simulant but has limitations (e.g. storage requirements, variability and shelf life [1]). Synthetic gelatine overcomes these limitations and displays consistent material properties under quasi-static compression [2]. However, the blast wave transmission properties of synthetic gelatine were previously unknown.

This paper presents recent results from preliminary experiments on the transmission of blast loading through 10% synthetic gelatine (from Clear Ballistics) under far-field blast conditions. Previously, this behaviour has been explored experimentally using pressure transducers. However, this has been shown to produce inconsistent and noisy results. To address these problems, a novel experimental set-up was developed.

A 305 mm diameter, 10mm high gelatine disc was adhered to the front face of a 1 mm thick, 1050 H14 aluminium alloy panel. A black and white speckle pattern was applied to the rear surface to visualise the transient out-of-plane displacement of the rear surface of the panel. The blast load was generated by detonating PE4 discs at a 650 mm stand-off from the front surface of the gelatine. Stereo-imaging and digital image correlation were used to measure the transient displacement. Comparisons between panel responses with/without gelatine are used to infer the effect gelatine has on the blast load transmission.

[1] J. Read, R. Hazael and R. Critchley, Soft Tissue Simulants for Survivability Assessment—A Sustainability Focussed Review, J. Appl. Sci., Vol. 12 (2022) 4954.
[2] E.L. Clarke, R.J. Curry, G.S. Langdon, S.D. Clarke, J.P. Tyler, Method Development for Compression Testing of Synthetic Ballistic Gelatine. Cranfield Online Research Data. (2024).


Design and performance evaluation of hollow pyramid structure filled will liquid on blast mitigation
Yuan Li, Northwestern Polytechnical University
Abstract

The personnel shock wave protection structure not only requires a small deflection of the back plate and small force transmission but also considers wearing comfort and ease of movement. Due to the strict design requirements, the current personnel’s ability to resist explosion shock waves is still insufficient. This study designs a new personnel shock wave protection structure by combining the respective protection advantages of V-shaped structures and liquid-filled structures. Through shock tube testing and numerical simulation, shock wave protection research was conducted on the designed and additively manufactured liquid-filled pyramid structure. A nylon tube filled with silicon rubber was used to simulate the human body and fixed at the nozzle of a shock tube. The designed structure was placed in front of the silicon rubber, and the pressure curves measured at the shock tube nozzle (incident pressure) and inside the silicon rubber (transmitted pressure) were compared to analyze the protective effect. Test results show that compared to the situation of no protection, the liquid-filled hollow pyramid structure can achieve a protection effect of overpressure attenuation of 26.0% and impulse attenuation of 13.1%. It also shows that under the situation of more plugs, more liquid mass was leakaged and the overpressure attenuation was improved. On this basis, the changing rules of the energy of the plug and liquid, the motion state of the pyramid shell structure, etc. were studied in detail through simulation, and the shock wave protection mechanism of the liquid-filled pyramid structure was clarified. There is a balance between the pressure attenuation by the energy dispersion and offset and the pressure enhancement by the cone valley focusing. Relevant research results can provide an important reference for improving personnel’s anti-explosion shock protection level.


Pressure dependent material characterization of ovine lung parenchyma for applications in finite element modelling
Patricia Thomas, Wake Forest University
Abstract

Finite element models (FEM) are an informative tool in the study of blast injury, as they give additional insight into the mechanical parameters that may lead to injury. Given the body of experimental studies using ovine models, computational analogs of this species are a potentially useful tool for FEM-based injury criteria development1. However, no published ovine FEMs utilize ovine-specific lung material properties, which we propose are critical for pulmonary injury criteria development. Lung parenchyma models also do not typically consider the pressure of air within the lungs, which varies with breathing and is thought to affect injury risk in blast applications2. Therefore, in this study we characterize ovine pulmonary tissue at a range of pressures over the breathing cycle. Intact ovine lung samples were obtained immediately after sacrifice, and testing was completed within 8 hours (N=7). The two largest lobes were used for spherical indentation testing (N = 59). Samples were loaded at 20 mm/s to a depth of 10 mm and held for 120 seconds. For each test, the lungs were inflated to one of three pressures relevant to the breathing cycle – 0, 4, and 10 mmHg. The resulting data were analyzed using a MATLAB code to fit a 3-parameter Maxwell model. ANOVA tests were run on the instantaneous and relaxed shear moduli. The mean instantaneous and relaxed shear moduli of all the data was 1.94 and 1.05 kPa, respectively, which is within range of published rodent lung properties of 0.6–7 kPa3. These results indicate that the pressure of the lung within breathing does may not affect the properties of the lung tissue although further study may be needed given the near significant findings.
References: 1. Stuhmiller JH. Blast Injury, Translating Research into Operational Medicine. 2008. 2. Clemedson C-J. Blast injury. 1956. 3. Andrikakou P et al. Sci Rep. 2016.


Effect of wearing a soft ballistic vest on thoracic blast injury
Johanna Boutillier, French-German Research Institute of Saint-Louis
Abstract

When designing new protective equipment for soldiers and law enforcement officers, the blast threat is not taken into account. The main focus is often on ballistic, knives and fragments protection. Primary blast injuries mainly concerned air-filled organs such as the lung and the gastrointestinal tract and studies have shown that some thoracic protective equipment (TPE) can worsen the level of injury.
An ISL anthropomorphic mannequin, called BOPMAN for Blast OverPressure MANnequin, was used to evaluate the efficiency of a soft ballistic TPE against blast threats of increasing intensities. Using the developed methodology, both qualitative (better or worse than) and quantitative (lung injury risk estimation) evaluations are possible.
Scenarios from 85g of C4 detonating at 3.8m from the mannequin to 4kg of C4 at 3m were performed unprotected and with a soft ballistic vest. Results show a near constant amplification factor of 1.3±0.2 on BOPMAN measurements with the vest compared to measurements unprotected. Estimated lung injury risks indicate that scenarios that should not generate lung injury when unprotected can be injurious with a soft thoracic protection. It was also noticed on high-speed videos that the TPE slap the mannequin’s chest when the shock wave arrived. The amplification could be the results of this slap, but further investigation should be performed to confirm that hypothesis.
BOPMAN thorax was derived into a laboratory version in order to understand the observed blast amplification behind a soft ballistic vest and to propose improved solutions. Metrics comparison, obtained unprotected or equipped with a soft ballistic pack, showed a good correlation between both models.
Those devices, both experimental and numerical, are intended to be used for thoracic protective systems evaluation (prototype sample or complete protection) against blast threats with an estimation of the injury risk for the dismounted soldier or the law enforcement officer.


The Further Development and Improvement Upon the Finite Element Ovine Thorax Model (FE-OTM) for Use in High-Rate Non-Penetrating Blunt Impacts and Blast Waves
Juliette Caffrey, Wake Forest School of Medicine
Abstract

We present a Finite Element Ovine Thorax Model v2.0 for the high-rate non-penetrating blunt impact (NPBI) and study its suitability for blast environments. This model includes anatomy from C5 to L4, airways and blood vessels (to a diameter of 3 mm), and rib geometry based on CT scan data with a resolution of 1 mm [1]. Constitutive material models include ovine specific adipose tissue (general viscoelastic), hide (Ogden rubber), costal cartilage (Ogden rubber), cortical rib (piecewise linear plasticity) and others sourced from the literature [2]. A series of 27 impacts at 40 to 70 m/s with hemispherical and flat impactor heads were run to represent the NPBI environment. An additional simulation was run to test the model in the blast environment, based on experimental work by Johnson et al. This included the model in an enclosed 3 m x 2.4 m x 2.4 m compartment with 0.57 kg of TNT located in a corner with a 2 m standoff distance [3]. The model was stable in all impact NPBI conditions run, exhibited no non-physical motion, and less than 5% hourglass energy. Preliminary analysis showed that the peak force for the NPBI impacts ranged from 5 to 20 kN which is comparable to experimental data (3 to 15 kN). The model successfully implemented a surface-to-surface tide break contact between the lungs and chest wall in the blast environment, for an impact with lung pressures up to 1 MPa, indicating the potential to study resulting lung strains and pressure propagation. Results showed that the model is robust enough for validation for the high-rate NPBI environment, which is the environment for which it was constructed. However, further development may be needed to expand to study blast conditions. Once validated, the model will serve to elucidate the biomechanics by linking metrics like tissue strain to observed injury outcomes. References: 1.Caffrey, J.M. PLOS ONE (2023). 2.Thomas, P.K. Ann Biomed Eng (2023). 3. Johnson, D.L., DTIC, ADA275038 (1993).


Back to main programme (Friday 12th July)


Session 6A

Lightning Talks

Evaluation of veteran characteristics presenting to Op RESTORE, The Veterans Physical Health and Wellbeing Service
Amrutha Bhaskaran, Imperial College London
Abstract

The physical health needs of UK military veterans have not been well characterized. Op RESTORE is the first specific provision in the UK for veterans with physical health challenges. It integrates physical and mental health expertise to deliver holistic care to veterans with integrated third-sector support. From 2016, Op RESTORE has cared for more than 1,000 patients to date. This study aimed to analyze the Op RESTORE database to understand the characteristics (patient characteristics, injury characteristics, complexity) veterans present with.The Op RESTORE database was analysed to understand patient characteristics (gender, age, military service), and characteristics of the case (physical and mental health conditions). Subgroup analysis was performed by gender, comparing various physical and mental health conditions. The analysis showed a male preponderance of veterans (91%), with most veterans in the 35-50 age group (43%). Single site issue (54%) was more common than poly site issues (45%). 64% of veterans had isolated physical health conditions and 34.3% of veterans who had both physical and mental health conditions. Injuries were grouped into clinical areas: the majority of both male and female patients had musculoskeletal injuries (MSK) (>80%) followed by neurological pathology (6.7%). Within MSK, 39% of veterans had lower extremity pathology, while 29% and 16.7% of the veterans had spinal and upper extremity pathology respectively. 3.5% of veterans had a combination of spinal, and upper and lower limb pathology. This is the first detailed analysis of UK veterans’ physical health needs. MSK pathology was the most dominant health need with the knee and back being the most common sites affected. The EQ5D5L analysis showed that 77% of the veterans had improved health-related quality of life from the time of referral to the discharge.


Blast Fragment Penetrating Injury: Models for Prediction and Prevention
Thuy-Tien Nguyen, Imperial College London
Abstract

The blast-fragment penetrating injury is the most common injury from an explosive event. It is caused by objects such as shrapnel from the explosive device or debris and soil ejecta from the vicinity that get energised to high-velocities due to the explosion. Our studies aim to investigate this injury mechanism to soft and skeletal tissues in the torso, pelvis, and lower extremities, in order to quantify injury risk and assess protective strategies.
Ballistic impacts by a range of projectiles of various shapes and sizes as well as soil ejecta were performed using a bespoke gas-gun system. High-speed photography and fluoroscopic systems were used to observe the impact event and tissue response. Survivability analyses were conducted to quantify the risk of specific injuries.
We quantified the response of different tissues and organs to ballistic threats, the behaviour of the fragment during and post-impact, and report the impact velocity threshold with 50% probability of an injury outcome. These findings were subsequently used to identify suitable simulants for consistent and well-controlled experimental models for injury studies and protection assessments. Our results can be used also to improve the accuracy of numerical tools that predict injury outcome of explosive events.


Identifying effective interventions to maintain bone health in lower limb amputees
Linjie Wang, Imperial College London
Abstract

Introduction
Bone degradation occurs in active veteran lower limb amputees [1], with above-knee (AKA) more severely affected than through-knee (TKA). Bone mineral density (BMD) decreases are not systemic, occurring mainly in the amputated side femur, suspected due to reduced mechanical stimulus. To explore this a combined musculoskeletal (MSK) and finite element (FE) modelling framework with bone adaptation algorithm [2] was utilised for AKA and TKA subjects, and an able-bodied (control) subject.
Methodology
MSK models adapted from [3] were used to simulate five daily activities (walking, stand-to-sit, sit-to-stand, stair-ascent, and stair-descent). Muscle forces and joint contact forces (JCFs) were calculated and exported into FE models for bone adaptation simulations.
Results
MSK simulations showed AKA and TKA subjects preferentially loaded the intact leg for all activities, with loading asymmetry less apparent for walking. Reduced ground reaction forces resulted in altered hip and prosthesis knee JCFs on the amputated side for both subjects. FE models predicted bone degradation, in agreement with clinical studies [1, 4], with reduced cortical thickness in the diaphysis and reduced trabecular Young’s modulus in the proximal femur for AKA and to a lesser extent for TKA.
Conclusion
The hypothesis that localized BMD reduction in lower limb amputees is due to reduced mechanical stimulus is supported. Localized unloading osteopenia occurs in AKA due to lack of end loading. Bone degradation occurs in TKA, despite preservation of end loading, due to loss of muscles crossing the knee. The developed modelling framework allows assessment of potential interventions such as socket design, direct fixation, and tailored activity regimes to prevent and mitigate localized bone degradation in amputees.
References
1.McMenemy et al., JBMR 38:1227-1233, 2023
2.Phillips et al., IntBiomech 2:43-61, 2015
3.Favier et al., CMBBE 24:1310-1325, 2021
4.Sherk et al., JBMR 23:1449–1457, 2008


Questions & Answers

Occupational blast exposure measurements and simulations
Hugo van Duijnhoven, Netherlands Organisation for Applied Scientific Research
Abstract

Introduction: Repetitive blast exposure over a longer period of time may cause (mild) traumatic brain injury. Due to the growing body of evidence of repetitive blast induced mTBI, TNO investigates the influence of blast on the human brain. A relevant challenge within this topic is determine occupational blast overpressure exposure in different training scenario’s. Blast overpressure exposure depends on e.g. the type of weapon system that is trained with (explosive detonations or firing of projectiles), the proximity to the blast source and the surroundings.
Goal: To gain insight in occupational blast overpressure exposure of various weapon systems and store data in an accessible exposure database.
Methods: Measurements were performed for explosive breaching, 60 mm mortar and 155 mm artillery firing. Peak incident overpressure pressure and peak impulse were captured using blast pencils. Additionally, computational Fluid Dynamics (CFD) simulations are performed to gain insight in scenario based blast overpressure. Using CFD simulations that are validated using experimental in-field measurements, both the proximity to the blast source as the surroundings can be modified. This offers a great advantage when investigating different training scenario’s.
Results: Peak pressures between 5 and 20 kPa were measured for the breachers. In the 60 mm mortar measurements, the gunner was exposed to ~10 kPa and the ammunition handler to ~15 kPa. Peak overpressures in the 155 mm artillery measurements were ~30 kPa for conditions with the doors open and ~5 kPa within the vehicle with the doors closed.
After calibration, CFD simulations of the breachers computed similar peak pressure and peak impulse as experimental measurements. Simulation results of the 60 mm mortar and 155 mm artillery will follow shortly.
Conclusion: Blast overpressure was measured for 3 weapons systems and currently simulated for 1, providing additional insight in the personnel blast exposure during trainings.


Addressing poor design in life-or-death trauma: Towards an ISO standard for tourniquets
Angus Clark, Imperial College London
Abstract

Emergency tourniquets are the last line-of-defence against significant blood loss following a traumatic incident where direct pressure is unsuccessful. While their recommended use is under constant discussion [1] due to poor usage and resulting unnecessary limb loss [2], their ability to achieve haemostasis in major limb trauma undoubtedly consistently saves lives [3]. Despite the incredibly high performance and reliability requirements of these life-saving devices, there is currently no internationally agreed standard of emergency tourniquet design or performance, resulting in significantly varied performance across designs. In addition to this, the increasing prevalence of conflicts in low-resource environments has led to the use of improvised tourniquets, which demonstrate reduced effectiveness, increased pain, and increased risk of tissue damage [4]. To address this, this project aims to propose an international ISO standard for emergency tourniquets through the identification of clear and detailed design requirements and standardised performance tests, alongside the parallel development of necessary testing equipment. Design requirements were generated and categorised using a comprehensive literature review, and corresponding verification protocols were selected to guarantee a standardised measurement.

References:

  1. Navein, J., R. Coupland, and R. Dunn, The tourniquet controversy. J Trauma, 2003. 54(5 Suppl): p. S219-20.
  2. Kragh Jr, J.F., et al., Minor morbidity with emergency tourniquet use to stop bleeding in severe limb trauma: research, history, and reconciling advocates and abolitionists. Military medicine, 2011. 176(7): p. 817-823.
  3. Kragh, J.F., Jr., et al., Battle casualty survival with emergency tourniquet use to stop limb bleeding. J Emerg Med, 2011. 41(6): p. 590-7.
  4. Cornelissen, M.P., et al., The safety and efficacy of improvised tourniquets in life-threatening hemorrhage: a systematic review. Eur J Trauma Emerg Surg, 2020. 46(3): p. 531-538.

10 litres via facemask: poison or a breath of fresh air?
Paul Ransom, HALO Trust
Abstract

Current standard treatment for severely injured patients is administer high flow oxygen ( e.g. 15 litres ) via facemask ( ATLS, NICE/BNF, WHO Essential Skills, Standards in Pre-Hospital Care ) .
Military medicine in the US, UK and Scandinavia has questioned this practice over the past decade, stemming partly from logistical problems providing oxygen close to the point of injury.
Hyperoxia is damaging to cells due to the production of reactive oxygen species from mitochondria and other sources and supplemental oxygen use is already discouraged for strokes, heart attacks and severe brain injury.
Subgroups of patients with chest trauma, blast lung or inhalational injury present additional problems. Current guidelines for these groups are almost universally in favour of supplemental oxygen. There is some evidence that hyperoxia and supplemental oxygen may in fact harm even this group of patients. Humanitarian demining medicine is bound by IMAS ( International Mine Action Standards ) guidelines as well as national regulation which often mandate administering oxygen in ambulances or at the scene of the injury. We are reluctant to adopt this practice due to the lack of evidence that it is beneficial to severely traumatised patients. In our retrospective review of mortality from explosive ordnance injury in HALO over 25 years, an experienced emergency doctor and military nurse could find no case where supplemental oxygen is likely to have preserved or prolonged life. Additional concerns are explosive risks from oxygen cylinders from mine accidents or bullets, difficulties in accurate patient monitoring in addition to supply, logistic and maintenance issues with cylinders and monitors.

We welcome and be happy to share additional research into the use of pre-hospital and hospital oxygen delivery to trauma patients which would allow a rational use of this drug for trauma in remote area medicine as well as in emergency rooms and theaters.


Questions & Answers

Conversion of an in-silico porcine model of phosgene-induced lung injury to human physiology predicts a longer therapeutic window for beneficial application of continuous positive airway pressure ventilation
Declan Bates, University of Warwick
Abstract

A recent study developed the first computational model of the pathophysiology of phosgene-induced lung injury in porcine subjects [1]. Data from experiments performed in several cohorts of large juvenile female pigs (111 data points from 37 subjects), including individual arterial blood gas readings, respiratory rate and heart rate, were used to develop the computational model. Close matches were observed between model outputs and the experimental data, for both terminally anaesthetised and conscious subjects. The model was applied to investigate continuous positive airway pressure (CPAP) as a pre-hospital treatment when initiated at different time points post LD50 exposure. The model predicted that clinically relevant benefits are obtained only when CPAP is initiated within 8 h after exposure.

To investigate the relevance of this to human casualties, we converted the in silico porcine model to human physiology. Parameters that were altered in the model to translate it to human physiology included weight, total blood volume, heart rate, cardiac output, respiratory rate, lung residual volume, functional residual capacity, inspiratory reserve volume, total capacity and tidal volume. We applied the same protocol as in our previous investigation [1], which simulated the application of ambient-air CPAP at different time points post-exposure and at varying pressure levels. Simulations were run until a steady state was reached in all cardio-pulmonary parameters and PaO2, PaCO2, shunt fraction and end-expiratory lung volume were recorded. In contrast to the in silico porcine model, the in silico human model predicts that clinically relevant benefits are obtained even when application of CPAP is initiated up to 18 hours post-exposure.

This new model can be used as a tool for conducting rapid and cost-effective investigations into treatment of chemical lung injury.

References:

[1] Mistry S, et al, 2024. Toxicology Letters, 391, pp. 45-54.


Exploring perspectives of Invictus Games active esport participation among military personnel with physical and/or psychological illnesses and injuries and their families
Celina Shirazipour, Cedars-Sinai Medical Center
Abstract

Background: Adapted sport can be an important component of rehabilitation following blast injury. However, numerous barriers exist, including availability and accessibility of quality programming. One potential solution is active esports.
Purpose: Explore perspectives of active esport participation among military personnel with physical and/or psychological illnesses and injuries and their families. The current study specifically focused on individuals with access to Invictus Games Foundation active esport programming, which included virtual cycling events, rowing events, and sports leagues.
Methods: Military personnel with illnesses and injuries (n=16; 4 women; ages 29-66) and their family (n=4; 3 women, ages 37-67) representing 5 countries (Australia, Canada, France, United Kingdom, United States of America) participated in virtual interviews. Data were analyzed using inductive thematic analysis.
Results: Participants who had engaged in active esport identified benefits including: (1) increased access to sport recovery; (2) enjoyable competition; and (3) valuable social interactions. Active esport also had features that, for some, made it preferable to in-person sport, particularly scheduling flexibility and an opportunity to be engage in sport-based recovery while still being present for family. Notably, those who had access to active esport programming but had not participated had contradictory perceptions of active esport as lacking these benefits.
Conclusion: Findings provide insight regarding active esports as a potential component of recovery programs. The opposing perceptions of those who had participated in active esports compared to those who had not participated highlight the need for greater understanding of active esports, particularly the diversity of activities available. Researchers should continue to explore the potential of active esports for expanding the reach and impact of sport recovery programming.


Questions & Answers

Poster Session

Delayed xenon treatment prevents injury development following blast-traumatic brain injury in vitro
Robert Dickinson, Imperial College London
Abstract

Background. Blast traumatic brain injury is a ‘signature injury’ of recent military operations, and is recognized as having a unique pathophysiology. Despite increased research effort, there are currently no clinically effective treatments to limit the development of ongoing brain injury following blast. Xenon is a noble gas shown to be neuroprotective in models of blunt traumatic brain injury. To evaluate the efficacy of xenon as a potential neuroprotectant in blast TBI we developed a novel in vitro model of blast-induced TBI.

Methods. Organotypic hippocampal slice cultures (OHSCs) were prepared from mouse pups. Tissue culture inserts were sealed in sterile sample bags pre-filled with warmed (37°C) oxygenated medium. A shock-tube was used to generate Friedlander-type blast waves. The sample bag was positioned vertically in front of the shock-tube with the inserts positioned perpendicular to the axis of the shock tube with the OHSCs exposed to a single shockwave. Sham slices were treated identically, but without blast. Following blast the inserts were carefully transferred to culture plates, and placed in a custom-made chamber. One hour after blast, xenon (50% atm) or control gas (helium) was applied, and injury was quantified by measuring propidium iodide fluorescence at 24 h, 48 h and 72 h.

Results. Xenon had a protective effect against blast trauma at all time-points measured. Injury in the xenon-treated slices was reduced by 47 ± 12 % (p<0.01) at 24 hours after blast; 48 hours after blast injury was reduced by 31 ± 7 % (p<0.05); 72 hours after blast, the injury was reduced by 39 ± 7% (p<0.001). Injury in the blast-injured slices at 24 hours and 72 hours was not significantly different to uninjured sham slices.

Conclusion. Xenon treatment starting 1 hour after trauma, limits injury progression following blast-induced traumatic brain injury in vitro. These findings support the idea that xenon could be used as a novel treatment for blast-induced TBI.


What is low-level blast the question that plagues the academic community
Cory McEvoy, USASOC
Abstract

Special Operations Forces service members across the globe are exposed to blast and overpressure due an extremely high training and operational tempo. The effect of cumulative sub-concussive blasts on short and long-term neurological function is relatively unknown. Chronic neurological injury from low level blast remains unproven due to lacking definitions of force levels agreed upon by the academic community. Low level blast has varying definitions, there are open-source US Army memos stating that soldiers need to be limited to exposures under 4psi. While this may be a noble undertaking, what is the scientific rational? Additionally, the US National Defense Authorization act of 2023 mandates that blast overpressure should be monitored through wearable sensors, what does this mandate accomplish without reasonable and established threshold limits? As a scientific community it is critical to define the variables before undertaking time intensive research. A cursory search of PubMed shows there are over 1,900 results when “Low-Level Blast” is searched. When reading many of these articles they either fail to mention a pressure threshold defined as low-level for their research or two they have a large deviation in force considered low-level, definitions can range from 2psi-20psi, while others only consider impulse levels. The ability to define low-level blast specifically for military exposure hazards is critically important in determining what may lead to injury and then follow-on preventative care. Service members today deserve scientist working together in the same direction towards defining a somewhat simple variable in this incredibly complex biomechanical problem.


Measurement and prediction of human injury: an introduction to Dstl’s injury analysis and modelling team
Gregory James, Dstl
Abstract

Dstl’s Injury Analysis and Modelling Team collaborates with international governmental agencies, industry and academia to assess, develop, use and maintain a range of injury models for the effects of militarily relevant weapons against the person. We are responsible for providing advice on the use of injury models within Defence and the measurement of injurious mechanisms to ensure the residual risks are understood. The injury models are capable of measuring primary, secondary, tertiary and quaternary injury mechanisms.
In addition to the injury models we also document casualties sustained on combat operations by combining anonymised Abbreviated Injury Scale coded injury data with operational and intelligence reporting. This enables us to assess the efficacy of system changes such as new Personal Protective Equipment or alternative Tactics, Techniques & Procedures. We also monitor threat trends to inform injury model requirements and support medical planning.
Our inputs are crucial to assessing effects of military weapons or terrorist threats on people. Those people may be mounted or dismounted, service personnel or civilians, within or around structures, across land, air or sea domains.
We are part of the advice team that underpins a number of Defence capabilities that helps to understand the operational risk. The enhanced representations of human injury we provide benefit areas including Operational Analysis, planning, and safety such as estimation of collateral damage, ensuring civilians and our allies are not subjected to undue risk caused by UK military action.
Our poster illustrates some models from this blast injury capability for UK Defence and Security.
© Crown copyright (2024), Dstl. This information is licensed under the Open Government Licence v3.0. To view this licence, visit https://www.nationalarchives.gov.uk/doc/open-government-licence/ . Any enquiries regarding this publication should be sent to: Dstl.


Pharmacological interventions for blast neurotrauma: a preclinical systematic review and meta-analysis
Eszter Ujvari, Imperial College London
Abstract

Background: Blast neurotrauma is experienced by military personnel and civilians in conflicts but the underlying pathophysiology is not fully understood. Preclinical animal models are playing an important role in investigating the pathophysiology and in evaluating neuroprotective treatments. A variety of pharmacological interventions have been explored using preclinical models, but there are currently no clinically proven neuroprotectants.

Methods: A systematic search of the MEDLINE and Embase was carried out. Titles, abstracts and full text, were screened and data extracted, followed by pairwise meta-analyses. Study quality was assessed using a modified CAMARADES risk-of-bias score. Between-study heterogeneity was examined by subgroup analysis, funnel plot asymmetry, and Egger’s regression. The protocol was prospectively registered in the Open Science Foundation Registry (https://osf.io/f39k8/).

Results: A total of 60 articles met the inclusion criteria, representing diverse pharmacotherapies falling into 9 broad classifications (anti-inflammatory agents, anticonvulsants, antioxidants, endoplasmic reticulum (ER) stress modulators, GLP-1 agonists, general anaesthetics, microglia modulators, oxygen therapy and peptide hormones). The risk-of-bias scores showed studies were predominantly moderate quality (72% moderate: 25% low: 3% high). The meta-analysis showed greater improvement in neurological outcomes with peptide hormones, microglial modulators, and ER stress modulators. Substantial between-study heterogeneity was detected.

Conclusions: Our results show a diversity of pharmacotherapies evaluated preclinically as neuroprotective treatments for blast neurotrauma. There is a need for further studies of individual treatments, but some classes of drugs such as peptide hormones, microglial modulators and ER stress modulators appear to be effective in preclinical models and merit further investigation.


Understanding the response of skeletal muscle to a blast shock wave using a novel prototype shock tube.
Abigail Spear, Dstl

Clinical Treatments using 3D Printing/Metal additive manufacturing for Patients
Swapnil Kumar, Imperial College London
Abstract

Treating accident victims is a critical issue, and to address this, engineers collaborate with medical professionals and hospitals to develop engineering solutions. These solutions include using finite element analysis to adjust the alignment of fractured bones/skull and utilizing 3D printing or metal additive manufacturing to create implants. In our research, we did obtained the CT scan for the skull/tibia bone that was damaged in a road accident. We then performed finite element analysis to determine the forces involved in causing the disruption/fracture. Following this analysis, we manufactured implants made from Ti-6Al-4V/SS-316L material. These implants were used to repair and correctly orient the damaged skull/tibia bone, ensuring its proper functionality.


Back to main programme (Friday 12th July)


Session 7A

Blast and Conflict Neurotrauma – Prediction and Prevention


Experimental and numerical exposure of a skull substitute to blast conditions
Natacha Elster, French-German Research Institute of Saint Louis
Abstract

Shock waves propagating during an explosion may cause severe bTBI, whose injury mechanisms are not yet understood. To evaluate the plausibility of the skull deflection hypothesis, a substitute with a vibratory response akin to a dry human skull was designed. The chosen geometry is a truncated spherical shell, with dimensions selected thanks to numerical tools. The goal of this study was to expose the substitute in free-field conditions to characterize its mechanical behaviour.

The skull substitute was filled with water and instrumented with strain gauges and internal pressure sensors. The substitute was then subjected to seven exposure configurations with incident pressures between 75 kPa and 200 kPa, for two blast durations of 1.2 ms and 2.0 ms. Because of the limited number of measurement channels, the number of obtained signals is restricted. To overcome these limitations, the LS-Dyna software was employed to replicate and expand the scenarios carried out experimentally.

The time histories of the shell strains and internal pressures were then analysed. Dose-response effects were highlighted for both quantities, while no significant effect of positive phase duration was observed. The results also showed an ipsi-controlateral effect for internal pressures: an increase in pressure in the ipsilateral zone was measured, while a negative pressure was recorded in the contralateral zone. In addition, the first peaks and the maximum magnitude of each mechanical quantity were then extracted from the numerical results to complement the experimental findings. Linear trends were observed as a function of incident pressure, but no conclusion could yet be drawn for the duration of the positive phase.

In the future, a greater number of scenarios will be considered during the numerical analyses to further characterise the mechanical response of the substitute to the blast phenomenon. This approach aims to continue investigating the skull deflection injury mechanism.


Potential for Interaction of Overpressure and Recoil Forces in Long-Range Precision Rifle (LPR) Discharge and Effects on Brain Response
Dilaver Singh, University of Waterloo
Abstract

Operators of long-range precision rifles (LPRs) are exposed to overpressure forces from rifle discharge as well as head kinematics from the recoil of the rifle. Both overpressure and recoil forces have been postulated to contribute to concussion-like symptoms in operators. Although the effects of both overpressures and head kinematics forces have been investigated separately in the past, the specific loadings during LPR discharge and the potential interaction of these effects have not been previously quantified. In this study, two experiments were performed to measure the overpressures and recoil head kinematics caused by LPR discharge. The overpressures were measured using an instrumented headform on a mannequin, and the recoil head kinematics were measured using human volunteers with instrumented mouth-guards. High speed videos of the experiments enabled estimation of the relative timing between the overpressure wave incidence and the onset of head kinematics. Finite element (FE) models of the head were employed to quantify the overpressure and recoil kinematics on the brain response in terms of transient intracranial pressure and strain. The experiments demonstrated consistency in the overpressure magnitudes (~20 kPa incident pressure) and timing (3.6 ms after discharge). However, the onset of recoil kinematics showed significant variability between operators, occurring between 11.0-28.0 ms after LPR discharge. The FE models predicted that the strain response in the brain largely depended on the recoil kinematics, with a low sensitivity to overpressure loading. However, the intracranial pressure response was governed by both overpressure and recoil kinematics, and additionally demonstrated an interaction between them in some cases. This work provides important experimental information on the potential interaction of overpressure and recoil forces in LPR discharge, and insights on the brain response that can inform injury mitigation strategies for operators.e.


Blast brain injury from Iraq and Afghanistan: the pathology and comparisons between mounted and dismounted fatalities
Emily Ashworth, Imperial College London
Abstract

Background
Previous research has shown that injuries to the head and neck were prevalent in at least 19% of injured personnel from Iraq and Afghanistan commitments (UK-JTTR). The mechanisms that cause such injuries to the central nervous system are not yet known. The aim of this study was to identify the head and spinal injuries in fatalities due to blast in both the mounted and dismounted cohorts, group these into constellations of injuries, and then develop hypotheses on the causative mechanisms.

Methods
All UK military fatalities from blast who suffered a head injury from 2007-2013 in the Iraq and Afghanistan conflicts were identified retrospectively. Post-mortem CTs (PMCTs), where available, were interrogated for injuries to the head, neck and spine. All injuries were documented and classified using the Society of British Neurosurgeons Brain Injury Classification. Chi-squared and Fisher’s Exact tests were used to reject independence, develop classification of injury constellations, and form a hypothesis for injury mechanisms.

Results
Blast brain injury is heterogenous, however there were injury mechanisms identified in both cohorts. There were 46 and 71 fatalities from mounted and dismounted blast respectively who suffered a head injury and had a CT-post mortem available for analysis. Chi-squared and Fisher’s exact showed independence could be rejected for lateral ventricle blood and injuries to the abdomen and thorax.

Conclusions
Five partially overlapping injury constellations were identified in the mounted cohort, and
Four mechanisms of injury in the dismounted cohort. Some of these mechanisms overlapped and showed correlation to civilian injuries.

These hypothesised mechanisms can now be investigated to consider mitigation strategies or clinical treatments.


Exploring Neural Correlates of Ibogaine In Special Forces Combat Veterans Through Multimodal Imaging
Maheen Adamson, Stanford University
Abstract

Objective: This analysis sought to identify the neural mechanisms underlying the strong therapeutic results from a recent study that evaluated the safety and clinical impact of ibogaine in treating military veterans with traumatic brain injury (TBI). TBI is a leading cause of disability with sequelae of psychiatric symptoms such as post-traumatic stress disorder (PTSD), major depressive disorder (MDD), and generalized anxiety disorder (GAD).

Methods: We collected arterial spin labeling (ASL) and blood-oxygen-level-dependent (BOLD) functional magnetic resonance imaging (fMRI) data at three-time points pre and post-treatment on 30 Special Operations Veterans (SOV) who had voluntarily enrolled in tabernanthe iboga exposure at a clinic in Mexico. We used a multimodal whole-brain resting-state exploratory approach of examining changes to regional Cerebral Blood Flow, Functional Connectivity, and Network communication to characterize neural features that were altered post-ibogaine treatment.

Results: Significant changes were identified in blood flow (p<0.001, PFDR<0.05), functional connectivity (p<0.005), and networks of the limbic and sensory-motor system, regions associated with TBI and PTSD. We found associations between neuroimaging findings in the left hemisphere insula, anterior cingulate cortex, and hippocampus-dorsal attention network with clinical measures of disability index and PTSD symptomology.

Conclusions and Relevance: Our novel multimodal neuroimaging approach revealed potential mechanisms underlying the therapeutic benefits of ibogaine for SOV suffering from TBI with comorbid disability and psychiatric symptoms. Further research with larger and diverse populations would be beneficial to establish clinical and neuroimaging alterations from ibogaine on subjects without lifetime TBIs or combat-induced PTSD.


Changes in brain structure and age in Veterans with TBIs following treatment with Magnesium-Ibogaine
Maheen Adamson, Stanford University
Abstract

Introduction: TBI is common among Veterans of recent conflicts, and may lead to a range of symptoms, as well as accelerated brain aging.Ibogaine, a psychoactive alkaloid, has neuroplasticity-promoting properties. It may help remodel neural circuitry and improve functioning in Veterans with TBI.

Methods: We conducted an observational study with 30 Veterans with multiple blast TBI (mbTBI) and complex clinical problems who received ibogaine treatment over several hours, preceded and followed by preparation and integration. At baseline, immediate post, and 1-month, we performed clinical assessments and structural MRI scans. We derived cortical thickness (CT) measures with the ANTs longitudinal CT pipeline and evaluated CT and volume in cortical and subcortical gray matter, and in cerebellar ROIs. To evaluate longitudinal changes in CT and volume across ROIs, we used linear mixed effects (LME) models. We used the algorithm brainageR to measure brain age.

Results: A Wald Χ2 test of regional LME models revealed a significant (pFDR<0.05) effect of study visit on CT in 13 ROIs. Pairwise t-tests demonstrated significant (pholm<0.05) increases in CT following ibogaine relative to the baseline visit in 11 regions. For subcortical volume, Wald Χ2 test of the subcortical LME models revealed a significant (pFDR<0.05) main effect on the log-jacobian determinant in the Right Ventral Diencephalon. Wald Χ2 test of the LMEs revealed a significant change in brain age across time points [Χ2(2)=10.64, p=0.0049]. Post-hoc t-tests gave a significant (pholm<0.05) reduction of 1.60 years in predicted brain age relative to baseline one month after treatment (t=3.18, p=0.0082, d=1.035).

Conclusions: This provides the first evidence of measurable brain morphometric changes in humans following ibogaine therapy in Veterans with mbTBI. More research is needed to understand the mechanisms by which ibogaine works fully and to determine the long-term impact on cortical structure.


Considering repeated low-level blast exposure in the context of the current classification of TBI
James Stone, University of Virginia
Abstract

Repeated exposure to low-intensity blast is recognized as a potential cause of significant neurological change in operational personnel. Studies of experienced breachers demonstrate blast-related alterations in brain structure and function. Additionally, blast-associated increases in neuroinflammation have been seen in special operators. Repeated low-level blast exposure has eluded categorization in the context of the standard TBI classification systems that are defined by an acute response to a potentially traumatic insult. Many low-level blast exposures may in fact be considered subconcussive events, that do not meet standard definitions of TBI but may result in cumulative neurological deficits over a career.

The current presentation will review the state of the field related to the neurological correlates of repeated low-level blast exposure and will consider these data in the context of current classification schemes for TBI. Data presented will also include new findings from a recently completed, comprehensive study of experienced artillery personnel. In that study, significant associations were seen between the history of artillery exposure and alterations in a variety of neuroimaging metrics. Repeated artillery exposures were associated with significant alterations in white matter regions that are central to coordination, integration and processing. Conversely, total blast exposure, measured by the generalized blast exposure value (GBEV), was relatively more associated with alterations in cortical and resting brain activity measures. These results suggest unique neurological changes as a function of weapon system.

A roadmap will be considered concerning how to develop a schema to characterize and classify the neurological effects of repeated low-level blast exposure to help equip operational leadership and medical personnel with the tools needed to mitigate and treat adverse brain health effects in exposed populations.


Back to main programme (Friday 12th July)