26 Projects, page 1 of 3
Loading
- Project . 2016 - 2018Funder: UKRI Project Code: EP/N018958/1Funder Contribution: 507,674 GBPPartners: University of Edinburgh, Wolfram Research Europe Ltd, University of Salford, The Mathworks Ltd, University of London, MICROSOFT RESEARCH LIMITED, 3DS, NAG, University of Sheffield, Maplesoft...
"Software is the most prevalent of all the instruments used in modern science" [Goble 2014]. Scientific software is not just widely used [SSI 2014] but also widely developed. Yet much of it is developed by researchers who have little understanding of even the basics of modern software development with the knock-on effects to their productivity, and the reliability, readability and reproducibility of their software [Nature Biotechnology]. Many are long-tail researchers working in small groups - even Big Science operations like the SKA are operationally undertaken by individuals collectively. Technological development in software is more like a cliff-face than a ladder - there are many routes to the top, to a solution. Further, the cliff face is dynamic - constantly and quickly changing as new technologies emerge and decline. Determining which technologies to deploy and how best to deploy them is in itself a specialist domain, with many features of traditional research. Researchers need empowerment and training to give them confidence with the available equipment and the challenges they face. This role, akin to that of an Alpine guide, involves support, guidance, and load carrying. When optimally performed it results in a researcher who knows what challenges they can attack alone, and where they need appropriate support. Guides can help decide whether to exploit well-trodden paths or explore new possibilities as they navigate through this dynamic environment. These guides are highly trained, technology-centric, research-aware individuals who have a curiosity driven nature dedicated to supporting researchers by forging a research software support career. Such Research Software Engineers (RSEs) guide researchers through the technological landscape and form a human interface between scientist and computer. A well-functioning RSE group will not just add to an organisation's effectiveness, it will have a multiplicative effect since it will make every individual researcher more effective. It has the potential to improve the quality of research done across all University departments and faculties. My work plan provides a bottom-up approach to providing RSE services that is distinctive from yet complements the top-down approach provided by the EPRSC-funded Software Sustainability Institute. The outcomes of this fellowship will be: Local and National RSE Capability: A RSE Group at Sheffield as a credible roadmap for others pump-priming a UK national research software capability; and a national Continuing Professional Development programme for RSEs. Scalable software support methods: A scalable approach based on "nudging", to providing research software support for scientific software efficiency, sustainability and reproducibility, with quality-guidelines for research software and for researchers on how best to incorporate research software engineering support within their grant proposals. HPC for long-tail researchers: 'HPC-software ramps' and a pathway for standardised integration of HPC resources into Desktop Applications fit for modern scientific computing; a network of HPC-centric RSEs based around shared resources; and a portfolio of new research software courses developed with partners. Communication and public understanding: A communication campaign to raise the profile of research software exploiting high profile social media and online resources, establishing an informal forum for research software debate. References [Goble 2014] Goble, C. "Better Software, Better Research". IEEE Internet Computing 18(5): 4-8 (2014) [SSI 2014] Hettrick, S. "It's impossible to conduct research without software, say 7 out of 10 UK researchers" http://www.software.ac.uk/blog/2014-12-04-its-impossible-conduct-research-without-software-say-7-out-10-uk-researchers (2014) [Nature 2015] Editorial "Rule rewrite aims to clean up scientific software", Nature Biotechnology 520(7547) April 2015
- Project . 2016 - 2018Funder: UKRI Project Code: NE/K00008X/2Funder Contribution: 42,744 GBPPartners: INGV (Nat Inst Volcanology and Geophys), Durham University, HSL, University of Bergen, SFU, Willis Limited, Met Office, NOC, FLE, University of London...
Submarine landslides can be far larger than terrestrial landslides, and many generate destructive tsunamis. The Storegga Slide offshore Norway covers an area larger than Scotland and contains enough sediment to cover all of Scotland to a depth of 80 m. This huge slide occurred 8,200 years ago and extends for 800 km down slope. It produced a tsunami with a run up >20 m around the Norwegian Sea and 3-8 m on the Scottish mainland. The UK faces few other natural hazards that could cause damage on the scale of a repeat of the Storegga Slide tsunami. The Storegga Slide is not the only huge submarine slide in the Norwegian Sea. Published data suggest that there have been at least six such slides in the last 20,000 years. For instance, the Traenadjupet Slide occurred 4,000 years ago and involved ~900 km3 of sediment. Based on a recurrence interval of 4,000 years (2 events in the last 8,000 years, or 6 events in 20,000 years), there is a 5% probability of a major submarine slide, and possible tsunami, occurring in the next 200 years. Sedimentary deposits in Shetland dated at 1500 and 5500 years, in addition to the 8200 year Storegga deposit, are thought to indicate tsunami impacts and provide evidence that the Arctic tsunami hazard is still poorly understood. Given the potential impact of tsunamis generated by Arctic landslides, we need a rigorous assessment of the hazard they pose to the UK over the next 100-200 years, their potential cost to society, degree to which existing sea defences protect the UK, and how tsunami hazards could be incorporated into multi-hazard flood risk management. This project is timely because rapid climatic change in the Arctic could increase the risk posed by landslide-tsunamis. Crustal rebound associated with future ice melting may produce larger and more frequent earthquakes, such as probably triggered the Storegga Slide 8200 years ago. The Arctic is also predicted to undergo particularly rapid warming in the next few decades that could lead to dissociation of gas hydrates (ice-like compounds of methane and water) in marine sediments, weakening the sediment and potentially increasing the landsliding risk. Our objectives will be achieved through an integrated series of work blocks that examine the frequency of landslides in the Norwegian Sea preserved in the recent geological record, associated tsunami deposits in Shetland, future trends in frequency and size of earthquakes due to ice melting, slope stability and tsunami generation by landslides, tsunami inundation of the UK and potential societal costs. This forms a work flow that starts with observations of past landslides and evolves through modelling of their consequences to predicting and costing the consequences of potential future landslides and associated tsunamis. Particular attention will be paid to societal impacts and mitigation strategies, including examination of the effectiveness of current sea defences. This will be achieved through engagement of stakeholders from the start of the project, including government agencies that manage UK flood risk, international bodies responsible for tsunami warning systems, and the re-insurance sector. The main deliverables will be: (i) better understanding of frequency of past Arctic landslides and resulting tsunami impact on the UK (ii) improved models for submarine landslides and associated tsunamis that help to understand why certain landslides cause tsunamis, and others don't. (iii) a single modelling strategy that starts with a coupled landslide-tsunami source, tracks propagation of the tsunami across the Norwegian Sea, and ends with inundation of the UK coast. Tsunami sources of various sizes and origins will be tested (iv) a detailed evaluation of the consequences and societal cost to the UK of tsunami flooding , including the effectiveness of existing flood defences (v) an assessment of how climate change may alter landslide frequency and thus tsunami risk to the UK.
- Project . 2016 - 2020Funder: UKRI Project Code: NE/P001378/1Funder Contribution: 396,492 GBPPartners: University of London, University of Alberta, Swiss Federal Institute of Technology ETH Zürich
Transition zone seismic discontinuities (TZSDs), manifestations of mineral phase transitions or/and compositional changes between the upper mantle and the lower mantle, hold the key to resolve the mystery of mass and heat transport in the Earth's mantle and the long-term evolution of the Earth's interior. However, seismic characterizations of TZSDs are typically incomplete because of the limit in the data frequency bandwidth and sensitivity relevant to TZSDs. We innovate a simple, effective and high resolution probing of mantle discontinuity through examination of broadband forward and backward scattering waves in the context of the teleseismic receiver function method. This approach will allow us to comprehensively characterize TZSDs beneath the continents, including properties such as discontinuity topography, sharpness and gradient, shear velocity jump and density jump. To date, there has been no single study that is capable of simultaneously determining these essential seismic properties in the TZSDs. These renewed descriptions of TZSDs will be used to explore outstanding questions including mineralogical models of the transition zone and the presence of volatile/melt. In particular, we aim to address how current and past subduction determine short-term and long-term mantle mixing and whether such a mixing process may in turn shape slab sinking dynamics. A series of outstanding questions can be much better addressed with our new seismic observations: Did long-term mixing of billions of years result in apparent chemical layering as indicated in geodynamic models? What are the degree and the length scale of lateral heterogeneity if such a chemical layering exists? Is it possible that primordial structure may survive long term mixing and become trapped in the transition zone? Is the transition zone potentially a relatively shallow reservoir for long-term storage and geochemical evolution of basalt? Does chemical layering or large-scale primordial structure dictate the slab sinking dynamics? Does modern and ancient subduction recycle water into the deep mantle and transition zone? Does hydrated transition zone induce convective instability and contribute to intraplate volcanism? In the proposed work, we will use an innovative and effective observation with broadband forward and backward scattering waves to provide a comprehensive characterization of TZSDs, including properties such as discontinuity topography, sharpness, velocity and density jumps across the boundaries, and the gradient above/below the discontinuities. These unprecedentedly rich observations will provide renewed constraints on fundamental processes relevant to the Earth's interior and evolution.
- Project . 2016 - 2018Funder: UKRI Project Code: AH/P008038/1Funder Contribution: 80,530 GBPPartners: Inst for Justice & Democracy in Haiti, University of Birmingham, Queen's University Canada, Inst of Social Work & Social Science
As of April 2016, a total of 103,510 uniformed personnel from 123 countries were serving in 16 peacekeeping operations around the world. Where foreign soldiers - during war, occupation or peacekeeping operations - are on foreign soil, military-civilian relations develop, including those between soldiers and local women. Peacekeepers have increasingly been associated with sexual exploitation and abuse of the vulnerable populations they had been mandated to protect. Many of the intimate relations between peacekeeping personnel and local women, of both voluntary and exploitative nature, have led to pregnancies and to children being born. These so-called 'peace babies' and their mothers face particular challenges in volatile post-conflict communities, reportedly including childhood adversities as well as stigmatization, discrimination and disproportionate economic and social hardships. This project proposes an in-depth-study on the situation of 'peace babies' conceived by personnel from or associated with the United Nations Stabilization Mission in Haiti (MINUSTAH). MINUSTAH is among the missions associated with allegations of misconduct, not least related to sexual and gender-based violence and consequently the unintended legacy of children fathered by UN personnel. The UN has recently acknowledged that 'peacekeeper babies' exist. Yet, an evidence base relating to the welfare of children fathered by UN peacekeepers (globally or in Haiti) is virtually non-existent, and it is clear that the existing UN policies and support programs are inadequate. The proposed study addresses this critical knowledge gap through the following original contributions: - Theoretical contribution - analysing the lack of accountability of the UN and its personnel for children fathered by UN peacekeepers by introducing a victim-centred approach; - Empirical contributions: i) exploring the gender norms, and the socioeconomic, cultural and security circumstances that contribute to unequal power relations between UN personnel and local civilians; ii) mapping the whereabouts of 'peace babies' in Haiti through a situational analysis of the areas surrounding six UN bases and exploring the circumstances around their conceptions; and iii) investigating the life experiences of women raising children fathered by peacekeepers; and - Methodological contribution - using an innovative mixed quantitative/qualitative research tool, Cognitive Edge's SenseMaker, to provide a more nuanced understanding of these complex issues. The multidisciplinary collaboration between scholars from the University of Birmingham, Queen's University, Kingston, the Centre of International and Defence Policy, and Haitian-based Enstiti Travay Sosyal ak Syans Sosyal (ETS), along with civil society organisations, the Institute for Justice and Democracy in Haiti and Haitian-based Bureau des Avocats Internationaux, will address this knowledge gap and enhance our understanding of the challenges faced by peace babies and their families as well as the obstacles to accessing support. Beyond the core UK-Canada-Haiti partnership, the project will include further ODA-recipient countries (among others Cambodia, Bosnia, Liberia and the DRC) and in a final project conference will apply insights from Haiti to Peace Support Operations (PSO) more generally in discourse with academic and non-academic participants from other countries with extensive PSO experience.
- Project . 2016 - 2017Funder: UKRI Project Code: EP/P006078/1Funder Contribution: 333,594 GBPPartners: Vienna University of Technology, University of Trento, CNR, UBC, Heriot-Watt University, Enshape
Some of the most fundamental and perhaps bizarre processes expected to occur in the vicinity of black holes are out of observational reach. To address this issue we utilise analogue systems where we study fluctuations on a background flow that in the experiment reproduces an effective black hole. In the literature this line of research is referred to as analogue models for gravity, or simply analogue gravity. Analogue models provide not only a theoretical but also an experimental framework in which to verify predictions of classical and quantum fields exposed to 'extreme' spacetime geometries, such as rapidly rotating black holes. This project brings together two world-wide recognised experts in the field of analogue gravity with the aim of pushing the field in a new direction: we propose ground-breaking studies to mimic some of the bizarre processes occurring in the vicinity of rotating black holes from general relativity and rotating fluids in both water and optical systems. In particular, we will investigate both theoretically and experimentally the interaction between an input wave and a rotating black hole spacetime geometry, here recreated by the rotating fluid. This allows us to mimic a scattering process associated to rotating black hoes called superradiant scattering. From a historical viewpoint this kind of radiation is the precursor to Hawking radiation. More precisely, black hole superradiance is the scattering of waves from a rotating black hole: if the incoming wave also possesses a small amount of angular momentum, it will be reflected with an increased amplitude, i.e. it is amplified at the expense of the black hole that thus loses some of its rotational energy. It has also been pointed out that the same physics may take place in very different systems, for example light incident on a rotating metallic (or absorbing) cylinder may also be amplified upon reflection. Yet, no-one has ever attempted to experimentally investigate the underlying physics that extend beyond general relativity and are relevant to a variety of hydrodynamical and rotating systems. We aim to provide the first ever experimental evidence of this intriguing and fundamental amplification mechanism in two different hydrodynamical systems. The first is a water spout, controlled so that the correct boundary conditions are obtained and optimised for observing BH-SS. The second is a less conventional fluid that is made out of light. Light propagating in a special medium can behave as a fluid or even a superfluid. By building upon highly developed photonic technologies e.g. for the control and measurements of laser beam wavefronts, we will implement very precisely tailored and characterised experiments. One of the unique aspects of this project is the marriage between two very different lab-based systems, one using water the other using light, to tackle an outstanding problem in physics that is of relevance to astrophysics, hydrodynamic and optical systems.
- Project . 2016 - 2017Funder: UKRI Project Code: AH/N006178/1Funder Contribution: 26,235 GBPPartners: University of Cambridge, University of Waterloo (Canada), University of Sussex, University of London, TNA
In recent years we have all become familiar with the notion of information overload, the digital deluge, the information explosion, and numerous variations on this idea. At the heart of this phenomenon is the growth of born-digital big data, a term which encompasses everything from aggregated tweets and Facebook posts to government emails, from the live and archived web to data generated by wearable and household technology. While there has been a growing interest in big data and the humanities in recent years, as exhibited notably in the AHRC's digital transformations theme, most academic research in this area has been undertaken by computer scientists and in emerging fields such as social informatics. As yet, there has been no systematic investigation of how humanities researchers are engaging with this new type of primary source, of what tools and methods they might require in order to work more effectively with big data in the future, and of what might constitute a specifically humanities approach to big data research. What kinds of questions will this data allow us to ask and answer? How can we ensure that this material is collected and preserved in such a way that it meets the requirements of humanities researchers? What insights can scholars in the humanities learn from ground-breaking work in the computer and social sciences, and from the archives and libraries who are concerned with securing all of this information? The proposed research Network will bring together researchers and practitioners from all of these stakeholder groups, to discern if there is a genuine humanities approach to born-digital big data, and to establish how this might inform, complement and draw on other disciplines and practices. Over the course of three workshops, one to be held at The National Archives in Kew, one at the Institute of Historical Research, University of London, and one at the University of Cambridge, the Network will address the current state of the field; establish the most appropriate tools and methods for humanities researchers for whom born-digital material is an important primary source; discuss the ways in which researchers and archives can work together to facilitate big data research; identify the barriers to engagement with big data, particularly in relation to skills; and work to build an engaged and lasting community of interest. The focus of the Network will be on history, but it will also encompass other humanities and social science disciplines. It will also include representatives of non-humanities disciplines, for example the computer, social and information sciences. Cross-disciplinary approaches and collaborative working are essential in such a new and complex area of investigation, and the Network relates to the current highlight notice encouraging the exploration of innovative areas of cross-disciplinary enquiry. While there has for some time been a recognition of the value of greater engagement between researchers in the humanities and the sciences in the development of new approaches to and understandings of born-digital big data, only very tentative first steps have been made towards realising this aim (for example forthcoming activity organised by the Turing Institute). The Network will provide a forum from which to launch precisely this kind of cross-disciplinary discussion, defining a central role for the humanities. During the 12 months of the project all members of the Network will contribute to a web resource, which will present key themes and ideas to both an academic and wider audience of the interested general public. External experts from government, the media and other relevant sectors will also be invited to contribute, to ensure that the Network takes account of a range of opinions and needs. The exchange of knowledge and experience that takes place at the workshops will also be distilled into a white paper, which will be published under a CC-BY licence in month 12 of the Network.
- Project . 2016 - 2019Funder: UKRI Project Code: NE/M017540/2Funder Contribution: 284,801 GBPPartners: Deltares-Delft, UNIMI, Durham University, Utrecht University, MBARI, MUN, BU, BIO, CSIC, Shell International Exploration & Produc...
Turbidity currents are the volumetrically most import process for sediment transport on our planet. A single submarine flow can transport ten times the annual sediment flux from all of the world's rivers, and they form the largest sediment accumulations on Earth (submarine fans). These flows break strategically important seafloor cable networks that carry > 95% of global data traffic, including the internet and financial markets, and threaten expensive seabed infrastructure used to recover oil and gas. Ancient flows form many deepwater subsurface oil and gas reservoirs in locations worldwide. It is sobering to note quite how few direct measurements we have from submarine flows in action, which is a stark contrast to other major sediment transport processes such as rivers. Sediment concentration is the most fundamental parameter for documenting what turbidity currents are, and it has never been measured for flows that reach submarine fans. How then do we know what type of flow to model in flume tanks, or which assumptions to use to formulate numerical or analytical models? There is a compelling need to monitor flows directly if we are to make step changes in understanding. The flows evolve significantly, such that source to sink data is needed, and we need to monitor flows in different settings because their character can vary significantly. This project will coordinate and pump-prime international efforts to monitor turbidity currents in action. Work will be focussed around key 'test sites' that capture the main types of flows and triggers. The objective is to build up complete source-to-sink information at key sites, rather than producing more incomplete datasets in disparate locations. Test sites are chosen where flows are known to be active - occurring on annual or shorter time scale, where previous work provides a basis for future projects, and where there is access to suitable infrastructure (e.g. vessels). The initial test sites include turbidity current systems fed by rivers, where the river enters marine or freshwater, and where plunging ('hyperpycnal') river floods are common or absent. They also include locations that produce powerful flows that reach the deep ocean and build submarine fans. The project is novel because there has been no comparable network established for monitoring turbidity currents Numerical and laboratory modelling will also be needed to understand the significance of the field observations, and our aim is also to engage modellers in the design and analysis of monitoring datasets. This work will also help to test the validity of various types of model. We will collect sediment cores and seismic data to study the longer term evolution of systems, and the more infrequent types of flow. Understanding how deposits are linked to flows is important for outcrop and subsurface oil and gas reservoir geologists. This proposal is timely because of recent efforts to develop novel technology for monitoring flows that hold great promise. This suite of new technology is needed because turbidity currents can be extremely powerful (up to 20 m/s) and destroy sensors placed on traditional moorings on the seafloor. This includes new sensors, new ways of placing those sensors above active flows or in near-bed layers, and new ways of recovering data via autonomous gliders. Key preliminary data are lacking in some test sites, such as detailed bathymetric base-maps or seismic datasets. Our final objective is to fill in key gaps in 'site-survey' data to allow larger-scale monitoring projects to be submitted in the future. This project will add considerable value to an existing NERC Grant to monitor flows in Monterey Canyon in 2014-2017, and a NERC Industry Fellowship hosted by submarine cable operators. Talling is PI for two NERC Standard Grants, a NERC Industry Fellowship and NERC Research Programme Consortium award. He is also part of a NERC Centre, and thus fulfils all four criteria for the scheme.
- Project . 2016 - 2017Funder: UKRI Project Code: ES/N007883/1Funder Contribution: 523,273 GBPPartners: KCL, University of Montreal, DCU, SEOUL NATIONAL UNIVERSITY
In a globalised economic and business context, the norms that shape human resource management travel internationally. This is particularly the case within the multinational company, where individuals are responsible for the creation, diffusion, interpretation and negotiation of norms - which may be rules, principles or guidelines - across international operations. We refer to such individuals as "globalizing actors". The aim of our research is to identify the resources mobilized by globalizing actors in the creation, diffusion, interpretation and negotiation of norms concerning the global coordination of human resources (see 'Objectives' for more detail). Previous research has examined individuals in important international positions, focusing on their orientations and values (e.g. whether they possess 'global mindsets'), the management of international assignments and the characteristics of members of the international business elite. However, these literatures have not systematically examined the actual roles of globalizing actors within firms, and precisely how they create, diffuse, and manage international norms. We examine what such actors actually do within a theoretical framework that sees the behaviour of globalizing actors as shaped by institutions: the institutions in the country in which they originated affect their competencies; they must be sensitive to a variety of host national institutions; and they must navigate their way through a growing range of transnational institutions. Their role is also shaped by organizational context, particularly how the firm derives synergies from integrating their operations internationally, which influences the types of global norms required. However, globalizing actors are not prisoners of institutional and organizational contexts. They can create new norms, develop strategies that help shape the 'rules of the game' and attempt to exploit institutional contradictions and ambiguities. We will explore the individual level resources of these actors to deal with these contexts, such as their skills and knowledge - 'human capital' - the relationship these actors have to others in terms of power, position and trust - their 'social capital' - and their transnational experiences or exposure. We will examine UK MNCs, both at home and across subsidiaries in Europe, North America and East Asia. The research will use multiple methods, consisting of five steps: 1. Pilot Work. Using seed-corn funding, we have tested key concepts and generated contacts for twelve full case studies in subsequent stages of the research. 2. UK interviews. These will focus on those charged with creating new norms, spreading them across international operations, or ensuring compliance. 3. Foreign Subsidiary Interviews. We will conduct interviews in the international operations of each firm, enabling us to understand frames of reference and actor choices in foreign subsidiaries. 4. Multi-level Survey. The survey of a set of globalizing actors will establish individual level capabilities associated with the establishment and diffusion of global norms. 5. Quantitative Diary Study. This methodological innovation allows us both to explore what globalizing actors actually do and to test predictors of behaviours and attitudes. The research will make a substantial and distinctive contribution to understanding of the processes of international management, through focusing on individual "globalizing actors" within the contexts of the multiple institutional and organisational contexts within which they make decisions. Equally, through the development and communication of a strong evidence base on how firms build individual and organisational capabilities in international management, the research also aims to enable improvements in the economic effectiveness of UK firms with overseas operations, while acting in ways that respond to the need for social responsibility at local-host and global levels.
- Project . 2016 - 2022Funder: UKRI Project Code: MR/N005759/1Funder Contribution: 3,039,500 GBPPartners: McMaster University
Cardiovascular disease is a leading cause of death globally estimated to be responsible for approximately 17 million deaths annually. Heart disease and stroke account for nearly one third of all deaths and are a major cause of hospitalization. Patients with congestive heart failure (CHF) are at particularly high risk. Clinical trials demonstrate that nearly one third of patients with CHF will experience a myocardial infarction (MI), stroke, or hospitalization for CHF. Observational studies have established an association between influenza infection and major adverse vascular events . It follows that vaccinating such a high risk group as patients with CHF against influenza may prevent adverse vascular events. However, these studies are subject to bias and a well designed clinical trial is needed to test the effect of influenza vaccination on preventing adverse vascular events. The goal of this study is to assess whether inactivated influenza vaccine can reduce adverse vascular events in high risk participants. We will address the question by randomizing patients at high risk for adverse vascular events to either annual inactivated influenza vaccine or to placebo over three influenza seasons. The primary outcome is a composite of cardiovascular (CV) death, non-fatal myocardial infarction (MI), non- fatal stroke, and hospitalization for CHF. We will enroll 3,500 participants from centres in seven countries: Philippines (the lead centre), Mozambique, Sudan, Uganda, Saudi Arabia, Malaysia, China. This proposed randomized trial has important implications for the management of patients at high risk for major adverse vascular events. Although the influenza vaccine is recommended annually for groups with diabetes and cardiovascular disease in many counties, uptake of these recommendations is relatively low. Cardiologists in most jurisdictions do not routinely recommend annual influenza vaccine for their patients as a strategy to reduce future adverse vascular events such as acute coronary syndrome or stroke. Uptake of influenza vaccine in patients with heart disease varies by country but in INTER-CHF sites (where are trial will be conducted) is 11% on average. Rigorous demonstration of influenza vaccine leading to a reduction in major adverse vascular events would represent a landmark study. We anticipate that such a trial would influence management decisions by physicians for patients at high risk for major vascular events. The effect size we propose testing is comparable to secondary prevention strategies available and given the fact that a vaccine is given once annually it is simple and inexpensive. Given the large burden of disease, the possibility to reduce cardiovascular and stroke related death is a compelling argument for this trial. If influenza vaccine is shown to reduce adverse vascular events, it will represent an important change in how prevention of adverse vascular events is thought about. The fact that our primary outcome is a composite, including various forms of vascular disease will increase generalizability. The study would be a milestone in contributing to evidence-based clinical as well public health policy.
- Project . 2016 - 2019Funder: UKRI Project Code: EP/N018745/1Funder Contribution: 320,381 GBPPartners: Perimeter Institute, University of Oxford, UBC, University of Waterloo (Canada), State University of New York at Potsdam, Universitat Autònoma de Barcelona (UAB)
Realizing the potential of applications of quantum theory to information processing, which include quantum communication and quantum computation, is one of the primary goals of contemporary engineering and physics. The key theoretical breakthroughs enabling quantum communication technologies were the discovery of the phenomenon of quantum entanglement in the 1930s and the realisation that entanglement represented not merely a curiosity of quantum theory but a critical resource which could be exploited to achieve heretofore impossible communication tasks in the 1980s. Bell indentified quantum nonlocality as the essentially quantum aspect of entanglement in the 1960s. While it is widely understood that quantum computation offers substantial efficiency advantages over classical computation for particular problems, it is neither understood what the precise class of such problems is nor what the particular aspect or aspects of quantum theory enabling these advantages are. The applications for QC which have been identified are likely only a fraction of the full potential, however, as only a handful of quantum algorithms have been discovered. Peter Shor, whose discovery of the first practical quantum algorithm founded modern quantum computer science, contemplated why so few quantum algorithms have been discovered and suggested that, "quantum computers operate in a manner so different from classical computers that our techniques for designing algorithms and our intuitions for understanding the process of computation no longer work". In seeking quantum algorithms without a clear idea of the essential quantum phenomenon accounting for quantum computational advantage, we are working in the dark. Despite decades of research, the key feature of quantum theory enabling quantum advantage over classical computers remains elusive. Several of quantum theory's novel features---such as entanglement, superposition, and discord---have been proposed as candidates but have subsequently proven insufficient. Recent evidence, such as that provided by Rausendorff (Phys. Rev. A, 88) and Howard et al. (Nature, 510), demonstrates that a generalization of nonlocality called contextuality plays an important role in QC and suggests that it is, perhaps, a sought-after key to understanding the unique capabilities of QC. Our vision is to deepen the theory of contextuality with the goals of achieving an understanding of the precise role it plays in QC and how it is a resource for computational advantage. Our team is uniquely positioned to tackle this challenge: the PIs are co-inventors of the two leading theoretical frameworks for contextuality. We will achieve our goal by collaborating with an international, interdisciplinary team of experts including those responsible for the initial evidence linking contextuality and QC as well as recognized leaders in quantum algorithms and the resource theory of nonlocality.
26 Projects, page 1 of 3
Loading
- Project . 2016 - 2018Funder: UKRI Project Code: EP/N018958/1Funder Contribution: 507,674 GBPPartners: University of Edinburgh, Wolfram Research Europe Ltd, University of Salford, The Mathworks Ltd, University of London, MICROSOFT RESEARCH LIMITED, 3DS, NAG, University of Sheffield, Maplesoft...
"Software is the most prevalent of all the instruments used in modern science" [Goble 2014]. Scientific software is not just widely used [SSI 2014] but also widely developed. Yet much of it is developed by researchers who have little understanding of even the basics of modern software development with the knock-on effects to their productivity, and the reliability, readability and reproducibility of their software [Nature Biotechnology]. Many are long-tail researchers working in small groups - even Big Science operations like the SKA are operationally undertaken by individuals collectively. Technological development in software is more like a cliff-face than a ladder - there are many routes to the top, to a solution. Further, the cliff face is dynamic - constantly and quickly changing as new technologies emerge and decline. Determining which technologies to deploy and how best to deploy them is in itself a specialist domain, with many features of traditional research. Researchers need empowerment and training to give them confidence with the available equipment and the challenges they face. This role, akin to that of an Alpine guide, involves support, guidance, and load carrying. When optimally performed it results in a researcher who knows what challenges they can attack alone, and where they need appropriate support. Guides can help decide whether to exploit well-trodden paths or explore new possibilities as they navigate through this dynamic environment. These guides are highly trained, technology-centric, research-aware individuals who have a curiosity driven nature dedicated to supporting researchers by forging a research software support career. Such Research Software Engineers (RSEs) guide researchers through the technological landscape and form a human interface between scientist and computer. A well-functioning RSE group will not just add to an organisation's effectiveness, it will have a multiplicative effect since it will make every individual researcher more effective. It has the potential to improve the quality of research done across all University departments and faculties. My work plan provides a bottom-up approach to providing RSE services that is distinctive from yet complements the top-down approach provided by the EPRSC-funded Software Sustainability Institute. The outcomes of this fellowship will be: Local and National RSE Capability: A RSE Group at Sheffield as a credible roadmap for others pump-priming a UK national research software capability; and a national Continuing Professional Development programme for RSEs. Scalable software support methods: A scalable approach based on "nudging", to providing research software support for scientific software efficiency, sustainability and reproducibility, with quality-guidelines for research software and for researchers on how best to incorporate research software engineering support within their grant proposals. HPC for long-tail researchers: 'HPC-software ramps' and a pathway for standardised integration of HPC resources into Desktop Applications fit for modern scientific computing; a network of HPC-centric RSEs based around shared resources; and a portfolio of new research software courses developed with partners. Communication and public understanding: A communication campaign to raise the profile of research software exploiting high profile social media and online resources, establishing an informal forum for research software debate. References [Goble 2014] Goble, C. "Better Software, Better Research". IEEE Internet Computing 18(5): 4-8 (2014) [SSI 2014] Hettrick, S. "It's impossible to conduct research without software, say 7 out of 10 UK researchers" http://www.software.ac.uk/blog/2014-12-04-its-impossible-conduct-research-without-software-say-7-out-10-uk-researchers (2014) [Nature 2015] Editorial "Rule rewrite aims to clean up scientific software", Nature Biotechnology 520(7547) April 2015
- Project . 2016 - 2018Funder: UKRI Project Code: NE/K00008X/2Funder Contribution: 42,744 GBPPartners: INGV (Nat Inst Volcanology and Geophys), Durham University, HSL, University of Bergen, SFU, Willis Limited, Met Office, NOC, FLE, University of London...
Submarine landslides can be far larger than terrestrial landslides, and many generate destructive tsunamis. The Storegga Slide offshore Norway covers an area larger than Scotland and contains enough sediment to cover all of Scotland to a depth of 80 m. This huge slide occurred 8,200 years ago and extends for 800 km down slope. It produced a tsunami with a run up >20 m around the Norwegian Sea and 3-8 m on the Scottish mainland. The UK faces few other natural hazards that could cause damage on the scale of a repeat of the Storegga Slide tsunami. The Storegga Slide is not the only huge submarine slide in the Norwegian Sea. Published data suggest that there have been at least six such slides in the last 20,000 years. For instance, the Traenadjupet Slide occurred 4,000 years ago and involved ~900 km3 of sediment. Based on a recurrence interval of 4,000 years (2 events in the last 8,000 years, or 6 events in 20,000 years), there is a 5% probability of a major submarine slide, and possible tsunami, occurring in the next 200 years. Sedimentary deposits in Shetland dated at 1500 and 5500 years, in addition to the 8200 year Storegga deposit, are thought to indicate tsunami impacts and provide evidence that the Arctic tsunami hazard is still poorly understood. Given the potential impact of tsunamis generated by Arctic landslides, we need a rigorous assessment of the hazard they pose to the UK over the next 100-200 years, their potential cost to society, degree to which existing sea defences protect the UK, and how tsunami hazards could be incorporated into multi-hazard flood risk management. This project is timely because rapid climatic change in the Arctic could increase the risk posed by landslide-tsunamis. Crustal rebound associated with future ice melting may produce larger and more frequent earthquakes, such as probably triggered the Storegga Slide 8200 years ago. The Arctic is also predicted to undergo particularly rapid warming in the next few decades that could lead to dissociation of gas hydrates (ice-like compounds of methane and water) in marine sediments, weakening the sediment and potentially increasing the landsliding risk. Our objectives will be achieved through an integrated series of work blocks that examine the frequency of landslides in the Norwegian Sea preserved in the recent geological record, associated tsunami deposits in Shetland, future trends in frequency and size of earthquakes due to ice melting, slope stability and tsunami generation by landslides, tsunami inundation of the UK and potential societal costs. This forms a work flow that starts with observations of past landslides and evolves through modelling of their consequences to predicting and costing the consequences of potential future landslides and associated tsunamis. Particular attention will be paid to societal impacts and mitigation strategies, including examination of the effectiveness of current sea defences. This will be achieved through engagement of stakeholders from the start of the project, including government agencies that manage UK flood risk, international bodies responsible for tsunami warning systems, and the re-insurance sector. The main deliverables will be: (i) better understanding of frequency of past Arctic landslides and resulting tsunami impact on the UK (ii) improved models for submarine landslides and associated tsunamis that help to understand why certain landslides cause tsunamis, and others don't. (iii) a single modelling strategy that starts with a coupled landslide-tsunami source, tracks propagation of the tsunami across the Norwegian Sea, and ends with inundation of the UK coast. Tsunami sources of various sizes and origins will be tested (iv) a detailed evaluation of the consequences and societal cost to the UK of tsunami flooding , including the effectiveness of existing flood defences (v) an assessment of how climate change may alter landslide frequency and thus tsunami risk to the UK.
- Project . 2016 - 2020Funder: UKRI Project Code: NE/P001378/1Funder Contribution: 396,492 GBPPartners: University of London, University of Alberta, Swiss Federal Institute of Technology ETH Zürich
Transition zone seismic discontinuities (TZSDs), manifestations of mineral phase transitions or/and compositional changes between the upper mantle and the lower mantle, hold the key to resolve the mystery of mass and heat transport in the Earth's mantle and the long-term evolution of the Earth's interior. However, seismic characterizations of TZSDs are typically incomplete because of the limit in the data frequency bandwidth and sensitivity relevant to TZSDs. We innovate a simple, effective and high resolution probing of mantle discontinuity through examination of broadband forward and backward scattering waves in the context of the teleseismic receiver function method. This approach will allow us to comprehensively characterize TZSDs beneath the continents, including properties such as discontinuity topography, sharpness and gradient, shear velocity jump and density jump. To date, there has been no single study that is capable of simultaneously determining these essential seismic properties in the TZSDs. These renewed descriptions of TZSDs will be used to explore outstanding questions including mineralogical models of the transition zone and the presence of volatile/melt. In particular, we aim to address how current and past subduction determine short-term and long-term mantle mixing and whether such a mixing process may in turn shape slab sinking dynamics. A series of outstanding questions can be much better addressed with our new seismic observations: Did long-term mixing of billions of years result in apparent chemical layering as indicated in geodynamic models? What are the degree and the length scale of lateral heterogeneity if such a chemical layering exists? Is it possible that primordial structure may survive long term mixing and become trapped in the transition zone? Is the transition zone potentially a relatively shallow reservoir for long-term storage and geochemical evolution of basalt? Does chemical layering or large-scale primordial structure dictate the slab sinking dynamics? Does modern and ancient subduction recycle water into the deep mantle and transition zone? Does hydrated transition zone induce convective instability and contribute to intraplate volcanism? In the proposed work, we will use an innovative and effective observation with broadband forward and backward scattering waves to provide a comprehensive characterization of TZSDs, including properties such as discontinuity topography, sharpness, velocity and density jumps across the boundaries, and the gradient above/below the discontinuities. These unprecedentedly rich observations will provide renewed constraints on fundamental processes relevant to the Earth's interior and evolution.
- Project . 2016 - 2018Funder: UKRI Project Code: AH/P008038/1Funder Contribution: 80,530 GBPPartners: Inst for Justice & Democracy in Haiti, University of Birmingham, Queen's University Canada, Inst of Social Work & Social Science
As of April 2016, a total of 103,510 uniformed personnel from 123 countries were serving in 16 peacekeeping operations around the world. Where foreign soldiers - during war, occupation or peacekeeping operations - are on foreign soil, military-civilian relations develop, including those between soldiers and local women. Peacekeepers have increasingly been associated with sexual exploitation and abuse of the vulnerable populations they had been mandated to protect. Many of the intimate relations between peacekeeping personnel and local women, of both voluntary and exploitative nature, have led to pregnancies and to children being born. These so-called 'peace babies' and their mothers face particular challenges in volatile post-conflict communities, reportedly including childhood adversities as well as stigmatization, discrimination and disproportionate economic and social hardships. This project proposes an in-depth-study on the situation of 'peace babies' conceived by personnel from or associated with the United Nations Stabilization Mission in Haiti (MINUSTAH). MINUSTAH is among the missions associated with allegations of misconduct, not least related to sexual and gender-based violence and consequently the unintended legacy of children fathered by UN personnel. The UN has recently acknowledged that 'peacekeeper babies' exist. Yet, an evidence base relating to the welfare of children fathered by UN peacekeepers (globally or in Haiti) is virtually non-existent, and it is clear that the existing UN policies and support programs are inadequate. The proposed study addresses this critical knowledge gap through the following original contributions: - Theoretical contribution - analysing the lack of accountability of the UN and its personnel for children fathered by UN peacekeepers by introducing a victim-centred approach; - Empirical contributions: i) exploring the gender norms, and the socioeconomic, cultural and security circumstances that contribute to unequal power relations between UN personnel and local civilians; ii) mapping the whereabouts of 'peace babies' in Haiti through a situational analysis of the areas surrounding six UN bases and exploring the circumstances around their conceptions; and iii) investigating the life experiences of women raising children fathered by peacekeepers; and - Methodological contribution - using an innovative mixed quantitative/qualitative research tool, Cognitive Edge's SenseMaker, to provide a more nuanced understanding of these complex issues. The multidisciplinary collaboration between scholars from the University of Birmingham, Queen's University, Kingston, the Centre of International and Defence Policy, and Haitian-based Enstiti Travay Sosyal ak Syans Sosyal (ETS), along with civil society organisations, the Institute for Justice and Democracy in Haiti and Haitian-based Bureau des Avocats Internationaux, will address this knowledge gap and enhance our understanding of the challenges faced by peace babies and their families as well as the obstacles to accessing support. Beyond the core UK-Canada-Haiti partnership, the project will include further ODA-recipient countries (among others Cambodia, Bosnia, Liberia and the DRC) and in a final project conference will apply insights from Haiti to Peace Support Operations (PSO) more generally in discourse with academic and non-academic participants from other countries with extensive PSO experience.
- Project . 2016 - 2017Funder: UKRI Project Code: EP/P006078/1Funder Contribution: 333,594 GBPPartners: Vienna University of Technology, University of Trento, CNR, UBC, Heriot-Watt University, Enshape
Some of the most fundamental and perhaps bizarre processes expected to occur in the vicinity of black holes are out of observational reach. To address this issue we utilise analogue systems where we study fluctuations on a background flow that in the experiment reproduces an effective black hole. In the literature this line of research is referred to as analogue models for gravity, or simply analogue gravity. Analogue models provide not only a theoretical but also an experimental framework in which to verify predictions of classical and quantum fields exposed to 'extreme' spacetime geometries, such as rapidly rotating black holes. This project brings together two world-wide recognised experts in the field of analogue gravity with the aim of pushing the field in a new direction: we propose ground-breaking studies to mimic some of the bizarre processes occurring in the vicinity of rotating black holes from general relativity and rotating fluids in both water and optical systems. In particular, we will investigate both theoretically and experimentally the interaction between an input wave and a rotating black hole spacetime geometry, here recreated by the rotating fluid. This allows us to mimic a scattering process associated to rotating black hoes called superradiant scattering. From a historical viewpoint this kind of radiation is the precursor to Hawking radiation. More precisely, black hole superradiance is the scattering of waves from a rotating black hole: if the incoming wave also possesses a small amount of angular momentum, it will be reflected with an increased amplitude, i.e. it is amplified at the expense of the black hole that thus loses some of its rotational energy. It has also been pointed out that the same physics may take place in very different systems, for example light incident on a rotating metallic (or absorbing) cylinder may also be amplified upon reflection. Yet, no-one has ever attempted to experimentally investigate the underlying physics that extend beyond general relativity and are relevant to a variety of hydrodynamical and rotating systems. We aim to provide the first ever experimental evidence of this intriguing and fundamental amplification mechanism in two different hydrodynamical systems. The first is a water spout, controlled so that the correct boundary conditions are obtained and optimised for observing BH-SS. The second is a less conventional fluid that is made out of light. Light propagating in a special medium can behave as a fluid or even a superfluid. By building upon highly developed photonic technologies e.g. for the control and measurements of laser beam wavefronts, we will implement very precisely tailored and characterised experiments. One of the unique aspects of this project is the marriage between two very different lab-based systems, one using water the other using light, to tackle an outstanding problem in physics that is of relevance to astrophysics, hydrodynamic and optical systems.
- Project . 2016 - 2017Funder: UKRI Project Code: AH/N006178/1Funder Contribution: 26,235 GBPPartners: University of Cambridge, University of Waterloo (Canada), University of Sussex, University of London, TNA
In recent years we have all become familiar with the notion of information overload, the digital deluge, the information explosion, and numerous variations on this idea. At the heart of this phenomenon is the growth of born-digital big data, a term which encompasses everything from aggregated tweets and Facebook posts to government emails, from the live and archived web to data generated by wearable and household technology. While there has been a growing interest in big data and the humanities in recent years, as exhibited notably in the AHRC's digital transformations theme, most academic research in this area has been undertaken by computer scientists and in emerging fields such as social informatics. As yet, there has been no systematic investigation of how humanities researchers are engaging with this new type of primary source, of what tools and methods they might require in order to work more effectively with big data in the future, and of what might constitute a specifically humanities approach to big data research. What kinds of questions will this data allow us to ask and answer? How can we ensure that this material is collected and preserved in such a way that it meets the requirements of humanities researchers? What insights can scholars in the humanities learn from ground-breaking work in the computer and social sciences, and from the archives and libraries who are concerned with securing all of this information? The proposed research Network will bring together researchers and practitioners from all of these stakeholder groups, to discern if there is a genuine humanities approach to born-digital big data, and to establish how this might inform, complement and draw on other disciplines and practices. Over the course of three workshops, one to be held at The National Archives in Kew, one at the Institute of Historical Research, University of London, and one at the University of Cambridge, the Network will address the current state of the field; establish the most appropriate tools and methods for humanities researchers for whom born-digital material is an important primary source; discuss the ways in which researchers and archives can work together to facilitate big data research; identify the barriers to engagement with big data, particularly in relation to skills; and work to build an engaged and lasting community of interest. The focus of the Network will be on history, but it will also encompass other humanities and social science disciplines. It will also include representatives of non-humanities disciplines, for example the computer, social and information sciences. Cross-disciplinary approaches and collaborative working are essential in such a new and complex area of investigation, and the Network relates to the current highlight notice encouraging the exploration of innovative areas of cross-disciplinary enquiry. While there has for some time been a recognition of the value of greater engagement between researchers in the humanities and the sciences in the development of new approaches to and understandings of born-digital big data, only very tentative first steps have been made towards realising this aim (for example forthcoming activity organised by the Turing Institute). The Network will provide a forum from which to launch precisely this kind of cross-disciplinary discussion, defining a central role for the humanities. During the 12 months of the project all members of the Network will contribute to a web resource, which will present key themes and ideas to both an academic and wider audience of the interested general public. External experts from government, the media and other relevant sectors will also be invited to contribute, to ensure that the Network takes account of a range of opinions and needs. The exchange of knowledge and experience that takes place at the workshops will also be distilled into a white paper, which will be published under a CC-BY licence in month 12 of the Network.
- Project . 2016 - 2019Funder: UKRI Project Code: NE/M017540/2Funder Contribution: 284,801 GBPPartners: Deltares-Delft, UNIMI, Durham University, Utrecht University, MBARI, MUN, BU, BIO, CSIC, Shell International Exploration & Produc...
Turbidity currents are the volumetrically most import process for sediment transport on our planet. A single submarine flow can transport ten times the annual sediment flux from all of the world's rivers, and they form the largest sediment accumulations on Earth (submarine fans). These flows break strategically important seafloor cable networks that carry > 95% of global data traffic, including the internet and financial markets, and threaten expensive seabed infrastructure used to recover oil and gas. Ancient flows form many deepwater subsurface oil and gas reservoirs in locations worldwide. It is sobering to note quite how few direct measurements we have from submarine flows in action, which is a stark contrast to other major sediment transport processes such as rivers. Sediment concentration is the most fundamental parameter for documenting what turbidity currents are, and it has never been measured for flows that reach submarine fans. How then do we know what type of flow to model in flume tanks, or which assumptions to use to formulate numerical or analytical models? There is a compelling need to monitor flows directly if we are to make step changes in understanding. The flows evolve significantly, such that source to sink data is needed, and we need to monitor flows in different settings because their character can vary significantly. This project will coordinate and pump-prime international efforts to monitor turbidity currents in action. Work will be focussed around key 'test sites' that capture the main types of flows and triggers. The objective is to build up complete source-to-sink information at key sites, rather than producing more incomplete datasets in disparate locations. Test sites are chosen where flows are known to be active - occurring on annual or shorter time scale, where previous work provides a basis for future projects, and where there is access to suitable infrastructure (e.g. vessels). The initial test sites include turbidity current systems fed by rivers, where the river enters marine or freshwater, and where plunging ('hyperpycnal') river floods are common or absent. They also include locations that produce powerful flows that reach the deep ocean and build submarine fans. The project is novel because there has been no comparable network established for monitoring turbidity currents Numerical and laboratory modelling will also be needed to understand the significance of the field observations, and our aim is also to engage modellers in the design and analysis of monitoring datasets. This work will also help to test the validity of various types of model. We will collect sediment cores and seismic data to study the longer term evolution of systems, and the more infrequent types of flow. Understanding how deposits are linked to flows is important for outcrop and subsurface oil and gas reservoir geologists. This proposal is timely because of recent efforts to develop novel technology for monitoring flows that hold great promise. This suite of new technology is needed because turbidity currents can be extremely powerful (up to 20 m/s) and destroy sensors placed on traditional moorings on the seafloor. This includes new sensors, new ways of placing those sensors above active flows or in near-bed layers, and new ways of recovering data via autonomous gliders. Key preliminary data are lacking in some test sites, such as detailed bathymetric base-maps or seismic datasets. Our final objective is to fill in key gaps in 'site-survey' data to allow larger-scale monitoring projects to be submitted in the future. This project will add considerable value to an existing NERC Grant to monitor flows in Monterey Canyon in 2014-2017, and a NERC Industry Fellowship hosted by submarine cable operators. Talling is PI for two NERC Standard Grants, a NERC Industry Fellowship and NERC Research Programme Consortium award. He is also part of a NERC Centre, and thus fulfils all four criteria for the scheme.
- Project . 2016 - 2017Funder: UKRI Project Code: ES/N007883/1Funder Contribution: 523,273 GBPPartners: KCL, University of Montreal, DCU, SEOUL NATIONAL UNIVERSITY
In a globalised economic and business context, the norms that shape human resource management travel internationally. This is particularly the case within the multinational company, where individuals are responsible for the creation, diffusion, interpretation and negotiation of norms - which may be rules, principles or guidelines - across international operations. We refer to such individuals as "globalizing actors". The aim of our research is to identify the resources mobilized by globalizing actors in the creation, diffusion, interpretation and negotiation of norms concerning the global coordination of human resources (see 'Objectives' for more detail). Previous research has examined individuals in important international positions, focusing on their orientations and values (e.g. whether they possess 'global mindsets'), the management of international assignments and the characteristics of members of the international business elite. However, these literatures have not systematically examined the actual roles of globalizing actors within firms, and precisely how they create, diffuse, and manage international norms. We examine what such actors actually do within a theoretical framework that sees the behaviour of globalizing actors as shaped by institutions: the institutions in the country in which they originated affect their competencies; they must be sensitive to a variety of host national institutions; and they must navigate their way through a growing range of transnational institutions. Their role is also shaped by organizational context, particularly how the firm derives synergies from integrating their operations internationally, which influences the types of global norms required. However, globalizing actors are not prisoners of institutional and organizational contexts. They can create new norms, develop strategies that help shape the 'rules of the game' and attempt to exploit institutional contradictions and ambiguities. We will explore the individual level resources of these actors to deal with these contexts, such as their skills and knowledge - 'human capital' - the relationship these actors have to others in terms of power, position and trust - their 'social capital' - and their transnational experiences or exposure. We will examine UK MNCs, both at home and across subsidiaries in Europe, North America and East Asia. The research will use multiple methods, consisting of five steps: 1. Pilot Work. Using seed-corn funding, we have tested key concepts and generated contacts for twelve full case studies in subsequent stages of the research. 2. UK interviews. These will focus on those charged with creating new norms, spreading them across international operations, or ensuring compliance. 3. Foreign Subsidiary Interviews. We will conduct interviews in the international operations of each firm, enabling us to understand frames of reference and actor choices in foreign subsidiaries. 4. Multi-level Survey. The survey of a set of globalizing actors will establish individual level capabilities associated with the establishment and diffusion of global norms. 5. Quantitative Diary Study. This methodological innovation allows us both to explore what globalizing actors actually do and to test predictors of behaviours and attitudes. The research will make a substantial and distinctive contribution to understanding of the processes of international management, through focusing on individual "globalizing actors" within the contexts of the multiple institutional and organisational contexts within which they make decisions. Equally, through the development and communication of a strong evidence base on how firms build individual and organisational capabilities in international management, the research also aims to enable improvements in the economic effectiveness of UK firms with overseas operations, while acting in ways that respond to the need for social responsibility at local-host and global levels.
- Project . 2016 - 2022Funder: UKRI Project Code: MR/N005759/1Funder Contribution: 3,039,500 GBPPartners: McMaster University
Cardiovascular disease is a leading cause of death globally estimated to be responsible for approximately 17 million deaths annually. Heart disease and stroke account for nearly one third of all deaths and are a major cause of hospitalization. Patients with congestive heart failure (CHF) are at particularly high risk. Clinical trials demonstrate that nearly one third of patients with CHF will experience a myocardial infarction (MI), stroke, or hospitalization for CHF. Observational studies have established an association between influenza infection and major adverse vascular events . It follows that vaccinating such a high risk group as patients with CHF against influenza may prevent adverse vascular events. However, these studies are subject to bias and a well designed clinical trial is needed to test the effect of influenza vaccination on preventing adverse vascular events. The goal of this study is to assess whether inactivated influenza vaccine can reduce adverse vascular events in high risk participants. We will address the question by randomizing patients at high risk for adverse vascular events to either annual inactivated influenza vaccine or to placebo over three influenza seasons. The primary outcome is a composite of cardiovascular (CV) death, non-fatal myocardial infarction (MI), non- fatal stroke, and hospitalization for CHF. We will enroll 3,500 participants from centres in seven countries: Philippines (the lead centre), Mozambique, Sudan, Uganda, Saudi Arabia, Malaysia, China. This proposed randomized trial has important implications for the management of patients at high risk for major adverse vascular events. Although the influenza vaccine is recommended annually for groups with diabetes and cardiovascular disease in many counties, uptake of these recommendations is relatively low. Cardiologists in most jurisdictions do not routinely recommend annual influenza vaccine for their patients as a strategy to reduce future adverse vascular events such as acute coronary syndrome or stroke. Uptake of influenza vaccine in patients with heart disease varies by country but in INTER-CHF sites (where are trial will be conducted) is 11% on average. Rigorous demonstration of influenza vaccine leading to a reduction in major adverse vascular events would represent a landmark study. We anticipate that such a trial would influence management decisions by physicians for patients at high risk for major vascular events. The effect size we propose testing is comparable to secondary prevention strategies available and given the fact that a vaccine is given once annually it is simple and inexpensive. Given the large burden of disease, the possibility to reduce cardiovascular and stroke related death is a compelling argument for this trial. If influenza vaccine is shown to reduce adverse vascular events, it will represent an important change in how prevention of adverse vascular events is thought about. The fact that our primary outcome is a composite, including various forms of vascular disease will increase generalizability. The study would be a milestone in contributing to evidence-based clinical as well public health policy.
- Project . 2016 - 2019Funder: UKRI Project Code: EP/N018745/1Funder Contribution: 320,381 GBPPartners: Perimeter Institute, University of Oxford, UBC, University of Waterloo (Canada), State University of New York at Potsdam, Universitat Autònoma de Barcelona (UAB)
Realizing the potential of applications of quantum theory to information processing, which include quantum communication and quantum computation, is one of the primary goals of contemporary engineering and physics. The key theoretical breakthroughs enabling quantum communication technologies were the discovery of the phenomenon of quantum entanglement in the 1930s and the realisation that entanglement represented not merely a curiosity of quantum theory but a critical resource which could be exploited to achieve heretofore impossible communication tasks in the 1980s. Bell indentified quantum nonlocality as the essentially quantum aspect of entanglement in the 1960s. While it is widely understood that quantum computation offers substantial efficiency advantages over classical computation for particular problems, it is neither understood what the precise class of such problems is nor what the particular aspect or aspects of quantum theory enabling these advantages are. The applications for QC which have been identified are likely only a fraction of the full potential, however, as only a handful of quantum algorithms have been discovered. Peter Shor, whose discovery of the first practical quantum algorithm founded modern quantum computer science, contemplated why so few quantum algorithms have been discovered and suggested that, "quantum computers operate in a manner so different from classical computers that our techniques for designing algorithms and our intuitions for understanding the process of computation no longer work". In seeking quantum algorithms without a clear idea of the essential quantum phenomenon accounting for quantum computational advantage, we are working in the dark. Despite decades of research, the key feature of quantum theory enabling quantum advantage over classical computers remains elusive. Several of quantum theory's novel features---such as entanglement, superposition, and discord---have been proposed as candidates but have subsequently proven insufficient. Recent evidence, such as that provided by Rausendorff (Phys. Rev. A, 88) and Howard et al. (Nature, 510), demonstrates that a generalization of nonlocality called contextuality plays an important role in QC and suggests that it is, perhaps, a sought-after key to understanding the unique capabilities of QC. Our vision is to deepen the theory of contextuality with the goals of achieving an understanding of the precise role it plays in QC and how it is a resource for computational advantage. Our team is uniquely positioned to tackle this challenge: the PIs are co-inventors of the two leading theoretical frameworks for contextuality. We will achieve our goal by collaborating with an international, interdisciplinary team of experts including those responsible for the initial evidence linking contextuality and QC as well as recognized leaders in quantum algorithms and the resource theory of nonlocality.