Advanced search in
Projects
arrow_drop_down
Searching FieldsTerms
Any field
arrow_drop_down
includes
arrow_drop_down
The following results are related to Canada. Are you interested to view more results? Visit OpenAIRE - Explore.
23 Projects, page 1 of 3

  • Canada
  • UK Research and Innovation
  • 2019

10
arrow_drop_down
  • Funder: UKRI Project Code: EP/P017401/1
    Funder Contribution: 100,794 GBP
    Partners: Autodesk Inc, University of Salford, University of Toronto

    Synthetic biology is an exciting new discipline which offers the potential to bring many benefits to human health and welfare. One near-market example is the use of engineered genetic networks to make biological sensors, or biosensors, which can rapidly detect toxins and harmful microorganisms. However, most synthetic biology systems are based on living genetically modified cells, and due to safety concerns and regulatory issues, they can not be used outside of a specially approved laboratory, whereas the greatest unmet need for biosensors is in the field, for 'point-of-use' and 'point-of-care' tests for health hazards. The laboratory of Professor James Collins recently reported a remarkable breakthrough, using non-living biological systems based on genetic components dried onto strips of paper. These systems can be prepared very cheaply, can be stored stably for long periods, and, since they are not alive and can not replicate, they pose no risks to the environment. This technology is therefore ideal for further development of sensors for human health. In addition, these cell-free systems can be prepared in large numbers very rapidly, in a matter of hours, and tested rapidly, in a matter of minutes, whereas living cell based systems may take weeks to prepare and days to test. This makes the new technology ideal for 'rapid prototyping' of genetic circuits. Many designs can be rapidly generated and tested, and the most successful can then be used to generate cell-based systems for applications where this is required, such as engineered metabolic pathways for manufacturing pharmaceuticals and other valuable compounds. In this project, we will further develop these remarkable systems and create new tools which will make it even easier to design and develop them. Firstly, we will create new computational tools which can be used to design genetic circuits for many applications. These will be made available on-line for the benefit of the research community. Secondly, we will establish methods for rapid automated assembly and testing of new circuits, allowing many thousands of variants to be generated and tested in a very short time with minimal human effort. Thirdly, we will seek to improve the basic technology, to improve the performance of the cell-free devices, and also develop low cost open-source electronic readers which can easily be used in the field along with the sensors we develop. Fourthly, we will demonstrate the usefulness of the technology by generating sensors which can rapidly and sensitively detect various external inputs. All of our new inventions will be made available to the research community. In addition to the other advantages mentioned above, this technology also makes it easy for users to develop their own assays simply by adding appropriate DNA components to a basic mixture, using standard protocols. Such devices can be manufactured and distributed cheaply on a very large scale. In conjunction with low-cost readers, ubiquitous mobile devices equipped with GPS and time data, and cloud-computing, this will offer the possibility to detect health hazards with unprecedented levels of speed and detail, with potentially huge effects on human health and welfare. Furthermore, these devices are ideal for use in education, allowing users to design and test their own genetic circuits without the issues inherent in using living cells. For these reasons, our proposal offers tremendous benefits and represents a step change in the real-word applicability of synthetic biology.

  • Funder: UKRI Project Code: ES/N007883/2
    Funder Contribution: 408,373 GBP
    Partners: University of Montreal, DCU, Loughborough University, SEOUL NATIONAL UNIVERSITY

    In a globalised economic and business context, the norms that shape human resource management travel internationally. This is particularly the case within the multinational company, where individuals are responsible for the creation, diffusion, interpretation and negotiation of norms - which may be rules, principles or guidelines - across international operations. We refer to such individuals as "globalizing actors". The aim of our research is to identify the resources mobilized by globalizing actors in the creation, diffusion, interpretation and negotiation of norms concerning the global coordination of human resources (see 'Objectives' for more detail). Previous research has examined individuals in important international positions, focusing on their orientations and values (e.g. whether they possess 'global mindsets'), the management of international assignments and the characteristics of members of the international business elite. However, these literatures have not systematically examined the actual roles of globalizing actors within firms, and precisely how they create, diffuse, and manage international norms. We examine what such actors actually do within a theoretical framework that sees the behaviour of globalizing actors as shaped by institutions: the institutions in the country in which they originated affect their competencies; they must be sensitive to a variety of host national institutions; and they must navigate their way through a growing range of transnational institutions. Their role is also shaped by organizational context, particularly how the firm derives synergies from integrating their operations internationally, which influences the types of global norms required. However, globalizing actors are not prisoners of institutional and organizational contexts. They can create new norms, develop strategies that help shape the 'rules of the game' and attempt to exploit institutional contradictions and ambiguities. We will explore the individual level resources of these actors to deal with these contexts, such as their skills and knowledge - 'human capital' - the relationship these actors have to others in terms of power, position and trust - their 'social capital' - and their transnational experiences or exposure. We will examine UK MNCs, both at home and across subsidiaries in Europe, North America and East Asia. The research will use multiple methods, consisting of five steps: 1. Pilot Work. Using seed-corn funding, we have tested key concepts and generated contacts for twelve full case studies in subsequent stages of the research. 2. UK interviews. These will focus on those charged with creating new norms, spreading them across international operations, or ensuring compliance. 3. Foreign Subsidiary Interviews. We will conduct interviews in the international operations of each firm, enabling us to understand frames of reference and actor choices in foreign subsidiaries. 4. Multi-level Survey. The survey of a set of globalizing actors will establish individual level capabilities associated with the establishment and diffusion of global norms. 5. Quantitative Diary Study. This methodological innovation allows us both to explore what globalizing actors actually do and to test predictors of behaviours and attitudes. The research will make a substantial and distinctive contribution to understanding of the processes of international management, through focusing on individual "globalizing actors" within the contexts of the multiple institutional and organisational contexts within which they make decisions. Equally, through the development and communication of a strong evidence base on how firms build individual and organisational capabilities in international management, the research also aims to enable improvements in the economic effectiveness of UK firms with overseas operations, while acting in ways that respond to the need for social responsibility at local-host and global levels.

  • Funder: UKRI Project Code: NE/L000318/1
    Funder Contribution: 620,481 GBP
    Partners: Nat Uni of Life & Env Sci Ukraine NUBiP, DSA, CIT, McMaster University, PHE, Inst Radiation and Nuclear Safety IRSN, NERC Centre for Ecology and Hydrology, Norwegian University of Life Sciences (NMBU), Faculty of Biosciences, Marine Ecology Rese Institute (Japan), Belgian Nuclear Research Centre SCK CEN

    For all sources of radioactivity, radiological risk assessments are essential for safeguarding human and environmental health. But assessments often have to rely upon simplistic assumptions, such as the use of simple ratios in risk calculations which combine many processes. This pragmatic approach has largely arisen due to the lack of scientific knowledge and/or data in key areas. The resultant uncertainty has been taken into account through conservative approaches to radiological risk assessment which may tend to overestimate risk. Uncertainty arises at all stages of the assessment process from the estimation of transfer to human foodstuffs and wildlife, exposure and risk. Reducing uncertainty is important as it relates directly to scientific credibility, which will always be open to challenge given the highly sensitive nature of radiological risk assessment in society. We propose an integrated, multi-disciplinary, programme to assess and reduce the uncertainty associated with radiological risk assessment to protect human health and the environment. At the same time we will contribute to building the capacity needed to ensure that the UK rebuilds and maintains expertise in environmental radioactivity into the future. Our project has four major and highly inter-related components to address the key goal of RATE to rebuild UK capacity and make a major contribution to enhancing environmental protection and safeguarding human health. The first component will study how the biological availability of radionuclides varies in soils over time. We will investigate if short-term measurements (collected in three year controlled experiments) can be used to predict the long-term availability of radionuclides in soils by testing our models in the Chernobyl exclusion zone. The second component will apply the concepts of 'phylogeny' and 'ionomics' to characterise radionuclide uptake by plants and other organisms. These approaches, and statistical modelling methods, are increasingly applied to describe uptake of a range of elements in plant nutrition, and we are pioneering their use for studying radionuclide uptake in other organisms and human foods. A particularly exciting aspect of the approach is the possibility to make predictions for any plant or animal. This is of great value as it is impossible to measure uptake for all wildlife, crops and farm animals. The third component of the work will extend our efforts to improve the quantification of radiation exposure and understanding of resultant biological effects by investigating the underlying mechanisms involved. A key aim is to see whether what we know from experiments on animals and plants in the laboratory is a good representation of what happens in the real world: some scientists believe that animals in the natural environment are more susceptible to radiation than laboratory animals: we need to test this to have confidence in our risk assessments. Together these studies will enable us to reduce and better quantify the uncertainties associated with radiological risk assessment. By training a cohort of PDRA and PhDs our fourth component will help to renew UK capacity in environmental radioactivity by providing trained, experienced researchers who are well networked within the UK and internationally through the contacts of the investigators. Our students will be trained in a wide range of essential skills through their controlled laboratory studies and working in contaminated environments. They will benefit from being a member of a multidisciplinary team and opportunities to take placements with our beneficiaries and extensive range of project partners. The outputs of the project will benefit governmental and non-governmental organisations with responsibility for assessing the risks to humans and wildlife posed by environmental radioactivity. It will also make a major contribution to improved scientific and public confidence in the outcomes of environmental safety assessments.

  • Funder: UKRI Project Code: AH/P006175/1
    Funder Contribution: 47,675 GBP
    Partners: University of Birmingham, Queen's University Canada

    As of March 2016, a total of 104,773 uniformed personnel from 123 countries were serving in 16 peacekeeping operations around the world. Where foreign soldiers - during war, occupation or peacekeeping operations - are on foreign soil, military-civilian relations develop, including those between soldiers and local women. Peacekeepers have increasingly been associated with sexual exploitation and abuse of the vulnerable populations they had been mandated to protect. Many of the intimate relations between peacekeeping personnel and local women, of both voluntary and exploitative nature, have led to pregnancies and to children being born. These so-called 'peace babies' and their mothers face particular challenges in volatile post-conflict communities, reportedly including childhood adversities as well as stigmatization, discrimination and disproportionate economic and social hardships. The network connects two strands of inquiry around 'peace babies' - from the academic world and from within the development sector - in a spirit of conversation and collaboration, to examine challenges of humanitarian intervention in a transnational historical context. Building on the firm belief that history's focus on causality and long-term processes of change is indispensable for appreciating the complex dynamics of socio-cultural change, the network contributes a deeper understanding of development and aims to affect practice. It provides an historical complement to the wealth of available analyses - internal and external - of the contemporary humanitarian environment. Specifically, the network proposes an in-depth-study of the situation of 'peace babies' by exploring the children conceived by personnel from or associated with the United Nations Stabilization Mission in Haiti (MINUSTAH). MINUSTAH is among the missions that have been associated with allegations of a range of abuses, not least related to sexual and gender-based violence and consequently the unintended legacy of children fathered by UN personnel. The UN has recently acknowledged that 'peacekeeper babies' exist. Yet, an evidence base relating to the welfare of children fathered by UN peacekeepers (globally or in Haiti) is virtually non-existent, and it is clear that the existing UN policies and support programs are inadequate. This multidisciplinary collaboration between scholars from Queen's University, the University of Birmingham, the Centre of International and Defence Policy, and Haitian-based Enstiti Travay Sosyal ak Syans Sosyal (ETS), along with civil society organisations, the Institute for Justice and Democracy in Haiti and Haitian-based Bureau des Avocats Internationaux, will address this knowledge gap and enhance our historically-informed understanding of the challenges faced by peace babies and their families as well as the obstacles to accessing support. Beyond the core UK-Canada-Haiti partnership, the network will include a further four ODA-recipient countries (Cambodia, Bosnia, Liberia and the DRC) and will apply insights from Haiti to PSOs more generally in discourse with academic and non-academic participants from those countries with extensive PSO experience. The network is structured around three network meetings (two workshops and a network conference, the latter supplemented by an early-career research workshop) which will create a sustainable partnership that focuses on co-creation of knowledge as well as a collaborative mobilisation of this knowledge to inform academic and non-academic stakeholders interested in peacekeepers' children. The findings of the workshops and the final conference will inform both academic outputs and - going forward - the development of an intersectoral research agenda; furthermore they will frame a special journal edition on 'Peace Babies' and will be at the core of the network's activities beyond the funding period, both as a platform for continued transnational and intersectoral conversation and of collaborative research

  • Funder: UKRI Project Code: NE/M017028/1
    Funder Contribution: 766,686 GBP
    Partners: University of Guelph, WU, University of Salford

    Soils provide many functions for humans, including the storage of carbon and nutrient cycling, which are crucial for the production of food and mitigation of climate change. However, there is much concern that soils, and the functions that they provide, are being threatened by a range of pressures, including intensive farming methods and increased frequency of extreme climatic events, such as drought. Not only do these disturbances pose an immediate threat to the functioning of soils, but they could also impair their ability to resist and recover from further stresses that come in the future. Our project will tackle this problem by addressing two general questions: first, what makes a soil able to withstand and recover from disturbance events, such as drought, and, second how can we use this knowledge to ensure soils can buffer disturbances in the future? These are questions that have puzzled soil scientists for many years, but so far, remain unresolved. An area that offers much promise, however, in tackling this issue is food web ecology. Food webs are the networks of interactions describing who eats whom amongst the myriad organisms within an ecosystem. And in soil, they are the engine that drives the very processes of nutrient cycling and energy flow on which the functioning of soil and the terrestrial ecosystems they support, depend. It has been proposed for many years, but so far not fully tested in soil, that simple food webs are less able to withstand and recover from disturbance events, such as drought than complex ones. We want to test this theory in soil, which harbours some of the most complex, but also sensitive, food webs on Earth. We test the idea, through experiments and models, that the ability of a soil to withstand, recover and adapt to disturbance events depends on the architecture and diversity of the soil food web, which governs the rate of transfer of nutrients and energy through the plant-soil system. We also propose that soil disturbances associated with intensive land use, such as trampling and fertiliser addition, erode the very food web structures that make the soil system stable, thereby reducing the ability of soil to resist and recover from future disturbances, such as extreme weather events. We will also resolve what makes a food web stable, and test the roles of different types of organisms in soil, such as mycorrhizal fungi, which we believe play a major role. And finally, we will develop new models to help us better predict how soils will respond to future threats and to guide management decisions on sustainable soil management in a rapidly changing world. These question are at the heart of the NERC Soil Security programme which seeks to resolve what controls the ability of soils and their functions to resist, recover and ultimately adapt, to perturbations, such as those caused by land use and extreme climatic events.

  • Funder: UKRI Project Code: EP/M01052X/1
    Funder Contribution: 731,953 GBP
    Partners: University of Edinburgh, RU, UMD, SFU, University of Kent

    Condensed matter physics has developed a relatively complete theory of common phases in materials leading to many technologically important devices including electronic screens, memory storage, and switching devices. Landau, or mean-field theory, has provided a framework to model, predict, and understand phases and transitions in a surprisingly diverse variety of materials and also dynamical systems. While these conventional ground states have proven technologically important and the underlying theory represents a major success for scientists, these phases have proven incredibly difficult to suppress and often emerge when new materials properties are sought or engineered. To discover novel phases that will lead to a new materials revolution, these common phases need to be suppressed to allow exotic and unconventional properties to emerge. The most common vehicle to turn off conventional phases in materials has been through the introduction of disorder through chemical doping resulting in strong random fields. Many important theories have been formulated and tested to describe the effects of random fields and in particular to account for the fine balance between surface and bulk free energy. However, the use of disorder has proved limiting as properties are often templated into the material and not directly controllable and also the resulting ground state of the material is difficult to understand. Another route, which has more recently been explored in the last decade, to suppress conventional phases is by introducing strong fluctuations. While this can be trivially done with temperature, new phases have emerged by studying quantum systems where the physics are governed by quantum mechanics and the Heisenberg uncertainty principle. The study of quantum systems has resulted in the discovery of many new phases of matter including high temperature superconductors and also quantum spin-liquids where the magnetism is dynamic at any temperature. A limitation of quantum fluctuations is that the properties do not carry over directly to ferroelectric based systems and also multiferroics where magnetic and structural properties are strongly coupled. Also, owing to the strong fluctuating nature of the ground state, the properties have not been found to be easily tunable limiting immediate use for applications. This proposal aims to therefore take a different route by studying classically frustrated systems where a large ground state degeneracy is introduced naturally through the lattice and quantum mechanical effects are small. Emphasis will be placed on lattices based upon a triangular geometry. The lack of strong fluctuations (that exists in quantum systems) provides the ability to controllably tune between different ground states making this route a potential means of creating new switching devices or novel memory storage systems. The proposal aims to investigate classically frustrated magnets and ferroelectrics. These systems can be described within a common framework and will be studied using scattering techniques to provide a bulk real space image of the ground state. The properties will be tuned with magnetic and electric fields supplying a direct route for discovering a new route towards technologically applicable materials. The combined approach of investigating ferroelectrics and magnets will result in a complete understanding applicable to immediate industrial applications. These new materials will lead to the discovery of new phases including new high temperature multiferroics, classical spin liquids, or localized controllable boundaries or defects.

  • Funder: UKRI Project Code: EP/R004730/1
    Funder Contribution: 101,150 GBP
    Partners: University of Toronto, University of Warwick, Abdus Salam ICTP, SISSA - ISAS

    The subject of study of differential geometry are smooth manifolds, which correspond to smooth curved objects of finite dimension. In modern differential geometry, it is becoming more and more common to consider sequences (or flows) of smooth manifolds. Typically the limits of such sequences (or flows) are non smooth anymore. It is then useful to isolate a natural class of non smooth objects which generalize the classical notion of smooth manifold, and which is closed under the process of taking limits. If the sequence of manifolds satisfy a lower bound on the sectional curvatures, a natural class of non-smooth objects which is closed under (Gromov-Hausdorff) convergence is given by special metric spaces known as Alexandrov spaces; if instead the sequence of manifolds satisfy a lower bound on the Ricci curvatures, a natural class of non-smooth objects, closed under (measured Gromov-Hausdorff) convergence, is given by special metric measure spaces (i.e. metric spaces endowed with a reference volume measure) known as RCD(K,N) spaces. These are a 'Riemannian' refinement of the so called CD(K,N) spaces of Lott-Sturm-Villani, which are metric measure spaces with Ricci curvature bounded below by K and dimension bounded above by N in a synthetic sense via optimal transport. In the proposed project we aim to understand in more detail the structure, the analytic and the geometric properties of RCD(K,N) spaces. The new results will have an impact also on the classical world of smooth manifolds satisfying curvature bounds.

  • Funder: UKRI Project Code: EP/N018958/2
    Funder Contribution: 305,534 GBP
    Partners: University of London, University of Edinburgh, University of Salford, University of Leeds, The Mathworks Ltd, Wolfram Research Europe Ltd, MICROSOFT RESEARCH LIMITED, NAG, N8 Research Partnership, 3DS...

    "Software is the most prevalent of all the instruments used in modern science" [Goble 2014]. Scientific software is not just widely used [SSI 2014] but also widely developed. Yet much of it is developed by researchers who have little understanding of even the basics of modern software development with the knock-on effects to their productivity, and the reliability, readability and reproducibility of their software [Nature Biotechnology]. Many are long-tail researchers working in small groups - even Big Science operations like the SKA are operationally undertaken by individuals collectively. Technological development in software is more like a cliff-face than a ladder - there are many routes to the top, to a solution. Further, the cliff face is dynamic - constantly and quickly changing as new technologies emerge and decline. Determining which technologies to deploy and how best to deploy them is in itself a specialist domain, with many features of traditional research. Researchers need empowerment and training to give them confidence with the available equipment and the challenges they face. This role, akin to that of an Alpine guide, involves support, guidance, and load carrying. When optimally performed it results in a researcher who knows what challenges they can attack alone, and where they need appropriate support. Guides can help decide whether to exploit well-trodden paths or explore new possibilities as they navigate through this dynamic environment. These guides are highly trained, technology-centric, research-aware individuals who have a curiosity driven nature dedicated to supporting researchers by forging a research software support career. Such Research Software Engineers (RSEs) guide researchers through the technological landscape and form a human interface between scientist and computer. A well-functioning RSE group will not just add to an organisation's effectiveness, it will have a multiplicative effect since it will make every individual researcher more effective. It has the potential to improve the quality of research done across all University departments and faculties. My work plan provides a bottom-up approach to providing RSE services that is distinctive from yet complements the top-down approach provided by the EPRSC-funded Software Sustainability Institute. The outcomes of this fellowship will be: Local and National RSE Capability: A RSE Group at Sheffield as a credible roadmap for others pump-priming a UK national research software capability; and a national Continuing Professional Development programme for RSEs. Scalable software support methods: A scalable approach based on "nudging", to providing research software support for scientific software efficiency, sustainability and reproducibility, with quality-guidelines for research software and for researchers on how best to incorporate research software engineering support within their grant proposals. HPC for long-tail researchers: 'HPC-software ramps' and a pathway for standardised integration of HPC resources into Desktop Applications fit for modern scientific computing; a network of HPC-centric RSEs based around shared resources; and a portfolio of new research software courses developed with partners. Communication and public understanding: A communication campaign to raise the profile of research software exploiting high profile social media and online resources, establishing an informal forum for research software debate. References [Goble 2014] Goble, C. "Better Software, Better Research". IEEE Internet Computing 18(5): 4-8 (2014) [SSI 2014] Hettrick, S. "It's impossible to conduct research without software, say 7 out of 10 UK researchers" http://www.software.ac.uk/blog/2014-12-04-its-impossible-conduct-research-without-software-say-7-out-10-uk-researchers (2014) [Nature 2015] Editorial "Rule rewrite aims to clean up scientific software", Nature Biotechnology 520(7547) April 2015

  • Funder: UKRI Project Code: EP/R019622/1
    Funder Contribution: 100,987 GBP
    Partners: RWTH, CNRS, Maplesoft, UniGe, Coventry University

    This project concerns computational mathematics and logic. The aim is to improve the ability of computers to perform ``Quantifier Elimination'' (QE). We say a logical statement is ``quantified'' if it is preceded by a qualification such as "for all" or "there exists". Here is an example of a quantified statement: "there exists x such that ax^2 + bx + c = 0 has two solutions for x". While the statement is mathematically precise the implications are unclear - what restrictions does this statement of existence force upon us? QE corresponds to replacing a quantified statement by an unquantified one which is equivalent. In this case we may replace the statement by: "b^2 - 4ac > 0", which is the condition for x to have two solutions. You may have recognised this equivalence from GCSE mathematics, when studying the quadratic equation. The important point here is that the latter statement can actually be derived automatically by a computer from the former, using a QE procedure. QE is not subject to the numerical rounding errors of most computations. Solutions are not in the form of a numerical answer but an algebraic description which offers insight into the structure of the problem at hand. In the example above, QE shows us not what the solutions to a particular quadratic equation are, but how in general the number of solutions depends on the coefficients a, b, and c. QE has numerous applications throughout engineering and the sciences. An example from biology is the determination of medically important values of parameters in a biological network; while another from economics is identifying which hypotheses in economic theories are compatible, and for what values of the variables. In both cases, QE can theoretically help, but in practice the size of the statements means state-of-the-art procedures run out of computer time/memory. The extensive development of QE procedures means they have many options and choices about how they are run. These decisions can greatly affect how long QE takes, rendering an intractable problem easy and vice versa. Making the right choice is a critical, but understudied problem and is the focus of this project. At the moment QE procedures make such choices either under direct supervision of a human or based on crude human-made heuristics (rules of thumb based on intuition / experience but with limited scientific basis). The purpose of this project is to replace these by machine learning techniques. Machine Learning (ML) is an overarching term for tools that allow computers to make decisions that are not explicitly programmed, usually involving the statistical analysis of large quantities of data. ML is quite at odds with the field of Symbolic Computation which studies QE, as the latter prizes exact correctness and so shuns the use of probabilistic tools making its application here very novel. We are able to combine these different worlds because the choices which we will use ML to make will all produce a correct and exact answer (but with different computational costs). The project follows pilot studies undertaken by the PI which experimented with one ML technique and found it improved upon existing heuristics for two particular decisions in a QE algorithm. We will build on this by working with the spectrum of leading ML tools to identify the optimal techniques for application in Symbolic Computation. We will demonstrate their use for both low level algorithm decisions and choices between different theories and implementations. Although focused on QE, we will also demonstrate ML as being a new route to optimisation in Computer Algebra more broadly and work encompasses Project Partners and events to maximise this. Finally, the project will deliver an improved QE procedure that makes use of ML automatically, without user input. This will be produced in the commercial Computer Algebra software Maple in collaboration with industrial Project Partner Maplesoft.

  • Funder: UKRI Project Code: NE/M017540/2
    Funder Contribution: 284,801 GBP
    Partners: Utrecht University, MBARI, GSC, NCU, SDSU, BU, Middlesex University London, ConocoPhillips Company, CSIC, UNIMI...

    Turbidity currents are the volumetrically most import process for sediment transport on our planet. A single submarine flow can transport ten times the annual sediment flux from all of the world's rivers, and they form the largest sediment accumulations on Earth (submarine fans). These flows break strategically important seafloor cable networks that carry > 95% of global data traffic, including the internet and financial markets, and threaten expensive seabed infrastructure used to recover oil and gas. Ancient flows form many deepwater subsurface oil and gas reservoirs in locations worldwide. It is sobering to note quite how few direct measurements we have from submarine flows in action, which is a stark contrast to other major sediment transport processes such as rivers. Sediment concentration is the most fundamental parameter for documenting what turbidity currents are, and it has never been measured for flows that reach submarine fans. How then do we know what type of flow to model in flume tanks, or which assumptions to use to formulate numerical or analytical models? There is a compelling need to monitor flows directly if we are to make step changes in understanding. The flows evolve significantly, such that source to sink data is needed, and we need to monitor flows in different settings because their character can vary significantly. This project will coordinate and pump-prime international efforts to monitor turbidity currents in action. Work will be focussed around key 'test sites' that capture the main types of flows and triggers. The objective is to build up complete source-to-sink information at key sites, rather than producing more incomplete datasets in disparate locations. Test sites are chosen where flows are known to be active - occurring on annual or shorter time scale, where previous work provides a basis for future projects, and where there is access to suitable infrastructure (e.g. vessels). The initial test sites include turbidity current systems fed by rivers, where the river enters marine or freshwater, and where plunging ('hyperpycnal') river floods are common or absent. They also include locations that produce powerful flows that reach the deep ocean and build submarine fans. The project is novel because there has been no comparable network established for monitoring turbidity currents Numerical and laboratory modelling will also be needed to understand the significance of the field observations, and our aim is also to engage modellers in the design and analysis of monitoring datasets. This work will also help to test the validity of various types of model. We will collect sediment cores and seismic data to study the longer term evolution of systems, and the more infrequent types of flow. Understanding how deposits are linked to flows is important for outcrop and subsurface oil and gas reservoir geologists. This proposal is timely because of recent efforts to develop novel technology for monitoring flows that hold great promise. This suite of new technology is needed because turbidity currents can be extremely powerful (up to 20 m/s) and destroy sensors placed on traditional moorings on the seafloor. This includes new sensors, new ways of placing those sensors above active flows or in near-bed layers, and new ways of recovering data via autonomous gliders. Key preliminary data are lacking in some test sites, such as detailed bathymetric base-maps or seismic datasets. Our final objective is to fill in key gaps in 'site-survey' data to allow larger-scale monitoring projects to be submitted in the future. This project will add considerable value to an existing NERC Grant to monitor flows in Monterey Canyon in 2014-2017, and a NERC Industry Fellowship hosted by submarine cable operators. Talling is PI for two NERC Standard Grants, a NERC Industry Fellowship and NERC Research Programme Consortium award. He is also part of a NERC Centre, and thus fulfils all four criteria for the scheme.

Advanced search in
Projects
arrow_drop_down
Searching FieldsTerms
Any field
arrow_drop_down
includes
arrow_drop_down
The following results are related to Canada. Are you interested to view more results? Visit OpenAIRE - Explore.
23 Projects, page 1 of 3
  • Funder: UKRI Project Code: EP/P017401/1
    Funder Contribution: 100,794 GBP
    Partners: Autodesk Inc, University of Salford, University of Toronto

    Synthetic biology is an exciting new discipline which offers the potential to bring many benefits to human health and welfare. One near-market example is the use of engineered genetic networks to make biological sensors, or biosensors, which can rapidly detect toxins and harmful microorganisms. However, most synthetic biology systems are based on living genetically modified cells, and due to safety concerns and regulatory issues, they can not be used outside of a specially approved laboratory, whereas the greatest unmet need for biosensors is in the field, for 'point-of-use' and 'point-of-care' tests for health hazards. The laboratory of Professor James Collins recently reported a remarkable breakthrough, using non-living biological systems based on genetic components dried onto strips of paper. These systems can be prepared very cheaply, can be stored stably for long periods, and, since they are not alive and can not replicate, they pose no risks to the environment. This technology is therefore ideal for further development of sensors for human health. In addition, these cell-free systems can be prepared in large numbers very rapidly, in a matter of hours, and tested rapidly, in a matter of minutes, whereas living cell based systems may take weeks to prepare and days to test. This makes the new technology ideal for 'rapid prototyping' of genetic circuits. Many designs can be rapidly generated and tested, and the most successful can then be used to generate cell-based systems for applications where this is required, such as engineered metabolic pathways for manufacturing pharmaceuticals and other valuable compounds. In this project, we will further develop these remarkable systems and create new tools which will make it even easier to design and develop them. Firstly, we will create new computational tools which can be used to design genetic circuits for many applications. These will be made available on-line for the benefit of the research community. Secondly, we will establish methods for rapid automated assembly and testing of new circuits, allowing many thousands of variants to be generated and tested in a very short time with minimal human effort. Thirdly, we will seek to improve the basic technology, to improve the performance of the cell-free devices, and also develop low cost open-source electronic readers which can easily be used in the field along with the sensors we develop. Fourthly, we will demonstrate the usefulness of the technology by generating sensors which can rapidly and sensitively detect various external inputs. All of our new inventions will be made available to the research community. In addition to the other advantages mentioned above, this technology also makes it easy for users to develop their own assays simply by adding appropriate DNA components to a basic mixture, using standard protocols. Such devices can be manufactured and distributed cheaply on a very large scale. In conjunction with low-cost readers, ubiquitous mobile devices equipped with GPS and time data, and cloud-computing, this will offer the possibility to detect health hazards with unprecedented levels of speed and detail, with potentially huge effects on human health and welfare. Furthermore, these devices are ideal for use in education, allowing users to design and test their own genetic circuits without the issues inherent in using living cells. For these reasons, our proposal offers tremendous benefits and represents a step change in the real-word applicability of synthetic biology.

  • Funder: UKRI Project Code: ES/N007883/2
    Funder Contribution: 408,373 GBP
    Partners: University of Montreal, DCU, Loughborough University, SEOUL NATIONAL UNIVERSITY

    In a globalised economic and business context, the norms that shape human resource management travel internationally. This is particularly the case within the multinational company, where individuals are responsible for the creation, diffusion, interpretation and negotiation of norms - which may be rules, principles or guidelines - across international operations. We refer to such individuals as "globalizing actors". The aim of our research is to identify the resources mobilized by globalizing actors in the creation, diffusion, interpretation and negotiation of norms concerning the global coordination of human resources (see 'Objectives' for more detail). Previous research has examined individuals in important international positions, focusing on their orientations and values (e.g. whether they possess 'global mindsets'), the management of international assignments and the characteristics of members of the international business elite. However, these literatures have not systematically examined the actual roles of globalizing actors within firms, and precisely how they create, diffuse, and manage international norms. We examine what such actors actually do within a theoretical framework that sees the behaviour of globalizing actors as shaped by institutions: the institutions in the country in which they originated affect their competencies; they must be sensitive to a variety of host national institutions; and they must navigate their way through a growing range of transnational institutions. Their role is also shaped by organizational context, particularly how the firm derives synergies from integrating their operations internationally, which influences the types of global norms required. However, globalizing actors are not prisoners of institutional and organizational contexts. They can create new norms, develop strategies that help shape the 'rules of the game' and attempt to exploit institutional contradictions and ambiguities. We will explore the individual level resources of these actors to deal with these contexts, such as their skills and knowledge - 'human capital' - the relationship these actors have to others in terms of power, position and trust - their 'social capital' - and their transnational experiences or exposure. We will examine UK MNCs, both at home and across subsidiaries in Europe, North America and East Asia. The research will use multiple methods, consisting of five steps: 1. Pilot Work. Using seed-corn funding, we have tested key concepts and generated contacts for twelve full case studies in subsequent stages of the research. 2. UK interviews. These will focus on those charged with creating new norms, spreading them across international operations, or ensuring compliance. 3. Foreign Subsidiary Interviews. We will conduct interviews in the international operations of each firm, enabling us to understand frames of reference and actor choices in foreign subsidiaries. 4. Multi-level Survey. The survey of a set of globalizing actors will establish individual level capabilities associated with the establishment and diffusion of global norms. 5. Quantitative Diary Study. This methodological innovation allows us both to explore what globalizing actors actually do and to test predictors of behaviours and attitudes. The research will make a substantial and distinctive contribution to understanding of the processes of international management, through focusing on individual "globalizing actors" within the contexts of the multiple institutional and organisational contexts within which they make decisions. Equally, through the development and communication of a strong evidence base on how firms build individual and organisational capabilities in international management, the research also aims to enable improvements in the economic effectiveness of UK firms with overseas operations, while acting in ways that respond to the need for social responsibility at local-host and global levels.

  • Funder: UKRI Project Code: NE/L000318/1
    Funder Contribution: 620,481 GBP
    Partners: Nat Uni of Life & Env Sci Ukraine NUBiP, DSA, CIT, McMaster University, PHE, Inst Radiation and Nuclear Safety IRSN, NERC Centre for Ecology and Hydrology, Norwegian University of Life Sciences (NMBU), Faculty of Biosciences, Marine Ecology Rese Institute (Japan), Belgian Nuclear Research Centre SCK CEN

    For all sources of radioactivity, radiological risk assessments are essential for safeguarding human and environmental health. But assessments often have to rely upon simplistic assumptions, such as the use of simple ratios in risk calculations which combine many processes. This pragmatic approach has largely arisen due to the lack of scientific knowledge and/or data in key areas. The resultant uncertainty has been taken into account through conservative approaches to radiological risk assessment which may tend to overestimate risk. Uncertainty arises at all stages of the assessment process from the estimation of transfer to human foodstuffs and wildlife, exposure and risk. Reducing uncertainty is important as it relates directly to scientific credibility, which will always be open to challenge given the highly sensitive nature of radiological risk assessment in society. We propose an integrated, multi-disciplinary, programme to assess and reduce the uncertainty associated with radiological risk assessment to protect human health and the environment. At the same time we will contribute to building the capacity needed to ensure that the UK rebuilds and maintains expertise in environmental radioactivity into the future. Our project has four major and highly inter-related components to address the key goal of RATE to rebuild UK capacity and make a major contribution to enhancing environmental protection and safeguarding human health. The first component will study how the biological availability of radionuclides varies in soils over time. We will investigate if short-term measurements (collected in three year controlled experiments) can be used to predict the long-term availability of radionuclides in soils by testing our models in the Chernobyl exclusion zone. The second component will apply the concepts of 'phylogeny' and 'ionomics' to characterise radionuclide uptake by plants and other organisms. These approaches, and statistical modelling methods, are increasingly applied to describe uptake of a range of elements in plant nutrition, and we are pioneering their use for studying radionuclide uptake in other organisms and human foods. A particularly exciting aspect of the approach is the possibility to make predictions for any plant or animal. This is of great value as it is impossible to measure uptake for all wildlife, crops and farm animals. The third component of the work will extend our efforts to improve the quantification of radiation exposure and understanding of resultant biological effects by investigating the underlying mechanisms involved. A key aim is to see whether what we know from experiments on animals and plants in the laboratory is a good representation of what happens in the real world: some scientists believe that animals in the natural environment are more susceptible to radiation than laboratory animals: we need to test this to have confidence in our risk assessments. Together these studies will enable us to reduce and better quantify the uncertainties associated with radiological risk assessment. By training a cohort of PDRA and PhDs our fourth component will help to renew UK capacity in environmental radioactivity by providing trained, experienced researchers who are well networked within the UK and internationally through the contacts of the investigators. Our students will be trained in a wide range of essential skills through their controlled laboratory studies and working in contaminated environments. They will benefit from being a member of a multidisciplinary team and opportunities to take placements with our beneficiaries and extensive range of project partners. The outputs of the project will benefit governmental and non-governmental organisations with responsibility for assessing the risks to humans and wildlife posed by environmental radioactivity. It will also make a major contribution to improved scientific and public confidence in the outcomes of environmental safety assessments.

  • Funder: UKRI Project Code: AH/P006175/1
    Funder Contribution: 47,675 GBP
    Partners: University of Birmingham, Queen's University Canada

    As of March 2016, a total of 104,773 uniformed personnel from 123 countries were serving in 16 peacekeeping operations around the world. Where foreign soldiers - during war, occupation or peacekeeping operations - are on foreign soil, military-civilian relations develop, including those between soldiers and local women. Peacekeepers have increasingly been associated with sexual exploitation and abuse of the vulnerable populations they had been mandated to protect. Many of the intimate relations between peacekeeping personnel and local women, of both voluntary and exploitative nature, have led to pregnancies and to children being born. These so-called 'peace babies' and their mothers face particular challenges in volatile post-conflict communities, reportedly including childhood adversities as well as stigmatization, discrimination and disproportionate economic and social hardships. The network connects two strands of inquiry around 'peace babies' - from the academic world and from within the development sector - in a spirit of conversation and collaboration, to examine challenges of humanitarian intervention in a transnational historical context. Building on the firm belief that history's focus on causality and long-term processes of change is indispensable for appreciating the complex dynamics of socio-cultural change, the network contributes a deeper understanding of development and aims to affect practice. It provides an historical complement to the wealth of available analyses - internal and external - of the contemporary humanitarian environment. Specifically, the network proposes an in-depth-study of the situation of 'peace babies' by exploring the children conceived by personnel from or associated with the United Nations Stabilization Mission in Haiti (MINUSTAH). MINUSTAH is among the missions that have been associated with allegations of a range of abuses, not least related to sexual and gender-based violence and consequently the unintended legacy of children fathered by UN personnel. The UN has recently acknowledged that 'peacekeeper babies' exist. Yet, an evidence base relating to the welfare of children fathered by UN peacekeepers (globally or in Haiti) is virtually non-existent, and it is clear that the existing UN policies and support programs are inadequate. This multidisciplinary collaboration between scholars from Queen's University, the University of Birmingham, the Centre of International and Defence Policy, and Haitian-based Enstiti Travay Sosyal ak Syans Sosyal (ETS), along with civil society organisations, the Institute for Justice and Democracy in Haiti and Haitian-based Bureau des Avocats Internationaux, will address this knowledge gap and enhance our historically-informed understanding of the challenges faced by peace babies and their families as well as the obstacles to accessing support. Beyond the core UK-Canada-Haiti partnership, the network will include a further four ODA-recipient countries (Cambodia, Bosnia, Liberia and the DRC) and will apply insights from Haiti to PSOs more generally in discourse with academic and non-academic participants from those countries with extensive PSO experience. The network is structured around three network meetings (two workshops and a network conference, the latter supplemented by an early-career research workshop) which will create a sustainable partnership that focuses on co-creation of knowledge as well as a collaborative mobilisation of this knowledge to inform academic and non-academic stakeholders interested in peacekeepers' children. The findings of the workshops and the final conference will inform both academic outputs and - going forward - the development of an intersectoral research agenda; furthermore they will frame a special journal edition on 'Peace Babies' and will be at the core of the network's activities beyond the funding period, both as a platform for continued transnational and intersectoral conversation and of collaborative research

  • Funder: UKRI Project Code: NE/M017028/1
    Funder Contribution: 766,686 GBP
    Partners: University of Guelph, WU, University of Salford

    Soils provide many functions for humans, including the storage of carbon and nutrient cycling, which are crucial for the production of food and mitigation of climate change. However, there is much concern that soils, and the functions that they provide, are being threatened by a range of pressures, including intensive farming methods and increased frequency of extreme climatic events, such as drought. Not only do these disturbances pose an immediate threat to the functioning of soils, but they could also impair their ability to resist and recover from further stresses that come in the future. Our project will tackle this problem by addressing two general questions: first, what makes a soil able to withstand and recover from disturbance events, such as drought, and, second how can we use this knowledge to ensure soils can buffer disturbances in the future? These are questions that have puzzled soil scientists for many years, but so far, remain unresolved. An area that offers much promise, however, in tackling this issue is food web ecology. Food webs are the networks of interactions describing who eats whom amongst the myriad organisms within an ecosystem. And in soil, they are the engine that drives the very processes of nutrient cycling and energy flow on which the functioning of soil and the terrestrial ecosystems they support, depend. It has been proposed for many years, but so far not fully tested in soil, that simple food webs are less able to withstand and recover from disturbance events, such as drought than complex ones. We want to test this theory in soil, which harbours some of the most complex, but also sensitive, food webs on Earth. We test the idea, through experiments and models, that the ability of a soil to withstand, recover and adapt to disturbance events depends on the architecture and diversity of the soil food web, which governs the rate of transfer of nutrients and energy through the plant-soil system. We also propose that soil disturbances associated with intensive land use, such as trampling and fertiliser addition, erode the very food web structures that make the soil system stable, thereby reducing the ability of soil to resist and recover from future disturbances, such as extreme weather events. We will also resolve what makes a food web stable, and test the roles of different types of organisms in soil, such as mycorrhizal fungi, which we believe play a major role. And finally, we will develop new models to help us better predict how soils will respond to future threats and to guide management decisions on sustainable soil management in a rapidly changing world. These question are at the heart of the NERC Soil Security programme which seeks to resolve what controls the ability of soils and their functions to resist, recover and ultimately adapt, to perturbations, such as those caused by land use and extreme climatic events.

  • Funder: UKRI Project Code: EP/M01052X/1
    Funder Contribution: 731,953 GBP
    Partners: University of Edinburgh, RU, UMD, SFU, University of Kent

    Condensed matter physics has developed a relatively complete theory of common phases in materials leading to many technologically important devices including electronic screens, memory storage, and switching devices. Landau, or mean-field theory, has provided a framework to model, predict, and understand phases and transitions in a surprisingly diverse variety of materials and also dynamical systems. While these conventional ground states have proven technologically important and the underlying theory represents a major success for scientists, these phases have proven incredibly difficult to suppress and often emerge when new materials properties are sought or engineered. To discover novel phases that will lead to a new materials revolution, these common phases need to be suppressed to allow exotic and unconventional properties to emerge. The most common vehicle to turn off conventional phases in materials has been through the introduction of disorder through chemical doping resulting in strong random fields. Many important theories have been formulated and tested to describe the effects of random fields and in particular to account for the fine balance between surface and bulk free energy. However, the use of disorder has proved limiting as properties are often templated into the material and not directly controllable and also the resulting ground state of the material is difficult to understand. Another route, which has more recently been explored in the last decade, to suppress conventional phases is by introducing strong fluctuations. While this can be trivially done with temperature, new phases have emerged by studying quantum systems where the physics are governed by quantum mechanics and the Heisenberg uncertainty principle. The study of quantum systems has resulted in the discovery of many new phases of matter including high temperature superconductors and also quantum spin-liquids where the magnetism is dynamic at any temperature. A limitation of quantum fluctuations is that the properties do not carry over directly to ferroelectric based systems and also multiferroics where magnetic and structural properties are strongly coupled. Also, owing to the strong fluctuating nature of the ground state, the properties have not been found to be easily tunable limiting immediate use for applications. This proposal aims to therefore take a different route by studying classically frustrated systems where a large ground state degeneracy is introduced naturally through the lattice and quantum mechanical effects are small. Emphasis will be placed on lattices based upon a triangular geometry. The lack of strong fluctuations (that exists in quantum systems) provides the ability to controllably tune between different ground states making this route a potential means of creating new switching devices or novel memory storage systems. The proposal aims to investigate classically frustrated magnets and ferroelectrics. These systems can be described within a common framework and will be studied using scattering techniques to provide a bulk real space image of the ground state. The properties will be tuned with magnetic and electric fields supplying a direct route for discovering a new route towards technologically applicable materials. The combined approach of investigating ferroelectrics and magnets will result in a complete understanding applicable to immediate industrial applications. These new materials will lead to the discovery of new phases including new high temperature multiferroics, classical spin liquids, or localized controllable boundaries or defects.

  • Funder: UKRI Project Code: EP/R004730/1
    Funder Contribution: 101,150 GBP
    Partners: University of Toronto, University of Warwick, Abdus Salam ICTP, SISSA - ISAS

    The subject of study of differential geometry are smooth manifolds, which correspond to smooth curved objects of finite dimension. In modern differential geometry, it is becoming more and more common to consider sequences (or flows) of smooth manifolds. Typically the limits of such sequences (or flows) are non smooth anymore. It is then useful to isolate a natural class of non smooth objects which generalize the classical notion of smooth manifold, and which is closed under the process of taking limits. If the sequence of manifolds satisfy a lower bound on the sectional curvatures, a natural class of non-smooth objects which is closed under (Gromov-Hausdorff) convergence is given by special metric spaces known as Alexandrov spaces; if instead the sequence of manifolds satisfy a lower bound on the Ricci curvatures, a natural class of non-smooth objects, closed under (measured Gromov-Hausdorff) convergence, is given by special metric measure spaces (i.e. metric spaces endowed with a reference volume measure) known as RCD(K,N) spaces. These are a 'Riemannian' refinement of the so called CD(K,N) spaces of Lott-Sturm-Villani, which are metric measure spaces with Ricci curvature bounded below by K and dimension bounded above by N in a synthetic sense via optimal transport. In the proposed project we aim to understand in more detail the structure, the analytic and the geometric properties of RCD(K,N) spaces. The new results will have an impact also on the classical world of smooth manifolds satisfying curvature bounds.

  • Funder: UKRI Project Code: EP/N018958/2
    Funder Contribution: 305,534 GBP
    Partners: University of London, University of Edinburgh, University of Salford, University of Leeds, The Mathworks Ltd, Wolfram Research Europe Ltd, MICROSOFT RESEARCH LIMITED, NAG, N8 Research Partnership, 3DS...

    "Software is the most prevalent of all the instruments used in modern science" [Goble 2014]. Scientific software is not just widely used [SSI 2014] but also widely developed. Yet much of it is developed by researchers who have little understanding of even the basics of modern software development with the knock-on effects to their productivity, and the reliability, readability and reproducibility of their software [Nature Biotechnology]. Many are long-tail researchers working in small groups - even Big Science operations like the SKA are operationally undertaken by individuals collectively. Technological development in software is more like a cliff-face than a ladder - there are many routes to the top, to a solution. Further, the cliff face is dynamic - constantly and quickly changing as new technologies emerge and decline. Determining which technologies to deploy and how best to deploy them is in itself a specialist domain, with many features of traditional research. Researchers need empowerment and training to give them confidence with the available equipment and the challenges they face. This role, akin to that of an Alpine guide, involves support, guidance, and load carrying. When optimally performed it results in a researcher who knows what challenges they can attack alone, and where they need appropriate support. Guides can help decide whether to exploit well-trodden paths or explore new possibilities as they navigate through this dynamic environment. These guides are highly trained, technology-centric, research-aware individuals who have a curiosity driven nature dedicated to supporting researchers by forging a research software support career. Such Research Software Engineers (RSEs) guide researchers through the technological landscape and form a human interface between scientist and computer. A well-functioning RSE group will not just add to an organisation's effectiveness, it will have a multiplicative effect since it will make every individual researcher more effective. It has the potential to improve the quality of research done across all University departments and faculties. My work plan provides a bottom-up approach to providing RSE services that is distinctive from yet complements the top-down approach provided by the EPRSC-funded Software Sustainability Institute. The outcomes of this fellowship will be: Local and National RSE Capability: A RSE Group at Sheffield as a credible roadmap for others pump-priming a UK national research software capability; and a national Continuing Professional Development programme for RSEs. Scalable software support methods: A scalable approach based on "nudging", to providing research software support for scientific software efficiency, sustainability and reproducibility, with quality-guidelines for research software and for researchers on how best to incorporate research software engineering support within their grant proposals. HPC for long-tail researchers: 'HPC-software ramps' and a pathway for standardised integration of HPC resources into Desktop Applications fit for modern scientific computing; a network of HPC-centric RSEs based around shared resources; and a portfolio of new research software courses developed with partners. Communication and public understanding: A communication campaign to raise the profile of research software exploiting high profile social media and online resources, establishing an informal forum for research software debate. References [Goble 2014] Goble, C. "Better Software, Better Research". IEEE Internet Computing 18(5): 4-8 (2014) [SSI 2014] Hettrick, S. "It's impossible to conduct research without software, say 7 out of 10 UK researchers" http://www.software.ac.uk/blog/2014-12-04-its-impossible-conduct-research-without-software-say-7-out-10-uk-researchers (2014) [Nature 2015] Editorial "Rule rewrite aims to clean up scientific software", Nature Biotechnology 520(7547) April 2015

  • Funder: UKRI Project Code: EP/R019622/1
    Funder Contribution: 100,987 GBP
    Partners: RWTH, CNRS, Maplesoft, UniGe, Coventry University

    This project concerns computational mathematics and logic. The aim is to improve the ability of computers to perform ``Quantifier Elimination'' (QE). We say a logical statement is ``quantified'' if it is preceded by a qualification such as "for all" or "there exists". Here is an example of a quantified statement: "there exists x such that ax^2 + bx + c = 0 has two solutions for x". While the statement is mathematically precise the implications are unclear - what restrictions does this statement of existence force upon us? QE corresponds to replacing a quantified statement by an unquantified one which is equivalent. In this case we may replace the statement by: "b^2 - 4ac > 0", which is the condition for x to have two solutions. You may have recognised this equivalence from GCSE mathematics, when studying the quadratic equation. The important point here is that the latter statement can actually be derived automatically by a computer from the former, using a QE procedure. QE is not subject to the numerical rounding errors of most computations. Solutions are not in the form of a numerical answer but an algebraic description which offers insight into the structure of the problem at hand. In the example above, QE shows us not what the solutions to a particular quadratic equation are, but how in general the number of solutions depends on the coefficients a, b, and c. QE has numerous applications throughout engineering and the sciences. An example from biology is the determination of medically important values of parameters in a biological network; while another from economics is identifying which hypotheses in economic theories are compatible, and for what values of the variables. In both cases, QE can theoretically help, but in practice the size of the statements means state-of-the-art procedures run out of computer time/memory. The extensive development of QE procedures means they have many options and choices about how they are run. These decisions can greatly affect how long QE takes, rendering an intractable problem easy and vice versa. Making the right choice is a critical, but understudied problem and is the focus of this project. At the moment QE procedures make such choices either under direct supervision of a human or based on crude human-made heuristics (rules of thumb based on intuition / experience but with limited scientific basis). The purpose of this project is to replace these by machine learning techniques. Machine Learning (ML) is an overarching term for tools that allow computers to make decisions that are not explicitly programmed, usually involving the statistical analysis of large quantities of data. ML is quite at odds with the field of Symbolic Computation which studies QE, as the latter prizes exact correctness and so shuns the use of probabilistic tools making its application here very novel. We are able to combine these different worlds because the choices which we will use ML to make will all produce a correct and exact answer (but with different computational costs). The project follows pilot studies undertaken by the PI which experimented with one ML technique and found it improved upon existing heuristics for two particular decisions in a QE algorithm. We will build on this by working with the spectrum of leading ML tools to identify the optimal techniques for application in Symbolic Computation. We will demonstrate their use for both low level algorithm decisions and choices between different theories and implementations. Although focused on QE, we will also demonstrate ML as being a new route to optimisation in Computer Algebra more broadly and work encompasses Project Partners and events to maximise this. Finally, the project will deliver an improved QE procedure that makes use of ML automatically, without user input. This will be produced in the commercial Computer Algebra software Maple in collaboration with industrial Project Partner Maplesoft.

  • Funder: UKRI Project Code: NE/M017540/2
    Funder Contribution: 284,801 GBP
    Partners: Utrecht University, MBARI, GSC, NCU, SDSU, BU, Middlesex University London, ConocoPhillips Company, CSIC, UNIMI...

    Turbidity currents are the volumetrically most import process for sediment transport on our planet. A single submarine flow can transport ten times the annual sediment flux from all of the world's rivers, and they form the largest sediment accumulations on Earth (submarine fans). These flows break strategically important seafloor cable networks that carry > 95% of global data traffic, including the internet and financial markets, and threaten expensive seabed infrastructure used to recover oil and gas. Ancient flows form many deepwater subsurface oil and gas reservoirs in locations worldwide. It is sobering to note quite how few direct measurements we have from submarine flows in action, which is a stark contrast to other major sediment transport processes such as rivers. Sediment concentration is the most fundamental parameter for documenting what turbidity currents are, and it has never been measured for flows that reach submarine fans. How then do we know what type of flow to model in flume tanks, or which assumptions to use to formulate numerical or analytical models? There is a compelling need to monitor flows directly if we are to make step changes in understanding. The flows evolve significantly, such that source to sink data is needed, and we need to monitor flows in different settings because their character can vary significantly. This project will coordinate and pump-prime international efforts to monitor turbidity currents in action. Work will be focussed around key 'test sites' that capture the main types of flows and triggers. The objective is to build up complete source-to-sink information at key sites, rather than producing more incomplete datasets in disparate locations. Test sites are chosen where flows are known to be active - occurring on annual or shorter time scale, where previous work provides a basis for future projects, and where there is access to suitable infrastructure (e.g. vessels). The initial test sites include turbidity current systems fed by rivers, where the river enters marine or freshwater, and where plunging ('hyperpycnal') river floods are common or absent. They also include locations that produce powerful flows that reach the deep ocean and build submarine fans. The project is novel because there has been no comparable network established for monitoring turbidity currents Numerical and laboratory modelling will also be needed to understand the significance of the field observations, and our aim is also to engage modellers in the design and analysis of monitoring datasets. This work will also help to test the validity of various types of model. We will collect sediment cores and seismic data to study the longer term evolution of systems, and the more infrequent types of flow. Understanding how deposits are linked to flows is important for outcrop and subsurface oil and gas reservoir geologists. This proposal is timely because of recent efforts to develop novel technology for monitoring flows that hold great promise. This suite of new technology is needed because turbidity currents can be extremely powerful (up to 20 m/s) and destroy sensors placed on traditional moorings on the seafloor. This includes new sensors, new ways of placing those sensors above active flows or in near-bed layers, and new ways of recovering data via autonomous gliders. Key preliminary data are lacking in some test sites, such as detailed bathymetric base-maps or seismic datasets. Our final objective is to fill in key gaps in 'site-survey' data to allow larger-scale monitoring projects to be submitted in the future. This project will add considerable value to an existing NERC Grant to monitor flows in Monterey Canyon in 2014-2017, and a NERC Industry Fellowship hosted by submarine cable operators. Talling is PI for two NERC Standard Grants, a NERC Industry Fellowship and NERC Research Programme Consortium award. He is also part of a NERC Centre, and thus fulfils all four criteria for the scheme.