search
The following results are related to Canada. Are you interested to view more results? Visit OpenAIRE - Explore.
17 Projects, page 1 of 2

  • Canada
  • UK Research and Innovation
  • UKRI|EPSRC
  • 2021

10
arrow_drop_down
  • Funder: UKRI Project Code: EP/V011855/1
    Funder Contribution: 4,436,180 GBP
    Partners: Marine Minerals Ltd, Apto Solutions, Bullitt, Natural History Museum, Cornish Mining World Heritage, Ravel, Mkango Resources Limited, Cornwall Resources Limited, Critical Minerals Association, Roskill Information Services Ltd...

    The Circular Economy (CE) is a revolutionary alternative to a traditional linear, make-use-dispose economy. It is based on the central principle of maintaining continuous flows of resources at their highest value for the longest period and then recovering, cascading and regenerating products and materials at the end of each life cycle. Metals are ideal flows for a circular economy. With careful stewardship and good technology, metals mined from the Earth can be reused indefinitely. Technology metals (techmetals) are an essential, distinct, subset of specialist metals. Although they are used in much smaller quantities than industrial metals such as iron and aluminium, each techmetal has its own specific and special properties that give it essential functions in devices ranging from smart phones, batteries, wind turbines and solar cells to electric vehicles. Techmetals are thus essential enablers of a future circular, low carbon economy and demand for many is increasing rapidly. E.g., to meet the UK's 2050 ambition for offshore wind turbines will require 10 years' worth of global neodymium production. To replace all UK-based vehicles with electric vehicles would require 200% of cobalt and 75% of lithium currently produced globally each year. The UK is 100% reliant on imports of techmetals including from countries that represent geopolitical risks. Some techmetals are therefore called Critical Raw Materials (high economic importance and high risk of supply disruption). Only four of the 27 raw materials considered critical by the EU have an end-of-life recycling input rate higher than 10%. Our UKRI TechMet CE Centre brings together for the first time world-leading researchers to maximise opportunities around the provision of techmetals from primary and secondary sources, and lead materials stewardship, creating a National Techmetals Circular Economy Roadmap to accelerate us towards a circular economy. This will help the UK meet its Industrial Strategy Clean Growth agenda and its ambitious UK 2050 climate change targets with secure and environmentally-acceptable supplies of techmetals. There are many challenges to a future techmetal circular economy. With growing demand, new mining is needed and we must keep the environmental footprint of this primary production as low as possible. Materials stewardship of techmetals is difficult because their fate is often difficult to track. Most arrive in the UK 'hidden' in complex products from which they are difficult to recover. Collection is inefficient, consumers may not feel incentivised to recycle, and policy and legislative initiatives such as Extended Producer Responsibility focus on large volume metals rather than small quantity techmetals. There is a lack of end-to-end visibility and connection between different parts of techmetal value chains. The TechMet consortium brings together the Universities of Exeter, Birmingham, Leicester, Manchester and the British Geological Survey who are already working on how to improve the raw materials cycle, manufacture goods to be re-used and recycled, recycle complex goods such as batteries and use and re-use equipment for as long as possible before it needs recycling. One of our first tasks is to track the current flows of techmetals through the UK economy, which although fundamental, is poorly known. The Centre will conduct new interdisciplinary research on interventions to improve each stage in the cycle and join up the value chain - raw materials can be newly mined and recycled, and manufacturing technology can be linked directly to re-use and recycling. The environmental footprint of our techmetals will be evaluated. Business, regulatory and social experts will recommend how the UK can best put all these stages together to make a new techmetals circular economy and produce a strategy for its implementation.

  • Funder: UKRI Project Code: EP/W007673/1
    Funder Contribution: 972,421 GBP
    Partners: University of London, University of Toronto, KageNova, Curtin University, UCD

    The emerging era of exascale computing that will be ushered in by the forthcoming generation of supercomputers will provide both opportunities and challenges. The raw compute power of such high performance computing (HPC) hardware has the potential to revolutionize many areas of science and industry. However, novel computing algorithms and software must be developed to ensure the potential of novel HPC architectures is realized. Computational imaging, where the goal is to recover images of interest from raw data acquired by some observational instrument, is one of the most widely encountered class of problem in science and industry, with myriad applications across astronomy, medicine, planetary and climate science, computer graphics and virtual reality, geophysics, molecular biology, and beyond. The rise of exascale computing, coupled with recent advances in instrumentation, is leading to novel and often huge datasets that, in principle, could be imaged for the first time in an interpretable manner at high fidelity. However, to unlock interpretable, high-fidelity imaging of big-data novel methodological approaches, algorithms and software implementations are required -- we will develop precisely these components as part of the Learned EXascale Computational Imaging (LEXCI) project. Firstly, whereas traditional computational imaging algorithms are based on relatively simple hand-crafted prior models of images, in LEXCI we will learn appropriate image priors and physical instrument simulation models from data, leading to much more accurate representations. Our hybrid techniques will be guided by model-based approaches to ensure effectiveness, efficiency, generalizability and uncertainty quantification. Secondly, we will develop novel algorithmic structures that support highly parallelized and distributed implementations, for deployment across a wide range of modern HPC architectures. Thirdly, we will implement these algorithms in professional research software. The structure of our algorithms will not only allow computations to be distributed across multi-node architectures, but memory and storage requirements also. We will develop a tiered parallelization approach targeting both large-scale distributed-memory parallelization, for distributing work across processors and co-processors, and light-weight data parallelism through vectorization or light-weight threads, for distributing work on processors and co-processors. Our tiered parallelization approach will ensure the software can be used across the full range of modern HPC systems. Combined, these developments will provide a future computing paradigm to help usher in the era of exascale computational imaging. The resulting computational imaging framework will have widespread application and will be applied to a number of diverse problems as part of the project, including radio interferometric imaging, magnetic resonance imaging, seismic imaging, computer graphics, and beyond. The resulting software will be deployed on the latest HPC computing resources to evaluate their performance and to feed back to the community the computing lessons learned and techniques developed, so as to support the general advance of exascale computing.

  • Funder: UKRI Project Code: EP/V028251/1
    Funder Contribution: 613,910 GBP
    Partners: Microsoft Research, Xilinx Corp, SU, UBC, Deloitte LLP, RIKEN, Hebei University, Cordouan Technologies, Dunnhumby, Imperial College London...

    The DART project aims to pioneer a ground-breaking capability to enhance the performance and energy efficiency of reconfigurable hardware accelerators for next-generation computing systems. This capability will be achieved by a novel foundation for a transformation engine based on heterogeneous graphs for design optimisation and diagnosis. While hardware designers are familiar with transformations by Boolean algebra, the proposed research promotes a design-by-transformation style by providing, for the first time, tools which facilitate experimentation with design transformations and their regulation by meta-programming. These tools will cover design space exploration based on machine learning, and end-to-end tool chains mapping designs captured in multiple source languages to heterogeneous reconfigurable devices targeting cloud computing, Internet-of-Things and supercomputing. The proposed approach will be evaluated through a variety of benchmarks involving hardware acceleration, and through codifying strategies for automating the search of neural architectures for hardware implementation with both high accuracy and high efficiency.

  • Funder: UKRI Project Code: EP/W002973/1
    Funder Contribution: 4,300,500 GBP
    Partners: University of Salford, MMU, Aalto University, Astrazeneca Plc, GLA, University of Cambridge, Spectra Analytics, Etsimo Healthcare Oy, Health Innovation Manchester, Delft University of Technology...

    Machine learning offers great promise in helping us solve problems by automatically learning solutions from data, without us having to specify all details of the solution as in earlier computational approaches. However, we still need to tell machine learning systems what problems we want them to solve, and this is currently undertaken by specifying desired outcomes and designing objective functions and rewards. Formulating the rewards for a new problem is not easy for us as humans, and is particularly difficult when we only partially know the goal, as is the case at the beginning of scientific research. In this programme we develop ways for machine learning systems to help humans to steer them in the process of collecting more information by designing experiments, interpreting what the results mean, and deciding what to measure next, to finally reach a conclusion and a trustworthy solution to the problem. The machine learning techniques will be developed first for three practically important problems and then generalized to be broadly applicable. The first is diagnosis and treatment decision making in personalized medicine, the second steering of scientific experiments in synthetic biology and drug design, and the third design and use of digital twins in designing physical systems and processes. An AI centre of excellence will be established at the University of Manchester, in collaboration with the Turing Institute and a number of partners from the industry and healthcare sector, and with strong connections to the networks of best national and international AI researchers.

  • Funder: UKRI Project Code: EP/W000652/1
    Funder Contribution: 800,898 GBP
    Partners: SU, University of Reading, Draper & Dash Healthcare, Humanity Vision Limited, Oxford Immune Algorithmics, Massachusetts Institute of Technology, USA, Imperial College London, University College London Hospital (UCLH) NHS Foundation Trust, The Chinese University of Hong kong, RBFT...

    There is an extremely high demand for laboratory-based blood tests from community settings in the UK and analysis suggests an important role in the future for remote blood monitoring that would enable patients and health professionals to carry out their own tests remotely, greatly benefiting patients and speeding up decision making. The COVID-19 pandemic has further highlighted the need for remote and connected blood testing that is beyond the online virtual clinics in the NHS outpatient setting. In current blood testing services for community healthcare, it is challenging to obtain and process blood samples outside of the clinical setting without training and lab facilities, and patients are required to attend a GP surgery or hospital for tests with travel burden and infection risk. Many blood analyses are done in batches that take a long time to build up, meaning the speed of blood sample analysis of routine tests and time taken for diagnosis are further challenges. Despite recent innovations in point of care, current blood analysis tools in practice are mainly mechanical or labour-intensive that require extensive filtering and manual tweaking and not suitable for regular at-home monitoring and longitudinal analytics. There is no personalised real-time approach available to inform disease complexity and conditions over time, which are critical for early detection of acute diseases and the management of chronic conditions. In England, around 95% of clinical pathways rely on patients having access to efficient, timely and cost-effective pathology services and there are 500 million biochemistry and 130 million haematology tests are carried out per year. This means inefficient and infrequent blood testing leads to late diagnosis, incomplete knowledge of disease progression and potential complications in a wide range of populations. Taking those challenges into account and current digital transformation in healthcare, this is a timely opportunity to bring researchers, clinicians and industrialist together to address the challenges of blood monitoring and analytics. The proposed Network+ will build an interdisciplinary community that will explore future blood testing solutions to achieve remote, inclusive, rapid, affordable and personalised blood monitoring, and address the above challenges in community health and care. To achieve the Network+ vision, research of technologies will be conducted from collaborations among information and communication technology (ICT), data and analytical science, clinical science, applied optics, biochemistry, engineering and social sciences in the Network+. The network will address three key technical challenges in blood testing: Remote monitoring, ICT, Personalised data and AI in a range of examplar clinical areas including cancer, autoimmune diseases, sickle cell disease, preoperative care, pathology services and general primary care.

  • Funder: UKRI Project Code: EP/V002325/1
    Funder Contribution: 395,816 GBP
    Partners: Macquarie University, Massachusetts Institute of Technology, USA, University of Quebec, University of California System, CASE WESTERN RESERVE UNIVERSITY, University of Leeds, Université Paris Diderot

    When we begin to study mathematics, we learn that the operation of multiplication on numbers satisfies some basic rules. One of these rules, known as associativity, says that for any three numbers a, b and c, we get the same result if we multiply a and b and then multiply the result by c or if we multiply a by the result of multiplying b and c. This leads to the abstract algebraic notion of a monoid, which is a set (in this case the set of natural numbers) equipped with a binary operation (in this case multiplication) that is associative and has a unit (in this case the number 1). If we continue to study mathematics, we encounter a new kind of multiplication, no longer on numbers but on sets, which is known as Cartesian product. Given two sets A and B, their Cartesian product is the set A x B whose elements are the ordered pairs (a, b), where a is an element of A and b is an element of B. Pictorially, the Cartesian product of two sets is a grid with coordinates given by the elements of the two sets. This operation satisfies some rules, analogous to those for the multiplication of numbers, but a little more subtle. For example, if we are given three sets A, B and C, then the set A x (B x C) is isomorphic (rather than equal) to the set (A x B) x C. Here, being isomorphic means that we they are essentially the same by means of a one-to-one correspondence between the elements A x (B x C) and those of (A x B) x C. This construction leads to the notion of a monoidal category, which amounts to a collection of objects and maps between them (in this case the collection of all sets and functions between them) equipped with a multiplication (in this case the Cartesian product) that is associative and has a unit (in this case the one-element set) up to isomorphism. Monoidal categories, introduced in the '60s, have been extremely important in several areas of mathematics (including logic, algebra, and topology) and theoretical computer science. In logic and theoretical computer science, they connect to linear logic, in which one keeps track of the resources necessary to prove a statement. This project is about the next step in this sequence of abstract notions of multiplication, which is given by the notion of a monoidal bicategory. In a bicategory, we have not only objects and maps but also 2-maps, which can be thought of as "maps between maps" and allow us to capture how different maps relate to each other. In a monoidal bicategory, we have a way of multiplying their objects, maps and 2-maps, subject to complex axioms. Monoidal bicategories, introduced in the '90s, have potential for applications even greater than that of monoidal categories, as they allow us to keep track of even more information. We seek to realise this potential by advancing the theory of monoidal bicategories. We will prove fundamental theorems about them, develop new connections to linear logic and theoretical computer science and investigate examples that are of interest in algebra and topology. Our work connects to algebra via an important research programme known as "categorification", which is concerned with replacing set-based structures (like monoids) with category-based structures (like monoidal categories) in order to obtain more subtle invariants. Our work links to topology via the notion of an operad, which is a flexible tool used to describe algebraic structures in which axioms do not hold as equalities, but rather up to weak forms of isomorphism. Overall, this project will bring the theory of monoidal bicategories to a new level and promote interdisciplinary research within mathematics and with theoretical computer science.

  • Funder: UKRI Project Code: EP/W001071/1
    Funder Contribution: 220,947 GBP
    Partners: ErgoWind S.r.l., Offshore Wind Consultants Ltd, NERC British Geological Survey, UWO, University of Brighton

    The proposed research aims to develop an innovative mitigation device to protect the next-generation onshore and offshore wind farms from dynamic loading caused by extreme natural events. In 2020, 20% of the UK's electricity was obtained from wind using both onshore and offshore windfarms. In order to increase this percentage and help the UK address its climate change target, new wind farms, with taller and larger wind turbines, and situated in more extreme locations are planned. Projections of growth also indicate the expansion into emerging markets and construction of new wind farms in developing countries. Therefore, these next-generation wind turbines will have to cope with harsher climate conditions induced by stronger storms and taller sea waves, and extreme events such as earthquakes and tsunamis. Several simplifying assumptions used for the design of previous generations of wind turbines can no longer be applied and new critical factors and uncertainties linked to power-generation efficiency and structural safety will emerge, affecting their resilience and life-cycle. The particular area of focus of this research is the traditional transition piece of a wind turbine, which is a structural element that connects the tower with its foundation and will have to tolerate extreme stresses induced by dynamic loading during extreme natural events. The aim is to replace the traditional connector with a novel mechanical joint of hourglass shape, termed an Hourglass Lattice Structure (HLS). This innovation will combine the unique features of two proven technologies extremely effective in seismic engineering, namely the "reduced beam section" approach and the "rocking foundation" design. In particular, the proposed HLS device, because of its hourglass shape, will facilitate the rocking behaviour in order to create a highly dissipating "fuse" which will protect the wind tower and foundation. Performance of the novel proposed device on the structural life-cycle risk will be assessed through analytical, numerical, and experimental investigation by using, as a measure of efficiency, the levelized cost of energy (LCOE), namely the cost per unit of energy based on amortized capital cost over the project life. In addition, experimental testing of offshore small-scale wind turbines will be carried out by means of an innovative test rig, the first-ever underwater shake-table hosted in a hydraulic flume that will be deployed, calibrated, and used to simulate multi-hazard scenarios such as those recently discovered and dubbed "stormquakes". The successful outcome of this timely project will allow next-generation wind turbines to be more resilient and cost effective so that wind energy can develop as a competitive renewable energy resource with less need for government subsidy. The inclusion of industrial partners in all stages of the project ensures that the technical developments will be included in commercial devices for a medium-long term impact.

  • Funder: UKRI Project Code: EP/V041665/1
    Funder Contribution: 1,504,770 GBP
    Partners: AMP Clean Energy, SFU, Ferrite Microwave Technologies LLC, University of Birmingham, GEIRI Europe

    The Committee on Climate Change suggests that we need to decarbonise all heat in buildings by 2050 to achieve the Net Zero emissions targets. The electrification of heat supply, through either direct electric heating or heat pumps, seems more likely to be realised in practice. However, the complete electrification of heat will result in much higher electricity demand in winter than in summer. Furthermore, due to the consistency of ambient temperature, it will also lead to electricity demand spikiness which is a big challenge for the grid. The HARVEST project will develop a new solution that can absorb and accumulate the curtailed/waste renewable electricity all around the year using thermochemical heat storage technology and then convert and magnify the heat output in winter and cooling output in summer using heat pump technology. The unique features of the proposed solution are: (1) the microwave-assisted process to flexibly absorb renewable electricity; and (2) the compact and efficient regeneration process by direct contact reaction between thermochemical heat storage materials and ammonia solution. We have established a strong multidisciplinary consortium, consisting of leading researchers from the University of Birmingham, the University of Edinburgh, and the University College London, to address the key challenges in both the scientific/technological aspects and social aspects. Our research will significantly contribute to several identified approaches in the 'Decarbonising Heating and Cooling 2' call document, in particular, the 'new technologies of heating and/or cooling' and 'new methods or significant developments for heat storage or cold storage'. Our research is also further supported by the UK and international partners to maximise knowledge exchange and impact delivery.

  • Project . 2021 - 2023
    Funder: UKRI Project Code: EP/V049763/1
    Funder Contribution: 130,807 GBP
    Partners: University of Alberta, NTU

    The last decade has seen staggering advances in our ability to acquire and process information at the single atom and single molecule levels. Both the scanning tunnelling microscope (STM) and its slightly younger sibling, the atomic force microscope (AFM), now enable individual atoms to be probed, positioned, and, in essence, programmed by exploiting control of an impressively wide variety of physicochemical processes and properties right down to the single chemical bond limit. In recent work by Andreas Heinrich's team at IBM Research Labs, the worlds of quantum information processing and not just nanotechnology, but atomtech, have excitingly been bridged. This opens up entirely new approaches to not just quantum computing* but much more energy-efficient classical information processing via spin control in solid state devices (whose power consumption is increasingly unsustainable for many applications.) Although exceptionally impressive, the single atom qubits achieved by the IBM team are fabricated and manipulated on a bespoke material system involving a thin oxide film on a metal substrate. This is unfortunately not the most technologically relevant or scalable of architectures. Our New Horizons application instead involves information processing, logic, and spin control at the single atom level in silicon, a material that remains at the very core of our information society and will likely remain there for quite some time to come. We will exploit recent advancements in the fabrication of atomic-scale Boolean gates by Bob Wolkow's team at the University of Alberta to develop a new spin logic architecture based on the surprising "innate" magnetism of electron orbitals created on an atomically sculpted silicon surface.

  • Funder: UKRI Project Code: EP/T015748/1
    Funder Contribution: 421,950 GBP
    Partners: UC, CNRS, Coventry University, Maplesoft, Macquarie University, RWTH

    A statement is quantified if it has a qualification such as "for all" or "there exists". Let us consider an example commonly encountered in high school mathematics when studying quadratics: "there exists x such that ax^2 + bx + c = 0 has two different solutions for x". The statement is mathematically precise but the implications are unclear: what restrictions does this statement of existence force upon us? Quantifier Elimination (QE) replaces such a statement by an equivalent unquantified one, in this case by "either a is not zero and b^2 - 4ac is greater than 0, or all of a=b=c=0". The quantifier "there exists" and the variable x have been eliminated. The key points are: (a) the result may be derived automatically by a computer from the original statement using QE; (b) QE uncovers the special case when a=0 which humans often miss! Solutions to QE problems are not numbers but algebraic descriptions which offer insight. In the example above QE did not provide solutions to a particular equation - it told us in general how the number of solutions depends on (a,b,c). QE makes explicit the mathematical structure that was hidden: it is a way to "simplify" or even "solve" mathematical problems. For statements in polynomials over real numbers there will always exist an equivalent formula without the quantification. However, actually obtaining the answer can be very costly in terms of computation, and those costs rise with the size of the problem. We call this the "doubly exponential wall" in reference to how fast they rise. Doubly exponential means rising in line with the power of a power, e.g. a problem of size n costs roughly 2^(2^n). When applying QE in practice, results may be found easily for small problems, but as sizes increase you inevitably hit the wall where a computation will never finish. The doubly exponential wall cannot be broken completely: this rise in costs is inevitable. However, the aim of this project is to "push back the wall" so that lots more practical problems may be tackled by QE. The scale here means that pushing the wall even a small way offers enormous potential: e.g. 2^(2^4) is less than 66,000 while 2^(2^5) is over 4 billion! We will achieve this through the development of new algorithms, inspired by an existing process (cylindrical algebraic decomposition) but with substantial innovations. The first innovation is a new computation path inspired by another area of computer science (satisfiability checking) which has pushed back the wall of another famously hard problem (Boolean satisfiability). The team are founding members of a new community for knowledge exchange here. The second innovation is the development of a new mathematical formalisms of the underlying algebraic theory so that it can exploit structure in the logic. The team has prior experience of such developments and is joined by a project partner who is the world expert on the topic (McCallum). The third innovation is the relaxation of conditions on the underlying algebraic object that have been in place for 40+ years. The team are the authors of one such relaxation (cylindrical algebraic coverings) together with project partner Abraham. QE has numerous applications, perhaps most crucially in the verification of critical software. Also in artificial intelligence: an AI recently passed the U. Tokyo Mathematics entry exam using QE technology. This project will focus on two emerging application domains: (1) Biology, where QE can be used to determine the medically important values of parameters in a system; (2) Economics where QE can be used to validate findings, identify flaws and explore possibilities. In both cases, although QE has been shown by the authors to be applicable in theory, currently procedures run out of computer time/memory when applied to many problem instances. We are joined by project partners from these disciplines: SYMBIONT from systems biology and economist Mulligan.

search
The following results are related to Canada. Are you interested to view more results? Visit OpenAIRE - Explore.
17 Projects, page 1 of 2
  • Funder: UKRI Project Code: EP/V011855/1
    Funder Contribution: 4,436,180 GBP
    Partners: Marine Minerals Ltd, Apto Solutions, Bullitt, Natural History Museum, Cornish Mining World Heritage, Ravel, Mkango Resources Limited, Cornwall Resources Limited, Critical Minerals Association, Roskill Information Services Ltd...

    The Circular Economy (CE) is a revolutionary alternative to a traditional linear, make-use-dispose economy. It is based on the central principle of maintaining continuous flows of resources at their highest value for the longest period and then recovering, cascading and regenerating products and materials at the end of each life cycle. Metals are ideal flows for a circular economy. With careful stewardship and good technology, metals mined from the Earth can be reused indefinitely. Technology metals (techmetals) are an essential, distinct, subset of specialist metals. Although they are used in much smaller quantities than industrial metals such as iron and aluminium, each techmetal has its own specific and special properties that give it essential functions in devices ranging from smart phones, batteries, wind turbines and solar cells to electric vehicles. Techmetals are thus essential enablers of a future circular, low carbon economy and demand for many is increasing rapidly. E.g., to meet the UK's 2050 ambition for offshore wind turbines will require 10 years' worth of global neodymium production. To replace all UK-based vehicles with electric vehicles would require 200% of cobalt and 75% of lithium currently produced globally each year. The UK is 100% reliant on imports of techmetals including from countries that represent geopolitical risks. Some techmetals are therefore called Critical Raw Materials (high economic importance and high risk of supply disruption). Only four of the 27 raw materials considered critical by the EU have an end-of-life recycling input rate higher than 10%. Our UKRI TechMet CE Centre brings together for the first time world-leading researchers to maximise opportunities around the provision of techmetals from primary and secondary sources, and lead materials stewardship, creating a National Techmetals Circular Economy Roadmap to accelerate us towards a circular economy. This will help the UK meet its Industrial Strategy Clean Growth agenda and its ambitious UK 2050 climate change targets with secure and environmentally-acceptable supplies of techmetals. There are many challenges to a future techmetal circular economy. With growing demand, new mining is needed and we must keep the environmental footprint of this primary production as low as possible. Materials stewardship of techmetals is difficult because their fate is often difficult to track. Most arrive in the UK 'hidden' in complex products from which they are difficult to recover. Collection is inefficient, consumers may not feel incentivised to recycle, and policy and legislative initiatives such as Extended Producer Responsibility focus on large volume metals rather than small quantity techmetals. There is a lack of end-to-end visibility and connection between different parts of techmetal value chains. The TechMet consortium brings together the Universities of Exeter, Birmingham, Leicester, Manchester and the British Geological Survey who are already working on how to improve the raw materials cycle, manufacture goods to be re-used and recycled, recycle complex goods such as batteries and use and re-use equipment for as long as possible before it needs recycling. One of our first tasks is to track the current flows of techmetals through the UK economy, which although fundamental, is poorly known. The Centre will conduct new interdisciplinary research on interventions to improve each stage in the cycle and join up the value chain - raw materials can be newly mined and recycled, and manufacturing technology can be linked directly to re-use and recycling. The environmental footprint of our techmetals will be evaluated. Business, regulatory and social experts will recommend how the UK can best put all these stages together to make a new techmetals circular economy and produce a strategy for its implementation.

  • Funder: UKRI Project Code: EP/W007673/1
    Funder Contribution: 972,421 GBP
    Partners: University of London, University of Toronto, KageNova, Curtin University, UCD

    The emerging era of exascale computing that will be ushered in by the forthcoming generation of supercomputers will provide both opportunities and challenges. The raw compute power of such high performance computing (HPC) hardware has the potential to revolutionize many areas of science and industry. However, novel computing algorithms and software must be developed to ensure the potential of novel HPC architectures is realized. Computational imaging, where the goal is to recover images of interest from raw data acquired by some observational instrument, is one of the most widely encountered class of problem in science and industry, with myriad applications across astronomy, medicine, planetary and climate science, computer graphics and virtual reality, geophysics, molecular biology, and beyond. The rise of exascale computing, coupled with recent advances in instrumentation, is leading to novel and often huge datasets that, in principle, could be imaged for the first time in an interpretable manner at high fidelity. However, to unlock interpretable, high-fidelity imaging of big-data novel methodological approaches, algorithms and software implementations are required -- we will develop precisely these components as part of the Learned EXascale Computational Imaging (LEXCI) project. Firstly, whereas traditional computational imaging algorithms are based on relatively simple hand-crafted prior models of images, in LEXCI we will learn appropriate image priors and physical instrument simulation models from data, leading to much more accurate representations. Our hybrid techniques will be guided by model-based approaches to ensure effectiveness, efficiency, generalizability and uncertainty quantification. Secondly, we will develop novel algorithmic structures that support highly parallelized and distributed implementations, for deployment across a wide range of modern HPC architectures. Thirdly, we will implement these algorithms in professional research software. The structure of our algorithms will not only allow computations to be distributed across multi-node architectures, but memory and storage requirements also. We will develop a tiered parallelization approach targeting both large-scale distributed-memory parallelization, for distributing work across processors and co-processors, and light-weight data parallelism through vectorization or light-weight threads, for distributing work on processors and co-processors. Our tiered parallelization approach will ensure the software can be used across the full range of modern HPC systems. Combined, these developments will provide a future computing paradigm to help usher in the era of exascale computational imaging. The resulting computational imaging framework will have widespread application and will be applied to a number of diverse problems as part of the project, including radio interferometric imaging, magnetic resonance imaging, seismic imaging, computer graphics, and beyond. The resulting software will be deployed on the latest HPC computing resources to evaluate their performance and to feed back to the community the computing lessons learned and techniques developed, so as to support the general advance of exascale computing.

  • Funder: UKRI Project Code: EP/V028251/1
    Funder Contribution: 613,910 GBP
    Partners: Microsoft Research, Xilinx Corp, SU, UBC, Deloitte LLP, RIKEN, Hebei University, Cordouan Technologies, Dunnhumby, Imperial College London...

    The DART project aims to pioneer a ground-breaking capability to enhance the performance and energy efficiency of reconfigurable hardware accelerators for next-generation computing systems. This capability will be achieved by a novel foundation for a transformation engine based on heterogeneous graphs for design optimisation and diagnosis. While hardware designers are familiar with transformations by Boolean algebra, the proposed research promotes a design-by-transformation style by providing, for the first time, tools which facilitate experimentation with design transformations and their regulation by meta-programming. These tools will cover design space exploration based on machine learning, and end-to-end tool chains mapping designs captured in multiple source languages to heterogeneous reconfigurable devices targeting cloud computing, Internet-of-Things and supercomputing. The proposed approach will be evaluated through a variety of benchmarks involving hardware acceleration, and through codifying strategies for automating the search of neural architectures for hardware implementation with both high accuracy and high efficiency.

  • Funder: UKRI Project Code: EP/W002973/1
    Funder Contribution: 4,300,500 GBP
    Partners: University of Salford, MMU, Aalto University, Astrazeneca Plc, GLA, University of Cambridge, Spectra Analytics, Etsimo Healthcare Oy, Health Innovation Manchester, Delft University of Technology...

    Machine learning offers great promise in helping us solve problems by automatically learning solutions from data, without us having to specify all details of the solution as in earlier computational approaches. However, we still need to tell machine learning systems what problems we want them to solve, and this is currently undertaken by specifying desired outcomes and designing objective functions and rewards. Formulating the rewards for a new problem is not easy for us as humans, and is particularly difficult when we only partially know the goal, as is the case at the beginning of scientific research. In this programme we develop ways for machine learning systems to help humans to steer them in the process of collecting more information by designing experiments, interpreting what the results mean, and deciding what to measure next, to finally reach a conclusion and a trustworthy solution to the problem. The machine learning techniques will be developed first for three practically important problems and then generalized to be broadly applicable. The first is diagnosis and treatment decision making in personalized medicine, the second steering of scientific experiments in synthetic biology and drug design, and the third design and use of digital twins in designing physical systems and processes. An AI centre of excellence will be established at the University of Manchester, in collaboration with the Turing Institute and a number of partners from the industry and healthcare sector, and with strong connections to the networks of best national and international AI researchers.

  • Funder: UKRI Project Code: EP/W000652/1
    Funder Contribution: 800,898 GBP
    Partners: SU, University of Reading, Draper & Dash Healthcare, Humanity Vision Limited, Oxford Immune Algorithmics, Massachusetts Institute of Technology, USA, Imperial College London, University College London Hospital (UCLH) NHS Foundation Trust, The Chinese University of Hong kong, RBFT...

    There is an extremely high demand for laboratory-based blood tests from community settings in the UK and analysis suggests an important role in the future for remote blood monitoring that would enable patients and health professionals to carry out their own tests remotely, greatly benefiting patients and speeding up decision making. The COVID-19 pandemic has further highlighted the need for remote and connected blood testing that is beyond the online virtual clinics in the NHS outpatient setting. In current blood testing services for community healthcare, it is challenging to obtain and process blood samples outside of the clinical setting without training and lab facilities, and patients are required to attend a GP surgery or hospital for tests with travel burden and infection risk. Many blood analyses are done in batches that take a long time to build up, meaning the speed of blood sample analysis of routine tests and time taken for diagnosis are further challenges. Despite recent innovations in point of care, current blood analysis tools in practice are mainly mechanical or labour-intensive that require extensive filtering and manual tweaking and not suitable for regular at-home monitoring and longitudinal analytics. There is no personalised real-time approach available to inform disease complexity and conditions over time, which are critical for early detection of acute diseases and the management of chronic conditions. In England, around 95% of clinical pathways rely on patients having access to efficient, timely and cost-effective pathology services and there are 500 million biochemistry and 130 million haematology tests are carried out per year. This means inefficient and infrequent blood testing leads to late diagnosis, incomplete knowledge of disease progression and potential complications in a wide range of populations. Taking those challenges into account and current digital transformation in healthcare, this is a timely opportunity to bring researchers, clinicians and industrialist together to address the challenges of blood monitoring and analytics. The proposed Network+ will build an interdisciplinary community that will explore future blood testing solutions to achieve remote, inclusive, rapid, affordable and personalised blood monitoring, and address the above challenges in community health and care. To achieve the Network+ vision, research of technologies will be conducted from collaborations among information and communication technology (ICT), data and analytical science, clinical science, applied optics, biochemistry, engineering and social sciences in the Network+. The network will address three key technical challenges in blood testing: Remote monitoring, ICT, Personalised data and AI in a range of examplar clinical areas including cancer, autoimmune diseases, sickle cell disease, preoperative care, pathology services and general primary care.

  • Funder: UKRI Project Code: EP/V002325/1
    Funder Contribution: 395,816 GBP
    Partners: Macquarie University, Massachusetts Institute of Technology, USA, University of Quebec, University of California System, CASE WESTERN RESERVE UNIVERSITY, University of Leeds, Université Paris Diderot

    When we begin to study mathematics, we learn that the operation of multiplication on numbers satisfies some basic rules. One of these rules, known as associativity, says that for any three numbers a, b and c, we get the same result if we multiply a and b and then multiply the result by c or if we multiply a by the result of multiplying b and c. This leads to the abstract algebraic notion of a monoid, which is a set (in this case the set of natural numbers) equipped with a binary operation (in this case multiplication) that is associative and has a unit (in this case the number 1). If we continue to study mathematics, we encounter a new kind of multiplication, no longer on numbers but on sets, which is known as Cartesian product. Given two sets A and B, their Cartesian product is the set A x B whose elements are the ordered pairs (a, b), where a is an element of A and b is an element of B. Pictorially, the Cartesian product of two sets is a grid with coordinates given by the elements of the two sets. This operation satisfies some rules, analogous to those for the multiplication of numbers, but a little more subtle. For example, if we are given three sets A, B and C, then the set A x (B x C) is isomorphic (rather than equal) to the set (A x B) x C. Here, being isomorphic means that we they are essentially the same by means of a one-to-one correspondence between the elements A x (B x C) and those of (A x B) x C. This construction leads to the notion of a monoidal category, which amounts to a collection of objects and maps between them (in this case the collection of all sets and functions between them) equipped with a multiplication (in this case the Cartesian product) that is associative and has a unit (in this case the one-element set) up to isomorphism. Monoidal categories, introduced in the '60s, have been extremely important in several areas of mathematics (including logic, algebra, and topology) and theoretical computer science. In logic and theoretical computer science, they connect to linear logic, in which one keeps track of the resources necessary to prove a statement. This project is about the next step in this sequence of abstract notions of multiplication, which is given by the notion of a monoidal bicategory. In a bicategory, we have not only objects and maps but also 2-maps, which can be thought of as "maps between maps" and allow us to capture how different maps relate to each other. In a monoidal bicategory, we have a way of multiplying their objects, maps and 2-maps, subject to complex axioms. Monoidal bicategories, introduced in the '90s, have potential for applications even greater than that of monoidal categories, as they allow us to keep track of even more information. We seek to realise this potential by advancing the theory of monoidal bicategories. We will prove fundamental theorems about them, develop new connections to linear logic and theoretical computer science and investigate examples that are of interest in algebra and topology. Our work connects to algebra via an important research programme known as "categorification", which is concerned with replacing set-based structures (like monoids) with category-based structures (like monoidal categories) in order to obtain more subtle invariants. Our work links to topology via the notion of an operad, which is a flexible tool used to describe algebraic structures in which axioms do not hold as equalities, but rather up to weak forms of isomorphism. Overall, this project will bring the theory of monoidal bicategories to a new level and promote interdisciplinary research within mathematics and with theoretical computer science.

  • Funder: UKRI Project Code: EP/W001071/1
    Funder Contribution: 220,947 GBP
    Partners: ErgoWind S.r.l., Offshore Wind Consultants Ltd, NERC British Geological Survey, UWO, University of Brighton

    The proposed research aims to develop an innovative mitigation device to protect the next-generation onshore and offshore wind farms from dynamic loading caused by extreme natural events. In 2020, 20% of the UK's electricity was obtained from wind using both onshore and offshore windfarms. In order to increase this percentage and help the UK address its climate change target, new wind farms, with taller and larger wind turbines, and situated in more extreme locations are planned. Projections of growth also indicate the expansion into emerging markets and construction of new wind farms in developing countries. Therefore, these next-generation wind turbines will have to cope with harsher climate conditions induced by stronger storms and taller sea waves, and extreme events such as earthquakes and tsunamis. Several simplifying assumptions used for the design of previous generations of wind turbines can no longer be applied and new critical factors and uncertainties linked to power-generation efficiency and structural safety will emerge, affecting their resilience and life-cycle. The particular area of focus of this research is the traditional transition piece of a wind turbine, which is a structural element that connects the tower with its foundation and will have to tolerate extreme stresses induced by dynamic loading during extreme natural events. The aim is to replace the traditional connector with a novel mechanical joint of hourglass shape, termed an Hourglass Lattice Structure (HLS). This innovation will combine the unique features of two proven technologies extremely effective in seismic engineering, namely the "reduced beam section" approach and the "rocking foundation" design. In particular, the proposed HLS device, because of its hourglass shape, will facilitate the rocking behaviour in order to create a highly dissipating "fuse" which will protect the wind tower and foundation. Performance of the novel proposed device on the structural life-cycle risk will be assessed through analytical, numerical, and experimental investigation by using, as a measure of efficiency, the levelized cost of energy (LCOE), namely the cost per unit of energy based on amortized capital cost over the project life. In addition, experimental testing of offshore small-scale wind turbines will be carried out by means of an innovative test rig, the first-ever underwater shake-table hosted in a hydraulic flume that will be deployed, calibrated, and used to simulate multi-hazard scenarios such as those recently discovered and dubbed "stormquakes". The successful outcome of this timely project will allow next-generation wind turbines to be more resilient and cost effective so that wind energy can develop as a competitive renewable energy resource with less need for government subsidy. The inclusion of industrial partners in all stages of the project ensures that the technical developments will be included in commercial devices for a medium-long term impact.

  • Funder: UKRI Project Code: EP/V041665/1
    Funder Contribution: 1,504,770 GBP
    Partners: AMP Clean Energy, SFU, Ferrite Microwave Technologies LLC, University of Birmingham, GEIRI Europe

    The Committee on Climate Change suggests that we need to decarbonise all heat in buildings by 2050 to achieve the Net Zero emissions targets. The electrification of heat supply, through either direct electric heating or heat pumps, seems more likely to be realised in practice. However, the complete electrification of heat will result in much higher electricity demand in winter than in summer. Furthermore, due to the consistency of ambient temperature, it will also lead to electricity demand spikiness which is a big challenge for the grid. The HARVEST project will develop a new solution that can absorb and accumulate the curtailed/waste renewable electricity all around the year using thermochemical heat storage technology and then convert and magnify the heat output in winter and cooling output in summer using heat pump technology. The unique features of the proposed solution are: (1) the microwave-assisted process to flexibly absorb renewable electricity; and (2) the compact and efficient regeneration process by direct contact reaction between thermochemical heat storage materials and ammonia solution. We have established a strong multidisciplinary consortium, consisting of leading researchers from the University of Birmingham, the University of Edinburgh, and the University College London, to address the key challenges in both the scientific/technological aspects and social aspects. Our research will significantly contribute to several identified approaches in the 'Decarbonising Heating and Cooling 2' call document, in particular, the 'new technologies of heating and/or cooling' and 'new methods or significant developments for heat storage or cold storage'. Our research is also further supported by the UK and international partners to maximise knowledge exchange and impact delivery.

  • Project . 2021 - 2023
    Funder: UKRI Project Code: EP/V049763/1
    Funder Contribution: 130,807 GBP
    Partners: University of Alberta, NTU

    The last decade has seen staggering advances in our ability to acquire and process information at the single atom and single molecule levels. Both the scanning tunnelling microscope (STM) and its slightly younger sibling, the atomic force microscope (AFM), now enable individual atoms to be probed, positioned, and, in essence, programmed by exploiting control of an impressively wide variety of physicochemical processes and properties right down to the single chemical bond limit. In recent work by Andreas Heinrich's team at IBM Research Labs, the worlds of quantum information processing and not just nanotechnology, but atomtech, have excitingly been bridged. This opens up entirely new approaches to not just quantum computing* but much more energy-efficient classical information processing via spin control in solid state devices (whose power consumption is increasingly unsustainable for many applications.) Although exceptionally impressive, the single atom qubits achieved by the IBM team are fabricated and manipulated on a bespoke material system involving a thin oxide film on a metal substrate. This is unfortunately not the most technologically relevant or scalable of architectures. Our New Horizons application instead involves information processing, logic, and spin control at the single atom level in silicon, a material that remains at the very core of our information society and will likely remain there for quite some time to come. We will exploit recent advancements in the fabrication of atomic-scale Boolean gates by Bob Wolkow's team at the University of Alberta to develop a new spin logic architecture based on the surprising "innate" magnetism of electron orbitals created on an atomically sculpted silicon surface.

  • Funder: UKRI Project Code: EP/T015748/1
    Funder Contribution: 421,950 GBP
    Partners: UC, CNRS, Coventry University, Maplesoft, Macquarie University, RWTH

    A statement is quantified if it has a qualification such as "for all" or "there exists". Let us consider an example commonly encountered in high school mathematics when studying quadratics: "there exists x such that ax^2 + bx + c = 0 has two different solutions for x". The statement is mathematically precise but the implications are unclear: what restrictions does this statement of existence force upon us? Quantifier Elimination (QE) replaces such a statement by an equivalent unquantified one, in this case by "either a is not zero and b^2 - 4ac is greater than 0, or all of a=b=c=0". The quantifier "there exists" and the variable x have been eliminated. The key points are: (a) the result may be derived automatically by a computer from the original statement using QE; (b) QE uncovers the special case when a=0 which humans often miss! Solutions to QE problems are not numbers but algebraic descriptions which offer insight. In the example above QE did not provide solutions to a particular equation - it told us in general how the number of solutions depends on (a,b,c). QE makes explicit the mathematical structure that was hidden: it is a way to "simplify" or even "solve" mathematical problems. For statements in polynomials over real numbers there will always exist an equivalent formula without the quantification. However, actually obtaining the answer can be very costly in terms of computation, and those costs rise with the size of the problem. We call this the "doubly exponential wall" in reference to how fast they rise. Doubly exponential means rising in line with the power of a power, e.g. a problem of size n costs roughly 2^(2^n). When applying QE in practice, results may be found easily for small problems, but as sizes increase you inevitably hit the wall where a computation will never finish. The doubly exponential wall cannot be broken completely: this rise in costs is inevitable. However, the aim of this project is to "push back the wall" so that lots more practical problems may be tackled by QE. The scale here means that pushing the wall even a small way offers enormous potential: e.g. 2^(2^4) is less than 66,000 while 2^(2^5) is over 4 billion! We will achieve this through the development of new algorithms, inspired by an existing process (cylindrical algebraic decomposition) but with substantial innovations. The first innovation is a new computation path inspired by another area of computer science (satisfiability checking) which has pushed back the wall of another famously hard problem (Boolean satisfiability). The team are founding members of a new community for knowledge exchange here. The second innovation is the development of a new mathematical formalisms of the underlying algebraic theory so that it can exploit structure in the logic. The team has prior experience of such developments and is joined by a project partner who is the world expert on the topic (McCallum). The third innovation is the relaxation of conditions on the underlying algebraic object that have been in place for 40+ years. The team are the authors of one such relaxation (cylindrical algebraic coverings) together with project partner Abraham. QE has numerous applications, perhaps most crucially in the verification of critical software. Also in artificial intelligence: an AI recently passed the U. Tokyo Mathematics entry exam using QE technology. This project will focus on two emerging application domains: (1) Biology, where QE can be used to determine the medically important values of parameters in a system; (2) Economics where QE can be used to validate findings, identify flaws and explore possibilities. In both cases, although QE has been shown by the authors to be applicable in theory, currently procedures run out of computer time/memory when applied to many problem instances. We are joined by project partners from these disciplines: SYMBIONT from systems biology and economist Mulligan.