search
The following results are related to Canada. Are you interested to view more results? Visit OpenAIRE - Explore.
23 Projects, page 1 of 3

  • Canada
  • UK Research and Innovation
  • 2019

10
arrow_drop_down
  • Funder: UKRI Project Code: EP/P017401/1
    Funder Contribution: 100,794 GBP
    Partners: University of Salford, Autodesk Inc, University of Toronto

    Synthetic biology is an exciting new discipline which offers the potential to bring many benefits to human health and welfare. One near-market example is the use of engineered genetic networks to make biological sensors, or biosensors, which can rapidly detect toxins and harmful microorganisms. However, most synthetic biology systems are based on living genetically modified cells, and due to safety concerns and regulatory issues, they can not be used outside of a specially approved laboratory, whereas the greatest unmet need for biosensors is in the field, for 'point-of-use' and 'point-of-care' tests for health hazards. The laboratory of Professor James Collins recently reported a remarkable breakthrough, using non-living biological systems based on genetic components dried onto strips of paper. These systems can be prepared very cheaply, can be stored stably for long periods, and, since they are not alive and can not replicate, they pose no risks to the environment. This technology is therefore ideal for further development of sensors for human health. In addition, these cell-free systems can be prepared in large numbers very rapidly, in a matter of hours, and tested rapidly, in a matter of minutes, whereas living cell based systems may take weeks to prepare and days to test. This makes the new technology ideal for 'rapid prototyping' of genetic circuits. Many designs can be rapidly generated and tested, and the most successful can then be used to generate cell-based systems for applications where this is required, such as engineered metabolic pathways for manufacturing pharmaceuticals and other valuable compounds. In this project, we will further develop these remarkable systems and create new tools which will make it even easier to design and develop them. Firstly, we will create new computational tools which can be used to design genetic circuits for many applications. These will be made available on-line for the benefit of the research community. Secondly, we will establish methods for rapid automated assembly and testing of new circuits, allowing many thousands of variants to be generated and tested in a very short time with minimal human effort. Thirdly, we will seek to improve the basic technology, to improve the performance of the cell-free devices, and also develop low cost open-source electronic readers which can easily be used in the field along with the sensors we develop. Fourthly, we will demonstrate the usefulness of the technology by generating sensors which can rapidly and sensitively detect various external inputs. All of our new inventions will be made available to the research community. In addition to the other advantages mentioned above, this technology also makes it easy for users to develop their own assays simply by adding appropriate DNA components to a basic mixture, using standard protocols. Such devices can be manufactured and distributed cheaply on a very large scale. In conjunction with low-cost readers, ubiquitous mobile devices equipped with GPS and time data, and cloud-computing, this will offer the possibility to detect health hazards with unprecedented levels of speed and detail, with potentially huge effects on human health and welfare. Furthermore, these devices are ideal for use in education, allowing users to design and test their own genetic circuits without the issues inherent in using living cells. For these reasons, our proposal offers tremendous benefits and represents a step change in the real-word applicability of synthetic biology.

  • Funder: UKRI Project Code: EP/R019622/1
    Funder Contribution: 100,987 GBP
    Partners: RWTH, UniGe, Coventry University, CNRS, Maplesoft

    This project concerns computational mathematics and logic. The aim is to improve the ability of computers to perform ``Quantifier Elimination'' (QE). We say a logical statement is ``quantified'' if it is preceded by a qualification such as "for all" or "there exists". Here is an example of a quantified statement: "there exists x such that ax^2 + bx + c = 0 has two solutions for x". While the statement is mathematically precise the implications are unclear - what restrictions does this statement of existence force upon us? QE corresponds to replacing a quantified statement by an unquantified one which is equivalent. In this case we may replace the statement by: "b^2 - 4ac > 0", which is the condition for x to have two solutions. You may have recognised this equivalence from GCSE mathematics, when studying the quadratic equation. The important point here is that the latter statement can actually be derived automatically by a computer from the former, using a QE procedure. QE is not subject to the numerical rounding errors of most computations. Solutions are not in the form of a numerical answer but an algebraic description which offers insight into the structure of the problem at hand. In the example above, QE shows us not what the solutions to a particular quadratic equation are, but how in general the number of solutions depends on the coefficients a, b, and c. QE has numerous applications throughout engineering and the sciences. An example from biology is the determination of medically important values of parameters in a biological network; while another from economics is identifying which hypotheses in economic theories are compatible, and for what values of the variables. In both cases, QE can theoretically help, but in practice the size of the statements means state-of-the-art procedures run out of computer time/memory. The extensive development of QE procedures means they have many options and choices about how they are run. These decisions can greatly affect how long QE takes, rendering an intractable problem easy and vice versa. Making the right choice is a critical, but understudied problem and is the focus of this project. At the moment QE procedures make such choices either under direct supervision of a human or based on crude human-made heuristics (rules of thumb based on intuition / experience but with limited scientific basis). The purpose of this project is to replace these by machine learning techniques. Machine Learning (ML) is an overarching term for tools that allow computers to make decisions that are not explicitly programmed, usually involving the statistical analysis of large quantities of data. ML is quite at odds with the field of Symbolic Computation which studies QE, as the latter prizes exact correctness and so shuns the use of probabilistic tools making its application here very novel. We are able to combine these different worlds because the choices which we will use ML to make will all produce a correct and exact answer (but with different computational costs). The project follows pilot studies undertaken by the PI which experimented with one ML technique and found it improved upon existing heuristics for two particular decisions in a QE algorithm. We will build on this by working with the spectrum of leading ML tools to identify the optimal techniques for application in Symbolic Computation. We will demonstrate their use for both low level algorithm decisions and choices between different theories and implementations. Although focused on QE, we will also demonstrate ML as being a new route to optimisation in Computer Algebra more broadly and work encompasses Project Partners and events to maximise this. Finally, the project will deliver an improved QE procedure that makes use of ML automatically, without user input. This will be produced in the commercial Computer Algebra software Maple in collaboration with industrial Project Partner Maplesoft.

  • Funder: UKRI Project Code: NE/M011429/1
    Funder Contribution: 549,872 GBP
    Partners: HGF, Namibia Rare Earths Inc, RPC, Maakrish Ltd, Nuna Minerals A/S, SRK Consulting UK Ltd, Oakdene Hollins (United Kingdom), UCT, Umwelt und Ingenieurtechnik GmbH, Mkango Resources Limited...

    Rare earth elements (REE) are the headline of the critical metals security of supply agenda. All the REE were defined as critical by the European Union in 2010, and in subsequent analysis in 2014. Similar projects in the UK and USA have highlighted 'heavy' REE (HREE - europium through to lutetium) as the metals most likely to be at risk of supply disruption and in short supply in the near future. The REE are ubiquitous within modern technologies, including computers and low energy lighting, energy storage devices, large wind turbines and smart materials, making their supply vital to UK society. The challenge is to develop new environmentally friendly and economically viable, neodymium (Nd) and HREE deposits so that use of REE in new and green technologies can continue to expand. The principal aims of this project are to understand the mobility and concentration of Nd and HREE in natural systems and to investigate new processes that will lower the environmental impact of REE extraction and recovery. By concentrating on the critical REE, the research will be wide ranging in the deposits and processing techniques considered. It gives NERC and the UK a world-leading research consortium on critical REE, concentrating on deposit types identified in the catalyst phase as most likely to have low environmental impact, and on research that bridges the two goals of the SoS programme. The project brings together two groups from the preceding catalyst projects (GEM-CRE, MM-FREE) to form a new interdisciplinary team, including the UK's leading experts in REE geology and metallurgy, together with materials science, high/low temperature fluid geochemistry, computational simulation/mineral physics, geomicrobiology and bioprocessing. The team brings substantial background IP and the key skills required. The research responds to the needs of industry partners and involves substantive international collaboration as well as a wider international and UK network across the REE value chain. The work programme has two strands. The first centres on conventional deposits, which comprise all of the REE mines outside China and the majority of active exploration and development projects. The aim is to make a step change in the understanding of the mobility of REE in these natural deposits via mineralogical analysis, experiments and computational simulation. Then, based on this research, the aim is to optimise the most relevant extraction methods. The second strand looks to the future to develop a sustainable new method of REE extraction. The focus will be the ion adsorption deposits, which could be exploited with the lowest environmental impact of any of the main ore types using a well-controlled in-situ leaching operation. Impact will be immediate through our industry partners engaged in REE exploration and development projects, who will gain improved deposit models and better and more efficient, and therefore more environmentally friendly, extraction techniques. There will be wider benefits for researchers in other international teams and companies as we publish our results. Security of REE supply is a major international issue and the challenges tackled in this research will be relevant to practically all REE deposits. Despite the UK not having world class REE deposits itself, the economy is reliant on REE (e.g. the functional materials and devices industry is worth ~£3 Bn p.a.) and therefore the UK must lead research into the extraction process. Manufacturers who use REE will also benefit from the research by receiving up to date information on prospects for future Nd and HREE supply. This will help plan their longer term product development, as well as shorter term purchasing strategy. Likewise, the results will be useful to inform national and European level policy and to interest, entertain and educate the wider community about the natural characters and importance of the REE.

  • Project . 2018 - 2019
    Funder: UKRI Project Code: EP/R009600/1
    Funder Contribution: 101,024 GBP
    Partners: University of Toronto, Lancaster University, Vienna University of Technology, Monash University, Zuhlke Engineering Ltd, Cleanweb UK, IRISA Rennes

    BACKGROUND Values are deeply held principles guiding decision-making processes of individuals, groups and organizations. Software is inevitably affected by values: the organizational values of the project sponsor, research partners, and developers. Some values (e.g. financial value) are easier to quantify than others (e.g. trust, responsibility) with the latter often dismissed in software production processes as lacking of measurable evidence. This is problematic because all values, including less-easy to measure ones, influence how people use, access and engage with software systems with far-reaching impact not only on the commercial success of software products, but more widely on society. VISION Values-First SE is a systematic and disciplined approach to the elicitation, articulation, and deliberation of human values in software production. Given the pervasiveness of software and its impact on society, we - developers, researchers, clients, and end-user - must strengthen our capacity and confidence to externalise the values-sets built into software and use them to track how software behaves. Recent examples such as the Google boycott stem from the (often unintentional) breach of implicitly held values-systems: simply put, companies do not want to be associated with extremist values perceived as opposite to those held by society. However, the interplay between values held by software industry (e.g. prestige, social responsibility), clients (e.g. financial, care for their customers and employees), and end-users (e.g. trust, social justice) is complex, difficult to articulate and rarely fully captured by current SE decision-making processes. CONTEXT Awareness of the impact of software on politics, society, and the environment is not new: from cyber-security to environmental informatics, to digital-health, a large body of ethical computing exists looking at mechanisms that can guide developers and managers' responsibilities (e.g. code of Ethics). What is new is the unprecedented scale, reach and complexity of such impact and the urgency for developers to "be prepared to be responsible". One of the biggest challenges for developers is that the full impact of values choices in the code developed is often unseen and unintentional: when writing software, often the platform obfuscates the process even to the software developer. How can software developers be prepared to be responsible when it is not clear what they should be responsible for? For example, in the Android Software Development Kit (SDK), the geocoding of location is done by sharing of precise location with a third party organization (including Google). The implications of sharing this data, how it is stored, treated and reused is not fully explained to the developer in tutorials, in the SDK documentation or in the IDE at the time of writing the code. Those simple lines of code for the developer could have an unseen impact on the encoded values of the software produced. However, current SE methods do not seem to provide the means to follow values through a software development process and, in particular, there are currently no methods that allow values to be used as a reference framework for decision making at key stages of software development. Articulating, negotiating, and capturing human values across SE decision-making processes is precisely Values-first SE main aim.

  • Funder: UKRI Project Code: ES/N000501/2
    Funder Contribution: 98,615 GBP
    Partners: University of Cambridge, NTU, York University Canada, Newcastle University

    Autism Spectrum Conditions (ASC) are a diverse group of developmental brain conditions that cause difficulties in communication, social interaction, unusually narrow interests and difficulties adapting to change. One in 100 people (700,000 in the UK) have an ASC, most of whom are adults. A majority of the total economic cost of ASC to the UK is spent on supporting adults (£25 billion out of a total of £28 billion), with 36% of this cost attributable to lost employment opportunities (Knapp et al. 2009). The individual and social costs of ASC in adulthood are also high, with research showing poor outcomes in terms of educational attainment, unemployment (Howlin, 2000), and high rates of depression (32%), suicidal thoughts (66%) and suicidal behaviours (35%) (Cassidy et al. 2014). The latest reports from the ESRC Centre for Economic Performance, and the Chief Medical Officer, describe the high individual, social and economic costs of leaving mental health problems such as depression untreated. However, there are no valid measures of depression or suicide risk for adults with ASC, despite evidence that these are common problems (Cassidy et al. 2014; Segers and Rawana, 2014). Measures for typically developing adults are not appropriate for adults with ASC, who tend to interpret questions literally (Happe et al. 1995), and have difficulty verbalising their emotional experiences (Bird et al. 2010). Depression and suicidality also manifest differently in ASC; inflexible thinking and impulsivity may increase risk (Cassidy et al. 2014). In addition to lack of appropriate measures, research progress is also hampered by the lack of a data set that includes enough adults with ASC to effectively evaluate their rates of depression and suicidality on a national scale; the UK adult psychiatric morbidity survey (2007) only included 19 adults with ASC. The lack of research and appropriate measures have had a profoundly negative impact on adults with ASC; 1) it is not possible to conduct detailed research into the nature, risk or protective factors for depression or suicidality in adults with ASC; 2) it is not possible to effectively assess their depression or suicide risk in clinical practice; 3) without the knowledge base or assessment tools, new theories and effective evidence based treatments cannot be developed or evaluated; 4) we cannot effectively evaluate the prevalence of depression or suicidality on a national scale, in order to inform effective government policy. Hence, adults with ASC are not currently able to access evidence based assessment or therapies for depression or suicidality, despite being at potentially high risk. This research project will address these fundamental issues by developing the first empirically validated measures of depression and suicidality for adults with ASC, for use in a national survey. This will form the first nationally representative dataset containing rates of depression and suicidality in adults with ASC in the UK, made available for secondary analysis. These objectives will be achieved by creating synergy between psychiatrists and clinicians involved in ageing, autism, suicide, mental health and risk assessment research, across internationally recognized institutions (Universities of Coventry, Newcastle, and Cambridge). This research will build on my previously published research, which has utilized big data to explore the health and behaviour of adults with ASC, including the first large-scale clinic study of depression and suicidality in adults with late diagnosis of Asperger Syndrome (a high functioning subgroup on the autism spectrum) (Cassidy et al., 2014). This project will enable me to foster a new inter-disciplinary mixed-methods approach to the study of mental health in ASC, which I will continue to lead beyond the funding period.

  • Funder: UKRI Project Code: NE/M017540/2
    Funder Contribution: 284,801 GBP
    Partners: SDSU, NCU, Fugro (United Kingdom), BU, MBARI, Osaka Institute of Technology, Shell International Exploration & Produc, Deltares-Delft, CSIC, Victoria University of Wellington...

    Turbidity currents are the volumetrically most import process for sediment transport on our planet. A single submarine flow can transport ten times the annual sediment flux from all of the world's rivers, and they form the largest sediment accumulations on Earth (submarine fans). These flows break strategically important seafloor cable networks that carry > 95% of global data traffic, including the internet and financial markets, and threaten expensive seabed infrastructure used to recover oil and gas. Ancient flows form many deepwater subsurface oil and gas reservoirs in locations worldwide. It is sobering to note quite how few direct measurements we have from submarine flows in action, which is a stark contrast to other major sediment transport processes such as rivers. Sediment concentration is the most fundamental parameter for documenting what turbidity currents are, and it has never been measured for flows that reach submarine fans. How then do we know what type of flow to model in flume tanks, or which assumptions to use to formulate numerical or analytical models? There is a compelling need to monitor flows directly if we are to make step changes in understanding. The flows evolve significantly, such that source to sink data is needed, and we need to monitor flows in different settings because their character can vary significantly. This project will coordinate and pump-prime international efforts to monitor turbidity currents in action. Work will be focussed around key 'test sites' that capture the main types of flows and triggers. The objective is to build up complete source-to-sink information at key sites, rather than producing more incomplete datasets in disparate locations. Test sites are chosen where flows are known to be active - occurring on annual or shorter time scale, where previous work provides a basis for future projects, and where there is access to suitable infrastructure (e.g. vessels). The initial test sites include turbidity current systems fed by rivers, where the river enters marine or freshwater, and where plunging ('hyperpycnal') river floods are common or absent. They also include locations that produce powerful flows that reach the deep ocean and build submarine fans. The project is novel because there has been no comparable network established for monitoring turbidity currents Numerical and laboratory modelling will also be needed to understand the significance of the field observations, and our aim is also to engage modellers in the design and analysis of monitoring datasets. This work will also help to test the validity of various types of model. We will collect sediment cores and seismic data to study the longer term evolution of systems, and the more infrequent types of flow. Understanding how deposits are linked to flows is important for outcrop and subsurface oil and gas reservoir geologists. This proposal is timely because of recent efforts to develop novel technology for monitoring flows that hold great promise. This suite of new technology is needed because turbidity currents can be extremely powerful (up to 20 m/s) and destroy sensors placed on traditional moorings on the seafloor. This includes new sensors, new ways of placing those sensors above active flows or in near-bed layers, and new ways of recovering data via autonomous gliders. Key preliminary data are lacking in some test sites, such as detailed bathymetric base-maps or seismic datasets. Our final objective is to fill in key gaps in 'site-survey' data to allow larger-scale monitoring projects to be submitted in the future. This project will add considerable value to an existing NERC Grant to monitor flows in Monterey Canyon in 2014-2017, and a NERC Industry Fellowship hosted by submarine cable operators. Talling is PI for two NERC Standard Grants, a NERC Industry Fellowship and NERC Research Programme Consortium award. He is also part of a NERC Centre, and thus fulfils all four criteria for the scheme.

  • Funder: UKRI Project Code: EP/K034383/1
    Funder Contribution: 2,246,110 GBP
    Partners: University of Rome, AIM, Abdus Salam ICTP, University of Waterloo (Canada), University of Warwick

    L-functions and modular forms are fundamental mathematical objects that encode much of our knowledge of contemporary number theory. They form part of a web of interconnected objects, the understanding of which in the most basic cases lies at the foundations of much of modern mathematics. A spectacular example is Wiles' proof of Fermat's Last Theorem, which was an application of a fundamental "modularity" link between L-functions, modular forms and elliptic curves. This project will greatly extend and generalize such connections, both theoretically and computationally. The research vision inspiring our programme can be summarised as: "Breaking the boundaries of classical L-functions and modular forms, and exploring their applications to 21st-century mathematics, physics, and computer science". Our guiding goal is to push forward both theoretical and algorithmic developments, in order to develop L-functions and modular forms far beyond current capabilities. This programme will systematically develop an extensive catalogue of number theoretic objects, and will make this information available through an integrated online resource that will become an indispensable tool for the world's research community. L-functions are to pure mathematics what fundamental particles are to physics: their interaction reveal fundamental truths. To continue the analogy, computers are to number theorists what colliders are to particle physicists. Aside from their established role as empirical "testers" for conjectures and theories, experiments can often throw up quite unexpected phenomena which go on to reshape modern theory. Our programme will establish a major database and encyclopedia of knowledge about L-functions and related objects, which will play a role analogous to that of the LHC for the scientists at CERN. Both are at the threshold of tantalising glimpses into completely uncharted territories: higher degree L-functions for us and the Higgs boson for them. Theoretical and computational work on higher degree L-functions has only started to make substantial progress in the past few years. There do not currently exist efficient methods to work with these, and rigorous computations with them are not yet possible. Neither is there yet an explicit description of all ways in which degree 3 L-functions can arise. We will address these facets in our research programme: both algorithmic development and theoretical classification. As well as having theoretical applications to modularity relationships as in Wiles' proof, detailed knowledge of L-functions has more far-reaching implications. Collections of L-functions have statistical properties which first arose in theoretical physics. This surprising connection, which has witnessed substantial developments led by researchers in Bristol, has fundamental predictive power in number theory; the synergy will be vastly extended in this programme. In another strand, number theory plays an increasingly vital role in computing and communications, as evidenced by its striking applications to both cryptography and coding theory. The Riemann Hypothesis (one of the Clay Mathematics Million Dollar Millennium Problems) concerns the distribution of prime numbers, and the correctness of the best algorithms for testing large prime numbers depend on the truth of a generalised version of this 150-year-old unsolved problem. These are algorithms which are used by public-key cryptosystems that everyone who uses the Internet relies on daily, and that underpin our digital economy. Our programme involves the creation of a huge amount of data about a wide range of modular forms and L-functions, which will far surpass in range and depth anything computed before in this area. This in turn will be used to analyse some of the most famous outstanding problems in mathematics, including the Riemann Hypothesis and another Clay problem, the Birch and Swinnerton-Dyer conjecture.

  • Funder: UKRI Project Code: EP/N018745/1
    Funder Contribution: 320,381 GBP
    Partners: UBC, State University of New York at Potsdam, Perimeter Institute, Universitat Autònoma de Barcelona (UAB), University of Waterloo (Canada), University of Oxford

    Realizing the potential of applications of quantum theory to information processing, which include quantum communication and quantum computation, is one of the primary goals of contemporary engineering and physics. The key theoretical breakthroughs enabling quantum communication technologies were the discovery of the phenomenon of quantum entanglement in the 1930s and the realisation that entanglement represented not merely a curiosity of quantum theory but a critical resource which could be exploited to achieve heretofore impossible communication tasks in the 1980s. Bell indentified quantum nonlocality as the essentially quantum aspect of entanglement in the 1960s. While it is widely understood that quantum computation offers substantial efficiency advantages over classical computation for particular problems, it is neither understood what the precise class of such problems is nor what the particular aspect or aspects of quantum theory enabling these advantages are. The applications for QC which have been identified are likely only a fraction of the full potential, however, as only a handful of quantum algorithms have been discovered. Peter Shor, whose discovery of the first practical quantum algorithm founded modern quantum computer science, contemplated why so few quantum algorithms have been discovered and suggested that, "quantum computers operate in a manner so different from classical computers that our techniques for designing algorithms and our intuitions for understanding the process of computation no longer work". In seeking quantum algorithms without a clear idea of the essential quantum phenomenon accounting for quantum computational advantage, we are working in the dark. Despite decades of research, the key feature of quantum theory enabling quantum advantage over classical computers remains elusive. Several of quantum theory's novel features---such as entanglement, superposition, and discord---have been proposed as candidates but have subsequently proven insufficient. Recent evidence, such as that provided by Rausendorff (Phys. Rev. A, 88) and Howard et al. (Nature, 510), demonstrates that a generalization of nonlocality called contextuality plays an important role in QC and suggests that it is, perhaps, a sought-after key to understanding the unique capabilities of QC. Our vision is to deepen the theory of contextuality with the goals of achieving an understanding of the precise role it plays in QC and how it is a resource for computational advantage. Our team is uniquely positioned to tackle this challenge: the PIs are co-inventors of the two leading theoretical frameworks for contextuality. We will achieve our goal by collaborating with an international, interdisciplinary team of experts including those responsible for the initial evidence linking contextuality and QC as well as recognized leaders in quantum algorithms and the resource theory of nonlocality.

  • Funder: UKRI Project Code: EP/M013294/1
    Funder Contribution: 35,513,900 GBP
    Partners: University of Birmingham, Cardno AUS, RU, Knowledge Transfer Partnership, Elekta Oy, JK Guest Group, Micro-g LaCoste, GeoDynamics Worldwide Srl, BP British Petroleum, South East Physics Network...

    The Hub will create a seamless link between science and applications by building on our established knowledge exchange activities in quantum technologies. We will transform science into technology by developing new products, demonstrating their applications and advantages, and establishing a strong user base in diverse sectors. Our overarching ambition is to deliver a wide range of quantum sensors to underpin many new commercial applications. Our key objective is to ensure that the Hub's outputs will have been picked up by companies, or industry-led TSB projects, by the end of the funding period. The Hub will comprise: a strong fabrication component; quantum scientists with a demonstrated ability to combine scientific excellence with technological delivery; leading engineers with the broad collective expertise and connections required to develop and use new quantum sensors. We have identified, and actively involved, industry enablers to build a supply chain for quantum sensor technology. As well as direct physics connections to industry, the engineers provide strong links to relevant industrial users, thus providing information on industrial needs and enabling rapid prototype deployment in the field. To establish a coherent national collaborative effort, the Hub will include a UK network on quantum sensors and metrology, which will also exploit the connections that Prof Bongs and all Hub members have forged in Europe, the US and Asia. This inter-linkage ensures capture of the most advanced developments in quantum technology around the world for exploitation by the UK. Quantum sensors and metrology, plus some devices in quantum communication, are the only areas where laboratory prototypes have already proven superior to their best classical counterparts. This sets the stage, credibly, for rapid and disruptive applications emerging from the Hub. The selection of prototypes will be driven by commercial pull, i.e. each prototype project within the Hub must demonstrate, from the outset, industry or practitioner engagement from our engineering and/or industrial collaborators. We have strong industry support across several disciplines with the structures in place actively to manage technology and knowledge transfer to the industry sector. Particular roles are played by NPL and e2V. We will closely collaborate with NPL as metrology end-user on clock, magnetometer and potentially Watt balance developments with a lecturer-level Birmingham-NPL fellow contributed by Birmingham University and our PRDAs spending ~17 man-years in addition to 3-5 PhD students on these joint projects in the Advanced Metrology Laboratory/incubator space. E2v have a unique industrial manufacturing/R&D facility co-located within the School of Physics and Astronomy at Nottingham that has already catalysed the expansion of their activities into the Quantum Technology domain. Public Engagement conveying the Hub's breakthroughs will be a high priority - for example annually at the Royal Society Summer Exhibitions. In addition to cohort-training of 80 PhD students working within the Hub, the Hub will contribute to the training of ~500 PhD students via electronically-shared lectures (many already running within the e-learning graduate schools MPAGS, MEGS, SEPNET and SUPA) across the institutions within the Hub. The Hub will create an internationally-leading centre of excellence with major impact in the area of quantum sensors and metrology. To widen the impact of the Hub and ensure long-term sustainability, we will actively pursue European and other international collaborative funding for both underlying fundamental research and the technology development.

  • Funder: UKRI Project Code: NE/L014076/1
    Funder Contribution: 638,057 GBP
    Partners: Cristal Pigment UK Ltd, University of Belgrade, University of Birmingham, University of Granada, UCT, C-Tech Innovation (United Kingdom), UWC, University of Surrey, Plymouth University, FCO...

    30 years' research on metal biorecovery from wastes has paid scant attention to strong CONTEMPORARY demands for (i) conservation of dwindling vital resources (e.g platinum group metals (PGM), recently rare earth elements, (REE), base metals (BMs) and uranium) and (ii) the unequivocal need to extract/refine them in a non-polluting, low-energy way. 21stC technologies increasingly rely on nanomaterials which have novel properties not seen in bulk materials. Bacteria can fabricate nanoparticles (NPs), bottom up, atom by atom, with exquisite fine control offered by enzymatic synthesis and bio-scaffolding that chemistry cannot emulate. Bio-nanoparticles have proven applications in green chemistry, low carbon energy, environmental protection and potentially in photonic applications. Bacteria can be grown cheaply at scale for facile production. We have shown that bacteria can make nanomaterials from secondary wastes, yielding, in some cases, a metallic mixture which can show better activity than 'pure' nanoparticles. Such fabrication of structured bimetallics can be hard to achieve chemically. For some metals like rare earths and uranium (which often co-occur in wastes) their biorecovery from scraps e.g. magnets (rare earths) and wastes (mixed U/rare earths), when separated, can make 'enriched' solids for delivery into further commercial refining to make new magnets (rare earths) or nuclear fuel (U). Biofabricating these solids is often beyond the ability of living cells but they can form scaffolds, with enzymatic processes harnessed to make biomineral precursors, often selectively. B3 will invoke tiered levels of complexity, maturity and risk. (i) Base metal mining wastes (e.g. Cu, Ni) will be biorefined into concentrated sludges for chemical reprocessing or alternatively to make base metal-bionanoproducts. (ii) Precious metal wastes will be converted into bionanomaterials for catalysis, environmental and energy applications. (iii) Rare earth metal wastes will be biomineralised for enriched feed into further refining or into new catalysts. (iv) Uranium-waste will be biorefined into mineral precursors for commercial nuclear fuels. In all, the environment will be spared dual impacts of both primary source pollution AND the high energy demand of processing from primary 'crude'. Metallic scraps are tougher, requiring acids for dissolution. Approaches will include the use of acidophilic bacteria, use of alkalinizing enzymes or using bacteria to first make a chemical catalyst (benignly) which can then convert the target metal of interest from the leachate into new nanomaterials (a hybrid living/nonliving system, already shown). Environmentally-friendly leaching & acids recycle will be evaluated and leaching processes optimised via extant predictive models. The interface between biology, chemistry, mineralogy and physics, exemplified by nanoparticles held in their unique 'biochemical nest', will receive special focus, being where major discoveries will be made; cutting edge technologies will relate structure to function, and validate the contribution of upstream waste doping or 'blending'; these, as well as novel materials processing, will increase bio-nanoparticle efficacy. Secondary wastes to be biorefined will include magnet scraps (rare earths), print cartridges (precious metals), road dusts (PMs, Fe,Ce) & metallurgical wastes (mixed rare earths/base metals/uranium). Their complex, often refractory nature gives a higher 'risk' than mine wastes but in compensation, the volumes are lower, & the scope for 'doping' or 'steering' to fabricate/steer engineered nanomaterials is correspondingly higher. B3 will have an embedded significant (~15%) Life Cycle Analysis iterative assessment of highlighted systems, with end-user trialling (supply chains; validations in conjunction with an industrial platform). B3 welcomes new 'joiners' from a raft of problem holders brought via Partner network backup.

search
The following results are related to Canada. Are you interested to view more results? Visit OpenAIRE - Explore.
23 Projects, page 1 of 3
  • Funder: UKRI Project Code: EP/P017401/1
    Funder Contribution: 100,794 GBP
    Partners: University of Salford, Autodesk Inc, University of Toronto

    Synthetic biology is an exciting new discipline which offers the potential to bring many benefits to human health and welfare. One near-market example is the use of engineered genetic networks to make biological sensors, or biosensors, which can rapidly detect toxins and harmful microorganisms. However, most synthetic biology systems are based on living genetically modified cells, and due to safety concerns and regulatory issues, they can not be used outside of a specially approved laboratory, whereas the greatest unmet need for biosensors is in the field, for 'point-of-use' and 'point-of-care' tests for health hazards. The laboratory of Professor James Collins recently reported a remarkable breakthrough, using non-living biological systems based on genetic components dried onto strips of paper. These systems can be prepared very cheaply, can be stored stably for long periods, and, since they are not alive and can not replicate, they pose no risks to the environment. This technology is therefore ideal for further development of sensors for human health. In addition, these cell-free systems can be prepared in large numbers very rapidly, in a matter of hours, and tested rapidly, in a matter of minutes, whereas living cell based systems may take weeks to prepare and days to test. This makes the new technology ideal for 'rapid prototyping' of genetic circuits. Many designs can be rapidly generated and tested, and the most successful can then be used to generate cell-based systems for applications where this is required, such as engineered metabolic pathways for manufacturing pharmaceuticals and other valuable compounds. In this project, we will further develop these remarkable systems and create new tools which will make it even easier to design and develop them. Firstly, we will create new computational tools which can be used to design genetic circuits for many applications. These will be made available on-line for the benefit of the research community. Secondly, we will establish methods for rapid automated assembly and testing of new circuits, allowing many thousands of variants to be generated and tested in a very short time with minimal human effort. Thirdly, we will seek to improve the basic technology, to improve the performance of the cell-free devices, and also develop low cost open-source electronic readers which can easily be used in the field along with the sensors we develop. Fourthly, we will demonstrate the usefulness of the technology by generating sensors which can rapidly and sensitively detect various external inputs. All of our new inventions will be made available to the research community. In addition to the other advantages mentioned above, this technology also makes it easy for users to develop their own assays simply by adding appropriate DNA components to a basic mixture, using standard protocols. Such devices can be manufactured and distributed cheaply on a very large scale. In conjunction with low-cost readers, ubiquitous mobile devices equipped with GPS and time data, and cloud-computing, this will offer the possibility to detect health hazards with unprecedented levels of speed and detail, with potentially huge effects on human health and welfare. Furthermore, these devices are ideal for use in education, allowing users to design and test their own genetic circuits without the issues inherent in using living cells. For these reasons, our proposal offers tremendous benefits and represents a step change in the real-word applicability of synthetic biology.

  • Funder: UKRI Project Code: EP/R019622/1
    Funder Contribution: 100,987 GBP
    Partners: RWTH, UniGe, Coventry University, CNRS, Maplesoft

    This project concerns computational mathematics and logic. The aim is to improve the ability of computers to perform ``Quantifier Elimination'' (QE). We say a logical statement is ``quantified'' if it is preceded by a qualification such as "for all" or "there exists". Here is an example of a quantified statement: "there exists x such that ax^2 + bx + c = 0 has two solutions for x". While the statement is mathematically precise the implications are unclear - what restrictions does this statement of existence force upon us? QE corresponds to replacing a quantified statement by an unquantified one which is equivalent. In this case we may replace the statement by: "b^2 - 4ac > 0", which is the condition for x to have two solutions. You may have recognised this equivalence from GCSE mathematics, when studying the quadratic equation. The important point here is that the latter statement can actually be derived automatically by a computer from the former, using a QE procedure. QE is not subject to the numerical rounding errors of most computations. Solutions are not in the form of a numerical answer but an algebraic description which offers insight into the structure of the problem at hand. In the example above, QE shows us not what the solutions to a particular quadratic equation are, but how in general the number of solutions depends on the coefficients a, b, and c. QE has numerous applications throughout engineering and the sciences. An example from biology is the determination of medically important values of parameters in a biological network; while another from economics is identifying which hypotheses in economic theories are compatible, and for what values of the variables. In both cases, QE can theoretically help, but in practice the size of the statements means state-of-the-art procedures run out of computer time/memory. The extensive development of QE procedures means they have many options and choices about how they are run. These decisions can greatly affect how long QE takes, rendering an intractable problem easy and vice versa. Making the right choice is a critical, but understudied problem and is the focus of this project. At the moment QE procedures make such choices either under direct supervision of a human or based on crude human-made heuristics (rules of thumb based on intuition / experience but with limited scientific basis). The purpose of this project is to replace these by machine learning techniques. Machine Learning (ML) is an overarching term for tools that allow computers to make decisions that are not explicitly programmed, usually involving the statistical analysis of large quantities of data. ML is quite at odds with the field of Symbolic Computation which studies QE, as the latter prizes exact correctness and so shuns the use of probabilistic tools making its application here very novel. We are able to combine these different worlds because the choices which we will use ML to make will all produce a correct and exact answer (but with different computational costs). The project follows pilot studies undertaken by the PI which experimented with one ML technique and found it improved upon existing heuristics for two particular decisions in a QE algorithm. We will build on this by working with the spectrum of leading ML tools to identify the optimal techniques for application in Symbolic Computation. We will demonstrate their use for both low level algorithm decisions and choices between different theories and implementations. Although focused on QE, we will also demonstrate ML as being a new route to optimisation in Computer Algebra more broadly and work encompasses Project Partners and events to maximise this. Finally, the project will deliver an improved QE procedure that makes use of ML automatically, without user input. This will be produced in the commercial Computer Algebra software Maple in collaboration with industrial Project Partner Maplesoft.

  • Funder: UKRI Project Code: NE/M011429/1
    Funder Contribution: 549,872 GBP
    Partners: HGF, Namibia Rare Earths Inc, RPC, Maakrish Ltd, Nuna Minerals A/S, SRK Consulting UK Ltd, Oakdene Hollins (United Kingdom), UCT, Umwelt und Ingenieurtechnik GmbH, Mkango Resources Limited...

    Rare earth elements (REE) are the headline of the critical metals security of supply agenda. All the REE were defined as critical by the European Union in 2010, and in subsequent analysis in 2014. Similar projects in the UK and USA have highlighted 'heavy' REE (HREE - europium through to lutetium) as the metals most likely to be at risk of supply disruption and in short supply in the near future. The REE are ubiquitous within modern technologies, including computers and low energy lighting, energy storage devices, large wind turbines and smart materials, making their supply vital to UK society. The challenge is to develop new environmentally friendly and economically viable, neodymium (Nd) and HREE deposits so that use of REE in new and green technologies can continue to expand. The principal aims of this project are to understand the mobility and concentration of Nd and HREE in natural systems and to investigate new processes that will lower the environmental impact of REE extraction and recovery. By concentrating on the critical REE, the research will be wide ranging in the deposits and processing techniques considered. It gives NERC and the UK a world-leading research consortium on critical REE, concentrating on deposit types identified in the catalyst phase as most likely to have low environmental impact, and on research that bridges the two goals of the SoS programme. The project brings together two groups from the preceding catalyst projects (GEM-CRE, MM-FREE) to form a new interdisciplinary team, including the UK's leading experts in REE geology and metallurgy, together with materials science, high/low temperature fluid geochemistry, computational simulation/mineral physics, geomicrobiology and bioprocessing. The team brings substantial background IP and the key skills required. The research responds to the needs of industry partners and involves substantive international collaboration as well as a wider international and UK network across the REE value chain. The work programme has two strands. The first centres on conventional deposits, which comprise all of the REE mines outside China and the majority of active exploration and development projects. The aim is to make a step change in the understanding of the mobility of REE in these natural deposits via mineralogical analysis, experiments and computational simulation. Then, based on this research, the aim is to optimise the most relevant extraction methods. The second strand looks to the future to develop a sustainable new method of REE extraction. The focus will be the ion adsorption deposits, which could be exploited with the lowest environmental impact of any of the main ore types using a well-controlled in-situ leaching operation. Impact will be immediate through our industry partners engaged in REE exploration and development projects, who will gain improved deposit models and better and more efficient, and therefore more environmentally friendly, extraction techniques. There will be wider benefits for researchers in other international teams and companies as we publish our results. Security of REE supply is a major international issue and the challenges tackled in this research will be relevant to practically all REE deposits. Despite the UK not having world class REE deposits itself, the economy is reliant on REE (e.g. the functional materials and devices industry is worth ~£3 Bn p.a.) and therefore the UK must lead research into the extraction process. Manufacturers who use REE will also benefit from the research by receiving up to date information on prospects for future Nd and HREE supply. This will help plan their longer term product development, as well as shorter term purchasing strategy. Likewise, the results will be useful to inform national and European level policy and to interest, entertain and educate the wider community about the natural characters and importance of the REE.

  • Project . 2018 - 2019
    Funder: UKRI Project Code: EP/R009600/1
    Funder Contribution: 101,024 GBP
    Partners: University of Toronto, Lancaster University, Vienna University of Technology, Monash University, Zuhlke Engineering Ltd, Cleanweb UK, IRISA Rennes

    BACKGROUND Values are deeply held principles guiding decision-making processes of individuals, groups and organizations. Software is inevitably affected by values: the organizational values of the project sponsor, research partners, and developers. Some values (e.g. financial value) are easier to quantify than others (e.g. trust, responsibility) with the latter often dismissed in software production processes as lacking of measurable evidence. This is problematic because all values, including less-easy to measure ones, influence how people use, access and engage with software systems with far-reaching impact not only on the commercial success of software products, but more widely on society. VISION Values-First SE is a systematic and disciplined approach to the elicitation, articulation, and deliberation of human values in software production. Given the pervasiveness of software and its impact on society, we - developers, researchers, clients, and end-user - must strengthen our capacity and confidence to externalise the values-sets built into software and use them to track how software behaves. Recent examples such as the Google boycott stem from the (often unintentional) breach of implicitly held values-systems: simply put, companies do not want to be associated with extremist values perceived as opposite to those held by society. However, the interplay between values held by software industry (e.g. prestige, social responsibility), clients (e.g. financial, care for their customers and employees), and end-users (e.g. trust, social justice) is complex, difficult to articulate and rarely fully captured by current SE decision-making processes. CONTEXT Awareness of the impact of software on politics, society, and the environment is not new: from cyber-security to environmental informatics, to digital-health, a large body of ethical computing exists looking at mechanisms that can guide developers and managers' responsibilities (e.g. code of Ethics). What is new is the unprecedented scale, reach and complexity of such impact and the urgency for developers to "be prepared to be responsible". One of the biggest challenges for developers is that the full impact of values choices in the code developed is often unseen and unintentional: when writing software, often the platform obfuscates the process even to the software developer. How can software developers be prepared to be responsible when it is not clear what they should be responsible for? For example, in the Android Software Development Kit (SDK), the geocoding of location is done by sharing of precise location with a third party organization (including Google). The implications of sharing this data, how it is stored, treated and reused is not fully explained to the developer in tutorials, in the SDK documentation or in the IDE at the time of writing the code. Those simple lines of code for the developer could have an unseen impact on the encoded values of the software produced. However, current SE methods do not seem to provide the means to follow values through a software development process and, in particular, there are currently no methods that allow values to be used as a reference framework for decision making at key stages of software development. Articulating, negotiating, and capturing human values across SE decision-making processes is precisely Values-first SE main aim.

  • Funder: UKRI Project Code: ES/N000501/2
    Funder Contribution: 98,615 GBP
    Partners: University of Cambridge, NTU, York University Canada, Newcastle University

    Autism Spectrum Conditions (ASC) are a diverse group of developmental brain conditions that cause difficulties in communication, social interaction, unusually narrow interests and difficulties adapting to change. One in 100 people (700,000 in the UK) have an ASC, most of whom are adults. A majority of the total economic cost of ASC to the UK is spent on supporting adults (£25 billion out of a total of £28 billion), with 36% of this cost attributable to lost employment opportunities (Knapp et al. 2009). The individual and social costs of ASC in adulthood are also high, with research showing poor outcomes in terms of educational attainment, unemployment (Howlin, 2000), and high rates of depression (32%), suicidal thoughts (66%) and suicidal behaviours (35%) (Cassidy et al. 2014). The latest reports from the ESRC Centre for Economic Performance, and the Chief Medical Officer, describe the high individual, social and economic costs of leaving mental health problems such as depression untreated. However, there are no valid measures of depression or suicide risk for adults with ASC, despite evidence that these are common problems (Cassidy et al. 2014; Segers and Rawana, 2014). Measures for typically developing adults are not appropriate for adults with ASC, who tend to interpret questions literally (Happe et al. 1995), and have difficulty verbalising their emotional experiences (Bird et al. 2010). Depression and suicidality also manifest differently in ASC; inflexible thinking and impulsivity may increase risk (Cassidy et al. 2014). In addition to lack of appropriate measures, research progress is also hampered by the lack of a data set that includes enough adults with ASC to effectively evaluate their rates of depression and suicidality on a national scale; the UK adult psychiatric morbidity survey (2007) only included 19 adults with ASC. The lack of research and appropriate measures have had a profoundly negative impact on adults with ASC; 1) it is not possible to conduct detailed research into the nature, risk or protective factors for depression or suicidality in adults with ASC; 2) it is not possible to effectively assess their depression or suicide risk in clinical practice; 3) without the knowledge base or assessment tools, new theories and effective evidence based treatments cannot be developed or evaluated; 4) we cannot effectively evaluate the prevalence of depression or suicidality on a national scale, in order to inform effective government policy. Hence, adults with ASC are not currently able to access evidence based assessment or therapies for depression or suicidality, despite being at potentially high risk. This research project will address these fundamental issues by developing the first empirically validated measures of depression and suicidality for adults with ASC, for use in a national survey. This will form the first nationally representative dataset containing rates of depression and suicidality in adults with ASC in the UK, made available for secondary analysis. These objectives will be achieved by creating synergy between psychiatrists and clinicians involved in ageing, autism, suicide, mental health and risk assessment research, across internationally recognized institutions (Universities of Coventry, Newcastle, and Cambridge). This research will build on my previously published research, which has utilized big data to explore the health and behaviour of adults with ASC, including the first large-scale clinic study of depression and suicidality in adults with late diagnosis of Asperger Syndrome (a high functioning subgroup on the autism spectrum) (Cassidy et al., 2014). This project will enable me to foster a new inter-disciplinary mixed-methods approach to the study of mental health in ASC, which I will continue to lead beyond the funding period.

  • Funder: UKRI Project Code: NE/M017540/2
    Funder Contribution: 284,801 GBP
    Partners: SDSU, NCU, Fugro (United Kingdom), BU, MBARI, Osaka Institute of Technology, Shell International Exploration & Produc, Deltares-Delft, CSIC, Victoria University of Wellington...

    Turbidity currents are the volumetrically most import process for sediment transport on our planet. A single submarine flow can transport ten times the annual sediment flux from all of the world's rivers, and they form the largest sediment accumulations on Earth (submarine fans). These flows break strategically important seafloor cable networks that carry > 95% of global data traffic, including the internet and financial markets, and threaten expensive seabed infrastructure used to recover oil and gas. Ancient flows form many deepwater subsurface oil and gas reservoirs in locations worldwide. It is sobering to note quite how few direct measurements we have from submarine flows in action, which is a stark contrast to other major sediment transport processes such as rivers. Sediment concentration is the most fundamental parameter for documenting what turbidity currents are, and it has never been measured for flows that reach submarine fans. How then do we know what type of flow to model in flume tanks, or which assumptions to use to formulate numerical or analytical models? There is a compelling need to monitor flows directly if we are to make step changes in understanding. The flows evolve significantly, such that source to sink data is needed, and we need to monitor flows in different settings because their character can vary significantly. This project will coordinate and pump-prime international efforts to monitor turbidity currents in action. Work will be focussed around key 'test sites' that capture the main types of flows and triggers. The objective is to build up complete source-to-sink information at key sites, rather than producing more incomplete datasets in disparate locations. Test sites are chosen where flows are known to be active - occurring on annual or shorter time scale, where previous work provides a basis for future projects, and where there is access to suitable infrastructure (e.g. vessels). The initial test sites include turbidity current systems fed by rivers, where the river enters marine or freshwater, and where plunging ('hyperpycnal') river floods are common or absent. They also include locations that produce powerful flows that reach the deep ocean and build submarine fans. The project is novel because there has been no comparable network established for monitoring turbidity currents Numerical and laboratory modelling will also be needed to understand the significance of the field observations, and our aim is also to engage modellers in the design and analysis of monitoring datasets. This work will also help to test the validity of various types of model. We will collect sediment cores and seismic data to study the longer term evolution of systems, and the more infrequent types of flow. Understanding how deposits are linked to flows is important for outcrop and subsurface oil and gas reservoir geologists. This proposal is timely because of recent efforts to develop novel technology for monitoring flows that hold great promise. This suite of new technology is needed because turbidity currents can be extremely powerful (up to 20 m/s) and destroy sensors placed on traditional moorings on the seafloor. This includes new sensors, new ways of placing those sensors above active flows or in near-bed layers, and new ways of recovering data via autonomous gliders. Key preliminary data are lacking in some test sites, such as detailed bathymetric base-maps or seismic datasets. Our final objective is to fill in key gaps in 'site-survey' data to allow larger-scale monitoring projects to be submitted in the future. This project will add considerable value to an existing NERC Grant to monitor flows in Monterey Canyon in 2014-2017, and a NERC Industry Fellowship hosted by submarine cable operators. Talling is PI for two NERC Standard Grants, a NERC Industry Fellowship and NERC Research Programme Consortium award. He is also part of a NERC Centre, and thus fulfils all four criteria for the scheme.

  • Funder: UKRI Project Code: EP/K034383/1
    Funder Contribution: 2,246,110 GBP
    Partners: University of Rome, AIM, Abdus Salam ICTP, University of Waterloo (Canada), University of Warwick

    L-functions and modular forms are fundamental mathematical objects that encode much of our knowledge of contemporary number theory. They form part of a web of interconnected objects, the understanding of which in the most basic cases lies at the foundations of much of modern mathematics. A spectacular example is Wiles' proof of Fermat's Last Theorem, which was an application of a fundamental "modularity" link between L-functions, modular forms and elliptic curves. This project will greatly extend and generalize such connections, both theoretically and computationally. The research vision inspiring our programme can be summarised as: "Breaking the boundaries of classical L-functions and modular forms, and exploring their applications to 21st-century mathematics, physics, and computer science". Our guiding goal is to push forward both theoretical and algorithmic developments, in order to develop L-functions and modular forms far beyond current capabilities. This programme will systematically develop an extensive catalogue of number theoretic objects, and will make this information available through an integrated online resource that will become an indispensable tool for the world's research community. L-functions are to pure mathematics what fundamental particles are to physics: their interaction reveal fundamental truths. To continue the analogy, computers are to number theorists what colliders are to particle physicists. Aside from their established role as empirical "testers" for conjectures and theories, experiments can often throw up quite unexpected phenomena which go on to reshape modern theory. Our programme will establish a major database and encyclopedia of knowledge about L-functions and related objects, which will play a role analogous to that of the LHC for the scientists at CERN. Both are at the threshold of tantalising glimpses into completely uncharted territories: higher degree L-functions for us and the Higgs boson for them. Theoretical and computational work on higher degree L-functions has only started to make substantial progress in the past few years. There do not currently exist efficient methods to work with these, and rigorous computations with them are not yet possible. Neither is there yet an explicit description of all ways in which degree 3 L-functions can arise. We will address these facets in our research programme: both algorithmic development and theoretical classification. As well as having theoretical applications to modularity relationships as in Wiles' proof, detailed knowledge of L-functions has more far-reaching implications. Collections of L-functions have statistical properties which first arose in theoretical physics. This surprising connection, which has witnessed substantial developments led by researchers in Bristol, has fundamental predictive power in number theory; the synergy will be vastly extended in this programme. In another strand, number theory plays an increasingly vital role in computing and communications, as evidenced by its striking applications to both cryptography and coding theory. The Riemann Hypothesis (one of the Clay Mathematics Million Dollar Millennium Problems) concerns the distribution of prime numbers, and the correctness of the best algorithms for testing large prime numbers depend on the truth of a generalised version of this 150-year-old unsolved problem. These are algorithms which are used by public-key cryptosystems that everyone who uses the Internet relies on daily, and that underpin our digital economy. Our programme involves the creation of a huge amount of data about a wide range of modular forms and L-functions, which will far surpass in range and depth anything computed before in this area. This in turn will be used to analyse some of the most famous outstanding problems in mathematics, including the Riemann Hypothesis and another Clay problem, the Birch and Swinnerton-Dyer conjecture.

  • Funder: UKRI Project Code: EP/N018745/1
    Funder Contribution: 320,381 GBP
    Partners: UBC, State University of New York at Potsdam, Perimeter Institute, Universitat Autònoma de Barcelona (UAB), University of Waterloo (Canada), University of Oxford

    Realizing the potential of applications of quantum theory to information processing, which include quantum communication and quantum computation, is one of the primary goals of contemporary engineering and physics. The key theoretical breakthroughs enabling quantum communication technologies were the discovery of the phenomenon of quantum entanglement in the 1930s and the realisation that entanglement represented not merely a curiosity of quantum theory but a critical resource which could be exploited to achieve heretofore impossible communication tasks in the 1980s. Bell indentified quantum nonlocality as the essentially quantum aspect of entanglement in the 1960s. While it is widely understood that quantum computation offers substantial efficiency advantages over classical computation for particular problems, it is neither understood what the precise class of such problems is nor what the particular aspect or aspects of quantum theory enabling these advantages are. The applications for QC which have been identified are likely only a fraction of the full potential, however, as only a handful of quantum algorithms have been discovered. Peter Shor, whose discovery of the first practical quantum algorithm founded modern quantum computer science, contemplated why so few quantum algorithms have been discovered and suggested that, "quantum computers operate in a manner so different from classical computers that our techniques for designing algorithms and our intuitions for understanding the process of computation no longer work". In seeking quantum algorithms without a clear idea of the essential quantum phenomenon accounting for quantum computational advantage, we are working in the dark. Despite decades of research, the key feature of quantum theory enabling quantum advantage over classical computers remains elusive. Several of quantum theory's novel features---such as entanglement, superposition, and discord---have been proposed as candidates but have subsequently proven insufficient. Recent evidence, such as that provided by Rausendorff (Phys. Rev. A, 88) and Howard et al. (Nature, 510), demonstrates that a generalization of nonlocality called contextuality plays an important role in QC and suggests that it is, perhaps, a sought-after key to understanding the unique capabilities of QC. Our vision is to deepen the theory of contextuality with the goals of achieving an understanding of the precise role it plays in QC and how it is a resource for computational advantage. Our team is uniquely positioned to tackle this challenge: the PIs are co-inventors of the two leading theoretical frameworks for contextuality. We will achieve our goal by collaborating with an international, interdisciplinary team of experts including those responsible for the initial evidence linking contextuality and QC as well as recognized leaders in quantum algorithms and the resource theory of nonlocality.

  • Funder: UKRI Project Code: EP/M013294/1
    Funder Contribution: 35,513,900 GBP
    Partners: University of Birmingham, Cardno AUS, RU, Knowledge Transfer Partnership, Elekta Oy, JK Guest Group, Micro-g LaCoste, GeoDynamics Worldwide Srl, BP British Petroleum, South East Physics Network...

    The Hub will create a seamless link between science and applications by building on our established knowledge exchange activities in quantum technologies. We will transform science into technology by developing new products, demonstrating their applications and advantages, and establishing a strong user base in diverse sectors. Our overarching ambition is to deliver a wide range of quantum sensors to underpin many new commercial applications. Our key objective is to ensure that the Hub's outputs will have been picked up by companies, or industry-led TSB projects, by the end of the funding period. The Hub will comprise: a strong fabrication component; quantum scientists with a demonstrated ability to combine scientific excellence with technological delivery; leading engineers with the broad collective expertise and connections required to develop and use new quantum sensors. We have identified, and actively involved, industry enablers to build a supply chain for quantum sensor technology. As well as direct physics connections to industry, the engineers provide strong links to relevant industrial users, thus providing information on industrial needs and enabling rapid prototype deployment in the field. To establish a coherent national collaborative effort, the Hub will include a UK network on quantum sensors and metrology, which will also exploit the connections that Prof Bongs and all Hub members have forged in Europe, the US and Asia. This inter-linkage ensures capture of the most advanced developments in quantum technology around the world for exploitation by the UK. Quantum sensors and metrology, plus some devices in quantum communication, are the only areas where laboratory prototypes have already proven superior to their best classical counterparts. This sets the stage, credibly, for rapid and disruptive applications emerging from the Hub. The selection of prototypes will be driven by commercial pull, i.e. each prototype project within the Hub must demonstrate, from the outset, industry or practitioner engagement from our engineering and/or industrial collaborators. We have strong industry support across several disciplines with the structures in place actively to manage technology and knowledge transfer to the industry sector. Particular roles are played by NPL and e2V. We will closely collaborate with NPL as metrology end-user on clock, magnetometer and potentially Watt balance developments with a lecturer-level Birmingham-NPL fellow contributed by Birmingham University and our PRDAs spending ~17 man-years in addition to 3-5 PhD students on these joint projects in the Advanced Metrology Laboratory/incubator space. E2v have a unique industrial manufacturing/R&D facility co-located within the School of Physics and Astronomy at Nottingham that has already catalysed the expansion of their activities into the Quantum Technology domain. Public Engagement conveying the Hub's breakthroughs will be a high priority - for example annually at the Royal Society Summer Exhibitions. In addition to cohort-training of 80 PhD students working within the Hub, the Hub will contribute to the training of ~500 PhD students via electronically-shared lectures (many already running within the e-learning graduate schools MPAGS, MEGS, SEPNET and SUPA) across the institutions within the Hub. The Hub will create an internationally-leading centre of excellence with major impact in the area of quantum sensors and metrology. To widen the impact of the Hub and ensure long-term sustainability, we will actively pursue European and other international collaborative funding for both underlying fundamental research and the technology development.

  • Funder: UKRI Project Code: NE/L014076/1
    Funder Contribution: 638,057 GBP
    Partners: Cristal Pigment UK Ltd, University of Belgrade, University of Birmingham, University of Granada, UCT, C-Tech Innovation (United Kingdom), UWC, University of Surrey, Plymouth University, FCO...

    30 years' research on metal biorecovery from wastes has paid scant attention to strong CONTEMPORARY demands for (i) conservation of dwindling vital resources (e.g platinum group metals (PGM), recently rare earth elements, (REE), base metals (BMs) and uranium) and (ii) the unequivocal need to extract/refine them in a non-polluting, low-energy way. 21stC technologies increasingly rely on nanomaterials which have novel properties not seen in bulk materials. Bacteria can fabricate nanoparticles (NPs), bottom up, atom by atom, with exquisite fine control offered by enzymatic synthesis and bio-scaffolding that chemistry cannot emulate. Bio-nanoparticles have proven applications in green chemistry, low carbon energy, environmental protection and potentially in photonic applications. Bacteria can be grown cheaply at scale for facile production. We have shown that bacteria can make nanomaterials from secondary wastes, yielding, in some cases, a metallic mixture which can show better activity than 'pure' nanoparticles. Such fabrication of structured bimetallics can be hard to achieve chemically. For some metals like rare earths and uranium (which often co-occur in wastes) their biorecovery from scraps e.g. magnets (rare earths) and wastes (mixed U/rare earths), when separated, can make 'enriched' solids for delivery into further commercial refining to make new magnets (rare earths) or nuclear fuel (U). Biofabricating these solids is often beyond the ability of living cells but they can form scaffolds, with enzymatic processes harnessed to make biomineral precursors, often selectively. B3 will invoke tiered levels of complexity, maturity and risk. (i) Base metal mining wastes (e.g. Cu, Ni) will be biorefined into concentrated sludges for chemical reprocessing or alternatively to make base metal-bionanoproducts. (ii) Precious metal wastes will be converted into bionanomaterials for catalysis, environmental and energy applications. (iii) Rare earth metal wastes will be biomineralised for enriched feed into further refining or into new catalysts. (iv) Uranium-waste will be biorefined into mineral precursors for commercial nuclear fuels. In all, the environment will be spared dual impacts of both primary source pollution AND the high energy demand of processing from primary 'crude'. Metallic scraps are tougher, requiring acids for dissolution. Approaches will include the use of acidophilic bacteria, use of alkalinizing enzymes or using bacteria to first make a chemical catalyst (benignly) which can then convert the target metal of interest from the leachate into new nanomaterials (a hybrid living/nonliving system, already shown). Environmentally-friendly leaching & acids recycle will be evaluated and leaching processes optimised via extant predictive models. The interface between biology, chemistry, mineralogy and physics, exemplified by nanoparticles held in their unique 'biochemical nest', will receive special focus, being where major discoveries will be made; cutting edge technologies will relate structure to function, and validate the contribution of upstream waste doping or 'blending'; these, as well as novel materials processing, will increase bio-nanoparticle efficacy. Secondary wastes to be biorefined will include magnet scraps (rare earths), print cartridges (precious metals), road dusts (PMs, Fe,Ce) & metallurgical wastes (mixed rare earths/base metals/uranium). Their complex, often refractory nature gives a higher 'risk' than mine wastes but in compensation, the volumes are lower, & the scope for 'doping' or 'steering' to fabricate/steer engineered nanomaterials is correspondingly higher. B3 will have an embedded significant (~15%) Life Cycle Analysis iterative assessment of highlighted systems, with end-user trialling (supply chains; validations in conjunction with an industrial platform). B3 welcomes new 'joiners' from a raft of problem holders brought via Partner network backup.