Advanced search in
Projects
arrow_drop_down
Searching FieldsTerms
Any field
arrow_drop_down
includes
arrow_drop_down
The following results are related to Canada. Are you interested to view more results? Visit OpenAIRE - Explore.
13 Projects

  • Canada
  • 2013-2022
  • UKRI|EPSRC
  • OA Publications Mandate: No
  • 2019

10
arrow_drop_down
  • Funder: UKRI Project Code: EP/P017401/1
    Funder Contribution: 100,794 GBP

    Synthetic biology is an exciting new discipline which offers the potential to bring many benefits to human health and welfare. One near-market example is the use of engineered genetic networks to make biological sensors, or biosensors, which can rapidly detect toxins and harmful microorganisms. However, most synthetic biology systems are based on living genetically modified cells, and due to safety concerns and regulatory issues, they can not be used outside of a specially approved laboratory, whereas the greatest unmet need for biosensors is in the field, for 'point-of-use' and 'point-of-care' tests for health hazards. The laboratory of Professor James Collins recently reported a remarkable breakthrough, using non-living biological systems based on genetic components dried onto strips of paper. These systems can be prepared very cheaply, can be stored stably for long periods, and, since they are not alive and can not replicate, they pose no risks to the environment. This technology is therefore ideal for further development of sensors for human health. In addition, these cell-free systems can be prepared in large numbers very rapidly, in a matter of hours, and tested rapidly, in a matter of minutes, whereas living cell based systems may take weeks to prepare and days to test. This makes the new technology ideal for 'rapid prototyping' of genetic circuits. Many designs can be rapidly generated and tested, and the most successful can then be used to generate cell-based systems for applications where this is required, such as engineered metabolic pathways for manufacturing pharmaceuticals and other valuable compounds. In this project, we will further develop these remarkable systems and create new tools which will make it even easier to design and develop them. Firstly, we will create new computational tools which can be used to design genetic circuits for many applications. These will be made available on-line for the benefit of the research community. Secondly, we will establish methods for rapid automated assembly and testing of new circuits, allowing many thousands of variants to be generated and tested in a very short time with minimal human effort. Thirdly, we will seek to improve the basic technology, to improve the performance of the cell-free devices, and also develop low cost open-source electronic readers which can easily be used in the field along with the sensors we develop. Fourthly, we will demonstrate the usefulness of the technology by generating sensors which can rapidly and sensitively detect various external inputs. All of our new inventions will be made available to the research community. In addition to the other advantages mentioned above, this technology also makes it easy for users to develop their own assays simply by adding appropriate DNA components to a basic mixture, using standard protocols. Such devices can be manufactured and distributed cheaply on a very large scale. In conjunction with low-cost readers, ubiquitous mobile devices equipped with GPS and time data, and cloud-computing, this will offer the possibility to detect health hazards with unprecedented levels of speed and detail, with potentially huge effects on human health and welfare. Furthermore, these devices are ideal for use in education, allowing users to design and test their own genetic circuits without the issues inherent in using living cells. For these reasons, our proposal offers tremendous benefits and represents a step change in the real-word applicability of synthetic biology.

    more_vert
  • Funder: UKRI Project Code: EP/M01052X/1
    Funder Contribution: 731,953 GBP

    Condensed matter physics has developed a relatively complete theory of common phases in materials leading to many technologically important devices including electronic screens, memory storage, and switching devices. Landau, or mean-field theory, has provided a framework to model, predict, and understand phases and transitions in a surprisingly diverse variety of materials and also dynamical systems. While these conventional ground states have proven technologically important and the underlying theory represents a major success for scientists, these phases have proven incredibly difficult to suppress and often emerge when new materials properties are sought or engineered. To discover novel phases that will lead to a new materials revolution, these common phases need to be suppressed to allow exotic and unconventional properties to emerge. The most common vehicle to turn off conventional phases in materials has been through the introduction of disorder through chemical doping resulting in strong random fields. Many important theories have been formulated and tested to describe the effects of random fields and in particular to account for the fine balance between surface and bulk free energy. However, the use of disorder has proved limiting as properties are often templated into the material and not directly controllable and also the resulting ground state of the material is difficult to understand. Another route, which has more recently been explored in the last decade, to suppress conventional phases is by introducing strong fluctuations. While this can be trivially done with temperature, new phases have emerged by studying quantum systems where the physics are governed by quantum mechanics and the Heisenberg uncertainty principle. The study of quantum systems has resulted in the discovery of many new phases of matter including high temperature superconductors and also quantum spin-liquids where the magnetism is dynamic at any temperature. A limitation of quantum fluctuations is that the properties do not carry over directly to ferroelectric based systems and also multiferroics where magnetic and structural properties are strongly coupled. Also, owing to the strong fluctuating nature of the ground state, the properties have not been found to be easily tunable limiting immediate use for applications. This proposal aims to therefore take a different route by studying classically frustrated systems where a large ground state degeneracy is introduced naturally through the lattice and quantum mechanical effects are small. Emphasis will be placed on lattices based upon a triangular geometry. The lack of strong fluctuations (that exists in quantum systems) provides the ability to controllably tune between different ground states making this route a potential means of creating new switching devices or novel memory storage systems. The proposal aims to investigate classically frustrated magnets and ferroelectrics. These systems can be described within a common framework and will be studied using scattering techniques to provide a bulk real space image of the ground state. The properties will be tuned with magnetic and electric fields supplying a direct route for discovering a new route towards technologically applicable materials. The combined approach of investigating ferroelectrics and magnets will result in a complete understanding applicable to immediate industrial applications. These new materials will lead to the discovery of new phases including new high temperature multiferroics, classical spin liquids, or localized controllable boundaries or defects.

    more_vert
  • Funder: UKRI Project Code: EP/N018958/2
    Funder Contribution: 305,534 GBP

    "Software is the most prevalent of all the instruments used in modern science" [Goble 2014]. Scientific software is not just widely used [SSI 2014] but also widely developed. Yet much of it is developed by researchers who have little understanding of even the basics of modern software development with the knock-on effects to their productivity, and the reliability, readability and reproducibility of their software [Nature Biotechnology]. Many are long-tail researchers working in small groups - even Big Science operations like the SKA are operationally undertaken by individuals collectively. Technological development in software is more like a cliff-face than a ladder - there are many routes to the top, to a solution. Further, the cliff face is dynamic - constantly and quickly changing as new technologies emerge and decline. Determining which technologies to deploy and how best to deploy them is in itself a specialist domain, with many features of traditional research. Researchers need empowerment and training to give them confidence with the available equipment and the challenges they face. This role, akin to that of an Alpine guide, involves support, guidance, and load carrying. When optimally performed it results in a researcher who knows what challenges they can attack alone, and where they need appropriate support. Guides can help decide whether to exploit well-trodden paths or explore new possibilities as they navigate through this dynamic environment. These guides are highly trained, technology-centric, research-aware individuals who have a curiosity driven nature dedicated to supporting researchers by forging a research software support career. Such Research Software Engineers (RSEs) guide researchers through the technological landscape and form a human interface between scientist and computer. A well-functioning RSE group will not just add to an organisation's effectiveness, it will have a multiplicative effect since it will make every individual researcher more effective. It has the potential to improve the quality of research done across all University departments and faculties. My work plan provides a bottom-up approach to providing RSE services that is distinctive from yet complements the top-down approach provided by the EPRSC-funded Software Sustainability Institute. The outcomes of this fellowship will be: Local and National RSE Capability: A RSE Group at Sheffield as a credible roadmap for others pump-priming a UK national research software capability; and a national Continuing Professional Development programme for RSEs. Scalable software support methods: A scalable approach based on "nudging", to providing research software support for scientific software efficiency, sustainability and reproducibility, with quality-guidelines for research software and for researchers on how best to incorporate research software engineering support within their grant proposals. HPC for long-tail researchers: 'HPC-software ramps' and a pathway for standardised integration of HPC resources into Desktop Applications fit for modern scientific computing; a network of HPC-centric RSEs based around shared resources; and a portfolio of new research software courses developed with partners. Communication and public understanding: A communication campaign to raise the profile of research software exploiting high profile social media and online resources, establishing an informal forum for research software debate. References [Goble 2014] Goble, C. "Better Software, Better Research". IEEE Internet Computing 18(5): 4-8 (2014) [SSI 2014] Hettrick, S. "It's impossible to conduct research without software, say 7 out of 10 UK researchers" http://www.software.ac.uk/blog/2014-12-04-its-impossible-conduct-research-without-software-say-7-out-10-uk-researchers (2014) [Nature 2015] Editorial "Rule rewrite aims to clean up scientific software", Nature Biotechnology 520(7547) April 2015

    more_vert
  • Funder: UKRI Project Code: EP/R004730/1
    Funder Contribution: 101,150 GBP

    The subject of study of differential geometry are smooth manifolds, which correspond to smooth curved objects of finite dimension. In modern differential geometry, it is becoming more and more common to consider sequences (or flows) of smooth manifolds. Typically the limits of such sequences (or flows) are non smooth anymore. It is then useful to isolate a natural class of non smooth objects which generalize the classical notion of smooth manifold, and which is closed under the process of taking limits. If the sequence of manifolds satisfy a lower bound on the sectional curvatures, a natural class of non-smooth objects which is closed under (Gromov-Hausdorff) convergence is given by special metric spaces known as Alexandrov spaces; if instead the sequence of manifolds satisfy a lower bound on the Ricci curvatures, a natural class of non-smooth objects, closed under (measured Gromov-Hausdorff) convergence, is given by special metric measure spaces (i.e. metric spaces endowed with a reference volume measure) known as RCD(K,N) spaces. These are a 'Riemannian' refinement of the so called CD(K,N) spaces of Lott-Sturm-Villani, which are metric measure spaces with Ricci curvature bounded below by K and dimension bounded above by N in a synthetic sense via optimal transport. In the proposed project we aim to understand in more detail the structure, the analytic and the geometric properties of RCD(K,N) spaces. The new results will have an impact also on the classical world of smooth manifolds satisfying curvature bounds.

    visibility22
    visibilityviews22
    downloaddownloads56
    Powered by Usage counts
    more_vert
  • Funder: UKRI Project Code: EP/R019622/1
    Funder Contribution: 100,987 GBP

    This project concerns computational mathematics and logic. The aim is to improve the ability of computers to perform ``Quantifier Elimination'' (QE). We say a logical statement is ``quantified'' if it is preceded by a qualification such as "for all" or "there exists". Here is an example of a quantified statement: "there exists x such that ax^2 + bx + c = 0 has two solutions for x". While the statement is mathematically precise the implications are unclear - what restrictions does this statement of existence force upon us? QE corresponds to replacing a quantified statement by an unquantified one which is equivalent. In this case we may replace the statement by: "b^2 - 4ac > 0", which is the condition for x to have two solutions. You may have recognised this equivalence from GCSE mathematics, when studying the quadratic equation. The important point here is that the latter statement can actually be derived automatically by a computer from the former, using a QE procedure. QE is not subject to the numerical rounding errors of most computations. Solutions are not in the form of a numerical answer but an algebraic description which offers insight into the structure of the problem at hand. In the example above, QE shows us not what the solutions to a particular quadratic equation are, but how in general the number of solutions depends on the coefficients a, b, and c. QE has numerous applications throughout engineering and the sciences. An example from biology is the determination of medically important values of parameters in a biological network; while another from economics is identifying which hypotheses in economic theories are compatible, and for what values of the variables. In both cases, QE can theoretically help, but in practice the size of the statements means state-of-the-art procedures run out of computer time/memory. The extensive development of QE procedures means they have many options and choices about how they are run. These decisions can greatly affect how long QE takes, rendering an intractable problem easy and vice versa. Making the right choice is a critical, but understudied problem and is the focus of this project. At the moment QE procedures make such choices either under direct supervision of a human or based on crude human-made heuristics (rules of thumb based on intuition / experience but with limited scientific basis). The purpose of this project is to replace these by machine learning techniques. Machine Learning (ML) is an overarching term for tools that allow computers to make decisions that are not explicitly programmed, usually involving the statistical analysis of large quantities of data. ML is quite at odds with the field of Symbolic Computation which studies QE, as the latter prizes exact correctness and so shuns the use of probabilistic tools making its application here very novel. We are able to combine these different worlds because the choices which we will use ML to make will all produce a correct and exact answer (but with different computational costs). The project follows pilot studies undertaken by the PI which experimented with one ML technique and found it improved upon existing heuristics for two particular decisions in a QE algorithm. We will build on this by working with the spectrum of leading ML tools to identify the optimal techniques for application in Symbolic Computation. We will demonstrate their use for both low level algorithm decisions and choices between different theories and implementations. Although focused on QE, we will also demonstrate ML as being a new route to optimisation in Computer Algebra more broadly and work encompasses Project Partners and events to maximise this. Finally, the project will deliver an improved QE procedure that makes use of ML automatically, without user input. This will be produced in the commercial Computer Algebra software Maple in collaboration with industrial Project Partner Maplesoft.

    more_vert
  • Funder: UKRI Project Code: EP/R009600/1
    Funder Contribution: 101,024 GBP

    BACKGROUND Values are deeply held principles guiding decision-making processes of individuals, groups and organizations. Software is inevitably affected by values: the organizational values of the project sponsor, research partners, and developers. Some values (e.g. financial value) are easier to quantify than others (e.g. trust, responsibility) with the latter often dismissed in software production processes as lacking of measurable evidence. This is problematic because all values, including less-easy to measure ones, influence how people use, access and engage with software systems with far-reaching impact not only on the commercial success of software products, but more widely on society. VISION Values-First SE is a systematic and disciplined approach to the elicitation, articulation, and deliberation of human values in software production. Given the pervasiveness of software and its impact on society, we - developers, researchers, clients, and end-user - must strengthen our capacity and confidence to externalise the values-sets built into software and use them to track how software behaves. Recent examples such as the Google boycott stem from the (often unintentional) breach of implicitly held values-systems: simply put, companies do not want to be associated with extremist values perceived as opposite to those held by society. However, the interplay between values held by software industry (e.g. prestige, social responsibility), clients (e.g. financial, care for their customers and employees), and end-users (e.g. trust, social justice) is complex, difficult to articulate and rarely fully captured by current SE decision-making processes. CONTEXT Awareness of the impact of software on politics, society, and the environment is not new: from cyber-security to environmental informatics, to digital-health, a large body of ethical computing exists looking at mechanisms that can guide developers and managers' responsibilities (e.g. code of Ethics). What is new is the unprecedented scale, reach and complexity of such impact and the urgency for developers to "be prepared to be responsible". One of the biggest challenges for developers is that the full impact of values choices in the code developed is often unseen and unintentional: when writing software, often the platform obfuscates the process even to the software developer. How can software developers be prepared to be responsible when it is not clear what they should be responsible for? For example, in the Android Software Development Kit (SDK), the geocoding of location is done by sharing of precise location with a third party organization (including Google). The implications of sharing this data, how it is stored, treated and reused is not fully explained to the developer in tutorials, in the SDK documentation or in the IDE at the time of writing the code. Those simple lines of code for the developer could have an unseen impact on the encoded values of the software produced. However, current SE methods do not seem to provide the means to follow values through a software development process and, in particular, there are currently no methods that allow values to be used as a reference framework for decision making at key stages of software development. Articulating, negotiating, and capturing human values across SE decision-making processes is precisely Values-first SE main aim.

    more_vert
  • Funder: UKRI Project Code: EP/K034383/1
    Funder Contribution: 2,246,110 GBP

    L-functions and modular forms are fundamental mathematical objects that encode much of our knowledge of contemporary number theory. They form part of a web of interconnected objects, the understanding of which in the most basic cases lies at the foundations of much of modern mathematics. A spectacular example is Wiles' proof of Fermat's Last Theorem, which was an application of a fundamental "modularity" link between L-functions, modular forms and elliptic curves. This project will greatly extend and generalize such connections, both theoretically and computationally. The research vision inspiring our programme can be summarised as: "Breaking the boundaries of classical L-functions and modular forms, and exploring their applications to 21st-century mathematics, physics, and computer science". Our guiding goal is to push forward both theoretical and algorithmic developments, in order to develop L-functions and modular forms far beyond current capabilities. This programme will systematically develop an extensive catalogue of number theoretic objects, and will make this information available through an integrated online resource that will become an indispensable tool for the world's research community. L-functions are to pure mathematics what fundamental particles are to physics: their interaction reveal fundamental truths. To continue the analogy, computers are to number theorists what colliders are to particle physicists. Aside from their established role as empirical "testers" for conjectures and theories, experiments can often throw up quite unexpected phenomena which go on to reshape modern theory. Our programme will establish a major database and encyclopedia of knowledge about L-functions and related objects, which will play a role analogous to that of the LHC for the scientists at CERN. Both are at the threshold of tantalising glimpses into completely uncharted territories: higher degree L-functions for us and the Higgs boson for them. Theoretical and computational work on higher degree L-functions has only started to make substantial progress in the past few years. There do not currently exist efficient methods to work with these, and rigorous computations with them are not yet possible. Neither is there yet an explicit description of all ways in which degree 3 L-functions can arise. We will address these facets in our research programme: both algorithmic development and theoretical classification. As well as having theoretical applications to modularity relationships as in Wiles' proof, detailed knowledge of L-functions has more far-reaching implications. Collections of L-functions have statistical properties which first arose in theoretical physics. This surprising connection, which has witnessed substantial developments led by researchers in Bristol, has fundamental predictive power in number theory; the synergy will be vastly extended in this programme. In another strand, number theory plays an increasingly vital role in computing and communications, as evidenced by its striking applications to both cryptography and coding theory. The Riemann Hypothesis (one of the Clay Mathematics Million Dollar Millennium Problems) concerns the distribution of prime numbers, and the correctness of the best algorithms for testing large prime numbers depend on the truth of a generalised version of this 150-year-old unsolved problem. These are algorithms which are used by public-key cryptosystems that everyone who uses the Internet relies on daily, and that underpin our digital economy. Our programme involves the creation of a huge amount of data about a wide range of modular forms and L-functions, which will far surpass in range and depth anything computed before in this area. This in turn will be used to analyse some of the most famous outstanding problems in mathematics, including the Riemann Hypothesis and another Clay problem, the Birch and Swinnerton-Dyer conjecture.

    visibility81
    visibilityviews81
    downloaddownloads108
    Powered by Usage counts
    more_vert
  • Funder: UKRI Project Code: EP/N018745/1
    Funder Contribution: 320,381 GBP

    Realizing the potential of applications of quantum theory to information processing, which include quantum communication and quantum computation, is one of the primary goals of contemporary engineering and physics. The key theoretical breakthroughs enabling quantum communication technologies were the discovery of the phenomenon of quantum entanglement in the 1930s and the realisation that entanglement represented not merely a curiosity of quantum theory but a critical resource which could be exploited to achieve heretofore impossible communication tasks in the 1980s. Bell indentified quantum nonlocality as the essentially quantum aspect of entanglement in the 1960s. While it is widely understood that quantum computation offers substantial efficiency advantages over classical computation for particular problems, it is neither understood what the precise class of such problems is nor what the particular aspect or aspects of quantum theory enabling these advantages are. The applications for QC which have been identified are likely only a fraction of the full potential, however, as only a handful of quantum algorithms have been discovered. Peter Shor, whose discovery of the first practical quantum algorithm founded modern quantum computer science, contemplated why so few quantum algorithms have been discovered and suggested that, "quantum computers operate in a manner so different from classical computers that our techniques for designing algorithms and our intuitions for understanding the process of computation no longer work". In seeking quantum algorithms without a clear idea of the essential quantum phenomenon accounting for quantum computational advantage, we are working in the dark. Despite decades of research, the key feature of quantum theory enabling quantum advantage over classical computers remains elusive. Several of quantum theory's novel features---such as entanglement, superposition, and discord---have been proposed as candidates but have subsequently proven insufficient. Recent evidence, such as that provided by Rausendorff (Phys. Rev. A, 88) and Howard et al. (Nature, 510), demonstrates that a generalization of nonlocality called contextuality plays an important role in QC and suggests that it is, perhaps, a sought-after key to understanding the unique capabilities of QC. Our vision is to deepen the theory of contextuality with the goals of achieving an understanding of the precise role it plays in QC and how it is a resource for computational advantage. Our team is uniquely positioned to tackle this challenge: the PIs are co-inventors of the two leading theoretical frameworks for contextuality. We will achieve our goal by collaborating with an international, interdisciplinary team of experts including those responsible for the initial evidence linking contextuality and QC as well as recognized leaders in quantum algorithms and the resource theory of nonlocality.

    more_vert
  • Funder: UKRI Project Code: EP/M013294/1
    Funder Contribution: 35,513,900 GBP

    The Hub will create a seamless link between science and applications by building on our established knowledge exchange activities in quantum technologies. We will transform science into technology by developing new products, demonstrating their applications and advantages, and establishing a strong user base in diverse sectors. Our overarching ambition is to deliver a wide range of quantum sensors to underpin many new commercial applications. Our key objective is to ensure that the Hub's outputs will have been picked up by companies, or industry-led TSB projects, by the end of the funding period. The Hub will comprise: a strong fabrication component; quantum scientists with a demonstrated ability to combine scientific excellence with technological delivery; leading engineers with the broad collective expertise and connections required to develop and use new quantum sensors. We have identified, and actively involved, industry enablers to build a supply chain for quantum sensor technology. As well as direct physics connections to industry, the engineers provide strong links to relevant industrial users, thus providing information on industrial needs and enabling rapid prototype deployment in the field. To establish a coherent national collaborative effort, the Hub will include a UK network on quantum sensors and metrology, which will also exploit the connections that Prof Bongs and all Hub members have forged in Europe, the US and Asia. This inter-linkage ensures capture of the most advanced developments in quantum technology around the world for exploitation by the UK. Quantum sensors and metrology, plus some devices in quantum communication, are the only areas where laboratory prototypes have already proven superior to their best classical counterparts. This sets the stage, credibly, for rapid and disruptive applications emerging from the Hub. The selection of prototypes will be driven by commercial pull, i.e. each prototype project within the Hub must demonstrate, from the outset, industry or practitioner engagement from our engineering and/or industrial collaborators. We have strong industry support across several disciplines with the structures in place actively to manage technology and knowledge transfer to the industry sector. Particular roles are played by NPL and e2V. We will closely collaborate with NPL as metrology end-user on clock, magnetometer and potentially Watt balance developments with a lecturer-level Birmingham-NPL fellow contributed by Birmingham University and our PRDAs spending ~17 man-years in addition to 3-5 PhD students on these joint projects in the Advanced Metrology Laboratory/incubator space. E2v have a unique industrial manufacturing/R&D facility co-located within the School of Physics and Astronomy at Nottingham that has already catalysed the expansion of their activities into the Quantum Technology domain. Public Engagement conveying the Hub's breakthroughs will be a high priority - for example annually at the Royal Society Summer Exhibitions. In addition to cohort-training of 80 PhD students working within the Hub, the Hub will contribute to the training of ~500 PhD students via electronically-shared lectures (many already running within the e-learning graduate schools MPAGS, MEGS, SEPNET and SUPA) across the institutions within the Hub. The Hub will create an internationally-leading centre of excellence with major impact in the area of quantum sensors and metrology. To widen the impact of the Hub and ensure long-term sustainability, we will actively pursue European and other international collaborative funding for both underlying fundamental research and the technology development.

    visibility733
    visibilityviews733
    downloaddownloads4,840
    Powered by Usage counts
    more_vert
  • Funder: UKRI Project Code: EP/P006078/2
    Funder Contribution: 225,535 GBP

    Some of the most fundamental and perhaps bizarre processes expected to occur in the vicinity of black holes are out of observational reach. To address this issue we utilise analogue systems where we study fluctuations on a background flow that in the experiment reproduces an effective black hole. In the literature this line of research is referred to as analogue models for gravity, or simply analogue gravity. Analogue models provide not only a theoretical but also an experimental framework in which to verify predictions of classical and quantum fields exposed to 'extreme' spacetime geometries, such as rapidly rotating black holes. This project brings together two world-wide recognised experts in the field of analogue gravity with the aim of pushing the field in a new direction: we propose ground-breaking studies to mimic some of the bizarre processes occurring in the vicinity of rotating black holes from general relativity and rotating fluids in both water and optical systems. In particular, we will investigate both theoretically and experimentally the interaction between an input wave and a rotating black hole spacetime geometry, here recreated by the rotating fluid. This allows us to mimic a scattering process associated to rotating black hoes called superradiant scattering. From a historical viewpoint this kind of radiation is the precursor to Hawking radiation. More precisely, black hole superradiance is the scattering of waves from a rotating black hole: if the incoming wave also possesses a small amount of angular momentum, it will be reflected with an increased amplitude, i.e. it is amplified at the expense of the black hole that thus loses some of its rotational energy. It has also been pointed out that the same physics may take place in very different systems, for example light incident on a rotating metallic (or absorbing) cylinder may also be amplified upon reflection. Yet, no-one has ever attempted to experimentally investigate the underlying physics that extend beyond general relativity and are relevant to a variety of hydrodynamical and rotating systems. We aim to provide the first ever experimental evidence of this intriguing and fundamental amplification mechanism in two different hydrodynamical systems. The first is a water spout, controlled so that the correct boundary conditions are obtained and optimised for observing BH-SS. The second is a less conventional fluid that is made out of light. Light propagating in a special medium can behave as a fluid or even a superfluid. By building upon highly developed photonic technologies e.g. for the control and measurements of laser beam wavefronts, we will implement very precisely tailored and characterised experiments. One of the unique aspects of this project is the marriage between two very different lab-based systems, one using water the other using light, to tackle an outstanding problem in physics that is of relevance to astrophysics, hydrodynamic and optical systems.

    visibility33
    visibilityviews33
    downloaddownloads224
    Powered by Usage counts
    more_vert
Advanced search in
Projects
arrow_drop_down
Searching FieldsTerms
Any field
arrow_drop_down
includes
arrow_drop_down
The following results are related to Canada. Are you interested to view more results? Visit OpenAIRE - Explore.
13 Projects
  • Funder: UKRI Project Code: EP/P017401/1
    Funder Contribution: 100,794 GBP

    Synthetic biology is an exciting new discipline which offers the potential to bring many benefits to human health and welfare. One near-market example is the use of engineered genetic networks to make biological sensors, or biosensors, which can rapidly detect toxins and harmful microorganisms. However, most synthetic biology systems are based on living genetically modified cells, and due to safety concerns and regulatory issues, they can not be used outside of a specially approved laboratory, whereas the greatest unmet need for biosensors is in the field, for 'point-of-use' and 'point-of-care' tests for health hazards. The laboratory of Professor James Collins recently reported a remarkable breakthrough, using non-living biological systems based on genetic components dried onto strips of paper. These systems can be prepared very cheaply, can be stored stably for long periods, and, since they are not alive and can not replicate, they pose no risks to the environment. This technology is therefore ideal for further development of sensors for human health. In addition, these cell-free systems can be prepared in large numbers very rapidly, in a matter of hours, and tested rapidly, in a matter of minutes, whereas living cell based systems may take weeks to prepare and days to test. This makes the new technology ideal for 'rapid prototyping' of genetic circuits. Many designs can be rapidly generated and tested, and the most successful can then be used to generate cell-based systems for applications where this is required, such as engineered metabolic pathways for manufacturing pharmaceuticals and other valuable compounds. In this project, we will further develop these remarkable systems and create new tools which will make it even easier to design and develop them. Firstly, we will create new computational tools which can be used to design genetic circuits for many applications. These will be made available on-line for the benefit of the research community. Secondly, we will establish methods for rapid automated assembly and testing of new circuits, allowing many thousands of variants to be generated and tested in a very short time with minimal human effort. Thirdly, we will seek to improve the basic technology, to improve the performance of the cell-free devices, and also develop low cost open-source electronic readers which can easily be used in the field along with the sensors we develop. Fourthly, we will demonstrate the usefulness of the technology by generating sensors which can rapidly and sensitively detect various external inputs. All of our new inventions will be made available to the research community. In addition to the other advantages mentioned above, this technology also makes it easy for users to develop their own assays simply by adding appropriate DNA components to a basic mixture, using standard protocols. Such devices can be manufactured and distributed cheaply on a very large scale. In conjunction with low-cost readers, ubiquitous mobile devices equipped with GPS and time data, and cloud-computing, this will offer the possibility to detect health hazards with unprecedented levels of speed and detail, with potentially huge effects on human health and welfare. Furthermore, these devices are ideal for use in education, allowing users to design and test their own genetic circuits without the issues inherent in using living cells. For these reasons, our proposal offers tremendous benefits and represents a step change in the real-word applicability of synthetic biology.

    more_vert
  • Funder: UKRI Project Code: EP/M01052X/1
    Funder Contribution: 731,953 GBP

    Condensed matter physics has developed a relatively complete theory of common phases in materials leading to many technologically important devices including electronic screens, memory storage, and switching devices. Landau, or mean-field theory, has provided a framework to model, predict, and understand phases and transitions in a surprisingly diverse variety of materials and also dynamical systems. While these conventional ground states have proven technologically important and the underlying theory represents a major success for scientists, these phases have proven incredibly difficult to suppress and often emerge when new materials properties are sought or engineered. To discover novel phases that will lead to a new materials revolution, these common phases need to be suppressed to allow exotic and unconventional properties to emerge. The most common vehicle to turn off conventional phases in materials has been through the introduction of disorder through chemical doping resulting in strong random fields. Many important theories have been formulated and tested to describe the effects of random fields and in particular to account for the fine balance between surface and bulk free energy. However, the use of disorder has proved limiting as properties are often templated into the material and not directly controllable and also the resulting ground state of the material is difficult to understand. Another route, which has more recently been explored in the last decade, to suppress conventional phases is by introducing strong fluctuations. While this can be trivially done with temperature, new phases have emerged by studying quantum systems where the physics are governed by quantum mechanics and the Heisenberg uncertainty principle. The study of quantum systems has resulted in the discovery of many new phases of matter including high temperature superconductors and also quantum spin-liquids where the magnetism is dynamic at any temperature. A limitation of quantum fluctuations is that the properties do not carry over directly to ferroelectric based systems and also multiferroics where magnetic and structural properties are strongly coupled. Also, owing to the strong fluctuating nature of the ground state, the properties have not been found to be easily tunable limiting immediate use for applications. This proposal aims to therefore take a different route by studying classically frustrated systems where a large ground state degeneracy is introduced naturally through the lattice and quantum mechanical effects are small. Emphasis will be placed on lattices based upon a triangular geometry. The lack of strong fluctuations (that exists in quantum systems) provides the ability to controllably tune between different ground states making this route a potential means of creating new switching devices or novel memory storage systems. The proposal aims to investigate classically frustrated magnets and ferroelectrics. These systems can be described within a common framework and will be studied using scattering techniques to provide a bulk real space image of the ground state. The properties will be tuned with magnetic and electric fields supplying a direct route for discovering a new route towards technologically applicable materials. The combined approach of investigating ferroelectrics and magnets will result in a complete understanding applicable to immediate industrial applications. These new materials will lead to the discovery of new phases including new high temperature multiferroics, classical spin liquids, or localized controllable boundaries or defects.

    more_vert
  • Funder: UKRI Project Code: EP/N018958/2
    Funder Contribution: 305,534 GBP

    "Software is the most prevalent of all the instruments used in modern science" [Goble 2014]. Scientific software is not just widely used [SSI 2014] but also widely developed. Yet much of it is developed by researchers who have little understanding of even the basics of modern software development with the knock-on effects to their productivity, and the reliability, readability and reproducibility of their software [Nature Biotechnology]. Many are long-tail researchers working in small groups - even Big Science operations like the SKA are operationally undertaken by individuals collectively. Technological development in software is more like a cliff-face than a ladder - there are many routes to the top, to a solution. Further, the cliff face is dynamic - constantly and quickly changing as new technologies emerge and decline. Determining which technologies to deploy and how best to deploy them is in itself a specialist domain, with many features of traditional research. Researchers need empowerment and training to give them confidence with the available equipment and the challenges they face. This role, akin to that of an Alpine guide, involves support, guidance, and load carrying. When optimally performed it results in a researcher who knows what challenges they can attack alone, and where they need appropriate support. Guides can help decide whether to exploit well-trodden paths or explore new possibilities as they navigate through this dynamic environment. These guides are highly trained, technology-centric, research-aware individuals who have a curiosity driven nature dedicated to supporting researchers by forging a research software support career. Such Research Software Engineers (RSEs) guide researchers through the technological landscape and form a human interface between scientist and computer. A well-functioning RSE group will not just add to an organisation's effectiveness, it will have a multiplicative effect since it will make every individual researcher more effective. It has the potential to improve the quality of research done across all University departments and faculties. My work plan provides a bottom-up approach to providing RSE services that is distinctive from yet complements the top-down approach provided by the EPRSC-funded Software Sustainability Institute. The outcomes of this fellowship will be: Local and National RSE Capability: A RSE Group at Sheffield as a credible roadmap for others pump-priming a UK national research software capability; and a national Continuing Professional Development programme for RSEs. Scalable software support methods: A scalable approach based on "nudging", to providing research software support for scientific software efficiency, sustainability and reproducibility, with quality-guidelines for research software and for researchers on how best to incorporate research software engineering support within their grant proposals. HPC for long-tail researchers: 'HPC-software ramps' and a pathway for standardised integration of HPC resources into Desktop Applications fit for modern scientific computing; a network of HPC-centric RSEs based around shared resources; and a portfolio of new research software courses developed with partners. Communication and public understanding: A communication campaign to raise the profile of research software exploiting high profile social media and online resources, establishing an informal forum for research software debate. References [Goble 2014] Goble, C. "Better Software, Better Research". IEEE Internet Computing 18(5): 4-8 (2014) [SSI 2014] Hettrick, S. "It's impossible to conduct research without software, say 7 out of 10 UK researchers" http://www.software.ac.uk/blog/2014-12-04-its-impossible-conduct-research-without-software-say-7-out-10-uk-researchers (2014) [Nature 2015] Editorial "Rule rewrite aims to clean up scientific software", Nature Biotechnology 520(7547) April 2015

    more_vert
  • Funder: UKRI Project Code: EP/R004730/1
    Funder Contribution: 101,150 GBP

    The subject of study of differential geometry are smooth manifolds, which correspond to smooth curved objects of finite dimension. In modern differential geometry, it is becoming more and more common to consider sequences (or flows) of smooth manifolds. Typically the limits of such sequences (or flows) are non smooth anymore. It is then useful to isolate a natural class of non smooth objects which generalize the classical notion of smooth manifold, and which is closed under the process of taking limits. If the sequence of manifolds satisfy a lower bound on the sectional curvatures, a natural class of non-smooth objects which is closed under (Gromov-Hausdorff) convergence is given by special metric spaces known as Alexandrov spaces; if instead the sequence of manifolds satisfy a lower bound on the Ricci curvatures, a natural class of non-smooth objects, closed under (measured Gromov-Hausdorff) convergence, is given by special metric measure spaces (i.e. metric spaces endowed with a reference volume measure) known as RCD(K,N) spaces. These are a 'Riemannian' refinement of the so called CD(K,N) spaces of Lott-Sturm-Villani, which are metric measure spaces with Ricci curvature bounded below by K and dimension bounded above by N in a synthetic sense via optimal transport. In the proposed project we aim to understand in more detail the structure, the analytic and the geometric properties of RCD(K,N) spaces. The new results will have an impact also on the classical world of smooth manifolds satisfying curvature bounds.

    visibility22
    visibilityviews22
    downloaddownloads56
    Powered by Usage counts
    more_vert
  • Funder: UKRI Project Code: EP/R019622/1
    Funder Contribution: 100,987 GBP

    This project concerns computational mathematics and logic. The aim is to improve the ability of computers to perform ``Quantifier Elimination'' (QE). We say a logical statement is ``quantified'' if it is preceded by a qualification such as "for all" or "there exists". Here is an example of a quantified statement: "there exists x such that ax^2 + bx + c = 0 has two solutions for x". While the statement is mathematically precise the implications are unclear - what restrictions does this statement of existence force upon us? QE corresponds to replacing a quantified statement by an unquantified one which is equivalent. In this case we may replace the statement by: "b^2 - 4ac > 0", which is the condition for x to have two solutions. You may have recognised this equivalence from GCSE mathematics, when studying the quadratic equation. The important point here is that the latter statement can actually be derived automatically by a computer from the former, using a QE procedure. QE is not subject to the numerical rounding errors of most computations. Solutions are not in the form of a numerical answer but an algebraic description which offers insight into the structure of the problem at hand. In the example above, QE shows us not what the solutions to a particular quadratic equation are, but how in general the number of solutions depends on the coefficients a, b, and c. QE has numerous applications throughout engineering and the sciences. An example from biology is the determination of medically important values of parameters in a biological network; while another from economics is identifying which hypotheses in economic theories are compatible, and for what values of the variables. In both cases, QE can theoretically help, but in practice the size of the statements means state-of-the-art procedures run out of computer time/memory. The extensive development of QE procedures means they have many options and choices about how they are run. These decisions can greatly affect how long QE takes, rendering an intractable problem easy and vice versa. Making the right choice is a critical, but understudied problem and is the focus of this project. At the moment QE procedures make such choices either under direct supervision of a human or based on crude human-made heuristics (rules of thumb based on intuition / experience but with limited scientific basis). The purpose of this project is to replace these by machine learning techniques. Machine Learning (ML) is an overarching term for tools that allow computers to make decisions that are not explicitly programmed, usually involving the statistical analysis of large quantities of data. ML is quite at odds with the field of Symbolic Computation which studies QE, as the latter prizes exact correctness and so shuns the use of probabilistic tools making its application here very novel. We are able to combine these different worlds because the choices which we will use ML to make will all produce a correct and exact answer (but with different computational costs). The project follows pilot studies undertaken by the PI which experimented with one ML technique and found it improved upon existing heuristics for two particular decisions in a QE algorithm. We will build on this by working with the spectrum of leading ML tools to identify the optimal techniques for application in Symbolic Computation. We will demonstrate their use for both low level algorithm decisions and choices between different theories and implementations. Although focused on QE, we will also demonstrate ML as being a new route to optimisation in Computer Algebra more broadly and work encompasses Project Partners and events to maximise this. Finally, the project will deliver an improved QE procedure that makes use of ML automatically, without user input. This will be produced in the commercial Computer Algebra software Maple in collaboration with industrial Project Partner Maplesoft.

    more_vert
  • Funder: UKRI Project Code: EP/R009600/1
    Funder Contribution: 101,024 GBP

    BACKGROUND Values are deeply held principles guiding decision-making processes of individuals, groups and organizations. Software is inevitably affected by values: the organizational values of the project sponsor, research partners, and developers. Some values (e.g. financial value) are easier to quantify than others (e.g. trust, responsibility) with the latter often dismissed in software production processes as lacking of measurable evidence. This is problematic because all values, including less-easy to measure ones, influence how people use, access and engage with software systems with far-reaching impact not only on the commercial success of software products, but more widely on society. VISION Values-First SE is a systematic and disciplined approach to the elicitation, articulation, and deliberation of human values in software production. Given the pervasiveness of software and its impact on society, we - developers, researchers, clients, and end-user - must strengthen our capacity and confidence to externalise the values-sets built into software and use them to track how software behaves. Recent examples such as the Google boycott stem from the (often unintentional) breach of implicitly held values-systems: simply put, companies do not want to be associated with extremist values perceived as opposite to those held by society. However, the interplay between values held by software industry (e.g. prestige, social responsibility), clients (e.g. financial, care for their customers and employees), and end-users (e.g. trust, social justice) is complex, difficult to articulate and rarely fully captured by current SE decision-making processes. CONTEXT Awareness of the impact of software on politics, society, and the environment is not new: from cyber-security to environmental informatics, to digital-health, a large body of ethical computing exists looking at mechanisms that can guide developers and managers' responsibilities (e.g. code of Ethics). What is new is the unprecedented scale, reach and complexity of such impact and the urgency for developers to "be prepared to be responsible". One of the biggest challenges for developers is that the full impact of values choices in the code developed is often unseen and unintentional: when writing software, often the platform obfuscates the process even to the software developer. How can software developers be prepared to be responsible when it is not clear what they should be responsible for? For example, in the Android Software Development Kit (SDK), the geocoding of location is done by sharing of precise location with a third party organization (including Google). The implications of sharing this data, how it is stored, treated and reused is not fully explained to the developer in tutorials, in the SDK documentation or in the IDE at the time of writing the code. Those simple lines of code for the developer could have an unseen impact on the encoded values of the software produced. However, current SE methods do not seem to provide the means to follow values through a software development process and, in particular, there are currently no methods that allow values to be used as a reference framework for decision making at key stages of software development. Articulating, negotiating, and capturing human values across SE decision-making processes is precisely Values-first SE main aim.

    more_vert
  • Funder: UKRI Project Code: EP/K034383/1
    Funder Contribution: 2,246,110 GBP

    L-functions and modular forms are fundamental mathematical objects that encode much of our knowledge of contemporary number theory. They form part of a web of interconnected objects, the understanding of which in the most basic cases lies at the foundations of much of modern mathematics. A spectacular example is Wiles' proof of Fermat's Last Theorem, which was an application of a fundamental "modularity" link between L-functions, modular forms and elliptic curves. This project will greatly extend and generalize such connections, both theoretically and computationally. The research vision inspiring our programme can be summarised as: "Breaking the boundaries of classical L-functions and modular forms, and exploring their applications to 21st-century mathematics, physics, and computer science". Our guiding goal is to push forward both theoretical and algorithmic developments, in order to develop L-functions and modular forms far beyond current capabilities. This programme will systematically develop an extensive catalogue of number theoretic objects, and will make this information available through an integrated online resource that will become an indispensable tool for the world's research community. L-functions are to pure mathematics what fundamental particles are to physics: their interaction reveal fundamental truths. To continue the analogy, computers are to number theorists what colliders are to particle physicists. Aside from their established role as empirical "testers" for conjectures and theories, experiments can often throw up quite unexpected phenomena which go on to reshape modern theory. Our programme will establish a major database and encyclopedia of knowledge about L-functions and related objects, which will play a role analogous to that of the LHC for the scientists at CERN. Both are at the threshold of tantalising glimpses into completely uncharted territories: higher degree L-functions for us and the Higgs boson for them. Theoretical and computational work on higher degree L-functions has only started to make substantial progress in the past few years. There do not currently exist efficient methods to work with these, and rigorous computations with them are not yet possible. Neither is there yet an explicit description of all ways in which degree 3 L-functions can arise. We will address these facets in our research programme: both algorithmic development and theoretical classification. As well as having theoretical applications to modularity relationships as in Wiles' proof, detailed knowledge of L-functions has more far-reaching implications. Collections of L-functions have statistical properties which first arose in theoretical physics. This surprising connection, which has witnessed substantial developments led by researchers in Bristol, has fundamental predictive power in number theory; the synergy will be vastly extended in this programme. In another strand, number theory plays an increasingly vital role in computing and communications, as evidenced by its striking applications to both cryptography and coding theory. The Riemann Hypothesis (one of the Clay Mathematics Million Dollar Millennium Problems) concerns the distribution of prime numbers, and the correctness of the best algorithms for testing large prime numbers depend on the truth of a generalised version of this 150-year-old unsolved problem. These are algorithms which are used by public-key cryptosystems that everyone who uses the Internet relies on daily, and that underpin our digital economy. Our programme involves the creation of a huge amount of data about a wide range of modular forms and L-functions, which will far surpass in range and depth anything computed before in this area. This in turn will be used to analyse some of the most famous outstanding problems in mathematics, including the Riemann Hypothesis and another Clay problem, the Birch and Swinnerton-Dyer conjecture.

    visibility81
    visibilityviews81
    downloaddownloads108
    Powered by Usage counts
    more_vert
  • Funder: UKRI Project Code: EP/N018745/1
    Funder Contribution: 320,381 GBP

    Realizing the potential of applications of quantum theory to information processing, which include quantum communication and quantum computation, is one of the primary goals of contemporary engineering and physics. The key theoretical breakthroughs enabling quantum communication technologies were the discovery of the phenomenon of quantum entanglement in the 1930s and the realisation that entanglement represented not merely a curiosity of quantum theory but a critical resource which could be exploited to achieve heretofore impossible communication tasks in the 1980s. Bell indentified quantum nonlocality as the essentially quantum aspect of entanglement in the 1960s. While it is widely understood that quantum computation offers substantial efficiency advantages over classical computation for particular problems, it is neither understood what the precise class of such problems is nor what the particular aspect or aspects of quantum theory enabling these advantages are. The applications for QC which have been identified are likely only a fraction of the full potential, however, as only a handful of quantum algorithms have been discovered. Peter Shor, whose discovery of the first practical quantum algorithm founded modern quantum computer science, contemplated why so few quantum algorithms have been discovered and suggested that, "quantum computers operate in a manner so different from classical computers that our techniques for designing algorithms and our intuitions for understanding the process of computation no longer work". In seeking quantum algorithms without a clear idea of the essential quantum phenomenon accounting for quantum computational advantage, we are working in the dark. Despite decades of research, the key feature of quantum theory enabling quantum advantage over classical computers remains elusive. Several of quantum theory's novel features---such as entanglement, superposition, and discord---have been proposed as candidates but have subsequently proven insufficient. Recent evidence, such as that provided by Rausendorff (Phys. Rev. A, 88) and Howard et al. (Nature, 510), demonstrates that a generalization of nonlocality called contextuality plays an important role in QC and suggests that it is, perhaps, a sought-after key to understanding the unique capabilities of QC. Our vision is to deepen the theory of contextuality with the goals of achieving an understanding of the precise role it plays in QC and how it is a resource for computational advantage. Our team is uniquely positioned to tackle this challenge: the PIs are co-inventors of the two leading theoretical frameworks for contextuality. We will achieve our goal by collaborating with an international, interdisciplinary team of experts including those responsible for the initial evidence linking contextuality and QC as well as recognized leaders in quantum algorithms and the resource theory of nonlocality.

    more_vert
  • Funder: UKRI Project Code: EP/M013294/1
    Funder Contribution: 35,513,900 GBP

    The Hub will create a seamless link between science and applications by building on our established knowledge exchange activities in quantum technologies. We will transform science into technology by developing new products, demonstrating their applications and advantages, and establishing a strong user base in diverse sectors. Our overarching ambition is to deliver a wide range of quantum sensors to underpin many new commercial applications. Our key objective is to ensure that the Hub's outputs will have been picked up by companies, or industry-led TSB projects, by the end of the funding period. The Hub will comprise: a strong fabrication component; quantum scientists with a demonstrated ability to combine scientific excellence with technological delivery; leading engineers with the broad collective expertise and connections required to develop and use new quantum sensors. We have identified, and actively involved, industry enablers to build a supply chain for quantum sensor technology. As well as direct physics connections to industry, the engineers provide strong links to relevant industrial users, thus providing information on industrial needs and enabling rapid prototype deployment in the field. To establish a coherent national collaborative effort, the Hub will include a UK network on quantum sensors and metrology, which will also exploit the connections that Prof Bongs and all Hub members have forged in Europe, the US and Asia. This inter-linkage ensures capture of the most advanced developments in quantum technology around the world for exploitation by the UK. Quantum sensors and metrology, plus some devices in quantum communication, are the only areas where laboratory prototypes have already proven superior to their best classical counterparts. This sets the stage, credibly, for rapid and disruptive applications emerging from the Hub. The selection of prototypes will be driven by commercial pull, i.e. each prototype project within the Hub must demonstrate, from the outset, industry or practitioner engagement from our engineering and/or industrial collaborators. We have strong industry support across several disciplines with the structures in place actively to manage technology and knowledge transfer to the industry sector. Particular roles are played by NPL and e2V. We will closely collaborate with NPL as metrology end-user on clock, magnetometer and potentially Watt balance developments with a lecturer-level Birmingham-NPL fellow contributed by Birmingham University and our PRDAs spending ~17 man-years in addition to 3-5 PhD students on these joint projects in the Advanced Metrology Laboratory/incubator space. E2v have a unique industrial manufacturing/R&D facility co-located within the School of Physics and Astronomy at Nottingham that has already catalysed the expansion of their activities into the Quantum Technology domain. Public Engagement conveying the Hub's breakthroughs will be a high priority - for example annually at the Royal Society Summer Exhibitions. In addition to cohort-training of 80 PhD students working within the Hub, the Hub will contribute to the training of ~500 PhD students via electronically-shared lectures (many already running within the e-learning graduate schools MPAGS, MEGS, SEPNET and SUPA) across the institutions within the Hub. The Hub will create an internationally-leading centre of excellence with major impact in the area of quantum sensors and metrology. To widen the impact of the Hub and ensure long-term sustainability, we will actively pursue European and other international collaborative funding for both underlying fundamental research and the technology development.

    visibility733
    visibilityviews733
    downloaddownloads4,840
    Powered by Usage counts
    more_vert
  • Funder: UKRI Project Code: EP/P006078/2
    Funder Contribution: 225,535 GBP

    Some of the most fundamental and perhaps bizarre processes expected to occur in the vicinity of black holes are out of observational reach. To address this issue we utilise analogue systems where we study fluctuations on a background flow that in the experiment reproduces an effective black hole. In the literature this line of research is referred to as analogue models for gravity, or simply analogue gravity. Analogue models provide not only a theoretical but also an experimental framework in which to verify predictions of classical and quantum fields exposed to 'extreme' spacetime geometries, such as rapidly rotating black holes. This project brings together two world-wide recognised experts in the field of analogue gravity with the aim of pushing the field in a new direction: we propose ground-breaking studies to mimic some of the bizarre processes occurring in the vicinity of rotating black holes from general relativity and rotating fluids in both water and optical systems. In particular, we will investigate both theoretically and experimentally the interaction between an input wave and a rotating black hole spacetime geometry, here recreated by the rotating fluid. This allows us to mimic a scattering process associated to rotating black hoes called superradiant scattering. From a historical viewpoint this kind of radiation is the precursor to Hawking radiation. More precisely, black hole superradiance is the scattering of waves from a rotating black hole: if the incoming wave also possesses a small amount of angular momentum, it will be reflected with an increased amplitude, i.e. it is amplified at the expense of the black hole that thus loses some of its rotational energy. It has also been pointed out that the same physics may take place in very different systems, for example light incident on a rotating metallic (or absorbing) cylinder may also be amplified upon reflection. Yet, no-one has ever attempted to experimentally investigate the underlying physics that extend beyond general relativity and are relevant to a variety of hydrodynamical and rotating systems. We aim to provide the first ever experimental evidence of this intriguing and fundamental amplification mechanism in two different hydrodynamical systems. The first is a water spout, controlled so that the correct boundary conditions are obtained and optimised for observing BH-SS. The second is a less conventional fluid that is made out of light. Light propagating in a special medium can behave as a fluid or even a superfluid. By building upon highly developed photonic technologies e.g. for the control and measurements of laser beam wavefronts, we will implement very precisely tailored and characterised experiments. One of the unique aspects of this project is the marriage between two very different lab-based systems, one using water the other using light, to tackle an outstanding problem in physics that is of relevance to astrophysics, hydrodynamic and optical systems.

    visibility33
    visibilityviews33
    downloaddownloads224
    Powered by Usage counts
    more_vert