309 Research products, page 1 of 31
Loading
- Other research product . 2017Open Access EnglishAuthors:Kapila, Sahil; Oni, Abayomi Olufemi; Kumar, Amit;Kapila, Sahil; Oni, Abayomi Olufemi; Kumar, Amit;Country: CanadaProject: NSERC
The development of a cost structure for energy storage systems (ESS) has received limited attention. In this study, we developed data-intensive techno-economic models to assess the economic feasibility of ESS. The ESS here includes pump hydro storage (PHS) and compressed air energy storage (CAES). The costs were developed using data-intensive bottom-up models. Scale factors were developed for each component of the storage systems. The life cycle costs of energy storage were estimated for capacity ranges of 98-491 MW, 81-404 MW, and 60-298 MW for PHS, conventional CAES (C-CAES), and adiabatic CAES (A-CAES), respectively, to ensure a market-driven price can be achieved. For CAES systems, costs were developed for storage in salt caverns hard rock caverns, and porous formations. The results show that the annual life cycle storage cost is $220-400 for PHS, $215-265 for C-CAES, and $375-480 per kW-year for A-CAES. The levelised cost of electricity is $69-121 for PHS, $58-70 for C-CAES, and $96-121 per MWh for A-CAES. C-CAES is economically attractive at all capacities, PHS is economically attractive at higher capacities, and A-CAES is not attractive at all. The developed information is helpful in making investment decision related to large energy storage systems.
- Other research product . 2010Open Access EnglishAuthors:Greyson, Devon; Morrison, Heather; Waller, Andrew;Greyson, Devon; Morrison, Heather; Waller, Andrew;Publisher: Canadian Library AssociationProject: NSERC , CIHR , SSHRC
This article is a summary of recent Open Access activity in Canada, focusing on policies and mandates, repositories, and initiatives in libraries.
- Other research product . 2005Open AccessAuthors:Pittner, Heiko;Pittner, Heiko;Publisher: Ludwig-Maximilians-Universität MünchenCountry: GermanyProject: NSERC
This work reports on experiments in which antihydrogen atoms have been produced in cryogenic Penning traps from antiproton and positron plasmas by two different methods and on experiments that have been carried out subsequently in order to investigate the antihydrogen atoms. By the first method antihydrogen atoms have been formed during the process of positron cooling of antiprotons in so called nested Penning traps and detected via a field ionization method. A linear dependence of the number of detected antihydrogen atoms on the number of positrons has been found. A measurement of the state distribution has revealed that the antihydrogen atoms are formed in highly excited states. This suggests along with the high production rate that the antihydrogen atoms are formed by three-body recombination processes and subsequent collisional deexcitations. However current theory cannot yet account for the measured state distribution. Typical radii of the detected antihydrogen atoms lie in the range between 0.4 µm and 0.15 µm. The deepest bound antihydrogen atoms have radii below 0.1 µm. Antihydrogen atoms with that size have chaotic positron orbits so that for the first time antihydrogen atoms have been detected that cannot be described by the GCA-model. The kinetic energy of the weakest bound antihydrogen atoms has been measured to about 200 meV, which corresponds to an antihydrogen velocity of approximately 6200 m/s. A simple model suggests that these atoms are formed from only one deexcitation collision and methods that might lead to a decrease of the antihydrogen velocity are presented. By the second method antihydrogen atoms have been synthesized in charge-exchange processes. Lasers are used to produce a Rydberg cesium beam within the cryogenic Penning trap that collides with trapped positrons so that Rydberg positronium atoms are formed via charge-exchange reactions. Due to their charge neutrality the Rydberg positronium atoms are free to leave the positron trapping region. The Rydberg positronium atoms that collide with nearby stored antiprotons form antihydrogen atoms in charge-exchange reactions. So far, 14 +/- 4 antihydrogen atoms have been detected background-free via a field-ionization method. The antihydrogen atoms produced via the two-step charge-exchange mechanism are expected to have a temperature of 4.2 K, the temperature of the antiprotons from which they are formed. A method is proposed by which the antihydrogen temperature can be determined with an accuracy of better than 1 K from a measurement of the time delay between antihydrogen annihilation events and the laser pulse that initiates the antihydrogen production via the production of Rydberg cesium atoms. First experiments have been carried out during the last days of the 2004 beam time, but the number of detected antihydrogen annihilations has been too low for a determination of the antihydrogen temperature. Trapped antiprotons have been directly exposed to laser light delivered by a Titanium:Sapphire laser in order to investigate if the laser light causes any loss on the trapped antiprotons. Experiments have shown that no extra loss occurs for laser powers of less than 590 mW. This is an important result against the background of the future plan to confine antihydrogen atoms in a combined Penning-Ioffe trap and then to carry out laser spectroscopy on these atoms, since it reveals that laser light does not cause an increase of the pressure in the trapping region to the extend that annihilations with the background gas become noticeable. The ATRAP Collaboration plans to precisely investigate antihydrogen atoms. The ultimate goal is to test the CPT-theorem by a high precision measurement of the 1S-2S transition of antihydrogen and a comparison with the precisely known value of the corresponding transition in hydrogen. This thesis presents the achievement of the first step towards this challenging goal: the production of cold antihydrogen itself.
- Other research product . 2018Open Access EnglishAuthors:Schneider, M.; Barthlott, S.; Hase, F.; González, Y.; Yoshimura, K.; García, O. E.; Sepúlveda, E.; Gomez-Pelaez, A.; Gisi, M.; Kohlhepp, R.; +16 moreSchneider, M.; Barthlott, S.; Hase, F.; González, Y.; Yoshimura, K.; García, O. E.; Sepúlveda, E.; Gomez-Pelaez, A.; Gisi, M.; Kohlhepp, R.; Dohe, S.; Blumenstock, T.; Wiegele, A.; Christner, E.; Strong, K.; Weaver, D.; Palm, M.; Deutscher, N. M.; Warneke, T.; Notholt, J.; Lejeune, B.; Demoulin, P.; Jones, N.; Griffith, D. W. T.; Smale, D.; Robinson, J.;Project: NSERC , EC | MUSICA (256961)
Within the project MUSICA (MUlti-platform remote Sensing of Isotopologues for investigating the Cycle of Atmospheric water), long-term tropospheric water vapour isotopologue data records are provided for ten globally distributed ground-based mid-infrared remote sensing stations of the NDACC (Network for the Detection of Atmospheric Composition Change). We present a new method allowing for an extensive and straightforward characterisation of the complex nature of such isotopologue remote sensing datasets. We demonstrate that the MUSICA humidity profiles are representative for most of the troposphere with a vertical resolution ranging from about 2 km (in the lower troposphere) to 8 km (in the upper troposphere) and with an estimated precision of better than 10%. We find that the sensitivity with respect to the isotopologue composition is limited to the lower and middle troposphere, whereby we estimate a precision of about 30‰ for the ratio between the two isotopologues HD16O and H216O. The measurement noise, the applied atmospheric temperature profiles, the uncertainty in the spectral baseline, and the cross-dependence on humidity are the leading error sources. We introduce an a posteriori correction method of the cross-dependence on humidity, and we recommend applying it to isotopologue ratio remote sensing datasets in general. In addition, we present mid-infrared CO2 retrievals and use them for demonstrating the MUSICA network-wide data consistency. In order to indicate the potential of long-term isotopologue remote sensing data if provided with a well-documented quality, we present a climatology and compare it to simulations of an isotope incorporated AGCM (Atmospheric General Circulation Model). We identify differences in the multi-year mean and seasonal cycles that significantly exceed the estimated errors, thereby indicating deficits in the modeled atmospheric water cycle.
- Other research product . 1998Open Access EnglishAuthors:Lin, Hai-Hui;Lin, Hai-Hui;Publisher: National Library of Canada = Bibliothèque nationale du CanadaProject: NSERC
- Other research product . 2010Open AccessAuthors:Araujo, Hugo Andres;Araujo, Hugo Andres;Country: CanadaProject: NSERC
In British Columbia, one of the main negative impacts on salmonid habitat is the production of fine sediments generated by forest roads or other human activities. Given this concern, this study’s main objective was to develop a quantitative framework for estimating effects of extreme suspended-sediment events caused by forest road construction and use on populations of chinook (Oncorhynchus tshawytscha) and coho salmon (Oncorhynchus kisutch) in a medium-sized coastal watershed of the lower Fraser River. The framework incorporates existing knowledge of sediment production by forest roads to make a quantitative link between traffic levels and physiological responses of salmonids. The results suggest that extreme sedimentation events generated by heavy traffic levels negatively affect populations of chinook and coho. Population numbers declined proportionally to the elevated levels of suspended sediments concentrations following a non-linear trend in which Chinook salmon are more vulnerable to the deleterious effects of sediments than coho salmon.
- Other research product . Collection . 2022Open Access EnglishAuthors:Léger-Daigle, Romy; Noisette, Fanny; Bélanger, Simon; Cusson, Mathieu; Nozais, Christian;Léger-Daigle, Romy; Noisette, Fanny; Bélanger, Simon; Cusson, Mathieu; Nozais, Christian;Publisher: PANGAEAProject: NSERC
The dataset compiles pigment content, absorptance data, photosynthetic parameters and primary production data as proxies for summertime photoacclimation of the temperate intertidal eelgrass Zostera marina after a 25-day exposure to a natural light intensity gradient (6, 36, 74, 133, 355, 503 and 860 µmol photons/m²/s) under laboratory conditions at the Pointe-au-Père research station, East Rimouski, Quebec, Canada. The data bundle contains: 1) photosynthetic and total absorptance data at the end of the experiment, which respectively represent the fraction of incident visible light absorbed by the photosynthetic tissues corrected and not corrected for non-photosynthetic absorption; 2) pigment content at the end of the experiment, which includes chlorophyll a and b and total carotenoids contents; 3) photosynthetic parameters obtained by Rapid Light Curves (RLC) on days 5 and 25, including photosynthetic apparatus efficiency (alpha), capacity (ETRmax) and saturation (Ek); 4) whole shoot primary production at the end of the experiment, which was calculated from oxygen fluxes under light and dark conditions, and normalized by leaf surface.
- Other research product . 2010Open Access EnglishAuthors:Edmonds, Jeff;Edmonds, Jeff;Publisher: Dagstuhl Seminar Proceedings. 10071 - SchedulingCountry: GermanyProject: NSERC
The goal is to prove a surprising lower bound for resource augmented nonclairvoyant algorithms for scheduling jobs with sublinear nondecreasing speed-up curves on multiple processors with the objective of average response time. Edmonds and Pruhs in SODA09 prove that for every $e > 0$, there is an algorithm $alg_{e}$ that is $(1!+!epsilon)$-speed $O({1 over e2})$-competitive. A problem, however, is that this algorithm $alg_{e}$ depends on $e$. The goal is to prove that every fixed deterministic nonclairvoyant algorithm has a suboptimal speed threshold, namely for every (graceful) algorithm $alg$, there is a threshold $1!+!beta_{alg}$ that is $beta_{alg} > 0$ away from being optimal such that the algorithm is $Omega({1 over e beta_{alg}})$ competitive with speed $(1 !+! beta_{alg}) !+! e$ and is $omega(1)$ competitive with speed $1 !+! beta_{alg}$. I have worked very hard on it and have felt that I was close. The proof technique is to use Brouwer's fixed point theorem to break the cycle of needing to know which input will be given before one can know what the algorithm will do and needing to know what the algorithm will do before one can know which input to give. Every thing I have can be found at
- Other research product . Other ORP type . 2016Open Access EnglishAuthors:Malo, Lauren Douglas;Malo, Lauren Douglas;
handle: 1974/14948
Country: CanadaProject: NSERCClimate change is expected to have marked impacts on forest ecosystems. In Ontario forests, this includes changes in tree growth, stand composition and disturbance regimes, with expected impacts on many forest-dependent communities, the bioeconomy, and other environmental considerations. In response to climate change, renewable energy systems, such as forest bioenergy, are emerging as critical tools for carbon emissions reductions and climate change mitigation. However, these systems may also need to adapt to changing forest conditions. Therefore, the aim of this research was to estimate changes in forest growth and forest cover in response to anticipated climatic changes in the year 2100 in Ontario forests, to ultimately explore the sustainability of bioenergy in the future. Using the Haliburton Forest and Wildlife Reserve in Ontario as a case study, this research used a spatial climate analog approach to match modeled Haliburton temperature and precipitation (via Fourth Canadian Regional Climate Model) to regions currently exhibiting similar climate (climate analogs). From there, current forest cover and growth rates of core species in Haliburton were compared to forests plots in analog regions from the US Forest Service Forest Inventory and Analysis (FIA). This comparison used two different emission scenarios, corresponding to a high and a mid-range emission future. This research then explored how these changes in forests may influence bioenergy feasibility in the future. It examined possible volume availability and composition of bioenergy feedstock under future conditions. This research points to a potential decline of softwoods in the Haliburton region with a simultaneous expansion of pre-established hardwoods such as northern red oak and red maple, as well as a potential loss in sugar maple cover. From a bioenergy perspective, hardwood residues may be the most feasible feedstock in the future with minimal change in biomass availability for energy production; under these possible conditions, small scale combined heat and power (CHP) and residential pellet use may be the most viable and ecologically sustainable options. Ultimately, understanding the way in which forests may change is important in informing meaningful policy and management, allowing for improved forest bioenergy systems, now and in the future.
add Add to ORCIDPlease grant OpenAIRE to access and update your ORCID works.This Research product is the result of merged Research products in OpenAIRE.
You have already added works in your ORCID record related to the merged Research product. - Other research product . 2019Open Access EnglishAuthors:Madureira, Marlene; Sá, José Carlos; Lopes, Manuel Pereira; Ferreira, Luís Pinto; Pereira, Maria Teresa;Madureira, Marlene; Sá, José Carlos; Lopes, Manuel Pereira; Ferreira, Luís Pinto; Pereira, Maria Teresa;
handle: 10400.22/15841
Publisher: Società Editrice EsculapioCountry: PortugalProject: NSERCThis study aims to design a new warehouse layout as a solution to the warehouse’s main problem: lack of space to store all the materials in stock. Besides the existing warehouse building, which currently presents an unsuitable layout for the storage of large volumes, there is a second area right next to the first to increase the storage area. The two buildings were re-dimensioned to accommodate a great quantity of stock, enabling one to transform the warehouse building into an industrial warehouse with appropriate storage methods. The final layout increased the storage area by 64%, from a total of 1.471,41 m2 to 2.414,22 m2 overall.
add Add to ORCIDPlease grant OpenAIRE to access and update your ORCID works.This Research product is the result of merged Research products in OpenAIRE.
You have already added works in your ORCID record related to the merged Research product.
309 Research products, page 1 of 31
Loading
- Other research product . 2017Open Access EnglishAuthors:Kapila, Sahil; Oni, Abayomi Olufemi; Kumar, Amit;Kapila, Sahil; Oni, Abayomi Olufemi; Kumar, Amit;Country: CanadaProject: NSERC
The development of a cost structure for energy storage systems (ESS) has received limited attention. In this study, we developed data-intensive techno-economic models to assess the economic feasibility of ESS. The ESS here includes pump hydro storage (PHS) and compressed air energy storage (CAES). The costs were developed using data-intensive bottom-up models. Scale factors were developed for each component of the storage systems. The life cycle costs of energy storage were estimated for capacity ranges of 98-491 MW, 81-404 MW, and 60-298 MW for PHS, conventional CAES (C-CAES), and adiabatic CAES (A-CAES), respectively, to ensure a market-driven price can be achieved. For CAES systems, costs were developed for storage in salt caverns hard rock caverns, and porous formations. The results show that the annual life cycle storage cost is $220-400 for PHS, $215-265 for C-CAES, and $375-480 per kW-year for A-CAES. The levelised cost of electricity is $69-121 for PHS, $58-70 for C-CAES, and $96-121 per MWh for A-CAES. C-CAES is economically attractive at all capacities, PHS is economically attractive at higher capacities, and A-CAES is not attractive at all. The developed information is helpful in making investment decision related to large energy storage systems.
- Other research product . 2010Open Access EnglishAuthors:Greyson, Devon; Morrison, Heather; Waller, Andrew;Greyson, Devon; Morrison, Heather; Waller, Andrew;Publisher: Canadian Library AssociationProject: NSERC , CIHR , SSHRC
This article is a summary of recent Open Access activity in Canada, focusing on policies and mandates, repositories, and initiatives in libraries.
- Other research product . 2005Open AccessAuthors:Pittner, Heiko;Pittner, Heiko;Publisher: Ludwig-Maximilians-Universität MünchenCountry: GermanyProject: NSERC
This work reports on experiments in which antihydrogen atoms have been produced in cryogenic Penning traps from antiproton and positron plasmas by two different methods and on experiments that have been carried out subsequently in order to investigate the antihydrogen atoms. By the first method antihydrogen atoms have been formed during the process of positron cooling of antiprotons in so called nested Penning traps and detected via a field ionization method. A linear dependence of the number of detected antihydrogen atoms on the number of positrons has been found. A measurement of the state distribution has revealed that the antihydrogen atoms are formed in highly excited states. This suggests along with the high production rate that the antihydrogen atoms are formed by three-body recombination processes and subsequent collisional deexcitations. However current theory cannot yet account for the measured state distribution. Typical radii of the detected antihydrogen atoms lie in the range between 0.4 µm and 0.15 µm. The deepest bound antihydrogen atoms have radii below 0.1 µm. Antihydrogen atoms with that size have chaotic positron orbits so that for the first time antihydrogen atoms have been detected that cannot be described by the GCA-model. The kinetic energy of the weakest bound antihydrogen atoms has been measured to about 200 meV, which corresponds to an antihydrogen velocity of approximately 6200 m/s. A simple model suggests that these atoms are formed from only one deexcitation collision and methods that might lead to a decrease of the antihydrogen velocity are presented. By the second method antihydrogen atoms have been synthesized in charge-exchange processes. Lasers are used to produce a Rydberg cesium beam within the cryogenic Penning trap that collides with trapped positrons so that Rydberg positronium atoms are formed via charge-exchange reactions. Due to their charge neutrality the Rydberg positronium atoms are free to leave the positron trapping region. The Rydberg positronium atoms that collide with nearby stored antiprotons form antihydrogen atoms in charge-exchange reactions. So far, 14 +/- 4 antihydrogen atoms have been detected background-free via a field-ionization method. The antihydrogen atoms produced via the two-step charge-exchange mechanism are expected to have a temperature of 4.2 K, the temperature of the antiprotons from which they are formed. A method is proposed by which the antihydrogen temperature can be determined with an accuracy of better than 1 K from a measurement of the time delay between antihydrogen annihilation events and the laser pulse that initiates the antihydrogen production via the production of Rydberg cesium atoms. First experiments have been carried out during the last days of the 2004 beam time, but the number of detected antihydrogen annihilations has been too low for a determination of the antihydrogen temperature. Trapped antiprotons have been directly exposed to laser light delivered by a Titanium:Sapphire laser in order to investigate if the laser light causes any loss on the trapped antiprotons. Experiments have shown that no extra loss occurs for laser powers of less than 590 mW. This is an important result against the background of the future plan to confine antihydrogen atoms in a combined Penning-Ioffe trap and then to carry out laser spectroscopy on these atoms, since it reveals that laser light does not cause an increase of the pressure in the trapping region to the extend that annihilations with the background gas become noticeable. The ATRAP Collaboration plans to precisely investigate antihydrogen atoms. The ultimate goal is to test the CPT-theorem by a high precision measurement of the 1S-2S transition of antihydrogen and a comparison with the precisely known value of the corresponding transition in hydrogen. This thesis presents the achievement of the first step towards this challenging goal: the production of cold antihydrogen itself.
- Other research product . 2018Open Access EnglishAuthors:Schneider, M.; Barthlott, S.; Hase, F.; González, Y.; Yoshimura, K.; García, O. E.; Sepúlveda, E.; Gomez-Pelaez, A.; Gisi, M.; Kohlhepp, R.; +16 moreSchneider, M.; Barthlott, S.; Hase, F.; González, Y.; Yoshimura, K.; García, O. E.; Sepúlveda, E.; Gomez-Pelaez, A.; Gisi, M.; Kohlhepp, R.; Dohe, S.; Blumenstock, T.; Wiegele, A.; Christner, E.; Strong, K.; Weaver, D.; Palm, M.; Deutscher, N. M.; Warneke, T.; Notholt, J.; Lejeune, B.; Demoulin, P.; Jones, N.; Griffith, D. W. T.; Smale, D.; Robinson, J.;Project: NSERC , EC | MUSICA (256961)
Within the project MUSICA (MUlti-platform remote Sensing of Isotopologues for investigating the Cycle of Atmospheric water), long-term tropospheric water vapour isotopologue data records are provided for ten globally distributed ground-based mid-infrared remote sensing stations of the NDACC (Network for the Detection of Atmospheric Composition Change). We present a new method allowing for an extensive and straightforward characterisation of the complex nature of such isotopologue remote sensing datasets. We demonstrate that the MUSICA humidity profiles are representative for most of the troposphere with a vertical resolution ranging from about 2 km (in the lower troposphere) to 8 km (in the upper troposphere) and with an estimated precision of better than 10%. We find that the sensitivity with respect to the isotopologue composition is limited to the lower and middle troposphere, whereby we estimate a precision of about 30‰ for the ratio between the two isotopologues HD16O and H216O. The measurement noise, the applied atmospheric temperature profiles, the uncertainty in the spectral baseline, and the cross-dependence on humidity are the leading error sources. We introduce an a posteriori correction method of the cross-dependence on humidity, and we recommend applying it to isotopologue ratio remote sensing datasets in general. In addition, we present mid-infrared CO2 retrievals and use them for demonstrating the MUSICA network-wide data consistency. In order to indicate the potential of long-term isotopologue remote sensing data if provided with a well-documented quality, we present a climatology and compare it to simulations of an isotope incorporated AGCM (Atmospheric General Circulation Model). We identify differences in the multi-year mean and seasonal cycles that significantly exceed the estimated errors, thereby indicating deficits in the modeled atmospheric water cycle.
- Other research product . 1998Open Access EnglishAuthors:Lin, Hai-Hui;Lin, Hai-Hui;Publisher: National Library of Canada = Bibliothèque nationale du CanadaProject: NSERC
- Other research product . 2010Open AccessAuthors:Araujo, Hugo Andres;Araujo, Hugo Andres;Country: CanadaProject: NSERC
In British Columbia, one of the main negative impacts on salmonid habitat is the production of fine sediments generated by forest roads or other human activities. Given this concern, this study’s main objective was to develop a quantitative framework for estimating effects of extreme suspended-sediment events caused by forest road construction and use on populations of chinook (Oncorhynchus tshawytscha) and coho salmon (Oncorhynchus kisutch) in a medium-sized coastal watershed of the lower Fraser River. The framework incorporates existing knowledge of sediment production by forest roads to make a quantitative link between traffic levels and physiological responses of salmonids. The results suggest that extreme sedimentation events generated by heavy traffic levels negatively affect populations of chinook and coho. Population numbers declined proportionally to the elevated levels of suspended sediments concentrations following a non-linear trend in which Chinook salmon are more vulnerable to the deleterious effects of sediments than coho salmon.
- Other research product . Collection . 2022Open Access EnglishAuthors:Léger-Daigle, Romy; Noisette, Fanny; Bélanger, Simon; Cusson, Mathieu; Nozais, Christian;Léger-Daigle, Romy; Noisette, Fanny; Bélanger, Simon; Cusson, Mathieu; Nozais, Christian;Publisher: PANGAEAProject: NSERC
The dataset compiles pigment content, absorptance data, photosynthetic parameters and primary production data as proxies for summertime photoacclimation of the temperate intertidal eelgrass Zostera marina after a 25-day exposure to a natural light intensity gradient (6, 36, 74, 133, 355, 503 and 860 µmol photons/m²/s) under laboratory conditions at the Pointe-au-Père research station, East Rimouski, Quebec, Canada. The data bundle contains: 1) photosynthetic and total absorptance data at the end of the experiment, which respectively represent the fraction of incident visible light absorbed by the photosynthetic tissues corrected and not corrected for non-photosynthetic absorption; 2) pigment content at the end of the experiment, which includes chlorophyll a and b and total carotenoids contents; 3) photosynthetic parameters obtained by Rapid Light Curves (RLC) on days 5 and 25, including photosynthetic apparatus efficiency (alpha), capacity (ETRmax) and saturation (Ek); 4) whole shoot primary production at the end of the experiment, which was calculated from oxygen fluxes under light and dark conditions, and normalized by leaf surface.
- Other research product . 2010Open Access EnglishAuthors:Edmonds, Jeff;Edmonds, Jeff;Publisher: Dagstuhl Seminar Proceedings. 10071 - SchedulingCountry: GermanyProject: NSERC
The goal is to prove a surprising lower bound for resource augmented nonclairvoyant algorithms for scheduling jobs with sublinear nondecreasing speed-up curves on multiple processors with the objective of average response time. Edmonds and Pruhs in SODA09 prove that for every $e > 0$, there is an algorithm $alg_{e}$ that is $(1!+!epsilon)$-speed $O({1 over e2})$-competitive. A problem, however, is that this algorithm $alg_{e}$ depends on $e$. The goal is to prove that every fixed deterministic nonclairvoyant algorithm has a suboptimal speed threshold, namely for every (graceful) algorithm $alg$, there is a threshold $1!+!beta_{alg}$ that is $beta_{alg} > 0$ away from being optimal such that the algorithm is $Omega({1 over e beta_{alg}})$ competitive with speed $(1 !+! beta_{alg}) !+! e$ and is $omega(1)$ competitive with speed $1 !+! beta_{alg}$. I have worked very hard on it and have felt that I was close. The proof technique is to use Brouwer's fixed point theorem to break the cycle of needing to know which input will be given before one can know what the algorithm will do and needing to know what the algorithm will do before one can know which input to give. Every thing I have can be found at
- Other research product . Other ORP type . 2016Open Access EnglishAuthors:Malo, Lauren Douglas;Malo, Lauren Douglas;
handle: 1974/14948
Country: CanadaProject: NSERCClimate change is expected to have marked impacts on forest ecosystems. In Ontario forests, this includes changes in tree growth, stand composition and disturbance regimes, with expected impacts on many forest-dependent communities, the bioeconomy, and other environmental considerations. In response to climate change, renewable energy systems, such as forest bioenergy, are emerging as critical tools for carbon emissions reductions and climate change mitigation. However, these systems may also need to adapt to changing forest conditions. Therefore, the aim of this research was to estimate changes in forest growth and forest cover in response to anticipated climatic changes in the year 2100 in Ontario forests, to ultimately explore the sustainability of bioenergy in the future. Using the Haliburton Forest and Wildlife Reserve in Ontario as a case study, this research used a spatial climate analog approach to match modeled Haliburton temperature and precipitation (via Fourth Canadian Regional Climate Model) to regions currently exhibiting similar climate (climate analogs). From there, current forest cover and growth rates of core species in Haliburton were compared to forests plots in analog regions from the US Forest Service Forest Inventory and Analysis (FIA). This comparison used two different emission scenarios, corresponding to a high and a mid-range emission future. This research then explored how these changes in forests may influence bioenergy feasibility in the future. It examined possible volume availability and composition of bioenergy feedstock under future conditions. This research points to a potential decline of softwoods in the Haliburton region with a simultaneous expansion of pre-established hardwoods such as northern red oak and red maple, as well as a potential loss in sugar maple cover. From a bioenergy perspective, hardwood residues may be the most feasible feedstock in the future with minimal change in biomass availability for energy production; under these possible conditions, small scale combined heat and power (CHP) and residential pellet use may be the most viable and ecologically sustainable options. Ultimately, understanding the way in which forests may change is important in informing meaningful policy and management, allowing for improved forest bioenergy systems, now and in the future.
add Add to ORCIDPlease grant OpenAIRE to access and update your ORCID works.This Research product is the result of merged Research products in OpenAIRE.
You have already added works in your ORCID record related to the merged Research product. - Other research product . 2019Open Access EnglishAuthors:Madureira, Marlene; Sá, José Carlos; Lopes, Manuel Pereira; Ferreira, Luís Pinto; Pereira, Maria Teresa;Madureira, Marlene; Sá, José Carlos; Lopes, Manuel Pereira; Ferreira, Luís Pinto; Pereira, Maria Teresa;
handle: 10400.22/15841
Publisher: Società Editrice EsculapioCountry: PortugalProject: NSERCThis study aims to design a new warehouse layout as a solution to the warehouse’s main problem: lack of space to store all the materials in stock. Besides the existing warehouse building, which currently presents an unsuitable layout for the storage of large volumes, there is a second area right next to the first to increase the storage area. The two buildings were re-dimensioned to accommodate a great quantity of stock, enabling one to transform the warehouse building into an industrial warehouse with appropriate storage methods. The final layout increased the storage area by 64%, from a total of 1.471,41 m2 to 2.414,22 m2 overall.
add Add to ORCIDPlease grant OpenAIRE to access and update your ORCID works.This Research product is the result of merged Research products in OpenAIRE.
You have already added works in your ORCID record related to the merged Research product.