Categories
ScienceDaily

Are salt deposits a solution for nuclear waste disposal?

Around the world, there are pools of water filled with nuclear waste waiting for their final resting place. This is waste that was created from decades of nuclear power generation, and the waste must be handled carefully.

In the United States, scientists are studying several solutions for disposing of these wastes. Phil Stauffer and researchers at Los Alamos National Labs have been working with the US Department of Energy and other national laboratories on one long-term, safe, disposal solution: salt.

“Deep salt formations that already exist in the United States are one candidate for long-term disposal,” says Stauffer. “This ‘high-level’ nuclear waste can create a lot of heat, in addition to the radioactivity that must be contained. We need to develop a clear path to dispose of this waste.”

Salt deposits exist underground. They are self-healing, have very low permeability and conduct heat well. All of these are important to releasing the natural heat of the nuclear waste. Salt formations can make an excellent barrier to long-term release of radionuclides into the human environment.

The United States and Germany are disposing of low- and intermediate-level nuclear waste in repositories in salt deposits. Those wastes don’t create as much heat. So, more studies were needed to determine the safety and efficacy of salt deposits for the high-level nuclear waste.

But salt is not just a physical barrier — it’s a chemical one, too. So, how these salt deposits would react to the presence of water, heat and other geologic factors needed to be researched.

Recent thermal testing underground began by creating a full-scale mock-up of a waste canister and heating it for nearly a year. This is the first time this has been done in the United States since the late 1980s.

In parallel, the Department of Energy’s research team is running a campaign to study generic nuclear waste repositories. This includes studying how water migrates towards heat sources in salt. Enter the “Brine Availability Test in Salt” — or BATS project, with Stauffer and rest of the team. The research team began a pilot program several years ago.

By drilling boreholes into the salt deposits and doing heater tests on these salt-surrounded holes, researchers gain insight to inform decisions. The testing is occurring deep underground, within drifts called hallways using large drill rig equipment.

Phase 1s (s for shakedown) began in the summer of 2018 and ran for nearly a year. “The lessons learned, and insights gained in this initial testing, are proving vital to the design and implementation of the next, larger-scale experiment,” says Stauffer.

In addition, researchers can turn to computer modeling to predict some of the outcomes. “Long term modeling can be used to develop the appropriate initial pressure and other important factors for the boreholes,” says Stauffer. Factors included are temperature response and water availability.

Phase 1 began in January 2020 and will run for several months. It will include more complicated data collection, including fiber optic cables, electrical resistivity tomography, and real-time isotopic measurements on the water evaporated from the brine.

These aspects of brine availability will be investigated in future phases of the BATS testing. Data from the next experiment will be used to further refine models and will be shared with the international research community. Project plans call for gradually ramping up the scale of the heater experiments to eventually explore salt-brine availability to spent nuclear fuel waste canisters in a configuration representing a possible future high-level waste repository.

The team’s research was recently published this paper in Vadose Zone Journal. Research was supported by the Department of Energy Office of Nuclear Energy and implemented through a collaboration between Los Alamos National Laboratory, Sandia National Laboratories (SNL), Lawrence Berkeley National Laboratory (LBNL), and DOE-Office of Environmental Management (EM) Carlsbad Field Office (CBFO).

Story Source:

Materials provided by American Society of Agronomy. Note: Content may be edited for style and length.

Go to Source
Author:

Categories
ScienceDaily

Engineers develop precision injection system for plants

While the human world is reeling from one pandemic, there are several ongoing epidemics that affect crops and put global food production at risk. Oranges, olives, and bananas are already under threat in many areas due to diseases that affect plants’ circulatory systems and that cannot be treated by applying pesticides.

A new method developed by engineers at MIT may offer a starting point for delivering life-saving treatments to plants ravaged by such diseases.

These diseases are difficult to detect early and to treat, given the lack of precision tools to access plant vasculature to treat pathogens and to sample biomarkers. The MIT team decided to take some of the principles involved in precision medicine for humans and adapt them to develop plant-specific biomaterials and drug-delivery devices.

The method uses an array of microneedles made of a silk-based biomaterial to deliver nutrients, drugs, or other molecules to specific parts of the plant. The findings are described in the journal Advanced Science, in a paper by MIT professors Benedetto Marelli and Jing-Ke-Weng, graduate student Yunteng Cao, postdoc Eugene Lim at MIT, and postdoc Menglong Xu at the Whitehead Institute for Biomedical Research.

The microneedles, which the researchers call phytoinjectors, can be made in a variety of sizes and shapes, and can deliver material specifically to a plant’s roots, stems, or leaves, or into its xylem (the vascular tissue involved in water transportation from roots to canopy) or phloem (the vascular tissue that circulates metabolites throughout the plant). In lab tests, the team used tomato and tobacco plants, but the system could be adapted to almost any crop, they say. The microneedles can not only deliver targeted payloads of molecules into the plant, but they can also be used to take samples from the plants for lab analysis.

The work started in response to a request from the U.S. Department of Agriculture for ideas on how to address the citrus greening crisis, which is threatening the collapse of a $9 billion industry, Marelli says. The disease is spread by an insect called the Asian citrus psyllid that carries a bacterium into the plant. There is as yet no cure for it, and millions of acres of U.S. orchards have already been devastated. In response, Marelli’s lab swung into gear to develop the novel microneedle technology, led by Cao as his thesis project.

The disease infects the phloem of the whole plant, including roots, which are very difficult to reach with any conventional treatment, Marelli explains. Most pesticides are simply sprayed or painted onto a plant’s leaves or stems, and little if any penetrates to the root system. Such treatments may appear to work for a short while, but then the bacteria bounce back and do their damage. What is needed is something that can target the phloem circulating through a plant’s tissues, which could carry an antibacterial compound down into the roots. That’s just what some version of the new microneedles could potentially accomplish, he says.

“We wanted to solve the technical problem of how you can have a precise access to the plant vasculature,” Cao adds. This would allow researchers to inject pesticides, for example, that would be transported between the root system and the leaves. Present approaches use “needles that are very large and very invasive, and that results in damaging the plant,” he says. To find a substitute, they built on previous work that had produced microneedles using silk-based material for injecting human vaccines.

“We found that adaptations of a material designed for drug delivery in humans to plants was not straightforward, due to differences not only in tissue vasculature, but also in fluid composition,” Lim says. The microneedles designed for human use were intended to biodegrade naturally in the body’s moisture, but plants have far less available water, so the material didn’t dissolve and was not useful for delivering the pesticide or other macromolecules into the phloem. The researchers had to design a new material, but they decided to stick with silk as its basis. That’s because of silk’s strength, its inertness in plants (preventing undesirable side effects), and the fact that it degrades into tiny particles that don’t risk clogging the plant’s internal vasculature systems.

They used biotechnology tools to increase silk’s hydrophilicity (making it attract water), while keeping the material strong enough to penetrate the plant’s epidermis and degradable enough to then get out of the way.

Sure enough, they tested the material on their lab tomato and tobacco plants, and were able to observe injected materials, in this case fluorescent molecules, moving all they way through the plant, from roots to leaves.

“We think this is a new tool that can be used by plant biologists and bioengineers to better understand transport phenomena in plants,” Cao says. In addition, it can be used “to deliver payloads into plants, and this can solve several problems. For example, you can think about delivering micronutrients, or you can think about delivering genes, to change the gene expression of the plant or to basically engineer a plant.”

“Now, the interests of the lab for the phytoinjectors have expanded beyond antibiotic delivery to genetic engineering and point-of-care diagnostics,” Lim adds.

For example, in their experiments with tobacco plants, they were able to inject an organism called Agrobacterium to alter the plant’s DNA — a typical bioengineering tool, but delivered in a new and precise way.

So far, this is a lab technique using precision equipment, so in its present form it would not be useful for agricultural-scale applications, but the hope is that it can be used, for example, to bioengineer disease-resistant varieties of important crop plants. The team has also done tests using a modified toy dart gun mounted to a small drone, which was able to fire microneedles into plants in the field. Ultimately, such a process might be automated using autonomous vehicles, Marelli says, for agricultural-scale use.

Meanwhile, the team continues to work on adapting the system to the varied needs and conditions of different kinds of plants and their tissues. “There’s a lot of variation among them, really,” Marelli says, so you need to think about having devices that are plant-specific. For the future, our research interests will go beyond antibiotic delivery to genetic engineering and point-of-care diagnostics based on metabolite sampling.”

The work was supported by the Office of Naval Research, the National Science Foundation, and the Keck Foundation.

Go to Source
Author:

Categories
ScienceDaily

New scavenger technology allows robots to ‘eat’ metal for energy

When electronics need their own power sources, there are two basic options: batteries and harvesters. Batteries store energy internally, but are therefore heavy and have a limited supply. Harvesters, such as solar panels, collect energy from their environments. This gets around some of the downsides of batteries but introduces new ones, in that they can only operate in certain conditions and can’t turn that energy into useful power very quickly.

New research from the University of Pennsylvania’s School of Engineering and Applied Science is bridging the gap between these two fundamental technologies for the first time in the form of a “metal-air scavenger” that gets the best of both worlds.

This metal-air scavenger works like a battery, in that it provides power by repeatedly breaking and forming a series of chemical bonds. But it also works like a harvester, in that power is supplied by energy in its environment: specifically, the chemical bonds in metal and air surrounding the metal-air scavenger.

The result is a power source that has 10 times more power density than the best energy harvesters and 13 times more energy density than lithium-ion batteries.

In the long term, this type of energy source could be the basis for a new paradigm in robotics, where machines keep themselves powered by seeking out and “eating” metal, breaking down its chemical bonds for energy like humans do with food.

In the near term, this technology is already powering a pair of spin-off companies. The winners of Penn’s annual Y-Prize Competition are planning to use metal-air scavengers to power low-cost lights for off-grid homes in the developing world and long-lasting sensors for shipping containers that could alert to theft, damage or even human trafficking.

The researchers, James Pikul, assistant professor in the Department of Mechanical Engineering and Applied Mechanics, along with Min Wang and Unnati Joshi, members of his lab, published a study demonstrating their scavenger’s capabilities in the journal ACS Energy Letters.

The motivation for developing their metal-air scavenger, or MAS, stemmed from the fact that the technologies that make up robots’ brains and the technologies that power them are fundamentally mismatched when it comes to miniaturization.

As the size of individual transistors shrink, chips provide more computing power in smaller and lighter packages. But batteries don’t benefit the same way when getting smaller; the density of chemical bonds in a material are fixed, so smaller batteries necessarily mean fewer bonds to break.

“This inverted relationship between computing performance and energy storage makes it very difficult for small-scale devices and robots to operate for long periods of time,” Pikul says. “There are robots the size of insects, but they can only operate for a minute before their battery runs out of energy.”

Worse still, adding a bigger battery won’t allow a robot to last longer; the added mass takes more energy to move, negating the extra energy provided by the bigger battery. The only way to break this frustrating inverted relationship is to forage for chemical bonds, rather than to pack them along.

“Harvesters, like those that collect solar, thermal or vibrational energy, are getting better,” Pikul says. “They’re often used to power sensors and electronics that are off the grid and where you might not have anyone around to swap out batteries. The problem is that they have low power density, meaning they can’t take energy out of the environment as fast as a battery can deliver it.”

“Our MAS has a power density that’s ten times better than the best harvesters, to the point that we can compete against batteries,” he says, “It’s using battery chemistry, but doesn’t have the associated weight, because it’s taking those chemicals from the environment.”

Like a traditional battery, the researchers’ MAS starts with a cathode that’s wired to the device it’s powering. Underneath the cathode is a slab of hydrogel, a spongy network of polymer chains that conducts electrons between the metal surface and the cathode via the water molecules it carries. With the hydrogel acting as an electrolyte, any metal surface it touches functions as the anode of a battery, allowing electrons to flow to the cathode and power the connected device.

For the purposes of their study, the researchers connected a small motorized vehicle to the MAS. Dragging the hydrogel behind it, the MAS vehicle oxidized metallic surfaces it traveled over, leaving a microscopic layer of rust in its wake.

To demonstrate the efficiency of this approach, the researchers had their MAS vehicle drive in circles on an aluminum surface. The vehicle was outfitted with a small reservoir that continuously wicked water into the hydrogel to prevent it from drying out.

“Energy density is the ratio of available energy to the weight that has to be carried,” Pikul says. “Even factoring in the weight of the extra water, the MAS had 13 times the energy density of a lithium ion battery because the vehicle only has to carry the hydrogel and cathode, and not the metal or oxygen which provide the energy.”

The researchers also tested the MAS vehicles on zinc and stainless steel. Different metals give the MAS different energy densities, depending on their potential for oxidation.

This oxidation reaction takes place only within 100 microns of the surface, so while the MAS may use up all the readily available bonds with repeated trips, there’s little risk of it doing significant structural damage to the metal it’s scavenging.

With so many possible uses, the researchers’ MAS system was a natural fit for Penn’s annual Y-Prize, a business plan competition that challenges teams to build companies around nascent technologies developed at Penn Engineering. This year’s first-place team, Metal Light, earned $10,000 for their proposal to use MAS technology in low-cost lighting for off-grid homes in the developing world. M-Squared, which earned $4,000 in second place, intends to use MAS-powered sensors in shipping containers.

“In the near term, we see our MAS powering internet-of-things technologies, like what Metal Light and M-Squared propose,” Pikul says. “But what was really compelling to us, and the motivation behind this work, is how it changes the way we think about designing robots.”

Much of Pikul’s other research involves improving technology by taking cues from the natural world. For example, his lab’s high-strength, low-density “metallic wood” was inspired by the cellular structure of trees, and his work on a robotic lionfish involved giving it a liquid battery circulatory system that also pneumatically actuated its fins.

The researchers see their MAS as drawing on an even more fundamental biological concept: food.

“As we get robots that are more intelligent and more capable, we no longer have to restrict ourselves to plugging them into a wall. They can now find energy sources for themselves, just like humans do,” Pikul says. “One day, a robot that needs to recharge its batteries will just need to find some aluminum to ‘eat’ with a MAS, which would give it enough power to for it work until its next meal.”

This work was supported by the Office of Naval Research, grant N00014-19-1-2353. It was carried out in part at the Singh Center for Nanotechnology, which is supported by the NSF National Nanotechnology Coordinated Infrastructure Program under grant NNCI-1542153.

Go to Source
Author:

Categories
DCED

Gov. Wolf Announces $450 Million Loan Program for Financially Strained Hospitals – PA Department of Community & Economic Development

Harrisburg, PA — There are more than 19,000 COVID-19 cases in the state as of midnight today with numbers expected to continue increasing, highlighting an even greater need to ensure that Pennsylvania’s hospitals are equipped to care for patients and workers. To assist, Governor Tom Wolf today announced a new loan program – the Hospital Emergency Loan Program, or HELP – that will provide short-term financial relief to Pennsylvania’s hospitals as they prepare for the growing surge of individuals infected with COVID-19 and the economic fallout of the nationwide pandemic.

“The combination of increased costs and reduced revenue has hurt many hospitals financially,” Gov. Wolf said. “We must support our hospitals through this unprecedented time. When this pandemic finally ends, we’re going to need hospitals to care for our regular medical needs, like heart attacks and broken bones. This new loan program will provide immediate relief to our hospitals, which are on the frontlines of this pandemic.”

The $450 million loan package will be available to the commonwealth’s hospitals to provide immediate financial support for working capital to ensure that these facilities have sufficient personnel, equipment, and personal protective equipment.

The funding was dispersed by the Pennsylvania Infrastructure Investment Authority (PENNVEST) and will be administered by the Pennsylvania Department of Community and Economic Development through the Pennsylvania First Program (PA First). It was approved by Treasurer Joe Torsella, who played a crucial role in the expedited release of this emergency funding.

“Hospitals across Pennsylvania should be focused on saving lives, not worrying about how to make ends meet until federal relief funds arrive months from now,” said Pennsylvania State Treasurer Joe Torsella, whose office must approve any investments made by the PENNVEST board. “I am proud to approve this prudent investment that will provide immediate, low-cost, and direct financing to enable hospitals to sufficiently staff their floors, purchase treatment supplies and protective equipment, and successfully prepare for the surge of COVID-19 patients in the coming weeks. I commend the PENNVEST board for taking this step, and Governor Wolf for his leadership and continued commitment to protecting Pennsylvanians throughout this crisis.”

Pennsylvania health care facilities licensed as hospitals by the Pennsylvania Department of Health under the Health Care Facilities Act of 1979 that are eligible to receive federal grant funding through the CARES Act are eligible for HELP. The maximum loan size is $10 million per hospital at an interest rate of 0.5 percent.

Applications will be available on DCED’s website starting at 10:00 AM April 13 through April 20. The costs must be incurred between March 1 and Sept. 1.

HELP will allow hospitals to take responsive action now until funding through the federal Coronavirus Aid, Relief, and Economic Security (CARES) Act, which was signed into law on March 27, 2020, is dispersed completely, with the goal of easing the financial strain of the pandemic and smoothing the transition back into regular health care operation.

Permitted expenses under HELP will mirror those under the CARES Act, allowing hospitals to close out their loan with CARES funding once it is received.

For more information about DCED, visit the DCED website, and be sure to stay up-to-date with all of our agency news on Facebook, Twitter, and LinkedIn.

MEDIA CONTACT:

Lyndsay Kensinger, [email protected]

Casey Smith, DCED, [email protected]

# # #


Go to Source
Author: Marketing451

Categories
Hackster.io

Enrique Albertos // Hackster Impact Prize

Enrique Albertos won our Impact Prize for his project "I Was There… Recycling the Waste!" Read our interview, in English & Spanish: https://www.hackster.io/news/feature-enrique-albertos-hackster-impact-prizewinner-034135a0ec7e
// Follow Enrique on Hackster: https://www.hackster.io/javagoza
// Submit to one of our contests, and you could be next! https://www.hackster.io/contests

Plus, check out a recycled hands-free door opener that you can make yourself with recycled plastics: https://www.hackster.io/contests
https://bazar.preciousplastic.com/index.php?dispatch=products.view&product_id=281

Categories
UnrealEngine

Connect with the Unreal Engine community online

Although many physical events around the world are on hold, there are plenty of places to connect with the Unreal Engine community online. From forums, to webinars, livestreams and full-on virtual events, our community of creators is continually staying active.

Below is a listing of permanent resources and online activities that we’d love to invite you to. Please check this post often as it will be updated in an ongoing fashion with newly-added events.

PERMANENT, FREE RESOURCES

Support and Documentation
From your first steps with Unreal Engine to completing your most ambitious real-time project, we’re here to help. With comprehensive reference documentation, instructional guides, community-based support, and options for dedicated professional support, you have full access to everything you need to succeed. 

Unreal Online Learning
This growing catalog of nearly 50 courses and guided learning paths tracks your progression and awards your achievements, whether your spending your first hours in tools such as Sequencer or brushing up on your visualization skills.

Unreal Engine on YouTube
Here’s where you’ll find archives of Inside Unreal, live training, and other broadcasts from our Twitch channel; tech talks from GDC, Unreal Fest Europe, and other conferences; and so much more.

Webinar Series
Check out our free webinars to learn all about the latest Twinmotion features and workflows and how to use Unreal Engine to create photorealistic scenes and interactive designs.
 


UNREAL EVENTS

March 26, 2020 | 9am ET & 2pm ET – WEBINAR: What’s New in Twinmotion 2020.1
We recently hosted the live webinar What’s New in Twinmotion 2020.1. The replay is now available. In this webinar, Martin Krasemann, Twinmotion Technical Marketing Specialist at Epic Games presents a deep dive into some of the new Twinmotion 2020.1 features.

Find out how the release brings improved fidelity and higher-quality lighting, more realistic vegetation and humans, new features to review and present projects, and more. Watch Now

March 26, 2020 | 2pm ET – INSIDE UNREAL: Blender to Unreal Tools, Part 3
It’s time for the third part in our “Blender to Unreal” series!

In part 2 we covered how to work with Rigify and the Unreal Mannequin. We’re leaving Manny behind for this adventure, and will proceed to demonstrate how to import custom animations, characters, and skeletons.

April 2, 2020 | 2pm ET – INSIDE UNREAL: State of Audio in 4.25
You may have heard that we have several new exciting audio features coming in 4.25, and this week we thought we’d take a look at them!

To kick things off, the audio team will give a quick update on the Audio Mixer. From there we will shift to discussions and demos of the Native Convolution Reverb, Native Ambisonics Decoding and Encoding Support, as well as the new non-real-time audio analysis plugin Synesthesia. And finally, Wyeth Johnson will cap off the show with a demo on visualizing audio with the Niagara Audio Data Interface.

April 2, 2020 | 9am ET & 2pm ET – WEBINAR: Unreal Engine and Quixel: pushing the boundaries of 3D
Traveling the planet with a mission to build the world’s largest library of scans, Quixel has sought to vastly simplify production of all digital experiences. Now, since joining forces with Epic Games, this mission is accelerating.  

Teddy Bergsman and Galen Davis will join Daryl Obert to demonstrate how Quixel Megascans, Bridge, and Mixer 2020—along with the power of Unreal Engine—are pushing the boundaries of what’s now possible with 3D. Register Now.
 


ADDITIONAL EVENTS

Members of our team often perform presentations, sit in on panel discussions, or otherwise share their insight while participating in a variety of online events. Here are some events where Epic plans to have a presence.

NVIDIA GTC Digital
GTC Digital is a great training, research, insights, and direct access to the brilliant minds of NVIDIA’s GPU Technology Conference, now online. Epic Games’ Film & TV industry manager, David Morin, presents: Creating In-Camera VFX with Real-Time Workflows.

Pocket Gamer Connects Digital
A new virtual conference for the games industry. Unreal Engine is a Platinum sponsor with two solo presentations and two panel seats. Details coming soon!

More online events will be added as they are confirmed, so please check back often!
 

Go to Source
Author:

Categories
ScienceDaily

Scientists identify microbe that could help degrade polyurethane-based plastics

There may be a small answer to one of the biggest problems on the planet.

German researchers report in the journal Frontiers in Microbiology that they have identified and characterized a strain of bacteria capable of degrading some of the chemical building blocks of polyurethane.

“The bacteria can use these compounds as a sole source of carbon, nitrogen and energy,” said Dr. Hermann J. Heipieper, a senior scientist at the Helmholtz Centre for Environmental Research-UFZ in Leipzig, Germany and co-author of the new paper. “This finding represents an important step in being able to reuse hard-to-recycle PU products.”

In 2015, polyurethane products alone accounted for 3.5 million tons of the plastics produced in Europe. Polyurethane is used in everything from refrigerators and buildings to footwear and furniture to numerous other applications that can leverage its lightweight, insulating and flexible properties.

Unfortunately, polyurethane is difficult and energy-intensive to recycle or destroy as most of these kinds of plastics are thermosetting polymers that do not melt when heated. The waste mostly ends up in landfills where it releases a number of toxic chemicals, some of which are carcinogenic.

The use of microorganisms like bacteria and fungi to break down oil-based plastics is an ongoing area of research. However, few studies have addressed biodegradation of polyurethanes like the current paper.

The team out of Germany managed to isolate a bacterium, Pseudomonas sp. TDA1, from a site rich in brittle plastic waste that shows promise in attacking some of the chemical bonds that make up polyurethane plastics.

The researchers performed a genomic analysis to identify the degradation pathways at work. They made preliminary discoveries about the factors that help the microbe metabolize certain chemical compounds in plastic for energy. They also conducted other analyses and experiments to understand the bacterium’s capabilities.

This particular strain is part of a group of bacteria that are well-known for their tolerance of toxic organic compounds and other forms of stress, according to Dr. Christian Eberlein with the Helmholtz Centre for Environmental Research-UFZ. He is a co-author on the paper who coordinated and supervised the work.

“That trait is also named solvent-tolerance and is one form of extremophilic microorganisms,” he said.

The research is part of a European Union scientific program dubbed P4SB (From Plastic waste to Plastic value using Pseudomonas putida Synthetic Biology), which is attempting to find useful microorganisms that can bioconvert oil-based plastics into fully biodegradable ones. As the name implies, the project has focused on a bacterium known as Pseudomonas putida.

In addition to polyurethane, the P4SB consortium, which includes the Helmholtz Centre for Environmental Research-UFZ, is also testing the efficacy of microbes to degrade plastics made of polyethylene terephthalate (PET), which is widely used in plastic water bottles.

Heipieper said that the first step of any future research on Pseudomonas sp. TDA1 will be to identify the genes that code for the extracellular enzymes that are capable of breaking down certain chemical compounds in polyester-based polyurethanes. Extracellular enzymes, also called exoenzymes, are proteins secreted outside of a cell that cause a biochemical reaction.

However, there is no immediate plan to engineer these or other enzymes using synthetic biology techniques for bioplastic production. That could involve, for instance, genetically converting the bacteria into mini-factories capable of transforming oil-based chemical compounds into biodegradable ones for planet-friendly plastics.

Heipieper said more “fundamental knowledge” like the one gathered in the current study is needed before scientists can make that technological and commercial leap.

One small step at a time.

Story Source:

Materials provided by Frontiers. Note: Content may be edited for style and length.

Go to Source
Author:

Categories
ScienceDaily

Shedding light on optimal materials for harvesting sunlight underwater

There may be many overlooked organic and inorganic materials that could be used to harness sunlight underwater and efficiently power autonomous submersible vehicles, report researchers at New York University. Their research, publishing March 18 in the journal Joule, develops guidelines for optimal band gap values at a range of watery depths, demonstrating that various wide-band gap semiconductors, rather than the narrow-band semiconductors used in traditional silicon solar cells, are best equipped for underwater use.

“So far, the general trend has been to use traditional silicon cells, which we show are far from ideal once you go to a significant depth since silicon absorbs a large amount of red and infrared light, which is also absorbed by water — especially at large depths,” says Jason A. Röhr, a postdoctoral research associate in Prof. André D. Taylor’s Transformative Materials and Devices laboratory at the Tandon School of Engineering at New York University and an author on the study. “With our guidelines, more optimal materials can be developed.”

Underwater vehicles, such as those used to explore the abyssal ocean, are currently limited by onshore power or inefficient on-board batteries, preventing travel over longer distances and periods of time. But while solar cell technology that has already taken off on land and in outer space could give these submersibles more freedom to roam, the watery world presents unique challenges. Water scatters and absorbs much of the visible light spectrum, soaking up red solar wavelengths even at shallow depths before silicon-based solar cells would have a chance to capture them.

Most previous attempts to develop underwater solar cells have been constructed from silicon or amorphous silicon, which each have narrow band gaps best suited for absorbing light on land. However, blue and yellow light manages to penetrate deep into the water column even as other wavelengths diminish, suggesting semiconductors with wider band gaps not found in traditional solar cells may provide a better fit for supplying energy underwater.

To better understand the potential of underwater solar cells, Röhr and colleagues assessed bodies of water ranging from the clearest regions of the Atlantic and Pacific oceans to a turbid Finnish lake, using a detailed-balance model to measure the efficiency limits for solar cells at each location. Solar cells were shown to harvest energy from the sun down to depths of 50 meters in Earth’s clearest bodies of water, with chilly waters further boosting the cells’ efficiency.

The researchers’ calculations revealed that solar cell absorbers would function best with an optimum band gap of about 1.8 electronvolts at a depth of two meters and about 2.4 electronvolts at a depth of 50 meters. These values remained consistent across all water sources studied, suggesting the solar cells could be tailored to specific operating depths rather than water locations.

Röhr notes that cheaply produced solar cells made from organic materials, which are known to perform well under low-light conditions, as well as alloys made with elements from groups three and five on the periodic table could be ideal in deep waters. And while the substance of the semiconductors would differ from solar cells used on land, the overall design would remain relatively similar.

“While the sun-harvesting materials would have to change, the general design would not necessarily have to change all that much,” says Röhr. “Traditional silicon solar panels, like the ones you can find on your roof, are encapsulated to prohibit damage from the environment. Studies have shown that these panels can be immersed and operated in water for months without sustaining significant damage to the panels. Similar encapsulation methods could be employed for new solar panels made from optimal materials.” Now that they have uncovered what makes effective underwater solar cells tick, the researchers plan to begin developing optimal materials.

“This is where the fun begins!” says Röhr. “We have already investigated unencapsulated organic solar cells which are highly stable in water, but we still need to show that these cells can be made more efficient than traditional cells. Given how capable our colleagues around the world are, we are sure that we will see these new and exciting solar cells on the market in the near future.”

Story Source:

Materials provided by Cell Press. Note: Content may be edited for style and length.

Go to Source
Author:

Categories
ScienceDaily

The one ring — to track your finger’s location

Smart technology keeps getting smaller. There are smartphones, smartwatches and now, smart rings, devices that allow someone to use simple finger gestures to control other technology.

Researchers at the University of Washington have created AuraRing, a ring and wristband combination that can detect the precise location of someone’s index finger and continuously track hand movements. The ring emits a signal that can be picked up on the wristband, which can then identify the position and orientation of the ring — and the finger it’s attached to. The research team published these results Dec. 11 in Proceedings of the ACM on Interactive, Mobile, Wearable and Ubiquitous Technologies.

“We’re thinking about the next generation of computing platforms,” said co-lead author Eric Whitmire, who completed this research as a doctoral student at the Paul G. Allen School of Computer Science & Engineering. “We wanted a tool that captures the fine-grain manipulation we do with our fingers — not just a gesture or where your finger’s pointed, but something that can track your finger completely.”

AuraRing is composed of a coil of wire wrapped 800 times around a 3D-printed ring. A current running through the wire generates a magnetic field, which is picked up by three sensors on the wristband. Based on what values the sensors detect, the researchers can continuously identify the exact position of the ring in space. From there, they can determine where the user’s finger is located.

“To have continuous tracking in other smart rings you’d have to stream all the data using wireless communication. That part consumes a lot of power, which is why a lot of smart rings only detect gestures and send those specific commands,” said co-lead author Farshid Salemi Parizi, a doctoral student in electrical and computer engineering. “But AuraRing’s ring consumes only 2.3 milliwatts of power, which produces an oscillating magnetic field that the wristband can constantly sense. In this way, there’s no need for any communication from the ring to the wristband.”

With continuous tracking, AuraRing can pick up handwriting — potentially for short responses to text messages — or allow someone to have a virtual reality avatar hand that mimics what they’re doing with their actual hand. In addition, because AuraRing uses magnetic fields, it can still track hands even when they are out of sight, such as when a user is on a crowded bus and can’t reach their phone.

“We can also easily detect taps, flicks or even a small pinch versus a big pinch,” Salemi Parizi said. “This gives you added interaction space. For example, if you write ‘hello,’ you could use a flick or a pinch to send that data. Or on a Mario-like game, a pinch could make the character jump, but a flick could make them super jump.”

The researchers designed AuraRing to be ready to use as soon as it comes out of the box and not be dependent on a specific user. They tested the system on 12 participants with different hand sizes. The team compared the actual location of a participant’s finger to where AuraRing said it was. Most of the time, the system’s tracked location agreed with the actual location within a few millimeters.

This ring and wristband combination could be useful for more than games and smartphones, the team said.

“Because AuraRing continuously monitors hand movements and not just gestures, it provides a rich set of inputs that multiple industries could take advantage of,” said senior author Shwetak Patel, a professor in both the Allen School and the electrical and computer engineering department. “For example, AuraRing could detect the onset of Parkinson’s disease by tracking subtle hand tremors or help with stroke rehabilitation by providing feedback on hand movement exercises.”

The technology behind AuraRing is something that could be easily added to smartwatches and other wristband devices, according to the team.

“It’s all about super powers,” Salemi Parizi said. “You would still have all the capabilities that today’s smartwatches have to offer, but when you want the additional benefits, you just put on your ring.”

Story Source:

Materials provided by University of Washington. Original written by Sarah McQuate. Note: Content may be edited for style and length.

Go to Source
Author:

Categories
UnrealEngine

Capturing the reality of space using ray tracing in Deliver Us The Moon

Though there are arguably as many styles as there are games, there is a clear tendency in part of the industry to create games that are increasingly realistic. The degree to which we can attain this goal greatly depends on the tools that are at our disposal. Unreal Engine 4 and NVIDIA’s latest real-time ray tracing technology usher in a new age of cinematic quality in games—even for small indie developers such as ourselves at KeokeN Interactive. Our debut game, Deliver Us The Moon, derives its realistic visuals in no small part from our use of RTX. Let’s take a look at its application and practicality. 

The technology behind RTX can be used for various purposes, but in its most common application, real-time ray tracing greatly enhances both shadows and reflections. By making light behave in-game as it would behave in reality, RTX introduces a new level of intricacy and credibility to the shadows and reflections that appear in-game. One of the most striking examples of this is how reflections are no longer restricted by screen-space. Sources of light that are well outside the player’s screen-space still reflect on surfaces. Though the lack thereof may not seem apparent at first, the difference is striking. It’s one of these things you didn’t know you missed until you’ve seen it.

The floor and glass materials reflect objects outside of screen space. The blue holographic image reflected in the window is actually in a room behind the astronaut. 

Similarly, RTX renders shadows in a manner that isn’t just a semblance, but an actual representation of how shadows behave in reality. The result is a greater level of realism and believability to your game. 

A big part of implementing ray tracing is about performance. Current hardware can do amazing things, but this is only the first generation of ray-tracing hardware, which means we couldn’t do crazy things like ray-traced shadows from all lights. When it comes to lighting, we tried to keep it subtle and focus on a few spotlights that actually benefit a lot from ray-traced shadows. One thing to keep in mind is that using any source radius, source length or soft source radius in a light makes it heavier than when it’s just a point. When it’s just a single point from which the shadows are cast, the renderer doesn’t need to denoise the shadow, and the frame takes less time to render. The sad part is that ray-traced shadows are the prettiest when their source isn’t just a single point, but an area from which light originates. The increasing softness made by area lights over the length of the shadow is one of the things that make ray-traced shadows stand out and look realistic. We also decided to have our directional lights cast ray traced shadows. This gave a huge boost to realism and seems to be the ideal application of ray-traced shadows for now. All objects get sharp contact shadows and become softer the further they are from the shadow casting object.
Tech_Blog_-DeliverUsTheMoon_005.jpg

Note the individual shadows on the radio knobs. 

Real-time ray tracing is a brand new tool for developers, and as with every tool, its success depends on not just the results you can achieve with it, but also its practicality. Especially for small indie developers like us, everyday reality and limited resources dictate time-efficient tools and methods. When RTX became publicly available for developers, Deliver Us The Moon was nearing the end of its polishing phase. The game’s realistic graphics are a major spearpoint, so we knew we wanted real-time ray tracing. But implementing this technology meant going back and potentially opening Pandora’s Box. Its risk and feasibility assessment came back positive in large part because of its integration in Unreal Engine. This was a decisive factor for us. 
Tech_Blog_-DeliverUsTheMoon_003.jpg

RTX demonstrated by the astronaut’s reflection in the window and the shadows of the handlebars to the left. 

Implementing RTX after your game is practically finished means you’ll have to adjust some Materials and lighting setups. With Deliver Us The Moon, we focused on reflections, since those have the most visual impact. 

The translucency reflections required some changes to the glass master Material. The biggest change comes from less roughness and less detailed normals. Before, we had glass with small speckles, which gave it a more industrial and worn look, but when that was rendered with ray tracing, it looked “off.” Adjusting the roughness and normals yielded the results we needed. A before and after comparison is shown below. Besides the fact that it looks better, it was also needed to improve performance. Having a detailed and strongly contrasting normal means the rays traced will diverge more, which means more rays are needed to get a decent image. This, together with roughness, determines the performance impact of translucent Materials. 

Tech_Blog_-DeliverUsTheMoon_008.jpg
Detailed normals on the window shader make the ray traced reflections too noisy.
Tech_Blog_-DeliverUsTheMoon_007.jpg
Simplified normals give a sharp translucent reflection and increases the performance.

Another thing we had to look at were the amount of refraction rays. This is the amount of times a ray can travel through translucent Materials. Ideally we would want this at three or four times, since we have many windows and particles that can overlap. Sadly, that was not great for performance, which meant we had to set it to one. Then, using the latest tech from NVIDIA’s Unreal Engine 4.23 RTX branch, we could enable Hybrid Translucency. This allows objects that we don’t need to be ray traced, such as small dust particles, to be rendered via regular rasterization. The downside is that those objects also won’t show in the ray-traced reflections. 

Across all opaque materials, there were a few things we did to improve performance. Just like in translucent Materials, a lot of details in the normal map and higher roughness values are bad for performance. The most important Material node is the RayTracingQualitySwitch. It lets you say how you want a Material to render regularly, and in ray-traced reflections. We opted to not use the normals when rendering in a ray-traced reflection. Besides that, we also used a mip bias in our texture samplers to use lower-res mips, lowering the needed VRAM bandwidth. 
Tech_Blog_-DeliverUsTheMoon_006.jpg
Since Unreal Engine 4.23, the ray-traced reflections can also fall back to the reflection sphere captures after the last reflection bounce. Before that, we had to use at least three bounces to get a decent result without showing too much black. This allowed us to set the maximum of reflection bounces to one without getting big black blocks of untraced reflections, while considerably improving performance. 

Implementing new technology can often be a time-consuming task because of stability issues and unknowns. You have to be willing to take the risk and prepare for worst-case scenarios. But even in Early Access, we went through the implementation process with relative ease. This was crucial for Deliver Us The Moon’s tight schedule, and we believe it represents a pivotal moment for RTX. Because of Unreal, ray tracing is not only accessible to a select group of high-budget industry titans, but also to small-sized teams with limited budgets and workforce. 
Tech_Blog_-DeliverUsTheMoon_001.jpg
In conclusion, we use ray tracing to support Deliver Us The Moon’s realistic graphics. Naturally behaving shadows and reflections allow us to further improve the realistic style that makes our players so enthusiastic. But it should be noted that the technology has similar benefits for games that don’t aim for realistic graphics per se. Whether your game is realistic, heavily stylized or comic-styled, the efficiency and ease-of-use of ray tracing allows you to set up a high-quality lighting scenario without having to bake everything down, freeing up time for you to focus on other parts of the game. Everybody wins when you need to spend less time getting better results. The technology yields incredible results and is bound to set the stage for many games to come. We are eager to put it to use in our next games. 
 

Go to Source
Author: KeokeN Interactive Technical Artist Daniel Torkar and Technical Level Designer Kevin van Schaijk