Engineering and Technology Updates
Energy-Efficient Construction Materials Work Better in Colder Climates, say Researchers
Researchers from Lithuania and Cyprus claim that the energy payback period of using phase change materials, new technology in the construction industry, is the shortest in a colder climate. The optimal location for their usage is the interior on the northern side of the building. The study provides informed answers regarding the application of PCMs to improve buildings’ energy efficiency. In recent years, phase change materials (PCMs) used to improve the energy efficiency of buildings are gaining momentum. PCMs can store and release large amounts of energy — when in a solid phase, they can absorb heat, providing a cooling effect and when a PCM is in its liquid phase it can release heat, providing a warming effect. Together with colleagues from Frederick University in Cyprus, KTU researchers conducted a study in different European regions aiming to calculate the efficiency of the application of PCMs for the energy upgrade of the existing buildings. Their research revealed that the efficiency and energy payback period of PCM depends on certain conditions, such as the geographical location and the wall orientation of the building. The work examines the application of PCM coatings in diverse meteorological conditions in Europe, for all major buildings’ orientations. In total, 16 numerical simulations were carried out for the four calendar months of January, April, July and October and for three latitudes of Athens, Milan and Copenhagen. The first 8 numerical simulations were performed with phase change material integrated into the building element structure and the other 8 simulations — in the absence of PCM. The PCM thickness incorporated was 4 cm. The annual energy saving was calculated for four typical months, representing the four seasons of the year (winter, spring, summer, and autumn). “One of the main study outcomes highlighted the fact that PCM performed better under cold conditions,” says researcher Klumbytė. According to the researchers, this makes perfect sense — firstly, in colder conditions, PCM absorbs more energy, and secondly, since in colder climates the buildings use more energy (electricity, heating, etc.) the energy saving in these conditions is more efficient. “In the study, we have developed the energy payback period concept, which means the balance between the energy used to produce these materials and gained while using them. Energy payback period indicates how long it will take for the energy that is saved in the PCMs to eliminate the energy costs of their production,” explains another researcher Fokaides. The study revealed that PCM implementation can contribute to energy savings in certain cases, varying from 0.24 up to 29,84 kWh/m2a and energy payback periods from less than a year to almost 20 years. The longest energy payback period was calculated in warmer climates, and the shortest — in colder locations. The optimal orientation for placing PCMs is west and east in Athens, east and north in Milan, and north in Copenhagen. Also, PCMs work best when they are integrated into interior structures. According to Fokaides, the above-described study is researching topics that have not been discussed in scientific literature before. The optimal location of the phase change material in the building, its optimal orientation and the energy payback period are entirely new concepts in the broad theme of the energy performance of the built environment. The KTU researchers claim that the methodology and dataset provided in this work can be used for further development of the buildings’ thermal assessment tools. Currently, the team is starting a new 1.5 million worth research project, which will focus on te digitalisation of the findings. This could include developing smart sensors to measure building elements’ thermal performance in real-time and other aspects. According to scientists, this topic has vast potential for commercialisation.
Source: https://www.sciencedaily.com/releases/2023/02/230203105332.htm
Autonomous Driving: New Algorithm Distributes Risk Fairly
Researchers at the Technical University of Munich (TUM) have developed autonomous driving software which distributes risk on the street in a fair manner. The algorithm contained in the software is considered to be the first to incorporate the 20 ethics recommendations of the EU Commission expert group, thus making significantly more differentiated decisions than previous algorithms. Operation of automated vehicles is to be made significantly safer by assessing the varying degrees of risk to pedestrians and motorists. The code is available to the general public as Open Source software. Technical realization is not the only obstacle to be mastered before autonomously driving vehicles can be allowed on the street on a large scale. Ethical questions play an important role in the development of the corresponding algorithms: Software has to be able to handle unforeseeable situations and make the necessary decisions in case of an impending accident. Researchers at TUM have now developed the first ethical algorithm to fairly distribute the levels of risk rather than operating on an either/or principle. Approximately 2,000 scenarios involving critical situations were tested, distributed across various types of streets and regions such as Europe, the USA and China. The research work is the joint result of a partnership between the TUM Chair of Automotive Technology and the Chair of Business Ethics at the TUM Institute for Ethics in Artificial Intelligence (IEAI). Maximilian Geisslinger, a scientist at the TUM Chair of Automotive Technology, explains the approach: “Until now, autonomous vehicles were always faced with an either/or choice when encountering an ethical decision. But street traffic can’t necessarily be divided into clear-cut, black and white situations; much more, the countless gray shades in between have to be considered as well. Our algorithm weighs various risks and makes an ethical choice from among thousands of possible behaviours — and does so in a matter of only a fraction of a second.” The basic ethical parameters on which the software’s risk evaluation is oriented were defined by an expert panel as a written recommendation on behalf of the EU Commission in 2020. The recommendation includes basic principles such as priority for the worst-off and the fair distribution of risk among all road users. In order to translate these rules into mathematical calculations, the research team classified vehicles and persons moving in street traffic based on the risk they present to others and on the respective willingness to take risks. A truck for example can cause serious damage to other traffic participants, while in many scenarios the truck itself will only experience minor damage. The opposite is the case for a bicycle. In the next step the algorithm was told not to exceed a maximum acceptable risk in the various respective street situations. In addition, the research team added variables to the calculation which account for responsibility on the part of the traffic participants, for example the responsibility to obey traffic regulations. Previous approaches treated critical situations on the street with only a small number of possible manoeuvres; in unclear cases the vehicle simply stopped. The risk assessment now integrated in the researchers’ code results in more possible degrees of freedom with less risk for all. An example will illustrate the approach: An autonomous vehicle wants to overtake a bicycle, while a truck is approaching in the oncoming lane. All the existing data on the surroundings and the individual participants are now utilized. Can the bicycle be overtaken without driving in the oncoming traffic lane and at the same time maintaining a safe distance to the bicycle? What is the risk posed to each respective vehicle, and what risk do these vehicles constitute to the autonomous vehicle itself? In unclear cases the autonomous vehicle with the new software always waits until the risk to all participants is acceptable. Aggressive manoeuvres are avoided, while at the same time the autonomous vehicle doesn’t simply freeze up and abruptly jam on the brakes. Yes and No are irrelevant, replaced by an evaluation containing a large number of options. The researchers emphasized the fact that even algorithms that are based on risk ethics — although they can make decisions based on the underlying ethical principles in every possible traffic situation — they still cannot guarantee accident-free street traffic. In the future it will additionally be necessary to consider further differentiations such as cultural differences in ethical decision-making. Until now the algorithm developed at TUM has been validated in simulations. In the future the software will be tested on the street using the research vehicle EDGAR. The code embodying the findings of the research activities is available as Open Source software.
Source: https://www.sciencedaily.com/releases/2023/02/230205081301.htm
Compact, Non-Mechanical 3D Lidar System Could Make Autonomous Driving Safer
Our roads might one day be safer thanks to a completely new type of system that overcomes some of lidar’s limitations. Lidar, which uses pulsed lasers to map objects and scenes, helps autonomous robots, vehicles and drones to navigate their environment. The new system represents the first time that the capabilities of conventional beam-scanning lidar systems have been combined with those of a newer 3D approach known as flash lidar. Investigators led by Susumu Noda from Kyoto University in Japan describe their new nonmechanical 3D lidar system, which fits in the palm of the hand. They also show that it can be used to measure the distance of poorly reflective objects and automatically track the motion of these objects. “With our lidar system, robots and vehicles will be able to reliably and safely navigate dynamic environments without losing sight of poorly reflective objects such as black metallic cars,” said Noda. “Incorporating this technology into cars, for example, would make autonomous driving safer.” The new system is possible thanks to a unique light source the researchers developed called a dually modulated photonic-crystal laser (DM-PCSEL). Because this light source is chip-based it could eventually enable the development of an on-chip all-solid-state 3D lidar system. “The DM-PCSEL integrates non-mechanical, electronically controlled beam scanning with flash illumination used in flash lidar to acquire a full 3D image with a single flash of light,” said Noda. “This unique source allows us to achieve both flash and scanning illumination without any moving parts or bulky external optical elements, such as lenses and diffractive optical elements.” Lidar systems map objects within view by illuminating those objects with laser beams and then calculating the distance of those objects by measuring the beams’ time of flight (ToF) — the time it takes for the light to travel to objects, be reflected and then return to the system. Most lidar systems in use and under development rely on moving parts such as motors to scan the laser beam, making these systems bulky, expensive and unreliable. One non-mechanical approach, known as flash lidar, simultaneously illuminates and evaluates the distances of all objects in the field of view with a single broad, diffuse beam of light. However, flash lidar systems can’t be used to measure the distances of poorly reflective objects like black metallic cars due to the very small amount of light reflected from these objects. These systems also tend to be large because of the external lenses and optical elements needed to create the flash beam. To address these critical limitations, the researchers developed the DM-PCSEL light source. It has both a flash source that can illuminate a wide 30°×30° field of view and a beam-scanning source that provides spot illumination with 100 narrow laser beams. They incorporated the DM-PCSEL into a 3D lidar system, which allowed them to measure the distances of many objects simultaneously using wide flash illumination while also selectively illuminating poorly reflective objects with a more concentrated beam of light. The researchers also installed a ToF camera to perform distance measurements and developed software that enables automatic tracking of the motion of poorly reflective objects using beam-scanning illumination. The researchers demonstrated the new lidar system by using it to measure the distances of poorly reflective objects placed on a table in a lab. They also showed that the system can automatically recognize poorly reflective objects and track their movement using selective illumination. The researchers are now working to demonstrate the system in practical applications, such as the autonomous movement of robots and vehicles. They also want to see if replacing the ToF camera with a more optically sensitive single-photon avalanche photodiode array would allow the measurement of objects across even longer distances.
Source: https://www.sciencedaily.com/releases/2023/02/230209114747.htm
Fighting Climate Change: Ruthenium Complexes for Carbon Dioxide Reduction to Valuable Chemicals
Climate change is a global environmental concern. A major contribution to climate change comes from excessive burning of fossil fuels. They produce carbon dioxide (CO2), a greenhouse gas responsible for global warming. In this light, governments globally are framing policies to curb such carbon emissions. However, merely curbing carbon emissions may not be enough. Managing the generated carbon dioxide is also necessary. On this front, scientists have suggested chemically converting CO2 into value-added compounds, such as methanol and formic acid (HCOOH). Producing the latter requires a source of hydride ion (H-), which is equivalent to one proton and two electrons. For instance, the nicotinamide adenine dinucleotide (NAD+/NADH) reduction-oxidation couple is a hydride (H-) generator and reservoir in biological systems. Against this backdrop, a group of researchers led by Professor Hitoshi Tamiaki from Ritsumeikan University, Japan, have now developed a novel chemical method that reduces CO2 to HCOOH using NAD+/NADH-like ruthenium complexes. Prof. Tamiaki explains the motivation behind their research. “Recently, a ruthenium complex with an NAD+ model — [Ru(bpy)2(pbn)](PF6)2 — was shown to undergo photochemical two-electron reduction. It produced the corresponding NADH-type complex [Ru(bpy)2(pbnHH)](PF6)2 under visible light irradiation in the presence of triethanolamine in acetonitrile (CH3CN),” he elaborates. “Further, the bubbling of CO2 into the [Ru(bpy)2(pbnHH)]2+ solution regenerated [Ru(bpy)2(pbn)]2+ and produced formate ion (HCOO-). However, its yield was quite low. Therefore, transferring H- to CO2 required an improved catalytic system.” Consequently, the researchers explored various reagents and reaction conditions to facilitate CO2 reduction. Based on those experiments, they proposed a photoinduced two-electron reduction of the [Ru(bpy)2(pbn)]2+/[Ru(bpy)2(pbnHH)]2+ redox couple in the presence of 1,3-dimethyl-2-phenyl-2,3-dihydro-1H-benzo[d]imidazole (BIH). Moreover, water (H2O), instead of triethanolamine, in CH3CN further improved the yield. In addition, the researchers explored the underlying reaction mechanism using techniques like nuclear magnetic resonance, cyclic voltammetry, and UV-Vis spectrophotometry. Based on this, they proposed the following: First, the photo-excitation of [Ru(bpy)2(pbn)]2+ produces [RuIII(bpy)2(pbn•?)]2+* radical, which undergoes reduction by BIH to give [RuII(bpy)2(pbn•?)]2+ and BIH•+. Following this, H2O protonates the ruthenium complex, generating [Ru(bpy)2(pbnH•)]2+ and BI•. The obtained product undergoes disproportionation to generate [Ru(bpy)2(pbnHH)]2+ and gives back [Ru(bpy)2(pbn)]2+. Then, the former is reduced by BI• to produce [Ru(bpy)(bpy•?)(pbnHH)]+. This complex is an active catalyst and transfers H- to CO2, producing HCOO- and formic acid. The researchers showed that the proposed reaction demonstrated a high turnover number — moles of CO2 converted by a mole of catalyst — of 63. Excited by these findings, the researchers hope to develop a new methodology of energy conversion (sunlight to chemical energy) for the production of novel renewable materials. “Our method would also decrease the total amount of CO2 gas on Earth and help maintain the carbon cycle. Thus, it could reduce global warming in the future,” adds Prof. Tamiaki. “Further, the novel organic hydride transfer technology will provide us with invaluable chemical compounds.”
Source: https://www.sciencedaily.com/releases/2023/02/230209094129.htm
Novel Microscope Developed to Design Better High-Performance Batteries
Lithium-ion batteries have transformed everyday lives — almost everyone has a smartphone, more electric vehicles can be spotted on the roads, and they keep power generators going during emergencies. As more portable electronic devices, electric vehicles and large-scale grid implementations come online, the demand for higher energy density batteries that are safe and affordable continues to grow. Now, a University of Houston research team, in collaboration with researchers from the Pacific Northwest National Laboratory and the U.S. Army Research Laboratory, has developed an operando reflection interference microscope (RIM) that provides a better understanding of how batteries work, which has significant implications for the next generation of batteries. “We have achieved real-time visualization of solid electrolyte interphase (SEI) dynamics for the first time,” said Xiaonan Shan, assistant professor of electrical and computer engineering at UH’s Cullen College of Engineering and corresponding author of a study “This provides key insight into the rational design of interphases, a battery component that has been the least understood and most challenging barrier to developing electrolytes for future batteries.” The highly sensitive microscope allows researchers to study the SEI layer, which is an extremely thin and fragile layer on the battery electrode surface that determines battery performance. Its chemical composition and morphology are continuously changing — making it a challenge to study. “A dynamic, non-invasive and high sensitivity operando imaging tool is required to understand the formation and evolution of SEI. Such a technique capable of direct probing SEI has been rare and highly desirable,” said Yan Yao, the Hugh Roy and Lillie Cranz Cullen Distinguished Professor of electrical and computer engineering who has worked with Shan on this project for the last four years. “We have now demonstrated that RIM is the first of its kind to provide critical insight into the working mechanism of the SEI layer and help design better high-performance batteries,” said Yao, who is also the principal investigator of the Texas Center for Superconductivity at the University of Houston. The research team applied the principle of interference reflection microscopy in the project, where the light beam — centering at 600 nanometers with spectrum width of about 10 nanometers — was directed towards the electrodes and SEI layers and reflected. The collected optical intensity contains interference signals between different layers, carrying important information about the evolution process of SEI and allowing the researchers to observe the entire reaction process. “The RIM is very sensitive to surface variations, which enables us to monitor the same location with large-scale high spatial and temporal resolution,” said UH graduate student Guangxia Feng, who performed much of the experimental work on the project. The researchers note that most battery researchers currently use cryo-electron microscopes, which only take one picture at a certain time and cannot continuously track the changes at the same location. “I wanted to approach energy research from a different angle by adapting and developing new characterization and imaging methods which provide new information to understand the reaction mechanism in energy conversion processes,” said Shan, who specializes in developing imaging techniques and spectrometry techniques to study electrochemical reactions in energy storage and conversions. This new imaging technique could also be applied to other state-of-the-art energy storage systems. “To realize the next generation of batteries, it is essential to understand the reaction mechanisms and novel materials,” she said, adding that developing higher energy batteries also benefits the environment.
Source: https://www.sciencedaily.com/releases/2023/02/230209141509.htm
AST SpaceMobile Makes History in Cellular Connectivity, Completing the First-Ever Space-Based Voice Call Using Everyday Unmodified Smartphones
AST SpaceMobile, Inc. the company building the first and only space-based cellular broadband network accessible directly by standard mobile phones, recently announced the successful completion of the first-ever two-way voice calls, directly to everyday unmodified smartphones using the BlueWalker 3 (“BW3”) satellite. This is the first time anyone has ever achieved a direct voice connection from space to everyday cellular devices, demonstrating a significant advancement in AST SpaceMobile’s mission to provide connectivity to the nearly 50% of the global population who remain unconnected from cellular broadband. The first voice call was made from the Midland, Texas area to Rakuten in Japan over AT&T spectrum using a Samsung Galaxy S22 smartphone. The initial test calls have validated the AST SpaceMobile patented system and architecture, and were completed using unmodified smartphones. The calls demonstrated the power of AST SpaceMobile’s BW3 satellite, the largest-ever commercial communications array deployed in low Earth orbit and is an important step to providing space-based 2G, 3G, 4G LTE and 5G cellular broadband globally. Engineers from Vodafone, Rakuten and AT&T participated in the preparation and testing of the first voice calls with BW3. In addition to test calls, AST SpaceMobile engineers conducted initial compatibility tests on a variety of smartphones and devices. The phones successfully exchanged Subscriber Identification Module (“SIM”) and network information directly to BW3 — crucial for delivering broadband connectivity from space to any phone or device. Additional testing and measurements on the smartphone uplink and downlink signal strength confirm the ability to support cellular broadband speeds and 4G LTE / 5G waveforms. AST SpaceMobile has over 2,600 patent and patent-pending claims for its technology and built state-of-the-art facilities in Midland, Texas that collectively span 185,000 square feet. AST SpaceMobile has agreements and understandings with mobile network operators globally that have approximately 2 billion existing subscribers, including Vodafone Group, Rakuten Mobile, AT&T, Bell Canada, Orange, Telefonica, TIM, Saudi Telecom Company, Zain KSA, Etisalat, Indosat Ooredoo Hutchison, Smart Communications, Globe Telecom, Millicom, Smartfren, Telecom Argentina, Telstra, Africell, Liberty Latin America and others. AST SpaceMobile is building the first and only global cellular broadband network in space to operate directly with standard, unmodified mobile devices based on our extensive IP and patent portfolio. The engineers and space scientists are on a mission to eliminate the connectivity gaps faced by today’s five billion mobile subscribers and finally bring broadband to the billions who remain unconnected. The ongoing testing of the BW3 test satellite may not be completed due to a variety of factors, which could include loss of satellite connectivity, destruction of the satellite, or other communication failures, and even if completed as planned, the BW3 testing may indicate adjustments that are needed or modifications that must be made, any of which could result in additional costs, which could be material, and delays in commercializing our service. If there are delays or issues with additional testing, it may become more costly to raise capital, if we are able to do so at all. AST SpaceMobile cautions that the foregoing list of factors is not exclusive. Mr Sriram Jayasimha, FNAE is Chief Scientist, Commercial Applications, AST SpaceMobile.
SSLV-D2 Rocket with 3 Satellites Lifts Off from Sriharikota
This is LV D2’s second developmental flight.
SSLV-D2 Launch: The Indian Space Research Organisation (ISRO) successfully launched the Small Satellite Launch Vehicle (SSLV-D2) from Satish Dhawan Space Centre at Sriharikota on February 10, 2023 morning. The launch vehicle carries three satellites including ISRO’s earth observation satellite EOS-07 and two co-passenger satellites, namely, Janus-1 and AzaadiSat2. The three satellites have already been injected into the intended 450-km circular orbit around the Earth. This is LV D2’s second developmental flight. The maiden flight which was carried on August 7 in 2022, was a partial failure. The maiden flight failed because of an orbit anomaly and some deviation in the rocket’s flight path. For those unaware, the key features of SSLV include low-cost access to space, offers low turn-around time and flexibility in accommodating multiple satellites, and demands minimal launch infrastructure. The Earth Observation Satellite, EOS-07, is of 156.3 kg which has been designed, developed and realized by ISRO. The co-passenger satellite, Janus-1, is a 10.2 kg satellite which belongs to ANTARIS, USA. Another co-passenger satellite AzaadiSAT-2 is of 8.7 kg which was made a combined effort of about 750 girl students across India guided by Space Kidz India, Chennai, as per ISRO‘s official website.
SSLV-D2 vehicle characteristics:
SSLV, as per ISRO, is configured with three solid propulsion stages and a velocity terminal module. It is a 34 m tall, 2 m diameter vehicle having a lift-off mass of 120 t.
Vehicle Height: 34 m
Vehicle Diameter: 2 m
Lift off Mass: ~119 t
Vehicle Configuration: SS1 + SS2 + SS3 + VTM
SSLV-D2 Mission Specifications:
Altitude (km): 450
Inclination (deg): 37.2
Launch Pad: FLP
Researchers Detail Never-Before-Seen Properties in a Family of Superconducting Kagome Metals
Dramatic advances in quantum computing, smartphones that only need to be charged once a month, trains that levitate and move at superfast speeds. Technological leaps like these could revolutionize society, but they remain largely out of reach as long as superconductivity — the flow of electricity without resistance or energy waste — isn’t fully understood. One of the major limitations for real-world applications of this technology is that the materials that make superconducting possible typically need to be at extremely cold temperatures to reach that level of electrical efficiency. To get around this limit, researchers need to build a clear picture of what different superconducting materials look like at the atomic scale as they transition through different states of matter to become superconductors. Scholars in a Brown University lab, working with an international team of scientists, have moved a small step closer to cracking this mystery for a recently discovered family of superconducting Kagome metals. In a new study, they used an innovative new strategy combining nuclear magnetic resonance imaging and a quantum modeling theory to describe the microscopic structure of this superconductor at 103 degrees Kelvin, which is equivalent to about 275 degrees below 0 degrees Fahrenheit. The researchers described the properties of this bizarre state of matter for what’s believed to be the first time. Ultimately, the findings represent a new achievement in a steady march toward superconductors that operate at higher temperatures. Superconductors that can operate at room temperature (or close to it) are considered the holy grail of condensed-matter physics because of the tremendous technological opportunities they would open in power efficiency, including in electricity transmission, transportation and quantum computing. The new study focuses on superconductor RbV3Sb5, which is made of the metals rubidium vanadium and antimony. The material earns its namesake because of its peculiar atomic structure, which resembles a basketweave pattern that features interconnected star-shaped triangles. Kagome materials fascinate researchers because of the insight they provide into quantum phenomena, bridging two of the most fundamental fields of physics — topological quantum physics and condensed matter physics. Previous work from different groups established that this material goes through a cascade of different phase transitions when the temperature is lowered, forming different states of matter with different exotic properties. When this material is brought to 103 degrees Kelvin, the structure of lattice changes and the material exhibits what’s known as a charge-density wave, where the electrical charge density jumps up and down. Understanding these jumps is important for the development of theories that describe the behaviour of electrons in quantum materials like superconductors. What hadn’t been seen before in this type of Kagome metal was what the physical structure of this lattice and charge order looked like at the temperature the researchers were looking at, which is highest temperature state where the metal starts transitioning between different states of matter. Using a new strategy combining NMR measurements and a modeling theory known as density functional theory that’s used to simulate the electrical structure and position of atoms, the team was able to describe the new structure the lattice changes into and its charge-density wave. They showed that the structure moves from a 2x2x1 pattern with a signature Star of David pattern to a 2x2x2 pattern. This happens because the Kagome lattice inverts in on itself when the temperature gets extremely frigid. The new lattice it transitions into is made up largely of separate hexagons and triangles, the researchers showed. They also showed how this pattern connects when they take one plane of the RbV3Sb5 structure and rotate it, “gazing ” into it from a different angle. Probing this atomic structure is a necessary step to providing a complete portrait of the exotic states of matter this superconducting material transitions into, the researchers said. They believe the findings will lead to further prodding on whether this formation and its properties can help superconductivity or if it’s something that should be suppressed to make better superconductors. The new unique technique they used will also allow the researchers to answer a whole new set of questions.
Source: https://www.sciencedaily.com/releases/2023/02/230210185152.htm
Research Reveals Thermal Instability of Solar Cells but Offers a Bright Path Forward
A new type of solar technology has seemed promising in recent years. Halide perovskite solar cells are both high performing and low cost for producing electrical energy — two necessary ingredients for any successful solar technology of the future. But new solar cell materials should also match the stability of silicon-based solar cells, which boast more than 25 years of reliability. A team led by Juan-Pablo Correa-Baena, assistant professor in the School of Materials Sciences and Engineering at Georgia Tech, shows that halide perovskite solar cells are less stable than previously thought. Their work reveals the thermal instability that happens within the cells’ interface layers, but also offers a path forward towards reliability and efficiency for halide perovskite solar technology. Their research has immediate implications for both academics and industry professionals working with perovskites in photovoltaics, a field concerned with electric currents generated by sunlight. Lead halide perovskite solar cells promise superior conversion of sunlight into electrical power. Currently, the most common strategy for coaxing high conversion efficiency out of these cells is to treat their surfaces with large positively charged ions known as cations. These cations are too big to fit into the perovskite atomic-scale lattice, and, upon landing on the perovskite crystal, change the material’s structure at the interface where they are deposited. The resulting atomic-scale defects limit the efficacy of current extraction from the solar cell. Despite awareness of these structural changes, research on whether the cations are stable after deposition is limited, leaving a gap in understanding of a process that could impact the long-term viability of halide perovskite solar cells. To carry out the experiment, the team created a sample solar device using typical perovskite films. The device features eight independent solar cells, which enables the researchers to experiment and generate data based on each cell’s performance. They investigated how the cells would perform, both with and without the cation surface treatment, and studied the cation-modified interfaces of each cell before and after prolonged thermal stress using synchrotron-based X-ray characterization techniques. First, the researchers exposed the pre-treated samples to 100 degrees Celsius for 40 minutes, and then measured their changes in chemical composition using X-ray photoelectron spectroscopy. They also used another type of X-ray technology to investigate precisely what type of crystal structures form on the film’s surface. Combining the information from the two tools, the researchers could visualize how the cations diffuse into the lattice and how the interface structure changes when exposed to heat. Next, to understand how the cation-induced structural changes impact solar cell performance, the researchers employed excitation correlation spectroscopy in collaboration with Carlos Silva, professor of physics and chemistry at Georgia Tech. The technique exposes the solar cell samples to very fast pulses of light and detects the intensity of light emitted from the film after each pulse to understand how energy from light is lost. The measurements allow the researchers to understand what kinds of surface defects are detrimental to performance. Finally, the team correlated the changes in structure and optoelectronic properties with the differences in the solar cells’ efficiencies. They also studied the changes induced by high temperatures in two of the most used cations and observed the differences in dynamics at their interfaces. The researchers learned that the surfaces of metal halide perovskite films treated with organic cations keep evolving in structure and composition under thermal stress. They saw that the resulting atomic-scale changes at the interface can cause a meaningful loss in power conversion efficiency in solar cells. In addition, they found that the speed of these changes depends on the type of cations used, suggesting that stable interfaces might be within reach with adequate engineering of the molecules.
Source: https://www.sciencedaily.com/releases/2023/02/230209224439.htm
Researchers Develop New, Automated, Powerful Diagnostic Tool for Drug Detection
In recent years, a mass spectrometry process that can detect the amounts of drugs in a biological sample, such as blood, has become a powerful diagnostic tool for helping medical professionals identify and monitor levels of therapeutic drugs in patients, which can cause unwanted or dangerous side effects. Holding back this technique — which is called liquid chromatography tandem mass spectrometry or LC-MS/MS for short — is that it often requires relatively large biological samples and a number of complicated steps that must be done by hand to prepare samples for analysis. At Brown University, a team of biomedical engineers has been working to make this time-consuming process simpler and much more automated, a key ingredient to the technique being widely adopted by clinicians. In the study, they present a robust new method for accurately measuring and identifying eight antidepressants most commonly prescribed to women: bupropion, citalopram, desipramine, imipramine, milnacipran, olanzapine, sertraline and vilazodone. The method does just what the researchers hoped. It is able to identify and monitor these drugs from small biological samples — 20 microliters each, which is about the equivalent of blood taken from a prick. The method is also able to be done almost entirely by liquid-handling robots found in most clinical mass spectrometry labs. Once the samples are ready, the user puts them through the mass spectrometer, which breaks the sample down into tiny fragments that contain tell-tale signs of the drugs they are looking for. The method’s accuracy is comparable to other LC-MS/MS-based techniques but has the advantage of a much smaller sample size and is able to be largely automated using the liquid handlers. These innovations set up the system’s immediate potential to be widely translated to clinical settings to help monitor the impacts of drugs prescribed for patients diagnosed with depression, including women experiencing postpartum depression. Depression is a growing global crisis, and women face higher rates of diagnosis than men. The percentage of patients prescribed antidepressants has tripled over the past two decades, and clinicians find themselves at a crossroad between finding the right drug to suit a patient and monitoring the abundance of it in the body, the researchers wrote in the study. Currently, there are no commercial products in the U.S. to help clinicians directly monitor how much these drugs are present in patients, the researchers noted. Clinicians often end up relying on more qualitative methods, like surveys, because of how obtrusive mass spectrometry methods are to patients in terms of sample size and the time-consuming nature of preparing the samples for the machine. Tripathi and colleagues in his lab started working on this potential solution in 2021 after they were asked to evaluate a commercial European kit that uses LC-MS/MS to detect drugs in humans. The work has largely been the result of a collaboration between Brown graduate and undergraduate students who work in the lab. The researchers decided to take a crack at designing their own kit that could be just as accurate but much simpler. They started by identifying some of the most commonly used depressants and from there worked to refine the how the LC-MS/MS technique identifies the drugs, including how much of a sample it needs and establishing a control they could run against actual samples. After running a barrage of quality control checks, tweaking and testing different methods of measuring the samples at different conditions, the researchers took their entire process for preparing the sample and broke it down so that it could be programmed into a machine that could handle the preparation of the liquids. The Brown researchers used a JANUS G3 Robotic Liquid Handler in their work but said that clinicians can use simpler or more advanced machines. The team detailed how they programmed their machine in a way that others can easily replicate with their own equipment. The team also created prototype kits that can be sent to clinicians, so they can implement the method in their labs. The kits include the chemicals and solvents needed along with a detailed instruction booklet that calls out what clinicians should be on the lookout for based on their own experiences and the numerous tweaks they made during quality control process.
Source: https://www.sciencedaily.com/releases/2023/02/230210185142.htm