Refine
Year of publication
Document Type
- Article (149) (remove)
Has Fulltext
- yes (149)
Keywords
- Risikomanagement (5)
- Erneuerbare Energien (4)
- Digitaler Zwilling (3)
- HIV (3)
- Heterogeneous Catalysis (3)
- Katastrophenmanagement (3)
- Katastrophenrisiko (3)
- Klimaänderung (3)
- Lyrik (3)
- Optimization (3)
Faculty
- Angewandte Naturwissenschaften (F11) (29)
- Fakultät 12 / Institut für Technologie und Ressourcenmanagement in den Tropen und Subtropen (21)
- Fakultät 09 / Institut für Rettungsingenieurwesen und Gefahrenabwehr (11)
- Fakultät 10 / Institut Allgemeiner Maschinenbau (11)
- Fakultät 03 / Institut für Translation und Mehrsprachige Kommunikation (8)
- Fakultät 09 / Institut für Produktentwicklung und Konstruktionstechnik (8)
- Fakultät 07 / Institut für Nachrichtentechnik (7)
- Fakultät 09 / Institut Anlagen und Verfahrenstechnik (7)
- Fakultät 09 / Cologne Institute for Renewable Energy (6)
- Fakultät 03 / Institut für Informationswissenschaft (4)
Agents with antifungal activity play a vital role as therapeutics in health care, as do fungicides in agriculture. Effectiveness, toxicological profile, and eco-friendliness are among the properties used to select suitable substances. Furthermore, a steady supply of new agents with different modes of action is required to counter the well-known potential of human and phyto-pathogenic fungi to develop resistance against established antifungals. Here, we use an in vitro growth assay to investigate the activity of the calcineurin inhibitor tacrolimus in combination with the commercial fungicides cyproconazole and hymexazol, as well as with two earlier reported novel {2-(3-R-1H-1,2,4-triazol-5-yl)phenyl}amines, against the fungi Aspergillus niger, Colletotrichum higginsianum, Fusarium oxysporum and the oomycete Phytophthora infestans, which are notoriously harmful in agriculture. When tacrolimus was added in a concentration range from 0.25 to 25 mg/L to the tested antifungals (at a fixed concentration of 25 or 50 mg/L), the inhibitory activities were distinctly enhanced. Molecular docking calculations revealed triazole derivative 5, (2-(3-adamantan-1-yl)-1H-1,2,4-triazol-5-yl)-4-chloroaniline), as a potent inhibitor of chitin deacetylases (CDA) of Aspergillus nidulans and A. niger (AnCDA and AngCDA, respectively), which was stronger than the previously reported polyoxorin D, J075-4187, and chitotriose. The results are discussed in the context of potential synergism and molecular mode of action.
Hydroxybenzene, commonly known as phenol, is one of the most important organic commodity chemicals. To produce phenol, the cumene process is the most used process worldwide. A crucial step in this process is the Hock rearrangement, which has a major impact on the overall cumene consumption rate and determines the safety level of the process. The most used catalyst for the cleavage of cumene hydroperoxide (CHP) is sulfuric acid. Besides its strong corrosive property, which increases plant investment costs, it also requires neutralization after the decomposition step to prevent side reactions. In this study, we show that high-temperature-treated Linde Type X (LTX) zeolites exhibit a high activity for the peroxide cleavage step. In addition, the structure–activity relationship responsible for this good performance in the reaction system of the HOCK rearrangement was investigated. XRPD analyses revealed the formation of a new phase after temperature treatment above 900 °C. The Si/Al ratio determined by EDX suggested the formation of extra-framework aluminum, which was confirmed by solid-state NMR analysis. The newly formed extra-framework aluminum was found to be responsible for the high catalytic activity. BET analyses showed that the surface area drops at higher calcination temperatures. This leads to a lower catalytic activity for most known reactions. However, for this study, no decrease in activity has been observed. The newfound material shows extraordinarily high activity as a catalyst in the HOCK cleavage and has the potential to be a heterogeneous alternative to sulfuric acid for this reaction.
Die Dimensionierung von thermischen Speichern in der Gebäudetechnik bezieht sich häufig auf die Trinkwassererwärmung mit der DIN 4708. Dabei werden in der Regel die Bedarfe der Nutzer zur Auslegung herangezogen. Bekannt ist das Summenlinienverfahren und der daraus resultierende Beitrag des Wärmeerzeugers. Bei Pufferspeichern wird dagegen unter-schieden in welcher Kombination von Speicher und Wärmerzeuger dieser eingesetzt werden soll und es kommt häufig zu Größenschätzungen und Auslegungen mit Richtwerten. Daneben bieten zahlreiche Herstellern Auslegungsprogramme, die immer auf den Spitzenbedarf des Gebäudes ausgelegt sind.
In diesem Beitrag wird eine Methode vorgestellt, die den thermischen Speicher als zweiten Wärmeversorger im Gebäude betrachtet, der zusammen mit diesem die Versorgung über-nimmt. Damit wird die Speicherauslegung mit der Wärmeerzeugerleistung verknüpft. Aus-gleichend über eine bestimmte Zeitperiode (24 h) mit Phasen hohen und niedrigen Bedarfs übernehmen der Wärmeerzeuger und der Speicher gemeinsam die Versorgung. Da die Wärmeversorgung eines Gebäudes in erster Linie von der Außenlufttemperatur abhängt, wird hier ein Verfahren auf dieser Basis vorgestellt, welches eine einfache Berechnung des Wärmeinhalts eines Speichers ermöglicht.
Despite intensive research over the last three decades, it has not yet been possible to bring an effective vaccine against human immunodeficiency virus (HIV) and the resulting acquired immunodeficiency syndrome (AIDS) to market. Virus-like particles (VLP) are a promising approach for efficient and effective vaccination and could play an important role in the fight against HIV. For example, HEK293 (human embryo kidney) cells can be used to produce virus-like particles. In this context, given the quality-by-design (QbD) concept for manufacturing, a digital twin is of great importance for the production of HIV-Gag-formed VLPs. In this work, a dynamic metabolic model for the production of HIV-Gag VLPs was developed and validated. The model can represent the VLP production as well as the consumption or formation of all important substrates and metabolites. Thus, in combination with already described process analytical technology (PAT) methods, the final step towards the implementation of a digital twin for process development and design, as well as process automation, was completed.
Concept for Combining LCA and Hazardous Building Material Assessment for Decision Support Using BIM
(2022)
AbstractThe construction and building sector is responsible for a large part of the world’s resource and energy consumption and is considered the largest global emitter of greenhouse gas (GHG) emissions. Hazardous and toxic substances in building materials affect indoor air quality as well as the environment and thus have a high impact on human health, as we spend around 90 percent of our lives in buildings. Life cycle assessment (LCA) and hazardous building material requirements of green building certification systems allow to reduce the environmental and health impacts of building products and materials. However, they are usually very complex and time-consuming to perform and require expert knowledge to use the results for decision support. Digital approaches to support the simplified application of these methods and intuitive visualization of results are becoming increasingly important. Especially Building Information Modeling (BIM) offers a high potential for this purpose, as the integration and linking of geometric and semantic information in 3D-models for LCA and hazardous building material assessment can be done much more efficiently and intuitively. Within the scope of this work, the following three objectives were pursued (1) development of a method for combining LCA and hazardous building material assessment, (2) simplification of the results by converting them into comprehensible indicators for decision support, and (3) implementation of the method in a BIM-based digital assistant for intuitive visualization and communication. The preliminary results show a concept for combined use of LCA and hazardous building material assessment in Germany with differentiation in six use cases. A prototypical implementation as BIM-integrated digital assistant was developed for one of these use cases. For the first time, this prototype provides understandable feedback in real time of LCA and hazardous building material requirements. This research project contributes to the awareness in the context of embodied impacts and low emitting materials in buildings and advances the current digitalization potentials.
This study aimed to simulate the sector-coupled energy system of Germany in 2030 with the restriction on CO2 emission levels and to observe how the system evolves with decreasing emissions. Moreover, the study presented an analysis of the interconnection between electricity, heat and hydrogen and how technologies providing flexibility will react when restricting CO2 emissions levels. This investigation has not yet been carried out with the technologies under consideration in this study. It shows how the energy system behaves under different set boundaries of CO2 emissions and how the costs and technologies change with different emission levels. The study results show that the installed capacities of renewable technologies constantly increase with higher limitations on emissions. However, their usage rates decreases with low CO2 emission levels in response to higher curtailed energy. The sector-coupled technologies behave differently in this regard. Heat pumps show similar behaviour, while the electrolysers usage rate increases with more renewable energy penetration. The system flexibility is not primarily driven by the hydrogen sector, but in low CO2 emission level scenarios, the flexibility shifts towards the heating sector and electrical batteries.
Electroplating generates high volumes of rinse water that is contaminated with heavy metals. This study presents an approach for direct metal recovery and recycling from simulated rinse water, made up of an electroplating electrolyte used in industry, using reverse osmosis (RO). To simulate the real industrial application, the process was examined at various permeate fluxes, ranging from 3.75 to 30 L·m−2·h−1 and hydraulic pressures up to 80 bar. Although permeance decreased significantly with increasing water recovery, rejections of up to 93.8% for boric acid, >99.9% for chromium and 99.6% for sulfate were observed. The final RO retentate contained 8.40 g/L chromium and was directly used in Hull cell electroplating tests. It was possible to deposit cold-hued chromium layers under a wide range of relevant current densities, demonstrating the reusability of the concentrate of the rinsing water obtained by RO.
Thioredoxin (Trx) overexpression is known to be a cause of chemotherapy resistance in various tumor entities. However, Trx effects on resistance are complex and depend strictly on tissue type. In the present study, we analyzed the impact of the Trx system on intrinsic chemoresistance of human glioblastoma multiforme (GBM) cells to cytostatic drugs. Resistance of GBM cell lines and primary cells to drugs and signaling inhibitors was assessed by 3-(4,5-dimethylthiazol-2-yl)-2,5-diphenyltetrazolium bromide (MTT) assays. Impact of Trx inhibition on apoptosis was investigated by proteome profiling of a subset of proteins and annexin V apoptosis assays. Trx-interacting protein (TXNIP) was overexpressed by transfection and protein expression was determined by immunoblotting. Pharmacological inhibition of Trx by 1-methyl-2-imidazolyl-disulfide (PX-12) reduced viability of three GBM cell lines, induced expression of active caspase-3, and reduced phosphorylation of AKT-kinase and expression of β-catenin. Sensitivity to cisplatin could be restored by both PX-12 and recombinant expression of the upstream Trx inhibitor TXNIP, respectively.
In addition, PX-12 also sensitized primary human GBM cells to temozolomide. Combined inhibition of Trx and the phosphatidylinositide 3-kinase (PI3K) pathway resulted in massive cell death. We conclude that the Trx system and the PI3K pathway act as a sequential cascade and could potentially present a new drug target.
Potential analyses identify possible locations for renewable energy installations, such as wind turbines and photovoltaic arrays. The results of previous potential studies for Germany, however, are not consistent due to different assumptions, methods, and datasets being used. For example, different land-use datasets are applied in the literature to identify suitable areas for technologies requiring open land. For the first time, commonly used datasets are compared regarding the area and position of identified features to analyze their impact on potential analyses. It is shown that the use of Corine Land Cover is not recommended as it leads to potential area overestimation in a typical wind potential analyses by a factor of 4.7 and 5.2 in comparison to Basis-DLM and Open Street Map, respectively. Furthermore, we develop scenarios for onshore wind, offshore wind, and open-field photovoltaic potential estimations based on land-eligibility analyses using the land-use datasets that were proven to be best by our pre-analysis. Moreover, we calculate the rooftop photovoltaic potential using 3D building data nationwide for the first time. The potentials have a high sensitivity towards exclusion conditions, which are also currently discussed in public. For example, if restrictive exclusions are chosen for the onshore wind analysis the necessary potential for climate neutrality cannot be met. The potential capacities and possible locations are published for all administrative levels in Germany in the freely accessible database (Tool for Renewable Energy Potentials—Database), for example, to be incorporated into energy system models.
Microphone arrays consisting of sensors mounted on the surface of a rigid, spherical scatterer are popular tools for the capture and binaural reproduction of spatial sound scenes. However, microphone arrays with a perfectly spherical body and uniformly distributed microphones are often impractical for the consumer sector, in which microphone arrays are generally mounted on mobile and wearable devices of arbitrary geometries. Therefore, the binaural reproduction of sound fields captured with arbitrarily shaped microphone arrays has become an important field of research. In this work, we present a comparison of methods for the binaural reproduction of sound fields captured with non-spherical microphone arrays. First, we evaluated equatorial microphone arrays (EMAs), where the microphones are distributed on an equatorial contour of a rigid, spherical 1.
Second, we evaluated a microphone array with six microphones mounted on a pair of glasses. Using these two arrays, we conducted two listening experiments comparing four rendering methods based on acoustic scenes captured in different rooms2. The evaluation includes a microphone-based stereo approach (sAB stereo), a beamforming-based stereo approach (sXY stereo), beamforming-based binaural reproduction (BFBR), and BFBR with binaural signal matching (BSM). Additionally, the perceptual evaluation included binaural Ambisonics renderings, which were based on measurements with spherical microphone arrays. In the EMA experiment we included a fourth-order Ambisonics rendering, while in the glasses array experiment we included a second-order Ambisonics rendering. In both listening experiments in which participants compared all approaches with a dummy head recording we applied non-head-tracked binaural synthesis, with sound sources only in the horizontal plane. The perceived differences were rated separately for the attributes timbre and spaciousness. Results suggest that most approaches perform similarly to the Ambisonics rendering. Overall, BSM, and microphone-based stereo were rated the best for EMAs, and BFBR and microphone-based stereo for the glasses array.
Abstract
(−)‐Menthol is one of the most popular aroma compounds worldwide. While in the past mostly extracted from mint plants, today (−)‐menthol synthesis from other raw materials is becoming more relevant. Common starting materials for menthol synthesis are m‐cresol, citral and myrcene, but also substrates like menthone, mono‐ and bicyclic terpenes and terpenoids have been used for this purpose in the past. As for many applications (−)‐menthol of high purity is required, asymmetric syntheses and enantiomeric resolution of obtained raw products are applied for menthol production. This review gives an overview on the most important synthetic menthol production processes of the companies Symrise, Takasago and BASF and relevant literature in the field of menthol synthesis with a focus on the last 20 years.
In water electrolyzers, polymer electrolyte membranes (PEMs) such as Nafion can accumulate cations stemming from salt impurities in the water supply, which leads to severe cell voltage increases. This combined experimental and computational study discusses the influence of sodium ion poisoning on the ionic conductivity of Nafion membranes and the ion transport in a thereon based water electrolysis cell. Conductivities of Nafion and aqueous solutions with the same amount of dissolved cations are measured with impedance spectroscopy and compared with respect to Nafion’s microstructure. The dynamic behavior of the voltage of a water electrolysis cell is characterized as a function of the sodium ion content and current density, showing the differences of the ion transport at alternating and direct currents. These experimental results are elucidated with a physical ion transport model for sodium ion poisoned Nafion membranes, which describes a proton depletion and sodium ion accumulation at the cathode. During proton depletion, the cathodic hydrogen evolution is maintained by the water reduction that forms hydroxide ions. Together with sodium ions from the membrane, the formed hydroxide ions can diffuse pairwise into the water supply, so that the membrane’s sodium ions can be at least partly be replaced with anodically formed protons.
Air-blast loading is a serious threat to military and civil vehicles, buildings, containers, and cargo. Applications of sandwich-structured composites have attracted increasing interest in modern lightweight design and in the construction of dynamic loading regimes due to their high resistance against blast and ballistic impacts. The functional properties of such composites are determined by the interplay of their face sheet material and the employed core topology. The core topology is the most important parameter affecting the structural behavior of sandwich composites. Therefore, this contribution presents a thorough numerical investigation of different core topologies in sandwich-structured composites subjected to blast loading. Special emphasis is put on prismatic and lattice core topologies displaying auxetic and classical non-auxetic deformation characteristics in order to illustrate the beneficial properties of auxetic core topologies. Their dynamic responses, elastic and plastic deformations, failure mechanisms, and energy absorption capabilities are numerically analyzed and compared. The numerical studies are performed by means of the commercial finite element code ABAQUS/Explicit, including a model for structural failure.
Conventional individual head-related transfer function (HRTF) measurements are demanding in terms of measurement time and equipment. For more flexibility, free body movement (FBM) measurement systems provide an easy-to-use way to measure full-spherical HRTF datasets with less effort. However, having no fixed measurement installation implies that the HRTFs are not sampled on a predefined regular grid but rely on the individual movements of the subject. Furthermore, depending on the measurement effort, a rather small number of measurements can be expected, ranging, for example, from 50 to 150 sampling points. Spherical harmonics (SH) interpolation has been extensively studied recently as one method to obtain full-spherical datasets from such sparse measurements, but previous studies primarily focused on regular full-spherical sampling grids. For irregular grids, it remains unclear up to which spatial order meaningful SH coefficients can be calculated and how the resulting interpolation error compares to regular grids. This study investigates SH interpolation of selected irregular grids obtained from HRTF measurements with an FBM system. Intending to derive general constraints for SH interpolation of irregular grids, the study analyzes how the variation of the SH order affects the interpolation results. Moreover, the study demonstrates the importance of Tikhonov regularization for SH interpolation, which is popular for solving ill-posed numerical problems associated with such irregular grids. As a key result, the study shows that the optimal SH order that minimizes the interpolation error depends mainly on the grid and the regularization strength but is almost independent of the selected HRTF set. Based on these results, the study proposes to determine the optimal SH order by minimizing the interpolation error of a reference HRTF set sampled on the sparse and irregular FBM grid. Finally, the study verifies the proposed method for estimating the optimal SH order by comparing interpolation results of irregular and equivalent regular grids, showing that the differences are small when the SH interpolation is optimally parameterized.
New risk geographies are emerging with war and conflict resurfacing, including nuclear threats. This poses challenges to civil protection for conducting risk-informed preparedness planning. A spatial assessment of Germany and Europe is conducted using a geographic information system. Buffer circles of nuclear explosion effects and fallout buffers show potentially exposed areas around major cities. Different scenarios indicate shrinking areas safe from exposure. However, even in a densely populated country, rural areas and smaller cities can be identified that could provide sites for evacuation shelters. Changing wind directions poses a challenge for civil protection planning because fallout risk covers most German territory even when few cities are attacked. However, wind speeds and topography can help identify suitable shelter areas. More knowledge about the temporal development of a nuclear explosion and its specific forms of harm can also help to improve risk knowledge and planning. While nuclear warfare at first seems to render useless any option for safe areas and survival, the spatial risk assessment shows that exposure does not occur at all places at all times. Being safe from harm will be difficult in such a worst-case scenario, but avoiding large city perimeters and being informed can also help reduce risk.
Remaining-useful-life (RUL) prediction of Li-ion batteries is used to provide an early indication of the expected lifetime of the battery, thereby reducing the risk of failure and increasing safety. In this paper, a detailed method is presented to make long-term predictions for the RUL based on a combination of gated recurrent unit neural network (GRU NN) and soft-sensing method. Firstly, an indirect health indicator (HI) was extracted from the charging processes using a soft-sensing method that can accurately describe power degradation instead of capacity. Then, a GRU NN with a sliding window was applied to learn the long-term performance development. The method also uses a dropout and early stopping method to prevent overfitting. To build the models and validate the effectiveness of the proposed method, a real-world NASA battery data set with various battery measurements was used. The results show that the method can produce a long-term and accurate RUL prediction at each position of the degradation progression based on several historical battery data sets.
The management of the liquid fraction of digestate produced from the anaerobic digestion of biodegradable municipal solid waste is a difficult affair, as its land application is limited due to high ammonium concentrations and the municipal waste that water treatment plants struggle to treat due to high pollutant loads. The amount of leachate and the pollutant load in the leachate produced by landfills usually decreases with the time, which increases the capacity of landfill leachate treatment plants (LLTPs) to treat additional wastewater. In order to solve the above two challenges, the co-treatment of landfill leachate and the liquid fraction of anaerobic digestate in an industrial-scale LLTP was investigated along with the long-term impacts of the liquid fraction of anaerobic digestate on biocoenosis and its impact on LLTP operational expenses. The co-treatment of landfill leachate and liquid fraction of anaerobic digestate was compared to conventional leachate treatment in an industrial-scale LLTP, which included the use of two parallel lanes (Lane-1 and Lane-2). The average nitrogen removal efficiencies in Lane-1 (co-treatment) were 93.4%, 95%, and 92%, respectively, for C/N ratios of 8.7, 8.9, and 9.4. The average nitrogen removal efficiency in Lane-2 (conventional landfill leachate treatment), meanwhile, was 88%, with a C/N ratio of 6.5. The LLTP’s average chemical oxygen demand (COD) removal efficiencies were 63.5%, 81%, and 78% during phases one, two, and three, respectively. As the volume ratios of the liquid fraction of anaerobic digestate increased, selective oxygen uptake rate experiments demonstrated the dominance of heterotrophic bacteria over ammonium and nitrite-oxidising organisms. The inclusion of the liquid fraction of anaerobic digestate during co-treatment did not cause a significant increase in operational resources, i.e., oxygen, the external carbon source, activated carbon, and energy.
A bifacial Photovoltaic (PV) simulation model is created by combining the optical View Factor matrix with electrical output simulation in python to analyse the energy density of bifacial systems. A discretization of the rear side of the bifacial modules allows a further investigation of mismatching and losses due to inhomogeneous radiation distribution. The model is validated, showing a deviation of -1.25 % to previous simulation models and giving hourly resolvedoutput data with a higher accuracy than existing software for bifacial PV systems.
To realize a reliable and cost-effective application of high-temperature superconductive (HTS) equipment at high-voltage (HV) levels, the influence of thermally induced gas bubbles on the dielectric strength of different solid insulating materials in liquid nitrogen (LN2) was investigated. A heatable copper tape electrode arrangement was developed simulating HTS tapes with insulation in between. AC breakdown measurements were performed without and with forced boiling on insulating papers, polypropylene laminated paper (PPLP) and polyimide (PI) films. Under nucleate boiling the influence of bubbles on the dielectric strength of all materials was not significant. However under film boiling the dielectric strength of the insulating papers decreased to a level comparable to their dielectric strength in air, demonstrating the insufficient impregnation of porous materials under film boiling. For PI there was no degradation at all. PPLP retained about 70% of its basic dielectric strength in LN2.
AbstractThis paper discusses the comparison of two methods to achieve thermal comfort utilising air conditioning (AC) system in a small indoor space – adaptive control and fuzzy control. Thermal comfort indoors is performed to provide comfortability individually or for a group of people. Due to the small indoor space which usually a bit cramped, crowded and less airy, the ambience can be very uncomfortable either for doing sedentary or active work, thus the AC system can be very useful to provide thermal comfort. Both methods can be utilised depending on how thermal comfort is viewed and how the level of thermal comfort is decided. Every method has its own advantage and limitations, and will be covered in this paper as well.
Climate change includes the change of the long-term average values and the change of the tails of probability density functions, where the extreme events are located. However, obtaining average values are more straightforward than the high temporal resolution information necessary to catch the extreme events on those tails. Such information is difficult to get in areas lacking sufficient rain stations. Thanks to the development of Satellite Precipitation Estimates with a daily resolution, this problem has been overcome, so Extreme Precipitation Indices (EPI) can be calculated for the entire Colombian territory. However, Colombia is strongly affected by the ENSO (El Niño—Southern Oscillation) phenomenon. Therefore, it is pertinent to ask if the EPI’s long-term change due to climate change is more critical than the anomalies due to climate variability induced by the warm and cold phases of ENSO (El Niño and La Niña, respectively). In this work, we built EPI annual time series at each grid-point of the selected Satellite Precipitation Estimate (CHIRPSv2) over Colombia to answer the previous question. Then, the Mann-Whitney-Wilcoxon test was used to compare the samples drawn in each case (i.e., change tests due to both long-term and climatic variability). After performing the analyses, we realized that the importance of the change depends on the region analyzed and the considered EPI. However, some general conclusions became evident: during El Niño years (La Niña), EPI’s anomaly follows the general trend of reduction -drier conditions- (increase; -wetter conditions-) observed in Colombian annual precipitation amount, but only on the Pacific, the Caribbean, and the Andean region. In the Eastern plains of Colombia (Orinoquía and Amazonian region), EPI show a certain insensitivity to change due to climatic variability. On the other hand, EPI’s long-term changes in the Pacific, the Caribbean, and the Andean region are spatially scattered. Still, long-term changes in the eastern plains have a moderate spatial consistency with statistical significance.
Abstract
Two types (with and without a hydrolysis stabilizer) of polyamide 6.6 (PA6.6) reinforced with 30% w/w glass fibres were examined against the influence of automotive cooling fluids, e.g. ethylene glycol aqueous solutions. The overall goal was to find a methodology to compare the performance of PA6.6 materials against the impacts of the hydrolysis environment. The stabilizer effect on the hydrolytic resistance of the materials was assessed using tensile tests according to ISO 527, and their strain‐at‐break values were evaluated in more detail. The degradation mechanism of both PA types was monitored by infrared spectroscopy and SEM. The material lifetime was described by the Arrhenius equation. The results show that the hydrolysis stabilizer operates effectively at low temperature but exhibits weak performance above 130 °C, which is explained by faster consumption of the stabilizing agent. © 2021 The Authors. Polymer International published by John Wiley & Sons Ltd on behalf of Society of Industrial Chemistry.
Due to the COVID-19 pandemic, university students worldwide have experienced drastic changes in their academic and social lives, with the rapid shift to online courses and contact restrictions being reported among the major stressors. In the present study, we aimed at examining students’ perceived stress over the course of the pandemic as well as individual psychological and social coping resources within the theoretical framework of the Transactional Model of Stress and Coping in the specific group of STEM students. In four cross-sectional studies with a total of 496 computer science students in Germany, we found that students reported significantly higher levels of perceived stress at both measurement time points in the second pandemic semester (October/November 2020; January/February 2021) as compared to the beginning of the first (April/May 2020), indicating that students rather became sensitized to the constant pandemic-related stress exposure than habituating to the “new normal”. Regarding students’ coping resources in the higher education context, we found that both high (a) academic self-efficacy and (b) academic online self-efficacy as well as low (c) perceived social and academic exclusion among fellow students significantly predicted lower levels of students’ (d) belonging uncertainty to their study program, which, in turn, predicted lower perceived stress at the beginning of the first pandemic semester. At the beginning of the second pandemic semester, we found that belonging uncertainty still significantly mediated the relationship between students’ academic self-efficacy and perceived stress. Students’ academic online self-efficacy, however, no longer predicted their uncertainty about belonging, but instead had a direct buffering effect on their perceived stress. Students’ perceived social and academic exclusion among fellow students only marginally predicted their belonging uncertainty and no longer predicted their perceived stress 6 months into the pandemic. We discuss the need and importance of assessing and monitoring students’ stress levels as well as faculty interventions to strengthen students’ individual psychological and social coping resources in light of the still ongoing pandemic.
AbstractThe Ganges-Brahmaputra (GB) delta is one of the most disaster-prone areas in the world due to a combination of high population density and exposure to tropical cyclones, floods, salinity intrusion and other hazards. Due to the complexity of natural deltaic processes and human influence on these processes, structural solutions like embankments are inadequate on their own for effective hazard mitigation. This article examines nature-based solutions (NbSs) as a complementary or alternative approach to managing hazards in the GB delta. We investigate the potential of NbS as a complementary and sustainable method for mitigating the impacts of coastal disaster risks, mainly cyclones and flooding. Using the emerging framework of NbS principles, we evaluate three existing approaches: tidal river management, mangrove afforestation, and oyster reef cultivation, all of which are actively being used to help reduce the impacts of coastal hazards. We also identify major challenges (socioeconomic, biophysical, governance and policy) that need to be overcome to allow broader application of the existing approaches by incorporating the NbS principles. In addition to addressing GB delta-specific challenges, our findings provide more widely applicable insights into the challenges of implementing NbS in deltaic environments globally.
In this paper we describe traffic sign recognition with neural networks in the frequency domain. Traffic signs exist in all countries to regulate the traffic of vehicles and pedestrians. Each country has its own set of traffic signs that are more or less similar. They consist of a set of abstract forms, symbols, numbers and letters, which are combined into different signs. Automatic traffic sign recognition is important for driver assistance systems and for autonomous driving. Traffic sign recognition is a subtype of image recognition. The traffic signs are usually recorded by a camera and must be recognized in real time, i.e. assigned to a class. We use neural networks for traffic sign recognition. The special feature of our method is that the traffic sign recognition does not take place in the spatial domain but in the frequency domain. This has advantages because it is possible to significantly reduce the number of neurons and thus the computing effort of the neural network compared to a conventional neural network.
The development and adoption of digital twins (DT) for Quality-by-Design (QbD)-based processes with flexible operating points within a proven acceptable range (PAR) and automation through Advanced Process Control (APC) with Process Analytical Technology (PAT) instead of conventional process execution based on offline analytics and inflexible process set points is one of the great challenges in modern biotechnology. Virus-like particles (VLPs) are part of a line of innovative drug substances (DS). VLPs, especially those based on human immunodeficiency virus (HIV), HIV-1 Gag VLPs, have very high potential as a versatile vaccination platform, allowing for pseudotyping with heterologous envelope proteins, e.g., the S protein of severe acute respiratory syndrome coronavirus 2 (SARS-CoV-2). As enveloped VLPs, optimal process control with minimal hold times is essential. This study demonstrates, for the first time, the use of a digital twin for the overall production process of HIV-1 Gag VLPs from cultivation, clarification, and purification to lyophilization. The accuracy of the digital twins is in the range of 0.8 to 1.4% in depth filtration (DF) and 4.6 to 5.2% in ultrafiltration/diafiltration (UFDF). The uncertainty due to variability in the model parameter determination is less than 4.5% (DF) and less than 3.8% (UFDF). In the DF, a prediction of the final filter capacity was demonstrated from as low as 5.8% (9mbar) of the final transmembrane pressure (TMP). The scale-up based on DT in chromatography shows optimization potential in productivity up to a factor of 2. The schedule based on DT and PAT for APC has been compared to conventional process control, and hold-time and process duration reductions by a factor of 2 have been achieved. This work lays the foundation for the short-term validation of the DT and PAT for APC in an automated S7 process environment and the conversion from batch to continuous production.
Different mechanisms mediate the toxicity of RNA. Genomic retroviral mRNA hijacks infected host cell factors to enable virus replication. The viral genomic RNA of the human immunodeficiency virus (HIV) encompasses nine genes encoding in less than 10 kb all proteins needed for replication in susceptible host cells. To do so, the genomic RNA undergoes complex alternative splicing to facilitate the synthesis of the structural, accessory, and regulatory proteins. However, HIV strongly relies on the host cell machinery recruiting cellular factors to complete its replication cycle. Antiretroviral therapy (ART) targets different steps in the cycle, preventing disease progression to the acquired immunodeficiency syndrome (AIDS). The comprehension of the host immune system interaction with the virus has fostered the development of a variety of vaccine platforms. Despite encouraging provisional results in vaccine trials, no effective vaccine has been developed, yet. However, novel promising vaccine platforms are currently under investigation.
Despite great efforts to develop a vaccine against human immunodeficiency virus (HIV), which causes AIDS if untreated, no approved HIV vaccine is available to date. A promising class of vaccines are virus-like particles (VLPs), which were shown to be very effective for the prevention of other diseases. In this study, production of HI-VLPs using different 293F cell lines, followed by a three-step purification of HI-VLPs, was conducted. The quality-by-design-based process development was supported by process analytical technology (PAT). The HI-VLP concentration increased 12.5-fold while >80% purity was achieved. This article reports on the first general process development and optimization up to purification. Further research will focus on process development for polishing and formulation up to lyophilization. In addition, process analytical technology and process modeling for process automation and optimization by digital twins in the context of quality-by-design framework will be developed.
Resilience in the urban context can be described as a continuum of absorptive, adaptive, and transformative capacities. The need to move toward a sustainable future and bounce forward after any disruption has led recent urban resilience initiatives to engage with the concept of transformative resilience when and where conventional and top-down resilience initiatives are less likely to deliver effective strategies, plans, and implementable actions. Transformative resilience pathways emphasize the importance of reflexive governance, inclusive co-creation of knowledge, innovative and collaborative learning, and self-organizing processes. To support these transformative pathways, considering techno-social co-evolution and digital transformation, using new data sources such as Volunteered Geographic Information (VGI) and crowdsourcing are being promoted. However, a literature review on VGI and transformative resilience reveals that a comprehensive understanding of the complexities and capacities of utilizing VGI for transformative resilience is lacking. Therefore, based on a qualitative content analysis of available resources, this paper explores the key aspects of using VGI for transformative resilience and proposes a comprehensive framework structured around the identified legal, institutional, social, economic, and technical aspects to formalize the process of adopting VGI in transformative resilience initiatives.
This paper documents the design, implementation and evaluation of the Unfolding Space Glove—an open source sensory substitution device. It transmits the relative position and distance of nearby objects as vibratory stimuli to the back of the hand and thus enables blind people to haptically explore the depth of their surrounding space, assisting with navigation tasks such as object recognition and wayfinding. The prototype requires no external hardware, is highly portable, operates in all lighting conditions, and provides continuous and immediate feedback—all while being visually unobtrusive. Both blind (n = 8) and blindfolded sighted participants (n = 6) completed structured training and obstacle courses with both the prototype and a white long cane to allow performance comparisons to be drawn between them. The subjects quickly learned how to use the glove and successfully completed all of the trials, though still being slower with it than with the cane. Qualitative interviews revealed a high level of usability and user experience. Overall, the results indicate the general processability of spatial information through sensory substitution using haptic, vibrotactile interfaces. Further research would be required to evaluate the prototype’s capabilities after extensive training and to derive a fully functional navigation aid from its features.
In this work, supported cellulose acetate (CA) mixed matrix membranes (MMMs) were prepared and studied concerning their gas separation behaviors. The dispersion of carbon nanotube fillers were studied as a factor of polymer and filler concentrations using the mixing methods of the rotor–stator system (RS) and the three-roll-mill system (TRM). Compared to the dispersion quality achieved by RS, samples prepared using the TRM seem to have slightly bigger, but fewer and more homogenously distributed, agglomerates. The green γ-butyrolactone (GBL) was chosen as a polyimide (PI) polymer-solvent, whereas diacetone alcohol (DAA) was used for preparing the CA solutions. The coating of the thin CA separation layer was applied using a spin coater. For coating on the PP carriers, a short parameter study was conducted regarding the plasma treatment to affect the wettability, the coating speed, and the volume of dispersion that was applied to the carrier. As predicted by the parameter study, the amount of dispersion that remained on the carriers decreased with an increasing rotational speed during the spin coating process. The dry separation layer thickness was varied between about 1.4 and 4.7 μm. Electrically conductive additives in a non-conductive matrix showed a steeply increasing electrical conductivity after passing the so-called percolation threshold. This was used to evaluate the agglomeration behavior in suspension and in the applied layer. Gas permeation tests were performed using a constant volume apparatus at feed pressures of 5, 10, and 15 bar. The highest calculated CO2/N2 selectivity (ideal), 21, was achieved for the CA membrane and corresponded to a CO2 permeability of 49.6 Barrer.
The oxidation of cumene and following cleavage of cumene hydroperoxide (CHP) with sulfuric acid (Hock rearrangement) is still, by far, the dominant synthetic route to produce phenol. In 2020, the global phenol market reached a value of 23.3 billion US$ with a projected compound annual growth rate of 3.4% for 2020–2025. From ecological and economical viewpoints, the key step of this process is the cleavage of CHP. One sought-after way to likewise reduce energy consumption and waste production of the process is to substitute sulfuric acid with heterogeneous catalysts. Different types of zeolites, silicon-based clays, heteropoly acids, and ion exchange resins have been investigated and tested in various studies. For every type of these solid acid catalysts, several materials were found that show high yield and selectivity to phenol. In this mini-review, first a brief introduction and overview on the Hock process is given. Next, the mechanism, kinetics, and safety aspects are summarized and discussed. Following, the different types of heterogeneous catalysts and their performance as catalyst in the Hock process are illustrated. Finally, the different approaches to substitute sulfuric acid in the synthetic route to produce phenol are briefly concluded and a short outlook is given.
Pressure injuries remain a serious health complication for patients and nursing staff. Evidence from the past decade has not been analysed through narrative synthesis yet. PubMed, Embase, CINAHL Complete, Web of Science, Cochrane Library, and other reviews/sources were screened. Risk of bias was evaluated using a slightly modified QUIPS tool. Risk factor domains were used to assign (non)statistically independent risk factors. Hence, 67 studies with 679,660 patients were included. In low to moderate risk of bias studies, non-blanchable erythema reliably predicted pressure injury stage 2. Factors influencing mechanical boundary conditions, e.g., higher interface pressure or BMI < 18.5, as well as factors affecting interindividual susceptibility (male sex, older age, anemia, hypoalbuminemia, diabetes, hypotension, low physical activity, existing pressure injuries) and treatment-related aspects, such as length of stay in intensive care units, were identified as possible risk factors for pressure injury development. Health care professionals’ evidence-based knowledge of above-mentioned risk factors is vital to ensure optimal prevention and/or treatment. Openly accessible risk factors, e.g., sex, age, BMI, pre-existing diabetes, and non-blanchable erythema, can serve as yellow flags for pressure injury development. Close communication concerning further risk factors, e.g., anemia, hypoalbuminemia, or low physical activity, may optimize prevention and/or treatment. Further high-quality evidence is warranted.
Abstract
Due to their pronounced bioactivity and limited availability from natural resources, metabolites of the soft coral Pseudopterogorgia elisabethae, such as erogorgiaene and the pseudopterosines, represent important target molecules for chemical synthesis. We have now developed a particularly short and efficient route towards these marine diterpenes exploiting an operationally convenient enantioselective cobalt‐catalyzed hydrovinylation as the chirogenic step. Other noteworthy C−C bond forming transformations include diastereoselective Lewis acid‐mediated cyclizations, a Suzuki coupling and a carbonyl ene reaction. Starting from 4‐methyl‐styrene the anti‐tubercular agent (+)‐erogorgiaene (>98 % ee) was prepared in only 7 steps with 46 % overall yield. In addition, the synthesis of the pseudopterosin A aglycone was achieved in 12 steps with 30 % overall yield and, surprisingly, was found to exhibit a similar anti‐inflammatory activity (inhibition of LPS‐induced NF‐κB activation) as a natural mixture of pseudopterosins A−D or iso‐pseudopterosin A, prepared by β‐D‐xylosylation of the synthetic aglycone.
Abstract
In the chemical industry large amounts of saline wastewater occur. Its disposal into rivers is a considerable burden to the ecosystem. To strive for a circular economy and enable a viable raw material recycling, energy‐efficient concentration processes are requisite. High‐pressure reverse osmosis meets this criterion, but its industrial application demands suitable membrane elements that withstand the exceptional operation conditions and provide sufficient performance. Hence, new requirements regarding the design of spiral‐wound elements arise. To identify those, specific performance‐limiting effects need a better understanding.
Different methods have been proposed for in situ root-length density (RLD) measurement. One widely employed is the time-consuming sampling of soil cores or monoliths (MO). The profile wall (PW) method is a less precise, but faster and less laborious alternative. However, depth-differentiated functions to convert PW RLD estimates to MO RLD measurements have not yet been reported. In this study, we perform a regression analysis to relate PW results to MO results and determine whether calibration is possible for distinct crop groups (grasses, brassicas and legumes) consisting of pure and mixed stands, and whether soil depth affects this calibration. The methods were applied over two years to all crop groups and their absolute and cumulative RLD were compared using a linear (LR) and multiple linear (MLR) regression. PW RLD was found to highly underestimate MO RLD in absolute values and in highly rooted areas. However, a close agreement between both methods was found for cumulative root-length (RL) when applying MLR, highlighting the influence of soil depth. The level of agreement between methods varied strongly with depth. Therefore, the application of PW as the main RLD estimation method can provide reliable estimates of cumulative root distribution traits of cover crops.
Table Tennis Tutor: Forehand Strokes Classification Based on Multimodal Data and Neural Networks
(2021)
Beginner table-tennis players require constant real-time feedback while learning the fundamental techniques. However, due to various constraints such as the mentor’s inability to be around all the time, expensive sensors and equipment for sports training, beginners are unable to get the immediate real-time feedback they need during training. Sensors have been widely used to train beginners and novices for various skills development, including psychomotor skills. Sensors enable the collection of multimodal data which can be utilised with machine learning to classify training mistakes, give feedback, and further improve the learning outcomes. In this paper, we introduce the Table Tennis Tutor (T3), a multi-sensor system consisting of a smartphone device with its built-in sensors for collecting motion data and a Microsoft Kinect for tracking body position. We focused on the forehand stroke mistake detection. We collected a dataset recording an experienced table tennis player performing 260 short forehand strokes (correct) and mimicking 250 long forehand strokes (mistake). We analysed and annotated the multimodal data for training a recurrent neural network that classifies correct and incorrect strokes. To investigate the accuracy level of the aforementioned sensors, three combinations were validated in this study: smartphone sensors only, the Kinect only, and both devices combined. The results of the study show that smartphone sensors alone perform sub-par than the Kinect, but similar with better precision together with the Kinect. To further strengthen T3’s potential for training, an expert interview session was held virtually with a table tennis coach to investigate the coach’s perception of having a real-time feedback system to assist beginners during training sessions. The outcome of the interview shows positive expectations and provided more inputs that can be beneficial for the future implementations of the T3.
Remote sensing applications of change detection are increasingly in demand for many areas of land use and urbanization, and disaster risk reduction. The Sendai Framework for Disaster Risk Reduction and the New Urban Agenda by the United Nations call for risk monitoring. This study maps and assesses the urban area changes of 23 Mexican-USA border cities with a remote sensing-based approach. A literature study on existing studies on hazard mapping and social vulnerability in those cities reveals a need for further studies on urban growth. Using a multi-modal combination of aerial, declassified (CORONA, GAMBIT, HEXAGON programs), and recent (Sentinel-2) satellite imagery, this study expands existing land cover change assessments by capturing urban growth back to the 1940s. A Geographic Information System and census data assessment results reveal that massive urban growth has occurred on both sides of the national border. On the Mexican side, population and area growth exceeds the US cities in many cases. In addition, flood hazard exposure has grown along with growing city sizes, despite structural river training. These findings indicate a need for more risk monitoring that includes remote sensing data. It has socio-economic implications, too, as the social vulnerability on Mexican and US sides differ. This study calls for the maintenance and expansion of open data repositories to enable such transboundary risk comparisons. Common vulnerability variable sets could be helpful to enable better comparisons as well as comparable flood zonation mapping techniques. To enable risk monitoring, basic data such as urban boundaries should be mapped per decade and provided on open data platforms in GIS formats and not just in map viewers.
Pluvial floods claimed more than 180 lives in Germany in July 2021, when a large and slow-moving storm system affected Germany and many neighbouring countries. The death tolls and damages were the highest since 1962 in Germany, and soon after, the crisis management was under public critique. This study has undertaken an online survey to understand crisis management better and identify lessons to learn. It has received a positive interest among operational relief forces and other helpers (n = 2264). The findings reveal an overall satisfaction with the operation in general as well as personal lessons learned. It also reveals shortcomings in many areas, ranging from information distribution, coordination, parallel ongoing COVID-19 pandemic, infrastructure resilience, and other factors. Just as well, areas for improvement of the crisis management system are suggested by the respondents. Cooperation and support by the affected population are perceived as positive. This helps to inform other areas of research that are necessary, such as studies on the perception by the affected people. The gaps in assessments of operational forces and some methodological constraints are discussed to advance future follow-up studies.
Floods are a known natural hazard in Germany, but the amount of precipitation and ensuing high death toll and damages after the events especially from 14 to 15 July 2021 came as a surprise. Almost immediately questions about failure in the early warning chains and the effectiveness of the German response emerged, also internationally. This article presents lessons to learn and argues against a blame culture. The findings are based on comparisons with findings from previous research projects carried out in the Rhein-Erft Kreis and the city of Cologne, as well as on discussions with operational relief forces after the 2021 events. The main disaster aspects of the 2021 flood are related to issuing and understanding warnings, a lack of information and data exchange, unfolding upon a situation of an ongoing pandemic and aggravated further by critical infrastructure failure. Increasing frequencies of flash floods and other extremes due to climate change are just one side of the transformation and challenge, Germany and neighbouring countries are facing. The vulnerability paradox also heavily contributes to it; German society became increasingly vulnerable to failure due to an increased dependency on its infrastructure and emergency system, and the ensuing expectations of the public for a perfect system.
In the literature, many studies outline the advantages of agrivoltaic (APV) systems from different viewpoints: optimized land use, productivity gain in both the energy and water sector, economic benefits, etc. A holistic analysis of an APV system is needed to understand its full advantages. For this purpose, a case study farm size of 0.15 ha has been chosen as a reference farm at a village in Niger, West Africa. Altogether four farming cases are considered. They are traditional rain-fed, irrigated with diesel-powered pumps, irrigated with solar pumps, and the APV system. The APV system is further analyzed under two scenarios: benefits to investors and combined benefits to investors and farmers. An economic feasibility analysis model is developed. Different economic indicators are used to present the results: gross margin, farm profit, benefit-cost ratio, and net present value (NPV). All the economic indicators obtained for the solar-powered irrigation system were positive, whereas all those for the diesel-powered system were negative. Additionally, the diesel system will emit annually about 4005 kg CO2 to irrigate the chosen reference farm. The land equivalent ratio (LER) was obtained at 1.33 and 1.13 for two cases of shading-induced yield loss excluded and included, respectively.
This paper studies the process of business cycle synchronization in the European Union and the euro area. As our baseline methodology we adopt rolling window correlation coefficients of various economic indicators, observed since 2000. Among the indicators, we distinguish between real economic indicators, like the real GDP growth and unemployment, and nominal indicators, like inflation and government budget. Given the direct implication of this kind of analysis for the common monetary policy of the European Central Bank (ECB), special attention is paid to the pattern of business cycle synchronization in the core and peripheral members of the euro area. Our analysis of quarterly data covering the first two decades of the euro area shows that there was a certain synchronization tendency in the first years of the common currency. However, the European debt crisis halted the economic integration within the European Union and—even more so—within the euro area. Since the ECB can to a large extent intervene only with “one-size-fits-all” monetary policy instruments, this renders increasingly cumbersome the conduct of stabilisation policies within the euro area.
Linoleic acid hydroperoxides are versatile intermediates for the production of green note aroma compounds and bifunctional ω-oxo-acids. An enzyme cascade consisting of lipoxygenase, lipase and catalase was developed for one-pot synthesis of 13-hydroperoxyoctadecadienoic acid starting from safflower oil. Reaction conditions were optimized for hydroperoxidation using lipoxygenase 1 from Glycine max (LOX-1) in a solvent-free system. The addition of green surfactant Triton CG-110 improved the reaction more than two-fold and yields of >50% were obtained at linoleic acid concentrations up to 100 mM. To combine hydroperoxidation and oil hydrolysis, 12 lipases were screened for safflower oil hydrolysis under the reaction conditions optimized for LOX-1. Lipases from Candida rugosa and Pseudomonas fluorescens were able to hydrolyze safflower oil to >75% within 5 h at a pH of 8.0. In contrast to C. rugosa lipase, the enzyme from P. fluorescens did not exhibit a lag phase. Combination of P. fluorescens lipase and LOX-1 worked well upon LOX-1 dosage and a synergistic effect was observed leading to >80% of hydroperoxides. Catalase from Micrococcus lysodeikticus was used for in-situ oxygen production with continuous H2O2 dosage in the LOX-1/lipase reaction system. Foam generation was significantly reduced in the 3-enzyme cascade in comparison to the aerated reaction system. Safflower oil concentration was increased up to 300 mM linoleic acid equivalent and 13-hydroperoxides could be produced in a yield of 70 g/L and a regioselectivity of 90% within 7 h.
Water scarcity drives governments in arid and semi-arid regions to promote strategies for improving water use efficiency. Water-related research generally also plays an important role in the same countries and for the same reason. However, it remains unclear how to link the implementation of new government strategies and water-related research. This article’s principal objective is to present a novel approach that defines water-related research gaps from the point of view of a government strategy. The proposed methodology is based on an extensive literature review, followed by a systematic evaluation of the topics covered both in grey and peer-reviewed literature. Finally, we assess if and how the different literature sources contribute to the goals of the water strategy. The methodology was tested by investigating the impact of the water strategy of Jordan’s government (2008–2022) on the research conducted in the Azraq Basin, considering 99 grey and peer-reviewed documents. The results showed an increase in the number of water-related research documents from 37 published between 1985 and 2007 to 62 published between 2008 and 2018. This increase should not, however, be seen as a positive impact of increased research activity from the development of Jordan’s water strategy. In fact, the increase in water-related research activity matches the increasing trend in research production in Jordan generally. Moreover, the results showed that only about 80% of the documents align with the goals identified in the water strategy. In addition, the distribution of the documents among the different goals of the strategy is heterogeneous; hence, research gaps can be identified, i.e., goals of the water-strategy that are not addressed by any of the documents sourced. To foster innovative and demand-based research in the future, a matrix was developed that linked basin-specific research focus areas (RFAs) with the MWI strategy topics. In doing so, the goals that are not covered by a particular RFA are highlighted. This analysis can inspire researchers to develop and apply new topics in the Azraq Basin to address the research gaps and strengthen the connection between the RFAs and the strategy topics and goals. Moreover, the application of the proposed methodology can motivate future research to become demand-driven, innovative, and contribute to solving societal challenges.
The paper structure of historical prints is sort of a unique fingerprint. Paper with the same origin shows similar chain line distances. As the manual measurement of chain line distances is time consuming, the automatic detection of chain lines is beneficial. We propose an end-to-end trainable deep learning method for segmentation and parameterization of chain lines in transmitted light images of German prints from the 16th Century. We trained a conditional generative adversarial network with a multitask loss for line segmentation and line parameterization. We formulated a fully differentiable pipeline for line coordinates’ estimation that consists of line segmentation, horizontal line alignment, and 2D Fourier filtering of line segments, line region proposals, and differentiable line fitting. We created a dataset of high-resolution transmitted light images of historical prints with manual line coordinate annotations. Our method shows superior qualitative and quantitative chain line detection results with high accuracy and reliability on our historical dataset in comparison to competing methods. Further, we demonstrated that our method achieves a low error of less than 0.7 mm in comparison to manually measured chain line distances.
The paper focuses on a study of turbulence decay in flow with streamwise gradient. For the first time, an analytical solution of this problem was obtained based on the k‐ε model of turbulence in one‐dimensional (1D) approximation, as well as on the symmetry properties of the system of differential equations. Lie group technique enabled reducing the problem to a linear differential equation. The analytical solution enabled parametric studies, which are computationally cheap in comparison to CFD based simulations. The lattice Boltzmann method (LBM) in two‐dimensional approximation (2D) was used to validate the analytical results. Large eddy simulation (LES) Smagorinsky approach was used to close the LBM model. Computations revealed that the rate of turbulence decay is significantly different for the cases of positive and negative streamwise pressure gradient. The further comparisons showed that the analytical solution underpredicts the predictions by the numerical methodology, which can be attributed to the simplified problem statement used to derive the closed‐form analytical solution. Comparisons of calculations with experiments revealed that the theoretical models used in the study underpredict the measurements for flows with a positive pressure gradient. Hence it can be concluded that the LBM technique combined with the LES Smagorinsky model requires the further modification.
The production of pharmaceutical ingredients, intermediates and final products strongly depends on the utilization of water. Water is also required for the purification and preparation of reagents. Each specific application determines the respective water quality. In the European Union, the European Pharmacopeia (Ph. Eur.) contains the official standards that assure quality control of pharmaceutical products during their life cycle. According to this, the production of water for pharmaceutical use is mainly based on multi-stage distillation and membrane processes, especially, reverse osmosis. Membrane distillation (MD) could be an alternative process to these classical methods. It offers advantages in terms of energy demand and a compact apparatus design. In the following study, the preparation of pharmaceutical-grade water from tap water in a one-step process using MD is presented. Special emphasis is placed on the performance of two different module designs and on the selection of optimum process parameters.