Refine
Year of publication
Document Type
- Master's Thesis (323)
- Article (149)
- Bachelor Thesis (142)
- Conference Proceeding (65)
- Report (53)
- Study Thesis (47)
- Working Paper (47)
- Book (29)
- Part of a Book (27)
- Other (11)
Language
- German (586)
- English (315)
- Multiple languages (2)
- Spanish (1)
Keywords
- Rückversicherung (66)
- Reinsurance (55)
- Versicherung (52)
- Kölner Forschungsstelle Rückversicherung (47)
- Versicherungswirtschaft (44)
- Sozialarbeit (24)
- Soziale Arbeit (20)
- Deutschland (19)
- Germany (19)
- Bibliothek (17)
Faculty
- Fakultät 10 / Institut für Informatik (154)
- Fakultät 03 / Institut für Informationswissenschaft (104)
- Fakultät 04 / Institut für Versicherungswesen (78)
- Fakultät 12 / Institut für Technologie und Ressourcenmanagement in den Tropen und Subtropen (72)
- Fakultät 07 / Institut für Medien- und Phototechnik (68)
- Fakultät 04 / Schmalenbach Institut für Wirtschaftswissenschaften (53)
- Angewandte Naturwissenschaften (F11) (45)
- Fakultät 10 / Advanced Media Institute (35)
- Fakultät 09 / Institut für Rettungsingenieurwesen und Gefahrenabwehr (30)
- Fakultät 07 / Institut für Nachrichtentechnik (26)
Aim: European cities are facing heighten hydrological risks as a result of climate change at the same time as ecological degradation has reduced the environmental capacity to absorb and regulate such fluctuations. Climate forecasts predict more intense convective rainfall and winter flood events in the Wupper Basin in Germany, against a background trend of reduced mean rainfall during the summer months. On 14 July 2021 intense convective rainfall fell at points across Western Germany and led to flash floods in the Wupper Basin, many sites were inundated and the Wupper and Dhünn rivers rose to new record highs. Green-blue infrastructure offers strategies to reduce the impacts of hazards at the same time as providing a range of co-benefits. A study was undertaken to find which green-blue interventions will be most effective at reducing the impacts of hydrometeorological hazards for a study area in the west of the Wupper basin. Furthermore, as landscape features are highly influential in hydrology, the study sought to establish which sites within the landscape can provide maximum results from green-blue interventions, with a minimum of change to current land uses.
Region: Europe, peri-urban and rural, undulating, low mountainous landscapes
Methods: Literature findings on observed and projected climate data are summarised and long-term rainfall data from the study area is analysed to confirm rainfall trends. A state-of-the-art review is conducted and summarised to form a toolbox of potential interventions. The most recent hazardous hydrometeorological event is analysed to inform the locational priorities of potential interventions. Landscape features that have the most influence on basin hydrology are identified from the literature. These sites are paired with green-blue interventions that are shown to have the highest potential impact on interception, infiltration, runoff and flooding. A series of spatial analyses are carried out to produce maps detailing location and intervention with high potential to reduce the impact of hydrometeorological hazards in the study area. All of the evidence gathered from the literature analysis is combined in an implementation guide for green-blue interventions in the Wupper Basin.
Results: The hazards caused by the hydrometeorological extremes of flooding and drought are addressed or minimised through the green-blue interventions that increase interception and infiltration and reduce runoff and flooding. Priority locations are identified as the riparian zone with slope ≤15%, hilltop, lower slope and toe slope, all locations with a slope ≥30% and areas with a high topographic wetness index (TWI). A series of spatial analyses were carried out and suggestions made including potential locations for retention or detention areas and ponds, sites for revegetation and potential locations for implementation of shelterbelts/hedgerows, buffer strips, conservation tillage or strip tillage, reduced mowing intensity or frequency and biochar additions. An implementation guide is created that provides a summary of the highest potential green-blue interventions and landscape locations, and a description of the mechanisms involved in addressing the hydrometeorological hazards.
Keywords: Green-blue interventions, hydrometeorological hazard reduction, Wupper Basin hydrology
As a customer, it can be frustrating to face an empty shelf in a store. The market does not always realize that a product has been out of stock for a while, as the item is still listed as in stock in the inventory management system. To address this issue, a camera should be used to check for Out-of-Stock (OOS) situations.
This master thesis evaluates different model configurations of Artificial Neural Networks (ANNs) to determine which one best detects OOS situations in the market using images. To create a dataset, 2,712 photos were taken in six stores. The photos clearly show whether there is a gap on the shelf or if the product is in stock. Based on the pre-trained VGG16 model from Keras, two fully connected layers were implemented, with 36 different ANNs differing in the optimization method and activation function pairings. In total, 216 models were generated in this thesis to investigate the effects of three different optimization methods combined with twelve different activation function pairings. An almost balanced ratio of OOS and in-stock data was used to generate these models.
The evaluation of the generated OOS models shows that the FTRL optimization method achieved the least favorable results and is therefore not suitable for this application. Model configurations using the Adam or SGD optimization methods achieve much better results. Of the top six model configurations, five use the Adam optimization method and one uses SGD. They all achieved an accuracy of at least 93% and were able to predict the Recall for the OOS class with at least 91%.
As the data ratio between OOS and in-stock data did not correspond to reality in the previously generated models, the in-stock images were augmented. Including the augmented images, new OOS models were generated for the top six model configurations. The results of these OOS models show no convergences. This suggests that more epochs in the training phase lead to better results. However, the results of the OOS model using the Adam optimization method and the Sigmoid and ReLU activation functions stand out positively. It achieved the best result with an accuracy of 97.91% and a Recall of the OOS class of 87.82%.
Overall, several OOS models have the potential to increase both market sales and customer satisfaction. In a future study, the OOS models should be installed in the market to evaluate their performance under real conditions. The resulting insights can be used for continuous optimization of the model.
The demand for explainable and transparent models increases with the continued success of reinforcement learning. In this article, we explore the potential of generating shallow decision trees (DTs) as simple and transparent surrogate models for opaque deep reinforcement learning (DRL) agents. We investigate three algorithms for generating training data for axis-parallel and oblique DTs with the help of DRL agents (“oracles”) and evaluate these methods on classic control problems from OpenAI Gym. The results show that one of our newly developed algorithms, the iterative training, outperforms traditional sampling algorithms, resulting in well-performing DTs that often even surpass the oracle from which they were trained. Even higher dimensional problems can be solved with surprisingly shallow DTs. We discuss the advantages and disadvantages of different sampling methods and insights into the decision-making process made possible by the transparent nature of DTs. Our work contributes to the development of not only powerful but also explainable RL agents and highlights the potential of DTs as a simple and effective alternative to complex DRL models.
In child–robot interaction (cHRI) research, many studies pursue the goal to develop interactive systems that can be applied in everyday settings. For early education, increasingly, the setting of a kindergarten is targeted. However, when cHRI and research are brought into a kindergarten, a range of ethical and related procedural aspects have to be considered and dealt with. While ethical models elaborated within other human–robot interaction settings, e.g., assisted living contexts, can provide some important indicators for relevant issues, we argue that it is important to start developing a systematic approach to identify and tackle those ethical issues which rise with cHRI in kindergarten settings on a more global level and address the impact of the technology from a macroperspective beyond the effects on the individual. Based on our experience in conducting studies with children in general and pedagogical considerations on the role of the institution of kindergarten in specific, in this paper, we enfold some relevant aspects that have barely been addressed in an explicit way in current cHRI research. Four areas are analyzed and key ethical issues are identified in each area: (1) the institutional setting of a kindergarten, (2) children as a vulnerable group, (3) the caregivers’ role, and (4) pedagogical concepts. With our considerations, we aim at (i) broadening the methodology of the current studies within the area of cHRI, (ii) revalidate it based on our comprehensive empirical experience with research in kindergarten settings, both laboratory and real-world contexts, and (iii) provide a framework for the development of a more systematic approach to address the ethical issues in cHRI research within kindergarten settings.
We consider a risk model in discrete time with dividends and capital injections. The goal is to maximise the value of a dividend strategy. We show that the optimal strategy is of barrier type. That is, all capital above a certain threshold is paid as dividend. A second problem adds tax to the dividends but an injection leads to an exemption from tax. We show that the value function fulfils a Bellman equation. As a special case, we consider the case of premia of size one. In this case we show that the optimal strategy is a two barrier strategy. That is, there is a barrier if a next dividend of size one can be paid without tax and a barrier if the next dividend of size one will be taxed. In both models, we illustrate the findings by de Finetti’s example.
In the last decade, the utilization of waste by-product apple pomace has been extensively researched (due to its difficult disposal) and currently finds beneficial usage in various industries; as substrate for microbial growth or recovery of pectin, xyloglucan and polyphenols. In this research apple juice was produced at pilot scale. Furthermore, apple pomace was employed as substrate for the production of pectin, biofuel (pellets) and concentrated apple pomace extract. Extensive mass and heat balances were conducted to evaluate the feasibility of this approach on industrial scale. The produced pellets had very similar characteristics to wood pellets (net calorific value of 20.3 MJ/kg). Dried apple pomace contained 11.9% of pectin. Fed-batch cultivation of baker´s yeast with apple pomace extract demonstrated a potential for partial substitution of molasses in industrial bioprocesses. This concept shows how a zero discharge biorefinery process converts waste from apple juice production into three valuable products enabling connections between different industries.
Changing our unsustainable linear water management pattern is necessary to face growing global water challenges. This article proposes an integrated framework to analyse and understand the role of different contextual conditions in the possible transition towards water circularity. Our framework combines a systematic multi-level perspective to explore the water system and the institutional work theory for technology legitimation. The framework consists of the following stages: (1) describing and understanding the water context, (2) assessment of the selected technologies’ circularity level, (3) assessment of the alternative circular technologies’ legitimacy, and (4) identification of the legitimation actions to support the upscale of alternative circular technologies. The practical applicability of the integrated assessment framework and its four assessment stages was demonstrated in the exploration of circular water technologies for the horticulture sector in Westland, the Netherlands. The results revealed the conditions that hinder or enable the legitimation of the circular water technologies, such as political environmentalism, trust in water governing authorities, and technical, financial, and knowledge capabilities.
Ten years after the journal’s first publication, we are taking a closer look at the knowledge flows of the output of the journal Publications. We analyzed the papers, topics, their authors and countries to assess the development of scholarly communication within Publications. Our bibliometric analyses show the research journal’s community, where the knowledge of this community is coming from, where it is going, and how diverse the community is based on its internationality and multidisciplinarity. We compare these findings with the scopes and topical goals the journal specifies. We aim at informing the editors and editorial board about the journal’s development to advance the journal’s role in scholarly communication. The results show that regarding topical diversity and internationality, the journal has remarkably developed. Moreover, the journal tends towards the field of library and information science, but strengthens its multidisciplinary status via its topics and author backgrounds.
Durch die stetige Weiterentwicklung und die mediale Präsenz der künstlichen Intelligenz findet die Steigerung der Unternehmenssicherheit in Unternehmen besondere Bedeutung. Insbesondere aus dem Umfeld des Machine Learnings sind kontinuierlich Anwendungen zu verzeichnen, die dazu dienen, eine derartige Maßnahme zu unterstützen.
Im Rahmen dieser Arbeit wurde untersucht, ob eine potenzielle Steigerung der Unternehmenssicherheit durch den Einsatz eines Prototyps für die Objekterkennung basierend auf einem YOLOv5-Algorithmus erreicht werden kann. Es wurden Beispielszenarien definiert und die Wirksamkeit dieses Algorithmus bei der Erkennung und Identifizierung in Bezug auf die Sicherheitsanforderungen in einem Unternehmensumfeld evaluiert.
Die Forschungsmethodik umfasste die Entwicklung und den Aufbau des Prototyps, der auf einem YOLOv5-Algorithmus basiert und auf einem Trainingsdatensatz der Objekterkennung trainiert wurde. Der Prototyp wurde anschließend in einer Laborumgebung implementiert und auf seine Fähigkeit getestet, Objekte nach definierten Sicherheitsanforderungen zu erkennen.
Die Implementierung eines solchen Prototyps konnte dazu beitragen, die Sicherheitsmaßnahmen in Unternehmen zu unterstützen, die Sicherheitsreaktion zu beschleunigen und proaktivere Ansätze zur Gefahrenabwehr zu ermöglichen. Aus diesen Ergebnissen sind weitere Forschungen und praktische Anwendungen im Bereich der Unternehmenssicherheit denkbar.
The paper presents results of the modelling of heat transfer at film boiling of a liquid in a porous medium on a vertical heated wall bordering with the porous medium. Such processes are observed at cooling of high-temperature surfaces of heat pipes, microstructural radiators etc. Heating conditions at the wall were the constant wall temperature or heat flux. The outer boundary of the vapor film was in contact with moving or stationary liquid inside the porous medium. An analytical solution was obtained for the problem of fluid flow and heat transfer using the porous medium model in the Darcy–Brinkman and Darcy–Brinkman–Forchheimer approximation. It was shown that heat transfer at film boiling in a porous medium was less intensive than in the absence of a porous medium (free fluid flow) and further decreased with the decreasing permeability of the porous medium. Significant differences were observed in frames of both models: 20% for small Darcy numbers at Da < 2 for the Darcy–Brinkman model, and 80% for the Darcy–Brinkman–Forchheimer model. In the Darcy–Brinkman model, depending on the interaction conditions at the vapor–liquid interface (no mechanical interaction or stationary fluid), a sharp decrease in heat transfer was observed for the Darcy numbers lower than five. The analytical predictions of heat transfer coefficients qualitatively agreed with the data of Cheng and Verma (Int J Heat Mass Transf 24:1151–1160, 1981) though demonstrated lower values of heat transfer coefficients for the conditions of the constant wall temperature and constant wall heat flux.
The paper focused on an analytical analysis of the main features of heat transfer in incompressible steady-state flow in a microconfusor with account for the second-order slip boundary conditions. The second-order boundary conditions serve as a closure of a system of the continuity, transport, and energy differential equations. As a result, novel solutions were obtained for the velocity and temperature profiles, as well as for the friction coefficient and the Nusselt number. These solutions demonstrated that an increase in the Knudsen number leads to a decrease in the Nusselt number. It was shown that the account for the second-order terms in the boundary conditions noticeably affects the fluid flow characteristics and does not influence on the heat transfer characteristics. It was also revealed that flow slippage effects on heat transfer weaken with an increase in the Prandtl number.
Surrogate-based optimization, nature-inspired metaheuristics, and hybrid combinations have become state of the art in algorithm design for solving real-world optimization problems. Still, it is difficult for practitioners to get an overview that explains their advantages in comparison to a large number of available methods in the scope of optimization. Available taxonomies lack the embedding of current approaches in the larger context of this broad field. This article presents a taxonomy of the field, which explores and matches algorithm strategies by extracting similarities and differences in their search strategies. A particular focus lies on algorithms using surrogates, nature-inspired designs, and those created by automatic algorithm generation. The extracted features of algorithms, their main concepts, and search operators, allow us to create a set of classification indicators to distinguish between a small number of classes. The features allow a deeper understanding of components of the search strategies and further indicate the close connections between the different algorithm designs. We present intuitive analogies to explain the basic principles of the search algorithms, particularly useful for novices in this research field. Furthermore, this taxonomy allows recommendations for the applicability of the corresponding algorithms.
To date, the establishment of high-titer stable viral packaging cells (VPCs) at large scale for gene therapeutic applications is very time- and cost-intensive. Here we report the establishment of three human suspension 293-F-derived ecotropic MLV-based VPCs. The classic stable transfection of an EGFP-expressing transfer vector resulted in a polyclonal VPC pool that facilitated cultivation in shake flasks of 100 mL volumes and yielded high functional titers of more than 1 × 106 transducing units/mL (TU/mL). When the transfer vector was flanked by transposon terminal inverted repeats (TIRs) and upon co-transfection of a plasmid encoding for the transposase, productivities could be slightly elevated to more than 3 × 106 TU/mL. In contrast and using mRNA encoding for the transposase, as a proof of concept, productivities were drastically improved by more than ten-fold exceeding 5 × 107 TU/mL. In addition, these VPC pools were generated within only 3 weeks. The production volume was successfully scaled up to 500 mL employing a stirred-tank bioreactor (STR). We anticipate that the stable transposition of transfer vectors employing transposase transcripts will be of utility for the future establishment of high-yield VPCs producing pseudotype vector particles with a broader host tropism on a large scale.
Ground tire rubber (GTR) is a product obtained by grinding worn tire treads before retreading them or via the cryogenic or ambient temperature milling of end-of-life tires (ELTs). The aim of this study is to evaluate if calcium carbonate can be substituted by GTR and, if so, to what extent. Different types of ground tire rubber are incorporated in an EPDM (ethylene–propylene–diene–rubber) model compound as partial or complete substitutes of calcium carbonate. The raw compounds and the vulcanizates are characterized to identify the limits. In general, it is apparent that increasing amounts of GTR and larger particles degrade the mechanical properties. The GTR also influences the vulcanization kinetics by reducing the scorch time up to 50% and vulcanization time up to nearly 80%. This is significant for production processes. The compounds with one-third substitution with the smaller-particle-size GTR show mostly similar or even better properties than the reference.
Academic search systems aid users in finding information covering specific topics of scientific interest and have evolved from early catalog-based library systems to modern web-scale systems. However, evaluating the performance of the underlying retrieval approaches remains a challenge. An increasing amount of requirements for producing accurate retrieval results have to be considered, e.g., close integration of the system’s users. Due to these requirements, small to mid-size academic search systems cannot evaluate their retrieval system in-house. Evaluation infrastructures for shared tasks alleviate this situation. They allow researchers to experiment with retrieval approaches in specific search and recommendation scenarios without building their own infrastructure. In this paper, we elaborate on the benefits and shortcomings of four state-of-the-art evaluation infrastructures on search and recommendation tasks concerning the following requirements: support for online and offline evaluations, domain specificity of shared tasks, and reproducibility of experiments and results. In addition, we introduce an evaluation infrastructure concept design aiming at reducing the shortcomings in shared tasks for search and recommender systems.
Stable recombinant mammalian cells are of growing importance in pharmaceutical biotechnology production scenarios for biologics such as monoclonal antibodies, growth and blood factors, cytokines and subunit vaccines. However, the establishment of recombinant producer cells using classical stable transfection of plasmid DNA is hampered by low stable gene transfer efficiencies. Consequently, subsequent selection of transgenic cells and the screening of clonal cell populations are time- and thus cost-intensive. To overcome these limitations, expression cassettes were embedded into transposon-derived donor vectors. Upon the co-transfection with transposase-encoding constructs, elevated vector copy numbers stably integrated into the genomes of the host cells are readily achieved facilitating under stringent selection pressure the establishment of cell pools characterized by sustained and high-yield recombinant protein production. Here, we discuss some aspects of transposon vector technologies, which render these vectors promising candidates for their further utilization in the production of biologics.
This paper introduces CAAI, a novel cognitive architecture for artificial intelligence in cyber-physical production systems. The goal of the architecture is to reduce the implementation effort for the usage of artificial intelligence algorithms. The core of the CAAI is a cognitive module that processes the user’s declarative goals, selects suitable models and algorithms, and creates a configuration for the execution of a processing pipeline on a big data platform. Constant observation and evaluation against performance criteria assess the performance of pipelines for many and different use cases. Based on these evaluations, the pipelines are automatically adapted if necessary. The modular design with well-defined interfaces enables the reusability and extensibility of pipeline components. A big data platform implements this modular design supported by technologies such as Docker, Kubernetes, and Kafka for virtualization and orchestration of the individual components and their communication. The implementation of the architecture is evaluated using a real-world use case. The prototypic implementation is accessible on GitHub and contains a demonstration.
Editorial
(2020)
Reducing the carbon emissions from hotels on non-interconnected islands (NII) is essential in the context of a low carbon future for the Mediterranean region. Maritime tourism is the major source of income for Greece and many other countries in the region, as well as hot-temperate and tropical regions worldwide. Like many NIIs, Rhodes attracts a high influx of tourists every summer, doubling the island’s energy demand and, given the high proportion of fossil fuels in the Rhodian energy supply, increasing carbon emissions. Using the theoretical framework ‘FINE’, this paper presents the optimisation of a medium-sized hotel’s energy system with the aim of reducing both cost and carbon emissions. By introducing a Photovoltaic (PV) net metering system, it was found that the carbon emissions associated with an NII hotel’s energy system could be reduced by 31% at an optimised cost. It is suggested that large-scale deployment of PV or alternative renewable energy sources (RES) in NII hotels could significantly reduce carbon emissions associated with the accommodation sector in Greece and help mitigate climate change.
Steer-by-wire systems represent a key technology for highly automated and autonomous driving. In this context, robust steering control is a fundamental precondition for automated vehicle lateral control. However, there is a need for improvement due to degrees of freedom, signal delays, and nonlinear characteristics of the plant which are unconsidered in the design models for the design of current steering controls. To be able to design an extremely robust steering control, suitable optimal models of a steer-by-wire system are required. Therefore, this paper presents an innovative nonlinear detail model of a steer-by-wire system. The detail model represents all characteristics of a real steer-by-wire system. In the context of a dominance analysis of the detail model, all dominant characteristics of a steer-by-wire system, including parameter dependencies, are identified. Through model reduction, a reduced model of the steer-by-wire system is then developed that can be used for a subsequent robust control design. Furthermore, this paper compares the steer-by-wire system with a conventional electromechanical power steering and shows similarities as well as differences.
Positive Impact of Red Soil on Albedo and the Annual Yield of Bifacial Photovoltaic Systems in Ghana
(2023)
The annual yield of bifacial photovoltaic systems is highly dependent on the albedo of the underlying soil. There are currently no published data about the albedo of red soil in western Africa. In this study, the impact of the albedo of red soil in Ghana on the energy yield of bifacial photovoltaic systems is analysed. A bifacial photovoltaic simulation model is created by combining the optical view factor matrix with an electrical output simulation. For an exact simulation, the albedo of red soil at three different locations in Ghana is measured for the first time. The average albedo of every red soil is clearly determined, as well as the measurement span including instrumentation uncertainty; values between 0.175 and 0.335 were measured. Considering these data, a state-of-the-art bifacial photovoltaic system with an average of 19.8% efficient modules in northern Ghana can achieve an annual energy yield of 508.8 kWh/m2 and a bifacial gain of up to 18.3% in comparison with monofacial photovoltaic panels. To summarise, red soil in two out of three locations in Ghana shows higher albedo values than most natural ground surfaces and therefore positively impacts the annual yield of bifacial photovoltaic systems.
Maximising Distribution Grid Utilisation by Optimising E-Car Charging Using Smart Meter Gateway Data
(2023)
The transition towards climate neutrality will result in an increase in electrical vehicles, as well as other electric loads, leading to higher loads on electrical distribution grids. This paper presents an optimisation algorithm that enables the integration of more loads into distribution grid infrastructure using information from smart meters and/or smart meter gateways. To achieve this, a mathematical programming formulation was developed and implemented. The algorithm determines the optimal charging schedule for all electric vehicles connected to the distribution grid, taking into account various criteria to avoid violating physical grid limitations and ensuring non-discriminatory charging of all electric vehicles on the grid while also optimising grid operation. Additionally, the expandability of the infrastructure and fail-safe operation are considered through the decentralisation of all components. Various scenarios are modelled and evaluated in a simulation environment. The results demonstrate that the developed optimisation algorithm allows for higher transformer loads compared to a P(U) control approach, without causing grid overload as observed in scenarios without optimisation or P(U) control.
Start-ups operate in dynamic seed stage, start-up stage and growth stage in an uncertain and volatile environment. An analysis of 59 start-ups shows that companies have special characteristics in terms of the organisational characteristics of employer attractiveness and flexible work organisation. The effects of the two organisational characteristics on an agile workforce are proven by a literature study. The study concludes with a theoretical-conceptual model that illustrates the factors influencing employer attractiveness and flexible work organisation. The results of the survey are brought together with the current state of literature and an approach to organisational agility is developed that takes deregulation tendencies into account.
Abstract
Even though researchers are increasingly acknowledging the dark side of customer participation (i.e., behavioral customer engagement), particularly in professional services with high cognitive demands that cause customer participation stress (i.e., negative psychological state resulting from the customer's overextension by required customer participation efforts), insights on how firms can effectively mitigate customer participation stress remains limited. Building on transactional stress theory, we investigate whether customers can effectively cope with the expected cognitive demands of professional services. Moreover, by introducing an adapted coping construct (i.e., coping support), we examine whether employees can provide coping support to help decrease customer participation stress. The findings of a time‐lagged field study with customers of a large German bank (N = 117) suggest that customer coping before the encounter cannot mitigate the effect of anticipated cognitive demands on customer participation stress. Instead, the results of both the field study and a follow‐up experimental study (N = 218) show that a certain set of employee coping support during service encounters is crucial. While focusing on action coping support is not ideal in situations with high cognitive demands, firms should advise their professional service employees to offer emotional coping support to attenuate the unfavorable effect of cognitive demands on customer participation stress.
To date, the establishment of high-titer stable viral packaging cells (VPCs) at large scale for gene therapeutic applications is very time- and cost-intensive. Here we report the establishment of three human suspension 293-F-derived ecotropic MLV-based VPCs. The classic stable transfection of an EGFP-expressing transfer vector resulted in a polyclonal VPC pool that facilitated cultivation in shake flasks of 100 mL volumes and yielded high functional titers of more than 1 × 106 transducing units/mL (TU/mL). When the transfer vector was flanked by transposon terminal inverted repeats (TIRs) and upon co-transfection of a plasmid encoding for the transposase, productivities could be slightly elevated to more than 3 × 106 TU/mL. In contrast and using mRNA encoding for the transposase, as a proof of concept, productivities were drastically improved by more than ten-fold exceeding 5 × 107 TU/mL. In addition, these VPC pools were generated within only 3 weeks. The production volume was successfully scaled up to 500 mL employing a stirred-tank bioreactor (STR). We anticipate that the stable transposition of transfer vectors employing transposase transcripts will be of utility for the future establishment of high-yield VPCs producing pseudotype vector particles with a broader host tropism on a large scale.
In recent years there have been numerous technical innovations such as CGM systems or insulin pumps that have made life easier for people with type 1 diabetes. However, this also means that more and more information is available. The aim of the present study is to find out more about the daily handling of information. The following research question was asked: What information do people with type 1 diabetes use? To answer this research question, a quantitative online survey of people with type 1 diabetes was conducted by Prof. Dr. Matthias Fank at the Technical University of Cologne. The online survey mainly consisted of 25 closed questions, which were asked on a scale from 0 to 10. The responses of 1,025 people who are at least 18 years old were included in the evaluation. The most important information for type 1 diabetics is the "current value". 67.5% have this on Place 1 placed. Current glucose levels are provided by CGM systems used by 94.2% of people with type 1 diabetes. Quarterly visits to the diabetologist are important and provide important information. 30.8% “completely” agree with this statement on a scale from 0 to 10. Only 2.2% of people with type 1 diabetes are satisfied with their current diabetes management apps. There is a desire for a manufacturer-independent app. The strongest agreement with a value of 10 was chosen by almost a quarter (24.6%) of the people with type 1 diabetes. The study provides an insight into diabetes therapy and shows the need for action.
Austria is committed to the net-zero climate goal along with the European Union. This requires all sectors to be decarbonized. Hereby, hydrogen plays a vital role as stated in the national hydrogen strategy. A report commissioned by the Austrian government predicts a minimum hydrogen demand of 16 TWh per year in Austria in 2040. Besides hydrogen imports, domestic production can ensure supply. Hence, this study analyses the levelized cost of hydrogen for an off-grid production plant including a proton exchange membrane electrolyzer, wind power and solar photovoltaics in Austria. In the first step, the capacity factors of the renewable electricity sources are determined by conducting a geographic information system analysis. Secondly, the levelized cost of electricity for wind power and solarphotovoltaics plants in Austria is calculated. Thirdly, the most cost-efficient portfolio of wind power and solar photovoltaics plants is determined using electricity generation profiles with a 10-min granularity. The modelled system variants differ among location, capacity factors of the renewable electricity sources and the full load hours of the electrolyzer. Finally, selected variables are tested for their sensitivities. With the applied model, the hydrogen production cost for decentralized production plants can be calculated for any specific location. The levelized cost of hydrogen estimates range from 3.08 EUR/kg to 13.12 EUR/kg of hydrogen, whereas it was found that the costs are most sensitive to the capacity factors of the renewable electricity sources and the full load hours of the electrolyzer. The novelty of the paper stems from the model applied that calculates the levelized cost of renewable hydrogen in an off-grid hydrogen production system. The model finds a cost-efficient portfolio of directly coupled wind power and solar photovoltaics systems for 80 different variants in an Austria-specific context.
In this work, we propose a novel data-driven approach to recover missing or corrupted motion capture data, either in the form of 3D skeleton joints or 3D marker trajectories. We construct a knowledge-base that contains prior existing knowledge, which helps us to make it possible to infer missing or corrupted information of the motion capture data. We then build a kd-tree in parallel fashion on the GPU for fast search and retrieval of this already available knowledge in the form of nearest neighbors from the knowledge-base efficiently. We exploit the concept of histograms to organize the data and use an off-the-shelf radix sort algorithm to sort the keys within a single processor of GPU. We query the motion missing joints or markers, and as a result, we fetch a fixed number of nearest neighbors for the given input query motion. We employ an objective function with multiple error terms that substantially recover 3D joints or marker trajectories in parallel on the GPU. We perform comprehensive experiments to evaluate our approach quantitatively and qualitatively on publicly available motion capture datasets, namely CMU and HDM05. From the results, it is observed that the recovery of boxing, jumptwist, run, martial arts, salsa, and acrobatic motion sequences works best, while the recovery of motion sequences of kicking and jumping results in slightly larger errors. However, on average, our approach executes outstanding results. Generally, our approach outperforms all the competing state-of-the-art methods in the most test cases with different action sequences and executes reliable results with minimal errors and without any user interaction.
As the population ages, the demand for care for older adults is increasing. To maintain their independence and autonomy, even with declining health, assistive technologies such as connected medical devices or social robots can be useful. In previous work, we introduced a novel health monitoring system that combines commercially available products with apps designed specifically for older adults. The system is intended for the long-term collection of subjective and objective health data. In this work, we present an exploratory user experience (UX) and usability study we conducted with older adults as the target group of the system and with younger expert users who tested our msystem. All participants interacted with a social robot conducting a health assessment and tested sensing devices and an app for data visualization. The UX and usability of the individual components of the system were rated highly in questionnaires in all sessions. All participants also said they would use such a system in their everyday lives, demonstrating the potential of these systems for self-managing users’ health. Finally, we found factors such as previous experience with social robots and technological expertise to have an influence on the reported UX of the users.
One-step preparation of bilayered films from kraft lignin and cellulose acetate to mimic tree bark
(2020)
This contribution presents the development of a dry-cast method for the one-step preparation of bio-based films from wood polymers that mimic the bilayered structure of tree bark, the natural protective layer of the tree. In a simplified view, natural bark can be considered as the superposition of an external homogeneous and non-porous layer (outer bark) and a porous substructure layer (inner bark). This work is a first step for the future development of bio-based biomimetic wood coatings. The film had a bark-like appearance and its total density, bulk density and porosity were similar to values measured in natural bark. Furthermore, the structural characteristics of the studied film, namely specific surface area (BET) and pore size distribution, as well as the performance of the water adsorption ability were investigated and discussed.
High-quality rendering of spatial sound fields in real-time is becoming increasingly important with the steadily growing interest in virtual and augmented reality technologies. Typically, a spherical microphone array (SMA) is used to capture a spatial sound field. The captured sound field can be reproduced over headphones in real-time using binaural rendering, virtually placing a single listener in the sound field. Common methods for binaural rendering first spatially encode the sound field by transforming it to the spherical harmonics domain and then decode the sound field binaurally by combining it with head-related transfer functions (HRTFs). However, these rendering methods are computationally demanding, especially for high-order SMAs, and require implementing quite sophisticated real-time signal processing. This paper presents a computationally more efficient method for real-time binaural rendering of SMA signals by linear filtering. The proposed method allows representing any common rendering chain as a set of precomputed finite impulse response filters, which are then applied to the SMA signals in real-time using fast convolution to produce the binaural signals. Results of the technical evaluation show that the presented approach is equivalent to conventional rendering methods while being computationally less demanding and easier to implement using any real-time convolution system. However, the lower computational complexity goes along with lower flexibility. On the one hand, encoding and decoding are no longer decoupled, and on the other hand, sound field transformations in the SH domain can no longer be performed. Consequently, in the proposed method, a filter set must be precomputed and stored for each possible head orientation of the listener, leading to higher memory requirements than the conventional methods. As such, the approach is particularly well suited for efficient real-time binaural rendering of SMA signals in a fixed setup where usually a limited range of head orientations is sufficient, such as live concert streaming or VR teleconferencing.
3d printing is capable of providing dose individualization for pediatric medicines and translating the precision medicine approach into practical application. In pediatrics, dose individualization and preparation of small dosage forms is a requirement for successful therapy, which is frequently not possible due to the lack of suitable dosage forms. For precision medicine, individual characteristics of patients are considered for the selection of the best possible API in the most suitable dose with the most effective release profile to improve therapeutic outcome. 3d printing is inherently suitable for manufacturing of individualized medicines with varying dosages, sizes, release profiles and drug combinations in small batch sizes, which cannot be manufactured with traditional technologies. However, understanding of critical quality attributes and process parameters still needs to be significantly improved for this new technology. To ensure health and safety of patients, cleaning and process validation needs to be established. Additionally, adequate analytical methods for the in-process control of intermediates, regarding their printability as well as control of the final 3d printed tablets considering any risk of this new technology will be required. The PolyPrint consortium is actively working on developing novel polymers for fused deposition modeling (FDM) 3d printing, filament formulation and manufacturing development as well as optimization of the printing process, and the design of a GMP-capable FDM 3d printer. In this manuscript, the consortium shares its views on quality aspects and measures for 3d printing from drug-loaded filaments, including formulation development, the printing process, and the printed dosage forms. Additionally, engineering approaches for quality assurance during the printing process and for the final dosage form will be presented together with considerations for a GMP-capable printer design.
We study p-adic L-functions Lp(s, 휒) for Dirichlet characters 휒. We show that Lp(s, 휒) has a Dirichlet series expansion for each regularization parameter c that is prime to p and the conductor of 휒. The expansion is proved by transforming a known formula for p-adic L-functions and by controlling the limiting behavior. A fnite number of Euler factors can be factored of in a natural manner from the p-adic Dirichlet series. We also provide an alternative proof of the expansion using p-adic measures and give an explicit formula for the values of the regularized Bernoulli distribution. The result is particularly simple for c = 2, where we obtain a Dirichlet series expansion that is similar to the complex case.
The publish or perish culture of scholarly communication results in quality and relevance to be are subordinate to quantity. Scientific events such as conferences play an important role in scholarly communication and knowledge exchange. Researchers in many fields, such as computer science, often need to search for events to publish their research results, establish connections for collaborations with other researchers and stay up to date with recent works. Researchers need to have a meta-research understanding of the quality of scientific events to publish in high-quality venues. However, there are many diverse and complex criteria to be explored for the evaluation of events. Thus, finding events with quality-related criteria becomes a time-consuming task for researchers and often results in an experience-based subjective evaluation. OpenResearch.org is a crowd-sourcing platform that provides features to explore previous and upcoming events of computer science, based on a knowledge graph. In this paper, we devise an ontology representing scientific events metadata. Furthermore, we introduce an analytical study of the evolution of Computer Science events leveraging the OpenResearch.org knowledge graph. We identify common characteristics of these events, formalize them, and combine them as a group of metrics. These metrics can be used by potential authors to identify high-quality events. On top of the improved ontology, we analyzed the metadata of renowned conferences in various computer science communities, such as VLDB, ISWC, ESWC, WIMS, and SEMANTiCS, in order to inspect their potential as event metrics.
This paper introduces a Business Cycle Indicator to compile a transparent and reliable chronology of past business cycle turning points for Germany. The Indicator is derived applying the statistical method of Principal Component Analysis, based on information from 20 economic time series. In this way, the Business Cycle Indicator grasps the development of the broader economic activity and has several advantages over a business cycle assessment based on quarterly series of Gross Domestic Product.
In Latin America and the Caribbean, river restoration projects are increasing, but many lack strategic planning and monitoring. We tested the applicability of a rapid visual social–ecological stream assessment method for restoration planning, complemented by a citizen survey on perceptions and uses of blue and green infrastructure. We applied the method at three urban streams in Jarabacoa (Dominican Republic) to identify and prioritize preferred areas for nature-based solutions. The method provides spatially explicit information for strategic river restoration planning, and its efficiency makes it suitable for use in data-poor contexts. It identifies well-preserved, moderately altered, and critically impaired areas regarding their hydromorphological and socio-cultural conditions, as well as demands on green and blue infrastructure. The transferability of the method can be improved by defining reference states for assessing the hydromorphology of tropical rivers, refining socio-cultural parameters to better address river services and widespread urban challenges, and balancing trade-offs between ecological and social restoration goals.
Kompetenzen auf dem Gebiet der Datenbanken gehören zum Pflichtbereich der Informatik. Das Angebot an Lehrbüchern, Vorlesungsformaten und Tools lässt sich jedoch für Lehrende oft nur eingeschränkt in die eigene Lehre integrieren. In diesem Aufsatz schildern wir unsere Erfahrungen in der Nutzung (frei) verfügbarer und der Entwicklung eigener digitaler Inhalte für grundlegende Datenbankveranstaltungen. Die Präferenzen der Studierenden werden mittels Nutzungsanalysen und Befragungen ermittelt. Wir stellen die Anforderungen auf, wie die nicht selten aufwendig herzustellenden digitalen Materialien von Lehrenden in ihre Lehr- und Lernumgebungen integriert werden können. Als konstruktive Antwort auf diese Herausforderung wird das Konzept EILD zur Entwicklung von Inhalten für die Lehre im Fach Datenbanken vorgestellt. Die Inhalte sollen in vielfältigen Lernszenarien eingesetzt werden können und mit einer Creative Commons (CC) Lizenzierung als OER (open educational resources) frei zur Verfügung stehen.
Ten female and five male participants (age range 28–50 years) were recruited at esoteric fairs or via esoteric chatrooms. In a guided face-to-face interview, they reported origins and contents of their beliefs in e.g. esoteric practices, supernatural beings, rebirthing, channeling. Transcripts of the tape-recorded reports were subjected to a qualitative analysis. Exhaustive categorization of the narratives’ content revealed that paranormal beliefs were functional with regard to two fundamental motives – striving for mastery and valuing me and mine (striving for a positive evaluation of the self). Moreover, paranormal beliefs paved the way for goal-setting and leading a meaningful life but, on the negative side, could also result in social exclusion. Results are discussed with reference to the adaptive value of paranormal beliefs.
Die Reihe „Basiswissen zur Nachhaltigkeit“ versucht zu verschiedenen Themen der Nachhaltigkeit, z.B. Klimawandel, Abfallproblematik, soziale Gerechtigkeit, grundlegende Fakten zusammenzustellen. Zielsetzung ist es ein Grundwissen bereitzustellen, das zum einen als Ausgangspunkt für vertiefende Analysen dient und zudem eine faktenbasierte Grundlage für die eigene Meinungsbildung sein kann. Der vorliegende Aufsatz behandelt die Messung, Ursachen und Folgen des Klimawandels sowie die nationalen und internationalen Strategien und Instrumente zur Lösung der Klimaproblematik. Zunächst werden die grundlegenden Informationen zu den Begriffen Klima, Klimawandel und Klimasysteme dargestellt. Dabei wird insbesondere die Messung und Bewertung von langfristigen Temperaturveränderungen kritisch beleuchtet. Nach einem Überblick über die natürlichen und menschengemachten Ursachen des Klimawandels werden die weltweiten Temperaturziele und daraus abgeleiteten CO2-Restbudgets dargestellt und hinsichtlich ihrer Realisierbarkeit analysiert. Der letzte Teil behandelt die Strategien und Instrumente zur Bewältigung des Klimawandels weltweit und in Deutschland.
Abstract
The paper represents an analysis of convective instability in a vertical cylindrical porous microchannel performed using the Galerkin method. The dependence of the critical Rayleigh number on the Darcy, Knudsen, and Prandtl numbers, as well as on the ratio of the thermal conductivities of the fluid and the wall, was obtained. It was shown that a decrease in permeability of the porous medium (in other words, increase in its porosity) causes an increase in flow stability. This effect is substantially nonlinear. Under the condition Da > 0.1, the effect of the porosity on the critical Rayleigh number practically vanishes. Strengthening of the slippage effects leads to an increase in the instability of the entire system. The slippage effect on the critical Rayleigh number is nonlinear. The level of nonlinearity depends on the Prandtl number. With an increase in the Prandtl number, the effect of slippage on the onset of convection weakens. With an increase in the ratio of the thermal conductivities of the fluid and the wall, the influence of the Prandtl number decreases. At high values of the Prandtl numbers (Pr > 10), its influence practically vanishes.
This paper presents the cognitive module of the Cognitive Architecture for Artificial Intelligence (CAAI) in cyber-physical production systems (CPPS). The goal of this architecture is to reduce the implementation effort of artificial intelligence (AI) algorithms in CPPS. Declarative user goals and the provided algorithm-knowledge base allow the dynamic pipeline orchestration and configuration. A big data platform (BDP) instantiates the pipelines and monitors the CPPS performance for further evaluation through the cognitive module. Thus, the cognitive module is able to select feasible and robust configurations for process pipelines in varying use cases. Furthermore, it automatically adapts the models and algorithms based on model quality and resource consumption. The cognitive module also instantiates additional pipelines to evaluate algorithms from different classes on test functions. CAAI relies on well-defined interfaces to enable the integration of additional modules and reduce implementation effort. Finally, an implementation based on Docker, Kubernetes, and Kafka for the virtualization and orchestration of the individual modules and as messaging technology for module communication is used to evaluate a real-world use case.
During spaceflight, humans experience a variety of physiological changes due to deviations from familiar earth conditions. Specifically, the lack of gravity is responsible for many effects observed in returning astronauts. These impairments can include structural as well as functional changes of the brain and a decline in cognitive performance. However, the underlying physiological mechanisms remain elusive. Alterations in neuronal activity play a central role in mental disorders and altered neuronal transmission may also lead to diminished human performance in space. Thus, understanding the influence of altered gravity at the cellular and network level is of high importance. Previous electrophysiological experiments using patch clamp techniques and calcium indicators have shown that neuronal activity is influenced by altered gravity. By using multi-electrode array (MEA) technology, we advanced the electrophysiological investigation covering single-cell to network level responses during exposure to decreased (micro-) or increased (hyper-) gravity conditions. We continuously recorded in real-time the spontaneous activity of human induced pluripotent stem cell (hiPSC)-derived neural networks in vitro. The MEA device was integrated into a custom-built environmental chamber to expose the system with neuronal cultures to up to 6 g of hypergravity on the Short-Arm Human Centrifuge at the DLR Cologne, Germany. The flexibility of the experimental hardware set-up facilitated additional MEA electrophysiology experiments under 4.7 s of high-quality microgravity (10–6 to 10–5 g) in the Bremen drop tower, Germany. Hypergravity led to significant changes in activity. During the microgravity phase, the mean action potential frequency across the neural networks was significantly enhanced, whereas different subgroups of neurons showed distinct behaviors, such as increased or decreased firing activity. Our data clearly demonstrate that gravity as an environmental stimulus triggers changes in neuronal activity. Neuronal networks especially reacted to acute changes in mechanical loading (hypergravity) or de-loading (microgravity). The current study clearly shows the gravity-dependent response of neuronal networks endorsing the importance of further investigations of neuronal activity and its adaptive responses to micro- and hypergravity. Our approach provided the basis for the identification of responsible mechanisms and the development of countermeasures with potential implications on manned space missions.
The European heating sector is currently heavily dominated by fossil fuels. Composting is a naturally occurring process in which heat is liberated from the composting substrate at a higher rate than the process needs to support itself. This difference could be harnessed for low-heat applications such as residential consumption, alleviating some of the impacts fossil fuel emissions represent. In this study, the composting heat recovery reported in the literature was compared to the energy demand for space and water heating in four European countries. A review of potential heat production from the waste representative of the residential sector was performed. We found that the theoretically recoverable composting heat does not significantly reduce the need for district heating. However, it can significantly reduce the energy demand for water heating, being able to supply countries such as Greece with between 36% and 100% of the yearly hot water demand, or 12% to 53% of the yearly hot water of countries such as Switzerland, depending on the efficiency of heat recovery.
The Production of Isophorone
(2023)
Isophorone is a technically important compound used as a high-boiling-point solvent for coatings, adhesives, etc., and it is used as a starting material for various valuable compounds, including isophorone diisocyanate, a precursor for polyurethanes. For over 80 years, isophorone has been synthesized via base-catalyzed self-condensation of acetone. This reaction has a complex reaction mechanism with numerous possible reaction steps including the formation of isophorone, triacetone dialcohol, and ketonic resins. This review provides an overview of the different production processes of isophorone in liquid- and vapor-phase and reviews the literature-reported selectivity toward isophorone achieved using different reaction parameters and catalysts.
The annual yield of bifacial photovoltaic systems is highly dependent on the albedo of the underlying soil. There are currently no published data about the albedo of red soil in western Africa. In this study, the impact of the albedo of red soil in Ghana on the energy yield of bifacial photovoltaic systems is analysed. A bifacial photovoltaic simulation model is created by combining the optical view factor matrix with an electrical output simulation. For an exact simulation, the albedo of red soil at three different locations in Ghana is measured for the first time. The average albedo of every red soil is clearly determined, as well as the measurement span including instrumentation uncertainty; values between 0.175 and 0.335 were measured. Considering these data, a state-of-the-art bifacial photovoltaic system with an average of 19.8% efficient modules in northern Ghana can achieve an annual energy yield of 508.8 kWh/m2 and a bifacial gain of up to 18.3% in comparison with monofacial photovoltaic panels. To summarise, red soil in two out of three locations in Ghana shows higher albedo values than most natural ground surfaces and therefore positively impacts the annual yield of bifacial photovoltaic systems.
Heutzutage ist das Internet, wie die sozialen Netzwerke ein fester Bestandteil unserer So nutzen Unternehmen in dem Kommunikationszeitalter die Netzwerke um in diesen Plattformen mit den Nutzern zu kommunizieren. Hierbei ist das Ziel dieser Bachelorarbeit zu untersuchen in wie weit man eine Erweiterung entwickeln kann um diese Ziele zu erreichen.
Dabei soll im Rahmen dieser Arbeit eine prototypische Implementierung stattfinden, um die Entwicklung eines Multichannel Social Media Marketingtools basierend auf Atlassian Produkten zu ermöglichen.
Die Arbeit gibt zunächst einen Überblick der wichtigsten Grundlagen. Danach wird mit einer Analyse die Anforderungen an den Prototypen erhebt und ausgewertet , sowie eine Marktanalyse durchgeführt. Darauf folgt die Konzeption, wie die Implementierung des Prototyps, was im Anschluss mit einem Test für die Gebrauchstauglichkeit ergänzt wird.
This is the fifth time that TH Köln has conducted this study to examine the local XR industry´on behalf of Mediencluster NRW GmbH, a subsidiary of Film- und Medienstiftung NRW. Aside from the two surveys on the North Rhine-Westphalian sector, there have now been three studies on the nationwide XR (extended or cross reality) sector in Germany. By this we mean all companies that create products and services in the field of virtual, mixed or augmented reality (but not firms that employ XR as users).