Refine
Year of publication
Document Type
- Master's Thesis (325)
- Article (173)
- Bachelor Thesis (148)
- Conference Proceeding (65)
- Report (53)
- Study Thesis (48)
- Working Paper (48)
- Book (30)
- Part of a Book (27)
- Other (12)
Language
- German (595)
- English (343)
- Multiple languages (2)
- Spanish (1)
Keywords
- Rückversicherung (66)
- Reinsurance (55)
- Versicherung (52)
- Kölner Forschungsstelle Rückversicherung (47)
- Versicherungswirtschaft (44)
- Sozialarbeit (25)
- Soziale Arbeit (20)
- Deutschland (19)
- Germany (19)
- Bibliothek (17)
Faculty
- Fakultät 10 / Institut für Informatik (157)
- Fakultät 03 / Institut für Informationswissenschaft (105)
- Fakultät 04 / Institut für Versicherungswesen (79)
- Fakultät 12 / Institut für Technologie und Ressourcenmanagement in den Tropen und Subtropen (78)
- Fakultät 07 / Institut für Medien- und Phototechnik (68)
- Fakultät 04 / Schmalenbach Institut für Wirtschaftswissenschaften (53)
- Angewandte Naturwissenschaften (F11) (49)
- Fakultät 10 / Advanced Media Institute (37)
- Fakultät 09 / Institut für Rettungsingenieurwesen und Gefahrenabwehr (34)
- Fakultät 07 / Institut für Nachrichtentechnik (26)
Vertical indoor farming (VIF) with hydroponics offers a promising perspective for sustainable food production. Intelligent control of VIF system components plays a key role in reducing operating costs and increasing crop yields. Modern machine vision (MV) systems use deep learning (DL) in combination with camera systems for various tasks in agriculture, such as disease and nutrient deficiency detection, and flower and fruit identification and classification for pollination and harvesting. This study presents the applicability of MV technology with DL modelling to detect the growth stages of chilli plants using YOLOv8 networks. The influence of different bird’s-eye view and side view datasets and different YOLOv8 architectures was analysed. To generate the image data for training and testing the YOLO models, chilli plants were grown in a hydroponic environment and imaged throughout their life cycle using four camera systems. The growth stages were divided into growing, flowering, and fruiting classes. All the trained YOLOv8 models showed reliable identification of growth stages with high accuracy. The results indicate that models trained with data from both views show better generalisation. YOLO’s middle architecture achieved the best performance.
In recent times, large language models (LLMs) have made significant strides in generating computer code, blurring the lines between code created by humans and code produced by artificial intelligence (AI). As these technologies evolve rapidly, it is crucial to explore how they influence code generation, especially given the risk of misuse in areas such as higher education. The present paper explores this issue by using advanced classification techniques to differentiate between code written by humans and code generated by ChatGPT, a type of LLM. We employ a new approach that combines powerful embedding features (black-box) with supervised learning algorithms including Deep Neural Networks, Random Forests, and Extreme Gradient Boosting to achieve this differentiation with an impressive accuracy of 98%. For the successful combinations, we also examine their model calibration, showing that some of the models are extremely well calibrated. Additionally, we present white-box features and an interpretable Bayes classifier to elucidate critical differences between the code sources, enhancing the explainability and transparency of our approach. Both approaches work well, but provide at most 85–88% accuracy. Tests on a small sample of untrained humans suggest that humans do not solve the task much better than random guessing. This study is crucial in understanding and mitigating the potential risks associated with using AI in code generation, particularly in the context of higher education, software development, and competitive programming.
Purpose: To evaluate the differences between two extended depth-of-focus intraocular lenses, the Alcon IQ Vivity and the Bausch & Lomb LuxSmart and to compare them with a simple monofocal lens, the Alcon IQ, using a simulation-based approach.
Methods: A mathematical lens model was created for each lens type based on a measured surface geometry. The lens model was then used in a raytracer to calculate a refractive power map of the lens and a ray propagation image for the focal zone.
Results: The simulations confirm the enhanced depth of focus of these two lenses. There are apparent differences between the models. For the Vivity, more light is directed into the far focus in low light conditions, whereas the LuxSmart behaves more pupil independent and prioritizes intermediate vision.
Conclusions: The simulation-based approach was effective in evaluating and comparing the design aspects of these lenses. It can be positioned as a valuable third tool for lens characterization, complementing in vivo studies and in vitro measurements.
Translational Relevance: With this approach not only focusing on the resulting optical performance, but the underlying functional mechanisms, it paves the way forward for a better adaptation to the individual needs and preferences of patients.
Räumliche Expositionsanalyse von Denkmälern gegenüber Hochwassergefahren am Beispiel der Stadt Köln
(2024)
Der Schutz von Kulturgütern ist für die Bewahrung der kulturellen Identität einer Bevölkerung zu jeder Zeit unerlässlich. Auch während oder nach einer Krise infolge einer Naturgefahr ist der Kulturgutschutz somit als ein essentieller Teil der Katastrophenbewältigung anzusehen. Die vorliegende Arbeit untersucht im Rahmen einer Geoinformationssystem (GIS)-Analyse die Exposition von Denkmälern gegenüber Hochwassergefahren am Beispiel der Stadt Köln, Deutschland. Dabei werden drei potenzielle Hochwasserszenarien mit unterschiedlichen Widerkehrwahrscheinlichkeiten angenommen: HQ10-50, HQ100 und HQ500 Hochwasserereignisse. Die Art der untersuchten Denkmäler wird auf bauliche, nicht bewohn- oder begehbare Denkmäler eingegrenzt. Die Ergebnisse der Arbeit zeigen, dass im extremsten Szenario rund ein Drittel der untersuchten Denkmäler von der Hochwassergefahr betroffen sind und dabei vereinzelt bis zu 6m tief überflutet werden. Durch die gewichtete Überlagerung von Daten zur Bevölkerungsdichte und georeferenzierten Fotos der Social-Media Plattform Flickr kann die lokale Relevanz der Denkmäler in Köln als relativer Wert angegeben werden. Eine anschließende Priorisierung der Denkmäler basierend auf dem ermittelten Relevanzwert und der jeweiligen Überflutungstiefe verdeutlicht, dass auch eine Vielzahl an Denkmälern abseits der Kölner Innenstadt eine detaillierte Verwundbarkeitsanalyse in Bezug auf Hochwassergefahren und zu erwartende Schäden bedürfen. Die Erkenntnisse dieser Arbeit können sowohl für die Denkmalschutzbeauftragten der Stadt Köln, als auch für die lokalen Gefahrenabwehrbehörden von Interesse sein.
Virtualization fundamentally changes how social relations form, but its effect on network structure in collaborative teams is poorly understood. This paper compares team networks from nine government-funded projects that were conducted virtually because of the COVID-19 pandemic with 15 prepandemic projects from the same funding program. Results of our comparative analysis of 2,746 dyadic ties in 24 teams showed lower levels of network density, clustering, and structural cohesion in virtualized projects, indicating fragmented virtual teams. Furthermore, expressive networks, defined by the sharing of personal information, were affected more than instrumental networks, which revolve around the sharing of expert knowledge.
There is an urgent need to develop sustainable agricultural land use schemes. Intensive crop production has induced increased greenhouse gas emissions and enhanced nutrient and pesticide leaching to groundwater and streams. Climate change is also expected to increase drought risk as well as the frequency of extreme precipitation events in many regions. Consequently, sustainable management schemes require sound knowledge of site-specific soil water processes that explicitly take into account the interplay between soil heterogeneities and crops. In this study, we applied a principal component analysis to a set of 64 soil moisture time series from a diversified cropping field featuring seven distinct crops and two weeding management strategies. Results showed that about 97 % of the spatial and temporal variance of the data set was explained by the first five principal components. Meteorological drivers accounted for 72.3 % of the variance and 17.0 % was attributed to different seasonal behaviour of different crops. While the third (4.1 %) and fourth (2.2 %) principal components were interpreted as effects of soil texture and cropping schemes on soil moisture variance, respectively, the effect of soil depth was represented by the fifth component (1.7 %). However, neither topography nor weed control had a significant effect on soil moisture variance. Contrary to common expectations, soil and rooting pattern heterogeneity seemed not to play a major role. Findings of this study highly depend on local conditions. However, we consider the presented approach generally applicable to a large range of site conditions.
HIV Gag virus-like particles (HIV Gag VLPs) are promising HIV vaccine candidates. In the literature, they are often described as shear-sensitive particles, and authors usually recommend the operation of tangential flow filtration (TFF) gently at shear rates below 4,000 s −1 to 6,000 s −1 . This in turn poses a severe limitation to the performance of TFF-mediated concentration of VLPs, which would be substantially enhanced by working at higher shear rates. To our knowledge, studies examining the shear sensitivity of HIV Gag VLPs and providing detailed information and evidence for the fragility of these particles have not been conducted yet. Thus, we investigated the effect of high shear rates on the colloidal stability of mosaic VLPs (Mos-VLPs) as relevant examples for HIV Gag VLPs. For this purpose, Mos-VLPs were exposed to different shear rates ranging from 3,395 s −1 to 22, 365 s −1 for 2 h. The average hydrodynamic diameter (AHD) and the polydispersity index (PDI) of the associated particle size distribution were used as stability indicators and measured after the treatment and during storage through dynamic light scattering. At high shear rates, we observed an increase in both AHD and PDI during the storage of HIV Mos1.Gag VLPs (bVLP—without envelope proteins) and Mos1.Gag + Mos2S.Env VLPs (eVLP—with envelope proteins). eVLPs exhibited higher colloidal stability than bVLPs, and we discuss the potential stabilizing role of envelope proteins. We finally demonstrated that the dispersion medium also has a considerable impact on the stability of Mos-VLPs.
Angesichts der wachsenden Bedeutung verteilter Systeme, der vielfältigen Nutzung von Multimedia-Inhalten auf unterschiedlichen Geräten und der fortlaufenden Weiterentwicklung von Browsern wird die Relevanz deutlich, das Potenzial von geräteübergreifenden Browserspielen zu untersuchen. Das Ziel dieser Arbeit besteht darin, den im vorangegangenen Praxisprojekt entwickelten Prototyp des Spiels "Finalblockdown" durch einen partizipativen Prozess weiterzuentwickeln und schließlich als Open-Source-Software zu veröffentlichen. Das direkte Einbeziehen ausgewählter Testnutzer in die Weiterentwicklung des Spiels zielt darauf ab, das Spielerlebnis, die technische Leistung sowie die Benutzerfreundlichkeit umfassend und nutzerorientiert zu verbessern.
Zur Durchführung des partizipativen Prozesses wurden mehrere Fokusgruppen gebildet, die während der gesamten Entwicklung aktiv mitgewirkt und jede Iteration eingehend getestet sowie evaluiert haben. Die Testläufe waren eine Kombination aus Play-Testing und Gruppeninterviews, die zur Auswertung mitgeschnitten wurden.
Basierend auf dem dokumentierten, zusammengetragenen und eingestuften Feedback aller Gruppen wurde ein Entwicklungsplan für die jeweilige Iteration erstellt.
Der partizipative Prozess brachte signifikante Vorteile, insbesondere im Hinblick auf die Gestaltung der Benutzeroberfläche, Spielmechaniken und des Testens. Limitationen durch begrenzte Zeit und Ressourcen zeigen jedoch, dass der partizipative Ansatz zeitintensiv ist, was in der Planung berücksichtigt werden muss. Durch den Prozess wurde ein visuell und spielerisch überzeugendes Spiel entwickelt, das komplexe Programmierkonzepte wie Websockets, WebGL-Grafiken und Spatial Hashing in Vanilla JavaScript umgesetzt hat und durch eigenes Hosting veröffentlicht wurde. Trotzdem hat das System noch umfangreiches Optimierungs- und Entwicklungspotenzial, weswegen es sich gut als Open-Source-Software eignet und so über den Projektrahmen hinaus weiterentwickelt werden wird. Weiterführende Forschungen könnten sich sowohl auf prozessbezogene als auch auf technische Aspekte konzentrieren. In der Weiterentwicklung des partizipativen Prozesses könnte man analysieren, wie die Einbindung von Testnutzern in komplexe Bereiche der Entwicklung, einschließlich Systemarchitektur und Leistung, die Demokratisierung des Designs verstärken könnte. Technische Untersuchungen könnten sich darauf konzentrieren, wie Optimierungen in den Bereichen Hosting, Websockets und Kollisionserkennung zur Leistungssteigerung des Systems beitragen können.
In the contemporary era, many organizations and companies are confronted with a signif-icant surge in data volumes. This has led to the challenge of capturing, storing, managing, and analyzing terabytes of data, which are stored in diverse formats and originate from numerous internal and external sources. Furthermore, the emergence of novel applica-tions, such as trading, and artificial intelligence, has made the processing of vast amounts of data in real time an absolute necessity. These requirements exceed the processing ca-pacity of traditional on-disk database management systems, which are ill-equipped to manage this data and to provide real-time results. Therefore, data management requires new solutions to cope with the challenges of data volumes and processing data in real time. An in-memory database system (IMDB- or IMD system) is a database management system that is emerging as a solution to these challenges, with the support of other tech-nologies. IMDBs are capable of processing massive data distinctly faster than traditional database management systems. This work examines the approach of IMDBs, with a par-ticular focus on SAP HANA, and compares it with other IMDBs.
The city of Barcelona has developed the transversal project of Superblocks which combines traffic pacification with the installation of urban green and urban furniture in order to counteract rising problems of threatening air quality, noise levels, insufficient urban green and lacking public spaces. The implementation of the Superblock in Sant Antoni has revealed a reduction of motorized transport in the whole neighbourhood. This thesis aims at understanding sociological dimensions of behaviour changes and modal shifts, as well as contextualizing the transformation in the light of an ongoing paradigm shift in urban mobility. Quantitative data of traffic volumes shows the measured reduction of motorized transport in Sant Antoni. A quantitative questionnaire casts a light on modal shifts and attitudes of residents in the intervention zone. Further qualitative interviews create an in-depth understanding of individual residents and their modal changes, mobility behaviour and attitudes towards the changes. Finally, expert interviews in the field of transport planning and city administration substantiate the ongoing paradigm shift towards a more ecological, socially connected, and sustainable city.
Keywords: urban planning, urban mobility, sustainable mobility transition, evaporating transport, modal shift, behaviour change, social practice
The cleaning of aged silk fibers poses a common challenge in the conservation of textiles, since traditional cleaning techniques often yield unsatisfactory results or even harm objects. In this regard, cleaning objects with laser radiation is a promising addition to the range of available methods. Due to it being contactless, even brittle and touch-sensitive objects with disfiguring or harmful soiling could potentially be cleaned and therefore made accessible for research and presentation. Examples of treatment have sometimes shown spectacular results. Still there is some skepticism concerning the safety of this treatment for textile materials, which has been strengthened through previous 532 nm wavelength nanosecond laser cleaning studies on silk fibers. Taking these published results into account, the range of examined laser parameters has been extended in this study, from 532 nm nanosecond laser to 1064 nm nanosecond and even 800 nm femtosecond laser, reevaluating the effect of this treatment on the fibers. The physicochemical processes taking place on the silk fibers when cleaning with lasers are complex and still not fully understood. The aim of this project was therefore to bring more clarification about potential effects of those processes on the condition of silk samples treated with a set of different parameters for wavelength, pulse duration, energy density and number of pulses per spot. It also looks at the influence of the presence of soiling on the results. The analysis of potential effects was then carried out using statistical methods and advanced analytics. Scanning electron microscopy, Fourier-transform infrared spectroscopy and colorimetry technology provided the required insights to better assess the effects. Results show that laser cleaning of silk fibers, like most other conventional cleaning techniques, is not completely without risk, but knowing what the possible effects are helps making decisions on whether the benefits of the technique used justify these risks.
The focus of this paper was Jouguet detonation in an ideal gas flow in a magnetic field. A modified Hugoniot detonation equation has been obtained, taking into account the influence of the magnetic field on the detonation process and the parameters of the detonation wave. It was shown that, under the influence of a magnetic field, combustion products move away from the detonation front at supersonic speed. As the magnetic field strength increases, the speed of the detonation products also increases. A dependence has been obtained that allows us to evaluate the influence of heat release on detonation parameters.
There is a wide variety of drought indices, yet a consensus on suitable indices and temporal scales for monitoring streamflow drought remains elusive across diverse hydrological settings. Considering the growing interest in spatially distributed indices for ungauged areas, this study addresses the following questions: (i) What temporal scales of precipitation-based indices are most suitable to assess streamflow drought in catchments with different hydrological regimes? (ii) Do soil moisture indices outperform meteorological indices as proxies for streamflow drought? (iii) Are snow indices more effective than meteorological indices for assessing streamflow drought in snow-influenced catchments? To answer these questions, we examined 100 near-natural catchments in Chile with four hydrological regimes, using the standardised precipitation index (SPI), standardised precipitation evapotranspiration index (SPEI), empirical standardised soil moisture index (ESSMI), and standardised snow water equivalent index (SWEI), aggregated across various temporal scales. Cross-correlation and event coincidence analysis were applied between these indices and the standardised streamflow index at a temporal scale of 1 month (SSI-1), as representative of streamflow drought events. Our results underscore that there is not a single drought index and temporal scale best suited to characterise all streamflow droughts in Chile, and their suitability largely depends on catchment memory. Specifically, in snowmelt-driven catchments characterised by a slow streamflow response to precipitation, the SPI at accumulation periods of 12–24 months serves as the best proxy for characterising streamflow droughts, with median correlation and coincidence rates of approximately 0.70–0.75 and 0.58–0.75, respectively. In contrast, the SPI at a 3-month accumulation period is the best proxy over faster-response rainfall-driven catchments, with median coincidence rates of around 0.55. Despite soil moisture and snowpack being key variables that modulate the propagation of meteorological deficits into hydrological ones, meteorological indices are better proxies for streamflow drought. Finally, to exclude the influence of non-drought periods, we recommend using the event coincidence analysis, a method that helps assessing the suitability of meteorological, soil moisture, and/or snow drought indices as proxies for streamflow drought events.
After 50 years, there is still an ongoing debate about the Limits to Growth (LtG) study. This paper recalibrates the 2005 World3‐03 model. The input parameters are changed to better match empirical data on world development. An iterative method is used to compute and optimize different parameter sets. This improved parameter set results in a World3 simulation that shows the same overshoot and collapse mode in the coming decade as the original business as usual scenario of the LtG standard run. The main effect of the recalibration update is to raise the peaks of most variables and move them a few years into the future. The parameters with the largest relative changes are those related to industrial capital lifetime, pollution transmission delay, and urban‐industrial land development time.
Water serves for the production of pharmaceutical ingredients, intermediates and final products. Accordingly, the quality requirements are particularly high. Next to quality, sustainability of the production and climate change mitigation will play an increasingly important role. For instance, in 2015, the total global emissions of the pharma sector was significantly higher than the CO<sub>2</sub> emissions generated by the automotive sector. Thus, efforts must be made at all stages of production of pharmaceuticals to reduce the environmental impact.
Comparing the InsurTech ecosystems of the United States and Germany (Europe), there are significant regional differences in the choice of business models. While many InsurTechs in the United States have opted for the business model of a fully licensed insurer, this business model is much less common in Europe. In Europe, many InsurTechs seem to shy away from applying for a license as an insurer and limit themselves to the business model of a broker or a managing general agent. This paper analyzes the factors that influence an InsurTech's choice of business model when deciding whether or not to apply for an insurance license. It examines the impact of different local market environments on these decisions, as well as the role that access to venture capital plays in business model decisions and how regulators and their actions influence the decision‐making process.
The rapid increase in the use and development of statistical design of experiments (DoE), particularly in pharmaceutical process development, has become increasingly important over the last decades. This rise aligns with Green Chemistry Principles, seeking reduced resource usage and heightened efficiency. In this study, we employed a comprehensive design of experiments (DoE) approach to optimize the catalytic conversion of 1-decene to n-decanal through direct Wacker-type oxidation using the previously determined efficient PdCl2(MeCN)2 catalytic system. The aim was to maximize selectivity and conversion efficiency. Through systematic variation of seven factors, including substrate amount, catalyst and co-catalyst amount, reaction temperature, reaction time, homogenization temperature, and water content, this study identified critical parameters influencing the process to direct the reaction toward the desired product. The statistical analysis revealed high significance for both selectivity and conversion, with surface diagrams illustrating optimal conditions. Notably, catalyst amount emerged as a pivotal factor influencing conversion, with reaction temperature and co-catalyst amount significantly affecting both conversion efficiency and selectivity. The refined model demonstrated strong correlations between predicted and observed values, highlighting the impact of these factors on both selectivity and conversion.
Die Bachelorarbeit mit dem Titel ”Gezielte Maßnahmen zur Förderung von Studentinnen in IT-Studiengängen am Campus Gummersbach der TH Köln” verfolgt das Ziel, die Präsenz und den Erfolg von Frauen in den IT-Studiengängen der Technischen Hochschule Köln zu verbessern. Der Fokus dieser Arbeit liegt darauf, basierend auf empirischen Erkenntnissen und einem theoretischen Rahmen, realisierbare Fördermaßnahmen zu entwickeln. Zur Hauptmaßnahme gehört die Initiative ”Be a IT-Girl”, die durch verschiedene Ansätze wie ein spezifisches Podcast-Format und ein vielseitig nutzbares Raumkonzept die Gleichstellung und das Empowerment von Studentinnen fördern soll.
Die Arbeit wurde durch eine Kombination aus qualitativer und quantitativer Forschung gestützt, wobei Befragungen und Analysen zu den Herausforderungen und Bedürfnissen der Studentinnen am Campus Gummersbach durchgeführt wurden. Die entwickelten Maßnahmen wurden hinsichtlich ihrer Durchführbarkeit und erwarteten Effektivität bewertet, wobei ein besonderer Fokus auf die praktische Umsetzbarkeit und die Anpassung an die spezifischen Bedingungen Standort Gummersbach gelegt wurde.
Insgesamt bietet die Arbeit nicht nur einen tiefen Einblick in die Problemstellungen und Potenziale zur Förderung von Frauen in IT-Bereichen, sondern auch konkrete Lösungsansätze, die zu einer nachhaltigen Veränderung und Verbesserung der Situation beitragen können.
Die Häufigkeit und Intensität globaler Katastrophen nehmen deutlich zu. Vermehrt betreffen deren Auswirkungen auch die Kritische Infrastruktur und führen zu Einschränkungen in der Funktionalität wichtiger Einrichtungen. Insbesondere in Krankenhäusern gibt es viele Prozesse, die es davor zu schützen gilt. Aus diesem Grund soll perspektivisch unter anderem das KRITIS-Dachgesetz auf Grundlage der CER-Richtlinie (EU 2022/2557) bis zum 17. Oktober 2024 verabschiedet werden. Eine Anforderung des KRITIS-Dachgesetztes wird zukünftig unter anderem die Erstellung von Maßnahmen zur Steigerung der Resilienz beinhalten, welche in Resilienzplänen nachgewiesen werden sollen. Die Maßnahmen aus betrieblichen Kontinuitätsstrategien (=Business Continuity Managementsystemen) können integraler Bestandteil der Resilienzpläne sein. In dieser Arbeit sollen Schnittstellen zwischen Business Continuity Managementsystemen und bereits bestehenden Krankenhausalarm- und -einsatzplänen identifiziert werden. Die Erkenntnisse sollen den von dem KRITIS-Dachgesetz betroffenen Krankenhäusern einen Einblick in die Anforderungen von Business Continuity Managementsystemen geben und dazu beitragen, Umsetzungsmöglichkeiten in Krankenhäusern aufzuzeigen. Im Bearbeitungszeitraum wurden 14 qualitative Experteninterviews mit Vertretern von Krankenhäusern, Behörden, juristischen Einrichtungen und internationalen Krisenvorsorgeexperten geführt, von denen 11 in dieser Forschungsarbeit ausgewertet werden. Die Befragungen wurden halbstrukturiert und mit einer explorativen Herangehensweise durchgeführt, um ein Meinungsbild zu den perspektivischen Anforderungen aus dem KRITIS-Dachgesetz zu erfassen, aktuelle Maßnahmen zur Steigerung der Resilienz in Krankenhäusern zu identifizieren und potenzielle Herausforderungen in der Umsetzung zu ermitteln. In den Befragungen wurde eine breite Zustimmung zur Notwendigkeit der Etablierung von Business Continuity Managementsystemen als integraler Bestandteil des Resilienzmanagements von Krankenhäusern festgestellt. Die wesentlichen Erkenntnisse sind die Unterscheidung zwischen ereignisspezifischen und prozessbezogenen Vorgehensweisen bei der Identifizierung von Risiken sowie die erweiterten Anforderungen des Business Continuity Managementsystem hinsichtlich des Wiederanlaufs und der Wiederherstellung von Prozessen und Systemen. Synchronisationspotenziale bietet die Krankenhauseinsatzleitung, welche ebenfalls im Business Continuity Managementsystem mit der besonderen Aufbauorganisation als zentrales Organ zur Ereignisbewältigung beschrieben wird. Ebenfalls können bereits vorhandene Redundanzen in die Anforderungen der betrieblichen Kontinuitätsstrategien übertragen werden. Ein zentrales Hindernis für die Umsetzung in Krankenhäusern kann die branchenspezifische Personalnot und die Insolvenzgefahr darstellen. Die Erkenntnisse wurden genutzt, um konkrete Handlungsempfehlungen für die Krankenhäuser zu entwickeln. Eine frühzeitige Identifizierung von Störungen, Notfällen oder Krisen und die entsprechende Meldung an die verantwortlichen Stellen sind wesentlicher Bestandteil einer erfolgreichen Ereignisbewältigung. Darauf aufbauend wird die Etablierung eines „Leiters KAEP“ empfohlen, welcher unter anderem für die Erstellung und Aktualisierung der Krankenhausalarm- und -einsatzpläne verantwortlich ist und über einen Gesamtüberblick der relevanten Schnittstellen und Abhängigkeiten aller Abteilungen im Krankenhaus verfügt.