620 Ingenieurwissenschaften
Refine
Has Fulltext
- yes (100) (remove)
Document Type
- Article (56)
- Doctoral Thesis (27)
- Conference Proceeding (9)
- Master's Thesis (4)
- Book (1)
- Habilitation (1)
- Report (1)
- Working Paper (1)
Institute
- Institut für Strukturmechanik (ISM) (33)
- F. A. Finger-Institut für Baustoffkunde (FIB) (6)
- Professur Bauphysik (6)
- Graduiertenkolleg 1462 (5)
- In Zusammenarbeit mit der Bauhaus-Universität Weimar (4)
- Institut für Konstruktiven Ingenieurbau (IKI) (4)
- Materialforschungs- und -prüfanstalt an der Bauhaus-Universität (4)
- Professur Baubetrieb und Bauverfahren (4)
- Junior-Professur Komplexe Tragwerke (3)
- Professur Bauchemie und Polymere Werkstoffe (3)
Keywords
- Finite-Elemente-Methode (7)
- Beton (5)
- Modellierung (5)
- OA-Publikationsfonds2022 (5)
- Strukturmechanik (5)
- Building Information Modeling (4)
- Bruchmechanik (3)
- Erdbeben (3)
- Ingenieurbau (3)
- OA-Publikationsfonds2020 (3)
When predicting sound pressure levels induced by structure-borne sound sources and describing the sound propagation path through the building structure as exactly as possible, it is necessary to characterize the vibration behavior of the structure-borne sound sources. In this investigation, the characterization of structure-borne sound sources was performed using the two-stage method (TSM) described in EN 15657. Four different structure-borne sound sources were characterized and subsequently installed in a lightweight test stand. The resulting sound pressure levels in an adjacent receiving room were measured. In the second step, sound pressure levels were predicted according to EN 12354-5 based on the parameters of the structure-borne sound sources. Subsequently, the predicted and the measured sound pressure levels were compared to obtain reliable statements on the achievable accuracy when using source quantities determined by TSM with this prediction method.
Identification of modal parameters of a space frame structure is a complex assignment due to a large number of degrees of freedom, close natural frequencies, and different vibrating mechanisms. Research has been carried out on the modal identification of rather simple truss structures. So far, less attention has been given to complex three-dimensional truss structures. This work develops a vibration-based methodology for determining modal information of three-dimensional space truss structures. The method uses a relatively complex space truss structure for its verification. Numerical modelling of the system gives modal information about the expected vibration behaviour. The identification process involves closely spaced modes that are characterised by local and global vibration mechanisms. To distinguish between local and global vibrations of the system, modal strain energies are used as an indicator. The experimental validation, which incorporated a modal analysis employing the stochastic subspace identification method, has confirmed that considering relatively high model orders is required to identify specific mode shapes. Especially in the case of the determination of local deformation modes of space truss members, higher model orders have to be taken into account than in the modal identification of most other types of structures.
The characteristic values of climatic actions in current structural design codes are based on a specified probability of exceedance during the design working life of a structure. These values are traditionally determined from the past observation data under a stationary climate assumption. However, this assumption becomes invalid in the context of climate change, where the frequency and intensity of climatic extremes varies with respect to time. This paper presents a methodology to calculate the non-stationary characteristic values using state of the art climate model projections. The non-stationary characteristic values are calculated in compliance with the requirements of structural design codes by forming quasi-stationary windows of the entire bias-corrected climate model data. Three approaches for the calculation of non-stationary characteristic values considering the design working life of a structure are compared and their consequences on exceedance probability are discussed.
As an optimization that starts from a randomly selected structure generally does not guarantee reasonable optimality, the use of a systemic approach, named the ground structure, is widely accepted in steel-made truss and frame structural design. However, in the case of reinforced concrete (RC) structural optimization, because of the orthogonal orientation of structural members, randomly chosen or architect-sketched framing is used. Such a one-time fixed layout trend, in addition to its lack of a systemic approach, does not necessarily guarantee optimality. In this study, an approach for generating a candidate ground structure to be used for cost or weight minimization of 3D RC building structures with included slabs is developed. A multiobjective function at the floor optimization stage and a single objective function at the frame optimization stage are considered. A particle swarm optimization (PSO) method is employed for selecting the optimal ground structure. This method enables generating a simple, yet potential, real-world representation of topologically preoptimized ground structure while both structural and main architectural requirements are considered. This is supported by a case study for different floor domain sizes.
We present a physics-informed deep learning model for the transient heat transfer analysis of three-dimensional functionally graded materials (FGMs) employing a Runge–Kutta discrete time scheme. Firstly, the governing equation, associated boundary conditions and the initial condition for transient heat transfer analysis of FGMs with exponential material variations are presented. Then, the deep collocation method with the Runge–Kutta integration scheme for transient analysis is introduced. The prior physics that helps to generalize the physics-informed deep learning model is introduced by constraining the temperature variable with discrete time schemes and initial/boundary conditions. Further the fitted activation functions suitable for dynamic analysis are presented. Finally, we validate our approach through several numerical examples on FGMs with irregular shapes and a variety of boundary conditions. From numerical experiments, the predicted results with PIDL demonstrate well agreement with analytical solutions and other numerical methods in predicting of both temperature and flux distributions and can be adaptive to transient analysis of FGMs with different shapes, which can be the promising surrogate model in transient dynamic analysis.
Nonlocal theories concern the interaction of objects, which are separated in space. Classical examples are Coulomb’s law or Newton’s law of universal gravitation. They had signficiant impact in physics and engineering. One classical application in mechanics is the failure of quasi-brittle materials. While local models lead to an ill-posed boundary value problem and associated mesh dependent results, nonlocal models guarantee the well-posedness and are furthermore relatively easy to implement into commercial computational software.
Die fortschreitende Digitalisierung lässt innovative bauprojekt- und unternehmensinterne Workflows sowie Organisationssysteme entstehen. In diesem Zusammenhang ist die digitale Fortentwicklung durch Building Information Modeling [BIM] als Veränderungsprozess zu definieren, der Organisationsstrukturen nachhaltig umformen wird. BIM ist die führende digitale Arbeitsmethodik im Bauwesen, die entwurfs-, ausführungs- und bauprojektbezogenen Belangen gerecht werden kann. Die deutsche Bauwirtschaft ist im Vergleich zu anderen Branchen jedoch als digital rückständig zu betrachten. Sie ist durch einen Markt gekennzeichnet, an dem kleine und mittelständische Unternehmen [KMU] in hoher Zahl vertreten sind. Aufgrund von Anwendungsunkenntnis der kleinen und mittelständischen Unternehmen fehlt der flächendeckende und durchgängige BIM-Einsatz in Projekten. Mit dem Fokus auf dem Bauprojekt als temporärer Organisation adressiert der vorliegende Forschungsschwerpunkt die Schaffung eines realistischen Abbilds erprobter BIM-Anwendungsfälle in Modellprojekten. Herausgearbeitet werden derzeit bestehende BIM-Herausforderungen für Erstanwender, die die durchgängige BIM-Anwendung in Deutschland bisher hemmen.
Die Forschungsarbeit fokussiert sich auf die Evaluation erfolgskritischer Faktoren [ekF] in BIM-Anwendungsfällen [AWF] im Rahmen einer qualitativen Inhaltsanalyse. Die digitale Transformation birgt strukturrelevante Veränderungsdeterminanten für Organisationen durch die BIM-Anwendung und außerdem Herausforderungen, die in der Anwendungsfallforschung betrachtet werden.
Die Zielstellung ist dreiteilig. Ein entwickeltes BIM-Strukturmodell erfasst die aktuelle Richtlinienarbeit sowie Standardisierung und stellt dadurch den Rahmen notwendiger BIM-Strukturen im Bauprojekt auf. Aus dem Strukturmodell ist ein Modell zur Prüfung von Anwendungsfallrisiken abgeleitet worden. Dieses wird auf gezielt recherchierte BIM-Modellprojekte in Deutschland angewendet, um aus den erfolgskritischen Faktoren der darin praktizierten BIM-Anwendungsfälle eine ekF-Risikomatrix abzuleiten. Daraus geht ein unterstützendes BIM-Anwendungsinstrument in Form von BPMN-Abläufen für KMU hervor. Resultierend aus der Verbindung des BIM-Strukturmodels und der Anwendungsfallanalyse wird in den einzelnen Ablaufübersichten eine Risikoverortung je Anwendungsfall kenntlich gemacht. Unternehmen ohne BIM-Anwendungsexpertise in Bauprojektorganisationen erhalten auf diese Weise einen instrumentellen und niederschwelligen Zugang zu BIM, um die kollaborativen und wirtschaftlichen Vorteile der digitalen Arbeitsmethodik nutzen zu können.
Modell bedarfsorientierter Leistungserbringung im FM auf Grundlage von Sensortechnologien und BIM
(2023)
Während der Digitalisierung im Bauwesen insbesondere im Bereich der Planungs- und Errichtungsphase von Bauwerken immer größere Aufmerksamkeit zuteilwird, ist das digitale Potenzial im Facility Management weit weniger ausgeschöpft, als dies möglich wäre. Vor dem Hintergrund, dass die Bewirtschaftung von Gebäuden jedoch einen wesentlichen Kostenanteil im Lebenszyklus darstellt, ist eine Fokussierung auf digitale Prozesse im Gebäudebetrieb erforderlich. Im Facility Management werden Dienstleistungen häufig verrichtungsorientiert, d. h. nach statischen Intervallen, oder bedarfsorientiert erbracht. Beide Arten der Leistungserbringung weisen Defizite auf, beispielweise weil Tätigkeiten auf Basis definierter Intervalle erbracht werden, ohne dass eine Notwendigkeit besteht oder weil bestehende Bedarfe mangels Möglichkeiten der Bedarfsermittlung nicht identifiziert werden. Speziell die Definition und Ermittlung eines Bedarfs zur Leistungserbringung ist häufig subjektiv geprägt. Auch sind Dienstleister oft nicht in frühen Phasen der Gebäudeplanung involviert und erhalten für ihre Dienstleistungen notwendige Daten und Informationen erst kurz vor Inbetriebnahme des zu betreibenden Gebäudes.
Aktuelle Ansätze des Building Information Modeling (BIM) und die zunehmende Verfügbarkeit von Sensortechnologien in Gebäuden bieten Chancen, die o. g. Defizite zu beheben.
In der vorliegenden Arbeit werden deshalb Datenmodelle und Methoden entwickelt, die mithilfe von BIM-basierten Datenbankstrukturen sowie Auswertungs- und Entscheidungsmethodiken Dienstleistungen der Gebäudebewirtschaftung objektiviert und automatisiert auslösen können. Der Fokus der Arbeit liegt dabei auf dem Facility Service der Reinigungs- und Pflegedienste des infrastrukturellen Facility Managements.
Eine umfangreiche Recherche etablierter Normen und Standards sowie öffentlich zugänglicher Leistungsausschreibungen bilden die Grundlage der Definition erforderlicher Informationen zur Leistungserbringung. Die identifizierten statischen Gebäude- und Prozessinformationen werden in einem relationalen Datenbankmodell strukturiert, das nach einer Darstellung von Messgrößen und der Beschreibung des Vorgehens zur Auswahl geeigneter Sensoren für die Erfassung von Bedarfen, um Sensorinformationen erweitert wird. Um Messwerte verschiedener und bereits in Gebäuden existenten Sensoren für die Leistungsauslösung verwenden zu können, erfolgt die Implementierung einer Normierungsmethodik in das Datenbankmodell. Auf diese Weise kann der Bedarf zur Leistungserbringung ausgehend von Grenzwerten ermitteln werden. Auch sind Verknüpfungsmethoden zur Kombination verschiedener Anwendungen in dem Datenbankmodell integriert. Zusätzlich zur direkten Auslösung erforderlicher Aktivitäten ermöglicht das entwickelte Modell eine opportune Auslösung von Leistungen, d. h. eine Leistungserbringung vor dem eigentlich bestehenden Bedarf. Auf diese Weise können tätigkeitsähnliche oder räumlich nah beieinander liegende Tätigkeiten sinnvoll vorzeitig erbracht werden, um für den Dienstleister eine Wegstreckeneinsparung zu ermöglichen. Die Arbeit beschreibt zudem die für die Auswertung, Entscheidungsfindung und Auftragsüberwachung benötigen Algorithmen.
Die Validierung des entwickelten Modells bedarfsorientierter Leistungserbringung erfolgt in einer relationalen Datenbank und zeigt simulativ für unterschiedliche Szenarien des Gebäudebetriebs, dass Bedarfsermittlungen auf Grundlage von Sensortechnologien erfolgen und Leistungen opportun ausgelöst, beauftragt und dokumentiert werden können.
The present article aims to provide an overview of the consequences of dynamic soil-structure interaction (SSI) on building structures and the available modelling techniques to resolve SSI problems. The role of SSI has been traditionally considered beneficial to the response of structures. However, contemporary studies and evidence from past earthquakes showed detrimental effects of SSI in certain conditions. An overview of the related investigations and findings is presented and discussed in this article. Additionally, the main approaches to evaluate seismic soil-structure interaction problems with the commonly used modelling techniques and computational methods are highlighted. The strength, limitations, and application cases of each model are also discussed and compared. Moreover, the role of SSI in various design codes and global guidelines is summarized. Finally, the advancements and recent findings on the SSI effects on the seismic response of buildings with different structural systems and foundation types are presented. In addition, with the aim of helping new researchers to improve previous findings, the research gaps and future research tendencies in the SSI field are pointed out.
Plastic structural analysis may be applied without any difficulty and with little effort for structural member verifications with regard to lateral torsional buckling of doubly symmetric rolled I sections. Suchlike analyses can be performed based on the plastic zone theory, specifically using finite beam elements with seven degrees of freedom and 2nd order theory considering material nonlinearity. The existing Eurocode enables these approaches and the coming-up generation will provide corresponding regulations in EN 1993-1-14. The investigations allow the determination of computationally accurate limit loads, which are determined in the present paper for selected structural systems with different sets of parameters, such as length, steel grade and cross section types. The results are compared to approximations gained by more sophisticated FEM analyses (commercial software Ansys Workbench applying solid elements) for reasons of verification/validation. In this course, differences in the results of the numerical models are addressed and discussed. In addition, results are compared to resistances obtained by common design regulations based on reduction factors χlt including regulations of EN 1993-1-1 (including German National Annex) as well as prEN 1993-1-1: 2020-08 (proposed new Eurocode generation). Concluding, correlations of results and their advantages as well as disadvantages are discussed.
Encapsulation-based self-healing concrete (SHC) is the most promising technique for providing a self-healing mechanism to concrete. This is due to its capacity to heal fractures effectively without human interventions, extending the operational life and lowering maintenance costs. The healing mechanism is created by embedding capsules containing the healing agent inside the concrete. The healing agent will be released once the capsules are fractured and the healing occurs in the vicinity of the damaged part. The healing efficiency of the SHC is still not clear and depends on several factors; in the case of microcapsules SHC the fracture of microcapsules is the most important aspect to release the healing agents and hence heal the cracks. This study contributes to verifying the healing efficiency of SHC and the fracture mechanism of the microcapsules. Extended finite element method (XFEM) is a flexible, and powerful discrete crack method that allows crack propagation without the requirement for re-meshing and has been shown high accuracy for modeling fracture in concrete. In this thesis, a computational fracture modeling approach of Encapsulation-based SHC is proposed based on the XFEM and cohesive surface technique (CS) to study the healing efficiency and the potential of fracture and debonding of the microcapsules or the solidified healing agents from the concrete matrix as well. The concrete matrix and a microcapsule shell both are modeled by the XFEM and combined together by CS. The effects of the healed-crack length, the interfacial fracture properties, and microcapsule size on the load carrying capability and fracture pattern of the SHC have been studied. The obtained results are compared to those obtained from the zero thickness cohesive element approach to demonstrate the significant accuracy and the validity of the proposed simulation. The present fracture simulation is developed to study the influence of the capsular clustering on the fracture mechanism by varying the contact surface area of the CS between the microcapsule shell and the concrete matrix. The proposed fracture simulation is expanded to 3D simulations to validate the 2D computational simulations and to estimate the accuracy difference ratio between 2D and 3D simulations. In addition, a proposed design method is developed to design the size of the microcapsules consideration of a sufficient volume of healing agent to heal the expected crack width. This method is based on the configuration of the unit cell (UC), Representative Volume Element (RVE), Periodic Boundary Conditions (PBC), and associated them to the volume fraction (Vf) and the crack width as variables. The proposed microcapsule design is verified through computational fracture simulations.
Determining the earthquake hazard of any settlement is one of the primary studies for reducing earthquake damage. Therefore, earthquake hazard maps used for this purpose must be renewed over time. Turkey Earthquake Hazard Map has been used instead of Turkey Earthquake Zones Map since 2019. A probabilistic seismic hazard was performed by using these last two maps and different attenuation relationships for Bitlis Province (Eastern Turkey) were located in the Lake Van Basin, which has a high seismic risk. The earthquake parameters were determined by considering all districts and neighborhoods in the province. Probabilistic seismic hazard analyses were carried out for these settlements using seismic sources and four different attenuation relationships. The obtained values are compared with the design spectrum stated in the last two earthquake maps. Significant differences exist between the design spectrum obtained according to the different exceedance probabilities. In this study, adaptive pushover analyses of sample-reinforced concrete buildings were performed using the design ground motion level. Structural analyses were carried out using three different design spectra, as given in the last two seismic design codes and the mean spectrum obtained from attenuation relationships. Different design spectra significantly change the target displacements predicted for the performance levels of the buildings.
The floods in 2002 and 2013, as well as the recent flood of 2021, caused billions Euros worth of property damage in Germany. The aim of the project Innovative Vulnerability and Risk Assessment of Urban Areas against Flood Events (INNOVARU) involved the development of a practicable flood damage model that enables realistic damage statements for the residential building stock. In addition to the determination of local flood risks, it also takes into account the vulnerability of individual buildings and allows for the prognosis of structural damage. In this paper, we discuss an improved method for the prognosis of structural damage due to flood impact. Detailed correlations between inundation level and flow velocities depending on the vulnerability of the building types, as well as the number of storeys, are considered. Because reliable damage data from events with high flow velocities were not available, an innovative approach was adopted to cover a wide range of flow velocities. The proposed approach combines comprehensive damage data collected after the 2002 flood in Germany with damage data of the 2011 Tohoku earthquake tsunami in Japan. The application of the developed methods enables a reliable reinterpretation of the structural damage caused by the August flood of 2002 in six study areas in the Free State of Saxony.
The fracture of microcapsules is an important issue to release the healing agent for healing the cracks in encapsulation-based self-healing concrete. The capsular clustering generated from the concrete mixing process is considered one of the critical factors in the fracture mechanism. Since there is a lack of studies in the literature regarding this issue, the design of self-healing concrete cannot be made without an appropriate modelling strategy. In this paper, the effects of microcapsule size and clustering on the fractured microcapsules are studied computationally. A simple 2D computational modelling approach is developed based on the eXtended Finite Element Method (XFEM) and cohesive surface technique. The proposed model shows that the microcapsule size and clustering have significant roles in governing the load-carrying capacity and the crack propagation pattern and determines whether the microcapsule will be fractured or debonded from the concrete matrix. The higher the microcapsule circumferential contact length, the higher the load-carrying capacity. When it is lower than 25% of the microcapsule circumference, it will result in a greater possibility for the debonding of the microcapsule from the concrete. The greater the core/shell ratio (smaller shell thickness), the greater the likelihood of microcapsules being fractured.
Operator Calculus Approach to Comparison of Elasticity Models for Modelling of Masonry Structures
(2022)
The solution of any engineering problem starts with a modelling process aimed at formulating a mathematical model, which must describe the problem under consideration with sufficient precision. Because of heterogeneity of modern engineering applications, mathematical modelling scatters nowadays from incredibly precise micro- and even nano-modelling of materials to macro-modelling, which is more appropriate for practical engineering computations. In the field of masonry structures, a macro-model of the material can be constructed based on various elasticity theories, such as classical elasticity, micropolar elasticity and Cosserat elasticity. Evidently, a different macro-behaviour is expected depending on the specific theory used in the background. Although there have been several theoretical studies of different elasticity theories in recent years, there is still a lack of understanding of how modelling assumptions of different elasticity theories influence the modelling results of masonry structures. Therefore, a rigorous approach to comparison of different three-dimensional elasticity models based on quaternionic operator calculus is proposed in this paper. In this way, three elasticity models are described and spatial boundary value problems for these models are discussed. In particular, explicit representation formulae for their solutions are constructed. After that, by using these representation formulae, explicit estimates for the solutions obtained by different elasticity theories are obtained. Finally, several numerical examples are presented, which indicate a practical difference in the solutions.
Kleine Kommunen im ländlichen Raum sind aufgrund ihrer oft eingeschränkten personellen und finanziellen Kapazitäten bisher eher sporadisch in den Themenfeldern Energieeffizienz und Erneuerbare Energien aktiv. Immer wieder stellt sich daher Frage, wie die Klimaschutzstrategien des Bundes und der Länder dort mit dem verfügbaren Personal kostengünstig realisierbar sind. Vor diesem Hintergrund wird ein Werkzeug entwickelt, mit dessen Hilfe der aktive Einstieg in diese Thematik mit geringen Aufwand und überwiegend barrierefrei möglich ist.
Der Aufbau eines prozessorientierten Entwicklungs- und Moderationsmodells zur Erprobung und Umsetzung bezahlbarer Handlungsoptionen für Energieeinsparungen und effizienten Energieeinsatz im überwiegend ländlichen geprägten Raum ist der Schwerpunkt der Softwarelösung.
Kommunen werden mit deren Hilfe in die Lage versetzt, in die notwendigen Prozesse der Energie- und Wärmewende einzusteigen. Dabei soll der modulare Aufbau die regulären Schritte notwendiger (integrierter) Planungsprozesse nicht vollständig ersetzen. Vielmehr können innerhalb der Online-Anwendung - überwiegend automatisiert - konkrete Maßnahmenvorschläge erstellt werden, die ein solides Fundament der künftigen energetischen Entwicklung der Kommunen darstellen.
Für eine gezielte Validierung der Ergebnisse und der Ableitung potentieller Maßnahmen werden für die Erprobung Modellkommunen in Thüringen, Bayern und Hessen als Reallabore einbezogen.
Das Tool steht bisher zunächst nur den beteiligten Modellkommunen zur Verfügung. Die entwickelte Softwarelösung soll künftig Schritt für Schritt allen interessierten Kommunen mit diversen Hilfsmitteln und einer Vielzahl anderer praktischer Bestandteile zur Verfügung gestellt werden.
Bei Analysen des Gebäudebestands im Quartierskontext werden zu Dokumentationszwecken viele Bilddaten erzeugt. Diese Daten sind im Nachhinein häufig keinen eindeutig genauen Standorten und Blickwinkeln auf das Bauwerk zuzuordnen. Insbesondere gilt dies für Ortsunkundige oder für Detailaufnahmen. Eine zusätzliche Herausforderung stellt die Aufnahme von Wärmebrücken- oder andersartigen Gebäudedetails durch Thermogramme dar. In der Praxis kommen hier oftmals analoge, fehleranfällige Lösungen zum Einsatz.
Durch die Nutzung von Georeferenzierung kann diese Lücke geschlossen und eine eindeutige Kommunikation und Auswertung gewährleistet werden. Im Gegensatz zu den üblichen Kameras sind Smartphones nach Stand der Technik ausreichend ausgestattet, um neben Daten zu Standort auch die Orientierungswinkel einer Bildaufnahme zu dokumentieren. Die georefenzierten Bilder können auf Grundlage der in den sogenannten Exif-Daten mitgeschriebenen Informationen händisch in ein bestehendes Quartiersmodell integriert werden.
Anhand eines universitären Musterquartiers wird die nutzerfreundliche Realisierung beispielhaft erprobt und auf ihre Potentiale zur Automatisierung in Python untersucht. Hierfür wurde ein bestehendes Quartiersmodell als geometrische Grundlage genutzt und um RGB-Bilder sowie Thermogramme erweitert. Das beschriebene Vorgehen wird im Rahmen der Anwendung auf seinen möglichen Einsatz im Rahmen einer energetischen Quartierserfassung sowie einer Bauschadensdokumentation untersucht.
Mit dem vorliegenden Beitrag wird dem Nutzenden ein Werkzeug bereitgestellt, das die hochwertige Dokumentation einer Bestandserfassung, auch im Quartierskontext, ermöglicht.
Data acquisition systems and methods to capture high-resolution images or reconstruct 3D point clouds of existing structures are an effective way to document their as-is condition. These methods enable a detailed analysis of building surfaces, providing precise 3D representations. However, for the condition assessment and documentation, damages are mainly annotated in 2D representations, such as images, orthophotos, or technical drawings, which do not allow for the application of a 3D workflow or automated comparisons of multitemporal datasets. In the available software for building heritage data management and analysis, a wide range of annotation and evaluation functions are available, but they also lack integrated post-processing methods and systematic workflows. The article presents novel methods developed to facilitate such automated 3D workflows and validates them on a small historic church building in Thuringia, Germany. Post-processing steps using photogrammetric 3D reconstruction data along with imagery were implemented, which show the possibilities of integrating 2D annotations into 3D documentations. Further, the application of voxel-based methods on the dataset enables the evaluation of geometrical changes of multitemporal annotations in different states and the assignment to elements of scans or building models. The proposed workflow also highlights the potential of these methods for condition assessment and planning of restoration work, as well as the possibility to represent the analysis results in standardised building model formats.
In this paper, we present an open-source code for the first-order and higher-order nonlocal operator method (NOM) including a detailed description of the implementation. The NOM is based on so-called support, dual-support, nonlocal operators, and an operate energy functional ensuring stability. The nonlocal operator is a generalization of the conventional differential operators. Combined with the method of weighed residuals and variational principles, NOM establishes the residual and tangent stiffness matrix of operate energy functional through some simple matrix without the need of shape functions as in other classical computational methods such as FEM. NOM only requires the definition of the energy drastically simplifying its implementation. The implementation in this paper is focused on linear elastic solids for sake of conciseness through the NOM can handle more complex nonlinear problems. The NOM can be very flexible and efficient to solve partial differential equations (PDEs), it’s also quite easy for readers to use the NOM and extend it to solve other complicated physical phenomena described by one or a set of PDEs. Finally, we present some classical benchmark problems including the classical cantilever beam and plate-with-a-hole problem, and we also make an extension of this method to solve complicated problems including phase-field fracture modeling and gradient elasticity material.
Realistic uncertainty description incorporating aleatoric and epistemic uncertainties can be described within the framework of polymorphic uncertainty, which is computationally demanding. Utilizing a domain decomposition approach for random field based uncertainty models the proposed level-based sampling method can reduce these computational costs significantly and shows good agreement with a standard sampling technique. While 2-level configurations tend to get unstable with decreasing sampling density 3-level setups show encouraging results for the investigated reliability analysis of a structural unit square.
Bolted connections are commonly used in steel construction. The load-bearing behavior of bolt fittings has extensively been studied in various research activities and the bearing capacity of bolted connections can be assessed well by standard regulations for practical applications. With regard to tensile loading, the nut does not have strong influence on resistances, since the failure occurs in the bolts due to higher material strengths of the nuts. In some applications, so-called “blind holes” are used to connect plated components. In a manner of speaking, the nut is replaced by the “outer” plate with a prefabricated hole and thread, in which the bolt can be screwed and tightened. In such connections, the limit load capacity cannot solely be assessed by the bolt resistance, since the threaded hole in the base material has strong influence on the structural behavior. In this context, the available screw-in depth of the blind hole is of fundamental importance. The German National Annex of EN 1993-1-8 provides information on a necessary depth in order to transfer the full tensile capacity of the bolt. However, some connections do not allow to fabricate such depths. In these cases, the capacity of the connection is unclear and not specified. In this paper, first experiments on corresponding connections with different screw-in depths are presented and compared to limit load capacities according to the standard.
Marine macroalgae such as Ulva intestinalis have promising properties as feedstock for cosmetics and pharmaceuticals. However, since the quantity and quality of naturally grown algae vary widely, their exploitability is reduced – especially for producers in high-priced markets. Moreover, the expansion of marine or shore-based cultivation systems is unlikely in Europe, since promising sites either lie in fishing zones, recreational areas, or natural reserves. The aim was therefore to develop a closed photobioreactor system enabling full control of abiotic environmental parameters and an effective reconditioning of the cultivation medium in order to produce marine macroalgae at sites distant from the shore. To assess the feasibility and functionality of the chosen technological concept, a prototypal plant has been implemented in central Germany – a site distant from the sea. Using a newly developed, submersible LED light source, cultivation experiments with Ulva intestinalis led to growth rates of 7.72 ± 0.04 % day−1 in a cultivation cycle of 28 days. Based on the space demand of the production system, this results in fresh mass productivity of 3.0 kg m−2, respectively, of 1.1 kg m−2 per year. Also considering the ratio of biomass to energy input amounting to 2.76 g kWh−1, significant future improvements of the developed photobioreactor system should include the optimization of growth parameters, and the reduction of the system’s overall energy demand.
Scaling of concrete due to salt frost attack is an important durability issue in moderate and cold climates. The actual damage mechanism is still not completely understood. Two recent damage theories—the glue spall theory and the cryogenic suction theory—offer plausible, but conflicting explanations for the salt frost scaling mechanism. The present study deals with the cryogenic suction theory, which assumes that freezing concrete can take up unfrozen brine from a partly frozen deicing solution during salt frost attack. According to the model hypothesis, the resulting saturation of the concrete surface layer intensifies the ice formation in this layer and causes salt frost scaling. In this study an experimental technique was developed that makes it possible to quantify to which extent brine uptake can increase ice formation in hardened cement paste (used as a model material for concrete). The experiments were carried out with low temperature differential scanning calorimetry, where specimens were subjected to freeze–thaw cycles while being in contact with NaCl brine. Results showed that the ice content in the specimens increased with subsequent freeze–thaw cycles due to the brine uptake at temperatures below 0 °C. The ability of the hardened cement paste to bind chlorides from the absorbed brine at the same time affected the freezing/melting behavior of the pore solution and the magnitude of the ice content.
The derivation of nonlocal strong forms for many physical problems remains cumbersome in traditional methods. In this paper, we apply the variational principle/weighted residual method based on nonlocal operator method for the derivation of nonlocal forms for elasticity, thin plate, gradient elasticity, electro-magneto-elasticity and phase-field fracture method. The nonlocal governing equations are expressed as an integral form on support and dual-support. The first example shows that the nonlocal elasticity has the same form as dual-horizon non-ordinary state-based peridynamics. The derivation is simple and general and it can convert efficiently many local physical models into their corresponding nonlocal forms. In addition, a criterion based on the instability of the nonlocal gradient is proposed for the fracture modelling in linear elasticity. Several numerical examples are presented to validate nonlocal elasticity and the nonlocal thin plate.
Within the scope of literature, the influence of openings within the infill walls that are bounded by a reinforced concrete frame and excited by seismic drift forces in both in- and out-of-plane direction is still uncharted. Therefore, a 3D micromodel was developed and calibrated thereafter, to gain more insight in the topic. The micromodels were calibrated against their equivalent physical test specimens of in-plane, out-of-plane drift driven tests on frames with and without infill walls and openings, as well as out-of-plane bend test of masonry walls. Micromodels were rectified based on their behavior and damage states. As a result of the calibration process, it was found that micromodels were sensitive and insensitive to various parameters, regarding the model’s behavior and computational stability. It was found that, even within the same material model, some parameters had more effects when attributed to concrete rather than on masonry. Generally, the in-plane behavior of infilled frames was found to be largely governed by the interface material model. The out-of-plane masonry wall simulations were governed by the tensile strength of both the interface and masonry material model. Yet, the out-of-plane drift driven test was governed by the concrete material properties.
Encapsulation-based self-healing concrete has received a lot of attention nowadays in civil engineering field. These capsules are embedded in the cementitious matrix during concrete mixing. When the cracks appear, the embedded capsules which are placed along the path of incoming crack are fractured and then release of healing agents in the vicinity of damage. The materials of capsules need to be designed in a way that they should be able to break with small deformation, so the internal fluid can be released to seal the crack. This study focuses on computational modeling of fracture in encapsulation-based selfhealing concrete. The numerical model of 2D and 3D with randomly packed aggreates and capsules have been developed to analyze fracture mechanism that plays a significant role in the fracture probability of capsules and consequently the self-healing process. The capsules are assumed to be made of Poly Methyl Methacrylate (PMMA) and the potential cracks are represented by pre-inserted cohesive elements with tension and shear softening laws along the element boundaries of the mortar matrix, aggregates, capsules, and at the interfaces between these phases. The effects of volume fraction, core-wall thickness ratio, and mismatch fracture properties of capsules on the load carrying capacity of self-healing concrete and fracture probability of the capsules are investigated. The output of this study will become valuable tool to assist not only the experimentalists but also the manufacturers in designing an appropriate capsule material for self-healing concrete.
A vast number of existing buildings were constructed before the development and enforcement of seismic design codes, which run into the risk of being severely damaged under the action of seismic excitations. This poses not only a threat to the life of people but also affects the socio-economic stability in the affected area. Therefore, it is necessary to assess such buildings’ present vulnerability to make an educated decision regarding risk mitigation by seismic strengthening techniques such as retrofitting. However, it is economically and timely manner not feasible to inspect, repair, and augment every old building on an urban scale. As a result, a reliable rapid screening methods, namely Rapid Visual Screening (RVS), have garnered increasing interest among researchers and decision-makers alike. In this study, the effectiveness of five different Machine Learning (ML) techniques in vulnerability prediction applications have been investigated. The damage data of four different earthquakes from Ecuador, Haiti, Nepal, and South Korea, have been utilized to train and test the developed models. Eight performance modifiers have been implemented as variables with a supervised ML. The investigations on this paper illustrate that the assessed vulnerability classes by ML techniques were very close to the actual damage levels observed in the buildings.
In this paper we present a theoretical background for a coupled analytical–numerical approach to model a crack propagation process in two-dimensional bounded domains. The goal of the coupled analytical–numerical approach is to obtain the correct solution behaviour near the crack tip by help of the analytical solution constructed by using tools of complex function theory and couple it continuously with the finite element solution in the region far from the singularity. In this way, crack propagation could be modelled without using remeshing. Possible directions of crack growth can be calculated through the minimization of the total energy composed of the potential energy and the dissipated energy based on the energy release rate. Within this setting, an analytical solution of a mixed boundary value problem based on complex analysis and conformal mapping techniques is presented in a circular region containing an arbitrary crack path. More precisely, the linear elastic problem is transformed into a Riemann–Hilbert problem in the unit disk for holomorphic functions. Utilising advantages of the analytical solution in the region near the crack tip, the total energy could be evaluated within short computation times for various crack kink angles and lengths leading to a potentially efficient way of computing the minimization procedure. To this end, the paper presents a general strategy of the new coupled approach for crack propagation modelling. Additionally, we also discuss obstacles in the way of practical realisation of this strategy.
In this work, extensive reactive molecular dynamics simulations are conducted to analyze the nanopore creation by nano-particles impact over single-layer molybdenum disulfide (MoS2) with 1T and 2H phases. We also compare the results with graphene monolayer. In our simulations, nanosheets are exposed to a spherical rigid carbon projectile with high initial velocities ranging from 2 to 23 km/s. Results for three different structures are compared to examine the most critical factors in the perforation and resistance force during the impact. To analyze the perforation and impact resistance, kinetic energy and displacement time history of the projectile as well as perforation resistance force of the projectile are investigated.
Interestingly, although the elasticity module and tensile strength of the graphene are by almost five times higher than those of MoS2, the results demonstrate that 1T and 2H-MoS2 phases are more resistive to the impact loading and perforation than graphene. For the MoS2nanosheets, we realize that the 2H phase is more resistant to impact loading than the 1T counterpart.
Our reactive molecular dynamics results highlight that in addition to the strength and toughness, atomic structure is another crucial factor that can contribute substantially to impact resistance of 2D materials. The obtained results can be useful to guide the experimental setups for the nanopore creation in MoS2or other 2D lattices.
Discrete function theory in higher-dimensional setting has been in active development since many years. However, available results focus on studying discrete setting for such canonical domains as half-space, while the case of bounded domains generally remained unconsidered. Therefore, this paper presents the extension of the higher-dimensional function theory to the case of arbitrary bounded domains in Rn. On this way, discrete Stokes’ formula, discrete Borel–Pompeiu formula, as well as discrete Hardy spaces for general bounded domains are constructed. Finally, several discrete Hilbert problems are considered.
When it comes to monitoring of huge structures, main issues are limited time, high costs and how to deal with the big amount of data. In order to reduce and manage them, respectively, methods from the field of optimal design of experiments are useful and supportive. Having optimal experimental designs at hand before conducting any measurements is leading to a highly informative measurement concept, where the sensor positions are optimized according to minimal errors in the structures’ models. For the reduction of computational time a combined approach using Fisher Information Matrix and mean-squared error in a two-step procedure is proposed under the consideration of different error types. The error descriptions contain random/aleatoric and systematic/epistemic portions. Applying this combined approach on a finite element model using artificial acceleration time measurement data with artificially added errors leads to the optimized sensor positions. These findings are compared to results from laboratory experiments on the modeled structure, which is a tower-like structure represented by a hollow pipe as the cantilever beam. Conclusively, the combined approach is leading to a sound experimental design that leads to a good estimate of the structure’s behavior and model parameters without the need of preliminary measurements for model updating.
This study proposes an efficient Bayesian, frequency-based damage identification approach to identify damages in cantilever structures with an acceptable error rate, even at high noise levels. The catenary poles of electric high-speed train systems were selected as a realistic case study to cover the objectives of this study. Compared to other frequency-based damage detection approaches described in the literature, the proposed approach is efficiently able to detect damages in cantilever structures to higher levels of damage detection, namely identifying both the damage location and severity using a low-cost structural health monitoring (SHM) system with a limited number of sensors; for example, accelerometers. The integration of Bayesian inference, as a stochastic framework, in the proposed approach, makes it possible to utilize the benefit of data fusion in merging the informative data from multiple damage features, which increases the quality and accuracy of the results. The findings provide the decision-maker with the information required to manage the maintenance, repair, or replacement procedures.
"Qualitätsmanagement (QM) ist im Sinne reibungslos funktionierender Abläufe ein unverzichtbarer Bestandteil jedes Büros, unabhängig von der Größe und dem Kerngeschäft und unabhängig davon, ob ein Zertifizierungsverfahren durchlaufen
wird oder nicht. Im Laufe der Jahre werden in Ingenieur- und Architekturbüros meist zahlreiche organisatorische Einzelregelungen getroffen, die den Alltag erleichtern sollen. Eine systematische Zusammenstellung, Einführung und Kontrolle unterbleibt jedoch oft. Häufig schrecken die Verantwortlichen vor dem vermeintlichen Aufwand für die systematische Zusammenstellung in einem QM-Handbuch und dem vermeintlich noch viel größeren Aufwand für eine externe Überprüfung im Rahmen eines externen Audits mit anschließender Zertifizierung zurück. Der
Nutzen, der alleine schon durch ein passgenau aufgebautes und gelebtes QMHandbuch entsteht, wird nicht realisiert.
Der QM-Standard „Planer am Bau“ (PaB) ist ein branchenspezifischer Standard, der gezielt für Ingenieur- und Architekturbüros entwickelt worden ist und ausschließlich deren Belange berücksichtigt. Im Ergebnis entstehen nach klaren Vorgaben der Mindestanforderungen schlanke, auf die jeweiligen Bürobesonderheiten angepasste Handbücher, die durch den TÜV Rheinland auditiert und zertifiziert werden
können. Der Nachweis eines wirksamen QM-Systems ist mit diesem Zertifikat erbracht, was unter anderem in VgV-Ausschreibungen Vorteile bringen kann."
In recent decades, a multitude of concepts and models were developed to understand, assess and predict muscular mechanics in the context of physiological and pathological events.
Most of these models are highly specialized and designed to selectively address fields in, e.g., medicine, sports science, forensics, product design or CGI; their data are often not transferable to other ranges of application. A single universal model, which covers the details of biochemical and neural processes, as well as the development of internal and external force and motion patterns and appearance could not be practical with regard to the diversity of the questions to be investigated and the task to find answers efficiently. With reasonable limitations though, a generalized approach is feasible.
The objective of the work at hand was to develop a model for muscle simulation which covers the phenomenological aspects, and thus is universally applicable in domains where up until now specialized models were utilized. This includes investigations on active and passive motion, structural interaction of muscles within the body and with external elements, for example in crash scenarios, but also research topics like the verification of in vivo experiments and parameter identification. For this purpose, elements for the simulation of incompressible deformations were studied, adapted and implemented into the finite element code SLang. Various anisotropic, visco-elastic muscle models were developed or enhanced. The applicability was demonstrated on the base of several examples, and a general base for the implementation of further material models was developed and elaborated.
The Marmara Region (NW Turkey) has experienced significant earthquakes (M > 7.0) to date. A destructive earthquake is also expected in the region. To determine the effect of the specific design spectrum, eleven provinces located in the region were chosen according to the Turkey Earthquake Building Code updated in 2019. Additionally, the differences between the previous and updated regulations of the country were investigated. Peak Ground Acceleration (PGA) and Peak Ground Velocity (PGV) were obtained for each province by using earthquake ground motion levels with 2%, 10%, 50%, and 68% probability of exceedance in 50-year periods. The PGA values in the region range from 0.16 to 0.7 g for earthquakes with a return period of 475 years. For each province, a sample of a reinforced-concrete building having two different numbers of stories with the same ground and structural characteristics was chosen. Static adaptive pushover analyses were performed for the sample reinforced-concrete building using each province’s design spectrum. The variations in the earthquake and structural parameters were investigated according to different geographical locations. It was determined that the site-specific design spectrum significantly influences target displacements for performance-based assessments of buildings due to seismicity characteristics of the studied geographic location.
A Machine Learning Framework for Assessing Seismic Hazard Safety of Reinforced Concrete Buildings
(2020)
Although averting a seismic disturbance and its physical, social, and economic disruption is practically impossible, using the advancements in computational science and numerical modeling shall equip humanity to predict its severity, understand the outcomes, and equip for post-disaster management. Many buildings exist amidst the developed metropolitan areas, which are senile and still in service. These buildings were also designed before establishing national seismic codes or without the introduction of construction regulations. In that case, risk reduction is significant for developing alternatives and designing suitable models to enhance the existing structure’s performance. Such models will be able to classify risks and casualties related to possible earthquakes through emergency preparation. Thus, it is crucial to recognize structures that are susceptible to earthquake vibrations and need to be prioritized for retrofitting. However, each building’s behavior under seismic actions cannot be studied through performing structural analysis, as it might be unrealistic because of the rigorous computations, long period, and substantial expenditure. Therefore, it calls for a simple, reliable, and accurate process known as Rapid Visual Screening (RVS), which serves as a primary screening platform, including an optimum number of seismic parameters and predetermined performance damage conditions for structures. In this study, the damage classification technique was studied, and the efficacy of the Machine Learning (ML) method in damage prediction via a Support Vector Machine (SVM) model was explored. The ML model is trained and tested separately on damage data from four different earthquakes, namely Ecuador, Haiti, Nepal, and South Korea. Each dataset consists of varying numbers of input data and eight performance modifiers. Based on the study and the results, the ML model using SVM classifies the given input data into the belonging classes and accomplishes the performance on hazard safety evaluation of buildings.
Recently, the demand for residence and usage of urban infrastructure has been increased, thereby resulting in the elevation of risk levels of human lives over natural calamities. The occupancy demand has rapidly increased the construction rate, whereas the inadequate design of structures prone to more vulnerability. Buildings constructed before the development of seismic codes have an additional susceptibility to earthquake vibrations. The structural collapse causes an economic loss as well as setbacks for human lives. An application of different theoretical methods to analyze the structural behavior is expensive and time-consuming. Therefore, introducing a rapid vulnerability assessment method to check structural performances is necessary for future developments. The process, as mentioned earlier, is known as Rapid Visual Screening (RVS). This technique has been generated to identify, inventory, and screen structures that are potentially hazardous. Sometimes, poor construction quality does not provide some of the required parameters; in this case, the RVS process turns into a tedious scenario. Hence, to tackle such a situation, multiple-criteria decision-making (MCDM) methods for the seismic vulnerability assessment opens a new gateway. The different parameters required by RVS can be taken in MCDM. MCDM evaluates multiple conflicting criteria in decision making in several fields. This paper has aimed to bridge the gap between RVS and MCDM. Furthermore, to define the correlation between these techniques, implementation of the methodologies from Indian, Turkish, and Federal Emergency Management Agency (FEMA) codes has been done. The effects of seismic vulnerability of structures have been observed and compared.
Die Gase Sauerstoff und Stickstoff werden für eine Vielzahl an technischen, industriellen, biologischen und medizinischen Einsatzzwecken benötigt. So liegen Anwendungsgebiete dieser Gase neben der klassischen metallverarbeitenden und der chemischen Industrie bei Sauerstoff vor allem in der Medizin, Verbrennungs- und Kläranlagenoptimierung sowie der Fischzucht und bei Stickstoff als Schutz- beziehungsweise Inertgas in der Kunststoffindustrie, der Luft- und Raumfahrt sowie dem Brandschutz.
Die Bereitstellung der Gase Sauerstoff und Stickstoff wird nahezu ausschließlich durch die Abtrennung aus der Umgebungsluft realisiert, welche aus ca. 78 Vol.-% Stickstoff, 21 Vol.-% Sauerstoff und 1 Vol.-% Spurengasen (Ar, CO2, Ne, He, ...) besteht. Am Markt etablierte Verfahren der Luftzerlegung sind das Linde-, das PSA- (pressure swing adsorption/Druckwechseladsorption) oder verschiedene Membran-Verfahren. Hierdurch werden die benötigten Gase entweder direkt vor Ort beim Verbraucher erzeugt (PSA- und Polymer-Membranverfahren: geringe Reinheiten) oder zentral in großen Anlagen hergestellt (Linde-Verfahren: hohe Reinheiten) und anschließend zum Verbraucher in Form von Flaschen- oder Tankgasen geliefert (Tansportkosten).
Für kleinere Verbraucher mit hohen Ansprüchen an die Reinheit des benötigten Sauerstoffs beziehungsweise Stickstoffs ergibt sich nur die Möglichkeit, die Gase als kostenintensive Transportgase zentraler Gaseversorger zu beziehen und sich somit in eine Abhängigkeit (Lieferverträge, Flaschen-/Tankmieten, ...) zu diesen zu begeben sowie eine eigene Lagerhaltung für die benötigten Gase (Mehraufwand, Lagerkosten, Platzbedarf) zu betreiben.
Ziel dieser Arbeit ist es, keramische Material-Systeme auf Basis chemischer Hochtemperatur-Reaktionen als Reaktive Oxidkeramiken zu entwickeln und diese hinsichtlich eines möglichen Einsatzes für die Sauerstoffseparation in neuartigen Luftzerlegungsanlagen zu untersuchen.
Derartige Anlagen sollen in ihrem Prinzip an die regenerative Sauerstoffseparation angelehnt sein und in ihren Reaktoren die Reaktiven Oxidkeramiken als Festbett-Material abwechselnd mit Luft be- und Vakuum oder O2-armen Atmosphären entladen.
Die Verwendung Reaktiver Oxidkeramiken, welche im Vergleich zu den bisherigen Materialien höhere Sauerstoffaustauschmengen und -raten bei gleichzeitig hoher Lebensdauer und Korrosionsbeständigkeit sowie relativ einfacher Handhabe aufweisen würden, soll ein Schritt in Richtung einer effizienten alternativen Luftzerlegungstechnologie sein.
Mit den Reaktiven Oxidkeramiken in einer Luftzerlegungsanlage sollte es im besten Fall möglich sein, in kleinen Anlagen sehr reinen Sauerstoff und zugleich sauerstofffreies Inertgas zu erzeugen sowie eine Sauerstoffan- oder -abreicherung von Luft, Prozess- oder Abgasen zu generieren.
Somit besäße eine solche, auf Reaktiven Oxidkeramiken basierende Technologie sehr weit gefächerte Einsatzgebiete und demzufolge ein enormes wirtschaftliches Potential.
Institute of Structural Engineering, Institute of Structural Mechanics, as well as Institute for Computing, Mathematics and Physics in Civil Engineering at the faculty of civil engineering at the Bauhaus-Universität Weimar presented special topics of structural engineering to highlight the broad spectrum of civil engineering in the field of modeling and simulation.
The summer course sought to impart knowledge and to combine research with a practical context, through a challenging and demanding series of lectures, seminars and project work. Participating students were enabled to deal with advanced methods and its practical application.
The extraordinary format of the interdisciplinary summer school offers the opportunity to study advanced developments of numerical methods and sophisticated modelling techniques in different disciplines of civil engineering for foreign and domestic students, which go far beyond traditional graduate courses.
The proceedings at hand are the result from the Bauhaus Summer School course: Forecast Engineering held at the Bauhaus-Universität Weimar, 2018. It summarizes the results of the conducted project work, provides the abstracts/papers of the contributions by the participants, as well as impressions from the accompanying programme and organized cultural activities.
The design of engineering structures takes place today and in the past on the basis of static calculations. The consideration of uncertainties in the model quality becomes more and more important with the development of new construction methods and design requirements. In addition to the traditional forced-based approaches, experiences and observations about the deformation behavior of components and the overall structure under different exposure conditions allow the introduction of novel detection and evaluation criteria.
The proceedings at hand are the result from the Bauhaus Summer School Course: Forecast Engineering held at the Bauhaus-Universität Weimar, 2017. It summarizes the results of the conducted project work, provides the abstracts of the contributions by the participants, as well as impressions from the accompanying programme and organized cultural activities.
The special character of this course is in the combination of basic disciplines of structural engineering with applied research projects in the areas of steel and reinforced concrete structures, earthquake and wind engineering as well as informatics and linking them to mathematical methods and modern tools of visualization. Its innovative character results from the ambitious engineering tasks and advanced
modeling demands.
The proceedings at hand are the result of the International Master Course Module: "Nonlinear Analysis of Structures: Wind Induced Vibrations" held at the Faculty of Civil Engineering at Bauhaus-University Weimar, Germany in the summer semester 2019 (April - August). This material summarizes the results of the project work done throughout the semester, provides an overview of the topic, as well as impressions from the accompanying programme.
Wind Engineering is a particular field of Civil Engineering that evaluates the resistance of structures caused by wind loads. Bridges, high-rise buildings, chimneys and telecommunication towers might be susceptible to wind vibrations due to their increased flexibility, therefore a special design is carried for this aspect. Advancement in technology and scientific studies permit us doing research at small scale for more accurate analyses. Therefore scaled models of real structures are built and tested for various construction scenarios. These models are placed in wind tunnels where experiments are conducted to determine parameters such as: critical wind speeds for bridge decks, static wind coefficients and forces for buildings or bridges. The objective of the course was to offer insight to the students into the assessment of long-span cable-supported bridges and high-rise buildings under wind excitation. The participating students worked in interdisciplinary teams to increase their knowledge in the understanding and influences on the behaviour of wind-sensitive structures.
Turbomachinery plays an important role in many cases of energy generation or conversion. Therefore, turbomachinery is a promising approaching point for optimization in order to increase the efficiency of energy use. In recent years, the use of automated optimization strategies in combination with numerical simulation has become increasingly popular in many fields of engineering. The complex interactions between fluid and solid mechanics encountered in turbomachines on the one hand and the high computational expense needed to calculate the performance on the other hand, have, however, prevented a widespread use of these techniques in this field of engineering. The objective of this work was the development of a strategy for efficient metamodel based optimization of centrifugal compressor impellers. In this context, the main focus is the reduction of the required numerical expense. The central idea followed in this research was the incorporation of preliminary information acquired from low-fidelity computation methods and empirical correlations into the sampling process to identify promising regions of the parameter space. This information was then used to concentrate the numerically expensive high-fidelity computations of the fluid dynamic and structure mechanic performance of the impeller in these regions while still maintaining a good coverage of the whole parameter space. The development of the optimization strategy can be divided into three main tasks. Firstly, the available preliminary information had to be researched and rated. This research identified loss models based on one dimensional flow physics and empirical correlations as the best suited method to predict the aerodynamic performance. The loss models were calibrated using available performance data to obtain a high prediction quality. As no sufficiently exact models for the prediction of the mechanical loading of the impellercould be identified, a metamodel based on finite element computations was chosen for this estimation. The second task was the development of a sampling method which concentrates samples in regions of the parameter space where high quality designs are predicted by the preliminary information while maintaining a good overall coverage. As available methods like rejection sampling or Markov-chain Monte-Carlo methods did not meet the requirements in terms of sample distribution and input correlation, a new multi-fidelity sampling method called “Filtered Sampling“has been developed. The last task was the development of an automated computational workflow. This workflow encompasses geometry parametrization, geometry generation, grid generation and computation of the aerodynamic performance and the structure mechanic loading. Special emphasis was put into the development of a geometry parametrization strategy based on fluid mechanic considerations to prevent the generation of physically inexpedient designs. Finally, the optimization strategy, which utilizes the previously developed tools, was successfully employed to carry out three optimization tasks. The efficiency of the method was proven by the first and second testcase where an existing compressor design was optimized by the presented method. The results were comparable to optimizations which did not take preliminary information into account, while the required computational expense cloud be halved. In the third testcase, the method was applied to generate a new impeller design. In contrast to the previous examples, this optimization featuredlargervariationsoftheimpellerdesigns. Therefore, theapplicability of the method to parameter spaces with significantly varying designs could be proven, too.
Components of structural glazing have to meet different requirements and resist various impacts, depending on the field of application. Within an international research project of the EU innovation program Horizon 2020, special glass panes with a fluid circulating in capillaries are developed exploiting solar energy. Major influences to this glazing are UV irradiation and the fluidic contact, effecting the mechanical and optical durability of the bonding material within the glass setup. Regarding to visual requirements, acrylate adhesives and EVA films are analyzed as possible bonding materials by destructive and non-destructive testing methods. Two types of specimen are presented for obtaining the mechanical behavior and the surface appearances of the bonding material.
Der vorliegende Beitrag beschreibt die Problematik bei der Prognose verkehrsbedingter Schadstoff-Immissionen. Im Mittelpunkt steht die Entwicklung und der Aufbau einer Simulationsumgebung zur Evaluation von umweltorientierten Verkehrsmanagement-Strategien. Die Simulationsumgebung wird über die drei Felder Verkehr, Emission, Immission entwickelt und findet zunächst Anwendung in der Evaluation verkehrlicher Maßnahmen für die Friedberger Landstraße in Frankfurt am Main.
Identification of flaws in structures is a critical element in the management of maintenance and quality assurance processes in engineering. Nondestructive testing (NDT) techniques based on a wide range of physical principles have been developed and are used in common practice for structural health monitoring. However, basic NDT techniques are usually limited in their ability to provide the accurate information on locations, dimensions and shapes of flaws. One alternative to extract additional information from the results of NDT is to append it with a computational model that provides detailed analysis of the physical process involved and enables the accurate identification of the flaw parameters. The aim here is to develop the strategies to uniquely identify cracks in two-dimensional 2D) structures under dynamic loadings.
A local NDT technique combined eXtended Finite Element Method (XFEM) with dynamic loading in order to identify the cracks in the structures quickly and accurately is developed in this dissertation. The Newmark-b time integration method with Rayleigh damping is used for the time integration. We apply Nelder-Mead (NM)and Quasi-Newton (QN) methods for identifying the crack tip in plate. The inverse problem is solved iteratively, in which XFEM is used for solving the forward problem in each iteration. For a timeharmonic excitation with a single frequency and a short-duration signal measured along part of the external boundary, the crack is detected through the solution of an inverse time-dependent problem. Compared to the static load, we show that the dynamic loads are more effective for crack detection problems. Moreover, we tested different dynamic loads and find that NM method works more efficient under the harmonic load than the pounding load while the QN method achieves almost the same results for both load types.
A global strategy, Multilevel Coordinate Search (MCS) with XFEM (XFEM-MCS) methodology under the dynamic electric load, to detect multiple cracks in 2D piezoelectric plates is proposed in this dissertation. The Newmark-b method is employed for the time integration and in each iteration the forward problem is solved by XFEM for various cracks. The objective functional is minimized by using a global search algorithm MCS. The test problems show that the XFEM-MCS algorithm under the dynamic electric load can be effectively employed for multiple cracks detection in piezoelectric materials, and it proves to be robust in identifying defects in piezoelectric structures. Fiber-reinforced composites (FRCs) are extensively applied in practical engineering since they have high stiffness and strength. Experiments reveal a so-called interphase zone, i.e. the space between the outside interface of the fiber and the inside interface of the matrix. The interphase strength between the fiber and the matrix strongly affects the mechanical properties as a result of the large ratio of interface/volume. For the purpose of understanding the mechanical properties of FRCs with functionally graded interphase (FGI), a closed-form expression of the interface strength between a fiber and a matrix is obtained in this dissertation using a continuum modeling approach according to the ver derWaals (vdW) forces. Based on the interatomic potential, we develop a new modified nonlinear cohesive law, which is applied to study the interface delamination of FRCs with FGI under different loadings. The analytical solutions show that the delamination behavior strongly depends on the interphase thickness, the fiber radius, the Young’s moduli and Poisson’s ratios of the fiber and the matrix. Thermal conductivity is the property of a material to conduct heat. With the development and deep research of 2D materials, especially graphene and molybdenum disulfide (MoS2), the thermal conductivity of 2D materials attracts wide attentions. The thermal conductivity of graphene nanoribbons (GNRs) is found to appear a tendency of decreasing under tensile strain by classical molecular dynamics (MD) simulations. Hence, the strain effects of graphene can play a key role in the continuous tunability and applicability of its thermal conductivity property at nanoscale, and the dissipation of thermal conductivity is an obstacle for the applications of thermal management. Up to now, the thermal conductivity of graphene under shear deformation has not been investigated yet. From a practical point of view, good thermal managements of GNRs have significantly potential applications of future GNR-based thermal nanodevices, which can greatly improve performances of the nanosized devices due to heat dissipations. Meanwhile, graphene is a thin membrane structure, it is also important to understand the wrinkling behavior under shear deformation. MoS2 exists in the stable semiconducting 1H phase (1H-MoS2) while the metallic 1T phase (1T-MoS2) is unstable at ambient conditions. As it’s well known that much attention has been focused on studying the nonlinear optical properties of the 1H-MoS2. In a very recent research, the 1T-type monolayer crystals of TMDCs, MX2 (MoS2, WS2 ...) was reported having an intrinsic in-plane negative Poisson’s ratio. Luckily, nearly at the same time, unprecedented long-term (>3months) air stability of the 1T-MoS2 can be achieved by using the donor lithium hydride (LiH). Therefore, it’s very important to study the thermal conductivity of 1T-MoS2.
The thermal conductivity of graphene under shear strain is systematically studied in this dissertation by MD simulations. The results show that, in contrast to the dramatic decrease of thermal conductivity of graphene under uniaxial tensile, the thermal conductivity of graphene is not sensitive to the shear strain, and the thermal conductivity decreases only 12-16%. The wrinkle evolves when the shear strain is around 5%-10%, but the thermal conductivity barely changes.
The thermal conductivities of single-layer 1H-MoS2(1H-SLMoS2) and single-layer 1T-MoS2 (1T-SLMoS2) with different sample sizes, temperatures and strain rates have been studied systematically in this dissertation. We find that the thermal conductivities of 1H-SLMoS2 and 1T-SLMoS2 in both the armchair and the zigzag directions increase with the increasing of the sample length, while the increase of the width of the sample has minor effect on the thermal conductions of these two structures. The thermal conductivity of 1HSLMoS2 is smaller than that of 1T-SLMoS2 under size effect. Furthermore, the temperature effect results show that the thermal conductivities of both 1H-SLMoS2 and 1T-SLMoS2 decrease with the increasing of the temperature. The thermal conductivities of 1HSLMoS2 and 1T-SLMoS2 are nearly the same (difference <6%) in both of the chiral orientations under corresponding temperatures, especially in the armchair direction (difference <2.8%). Moreover, we find that the strain effects on the thermal conductivity of 1HSLMoS2 and 1T-SLMoS2 are different. More specifically, the thermal conductivity decreases with the increasing tensile strain rate for
1T-SLMoS2, while fluctuates with the growth of the strain for 1HSLMoS2. Finally, we find that the thermal conductivity of same sized 1H-SLMoS2 is similar with that of the strained 1H-SLMoS2 structure.
Matrix-free voxel-based finite element method for materials with heterogeneous microstructures
(2019)
Modern image detection techniques such as micro computer tomography
(μCT), magnetic resonance imaging (MRI) and scanning electron microscopy (SEM) provide us with high resolution images of the microstructure of materials in a non-invasive and convenient way. They form the basis for the geometrical models of high-resolution analysis, so called image-based analysis.
However especially in 3D, discretizations of these models reach easily the size of 100 Mill. degrees of freedoms and require extensive hardware resources in terms of main memory and computing power to solve the numerical model. Consequently, the focus of this work is to combine and adapt numerical solution methods to reduce the memory demand first and then the computation time and therewith enable an execution of the image-based analysis on modern computer desktops. Hence, the numerical model is a straightforward grid discretization of the voxel-based (pixels with a third dimension) geometry which omits the boundary detection algorithms and allows reduced storage of the finite element data structure and a matrix-free solution algorithm.
This in turn reduce the effort of almost all applied grid-based solution techniques and results in memory efficient and numerically stable algorithms for the microstructural models. Two variants of the matrix-free algorithm are presented. The efficient iterative solution method of conjugate gradients is used with matrix-free applicable preconditioners such as the Jacobi and the especially suited multigrid method. The jagged material boundaries of the voxel-based mesh are smoothed through embedded boundary elements which contain different material information at the integration point and are integrated sub-cell wise though without additional boundary detection. The efficiency of the matrix-free methods can be retained.
The high resource demand of the building sector clearly indicates the need to search for alternative, renewable and energy-efficient materials. This work presents paper-laminated sandwich elements with a core of corrugated paperboard that can serve as architectural components with a load-bearing capacity after a linear folding process. Conventional methods either use paper tubes or glued layers of honeycomb panels. In contrast, the folded components are extremely lightweight, provide the material strength exactly where it is statically required and offer many possibilities for design variants. After removing stripes of the paper lamination, the sandwich can be folded in a linear way at this position. Without the resistance of the missing paper, the sandwich core can be easily compressed. The final angle of the folding correlates with the width of the removed paper stripe. As such, this angle can be described by a simple geometric equation. The geometrical basis for the production of folded sandwich elements was established and many profile types were generated such as triangular, square or rectangular shapes. The method allows the easy planning and fast production of components that can be used in the construction sector. A triangle profile was used to create a load-bearing frame as supporting structure for an experimental building. This first permanent building completely made of corrugated cardboard was evaluated in a two-year test to confirm the efficiency of the developed components. In addition to the frame shown in this paper, large-scale sandwich elements with a core of folded components can be used to fabricate lightweight ceilings and large-scale sandwich components. The method enables the efficient production of linearly folded cardboard elements which can replace normal wooden components like beams, pillars or frames and bring a fully recycled material in the context of architectural construction.
Occupant needs with regard to residential buildings are not well known due to a lack of representative scientific studies. To improve the lack of data, a large scale study was carried out using a Post Occupancy Evaluation of 1,416 building occupants. Several criteria describing the needs of occupants were evaluated with regard to their subjective level of relevance. Additionally, we investigated the degree to which deficiencies subjectively exist, and the degree to which occupants were able to accept them. From the data obtained, a hierarchy of criteria was created. It was found that building occupants ranked the physiological needs of air quality and thermal comfort the highest. Health hazards such as mould and contaminated building materials were unacceptable for occupants, while other deficiencies were more likely to be tolerated. Occupant satisfaction was also investigated. We found that most occupants can be classified as satisfied, although some differences do exist between different populations. To explain the relationship between the constructs of what we call relevance, acceptance, deficiency and satisfaction, we then created an explanatory model. Using correlation and regression analysis, the validity of the model was then confirmed by applying the collected data. The results of the study are both relevant in shaping further research and in providing guidance on how to maximize tenant satisfaction in real estate management.
Der vorliegende Text beschreibt die intensive Erforschung von Wabenplatten aus Papierwerkstoffen, die durch Faltprozesse neue räumliche Zustände einnehmen können und somit ihr ursprüngliches Anwendungsspektrum erweitern. Die gezeigten Lösungsansätze bewegen sich dabei im Spannungsfeld von Architektur und Ingenieurbau, denn die gefalteten Bauteile sind nicht nur äußerst tragfähig sondern besitzen auch eine ästhetische Form. Die entwickelten Verfahren und Konstruktionen werden auf einem hohen architektonischen Niveau präsentiert und mit einfachen ingenieurtechnischen Methoden verifiziert. Zur Lösungsfindung werden geometrische Verfahren ebenso angewendet wie konstruktive Faustformeln und Recherchen aus Architektur und Forschung.
Der Fokus der Arbeit liegt auf der Untersuchung von Faltungen in Wabenplatten. Während der Auseinandersetzung mit der Thematik erschienen jedoch viele weitere Aspekte als sehr interessant und bearbeitungswürdig. Als theoretische Grundlage dieser Arbeit werden deshalb die geschichtliche Entwicklung und die gesellschaftliche Bedeutung von Papier und Papierwerkstoffen analysiert und deren Produktionsprozesse beleuchtet. Diese Vorgehensweise ermöglicht eine Einordnung des Potentials und der Bedeutung des Werkstoffs Papier. Der Kontext der Arbeit wird dadurch gestärkt und führt zu interessanten zukünftigen Forschungsansätzen.
Intensive Untersuchungen widmen sich der geometrischen Bestimmung von Faltungen in Wabenplatten aus Papierwerkstoffen sowie deren Manifestation als konstruktive Bauteile. Auch die statischen Eigenschaften der Elemente und ihr Konstruktionspotential werden erforscht und aufbereitet. Wichtige Impulse aus Forschung und Technik fließen in die Recherche der Arbeit ein und erlauben die Verortung der Ergebnisse im architektonischen Kontext. Versuchsreihen und Materialstudien an Prototypen belegen die Ergebnisse virtueller und rechnerischer Studien. Konzepte zur parametrischen Berechnung und Visualisierung der Forschungsergebnisse werden präsentiert und zeigen zukunftsfähige Planungshilfen für die Industrie auf. Etliche Testreihen zu unterschiedlichsten Abdichtungskonzepten führen zur Realisierung eines sehenswerten Experimentalbaus. Er erlaubt die dauerhafte Untersuchung der entwickelten Bauteile unter realistischen Bedingungen und bestätigt deren Leistungsfähigkeit. Dadurch wird nicht nur ein dauerhaftes Monitoring und eine Evaluierung der Leistungsdaten möglich sondern es wird auch der sichtbare Beweis erbracht, dass mit Papierwerkstoffen effiziente und hochwertige Architekturen zu realisieren sind, welche das enorme gestalterische Potential von gefalteten Wabenplatten ausnutzen.