Refine
Document Type
- Article (1015) (remove)
Institute
- Professur Theorie und Geschichte der modernen Architektur (393)
- Institut für Strukturmechanik (ISM) (254)
- Professur Informatik im Bauwesen (130)
- Professur Stochastik und Optimierung (40)
- Professur Bauphysik (23)
- In Zusammenarbeit mit der Bauhaus-Universität Weimar (21)
- Professur Informatik in der Architektur (15)
- Junior-Professur Computational Architecture (12)
- Institut für Europäische Urbanistik (11)
- Professur Bauchemie und Polymere Werkstoffe (11)
Keywords
- Bauhaus-Kolloquium (395)
- Weimar (395)
- Angewandte Mathematik (186)
- Strukturmechanik (185)
- Architektur (168)
- 1986 (63)
- 1989 (60)
- Design (59)
- Bauhaus (55)
- Raum (55)
In this work, extensive reactive molecular dynamics simulations are conducted to analyze the nanopore creation by nano-particles impact over single-layer molybdenum disulfide (MoS2) with 1T and 2H phases. We also compare the results with graphene monolayer. In our simulations, nanosheets are exposed to a spherical rigid carbon projectile with high initial velocities ranging from 2 to 23 km/s. Results for three different structures are compared to examine the most critical factors in the perforation and resistance force during the impact. To analyze the perforation and impact resistance, kinetic energy and displacement time history of the projectile as well as perforation resistance force of the projectile are investigated.
Interestingly, although the elasticity module and tensile strength of the graphene are by almost five times higher than those of MoS2, the results demonstrate that 1T and 2H-MoS2 phases are more resistive to the impact loading and perforation than graphene. For the MoS2nanosheets, we realize that the 2H phase is more resistant to impact loading than the 1T counterpart.
Our reactive molecular dynamics results highlight that in addition to the strength and toughness, atomic structure is another crucial factor that can contribute substantially to impact resistance of 2D materials. The obtained results can be useful to guide the experimental setups for the nanopore creation in MoS2or other 2D lattices.
Durch internationale Fluchtbewegungen über die sogenannte Balkanroute bildete sich in Serbiens Hauptstadt Belgrad in den letzten Jahren ein sogenannter Refugee District heraus. Im Kontext von Migration und Flucht werden dabei zahlreiche Spannungsfelder auf unterschiedlichen räumlichen und politischen Ebenen sichtbar. Für Flüchtende kreieren diese eine Situation, die von Stillstand, Ausweglosigkeit, Kontrolle, Gefahr und Verdrängung geprägt ist. Allerdings führen die Vielschichtigkeit und die Diversität unterschiedlicher Akteur*innen, die bezüglich der Situation von Flüchtenden auf der Balkanroute wirkmächtig sind, auch zu Nischen, Widerständigkeiten und der Möglichkeit (neuer) Allianzen. Auf diese Weise entsteht eine kollektive Praktik der Nicht-Bewegung im Widerstand gegen die Unterdrückung und für globale Bewegungsfreiheit.
According to Eurocode, the computation of bending strength for steel cantilever beams is a straightforward process. The approach is based on an Ayrton-Perry formula adaptation of buckling curves for steel members in compression, which involves the computation of an elastic critical buckling load for considering the instability. NCCI documents offer a simplified formula to determine the critical bending moment for cantilevers beams with symmetric cross-section. Besides the NCCI recommendations, other approaches, e.g. research literature or Finite-Element-Analysis, may be employed to determine critical buckling loads. However, in certain cases they render different results. Present paper summarizes and compares the abovementioned analytical and numerical approaches for determining critical loads and it exemplarily analyses corresponding cantilever beam capacities using numerical approaches based on plastic zones theory (GMNIA).
The spread of breathing air when playing wind instruments and singing was investigated and visualized using two methods: (1) schlieren imaging with a schlieren mirror and (2) background-oriented schlieren (BOS). These methods visualize airflow by visualizing density gradients in transparent media. The playing of professional woodwind and brass instrument players, as well as professional classical trained singers were investigated to estimate the spread distances of the breathing air. For a better comparison and consistent measurement series, a single high note, a single low note, and an extract of a musical piece were investigated. Additionally, anemometry was used to determine the velocity of the spreading breathing air and the extent to which it was quantifiable. The results showed that the ejected airflow from the examined instruments and singers did not exceed a spreading range of 1.2 m into the room. However, differences in the various instruments have to be considered to assess properly the spread of the breathing air. The findings discussed below help to estimate the risk of cross-infection for wind instrument players and singers and to develop efficacious safety precautions, which is essential during critical health periods such as the current COVID-19 pandemic.
This paper presents numerical analysis of the discrete fundamental solution of the discrete Laplace operator on a rectangular lattice. Additionally, to provide estimates in interior and exterior domains, two different regularisations of the discrete fundamental solution are considered. Estimates for the absolute difference and lp-estimates are constructed for both regularisations. Thus, this work extends the classical results in the discrete potential theory to the case of a rectangular lattice and serves as a basis for future convergence analysis of the method of discrete potentials on rectangular lattices.
Global structural analyses in civil engineering are usually performed considering linear-elastic material behavior. However, for steel structures, a certain degree of plasticization depending on the member classification may be considered. Corresponding plastic analyses taking material nonlinearities into account are effectively realized using numerical methods. Frequently applied finite elements of two and three-dimensional models evaluate the plasticity at defined nodes using a yield surface, i.e. by a yield condition, hardening rule, and flow rule. Corresponding calculations are connected to a large numerical as well as time-consuming effort and they do not rely on the theoretical background of beam theory, to which the regulations of standards mainly correspond. For that reason, methods using beam elements (one-dimensional) combined with cross-sectional analyses are commonly applied for steel members in terms of plastic zones theories. In these approaches, plasticization is in general assessed by means of axial stress only. In this paper, more precise numerical representation of the combined stress states, i.e. axial and shear stresses, is presented and results of the proposed approach are validated and discussed.
Polylactic acid (PLA) is a highly applicable material that is used in 3D printers due to some significant features such as its deformation property and affordable cost. For improvement of the end-use quality, it is of significant importance to enhance the quality of fused filament fabrication (FFF)-printed objects in PLA. The purpose of this investigation was to boost toughness and to reduce the production cost of the FFF-printed tensile test samples with the desired part thickness. To remove the need for numerous and idle printing samples, the response surface method (RSM) was used. Statistical analysis was performed to deal with this concern by considering extruder temperature (ET), infill percentage (IP), and layer thickness (LT) as controlled factors. The artificial intelligence method of artificial neural network (ANN) and ANN-genetic algorithm (ANN-GA) were further developed to estimate the toughness, part thickness, and production-cost-dependent variables. Results were evaluated by correlation coefficient and RMSE values. According to the modeling results, ANN-GA as a hybrid machine learning (ML) technique could enhance the accuracy of modeling by about 7.5, 11.5, and 4.5% for toughness, part thickness, and production cost, respectively, in comparison with those for the single ANN method. On the other hand, the optimization results confirm that the optimized specimen is cost-effective and able to comparatively undergo deformation, which enables the usability of printed PLA objects.
Burning of clinker is the most influencing step of cement quality during the production process. Appropriate characterisation for quality control and decision-making is therefore the critical point to maintain a stable production but also for the development of alternative cements. Scanning electron microscopy (SEM) in combination with energy dispersive X-ray spectroscopy (EDX) delivers spatially resolved phase and chemical information for cement clinker. This data can be used to quantify phase fractions and chemical composition of identified phases.
The contribution aims to provide an overview of phase fraction quantification by semi-automatic phase segmentation using high-resolution backscattered electron (BSE) images and lower-resolved EDX element maps. Therefore, a tool for image analysis was developed that uses state-of-the-art algorithms for pixel-wise image segmentation and labelling in combination with a decision tree that allows searching for specific clinker phases. Results show that this tool can be applied to segment sub-micron scale clinker phases and to get a quantification of all phase fractions. In addition, statistical evaluation of the data is implemented within the tool to reveal whether the imaged area is representative for all clinker phases.
This study demonstrates the application and combination of multiple imaging techniques [light microscopy, micro-X-ray computer tomography (μ-CT), scanning electron microscopy (SEM) and focussed ion beam – nano-tomography (FIB-nT)] to the analysis of the microstructure of hydrated alite across multiple scales. However, by comparing findings with mercury intrusion porosimetry (MIP), it becomes obvious that the imaged 3D volumes and 2D images do not sufficiently overlap at certain scales to allow a continuous quantification of the pore size distribution (PSD). This can be overcome by improving the resolution and increasing the measured volume. Furthermore, results show that the fibrous morphology of calcium-silicate-hydrates (C-S-H) phases is preserved during FIB-nT. This is a requirement for characterisation of nano-scale porosity. Finally, it was proven that the combination of FIB-nT with energy-dispersive X-ray spectroscopy (EDX) data facilitates the phase segmentation of a 11 × 11 × 7.7 μm3 volume of hydrated alite.
Within the scope of literature, the influence of openings within the infill walls that are bounded by a reinforced concrete frame and excited by seismic drift forces in both in- and out-of-plane direction is still uncharted. Therefore, a 3D micromodel was developed and calibrated thereafter, to gain more insight in the topic. The micromodels were calibrated against their equivalent physical test specimens of in-plane, out-of-plane drift driven tests on frames with and without infill walls and openings, as well as out-of-plane bend test of masonry walls. Micromodels were rectified based on their behavior and damage states. As a result of the calibration process, it was found that micromodels were sensitive and insensitive to various parameters, regarding the model’s behavior and computational stability. It was found that, even within the same material model, some parameters had more effects when attributed to concrete rather than on masonry. Generally, the in-plane behavior of infilled frames was found to be largely governed by the interface material model. The out-of-plane masonry wall simulations were governed by the tensile strength of both the interface and masonry material model. Yet, the out-of-plane drift driven test was governed by the concrete material properties.
Realistic uncertainty description incorporating aleatoric and epistemic uncertainties can be described within the framework of polymorphic uncertainty, which is computationally demanding. Utilizing a domain decomposition approach for random field based uncertainty models the proposed level-based sampling method can reduce these computational costs significantly and shows good agreement with a standard sampling technique. While 2-level configurations tend to get unstable with decreasing sampling density 3-level setups show encouraging results for the investigated reliability analysis of a structural unit square.
Bauhaus-Gastprofessorin Mirjam Wenzel referierte am 30. Juni 2021 im Audimax der Bauhaus-Universität Weimar zur Entstehungsgeschichte und Konzeption Jüdischer Museen. Dabei ging sie darauf ein, inwiefern diese Museen besonders relevant für aktuelle gesellschaftliche und politische Fragestellungen sind. Prof. Wenzels zweiter öffentlicher Vortrag an der Bauhaus-Universität Weimar skizzierte die Potentiale von Kultureinrichtungen in Zeiten gesellschaftspolitischer Veränderungen im Allgemeinen und die Bedeutung Jüdischer Museen angesichts verbaler und tätlicher Gewalt gegen Jüdinnen und Juden im Besonderen.
Compiling and disseminating information about incidents and disasters are key to disaster management and relief. But due to inherent limitations of the acquisition process, the required information is often incomplete or missing altogether. To fill these gaps, citizen observations spread through social media are widely considered to be a promising source of relevant information, and many studies propose new methods to tap this resource. Yet, the overarching question of whether and under which circumstances social media can supply relevant information (both qualitatively and quantitatively) still remains unanswered. To shed some light on this question, we review 37 disaster and incident databases covering 27 incident types, compile a unified overview of the contained data and their collection processes, and identify the missing or incomplete information. The resulting data collection reveals six major use cases for social media analysis in incident data collection: (1) impact assessment and verification of model predictions, (2) narrative generation, (3) recruiting citizen volunteers, (4) supporting weakly institutionalized areas, (5) narrowing surveillance areas, and (6) reporting triggers for periodical surveillance. Furthermore, we discuss the benefits and shortcomings of using social media data for closing information gaps related to incidents and disasters.
Chemical glass frosting processes are widely used to create visual attractive glass surfaces. A commonly used frosting bath mainly contains ammonium bifluoride (NH4HF2) mixed with hydrochloric acid (HCl). The frosting process consists of several baths. Firstly, the preliminary bath to clean the object. Secondly, the frosting bath which etches the rough light scattering structure into the glass surface. Finally, the washing baths to clean the frosted object. This is where the constituents of the preceding steps accumulate and have to be filtered from the sewage. In the present contribution, phosphoric acid (H3PO4) was used as a substitute for HCl to reduce the amount of ammonium (NH4+) and chloride (Cl−) dissolved in the waste water. In combination with magnesium carbonate (MgCO3), it allows the precipitation of ammonium within the sewage as ammonium magnesium phosphate (MgNH4PO4). However, a trivial replacement of HCl by H3PO4 within the frosting process causes extensive frosting errors, such as inhomogeneous size distributions of the structures or domains that are not fully covered by these structures. By modifying the preliminary bath composition, it was possible to improve the frosting result considerably. To determine the optimal composition of the preliminary bath, a semi-automatic evaluation method has been developed. This method renders the objective comparison of the resulting surface quality possible.
Entrepreneurship and start-up activities are seen as a key response to recent upheavals in the media industry: Newly founded ventures can act as important drivers for industry transformation and renewal, pioneering new products, business models, and organizational designs (e.g. Achtenhagen, 2017; Buschow & Laugemann, 2020).
In principle, media students represent a crucial population of nascent entrepreneurs: individuals who will likely become founders of start-ups (Casero-Ripollés et al., 2016). However, their willingness to start a new business is generally considered to be rather low (Goyanes, 2015), and for journalism students, the idea of innovation tends to be conservative, following traditional norms and professional standards (Singer & Broersma, 2020). In a sample of Spanish journalism students, López-Meri et al. (2020) found that one of the main barriers to entrepreneurial intentions is that students feel they lack knowledge and training in entrepreneurship.
In the last 10 years, a wide variety of entrepreneurship education courses have been set up in media departments of colleges and universities worldwide.
These programs have been designed to sensitize and prepare communications, media and journalism students to think and act entrepreneurially (e.g. Caplan et al., 2020; Ferrier, 2013; Ferrier & Mays, 2017; Hunter & Nel, 2011). Entrepreneurial competencies
and practices not only play a crucial role for start-ups, but, in imes of digital transformation, are increasingly sought after by legacy media companies as well (Küng, 2015).
At the Department of Journalism and Communication Research, Hanover University of Music, Drama and Media, Germany, we have been addressing these developments with the “Media Entrepreneurship” program. The course, established in 2013, aims to provide fundamental knowledge of entrepreneurship, as well as promoting students‘ entrepreneurial thinking and behavior. This article presents the pedagogical approach of the program and investigates learning outcomes. By outlining and evaluating the Media Entrepreneurship program, this article aims to promote good practices of entrepreneurship education in communications, media and journalism, and to reflect on the limitations of such programs.
This article is focused on the research and development of new cellulose ether derivatives as innovative superplasticizers for mortar systems. Several synthetic strategies have been pursued to obtain new compounds to study their properties on cementitious systems as new bio-based additives. The new water-soluble admixtures were synthesized using a complex carboxymethylcellulose-based backbone that was first hydrolyzed and then sulfo-ethylated in the presence of sodium vinyl sulphonate. Starting with a complex biopolymer that is widely known as a thickening agent was very challenging. Only by varying the hydrolysis times and temperatures of the reactions was achieved the aimed goal. The obtained derivatives showed different molecular weight (Mw) and anionic charges on their backbones. An improvement in shear stress and dynamic viscosity values of CEM II 42.5R cement was observed with the samples obtained with a longer time of higher temperature hydrolysis and sulfo-ethylation. Investigations into the chemical nature of the pore solution, calorimetric studies and adsorption experiments clearly showed the ability of carboxymethyl cellulose superplasticizer (CMC SP) to interact with cement grains and influence hydration processes within a 48-h time window, causing a delay in hydration reactions in the samples. The fluidity of the cementitious matrices was ascertained through slump test and preliminary studies of mechanical and flexural strength of the hardened mortar formulated with the new ecological additives yielded values in terms of mechanical properties. Finally, the computed tomography (CT) images completed the investigation of the pore network structure of hardened specimens, highlighting their promising structure porosity.
The development of a hydro-mechanically coupled Coupled-Eulerian–Lagrangian (CEL) method and its application to the back-analysisof vibratory pile driving model tests in water-saturated sand is presented. The predicted pile penetration using this approachis in good agreement with the results of the model tests as well as with fully Lagrangian simulations. In terms of pore water pressure, however, the results of the CEL simulation show a slightly worse accordance with the model tests compared to the Lagrangian simulation. Some shortcomings of the hydro-mechanically coupled CEL method in case of frictional contact problems and pore fluids with high bulk modulus are discussed. Lastly, the CEL method is applied to the simulation of vibratory driving of open-profile piles under partially drained conditions to study installation-induced changes in the soil state. It is concluded that the proposed method is capable of realistically reproducing the most important mechanisms in the soil during the driving process despite its addressed shortcomings.
One of the most important subjects of hydraulic engineering is the reliable estimation of the transverse distribution in the rectangular channel of bed and wall shear stresses. This study makes use of the Tsallis entropy, genetic programming (GP) and adaptive neuro-fuzzy inference system (ANFIS) methods to assess the shear stress distribution (SSD) in the rectangular channel.
To evaluate the results of the Tsallis entropy, GP and ANFIS models, laboratory observations were used in which shear stress was measured using an optimized Preston tube. This is then used to measure the SSD in various aspect ratios in the rectangular channel. To investigate the shear stress percentage, 10 data series with a total of 112 different data for were used. The results of the sensitivity analysis show that the most influential parameter for the SSD in smooth rectangular channel is the dimensionless parameter B/H, Where the transverse coordinate is B, and the flow depth is H. With the parameters (b/B), (B/H) for the bed and (z/H), (B/H) for the wall as inputs, the modeling of the GP was better than the other one. Based on the analysis, it can be concluded that the use of GP and ANFIS algorithms is more effective in estimating shear stress in smooth rectangular channels than the Tsallis entropy-based equations.
This study proposes an efficient Bayesian, frequency-based damage identification approach to identify damages in cantilever structures with an acceptable error rate, even at high noise levels. The catenary poles of electric high-speed train systems were selected as a realistic case study to cover the objectives of this study. Compared to other frequency-based damage detection approaches described in the literature, the proposed approach is efficiently able to detect damages in cantilever structures to higher levels of damage detection, namely identifying both the damage location and severity using a low-cost structural health monitoring (SHM) system with a limited number of sensors; for example, accelerometers. The integration of Bayesian inference, as a stochastic framework, in the proposed approach, makes it possible to utilize the benefit of data fusion in merging the informative data from multiple damage features, which increases the quality and accuracy of the results. The findings provide the decision-maker with the information required to manage the maintenance, repair, or replacement procedures.
Biofeedback constitutes a well-established, non-invasive method to voluntary interfere in emotional processing by means of cognitive strategies. However, treatment durations exhibit strong inter-individual variations and first successes can often be achieved only after a large number of sessions. Sham feedback constitutes a rather untapped approach by providing feedback that does not correspond to the participant’s actual state. The current study aims to gain insights into mechanisms of sham feedback processing in order to support new techniques in biofeedback therapy. We carried out two experiments and applied different types of sham feedback on skin conductance responses and pupil size changes during affective processing. Results indicate that standardized but context-sensitive sham signals based on skin conductance responses exert a stronger influence on emotional regulation compared to individual sham feedback from ongoing pupil dynamics. Also, sham feedback should forego unnatural signal behavior to avoid irritation and skepticism among participants. Altogether, a reasonable combination of stimulus features and sham feedback characteristics enables to considerably reduce the actual bodily responsiveness already within a single session.
This paper outlines an important step in characterizing a novel field of robotic construction research where a cable-driven parallel robot is used to extrude cementitious material in three-dimensional space, and thus offering a comprehensive new approach to computational design and construction, and to robotic fabrication at larger scales. Developed by the Faculty of Art and Design at Bauhaus-University Weimar (Germany), the faculty of Architecture at the University of Applied Sciences Dortmund (Germany) and the Chair of Mechatronics at the University of Duisburg-Essen (Germany), this approach offers unique advantages over existing additive manufacturing methods: the system is easily transportable and scalable, it does not require additional formwork or scaffolding, and it offers digital integration and informational oversight across the entire design and building process. This paper considers 1) key research components of cable robotic 3D-printing (such as computational design, material exploration, and robotic control), and 2) the integration of these parameters into a unified design and building process. The demonstration of the approach at full-scale is of particular concern.
This article aims to develop a social theory of violence that emphasizes the role of the third party as well as the communication between the involved subjects. For this Teresa Koloma Beck’s essay ‘The Eye of the Beholder: Violence as a Social Process’ is taken as a starting point, which adopts a social-constructivist perspective. On the one hand, the basic concepts and the benefits of this approach are presented. On the other hand, social-theoretical problems of this approach are revealed. These deficits are counteracted by expanding Koloma Beck’s approach with a communicative-constructivist framework. Thus, the role of communicative action and the ‘objectification of violence’ is emphasized. These aspects impact the perception, judgement and (de-)legitimation of violence phenomena and the emergence of a ‘knowledge of violence’. Communicative actions and objectifications form a key to understanding violent interactions and the link between the micro and macro levels. Finally, the methodological consequences for the research of violence and Communicative Constructivism are discussed. Furthermore, possible research fields are outlined, which open up by looking at communicative action and the objectifications within the ‘triads of violence’.
Zu den diversen Unternehmungen sozialbewegter „Gegenwissenschaft“, die um 1980 auf der Bildfläche der BRD erschienen, zählte der 1982 gegründete Berliner Wissenschaftsladen e. V., kurz WILAB – eine Art „alternatives“ Spin-off der Technischen Universität Berlin. Der vorliegende Beitrag situiert die Ausgründung des „Ladens“ im Kontext zeitgenössischer Fortschritte der (regionalen) Forschungs- und Technologiepolitik. Gezeigt wird, wie der deindustrialisierenden Inselstadt, qua „innovationspolitischer“ Gegensteuerung, dabei sogar eine gewisse Vorreiterrolle zukam: über die Stadtgrenzen hinaus sichtbare Neuerungen wie die Gründermesse BIG TECH oder das 1983 eröffnete Berliner Innovations- und Gründerzentrum (BIG), der erste „Incubator“ [sic] der BRD, etwa gingen auf das Konto der 1977/78 lancierten Technologie-Transferstelle der TU Berlin, TU-transfer.
Anders gesagt: tendenziell bekam man es hier nun mit Verhältnissen zu tun, die immer weniger mit den Träumen einer „kritischen“, nicht-fremdbestimmten (Gegen‑)Wissenschaft kompatibel waren. Latent konträr zur historiographischen Prominenz des wissenschaftskritischen Zeitgeists fristeten „alternativen“ Zielsetzungen verpflichtete Unternehmungen wie „WILAB“ ein relativ marginalisiertes Nischendasein. Dennoch wirft das am WILAB verfolgte, so gesehen wenig aussichtsreiche Anliegen, eine andere, nämlich „humanere“ Informationstechnologie in die Wege zu leiten, ein instruktives Licht auf die Aufbrüche „unternehmerischer“ Wissenschaft in der BRD um 1980.
Object-Oriented Damage Information Modeling Concepts and Implementation for Bridge Inspection
(2022)
Bridges are designed to last for more than 50 years and consume up to 50% of their life-cycle costs during their operation phase. Several inspections and assessment actions are executed during this period. Bridge and damage information must be gathered, digitized, and exchanged between different stakeholders. Currently, the inspection and assessment practices rely on paper-based data collection and exchange, which is time-consuming and error-prone, and leads to loss of information. Storing and exchanging damage and building information in a digital format may lower costs and errors during inspection and assessment and support future needs, for example, immediate simulations regarding performance assessment, automated maintenance planning, and mixed reality inspections. This study focused on the concept for modeling damage information to support bridge reviews and structural analysis. Starting from the definition of multiple use cases and related requirements, the data model for damage information is defined independently from the subsequent implementation. In the next step, the implementation via an established standard is explained. Functional tests aim to identify problems in the concept and implementation. To show the capability of the final model, two example use cases are illustrated: the inspection review of the entire bridge and a finite-element analysis of a single component. Main results are the definition of necessary damage data, an object-oriented damage model, which supports multiple use cases, and the implementation of the model in a standard. Furthermore, the tests have shown that the standard is suitable to deliver damage information; however, several software programs lack proper implementation of the standard.
Quantification of cracks in concrete thin sections considering current methods of image analysis
(2022)
Image analysis is used in this work to quantify cracks in concrete thin sections via modern image processing. Thin sections were impregnated with a yellow epoxy resin, to increase the contrast between voids and other phases of the concrete. By the means of different steps of pre-processing, machine learning and python scripts, cracks can be quantified in an area of up to 40 cm2. As a result, the crack area, lengths and widths were estimated automatically within a single workflow. Crack patterns caused by freeze-thaw damages were investigated. To compare the inner degradation of the investigated thin sections, the crack density was used. Cracks in the thin sections were measured manually in two different ways for validation of the automatic determined results. On the one hand, the presented work shows that the width of cracks can be determined pixelwise, thus providing the plot of a width distribution. On the other hand, the automatically measured crack length differs in comparison to the manually measured ones.
In this paper, we present an open-source code for the first-order and higher-order nonlocal operator method (NOM) including a detailed description of the implementation. The NOM is based on so-called support, dual-support, nonlocal operators, and an operate energy functional ensuring stability. The nonlocal operator is a generalization of the conventional differential operators. Combined with the method of weighed residuals and variational principles, NOM establishes the residual and tangent stiffness matrix of operate energy functional through some simple matrix without the need of shape functions as in other classical computational methods such as FEM. NOM only requires the definition of the energy drastically simplifying its implementation. The implementation in this paper is focused on linear elastic solids for sake of conciseness through the NOM can handle more complex nonlinear problems. The NOM can be very flexible and efficient to solve partial differential equations (PDEs), it’s also quite easy for readers to use the NOM and extend it to solve other complicated physical phenomena described by one or a set of PDEs. Finally, we present some classical benchmark problems including the classical cantilever beam and plate-with-a-hole problem, and we also make an extension of this method to solve complicated problems including phase-field fracture modeling and gradient elasticity material.
In this study, we propose a nonlocal operator method (NOM) for the dynamic analysis of (thin) Kirchhoff plates. The nonlocal Hessian operator is derived based on a second-order Taylor series expansion. The NOM does not require any shape functions and associated derivatives as ’classical’ approaches such as FEM, drastically facilitating the implementation. Furthermore, NOM is higher order continuous, which is exploited for thin plate analysis that requires C1 continuity. The nonlocal dynamic governing formulation and operator energy functional for Kirchhoff plates are derived from a variational principle. The Verlet-velocity algorithm is used for the time discretization. After confirming the accuracy of the nonlocal Hessian operator, several numerical examples are simulated by the nonlocal dynamic Kirchhoff plate formulation.
We present a stochastic deep collocation method (DCM) based on neural architecture search (NAS) and transfer learning for heterogeneous porous media. We first carry out a sensitivity analysis to determine the key hyper-parameters of the network to reduce the search space and subsequently employ hyper-parameter optimization to finally obtain the parameter values. The presented NAS based DCM also saves the weights and biases of the most favorable architectures, which is then used in the fine-tuning process. We also employ transfer learning techniques to drastically reduce the computational cost. The presented DCM is then applied to the stochastic analysis of heterogeneous porous material. Therefore, a three dimensional stochastic flow model is built providing a benchmark to the simulation of groundwater flow in highly heterogeneous aquifers. The performance of the presented NAS based DCM is verified in different dimensions using the method of manufactured solutions. We show that it significantly outperforms finite difference methods in both accuracy and computational cost.
In machine learning, if the training data is independently and identically distributed as the test data then a trained model can make an accurate predictions for new samples of data. Conventional machine learning has a strong dependence on massive amounts of training data which are domain specific to understand their latent patterns. In contrast, Domain adaptation and Transfer learning methods are sub-fields within machine learning that are concerned with solving the inescapable problem of insufficient training data by relaxing the domain dependence hypothesis. In this contribution, this issue has been addressed and by making a novel combination of both the methods we develop a computationally efficient and practical algorithm to solve boundary value problems based on nonlinear partial differential equations. We adopt a meshfree analysis framework to integrate the prevailing geometric modelling techniques based on NURBS and present an enhanced deep collocation approach that also plays an important role in the accuracy of solutions. We start with a brief introduction on how these methods expand upon this framework. We observe an excellent agreement between these methods and have shown that how fine-tuning a pre-trained network to a specialized domain may lead to an outstanding performance compare to the existing ones. As proof of concept, we illustrate the performance of our proposed model on several benchmark problems.
In this work, we present a deep collocation method (DCM) for three-dimensional potential problems in non-homogeneous media. This approach utilizes a physics-informed neural network with material transfer learning reducing the solution of the non-homogeneous partial differential equations to an optimization problem. We tested different configurations of the physics-informed neural network including smooth activation functions, sampling methods for collocation points generation and combined optimizers. A material transfer learning technique is utilized for non-homogeneous media with different material gradations and parameters, which enhance the generality and robustness of the proposed method. In order to identify the most influential parameters of the network configuration, we carried out a global sensitivity analysis. Finally, we provide a convergence proof of our DCM. The approach is validated through several benchmark problems, also testing different material variations.
Subscription-based news platforms (such as “Apple News+” or “Readly”) that bundle content from different publishers into one comprehensive package and offer it to media users at a fixed monthly rate are a new way of accessing and consuming digital journalism. These services have received little attention in journalism studies, although they differ greatly from traditional media products and distribution channels. This article empirically investigates the perception of journalism platforms based on eight qualitative focus group discussions with 55 German news consumers.
Results show that the central characteristics these platforms should fulfill in order to attract users are strikingly similar to the characteristics of media platforms from the music and video industries, in particular regarding price points, contract features, and modes of usage. Against this background, the potential and perspectives of a subscription-based news platform for journalism’s societal role are discussed.
Immanuel Kant’s thought is a central historical and theoretical reference in Hans Blumenberg’s metaphorological project. This is demonstrated by the fact that in the Paradigms the author outlines the concept of absolute metaphor by explicitly referring to §59 of the Critique of the Power of Judgment and recognizing in the Kantian symbol a model for his own metaphorics. However, Kant’s name also appears in the chapter on the metaphor of the “terra incognita” that not only did he theorize the presence of symbolic hypotyposis in our language [...] but also made extensive use of metaphors linked to “determinate historical experiences”. In particular: geographical metaphors. In my essay, I would like to start from the analysis of Kant’s geographical metaphors in order to try to rethink Blumenberg’s archaeological method as an archaeology of media that grounds the study of metaphors in the materiality of communication and the combination of tools, agents and media.
Real-world labs hold the potential to catalyse rapid urban transformations through real-world experimentation. Characterised by a rather radical, responsive, and location-specific nature, real-world labs face constraints in the scaling of experimental knowledge. To make a significant contribution to urban transformation, the produced knowledge must go beyond the level of a building, street, or small district where real-world experiments are conducted. Thus, a conflict arises between experimental boundaries and the stimulation of broader implications. The challenges of scaling experimental knowledge have been recognised as a problem, but remain largely unexplained. Based on this, the article will discuss the applicability of the “typology of amplification processes” by Lam et al. (2020) to explore and evaluate the potential of scaling experimental knowledge from real-world labs. The application of the typology is exemplified in the case of the Bauhaus.MobilityLab. The Bauhaus.MobilityLab takes a unique approach by testing and developing cross-sectoral mobility, energy, and logistics solutions with a distinct focus on scaling knowledge and innovation. For this case study, different qualitative research techniques are combined according to “within-method triangulation” and synthesised in a strengths, weaknesses, opportunities, and threats (SWOT) analysis. The analysis of the Bauhaus.MobilityLab proves that the “typology of amplification processes” is useful as a systematic approach to identifying and evaluating the potential of scaling experimental knowledge.
For the safe and efficient operation of dams, frequent monitoring and maintenance are required. These are usually expensive, time consuming, and cumbersome. To alleviate these issues, we propose applying a wave-based scheme for the location and quantification of damages in dams.
To obtain high-resolution “interpretable” images of the damaged regions, we drew inspiration from non-linear full-multigrid methods for inverse problems and applied a new cyclic multi-stage full-waveform inversion (FWI) scheme. Our approach is less susceptible to the stability issues faced by the standard FWI scheme when dealing with ill-posed problems. In this paper, we first selected an optimal acquisition setup and then applied synthetic data to demonstrate the capability of our approach in identifying a series of anomalies in dams by a mixture of reflection and transmission tomography. The results had sufficient robustness, showing the prospects of application in the field of non-destructive testing of dams.
In this work, the degradation performance for the photocatalytic oxidation of eight micropollutants (amisulpride, benzotriazole, candesartan, carbamazepine, diclofenac, gabapentin, methlybenzotriazole, and metoprolol) within real secondary effluent was investigated using three different reactor designs. For all reactor types, the influence of irradiation power on its reaction rate and energetic efficiency was investigated. Flat cell and batch reactor showed almost similar substance specific degradation behavior. Within the immersion rotary body reactor, benzotriazole and methylbenzotriazole showed a significantly lower degradation affinity. The flat cell reactor achieved the highest mean degradation rate, with half time values ranging from 5 to 64 min with a mean of 18 min, due to its high catalysts surface to hydraulic volume ratio. The EE/O values were calculated for all micro-pollutants as well as the mean degradation rate constant of each experimental step. The lowest substance specific energy per order (EE/O) values of 5 kWh/m3 were measured for benzotriazole within the batch reactor. The batch reactor also reached the lowest mean values (11.8–15.9 kWh/m3) followed by the flat cell reactor (21.0–37.0 kWh/m3) and immersion rotary body reactor (23.9–41.0 kWh/m3). Catalyst arrangement and irradiation power were identified as major influences on the energetic performance of the reactors. Low radiation intensities as well as the use of submerged catalyst arrangement allowed a reduction in energy demand by a factor of 3–4. A treatment according to existing treatment goals of wastewater treatment plants (80% total degradation) was achieved using the batch reactor with a calculated energy demand of 7000 Wh/m3.
One of the most important renewable energy technologies used nowadays are wind power turbines. In this paper, we are interested in identifying the operating status of wind turbines, especially rotor blades, by means of multiphysical models. It is a state-of-the-art technology to test mechanical structures with ultrasonic-based methods. However, due to the density and the required high resolution, the testing is performed with high-frequency waves, which cannot penetrate the structure in depth. Therefore, there is a need to adopt techniques in the fields of multiphysical model-based inversion schemes or data-driven structural health monitoring. Before investing effort in the development of such approaches, further insights and approaches are necessary to make the techniques applicable to structures such as wind power plants (blades). Among the expected developments, further accelerations of the so-called “forward codes” for a more efficient implementation of the wave equation could be envisaged. Here, we employ electromagnetic waves for the early detection of cracks. Because in many practical situations, it is not possible to apply techniques from tomography (characterized by multiple sources and sensor pairs), we focus here on the question of whether the existence of cracks can be determined by using only one source for the sent waves.
The goal of architecture is changing in response to the expanding role of cities, rapid urbanization, and transformation under changing economic, environmental, social, and demographic factors. As cities increased in the early modern era, overcrowding, urbanization, and pollution conditions led reformers to consider the future shape of the cities. One of the most critical topics in contemporary architecture is the subject of the future concepts of living. In most cases, domed cities, as a future concept of living, are rarely considered, and they are used chiefly as “utopian” visions in the discourse of future ways of living. This paper highlights the reviews of domed cities to deepen the understanding of the idea in practice, like its approach in terms of architecture. The main aim of this paper is to provide a broad overview for domed cities in the face of pollution as one of the main concerns in many European cities. As a result, the significance of the reviews of the existing projects is focused on their conceptual quality. This review will pave the way for further studies in terms of future developments in the realm of domed cities. In this paper, the city of Celje, one of the most polluted cities in Slovenia, is taken as a case study for considering the concept of Dome incorporated due to the lack of accessible literature on the topic. This review’s primary contribution is to allow architects to explore a broad spectrum of innovation by comparing today’s achievable statuses against the possibilities generated by domed cities. As a result of this study, the concept of living under the Dome remains to be developed in theory and practice. The current challenging climatic situation will accelerate the evolution of these concepts, resulting in the formation of new typologies, which are a requirement for humanity.
Bolted connections are widely employed in structures like transmission poles, wind turbines, and television (TV) towers. The behaviour of bolted connections is often complex and plays a significant role in the overall dynamic characteristics of the structure. The goal of this work is to conduct a fatigue lifecycle assessment of such a bolted connection block of a 193 m tall TV tower, for which 205 days of real measurement data have been obtained from the installed monitoring devices. Based on the recorded data, the best-fit stochastic wind distribution for 50 years, the decisive wind action, and the locations to carry out the fatigue analysis have been decided. A 3D beam model of the entire tower is developed to extract the nodal forces corresponding to the connection block location under various mean wind speeds, which is later coupled with a detailed complex finite element model of the connection block, with over three million degrees of freedom, for acquiring stress histories on some pre-selected bolts. The random stress histories are analysed using the rainflow counting algorithm (RCA) and the damage is estimated using Palmgren-Miner's damage accumulation law. A modification is proposed to integrate the loading sequence effect into the RCA, which otherwise is ignored, and the differences between the two RCAs are investigated in terms of the accumulated damage.
The floods in 2002 and 2013, as well as the recent flood of 2021, caused billions Euros worth of property damage in Germany. The aim of the project Innovative Vulnerability and Risk Assessment of Urban Areas against Flood Events (INNOVARU) involved the development of a practicable flood damage model that enables realistic damage statements for the residential building stock. In addition to the determination of local flood risks, it also takes into account the vulnerability of individual buildings and allows for the prognosis of structural damage. In this paper, we discuss an improved method for the prognosis of structural damage due to flood impact. Detailed correlations between inundation level and flow velocities depending on the vulnerability of the building types, as well as the number of storeys, are considered. Because reliable damage data from events with high flow velocities were not available, an innovative approach was adopted to cover a wide range of flow velocities. The proposed approach combines comprehensive damage data collected after the 2002 flood in Germany with damage data of the 2011 Tohoku earthquake tsunami in Japan. The application of the developed methods enables a reliable reinterpretation of the structural damage caused by the August flood of 2002 in six study areas in the Free State of Saxony.
A safe and economic structural design based on the semi-probabilistic concept requires statistically representative safety elements, such as characteristic values, design values, and partial safety factors. Regarding climate loads, the safety levels of current design codes strongly reflect experiences based on former measurements and investigations assuming stationary conditions, i.e. involving constant frequencies and intensities. However, due to climate change, occurrence of corresponding extreme weather events is expected to alter in the future influencing the reliability and safety of structures and their components. Based on established approaches, a systematically refined data-driven methodology for the determination of design parameters considering nonstationarity as well as standardized targets of structural reliability or safety, respectively, is therefore proposed. The presented procedure picks up fundamentals of European standardization and extends them with respect to nonstationarity by applying a shifting time window method. Taking projected snow loads into account, the application of the method is exemplarily demonstrated and various influencing parameters are discussed.
Design-related reassessment of structures integrating Bayesian updating of model safety factors
(2022)
In the semi-probabilistic approach of structural design, the partial safety factors are defined by considering some degree of uncertainties to actions and resistance, associated with the parameters’ stochastic nature. However, uncertainties for individual structures can be better examined by incorporating measurement data provided by sensors from an installed health monitoring scheme. In this context, the current study proposes an approach to revise the partial safety factor for existing structures on the action side, γE by integrating Bayesian model updating. A simple numerical example of a beam-like structure with artificially generated measurement data is used such that the influence of different sensor setups and data uncertainties on revising the safety factors can be investigated. It is revealed that the health monitoring system can reassess the current capacity reserve of the structure by updating the design safety factors, resulting in a better life cycle assessment of structures. The outcome is furthermore verified by analysing a real life small railway steel bridge ensuring the applicability of the proposed method to practical applications.
Determining the earthquake hazard of any settlement is one of the primary studies for reducing earthquake damage. Therefore, earthquake hazard maps used for this purpose must be renewed over time. Turkey Earthquake Hazard Map has been used instead of Turkey Earthquake Zones Map since 2019. A probabilistic seismic hazard was performed by using these last two maps and different attenuation relationships for Bitlis Province (Eastern Turkey) were located in the Lake Van Basin, which has a high seismic risk. The earthquake parameters were determined by considering all districts and neighborhoods in the province. Probabilistic seismic hazard analyses were carried out for these settlements using seismic sources and four different attenuation relationships. The obtained values are compared with the design spectrum stated in the last two earthquake maps. Significant differences exist between the design spectrum obtained according to the different exceedance probabilities. In this study, adaptive pushover analyses of sample-reinforced concrete buildings were performed using the design ground motion level. Structural analyses were carried out using three different design spectra, as given in the last two seismic design codes and the mean spectrum obtained from attenuation relationships. Different design spectra significantly change the target displacements predicted for the performance levels of the buildings.
The seismic vulnerability assessment of existing reinforced concrete (RC) buildings is a significant source of disaster mitigation plans and rescue services. Different countries evolved various Rapid Visual Screening (RVS) techniques and methodologies to deal with the devastating consequences of earthquakes on the structural characteristics of buildings and human casualties. Artificial intelligence (AI) methods, such as machine learning (ML) algorithm-based methods, are increasingly used in various scientific and technical applications. The investigation toward using these techniques in civil engineering applications has shown encouraging results and reduced human intervention, including uncertainties and biased judgment. In this study, several known non-parametric algorithms are investigated toward RVS using a dataset employing different earthquakes. Moreover, the methodology encourages the possibility of examining the buildings’ vulnerability based on the factors related to the buildings’ importance and exposure. In addition, a web-based application built on Django is introduced. The interface is designed with the idea to ease the seismic vulnerability investigation in real-time. The concept was validated using two case studies, and the achieved results showed the proposed approach’s potential efficiency
In the wake of the news industry’s digitization, novel organizations that differ considerably from traditional media firms in terms of their functional roles and organizational practices of media work are emerging. One new type is the field repair organization, which is characterized by supporting high‐quality media work to compensate for the deficits (such as those which come from cost savings and layoffs) which have become apparent in legacy media today. From a practice‐theoretical research perspective and based on semi‐structured interviews, virtual field observations, and document analysis, we have conducted a single case study on Science Media Center Germany (SMC), a unique non‐profit news start‐up launched in 2016 in Cologne, Germany. Our findings show that, in addition to field repair activities, SMC aims to facilitate progress and innovation in the field, which we refer to as field advancement. This helps to uncover emerging needs and anticipates problems before they intensify or even occur, proactively providing products and tools for future journalism. This article contributes to our understanding of novel media organizations with distinct functions in the news industry, allowing for advancements in theory on media work and the organization of journalism in times of digital upheaval.
Paper-based data acquisition and manual transfer between incompatible software or data formats during inspections of bridges, as done currently, are time-consuming, error-prone, cumbersome, and lead to information loss. A fully digitized workflow using open data formats would reduce data loss, efforts, and the costs of future inspections. On the one hand, existing studies proposed methods to automatize data acquisition and visualization for inspections. These studies lack an open standard to make the gathered data available for other processes. On the other hand, several studies discuss data structures for exchanging damage information among different stakeholders. However, those studies do not cover the process of automatic data acquisition and transfer. This study focuses on a framework that incorporates automatic damage data acquisition, transfer, and a damage information model for data exchange. This enables inspectors to use damage data for subsequent analyses and simulations. The proposed framework shows the potentials for a comprehensive damage information model and related (semi-)automatic data acquisition and processing.
This dataset presents the numerical analysis of the heat and moisture transport through a facade equipped with a living wall system designated for greywater treatment. While such greening systems provide many environmental benefits, they involve pumping large quantities of water onto the wall assembly, which can increase the risk of moisture in the wall as well as impaired energetic performance due to increased thermal conductivity with increased moisture content in the building materials. This dataset was acquired through numerical simulation using the coupling of two simulation tools, namely Envi-Met and Delphin. This coupling was used to include the complex role the plants play in shaping the near-wall environmental parameters in the hygrothermal simulations. Four different wall assemblies were investigated, each assembly was assessed twice: with and without the living wall. The presented data include the input and output parameters of the simulations, which were presented in the co-submitted article [1].
Der Aufruf, die Begriffe Stadt und Kritik in das Zentrum einer Debatte zu stellen, bietet die große Chance, uns weit über begriffliche Klärungen unseres gemeinsamen Arbeitsgegenstands hinaus – die ja auch für sich selbst sehr fruchtbar sein können – über die Funktion zu verständigen, die wir in der Gesellschaft ausüben, wenn wir räumliche Planung praktizieren, erforschen und lehren. Da in der Bundesrepublik nicht nur ein großer Bedarf, sondern auch eine beträchtliche Nachfrage nach öffentlicher Planung besteht und die planungsbezogenen Wissenschaften sich eines insgesamt stabilen institutionellen Standes erfreuen, laufen wir Gefahr, die gesellschaftspolitische Legitimation von Berufsfeld und Wissenschaft zu vernachlässigen, sie als gegeben zu behandeln. Wir müssen uns ja kaum rechtfertigen.
Multi-criteria decision analysis (MCDA) is an established methodology to support the decision-making of multi-objective problems. For conducting an MCDA, in most cases, a set of objectives (SOO) is required, which consists of a hierarchical structure comprised of objectives, criteria, and indicators. The development of an SOO is usually based on moderated development processes requiring high organizational and cognitive effort from all stakeholders involved. This article proposes elementary interactions as a key paradigm of an algorithm-driven development process for an SOO that requires little moderation efforts. Elementary interactions are self-contained information requests that may be answered with little cognitive effort. The pairwise comparison of elements in the well-known analytical hierarchical process (AHP) is an example of an elementary interaction. Each elementary interaction in the development process presented contributes to the stepwise development of an SOO. Based on the hypothesis that an SOO may be developed exclusively using elementary interactions (EIs), a concept for a multi-user platform is proposed. Essential components of the platform are a Model Aggregator, an Elementary Interaction Stream Generator, a Participant Manager, and a Discussion Forum. While the latter component serves the professional exchange of the participants, the first three components are intended to be automatable by algorithms. The platform concept proposed has been evaluated partly in an explorative validation study demonstrating the general functionality of the algorithms outlined. In summary, the platform concept suggested demonstrates the potential to ease SOO development processes as the platform concept does not restrict the application domain; it is intended to work with little administration moderation efforts, and it supports the further development of an existing SOO in the event of changes in external conditions. The algorithm-driven development of SOOs proposed in this article may ease the development of MCDA applications and, thus, may have a positive effect on the spread of MCDA applications.
Operator Calculus Approach to Comparison of Elasticity Models for Modelling of Masonry Structures
(2022)
The solution of any engineering problem starts with a modelling process aimed at formulating a mathematical model, which must describe the problem under consideration with sufficient precision. Because of heterogeneity of modern engineering applications, mathematical modelling scatters nowadays from incredibly precise micro- and even nano-modelling of materials to macro-modelling, which is more appropriate for practical engineering computations. In the field of masonry structures, a macro-model of the material can be constructed based on various elasticity theories, such as classical elasticity, micropolar elasticity and Cosserat elasticity. Evidently, a different macro-behaviour is expected depending on the specific theory used in the background. Although there have been several theoretical studies of different elasticity theories in recent years, there is still a lack of understanding of how modelling assumptions of different elasticity theories influence the modelling results of masonry structures. Therefore, a rigorous approach to comparison of different three-dimensional elasticity models based on quaternionic operator calculus is proposed in this paper. In this way, three elasticity models are described and spatial boundary value problems for these models are discussed. In particular, explicit representation formulae for their solutions are constructed. After that, by using these representation formulae, explicit estimates for the solutions obtained by different elasticity theories are obtained. Finally, several numerical examples are presented, which indicate a practical difference in the solutions.
It is widely accepted that most people spend the majority of their lives indoors. Most individuals do not realize that while indoors, roughly half of heat exchange affecting their thermal comfort is in the form of thermal infrared radiation. We show that while researchers have been aware of its thermal comfort significance over the past century, systemic error has crept into the most common evaluation techniques, preventing adequate characterization of the radiant environment. Measuring and characterizing radiant heat transfer is a critical component of both building energy efficiency and occupant thermal comfort and productivity. Globe thermometers are typically used to measure mean radiant temperature (MRT), a commonly used metric for accounting for the radiant effects of an environment at a point in space. In this paper we extend previous field work to a controlled laboratory setting to (1) rigorously demonstrate that existing correction factors used in the American Society of Heating Ventilation and Air-conditioning Engineers (ASHRAE) Standard 55 or ISO7726 for using globe thermometers to quantify MRT are not sufficient; (2) develop a correction to improve the use of globe thermometers to address problems in the current standards; and (3) show that mean radiant temperature measured with ping-pong ball-sized globe thermometers is not reliable due to a stochastic convective bias. We also provide an analysis of the maximum precision of globe sensors themselves, a piece missing from the domain in contemporary literature.