Refine
Document Type
- Article (1015) (remove)
Institute
- Professur Theorie und Geschichte der modernen Architektur (393)
- Institut für Strukturmechanik (ISM) (254)
- Professur Informatik im Bauwesen (130)
- Professur Stochastik und Optimierung (40)
- Professur Bauphysik (23)
- In Zusammenarbeit mit der Bauhaus-Universität Weimar (21)
- Professur Informatik in der Architektur (15)
- Junior-Professur Computational Architecture (12)
- Institut für Europäische Urbanistik (11)
- Professur Bauchemie und Polymere Werkstoffe (11)
Keywords
- Bauhaus-Kolloquium (395)
- Weimar (395)
- Angewandte Mathematik (186)
- Strukturmechanik (185)
- Architektur (168)
- 1986 (63)
- 1989 (60)
- Design (59)
- Bauhaus (55)
- Raum (55)
Realistic uncertainty description incorporating aleatoric and epistemic uncertainties can be described within the framework of polymorphic uncertainty, which is computationally demanding. Utilizing a domain decomposition approach for random field based uncertainty models the proposed level-based sampling method can reduce these computational costs significantly and shows good agreement with a standard sampling technique. While 2-level configurations tend to get unstable with decreasing sampling density 3-level setups show encouraging results for the investigated reliability analysis of a structural unit square.
Bauhaus-Gastprofessorin Mirjam Wenzel referierte am 30. Juni 2021 im Audimax der Bauhaus-Universität Weimar zur Entstehungsgeschichte und Konzeption Jüdischer Museen. Dabei ging sie darauf ein, inwiefern diese Museen besonders relevant für aktuelle gesellschaftliche und politische Fragestellungen sind. Prof. Wenzels zweiter öffentlicher Vortrag an der Bauhaus-Universität Weimar skizzierte die Potentiale von Kultureinrichtungen in Zeiten gesellschaftspolitischer Veränderungen im Allgemeinen und die Bedeutung Jüdischer Museen angesichts verbaler und tätlicher Gewalt gegen Jüdinnen und Juden im Besonderen.
Compiling and disseminating information about incidents and disasters are key to disaster management and relief. But due to inherent limitations of the acquisition process, the required information is often incomplete or missing altogether. To fill these gaps, citizen observations spread through social media are widely considered to be a promising source of relevant information, and many studies propose new methods to tap this resource. Yet, the overarching question of whether and under which circumstances social media can supply relevant information (both qualitatively and quantitatively) still remains unanswered. To shed some light on this question, we review 37 disaster and incident databases covering 27 incident types, compile a unified overview of the contained data and their collection processes, and identify the missing or incomplete information. The resulting data collection reveals six major use cases for social media analysis in incident data collection: (1) impact assessment and verification of model predictions, (2) narrative generation, (3) recruiting citizen volunteers, (4) supporting weakly institutionalized areas, (5) narrowing surveillance areas, and (6) reporting triggers for periodical surveillance. Furthermore, we discuss the benefits and shortcomings of using social media data for closing information gaps related to incidents and disasters.
Chemical glass frosting processes are widely used to create visual attractive glass surfaces. A commonly used frosting bath mainly contains ammonium bifluoride (NH4HF2) mixed with hydrochloric acid (HCl). The frosting process consists of several baths. Firstly, the preliminary bath to clean the object. Secondly, the frosting bath which etches the rough light scattering structure into the glass surface. Finally, the washing baths to clean the frosted object. This is where the constituents of the preceding steps accumulate and have to be filtered from the sewage. In the present contribution, phosphoric acid (H3PO4) was used as a substitute for HCl to reduce the amount of ammonium (NH4+) and chloride (Cl−) dissolved in the waste water. In combination with magnesium carbonate (MgCO3), it allows the precipitation of ammonium within the sewage as ammonium magnesium phosphate (MgNH4PO4). However, a trivial replacement of HCl by H3PO4 within the frosting process causes extensive frosting errors, such as inhomogeneous size distributions of the structures or domains that are not fully covered by these structures. By modifying the preliminary bath composition, it was possible to improve the frosting result considerably. To determine the optimal composition of the preliminary bath, a semi-automatic evaluation method has been developed. This method renders the objective comparison of the resulting surface quality possible.
Entrepreneurship and start-up activities are seen as a key response to recent upheavals in the media industry: Newly founded ventures can act as important drivers for industry transformation and renewal, pioneering new products, business models, and organizational designs (e.g. Achtenhagen, 2017; Buschow & Laugemann, 2020).
In principle, media students represent a crucial population of nascent entrepreneurs: individuals who will likely become founders of start-ups (Casero-Ripollés et al., 2016). However, their willingness to start a new business is generally considered to be rather low (Goyanes, 2015), and for journalism students, the idea of innovation tends to be conservative, following traditional norms and professional standards (Singer & Broersma, 2020). In a sample of Spanish journalism students, López-Meri et al. (2020) found that one of the main barriers to entrepreneurial intentions is that students feel they lack knowledge and training in entrepreneurship.
In the last 10 years, a wide variety of entrepreneurship education courses have been set up in media departments of colleges and universities worldwide.
These programs have been designed to sensitize and prepare communications, media and journalism students to think and act entrepreneurially (e.g. Caplan et al., 2020; Ferrier, 2013; Ferrier & Mays, 2017; Hunter & Nel, 2011). Entrepreneurial competencies
and practices not only play a crucial role for start-ups, but, in imes of digital transformation, are increasingly sought after by legacy media companies as well (Küng, 2015).
At the Department of Journalism and Communication Research, Hanover University of Music, Drama and Media, Germany, we have been addressing these developments with the “Media Entrepreneurship” program. The course, established in 2013, aims to provide fundamental knowledge of entrepreneurship, as well as promoting students‘ entrepreneurial thinking and behavior. This article presents the pedagogical approach of the program and investigates learning outcomes. By outlining and evaluating the Media Entrepreneurship program, this article aims to promote good practices of entrepreneurship education in communications, media and journalism, and to reflect on the limitations of such programs.
This article is focused on the research and development of new cellulose ether derivatives as innovative superplasticizers for mortar systems. Several synthetic strategies have been pursued to obtain new compounds to study their properties on cementitious systems as new bio-based additives. The new water-soluble admixtures were synthesized using a complex carboxymethylcellulose-based backbone that was first hydrolyzed and then sulfo-ethylated in the presence of sodium vinyl sulphonate. Starting with a complex biopolymer that is widely known as a thickening agent was very challenging. Only by varying the hydrolysis times and temperatures of the reactions was achieved the aimed goal. The obtained derivatives showed different molecular weight (Mw) and anionic charges on their backbones. An improvement in shear stress and dynamic viscosity values of CEM II 42.5R cement was observed with the samples obtained with a longer time of higher temperature hydrolysis and sulfo-ethylation. Investigations into the chemical nature of the pore solution, calorimetric studies and adsorption experiments clearly showed the ability of carboxymethyl cellulose superplasticizer (CMC SP) to interact with cement grains and influence hydration processes within a 48-h time window, causing a delay in hydration reactions in the samples. The fluidity of the cementitious matrices was ascertained through slump test and preliminary studies of mechanical and flexural strength of the hardened mortar formulated with the new ecological additives yielded values in terms of mechanical properties. Finally, the computed tomography (CT) images completed the investigation of the pore network structure of hardened specimens, highlighting their promising structure porosity.
The development of a hydro-mechanically coupled Coupled-Eulerian–Lagrangian (CEL) method and its application to the back-analysisof vibratory pile driving model tests in water-saturated sand is presented. The predicted pile penetration using this approachis in good agreement with the results of the model tests as well as with fully Lagrangian simulations. In terms of pore water pressure, however, the results of the CEL simulation show a slightly worse accordance with the model tests compared to the Lagrangian simulation. Some shortcomings of the hydro-mechanically coupled CEL method in case of frictional contact problems and pore fluids with high bulk modulus are discussed. Lastly, the CEL method is applied to the simulation of vibratory driving of open-profile piles under partially drained conditions to study installation-induced changes in the soil state. It is concluded that the proposed method is capable of realistically reproducing the most important mechanisms in the soil during the driving process despite its addressed shortcomings.
One of the most important subjects of hydraulic engineering is the reliable estimation of the transverse distribution in the rectangular channel of bed and wall shear stresses. This study makes use of the Tsallis entropy, genetic programming (GP) and adaptive neuro-fuzzy inference system (ANFIS) methods to assess the shear stress distribution (SSD) in the rectangular channel.
To evaluate the results of the Tsallis entropy, GP and ANFIS models, laboratory observations were used in which shear stress was measured using an optimized Preston tube. This is then used to measure the SSD in various aspect ratios in the rectangular channel. To investigate the shear stress percentage, 10 data series with a total of 112 different data for were used. The results of the sensitivity analysis show that the most influential parameter for the SSD in smooth rectangular channel is the dimensionless parameter B/H, Where the transverse coordinate is B, and the flow depth is H. With the parameters (b/B), (B/H) for the bed and (z/H), (B/H) for the wall as inputs, the modeling of the GP was better than the other one. Based on the analysis, it can be concluded that the use of GP and ANFIS algorithms is more effective in estimating shear stress in smooth rectangular channels than the Tsallis entropy-based equations.
This study proposes an efficient Bayesian, frequency-based damage identification approach to identify damages in cantilever structures with an acceptable error rate, even at high noise levels. The catenary poles of electric high-speed train systems were selected as a realistic case study to cover the objectives of this study. Compared to other frequency-based damage detection approaches described in the literature, the proposed approach is efficiently able to detect damages in cantilever structures to higher levels of damage detection, namely identifying both the damage location and severity using a low-cost structural health monitoring (SHM) system with a limited number of sensors; for example, accelerometers. The integration of Bayesian inference, as a stochastic framework, in the proposed approach, makes it possible to utilize the benefit of data fusion in merging the informative data from multiple damage features, which increases the quality and accuracy of the results. The findings provide the decision-maker with the information required to manage the maintenance, repair, or replacement procedures.
Biofeedback constitutes a well-established, non-invasive method to voluntary interfere in emotional processing by means of cognitive strategies. However, treatment durations exhibit strong inter-individual variations and first successes can often be achieved only after a large number of sessions. Sham feedback constitutes a rather untapped approach by providing feedback that does not correspond to the participant’s actual state. The current study aims to gain insights into mechanisms of sham feedback processing in order to support new techniques in biofeedback therapy. We carried out two experiments and applied different types of sham feedback on skin conductance responses and pupil size changes during affective processing. Results indicate that standardized but context-sensitive sham signals based on skin conductance responses exert a stronger influence on emotional regulation compared to individual sham feedback from ongoing pupil dynamics. Also, sham feedback should forego unnatural signal behavior to avoid irritation and skepticism among participants. Altogether, a reasonable combination of stimulus features and sham feedback characteristics enables to considerably reduce the actual bodily responsiveness already within a single session.
This paper outlines an important step in characterizing a novel field of robotic construction research where a cable-driven parallel robot is used to extrude cementitious material in three-dimensional space, and thus offering a comprehensive new approach to computational design and construction, and to robotic fabrication at larger scales. Developed by the Faculty of Art and Design at Bauhaus-University Weimar (Germany), the faculty of Architecture at the University of Applied Sciences Dortmund (Germany) and the Chair of Mechatronics at the University of Duisburg-Essen (Germany), this approach offers unique advantages over existing additive manufacturing methods: the system is easily transportable and scalable, it does not require additional formwork or scaffolding, and it offers digital integration and informational oversight across the entire design and building process. This paper considers 1) key research components of cable robotic 3D-printing (such as computational design, material exploration, and robotic control), and 2) the integration of these parameters into a unified design and building process. The demonstration of the approach at full-scale is of particular concern.
This article aims to develop a social theory of violence that emphasizes the role of the third party as well as the communication between the involved subjects. For this Teresa Koloma Beck’s essay ‘The Eye of the Beholder: Violence as a Social Process’ is taken as a starting point, which adopts a social-constructivist perspective. On the one hand, the basic concepts and the benefits of this approach are presented. On the other hand, social-theoretical problems of this approach are revealed. These deficits are counteracted by expanding Koloma Beck’s approach with a communicative-constructivist framework. Thus, the role of communicative action and the ‘objectification of violence’ is emphasized. These aspects impact the perception, judgement and (de-)legitimation of violence phenomena and the emergence of a ‘knowledge of violence’. Communicative actions and objectifications form a key to understanding violent interactions and the link between the micro and macro levels. Finally, the methodological consequences for the research of violence and Communicative Constructivism are discussed. Furthermore, possible research fields are outlined, which open up by looking at communicative action and the objectifications within the ‘triads of violence’.
Zu den diversen Unternehmungen sozialbewegter „Gegenwissenschaft“, die um 1980 auf der Bildfläche der BRD erschienen, zählte der 1982 gegründete Berliner Wissenschaftsladen e. V., kurz WILAB – eine Art „alternatives“ Spin-off der Technischen Universität Berlin. Der vorliegende Beitrag situiert die Ausgründung des „Ladens“ im Kontext zeitgenössischer Fortschritte der (regionalen) Forschungs- und Technologiepolitik. Gezeigt wird, wie der deindustrialisierenden Inselstadt, qua „innovationspolitischer“ Gegensteuerung, dabei sogar eine gewisse Vorreiterrolle zukam: über die Stadtgrenzen hinaus sichtbare Neuerungen wie die Gründermesse BIG TECH oder das 1983 eröffnete Berliner Innovations- und Gründerzentrum (BIG), der erste „Incubator“ [sic] der BRD, etwa gingen auf das Konto der 1977/78 lancierten Technologie-Transferstelle der TU Berlin, TU-transfer.
Anders gesagt: tendenziell bekam man es hier nun mit Verhältnissen zu tun, die immer weniger mit den Träumen einer „kritischen“, nicht-fremdbestimmten (Gegen‑)Wissenschaft kompatibel waren. Latent konträr zur historiographischen Prominenz des wissenschaftskritischen Zeitgeists fristeten „alternativen“ Zielsetzungen verpflichtete Unternehmungen wie „WILAB“ ein relativ marginalisiertes Nischendasein. Dennoch wirft das am WILAB verfolgte, so gesehen wenig aussichtsreiche Anliegen, eine andere, nämlich „humanere“ Informationstechnologie in die Wege zu leiten, ein instruktives Licht auf die Aufbrüche „unternehmerischer“ Wissenschaft in der BRD um 1980.
Object-Oriented Damage Information Modeling Concepts and Implementation for Bridge Inspection
(2022)
Bridges are designed to last for more than 50 years and consume up to 50% of their life-cycle costs during their operation phase. Several inspections and assessment actions are executed during this period. Bridge and damage information must be gathered, digitized, and exchanged between different stakeholders. Currently, the inspection and assessment practices rely on paper-based data collection and exchange, which is time-consuming and error-prone, and leads to loss of information. Storing and exchanging damage and building information in a digital format may lower costs and errors during inspection and assessment and support future needs, for example, immediate simulations regarding performance assessment, automated maintenance planning, and mixed reality inspections. This study focused on the concept for modeling damage information to support bridge reviews and structural analysis. Starting from the definition of multiple use cases and related requirements, the data model for damage information is defined independently from the subsequent implementation. In the next step, the implementation via an established standard is explained. Functional tests aim to identify problems in the concept and implementation. To show the capability of the final model, two example use cases are illustrated: the inspection review of the entire bridge and a finite-element analysis of a single component. Main results are the definition of necessary damage data, an object-oriented damage model, which supports multiple use cases, and the implementation of the model in a standard. Furthermore, the tests have shown that the standard is suitable to deliver damage information; however, several software programs lack proper implementation of the standard.
Quantification of cracks in concrete thin sections considering current methods of image analysis
(2022)
Image analysis is used in this work to quantify cracks in concrete thin sections via modern image processing. Thin sections were impregnated with a yellow epoxy resin, to increase the contrast between voids and other phases of the concrete. By the means of different steps of pre-processing, machine learning and python scripts, cracks can be quantified in an area of up to 40 cm2. As a result, the crack area, lengths and widths were estimated automatically within a single workflow. Crack patterns caused by freeze-thaw damages were investigated. To compare the inner degradation of the investigated thin sections, the crack density was used. Cracks in the thin sections were measured manually in two different ways for validation of the automatic determined results. On the one hand, the presented work shows that the width of cracks can be determined pixelwise, thus providing the plot of a width distribution. On the other hand, the automatically measured crack length differs in comparison to the manually measured ones.
In this paper, we present an open-source code for the first-order and higher-order nonlocal operator method (NOM) including a detailed description of the implementation. The NOM is based on so-called support, dual-support, nonlocal operators, and an operate energy functional ensuring stability. The nonlocal operator is a generalization of the conventional differential operators. Combined with the method of weighed residuals and variational principles, NOM establishes the residual and tangent stiffness matrix of operate energy functional through some simple matrix without the need of shape functions as in other classical computational methods such as FEM. NOM only requires the definition of the energy drastically simplifying its implementation. The implementation in this paper is focused on linear elastic solids for sake of conciseness through the NOM can handle more complex nonlinear problems. The NOM can be very flexible and efficient to solve partial differential equations (PDEs), it’s also quite easy for readers to use the NOM and extend it to solve other complicated physical phenomena described by one or a set of PDEs. Finally, we present some classical benchmark problems including the classical cantilever beam and plate-with-a-hole problem, and we also make an extension of this method to solve complicated problems including phase-field fracture modeling and gradient elasticity material.
In this study, we propose a nonlocal operator method (NOM) for the dynamic analysis of (thin) Kirchhoff plates. The nonlocal Hessian operator is derived based on a second-order Taylor series expansion. The NOM does not require any shape functions and associated derivatives as ’classical’ approaches such as FEM, drastically facilitating the implementation. Furthermore, NOM is higher order continuous, which is exploited for thin plate analysis that requires C1 continuity. The nonlocal dynamic governing formulation and operator energy functional for Kirchhoff plates are derived from a variational principle. The Verlet-velocity algorithm is used for the time discretization. After confirming the accuracy of the nonlocal Hessian operator, several numerical examples are simulated by the nonlocal dynamic Kirchhoff plate formulation.
We present a stochastic deep collocation method (DCM) based on neural architecture search (NAS) and transfer learning for heterogeneous porous media. We first carry out a sensitivity analysis to determine the key hyper-parameters of the network to reduce the search space and subsequently employ hyper-parameter optimization to finally obtain the parameter values. The presented NAS based DCM also saves the weights and biases of the most favorable architectures, which is then used in the fine-tuning process. We also employ transfer learning techniques to drastically reduce the computational cost. The presented DCM is then applied to the stochastic analysis of heterogeneous porous material. Therefore, a three dimensional stochastic flow model is built providing a benchmark to the simulation of groundwater flow in highly heterogeneous aquifers. The performance of the presented NAS based DCM is verified in different dimensions using the method of manufactured solutions. We show that it significantly outperforms finite difference methods in both accuracy and computational cost.
In machine learning, if the training data is independently and identically distributed as the test data then a trained model can make an accurate predictions for new samples of data. Conventional machine learning has a strong dependence on massive amounts of training data which are domain specific to understand their latent patterns. In contrast, Domain adaptation and Transfer learning methods are sub-fields within machine learning that are concerned with solving the inescapable problem of insufficient training data by relaxing the domain dependence hypothesis. In this contribution, this issue has been addressed and by making a novel combination of both the methods we develop a computationally efficient and practical algorithm to solve boundary value problems based on nonlinear partial differential equations. We adopt a meshfree analysis framework to integrate the prevailing geometric modelling techniques based on NURBS and present an enhanced deep collocation approach that also plays an important role in the accuracy of solutions. We start with a brief introduction on how these methods expand upon this framework. We observe an excellent agreement between these methods and have shown that how fine-tuning a pre-trained network to a specialized domain may lead to an outstanding performance compare to the existing ones. As proof of concept, we illustrate the performance of our proposed model on several benchmark problems.
In this work, we present a deep collocation method (DCM) for three-dimensional potential problems in non-homogeneous media. This approach utilizes a physics-informed neural network with material transfer learning reducing the solution of the non-homogeneous partial differential equations to an optimization problem. We tested different configurations of the physics-informed neural network including smooth activation functions, sampling methods for collocation points generation and combined optimizers. A material transfer learning technique is utilized for non-homogeneous media with different material gradations and parameters, which enhance the generality and robustness of the proposed method. In order to identify the most influential parameters of the network configuration, we carried out a global sensitivity analysis. Finally, we provide a convergence proof of our DCM. The approach is validated through several benchmark problems, also testing different material variations.
Subscription-based news platforms (such as “Apple News+” or “Readly”) that bundle content from different publishers into one comprehensive package and offer it to media users at a fixed monthly rate are a new way of accessing and consuming digital journalism. These services have received little attention in journalism studies, although they differ greatly from traditional media products and distribution channels. This article empirically investigates the perception of journalism platforms based on eight qualitative focus group discussions with 55 German news consumers.
Results show that the central characteristics these platforms should fulfill in order to attract users are strikingly similar to the characteristics of media platforms from the music and video industries, in particular regarding price points, contract features, and modes of usage. Against this background, the potential and perspectives of a subscription-based news platform for journalism’s societal role are discussed.
Immanuel Kant’s thought is a central historical and theoretical reference in Hans Blumenberg’s metaphorological project. This is demonstrated by the fact that in the Paradigms the author outlines the concept of absolute metaphor by explicitly referring to §59 of the Critique of the Power of Judgment and recognizing in the Kantian symbol a model for his own metaphorics. However, Kant’s name also appears in the chapter on the metaphor of the “terra incognita” that not only did he theorize the presence of symbolic hypotyposis in our language [...] but also made extensive use of metaphors linked to “determinate historical experiences”. In particular: geographical metaphors. In my essay, I would like to start from the analysis of Kant’s geographical metaphors in order to try to rethink Blumenberg’s archaeological method as an archaeology of media that grounds the study of metaphors in the materiality of communication and the combination of tools, agents and media.
Real-world labs hold the potential to catalyse rapid urban transformations through real-world experimentation. Characterised by a rather radical, responsive, and location-specific nature, real-world labs face constraints in the scaling of experimental knowledge. To make a significant contribution to urban transformation, the produced knowledge must go beyond the level of a building, street, or small district where real-world experiments are conducted. Thus, a conflict arises between experimental boundaries and the stimulation of broader implications. The challenges of scaling experimental knowledge have been recognised as a problem, but remain largely unexplained. Based on this, the article will discuss the applicability of the “typology of amplification processes” by Lam et al. (2020) to explore and evaluate the potential of scaling experimental knowledge from real-world labs. The application of the typology is exemplified in the case of the Bauhaus.MobilityLab. The Bauhaus.MobilityLab takes a unique approach by testing and developing cross-sectoral mobility, energy, and logistics solutions with a distinct focus on scaling knowledge and innovation. For this case study, different qualitative research techniques are combined according to “within-method triangulation” and synthesised in a strengths, weaknesses, opportunities, and threats (SWOT) analysis. The analysis of the Bauhaus.MobilityLab proves that the “typology of amplification processes” is useful as a systematic approach to identifying and evaluating the potential of scaling experimental knowledge.
For the safe and efficient operation of dams, frequent monitoring and maintenance are required. These are usually expensive, time consuming, and cumbersome. To alleviate these issues, we propose applying a wave-based scheme for the location and quantification of damages in dams.
To obtain high-resolution “interpretable” images of the damaged regions, we drew inspiration from non-linear full-multigrid methods for inverse problems and applied a new cyclic multi-stage full-waveform inversion (FWI) scheme. Our approach is less susceptible to the stability issues faced by the standard FWI scheme when dealing with ill-posed problems. In this paper, we first selected an optimal acquisition setup and then applied synthetic data to demonstrate the capability of our approach in identifying a series of anomalies in dams by a mixture of reflection and transmission tomography. The results had sufficient robustness, showing the prospects of application in the field of non-destructive testing of dams.
In this work, the degradation performance for the photocatalytic oxidation of eight micropollutants (amisulpride, benzotriazole, candesartan, carbamazepine, diclofenac, gabapentin, methlybenzotriazole, and metoprolol) within real secondary effluent was investigated using three different reactor designs. For all reactor types, the influence of irradiation power on its reaction rate and energetic efficiency was investigated. Flat cell and batch reactor showed almost similar substance specific degradation behavior. Within the immersion rotary body reactor, benzotriazole and methylbenzotriazole showed a significantly lower degradation affinity. The flat cell reactor achieved the highest mean degradation rate, with half time values ranging from 5 to 64 min with a mean of 18 min, due to its high catalysts surface to hydraulic volume ratio. The EE/O values were calculated for all micro-pollutants as well as the mean degradation rate constant of each experimental step. The lowest substance specific energy per order (EE/O) values of 5 kWh/m3 were measured for benzotriazole within the batch reactor. The batch reactor also reached the lowest mean values (11.8–15.9 kWh/m3) followed by the flat cell reactor (21.0–37.0 kWh/m3) and immersion rotary body reactor (23.9–41.0 kWh/m3). Catalyst arrangement and irradiation power were identified as major influences on the energetic performance of the reactors. Low radiation intensities as well as the use of submerged catalyst arrangement allowed a reduction in energy demand by a factor of 3–4. A treatment according to existing treatment goals of wastewater treatment plants (80% total degradation) was achieved using the batch reactor with a calculated energy demand of 7000 Wh/m3.
One of the most important renewable energy technologies used nowadays are wind power turbines. In this paper, we are interested in identifying the operating status of wind turbines, especially rotor blades, by means of multiphysical models. It is a state-of-the-art technology to test mechanical structures with ultrasonic-based methods. However, due to the density and the required high resolution, the testing is performed with high-frequency waves, which cannot penetrate the structure in depth. Therefore, there is a need to adopt techniques in the fields of multiphysical model-based inversion schemes or data-driven structural health monitoring. Before investing effort in the development of such approaches, further insights and approaches are necessary to make the techniques applicable to structures such as wind power plants (blades). Among the expected developments, further accelerations of the so-called “forward codes” for a more efficient implementation of the wave equation could be envisaged. Here, we employ electromagnetic waves for the early detection of cracks. Because in many practical situations, it is not possible to apply techniques from tomography (characterized by multiple sources and sensor pairs), we focus here on the question of whether the existence of cracks can be determined by using only one source for the sent waves.
The goal of architecture is changing in response to the expanding role of cities, rapid urbanization, and transformation under changing economic, environmental, social, and demographic factors. As cities increased in the early modern era, overcrowding, urbanization, and pollution conditions led reformers to consider the future shape of the cities. One of the most critical topics in contemporary architecture is the subject of the future concepts of living. In most cases, domed cities, as a future concept of living, are rarely considered, and they are used chiefly as “utopian” visions in the discourse of future ways of living. This paper highlights the reviews of domed cities to deepen the understanding of the idea in practice, like its approach in terms of architecture. The main aim of this paper is to provide a broad overview for domed cities in the face of pollution as one of the main concerns in many European cities. As a result, the significance of the reviews of the existing projects is focused on their conceptual quality. This review will pave the way for further studies in terms of future developments in the realm of domed cities. In this paper, the city of Celje, one of the most polluted cities in Slovenia, is taken as a case study for considering the concept of Dome incorporated due to the lack of accessible literature on the topic. This review’s primary contribution is to allow architects to explore a broad spectrum of innovation by comparing today’s achievable statuses against the possibilities generated by domed cities. As a result of this study, the concept of living under the Dome remains to be developed in theory and practice. The current challenging climatic situation will accelerate the evolution of these concepts, resulting in the formation of new typologies, which are a requirement for humanity.
Bolted connections are widely employed in structures like transmission poles, wind turbines, and television (TV) towers. The behaviour of bolted connections is often complex and plays a significant role in the overall dynamic characteristics of the structure. The goal of this work is to conduct a fatigue lifecycle assessment of such a bolted connection block of a 193 m tall TV tower, for which 205 days of real measurement data have been obtained from the installed monitoring devices. Based on the recorded data, the best-fit stochastic wind distribution for 50 years, the decisive wind action, and the locations to carry out the fatigue analysis have been decided. A 3D beam model of the entire tower is developed to extract the nodal forces corresponding to the connection block location under various mean wind speeds, which is later coupled with a detailed complex finite element model of the connection block, with over three million degrees of freedom, for acquiring stress histories on some pre-selected bolts. The random stress histories are analysed using the rainflow counting algorithm (RCA) and the damage is estimated using Palmgren-Miner's damage accumulation law. A modification is proposed to integrate the loading sequence effect into the RCA, which otherwise is ignored, and the differences between the two RCAs are investigated in terms of the accumulated damage.
The floods in 2002 and 2013, as well as the recent flood of 2021, caused billions Euros worth of property damage in Germany. The aim of the project Innovative Vulnerability and Risk Assessment of Urban Areas against Flood Events (INNOVARU) involved the development of a practicable flood damage model that enables realistic damage statements for the residential building stock. In addition to the determination of local flood risks, it also takes into account the vulnerability of individual buildings and allows for the prognosis of structural damage. In this paper, we discuss an improved method for the prognosis of structural damage due to flood impact. Detailed correlations between inundation level and flow velocities depending on the vulnerability of the building types, as well as the number of storeys, are considered. Because reliable damage data from events with high flow velocities were not available, an innovative approach was adopted to cover a wide range of flow velocities. The proposed approach combines comprehensive damage data collected after the 2002 flood in Germany with damage data of the 2011 Tohoku earthquake tsunami in Japan. The application of the developed methods enables a reliable reinterpretation of the structural damage caused by the August flood of 2002 in six study areas in the Free State of Saxony.
A safe and economic structural design based on the semi-probabilistic concept requires statistically representative safety elements, such as characteristic values, design values, and partial safety factors. Regarding climate loads, the safety levels of current design codes strongly reflect experiences based on former measurements and investigations assuming stationary conditions, i.e. involving constant frequencies and intensities. However, due to climate change, occurrence of corresponding extreme weather events is expected to alter in the future influencing the reliability and safety of structures and their components. Based on established approaches, a systematically refined data-driven methodology for the determination of design parameters considering nonstationarity as well as standardized targets of structural reliability or safety, respectively, is therefore proposed. The presented procedure picks up fundamentals of European standardization and extends them with respect to nonstationarity by applying a shifting time window method. Taking projected snow loads into account, the application of the method is exemplarily demonstrated and various influencing parameters are discussed.
Design-related reassessment of structures integrating Bayesian updating of model safety factors
(2022)
In the semi-probabilistic approach of structural design, the partial safety factors are defined by considering some degree of uncertainties to actions and resistance, associated with the parameters’ stochastic nature. However, uncertainties for individual structures can be better examined by incorporating measurement data provided by sensors from an installed health monitoring scheme. In this context, the current study proposes an approach to revise the partial safety factor for existing structures on the action side, γE by integrating Bayesian model updating. A simple numerical example of a beam-like structure with artificially generated measurement data is used such that the influence of different sensor setups and data uncertainties on revising the safety factors can be investigated. It is revealed that the health monitoring system can reassess the current capacity reserve of the structure by updating the design safety factors, resulting in a better life cycle assessment of structures. The outcome is furthermore verified by analysing a real life small railway steel bridge ensuring the applicability of the proposed method to practical applications.
Determining the earthquake hazard of any settlement is one of the primary studies for reducing earthquake damage. Therefore, earthquake hazard maps used for this purpose must be renewed over time. Turkey Earthquake Hazard Map has been used instead of Turkey Earthquake Zones Map since 2019. A probabilistic seismic hazard was performed by using these last two maps and different attenuation relationships for Bitlis Province (Eastern Turkey) were located in the Lake Van Basin, which has a high seismic risk. The earthquake parameters were determined by considering all districts and neighborhoods in the province. Probabilistic seismic hazard analyses were carried out for these settlements using seismic sources and four different attenuation relationships. The obtained values are compared with the design spectrum stated in the last two earthquake maps. Significant differences exist between the design spectrum obtained according to the different exceedance probabilities. In this study, adaptive pushover analyses of sample-reinforced concrete buildings were performed using the design ground motion level. Structural analyses were carried out using three different design spectra, as given in the last two seismic design codes and the mean spectrum obtained from attenuation relationships. Different design spectra significantly change the target displacements predicted for the performance levels of the buildings.
The seismic vulnerability assessment of existing reinforced concrete (RC) buildings is a significant source of disaster mitigation plans and rescue services. Different countries evolved various Rapid Visual Screening (RVS) techniques and methodologies to deal with the devastating consequences of earthquakes on the structural characteristics of buildings and human casualties. Artificial intelligence (AI) methods, such as machine learning (ML) algorithm-based methods, are increasingly used in various scientific and technical applications. The investigation toward using these techniques in civil engineering applications has shown encouraging results and reduced human intervention, including uncertainties and biased judgment. In this study, several known non-parametric algorithms are investigated toward RVS using a dataset employing different earthquakes. Moreover, the methodology encourages the possibility of examining the buildings’ vulnerability based on the factors related to the buildings’ importance and exposure. In addition, a web-based application built on Django is introduced. The interface is designed with the idea to ease the seismic vulnerability investigation in real-time. The concept was validated using two case studies, and the achieved results showed the proposed approach’s potential efficiency
In the wake of the news industry’s digitization, novel organizations that differ considerably from traditional media firms in terms of their functional roles and organizational practices of media work are emerging. One new type is the field repair organization, which is characterized by supporting high‐quality media work to compensate for the deficits (such as those which come from cost savings and layoffs) which have become apparent in legacy media today. From a practice‐theoretical research perspective and based on semi‐structured interviews, virtual field observations, and document analysis, we have conducted a single case study on Science Media Center Germany (SMC), a unique non‐profit news start‐up launched in 2016 in Cologne, Germany. Our findings show that, in addition to field repair activities, SMC aims to facilitate progress and innovation in the field, which we refer to as field advancement. This helps to uncover emerging needs and anticipates problems before they intensify or even occur, proactively providing products and tools for future journalism. This article contributes to our understanding of novel media organizations with distinct functions in the news industry, allowing for advancements in theory on media work and the organization of journalism in times of digital upheaval.
Paper-based data acquisition and manual transfer between incompatible software or data formats during inspections of bridges, as done currently, are time-consuming, error-prone, cumbersome, and lead to information loss. A fully digitized workflow using open data formats would reduce data loss, efforts, and the costs of future inspections. On the one hand, existing studies proposed methods to automatize data acquisition and visualization for inspections. These studies lack an open standard to make the gathered data available for other processes. On the other hand, several studies discuss data structures for exchanging damage information among different stakeholders. However, those studies do not cover the process of automatic data acquisition and transfer. This study focuses on a framework that incorporates automatic damage data acquisition, transfer, and a damage information model for data exchange. This enables inspectors to use damage data for subsequent analyses and simulations. The proposed framework shows the potentials for a comprehensive damage information model and related (semi-)automatic data acquisition and processing.
This dataset presents the numerical analysis of the heat and moisture transport through a facade equipped with a living wall system designated for greywater treatment. While such greening systems provide many environmental benefits, they involve pumping large quantities of water onto the wall assembly, which can increase the risk of moisture in the wall as well as impaired energetic performance due to increased thermal conductivity with increased moisture content in the building materials. This dataset was acquired through numerical simulation using the coupling of two simulation tools, namely Envi-Met and Delphin. This coupling was used to include the complex role the plants play in shaping the near-wall environmental parameters in the hygrothermal simulations. Four different wall assemblies were investigated, each assembly was assessed twice: with and without the living wall. The presented data include the input and output parameters of the simulations, which were presented in the co-submitted article [1].
Der Aufruf, die Begriffe Stadt und Kritik in das Zentrum einer Debatte zu stellen, bietet die große Chance, uns weit über begriffliche Klärungen unseres gemeinsamen Arbeitsgegenstands hinaus – die ja auch für sich selbst sehr fruchtbar sein können – über die Funktion zu verständigen, die wir in der Gesellschaft ausüben, wenn wir räumliche Planung praktizieren, erforschen und lehren. Da in der Bundesrepublik nicht nur ein großer Bedarf, sondern auch eine beträchtliche Nachfrage nach öffentlicher Planung besteht und die planungsbezogenen Wissenschaften sich eines insgesamt stabilen institutionellen Standes erfreuen, laufen wir Gefahr, die gesellschaftspolitische Legitimation von Berufsfeld und Wissenschaft zu vernachlässigen, sie als gegeben zu behandeln. Wir müssen uns ja kaum rechtfertigen.
Multi-criteria decision analysis (MCDA) is an established methodology to support the decision-making of multi-objective problems. For conducting an MCDA, in most cases, a set of objectives (SOO) is required, which consists of a hierarchical structure comprised of objectives, criteria, and indicators. The development of an SOO is usually based on moderated development processes requiring high organizational and cognitive effort from all stakeholders involved. This article proposes elementary interactions as a key paradigm of an algorithm-driven development process for an SOO that requires little moderation efforts. Elementary interactions are self-contained information requests that may be answered with little cognitive effort. The pairwise comparison of elements in the well-known analytical hierarchical process (AHP) is an example of an elementary interaction. Each elementary interaction in the development process presented contributes to the stepwise development of an SOO. Based on the hypothesis that an SOO may be developed exclusively using elementary interactions (EIs), a concept for a multi-user platform is proposed. Essential components of the platform are a Model Aggregator, an Elementary Interaction Stream Generator, a Participant Manager, and a Discussion Forum. While the latter component serves the professional exchange of the participants, the first three components are intended to be automatable by algorithms. The platform concept proposed has been evaluated partly in an explorative validation study demonstrating the general functionality of the algorithms outlined. In summary, the platform concept suggested demonstrates the potential to ease SOO development processes as the platform concept does not restrict the application domain; it is intended to work with little administration moderation efforts, and it supports the further development of an existing SOO in the event of changes in external conditions. The algorithm-driven development of SOOs proposed in this article may ease the development of MCDA applications and, thus, may have a positive effect on the spread of MCDA applications.
Operator Calculus Approach to Comparison of Elasticity Models for Modelling of Masonry Structures
(2022)
The solution of any engineering problem starts with a modelling process aimed at formulating a mathematical model, which must describe the problem under consideration with sufficient precision. Because of heterogeneity of modern engineering applications, mathematical modelling scatters nowadays from incredibly precise micro- and even nano-modelling of materials to macro-modelling, which is more appropriate for practical engineering computations. In the field of masonry structures, a macro-model of the material can be constructed based on various elasticity theories, such as classical elasticity, micropolar elasticity and Cosserat elasticity. Evidently, a different macro-behaviour is expected depending on the specific theory used in the background. Although there have been several theoretical studies of different elasticity theories in recent years, there is still a lack of understanding of how modelling assumptions of different elasticity theories influence the modelling results of masonry structures. Therefore, a rigorous approach to comparison of different three-dimensional elasticity models based on quaternionic operator calculus is proposed in this paper. In this way, three elasticity models are described and spatial boundary value problems for these models are discussed. In particular, explicit representation formulae for their solutions are constructed. After that, by using these representation formulae, explicit estimates for the solutions obtained by different elasticity theories are obtained. Finally, several numerical examples are presented, which indicate a practical difference in the solutions.
It is widely accepted that most people spend the majority of their lives indoors. Most individuals do not realize that while indoors, roughly half of heat exchange affecting their thermal comfort is in the form of thermal infrared radiation. We show that while researchers have been aware of its thermal comfort significance over the past century, systemic error has crept into the most common evaluation techniques, preventing adequate characterization of the radiant environment. Measuring and characterizing radiant heat transfer is a critical component of both building energy efficiency and occupant thermal comfort and productivity. Globe thermometers are typically used to measure mean radiant temperature (MRT), a commonly used metric for accounting for the radiant effects of an environment at a point in space. In this paper we extend previous field work to a controlled laboratory setting to (1) rigorously demonstrate that existing correction factors used in the American Society of Heating Ventilation and Air-conditioning Engineers (ASHRAE) Standard 55 or ISO7726 for using globe thermometers to quantify MRT are not sufficient; (2) develop a correction to improve the use of globe thermometers to address problems in the current standards; and (3) show that mean radiant temperature measured with ping-pong ball-sized globe thermometers is not reliable due to a stochastic convective bias. We also provide an analysis of the maximum precision of globe sensors themselves, a piece missing from the domain in contemporary literature.
Data acquisition systems and methods to capture high-resolution images or reconstruct 3D point clouds of existing structures are an effective way to document their as-is condition. These methods enable a detailed analysis of building surfaces, providing precise 3D representations. However, for the condition assessment and documentation, damages are mainly annotated in 2D representations, such as images, orthophotos, or technical drawings, which do not allow for the application of a 3D workflow or automated comparisons of multitemporal datasets. In the available software for building heritage data management and analysis, a wide range of annotation and evaluation functions are available, but they also lack integrated post-processing methods and systematic workflows. The article presents novel methods developed to facilitate such automated 3D workflows and validates them on a small historic church building in Thuringia, Germany. Post-processing steps using photogrammetric 3D reconstruction data along with imagery were implemented, which show the possibilities of integrating 2D annotations into 3D documentations. Further, the application of voxel-based methods on the dataset enables the evaluation of geometrical changes of multitemporal annotations in different states and the assignment to elements of scans or building models. The proposed workflow also highlights the potential of these methods for condition assessment and planning of restoration work, as well as the possibility to represent the analysis results in standardised building model formats.
This dataset consists mainly of two subsets. The first subset includes measurements and simulation data conducted to validate the simulation tool ENVI-met. The measurements were conducted at the campus of the Bauhaus-University Weimar in Weimar, Germany and consisted of recording exterior air temperature, globe temperature, relative humidity, and wind velocity at 1.5 m at four points on four different days. After the measurements, the geometry of the campus was modelled and meshed; the simulations were conducted using the weather data of the measurements days with the aim of investigating the accuracy of the model.
The second data subset consists of ENVI-met simulation data of the potential of facade greening in improving the outdoor environment and the indoor air temperature during heatwaves in Central European cities. The data consist of the boundary conditions and the simulation output of two simulation models: with and without facade greening. The geometry of the models corresponded to a residential buildings district in Stuttgart, Germany. The simulation output consisted of exterior air temperature, mean radiant temperature, relative humidity, and wind velocity at 12 different probe points in the model in addition to the indoor air temperature of an exemplary building. The dataset presents both vertical profiles of the probed parameters as well as the time series output of the five-day simulation duration. Both data subsets correspond to the investigations presented in the co-submitted article [1].
The fracture of microcapsules is an important issue to release the healing agent for healing the cracks in encapsulation-based self-healing concrete. The capsular clustering generated from the concrete mixing process is considered one of the critical factors in the fracture mechanism. Since there is a lack of studies in the literature regarding this issue, the design of self-healing concrete cannot be made without an appropriate modelling strategy. In this paper, the effects of microcapsule size and clustering on the fractured microcapsules are studied computationally. A simple 2D computational modelling approach is developed based on the eXtended Finite Element Method (XFEM) and cohesive surface technique. The proposed model shows that the microcapsule size and clustering have significant roles in governing the load-carrying capacity and the crack propagation pattern and determines whether the microcapsule will be fractured or debonded from the concrete matrix. The higher the microcapsule circumferential contact length, the higher the load-carrying capacity. When it is lower than 25% of the microcapsule circumference, it will result in a greater possibility for the debonding of the microcapsule from the concrete. The greater the core/shell ratio (smaller shell thickness), the greater the likelihood of microcapsules being fractured.
Im Heft zum zehnjährigen Jubiläum von sub\urban mit dem Themenschwerpunkt „sub\x: Verortungen, Entortungen" veröffentlichen wir eine Debatte, die von den bisherigen in unserer Zeitschrift in dieser Rubrik geführten textlichen Diskussionen abweicht. Im Vorfeld der Planungen für unsere Jubiläumsausgabe haben wir die aktuellen Mitglieder unseres wissenschaftlichen Beirats darum gebeten, zwei grundlegende Fragen von kritischer Stadtforschung in kurzen Beiträgen zu diskutieren: Was ist Stadt? Was ist Kritik?
Multi-criteria decision analysis (MCDA) is an established methodology to support the decision-making of multi-objective problems. For conducting an MCDA, in most cases, a set of objectives (SOO) is required, which consists of a hierarchical structure comprised of objectives, criteria, and indicators. The development of an SOO is usually based on moderated development processes requiring high organizational and cognitive effort from all stakeholders involved. This article proposes elementary interactions as a key paradigm of an algorithm-driven development process for an SOO that requires little moderation efforts. Elementary interactions are self-contained information requests that may be answered with little cognitive effort. The pairwise comparison of elements in the well-known analytical hierarchical process (AHP) is an example of an elementary interaction. Each elementary interaction in the development process presented contributes to the stepwise development of an SOO. Based on the hypothesis that an SOO may be developed exclusively using elementary interactions (EIs), a concept for a multi-user platform is proposed. Essential components of the platform are a Model Aggregator, an Elementary Interaction Stream Generator, a Participant Manager, and a Discussion Forum. While the latter component serves the professional exchange of the participants, the first three components are intended to be automatable by algorithms. The platform concept proposed has been evaluated partly in an explorative validation study demonstrating the general functionality of the algorithms outlined. In summary, the platform concept suggested demonstrates the potential to ease SOO development processes as the platform concept does not restrict the application domain; it is intended to work with little administration moderation efforts, and it supports the further development of an existing SOO in the event of changes in external conditions. The algorithm-driven development of SOOs proposed in this article may ease the development of MCDA applications and, thus, may have a positive effect on the spread of MCDA applications.
Plastic structural analysis may be applied without any difficulty and with little effort for structural member verifications with regard to lateral torsional buckling of doubly symmetric rolled I sections. Suchlike analyses can be performed based on the plastic zone theory, specifically using finite beam elements with seven degrees of freedom and 2nd order theory considering material nonlinearity. The existing Eurocode enables these approaches and the coming-up generation will provide corresponding regulations in EN 1993-1-14. The investigations allow the determination of computationally accurate limit loads, which are determined in the present paper for selected structural systems with different sets of parameters, such as length, steel grade and cross section types. The results are compared to approximations gained by more sophisticated FEM analyses (commercial software Ansys Workbench applying solid elements) for reasons of verification/validation. In this course, differences in the results of the numerical models are addressed and discussed. In addition, results are compared to resistances obtained by common design regulations based on reduction factors χlt including regulations of EN 1993-1-1 (including German National Annex) as well as prEN 1993-1-1: 2020-08 (proposed new Eurocode generation). Concluding, correlations of results and their advantages as well as disadvantages are discussed.
Vertical green system for gray water treatment: Analysis of the VertiKKA-module in a field test
(2022)
This work presents a modular Vertical Green System (VGS) for gray water treatment, developed at the Bauhaus-Universität Weimar. The concept was transformed into a field study with four modules built and tested with synthetic gray water. Each module set contains a small and larger module with the same treatment substrate and was fed hourly. A combination of lightweight structural material and biochar of agricultural residues and wood chips was used as the treatment substrate. In this article, we present the first 18 weeks of operation. Regarding the treatment efficiency, the parameters chemical oxygen demand (COD), total phosphorous (TP), ortho-phosphate (ortho-P), total bound nitrogen (TNb), ammonium nitrogen (NH4-N), and nitrate nitrogen (NO3-N) were analyzed and are presented in this work. The results of the modules with agricultural residues are promising. Up to 92% COD reduction is stated in the data. The phosphate and nitrogen fractions are reduced significantly in these modules. By contrast, the modules with wood chips reduce only 67% of the incoming COD and respectively less regarding phosphates and the nitrogen fraction.
Nonlocal theories concern the interaction of objects, which are separated in space. Classical examples are Coulomb’s law or Newton’s law of universal gravitation. They had signficiant impact in physics and engineering. One classical application in mechanics is the failure of quasi-brittle materials. While local models lead to an ill-posed boundary value problem and associated mesh dependent results, nonlocal models guarantee the well-posedness and are furthermore relatively easy to implement into commercial computational software.
Research into bio-based epoxy resins has intensified in recent decades. Here, it is of great importance to use raw materials whose use does not compete with food production. In addition, the performance of the newly developed materials should be comparable to that of conventional products. Possible starting materials are lignin degradation products, such as vanillin and syringaldehyde, for which new synthesis routes to the desired products must be found and their properties determined. In this article, the first synthesis of two amine hardeners, starting with vanillin and syringaldehyde, using the Smiles rearrangement reaction is reported. The amine hardeners were mixed with bisphenol A diglycidyl ether, and the curing was compared to isophorone diamine, 4-4′-diaminodiphenyl sulfone, and 4-Aminonbenzylamine by means of differential scanning calorimetry. It was found that the two amines prepared are cold-curing. As TG-MS studies showed, the thermal stability of at least one of the polymers prepared with the potentially bio-based amines is comparable to that of the polymer prepared with isophorone diamine, and similar degradation products are formed during pyrolysis.