TY - JOUR A1 - König, Reinhard T1 - Die Stadt der Agenten und Automaten JF - FORUM - Architektur & Bauforum N2 - PLANUNGSUNTERSTÜTZUNG DURCH DIE ANALYSE RÄUMLICHER PROZESSE MITTELS COMPUTERSIMULATIONEN. Erst wenn man – zumindest im Prinzip – versteht, wie eine Stadt mit ihren komplexen, verwobenen Vorgängen im Wesentlichen funktioniert, ist eine sinnvolle Stadtplanung möglich. Denn jede Planung bedeutet einen Eingriff in den komplexen Organismus einer Stadt. Findet dieser Eingriff ohne Wissen über die Funktionsweise des Organismus statt, können auch die Auswirkungen nicht abgeschätzt werden. Dieser Beitrag stellt dar, wie urbane Prozesse mittels Computersimulationen unter Zuhilfenahme so genannter Multi-Agenten-Systeme und Zellulärer Automaten verstanden werden können. von KW - Computational Urban Design KW - CAD Y1 - 2007 U6 - http://nbn-resolving.de/urn/resolver.pl?urn:nbn:de:gbv:wim2-20160622-26083 ER - TY - THES A1 - Schwedler, Michael T1 - Integrated structural analysis using isogeometric finite element methods N2 - The gradual digitization in the architecture, engineering, and construction industry over the past fifty years led to an extremely heterogeneous software environment, which today is embodied by the multitude of different digital tools and proprietary data formats used by the many specialists contributing to the design process in a construction project. Though these projects become increasingly complex, the demands on financial efficiency and the completion within a tight schedule grow at the same time. The digital collaboration of project partners has been identified as one key issue in successfully dealing with these challenges. Yet currently, the numerous software applications and their respective individual views on the design process severely impede that collaboration. An approach to establish a unified basis for the digital collaboration, regardless of the existing software heterogeneity, is a comprehensive digital building model contributed to by all projects partners. This type of data management known as building information modeling (BIM) has many benefits, yet its adoption is associated with many difficulties and thus, proceeds only slowly. One aspect in the field of conflicting requirements on such a digital model is the cooperation of architects and structural engineers. Traditionally, these two disciplines use different abstractions of reality for their models that in consequence lead to incompatible digital representations thereof. The onset of isogeometric analysis (IGA) promised to ease the discrepancy in design and analysis model representations. Yet, that initial focus quickly shifted towards using these methods as a more powerful basis for numerical simulations. Furthermore, the isogeometric representation alone is not capable of solving the model abstraction problem. It is thus the intention of this work to contribute to an improved digital collaboration of architects and engineers by exploring an integrated analysis approach on the basis of an unified digital model and solid geometry expressed by splines. In the course of this work, an analysis framework is developed that utilizes such models to automatically conduct numerical simulations commonly required in construction projects. In essence, this allows to retrieve structural analysis results from BIM models in a fast and simple manner, thereby facilitating rapid design iterations and profound design feedback. The BIM implementation Industry Foundation Classes (IFC) is reviewed with regard to its capabilities of representing the unified model. The current IFC schema strongly supports the use of redundant model data, a major pitfall in digital collaboration. Additionally, it does not allow to describe the geometry by volumetric splines. As the pursued approach builds upon a unique model for both, architectural and structural design, and furthermore requires solid geometry, necessary schema modifications are suggested. Structural entities are modeled by volumetric NURBS patches, each of which constitutes an individual subdomain that, with regard to the analysis, is incompatible with the remaining full model. The resulting consequences for numerical simulation are elaborated in this work. The individual subdomains have to be weakly coupled, for which the mortar method is used. Different approaches to discretize the interface traction fields are implemented and their respective impact on the analysis results is evaluated. All necessary coupling conditions are automatically derived from the related geometry model. The weak coupling procedure leads to a linear system of equations in saddle point form, which, owed to the volumetric modeling, is large in size and, the associated coefficient matrix has, due to the use of higher degree basis functions, a high bandwidth. The peculiarities of the system require adapted solution methods that generally cause higher numerical costs than the standard procedures for symmetric, positive-definite systems do. Different methods to solve the specific system are investigated and an efficient parallel algorithm is finally proposed. When the structural analysis model is derived from the unified model in the BIM data, it does in general initially not meet the requirements on the discretization that are necessary to obtain sufficiently accurate analysis results. The consequently necessary patch refinements must be controlled automatically to allowfor an entirely automatic analysis procedure. For that purpose, an empirical refinement scheme based on the geometrical and possibly mechanical properties of the specific entities is proposed. The level of refinement may be selectively manipulated by the structural engineer in charge. Furthermore, a Zienkiewicz-Zhu type error estimator is adapted for the use with isogeometric analysis results. It is shown that also this estimator can be used to steer an adaptive refinement procedure. T3 - ISM-Bericht // Institut für Strukturmechanik, Bauhaus-Universität Weimar - 2016,2 KW - Finite-Elemente-Methode KW - NURBS KW - Isogeometrische Analyse KW - finite element method KW - isogeometric analysis KW - mortar method KW - building information modelling Y1 - 2017 U6 - http://nbn-resolving.de/urn/resolver.pl?urn:nbn:de:gbv:wim2-20170130-27372 ER - TY - THES A1 - Amiri, Fatemeh T1 - Computational modelling of fracture with local maximum entropy approximations N2 - The key objective of this research is to study fracture with a meshfree method, local maximum entropy approximations, and model fracture in thin shell structures with complex geometry and topology. This topic is of high relevance for real-world applications, for example in the automotive industry and in aerospace engineering. The shell structure can be described efficiently by meshless methods which are capable of describing complex shapes as a collection of points instead of a structured mesh. In order to find the appropriate numerical method to achieve this goal, the first part of the work was development of a method based on local maximum entropy (LME) shape functions together with enrichment functions used in partition of unity methods to discretize problems in linear elastic fracture mechanics. We obtain improved accuracy relative to the standard extended finite element method (XFEM) at a comparable computational cost. In addition, we keep the advantages of the LME shape functions,such as smoothness and non-negativity. We show numerically that optimal convergence (same as in FEM) for energy norm and stress intensity factors can be obtained through the use of geometric (fixed area) enrichment with no special treatment of the nodes near the crack such as blending or shifting. As extension of this method to three dimensional problems and complex thin shell structures with arbitrary crack growth is cumbersome, we developed a phase field model for fracture using LME. Phase field models provide a powerful tool to tackle moving interface problems, and have been extensively used in physics and materials science. Phase methods are gaining popularity in a wide set of applications in applied science and engineering, recently a second order phase field approximation for brittle fracture has gathered significant interest in computational fracture such that sharp cracks discontinuities are modeled by a diffusive crack. By minimizing the system energy with respect to the mechanical displacements and the phase-field, subject to an irreversibility condition to avoid crack healing, this model can describe crack nucleation, propagation, branching and merging. One of the main advantages of the phase field modeling of fractures is the unified treatment of the interfacial tracking and mechanics, which potentially leads to simple, robust, scalable computer codes applicable to complex systems. In other words, this approximation reduces considerably the implementation complexity because the numerical tracking of the fracture is not needed, at the expense of a high computational cost. We present a fourth-order phase field model for fracture based on local maximum entropy (LME) approximations. The higher order continuity of the meshfree LME approximation allows to directly solve the fourth-order phase field equations without splitting the fourth-order differential equation into two second order differential equations. Notably, in contrast to previous discretizations that use at least a quadratic basis, only linear completeness is needed in the LME approximation. We show that the crack surface can be captured more accurately in the fourth-order model than the second-order model. Furthermore, less nodes are needed for the fourth-order model to resolve the crack path. Finally, we demonstrate the performance of the proposed meshfree fourth order phase-field formulation for 5 representative numerical examples. Computational results will be compared to analytical solutions within linear elastic fracture mechanics and experimental data for three-dimensional crack propagation. In the last part of this research, we present a phase-field model for fracture in Kirchoff-Love thin shells using the local maximum-entropy (LME) meshfree method. Since the crack is a natural outcome of the analysis it does not require an explicit representation and tracking, which is advantageous over techniques as the extended finite element method that requires tracking of the crack paths. The geometric description of the shell is based on statistical learning techniques that allow dealing with general point set surfaces avoiding a global parametrization, which can be applied to tackle surfaces of complex geometry and topology. We show the flexibility and robustness of the present methodology for two examples: plate in tension and a set of open connected pipes. KW - Fracture mechanics KW - Local maximum entropy approximants KW - PU Enrichment method KW - Phase-field model KW - Thin shell KW - Kirchoff--love theory Y1 - 2016 U6 - http://nbn-resolving.de/urn/resolver.pl?urn:nbn:de:gbv:wim2-20160719-26310 ER - TY - CHAP A1 - König, Reinhard A1 - Schmitt, Gerhard ED - Szoboszlai, Mihály T1 - Backcasting and a new way of command in computational design : Proceedings T2 - CAADence in Architecture Conference N2 - It's not uncommon that analysis and simulation methods are used mainly to evaluate finished designs and to proof their quality. Whereas the potential of such methods is to lead or control a design process from the beginning on. Therefore, we introduce a design method that move away from a “what-if” forecasting philosophy and increase the focus on backcasting approaches. We use the power of computation by combining sophisticated methods to generate design with analysis methods to close the gap between analysis and synthesis of designs. For the development of a future-oriented computational design support we need to be aware of the human designer’s role. A productive combination of the excellence of human cognition with the power of modern computing technology is needed. We call this approach “cognitive design computing”. The computational part aim to mimic the way a designer’s brain works by combining state-of-the-art optimization and machine learning approaches with available simulation methods. The cognition part respects the complex nature of design problems by the provision of models for human-computation interaction. This means that a design problem is distributed between computer and designer. In the context of the conference slogan “back to command”, we ask how we may imagine the command over a cognitive design computing system. We expect that designers will need to let go control of some parts of the design process to machines, but in exchange they will get a new powerful command on complex computing processes. This means that designers have to explore the potentials of their role as commanders of partially automated design processes. In this contribution we describe an approach for the development of a future cognitive design computing system with the focus on urban design issues. The aim of this system is to enable an urban planner to treat a planning problem as a backcasting problem by defining what performance a design solution should achieve and to automatically query or generate a set of best possible solutions. This kind of computational planning process offers proof that the designer meets the original explicitly defined design requirements. A key way in which digital tools can support designers is by generating design proposals. Evolutionary multi-criteria optimization methods allow us to explore a multi-dimensional design space and provide a basis for the designer to evaluate contradicting requirements: a task urban planners are faced with frequently. We also reflect why designers will give more and more control to machines. Therefore, we investigate first approaches learn how designers use computational design support systems in combination with manual design strategies to deal with urban design problems by employing machine learning methods. By observing how designers work, it is possible to derive more complex artificial solution strategies that can help computers make better suggestions in the future. KW - Cognitive design computing KW - machine learning KW - backcasting KW - design synthesis KW - evolutionary optimization KW - CAD Y1 - 2016 U6 - http://nbn-resolving.de/urn/resolver.pl?urn:nbn:de:gbv:wim2-20160622-25996 SP - 15 EP - 25 CY - Budapest ER - TY - JOUR A1 - König, Reinhard A1 - Bauriedel, Christian T1 - Generating settlement structures: a method for urban planning and analysis supported by cellular automata JF - Environment and Planning B: Planning and Design N2 - Previous models for the explanation of settlement processes pay little attention to the interactions between settlement spreading and road networks. On the basis of a dielectric breakdown model in combination with cellular automata, we present a method to steer precisely the generation of settlement structures with regard to their global and local density as well as the size and number of forming clusters. The resulting structures depend on the logic of how the dependence of the settlements and the road network is implemented to the simulation model. After analysing the state of the art we begin with a discussion of the mutual dependence of roads and land development. Next, we elaborate a model that permits the precise control of permeability in the developing structure as well as the settlement density, using the fewest necessary control parameters. On the basis of different characteristic values, possible settlement structures are analysed and compared with each other. Finally, we reflect on the theoretical contribution of the model with regard to the context of urban dynamics. KW - Cellular automata KW - Computational urban design Y1 - 2009 U6 - http://nbn-resolving.de/urn/resolver.pl?urn:nbn:de:gbv:wim2-20160624-26054 UR - http://www.envplan.com.proxy.lib.sfu.ca/abstract.cgi?id=b34025 SP - 602 EP - 624 ER - TY - CHAP A1 - König, Reinhard A1 - Varoudis, Tasos T1 - Spatial Optimizations: Merging depthmapX , spatial graph networks and evolutionary design in Grasshopper T2 - Proceedings of ecaade 34: Complexity & Simplicity N2 - In the Space Syntax community, the standard tool for computing all kinds of spatial graph network measures is depthmapX (Turner, 2004; Varoudis, 2012). The process of evaluating many design variants of networks is relatively complicated, since they need to be drawn in a separated CAD system, exported and imported in depthmapX via dxf file format. This procedure disables a continuous integration into a design process. Furthermore, the standalone character of depthmapX makes it impossible to use its network centrality calculation for optimization processes. To overcome this limitations, we present in this paper the first steps of experimenting with a Grasshopper component (reference omitted until final version) that can access the functions of depthmapX and integrate them into Grasshopper/Rhino3D. Here the component is implemented in a way that it can be used directly for an evolutionary algorithm (EA) implemented in a Python scripting component in Grasshopper KW - depthmapx KW - python KW - optimization KW - space syntax KW - grasshopper Y1 - 2016 U6 - http://nbn-resolving.de/urn/resolver.pl?urn:nbn:de:gbv:wim2-20160622-26040 SP - 1 EP - 6 CY - Oulu, Finland ER - TY - THES A1 - Ehrhardt, Dirk T1 - ZUM EINFLUSS DER NACHBEHANDLUNG AUF DIE GEFÜGEAUSBILDUNG UND DEN FROST-TAUMITTELWIDERSTAND DER BETONRANDZONE N2 - Die Festigkeitsentwicklung des Zementbetons basiert auf der chemischen Reaktion des Zementes mit dem Anmachwasser. Durch Nachbehandlungsmaßnahmen muss dafür gesorgt werden, dass dem Zement genügend Wasser für seine Reaktion zur Verfügung steht, da sonst ein Beton mit minderer Qualität entsteht. Die vorliegende Arbeit behandelt die grundsätzlichen Fragen der Betonnachbehandlung bei Anwendung von Straßenbetonen. Im Speziellen wird die Frage des erforderlichen Nachbehandlungsbedarfs von hüttensandhaltigen Kompositzementen betrachtet. Die Wirkung der Nachbehandlung wird anhand des erreichten Frost-Tausalz-Widerstandes und der Gefügeausbildung in der unmittelbaren Betonrandzone bewertet. Der Fokus der Untersuchungen lag auf abgezogenen Betonoberflächen. Es wurde ein Modell zur Austrocknung des jungen Betons erarbeitet. Es konnte gezeigt werden, dass in einer frühen Austrocknung (Kapillarphase) keine kritische Austrocknung der Betonrandzone einsetzt, sondern der Beton annährend gleichmäßig über die Höhe austrocknet. Es wurde ein Nomogramm entwickelt, mit dem die Dauer der Kapillarphase in Abhängigkeit der Witterung für Straßenbetone abgeschätzt werden kann. Eine kritische Austrocknung der wichtigen Randzone setzt nach Ende der Kapillarphase ein. Für Betone unter Verwendung von Zementen mit langsamer Festigkeitsentwicklung ist die Austrocknung der Randzone nach Ende der Kapillarphase besonders ausgeprägt. Im Ergebnis zeigen diese Betone dann einen geringen Frost-Tausalz-Widerstand. Mit Zementen, die eine 2d-Zementdruckfestigkeit ≥ 23,0 N/mm² aufweisen, wurde unabhängig von der Zementart (CEM I oder CEM II/B-S) auch dann ein hoher Frost-Tausalz-Widerstand erreicht, wenn keine oder eine schlechtere Nachbehandlung angewendet wurde. Für die Praxis ergibt sich damit eine einfache Möglichkeit der Vorauswahl von geeigneten Zementen für den Verkehrsflächenbau. Betone, die unter Verwendung von Zementen mit langsamere Festigkeitsentwicklung hergestellt werden, erreichen einen hohen Frost-Tausalz-Widerstand nur mit einer geeigneten Nachbehandlung. Die Anwendung von flüssigen Nachbehandlungsmitteln (NBM gemäß TL NBM-StB) erreicht eine ähnliche Wirksamkeit wie eine 5 tägige Feuchtnachbehandlung. Voraussetzung für die Wirksamkeit der NBM ist, dass sie auf eine Betonoberfläche ohne sichtbaren Feuchtigkeitsfilm (feuchter Glanz) aufgesprüht werden. Besonders wichtig ist die Beachtung des richtigen Auftragszeitpunktes bei kühler Witterung, da hier aufgrund der verlangsamten Zementreaktion der Beton länger Anmachwasser abstößt. Ein zu früher Auftrag des Nachbehandlungsmittels führt zu einer Verschlechterung der Qualität der Betonrandzone. Durch Bereitstellung hydratationsabhängiger Transportkenngrößen (Feuchtetransport im Beton) konnten numerische Berechnungen zum Zusammenspiel zwischen der Austrocknung, der Nachbehandlung und der Gefügeentwicklung durchgeführt werden. Mit dem erstellten Berechnungsmodell wurden Parameterstudien durchgeführt. Die Berechnungen bestätigen die wesentlichen Erkenntnisse der Laboruntersuchungen. Darüber hinaus lässt sich mit dem Berechnungsmodell zeigen, dass gerade bei langsam reagierenden Zementen und kühler Witterung ohne eine Nachbehandlung eine sehr dünne Randzone (ca. 500 µm – 1000 µm) mit stark erhöhter Kapillarporosität entsteht. N2 - The hardening of cement concrete is based on the chemical reaction of cement and water. Therefore, the ensuring of sufficient amount of water in concrete is essential. All these measures are referred as curing of concrete. This dissertation provides a basic consideration of curing of concrete for concrete pavements. In this regard the using of cements with slow strength development, e.g. cements with blast furnace slag is the main topic. The effectiveness of curing was evaluated on the basis of the freeze-thaw de-icing resistance and the microstructure of hardened outer concrete surface. Concrete surfaces with textured mortar are on the focus. The results were used to develope a model of the drying of young concrete. It could be shown, that the outer concrete surface does not dry during the first drying phase (called capillary phase). Instead the concrete is drying evenly over the high of the concrete sample during the capillary drying phase. A fast drying of the outer concrete surface only takes place after the capillary drying phase. Based on all results a nomogram (for road concrete) was created for an estimation of the duration of the capillary drying phase. If there is no curing after the capillary drying phase the concrete with use of slowly reacting cement has a great risk for a harmful drying of outer concrete surface. In this case such a concrete shows a very poor freeze thaw de-icing resistance. By using cements with a 2 day-compressive strength ≥ 23,0 N/mm² a good freeze thaw de-icing resistance could assure, despite no or a poor curing was applied. This criterion can be used to estimate the usability of cements with granulated blast furnace slag when a great freeze thaw de-icing resistance is essential. There could also be used a cement with lower 2 day-compressive strength. In this case a good curing has to be assured. Spraying of liquid curing compounds according to TL NBM-StB is suitable for curing concrete pavements. The curing compound should not be sprayed on a concrete surface with any visible water film. The effectiveness of curing compounds is equal to a five days long wet curing in consideration of applying the curing compound at the right time. Otherwise there is some negative influence when the spraying of curing compounds starts too early. Consideration about the right application time of curing compounds is more important for low temperatures because the concrete bleeds much longer due to the slower cement reaction at low temperatures. It was possible to create a numerical model which accounts the interaction between drying, curing and development of microstructure by providing hydration dependent transport parameters (water transport in concrete). The model was used for a parameter study. It could be shown that the combination of cements with slow strength development and low ambient temperatures leads to a thin surface zone (500 µm – 1000 µm) with very high capillary porosity. KW - Beton KW - Zement KW - Nachbehandlung KW - Modellbildung KW - Nachbehandlungsmittel KW - Straßenbeton KW - Frost-Tausalz-Widerstand KW - curing compounds KW - concrete pavements Y1 - 2017 U6 - http://nbn-resolving.de/urn/resolver.pl?urn:nbn:de:gbv:wim2-20171120-36889 ER - TY - THES A1 - Müller, Matthias T1 - Salt-frost Attack on Concrete - New Findings regarding the Damage Mechanism N2 - The reduction of the cement clinker content is an important prerequisite for the improvement of the CO2-footprint of concrete. Nevertheless, the durability of such concretes must be sufficient to guarantee a satisfactory service life of structures. Salt frost scaling resistance is a critical factor in this regard, as it is often diminished at increased clinker substitution rates. Furthermore, only insufficient long-term experience for such concretes exists. A high salt frost scaling resistance thus cannot be achieved by applying only descriptive criteria, such as the concrete composition. It is therefore to be expected, that in the long term a performance based service life prediction will replace the descriptive concept. To achieve the important goal of clinker reduction for concretes also in cold and temperate climates it is important to understand the underlying mechanisms for salt frost scaling. However, conflicting damage theories dominate the current State of the Art. It was consequently derived as the goal of this thesis to evaluate existing damage theories and to examine them experimentally. It was found that only two theories have the potential to describe the salt frost attack satisfactorily – the glue spall theory and the cryogenic suction theory. The glue spall theory attributes the surface scaling to the interaction of an external ice layer with the concrete surface. Only when moderate amounts of deicing salt are present in the test solution the resulting mechanical properties of the ice can cause scaling. However, the results in this thesis indicate that severe scaling also occurs at deicing salt levels, at which the ice is much too soft to damage concrete. Thus, the inability of the glue spall theory to account for all aspects of salt frost scaling was shown. The cryogenic suction theory is based on the eutectic behavior of salt solutions, which consist of two phases – water ice and liquid brine – between the freezing point and the eutectic temperature. The liquid brine acts as an additional moisture reservoir, which facilitates the growth of ice lenses in the surface layer of the concrete. The experiments in this thesis confirmed, that the ice formation in hardened cement paste increases due to the suction of brine at sub-zero temperatures. The extent of additional ice formation was influenced mainly by the porosity and by the chloride binding capacity of the hardened cement paste. Consequently, the cryogenic suction theory plausibly describes the actual generation of scaling, but it has to be expanded by some crucial aspects to represent the salt frost scaling attack completely. The most important aspect is the intensive saturation process, which is ascribed to the so-called micro ice lens pump. Therefore a combined damage theory was proposed, which considers multiple saturation processes. Important aspects of this combined theory were confirmed experimentally. As a result, the combined damage theory constitutes a good basis to understand the salt frost scaling attack on concrete on a fundamental level. Furthermore, a new approach was identified, to account for the reduced salt frost scaling resistance of concretes with reduced clinker content. T2 - Frost-Tausalz-Angriff auf Beton - Neue Erkenntnisse zum Schadensmechnismus KW - Beton KW - Frost KW - Concrete KW - Salt frost attack KW - Damage mechanism KW - Glue Spall KW - Cryogenic Suction Y1 - 2022 U6 - http://nbn-resolving.de/urn/resolver.pl?urn:nbn:de:gbv:wim2-20230103-48681 UR - https://e-pub.uni-weimar.de/opus4/frontdoor/index/index/docId/4502 N1 - Englische Fassung meiner deutschsprachigen Dissertation mit dem Titel "Frost-Tausalz-Angriff auf Beton - Neue Erkenntnisse zum Schadensmechnismus". ER - TY - THES A1 - Krtschil, Anna T1 - Vergleich verschiedener Indikatoren in Bezug auf die Ökobilanz von Gebäuden N2 - Im Rahmen der Bachelorarbeit werden zwei Indikatoren zur Auswertung einer Ökobilanz gegenübergestellt. Die Umweltbelastungspunkte der Schweiz werden mit dem niederländischen ReCiPe verglichen. KW - Umweltbilanz KW - Ökobilanz KW - Umweltbelastungspunkte KW - ReCiPe KW - GaBi Software Y1 - 2015 U6 - http://nbn-resolving.de/urn/resolver.pl?urn:nbn:de:gbv:wim2-20150716-24340 ER - TY - THES A1 - Harirchian, Ehsan T1 - Improved Rapid Assessment of Earthquake Hazard Safety of Existing Buildings Using a Hierarchical Type-2 Fuzzy Logic Model N2 - Although it is impractical to avert subsequent natural disasters, advances in simulation science and seismological studies make it possible to lessen the catastrophic damage. There currently exists in many urban areas a large number of structures, which are prone to damage by earthquakes. These were constructed without the guidance of a national seismic code, either before it existed or before it was enforced. For instance, in Istanbul, Turkey, as a high seismic area, around 90% of buildings are substandard, which can be generalized into other earthquakeprone regions in Turkey. The reliability of this building stock resulting from earthquake-induced collapse is currently uncertain. Nonetheless, it is also not feasible to perform a detailed seismic vulnerability analysis on each building as a solution to the scenario, as it will be too complicated and expensive. This indicates the necessity of a reliable, rapid, and computationally easy method for seismic vulnerability assessment, commonly known as Rapid Visual Screening (RVS). In RVS methodology, an observational survey of buildings is performed, and according to the data collected during the visual inspection, a structural score is calculated without performing any structural calculations to determine the expected damage of a building and whether the building needs detailed assessment. Although this method might save time and resources due to the subjective/qualitative judgments of experts who performed the inspection, the evaluation process is dominated by vagueness and uncertainties, where the vagueness can be handled adequately through the fuzzy set theory but do not cover all sort of uncertainties due to its crisp membership functions. In this study, a novel method of rapid visual hazard safety assessment of buildings against earthquake is introduced in which an interval type-2 fuzzy logic system (IT2FLS) is used to cover uncertainties. In addition, the proposed method provides the possibility to evaluate the earthquake risk of the building by considering factors related to the building importance and exposure. A smartphone app prototype of the method has been introduced. For validation of the proposed method, two case studies have been selected, and the result of the analysis presents the robust efficiency of the proposed method. KW - Fuzzy-Logik KW - Erdbebensicherheit KW - Fuzzy logic KW - RC Buildings KW - Rapid Visual Assessment KW - Seismic Vulnerability KW - Uncertainty Y1 - 2021 U6 - http://nbn-resolving.de/urn/resolver.pl?urn:nbn:de:gbv:wim2-20210326-43963 ER -