Refine
Document Type
- Conference Proceeding (38)
- Article (22)
- Doctoral Thesis (16)
- Part of a Book (10)
- Bachelor Thesis (8)
- Master's Thesis (3)
- Book (2)
- Periodical (2)
- Study Thesis (2)
- Habilitation (1)
Institute
- Graduiertenkolleg 1462 (19)
- In Zusammenarbeit mit der Bauhaus-Universität Weimar (19)
- Institut für Strukturmechanik (ISM) (18)
- Universitätsbibliothek (11)
- Professur Bauphysik (8)
- Professur Informatik in der Architektur (6)
- F. A. Finger-Institut für Baustoffkunde (FIB) (4)
- An-Institute (2)
- Institut für Europäische Urbanistik (2)
- Professur Bauchemie und Polymere Werkstoffe (2)
Keywords
- Angewandte Mathematik (50)
- Angewandte Informatik (36)
- Computerunterstütztes Verfahren (36)
- Strukturmechanik (14)
- Elektronisches Buch (7)
- Bauphysik (3)
- Bibliothek (3)
- E-Book-Reader (3)
- Urheberrecht (3)
- Architektur (2)
- Beton (2)
- Buchmarkt (2)
- Kulturwissenschaft (2)
- Lernspiel (2)
- Lesen (2)
- Magnesiumsulfat (2)
- Medienwissenschaft (2)
- Quantitative Sozialforschung (2)
- Studium (2)
- Verlag (2)
- 0947 (1)
- 0948 (1)
- 0949 (1)
- 3D Surface Models (1)
- 50.32 (1)
- 56 (1)
- 56.62 (1)
- ATZ (1)
- Abaqus (1)
- Actor-Network-Theory (1)
- Aktionsraumforschung (1)
- Alterung (1)
- Anachronismus (1)
- Anorganische Salze (1)
- Apps <Programm> (1)
- Augmented Reality Displays (1)
- Automatisierung (1)
- BDA (1)
- Bausteinbibliothek (1)
- Baustoff Quartett (1)
- Bayes-Inferenz (1)
- Benutzer (1)
- Berührungslose Messung (1)
- Bewertung (1)
- Bewertungsmethode (1)
- Brückenkappe (1)
- Buchbranche (1)
- BuildVille (1)
- Computational Design (1)
- Computerunterstütztes Lernen (1)
- Coupled models (1)
- Crowdsourcing (1)
- Damm (1)
- Digital Game Based Learning (1)
- Digital Games Based Learning (1)
- Digital Rights Management (1)
- Digitale Welt (1)
- Digitaler Buchmarkt (1)
- Dispersionskeramik (1)
- Dresden (1)
- E-Book (1)
- EDA <Optimierung> (1)
- EDOC-Tage Weimar 2011 (1)
- Elektronische Buch (1)
- Elektronische Formate (1)
- Embankment, sensitivity analysis, parameter identification, Particle Swarm Optimization (1)
- Empfindlichkeit (1)
- Entscheidungsfindung (1)
- Entwurfssystem, computerbasiertes Planen, evolutionäre Optimierung, Grundrisse, generative Methoden (1)
- Enzyklopädie (1)
- Erinnerung (1)
- Erwerbungsetat (1)
- Evaluation methods (1)
- Evolutionärer Algorithmus (1)
- Eye Typing (1)
- Eye tracking movement (1)
- FEM (1)
- Facilides (1)
- Fehlerabschätzung (1)
- Feststoff (1)
- Feuer (1)
- Finite-Elemente-Methode (1)
- Fliplife (1)
- Fliplife Mitarbeiterrekrutierung (1)
- Flugasche (1)
- Forschung (1)
- Freemium (1)
- Frost-Tausalz-Widerstand (1)
- Fuzzylogik (1)
- GIXRD (1)
- Gaze Control (1)
- Gaze Interaction (1)
- Gekoppelte Modelle (1)
- Geschichtsschreibung (1)
- Gesellschaftsspiel (1)
- Gips (1)
- Grundriss (1)
- Grundstücksumlegung (1)
- HPC (1)
- Halle (1)
- Historischer Film (1)
- Housing (1)
- Hüftgelenkprothese (1)
- Iannis Xenakis (1)
- Information Retrieval (1)
- Informationsportal (1)
- Ingenieurbau (1)
- Isogeometric Analysis (1)
- Kallmeyer (1)
- Knoop Mikrohärte (1)
- Kriechen (1)
- Lagos (1)
- Landesplanung (1)
- Landesplanungsgeschichte (1)
- Laser (1)
- Lehre (1)
- Lernen nach Gagné (1)
- Lernspiele (1)
- Medien (1)
- Mesh quality (1)
- Metastaseis (1)
- Milieuforschung (1)
- Minecraft (1)
- Missionsmanager (1)
- Modalanalyse (1)
- Model quality, Model error estimation, Kinematical model, Geometric non-linearity, Finite Element method (1)
- Modellqualität, Modellfehlerschätzer, Geometrisch nicht-lineare Berechnung, Kinematik Modell, Finite Elemente Methode (1)
- Moderne (1)
- Multiplayer (1)
- Multiple Volume Rendering (1)
- Musik (1)
- Musik und Architektur (1)
- NRA (1)
- NURBS (1)
- Nachbehandlung (1)
- Nicht-lineare Planung (1)
- Nutzung (1)
- Overlay (1)
- Philips Pavillon (1)
- Polieren (1)
- Positionierung (1)
- Preispolitik (1)
- Probleme (1)
- Procedural modeling (1)
- Prognosequalität (1)
- Quartett (1)
- Quarzglas (1)
- Referenz (1)
- Referenzsensoren (1)
- Regionalplanung (1)
- Residentielle Mobilität (1)
- Salzmischungen (1)
- Schwerelose Ökonomie (1)
- Sensitivitätsanalyse (1)
- Simulation (1)
- Singleplayer (1)
- Skill (1)
- Skillmanager (1)
- Skillsystem (1)
- Slums (1)
- Social Game (1)
- Social Games (1)
- Sorptionswärme (1)
- Spieler (1)
- Spielermodell (1)
- Stadtgestalt (1)
- Stadtlandschaft (1)
- Stochastik (1)
- Strategische Planung (1)
- Straßenbeton (1)
- Städtischer Raum (1)
- Sulfatangriff (1)
- Systementwurf (1)
- Templates (1)
- Thaumasit (1)
- Thermische Energiespeicherung (1)
- Thermochemische Wärmespeicherung (1)
- Tribologie (1)
- UML (1)
- Unsicherheitsanalyse (1)
- Unterteilungsalgorithmen (1)
- Urban governance (1)
- Vergessen (1)
- Vertriebsweg (1)
- Vibrationssensor (1)
- Visual Perception (1)
- Waschbeton (1)
- Weimar / Bauhaus-Universität (1)
- Wohnstandortentscheidungen (1)
- Wohnstandortpräferenzen (1)
- architektonischer Entwurf (1)
- bogenförmig (1)
- bridge-vehicle interaction (1)
- buckling (1)
- cement (1)
- cluster analysis (1)
- complex data analysis (1)
- composite (1)
- composite cement (1)
- corpus construction (1)
- cylindrical shell structures (1)
- damage (1)
- documentation (1)
- domain decomposition (1)
- durability (1)
- evaluation (1)
- experimentelle Modalanalyse (1)
- finite element method (1)
- fire (1)
- gypsum (1)
- heterogeneous material (1)
- high-performance computing (1)
- laser; contactless measurement; quartz glass; polishing; temperature; residual stress; simulation; sensitivity (1)
- metakaolin (1)
- multiphase (1)
- output-only (1)
- philosophy (1)
- plagiarism detection (1)
- plasterboard (1)
- poröse Trägermaterialien (1)
- random vibrations (1)
- scalable smeared crack analysis (1)
- sensitivity and uncertainty analysis (1)
- singular value decomposition (1)
- solver (1)
- stochastic subspace (1)
- städtische Strukturen (1)
- text reuse (1)
- urban planning (1)
- urban research-quantitative (1)
- writing assistance (1)
- überschnittene Bohrpfahlwand (1)
Year of publication
- 2012 (105) (remove)
Non-destructive techniques for damage detection became the focus of engineering interests in the last few years. However, applying these techniques to large complex structures like civil engineering buildings still has some limitations since these types of structures are
unique and the methodologies often need a large number of specimens for reliable results. For this reason, cost and time can greatly influence the final results.
Model Assisted Probability Of Detection (MAPOD) has taken its place among the ranks of damage identification techniques, especially with advances in computer capacity and modeling tools. Nevertheless, the essential condition for a successful MAPOD is having a reliable model in advance. This condition is opening the door for model assessment and model quality problems. In this work, an approach is proposed that uses Partial Models (PM) to compute the Probability Of damage Detection (POD). A simply supported beam, that can be structurally modified and
tested under laboratory conditions, is taken as an example. The study includes both experimental and numerical investigations, the application of vibration-based damage detection approaches and a comparison of the results obtained based on tests and simulations.
Eventually, a proposal for a methodology to assess the reliability and the robustness of the models is given.
The present research analyses the error on prediction obtained under different data availability scenarios to determine which measurements contribute to an improvement of model prognosis and which not. A fully coupled 2D hydromechanical model of a water retaining dam is taken as an example. Here, the mean effective stress in the porous skeleton is reduced due to an increase in pore water pressure under drawdown conditions. Relevant model parameters are ranked by scaled sensitivities, Particle Swarm Optimization is applied to determine the optimal parameter values and model validation is performed to determine the magnitude of error forecast. We compare the predictions of the optimized models with results from a forward run of the reference model to obtain actual prediction errors.
The analyses presented here were performed to 31 data sets of 100 observations of varying data types. Calibrating with multiple information types instead of only one sort, brings better calibration results and improvement in model prognosis. However, when using several types of information the number of observations have to be increased to be able to cover a representative part of the model domain; otherwise a compromise between data availability and domain
coverage prove best. Which type of information for calibration contributes to the best prognoses, could not be determined in advance. For the error in model prognosis does not depends on the error in calibration, but on the parameter error, which unfortunately can not be determined in reality since we do not know its real value. Excellent calibration fits with parameters’ values near the limits of reasonable physical values, provided the highest prognosis errors. While models which included excess pore pressure values for calibration provided the best prognosis, independent of the calibration fit.
This paper presents a strain smoothing procedure for the extended finite element method (XFEM). The resulting “edge-based” smoothed extended finite element method (ESm-XFEM) is tailored to linear elastic fracture mechanics and, in this context, to outperform the standard XFEM. In the XFEM, the displacement-based approximation is enriched by the Heaviside and asymptotic crack tip functions using the framework of partition of unity. This eliminates the need for the mesh alignment with the crack and re-meshing, as the crack evolves. Edge-based smoothing (ES) relies on a generalized smoothing operation over smoothing domains associated with edges of simplex meshes, and produces a softening effect leading to a close-to-exact stiffness, “super-convergence” and “ultra-accurate” solutions. The present method takes advantage of both the ES-FEM and the XFEM. Thanks to the use of strain smoothing, the subdivision of elements intersected by discontinuities and of integrating the (singular) derivatives of the approximation functions is suppressed via transforming interior integration into boundary integration. Numerical examples show that the proposed method improves significantly the accuracy of stress intensity factors and achieves a near optimal convergence rate in the energy norm even without geometrical enrichment or blending correction.
A phantom-node method is developed for three-node shell elements to describe cracks. This method can treat arbitrary cracks independently of the mesh. The crack may cut elements completely or partially. Elements are overlapped on the position of the crack, and they are partially integrated to implement the discontinuous displacement across the crack. To consider the element containing a crack tip, a new kinematical relation between the overlapped elements is developed. There is no enrichment function for the discontinuous displacement field. Several numerical examples are presented to illustrate the proposed method.
Electromagnetic wave propagation is currently present in the vast majority of situations which occur in veryday life, whether in mobile communications, DTV, satellite tracking, broadcasting, etc. Because of this the study of increasingly complex means of propagation of lectromagnetic waves has become necessary in order to optimize resources and increase the capabilities of the devices as required by the growing demand for such services.
Within the electromagnetic wave propagation different parameters are considered that characterize it under various circumstances and of particular importance are the reflectance and transmittance. There are several methods or the analysis of the reflectance and transmittance such as the method of approximation by boundary condition, the plane wave expansion method (PWE), etc., but this work focuses on the WKB and SPPS methods.
The implementation of the WKB method is relatively simple but is found to be relatively efficient only when working at high frequencies. The SPPS method (Spectral Parameter Powers Series) based on the theory of pseudoanalytic functions, is used to solve this problem through a new representation for solutions of Sturm Liouville equations and has recently proven to be a powerful tool to solve different boundary value and eigenvalue problems. Moreover, it has a very suitable structure for numerical implementation, which in this case took place in the Matlab software for the valuation of both conventional and turning points profiles.
The comparison between the two methods allows us to obtain valuable information about their perfor mance which is useful for determining the validity and propriety of their application for solving problems where these parameters are calculated in real life applications.
Im Rahmen dieser Arbeit wurde die Entscheidungsfindung im Herstellungsprozess von Brückenkappen untersucht. Es stellte sich heraus, dass die Fuzzy-Methode ein geeignetes Werkzeug sein könnte, die Teilprozesse auf die Möglichkeit ihrer Parallelisierung hin zu untersuchen. Um diese Theorie zu testen, wurde auf den Grundlagen von Arbeiten der Professur Baubetrieb und Bauverfahren der Prozess näher analysiert und unterstützend durch eigene Recherchen ein UML-Diagramm erstellt, welches als Aktivitätsdiagramm ausgebildet wurde. Aufbauend auf diesem Ablauf und den gewonnenen Kenntnissen zur Herstellung einer Brückenkappe, konnten die einzelnen Prozesse zu Teilprozessen, sogenannten Bausteinen, zusammengefasst werden. Diese Bausteine sind entstanden, um die Simulation möglich zu machen, indem der Ablauf weniger komplex wird und nur die Prozesse zu beurteilen sind, die beeinflusst bzw. in ihrer Reihenfolge bis zu einem gewissen Grad variabel sind. Eine erleichterte Interaktion mit den Bausteinen und deren Überführung in ein Simulationsprogramm wurde über Templates realisiert. So besitzt jeder Baustein eine einheitliche Struktur. Unter anderem beinhalten die Bausteine die jeweiligen Ressourcen und Parameter, sowie die Abhängigkeiten der Prozesse untereinander und die zugehörige Priorität. Zur Entscheidungsfindung wurde die Fuzzylogik herangezogen und die Problemstellung der Parallelisierung zum Ziel gesetzt. Die Realisierung wurde über einen Entscheidungsbaum und das daraus resultierende Regelwerk erreicht. Somit ließen sich, ausgehend von einem festgelegten Prozess und durch die Fuzzifizierung der Eingangsparameter Priorität, Prozessdauer und Verfügbarkeit der Ressourcen und Arbeitskräfte, verschiedene Pfade identifizieren, allerdings nur in Verbindung mit den vorher analysierten Abhängigkeiten. Für jeden der Eingangsparameter wurden so Fuzzy-Sets erstellt. Über den Entscheidungsbaum, welcher mit den linguistischen Variablen versehen wurde, konnte über sogenannte "und" - Verknüpfungen das Regelwerk aufgestellt werden. Das gekürzte Regelwerk in dieser Arbeit beinhaltet grundsätzlich nur die Regeln, die auch zu einer wirklichen Entscheidung führen. Daraus folgt, dass es möglich ist mittels der Fuzzy-Methode eine Entscheidung darüber zu fällen, ob zwei Prozesse zu parallelisieren sind oder nicht und aufgetretene Verzögerungen wieder eingeholt werden können.
This work describes an algorithm and corresponding software for incorporating general nonlinear multiple-point equality constraints in a implicit sparse direct solver. It is shown that direct addressing of sparse matrices is possible in general circumstances, circumventing the traditional linear or binary search for introducing (generalized) constituents to a sparse matrix. Nested and arbitrarily interconnected multiple-point constraints are introduced by processing of multiplicative constituents with a built-in topological ordering of the resulting directed graph. A classification of discretization methods is performed and some re-classified problems are described and solved under this proposed perspective. The dependence relations between solution methods, algorithms and constituents becomes apparent. Fracture algorithms can be naturally casted in this framework. Solutions based on control equations are also directly incorporated as equality constraints. We show that arbitrary constituents can be used as long as the resulting directed graph is acyclic. It is also shown that graph partitions and orderings should be performed in the innermost part of the algorithm, a fact with some peculiar consequences. The core of our implicit code is described, specifically new algorithms for direct access of sparse matrices (by means of the clique structure) and general constituent processing. It is demonstrated that the graph structure of the second derivatives of the equality constraints are cliques (or pseudo-elements) and are naturally included as such. A complete algorithm is presented which allows a complete automation of equality constraints, avoiding the need of pre-sorting. Verification applications in four distinct areas are shown: single and multiple rigid body dynamics, solution control and computational fracture.
In this paper, wavelet energy damage indicator is used in response surface methodology to identify the damage in simulated filler beam railway bridge. The approximate model is addressed to include the operational and surrounding condition in the assessment. The procedure is split into two stages, the training and detecting phase. During training phase, a so-called response surface is built from training data using polynomial regression and radial basis function approximation approaches. The response surface is used to detect the damage in structure during detection phase. The results show that the response surface model is able to detect moderate damage in one of bridge supports while the temperatures and train velocities are varied.
Different types of data provide different type of information. The present research analyzes the error on prediction obtained under different data type availability for calibration. The contribution of different measurement types to model calibration and prognosis are evaluated. A coupled 2D hydro-mechanical model of a water retaining dam is taken as an example. Here, the mean effective stress in the porous skeleton is reduced due to an increase in pore water pressure under drawdown conditions. Relevant model parameters are identified by scaled sensitivities. Then, Particle Swarm Optimization is applied to determine the optimal parameter values and finally, the error in prognosis is determined. We compare the predictions of the optimized models with results from a forward run of the reference model to obtain the actual prediction errors. The analyses presented here were performed calibrating the hydro-mechanical model to 31 data sets of 100 observations of varying data types. The prognosis results improve when using diversified information for calibration. However, when using several types of information, the number of observations has to be increased to be able to cover a representative part of the model domain. For an analysis with constant number of observations, a compromise between data type availability and domain coverage proves to be the best solution. Which type of calibration information contributes to the best prognoses could not be determined in advance. The error in model prognosis does not depend on the error in calibration, but on the parameter error, which unfortunately cannot be determined in inverse problems since we do not know its real value. The best prognoses were obtained independent of calibration fit. However, excellent calibration fits led to an increase in prognosis error variation. In the case of excellent fits; parameters' values came near the limits of reasonable physical values more often. To improve the prognoses reliability, the expected value of the parameters should be considered as prior information on the optimization algorithm.