Refine
Document Type
- Conference Proceeding (38)
- Article (22)
- Doctoral Thesis (16)
- Part of a Book (10)
- Bachelor Thesis (8)
- Master's Thesis (3)
- Book (2)
- Periodical (2)
- Study Thesis (2)
- Habilitation (1)
- Sound (1)
Institute
- Graduiertenkolleg 1462 (19)
- In Zusammenarbeit mit der Bauhaus-Universität Weimar (19)
- Institut für Strukturmechanik (ISM) (18)
- Universitätsbibliothek (11)
- Professur Bauphysik (8)
- Professur Informatik in der Architektur (6)
- F. A. Finger-Institut für Baustoffkunde (FIB) (4)
- An-Institute (2)
- Institut für Europäische Urbanistik (2)
- Professur Bauchemie und Polymere Werkstoffe (2)
Keywords
- Angewandte Mathematik (50)
- Angewandte Informatik (36)
- Computerunterstütztes Verfahren (36)
- Strukturmechanik (14)
- Elektronisches Buch (7)
- Bauphysik (3)
- Bibliothek (3)
- E-Book-Reader (3)
- Urheberrecht (3)
- Architektur (2)
Year of publication
- 2012 (105) (remove)
We present an extended finite element formulation for dynamic fracture of piezo-electric materials. The method is developed in the context of linear elastic fracture mechanics. It is applied to mode I and mixed mode-fracture for quasi-steady cracks. An implicit time integration scheme is exploited. The results are compared to results obtained with the boundary element method and show excellent agreement.
This work describes an algorithm and corresponding software for incorporating general nonlinear multiple-point equality constraints in a implicit sparse direct solver. It is shown that direct addressing of sparse matrices is possible in general circumstances, circumventing the traditional linear or binary search for introducing (generalized) constituents to a sparse matrix. Nested and arbitrarily interconnected multiple-point constraints are introduced by processing of multiplicative constituents with a built-in topological ordering of the resulting directed graph. A classification of discretization methods is performed and some re-classified problems are described and solved under this proposed perspective. The dependence relations between solution methods, algorithms and constituents becomes apparent. Fracture algorithms can be naturally casted in this framework. Solutions based on control equations are also directly incorporated as equality constraints. We show that arbitrary constituents can be used as long as the resulting directed graph is acyclic. It is also shown that graph partitions and orderings should be performed in the innermost part of the algorithm, a fact with some peculiar consequences. The core of our implicit code is described, specifically new algorithms for direct access of sparse matrices (by means of the clique structure) and general constituent processing. It is demonstrated that the graph structure of the second derivatives of the equality constraints are cliques (or pseudo-elements) and are naturally included as such. A complete algorithm is presented which allows a complete automation of equality constraints, avoiding the need of pre-sorting. Verification applications in four distinct areas are shown: single and multiple rigid body dynamics, solution control and computational fracture.
The lattice dynamics properties are investigated for twisting bilayer graphene. There are big jumps for the inter-layer potential at twisting angle θ=0° and 60°, implying the stability of Bernal-stacking and the instability of AA-stacking structures, while a long platform in [8,55]° indicates the ease of twisting bilayer graphene in this wide angle range. Significant frequency shifts are observed for the z breathing mode around θ=0° and 60°, while the frequency is a constant in a wide range [8,55]°. Using the z breathing mode, a mechanical nanoresonator is proposed to operate on a robust resonant frequency in terahertz range.
The upper limit of the thermal conductivity and the mechanical strength are predicted for the polyethylene chain, by performing the ab initio calculation and applying the quantum mechanical non-equilibrium Green’s function approach. Specially, there are two main findings from our calculation: (1) the thermal conductivity can reach a high value of 310 Wm−1 K−1 in a 100 nm polyethylene chain at room temperature and the thermal conductivity increases with the length of the chain; (2) the Young’s modulus in the polyethylene chain is as high as 374.5 GPa, and the polyethylene chain can sustain 32.85%±0.05% (ultimate) strain before undergoing structural phase transition into gaseous ethylene.
The 19th International Conference on the Applications of Computer Science and Mathematics in Architecture and Civil Engineering will be held at the Bauhaus University Weimar from 4th till 6th July 2012. Architects, computer scientists, mathematicians, and engineers from all over the world will meet in Weimar for an interdisciplinary exchange of experiences, to report on their results in research, development and practice and to discuss. The conference covers a broad range of research areas: numerical analysis, function theoretic methods, partial differential equations, continuum mechanics, engineering applications, coupled problems, computer sciences, and related topics. Several plenary lectures in aforementioned areas will take place during the conference.
We invite architects, engineers, designers, computer scientists, mathematicians, planners, project managers, and software developers from business, science and research to participate in the conference!
Dieses Arbeitspapier beschreibt, wie ausgehend von einem vorhandenen Straßennetzwerk Bebauungsareale mithilfe von Unterteilungsalgorithmen automatisch umgelegt, d.h. in Grundstücke unterteilt, und anschließend auf Basis verschiedener städtebaulicher Typen bebaut werden können. Die Unterteilung von Bebauungsarealen und die Generierung von Bebauungsstrukturen unterliegen dabei bestimmten stadtplanerischen Einschränkungen, Vorgaben und Parametern. Ziel ist es aus den dargestellten Untersuchungen heraus ein Vorschlagssystem für stadtplanerische Entwürfe zu entwickeln, das anhand der Umsetzung eines ersten Softwareprototyps zur Generierung von Stadtstrukturen weiter diskutiert wird.
Metakaolin made from kaolin is used around the world but rarely in Vietnam where abundant deposits of kaolin is found. The first studies of producing metakaolin were conducted with high quality Vietnamese kaolins. The results showed the potential to produce metakaolin, and its effect has on strength development of mortars and concretes. However, utilisation of a low quality kaolin for producing Vietnamese metakaolin has not been studied so far.
The objectives of this study were to produce a good quality metakaolin made from low quality Vietnamese kaolin and to facilitate the utilisation of Vietnamese metakaolin in composite cements.
In order to reach such goals, the optimal thermal conversion of Vietnamese kaolin into metakaolin was carried out by many investigations, and as such the optimal conversion is found using the analysis results of DSC/TGA, XRD and CSI. During the calcination in a range of 500 – 800 oC lasting for 1 – 5 hours, the characterisation of calcinated kaolin was also monitored for mass loss, BET surface, PSD, density as well as the presence of the residual water. It is found to have a well correlation between residual water and BET surface.
The pozzolanic activity of metakaolin was tested by various methods regarding to the saturated lime method, mCh and TGA-CaO method. The results of the study showed which method is the most suitable one to characterise the real activity of metakaolin and can reach the greatest agreement with concrete performance. Furthermore, the pozzolanic activity results tested using methods were also analysed and compared to each other with respect to the BET surface.
The properties of Vietnam metakaolin was established using investigations on water demand, setting time, spread-flowability, and strength. It is concluded that depending on the intended use of composite cement and weather conditions of cure, each Vietnamese metakaolin can be used appropriately to produce (1) a composite cement with a low water demand (2) a high strength of composite cement (3) a composite cement that aims to reduce CO2 emissions and to improve economics of cement products (4) a high performance mortar.
The durability of metakaolin mortar was tested to find the needed metakaolin content against ASR, sulfat and sulfuric acid attacks successfully.
Modern digital material approaches for the visualization and simulation of heterogeneous materials allow to investigate the behavior of complex multiphase materials with their physical nonlinear material response at various scales. However, these computational techniques require extensive hardware resources with respect to computing power and main memory to solve numerically large-scale discretized models in 3D. Due to a very high number of degrees of freedom, which may rapidly be increased to the two-digit million range, the limited hardware ressources are to be utilized in a most efficient way to enable an execution of the numerical algorithms in minimal computation time. Hence, in the field of computational mechanics, various methods and algorithms can lead to an optimized runtime behavior of nonlinear simulation models, where several approaches are proposed and investigated in this thesis.
Today, the numerical simulation of damage effects in heterogeneous materials is performed by the adaption of multiscale methods. A consistent modeling in the three-dimensional space with an appropriate discretization resolution on each scale (based on a hierarchical or concurrent multiscale model), however, still contains computational challenges in respect to the convergence behavior, the scale transition or the solver performance of the weak coupled problems. The computational efficiency and the distribution among available hardware resources (often based on a parallel hardware architecture) can significantly be improved. In the past years, high-performance computing (HPC) and graphics processing unit (GPU) based computation techniques were established for the investigationof scientific objectives. Their application results in the modification of existing and the development of new computational methods for the numerical implementation, which enables to take advantage of massively clustered computer hardware resources. In the field of numerical simulation in material science, e.g. within the investigation of damage effects in multiphase composites, the suitability of such models is often restricted by the number of degrees of freedom (d.o.f.s) in the three-dimensional spatial discretization. This proves to be difficult for the type of implementation method used for the nonlinear simulation procedure and, simultaneously has a great influence on memory demand and computational time.
In this thesis, a hybrid discretization technique has been developed for the three-dimensional discretization of a three-phase material, which is respecting the numerical efficiency of nonlinear (damage) simulations of these materials. The increase of the computational efficiency is enabled by the improved scalability of the numerical algorithms. Consequently, substructuring methods for partitioning the hybrid mesh were implemented, tested and adapted to the HPC computing framework using several hundred CPU (central processing units) nodes for building the finite element assembly. A memory-efficient iterative and parallelized equation solver combined with a special preconditioning technique for solving the underlying equation system was modified and adapted to enable combined CPU and GPU based computations.
Hence, it is recommended by the author to apply the substructuring method for hybrid meshes, which respects different material phases and their mechanical behavior and which enables to split the structure in elastic and inelastic parts. However, the consideration of the nonlinear material behavior, specified for the corresponding phase, is limited to the inelastic domains only, and by that causes a decreased computing time for the nonlinear procedure. Due to the high numerical effort for such simulations, an alternative approach for the nonlinear finite element analysis, based on the sequential linear analysis, was implemented in respect to scalable HPC. The incremental-iterative procedure in finite element analysis (FEA) during the nonlinear step was then replaced by a sequence of linear FE analysis when damage in critical regions occured, known in literature as saw-tooth approach. As a result, qualitative (smeared) crack initiation in 3D multiphase specimens has efficiently been simulated.
This paper presents a strain smoothing procedure for the extended finite element method (XFEM). The resulting “edge-based” smoothed extended finite element method (ESm-XFEM) is tailored to linear elastic fracture mechanics and, in this context, to outperform the standard XFEM. In the XFEM, the displacement-based approximation is enriched by the Heaviside and asymptotic crack tip functions using the framework of partition of unity. This eliminates the need for the mesh alignment with the crack and re-meshing, as the crack evolves. Edge-based smoothing (ES) relies on a generalized smoothing operation over smoothing domains associated with edges of simplex meshes, and produces a softening effect leading to a close-to-exact stiffness, “super-convergence” and “ultra-accurate” solutions. The present method takes advantage of both the ES-FEM and the XFEM. Thanks to the use of strain smoothing, the subdivision of elements intersected by discontinuities and of integrating the (singular) derivatives of the approximation functions is suppressed via transforming interior integration into boundary integration. Numerical examples show that the proposed method improves significantly the accuracy of stress intensity factors and achieves a near optimal convergence rate in the energy norm even without geometrical enrichment or blending correction.
The concept of isogeometric analysis, where functions that are used to describe geometry in CAD software are used to approximate the unknown fields in numerical simulations, has received great attention in recent years. The method has the potential to have profound impact on engineering design, since the task of meshing, which in some cases can add significant overhead, has been circumvented. Much of the research effort has been focused on finite element implementations of the isogeometric concept, but at present, little has been seen on the application to the Boundary Element Method. The current paper proposes an Isogeometric Boundary Element Method (BEM), which we term IGABEM, applied to two-dimensional elastostatic problems using Non-Uniform Rational B-Splines (NURBS). We find it is a natural fit with the isogeometric concept since both the NURBS approximation and BEM deal with quantities entirely on the boundary. The method is verified against analytical solutions where it is seen that superior accuracies are achieved over a conventional quadratic isoparametric BEM implementation.
Monogenic functions play a role in quaternion analysis similarly to that of holomorphic functions in complex analysis. A holomorphic function with nonvanishing complex derivative is a conformal mapping. It is well-known that in Rn+1, n ≥ 2 the set of conformal mappings is restricted to the set of Möbius transformations only and that the Möbius transformations are not monogenic. The paper deals with a locally geometric mapping property of a subset of monogenic functions with nonvanishing hypercomplex derivatives (named M-conformal mappings). It is proved that M-conformal mappings orthogonal to all monogenic constants admit a certain change of solid angles and vice versa, that change can characterize such mappings. In addition, we determine planes in which those mappings behave like conformal mappings in the complex plane.
This paper presents a novel numerical procedure based on the framework of isogeometric analysis for static, free vibration, and buckling analysis of laminated composite plates using the first-order shear deformation theory. The isogeometric approach utilizes non-uniform rational B-splines to implement for the quadratic, cubic, and quartic elements. Shear locking problem still exists in the stiffness formulation, and hence, it can be significantly alleviated by a stabilization technique. Several numerical examples are presented to show the performance of the method, and the results obtained are compared with other available ones.
This paper presents a novel numerical procedure for computing limit and shakedown loads of structures using a node-based smoothed FEM in combination with a primal–dual algorithm. An associated primal–dual form based on the von Mises yield criterion is adopted. The primal-dual algorithm together with a Newton-like iteration are then used to solve this associated primal–dual form to determine simultaneously both approximate upper and quasi-lower bounds of the plastic collapse limit and the shakedown limit. The present formulation uses only linear approximations and its implementation into finite element programs is quite simple. Several numerical examples are given to show the reliability, accuracy, and generality of the present formulation compared with other available methods.
An analytical molecular mechanics model for the elastic properties of crystalline polyethylene
(2012)
We present an analytical model to relate the elastic properties of crystalline polyethylene based on a molecular mechanics approach. Along the polymer chains direction, the united-atom (UA) CH2-CH2 bond stretching, angle bending potentials are replaced with equivalent Euler-Bernoulli beams. Between any two polymer chains, the explicit formulae are derived for the van der Waals interaction represented by the linear springs of different stiffness. Then, the nine independent elastic constants are evaluated systematically using the formulae. The analytical model is finally validated by present united-atom molecular dynamics (MD) simulations and against available all-atom molecular dynamics results in the literature. The established analytical model provides an efficient route for mechanical characterization of crystalline polymers and related materials.
A simple multiscale analysis framework for heterogeneous solids based on a computational homogenization technique is presented. The macroscopic strain is linked kinematically to the boundary displacement of a circular or spherical representative volume which contains the microscopic information of the material. The macroscopic stress is obtained from the energy principle between the macroscopic scale and the microscopic scale. This new method is applied to several standard examples to show its accuracy and consistency of the method proposed.
A phantom-node method is developed for three-node shell elements to describe cracks. This method can treat arbitrary cracks independently of the mesh. The crack may cut elements completely or partially. Elements are overlapped on the position of the crack, and they are partially integrated to implement the discontinuous displacement across the crack. To consider the element containing a crack tip, a new kinematical relation between the overlapped elements is developed. There is no enrichment function for the discontinuous displacement field. Several numerical examples are presented to illustrate the proposed method.
The aim of this paper we discuss explicit series constructions for the fundamental solution of the Helmholtz operator on some important examples non-orientable conformally at manifolds. In the context of this paper we focus on higher dimensional generalizations of the Klein bottle which in turn generalize higher dimensional Möbius strips that we discussed in preceding works. We discuss some basic properties of pinor valued solutions to the Helmholtz equation on these manifolds.
This paper is focused on the first numerical tests for coupling between analytical solution and finite element method on the example of one problem of fracture mechanics. The calculations were done according to ideas proposed in [1]. The analytical solutions are constructed by using an orthogonal basis of holomorphic and anti-holomorphic functions. For coupling with finite element method the special elements are constructed by using the trigonometric interpolation theorem.
Lesen - Schreiben - Apparate
(2012)
The aim of our contribution is to clarify the relation between totally regular variables and Appell sequences of hypercomplex holomorphic polynomials (sometimes simply called monogenic power-like functions) in Hypercomplex Function Theory. After their introduction in 2006 by two of the authors of this note on the occasion of the 17th IKM, the latter have been subject of investigations by different authors with different methods and in various contexts. The former concept, introduced by R. Delanghe in 1970 and later also studied by K. Gürlebeck in 1982 for the case of quaternions, has some obvious relationship with the latter, since it describes a set of linear hypercomplex holomorphic functions all power of which are also hypercomplex holomorphic. Due to the non-commutative nature of the underlying Clifford algebra, being totally regular variables or Appell sequences are not trivial properties as it is for the integer powers of the complex variable z=x+ iy. Simple examples show also, that not every totally regular variable and its powers form an Appell sequence and vice versa. Under some very natural normalization condition the set of all para-vector valued totally regular variables which are also Appell sequences will completely be characterized. In some sense the result can also be considered as an answer to a remark of K. Habetha in chapter 16: Function theory in algebras of the collection Complex analysis. Methods, trends, and applications, Akademie-Verlag Berlin, (Eds. E. Lanckau and W. Tutschke) 225-237 (1983) on the use of exact copies of several complex variables for the power series representation of any hypercomplex holomorphic function.
Gaze based human-computer-interaction has been a research topic for over a quarter century. Since then, the main scenario for gaze interaction has been helping handicapped people to communicate an interact with their environment. With the rapid development of mobile and wearable display technologies, a new application field for gaze interaction has appeared, opening new research questions.
This thesis investigates the feasibility of mobile gaze based interaction, studying deeply the use of pie menus as a generic and robust widget for gaze interaction as well as visual and perceptual issues on head mounted (wearable) optical see-through displays.
It reviews conventional gaze-based selection methods and investigates in detail the use of pie menus for gaze control. It studies and discusses layout issues, selection methods and applications. Results show that pie menus can allocate up to six items in width and multiple depth layers, allowing a fast and accurate navigation through hierarchical levels by using or combining multiple selection methods. Based on these results, several text entry methods based on pie menus are proposed. Character-by-character text entry, text entry with bigrams and with text entry with bigrams derived by word prediction, as well as possible selection methods, are examined in a longitudinal study. Data showed large advantages of the bigram entry methods over single character text entry in speed and accuracy. Participants preferred the novel selection method based on saccades (selecting by borders) over the conventional and well established dwell time method.
On the one hand, pie menus showed to be a feasible and robust widget, which may enable the efficient use of mobile eye tracking systems that may not be accurate enough for controlling elements on conventional interface. On the other hand, visual perception on mobile displays technologies need to be examined in order to deduce if the mentioned results can be transported to mobile devices.
Optical see-through devices enable observers to see additional information embedded in real environments. There is already some evidence of increasing visual load on the respective systems. We investigated visual performance on participants with a visual search tasks and dual tasks presenting visual stimuli on the optical see-through device, only on a computer screen, and simultaneously on both devices. Results showed that switching between the presentation devices (i.e. perceiving information simultaneously from both devices) produced costs in visual performance. The implications of these costs and of further perceptual and technical factors for mobile gaze-based interaction are discussed and solutions are proposed.
Capturing Sheep With Minecraft befasst sich mit ausgewählten Problemen der Bauphysik und deren Umsetzung mithilfe des Computerspiels Minecraft. Es werden ausgewählte Probleme der Bauphysik in Minecraft abgebildet um diese Schülern und Studenten näher zu bringen. Es wurde ein Szenario in Minecraft entworfen welches durch entgegenwirken der abgebildeten Probleme, durch den Spieler gelöst werden soll.
In this paper, wavelet energy damage indicator is used in response surface methodology to identify the damage in simulated filler beam railway bridge. The approximate model is addressed to include the operational and surrounding condition in the assessment. The procedure is split into two stages, the training and detecting phase. During training phase, a so-called response surface is built from training data using polynomial regression and radial basis function approximation approaches. The response surface is used to detect the damage in structure during detection phase. The results show that the response surface model is able to detect moderate damage in one of bridge supports while the temperatures and train velocities are varied.
Die Arbeit behandelt die Geschichte der Landesplanung in Mitteldeutschland und besteht aus zwei Teilen: den Zeiträumen der 1920 und 1990 Jahre. Dabei wird die Herausbildung der Landesplanung im mitteldeutschen Industriegebiet (um Merseburg) bzw. Thüringen behandelt. Die kognitiven, institutionellen und planerischen Grundlagen der Landesplanung werden herausgearbeitet. Dabei spielen die paradigmatischen Grundlagen (Dezentralisierung) sowie die internationalen Beziehungen (insbes. zu den USA) eine besondere Rolle. Schließlich werden das Wirken von Schlüsselpersonen (u.a. Prager, Luthardt, Langen) und die Bezüge zum Bauhaus untersucht. Die Analyse des Planwerks für Mitteldeutschland (1932) wird umfassend geführt. In der Analyse der Entstehung können erste Ansätze nicht-linearer Planung herausgearbeitet werden. Der zweite Teil der Arbeit widmet sich dem Industriellen Gartenreich, einem Projekt der Stiftung Bauhaus Dessau, das als Korrespondenzregion zur EXPO anerkannt worden war. Hier stehen das Gesamtkonzept, aber auch Projekte wie Ferropolis im Zentrum der Betrachtung. Die Analysen der Landesplanung werden in ihren Bezügen zur gesellschaftlichen und konkret wirtschaftlichen Entwicklung behandelt. Schließlich werden beide Entwicklungsphasen und deren landesplanerischen Resultate übergreifend bewertet und in einem Modell emergent-adaptiver Planung verortet und ein Ausblick auf die Entwicklung der Landesplanung als nicht-lineare Planung gegeben.
Texts from the web can be reused individually or in large quantities. The former is called text reuse and the latter language reuse. We first present a comprehensive overview of the different ways in which text and language is reused today, and how exactly information retrieval technologies can be applied in this respect. The remainder of the thesis then deals with specific retrieval tasks. In general, our contributions consist of models and algorithms, their evaluation, and for that purpose, large-scale corpus construction.
The thesis divides into two parts. The first part introduces technologies for text reuse detection, and our contributions are as follows: (1) A unified view of projecting-based and embedding-based fingerprinting for near-duplicate detection and the first time evaluation of fingerprint algorithms on Wikipedia revision histories as a new, large-scale corpus of near-duplicates. (2) A new retrieval model for the quantification of cross-language text similarity, which gets by without parallel corpora. We have evaluated the model in comparison to other models on many different pairs of languages. (3) An evaluation framework for text reuse and particularly plagiarism detectors, which consists of tailored detection performance measures and a large-scale corpus of automatically generated and manually written plagiarism cases. The latter have been obtained via crowdsourcing. This framework has been successfully applied to evaluate many different state-of-the-art plagiarism detection approaches within three international evaluation competitions.
The second part introduces technologies that solve three retrieval tasks based on language reuse, and our contributions are as follows: (4) A new model for the comparison of textual and non-textual web items across media, which exploits web comments as a source of information about the topic of an item. In this connection, we identify web comments as a largely neglected information source and introduce the rationale of comment retrieval. (5) Two new algorithms for query segmentation, which exploit web n-grams and Wikipedia as a means of discerning the user intent of a keyword query. Moreover, we crowdsource a new corpus for the evaluation of query segmentation which surpasses existing corpora by two orders of magnitude. (6) A new writing assistance tool called Netspeak, which is a search engine for commonly used language. Netspeak indexes the web in the form of web n-grams as a source of writing examples and implements a wildcard query processor on top of it.
Im Rahmen dieser Arbeit wurde die Entscheidungsfindung im Herstellungsprozess von Brückenkappen untersucht. Es stellte sich heraus, dass die Fuzzy-Methode ein geeignetes Werkzeug sein könnte, die Teilprozesse auf die Möglichkeit ihrer Parallelisierung hin zu untersuchen. Um diese Theorie zu testen, wurde auf den Grundlagen von Arbeiten der Professur Baubetrieb und Bauverfahren der Prozess näher analysiert und unterstützend durch eigene Recherchen ein UML-Diagramm erstellt, welches als Aktivitätsdiagramm ausgebildet wurde. Aufbauend auf diesem Ablauf und den gewonnenen Kenntnissen zur Herstellung einer Brückenkappe, konnten die einzelnen Prozesse zu Teilprozessen, sogenannten Bausteinen, zusammengefasst werden. Diese Bausteine sind entstanden, um die Simulation möglich zu machen, indem der Ablauf weniger komplex wird und nur die Prozesse zu beurteilen sind, die beeinflusst bzw. in ihrer Reihenfolge bis zu einem gewissen Grad variabel sind. Eine erleichterte Interaktion mit den Bausteinen und deren Überführung in ein Simulationsprogramm wurde über Templates realisiert. So besitzt jeder Baustein eine einheitliche Struktur. Unter anderem beinhalten die Bausteine die jeweiligen Ressourcen und Parameter, sowie die Abhängigkeiten der Prozesse untereinander und die zugehörige Priorität. Zur Entscheidungsfindung wurde die Fuzzylogik herangezogen und die Problemstellung der Parallelisierung zum Ziel gesetzt. Die Realisierung wurde über einen Entscheidungsbaum und das daraus resultierende Regelwerk erreicht. Somit ließen sich, ausgehend von einem festgelegten Prozess und durch die Fuzzifizierung der Eingangsparameter Priorität, Prozessdauer und Verfügbarkeit der Ressourcen und Arbeitskräfte, verschiedene Pfade identifizieren, allerdings nur in Verbindung mit den vorher analysierten Abhängigkeiten. Für jeden der Eingangsparameter wurden so Fuzzy-Sets erstellt. Über den Entscheidungsbaum, welcher mit den linguistischen Variablen versehen wurde, konnte über sogenannte "und" - Verknüpfungen das Regelwerk aufgestellt werden. Das gekürzte Regelwerk in dieser Arbeit beinhaltet grundsätzlich nur die Regeln, die auch zu einer wirklichen Entscheidung führen. Daraus folgt, dass es möglich ist mittels der Fuzzy-Methode eine Entscheidung darüber zu fällen, ob zwei Prozesse zu parallelisieren sind oder nicht und aufgetretene Verzögerungen wieder eingeholt werden können.
In crowdsourcingbasierten Systemen kommt der Qualitätssicherung des durch Benutzer generierten Inhaltes große Bedeutung für die Erhaltung der Benutzbarkeit zu. Das bauphysikalische Lehrspiel "BuildVille" benutzt für die Quiz-Anwendung einen Crowdsourcing-Ansatz: Die Quiz-Fragen werden von den Benutzern selbst generiert. Mit Hilfe dieser Arbeit soll sichergestellt werden, dass fehlerhafte, irrtümlicherweise oder zum Spaß eingegebene Fragen möglichst früh erkannt, korrigiert oder von der weiteren Verbreitung ausgeschlossen werden. Dazu soll mit Hilfe einer Analyse bestehender crowdsourcingbasierter Systeme bezüglich umgesetzter Qualitätssicherungsmaßnahmen ein Konzept für die QS-Maßnahmen in BuildVille entwickelt werden.
Bauphysikalisches Quartett
(2012)
Quartett ist ein ebenso altes, wie auch beliebtes Kartenspiel. Vor allem bei Kindern erfreut es sich großer Beliebtheit, während in den älteren Generationen kaum jemand mit Quartettkarten spielt.
Quartettspiele speziell für Kleinkinder sind zum Großteil mit Inhalten versehen, die Wissen auf spielerische Art und Weise vermitteln. Dabei werden gute Lernerfolge in dieser Zielgruppe verzeichnet.
Wie lassen sich also diese Lernerfolge durch das Spielen mit Quartettkarten erzielen? Und wie kann dieser Effekt auch auf Studenten übertragen werden?
Ziel dieser Arbeit ist es, das Konzept des Quartettkartenspiels auf bauphysikalische Inhalte anzuwenden und gegebenenfalls die Spielprinzipien zu erweitern oder zu verändern. Dabei sind die Studenten der Fakultät Bauingenieurswesen die Zielgruppe, an die sich das Spiel richten soll.
Besondere Herausforderung ist es, unterschiedliche Objekteklassen von bauphysikalischer Relevanz in einem Spiel zusammenzubringen und vergleichbar zu machen. Das sich ergebende Quartettkartenspiel sollte nicht nur eine Objektklasse, sondern mehrere Objektklassen zum Inhalt haben. Dabei sollen die Objektklassen so gewählt werden, dass sich Kategorien mit bauphysikalischem Inhalt finden lassen.
Augenmerk sollte auch auf die Strukturierung der Lerninhalte gelegt werden, um eine leichte Übertragung des Spielkonzepts auf andere Fachdomänen zu ermöglichen. Das Ergebnis der Arbeit sind zwei fertige und spielbare Quartette.
Eine der jüngsten Entwicklungen in der Games Branche sind sogenannte Social Games. Hierbei handelt es sich um digitale Spiele, die innerhalb von sozialen Netzwerken, wie z.B. Facebook und Myspace, gespielt werden.
Studien zeigen, dass kommerzielle digitale Spiele mehr als nur ein Zeitvertreib sind. Sie fördern sowohl kognitive, als auch affektive
und psychomotorische Kompetenzen. Aus diesem Grund werden seit Jahrzehnten digitale Spiele in der Pädagogik eingesetzt, um ihre Motivationskraft zu nutzen, um Lerneffekte zu erzielen.
Ziel dieser Arbeit ist es Spielmechaniken für ein bauphysikalisches Social Game zu entwickeln. Ausgehend von der Identifikation von Spielmechaniken, basierend auf einer Analyse der Funktionsweisen existierender populärer Social Games, und einem grundlegenden pädagogischen Verständnis bezüglich Digital Game Based Learning (DGBL), werden Spielmechaniken entwickelt, mit deren Hilfe bauphysikalische Fachkompetenzen vermittelt werden können.
Entwurf eines Spieler-Modells für eine erweiterbare Spielplattform zur Ausbildung in der Bauphysik
(2012)
Im Projekt Intelligentes Lernen beschäftigen sich die Professuren Content Management und Web-Technologien, Systeme der Virtuellen Realität und Bauphysik der Bauhaus- Universität Weimar mit der Entwicklung innovativer Informationstechnologien für eLearning- Umgebungen. In den Teilbereichen Retrieval, Extraktion und Visualisierung großer Dokumentkollektionen, sowie simulations- und planbasierter Wissensvermittlung werden Algorithmen und Werkzeuge erforscht, um eLearning-Systeme leistungsfähiger zu machen und um somit den Lernerfolg zu optimieren.
Ziel des Projekts, auf dem Gebiet des simulationsbasierten Wissenstransfers, ist die Entwicklung eines Multiplayer Online Games (MOG) zur Ausbildungsunterstützung in der Bauphysik.
Im Rahmen der vorliegenden Bachelorarbeit wird für diese digitale Lernsoftware ein Spieler- Modell zur Verwaltung der spielerspezifischen Daten entworfen und in das bestehende Framework integriert. Der Schwerpunkt der Arbeit liegt in der Organisation der erlernten Fähigkeiten des Spielers und in der an den Wissensstand angepassten Auswahl geeigneter Spielaufgaben. Für die Anwendung im eLearning-Bereich ist die Erweiterbarkeit des Modells um neue Lernkomplexe eine wesentliche Anforderung.
Experimentelle Untersuchung eines Verfahrens zur optimalen Positionierung von Referenzsensoren bei der experimentellen Modalanalyse mit output-only Methoden nach Brehm (2011). Untersuchung des Einflusses der Referenzsensorpositionierung, -anzahl und der Positionierung der wandernden Sensoren unter Anwendung des Stochastic-Subspace-Verfahrens zur Auswertung der output-only Messdaten.
Meshfree methods (MMs) such as the element free Galerkin (EFG)method have gained popularity because of some advantages over other numerical methods such as the finite element method (FEM). A group of problems that have attracted a great deal of attention from the EFG method community includes the treatment of large deformations and dealing with strong discontinuities such as cracks. One efficient solution to model cracks is adding special enrichment functions to the standard shape functions such as extended FEM, within the FEM context, and the cracking particles method, based on EFG method. It is well known that explicit time integration in dynamic applications is conditionally stable. Furthermore, in enriched methods, the critical time step may tend to very small values leading to computationally expensive simulations. In this work, we study the stability of enriched MMs and propose two mass-lumping strategies. Then we show that the critical time step for enriched MMs based on lumped mass matrices is of the same order as the critical time step of MMs without enrichment. Moreover, we show that, in contrast to extended FEM, even with a consistent mass matrix, the critical time step does not vanish even when the crack directly crosses a node.
Civil engineers take advantage of models to design reliable structures. In order to fulfill the design goal with a certain amount of confidence, the utilized models should be able to predict the probable structural behavior under the expected loading schemes. Therefore, a major challenge is to find models which provide less uncertain and more robust responses. The problem gets even twofold when the model to be studied is a global model comprised of different interacting partial models. This study aims at model quality evaluation of global models with a focus on frame-wall systems as the case study. The paper, presents the results of the first step taken toward accomplishing this goal. To start the model quality evaluation of the global frame-wall system, the main element (i.e. the wall) was studied through nonlinear static and dynamic analysis using two different modeling approaches. The two selected models included the fiber section model and the Multiple-Vertical-Line-Element-Model (MVLEM). The influence of the wall aspect ratio (H=L) and the axial load on the response of the models was studied. The results from nonlinear static and dynamic analysis of both models are presented and compared. The models resulted in quite different responses in the range of low aspect ratio walls under large axial loads due to different contribution of the shear deformations to the top displacement. In the studied cases, the results implied that careful attention should be paid to the model quality evaluation of the wall models specifically when they are supposed to be coupled to other partial models such as a moment frame or a soil-footing substructure which their response is sensitive to shear deformations. In this case, even a high quality wall model would not result in a high quality coupled system since it fails to interact properly with the rest of the system.
A simple multiscale analysis framework for heterogeneous solids based on a computational homogenization technique is presented. The macroscopic strain is linked kinematically to the boundary displacement of a circular or spherical representative volume which contains the microscopic information of the material. The macroscopic stress is obtained from the energy principle between the macroscopic scale and the microscopic scale. This new method is applied to several standard examples to show its accuracy and consistency of the method proposed.
The aim of this study is to show an application of model robustness measures for soilstructure interaction (henceforth written as SSI) models. Model robustness defines a measure for the ability of a model to provide useful model answers for input parameters which typically have a wide range in geotechnical engineering. The calculation of SSI is a major problem in geotechnical engineering. Several different models exist for the estimation of SSI. These can be separated into analytical, semi-analytical and numerical methods. This paper focuses on the numerical models of SSI specific macro-element type models and more advanced finite element method models using contact description as continuum or interface elements. A brief description of the models used is given in the paper. Following this description, the applied SSI problem is introduced. The observed event is a static loaded shallow foundation with an inclined load. The different partial models to consider the SSI effects are assessed using different robustness measures during numerical application. The paper shows the investigation of the capability to use these measures for the assessment of the model quality of SSI partial models. A variance based robustness and a mathematical robustness approaches are applied. These different robustness measures are used in a framework which allows also the investigation of computational time consuming models. Finally the result shows that the concept of using robustness approaches combined with other model–quality indicators (e.g. model sensitivity or model reliability) can lead to unique model–quality assessment for SSI models.
Non-destructive techniques for damage detection became the focus of engineering interests in the last few years. However, applying these techniques to large complex structures like civil engineering buildings still has some limitations since these types of structures are
unique and the methodologies often need a large number of specimens for reliable results. For this reason, cost and time can greatly influence the final results.
Model Assisted Probability Of Detection (MAPOD) has taken its place among the ranks of damage identification techniques, especially with advances in computer capacity and modeling tools. Nevertheless, the essential condition for a successful MAPOD is having a reliable model in advance. This condition is opening the door for model assessment and model quality problems. In this work, an approach is proposed that uses Partial Models (PM) to compute the Probability Of damage Detection (POD). A simply supported beam, that can be structurally modified and
tested under laboratory conditions, is taken as an example. The study includes both experimental and numerical investigations, the application of vibration-based damage detection approaches and a comparison of the results obtained based on tests and simulations.
Eventually, a proposal for a methodology to assess the reliability and the robustness of the models is given.
Die im vorliegenden Buch dokumentierten Untersuchungen befassen sich mit der Entwicklung von Methoden zur algorithmischen Lösung von Layoutaufgaben im architektonischen Kontext. Layout bezeichnet hier die gestalterisch und funktional sinnvolle Anordnung räumlicher Elemente, z.B. von Parzellen, Gebäuden, Räumen auf bestimmten Maßstabsebenen. Die vorliegenden Untersuchungen sind im Rahmen eines von der Deutschen Forschungsgemeinschaft geförderten Forschungsprojekts entstanden.
It is well known that complex quaternion analysis plays an important role in the study of higher order boundary value problems of mathematical physics. Following the ideas given for real quaternion analysis, the paper deals with certain orthogonal decompositions of the complex quaternion Hilbert space into its subspaces of null solutions of Dirac type operator with an arbitrary complex potential. We then apply them to consider related boundary value problems, and to prove the existence and uniqueness as well as the explicit representation formulae of the underlying solutions.
Die Qualität von Beplankungselementen wirkt sich deutlich auf den Feuerwiderstand von Metallständer-Wandkonstruktionen aus. Daher wurde im Rahmen dieser Arbeit der Einfluss von Zusätzen in Gipsplatten bezüglich einer möglichen Verbesserung dieser Eigenschaft untersucht.
Zu diesem Zweck wurden spezielle, den jeweiligen Untersuchungsbedingungen angepasste Probekörper unter Verwendung verschiedenster Zusätze gefertigt. Die Beurteilung deren Auswirkungen erfolgte insbesondere mittels nachfolgender fünf Kriterien:
1) dem Zeitpunkt der Temperaturerhöhung nach der Probekörperentwässerung,
2) dem Maximalwert der Plattenrückseitentemperatur,
3) der Größe und der Anzahl der Risse,
4) der Plattenstabilität nach der Wärmebeanspruchung,
5) der Verkürzung von prismatischen Probekörpern.
Besonders wichtig war hierbei die Charakterisierung der Auswirkungen einer simulierten Brandbeanspruchung von 970 °C über 90 Minuten auf Labor-Gipsplatten. Dabei wurde die Temperaturänderung auf der Plattenrückseite über den gesamten Prüfzeitraum kontinuierlich erfasst. Die Bewertung des Zusammenhalts der Platten nach der thermischen Beanspruchung erfolgte erstmals quantitativ über Anzahl und Größe der an den Proben entstandenen Risse. Ursächlich für die Rissbildung ist die Verringerung des Probekörpervolumens infolge des ausgetriebenen Kristallwassers. Da dieser Parameter im Plattenversuch nicht bestimmt werden kann, wurde ergänzend das Längenänderungsverhalten von Prismen im Ergebnis einer 90minütigen Temperung bei 1000 °C im Muffelofen ermittelt.
Besonders vorteilhaft hat sich die Zugabe von 80 g/m2 Glasfasern und 7,75 % Kalksteinmehl auf das Verhalten von Gipsplatten bei Brandbeanspruchung ausgewirkt. Diese Verbesserung ist insbesondere auf höhere Stabilität und geringere Schrumpfung der Gipsplatte zurückzuführen.
Basierend auf den im Labormaßstab erhaltenen Ergebnissen wurden Rezepturvorschläge zur Verbesserung des Feuerwiderstandsverhaltens von Gipsplatten unter Praxisbedingungen entwickelt. Die Herstellung der erforderlichen großformatigen Platten erfolgte auf der Bandstraße der Knauf Gips KG. Diese Platten wurden als Wandkonstruktion mit zweilagiger Beplankung einer großtechnischen Prüfung erfolgreich unterzogen. Eine geringere Durchbiegung der Wandkonstruktion, eine verminderte Volumenreduzierung der Platten sowie eine erhöhte Plattenstabilität belegen die verbesserten Eigenschaften dieser modifizierten Feuerschutzplatte.
Weitere durchgeführte Untersuchungen ergaben, dass es unerheblich ist, ob die Platten auf Basis von Natur- oder REA-Gips bzw. mit hohem oder niedrigem Flächengewicht gefertigt wurden. Das eindeutig beste Ergebnis mit einer Feuerwiderstandsdauer von 118 Minuten hat eine Wandkonstruktion aus Feuerschutzplatten auf Basis eines Stuckgipses aus 100 % REA-Gips mit einem Anteil von 83,9 g/m2 Glasfasern und 1 % Vermiculit und einem Flächengewicht von 10,77 kg/m2, bei einer Plattenstärke von 12,5 mm.
Die als Ziel vorgebende Feuerwiderstandsdauer von 120 Minuten bei zweilagiger Beplankung ohne Dämmstoff könnte künftig erreicht werden, wenn es gelingt, die Volumenreduzierung noch besser zu kompensieren und die Plattenstabilität zu steigern. Eine Möglichkeit hierzu ist die Substitution der beidseitigen Kartonlagen durch eine Glasfaser-Vliesummantelung. Die Wandkonstruktion W112 ohne Dämmstoff erreicht dabei eine Feuerwiderstandsdauer von weit über 120 Minuten, wobei der Gipskern mit Glasfasern armiert ist.
Der Nachbehandlung eines Fahrbahndeckenbetons kommt zum Erzielen eines hohen Frost-Tausalz-Widerstandes der fertigen Betondecke eine besondere Bedeutung zu. Bei der Waschbetonbauweise erfolgt die Nachbehandlung in mehreren Schritten. Eine erste Nachbehandlung gewährleistet den Verdunstungsschutz des Betons bis zum Zeitpunkt des Ausbürstens des verzögerten Oberflächenmörtels. Daran schließt sich die zweite Nachbehandlung an, in der Regel durch Aufsprühen eines flüssigen Nachbehandlungsmittels.
Der zweite Nachbehandlungsschritt ist entscheidend für den Frost-Tausalz-Widerstand der Betondecke. Im Rahmen eines Forschungsprojektes wurde daher untersucht, inwiefern durch eine Optimierung der zweiten Nachbehandlung der Frost-Tausalz-Widerstand von Waschbetonoberflächen erhöht werden kann, insbesondere bei Verwendung hüttensandhaltiger Zemente. Schon durch eine einmalige Nassnachbehandlung wurde eine deutlich höherer Widerstand der Waschbetons gegen Frost-Tausalz-Angriff erzielt.
Thin-walled cylindrical composite shell structures are often applied in aerospace for lighter and cheaper launcher transport system. These structures exhibit sensitivity to geometrical imperfection and are prone to buckling under axial compression. Today the design is based on NASA guidelines from the 1960’s [1] using a conservative lower bound curve embodying many experimental results of that time. It is well known that the advantages and different characteristics of composites as well as the evolution of manufacturing standards are not considered apporopriately in this outdated approach. The DESICOS project was initiated to provide new design guidelines regarding all the advantages of composites and allow further weight reduction of space structures by guaranteeing a more precise and robust design.
Therefore it is necessary among other things to understand how a cutout with different dimensions affects the buckling load of a thin-walled cylindrical shell structure in combination with initial geometric imperfections. This work is intended to identify a ratio between the cutout characteristic dimension (in this case the cutout diameter) and the structure characteristic dimension (in this case the cylinder radius) that can be used to tell if the buckling structure is dominated by initial imperfections or is dominated by the cutout.