56.03 Methoden im Bauingenieurwesen
Refine
Document Type
- Conference Proceeding (599)
- Article (143)
- Doctoral Thesis (28)
- Master's Thesis (4)
- Diploma Thesis (1)
Institute
- Professur Informatik im Bauwesen (467)
- In Zusammenarbeit mit der Bauhaus-Universität Weimar (173)
- Graduiertenkolleg 1462 (34)
- Institut für Strukturmechanik (ISM) (30)
- Professur Angewandte Mathematik (18)
- Institut für Konstruktiven Ingenieurbau (IKI) (8)
- Professur Baubetrieb und Bauverfahren (8)
- Professur Baumechanik (7)
- Professur Stahlbau (6)
- Professur Stochastik und Optimierung (5)
Keywords
- Computerunterstütztes Verfahren (287)
- Architektur <Informatik> (198)
- CAD (164)
- Angewandte Mathematik (144)
- Angewandte Informatik (143)
- Computer Science Models in Engineering; Multiscale and Multiphysical Models; Scientific Computing (72)
- Modellierung (65)
- Bauwerk (50)
- Finite-Elemente-Methode (46)
- Building Information Modeling (38)
Ziel des Teilprojekts C5 des Sonderforschungsbereichs 398 an der Ruhr-Universität Bochum ist die Entwicklung eines lebensdauerorientierten Entwurfsmodells für Stahltragwerke. Das Referenzsystem der letzten Förderperiode war eine einschiffige Stahlhalle mit Kranbahnen. Der vorliegende Beitrag stellt die Ergebnisse einer Sensitivitätsanalyse des Stahlrahmens unter Kranfahrtlasten vor. Diese Verkehrslasten wurden, völlig entkoppelt von weiteren äußeren Einflüssen wie Wind und Schnee, als poissongetriebene Pulsprozesse beschrieben, die zufälligen Eigenschaften der Kranfahrtlasten wurden in den Zufallsvariablen Pulsdauer und Pulsintensität berücksichtigt. Zur Minimierung des Rechenaufwandes der Lebensdauer-untersuchung wurden a priori einige Systemparameter anhand von Parameterstudien im Hinblick auf ihre dynamische Wirkung maßgebend festgelegt. In einer darauf aufbauenden Betriebsfestigkeitsuntersuchung wurden die Schädigungseinträge in einer Rahmenecke infolge von jeweils einzelnen Pulslastereignissen berechnet und in einer Datenbasis abgelegt. Eine abschließende Distance-Controlled Monte Carlo Simulation der Pulsprozesse über einen maximalen Zeitraum von 100 Jahren überführte die Realisationen der einzelnen Pulslasten anhand der Datenbasis in Teilschädigungen, welche nach der Palmgren/Miner-Hypothese zu einer Gesamtschädigung aufsummiert wurden. Der Einfluss der Kranlasten auf den Entwurf von Stahlhallen wurde durch den Vergleich der berechneten Lebensdauer und der planmäßig vorgegebenen Nutzungsdauer quantifiziert.
In this contribution the software design and implementation of an analysis server for the computation of failure probabilities in structural engineering is presented. The structures considered are described in terms of an equivalent Finite Element model, the stochastic properties, like e.g. the scatter of the material behavior or the incoming load, are represented using suitable random variables. Within the software framework, a Client-Server-Architecture has been implemented, employing the middleware CORBA for the communication between the distributed modules. The analysis server offers the possibility to compute failure probabilities for stochastically defined structures. Therefore, several different approximation (FORM, SORM) and simulation methods (Monte Carlo Simulation and Importance Sampling) have been implemented. This paper closes in showing several examples computed on the analysis server.
The design of safety-critical structures, exposed to cyclic excitations demands for non-degrading or limited-degrading behavior during extreme events. Among others, the structural behavior is mainly determined by the amount of plastic cycles, completed during the excitation. Existing simplified methods often ignore this dependency, or assume/request sufficient cyclic capacity. The paper introduces a new performance based design method that considers explicitly a predefined number of re-plastifications. Hereby approaches from the shakedown theory and signal processing methods are utilized. The paper introduces the theoretical background, explains the steps of the design procedure and demonstrates the applicability with help of an example. This project was supported by German Science Foundation (Deutsche Forschungsgemeinschaft, DFG)
Limit state design of hybrid structures with meshless methods using mathematical optimization
(2003)
The revitalization of existing structures belongs to the frequently tasks in urban reconstruction processes. The adaptation for new requirements will commonly affect substantial changes in the general configuration of structures. The resulting revitalized structures are characterized by a hybrid design, where old and new, identical or diverse materials and members will be coupled in different ways. In the planning stage the treatment of these systems leads to application of complex and hybrid mechanical models respectively. High performance numerical instruments have to be applied for solving not only analysis but also targeted design problems. Because of the hybrid character of mechanical models in revitalization planning processes the use of hybrid technologies is advantageous. In this paper mixed domain technique will be used for connecting EFG and FE. The models derived will be adopted for design purposes of non-linear loaded hybrid structures. The investigations show a good adaptability of the meshless methods to the design of hybrid structures by using optimization strategies. With this method the advantages of both finite element and meshless methods can be utilized most suitable. With the property of a minimum amount of unknowns by maintaining an adequate quality of the results the application of mixed finite element and meshless methods is a promising alternative to traditional methods in structural analysis and optimization.
Many problems related to data integration in AEC can be better tackled by an approach that takes into account the heterogeneity of tasks, models and applications but does not require continuous consistency of the evolving design data, at each data management operation. Such an approach must incorporate adequate services that can facilitate reintegration of concurrently modified data at reasonably selected coordination points. In this paper we present a set of methods which, used in combination, can achieve that goal. After a description of the principal envisaged cooperative work scenario each of these methods is discussed in detail and current observations drawn from their software realisation are given. Whilst the suggested approach is valid for any EXPRESS-based data model, the practical focus of work has been on facilitating IFC-driven integration.
Online-Lehre im Einsatz
(2003)
Das im Rahmen des vom BMBF geförderten Projektes PORTIKO (Multimediale Lehr- und Lernplattform für den Studiengang Bauingenieurwesen) umfasst 10 Bauingenieurinstitute in Braunschweig und Dresden und das Fernstudium an der TU Dresden. Daneben sind drei Stabsprojekte für Didaktik, Teleteaching und Multimedia-Plattform als Dienstleister und Koordinatoren für die anderen Teilprojekte tätig. An der TU Braunschweig ist das Institut für Baustoffe, Massivbau und Brandschutz unter anderem für die Grundfachausbildung des Massivbau (5. + 6. Semester) und für eine spezielle Vertiefungsrichtung, den Brand- und Katastrophenschutz (7. bis 9. Semester), zuständig. In diesem Beitrag wird zunächst ein Überblick über die verwendete Plattform sowie die eingesetzten Hardware- und Softwaretechniken gegeben. Anschließend wird von ersten Erfahrungen über den Einsatz der Online-Lehre und die daraus gewonnen unterschiedlichen strategischen Ansätze für das weiter Vorgehen nach Projektende im Massivbau und im Brandschutz vorgestellt. Um den Praxisbezug zu vertiefen, wurden in PORTIKO zwei Projektbereiche gebildet, die auf Grund der engen Beziehung zwischen den Fächern gezielt zusammenarbeiten sollen. Dabei handelt es sich um die Bereiche >Virtuelles Haus< und >Virtuelle Infrastruktur<. Am Beispiel des Virtuellen Hauses soll gezeigt werden, wie die einzelnen Fächer mit Hilfe abgestimmter Übungsbeispiele den Studierenden die Notwendigkeit der Teamarbeit und Interaktion verschiedener Disziplinen vor Augen führen. Den Studenten stehen durch die Online-Plattform eine Vielzahl von Möglichkeiten zur Verfügung, z. B. mit Videoclips und Fotos über Versuche und Baustellen angereicherte Skripte, audio-kommentierte Vorlesungsfolien, alte Klausuren mit Musterlösungen, Aufgabenstellungen für Hausübungen, usw.. Den Studierenden können in der Online-Lehre interaktive Anwendungen zur Verfügung gestellt werden, die ein spielerisches Erproben und Beobachten von Parametereinflüssen ermöglichen. Solche interaktiven Übungen werden mit Java, Flash und Authorware erstellt. Hierzu werden Beispiele vorgestellt.
Objektorientierte Modellierungstechniken werden gegenwärtig vor allem Entwicklern von CAD-Systemen angeboten. Sie erzeugen über die Schritte OO-Modellanalyse und OO-Softwaredesign OO-Programme, die mit ihrer Compilation das durch den Softwareingenieur gefundene Modell festschreiben. Generell, aber insbesondere im Bauwerksdesign, ist dieses Vorgehen unbefriedigend, da hier eine Normung von Modellen nicht gelingt, der Entwurfsprozeß vergleichsweise lang ist und eine Kooperation von Ingenieurgewerken mit verschiedenen Modelldomänen die Regel sind. Darüber hinaus weisen die Modelle in frühen Phasen ein hohes Maß an Unschärfe und Abstraktion auf. CAD-Tools, die diese Phasen unterstützen, benötigen deshalb: statt eines genormten Produktmodells ein einheitliches, kognitiv begründetes Modellstrukturierungsparadigma, für das mit der Objektorientierung eine mögliche Ausprägung gegeben ist, ein explizites, verfügbares Domänenmodell zur fortwährenden Interpretation von Bauwerksmodellen, deskriptive Elemente, die die Interpretation von Objekten und Attributen erleichtern, ein Konzept zur Behandlung von Unschärfe und Abstraktion. Hieraus ergeben sich für die Entwicklung von CAD-Systemen folgende Forderungen : Explizite Verfügbarkeit von Klassenobjekten und deren Erzeugung und Veränderung zur Laufzeit, Vererbung auf Klassen- und Instanzniveau, Erweiterte Attributkonzepte (Facetten), Unterstützung der Aggregation als einer wesentlichen Modellstrukturierungsrelation, Verfügbarkeit von OO-Schnittstellen zum Aufbau von CAD-Systemen aus Tools einerseits, sowie zur Trennung von Modellverwaltung und Modellrepräsentation andererseits. Als ein herausragendes Merkmal des Objektorientierten Paradigmas wird die Anwendungsnähe genannt, da Erscheinungen der behandelten Domäne sich analog in Modellen und Programmen wiederfinden (sollen). Unter der Grundannahme, daß dieses Paradigma auch durch den Anwender zur Erstellung seiner Modellwelten verwendet wird, will FLEXOB eine homogene Umgebung schaffen, die die Modellwelt des Softwareingenieurs zu Analysezwecken dem Anwender zur Verfügung stellt und die die Erweiterung dieser Modellwelt auf deskriptivem Niveau ermöglicht. Das Tool FLEXOB und einige wesentliche Implementationsdetails werden im Beitrag vorgestellt. Es handelt sich bei diesem Tools um eine C++ Klassenbibliothek, die entweder als Objektmodul oder als Windows-DLL verwendet werden kann. Aspekte des Nutzungsregimes solch flexibler Modellverwaltungen werden im Beitrag ebenfalls angespochen.
Die digitale Unterstützung der Planungsprozesse ist ein aktueller Forschungs- und Arbeitsschwerpunkt der Professur Informatik in der Architektur (InfAR) und der Juniorprofessur Architekturinformatik der Fakultät Architektur an der Bauhaus-Universität Weimar. Verankert in dem DFG Sonderforschungsbereich 524 >Werkzeuge und Konstruktionen für die Revitalisierung von Bauwerken< entstehen Konzepte und Prototypen für eine fachlich orientierte Planungsunterstützung. Als ein Teilaspekt wird in diesem Beitrag gezeigt, wie das Handaufmaß unter Einsatz modernen taktiler Erfassungsmethoden eine ganz neue Bedeutung für die Datenerfassung gewinnen kann. Das Potenzial der verschiedenen Verfahren zur Koordinatenbestimmung mit taktilen Werkzeugen wird evaluiert. Daraus wird eine Strategie entwickelt, die für die unterschiedlichen Notwendigkeiten im Planungsfortschritt den optimierten Einsatz der taktilen Erfassung in Kombination mit klassischen Erfassungsmethoden aufzeigt. Die Realisierbarkeit eines derartigen Konzeptes wird durch Fallstudien und mögliche Ablaufszenarien für einzelne Verfahren nachgewiesen. Durch die Integration taktiler Messverfahren in die Bestandserfassung kann erreicht werden, dass - relevante (Geometrie-) Informationen in ein umfassendes Bauwerksmodell integriert werden können, - die Bauaufnahme wieder im direkten Kontakt zum Bauaufnahmeobjekt erfolgen kann und - die Verfahren so einfach und leicht von allen Beteiligten eingesetzt werden können, um die Bauaufnahme und die Planung wieder miteinander zu verzahnen.
Former achievements for integrated information management have concentrated on interoperability of applications like e.g. CAD, structural analysis or facility management, based on product models introducing additional application independent model layers (core models). In the last years it has become clear, that besides interoperability of autonomous applications, the concurrent processes of model instantiation and evolution have to be modeled, including the relationship to available project resources, persons, legal requirements and communication infrastructure. This paper discusses some basic concepts for an emerging methodology relating the fields of product modeling, project management and workflow systems by elaborating the concept of a process model, which gives a decomposition of the project goals into executable activities. Integrated information management systems should be related to process models to detect pending activities, deadlocks and alternatives of execution. According to the heterogeneous nature of project communication processes, a method for dynamic classification of ad-hoc activities is suggested, that complements predefined highlevel process definitions. In a brief outline of the system architecture, we show how sophisticated information management systems can be broadly made available by using conventional Internet technologies.
The increased implementation of site data capture technologies invariably results in an increase in data warehousing and database technologies to store captured data. However, restricted use of data beyond the initial application could potentially result in a loss of understanding of site processes. This could in turn lead to poor decision making at production, tactical and strategic levels. Concrete usage data have been collected from two piling processes. These data have been analysed and the results highlighted potential improvements that could be made to existing site management and estimating processes. A cost benefit analysis has been used to support decision making at the strategic level where the identified improvements require capital expenditure.
It has been shown that symmetries of moment functions of stochastic processes play an important role in identification of systems. They provide the group-theoretic method of choice of the model structure and model parameters. In the first stage the group-theoretic analysis of some fundamental concepts of stochastic dynamics: stochastic processes and functional series of Volterra-Wiener type has been undertaken. The analysis of group representations of the moment functions of order m for stochastic processes is the basic, original concept of the work. The following groups: symmetric Sm, special affine SAff(m), general linear GL(n, R), GL(n,C) and their subgroups play the main role in the models. In the second stage the informational entropy has been introduced as a measure of the randomness in the identified models. The group-theoretic approach underlines the unity of the nonlinear system identification and leads to useful engineering results in the range of the second-order (stochastic) theory.
For the dynamic behavior of lightweight structures like thin shells and membranes exposed to fluid flow the interaction between the two fields is often essential. Computational fluid-structure interaction provides a tool to predict this interaction and complement or eventually replace expensive experiments. Partitioned analyses techniques enjoy great popularity for the numerical simulation of these interactions. This is due to their computational superiority over simultaneous, i.e. fully coupled monolithic approaches, as they allow the independent use of suitable discretization methods and modular analysis software. We use, for the fluid, GLS stabilized finite elements on a moving domain based on the incompressible instationary Navier-Stokes equations, where the formulation guarantees geometric conservation on the deforming domain. The structure is discretized by nonlinear, three-dimensional shell elements.
Commonly used sequential staggered coupling schemes may exhibit instabilities due to the so-called artificial added mass effect. As best remedy to this problem subiterations should be invoked to guarantee kinematic and dynamic continuity across the fluid-structure interface. Since iterative coupling algorithms are computationally very costly, their convergence rate is very decisive for their usability. To ensure and accelerate the convergence of this iteration the updates of the interface position are relaxed. The time dependent, 'optimal' relaxation parameter is determined automatically without any user-input via exploiting a gradient method or applying an Aitken iteration scheme.
Dezentrale mikrocontrollerbasierende Messtechnik in der in der Leittechnik historischer Gebäude
(2003)
Im Hintergrund stehen historische Bibliotheksgebäude, deren Bestände durch ungünstige klimatische Verhältnisse in den Innenräumen ihrer der Erhaltung gefährdet sind. Damit Maßnahmen der Gebäudeleittechnik geplant werden können, ist die Aufnahme der Umweltgrößen über einen längeren Zeitraum erforderlich. Im speziellen zielen die Maßnahmen auf die Klimatechnik mit ihren Aufgaben der Heizungsregelung, Belüftungsanlagen. Ein Messsystem kann mit dem Mikrocontroller Beck >IPC@CHIP< realisiert werden. Das an der FH Fulda entwickelte Messsystem nimmt im Abstand von drei Minuten Messwerte von Lufttemperatur, Luftfeuchte, Luftdruck und UV-Intensität auf. Sie gelten als beeinflussende Größen der organischen Materialien des Fundus. Die Messwerte werden in einer elektronischen Schaltung vorverarbeitet und einem Mikrocontroller zugeführt. Dieser archiviert die Daten fortlaufend über einen Zeitraum bis zu sieben Jahren. Der Mikrocontroller ist mit seinem internen Flash-Speicher und der Ethernet-Schnittstelle in der Lage, die gesammelten Messwerte als Webserver zur Verfügung zu stellen. Wenn das Messsystem an das Internet angeschlossen wird, können die Klimaverläufe in einem herkömmlichen Internet-Browser dargestellt werden. Dafür wurde mit der Programmiersprache Java eine Visualisierungsoberfläche erstellt. Sie bietet die verschiedensten Möglichkeiten der Datenauswertung. Dazu gehört der Zugriff auf jeden einzelnen Messwert, auf Tages-, Monats- und Jahresmittelwerte und die Errechnung von Minima und Maxima. Der Flash-Speicher wurde auch dazu benutzt, Dokumentation und Bedienhinweise abzulegen, die somit über das Netzwerk abrufbar sind. Das gesamte Messsystem ist in ein Buch integriert und erfasst seit August 2002 die klimatischen Verhältnisse der Herzogin-Anna-Amalia-Bibliothek in Weimar. Nach den erfolgten Langzeitmessungen kann darüber entschieden werden, welchen Beitrag eine Gebäudeleittechnik mit integrierter Klimaregelung zur Werterhaltung der Gegenstände leisten kann.
For analysis and planing of transport networks detailed information concerning travel sequences is required. The paper examines an activity chain model to determine stochastic travel demand which individual generates in order to participate in activity or sequence of activities over the day. The transition from one activity to another depends on the activity participation decision, which is made sequentially and is constrained in space and time. Activity chains derived from travel survey data are used as a base to develop a more realistic model than conventional discrete trip models. Probabilistic concepts and statistical procedures are used to estimate characteristics of travel demand such as the number of daily out-of-home activities and transition probability from one activity to another participated by homogeneous behavioural groups. In addition, the paper compares the model for two different sized towns.
Steel structural design is an integral part of the building construction process. So far, various methods of design have been applied in practice to satisfy the design requirements. This paper attempts to acquire the Differential Evolution Algorithms in automatization of specific synthesis and rationalization of design process. The capacity of the Differential Evolution Algorithms to deal with continuous and/or discrete optimization of steel structures is also demonstrated. The goal of this study is to propose an optimal design of steel frame structures using built-up I-sections and/or a combination of standard hot-rolled profiles. All optimized steel frame structures in this paper generated optimization solutions better than the original solution designed by the manufacturer. Taking the criteria regarding the quality and efficiency of the practical design into consideration, the produced optimal design with the Differential Evolution Algorithms can completely replace conventional design because of its excellent performance.
Anhand von Ergebnissen aus dem FABEL-Projekt wird gezeigt, welche Beiträge Methoden der Künstlichen Intelligenz, insbesondere der Wissensverarbeitung beim Entwurf komplexer Gebäude leisten können. Exemplarisch werden spezialisierte wissensintensive Methoden, und allgemeine fallbasierte Methoden zum Retrieval und zur Wiederverwendung früherer Entwürfe vorgestellt. Es werden Fragen der Integration von Wissen, Fällen und Daten diskutiert. Der Prototyp des FABEL-Projekts verwendet die Metapher der virtuellen Baustelle, um die verschiedenen Methoden als Planungswerkzeuge in einem CAD-System integriert anzubieten. Ein Planungsmodell dient der zusätzlichen Orientierung des Planers. Die Ergebnisse sind interessant für den Entwurf komplexer Unikate, dürften aber auch als Zusatz zu elektronisch angebotenen Katalogen relevant sein.
Renovation's peculiarities of industrial enterprises in conditions of economic selfsufficiency
(1997)
Probleme of recrienfation of building complex, to the sharp increase of share of reconstruction works, capital repair and modernisation of in-dustrial plants are concidered in this work. The conception of develop-ment and creation of unitified system of expluatation and renovation of industrial plants are worded out. This system is based on date-computer technology and taking into conciderations of real economic relations.
A practical framework for generating cross correlated fields with a specified marginal distribution function, an autocorrelation function and cross correlation coefficients is presented in the paper. The contribution promotes a recent journal paper [1]. The approach relies on well known series expansion methods for simulation of a Gaussian random field. The proposed method requires all cross correlated fields over the domain to share an identical autocorrelation function and the cross correlation structure between each pair of simulated fields to be simply defined by a cross correlation coefficient. Such relations result in specific properties of eigenvectors of covariance matrices of discretized field over the domain. These properties are used to decompose the eigenproblem which must normally be solved in computing the series expansion into two smaller eigenproblems. Such decomposition represents a significant reduction of computational effort. Non-Gaussian components of a multivariate random field are proposed to be simulated via memoryless transformation of underlying Gaussian random fields for which the Nataf model is employed to modify the correlation structure. In this method, the autocorrelation structure of each field is fulfilled exactly while the cross correlation is only approximated. The associated errors can be computed before performing simulations and it is shown that the errors happen especially in the cross correlation between distant points and that they are negligibly small in practical situations.
Scalarization methods are a category of multiobjective optimization (MOO) methods. These methods allow the usage of conventional single objective optimization algorithms, as scalarization methods reformulate the MOO problem into a single objective optimization problem. The scalarization methods analysed within this thesis are the Weighted Sum (WS), the Epsilon-Constraint (EC), and the MinMax (MM) method. After explaining the approach of each method, the WS, EC and MM are applied, a-posteriori, to three different examples: to the Kursawe function; to the ten bar truss, a common benchmark problem in structural optimization; and to the metamodel of an aero engine exit module.
The aim is to evaluate and compare the performance of each scalarization method that is examined within this thesis. The evaluation is conducted using performance metrics, such as the hypervolume and the generational distance, as well as using visual comparison.
The application to the three examples gives insight into the advantages and disadvantages of each method, and provides further understanding of an adequate application of the methods concerning high dimensional optimization problems.
The development of the qualitative methods of investigation of dynamic systems, suggested by the authors, is the effective means for identification of dynamic systems. The results of the extensive investigations of the behaviour of linear dynamic systems and symmetrical system with double well potential under polyharmonic excitation are given in the paper. Phase space of dynamic systems is multi-dimensional. Each point of this space is characterized by not less than four co-ordinates. In particular: displacement, velocity, acceleration and time. Real space has three dimensions. It is more convenient for the analysis. We consider the phase space as limited to three dimensions, namely displacement, velocity and acceleration. Another choice of parameters of phase planes is also possible [1, 2]. Phase trajectory on a plane is of the greatest interest. It is known that accelerations of points are more sensitive to deviations of oscillations from harmonic ones. It is connected with the fact that power criteria on it are interpreted most evidently. Besides, dependence is back symmetric relative to axis of the diagram of elastic characteristic. Only the phase trajectories allow establishing a type and a level of non-linearity of a system. The results of the extensive investigations of the dynamic systems behaviour under polyharmonic excitation are given in the paper. The use of the given phase trajectories enables us to determine with a high degree of reliability the following peculiarities: - presence or absence of non-linear character of behaviour of a dynamic system; - type of non-linearity; - type of dynamic process (oscillations of the basic tone, combinative oscillations, chaotic oscillations.). Unlike existing asymptotic and stochastic methods of identification of dynamic systems, the use of the suggested technique is not connected with the use of a significant amount of computing procedures, and also it has a number of advantages at the investigation of complicated oscillations.
The reduction of oscillation amplitudes of structural elements is necessary not only for maintenance of their durability and longevity but also for elimination of a harmful effect of oscillations on people and technology operations. The dampers are widely applied for this purpose. One of the most widespread models of structural friction forces having piecewise linear relation to displacement was analysed. T The author suggests the application of phase trajectories mapping in plane "acceleration – displacement". Unlike the trajectories mapping in a plane "velocity – displacement", they don't require large number of geometrical constructions for identification of the characteristics of dynamic systems. It promotes improving the accuracy. The analytical assumptions had been verified by numerical modeling. The results show good enough coincide between numerical and analytical estimation of dissipative characteristic.
Pre-stressed structural elements are widely used in large-span structures. As a rule, they have higher stiffness characteristics. Pre-stressed rods can be applied as girders of different purpose, and as their separate parts, e.g. rods of trusses and frames. Among numerous ways of prestressing the compression of girders, trusses, and frames by tightenings from high-strength materials is under common application.
The presented work focuses on the presentation of a discrete event simulator which can be used for automated sequencing and optimization of building processes. The sequencing is based on the commonly used component–activity–resource relations taking structural and process constraints into account. For the optimization a genetic algorithm approach was developed, implemented and successfully applied to several real life steel constructions. In this contribution we discuss the application of the discrete event simulator including its optimization capabilities on a 4D process model of a steel structure of an automobile recycling facility.
What is nowadays called (classic) Clifford analysis consists in the establishment of a function theory for functions belonging to the kernel of the Dirac operator. While such functions can very well describe problems of a particle with internal SU(2)-symmetries, higher order symmetries are beyond this theory. Although many modifications (such as Yang-Mills theory) were suggested over the years they could not address the principal problem, the need of a n-fold factorization of the d’Alembert operator. In this paper we present the basic tools of a fractional function theory in higher dimensions, for the transport operator (alpha = 1/2 ), by means of a fractional correspondence to the Weyl relations via fractional Riemann-Liouville derivatives. A Fischer decomposition, fractional Euler and Gamma operators, monogenic projection, and basic fractional homogeneous powers are constructed.
The theory of random matrices, or random matrix theory, RMT in what follows, has been developed at the beginning of the fties to describe the sta- tistical properties of energy levels of complex quantum systems, [1], [2], [3]. In the early eighties it has enjoyed renewed interest since it has been recognized as a very useful tool in the study of numerous physical systems. Specically, it is very useful in the analysis of chaotic quantum systems. In fact, in the last years many papers appeared about the problem of quantum chaos which implies the quantization of systems whose underlying classical dynamics is irregular (i.e. chaotic). The simplest models considered in this eld are billi- ards of various shapes. From the the classical point of view, a point particle in a 2-dimensional billiard displays regular or irregular motion depending on the shape of the billiard; for instance motion in a rectangular or circular billi- ard is regular thanks to the symmetries of the boundary. On the other hand, billiards of arbitrary shapes imply chaotic motion, i.e. exponential diver- gence of initially nearby trajectories. In order to study quantum billiards we have to consider the Schroedinger equation in various 2-dimensional domains. The eigenvalues of the Schroedinger equation represent the allowed energy levels of our quantum particle in the billiard under consideration, while the eigenfunction norms represent the probability density of nding the particle in a certain position. The question of quantum chaos is whether the charac- ter of the classical motion (regular or chaotic) can in uence some properties
The subject of this talk is the problem of surface design based upon a mesh that may contain both triangular and quadrangular domains. We investigate the cases when such a combined mesh occurs more preferable for bivariate data interpolation than a pure triangulation. First we describe a modification of the well-known flipping algorithm that constructs a locally optimal combined mesh with a predefined quality criterion. Then we introduce two quality measures for triangular and quadrangular domains and present the results of a computational experiment that compares integral interpolation errors and errors in gradients caused by the piecewise surface models produced by the flipping algorithm with the introduced quality measures. The experiment shows that triangular meshes with the Delaunay quality measure provide better interpolation accuracy only if the interpolated function is strictly convex, as well as a saddle-shaped function is better interpolated by bilinear patches within a combined mesh. For a randomly shaped function combined meshes demonstrate smaller error values and better stability in compare with pure triangulations. At the end we consider other resources for mesh improvement, such as excluding >bad< points from the input set for the mesh generating procedure. Because the function values at these points should not be lost, some linear or bilinear patches are replaced by nonlinear patches that pass through the excluded points.
The stress state of a piecewise-homogeneous elastic body, which has a semi-infinite crack along the interface, under in-plane and antiplane loads is considered. One of the crack edges is reinforced by a rigid patch plate on a finite interval adjacent to the crack tip. The crack edges are loaded with specified stresses. The body is stretched at infinity by specified stresses. External forces with a given principal vector and moment act on the patch plate. The problem reduces to a Riemann-Hilbert boundary-value matrix problem with a piecewise-constant coefficient for two complex potentials in the plane case and for one in the antiplane case. The complex potentials are found explicitly using a Gaussian hypergeometric function. The stress state of the body close to the ends of the patch plate, one of which is also simultaneously the crack tip, is investigated. Stress intensity factors near the singular points are determined.
Individual views on a building product of people involved in the design process imply different models for planning and calculation. In order to interpret these geometrical, topological and semantical data of a building model we identify a structural component graph, a graph of room faces, a room graph and a relational object graph as aids and we explain algorithms to derive these relations. The application of the technique presented is demonstrated by the analysis and discretization of a sample model in the scope of building energy simulation.
The worldwide growth of communication networks and associated technologies provide the basic infrastructure for new ways of executing the engineering process. Collaboration amongst team members seperated in time and location is of particular importance. Two broad themes can be recognized in research pertaining to distributed collaboration. One theme focusses on the technical and technological aspects of distributed work, while the other emphasises human aspects thereof. The case of finite element structural analysis in a distributed collaboratory is examined in this paper. An approach is taken which has its roots in human aspects of the structural analysis task. Based on experience of how structural engineers currently approach and execute this task while utilising standard software designed for use on local workstations only, criteria are stated for a software architechture that could support collaborative structural analysis. Aspects of a pilot application and the results of qualitative performance measurements are discussed.
A lot of real-life problems lead frequently to the solution of a complicated (large scale, multicriteria, unstable, nonsmooth etc.) nonlinear optimization problem. In order to cope with large scale problems and to develop many optimum plans a hiearchical approach to problem solving may be useful. The idea of hierarchical decision making is to reduce the overall complex problem into smaller and simpler approximate problems (subproblems) which may thereupon treated independently. One way to break a problem into smaller subproblems is the use of decomposition-coordination schemes. For finding proper values for coordination parameters in convex programming some rapidly convergent iterative methods are developed, their convergence properties and computational aspects are examined. Problems of their global implementation and polyalgorithmic approach are discussed as well.
Methods with the convergence order p 2 (Newton`s, tangent hyperbolas, tangent parabolas etc.) and their approximate variants are studied. Conditions are presented under which the approximate variants preserve their convergence rate intrinsic to these methods and some computational aspects (possibilities to organize parallel computation, globalization of a method, the solution of the linear equations versus the matrix inversion at every iteration etc.) are discussed. Polyalgorithmic computational schemes (hybrid methods) combining the best features of various methods are developed and possibilities of their application to numerical solution of two-point boundary-value problem in ordinary differential equations and decomposition-coordination problem in convex programming are analyzed.
The extended finite element method (XFEM) offers an elegant tool to model material discontinuities and cracks within a regular mesh, so that the element edges do not necessarily coincide with the discontinuities. This allows the modeling of propagating cracks without the requirement to adapt the mesh incrementally. Using a regular mesh offers the advantage, that simple refinement strategies based on the quadtree data structure can be used to refine the mesh in regions, that require a high mesh density. An additional benefit of the XFEM is, that the transmission of cohesive forces through a crack can be modeled in a straightforward way without introducing additional interface elements. Finally different criteria for the determination of the crack propagation angle are investigated and applied to numerical tests of cracked concrete specimens, which are compared with experimental results.
From a macroscopic point of view, failure within concrete structures is characterized by the initiation and propagation of cracks. In the first part of the thesis, a methodology for macroscopic crack growth simulations for concrete structures using a cohesive discrete crack approach based on the extended finite element method is introduced. Particular attention is turned to the investigation of criteria for crack initiation and crack growth. A drawback of the macroscopic simulation is that the real physical phenomena leading to the nonlinear behavior are only modeled phenomenologically. For concrete, the nonlinear behavior is characterized by the initiation of microcracks which coalesce into macroscopic cracks. In order to obtain a higher resolution of this failure zones, a mesoscale model for concrete is developed that models particles, mortar matrix and the interfacial transition zone (ITZ) explicitly. The essential features are a representation of particles using a prescribed grading curve, a material formulation based on a cohesive approach for the ITZ and a combined model with damage and plasticity for the mortar matrix. Compared to numerical simulations, the response of real structures exhibits a stochastic scatter. This is e.g. due to the intrinsic heterogeneities of the structure. For mesoscale models, these intrinsic heterogeneities are simulated by using a random distribution of particles and by a simulation of spatially variable material parameters using random fields. There are two major problems related to numerical simulations on the mesoscale. First of all, the material parameters for the constitutive description of the materials are often difficult to measure directly. In order to estimate material parameters from macroscopic experiments, a parameter identification procedure based on Bayesian neural networks is developed which is universally applicable to any parameter identification problem in numerical simulations based on experimental results. This approach offers information about the most probable set of material parameters based on experimental data and information about the accuracy of the estimate. Consequently, this approach can be used a priori to determine a set of experiments to be carried out in order to fit the parameters of a numerical model to experimental data. The second problem is the computational effort required for mesoscale simulations of a full macroscopic structure. For this purpose, a coupling between mesoscale and macroscale model is developed. Representative mesoscale simulations are used to train a metamodel that is finally used as a constitutive model in a macroscopic simulation. Special focus is placed on the ability of appropriately simulating unloading.
Die vorliegende Arbeit beschäftigt sich mit der dynamischen Analyse der Sprottetalbrücke infolge aufgetretener Asphaltschäden. Sie beinhaltet die Erstellung eines FE-Modells, der Darstellung der theoretischen Grundlagen der Dynamik sowie die Auswertung von berechneten Eigenformen und Asphaltspannungen unter Berücksichtigung der derzeit gültigen Normen.
The article presents analysis of stress distribution in the reinforced concrete support beam bracket which is a component of prefabricated reinforced concrete building. The building structure is spatial frame where dilatations were applied. The proper stiffness of its structure is provided by frames with stiff joints, monolithic lift shifts and staircases. The prefabricated slab floors are supported by beam shelves which are shaped as inverted letter ‘T’. Beams are supported by the column brackets. In order to lower the storey height and fulfill the architectural demands at the same time, the designer lowered the height of beam at the support zone. The analyzed case refers to the bracket zone where the slant crack. on the support beam bracket was observed. It could appear as a result of overcrossing of allowable tension stresses in reinforced concrete, in the bracket zone. It should be noted that the construction solution applied, i.e. concurrent support of the “undercut” beam on the column bracket causes local concentration of stresses in the undercut zone where the strongest transverse forces and tangent stresses occur concurrently. Some additional rectangular stresses being a result of placing the slab floors on the lower part of beam shelves sum up with those described above.
The paper contains a description of dynamic effects in the silo wall during the outflow of a stored material. The work allows for determining the danger of construction damage due to resonant vibrations and is of practical importance by determining the influence of cyclic pressures and vibro–creeping during prolonged use of a silo. The paper was devised as a result of tests on silo walls in semi-technical scale. The model is generally applicable and allows for identification of parameters in real- size silos as well.
In der vorliegenden Arbeit werden dickwandige Schalentragwerke unter statischen Belastungen betrachtet. Die Schale besteht aus verschiedenen Zonen und in jeder Zone wird die Schalenmittelflaeche mittels eines eigenen Geometriegleichungssystems definiert. Das Verzerrungsfeld hat allen 6 unabhaengigen Komponenten unter der Annahme, dass die Querschnittsfasern, die normal zu der Mittelflaeche der unbelasteten Schale sind, geradelinig bleiben. Ein dreidimensionales isoparametrisches finites Element wird vorgeschlagen. Die Berechnung wird mit der Hilfe der Makroelemententechnik durchgefuehrt. In der Arbeit werden die wesentliche Parameter der Schalengeometrie, sowie auch entsprechendes Anteil von Klassen des konstruktiven Modells, definiert. Ein konstruktives Informationsmodell und ein FEM-Informationsmodell, werden entwickelt. Die Informationsverbindungen zwischen den beiden Modellen werden definiert. Diese objektorientierten Modelle werden in Programmiersprache Microsoft Visual C++ v.4.0 unter Windows 95 implementiert. Als numerisches Beispiel wird ein Bogenmauertragwerk betrachtet.
Im Rahmen des sich derzeit vollziehenden Wandels von der segmentierten, zeichnungsorientierten zur integrierten, modellbasierten Arbeitsweise bei der Planung von Bauwerken und ihrer Erstellung werden Computermodelle nicht mehr nur für die physikalische Simulation des Bauwerksverhaltens, sondern auch zur Koordination zwischen den einzelnen Planungsdisziplinen und Projektbeteiligten genutzt. Die gemeinsame Erstellung und Nutzung dieses Modells zur virtuellen Abbildung des Bauwerks und seiner Erstellungsprozesse, das sog. Building Information Modeling (BIM), ist dabei zentraler Bestandteil der Planung. Die Integration der Terminplanung in diese Arbeitsweise erfolgt bisher jedoch nur unzureichend, meist lediglich in der Form einer nachgelagerten 4D-Simulation zur Kommunikation der Planungsergebnisse. Sie weist damit im Verhältnis zum entstehenden Zusatzaufwand einen zu geringen Nutzen für den Terminplaner auf. Gegenstand der vorliegenden Arbeit ist die tiefere Einbettung der Terminplanung in die modellbasierte Arbeitsweise. Auf Basis einer umfassende Analyse der Rahmenbedingungen und des Informationsbedarfs der Terminplanung werden Konzepte zur effizienten Wiederverwendung von im Modell gespeicherten Daten mit Hilfe einer Verknüpfungssprache, zum umfassenden Datenaustausch auf Basis der Industry Foundation Classes (IFC) und für das Änderungsmanagement mittels einer Versionierung auf Objektebene entwickelt.Die für die modellbasierte Terminplanung relevanten Daten und ihre Beziehungen zueinander werden dabei formal beschrieben sowie die Kompatibilität ihrer Granularität durch eine Funktionalität zur Objektteilung sichergestellt. Zur zielgenauen Extraktion von Daten werden zudem Algorithmen für räumliche Anfragen entwickelt. Die vorgestellten Konzepte und ihre Anwendbarkeit werden mittels einer umfangreichen Pilotimplementierung anhand von mehreren Praxisbeispielen demonstriert und somit deren praktische Relevanz und Nutzen nachgewiesen.
The purpose of this research is to develop the method to retrieve a building name from the impression of the building. First, the images of the building are registered as database by the questionnaire. Next, the images of the objective building are compared with the degree of matching in image databases, and the building with high synthetic matching degree is retrieved. This system could get a good retrieval result. Moreover, image processing was done, and image databases are trained by neural network from the amount of characteristics of the image, and the retrieval system by image processing was examined.
An important feature of the 2003 SARS outbreak in Canada, Singapore, and Hong Kong was that many health care workers (HCWs) developed SARS after caring for patients with SARS. This has been ascribed to inadequate or ineffective patient isolation. However, it is difficult for dense cities to provide sufficient isolation facilities within a short period of time. This has raised concerns from the public for new strategies in the planning and design of isolation facilities. Considering that SARS or other infectious diseases could seriously damage our society’s development, isolation facilities that could be rapidly and economically constructed with appropriate environmental controls are essential. For this reason, the design team of the Department of Architecture collaborated with a special task force from the Faculty of Medicine, who are the frontline medical officers treating the SARS patients, to design Rapidly Assembled Isolation Patient Wards. Both architecture and medicine are well established disciplines, but they have little in common in terms of the mode of knowledge construction and practice. This induced much intellectual exploration and research interest in conducting this study. The process has provided an important reference for cross disciplinary studies between the architectural and medical domains.
The paper proposes a new method for general 3D measurement and 3D point reconstruction. Looking at its features, the method explicitly aims at practical applications. These features especially cover low technical expenses and minimal user interaction, a clear problem separation into steps that are solved by simple mathematical methods (direct, stable and optimal with respect to least error squares), and scalability. The method expects the internal and radial distortion parameters of the used camera(s) as inputs, and a plane quadrangle with known geometry within the scene. At first, for each single picture the 3D position of the reference quadrangle (with respect to each camera coordinate frame) is calculated. These 3D reconstructions of the reference quadrangle are then used to yield the relative external parameters of each camera regarding the first one. With known external parameters, triangulation is finally possible. The differences from other known procedures are outlined, paying attention to the stable mathematical methods (no usage of nonlinear optimization) and the low user interaction with good results at the same time.
Complex gridshell structures used in architecturally ambitious constructions remain as appealing as ever in the public realm. This paper describes the theory and approach behind the software realisation of a tool which helps in finding the affine self-weight geometry of gridshell structures. The software tool DOMEdesign supports the formal design process of lattice and grid shell structures based upon the laws of physics. The computer-aided simulation of suspension models is used to derive structurally favourable forms for domes and arches subject to compression load, based upon the input of simple architectonic parameters. Irregular plans, three-dimensional topography, a choice different kinds of shell lattice structures and the desired height of the dome are examples of design parameters which can be used to modify the architectural design. The provision of data export formats for structural dimensioning and visualisation software enables engineers and planners to use the data in future planning and to communicate the design to the client.
Since the 90-ties the Pascal matrix, its generalizations and applications have been in the focus of a great amount of publications. As it is well known, the Pascal matrix, the symmetric Pascal matrix and other special matrices of Pascal type play an important role in many scientific areas, among them Numerical Analysis, Combinatorics, Number Theory, Probability, Image processing, Sinal processing, Electrical engineering, etc. We present a unified approach to matrix representations of special polynomials in several hypercomplex variables (new Bernoulli, Euler etc. polynomials), extending results of H. Malonek, G.Tomaz: Bernoulli polynomials and Pascal matrices in the context of Clifford Analysis, Discrete Appl. Math. 157(4)(2009) 838-847. The hypercomplex version of a new Pascal matrix with block structure, which resembles the ordinary one for polynomials of one variable will be discussed in detail.
In the abstract proposed is the Instrumental System of mechanics problems analysis of the deformed solid body. It supplies the researcher with the possibility to describe the input data on the object under analyses and the problem scheme based upon the variational principles within one task. The particular feature of System is possibility to describe the information concerning the object of any geometrical shape and the computation sheme according to the program defined for purpose. The Methods allow to compute the tasks with indefinite functional and indefinite geometry of the object (or the set of objects). The System provides the possibility to compute the tasks with indefinite sheme based upon the Finite Element Method (FEM). The restrictions of the System usage are therefore determined by the restrictions of the FEM itself. It contrast to other known programms using FEM (ANSYS, LS-DYNA and etc) described system possesses more universality in defining input data and choosing computational scheme. Builtin is an original Subsytem of Numerical Result Analuses. It possesses the possibility to visualise all numerical results, build the epures of the unknown variables, etc. The Subsystem is approved while solving two- and three-dimensional problems of Elasticiti and Plasticity, under the conditions of Geometrical Unlinearity. Discused are Contact Problems of Statics and Dynamics.
Die digitale Unterstützung der Planungsprozesse ist ein aktueller Forschungs- und Arbeitsschwerpunkt der Professur Informatik in der Architektur (InfAR) und der Juniorprofessur Architekturinformatik der Fakultät Architektur an der Bauhaus-Universität Weimar. Verankert in dem DFG Sonderforschungsbereich 524 >Werkzeuge und Konstruktionen für die Revitalisierung von Bauwerken< entstehen Konzepte und Prototypen für eine fachlich orientierte Planungsunterstützung. Als ein Teilaspekt wird in diesem Beitrag die Vision eines mitwachsenden Geometriemodells für das computergestützte Bauaufmaß gezeigt, welches den Aufnehmenden von der Erstbegehung an begleitet. Die bei jeder Phase der Bauaufnahme gewonnenen Geometrieinformationen sollen in den anschließenden Phasen wiederverwendet, konkretisiert bzw. korrigiert werden. Aufmaßtechniken und Geometriemodell sind dabei eng gekoppelt. Verschiedene Sichten auf ein gemeinsames Geometriemodell haben zum Ziel, den Nutzer die Vorteile planarer Abbildungen nutzen zu lassen, ohne die dreidimensionale Übersicht zu verlieren oder entsprechende räumliche Manipulationen zu missen. Das Geometriemodell ist dabei in ein dynamisches Bauwerksmodell eingebettet. Der folgende Beitrag bezieht sich auf die Bauaufnahme mit folgenden Vorgaben: - die Bauaufnahme dient der Vorbereitung der Bauplanung im Bestand - es wird nur eine Genauigkeitsstufe (im Bereich von +/- 10 cm) unterstützt - die Geometrieabbildung des aufzunehmenden Bauwerkes beruht ausschließlich auf ebenen Oberflächen
The paper describes a concept for the step-by-step computer-aided capture and representation of geometric building data in the context of planning-oriented building surveying. Selected aspects of the concept have been implemented and tested as prototypes. The process of step-by-step capture and representation is determined by the order in which the user experiences the building. Only the information that the user knows (can see) or can reasonably deduce is represented. In addition approaches to the flexible combination of different measuring techniques and geometric abstractions are described which are based upon geodetic computational adjustment.
From the design experiences of arch dams in the past, it has significant practical value to carry out the shape optimization of arch dams, which can fully make use of material characteristics and reduce the cost of constructions. Suitable variables need to be chosen to formulate the objective function, e.g. to minimize the total volume of the arch dam. Additionally a series of constraints are derived and a reasonable and convenient penalty function has been formed, which can easily enforce the characteristics of constraints and optimal design. For the optimization method, a Genetic Algorithm is adopted to perform a global search. Simultaneously, ANSYS is used to do the mechanical analysis under the coupling of thermal and hydraulic loads. One of the constraints of the newly designed dam is to fulfill requirements on the structural safety. Therefore, a reliability analysis is applied to offer a good decision supporting for matters concerning predictions of both safety and service life of the arch dam. By this, the key factors which would influence the stability and safety of arch dam significantly can be acquired, and supply a good way to take preventive measures to prolong ate the service life of an arch dam and enhances the safety of structure.
Recently, many reseraches on active control systems of building structures are preformed based on modern control theory and are installed real buildings. The authors have already proposed intelligent fuzzy optimal active control (IFOAC) systems. IFOAC systems imitate intelligent activities of human brains such as prediction, adaptation, decision-kaking and so on. In IFOAC systems, objective and subjective judgements on the active control can be taken into account. However, IFOAC systems are considered to be suitable for far-field erathquake and control effect becomes small in case of near-field earthqaukes which include a few velosity pules with large amplitudes. To improve control effect in case of near-souece earthquakes, the authors have also proposed hybrid control (HC) systems, in which IFOAC systems and fuzzy control system are combined. In HC systems, the fuzzy control systems are introduced as a reflective fuzzy active control (RFAC) system and imitates spinal reflection of human. In HC systems, active control forces are activated to buildings in accordance with switching rules on active control forces. In this paper, optimizations on fuzzy control rules in RFAC system and switching rules of active control forces in HC system are performed by Parameter-Free Genetic Algorithms (PfGAs). Here, the optimization is performed by using different earthquake inputs. The results of digital simulations show that the HC system can reduce maximal response displacements under restrictions on strokes of the actuator effectively in case of a near-source earthquake and the effectiveness of the proposed HC system is discussed and clarified.
The truss model for predicting shear resistance of reinforced concrete beams has usually been criticized because of its underestimation of the concrete shear strength especially for beams with low shear reinforcement. Two challengers are commonly encountered in any truss model and are responsible for its inaccurate shear strength prediction. First: the cracking angle is usually assumed empirically and second the shear contribution of the arching action is usually neglected. This research introduces a nouvelle approach, by using Artificial Neural Network (ANN) for accurately evaluating the shear cracking angle of reinforced and prestressed concrete beams. The model inputs include the beam geometry, concrete strength, the shear reinforcement ratio and the prestressing stress if any. ...
In this paper proposed the application of two-parameters damage model, based on non-linear finite element approach, to the analysis of masonry panels. Masonry is treated as a homogenized material, for which the material characteristics can be defined by using homogenization technique. The masonry panels subjected to shear loading are studied by using the proposed procedure within the framework of three-dimensional analyses. The nonlinear behaviour of masonry can be modelled using concepts of damage theory. In this case an adequate damage function is defined for taking into account different response of masonry under tension and compression states. Cracking can, therefore, be interpreted as a local damage effect, defined by the evolution of known material parameters and by one or several functions which control the onset and evolution of damage. The model takes into account all the important aspects which should be considered in the nonlinear analysis of masonry structures such as the effect of stiffness degradation due to mechanical effects and the problem of objectivity of the results with respect to the finite element mesh. Finally the proposed damage model is validated with a comparison with experimental results available in the literature.
This paper deals with the modelling and the analysis of masonry vaults. Numerical FEM analyses are performed using LUSAS code. Two vault typologies are analysed (barrel and cross-ribbed vaults) parametrically varying geometrical proportions and constraints. The proposed model and the developed numerical procedure are implemented in a computer analysis. Numerical applications are developed to assess the model effectiveness and the efficiency of the numerical procedure. The main object of the present paper is the development of a computational procedure which allows to define 3D structural behaviour of masonry vaults. For each investigated example, the homogenized limit analysis approach has been employed to predict ultimate load and failure mechanisms. Finally, both a mesh dependence study and a sensitivity analysis are reported. Sensitivity analysis is conducted varying in a wide range mortar tensile strength and mortar friction angle with the aim of investigating the influence of the mechanical properties of joints on collapse load and failure mechanisms. The proposed computer model is validated by a comparison with experimental results available in the literature.
This paper presents results of applying Fuzzy Inference System for estimation of the number of potential Park and Ride users. Usually it is difficult to evaluate the number of users because it depends on human factor and data in the considered system are uncertain. In such situation the traditional mathematical approaches can not take into consideration rough data. Therefore a fuzzy approach can be applied in this case. A fuzzy methodology is treated as a proper way to describe choice of mode of transport, and especially that uncertainty accompanied of choosing process has rather fuzzy character. The proposed approach is based on the Mamdani Fuzzy Inference System and for calculation there is used Matlab software with Fuzzy Logic Toolbox. Mamdani model requires, as an input data, knowledge of the shape of membership function. These functions can be calibrated taking into consideration results of questionnaires conducted among users of Park and Ride system. Due to lack of representative sample of users, one has decided to use results of experts' questionnaires as a input data for calibration the shape of membership functions. Describing factor will be generalized cost of the trip for different modes of transport. Proposed approach consists of two main stages: modeling of share of public/private transport trips and Multimodal model estimating number of Park and Ride users. Verification of presented methodology is treated as an indirect proof. Proposed approach can be applied for estimation of bi-modal split. Then the results are compared with traditional approaches based on logit functions. Comparable results of proposed fuzzy approach with traditional logit models can be treated as a confirmation of chosen methodology.
Indentation experiments have been carried out over the past century to determine hardness of materials. Modern indentation machines have the capability to continuously monitor load and displacement to high precision and accuracy. In recent years, research interests have focussed on methods to extract material properties from indentation load-displacement curves. Analytical methods to interpret the indentation load-displacement curves are difficult to formulate due to material and geometric nonlinearities as well as complex contact interactions. In the present study, an artificial neural network model was constructed for interpretation of indentation load-displacement curves. Large strain-large deformation finite element analyses were first carried out to simulate indentation experiments. The data from finite element analyses were then used to train the artificial neural network model. The artificial neural network model was able to accurately determine the material properties when presented with load-displacement curves which were not used in the training process. The proposed artificial neural network model is robust and directly relates the characteristics of the indentation loaddisplacement curve to the elasto-plastic material properties.
A concept of non-commutative Galois extension is introduced and binary and ternary extensions are chosen. Non-commutative Galois extensions of Nonion algebra and su(3) are constructed. Then ternary and binary Clifford analysis are introduced for non-commutative Galois extensions and the corresponding Dirac operators are associated.
This paper describes monitoring of the in-valley discharge and underground water level at the place where the tunnel will be constructed and also, the numerical analysis for prediction applying the Tank Model and Linear Filter Method to calculate the prediction. The application of these analyses has actually allowed the change of underground water level to be grasped and more effective information system to be established by comparing the real-time monitoring data with the real-time calculation of prediction.
The aim of this study is to show an application of model robustness measures for soilstructure interaction (henceforth written as SSI) models. Model robustness defines a measure for the ability of a model to provide useful model answers for input parameters which typically have a wide range in geotechnical engineering. The calculation of SSI is a major problem in geotechnical engineering. Several different models exist for the estimation of SSI. These can be separated into analytical, semi-analytical and numerical methods. This paper focuses on the numerical models of SSI specific macro-element type models and more advanced finite element method models using contact description as continuum or interface elements. A brief description of the models used is given in the paper. Following this description, the applied SSI problem is introduced. The observed event is a static loaded shallow foundation with an inclined load. The different partial models to consider the SSI effects are assessed using different robustness measures during numerical application. The paper shows the investigation of the capability to use these measures for the assessment of the model quality of SSI partial models. A variance based robustness and a mathematical robustness approaches are applied. These different robustness measures are used in a framework which allows also the investigation of computational time consuming models. Finally the result shows that the concept of using robustness approaches combined with other model–quality indicators (e.g. model sensitivity or model reliability) can lead to unique model–quality assessment for SSI models.
In the design of a structure, the implementation of reliable soil-foundation-structure interaction into the analysis process plays a very important role. The paper presents a determination of parameters of a suitably chosen soil-foundation model and their influence on the structure response. Since the mechanical data for the structure can be determined with satisfactory accuracy, the properties of the soil-foundation model were identified using measured dynamic response of the real structure. A simple model describing soil-foundation structure was incorporated into the classical 3-D finite element analysis of the structure with commercial software. Results obtained from the measured data on the pier were afterwards compared with those obtained with the finite model of the pier-foundation-soil structure. On the basis of this comparison the coefficients describing the properties in the soil-foundation model were adjusted until the calculated dynamic response coincided with the measured ones. In this way, the difference between both results was reduced to 1%. Full-scale tests measuring eigenmotion of the bridge were performed through all erection stages of the new bridge in Maribor. In this way an effective and experimentally verified 3-D model for a complex dynamic analysis of the bridge under the earthquake loading was obtained. The significant advantage of the obtained model is that it was updated on the basis of the dynamic measurements thus improving the model on the basis of in-situ geomechanical measurements. The model is very accurate in describing the upper structure and economical in describing the soil mass thus representing an optimal solution regarding computational efforts.
The frame of this paper is the development of methods and procedures for the description of the motion of an arbitrary shaped foundation. Since the infinite half-space cannot be properly described by a model of finite dimensions without violating the radiation condition, the basic problems are infinite dimensions of the half-space as well as its non-homogeneous nature. Consequently, an approach has been investigated to solve this problem indirectly by developing Green's function in which the non-homogeneity and the infiniteness of the half-space has been included. When the Green's function is known, the next step will be the evaluation of contact stresses acting between the foundation and the surface of the half-space through an integral equation. The equation should be solved in the area of the foundation using Green's function as the kernel. The derivation of three-dimensional Green's function for the homogeneous half-space (Kobayashi and Sasaki 1991) has been made using the potential method. Partial differential equations occurring in the problem have been made ordinary ones through the Hankel integral transform. The general idea for obtaining the three-dimensional Green's function for the layered half-space is similar. But in that case some additional phenomena may occur. One of them is the possibility of the appearance of Stonely surface waves propagating along the contact surfaces of layers. Their contribution to the final result is in most cases important enough that they should not be neglected. The main advantage of results presented in comparing to other obtained with numerical methods is their accuracy especially in the case of thin layers because all essential steps of Green's function evaluation except of the contour integration along the branch cut have been made analytically. On the other hand the disadvantage of this method is that the mathematical effort for obtaining the Green's function is increasing drastically with the increase of the number of layers. Future work will therefore be directed in simplifying of the above described process
The sizing of simple resonators like guitar strings or laser mirrors is directly connected to the wavelength and represents no complex optimisation problem. This is not the case with liquid-filled acoustic resonators of non-trivial geometries, where several masses and stiffnesses of the structure and the fluid have to fit together. This creates a scenario of many competing and interacting resonances varying in relative strength and frequency when design parameters change. Hence, the resonator design involves a parameter-tuning problem with many local optima. As its solution evolutionary algorithms (EA) coupled to a forced-harmonic FE simulation are presented. A new hybrid EA is proposed and compared to two state-of-theart EAs based on selected test problems. The motivating background is the search for better resonators suitable for sonofusion experiments where extreme states of matter are sought in collapsing cavitation bubbles.
In this paper we evaluate 2D models for soil-water characteristic curve (SWCC), that incorporate the hysteretic nature of the relationship between volumetric water content θ and suction ψ. The models are based on nonlinear least squares estimation of the experimental data for sand. To estimate the dependent variable θ the proposed models include two independent variables, suction and sensors reading position (depth d in the column test). The variable d represents not only the position where suction and water content are measured but also the initial suction distribution before each of the hydraulic loading test phases. Due to this the proposed 2D regression models acquire the advantage that they: (a) can be applied for prediction of θ for any position along the column and (b) give the functional form for the scanning curves.
Für die Gestaltung einer durchgängigen Unterstützung des Entwurfsprozesses stehen gegenwärtig deskriptive Modelle der Entwurfsobjekte im Mittelpunkt der Untersuchungen. Diese Modelle gestatten das Ableiten von Repräsentationen sowie eine Weitergabe von Entwurfsergebnissen. Pragmatische Gliederungen des Entwurfsprozesses unterteilen diesen nach organisatorischen und betriebswirtschaftlichen Aspekten (Planbarkeit und Abrechenbarkeit) in eine Sequenz von Entwurfsphasen (HOAI). Diese Gliederungen berücksichtigen nicht das WIE des eigentlichen modellkreierenden Schaffensprozesses. Für ein echtes CADesign bildet dessen Klärung jedoch die erforderliche Voraussetzung. Im Beitrag wird dazu von einem vereinheitlichten Set generischer Entwurfsaktionen ausgegangen. Auch dann, wenn die verschiedenen Entwurfsphasen und die Entwurfshandlungen der einzelnen Ingenieurgewerke mit spezifische Entwurfsmodellen verbunden werden, besteht damit eine Grundlage zur methodischen Fundierung entsprechender CAD-Tools. Die methodische Verfahrensweise ähnelt der, die in Form von Styleguides zur Gestaltung von 'Graphical User Interfaces' vorgeschlagen wird. Wesentliche praktische Benutzungen solche Basisaktivitäten ergeben sich für: die Systematisierung computergestützter Entwurfshandlungen, insbesondere durch Erweiterung des deskriptiven um ein operationales Modell sowie deren erweiterte Interpretierbarkeit die Erzeugung wissensbasierter Werkzeuge zur automatischen Modellgenerierung/-konfiguration die Implementation von leistungsfähigen UNDO- bzw. TMS-Mechanismen.
Due to technological progress and European standardization (e.g. ISO certification) there is an increasing demand to automate the digital recording of building plans. Currently there are two approaches available: (1) labour-intensive redrawing with digitizing tablets or by screen digitizing, (2) automatic scanning and vectorizing where vectorization generally demands an interactive follow-up treatment due to incomplete or ambigues results. This paper proposes a knowledge-based approach for building plan analysis. The procedure reveals the following processing steps: scanning of the building plan, enhancing the image quality and binarization, extraction of lines, line junctions and character fields, knowledge-based interpretation and grouping of image features to domain specific feature aggregates like door symbols, stair symbols etc., optical character recognition and lexicon-based interpretation of character fields, matching of feature aggregates with dimension sets. First results of a prototype implementation are presented. At last an extension of the approach towards a semantic modeling concept showing a coupling between 3D object modeling and an explict 2D modeling of images and technical drawings is presented
Building information modeling offers a huge potential for increasing the productivity and quality of construction planning processes. Despite its promising concept, this approach has not found widespread use. One of the reasons is the insufficient coupling of the structural models with the general building model. Instead, structural engineers usually set up a structural model that is independent from the building model and consists of mechanical models of reduced dimension. An automatic model generation, which would be valuable in case of model revisions is therefore not possible. This can be overcome by a volumetric formulation of the problem. A recent approach employed the p-version of the finite element method to this problem. This method, in conjunction with a volumetric formulation is suited to simulate the structural behaviour of both „thick“ solid bodies and thin-walled structures. However, there remains a notable discretization error in the numerical models. This paper therefore proposes a new approach for overcoming this situation. It sugggests the combination of the Isogeometric analysis together with the volumetric models in order to integrate the structural design into the digital, building model-centered planning process and reduce the discretization error. The concept of the isogeometric analysis consists, roughly, in the application of NURBS functions to represent the geometry and the shape functions of the elements. These functions possess some beneficial properties regarding numerical simulation. Their use, however, leads to some intricacies related to the setup of the stiffness matrix. This paper describes some of these properties.
Current building product models explicitly represent components, attributes of components, and relationships between components. These designer-focused product models, however, do not represent many of the design conditions that are important for construction, such as component similarity, uniformity, and penetrations. Current design and construction tools offer limited support for detecting these construction-specific design conditions. This paper describes the ontology we developed using the manufacturing concept of features to represent the design conditions that are important for construction. The feature ontology provides the blueprint for the additions and changes needed to transform a standard product model into a constructionspecific product model. The ontology formalizes three classes of features, defines the attributes and functions of each feature type, and represents the relationships between features explicitly. The descriptive semantics of the ontology allows practitioners to represent their varied preferences for naming features, specifying features that result from component intersections and the similarity of components, and grouping features that affect a specific construction domain. A software prototype that implements the ontology enables practitioners to transform designer-focused product models into feature-based product models that represent the construction perspective.
Es wird ein Verfahren vorgestellt, das erlaubt, die Vermessung von Porengrößen in Sichtbetonflächen sowie die Beurteilung von Farbeindrücken automatisiert durchzuführen. Subjektive Einflüsse bei der Bewertung werden ausgeschlossen, gleichzeitig werden Einsparungseffekte von 80% und mehr gegenüber Handauszählungen von Lunkern nachgewiesen. Die verwendeten Techniken sind ein spezieller Markierungsrahmen, eine hochauflösende Digitalkamera und spezielle Bildanalyseprozeduren. Letztere leisten die notwendigen Vorverarbeitungsfunktionen, wie geometrische Entzerrung, Kalibrierung sowie einen Helligkeitsausgleich, und nehmen die durchzuführenden Bewertungen vor. Das sind Porengrößen- und verteilungsanalyse sowie Farbvergleiche mit definierten Normalen zur Beurteilung von Farbhomogenitäten über den zu analysierenden Sichtbetonflächen. Die Ergebnisprotokollierung erfolgt sowohl bildlich als auch datentechnisch, letzteres z. B. in einem Excelfile. Die Durchgängigkeit des Verfahrens wird an einer Reihe von in Sichtbetonbauweise ausgeführten Bauten veranschaulicht, z. B. dem Bundeskanzleramt und dem Hörsaalzentrum der TU-Dresden. Wesentliche Einflussparameter wie Beleuchtungsgeometrie oder Helligkeitsschwellen werden diskutiert und Vergleiche zu durchgeführten Handmessungen gezogen. Der vorgestellte Ansatz ist in weiterer Forschung anwendungsorientiert aufzubereiten, vergleichende Untersuchungen zu durch Experten durchgeführten Messungen sind zu Validierungszwecken vorzunehmen. Der Ansatz kann einen Beitrag zu Normungsfragen liefern. Seine Nutzbarkeit ist auch für spezielle Bereiche der Entwicklung von Betonmischtechnologien oder in der Fertigelementefertigung gegeben. Im Beitrag sind die relevanten Verfahrensschritte bildlich und in tabellarischer Form untersetzt.
Optimum technological solutions must take into account the entire life cycle of structures including design procedures as well as quality assurance, inspection, maintenance, and repair strategies. Unfortunately, current design standards do not provide a satisfactory basis to ensure expected structural lifetimes. The latter may vary from only a few years for temporary structures to over a century for bridges, water dams or nuclear repositories. Consistent scientific concepts are urgently required to cover this wide spectrum of lifetimes in structural design and maintenance. This was a motivation for a group of scientists at the Ruhr-University Bochum (RUB) to start a special research program supported by the German Research Foundation (DFG) within the Cooperative Research Center SFB 398 since 1996. Institutes of the University Wuppertal and of the University Essen-Duisburg joined the research group. The goal of the Center is to study sources of damage and deterioration in materials and structures, to develop consistent models and simulation methods, to predict structural lifetimes and finally to integrate this predictions into new lifetime-oriented design strategies.
Research activities in our center are organised in three Project Groups as follows:
- Modelling of lifetime effects
- Methods for lifetime-oriented structural analyses
- Future lifespan-oriented design strategies.
... WITHOUT RIGHT ANGLE.
(2006)
Currently sculptural design is one of the most discussed themes in architecture. Due to their light weight, easy transportation and assembly, as well as an almost unlimited structural variety, parameterised spatial structures are excellently suited for constructive realisation of free formed claddings. They subdivide the continuous surface into a structure of small sized nodes, straight members and plane glass panels. Thus they provide an opportunity to realise arbitrary double-curved claddings with a high degree of transparency, using industrial semi-finished products (steel sections, flat glass). Digital design strategies and a huge number of similar looking but in detail unique structural members demand a continuous digital project handling. Within a research project, named MYLOMESH, a free-formed spatial structure was designed, constructed, fabricated and assembled. All required steps were carried out based on digital data. Different digital connections (scripts) between varying software tools, which are usually not used in the planning process of buildings, were created. They allow a completely digital workflow. The project, its design, meshing, constructive detailing and the above-mentioned scripts are described in this paper.
Information technology plays a key role in the everyday operation of buildings and campuses. Many proprietary technologies and methodologies can assist in effective Building Performance Monitoring (BPM) and efficient managing of building resources. The integration of related tools like energy simulator packages, facility, energy and building management systems, and enterprise resource planning systems is of benefit to BPM. However, the complexity to integrating such domain specific systems prevents their common usage. Service Oriented Architecture (SOA) has been deployed successfully in many large multinational companies to create integrated and flexible software systems, but so far this methodology has not been applied broadly to the field of BPM. This paper envisions that SOA provides an effective integration framework for BPM. Service oriented architecture for the ITOBO framework for sustainable and optimised building operation is proposed and an implementation for a building performance monitoring system is introduced.
DIGITAL SUPPORT OF MATERIAL- AND PRODUCT SELECTION IN THE ARCHITECTURAL DESIGN- AND PLANNING PROCESS
(2006)
Architecture is predominantly perceived over the surfaces limiting the space. The used surface materials thereby should support the design intention and have to fulfil various technical and economical requirements. If the architect wants to select the "right" or the "best" material he has to play with very different and sometimes contradicting criteria and must weight these individually for the special purpose. This selection process is supported only insufficiently by today's digital systems. If it would be possible to illustrate all the various parameters by numerical values, the method of multidimensional scaling will offer a solution for architects to find the material which is best fitting on basis of his individual weighting of criteria. By displaying the result of the architect's multidimensional query in a spatial arrangement multidimensional scaling can support an interactive selection process with additional feedback over the applied search strategy.
The paper is devoted to a study of properties of homogeneous solutions of massless field equation in higher dimensions. We first treat the case of dimension 4. Here we use the two-component spinor language (developed for purposes of general relativity). We describe how are massless field operators related to a higher spin analogues of the de Rham sequence - the so called Bernstein-Gel'fand-Gel'fand (BGG) complexes - and how are they related to the twisted Dirac operators. Then we study similar question in higher (even) dimensions. Here we have to use more tools from representation theory of the orthogonal group. We recall the definition of massless field equations in higher dimensions and relations to higher dimensional conformal BGG complexes. Then we discuss properties of homogeneous solutions of massless field equation. Using some recent techniques for decomposition of tensor products of irreducible $Spin(m)$-modules, we are able to add some new results on a structure of the spaces of homogenous solutions of massless field equations. In particular, we show that the kernel of the massless field equation in a given homogeneity contains at least on specific irreducible submodule.
The problem F|n=2|F is to minimize the given objective function F(C1,m, C2,m) of completion times Ci,m of two jobs i Î J={1, 2} processed on m machines M={1, 2, …, m}. Both jobs have the same technological route through m machines. Processing time ti,k of job iÎ J on machine kÎ M is known. Operation preemptions are not allowed. Let R2m be space of non-negative 2m-dimensional real vectors t=(t1,1,…, t1,m, t2,1,…, t2,m) with Chebyshev’s distance d(t, t*). To solve problem F|n=2|F, we can use the geometric algorithm, which includes the following steps: 1) construct digraph (V, A) for problem F|n=2|F and find so-called border vertices in (V, A); 2) construct the set of trajectories corresponding to the shortest paths Rt in digraph (V, A) from the origin vertex to each of the border vertices; 3) find an optimal path in the set Rt that represents a schedule with minimal value of the objective function F. Let path tu Î Rt be optimal for the problem F|n=2|F with operation processing times defined by vector t. If for any small positive real number e > 0 there exists vector t*Î R2m such that d(t, t*) = e and path tu is not optimal for the problem F|n=2|F with operation processing times defined by vector t*, then optimality of path tu is not stable. The main result of the paper is the proof of necessary and sufficient conditions for optimality stability of path tu. If objective function F is continuous non-decreasing (e.g., makespan, total completion time, maximal lateness or total tardiness), then to test whether optimality of the path tu Î Rt is stable takes O(m log m) time.
Collaborative Design Processes: A Class on Concurrent Collaboration in Multidisciplinary Design
(2004)
The rise of concurrent engineering in construction demands early team formation and constant communication throughout the project life cycle, but educational models in architecture, engineering and construction have been slow to adjust to this shift in project organization. Most students in these fields spend the majority of their college years working on individual projects that do not build teamwork or communication skills. Collaborative Design Processes (CDP) is a capstone design course where students from the University of Illinois at Urbana-Champaign and the University of Florida learn methods of collaborative design enhanced by the use of information technology. Students work in multidisciplinary teams to collaborate from remote locations via the Internet on the design of a facility. An innovation of this course compared to previous efforts is that students also develop process designs for the integration of technology into the work of multidisciplinary design teams. The course thus combines both active and reflective learning about collaborative design and methods. The course is designed to provide students the experience, tools, and methods needed to improve design processes and better integrate the use of technology into AEC industry work practices. This paper describes the goals, outcomes and significance of this new, interdisciplinary course for distributed AEC education. Differences from existing efforts and lessons learned to promote collaborative practices are discussed. Principal conclusions are that the course presents effective pedagogy to promote collaborative design methods, but faces challenges in both technology and in traditional intra-disciplinary training of students.
This paper is concerned with the numerical treatment of quasilinear elliptic partial differential equations. In order to solve the given equation we propose to use a Galerkin approach, but, in contrast to conventional finite element discretizations, we work with trial spaces that, not only exhibit the usual approximation and good localization properties, but, in addition, lead to expansions of any element in the underlying Hilbert spaces in terms in multiscale or wavelet bases with certain stability properties. Specifically, we select as trial spaces a nested sequence of spaces from an appropriate biorthogonal multiscale analysis. This gives rise to a nonlinear discretized system. To overcome the problems of nonlinearity, we make use of the machinery of interpolating wavelets to obtain knot oriented quadrature rules. Finally, Newton's method is applied to approximate the solution in the given ansatz space. The results of some numerical experiments with different biorthogonal systems, confirming the applicability of our scheme, are presented.
We consider a structural truss problem where all of the physical model parameters are uncertain: not just the material values and applied loads, but also the positions of the nodes are assumed to be inexact but bounded and are represented by intervals. Such uncertainty may typically arise from imprecision during the process of manufacturing or construction, or round-off errors. In this case the application of the finite element method results in a system of linear equations with numerous interval parameters which cannot be solved conventionally. Applying a suitable variable substitution, an iteration method for the solution of a parametric system of linear equations is firstly employed to obtain initial bounds on the node displacements. Thereafter, an interval tightening (pruning) technique is applied, firstly on the element forces and secondly on the node displacements, in order to obtain tight guaranteed enclosures for the interval solutions for the forces and displacements.
IFC-BASED MONITORING INFORMATION MODELING FOR DATA MANAGEMENT IN STRUCTURAL HEALTH MONITORING
(2015)
This conceptual paper discusses opportunities and challenges towards the digital representation of structural health monitoring systems using the Industry Foundation Classes (IFC) standard. State-of-the-art sensor nodes, collecting structural and environmental data from civil infrastructure systems, are capable of processing and analyzing the data sets directly on-board the nodes. Structural health monitoring (SHM) based on sensor nodes that possess so called “on-chip intelligence” is, in this study, referred to as “intelligent SHM”, and the infrastructure system being equipped with an intelligent SHM system is referred to as “intelligent infrastructure”. Although intelligent SHM will continue to grow, it is not possible, on a well-defined formalism, to digitally represent information about sensors, about the overall SHM system, and about the monitoring strategies being implemented (“monitoring-related information”). Based on a review of available SHM regulations and guidelines as well as existing sensor models and sensor modeling languages, this conceptual paper investigates how to digitally represent monitoring-related information in a semantic model. With the Industry Foundation Classes, there exists an open standard for the digital representation of building information; however, it is not possible to represent monitoring-related information using the IFC object model. This paper proposes a conceptual approach for extending the current IFC object model in order to include monitoring-related information. Taking civil infrastructure systems as an illustrative example, it becomes possible to adequately represent, process, and exchange monitoring-related information throughout the whole life cycle of civil infrastructure systems, which is referred to as monitoring information modeling (MIM). However, since this paper is conceptual, additional research efforts are required to further investigate, implement, and validate the proposed concepts and methods.
Due to increasing numbers of wind energy converters, the accurate assessment of the lifespan of their structural parts and the entire converter system is becoming more and more paramount. Lifespan-oriented design, inspections and remedial maintenance are challenging because of their complex dynamic behavior. Wind energy converters are subjected to stochastic turbulent wind loading causing corresponding stochastic structural response and vibrations associated with an extreme number of stress cycles (up to 109 according to the rotation of the blades). Currently, wind energy converters are constructed for a service life of about 20 years. However, this estimation is more or less made by rule of thumb and not backed by profound scientific analyses or accurate simulations. By contrast, modern structural health monitoring systems allow an improved identification of deteriorations and, thereupon, to drastically advance the lifespan assessment of wind energy converters. In particular, monitoring systems based on artificial intelligence techniques represent a promising approach towards cost-efficient and reliable real-time monitoring. Therefore, an innovative real-time structural health monitoring concept based on software agents is introduced in this contribution. For a short time, this concept is also turned into a real-world monitoring system developed in a DFG joint research project in the authors’ institute at the Ruhr-University Bochum. In this paper, primarily the agent-based development, implementation and application of the monitoring system is addressed, focusing on the real-time monitoring tasks in the deserved detail.
The problem of the computation of stresses and settlements in the half-space under various types of loads is often presented in geotechnical engineering. In 1885 Boussinesq advanced theoretical expressions to determine stresses at a point within an ideal mass. His equation considers a point load on the surface of a semi-infinite, homogeneous, isotropic, weightless, elastic half-space. Newmark in 1942 performed the integration of Boussinesq's equations for the vertical stress under a corner of a rectangular area loaded with a uniform load. The problem of the determination of vertical stresses under a rectangular shaped footing has been satisfactorily solved with renewal integration of the Boussinesq's equation over the arbitrary rectangle on surface of the half-space, with a non-uniform load represented with piecewise linear interpolation functions. The problem of the determination of stresses in the case when the footing shape is an arbitrary quadrilateral however remains unsolved. The paper discusses an approach to the computation of vertical stresses and settlements in an arbitrary point of the half-space, loaded with a uniform load, which shape in the ground plan can be a general four noded form with straight edges. Since the form is transformed into a biunit square and all integrations are performed over this area, all solutions are valid also for an arbitrary triangle by the implementation of the degeneration rule.
To fulfil safety requirements the changes in the static and/or dynamic behaviour of the structure must be analysed with great care. These changes are often caused by local reduction of the stiffness of the structure caused by the irregularities in the structure, as for example cracks. In simple structures such analysis can be performed directly, by solving equations of motion, but for more complex structures a different approach, usually numerical, must be applied. The problem of crack implementation into the structure behaviour has been studied by many authors who have usually modelled the crack as a massless rotational spring of suitable stiffness placed at the beam at the location where the crack occurs. Recently, the numerical procedure for the computation of the stiffness matrix for a beam element with a single transverse crack has been replaced with the element stiffness matrix written in fully symbolic form. A detailed comparison of the results obtained by using 200 2D finite elements with those obtained with a single cracked beam element has confirmed the usefulness of such element.
The concept is presented of the sensitivity analysis of the limit state of the structure with respect to selected basic variables. The sensitivity is presented in the form of the probability distribution of the limit state of the structure. The analysis is performed by the problem-oriented Monte Carlo simulation procedure. The procedure is based on the problem's definition of the elementary event, as a structural limit state. Thus the sample space consists of limit states of the structure. Defined on the sample space the one-dimensional random multiplier is introduced. This multiplier refers to the dominant basic variable (group of variables) of the problem. Numerical procedure results in the set of random numbers. Normalized relative histogram of this set is an estimator of the PDF of the limit state of the structure. Estimators of reliability, or the probability of failure are statistical characteristics of this histogram. The procedure is illustrated by the example of sensitivity analysis of the serviceability limit state of monumental structure. It is the colonnade of Licheń Basilica, situated in central Poland. Limit state of the structure is examined with reference to the upper deck horizontal deflection. Wind actions are taken as dominant variables. An assumption is made that the wind load intensities acting on the lower and on the upper storey of the colonnade, respectively, are identically distributed, but correlated random variables. Three correlation variants of these variables are considered. Relevant limit state histograms are analysed thereafter. The paper ends with the conclusions referring to the method and some general remarks on the fully probabilistic design.
The Bernstein polynomials are used for important applications in many branches of Mathematics and the other sciences, for instance, approximation theory, probability theory, statistic theory, num- ber theory, the solution of the di¤erential equations, numerical analysis, constructing Bezier curves, q-calculus, operator theory and applications in computer graphics. The Bernstein polynomials are used to construct Bezier curves. Bezier was an engineer with the Renault car company and set out in the early 1960’s to develop a curve formulation which would lend itself to shape design. Engineers may …nd it most understandable to think of Bezier curves in terms of the center of mass of a set of point masses. Therefore, in this paper, we study on generating functions and functional equations for these polynomials. By applying these functions, we investigate interpolation function and many properties of these polynomials.
In order to model and simulate collapses of large scale complex structures, a user-friendly and high performance software system is essential. Because a large number of simulation experiments have to be performed, therefore, next to an appropriate simulation model and high performance computing, efficient interactive control and visualization capabilities of model parameters and simulation results are crucial. To this respect, this contribution is concerned with advancements of the software system CADCE (Computer Aided Demolition using Controlled Explosives) that is extended under particular consideration of computational steering concepts. Thereby, focus is placed on problems and solutions for the collapse simulation of real world large scale complex structures. The simulation model applied is based on a multilevel approach embedding finite element models on a local as well as a near field length scale, and multibody models on a global scale. Within the global level simulation, relevant effects of the local and the near field scale, such as fracture and failure processes of the reinforced concrete parts, are approximated by means of tailor-made multibody subsystems. These subsystems employ force elements representing nonlinear material characteristics in terms of force/displacement relationships that, in advance, are determined by finite element analysis. In particular, enhancements concerning the efficiency of the multibody model and improvements of the user interaction are presented that are crucial for the capability of the computational steering. Some scenarios of collapse simulations of real world large scale structures demonstrate the implementation of the above mentioned approaches within the computational steering.
CRITICAL STRESS ASSESSMENT IN ANGLE TO GUSSET PLATE BOLTED CONNECTION BY SIMPLIFIED FEM MODELLING
(2010)
Simplified modelling of friction grip bolted connections of steel member – to – gusset plate is often applied in engineering practise. The paper deals with the simplification of pre-tensioned bolt model and simplification of load transfer within connection. Influence on normal strain (and thus stress) distribution at critical cross-section is investigated. Laboratory testing of single-angle or double-angle members – to – gusset plates bolted connections were taken as basis for numerical analysis. FE models were created using 1D and 2D elements. Angles and gusset plates were modelled with shell elements. Two methods of modelling of friction grip bolting were considered: bolt-regarding approach with 1D element systems modelling bolts and two variants of bolt-disregarding approach with special constraints over some part of member and gusset plate surfaces in contact: a) constraints over whole area of contact, b) constraints over the area around each bolt shank (“partially tied”). Modelling of friction grip bolted connections using simplified bolt modelling may be effective, especially in the case of analysis concerning elastic range only. In such a case disregarding bolts and replacing them with “partially tied” modelling seems to be more attractive. It is less time-consuming and provides results of similar accuracy in comparison to analysis utilizing simplified bolt modelling.
VARIATION OF ROTATIONAL RESTRAINT IN GRID DECK CONNECTION DUE TO CORROSION DAMAGE AND STRENGTHENING
(2006)
The approach to assessment of rotational restraint of stringer-to-crossbeam connection in a deck of 100-year old steel truss bridge is presented. Sensitivity of rotational restraint coefficient of the connection to corrosion damage and strengthening is analyzed. Two criteria of the assessment of the rotational restraint coefficient are applied: static and kinematic one. The former is based on bending moment distribution in the considered member, the latter one – on the member rotation at the given joint. 2D-element model of finite element method is described: webs and flanges are modeled with shell elements, while rivets in the connection – with system of beam and spring elements. The method of rivet modeling is verified by T-stub connection test results published in literature. FEM analyses proved that recorded extent of corrosion damage does not alter the initial rotational restraint of stringer-to-crossbeam connection. Strengthening of stringer midspan influences midspan bending moment and stringer end rotation in a different way. Usually restoring member load bearing capacity means strengthening its critical regions (where the highest stress levels occur). This alters flexural stiffness distribution over member length and influences rotational restraint at its connection to other members. The impact depends on criterion chosen for rotational restraint coefficient assessment.
Particle Simulation and Evaluation of Personal Exposure to Contaminant Sources in an Elevation Space
(2004)
An elevator, which figures a small volume, is normally used by everyone for a short period of time and equipped with simple ventilation system..Any contaminant released within it may cause serious problem. This research adapt a fire and smoke simulation software (FDS) into non-fire indoor airflow scario. Differently from previous research, particles are chosen as a risk evalution unit. A personal and multi-personal exposure model is proposed. The model takes the influence of the human thermal boundary, coughing, inhalation, exhalation, standing position, and the fan factor into account. The model is easy-to-use and suitable for the design of elevator system in practice.
The uncertainty existing in the construction industry is bigger than in other industries. Consequently, most construction projects do not go totally as planned. The project management plan needs therefore to be adapted repeatedly within the project lifecycle to suit the actual project conditions. Generally, the risks of change in the project management plan are difficult to be identified in advance, especially if these risks are caused by unexpected events such as human errors or changes in the client preferences. The knowledge acquired from different resources is essential to identify the probable deviations as well as to find proper solutions to the faced change risks. Hence, it is necessary to have a knowledge base that contains known solutions for the common exceptional cases that may cause changes in each construction domain. The ongoing research work presented in this paper uses the process modeling technique of Event-driven Process Chains to describe different patterns of structure changes in the schedule networks. This results in several so called “change templates”. Under each template different types of change risk/ response pairs can be categorized and stored in a knowledge base. This knowledge base is described as an ontology model populated with reference construction process data. The implementation of the developed approach can be seen as an iterative scheduling cycle that will be repeated within the project lifecycle as new change risks surface. This can help to check the availability of ready solutions in the knowledge base for the situation at hand. Moreover, if the solution is adopted, CPSP, “Change Project Schedule Plan „a prototype developed for the purpose of this research work, will be used to make the needed structure changes of the schedule network automatically based on the change template. What-If scenarios can be implemented using the CPSP prototype in the planning phase to study the effect of specific situations without endangering the success of the project objectives. Hence, better designed and more maintainable project schedules can be achieved.
Monitoring und Bewertung sind Hauptaufgaben im Management bzw. der Revitalisierung von Bauwerken. Unterschiedliche Verfahren können bei der Akquisition der erforderlichen geometrischen Information, wie z. B. Größe oder Verformung eines Gebäudes, eingesetzt werden. Da das Potenzial der digitalen Fotografie kontinuierlich wächst, stellt die Industriephotogrammetrie heute eine bedeutende Alternative zu den klassischen Verfahren wie Dehnmessstreifen oder anderen taktilen Sensoren dar. Moderne Industriephotogrammetrie erfasst die Bilder mittels digitaler Systeme. Dies bedeutet, dass die Information digitaler Bilder mit Hilfe der digitalen Bildverarbeitung untersucht werden muss, um die Bildkoordinaten der Messpunkte zu erhalten. Eine der Aufgaben der Bildverarbeitung für photogrammetrische Zwecke besteht somit darin, den Mittelpunkt von kreisförmigen Marken zu lokalisieren. Die modernen Operatoren liefern Subpixelgenauigkeit für die Koordinaten des Punktes. Das optische Messverfahren der Industriephotogrammetrie erfordert hinsichtlich der Hardware in erster Linie hochauflösende digitale Kameras. Dabei lassen sich die Kameras in Videokameras, HighSpeed-Kameras, intelligente Kameras sowie so genannte Consumer und Professionelle Kameras unterscheiden. Die geometrische Auflösung digitaler HighEnd-Kameras liegt heute bei über 10 Megapixel. In punkto Datentransfer zum Rechner sind verschiedene Standards am Markt verfügbar, z. B. USB2.0, GigE-Vision, CameraLink oder Firewire. Die Wahl des Standards hängt immer von der spezifischen Aufgabenstellung ab, da keine der Techniken eine führende Position einnimmt. Die moderne Photogrammetrie bietet viele neue Möglichkeiten für das Monitoring und die Bewertung von Bauwerken. Sie kann ein-, zwei-, drei- oder vierdimensionale Informationen liefern, falls erforderlich auch in Echtzeit. Als berührungsloses Messverfahren ist der Einsatz der Photogrammetrie noch möglich, wenn die taktilen Sensoren z. B. aufgrund ihres Platzbedarfes nicht mehr eingesetzt werden können. Hochauflösende Videokameras erlauben es, selbst dynamische Untersuchungen mit großer Präzision durchzuführen.
This paper presents an innovative software platform OpenSTEP intended to build advanced distributed integrated systems and to conduct multidisciplinary collaborative projects in both academy and industry. The paper discusses an open system architecture, methodology, component library and CASE toolkit enabling the developers to build a wide range of interoperable applications and systems compliant with STEP and, particularly, with IFC becoming the increasingly important standard for information integration in architecture, engineering and construction.
The paper presents a general map-based approach to prototyping of products in virtual reality environments. Virtual prototyping of products is considered as a consistent simulation and visualization process mapping the source product model into its target visual representations. The approach enables to interrelate formally the product and visual information models with each other by defining mapping rules, to specify a prototyping scenario as a composition of map instances, and then to explore particular product models in virtual reality environments by interpreting the composed scenario. Having been realized, the proposed approach provides for the strongly formalized method and the common software framework to build virtual prototyping applications. As a result, the applications gain in expressiveness, reusability and reliability, as well as take on additional runtime flexibility...
Multimodel Numerical Analysis of the Elasto-Visco-Plastic Deformation of Materials and Constructions
(1997)
At the present time there is no a generally accepted theory of visco-plasticity which is applicable for a wide class of materials and arbitrary paths of loading. The multimodel approach, based on the creation of hierarchical sequence of the models, is the most rational. The developed library of elasto-visco-plastic models includes both simplest and sophistic models demanding numerous experimental data. A unified general form of constitutive equations for all used elasto-visco-plastic models are presented based upon the concept of tensorial internal state variables. It permits to use unified algorithm of boundary tasks solution for different variants of material models. The developed selection criteria system generates the necessary conditions and provides the choice of the simplest variant of theory sufficient for correct problem solution. Formulation of the selection criteria system is based on peculiarities of viscoplastic materials behavior for the wide range thermomechanical loading and numerous computational experiments with structures different complexity levels. A set of effective schemes of integration stress-strain relations and non-linear finite element system solution are discussed for the considered class of material models. Application possibility of different material models is studied both for material element and for complicated structures. Application of the multimodel approach in numerical computations has demonstrated possibility of reliable prediction of stress-strain response under wide variety of combined loading.
Die geometrische Modellierung hat in den Ingenieurwissenschaften eine große Bedeutung erlangt. Die Visualisierung von zwei- oder dreidimensionalen Problemstellungen ist aus heutigen Anwendungen nicht mehr wegzudenken. Zunehmend rücken Aufgabenstellungen aus dem Bereich der geometrischen Modellierung in den Vordergrund, die über die etablierten Dimensionen 1-3 hinausgehen und die nicht mehr rein geometrischer Natur sind. Hierzu zählen Aufgabenstellungen aus den Bereichen numerische Simulation, Parameteridentifikation und Strukturanalyse. Auf diese nicht-geometrischen Aufgabenstellungen sollen geometrische Verfahren, wie z.B. Triangulation, konvexe Hülle, geometrischer Schnitt und Interpolation angewendet werden. Hierzu werden diese Algorithmen, die alle auf der klassischen Geometrie des euklidischen Raumes beruhen, auf ihre Übertragbarkeit hin analysiert und überarbeitet. Am Beispiel einer Parameteridentifikation wird eine systematische Vorgehensweise vorgestellt, die es ermöglicht, trotz weniger Versuchsrechnungen den Bereich der in Frage kommenden Parameter umfassend zu beschreiben. Dies ermöglicht ein besseres Verständnis der Zusammenhänge der Parameter untereinander. Häufig existieren mehr als eine Parameterkombination, so daß diese eine Isolinie formen, die ihrerseits unendlich viele Lösungen des gestellten Problemes im Untersuchungsgebiet beschreibt.
Car following models are used to describe the behavior of a number of cars on the road dependent on the distance to the car in front. We introduce a system of ordinary differential equations and perform a theoretical and numerical analysis in order to find solutions that reflect various traffic situations. We present three different variations of the model motivated by reality.
Der vorliegende Text beschreibt die intensive Erforschung von Wabenplatten aus Papierwerkstoffen, die durch Faltprozesse neue räumliche Zustände einnehmen können und somit ihr ursprüngliches Anwendungsspektrum erweitern. Die gezeigten Lösungsansätze bewegen sich dabei im Spannungsfeld von Architektur und Ingenieurbau, denn die gefalteten Bauteile sind nicht nur äußerst tragfähig sondern besitzen auch eine ästhetische Form. Die entwickelten Verfahren und Konstruktionen werden auf einem hohen architektonischen Niveau präsentiert und mit einfachen ingenieurtechnischen Methoden verifiziert. Zur Lösungsfindung werden geometrische Verfahren ebenso angewendet wie konstruktive Faustformeln und Recherchen aus Architektur und Forschung.
Der Fokus der Arbeit liegt auf der Untersuchung von Faltungen in Wabenplatten. Während der Auseinandersetzung mit der Thematik erschienen jedoch viele weitere Aspekte als sehr interessant und bearbeitungswürdig. Als theoretische Grundlage dieser Arbeit werden deshalb die geschichtliche Entwicklung und die gesellschaftliche Bedeutung von Papier und Papierwerkstoffen analysiert und deren Produktionsprozesse beleuchtet. Diese Vorgehensweise ermöglicht eine Einordnung des Potentials und der Bedeutung des Werkstoffs Papier. Der Kontext der Arbeit wird dadurch gestärkt und führt zu interessanten zukünftigen Forschungsansätzen.
Intensive Untersuchungen widmen sich der geometrischen Bestimmung von Faltungen in Wabenplatten aus Papierwerkstoffen sowie deren Manifestation als konstruktive Bauteile. Auch die statischen Eigenschaften der Elemente und ihr Konstruktionspotential werden erforscht und aufbereitet. Wichtige Impulse aus Forschung und Technik fließen in die Recherche der Arbeit ein und erlauben die Verortung der Ergebnisse im architektonischen Kontext. Versuchsreihen und Materialstudien an Prototypen belegen die Ergebnisse virtueller und rechnerischer Studien. Konzepte zur parametrischen Berechnung und Visualisierung der Forschungsergebnisse werden präsentiert und zeigen zukunftsfähige Planungshilfen für die Industrie auf. Etliche Testreihen zu unterschiedlichsten Abdichtungskonzepten führen zur Realisierung eines sehenswerten Experimentalbaus. Er erlaubt die dauerhafte Untersuchung der entwickelten Bauteile unter realistischen Bedingungen und bestätigt deren Leistungsfähigkeit. Dadurch wird nicht nur ein dauerhaftes Monitoring und eine Evaluierung der Leistungsdaten möglich sondern es wird auch der sichtbare Beweis erbracht, dass mit Papierwerkstoffen effiziente und hochwertige Architekturen zu realisieren sind, welche das enorme gestalterische Potential von gefalteten Wabenplatten ausnutzen.
The high resource demand of the building sector clearly indicates the need to search for alternative, renewable and energy-efficient materials. This work presents paper-laminated sandwich elements with a core of corrugated paperboard that can serve as architectural components with a load-bearing capacity after a linear folding process. Conventional methods either use paper tubes or glued layers of honeycomb panels. In contrast, the folded components are extremely lightweight, provide the material strength exactly where it is statically required and offer many possibilities for design variants. After removing stripes of the paper lamination, the sandwich can be folded in a linear way at this position. Without the resistance of the missing paper, the sandwich core can be easily compressed. The final angle of the folding correlates with the width of the removed paper stripe. As such, this angle can be described by a simple geometric equation. The geometrical basis for the production of folded sandwich elements was established and many profile types were generated such as triangular, square or rectangular shapes. The method allows the easy planning and fast production of components that can be used in the construction sector. A triangle profile was used to create a load-bearing frame as supporting structure for an experimental building. This first permanent building completely made of corrugated cardboard was evaluated in a two-year test to confirm the efficiency of the developed components. In addition to the frame shown in this paper, large-scale sandwich elements with a core of folded components can be used to fabricate lightweight ceilings and large-scale sandwich components. The method enables the efficient production of linearly folded cardboard elements which can replace normal wooden components like beams, pillars or frames and bring a fully recycled material in the context of architectural construction.
Die Kommunale Wohnungsgesellschaft mbH Erfurt(KoWo) ist mit ihren rund 20.000 Wohnungen in der Landeshauptstadt das größte Wohnungsunternehmen in Thüringen. Der Immobilienbestand ist heterogen in seinem technischen Zustand und im Bezug auf die unterschiedlichen Lagen der Objekte. Bedingt durch Leerstände und unterschiedliche Modernisierungsmaßnahmen und -stände unterscheidet sich die Wirtschaftlichkeit verschiedener Objekte deutlich. Ohne eine einheitliche Einwertung des Immobilienbestandes im Bezug auf die Objektattraktivität, die Standortqualität und die Objektwirtschaftlichkeit fällt eine langfristige strategische Entwicklung des Immobilienportfolios schwer. Über die Schritte der technischen Bestandserfassung, die Einwertung über ein Scorintmodell, die Abbildung in einem Portfoliomodell mit zugehöriger Normstrategie bis hin zur Weiterverarbeitung der Daten in der 20-jährigen Instandsetzungsplanung wird praxisnah aufgezeigt, wie die Vorgehensweise bei der Einwertung des Immobilienportfolios ist.
Die Situation in der Immobilienwirtschaft hat sich in den letzten Jahren grundlegend verändert. Die Neubauaktivitäten in ganz Deutschland sind deutlich zurückgegangen. Das Bauen im Bestand mit den Tätigkeitsfeldern der Renovierung, Modernisierung und Instandsetzung rückt daher in den Mittelpunkt der Aktivitäten. Laut Bundesarbeitskreis Altbauerneuerungen werden im nächsten Jahr 74 Prozent des gesamten Bauvolumens auf die Modernisierung von alten Häusern entfallen, der Anteil der Neubaumaßnahmen geht auf 26 Prozent zurück. Dies gilt nicht nur für Wohn-, sondern auch für Gewerbeimmobilien. Die anhaltende Rezession und die Entzauberung der New Economy ziehen in diesem Bereich Leerstände und fallende Mieten nach sich. Auch die Wohnungswirtschaft hat vor allem in den neuen Ländern mit Leerständen zu kämpfen. Diese Tatsache zwingt Immobilienbetreiber zunehmend dazu, Methoden zu ergreifen und Werkzeuge einzusetzen, die die Effizienz im eigenen Unternehmen steigern. Die Aufgabenstellung lautet: Mehr Sanierungsprojekte in der selben Zeit abwickeln. Dies setzt eine prozessorientierte Vorgehensweise und unterstützende Werkzeuge voraus. Mit dem baucommunicator wurde ein Werkzeug entwickelt, dass es dem Gebäudemanager ermöglicht, bei der Abwicklung von Bau- und Sanierungsmaßnahmen im Aufgabenfeld der Koordination dieser Aufgabenstellung zu entsprechen. Die Kernfunktionen der Software sind das Termin-, Kosten und Qualitätsmanagement bei der Abwicklung von Bauprozessen. Die unternehmensübergreifende Konzeption erlaubt sowohl dem Auftraggeber, wie auch den ausführenden Betrieben den Zugriff auf die zentral abgelegten Projektinformationen. Die Workflow-Funktionalität des baucommunicator führt zu einer Reduktion der Prozessaktivitäten. Dadurch werden Kosten eingespart und die Ausführungsqualität gesteigert.
Die vorliegende Arbeit beschäftigt sich mit der Analyse der last- und zeitabhängigen Verformungseigenschaften und der Steifigkeits-Degradation von unbewehrtem Normalbeton und hochfestem, selbstverdichtendem Beton (SVB). Neben experimentellen Untersuchungen werden numerische Simmulationen des Beton-Kriechverhaltens unter kurzzeitiger, quasi-statischer, uniaxialer Druckbeanspruchung vorgenommen.