Refine
Document Type
- Conference Proceeding (857) (remove)
Institute
- Professur Informatik im Bauwesen (336)
- In Zusammenarbeit mit der Bauhaus-Universität Weimar (174)
- Professur Baubetrieb und Bauverfahren (104)
- Professur Theorie und Geschichte der modernen Architektur (94)
- Graduiertenkolleg 1462 (32)
- Institut für Strukturmechanik (ISM) (23)
- Professur Angewandte Mathematik (18)
- Institut für Konstruktiven Ingenieurbau (IKI) (12)
- Junior-Professur Computational Architecture (11)
- Bauhaus-Institut für Geschichte und Theorie der Architektur und Planung (10)
Keywords
- Computerunterstütztes Verfahren (286)
- Architektur <Informatik> (200)
- CAD (158)
- Angewandte Informatik (145)
- Angewandte Mathematik (145)
- Computer Science Models in Engineering; Multiscale and Multiphysical Models; Scientific Computing (74)
- Bauhaus (68)
- Arbeitsschutz (61)
- Baustelle (53)
- Architekturtheorie (52)
It is well-known that the solution of the fundamental equations of linear elasticity for a homogeneous isotropic material in plane stress and strain state cases can be equivalently reduced to the solution of a biharmonic equation. The discrete version of the Theorem of Goursat is used to describe the solution of the discrete biharmonic equation by the help of two discrete holomorphic functions. In order to obtain a Taylor expansion of discrete holomorphic functions we introduce a basis of discrete polynomials which fulfill the so-called Appell property with respect to the discrete adjoint Cauchy-Riemann operator. All these steps are very important in the field of fracture mechanics, where stress and displacement fields in the neighborhood of singularities caused by cracks and notches have to be calculated with high accuracy. Using the sum representation of holomorphic functions it seems possible to reproduce the order of singularity and to determine important mechanical characteristics.
The increased implementation of site data capture technologies invariably results in an increase in data warehousing and database technologies to store captured data. However, restricted use of data beyond the initial application could potentially result in a loss of understanding of site processes. This could in turn lead to poor decision making at production, tactical and strategic levels. Concrete usage data have been collected from two piling processes. These data have been analysed and the results highlighted potential improvements that could be made to existing site management and estimating processes. A cost benefit analysis has been used to support decision making at the strategic level where the identified improvements require capital expenditure.
The stress state of a piecewise-homogeneous elastic body, which has a semi-infinite crack along the interface, under in-plane and antiplane loads is considered. One of the crack edges is reinforced by a rigid patch plate on a finite interval adjacent to the crack tip. The crack edges are loaded with specified stresses. The body is stretched at infinity by specified stresses. External forces with a given principal vector and moment act on the patch plate. The problem reduces to a Riemann-Hilbert boundary-value matrix problem with a piecewise-constant coefficient for two complex potentials in the plane case and for one in the antiplane case. The complex potentials are found explicitly using a Gaussian hypergeometric function. The stress state of the body close to the ends of the patch plate, one of which is also simultaneously the crack tip, is investigated. Stress intensity factors near the singular points are determined.
This paper is focused on the first numerical tests for coupling between analytical solution and finite element method on the example of one problem of fracture mechanics. The calculations were done according to ideas proposed in [1]. The analytical solutions are constructed by using an orthogonal basis of holomorphic and anti-holomorphic functions. For coupling with finite element method the special elements are constructed by using the trigonometric interpolation theorem.
The preliminary design of a wearable computer for supporting Construction Progress Monitoring
(2000)
Progress monitoring has become more and more important as owners have increasingly demanded shorter times for the delivery of their projects. This trend is even more evident in high technology industries, such as the computer industry and the chemical industry. Fast changing markets, such as the computer industry, force companies to have to build new facilities quickly. To make a statement about construction progress, the status of a building has to be determined and monitored over a period of time. Depicting the construction progress in a diagram over time, statements can be made about the anticipated completion of the project and delays and problems in certain areas. Having this information, measures can be taken to efficiently >catch up< on the schedule of the project. New technologies, such as wearable computers, speech recognition, touch screens and wireless networks could help to move electronic data processing to the construction site. Progress monitoring could very much take advantage of this move, as several intermediate steps of processing progress data can be made unnecessary. The processing of progress data could be entirely done by computers, which means that data for supporting decisions can be made available at the moment the construction progress is measured. This paper describes a project, that investigates how these new technologies can be linked to create a system that enhances the efficiency of progress monitoring. During the project a first prototype of a progress monitoring system was developed that allows construction companies and site supervisors to measure construction progress on site using wearable computers that are speech controlled and connected to a central database via a wireless network.
THE INFLUENCE OF THE LOCAL CONCAVITY ON THE FUNCTIONING OF BEARING SHELL OF HIGH-RISE CONSTRUCTION
(2012)
Areas with various defects and damages, which reduce carrying capacity, were examined in a study of metal chimneys. In this work, the influence of the local dimples on the function of metal chimneys was considered. Modeling tasks were completed in the software packages LIRA and ANSYS. Parameters were identified, which characterize the local dimples, and a numerical study of the influence of local dimples on the stress-strain state of shells of metal chimneys was conducted. A distribution field of circular and meridional tension was analyzed in a researched area. Zones of influence of dimples on the bearing cover of metal chimneys were investigated. The bearing capacities of high-rise structures with various dimple geometries and various cover parameters were determined with respect to specified areas of the trunk. Dependent relationships are represented graphically for the decrease in bearing capacity of a cover with respect to dimples. Diameter and thickness of covers of metal chimneys were constructed according to the resulting data.
This article presents the Rigid Finite Element Method in the calculation of reinforced concrete beam deflection with cracks. Initially, this method was used in the shipbuilding industry. Later, it was adapted in the homogeneous calculations of the bar structures. In this method, rigid mass discs serve as an element model. In the flat layout, three generalized coordinates (two translational and one rotational) correspond to each disc. These discs are connected by elastic ties. The genuine idea is to take into account a discrete crack in the Rigid Finite Element Method. It consists in the suitable reduction of the rigidity in rotational ties located in the spots, where cracks occurred. The susceptibility of this tie results from the flexural deformability of the element and the occurrence of the crack. As part of the numerical analyses, the influence of cracks on the total deflection of beams was determined. Furthermore, the results of the calculations were compared to the results of the experiment. Overestimations of the calculated deflections against the measured deflections were found. The article specifies the size of the overestimation and describes its causes.
In this paper we present rudiments of a higher dimensional analogue of the Szegö kernel method to compute 3D mappings from elementary domains onto the unit sphere. This is a formal construction which provides us with a good substitution of the classical conformal Riemann mapping. We give explicit numerical examples and discuss a comparison of the results with those obtained alternatively by the Bergman kernel method.
In this note, we describe quite explicitly the Howe duality for Hodge systems and connect it with the well-known facts of harmonic analysis and Clifford analysis. In Section 2, we recall briefly the Fisher decomposition and the Howe duality for harmonic analysis. In Section 3, the well-known fact that Clifford analysis is a real refinement of harmonic analysis is illustrated by the Fisher decomposition and the Howe duality for the space of spinor-valued polynomials in the Euclidean space under the so-called L-action. On the other hand, for Clifford algebra valued polynomials, we can consider another action, called in Clifford analysis the H-action. In the last section, we recall the Fisher decomposition for the H-action obtained recently. As in Clifford analysis the prominent role plays the Dirac equation in this case the basic set of equations is formed by the Hodge system. Moreover, analysis of Hodge systems can be viewed even as a refinement of Clifford analysis. In this note, we describe the Howe duality for the H-action. In particular, in Proposition 1, we recognize the Howe dual partner of the orthogonal group O(m) in this case as the Lie superalgebra sl(2 1). Furthermore, Theorem 2 gives the corresponding multiplicity free decomposition with an explicit description of irreducible pieces.
THE HERPICH AFFAIR OF 1924
(2011)
Michele Stavagna is an architect and architectural historian, who lives and works in Berlin, and is the correspondent from Italy for the magazine “der architekt - BDA”. He was educated at the Università IUAV of Venice (Italy), holds a degree in architectural design and a PhD in history of architecture and urban design, and has taught Theory and History of Industrial Design at the Università degli Studi of Triest (Italy). Stavagna translated and edited the first Italian edition of “Die Baukunst der neuesten Zeit” by G. A. Platz. His research themes focus on the birth and affirmation of Modernism within the broader context of the mass public and economic development of the modern society.
The conventional way of describing an image is in terms of its canonical pixel-based representation. Other image description techniques are based on image transformations. Such an image transformation converts a canonical image representation into a representation in which specific properties of an image are described more explicitly. In most transformations, images are locally approximated within a window by a linear combination of a number of a priori selected patterns. The coefficients of such a decomposition then provide the desired image representation. The Hermite transform is an image transformation technique introduced by Martens. It uses overlapping Gaussian windows and projects images locally onto a basis of orthogonal polynomials. As the analysis filters needed for the Hermite transform are derivatives of Gaussians, Hermite analysis is in close agreement with the information analysis carried out by the human visual system. In this paper we construct a new higher dimensional Hermite transform within the framework of Quaternionic Analysis. The building blocks for this construction are the Clifford-Hermite polynomials rewritten in terms of Quaternionic analysis. Furthermore, we compare this newly introduced Hermite transform with the Quaternionic-Hermite Continuous Wavelet transform. The Continuous Wavelet transform is a signal analysis technique suitable for non-stationary, inhomogeneous signals for which Fourier analysis is inadequate. Finally the developed three dimensional filter functions of the Quaternionic-Hermite transform are tested with traditional scalar benchmark signals upon their selectivity at detecting pointwise singularities.
Wissenschaftliches Kolloquium vom 14. bis 16. Oktober 1999 in Weimar an der Bauhaus-Universität zum Thema: ‚global village - Perspektiven der Architektur'
THE FOURIER-BESSEL TRANSFORM
(2010)
In this paper we devise a new multi-dimensional integral transform within the Clifford analysis setting, the so-called Fourier-Bessel transform. It appears that in the two-dimensional case, it coincides with the Clifford-Fourier and cylindrical Fourier transforms introduced earlier. We show that this new integral transform satisfies operational formulae which are similar to those of the classical tensorial Fourier transform. Moreover the L2-basis elements consisting of generalized Clifford-Hermite functions appear to be eigenfunctions of the Fourier-Bessel transform.
Kari Jormakka has been teaching architectural theory at the Bauhaus University in Weimar since 2007. In addition, he has been an Ordinarius Professor of architectural theory at Vienna University of Technology since 1998. Previously, he has taught at the Knowlton School of Architecture at the Ohio State University, the University of Illinois at Chicago, Tampere University of Technology as well as Harvard University. Author of ten books and many papers on architectural history and theory, he studied architecture at Otaniemi University in Helsinki and at Tampere University of Technology, as well as philosophy at Helsinki University.
We briefly review and use the recent comprehensive research on the manifolds of square roots of −1 in real Clifford geometric algebras Cl(p,q) in order to construct the Clifford Fourier transform. Basically in the kernel of the complex Fourier transform the complex imaginary unit j is replaced by a square root of −1 in Cl(p,q). The Clifford Fourier transform (CFT) thus obtained generalizes previously known and applied CFTs, which replaced the complex imaginary unit j only by blades (usually pseudoscalars) squaring to −1. A major advantage of real Clifford algebra CFTs is their completely real geometric interpretation. We study (left and right) linearity of the CFT for constant multivector coefficients in Cl(p,q), translation (x-shift) and modulation (w -shift) properties, and signal dilations. We show an inversion theorem. We establish the CFT of vector differentials, partial derivatives, vector derivatives and spatial moments of the signal. We also derive Plancherel and Parseval identities as well as a general convolution theorem.
For planning in existing built contexts, the building survey is the starting point for initial planning proposals, for the diagnosis and documentation of building damages, for the creation of objectives catalogues, for the detailed design of renovation and conversion measures and for ensuring fulfilment of building legislation, particularly by change of use and refitting. An examination of currently available IT-tools shows insufficient support for planning within existing contexts, most notably a deficit with regard to information capture and administration. This paper discusses the concept for a modular surveying system (basic concept, separation of geometry from semantic data, and separation into sub-systems) and the prototypical realisation of a system for the complete support of the entire building surveying process for existing buildings. The project aims to contribute to the development of a planning system for existing buildings. ...
Wissenschaftliches Kolloquium vom 14. bis 16. Oktober 1999 in Weimar an der Bauhaus-Universität zum Thema: ‚global village - Perspektiven der Architektur'
Tobias Danielmeier teaches design at the Otago Polytechnic as well as at the University of Otago in Dunedin, New Zealand. He holds a Masters of Arts in Architecture from the Münster School of Architecture and is currently completing his PhD at the University of Otago. His research investigates the art, business and science of winery architecture and their interrelation with place and technology. Tobias Danielmeier’s practical experience includes projects for Reichardt Architekten, Essen, and Bolles+Wilson, Münster.
Eine neue, kopflose Gewalt hat den Imperialismus vergangener Zeiten abgelöst. Die neue Weltordnung, das »Empire«, überschreitet alle Grenzen unserer althergebrachten politischen Begriffe – Staat und Gesellschaft, Krieg und Frieden, Kontrolle und Freiheit. Das dezentralisierte und deterritorialisierte Empire beherrscht uns, indem es durch die Medien, die Technik und durch soziale Praktiken unmittelbaren Einfluss auf uns Menschen nimmt.
Architektur und Raumplanung haben sich in den letzten Jahrzehnten radikal gewandelt. Die alten, modernistischen Bestrebungen nach erschwinglichen Wohnungen und einer rationalen Organisation der Städte sind ebenso in den Hintergrund gerückt wie die postmodernen Obsessionen der Kommunikation, der Nutzerbeteiligung und des öffentlichen Raumes. Stattdessen stehen nun ästhetische und entschieden unpolitischere Belange im Vordergrund: Diskussionen zwischen einer kritischen und einer projektiven Praxis, zwischen Blobs und Kisten, zwischen Atmosphäre und Ornament.
Doch das ist noch lange nicht das Ende der Geschichte, wie im vorliegenden Band deutlich wird. Die Beiträge des 11. Bauhaus-Kolloquiums umspannen einen Zeitraum, der von der Gründung des Bauhauses in Weimar bis zur globalen Architektur unserer Zeit reicht, und verfolgen dabei die Entwicklung des Empires zurück, um gleichzeitig nach Konsequenzen und Alternativen zu fragen, denen die Architektur sich heute gegenübergestellt sieht.
Non-destructive techniques for damage detection became the focus of engineering interests in the last few years. However, applying these techniques to large complex structures like civil engineering buildings still has some limitations since these types of structures are
unique and the methodologies often need a large number of specimens for reliable results. For this reason, cost and time can greatly influence the final results.
Model Assisted Probability Of Detection (MAPOD) has taken its place among the ranks of damage identification techniques, especially with advances in computer capacity and modeling tools. Nevertheless, the essential condition for a successful MAPOD is having a reliable model in advance. This condition is opening the door for model assessment and model quality problems. In this work, an approach is proposed that uses Partial Models (PM) to compute the Probability Of damage Detection (POD). A simply supported beam, that can be structurally modified and
tested under laboratory conditions, is taken as an example. The study includes both experimental and numerical investigations, the application of vibration-based damage detection approaches and a comparison of the results obtained based on tests and simulations.
Eventually, a proposal for a methodology to assess the reliability and the robustness of the models is given.
This paper describes the application of interval calculus to calculation of plate deflection, taking in account inevitable and acceptable tolerance of input data (input parameters). The simply supported reinforced concrete plate was taken as an example. The plate was loaded by uniformly distributed loads. Several parameters that influence the plate deflection are given as certain closed intervals. Accordingly, the results are obtained as intervals so it was possible to follow the direct influence of a change of one or more input parameters on output (in our example, deflection) values by using one model and one computing procedure. The described procedure could be applied to any FEM calculation in order to keep calculation tolerances, ISO-tolerances, and production tolerances in close limits (admissible limits). The Wolfram Mathematica has been used as tool for interval calculation.
The development of a consistent material model for textile reinforced concrete requires the formulation and calibration of several sub-models on different resolution scales. Each of these models represents the material structure at the corresponding scale. While the models at the micro-level are able to capture the fundamental failure and damage mechanisms of the material components (e.g. filament rupture and debonding from the matrix) their computational costs limit their application to the small size representative unit cells of the material structure. On the other hand, the macro-level models provide a sufficient performance at the expense of limited range of applicability. Due to the complex structuring of the textile reinforced concrete at several levels (filament - yarn - textile - matrix) it is a non-trivial task to develop a multiscale model from scratch. It is rather more effective to develop a set of conceptually related sub-models for each structural level covering the selected phenomena of the material behavior. The homogenized effective material properties obtained at the lower level may be verified and validated using experiments and models at the higher level(s). In this paper the development of a consistent material model for textile reinforced concrete is presented. Load carrying and failure mechanisms at the micro, meso and macro scales are described and models with the focus on the specified scales are introduced. The models currently being developed in the framework of the collaborative research center are classified and evaluated with respect to the failure mechanisms being captured. The micromechanical modeling of the yarn and bonding behavior is discussed in detail and the correspondence with the experiments focused on the selected failure and interaction mechanisms is shown. The example of modeling the bond layer demonstrates the application of the presented strategy.
The goal of the collaborative research center (SFB 532) >Textile reinforced concrete (TRC): the basis for the development of a new material technology< installed in 1998 at the Aachen University is a complex assessment of mechanical, chemical, economical and productional aspects in an interdisciplinary environment. The research project involves 10 institutes performing parallel research in 17 projects. The coordination of such a research process requires effective software support for information sharing in form of data exchange, data analysis and data archival. Furthermore, the processes of experiment planning and design, modification of material compositions and design parameters and development of new material models in such an environment call for systematic coordination applying the concepts of operational research. Flexible organization of the data coming from several sources is a crucial premise for a transparent accumulation of knowledge and, thus, for a successful research in a long run. The technical information system (TRC-TIS) developed in the SFB 532 has been implemented as a database-powered web server with a transparent definition of the product and process model. It serves as an intranet server with access domains devoted to the involved research groups. At the same time, it allows the presentation of selected results just by granting a data object an access from the public area of the server via internet.
Experimental testing of nailed connections taken from old roof trusses is presented in this paper. To enable the further use and preservation of nailed roof trusses, it is important to understand how the nail corrosion and aging processes of steel and wood affect the load-bearing capacity and deformation behaviour of such structures. The hypothesis was investigated whether corroded nails allow an increase in load-bearing capacity. Several old and new joints were tested in a first test series, and the results were very promising regarding the initial assumption. However, more tests must be carried out to verify the results.
Dr. phil., seit 2001 Professorin für Geschichte und Theorie der Architektur und der Stadt; zuvor Professorin für Kunstgeschichte an der TU Graz; Gastprofessuren für Kunst-, Architektur- und Designgeschichte an der UdK Berlin, Universität Kassel, Oldenburg und Bonn; Studium der Kunstgeschichte, Soziologie, Psychologie und Philosophie. Karin Wilhelm ist Organisatorin mehrerer internationaler Ausstellungen zur modernen Architektur und zum Design (Berlin, London, Stock olm) und war wissenschaftlicher Beirat der Stiftung Bauhaus Dessau und des Deutschen Architekturmuseums (DAM). Zahlreiche Veröffentlichungen, zuletzt: Bauhaus Weimar 1919 –1924 (1996); Kunst als Revolte? Von der Fähigkeit der Künste, Nein zu sagen (1996); Visionen vom Glück – Visionen vom Untergang: Zeichen und Diskurse zur „schönen neuen Welt“ (1998); Sehen – Gehen – Denken: der Entwurf des Bauhausgebäudes, in: ‚Das Bauhausgebäude in Dessau 1926 –1999‘ (1998); City-Lights – Zentren, Peripherien, Regionen: interdisziplinäre Positionen für eine urbane Kultur (2002), Idea and form: Häuser von Szyszkowitz + Kowalski (2003); Formationen der Stadt. Camillo Sitte weitergelesen (2005).
Due to the amount of flow simulation and measurement data, automatic detection, classification and visualization of features is necessary for an inspection. Therefore, many automated feature detection methods have been developed in recent years. However, only one feature class is visualized afterwards in most cases, and many algorithms have problems in the presence of noise or superposition effects. In contrast, image processing and computer vision have robust methods for feature extraction and computation of derivatives of scalar fields. Furthermore, interpolation and other filter can be analyzed in detail. An application of these methods to vector fields would provide a solid theoretical basis for feature extraction. The authors suggest Clifford algebra as a mathematical framework for this task. Clifford algebra provides a unified notation for scalars and vectors as well as a multiplication of all basis elements. The Clifford product of two vectors provides the complete geometric information of the relative positions of these vectors. Integration of this product results in Clifford correlation and convolution which can be used for template matching of vector fields. For frequency analysis of vector fields and the behavior of vector-valued filters, a Clifford Fourier transform has been derived for 2D and 3D. Convolution and other theorems have been proved, and fast algorithms for the computation of the Clifford Fourier transform exist. Therefore the computation of Clifford convolution can be accelerated by computing it in Clifford Fourier domain. Clifford convolution and Fourier transform can be used for a thorough analysis and subsequent visualization of flow fields.
Modellverwaltungssysteme sind eine geeignete technologische Basis zum Management digitaler Bauwerksmodelle bei Planungstätigkeiten für den Neubau als auch für die Revitalisierung von Bauwerken. Die Unterstützung von Revitalisierungsprozessen impliziert für den Entwurf integrierter Planungsumgebungen spezifische Anforderungen wie die Repräsentation von Informationen, die mit verschiedenen Typen von Vagheit behaftet sind, die Notwendigkeit, den Soll- sowie den Ist- Zustand des Bauwerks abzubilden und die Fähigkeit des Umgangs mit temporal inkonsistenten Modellzuständen. Die erforderliche Dynamik der Domänenmodelle und die erforderliche Nutzbarkeit in Virtual Enterprises stellen weitere Ansprüche an die Realisierungsbasis der Modellverwaltungssysteme. Zur Implementierung derartiger Systeme erweist es sich als vorteilhaft, Eigenschaften objektorientierter Programmiersprachen mit nichtstatischen Typsystemen auszunutzen, da diese durch die vorhandene Metaebene sowie Introspektions- und Reflektionsmechanismen eine effiziente Realisierungsbasis bereitstellen. Zur effektiven Unterstützung synchroner kooperativer Planungstätigkeiten innerhalb einzelner Fachdisziplinen wurde ein Benachrichtigungsmechanismus realisiert, der an das Modellverwaltungssystem angekoppelte Fachapplikationen über nebenläufig vorgenommene Modifikationen am zugehörigen Domänenmodell oder an Projektinformationen informiert. Weiterhin existiert ein Mechanismus zur vereinfachten Anbindung von existierenden Applikationen, die auf statischen Partialmodellen beruhen oder standardisierte, modellbasierte Austauschformate unterstützen. Abschließend wird eine aus einem zentralen Projektserver, Domänenservern und Domänenclients bestehende hybride Systemarchitektur vorgestellt, die geeignet ist, unter den Randbedingungen kooperativer und geographisch verteilter Arbeit bei Revitalisierungsvorhaben in Virtual Enterprises eingesetzt zu werden.
Am 25. März 2010 veranstaltete die Professur Baubetrieb und Bauverfahren im Rahmen der jährlich stattfindenden baubetrieblichen Tagungsreihe gemeinsam mit der Arbeitsgruppe „Unikatprozesse“ in der Fachgruppe „Simulation in Produktion und Logistik“ (SPL) im Rahmen der Arbeitsgemeinschaft Simulation – ASIM einen ganztägigen Workshop mit dem Titel: „Modellierung von Prozessen zur Fertigung von Unikaten“. Viele Bauprozesse sind dadurch gekennzeichnet, dass sie Unikatcharakter besitzen. Unikate sind durch prototypische Einmaligkeit, Individualität, vielfältige Randbedingungen, einen geringen Grad an Standardisierung und Wiederholungen gekennzeichnet. Das erschwert die realitätsnahe Modellierung zur Simulation sogenannter Unikatprozesse. Dieser Besonderheit widmet sich die überwiegende Zahl der Tagungsbeiträge, die in diesem Band widergegeben sind.
Am 31. März 2008 veranstalteten die Professur Baubetrieb und Bauverfahren und die Juniorprofessur Theoretische Methoden des Projektmanagements der Bauhaus-Universität Weimar einen ganztägigen Workshop mit dem Titel: „Auf dem Weg zum digitalen (Bau-)haus-Bau“. Damit sollte die im Herbst 2007 an der Universität Kassel unter dem Titel „Simulation in der Bauwirtschaft“ begonnene Reihe von Workshops eine Fortsetzung finden. Der Schwerpunkt wurde dieses Mal auf die Simulation von Bauprozessen gesetzt – Simulation mit dem Ziel, Arbeitsvorbereitung, Bauausführung und Baustellencontrolling digital zu unterstützen.
Die Fachtagung richtete sich an Geschäftsführer, Projektleiter, Bauleiter und Projektsteuerer in Planung und Ausführung mit Beiträgen zum Nachtrags- und Änderungsmanagement am Bau, Workflow-Management in der Baupraxis, Integration von Informationsprozessen auf der Basis von Nemetschek Technologien sowie Kompetenzaufbau durch gezielte Weiterbildung.
Safety operation of important civil structures such as bridges can be estimated by using fracture analysis. Since the analytical methods are not capable of solving many complicated engineering problems, numerical methods have been increasingly adopted. In this paper, a part of isotropic material which contains a crack is considered as a partial model and the proposed model quality is evaluated. EXtended IsoGeometric Analysis (XIGA) is a new developed numerical approach [1, 2] which benefits from advantages of its origins: eXtended Finite Element Method (XFEM) and IsoGeometric Analysis (IGA). It is capable of simulating crack propagation problems with no remeshing necessity and capturing singular field at the crack tip by using the crack tip enrichment functions. Also, exact representation of geometry is possible using only few elements. XIGA has also been successfully applied for fracture analysis of cracked orthotropic bodies [3] and for simulation of curved cracks [4]. XIGA applies NURBS functions for both geometry description and solution field approximation. The drawback of NURBS functions is that local refinement cannot be defined regarding that it is based on tensorproduct constructs unless multiple patches are used which has also some limitations. In this contribution, the XIGA is further developed to make the local refinement feasible by using Tspline basis functions. Adopting a recovery based error estimator in the proposed approach for evaluation of the model quality and performing the adaptive processes is in progress. Finally, some numerical examples with available analytical solutions are investigated by the developed scheme.
SYSWELD Forum 2011
(2011)
Am 25. und 26. Oktober 2011 trafen sich an der Bauhaus-Universität Weimar 70 nationale und internationale Fachleute aus Forschung und Praxis, um sich im Rahmen des vierten SYSWELD Forums über aktuelle Entwicklungen der numerischen Simulation auf dem Gebiet der Wärmebehandlung und des Schweißens auszutauschen. Die numerische Simulation im Bereich des Schweißens und der Wärmebehandlung hat sich in den letzten Jahren beachtlich weiterentwickelt und bietet ein zukunftsweisendes und innovatives Arbeitsfeld für Ingenieure.
This paper presents a robust model updating strategy for system identification of wind turbines. To control the updating parameters and to avoid ill-conditioning, the global sensitivity analysis using the elementary effects method is conducted. The formulation of the objective function is based on M¨uller-Slany’s strategy for multi-criteria functions. As a simulationbased optimization, a simulation adapter is developed to interface the simulation software ANSYS and the locally developed optimization software MOPACK. Model updating is firstly tested on the beam model of the rotor blade. The defect between the numerical model and the reference has been markedly reduced by the process of model updating. The effect of model updating becomes more pronounced in the comparison of the measured and the numerical properties of the wind turbine model. The deviations of the frequencies of the updated model are rather small. The complete comparison including the free vibration modes by the modal assurance criteria shows the excellent coincidence of the modal parameters of the updated model with the ones from the measurements. By successful implementation of the model validation via model updating, the applicability and effectiveness of the solution concept has been demonstrated.
Due to the complex interactions between the ground, the driving machine, the lining tube and the built environment, the accurate assignment of in-situ system parameters for numerical simulation in mechanized tunneling is always subject to tremendous difficulties. However, the more accurate these parameters are, the more applicable the responses gained from computations will be. In particular, if the entire length of the tunnel lining is examined, then, the appropriate selection of various kinds of ground parameters is accountable for the success of a tunnel project and, more importantly, will prevent potential casualties. In this context, methods of system identification for the adaptation of numerical simulation of ground models are presented. Hereby, both deterministic and probabilistic approaches are considered for typical scenarios representing notable variations or changes in the ground model.
SYSBAT - An Application to the Building ProductionBased on Computer Supported Cooperative Work
(2003)
Our proposed solution is to enable partners of a construction project to share all the technical data produced and handled during the building production process by building a system through the use of internet technology. The system links distributed databases and allows building partners to access remotely and manipulate specific information. It provides an updated building representation that is being enriched and refined all along the building production process. A recent collaboration with Nemetschek France (subsidiary company of Nemetschek AG, AEC CAD software leader) focus on a building product repository available in a web context. The aim is to help building project actors to choose a technical solution that fits its professional needs, and maintain our information system with up to date information. It starts with the possibility to build on line building product catalogs, in order to link Allplan CAD entities with building technical features. This paper presents the conceptual approaches on which our information system is built. Starting from a general organization diagram organization, we focus on the product and the description branches of construction works (including last IFC model specifications). Our aim is to add decisional support to the construction works selection process. To do so, we consider the actor's role upon the system and the pieces of information each one needs to achieve a given task.
The subject of this talk is the problem of surface design based upon a mesh that may contain both triangular and quadrangular domains. We investigate the cases when such a combined mesh occurs more preferable for bivariate data interpolation than a pure triangulation. First we describe a modification of the well-known flipping algorithm that constructs a locally optimal combined mesh with a predefined quality criterion. Then we introduce two quality measures for triangular and quadrangular domains and present the results of a computational experiment that compares integral interpolation errors and errors in gradients caused by the piecewise surface models produced by the flipping algorithm with the introduced quality measures. The experiment shows that triangular meshes with the Delaunay quality measure provide better interpolation accuracy only if the interpolated function is strictly convex, as well as a saddle-shaped function is better interpolated by bilinear patches within a combined mesh. For a randomly shaped function combined meshes demonstrate smaller error values and better stability in compare with pure triangulations. At the end we consider other resources for mesh improvement, such as excluding >bad< points from the input set for the mesh generating procedure. Because the function values at these points should not be lost, some linear or bilinear patches are replaced by nonlinear patches that pass through the excluded points.
The design of mobile IT systems, especially the design of wearable computer systems, is a complex task that requires computer science knowledge, such as that related to hardware configuration and software development, in addition to knowledge of the domain in which the system is intended to be used. Particularly in the AEC sector, it is necessary that the support from mobile information technology fit the work situation at hand. Ideally, the domain expert alone can adjust the wearable computer system to achieve this fit without having to consult IT experts. In this paper, we describe a model that helps in transferring existing design knowledge from non-AEC domains to new projects in the construction area. The base for this is a model and a methodology that describes the usage scenarios of said computer systems in an application-neutral and domain-independent way. Thus, the actual design information and experience will be transferable between different applications and domains.
Structural engineering projects are increasingly organized in networked cooperations due to a permanently enlarged competition pressure and a high degree of complexity while performing the concurrent design activities. Software that intends to support such collaborative structural design processes implicates enormous requirements. In the course of our common research work, we analyzed the pros and cons of the application of both the peer-to-peer (University of Bonn) and multiagent architecture style (University of Bochum) within the field of collaborative structural design. In this paper, we join the benefits of both architecture styles in an integrated conceptual approach. We demonstrate the surplus value of the integrated multiagent–peer-to-peer approach by means of an example scenario in which several structural engineers are co-operatively designing the basic structural elements of an arched bridge, applying heterogeneous CAD systems.
In the superelliptic shell joined to a circular cylinder bending stresses are absent when it is subjected to uniform pressure.Some geometrical characteristics have been found. Expressions for determining stresses in the shell crest(in the singular point of plane type) are suggested. The problem of a theoretical critical buckling load of an elongated shell supported by frames is studied. A critical buckling load for two shells with different specifications was found experimentally.
Die besondere Aggressivität von hochkonzentrierten Magnesiumsulfatlösungen bei Einwirkung auf Beton ist seit vielen Jahrzehnten bekannt. Neben dem Sulfat greift zusätzlich auch das Magnesium den Zementstein an. Bei hohen Lösungskonzentrationen nimmt der Magnesiumangriff gegenüber dem Sulfatangriff sogar eine dominante Rolle ein. Magnesiumgehalte unter 300 mg/l im Grundwasser gelten allerdings bislang als nicht angreifend. In Auslagerungs- und Laborversuchen wurde jedoch festgestellt, dass auch bei praxisrelevanten Magnesium- (<300 mg/l) und Sulfatgehalten (1.500 mg/l) das Magnesium zu einer deutlichen Verschärfung des Sulfatangriffes bei niedrigen Temperaturen führte. Diese Verschärfung trat bei Mörteln und Betonen auf, bei denen der erhöhte Sulfatwiderstand durch einen teilweisen Zementersatz mit 20 % Flugasche zu einem CEM II/A-LL erreicht werden sollte, gemäß der Flugascheregelung nach EN 206-1/DIN 1045-2.
Bei einem teilweisen Zementersatz durch 30 % Flugasche konnte auch in magnesiumhaltigen Sulfatlösungen eine deutliche Verbesserung des Sulfatwiderstandes erreicht werden. Mörtel mit HS-Zement als Bindemittel wiesen keinerlei Schäden auf. Schadensverursachend war eine Kombination mehrerer Einflüsse. Zum einen wurde der Sulfatwiderstand des Zement-Flugasche-Systems durch die unzureichende Reaktion der Flugasche infolge der niedrigen Lagerungstemperatur geschwächt. Zum anderen konnte durch die Einwirkung des Magnesiums in der Randzone vermutlich eine Destabilisierung der C-S-H-Phasen erfolgen, wodurch die Thaumasitbildung an dieser Stelle forciert wurde. Zusätzlich wurde durch den Portlanditverbrauch und die pH-Wert-Absenkung in der Randzone die puzzolanische Reaktion der Flugasche behindert.
This contribution will be freewheeling in the domain of signal, image and surface processing and touch briefly upon some topics that have been close to the heart of people in our research group. A lot of the research of the last 20 years in this domain that has been carried out world wide is dealing with multiresolution. Multiresolution allows to represent a function (in the broadest sense) at different levels of detail. This was not only applied in signals and images but also when solving all kinds of complex numerical problems. Since wavelets came into play in the 1980's, this idea was applied and generalized by many researchers. Therefore we use this as the central idea throughout this text. Wavelets, subdivision and hierarchical bases are the appropriate tools to obtain these multiresolution effects. We shall introduce some of the concepts in a rather informal way and show that the same concepts will work in one, two and three dimensions. The applications in the three cases are however quite different, and thus one wants to achieve very different goals when dealing with signals, images or surfaces. Because completeness in our treatment is impossible, we have chosen to describe two case studies after introducing some concepts in signal processing. These case studies are still the subject of current research. The first one attempts to solve a problem in image processing: how to approximate an edge in an image efficiently by subdivision. The method is based on normal offsets. The second case is the use of Powell-Sabin splines to give a smooth multiresolution representation of a surface. In this context we also illustrate the general method of construction of a spline wavelet basis using a lifting scheme.
Polymer modification of mortar and concrete is a widely used technique in order to improve their durability properties. Hitherto, the main application fields of such materials are repair and restoration of buildings. However, due to the constant increment of service life requirements and the cost efficiency, polymer modified concrete (PCC) is also used for construction purposes. Therefore, there is a demand for studying the mechanical properties of PCC and entitative differences compared to conventional concrete (CC). It is significant to investigate whether all the assumed hypotheses and existing analytical formulations about CC are also valid for PCC. In the present study, analytical models available in the literature are evaluated. These models are used for estimating mechanical properties of concrete. The investigated property in this study is the modulus of elasticity, which is estimated with respect to the value of compressive strength. One existing database was extended and adapted for polymer-modified concrete mixtures along with their experimentally measured mechanical properties. Based on the indexed data a comparison between model predictions and experiments was conducted by calculation of forecast errors.
Information science researchers and developers have spent many years addressing the problem of retrieving the exact information needed and using it for analysis purposes. In informationseeking dialogues, the user, i.e. construction project manager or supplier, often asks questions about specific aspects of the tasks they want to perform. But most of the time it is difficult for the software systems to unambiguously understand their overall intentions. The existence of information tunnels (Tannenbaum 2002) aggravates this phenomenon. This study includes a detailed case study of the material management process in the construction industry. Based on this case study, the structure of a formal user model for information retrieval in construction management is proposed. This prototype user model will be incorporated into the system design for construction information management and retrieval. This information retrieval system is a user-centered product based on the development of a user configurable visitor mechanism for managing and retrieving project information without worrying too much about the underlying data structure of the database system. An executable UML model combined with OODB is used to reduce the ambiguity in the user's intentions and to achieve user satisfaction.
With the advances of the computer technology, structural optimization has become a prominent field in structural engineering. In this study an unconventional approach of structural optimization is presented which utilize the Energy method with Integral Material behaviour (EIM), based on the Lagrange’s principle of minimum potential energy. The equilibrium condition with the EIM, as an alternative method for nonlinear analysis, is secured through minimization of the potential energy as an optimization problem. Imposing this problem as an additional constraint on a higher cost function of a structural property, a bilevel programming problem is formulated. The nested strategy of solution of the bilevel problem is used, treating the energy and the upper objective function as separate optimization problems. Utilizing the convexity of the potential energy, gradient based algorithms are employed for its minimization and the upper cost function is minimized using the gradient free algorithms, due to its unknown properties. Two practical examples are considered in order to prove the efficiency of the method. The first one presents a sizing problem of I steel section within encased composite cross section, utilizing the material nonlinearity. The second one is a discrete shape optimization of a steel truss bridge, which is compared to a previous study based on the Finite Element Method.
The planning of projects in building engineering is a complex process which is characterized by a dynamical composition and many modifications during the definition and execution time of processes. For a computer-aided and network-based cooperation a formal description of the planning process is necessary. In the research project “Relational Process Modelling in Cooperative Building Planning” a process model is described by three parts: an organizational structure with participants, a building structure with states and a process structure with activities. This research project is part of the priority program 1103 “Network-Based Cooperative Planning Processes in Structural Engineering” promoted by the German Research Foundation (DFG). Planning processes in civil engineering can be described by workflow graphs. The process structure describes the logical planning process and can be formally defined by a bipartite graph. This structure consists of activities, transitions and relationships between activities and transitions. In order to minimize errors at execution time of a planning process a consistent and structurally correct process model must be guaranteed. This contribution considers the concept and the algorithms for checking the consistency and the correctness of the process structure.