56.03 Methoden im Bauingenieurwesen
Refine
Document Type
- Conference Proceeding (599)
- Article (143)
- Doctoral Thesis (28)
- Master's Thesis (4)
- Diploma Thesis (1)
Institute
- Professur Informatik im Bauwesen (467)
- In Zusammenarbeit mit der Bauhaus-Universität Weimar (173)
- Graduiertenkolleg 1462 (34)
- Institut für Strukturmechanik (ISM) (30)
- Professur Angewandte Mathematik (18)
- Institut für Konstruktiven Ingenieurbau (IKI) (8)
- Professur Baubetrieb und Bauverfahren (8)
- Professur Baumechanik (7)
- Professur Stahlbau (6)
- Professur Stochastik und Optimierung (5)
- Professur Massivbau I (4)
- Institut für Bauinformatik, Mathematik und Bauphysik (IBMB) (3)
- Professur Bauphysik (3)
- Professur Informatik in der Architektur (3)
- Professur Modellierung und Simulation - Konstruktion (3)
- Professur Bodenmechanik (2)
- Professur Computer Vision in Engineering (2)
- Professur Holz- und Mauerwerksbau (2)
- Professur Informations- und Wissensverarbeitung (2)
- Professur Tragwerkslehre (2)
- bauhaus.institut für experimentelle Architektur (2)
- Juniorprofessur CAD in der Bauinformatik (1)
- Juniorprofessur Simulation und Experiment (1)
- Professur Baustatik und Bauteilfestigkeit (1)
- Professur Planung von Ingenieurbauten (1)
Keywords
- Computerunterstütztes Verfahren (287)
- Architektur <Informatik> (198)
- CAD (164)
- Angewandte Mathematik (144)
- Angewandte Informatik (143)
- Computer Science Models in Engineering; Multiscale and Multiphysical Models; Scientific Computing (72)
- Modellierung (65)
- Bauwerk (50)
- Finite-Elemente-Methode (46)
- Building Information Modeling (38)
SYSWELD Forum 2011
(2011)
Am 25. und 26. Oktober 2011 trafen sich an der Bauhaus-Universität Weimar 70 nationale und internationale Fachleute aus Forschung und Praxis, um sich im Rahmen des vierten SYSWELD Forums über aktuelle Entwicklungen der numerischen Simulation auf dem Gebiet der Wärmebehandlung und des Schweißens auszutauschen. Die numerische Simulation im Bereich des Schweißens und der Wärmebehandlung hat sich in den letzten Jahren beachtlich weiterentwickelt und bietet ein zukunftsweisendes und innovatives Arbeitsfeld für Ingenieure.
The 19th International Conference on the Applications of Computer Science and Mathematics in Architecture and Civil Engineering will be held at the Bauhaus University Weimar from 4th till 6th July 2012. Architects, computer scientists, mathematicians, and engineers from all over the world will meet in Weimar for an interdisciplinary exchange of experiences, to report on their results in research, development and practice and to discuss. The conference covers a broad range of research areas: numerical analysis, function theoretic methods, partial differential equations, continuum mechanics, engineering applications, coupled problems, computer sciences, and related topics. Several plenary lectures in aforementioned areas will take place during the conference.
We invite architects, engineers, designers, computer scientists, mathematicians, planners, project managers, and software developers from business, science and research to participate in the conference!
The 20th International Conference on the Applications of Computer Science and Mathematics in Architecture and Civil Engineering will be held at the Bauhaus University Weimar from 20th till 22nd July 2015. Architects, computer scientists, mathematicians, and engineers from all over the world will meet in Weimar for an interdisciplinary exchange of experiences, to report on their results in research, development and practice and to discuss. The conference covers a broad range of research areas: numerical analysis, function theoretic methods, partial differential equations, continuum mechanics, engineering applications, coupled problems, computer sciences, and related topics. Several plenary lectures in aforementioned areas will take place during the conference.
We invite architects, engineers, designers, computer scientists, mathematicians, planners, project managers, and software developers from business, science and research to participate in the conference!
Long-span cable supported bridges are prone to aerodynamic instabilities caused by wind and this phenomenon is usually a major design criterion. If the wind speed exceeds the critical flutter speed of the bridge, this constitutes an Ultimate Limit State. The prediction of the flutter boundary therefore requires accurate and robust models. This paper aims at studying various combinations of models to predict the flutter phenomenon.
Since flutter is a coupling of aerodynamic forcing with a structural dynamics problem, different types and classes of models can be combined to study the interaction. Here, both numerical approaches and analytical models are utilised and coupled in different ways to assess the prediction quality of the hybrid model. Models for aerodynamic forces employed are the analytical Theodorsen expressions for the motion-enduced aerodynamic forces of a flat plate and Scanlan derivatives as a Meta model. Further, Computational Fluid Dynamics (CFD) simulations using the Vortex Particle Method (VPM) were used to cover numerical models.
The structural representations were dimensionally reduced to two degree of freedom section models calibrated from global models as well as a fully three-dimensional Finite Element (FE) model. A two degree of freedom system was analysed analytically as well as numerically.
Generally, all models were able to predict the flutter phenomenon and relatively close agreement was found for the particular bridge. In conclusion, the model choice for a given practical analysis scenario will be discussed in the context of the analysis findings.
Integrated structural engineering system usually consists of large number of design objects that may be distributed across different platforms. These design objects need to communicate data and information among each other. For efficient communication among design objects a common communication protocol need to be defined. This paper presents the elements of a communication protocol that uses a mediator agent to facilitate communication among design objects. This protocol is termed the Mediative Communication Protocol (MCP). The protocol uses certain design communication performatives and the semantics of an Agent Communication language (ACL) mainly the Knowledge and Query Manipulation Language (KQML) to implement its steps. Details of a Mediator Agent, that will facilitate the communication among design objects, is presented. The Unified Modeling Language (UML) is used to present the Meditative protocol and show how the mediator agent can be use to execute the steps of the meditative communication protocol. An example from structural engineering application is presented to demonstrate and validate the protocol. It is concluded that the meditative protocol is a viable protocol to facilitate object-to-object communication and also has potential to facilitate communication among the different project participants at the higher level of integrated structural engineering systems.
Identification of modal parameters of a space frame structure is a complex assignment due to a large number of degrees of freedom, close natural frequencies, and different vibrating mechanisms. Research has been carried out on the modal identification of rather simple truss structures. So far, less attention has been given to complex three-dimensional truss structures. This work develops a vibration-based methodology for determining modal information of three-dimensional space truss structures. The method uses a relatively complex space truss structure for its verification. Numerical modelling of the system gives modal information about the expected vibration behaviour. The identification process involves closely spaced modes that are characterised by local and global vibration mechanisms. To distinguish between local and global vibrations of the system, modal strain energies are used as an indicator. The experimental validation, which incorporated a modal analysis employing the stochastic subspace identification method, has confirmed that considering relatively high model orders is required to identify specific mode shapes. Especially in the case of the determination of local deformation modes of space truss members, higher model orders have to be taken into account than in the modal identification of most other types of structures.
In the superelliptic shell joined to a circular cylinder bending stresses are absent when it is subjected to uniform pressure.Some geometrical characteristics have been found. Expressions for determining stresses in the shell crest(in the singular point of plane type) are suggested. The problem of a theoretical critical buckling load of an elongated shell supported by frames is studied. A critical buckling load for two shells with different specifications was found experimentally.
A method of automatic maintenance of vibration amplitude of a number of mechanisms at given level, when exiting force amplitude is varied greatly is given. For this purpose a pendulum is attached to a mechanism through a viscoelastic hinge. Load of a pendulum can move along an arm and it is viscoelastic connected to it.
Fuzzy functions are suitable to deal with uncertainties and fuzziness in a closed form maintaining the informational content. This paper tries to understand, elaborate, and explain the problem of interpolating crisp and fuzzy data using continuous fuzzy valued functions. Two main issues are addressed here. The first covers how the fuzziness, induced by the reduction and deficit of information i.e. the discontinuity of the interpolated points, can be evaluated considering the used interpolation method and the density of the data. The second issue deals with the need to differentiate between impreciseness and hence fuzziness only in the interpolated quantity, impreciseness only in the location of the interpolated points and impreciseness in both the quantity and the location. In this paper, a brief background of the concept of fuzzy numbers and of fuzzy functions is presented. The numerical side of computing with fuzzy numbers is concisely demonstrated. The problem of fuzzy polynomial interpolation, the interpolation on meshes and mesh free fuzzy interpolation is investigated. The integration of the previously noted uncertainty into a coherent fuzzy valued function is discussed. Several sets of artificial and original measured data are used to examine the mentioned fuzzy interpolations.
The fire resistance of concrete members is controlled by the temperature distribution of the considered cross section. The thermal analysis can be performed with the advanced temperature dependent physical properties provided by 5EN6 1992-1-2. But the recalculation of laboratory tests on columns from 5TU6 Braunschweig shows, that there are deviations between the calculated and measured temperatures. Therefore it can be assumed, that the mathematical formulation of these thermal properties could be improved. A sensitivity analysis is performed to identify the governing parameters of the temperature calculation and a nonlinear optimization method is used to enhance the formulation of the thermal properties. The proposed simplified properties are partly validated by the recalculation of measured temperatures of concrete columns. These first results show, that the scatter of the differences from the calculated to the measured temperatures can be reduced by the proposed simple model for the thermal analysis of concrete.
Das Ziel der vorliegenden Arbeit besteht in der Entwicklung einer Strategie zur physikalisch nichtlinearen Analyse von Aussteifungssystemen. Der Anwendungsschwerpunkt umfasst neben dem traditionellen Aufgabenumfang zur Analyse neu zu errichtender Tragwerke gleichzeitig auch Planungsaufgaben, die mit Umbau- und Sanierungsmaßnahmen verbunden sind. Veränderungen, die sich während der Nutzungsgeschichte oder im Revitalisierungsprozess ergeben, werden in den Berechnungsmodellen berücksichtigt. In vielen Fällen ist es aus planerischer Sicht zweckmäßig, die Nichtlinearität des Materialverhaltens zur Erschließung von Tragreserven in den normativen Nachweiskonzepten mit einzubeziehen. Der damit verbundene numerische Aufwand wird durch die Verwendung separater Modelle zur Erfassung des Querschnitts- und des Systemtragverhaltens begrenzt, ohne die Komplexität der Aufgabenstellung zu reduzieren. Aus detaillierten Querschnittsuntersuchungen der Tragwände werden integrale Materialbeziehungen abgeleitet, welche die Grundlage für die nichtlineare Tragwerksanalyse darstellen. Die Modellbildung gegliederter Aussteifungswände basiert auf deren Zerlegung in ebene finite Stabsegmente, die sich durch die Diskretisierung in Längs- und in Querrichtung ergeben. Zusätzlich zu den an den Stabenden angreifenden Normalkräften, Querkräften und Biegemomenten werden an den Elementlängsrändern Schubbeanspruchungen erfasst. Die physikalische Nichtlinearität wird durch die Einbeziehung integraler Materialbeziehungen an den Segmenträndern berücksichtigt. Die numerische Umsetzung erfolgt mit Methoden der mathematischen Optimierung. Die Leistungsfähigkeit der Berechnungsstrategie wird exemplarisch anhand von Untersuchungen an Aussteifungssystemen in Großtafelbauweise nachgewiesen.
This paper concerns schedule synchronization problems in public transit networks. In particular, it consists of three main parts. In the first the subject area is introduced, the terms are defined and framework for optimal synchronization in the form of problem representation and formulation is proposed. The second part is devoted to transfer synchronization problem when passengers changing transit lines at transfer points. The intergrated Tabu Search and Genetic solution method is developed with respect to this specific problem. The third part deals with headways harmonization problem i.e. synchronization of different transit lines schedules on a common segments of routes. For the solution of this problem a new bilevel optimization method is proposed with zones harmonization at the bottom level and co-ordination of zones, by time buffers assigned to timing points, at the upper level. Finally, the synchronization problems are numerically illustrated by real-life examples of the public transport lines in Cracow.
Different types of data provide different type of information. The present research analyzes the error on prediction obtained under different data type availability for calibration. The contribution of different measurement types to model calibration and prognosis are evaluated. A coupled 2D hydro-mechanical model of a water retaining dam is taken as an example. Here, the mean effective stress in the porous skeleton is reduced due to an increase in pore water pressure under drawdown conditions. Relevant model parameters are identified by scaled sensitivities. Then, Particle Swarm Optimization is applied to determine the optimal parameter values and finally, the error in prognosis is determined. We compare the predictions of the optimized models with results from a forward run of the reference model to obtain the actual prediction errors. The analyses presented here were performed calibrating the hydro-mechanical model to 31 data sets of 100 observations of varying data types. The prognosis results improve when using diversified information for calibration. However, when using several types of information, the number of observations has to be increased to be able to cover a representative part of the model domain. For an analysis with constant number of observations, a compromise between data type availability and domain coverage proves to be the best solution. Which type of calibration information contributes to the best prognoses could not be determined in advance. The error in model prognosis does not depend on the error in calibration, but on the parameter error, which unfortunately cannot be determined in inverse problems since we do not know its real value. The best prognoses were obtained independent of calibration fit. However, excellent calibration fits led to an increase in prognosis error variation. In the case of excellent fits; parameters' values came near the limits of reasonable physical values more often. To improve the prognoses reliability, the expected value of the parameters should be considered as prior information on the optimization algorithm.
Mobile Software-Agenten für neuartige Funktionen und Nutzeffekte in intelligenten Gebäudesystemen
(2003)
Anwendungsbezogene Software innerhalb vernetzter Gebäude nimmt immer mehr zu. Neue Standards erlauben den einfachen Fernzugriff, um neue Services zu installieren oder um Updates aufspielen zu können. Zu diesem Thema wird der OSGi-Standard vorgestellt, der ein Management von Software während des Betriebs vornehmen kann. Außerdem nimmt die Netzlast der heterogenen Netze innerhalb und zu den Häusern stetig zu. Hier können mobile Softwareagenten ihre Vorteile gegenüber herkömmlichen, statischen Kommunikationsmechanismen hervorheben. Im folgenden Text wird die Integration solcher mobilen Softwareagenten in bestehende Standards intelligenter Häuser beschrieben und anhand des Innovationszentrum Intelligentes Haus Duisburg (www.inhaus-duisburg.de) beispielhaft erläutert. Nach der Einleitung wird in Kapitel 2 der aktuelle Stand der Technik beschrieben. Dabei wird vor allem auf den OSGi-Standard und die Technik der mobilen Softwareagenten eingegangen. Im Kapitel 3 wird stehen vor allem Voranalysen zur Fernwartung, Optimierungen von Regelungen und die Integration dynamischer Netzteilnehmer im Vordergrund, die durch die beschriebenen Mechanismen erleichtert werden. Im Kapitel 4 werden die Ergebnisse kurz zusammengefaßt und einen Ausblick gegeben.
In this paper, wavelet energy damage indicator is used in response surface methodology to identify the damage in simulated filler beam railway bridge. The approximate model is addressed to include the operational and surrounding condition in the assessment. The procedure is split into two stages, the training and detecting phase. During training phase, a so-called response surface is built from training data using polynomial regression and radial basis function approximation approaches. The response surface is used to detect the damage in structure during detection phase. The results show that the response surface model is able to detect moderate damage in one of bridge supports while the temperatures and train velocities are varied.
We give a sufficient and a necessary condition for an analytic function "f" on the unit disk "D" with Hadamard gap to belong to a class of weighted logarithmic Bloch space as well as to the corresponding little weighted logarithmic Bloch space under some conditions posed on the defined weight function. Also, we study the relations between the class of weighted logarithmic Bloch functions and some other classes of analytic functions by the help of analytic functions in the Hadamard gap class.
In this study we introduce a concept of discrete Laplacian on the plane lattice and consider its iteration dynamical system. At first we discuss some basic properties on the dynamical system to be proved. Next making their computer simulations, we show that we can realize the following phenomena quite well:(1) The crystal of waters (2) The designs of carpets, embroideries (3) The time change of the numbers of families of extinct animals, and (4) The echo systems of life things. Hence we may expect that we can understand the evolutions and self organizations by use of the dynamical systems. Here we want to make a stress on the following fact: Although several well known chaotic dynamical systems can describe chaotic phenomena, they have difficulties in the descriptions of the evolutions and self organizations.
The research of the best building design requires a concerted design approach of both structure and foundation. Our work is an application of this approach. Our objective is also to create an interactive tool, which will be able to define, at the early design stages, the orientations of structure and foundation systems that satisfy as well as possible the client and the architect. If the concerns of these two actors are primarily technical and economical, they also wish to apprehend the environmental and social dimensions of their projects. Thus, this approach bases on alternative studies and on a multi-criterion analysis. In this paper, we present the context of our work, the problem formulation, which allows a concerted design of Structure and Foundation systems and the feasible solutions identifying process.
The p-Laplace equation is a nonlinear generalization of the Laplace equation. This generalization is often used as a model problem for special types of nonlinearities. The p-Laplace equation can be seen as a bridge between very general nonlinear equations and the linear Laplace equation. The aim of this paper is to solve the p-Laplace equation for 2 < p < 3 and to find strong solutions. The idea is to apply a hypercomplex integral operator and spatial function theoretic methods to transform the p-Laplace equation into the p-Dirac equation. This equation will be solved iteratively by using a fixed point theorem.
In order to minimize the probability of foundation failure resulting from cyclic action on structures, researchers have developed various constitutive models to simulate the foundation response and soil interaction as a result of these complex cyclic loads. The efficiency and effectiveness of these model is majorly influenced by the cyclic constitutive parameters. Although a lot of research is being carried out on these relatively new models, little or no details exist in literature about the model based identification of the cyclic constitutive parameters. This could be attributed to the difficulties and complexities of the inverse modeling of such complex phenomena. A variety of optimization strategies are available for the solution of the sum of least-squares problems as usually done in the field of model calibration. However for the back analysis (calibration) of the soil response to oscillatory load functions, this paper gives insight into the model calibration challenges and also puts forward a method for the inverse modeling of cyclic loaded foundation response such that high quality solutions are obtained with minimum computational effort. Therefore model responses are produced which adequately describes what would otherwise be experienced in the laboratory or field.
Expert systems integrating fuzzy reasoning techniques represent a powerful tool to support practicing engineers during the early stages of structural design. In this context fuzzy models have proved themselves to be very suitable for the representation of complex design knowledge. However their definition is a laborious task. This paper introduces an approach for the design and the optimization of fuzzy systems based upon Genetic Programming. To keep the emerging fuzzy systems transparent a new framework for the definition of linguistic variables is also introduced.
Structural engineering projects are increasingly organized in networked cooperations due to a permanently enlarged competition pressure and a high degree of complexity while performing the concurrent design activities. Software that intends to support such collaborative structural design processes implicates enormous requirements. In the course of our common research work, we analyzed the pros and cons of the application of both the peer-to-peer (University of Bonn) and multiagent architecture style (University of Bochum) within the field of collaborative structural design. In this paper, we join the benefits of both architecture styles in an integrated conceptual approach. We demonstrate the surplus value of the integrated multiagent–peer-to-peer approach by means of an example scenario in which several structural engineers are co-operatively designing the basic structural elements of an arched bridge, applying heterogeneous CAD systems.
Wahrnehmung und Verarbeitung von Ereignissen bei der verteilten Planung im baulichen Brandschutz
(2003)
Der Bauplanungsprozess ist durch ein hohes Maß an Kooperation zwischen Planungsbeteiligten verschiedener Fachrichtungen gekennzeichnet. Hierbei werden zum einen Planungen auf der Basis von Planungsinformationen anderer Planungsbeteiligter detailliert, zum anderen geben Planungen einzelner auch wichtige Rahmenbedingungen für die Gesamtplanung vor. Der vorliegende Beitrag beschreibt einen Ansatz zur ganzheitlichen Unterstützungen verteilter Planungen am Beispiel des baulichen Brandschutzes. Der Antrag trägt hierbei der verteilten und parallelen Planung Rechnung, wie sie heute bei der Planung großer und mittlerer Bauwerke angewendet wird. Die Verteiltheit wird nicht nur für die Planungsbeteiligten modelliert, sondern auch die einzelnen Planungsinformationen liegen im gemeinsamen Kooperationsverbund verteilt vor. Der Fokus dieses Beitrags liegt auf der Wahrnehmung von Planungsänderungen und Ereignissen während der Planung und die Verarbeitung dieser Informationen um eine durchgängige Planung zu gewährleisten. Dies wird zum einen durch das CoBE Awarenessmodell erreicht, mit dem Ereignisse erkannt und dem Informationsverbund zur Verfügung gestellt werden können. Zum anderen werden die Ereignisbehandlung und die darauf folgende fachgerechte Informationsverarbeitung mit Hilfe eines Multi-Agentensystems beschrieben.
Electric trains are considered one of the most eco-friendly and safest means of transportation. Catenary poles are used worldwide to support overhead power lines for electric trains. The performance of the catenary poles has an extensive influence on the integrity of the train systems and, consequently, the connected human services. It became a must nowadays to develop SHM systems that provide the instantaneous status of catenary poles in- service, making the decision-making processes to keep or repair the damaged poles more feasible. This study develops a data-driven, model-free approach for status monitoring of cantilever structures, focusing on pre-stressed, spun-cast ultrahigh-strength concrete catenary poles installed along high-speed train tracks. The pro-posed approach evaluates multiple damage features in an unfied damage index, which leads to straightforward interpretation and comparison of the output. Besides, it distinguishes between multiple damage scenarios of the poles, either the ones caused by material degradation of the concrete or by the cracks that can be propagated during the life span of the given structure. Moreover, using a logistic function to classify the integrity of structure avoids the expensive learning step in the existing damage detection approaches, namely, using the modern machine and deep learning methods. The findings of this study look very promising when applied to other types of cantilever structures, such as the poles that support the power transmission lines, antenna masts, chimneys, and wind turbines.
This study proposes an efficient Bayesian, frequency-based damage identification approach to identify damages in cantilever structures with an acceptable error rate, even at high noise levels. The catenary poles of electric high-speed train systems were selected as a realistic case study to cover the objectives of this study. Compared to other frequency-based damage detection approaches described in the literature, the proposed approach is efficiently able to detect damages in cantilever structures to higher levels of damage detection, namely identifying both the damage location and severity using a low-cost structural health monitoring (SHM) system with a limited number of sensors; for example, accelerometers. The integration of Bayesian inference, as a stochastic framework, in the proposed approach, makes it possible to utilize the benefit of data fusion in merging the informative data from multiple damage features, which increases the quality and accuracy of the results. The findings provide the decision-maker with the information required to manage the maintenance, repair, or replacement procedures.
Over the last decade, the technology of constructing buildings has been dramatically developed especially with the huge growth of CAD tools that help in modeling buildings, bridges, roads and other construction objects. Often quality control and size accuracy in the factory or on construction site are based on manual measurements of discrete points. These measured points of the realized object or a part of it will be compared with the points of the corresponding CAD model to see whether and where the construction element fits into the respective CAD model. This process is very complicated and difficult even when using modern measuring technology. This is due to the complicated shape of the components, the large amount of manually detected measured data and the high cost of manual processing of measured values. However, by using a modern 3D scanner one gets information of the whole constructed object and one can make a complete comparison against the CAD model. It gives an idea about quality of objects on the whole. In this paper, we present a case study of controlling the quality of measurement during the constructing phase of a steel bridge by using 3D point cloud technology. Preliminary results show that an early detection of mismatching between real element and CAD model could save a lot of time, efforts and obviously expenses.
Humans are able to think, to feel, and to sense. We are also able to compute but not very well. In contrast, computers are giants in computing. Yet, they can not do anything else besides computing. Appropriate combinations of the different gifts and strengths of human and computer may result in impressive performances. In the 3-Hirn approach one human and two computers are involved. On the computers different programs are running. The human starts the machines and inspects the solutions they propose. He compares these candidate solutions and finally decides for one of the alternatives. So, the human makes the final choice from a small number of computer proposals. In performance-oriented chess, 3-Hirn combinations consisting of an amateur player and commer-cial software have reached world class level. 3-Hirn is a Decision Support System with Multiple Choice Structure. Such Multiple Choice Systems will be exhibited and discussed.
The node moving and multistage node enrichment adaptive refinement procedures are extended in mixed discrete least squares meshless (MDLSM) method for efficient analysis of elasticity problems. In the formulation of MDLSM method, mixed formulation is accepted to avoid second-order differentiation of shape functions and to obtain displacements and stresses simultaneously. In the refinement procedures, a robust error estimator based on the value of the least square residuals functional of the governing differential equations and its boundaries at nodal points is used which is inherently available from the MDLSM formulation and can efficiently identify the zones with higher numerical errors. The results are compared with the refinement procedures in the irreducible formulation of discrete least squares meshless (DLSM) method and show the accuracy and efficiency of the proposed procedures. Also, the comparison of the error norms and convergence rate show the fidelity of the proposed adaptive refinement procedures in the MDLSM method.
SYSBAT - An Application to the Building ProductionBased on Computer Supported Cooperative Work
(2003)
Our proposed solution is to enable partners of a construction project to share all the technical data produced and handled during the building production process by building a system through the use of internet technology. The system links distributed databases and allows building partners to access remotely and manipulate specific information. It provides an updated building representation that is being enriched and refined all along the building production process. A recent collaboration with Nemetschek France (subsidiary company of Nemetschek AG, AEC CAD software leader) focus on a building product repository available in a web context. The aim is to help building project actors to choose a technical solution that fits its professional needs, and maintain our information system with up to date information. It starts with the possibility to build on line building product catalogs, in order to link Allplan CAD entities with building technical features. This paper presents the conceptual approaches on which our information system is built. Starting from a general organization diagram organization, we focus on the product and the description branches of construction works (including last IFC model specifications). Our aim is to add decisional support to the construction works selection process. To do so, we consider the actor's role upon the system and the pieces of information each one needs to achieve a given task.
This paper describes a couple of new truss structures based on fractal geometry. One is the famous Sierpinski Gasket and another is a fractal triangle derived by means of applying a process forming leaves of a cedar tree using M. F. Barnsley’s contraction mapping theory. Therefore a pair of x-y coordinates of an arbitrary nodal point on the structures are generated easily if IFS(Iterated Function System) codes and a scale of them are specified. Structural members are defined similarly. Thus data for frame analysis can be generated automatically, which is significant if the objective structure has complex configuration. Next analytical results under vertical and wind loadings in Japanese Building Code are shown. Here members are assumed to be timber and to have cross section of 15cm×15cm. Finally authors conclude that geometrically new truss structures were developed and automatic data generation for frame analysis was attained using IFS. Analytical results show they contribute to saving material when compared it with King-post truss.
Current disaster management procedures rely primarily on heuristics which result in their strategies being very cautious and sub-optimum in terms of saving life, minimising damage and returning the building to its normal function. Also effective disaster management demands decentralized, dynamic, flexible, short term and across domain resource sharing, which is not well supported by existing distributing computing infrastructres. The paper proposes a conceptual framework for emergency management in the built environment, using Semantic Grid as an integrating platform for different technologies. The framework supports a distributed network of specialists in built environment, including structural engineers, building technologists, decision analysts etc. It brings together the necessary technology threads, including the Semantic Web (to provide a framework for shared definitions of terms, resources and relationships), Web Services (to provide dynamic discovery and integration) and Grid Computing (for enhanced computational power, high speed access, collaboration and security control) to support rapid formation of virtual teams for disaster management. The proposed framework also make an extensive use of modelling and simulation (both numerical and using visualisations), data mining (to find resources in legacy data sets) and visualisation. It also include a variety of hardware instruments with access to real time data. Furthermore the whole framework is centred on collaborative working by the virtual team. Although focus of this paper is on disaster management, many aspects of the discussed Grid and Visualisation technologies will be useful for any other forms of collaboration. Conclusions are drawn about the possible future impact on the built environment.
Interactive visualization based on 3D computer graphics nowadays is an indispensable part of any simulation software used in engineering. Nevertheless, the implementation of such visualization software components is often avoided in research projects because it is a challenging and potentially time consuming task. In this contribution, a novel Java framework for the interactive visualization of engineering models is introduced. It supports the task of implementing engineering visualization software by providing adequate program logic as well as high level classes for the visual representation of entities typical for engineering models. The presented framework is built on top of the open source visualization toolkit VTK. In VTK, a visualization model is established by connecting several filter objects in a so called visualization pipeline. Although designing and implementing a good pipeline layout is demanding, VTK does not support the reuse of pipeline layouts directly. Our framework tailors VTK to engineering applications on two levels. On the first level it adds new – engineering model specific – filter classes to VTK. On the second level, ready made pipeline layouts for certain aspects of engineering models are provided. For instance there is a pipeline class for one-dimensional elements like trusses and beams that is capable of showing the elements along with deformations and member forces. In order to facilitate the implementation of a graphical user interface (GUI) for each pipeline class, there exists a reusable Java Swing GUI component that allows the user to configure the appearance of the visualization model. Because of the flexible structure, the framework can be easily adapted and extended to new problem domains. Currently it is used in (i) an object-oriented p-version finite element code for design optimization, (ii) an agent based monitoring system for dam structures and (iii) the simulation of destruction processes by controlled explosives based on multibody dynamics. Application examples from all three domains illustrates that the approach presented is powerful as well as versatile.
The optimization of continuous structures requires careful attention to discretization errors. Compared to ordinary low order formulation (h-elements) in conjunction with an adaptive mesh refinement in each optimization step, the use of high order finite elements (so called p-elements) has several advantages. However, compared to the h-method a higher order finite element analysis program poses higher demands from a software engineering point of view. In this article the basics of an object oriented higher order finite element system especially tailored to the use in structural optimization is presented. Besides the design of the system, aspects related to the employed implementation language Java are discussed.
A multicriterial statement of the above mentioned problem is presented. It differes from the classical statement of Spanning Tree problem. The quality of solution is estimated by vector objective function which contains weight criteria as well as topological criteria (degree and diameter of tree). Many real processes are not determined yet. And that is why the investigation of the stability is very important. Many errors are connected with calculations. The stability analysis of vector combinatorial problems allows to discover the value of changes in the initial data for which the optimal solution is not changed. Furthermore, the investigation of the stability allows to construct the class of the problems on base of the one problem by means of the parameter variations. Analysis of the problems with belong to this class allows to obtaine axact and adecuate discription of model
>CyberCity< ist ein Konzept, das durch ein virtuelles Abbild der räumlichen Realität einer Stadt (Berlin) eine uns bekannte Wahrnehmungsumgebung als Orientierungs- und Navigationserleichterung bereitstellt, um über diesen virtuellen Browser möglichst schnell und anschaulich an eine gewünschte Information zu kommen. Dieses Umgebungsmodell ist auch als Simulationsmodell für die Visualisierung stadträumlicher Beurteilungen neuer Projekte, verkehrstechnischer Massnahmen und ökologischer Belastungen geeignet. Insbesondere ist es als Orientierungsumgebung für die Telepräsenz über die Kommunikationsnetze gedacht, die über die virtuellen Repräsentanten (Avatare) eine besondere gesellschaftliche Brisanz erhält.
Poland is not situated in any seismic region of the earth, however there are still areas were underground mining is being conducted. In these areas, so-called 'paraseismic tremors', are very frequent phenomena. In the situation when a building examination is realized in order to define its safety, it is necessary to make a complete analysis, in which an influence of tremors should be included. To decide if a building is able to carry out any dynamic loads or not, it is necessary to compute its dynamic characteristics, i.e. natural frequencies. It is not possible using any standard techniques. After diagnosis a building in situ by an expert, computer techniques together with specialized software for dynamic, static, and strength analyses become a suitable tool. In this paper a special attention was paid to a typical twelve-store WGP (Wroclaw Great Plate) prefabricated building, concerning special type of joints. During dynamic actions these joints have a decisive influence on building's behavior. Paraseismic tremors are especially dangerous for these buildings and can be the reason of pre-failure states. It can be difficult and very expensive to prepare laboratory investigations of the part of a building or of a separate joint; therefore the computer modeling suitable to investigate behavior of such elements and whole buildings under different kinds of loads was used.
Building Information Modeling is a powerful tool for the design and for a consistent set of data in a virtual storage. For the application in the phases of realization and on site it needs further development. The paper describes main challenges and main features, which will help the development of software to better service the needs of construction site managers
The development of 3D technologies during the last decades in many different areas, leads us towards the complete 3D representation of planet earth on a high level of detail. On the lowest level we have geographical information systems (GIS) representing the outer layer of our planet as a 3D model. In the meantime these systems do not only give a geographical model but also present additional information like ownership, infrastructure and others that might be of interest for the construction business. In future these systems will serve as basis for virtual environments for planning and simulation of construction sites. In addition to this work is done on the integration of GIS systems with 3D city models in the area of urban planning and thus integration of different levels of detail. This article presents research work on the use of 3D models in construction on the next level of detail below the level of urban planning. The 3D city model is taken as basis for the 3D model of the construction site. In this virtual nD-world a contractor can organize and plan his resources, simulate different variants of construction processes and thus find out the most effective solution for the consideration of costs and time. On the basis of former researches the authors present a new approach for cost estimation and simulation using development technologies from game software.
The paper analyses the application of 3D gaming technologies in the simulation of processes associated with human resources and machinery on construction sites in order to determine process costs. It addresses the problem of detailing in process simulation. The authors outline special boundary conditions for the simulation of cost-relevant resource processes on virtual construction sites. The approach considers different needs for detailing in process simulation during the planning and building phase. For simulation of process costs on a construction site (contractors view) the level of detail has to be set to high. A prototype for determination of process durations (and hereby process costs) developed at the Bauhaus University Weimar is presented as a result of ongoing researches on detailing in process simulation. It shows the method of process cost determination on a high level of detail (game between excavator and truck) through interaction with the virtual environment of the site.
Die Autoren stellen Grundlagen und Methoden zur Erstellung von Ausschreibungen und zur Durchführung der Kalkulation vor, die direkt mit dreidimensionalen Bauteilmodellen arbeiten. Dies trägt dazu bei, den gesamten Lebenszyklus eines Bauwerkes durchgängig über 3-D-Modelle beschreiben zu können. Im ersten Abschnitt werden grundlegende Überlegungen zum Einsatz von PLM/PDM-Technologien im Bauwesen angestellt. Im Anschluß daran wird die herkömmliche Ausschreibungsmethodik analysiert. Die Unterschiede zwischen einem konventionellen Leistungsverzeichnis und einem dreidimensionalen Bauteilkatalog (Objektdatenbank) hinsichtlich der Datenstruktur werden dargestellt. Hieraus abgeleitet werden Methoden und Prozesse bei der Erstellung einer 3-D-Leistungsbeschreibung entwickelt. Dazu wird eine geeignete Benutzer-Schnittstelle eines dreidimensionalen Bauteilkataloges vorgestellt. Schließlich werden praxisrelevante Probleme bei der Verwendung von Bauteilkatalogen erörtert. Ein wesentlicher Baustein ist die Kalkulation. Hier werden Kalkulationsmethoden basierend auf einem zweidimensionalen Leistungsverzeichnis und einem dreidimensionalen Bauteilkatalog miteinander verglichen. Ergänzt wird dies um Lösungen zur Anbindung von elektronischen Marktplätzen an den Bauteilkatalog zum Zweck der Preisbildung. Schließlich wird ein Ausblick gegeben, wie eine Synthese zwischen dreidimensionalem Bauteilkatalog und textlichen Standard-Leistungsbeschreibungen erreicht werden kann.
Der Einsatz von CAD-Applikationen hat die Planungsarbeit im Bauwesen entscheidend verändert. In der Entwurfsplanung sind die Änderungen bereits deutlich erkennbar. Seit der Anwendung von dreidimensionalen Planungswerkzeugen kann der Bauherr das Bauwerk in seiner virtuellen Gesamtheit am Computer betrachten und bewerten. Dies bietet viele Vorteile. So ist der Planer gezwungen alle wesentlichen Details a priori exakt zu planen, wodurch viele Fehler vermieden werden können. Dies bietet insbesondere für Planungen in Baubetrieben, z.B. im Rahmen der Arbeitsvorbereitung, neue Einsatzmöglichkeiten.
The paper introduces a systematic construction management approach, supporting expansion of a specified construction process, both automatically and semi-automatically. Throughout the whole design process, many requirements must be taken into account in order to fulfil demands defined by clients. In implementing those demands into a design concept up to the execution plan, constraints such as site conditions, building code, and legal framework are to be considered. However, complete information, which is needed to make a sound decision, is not yet acquired in the early phase. Decisions are traditionally taken based on experience and assumptions. Due to a vast number of appropriate available solutions, particularly in building projects, it is necessary to make those decisions traceable. This is important in order to be able to reconstruct considerations and assumptions taken, should there be any changes in the future project’s objectives. The research will be carried out by means of building information modelling, where rules deriving from standard logics of construction management knowledge will be applied. The knowledge comprises a comprehensive interaction amongst bidding process, cost-estimation, construction site preparation as well as specific project logistics – which are usually still separately considered. By means of these rules, favourable decision taking regarding prefabrication and in-situ implementation can be justified. Modifications depending on the available information within current design stage will consistently be traceable.
Due to economical, technical or political reasons all over the world about 100 nuclear power plants have been disconnected until today. All these power stations are still waiting for their complete dismantling which, considering one reactor, causes cost of up to one Bil. Euros and lasts up to 15 years. In our contribution we present a resource-constrained project scheduling approach minimizing the total discounted cost of dismantling a nuclear power plant. A project of dismantling a nuclear power plant can be subdivided into a number of disassembling activities. The execution of these activities requires time and scarce resources like manpower, special equipment or storage facilities for the contaminated material arising from the dismantling. Moreover, we have to regard several minimum and maximum time lags (temporal constraints) between the start times of the different activities. Finally, each disassembling activity can be processed in two alternative execution modes, which lead to different disbursements and determine the resource requirements of the considered activity. The optimization problem is to determine a start time and an execution mode for each activity, such that the discounted cost of the project is minimum, and neither the temporal constraints are violated nor the activities' resource requirements exceed the availability of any scarce resource at any point in time. In our contribution we introduce an appropriate multi-mode project scheduling model with minimum and maximum time lags as well as renewable and cumulative resources for the described optimization problem. Furthermore, we show that the considered optimization problem is NP-hard in the strong sense. For small problem instances, optimal solutions can be gained from a relaxation based enumeration approach which is incorporated into a branch and bound algorithm. In order to be able to solve large problem instances, we also propose a truncated version of the devised branch and bound algorithm.
We investigate aspects of tram-network section reliability, which operates as a part of the model of whole city tram-network reliability. Here, one of the main points of interest is the character of the chronological development of the disturbances (namely the differences between time of departure provided in schedule and real time of departure) on subsequent sections during tram line operation. These developments were observed in comprehensive measurements done in Krakow, during one of the main transportation nodes (Rondo Mogilskie) rebuilding. All taken building activities cause big disturbances in tram lines operation with effects extended to neighboring sections. In a second part, the stochastic character of section running time will be analyzed more detailed. There will be taken into consideration sections with only one beginning stop and also with two or three beginning stops located at different streets at an intersection. Possibility of adding results from sections with two beginning stops to one set will be checked with suitable statistical tests which are used to compare the means of the two samples. Section running time may depend on the value of gap between two following trams and from the value of deviation from schedule. This dependence will be described by a multi regression formula. The main measurements were done in the city center of Krakow in two stages: before and after big changes in tramway infrastructure.
MODEL OF TRAM LINE OPERATION
(2006)
From passenger's perspective punctuality is one of the most important features of trams operations. Unfortunately in most cases this feature is only insufficiently fulfilled. In this paper we present a simulation model for trams operation with special focus on punctuality. The aim is to get a helpful tool for designing time-tables and for analyzing the effects by changing priorities for trams in traffic lights respectively the kind of track separation. A realization of trams operations is assumed to be a sequence of running times between successive stops and times spent by tram at the stops. In this paper the running time is modeled by the sum of its mean value and a zero-mean random variable. With the help of multiple regression we find out that the average running time is a function depending on the length of the sections and the number of intersections. The random component is modeled by a sum of two independent zero-mean random variables. One of these variables describes the disturbance caused by the process of waiting at an intersection and the other the disturbance caused by the process of driving. The time spent at a stop is assumed to be a random variable, too. Its distribution is estimated from given measurements of these stop times for different tram lines in Kraków. Finally a special case of the introduced model is considered and numerical results are presented. This paper is involved with CIVITAS-CARAVEL project: "Clean and better transport in cites". The project has received research funding from the Community's Sixth Framework Programme. The paper reflects only the author's views and the Community is not liable for any use that may be made of the information contained therein.
The ride of the tram along the line, defined by a time-table, consists of the travel time between the subsequent sections and the time spent by tram on the stops. In the paper, statistical data collected in the city of Krakow is presented and evaluated. In polish conditions, for trams the time spent on stops makes up the remarkable amount of 30 % of the total time of tram line operation. Moreover, this time is characterized by large variability. The time spent by tram on a stop consists of alighting and boarding time and time lost by tram on stop after alighting and boarding time ending, but before departure. Alighting and boarding time itself usually depends on the random number of alighting and boarding passengers and also on the number of passengers which are inside the vehicle. However, the time spent by tram on stop after alighting and boarding time ending is an effect of certain random events, mainly because of impossibility of departure from stop, caused by lack of priorities for public transport vehicles. The main focus of the talk lies on the description and the modelling of these effects. This paper is involved with CIVITAS-CARAVEL project: "Clean and better transport in cites". The project has received research funding from the Community's Sixth Framework Programme. The paper reflects only the author's views and the Community is not liable for any use that may be made of the information contained therein.
From passenger’s perspective, punctuality is one of the most important features of tram route operation. We present a stochastic simulation model with special focus on determining important factors of influence. The statistical analysis bases on large samples (sample size is nearly 2000) accumulated from comprehensive measurements on eight tram routes in Cracow. For the simulation, we are not only interested in average values but also in stochastic characteristics like the variance and other properties of the distribution. A realization of trams operations is assumed to be a sequence of running times between successive stops and times spent by tram at the stops divided in passengers alighting and boarding times and times waiting for possibility of departure . The running time depends on the kind of track separation including the priorities in traffic lights, the length of the section and the number of intersections. For every type of section, a linear mixed regression model describes the average running time and its variance as functions of the length of the section and the number of intersections. The regression coefficients are estimated by the iterative re-weighted least square method. Alighting and boarding time mainly depends on type of vehicle, number of passengers alighting and boarding and occupancy of vehicle. For the distribution of the time waiting for possibility of departure suitable distributions like Gamma distribution and Lognormal distribution are fitted.
The idea about a simulation program to support urban planning is explained: Four different, clearly defined developing paths can be calculated for the rebuilding of a shrinking town. Aided by self-organization principles, a complex system can be created. The dynamics based on the action patterns of single actors, whose behaviour is cyclically depends on the generated structure. Global influences, which control the development, can be divided at a spatial, socioeconomic, and organizational-juridical level. The simulation model should offer conclusions on new planning strategies, especially in the context of the creation process of rebuilding measures. An example of a transportation system is shown by means of prototypes for the visualisation of the dynamic development process.
Die Arbeit befaßt sich mit varianzmindernden Verfahren zur Monte Carlo Simulation von stochastischen Prozessen, zum Zweck der Zuverlässigkeitsbeurteilung von Baukonstruktionen mit nichtlinearem Systemverhalten. Kap. 2 ist eine Literaturstudie zu varianzmindernden Monte Carlo Methoden. In Kap. 3 wird die Spektrale Darstellung eines stationären, skalaren Gauß - Prozesses hergeleitet. Auf dieser Grundlage werden verschiedene Simulationsmodelle diskutiert. Das in Kap. 4 entwickelte varianzmindernde Simulationsverfahren basiert auf der Spektralen Darstellung. Nach einer ersten Pilotsimulation werden die Frequenzen für die Einführung zufälliger Amplituden bestimmt und deren Parameter angepaßt. Der zweite Lauf erfolgt mit diesen Parametern nach dem Prinzip des Importance Sampling. Das Verfahren wird in Kap. 5 für eine Brücke unter Erdbebenbelastung angewendet. Die Brücke ist mit sog. Hysteretic Devices zur Energiedissipation ausgerüstet. Es werden einerseits die Genauigkeit und Effizienz des Simulationsverfahrens, andererseits die Leistungsfähigkeit der Hysteretic Devices zur Erdbebenertüchtigung von Bauwerken demonstriert.