56.03 Methoden im Bauingenieurwesen
Refine
Document Type
- Conference Proceeding (599)
- Article (143)
- Doctoral Thesis (28)
- Master's Thesis (4)
- Diploma Thesis (1)
Institute
- Professur Informatik im Bauwesen (467)
- In Zusammenarbeit mit der Bauhaus-Universität Weimar (173)
- Graduiertenkolleg 1462 (34)
- Institut für Strukturmechanik (ISM) (30)
- Professur Angewandte Mathematik (18)
- Institut für Konstruktiven Ingenieurbau (IKI) (8)
- Professur Baubetrieb und Bauverfahren (8)
- Professur Baumechanik (7)
- Professur Stahlbau (6)
- Professur Stochastik und Optimierung (5)
- Professur Massivbau I (4)
- Institut für Bauinformatik, Mathematik und Bauphysik (IBMB) (3)
- Professur Bauphysik (3)
- Professur Informatik in der Architektur (3)
- Professur Modellierung und Simulation - Konstruktion (3)
- Professur Bodenmechanik (2)
- Professur Computer Vision in Engineering (2)
- Professur Holz- und Mauerwerksbau (2)
- Professur Informations- und Wissensverarbeitung (2)
- Professur Tragwerkslehre (2)
- bauhaus.institut für experimentelle Architektur (2)
- Juniorprofessur CAD in der Bauinformatik (1)
- Juniorprofessur Simulation und Experiment (1)
- Professur Baustatik und Bauteilfestigkeit (1)
- Professur Planung von Ingenieurbauten (1)
Keywords
- Computerunterstütztes Verfahren (287)
- Architektur <Informatik> (198)
- CAD (164)
- Angewandte Mathematik (144)
- Angewandte Informatik (143)
- Computer Science Models in Engineering; Multiscale and Multiphysical Models; Scientific Computing (72)
- Modellierung (65)
- Bauwerk (50)
- Finite-Elemente-Methode (46)
- Building Information Modeling (38)
SYSWELD Forum 2011
(2011)
Am 25. und 26. Oktober 2011 trafen sich an der Bauhaus-Universität Weimar 70 nationale und internationale Fachleute aus Forschung und Praxis, um sich im Rahmen des vierten SYSWELD Forums über aktuelle Entwicklungen der numerischen Simulation auf dem Gebiet der Wärmebehandlung und des Schweißens auszutauschen. Die numerische Simulation im Bereich des Schweißens und der Wärmebehandlung hat sich in den letzten Jahren beachtlich weiterentwickelt und bietet ein zukunftsweisendes und innovatives Arbeitsfeld für Ingenieure.
The 19th International Conference on the Applications of Computer Science and Mathematics in Architecture and Civil Engineering will be held at the Bauhaus University Weimar from 4th till 6th July 2012. Architects, computer scientists, mathematicians, and engineers from all over the world will meet in Weimar for an interdisciplinary exchange of experiences, to report on their results in research, development and practice and to discuss. The conference covers a broad range of research areas: numerical analysis, function theoretic methods, partial differential equations, continuum mechanics, engineering applications, coupled problems, computer sciences, and related topics. Several plenary lectures in aforementioned areas will take place during the conference.
We invite architects, engineers, designers, computer scientists, mathematicians, planners, project managers, and software developers from business, science and research to participate in the conference!
The 20th International Conference on the Applications of Computer Science and Mathematics in Architecture and Civil Engineering will be held at the Bauhaus University Weimar from 20th till 22nd July 2015. Architects, computer scientists, mathematicians, and engineers from all over the world will meet in Weimar for an interdisciplinary exchange of experiences, to report on their results in research, development and practice and to discuss. The conference covers a broad range of research areas: numerical analysis, function theoretic methods, partial differential equations, continuum mechanics, engineering applications, coupled problems, computer sciences, and related topics. Several plenary lectures in aforementioned areas will take place during the conference.
We invite architects, engineers, designers, computer scientists, mathematicians, planners, project managers, and software developers from business, science and research to participate in the conference!
Long-span cable supported bridges are prone to aerodynamic instabilities caused by wind and this phenomenon is usually a major design criterion. If the wind speed exceeds the critical flutter speed of the bridge, this constitutes an Ultimate Limit State. The prediction of the flutter boundary therefore requires accurate and robust models. This paper aims at studying various combinations of models to predict the flutter phenomenon.
Since flutter is a coupling of aerodynamic forcing with a structural dynamics problem, different types and classes of models can be combined to study the interaction. Here, both numerical approaches and analytical models are utilised and coupled in different ways to assess the prediction quality of the hybrid model. Models for aerodynamic forces employed are the analytical Theodorsen expressions for the motion-enduced aerodynamic forces of a flat plate and Scanlan derivatives as a Meta model. Further, Computational Fluid Dynamics (CFD) simulations using the Vortex Particle Method (VPM) were used to cover numerical models.
The structural representations were dimensionally reduced to two degree of freedom section models calibrated from global models as well as a fully three-dimensional Finite Element (FE) model. A two degree of freedom system was analysed analytically as well as numerically.
Generally, all models were able to predict the flutter phenomenon and relatively close agreement was found for the particular bridge. In conclusion, the model choice for a given practical analysis scenario will be discussed in the context of the analysis findings.
Integrated structural engineering system usually consists of large number of design objects that may be distributed across different platforms. These design objects need to communicate data and information among each other. For efficient communication among design objects a common communication protocol need to be defined. This paper presents the elements of a communication protocol that uses a mediator agent to facilitate communication among design objects. This protocol is termed the Mediative Communication Protocol (MCP). The protocol uses certain design communication performatives and the semantics of an Agent Communication language (ACL) mainly the Knowledge and Query Manipulation Language (KQML) to implement its steps. Details of a Mediator Agent, that will facilitate the communication among design objects, is presented. The Unified Modeling Language (UML) is used to present the Meditative protocol and show how the mediator agent can be use to execute the steps of the meditative communication protocol. An example from structural engineering application is presented to demonstrate and validate the protocol. It is concluded that the meditative protocol is a viable protocol to facilitate object-to-object communication and also has potential to facilitate communication among the different project participants at the higher level of integrated structural engineering systems.
Identification of modal parameters of a space frame structure is a complex assignment due to a large number of degrees of freedom, close natural frequencies, and different vibrating mechanisms. Research has been carried out on the modal identification of rather simple truss structures. So far, less attention has been given to complex three-dimensional truss structures. This work develops a vibration-based methodology for determining modal information of three-dimensional space truss structures. The method uses a relatively complex space truss structure for its verification. Numerical modelling of the system gives modal information about the expected vibration behaviour. The identification process involves closely spaced modes that are characterised by local and global vibration mechanisms. To distinguish between local and global vibrations of the system, modal strain energies are used as an indicator. The experimental validation, which incorporated a modal analysis employing the stochastic subspace identification method, has confirmed that considering relatively high model orders is required to identify specific mode shapes. Especially in the case of the determination of local deformation modes of space truss members, higher model orders have to be taken into account than in the modal identification of most other types of structures.
In the superelliptic shell joined to a circular cylinder bending stresses are absent when it is subjected to uniform pressure.Some geometrical characteristics have been found. Expressions for determining stresses in the shell crest(in the singular point of plane type) are suggested. The problem of a theoretical critical buckling load of an elongated shell supported by frames is studied. A critical buckling load for two shells with different specifications was found experimentally.
A method of automatic maintenance of vibration amplitude of a number of mechanisms at given level, when exiting force amplitude is varied greatly is given. For this purpose a pendulum is attached to a mechanism through a viscoelastic hinge. Load of a pendulum can move along an arm and it is viscoelastic connected to it.
Fuzzy functions are suitable to deal with uncertainties and fuzziness in a closed form maintaining the informational content. This paper tries to understand, elaborate, and explain the problem of interpolating crisp and fuzzy data using continuous fuzzy valued functions. Two main issues are addressed here. The first covers how the fuzziness, induced by the reduction and deficit of information i.e. the discontinuity of the interpolated points, can be evaluated considering the used interpolation method and the density of the data. The second issue deals with the need to differentiate between impreciseness and hence fuzziness only in the interpolated quantity, impreciseness only in the location of the interpolated points and impreciseness in both the quantity and the location. In this paper, a brief background of the concept of fuzzy numbers and of fuzzy functions is presented. The numerical side of computing with fuzzy numbers is concisely demonstrated. The problem of fuzzy polynomial interpolation, the interpolation on meshes and mesh free fuzzy interpolation is investigated. The integration of the previously noted uncertainty into a coherent fuzzy valued function is discussed. Several sets of artificial and original measured data are used to examine the mentioned fuzzy interpolations.
The fire resistance of concrete members is controlled by the temperature distribution of the considered cross section. The thermal analysis can be performed with the advanced temperature dependent physical properties provided by 5EN6 1992-1-2. But the recalculation of laboratory tests on columns from 5TU6 Braunschweig shows, that there are deviations between the calculated and measured temperatures. Therefore it can be assumed, that the mathematical formulation of these thermal properties could be improved. A sensitivity analysis is performed to identify the governing parameters of the temperature calculation and a nonlinear optimization method is used to enhance the formulation of the thermal properties. The proposed simplified properties are partly validated by the recalculation of measured temperatures of concrete columns. These first results show, that the scatter of the differences from the calculated to the measured temperatures can be reduced by the proposed simple model for the thermal analysis of concrete.
Das Ziel der vorliegenden Arbeit besteht in der Entwicklung einer Strategie zur physikalisch nichtlinearen Analyse von Aussteifungssystemen. Der Anwendungsschwerpunkt umfasst neben dem traditionellen Aufgabenumfang zur Analyse neu zu errichtender Tragwerke gleichzeitig auch Planungsaufgaben, die mit Umbau- und Sanierungsmaßnahmen verbunden sind. Veränderungen, die sich während der Nutzungsgeschichte oder im Revitalisierungsprozess ergeben, werden in den Berechnungsmodellen berücksichtigt. In vielen Fällen ist es aus planerischer Sicht zweckmäßig, die Nichtlinearität des Materialverhaltens zur Erschließung von Tragreserven in den normativen Nachweiskonzepten mit einzubeziehen. Der damit verbundene numerische Aufwand wird durch die Verwendung separater Modelle zur Erfassung des Querschnitts- und des Systemtragverhaltens begrenzt, ohne die Komplexität der Aufgabenstellung zu reduzieren. Aus detaillierten Querschnittsuntersuchungen der Tragwände werden integrale Materialbeziehungen abgeleitet, welche die Grundlage für die nichtlineare Tragwerksanalyse darstellen. Die Modellbildung gegliederter Aussteifungswände basiert auf deren Zerlegung in ebene finite Stabsegmente, die sich durch die Diskretisierung in Längs- und in Querrichtung ergeben. Zusätzlich zu den an den Stabenden angreifenden Normalkräften, Querkräften und Biegemomenten werden an den Elementlängsrändern Schubbeanspruchungen erfasst. Die physikalische Nichtlinearität wird durch die Einbeziehung integraler Materialbeziehungen an den Segmenträndern berücksichtigt. Die numerische Umsetzung erfolgt mit Methoden der mathematischen Optimierung. Die Leistungsfähigkeit der Berechnungsstrategie wird exemplarisch anhand von Untersuchungen an Aussteifungssystemen in Großtafelbauweise nachgewiesen.
This paper concerns schedule synchronization problems in public transit networks. In particular, it consists of three main parts. In the first the subject area is introduced, the terms are defined and framework for optimal synchronization in the form of problem representation and formulation is proposed. The second part is devoted to transfer synchronization problem when passengers changing transit lines at transfer points. The intergrated Tabu Search and Genetic solution method is developed with respect to this specific problem. The third part deals with headways harmonization problem i.e. synchronization of different transit lines schedules on a common segments of routes. For the solution of this problem a new bilevel optimization method is proposed with zones harmonization at the bottom level and co-ordination of zones, by time buffers assigned to timing points, at the upper level. Finally, the synchronization problems are numerically illustrated by real-life examples of the public transport lines in Cracow.
Different types of data provide different type of information. The present research analyzes the error on prediction obtained under different data type availability for calibration. The contribution of different measurement types to model calibration and prognosis are evaluated. A coupled 2D hydro-mechanical model of a water retaining dam is taken as an example. Here, the mean effective stress in the porous skeleton is reduced due to an increase in pore water pressure under drawdown conditions. Relevant model parameters are identified by scaled sensitivities. Then, Particle Swarm Optimization is applied to determine the optimal parameter values and finally, the error in prognosis is determined. We compare the predictions of the optimized models with results from a forward run of the reference model to obtain the actual prediction errors. The analyses presented here were performed calibrating the hydro-mechanical model to 31 data sets of 100 observations of varying data types. The prognosis results improve when using diversified information for calibration. However, when using several types of information, the number of observations has to be increased to be able to cover a representative part of the model domain. For an analysis with constant number of observations, a compromise between data type availability and domain coverage proves to be the best solution. Which type of calibration information contributes to the best prognoses could not be determined in advance. The error in model prognosis does not depend on the error in calibration, but on the parameter error, which unfortunately cannot be determined in inverse problems since we do not know its real value. The best prognoses were obtained independent of calibration fit. However, excellent calibration fits led to an increase in prognosis error variation. In the case of excellent fits; parameters' values came near the limits of reasonable physical values more often. To improve the prognoses reliability, the expected value of the parameters should be considered as prior information on the optimization algorithm.
Mobile Software-Agenten für neuartige Funktionen und Nutzeffekte in intelligenten Gebäudesystemen
(2003)
Anwendungsbezogene Software innerhalb vernetzter Gebäude nimmt immer mehr zu. Neue Standards erlauben den einfachen Fernzugriff, um neue Services zu installieren oder um Updates aufspielen zu können. Zu diesem Thema wird der OSGi-Standard vorgestellt, der ein Management von Software während des Betriebs vornehmen kann. Außerdem nimmt die Netzlast der heterogenen Netze innerhalb und zu den Häusern stetig zu. Hier können mobile Softwareagenten ihre Vorteile gegenüber herkömmlichen, statischen Kommunikationsmechanismen hervorheben. Im folgenden Text wird die Integration solcher mobilen Softwareagenten in bestehende Standards intelligenter Häuser beschrieben und anhand des Innovationszentrum Intelligentes Haus Duisburg (www.inhaus-duisburg.de) beispielhaft erläutert. Nach der Einleitung wird in Kapitel 2 der aktuelle Stand der Technik beschrieben. Dabei wird vor allem auf den OSGi-Standard und die Technik der mobilen Softwareagenten eingegangen. Im Kapitel 3 wird stehen vor allem Voranalysen zur Fernwartung, Optimierungen von Regelungen und die Integration dynamischer Netzteilnehmer im Vordergrund, die durch die beschriebenen Mechanismen erleichtert werden. Im Kapitel 4 werden die Ergebnisse kurz zusammengefaßt und einen Ausblick gegeben.
In this paper, wavelet energy damage indicator is used in response surface methodology to identify the damage in simulated filler beam railway bridge. The approximate model is addressed to include the operational and surrounding condition in the assessment. The procedure is split into two stages, the training and detecting phase. During training phase, a so-called response surface is built from training data using polynomial regression and radial basis function approximation approaches. The response surface is used to detect the damage in structure during detection phase. The results show that the response surface model is able to detect moderate damage in one of bridge supports while the temperatures and train velocities are varied.
We give a sufficient and a necessary condition for an analytic function "f" on the unit disk "D" with Hadamard gap to belong to a class of weighted logarithmic Bloch space as well as to the corresponding little weighted logarithmic Bloch space under some conditions posed on the defined weight function. Also, we study the relations between the class of weighted logarithmic Bloch functions and some other classes of analytic functions by the help of analytic functions in the Hadamard gap class.
In this study we introduce a concept of discrete Laplacian on the plane lattice and consider its iteration dynamical system. At first we discuss some basic properties on the dynamical system to be proved. Next making their computer simulations, we show that we can realize the following phenomena quite well:(1) The crystal of waters (2) The designs of carpets, embroideries (3) The time change of the numbers of families of extinct animals, and (4) The echo systems of life things. Hence we may expect that we can understand the evolutions and self organizations by use of the dynamical systems. Here we want to make a stress on the following fact: Although several well known chaotic dynamical systems can describe chaotic phenomena, they have difficulties in the descriptions of the evolutions and self organizations.
The research of the best building design requires a concerted design approach of both structure and foundation. Our work is an application of this approach. Our objective is also to create an interactive tool, which will be able to define, at the early design stages, the orientations of structure and foundation systems that satisfy as well as possible the client and the architect. If the concerns of these two actors are primarily technical and economical, they also wish to apprehend the environmental and social dimensions of their projects. Thus, this approach bases on alternative studies and on a multi-criterion analysis. In this paper, we present the context of our work, the problem formulation, which allows a concerted design of Structure and Foundation systems and the feasible solutions identifying process.
The p-Laplace equation is a nonlinear generalization of the Laplace equation. This generalization is often used as a model problem for special types of nonlinearities. The p-Laplace equation can be seen as a bridge between very general nonlinear equations and the linear Laplace equation. The aim of this paper is to solve the p-Laplace equation for 2 < p < 3 and to find strong solutions. The idea is to apply a hypercomplex integral operator and spatial function theoretic methods to transform the p-Laplace equation into the p-Dirac equation. This equation will be solved iteratively by using a fixed point theorem.
In order to minimize the probability of foundation failure resulting from cyclic action on structures, researchers have developed various constitutive models to simulate the foundation response and soil interaction as a result of these complex cyclic loads. The efficiency and effectiveness of these model is majorly influenced by the cyclic constitutive parameters. Although a lot of research is being carried out on these relatively new models, little or no details exist in literature about the model based identification of the cyclic constitutive parameters. This could be attributed to the difficulties and complexities of the inverse modeling of such complex phenomena. A variety of optimization strategies are available for the solution of the sum of least-squares problems as usually done in the field of model calibration. However for the back analysis (calibration) of the soil response to oscillatory load functions, this paper gives insight into the model calibration challenges and also puts forward a method for the inverse modeling of cyclic loaded foundation response such that high quality solutions are obtained with minimum computational effort. Therefore model responses are produced which adequately describes what would otherwise be experienced in the laboratory or field.
Expert systems integrating fuzzy reasoning techniques represent a powerful tool to support practicing engineers during the early stages of structural design. In this context fuzzy models have proved themselves to be very suitable for the representation of complex design knowledge. However their definition is a laborious task. This paper introduces an approach for the design and the optimization of fuzzy systems based upon Genetic Programming. To keep the emerging fuzzy systems transparent a new framework for the definition of linguistic variables is also introduced.
Structural engineering projects are increasingly organized in networked cooperations due to a permanently enlarged competition pressure and a high degree of complexity while performing the concurrent design activities. Software that intends to support such collaborative structural design processes implicates enormous requirements. In the course of our common research work, we analyzed the pros and cons of the application of both the peer-to-peer (University of Bonn) and multiagent architecture style (University of Bochum) within the field of collaborative structural design. In this paper, we join the benefits of both architecture styles in an integrated conceptual approach. We demonstrate the surplus value of the integrated multiagent–peer-to-peer approach by means of an example scenario in which several structural engineers are co-operatively designing the basic structural elements of an arched bridge, applying heterogeneous CAD systems.
Wahrnehmung und Verarbeitung von Ereignissen bei der verteilten Planung im baulichen Brandschutz
(2003)
Der Bauplanungsprozess ist durch ein hohes Maß an Kooperation zwischen Planungsbeteiligten verschiedener Fachrichtungen gekennzeichnet. Hierbei werden zum einen Planungen auf der Basis von Planungsinformationen anderer Planungsbeteiligter detailliert, zum anderen geben Planungen einzelner auch wichtige Rahmenbedingungen für die Gesamtplanung vor. Der vorliegende Beitrag beschreibt einen Ansatz zur ganzheitlichen Unterstützungen verteilter Planungen am Beispiel des baulichen Brandschutzes. Der Antrag trägt hierbei der verteilten und parallelen Planung Rechnung, wie sie heute bei der Planung großer und mittlerer Bauwerke angewendet wird. Die Verteiltheit wird nicht nur für die Planungsbeteiligten modelliert, sondern auch die einzelnen Planungsinformationen liegen im gemeinsamen Kooperationsverbund verteilt vor. Der Fokus dieses Beitrags liegt auf der Wahrnehmung von Planungsänderungen und Ereignissen während der Planung und die Verarbeitung dieser Informationen um eine durchgängige Planung zu gewährleisten. Dies wird zum einen durch das CoBE Awarenessmodell erreicht, mit dem Ereignisse erkannt und dem Informationsverbund zur Verfügung gestellt werden können. Zum anderen werden die Ereignisbehandlung und die darauf folgende fachgerechte Informationsverarbeitung mit Hilfe eines Multi-Agentensystems beschrieben.
Electric trains are considered one of the most eco-friendly and safest means of transportation. Catenary poles are used worldwide to support overhead power lines for electric trains. The performance of the catenary poles has an extensive influence on the integrity of the train systems and, consequently, the connected human services. It became a must nowadays to develop SHM systems that provide the instantaneous status of catenary poles in- service, making the decision-making processes to keep or repair the damaged poles more feasible. This study develops a data-driven, model-free approach for status monitoring of cantilever structures, focusing on pre-stressed, spun-cast ultrahigh-strength concrete catenary poles installed along high-speed train tracks. The pro-posed approach evaluates multiple damage features in an unfied damage index, which leads to straightforward interpretation and comparison of the output. Besides, it distinguishes between multiple damage scenarios of the poles, either the ones caused by material degradation of the concrete or by the cracks that can be propagated during the life span of the given structure. Moreover, using a logistic function to classify the integrity of structure avoids the expensive learning step in the existing damage detection approaches, namely, using the modern machine and deep learning methods. The findings of this study look very promising when applied to other types of cantilever structures, such as the poles that support the power transmission lines, antenna masts, chimneys, and wind turbines.
This study proposes an efficient Bayesian, frequency-based damage identification approach to identify damages in cantilever structures with an acceptable error rate, even at high noise levels. The catenary poles of electric high-speed train systems were selected as a realistic case study to cover the objectives of this study. Compared to other frequency-based damage detection approaches described in the literature, the proposed approach is efficiently able to detect damages in cantilever structures to higher levels of damage detection, namely identifying both the damage location and severity using a low-cost structural health monitoring (SHM) system with a limited number of sensors; for example, accelerometers. The integration of Bayesian inference, as a stochastic framework, in the proposed approach, makes it possible to utilize the benefit of data fusion in merging the informative data from multiple damage features, which increases the quality and accuracy of the results. The findings provide the decision-maker with the information required to manage the maintenance, repair, or replacement procedures.
Over the last decade, the technology of constructing buildings has been dramatically developed especially with the huge growth of CAD tools that help in modeling buildings, bridges, roads and other construction objects. Often quality control and size accuracy in the factory or on construction site are based on manual measurements of discrete points. These measured points of the realized object or a part of it will be compared with the points of the corresponding CAD model to see whether and where the construction element fits into the respective CAD model. This process is very complicated and difficult even when using modern measuring technology. This is due to the complicated shape of the components, the large amount of manually detected measured data and the high cost of manual processing of measured values. However, by using a modern 3D scanner one gets information of the whole constructed object and one can make a complete comparison against the CAD model. It gives an idea about quality of objects on the whole. In this paper, we present a case study of controlling the quality of measurement during the constructing phase of a steel bridge by using 3D point cloud technology. Preliminary results show that an early detection of mismatching between real element and CAD model could save a lot of time, efforts and obviously expenses.
Humans are able to think, to feel, and to sense. We are also able to compute but not very well. In contrast, computers are giants in computing. Yet, they can not do anything else besides computing. Appropriate combinations of the different gifts and strengths of human and computer may result in impressive performances. In the 3-Hirn approach one human and two computers are involved. On the computers different programs are running. The human starts the machines and inspects the solutions they propose. He compares these candidate solutions and finally decides for one of the alternatives. So, the human makes the final choice from a small number of computer proposals. In performance-oriented chess, 3-Hirn combinations consisting of an amateur player and commer-cial software have reached world class level. 3-Hirn is a Decision Support System with Multiple Choice Structure. Such Multiple Choice Systems will be exhibited and discussed.
The node moving and multistage node enrichment adaptive refinement procedures are extended in mixed discrete least squares meshless (MDLSM) method for efficient analysis of elasticity problems. In the formulation of MDLSM method, mixed formulation is accepted to avoid second-order differentiation of shape functions and to obtain displacements and stresses simultaneously. In the refinement procedures, a robust error estimator based on the value of the least square residuals functional of the governing differential equations and its boundaries at nodal points is used which is inherently available from the MDLSM formulation and can efficiently identify the zones with higher numerical errors. The results are compared with the refinement procedures in the irreducible formulation of discrete least squares meshless (DLSM) method and show the accuracy and efficiency of the proposed procedures. Also, the comparison of the error norms and convergence rate show the fidelity of the proposed adaptive refinement procedures in the MDLSM method.
SYSBAT - An Application to the Building ProductionBased on Computer Supported Cooperative Work
(2003)
Our proposed solution is to enable partners of a construction project to share all the technical data produced and handled during the building production process by building a system through the use of internet technology. The system links distributed databases and allows building partners to access remotely and manipulate specific information. It provides an updated building representation that is being enriched and refined all along the building production process. A recent collaboration with Nemetschek France (subsidiary company of Nemetschek AG, AEC CAD software leader) focus on a building product repository available in a web context. The aim is to help building project actors to choose a technical solution that fits its professional needs, and maintain our information system with up to date information. It starts with the possibility to build on line building product catalogs, in order to link Allplan CAD entities with building technical features. This paper presents the conceptual approaches on which our information system is built. Starting from a general organization diagram organization, we focus on the product and the description branches of construction works (including last IFC model specifications). Our aim is to add decisional support to the construction works selection process. To do so, we consider the actor's role upon the system and the pieces of information each one needs to achieve a given task.
This paper describes a couple of new truss structures based on fractal geometry. One is the famous Sierpinski Gasket and another is a fractal triangle derived by means of applying a process forming leaves of a cedar tree using M. F. Barnsley’s contraction mapping theory. Therefore a pair of x-y coordinates of an arbitrary nodal point on the structures are generated easily if IFS(Iterated Function System) codes and a scale of them are specified. Structural members are defined similarly. Thus data for frame analysis can be generated automatically, which is significant if the objective structure has complex configuration. Next analytical results under vertical and wind loadings in Japanese Building Code are shown. Here members are assumed to be timber and to have cross section of 15cm×15cm. Finally authors conclude that geometrically new truss structures were developed and automatic data generation for frame analysis was attained using IFS. Analytical results show they contribute to saving material when compared it with King-post truss.
Current disaster management procedures rely primarily on heuristics which result in their strategies being very cautious and sub-optimum in terms of saving life, minimising damage and returning the building to its normal function. Also effective disaster management demands decentralized, dynamic, flexible, short term and across domain resource sharing, which is not well supported by existing distributing computing infrastructres. The paper proposes a conceptual framework for emergency management in the built environment, using Semantic Grid as an integrating platform for different technologies. The framework supports a distributed network of specialists in built environment, including structural engineers, building technologists, decision analysts etc. It brings together the necessary technology threads, including the Semantic Web (to provide a framework for shared definitions of terms, resources and relationships), Web Services (to provide dynamic discovery and integration) and Grid Computing (for enhanced computational power, high speed access, collaboration and security control) to support rapid formation of virtual teams for disaster management. The proposed framework also make an extensive use of modelling and simulation (both numerical and using visualisations), data mining (to find resources in legacy data sets) and visualisation. It also include a variety of hardware instruments with access to real time data. Furthermore the whole framework is centred on collaborative working by the virtual team. Although focus of this paper is on disaster management, many aspects of the discussed Grid and Visualisation technologies will be useful for any other forms of collaboration. Conclusions are drawn about the possible future impact on the built environment.
Interactive visualization based on 3D computer graphics nowadays is an indispensable part of any simulation software used in engineering. Nevertheless, the implementation of such visualization software components is often avoided in research projects because it is a challenging and potentially time consuming task. In this contribution, a novel Java framework for the interactive visualization of engineering models is introduced. It supports the task of implementing engineering visualization software by providing adequate program logic as well as high level classes for the visual representation of entities typical for engineering models. The presented framework is built on top of the open source visualization toolkit VTK. In VTK, a visualization model is established by connecting several filter objects in a so called visualization pipeline. Although designing and implementing a good pipeline layout is demanding, VTK does not support the reuse of pipeline layouts directly. Our framework tailors VTK to engineering applications on two levels. On the first level it adds new – engineering model specific – filter classes to VTK. On the second level, ready made pipeline layouts for certain aspects of engineering models are provided. For instance there is a pipeline class for one-dimensional elements like trusses and beams that is capable of showing the elements along with deformations and member forces. In order to facilitate the implementation of a graphical user interface (GUI) for each pipeline class, there exists a reusable Java Swing GUI component that allows the user to configure the appearance of the visualization model. Because of the flexible structure, the framework can be easily adapted and extended to new problem domains. Currently it is used in (i) an object-oriented p-version finite element code for design optimization, (ii) an agent based monitoring system for dam structures and (iii) the simulation of destruction processes by controlled explosives based on multibody dynamics. Application examples from all three domains illustrates that the approach presented is powerful as well as versatile.
The optimization of continuous structures requires careful attention to discretization errors. Compared to ordinary low order formulation (h-elements) in conjunction with an adaptive mesh refinement in each optimization step, the use of high order finite elements (so called p-elements) has several advantages. However, compared to the h-method a higher order finite element analysis program poses higher demands from a software engineering point of view. In this article the basics of an object oriented higher order finite element system especially tailored to the use in structural optimization is presented. Besides the design of the system, aspects related to the employed implementation language Java are discussed.
A multicriterial statement of the above mentioned problem is presented. It differes from the classical statement of Spanning Tree problem. The quality of solution is estimated by vector objective function which contains weight criteria as well as topological criteria (degree and diameter of tree). Many real processes are not determined yet. And that is why the investigation of the stability is very important. Many errors are connected with calculations. The stability analysis of vector combinatorial problems allows to discover the value of changes in the initial data for which the optimal solution is not changed. Furthermore, the investigation of the stability allows to construct the class of the problems on base of the one problem by means of the parameter variations. Analysis of the problems with belong to this class allows to obtaine axact and adecuate discription of model
>CyberCity< ist ein Konzept, das durch ein virtuelles Abbild der räumlichen Realität einer Stadt (Berlin) eine uns bekannte Wahrnehmungsumgebung als Orientierungs- und Navigationserleichterung bereitstellt, um über diesen virtuellen Browser möglichst schnell und anschaulich an eine gewünschte Information zu kommen. Dieses Umgebungsmodell ist auch als Simulationsmodell für die Visualisierung stadträumlicher Beurteilungen neuer Projekte, verkehrstechnischer Massnahmen und ökologischer Belastungen geeignet. Insbesondere ist es als Orientierungsumgebung für die Telepräsenz über die Kommunikationsnetze gedacht, die über die virtuellen Repräsentanten (Avatare) eine besondere gesellschaftliche Brisanz erhält.
Poland is not situated in any seismic region of the earth, however there are still areas were underground mining is being conducted. In these areas, so-called 'paraseismic tremors', are very frequent phenomena. In the situation when a building examination is realized in order to define its safety, it is necessary to make a complete analysis, in which an influence of tremors should be included. To decide if a building is able to carry out any dynamic loads or not, it is necessary to compute its dynamic characteristics, i.e. natural frequencies. It is not possible using any standard techniques. After diagnosis a building in situ by an expert, computer techniques together with specialized software for dynamic, static, and strength analyses become a suitable tool. In this paper a special attention was paid to a typical twelve-store WGP (Wroclaw Great Plate) prefabricated building, concerning special type of joints. During dynamic actions these joints have a decisive influence on building's behavior. Paraseismic tremors are especially dangerous for these buildings and can be the reason of pre-failure states. It can be difficult and very expensive to prepare laboratory investigations of the part of a building or of a separate joint; therefore the computer modeling suitable to investigate behavior of such elements and whole buildings under different kinds of loads was used.
Building Information Modeling is a powerful tool for the design and for a consistent set of data in a virtual storage. For the application in the phases of realization and on site it needs further development. The paper describes main challenges and main features, which will help the development of software to better service the needs of construction site managers
The development of 3D technologies during the last decades in many different areas, leads us towards the complete 3D representation of planet earth on a high level of detail. On the lowest level we have geographical information systems (GIS) representing the outer layer of our planet as a 3D model. In the meantime these systems do not only give a geographical model but also present additional information like ownership, infrastructure and others that might be of interest for the construction business. In future these systems will serve as basis for virtual environments for planning and simulation of construction sites. In addition to this work is done on the integration of GIS systems with 3D city models in the area of urban planning and thus integration of different levels of detail. This article presents research work on the use of 3D models in construction on the next level of detail below the level of urban planning. The 3D city model is taken as basis for the 3D model of the construction site. In this virtual nD-world a contractor can organize and plan his resources, simulate different variants of construction processes and thus find out the most effective solution for the consideration of costs and time. On the basis of former researches the authors present a new approach for cost estimation and simulation using development technologies from game software.
The paper analyses the application of 3D gaming technologies in the simulation of processes associated with human resources and machinery on construction sites in order to determine process costs. It addresses the problem of detailing in process simulation. The authors outline special boundary conditions for the simulation of cost-relevant resource processes on virtual construction sites. The approach considers different needs for detailing in process simulation during the planning and building phase. For simulation of process costs on a construction site (contractors view) the level of detail has to be set to high. A prototype for determination of process durations (and hereby process costs) developed at the Bauhaus University Weimar is presented as a result of ongoing researches on detailing in process simulation. It shows the method of process cost determination on a high level of detail (game between excavator and truck) through interaction with the virtual environment of the site.
Die Autoren stellen Grundlagen und Methoden zur Erstellung von Ausschreibungen und zur Durchführung der Kalkulation vor, die direkt mit dreidimensionalen Bauteilmodellen arbeiten. Dies trägt dazu bei, den gesamten Lebenszyklus eines Bauwerkes durchgängig über 3-D-Modelle beschreiben zu können. Im ersten Abschnitt werden grundlegende Überlegungen zum Einsatz von PLM/PDM-Technologien im Bauwesen angestellt. Im Anschluß daran wird die herkömmliche Ausschreibungsmethodik analysiert. Die Unterschiede zwischen einem konventionellen Leistungsverzeichnis und einem dreidimensionalen Bauteilkatalog (Objektdatenbank) hinsichtlich der Datenstruktur werden dargestellt. Hieraus abgeleitet werden Methoden und Prozesse bei der Erstellung einer 3-D-Leistungsbeschreibung entwickelt. Dazu wird eine geeignete Benutzer-Schnittstelle eines dreidimensionalen Bauteilkataloges vorgestellt. Schließlich werden praxisrelevante Probleme bei der Verwendung von Bauteilkatalogen erörtert. Ein wesentlicher Baustein ist die Kalkulation. Hier werden Kalkulationsmethoden basierend auf einem zweidimensionalen Leistungsverzeichnis und einem dreidimensionalen Bauteilkatalog miteinander verglichen. Ergänzt wird dies um Lösungen zur Anbindung von elektronischen Marktplätzen an den Bauteilkatalog zum Zweck der Preisbildung. Schließlich wird ein Ausblick gegeben, wie eine Synthese zwischen dreidimensionalem Bauteilkatalog und textlichen Standard-Leistungsbeschreibungen erreicht werden kann.
Der Einsatz von CAD-Applikationen hat die Planungsarbeit im Bauwesen entscheidend verändert. In der Entwurfsplanung sind die Änderungen bereits deutlich erkennbar. Seit der Anwendung von dreidimensionalen Planungswerkzeugen kann der Bauherr das Bauwerk in seiner virtuellen Gesamtheit am Computer betrachten und bewerten. Dies bietet viele Vorteile. So ist der Planer gezwungen alle wesentlichen Details a priori exakt zu planen, wodurch viele Fehler vermieden werden können. Dies bietet insbesondere für Planungen in Baubetrieben, z.B. im Rahmen der Arbeitsvorbereitung, neue Einsatzmöglichkeiten.
The paper introduces a systematic construction management approach, supporting expansion of a specified construction process, both automatically and semi-automatically. Throughout the whole design process, many requirements must be taken into account in order to fulfil demands defined by clients. In implementing those demands into a design concept up to the execution plan, constraints such as site conditions, building code, and legal framework are to be considered. However, complete information, which is needed to make a sound decision, is not yet acquired in the early phase. Decisions are traditionally taken based on experience and assumptions. Due to a vast number of appropriate available solutions, particularly in building projects, it is necessary to make those decisions traceable. This is important in order to be able to reconstruct considerations and assumptions taken, should there be any changes in the future project’s objectives. The research will be carried out by means of building information modelling, where rules deriving from standard logics of construction management knowledge will be applied. The knowledge comprises a comprehensive interaction amongst bidding process, cost-estimation, construction site preparation as well as specific project logistics – which are usually still separately considered. By means of these rules, favourable decision taking regarding prefabrication and in-situ implementation can be justified. Modifications depending on the available information within current design stage will consistently be traceable.
Due to economical, technical or political reasons all over the world about 100 nuclear power plants have been disconnected until today. All these power stations are still waiting for their complete dismantling which, considering one reactor, causes cost of up to one Bil. Euros and lasts up to 15 years. In our contribution we present a resource-constrained project scheduling approach minimizing the total discounted cost of dismantling a nuclear power plant. A project of dismantling a nuclear power plant can be subdivided into a number of disassembling activities. The execution of these activities requires time and scarce resources like manpower, special equipment or storage facilities for the contaminated material arising from the dismantling. Moreover, we have to regard several minimum and maximum time lags (temporal constraints) between the start times of the different activities. Finally, each disassembling activity can be processed in two alternative execution modes, which lead to different disbursements and determine the resource requirements of the considered activity. The optimization problem is to determine a start time and an execution mode for each activity, such that the discounted cost of the project is minimum, and neither the temporal constraints are violated nor the activities' resource requirements exceed the availability of any scarce resource at any point in time. In our contribution we introduce an appropriate multi-mode project scheduling model with minimum and maximum time lags as well as renewable and cumulative resources for the described optimization problem. Furthermore, we show that the considered optimization problem is NP-hard in the strong sense. For small problem instances, optimal solutions can be gained from a relaxation based enumeration approach which is incorporated into a branch and bound algorithm. In order to be able to solve large problem instances, we also propose a truncated version of the devised branch and bound algorithm.
We investigate aspects of tram-network section reliability, which operates as a part of the model of whole city tram-network reliability. Here, one of the main points of interest is the character of the chronological development of the disturbances (namely the differences between time of departure provided in schedule and real time of departure) on subsequent sections during tram line operation. These developments were observed in comprehensive measurements done in Krakow, during one of the main transportation nodes (Rondo Mogilskie) rebuilding. All taken building activities cause big disturbances in tram lines operation with effects extended to neighboring sections. In a second part, the stochastic character of section running time will be analyzed more detailed. There will be taken into consideration sections with only one beginning stop and also with two or three beginning stops located at different streets at an intersection. Possibility of adding results from sections with two beginning stops to one set will be checked with suitable statistical tests which are used to compare the means of the two samples. Section running time may depend on the value of gap between two following trams and from the value of deviation from schedule. This dependence will be described by a multi regression formula. The main measurements were done in the city center of Krakow in two stages: before and after big changes in tramway infrastructure.
MODEL OF TRAM LINE OPERATION
(2006)
From passenger's perspective punctuality is one of the most important features of trams operations. Unfortunately in most cases this feature is only insufficiently fulfilled. In this paper we present a simulation model for trams operation with special focus on punctuality. The aim is to get a helpful tool for designing time-tables and for analyzing the effects by changing priorities for trams in traffic lights respectively the kind of track separation. A realization of trams operations is assumed to be a sequence of running times between successive stops and times spent by tram at the stops. In this paper the running time is modeled by the sum of its mean value and a zero-mean random variable. With the help of multiple regression we find out that the average running time is a function depending on the length of the sections and the number of intersections. The random component is modeled by a sum of two independent zero-mean random variables. One of these variables describes the disturbance caused by the process of waiting at an intersection and the other the disturbance caused by the process of driving. The time spent at a stop is assumed to be a random variable, too. Its distribution is estimated from given measurements of these stop times for different tram lines in Kraków. Finally a special case of the introduced model is considered and numerical results are presented. This paper is involved with CIVITAS-CARAVEL project: "Clean and better transport in cites". The project has received research funding from the Community's Sixth Framework Programme. The paper reflects only the author's views and the Community is not liable for any use that may be made of the information contained therein.
The ride of the tram along the line, defined by a time-table, consists of the travel time between the subsequent sections and the time spent by tram on the stops. In the paper, statistical data collected in the city of Krakow is presented and evaluated. In polish conditions, for trams the time spent on stops makes up the remarkable amount of 30 % of the total time of tram line operation. Moreover, this time is characterized by large variability. The time spent by tram on a stop consists of alighting and boarding time and time lost by tram on stop after alighting and boarding time ending, but before departure. Alighting and boarding time itself usually depends on the random number of alighting and boarding passengers and also on the number of passengers which are inside the vehicle. However, the time spent by tram on stop after alighting and boarding time ending is an effect of certain random events, mainly because of impossibility of departure from stop, caused by lack of priorities for public transport vehicles. The main focus of the talk lies on the description and the modelling of these effects. This paper is involved with CIVITAS-CARAVEL project: "Clean and better transport in cites". The project has received research funding from the Community's Sixth Framework Programme. The paper reflects only the author's views and the Community is not liable for any use that may be made of the information contained therein.
From passenger’s perspective, punctuality is one of the most important features of tram route operation. We present a stochastic simulation model with special focus on determining important factors of influence. The statistical analysis bases on large samples (sample size is nearly 2000) accumulated from comprehensive measurements on eight tram routes in Cracow. For the simulation, we are not only interested in average values but also in stochastic characteristics like the variance and other properties of the distribution. A realization of trams operations is assumed to be a sequence of running times between successive stops and times spent by tram at the stops divided in passengers alighting and boarding times and times waiting for possibility of departure . The running time depends on the kind of track separation including the priorities in traffic lights, the length of the section and the number of intersections. For every type of section, a linear mixed regression model describes the average running time and its variance as functions of the length of the section and the number of intersections. The regression coefficients are estimated by the iterative re-weighted least square method. Alighting and boarding time mainly depends on type of vehicle, number of passengers alighting and boarding and occupancy of vehicle. For the distribution of the time waiting for possibility of departure suitable distributions like Gamma distribution and Lognormal distribution are fitted.
The idea about a simulation program to support urban planning is explained: Four different, clearly defined developing paths can be calculated for the rebuilding of a shrinking town. Aided by self-organization principles, a complex system can be created. The dynamics based on the action patterns of single actors, whose behaviour is cyclically depends on the generated structure. Global influences, which control the development, can be divided at a spatial, socioeconomic, and organizational-juridical level. The simulation model should offer conclusions on new planning strategies, especially in the context of the creation process of rebuilding measures. An example of a transportation system is shown by means of prototypes for the visualisation of the dynamic development process.
Die Arbeit befaßt sich mit varianzmindernden Verfahren zur Monte Carlo Simulation von stochastischen Prozessen, zum Zweck der Zuverlässigkeitsbeurteilung von Baukonstruktionen mit nichtlinearem Systemverhalten. Kap. 2 ist eine Literaturstudie zu varianzmindernden Monte Carlo Methoden. In Kap. 3 wird die Spektrale Darstellung eines stationären, skalaren Gauß - Prozesses hergeleitet. Auf dieser Grundlage werden verschiedene Simulationsmodelle diskutiert. Das in Kap. 4 entwickelte varianzmindernde Simulationsverfahren basiert auf der Spektralen Darstellung. Nach einer ersten Pilotsimulation werden die Frequenzen für die Einführung zufälliger Amplituden bestimmt und deren Parameter angepaßt. Der zweite Lauf erfolgt mit diesen Parametern nach dem Prinzip des Importance Sampling. Das Verfahren wird in Kap. 5 für eine Brücke unter Erdbebenbelastung angewendet. Die Brücke ist mit sog. Hysteretic Devices zur Energiedissipation ausgerüstet. Es werden einerseits die Genauigkeit und Effizienz des Simulationsverfahrens, andererseits die Leistungsfähigkeit der Hysteretic Devices zur Erdbebenertüchtigung von Bauwerken demonstriert.
In modernen Gebäuden nimmt die Komplexität der Heizungstechnik ständig zu. Damit wird es auch immer schwieriger, ein ökologisch und ökonomisch vernünftiges Zusammenspiel der Komponenten zu gewährleisten. Die Vernetzung der verschiedenen Komponenten eines Heizsystems mittels Netzwerktechnik aus der EDV soll helfen, die Energieeffizienz zu erhöhen. Embedded Systems und Mikrocontroller fungieren als Regler für die Teilsysteme. Durch Kommunikation untereinander sollen sie ihr Regelverhalten aneinander anpassen. Eine Internetanbindung ermöglicht die Nutzung weiterer Informationen für die Betriebsführung. Außerdem kann der Internetanschluss für die Fernwartung der Anlage genutzt werden. Mit kleinen, im Webbrowser eines Rechners ausgeführten Java-Programmen, sogenannten Applets, können die Betriebszustände von Heizsystemen in Echtzeit visualisiert werden. Durch das Aufzeichnen von Betriebsdaten wird deren Analyse ermöglicht.
Der Planungsprozess im Konstruktiven Ingenieurbau ist gekennzeichnet durch drei sich zyklisch wiederholende Phasen: die Phase der Aufgabenverteilung, die Phase der parallelen Bearbeitung mit entsprechenden Abstimmungen und die Phase der Zusammenführung der Ergebnisse. Die verfügbare Planungssoftware unterstützt überwiegend nur die Bearbeitung in der zweiten Phase und den Austausch der Datenbestände durch Dokumente. Gegenstand der Arbeit ist die Entwicklung einer Systemarchitektur, die in ihrem Grundsatz alle Phasen der verteilten Bearbeitung und unterschiedliche Arten der Kooperation (asynchron, parallel, wechselseitig) berücksichtigt und bestehende Anwendungen integriert. Das gemeinsame Arbeitsmaterial der Beteiligten wird nicht als Dokumentmenge, sondern als Menge von Objekt- und Elementversionen und deren Beziehungen abstrahiert. Elemente erweitern Objekte um applikationsunabhängige Eigenschaften (Features). Für die Bearbeitung einer Aufgabe werden Teilmengen auf Basis der Features gebildet, für deren Elemente neue Versionen abgeleitet und in einen privaten Arbeitsbereich geladen werden. Die Bearbeitung wird auf Operationen zurückgeführt, mit denen das gemeinsame Arbeitsmaterial konsistent zu halten ist. Die Systemarchitektur wird formal mit Mitteln der Mathematik beschrieben, verfügbare Technologie beschrieben und deren Einsatz in einem Umsetzungskonzept dargestellt. Das Umsetzungskonzept wird pilothaft implementiert. Dies erfolgt in der Umgebung des Internet in der Sprache Java unter Verwendung eines Versionsverwaltungswerkzeuges und relationalen Datenbanken.
Die verteilte Bearbeitung gemeinsamer Produktmodelle ist im Bauwesen Gegenstand der aktuellen Forschung. Der vorgestellte Lösungsansatz bewegt sich in einem Spannungsfeld: Zum einen sollen die zu bearbeitenden Teilmengen des Produktmodells sehr flexibel durch die Planer zu bilden sein, zum anderen müssen Revisions- und Freigabestände dauerhaft und unveränderlich definiert werden. In einer versionierten Umgebung mit vielen Abhängigkeiten sind diese Anforderungen schwierig zu erfüllen. Der vorgestellte Lösungsansatz zeigt die Bildung von Revisions- und Freigabeständen, ohne die flexible verteilte Bearbeitung einzuschränken. Die Freigabestände müssen bestimmte Eigenschaften erfüllen: Es darf beispielsweise nur eine Version eines Objekts enthalten sein und es müssen die Bindungen zu anderen Objektversionen in einer konsistenten Weise berücksichtigt werden. Es wird eine mathematische Beschreibung gewählt, die auf der Mengenlehre und der Graphentheorie basiert.
The synchronous distributed processing of common source code in the software development process is supported by well proven methods. The planning process has similarities with the software development process. However, there are no consistent and similarly successful methods for applications in construction projects. A new approach is proposed in this contribution.
Für eine Abschätzung des Heizwärmebedarfs von Gebäuden und Quartieren können thermisch-energetische Simulationen eingesetzt werden. Grundlage dieser Simulationen sind geometrische und physikalische Gebäudemodelle. Die Erstellung des geometrischen Modells erfolgt in der Regel auf Basis von Bauplänen oder Vor-Ort-Begehungen, was mit einem großen Recherche- und Modellierungsaufwand verbunden ist. Spätere bauliche Veränderungen des Gebäudes müssen häufig manuell in das Modell eingearbeitet werden, was den Arbeitsaufwand zusätzlich erhöht. Das physikalische Modell stellt die Menge an Parametern und Randbedingungen dar, welche durch Materialeigenschaften, Lage und Umgebungs-einflüsse gegeben sind. Die Verknüpfung beider Modelle wird innerhalb der entsprechenden Simulations-software realisiert und ist meist nicht in andere Softwareprodukte überführbar. Mithilfe des Building Information Modeling (BIM) können Simulationsdaten sowohl konsistent gespeichert als auch über Schnittstellen mit entsprechenden Anwendungen ausgetauscht werden. Hierfür wird eine Methode vorgestellt, die thermisch-energetische Simulationen auf Basis des standardisierten Übergabe-formats Industry Foundation Classes (IFC) inklusive anschließender Auswertungen ermöglicht. Dabei werden geometrische und physikalische Parameter direkt aus einem über den gesamten Lebenszyklus aktuellen Gebäudemodell extrahiert und an die Simulation übergeben. Dies beschleunigt den Simulations-prozess hinsichtlich der Gebäudemodellierung und nach späteren baulichen Veränderungen. Die erarbeite-te Methode beruht hierbei auf einfachen Modellierungskonventionen bei der Erstellung des Bauwerksinformationsmodells und stellt eine vollständige Übertragbarkeit der Eingangs- und Ausgangswerte sicher.
Thermal building simulation based on BIM-models. Thermal energetic simulations are used for the estimation of the heating demand of buildings and districts. These simulations are based on building models containing geometrical and physical information. The creation of geometrical models is usually based on existing construction plans or in situ assessments which demand a comparatively big effort of investigation and modeling. Alterations, which are later applied to the structure, request manual changes of the related model, which increases the effort additionally. The physical model represents the total amount of parameters and boundary conditions that are influenced by material properties, location and environmental influences on the building. The link between both models is realized within the correspondent simulation soft-ware and is usually not transferable to other software products. By Applying Building Information Modeling (BIM) simulation data is stored consistently and an exchange to other software is enabled. Therefore, a method which allows a thermal energetic simulation based on the exchange format Industry Foundation Classes (IFC) including an evaluation is presented. All geometrical and physical information are extracted directly from the building model that is kept up-to-date during its life cycle and transferred to the simulation. This accelerates the simulation process regarding the geometrical modeling and adjustments after later changes of the building. The developed method is based on simple conventions for the creation of the building model and ensures a complete transfer of all simulation data.
Für eine Abschätzung des Heizwärmebedarfs von Gebäuden und Quartieren können thermisch-energetische Simulationen eingesetzt werden. Grundlage dieser Simulationen sind geometrische und physikalische Gebäudemodelle. Die Erstellung des geometrischen Modells erfolgt in der Regel auf Basis von Bauplänen oder Vor-Ort-Begehungen, was mit einem großen Recherche- und Modellierungsaufwand verbunden ist. Spätere bauliche Veränderungen des Gebäudes müssen häufig manuell in das Modell eingearbeitet werden, was den Arbeitsaufwand zusätzlich erhöht. Das physikalische Modell stellt die Menge an Parametern und Randbedingungen dar, welche durch Materialeigenschaften, Lage und Umgebungs-einflüsse gegeben sind. Die Verknüpfung beider Modelle wird innerhalb der entsprechenden Simulations-software realisiert und ist meist nicht in andere Softwareprodukte überführbar.
Mithilfe des Building Information Modeling (BIM) können Simulationsdaten sowohl konsistent gespeichert als auch über Schnittstellen mit entsprechenden Anwendungen ausgetauscht werden. Hierfür wird eine Methode vorgestellt, die thermisch-energetische Simulationen auf Basis des standardisierten Übergabe-formats Industry Foundation Classes (IFC) inklusive anschließender Auswertungen ermöglicht. Dabei werden geometrische und physikalische Parameter direkt aus einem über den gesamten Lebenszyklus aktuellen Gebäudemodell extrahiert und an die Simulation übergeben. Dies beschleunigt den Simulations-prozess hinsichtlich der Gebäudemodellierung und nach späteren baulichen Veränderungen. Die erarbeite-te Methode beruht hierbei auf einfachen Modellierungskonventionen bei der Erstellung des Bauwerksinformationsmodells und stellt eine vollständige Übertragbarkeit der Eingangs- und Ausgangswerte sicher.
Thermal building simulation based on BIM-models. Thermal energetic simulations are used for the estimation of the heating demand of buildings and districts. These simulations are based on building models containing geometrical and physical information. The creation of geometrical models is usually based on existing construction plans or in situ assessments which demand a comparatively big effort of investigation and modeling. Alterations, which are later applied to the structure, request manual changes of the related model, which increases the effort additionally. The physical model represents the total amount of parameters and boundary conditions that are influenced by material properties, location and environmental influences on the building. The link between both models is realized within the correspondent simulation soft-ware and is usually not transferable to other software products.
By Applying Building Information Modeling (BIM) simulation data is stored consistently and an exchange to other software is enabled. Therefore, a method which allows a thermal energetic simulation based on the exchange format Industry Foundation Classes (IFC) including an evaluation is presented. All geometrical and physical information are extracted directly from the building model that is kept up-to-date during its life cycle and transferred to the simulation. This accelerates the simulation process regarding the geometrical modeling and adjustments after later changes of the building. The developed method is based on simple conventions for the creation of the building model and ensures a complete transfer of all simulation data.
Practical examples show that the improvement in cost flow and total amount of money spend in construction and further use may be cut significantly. The calculation is based on spreadsheets calculation, very easy to develop on most PC´s now a days. Construction works, are a field where the evaluation of Cash Flow can be and should be applied. Decisions about cash flow in construction are decisions with long-term impact and long-term memory. Mistakes from the distant past have a massive impact on situations in the present and into the far economic future of economic activities. Two approaches exist. The Just-in-Time (JIT) approach and life cycle costs (LCC) approach. The calculation example shows the dynamic results for the production speed in opposition to stable flow of production in duration of activities. More sophisticated rescheduling in optimal solution might bring in return extra profit. In the technologies and organizational processes for industrial buildings, railways and road reconstruction, public utilities and housing developments there are assembly procedures that are very appropriate for the given purpose, complicated research-, development-, innovation-projects are all very good aspects of these kinds of applications. The investors of large investments and all public invested money may be spent more efficiently if an optimisation speed-strategy can be calculated.
The contribution presents a model that is able to simulate construction duration and cost for a building project. This model predicts set of expected project costs and duration schedule depending on input parameters such as production speed, scope of work, time schedule, bonding conditions and maximum and minimum deviations from scope of work and production speed. The simulation model is able to calculate, on the basis of input level of probability, the adequate construction cost and time duration of a project. The reciprocal view attends to finding out the adequate level of probability for construction cost and activity durations. Among interpretive outputs of the application software belongs the compilation of a presumed dynamic progress chart. This progress chart represents the expected scenario of development of a building project with the mapping of potential time dislocations for particular activities. The calculation of a presumed dynamic progress chart is based on an algorithm, which calculates mean values as a partial result of the simulated building project. Construction cost and time models are, in many ways, useful tools in project management. Clients are able to make proper decisions about the time and cost schedules of their investments. Consequently, building contractors are able to schedule predicted project cost and duration before any decision is finalized.
Die Sicherung der Wettbewerbsfähigkeit im Bereich des Bauwesens, insbesondere kleinerer und mittelständischer Betriebe erfordert ein aktives Handeln als Antwort auf die sich ändernde Wettbewerbssituation. Einen wesentlichen Wettbewerbsvorteil können kleine unternehmerische Einheiten durch höhere Flexibilität, schnelle Reaktion auf Kundenwünsche oder aktuelle Situationen auf der Baustelle und Marktnähe erreichen. Dazu ist es nötig, die Informations- und Kommunikationsströme durch Einsatz standardisierter und kostengünstiger Hard- und Software wie z.B. Handhelds zu unterstützen und insbesondere die existierenden Hindernisse im Informationsfluss zwischen Baustelle und Büro zu beseitigen. Am Beispiel der Projekte >IuK - SystemBau< und >eSharing< wird eine Einführungsstrategie für >Mobile Computing< in kleinen unternehmerischen Einheiten des Bauwesens (KMU) basierend auf einer umfangreichen Anforderungsanalyse vorgestellt. Folgende Aspekte sollen beschrieben werden: durchgängiger Einsatz der Technik unter Beachtung der verschiedenen Qualifikationsniveaus, Einführungsunterstützung durch Schulungen, Prozessanalyse und mögliche Integration in bestehende Software-Umgebungen sowie Feldtests.
Detailuntersuchungen an Tragwerken führen bei FE-Berechnungen immer wieder auf das Problem einer geeigneten Netzgestaltung. Während in weiten Bereichen ein grobes Netz ausreicht, muß an kritischen Stellen ein sehr feines Netz gewählt werden, um gerade dort hinreichend genaue Ergebnisse zu erhalten. Bei der Realisierung lokaler Netzverdichtungen stellt die Gestaltung des Übergangs vom groben zum feinen Netz das Hauptproblem dar. Im Beitrag wird hierzu eine Familie von FE-Übergangselementen vorgestellt, mit denen sich eine voll-kompatible Kopplung von wenigen großen Elementen mit vielen kleinen Elementen bereits über nur eine Stufe erzielen läßt. Diese neu entwickelten sogenannten pNh-Elemente ermöglichen an einer oder mehreren Seiten den Anschluß von N kleineren Elementen (Elementseiten für h-Verfeinerung). Das wird durch N stückweise definierte Ansatzfunktionen an den entsprechenden Seiten erreicht, wobei die Teilung nicht äquidistant sein braucht. Darüber hinaus ist es möglich, Elemente unterschiedlichen Polynomgrades p an den Standardseiten und den Verfeinerungsseiten anzuschließen. Der praktische Einsatz der Übergangselemente setzt geeignete automatische oder halbautomatische Netzgeneratoren voraus, die diese Elemente einbeziehen. Im Rahmen einer substrukturorientierten Modellierung läßt sich dies besonders günstig realisieren. Im Beitrag wird gezeigt, wie durch Zerlegung des Gesamtmodells in Bereiche mit grobem Netz, mit Übergangsnetz und mit feinem Netz, eine effektive Generierung der Netzverdichtungen zu erreichen ist. An einem praktischen Beispiel aus dem Bauingenieurwesen werden die Vorteile des vorgestellten Übergangselementkonzeptes umfassend demonstriert.
For the management or reorganisation of existing buildings, data concerning dimensions and construction are necessary. Often these data are given exclusively by paper-based drawings and no digital data such as a computer based product model or even a CAD-model are available. In order to perform mass calculation, damage mapping or a recalculation of the structure these drawings of the building under consideration have to be analysed manually by the engineer. This is a very time-consuming job. In order to close this gap between drawings of an existing building and a digital product model an approach is presented in this paper to digitise a drawing, to build up geometric and topologic models and to recognise construction parts of the building. Finally all recognised parts are transformed into a three-dimensional geometric model which provides all necessary geometric information for the product model. During this import process the semantics of a ground floor plan has to be converted into a 3D-model.
Authors' own research in applied unicriterial and multicriterial optimisation of bar structures, and also an analysis of accessible bibliography on structural synthesis allows to present herein an attempt to define a general algorithm for proceeding in formulation of a structural optimisation problem. A practical aspect of such an algorithm consists, in author's opinion, in enabling a designer a correct creation of a mathematical model of synthesis problems, independently of known mathematical methods employed to looking for an unconditional extremum of function of several variables. A proposed algorithm is not a ready-for-use tool for solving all the optimisation problems, but it constitutes an easy-to-expand theoretical basis. This basis should allow a designer to create a proper set of compromises on the way to construct a mathematical model of a specific optimisation problem. The algorithm, presented in the paper, is constructed as a sequence of the one-after-another problem questions, on which the designer answers: yes or no, and a set of selections from the knowledge base consisting of the elements of an optimisation problem components. The order of making questions adopted by the authors in the algorithm is subjective, however it is supported by their experience, both in applied optimisation and in designing of structures like trusses or frames.
Maxwell's equations can be rewritten in terms of a Dirac operator D+a. The advantage is that in this setting Maxwell's equations are treated as a system of first order differential equations. To ensure the uniqueness of a non-homogeneous differential equation in the whole space additional conditions are needed.
A realistic and reliable model is an important precondition for the simulation of revitalization tasks and the estimation of system properties of existing buildings. Thereby, the main focus lies on the parameter identification, the optimization strategies and the preparation of experiments. As usual structures are modeled by the finite element method. This as well as other techniques are based on idealizations and empiric material properties. Within one theory the parameters of the model should be approximated by gradually performed experiments and their analysis. This approximation method is performed by solving an optimization problem, which is usually non-convex, of high dimension and possesses a non-differentiable objective function. Therefore we use an optimization procedure based on genetic algorithms which was implemented by using the program package SLang...
Models in the context of engineering can be classified in process based and data based models. Whereas the process based model describes the problem by an explicit formulation, the data based model is often used, where no such mapping can be found due to the high complexity of the problem. Artificial Neuronal Networks (ANN) is a data based model, which is able to “learn“ a mapping from a set of training patterns. This paper deals with the application of ANN in time dependent bathymetric models. A bathymetric model is a geometric representation of the sea bed. Typically, a bathymetry is been measured and afterwards described by a finite set of measured data. Measuring at different time steps leads to a time dependent bathymetric model. To obtain a continuous surface, the measured data has to be interpolated by some interpolation method. Unlike the explicitly given interpolation methods, the presented time dependent bathymetric model using an ANN trains the approximated surface in space and time in an implicit way. The ANN is trained by topographic measured data, which consists of the location (x,y) and time t. In other words the ANN is trained to reproduce the mapping h = f(x,y,t) and afterwards it is able to approximate the topographic height for a given location and date. In a further step, this model is extended to take meteorological parameters into account. This leads to a model of more predictive character.
Die Informatik im Bauwesen oder kurz Bauinformatik hat sich in den vergangenen Jahren in Deutschland kontinuierlich entwickelt und sie hat einen festen Platz in den Fakultäten im Bauingenieurwesen an den deutschen Universitäten erhalten. Diese Entwicklung war anfänglich sehr stark fixiert auf die Berechnungen des physikalischen Verhaltens von Bauwerken. In den letzten Jahren kamen weitere Gebiete hinzu. Hier ist vor allem die Zeichnungserstellung (CAD) hervorzuheben. Diese Entwicklungen haben einen großen Reifegrad erlangt und sie haben einen festen Platz in Lehre und Forschung in der Bauinformatik. Die Rolle der Bauinformatik allgemein hat in den letzten Jahren eine stürmische Entwicklung genommen. Die Breite der Anwendungen hat ständig zugenommen und die Forderung nach einer durchgängigen Nutzbarkeit aller Informationen, die in diesem Zusammenhang bearbeitet werden, wird immer stärker gestellt. Besonders zwei Problemstellungen stehen im Mittelpunkt vieler Entwicklungen in der Bauindustrie und in den entsprechenden Softwarehäusern. Dies sind die Entwicklung und Nutzung technischer Modelle zur Unterstützung des Bauprozesses und die Unterstützung betriebswirtschaftlicher Anforderungen in der Bauindustrie. Vor allem die zweite Anforderung war bisher kaum Gegenstand der Bauinformatik. Aktuelle Bestrebungen vor allem aus der Praxis wirken darauf hin, diesen Zustand zu ändern. Die Fakultät Bauingenieurwesen an der Bauhaus-Universität beginnt diesen Anforderungen dadurch Rechnung zu tragen, daß kontinuierlich die notwendigen Voraussetzungen geschaffen werden, um diese Thematik in Lehre und Forschung kompetent aufnehmen zu können. Externe Referenten, die an der Spitze entsprechender Entwicklungen sind, werden in die Lehre mit eingebunden. Dies soll eine kontinuierliche Weiterentwicklung einleiten, die helfen soll, eine zeitgemäße Ausbildung der zukünftigen Bauingenieure sicherzustellen.
Die Versagenswahrscheinlichkeit nach einem Grenzzustand wird gewöhnlich mit dem Integral I der Basisvariablen-Verteilungsdichte über den Versagensbereich bestimmt. Dabei ist eine geschlossene Lösung nur im Spezialfall normalverteilter Basisvariablen bei Linearität der Grenzzustandsgleichung möglich. In anderen Fällen sind verschiedene Näherungsverfahren gebräuchlich, die auf den Momenten der Basisvariablen und geeignet gewählten Indizes als Sicherheitskenngrößen beruhen. Eine größere Genauigkeit bieten die Zuverlässigkeitstheorien erster bzw. zweiter Ordnung, die ebenfalls von I ausgehen. Im Beitrag wird ein neuartiges Verfahren vorgestellt, dessen Ausgangspunkt nicht I, sondern das Kraftgrößenverfahren als einem Standardalgorithmus des konstruktiven Ingenieurbaus ist. Die Einbeziehung der maßgebenden Zufallsgrößen in die Matrix der Vorzahlen und die Belastungszahlen führt zur Verallgemeinerung des Systems der Elastizitätsgleichungen zum zufälligen System der Elastizitätsgleichungen. Dessen Lösung, die durch den Übergang zu einem deterministischen Ersatzsystem gewonnen wird, liefert die statisch Unbestimmten als Funktionen der im System wirkenden Zufallsgrößen (z.B. E-Modul der Stäbe und Belastung). Da dieser Zusammenhang analytisch vorliegt, kann die Wirkung einzelner Zufallseinflüsse auf die statisch Unbestimmten und die daraus folgenden sicherheitsrelevanten Zustandsgrößen beurteilt werden. Die Dichtefunktion der Grenzzustandsgleichung kann berechnet oder durch Simulation ermittelt werden. Daraus folgt . Nicht normalverteilte Zufallsgrößen werden durch Entwicklung in orthogonale Polynome Gaußscher Zufallsgrößen berücksichtigt.
Seit mehr als fünfzig Jahren werden zur Untersuchung der Tragwerkssicherheit auch Methoden der Wahrscheinlichkeitsrechnung herangezogen. Ungeachtet der inzwischen erreichten Fortschritte und der offensichtlichen Vorzüge, konnte dieses Vorgehen in der Praxis bis jetzt noch nicht ausreichend Fuß fassen. Im Beitrag wird das Problem der Tragwerkssicherheit mit einem neuartigen Verfahren behandelt. Im Unterschied zu den üblichen probabilistischen Methoden geht es nicht von Verteilungsfunktionen aus. Vielmehr werden die maßgebenden Zufallsgrößen in den Mittelpunkt gestellt und direkt in die Rechenvorschrift eingeführt. Als mathematisches Hilfsmittel dienen die WIENERschen Chaos-Polynome. Sie stellen im Raum der Zufallsgrößen mit beschränkter Varianz eine Basis dar, mit der sich eine beliebige Zufallsgröße nach orthogonalen Polynomen GAUSSscher Zufallsgrößen entwickeln läßt. So entsteht ein effektiver Formalismus, der sich eng an die herkömmliche Deformationsmethode anlehnt und als deren probabilistische Verallgemeinerung angesprochen werden darf. Die Methode liefert die Grenzzustandsbedingung als Funktion der auf das Tragwerk wirkenden Zufallsgrößen. Die Versagenswahrscheinlichkeit kann daher durch Monte-Carlo-Simulation bestimmt werden. Die mit der Auswertung des Wahrscheinlichkeitsintegrals der First Order Reliability Method (FORM) verbundenen Schwierigkeiten werden vermieden. An einem Beispieltragwerk wird dargestellt, wie sich Veränderungen gewisser Konstruktionsparameter auf die Versagenswahrscheinlichkeit auswirken.
Objects for civil engineering applications can be identified with their reference in memory, their alpha-numeric name or their geometric location. Particularly in graphic user interfaces, it is common to identify objects geometrically by selection with the mouse. As the number of geometric objects in a graphic user interface grows, it becomes increasingly more important to treat the basic operations add, search and remove for geometric objects with great efficiency. Guttmann has proposed the Region-Tree (R-tree) for geometric identification in an environment which uses pages on disc as data structure. Minimal bounding rectangles are used to structure the data in such a way that neighborhood relations can be described effectively. The literature shows that the parameters which influence the efficiency of the R-trees have been studied extensively, but without conclusive results. The goal of the research which is reported in this paper is to determine reliably the parameters which significantly influence the efficiency of R-trees for geometric identification in technical drawings. In order to make this investigation conclusive, it must be performed with the best available software technology. Therefore an object-oriented software for the method is developed. This implementation is tested with technical drawings containing many thousands of geometric objects. These drawings are created automatically by a stochastic generator which is incorporated into a test bed consisting of an editor and a visualisor. This test bed is used to obtain statistics for the main factors which affect the efficiency of R-trees. The investigation shows that the following main factors which affect the efficiency can be identified reliably : number of geometric objects on the drawing the minimum und maximum number of children of a node of the tree the maximum width and height of the minimal bounding rectangles of the geometric objects relative to the size of the drawing.
Die heutige Situation in der Tragwerksplanung ist durch das kooperative Zusammenwirken einer größeren Anzahl von Fachleuten verschiedener Disziplinen (Architektur, Tragwerksplanung, etc.) in zeitlich befristeten Projektgemeinschaften gekennzeichnet. Bei der Abstimmung der hierdurch bedingten komplexen, dynamischen und vernetzten Planungsprozesse kommt es dabei häufig zu Planungsmängeln und Qualitätseinbußen. Dieser Artikel zeigt auf, wie mit Hilfe der Agententechnologie Lösungsansätze zur Verbesserung der Planungssituation erreicht werden können. Hierzu wird ein Agentenmodell für die vernetzt-kooperative Tragwerksplanung vorgestellt und anhand der Planung einer Fußgängerbogenbrücke anschaulich demonstriert. Das Agentenmodell erfasst (1) die beteiligten Fachplaner und Organisationen, (2) die tragwerksspezifischen Planungsprozesse, (3) die zugehörigen (Teil-)Produktmodelle und (4) die genutzte (Ingenieur-)Software. Hieraus leiten sich die drei Teilmodelle (1) agentenbasiertes Kooperationsmodell, (2) agentenbasierte Produktmodellintegration und (3) Modell zur agentenbasierten Software-Integration ab. Der Fokus des Artikels liegt auf der Darstellung des agentenbasierten Kooperationsmodells.
Die Aufgaben des Bauwesens erfordern den direkten Zugriff auf Objekte einer Datenbasis, die in dem Arbeitsspeicher einer Sitzung und in mehreren Binärdateien verteilt sind. In den Methoden einer Anwendung soll jedes Objekt unabhängig vom Ort seiner Speicherung mit seinem Namen als persistentem Identifikator direkt ansprechbar sein. Bei Bedarf sollen Objekte automatisch aus Dateien nachgeladen werden. Der Anwender soll bestimmen können, in welcher Datei ein bestimmtes Objekt gespeichert wird. Die Zugriffszeit auf ein Objekt soll im Mittel mit der Zugriffszeit auf ein Objekt in einer Java-Methode vergleichbar sein. Ein Konzept einer generalisierten Datenbasis wird vorgestellt. Seine Leistungsfähigkeit wird mit der vorhandenen Software für die Aufbewahrung und Verwaltung von Daten im Bauwesen verglichen. Es erweist sich als zweckmäßig, bereits im Entwurf einer Anwendung streng zwischen persistenten und transienten Objekten zu unterscheiden. Alle persistenten Objekte der Anwendung werden benannt. Unbenannte persistente Objekte der Java Plattform, beispielsweise Kollektionen und graphische Objekte, sind ebenfalls mit einem Griff (einem von der Datenbasis zugeteilten Namen) speicherbar. Der Zugriff auf ein Objekt ist schnell, da nur primitive Datentypen und Strings in binärer Form ohne Rückgriff auf Datenbanken gespeichert werden.
Applications for civil engineering tasks usually contain graphical user interfaces for the engineering processes. Persistent objects of the applications are stored to data bases. The influence of the interaction between a graphical user interface and a data base for the development of an civil engineering application is investigated in this paper. A graphic application for the linear elastic analysis of plane frames, which was previously developed with standard tools of the Java platform, is compared to a redesigned implementation using a generalized data base for persistent objects. The investigation leads to the following results : - A strict distinction between persistent and transient objects influences the class structure of an application, in particular the class structure of a graphical user interface. - The structure of an application depends on the logic for updating of references to persistent and transient graphical objects after an application is read from a file. - The complexity of the reference management can usually be handled better by just in time referencing associated with String - identifiers rather than by automated referencing associated with Name - identifiers.
Der Schwerpunkt von Forschung und Entwicklung auf dem Gebiet der Tragwerksplanungs-Software lag in den letzten Jahren auf der Erweiterung des funktionalen Umfangs. In der Folge ist es notwendig, den gestiegenen Funktionsumfang einem möglichst breiten Anwenderkreis durch ingenieurgemäß gestaltete Arbeitsumgebungen zugänglich zu machen, so dass ein möglichst effizientes und fehlerarmes Arbeiten ermöglicht wird. Aus der Sicht der Tragwerksplaner muss eine ingenieurgemäß gestaltete Software eine dem spezifischen Arbeitsablauf angepasste Nutzer-Software-Interaktion aufweisen. Dabei sind die benötigten Funktionalitäten in ein einheitliches System zu integrieren und eine Anpassbarkeit durch den Anwender sicherzustellen. Die Berücksichtigung dieser Anforderungen mit herkömmlichen Mitteln würde einen unverhältnismäßig hohen Entwicklungsaufwand erfordern. Infolgedessen muss aus der Sicht der Software-Entwickler eine moderne Software-Architektur für die Tragwerksplanung eine Erhöhung des Wiederverwendungsgrades und eine unabhängige Erweiterbarkeit als zusätzliche Anforderungen erfüllen. In diesem Beitrag wird ein auf Verbunddokumenten basierendes Konzept vorgestellt, mit dem eine Zusammenführung von Standard-Software und fachspezifischen Software-Komponenten zu einer ingenieurgemäßen Arbeitsumgebung ermöglicht wird. Damit kann die Analyse und die Dokumentation eines Tragelementes einschließlich der zugehörigen Datenhaltung innerhalb eines Verbunddokumentes erfolgen. Gleichzeitig kann der software-technische Wiederverwendungsgrad durch die Definition eines Component Frameworks als unabhängig erweiterbare Software-Architektur und durch den Einsatz von Software-Komponenten mit eigener Nutzeroberfläche über das bisher erreichte Niveau hinaus gesteigert werden. Die Umsetzbarkeit des Konzeptes wird durch eine Pilotimplementierung demonstriert.
Das Ziel der Arbeit besteht in der Entwicklung von Konzepten zur effizienten Implementierung von ingenieurgemäßer Tragwerksplanungs-Software. Der vorgeschlagene Lösungsansatz basiert auf einer optimalen Anpassung der Nutzeroberfläche an den spezifischen Arbeitsablauf, der Erhöhung des Wiederverwendungsgrades sowie der Möglichkeit der unabhängigen Erweiterbarkeit von Software-Systemen. Verbunddokumente werden als Nutzeroberfläche und Software-Architektur benutzt, um eine Zusammenführung von Standard-Software und fachspezifischen Software-Komponenten zu ingenieurgemäßer Tragwerksplanungs-Software zu ermöglichen. Die Umsetzbarkeit des Konzeptes wird durch eine Pilotimplementierung verifiziert.
Heutige Methoden zur Soll-Spezifikation von Bauleistungen (Kostenermittlung und zeitliche Ablaufplanung) gehen von einer abstrahierten und vereinfachten Betrachtung der Zusammenhänge bei Bauprojekten aus. Leistungsverzeichnisse, Kostenermittlungen und Bauzeitpläne orientieren sich nur indirekt an der Geometrie des Bauwerks und der Baustelle. Die dabei verwendeten Medien wie Papier, 2D-Dateien, digitale Leistungsbeschreibungen oder 3D-Darstellungen lassen die Suche nach Informationen auf der Baustelle zu einem zeitaufwändigen und in Anbetracht existierender Medientechnologien ineffizienten Prozess werden. Interaktive virtuelle Umgebungen erlauben die Auflösung starrer Zusammenhänge durch interaktive Eingriffe des Anwenders und visualisieren komplexe bauproduktionstechnische Vorgänge. Das Konzept der visuellen interaktiven Simulation der Bauproduktion sieht vor, die Soll-Spezifikation anhand eines interaktiven 3D-Modells zu entwickeln, um räumliche Veränderungen und parallele Prozesse auf der virtuellen Baustelle im Rahmen der Entscheidungsfindung zum Bauablauf besser berücksichtigen zu können. Verlangt man einen hohen Grad an Interaktivität mit dem 3D-Modell, dann bieten sich Computerspieltechnologien sehr gut zu Verifikationszwecken an. Die visuelle interaktive Simulation der Bauproduktion ist damit als eine 3D-modellbasierte Methode der Prozessmodellierung zu verstehen, die Entscheidungen als Input benötigt und die Kostenermittlung sowie die zeitliche Ablaufplanung als Output liefert.
The contribution focuses on the development of a basic computational scheme that provides a suitable calculation environment for the coupling of analytical near-field solutions with numerical standard procedures in the far-field of the singularity. The proposed calculation scheme uses classical methods of complex function theory, which can be generalized to 3-dimensional problems by using the framework of hypercomplex analysis. The adapted approach is mainly based on the factorization of the Laplace operator EMBED Equation.3 by the Cauchy-Riemann operator EMBED Equation.3 , where exact solutions of the respective differential equation are constructed by using an orthonormal basis of holomorphic and anti-holomorphic functions.
In this paper the influence of changes in the mean wind velocity, the wind profile power-law coefficient, the drag coefficient of the terrain and the structural stiffness are investigated on different complex structural models. This paper gives a short introduction to wind profile models and to the approach by Davenport A. G. to compute the structural reaction of wind induced vibrations. Firstly with help of a simple example (a skyscraper) this approach is shown. Using this simple example gives the reader the possibility to study the variance differences when changing one of the above mentioned parameters on this very easy example and see the influence of different complex structural models on the result. Furthermore an approach for estimation of the needed discretization level is given. With the help of this knowledge the structural model design methodology can be base on deeper understanding of the different behavior of the single models.
Rectangular steel frames are considered and subjected to strong ground motion. Their behavior factor is numerically evaluated using nonlinear time history analysis and different ground acceleration records. The behavior factor is determined assuming severe collapse mechanism occurs throughout the time history. The system of equations is transformed into single equation end then the energy balance concept is applied. The expression for the behavior factor is derived and its application to four story two bays steel frame is illustrated and the corresponding results are discussed.
The development of a life cycle structured cooperation platform is described, which is based on an integrated process and goal-oriented project model. Furthermore the structure of a life cycle oriented object structure model and its implementation in the platform are documented. The complete conceptual model is described, which represents the basis of a lifecycle -oriented structuring of the planning object and supports the thematic classification of the object and project management data.
This paper describes an ongoing research on the representation and reasoning about construction specifications, which is part of a bigger research project that aims at developing a formalism for automating the identification of deviations and defects on construction sites. We specifically describe the requirements on product and process models and an approach for representing and reasoning about construction specifications to enable automated detection and assessment of construction deviations and defects. This research builds on the previous research on modeling design specifications and extends and elaborates concept of contexts developed in that domain. The paper provides an overview of how the construction specifications are being modele d in this research and points out future steps that need to be accomplished to develop the envisioned automated deviation and defect detection system.
Thin-walled spatial structures are broadly used in the modern technician and building. In fuel industry for long-term keeping of oil and gas are used reservoirs of various capacity, which on technological reasons can be shipped under the soil. Shells of reservoirs combine in itself high toughness and low specific consumption of materials. At the same time, being under the soil, they feel steady-state and dynamic loads from ambiance, which particularly in the event, when reservoir is empty, can bring about the loss of stability of its form. On the other hand contact interactions of shell and soil greatly depend on features of ambiance and its saturating of liquid. For building generalized porous springy ambiance models, saturated by the liquid, it is possible to use Bio equations of motion for displacement of hard and fluid phases. Elaboration of mathematical specified interaction models and theirs realization by means of modern computing software allows to study behaviour of spatial thin-walled designs on base of geometric nonlinear theory of shells
In earlier research, generalized multidimensional Hilbert transforms have been constructed in m-dimensional Euclidean space, in the framework of Clifford analysis. Clifford analysis, centred around the notion of monogenic functions, may be regarded as a direct and elegant generalization to higher dimension of the theory of the holomorphic functions in the complex plane. The considered Hilbert transforms, usually obtained as a part of the boundary value of an associated Cauchy transform in m+1 dimensions, might be characterized as isotropic, since the metric in the underlying space is the standard Euclidean one. In this paper we adopt the idea of a so-called anisotropic Clifford setting, which leads to the introduction of a metric dependent m-dimensional Hilbert transform, showing, at least formally, the same properties as the isotropic one. The Hilbert transform being an important tool in signal analysis, this metric dependent setting has the advantage of allowing the adjustment of the co-ordinate system to possible preferential directions in the signals to be analyzed. A striking result to be mentioned is that the associated anisotropic (m+1)-dimensional Cauchy transform is no longer uniquely determined, but may stem from a diversity of (m+1)-dimensional "mother" metrics.
The one-dimensional continuous wavelet transform is a successful tool for signal and image analysis, with applications in physics and engineering. Clifford analysis offers an appropriate framework for taking wavelets to higher dimension. In the usual orthogonal case Clifford analysis focusses on monogenic functions, i.e. null solutions of the rotation invariant vector valued Dirac operator ∂, defined in terms of an orthogonal basis for the quadratic space Rm underlying the construction of the Clifford algebra R0,m. An intrinsic feature of this function theory is that it encompasses all dimensions at once, as opposed to a tensorial approach with products of one-dimensional phenomena. This has allowed for a very specific construction of higher dimensional wavelets and the development of the corresponding theory, based on generalizations of classical orthogonal polynomials on the real line, such as the radial Clifford-Hermite polynomials introduced by Sommen. In this paper, we pass to the Hermitian Clifford setting, i.e. we let the same set of generators produce the complex Clifford algebra C2n (with even dimension), which we equip with a Hermitian conjugation and a Hermitian inner product. Hermitian Clifford analysis then focusses on the null solutions of two mutually conjugate Hermitian Dirac operators which are invariant under the action of the unitary group. In this setting we construct new Clifford-Hermite polynomials, starting in a natural way from a Rodrigues formula which now involves both Dirac operators mentioned. Due to the specific features of the Hermitian setting, four different types of polynomials are obtained, two types of even degree and two types of odd degree. These polynomials are used to introduce a new continuous wavelet transform, after thorough investigation of all necessary properties of the involved polynomials, the mother wavelet and the associated family of wavelet kernels.
On the basis of the little material available (an architecture plan and some photographs) a computer model is developed for a bullet shaped dome, part of the Belgian Congo pavilion, created by the architect Henry Lacoste for the International Colonial Exhibition of 1931 in Paris. The ingenious and elegant wooden skeleton of the dome is approximated in two stages. The first approximation focusses on the curves traced on the dome by the wooden laminae, which appear to be loxodromes, cutting the meridians by a constant angle. In a second approximation the very specific joints of the laminae are taken into consideration. The resulting computer image shows an astonishing resemblance with the photographs. Finally, the shapes and dimensions of all laminae are calculated, enabling a possible reconstruction of the dome.
Euclidean Clifford analysis is a higher dimensional function theory offering a refinement of classical harmonic analysis. The theory is centered around the concept of monogenic functions, i.e. null solutions of a first order vector valued rotation invariant differential operator called the Dirac operator, which factorizes the Laplacian. More recently, Hermitean Clifford analysis has emerged as a new and successful branch of Clifford analysis, offering yet a refinement of the Euclidean case; it focusses on the simultaneous null solutions, called Hermitean (or h-) monogenic functions, of two Hermitean Dirac operators which are invariant under the action of the unitary group. In Euclidean Clifford analysis, the Clifford-Cauchy integral formula has proven to be a corner stone of the function theory, as is the case for the traditional Cauchy formula for holomorphic functions in the complex plane. Previously, a Hermitean Clifford-Cauchy integral formula has been established by means of a matrix approach. This formula reduces to the traditional Martinelli-Bochner formula for holomorphic functions of several complex variables when taking functions with values in an appropriate part of complex spinor space. This means that the theory of Hermitean monogenic functions should encompass also other results of several variable complex analysis as special cases. At present we will elaborate further on the obtained results and refine them, considering fundamental solutions, Borel-Pompeiu representations and the Teoderescu inversion, each of them being developed at different levels, including the global level, handling vector variables, vector differential operators and the Clifford geometric product as well as the blade level were variables and differential operators act by means of the dot and wedge products. A rich world of results reveals itself, indeed including well-known formulae from the theory of several complex variables.
Image processing has been much inspired by the human vision, in particular with regard to early vision. The latter refers to the earliest stage of visual processing responsible for the measurement of local structures such as points, lines, edges and textures in order to facilitate subsequent interpretation of these structures in higher stages (known as high level vision) of the human visual system. This low level visual computation is carried out by cells of the primary visual cortex. The receptive field profiles of these cells can be interpreted as the impulse responses of the cells, which are then considered as filters. According to the Gaussian derivative theory, the receptive field profiles of the human visual system can be approximated quite well by derivatives of Gaussians. Two mathematical models suggested for these receptive field profiles are on the one hand the Gabor model and on the other hand the Hermite model which is based on analysis filters of the Hermite transform. The Hermite filters are derivatives of Gaussians, while Gabor filters, which are defined as harmonic modulations of Gaussians, provide a good approximation to these derivatives. It is important to note that, even if the Gabor model is more widely used than the Hermite model, the latter offers some advantages like being an orthogonal basis and having better match to experimental physiological data. In our earlier research both filter models, Gabor and Hermite, have been developed in the framework of Clifford analysis. Clifford analysis offers a direct, elegant and powerful generalization to higher dimension of the theory of holomorphic functions in the complex plane. In this paper we expose the construction of the Hermite and Gabor filters, both in the classical and in the Clifford analysis framework. We also generalize the concept of complex Gaussian derivative filters to the Clifford analysis setting. Moreover, we present further properties of the Clifford-Gabor filters, such as their relationship with other types of Gabor filters and their localization in the spatial and in the frequency domain formalized by the uncertainty principle.
The conventional way of describing an image is in terms of its canonical pixel-based representation. Other image description techniques are based on image transformations. Such an image transformation converts a canonical image representation into a representation in which specific properties of an image are described more explicitly. In most transformations, images are locally approximated within a window by a linear combination of a number of a priori selected patterns. The coefficients of such a decomposition then provide the desired image representation. The Hermite transform is an image transformation technique introduced by Martens. It uses overlapping Gaussian windows and projects images locally onto a basis of orthogonal polynomials. As the analysis filters needed for the Hermite transform are derivatives of Gaussians, Hermite analysis is in close agreement with the information analysis carried out by the human visual system. In this paper we construct a new higher dimensional Hermite transform within the framework of Quaternionic Analysis. The building blocks for this construction are the Clifford-Hermite polynomials rewritten in terms of Quaternionic analysis. Furthermore, we compare this newly introduced Hermite transform with the Quaternionic-Hermite Continuous Wavelet transform. The Continuous Wavelet transform is a signal analysis technique suitable for non-stationary, inhomogeneous signals for which Fourier analysis is inadequate. Finally the developed three dimensional filter functions of the Quaternionic-Hermite transform are tested with traditional scalar benchmark signals upon their selectivity at detecting pointwise singularities.
Für eine gesicherte Planung im Bestand, sind eine Fülle verschiedenster Informationen zu berücksichtigen, welche oft erst während des Planungs- oder Bauprozesses gewonnen werden. Voraussetzung hierfür bildet immer eine Bestandserfassung. Zwar existieren Computerprogramme zur Unterstützung der Bestandserfassung, allerdings handelt es sich hierbei ausschließlich um Insellösungen. Der Export der aufgenommenen Daten in ein Planungssystem bedingt Informationsverluste. Trotz der potentiellen Möglichkeit aktueller CAAD/BIM Systeme zur Verwaltung von Bestandsdaten, sind diese vorrangig für die Neubauplanung konzipiert. Die durchgängige Bearbeitung von Sanierungsprojekten von der Erfassung des Bestandes über die Entwurfs- und Genehmigungsplanung bis zur Ausführungsplanung innerhalb eines CAAD/BIM Systems wird derzeit nicht adäquat unterstützt. An der Professur Informatik in der Architektur (InfAR) der Fakultät Architektur der Bauhaus-Universität Weimar entstanden im Rahmen des DFG Sonderforschungsbereich 524 "Werkzeuge und Konstruktionen für die Revitalisierung von Bauwerken" in den letzten Jahren Konzepte und Prototypen zur fachlich orientierten Unterstützung der Planung im Bestand. Der Fokus lag dabei in der Erfassung aller planungsrelevanter Bestandsdaten und der Abbildung dieser in einem dynamischen Bauwerksmodell. Aufbauend auf diesen Forschungsarbeiten befasst sich der Artikel mit der kontextbezogenen Weiterverwendung und gezielten Bereitstellung von Bestandsdaten im Prozess des Planens im Bestand und der Integration von Konzepten der planungsrelevanten Bestandserfassung in marktübliche CAAD/BIM Systeme.
Iso-parametric finite elements with linear shape functions show in general a too stiff element behavior, called locking. By the investigation of structural parts under bending loading the so-called shear locking appears, because these elements can not reproduce pure bending modes. Many studies dealt with the locking problem and a number of methods to avoid the undesirable effects have been developed. Two well known methods are the >Assumed Natural Strain< (ANS) method and the >Enhanced Assumed Strain< (EAS) method. In this study the EAS method is applied to a four-node plane element with four EAS-parameters. The paper will describe the well-known linear formulation, its extension to nonlinear materials and the modeling of material uncertainties with random fields. For nonlinear material behavior the EAS parameters can not be determined directly. Here the problem is solved by using an internal iteration at the element level, which is much more efficient and stable than the determination via a global iteration. To verify the deterministic element behavior the results of common test examples are presented for linear and nonlinear materials. The modeling of material uncertainties is done by point-discretized random fields. To show the applicability of the element for stochastic finite element calculations Latin Hypercube Sampling was applied to investigate the stochastic hardening behavior of a cantilever beam with nonlinear material. The enhanced linear element can be applied as an alternative to higher-order finite elements where more nodes are necessary. The presented element formulation can be used in a similar manner to improve stochastic linear solid elements.
In the context of finite element model updating using vibration test data, natural frequencies and mode shapes are used as validation criteria. Consequently, the order of natural frequencies and mode shapes is important. As only limited spatial information is available and noise is present in the measurements, the automatic selection of the most likely numerical mode shape corresponding to a measured mode shape is a difficult task. The most common criterion to indicate corresponding mode shapes is the modal assurance criterion. Unfortunately, this criterion fails in certain cases. In this paper, the pure mathematical modal assurance criterion will be enhanced by additional physical information of the numerical model in terms of modal strain energies. A numerical example and a benchmark study with real measured data are presented to show the advantages of the enhanced energy based criterion in comparison to the traditional modal assurance criterion.
The presented work focuses on collaboration- experiences gathered with complex design and engineering projects, using the learning platform POLE- Europe. Within the POLE environment student-teams from different universities, disciplines and cultural backgrounds are assigned to real-world projects with clearly defined design - tasks, usually to be accomplished within one semester while working in a virtual environment for most of the time. The concept of POLE and the information and collaboration technology is described.
Current software solutions for real estate planning, construction and use, do not model the complete life cycle of a building. Well-integrated software tools exist for the planning and construction phases. Data integrity exists throughout the planning and construction phases, but problems occur at the transition to the use-phase. At this interface, the complete data set of planning and execution gets lost. Another software deficiency is that current software solutions don’t handle construction work and maintenance work equally. This is why a new software generation is demanded, which continuously covers the entire workflow process from the planning phase to the demolition of a building. New data concepts have to be developed, which allow bringing work items for construction together with work items for real estate use.
Am Beispiel eines 3-feldrigen Durchlaufträgers wird die Versagenswahrscheinlichkeit von wechselnd belasteten Stahlbetonbalken bezüglich des Grenzzustandes der Adaption (Einspielen, shakedown) untersucht. Die Adaptionsanalyse erfolgt unter Berücksichtigung der beanspruchungschabhängigen Degradation der Biegesteifigkeit infolge Rissbildung. Die damit verbundene mechanische Problemstellung kann auf die Adaptionsanalyse linear elastisch - ideal plastischer Balkentragwerke mit unbekannter aber begrenzter Biegesteifigkeit zurückgeführt werden. Die Versagenswahrscheinlichkeit wird unter Berücksichtigung stochastischer Tragwerks- und Belastungsgrößen berechnet. Tragwerkseigenschaften und ständige Lasten gelten als zeitunabhängige Zufallsgrößen. Zeitlich veränderliche Lasten werden als nutzungsdauerbezogene Extremwerte POISSONscher Rechteck-Pulsprozesse unter Berücksichtigung zeitlicher Überlagerungseffekte modelliert, so dass die Versagenswahrscheinlichkeit ebenfalls eine nutzungsdauerbezogene Größe ist. Die mechanischen Problemstellungen werden numerisch mit der mathematischen Optimierung gelöst. Die Versagenswahrscheinlichkeit wird auf statistischem Weg mit der Monte-Carlo-Methode geschätzt.
Die Bearbeitung von Bauprojekten erfordert ein hohes Maß an Fachwissen verschiedener Disziplinen. Dabei kommt eine Vielzahl spezialisierter Fachmodelle zum Einsatz. Zur Übernahme der Daten von anderen Planern in das eigene, neu zu erstellende Fachmodell sind die verfügbaren Inhalte aus verschiedenen Modellen vom Fachplaner entsprechend seiner Anforderungen anzupassen und um spezifische Inhalte zu ergänzen. Dabei ergeben sich Beziehungen, welche die Zusammenhänge und Abhängigkeiten der Fachmodelle untereinander aufzeigen. Eine zugleich allgemeingültige sowie vollständige Vordefinition des durch die Beziehungen beschriebenen Modellverbundes ist kaum möglich. Zur rechnerinternen Abbildung erfolgt aus diesem Grund eine Zerlegung des Modellverbundes in Partialmodelle und entsprechende Verknüpfungen. Die Beschaffenheit des Beziehungsgeflechtes hängt sowohl von der Qualität der Datenmodelle als auch von der Beschreibungsgüte der Verknüpfungstypen, deren Definition ein hohes Maß an Fachwissen erfordern, ab. Mit einem Konzept zur Strukturierung und Zerlegung der Verbindungen in Basiselemente sowie der Integrationsmöglichkeit zu komplexeren Elementen wird eine einfachere Erstellung, Wartung und Anpassung von umfassenden baufachlichen Inhalten ermöglicht. Zur Sicherung einer hochwertigen Beschreibung des Modellverbundes ist ein an die Fähigkeiten des Ingenieurs ausgerichteter Zugang zur Spezifikation und Anpassung der Beziehungsdefinitionen unverzichtbar.
Datenaustausch, Daten resp. Produktdatenmodelle sind seit mehreren Jahren Themen in der Forschung. Verschiedene Forschungsprojekte und Initiativen diverser Firmen führten zu bereichsübergreifenden Ansätzen wie IFC und verschiedenen STEP-AP´s. Speziell im Stahlbau sind die Projekte >Produktschnittstelle Stahlbau< und >CIMsteel< entwickelt, weiterentwickelt und überarbeitet worden. Als Weiterentwicklung der bisher existierenden Austauschformate versuchen neuere Ansätze den Nutzen über die reine Datenübermittlung hinaus zu erweitern. So integrieren diese Lösungsvorschläge Aspekte der Kommunikation, der Zusammenarbeit und des Managements. Des weiteren übernehmen sie Aufgaben der Daten- und Modellverwaltung. Somit erfolgt eine digitale Abbildung unter Einbezug sämtlicher ermittelter Daten. Resultierend aus den besonderen Randbedingungen im Bauwesen, wird ein Bauwerksmodell aus untereinander in Beziehung gesetzten Domänenmodellen aufgebaut
Datenmodelle zur Bearbeitung von Ingenieuraufgaben am Beispiel von Wohnhäusern in Stahlbauweise
(2006)
Modelle bilden die Grundlage der Planung. Sie repräsentieren die zur Bearbeitung erforderlichen Eigenschaften eines Bauwerks in einer an die spezifische Aufgabe angepassten Form. Zwischen den verschiedenen zur Abbildung des Bauwerks eingesetzten Modellen bestehen fachliche Zusammenhänge bezüglich der darin abgebildeten Aspekte. Diese Abhängigkeiten werden in der praktischen Planungsbearbeitung gegenwärtig auf Grundlage von Erfahrungswerten, normativen Vorgaben und vereinfachenden Annahmen berücksichtigt. Die detailliertere Modellierung von Bauwerkseigenschaften führt zu einer engeren Verzahnung der verschiedenen Modelle. Um eine fachliche Inselbildung zu vermeiden, ist eine entsprechend angepasste Abbildung der zwischen den einzelnen Modellen bestehenden Beziehungen erforderlich. Mit den steigenden Ansprüchen an eine Bearbeitung von Ingenieuraufgaben gewinnt eine über den Zweck der Bereitstellung ausgewählter Informationen zum Bauwerk und der Unterstützung eines Datenaustauschs zwischen verschiedenen Fachplanern hinausgehende datentechnische Abbildung an Bedeutung. Dies setzt eine Diskussion der Anforderungen an eine solche Beschreibung aus fachlicher Sicht voraus. Die Untersuchung der fachlichen Anforderungen wird am Beispiel von Wohnhäusern in Stahlbauweise geführt.
SLang - the Structural Language : Solving Nonlinear and Stochastic Problems in Structural Mechanics
(1997)
Recent developments in structural mechanics indicate an increasing need of numerical methods to deal with stochasticity. This process started with the modeling of loading uncertainties. More recently, also system uncertainty, such as physical or geometrical imperfections are modeled in probabilistic terms. Clearly, this task requires close connenction of structural modeling with probabilistic modeling. Nonlinear effects are essential for a realistic description of the structural behavior. Since modern structural analysis relies quite heavily on the Finite Element Method, it seems to be quite reasonable to base stochastic structural analysis on this method. Commercially available software packages can cover deterministic structural analysis in a very wide range. However, the applicability of these packages to stochastic problems is rather limited. On the other hand, there is a number of highly specialized programs for probabilistic or reliability problems which can be used only in connection with rather simplistic structural models. In principle, there is the possibility to combine both kinds of software in order to achieve the goal. The major difficulty which then arises in practical computation is to define the most suitable way of transferring data between the programs. In order to circumvent these problems, the software package SLang (Structural Language) has been developed. SLang is a command interpreter which acts on a set of relatively complex commands. Each command takes input from and gives output to simple data structures (data objects), such as vectors and matrices. All commands communicate via these data objects which are stored in memory or on disk. The paper will show applications to structural engineering problems, in particular failure analysis of frames and shell structures with random loads and random imperfections. Both geometrical and physical nonlinearities are taken into account.
This contribution will be freewheeling in the domain of signal, image and surface processing and touch briefly upon some topics that have been close to the heart of people in our research group. A lot of the research of the last 20 years in this domain that has been carried out world wide is dealing with multiresolution. Multiresolution allows to represent a function (in the broadest sense) at different levels of detail. This was not only applied in signals and images but also when solving all kinds of complex numerical problems. Since wavelets came into play in the 1980's, this idea was applied and generalized by many researchers. Therefore we use this as the central idea throughout this text. Wavelets, subdivision and hierarchical bases are the appropriate tools to obtain these multiresolution effects. We shall introduce some of the concepts in a rather informal way and show that the same concepts will work in one, two and three dimensions. The applications in the three cases are however quite different, and thus one wants to achieve very different goals when dealing with signals, images or surfaces. Because completeness in our treatment is impossible, we have chosen to describe two case studies after introducing some concepts in signal processing. These case studies are still the subject of current research. The first one attempts to solve a problem in image processing: how to approximate an edge in an image efficiently by subdivision. The method is based on normal offsets. The second case is the use of Powell-Sabin splines to give a smooth multiresolution representation of a surface. In this context we also illustrate the general method of construction of a spline wavelet basis using a lifting scheme.
Bei komplexen Gründungskonstruktionen sind Planungsfehler durch eine konsistente Modellierung vermeidbar. Manuelle Berechnungsmethoden ermöglichen im allgemeinen ein dreidimensionales Vorgehen nicht. Numerische Berechnungsmethoden, wie z.B. die Finite-Element-Methode, sind ein optimales Werkzeug zur ganzheitlichen Simulation des Problems. Die für die Finite-Element-Analyse notwendige Diskretisierung komplexer Bau- grundstrukturen ist manuell nicht zu bewältigen. Der vorliegende Beitrag zeigt wie ein Finite-Element-Modell automatisch aus einem geotechnischen Modell unter Berücksichtigung der spezifischen Anforderungen der Baugrund-Tragwerk-Struktur und des Bauablaufes erzeugt werden kann. Hierbei wird die Berücksichtigung der geometrischen und der mechanischen Besonderheiten bei der Netzgenerierung dargestellt.
This paper describes a framework for computer-aided conceptual design of building structures that results from building architectural considerations. The central task that is carried out during conceptual design is the synthesis of the structural system. This paper proposes a methodology for the synthesis of structural solutions. Given the nature of architectural constraints, user-model interactivity is devised as the most suitable computer methodology for driving the structural synthesis process. Taking advantage of the hierarchical organization of the structural system, this research proposes a top-down approach for structural synthesis. Through hierarchical refinement, the approach lends itself to the synthesis of global and local structural solutions. The components required for implementing the proposed methodology are briefly described. The main components have been incorporated in a proof-of-concept prototype that is being tested and validated with actual buildings.