620 Ingenieurwissenschaften und zugeordnete Tätigkeiten
Refine
Document Type
- Conference Proceeding (326)
- Article (129)
- Doctoral Thesis (8)
- Master's Thesis (1)
Institute
- Professur Informatik im Bauwesen (464) (remove)
Keywords
- Modellierung (62)
- Bauwerk (48)
- Finite-Elemente-Methode (41)
- CAD (39)
- Verteiltes System (38)
- Bautechnik (30)
- Produktmodell (24)
- Architektur (23)
- Simulation (21)
- Computerunterstütztes Verfahren (20)
Am Fraunhofer ISE wurde in den letzten Jahren die Simulationsumgebung ColSim entwickelt, die sich speziell zur Untersuchung von Regelungssystemen in Gebäudeenergieversorgungssystemen eignet. Zielsetzung des Designs ist die Umsetzung des simulationsbasierten Regelungsentwurfs, der einen unmittelbaren Einsatz der Regelungsmodule auf sog. Enbedded Systems gestattet. Das Simulationswerkzeug zeichnet sich durch die modulare offene Struktur aus, die eine flexible Erweiterung ermöglicht (vgl. TRNSYS [1]). Die Implementierung erfolgte im genormten ANSI-C Code, der einen plattform- unabhänigen Einsatz gewährleistet. Entwicklungsplattform stellt derzeit ein Linux Cluster dar, als Zielplattform wurden bisher sowohl eingebettete Industrie-PCs wie auch klassische Micro- controller Boards verwendet. Die Entwurfsmethode wird anhand einer Systemregelung für eine solarthermische Anlage mit 120m2 Kollektorfläche (SolarThermie2000 Anlage) demonstriert, bei der ein vernetztes Regelungssystem mit Internet-Integration zum Einsatz kommt. Das Regelungssystem verfügt seinerseits über ein Betriebssystem (schlankes embedded linux system), das die Kommunikation nach aussen gestattet. Das Regelungssystem vefügt somit über Klima- bzw Strahlungsdaten, die für den Regelungsprozess von Bedeutung sind. Die externen Informationen können einerseits zur >Einsparung< von Sensorik genutzt werden, andererseits gestatten sie den Einsatz von prädiktiver Regelungsmethodik, um den fossilen (Nachheiz-) Energieeinsatz zu minimieren. Mit Hilfe von simulationstechnischen Systemstudien kann ein adaptives Verhalten des Regelungssystems erprobt werden, das eine selbstständige Strecken- identifikation realieren kann. Beispielsweise soll beim näher beschriebenen solarthermischen System die Totzeit bestimmt werden, die sich infolge der Verrohrung zwischen Speicher und Entladegruppe ergibt. Der Betrieb der Entlade- pumpe wird einerseits in Abhängigkeit der Verfügbarkeit des Pufferwassers erfolgen, andererseits in Abhängigkeit des erwarteten Zapfvolumens durch den Verbraucher. Die vernetzten Regelungssysteme, die auf Basis der Simulationsmodelle ent- wickelt werden, sollen künftig die gesamte Energieflußanalyse des Gebäudes realisieren, wobei eine transparente Darstellung des Systemverhaltens auf Basis einer Internet Visualisierung erfolgt. Der Betreiber und Nutzer wird unmittel- bar durch die Online Dienste (SMS,Email,Fax) über das (fehlerhafte) Anlagen- verhalten informiert. Gerade die sensitiven regenerativen Systeme neigen durch ihre Komplexität zu Störungen, die oftmals nicht erkannt werden, weil die konventionellen Teilsysteme (z.B. Ergasbrenner) den Ausfall in der Regel >kompensieren<.
COMPARISON OF SOME VARIANTS OF THE FINITE STRIP METHOD FOR ANALYSIS OF COMPLEX SHELL STRUCTURES
(2000)
The subject of this paper is to explore and evaluate the semi-analytical, analytical and numerical versions of the finite strip method (FSM) for static, dynamic and stability analyses of complex thin-walled structures. Many of bridge superstructures, some roof and floor structures, reservoirs, channels, tunnels, subways, layered shells and plates etc. can be analysed by this method. In both semi-analytical and analytical variants beam eigenvalue vibration or stability functions, orthogonal polynomials, products of these functions are used as longitudinal functions of the unknowns. In the numerical FSM spline longitudinal displacement functions are implemented. In the semi-analytical and numerical FSM conventional transverse shape functions for displacements are used. In the analytical FSM the accurate function of the strip normal displacement and the plane stress function are applied. These three basic variants of the FSM are compared in quality and quantity in view to the following: basic ideas, modelling, unknowns, DOF, a kind and order of the strips, longitudinal and transverse displacement and stress functions, compatibility requirements, boundary conditions, ways for obtaining of the strip stiffness and load matrices, a kind and size of the structure stiffness matrix and its band width, mesh density, necessary number of terms in length, accuracy and convergence of the stresses and displacements, approaches for refining results, input and output data, computer resources used, application area, closeness to other methods, options for future development. Numerical example is presented. Advantages and shortcomings are pointed. Conclusions are given.
The problem of the computation of stresses and settlements in the half-space under various types of loads is often presented in geotechnical engineering. In 1885 Boussinesq advanced theoretical expressions to determine stresses at a point within an ideal mass. His equation considers a point load on the surface of a semi-infinite, homogeneous, isotropic, weightless, elastic half-space. Newmark in 1942 performed the integration of Boussinesq's equations for the vertical stress under a corner of a rectangular area loaded with a uniform load. The problem of the determination of vertical stresses under a rectangular shaped footing has been satisfactorily solved with renewal integration of the Boussinesq's equation over the arbitrary rectangle on surface of the half-space, with a non-uniform load represented with piecewise linear interpolation functions. The problem of the determination of stresses in the case when the footing shape is an arbitrary quadrilateral however remains unsolved. The paper discusses an approach to the computation of vertical stresses and settlements in an arbitrary point of the half-space, loaded with a uniform load, which shape in the ground plan can be a general four noded form with straight edges. Since the form is transformed into a biunit square and all integrations are performed over this area, all solutions are valid also for an arbitrary triangle by the implementation of the degeneration rule.
Computational Steering provides methods for the integration of modeling, simulation, visualization, data analysis and post processing. The user has full control over a running simulation and the possibility to modify objects (geometry and other properties), boundary conditions and other parameters of the system interactively. The objective of such a system is to explore the effects of changes made immediately and thus to optimize the target problem interactively. We present a computational steering based system for fluid flow problems in civil engineering. It is based on three software components as shown in figure 1. The modeler is the CAD-system AutoCAD, which offers a powerful programming interface allowing an efficient access to the geometric data. It also offers convenient manipulators for geometric objects. The simulation kernel is a Lattice-Boltzmann (LB) solver for the Navier-Stokes equations, which is especially suitable for instationary flows in complex geometries. For the visualization and postprocessing we use the software tool AVS, which provides a powerful programming interface and allows the efficient visualization of flow fields. These three components are interconnected through two communication modules and three interfaces as depicted in figure 1. Interface 1 is responsible for the transformation of the modified system for the simulation kernel, interface 2 is responsible for the proper preparation of the simulation data whereas interface 3 transforms the data from the modeler into a format suitable for the visualization system. The whole system is synchronized by the two communication modules.
Designing lightings in a 3D-scene is a general complex task for building conception as it is submitted to many constraints such as aesthetics or ergonomics. This is often achieved by experimental trials until reaching an acceptable result. Several rendering softwares (such as Radiance) allow an accurate computation of lighting for each point in a scene, but this is a long process and any modification requires the whole scene to be rendered again to get the result. The first guess is empirical, provided by experience of the operator and rarely submitted to scientific considerations. Our aim is to provide a tool for helping designers to achieve this work in the scope of global illumination. We consider the problem when some data are asked for : on one hand the mean lighting in some zones (for example on a desktop) and on the other hand some qualitative information about location of sources (spotlights on the ceiling, halogens on north wall,...). The system we are conceiving computes the number of light sources, their position and intensities, in order to obtain the lighting effects defined by the user. The algorithms that we use bind together radiosity computations with resolution of a system of constraints.
Poland is not situated in any seismic region of the earth, however there are still areas were underground mining is being conducted. In these areas, so-called 'paraseismic tremors', are very frequent phenomena. In the situation when a building examination is realized in order to define its safety, it is necessary to make a complete analysis, in which an influence of tremors should be included. To decide if a building is able to carry out any dynamic loads or not, it is necessary to compute its dynamic characteristics, i.e. natural frequencies. It is not possible using any standard techniques. After diagnosis a building in situ by an expert, computer techniques together with specialized software for dynamic, static, and strength analyses become a suitable tool. In this paper a special attention was paid to a typical twelve-store WGP (Wroclaw Great Plate) prefabricated building, concerning special type of joints. During dynamic actions these joints have a decisive influence on building's behavior. Paraseismic tremors are especially dangerous for these buildings and can be the reason of pre-failure states. It can be difficult and very expensive to prepare laboratory investigations of the part of a building or of a separate joint; therefore the computer modeling suitable to investigate behavior of such elements and whole buildings under different kinds of loads was used.
The paper describes a development of the analytical finite strip method (FSM) in displacements for linear elastic static analysis of simply supported at their transverse ends complex orthotropic prismatic shell structures with arbitrary open or closed deformable contour of the cross-section under general external loads. A number of bridge top structures, some roof structures and others are related to the studied class. By longitudinal sections the prismatic thin-walled structure is discretized to a limited number of plane straight strips which are connected continuously at their longitudinal ends to linear joints. As basic unknowns are assumed the three displacements of points from the joint lines and the rotation to these lines. In longitudinal direction of the strips the unknown quantities and external loads are presented by single Fourier series. In transverse direction of each strips the unknown values are expressed by hyperbolic functions presenting an exact solution of the corresponding differential equations of the plane straight strip. The basic equations and relations for the membrane state, for the bending state and for the total state of the finite strip are obtained. The rigidity matrix of the strip in the local and global co-ordinate systems is derived. The basic relations of the structure are given and the general stages of the analytical FSM are traced. For long structures FSM is more efficient than the classic finite element method (FEM), since the problem dimension is reduced by one and the number of unknowns decreases. In comparison with the semi-analytical FSM, the analytical FSM leads to a practically precise solution, especially for wider strips, and provides compatibility of the displacements and internal forces along the longitudinal linear joints.
As it is well known, the approximation theory of complex valued functions is one of the main fields in function theory. In general, several aspects of approximation and interpolation are only well understood by using methods of complex analysis. It seems natural to extend these techniques to higher dimensions by using Clifford Analysis methods or, more specific, in lower dimensions 3 or 4, by using tools of quaternionic analysis. One starting point for such attempts has to be the suitable choice of complete orthonormal function systems that should replace the holomorphic function systems used in the complex case. The aim of our contribuition is the construction of a complete orthonormal system of monogenic polynomials derived from a harmonic function system by using sistematically the generalized quaternionic derivative
The contribution introduces an adaptable process model to meet the special requirements of the coordination of planning activities in AEC (Architecture, Engineering, Construction). The process model is based on the concept of Coloured Petri-Nets and uses metainformation to characterize process-relevant information and to enable process-control based on the actual results of the planning.
Cost and Schedule Controlling in Relation to Liquidity Management during Construction Projects
(2004)
The present paper describes a software application which can be used for relating the scheduled events of a construction project with the respective financial parameters, leading to an overall improvement in general controlling and liquidity management. For this purpose, existing construction schedules are taken and details of the assignment are recorded. Thus it becomes possible to assess a future payment status should changes in the designated schedule occur.
The phenomenological and computational aspects of the various damage models applications for the low and multi cyclic fatigue processes are investigated. Damage is considered as internal state variable, describing macroscopic effects of the progressive material degradation, within the framework of continuum damage mechanics. Present analysis is restricted to the case of isotropic damage, which can be modeled by a scalar variable. The strain, force and power types of kinetic equations for the damage evolution description are considered. The original mixed strain-power type damage model is developed for taking into account the different physical fracture mechanism in monotone and cyclic loading. The constitutive equations of plastic flow theory coupled and uncoupled to damage has been considered. The rational algorithm of implementation into finite element code is considered for developed damage models. Set of the computational experiments has been carried out for the various structures (huge aerials, pipelines, fastening units, vessel of nuclear reactor) and cases of loading. The comparison of the predictions of the developed model with experimental data is performed for 1X18H10T steel tubular specimens for complex paths of loading and for complex profiles beams under cyclic loading. Damage field distribution is the basic information for the prediction of crack initiation in structures. The developed method of structural parameter for stress concentration zones is discussed for correcting of crack location. It allows to describe the crack initiation near surface domain as observe in numerous experiments.
Die digitale Unterstützung der Planungsprozesse ist ein aktueller Forschungs- und Arbeitsschwerpunkt der Professur Informatik in der Architektur (InfAR) und der Juniorprofessur Architekturinformatik der Fakultät Architektur an der Bauhaus-Universität Weimar. Verankert in dem DFG Sonderforschungsbereich 524 >Werkzeuge und Konstruktionen für die Revitalisierung von Bauwerken< entstehen Konzepte und Prototypen für eine fachlich orientierte Planungsunterstützung. In dem Beitrag wird ein Konzept und prototypische Realisierung für die durchgängige Unterstützung des gesamten Bauaufnahmeprozesses für Altbausubstanz vorgestellt und diskutiert. Der Fokus liegt auf der frühe Phase in der Bauaufnahme als ein Baustein in einer gesamtheitlichen IT-gestützen Planungsumgebung eingegangen. Durch gezielte Aufnahme planungsrelevanter Parameter und Auswertung hinsichtlich Wirtschaftlichkeit und Wiederverwendbarkeit bzw. der Variantenüberprüfung von Nutungskonzepten, werden gerade in dieser Phase wesentliche Entscheidungen für eine kostengünstige Planung getroffen werden. In der Veröffentlichung wird der Fokus auf folgende Punkte gesetzt: - Strukturierung und Aufnahme der Informationen während der Erstbegehung - Skizzenhafte Abbildung als Basis für die Formulierung erster Entwurfsintensionen/ Variantenuntersuchungen - Navigations- und Informationsumgebung - gezielte Auswertungsmöglichkeiten (bspw. Wirtschaftlichkeitsberechnung, Wiederverwendung von Bauteilen, Kalkulation von Abrissmengen)
Die Aufgaben des Bauingenieurwesens sind dadurch geprägt, daß sowohl die Planung als auch die Ausführung von Bauwerken häufigen Änderungen unterliegen. Beschreibt man das Verhalten der Bauwerke und den Bauprozeß im Computer mit Modellen, so ändern sich Umfang und Struktur der Modelle als Folge der Änderung in Planung und Ausführung. Diesen Vorgang nennt man Dynamisierung des Modells. Die Dynamisierung führt zu Veränderungen und Inkonsistenzen in den Modellen der Anwendungen. Die Aktualisierung und Abstimmung von Beziehungen innerhalb eines Modells sowie die Sicherung der Konsistenz der Modelle untereinander sind daher von zentraler Bedeutung für die Lösung von Bauingenieuraufgaben. Seit den letzten 10 Jahren wird die objektorientierte Methode in der Modellierung für Anwendungen im Bauingenieurwesen intensiv entwickelt und eingesetzt. Es hat sich gezeigt, daß die Anwendung dieser Methode in wichtigen Bereichen der Modellierung zu Verbesserungen führt. Gleichzeitig hat sich aber auch herausgestellt, daß die für das Bauingenieurwesen wichtigen Aspekte der Aktualisierung und der Konsistenz nicht zweckmäßig beschreibbar sind. In diesem Beitrag wird eine einfache Modelliermethode in ihrem Konzept und ihrer Realisierung gezeigt, mit der sich die Aktualisierung von Objekten und Modellen sowie die Sicherstellung der Konsistenz in Systemen des Bauingenieurwesens bearbeiten lassen.
Die Gesamtheit der Produktinformationen im Bauwesen stellt sich als sehr inkonsistente und redundante Informationsmenge dar. Die Hersteller stehen vor der Aufgabe, diese Produktinformationen nicht nur in der üblichen Katalogform sondern auch in neuen Medien wie CD-ROM und WWW zu publizieren. Einmal betriebener Erfassungsaufwand muß sich dabei durch eine einfache Mehrfachverwendung der Daten rentieren. Wegen seiner technologischen Eigenschaften eignet sich das WWW sehr gut als primäre Quelle für Produktinformationen. Vor allem wegen der möglichen Aktualität sollte es Grundlage für Publikationen in den anderen Medien sein. Mit verhältnismäßig geringem Aufwand lassen sich die WWW-Präsentation von Produktdaten so strukturieren, daß eine automatisierte Erfassung und Verarbeitung dieser Daten vereinfacht wird. So kann die Datenkonsistenz von der visuellen Präsentation in herkömmlichen Katalogen bis zur automatisch erstellten Produktdatenbank durchgängig gesichert werden. Vor dem Hintergrund der von Schnittstellenproblemen, heterogener Softwarestruktur und unzureichender Ausnutzung der Möglichkeiten der Informatik gekennzeichneten Situation im Bauwesen wird in generalisierter Ansatz mittels SGML vorgestellt. Mit einem derartigen Datenangebot werden neue, verteilte und vernetzte Anwendungen auf Grundlage derselben Produktdaten ermöglicht.
This paper will present a number of technical aspects for one of the most elaborate instrumentation and data acquisition projects ever undertaken in Canada. Confederation Bridge, the longest bridge built over ice covered seawater has been equipped with the state of the art data acquistition devices and systems as well as data transfer networks. The Bridge has been providing a fixed surface connection between Prince Edward Island and Province of New Brunswick in Canada since its opening in 1997. The Bridge has a rather long design service life of 100 years. Because of its large size and long span length, its design is not covered by any existing codes or standards worldwide. The focus of the paper is to introduce the data acquisition, transfer, processing and management systems. The instrumentation and communications infrastructure and devices will be presented in some details along with the data processing and management systems and techniques. Teams of engineers and researchers use the collected data to verify the analysis and design assumptions and parameters as well as investigate the short-term and long-term behaviour and health of the Bridge. The collected data are also used in furthering research activities in the field of bridge engineering and in elevating our knowledge about behaviour, reliability and durability of such complex structures, their components and materials.
Advances in construction data analysis techniques have provided useful tools to discover explicit knowledge on historical databases supporting project managers’ decision making. However, in many situations, historical data are extracted and preprocessed for knowledge discovery based on time-consuming and problem-specific data preparation solutions, which often results in inefficiencies and inconsistencies. To overcome the problem, we are working on the development of a new data fusion methodology, which is designed to provide timely and consistent access to historical data for efficient and effective management knowledge discovery. The methodology is intended to be a new bridge between historical databases and data analysis techniques, which shields project managers from complex data preparation solutions, and enables them to use discovered knowledge for decision making more conveniently. This paper briefly describes the motivation, the background and the initial results of the ongoing research.
The problem of data interoperability is now very important. The formal description of construction systems and objects must base upon the modeling for the description of construction data domain. The XML-language was selected as a basis of a universal data format, ensuring natural hierarchy of objects, flexibility, good layout and expandability. The language, developed by the author, is called Building Object Description Extensible Markup Language (bodXML). The types of all objects used by data transfer should be definite beforehand with existing methods of programming. It limits the possibilities of IT in application of new types. But the recipient software must recognize the building objects even if the kind of object is unknown at the outset. The author offers a set of main topological and geometric properties being sufficient for recognition of main three-dimensional building constructions with flat edges. The tests of artificial neuron network have shown that the recognition of a kind of the constructions represented as a set of indicated parameters happens enough confidently.
Many construction and facilities management Web sites can be found on the Internet. The interested parties on construction and facilities management Web sites can find databases of best practices, calculators, analyzers, software, expert and decision support systems, neural networks, etc. Technological innovation mainly through changes in the availability of information and communication technology inclusive databases of best practices, calculators, analyzers, software, neural networks, decision support and expert systems that have been provided by a variety of new services developed by the construction and facilities management sectors. Most of all calculators, analyzers, software, decision support and expert systems, neural networks and on-line systems seek to find out how to make the most economic decisions and most of all these decisions are intended only for economic objectives. Alternatives under evaluation have to be evaluated not only from the economic position, but take into consideration qualitative, technical, technological and other characteristics as well. Based on the analysis of the existing calculators, analyzers, information, expert and decision support systems, neural networks and in order to determine most efficient versions of best practices a Decision Support Web-Based System for Construction Innovation (IDSS) was developed by Vilnius Gediminas Technical University.
Building design, realization, operation and refurbishment have to take into account the environmental impacts as well as the resulting costs over a long period of time. LCA methods had to be developed for buildings because of their complexity, their long life duration and through a large number of actors who are involved. This was realized by integrating life cycle analysis, life cycle costing and building product models in integrated LCA models. However the use of such models leads to difficulties. The principal ones are the uncertainty treatment in LCA models and the lack of experience of practitioners who are not LCA specialists. Answers to these problems are the management of uncertainty and the development of simplified models for building design, construction and operation. This can be achieved with the mean of experimental plans or Monte Carlo simulation. The paper will focus on how these techniques can be used, what are their possibilities and disadvantages, particularly concerning the development of simplified models.
The construction of a new building interferes with the existent environment. A careful aesthetic study must be made at an early stage in the design and the visualization of a three-dimensional (3D) model of the structure is the best way to analyse it. As some structures presents a complex shape is difficult to execute a 3D model as well as the specific drawings. Using traditional graphical systems, the execution of deck specific drawings is extremely time consuming and the 3D deck model gives an approximation only of the exterior shape of the deck. The modelling scheme proposed here allows the automation of the geometric design phases related to the deck bridge element using as a means of integration a geometric database representative of the real deck shape. This concept was implemented in a computer program. This application is an important support in the process design namely at the conceptual and graphical stages. The computer application provides an important tool to the bridge designer particularly at the conceptual stage, as it allows aesthetic and structural evaluation of the bridge at an early stage in the design. The geometric modelling process and graphical results of a case study are presented.
The paper investigates accuracy of deflection predictions made by the finite element package ATENA and design code methods ACI and EC2. Deflections have been calculated for a large number of experimental reinforced concrete beams reported by three investigators. Statistical parameters have been established for each of the technique at different load levels, separately for the beams with small and moderate reinforcement ratio.
In this contribution, the design of an analysis environment is presented, that supports an analyst to come to a decision within a gradual collaborative planning process. An analyst represents a project manager, planner or any other person, involved in the planning process. Today, planning processes are managed by several geographically distributed planners and project managers. Thus, complexity of such a process rises even more. Prediction of consequences of many planning decisions is not possible, in particular since assessment of a planning advance is not trivial. There have to be considered several viewpoints, that depend on individual perceptions. In the following, methods are presented to realize planning decision support.
In the field of Civil Engineering, the content of reinforcement concrete design course (RC course) has complicated design procedures and many difficult specifications to recognize, so most of the students regard the RC course a tough course, and teachers very often find the class time insufficient. Also, teachers of the RC course usually spend a lot of time in organizing the examinations for handling tedious calculations and complicated logical reasoning. Furthermore, correcting examination papers with partial scoring takes even more time of the teacher’s. Therefore, the objective of this research is to design and develop a partial scoring assessment system to meet the needs in engineering design courses, such as the RC course. This assessment system can generate test items with variable parameters. It also supports inference diagnosis on the examinee’s misconceptions and gives partial scores in grading the examination. In this research, the example test subject is the analysis of rectangular reinforced concrete beam with single layer steel bars.
An important feature of the 2003 SARS outbreak in Canada, Singapore, and Hong Kong was that many health care workers (HCWs) developed SARS after caring for patients with SARS. This has been ascribed to inadequate or ineffective patient isolation. However, it is difficult for dense cities to provide sufficient isolation facilities within a short period of time. This has raised concerns from the public for new strategies in the planning and design of isolation facilities. Considering that SARS or other infectious diseases could seriously damage our society’s development, isolation facilities that could be rapidly and economically constructed with appropriate environmental controls are essential. For this reason, the design team of the Department of Architecture collaborated with a special task force from the Faculty of Medicine, who are the frontline medical officers treating the SARS patients, to design Rapidly Assembled Isolation Patient Wards. Both architecture and medicine are well established disciplines, but they have little in common in terms of the mode of knowledge construction and practice. This induced much intellectual exploration and research interest in conducting this study. The process has provided an important reference for cross disciplinary studies between the architectural and medical domains.
Usually, the co-ordination of design and planning tasks of a project in the construction industries is done in a paper based way. Subsequent modifications have to be handled manually. The effects of modifications cannot be determined automatically. The approach to specify a complete process model before project start does not consider the requirements of the construction industries. The effort of specification at the beginning and during the process (modifications) does not justify the use of standard process model techniques. A new approach is presented in the according paper. A complete process model is deducted on the basis of a core. The core consists of process elements and specific relations between them. Modifications need to be specified in the core only. The effort of specification is therefore reduced. The deduction of the complete process is based on the graph theory. Algorithms of the graph theory are also used to determine the effects of modifications during project work.
DETERMINATION OF THE DYNAMIC STRESS INTENSITY FACTOR USING ADVANCED ENERGY RELEASE EVALUATION
(2000)
In this study a simple effective procedure practically based upon the FEM for determination of the dynamic stress intensity factor (DSIF) depending on the input frequency and using an advanced strain energy release evaluation by the simultaneous release of a set of fictitious nodal spring links near the crack tip is developed and applied. The DSIF is expressed in terms of the released energy per unit crack length. The formulations of the linear fracture mechanics are accepted. This technique is theoretically based upon the eigenvalue problem for assessment of the spring stiffnesses and on the modal decomposition of the crack shape. The inertial effects are included into the released energy. A linear elastic material, time-dependent loading of sine type and steady state response of the structure are assumed. The procedure allows the opening, sliding and mixed modes of the structure fracture to be studied. This rational and powerful technique requires a mesh refinement near the crack tip. A numerical test example of a square notched steel plate under tension is given. Opening mode of fracture is studied only. The DSIF is calculated using a coarse mesh and a single node release for the released energy computation as well a fine mesh and simultaneous release of four links for more accurate values. The results are analyzed. Comparisons with the known exact results from a static loading are presented. Conclusions are derived. The values of the DSIF are significantly larger than the values of the corresponding static SIF. Significant peaks of the DSIF are observed near the natural frequences. This approach is general, practicable, reliable and versatile.
For modeling of singular fields of stresses and deformations in elasters with a crack is offered to use of three-dimesional a special finite element. Weak compessible of elasters is taken into account on the basis of threefold approximation of fields of displacements, deformations and function of volume change. At intensive cyclic loading of the elastomer constructions with a crack it is necessary to take into account warming and large deformations at the crack top. The stress-deformed state elasters with a crack is determined from the decision of a nonlinear problem by a modified method Newton-Kantorovich. Account stress intensity factors for a rectangular plate with a various arrangement of a through crack is executed. Process of development of a surface crack and dissipative warming in prismatic a element of shift is investigated.
Werden Bauwerke für eine begrenzte Lebensdauer ausgelegt, kann es sinnvoll sein, die Tragfähigkeit von Tragkonstruktionen zu überwachen, um Schäden zu vermeiden und eine sichere Funktionsweise zu gewährleisten. Die Überwachung, hier auf Basis von Schwingungen der Struktur, wird zumeist von einer rechnergestützten Messtechnik automatisch durchgeführt. Der Computer überprüft spezielle physikalische Kennwerte oder Kennfunktionen des Tragwerks auf Veränderungen. Eine Schädigung ruft eine Veränderung hervor. Aufgabe der Systemidentifikation ist es, eine solche Veränderung zu erkennen. Eine Modellbildung kann z.B. auf theoretischer Basis als Finite Element Modellierung, oder als Black Box Modellierung aus Messwerten mit der Methodik der deterministischen oder stochastischen Systemidentifikation vorgenommen werden. In diesem Aufsatz werden die Analyse allgemeiner deterministischer und stochastischer Erregungen und deren Schwingungsantworten zur Modellbildung und Systemidentifikation beschrieben. Als Anwendungsbeispiele für die Bauwerksüberwachung werden Methoden zur Schadens-Erfassung und -Lokalisation vorgestellt. Den Abschluss bilden Ausführungen zur numerischen Modellierung von Windlasten als stochastischen Prozess und der Kopplung dieser Modelle mit finiten Element-Modellen, um eine bessere Abschätzung der Lebensdauer eines Bauwerks schon im Entwurfsprozess zu ermöglichen.
Communication software and distributed applications for control and building performance simulation software must be reliable, efficient, flexible, and reusable. This paper reports on progress of a project, which aims to achieve better integrated building and systems control modeling in building performance simulation by run-time coupling of distributed computer programs. These requirements motivate the use of the Common Object Request Broker Architecture (CORBA), which offers sufficient advantage than communication within simple abstraction. However, set up highly available applications with CORBA is hard. Neither control modeling software nor building performance environments have simple interface with CORBA objects. Therefore, this paper describes an architectural solution to distributed control and building performance software tools with CORBA objects. Then, it explains how much the developement of CORBA based distributed building control simulation applications is difficult. The paper finishes by giving some recommendations.
Development and Analysis of Sparse Matrix Concepts for Finite Element Approximation on general Cells
(2004)
In engineering and computing, the finite element approximation is one of the most well-known computational solution techniques. It is a great tool to find solutions for mechanic, fluid mechanic and ecological problems. Whoever works with the finite element method will need to solve a large system of linear equations. There are different ways to find a solution. One way is to use a matrix decomposition technique such as LU or QR. The other possibility is to use an iterative solution algorithm like Conjugate Gradients, Gauß-Seidel, Multigrid Methods, etc. This paper will focus on iterative solvers and the needed storage techniques...
PKPM series CAD software is an integrated CAD system for building design, which integrated the following parts: architectural design, structural design, building service design and statistic analysis of quantity and budget. These four parts share the same database with high efficiency. Over 80% of design corporation in China are using PKPM series CAD software. The detailed information and some key modules of PKPM series CAD software are mainly introduced in this paper.
Change management has been the focus of different IT systems. These IT systems were developed to represent design information, record design rationale, facilitate design coordination and changes. They are largely based on managing reactive changes, particularly design changes, in which changes are recorded and then propagated to the relevant project members. However, proactive changes are hardly dealt with in IT systems. Proactive changes require estimating the likelihood of occurrence of a change event as well as estimating the degree of change impacts on project parameters. Changes in construction projects often result from the uncertainty associated with the imprecise and vague knowledge of much project information at the early stages of projects. This is a major outcome of the case studies carried out as part of this research. Therefore, the proposed model considers that incomplete knowledge and certain project characteristics are always behind change causes. For proactive changes, predicting a change event is the main task for modelling. The prediction model should strive to integrate these main elements: 1) project characteristics that lead to change 2) causes of change, 3) the likelihood of change occurrence, and 4) the change consequences. It should also define the dependency relationships between these elements. However, limited data (documented) are only available from previous projects for change cases and many of the above elements can only be expressed in linguistic terms. This means that the model will simulate the uncertainty and subjectivity associated with these sets of elements. Therefore, a fuzzy model is proposed in this research to capture these elements. The model analyses the impact of each set of elements on the other by assigning fuzzy values for these elements that express the uncertainty and subjectivity of their impact. The main aim is to predict change events and evaluate change effects on project parameters. The fuzzy model described above was developed in an IT system for operational purposes and was designed as a Java package of components with their supporting classes, beans, and files. This paper describes the development and the architecture of the proposed IT system to achieve these requirements. The system is intended to help project teams in dealing with change causes and then the change consequences in construction projects.
Digital maps are very easily applied as route guide maps. Route guide maps are provided through Web or a mobile phone services, and demand for such services is increasing. However, the production of a route guide map requires a great deal of time. Therefore, it is difficult for general users to make route guide maps. The purpose of the present research is the development of a system that can generate a route guide map using the Digital Map 2500 (Spatial Data Framework) published by the Geographical Survey Institute. This system will not require advanced equipment or expert knowledge. Therefore, anyone can produce route guide maps easily and quickly. By using the Digital Map 2500, the time and cost required in order to generate a map are reduced. Moreover, a useful route guide map can be created by simplifying the map form based on the human cognitive map.
A simulation system has been developed as a computer aided design tool to evaluate the effect of proposed design on the thermal environment during the designing process. This system calculates outdoor surface temperatures in order to evaluate the thermal impact of a design factor in outdoor space. In this study, the previous heat balance simulation system was improved to predict the surface temperature of a proposed design using 3D-CAD. This system is able to input the complicated outdoor spatial forms efficiently and also to evaluate the surface temperature distribution from any viewpoint.
Development of Urban Land Use Model to Compare Transit-Oriented and Automobile-Oriented Cities
(2004)
This study is an attempt to develop a simple simulation model that can compare the differences between automobile-oriented and transit-oriented cities, and clarify the difference between city forms by transportation modes. Following a theoretical model development, a series of simulation runs are tried. The model allocates people who commute to CBD from residential zones along a transportation corridor. As a result of many simulation analyses, it is shown that automobiles need much more traffic space in comparison with the transit as is shown by the proposed traffic space ratio both in CBD and along the corridor.
This paper reports on the latest results in the development of a new approach for simulating the thermal behavior of buildings that overcomes the limitations of conventional heat-transfer simulation methods such as FDM and FEM. The proposed technique uses a coarse-grain approach to model development whereby each element represents a complete building component such as a wall, internal space, or floor. The thermal behavior of each coarse-grain element is captured using empirical modeling techniques such as artificial neural networks (ANNs). The main advantages of the approach compared to conventional simulation methods are: (a) simplified model construction for the end-user; (b) simplified model reconfiguration; (c) significantly faster simulation runs (orders of magnitude faster for two and three-dimensional models); and (d) potentially more accurate results. The paper demonstrates the viability of the approach through a number of experiments with a model of a composite wall. The approach is shown to be able to sustain highly accurate longterm simulation runs, if the coarse-grain modeling elements are implemented as ANNs. In contrast, an implementation of the coarse-grain elements using a linear model is shown to function inaccurately and erratically. The paper concludes with an identification of on-going work and future areas for development of the technique.
Seismic insulation is one of the most progressive types of seismic protection. Seismic insulation is understood as applying special devices for reducing of inertial seismic loads acting on a building. The constructions, shown in article, significantly more effectively solve problems of seismic protection, comparing to the existing seismic insulation systems (SIS). The Special Mechanical Engineering Design Office has developed, manufactured and tested a shock-absorber unit (SAU) of large load capacity. The SAU represents a block of number of pneumatic shock-absorbers (PSA), concentrically mounted round the guiding cylinder. The protected object rests upon the upper movable part of the guiding cylinder of the SAU. In its turn, the lower part of the SAU is rested on the foundation plate, mounted on the ground. A set of the SAU is a constructive realization of the SIS. Efficiency of such SIS has been proved as theoretically, so experimentally An effective SIS can also be created on the base of the elasto-plastic shock-absorbers developed by the KBSM. New designing and constructional solutions are based on the use of the original SIS elements, performed on the base of long time existing (in the other field of technical equipment)units, safety and durability of which have been proved by long term service (more than 20 years).
Dezentrale mikrocontrollerbasierende Messtechnik in der in der Leittechnik historischer Gebäude
(2003)
Im Hintergrund stehen historische Bibliotheksgebäude, deren Bestände durch ungünstige klimatische Verhältnisse in den Innenräumen ihrer der Erhaltung gefährdet sind. Damit Maßnahmen der Gebäudeleittechnik geplant werden können, ist die Aufnahme der Umweltgrößen über einen längeren Zeitraum erforderlich. Im speziellen zielen die Maßnahmen auf die Klimatechnik mit ihren Aufgaben der Heizungsregelung, Belüftungsanlagen. Ein Messsystem kann mit dem Mikrocontroller Beck >IPC@CHIP< realisiert werden. Das an der FH Fulda entwickelte Messsystem nimmt im Abstand von drei Minuten Messwerte von Lufttemperatur, Luftfeuchte, Luftdruck und UV-Intensität auf. Sie gelten als beeinflussende Größen der organischen Materialien des Fundus. Die Messwerte werden in einer elektronischen Schaltung vorverarbeitet und einem Mikrocontroller zugeführt. Dieser archiviert die Daten fortlaufend über einen Zeitraum bis zu sieben Jahren. Der Mikrocontroller ist mit seinem internen Flash-Speicher und der Ethernet-Schnittstelle in der Lage, die gesammelten Messwerte als Webserver zur Verfügung zu stellen. Wenn das Messsystem an das Internet angeschlossen wird, können die Klimaverläufe in einem herkömmlichen Internet-Browser dargestellt werden. Dafür wurde mit der Programmiersprache Java eine Visualisierungsoberfläche erstellt. Sie bietet die verschiedensten Möglichkeiten der Datenauswertung. Dazu gehört der Zugriff auf jeden einzelnen Messwert, auf Tages-, Monats- und Jahresmittelwerte und die Errechnung von Minima und Maxima. Der Flash-Speicher wurde auch dazu benutzt, Dokumentation und Bedienhinweise abzulegen, die somit über das Netzwerk abrufbar sind. Das gesamte Messsystem ist in ein Buch integriert und erfasst seit August 2002 die klimatischen Verhältnisse der Herzogin-Anna-Amalia-Bibliothek in Weimar. Nach den erfolgten Langzeitmessungen kann darüber entschieden werden, welchen Beitrag eine Gebäudeleittechnik mit integrierter Klimaregelung zur Werterhaltung der Gegenstände leisten kann.
Die Entwicklung Berlins zur neuen Hauptstadt des vereinigten Deutschlands erfordert den umfangreichen Ausbau der innerstädtischen Verkehrswege und eine umfassende Einbindung in das bundesweite Verkehrsnetz. Der Ausbau der Inter-City-Express-Strecke zwischen Hannover und Berlin bedingte eine Umplanung großer Streckenabschnitte der Deutschen Bahn AG. Im Zeitalter der computergestützten Informationsverarbeitung sind gerade auf dem Gebiet der Tragwerksplanung konstruktiver Ingenieurbauten vielfältige Möglichkeiten gegeben. So ist bei der Planung von Brückenneubauten eine durchgehende computergestützte Konstruktion und Tragwerksplanung vom Vorentwurf bis hin zur Fertigung möglich. Weiterhin gestatten besondere Berechnungsmethoden bei speziellen Belastungsarten eine exaktere Schnittkraft- und Spannungsermittlung und dadurch eine dem wirklichen Tragverhalten entsprechende genauere Nachweisführung. An Beispielen aus dem konstruktiven Ingenieurbau im Zuge der Sanierung der Berliner Stadtbahn wird die Anwendung moderner Rechentechnik und computergestützter Konstruktionsmethoden und Berechnungsverfahren diskutiert. Die Berliner Stadtbahn wird in weiten Abschnitten über gemauerte Ziegelsteinviadukte geführt. Das Konzept der Sanierungsmaßnahmen sah neben der Anordnung einer lastverteilenden Platte einen Gleisaufbau als Feste Fahrbahn vor. Zusätzlich wurden die Gleisabstände den heutigen Bundesbahnvorschriften angepaßt. Aus diesem Grund wurde eine genaue Nachrechnung der Stadbahnviadukte unter Berücksichtigung der verschiedensten Parameter (z.B. Fugen in der lastverteilenden Platte, Rißbildung im Mauerwerk, Schubverformungen in Fugen zwischen Beton und Mauerwerk bei Überschreitung von Grenzwerten) erforderlich. Die zwischen dem Berliner Hauptbahnhof und dem Bahnhof Jannowitzbrücke befindliche Eisenbahnüberführung über die Holzmarktstraße wurden abgebrochen. Das neue Bauwerk besteht aus einer zweifeldrigen, schiefwinkligen Deckbrücke für 4 Gleise. Auf Grund der komplizierten geometrischen Randbedingungen erfolgte eine durchgängige computergestützte Erstellung der Ausführungsunterlagen. Die Hauptabmessungen der Konstruktion wurden mit Hilfe von Vermessungsdaten bestimmt, die im CAD-Programm verarbeitet wurden. Die CAD-Daten bildeten die geometrischen Eingangsgrößen für das Berechnungsprogramm. Die Querschnittsdimensionierung der Berechnung ging dann in die CAD-Bearbeitung ein. Die Koordinaten für die Werkstattfertigung sowie für die Montage auf der Baustelle ergaben sich ebenfalls aus den Berechnungsergebnissen.
Die Inverse Matrizeniteration ist ein Verfahren zur Bestimmung der Eigenzustände reeller, symmetrischer Matrizen mit konvexer Profilstruktur. Das Verfahren zeichnet sich besonders durch den Erhalt der Profilstruktur und die Bestimmung der Eigenwerte in geordneter Reihenfolge aus. Die Iteration ermöglicht die gezielte Bestimmung mehrerer aufeinander folgender betragskleinster Eigenwerte, wie es im Bauwesen insbesondere für Stabilitäts- und Schwingungsanalysen erforderlich ist. Die Anwendung dieses Verfahrens hat gezeigt, daß ein Teil der gesuchten Eigenwerte mit wenigen Iterationszyklen bestimmt werden kann, während andere eine größere Anzahl von Iterationszyklen erfordern. Diese lokale Konvergenzverschlechterung kann durch eine Jacobi-Randkorrektur, wie sie im Beitrag näher erläutert wird, behoben werden.
Das Eintreffen der Passagiere einer Haltestelle eines öffentlichen Nahverkehrsmittel wird als stochastischer Prozeß beschrieben . Die Ankünfte der Nahverkehrsmittel stellen einen Erneuerungsprozeß dar , wohingegen die Ankünfte der Personen innerhalb einer Erneuerungsperiode als instationärer Poissonprozeß aufgefaßt wird. Über die Intensitätsfunktion liegen Messungen vor. Betrachtet wird die Gesamt- wartezeit der Personen an einer Haltestelle .
Die Eisenbahnbrücken des Lehrter Bahnhofs in Berlin - Ein ganzheitliches FE-Berechnungskonzept
(2000)
Der Komplexität moderner Brückenbauwerke scheinen die verwendeten Berechungsmodelle oft nicht angemessen. Tragwerksberechnungen basieren in vielen Fällen noch auf der Vorgehensweise, das Brückenbauwerk in Einzelbauteile zu zerlegen und mit unterschiedlichen Teilmodellen zu behandeln. Das erscheint, auch vor dem Hintergrund ständig wachsender Rechnerleistung, nicht mehr zeitgemäß. Dies gilt zum Beispiel auch für die gängige Praxis, flächenhafte Brückenüberbauten mit Balkenmodellen zu berechnen. Der vorliegende Beitrag stellt ein ganzheitliches Berech-nungskonzept vor, welches auf der Basis eines einzigen FE-Modells die Berechnung des Gesamtbauwerks erlaubt. Damit wird für alle Bauteile neben der Zustandsgrößenberechnung auch die Bemessung von Stahl- und Spannbetonbauteilen bis hin zu Nachweisen wie zur Beschränkung der Rissbreite geführt. Die Anwendung dieses Berechnungskonzeptes wird am Beispiel der Eisenbahnüberführung des neuen Lehrter Bahn-hofs in Berlin gezeigt. Das verwendete FE-Modell umfasst Baugrund, Fundamente, Stahl- bzw. Gußstahlunterkonstruktion sowie den Stahl- bzw. Spannbetonüberbau. Besonderheiten sind unter anderem die Modellierung des plattenbalkenartigen Überbaus durch exzentrische, vorspannbare Schalenelemente und das getrennte Vorhalten von tragwerks- und lastbezogenen Eingabefiles. Damit gelingt die sequentielle Erfassung unterschiedlicher Bettungsmoduli zur Simulation statischer und dynamischer Beanspruchungen, die Berücksichtigung des Anspannens und der Interaktion zwischen vorgespannten Stahlverbänden zur Aufnahme von Horizontallasten sowie die Berücksichtigung unterschiedlicher statischer Systeme bei der Herstellung des Spannbetonüberbaus.
Der Versuch, alle vorhandenen Arten von Bauobjekten in einem Schema zu beschreiben, wird zu einem übermäßig großen Schema führen. Im internationalen Zentrum für Bauinformatik hat der Autor die Besonderheiten des Bauwesens formuliert und gefolgert, dass es genügt, in den Klassen topologische, geometrische und graphische Aspekte der Bauobjekte, sowie einige Hilfsbegriffe zu beschreiben. Ein Hauptziel bei der Entwicklung einer Sprache ist die Einfachheit des Schemas. Insgesamt werden etwa nur 50 Klassen für CAD-Systeme vorgeschlagen. In Modellen von Bauobjekten, die von CAD-Systemen erzeugt werden, betrifft der größte Anteil der Daten die geometrische Form der Elemente und ihre Raumlage. Zur Beschreibung der Bauobjekte werden nur 7 Hauptklassen verwendet: Entity, System, Clone, Context, Annotation, Figure und Process. Von besonderer Bedeutung ist die Klasse >Clone<. Sie erlaubt in kompakter Form die Beschreibung einer Menge ähnlicher Objekte, die sich in wenigen Parametern (zum Beispiel, der Lage im Raum) unterscheiden. Das vorgestellte Konzept führt zu einer starken Verringerung der Dateigröße und erleichtert das Erkennen der Objekte bei der Übergabe der Daten. Die Strukturdiagramme der Klassen wurden mit Hilfe der UML-Sprache erzeugt. Die Klassen sind auch im XML-Format beschreiben und können auf Homepage >http://www.mtu-net.ru/pavlov/bodXML< gelesen werden. Zur Prüfung der Anwendbarkeit des Schemas wurden Beispiele der Beschreibung verschiedener Bauobjekte entwickelt. Diese Beispiele sind auf o.g. Homepage angegeben
Es wird das gemeine Schema der Projektierung, die die Analyse des Marktes einschließt, betrachtet, die Einschätzung der Nomenklatur und des Umfanges des Produktionsausstoßes der Erzeugnisse, und die Schlüsselaufgabe der Wahl der Regimes der Bearbeitung. Die letzte Aufgabe wird - konkretisiert es werden die Formeln berichtet und die Algorithmen der Optimierung nach den Kriterien der Produktivität und den Selbstkostenpreisen. Es sind die Formeln für die Produktivität der Systeme mit der flexiblen Verbindung, präzisiert sowie die Abhängigkeiten für die Standhaftigkeit bei den kleinen Geschwindigkeiten des Schneidens. Die Schlüsselwörter: der Markt, die Regimes der Bearbeitung, die Produktivität, der Selbstkostenpreis, die Optimierung
Der Funktionsumfang von 3D-Modellierern und die Leistung allgemein zur Verfügung stehender Rechnersysteme gestatten die Modellierung beliebiger Objekte in jeder nur denkbaren Fachdisziplin. Das potentielle Leistungsvermögen von 3D-Modellierern sollte Anreiz sein, sie so alltäglich einzusetzen, wie die etablierten Standardapplikationen. Die im Vortrag gezeigten Modelle umfassen ein kleines Spektrum der möglichen Anwendungsgebiete. Weder vom (Daten)Umfang noch von ihrem Kompliziertheitsgrad stellen sie etwas Besonderes dar. Sie sollen nur das Interesse für einen alltäglichen Einsatz von 3D-Modellierern wecken und eventuell vorhandenen Einsatzschwellen abbauen helfen. Die im Vortrag gezeigten Bildfolgen sind unter http://www.uni-weimar.de/animationen zu finden.
For many purposes geometric information about existing buildings is necessary, e.g. planing of conservation or reconstruction. Architectural photogrammetry is a technique to acquire 3D geometric data of buildings for a CAD model from images. In this paper the state of the art in architectural photogrammetry and some developments towards automation are described. The photogrammetric process consists of image acquisition, orientation and restitution. Special attention is put on digital methods, from digital image acquisition to restitution methods, supported by digital image processing. There are a few field of development towards automation, e.g. feature extraction, extraction of edges and lines and the detection of corresponding points. The acquired data may be used in a CAD environment or for visualization in Virtual Reality Models, using digital orthoimages for texture mapping.
Re-using knowledge in architecture, engineering and construction (AEC) firms can lead to greater competitive advantage, improved designs, and more effective management of constructed facilities. This paper discusses the importance of exploration and discovery of reusable knowledge from a corporate archive as opposed to simple search and retrieval. We describe and illustrate through a scenario of use an exploration framework and prototype, CoMemTM that formalizes the added value of exploration in the process of knowledge reuse. We discuss two exploration activities: (i) Breadth- Only overview exploration that assist a user to rapidly localize pockets of re-usable knowledge from the large corporate archive and (ii) Iterative breadth-depth exploration that enables a user to identify those re-usable components of the Corporate Memory that may yield design issues that were not originally considered.
Discrete-continual Finite Element Method of Analysis for Three-dimensional Curvilinear Structures
(2003)
This paper is devoted to discrete-continual finite element method (DCFEM) of analysis for three-dimensional curvilinear structures. Operational and variational formulations of the problem in the ring coordinate system are presented. The discrete-continual design model for structures with constant physical and geometrical parameters in longitudinal direction is offered on the basis of so-called curvilinear discrete-continual finite elements. Element coordinate system, approximation of nodal unknowns, construction of element nodal load vector are under consideration. Element system of differential equations is formulated with use of special generalized block-structured stiffness matrix of discrete-continual finite element. Local differential relations are formulated. Resultant multipoint boundary problem for system of ordinary differential equations is given. Method of analytical solution of multipoint boundary problems in structural analysis is offered as well. Its major peculiarities include universality, computer-oriented algorithm involving theory of distributions, computational stability, optimal conditionality of resultant systems, partial Jordan decomposition of matrix of coefficients, eliminating necessity of calculation of root vectors. Brief information concerning developed software is provided.
Available construction time-cost trade-off analysis models can be used to generate trade-offs between these two important objectives, however, their application is limited in large-scale construction projects due to their impractical computational requirements. This paper presents the development of a scalable and multi-objective genetic algorithm that provides the capability of simultaneously optimizing construction time and cost large-scale construction projects. The genetic algorithm was implemented in a distributed computing environment that utilizes a recent standard for parallel and distributed programming called the message passing interface (MPI). The performance of the model is evaluated using a set of measures of performance and the results demonstrate the capability of the present model in significantly reducing the computational time required to optimize large-scale construction projects.
In this contribution the software design and implementation of an analysis server for the computation of failure probabilities in structural engineering is presented. The structures considered are described in terms of an equivalent Finite Element model, the stochastic properties, like e.g. the scatter of the material behavior or the incoming load, are represented using suitable random variables. Within the software framework, a Client-Server-Architecture has been implemented, employing the middleware CORBA for the communication between the distributed modules. The analysis server offers the possibility to compute failure probabilities for stochastically defined structures. Therefore, several different approximation (FORM, SORM) and simulation methods (Monte Carlo Simulation and Importance Sampling) have been implemented. This paper closes in showing several examples computed on the analysis server.