Refine
Document Type
- Conference Proceeding (336)
- Article (130)
- Doctoral Thesis (12)
- Master's Thesis (2)
- Working Paper (2)
- Preprint (1)
- Study Thesis (1)
Institute
- Professur Informatik im Bauwesen (484) (remove)
Keywords
- Modellierung (62)
- Bauwerk (48)
- Finite-Elemente-Methode (41)
- CAD (40)
- Verteiltes System (38)
- Bautechnik (30)
- Computerunterstütztes Verfahren (24)
- Produktmodell (24)
- Architektur (23)
- Simulation (22)
The method of difference potentials can be used to solve discrete elliptic boundary value problems, where all derivatives are approximated by finite differences. Considering the classical potential theory, an integral equation on the boundary will be investigated, which is solved approximately by the help of a quadrature formula. The advantage of the discrete method consists in the establishment of a linear equation system on the boundary, which can be immediately solved on the computer. The described method of difference potentials is based on the discrete Laplace equation in the three-dimensional case. In the first step the integral representation of the discrete fundamental solution is presented and the convergence behaviour with respect to the continuous fundamental solution is discussed. Because the method can be used to solve boundary value problems in interior as well as in exterior domains, it is necessary to explain some geometrical aspects in relation with the discrete domain and the double-layer boundary. A discrete analogue of the integral representation for functions in will be presented. The main result consists in splitting the difference potential on the boundary into a discrete single- and double-layer potential, respectively. The discrete potentials are used to establish and solve a linear equation system on the boundary. The actual form of this equation systems and the conditions for solvability are presented for Dirichlet and Neumann problems in interior as well as in exterior domains
The increased implementation of site data capture technologies invariably results in an increase in data warehousing and database technologies to store captured data. However, restricted use of data beyond the initial application could potentially result in a loss of understanding of site processes. This could in turn lead to poor decision making at production, tactical and strategic levels. Concrete usage data have been collected from two piling processes. These data have been analysed and the results highlighted potential improvements that could be made to existing site management and estimating processes. A cost benefit analysis has been used to support decision making at the strategic level where the identified improvements require capital expenditure.
The preliminary design of a wearable computer for supporting Construction Progress Monitoring
(2000)
Progress monitoring has become more and more important as owners have increasingly demanded shorter times for the delivery of their projects. This trend is even more evident in high technology industries, such as the computer industry and the chemical industry. Fast changing markets, such as the computer industry, force companies to have to build new facilities quickly. To make a statement about construction progress, the status of a building has to be determined and monitored over a period of time. Depicting the construction progress in a diagram over time, statements can be made about the anticipated completion of the project and delays and problems in certain areas. Having this information, measures can be taken to efficiently >catch up< on the schedule of the project. New technologies, such as wearable computers, speech recognition, touch screens and wireless networks could help to move electronic data processing to the construction site. Progress monitoring could very much take advantage of this move, as several intermediate steps of processing progress data can be made unnecessary. The processing of progress data could be entirely done by computers, which means that data for supporting decisions can be made available at the moment the construction progress is measured. This paper describes a project, that investigates how these new technologies can be linked to create a system that enhances the efficiency of progress monitoring. During the project a first prototype of a progress monitoring system was developed that allows construction companies and site supervisors to measure construction progress on site using wearable computers that are speech controlled and connected to a central database via a wireless network.
The purpose of this paper is to review model for finite element techniques for non-linear crack analysis of reinforced concrete beams and slabs. The non-linear behaviour of concrete and steel were described. Some calculations of >self-stress< for concrete and reinforced concrete beam was made. Current computational aspects are discussed. Several remarks for future studies are also given. The numerical model of the concrete and reinforced concrete was described. The paper shows the results of calculations on a reinforced concrete plane stress panel with cracks. The non-linear, numerical model of calculations of reinforced concrete was assumed. Using finite elements method some calculations were made. The results of calculations like displacements, stresses and cracking are shown on diagrams. They were compared with experimental results and other finding. Some conclusions about the described model and results of calculation are shown.
In the abstract proposed is the Instrumental System of mechanics problems analysis of the deformed solid body. It supplies the researcher with the possibility to describe the input data on the object under analyses and the problem scheme based upon the variational principles within one task. The particular feature of System is possibility to describe the information concerning the object of any geometrical shape and the computation sheme according to the program defined for purpose. The Methods allow to compute the tasks with indefinite functional and indefinite geometry of the object (or the set of objects). The System provides the possibility to compute the tasks with indefinite sheme based upon the Finite Element Method (FEM). The restrictions of the System usage are therefore determined by the restrictions of the FEM itself. It contrast to other known programms using FEM (ANSYS, LS-DYNA and etc) described system possesses more universality in defining input data and choosing computational scheme. Builtin is an original Subsytem of Numerical Result Analuses. It possesses the possibility to visualise all numerical results, build the epures of the unknown variables, etc. The Subsystem is approved while solving two- and three-dimensional problems of Elasticiti and Plasticity, under the conditions of Geometrical Unlinearity. Discused are Contact Problems of Statics and Dynamics.
The conventional way of describing an image is in terms of its canonical pixel-based representation. Other image description techniques are based on image transformations. Such an image transformation converts a canonical image representation into a representation in which specific properties of an image are described more explicitly. In most transformations, images are locally approximated within a window by a linear combination of a number of a priori selected patterns. The coefficients of such a decomposition then provide the desired image representation. The Hermite transform is an image transformation technique introduced by Martens. It uses overlapping Gaussian windows and projects images locally onto a basis of orthogonal polynomials. As the analysis filters needed for the Hermite transform are derivatives of Gaussians, Hermite analysis is in close agreement with the information analysis carried out by the human visual system. In this paper we construct a new higher dimensional Hermite transform within the framework of Quaternionic Analysis. The building blocks for this construction are the Clifford-Hermite polynomials rewritten in terms of Quaternionic analysis. Furthermore, we compare this newly introduced Hermite transform with the Quaternionic-Hermite Continuous Wavelet transform. The Continuous Wavelet transform is a signal analysis technique suitable for non-stationary, inhomogeneous signals for which Fourier analysis is inadequate. Finally the developed three dimensional filter functions of the Quaternionic-Hermite transform are tested with traditional scalar benchmark signals upon their selectivity at detecting pointwise singularities.
In this paper the results of the investigations of the free oscillations of the pre-stressed flexible structure elements are presented . Two cases of the central preliminary stress are investigated : without intermediate fastening of the tie to the flexible element and with the intermediate fastening in the middle of the element length. The given physical model can be applied to the flexible sloping shells and arches, membranes, large space antenna fields (besides flexible elements). The peculiarity of these systems is the possibility of the non-adjacent equilibrium form existence at the definite relations of the physical parameters . The transition from one stable equilibrium form to another, non-adjacent form, may be treated as jump. In this case they are called systems with buckling or the systems with two potential «gaps». These systems commenced the new section of the mathematical physics - the theory of chaos and strange attractors. The analysis of the solutions confirms the received for the first time by the author and given in effect of the oscillation period doubling of the system during the transition from the «small» oscillations relatively center to the >large< relatively all three equilibrium conditions. The character of the frequency (period) dependence on the free oscillation amplitudes of the non-linear system also confirms the received earlier result of the duality of the system behaviour : >small< oscillations possess the qualities of soft system; >large< oscillations possess the qualities of rigid system. The >small< oscillation natural frequency changing, depending on the oscillation amplitudes, is in the internal . Here the frequency takes zero value at the amplitude values Aa and Ad (or Aa and Ae ); the frequency takes maximum value at the amplitude value near point b .The >large< oscillation natural frequency changes in the interval . Here is also observed . The influence of the tie intermediate fastening doesn't introduce qualitative changes in the behaviour of the investigated system. It only increases ( four times ) the critical value of the preliminary tension force
In the design of a structure, the implementation of reliable soil-foundation-structure interaction into the analysis process plays a very important role. The paper presents a determination of parameters of a suitably chosen soil-foundation model and their influence on the structure response. Since the mechanical data for the structure can be determined with satisfactory accuracy, the properties of the soil-foundation model were identified using measured dynamic response of the real structure. A simple model describing soil-foundation structure was incorporated into the classical 3-D finite element analysis of the structure with commercial software. Results obtained from the measured data on the pier were afterwards compared with those obtained with the finite model of the pier-foundation-soil structure. On the basis of this comparison the coefficients describing the properties in the soil-foundation model were adjusted until the calculated dynamic response coincided with the measured ones. In this way, the difference between both results was reduced to 1%. Full-scale tests measuring eigenmotion of the bridge were performed through all erection stages of the new bridge in Maribor. In this way an effective and experimentally verified 3-D model for a complex dynamic analysis of the bridge under the earthquake loading was obtained. The significant advantage of the obtained model is that it was updated on the basis of the dynamic measurements thus improving the model on the basis of in-situ geomechanical measurements. The model is very accurate in describing the upper structure and economical in describing the soil mass thus representing an optimal solution regarding computational efforts.
In the given paper the generalized formulation of the problem of computer modelling of the complex-composite structure interaction with different types of dynamic loads and effects is discussed. Here the analysis is given as for the usage of some universal computing systems for the solution of such problems. Also if is shown that the quantification of the dynamic models of the complex-composite systems with the variable structure, depending on the character and intensivity of the effects, is necessary. The different variants of the joint and the space structure element modelling are gested. It allows to consider the complex modes of the joint bending-torsional oscillations of such structures as bridges, towers, high-rise buildings. The peculiarities of the modelling and testing of some problems of the objects aerodynamics and the interaction of the frameworks constructions with shock and movable loads are considered. In this paper the examples of the complex-composite structure dynamic analysis are shown. It is achieved by means of some special methods of the input of the real inducements and loads of the exploitated analog-object into the computing model. The suggested models found a wide use both at the design of new structures and the dynamic monitoring of the exploitated structures.
For planning in existing built contexts, the building survey is the starting point for initial planning proposals, for the diagnosis and documentation of building damages, for the creation of objectives catalogues, for the detailed design of renovation and conversion measures and for ensuring fulfilment of building legislation, particularly by change of use and refitting. An examination of currently available IT-tools shows insufficient support for planning within existing contexts, most notably a deficit with regard to information capture and administration. This paper discusses the concept for a modular surveying system (basic concept, separation of geometry from semantic data, and separation into sub-systems) and the prototypical realisation of a system for the complete support of the entire building surveying process for existing buildings. The project aims to contribute to the development of a planning system for existing buildings. ...
Pre-stressed structural elements are widely used in large-span structures. As a rule, they have higher stiffness characteristics. Pre-stressed rods can be applied as girders of different purpose, and as their separate parts, e.g. rods of trusses and frames. Among numerous ways of prestressing the compression of girders, trusses, and frames by tightenings from high-strength materials is under common application.
The development of a consistent material model for textile reinforced concrete requires the formulation and calibration of several sub-models on different resolution scales. Each of these models represents the material structure at the corresponding scale. While the models at the micro-level are able to capture the fundamental failure and damage mechanisms of the material components (e.g. filament rupture and debonding from the matrix) their computational costs limit their application to the small size representative unit cells of the material structure. On the other hand, the macro-level models provide a sufficient performance at the expense of limited range of applicability. Due to the complex structuring of the textile reinforced concrete at several levels (filament - yarn - textile - matrix) it is a non-trivial task to develop a multiscale model from scratch. It is rather more effective to develop a set of conceptually related sub-models for each structural level covering the selected phenomena of the material behavior. The homogenized effective material properties obtained at the lower level may be verified and validated using experiments and models at the higher level(s). In this paper the development of a consistent material model for textile reinforced concrete is presented. Load carrying and failure mechanisms at the micro, meso and macro scales are described and models with the focus on the specified scales are introduced. The models currently being developed in the framework of the collaborative research center are classified and evaluated with respect to the failure mechanisms being captured. The micromechanical modeling of the yarn and bonding behavior is discussed in detail and the correspondence with the experiments focused on the selected failure and interaction mechanisms is shown. The example of modeling the bond layer demonstrates the application of the presented strategy.
The goal of the collaborative research center (SFB 532) >Textile reinforced concrete (TRC): the basis for the development of a new material technology< installed in 1998 at the Aachen University is a complex assessment of mechanical, chemical, economical and productional aspects in an interdisciplinary environment. The research project involves 10 institutes performing parallel research in 17 projects. The coordination of such a research process requires effective software support for information sharing in form of data exchange, data analysis and data archival. Furthermore, the processes of experiment planning and design, modification of material compositions and design parameters and development of new material models in such an environment call for systematic coordination applying the concepts of operational research. Flexible organization of the data coming from several sources is a crucial premise for a transparent accumulation of knowledge and, thus, for a successful research in a long run. The technical information system (TRC-TIS) developed in the SFB 532 has been implemented as a database-powered web server with a transparent definition of the product and process model. It serves as an intranet server with access domains devoted to the involved research groups. At the same time, it allows the presentation of selected results just by granting a data object an access from the public area of the server via internet.
Modellverwaltungssysteme sind eine geeignete technologische Basis zum Management digitaler Bauwerksmodelle bei Planungstätigkeiten für den Neubau als auch für die Revitalisierung von Bauwerken. Die Unterstützung von Revitalisierungsprozessen impliziert für den Entwurf integrierter Planungsumgebungen spezifische Anforderungen wie die Repräsentation von Informationen, die mit verschiedenen Typen von Vagheit behaftet sind, die Notwendigkeit, den Soll- sowie den Ist- Zustand des Bauwerks abzubilden und die Fähigkeit des Umgangs mit temporal inkonsistenten Modellzuständen. Die erforderliche Dynamik der Domänenmodelle und die erforderliche Nutzbarkeit in Virtual Enterprises stellen weitere Ansprüche an die Realisierungsbasis der Modellverwaltungssysteme. Zur Implementierung derartiger Systeme erweist es sich als vorteilhaft, Eigenschaften objektorientierter Programmiersprachen mit nichtstatischen Typsystemen auszunutzen, da diese durch die vorhandene Metaebene sowie Introspektions- und Reflektionsmechanismen eine effiziente Realisierungsbasis bereitstellen. Zur effektiven Unterstützung synchroner kooperativer Planungstätigkeiten innerhalb einzelner Fachdisziplinen wurde ein Benachrichtigungsmechanismus realisiert, der an das Modellverwaltungssystem angekoppelte Fachapplikationen über nebenläufig vorgenommene Modifikationen am zugehörigen Domänenmodell oder an Projektinformationen informiert. Weiterhin existiert ein Mechanismus zur vereinfachten Anbindung von existierenden Applikationen, die auf statischen Partialmodellen beruhen oder standardisierte, modellbasierte Austauschformate unterstützen. Abschließend wird eine aus einem zentralen Projektserver, Domänenservern und Domänenclients bestehende hybride Systemarchitektur vorgestellt, die geeignet ist, unter den Randbedingungen kooperativer und geographisch verteilter Arbeit bei Revitalisierungsvorhaben in Virtual Enterprises eingesetzt zu werden.
Der Planungsprozess im Konstruktiven Ingenieurbau ist gekennzeichnet durch drei sich zyklisch wiederholende Phasen: die Phase der Aufgabenverteilung, die Phase der parallelen Bearbeitung mit entsprechenden Abstimmungen und die Phase der Zusammenführung der Ergebnisse. Die verfügbare Planungssoftware unterstützt überwiegend nur die Bearbeitung in der zweiten Phase und den Austausch der Datenbestände durch Dokumente. Gegenstand der Arbeit ist die Entwicklung einer Systemarchitektur, die in ihrem Grundsatz alle Phasen der verteilten Bearbeitung und unterschiedliche Arten der Kooperation (asynchron, parallel, wechselseitig) berücksichtigt und bestehende Anwendungen integriert. Das gemeinsame Arbeitsmaterial der Beteiligten wird nicht als Dokumentmenge, sondern als Menge von Objekt- und Elementversionen und deren Beziehungen abstrahiert. Elemente erweitern Objekte um applikationsunabhängige Eigenschaften (Features). Für die Bearbeitung einer Aufgabe werden Teilmengen auf Basis der Features gebildet, für deren Elemente neue Versionen abgeleitet und in einen privaten Arbeitsbereich geladen werden. Die Bearbeitung wird auf Operationen zurückgeführt, mit denen das gemeinsame Arbeitsmaterial konsistent zu halten ist. Die Systemarchitektur wird formal mit Mitteln der Mathematik beschrieben, verfügbare Technologie beschrieben und deren Einsatz in einem Umsetzungskonzept dargestellt. Das Umsetzungskonzept wird pilothaft implementiert. Dies erfolgt in der Umgebung des Internet in der Sprache Java unter Verwendung eines Versionsverwaltungswerkzeuges und relationalen Datenbanken.
SYSBAT - An Application to the Building ProductionBased on Computer Supported Cooperative Work
(2003)
Our proposed solution is to enable partners of a construction project to share all the technical data produced and handled during the building production process by building a system through the use of internet technology. The system links distributed databases and allows building partners to access remotely and manipulate specific information. It provides an updated building representation that is being enriched and refined all along the building production process. A recent collaboration with Nemetschek France (subsidiary company of Nemetschek AG, AEC CAD software leader) focus on a building product repository available in a web context. The aim is to help building project actors to choose a technical solution that fits its professional needs, and maintain our information system with up to date information. It starts with the possibility to build on line building product catalogs, in order to link Allplan CAD entities with building technical features. This paper presents the conceptual approaches on which our information system is built. Starting from a general organization diagram organization, we focus on the product and the description branches of construction works (including last IFC model specifications). Our aim is to add decisional support to the construction works selection process. To do so, we consider the actor's role upon the system and the pieces of information each one needs to achieve a given task.
The subject of this talk is the problem of surface design based upon a mesh that may contain both triangular and quadrangular domains. We investigate the cases when such a combined mesh occurs more preferable for bivariate data interpolation than a pure triangulation. First we describe a modification of the well-known flipping algorithm that constructs a locally optimal combined mesh with a predefined quality criterion. Then we introduce two quality measures for triangular and quadrangular domains and present the results of a computational experiment that compares integral interpolation errors and errors in gradients caused by the piecewise surface models produced by the flipping algorithm with the introduced quality measures. The experiment shows that triangular meshes with the Delaunay quality measure provide better interpolation accuracy only if the interpolated function is strictly convex, as well as a saddle-shaped function is better interpolated by bilinear patches within a combined mesh. For a randomly shaped function combined meshes demonstrate smaller error values and better stability in compare with pure triangulations. At the end we consider other resources for mesh improvement, such as excluding >bad< points from the input set for the mesh generating procedure. Because the function values at these points should not be lost, some linear or bilinear patches are replaced by nonlinear patches that pass through the excluded points.
The design of mobile IT systems, especially the design of wearable computer systems, is a complex task that requires computer science knowledge, such as that related to hardware configuration and software development, in addition to knowledge of the domain in which the system is intended to be used. Particularly in the AEC sector, it is necessary that the support from mobile information technology fit the work situation at hand. Ideally, the domain expert alone can adjust the wearable computer system to achieve this fit without having to consult IT experts. In this paper, we describe a model that helps in transferring existing design knowledge from non-AEC domains to new projects in the construction area. The base for this is a model and a methodology that describes the usage scenarios of said computer systems in an application-neutral and domain-independent way. Thus, the actual design information and experience will be transferable between different applications and domains.
Structural engineering projects are increasingly organized in networked cooperations due to a permanently enlarged competition pressure and a high degree of complexity while performing the concurrent design activities. Software that intends to support such collaborative structural design processes implicates enormous requirements. In the course of our common research work, we analyzed the pros and cons of the application of both the peer-to-peer (University of Bonn) and multiagent architecture style (University of Bochum) within the field of collaborative structural design. In this paper, we join the benefits of both architecture styles in an integrated conceptual approach. We demonstrate the surplus value of the integrated multiagent–peer-to-peer approach by means of an example scenario in which several structural engineers are co-operatively designing the basic structural elements of an arched bridge, applying heterogeneous CAD systems.
In the superelliptic shell joined to a circular cylinder bending stresses are absent when it is subjected to uniform pressure.Some geometrical characteristics have been found. Expressions for determining stresses in the shell crest(in the singular point of plane type) are suggested. The problem of a theoretical critical buckling load of an elongated shell supported by frames is studied. A critical buckling load for two shells with different specifications was found experimentally.
Analysis System for Bridge Test (Chinese name abbr.: QLJC) is an application software specially designed for bridge test to analyze the static and dynamic character of bridge structures, calculate efficiency ratio of load test, pick up the results of observation points and so on. In this paper, research content, system design, calculation theory, characteristics and practical application of QLJC is introduced in detail.
Das Ziel dieser Arbeit war es, durch Verwendung geeigneter vorhandener CAD-Pläne die Bearbeitung neuer CAD-Pläne zu unterstützen. Entstanden ist ein generischer Ansatz zum fallbasierten Schließens. Da in CAD-Plänen die räumliche Struktur eine wichtige Rolle spielt, ist das Konzept auf strukturorientierte Anwendungen ausgerichtet. Deshalb bezeichne ich es als ein Konzept zum " strukturorientierten fallbasierten Schließen". Die Arbeit spezifiziert das Minimum an Wissen, welches zur Suche und Wiederverwendung von Fällen benötigt wird, wie das darüber hinausgehende Wissen verarbeitet wird, welche Zusammenhänge es zum Beispiel zwischen Vergleichs- und Anpassungswissen gibt und wie man das Wissen modellieren kann. Zur Erläuterung wird das benötigte Wissen anhand verschiedener Anwendungen dargestellt. Das in der Arbeit vorgestellte Konzept erlaubt die Ergänzung, Detaillierung und Korrektur einer Anfrage. Die beiden entscheidenden Algorithmen dienen dem Vergleich von Anfrage und Fall und der Anpassung der Information des Falles zur Modifikation der Anfrage.
Information science researchers and developers have spent many years addressing the problem of retrieving the exact information needed and using it for analysis purposes. In informationseeking dialogues, the user, i.e. construction project manager or supplier, often asks questions about specific aspects of the tasks they want to perform. But most of the time it is difficult for the software systems to unambiguously understand their overall intentions. The existence of information tunnels (Tannenbaum 2002) aggravates this phenomenon. This study includes a detailed case study of the material management process in the construction industry. Based on this case study, the structure of a formal user model for information retrieval in construction management is proposed. This prototype user model will be incorporated into the system design for construction information management and retrieval. This information retrieval system is a user-centered product based on the development of a user configurable visitor mechanism for managing and retrieving project information without worrying too much about the underlying data structure of the database system. An executable UML model combined with OODB is used to reduce the ambiguity in the user's intentions and to achieve user satisfaction.
The planning of projects in building engineering is a complex process which is characterized by a dynamical composition and many modifications during the definition and execution time of processes. For a computer-aided and network-based cooperation a formal description of the planning process is necessary. In the research project “Relational Process Modelling in Cooperative Building Planning” a process model is described by three parts: an organizational structure with participants, a building structure with states and a process structure with activities. This research project is part of the priority program 1103 “Network-Based Cooperative Planning Processes in Structural Engineering” promoted by the German Research Foundation (DFG). Planning processes in civil engineering can be described by workflow graphs. The process structure describes the logical planning process and can be formally defined by a bipartite graph. This structure consists of activities, transitions and relationships between activities and transitions. In order to minimize errors at execution time of a planning process a consistent and structurally correct process model must be guaranteed. This contribution considers the concept and the algorithms for checking the consistency and the correctness of the process structure.
Let the information of a civil engineering application be decomposed into objects of a given set of classes. Then the set of objects forms the data base of the application. The objects contain attributes and methods. Properties of the objects are stored in the attributes. Algorithms which the objects perform are implemented in the methods of the objects. If objects are modified by a user, the consistency of data in the base is destroyed. The data base must be modified in an update to restore its consistency. The sequence of the update operations is not arbitrary, but is governed by dependence between the objects. The situation can be described mathematically with graph theory. The available algorithms for the determination of the update sequence are not suitable when the data base is large. A new update algorithm for large data bases has been developed and is presented in this paper.
In this paper we present a computer aided method supporting co-operation between different project partners, such as architects and engineers, on the basis of strictly three-dimensional models. The center of our software architecture is a product model, described by the Industry Foundation Classes (IFC) of the International Alliance for Interoperability (IAI). From this a geometrical model is extracted and automatically transferred to a computational model serving as a basis for various simulation tasks. In this paper the focus is set on the advantage of the fully three-dimensional structural analysis performed by p-version of the finite element analysis. Other simulation methods are discussed in a separate contribution of this Volume (Treeck 2004). The validity of this approach will be shown in a complex example.
We consider the standardization problem (SP) which can be formulated as follows. It is known demand bi in each type i in {1, 2, ..., n} of items. Production of yi items of the ith type brings a profit fi (yi), where fi is a nondecreasing concave function for each i in {1, 2, ..., n}.It is necessary to satisfy the demand and to maximize the total profit provided that there exist >standardization possibilities< . These possibilities means that some types of items can be replaced by some another types. We introduce generalized standardization problem (GSP) in which titems demand is given as the set of admissible demand vectors. We show that GSP and SP are special cases of the resource allocation problem over a network polymatroid. Ibasing on this observation we propose a polynomial time solution algorithm for GSP and SP.
Eine der wichtigsten Aufgaben und Herausforderungen der Bauinformatik ist gegenwärtig die Realisierung des durchgängigen, fachübergreifenden Datenflusses im Planungsprozeß eines Bauvorhabens. Im Hinblick auf die internationale Wettbewerbsfähigkeit der deutschen Bauwirtschaft ist es unumgänglich, vorhandene Effizienzpotentiale in der Bauplanung auszuschöpfen, welche durch eine qualitative Verbesserung der Planung sowie durch eine Verringerung der Bearbeitungszeit aller beteiligten Fachplanern erreicht werden können. Nach dem gegenwärtigen Stand der Technik werden die Informationsobjekte standardisiert, damit sie durchgängig nutzbar sind. Diese werden in einem allgemeingültigen Format den speziellen Programmen der Fachplaner zur Verfügung gestellt. In dieser Arbeit wird der Ansatz verfolgt, eine Integration durch die Standardisierung der Kommunikation zwischen den Informationsobjekten und ihren Anwendungsprogrammen zu erreichen. Dabei kann auf die Standardisierung der zu übertragenden Objekte verzichtet werden. Ziel der Ausarbeitung ist die Definition von implementationstechnischen Regeln, die alle auszutauschenden Objekte sowie die Anwendungen, die solche Objekte aufnehmen wollen, erfüllen müssen. Die Bearbeitung der Objekte soll in den gewohnten Anwendungen in unveränderter Weise erfolgen.
In dieser Arbeit wird eine neue Methode für die Integration von Informationen in digitalen Planungsunterlagen erarbeitet. Die Grundidee des Integrationsansatzes stützt sich auf die aktive Einbeziehung der Anwender während der Realisierung der Übernahme von Informationen und bei der Aktualisierung von Planungsunterlagen, die inkonsistent zu anderen Planungsunterlagen sind. Diese Grundidee kombiniert mit den Möglichkeiten neuer Kommunikationstechnologien war für die Spezifikation von neuen Methoden für die Übernahme von Informationen und für die Überwachung von Veränderungen ausschlaggebend. Die neuen Methoden werden in dieser Ausarbeitung erarbeitet und vorgestellt. Ziel der Ausarbeitung ist die Definition von implementationstechnischen Regeln, die alle auszutauschenden Objekte erfüllen müssen. Die Realisierung der Integrationsaufgaben durch den Anwender basiert dabei auf den Möglichkeiten der traditionellen Integration analoger Dokumente.
The problem F|n=2|F is to minimize the given objective function F(C1,m, C2,m) of completion times Ci,m of two jobs i Î J={1, 2} processed on m machines M={1, 2, …, m}. Both jobs have the same technological route through m machines. Processing time ti,k of job iÎ J on machine kÎ M is known. Operation preemptions are not allowed. Let R2m be space of non-negative 2m-dimensional real vectors t=(t1,1,…, t1,m, t2,1,…, t2,m) with Chebyshev’s distance d(t, t*). To solve problem F|n=2|F, we can use the geometric algorithm, which includes the following steps: 1) construct digraph (V, A) for problem F|n=2|F and find so-called border vertices in (V, A); 2) construct the set of trajectories corresponding to the shortest paths Rt in digraph (V, A) from the origin vertex to each of the border vertices; 3) find an optimal path in the set Rt that represents a schedule with minimal value of the objective function F. Let path tu Î Rt be optimal for the problem F|n=2|F with operation processing times defined by vector t. If for any small positive real number e > 0 there exists vector t*Î R2m such that d(t, t*) = e and path tu is not optimal for the problem F|n=2|F with operation processing times defined by vector t*, then optimality of path tu is not stable. The main result of the paper is the proof of necessary and sufficient conditions for optimality stability of path tu. If objective function F is continuous non-decreasing (e.g., makespan, total completion time, maximal lateness or total tardiness), then to test whether optimality of the path tu Î Rt is stable takes O(m log m) time.
Hydro- und morphodynamischen Prozesse in Binnengewässern und im Küstennahbereich erzeugen hochkomplexe Phänomene. Zur Beurteilung der Entwicklung von Küstenzohnen, von Flussbetten sowie von Eingriffen des Menschen in Form von Schutzbauwerken sind geeignete numerische Modellwerkzeuge notwendig. Es wird ein holistischer Modellansatz zur Approximation gekoppelter Seegangs-, Strömungs- und Morphodynamischer Prozesse auf der Basis stabilisierter Finiter Elemente vorgestellt. Der Großteil der Modellgleichungen der Hydro- und Morphodynamik sind Transportgleichungen. Dem Transportcharakter dieser Gleichungen entsprechend wird ein stabilisiertes Finites Element Verfahren auf Dreiecken vorgestellt. Die vorgestellte Approximation entspricht einem streamline upwinding Petrov-Galerkin-Verfahrens für vektorwertige mehrdimensionale Probleme, bei dem der Fehler eines Standard-Galerkin-Verfahrens mit Hilfe eines Upwinding-Koeffizienten minimiert wird. Die Wahl des Upwinding-Koeffizienten ist übertragbar auf andere Problemklassen und basiert ausschließlich auf dem Charakter der zugrundeliegene Das Modell wurde für Seegangs- und Strömungs-Untersuchungen im Jade-Weser-Ästuar an der deutschen Nordseeküste eingesetzt.
The cost of keeping large area urban computer aided architectural design (CAAD) models up to date justifies wider use and access. This paper reviews the potential for collaborative groupwork creation and maintenance of such models and suggests an approach to data entry, data management and generation of appropriate levels of detail models from a Geographic Information System (GIS). Staff at the University of the West of England (UWE) modelled a large area of Bristol to demonstrate millennium landmark proposals. It became swiftly apparent that continued amendment of the model to keep it an accurate reflection of changes on the ground was a major data management problem. Piecing in new CAAD models received from Architectural Practices to visualise them in context as part of the planning negotiation process has often taken staff several days of work for each instance. The model is so complex and proprietary that Bristol City operates a specialist visualisation bureau service. UWE later modelled the environs of the Tower of London to support bids for funding and to provide the context for judging the visual impact of iterative design development. Further research continued to develop more effective approaches to. Data conversion and amalgamation from all the diverse sources was the major impediment to effective group working to create the models. It became apparent that a GIS would assist retrieving all the appropriate data that described the part of the model under creation. It was possible to predict that management of many historic part models stepping back through time, allowing for different expert interpretations to co-exist would be in itself a major task requiring a spatial database/GIS. UWE started afresh from the original source data, to explore the collaborative use of GIS and Virtual Reality Modelling Language (VRML) to integrate models and interventions from various sources and to generate an overall navigable interactive whole. Current exploration of the combination of event driven behaviours and Structured Query Language is seeking to define how appropriately to modify objects in the VRML model on demand. This is beginning to realise the potential for use of this process for: asynchronous group modelling on the lines of a collaborative virtual design studio; historic building maintenance management; visitor management; interpretation of historic sites to visitors and public planning information.
Spatial data acquisition, integration, and modeling for real-time project life-cycle applications
(2004)
Current methods for site modeling employs expensive laser range scanners that produce dense point clouds which require hours or days of post-processing to arrive at a finished model. While these methods produce very detailed models of the scanned scene, useful for obtaining as-built drawings of existing structures, the associated computational time burden precludes the methods from being used onsite for real-time decision-making. Moreover, in many project life-cycle applications, detailed models of objects are not needed. Results of earlier research conducted by the authors demonstrated novel, highly economical methods that reduce data acquisition time and the need for computationally intensive processing. These methods enable complete local area modeling in the order of a minute, and with sufficient accuracy for applications such as advanced equipment control, simple as-built site modeling, and real-time safety monitoring for construction equipment. This paper describes a research project that is investigating novel ways of acquiring, integrating, modeling, and analyzing project site spatial data that do not rely on dense, expensive laser scanning technology and that enable scalability and robustness for real-time, field deployment. Algorithms and methods for modeling objects of simple geometric shape (geometric primitives from a limited number of range points, as well as methods provide a foundation for further development required to address more complex site situations, especially if dynamic site information (motion of personnel and equipment). Field experiments are being conducted to establish performance parameters and validation for the proposed methods and models. Initial experimental work has demonstrated the feasibility of this approach.
The frame of this paper is the development of methods and procedures for the description of the motion of an arbitrary shaped foundation. Since the infinite half-space cannot be properly described by a model of finite dimensions without violating the radiation condition, the basic problems are infinite dimensions of the half-space as well as its non-homogeneous nature. Consequently, an approach has been investigated to solve this problem indirectly by developing Green's function in which the non-homogeneity and the infiniteness of the half-space has been included. When the Green's function is known, the next step will be the evaluation of contact stresses acting between the foundation and the surface of the half-space through an integral equation. The equation should be solved in the area of the foundation using Green's function as the kernel. The derivation of three-dimensional Green's function for the homogeneous half-space (Kobayashi and Sasaki 1991) has been made using the potential method. Partial differential equations occurring in the problem have been made ordinary ones through the Hankel integral transform. The general idea for obtaining the three-dimensional Green's function for the layered half-space is similar. But in that case some additional phenomena may occur. One of them is the possibility of the appearance of Stonely surface waves propagating along the contact surfaces of layers. Their contribution to the final result is in most cases important enough that they should not be neglected. The main advantage of results presented in comparing to other obtained with numerical methods is their accuracy especially in the case of thin layers because all essential steps of Green's function evaluation except of the contour integration along the branch cut have been made analytically. On the other hand the disadvantage of this method is that the mathematical effort for obtaining the Green's function is increasing drastically with the increase of the number of layers. Future work will therefore be directed in simplifying of the above described process
SLang - the Structural Language : Solving Nonlinear and Stochastic Problems in Structural Mechanics
(1997)
Recent developments in structural mechanics indicate an increasing need of numerical methods to deal with stochasticity. This process started with the modeling of loading uncertainties. More recently, also system uncertainty, such as physical or geometrical imperfections are modeled in probabilistic terms. Clearly, this task requires close connenction of structural modeling with probabilistic modeling. Nonlinear effects are essential for a realistic description of the structural behavior. Since modern structural analysis relies quite heavily on the Finite Element Method, it seems to be quite reasonable to base stochastic structural analysis on this method. Commercially available software packages can cover deterministic structural analysis in a very wide range. However, the applicability of these packages to stochastic problems is rather limited. On the other hand, there is a number of highly specialized programs for probabilistic or reliability problems which can be used only in connection with rather simplistic structural models. In principle, there is the possibility to combine both kinds of software in order to achieve the goal. The major difficulty which then arises in practical computation is to define the most suitable way of transferring data between the programs. In order to circumvent these problems, the software package SLang (Structural Language) has been developed. SLang is a command interpreter which acts on a set of relatively complex commands. Each command takes input from and gives output to simple data structures (data objects), such as vectors and matrices. All commands communicate via these data objects which are stored in memory or on disk. The paper will show applications to structural engineering problems, in particular failure analysis of frames and shell structures with random loads and random imperfections. Both geometrical and physical nonlinearities are taken into account.
Today´s procedures for the awarding of public construction performance contracts are mainly paper-based. Although the usage of electronic means is permitted in the VOB, the regulations are not sufficient yet. Especially software agents within the AEC-bidding process were not considered at all. The acceptance of an agent-based virtual marketplace for AEC-bidding depends on a reliable and trustworthy public key infrastructure according to the (German) digital signature act. Only if confidentiality, integrity, non-repudiation, and authentication are provided reliably, users will assign sensitive business processes like public tendering procedures to software agents. The development of a secure agent-based virtual marketplace for AEC-bidding according to legal regulations is an entirely new approach from a technical as well as from a legal point of view. The objective of this research project is the development of intelligent software-agents which are able to legally call for bids, to calculate proposals, and to award the successful bidder.
Ein simultanes Lösungsverfahren für Fluid-Struktur-Wechselwirkungen aus dem Bereich des Bauingenieurwesens wird vorgestellt. Die Modellierung der Tragwerksdynamik erfolgt mit der geometrisch nichtlinearen Elastizitätstheorie in total Lagrangescher Formulierung. Die Strömung wird mit den inkompressiblen Navier-Stokes-Gleichungen beschrieben. Wenn Turbulenzeffekte massgeblich sind, kommen die Reynolds-Gleichungen in Verbindung mit dem k-omega-Turbulenzmodell von Wilcox zum Einsatz. Zur Beschreibung von komplexen freien Oberflächen wird die Level-Set-Methode eingesetzt. Die einheitliche Diskretisierung von Fluid und Struktur mit der Raum-Zeit-Finite-Element-Methode führt zu einem konsistenten Berechnungsmodell für das gekoppelte System. Da die isoparametrischen Raum-Zeit-Elemente ihre Geometrie in Zeitrichtung ändern können, erlaubt die Methode eine natürliche Beschreibung des infolge der Strukturbewegung zeitveränderlichen Strömungsgebiets. Die gewichtete Integralformulierung der Kopplungsbedingungen mit globalen Freiwerten für die Interface-Spannungen sichert eine konservative Kopplung von Fluid und Struktur. Ausgewählte Anwendungsbeispiele zeigen die Leistungsfähigkeit der entwickelten Methodik und belegen die guten Konvergenzeigenschaften des simultanen Lösungsverfahrens.
Der vorliegende Beitrag beschreibt die Problematik bei der Prognose verkehrsbedingter Schadstoff-Immissionen. Im Mittelpunkt steht die Entwicklung und der Aufbau einer Simulationsumgebung zur Evaluation von umweltorientierten Verkehrsmanagement-Strategien. Die Simulationsumgebung wird über die drei Felder Verkehr, Emission, Immission entwickelt und findet zunächst Anwendung in der Evaluation verkehrlicher Maßnahmen für die Friedberger Landstraße in Frankfurt am Main.
Site superintendents performing project management tasks on construction sites need to access project documents and need to collect information that they observe while inspecting the site. Often, information that is observed on a construction site needs to be integrated into electronic documents or project control systems. In the future, we expect integrated product and process models to be the medium for storing and handling construction project management information. Even though mobile computing devices today are already capable of storing and handling such integrated product and process data models, the user interaction with such large and complex models is difficult and not adequately addressed in the existing research. In this paper, we introduce a system that supports project management tasks on construction sites effectively and efficiently by making integrated product and process models accessible. In order to effectively and efficiently enter or access information, site superintendents need visual representations of the project data that are flexible with respect to the level of detail, the decomposition structure, and the type of visual representation. Based on this understanding of the information and data collection needs, we developed the navigational model framework and the application Site Data Collection System (SiDaCoS), which implements that framework. The navigational model framework allows site superintendents to create customized representations of information contained in a product and process model that correspond to their data access and data collection needs on site.
Die Sicherheit von Tragwerken hängt von der zuverlässigen Modellierung sämtlicher Tragwerksparameter ab. Üblicherweise werden diese Parameter als deterministische oder stochastische Größen beschrieben. Stochastische Größen sind Zufallsgrößen, die unscharfe Informationen über Tragwerksparameter mit Hilfe von Dichtefunktionen erfassen. Nicht alle unscharfen Tragwerksparameter lassen sich als Zufallsgrößen darstellen. Sie können jedoch als Fuzzy-Größen modelliert werden. Fuzzy-Größen beschreiben unscharfe Tragwerksparameter als unscharfe Menge mit Bewertungsfunktion (Zugehörigkeitsfunktion). Die Fuzzy-Modellierung im Bauingenieurwesen umfaßt die Fuzzifizierung, die Fuzzy-Analyse, die Defuzzifizierung und die Sicherheitsbeurteilung. Sie erlaubt es, Tragwerke mit nicht-stochastischen unscharfen Eingangsinformationen zu untersuchen. Nicht-stochastische Eingangsinformationen treten sowohl bei bestehenden als auch bei neuen Tragwerken auf. Die unscharfen Ergebnisse der Fuzzy-Modellierung gestatten es, das Systemverhalten zutreffender zu beurteilen; sie sind die Ausgangspunkte für eine neue Sicherheitsbeurteilung auf der Grundlage der Möglichkeitstheorie. Bei der Fuzzy-Analyse ist die alpha-Diskretisierung vorteilhaft einsetzbar. Bei fehlender Monotonie der deterministischen Berechnungen und unter Berücksichtigung der Nichtlinearität wird die Fuzzy-Analyse mit Optimierungsalgorithmen durchgeführt. Zwei Beispiele werden diskutiert: die Lösung eines transzendenten Eigenwertproblems und eines linearen Gleichungssystems. Die Systemantworten der Fuzzy-Analyse werden der Sicherheitsbeurteilung zugrunde gelegt. Für ausgewählte physikalische Größen werden Versagensfunktionen definiert. Diese bewerten die Möglichkeit des Versagens. Mit Hilfe von Min-max-Operationen der Fuzzy-Set-Theorie erhält man aus Versagensfunktion und Fuzzy-Antwort die Versagensmöglichkeit bzw. die Überlebensmöglichkeit. Die ermittelte Versagensmöglichkeit repräsentiert die subjektive Beurteilung der Möglichkeit, daß das Ereignis &qout;Versagen&qout; eintritt. Beispiele zeigen die Unterschiede zwischen der Sicherheitsbeurteilung mittels Fuzzy-Modells und mittels deterministischen Modells.
Die Berücksichtigung stochastischer System- und Lastparameter bei der nach EC zulässigen Analyse des Tragwerksverhaltens unter Berücksichtigung globalen nichtlinearen Systemverhaltens sind notwendig, da dies ein anderes Sicherheitskonzept erfordert. Wird der plastische Grenzlastfaktor (PGLF), der die Ausnutzung der Systemkapazitäten bis zum Kollaps ermöglicht, zur Grenzzustandsbeurteilung herangezogen, wird dies besonders deutlich. Für das Modell eines ebenen Stahlbetontragwerks wird starr-ideal-plastisches Materialverhalten vorausgesetzt. Die Bestimmung des PGLFs für ein gegebenes Lastbild kann ausgehend von einem Extremalprinzip über die Lösung einer Optimierungsaufgabe erfolgen. Diese direkte Bestimmung des Kollapses bereitet aber bei der stochastischen Analyse Schwierigkeiten, da die zugehörigen Grenzzustandsgleichungen (GZG) nicht gutartig sind. Es wird die stochastische Methode des Multi-Modal Importance Sampling (MMIS) vorgeschlagen, die unter Berücksichtigung der Eigenschaften dieses mechanischen Modells die Versagenswahrscheinlichkeit bestimmt, d.h. das Verfahren nimmt auf die nur stückweise Stetigkeit GZG des speziellen Problems Rücksicht. Es setzt die zugehörige Grenzzustandsfunktion voraus. Die wesentlichen Bemessungspunkte werden durch Anwendung des Betaverfahrens gesucht und dann mit einem Importance-Sampling-Algorithmus mit multimodaler Sampling Dichte die Versagenswahrscheinlichkeit bestimmt . Das Verfahren sucht und berücksichtigt die wesentlichen Versagensbereiche des Problems mit vertretbarem Aufwand. Verbesserungen könnten sowohl bei den enthaltenen Such- und Iterationsalgorithmen als auch bei der Wahl der einzelnen Sampling-Dichten erzielt werden, was Gegenstand weiterer Untersuchungen ist.
In ersten Teil des Vortrages wird eine Heuristik zur Approximation der Pareto Menge multikriterielle verallgemeinerter Job-Shop Scheduling-Probleme vorgestellt. Der Algorithmus basiert auf einer genetischen lokale Suche Heuristik. Die Mittelwertbildung der Startzeiten der Vorgänge wurde hierbei als Rekombinationsoperator verwendet. Als lokale Suche wurde ein Threshold Accepting Algorithmus implementiert. Der Algorithmus wurde auf einer großen Menge von Benchmark-Instanzen und den Zielfunktionen Makespan, Tardiness, Lateness und Summe der Fertigstellungszeiten getestet. Die Ergebnisse zu den verschiedenen Zielfunktionen werden vorgestellt. Die vom Algorithmus erzeugte Approximation der Pareto Menge kann eine große Anzahl von Lösungen enthalten. Aus dieser Menge muss der Entscheidungsträger eine Lösung auswählen. Da die Datenmenge schon bei moderaten Problemdimensionen sehr umfangreich wird, stellt dies ein Problem für den Entscheider dar. Deshalb muss die große Menge der Lösungen auf eine überschaubare Anzahl reduziert werden. Bei dieser Reduktion der Lösungen muss die Diversität der verbleibenden Lösungen beachtet werden. Diese Reduzierung wird als Short Listing bezeichnet und im zweiten Teil des Vortrages vorgestellt. Im ersten Schritt des Short Listings werden die Lösungen mittels Abstandsmaßen im Lösungsraum geclustert. Die dazu verwendeten Abstandsmaße werden auf den Permutationen der Vorgänge auf den Ressourcen definiert. Es wurden fünf Abstandsmaße und zwei Clusterverfahren, ein hierarchisches und ein nicht-hierarchisches, untersucht. Im zweiten Schritt wird aus jedem Cluster jeweils eine Lösung ausgewählt und dem Entscheidungsträger vorgelegt. Dabei wurden zwei Methoden untersucht. Im ersten Fall wurde die bezüglich einer Ranking-Funktion beste Lösung und im zweiten Fall die Medianlösung bezüglich des Abstandsmaßes ausgewählt. Die Abstandsmaße, Clusteralgorithmen und Auswahlmethoden wurden auf einer großen Menge von Benchmark-Instanzen verglichen. Die Ergebnisse werden im Vortrag vorgestellt.
This work was partially supported by DAAD, Fundamental Researches Foundation of Belarus and International Soros Science Education Program We consider a vector discrete optimization problem on a system of non- empty subsets (trajectories) of a finite set. The vector criterion of the pro- blem consists partial criterias of the kinds MINSUM, MINMAX and MIN- MIN. The stability of eficient (Pareto optimal, Slater optimal and Smale op- timal) trajectories to perturbations of vector criterion parameters has been investigated. Suficient and necessary conditions of eficient trajectories local stability have been obtained. Lower evaluations of eficient trajectories sta- bility radii, and formulas in several cases, have been found for the case when l(inf) -norm is defined in the space of vector criterion parameters.
O-D Matrix sind die wichtigste Informationsquelle, die den Bedarf an Transport in der Stadt betrifft. Sie dienen den richtigen Bildung von Straßennetz und Öffentlichen Personen Verkehrslinien. Anhand diesen Grundlagen gerechnet man Verkehrsstärken, die zur Dimensionierung der Verkehrsinfrastruktur benutzt werden. Im Verkehrsplannung zur Rechnung O-D Matrix wird Vier - Stufen - Methode angewendet. In dieser Methode verwendet man die Angaben aus den Haushaltsumfragen und aus den Kordonumfragen. Diese Methode fordert einen großen Arbeitsaufwand und Kosten. Hier wird die andere Methode dargestellt. Bei Schätzung der O-D Matrix verwendet man die Verkehrsstärken, die in den Straßenquerschnitten gemessen werden. Wir brauchen gemessene Verkehrsstärken auch für andere Ziele (z. B. für Signalsteuerung, für Beobachtung des Motorisierungstrendes). Darum ist diese Methode billiger als das traditionelle Verfahren. In diesen Methode haben wir zwei Möglichkeiten. Die erste - von Verkehrsstärkemessungen kann man O-D Matrix direkt schätzen. Und zweite Möglichkeit - zuerst rechnen wir die Verkehrserzeugungen und später O-D Matrix. Das zweite Verfahren hat einen wesentlichen Vorteil und zwar, anhand der Verkehrserzeugungen ist es möglich alle Elemente O-D Matrix zu rechnen. Bei der Schätzung der O-D Matrix unmittelbar von den Messungen (das erste Verfahren) bekommt man nur diese Elemente, die in den beobachtenen Straßenquerschnitten auftreten.
This paper concerns schedule synchronization problems in public transit networks. In particular, it consists of three main parts. In the first the subject area is introduced, the terms are defined and framework for optimal synchronization in the form of problem representation and formulation is proposed. The second part is devoted to transfer synchronization problem when passengers changing transit lines at transfer points. The intergrated Tabu Search and Genetic solution method is developed with respect to this specific problem. The third part deals with headways harmonization problem i.e. synchronization of different transit lines schedules on a common segments of routes. For the solution of this problem a new bilevel optimization method is proposed with zones harmonization at the bottom level and co-ordination of zones, by time buffers assigned to timing points, at the upper level. Finally, the synchronization problems are numerically illustrated by real-life examples of the public transport lines in Cracow.
Although there are some good reasons to design engineering software as a stand-alone application for a single computer, there are also numerous possibilities for creating distributed engineering applications, in particular using the Internet. This paper presents some typical scenarios how engineering applications can benefit from including network capabilities. Also, some examples of Internet-based engineering applications are discussed to show how the concepts presented can be implemented.
In construction engineering, a schedule’s input data, which is usually not exactly known in the planning phase, is considered deterministic when generating the schedule. As a result, construction schedules become unreliable and deadlines are often not met. While the optimization of construction schedules with respect to costs and makespan has been a matter of research in the past decades, the optimization of the robustness of construction schedules has received little attention. In this paper, the effects of uncertainties inherent to the input data of construction schedules are discussed. Possibilities are investigated to improve the reliability of construction schedules by considering alternative processes for certain tasks and by identifying the combination of processes generating the most robust schedule with respect to the makespan of a construction project.
This paper presents an application of dynamic decision making under uncertainty in planning and estimating underground construction. The application of the proposed methodology is illustrated by its application to an actual tunneling project—The Hanging Lake Tunnel Project in Colorado, USA. To encompass the typical risks in underground construction, tunneling decisions are structured as a risk-sensitive Markov decision process that reflects the decision process faced by a contractor in each tunneling round. This decision process consists of five basic components: (1) decision stages (locations), (2) system states (ground classes and tunneling methods), (3) alternatives (tunneling methods), (4) ground class transition probabilities, and (5) tunneling cost structure. The paper also presents concepts related to risk preference that are necessary to model the contractor’s risk attitude, including the lottery concept, utility theory, and the delta property. The optimality equation is formulated, the model components are defined, and the model is solved by stochastic dynamic programming. The main results are the optimal construction plans and risk-adjusted project costs, both of which reflect the dynamics of subsurface construction, the uncertainty about geologic variability as a function of available information, and the contractor’s risk preference.
Most of the existing seismic resistant design codes are based on the response spectrum theory. The influence of inelastic deformations can be evaluated by considering inelastic type of resisting force and then the inelastic spectrum is considerably different from the elastic one. Also, the influence of stiffness degradation and strength deterioration can be accounted for by including more precise models from material point of view. In some recent papers the corresponding changes in response spectra due to the P- Ä effect are discussed. The experience accumulated from the recent earthquakes indicates that structural pounding may considerably influence the response of structures and should be taken into account in design procedures. The most convenient way to do that is to predict the influence of the pounding on the response spectra for accelerations, velocities and displacements. Generally speaking the contact problems such as pounding are characterized by large extent of nonlinearity and slow convergence of the computational procedures. Thus obtaining spectra where the contact problem is accounted for seems very attractive from engineering point of view because could easy be implemented into the design procedures. However it is worth nothing that there is not rigorous mathematical proof that the original system can be decomposed into single equations related to single degree of freedom systems. It is the porpose of the paper to study the influence of the pounding on the response spectra and to evaluate the amplification due to the impact. For this purpose two adjacent SDOF systems are considered that are able to interact during the vibration process. This problem is solved versus the elastic stiffness ratio, which appears to be very important for such assemblage. The contact between masses is numerically simulated using opening gap elements as links. Comparisons between calculated response spectra and linear response spectra are made in order to derive analytical relationships to simply obtain the contribution of pounding. The results are graphically illustrated in response spectra format and the influence of the stiffness ratio is clarified.
The management of resources is an essential task in each construction company. Today, ERP systems and e-Business systems are available to assist construction companies to efficiently organise the allocation of their personnel and equipment within the company, but they cannot provide the company with the idle resources for every single task that has to be performed during a construction project. Therefore, companies should have an alternative solution to better exploit expensive resources and compensate their fixed costs, but also have them available at the right time for their own business activities. This paper outlines the approach taken by the EU funded project “e-Sharing” (IST-2001-33325) to support resource management between construction companies. It will describe requirements for the management of construction resources, its core features, and the integration approach. Therefore, we will outline the approach of an integrated resource type model supporting the management and classification of construction equipment, construction tasks and qualification profiles. The development is based on a cross-domain analysis and evaluation of existing models. ...
Recently, many reseraches on active control systems of building structures are preformed based on modern control theory and are installed real buildings. The authors have already proposed intelligent fuzzy optimal active control (IFOAC) systems. IFOAC systems imitate intelligent activities of human brains such as prediction, adaptation, decision-kaking and so on. In IFOAC systems, objective and subjective judgements on the active control can be taken into account. However, IFOAC systems are considered to be suitable for far-field erathquake and control effect becomes small in case of near-field earthqaukes which include a few velosity pules with large amplitudes. To improve control effect in case of near-souece earthquakes, the authors have also proposed hybrid control (HC) systems, in which IFOAC systems and fuzzy control system are combined. In HC systems, the fuzzy control systems are introduced as a reflective fuzzy active control (RFAC) system and imitates spinal reflection of human. In HC systems, active control forces are activated to buildings in accordance with switching rules on active control forces. In this paper, optimizations on fuzzy control rules in RFAC system and switching rules of active control forces in HC system are performed by Parameter-Free Genetic Algorithms (PfGAs). Here, the optimization is performed by using different earthquake inputs. The results of digital simulations show that the HC system can reduce maximal response displacements under restrictions on strokes of the actuator effectively in case of a near-source earthquake and the effectiveness of the proposed HC system is discussed and clarified.
Research on Establishment of a Standard of Traffic Impact Assessment with Integrated Database System
(2004)
Planning support systems, such as geographical information system (GIS) and traffic flow simulation models, are widely in use in recent urban planning research. In this paper we propose a method to apply traffic impact assessment (TIA) to large-scale, commercial developments. In TIA research we often encounter the problem of increasing amount of data that is necessary for detailed investigation and analysis, as the scale of commercial developments become larger and more complex. As a result, TIA presents two problems. The first problem is the difficulty of data acquisition. The second problem is the reliability of data. As a solution, we developed an integrated database system.
A multicriterial statement of the above mentioned problem is presented. It differes from the classical statement of Spanning Tree problem. The quality of solution is estimated by vector objective function which contains weight criteria as well as topological criteria (degree and diameter of tree). Many real processes are not determined yet. And that is why the investigation of the stability is very important. Many errors are connected with calculations. The stability analysis of vector combinatorial problems allows to discover the value of changes in the initial data for which the optimal solution is not changed. Furthermore, the investigation of the stability allows to construct the class of the problems on base of the one problem by means of the parameter variations. Analysis of the problems with belong to this class allows to obtaine axact and adecuate discription of model
The technological processes, schedules, parallel algorithms, etc., having some technological limitations and exacting increases of efficiency of their execution can be described through digraphs, on which the appropriate optimization problem (construction of optimal scheduling of tops of digraph) can be solved. The problems, researched in the given operation, have a generally following statement: The problem 1: Under the given graph G and option value h to construct parallel scheduling of tops of digraph of minimum length. Let's designate the problem S(G, h, l). The problem 2: Under the given graph G and option value l to construct parallel scheduling of tops of digraph of minimum width. Let's designate the problem S(G, l, h). The problem 3: Under the given graph G, option value h and periods of execution of operations di, i=1, …, n to construct parallel scheduling of tops of digraph of minimum length. Let's designate the problem S(G, h, di, l). The problems 1,2,3 in a case when h-arbitrary have exponential complexity. In operation the method of solution of the problem S(T, h, di, l) is offered on the basis of choice of tops having greatest weight. The approach to solution of the problem S(G, 3, l) is offered, where G the graph satisfying property : S[i] =S [i], i=1, …, l. For obtaining a rating of width of scheduling on an available estimator of length, we offer to use iterative algorithm of polynomial complexity, on which each step the current value of width of scheduling is set, which is used for specification of length of scheduling.
The paper introduced the research and application of the highway construction management information integrated system. Explained the development and application of highway survey applet run on mobile telephone supporting Java and the technique of transmitting engineering data by GPRS wireless network technology. And expounded the development and application of highway engineering construction field data collecting software run on Pocket-PC. Recommended the technique of engineering long distance data transmitting based on C/S structure adopting VPN (Virtual Private Networks) technology. Especially expatiated on the research on the platform of highway construction management information integrated system adopting geography information system (GIS) technique, database technique and network technique. And said all to subsystem about bid manage, contract management, engineering design drawing, engineering survey calculation, measure and pay, data processing on engineering experiment, quantity assessing, project plan and progress, engineering document management etc. Besides? proposed highway construction project visual analysis and inquiry system based on Web-GIS; Explained the research and application of highway engineering construction OA based on B/S structure; real-time workflow and information processing such as the management of administration, business and procedure of authorization and information distribution. At last, the author described the prospect of the application of C/S and B/S structure in trade software development in the highway construction management.
In current AEC practice client requirements are typically recorded in a building program, which, depending on the building type, covers various aspects from the overall goals, activities and spatial needs to very detailed material and condition requirements. This documentation is used as the starting point of the design process, but as the design progresses, it is usually left aside and changes are made incrementally based on the previous design solution. These incremental small changes can lead to a solution that may no longer meet the original requirements. In addition, design is by nature an iterative process and the proposed solutions often also cause evolution in the client requirements. However, the requirements documentation is usually not updated accordingly. Finding the latest updates and evolution of the requirements from the documentation is very difficult, if not impossible. This process can lead to an end result, which is significantly different from the documented requirements. Some important requirements may not be satisfied, and even if the design process was based on agreed-upon changes in the scope and requirements, differences in the requirements documents and in the completed building can lead to well-justified doubts about the quality of the design and construction process...
Die Verfügbarkeit und die sichere Beherrschung moderner und kostengünstiger Informations- und Kommunikationstechnologien gestattet es auch kleinen unternehmerischen Einheiten im Bauwesen sich in betriebsübergreifenden Netzwerken zu organisieren und sich besser in die Wertschöpfungskette am Bau zu integrieren. In sogenannten >Virtuellen Organisationen< (VO) können diese für die Volkswirtschaft wichtigen Unternehmen ihre Wettbewerbsvorteile wie höhere Flexibilität, schnelle Reaktion auf Kundenwünsche und Marktnähe nutzen und somit ihr langfristiges Bestehen in einem liberalen Wettbewerb auf einem erweiterten EU-Markt sichern und ausbauen. Die derzeit existierenden Hindernisse im Informationsfluss zwischen Baustelle und Büro können durch den Einsatz mobiler, funkvernetzter Endgeräte, wie beispielsweise >Smart-Phones<, und einer neu zu gestaltenden Infrastruktur zum Wissensmanagement und zur kontextsensitiven Informationsdarstellung abgebaut werden. Im Rahmen dieser Veröffentlichung werden konzeptionelle Ansätze eines integrierten Informationsmanagements zur Unterstützung von VO, die im Rahmen des BMBF-Projektes >IuK-System Bau< derzeit entwickelt werden, vorgestellt.
Renovation's peculiarities of industrial enterprises in conditions of economic selfsufficiency
(1997)
Probleme of recrienfation of building complex, to the sharp increase of share of reconstruction works, capital repair and modernisation of in-dustrial plants are concidered in this work. The conception of develop-ment and creation of unitified system of expluatation and renovation of industrial plants are worded out. This system is based on date-computer technology and taking into conciderations of real economic relations.
Die zunehmend erforderliche Kooperation verschiedener Beteiligter unterschiedlicher Fachbereiche und der Einsatz hochspezialisierter Fachapplikationen in heterogenen Systemumgebungen unterstreichen die Bedeutung und Notwendigkeit neuer Konzepte und Möglichkeiten zur Schaffung einer computergestützten Integrationsebene. Ziel einer computergestützten Integrationsebene ist die Verbesserung der Kooperation und Kommunikation unter den Beteiligten. Grundlage dafür ist die Etablierung eines effizienten und fehlerfreien Daten- und Informationsaustausches zwischen den verschiedenen Fachplanern und -applikationen. Die Basis für die Datenintegrationsebene bildet ein digitales Bauwerksmodell im Sinne eines >virtuellen Bauwerks<, welches alle relevanten Daten und Informationen über ein zu planendes oder real existierendes Bauwerk zur Verfügung stellt. Bei der Verwirklichung einer Bauwerksmodell-orientierten Datenintegrationsebene und deren Modellverwaltung erweist sich speziell die Definition des Bauwerksmodells also die Spezifikation der relevanten auszutauschenden Daten als äußerst komplex. Der hier vorzustellende Relationen-orientierte Ansatz, d.h. die Realisierung des Daten- und Informationsaustauschs mittels definierter Relationen und Beziehungen zwischen dynamisch modifizierbaren Domänenmodellen, bietet Ansätze zur: * Verringerung und Beherrschung der Komplexität des Bauwerksmodells (Teilmodellbildung) * Realisierung eines effizienten Datenaustauschs (Relationenmanagement) Somit stellt der Relationenorientierte Ansatz einen adäquaten Lösungsweg zur Modellierung eines digitalen Bauwerksmodells als Datenintegrationsebene für den Lebenszyklus eines Bauwerkes dar.
Für planende Ingenieure und Architekten besteht seit jeher im Rekonstruktionsbereich die Aufgabe, vorhandene Gebäude in ihrer Geometrie und Struktur zu erfassen und daraus Rekonstruktionspläne und -technologien zu erarbeiten. Diese Erfassungsmaßnahmen sind sehr umfangreich und kostenintensiv. Ziel der hier vorzustellenden Ansatzes war es deshalb, ein kostengünstiges Verfahren zu entwickeln, das ein berührungsloses Aufmaß eben begrenzter Räume (polyedrischer Räume) mit einer den Erfordernissen entsprechenden Genauigkeit gewährleistet. Es werden im wesentlichen zwei Problemkreise behandelt. Der erste beinhaltet den Nachweis einer für die geplanten Anwendungen hinreichend genauen, maßstäblichen Rekonstruierbarkeit von ebenen Objekten (Wänden) aus monokularen Fotoaufnahmen. Anstelle des in der Photogrammetrie üblichen Weges über Kamerakalibrierung mittels exakt eingemessener Paßpunkte wurde ein Ansatz verfolgt, bei dem eine mittels Laserspotprojektoren auf dem Aufnahmeobjekt erzeugte parameterabhängige Maßfigur in Verbindung mit a priori bekannten Bildinhalten Grundlage für die maßstabsgerechte Rekonstruktion des Objektes ist. Der zweite Problemkreis behandelt die Zusammensetzung von eben begrenzten Räumen aus Einzelebenen (Wänden) die als Ergebnis des ersten Schrittes projektiv entzerrt, d.h. als orthogonale Draufsicht, allerdings verfahrensbedingt fehlerbehaftet vorliegen. Ziel ist hier, durch Nutzung von a-priori-Kenntnissen über die Raumstruktur mit Hilfe von Methoden der mathematischen Optimierung einen Genauigkeitsgewinn zu erzielen.
Für den Entwurf der i.a. aus langen schmalen Rechtecken bestehenden Schal- bzw. Werkpläne wird eine Entwurfsunterstützung vorgestellt, bei der die Größe der Rechtecke wie immer festgelegt wird, die Lage der Rechtecke aber durch topologische Angaben. Letztere bilden programmtechnisch Bedingungen, wobei zwischen Berühr- und Bündigkeitsbedingen unterschieden wird. Diese Angaben positionieren das neue Rechteck im Bezug zu einem bereits platzierten. Zum Beispiel erlaubt die Angabe, die Säule ist oberhalb des Fundamentes und belastet dieses mittig, eine eindeutige Festlegung der Lage der Säule bei gegebener Lage des Fundamentes und gegebenen Abmessungen beider Rechtecke. Die Formulierung mittels Bedingungen hat den Vorteil daß diese auch bei Änderung von Abmessungen gültig bleiben. Die hier vorgestellte Eingabeart der relativen Positionierung ist eine Erweiterung des Orthomodus, wie er bei Bau-CAD-Programmen stets gefunden wird.
Ideally, multiple computational building evaluation routines (particularly simulation tools) should be coupled in real-time to the representational design model to provide timely performance feed-back to the system user. In this paper we demonstrate how this can be achieved effectively and conveniently via homology-based mapping. We consider two models as homologous if they entail isomorphic topological information. If the general design representation (i.e., a shared object model) is generated in a manner so as to include both the topological building information and pointers to the semantic information base, it can be used to directly derive the domain representations (>enriched< object models with detailed configurational information and filtered semantic data) needed for evaluation purposes. As a proof of concept, we demonstrate a computational design environment that dynamically links an object-oriented space-based design model, with structurally homologous object models of various simulation routines.
A distributed geotechnical remote analysis of data system (Distributed G-RAD) can benefit both owners and contractors in providing better quality control and assurance on geotechnical projects. The Distributed G-RAD approach involves efficient data acquisition using PDAs with GPS capability, radio frequency identification (RFID) tags for labeling soil samples, laser scanning for measuring lift thickness and volumes of stockpiles and borrow pits. Spatial data storage is provided using a geographic information system (GIS). Portions of this system are already developed while other parts are still being considered. This paper also describes how RFID and laser scanning technologies can be used in the larger Distributed G-RAD system.
For the management or reorganisation of existing buildings, data concerning dimensions and construction are necessary. Often these data are given exclusively by paper-based drawings and no digital data such as a computer based product model or even a CAD-model are available. In order to perform mass calculation, damage mapping or a recalculation of the structure these drawings of the building under consideration have to be analysed manually by the engineer. This is a very time-consuming job. In order to close this gap between drawings of an existing building and a digital product model an approach is presented in this paper to digitise a drawing, to build up geometric and topologic models and to recognise construction parts of the building. Finally all recognised parts are transformed into a three-dimensional geometric model which provides all necessary geometric information for the product model. During this import process the semantics of a ground floor plan has to be converted into a 3D-model.
This paper is a report of Radio Frequency Identification (RFID) technology and its potential applications in the commercial construction industry. RFID technology offers wireless communication between RFID tags and readers with non line-of-sight readability. These fundamental properties eliminate manual data entry and introduce the potential for automated processes to increase project productivity, construction safety, and project cost efficiency. Construction contractors, owners, and material suppliers that believe technology can further develop methods and processes in construction should feel obligated to participate in RFID studies for the advancement of the construction industry as a whole.
After more than hundred years of arguments in favour and against quaternions, of exciting odysseys with new insights as well as disillusions about their usefulness the mathematical world saw in the last 40 years a burst in the application of quaternions and its generalizations in almost all disciplines that are dealing with problems in more than two dimensions. Our aim is to sketch some ideas - necessarily in a very concise and far from being exhaustive manner - which contributed to the picture of the recent development. With the help of some historical reminiscences we firstly try to draw attention to quaternions as a special case of Clifford Algebras which play the role of a unifying language in the Babylon of several different mathematical languages. Secondly, we refer to the use of quaternions as a tool for modelling problems and at the same time for simplifying the algebraic calculus in almost all applied sciences. Finally, we intend to show that quaternions in combination with classical and modern analytic methods are a powerful tool for solving concrete problems thereby giving origin to the development of Quaternionic Analysis and, more general, of Clifford Analysis.
The theory of random matrices, or random matrix theory, RMT in what follows, has been developed at the beginning of the fties to describe the sta- tistical properties of energy levels of complex quantum systems, [1], [2], [3]. In the early eighties it has enjoyed renewed interest since it has been recognized as a very useful tool in the study of numerous physical systems. Specically, it is very useful in the analysis of chaotic quantum systems. In fact, in the last years many papers appeared about the problem of quantum chaos which implies the quantization of systems whose underlying classical dynamics is irregular (i.e. chaotic). The simplest models considered in this eld are billi- ards of various shapes. From the the classical point of view, a point particle in a 2-dimensional billiard displays regular or irregular motion depending on the shape of the billiard; for instance motion in a rectangular or circular billi- ard is regular thanks to the symmetries of the boundary. On the other hand, billiards of arbitrary shapes imply chaotic motion, i.e. exponential diver- gence of initially nearby trajectories. In order to study quantum billiards we have to consider the Schroedinger equation in various 2-dimensional domains. The eigenvalues of the Schroedinger equation represent the allowed energy levels of our quantum particle in the billiard under consideration, while the eigenfunction norms represent the probability density of nding the particle in a certain position. The question of quantum chaos is whether the charac- ter of the classical motion (regular or chaotic) can in uence some properties
Um die entsprechende Qualität der ÖPNV zu erreichen, sollte man das Verhältnis zwischen der Bedienungsqualität und dem Kostenaufwand analysieren. In den Bedingungen der anwachsenden Konkurenz aus der Seite der PKW und anderen Fuhrunternehmern soll man die Handlungen, die grösste Effektivität gewährleisten, aufnehmen. Es gibt viele Möglichkeiten die diese Qualität verbessern können, wie z.B. steigernde Frequenz der Fahrzeuge, Vergrößerung der Geschwindigkeit, Pünktlichkeit und Regelmäßigkeit, Einführung der Niederfußfahrzeuge und vieles mehr. Ich versuche in meinem Referat die Aspekte zu analysieren, die mit der Frequenz verbunden sind, d.h. die Erhöhung der Frequenz und das Erhalten der konstanten Häufigkeit der Linienbusse. Das Referat umfasst die Modelle und die Beispiele. In Verbindung mit den Untersuchungen, die die Bereitschaft der Bezahlung für die Verbesserung der Qualität betreffen, kann man schon Entscheidungen treffen, die unterschiedlichen Standard der Fahrt bestimmen.
Pseudorigidity method for solving the problem of limit equilibrium of rigid-plastic constructions
(1997)
1.Design calculations , based upon the theory elasticity , cannot completely satisfy engineers and designers , because cannot answer to basic question about overload capability . Only design calculations of limit equilibrium of rigid-plastic constructions can answer to this question completely enough. As a rule , such design calculations are made issue from hypothesis, that material of construction has rigid-plastic diagram Prandtl .This scheme of calculation gives qualitatively more correct results, then usual calculation, based upon law Hooke’s , and allows more really estimate ultimate strength of construction due to different loads. Universal algorithms for solving the problem of limit equilibrium have been created since the middle of the 60’s.These algorithms are based upon two basic theorems about limit analysis - static and kinetics. It was found , that with the help of above-mentioned theorems the problem of limit equilibrium can be formulated as a problem of linear programming (for linear yield) or nonlinear programming (for yield Guber-Mizes). The method of linear programming conformably to calculation of rod systems got the most development in the reports Prager W. [1] and Chiras A. [ 2 ]. The method of linear programming conformably to plates and shells was widely used by Rganizin A.[3]. [3[ contains more full bibliography about this problem. Calculation of limit equilibrium with the help of linear and nonlinear programming has a few significant lacks: - complexity and laboriousness preliminary preparation of problem for PC; - necessity to use special program means , which are not in usual program packet for strength analysis. Author worked out a new method about design calculation of limit equilibrium without above-mentioned lacks . The method is based upon analogy of relations between internal generalized efforts and generalized deformations in elastic system and between generalized efforts and velocities of change generalized deformations in rigid-plastic system. Because later rigid-plastic deformation would be treated as an elastic deformation in the system with special constructed rigidities , this method could be called >pseudorigidity method<.
One of the basic types of strength calculations is the calculation of limit equilibrium of constructions. This report describes new method for solving the problem of limit equilibrium. The rigid-plastic system in this method is substituted with an «equivalent» elastic system with specially constructed rigidities. This is why it is called the method of pseudorigidities. An iteration algorithm was developed for finding pseudorigidities. This algorithm is realized in a special software procedure. Conjunction of this procedure with any elastic calculation program (base program) creates a program solving rigid-plastic problems. It is proved, that iterations will be converge to the solution for the problem of limit equilibrium. The solution of tests show, that pseudorigidity method is universal. It allows the following: - to solve problems of limit equilibrium for various models (arch, beam, frame, plate, beam-wall, shell, solid); - to take into account both linearized and square-law fluidity conditions; - to solve problems for various kinds of loads (concentrated, distributed, given by a generalized vector); - to take into account the existing various of fluidity criteria in different sections etc. The iterative PRM process quickly converges. The accuracy of PRM is very high even in case of rough finite-element structuring. The author has used this method for design protection systems from extreme loads due to equipment of nuclear power stations, pipelines, cargo in any transportation.
Prädiktive Wärmeflussregelung solaroptimierter Wohngebäude - Thermische Simulation komplexer Gebäude
(2003)
Moderne Wohngebäude heute, besonders aber die der Zukunft, werden sich zu einem großen Teil selbst mit Wärme versorgen. Hierbei spielen Entwicklungen im Bereich der Fassadenkonstruktionen eine zentrale Rolle. Zum erhöhten solaren Wärmeeintrag kommen die Eigenschaften der Wärmespeicherung und verzögerten Wärmeabgabe hinzu. Aufgrund dieser Eigenschaften definiert sich das Gebäudeverhalten neu und komplexer. Im Rahmen der Entwicklungen von Wärmeflussregelungen in solarthermisch beheizten Wohngebäuden mit neuartigen Fassaden wird in diesem Beitrag auf die Modellentwicklung moderner Fassadenkonstruktionen hinsichtlich Transparenter Wärmedämmung und Phasenwechselmaterialien im Wandverbund, kombiniert mit der regelungstechnisch interessanten Neuentwicklung von Verschattungseinrichtungen über schaltbare Schichten am Fenster, eingegangen. Der Beitrag beschreibt den Entwurf eines Regelungskonzeptes für die thermische Raumklimaregelung. Diese Untersuchungen basieren auf komplexen Simulationsmodellen, wobei bei der Modellentwicklung der neuartigen Fassadenelemente auf reale Messwerte zurückgegriffen werden konnte.
Prozeßoptimierung in der logistischen Kette erfordert eine interdisziplinäre betriebsüber-greifende Projektarbeit. Im Zeitalter der Globalisierung der Märkte und der stetigen Verbesserung der Wettbewerbsfähigkeit mittelständischer Unternehmen ist eine innerbetriebliche und überbetriebliche Ressourcen- und Tourenplanung (Optimierung) über eine Informationsvernetzung ebenso notwendig wie die Aufbereitung von Informationen und die Analyse von Geschäftsprozessen. Die Prozessoptimierung umfaßt die Aufgaben der Analyse, Gestaltung, Planung und Kontrolle von Prozessen. Supply Chain Management (SCM) ist die übergreifende Prozessoptimierung in der logistischen Kette, d.h. die logische Weiterführung der PPS auf die Lieferbeziehungen. Das Strukturmodell der logistischen Kette umfaßt die Prozesse der * Produktentstehung * Entwicklung * Auftragsgewinnung (Vertrieb, Marketing) * Produktionsplanung * Beschaffung * Produktion * Distribution und Entsorgung Diese Prozesse werden durch das Supply Chain Management nach unternehmensspezifischen Zielsetzungen in Richtung Kunden, Lieferanten und Dienstleistern gestaltet und optimiert (Optimierung der Wertschöpfungskette). Anwendungssystem der Informatik übernehmen die Informationsversorgung in der logistischen Kette.
Vor dem Hintergrund einer sich verschärfenden Umweltgesetzgebung sowie der Erkenntnis, dass Bauabfälle sich grundsätzlich für eine Stoffkreislaufführung eignen, hat der Rückbau von Bauwerken in den letzten Jahren verstärkt an Bedeutung gewonnen. Aufgrund oftmals strenger Zeit- und Kostenvorgaben für einen Rückbau, begrenzter Verfügbarkeit von Personal und Betriebsmitteln, einer weitgehenden Unikatfertigung sowie wechselnder Standorte kommt dabei der projektorientierten Planung von Maßnahmen auf der Baustelle große Bedeutung zu. Im Rahmen des Beitrages werden Ansätze zur Modellierung und Lösung der sich hieraus ergebenden Probleme zur Planung und Optimierung von (Rück-) Bauabläufen unter Verwendung von Projektplanungsmodellen und -methoden vorgestellt. Hierbei werden neben betriebswirtschaftlichen auch umweltrelevante sowie technische Fragestellungen im Zusammenhang mit der Planung von Rückbauprojekten aufgegriffen. Eine Anwendung der Planungsansätze auf reale Gebäude zeigt, dass sich durch Kombination von Gebäuderückbau und Aufbereitungstechnik eine Qualitätsverbesserung von Recyclingbaustoffen erzielen lässt. Zur Umsetzung der hierzu erforderlichen Maßnahmen werden unter Berücksichtigung individueller abfallwirtschaftlicher Rahmenbedingungen der jeweiligen Planungsregion, gebäude- und baustellenbezogener Besonderheiten, technischer sowie kapazitiver Restriktionen Ablaufpläne für den Gebäuderückbau berechnet. Die für unterschiedliche Planungsregionen vorgenommenen Modellrechnungen weisen nach, dass sich die Demontage von Gebäuden gegenüber einem konventionellen Gebäudeabbruch unter bestimmten Rahmenbedingungen bereits wirtschaftlich vorteilhaft realisieren lässt. Abgerundet wird der Beitrag durch einen Ausblick auf Möglichkeiten einer Realisierung komplexer Bauabläufe auch unter strengen Zeitvorgaben sowie bei begrenzten Platzverhältnissen mit Hilfe fertigungssynchroner Ressourceneinsatzplanung sowie auf die Berücksichtigung von Unsicherheiten in der Planung und Ausführung von Rückbauprojekten.
The presented work focuses on collaboration- experiences gathered with complex design and engineering projects, using the learning platform POLE- Europe. Within the POLE environment student-teams from different universities, disciplines and cultural backgrounds are assigned to real-world projects with clearly defined design - tasks, usually to be accomplished within one semester while working in a virtual environment for most of the time. The concept of POLE and the information and collaboration technology is described.
This paper presents a specific modeling technique that is focused on preparing planning processes in civil engineering. Planning processes in civil engineering are characterized by some peculiarities so that the sequence of planning tasks needs to be determined for each planning project. Neither the use of optimized partial processes nor the use of lower detailed and optimized processes guarantee an optimal overall planning process. The modeling technique considers these peculiarities. In a first step, it is focused on the logic of the planning process. Algorithms based on the graph theory determine that logic. This approach ensures consistency and logical correctness of the description of a planning process at the early beginning in its preparation phase. Sets of data – the products of engineers like technical drawings, technical models, reports, or specifications – form the core of the presented modeling technique. The production of these sets of data requires time and money. This is expressed by a specific weighting of each set of data in the presented modeling technique. The introduction of these weights allows an efficient progress measurement and controlling of a planning project. For this purpose, a link between the modeling technique used in the preparation phase and the execution phase is necessary so that target and actual values are available for controlling purposes. The present paper covers the description of this link. An example is given to illustrate the use of the modeling technique for planning processes in civil engineering projects.
The Priority Programme ‘Network Based Co-operation in Structural Engineering’ of the ‘German Research Foundation’ (DFG) has been established in the year 2000. This paper describes and discusses the main research directions and first results of the workgroup ‘Distributed Product Models’. The five projects of the workgroup have developed completely different solutions for specific application domains. Each solution concept deals with a consistent product modeling and knowledge processing in a distributed environment in the planning process. The individual solution approaches of the projects are described and the underlying basic assumptions are discussed. A unified system architecture is described for all projects of the workgroup. Two different approaches (object-oriented and graph-based models) have been introduced for product and knowledge modeling. The common structure of these models will be explained to fully understand the differences of these modeling approaches. Finally the concepts for co-operative work and conflict management in a distributed environment are described: The solution approaches will be distinguished by classifying the supported co-operation according to time. A final scientific summary describes the state-of-the-art in network based co-operation in structural engineering: The role of research directions like knowledge modeling, standard product modeling and versioning in the distributed planning process will be explained.
Building project, with many different players involved, requires open and commonly accepted standard for product model description. Product model based design tools support easy comparisons of design alternatives and optimisation of design solution technical quality. This supports client s decision-making and design target comparisons through the whole building project. Use of product models enable these tasks to meet both schedule and cost requirements Olof Granlund is using product models and interoperable software as the main tool in projects. The use and the realised benefits are illustrated by examples from 3 different real projects: University building, where product models were used already in the very early phases by the whole design team. Office building for research organisation, where product models were used in so called self-reporting building system. Headquarters for international company, where product models were widely used for building performance analysis and visualisations in design phase as well as for facilities management system configuration for operational phase.
The complexity of the relationships between the actors of a building project requires high efficiency in communication. Among other things, data sharing is crucial. The exchange of data is made possible by interfaces between expert programs, which rely on product models. The latter are neutral standards with formal definitions of building objects and their attributes. This paper deals with the state of the art and the research activities concerning product models in the steel construction domain and the advantages provided by this technology for the sector.
Processing technical and environmental data on building materials, components, and systems has become more important during the last few years. Increased sensitivity towards environmental and energy problems has lead to the demand for simulation and evaluation of the long term behavior of buildings. The results of such simulations are expected to enable architects and engineers to develop a broader, interdisciplinary understanding of the impact of their products (buildings) on the environment. However, conducting such evaluations is currently hampered by the lack of comprehensive, up-to-date, and ecologically relevant data on building materials, components, and systems. To address this problem, this paper proposes an approach to deal with the absent or uncertain attributes of building materials, components, and systems. In the past, various information systems have been developed to provide data on a limited set of building materials, including precise values pertaining to some of their characteristics, such as availability, manufacturers, costs, etc. These traditional information systems have difficulty in dealing with uncertain, incomplete and sparse data. However, uncertainty and incompleteness characterize the nature of most of the available and environmentally related characteristics of materials, components, and systems. In this paper, a fuzzy-logic-based augmentation of traditional information systems is proposed towards providing management, utilization and manipulation of incomplete and uncertain data.
Increasingly powerful hard- and software allows for the numerical simulation of complex physical phenomena with high levels of detail. In light of this development the definition of numerical models for the Finite Element Method (FEM) has become the bottleneck in the simulation process. Characteristic features of the model generation are large manual efforts and a de-coupling of geometric and numerical model. In the highly probable case of design revisions all steps of model preprocessing and mesh generation have to be repeated. This includes the idealization and approximation of a geometric model as well as the definition of boundary conditions and model parameters. Design variants leading to more resource-efficient structures might hence be disregarded due to limited budgets and constrained time frames.
A potential solution to above problem is given with the concept of Isogeometric Analysis (IGA). Core idea of this method is to directly employ a geometric model for numerical simulations, which allows to circumvent model transformations and the accompanying data losses. Basis for this method are geometric models described in terms of Non-uniform rational B-Splines (NURBS). This class of piecewise continuous rational polynomial functions is ubiquitous in computer graphics and Computer-Aided Design (CAD). It allows the description of a wide range of geometries using a compact mathematical representation. The shape of an object thereby results from the interpolation of a set of control points by means of the NURBS functions, allowing efficient representations for curves, surfaces and solid bodies alike. Existing software applications, however, only support the modeling and manipulation of the former two. The description of three-dimensional solid bodies consequently requires significant manual effort, thus essentially forbidding the setup of complex models.
This thesis proposes a procedural approach for the generation of volumetric NURBS models. That is, a model is not described in terms of its data structures but as a sequence of modeling operations applied to a simple initial shape. In a sense this describes the "evolution" of the geometric model under the sequence of operations. In order to adapt this concept to NURBS geometries, only a compact set of commands is necessary which, in turn, can be adapted from existing algorithms. A model then can be treated in terms of interpretable model parameters. This leads to an abstraction from its data structures and model variants can be set up by variation of the governing parameters.
The proposed concept complements existing template modeling approaches: templates can not only be defined in terms of modeling commands but can also serve as input geometry for said operations. Such templates, arranged in a nested hierarchy, provide an elegant model representation. They offer adaptivity on each tier of the model hierarchy and allow to create complex models from only few model parameters. This is demonstrated for volumetric fluid domains used in the simulation of vertical-axis wind turbines. Starting from a template representation of airfoil cross-sections, the complete "negative space" around the rotor blades can be described by a small set of model parameters, and model variants can be set up in a fraction of a second.
NURBS models offer a high geometric flexibility, allowing to represent a given shape in different ways. Different model instances can exhibit varying suitability for numerical analyses. For their assessment, Finite Element mesh quality metrics are regarded. The considered metrics are based on purely geometric criteria and allow to identify model degenerations commonly used to achieve certain geometric features. They can be used to decide upon model adaptions and provide a measure for their efficacy. Unfortunately, they do not reveal a relation between mesh distortion and ill-conditioning of the equation systems resulting from the numerical model.
Preparation and provision of building information for planning within existing built contexts
(2004)
A prerequisite for planning within existing built contexts is precise information regarding the building substance, its construction and materials, possible damages and any modifications and additions that may have occurred during its lifetime. Using the information collected in a building survey the user should be able to “explore” the building in virtual form, as well as to assess the information contained with regard to a specific planning aspect. The functionality provided by an information module should cover several levels of information provision ranging from ‘simple retrieval’ of relevant information to the analysis and assessment of stored information with regard to particular question sets. Through the provision of basic functionality at an elementary level and the ability to extend this using plug-ins, the system concept of an open extendable system is upheld. Using this modular approach, different levels of information provision can be provided as required during the planning process.
Die Bauinformatik ist eine Säule der modernen Bau- und Umweltingenieurwissenschaften und befasst sich mit der Erforschung grundlegender informatorischer Methoden sowie mit der Anwendung und Weiterentwicklung der Informationswissenschaften im Bau- und Umweltbereich. Der Arbeitskreis Bauinformatik konstituiert sich aus Wissenschaftlern, die an Universitäten im deutschsprachigen Raum auf dem Fachgebiet Bauinformatik lehren und forschen. Ausgehend vom erreichten Entwicklungsstand der Bauinformatik skizziert dieses Positionspapier die Aufgaben des Arbeitskreises und formuliert eine Grundlage für eine abgestimmte Weiterentwicklung an den deutschsprachigen Universitäten.
Plausibilität im Planungsprozess - Digitale Planungshilfen für die Bebaubarkeit von Grundstücken
(2003)
Die digitale Unterstützung der Planungsprozesse ist ein aktueller Forschungs- und Arbeitsschwerpunkt der Professur Informatik in der Architektur (InfAR) und der Juniorprofessur Architekturinformatik der Fakultät Architektur an der Bauhaus-Universität Weimar. Verankert in dem DFG Sonderforschungsbereich 524 >Werkzeuge und Konstruktionen für die Revitalisierung von Bauwerken< werden Konzepte und Prototypen für eine fachlich orientierte Planungsunterstützung entwickelt. Die Vielfalt der unterschiedlichen Faktoren, die Einfluss auf den Planungsprozess nehmen können, sowie deren Abhängigkeiten voneinander werden von heutigen Planungssystemen in nur unzureichender Weise aufbereitet und verwaltet. Diese Faktoren bedingen Planungstools, deren Aufgabe die Beschaffung, Verarbeitung, Integration und Verwaltung von Informationen sowie die Veranschaulichung der komplexen Informationszusammenhänge ist. Die Entwicklung solcher Systeme ist technisch möglich. Die Schwierigkeit liegt in der Beschaffung und Strukturierung der für den Planungsprozess relevanten Informationen sowie in ihrer Aufbereitung und Integration in eine digitale Planungsumgebung. Das Bestreben des Forschungsprojektes ist es, Grundlagen für digitale Werkzeuge zu entwickeln die zu plausiblen Lösungen im Planungsprozess und somit zu erhöhter Planungssicherheit für die am Bau beteiligten Auftragnehmer und Auftraggeber führen. Es wird angestrebt Programm-Module zu entwickeln, die den Planer bei der Ermittlung von Lösungswegen zu einer Fachfrage inhaltlich unterstützen und die Nachvollziehbarkeit und Richtigkeit einer Planungsentscheidung gewährleisten und plausibel darlegen. Mit Hilfe der Module sollen Entscheidungsfindungen katalysiert werden. Die Bauvorhaben der Zukunft werden zu einem großen Teil im Bestand liegen. Dieses Faktum erfordert planerische Maßnahmen, für die vollends Werkzeuge und Hilfsmittel fehlen, die zu plausiblen und sicheren Planungsentscheidungen führen. Die Entwicklung solcher Hilfsmittel ist Ziel dieser Forschung. Der Beitrag stellt prototypische Software-Module vor, die sich mit der Problematik der potenziellen Bebaubarkeit einer Grundstücksfläche auseinander setzen. Die Module verarbeiten Regeln, die den einschlägigen Normen und Verordnungen entnommen sind, die bei der Erarbeitung einer Planungslösung eingehalten werden müssen.
The approach discussed here is part of research into an overall concept for digital instruments which support the entire planning process and help in enabling planning decisions to be based upon clear reasoning and plausible arguments. Such specialist systems must take into account currently available technology, such as networked working patterns, object-orientation, building and product models as well as the working method of the planner. The paper describes a plausibility instrument for the formulation of colour scheme proposals for building interiors and elevations. With the help of intuitively usable light simulations, colour, material and spatial concepts can be assessed realistically. The software prototype “Coloured Architecture” is conceived as a professional extension to conventional design tools for the modelling of buildings. As such it can be used by the architect in the earliest design phases of the planning process as well as for colour implementation on location.
Complex gridshell structures used in architecturally ambitious constructions remain as appealing as ever in the public realm. This paper describes the theory and approach behind the software realisation of a tool which helps in finding the affine self-weight geometry of gridshell structures. The software tool DOMEdesign supports the formal design process of lattice and grid shell structures based upon the laws of physics. The computer-aided simulation of suspension models is used to derive structurally favourable forms for domes and arches subject to compression load, based upon the input of simple architectonic parameters. Irregular plans, three-dimensional topography, a choice different kinds of shell lattice structures and the desired height of the dome are examples of design parameters which can be used to modify the architectural design. The provision of data export formats for structural dimensioning and visualisation software enables engineers and planners to use the data in future planning and to communicate the design to the client.
Diskrete Arbeitsvorgänge lassen sich mit Hilfe von Petri--Netzen formal beschreiben. Petri--Netze basieren auf der Graphentheorie. Die Elemente zweier Knotenmengen werden Stellen und Transitionen genannt und sind durch gerichtete Kanten miteinander verknüpft. Stellen repräsentieren Bedingungen oder Zustände und Transitionen Ereignisse oder Vorgänge. Durch Petri--Netze ist es möglich nicht nur eine statische Vorgänger--Nachfolger--Struktur abzubilden, vielmehr können ebenso Ereignisse, Alternativen und Nebenläufigkeiten modelliert werden. In diesem Beitrag wird vorgestellt, wie ein Bauablauf gegeben durch ein Vorgangsknoten-Netzplan mit sehr wenigen Schritten auf ein Bedingungs/Ereignis-Netz abgebildet werden kann. Alle notwenigen Teilschritte wie das Bilden von Teilnetzen oder das Vergröbern und Verfeinern von Knoten basieren auf einem mathematisch abgesicherten Fundament. Im Gegensatz zu anderen Formulierungen von Bauabläufen ist die Theorie der Petri-Netze eine allgemeingültige Theorie und kann in vielen Bereichen eingesetzt werden. Die Verwendung einer solchen mathematischen Abstraktion ermöglicht die Wiederverwendung von bereits entwickelten Lösungsansätzen. So können die gewonnenen Erfahrungen auch bei der Modellie-rung von anderen Arbeitsvorgängen verwendet werden.
Im Zusammenhang mit der Revitalisierung von Plattenbauwerken kommt der Bewertung bzw. Neubewertung der vorhandenen Aussteifungssysteme große Bedeutung zu. Zur Aussteifung werden im allgemeinen raumbreite, raumhohe und meist unbewehrte Betonelemente verwendet, die lediglich konstruktiv miteinander verbunden sind. Im Rahmen der Untersuchung wird ein vorhandenes Berechnungsmodell zur physikalisch nichtlinearen Analyse von Aussteifungssystemen mehrgeschossiger Gebäude unter Horizontallast so erweitert, dass unbewehrte Horizontalfugen berücksichtigt werden können. Die Analyse der Aussteifungssysteme wird auf der Basis der Methode der mathematischen Optimierung realisiert, wobei die aussteifenden Wände als offene dünnwandige, schlanke Stäbe betrachtet werden, die über dehnstarre Deckenscheibe (Diaphragmen) gekoppelt sind. Im Beitrag wird gezeigt, dass sich bei Berücksichtigung des physikalisch nichtlinearen Tragverhaltens und der damit verbundenen Schnittgrößenumlagerungen Tragreserven erschließen lassen, die auf die Standsicherheit sowohl vorhandener als auch revitalisierter Gebäude angerechnet werden können.
Physikalisch nichtlineare Analyse von Aussteifungssystemen unter Einbeziehung von Lastfolgeeffekten
(2003)
Die physikalisch nichtlineare Analyse von Stahlbetontragwerken unter Berücksichtigung des Einspielverhaltens (adaptives Tragverhalten) mit Methoden der mathematischen Optimierung ist seit Jahren Gegenstand intensiver Forschungsarbeiten am Lehrstuhl Massivbau I der Bauhaus-Universität Weimar. Die dabei entwickelten Modelle und Algorithmen werden im folgenden Beitrag exemplarisch auf die Untersuchung von Aussteifungssystemen in Großtafelbauweise angewendet. Da bei diesen Gebäuden die aussteifende Konstruktion aus zusammengesetzten großformatigen Betonfertigteilen besteht, wird das Gesamttragverhalten maßgebend durch das Fugentragverhalten bestimmt. Die physikalische Nichtlinearität wird durch das Aufreißen der unbewehrten Horizontalfugen und den verschieblichen Verbund in den Vertikalfugen charakterisiert und entsprechend im Berechnungsmodell berücksichtigt. Beispielrechnungen belegen, dass für die beschriebenen Aussteifungssysteme signifikante Spannungsumlagerungen infolge des nichtlinearen Fugentragverhaltens auftreten. Weiterhin können Lastfolgeeffekte rechnerisch nachgewiesen werden. Im Gegensatz zu seismisch beanspruchten Systemen, die in kürzester Zeit wiederholt extrem beansprucht werden, ist die Eintrittswahrscheinlichkeit bemessungsrelevanter Windlasten gering.
The goal of the research is to increase the understanding of dynamic behaviors during the crane operation, and develops computer-aided methods to improve the training of crane operators. There are approximately 125,000 cranes in operation today in the construction industry, responsible for major portion of erection activities. Unfortunately, many accidents occur every year in the U.S. and other countries related to the operation of cranes in construction sites. For example on November 28, 1989 a tower crane collapse during the construction of a building in San Francisco killing four construction workers, one civilian and injuring 28. According to the statistics from Occupational Safety Health Administration (OSHA), there were 137 crane-related fatalities from 1992 to 2001 in the US. A well-known internet website that keeps track of crane-related accidents (craneaccidents.com), reports 516 accidents and 277 fatalities from 2000 to 2002. These statistics show that even though many measures have been taken to decrease the number of crane-related accidents (Braam, 2002), the number of crane related accidents is still very large. It is important to recognize that each construction related fatality is not only a great human loss but also increases the costs of insurance, lawsuits, and the construction budget due to delay of a project (Paulson 1992)...
Physically Based Modeling and Multi-Physical Simulation System for Wood Structure Fire Performance
(2004)
This research is devoted to promoting the performance-based engineering in wood structure fire. It looks into the characteristic of the material, structural composing and collapse detecting to find out the main factors in the wood structure collapse in fire. The aim of the research is to provide an automatic simulation platform for the complicated circulation. A physically based model for slim member for beams and columns and a frame of multi-physical simulation are provided to implement the system. The physically based model contains material model, structural mechanics model, material mechanics model, as well as geometry model for the compositive simulation. The multi-physical simulation is built on the model and has the capacity to carry out a simulation combining structural, fire (thermal, CFD) and material degradation simulation. The structural and fire simulation rely on two sophisticated software respectively, ANSYS (an FEA software) and FDS (with a core of CFD). Researchers of the paper develop system by themselves to combine the two existing ones. The system has the capability to calculate the wood char to find out the loss of cross-section and to detect the collapse caused in different ways. The paper gives a sample of Chinese traditional house to show how this simulation system works.
The paper analyses the influence of the effect of inertia on the reliability of production systems. Systems inertia represents the phenomenon of continuing work for some time after the breakdown of one of the former phases. In our considerations, inertia is treated as the time elapsed from the onset of breakdown till the system's inability to work. A special method had to be devised to investigate the effect of inertia in order to evaluate the reliability of production systems and to attempt algorithmization to control the reliability of production system by means of inertia or reserving. The method of reliability analysis is presented only in an inform aspect. The possibilities of increasing reliability of production systems are listed. A comparison of the redundancy method and inertia method is presented. The results of this comparison and simulated investigations of influence of inertia on reliability of system are essential scope of the paper. Selected conclusions are as follows: when inertia approaches the last phase in the system, its influence on the shape of the distribution of the system's ability increases; an increase in inertia causes an increase in the availability of the system which approaches a certain border value; dependence of the average of a system's disability on inertia has a saddle-like character whereas dependence of the number of breakdowns (stoppages) in the system has the nature of an S-curve.
Der englische Physiker und Mathematiker Roger Penrose hat eine mathematische Pflasterung entdeckt, deren Formenreichtum neue Impulse in der Architektur setzen kann. In der Gemeinde Bütgenbach im Hohen Venn, Belgien, wurde das Verwaltungszentrum durch den Essener Architekten Ernst Burghartz mit mehrereren Penrose-Pflasterungen versehen. Diese beruhen auf einem Computer-Programm, welches der Vortragende zusammen mit Dr. Frank Martini entwickelt hat. Herr Burghartz hat jedoch diesen Vorentwurf nach seinen künstlerischen Vorstellungen abgewandelt.
Particle Simulation and Evaluation of Personal Exposure to Contaminant Sources in an Elevation Space
(2004)
An elevator, which figures a small volume, is normally used by everyone for a short period of time and equipped with simple ventilation system..Any contaminant released within it may cause serious problem. This research adapt a fire and smoke simulation software (FDS) into non-fire indoor airflow scario. Differently from previous research, particles are chosen as a risk evalution unit. A personal and multi-personal exposure model is proposed. The model takes the influence of the human thermal boundary, coughing, inhalation, exhalation, standing position, and the fan factor into account. The model is easy-to-use and suitable for the design of elevator system in practice.
One of the most promising and recent advances in computer-based planning is the transition from classical geometric modeling to building information modeling (BIM). Building information models support the representation, storage, and exchange of various information relevant to construction planning. This information can be used for describing, e.g., geometric/physical properties or costs of a building, for creating construction schedules, or for representing other characteristics of construction projects. Based on this information, plans and specifications as well as reports and presentations of a planned building can be created automatically. A fundamental principle of BIM is object parameterization, which allows specifying geometrical, numerical, algebraic and associative dependencies between objects contained in a building information model. In this paper, existing challenges of parametric modeling using the Industry Foundation Classes (IFC) as a federated model for integrated planning are shown, and open research questions are discussed.
Parallele Netzgenerierung
(1997)
Bei der Berechnung von statischen oder dynamischen Problemen mit Hilfe der Methode der Finiten Elemente ist eine Diskretisierung des zu berechnenden Gebietes notwendig. Bei einer sinnvollen Modellierung des Gebietes ist die Elementgröße meist nicht konstant, sondern ist an kritischen Stellen kleiner. Die Vorgaben hierfür können einerseits aus Erfahrungen des Anwenders, andererseits aus einer Fehlerabschätzung einer vorangegangenen FE-Berechnung resultieren [5]. Soll die FE-Berechnung auf einem Parallelrechner geschehen, ist eine Partitionierung des Gebietes, d.h. eine Zuordnung der Elemente zu den Prozessoren, notwendig. Bei dem hier beschriebenen Ansatz werden nun im Gegensatz zu den üblichen Verfahren erst die Eingangsdaten für den Netzgenerator umgewandelt und dann das Elementnetz direkt auf dem Parallelrecher gleichzeitig auf allen Prozessoren erzeugt. Eine Aufteilung der Elemente auf die Prozessoren entsteht als Nebenprodukt der Netzaufteilung. Die entstehenden Teilgebietsgrenzen werden geometrisch minimiert. Die Lastbalance der Netzaufteilung sowie der FE-Rechnung wird durch ein annähernd gleiche Anzahl der Elemente je Partition gewährleistet. Als Eingabedaten wird eine Beschreibung des Gebietes durch Polygonzüge, sowie einer Netzdichtefunktion, z.B. durch Punkte mit Angaben über die angestrebte Elementgröße, benötigt.
The effectiveness of working processes accomplished by various technological machines to a large extend depends on working quality of supply, transporting and orientating mechanisms which are very often produced as positional hydro-mechanical systems. The choice of their best type and regimes of work requires construction and analysis of models of their optimum steering which are complicated by nonlinearness, multy-criterialness of problem and also by occasional outbreaks of parameters and moments of steering regime changing. It was developed the common structure of such systems allowing within common scheme to vary the complexity degree of PHMS and the methods of inhibitory efforts supplement. For some systems which are complicated in series (from two-measured linear system to nine-measured non-linear) puzzles of the most fast zero-ambit getting are solved and two-criterial problems are analyzed. (T-min-speed, Z(T)- accuracy). There are suggested the computing procedures of optimum PHMS synthesis. The effectiveness of accepted methods of solving is asserted by the analogy of the results of gradually complicated models investigation and by their good analogy with the natural experiment. It was exposed the sense of heuristic methods of improving of approximately optimum steering, their elaboration on the base of theoretical models. The basic methods of optimum PGMS construction were also nominated.
This paper contributes to discussion on introduction of exclusive lane for public transport. Analyses of results have been presented on effectiveness of usage of a 4 - lane and a 6 - lane street with and without an exclusive lane for buses. Two basic sub-models have been applied: - the binary logit model for modal split estimation; it takes into consideration the relation of travel time performed by private cars to public transport, - the polynomial model for predicting link impedance; the real travel time is affected by relation of traffic volume to the design capacity The parameters of both sub-models have been calibrated for the Polish conditions. The relationships are determined between number of person trips and traffic volume of buses/private cars, market share of public transport in motorised trips, average travel time lost in trips, average operation cost. An iterative technique addresses a feedback between the modal split and traffic volumes. Numerical calculations by use of EXCEL and MATLAB were carried out. Typical values of corridor length, occupancy rate of passenger car, design capacity of the bus, access and egress time to and from the parking/bus-stop in urban areas were applied. On the basis of the estimated results, the marginal parameter values: number of people carried at which separated lane for buses is most effective , traffic volume for private cars, traffic volume for buses and share of public transport in trips have been got by consideration of the average travel time lost and operational cost as criteria: Then introduction of street lanes with and without exclusive lane for buses in relation to number of person trips can be optimized.
The presented work focuses on the presentation of a discrete event simulator which can be used for automated sequencing and optimization of building processes. The sequencing is based on the commonly used component–activity–resource relations taking structural and process constraints into account. For the optimization a genetic algorithm approach was developed, implemented and successfully applied to several real life steel constructions. In this contribution we discuss the application of the discrete event simulator including its optimization capabilities on a 4D process model of a steel structure of an automobile recycling facility.
Ausgehend von mathematischen Überlegungen haben wir einfache Modellansätze zur Bearbeitung des folgenden Optimierungsproblems erarbeitet und numerische Tests durchgeführt: Eine Landkarte wird in Quadrate unterteilt, wobei jedes Quadrat mit einem Faktor zu bewerten ist. Dieser Wichtungsfaktor sei klein, wenn das Gebiet problemlos passierbar ist und entsprechend groß, wenn es sich um ein Naturschutz-gebiet, einen See oder ein schwer befahrbares Gebiet handelt. Gesucht wird nach einer günstigen Verbindung vom Punkt A zum Punkt B, wobei die durch den Wichtungsfaktor gegebenen landschaftlichen Besonderheiten zu berücksichtigen sind. Wir formulieren das Problem zunächst als Variationsproblem. Eine notwendige Bedingung, der die Lösungsfunktion genügen muß, ist die Euler-Lagrangesche Differentialgleichung. Mit Hilfe der Hamiltonschen Funktion ist es möglich, diese Differentialgleichung in kanonischer Form zu schreiben. Durch Vereinfachung des Modelles gelingt es, das System der kanonischen Gleichungen so zu konkretisieren, daß es als Ausgangspunkt für numerische Untersuchungen betrachtet werden kann. Dazu verwandeln wir die ursprüngliche Landschaft in eine >Berglandschaft<, wobei hohe Berge schwer passierbare Gebiete charakterisieren. Das einfachste Modell ist ein einzelner Berg, der mit Hilfe der Dichtefunktion einer zweidimensionalen Normalverteilung erzeugt wird. Zusätzlich haben wir Berechnungen an zwei sich überlagernden Bergen sowie einer Schlucht durchgeführt.