Refine
Document Type
- Conference Proceeding (286) (remove)
Institute
- In Zusammenarbeit mit der Bauhaus-Universität Weimar (174)
- Graduiertenkolleg 1462 (31)
- Institut für Strukturmechanik (ISM) (22)
- Professur Informatik im Bauwesen (21)
- Professur Angewandte Mathematik (18)
- Institut für Konstruktiven Ingenieurbau (IKI) (8)
- Professur Stahlbau (4)
- Institut für Bauinformatik, Mathematik und Bauphysik (IBMB) (3)
- Professur Informatik in der Architektur (3)
- Professur Stochastik und Optimierung (3)
Keywords
- Computerunterstütztes Verfahren (286) (remove)
This paper is focused on the first numerical tests for coupling between analytical solution and finite element method on the example of one problem of fracture mechanics. The calculations were done according to ideas proposed in [1]. The analytical solutions are constructed by using an orthogonal basis of holomorphic and anti-holomorphic functions. For coupling with finite element method the special elements are constructed by using the trigonometric interpolation theorem.
It is well known that complex quaternion analysis plays an important role in the study of higher order boundary value problems of mathematical physics. Following the ideas given for real quaternion analysis, the paper deals with certain orthogonal decompositions of the complex quaternion Hilbert space into its subspaces of null solutions of Dirac type operator with an arbitrary complex potential. We then apply them to consider related boundary value problems, and to prove the existence and uniqueness as well as the explicit representation formulae of the underlying solutions.
Many structures in different engineering applications suffer from cracking. In order to make reliable prognosis about the serviceability of those structures it is of utmost importance to identify cracks as precisely as possible by non-destructive testing. A novel approach (XIGA), which combines the Isogeometric Analysis (IGA) and the Extended Finite Element Method (XFEM) is used for the forward problem, namely the analysis of a cracked material, see [1]. Applying the NURBS (Non-Uniform Rational B-Spline) based approach from IGA together with the XFEM allows to describe effectively arbitrarily shaped cracks and avoids the necessity of remeshing during the crack identification problem. We want to exploit these advantages for the inverse problem of detecting existing cracks by non-destructive testing, see e.g. [2]. The quality of the reconstructed cracks however depends on two major issues, namely the quality of the measured data (measurement error) and the discretization of the crack model. The first one will be taken into account by applying regularizing methods with a posteriori stopping criteria. The second one is critical in the sense that too few degrees of freedom, i.e. the number of control points of the NURBS, do not allow for a precise description of the crack. An increased number of control points, however, increases the number of unknowns in the inverse analysis and intensifies the ill-posedness. The trade-off between accuracy and stability is aimed to be found by applying an inverse multilevel algorithm [3, 4] where the identification is started with short knot vectors which successively will be enlarged during the identification process.
THE INFLUENCE OF THE LOCAL CONCAVITY ON THE FUNCTIONING OF BEARING SHELL OF HIGH-RISE CONSTRUCTION
(2012)
Areas with various defects and damages, which reduce carrying capacity, were examined in a study of metal chimneys. In this work, the influence of the local dimples on the function of metal chimneys was considered. Modeling tasks were completed in the software packages LIRA and ANSYS. Parameters were identified, which characterize the local dimples, and a numerical study of the influence of local dimples on the stress-strain state of shells of metal chimneys was conducted. A distribution field of circular and meridional tension was analyzed in a researched area. Zones of influence of dimples on the bearing cover of metal chimneys were investigated. The bearing capacities of high-rise structures with various dimple geometries and various cover parameters were determined with respect to specified areas of the trunk. Dependent relationships are represented graphically for the decrease in bearing capacity of a cover with respect to dimples. Diameter and thickness of covers of metal chimneys were constructed according to the resulting data.
The aim of this paper we discuss explicit series constructions for the fundamental solution of the Helmholtz operator on some important examples non-orientable conformally at manifolds. In the context of this paper we focus on higher dimensional generalizations of the Klein bottle which in turn generalize higher dimensional Möbius strips that we discussed in preceding works. We discuss some basic properties of pinor valued solutions to the Helmholtz equation on these manifolds.
The process of analysis and design in structural engineering requires the consideration of different partial models, for example loading, structural materials, structural elements, and analysis types. The various partial models are combined by coupling several of their components. Due to the large number of available partial models describing similar phenomena, many different model combinations are possible to simulate the same aspects of a structure. The challenging task of an engineer is to select a model combination that ensures a sufficient, reliable prognosis. In order to achieve this reliable prognosis of the overall structural behavior, a high individual quality of the partial models and an adequate coupling of the partial models is required. Several methodologies have been proposed to evaluate the quality of partial models for their intended application, but a detailed study of the coupling quality is still lacking. This paper proposes a new approach to assess the coupling quality of partial models in a quantitative manner. The approach is based on the consistency of the coupled data and applies for uni- and bidirectional coupled partial models. Furthermore, the influence of the coupling quality on the output quantities of the partial models is considered. The functionality of the algorithm and the effect of the coupling quality are demonstrated using an example of coupled partial models in structural engineering.
Bridge vibration due to traffic loading has been subject of extensive research in the last decades. Such studies are concerned with deriving solutions for the bridge-vehicle interaction (BVI) and analyzing the dynamic responses considering randomness of the coupled model’s (BVI) input parameters and randomness of road unevenness. This study goes further to examine the effects of such randomness of input parameters and processes on the variance of dynamic responses in quantitative measures. The input parameters examined in the sensitivity analysis are, stiffness and damping of vehicle’s suspension system, axle spacing, and stiffness and damping of bridge. This study also examines the effects of the initial excitation of a vehicle on the influences of the considered input parameters. Variance based sensitivity analysis is often applied to deterministic models. However, the models for the dynamic problem is a stochastic one due to the simulations of the random processes. Thus, a setting using a joint meta-model; one for the mean response and other for the dispersion of the response is developed. The joint model is developed within the framework of Generalized Linear Models (GLM). An enhancement of the GLM procedure is suggested and tested; this enhancement incorporates Moving Least Squares (MLS) approximation algorithms in the fitting of the mean component of the joint model. The sensitivity analysis is then performed on the joint-model developed for the dynamic responses caused by BVI.
Numerical simulations in the general field of civil engineering are common for the design process of structures and/or the assessment of existing buildings. The behaviour of these structures is analytically unknown and is approximated with numerical simulation methods like the Finite Element Method (FEM). Therefore the real structure is transferred into a global model (GM, e.g. concrete bridge) with a wide range of sub models (partial models PM, e.g. material modelling, creep). These partial models are coupled together to predict the behaviour of the observed structure (GM) under different conditions. The engineer needs to decide which models are suitable for computing realistically and efficiently the physical processes determining the structural behaviour. Theoretical knowledge along with the experience from prior design processes will influence this model selection decision. It is thus often a qualitative selection of different models. The goal of this paper is to present a quantitative evaluation of the global model quality according to the simulation of a bridge subject to direct loading (dead load, traffic) and indirect loading (temperature), which induce restraint effects. The model quality can be separately investigated for each partial model and also for the coupled partial models in a global structural model. Probabilistic simulations are necessary for the evaluation of these model qualities by using Uncertainty and Sensitivity Analysis. The method is applied to the simulation of a semi-integral concrete bridge with a monolithic connection between the superstructure and the piers, and elastomeric bearings at the abutments. The results show that the evaluation of global model quality is strongly dependent on the sensitivity of the considered partial models and their related quantitative prediction quality. This method is not only a relative comparison between different models, but also a quantitative representation of model quality using probabilistic simulation methods, which can support the process of model selection for numerical simulations in research and practice.
We briefly review and use the recent comprehensive research on the manifolds of square roots of −1 in real Clifford geometric algebras Cl(p,q) in order to construct the Clifford Fourier transform. Basically in the kernel of the complex Fourier transform the complex imaginary unit j is replaced by a square root of −1 in Cl(p,q). The Clifford Fourier transform (CFT) thus obtained generalizes previously known and applied CFTs, which replaced the complex imaginary unit j only by blades (usually pseudoscalars) squaring to −1. A major advantage of real Clifford algebra CFTs is their completely real geometric interpretation. We study (left and right) linearity of the CFT for constant multivector coefficients in Cl(p,q), translation (x-shift) and modulation (w -shift) properties, and signal dilations. We show an inversion theorem. We establish the CFT of vector differentials, partial derivatives, vector derivatives and spatial moments of the signal. We also derive Plancherel and Parseval identities as well as a general convolution theorem.
New foundations for geometric algebra are proposed based upon the existing isomorphisms between geometric and matrix algebras. Each geometric algebra always has a faithful real matrix representation with a periodicity of 8. On the other hand, each matrix algebra is always embedded in a geometric algebra of a convenient dimension. The geometric product is also isomorphic to the matrix product, and many vector transformations such as rotations, axial symmetries and Lorentz transformations can be written in a form isomorphic to a similarity transformation of matrices. We collect the idea that Dirac applied to develop the relativistic electron equation when he took a basis of matrices for the geometric algebra instead of a basis of geometric vectors. Of course, this way of understanding the geometric algebra requires new definitions: the geometric vector space is defined as the algebraic subspace that generates the rest of the matrix algebra by addition and multiplication; isometries are simply defined as the similarity transformations of matrices as shown above, and finally the norm of any element of the geometric algebra is defined as the nth root of the determinant of its representative matrix of order n×n. The main idea of this proposal is an arithmetic point of view consisting of reversing the roles of matrix and geometric algebras in the sense that geometric algebra is a way of accessing, working and understanding the most fundamental conception of matrix algebra as the algebra of transformations of multilinear quantities.
Safety operation of important civil structures such as bridges can be estimated by using fracture analysis. Since the analytical methods are not capable of solving many complicated engineering problems, numerical methods have been increasingly adopted. In this paper, a part of isotropic material which contains a crack is considered as a partial model and the proposed model quality is evaluated. EXtended IsoGeometric Analysis (XIGA) is a new developed numerical approach [1, 2] which benefits from advantages of its origins: eXtended Finite Element Method (XFEM) and IsoGeometric Analysis (IGA). It is capable of simulating crack propagation problems with no remeshing necessity and capturing singular field at the crack tip by using the crack tip enrichment functions. Also, exact representation of geometry is possible using only few elements. XIGA has also been successfully applied for fracture analysis of cracked orthotropic bodies [3] and for simulation of curved cracks [4]. XIGA applies NURBS functions for both geometry description and solution field approximation. The drawback of NURBS functions is that local refinement cannot be defined regarding that it is based on tensorproduct constructs unless multiple patches are used which has also some limitations. In this contribution, the XIGA is further developed to make the local refinement feasible by using Tspline basis functions. Adopting a recovery based error estimator in the proposed approach for evaluation of the model quality and performing the adaptive processes is in progress. Finally, some numerical examples with available analytical solutions are investigated by the developed scheme.
We study the Weinstein equation u on the upper half space R3+. The Weinstein equation is connected to the axially symmetric potentials. We compute solutions of the Weinstein equation depending on the hyperbolic distance and x2. These results imply the explicit mean value properties. We also compute the fundamental solution. The main tools are the hyperbolic metric and its invariance properties.
Non-destructive techniques for damage detection became the focus of engineering interests in the last few years. However, applying these techniques to large complex structures like civil engineering buildings still has some limitations since these types of structures are
unique and the methodologies often need a large number of specimens for reliable results. For this reason, cost and time can greatly influence the final results.
Model Assisted Probability Of Detection (MAPOD) has taken its place among the ranks of damage identification techniques, especially with advances in computer capacity and modeling tools. Nevertheless, the essential condition for a successful MAPOD is having a reliable model in advance. This condition is opening the door for model assessment and model quality problems. In this work, an approach is proposed that uses Partial Models (PM) to compute the Probability Of damage Detection (POD). A simply supported beam, that can be structurally modified and
tested under laboratory conditions, is taken as an example. The study includes both experimental and numerical investigations, the application of vibration-based damage detection approaches and a comparison of the results obtained based on tests and simulations.
Eventually, a proposal for a methodology to assess the reliability and the robustness of the models is given.
The present research analyses the error on prediction obtained under different data availability scenarios to determine which measurements contribute to an improvement of model prognosis and which not. A fully coupled 2D hydromechanical model of a water retaining dam is taken as an example. Here, the mean effective stress in the porous skeleton is reduced due to an increase in pore water pressure under drawdown conditions. Relevant model parameters are ranked by scaled sensitivities, Particle Swarm Optimization is applied to determine the optimal parameter values and model validation is performed to determine the magnitude of error forecast. We compare the predictions of the optimized models with results from a forward run of the reference model to obtain actual prediction errors.
The analyses presented here were performed to 31 data sets of 100 observations of varying data types. Calibrating with multiple information types instead of only one sort, brings better calibration results and improvement in model prognosis. However, when using several types of information the number of observations have to be increased to be able to cover a representative part of the model domain; otherwise a compromise between data availability and domain
coverage prove best. Which type of information for calibration contributes to the best prognoses, could not be determined in advance. For the error in model prognosis does not depends on the error in calibration, but on the parameter error, which unfortunately can not be determined in reality since we do not know its real value. Excellent calibration fits with parameters’ values near the limits of reasonable physical values, provided the highest prognosis errors. While models which included excess pore pressure values for calibration provided the best prognosis, independent of the calibration fit.
Electromagnetic wave propagation is currently present in the vast majority of situations which occur in veryday life, whether in mobile communications, DTV, satellite tracking, broadcasting, etc. Because of this the study of increasingly complex means of propagation of lectromagnetic waves has become necessary in order to optimize resources and increase the capabilities of the devices as required by the growing demand for such services.
Within the electromagnetic wave propagation different parameters are considered that characterize it under various circumstances and of particular importance are the reflectance and transmittance. There are several methods or the analysis of the reflectance and transmittance such as the method of approximation by boundary condition, the plane wave expansion method (PWE), etc., but this work focuses on the WKB and SPPS methods.
The implementation of the WKB method is relatively simple but is found to be relatively efficient only when working at high frequencies. The SPPS method (Spectral Parameter Powers Series) based on the theory of pseudoanalytic functions, is used to solve this problem through a new representation for solutions of Sturm Liouville equations and has recently proven to be a powerful tool to solve different boundary value and eigenvalue problems. Moreover, it has a very suitable structure for numerical implementation, which in this case took place in the Matlab software for the valuation of both conventional and turning points profiles.
The comparison between the two methods allows us to obtain valuable information about their perfor mance which is useful for determining the validity and propriety of their application for solving problems where these parameters are calculated in real life applications.
In this paper, wavelet energy damage indicator is used in response surface methodology to identify the damage in simulated filler beam railway bridge. The approximate model is addressed to include the operational and surrounding condition in the assessment. The procedure is split into two stages, the training and detecting phase. During training phase, a so-called response surface is built from training data using polynomial regression and radial basis function approximation approaches. The response surface is used to detect the damage in structure during detection phase. The results show that the response surface model is able to detect moderate damage in one of bridge supports while the temperatures and train velocities are varied.
Long-span cable supported bridges are prone to aerodynamic instabilities caused by wind and this phenomenon is usually a major design criterion. If the wind speed exceeds the critical flutter speed of the bridge, this constitutes an Ultimate Limit State. The prediction of the flutter boundary therefore requires accurate and robust models. This paper aims at studying various combinations of models to predict the flutter phenomenon.
Since flutter is a coupling of aerodynamic forcing with a structural dynamics problem, different types and classes of models can be combined to study the interaction. Here, both numerical approaches and analytical models are utilised and coupled in different ways to assess the prediction quality of the hybrid model. Models for aerodynamic forces employed are the analytical Theodorsen expressions for the motion-enduced aerodynamic forces of a flat plate and Scanlan derivatives as a Meta model. Further, Computational Fluid Dynamics (CFD) simulations using the Vortex Particle Method (VPM) were used to cover numerical models.
The structural representations were dimensionally reduced to two degree of freedom section models calibrated from global models as well as a fully three-dimensional Finite Element (FE) model. A two degree of freedom system was analysed analytically as well as numerically.
Generally, all models were able to predict the flutter phenomenon and relatively close agreement was found for the particular bridge. In conclusion, the model choice for a given practical analysis scenario will be discussed in the context of the analysis findings.
The 19th International Conference on the Applications of Computer Science and Mathematics in Architecture and Civil Engineering will be held at the Bauhaus University Weimar from 4th till 6th July 2012. Architects, computer scientists, mathematicians, and engineers from all over the world will meet in Weimar for an interdisciplinary exchange of experiences, to report on their results in research, development and practice and to discuss. The conference covers a broad range of research areas: numerical analysis, function theoretic methods, partial differential equations, continuum mechanics, engineering applications, coupled problems, computer sciences, and related topics. Several plenary lectures in aforementioned areas will take place during the conference.
We invite architects, engineers, designers, computer scientists, mathematicians, planners, project managers, and software developers from business, science and research to participate in the conference!
The 20th International Conference on the Applications of Computer Science and Mathematics in Architecture and Civil Engineering will be held at the Bauhaus University Weimar from 20th till 22nd July 2015. Architects, computer scientists, mathematicians, and engineers from all over the world will meet in Weimar for an interdisciplinary exchange of experiences, to report on their results in research, development and practice and to discuss. The conference covers a broad range of research areas: numerical analysis, function theoretic methods, partial differential equations, continuum mechanics, engineering applications, coupled problems, computer sciences, and related topics. Several plenary lectures in aforementioned areas will take place during the conference.
We invite architects, engineers, designers, computer scientists, mathematicians, planners, project managers, and software developers from business, science and research to participate in the conference!
Projektsteuerung gewinnt als Aufgabenfeld bei der Realisierung von Investitionsprojekten ständig an Bedeutung. Zur Bewältigung der umfangreichen und komplizierten Aufgaben des Projektsteuerers wird zunehmend Software angeboten und genutzt. Dabei ist feststellbar, daß von einer integrierten Projektsteuerung unter Berücksichtigung der Kriterien Leistung, Termin und Kosten nicht gesprochen werden kann, weil wesentliche theoretische und praktische Voraussetzungen fehlen. Im Beitrag werden Defizite der praktizierten Vorgehensweise und Lösungsansätze für eine integrierte rechnergestützte Projektsteuerung vorgestellt. Durch geeignete Formen der Projektstrukturierung und eine Kopplung auf dem Mark befindlicher Standardsoftware können geeignete Lösungen gefunden werden. Problempunkte dabei stellen die Schnittstellen zwischen den Anwenderprogrammen die Datenbeschaffung und -verwaltung sowie geeignete Verfahren zur Kostenermittlung und -verfolgung dar. Im Beitrag werden eine bereits praktizierte und eine in Entwicklung befindliche Lösung vorgestellt.
Seit die Datenverarbeitung in ihrer Komplexität sich der Thematik des Computer Integrated Manufacturing widmet gehört die Produktionsplanung und Steuerung zu jenen Bereichen, in denen eine Computerunterstützung am vordringlichsten erschien. Später sind betriebswirtschaftliche Gesamtlösungen entstanden, die (bis heute recht unpräzise) als Enterprise Resource Planning (ERP)-Systeme bezeichnet werden und in ihren Logistik-Modulen auch Funktionen der Produktionsplanung abdecken. Alle bekannten MRP-, PPS- und auch ERP-Systeme beruhen auf einer Sukzessivplanung. Advanced Planning and Scheduling (APS) Systems finden seit etwa 1995 zunehmend Interesse. Neben Demand Planning, Production Planning and Scheduling, Distribution Planning, Transportation Planning und Supply Chain Planning werden Lösungen für Anzahl und Standorte von Produktionsstätten und Auslieferungslagern, Zuordnung zu Produktionsstätten, Kapazitätsbestimmung für Arbeitskräfte und Betriebsmittel je Standort, Lagerhaltung je Teil und Lager, Bestimmung benötigter Transportmittel und Häufigkeit ihres Einsatzes, Zuordnung von Lagern zu Produktionsstätten von Märkten zu Lagern u.a.m. von APS-Systemen erwartet. D.h. APS-Systeme ergänzen ERP-Lösungen, nutzen die bereits durch das ERP-System vorhandenen Daten und benötigen neuartige Algorithmen und (Meta-) Heuristiken. Im Rahmen des Vortrages werden Modelle und Echtzeitalgorithmen zur Optimierung der Logistik für Prozesse mit kurzfristigen Anforderungen, geographisch verteilter Produktion, Lagerhaltung der Ausgangs-, Zwischen- und Endprodukte und wechselnden Transport-Bedingungen aus der Sicht der praktischen Umsetzung und Anwendung in Form einer ASP-Lösung aufgezeigt und diskutiert.
Die Sicherung der Wettbewerbsfähigkeit im Bereich des Bauwesens, insbesondere kleinerer und mittelständischer Betriebe erfordert ein aktives Handeln als Antwort auf die sich ändernde Wettbewerbssituation. Einen wesentlichen Wettbewerbsvorteil können kleine unternehmerische Einheiten durch höhere Flexibilität, schnelle Reaktion auf Kundenwünsche oder aktuelle Situationen auf der Baustelle und Marktnähe erreichen. Dazu ist es nötig, die Informations- und Kommunikationsströme durch Einsatz standardisierter und kostengünstiger Hard- und Software wie z.B. Handhelds zu unterstützen und insbesondere die existierenden Hindernisse im Informationsfluss zwischen Baustelle und Büro zu beseitigen. Am Beispiel der Projekte >IuK - SystemBau< und >eSharing< wird eine Einführungsstrategie für >Mobile Computing< in kleinen unternehmerischen Einheiten des Bauwesens (KMU) basierend auf einer umfangreichen Anforderungsanalyse vorgestellt. Folgende Aspekte sollen beschrieben werden: durchgängiger Einsatz der Technik unter Beachtung der verschiedenen Qualifikationsniveaus, Einführungsunterstützung durch Schulungen, Prozessanalyse und mögliche Integration in bestehende Software-Umgebungen sowie Feldtests.
Die heutige Situation in der Tragwerksplanung ist durch das kooperative Zusammenwirken einer größeren Anzahl von Fachleuten verschiedener Disziplinen (Architektur, Tragwerksplanung, etc.) in zeitlich befristeten Projektgemeinschaften gekennzeichnet. Bei der Abstimmung der hierdurch bedingten komplexen, dynamischen und vernetzten Planungsprozesse kommt es dabei häufig zu Planungsmängeln und Qualitätseinbußen. Dieser Artikel zeigt auf, wie mit Hilfe der Agententechnologie Lösungsansätze zur Verbesserung der Planungssituation erreicht werden können. Hierzu wird ein Agentenmodell für die vernetzt-kooperative Tragwerksplanung vorgestellt und anhand der Planung einer Fußgängerbogenbrücke anschaulich demonstriert. Das Agentenmodell erfasst (1) die beteiligten Fachplaner und Organisationen, (2) die tragwerksspezifischen Planungsprozesse, (3) die zugehörigen (Teil-)Produktmodelle und (4) die genutzte (Ingenieur-)Software. Hieraus leiten sich die drei Teilmodelle (1) agentenbasiertes Kooperationsmodell, (2) agentenbasierte Produktmodellintegration und (3) Modell zur agentenbasierten Software-Integration ab. Der Fokus des Artikels liegt auf der Darstellung des agentenbasierten Kooperationsmodells.
SYSBAT - An Application to the Building ProductionBased on Computer Supported Cooperative Work
(2003)
Our proposed solution is to enable partners of a construction project to share all the technical data produced and handled during the building production process by building a system through the use of internet technology. The system links distributed databases and allows building partners to access remotely and manipulate specific information. It provides an updated building representation that is being enriched and refined all along the building production process. A recent collaboration with Nemetschek France (subsidiary company of Nemetschek AG, AEC CAD software leader) focus on a building product repository available in a web context. The aim is to help building project actors to choose a technical solution that fits its professional needs, and maintain our information system with up to date information. It starts with the possibility to build on line building product catalogs, in order to link Allplan CAD entities with building technical features. This paper presents the conceptual approaches on which our information system is built. Starting from a general organization diagram organization, we focus on the product and the description branches of construction works (including last IFC model specifications). Our aim is to add decisional support to the construction works selection process. To do so, we consider the actor's role upon the system and the pieces of information each one needs to achieve a given task.
The planning of projects in building engineering is a complex process which is characterized by a dynamical composition and many modifications during the definition and execution time of processes. For a computer-aided and network-based cooperation a formal description of the planning process is necessary. In the research project “Relational Process Modelling in Cooperative Building Planning” a process model is described by three parts: an organizational structure with participants, a building structure with states and a process structure with activities. This research project is part of the priority program 1103 “Network-Based Cooperative Planning Processes in Structural Engineering” promoted by the German Research Foundation (DFG). Planning processes in civil engineering can be described by workflow graphs. The process structure describes the logical planning process and can be formally defined by a bipartite graph. This structure consists of activities, transitions and relationships between activities and transitions. In order to minimize errors at execution time of a planning process a consistent and structurally correct process model must be guaranteed. This contribution considers the concept and the algorithms for checking the consistency and the correctness of the process structure.
Let the information of a civil engineering application be decomposed into objects of a given set of classes. Then the set of objects forms the data base of the application. The objects contain attributes and methods. Properties of the objects are stored in the attributes. Algorithms which the objects perform are implemented in the methods of the objects. If objects are modified by a user, the consistency of data in the base is destroyed. The data base must be modified in an update to restore its consistency. The sequence of the update operations is not arbitrary, but is governed by dependence between the objects. The situation can be described mathematically with graph theory. The available algorithms for the determination of the update sequence are not suitable when the data base is large. A new update algorithm for large data bases has been developed and is presented in this paper.
The management of resources is an essential task in each construction company. Today, ERP systems and e-Business systems are available to assist construction companies to efficiently organise the allocation of their personnel and equipment within the company, but they cannot provide the company with the idle resources for every single task that has to be performed during a construction project. Therefore, companies should have an alternative solution to better exploit expensive resources and compensate their fixed costs, but also have them available at the right time for their own business activities. This paper outlines the approach taken by the EU funded project “e-Sharing” (IST-2001-33325) to support resource management between construction companies. It will describe requirements for the management of construction resources, its core features, and the integration approach. Therefore, we will outline the approach of an integrated resource type model supporting the management and classification of construction equipment, construction tasks and qualification profiles. The development is based on a cross-domain analysis and evaluation of existing models. ...
This paper presents a specific modeling technique that is focused on preparing planning processes in civil engineering. Planning processes in civil engineering are characterized by some peculiarities so that the sequence of planning tasks needs to be determined for each planning project. Neither the use of optimized partial processes nor the use of lower detailed and optimized processes guarantee an optimal overall planning process. The modeling technique considers these peculiarities. In a first step, it is focused on the logic of the planning process. Algorithms based on the graph theory determine that logic. This approach ensures consistency and logical correctness of the description of a planning process at the early beginning in its preparation phase. Sets of data – the products of engineers like technical drawings, technical models, reports, or specifications – form the core of the presented modeling technique. The production of these sets of data requires time and money. This is expressed by a specific weighting of each set of data in the presented modeling technique. The introduction of these weights allows an efficient progress measurement and controlling of a planning project. For this purpose, a link between the modeling technique used in the preparation phase and the execution phase is necessary so that target and actual values are available for controlling purposes. The present paper covers the description of this link. An example is given to illustrate the use of the modeling technique for planning processes in civil engineering projects.
The design of building projects involves several types of resources such as architects, structural engineers, mechanical engineers, electrical engineers, and draftsmen, among others. For design firms to stay in business in this very competitive market, they need to manage their resources in a way that improves productivity and cost effectiveness. This task, however, is not simple and requires thorough analysis of process-level operations, resource use, and productivity. Typically, these operational aspects are the responsibility of the design office manager who assigns available resources to the different design projects to save time and lower design expenses. It is noted that limited studies have been carried out in the literature to model overall organizational operations and behavioral aspects, particularly in firms specialized in the design of building projects. In an effort to simplify the modeling process, a simplified modeling and simulation tool is used in this research. A simulation model representing an actual design office was developed assuming that the office performs designs for small, medium, and large size building projects. The developed model was used to simulate several alternatives and examine various resource assignment strategies. The simulation was conducted over ten years and the resulting productivity and income was measured.
The growing competition pressure in the building industry increases the demands on the design and construction processes in respect to economical, technical and time aspects. These demands require efficient improvements of the value-added chain, which can be realized mainly with the usage of innovative information- and communication-technologies. To support the collaboration of all participants involved in a certain building project the Workflow-Management-System “BauKom-Online” has been developed. In the focus of the system is to support the coordination of the participants and their information exchange. Such a software-method is well suited to ensure a high quality planning process. The modelling of business-processes enables a better self-comprehension of the participants work and helps to enhance the project performance. The system architecture of BauKom-Online contains two basic components: the processmodelling tool and the workflow-engine. The process-model contains of activities and states of the planning and construction processes and their relations. These connected processes compose the workflow. Such a process-model for engineering purposes has to satisfy several needs, e.g., the consideration of planning and building alternatives, dynamic changes of the model during execution of the project and the linkage to further technical objects like costs, building structure, specifications and documentmanagement. Furthermore, the scheduling of the project can be done within the process-model and can be visualized as a Gantt-diagram. ...
Applications for civil engineering tasks usually contain graphical user interfaces for the engineering processes. Persistent objects of the applications are stored to data bases. The influence of the interaction between a graphical user interface and a data base for the development of an civil engineering application is investigated in this paper. A graphic application for the linear elastic analysis of plane frames, which was previously developed with standard tools of the Java platform, is compared to a redesigned implementation using a generalized data base for persistent objects. The investigation leads to the following results : - A strict distinction between persistent and transient objects influences the class structure of an application, in particular the class structure of a graphical user interface. - The structure of an application depends on the logic for updating of references to persistent and transient graphical objects after an application is read from a file. - The complexity of the reference management can usually be handled better by just in time referencing associated with String - identifiers rather than by automated referencing associated with Name - identifiers.
Integrated Engineering Workflow focused on the Structural Engineering in the Industrial Environment
(2004)
The engineering and construction industry has been slow to exploit the full potential of information technology. The industry is highly fragmented, price sensitive, risk-adverse, and profit margins are small. Each project is unique with a small amount of technological innovation opportunities to capitalise on from one project to the next. Technological innovations that have been taking place are just simulating the old traditional paper workflow. Engineering information in digital form is being conveyed using traditional paper representations, which have to be interpreted by humans before the information can be used in other applications, thereby creating ‘islands of information’. It can be seen that poorly implemented IT strategies are duplicating paperwork, rather than reducing or eliminating it (Crowley et al., 2000). This paper will introduce the Integrated Engineering Workflow (IEW) concept to re-organise a structural discipline working on multi-disciplinary projects so as to maximise the advantages offered by new information technology.
As a result of the pilot-project “Grundwasser-Online” the supervision and the active controlling of large monitoring- and catchment-areas are realised by a co-operative integration of all related institutions, a synchronisation-process to combine all distributed data into one central server database, and a high-level eGovernment-Service to provide evaluated information over the internet. Based on this software system the local authorities are able to supervise the groundwater-levels and to find adequate decisions, which finally result in official permissions for the usage of groundwater-reservoirs by the water supply companies.
Available construction time-cost trade-off analysis models can be used to generate trade-offs between these two important objectives, however, their application is limited in large-scale construction projects due to their impractical computational requirements. This paper presents the development of a scalable and multi-objective genetic algorithm that provides the capability of simultaneously optimizing construction time and cost large-scale construction projects. The genetic algorithm was implemented in a distributed computing environment that utilizes a recent standard for parallel and distributed programming called the message passing interface (MPI). The performance of the model is evaluated using a set of measures of performance and the results demonstrate the capability of the present model in significantly reducing the computational time required to optimize large-scale construction projects.
Cost and Schedule Controlling in Relation to Liquidity Management during Construction Projects
(2004)
The present paper describes a software application which can be used for relating the scheduled events of a construction project with the respective financial parameters, leading to an overall improvement in general controlling and liquidity management. For this purpose, existing construction schedules are taken and details of the assignment are recorded. Thus it becomes possible to assess a future payment status should changes in the designated schedule occur.
The methods currently used for scheduling building processes have some major advantages as well as disadvantages. The main advantages are the arrangement of the tasks of a project in a clear, easily readable form and the calculation of valuable information like critical paths. The main disadvantage on the other hand is the inflexibility of the model caused by the modeling paradigms. Small changes of the modeled information strongly influence the whole model and lead to the need to change many more details in the plan. In this article an approach is introduced allowing the creation of more flexible schedules. It aims towards a more robust model that lowers the need to change more than a few information while being able to calculate the important propositions of the known models and leading to further valuable conclusions.