Refine
Document Type
- Conference Proceeding (23)
- Article (4)
Keywords
- Bauwerk (27) (remove)
Year of publication
- 2004 (27) (remove)
In this contribution the software design and implementation of an analysis server for the computation of failure probabilities in structural engineering is presented. The structures considered are described in terms of an equivalent Finite Element model, the stochastic properties, like e.g. the scatter of the material behavior or the incoming load, are represented using suitable random variables. Within the software framework, a Client-Server-Architecture has been implemented, employing the middleware CORBA for the communication between the distributed modules. The analysis server offers the possibility to compute failure probabilities for stochastically defined structures. Therefore, several different approximation (FORM, SORM) and simulation methods (Monte Carlo Simulation and Importance Sampling) have been implemented. This paper closes in showing several examples computed on the analysis server.
During the establishing of fundamentals in a building project a huge amount of influence factors and boundary conditions have to be investigated in order to provide the prerequisites for further planning processes. These investigative tasks are often related with a great effort concerning time and money, because there are no standardized workflows and interfaces which provide an efficient access to the necessary information related to a specific construction site. Within the fundamental investigation human and natural circumstances have to be taken into account. Thus, in this project two examples have been chosen in order to demonstrate the holistic approach for an integration and provision of georeferenced information. The developed internet-site http://www.grundlagenermittlung.de has been designed to support architects and civil engineers in early planning phases of a building project efficiently. It offers web-based services based on dynamic interfaces for a flexible search and collection of information concerning the building site. Therefore, a central Metadatabase-Server for Description, Discovery and Integration has been established which enables a registration of georeferenced services and the redirection of incoming requests to other, distributed data pools. Using this Yellow-Page concept in combination with the underlying meta-data based on the ISO 19115 standard an efficient investigation of geographical and environmental information becomes possible.
This paper describes a couple of new truss structures based on fractal geometry. One is the famous Sierpinski Gasket and another is a fractal triangle derived by means of applying a process forming leaves of a cedar tree using M. F. Barnsley’s contraction mapping theory. Therefore a pair of x-y coordinates of an arbitrary nodal point on the structures are generated easily if IFS(Iterated Function System) codes and a scale of them are specified. Structural members are defined similarly. Thus data for frame analysis can be generated automatically, which is significant if the objective structure has complex configuration. Next analytical results under vertical and wind loadings in Japanese Building Code are shown. Here members are assumed to be timber and to have cross section of 15cm×15cm. Finally authors conclude that geometrically new truss structures were developed and automatic data generation for frame analysis was attained using IFS. Analytical results show they contribute to saving material when compared it with King-post truss.
The processes in the life cycle of buildings are characterised by highly distinct teamwork. The integration of all the distributed working participants, by providing an environment, which especially supports the communication and collaboration between the actors, is a fundamental step to improve the efficiency of the involved processes and to reduce the total costs. In this article, a link based modelling approach and its “intelligent” link management is introduced (1). This approach realises an integration environment based on a special building model that acts as a decision support system. The link-based modelling is characterised by the definition and specialisation of links between partial models. These intelligent managed links enable a very flexible and task specific data access and exchange between all the different views and partial models of the participants.
This paper describes an Internet-enabled software model that could facilitate the development and utilization of nonlinear structural analysis programs. The software model allows users easy access to the analysis core program and the analysis results by using a web-browser or other application programs. In addition, new and legacy codes can be incorporated as distributed services and be integrated with the software framework from disparate sites. A distributed project management system, taking advantages of Internet and database technologies, is implemented to store and manage model information and simulation results. Nonlinear dynamic analysis and simulations of a bridge structure is performed to illustrate the facilities of the Internet-enabled software model.
At the start of the conceptual design process, designers start to give tangible form to their thoughts by sketching. This helps with reasoning and communicates ideas to other members of the team. Sketches are gradually worked up into more formal drawings which are then passed to the other stages of the design process. There are however some problems with basing early ideas on sketching. For example, due to their ad-hoc nature, sketches tend only to be diagrammatic representations and so designers cannot be sure that their ideas are feasible and what is being proposed meets the constraints described in the client brief. This can result in designers wasting time working up ideas which prove to be unsuitable. Also the process of constraint checking is complex and time consuming and so designers tend limit their search of possible options and instead choose satisfying rather than good solutions. This paper describes the INTEGRA project which examines the role of sketching in early conceptual design and how this can be linked to other aspects of the process and particularly automated constraint checking using an IT based approach. The focus for the work is the design of framed buildings. A multi-disciplinary approach has been adopted and the work has been undertaken in close collaboration with practising designers and clients.
The optimization of continuous structures requires careful attention to discretization errors. Compared to ordinary low order formulation (h-elements) in conjunction with an adaptive mesh refinement in each optimization step, the use of high order finite elements (so called p-elements) has several advantages. However, compared to the h-method a higher order finite element analysis program poses higher demands from a software engineering point of view. In this article the basics of an object oriented higher order finite element system especially tailored to the use in structural optimization is presented. Besides the design of the system, aspects related to the employed implementation language Java are discussed.
In current AEC practice client requirements are typically recorded in a building program, which, depending on the building type, covers various aspects from the overall goals, activities and spatial needs to very detailed material and condition requirements. This documentation is used as the starting point of the design process, but as the design progresses, it is usually left aside and changes are made incrementally based on the previous design solution. These incremental small changes can lead to a solution that may no longer meet the original requirements. In addition, design is by nature an iterative process and the proposed solutions often also cause evolution in the client requirements. However, the requirements documentation is usually not updated accordingly. Finding the latest updates and evolution of the requirements from the documentation is very difficult, if not impossible. This process can lead to an end result, which is significantly different from the documented requirements. Some important requirements may not be satisfied, and even if the design process was based on agreed-upon changes in the scope and requirements, differences in the requirements documents and in the completed building can lead to well-justified doubts about the quality of the design and construction process...
Spatial data acquisition, integration, and modeling for real-time project life-cycle applications
(2004)
Current methods for site modeling employs expensive laser range scanners that produce dense point clouds which require hours or days of post-processing to arrive at a finished model. While these methods produce very detailed models of the scanned scene, useful for obtaining as-built drawings of existing structures, the associated computational time burden precludes the methods from being used onsite for real-time decision-making. Moreover, in many project life-cycle applications, detailed models of objects are not needed. Results of earlier research conducted by the authors demonstrated novel, highly economical methods that reduce data acquisition time and the need for computationally intensive processing. These methods enable complete local area modeling in the order of a minute, and with sufficient accuracy for applications such as advanced equipment control, simple as-built site modeling, and real-time safety monitoring for construction equipment. This paper describes a research project that is investigating novel ways of acquiring, integrating, modeling, and analyzing project site spatial data that do not rely on dense, expensive laser scanning technology and that enable scalability and robustness for real-time, field deployment. Algorithms and methods for modeling objects of simple geometric shape (geometric primitives from a limited number of range points, as well as methods provide a foundation for further development required to address more complex site situations, especially if dynamic site information (motion of personnel and equipment). Field experiments are being conducted to establish performance parameters and validation for the proposed methods and models. Initial experimental work has demonstrated the feasibility of this approach.
Information science researchers and developers have spent many years addressing the problem of retrieving the exact information needed and using it for analysis purposes. In informationseeking dialogues, the user, i.e. construction project manager or supplier, often asks questions about specific aspects of the tasks they want to perform. But most of the time it is difficult for the software systems to unambiguously understand their overall intentions. The existence of information tunnels (Tannenbaum 2002) aggravates this phenomenon. This study includes a detailed case study of the material management process in the construction industry. Based on this case study, the structure of a formal user model for information retrieval in construction management is proposed. This prototype user model will be incorporated into the system design for construction information management and retrieval. This information retrieval system is a user-centered product based on the development of a user configurable visitor mechanism for managing and retrieving project information without worrying too much about the underlying data structure of the database system. An executable UML model combined with OODB is used to reduce the ambiguity in the user's intentions and to achieve user satisfaction.
For planning in existing built contexts, the building survey is the starting point for initial planning proposals, for the diagnosis and documentation of building damages, for the creation of objectives catalogues, for the detailed design of renovation and conversion measures and for ensuring fulfilment of building legislation, particularly by change of use and refitting. An examination of currently available IT-tools shows insufficient support for planning within existing contexts, most notably a deficit with regard to information capture and administration. This paper discusses the concept for a modular surveying system (basic concept, separation of geometry from semantic data, and separation into sub-systems) and the prototypical realisation of a system for the complete support of the entire building surveying process for existing buildings. The project aims to contribute to the development of a planning system for existing buildings. ...
The increased implementation of site data capture technologies invariably results in an increase in data warehousing and database technologies to store captured data. However, restricted use of data beyond the initial application could potentially result in a loss of understanding of site processes. This could in turn lead to poor decision making at production, tactical and strategic levels. Concrete usage data have been collected from two piling processes. These data have been analysed and the results highlighted potential improvements that could be made to existing site management and estimating processes. A cost benefit analysis has been used to support decision making at the strategic level where the identified improvements require capital expenditure.
This paper describes an ongoing research on the representation and reasoning about construction specifications, which is part of a bigger research project that aims at developing a formalism for automating the identification of deviations and defects on construction sites. We specifically describe the requirements on product and process models and an approach for representing and reasoning about construction specifications to enable automated detection and assessment of construction deviations and defects. This research builds on the previous research on modeling design specifications and extends and elaborates concept of contexts developed in that domain. The paper provides an overview of how the construction specifications are being modele d in this research and points out future steps that need to be accomplished to develop the envisioned automated deviation and defect detection system.
The process of matching data represented in two different data models is a longstanding issue in the exchange of data between different software systems. While the traditional manual matching approach cannot meet today’s demands on data exchange, research shows that a fully automated generic approach for model matching is not likely, and generic semi-automated approaches are not easy to implement. In this paper, we present an approach that focuses on matching data models in a specific domain. The approach combines a basic model matching approach and a version matching approach to deduce new matching rules to enable data transfer between two evolving data models.
The paper describes further developments of the interactive evolutionary design concept relating to the emergence of mutually inclusive regions of high performance design solutions. These solutions are generated from cluster-oriented genetic algorithm (COGAs) output and relate to a number of objectives introduced during the preliminary design of military airframes. The data-mining of multi-objective COGA (moCOGA) output further defines these regions through the application of clustering algorithms, data reduction and variable attribute relevance analyses. A number of visual representations of the COGA output projected onto both variable and objective space are presented. The multi-objective output of the COGA is compared to output from a Strength Pareto Evolutionary Algorithm (SPEA-II) to illustrate the manner in which moCOGAs can generate good approximations to Pareto frontiers.
The paper presents a general map-based approach to prototyping of products in virtual reality environments. Virtual prototyping of products is considered as a consistent simulation and visualization process mapping the source product model into its target visual representations. The approach enables to interrelate formally the product and visual information models with each other by defining mapping rules, to specify a prototyping scenario as a composition of map instances, and then to explore particular product models in virtual reality environments by interpreting the composed scenario. Having been realized, the proposed approach provides for the strongly formalized method and the common software framework to build virtual prototyping applications. As a result, the applications gain in expressiveness, reusability and reliability, as well as take on additional runtime flexibility...