International Conference on Computing in Civil and Building Engineering, ICCCBE, Weimar 10. 2004
The optimization of continuous structures requires careful attention to discretization errors. Compared to ordinary low order formulation (h-elements) in conjunction with an adaptive mesh refinement in each optimization step, the use of high order finite elements (so called p-elements) has several advantages. However, compared to the h-method a higher order finite element analysis program poses higher demands from a software engineering point of view. In this article the basics of an object oriented higher order finite element system especially tailored to the use in structural optimization is presented. Besides the design of the system, aspects related to the employed implementation language Java are discussed.
At the start of the conceptual design process, designers start to give tangible form to their thoughts by sketching. This helps with reasoning and communicates ideas to other members of the team. Sketches are gradually worked up into more formal drawings which are then passed to the other stages of the design process. There are however some problems with basing early ideas on sketching. For example, due to their ad-hoc nature, sketches tend only to be diagrammatic representations and so designers cannot be sure that their ideas are feasible and what is being proposed meets the constraints described in the client brief. This can result in designers wasting time working up ideas which prove to be unsuitable. Also the process of constraint checking is complex and time consuming and so designers tend limit their search of possible options and instead choose satisfying rather than good solutions. This paper describes the INTEGRA project which examines the role of sketching in early conceptual design and how this can be linked to other aspects of the process and particularly automated constraint checking using an IT based approach. The focus for the work is the design of framed buildings. A multi-disciplinary approach has been adopted and the work has been undertaken in close collaboration with practising designers and clients.
This paper describes an Internet-enabled software model that could facilitate the development and utilization of nonlinear structural analysis programs. The software model allows users easy access to the analysis core program and the analysis results by using a web-browser or other application programs. In addition, new and legacy codes can be incorporated as distributed services and be integrated with the software framework from disparate sites. A distributed project management system, taking advantages of Internet and database technologies, is implemented to store and manage model information and simulation results. Nonlinear dynamic analysis and simulations of a bridge structure is performed to illustrate the facilities of the Internet-enabled software model.
The processes in the life cycle of buildings are characterised by highly distinct teamwork. The integration of all the distributed working participants, by providing an environment, which especially supports the communication and collaboration between the actors, is a fundamental step to improve the efficiency of the involved processes and to reduce the total costs. In this article, a link based modelling approach and its “intelligent” link management is introduced (1). This approach realises an integration environment based on a special building model that acts as a decision support system. The link-based modelling is characterised by the definition and specialisation of links between partial models. These intelligent managed links enable a very flexible and task specific data access and exchange between all the different views and partial models of the participants.
This paper describes a couple of new truss structures based on fractal geometry. One is the famous Sierpinski Gasket and another is a fractal triangle derived by means of applying a process forming leaves of a cedar tree using M. F. Barnsley’s contraction mapping theory. Therefore a pair of x-y coordinates of an arbitrary nodal point on the structures are generated easily if IFS(Iterated Function System) codes and a scale of them are specified. Structural members are defined similarly. Thus data for frame analysis can be generated automatically, which is significant if the objective structure has complex configuration. Next analytical results under vertical and wind loadings in Japanese Building Code are shown. Here members are assumed to be timber and to have cross section of 15cm×15cm. Finally authors conclude that geometrically new truss structures were developed and automatic data generation for frame analysis was attained using IFS. Analytical results show they contribute to saving material when compared it with King-post truss.
During the establishing of fundamentals in a building project a huge amount of influence factors and boundary conditions have to be investigated in order to provide the prerequisites for further planning processes. These investigative tasks are often related with a great effort concerning time and money, because there are no standardized workflows and interfaces which provide an efficient access to the necessary information related to a specific construction site. Within the fundamental investigation human and natural circumstances have to be taken into account. Thus, in this project two examples have been chosen in order to demonstrate the holistic approach for an integration and provision of georeferenced information. The developed internet-site http://www.grundlagenermittlung.de has been designed to support architects and civil engineers in early planning phases of a building project efficiently. It offers web-based services based on dynamic interfaces for a flexible search and collection of information concerning the building site. Therefore, a central Metadatabase-Server for Description, Discovery and Integration has been established which enables a registration of georeferenced services and the redirection of incoming requests to other, distributed data pools. Using this Yellow-Page concept in combination with the underlying meta-data based on the ISO 19115 standard an efficient investigation of geographical and environmental information becomes possible.
In this contribution the software design and implementation of an analysis server for the computation of failure probabilities in structural engineering is presented. The structures considered are described in terms of an equivalent Finite Element model, the stochastic properties, like e.g. the scatter of the material behavior or the incoming load, are represented using suitable random variables. Within the software framework, a Client-Server-Architecture has been implemented, employing the middleware CORBA for the communication between the distributed modules. The analysis server offers the possibility to compute failure probabilities for stochastically defined structures. Therefore, several different approximation (FORM, SORM) and simulation methods (Monte Carlo Simulation and Importance Sampling) have been implemented. This paper closes in showing several examples computed on the analysis server.
The problem of data interoperability is now very important. The formal description of construction systems and objects must base upon the modeling for the description of construction data domain. The XML-language was selected as a basis of a universal data format, ensuring natural hierarchy of objects, flexibility, good layout and expandability. The language, developed by the author, is called Building Object Description Extensible Markup Language (bodXML). The types of all objects used by data transfer should be definite beforehand with existing methods of programming. It limits the possibilities of IT in application of new types. But the recipient software must recognize the building objects even if the kind of object is unknown at the outset. The author offers a set of main topological and geometric properties being sufficient for recognition of main three-dimensional building constructions with flat edges. The tests of artificial neuron network have shown that the recognition of a kind of the constructions represented as a set of indicated parameters happens enough confidently.
Advances in construction data analysis techniques have provided useful tools to discover explicit knowledge on historical databases supporting project managers’ decision making. However, in many situations, historical data are extracted and preprocessed for knowledge discovery based on time-consuming and problem-specific data preparation solutions, which often results in inefficiencies and inconsistencies. To overcome the problem, we are working on the development of a new data fusion methodology, which is designed to provide timely and consistent access to historical data for efficient and effective management knowledge discovery. The methodology is intended to be a new bridge between historical databases and data analysis techniques, which shields project managers from complex data preparation solutions, and enables them to use discovered knowledge for decision making more conveniently. This paper briefly describes the motivation, the background and the initial results of the ongoing research.
This paper will present a number of technical aspects for one of the most elaborate instrumentation and data acquisition projects ever undertaken in Canada. Confederation Bridge, the longest bridge built over ice covered seawater has been equipped with the state of the art data acquistition devices and systems as well as data transfer networks. The Bridge has been providing a fixed surface connection between Prince Edward Island and Province of New Brunswick in Canada since its opening in 1997. The Bridge has a rather long design service life of 100 years. Because of its large size and long span length, its design is not covered by any existing codes or standards worldwide. The focus of the paper is to introduce the data acquisition, transfer, processing and management systems. The instrumentation and communications infrastructure and devices will be presented in some details along with the data processing and management systems and techniques. Teams of engineers and researchers use the collected data to verify the analysis and design assumptions and parameters as well as investigate the short-term and long-term behaviour and health of the Bridge. The collected data are also used in furthering research activities in the field of bridge engineering and in elevating our knowledge about behaviour, reliability and durability of such complex structures, their components and materials.