Refine
Document Type
- Conference Proceeding (296) (remove)
Institute
- Professur Informatik im Bauwesen (129)
- In Zusammenarbeit mit der Bauhaus-Universität Weimar (98)
- Professur Theorie und Geschichte der modernen Architektur (21)
- Graduiertenkolleg 1462 (18)
- Institut für Strukturmechanik (ISM) (10)
- Professur Angewandte Mathematik (7)
- Bauhaus-Institut für Geschichte und Theorie der Architektur und Planung (3)
- Institut für Konstruktiven Ingenieurbau (IKI) (2)
- Junior-Professur Computational Architecture (2)
- Professur Stochastik und Optimierung (2)
Keywords
- Computerunterstütztes Verfahren (149)
- Architektur <Informatik> (99)
- Angewandte Informatik (83)
- Angewandte Mathematik (83)
- CAD (67)
- Computer Science Models in Engineering; Multiscale and Multiphysical Models; Scientific Computing (41)
- Verteiltes System (25)
- Building Information Modeling (21)
- Data, information and knowledge modeling in civil engineering; Function theoretic methods and PDE in engineering sciences; Mathematical methods for (robotics and) computer vision; Numerical modeling in engineering; Optimization in engineering applications (21)
- Modellierung (20)
Usually, the co-ordination of design and planning tasks of a project in the construction industries is done in a paper based way. Subsequent modifications have to be handled manually. The effects of modifications cannot be determined automatically. The approach to specify a complete process model before project start does not consider the requirements of the construction industries. The effort of specification at the beginning and during the process (modifications) does not justify the use of standard process model techniques. A new approach is presented in the according paper. A complete process model is deducted on the basis of a core. The core consists of process elements and specific relations between them. Modifications need to be specified in the core only. The effort of specification is therefore reduced. The deduction of the complete process is based on the graph theory. Algorithms of the graph theory are also used to determine the effects of modifications during project work.
An important feature of the 2003 SARS outbreak in Canada, Singapore, and Hong Kong was that many health care workers (HCWs) developed SARS after caring for patients with SARS. This has been ascribed to inadequate or ineffective patient isolation. However, it is difficult for dense cities to provide sufficient isolation facilities within a short period of time. This has raised concerns from the public for new strategies in the planning and design of isolation facilities. Considering that SARS or other infectious diseases could seriously damage our society’s development, isolation facilities that could be rapidly and economically constructed with appropriate environmental controls are essential. For this reason, the design team of the Department of Architecture collaborated with a special task force from the Faculty of Medicine, who are the frontline medical officers treating the SARS patients, to design Rapidly Assembled Isolation Patient Wards. Both architecture and medicine are well established disciplines, but they have little in common in terms of the mode of knowledge construction and practice. This induced much intellectual exploration and research interest in conducting this study. The process has provided an important reference for cross disciplinary studies between the architectural and medical domains.
In the field of Civil Engineering, the content of reinforcement concrete design course (RC course) has complicated design procedures and many difficult specifications to recognize, so most of the students regard the RC course a tough course, and teachers very often find the class time insufficient. Also, teachers of the RC course usually spend a lot of time in organizing the examinations for handling tedious calculations and complicated logical reasoning. Furthermore, correcting examination papers with partial scoring takes even more time of the teacher’s. Therefore, the objective of this research is to design and develop a partial scoring assessment system to meet the needs in engineering design courses, such as the RC course. This assessment system can generate test items with variable parameters. It also supports inference diagnosis on the examinee’s misconceptions and gives partial scores in grading the examination. In this research, the example test subject is the analysis of rectangular reinforced concrete beam with single layer steel bars.
In this contribution, the design of an analysis environment is presented, that supports an analyst to come to a decision within a gradual collaborative planning process. An analyst represents a project manager, planner or any other person, involved in the planning process. Today, planning processes are managed by several geographically distributed planners and project managers. Thus, complexity of such a process rises even more. Prediction of consequences of many planning decisions is not possible, in particular since assessment of a planning advance is not trivial. There have to be considered several viewpoints, that depend on individual perceptions. In the following, methods are presented to realize planning decision support.
Bridge vibration due to traffic loading has been subject of extensive research in the last decades. Such studies are concerned with deriving solutions for the bridge-vehicle interaction (BVI) and analyzing the dynamic responses considering randomness of the coupled model’s (BVI) input parameters and randomness of road unevenness. This study goes further to examine the effects of such randomness of input parameters and processes on the variance of dynamic responses in quantitative measures. The input parameters examined in the sensitivity analysis are, stiffness and damping of vehicle’s suspension system, axle spacing, and stiffness and damping of bridge. This study also examines the effects of the initial excitation of a vehicle on the influences of the considered input parameters. Variance based sensitivity analysis is often applied to deterministic models. However, the models for the dynamic problem is a stochastic one due to the simulations of the random processes. Thus, a setting using a joint meta-model; one for the mean response and other for the dispersion of the response is developed. The joint model is developed within the framework of Generalized Linear Models (GLM). An enhancement of the GLM procedure is suggested and tested; this enhancement incorporates Moving Least Squares (MLS) approximation algorithms in the fitting of the mean component of the joint model. The sensitivity analysis is then performed on the joint-model developed for the dynamic responses caused by BVI.
Sensor faults can affect the dependability and the accuracy of structural health monitoring (SHM) systems. Recent studies demonstrate that artificial neural networks can be used to detect sensor faults. In this paper, decentralized artificial neural networks (ANNs) are applied for autonomous sensor fault detection. On each sensor node of a wireless SHM system, an ANN is implemented to measure and to process structural response data. Structural response data is predicted by each sensor node based on correlations between adjacent sensor nodes and on redundancies inherent in the SHM system. Evaluating the deviations (or residuals) between measured and predicted data, sensor faults are autonomously detected by the wireless sensor nodes in a fully decentralized manner. A prototype SHM system implemented in this study, which is capable of decentralized autonomous sensor fault detection, is validated in laboratory experiments through simulated sensor faults. Several topologies and modes of operation of the embedded ANNs are investigated with respect to the dependability and the accuracy of the fault detection approach. In conclusion, the prototype SHM system is able to accurately detect sensor faults, demonstrating that neural networks, processing decentralized structural response data, facilitate autonomous fault detection, thus increasing the dependability and the accuracy of structural health monitoring systems.
Building design, realization, operation and refurbishment have to take into account the environmental impacts as well as the resulting costs over a long period of time. LCA methods had to be developed for buildings because of their complexity, their long life duration and through a large number of actors who are involved. This was realized by integrating life cycle analysis, life cycle costing and building product models in integrated LCA models. However the use of such models leads to difficulties. The principal ones are the uncertainty treatment in LCA models and the lack of experience of practitioners who are not LCA specialists. Answers to these problems are the management of uncertainty and the development of simplified models for building design, construction and operation. This can be achieved with the mean of experimental plans or Monte Carlo simulation. The paper will focus on how these techniques can be used, what are their possibilities and disadvantages, particularly concerning the development of simplified models.
Advances in construction data analysis techniques have provided useful tools to discover explicit knowledge on historical databases supporting project managers’ decision making. However, in many situations, historical data are extracted and preprocessed for knowledge discovery based on time-consuming and problem-specific data preparation solutions, which often results in inefficiencies and inconsistencies. To overcome the problem, we are working on the development of a new data fusion methodology, which is designed to provide timely and consistent access to historical data for efficient and effective management knowledge discovery. The methodology is intended to be a new bridge between historical databases and data analysis techniques, which shields project managers from complex data preparation solutions, and enables them to use discovered knowledge for decision making more conveniently. This paper briefly describes the motivation, the background and the initial results of the ongoing research.
This paper will present a number of technical aspects for one of the most elaborate instrumentation and data acquisition projects ever undertaken in Canada. Confederation Bridge, the longest bridge built over ice covered seawater has been equipped with the state of the art data acquistition devices and systems as well as data transfer networks. The Bridge has been providing a fixed surface connection between Prince Edward Island and Province of New Brunswick in Canada since its opening in 1997. The Bridge has a rather long design service life of 100 years. Because of its large size and long span length, its design is not covered by any existing codes or standards worldwide. The focus of the paper is to introduce the data acquisition, transfer, processing and management systems. The instrumentation and communications infrastructure and devices will be presented in some details along with the data processing and management systems and techniques. Teams of engineers and researchers use the collected data to verify the analysis and design assumptions and parameters as well as investigate the short-term and long-term behaviour and health of the Bridge. The collected data are also used in furthering research activities in the field of bridge engineering and in elevating our knowledge about behaviour, reliability and durability of such complex structures, their components and materials.
CRITICAL STRESS ASSESSMENT IN ANGLE TO GUSSET PLATE BOLTED CONNECTION BY SIMPLIFIED FEM MODELLING
(2010)
Simplified modelling of friction grip bolted connections of steel member – to – gusset plate is often applied in engineering practise. The paper deals with the simplification of pre-tensioned bolt model and simplification of load transfer within connection. Influence on normal strain (and thus stress) distribution at critical cross-section is investigated. Laboratory testing of single-angle or double-angle members – to – gusset plates bolted connections were taken as basis for numerical analysis. FE models were created using 1D and 2D elements. Angles and gusset plates were modelled with shell elements. Two methods of modelling of friction grip bolting were considered: bolt-regarding approach with 1D element systems modelling bolts and two variants of bolt-disregarding approach with special constraints over some part of member and gusset plate surfaces in contact: a) constraints over whole area of contact, b) constraints over the area around each bolt shank (“partially tied”). Modelling of friction grip bolted connections using simplified bolt modelling may be effective, especially in the case of analysis concerning elastic range only. In such a case disregarding bolts and replacing them with “partially tied” modelling seems to be more attractive. It is less time-consuming and provides results of similar accuracy in comparison to analysis utilizing simplified bolt modelling.
Buildings can be divided into various types and described by a huge number of parameters. Within the life cycle of a building, especially during the design and construction phases, a lot of engineers with different points of view, proprietary applications and data formats are involved. The collaboration of all participating engineers is characterised by a high amount of communication. Due to these aspects, a homogeneous building model for all engineers is not feasible. The status quo of civil engineering is the segmentation of the complete model into partial models. Currently, the interdependencies of these partial models are not in the focus of available engineering solutions. This paper addresses the problem of coupling partial models in civil engineering. According to the state-of-the-art, applications and partial models are formulated by the object-oriented method. Although this method solves basic communication problems like subclass coupling directly it was found that many relevant coupling problems remain to be solved. Therefore, it is necessary to analyse and classify the relevant coupling types in building modelling. Coupling in computer science refers to the relationship between modules and their mutual interaction and can be divided into different coupling types. The coupling types differ on the degree by which the coupled modules rely upon each other. This is exemplified by a general reference example from civil engineering. A uniform formulation of coupling patterns is described analogously to design patterns, which are a common methodology in software engineering. Design patterns are templates for describing a general reusable solution to a commonly occurring problem. A template is independent of the programming language and the operating system. These coupling patterns are selected according to the specific problems of building modelling. A specific meta-model for coupling problems in civil engineering is introduced. In our meta-model the coupling patterns are a semantic description of a specific coupling design.
Cost and Schedule Controlling in Relation to Liquidity Management during Construction Projects
(2004)
The present paper describes a software application which can be used for relating the scheduled events of a construction project with the respective financial parameters, leading to an overall improvement in general controlling and liquidity management. For this purpose, existing construction schedules are taken and details of the assignment are recorded. Thus it becomes possible to assess a future payment status should changes in the designated schedule occur.
The contribution introduces an adaptable process model to meet the special requirements of the coordination of planning activities in AEC (Architecture, Engineering, Construction). The process model is based on the concept of Coloured Petri-Nets and uses metainformation to characterize process-relevant information and to enable process-control based on the actual results of the planning.
Practical examples show that the improvement in cost flow and total amount of money spend in construction and further use may be cut significantly. The calculation is based on spreadsheets calculation, very easy to develop on most PC´s now a days. Construction works, are a field where the evaluation of Cash Flow can be and should be applied. Decisions about cash flow in construction are decisions with long-term impact and long-term memory. Mistakes from the distant past have a massive impact on situations in the present and into the far economic future of economic activities. Two approaches exist. The Just-in-Time (JIT) approach and life cycle costs (LCC) approach. The calculation example shows the dynamic results for the production speed in opposition to stable flow of production in duration of activities. More sophisticated rescheduling in optimal solution might bring in return extra profit. In the technologies and organizational processes for industrial buildings, railways and road reconstruction, public utilities and housing developments there are assembly procedures that are very appropriate for the given purpose, complicated research-, development-, innovation-projects are all very good aspects of these kinds of applications. The investors of large investments and all public invested money may be spent more efficiently if an optimisation speed-strategy can be calculated.
Unconstrained models are very often found in the broad spectrum of different theories of traffic demand models. In these models there are none or only one-sided restrictions influencing the choice of the individual. However in the traffic demand different deciding dependencies of the traffic volume with regard to the specific conditions of the territory structure potentials exist. Kichhoff and Lohse introduced bi- and tri-linearly constrained models to show these dependencies. In principle, the dependencies are described as hard, elastic and open boundary sum criteria. In this article a model is formulated which gets away from these predefined boundary sum criteria and allows a free determination of minimal and maximal boundary sum criteria. The iterative solution algorithm is shown according to a FURNESS procedure at the same time. With the approach of freely selectable minimal and maximal boundary sum criteria the modeling transport planner gets the possibility to show the traffic event even better. Furthermore all common boundary sum criteria can be calculated with this model. Therewith the often necessary and sensible standard and special cases can also be modeled.
The analysis of the response of complex structural systems requires the description of the material constitutive relations by means of an appropriate material model. The level of abstraction of such model may strongly affect the quality of the prognosis of the whole structure. In context to this fact, it is necessary to describe the material in a convenient sense as exact but as simple as possible. All material phenomena of crystalline materials e.g. steel, affecting the behavior of the structure, rely on physical effects which are interacting over spatial scales from subatomic to macroscopic range. Nevertheless, if the material is microscopically heterogenic, it might be appropriate to use phenomenological models for the purpose of civil engineering. Although constantly applied, these models are insufficient for steel materials with microscopic characteristics such as texture, typically occurring in hot rolled steel members or heat affected zones of welded joints. Hence, texture is manifested in crystalline materials as a regular crystallographic structure and crystallite orientation, influencing macroscopic material properties. The analysis of structural response of material with texture (e.g. rolled steel or heat affected zone of a welded joint) obliges the extension of the phenomenological material description of macroscopic scale by means of microscopic information. This paper introduces an enrichment approach for material models based on a hierarchical multiscale methodology. This has been done by describing the grain texture on a mesoscopic scale and coupling it with macroscopic constitutive relations by means of homogenization. Due to a variety of available homogenization methods, the question of an assessment of coupling quality arises. The applicability of the method and the effect of the coupling method on the reliability of the response are presented on an example.
This work presents a concept of interactive machine learning in a human design process. An urban design problem is viewed as a multiple-criteria optimization problem. The outlined feature of an urban design problem is the dependence of a design goal on a context of the problem. We model the design goal as a randomized fitness measure that depends on the context. In terms of multiple-criteria decision analysis (MCDA), the defined measure corresponds to a subjective expected utility of a user. In the first stage of the proposed approach we let the algorithm explore a design space using clustering techniques. The second stage is an interactive design loop; the user makes a proposal, then the program optimizes it, gets the user’s feedback and returns back the control over the application interface.
Poland is not situated in any seismic region of the earth, however there are still areas were underground mining is being conducted. In these areas, so-called 'paraseismic tremors', are very frequent phenomena. In the situation when a building examination is realized in order to define its safety, it is necessary to make a complete analysis, in which an influence of tremors should be included. To decide if a building is able to carry out any dynamic loads or not, it is necessary to compute its dynamic characteristics, i.e. natural frequencies. It is not possible using any standard techniques. After diagnosis a building in situ by an expert, computer techniques together with specialized software for dynamic, static, and strength analyses become a suitable tool. In this paper a special attention was paid to a typical twelve-store WGP (Wroclaw Great Plate) prefabricated building, concerning special type of joints. During dynamic actions these joints have a decisive influence on building's behavior. Paraseismic tremors are especially dangerous for these buildings and can be the reason of pre-failure states. It can be difficult and very expensive to prepare laboratory investigations of the part of a building or of a separate joint; therefore the computer modeling suitable to investigate behavior of such elements and whole buildings under different kinds of loads was used.
In this paper proposed the application of two-parameters damage model, based on non-linear finite element approach, to the analysis of masonry panels. Masonry is treated as a homogenized material, for which the material characteristics can be defined by using homogenization technique. The masonry panels subjected to shear loading are studied by using the proposed procedure within the framework of three-dimensional analyses. The nonlinear behaviour of masonry can be modelled using concepts of damage theory. In this case an adequate damage function is defined for taking into account different response of masonry under tension and compression states. Cracking can, therefore, be interpreted as a local damage effect, defined by the evolution of known material parameters and by one or several functions which control the onset and evolution of damage. The model takes into account all the important aspects which should be considered in the nonlinear analysis of masonry structures such as the effect of stiffness degradation due to mechanical effects and the problem of objectivity of the results with respect to the finite element mesh. Finally the proposed damage model is validated with a comparison with experimental results available in the literature.
Electromagnetic wave propagation is currently present in the vast majority of situations which occur in veryday life, whether in mobile communications, DTV, satellite tracking, broadcasting, etc. Because of this the study of increasingly complex means of propagation of lectromagnetic waves has become necessary in order to optimize resources and increase the capabilities of the devices as required by the growing demand for such services.
Within the electromagnetic wave propagation different parameters are considered that characterize it under various circumstances and of particular importance are the reflectance and transmittance. There are several methods or the analysis of the reflectance and transmittance such as the method of approximation by boundary condition, the plane wave expansion method (PWE), etc., but this work focuses on the WKB and SPPS methods.
The implementation of the WKB method is relatively simple but is found to be relatively efficient only when working at high frequencies. The SPPS method (Spectral Parameter Powers Series) based on the theory of pseudoanalytic functions, is used to solve this problem through a new representation for solutions of Sturm Liouville equations and has recently proven to be a powerful tool to solve different boundary value and eigenvalue problems. Moreover, it has a very suitable structure for numerical implementation, which in this case took place in the Matlab software for the valuation of both conventional and turning points profiles.
The comparison between the two methods allows us to obtain valuable information about their perfor mance which is useful for determining the validity and propriety of their application for solving problems where these parameters are calculated in real life applications.
In this paper we review two distint complete orthogonal systems of monogenic polynomials over 3D prolate spheroids. The underlying functions take on either values in the reduced and full quaternions (identified, respectively, with R3 and R4), and are generally assumed to be nullsolutions of the well known Riesz and Moisil Théodoresco systems in R3. This will be done in the spaces of square integrable functions over R and H. The representations of these polynomials are explicitly given. Additionally, we show that these polynomial functions play an important role in defining the Szegö kernel function over the surface of 3D spheroids. As a concrete application, we prove the explicit expression of the monogenic Szegö kernel function over 3D prolate spheroids.
Reasonably accurate cost estimation of the structural system is quite desirable at the early stages of the design process of a construction project. However, the numerous interactions among the many cost-variables make the prediction difficult. Artificial neural networks (ANN) and case-based reasoning (CBR) are reported to overcome this difficulty. This paper presents a comparison of CBR and ANN augmented by genetic algorithms (GA) conducted by using spreadsheet simulations. GA was used to determine the optimum weights for the ANN and CBR models. The cost data of twenty-nine actual cases of residential building projects were used as an example application. Two different sets of cases were randomly selected from the data set for training and testing purposes. Prediction rates of 84% in the GA/CBR study and 89% in the GA/ANN study were obtained. The advantages and disadvantages of the two approaches are discussed in the light of the experiments and the findings. It appears that GA/ANN is a more suitable model for this example of cost estimation where the prediction of numerical values is required and only a limited number of cases exist. The integration of GA into CBR and ANN in a spreadsheet format is likely to improve the prediction rates.
M. Christine Boyer is an urban historian whose interests include the history of the American city, city planning, preservation planning, and computer science. Before coming to Princeton University in 1991, Boyer was professor and chair of the City and Regional Planning Program at Pratt Institute. She was a visiting professor in the Ph.D. program at TU Deflt School of Design for Spring 2005. She has written extensively about American urbanism. Her publications include Dreaming the Rational City: The Myth of American City Planning 1890 –1945 (Cambridge: The MIT Press, 1983), Manhattan Manners: Architecture and Style 1850-1900 (New York: Rizzoli, 1985), The City of Collective Memory (Cambridge: The MIT Press, 1994), and CyberCities (New York: Princeton Architectural Press, 1996).
A/E/C Team members, while collaborating on building projects, rely on past experiences and content through the use of project design archives (whether in paper or digital format). This leads to underutilization of potential knowledge, as decision-making of data, information, and knowledge reuse is limited by access to these archives, due to sheer size and inconvenient presentation. This paper presents an integrated solution that leverages two technologies CoMem (Corporate Memory) and iRoom (interactive Room) developed at Stanford. This addresses critical limitations, i.e., content, context, visualization and interactivity, constraining the process of collaborative exploration towards knowledge reuse and decision-making.
In the AEC (Architecture / Engineering / Construction) industry a number of individuals and organisations collaborate and work jointly on a construction project. The resulting consortium has large pool of expertise and experience and can be defined as a Virtual Organisation (VO) formed for the duration of the project. VOs are electronically networked organisations where IT and web based communication technology play an important role in coordinating various activities of these organisations. This paper describes the design, development and implementation of a Grid enabled application called the Product Supplier Catalogue Database (PSCD) which supports collaborative working in consortia. As part of the Grid-enabling process, specialised metadata is being developed to enable PSCD to effectively utilise Grid middleware such as Globus and Java CoG toolkits. We also describe our experience whilst designing, developing and deploying the security service of the application using the Globus Security Interface (GSI).
Collaborative Design Processes: A Class on Concurrent Collaboration in Multidisciplinary Design
(2004)
The rise of concurrent engineering in construction demands early team formation and constant communication throughout the project life cycle, but educational models in architecture, engineering and construction have been slow to adjust to this shift in project organization. Most students in these fields spend the majority of their college years working on individual projects that do not build teamwork or communication skills. Collaborative Design Processes (CDP) is a capstone design course where students from the University of Illinois at Urbana-Champaign and the University of Florida learn methods of collaborative design enhanced by the use of information technology. Students work in multidisciplinary teams to collaborate from remote locations via the Internet on the design of a facility. An innovation of this course compared to previous efforts is that students also develop process designs for the integration of technology into the work of multidisciplinary design teams. The course thus combines both active and reflective learning about collaborative design and methods. The course is designed to provide students the experience, tools, and methods needed to improve design processes and better integrate the use of technology into AEC industry work practices. This paper describes the goals, outcomes and significance of this new, interdisciplinary course for distributed AEC education. Differences from existing efforts and lessons learned to promote collaborative practices are discussed. Principal conclusions are that the course presents effective pedagogy to promote collaborative design methods, but faces challenges in both technology and in traditional intra-disciplinary training of students.
The construction industry is a project-based business bringing together many different organisations to complete a desired goal. The strategic use of Information and Communication Technologies (ICT) has enabled this goal to be completed more effectively. Two issues require addressing, the technology itself and the implementation factors of the technology. Such implementation factors should consider, among other factors, the legal and contractual issues associated with the use of ICT, training requirements and its effects on the organisational culture. To date the legal and contractual issues have not been extensively covered, and it is recognised that the technologies have not been properly covered by any recognised legal and contractual practices. This in turn is threatening to inhibit the growth and prosperity of the use of the technology on construction projects. This paper discusses these legal and contractual issues and describes methods and tools that can be used to enable the growth of technology to be used in a legal and contractually valid environment.
Collaboration in AEC Design : Web-enabling Applications using Peer-to-Peer Office Communicator
(2004)
A market analysis conducted by Gartner Dataquest in August 2001 has shown the typical characteristics of the AEC design process. High volatility in membership of AEC design groups and members dispersed over several external offices is the common collaboration scenario. Membership is most times short lived, compared to the overall duration of the process. A technical solution has to take that into account by making joining and leaving a collaborative work group very easy. The modelling of roles of collaboration between group members must be based on a commonly understood principle like the publisher / subscriber model, where the individual that is responsible for the distribution of vital information is clear. Security issues and trust in the confidentiality of the system is a central concern for the acceptance of the system. Therefore, keeping the subset of data that will be published under the absolute control of the publisher is a must. This is not the case with server-based scenarios, sometimes even due to psychological reasons. A loosely bound Peer-to-Peer network offers advantages over a server-based solution, because of less administrative overhead and simple installation procedures. In a peer-to-peer environment, a publish/subscribe role model can be more easily implemented. The publish/subscribe model matches the way AEC processes are modelled in real world scenarios today, where legal proof of information exchange between external offices is of high importance. Workflow management systems for small to midsize companies of the AEC industry may adopt the peer-to-peer approach to collaboration in the future. Further investigations are being made on the research level (WINDS) by integrating the viewer and redlining application Collaborate! into a collaborative environment.
CLOSING THE WORLD’S FACTORY
(2011)
Joshua Bolchover is an urban researcher, academic and architectural designer. He is an Assistant Professor at the University of Hong Kong, focusing on researching and designing buildings in rural China. In 2010 he exhibited Rural Urban Ecology at the Venice Biennale 2010. He has curated, designed and contributed to several international exhibitions including: Utopia Now: Opening the Closed Area, a research project on the Hong Kong and Shenzhen border at the Venice Biennale 2008; Get it Louder, a touring exhibition in China; Airspace: What Skyline does London want; Hydan; Can Buildings Curate and has exhibited at the HK-SZ Biennale. Joshua was a local curator for the Manchester-Liverpool section of Shrinking Cities between 2003 and 2005. He has collaborated with Raoul Bunschoten, Chora, researching strategic urban projects and has worked with Diller + Scofidio in New York. Joshua has previously taught architecture at the Chinese University of Hong Kong, London Metropolitan University, Cambridge University and the Architectural Association. He was educated at Cambridge University and at the Bartlett School of Architecture. John Lin is an architect based in Hong Kong and a graduate of The Cooper Union in New York City. His experimental constructions have been published in FRAME magazine (2003) and exhibited in the Kolonihaven (Architecture Park) at the Louisiana Museum of Modern Art in Copenhagen (2004) and the Venice Biennale (2008). Current projects include the design of several school buildings in China. He has taught at the Royal Danish Academy of Fine Arts, School of Architecture, and The Chinese University of Hong Kong and is currently an Assistant Professor at the University of Hong Kong.
Image processing has been much inspired by the human vision, in particular with regard to early vision. The latter refers to the earliest stage of visual processing responsible for the measurement of local structures such as points, lines, edges and textures in order to facilitate subsequent interpretation of these structures in higher stages (known as high level vision) of the human visual system. This low level visual computation is carried out by cells of the primary visual cortex. The receptive field profiles of these cells can be interpreted as the impulse responses of the cells, which are then considered as filters. According to the Gaussian derivative theory, the receptive field profiles of the human visual system can be approximated quite well by derivatives of Gaussians. Two mathematical models suggested for these receptive field profiles are on the one hand the Gabor model and on the other hand the Hermite model which is based on analysis filters of the Hermite transform. The Hermite filters are derivatives of Gaussians, while Gabor filters, which are defined as harmonic modulations of Gaussians, provide a good approximation to these derivatives. It is important to note that, even if the Gabor model is more widely used than the Hermite model, the latter offers some advantages like being an orthogonal basis and having better match to experimental physiological data. In our earlier research both filter models, Gabor and Hermite, have been developed in the framework of Clifford analysis. Clifford analysis offers a direct, elegant and powerful generalization to higher dimension of the theory of holomorphic functions in the complex plane. In this paper we expose the construction of the Hermite and Gabor filters, both in the classical and in the Clifford analysis framework. We also generalize the concept of complex Gaussian derivative filters to the Clifford analysis setting. Moreover, we present further properties of the Clifford-Gabor filters, such as their relationship with other types of Gabor filters and their localization in the spatial and in the frequency domain formalized by the uncertainty principle.
Car following models are used to describe the behavior of a number of cars on the road dependent on the distance to the car in front. We introduce a system of ordinary differential equations and perform a theoretical and numerical analysis in order to find solutions that reflect various traffic situations. We present three different variations of the model motivated by reality.
In order to minimize the probability of foundation failure resulting from cyclic action on structures, researchers have developed various constitutive models to simulate the foundation response and soil interaction as a result of these complex cyclic loads. The efficiency and effectiveness of these model is majorly influenced by the cyclic constitutive parameters. Although a lot of research is being carried out on these relatively new models, little or no details exist in literature about the model based identification of the cyclic constitutive parameters. This could be attributed to the difficulties and complexities of the inverse modeling of such complex phenomena. A variety of optimization strategies are available for the solution of the sum of least-squares problems as usually done in the field of model calibration. However for the back analysis (calibration) of the soil response to oscillatory load functions, this paper gives insight into the model calibration challenges and also puts forward a method for the inverse modeling of cyclic loaded foundation response such that high quality solutions are obtained with minimum computational effort. Therefore model responses are produced which adequately describes what would otherwise be experienced in the laboratory or field.
In Bauplanungssystemen können XML-Technologien in vielen Bereichen eingesetzt werden mit dem Ziel, diese Systeme modular und webfähig zu gestalten. Der Einsatz lohnt als Basis-Datenstruktur für verschiedene rechnerinterne Modelle, Steuerungsstruktur für Customizing von Anwendungen, Bindeglied zwischen objektbasierten Systemen, Kommunikationsprotokoll zwischen Komponenten. Es ist möglich, komplexe Objekte aus dem Planungsalltag mittels XML arzustellen, zu speichern und zu verarbeiten. Es ist möglich, entsprechende Komponenten im Netz zu verteilen bzw. über Internet zu verbinden. Die heute dominierende Sicht auf XML als Austauschmedium wird ergänzt um die Idee eines XML-basierten Systems: Entwurfsobjekte können als >XML-Objekte< formuliert und im Sinne eines late binding verwendet werden.
Review of Discrete Optimization Techniques for CAD Discrete optimization in the structure design Morphological method The alternative graph approach Convex discrete optimization without objective function Matroidal Decomposition in design Decomposition of layered matrices Discrete Optimization in Designing Packing problem Optimal arrangement of rectangles and shortest paths in L1-metrics Partition problems Discrete optimization in computational geometry and computer graphics Maxima of a point set on the plane Triangulation One of the main problems in computer graphics is removing hidden lines and surfaces
The main aim of the research project in progress is to develop virtual models as tools to support decision-making in the planning of construction maintenance. The virtual models gives the capacity to allow them to transmit, visually and interactively, information related to the physical behaviour of materials, components of given infrastructures, defined as a function of the time variable. The interactive application allows decisions to be made on conception options in the definition of plans for maintenance, conservation or rehabilitation. The first virtual prototype that is now in progress concerns just lamps. It allows the examination of the physical model, visualizing, for each element modelled in 3D and linked to a database, the corresponding technical information concerned with the wear and tear aspects of the material, calculated for that period of time. In addition, the analysis of solutions for repair work or substitution and inherent cost are predicted, the results being obtained interactively and visualized in the virtual environment itself. The aim is that the virtual model should be able to be applied directly over the 3D models of new constructions, in situations of rehabilitation. The practical usage of these models is directed, then, towards supporting decision-making in the conception phase and the planning of maintenance. In further work other components will be analysed and incorporated into the virtual system.
We describe the database requirements of SEED (Software Environment to Support the Early Phases in Building Design). The requirements are typical for a database that intends to support a heterogeneous design support environment consisting of independent software modules with diverse internal design models, requirements not met by any commercial database system. The design and implementation of this database is an integral part of the overall software engineering effort. We describe the SEED approach that integrates external and in-house software based on a shared information model specified in the modeling language SPROUT, which allows for the specification of domains, and classes, relationship types and their behavior, and multiple classifications. The SPROUT run-time system organizes and coordinates the communication between the software modules and the database
To support research in the building sector and in order to help it move towards a new digital economy, the European Commission under the 5th Framework initiative, especially the IST programme, funded various RTD projects. The opportunity to bring these IST projects together was acknowledged so that stronger links can be created under a clustering umbrella and that, moreover, links of those projects with their RTD environment could be facilitated. This has been the objective of work carried out within the ICCI (IST-2001-33022) Cluster project. This paper introduces the main aims and objectives of the project, and then presents its principal outcomes. In a second part, it synthesises the underlying concepts, technology and tools that will make ICT-based Construction a reality in a near future, and gives recommended actions for the industry, the EC and the Construction ICT R&D in Europe, giving some benefit of this project experience to the three communities.
A concept of non-commutative Galois extension is introduced and binary and ternary extensions are chosen. Non-commutative Galois extensions of Nonion algebra and su(3) are constructed. Then ternary and binary Clifford analysis are introduced for non-commutative Galois extensions and the corresponding Dirac operators are associated.
In this paper, we present an empirical approach for objective and quantitative benchmarking of optimization algorithms with respect to characteristics induced by the forward calculation. Due to the professional background of the authors, this benchmarking strategy is illustrated on a selection of search methods in regard to expected characteristics of geotechnical parameter back calculation problems. Starting from brief introduction into the approach employed, a strategy for optimization algorithm benchmarking is introduced. The benchmarking utilizes statistical tests carried out on well-known test functions superposed with perturbations, both chosen to mimic objective function topologies found for geotechnical objective function topologies. Here, the moved axis parallel hyper-ellipsoid test function and the generalized Ackley test function in conjunction with an adjustable quantity of objective function topology roughness and fraction of failing forward calculations is analyzed. In total, results for 5 optimization algorithms are presented, compared and discussed.
Rectangular steel frames are considered and subjected to strong ground motion. Their behavior factor is numerically evaluated using nonlinear time history analysis and different ground acceleration records. The behavior factor is determined assuming severe collapse mechanism occurs throughout the time history. The system of equations is transformed into single equation end then the energy balance concept is applied. The expression for the behavior factor is derived and its application to four story two bays steel frame is illustrated and the corresponding results are discussed.
BAUHAUS ISOMETRY AND FIELDS
(2012)
While integration increases by networking, segregation strides ahead too. Most of us fixate our mind on special topics. Yet we are relying on our intuition too. We are sometimes waiting for the inflow of new ideas or valuable information that we hold in high esteem, although we are not entirely conscious of its origin. We may even say the most precious intuitions are rooting in deep subconscious, collective layers of the mind. Take as a simple example the emergence of orientation in paleolithic events and its relation to the dihedral symmetry of the compass. Consider also the extension of this algebraic matter into the operational structures of the mind on the one hand and into the algebra of geometry, Clifford algebra as we use to call it today, on the other. Culture and mind, and even the individual act of creation may be connected with transient events that are subconscious and inaccessible to cognition in principle. Other events causative for our work may be merely invisible too us, though in principle they should turn out attainable. In this case we are just ignorant of the whole creative process. Sometimes we begin to use unusual tools or turn into handicraft enthusiasts. Then our small institutes turn into workshops and factories. All this is indeed joining with the Bauhaus and its spirit. We shall go together into this, and we shall present a record of this session.
It's not uncommon that analysis and simulation methods are used mainly to evaluate finished designs and to proof their quality. Whereas the potential of such methods is to lead or control a design process from the beginning on. Therefore, we introduce a design method that move away from a “what-if” forecasting philosophy and increase the focus on backcasting approaches. We use the power of computation by combining sophisticated methods to generate design with analysis methods to close the gap between analysis and synthesis of designs. For the development of a future-oriented computational design support we need to be aware of the human designer’s role. A productive combination of the excellence of human cognition with the power of modern computing technology is needed. We call this approach “cognitive design computing”. The computational part aim to mimic the way a designer’s brain works by combining state-of-the-art optimization and machine learning approaches with available simulation methods. The cognition part respects the complex nature of design problems by the provision of models for human-computation interaction. This means that a design problem is distributed between computer and designer. In the context of the conference slogan “back to command”, we ask how we may imagine the command over a cognitive design computing system. We expect that designers will need to let go control of some parts of the design process to machines, but in exchange they will get a new powerful command on complex computing processes. This means that designers have to explore the potentials of their role as commanders of partially automated design processes. In this contribution we describe an approach for the development of a future cognitive design computing system with the focus on urban design issues. The aim of this system is to enable an urban planner to treat a planning problem as a backcasting problem by defining what performance a design solution should achieve and to automatically query or generate a set of best possible solutions. This kind of computational planning process offers proof that the designer meets the original explicitly defined design requirements. A key way in which digital tools can support designers is by generating design proposals. Evolutionary multi-criteria optimization methods allow us to explore a multi-dimensional design space and provide a basis for the designer to evaluate contradicting requirements: a task urban planners are faced with frequently. We also reflect why designers will give more and more control to machines. Therefore, we investigate first approaches learn how designers use computational design support systems in combination with manual design strategies to deal with urban design problems by employing machine learning methods. By observing how designers work, it is possible to derive more complex artificial solution strategies that can help computers make better suggestions in the future.
In civil engineering practice, values of column forces are often required before any detailed analysis of the structure has been performed. One of the reasons for this arises from the fast-tracked nature of the majority of construction projects: foundations are laid and base columns constructed whilst analysis and design are still in progress. A need for quick results when feasibility studies are performed or when evaluating the effect of design changes on supporting columns form other situations in which column forces are required, but where a detailed analysis to get these forces seems superfluous. Thus it was concluded that the development of an efficient tool for column force calculations, in which the extensive input required in a finite element analysis is to be avoided, would be highly beneficial. The automation of the process is achieved by making use of a Voronoi diagram. The Voronoi diagram is used a) for subdividing the floor into influence areas and b) as a basis for automatic load assignment. The implemented procedure is integrated into a CAD system in which the relevant geometric information of the floor, i.e. its shape and column layout, can be defined or uploaded. A brief description of the implementation is included. Some comparative results and considerations regarding the continuation of the study are given.
Numerical simulations in the general field of civil engineering are common for the design process of structures and/or the assessment of existing buildings. The behaviour of these structures is analytically unknown and is approximated with numerical simulation methods like the Finite Element Method (FEM). Therefore the real structure is transferred into a global model (GM, e.g. concrete bridge) with a wide range of sub models (partial models PM, e.g. material modelling, creep). These partial models are coupled together to predict the behaviour of the observed structure (GM) under different conditions. The engineer needs to decide which models are suitable for computing realistically and efficiently the physical processes determining the structural behaviour. Theoretical knowledge along with the experience from prior design processes will influence this model selection decision. It is thus often a qualitative selection of different models. The goal of this paper is to present a quantitative evaluation of the global model quality according to the simulation of a bridge subject to direct loading (dead load, traffic) and indirect loading (temperature), which induce restraint effects. The model quality can be separately investigated for each partial model and also for the coupled partial models in a global structural model. Probabilistic simulations are necessary for the evaluation of these model qualities by using Uncertainty and Sensitivity Analysis. The method is applied to the simulation of a semi-integral concrete bridge with a monolithic connection between the superstructure and the piers, and elastomeric bearings at the abutments. The results show that the evaluation of global model quality is strongly dependent on the sensitivity of the considered partial models and their related quantitative prediction quality. This method is not only a relative comparison between different models, but also a quantitative representation of model quality using probabilistic simulation methods, which can support the process of model selection for numerical simulations in research and practice.
Models in the context of engineering can be classified in process based and data based models. Whereas the process based model describes the problem by an explicit formulation, the data based model is often used, where no such mapping can be found due to the high complexity of the problem. Artificial Neuronal Networks (ANN) is a data based model, which is able to “learn“ a mapping from a set of training patterns. This paper deals with the application of ANN in time dependent bathymetric models. A bathymetric model is a geometric representation of the sea bed. Typically, a bathymetry is been measured and afterwards described by a finite set of measured data. Measuring at different time steps leads to a time dependent bathymetric model. To obtain a continuous surface, the measured data has to be interpolated by some interpolation method. Unlike the explicitly given interpolation methods, the presented time dependent bathymetric model using an ANN trains the approximated surface in space and time in an implicit way. The ANN is trained by topographic measured data, which consists of the location (x,y) and time t. In other words the ANN is trained to reproduce the mapping h = f(x,y,t) and afterwards it is able to approximate the topographic height for a given location and date. In a further step, this model is extended to take meteorological parameters into account. This leads to a model of more predictive character.
Lara Schrijver is an assistant professor at the Faculty of Architecture of the TU Delft. She is one of three program leaders for a new research program in the department of architecture, ‘The Architectural Project and its Foundations’. Schrijver holds degrees in architecture from Princeton University and the TU Delft. She received her Ph.D. from the TU Eindhoven in 2005. Schrijver has taught design and theory courses, and contributed to conferences in the Netherlands as well as abroad. She was an editor for OASE, journal for architecture, for ten years, and was co-organizer of the 2006 conference ‘The Projective Landscape’. Her current work revolves around the role of architecture in the city, and its responsibility in defining the public domain. Her first book, Radical Games, on the influence of the 1960s on contemporary discourse, is forthcoming in the spring of 2009.
Tatjana Schneider is lecturer at the School of Architecture, University of Sheffield. She holds a PhD in architecture. She worked in architectural practice in Germany and the UK, and has taught, lectured and published widely (including ‘Flexible Housing’ with Jeremy Till). She was a member of the worker’s cooperative G.L.A.S. (Glasgow Letters on Architecture and Space), which undertook agit-prop works, educational workshops, community based design consultancy and produced the quarterly journal glaspaper. Her work focuses on the production and political economy of the built environment. Current work includes the research ‘Spatial Agency’.
Architecture and globality
(2000)
Wissenschaftliches Kolloquium vom 14. bis 16. Oktober 1999 in Weimar an der Bauhaus-Universität zum Thema: ‚global village - Perspektiven der Architektur'
ARCHITECTURE AND ATMOSPHERE
(2011)
Nathalie Bredella is an architect. She was educated at the TU Berlin and Cooper Union, New York. She received a PhD in Architectural Theory. She taught architectural design at the TU Berlin. She ist the author of Architekturen des Zuschauens. Imaginäre und reale Räume im Film (transcript-verlag). The work is based on an interdisciplinary approach incorporating architecture, film theory and philosophy. Her interests in architectural practice focus on the relationship between spatial strategies, film and media on an urban and architectural scale.
Wissenschaftliches Kolloquium vom 14. bis 16. Oktober 1999 in Weimar an der Bauhaus-Universität zum Thema: ‚global village - Perspektiven der Architektur'
We propose a new approach to the numerical solution of quasi-static elastic-plastic problems based on the Moreau-Yosida theorem. After the time discretization, the problem is expressed as an energy minimization problem for unknown displacement and plastic strain fields. The dependency of the minimization functional on the displacement is smooth whereas the dependency on the plastic strain is non-smooth. Besides, there exists an explicit formula, how to calculate the plastic strain from a given displacement field. This allows us to reformulate the original problem as a minimization problem in the displacement only. Using the Moreau-Yosida theorem from the convex analysis, the minimization functional in the displacements turns out to be Frechet-differentiable, although the hidden dependency on the plastic strain is non-differentiable. The seconds derivative exists everywhere apart from the elastic-plastic interface dividing elastic and plastic zones of the continuum. This motivates to implement a Newton-like method, which converges super-linearly as can be observed in our numerical experiments.
The reduction of oscillation amplitudes of structural elements is necessary not only for maintenance of their durability and longevity but also for elimination of a harmful effect of oscillations on people and technology operations. The dampers are widely applied for this purpose. One of the most widespread models of structural friction forces having piecewise linear relation to displacement was analysed. T The author suggests the application of phase trajectories mapping in plane "acceleration – displacement". Unlike the trajectories mapping in a plane "velocity – displacement", they don't require large number of geometrical constructions for identification of the characteristics of dynamic systems. It promotes improving the accuracy. The analytical assumptions had been verified by numerical modeling. The results show good enough coincide between numerical and analytical estimation of dissipative characteristic.
In this paper a meshless component is presented, which internally uses the common meshless interpolation technique >Moving Least Squares<. In contrast to usual meshless integration schemes like the cell quadrature and the nodal integration in this study integration zones with triangular geometry spanned by three nodes are used for 2D analysis. The boundary of the structure is defined by boundary nodes, which are similar to finite element nodes. By using the neighborhood relations of the integration zones an efficient search algorithm to detected the nodes in the influence of the integration points was developed. The components are directly coupled with finite elements by using a penalty method. An widely accepted model to describe the fracture behavior of concrete is the >Fictitious Crack Model< which is applied in this study, which differentiates between micro cracks and macro cracks, with and without force transmission over the crack surface, respectively. In this study the crack surface is discretized by node pairs in form of a polygon, which is part of the boundary. To apply the >Fictitious Crack Model< finite interface elements are included between the crack surface nodes. The determination of the maximum principal strain at the crack tip is done by introducing an influence area around the singularity. On a practical example it is shown that the included elements improve the model by the transmission of the surface forces during monotonic loading and by the representation of the contact forces of closed cracks during reverse loading.
A fuzzy logic controller - WNC (Water Network Control) was developed for control of urban drainage systems. The objectives are to avoid accidents, flooding, pollutions through combined sewer overflows and excessive operation and maintenance costs. Fuzzy logic was proved to be a promising approach, flexible and easy accepted, because it includes the expert knowledge. Fuzzy control system proposed is robust and also easy to understand and modified. It offers to the operator the possibility to participate directly in the system control, combining the results of the modern optimization techniques with the experience and knowledge accumulated in time by experts. Thus, the control of urban sewer system can be well solved by implementing an intelligent control system, based on available information (fuzzy) and on expert's experience. An important feature of this fuzzy logic system is its capability to elaborate a control decision even in situations that were not considered in the design phase of the urban network.
All construction project are constrained by their schedules, budgets and specifications, and safety and environmental regulations. These constraints made construction management more complex and difficult. At the same time, many historical data that can support the decisions in the future are kept in construction enterprises,. To use the historical data effectively and efficiently, it is essential to apply the data warehouse and data mining technologies. This paper introduces a research which aims to develop a data warehouse system according to the requirements of construction enterprises and use data mining technology to learn useful information and knowledge from the data warehouse system. The design, the development and the application of this system are detailedly introduced in this paper.
In den zurückliegenden Jahren wurden an der Professur Massivbau I umfangreiche Untersuchungen zur Modellbildung und rechnerischen Erfassung des Tragverhaltens von Tragwerken und Tragwerkselementen aus Stahlbeton und Spannbeton unter Berücksichtigung von Rißbildungen und Plastizierungen durchgeführt. Diesen Untersuchungen liegt als einheitliches methodisches Konzept der mathematischen Problembeschreibung und Problemlösung die mathematische Optimierung zugrunde. Bereits anläßlich des IKM 1994 [1] hatte der Verfasser Gelegenheit, zusammenfassend über Ergebnisse bei der Anwendung der mathematischen Optimierung im Bereich der nichtlinearen Tragwerksanalyse zu berichten. Der vorliegende Beitrag, soll einen Überblick über seitdem untersuchte Problemkreise und dabei gewonnene Ergebnisse und Erfahrungen vermitteln. Bei der Anwendung der linearen und quadratischen Optimierung sind wegen der geforderten Linearität der Nebenbedingungen Vereinfachungen bei der Modellbildung des stahlbetonspezifischen Tragverhaltens unumgänglich. Besonders betroffen sind die Ansätze zur Beschreibungen des Materialverhaltens. Durch den Einsatz allgemeiner nichtlinearer mathematischer Optimierungsmethoden lässt sich eine methodisch bedingte Linearisierung des Berechnungsmodells umgehen....
Procedures of a construction of general solutions for some classes of partial differential equations (PDEs) are proposed and a symmetry operators approach to the raising the orders of the polynomial solutions to linear PDEs are develops. We touch upon an ''operator analytic function theory'' as the solution of a frequent classes of the equations of mathematical physics, when its symmetry operators forms vast enough space. The MAPLE© package programs for the building the operator variables is elaborated also.
Steel profiles with slender cross-sections are characterized by their high susceptibility to instability phenomena, especially local buckling, which are intensified under fire conditions. This work presents a study on numerical modelling of the behaviour of steel structural elements in case of fire with slender cross-sections. To accurately carry out these analyses it is necessary to take into account those local instability modes, which normally is only possible with shell finite elements. However, aiming at the development of more expeditious methods, particularly important for analysing complete structures in case of fire, recent studies have proposed the use of beam finite elements considering the presence of local buckling through the implementation of a new effective steel constitutive law. The objective of this work is to develop a study to validate this methodology using the program SAFIR. Comparisons are made between the results obtained applying the referred new methodology and finite element analyses using shell elements. The studies were made to laterally restrained beams, unrestrained beams, axially compressed columns and columns subjected to bending plus compression.
The paper gives a general overview and concerns with a specified set of computer-aided analysis modules for hybrid structures loaded by extreme excitations. All problems are solved by methods of linear, quadratic or nonlinear mathematical optimization, that leads to very effective and economic design solutions. All approaches are derived from general optimization problem that can be easily altered to conform to specific design tasks. Some advantages and possibilities of hybrid structural modeling (single or mixed model-supported) are discussed. The methods will be illustrated by an example structure and optimization schemes.
Analysis of the reinforced concrete chimney geometry changes and their influence on the stresses in the chimney mantle was made. All the changes were introduced to a model chimney and compared. Relations between the stresses in the mantle of the chimney and the deformations determined by the change of the chimney's vertical axis geometry were investigated. The vertical axis of chimney was described by linear function (corresponding to the real rotation of the chimney together with the foundation), and by parabolic function (corresponding to the real dislocation of the chimney under the influence of the horizontal forces - wind). The positive stress pattern in the concrete as well as the negative stress pattern in the reinforcing steel have been presented. The two cases were compared. Analysis of the stress changes in the chimney mantle depending on the modification in the thickness of the mantle (the thickness of the chimney mantle was altered in the linear or the abrupt way) was carried out. The relation between the stresses and the chimney's diameter change from the bottom to the top of the chimney was investigated. All the analyses were conducted by means of a specially developed computer program created in Mathematica environment. The program makes it also possible to control calculations and to visualize the results of the calculations at every stage of the calculation process.
The general motivation of this research is to develop software to support the handling of the increased complexity of architectural design. In this paper we describe a system providing general support during the whole process. Instead of only developing design tools we are also addressing the problem of the operating environment of these tools. We conclude that design tools have to be integrated in an open, modular, distributed, user friendly and efficient environment. Two major fields have to be addressed - the development of design tools and the realisation of an integrated system as their operation environment. We will briefly focus on the latter by discussing known technologies in the field of information technology and other design disciplines that can be used to realise such an environment. Regarding the first subject we have to state the need of a detailed tool specification. As a solution we suggest a strategy where the tool functions are specified on the basis of a transformation, where a hierarchical process model is mapped into specifications of different design tools realising appropriate support for all sub-processes of architectural design. Using this strategy the main steps to develop such a support system are: implementation of a framework as basis for the integrated design system decision whether the tool specification are already implemented in available tools in this case these tools can be integrated using known methods for tool coupling otherwise new design tools have to be developed according to the framework
This paper describes a research project that addresses the difficulties in dealing with regulatory documents such as national and regional codes. These documents tend to be voluminous, heavily cross-referenced, possibly ambiguous and even conflicting at times. There are often multiple documents that need to be consulted and satisfied; however it is a difficult task to locate all of the relevant provisions. In addition, sections dealing with the same or similar conceptual ideas sometimes lay down conflicting requirements. We propose a framework for regulation representation, analysis and comparison with emphasis on the extraction of similarities between provisions. We focus on accessibility regulations, whose intent is to provide the same or equivalent access to a building and its facilities for disabled persons. An XML regulatory repository is developed to extract structural as well as non-structural features from government regulations to help user understanding and computational analysis. A similarity analysis is performed between different sources of regulations. In order to achieve a better comparison between provisions, we employ a combination of feature matching and structural analysis. Results are shown on comparisons between American and European codes, as well as on the domain of electronic-rulemaking.
This paper examines the impact of information technology (IT) utilization on construction firm performance. Based on empirical data collected from 74 US construction firms, the analyses provide evidence that IT has a positive impact on overall firm performance, schedule performance, and cost performance. Firm performance is a composite score of several metrics of performance: schedule performance, cost performance, customer satisfaction, safety performance, and profit. No relationship is found between IT utilization and customer satisfaction, safety, or profit, although this may be due to limitations of the study given strong correlations between IT utilization and cost and schedule performnance. The empirical evidence of positive association between performance and IT use provided by this research is significant to both construction practice and research literature. This evidence should encourage firms to adopt and invest in IT tools.
There are many construction projects in China and mass documents are exchanged among the multi-party, including the owner, the contractor and the engineer in the projects. Based on previous studies, an approach to the utilization of the exchanged documents is established by using data warehouse technology and a prototype system called EXPLYZER is developed. The approach and the prototype system are verified through their application in a construction project. It is concluded that the approach can support the decision-making in project management.
Recent research shows that current learning strategies in construction industry have not been effective in implementing lean principles in construction. With that in mind the researchers set to investigate an alternative learning strategy in order to promote learning at the international level. A web-based environment, was developed for this project with the intent of promoting learning and knowledge exchange on the theory and practice of "process transparency" across different countries.
The promise of lower costs for sensors that can be used for construction inspection means that inspectors will continue to have new choices to consider in creating inspection plans. However, these emerging inspection methods can require different activities, resources, and decisions such that it can be difficult to compare the emerging methods with other methods that satisfy the same inspection needs. Furthermore, the context in which inspection is performed can significantly influence how well certain inspection methods are suited for a given set of goals for inspection. Context information, such as weather, security, and the regulatory environment, can be used to understand what information about a component should be collected and how an inspection should be performed. The research described in this paper is aimed at developing an approach for comparing and selecting inspection plans. This approach consists of (1) refinement of given goals for inspection, if necessary, in order to address any additional information needs due to a given context and in order to reach a level of detail that can be addressed by an inspection activity; (2) development of constraints to describe how an inspection should be achieved; (3) matching of goals to available inspection methods, and generation of activities and resource plans in order to address the goals; and (4) selection of an inspection plan from among the possible plans that have been identified. The authors illustrate this approach with observations made at a local construction site.
The conceptual structure of an application that can support the structural analysis task in a distributed collaboratory is described in (van Rooyen and Olivier 2004). The application described there has a standalone component for executing the finite element method on a local workstation in the absence of network access. This application is comparable to current, local workstation based finite element packages. However, it differs fundamentally from standard packages since the application itself, and its objects, are adapted to support distributed execution of the analysis task. Basic aspects of an object-oriented framework for the development of applications which can be used in similar distributed collaboratories are described in this paper. An important feature of this framework is its application-centred design. This means that an application can contain any number of engineering models, where the models are formed by the collection of objects according to semantic views within the application. This is achieved through very flexible classes Application and Model, which are described in detail. The advantages of the application-centred design approach is demonstrated with reference to the design of steel structures, where the finite element analysis model, member design model and connection design model interact to provide the required functionality.
The application of a recent method using formal power series is proposed. It is based on a new representation for solutions of Sturm-Liouville equations. This method is used to calculate the transmittance and reflectance coefficients of finite inhomogeneous layers with high accuracy and efficiency. Tailoring the refraction index profile defining the inhomogeneous media it is possible to develop very important applications such as optical filters. A number of profiles were evaluated and then some of them selected in order to perform an improvement of their characteristics via the modification of their profiles.
Low-skilled labor makes a significant part of the construction sector, performing daily production tasks that do not require specific technical knowledge or confirmed skills. Today, construction market demands increasing skill levels. Many jobs that were once considered to be undertaken by low or un-skilled labor, now demand some kind of formal skills. The jobs that require low skilled labor are continually decreasing due to technological advancement and globalization. Jobs that previously required little or no training now require skilful people to perform the tasks appropriately. The study aims at ameliorating employability of less skilled manpower by finding ways to instruct them for performing constructions tasks. A review of exiting task instruction methodologies in construction and the underlying gaps within them warrants an appropriate way to train and instruct low skilled workers for the tasks in construction. The idea is to ensure the required quality of construction with technological and didactic aids seeming particularly purposeful to prepare potential workers for the tasks in construction without exposing them to existing communication barriers. A BIM based technology is considered promising along with the integration of visual directives/animations to elaborate the construction tasks scheduled to be carried on site.
We present an algebraically extended 2D image representation in this paper. In order to obtain more degrees of freedom, a 2D image is embedded into a certain geometric algebra. Combining methods of differential geometry, tensor algebra, monogenic signal and quadrature filter, the novel 2D image representation can be derived as the monogenic extension of a curvature tensor. The 2D spherical harmonics are employed as basis functions to construct the algebraically extended 2D image representation. From this representation, the monogenic signal and the monogenic curvature signal for modeling intrinsically one and two dimensional (i1D/i2D) structures are obtained as special cases. Local features of amplitude, phase and orientation can be extracted at the same time in this unique framework. Compared with the related work, our approach has the advantage of simultaneous estimation of local phase and orientation. The main contribution is the rotationally invariant phase estimation, which enables phase-based processing in many computer vision tasks.
This ethnographic study reports on emerging work processes and practices observed in the AEC (Architecture/Engineering/Construction) Global Teamwork program, i.e., what people experience when interacting with and through collaboration technologies, why people practice in the way they do, how the practice fits into the environment and changes the work patterns. It presents the experience of two high-performance typical but extreme AEC teamwork cases adopting and adapting to collaboration technologies and how these technologies in practice impact their work processes. The findings illustrate the importance of collaboration technologies in cross-disciplinary, global teamwork. Observations indicate that high performance teams that use the collaboration technologies effectively exhibit collaboration readiness at an early stage and manage to define a “third way” to meet the demands of the cross-disciplinary, multi cultural and geographically distributed AEC workspace. The observations and implications represent the blueprint for yearly innovations and improvements to the design of the AEC Global Teamwork program.
For the dynamic behavior of lightweight structures like thin shells and membranes exposed to fluid flow the interaction between the two fields is often essential. Computational fluid-structure interaction provides a tool to predict this interaction and complement or eventually replace expensive experiments. Partitioned analyses techniques enjoy great popularity for the numerical simulation of these interactions. This is due to their computational superiority over simultaneous, i.e. fully coupled monolithic approaches, as they allow the independent use of suitable discretization methods and modular analysis software. We use, for the fluid, GLS stabilized finite elements on a moving domain based on the incompressible instationary Navier-Stokes equations, where the formulation guarantees geometric conservation on the deforming domain. The structure is discretized by nonlinear, three-dimensional shell elements.
Commonly used sequential staggered coupling schemes may exhibit instabilities due to the so-called artificial added mass effect. As best remedy to this problem subiterations should be invoked to guarantee kinematic and dynamic continuity across the fluid-structure interface. Since iterative coupling algorithms are computationally very costly, their convergence rate is very decisive for their usability. To ensure and accelerate the convergence of this iteration the updates of the interface position are relaxed. The time dependent, 'optimal' relaxation parameter is determined automatically without any user-input via exploiting a gradient method or applying an Aitken iteration scheme.
In engineering science the modeling and numerical analysis of complex systems and relations plays an important role. In order to realize such an investigation, for example a stochastic analysis, in a reasonable computational time, approximation procedure have been developed. A very famous approach is the response surface method, where the relation between input and output quantities is represented for example by global polynomials or local interpolation schemes as Moving Least Squares (MLS). In recent years artificial neural networks (ANN) have been applied as well for such purposes. Recently an adaptive response surface approach for reliability analyses was proposed, which is very efficient concerning the number of expensive limit state function evaluations. Due to the applied simplex interpolation the procedure is limited to small dimensions. In this paper this approach is extended for larger dimensions using combined ANN and MLS response surfaces for evaluating the adaptation criterion with only one set of joined limit state points. As adaptation criterion a combination by using the maximum difference in the conditional probabilities of failure and the maximum difference in the approximated radii is applied. Compared to response surfaces on directional samples or to plain directional sampling the failure probability can be estimated with a much smaller number of limit state points.
Major problems of applying selective sensitivity to system identification are requirement of precise knowledge about the system parameters and realization of the required system of forces. This work presents a procedure which is able to deriving selectively sensitive excitation by iterative experiments. The first step is to determine the selectively sensitive displacement and selectively sensitive force patterns. These values are obtained by introducing the prior information of system parameters into an optimization which minimizes the sensitivities of the structure response with respect to the unselected parameters while keeping the sensitivities with respect to the selected parameters as a constant. In a second step the force pattern is used to derive dynamic loads on the tested structure and measurements are carried out. An automatic control ensures the required excitation forces. In a third step, measured outputs are employed to update the prior information. The strategy is to minimize the difference between a predicted displacement response, formulated as function of the unknown parameters and the measured displacements, and the selectively sensitive displacement calculated in the first step. With the updated values of the parameters a re-analysis of selective sensitivity is performed and the experiment is repeated until the displacement response of the model and the actual structure are conformed. As an illustration a simply supported beam made of steel, vibrated by harmonic excitation is investigated, thereby demonstrating that the adaptive excitation can be obtained efficiently.
We present recent developments of adaptive wavelet solvers for elliptic eigenvalue problems. We describe the underlying abstract iteration scheme of the preconditioned perturbed iteration. We apply the iteration to a simple model problem in order to identify the main ideas which a numerical realization of the abstract scheme is based upon. This indicates how these concepts carry over to wavelet discretizations. Finally we present numerical results for the Poisson eigenvalue problem on an L-shaped domain.
The uncertainty existing in the construction industry is bigger than in other industries. Consequently, most construction projects do not go totally as planned. The project management plan needs therefore to be adapted repeatedly within the project lifecycle to suit the actual project conditions. Generally, the risks of change in the project management plan are difficult to be identified in advance, especially if these risks are caused by unexpected events such as human errors or changes in the client preferences. The knowledge acquired from different resources is essential to identify the probable deviations as well as to find proper solutions to the faced change risks. Hence, it is necessary to have a knowledge base that contains known solutions for the common exceptional cases that may cause changes in each construction domain. The ongoing research work presented in this paper uses the process modeling technique of Event-driven Process Chains to describe different patterns of structure changes in the schedule networks. This results in several so called “change templates”. Under each template different types of change risk/ response pairs can be categorized and stored in a knowledge base. This knowledge base is described as an ontology model populated with reference construction process data. The implementation of the developed approach can be seen as an iterative scheduling cycle that will be repeated within the project lifecycle as new change risks surface. This can help to check the availability of ready solutions in the knowledge base for the situation at hand. Moreover, if the solution is adopted, CPSP, “Change Project Schedule Plan „a prototype developed for the purpose of this research work, will be used to make the needed structure changes of the schedule network automatically based on the change template. What-If scenarios can be implemented using the CPSP prototype in the planning phase to study the effect of specific situations without endangering the success of the project objectives. Hence, better designed and more maintainable project schedules can be achieved.
Acoustic travel-time tomography (ATOM) determines the distribution of the temperature in a propagation medium by measuring the travel-time of acoustic signals between transmitters and receivers. To employ ATOM for indoor climate measurements, the impulse responses have been measured in the climate chamber lab of the Bauhaus-University Weimar and compared with the theoretical results of its image source model (ISM). A challenging task is distinguishing the reflections of interest in the reflectogram when the sound rays have similar travel-times. This paper presents a numerical method to address this problem by finding optimal positions of transmitter and receiver, since they have a direct impact on the distribution of travel times. These optimal positions have the minimum number of simultaneous arrival time within a threshold level. Moreover, for the tomographic reconstruction, when some of the voxels remain empty of sound-rays, it leads to inaccurate determination of the air temperature within those voxels. Based on the presented numerical method, the number of empty tomographic voxels are minimized to ensure the best sound-ray coverage of the room. Subsequently, a spatial temperature distribution is estimated by simultaneous iterative reconstruction technique (SIRT). The experimental set-up in the climate chamber verifies the simulation results.
The paper presents the abstraction of process relevant information in order to enable the workflow management based on semantic data. It is shown for three examples, how the standards define the information needed to perform a certain planning activity. Abstraction of process relevant information is discussed for different granularities of the underlying processmodel. As one possible application ProMiSE is introduced, which uses process relevant data in individual tokens in a petri-net based process-model.
A UNIFIED APPROACH FOR THE TREATMENT OF SOME HIGHER DIMENSIONAL DIRAC TYPE EQUATIONS ON SPHERES
(2010)
Using Clifford analysis methods, we provide a unified approach to obtain explicit solutions of some partial differential equations combining the n-dimensional Dirac and Euler operators, including generalizations of the classical time-harmonic Maxwell equations. The obtained regular solutions show strong connections between hypergeometric functions and homogeneous polynomials in the kernel of the Dirac operator.
A four-node quadrilateral shell element with smoothed membrane-bending based on Mindlin-Reissner theory is proposed. The element is a combination of a plate bending and membrane element. It is based on mixed interpolation where the bending and membrane stiffness matrices are calculated on the boundaries of the smoothing cells while the shear terms are approximated by independent interpolation functions in natural coordinates. The proposed element is robust, computationally inexpensive and free of locking. Since the integration is done on the element boundaries for the bending and membrane terms, the element is more accurate than the MITC4 element for distorted meshes. This will be demonstrated for several numerical examples.
As computer programs become ever more complex, software development has shifted from focusing on programming towards focusing on integration. This paper describes a simulation access language (SimAL) that can be used to access and compose software applications over the Internet. Specifically, the framework is developed for the integration of tools for project management applications. The infrastructure allows users to specify and to use existing heterogeneous tools (e.g., Microsoft Project, Microsoft Excel, Primavera Project Planner, and AutoCAD) for simulation of project scenarios. This paper describes the components of the SimAL language and the implementation efforts required in the development of the SimAL framework. An illustration example bringing on-line weather forecasting service for project scheduling and management applications is provided to demonstrate the use of the simulation language and the infrastructure framework.
Information technology plays a key role in the everyday operation of buildings and campuses. Many proprietary technologies and methodologies can assist in effective Building Performance Monitoring (BPM) and efficient managing of building resources. The integration of related tools like energy simulator packages, facility, energy and building management systems, and enterprise resource planning systems is of benefit to BPM. However, the complexity to integrating such domain specific systems prevents their common usage. Service Oriented Architecture (SOA) has been deployed successfully in many large multinational companies to create integrated and flexible software systems, but so far this methodology has not been applied broadly to the field of BPM. This paper envisions that SOA provides an effective integration framework for BPM. Service oriented architecture for the ITOBO framework for sustainable and optimised building operation is proposed and an implementation for a building performance monitoring system is introduced.
We present a software prototype for fluid flow problems in civil engineering, which combines essential features of Computational Steering approaches with efficient methods for model transfer and high performance computing. The main components of the system are described: - The modeler with a focus on the data management of the product model - The pre-processing and the post-processing toolkit - The simulation kernel based on the Lattice Boltzmann method - The required hardware for real-time computing
The truss model for predicting shear resistance of reinforced concrete beams has usually been criticized because of its underestimation of the concrete shear strength especially for beams with low shear reinforcement. Two challengers are commonly encountered in any truss model and are responsible for its inaccurate shear strength prediction. First: the cracking angle is usually assumed empirically and second the shear contribution of the arching action is usually neglected. This research introduces a nouvelle approach, by using Artificial Neural Network (ANN) for accurately evaluating the shear cracking angle of reinforced and prestressed concrete beams. The model inputs include the beam geometry, concrete strength, the shear reinforcement ratio and the prestressing stress if any. ...
For the analysis of arbitrary, by Finite Elements discretized shell structures, an efficient numerical simulation strategy with quadratic convergence including geometrically and physically nonlinear effects will be presented. In the beginning, a Finite-Rotation shell theory allowing constant shear deformations across the shell thickness is given in an isoparametric formulation. The assumed-strain concept enables the derivation of a locking-free finite element. The Layered Approach will be applied to ensure a sufficiently precise prediction of the propagation of plastic zones even throughout the shell thickness. The Riks-Wempner-Wessels global iteration scheme will be enhanced by a Line-Search procedure to ensure the tracing of nonlinear deformation paths with rather great load steps even in the post-peak range. The elastic-plastic material model includes isotropic hardening. A new Operator-Split return algorithm ensures considerably exact solution of the initial-value problem even for greater load steps. The combination with consistently linearized constitutive equations ensures quadratic convergence in a close neighbourhood to the exact solution. Finally, several examples will demonstrate accuracy and numerical efficiency of the developed algorithm.
The Element-free Galerkin Method has become a very popular tool for the simulation of mechanical problems with moving boundaries. The internally applied Moving Least Squares approximation uses in general Gaussian or cubic weighting functions and has compact support. Due to the approximative character of this method the obtained shape functions do not fulfill the interpolation condition, which causes additional numerical effort for the imposition of the essential boundary conditions. The application of a singular weighting function, which leads to singular coefficient matrices at the nodes, can solve this problem, but requires a very careful placement of the integration points. Special procedures for the handling of such singular matrices were proposed in literature, which require additional numerical effort. In this paper a non-singular weighting function is presented, which leads to an exact fulfillment of the interpolation condition. This weighting function leads to regular values of the weights and the coefficient matrices in the whole interpolation domain even at the nodes. Furthermore this function gives much more stable results for varying size of the influence radius and for strongly distorted nodal arrangements than classical weighting function types. Nevertheless, for practical applications the results are similar as these obtained with the regularized weighting type presented by the authors in previous publications. Finally a new concept will be presented, which enables an efficient analysis of systems with strongly varying node density. In this concept the nodal influence domains are adapted depending on the nodal configuration by interpolating the influence radius for each direction from the distances to the natural neighbor nodes. This approach requires a Voronoi diagram of the domain, which is available in this study since Delaunay triangles are used as integration background cells. In the numerical examples it will be shown, that this method leads to a more uniform and reduced number of influencing nodes for systems with varying node density than the classical circular influence domains, which means that the small additional numerical effort for interpolating the influence radius leads to remarkable reduction of the total numerical cost in a linear analysis while obtaining similar results. For nonlinear calculations this advantage would be even more significant.
In earlier research, generalized multidimensional Hilbert transforms have been constructed in m-dimensional Euclidean space, in the framework of Clifford analysis. Clifford analysis, centred around the notion of monogenic functions, may be regarded as a direct and elegant generalization to higher dimension of the theory of the holomorphic functions in the complex plane. The considered Hilbert transforms, usually obtained as a part of the boundary value of an associated Cauchy transform in m+1 dimensions, might be characterized as isotropic, since the metric in the underlying space is the standard Euclidean one. In this paper we adopt the idea of a so-called anisotropic Clifford setting, which leads to the introduction of a metric dependent m-dimensional Hilbert transform, showing, at least formally, the same properties as the isotropic one. The Hilbert transform being an important tool in signal analysis, this metric dependent setting has the advantage of allowing the adjustment of the co-ordinate system to possible preferential directions in the signals to be analyzed. A striking result to be mentioned is that the associated anisotropic (m+1)-dimensional Cauchy transform is no longer uniquely determined, but may stem from a diversity of (m+1)-dimensional "mother" metrics.
This paper focuses on a new three-level discretisation strategy which enables the transition between continuum/structural (I) and structural/black box modelling (II). The transition (I) is realised by means of a model adaptive concept based on an innovative finite element technology. For transition (II) we apply the truncated balanced realisation method (TBR). The latter represents an established system theoretical model reduction technique which is here combined with a novel substructure technique. The approach provides a modular concept to facilitate the computational analysis of complex structures. The final goal is to apply the strategy to life time estimation.
Modern distributed engineering applications are based on complex systems consisting of various subsystems that are connected through the Internet. Communication and collaboration within an entire system requires reliable and efficient data exchange between the subsystems. Middleware developed within the web evolution during the past years provides reliable and efficient data exchange for web applications, which can be adopted for solving the data exchange problems in distributed engineering applications. This paper presents a generic approach for reliable and efficient data exchange between engineering devices using existing middleware known from web applications. Different existing middleware is examined with respect to the suitability in engineering applications. In this paper, a suitable middleware is shown and a prototype implementation simulating distributed wind farm control is presented and validated using several performance measurements.
Interactive visualization based on 3D computer graphics nowadays is an indispensable part of any simulation software used in engineering. Nevertheless, the implementation of such visualization software components is often avoided in research projects because it is a challenging and potentially time consuming task. In this contribution, a novel Java framework for the interactive visualization of engineering models is introduced. It supports the task of implementing engineering visualization software by providing adequate program logic as well as high level classes for the visual representation of entities typical for engineering models. The presented framework is built on top of the open source visualization toolkit VTK. In VTK, a visualization model is established by connecting several filter objects in a so called visualization pipeline. Although designing and implementing a good pipeline layout is demanding, VTK does not support the reuse of pipeline layouts directly. Our framework tailors VTK to engineering applications on two levels. On the first level it adds new – engineering model specific – filter classes to VTK. On the second level, ready made pipeline layouts for certain aspects of engineering models are provided. For instance there is a pipeline class for one-dimensional elements like trusses and beams that is capable of showing the elements along with deformations and member forces. In order to facilitate the implementation of a graphical user interface (GUI) for each pipeline class, there exists a reusable Java Swing GUI component that allows the user to configure the appearance of the visualization model. Because of the flexible structure, the framework can be easily adapted and extended to new problem domains. Currently it is used in (i) an object-oriented p-version finite element code for design optimization, (ii) an agent based monitoring system for dam structures and (iii) the simulation of destruction processes by controlled explosives based on multibody dynamics. Application examples from all three domains illustrates that the approach presented is powerful as well as versatile.
This paper describes a framework for computer-aided conceptual design of building structures that results from building architectural considerations. The central task that is carried out during conceptual design is the synthesis of the structural system. This paper proposes a methodology for the synthesis of structural solutions. Given the nature of architectural constraints, user-model interactivity is devised as the most suitable computer methodology for driving the structural synthesis process. Taking advantage of the hierarchical organization of the structural system, this research proposes a top-down approach for structural synthesis. Through hierarchical refinement, the approach lends itself to the synthesis of global and local structural solutions. The components required for implementing the proposed methodology are briefly described. The main components have been incorporated in a proof-of-concept prototype that is being tested and validated with actual buildings.
Iso-parametric finite elements with linear shape functions show in general a too stiff element behavior, called locking. By the investigation of structural parts under bending loading the so-called shear locking appears, because these elements can not reproduce pure bending modes. Many studies dealt with the locking problem and a number of methods to avoid the undesirable effects have been developed. Two well known methods are the >Assumed Natural Strain< (ANS) method and the >Enhanced Assumed Strain< (EAS) method. In this study the EAS method is applied to a four-node plane element with four EAS-parameters. The paper will describe the well-known linear formulation, its extension to nonlinear materials and the modeling of material uncertainties with random fields. For nonlinear material behavior the EAS parameters can not be determined directly. Here the problem is solved by using an internal iteration at the element level, which is much more efficient and stable than the determination via a global iteration. To verify the deterministic element behavior the results of common test examples are presented for linear and nonlinear materials. The modeling of material uncertainties is done by point-discretized random fields. To show the applicability of the element for stochastic finite element calculations Latin Hypercube Sampling was applied to investigate the stochastic hardening behavior of a cantilever beam with nonlinear material. The enhanced linear element can be applied as an alternative to higher-order finite elements where more nodes are necessary. The presented element formulation can be used in a similar manner to improve stochastic linear solid elements.
The methods currently used for scheduling building processes have some major advantages as well as disadvantages. The main advantages are the arrangement of the tasks of a project in a clear, easily readable form and the calculation of valuable information like critical paths. The main disadvantage on the other hand is the inflexibility of the model caused by the modeling paradigms. Small changes of the modeled information strongly influence the whole model and lead to the need to change many more details in the plan. In this article an approach is introduced allowing the creation of more flexible schedules. It aims towards a more robust model that lowers the need to change more than a few information while being able to calculate the important propositions of the known models and leading to further valuable conclusions.
The evolution of data exchange and integration standards within the Architectural, Engineering and Construction industry is gradually making the long-held vision of computer-integratedconstruction a reality. The Industry Foundations Classes and CIMSteel Integration Standards are two such standards that have seen remarkable successes over the past few years. Despite successes, these standards support the exchange of product data more than they do process data, especially those processes that are loosely coupled with product models. This paper reports on on-going research to evaluate the adequacy of the IFC and CIS/2 standards to support process modeling in the steel supply chain. Some initial recommendations are made regarding enhancements to the data standards to better support processes.