Refine
Document Type
- Conference Proceeding (857) (remove)
Institute
- Professur Informatik im Bauwesen (336)
- In Zusammenarbeit mit der Bauhaus-Universität Weimar (174)
- Professur Baubetrieb und Bauverfahren (104)
- Professur Theorie und Geschichte der modernen Architektur (94)
- Graduiertenkolleg 1462 (32)
- Institut für Strukturmechanik (ISM) (23)
- Professur Angewandte Mathematik (18)
- Institut für Konstruktiven Ingenieurbau (IKI) (12)
- Junior-Professur Computational Architecture (11)
- Bauhaus-Institut für Geschichte und Theorie der Architektur und Planung (10)
Keywords
- Computerunterstütztes Verfahren (286)
- Architektur <Informatik> (200)
- CAD (158)
- Angewandte Informatik (145)
- Angewandte Mathematik (145)
- Computer Science Models in Engineering; Multiscale and Multiphysical Models; Scientific Computing (74)
- Bauhaus (68)
- Arbeitsschutz (61)
- Baustelle (53)
- Architekturtheorie (52)
From the design experiences of arch dams in the past, it has significant practical value to carry out the shape optimization of arch dams, which can fully make use of material characteristics and reduce the cost of constructions. Suitable variables need to be chosen to formulate the objective function, e.g. to minimize the total volume of the arch dam. Additionally a series of constraints are derived and a reasonable and convenient penalty function has been formed, which can easily enforce the characteristics of constraints and optimal design. For the optimization method, a Genetic Algorithm is adopted to perform a global search. Simultaneously, ANSYS is used to do the mechanical analysis under the coupling of thermal and hydraulic loads. One of the constraints of the newly designed dam is to fulfill requirements on the structural safety. Therefore, a reliability analysis is applied to offer a good decision supporting for matters concerning predictions of both safety and service life of the arch dam. By this, the key factors which would influence the stability and safety of arch dam significantly can be acquired, and supply a good way to take preventive measures to prolong ate the service life of an arch dam and enhances the safety of structure.
This paper presents initial findings from the empirical analysis of community based social enterprise (SE) and non-profit organisation (NPO) ecosystems in Johannesburg. SEs and NPOs are widely recognised as contributors to the resilience of marginalised urban communities. However, the connection between these organisations , urban governance, and community resilience has not yet been sufficiently understood , particularly in African urban contexts. The 'Resilient Urban Communities' project focuses on Johannesburg as a case study to shed light on this under-researched topic. The key to exploring it is understanding SEs and NPOs as providers of public services, job creators, and promoters of good governance, all of which contribute to community resilience. Using this premise as a starting point, this paper investigates ecosystem conditions with a particular focus on state-civil society partnerships. Empirical data was generated through semi-struc-tured interviews and analysed with a grounded theory approach. Preliminary results of this ongoing research reveal that urban geography is a relevant ecosystem factor for SEs and NPOs from marginalised communities. We also suggest that co-production could be an opportunity for growth within the investigated state-civil society partnership.
This paper concerns schedule synchronization problems in public transit networks. In particular, it consists of three main parts. In the first the subject area is introduced, the terms are defined and framework for optimal synchronization in the form of problem representation and formulation is proposed. The second part is devoted to transfer synchronization problem when passengers changing transit lines at transfer points. The intergrated Tabu Search and Genetic solution method is developed with respect to this specific problem. The third part deals with headways harmonization problem i.e. synchronization of different transit lines schedules on a common segments of routes. For the solution of this problem a new bilevel optimization method is proposed with zones harmonization at the bottom level and co-ordination of zones, by time buffers assigned to timing points, at the upper level. Finally, the synchronization problems are numerically illustrated by real-life examples of the public transport lines in Cracow.
Sieben Bestandsbauten der Herzogin-Anna-Amalia-Bibliothek Weimar sollten umgenutzt und durch zwei oberirdische und drei unterirdische Neubauten ergänzt werden. Im Beitrag näher beschrieben: Schadstofferkundung und Beurteilung der Gefahren der Baumaßnahmen, Sicherheitsmaßnahmen, Durchführung von Reinigungsarbeiten
Although there are some good reasons to design engineering software as a stand-alone application for a single computer, there are also numerous possibilities for creating distributed engineering applications, in particular using the Internet. This paper presents some typical scenarios how engineering applications can benefit from including network capabilities. Also, some examples of Internet-based engineering applications are discussed to show how the concepts presented can be implemented.
The paper introduces a systematic construction management approach, supporting expansion of a specified construction process, both automatically and semi-automatically. Throughout the whole design process, many requirements must be taken into account in order to fulfil demands defined by clients. In implementing those demands into a design concept up to the execution plan, constraints such as site conditions, building code, and legal framework are to be considered. However, complete information, which is needed to make a sound decision, is not yet acquired in the early phase. Decisions are traditionally taken based on experience and assumptions. Due to a vast number of appropriate available solutions, particularly in building projects, it is necessary to make those decisions traceable. This is important in order to be able to reconstruct considerations and assumptions taken, should there be any changes in the future project’s objectives. The research will be carried out by means of building information modelling, where rules deriving from standard logics of construction management knowledge will be applied. The knowledge comprises a comprehensive interaction amongst bidding process, cost-estimation, construction site preparation as well as specific project logistics – which are usually still separately considered. By means of these rules, favourable decision taking regarding prefabrication and in-situ implementation can be justified. Modifications depending on the available information within current design stage will consistently be traceable.
In this paper experimental studies and numerical analysis carried out on reinforced concrete beam are partially reported. They aimed to apply the rigid finite element method to calculations for reinforced concrete beams using discrete crack model. Hence rotational ductility resulting from crack occurrence had to be determined. A relationship for calculating it in static equilibrium was proposed. Laboratory experiments proved that dynamic ductility is considerably smaller. Therefore scaling of the empirical parameter was carried out. Consequently a formula for its value depending on reinforcement ratio was obtained.
The topic of structural robustness is covered extensively in current literature in structural engineering. A few evaluation methods already exist. Since these methods are based on different evaluation approaches, the comparison is difficult. But all the approaches have one in common, they need a structural model which represents the structure to be evaluated. As the structural model is the basis of the robustness evaluation, there is the question if the quality of the chosen structural model is influencing the estimation of the structural robustness index. This paper shows what robustness in structural engineering means and gives an overview of existing assessment methods. One is the reliability based robustness index, which uses the reliability indices of a intact and a damaged structure. The second one is the risk based robustness index, which estimates the structural robustness by the usage of direct and indirect risk. The paper describes how these approaches for the evaluation of structural robustness works and which parameters will be used. Since both approaches needs a structural model for the estimation of the structural behavior and the probability of failure, it is necessary to think about the quality of the chosen structural model. Nevertheless, the chosen model has to represent the structure, the input factors and reflect the damages which occur. On the example of two different model qualities, it will be shown, that the model choice is really influencing the quality of the robustness index.
In construction engineering, a schedule’s input data, which is usually not exactly known in the planning phase, is considered deterministic when generating the schedule. As a result, construction schedules become unreliable and deadlines are often not met. While the optimization of construction schedules with respect to costs and makespan has been a matter of research in the past decades, the optimization of the robustness of construction schedules has received little attention. In this paper, the effects of uncertainties inherent to the input data of construction schedules are discussed. Possibilities are investigated to improve the reliability of construction schedules by considering alternative processes for certain tasks and by identifying the combination of processes generating the most robust schedule with respect to the makespan of a construction project.
REVISITING 1923
(2011)
Addison Godel is a student at the Knowlton School of Architecture at Ohio State, working towards a three-year Master of Architecture. He is a teaching assistant for a variety of history and theory courses, as well as two of the school’s European travel-abroad programs. His interests include the relationship between style and larger cultural forces, and the efforts of architecture to symbolically adapt and represent contemporary technology.
Restelo Neighbourhood: Expanding the Capital of the Empire with the First Portuguese Urban Planner
(2015)
For decades in Germany, historical research on dictatorial urban design in the first half of the 20th century focused on the National Socialist period. Studies on the urban design practices of other dictatorships remained an exception. This has changed. Meanwhile, the urban production practices of the Mussolini, Stalin, Salazar, Hitler and Franco dictatorships have become the subject of comprehensive research projects. Recently, a research group that studies dictatorial urban design in 20th century Europe has emerged at the Bauhaus-Institut für Geschichte und Theorie der Architektur und der Planung. The group is already able to refer to various research results.
Part of the research group’s self-conception is the assumption that the urban design practices of the named dictatorships can only be properly understood from a European perspective. The dictatorships influenced one another substantially. Furthermore, the specificities of the practices of each dictatorship can only be discerned if one can compare them to those of the other dictatorships. This approach requires strict adherence to the research methods of planning history and urban design theory. Meanwhile, these methods must be opened
to include those of general historical studies.
With this symposium, the research group aims to further qualify this European perspective. The aim is to pursue an inventory of the various national historiographies on the topic of “urban design and dictatorship”. This inventory should offer an overview on the general national level of historical research on urban design as well as on the level of particular urban design projects, persons or topics.
The symposium took place in Weimar, November 21-22, 2013. It was organized by Harald Bodenschatz, Piero Sassi and Max Welch Guerra and funded by the DAAD (German Academic Exchange Service).
For decades in Germany, historical research on dictatorial urban design in the first half of the 20th century focused on the National Socialist period. Studies on the urban design practices of other dictatorships remained an exception. This has changed. Meanwhile, the urban production practices of the Mussolini, Stalin, Salazar, Hitler and Franco dictatorships have become the subject of comprehensive research projects. Recently, a research group that studies dictatorial urban design in 20th century Europe has emerged at the Bauhaus-Institut für Geschichte und Theorie der Architektur und der Planung. The group is already able to refer to various research results.
Part of the research group’s self-conception is the assumption that the urban design practices of the named dictatorships can only be properly understood from a European perspective. The dictatorships influenced one another substantially. Furthermore, the specificities of the practices of each dictatorship can only be discerned if one can compare them to those of the other dictatorships. This approach requires strict adherence to the research methods of planning history and urban design theory. Meanwhile, these methods must be opened
to include those of general historical studies.
With this symposium, the research group aims to further qualify this European perspective. The aim is to pursue an inventory of the various national historiographies on the topic of “urban design and dictatorship”. This inventory should offer an overview on the general national level of historical research on urban design as well as on the level of particular urban design projects, persons or topics.
The symposium took place in Weimar, November 21-22, 2013. It was organized by Harald Bodenschatz, Piero Sassi and Max Welch Guerra and funded by the DAAD (German Academic Exchange Service).
Most of the existing seismic resistant design codes are based on the response spectrum theory. The influence of inelastic deformations can be evaluated by considering inelastic type of resisting force and then the inelastic spectrum is considerably different from the elastic one. Also, the influence of stiffness degradation and strength deterioration can be accounted for by including more precise models from material point of view. In some recent papers the corresponding changes in response spectra due to the P- Ä effect are discussed. The experience accumulated from the recent earthquakes indicates that structural pounding may considerably influence the response of structures and should be taken into account in design procedures. The most convenient way to do that is to predict the influence of the pounding on the response spectra for accelerations, velocities and displacements. Generally speaking the contact problems such as pounding are characterized by large extent of nonlinearity and slow convergence of the computational procedures. Thus obtaining spectra where the contact problem is accounted for seems very attractive from engineering point of view because could easy be implemented into the design procedures. However it is worth nothing that there is not rigorous mathematical proof that the original system can be decomposed into single equations related to single degree of freedom systems. It is the porpose of the paper to study the influence of the pounding on the response spectra and to evaluate the amplification due to the impact. For this purpose two adjacent SDOF systems are considered that are able to interact during the vibration process. This problem is solved versus the elastic stiffness ratio, which appears to be very important for such assemblage. The contact between masses is numerically simulated using opening gap elements as links. Comparisons between calculated response spectra and linear response spectra are made in order to derive analytical relationships to simply obtain the contribution of pounding. The results are graphically illustrated in response spectra format and the influence of the stiffness ratio is clarified.
The management of resources is an essential task in each construction company. Today, ERP systems and e-Business systems are available to assist construction companies to efficiently organise the allocation of their personnel and equipment within the company, but they cannot provide the company with the idle resources for every single task that has to be performed during a construction project. Therefore, companies should have an alternative solution to better exploit expensive resources and compensate their fixed costs, but also have them available at the right time for their own business activities. This paper outlines the approach taken by the EU funded project “e-Sharing” (IST-2001-33325) to support resource management between construction companies. It will describe requirements for the management of construction resources, its core features, and the integration approach. Therefore, we will outline the approach of an integrated resource type model supporting the management and classification of construction equipment, construction tasks and qualification profiles. The development is based on a cross-domain analysis and evaluation of existing models. ...
The technological processes, schedules, parallel algorithms, etc., having some technological limitations and exacting increases of efficiency of their execution can be described through digraphs, on which the appropriate optimization problem (construction of optimal scheduling of tops of digraph) can be solved. The problems, researched in the given operation, have a generally following statement: The problem 1: Under the given graph G and option value h to construct parallel scheduling of tops of digraph of minimum length. Let's designate the problem S(G, h, l). The problem 2: Under the given graph G and option value l to construct parallel scheduling of tops of digraph of minimum width. Let's designate the problem S(G, l, h). The problem 3: Under the given graph G, option value h and periods of execution of operations di, i=1, …, n to construct parallel scheduling of tops of digraph of minimum length. Let's designate the problem S(G, h, di, l). The problems 1,2,3 in a case when h-arbitrary have exponential complexity. In operation the method of solution of the problem S(T, h, di, l) is offered on the basis of choice of tops having greatest weight. The approach to solution of the problem S(G, 3, l) is offered, where G the graph satisfying property : S[i] =S [i], i=1, …, l. For obtaining a rating of width of scheduling on an available estimator of length, we offer to use iterative algorithm of polynomial complexity, on which each step the current value of width of scheduling is set, which is used for specification of length of scheduling.
RESEARCH OF DEFORMATION OF MULTILAYERED PLATES ON UNDEFORMABLE BASIS BY UNFLEXURAL SPECIFIED MODEL
(2006)
Stress-strain state (SSS) of multilayered plates on undeformable foundation is investigated. The settlement circuit of transverse loaded plate is formed by symmetrical attaching of a plate concerning a surface of contact to the foundation. The plate of the double thickness becomes bilateral symmetrically loaded concerning its median surface. It allows to model only unflexural deformation that reduces amount of unknown and the general order of differentiation of resolving system of the equations. The developed refined continual model takes into account deformations of transverse shear and transverse compression in high iterative approximation. Rigid contact between the foundation and a plate, and also shear without friction on a surface of contact of a plate with the foundation is considered. Calculations confirm efficiency of such approach, allowing to receive decisions which is qualitative and quantitatively close to three-dimensional solutions.
Die Verfügbarkeit und die sichere Beherrschung moderner und kostengünstiger Informations- und Kommunikationstechnologien gestattet es auch kleinen unternehmerischen Einheiten im Bauwesen sich in betriebsübergreifenden Netzwerken zu organisieren und sich besser in die Wertschöpfungskette am Bau zu integrieren. In sogenannten >Virtuellen Organisationen< (VO) können diese für die Volkswirtschaft wichtigen Unternehmen ihre Wettbewerbsvorteile wie höhere Flexibilität, schnelle Reaktion auf Kundenwünsche und Marktnähe nutzen und somit ihr langfristiges Bestehen in einem liberalen Wettbewerb auf einem erweiterten EU-Markt sichern und ausbauen. Die derzeit existierenden Hindernisse im Informationsfluss zwischen Baustelle und Büro können durch den Einsatz mobiler, funkvernetzter Endgeräte, wie beispielsweise >Smart-Phones<, und einer neu zu gestaltenden Infrastruktur zum Wissensmanagement und zur kontextsensitiven Informationsdarstellung abgebaut werden. Im Rahmen dieser Veröffentlichung werden konzeptionelle Ansätze eines integrierten Informationsmanagements zur Unterstützung von VO, die im Rahmen des BMBF-Projektes >IuK-System Bau< derzeit entwickelt werden, vorgestellt.
In this paper we consider three different methods for generating monogenic functions. The first one is related to Fueter's well known approach to the generation of monogenic quaternion-valued functions by means of holomorphic functions, the second one is based on the solution of hypercomplex differential equations and finally the third one is a direct series approach, based on the use of special homogeneous polynomials. We illustrate the theory by generating three different exponential functions and discuss some of their properties. Formula que se usa em preprints e artigos da nossa UI&D (acho demasiado completo): Partially supported by the R\&D unit \emph{Matem\'atica a Aplica\c\~es} (UIMA) of the University of Aveiro, through the Portuguese Foundation for Science and Technology (FCT), co-financed by the European Community fund FEDER.
We investigate aspects of tram-network section reliability, which operates as a part of the model of whole city tram-network reliability. Here, one of the main points of interest is the character of the chronological development of the disturbances (namely the differences between time of departure provided in schedule and real time of departure) on subsequent sections during tram line operation. These developments were observed in comprehensive measurements done in Krakow, during one of the main transportation nodes (Rondo Mogilskie) rebuilding. All taken building activities cause big disturbances in tram lines operation with effects extended to neighboring sections. In a second part, the stochastic character of section running time will be analyzed more detailed. There will be taken into consideration sections with only one beginning stop and also with two or three beginning stops located at different streets at an intersection. Possibility of adding results from sections with two beginning stops to one set will be checked with suitable statistical tests which are used to compare the means of the two samples. Section running time may depend on the value of gap between two following trams and from the value of deviation from schedule. This dependence will be described by a multi regression formula. The main measurements were done in the city center of Krakow in two stages: before and after big changes in tramway infrastructure.
Die zunehmend erforderliche Kooperation verschiedener Beteiligter unterschiedlicher Fachbereiche und der Einsatz hochspezialisierter Fachapplikationen in heterogenen Systemumgebungen unterstreichen die Bedeutung und Notwendigkeit neuer Konzepte und Möglichkeiten zur Schaffung einer computergestützten Integrationsebene. Ziel einer computergestützten Integrationsebene ist die Verbesserung der Kooperation und Kommunikation unter den Beteiligten. Grundlage dafür ist die Etablierung eines effizienten und fehlerfreien Daten- und Informationsaustausches zwischen den verschiedenen Fachplanern und -applikationen. Die Basis für die Datenintegrationsebene bildet ein digitales Bauwerksmodell im Sinne eines >virtuellen Bauwerks<, welches alle relevanten Daten und Informationen über ein zu planendes oder real existierendes Bauwerk zur Verfügung stellt. Bei der Verwirklichung einer Bauwerksmodell-orientierten Datenintegrationsebene und deren Modellverwaltung erweist sich speziell die Definition des Bauwerksmodells also die Spezifikation der relevanten auszutauschenden Daten als äußerst komplex. Der hier vorzustellende Relationen-orientierte Ansatz, d.h. die Realisierung des Daten- und Informationsaustauschs mittels definierter Relationen und Beziehungen zwischen dynamisch modifizierbaren Domänenmodellen, bietet Ansätze zur: * Verringerung und Beherrschung der Komplexität des Bauwerksmodells (Teilmodellbildung) * Realisierung eines effizienten Datenaustauschs (Relationenmanagement) Somit stellt der Relationenorientierte Ansatz einen adäquaten Lösungsweg zur Modellierung eines digitalen Bauwerksmodells als Datenintegrationsebene für den Lebenszyklus eines Bauwerkes dar.
Für planende Ingenieure und Architekten besteht seit jeher im Rekonstruktionsbereich die Aufgabe, vorhandene Gebäude in ihrer Geometrie und Struktur zu erfassen und daraus Rekonstruktionspläne und -technologien zu erarbeiten. Diese Erfassungsmaßnahmen sind sehr umfangreich und kostenintensiv. Ziel der hier vorzustellenden Ansatzes war es deshalb, ein kostengünstiges Verfahren zu entwickeln, das ein berührungsloses Aufmaß eben begrenzter Räume (polyedrischer Räume) mit einer den Erfordernissen entsprechenden Genauigkeit gewährleistet. Es werden im wesentlichen zwei Problemkreise behandelt. Der erste beinhaltet den Nachweis einer für die geplanten Anwendungen hinreichend genauen, maßstäblichen Rekonstruierbarkeit von ebenen Objekten (Wänden) aus monokularen Fotoaufnahmen. Anstelle des in der Photogrammetrie üblichen Weges über Kamerakalibrierung mittels exakt eingemessener Paßpunkte wurde ein Ansatz verfolgt, bei dem eine mittels Laserspotprojektoren auf dem Aufnahmeobjekt erzeugte parameterabhängige Maßfigur in Verbindung mit a priori bekannten Bildinhalten Grundlage für die maßstabsgerechte Rekonstruktion des Objektes ist. Der zweite Problemkreis behandelt die Zusammensetzung von eben begrenzten Räumen aus Einzelebenen (Wänden) die als Ergebnis des ersten Schrittes projektiv entzerrt, d.h. als orthogonale Draufsicht, allerdings verfahrensbedingt fehlerbehaftet vorliegen. Ziel ist hier, durch Nutzung von a-priori-Kenntnissen über die Raumstruktur mit Hilfe von Methoden der mathematischen Optimierung einen Genauigkeitsgewinn zu erzielen.
Reinforced concrete walls are commonly selected as the lateral resisting systems in seismic design of buildings. The design procedure requires reliable/robust models to predict the wall response. Many researchers, thus, have focused on using the available experimental data to be able to comment on the quality of models at hand. What is missing though is an uncertain attitude towards the experimental data since such data can be affected by different sources of uncertainty. In this paper, we introduce the database created for model quality evaluation purposes considering the uncertainties in the experimental data. This is the first step of a larger study on experience-based model quality evaluation of reinforced concrete walls. Here, we briefly present the database as well as six sample validations of the developed numerical model (the quality of which is to be assessed). The database contains the information on nearly 300 wall specimens from about 50 sources. Both the database and the numerical model, built for uncertainty/sensitivity analysis purposes, are mainly based on ten parameters. These include geometry, material, reinforcement layout and loading properties. The validation results prove that the model is able to predict the wall response satisfactorily. Consequently, the validated numerical model could be used in further quality evaluation studies.
The theory of regular quaternionic functions of a reduced quaternionic variable is a 3-dimensional generalization of complex analysis. The Moisil-Theodorescu system (MTS) is a regularity condition for such functions depending on the radius vector r = ix+jy+kz seen as a reduced quaternionic variable. The analogues of the main theorems of complex analysis for the MTS in quaternion forms are established: Cauchy, Cauchy integral formula, Taylor and Laurent series, approximation theorems and Cauchy type integral properties. The analogues of positive powers (inner spherical monogenics) are investigated: the set of recurrence formulas between the inner spherical monogenics and the explicit formulas are established. Some applications of the regular function in the elasticity theory and hydrodynamics are given.
Für den Entwurf der i.a. aus langen schmalen Rechtecken bestehenden Schal- bzw. Werkpläne wird eine Entwurfsunterstützung vorgestellt, bei der die Größe der Rechtecke wie immer festgelegt wird, die Lage der Rechtecke aber durch topologische Angaben. Letztere bilden programmtechnisch Bedingungen, wobei zwischen Berühr- und Bündigkeitsbedingen unterschieden wird. Diese Angaben positionieren das neue Rechteck im Bezug zu einem bereits platzierten. Zum Beispiel erlaubt die Angabe, die Säule ist oberhalb des Fundamentes und belastet dieses mittig, eine eindeutige Festlegung der Lage der Säule bei gegebener Lage des Fundamentes und gegebenen Abmessungen beider Rechtecke. Die Formulierung mittels Bedingungen hat den Vorteil daß diese auch bei Änderung von Abmessungen gültig bleiben. Die hier vorgestellte Eingabeart der relativen Positionierung ist eine Erweiterung des Orthomodus, wie er bei Bau-CAD-Programmen stets gefunden wird.
Reconstruction of the indoor air temperature distribution using acoustic travel-time tomography
(2021)
Acoustic travel-time tomography (ATOM) is being increasingly considered recently as a remote sensing methodology to determine the indoor air temperatures distribution. It employs the relationship between the sound velocities along sound-paths and their related travel-times through measured room-impulse-response (RIR). Thus, the precise travel-time estimation is of critical importance which can be performed by applying an analysis time-window method. In this study, multiple analysis time-windows with different lengths are proposed to overcome the challenge of accurate detection of the travel-times at RIR. Hence, the ATOM-temperatures distribution has been measured at the climate chamber lab of the Bauhaus-University Weimar. As a benchmark, the temperatures of NTC thermistors are compared to the reconstructed temperatures derived from the ATOM technique illustrating this technique can be a reliable substitute for traditional thermal sensors. The numerical results indicate that the selection of an appropriate analysis time-window significantly enhances the accuracy of the reconstructed temperatures distribution.
Due to increasing numbers of wind energy converters, the accurate assessment of the lifespan of their structural parts and the entire converter system is becoming more and more paramount. Lifespan-oriented design, inspections and remedial maintenance are challenging because of their complex dynamic behavior. Wind energy converters are subjected to stochastic turbulent wind loading causing corresponding stochastic structural response and vibrations associated with an extreme number of stress cycles (up to 109 according to the rotation of the blades). Currently, wind energy converters are constructed for a service life of about 20 years. However, this estimation is more or less made by rule of thumb and not backed by profound scientific analyses or accurate simulations. By contrast, modern structural health monitoring systems allow an improved identification of deteriorations and, thereupon, to drastically advance the lifespan assessment of wind energy converters. In particular, monitoring systems based on artificial intelligence techniques represent a promising approach towards cost-efficient and reliable real-time monitoring. Therefore, an innovative real-time structural health monitoring concept based on software agents is introduced in this contribution. For a short time, this concept is also turned into a real-world monitoring system developed in a DFG joint research project in the authors’ institute at the Ruhr-University Bochum. In this paper, primarily the agent-based development, implementation and application of the monitoring system is addressed, focusing on the real-time monitoring tasks in the deserved detail.
Ideally, multiple computational building evaluation routines (particularly simulation tools) should be coupled in real-time to the representational design model to provide timely performance feed-back to the system user. In this paper we demonstrate how this can be achieved effectively and conveniently via homology-based mapping. We consider two models as homologous if they entail isomorphic topological information. If the general design representation (i.e., a shared object model) is generated in a manner so as to include both the topological building information and pointers to the semantic information base, it can be used to directly derive the domain representations (>enriched< object models with detailed configurational information and filtered semantic data) needed for evaluation purposes. As a proof of concept, we demonstrate a computational design environment that dynamically links an object-oriented space-based design model, with structurally homologous object models of various simulation routines.
A distributed geotechnical remote analysis of data system (Distributed G-RAD) can benefit both owners and contractors in providing better quality control and assurance on geotechnical projects. The Distributed G-RAD approach involves efficient data acquisition using PDAs with GPS capability, radio frequency identification (RFID) tags for labeling soil samples, laser scanning for measuring lift thickness and volumes of stockpiles and borrow pits. Spatial data storage is provided using a geographic information system (GIS). Portions of this system are already developed while other parts are still being considered. This paper also describes how RFID and laser scanning technologies can be used in the larger Distributed G-RAD system.
For the management or reorganisation of existing buildings, data concerning dimensions and construction are necessary. Often these data are given exclusively by paper-based drawings and no digital data such as a computer based product model or even a CAD-model are available. In order to perform mass calculation, damage mapping or a recalculation of the structure these drawings of the building under consideration have to be analysed manually by the engineer. This is a very time-consuming job. In order to close this gap between drawings of an existing building and a digital product model an approach is presented in this paper to digitise a drawing, to build up geometric and topologic models and to recognise construction parts of the building. Finally all recognised parts are transformed into a three-dimensional geometric model which provides all necessary geometric information for the product model. During this import process the semantics of a ground floor plan has to be converted into a 3D-model.
Stanford Anderson is Professor of History and Architecture and was Head of the Department of Architecture from 1991 through 2004. He was director of MIT’s PhD program in History, Theory and Criticism of Architecture, Art and Urban Form from its founding in 1974 to 1991 and in 1995-96. Anderson’s research and writing concern architectural theory, early modern architecture in northern Europe, American architecture and urbanism, and epistemology and historiography. He has organized numerous professional conferences and served on the editorial boards of Assemblage, Journal of Architectural Education, Places, and The MIT Press. In addition to numerous articles, his books are Planning for Diversity and Choice, On Streets, and Hermann Muthesius: Style-Architecture and Building Art. He is co-author of Kay Fisker. Peter Behrens and a New Architecture for the Twentieth Century appeared in 2000 and Eladio Dieste: Innovation in Structural Art in 2004. In 1997, The MIT Press published a collection of essays in his honor, edited by Martha Pollak: The Education of the Architect: Historiography, Urbanism, and the Growth of Knowledge. He was a Fulbright fellow at the Technische Hochschule in Munich and subsequently a fellow of the John Simon Guggenheim Foundation and the American Council of Learned Societies. Anderson received his bachelor’s degree from the University of Minnesota, his master’s in architecture from the University of California at Berkeley, and his doctoral degree in the history of art from Columbia University in New York City.
This paper is a report of Radio Frequency Identification (RFID) technology and its potential applications in the commercial construction industry. RFID technology offers wireless communication between RFID tags and readers with non line-of-sight readability. These fundamental properties eliminate manual data entry and introduce the potential for automated processes to increase project productivity, construction safety, and project cost efficiency. Construction contractors, owners, and material suppliers that believe technology can further develop methods and processes in construction should feel obligated to participate in RFID studies for the advancement of the construction industry as a whole.
Quo vadis Arbeitsschutz?
(2005)
After more than hundred years of arguments in favour and against quaternions, of exciting odysseys with new insights as well as disillusions about their usefulness the mathematical world saw in the last 40 years a burst in the application of quaternions and its generalizations in almost all disciplines that are dealing with problems in more than two dimensions. Our aim is to sketch some ideas - necessarily in a very concise and far from being exhaustive manner - which contributed to the picture of the recent development. With the help of some historical reminiscences we firstly try to draw attention to quaternions as a special case of Clifford Algebras which play the role of a unifying language in the Babylon of several different mathematical languages. Secondly, we refer to the use of quaternions as a tool for modelling problems and at the same time for simplifying the algebraic calculus in almost all applied sciences. Finally, we intend to show that quaternions in combination with classical and modern analytic methods are a powerful tool for solving concrete problems thereby giving origin to the development of Quaternionic Analysis and, more general, of Clifford Analysis.
The process of analysis and design in structural engineering requires the consideration of different partial models, for example loading, structural materials, structural elements, and analysis types. The various partial models are combined by coupling several of their components. Due to the large number of available partial models describing similar phenomena, many different model combinations are possible to simulate the same aspects of a structure. The challenging task of an engineer is to select a model combination that ensures a sufficient, reliable prognosis. In order to achieve this reliable prognosis of the overall structural behavior, a high individual quality of the partial models and an adequate coupling of the partial models is required. Several methodologies have been proposed to evaluate the quality of partial models for their intended application, but a detailed study of the coupling quality is still lacking. This paper proposes a new approach to assess the coupling quality of partial models in a quantitative manner. The approach is based on the consistency of the coupled data and applies for uni- and bidirectional coupled partial models. Furthermore, the influence of the coupling quality on the output quantities of the partial models is considered. The functionality of the algorithm and the effect of the coupling quality are demonstrated using an example of coupled partial models in structural engineering.
Etwa ein Viertel des gesamten Endenergieverbrauchs (26%) in Deutschland entfällt auf den Wohnungssektor, wodurch dieser Sektor einen erheblichen Anteil am möglichen Einsparpotenzial an Energie hat. Im Hinblick auf das Klimaschutzziel der Europäischen Union, die Energieeffizienz im Vergleich zu 1990 um 20% zu erhöhen, stellt sich daher die Frage, welche Einsparpotenziale es im Wohnungssektor tatsächlich gibt und wie diese quantifiziert werden können. In dieser Arbeit wird der Einfluss der Parameter, die den Endenergieverbrauch beeinflussen, mit Hilfe einer Sensitivitätsanalyse bestimmt. Die Ergebnisse der Sensitivitätsanalyse zeigen, dass die einflussreichsten Parameter auf den Endenergieverbrauch der Innentemperaturbedarf, die Länge der Heizperiode, die Außentemperatur (Gradtagzahl) und die Anzahl der Wohnungen sind. Dies sind Variablen, die nicht durch Verordnungen reguliert werden können. Der einzige Parameter, der regulierbar ist und einen bedeutenden Einfluss auf den Endenergieverbrauch hat, ist der Nutzungsgrad der Anlagen/Geräte für Raumwärme, Warmwasser und Kochen (sowie zu einem geringen Teil der Wirkungsgrad der eingesetzten Beleuchtung). Zur Quantifizierung des Energieeinsparpotentials im deutschen Wohnungssektor bezüglich des Nutzungsgrades wurden in dieser Arbeit Daten zur Bestimmung der langfristigen Entwicklung (Zeitraum 1990-2010) des Nutzungsgrades von Anlagen und Geräten analysiert. Mit verschiedenen Angaben aus der Literatur und mit Hilfe von Sättigungskurven wurde die Entwicklung der Nutzugsgrade der Anlagen/Geräte entsprechend der Energiequellen zwischen 1990 und 2010 ermittelt. Die erhaltenden Sättigungskurven ermöglichen die Bestimmung der Entwicklung des Nutzenergieverbrauchs im deutschen Wohnungssektor. Hierbei wurde festgestellt, dass die Differenz zwischen Nutzenergieverbrauch und Endenergieverbrauch einen Rückgang von 12 % im betrachtenden Zeitraum verzeichnete und dass das Energieeinsparpotenzial in Abhängigkeit von der Energiequelle beträchtlich variieren kann (um derzeit mehr als 35%-Punkte). Im Hinblick auf das oben genannte Klimaschutzziel werden in dieser Arbeit verschiedene Entwicklungsszenarien auf Basis des Nutzungsgrades der Anlagen und der Energiequellen analysiert. Hierbei wird deutlich, dass das theoretische Energieeinsparpotenzial im deutschen Wohnungssektor bezüglich des durchschnittlichen Nutzungsgrades nur zwischen 4 und 15 % liegt. Dies bedeutet, dass eine deutliche Reduktion des Endenergiebedarfs im Wohnungssektor nur stattfinden kann, wenn andere Energieeinsparmaßnahmen betrachtet werden. Basierend auf den Ergebnissen der Sensitivitätsanalyse werden hierzu Empfehlungen gegeben.
Im Beitrag wird belegt, dass es viele Gemeinsamkeiten von Qualitätsverletzungen und Verstößen gegen die Sicherheitsvorschriften gibt. An Hand einiger Beispiele wird gezeigt, wie sich durch – Schwachstellen in der technischen und technologischen Vorbereitung, – Mängel im Gesundheits-, Arbeits- und Brandschutz, – ungenügende Qualifikation sowie subjektives Fehlverhalten von Arbeitern, – Material-, Fertigungs- und Konstruktionsfehler Unfälle, Schadensfälle, Gesundheitsschäden und Brände insbesondere beim Schweißen ereignet haben und damit die aufgezeigten Zusammenhänge bestätigen.
Um die entsprechende Qualität der ÖPNV zu erreichen, sollte man das Verhältnis zwischen der Bedienungsqualität und dem Kostenaufwand analysieren. In den Bedingungen der anwachsenden Konkurenz aus der Seite der PKW und anderen Fuhrunternehmern soll man die Handlungen, die grösste Effektivität gewährleisten, aufnehmen. Es gibt viele Möglichkeiten die diese Qualität verbessern können, wie z.B. steigernde Frequenz der Fahrzeuge, Vergrößerung der Geschwindigkeit, Pünktlichkeit und Regelmäßigkeit, Einführung der Niederfußfahrzeuge und vieles mehr. Ich versuche in meinem Referat die Aspekte zu analysieren, die mit der Frequenz verbunden sind, d.h. die Erhöhung der Frequenz und das Erhalten der konstanten Häufigkeit der Linienbusse. Das Referat umfasst die Modelle und die Beispiele. In Verbindung mit den Untersuchungen, die die Bereitschaft der Bezahlung für die Verbesserung der Qualität betreffen, kann man schon Entscheidungen treffen, die unterschiedlichen Standard der Fahrt bestimmen.
Quality is one of the most important properties of a product. Providing the optimal quality can reduce costs for rework, scrap, recall or even legal actions while satisfying customers demand for reliability. The aim is to achieve ``built-in'' quality within product development process (PDP). The common approach therefore is the robust design optimization (RDO). It uses stochastic values as constraint and/or objective to obtain a robust and reliable optimal design. In classical approaches the effort required for stochastic analysis multiplies with the complexity of the optimization algorithm. The suggested approach shows that it is possible to reduce this effort enormously by using previously obtained data. Therefore the support point set of an underlying metamodel is filled iteratively during ongoing optimization in regions of interest if this is necessary. In a simple example, it will be shown that this is possible without significant loss of accuracy.
Over the last decade, the technology of constructing buildings has been dramatically developed especially with the huge growth of CAD tools that help in modeling buildings, bridges, roads and other construction objects. Often quality control and size accuracy in the factory or on construction site are based on manual measurements of discrete points. These measured points of the realized object or a part of it will be compared with the points of the corresponding CAD model to see whether and where the construction element fits into the respective CAD model. This process is very complicated and difficult even when using modern measuring technology. This is due to the complicated shape of the components, the large amount of manually detected measured data and the high cost of manual processing of measured values. However, by using a modern 3D scanner one gets information of the whole constructed object and one can make a complete comparison against the CAD model. It gives an idea about quality of objects on the whole. In this paper, we present a case study of controlling the quality of measurement during the constructing phase of a steel bridge by using 3D point cloud technology. Preliminary results show that an early detection of mismatching between real element and CAD model could save a lot of time, efforts and obviously expenses.
Known as a sophisticated phenomenon in civil engineering problems, soil structure interaction has been under deep investigations in the field of Geotechnics. On the other hand, advent of powerful computers has led to development of numerous numerical methods to deal with this phenomenon, resulting in a wide variety of methods trying to simulate the behavior of the soil stratum. This survey studies two common approaches to model the soil’s behavior in a system consisting of a structure with two degrees of freedom, representing a two-storey frame structure made of steel, with the column resting on a pile embedded into sand in laboratory scale. The effect of soil simulation technique on the dynamic behavior of the structure is of major interest in the study. Utilized modeling approaches are the so-called Holistic method, and substitution of soil with respective impedance functions.
One of the basic types of strength calculations is the calculation of limit equilibrium of constructions. This report describes new method for solving the problem of limit equilibrium. The rigid-plastic system in this method is substituted with an «equivalent» elastic system with specially constructed rigidities. This is why it is called the method of pseudorigidities. An iteration algorithm was developed for finding pseudorigidities. This algorithm is realized in a special software procedure. Conjunction of this procedure with any elastic calculation program (base program) creates a program solving rigid-plastic problems. It is proved, that iterations will be converge to the solution for the problem of limit equilibrium. The solution of tests show, that pseudorigidity method is universal. It allows the following: - to solve problems of limit equilibrium for various models (arch, beam, frame, plate, beam-wall, shell, solid); - to take into account both linearized and square-law fluidity conditions; - to solve problems for various kinds of loads (concentrated, distributed, given by a generalized vector); - to take into account the existing various of fluidity criteria in different sections etc. The iterative PRM process quickly converges. The accuracy of PRM is very high even in case of rough finite-element structuring. The author has used this method for design protection systems from extreme loads due to equipment of nuclear power stations, pipelines, cargo in any transportation.
Präventive Aufgaben der Sicherheits- und Gesundheitsschutz-Koordination im Facility Management
(2003)
Relation Kosten der Nutzung und Unterhaltung eines Bauwerkes zu den Herstellkosten; Versäumnisse der Planungsphase führen zu Mehraufwand bei Betrieb und Unterhaltung des Bauwerkes; Baumängel aus Sicht des Facility Managements, ganzheitliche Bauerkskonzeptionen, Phasen und Bruchlinien der Projektentwicklung; Anforderungen an den Sicherheits- und Gesundheitsschutz-Koordinator.
Prädiktive Wärmeflussregelung solaroptimierter Wohngebäude - Thermische Simulation komplexer Gebäude
(2003)
Moderne Wohngebäude heute, besonders aber die der Zukunft, werden sich zu einem großen Teil selbst mit Wärme versorgen. Hierbei spielen Entwicklungen im Bereich der Fassadenkonstruktionen eine zentrale Rolle. Zum erhöhten solaren Wärmeeintrag kommen die Eigenschaften der Wärmespeicherung und verzögerten Wärmeabgabe hinzu. Aufgrund dieser Eigenschaften definiert sich das Gebäudeverhalten neu und komplexer. Im Rahmen der Entwicklungen von Wärmeflussregelungen in solarthermisch beheizten Wohngebäuden mit neuartigen Fassaden wird in diesem Beitrag auf die Modellentwicklung moderner Fassadenkonstruktionen hinsichtlich Transparenter Wärmedämmung und Phasenwechselmaterialien im Wandverbund, kombiniert mit der regelungstechnisch interessanten Neuentwicklung von Verschattungseinrichtungen über schaltbare Schichten am Fenster, eingegangen. Der Beitrag beschreibt den Entwurf eines Regelungskonzeptes für die thermische Raumklimaregelung. Diese Untersuchungen basieren auf komplexen Simulationsmodellen, wobei bei der Modellentwicklung der neuartigen Fassadenelemente auf reale Messwerte zurückgegriffen werden konnte.
PRÄ- UND POST-ARCHITEKTUR
(2011)
1988-1994 Redakteur bei Arch+. Gastprofessur an der TU Cottbus 2000–02. Initiator und Co-Leiter des Europäischen Forschungsprojekt Urban Catalyst 2001–03. Mitinitiator von ZwischenPalastNutzung und Künstlerischer Co-Leiter von Volkspalast 2004. Leitender Kurator des Projektes Schrumpfende Städte für die Kulturstiftung des Bundes 2002–08. Autor und Herausgeber mehrerer Bücher und Schriften, u.a. Wohltemperierte Architektur und Berlin_Stadt ohne Form. Seit Herbst 2006 Professor für Architekturtheorie und Entwerfen an der Universität Kassel, seit März 2009 Leiter der Stiftung Bauhaus Dessau.
Prozeßoptimierung in der logistischen Kette erfordert eine interdisziplinäre betriebsüber-greifende Projektarbeit. Im Zeitalter der Globalisierung der Märkte und der stetigen Verbesserung der Wettbewerbsfähigkeit mittelständischer Unternehmen ist eine innerbetriebliche und überbetriebliche Ressourcen- und Tourenplanung (Optimierung) über eine Informationsvernetzung ebenso notwendig wie die Aufbereitung von Informationen und die Analyse von Geschäftsprozessen. Die Prozessoptimierung umfaßt die Aufgaben der Analyse, Gestaltung, Planung und Kontrolle von Prozessen. Supply Chain Management (SCM) ist die übergreifende Prozessoptimierung in der logistischen Kette, d.h. die logische Weiterführung der PPS auf die Lieferbeziehungen. Das Strukturmodell der logistischen Kette umfaßt die Prozesse der * Produktentstehung * Entwicklung * Auftragsgewinnung (Vertrieb, Marketing) * Produktionsplanung * Beschaffung * Produktion * Distribution und Entsorgung Diese Prozesse werden durch das Supply Chain Management nach unternehmensspezifischen Zielsetzungen in Richtung Kunden, Lieferanten und Dienstleistern gestaltet und optimiert (Optimierung der Wertschöpfungskette). Anwendungssystem der Informatik übernehmen die Informationsversorgung in der logistischen Kette.
Vor dem Hintergrund einer sich verschärfenden Umweltgesetzgebung sowie der Erkenntnis, dass Bauabfälle sich grundsätzlich für eine Stoffkreislaufführung eignen, hat der Rückbau von Bauwerken in den letzten Jahren verstärkt an Bedeutung gewonnen. Aufgrund oftmals strenger Zeit- und Kostenvorgaben für einen Rückbau, begrenzter Verfügbarkeit von Personal und Betriebsmitteln, einer weitgehenden Unikatfertigung sowie wechselnder Standorte kommt dabei der projektorientierten Planung von Maßnahmen auf der Baustelle große Bedeutung zu. Im Rahmen des Beitrages werden Ansätze zur Modellierung und Lösung der sich hieraus ergebenden Probleme zur Planung und Optimierung von (Rück-) Bauabläufen unter Verwendung von Projektplanungsmodellen und -methoden vorgestellt. Hierbei werden neben betriebswirtschaftlichen auch umweltrelevante sowie technische Fragestellungen im Zusammenhang mit der Planung von Rückbauprojekten aufgegriffen. Eine Anwendung der Planungsansätze auf reale Gebäude zeigt, dass sich durch Kombination von Gebäuderückbau und Aufbereitungstechnik eine Qualitätsverbesserung von Recyclingbaustoffen erzielen lässt. Zur Umsetzung der hierzu erforderlichen Maßnahmen werden unter Berücksichtigung individueller abfallwirtschaftlicher Rahmenbedingungen der jeweiligen Planungsregion, gebäude- und baustellenbezogener Besonderheiten, technischer sowie kapazitiver Restriktionen Ablaufpläne für den Gebäuderückbau berechnet. Die für unterschiedliche Planungsregionen vorgenommenen Modellrechnungen weisen nach, dass sich die Demontage von Gebäuden gegenüber einem konventionellen Gebäudeabbruch unter bestimmten Rahmenbedingungen bereits wirtschaftlich vorteilhaft realisieren lässt. Abgerundet wird der Beitrag durch einen Ausblick auf Möglichkeiten einer Realisierung komplexer Bauabläufe auch unter strengen Zeitvorgaben sowie bei begrenzten Platzverhältnissen mit Hilfe fertigungssynchroner Ressourceneinsatzplanung sowie auf die Berücksichtigung von Unsicherheiten in der Planung und Ausführung von Rückbauprojekten.
The presented work focuses on collaboration- experiences gathered with complex design and engineering projects, using the learning platform POLE- Europe. Within the POLE environment student-teams from different universities, disciplines and cultural backgrounds are assigned to real-world projects with clearly defined design - tasks, usually to be accomplished within one semester while working in a virtual environment for most of the time. The concept of POLE and the information and collaboration technology is described.
This paper presents a specific modeling technique that is focused on preparing planning processes in civil engineering. Planning processes in civil engineering are characterized by some peculiarities so that the sequence of planning tasks needs to be determined for each planning project. Neither the use of optimized partial processes nor the use of lower detailed and optimized processes guarantee an optimal overall planning process. The modeling technique considers these peculiarities. In a first step, it is focused on the logic of the planning process. Algorithms based on the graph theory determine that logic. This approach ensures consistency and logical correctness of the description of a planning process at the early beginning in its preparation phase. Sets of data – the products of engineers like technical drawings, technical models, reports, or specifications – form the core of the presented modeling technique. The production of these sets of data requires time and money. This is expressed by a specific weighting of each set of data in the presented modeling technique. The introduction of these weights allows an efficient progress measurement and controlling of a planning project. For this purpose, a link between the modeling technique used in the preparation phase and the execution phase is necessary so that target and actual values are available for controlling purposes. The present paper covers the description of this link. An example is given to illustrate the use of the modeling technique for planning processes in civil engineering projects.
Building project, with many different players involved, requires open and commonly accepted standard for product model description. Product model based design tools support easy comparisons of design alternatives and optimisation of design solution technical quality. This supports client s decision-making and design target comparisons through the whole building project. Use of product models enable these tasks to meet both schedule and cost requirements Olof Granlund is using product models and interoperable software as the main tool in projects. The use and the realised benefits are illustrated by examples from 3 different real projects: University building, where product models were used already in the very early phases by the whole design team. Office building for research organisation, where product models were used in so called self-reporting building system. Headquarters for international company, where product models were widely used for building performance analysis and visualisations in design phase as well as for facilities management system configuration for operational phase.
Processing technical and environmental data on building materials, components, and systems has become more important during the last few years. Increased sensitivity towards environmental and energy problems has lead to the demand for simulation and evaluation of the long term behavior of buildings. The results of such simulations are expected to enable architects and engineers to develop a broader, interdisciplinary understanding of the impact of their products (buildings) on the environment. However, conducting such evaluations is currently hampered by the lack of comprehensive, up-to-date, and ecologically relevant data on building materials, components, and systems. To address this problem, this paper proposes an approach to deal with the absent or uncertain attributes of building materials, components, and systems. In the past, various information systems have been developed to provide data on a limited set of building materials, including precise values pertaining to some of their characteristics, such as availability, manufacturers, costs, etc. These traditional information systems have difficulty in dealing with uncertain, incomplete and sparse data. However, uncertainty and incompleteness characterize the nature of most of the available and environmentally related characteristics of materials, components, and systems. In this paper, a fuzzy-logic-based augmentation of traditional information systems is proposed towards providing management, utilization and manipulation of incomplete and uncertain data.
The concept is presented of the sensitivity analysis of the limit state of the structure with respect to selected basic variables. The sensitivity is presented in the form of the probability distribution of the limit state of the structure. The analysis is performed by the problem-oriented Monte Carlo simulation procedure. The procedure is based on the problem's definition of the elementary event, as a structural limit state. Thus the sample space consists of limit states of the structure. Defined on the sample space the one-dimensional random multiplier is introduced. This multiplier refers to the dominant basic variable (group of variables) of the problem. Numerical procedure results in the set of random numbers. Normalized relative histogram of this set is an estimator of the PDF of the limit state of the structure. Estimators of reliability, or the probability of failure are statistical characteristics of this histogram. The procedure is illustrated by the example of sensitivity analysis of the serviceability limit state of monumental structure. It is the colonnade of Licheń Basilica, situated in central Poland. Limit state of the structure is examined with reference to the upper deck horizontal deflection. Wind actions are taken as dominant variables. An assumption is made that the wind load intensities acting on the lower and on the upper storey of the colonnade, respectively, are identically distributed, but correlated random variables. Three correlation variants of these variables are considered. Relevant limit state histograms are analysed thereafter. The paper ends with the conclusions referring to the method and some general remarks on the fully probabilistic design.
Preparation and provision of building information for planning within existing built contexts
(2004)
A prerequisite for planning within existing built contexts is precise information regarding the building substance, its construction and materials, possible damages and any modifications and additions that may have occurred during its lifetime. Using the information collected in a building survey the user should be able to “explore” the building in virtual form, as well as to assess the information contained with regard to a specific planning aspect. The functionality provided by an information module should cover several levels of information provision ranging from ‘simple retrieval’ of relevant information to the analysis and assessment of stored information with regard to particular question sets. Through the provision of basic functionality at an elementary level and the ability to extend this using plug-ins, the system concept of an open extendable system is upheld. Using this modular approach, different levels of information provision can be provided as required during the planning process.
The paper proposes a new method for general 3D measurement and 3D point reconstruction. Looking at its features, the method explicitly aims at practical applications. These features especially cover low technical expenses and minimal user interaction, a clear problem separation into steps that are solved by simple mathematical methods (direct, stable and optimal with respect to least error squares), and scalability. The method expects the internal and radial distortion parameters of the used camera(s) as inputs, and a plane quadrangle with known geometry within the scene. At first, for each single picture the 3D position of the reference quadrangle (with respect to each camera coordinate frame) is calculated. These 3D reconstructions of the reference quadrangle are then used to yield the relative external parameters of each camera regarding the first one. With known external parameters, triangulation is finally possible. The differences from other known procedures are outlined, paying attention to the stable mathematical methods (no usage of nonlinear optimization) and the low user interaction with good results at the same time.
For decades in Germany, historical research on dictatorial urban design in the first half of the 20th century focused on the National Socialist period. Studies on the urban design practices of other dictatorships remained an exception. This has changed. Meanwhile, the urban production practices of the Mussolini, Stalin, Salazar, Hitler and Franco dictatorships have become the subject of comprehensive research projects. Recently, a research group that studies dictatorial urban design in 20th century Europe has emerged at the Bauhaus-Institut für Geschichte und Theorie der Architektur und der Planung. The group is already able to refer to various research results.
Part of the research group’s self-conception is the assumption that the urban design practices of the named dictatorships can only be properly understood from a European perspective. The dictatorships influenced one another substantially. Furthermore, the specificities of the practices of each dictatorship can only be discerned if one can compare them to those of the other dictatorships. This approach requires strict adherence to the research methods of planning history and urban design theory. Meanwhile, these methods must be opened
to include those of general historical studies.
With this symposium, the research group aims to further qualify this European perspective. The aim is to pursue an inventory of the various national historiographies on the topic of “urban design and dictatorship”. This inventory should offer an overview on the general national level of historical research on urban design as well as on the level of particular urban design projects, persons or topics.
The symposium took place in Weimar, November 21-22, 2013. It was organized by Harald Bodenschatz, Piero Sassi and Max Welch Guerra and funded by the DAAD (German Academic Exchange Service).
Plausibilität im Planungsprozess - Digitale Planungshilfen für die Bebaubarkeit von Grundstücken
(2003)
Die digitale Unterstützung der Planungsprozesse ist ein aktueller Forschungs- und Arbeitsschwerpunkt der Professur Informatik in der Architektur (InfAR) und der Juniorprofessur Architekturinformatik der Fakultät Architektur an der Bauhaus-Universität Weimar. Verankert in dem DFG Sonderforschungsbereich 524 >Werkzeuge und Konstruktionen für die Revitalisierung von Bauwerken< werden Konzepte und Prototypen für eine fachlich orientierte Planungsunterstützung entwickelt. Die Vielfalt der unterschiedlichen Faktoren, die Einfluss auf den Planungsprozess nehmen können, sowie deren Abhängigkeiten voneinander werden von heutigen Planungssystemen in nur unzureichender Weise aufbereitet und verwaltet. Diese Faktoren bedingen Planungstools, deren Aufgabe die Beschaffung, Verarbeitung, Integration und Verwaltung von Informationen sowie die Veranschaulichung der komplexen Informationszusammenhänge ist. Die Entwicklung solcher Systeme ist technisch möglich. Die Schwierigkeit liegt in der Beschaffung und Strukturierung der für den Planungsprozess relevanten Informationen sowie in ihrer Aufbereitung und Integration in eine digitale Planungsumgebung. Das Bestreben des Forschungsprojektes ist es, Grundlagen für digitale Werkzeuge zu entwickeln die zu plausiblen Lösungen im Planungsprozess und somit zu erhöhter Planungssicherheit für die am Bau beteiligten Auftragnehmer und Auftraggeber führen. Es wird angestrebt Programm-Module zu entwickeln, die den Planer bei der Ermittlung von Lösungswegen zu einer Fachfrage inhaltlich unterstützen und die Nachvollziehbarkeit und Richtigkeit einer Planungsentscheidung gewährleisten und plausibel darlegen. Mit Hilfe der Module sollen Entscheidungsfindungen katalysiert werden. Die Bauvorhaben der Zukunft werden zu einem großen Teil im Bestand liegen. Dieses Faktum erfordert planerische Maßnahmen, für die vollends Werkzeuge und Hilfsmittel fehlen, die zu plausiblen und sicheren Planungsentscheidungen führen. Die Entwicklung solcher Hilfsmittel ist Ziel dieser Forschung. Der Beitrag stellt prototypische Software-Module vor, die sich mit der Problematik der potenziellen Bebaubarkeit einer Grundstücksfläche auseinander setzen. Die Module verarbeiten Regeln, die den einschlägigen Normen und Verordnungen entnommen sind, die bei der Erarbeitung einer Planungslösung eingehalten werden müssen.
The approach discussed here is part of research into an overall concept for digital instruments which support the entire planning process and help in enabling planning decisions to be based upon clear reasoning and plausible arguments. Such specialist systems must take into account currently available technology, such as networked working patterns, object-orientation, building and product models as well as the working method of the planner. The paper describes a plausibility instrument for the formulation of colour scheme proposals for building interiors and elevations. With the help of intuitively usable light simulations, colour, material and spatial concepts can be assessed realistically. The software prototype “Coloured Architecture” is conceived as a professional extension to conventional design tools for the modelling of buildings. As such it can be used by the architect in the earliest design phases of the planning process as well as for colour implementation on location.
Complex gridshell structures used in architecturally ambitious constructions remain as appealing as ever in the public realm. This paper describes the theory and approach behind the software realisation of a tool which helps in finding the affine self-weight geometry of gridshell structures. The software tool DOMEdesign supports the formal design process of lattice and grid shell structures based upon the laws of physics. The computer-aided simulation of suspension models is used to derive structurally favourable forms for domes and arches subject to compression load, based upon the input of simple architectonic parameters. Irregular plans, three-dimensional topography, a choice different kinds of shell lattice structures and the desired height of the dome are examples of design parameters which can be used to modify the architectural design. The provision of data export formats for structural dimensioning and visualisation software enables engineers and planners to use the data in future planning and to communicate the design to the client.
Diskrete Arbeitsvorgänge lassen sich mit Hilfe von Petri--Netzen formal beschreiben. Petri--Netze basieren auf der Graphentheorie. Die Elemente zweier Knotenmengen werden Stellen und Transitionen genannt und sind durch gerichtete Kanten miteinander verknüpft. Stellen repräsentieren Bedingungen oder Zustände und Transitionen Ereignisse oder Vorgänge. Durch Petri--Netze ist es möglich nicht nur eine statische Vorgänger--Nachfolger--Struktur abzubilden, vielmehr können ebenso Ereignisse, Alternativen und Nebenläufigkeiten modelliert werden. In diesem Beitrag wird vorgestellt, wie ein Bauablauf gegeben durch ein Vorgangsknoten-Netzplan mit sehr wenigen Schritten auf ein Bedingungs/Ereignis-Netz abgebildet werden kann. Alle notwenigen Teilschritte wie das Bilden von Teilnetzen oder das Vergröbern und Verfeinern von Knoten basieren auf einem mathematisch abgesicherten Fundament. Im Gegensatz zu anderen Formulierungen von Bauabläufen ist die Theorie der Petri-Netze eine allgemeingültige Theorie und kann in vielen Bereichen eingesetzt werden. Die Verwendung einer solchen mathematischen Abstraktion ermöglicht die Wiederverwendung von bereits entwickelten Lösungsansätzen. So können die gewonnenen Erfahrungen auch bei der Modellie-rung von anderen Arbeitsvorgängen verwendet werden.
For decades in Germany, historical research on dictatorial urban design in the first half of the 20th century focused on the National Socialist period. Studies on the urban design practices of other dictatorships remained an exception. This has changed. Meanwhile, the urban production practices of the Mussolini, Stalin, Salazar, Hitler and Franco dictatorships have become the subject of comprehensive research projects. Recently, a research group that studies dictatorial urban design in 20th century Europe has emerged at the Bauhaus-Institut für Geschichte und Theorie der Architektur und der Planung. The group is already able to refer to various research results.
Part of the research group’s self-conception is the assumption that the urban design practices of the named dictatorships can only be properly understood from a European perspective. The dictatorships influenced one another substantially. Furthermore, the specificities of the practices of each dictatorship can only be discerned if one can compare them to those of the other dictatorships. This approach requires strict adherence to the research methods of planning history and urban design theory. Meanwhile, these methods must be opened
to include those of general historical studies.
With this symposium, the research group aims to further qualify this European perspective. The aim is to pursue an inventory of the various national historiographies on the topic of “urban design and dictatorship”. This inventory should offer an overview on the general national level of historical research on urban design as well as on the level of particular urban design projects, persons or topics.
The symposium took place in Weimar, November 21-22, 2013. It was organized by Harald Bodenschatz, Piero Sassi and Max Welch Guerra and funded by the DAAD (German Academic Exchange Service).
Im Zusammenhang mit der Revitalisierung von Plattenbauwerken kommt der Bewertung bzw. Neubewertung der vorhandenen Aussteifungssysteme große Bedeutung zu. Zur Aussteifung werden im allgemeinen raumbreite, raumhohe und meist unbewehrte Betonelemente verwendet, die lediglich konstruktiv miteinander verbunden sind. Im Rahmen der Untersuchung wird ein vorhandenes Berechnungsmodell zur physikalisch nichtlinearen Analyse von Aussteifungssystemen mehrgeschossiger Gebäude unter Horizontallast so erweitert, dass unbewehrte Horizontalfugen berücksichtigt werden können. Die Analyse der Aussteifungssysteme wird auf der Basis der Methode der mathematischen Optimierung realisiert, wobei die aussteifenden Wände als offene dünnwandige, schlanke Stäbe betrachtet werden, die über dehnstarre Deckenscheibe (Diaphragmen) gekoppelt sind. Im Beitrag wird gezeigt, dass sich bei Berücksichtigung des physikalisch nichtlinearen Tragverhaltens und der damit verbundenen Schnittgrößenumlagerungen Tragreserven erschließen lassen, die auf die Standsicherheit sowohl vorhandener als auch revitalisierter Gebäude angerechnet werden können.
Physikalisch nichtlineare Analyse von Aussteifungssystemen unter Einbeziehung von Lastfolgeeffekten
(2003)
Die physikalisch nichtlineare Analyse von Stahlbetontragwerken unter Berücksichtigung des Einspielverhaltens (adaptives Tragverhalten) mit Methoden der mathematischen Optimierung ist seit Jahren Gegenstand intensiver Forschungsarbeiten am Lehrstuhl Massivbau I der Bauhaus-Universität Weimar. Die dabei entwickelten Modelle und Algorithmen werden im folgenden Beitrag exemplarisch auf die Untersuchung von Aussteifungssystemen in Großtafelbauweise angewendet. Da bei diesen Gebäuden die aussteifende Konstruktion aus zusammengesetzten großformatigen Betonfertigteilen besteht, wird das Gesamttragverhalten maßgebend durch das Fugentragverhalten bestimmt. Die physikalische Nichtlinearität wird durch das Aufreißen der unbewehrten Horizontalfugen und den verschieblichen Verbund in den Vertikalfugen charakterisiert und entsprechend im Berechnungsmodell berücksichtigt. Beispielrechnungen belegen, dass für die beschriebenen Aussteifungssysteme signifikante Spannungsumlagerungen infolge des nichtlinearen Fugentragverhaltens auftreten. Weiterhin können Lastfolgeeffekte rechnerisch nachgewiesen werden. Im Gegensatz zu seismisch beanspruchten Systemen, die in kürzester Zeit wiederholt extrem beansprucht werden, ist die Eintrittswahrscheinlichkeit bemessungsrelevanter Windlasten gering.
Summer overheating in buildings is a common problem, especially in office buildings with large glazed facades, high internal loads and low thermal mass. Phase change materials (PCM) that undergo a phase transition in the temperature range of thermal comfort can add thermal mass without increasing the structural load of the building. The investigated PCM were micro-encapsulated and mixed into gypsum plaster. The experiments showed a reduction of indoor-temperature of up to 4 K when using a 3 cm layer of PCM-plaster with micro-encapsulated paraffin. The measurement results could validate a numerical model that is based on a temperature dependent function for heat capacity. Thermal building simulation showed that a 3 cm layer of PCM-plaster can help to fulfil German regulations concerning heat protection of buildings in summer for most office rooms.
One of the most promising and recent advances in computer-based planning is the transition from classical geometric modeling to building information modeling (BIM). Building information models support the representation, storage, and exchange of various information relevant to construction planning. This information can be used for describing, e.g., geometric/physical properties or costs of a building, for creating construction schedules, or for representing other characteristics of construction projects. Based on this information, plans and specifications as well as reports and presentations of a planned building can be created automatically. A fundamental principle of BIM is object parameterization, which allows specifying geometrical, numerical, algebraic and associative dependencies between objects contained in a building information model. In this paper, existing challenges of parametric modeling using the Industry Foundation Classes (IFC) as a federated model for integrated planning are shown, and open research questions are discussed.
PARAMETER IDENTIFICATION OF MESOSCALE MODELS FROM MACROSCOPIC TESTS USING BAYESIAN NEURAL NETWORKS
(2010)
In this paper, a parameter identification procedure using Bayesian neural networks is proposed. Based on a training set of numerical simulations, where the material parameters are simulated in a predefined range using Latin Hypercube sampling, a Bayesian neural network, which has been extended to describe the noise of multiple outputs using a full covariance matrix, is trained to approximate the inverse relation from the experiment (displacements, forces etc.) to the material parameters. The method offers not only the possibility to determine the parameters itself, but also the accuracy of the estimate and the correlation between these parameters. As a result, a set of experiments can be designed to calibrate a numerical model.
This study contributes to the identification of coupled THM constitutive model parameters via back analysis against information-rich experiments. A sampling based back analysis approach is proposed comprising both the model parameter identification and the assessment of the reliability of identified model parameters. The results obtained in the context of buffer elements indicate that sensitive parameter estimates generally obey the normal distribution. According to the sensitivity of the parameters and the probability distribution of the samples we can provide confidence intervals for the estimated parameters and thus allow a qualitative estimation on the identified parameters which are in future work used as inputs for prognosis computations of buffer elements. These elements play e.g. an important role in the design of nuclear waste repositories.
Traffic simulation is a valuable tool for the design and evaluation of road networks. Over the years, the level of detail to which urban and freeway traffic can be simulated has increased steadily, shifting from a merely qualitative macroscopic perspective to a very detailed microscopic view, where the behavior of individual vehicles is emulated realistically. With the improvement of behavioral models, however, the computational complexity has also steadily increased, as more and more aspects of real-life traffic have to be considered by the simulation environment. Despite the constant increase in computing power of modern personal computers, microscopic simulation stays computationally expensive, limiting the maximum network size than can be simulated on a single-processor computer in reasonable time. Parallelization can distribute the computing load from a single computer system to a cluster of several computing nodes. To this end, the exisiting simulation framework had to be adapted to allow for a distributed approach. As the simulation is ultimately targeted to be executed in real-time, incorporating real traffic data, only a spatial partition of the simulation was considered, meaning the road network has to be partitioned into subnets of comparable complexity, to ensure a homogenous load balancing. The partition process must also ensure, that the division between subnets does only occur in regions, where no strong interaction between the separated road segments occurs (i.e. not in the direct vicinity of junctions). In this paper, we describe a new microscopic reasoning voting strategy, and discuss in how far the increasing computational costs of these more complex behaviors lend themselves to a parallelized approach. We show the parallel architecture employed, the communication between computing units using MPIJava, and the benefits and pitfalls of adapting a single computer application to be used on a multi-node computing cluster.
PANTS ON FIRE?
(2011)
Grace Quiroga studied architecture at the University of Michigan and at the Vienna University of Technology. Her ongoing architectural projects include the design of a housing project in the Chinese province of Sichuan for a thousand families displaced by the earthquake of 2008. In addition, she is working on a doctoral dissertation titled “Rem Koolhaas and the architecture culture of the AA in the 1970’s”.
OÍDA OUK EIDÓS
(2011)
Ralf Hennig is a PhD-candidate at the chair of Theory and History of Modern Architecture at the Bauhaus-University Weimar. His prior research interests are focused on historical and current interaction between media, architecture and the city as well as the influence of the alliance of these entities on traditional principles of dwelling. In 2004-2005 he was responsible for the conception and constitution of the postgraduate Master’s degree programme MediaArchitecture at the Bauhaus-University Weimar. In 2007-2008 he worked there as a scientific associate at the chair of Sociology of Globalisation, involved in various activities such as the research project MEDIACITY.
In civil engineering it is very difficult and often expensive to excite constructions such as bridges and buildings with an impulse hammer or shaker. This problem can be avoided with the output-only method as special feature of stochastic system identification. The permanently existing ambient noise (e.g. wind, traffic, waves) is sufficient to excite the structures in their operational conditions. The output-only method is able to estimate the observable part of a state-space-model which contains the dynamic characteristics of the measured mechanical system. Because of the assumption that the ambient excitation is white there is no requirement to measure the input. Another advantage of the output-only method is the possibility to get high detailed models by a special method, called polyreference setup. To pretend the availability of a much larger set of sensors the data from varying sensor locations will be collected. Several successive data sets are recorded with sensors at different locations (moving sensors) and fixed locations (reference sensors). The covariance functions of the reference sensors are bases to normalize the moving sensors. The result of the following subspace-based system identification is a high detailed black-box-model that contains the weighting function including the well-known dynamic parameters eigenfrequencies and mode shapes of the mechanical system. Emphasis of this lecture is the presentation of an extensive damage detection experiment. A 53-year old prestressed concrete tied-arch-bridge in Hünxe (Germany) was deconstructed in 2005. Preliminary numerous vibration measurements were accomplished. The first experiment for system modification was an additional support near the bridge bearing of one main girder. During a further experiment one hanger from one tied arch was cut through as an induced damage. Some first outcomes of the described experiments will be presented.
OUT OF EMPIRE
(2011)
Philip Ursprung studierte Kunstgeschichte, Allgemeine Geschichte und Germanistik in Genf, Wien und Berlin. Er wurde 1993 an der FU Berlin promoviert und 1999 an der ETH Zürich habilitiert. Er unterrichtete an den Universitäten Genf, Basel und Zürich, an der ETH Zürich, der Kunsthochschule Berlin-Weissensee und der Universität der Künste Berlin. 2001-2005 war er Nationalfonds- Förderungsprofessor für Geschichte der Gegenwartskunst am Departement Architektur der ETH Zürich. Seit 2005 ist er Professor für Moderne und zeitgenössische Kunst an der Universität Zürich. 2007 war er Gastprofessor an der Graduate School of Architecture, Planning and Preservation der Columbia University New York. Er war Gastkurator am Museum für Gegenwartskunst in Basel, am Canadian Centre for Architecture in Montreal und der Graduate School for Architecture, Planning and Preservation der Columbia University New York. Er ist Autor von Grenzen der Kunst: Allan Kaprow und das Happening, Robert Smithson und die Land Art (München, 2003), Herausgeber von Herzog & de Meuron: Naturgeschichte (Montreal und Baden 2002), sowie Ko-autor von Images: A Picture Book of Architecture (München 2004), Minimal Architecture (München, 2003) und Studio Olafur Eliasson: An Encyclopedia (Köln, 2008). Zuletzt erschien Caruso St John: Almost Everything (Barcelona, 2008).
It is well known that complex quaternion analysis plays an important role in the study of higher order boundary value problems of mathematical physics. Following the ideas given for real quaternion analysis, the paper deals with certain orthogonal decompositions of the complex quaternion Hilbert space into its subspaces of null solutions of Dirac type operator with an arbitrary complex potential. We then apply them to consider related boundary value problems, and to prove the existence and uniqueness as well as the explicit representation formulae of the underlying solutions.
This paper deals with the development of a new multi-objective evolution strategy in combination with an integrated pollution-load and water-quality model. The optimization algorithm combines the advantages of the Non-Dominated Sorting Genetic Algorithm and Self-Adaptive Evolution Strategies. The identification of a good spread of solutions on the pareto-optimum front and the optimization of a large number of decision variables equally demands numerous simulation runs. In addition, statements with regard to the frequency of critical concentrations and peak discharges require continuous long-term simulations. Therefore, a fast operating integrated simulation model is needed providing the required precision of the results. For this purpose, a hydrological deterministic pollution-load model has been coupled with a river water-quality and a rainfall-runoff model. Wastewater treatment plants are simulated in a simplified way. The functionality of the optimization and simulation tool has been validated by analyzing a real catchment area including sewer system, WWTP, water body and natural river basin. For the optimization/rehabilitation of the urban drainage system, both innovative and approved measures have been examined and used as decision variables. As objective functions, investment costs and river water quality criteria have been used.
This paper contributes to discussion on introduction of exclusive lane for public transport. Analyses of results have been presented on effectiveness of usage of a 4 - lane and a 6 - lane street with and without an exclusive lane for buses. Two basic sub-models have been applied: - the binary logit model for modal split estimation; it takes into consideration the relation of travel time performed by private cars to public transport, - the polynomial model for predicting link impedance; the real travel time is affected by relation of traffic volume to the design capacity The parameters of both sub-models have been calibrated for the Polish conditions. The relationships are determined between number of person trips and traffic volume of buses/private cars, market share of public transport in motorised trips, average travel time lost in trips, average operation cost. An iterative technique addresses a feedback between the modal split and traffic volumes. Numerical calculations by use of EXCEL and MATLAB were carried out. Typical values of corridor length, occupancy rate of passenger car, design capacity of the bus, access and egress time to and from the parking/bus-stop in urban areas were applied. On the basis of the estimated results, the marginal parameter values: number of people carried at which separated lane for buses is most effective , traffic volume for private cars, traffic volume for buses and share of public transport in trips have been got by consideration of the average travel time lost and operational cost as criteria: Then introduction of street lanes with and without exclusive lane for buses in relation to number of person trips can be optimized.
Steel structural design is an integral part of the building construction process. So far, various methods of design have been applied in practice to satisfy the design requirements. This paper attempts to acquire the Differential Evolution Algorithms in automatization of specific synthesis and rationalization of design process. The capacity of the Differential Evolution Algorithms to deal with continuous and/or discrete optimization of steel structures is also demonstrated. The goal of this study is to propose an optimal design of steel frame structures using built-up I-sections and/or a combination of standard hot-rolled profiles. All optimized steel frame structures in this paper generated optimization solutions better than the original solution designed by the manufacturer. Taking the criteria regarding the quality and efficiency of the practical design into consideration, the produced optimal design with the Differential Evolution Algorithms can completely replace conventional design because of its excellent performance.
The aim of this study was to investigate the optimization of the strength development of quaternary cements with 50 % clinker by a variation of the particle size distribution of the components GGBFS, fly ash and limestone powder.
By balancing the overall PSD of the cement by using unprocessed fly ash and coarse limestone powder in combination with a very fine GGBFS, the water demand of the resulting quaternary cements remained unaltered, while the compressive strength of the cements was increased significantly after 7d, 28 and 56d. As can be expected, the quaternary cement with 30 wt.% of the fine slag exhibited a stronger strength increase (about 18 % after 28 d) than the cements with only 20 wt.% slag (about 10% after 28d).
The sizing of simple resonators like guitar strings or laser mirrors is directly connected to the wavelength and represents no complex optimisation problem. This is not the case with liquid-filled acoustic resonators of non-trivial geometries, where several masses and stiffnesses of the structure and the fluid have to fit together. This creates a scenario of many competing and interacting resonances varying in relative strength and frequency when design parameters change. Hence, the resonator design involves a parameter-tuning problem with many local optima. As its solution evolutionary algorithms (EA) coupled to a forced-harmonic FE simulation are presented. A new hybrid EA is proposed and compared to two state-of-theart EAs based on selected test problems. The motivating background is the search for better resonators suitable for sonofusion experiments where extreme states of matter are sought in collapsing cavitation bubbles.
Der Nachbehandlung eines Fahrbahndeckenbetons kommt zum Erzielen eines hohen Frost-Tausalz-Widerstandes der fertigen Betondecke eine besondere Bedeutung zu. Bei der Waschbetonbauweise erfolgt die Nachbehandlung in mehreren Schritten. Eine erste Nachbehandlung gewährleistet den Verdunstungsschutz des Betons bis zum Zeitpunkt des Ausbürstens des verzögerten Oberflächenmörtels. Daran schließt sich die zweite Nachbehandlung an, in der Regel durch Aufsprühen eines flüssigen Nachbehandlungsmittels.
Der zweite Nachbehandlungsschritt ist entscheidend für den Frost-Tausalz-Widerstand der Betondecke. Im Rahmen eines Forschungsprojektes wurde daher untersucht, inwiefern durch eine Optimierung der zweiten Nachbehandlung der Frost-Tausalz-Widerstand von Waschbetonoberflächen erhöht werden kann, insbesondere bei Verwendung hüttensandhaltiger Zemente. Schon durch eine einmalige Nassnachbehandlung wurde eine deutlich höherer Widerstand der Waschbetons gegen Frost-Tausalz-Angriff erzielt.
Ausgehend von mathematischen Überlegungen haben wir einfache Modellansätze zur Bearbeitung des folgenden Optimierungsproblems erarbeitet und numerische Tests durchgeführt: Eine Landkarte wird in Quadrate unterteilt, wobei jedes Quadrat mit einem Faktor zu bewerten ist. Dieser Wichtungsfaktor sei klein, wenn das Gebiet problemlos passierbar ist und entsprechend groß, wenn es sich um ein Naturschutz-gebiet, einen See oder ein schwer befahrbares Gebiet handelt. Gesucht wird nach einer günstigen Verbindung vom Punkt A zum Punkt B, wobei die durch den Wichtungsfaktor gegebenen landschaftlichen Besonderheiten zu berücksichtigen sind. Wir formulieren das Problem zunächst als Variationsproblem. Eine notwendige Bedingung, der die Lösungsfunktion genügen muß, ist die Euler-Lagrangesche Differentialgleichung. Mit Hilfe der Hamiltonschen Funktion ist es möglich, diese Differentialgleichung in kanonischer Form zu schreiben. Durch Vereinfachung des Modelles gelingt es, das System der kanonischen Gleichungen so zu konkretisieren, daß es als Ausgangspunkt für numerische Untersuchungen betrachtet werden kann. Dazu verwandeln wir die ursprüngliche Landschaft in eine >Berglandschaft<, wobei hohe Berge schwer passierbare Gebiete charakterisieren. Das einfachste Modell ist ein einzelner Berg, der mit Hilfe der Dichtefunktion einer zweidimensionalen Normalverteilung erzeugt wird. Zusätzlich haben wir Berechnungen an zwei sich überlagernden Bergen sowie einer Schlucht durchgeführt.
We consider an industrial application consisting of the mass minimization of a frame in an injection moulding machine. This frame has to compensate the forces acting on the mould inside the machine and has to fulfill certain critical constraints. The deformation of that frame with constant thickness is described by the plain stress state equations for linear elasticity. If the thickness varies then we use a generalized plain stress state with constant thickness in the coarse grid elements. These direct problems are solved by an adaptive multigrid solver. The mass minimization problem leads to a constrained minimization problem for a non-linear functional which will be solved by some standard optimization algorithm which requires the gradients with respect to design parameters. For the shape optimization problem, we assume that the machine components consist of simple geometrical primitives determined by a few design parameters. Therefore, we calculate the gradient in the shape optimization by means of numerical differentiation which requires the solution of approximately 4 direct problems per design parameter. The adaptive solver guarantees the detection of critical regions automatically, and ensures a good approximation to the exact solution of the direct problem. This rather slow approach can be significantly accelerated by using the adjoint method to express the gradient. It will be combined with a direct implementation of several terms that appear after applying the chain rule to the gradient.
Performing parameter identification prior to numerical simulation is an essential task in geotechnical engineering. However, it has to be kept in mind that the accuracy of the obtained parameter is closely related to the chosen experimental setup, such as the number of sensors as well as their location. A well considered position of sensors can increase the quality of the measurement and to reduce the number of monitoring points. This Paper illustrates this concept by means of a loading device that is used to identify the stiffness and permeability of soft clays. With an initial setup of the measurement devices the pore water pressure and the vertical displacements are recorded and used to identify the afore mentioned parameters. Starting from these identified parameters, the optimal measurement setup is investigated with a method based on global sensitivity analysis. This method shows an optimal sensor location assuming three sensors for each measured quantity, and the results are discussed.
In this paper, a circulation-type society is expressed by recurrent architecture network described with multi-agent model which consists of the following agents: user, builder, reuse maker, fabricator, waste disposer, material maker and earth bank (see Fig.1). Structural members, materials, resources and monies move among these agents. Each agent has its own rules and aims, regarding structural damages, lifetime, cost reduction, numbers of structural members and structural systems. Reasonable prices of members (fresh, reused, recycled and disposed) can be optimized by GAs in this system considering equal distribution of monies among agents.
Anwendungen der Methoden des Operations Research in der Praxis des Projektmanagements sind so gut wie kaum zu finden. Diese Anwendungslücke ist umso erstaunlicher, als es eine große Zahl von theoretischen Veröffentlichungen zum Themenkreis >Operations Research und Projektmanagement< gibt . Der Beitrag arbeitet die verschiedenen Ursachen für die bislang geringe praktische Relevanz der Disziplin >Operations Research< heraus und macht Vorschläge, wie in Zukunft anwendungsfreundlichere Modelle entwickelt werden können.
The Laguerre polynomials appear naturally in many branches of pure and applied mathematics and mathematical physics. Debnath introduced the Laguerre transform and derived some of its properties. He also discussed the applications in study of heat conduction and to the oscillations of a very long and heavy chain with variable tension. An explicit boundedness for some class of Laguerre integral transforms will be present.
In many branches companies often lose the visibility of their human and technical resources of their field service. On the one hand the people in the fieldservice are often free like kings on the other hand they do not take part of the daily communication in the central office and suffer under the lacking involvement in the decisions inside the central office. The result is inefficiency. Reproaches in both directions follow. With the radio systems and then mobile phones the ditch began to dry up. But the solutions are far from being productive.
Online-Lehre im Einsatz
(2003)
Das im Rahmen des vom BMBF geförderten Projektes PORTIKO (Multimediale Lehr- und Lernplattform für den Studiengang Bauingenieurwesen) umfasst 10 Bauingenieurinstitute in Braunschweig und Dresden und das Fernstudium an der TU Dresden. Daneben sind drei Stabsprojekte für Didaktik, Teleteaching und Multimedia-Plattform als Dienstleister und Koordinatoren für die anderen Teilprojekte tätig. An der TU Braunschweig ist das Institut für Baustoffe, Massivbau und Brandschutz unter anderem für die Grundfachausbildung des Massivbau (5. + 6. Semester) und für eine spezielle Vertiefungsrichtung, den Brand- und Katastrophenschutz (7. bis 9. Semester), zuständig. In diesem Beitrag wird zunächst ein Überblick über die verwendete Plattform sowie die eingesetzten Hardware- und Softwaretechniken gegeben. Anschließend wird von ersten Erfahrungen über den Einsatz der Online-Lehre und die daraus gewonnen unterschiedlichen strategischen Ansätze für das weiter Vorgehen nach Projektende im Massivbau und im Brandschutz vorgestellt. Um den Praxisbezug zu vertiefen, wurden in PORTIKO zwei Projektbereiche gebildet, die auf Grund der engen Beziehung zwischen den Fächern gezielt zusammenarbeiten sollen. Dabei handelt es sich um die Bereiche >Virtuelles Haus< und >Virtuelle Infrastruktur<. Am Beispiel des Virtuellen Hauses soll gezeigt werden, wie die einzelnen Fächer mit Hilfe abgestimmter Übungsbeispiele den Studierenden die Notwendigkeit der Teamarbeit und Interaktion verschiedener Disziplinen vor Augen führen. Den Studenten stehen durch die Online-Plattform eine Vielzahl von Möglichkeiten zur Verfügung, z. B. mit Videoclips und Fotos über Versuche und Baustellen angereicherte Skripte, audio-kommentierte Vorlesungsfolien, alte Klausuren mit Musterlösungen, Aufgabenstellungen für Hausübungen, usw.. Den Studierenden können in der Online-Lehre interaktive Anwendungen zur Verfügung gestellt werden, die ein spielerisches Erproben und Beobachten von Parametereinflüssen ermöglichen. Solche interaktiven Übungen werden mit Java, Flash und Authorware erstellt. Hierzu werden Beispiele vorgestellt.
Using a quaternionic reformulation of the electrical impedance equation, we consider a two-dimensional separable-variables conductivity function and, posing two different techniques, we obtain a special class of Vekua equation, whose general solution can be approach by virtue of Taylor series in formal powers, for which is possible to introduce an explicit Bers generating sequence.
A lot of real-life problems lead frequently to the solution of a complicated (large scale, multicriteria, unstable, nonsmooth etc.) nonlinear optimization problem. In order to cope with large scale problems and to develop many optimum plans a hiearchical approach to problem solving may be useful. The idea of hierarchical decision making is to reduce the overall complex problem into smaller and simpler approximate problems (subproblems) which may thereupon treated independently. One way to break a problem into smaller subproblems is the use of decomposition-coordination schemes. For finding proper values for coordination parameters in convex programming some rapidly convergent iterative methods are developed, their convergence properties and computational aspects are examined. Problems of their global implementation and polyalgorithmic approach are discussed as well.
ON THE NAVIER-STOKES EQUATION WITH FREE CONVECTION IN STRIP DOMAINS AND 3D TRIANGULAR CHANNELS
(2006)
The Navier-Stokes equations and related ones can be treated very elegantly with the quaternionic operator calculus developed in a series of works by K. Guerlebeck, W. Sproeossig and others. This study will be extended in this paper. In order to apply the quaternionic operator calculus to solve these types of boundary value problems fully explicitly, one basically needs to evaluate two types of integral operators: the Teodorescu operator and the quaternionic Bergman projector. While the integral kernel of the Teodorescu transform is universal for all domains, the kernel function of the Bergman projector, called the Bergman kernel, depends on the geometry of the domain. With special variants of quaternionic holomorphic multiperiodic functions we obtain explicit formulas for three dimensional parallel plate channels, rectangular block domains and regular triangular channels. The explicit knowledge of the integral kernels makes it then possible to evaluate the operator equations in order to determine the solutions of the boundary value problem explicitly.
In this paper we consider the time independent Klein-Gordon equation on some conformally flat 3-tori with given boundary data. We set up an explicit formula for the fundamental solution. We show that we can represent any solution to the homogeneous Klein-Gordon equation on the torus as finite sum over generalized 3-fold periodic elliptic functions that are in the kernel of the Klein-Gordon operator. Furthermore we prove Cauchy and Green type integral formulas and set up a Teodorescu and Cauchy transform for the toroidal Klein-Gordon operator. These in turn are used to set up explicit formulas for the solution to the inhomogeneous version of the Klein-Gordon equation on the 3-torus.
As numerical techniques for solving PDE or integral equations become more sophisticated, treatments of the generation of the geometric inputs should also follow that numerical advancement. This document describes the preparation of CAD data so that they can later be applied to hierarchical BEM or FEM solvers. For the BEM case, the geometric data are described by surfaces which we want to decompose into several curved foursided patches. We show the treatment of untrimmed and trimmed surfaces. In particular, we provide prevention of smooth corners which are bad for diffeomorphism. Additionally, we consider the problem of characterizing whether a Coons map is a diffeomorphism from the unit square onto a planar domain delineated by four given curves. We aim primarily at having not only theoretically correct conditions but also practically efficient methods. As for FEM geometric preparation, we need to decompose a 3D solid into a set of curved tetrahedra. First, we describe some method of decomposition without adding too many Steiner points (additional points not belonging to the initial boundary nodes of the boundary surface). Then, we provide a methodology for efficiently checking whether a tetrahedral transfinite interpolation is regular. That is done by a combination of degree reduction technique and subdivision. Along with the method description, we report also on some interesting practical results from real CAD data.
With the aid of factorization of the Schrödinger operator by quaternionic differential operators of first order proposed in recent works by S. Bernstein and K. Gürlebeck we study the system describing forcefree magnetic fields with nonconstant proportionality factor, the static Maxwell system for inhomogeneous media, the Beltrami condition and the Dirac equation with different types of potentials depending on one variable. We obtain integral representations for solutions of these systems.
The paper is devoted to a study of properties of homogeneous solutions of massless field equation in higher dimensions. We first treat the case of dimension 4. Here we use the two-component spinor language (developed for purposes of general relativity). We describe how are massless field operators related to a higher spin analogues of the de Rham sequence - the so called Bernstein-Gel'fand-Gel'fand (BGG) complexes - and how are they related to the twisted Dirac operators. Then we study similar question in higher (even) dimensions. Here we have to use more tools from representation theory of the orthogonal group. We recall the definition of massless field equations in higher dimensions and relations to higher dimensional conformal BGG complexes. Then we discuss properties of homogeneous solutions of massless field equation. Using some recent techniques for decomposition of tensor products of irreducible $Spin(m)$-modules, we are able to add some new results on a structure of the spaces of homogenous solutions of massless field equations. In particular, we show that the kernel of the massless field equation in a given homogeneity contains at least on specific irreducible submodule.
Monogenic functions play a role in quaternion analysis similarly to that of holomorphic functions in complex analysis. A holomorphic function with nonvanishing complex derivative is a conformal mapping. It is well-known that in Rn+1, n ≥ 2 the set of conformal mappings is restricted to the set of Möbius transformations only and that the Möbius transformations are not monogenic. The paper deals with a locally geometric mapping property of a subset of monogenic functions with nonvanishing hypercomplex derivatives (named M-conformal mappings). It is proved that M-conformal mappings orthogonal to all monogenic constants admit a certain change of solid angles and vice versa, that change can characterize such mappings. In addition, we determine planes in which those mappings behave like conformal mappings in the complex plane.