Refine
Document Type
- Conference Proceeding (857) (remove)
Institute
- Professur Informatik im Bauwesen (336)
- In Zusammenarbeit mit der Bauhaus-Universität Weimar (174)
- Professur Baubetrieb und Bauverfahren (104)
- Professur Theorie und Geschichte der modernen Architektur (94)
- Graduiertenkolleg 1462 (32)
- Institut für Strukturmechanik (ISM) (23)
- Professur Angewandte Mathematik (18)
- Institut für Konstruktiven Ingenieurbau (IKI) (12)
- Junior-Professur Computational Architecture (11)
- Bauhaus-Institut für Geschichte und Theorie der Architektur und Planung (10)
Keywords
- Computerunterstütztes Verfahren (286)
- Architektur <Informatik> (200)
- CAD (158)
- Angewandte Informatik (145)
- Angewandte Mathematik (145)
- Computer Science Models in Engineering; Multiscale and Multiphysical Models; Scientific Computing (74)
- Bauhaus (68)
- Arbeitsschutz (61)
- Baustelle (53)
- Architekturtheorie (52)
The development of the qualitative methods of investigation of dynamic systems, suggested by the authors, is the effective means for identification of dynamic systems. The results of the extensive investigations of the behaviour of linear dynamic systems and symmetrical system with double well potential under polyharmonic excitation are given in the paper. Phase space of dynamic systems is multi-dimensional. Each point of this space is characterized by not less than four co-ordinates. In particular: displacement, velocity, acceleration and time. Real space has three dimensions. It is more convenient for the analysis. We consider the phase space as limited to three dimensions, namely displacement, velocity and acceleration. Another choice of parameters of phase planes is also possible [1, 2]. Phase trajectory on a plane is of the greatest interest. It is known that accelerations of points are more sensitive to deviations of oscillations from harmonic ones. It is connected with the fact that power criteria on it are interpreted most evidently. Besides, dependence is back symmetric relative to axis of the diagram of elastic characteristic. Only the phase trajectories allow establishing a type and a level of non-linearity of a system. The results of the extensive investigations of the dynamic systems behaviour under polyharmonic excitation are given in the paper. The use of the given phase trajectories enables us to determine with a high degree of reliability the following peculiarities: - presence or absence of non-linear character of behaviour of a dynamic system; - type of non-linearity; - type of dynamic process (oscillations of the basic tone, combinative oscillations, chaotic oscillations.). Unlike existing asymptotic and stochastic methods of identification of dynamic systems, the use of the suggested technique is not connected with the use of a significant amount of computing procedures, and also it has a number of advantages at the investigation of complicated oscillations.
In this paper a meshless component is presented, which internally uses the common meshless interpolation technique >Moving Least Squares<. In contrast to usual meshless integration schemes like the cell quadrature and the nodal integration in this study integration zones with triangular geometry spanned by three nodes are used for 2D analysis. The boundary of the structure is defined by boundary nodes, which are similar to finite element nodes. By using the neighborhood relations of the integration zones an efficient search algorithm to detected the nodes in the influence of the integration points was developed. The components are directly coupled with finite elements by using a penalty method. An widely accepted model to describe the fracture behavior of concrete is the >Fictitious Crack Model< which is applied in this study, which differentiates between micro cracks and macro cracks, with and without force transmission over the crack surface, respectively. In this study the crack surface is discretized by node pairs in form of a polygon, which is part of the boundary. To apply the >Fictitious Crack Model< finite interface elements are included between the crack surface nodes. The determination of the maximum principal strain at the crack tip is done by introducing an influence area around the singularity. On a practical example it is shown that the included elements improve the model by the transmission of the surface forces during monotonic loading and by the representation of the contact forces of closed cracks during reverse loading.
The paper is devoted to the investigation of dynamical behavior of a cable under influence of various types of excitations. Such element has a low rigidity and is sensitive to dynamic effect. The structural scheme is a cable which ends are located at different level. The analysis of dynamical behavior of the cable under effect of kinematical excitation which is represented by the oscillations of the upper part of tower is given. The scheme of cable is accepted such, that lower end of an inclined cable is motionless. The motion of the upper end is assumed only in horizontal direction. The fourth-order Runge-Kutta method was realized in software. The fast Fourier transform was used for spectral analysis. Standard graphical software was adopted for presenting results of investigations. The mathematical model of oscillations of a cable was developed by the account of the viscous damping. The analysis of dynamical characteristics of a cable for various parameters of damping and kinematical excitation was carried out. The time series, spectral characteristics and amplitude-frequencies characteristics was obtained. The resonance amplitude for different oscillating regimes was estimated. It is noted that increasing of the coefficient of the viscous damping and decreasing of the amplitude of tower's oscillations reduces the value of the critical frequency and the resonant amplitudes.
In this paper we revisit the so-called Bergman kernel method (BKM) for solving conformal mapping problems. This method is based on the reproducing property of the Bergman kernel function. The main drawback of this well known technique is that it involves an orthonormalization process and thus is numerically unstable. This difficulty can be, in some cases, overcome by using the Maple system, which makes no use of numeric quadrature. We illustrate this implementation by presenting a numerical example. The construction of reproducing kernel functions is not restricted to real dimension 2. Results concerning the construction of Bergman kernel functions in closed form for special domains in the framework of hypercomplex function theory suggest that BKM can also be extended to mapping problems in higher dimensions, particularly 3-dimensional cases. We describe such a generalized BKM-approach and present numerical examples obtained by the use of specially developed software packages for quaternions.
The quaternionic operator calculus can be applied very elegantly to solve many important boundary value problems arising in fluid dynamics and electrodynamics in an analytic way. In order to set up fully explicit solutions. In order to apply the quaternionic operator calculus to solve these types of boundary value problems fully explicitly, one has to evaluate two types of integral operators: the Teodorescu operator and the quaternionic Bergman projector. While the integral kernel of the Teodorescu transform is universal for all domains, the kernel function of the Bergman projector, called the Bergman kernel, depends on the geometry of the domain. Recently the theory of quaternionic holomorphic multiperiodic functions and automorphic forms provided new impulses to set up explicit representation formulas for large classes of hyperbolic polyhedron type domains. These include block shaped domains, wedge shaped domains (with or without additional rectangular restrictions) and circular symmetric finite and infinite cylinders as particular subcases. In this talk we want to give an overview over the recent developments in this direction.
The reduction of oscillation amplitudes of structural elements is necessary not only for maintenance of their durability and longevity but also for elimination of a harmful effect of oscillations on people and technology operations. The dampers are widely applied for this purpose. One of the most widespread models of structural friction forces having piecewise linear relation to displacement was analysed. T The author suggests the application of phase trajectories mapping in plane "acceleration – displacement". Unlike the trajectories mapping in a plane "velocity – displacement", they don't require large number of geometrical constructions for identification of the characteristics of dynamic systems. It promotes improving the accuracy. The analytical assumptions had been verified by numerical modeling. The results show good enough coincide between numerical and analytical estimation of dissipative characteristic.
We propose a new approach to the numerical solution of quasi-static elastic-plastic problems based on the Moreau-Yosida theorem. After the time discretization, the problem is expressed as an energy minimization problem for unknown displacement and plastic strain fields. The dependency of the minimization functional on the displacement is smooth whereas the dependency on the plastic strain is non-smooth. Besides, there exists an explicit formula, how to calculate the plastic strain from a given displacement field. This allows us to reformulate the original problem as a minimization problem in the displacement only. Using the Moreau-Yosida theorem from the convex analysis, the minimization functional in the displacements turns out to be Frechet-differentiable, although the hidden dependency on the plastic strain is non-differentiable. The seconds derivative exists everywhere apart from the elastic-plastic interface dividing elastic and plastic zones of the continuum. This motivates to implement a Newton-like method, which converges super-linearly as can be observed in our numerical experiments.
Die im Vortrag erläuterten Untersuchung dienten der Erfassung von Bestimmungsgrößen und relevanten Aspekten von sicherheits- und gesundheitsgerechtem Verhalten. Es sollen individuelle verhaltenssteuernde Größen und Steuerungsformen der Organisation beim Umgang mit Sicherheits- und Gesundheitsrisiken erhoben werden. Dargestellt werden Methodik, Fragebogen und erste Ergebnisse der Untersuchung.
Situation und Entwicklungstendenzen; Arbeitsprozesse im Arbeitsschutz; Das Internet; Arbeitsschutz im Internet (Internet-Inhalte auf dem Weg zum papierlosen Büro); Datenaustausch und papierloses Büro. Dank Telekommunikation und Internet sind wir auf dem besten Weg zum papierlosen Büro. Papier gebundenen Informationen werden jetzt schon durch Online-Dienste und digitale Dokumenten-Management-Systeme zurückgedrängt. Dank Vernetzung wachsen die Möglichkeiten simultanen Arbeitens. Auf dem Gebiet des Arbeitsschutzes bestehen aktuell bereits gute Arbeitsmöglichkeiten.
Arbeitsschutzkonzept für den Bau einröhriger Tunnel am Beispiel der ICE-Querung des Thüringer Waldes
(2003)
Erfahrungen bei der Umsetzung der Baustellenverordnung beim Bau der ICE-Neubaustrecke in Thüringen, im Vergleich zu Projekten des Autobahnneubaus. Die entscheidende Schwierigkeit liegt in der Einröhrigkeit der Tunnel. Von entscheidender Voraussetzung für die erfolgreiche Arbeit ist eine von gegenseitiger Achtung gekennzeichnete Zusammenarbeit in der Planung zwischen Bauherrn, Planer und dem Amt für Arbeitsschutz. Wegetransportplan: Im Unterschied zum Autobahnbau gehören die Baustraßen nicht zum jeweiligen Los. Alle Zuwegungen werden gesondert betrachtet. Fluchtwege.
Wissenschaftliches Kolloquium vom 14. bis 16. Oktober 1999 in Weimar an der Bauhaus-Universität zum Thema: ‚global village - Perspektiven der Architektur'
ARCHITECTURE AND ATMOSPHERE
(2011)
Nathalie Bredella is an architect. She was educated at the TU Berlin and Cooper Union, New York. She received a PhD in Architectural Theory. She taught architectural design at the TU Berlin. She ist the author of Architekturen des Zuschauens. Imaginäre und reale Räume im Film (transcript-verlag). The work is based on an interdisciplinary approach incorporating architecture, film theory and philosophy. Her interests in architectural practice focus on the relationship between spatial strategies, film and media on an urban and architectural scale.
Architecture and globality
(2000)
Wissenschaftliches Kolloquium vom 14. bis 16. Oktober 1999 in Weimar an der Bauhaus-Universität zum Thema: ‚global village - Perspektiven der Architektur'
Tatjana Schneider is lecturer at the School of Architecture, University of Sheffield. She holds a PhD in architecture. She worked in architectural practice in Germany and the UK, and has taught, lectured and published widely (including ‘Flexible Housing’ with Jeremy Till). She was a member of the worker’s cooperative G.L.A.S. (Glasgow Letters on Architecture and Space), which undertook agit-prop works, educational workshops, community based design consultancy and produced the quarterly journal glaspaper. Her work focuses on the production and political economy of the built environment. Current work includes the research ‘Spatial Agency’.
GRAFT is an architectural firm located in Los Angeles, Berlin, and Beijing. Their collective professional experience encompasses a wide array of building types including Fine Arts, Educational, Institutional, Commercial and Residential facilities. The firm has won numerous awards in Europe as well as in the United States. GRAFT was established in 1998 in Los Angeles by Lars Krückeberg, Wolfram Putz and Thomas Willemeit and opened an office in Berlin in 2001. In 2003 GRAFT opened an office in Beijing with Gregor Hoheisel as partner for the asian market. In 2007 Alejandra Lillo became Partner for the office in Los Angeles. GRAFT was conceived as a ‘Label’ for Architecture, Urban Planning, Design, Music, and the “pursuit of happiness”. Since the firm was established, it has been commissioned to design and manage a wide range of projects in multiple disciplines and locations. With the core of the firm’s enterprises gravitating around the field of architecture and the built environment, GRAFT has always maintained an interest in crossing the boundaries between disciplines and “grafting” the creative potentials and methodologies of different realities. This is reflected in the firm’s expansion into the fields of exhibition design and product design, art installations, academic projects and “events” as well as in the variety of project locations in Germany, China, UAE, Russia, Georgia, in the U.S. and Mexico, to name a few.
Lara Schrijver is an assistant professor at the Faculty of Architecture of the TU Delft. She is one of three program leaders for a new research program in the department of architecture, ‘The Architectural Project and its Foundations’. Schrijver holds degrees in architecture from Princeton University and the TU Delft. She received her Ph.D. from the TU Eindhoven in 2005. Schrijver has taught design and theory courses, and contributed to conferences in the Netherlands as well as abroad. She was an editor for OASE, journal for architecture, for ten years, and was co-organizer of the 2006 conference ‘The Projective Landscape’. Her current work revolves around the role of architecture in the city, and its responsibility in defining the public domain. Her first book, Radical Games, on the influence of the 1960s on contemporary discourse, is forthcoming in the spring of 2009.
Models in the context of engineering can be classified in process based and data based models. Whereas the process based model describes the problem by an explicit formulation, the data based model is often used, where no such mapping can be found due to the high complexity of the problem. Artificial Neuronal Networks (ANN) is a data based model, which is able to “learn“ a mapping from a set of training patterns. This paper deals with the application of ANN in time dependent bathymetric models. A bathymetric model is a geometric representation of the sea bed. Typically, a bathymetry is been measured and afterwards described by a finite set of measured data. Measuring at different time steps leads to a time dependent bathymetric model. To obtain a continuous surface, the measured data has to be interpolated by some interpolation method. Unlike the explicitly given interpolation methods, the presented time dependent bathymetric model using an ANN trains the approximated surface in space and time in an implicit way. The ANN is trained by topographic measured data, which consists of the location (x,y) and time t. In other words the ANN is trained to reproduce the mapping h = f(x,y,t) and afterwards it is able to approximate the topographic height for a given location and date. In a further step, this model is extended to take meteorological parameters into account. This leads to a model of more predictive character.
Entwurfsprozesse im Bauwesen sind hochgradig kooperative Prozesse mit alternierenden Phasen asynchroner und synchroner Teamarbeit. Die Informationen über den aktuellen Entwurfsgegenstand können als Objektstrukturen modelliert werden, die in entsprechenden Modellverwaltungssystemen gespeichert werden. Bei der Realisierung von kooperativ nutzbaren Umgebungen für den Bauwerksentwurf sind jedoch bei der Auswahl von Basistechniken spezifische Anforderungen von CSCW-Applikationen zu beachten, die bestimmte traditionelle Verfahren nicht erfüllen. Neben verschiedenen Auswirkungen auf das Interaktionsverhalten der Entwurfsumgebung spielt die cooperation awareness der eingesetzten Mechanismen eine bedeutende Rolle. Mechanismen zur Zugriffskontrolle sind in netzwerkbasierten Mehrbenutzerumgebungen essentiell, jedoch sind herkömmliche Verfahren zu unflexibel und nicht hinreichend ausdrucksstark. Eine adaptierte und erweiterte Variante des Matrixverfahrens ist für die Anwendung in Modellverwaltungssystemen geeignet. Ebenso muss bei der Auswahl von Mechanismen zur Nebenläufigkeitskontrolle Augenmerk auf dessen Eignung in Groupware-Systemen gelegt werden. Bei der Unterstützung asynchroner Kooperation können Lock-Verfahren auf die Informationen in Modellverwaltungssystemen angewandt werden. Für die Applikationen für synchrone Teamarbeit müssen derartige Mechanismen auf die gemeinsamen Informationsbestände sowie auf Systemressourcen der Entwurfsumgebung angewendet werden. Hierfür sind floor-passing'-Verfahren geeignet; die Anwendbarkeit von Transformationsverfahren sollte für die konkret umzusetzende Applikation geprüft werden.
Numerical simulations in the general field of civil engineering are common for the design process of structures and/or the assessment of existing buildings. The behaviour of these structures is analytically unknown and is approximated with numerical simulation methods like the Finite Element Method (FEM). Therefore the real structure is transferred into a global model (GM, e.g. concrete bridge) with a wide range of sub models (partial models PM, e.g. material modelling, creep). These partial models are coupled together to predict the behaviour of the observed structure (GM) under different conditions. The engineer needs to decide which models are suitable for computing realistically and efficiently the physical processes determining the structural behaviour. Theoretical knowledge along with the experience from prior design processes will influence this model selection decision. It is thus often a qualitative selection of different models. The goal of this paper is to present a quantitative evaluation of the global model quality according to the simulation of a bridge subject to direct loading (dead load, traffic) and indirect loading (temperature), which induce restraint effects. The model quality can be separately investigated for each partial model and also for the coupled partial models in a global structural model. Probabilistic simulations are necessary for the evaluation of these model qualities by using Uncertainty and Sensitivity Analysis. The method is applied to the simulation of a semi-integral concrete bridge with a monolithic connection between the superstructure and the piers, and elastomeric bearings at the abutments. The results show that the evaluation of global model quality is strongly dependent on the sensitivity of the considered partial models and their related quantitative prediction quality. This method is not only a relative comparison between different models, but also a quantitative representation of model quality using probabilistic simulation methods, which can support the process of model selection for numerical simulations in research and practice.
This paper presents the development of an assessment scheme for a visual qualitative evaluation of nailed connections in existing structures, such as board trusses. In terms of further use and preservation, a quick visual inspection will help to evaluate the quality of a structure regarding its load-bearing capacity and deformation behaviour. Tests of old and new nailed joints in combination with a rating scheme point out the correlation between the load-bearing capacity and condition of a joint. Old joints of comparatively good condition tend to exhibit better results than those of poor condition. Moreover, aged joints are generally more load-bearing than newly assembled ones.
Die meisten Insolvenzen in Deutschland kommen aus der Bauindustrie. Die Gründe hierfür sind vielschichtig, jedoch kann mittels eines modern ausgerichteten M-I-S und Baustellen-Controllings frühzeitig erkannt werden, wie sich die Baustellenergebnisse entwickeln. Hierzu ist es notwendig, dass die Arbeitskalkulation ständig auf dem Laufenden gehalten wird. Nur wenn dies geschieht, sind monatliche Soll-/ Ist-Vergleiche und eine Betrachtung der cost-to-complete möglich und sinnvoll. Eine monatlich rollierende Prognose des Baustellenergebnisses zum Bauende ermöglicht, dass gravierende Veränderungen des Ergebnisses umgehend aufgedeckt werden. Nur in Kenntnis dieser Entwicklungen kann das Management frühzeitig (im Sinne eines Frühwarnsystems) agieren und Steuerungsmaßnahmen ergreifen. Die Ergebnisprognose zum Bauende ist allein als Steuerungsinstrument nicht ausreichend. Die Finanzsituation der Baustelle muß auch regelmäßig geprüft werden, d.h. der Leistungsstand mit der Rechnungsstellung an den Bauherren abgeglichen sowie die unbezahlten Rechnungen des Bauherren überprüft werden. Das beste Prognoseergebnis ist wertlos, wenn der Bauherr seine bezogenen Leistungen nicht vergütet. Die wirtschaftlichen Daten stehen den Verantwortlichen online im Baustellen-Informations-System (B-I-S) zur Verfügung. Ein Ampelsystem verdeutlicht die wirtschaftliche Lage der Baustelle.
This research focuses on the Case-based Reasoning paradigm in architectural design (CBD) and education. Initial point for further exploring this only seemingly comprehensive investigated field of research constitutes the finding that promising looking concepts exist but that they do not play a role in daily routine of designing architects or in university education. In search of reasons for this limited success a critical review of the CBR approach to architectural education and design was performed. The aim was to identify gaps in the CBD research and to discover potential fields of research within CBR research in architectural education and design to improve acceptance and practical suitability. Two major shortcomings could be identified. In the first place the way retrieval mechanisms of systems under investigation relate to the needs of architectural designers and students. At second: Successful CBD systems rely on the work of third-parties in sharing their experiences with others and filling the databases with relevant cases. Therefore two questions remain unanswered: The question of which projects become part of the database and how get existing projects not only described but evaluated. This is an essential task and prerequisite to meet the requirements of the underlying theory of CBR.
Datenaustausch, Daten resp. Produktdatenmodelle sind seit mehreren Jahren Themen in der Forschung. Verschiedene Forschungsprojekte und Initiativen diverser Firmen führten zu bereichsübergreifenden Ansätzen wie IFC und verschiedenen STEP-AP´s. Speziell im Stahlbau sind die Projekte >Produktschnittstelle Stahlbau< und >CIMsteel< entwickelt, weiterentwickelt und überarbeitet worden. Als Weiterentwicklung der bisher existierenden Austauschformate versuchen neuere Ansätze den Nutzen über die reine Datenübermittlung hinaus zu erweitern. So integrieren diese Lösungsvorschläge Aspekte der Kommunikation, der Zusammenarbeit und des Managements. Des weiteren übernehmen sie Aufgaben der Daten- und Modellverwaltung. Somit erfolgt eine digitale Abbildung unter Einbezug sämtlicher ermittelter Daten. Resultierend aus den besonderen Randbedingungen im Bauwesen, wird ein Bauwerksmodell aus untereinander in Beziehung gesetzten Domänenmodellen aufgebaut
In civil engineering practice, values of column forces are often required before any detailed analysis of the structure has been performed. One of the reasons for this arises from the fast-tracked nature of the majority of construction projects: foundations are laid and base columns constructed whilst analysis and design are still in progress. A need for quick results when feasibility studies are performed or when evaluating the effect of design changes on supporting columns form other situations in which column forces are required, but where a detailed analysis to get these forces seems superfluous. Thus it was concluded that the development of an efficient tool for column force calculations, in which the extensive input required in a finite element analysis is to be avoided, would be highly beneficial. The automation of the process is achieved by making use of a Voronoi diagram. The Voronoi diagram is used a) for subdividing the floor into influence areas and b) as a basis for automatic load assignment. The implemented procedure is integrated into a CAD system in which the relevant geometric information of the floor, i.e. its shape and column layout, can be defined or uploaded. A brief description of the implementation is included. Some comparative results and considerations regarding the continuation of the study are given.
A method of automatic maintenance of vibration amplitude of a number of mechanisms at given level, when exiting force amplitude is varied greatly is given. For this purpose a pendulum is attached to a mechanism through a viscoelastic hinge. Load of a pendulum can move along an arm and it is viscoelastic connected to it.
It's not uncommon that analysis and simulation methods are used mainly to evaluate finished designs and to proof their quality. Whereas the potential of such methods is to lead or control a design process from the beginning on. Therefore, we introduce a design method that move away from a “what-if” forecasting philosophy and increase the focus on backcasting approaches. We use the power of computation by combining sophisticated methods to generate design with analysis methods to close the gap between analysis and synthesis of designs. For the development of a future-oriented computational design support we need to be aware of the human designer’s role. A productive combination of the excellence of human cognition with the power of modern computing technology is needed. We call this approach “cognitive design computing”. The computational part aim to mimic the way a designer’s brain works by combining state-of-the-art optimization and machine learning approaches with available simulation methods. The cognition part respects the complex nature of design problems by the provision of models for human-computation interaction. This means that a design problem is distributed between computer and designer. In the context of the conference slogan “back to command”, we ask how we may imagine the command over a cognitive design computing system. We expect that designers will need to let go control of some parts of the design process to machines, but in exchange they will get a new powerful command on complex computing processes. This means that designers have to explore the potentials of their role as commanders of partially automated design processes. In this contribution we describe an approach for the development of a future cognitive design computing system with the focus on urban design issues. The aim of this system is to enable an urban planner to treat a planning problem as a backcasting problem by defining what performance a design solution should achieve and to automatically query or generate a set of best possible solutions. This kind of computational planning process offers proof that the designer meets the original explicitly defined design requirements. A key way in which digital tools can support designers is by generating design proposals. Evolutionary multi-criteria optimization methods allow us to explore a multi-dimensional design space and provide a basis for the designer to evaluate contradicting requirements: a task urban planners are faced with frequently. We also reflect why designers will give more and more control to machines. Therefore, we investigate first approaches learn how designers use computational design support systems in combination with manual design strategies to deal with urban design problems by employing machine learning methods. By observing how designers work, it is possible to derive more complex artificial solution strategies that can help computers make better suggestions in the future.
Mit der Vergabeverordnung und den neuen Verdingungsordnungen ist zunächst einmal das europäische Vergaberecht in nationales Vergaberecht umgesetzt. Dennoch wirft das neue Vergaberecht Fragen auf, die voraussichtlich gerichtlich entschieden werden. Es werden die Rechtsgrundlagen für das Bauen mit dem europäischen und nationalen Vergaberecht dargestellt. Wesentliche Inhalte des Gesetzes gegen Wettbewerbsbeschränkungen (GWB) und Vergabeverfahren nach GWB, Fragen des Rechtsschutzes werden behandelt.
BAUHAUS ISOMETRY AND FIELDS
(2012)
While integration increases by networking, segregation strides ahead too. Most of us fixate our mind on special topics. Yet we are relying on our intuition too. We are sometimes waiting for the inflow of new ideas or valuable information that we hold in high esteem, although we are not entirely conscious of its origin. We may even say the most precious intuitions are rooting in deep subconscious, collective layers of the mind. Take as a simple example the emergence of orientation in paleolithic events and its relation to the dihedral symmetry of the compass. Consider also the extension of this algebraic matter into the operational structures of the mind on the one hand and into the algebra of geometry, Clifford algebra as we use to call it today, on the other. Culture and mind, and even the individual act of creation may be connected with transient events that are subconscious and inaccessible to cognition in principle. Other events causative for our work may be merely invisible too us, though in principle they should turn out attainable. In this case we are just ignorant of the whole creative process. Sometimes we begin to use unusual tools or turn into handicraft enthusiasts. Then our small institutes turn into workshops and factories. All this is indeed joining with the Bauhaus and its spirit. We shall go together into this, and we shall present a record of this session.
Sandra Lippert-Vieira, 1971 in Lissabon, Portugal geboren, schloss 1995 ihr Architekturstudium an der Universidade Lusíada in Lissabon ab. Bis 2003 arbeitete sie als Freie Architektin in Lissabon und war Entwurfsassistentin an der Universidade Moderne und Universidade Lusófona in Lissabon bei Prof. em. Amâncio d’Alpoim Guedes, Lehrstuhl für Entwurf. Derzeit ist sie wissenschaftliche Mitarbeiterin im Fachgebiet Gebäudelehre bei Prof. Daniele Marques, Institut für Entwerfen, Kunst und Theorie, KIT – Karlsruher Institut für Technologie. Sie promovierte an der BTU Cottbus, Prof. Führ, zum Thema „Dissoziative Architektur. Zwischen Teufelskralle und Scheinriese. Wege zu einem weiteren Verständnis der Architektur des Expressionismus.“
Bisherige Veröffentlichungen:
„Wege zu einer Rezeptionsästhetik in der Architektur: das implizite Leben der gebauten Welt,“ in: Wolkenkuckucksheim (Heft 2/08); „Texte und Kontexte“ von Jürgen Habermas und „Martin Heidegger. Unterwegs zu seiner Biographie“ von Hugo Ott. Forschungsschwerpunkte: Expressionistische Architektur, Outsider Architektur, Architekturinterpretationsmethoden, Avantgarde und Postanarchismus.
Bauschäden im Wohn- und Gewerbebau – eine Thüringer Bestandsaufnahme und Ansätze zur Problemlösung
(2000)
Der Verfasser gibt einen Überblick über die aktuelle Bauschadensforschung an Wohn-, Gewerbe- und Industriebauten (Schadensanfälligkeit, zeitliche Verteilung von Bauschäden, Verteiliung der Verursachung und des Verschuldens, Schadensintensität in Abhängigkeit von den Vertragsstrukturen, Mängel- und Schadensbeseitigungskosten). Es werden Schlussfolgerungen auf notwendige Maßnahmen zur Reduzierung der Schadensintensität gezogen.
In Erstanwendung des Fernstraßenbauprivatfinanzierungsgesetzes wurde der Konzessionär in einer frühen Projektphase noch vor der Planfeststellung beauftragt. Nur hierdurch war es möglich, ein umfassendes, speziell auf die Technologie eines Ausführungsbetriebes zugeschnittenes Baustellenmanagement-Konzept als Basis für die Planfeststellung zu erarbeiten. Vorstellung des Vorhabens, Baumanagement-Konzept: Grobbauablaufplanung, Baustelleneinrichtung, Arbeitskräfteeinsatz, Bodenmanagement, Hauptbaustoffversorgung und Erdstofftransporte im öffentlichen Bauraum, Verkehrsumleitungskonzeption für den öffentlichen Verkehr entsprechend den Bauphasen.
Rectangular steel frames are considered and subjected to strong ground motion. Their behavior factor is numerically evaluated using nonlinear time history analysis and different ground acceleration records. The behavior factor is determined assuming severe collapse mechanism occurs throughout the time history. The system of equations is transformed into single equation end then the energy balance concept is applied. The expression for the behavior factor is derived and its application to four story two bays steel frame is illustrated and the corresponding results are discussed.
Bei der Tragwerksplanung sowohl für Massivkonstruktionen als auch für Stahlkonstruktionen werden zukünftig nichtlineare Berechnungsverfahren in größerem Umfang Anwendung finden, als das in der Vergangenheit üblich bzw. möglich war. Wichtige Impulse gehen dabei von der europäischen Normung aus. Bei der Anwendung von Berechnungsverfahren, die die Nichtlinearität des Materialverhaltens berücksichtigen und bei der Ermittlung der Tragsicherheit planmäßig ausnutzen, ist es notwendig, die Entwicklung plastischer Deformationen zu verfolgen und bei der Beurteilung des Grenzzustandes der Tragfähigkeit als Kriterium mit heranzuziehen. Im vorliegenden Beitrag werden mathematische Modelle für folgende Berechnungsaufgaben vorgestellt: Ermittlung der Schnittgrößen und Formänderungen in ebenen Stabtragwerken nach Theorie II. Ordnung unter Berücksichtigung der physikalischen Nichtlinearität und Ermittlung von Grenzlasten, die durch Spannungs- und Verformungskriterien definiert sind. Dabei zeigt sich, daß mathematische Modelle auf der Grundlage von Extremalprinzipien und unter Einbeziehung der mathematischen Optimierung effektiv und hinreichend universell formuliert werden können. Wie Beispielrechnungen zeigen, ist die Beurteilung der Tragfähigkeit unter Berücksichtigung von Deformationsbegrenzungen von entscheidender Bedeutung, um Fehleinschätzungen der Tragsicherheit zu vermeiden.
In this paper, we present an empirical approach for objective and quantitative benchmarking of optimization algorithms with respect to characteristics induced by the forward calculation. Due to the professional background of the authors, this benchmarking strategy is illustrated on a selection of search methods in regard to expected characteristics of geotechnical parameter back calculation problems. Starting from brief introduction into the approach employed, a strategy for optimization algorithm benchmarking is introduced. The benchmarking utilizes statistical tests carried out on well-known test functions superposed with perturbations, both chosen to mimic objective function topologies found for geotechnical objective function topologies. Here, the moved axis parallel hyper-ellipsoid test function and the generalized Ackley test function in conjunction with an adjustable quantity of objective function topology roughness and fraction of failing forward calculations is analyzed. In total, results for 5 optimization algorithms are presented, compared and discussed.
Hinsichtlich der Integration einzelner Bauwerkslebensphasen und der verschiedenen Beteiligten, insbesondere innerhalb von Bauplanungs- und Revitalisierungsprozessen, bestehen aktuell entscheidende Defizite. Die generelle Zielstellung der in diesem Beitrag vorgestellten Forschungsarbeiten besteht in der Unterstützung und Verbesserung der Integration durch die disziplin- und lebensphasenübergreifende Bereitstellung sämtlicher bauwerksbezogener Informationen. Dies erfordert einerseits geeignete Ansätze zur Modellierung und Integration der vielfältigen disziplinspezifischen Daten, andererseits geeignete Lösungen, die einen globalen Zugriff, Navigation und Recherche im Gesamtdatenbestand ermöglichen. Die Modellierung und Verwaltung bauwerksbezogener Daten ist seit längerem Gegenstand diverser Forschungsarbeiten. Im Rahmen des SFB 524 wurde ein eigener Ansatz basierend auf einem laufzeitdynamischen Partialmodellverbund entwickelt. Dieser wird in den wesentlichen Grundzügen anderen Ansätzen gegenübergestellt. Den Schwerpunkt dieses Beitrags bildet jedoch die Entwicklung einer geeigneten flexiblen Navigations- und Rechercheschicht zu Realisierung projektglobaler Informationsrecherche. Aus der Sicht der Modellierung und Datenverwaltung wie auch aus der Sicht der Informationsrecherche und Informationspräsentation in Planungsprozessen ergeben sich verschiedene Anforderungen an derartige Recherchewerkzeuge, wobei der wesentlichste Grundsatz maximale Flexibilität hinsichtlich verfügbarer Darstellungstechniken und deren freie Kombination mit Techniken formaler Suchanfragen ist. Das entwickelte Systemkonzept basiert auf einem Framework, welches verschiedene Grundtypen von Recherchemodulen und deren Interaktionsprinzipien vorgibt. Einzelne Recherchemodule werden als Ausprägungen dieser Modultypen realisiert und können je nach Bedarf laufzeitdynamisch in die Navigationsschicht integriert werden. Die technische Realisierung des Systems erfolgt im Umfeld vorhandener Prototypen aus vorangegangenen Forschungsaktivitäten. Dieses technische Umfeld gibt verschiedene Rahmenbedingungen vor, welche im Vorfeld prototypischer Implementierungen verschiedene Adaptionen des generellen Systemkonzepts notwendig machen. Der vorliegende Beitrag stellt den aktuellen Entwicklungsstand der Systemlösung aus konzeptioneller und technischer Sicht sowie erste prototypische Realisierungen von Recherchemodulen vor.
Prozesse im Bauingenieurwesen sind komplex und beinhalten eine große Anzahl verschiedener Aufgaben mit vielen logischen Abhängigkeiten. Basierend auf diesen projektspezifischen Abhängigkeiten wird gewöhnlich ein Bauablaufplan manuell erstellt. In der Regel existieren mehrere Varianten und somit alternative Bauabläufe um ein Projekt zu realisieren. Welche dieser Ausführungsvarianten zur praktischen Anwendung kommt, wird durch den jeweiligen Projektmanager bestimmt. Falls Ä;nderungen oder Störungen während des Bauablaufs auftreten, müssen die davon betroffenen Aufgaben und Abläufe per Hand modifiziert und alternative Aufgaben sowie Abläufe stattdessen ausgeführt werden. Diese Vorgehensweise ist oft sehr aufwändig und teuer. Aktuelle Forschungsansätze beschäftigen sich mit der automatischen Generierung von Bauabläufen. Grundlage sind dabei Aufgaben mit ihren erforderlichen Voraussetzungen und erzeugten Ergebnissen. Im Rahmen dieses Beitrags wird eine Methodik vorgestellt, um Bauabläufe mit Ausführungsvarianten in Form von Workflow-Netzen zu jeder Zeit berechnen zu können. Die vorgestellte Methode wird anhand eines Beispiels aus dem Straßenbau schematisch dargestellt.
In der jüngeren Vergangenheit haben die Entwicklungen von neuen Programmen und der Rechentechnik zu radikalen Veränderungen in den Methoden der Berechnung von ingenieurtechnischen Bauten geführt. An erster Stelle sind hier die Matrizenmethoden für notwendige Berechnungen der Aufgaben anzuführen, welche eine kompakte und allgemeine Form der Aufschreibung von Gleichungen gestatten und geeignet für die Bearbeitung an Computern sind. Die elektronischen Maschinen wurden damit befähigt nicht nur Systeme linear-algebraischer Gleichungen mit hundert oder tausend von Unbekannten zu lösen, sonder auch die Bearbeitung von Differentialgleichungen, sowie die Formierung dieser Gleichungen zu übernehmen. Damit übernehmen die Computer den Grossteil des Prozesses der Berechnung und der Projektierung einer Konstruktion. Zweifellose Aktualität haben die Fragen über die Benutzung dieser Eigenschaften für die Aufgaben der Bauinformatik und Baumechanik. Diesen ist der vorliegende Artikel gewidmet. In der Praxis der Projektierung kommen oft komplexe Stabsysteme und zylindrische-gefaltete Flächensysteme vor. Dabei ist zu berücksichtigen: die Verschiebung in Stabsystemen, ihre elastischen Gründungen und die auftretenden längs- und querlaufenden Biegungen sowie harmonische Schwingungen. In diesem Artikel wird ein universeller Algorithmus zur Ausarbeitung von Matrizen für Härte- und Festigkeitsbetrachtungen für Stäbe vorgestellt. Der entwickelte Algorithmus erlaubt die Berechnung von komplizierten Stabsystemen, die in der Praxis weit verbreitet sind. Für die Ausarbeitung der Matrizen für die Festigkeitsbetrachtung wurde die symbolische Programmiersprache MAPLE benutzt. Weiterhin zeigt dieser Artikel die Berechnung von gefalteten Flächensystemen unter Verwendung des diskret- kontinuierlichen Modells von W. S. Wlasow, mit Hilfe eines universellen Algorithmus auf. Für die Aufstellung der Matrix für Festigkeitsbetrachtung werden als Systemeinstellungen die Schichtformierung und der Ausschluss nach Gauss vorgeschlagen. Am Ende des Artikels werden Beispiele für die Berechnung von komplizierten Konstruktion aufgeführt.
Neue Konstruktionsentwicklungen erwarten von Bauingenieuren auch neue Berechnungs- und Analysenmethoden. Zu solchen Konstruktionen gehören Polystyrol-Massiv-Rippendecken. Die Decken, die eine wichtige Rolle im Neubau sowie in der Sanierung und Rekonstruktion von Altbauten spielen, haben eine umfangreiche Technologiebeschreibung, jedoch relativ kleine Berechnungsbase zur statischen Analyse der Ausnutzugs- und Statischfestigkeitsparametern. In den Vortrag wurde eine Methode zu Analyse solche Decken dargestellt. Das interessante Problem befindet sich in der sogennaten zweiten Phase, wenn der Decke keine elastische Platte ist. Eine wichtige Rolle bei Berechnung spielen die veränderte Steifigkeit und Rheology.
Werden bei der Tragwerksauslegung Schnittgrößenumlagerungen infolge Plastizierungen zugelassen, dann ist die Lastintensität durch die Einhaltung von entsprechenden Grenzzustandskriterien, passend zum physikalisch nichtlinearen Tragverhalten, zu begrenzen. Für Tragwerke, die von mehreren unabhängig voneinander, wiederholt und in beliebiger Reihenfolge auftretenden Lasten beansprucht werden, stellt die adaptive Grenzlast (Einspiellast), ausgedrückt durch den adaptiven Grenzlastfaktor, ein geeignetes Grenzzustandskriterium dar. Bedingt durch zufällige Systemeigenschaften und zeitlich zufälliges Lastverhalten stellt der adaptive Grenzlastfaktor eine Zufallsgröße dar. Für die Bestimmung des stochastischen adaptiven Grenzlastfaktors und der Versagenswahrscheinlichkeit gegenüber dem Grenzzustand der Adaption für einen Zeitraum [0,T] werden die mathematische Optimierung (mechanische Problemlösung) und die Monte-Carlo-Simulation (stochastische Problemlösung) herangezogen, wobei eine Überführung von zeitvarianten Lastmodellen in äquivalente zeitinvariante Lastmodelle erforderlich wird. Am Beispiel eines eingespannten Stahlbetonrahmens wird untersucht, wie sich eine unterschiedliche stochastische Modellbildung des Tragwerks und eine unterschiedliche Vorgehensweise bei der Überlagerung von Extremwerten der Belastung auf die Beurteilung der Versagenswahrscheinlichkeit des Tragwerks für verschiedene Lebensdauern auswirken. Im Ergebnis dieser Untersuchungen zeigt sich, dass sich die Versagenswahrscheinlichkeit signifikant erhöht, wenn stochastische Tragwerkseigenschaften in Ansatz gebracht werden. Die größte Bedeutung besitzt dabei die Zufälligkeit der Zugfestigkeit der Bewehrung. Alle anderen Zufallsgrößen beeinflussen die Versagenswahrscheinlichkeit nur in ihrer Gesamtheit, einzeln betrachtet sind sie nahezu bedeutungslos. Es stellt sich weiterhin heraus, dass eine vereinfachte Überlagerung der Last-Extremwerte zu einer deutlichen Überschätzung der Versagenswahrscheinlichkeit führt und somit als konservatives Modell zu bewerten ist.
In this paper we study the structure of the solutions to higher dimensional Dirac type equations generalizing the known λ-hyperholomorphic functions, where λ is a complex parameter. The structure of the solutions to the system of partial differential equations (D- λ) f=0 show a close connection with Bessel functions of first kind with complex argument. The more general system of partial differential equations that is considered in this paper combines Dirac and Euler operators and emphasizes the role of the Bessel functions. However, contrary to the simplest case, one gets now Bessel functions of any arbitrary complex order.
Bewertung der Grenzlast von elastisch-plastischen Tragwerken mit Hilfe stochastischer Methoden
(1997)
Für die Analyse von Tragwerken sowohl des Stahlbaus als auch des Massivbaus eröffnet die nationale und internationale Normengebung in zunehmendem Maße die Anwendung physikalisch nichtlinearer Berechnungsmodelle. Es ist zu erwarten, daß neben dem traditionellen elastischen Berechnungsmodell das linearelastisch-idealplastische Materialmodell in die Tragwerksanalyse Eingang finden wird. Während bei den traditionellen Berechnungsverfahren auf der Grundlage der Elastizitätstheorie hinreichende Erfahrungen durch die Planungspraxis bestehen und umfangreiche Untersuchungen zur dabei erreichten Sicherheit vorliegen, stellen die nichtlinearen Berechnungsmethoden sowohl in mechanischer als auch in sicherheitstheoretischer Hinsicht ein neues Erfahrungsfeld dar. Im vorliegenden Beitrag werden aus der Vielzahl der anstehenden Probleme folgende Teilprobleme behandelt: Bestimmung der Versagenswahrscheinlichkeit elasto-plastischer Tragsysteme nach dem Kriterium der plastischen Grenzlast Ermittlung stochastischer Eigenschaften des plastischen Grenzlastparameters elasto-plastischer Tragsysteme. Die Lösung des mechanischen Problems geschieht über eine lineare Optimierungsaufgabe, die nach dem statischen Theorem der plastischen Grenzlast formuliert ist. Als stochastische Methode wird die Simulation angewandt, die zum einen auf einer zufälligen Erzeugung der Realisierungen (stochastische Simulation) und zum anderen auf einer planmäßigen Erzeugung der Realisierungen (konstruktive Simulation) beruhen kann. Für jedes der Teilprobleme wird ein Beispiel vorgestellt.
Es wird eine bibliografische Übersicht über das Schaffen des Verfassers in den zurückliegenden 35 Jahren gegeben. Einzeln für sich betrachtet, handelt es sich bei der überwiegenden Mehrzahl der Publikationen um Antworten auf vielfältige technische Einzelfragen, welche die Praxis des Industriebaus und speziell der Schweißtechnik aufwarfen. Insgesamt gesehen ist die vorliegende Bibliographie jedoch auch als Beitrag zur Technikgeschichte zu werten, und zwar für Gebiete, auf denen die Bau-, Schweiß- und Sicherheitsingenieure im Osten Deutschlands durchaus mit Selbstbewusstsein auf das Geleistete zurückblicken können.
Big soft orange
(2000)
Wissenschaftliches Kolloquium vom 14. bis 16. Oktober 1999 in Weimar an der Bauhaus-Universität zum Thema: ‚global village - Perspektiven der Architektur'
The changed global security situation in the last eight years has shown the importance of emergency management plans in public buildings. Therefore, the use of computer simulators for surveying fire safety design and evacuation process is increasing. The aim of these simulators is to have more realistic evacuation simulations. The challenge is, firstly, to realize the virtual simulation environment based on geometrical and material boundary conditions, secondly, to considerate the mutual interaction effects between different parameters and, finally, to have a realistic visualization of the simulated results. In order to carry out this task, an especial new software method on a BIM-platform has to be developed which can integrate all required simulations and will be able to have an immersive output BIM ISEE (Immersive Safety Engineering Environment). The new BIM-ISEE will integrate the Fire Dynamics Simulator (FDS) for fire and evacuation simulation in the Autodesk Revit which is a BIM-platform and will represent the simulation results in the immersive virtual environment at the institute (CES-Lab). With BIM-ISEE the fire safety engineer will be able to obtain more realistic visualizations in the immersive environment, to modify his concept more effectively, to evaluate the simulation results more accurately and to visualize the various simulation results. It can also give the rescue staff the opportunity to perform and evaluate emergency evacuation trainings.
A concept of non-commutative Galois extension is introduced and binary and ternary extensions are chosen. Non-commutative Galois extensions of Nonion algebra and su(3) are constructed. Then ternary and binary Clifford analysis are introduced for non-commutative Galois extensions and the corresponding Dirac operators are associated.
In der vorliegenden Arbeit geht es um die Anwendung von biorthogonalen Waveletsystemen in der Parameteridentifikation. Es sollen Grundlagen geschaffen werden, um bei der Auswertung dynamischer Experimente derartige Wavelets und damit die schnelle Wavelet-Transformation (FWT) systematisch und effektiv zu nutzen. Zu diesem Zweck wird von den Waveletfiltern ein System von Verbindungskoeffizienten abgeleitet. Mit deren Hilfe erfolgen die Projektionen von Operatoren, insbesondere die von Differentiations- und Integrationsoperatoren, in die entsprechenden Wavelet-Räume. Sämtliche Verbindungskoeffizienten können rekursiv und in endlich vielen Schritten exakt berechnet werden. Ausgehend von den dynamischen Krafteinwirkungen und den gemessenen Reaktionsbeschleunigungen oder Reaktionsgeschwindigkeiten bezüglich der einzelnen Freiheitsgrade können dann unbekannte Steifigkeiten und Dämpfungen identifiziert werden. Dazu erfolgt nach entsprechenden Wavelet-Zerlegungen aller relevanten Zeitsignale ein Abgleich auf den einzelnen Frequenzbändern. Dieser führt insbesondere zu einem System von linearen Matrizengleichungen zur Bestimmung der unbekannten Parameter. Vorgeschlagen wird im Falle einer größeren Zahl von Freiheitsgraden und Parametern, ein mehrstufiges Optimierungsverfahren anzuwenden. Gegenüber Identifikationsverfahren im Zeitbereich werden aufwendige numerische Quadraturverfahren und die daraus resultierenden Fehlerquellen und Stabilitätsprobleme vermieden. Gegenüber Verfahren im Frequenzbereich, die ausschließlich mit Hilfe der FFT formuliert werden, sind Störungen in den Randspektren besser beherrschbar und eliminierbar. Außerdem werden mit einem FWT-Verfahren einfachere Denoising-Algorithmen anwendbar. Letztendlich wird im Vergleich zu einem FFT-Verfahren ein späterer Übergang zur Identifikation nichtlinearer MDOF-Systeme methodisch erleichtert.
Components of structural glazing have to meet different requirements and resist various impacts, depending on the field of application. Within an international research project of the EU innovation program Horizon 2020, special glass panes with a fluid circulating in capillaries are developed exploiting solar energy. Major influences to this glazing are UV irradiation and the fluidic contact, effecting the mechanical and optical durability of the bonding material within the glass setup. Regarding to visual requirements, acrylate adhesives and EVA films are analyzed as possible bonding materials by destructive and non-destructive testing methods. Two types of specimen are presented for obtaining the mechanical behavior and the surface appearances of the bonding material.
Die Planung von komplexen Bauwerken erfolgt zunehmend mit Planungswerkzeugen, die den Export von Bauwerksinformationen im STEP-Format auf Grundlage der IFC (Industry Foundation Classes) erlauben. Durch die Verfügbarkeit dieser Schnittstelle ist es möglich, Bauwerksinformationen für die weiterführende Verarbeitung zu verwenden. Zur Visualisierung der geometrischen Daten stehen innerhalb der IFC verschiedene geometrische Modelle für die Darstellung von Bauteilen zur Verfügung. Unter anderem werden für das „Ausschneiden“ von Öffnungen aus Bauteilen (z.B. für Fenster und Türen) geometrische boolesche Operationen benötigt.
Gegenstand des Beitrags ist die Vorstellung eines Algorithmus zur Berechnung von booleschen Operationen auf Basis eines triangulierten B-Rep (Boundary Representation) Modells nach HUBBARD (1990). Da innerhalb von IFC-Gebäudemodellen Bauteile oft das Resultat mehrerer boolescher Operationen sind (z.B. um mehrere Fensteröffnungen von einer gegebenen Wand abzuziehen), wurde der Algorithmus von Hubbard angepasst, sodass mehrere boolesche Operationen gleichzeitig berechnet werden können. Durch diese Optimierung wird eine deutliche Reduzierung der benötigten Berechnungen und somit der Rechenzeit erreicht.
To support research in the building sector and in order to help it move towards a new digital economy, the European Commission under the 5th Framework initiative, especially the IST programme, funded various RTD projects. The opportunity to bring these IST projects together was acknowledged so that stronger links can be created under a clustering umbrella and that, moreover, links of those projects with their RTD environment could be facilitated. This has been the objective of work carried out within the ICCI (IST-2001-33022) Cluster project. This paper introduces the main aims and objectives of the project, and then presents its principal outcomes. In a second part, it synthesises the underlying concepts, technology and tools that will make ICT-based Construction a reality in a near future, and gives recommended actions for the industry, the EC and the Construction ICT R&D in Europe, giving some benefit of this project experience to the three communities.
We describe the database requirements of SEED (Software Environment to Support the Early Phases in Building Design). The requirements are typical for a database that intends to support a heterogeneous design support environment consisting of independent software modules with diverse internal design models, requirements not met by any commercial database system. The design and implementation of this database is an integral part of the overall software engineering effort. We describe the SEED approach that integrates external and in-house software based on a shared information model specified in the modeling language SPROUT, which allows for the specification of domains, and classes, relationship types and their behavior, and multiple classifications. The SPROUT run-time system organizes and coordinates the communication between the software modules and the database
The main aim of the research project in progress is to develop virtual models as tools to support decision-making in the planning of construction maintenance. The virtual models gives the capacity to allow them to transmit, visually and interactively, information related to the physical behaviour of materials, components of given infrastructures, defined as a function of the time variable. The interactive application allows decisions to be made on conception options in the definition of plans for maintenance, conservation or rehabilitation. The first virtual prototype that is now in progress concerns just lamps. It allows the examination of the physical model, visualizing, for each element modelled in 3D and linked to a database, the corresponding technical information concerned with the wear and tear aspects of the material, calculated for that period of time. In addition, the analysis of solutions for repair work or substitution and inherent cost are predicted, the results being obtained interactively and visualized in the virtual environment itself. The aim is that the virtual model should be able to be applied directly over the 3D models of new constructions, in situations of rehabilitation. The practical usage of these models is directed, then, towards supporting decision-making in the conception phase and the planning of maintenance. In further work other components will be analysed and incorporated into the virtual system.
Review of Discrete Optimization Techniques for CAD Discrete optimization in the structure design Morphological method The alternative graph approach Convex discrete optimization without objective function Matroidal Decomposition in design Decomposition of layered matrices Discrete Optimization in Designing Packing problem Optimal arrangement of rectangles and shortest paths in L1-metrics Partition problems Discrete optimization in computational geometry and computer graphics Maxima of a point set on the plane Triangulation One of the main problems in computer graphics is removing hidden lines and surfaces
In Bauplanungssystemen können XML-Technologien in vielen Bereichen eingesetzt werden mit dem Ziel, diese Systeme modular und webfähig zu gestalten. Der Einsatz lohnt als Basis-Datenstruktur für verschiedene rechnerinterne Modelle, Steuerungsstruktur für Customizing von Anwendungen, Bindeglied zwischen objektbasierten Systemen, Kommunikationsprotokoll zwischen Komponenten. Es ist möglich, komplexe Objekte aus dem Planungsalltag mittels XML arzustellen, zu speichern und zu verarbeiten. Es ist möglich, entsprechende Komponenten im Netz zu verteilen bzw. über Internet zu verbinden. Die heute dominierende Sicht auf XML als Austauschmedium wird ergänzt um die Idee eines XML-basierten Systems: Entwurfsobjekte können als >XML-Objekte< formuliert und im Sinne eines late binding verwendet werden.
Beside the standard calculation programs for civil engineering buildings mathematical programs have been lately established for the solution of differential equations for the analysis of mechanical and static systems. Programs like Maple, Matlab, MathCAD and Mathematica are popular in this field. To the knowledge of the authors, the widest application functionality offers the program Maple. Its advantages are, e.g. the alternatively symbolic or numerical solution of differential equation systems, the easy handling of parameter studies, the immediate visualization of results, the definition of macros for selected calculation steps and their export in other computer languages and, not least, the automatically generated, very clear documentation of the matehematical calculus.
In order to minimize the probability of foundation failure resulting from cyclic action on structures, researchers have developed various constitutive models to simulate the foundation response and soil interaction as a result of these complex cyclic loads. The efficiency and effectiveness of these model is majorly influenced by the cyclic constitutive parameters. Although a lot of research is being carried out on these relatively new models, little or no details exist in literature about the model based identification of the cyclic constitutive parameters. This could be attributed to the difficulties and complexities of the inverse modeling of such complex phenomena. A variety of optimization strategies are available for the solution of the sum of least-squares problems as usually done in the field of model calibration. However for the back analysis (calibration) of the soil response to oscillatory load functions, this paper gives insight into the model calibration challenges and also puts forward a method for the inverse modeling of cyclic loaded foundation response such that high quality solutions are obtained with minimum computational effort. Therefore model responses are produced which adequately describes what would otherwise be experienced in the laboratory or field.
Car following models are used to describe the behavior of a number of cars on the road dependent on the distance to the car in front. We introduce a system of ordinary differential equations and perform a theoretical and numerical analysis in order to find solutions that reflect various traffic situations. We present three different variations of the model motivated by reality.
An einem Beispiel wird belegt, dass unzureichende Beachtung von sicherheitsrelevanten Aspekten in der Phase der Projektplanung nachteilige Auswirkungen für die Bewirtschaftung eines Objektes hat. Einziger praktikabler Ausweg für eine dauerhafte wirtschaftliche Lösung komplette Reinigung der Glasfassade, die den überwiegenden Teil der Gesamtfassadenfläche ausmacht, bleibt in diesem Beispiel, unter Abwägung aller Umstände und unter Berücksichtigung der einschlägigen arbeitsschutzrechtlichen Bestimmungen, das nachträgliche Installieren von so genannten Flachdachauslegern in Verbindung mit einer Arbeitsbühne, die der Beschaffenheit des Daches, der unterschiedlichen Zahl von Geschossebenen und der unmittelbaren Nähe der Gleisanlage der Straßenbahn Rechnung trägt.
Tests on Polymer Modified Cement Concrete (PCC) have shown significant large creep deformation. The reasons for that as well as additional material phenomena are explained in the following paper. Existing creep models developed for standard concrete are studied to determine the time-dependent deformations of PCC. These models are: model B3 by Bažant and Bajewa, the models according to Model Code 90 and ACI 209 as well as model GL2000 by Gardner and Lockman. The calculated creep strains are compared to existing experimental data of PCC and the differences are pointed out. Furthermore, an optimization of the model parameters is performed to fit the models to the experimental data to achieve a better model prognosis.
Image processing has been much inspired by the human vision, in particular with regard to early vision. The latter refers to the earliest stage of visual processing responsible for the measurement of local structures such as points, lines, edges and textures in order to facilitate subsequent interpretation of these structures in higher stages (known as high level vision) of the human visual system. This low level visual computation is carried out by cells of the primary visual cortex. The receptive field profiles of these cells can be interpreted as the impulse responses of the cells, which are then considered as filters. According to the Gaussian derivative theory, the receptive field profiles of the human visual system can be approximated quite well by derivatives of Gaussians. Two mathematical models suggested for these receptive field profiles are on the one hand the Gabor model and on the other hand the Hermite model which is based on analysis filters of the Hermite transform. The Hermite filters are derivatives of Gaussians, while Gabor filters, which are defined as harmonic modulations of Gaussians, provide a good approximation to these derivatives. It is important to note that, even if the Gabor model is more widely used than the Hermite model, the latter offers some advantages like being an orthogonal basis and having better match to experimental physiological data. In our earlier research both filter models, Gabor and Hermite, have been developed in the framework of Clifford analysis. Clifford analysis offers a direct, elegant and powerful generalization to higher dimension of the theory of holomorphic functions in the complex plane. In this paper we expose the construction of the Hermite and Gabor filters, both in the classical and in the Clifford analysis framework. We also generalize the concept of complex Gaussian derivative filters to the Clifford analysis setting. Moreover, we present further properties of the Clifford-Gabor filters, such as their relationship with other types of Gabor filters and their localization in the spatial and in the frequency domain formalized by the uncertainty principle.
CLOSING THE WORLD’S FACTORY
(2011)
Joshua Bolchover is an urban researcher, academic and architectural designer. He is an Assistant Professor at the University of Hong Kong, focusing on researching and designing buildings in rural China. In 2010 he exhibited Rural Urban Ecology at the Venice Biennale 2010. He has curated, designed and contributed to several international exhibitions including: Utopia Now: Opening the Closed Area, a research project on the Hong Kong and Shenzhen border at the Venice Biennale 2008; Get it Louder, a touring exhibition in China; Airspace: What Skyline does London want; Hydan; Can Buildings Curate and has exhibited at the HK-SZ Biennale. Joshua was a local curator for the Manchester-Liverpool section of Shrinking Cities between 2003 and 2005. He has collaborated with Raoul Bunschoten, Chora, researching strategic urban projects and has worked with Diller + Scofidio in New York. Joshua has previously taught architecture at the Chinese University of Hong Kong, London Metropolitan University, Cambridge University and the Architectural Association. He was educated at Cambridge University and at the Bartlett School of Architecture. John Lin is an architect based in Hong Kong and a graduate of The Cooper Union in New York City. His experimental constructions have been published in FRAME magazine (2003) and exhibited in the Kolonihaven (Architecture Park) at the Louisiana Museum of Modern Art in Copenhagen (2004) and the Venice Biennale (2008). Current projects include the design of several school buildings in China. He has taught at the Royal Danish Academy of Fine Arts, School of Architecture, and The Chinese University of Hong Kong and is currently an Assistant Professor at the University of Hong Kong.
Collaboration in AEC Design : Web-enabling Applications using Peer-to-Peer Office Communicator
(2004)
A market analysis conducted by Gartner Dataquest in August 2001 has shown the typical characteristics of the AEC design process. High volatility in membership of AEC design groups and members dispersed over several external offices is the common collaboration scenario. Membership is most times short lived, compared to the overall duration of the process. A technical solution has to take that into account by making joining and leaving a collaborative work group very easy. The modelling of roles of collaboration between group members must be based on a commonly understood principle like the publisher / subscriber model, where the individual that is responsible for the distribution of vital information is clear. Security issues and trust in the confidentiality of the system is a central concern for the acceptance of the system. Therefore, keeping the subset of data that will be published under the absolute control of the publisher is a must. This is not the case with server-based scenarios, sometimes even due to psychological reasons. A loosely bound Peer-to-Peer network offers advantages over a server-based solution, because of less administrative overhead and simple installation procedures. In a peer-to-peer environment, a publish/subscribe role model can be more easily implemented. The publish/subscribe model matches the way AEC processes are modelled in real world scenarios today, where legal proof of information exchange between external offices is of high importance. Workflow management systems for small to midsize companies of the AEC industry may adopt the peer-to-peer approach to collaboration in the future. Further investigations are being made on the research level (WINDS) by integrating the viewer and redlining application Collaborate! into a collaborative environment.
The construction industry is a project-based business bringing together many different organisations to complete a desired goal. The strategic use of Information and Communication Technologies (ICT) has enabled this goal to be completed more effectively. Two issues require addressing, the technology itself and the implementation factors of the technology. Such implementation factors should consider, among other factors, the legal and contractual issues associated with the use of ICT, training requirements and its effects on the organisational culture. To date the legal and contractual issues have not been extensively covered, and it is recognised that the technologies have not been properly covered by any recognised legal and contractual practices. This in turn is threatening to inhibit the growth and prosperity of the use of the technology on construction projects. This paper discusses these legal and contractual issues and describes methods and tools that can be used to enable the growth of technology to be used in a legal and contractually valid environment.
Collaborative Design Processes: A Class on Concurrent Collaboration in Multidisciplinary Design
(2004)
The rise of concurrent engineering in construction demands early team formation and constant communication throughout the project life cycle, but educational models in architecture, engineering and construction have been slow to adjust to this shift in project organization. Most students in these fields spend the majority of their college years working on individual projects that do not build teamwork or communication skills. Collaborative Design Processes (CDP) is a capstone design course where students from the University of Illinois at Urbana-Champaign and the University of Florida learn methods of collaborative design enhanced by the use of information technology. Students work in multidisciplinary teams to collaborate from remote locations via the Internet on the design of a facility. An innovation of this course compared to previous efforts is that students also develop process designs for the integration of technology into the work of multidisciplinary design teams. The course thus combines both active and reflective learning about collaborative design and methods. The course is designed to provide students the experience, tools, and methods needed to improve design processes and better integrate the use of technology into AEC industry work practices. This paper describes the goals, outcomes and significance of this new, interdisciplinary course for distributed AEC education. Differences from existing efforts and lessons learned to promote collaborative practices are discussed. Principal conclusions are that the course presents effective pedagogy to promote collaborative design methods, but faces challenges in both technology and in traditional intra-disciplinary training of students.
In the AEC (Architecture / Engineering / Construction) industry a number of individuals and organisations collaborate and work jointly on a construction project. The resulting consortium has large pool of expertise and experience and can be defined as a Virtual Organisation (VO) formed for the duration of the project. VOs are electronically networked organisations where IT and web based communication technology play an important role in coordinating various activities of these organisations. This paper describes the design, development and implementation of a Grid enabled application called the Product Supplier Catalogue Database (PSCD) which supports collaborative working in consortia. As part of the Grid-enabling process, specialised metadata is being developed to enable PSCD to effectively utilise Grid middleware such as Globus and Java CoG toolkits. We also describe our experience whilst designing, developing and deploying the security service of the application using the Globus Security Interface (GSI).
A/E/C Team members, while collaborating on building projects, rely on past experiences and content through the use of project design archives (whether in paper or digital format). This leads to underutilization of potential knowledge, as decision-making of data, information, and knowledge reuse is limited by access to these archives, due to sheer size and inconvenient presentation. This paper presents an integrated solution that leverages two technologies CoMem (Corporate Memory) and iRoom (interactive Room) developed at Stanford. This addresses critical limitations, i.e., content, context, visualization and interactivity, constraining the process of collaborative exploration towards knowledge reuse and decision-making.
M. Christine Boyer is an urban historian whose interests include the history of the American city, city planning, preservation planning, and computer science. Before coming to Princeton University in 1991, Boyer was professor and chair of the City and Regional Planning Program at Pratt Institute. She was a visiting professor in the Ph.D. program at TU Deflt School of Design for Spring 2005. She has written extensively about American urbanism. Her publications include Dreaming the Rational City: The Myth of American City Planning 1890 –1945 (Cambridge: The MIT Press, 1983), Manhattan Manners: Architecture and Style 1850-1900 (New York: Rizzoli, 1985), The City of Collective Memory (Cambridge: The MIT Press, 1994), and CyberCities (New York: Princeton Architectural Press, 1996).
Am Fraunhofer ISE wurde in den letzten Jahren die Simulationsumgebung ColSim entwickelt, die sich speziell zur Untersuchung von Regelungssystemen in Gebäudeenergieversorgungssystemen eignet. Zielsetzung des Designs ist die Umsetzung des simulationsbasierten Regelungsentwurfs, der einen unmittelbaren Einsatz der Regelungsmodule auf sog. Enbedded Systems gestattet. Das Simulationswerkzeug zeichnet sich durch die modulare offene Struktur aus, die eine flexible Erweiterung ermöglicht (vgl. TRNSYS [1]). Die Implementierung erfolgte im genormten ANSI-C Code, der einen plattform- unabhänigen Einsatz gewährleistet. Entwicklungsplattform stellt derzeit ein Linux Cluster dar, als Zielplattform wurden bisher sowohl eingebettete Industrie-PCs wie auch klassische Micro- controller Boards verwendet. Die Entwurfsmethode wird anhand einer Systemregelung für eine solarthermische Anlage mit 120m2 Kollektorfläche (SolarThermie2000 Anlage) demonstriert, bei der ein vernetztes Regelungssystem mit Internet-Integration zum Einsatz kommt. Das Regelungssystem verfügt seinerseits über ein Betriebssystem (schlankes embedded linux system), das die Kommunikation nach aussen gestattet. Das Regelungssystem vefügt somit über Klima- bzw Strahlungsdaten, die für den Regelungsprozess von Bedeutung sind. Die externen Informationen können einerseits zur >Einsparung< von Sensorik genutzt werden, andererseits gestatten sie den Einsatz von prädiktiver Regelungsmethodik, um den fossilen (Nachheiz-) Energieeinsatz zu minimieren. Mit Hilfe von simulationstechnischen Systemstudien kann ein adaptives Verhalten des Regelungssystems erprobt werden, das eine selbstständige Strecken- identifikation realieren kann. Beispielsweise soll beim näher beschriebenen solarthermischen System die Totzeit bestimmt werden, die sich infolge der Verrohrung zwischen Speicher und Entladegruppe ergibt. Der Betrieb der Entlade- pumpe wird einerseits in Abhängigkeit der Verfügbarkeit des Pufferwassers erfolgen, andererseits in Abhängigkeit des erwarteten Zapfvolumens durch den Verbraucher. Die vernetzten Regelungssysteme, die auf Basis der Simulationsmodelle ent- wickelt werden, sollen künftig die gesamte Energieflußanalyse des Gebäudes realisieren, wobei eine transparente Darstellung des Systemverhaltens auf Basis einer Internet Visualisierung erfolgt. Der Betreiber und Nutzer wird unmittel- bar durch die Online Dienste (SMS,Email,Fax) über das (fehlerhafte) Anlagen- verhalten informiert. Gerade die sensitiven regenerativen Systeme neigen durch ihre Komplexität zu Störungen, die oftmals nicht erkannt werden, weil die konventionellen Teilsysteme (z.B. Ergasbrenner) den Ausfall in der Regel >kompensieren<.
Reasonably accurate cost estimation of the structural system is quite desirable at the early stages of the design process of a construction project. However, the numerous interactions among the many cost-variables make the prediction difficult. Artificial neural networks (ANN) and case-based reasoning (CBR) are reported to overcome this difficulty. This paper presents a comparison of CBR and ANN augmented by genetic algorithms (GA) conducted by using spreadsheet simulations. GA was used to determine the optimum weights for the ANN and CBR models. The cost data of twenty-nine actual cases of residential building projects were used as an example application. Two different sets of cases were randomly selected from the data set for training and testing purposes. Prediction rates of 84% in the GA/CBR study and 89% in the GA/ANN study were obtained. The advantages and disadvantages of the two approaches are discussed in the light of the experiments and the findings. It appears that GA/ANN is a more suitable model for this example of cost estimation where the prediction of numerical values is required and only a limited number of cases exist. The integration of GA into CBR and ANN in a spreadsheet format is likely to improve the prediction rates.
COMPARISON OF SOME VARIANTS OF THE FINITE STRIP METHOD FOR ANALYSIS OF COMPLEX SHELL STRUCTURES
(2000)
The subject of this paper is to explore and evaluate the semi-analytical, analytical and numerical versions of the finite strip method (FSM) for static, dynamic and stability analyses of complex thin-walled structures. Many of bridge superstructures, some roof and floor structures, reservoirs, channels, tunnels, subways, layered shells and plates etc. can be analysed by this method. In both semi-analytical and analytical variants beam eigenvalue vibration or stability functions, orthogonal polynomials, products of these functions are used as longitudinal functions of the unknowns. In the numerical FSM spline longitudinal displacement functions are implemented. In the semi-analytical and numerical FSM conventional transverse shape functions for displacements are used. In the analytical FSM the accurate function of the strip normal displacement and the plane stress function are applied. These three basic variants of the FSM are compared in quality and quantity in view to the following: basic ideas, modelling, unknowns, DOF, a kind and order of the strips, longitudinal and transverse displacement and stress functions, compatibility requirements, boundary conditions, ways for obtaining of the strip stiffness and load matrices, a kind and size of the structure stiffness matrix and its band width, mesh density, necessary number of terms in length, accuracy and convergence of the stresses and displacements, approaches for refining results, input and output data, computer resources used, application area, closeness to other methods, options for future development. Numerical example is presented. Advantages and shortcomings are pointed. Conclusions are given.
In this paper we review two distint complete orthogonal systems of monogenic polynomials over 3D prolate spheroids. The underlying functions take on either values in the reduced and full quaternions (identified, respectively, with R3 and R4), and are generally assumed to be nullsolutions of the well known Riesz and Moisil Théodoresco systems in R3. This will be done in the spaces of square integrable functions over R and H. The representations of these polynomials are explicitly given. Additionally, we show that these polynomial functions play an important role in defining the Szegö kernel function over the surface of 3D spheroids. As a concrete application, we prove the explicit expression of the monogenic Szegö kernel function over 3D prolate spheroids.
Electromagnetic wave propagation is currently present in the vast majority of situations which occur in veryday life, whether in mobile communications, DTV, satellite tracking, broadcasting, etc. Because of this the study of increasingly complex means of propagation of lectromagnetic waves has become necessary in order to optimize resources and increase the capabilities of the devices as required by the growing demand for such services.
Within the electromagnetic wave propagation different parameters are considered that characterize it under various circumstances and of particular importance are the reflectance and transmittance. There are several methods or the analysis of the reflectance and transmittance such as the method of approximation by boundary condition, the plane wave expansion method (PWE), etc., but this work focuses on the WKB and SPPS methods.
The implementation of the WKB method is relatively simple but is found to be relatively efficient only when working at high frequencies. The SPPS method (Spectral Parameter Powers Series) based on the theory of pseudoanalytic functions, is used to solve this problem through a new representation for solutions of Sturm Liouville equations and has recently proven to be a powerful tool to solve different boundary value and eigenvalue problems. Moreover, it has a very suitable structure for numerical implementation, which in this case took place in the Matlab software for the valuation of both conventional and turning points profiles.
The comparison between the two methods allows us to obtain valuable information about their perfor mance which is useful for determining the validity and propriety of their application for solving problems where these parameters are calculated in real life applications.
In this paper proposed the application of two-parameters damage model, based on non-linear finite element approach, to the analysis of masonry panels. Masonry is treated as a homogenized material, for which the material characteristics can be defined by using homogenization technique. The masonry panels subjected to shear loading are studied by using the proposed procedure within the framework of three-dimensional analyses. The nonlinear behaviour of masonry can be modelled using concepts of damage theory. In this case an adequate damage function is defined for taking into account different response of masonry under tension and compression states. Cracking can, therefore, be interpreted as a local damage effect, defined by the evolution of known material parameters and by one or several functions which control the onset and evolution of damage. The model takes into account all the important aspects which should be considered in the nonlinear analysis of masonry structures such as the effect of stiffness degradation due to mechanical effects and the problem of objectivity of the results with respect to the finite element mesh. Finally the proposed damage model is validated with a comparison with experimental results available in the literature.
In order to model and simulate collapses of large scale complex structures, a user-friendly and high performance software system is essential. Because a large number of simulation experiments have to be performed, therefore, next to an appropriate simulation model and high performance computing, efficient interactive control and visualization capabilities of model parameters and simulation results are crucial. To this respect, this contribution is concerned with advancements of the software system CADCE (Computer Aided Demolition using Controlled Explosives) that is extended under particular consideration of computational steering concepts. Thereby, focus is placed on problems and solutions for the collapse simulation of real world large scale complex structures. The simulation model applied is based on a multilevel approach embedding finite element models on a local as well as a near field length scale, and multibody models on a global scale. Within the global level simulation, relevant effects of the local and the near field scale, such as fracture and failure processes of the reinforced concrete parts, are approximated by means of tailor-made multibody subsystems. These subsystems employ force elements representing nonlinear material characteristics in terms of force/displacement relationships that, in advance, are determined by finite element analysis. In particular, enhancements concerning the efficiency of the multibody model and improvements of the user interaction are presented that are crucial for the capability of the computational steering. Some scenarios of collapse simulations of real world large scale structures demonstrate the implementation of the above mentioned approaches within the computational steering.
Computational Steering provides methods for the integration of modeling, simulation, visualization, data analysis and post processing. The user has full control over a running simulation and the possibility to modify objects (geometry and other properties), boundary conditions and other parameters of the system interactively. The objective of such a system is to explore the effects of changes made immediately and thus to optimize the target problem interactively. We present a computational steering based system for fluid flow problems in civil engineering. It is based on three software components as shown in figure 1. The modeler is the CAD-system AutoCAD, which offers a powerful programming interface allowing an efficient access to the geometric data. It also offers convenient manipulators for geometric objects. The simulation kernel is a Lattice-Boltzmann (LB) solver for the Navier-Stokes equations, which is especially suitable for instationary flows in complex geometries. For the visualization and postprocessing we use the software tool AVS, which provides a powerful programming interface and allows the efficient visualization of flow fields. These three components are interconnected through two communication modules and three interfaces as depicted in figure 1. Interface 1 is responsible for the transformation of the modified system for the simulation kernel, interface 2 is responsible for the proper preparation of the simulation data whereas interface 3 transforms the data from the modeler into a format suitable for the visualization system. The whole system is synchronized by the two communication modules.
Designing lightings in a 3D-scene is a general complex task for building conception as it is submitted to many constraints such as aesthetics or ergonomics. This is often achieved by experimental trials until reaching an acceptable result. Several rendering softwares (such as Radiance) allow an accurate computation of lighting for each point in a scene, but this is a long process and any modification requires the whole scene to be rendered again to get the result. The first guess is empirical, provided by experience of the operator and rarely submitted to scientific considerations. Our aim is to provide a tool for helping designers to achieve this work in the scope of global illumination. We consider the problem when some data are asked for : on one hand the mean lighting in some zones (for example on a desktop) and on the other hand some qualitative information about location of sources (spotlights on the ceiling, halogens on north wall,...). The system we are conceiving computes the number of light sources, their position and intensities, in order to obtain the lighting effects defined by the user. The algorithms that we use bind together radiosity computations with resolution of a system of constraints.
Poland is not situated in any seismic region of the earth, however there are still areas were underground mining is being conducted. In these areas, so-called 'paraseismic tremors', are very frequent phenomena. In the situation when a building examination is realized in order to define its safety, it is necessary to make a complete analysis, in which an influence of tremors should be included. To decide if a building is able to carry out any dynamic loads or not, it is necessary to compute its dynamic characteristics, i.e. natural frequencies. It is not possible using any standard techniques. After diagnosis a building in situ by an expert, computer techniques together with specialized software for dynamic, static, and strength analyses become a suitable tool. In this paper a special attention was paid to a typical twelve-store WGP (Wroclaw Great Plate) prefabricated building, concerning special type of joints. During dynamic actions these joints have a decisive influence on building's behavior. Paraseismic tremors are especially dangerous for these buildings and can be the reason of pre-failure states. It can be difficult and very expensive to prepare laboratory investigations of the part of a building or of a separate joint; therefore the computer modeling suitable to investigate behavior of such elements and whole buildings under different kinds of loads was used.
The paper describes a development of the analytical finite strip method (FSM) in displacements for linear elastic static analysis of simply supported at their transverse ends complex orthotropic prismatic shell structures with arbitrary open or closed deformable contour of the cross-section under general external loads. A number of bridge top structures, some roof structures and others are related to the studied class. By longitudinal sections the prismatic thin-walled structure is discretized to a limited number of plane straight strips which are connected continuously at their longitudinal ends to linear joints. As basic unknowns are assumed the three displacements of points from the joint lines and the rotation to these lines. In longitudinal direction of the strips the unknown quantities and external loads are presented by single Fourier series. In transverse direction of each strips the unknown values are expressed by hyperbolic functions presenting an exact solution of the corresponding differential equations of the plane straight strip. The basic equations and relations for the membrane state, for the bending state and for the total state of the finite strip are obtained. The rigidity matrix of the strip in the local and global co-ordinate systems is derived. The basic relations of the structure are given and the general stages of the analytical FSM are traced. For long structures FSM is more efficient than the classic finite element method (FEM), since the problem dimension is reduced by one and the number of unknowns decreases. In comparison with the semi-analytical FSM, the analytical FSM leads to a practically precise solution, especially for wider strips, and provides compatibility of the displacements and internal forces along the longitudinal linear joints.
How does it come to particular structure formations in the cities and which strengths play a role in this process? On which elements can the phenomena be reduced to find the respective combination rules? How do general principles have to be formulated to be able to describe the urban processes so that different structural qualities can be produced? With the aid of mathematic methods, models based on four basic levels are generated in the computer, through which the connections between the elements and the rules of their interaction can be examined. Conclusions on the function of developing processes and the further urban origin can be derived.
The idea about a simulation program to support urban planning is explained: Four different, clearly defined developing paths can be calculated for the rebuilding of a shrinking town. Aided by self-organization principles, a complex system can be created. The dynamics based on the action patterns of single actors, whose behaviour is cyclically depends on the generated structure. Global influences, which control the development, can be divided at a spatial, socioeconomic, and organizational-juridical level. The simulation model should offer conclusions on new planning strategies, especially in the context of the creation process of rebuilding measures. An example of a transportation system is shown by means of prototypes for the visualisation of the dynamic development process.
Für eine gesicherte Planung im Bestand, sind eine Fülle verschiedenster Informationen zu berücksichtigen, welche oft erst während des Planungs- oder Bauprozesses gewonnen werden. Voraussetzung hierfür bildet immer eine Bestandserfassung. Zwar existieren Computerprogramme zur Unterstützung der Bestandserfassung, allerdings handelt es sich hierbei ausschließlich um Insellösungen. Der Export der aufgenommenen Daten in ein Planungssystem bedingt Informationsverluste. Trotz der potentiellen Möglichkeit aktueller CAAD/BIM Systeme zur Verwaltung von Bestandsdaten, sind diese vorrangig für die Neubauplanung konzipiert. Die durchgängige Bearbeitung von Sanierungsprojekten von der Erfassung des Bestandes über die Entwurfs- und Genehmigungsplanung bis zur Ausführungsplanung innerhalb eines CAAD/BIM Systems wird derzeit nicht adäquat unterstützt. An der Professur Informatik in der Architektur (InfAR) der Fakultät Architektur der Bauhaus-Universität Weimar entstanden im Rahmen des DFG Sonderforschungsbereich 524 "Werkzeuge und Konstruktionen für die Revitalisierung von Bauwerken" in den letzten Jahren Konzepte und Prototypen zur fachlich orientierten Unterstützung der Planung im Bestand. Der Fokus lag dabei in der Erfassung aller planungsrelevanter Bestandsdaten und der Abbildung dieser in einem dynamischen Bauwerksmodell. Aufbauend auf diesen Forschungsarbeiten befasst sich der Artikel mit der kontextbezogenen Weiterverwendung und gezielten Bereitstellung von Bestandsdaten im Prozess des Planens im Bestand und der Integration von Konzepten der planungsrelevanten Bestandserfassung in marktübliche CAAD/BIM Systeme.
Die Kommunale Wohnungsgesellschaft mbH Erfurt(KoWo) ist mit ihren rund 20.000 Wohnungen in der Landeshauptstadt das größte Wohnungsunternehmen in Thüringen. Der Immobilienbestand ist heterogen in seinem technischen Zustand und im Bezug auf die unterschiedlichen Lagen der Objekte. Bedingt durch Leerstände und unterschiedliche Modernisierungsmaßnahmen und -stände unterscheidet sich die Wirtschaftlichkeit verschiedener Objekte deutlich. Ohne eine einheitliche Einwertung des Immobilienbestandes im Bezug auf die Objektattraktivität, die Standortqualität und die Objektwirtschaftlichkeit fällt eine langfristige strategische Entwicklung des Immobilienportfolios schwer. Über die Schritte der technischen Bestandserfassung, die Einwertung über ein Scorintmodell, die Abbildung in einem Portfoliomodell mit zugehöriger Normstrategie bis hin zur Weiterverarbeitung der Daten in der 20-jährigen Instandsetzungsplanung wird praxisnah aufgezeigt, wie die Vorgehensweise bei der Einwertung des Immobilienportfolios ist.
Die digitale Unterstützung der Planungsprozesse ist ein aktueller Forschungs- und Arbeitsschwerpunkt der Professur Informatik in der Architektur (InfAR) und der Juniorprofessur Architekturinformatik der Fakultät Architektur an der Bauhaus-Universität Weimar. Verankert in dem DFG Sonderforschungsbereich 524 'Werkzeuge und Konstruktionen für die Revitalisierung von Bauwerken' entstehen Konzepte und Prototypen für eine fachlich orientierte Planungsunterstützung. Vor dem Hintergrund zunehmender Komplexität der Bauaufgaben steigt die Zahl der an einem Projekt Beteiligten und deren örtliche Verteilung. Planungsvorhaben sind dadurch verstärkt gekennzeichnet durch einen erhöhten Aufwand in Planungskoordination, -organisation und Kommunikation. Globale Computernetzwerke - das Internet - bieten Potential zur Lösung dieser Aufgaben. Vor diesem Hintergrund sind in der letzten Zeit eine Vielzahl von Systemen die sich unterschiedlichsten Techniken bedienen entstanden. Allen diesen Systemen gemein ist die Vision der Optimierung des Planungsprozesses, Vereinfachung der Kommunikation und die Verbesserung des Zeitmanagements. Aus Sicht der Architekten stellt sich die Situation derzeit als ambivalent dar: Einerseits sind die Ideen, die den 'IBPM - Systemen' zugrunde liegen, nachvollziehbar und offerieren einen sofort messbaren Nutzen. Auf der anderen Seite stehen vielfältige Aspekte, die den uneingeschränkten Einsatz dieser Systeme augenscheinlich verhindern. Ein Focus bei der Beleuchtung dieser Schwachstellen liegt auf dem omnipräsentem Problem der mangelhaften Unterstützung graphischer Daten als die bedeutendste Informationsgrundlage im Planungsprozess. Aus der konkreten, fachspezifischen Analyse des Planungsprozesses, der Untersuchung potentieller Entwicklungsmöglichkeiten vorhandener Systeme und der intensiven Auseinandersetzung mit neuen Internettechnologien entstand im Zuge dieses Forschungsschwerpunktes eine architekturpraxisnahe Applikation, die das Internet weg vom reinen Präsentationsmedium, über ein reines Kommunikationsmittel hinaus, hin zu einer leistungsfähigen interaktiven Schnittstelle für alle am Entwurfs- und Planungsprozess Beteiligten erschließt.
This work presents a concept of interactive machine learning in a human design process. An urban design problem is viewed as a multiple-criteria optimization problem. The outlined feature of an urban design problem is the dependence of a design goal on a context of the problem. We model the design goal as a randomized fitness measure that depends on the context. In terms of multiple-criteria decision analysis (MCDA), the defined measure corresponds to a subjective expected utility of a user. In the first stage of the proposed approach we let the algorithm explore a design space using clustering techniques. The second stage is an interactive design loop; the user makes a proposal, then the program optimizes it, gets the user’s feedback and returns back the control over the application interface.
The analysis of the response of complex structural systems requires the description of the material constitutive relations by means of an appropriate material model. The level of abstraction of such model may strongly affect the quality of the prognosis of the whole structure. In context to this fact, it is necessary to describe the material in a convenient sense as exact but as simple as possible. All material phenomena of crystalline materials e.g. steel, affecting the behavior of the structure, rely on physical effects which are interacting over spatial scales from subatomic to macroscopic range. Nevertheless, if the material is microscopically heterogenic, it might be appropriate to use phenomenological models for the purpose of civil engineering. Although constantly applied, these models are insufficient for steel materials with microscopic characteristics such as texture, typically occurring in hot rolled steel members or heat affected zones of welded joints. Hence, texture is manifested in crystalline materials as a regular crystallographic structure and crystallite orientation, influencing macroscopic material properties. The analysis of structural response of material with texture (e.g. rolled steel or heat affected zone of a welded joint) obliges the extension of the phenomenological material description of macroscopic scale by means of microscopic information. This paper introduces an enrichment approach for material models based on a hierarchical multiscale methodology. This has been done by describing the grain texture on a mesoscopic scale and coupling it with macroscopic constitutive relations by means of homogenization. Due to a variety of available homogenization methods, the question of an assessment of coupling quality arises. The applicability of the method and the effect of the coupling method on the reliability of the response are presented on an example.
CONSTITUTIVE MODELS FOR SUBSOIL IN THE CONTEXT OF STRUCTURAL ANALYSIS IN CONSTRUCTION ENGINEERING
(2010)
Parameters of constitutive models are obtained generally comparing the results of forward numerical simulations to measurement data. Mostly the parameter values are varied by trial-and-error in order to reach an improved fit and obtain plausible results. However, the description of complex soil behavior requires advanced constitutive models where the rising complexity of these models mainly increases the number of unknown constitutive parameters. Thus an efficient identification "by hand" becomes quite difficult for most practical geotechnical problems. The main focus of this article is on finding a vector of parameters in a given search space which minimizes discrepancy between measurements and the associated numerical result. Classically, the parameter values are estimated from laboratory tests on small samples (triaxial tests or oedometer tests). For this purpose an automatic population-based approach is present to determine the material parameters for reconstituted and natural Bothkennar Clay. After the identification a statistical assessment is carried out of numerical results to evaluate different constitutive models. On the other side a geotechnical problem, stone columns under an embankment, is treated in a well instrumented field trial in Klagenfurt, Austria. For the identification purpose there are measurements from multilevel-piezometers, multilevel-extensometers and horizontal inclinometer. Based on the simulation of the stone columns in a FE-Model the identification of the constitutive parameters is similar to the experimental tests by minimizing the absolute error between measurement and numerical curves.
Unconstrained models are very often found in the broad spectrum of different theories of traffic demand models. In these models there are none or only one-sided restrictions influencing the choice of the individual. However in the traffic demand different deciding dependencies of the traffic volume with regard to the specific conditions of the territory structure potentials exist. Kichhoff and Lohse introduced bi- and tri-linearly constrained models to show these dependencies. In principle, the dependencies are described as hard, elastic and open boundary sum criteria. In this article a model is formulated which gets away from these predefined boundary sum criteria and allows a free determination of minimal and maximal boundary sum criteria. The iterative solution algorithm is shown according to a FURNESS procedure at the same time. With the approach of freely selectable minimal and maximal boundary sum criteria the modeling transport planner gets the possibility to show the traffic event even better. Furthermore all common boundary sum criteria can be calculated with this model. Therewith the often necessary and sensible standard and special cases can also be modeled.
Practical examples show that the improvement in cost flow and total amount of money spend in construction and further use may be cut significantly. The calculation is based on spreadsheets calculation, very easy to develop on most PC´s now a days. Construction works, are a field where the evaluation of Cash Flow can be and should be applied. Decisions about cash flow in construction are decisions with long-term impact and long-term memory. Mistakes from the distant past have a massive impact on situations in the present and into the far economic future of economic activities. Two approaches exist. The Just-in-Time (JIT) approach and life cycle costs (LCC) approach. The calculation example shows the dynamic results for the production speed in opposition to stable flow of production in duration of activities. More sophisticated rescheduling in optimal solution might bring in return extra profit. In the technologies and organizational processes for industrial buildings, railways and road reconstruction, public utilities and housing developments there are assembly procedures that are very appropriate for the given purpose, complicated research-, development-, innovation-projects are all very good aspects of these kinds of applications. The investors of large investments and all public invested money may be spent more efficiently if an optimisation speed-strategy can be calculated.
As it is well known, the approximation theory of complex valued functions is one of the main fields in function theory. In general, several aspects of approximation and interpolation are only well understood by using methods of complex analysis. It seems natural to extend these techniques to higher dimensions by using Clifford Analysis methods or, more specific, in lower dimensions 3 or 4, by using tools of quaternionic analysis. One starting point for such attempts has to be the suitable choice of complete orthonormal function systems that should replace the holomorphic function systems used in the complex case. The aim of our contribuition is the construction of a complete orthonormal system of monogenic polynomials derived from a harmonic function system by using sistematically the generalized quaternionic derivative
The design of challenging space structures frequently relies on the theory of folded plates. The models are composed of plane facets of which the bending and membrane stiffness are coupled along the folds. In conventional finite element analysis of faceted structures the continuity of the displacement field is enforced exclusively at the nodes. Since approximate solutions for transverse and for in-plane displacements are not members of the same function space, separation occurs in between the common nodes of adjacent elements. It is shown that the kinematic assumptions of Bernoulli are accounted for this incompatibility along the edges in facet models. A general answer to this problem involves substantial modification of plate and membrane theory, but a straight forward formulation can be derived for simply folded plates, structures, whose folds do not intersect. A broad class of faceted structures, including models of various curved shells, belong to this category and can be calculated consistently. The additional requirements to assure continuity concern the mapping of displacement derivatives on the edges. An appropriate finite facet element provides node and edge-oriented degrees of freedom, whose transformation to system degrees of freedom, depends on the geometric configuration at each node. The concept is implemented using conform triangular elements. To evaluate the new approach, the energy norm of representative structures for refined meshes is calculated. The focus is placed on the mathematical convergence towards reliable solutions obtained from finite volume models.
The contribution introduces an adaptable process model to meet the special requirements of the coordination of planning activities in AEC (Architecture, Engineering, Construction). The process model is based on the concept of Coloured Petri-Nets and uses metainformation to characterize process-relevant information and to enable process-control based on the actual results of the planning.