Refine
Document Type
- Conference Proceeding (857) (remove)
Institute
- Professur Informatik im Bauwesen (336)
- In Zusammenarbeit mit der Bauhaus-Universität Weimar (174)
- Professur Baubetrieb und Bauverfahren (104)
- Professur Theorie und Geschichte der modernen Architektur (94)
- Graduiertenkolleg 1462 (32)
- Institut für Strukturmechanik (ISM) (23)
- Professur Angewandte Mathematik (18)
- Institut für Konstruktiven Ingenieurbau (IKI) (12)
- Junior-Professur Computational Architecture (11)
- Bauhaus-Institut für Geschichte und Theorie der Architektur und Planung (10)
Keywords
- Computerunterstütztes Verfahren (286)
- Architektur <Informatik> (200)
- CAD (158)
- Angewandte Informatik (145)
- Angewandte Mathematik (145)
- Computer Science Models in Engineering; Multiscale and Multiphysical Models; Scientific Computing (74)
- Bauhaus (68)
- Arbeitsschutz (61)
- Baustelle (53)
- Architekturtheorie (52)
Der Beitrag basiert auf den Ansätzen und Ergebnissen des Forschungsprojekts >Prozessorientierte Vernetzung von Ingenieurplanungen am Beispiel der Geotechnik<, des DFG-Schwerpunktprogramms 1103 >Vernetzt-kooperative Planungsprozesse im Konstruktiven Ingenieurbau<. Ziel ist die Entwicklung einer netzwerkbasierten Kooperationsplattform zur Unterstützung von Ingenieurplanungen. Methodische Grundlagen hierfür stellen die Petri-Netze mit individuellen Marken in Verbindung mit einer semantischen Informationsbewertung dar. Der Beitrag zeigt an einem Beispiel die grundlegenden Möglichkeiten der Petri-Netze auf und stellt die Steuerung der Planungsprozesse aufgrund von Metainformationen dar. Darüber hinaus wird der Ansatz verfolgt, auf der Basis bauteilorientierter Prozessmuster für geotechnische Konstruktionselemente den veränderlichen Prozessablauf zu erfassen. Abschließend wird ein Weg zur Implementierung gezeigt.
In Anbetracht der leistungsstarken Computernetzwerke und der intensiven Globalisierung von Wissenschaft und Technik im Bauwesen ist es das Ziel der Bauinformatik, die Planungsprozesse des Konstruktiven Ingenieurbaus für die Nutzung verteilter Ressourcen zu gestalten, geeignete Kooperationsmodelle für die Fachplanung im Informationsverbund der Projektbeteiligten zu entwickeln und die kooperative Projektbearbeitung unter Nutzung verteilter Fachmodelle in Netzen zu ermöglichen. Dazu werden neue methodische Grundlagen geschaffen, welche die tiefgreifenden Veränderungen der Arbeits- und Organisationsstrukturen berücksichtigen. Vor diesem Hintergrund zeigt der Beitrag aktuelle Forschungsinhalte der Bauinformatik auf, die insbesondere im Rahmen des DFG-Schwerpunktprogramms 1103 'Vernetzt-kooperative Planungsprozesse im Konstruktiven Ingenieurbau' von verschiedenen Projekten arbeitsteilig entwickelt werden. Es wird ein integratives Prozessmodell vorgestellt, das die Grundlage für die Unterstützung der vernetzt-kooperativen Planung mit modernen Methoden der Informationsverarbeitung und Kommunikationstechnik zusammenführt. An einem Beispiel aus dem baulichen Brandschutz werden die methodischen Ansätze der Prozessmodellierung mit Petri-Netzen und der mobilen Softwareagenten zur Unterstützung der Kooperation in der Fachplanung des Bauwesens dargestellt und erläutert.
We consider a structural truss problem where all of the physical model parameters are uncertain: not just the material values and applied loads, but also the positions of the nodes are assumed to be inexact but bounded and are represented by intervals. Such uncertainty may typically arise from imprecision during the process of manufacturing or construction, or round-off errors. In this case the application of the finite element method results in a system of linear equations with numerous interval parameters which cannot be solved conventionally. Applying a suitable variable substitution, an iteration method for the solution of a parametric system of linear equations is firstly employed to obtain initial bounds on the node displacements. Thereafter, an interval tightening (pruning) technique is applied, firstly on the element forces and secondly on the node displacements, in order to obtain tight guaranteed enclosures for the interval solutions for the forces and displacements.
Der Schwerpunkt von Forschung und Entwicklung auf dem Gebiet der Tragwerksplanungs-Software lag in den letzten Jahren auf der Erweiterung des funktionalen Umfangs. In der Folge ist es notwendig, den gestiegenen Funktionsumfang einem möglichst breiten Anwenderkreis durch ingenieurgemäß gestaltete Arbeitsumgebungen zugänglich zu machen, so dass ein möglichst effizientes und fehlerarmes Arbeiten ermöglicht wird. Aus der Sicht der Tragwerksplaner muss eine ingenieurgemäß gestaltete Software eine dem spezifischen Arbeitsablauf angepasste Nutzer-Software-Interaktion aufweisen. Dabei sind die benötigten Funktionalitäten in ein einheitliches System zu integrieren und eine Anpassbarkeit durch den Anwender sicherzustellen. Die Berücksichtigung dieser Anforderungen mit herkömmlichen Mitteln würde einen unverhältnismäßig hohen Entwicklungsaufwand erfordern. Infolgedessen muss aus der Sicht der Software-Entwickler eine moderne Software-Architektur für die Tragwerksplanung eine Erhöhung des Wiederverwendungsgrades und eine unabhängige Erweiterbarkeit als zusätzliche Anforderungen erfüllen. In diesem Beitrag wird ein auf Verbunddokumenten basierendes Konzept vorgestellt, mit dem eine Zusammenführung von Standard-Software und fachspezifischen Software-Komponenten zu einer ingenieurgemäßen Arbeitsumgebung ermöglicht wird. Damit kann die Analyse und die Dokumentation eines Tragelementes einschließlich der zugehörigen Datenhaltung innerhalb eines Verbunddokumentes erfolgen. Gleichzeitig kann der software-technische Wiederverwendungsgrad durch die Definition eines Component Frameworks als unabhängig erweiterbare Software-Architektur und durch den Einsatz von Software-Komponenten mit eigener Nutzeroberfläche über das bisher erreichte Niveau hinaus gesteigert werden. Die Umsetzbarkeit des Konzeptes wird durch eine Pilotimplementierung demonstriert.
In the past, several types of Fourier transforms in Clifford analysis have been studied. In this paper, first an overview of these different transforms is given. Next, a new equation in a Clifford algebra is proposed, the solutions of which will act as kernels of a new class of generalized Fourier transforms. Two solutions of this equation are studied in more detail, namely a vector-valued solution and a bivector-valued solution, as well as the associated integral transforms.
VARIATIONAL POSITING AND SOLUTION OF COUPLED THERMOMECHANICAL PROBLEMS IN A REFERENCE CONFIGURATION
(2015)
Variational formulation of a coupled thermomechanical problem of anisotropic solids for the case of non-isothermal finite deformations in a reference configuration is shown. The formulation of the problem includes: a condition of equilibrium flow of a deformation process in the reference configuration; an equation of a coupled heat conductivity in a variational form, in which an influence of deformation characteristics of a process on the temperature field is taken into account; tensor-linear constitutive relations for a hypoelastic material; kinematic and evolutional relations; initial and boundary conditions. Based on this formulation several axisymmetric isothermal and coupled problems of finite deformations of isotropic and anisotropic bodies are solved. The solution of coupled thermomechanical problems for a hollow cylinder in case of finite deformation showed an essential influence of coupling on distribution of temperature, stresses and strains. The obtained solutions show the development of stressstrain state and temperature changing in axisymmetric bodies in the case of finite deformations.
VARIATION OF ROTATIONAL RESTRAINT IN GRID DECK CONNECTION DUE TO CORROSION DAMAGE AND STRENGTHENING
(2006)
The approach to assessment of rotational restraint of stringer-to-crossbeam connection in a deck of 100-year old steel truss bridge is presented. Sensitivity of rotational restraint coefficient of the connection to corrosion damage and strengthening is analyzed. Two criteria of the assessment of the rotational restraint coefficient are applied: static and kinematic one. The former is based on bending moment distribution in the considered member, the latter one – on the member rotation at the given joint. 2D-element model of finite element method is described: webs and flanges are modeled with shell elements, while rivets in the connection – with system of beam and spring elements. The method of rivet modeling is verified by T-stub connection test results published in literature. FEM analyses proved that recorded extent of corrosion damage does not alter the initial rotational restraint of stringer-to-crossbeam connection. Strengthening of stringer midspan influences midspan bending moment and stringer end rotation in a different way. Usually restoring member load bearing capacity means strengthening its critical regions (where the highest stress levels occur). This alters flexural stiffness distribution over member length and influences rotational restraint at its connection to other members. The impact depends on criterion chosen for rotational restraint coefficient assessment.
Das Innovationsmanagement von Medienorganisationen unterliegt derzeit erheblichen Veränderungen: Im veränderten Marktumfeld erweisen sich Flexibilität, schnelle Richtungswechsel und Anpassungsfähigkeit als zentral. Darauf muss auch die Medienmanagement-Forschung reagieren: Um die Agilität der gegenwärtigen Unternehmenspraxis valide zu erforschen, ist eine ebenso agile, adaptive Forschung gefordert. Zu diesem Zweck schlägt der Beitrag eine praxistheoretische Perspektive auf das Innovationsmanagement von Medienorganisationen vor. Empirische Forschungsdesigns, die aus einem solchen Zugriff resultieren, werden sowohl hinsichtlich ihrer methodischen Herausforderungen als auch ihres Forschungsprojektmanagements diskutiert. Der Beitrag greift außerdem neue Möglichkeitsräume des wissenschaftlichen Publizierens, des Universitätsmanagements sowie der Forschungsorganisation auf, die praxistheoretisch gegründete, empirische Innovationsforschung in der Medienwirtschaft einfordert.
Utilizing Modern FIB/SEM Technology and EDS for 3D Imaging of Hydrated Alite and its Pore Space
(2021)
The exploration of cementitious materials using scanning electron microscopes (SEM) is mainly done using fractured or polished surfaces. This leads to high-resolution 2D-images that can be combined using EDX and EBSD to unveil details of the microstructure and composition of materials. Nevertheless, this does not provide a quantitative insight into the three-dimensional fine structure of for example C-S-H phases.
The focused ion beam (FIB) technology can cut a block of material in thin layers of less than 10 nm. This gives us a volume of 1000 μm³ with a voxel resolution of down to 4 x 4 x 10 nm³. The results can be combined with simultaneously acquired EDX data to improve image segmentation. Results of the investigation demonstrate that it is possible to obtain close-to-native 3D-visualisation of the spatial distribution of unreacted C3S, C-S-H and CH. Additionally, an optimized preparation method allows us to quantify the fine structure of C-S-H phases (length, aspect ratio, …) and the pore space.
Adopting the European laws concerning environmental protection will require sustained efforts of the authorities and communities from Romania; implementing modern solutions will become a fast and effective option for the improvement of the functioning systems, in order to prevent disasters. As a part of the urban infrastructure, the drainage networks of pluvial and residual waters are included in the plan of promoting the systems which protect the environmental quality, with the purpose of integrated and adaptive management. The paper presents a distributed control system for sewer network of Iasi town. Unsatisfactory technical state of the actual sewer system is exposed, focusing on objectives related to implementation of the control system. The proposed distributed control system of Iasi drainage network is based on the implementation of the hierarchic control theory for diagnose, sewer planning and management. There are proposed two control levels: coordinating and local execution. Configuration of the distributed control system, including data acquisition and conversion equipment, interface characteristics, local data bus, data communication network, station configuration are widely described. The project wish to be an useful instrument for the local authorities in the preventing and reducing the impact of future natural disasters over the urban areas by means of modern technologies.
Interval analysis extends the concept of computing with real numbers to computing with real intervals. As a consequence, some interesting properties appear, such as the delivery of guaranteed results or confirmed global values. The former property is given in the sense that unknown numerical values are in known to lie in a computed interval. The latter property states that the global minimum value, for example, of a given function is also known to be contained in a interval (or a finite set of intervals). Depending upon the amount computation effort invested in the calculation, we can often find tight bounds on these enclosing intervals. The downside of interval analysis, however, is the mathematically correct, but often very pessimistic size of the interval result. This is in particularly due to the so-called dependency effect, where a single variable is used multiple times in one calculation. Applying interval analysis to structural analysis problems, the dependency has a great influence on the quality of numerical results. In this paper, a brief background of interval analysis is presented and shown how it can be applied to the solution of structural analysis problems. A discussion of possible improvements as well as an outlook to parallel computing is also given.
This paper presents a generic methodology for measurement system configuration when the goal is to identify behaviour models that reasonably explain observations. For such tasks, the best measurement system provides maximum separation between candidate models. In this work, the degree of separation between models is measured using Shannon’s Entropy Function. The location and type of measurement devices are chosen such that the entropy of candidate models is greatest. This methodology is tested on a laboratory structure and, to demonstrate generality, an existing fresh water supply network in a city in Switzerland. In both cases, the methodology suggests an appropriate set of sensors for identifying the state of the system.
The paper describes further developments of the interactive evolutionary design concept relating to the emergence of mutually inclusive regions of high performance design solutions. These solutions are generated from cluster-oriented genetic algorithm (COGAs) output and relate to a number of objectives introduced during the preliminary design of military airframes. The data-mining of multi-objective COGA (moCOGA) output further defines these regions through the application of clustering algorithms, data reduction and variable attribute relevance analyses. A number of visual representations of the COGA output projected onto both variable and objective space are presented. The multi-objective output of the COGA is compared to output from a Strength Pareto Evolutionary Algorithm (SPEA-II) to illustrate the manner in which moCOGAs can generate good approximations to Pareto frontiers.
At present time neuronet's technologies have got a wide application in a different fields of technique. At the same time they give insufficient consideration to using neuron nets in the field of building. Use of approximating neuron nets will allow to definite the deflected mode of constructions elements using noticeably less computing facilities then by using universal methods, finite-element method for instance. Today neuron nets are used for calculation separate elements of building constructions. In this work use of neuron nets for calculation deflected mode of construction which consists of many elements is consider. The main idea of suggested analysis is using neuron nets for calculation internal intensities and transferences pieces of model which are selected by there functional destination. For example, a plate is destine for adoption intensity distributed among area, the purpose of core is taking up surface distributed intensity. Elements involved as intensity converter. Plate serve for intensities dispersion and their transfer. A template is associated with functional destination. A template regards as composition of model elements which has define functional destinations. A single template can incarnate several functional destinations. On receipt values of components transference the estimation of their permissibility is put into practice. In the case of detection a violation of permissible limit, in the component database is making a search for component with analogous functional destination, according to the type of violation. If such component is found than a change a previous component into new one is realized. Thus besides control a condition of construction by components there is a possibility to make a search for decisions of revealed problem....
Portugal is one of the European countries with higher spatial and population freeway network coverage. The sharp growth of this network in the last years instigates the use of methods of analysis and the evaluation of their quality of service in terms of the traffic performance, typically performed through internationally accepted methodologies, namely that presented in the Highway Capacity Manual (HCM). Lately, the use of microscopic traffic simulation models has been increasingly widespread. These models simulate the individual movement of the vehicles, allowing to perform traffic analysis. The main target of this study was to verify the possibility of using microsimulation as an auxiliary tool in the adaptation of the methodology by HCM 2000 to Portugal. For this purpose, were used the microscopic simulators AIMSUN and VISSIM for the simulation of the traffic circulation in the A5 Portuguese freeway. The results allowed the analysis of the influence of the main geometric and traffic factors involved in the methodology by HCM 2000. In conclusion, the study presents the main advantages and limitations of the microsimulators AIMSUN and VISSIM in modelling the traffic circulation in Portuguese freeways. The main limitation is that these microsimulators are not able to simulate explicitly some of the factors considered in the HCM 2000 methodology, which invalidates their direct use as a tool in the quantification of those effects and, consequently, makes the direct adaptation of this methodology to Portugal impracticable.
Urban design played a central role for the European dictatorships during the 20th century, it served to legitimate the regime, to produce agreement, to demonstrate power, efficiency and speed, it communicated the social, as well as design projects, of the dictatorial regimes domestically and internationally, it tied old experts, as well as new, to the regime. Dictatorial urban design also played an important role after the fall of the dictatorships: It became the object of structural and verbal handling strategies: of demolition, of transformation, of reconstruction, of forgetting, of suppressing, of re-interpretation and of glorification. The topic area is, therefore, both historical and relevant to the present day. The discussion of the topic area is, like it or not, always embedded in the present state of societal engagement with dictatorships.
In order to even be able to discuss all of these aspects, different conceptual decisions are necessary. In retrospect, these may seem to many as self-evident, although they are anything but. Our thesis is that there are three methodological imperatives, especially, which allow an expanded approach to the topic area “urban design and dictatorship”. First and above all, the tunnel view, focused on individual dictatorships and neglecting the international dimension, must be overcome. Second, the differences in urban design over the course of a dictatorship, through an appropriate periodisation, should be emphasised. Third, we must strive for an open, flexible, but complex concept of urban design. The main focus lies on the urban design of the most influential dictatorships of the first half of the 20th century: Soviet Union, Fascist Italy and Nazi Germany, including the urban design of the autarky periods in Portugal and Spain.
After all, urban design is not just a product of specific historic circumstances. It is a form that continues to have long-term effects, which demonstrates its usefulness and adaptability throughout this process. The urban design products undoubtedly still recall the dictatorial rule under which they were created. However, they are more than a memory space. They are also a living space of the present. They can and should be discussed with respect to their spatial and functional utility for today and tomorrow. Such a perspective is a given for the citizens of a city, but also for city marketing, having marvellous consequences. Only when we do not exclude this dimension a priori, even in academic discussions, can we do justice to the products of dictatorships.
And finally, the view of the urban design of dictatorships can and must contribute to the questioning of simplified and naive conceptions of dictatorships. With urban design in mind, we can observe how dictatorships work and how they were able to prevail. In Europe, these questions are of the highest actuality.
Der Begriff der Zuverlässigkeit spielt eine zentrale Rolle bei der Bewertung von Verkehrsnetzen. Aus der Sicht der Nutzer des öffentlichen Personennahverkehrs (ÖPNV) ist eines der wichtigsten Kriterien zur Beurteilung der Qualität des Liniennetzes, ob es möglich ist, mit einer großen Sicherheit das Reiseziel in einer vorgegebenen Zeit zu erreichen. Im Vortrag soll dieser Zuverlässigkeitsbegriff mathematisch gefasst werden. Dabei wird zunächst auf den üblichen Begriff der Zuverlässigkeit eines Netzes im Sinne paarweiser Zusammenhangswahrscheinlichkeiten eingegangen. Dieser Begriff wird erweitert durch die Betrachtung der Zuverlässigkeit unter Einbeziehung einer maximal zulässigen Reisezeit. In vergangenen Arbeiten hat sich die Ring-Radius-Struktur als bewährtes Modell für die theoretische Beschreibung von Verkehrsnetzen erwiesen. Diese Überlegungen sollen nun durch Einbeziehung realer Verkehrsnetzstrukturen erweitert werden. Als konkretes Beispiel dient das Straßenbahnnetz von Krakau. Hier soll insbesondere untersucht werden, welche Auswirkungen ein geplanter Ausbau des Netzes auf die Zuverlässigkeit haben wird. This paper is involved with CIVITAS-CARAVEL project: "Clean and better transport in cites". The project has received research funding from the Community's Sixth Framework Programme. The paper reflects only the author's views and the Community is not liable for any use that may be made of the information contained therein.
Im Stahlbeton- und Spannbetonbau kommen in zunehmendem Maße Verbundkonstruktionen aus Betonfertigteilen und Ortbetonergänzungen zum Einsatz. Die Fertigteile werden je nach Spannweite, Belastung und speziellen Anforderungen mit schlaffer oder vorgespannter Be-wehrung ausgeführt. Aufgrund der im Allgemeinen unterschiedlichen Betone der einzelnen Querschnittsanteile sowie bedingt durch den zeitlichen Versatz in der Herstellung der Fertig-teile und der Ortbetonergänzungen ergeben sich Unterschiede im Langzeitverhalten, die bei der Bemessung und Nachweisführung zu berücksichtigen sind. Das Kriechen und Schwinden der Betone als Folge des zeitabhängigen Materialverhaltens und die Spezifik des Verbund-querschnitts können für die Gebrauchstauglichkeit relevant sein, insbesondere wenn die Fer-tigteile vorgespannt sind. Eine hinreichend wirklichkeitsnahe Beschreibung des Langzeitverhaltens von Beton gestattet die Theorie des elastisch-kriechenden Körpers, wobei sich im Allgemeinen keine geschlosse-ne Lösung für bewehrte Betonverbundquerschnitte angeben lässt. Eine Alternative hierzu ergibt sich, wenn für die einzelnen Kriechintervalle ein vereinfachter Ansatz für den Verlauf der Kriechfunktion entsprechend der Theorie des Alterns getroffen wird. Dabei werden die im Zeitintervall (tj-1,tj) umgelagerten Schnittgrößen als neue Belastung der einzelnen Querschnittsanteile im Zeitintervall (tj,tj+1) mit der Kriech-funktion f(tj+1,tj) berücksichtigt. Bei hinreichend kleinen Zeitschritten können die Nachteile der getroffenen Vereinfachung vernachlässigt werden.
In der heutigen Informationsgesellschaft sind alle Geschäftsprozesse in einem Unternehmen geprägt durch die Verwendung der vorhandenen Informations- und Kommunikationstechnologien. Diese haben zur Folge, dass Unternehmen mit einer wahren Flut an Informationen konfrontiert werden, die von extern und intern in das Unternehmen eingetragen wurden. Die erfolgreiche Beherrschung dieser Informationsflut bildet die Basis für das effektive Arbeiten im Unternehmen. Grundvoraussetzung hierfür ist vor allem ein harmonisches Zusammenspiel zwischen Mensch ↔ Organisation ↔ Technik. Dokumentenmanagementsysteme haben sich zur Aufgabe gemacht, den Anwender bei der technologischen Bewältigung dieser Aufgabenstellung zu unterstützen. Der hier vorgestellte Artikel zeigt die Probleme auf, die während der Umsetzung eines Dokumentenmanagementsystems auftreten können und konzentriert sich dabei besonders auf Kleinen und Mittleren Unternehmen (KMU) im Bauwesen. Charakteristisch für diese ist die Schwäche die eigenen Unternehmensstrukturen betriebswirtschaftlich aufzubereiten, woraus später bei einer technologischen Umsetzung schwerwiegende Probleme resultieren. Genau an diesem Punkt setzt der vorgestellte Artikel an und versucht den Zusammenhang zwischen Organisation und Technik herauszuarbeiten, um darauf aufsetzend einen allgemeingültigen Ansatz für die Umsetzung eines Dokumentenmanagementsystems zu liefern.
An einem Teil der Topologie architektonischer Räume, dem Volumenadjazenzgraphen (VAG), wird gezeigt wie topologisches Modellieren Anwendungen der Bauplanung integrieren kann. Dazu wird ein Prototyp vorgestellt, der im wesentlichen aus drei Komponenten besteht: Mit dem Anforderungsmanager werden Anforderungen eigegeben, die formal gut handhabbar sind. Mit dem Topologiemanager werden diese Anforderungen mit gezeichneten Räumen kombiniert. Die topologischen Relationen in den Zeichnungen werden mit den entsprechenden Mitteln des GIS berechnet und in eine Datenbank exportiert. Der Anforderungsprüfer vergleicht dann die Anforderungsdaten, die mit Hilfe des Anforderungsmanagers erzeugt wurden, mit den Topologiedaten. Dieser Ansatz soll zeigen, wie topologische Modelle eine Formalisierung semantisch hochstehender Informationen ermöglichen, indem sie als Eigenschaften von Graphen dargestellt werden
Ungarn und das Bauhaus
(1976)
Fuzzy functions are suitable to deal with uncertainties and fuzziness in a closed form maintaining the informational content. This paper tries to understand, elaborate, and explain the problem of interpolating crisp and fuzzy data using continuous fuzzy valued functions. Two main issues are addressed here. The first covers how the fuzziness, induced by the reduction and deficit of information i.e. the discontinuity of the interpolated points, can be evaluated considering the used interpolation method and the density of the data. The second issue deals with the need to differentiate between impreciseness and hence fuzziness only in the interpolated quantity, impreciseness only in the location of the interpolated points and impreciseness in both the quantity and the location. In this paper, a brief background of the concept of fuzzy numbers and of fuzzy functions is presented. The numerical side of computing with fuzzy numbers is concisely demonstrated. The problem of fuzzy polynomial interpolation, the interpolation on meshes and mesh free fuzzy interpolation is investigated. The integration of the previously noted uncertainty into a coherent fuzzy valued function is discussed. Several sets of artificial and original measured data are used to examine the mentioned fuzzy interpolations.
This paper presents a methodology for uncertainty quantification in cyclic creep analysis. Several models- , namely BP model, Whaley and Neville model, modified MC90 for cyclic loading and modified Hyperbolic function for cyclic loading are used for uncertainty quantification. Three types of uncertainty are included in Uncertainty Quantification (UQ): (i) natural variability in loading and materials properties; (ii) data uncertainty due to measurement errors; and (iii) modelling uncertainty and errors during cyclic creep analysis. Due to the consideration of all type of uncertainties, a measure for the total variation of the model response is achieved. The study finds that the BP, modified Hyperbolic and modified MC90 are best performing models for cyclic creep prediction in that order. Further, global Sensitivity Analysis (SA) considering the uncorrelated and correlated parameters is used to quantify the contribution of each source of uncertainty to the overall prediction uncertainty and to identifying the important parameters. The error in determining the input quantities and model itself can produce significant changes in creep prediction values. The variability influence of input random quantities on the cyclic creep was studied by means of the stochastic uncertainty and sensitivity analysis namely the Gartner et al. method and Saltelli et al. method. All input imperfections were considered to be random quantities. The Latin Hypercube Sampling (LHS) numerical simulation method (Monte Carlo type method) was used. It has been found by the stochastic sensitivity analysis that the cyclic creep deformation variability is most sensitive to the Elastic modulus of concrete, compressive strength, mean stress, cyclic stress amplitude, number of cycle, in that order.
Das studentische Projekt Umweltorientiertes Projektmanagement (UWoPM) eröffnet eine neue Generation von Arbeitsweisen im Projektmanagement. Umweltbelange wirken über Rechtsvorschriften immer stärker in das Projektgeschäft hinein. Durch Einbeziehung von rechtsrelevanten Daten in die Projektdatenbasis sowie von Rechtsbestimmungen in die Wissensbasis wird eine Softwareunterstützung für (wenn auch zunächst nur elementare) Rechtsfragen geboten. Eine spezielle Präsentation von Rechtswissen und eine komfortable Unterstützung der Antragstellung sollen dies in die Selbstverständlichkeiten des Projektmanagements einbringen. Gleichzeitig bietet der UWoPM-Prototyp zeitgemäße Systemlösungen für die Kommunikation Antragsteller - Behörde sowie für den Workflow innerhalb der Genehmigungsbehörde. Dieser Vortrag gibt Einführung und Übersicht zu insgesamt 18 Diplomarbeiten, in denen die Einzelkomponenten des UWoPM entwickelt wurden. Zentrales Problem war die Überführung gesetzlicher Vorschriften in Wissensbasen. Durch Beschränkung auf >Kernbestimmungen< und Ausweis von Interpretationsbedarf wurde ein prinzipieller Lösungsweg realisiert. Die technische Realisierung konnte zwar über einen Prototyp noch nicht hinausgehen, weist aber die Realisierbarkeit des Gesamtansatzes nach.
The execution of project activities generally requires the use of (renewable) resources like machines, equipment or manpower. The resource allocation problem consists in assigning time intervals to the execution of the project activities while taking into account temporal constraints between activities emanating from technological or organizational requirements and costs incurred by the resource allocation. If the total procurement cost of the different renewable resources has to be minimized we speak of a resource investment problem. If the cost depends on the smoothness of the resource utilization over time the underlying problem is called a resource levelling problem. In this paper we consider a new tree-based enumeration method for solving resource investment and resource levelling problems exploiting some fundamental properties of spanning trees. The enumeration scheme is embedded in a branch-and-bound procedure using a workload-based lower bound and a depth first search. Preliminary computational results show that the proposed procedure is promising for instances with up to 30 activities.
Die Lage oder der Standort eines Bauwerkes ist zweifellos charakteristisch mit diesem verbunden. Die räumliche Einordnung eines exponierten architektonischen Werkes, die Erschließung eines Gebäudes im innerstädtischen Umfeld oder die Verwaltung eines Bestands ist im Bauwesen oder der Architektur immer visuell. Die Interaktion mit Zeichnungen, die Orientierung anhand eines Lageplans oder die Dokumentation mit Fotos sind nur einige Beispiele. Die wirtschaftliche Optimierung unter Nutzung solcher Daten und deren nachfolgende Visualisierung soll hier mittels geeigneter Systeme gezeigt werden. Aber auch die Bewertung durch und Interaktion mit dem Benutzer unterstützt werden. So soll dieser Artikel beispielhaft den Transport und Verkehr fokussieren. Mathematik Graphen und Netze formen dabei Modelle zur Optimierung der zugrundeliegenden Logistikprozesse: Die Baustoffbedarfsplanung mit Bestellwesen, (Ab-)Transport und Lieferung von Material, Tourenzusammenstellung oder Standortauswahl. Informatik Weiterhin wird deren softwaretechnische Umsetzung und Einordung in begleitende Projekte der >Mathematischen Optimierung< vorgestellt.
TRANSFORMING TO EXPERT
(2011)
Katharina Richter holds a degree in Architecture, Urban- and Regional Planning from the Bauhaus-University Weimar, where she is appointed as Assistant Professor at the Chair of Computer Science in Architecture since 2000. She teaches architecture studio in undergraduate and graduate programs and has been supervising several international teaching projects. In Fall 2004 she was teaching and researching at the Washington Alexandria Architecture Consortium - Virginia Polytechnic and State University, Alexandria, VA. Her current research focuses on the investigation of the potential of computer based exchange of experiential knowledge in architecture. Between 2000 and 2006 she coordinated third party verification procedures at the Collaborative Research Center SFB 524 „Materials and Structures in Revitalization of Buildings“, Bauhaus-University Weimar, Germany. Her work has been published at various international conferences as well as in related reviewed journals.
TRANSFORMING TACIT KNOWLEDGE
(2011)
Sabine Ammon studied architecture and philosophy at the Technische Universität Berlin. Study and research visits led her to the University of London, Harvard University and ETH Zürich. Furthermore, she practised building design as a freelance architect. Her dissertation “Wissen verstehen. Perspektiven einer prozessualen Theorie der Erkenntnis”, Weilerswist 2009, develops a theory of knowledge, based on the philosophy of symbols. In her current research project she explores the epistemic dimension of architectural design processes.
Professor Jane Rendell is Director of Architectural Research at the Bartlett, UCL. An architectural designer and historian, art critic and writer, her work has explored various interdisciplinary intersections: feminist theory and architectural history, ne art and architectural design, autobiographical writing and criticism. She is author of Site-Writing: The Architecture of Art Criticism (2010), Art and Architecture (2006), The Pursuit of Pleasure (2002) and co-editor of Pattern (2007), Critical Architecture (2007), Spatial Imagination (2005), The Unknown City (2001), Intersections (2000), Gender Space Architecture (1999) and Strangely Familiar (1995). Her talks and texts have been commissioned by artists such as Daniel Arsham and Bik Van Der Pol, and galleries, as the Baltic, the Hayward, Kunstmuseum Thon, the Serpentine, the Tate and the Whitechapel. She is on the Editorial Board for ARQ (Architectural Research Quarterly), Haecceity, The Happy Hypocrite, The Issues and the Journal of Visual Culture in Britain.
In order to make control decisions, Smart Buildings need to collect data from multiple sources and bring it to a central location, such as the Building Management System (BMS). This needs to be done in a timely and automated fashion. Besides data being gathered from different energy using elements, information of occupant behaviour is also important for a building’s requirement analysis. In this paper, the parameter of Occupant Density was considered to help find behaviour of occupants towards a building space. Through this parameter, support for building energy consumption and requirements based on occupant need and demands was provided. The demonstrator presented provides information on the number of people present in a particular building space at any time, giving the space density. Such collections of density data made over a certain period of time represents occupant behaviour towards the building space, giving its usage patterns. Similarly, inventory items were tracked and monitored for moving out or being brought into a particular read zone. For both, people and inventory items, this was achieved using small, low-cost, passive Ultra-High Frequency (UHF) Radio Frequency Identification (RFID) tags. Occupants were given the tags in a form factor of a credit card to be possessed at all times. A central database was built where occupant and inventory information for a particular building space was maintained for monitoring and providing a central data access.
A central issue for the autonomous navigation of mobile robots is to map unknown environments while simultaneously estimating its position within this map. This chicken-eggproblem is known as simultaneous localization and mapping (SLAM). Asctec’s quadrotor Pelican is a powerful and flexible research UAS (unmanned aircraft system) which enables the development of new real-time on-board algorithms for SLAM as well as autonomous navigation. The relative UAS pose estimation for SLAM, usually based on low-cost sensors like inertial measurement units (IMU) and barometers, is known to be affected by high drift rates. In order to significantly reduce these effects, we incorporate additional independent pose estimation techniques using exteroceptive sensors. In this article we present first pose estimation results using a stereo camera setup as well as a laser range finder, individually. Even though these methods fail in few certain configurations we demonstrate their effectiveness and value for the reduction of IMU drift rates and give an outlook for further works towards SLAM.
The process of matching data represented in two different data models is a longstanding issue in the exchange of data between different software systems. While the traditional manual matching approach cannot meet today’s demands on data exchange, research shows that a fully automated generic approach for model matching is not likely, and generic semi-automated approaches are not easy to implement. In this paper, we present an approach that focuses on matching data models in a specific domain. The approach combines a basic model matching approach and a version matching approach to deduce new matching rules to enable data transfer between two evolving data models.
This paper describes an ongoing research on the representation and reasoning about construction specifications, which is part of a bigger research project that aims at developing a formalism for automating the identification of deviations and defects on construction sites. We specifically describe the requirements on product and process models and an approach for representing and reasoning about construction specifications to enable automated detection and assessment of construction deviations and defects. This research builds on the previous research on modeling design specifications and extends and elaborates concept of contexts developed in that domain. The paper provides an overview of how the construction specifications are being modele d in this research and points out future steps that need to be accomplished to develop the envisioned automated deviation and defect detection system.
Leslie Kavanaugh is both an architect and a philosopher. She is a licensed architect in America and the Netherlands, as well as a member of the AIA, but studied philosophy from undergraduate to doctorate at the University of Amsterdam. She has taught philosophy and design at various institutions, including twelve years at TUDelft, and as a guest professor at the Tokyo Science University and Milano Politecnico. Her publications include The Architectonic of Philosophy: Plato, Aristotle, Leibniz (Amsterdam: University of Amsterdam Press, 2007), Crossovers (with A.Graafland), Meditations on Space (2010), Aggregates (2010), and Chronotopologies: Hybrid Spatialities and Multiple Temporalities (Amsterdam: Rodopi Press, forthcoming). Presently she is the founder and director of studiokav.com in Amsterdam, a multi-disciplinary and collaborative atelier. In addition, Kavanaugh is an affiliated Senior Scholar at the Philosophy Institute, Leiden University, the Netherlands.
The aim of our contribution is to clarify the relation between totally regular variables and Appell sequences of hypercomplex holomorphic polynomials (sometimes simply called monogenic power-like functions) in Hypercomplex Function Theory. After their introduction in 2006 by two of the authors of this note on the occasion of the 17th IKM, the latter have been subject of investigations by different authors with different methods and in various contexts. The former concept, introduced by R. Delanghe in 1970 and later also studied by K. Gürlebeck in 1982 for the case of quaternions, has some obvious relationship with the latter, since it describes a set of linear hypercomplex holomorphic functions all power of which are also hypercomplex holomorphic. Due to the non-commutative nature of the underlying Clifford algebra, being totally regular variables or Appell sequences are not trivial properties as it is for the integer powers of the complex variable z=x+ iy. Simple examples show also, that not every totally regular variable and its powers form an Appell sequence and vice versa. Under some very natural normalization condition the set of all para-vector valued totally regular variables which are also Appell sequences will completely be characterized. In some sense the result can also be considered as an answer to a remark of K. Habetha in chapter 16: Function theory in algebras of the collection Complex analysis. Methods, trends, and applications, Akademie-Verlag Berlin, (Eds. E. Lanckau and W. Tutschke) 225-237 (1983) on the use of exact copies of several complex variables for the power series representation of any hypercomplex holomorphic function.
A topology optimization method has been developed for structures subjected to multiple load cases (Example of a bridge pier subjected to wind loads, traffic, superstructure...). We formulate the problem as a multi-criterial optimization problem, where the compliance is computed for each load case. Then, the Epsilon constraint method (method proposed by Chankong and Haimes, 1971) is adapted. The strategy of this method is based on the concept of minimizing the maximum compliance resulting from the critical load case while the other remaining compliances are considered in the constraints. In each iteration, the compliances of all load cases are computed and only the maximum one is minimized. The topology optimization process is switching from one load to another according to the variation of the resulting compliance. In this work we will motivate and explain the proposed methodology and provide some numerical examples.
Digital models of buildings are widely used in civil engineering. In these models, geometric information is used as leading information. Engineers are used to have geometric information, and, for instance, it is state of the art to specify a point by its three coordinates. However, the traditional approaches have disadvantages. Geometric information is over-determined. Thus, more geometric information is specified and stored than needed. In addition, engineers already deal with topological information. A denotation of objects in buildings is of topological nature. It has to be answered whether approaches where topological information becomes a leading role would be more efficient in civil engineering. This paper presents such an approach. Topological information is modelled independently of geometric information. It is used for denoting the objects of a building. Geometric information is associated to topological information so that geometric information “weights” a topology.
The concept presented in this paper has already been used in surveying existing buildings. Experiences in the use of this concept showed that the number of geometric information that is required for a complete specification of a building could be reduced by a factor up to 100. Further research will show how this concept can be used in planning processes.
Toplogical Houses
(2003)
Many properties of houses are of topological nature. The problem of three-dimensional encoding is solved here by first giving an axiomatic description of a simplified concept of >house< as a certain generalisation of a cw-complex and, secondly, by generalising local observation structures of embedded unconnected planar graphs to the three-dimensional case and proving that they allow retrieving all topological properties of these simplified houses. In the more general case of an architectural complex (a certain generalisation of a >house<) still much topolgical information is kept in these structures still making them a useful approach to encoding topological spaces. Finally, a lossless representation of observation structures in a relational database scheme which we call PLAV (Points, Lines, Areas, Volumes) is given. We expect PLAV to be useful for encoding higher dimensional (architectural) space-time complexes.
TOOL TO CHECK TOPOLOGY AND GEOMETRY FOR SPATIAL STRUCTURES ON BASIS OF THE EXTENDED MAXWELL'S RULE
(2006)
One of the simplest principle in the design of light-weight structures is to avoid bending. This can be achieved by dissolving girders into members acting purely in axial tension or compression. The employment of cables for the tensioned members leads to even lighter structures which are called cable-strut structures. They constitute a subclass of spatial structures. To give fast information about the general feasibility of an architectural concept employing cable-strut structures is a challenging task due to their sophisticated mechanical behavior. In this regard it is essential to control if the structure is stable and if pre-stress can be applied. This paper presents a tool using the spreadsheet software Microsoft (MS) Excel which can give such information. Therefore it is not necessary to purchase special software and the according time consuming training is much lower. The tool was developed on basis of the extended Maxwell's rule, which besides topology also considers the geometry of the structure. For this the rank of the node equilibrium matrix is crucial. Significance and determination of the rank and the implementation of the corresponding algorithms in MS Excel are described in the following. The presented tool is able to support the structural designer in an early stage of the project in finding a feasible architectural concept for cable-strut structures. As examples for the application of the software tool two special cable-strut structures, so called tensegrity structures, were examined for their mechanical behavior.
Design activity could be treated as state transition computationally. In stepwise processing, in-between form-states are not easily observed. However, in this research time-based concept is introduced and applied in order to bridge the gap. In architecture, folding is one method of form manipulation and architects also want to search for alternatives by this operation. Besides, folding operation has to be defined and parameterized before time factor is involved as a variable of folding. As a result, time-based transformation provides sequential form states and redirects design activity.
The concrete is modeled as a material with damage and plasticity, whereat the viscoplastic and the viscoelastic behaviour depends on the rate of the total strains. Due to the damage behaviour the compliance tensor develops different properties in tension and compression. There have been tested various yield surfaces and flow rules, damage rules respectively to their usability in a concrete model. One three-dimensional yield surface was developed from a failure surface based on the Willam--Warnke five-parameter model by the author. Only one general uni-axial stress-strain-relation is used for the numeric control of the yield surface. From that curve all necessary parameters for different strengths of concrete and different strain rates can be derived by affine transformations. For the flow rule in the compression zone a non associated inelastic potential is used, in the tension zone a Rankine potential. Conditional on the time-dependent formulation, the symmetry of the system equations is maintained in spite of the usage of non-associated potentials for the derivation of the inelastic strains. In case of quasi statical computations a simple viscoplastic law is used that is rested on an approach to Perzyna. The principle of equality of dissipation power in the uni-axial and the three-axial state of stress is used. It is modified by a factor that depends on the actual stress ratio and in comparison with the Kupfer experiments it implicates strains that are more realistic. The implementation of the concrete model is conducted in a mixed hybrid finite element. Examples in the structural level are introduced for verification of the concrete model.
Albert Narath is a doctoral candidate in modern architectural history at Columbia University in New York and a Paul Mellon Pre-doctoral Fellow at the Center for Advanced Research in the Visual Arts at the National Gallery of Art in Washington, DC. He holds an MA degree from the Architectural Association in London. His dissertation concerns architectural and art historical debates surrounding the Neo-baroque at the end of the nineteenth century in Germany.
A realistic and reliable model is an important precondition for the simulation of revitalization tasks and the estimation of system properties of existing buildings. Thereby, the main focus lies on the parameter identification, the optimization strategies and the preparation of experiments. As usual structures are modeled by the finite element method. This as well as other techniques are based on idealizations and empiric material properties. Within one theory the parameters of the model should be approximated by gradually performed experiments and their analysis. This approximation method is performed by solving an optimization problem, which is usually non-convex, of high dimension and possesses a non-differentiable objective function. Therefore we use an optimization procedure based on genetic algorithms which was implemented by using the program package SLang...
For analysis and planing of transport networks detailed information concerning travel sequences is required. The paper examines an activity chain model to determine stochastic travel demand which individual generates in order to participate in activity or sequence of activities over the day. The transition from one activity to another depends on the activity participation decision, which is made sequentially and is constrained in space and time. Activity chains derived from travel survey data are used as a base to develop a more realistic model than conventional discrete trip models. Probabilistic concepts and statistical procedures are used to estimate characteristics of travel demand such as the number of daily out-of-home activities and transition probability from one activity to another participated by homogeneous behavioural groups. In addition, the paper compares the model for two different sized towns.
For assessment of old buildings, thermal graphic analysis aided with infra-red camera have been employed in a wide range nowadays. Image processing and evaluation can be economically practicable only if the image evaluation can also be automated to the largest extend. For that reason methods of computer vision are presented in this paper to evaluate thermal images. To detect typical thermal image elements, such as thermal bridges and lintels in thermal images respectively gray value images, methods of digital image processing have been applied, of which numerical procedures are available to transform, modify and encode images. At the same time, image processing can be regarded as a multi-stage process. In order to be able to accomplish the process of image analysis from image formation through perfecting and segmentation to categorization, appropriate functions must be implemented. For this purpose, different measuring procedures and methods for automated detection and evaluation have been tested.
The method of difference potentials can be used to solve discrete elliptic boundary value problems, where all derivatives are approximated by finite differences. Considering the classical potential theory, an integral equation on the boundary will be investigated, which is solved approximately by the help of a quadrature formula. The advantage of the discrete method consists in the establishment of a linear equation system on the boundary, which can be immediately solved on the computer. The described method of difference potentials is based on the discrete Laplace equation in the three-dimensional case. In the first step the integral representation of the discrete fundamental solution is presented and the convergence behaviour with respect to the continuous fundamental solution is discussed. Because the method can be used to solve boundary value problems in interior as well as in exterior domains, it is necessary to explain some geometrical aspects in relation with the discrete domain and the double-layer boundary. A discrete analogue of the integral representation for functions in will be presented. The main result consists in splitting the difference potential on the boundary into a discrete single- and double-layer potential, respectively. The discrete potentials are used to establish and solve a linear equation system on the boundary. The actual form of this equation systems and the conditions for solvability are presented for Dirichlet and Neumann problems in interior as well as in exterior domains
The Lucas-Kanade tracker has proven to be an efficient and accurate method for calculation of the optical flow. However, this algorithm can reliably track only suitable image features like corners and edges. Therefore, the optical flow can only be calculated for a few points in each image, resulting in sparse optical flow fields. Accumulation of these vectors over time is a suitable method to retrieve a dense motion vector field. However, the accumulation process limits application of the proposed method to fixed camera setups. Here, a histogram based approach is favored to allow more than a single typical flow vector per pixel. The resulting vector field can be used to detect roads and prescribed driving directions which constrain object movements. The motion structure can be modeled as a graph. The nodes represent entry and exit points for road users as well as crossings, while the edges represent typical paths.
It is well-known that the solution of the fundamental equations of linear elasticity for a homogeneous isotropic material in plane stress and strain state cases can be equivalently reduced to the solution of a biharmonic equation. The discrete version of the Theorem of Goursat is used to describe the solution of the discrete biharmonic equation by the help of two discrete holomorphic functions. In order to obtain a Taylor expansion of discrete holomorphic functions we introduce a basis of discrete polynomials which fulfill the so-called Appell property with respect to the discrete adjoint Cauchy-Riemann operator. All these steps are very important in the field of fracture mechanics, where stress and displacement fields in the neighborhood of singularities caused by cracks and notches have to be calculated with high accuracy. Using the sum representation of holomorphic functions it seems possible to reproduce the order of singularity and to determine important mechanical characteristics.
The increased implementation of site data capture technologies invariably results in an increase in data warehousing and database technologies to store captured data. However, restricted use of data beyond the initial application could potentially result in a loss of understanding of site processes. This could in turn lead to poor decision making at production, tactical and strategic levels. Concrete usage data have been collected from two piling processes. These data have been analysed and the results highlighted potential improvements that could be made to existing site management and estimating processes. A cost benefit analysis has been used to support decision making at the strategic level where the identified improvements require capital expenditure.
The stress state of a piecewise-homogeneous elastic body, which has a semi-infinite crack along the interface, under in-plane and antiplane loads is considered. One of the crack edges is reinforced by a rigid patch plate on a finite interval adjacent to the crack tip. The crack edges are loaded with specified stresses. The body is stretched at infinity by specified stresses. External forces with a given principal vector and moment act on the patch plate. The problem reduces to a Riemann-Hilbert boundary-value matrix problem with a piecewise-constant coefficient for two complex potentials in the plane case and for one in the antiplane case. The complex potentials are found explicitly using a Gaussian hypergeometric function. The stress state of the body close to the ends of the patch plate, one of which is also simultaneously the crack tip, is investigated. Stress intensity factors near the singular points are determined.
This paper is focused on the first numerical tests for coupling between analytical solution and finite element method on the example of one problem of fracture mechanics. The calculations were done according to ideas proposed in [1]. The analytical solutions are constructed by using an orthogonal basis of holomorphic and anti-holomorphic functions. For coupling with finite element method the special elements are constructed by using the trigonometric interpolation theorem.
The preliminary design of a wearable computer for supporting Construction Progress Monitoring
(2000)
Progress monitoring has become more and more important as owners have increasingly demanded shorter times for the delivery of their projects. This trend is even more evident in high technology industries, such as the computer industry and the chemical industry. Fast changing markets, such as the computer industry, force companies to have to build new facilities quickly. To make a statement about construction progress, the status of a building has to be determined and monitored over a period of time. Depicting the construction progress in a diagram over time, statements can be made about the anticipated completion of the project and delays and problems in certain areas. Having this information, measures can be taken to efficiently >catch up< on the schedule of the project. New technologies, such as wearable computers, speech recognition, touch screens and wireless networks could help to move electronic data processing to the construction site. Progress monitoring could very much take advantage of this move, as several intermediate steps of processing progress data can be made unnecessary. The processing of progress data could be entirely done by computers, which means that data for supporting decisions can be made available at the moment the construction progress is measured. This paper describes a project, that investigates how these new technologies can be linked to create a system that enhances the efficiency of progress monitoring. During the project a first prototype of a progress monitoring system was developed that allows construction companies and site supervisors to measure construction progress on site using wearable computers that are speech controlled and connected to a central database via a wireless network.
THE INFLUENCE OF THE LOCAL CONCAVITY ON THE FUNCTIONING OF BEARING SHELL OF HIGH-RISE CONSTRUCTION
(2012)
Areas with various defects and damages, which reduce carrying capacity, were examined in a study of metal chimneys. In this work, the influence of the local dimples on the function of metal chimneys was considered. Modeling tasks were completed in the software packages LIRA and ANSYS. Parameters were identified, which characterize the local dimples, and a numerical study of the influence of local dimples on the stress-strain state of shells of metal chimneys was conducted. A distribution field of circular and meridional tension was analyzed in a researched area. Zones of influence of dimples on the bearing cover of metal chimneys were investigated. The bearing capacities of high-rise structures with various dimple geometries and various cover parameters were determined with respect to specified areas of the trunk. Dependent relationships are represented graphically for the decrease in bearing capacity of a cover with respect to dimples. Diameter and thickness of covers of metal chimneys were constructed according to the resulting data.
This article presents the Rigid Finite Element Method in the calculation of reinforced concrete beam deflection with cracks. Initially, this method was used in the shipbuilding industry. Later, it was adapted in the homogeneous calculations of the bar structures. In this method, rigid mass discs serve as an element model. In the flat layout, three generalized coordinates (two translational and one rotational) correspond to each disc. These discs are connected by elastic ties. The genuine idea is to take into account a discrete crack in the Rigid Finite Element Method. It consists in the suitable reduction of the rigidity in rotational ties located in the spots, where cracks occurred. The susceptibility of this tie results from the flexural deformability of the element and the occurrence of the crack. As part of the numerical analyses, the influence of cracks on the total deflection of beams was determined. Furthermore, the results of the calculations were compared to the results of the experiment. Overestimations of the calculated deflections against the measured deflections were found. The article specifies the size of the overestimation and describes its causes.
In this paper we present rudiments of a higher dimensional analogue of the Szegö kernel method to compute 3D mappings from elementary domains onto the unit sphere. This is a formal construction which provides us with a good substitution of the classical conformal Riemann mapping. We give explicit numerical examples and discuss a comparison of the results with those obtained alternatively by the Bergman kernel method.
In this note, we describe quite explicitly the Howe duality for Hodge systems and connect it with the well-known facts of harmonic analysis and Clifford analysis. In Section 2, we recall briefly the Fisher decomposition and the Howe duality for harmonic analysis. In Section 3, the well-known fact that Clifford analysis is a real refinement of harmonic analysis is illustrated by the Fisher decomposition and the Howe duality for the space of spinor-valued polynomials in the Euclidean space under the so-called L-action. On the other hand, for Clifford algebra valued polynomials, we can consider another action, called in Clifford analysis the H-action. In the last section, we recall the Fisher decomposition for the H-action obtained recently. As in Clifford analysis the prominent role plays the Dirac equation in this case the basic set of equations is formed by the Hodge system. Moreover, analysis of Hodge systems can be viewed even as a refinement of Clifford analysis. In this note, we describe the Howe duality for the H-action. In particular, in Proposition 1, we recognize the Howe dual partner of the orthogonal group O(m) in this case as the Lie superalgebra sl(2 1). Furthermore, Theorem 2 gives the corresponding multiplicity free decomposition with an explicit description of irreducible pieces.
THE HERPICH AFFAIR OF 1924
(2011)
Michele Stavagna is an architect and architectural historian, who lives and works in Berlin, and is the correspondent from Italy for the magazine “der architekt - BDA”. He was educated at the Università IUAV of Venice (Italy), holds a degree in architectural design and a PhD in history of architecture and urban design, and has taught Theory and History of Industrial Design at the Università degli Studi of Triest (Italy). Stavagna translated and edited the first Italian edition of “Die Baukunst der neuesten Zeit” by G. A. Platz. His research themes focus on the birth and affirmation of Modernism within the broader context of the mass public and economic development of the modern society.
The conventional way of describing an image is in terms of its canonical pixel-based representation. Other image description techniques are based on image transformations. Such an image transformation converts a canonical image representation into a representation in which specific properties of an image are described more explicitly. In most transformations, images are locally approximated within a window by a linear combination of a number of a priori selected patterns. The coefficients of such a decomposition then provide the desired image representation. The Hermite transform is an image transformation technique introduced by Martens. It uses overlapping Gaussian windows and projects images locally onto a basis of orthogonal polynomials. As the analysis filters needed for the Hermite transform are derivatives of Gaussians, Hermite analysis is in close agreement with the information analysis carried out by the human visual system. In this paper we construct a new higher dimensional Hermite transform within the framework of Quaternionic Analysis. The building blocks for this construction are the Clifford-Hermite polynomials rewritten in terms of Quaternionic analysis. Furthermore, we compare this newly introduced Hermite transform with the Quaternionic-Hermite Continuous Wavelet transform. The Continuous Wavelet transform is a signal analysis technique suitable for non-stationary, inhomogeneous signals for which Fourier analysis is inadequate. Finally the developed three dimensional filter functions of the Quaternionic-Hermite transform are tested with traditional scalar benchmark signals upon their selectivity at detecting pointwise singularities.
Wissenschaftliches Kolloquium vom 14. bis 16. Oktober 1999 in Weimar an der Bauhaus-Universität zum Thema: ‚global village - Perspektiven der Architektur'
THE FOURIER-BESSEL TRANSFORM
(2010)
In this paper we devise a new multi-dimensional integral transform within the Clifford analysis setting, the so-called Fourier-Bessel transform. It appears that in the two-dimensional case, it coincides with the Clifford-Fourier and cylindrical Fourier transforms introduced earlier. We show that this new integral transform satisfies operational formulae which are similar to those of the classical tensorial Fourier transform. Moreover the L2-basis elements consisting of generalized Clifford-Hermite functions appear to be eigenfunctions of the Fourier-Bessel transform.
Kari Jormakka has been teaching architectural theory at the Bauhaus University in Weimar since 2007. In addition, he has been an Ordinarius Professor of architectural theory at Vienna University of Technology since 1998. Previously, he has taught at the Knowlton School of Architecture at the Ohio State University, the University of Illinois at Chicago, Tampere University of Technology as well as Harvard University. Author of ten books and many papers on architectural history and theory, he studied architecture at Otaniemi University in Helsinki and at Tampere University of Technology, as well as philosophy at Helsinki University.
We briefly review and use the recent comprehensive research on the manifolds of square roots of −1 in real Clifford geometric algebras Cl(p,q) in order to construct the Clifford Fourier transform. Basically in the kernel of the complex Fourier transform the complex imaginary unit j is replaced by a square root of −1 in Cl(p,q). The Clifford Fourier transform (CFT) thus obtained generalizes previously known and applied CFTs, which replaced the complex imaginary unit j only by blades (usually pseudoscalars) squaring to −1. A major advantage of real Clifford algebra CFTs is their completely real geometric interpretation. We study (left and right) linearity of the CFT for constant multivector coefficients in Cl(p,q), translation (x-shift) and modulation (w -shift) properties, and signal dilations. We show an inversion theorem. We establish the CFT of vector differentials, partial derivatives, vector derivatives and spatial moments of the signal. We also derive Plancherel and Parseval identities as well as a general convolution theorem.
For planning in existing built contexts, the building survey is the starting point for initial planning proposals, for the diagnosis and documentation of building damages, for the creation of objectives catalogues, for the detailed design of renovation and conversion measures and for ensuring fulfilment of building legislation, particularly by change of use and refitting. An examination of currently available IT-tools shows insufficient support for planning within existing contexts, most notably a deficit with regard to information capture and administration. This paper discusses the concept for a modular surveying system (basic concept, separation of geometry from semantic data, and separation into sub-systems) and the prototypical realisation of a system for the complete support of the entire building surveying process for existing buildings. The project aims to contribute to the development of a planning system for existing buildings. ...
Wissenschaftliches Kolloquium vom 14. bis 16. Oktober 1999 in Weimar an der Bauhaus-Universität zum Thema: ‚global village - Perspektiven der Architektur'
Tobias Danielmeier teaches design at the Otago Polytechnic as well as at the University of Otago in Dunedin, New Zealand. He holds a Masters of Arts in Architecture from the Münster School of Architecture and is currently completing his PhD at the University of Otago. His research investigates the art, business and science of winery architecture and their interrelation with place and technology. Tobias Danielmeier’s practical experience includes projects for Reichardt Architekten, Essen, and Bolles+Wilson, Münster.
Eine neue, kopflose Gewalt hat den Imperialismus vergangener Zeiten abgelöst. Die neue Weltordnung, das »Empire«, überschreitet alle Grenzen unserer althergebrachten politischen Begriffe – Staat und Gesellschaft, Krieg und Frieden, Kontrolle und Freiheit. Das dezentralisierte und deterritorialisierte Empire beherrscht uns, indem es durch die Medien, die Technik und durch soziale Praktiken unmittelbaren Einfluss auf uns Menschen nimmt.
Architektur und Raumplanung haben sich in den letzten Jahrzehnten radikal gewandelt. Die alten, modernistischen Bestrebungen nach erschwinglichen Wohnungen und einer rationalen Organisation der Städte sind ebenso in den Hintergrund gerückt wie die postmodernen Obsessionen der Kommunikation, der Nutzerbeteiligung und des öffentlichen Raumes. Stattdessen stehen nun ästhetische und entschieden unpolitischere Belange im Vordergrund: Diskussionen zwischen einer kritischen und einer projektiven Praxis, zwischen Blobs und Kisten, zwischen Atmosphäre und Ornament.
Doch das ist noch lange nicht das Ende der Geschichte, wie im vorliegenden Band deutlich wird. Die Beiträge des 11. Bauhaus-Kolloquiums umspannen einen Zeitraum, der von der Gründung des Bauhauses in Weimar bis zur globalen Architektur unserer Zeit reicht, und verfolgen dabei die Entwicklung des Empires zurück, um gleichzeitig nach Konsequenzen und Alternativen zu fragen, denen die Architektur sich heute gegenübergestellt sieht.
Non-destructive techniques for damage detection became the focus of engineering interests in the last few years. However, applying these techniques to large complex structures like civil engineering buildings still has some limitations since these types of structures are
unique and the methodologies often need a large number of specimens for reliable results. For this reason, cost and time can greatly influence the final results.
Model Assisted Probability Of Detection (MAPOD) has taken its place among the ranks of damage identification techniques, especially with advances in computer capacity and modeling tools. Nevertheless, the essential condition for a successful MAPOD is having a reliable model in advance. This condition is opening the door for model assessment and model quality problems. In this work, an approach is proposed that uses Partial Models (PM) to compute the Probability Of damage Detection (POD). A simply supported beam, that can be structurally modified and
tested under laboratory conditions, is taken as an example. The study includes both experimental and numerical investigations, the application of vibration-based damage detection approaches and a comparison of the results obtained based on tests and simulations.
Eventually, a proposal for a methodology to assess the reliability and the robustness of the models is given.
This paper describes the application of interval calculus to calculation of plate deflection, taking in account inevitable and acceptable tolerance of input data (input parameters). The simply supported reinforced concrete plate was taken as an example. The plate was loaded by uniformly distributed loads. Several parameters that influence the plate deflection are given as certain closed intervals. Accordingly, the results are obtained as intervals so it was possible to follow the direct influence of a change of one or more input parameters on output (in our example, deflection) values by using one model and one computing procedure. The described procedure could be applied to any FEM calculation in order to keep calculation tolerances, ISO-tolerances, and production tolerances in close limits (admissible limits). The Wolfram Mathematica has been used as tool for interval calculation.
The development of a consistent material model for textile reinforced concrete requires the formulation and calibration of several sub-models on different resolution scales. Each of these models represents the material structure at the corresponding scale. While the models at the micro-level are able to capture the fundamental failure and damage mechanisms of the material components (e.g. filament rupture and debonding from the matrix) their computational costs limit their application to the small size representative unit cells of the material structure. On the other hand, the macro-level models provide a sufficient performance at the expense of limited range of applicability. Due to the complex structuring of the textile reinforced concrete at several levels (filament - yarn - textile - matrix) it is a non-trivial task to develop a multiscale model from scratch. It is rather more effective to develop a set of conceptually related sub-models for each structural level covering the selected phenomena of the material behavior. The homogenized effective material properties obtained at the lower level may be verified and validated using experiments and models at the higher level(s). In this paper the development of a consistent material model for textile reinforced concrete is presented. Load carrying and failure mechanisms at the micro, meso and macro scales are described and models with the focus on the specified scales are introduced. The models currently being developed in the framework of the collaborative research center are classified and evaluated with respect to the failure mechanisms being captured. The micromechanical modeling of the yarn and bonding behavior is discussed in detail and the correspondence with the experiments focused on the selected failure and interaction mechanisms is shown. The example of modeling the bond layer demonstrates the application of the presented strategy.
The goal of the collaborative research center (SFB 532) >Textile reinforced concrete (TRC): the basis for the development of a new material technology< installed in 1998 at the Aachen University is a complex assessment of mechanical, chemical, economical and productional aspects in an interdisciplinary environment. The research project involves 10 institutes performing parallel research in 17 projects. The coordination of such a research process requires effective software support for information sharing in form of data exchange, data analysis and data archival. Furthermore, the processes of experiment planning and design, modification of material compositions and design parameters and development of new material models in such an environment call for systematic coordination applying the concepts of operational research. Flexible organization of the data coming from several sources is a crucial premise for a transparent accumulation of knowledge and, thus, for a successful research in a long run. The technical information system (TRC-TIS) developed in the SFB 532 has been implemented as a database-powered web server with a transparent definition of the product and process model. It serves as an intranet server with access domains devoted to the involved research groups. At the same time, it allows the presentation of selected results just by granting a data object an access from the public area of the server via internet.
Experimental testing of nailed connections taken from old roof trusses is presented in this paper. To enable the further use and preservation of nailed roof trusses, it is important to understand how the nail corrosion and aging processes of steel and wood affect the load-bearing capacity and deformation behaviour of such structures. The hypothesis was investigated whether corroded nails allow an increase in load-bearing capacity. Several old and new joints were tested in a first test series, and the results were very promising regarding the initial assumption. However, more tests must be carried out to verify the results.
Dr. phil., seit 2001 Professorin für Geschichte und Theorie der Architektur und der Stadt; zuvor Professorin für Kunstgeschichte an der TU Graz; Gastprofessuren für Kunst-, Architektur- und Designgeschichte an der UdK Berlin, Universität Kassel, Oldenburg und Bonn; Studium der Kunstgeschichte, Soziologie, Psychologie und Philosophie. Karin Wilhelm ist Organisatorin mehrerer internationaler Ausstellungen zur modernen Architektur und zum Design (Berlin, London, Stock olm) und war wissenschaftlicher Beirat der Stiftung Bauhaus Dessau und des Deutschen Architekturmuseums (DAM). Zahlreiche Veröffentlichungen, zuletzt: Bauhaus Weimar 1919 –1924 (1996); Kunst als Revolte? Von der Fähigkeit der Künste, Nein zu sagen (1996); Visionen vom Glück – Visionen vom Untergang: Zeichen und Diskurse zur „schönen neuen Welt“ (1998); Sehen – Gehen – Denken: der Entwurf des Bauhausgebäudes, in: ‚Das Bauhausgebäude in Dessau 1926 –1999‘ (1998); City-Lights – Zentren, Peripherien, Regionen: interdisziplinäre Positionen für eine urbane Kultur (2002), Idea and form: Häuser von Szyszkowitz + Kowalski (2003); Formationen der Stadt. Camillo Sitte weitergelesen (2005).
Due to the amount of flow simulation and measurement data, automatic detection, classification and visualization of features is necessary for an inspection. Therefore, many automated feature detection methods have been developed in recent years. However, only one feature class is visualized afterwards in most cases, and many algorithms have problems in the presence of noise or superposition effects. In contrast, image processing and computer vision have robust methods for feature extraction and computation of derivatives of scalar fields. Furthermore, interpolation and other filter can be analyzed in detail. An application of these methods to vector fields would provide a solid theoretical basis for feature extraction. The authors suggest Clifford algebra as a mathematical framework for this task. Clifford algebra provides a unified notation for scalars and vectors as well as a multiplication of all basis elements. The Clifford product of two vectors provides the complete geometric information of the relative positions of these vectors. Integration of this product results in Clifford correlation and convolution which can be used for template matching of vector fields. For frequency analysis of vector fields and the behavior of vector-valued filters, a Clifford Fourier transform has been derived for 2D and 3D. Convolution and other theorems have been proved, and fast algorithms for the computation of the Clifford Fourier transform exist. Therefore the computation of Clifford convolution can be accelerated by computing it in Clifford Fourier domain. Clifford convolution and Fourier transform can be used for a thorough analysis and subsequent visualization of flow fields.
Modellverwaltungssysteme sind eine geeignete technologische Basis zum Management digitaler Bauwerksmodelle bei Planungstätigkeiten für den Neubau als auch für die Revitalisierung von Bauwerken. Die Unterstützung von Revitalisierungsprozessen impliziert für den Entwurf integrierter Planungsumgebungen spezifische Anforderungen wie die Repräsentation von Informationen, die mit verschiedenen Typen von Vagheit behaftet sind, die Notwendigkeit, den Soll- sowie den Ist- Zustand des Bauwerks abzubilden und die Fähigkeit des Umgangs mit temporal inkonsistenten Modellzuständen. Die erforderliche Dynamik der Domänenmodelle und die erforderliche Nutzbarkeit in Virtual Enterprises stellen weitere Ansprüche an die Realisierungsbasis der Modellverwaltungssysteme. Zur Implementierung derartiger Systeme erweist es sich als vorteilhaft, Eigenschaften objektorientierter Programmiersprachen mit nichtstatischen Typsystemen auszunutzen, da diese durch die vorhandene Metaebene sowie Introspektions- und Reflektionsmechanismen eine effiziente Realisierungsbasis bereitstellen. Zur effektiven Unterstützung synchroner kooperativer Planungstätigkeiten innerhalb einzelner Fachdisziplinen wurde ein Benachrichtigungsmechanismus realisiert, der an das Modellverwaltungssystem angekoppelte Fachapplikationen über nebenläufig vorgenommene Modifikationen am zugehörigen Domänenmodell oder an Projektinformationen informiert. Weiterhin existiert ein Mechanismus zur vereinfachten Anbindung von existierenden Applikationen, die auf statischen Partialmodellen beruhen oder standardisierte, modellbasierte Austauschformate unterstützen. Abschließend wird eine aus einem zentralen Projektserver, Domänenservern und Domänenclients bestehende hybride Systemarchitektur vorgestellt, die geeignet ist, unter den Randbedingungen kooperativer und geographisch verteilter Arbeit bei Revitalisierungsvorhaben in Virtual Enterprises eingesetzt zu werden.
Am 25. März 2010 veranstaltete die Professur Baubetrieb und Bauverfahren im Rahmen der jährlich stattfindenden baubetrieblichen Tagungsreihe gemeinsam mit der Arbeitsgruppe „Unikatprozesse“ in der Fachgruppe „Simulation in Produktion und Logistik“ (SPL) im Rahmen der Arbeitsgemeinschaft Simulation – ASIM einen ganztägigen Workshop mit dem Titel: „Modellierung von Prozessen zur Fertigung von Unikaten“. Viele Bauprozesse sind dadurch gekennzeichnet, dass sie Unikatcharakter besitzen. Unikate sind durch prototypische Einmaligkeit, Individualität, vielfältige Randbedingungen, einen geringen Grad an Standardisierung und Wiederholungen gekennzeichnet. Das erschwert die realitätsnahe Modellierung zur Simulation sogenannter Unikatprozesse. Dieser Besonderheit widmet sich die überwiegende Zahl der Tagungsbeiträge, die in diesem Band widergegeben sind.
Am 31. März 2008 veranstalteten die Professur Baubetrieb und Bauverfahren und die Juniorprofessur Theoretische Methoden des Projektmanagements der Bauhaus-Universität Weimar einen ganztägigen Workshop mit dem Titel: „Auf dem Weg zum digitalen (Bau-)haus-Bau“. Damit sollte die im Herbst 2007 an der Universität Kassel unter dem Titel „Simulation in der Bauwirtschaft“ begonnene Reihe von Workshops eine Fortsetzung finden. Der Schwerpunkt wurde dieses Mal auf die Simulation von Bauprozessen gesetzt – Simulation mit dem Ziel, Arbeitsvorbereitung, Bauausführung und Baustellencontrolling digital zu unterstützen.
Die Fachtagung richtete sich an Geschäftsführer, Projektleiter, Bauleiter und Projektsteuerer in Planung und Ausführung mit Beiträgen zum Nachtrags- und Änderungsmanagement am Bau, Workflow-Management in der Baupraxis, Integration von Informationsprozessen auf der Basis von Nemetschek Technologien sowie Kompetenzaufbau durch gezielte Weiterbildung.
Safety operation of important civil structures such as bridges can be estimated by using fracture analysis. Since the analytical methods are not capable of solving many complicated engineering problems, numerical methods have been increasingly adopted. In this paper, a part of isotropic material which contains a crack is considered as a partial model and the proposed model quality is evaluated. EXtended IsoGeometric Analysis (XIGA) is a new developed numerical approach [1, 2] which benefits from advantages of its origins: eXtended Finite Element Method (XFEM) and IsoGeometric Analysis (IGA). It is capable of simulating crack propagation problems with no remeshing necessity and capturing singular field at the crack tip by using the crack tip enrichment functions. Also, exact representation of geometry is possible using only few elements. XIGA has also been successfully applied for fracture analysis of cracked orthotropic bodies [3] and for simulation of curved cracks [4]. XIGA applies NURBS functions for both geometry description and solution field approximation. The drawback of NURBS functions is that local refinement cannot be defined regarding that it is based on tensorproduct constructs unless multiple patches are used which has also some limitations. In this contribution, the XIGA is further developed to make the local refinement feasible by using Tspline basis functions. Adopting a recovery based error estimator in the proposed approach for evaluation of the model quality and performing the adaptive processes is in progress. Finally, some numerical examples with available analytical solutions are investigated by the developed scheme.
SYSWELD Forum 2011
(2011)
Am 25. und 26. Oktober 2011 trafen sich an der Bauhaus-Universität Weimar 70 nationale und internationale Fachleute aus Forschung und Praxis, um sich im Rahmen des vierten SYSWELD Forums über aktuelle Entwicklungen der numerischen Simulation auf dem Gebiet der Wärmebehandlung und des Schweißens auszutauschen. Die numerische Simulation im Bereich des Schweißens und der Wärmebehandlung hat sich in den letzten Jahren beachtlich weiterentwickelt und bietet ein zukunftsweisendes und innovatives Arbeitsfeld für Ingenieure.
This paper presents a robust model updating strategy for system identification of wind turbines. To control the updating parameters and to avoid ill-conditioning, the global sensitivity analysis using the elementary effects method is conducted. The formulation of the objective function is based on M¨uller-Slany’s strategy for multi-criteria functions. As a simulationbased optimization, a simulation adapter is developed to interface the simulation software ANSYS and the locally developed optimization software MOPACK. Model updating is firstly tested on the beam model of the rotor blade. The defect between the numerical model and the reference has been markedly reduced by the process of model updating. The effect of model updating becomes more pronounced in the comparison of the measured and the numerical properties of the wind turbine model. The deviations of the frequencies of the updated model are rather small. The complete comparison including the free vibration modes by the modal assurance criteria shows the excellent coincidence of the modal parameters of the updated model with the ones from the measurements. By successful implementation of the model validation via model updating, the applicability and effectiveness of the solution concept has been demonstrated.
Due to the complex interactions between the ground, the driving machine, the lining tube and the built environment, the accurate assignment of in-situ system parameters for numerical simulation in mechanized tunneling is always subject to tremendous difficulties. However, the more accurate these parameters are, the more applicable the responses gained from computations will be. In particular, if the entire length of the tunnel lining is examined, then, the appropriate selection of various kinds of ground parameters is accountable for the success of a tunnel project and, more importantly, will prevent potential casualties. In this context, methods of system identification for the adaptation of numerical simulation of ground models are presented. Hereby, both deterministic and probabilistic approaches are considered for typical scenarios representing notable variations or changes in the ground model.
SYSBAT - An Application to the Building ProductionBased on Computer Supported Cooperative Work
(2003)
Our proposed solution is to enable partners of a construction project to share all the technical data produced and handled during the building production process by building a system through the use of internet technology. The system links distributed databases and allows building partners to access remotely and manipulate specific information. It provides an updated building representation that is being enriched and refined all along the building production process. A recent collaboration with Nemetschek France (subsidiary company of Nemetschek AG, AEC CAD software leader) focus on a building product repository available in a web context. The aim is to help building project actors to choose a technical solution that fits its professional needs, and maintain our information system with up to date information. It starts with the possibility to build on line building product catalogs, in order to link Allplan CAD entities with building technical features. This paper presents the conceptual approaches on which our information system is built. Starting from a general organization diagram organization, we focus on the product and the description branches of construction works (including last IFC model specifications). Our aim is to add decisional support to the construction works selection process. To do so, we consider the actor's role upon the system and the pieces of information each one needs to achieve a given task.
The subject of this talk is the problem of surface design based upon a mesh that may contain both triangular and quadrangular domains. We investigate the cases when such a combined mesh occurs more preferable for bivariate data interpolation than a pure triangulation. First we describe a modification of the well-known flipping algorithm that constructs a locally optimal combined mesh with a predefined quality criterion. Then we introduce two quality measures for triangular and quadrangular domains and present the results of a computational experiment that compares integral interpolation errors and errors in gradients caused by the piecewise surface models produced by the flipping algorithm with the introduced quality measures. The experiment shows that triangular meshes with the Delaunay quality measure provide better interpolation accuracy only if the interpolated function is strictly convex, as well as a saddle-shaped function is better interpolated by bilinear patches within a combined mesh. For a randomly shaped function combined meshes demonstrate smaller error values and better stability in compare with pure triangulations. At the end we consider other resources for mesh improvement, such as excluding >bad< points from the input set for the mesh generating procedure. Because the function values at these points should not be lost, some linear or bilinear patches are replaced by nonlinear patches that pass through the excluded points.
The design of mobile IT systems, especially the design of wearable computer systems, is a complex task that requires computer science knowledge, such as that related to hardware configuration and software development, in addition to knowledge of the domain in which the system is intended to be used. Particularly in the AEC sector, it is necessary that the support from mobile information technology fit the work situation at hand. Ideally, the domain expert alone can adjust the wearable computer system to achieve this fit without having to consult IT experts. In this paper, we describe a model that helps in transferring existing design knowledge from non-AEC domains to new projects in the construction area. The base for this is a model and a methodology that describes the usage scenarios of said computer systems in an application-neutral and domain-independent way. Thus, the actual design information and experience will be transferable between different applications and domains.
Structural engineering projects are increasingly organized in networked cooperations due to a permanently enlarged competition pressure and a high degree of complexity while performing the concurrent design activities. Software that intends to support such collaborative structural design processes implicates enormous requirements. In the course of our common research work, we analyzed the pros and cons of the application of both the peer-to-peer (University of Bonn) and multiagent architecture style (University of Bochum) within the field of collaborative structural design. In this paper, we join the benefits of both architecture styles in an integrated conceptual approach. We demonstrate the surplus value of the integrated multiagent–peer-to-peer approach by means of an example scenario in which several structural engineers are co-operatively designing the basic structural elements of an arched bridge, applying heterogeneous CAD systems.
In the superelliptic shell joined to a circular cylinder bending stresses are absent when it is subjected to uniform pressure.Some geometrical characteristics have been found. Expressions for determining stresses in the shell crest(in the singular point of plane type) are suggested. The problem of a theoretical critical buckling load of an elongated shell supported by frames is studied. A critical buckling load for two shells with different specifications was found experimentally.
Die besondere Aggressivität von hochkonzentrierten Magnesiumsulfatlösungen bei Einwirkung auf Beton ist seit vielen Jahrzehnten bekannt. Neben dem Sulfat greift zusätzlich auch das Magnesium den Zementstein an. Bei hohen Lösungskonzentrationen nimmt der Magnesiumangriff gegenüber dem Sulfatangriff sogar eine dominante Rolle ein. Magnesiumgehalte unter 300 mg/l im Grundwasser gelten allerdings bislang als nicht angreifend. In Auslagerungs- und Laborversuchen wurde jedoch festgestellt, dass auch bei praxisrelevanten Magnesium- (<300 mg/l) und Sulfatgehalten (1.500 mg/l) das Magnesium zu einer deutlichen Verschärfung des Sulfatangriffes bei niedrigen Temperaturen führte. Diese Verschärfung trat bei Mörteln und Betonen auf, bei denen der erhöhte Sulfatwiderstand durch einen teilweisen Zementersatz mit 20 % Flugasche zu einem CEM II/A-LL erreicht werden sollte, gemäß der Flugascheregelung nach EN 206-1/DIN 1045-2.
Bei einem teilweisen Zementersatz durch 30 % Flugasche konnte auch in magnesiumhaltigen Sulfatlösungen eine deutliche Verbesserung des Sulfatwiderstandes erreicht werden. Mörtel mit HS-Zement als Bindemittel wiesen keinerlei Schäden auf. Schadensverursachend war eine Kombination mehrerer Einflüsse. Zum einen wurde der Sulfatwiderstand des Zement-Flugasche-Systems durch die unzureichende Reaktion der Flugasche infolge der niedrigen Lagerungstemperatur geschwächt. Zum anderen konnte durch die Einwirkung des Magnesiums in der Randzone vermutlich eine Destabilisierung der C-S-H-Phasen erfolgen, wodurch die Thaumasitbildung an dieser Stelle forciert wurde. Zusätzlich wurde durch den Portlanditverbrauch und die pH-Wert-Absenkung in der Randzone die puzzolanische Reaktion der Flugasche behindert.
This contribution will be freewheeling in the domain of signal, image and surface processing and touch briefly upon some topics that have been close to the heart of people in our research group. A lot of the research of the last 20 years in this domain that has been carried out world wide is dealing with multiresolution. Multiresolution allows to represent a function (in the broadest sense) at different levels of detail. This was not only applied in signals and images but also when solving all kinds of complex numerical problems. Since wavelets came into play in the 1980's, this idea was applied and generalized by many researchers. Therefore we use this as the central idea throughout this text. Wavelets, subdivision and hierarchical bases are the appropriate tools to obtain these multiresolution effects. We shall introduce some of the concepts in a rather informal way and show that the same concepts will work in one, two and three dimensions. The applications in the three cases are however quite different, and thus one wants to achieve very different goals when dealing with signals, images or surfaces. Because completeness in our treatment is impossible, we have chosen to describe two case studies after introducing some concepts in signal processing. These case studies are still the subject of current research. The first one attempts to solve a problem in image processing: how to approximate an edge in an image efficiently by subdivision. The method is based on normal offsets. The second case is the use of Powell-Sabin splines to give a smooth multiresolution representation of a surface. In this context we also illustrate the general method of construction of a spline wavelet basis using a lifting scheme.
Polymer modification of mortar and concrete is a widely used technique in order to improve their durability properties. Hitherto, the main application fields of such materials are repair and restoration of buildings. However, due to the constant increment of service life requirements and the cost efficiency, polymer modified concrete (PCC) is also used for construction purposes. Therefore, there is a demand for studying the mechanical properties of PCC and entitative differences compared to conventional concrete (CC). It is significant to investigate whether all the assumed hypotheses and existing analytical formulations about CC are also valid for PCC. In the present study, analytical models available in the literature are evaluated. These models are used for estimating mechanical properties of concrete. The investigated property in this study is the modulus of elasticity, which is estimated with respect to the value of compressive strength. One existing database was extended and adapted for polymer-modified concrete mixtures along with their experimentally measured mechanical properties. Based on the indexed data a comparison between model predictions and experiments was conducted by calculation of forecast errors.
Information science researchers and developers have spent many years addressing the problem of retrieving the exact information needed and using it for analysis purposes. In informationseeking dialogues, the user, i.e. construction project manager or supplier, often asks questions about specific aspects of the tasks they want to perform. But most of the time it is difficult for the software systems to unambiguously understand their overall intentions. The existence of information tunnels (Tannenbaum 2002) aggravates this phenomenon. This study includes a detailed case study of the material management process in the construction industry. Based on this case study, the structure of a formal user model for information retrieval in construction management is proposed. This prototype user model will be incorporated into the system design for construction information management and retrieval. This information retrieval system is a user-centered product based on the development of a user configurable visitor mechanism for managing and retrieving project information without worrying too much about the underlying data structure of the database system. An executable UML model combined with OODB is used to reduce the ambiguity in the user's intentions and to achieve user satisfaction.
With the advances of the computer technology, structural optimization has become a prominent field in structural engineering. In this study an unconventional approach of structural optimization is presented which utilize the Energy method with Integral Material behaviour (EIM), based on the Lagrange’s principle of minimum potential energy. The equilibrium condition with the EIM, as an alternative method for nonlinear analysis, is secured through minimization of the potential energy as an optimization problem. Imposing this problem as an additional constraint on a higher cost function of a structural property, a bilevel programming problem is formulated. The nested strategy of solution of the bilevel problem is used, treating the energy and the upper objective function as separate optimization problems. Utilizing the convexity of the potential energy, gradient based algorithms are employed for its minimization and the upper cost function is minimized using the gradient free algorithms, due to its unknown properties. Two practical examples are considered in order to prove the efficiency of the method. The first one presents a sizing problem of I steel section within encased composite cross section, utilizing the material nonlinearity. The second one is a discrete shape optimization of a steel truss bridge, which is compared to a previous study based on the Finite Element Method.
The planning of projects in building engineering is a complex process which is characterized by a dynamical composition and many modifications during the definition and execution time of processes. For a computer-aided and network-based cooperation a formal description of the planning process is necessary. In the research project “Relational Process Modelling in Cooperative Building Planning” a process model is described by three parts: an organizational structure with participants, a building structure with states and a process structure with activities. This research project is part of the priority program 1103 “Network-Based Cooperative Planning Processes in Structural Engineering” promoted by the German Research Foundation (DFG). Planning processes in civil engineering can be described by workflow graphs. The process structure describes the logical planning process and can be formally defined by a bipartite graph. This structure consists of activities, transitions and relationships between activities and transitions. In order to minimize errors at execution time of a planning process a consistent and structurally correct process model must be guaranteed. This contribution considers the concept and the algorithms for checking the consistency and the correctness of the process structure.