Refine
Has Fulltext
- yes (93) (remove)
Document Type
- Conference Proceeding (51)
- Doctoral Thesis (22)
- Article (10)
- Master's Thesis (3)
- Bachelor Thesis (2)
- Periodical (2)
- Report (2)
- Book (1)
Institute
- In Zusammenarbeit mit der Bauhaus-Universität Weimar (18)
- Institut für Strukturmechanik (ISM) (10)
- Professur Angewandte Mathematik (9)
- Bauhaus-Institut für Geschichte und Theorie der Architektur und Planung (8)
- F. A. Finger-Institut für Baustoffkunde (FIB) (6)
- Institut für Europäische Urbanistik (5)
- Junior-Professur Computational Architecture (5)
- Graduiertenkolleg 1462 (4)
- Professur Informatik im Bauwesen (4)
- Institut für Konstruktiven Ingenieurbau (IKI) (3)
Keywords
- Building Information Modeling (36)
- Angewandte Informatik (35)
- Angewandte Mathematik (35)
- Computerunterstütztes Verfahren (35)
- Data, information and knowledge modeling in civil engineering; Function theoretic methods and PDE in engineering sciences; Mathematical methods for (robotics and) computer vision; Numerical modeling in engineering; Optimization in engineering applications (34)
- Architektur (9)
- Städtebau (7)
- Architecture (2)
- Beton (2)
- Cement (2)
Year of publication
- 2015 (93) (remove)
Nanostructured materials are extensively applied in many fields of material science for new industrial applications, particularly in the automotive, aerospace industry due to their exceptional physical and mechanical properties. Experimental testing of nanomaterials is expensive, timeconsuming,challenging and sometimes unfeasible. Therefore,computational simulations have been employed as alternative method to predict macroscopic material properties. The behavior of polymeric nanocomposites (PNCs) are highly complex.
The origins of macroscopic material properties reside in the properties and interactions taking place on finer scales. It is therefore essential to use multiscale modeling strategy to properly account for all large length and time scales associated with these material systems, which across many orders of magnitude. Numerous multiscale models of PNCs have been established, however, most of them connect only two scales. There are a few multiscale models for PNCs bridging four length scales (nano-, micro-, meso- and macro-scales). In addition, nanomaterials are stochastic in nature and the prediction of macroscopic mechanical properties are influenced by many factors such as fine-scale features. The predicted mechanical properties obtained by traditional approaches significantly deviate from the measured values in experiments due to neglecting uncertainty of material features. This discrepancy is indicated that the effective macroscopic properties of materials are highly sensitive to various sources of uncertainty, such as loading and boundary conditions and material characteristics, etc., while very few stochastic multiscale models for PNCs have been developed. Therefore, it is essential to construct PNC models within the framework of stochastic modeling and quantify the stochastic effect of the input parameters on the macroscopic mechanical properties of those materials.
This study aims to develop computational models at four length scales (nano-, micro-, meso- and macro-scales) and hierarchical upscaling approaches bridging length scales from nano- to macro-scales. A framework for uncertainty quantification (UQ) applied to predict the mechanical properties
of the PNCs in dependence of material features at different scales is studied. Sensitivity and uncertainty analysis are of great helps in quantifying the effect of input parameters, considering both main and interaction effects, on the mechanical properties of the PNCs. To achieve this major
goal, the following tasks are carried out:
At nano-scale, molecular dynamics (MD) were used to investigate deformation mechanism of glassy amorphous polyethylene (PE) in dependence of temperature and strain rate. Steered molecular dynamics (SMD)were also employed to investigate interfacial characteristic of the PNCs.
At mico-scale, we developed an atomistic-based continuum model represented by a representative volume element (RVE) in which the SWNT’s properties and the SWNT/polymer interphase are modeled at nano-scale, the surrounding polymer matrix is modeled by solid elements. Then, a two-parameter model was employed at meso-scale. A hierarchical multiscale approach has been developed to obtain the structure-property relations at one length scale and transfer the effect to the higher length
scales. In particular, we homogenized the RVE into an equivalent fiber.
The equivalent fiber was then employed in a micromechanical analysis (i.e. Mori-Tanaka model) to predict the effective macroscopic properties of the PNC. Furthermore, an averaging homogenization process was also used to obtain the effective stiffness of the PCN at meso-scale.
Stochastic modeling and uncertainty quantification consist of the following ingredients:
- Simple random sampling, Latin hypercube sampling, Sobol’ quasirandom sequences, Iman and Conover’s method (inducing correlation in Latin hypercube sampling) are employed to generate independent and dependent sample data, respectively.
- Surrogate models, such as polynomial regression, moving least squares (MLS), hybrid method combining polynomial regression and MLS, Kriging regression, and penalized spline regression, are employed as an approximation of a mechanical model. The advantage of the surrogate models is the high computational efficiency and robust as they can be constructed from a limited amount of available data.
- Global sensitivity analysis (SA) methods, such as variance-based methods for models with independent and dependent input parameters, Fourier-based techniques for performing variance-based methods and partial derivatives, elementary effects in the context of local SA, are used to quantify the effects of input parameters and their interactions on the mechanical properties of the PNCs. A bootstrap technique is used to assess the robustness of the global SA methods with respect to their performance.
In addition, the probability distribution of mechanical properties are determined by using the probability plot method. The upper and lower bounds of the predicted Young’s modulus according to 95 % prediction intervals were provided.
The above-mentioned methods study on the behaviour of intact materials. Novel numerical methods such as a node-based smoothed extended finite element method (NS-XFEM) and an edge-based smoothed phantom node method (ES-Phantom node) were developed for fracture problems. These methods can be used to account for crack at macro-scale for future works. The predicted mechanical properties were validated and verified. They show good agreement with previous experimental and simulations results.
The polymeric clay nanocomposites are a new class of materials of which recently have become the centre of attention due to their superior mechanical and physical properties. Several studies have been performed on the mechanical characterisation of these nanocomposites; however most of those studies have neglected the effect of the interfacial region between the clays and the matrix despite of its significant influence on the mechanical performance of the nanocomposites.
There are different analytical methods to calculate the overall elastic material properties of the composites. In this study we use the Mori-Tanaka method to determine the overall stiffness of the composites for simple inclusion geometries of cylinder and sphere. Furthermore, the effect of interphase layer on the overall properties of composites is calculated. Here, we intend to get ounds for the effective mechanical properties to compare with the analytical results. Hence, we use linear displacement boundary conditions (LD) and uniform traction boundary conditions (UT) accordingly. Finally, the analytical results are compared with numerical results and they are in a good agreement.
The next focus of this dissertation is a computational approach with a hierarchical multiscale method on the mesoscopic level. In other words, in this study we use the stochastic analysis and computational homogenization method to analyse the effect of thickness and stiffness of the interfacial region on the overall elastic properties of the clay/epoxy nanocomposites. The results show that the increase in interphase thickness, reduces the stiffness of the clay/epoxy naocomposites and this decrease becomes significant in higher clay contents. The results of the sensitivity analysis prove that the stiffness of the interphase layer has more significant effect on the final stiffness of nanocomposites. We also validate the results with the available experimental results from the literature which show good agreement.
In der vorliegenden Ph.D.-Arbeit werden die Bereiche Materialität, Objektkultur und Physical Computing adressiert. Der Autor erkennt nun durch die durchlaufene intensive theoretische Betrachtungsweise des Themas Materialität die Bedeutung von Tastsinn und Objektoberflächen für den Alltag des praktizierenden Gestalters und proklamiert eine Wende und Hinwendung der Designpraxis zu den Potenzialen von Materialität und deren Bedeutung für die Akteure. Die Aufgabe der Praxisforschung ist es, eine inklusive Optimierungsmethode des Produktdesigns zu gestalten, mit der Designentwicklungen durch überprüfbare Nutzungsdaten optimiert werden können. Die taktile Pilotmethode ergab auf Basis der Generierung von Nutzerkarten Erkenntnisse über biometrische Werte, individuelle Körpergrößen und unterschiedliche Handhabungsprinzipien.
What is nowadays called (classic) Clifford analysis consists in the establishment of a function theory for functions belonging to the kernel of the Dirac operator. While such functions can very well describe problems of a particle with internal SU(2)-symmetries, higher order symmetries are beyond this theory. Although many modifications (such as Yang-Mills theory) were suggested over the years they could not address the principal problem, the need of a n-fold factorization of the d’Alembert operator. In this paper we present the basic tools of a fractional function theory in higher dimensions, for the transport operator (alpha = 1/2 ), by means of a fractional correspondence to the Weyl relations via fractional Riemann-Liouville derivatives. A Fischer decomposition, fractional Euler and Gamma operators, monogenic projection, and basic fractional homogeneous powers are constructed.
SELECTION AND SCALING OF GROUND MOTION RECORDS FOR SEISMIC ANALYSIS USING AN OPTIMIZATION ALGORITHM
(2015)
The nonlinear time history analysis and seismic performance based methods require a set of scaled ground motions. The conventional procedure of ground motion selection is based on matching the motion properties, e.g. magnitude, amplitude, fault distance, and fault mechanism. The seismic target spectrum is only used in the scaling process following the random selection process. Therefore, the aim of the paper is to present a procedure to select a sets of ground motions from a built database of ground motions. The selection procedure is based on running an optimization problem using Dijkstra’s algorithm to match the selected set of ground motions to a target response spectrum. The selection and scaling procedure of optimized sets of ground motions is presented by examining the analyses of nonlinear single degree of freedom systems.
From the design experiences of arch dams in the past, it has significant practical value to carry out the shape optimization of arch dams, which can fully make use of material characteristics and reduce the cost of constructions. Suitable variables need to be chosen to formulate the objective function, e.g. to minimize the total volume of the arch dam. Additionally a series of constraints are derived and a reasonable and convenient penalty function has been formed, which can easily enforce the characteristics of constraints and optimal design. For the optimization method, a Genetic Algorithm is adopted to perform a global search. Simultaneously, ANSYS is used to do the mechanical analysis under the coupling of thermal and hydraulic loads. One of the constraints of the newly designed dam is to fulfill requirements on the structural safety. Therefore, a reliability analysis is applied to offer a good decision supporting for matters concerning predictions of both safety and service life of the arch dam. By this, the key factors which would influence the stability and safety of arch dam significantly can be acquired, and supply a good way to take preventive measures to prolong ate the service life of an arch dam and enhances the safety of structure.
Schwerpunkt Textil
(2015)
Textil: Der Titel des Schwerpunkts der vorliegenden Ausgabe der ZMK kann ein Ding, ein Material oder eine Eigenschaft bezeichnen. Als Eigenschaftswort spezifiziert textil indes nicht nur bestimmte Künste näher (den Begriff der textilen Kunst gibt es spätestens seit Gottfried Semper) und umfasst so die Weberei, die Stickerei, das Flechten, Knoten, Stricken, Häkeln, Wirken und vieles andere mehr (Semper zufolge auch die Anfänge der Baukunst), sondern neuerdings auch Medien. Die Rede von den »textilen Medien« zielt offenbar auf einen anderen Aspekt des Medienbegriffs als denjenigen, der von der Trias »Speichern, Übertragen, Verarbeiten« gebildet wird, nämlich auf das Material, nicht auf die Funktion. Nun kann man einerseits diesen Medienbegriff einfach, wie etwa im Bereich der Kunstgeschichte üblich, im Sinne der Materialität eines Bildträgers verstehen. Andererseits jedoch lenkt die Rede von den »textilen Medien«, indem sie die Materialität anstelle der Funktionalität betont, den Fokus auf eine spezifische Medialität des Textilen und darüber hinaus auf eine Medialität des Materials überhaupt. Eben darin liegt der Grund für die seit einigen Jahren zu beobachtende enorme Konjunktur des Textilen in so unterschiedlichen Bereichen wie der Kunst, der Kunstwissenschaft oder der Technik- und Sozialanthropologie. In der Kunstöffentlichkeit belegen eine Reihe von Ausstellungen diese Konjunktur, wie zum Beispiel »Kunst & Textil« in Wolfsburg, »Textiles: Open Letter« in Mönchengladbach, »Decorum. Tapis et Tapisseries d’Artistes« in Paris, »Soft Pictures« in Turin oder »To Open Eyes. Kunst und Textil vom Bauhaus bis heute« in Bielefeld. ...
Schwerpunkt Sendung
(2015)
Auch die Zeitschrift für Medien- und Kulturforschung erreicht ihre Leserinnen und Leser als Sendung, als Postgut auf dem Wege der Zustellung nämlich. Wer immer sie liest, hat also mit der Sendung zu tun. Allein das schon ist ein Grund, sich mit der Sendung zu befassen. Und ein gewichtiger Grund dazu: Phänomen und Begriff der Sendung haben, das leuchtet schon intuitiv jeder Postkundin, jedem Radiohörer, jedem Gottesdienstbesucher und jeder Goethe-Leserin ein, eine enorme medienwissenschaftliche Relevanz. Als empirischen, kulturtechnischen Sachverhalt kann man die Sendung – also etwa diese Ausgabe der ZMK – physisch in Händen halten (oder auch nicht, wenn die Postsache nicht ankommt), annehmen oder zurückweisen, man kann sie technisch, etwa als Druck- und Redaktionserzeugnis herstellen, sie aufgeben und abholen, verwalten und organisieren, sie bewirtschaften – aber die Sendung lässt sich auch fühlen, verspüren und bemerken, erfüllen und verfehlen, kann bewegen und berühren oder eben unberührt lassen. Ganz real affiziert und attachiert sie, richtet aus und sendet selbst. Als medienphilosophisches Konzept genommen, besitzt die Sendung folglich das Potential zur Entfaltung komplexer Grundannahmen der Medientheorie und zugleich zu ihrer reduzierenden Bündelung und Einfassung. Sie verbindet und durchkreuzt ganz grundsätzlich das begrifflich sorgsam Getrennte, zum Beispiel das Heilige und das Profane, das Materielle und das Immaterielle, das Aktive und das Passive. Sie ist darin ein genuin medienwissenschaftlicher Leitbegriff, an dem sich die gesamte Breite dessen, was Medium sein kann, entfalten lässt, von der Religion bis zum Massenmedium, von der Politik bis zum Postboten, von der Infrastruktur bis zur Entrückung. Sie erzeugt zudem in all ihrer Materialität mannigfaltige paradoxe und reflexive Verläufe – die Sendung überhaupt zu denken, heißt deshalb nicht zuletzt, selber senden und gesandt werden.
The paper introduces a systematic construction management approach, supporting expansion of a specified construction process, both automatically and semi-automatically. Throughout the whole design process, many requirements must be taken into account in order to fulfil demands defined by clients. In implementing those demands into a design concept up to the execution plan, constraints such as site conditions, building code, and legal framework are to be considered. However, complete information, which is needed to make a sound decision, is not yet acquired in the early phase. Decisions are traditionally taken based on experience and assumptions. Due to a vast number of appropriate available solutions, particularly in building projects, it is necessary to make those decisions traceable. This is important in order to be able to reconstruct considerations and assumptions taken, should there be any changes in the future project’s objectives. The research will be carried out by means of building information modelling, where rules deriving from standard logics of construction management knowledge will be applied. The knowledge comprises a comprehensive interaction amongst bidding process, cost-estimation, construction site preparation as well as specific project logistics – which are usually still separately considered. By means of these rules, favourable decision taking regarding prefabrication and in-situ implementation can be justified. Modifications depending on the available information within current design stage will consistently be traceable.
In construction engineering, a schedule’s input data, which is usually not exactly known in the planning phase, is considered deterministic when generating the schedule. As a result, construction schedules become unreliable and deadlines are often not met. While the optimization of construction schedules with respect to costs and makespan has been a matter of research in the past decades, the optimization of the robustness of construction schedules has received little attention. In this paper, the effects of uncertainties inherent to the input data of construction schedules are discussed. Possibilities are investigated to improve the reliability of construction schedules by considering alternative processes for certain tasks and by identifying the combination of processes generating the most robust schedule with respect to the makespan of a construction project.
Restelo Neighbourhood: Expanding the Capital of the Empire with the First Portuguese Urban Planner
(2015)
For decades in Germany, historical research on dictatorial urban design in the first half of the 20th century focused on the National Socialist period. Studies on the urban design practices of other dictatorships remained an exception. This has changed. Meanwhile, the urban production practices of the Mussolini, Stalin, Salazar, Hitler and Franco dictatorships have become the subject of comprehensive research projects. Recently, a research group that studies dictatorial urban design in 20th century Europe has emerged at the Bauhaus-Institut für Geschichte und Theorie der Architektur und der Planung. The group is already able to refer to various research results.
Part of the research group’s self-conception is the assumption that the urban design practices of the named dictatorships can only be properly understood from a European perspective. The dictatorships influenced one another substantially. Furthermore, the specificities of the practices of each dictatorship can only be discerned if one can compare them to those of the other dictatorships. This approach requires strict adherence to the research methods of planning history and urban design theory. Meanwhile, these methods must be opened
to include those of general historical studies.
With this symposium, the research group aims to further qualify this European perspective. The aim is to pursue an inventory of the various national historiographies on the topic of “urban design and dictatorship”. This inventory should offer an overview on the general national level of historical research on urban design as well as on the level of particular urban design projects, persons or topics.
The symposium took place in Weimar, November 21-22, 2013. It was organized by Harald Bodenschatz, Piero Sassi and Max Welch Guerra and funded by the DAAD (German Academic Exchange Service).
For decades in Germany, historical research on dictatorial urban design in the first half of the 20th century focused on the National Socialist period. Studies on the urban design practices of other dictatorships remained an exception. This has changed. Meanwhile, the urban production practices of the Mussolini, Stalin, Salazar, Hitler and Franco dictatorships have become the subject of comprehensive research projects. Recently, a research group that studies dictatorial urban design in 20th century Europe has emerged at the Bauhaus-Institut für Geschichte und Theorie der Architektur und der Planung. The group is already able to refer to various research results.
Part of the research group’s self-conception is the assumption that the urban design practices of the named dictatorships can only be properly understood from a European perspective. The dictatorships influenced one another substantially. Furthermore, the specificities of the practices of each dictatorship can only be discerned if one can compare them to those of the other dictatorships. This approach requires strict adherence to the research methods of planning history and urban design theory. Meanwhile, these methods must be opened
to include those of general historical studies.
With this symposium, the research group aims to further qualify this European perspective. The aim is to pursue an inventory of the various national historiographies on the topic of “urban design and dictatorship”. This inventory should offer an overview on the general national level of historical research on urban design as well as on the level of particular urban design projects, persons or topics.
The symposium took place in Weimar, November 21-22, 2013. It was organized by Harald Bodenschatz, Piero Sassi and Max Welch Guerra and funded by the DAAD (German Academic Exchange Service).
The theory of regular quaternionic functions of a reduced quaternionic variable is a 3-dimensional generalization of complex analysis. The Moisil-Theodorescu system (MTS) is a regularity condition for such functions depending on the radius vector r = ix+jy+kz seen as a reduced quaternionic variable. The analogues of the main theorems of complex analysis for the MTS in quaternion forms are established: Cauchy, Cauchy integral formula, Taylor and Laurent series, approximation theorems and Cauchy type integral properties. The analogues of positive powers (inner spherical monogenics) are investigated: the set of recurrence formulas between the inner spherical monogenics and the explicit formulas are established. Some applications of the regular function in the elasticity theory and hydrodynamics are given.
We present StarWatch, our application for real-time analysis of radio astronomical data in Virtual Environment. Serving as an interface to radio astronomical databases or being applied to live data from the radio telescopes, the application supports various data filters measuring signal-to-noise ratio (SNR), Doppler's drift, degree of signal localization on celestial sphere and other useful tools for signal extraction and classification. Originally designed for the database of narrow band signals from SETI Institute (setilive.org), the application has been recently extended for the detection of wide band periodic signals, necessary for the search of pulsars. We will also address the detection of week signals possessing arbitrary waveforms and present several data filters suitable for this purpose.
Etwa ein Viertel des gesamten Endenergieverbrauchs (26%) in Deutschland entfällt auf den Wohnungssektor, wodurch dieser Sektor einen erheblichen Anteil am möglichen Einsparpotenzial an Energie hat. Im Hinblick auf das Klimaschutzziel der Europäischen Union, die Energieeffizienz im Vergleich zu 1990 um 20% zu erhöhen, stellt sich daher die Frage, welche Einsparpotenziale es im Wohnungssektor tatsächlich gibt und wie diese quantifiziert werden können. In dieser Arbeit wird der Einfluss der Parameter, die den Endenergieverbrauch beeinflussen, mit Hilfe einer Sensitivitätsanalyse bestimmt. Die Ergebnisse der Sensitivitätsanalyse zeigen, dass die einflussreichsten Parameter auf den Endenergieverbrauch der Innentemperaturbedarf, die Länge der Heizperiode, die Außentemperatur (Gradtagzahl) und die Anzahl der Wohnungen sind. Dies sind Variablen, die nicht durch Verordnungen reguliert werden können. Der einzige Parameter, der regulierbar ist und einen bedeutenden Einfluss auf den Endenergieverbrauch hat, ist der Nutzungsgrad der Anlagen/Geräte für Raumwärme, Warmwasser und Kochen (sowie zu einem geringen Teil der Wirkungsgrad der eingesetzten Beleuchtung). Zur Quantifizierung des Energieeinsparpotentials im deutschen Wohnungssektor bezüglich des Nutzungsgrades wurden in dieser Arbeit Daten zur Bestimmung der langfristigen Entwicklung (Zeitraum 1990-2010) des Nutzungsgrades von Anlagen und Geräten analysiert. Mit verschiedenen Angaben aus der Literatur und mit Hilfe von Sättigungskurven wurde die Entwicklung der Nutzugsgrade der Anlagen/Geräte entsprechend der Energiequellen zwischen 1990 und 2010 ermittelt. Die erhaltenden Sättigungskurven ermöglichen die Bestimmung der Entwicklung des Nutzenergieverbrauchs im deutschen Wohnungssektor. Hierbei wurde festgestellt, dass die Differenz zwischen Nutzenergieverbrauch und Endenergieverbrauch einen Rückgang von 12 % im betrachtenden Zeitraum verzeichnete und dass das Energieeinsparpotenzial in Abhängigkeit von der Energiequelle beträchtlich variieren kann (um derzeit mehr als 35%-Punkte). Im Hinblick auf das oben genannte Klimaschutzziel werden in dieser Arbeit verschiedene Entwicklungsszenarien auf Basis des Nutzungsgrades der Anlagen und der Energiequellen analysiert. Hierbei wird deutlich, dass das theoretische Energieeinsparpotenzial im deutschen Wohnungssektor bezüglich des durchschnittlichen Nutzungsgrades nur zwischen 4 und 15 % liegt. Dies bedeutet, dass eine deutliche Reduktion des Endenergiebedarfs im Wohnungssektor nur stattfinden kann, wenn andere Energieeinsparmaßnahmen betrachtet werden. Basierend auf den Ergebnissen der Sensitivitätsanalyse werden hierzu Empfehlungen gegeben.
Over the last decade, the technology of constructing buildings has been dramatically developed especially with the huge growth of CAD tools that help in modeling buildings, bridges, roads and other construction objects. Often quality control and size accuracy in the factory or on construction site are based on manual measurements of discrete points. These measured points of the realized object or a part of it will be compared with the points of the corresponding CAD model to see whether and where the construction element fits into the respective CAD model. This process is very complicated and difficult even when using modern measuring technology. This is due to the complicated shape of the components, the large amount of manually detected measured data and the high cost of manual processing of measured values. However, by using a modern 3D scanner one gets information of the whole constructed object and one can make a complete comparison against the CAD model. It gives an idea about quality of objects on the whole. In this paper, we present a case study of controlling the quality of measurement during the constructing phase of a steel bridge by using 3D point cloud technology. Preliminary results show that an early detection of mismatching between real element and CAD model could save a lot of time, efforts and obviously expenses.
Known as a sophisticated phenomenon in civil engineering problems, soil structure interaction has been under deep investigations in the field of Geotechnics. On the other hand, advent of powerful computers has led to development of numerous numerical methods to deal with this phenomenon, resulting in a wide variety of methods trying to simulate the behavior of the soil stratum. This survey studies two common approaches to model the soil’s behavior in a system consisting of a structure with two degrees of freedom, representing a two-storey frame structure made of steel, with the column resting on a pile embedded into sand in laboratory scale. The effect of soil simulation technique on the dynamic behavior of the structure is of major interest in the study. Utilized modeling approaches are the so-called Holistic method, and substitution of soil with respective impedance functions.
For decades in Germany, historical research on dictatorial urban design in the first half of the 20th century focused on the National Socialist period. Studies on the urban design practices of other dictatorships remained an exception. This has changed. Meanwhile, the urban production practices of the Mussolini, Stalin, Salazar, Hitler and Franco dictatorships have become the subject of comprehensive research projects. Recently, a research group that studies dictatorial urban design in 20th century Europe has emerged at the Bauhaus-Institut für Geschichte und Theorie der Architektur und der Planung. The group is already able to refer to various research results.
Part of the research group’s self-conception is the assumption that the urban design practices of the named dictatorships can only be properly understood from a European perspective. The dictatorships influenced one another substantially. Furthermore, the specificities of the practices of each dictatorship can only be discerned if one can compare them to those of the other dictatorships. This approach requires strict adherence to the research methods of planning history and urban design theory. Meanwhile, these methods must be opened
to include those of general historical studies.
With this symposium, the research group aims to further qualify this European perspective. The aim is to pursue an inventory of the various national historiographies on the topic of “urban design and dictatorship”. This inventory should offer an overview on the general national level of historical research on urban design as well as on the level of particular urban design projects, persons or topics.
The symposium took place in Weimar, November 21-22, 2013. It was organized by Harald Bodenschatz, Piero Sassi and Max Welch Guerra and funded by the DAAD (German Academic Exchange Service).
For decades in Germany, historical research on dictatorial urban design in the first half of the 20th century focused on the National Socialist period. Studies on the urban design practices of other dictatorships remained an exception. This has changed. Meanwhile, the urban production practices of the Mussolini, Stalin, Salazar, Hitler and Franco dictatorships have become the subject of comprehensive research projects. Recently, a research group that studies dictatorial urban design in 20th century Europe has emerged at the Bauhaus-Institut für Geschichte und Theorie der Architektur und der Planung. The group is already able to refer to various research results.
Part of the research group’s self-conception is the assumption that the urban design practices of the named dictatorships can only be properly understood from a European perspective. The dictatorships influenced one another substantially. Furthermore, the specificities of the practices of each dictatorship can only be discerned if one can compare them to those of the other dictatorships. This approach requires strict adherence to the research methods of planning history and urban design theory. Meanwhile, these methods must be opened
to include those of general historical studies.
With this symposium, the research group aims to further qualify this European perspective. The aim is to pursue an inventory of the various national historiographies on the topic of “urban design and dictatorship”. This inventory should offer an overview on the general national level of historical research on urban design as well as on the level of particular urban design projects, persons or topics.
The symposium took place in Weimar, November 21-22, 2013. It was organized by Harald Bodenschatz, Piero Sassi and Max Welch Guerra and funded by the DAAD (German Academic Exchange Service).
One of the most promising and recent advances in computer-based planning is the transition from classical geometric modeling to building information modeling (BIM). Building information models support the representation, storage, and exchange of various information relevant to construction planning. This information can be used for describing, e.g., geometric/physical properties or costs of a building, for creating construction schedules, or for representing other characteristics of construction projects. Based on this information, plans and specifications as well as reports and presentations of a planned building can be created automatically. A fundamental principle of BIM is object parameterization, which allows specifying geometrical, numerical, algebraic and associative dependencies between objects contained in a building information model. In this paper, existing challenges of parametric modeling using the Industry Foundation Classes (IFC) as a federated model for integrated planning are shown, and open research questions are discussed.
This study contributes to the identification of coupled THM constitutive model parameters via back analysis against information-rich experiments. A sampling based back analysis approach is proposed comprising both the model parameter identification and the assessment of the reliability of identified model parameters. The results obtained in the context of buffer elements indicate that sensitive parameter estimates generally obey the normal distribution. According to the sensitivity of the parameters and the probability distribution of the samples we can provide confidence intervals for the estimated parameters and thus allow a qualitative estimation on the identified parameters which are in future work used as inputs for prognosis computations of buffer elements. These elements play e.g. an important role in the design of nuclear waste repositories.
The aim of this study was to investigate the optimization of the strength development of quaternary cements with 50 % clinker by a variation of the particle size distribution of the components GGBFS, fly ash and limestone powder.
By balancing the overall PSD of the cement by using unprocessed fly ash and coarse limestone powder in combination with a very fine GGBFS, the water demand of the resulting quaternary cements remained unaltered, while the compressive strength of the cements was increased significantly after 7d, 28 and 56d. As can be expected, the quaternary cement with 30 wt.% of the fine slag exhibited a stronger strength increase (about 18 % after 28 d) than the cements with only 20 wt.% slag (about 10% after 28d).
The sizing of simple resonators like guitar strings or laser mirrors is directly connected to the wavelength and represents no complex optimisation problem. This is not the case with liquid-filled acoustic resonators of non-trivial geometries, where several masses and stiffnesses of the structure and the fluid have to fit together. This creates a scenario of many competing and interacting resonances varying in relative strength and frequency when design parameters change. Hence, the resonator design involves a parameter-tuning problem with many local optima. As its solution evolutionary algorithms (EA) coupled to a forced-harmonic FE simulation are presented. A new hybrid EA is proposed and compared to two state-of-theart EAs based on selected test problems. The motivating background is the search for better resonators suitable for sonofusion experiments where extreme states of matter are sought in collapsing cavitation bubbles.
Performing parameter identification prior to numerical simulation is an essential task in geotechnical engineering. However, it has to be kept in mind that the accuracy of the obtained parameter is closely related to the chosen experimental setup, such as the number of sensors as well as their location. A well considered position of sensors can increase the quality of the measurement and to reduce the number of monitoring points. This Paper illustrates this concept by means of a loading device that is used to identify the stiffness and permeability of soft clays. With an initial setup of the measurement devices the pore water pressure and the vertical displacements are recorded and used to identify the afore mentioned parameters. Starting from these identified parameters, the optimal measurement setup is investigated with a method based on global sensitivity analysis. This method shows an optimal sensor location assuming three sensors for each measured quantity, and the results are discussed.
The Laguerre polynomials appear naturally in many branches of pure and applied mathematics and mathematical physics. Debnath introduced the Laguerre transform and derived some of its properties. He also discussed the applications in study of heat conduction and to the oscillations of a very long and heavy chain with variable tension. An explicit boundedness for some class of Laguerre integral transforms will be present.
The p-Laplace equation is a nonlinear generalization of the Laplace equation. This generalization is often used as a model problem for special types of nonlinearities. The p-Laplace equation can be seen as a bridge between very general nonlinear equations and the linear Laplace equation. The aim of this paper is to solve the p-Laplace equation for 2 < p < 3 and to find strong solutions. The idea is to apply a hypercomplex integral operator and spatial function theoretic methods to transform the p-Laplace equation into the p-Dirac equation. This equation will be solved iteratively by using a fixed point theorem.
In der modernen Arbeitswelt ist es üblich betriebliche Aufgaben extern zu vergeben. Dieser Vorgang wird in der Wirtschaft Outsourcing genannt und beschränkt sich in der Regel auf Arbeiten, welche nicht zu den Kernkompetenzen eines Unternehmens gehören. Wenn diese Arbeiten durch spezialisierte Unternehmen übernommen werden ergeben sich Potenziale, die weit über eine bloße Kostensenkung hinausgehen. Der Kern dieser Arbeit bezieht sich auf die Auslagerung von sogenannten Nebentätigkeiten im Zusammenhang einer Bauwerksausführung. Dabei werden die arbeitsrechtliche Situation, Machbarkeit und Sinnhaftigkeit analysiert.
Das Arbeitsrecht im Baugewerbe ist überaus komplex und von starren Regelungen geprägt. Die für allgemeinverbindlich erklärten Tarifverträge gelten gemeinhin für alle Betriebe, die überwiegend bauliche Tätigkeiten ausüben. Problematisch wird die tarifliche Zuordnung für Betriebe, welche die Nebentätigkeiten übernehmen und für gewöhnlich nicht in den Geltungsbereich der Tarifverträge fallen. Bei der Vergabe von Nebentätigkeiten mit Hilfe von Werk- oder Dienstverträgen bedarf es einer genauen Klärung, ab wann eine unerlaubte Arbeitnehmerüberlassung beginnt.
Im Rahmen dieser Arbeit soll untersucht werden, inwieweit eine Auslagerung der baulichen Nebentätigkeiten durch eine Werk- oder Dienstvertragsvergabe an die Grenzen der arbeitsrechtlichen Gegebenheiten stößt. Zusätzlich wird eine tarifliche Einordnung der Tätigkeiten vorgenommen.
Der Text folgt in essayistischer Form einem Spaziergang durch das politische Zentrum Brasílias in Brasilien. Die Konzentration liegt auf der Gestaltung des Bodens. Wie ist die Planhauptstadt „vom Reißbrett“ in der Horizontalen gestaltet? Wie sehen repräsentative Plätze einer Stadt aus, die vor allem für Autos gebaut worden ist? Der forschende Blick liegt auf dem erlebten Ist-Zustand und wird assoziativ mit Ergebnissen der Forschungsarbeit aus Deutschland reflektiert. „Mächtiger Boden“ entstand als Satellit zur aktuellen Forschung der Autorin im Rahmen eines Aufenthalts in Brasilien.
In this study, an application of evolutionary multi-objective optimization algorithms on the optimization of sandwich structures is presented. The solution strategy is known as Elitist Non-Dominated Sorting Evolution Strategy (ENSES) wherein Evolution Strategies (ES) as Evolutionary Algorithm (EA) in the elitist Non-dominated Sorting Genetic algorithm (NSGA-II) procedure. Evolutionary algorithm seems a compatible approach to resolve multi-objective optimization problems because it is inspired by natural evolution, which closely linked to Artificial Intelligence (AI) techniques and elitism has shown an important factor for improving evolutionary multi-objective search. In order to evaluate the notion of performance by ENSES, the well-known study case of sandwich structures are reconsidered. For Case 1, the goals of the multi-objective optimization are minimization of the deflection and the weight of the sandwich structures. The length, the core and skin thicknesses are the design variables of Case 1. For Case 2, the objective functions are the fabrication cost, the beam weight and the end deflection of the sandwich structures. There are four design variables i.e., the weld height, the weld length, the beam depth and the beam width in Case 2. Numerical results are presented in terms of Paretooptimal solutions for both evaluated cases.
We conducted extensive molecular dynamics simulations to investigate the thermal conductivity of polycrystalline hexagonal boron-nitride (h-BN) films. To this aim, we constructed large atomistic models of polycrystalline h-BN sheets with random and uniform grain configuration. By performing equilibrium molecular dynamics (EMD) simulations, we investigated the influence of the average grain size on the thermal conductivity of polycrystalline h-BN films at various temperatures. Using the EMD results, we constructed finite element models of polycrystalline h-BN sheets to probe the thermal conductivity of samples with larger grain sizes. Our multiscale investigations not only provide a general viewpoint regarding the heat conduction in h-BN films but also propose that polycrystalline h-BN sheets present high thermal conductivity comparable to monocrystalline sheets.
Methods based on B-splines for model representation, numerical analysis and image registration
(2015)
The thesis consists of inter-connected parts for modeling and analysis using newly developed isogeometric methods. The main parts are reproducing kernel triangular B-splines, extended isogeometric analysis for solving weakly discontinuous problems, collocation methods using superconvergent points, and B-spline basis in image registration applications.
Each topic is oriented towards application of isogeometric analysis basis functions to ease the process of integrating the modeling and analysis phases of simulation.
First, we develop reproducing a kernel triangular B-spline-based FEM for solving PDEs. We review the triangular B-splines and their properties. By definition, the triangular basis function is very flexible in modeling complicated domains. However, instability results when it is applied for analysis. We modify the triangular B-spline by a reproducing kernel technique, calculating a correction term for the triangular kernel function from the chosen surrounding basis. The improved triangular basis is capable to obtain the results with higher accuracy and almost optimal convergence rates.
Second, we propose an extended isogeometric analysis for dealing with weakly discontinuous problems such as material interfaces. The original IGA is combined with XFEM-like enrichments which are continuous functions themselves but with discontinuous derivatives. Consequently, the resulting solution space can approximate solutions with weak discontinuities. The method is also applied to curved material interfaces, where the inverse mapping and the curved triangular elements are considered.
Third, we develop an IGA collocation method using superconvergent points. The collocation methods are efficient because no numerical integration is needed. In particular when higher polynomial basis applied, the method has a lower computational cost than Galerkin methods. However, the positions of the collocation points are crucial for the accuracy of the method, as they affect the convergent rate significantly. The proposed IGA collocation method uses superconvergent points instead of the traditional Greville abscissae points. The numerical results show the proposed method can have better accuracy and optimal convergence rates, while the traditional IGA collocation has optimal convergence only for even polynomial degrees.
Lastly, we propose a novel dynamic multilevel technique for handling image registration. It is application of the B-spline functions in image processing. The procedure considered aims to align a target image from a reference image by a spatial transformation. The method starts with an energy function which is the same as a FEM-based image registration. However, we simplify the solving procedure, working on the energy function directly. We dynamically solve for control points which are coefficients of B-spline basis functions. The new approach is more simple and fast. Moreover, it is also enhanced by a multilevel technique in order to prevent instabilities. The numerical testing consists of two artificial images, four real bio-medical MRI brain and CT heart images, and they show our registration method is accurate, fast and efficient, especially for large deformation problems.
In this paper we introduce LUCI, a Lightweight Urban Calculation Interchange system, designed to bring the advantages of a calculation and content co-ordination system to small planning and design groups by the means of an open source middle-ware. The middle-ware focuses on problems typical to urban planning and therefore features a geo-data repository as well as a job runtime administration, to coordinate simulation models and its multiple views. The described system architecture is accompanied by two exemplary use cases that have been used to test and further develop our concepts and implementations.
In this paper we introduce LUCI, a Lightweight Urban Calculation Interchange system, designed to bring the advantages of a calculation and content co-ordination system to small planning and design groups by the means of an open source middle-ware. The middle-ware focuses on problems typical to urban planning and therefore features a geo-data repository as well as a job runtime administration, to coordinate simulation models and its multiple views. The described system architecture is accompanied by two exemplary use cases that have been used to test and further develop our concepts and implementations.
As human thought was developing, likewise, the technology used for illumination was growing. But a haul through history, reviewing its pages and analyzing it, inherently brings up old and new question, like: Is it possible to alter negatively the image of historic buildings and monuments through inadequate lighting to the degree of distorting the perception that people have of the work? and if so, what are the causes that generate it? Do the light designers take into consideration criteria to protect not only historic buildings and monuments, but also the environment? What are the consequences that may generate the inadequate lighting of urban heritage to the environment? What are the factors to consider for a proper illumination of urban heritage? The answers to these questions will help lay the foundation for proper illumination of the urban heritage, avoiding at the maximum the light pollution and the effects that it generates, seeking a balance and harmonious reconciliation between the technology, urban heritage and environment, taking as a framework and the case study the urban heritage of a city from the colonial era in southern Mexico, with pre-Hispanic roots and where today you can still see through its streets and buildings an atmosphere of mysticism reflection of their folklore and traditions, this city is known as Chiapa de Corzo, Chiapas.
For decades in Germany, historical research on dictatorial urban design in the first half of the 20th century focused on the National Socialist period. Studies on the urban design practices of other dictatorships remained an exception. This has changed. Meanwhile, the urban production practices of the Mussolini, Stalin, Salazar, Hitler and Franco dictatorships have become the subject of comprehensive research projects. Recently, a research group that studies dictatorial urban design in 20th century Europe has emerged at the Bauhaus-Institut für Geschichte und Theorie der Architektur und der Planung. The group is already able to refer to various research results.
Part of the research group’s self-conception is the assumption that the urban design practices of the named dictatorships can only be properly understood from a European perspective. The dictatorships influenced one another substantially. Furthermore, the specificities of the practices of each dictatorship can only be discerned if one can compare them to those of the other dictatorships. This approach requires strict adherence to the research methods of planning history and urban design theory. Meanwhile, these methods must be opened
to include those of general historical studies.
With this symposium, the research group aims to further qualify this European perspective. The aim is to pursue an inventory of the various national historiographies on the topic of “urban design and dictatorship”. This inventory should offer an overview on the general national level of historical research on urban design as well as on the level of particular urban design projects, persons or topics.
The symposium took place in Weimar, November 21-22, 2013. It was organized by Harald Bodenschatz, Piero Sassi and Max Welch Guerra and funded by the DAAD (German Academic Exchange Service).
Theoretischer Teil:
Durch den Vergleich der Daten, die sowohl von den Instituten bzw. Statistikunternehmen z. B. Google, Bundesnetzagentur und Statista in Deutschland, 中国互联网络信息中心 in China, FIND in Taiwan usw. zur Verfügung gestellt wurden, als auch durch die von mir durchgeführten Umfragen in den jeweiligen Ländern, sowie durch die Interviews in Weimar, zeigen sich die kulturellen Unterschiede sowie die Gemeinsamkeiten bei der Nutzung des Smartphones. Darüber hinaus ergaben sich noch einige Konsequenzen bei den Interpretationen dieser Ergebnisse, die eng mit der Entstehung der kulturellen Unterschiede zusammenhängen.
Praktischer Teil:
Der erste Entwurf ist eine visuelle Tastatur. Um die Bildschirmtastatur zu verbessern, versucht type right ! (der Name meines Entwurfs) das Problem von Vertippen beim Eingeben zu lösen. type right ! hat zwei Schwerpunkte: 1. geänderte Positionen der Buchstaben und Zeichen und 2. geänderte Form der Taste.
Der zweite Entwurf handelt von einem Konzept einer App für die Integrationsmöglichkeit der Kommunikation.
The refurbishment of old buildings often goes hand in hand with an increase in both the dead and live loads. The latter, combined with the higher safety factors, often make the reinforcement of the old structures necessary. Most reinforcement methods involve transforming a structural timber member into a composite beam. Composite sections have a long tradition in timber construction. In the early days, multiple timber beams were connected with interlocking tooth and wooden shear connecters, which resulted in an elastic connection. Although historical timber structures are frequently upgraded, no method has yet been established and fully accepted by all stakeholders such as owners, builders, architects, engineers and cultural heritage organisations. Carbon fibre-reinforced polymers (CFRP) have already shown their efficiency in structural reinforcement especially in concrete structures. Moreover, previous studies have shown that CFRP has the potential to meet the expectations of all parties involved.
In order to reach the service-limit state, a high amount of carbon fibres has to be used, or considering the cost of reinforcement, prestress has to be applied. However, prestressing often goes hand-in-hand with delaminating issues. The camber method presented here offers an efficient solution for prestressing timber bending members and overcoming the known obstacles.
In the method proposed, the timber beam is cambered using an adjustable prop at midspan during the bonding of the CRFP-lamella to the lower side of the bending member. After curing the adhesive, the prop is removed and the prestressed composite beam is ready to be used. The prestress introduced in the system is not constant, but has a triangular shape and peaks at midspan, where it is used the most. The prestress force, which declines towards the end of the beam, leads to a constant shear stress over the whole length of the reinforcement,avoiding a concentrated anchorage zone.
An analytical calculation model has been developed to calculate and design prestressed timber-
bending members using the camber method. Numerical modelling, using a multi-surface plasticity model for timber, confirmed the results from the analytical model, and clearly reduced delaminating issues, comparing very favourably to traditional prestressing methods. The experimental parametric study, including the determination of the short-term loadbearing capacity of structural-sized beams, showed agreement with the analytical and numerical calculation. The prestressed reinforcement showed a benefit of nearly 50% towards the ultimate-limit state and up to 70% towards the service-limit state. Calculations revealed that the use of high modulus CFRP allows even higher benefits, depending on the configurations and requirements. The long-term design of the prestressed composite beam was investigated by extending the analytical model. The creep of the timber leads to a load transfer from the timber towards the CFRP, and therefore increases the benefit towards the ultimatelimit design. Applying high modulus CFRP-lamellas allows for a complete utilisation of the
design capacity of timber and carbon fibre-reinforced polymer.
The thorough investigation conducted demonstrated that the camber method is an efficient technique for prestressing and reinforcing timber-bending members. Furthermore, the calculation model presented allows for a safe design and estimation of long-term behaviour.
The 20th International Conference on the Applications of Computer Science and Mathematics in Architecture and Civil Engineering will be held at the Bauhaus University Weimar from 20th till 22nd July 2015. Architects, computer scientists, mathematicians, and engineers from all over the world will meet in Weimar for an interdisciplinary exchange of experiences, to report on their results in research, development and practice and to discuss. The conference covers a broad range of research areas: numerical analysis, function theoretic methods, partial differential equations, continuum mechanics, engineering applications, coupled problems, computer sciences, and related topics. Several plenary lectures in aforementioned areas will take place during the conference.
We invite architects, engineers, designers, computer scientists, mathematicians, planners, project managers, and software developers from business, science and research to participate in the conference!
IFC-BASED MONITORING INFORMATION MODELING FOR DATA MANAGEMENT IN STRUCTURAL HEALTH MONITORING
(2015)
This conceptual paper discusses opportunities and challenges towards the digital representation of structural health monitoring systems using the Industry Foundation Classes (IFC) standard. State-of-the-art sensor nodes, collecting structural and environmental data from civil infrastructure systems, are capable of processing and analyzing the data sets directly on-board the nodes. Structural health monitoring (SHM) based on sensor nodes that possess so called “on-chip intelligence” is, in this study, referred to as “intelligent SHM”, and the infrastructure system being equipped with an intelligent SHM system is referred to as “intelligent infrastructure”. Although intelligent SHM will continue to grow, it is not possible, on a well-defined formalism, to digitally represent information about sensors, about the overall SHM system, and about the monitoring strategies being implemented (“monitoring-related information”). Based on a review of available SHM regulations and guidelines as well as existing sensor models and sensor modeling languages, this conceptual paper investigates how to digitally represent monitoring-related information in a semantic model. With the Industry Foundation Classes, there exists an open standard for the digital representation of building information; however, it is not possible to represent monitoring-related information using the IFC object model. This paper proposes a conceptual approach for extending the current IFC object model in order to include monitoring-related information. Taking civil infrastructure systems as an illustrative example, it becomes possible to adequately represent, process, and exchange monitoring-related information throughout the whole life cycle of civil infrastructure systems, which is referred to as monitoring information modeling (MIM). However, since this paper is conceptual, additional research efforts are required to further investigate, implement, and validate the proposed concepts and methods.
Granite on the Ground: Former Nazi Party Rally Grounds, Nuremberg/Germany. A brief introduction
(2015)
For decades in Germany, historical research on dictatorial urban design in the first half of the 20th century focused on the National Socialist period. Studies on the urban design practices of other dictatorships remained an exception. This has changed. Meanwhile, the urban production practices of the Mussolini, Stalin, Salazar, Hitler and Franco dictatorships have become the subject of comprehensive research projects. Recently, a research group that studies dictatorial urban design in 20th century Europe has emerged at the Bauhaus-Institut für Geschichte und Theorie der Architektur und der Planung. The group is already able to refer to various research results.
Part of the research group’s self-conception is the assumption that the urban design practices of the named dictatorships can only be properly understood from a European perspective. The dictatorships influenced one another substantially. Furthermore, the specificities of the practices of each dictatorship can only be discerned if one can compare them to those of the other dictatorships. This approach requires strict adherence to the research methods of planning history and urban design theory. Meanwhile, these methods must be opened
to include those of general historical studies.
With this symposium, the research group aims to further qualify this European perspective. The aim is to pursue an inventory of the various national historiographies on the topic of “urban design and dictatorship”. This inventory should offer an overview on the general national level of historical research on urban design as well as on the level of particular urban design projects, persons or topics.
The symposium took place in Weimar, November 21-22, 2013. It was organized by Harald Bodenschatz, Piero Sassi and Max Welch Guerra and funded by the DAAD (German Academic Exchange Service).
In this paper we present some rudiments of a generalized Wiman-Valiron theory in the context of polymonogenic functions. In particular, we analyze the relations between different notions of growth orders and the Taylor coefficients. Our main intention is to look for generalizations of the Lindel¨of-Pringsheim theorem. In contrast to the classical holomorphic and the monogenic setting we only obtain inequality relations in the polymonogenic setting. This is due to the fact that the Almansi-Fischer decomposition of a polymonogenic function consists of different monogenic component functions where each of them can have a totally different kind of asymptotic growth behavior.
Media anthropology is a new and interdisciplinary field of research with very different subjects and methods that seems to be already heavily informed by a comparatively narrow understanding of media as mass media (e.g. TV, Internet, social web, etc.). Therefore, most theories in this field, at least implicitly, employ a hierarchical and often dichotomic preconception of the two poles of media-human relations, by analysing the operationalities and ontologies of the human and the media independently from one another. This article deviates from this line of thought by advocating an expanded, symmetrical and relational understanding of the terms media and human, taking them as always already intermingled facets of a broader dynamic configuration. Starting from a consideration of the historically powerful, yet overlooked media of the so-called habitat diorama, the heuristic concept of “anthropomediality” is to be developed. Eventually, this relational approach may open up a new, interesting field for interrogation of (media-)anthropological analysis in general.
In photogrammetry and computer vision the trifocal tensor is used to describe the geometric relation between projections of points in three views. In this paper we analyze the stability and accuracy of the metric trifocal tensor for calibrated cameras. Since a minimal parameterization of the metric trifocal tensor is challenging, the additional constraints of the interior orientation are applied to the well-known projective 6-point and 7-point algorithms for three images. The experimental results show that the linear 7-point algorithm fails for some noise-free degenerated cases, whereas the minimal 6-point algorithm seems to be competitive even with realistic noise.
Entwicklung eines Sommerreferenzjahres zur Bestimmung der sommerlichen Überhitzung von Gebäuden
(2015)
Die Ableitung von sommer-fokussierten warmen Referenzjahren aus langjährigen Klimadaten erfolgt in Europa bisher nach unterschiedlichen, länderspezifischen Methoden, die sich in der Regel allein auf die Trockentemperatur beziehen und in der Auswahl eines zusammenhängenden realen Sommerhalbjahres resultieren. Simulationsergebnisse zur sommerlichen Überhitzung von natürlich belüfteten Gebäuden in Deutschland und Großbritannien zeigen jedoch für einige Wetterstationen weniger Überhitzung für Simulationen mit dem sommer-fokussierten Referenzjahr als für solche mit dem entsprechenden Testreferenzjahr (TRY) für den gleichen Ort. Dies gilt insbesondere dann, wenn einzelne Monate miteinander verglichen werden. Neben der Wahl eines kompletten Halbjahres, das sowohl extrem warme als auch vergleichsweise kühle Monate beinhalten kann, liegt dies vor allem begründet in der fehlenden Berücksichtigung der Solarstrahlung bei der Auswahl eines warmen Referenzjahres, die jedoch eine wichtige Rolle für sommerliche Überhitzungserscheinungen in Gebäuden spielt. Eine verlässliche, allgemein anerkannte Methode zur Erstellung von sommer-fokussierten Referenzjahren erscheint daher auch im Hinblick auf die rechtlichen Rahmenbedingungen in der Europäischen Union, die Strategien zur natürlichen Belüftung von Neubauten und Sanierungen begünstigen, erforderlich. Diese Arbeit präsentiert einen Ansatz zur Erstellung eines Sommerreferenzjahres (Summer Reference Year – SRY) aus dem TRY eines gegebenen Ortes und langjährigen Klimadaten. Die existierenden TRY-Daten werden hierbei skaliert, um den Bedingungen für Trockentemperatur und Solarstrahlung von nah-extremen Kandidatenjahren zu entsprechen, die separat über einen statistischen Ansatz ausgewählt werden. Anschließend werden Feuchttemperatur, Windgeschwindigkeit und Luftdruck des TRY durch lineare Korrelationen mit der Trockentemperatur angepasst, um die entsprechenden SRY-Daten zu erhalten. Der Vorteil dieser Methode liegt darin, dass das grundlegende Wettermuster des TRY erhalten bleibt und somit eine klare Relation zwischen SRY und TRY besteht, die eine Vergleichbarkeit von Simulationsergebnissen gewährleistet. Über vergleichende Gebäudesimulationen mit dem zugrundeliegenden TRY und langjährigen Klimadatensätzen kann nachgewiesen werden, dass sich das SRY zur Ermittlung sommerlicher Überhitzungserscheinungen in natürlich belüfteten Gebäuden eignet. Weiterhin kann gezeigt werden, dass das SRY im Gegensatz zur direkten Nutzung eines Kandidatenjahres für einen nah-extremen Sommer die Möglichkeit eines monatsscharfen Vergleichs mit dem TRY erlaubt und frei von wenig repräsentativen Besonderheiten ist, die in den entsprechenden Kandidatenjahren vorhanden sein können.
In der vorliegenden Studie wurde der Einfluss der Klinkerzusammensetzung sowie der Sulfatträgerart auf die Leistungsfähigkeit eines Spritzzementes untersucht. Um eine Untersulfatisierung im System mit einen Aluminiumsulfat / -hydroxid Beschleuniger zu vermeiden, sollte ein anhydritbasierter Sulfatträger eingesetzt werden. Dies führt zu einer besseren Festigkeitsentwicklung im jungen Alter.
Recently there has been a surge of interest in PDEs involving fractional derivatives in different fields of engineering. In this extended abstract we present some of the results developedin [3]. We compute the fundamental solution for the three-parameter fractional Laplace operator Δ by transforming the eigenfunction equation into an integral equation and applying the method of separation of variables. The obtained solutions are expressed in terms of Mittag-Leffer functions. For more details we refer the interested reader to [3] where it is also presented an operational approach based on the two Laplace transform.
Die Reise der Zeichen - Eine Studie zur symbolischen Bildwelt der Xiyü-Kultur an der Seidenstraße
(2015)
Kurzfassung
Die vorliegende Studie untersucht die traditionellen Xiyü-Zeichen im Visuellen verschiedener Volksgruppen am Beispiel Xinjiangs im Nordwesten Chinas, wo einst die alte Seidenstraße entlangführte. Die Untersuchung versteht sich als ein Versuch der systematischen Darstellung der symbolischen Bildwelt in Xinjiang und damit im weiteren Sinne auch als ein Versuch, das Xiyü-Kulturerbe zu bewahren, das angesichts des tiefgreifenden Wandels in China und der Go-West-Strategie des chinesischen Wirtschaftsprogramms, welches sich auf die weiten Westgebiete Chinas bezieht, in Gefahr geraten ist. Es muss ein Weg gefunden werden, Modernisierung und Tradition in Einklang zu bringen.
Diese Studie beschreibt, analysiert und bewertet exemplarisch, wie sich Zeichen und Symbole des Wissens und der Kultur entlang der Seidenstraße in Xinjiang zwischen den östlichen und westlichen Kulturen vermischt haben, wie sie übertragen wurden und inwieweit die Xiyü-Kultur reflektiert, dass Xinjiang ein Treffpunkt zwischen östlichen und westlichen Einflüssen war. In der Annährung an dieses Forschungsziel werden drei Hauptfragen zusammengefasst:
Erstens, was kann das Erfahrungsmodell der alten Seidenstraße zum Verständnis der kulturellen Entwicklung im heutigen Informationszeitalter beitragen?
Zweitens, was bedeutet Vielfalt für die kulturelle Kommunikation in Xinjiang? Steigen die Medienereignisse der visuellen Kommunikation und die internationalen Einflüsse an?
Drittens, wie kann Xinjiang die Chancen der Go-West-Strategie des chinesischen Wirtschaftsprogramms in einer zunehmend globalisierten Welt nutzen und in der zukünftigen weltwirtschaftlichen Entwicklung wieder Vitalität zeigen?
Um sich auf diese Weise den Antworten zu nähern, verwendet diese Studie einen Ansatz, der eine kulturanthropologische Perspektive integriert. Auf Grundlage einer Feldstudie werden Eigenschaften der traditionellen visuellen Xiyü-Zeichen, ihre Bedeutung für die ethnischen Volksgruppen Xinjiangs und den kulturellen Austausch zwischen Europa und Asien diskutiert und der besondere Einfluss von Handel und Verkehrswegen aufgezeigt.
Die visuellen Zeichenformen der Xiyü-Kultur wurden von der Zeit vor der Islamisierung bis hin zur heutigen islamischen Zeit dargestellt. Die Forschung zeigt, dass die Zeichen kein ortsgebundenes Kulturphänomen oder eine lokale Tradition sind, vielmehr werden sie von regionalen, nationalen und internationalen Kulturkreisen bewusst oder unbewusst beeinflusst. Sie sind ein Ergebnis kultureller Kommunikation. Schließlich setzt sich die Studie mit den Zeichen der Xiyü-Kultur Xinjiangs in Relation mit der Go-West-Strategie der chinesischen Regierung auseinander. Die wirtschaftliche Entwicklung bringt für Xinjiang nicht nur Modernisierung und Wohlstand, sondern auch ein kulturelles und ökologisches Ungleichgewicht.
Der Mensch strukturiert alles, was er in seiner Umgebung wahrnimmt, mit Zeichen. Die Xiyü-Zeichen schaffen eine visuelle Strukturierung. Die Xiyü-Kultur hat nach wie vor Einfluss auf die Lebensweise der ethnischen Gruppen in Xinjiang. Heute hat die Konfrontation zwischen der traditionellen Xiyü-Kultur und der wirtschaftlichen Entwicklung zu einem ernsthaften Konflikt in Xinjiang geführt.
Diese Studie versteht sich als eine kritische Interpretation der aktuellen Situation der Xiyü-Kultur bzw. des chinesischen Wirtschaftsprogramms in Xinjiang. Sie will darüber hinaus einen Beitrag leisten zur Analyse der bislang nur unzureichend erfassten Wechselwirkungen zwischen Geschichte, Kultur, Politik und Wirtschaft in der kulturellen Globalisierung in Xinjiang.