Refine
Document Type
- Article (1015) (remove)
Institute
- Professur Theorie und Geschichte der modernen Architektur (393)
- Institut für Strukturmechanik (ISM) (254)
- Professur Informatik im Bauwesen (130)
- Professur Stochastik und Optimierung (40)
- Professur Bauphysik (23)
- In Zusammenarbeit mit der Bauhaus-Universität Weimar (21)
- Professur Informatik in der Architektur (15)
- Junior-Professur Computational Architecture (12)
- Institut für Europäische Urbanistik (11)
- Professur Bauchemie und Polymere Werkstoffe (11)
Keywords
- Bauhaus-Kolloquium (395)
- Weimar (395)
- Angewandte Mathematik (186)
- Strukturmechanik (185)
- Architektur (168)
- 1986 (63)
- 1989 (60)
- Design (59)
- Bauhaus (55)
- Raum (55)
Particle Simulation and Evaluation of Personal Exposure to Contaminant Sources in an Elevation Space
(2004)
An elevator, which figures a small volume, is normally used by everyone for a short period of time and equipped with simple ventilation system..Any contaminant released within it may cause serious problem. This research adapt a fire and smoke simulation software (FDS) into non-fire indoor airflow scario. Differently from previous research, particles are chosen as a risk evalution unit. A personal and multi-personal exposure model is proposed. The model takes the influence of the human thermal boundary, coughing, inhalation, exhalation, standing position, and the fan factor into account. The model is easy-to-use and suitable for the design of elevator system in practice.
Paraffin Nanocomposites for Heat Management of Lithium-Ion Batteries: A Computational Investigation
(2016)
Lithium-ion (Li-ion) batteries are currently considered as vital components for advances in mobile technologies such as those in communications and transport. Nonetheless, Li-ion batteries suffer from temperature rises which sometimes lead to operational damages or may even cause fire. An appropriate solution to control the temperature changes during the operation of Li-ion batteries is to embed batteries inside a paraffin matrix to absorb and dissipate heat. In the present work, we aimed to investigate the possibility of making paraffin nanocomposites for better heat management of a Li-ion battery pack. To fulfill this aim, heat generation during a battery charging/discharging cycles was simulated using Newman’s well established electrochemical pseudo-2D model. We couple this model to a 3D heat transfer model to predict the temperature evolution during the battery operation. In the later model, we considered different paraffin nanocomposites structures made by the addition of graphene, carbon nanotubes, and fullerene by assuming the same thermal conductivity for all fillers. This way, our results mainly correlate with the geometry of the fillers. Our results assess the degree of enhancement in heat dissipation of Li-ion batteries through the use of paraffin nanocomposites. Our results may be used as a guide for experimental set-ups to improve the heat management of Li-ion batteries.
Wissenschaftliches Kolloquium vom 24. bis 27. April 2003 in Weimar an der Bauhaus-Universität zum Thema: ‚MediumArchitektur - Zur Krise der Vermittlung'
Gropius als Lehrer
(1983)
Wissenschaftliches Kolloquium vom 27. bis 30. Juni 1989 in Weimar an der Hochschule für Architektur und Bauwesen zum Thema: ‚Produktivkraftentwicklung und Umweltgestaltung. Sozialer und wissenschaftlich-technischer Fortschritt in ihren Auswirkungen auf Architektur und industrielle Formgestaltung in unserer Zeit. Zum 100. Geburtstag von Hannes Meyer'
Wissenschaftliches Kolloquium vom 27. bis 30. Juni 1989 in Weimar an der Hochschule für Architektur und Bauwesen zum Thema: ‚Produktivkraftentwicklung und Umweltgestaltung. Sozialer und wissenschaftlich-technischer Fortschritt in ihren Auswirkungen auf Architektur und industrielle Formgestaltung in unserer Zeit. Zum 100. Geburtstag von Hannes Meyer'
Wissenschaftliches Kolloquium vom 27. bis 30. Juni 1989 in Weimar an der Hochschule für Architektur und Bauwesen zum Thema: ‚Produktivkraftentwicklung und Umweltgestaltung. Sozialer und wissenschaftlich-technischer Fortschritt in ihren Auswirkungen auf Architektur und industrielle Formgestaltung in unserer Zeit. Zum 100. Geburtstag von Hannes Meyer'
The concept of isogeometric analysis, where functions that are used to describe geometry in CAD software are used to approximate the unknown fields in numerical simulations, has received great attention in recent years. The method has the potential to have profound impact on engineering design, since the task of meshing, which in some cases can add significant overhead, has been circumvented. Much of the research effort has been focused on finite element implementations of the isogeometric concept, but at present, little has been seen on the application to the Boundary Element Method. The current paper proposes an Isogeometric Boundary Element Method (BEM), which we term IGABEM, applied to two-dimensional elastostatic problems using Non-Uniform Rational B-Splines (NURBS). We find it is a natural fit with the isogeometric concept since both the NURBS approximation and BEM deal with quantities entirely on the boundary. The method is verified against analytical solutions where it is seen that superior accuracies are achieved over a conventional quadratic isoparametric BEM implementation.
Wissenschaftliches Kolloquium vom 24. bis 27. April 2003 in Weimar an der Bauhaus-Universität zum Thema: ‚MediumArchitektur - Zur Krise der Vermittlung'
As part of an international research project – funded by the European Union – capillary glasses for facades are being developed exploiting storage energy by means of fluids flowing through the capillaries. To meet highest visual demands, acrylate adhesives and EVA films are tested as possible bonding materials for the glass setup. Especially non-destructive methods (visual analysis, analysis of birefringent properties and computed tomographic data) are applied to evaluate failure patterns as well as the long-term behavior considering climatic influences. The experimental investigations are presented after different loading periods, providing information of failure developments. In addition, detailed information and scientific findings on the application of computed tomographic analyses are presented.
The problem of the computation of stresses and settlements in the half-space under various types of loads is often presented in geotechnical engineering. In 1885 Boussinesq advanced theoretical expressions to determine stresses at a point within an ideal mass. His equation considers a point load on the surface of a semi-infinite, homogeneous, isotropic, weightless, elastic half-space. Newmark in 1942 performed the integration of Boussinesq's equations for the vertical stress under a corner of a rectangular area loaded with a uniform load. The problem of the determination of vertical stresses under a rectangular shaped footing has been satisfactorily solved with renewal integration of the Boussinesq's equation over the arbitrary rectangle on surface of the half-space, with a non-uniform load represented with piecewise linear interpolation functions. The problem of the determination of stresses in the case when the footing shape is an arbitrary quadrilateral however remains unsolved. The paper discusses an approach to the computation of vertical stresses and settlements in an arbitrary point of the half-space, loaded with a uniform load, which shape in the ground plan can be a general four noded form with straight edges. Since the form is transformed into a biunit square and all integrations are performed over this area, all solutions are valid also for an arbitrary triangle by the implementation of the degeneration rule.
Die Bauinformatik ist eine Säule der modernen Bau- und Umweltingenieurwissenschaften und befasst sich mit der Erforschung grundlegender informatorischer Methoden sowie mit der Anwendung und Weiterentwicklung der Informationswissenschaften im Bau- und Umweltbereich. Der Arbeitskreis Bauinformatik konstituiert sich aus Wissenschaftlern, die an Universitäten im deutschsprachigen Raum auf dem Fachgebiet Bauinformatik lehren und forschen. Ausgehend vom erreichten Entwicklungsstand der Bauinformatik skizziert dieses Positionspapier die Aufgaben des Arbeitskreises und formuliert eine Grundlage für eine abgestimmte Weiterentwicklung an den deutschsprachigen Universitäten.
Wissenschaftliches Kolloquium vom 24. bis 27. April 2003 in Weimar an der Bauhaus-Universität zum Thema: ‚MediumArchitektur - Zur Krise der Vermittlung'
Die engere Wohnumwelt
(1990)
Wissenschaftliches Kolloquium vom 27. bis 30. Juni 1989 in Weimar an der Hochschule für Architektur und Bauwesen zum Thema: ‚Produktivkraftentwicklung und Umweltgestaltung. Sozialer und wissenschaftlich-technischer Fortschritt in ihren Auswirkungen auf Architektur und industrielle Formgestaltung in unserer Zeit. Zum 100. Geburtstag von Hannes Meyer'
Wir müssen nicht in einem Stil bauen! - Industriearchitektur als baukulturelle Herausforderung
(1990)
Wissenschaftliches Kolloquium vom 27. bis 30. Juni 1989 in Weimar an der Hochschule für Architektur und Bauwesen zum Thema: ‚Produktivkraftentwicklung und Umweltgestaltung. Sozialer und wissenschaftlich-technischer Fortschritt in ihren Auswirkungen auf Architektur und industrielle Formgestaltung in unserer Zeit. Zum 100. Geburtstag von Hannes Meyer'
Zu den diversen Unternehmungen sozialbewegter „Gegenwissenschaft“, die um 1980 auf der Bildfläche der BRD erschienen, zählte der 1982 gegründete Berliner Wissenschaftsladen e. V., kurz WILAB – eine Art „alternatives“ Spin-off der Technischen Universität Berlin. Der vorliegende Beitrag situiert die Ausgründung des „Ladens“ im Kontext zeitgenössischer Fortschritte der (regionalen) Forschungs- und Technologiepolitik. Gezeigt wird, wie der deindustrialisierenden Inselstadt, qua „innovationspolitischer“ Gegensteuerung, dabei sogar eine gewisse Vorreiterrolle zukam: über die Stadtgrenzen hinaus sichtbare Neuerungen wie die Gründermesse BIG TECH oder das 1983 eröffnete Berliner Innovations- und Gründerzentrum (BIG), der erste „Incubator“ [sic] der BRD, etwa gingen auf das Konto der 1977/78 lancierten Technologie-Transferstelle der TU Berlin, TU-transfer.
Anders gesagt: tendenziell bekam man es hier nun mit Verhältnissen zu tun, die immer weniger mit den Träumen einer „kritischen“, nicht-fremdbestimmten (Gegen‑)Wissenschaft kompatibel waren. Latent konträr zur historiographischen Prominenz des wissenschaftskritischen Zeitgeists fristeten „alternativen“ Zielsetzungen verpflichtete Unternehmungen wie „WILAB“ ein relativ marginalisiertes Nischendasein. Dennoch wirft das am WILAB verfolgte, so gesehen wenig aussichtsreiche Anliegen, eine andere, nämlich „humanere“ Informationstechnologie in die Wege zu leiten, ein instruktives Licht auf die Aufbrüche „unternehmerischer“ Wissenschaft in der BRD um 1980.
Wissenschaftliches Kolloquium vom 27. bis 30. Juni 1989 in Weimar an der Hochschule für Architektur und Bauwesen zum Thema: ‚Produktivkraftentwicklung und Umweltgestaltung. Sozialer und wissenschaftlich-technischer Fortschritt in ihren Auswirkungen auf Architektur und industrielle Formgestaltung in unserer Zeit. Zum 100. Geburtstag von Hannes Meyer'
Current building product models explicitly represent components, attributes of components, and relationships between components. These designer-focused product models, however, do not represent many of the design conditions that are important for construction, such as component similarity, uniformity, and penetrations. Current design and construction tools offer limited support for detecting these construction-specific design conditions. This paper describes the ontology we developed using the manufacturing concept of features to represent the design conditions that are important for construction. The feature ontology provides the blueprint for the additions and changes needed to transform a standard product model into a constructionspecific product model. The ontology formalizes three classes of features, defines the attributes and functions of each feature type, and represents the relationships between features explicitly. The descriptive semantics of the ontology allows practitioners to represent their varied preferences for naming features, specifying features that result from component intersections and the similarity of components, and grouping features that affect a specific construction domain. A software prototype that implements the ontology enables practitioners to transform designer-focused product models into feature-based product models that represent the construction perspective.
The development of a hydro-mechanically coupled Coupled-Eulerian–Lagrangian (CEL) method and its application to the back-analysisof vibratory pile driving model tests in water-saturated sand is presented. The predicted pile penetration using this approachis in good agreement with the results of the model tests as well as with fully Lagrangian simulations. In terms of pore water pressure, however, the results of the CEL simulation show a slightly worse accordance with the model tests compared to the Lagrangian simulation. Some shortcomings of the hydro-mechanically coupled CEL method in case of frictional contact problems and pore fluids with high bulk modulus are discussed. Lastly, the CEL method is applied to the simulation of vibratory driving of open-profile piles under partially drained conditions to study installation-induced changes in the soil state. It is concluded that the proposed method is capable of realistically reproducing the most important mechanisms in the soil during the driving process despite its addressed shortcomings.
Image and the Space of the Modern City in Erich Mendelsohn's Amerika: Bilderbuch eines Architekten
(2008)
Wissenschaftliches Kolloquium vom 19. bis 22. April 2007 in Weimar an der Bauhaus-Universität zum Thema: ‚Die Realität des Imaginären. Architektur und das digitale Bild'
Due to technological progress and European standardization (e.g. ISO certification) there is an increasing demand to automate the digital recording of building plans. Currently there are two approaches available: (1) labour-intensive redrawing with digitizing tablets or by screen digitizing, (2) automatic scanning and vectorizing where vectorization generally demands an interactive follow-up treatment due to incomplete or ambigues results. This paper proposes a knowledge-based approach for building plan analysis. The procedure reveals the following processing steps: scanning of the building plan, enhancing the image quality and binarization, extraction of lines, line junctions and character fields, knowledge-based interpretation and grouping of image features to domain specific feature aggregates like door symbols, stair symbols etc., optical character recognition and lexicon-based interpretation of character fields, matching of feature aggregates with dimension sets. First results of a prototype implementation are presented. At last an extension of the approach towards a semantic modeling concept showing a coupling between 3D object modeling and an explict 2D modeling of images and technical drawings is presented
Für die Gestaltung einer durchgängigen Unterstützung des Entwurfsprozesses stehen gegenwärtig deskriptive Modelle der Entwurfsobjekte im Mittelpunkt der Untersuchungen. Diese Modelle gestatten das Ableiten von Repräsentationen sowie eine Weitergabe von Entwurfsergebnissen. Pragmatische Gliederungen des Entwurfsprozesses unterteilen diesen nach organisatorischen und betriebswirtschaftlichen Aspekten (Planbarkeit und Abrechenbarkeit) in eine Sequenz von Entwurfsphasen (HOAI). Diese Gliederungen berücksichtigen nicht das WIE des eigentlichen modellkreierenden Schaffensprozesses. Für ein echtes CADesign bildet dessen Klärung jedoch die erforderliche Voraussetzung. Im Beitrag wird dazu von einem vereinheitlichten Set generischer Entwurfsaktionen ausgegangen. Auch dann, wenn die verschiedenen Entwurfsphasen und die Entwurfshandlungen der einzelnen Ingenieurgewerke mit spezifische Entwurfsmodellen verbunden werden, besteht damit eine Grundlage zur methodischen Fundierung entsprechender CAD-Tools. Die methodische Verfahrensweise ähnelt der, die in Form von Styleguides zur Gestaltung von 'Graphical User Interfaces' vorgeschlagen wird. Wesentliche praktische Benutzungen solche Basisaktivitäten ergeben sich für: die Systematisierung computergestützter Entwurfshandlungen, insbesondere durch Erweiterung des deskriptiven um ein operationales Modell sowie deren erweiterte Interpretierbarkeit die Erzeugung wissensbasierter Werkzeuge zur automatischen Modellgenerierung/-konfiguration die Implementation von leistungsfähigen UNDO- bzw. TMS-Mechanismen.
Wissenschaftliches Kolloquium vom 24. bis 27. April 2003 in Weimar an der Bauhaus-Universität zum Thema: ‚MediumArchitektur - Zur Krise der Vermittlung'
Der Artikel behandelt fünf für das wissenschaftliche Arbeiten besonders relevante Themenfelder: 1) Ziele und Gegenstand wissenschaftlichen Arbeitens, 2) der Zusammenhang von Wissenschaft, Erkenntnis und Fortschritt, 3) eine Darstellung der Forschungslandschaft in Deutschland unter Berücksichtigung der Wissenschaftsorganisation, 4) eine ausführliche, praxisorientierte Erläuterung des typischen Ablaufs eines Forschungsprozesses,
5) eine Skizze zur literaturbasierten Forschung. Der Beitrag stellte zahlreiche Bezüge zur Stadtforschung her und nutzt Beispiele zur Illustration der Inhalte.
The frame of this paper is the development of methods and procedures for the description of the motion of an arbitrary shaped foundation. Since the infinite half-space cannot be properly described by a model of finite dimensions without violating the radiation condition, the basic problems are infinite dimensions of the half-space as well as its non-homogeneous nature. Consequently, an approach has been investigated to solve this problem indirectly by developing Green's function in which the non-homogeneity and the infiniteness of the half-space has been included. When the Green's function is known, the next step will be the evaluation of contact stresses acting between the foundation and the surface of the half-space through an integral equation. The equation should be solved in the area of the foundation using Green's function as the kernel. The derivation of three-dimensional Green's function for the homogeneous half-space (Kobayashi and Sasaki 1991) has been made using the potential method. Partial differential equations occurring in the problem have been made ordinary ones through the Hankel integral transform. The general idea for obtaining the three-dimensional Green's function for the layered half-space is similar. But in that case some additional phenomena may occur. One of them is the possibility of the appearance of Stonely surface waves propagating along the contact surfaces of layers. Their contribution to the final result is in most cases important enough that they should not be neglected. The main advantage of results presented in comparing to other obtained with numerical methods is their accuracy especially in the case of thin layers because all essential steps of Green's function evaluation except of the contour integration along the branch cut have been made analytically. On the other hand the disadvantage of this method is that the mathematical effort for obtaining the Green's function is increasing drastically with the increase of the number of layers. Future work will therefore be directed in simplifying of the above described process
In the design of a structure, the implementation of reliable soil-foundation-structure interaction into the analysis process plays a very important role. The paper presents a determination of parameters of a suitably chosen soil-foundation model and their influence on the structure response. Since the mechanical data for the structure can be determined with satisfactory accuracy, the properties of the soil-foundation model were identified using measured dynamic response of the real structure. A simple model describing soil-foundation structure was incorporated into the classical 3-D finite element analysis of the structure with commercial software. Results obtained from the measured data on the pier were afterwards compared with those obtained with the finite model of the pier-foundation-soil structure. On the basis of this comparison the coefficients describing the properties in the soil-foundation model were adjusted until the calculated dynamic response coincided with the measured ones. In this way, the difference between both results was reduced to 1%. Full-scale tests measuring eigenmotion of the bridge were performed through all erection stages of the new bridge in Maribor. In this way an effective and experimentally verified 3-D model for a complex dynamic analysis of the bridge under the earthquake loading was obtained. The significant advantage of the obtained model is that it was updated on the basis of the dynamic measurements thus improving the model on the basis of in-situ geomechanical measurements. The model is very accurate in describing the upper structure and economical in describing the soil mass thus representing an optimal solution regarding computational efforts.
Obgleich redaktionelle Diversität regelmäßig als eine der Kernherausforderungen für den Journalismus gehandelt wird, tun sich journalistische Organisationen immer noch schwer damit, gleichberechtigte Zugangs-und Entfaltungschancen für Frauen, People of Color, Menschen aus Einwandererfamilien oder auf anderen Ebenen diskriminierungsbetroffene Journalist*innen zu schaffen. Stattdessen zeigt die gender-und diversitätsbezogene Redaktionsforschung, dass Redaktionen nach wie vor in ihren Strukturen, Praktiken und in ihrer Kultur gesellschaftliche Machtverhältnisse widerspiegeln und (re-)produzieren. Doch im Zuge des „Entrepreneurial Journalism“ entstehen heute vielerorts neue Organisationen. Diese journalistischen Neugründungen zeichnen sich unter anderem durch kleinere, offenere und flexiblere Organisationsstrukturen aus. Ihr veränderter organisationaler Charakter wurde jedoch bislang noch nicht tiefergehend im Hinblick auf seine Potenziale und möglichen Risiken für Diversität und (Anti-)Diskriminierung untersucht. Der Beitrag hat somit zum Ziel, journalistische Neugründungen als neuen Gegenstand auf der Agenda der organisations-und diskriminierungsbezogenen Journalismusforschung zu positionieren. Dazu werden ausgewählte Studien zum „Entrepreneurial Journalism“ im Hinblick auf Diversität und (Anti-)Diskriminierung aufgearbeitet und bestehende Desiderata identifiziert. Auf dieser Grundlage wird schließlich eine programmatische Forschungsagenda entwickelt, um empirische Folgeforschung zum Thema anzustoßen.
Multi-criteria decision analysis (MCDA) is an established methodology to support the decision-making of multi-objective problems. For conducting an MCDA, in most cases, a set of objectives (SOO) is required, which consists of a hierarchical structure comprised of objectives, criteria, and indicators. The development of an SOO is usually based on moderated development processes requiring high organizational and cognitive effort from all stakeholders involved. This article proposes elementary interactions as a key paradigm of an algorithm-driven development process for an SOO that requires little moderation efforts. Elementary interactions are self-contained information requests that may be answered with little cognitive effort. The pairwise comparison of elements in the well-known analytical hierarchical process (AHP) is an example of an elementary interaction. Each elementary interaction in the development process presented contributes to the stepwise development of an SOO. Based on the hypothesis that an SOO may be developed exclusively using elementary interactions (EIs), a concept for a multi-user platform is proposed. Essential components of the platform are a Model Aggregator, an Elementary Interaction Stream Generator, a Participant Manager, and a Discussion Forum. While the latter component serves the professional exchange of the participants, the first three components are intended to be automatable by algorithms. The platform concept proposed has been evaluated partly in an explorative validation study demonstrating the general functionality of the algorithms outlined. In summary, the platform concept suggested demonstrates the potential to ease SOO development processes as the platform concept does not restrict the application domain; it is intended to work with little administration moderation efforts, and it supports the further development of an existing SOO in the event of changes in external conditions. The algorithm-driven development of SOOs proposed in this article may ease the development of MCDA applications and, thus, may have a positive effect on the spread of MCDA applications.
Multi-criteria decision analysis (MCDA) is an established methodology to support the decision-making of multi-objective problems. For conducting an MCDA, in most cases, a set of objectives (SOO) is required, which consists of a hierarchical structure comprised of objectives, criteria, and indicators. The development of an SOO is usually based on moderated development processes requiring high organizational and cognitive effort from all stakeholders involved. This article proposes elementary interactions as a key paradigm of an algorithm-driven development process for an SOO that requires little moderation efforts. Elementary interactions are self-contained information requests that may be answered with little cognitive effort. The pairwise comparison of elements in the well-known analytical hierarchical process (AHP) is an example of an elementary interaction. Each elementary interaction in the development process presented contributes to the stepwise development of an SOO. Based on the hypothesis that an SOO may be developed exclusively using elementary interactions (EIs), a concept for a multi-user platform is proposed. Essential components of the platform are a Model Aggregator, an Elementary Interaction Stream Generator, a Participant Manager, and a Discussion Forum. While the latter component serves the professional exchange of the participants, the first three components are intended to be automatable by algorithms. The platform concept proposed has been evaluated partly in an explorative validation study demonstrating the general functionality of the algorithms outlined. In summary, the platform concept suggested demonstrates the potential to ease SOO development processes as the platform concept does not restrict the application domain; it is intended to work with little administration moderation efforts, and it supports the further development of an existing SOO in the event of changes in external conditions. The algorithm-driven development of SOOs proposed in this article may ease the development of MCDA applications and, thus, may have a positive effect on the spread of MCDA applications.
Recently, many reseraches on active control systems of building structures are preformed based on modern control theory and are installed real buildings. The authors have already proposed intelligent fuzzy optimal active control (IFOAC) systems. IFOAC systems imitate intelligent activities of human brains such as prediction, adaptation, decision-kaking and so on. In IFOAC systems, objective and subjective judgements on the active control can be taken into account. However, IFOAC systems are considered to be suitable for far-field erathquake and control effect becomes small in case of near-field earthqaukes which include a few velosity pules with large amplitudes. To improve control effect in case of near-souece earthquakes, the authors have also proposed hybrid control (HC) systems, in which IFOAC systems and fuzzy control system are combined. In HC systems, the fuzzy control systems are introduced as a reflective fuzzy active control (RFAC) system and imitates spinal reflection of human. In HC systems, active control forces are activated to buildings in accordance with switching rules on active control forces. In this paper, optimizations on fuzzy control rules in RFAC system and switching rules of active control forces in HC system are performed by Parameter-Free Genetic Algorithms (PfGAs). Here, the optimization is performed by using different earthquake inputs. The results of digital simulations show that the HC system can reduce maximal response displacements under restrictions on strokes of the actuator effectively in case of a near-source earthquake and the effectiveness of the proposed HC system is discussed and clarified.
Meshfree methods (MMs) such as the element free Galerkin (EFG)method have gained popularity because of some advantages over other numerical methods such as the finite element method (FEM). A group of problems that have attracted a great deal of attention from the EFG method community includes the treatment of large deformations and dealing with strong discontinuities such as cracks. One efficient solution to model cracks is adding special enrichment functions to the standard shape functions such as extended FEM, within the FEM context, and the cracking particles method, based on EFG method. It is well known that explicit time integration in dynamic applications is conditionally stable. Furthermore, in enriched methods, the critical time step may tend to very small values leading to computationally expensive simulations. In this work, we study the stability of enriched MMs and propose two mass-lumping strategies. Then we show that the critical time step for enriched MMs based on lumped mass matrices is of the same order as the critical time step of MMs without enrichment. Moreover, we show that, in contrast to extended FEM, even with a consistent mass matrix, the critical time step does not vanish even when the crack directly crosses a node.
A simple multiscale analysis framework for heterogeneous solids based on a computational homogenization technique is presented. The macroscopic strain is linked kinematically to the boundary displacement of a circular or spherical representative volume which contains the microscopic information of the material. The macroscopic stress is obtained from the energy principle between the macroscopic scale and the microscopic scale. This new method is applied to several standard examples to show its accuracy and consistency of the method proposed.
A simple multiscale analysis framework for heterogeneous solids based on a computational homogenization technique is presented. The macroscopic strain is linked kinematically to the boundary displacement of a circular or spherical representative volume which contains the microscopic information of the material. The macroscopic stress is obtained from the energy principle between the macroscopic scale and the microscopic scale. This new method is applied to several standard examples to show its accuracy and consistency of the method proposed.
Data acquisition systems and methods to capture high-resolution images or reconstruct 3D point clouds of existing structures are an effective way to document their as-is condition. These methods enable a detailed analysis of building surfaces, providing precise 3D representations. However, for the condition assessment and documentation, damages are mainly annotated in 2D representations, such as images, orthophotos, or technical drawings, which do not allow for the application of a 3D workflow or automated comparisons of multitemporal datasets. In the available software for building heritage data management and analysis, a wide range of annotation and evaluation functions are available, but they also lack integrated post-processing methods and systematic workflows. The article presents novel methods developed to facilitate such automated 3D workflows and validates them on a small historic church building in Thuringia, Germany. Post-processing steps using photogrammetric 3D reconstruction data along with imagery were implemented, which show the possibilities of integrating 2D annotations into 3D documentations. Further, the application of voxel-based methods on the dataset enables the evaluation of geometrical changes of multitemporal annotations in different states and the assignment to elements of scans or building models. The proposed workflow also highlights the potential of these methods for condition assessment and planning of restoration work, as well as the possibility to represent the analysis results in standardised building model formats.
Identity management provides PET (privacy enhancing technology) tools for users to control privacy of their personal data. With the support of mobile location determination techniques based on GPS, WLAN, Bluetooth, etc., context-aware and location-aware mobile applications (e.g. restaurant finder, friend finder, indoor and outdoor navigation, etc.) have gained quite big interest in the business and IT world. Considering sensitive static personal information (e.g. name, address, phone number, etc.) and also dynamic personal information (e.g. current location, velocity in car, current status, etc.), mobile identity management is required to help mobile users to safeguard their personal data. In this paper, we evaluate certain required aspects and features (e.g. context-to-context dependence and relation, blurring in levels, trust management with p3p integration, extended privacy preferences, etc.) of mobile identity management
It is widely accepted that most people spend the majority of their lives indoors. Most individuals do not realize that while indoors, roughly half of heat exchange affecting their thermal comfort is in the form of thermal infrared radiation. We show that while researchers have been aware of its thermal comfort significance over the past century, systemic error has crept into the most common evaluation techniques, preventing adequate characterization of the radiant environment. Measuring and characterizing radiant heat transfer is a critical component of both building energy efficiency and occupant thermal comfort and productivity. Globe thermometers are typically used to measure mean radiant temperature (MRT), a commonly used metric for accounting for the radiant effects of an environment at a point in space. In this paper we extend previous field work to a controlled laboratory setting to (1) rigorously demonstrate that existing correction factors used in the American Society of Heating Ventilation and Air-conditioning Engineers (ASHRAE) Standard 55 or ISO7726 for using globe thermometers to quantify MRT are not sufficient; (2) develop a correction to improve the use of globe thermometers to address problems in the current standards; and (3) show that mean radiant temperature measured with ping-pong ball-sized globe thermometers is not reliable due to a stochastic convective bias. We also provide an analysis of the maximum precision of globe sensors themselves, a piece missing from the domain in contemporary literature.
This paper presents a novel numerical procedure based on the framework of isogeometric analysis for static, free vibration, and buckling analysis of laminated composite plates using the first-order shear deformation theory. The isogeometric approach utilizes non-uniform rational B-splines to implement for the quadratic, cubic, and quartic elements. Shear locking problem still exists in the stiffness formulation, and hence, it can be significantly alleviated by a stabilization technique. Several numerical examples are presented to show the performance of the method, and the results obtained are compared with other available ones.
Looking at our face in a mirror is one of the strongest phenomenological experiences of the Self in which we need to identify the face as reflected in the mirror as belonging to us. Recent behavioral and neuroimaging studies reported that self-face identification not only relies upon visual-mnemonic representation of ones own face but also upon continuous updating and integration of visuo-tactile signals. Therefore, bodily self-consciousness plays a major role in self-face identification, with respect to interplay between unisensory and multisensory processing. However, if previous studies demonstrated that the integration of multisensory body-related signals contributes to the visual processing of ones own face, there is so far no data regarding how self-face identification, inversely, contributes to bodily self-consciousness. In the present study, we tested whether selfother face identification impacts either the egocentered or heterocentered visuo-spatial mechanisms that are core processes of bodily self-consciousness and sustain selfother distinction. For that, we developed a new paradigm, named Double Mirror. This paradigm, consisting of a semi-transparent double mirror and computer-controlled Light Emitting Diodes, elicits selfother face merging illusory effect in ecologically more valid conditions, i.e., when participants are physically facing each other and interacting. Self-face identification was manipulated by exposing pairs of participants to an Interpersonal Visual Stimulation in which the reflection of their faces merged in the mirror. Participants simultaneously performed visuo-spatial and mental own-body transformation tasks centered on their own face (egocentered) or the face of their partner (heterocentered) in the pre- and post-stimulation phase. We show that selfother face identification altered the egocentered visuo-spatial mechanisms. Heterocentered coding was preserved. Our data suggest that changes in self-face identification induced a bottom-up conflict between the current visual representation and the stored mnemonic representation of ones own face which, in turn, top-down impacted bodily self-consciousness.
Wissenschaftliches Kolloquium vom 24. bis 27. April 2003 in Weimar an der Bauhaus-Universität zum Thema: ‚MediumArchitektur - Zur Krise der Vermittlung'
In the abstract proposed is the Instrumental System of mechanics problems analysis of the deformed solid body. It supplies the researcher with the possibility to describe the input data on the object under analyses and the problem scheme based upon the variational principles within one task. The particular feature of System is possibility to describe the information concerning the object of any geometrical shape and the computation sheme according to the program defined for purpose. The Methods allow to compute the tasks with indefinite functional and indefinite geometry of the object (or the set of objects). The System provides the possibility to compute the tasks with indefinite sheme based upon the Finite Element Method (FEM). The restrictions of the System usage are therefore determined by the restrictions of the FEM itself. It contrast to other known programms using FEM (ANSYS, LS-DYNA and etc) described system possesses more universality in defining input data and choosing computational scheme. Builtin is an original Subsytem of Numerical Result Analuses. It possesses the possibility to visualise all numerical results, build the epures of the unknown variables, etc. The Subsystem is approved while solving two- and three-dimensional problems of Elasticiti and Plasticity, under the conditions of Geometrical Unlinearity. Discused are Contact Problems of Statics and Dynamics.
Volumerendering ist eine Darstellungstechnik, um verschiedene räumliche Mess- und Simulationsdaten anschaulich, interaktiv grafisch darzustellen. Im folgenden Beitrag wird ein Verfahren vorgestellt, mehrere Volumendaten mit einem Architekturflächenmodell zu überlagern. Diese komplexe Darstellungsberechnung findet mit hardwarebeschleunigten Shadern auf der Grafikkarte statt. Im Beitrag wird hierzu der implementierte Softwareprototyp "VolumeRendering" vorgestellt. Neben dem interaktiven Berechnungsverfahren wurde ebenso Wert auf eine nutzerfreundliche Bedienung gelegt. Das Ziel bestand darin, eine einfache Bewertung der Volumendaten durch Fachplaner zu ermöglichen. Durch die Überlagerung, z. B. verschiedener Messverfahren mit einem Flächenmodell, ergeben sich Synergien und neue Auswertungsmöglichkeiten. Abschließend wird anhand von Beispielen aus einem interdisziplinären Forschungsprojekt die Anwendung des Softwareprototyps illustriert.
Wissenschaftliches Kolloquium vom 19. bis 22. April 2007 in Weimar an der Bauhaus-Universität zum Thema: ‚Die Realität des Imaginären. Architektur und das digitale Bild'
Many researchers are working on developing robots into adequate partners, be it at the working place, be it at home or in leisure activities, or enabling elder persons to lead a self-determined, independent life. While quite some progress has been made in e.g. speech or emotion understanding, processing and expressing, the relations between humans and robots are usually only short-term. In order to build long-term, i.e. social relations, qualities like empathy, trust building, dependability, non-patronizing, and others will be required. But these are just terms and as such no adequate starting points to “program” these capacities even more how to avoid the problems and pitfalls in interactions between humans and robots. However, a rich source for doing this is available, unused until now for this purpose: artistic productions, namely literature, theater plays, not to forget operas, and films with their multitude of examples. Poets, writers, dramatists, screen-writers, etc. have studied for centuries the facets of interactions between persons, their dynamics, and the related snags. And since we wish for human-robot relations as master-servant relations - the human obviously being the master - the study of these relations will be prominent. A procedure is proposed, with four consecutive steps, namely Selection, Analysis, Categorization, and Integration. Only if we succeed in developing robots which are seen as servants we will be successful in supporting and helping humans through robots.
In this paper we introduce LUCI, a Lightweight Urban Calculation Interchange system, designed to bring the advantages of calculation and content co-ordination system to small planning and design groups by the means of an open source middle-ware. The middle-ware focuses on problems typical to urban planning and therefore features a geo-data repository as well as a job runtime administration, to coordinate simulation models and its multiple views. The described system architecture is accompanied by two exemplary use cases, that have been used to test and further develop our concepts and implementations.
Wissenschaftliches Kolloquium vom 19. bis 22. April 2007 in Weimar an der Bauhaus-Universität zum Thema: ‚Die Realität des Imaginären. Architektur und das digitale Bild'
Wissenschaftliches Kolloquium vom 19. bis 22. April 2007 in Weimar an der Bauhaus-Universität zum Thema: ‚Die Realität des Imaginären. Architektur und das digitale Bild'
Conventional superplasticizers based on polycarboxylate ether (PCE) show an intolerance to clay minerals due to intercalation of their polyethylene glycol (PEG) side chains into the interlayers of the clay mineral. An intolerance to very basic media is also known. This makes PCE an unsuitable choice as a superplasticizer for geopolymers. Bio-based superplasticizers derived from starch showed comparable effects to PCE in a cementitious system. The aim of the present study was to determine if starch superplasticizers (SSPs) could be a suitable additive for geopolymers by carrying out basic investigations with respect to slump, hardening, compressive and flexural strength, shrinkage, and porosity. Four SSPs were synthesized, differing in charge polarity and specific charge density. Two conventional PCE superplasticizers, differing in terms of molecular structure, were also included in this study. The results revealed that SSPs improved the slump of a metakaolin-based geopolymer (MK-geopolymer) mortar while the PCE investigated showed no improvement. The impact of superplasticizers on early hardening (up to 72 h) was negligible. Less linear shrinkage over the course of 56 days was seen for all samples in comparison with the reference. Compressive strengths of SSP specimens tested after 7 and 28 days of curing were comparable to the reference, while PCE led to a decline. The SSPs had a small impact on porosity with a shift to the formation of more gel pores while PCE caused an increase in porosity. Throughout this research, SSPs were identified as promising superplasticizers for MK-geopolymer mortar and concrete.
Wissenschaftliches Kolloquium vom 19. bis 22. April 2007 in Weimar an der Bauhaus-Universität zum Thema: ‚Die Realität des Imaginären. Architektur und das digitale Bild'
Methods with the convergence order p 2 (Newton`s, tangent hyperbolas, tangent parabolas etc.) and their approximate variants are studied. Conditions are presented under which the approximate variants preserve their convergence rate intrinsic to these methods and some computational aspects (possibilities to organize parallel computation, globalization of a method, the solution of the linear equations versus the matrix inversion at every iteration etc.) are discussed. Polyalgorithmic computational schemes (hybrid methods) combining the best features of various methods are developed and possibilities of their application to numerical solution of two-point boundary-value problem in ordinary differential equations and decomposition-coordination problem in convex programming are analyzed.
Few studies have investigated how search behavior affects complex writing tasks. We analyze a dataset of 150 long essays whose authors searched the ClueWeb09 corpus for source material, while all querying, clicking, and writing activity was meticulously recorded. We model the effect of search and writing behavior on essay quality using path analysis. Since the boil-down and build-up writing strategies identified in previous research have been found to affect search behavior, we model each writing strategy separately. Our analysis shows that the search process contributes significantly to essay quality through both direct and mediated effects, while the author's writing strategy moderates this relationship. Our models explain 25–35% of the variation in essay quality through rather simple search and writing process characteristics alone, a fact that has implications on how search engines could personalize result pages for writing tasks. Authors' writing strategies and associated searching patterns differ, producing differences in essay quality. In a nutshell: essay quality improves if search and writing strategies harmonize—build-up writers benefit from focused, in-depth querying, while boil-down writers fare better with a broader and shallower querying strategy.
Individual views on a building product of people involved in the design process imply different models for planning and calculation. In order to interpret these geometrical, topological and semantical data of a building model we identify a structural component graph, a graph of room faces, a room graph and a relational object graph as aids and we explain algorithms to derive these relations. The application of the technique presented is demonstrated by the analysis and discretization of a sample model in the scope of building energy simulation.
The theory of random matrices, or random matrix theory, RMT in what follows, has been developed at the beginning of the fties to describe the sta- tistical properties of energy levels of complex quantum systems, [1], [2], [3]. In the early eighties it has enjoyed renewed interest since it has been recognized as a very useful tool in the study of numerous physical systems. Specically, it is very useful in the analysis of chaotic quantum systems. In fact, in the last years many papers appeared about the problem of quantum chaos which implies the quantization of systems whose underlying classical dynamics is irregular (i.e. chaotic). The simplest models considered in this eld are billi- ards of various shapes. From the the classical point of view, a point particle in a 2-dimensional billiard displays regular or irregular motion depending on the shape of the billiard; for instance motion in a rectangular or circular billi- ard is regular thanks to the symmetries of the boundary. On the other hand, billiards of arbitrary shapes imply chaotic motion, i.e. exponential diver- gence of initially nearby trajectories. In order to study quantum billiards we have to consider the Schroedinger equation in various 2-dimensional domains. The eigenvalues of the Schroedinger equation represent the allowed energy levels of our quantum particle in the billiard under consideration, while the eigenfunction norms represent the probability density of nding the particle in a certain position. The question of quantum chaos is whether the charac- ter of the classical motion (regular or chaotic) can in uence some properties
Wissenschaftliches Kolloquium vom 24. bis 27. April 2003 in Weimar an der Bauhaus-Universität zum Thema: ‚MediumArchitektur - Zur Krise der Vermittlung'
When predicting sound pressure levels induced by structure-borne sound sources and describing the sound propagation path through the building structure as exactly as possible, it is necessary to characterize the vibration behavior of the structure-borne sound sources. In this investigation, the characterization of structure-borne sound sources was performed using the two-stage method (TSM) described in EN 15657. Four different structure-borne sound sources were characterized and subsequently installed in a lightweight test stand. The resulting sound pressure levels in an adjacent receiving room were measured. In the second step, sound pressure levels were predicted according to EN 12354-5 based on the parameters of the structure-borne sound sources. Subsequently, the predicted and the measured sound pressure levels were compared to obtain reliable statements on the achievable accuracy when using source quantities determined by TSM with this prediction method.
The presented work focuses on the presentation of a discrete event simulator which can be used for automated sequencing and optimization of building processes. The sequencing is based on the commonly used component–activity–resource relations taking structural and process constraints into account. For the optimization a genetic algorithm approach was developed, implemented and successfully applied to several real life steel constructions. In this contribution we discuss the application of the discrete event simulator including its optimization capabilities on a 4D process model of a steel structure of an automobile recycling facility.