Refine
Document Type
- Conference Proceeding (296)
- Article (238)
- Doctoral Thesis (70)
- Report (14)
- Part of a Book (9)
- Master's Thesis (9)
- Book (7)
- Preprint (4)
- Periodical (2)
- Review (2)
- Study Thesis (2)
- Diploma Thesis (1)
- Habilitation (1)
Institute
- Professur Informatik im Bauwesen (190)
- In Zusammenarbeit mit der Bauhaus-Universität Weimar (116)
- Professur Theorie und Geschichte der modernen Architektur (83)
- Institut für Strukturmechanik (ISM) (50)
- Graduiertenkolleg 1462 (20)
- Professur Bauphysik (14)
- Professur Angewandte Mathematik (9)
- Institut für Europäische Urbanistik (8)
- Professur Computergestütztes kooperatives Arbeiten (8)
- Professur Betriebswirtschaftslehre im Bauwesen (7)
- Bauhaus-Institut für zukunftsweisende Infrastruktursysteme (b.is) (6)
- Junior-Professur Computational Architecture (6)
- Junior-Professur Organisation und vernetzte Medien (6)
- Professur Bauchemie und Polymere Werkstoffe (5)
- Professur Bodenmechanik (5)
- Professur Content Management und Webtechnologien (5)
- Professur Medienphilosophie (5)
- Professur Mediensicherheit (5)
- Professur Sozialwissenschaftliche Stadtforschung (5)
- Bauhaus-Institut für Geschichte und Theorie der Architektur und Planung (4)
- F. A. Finger-Institut für Baustoffkunde (FIB) (4)
- Professur Baubetrieb und Bauverfahren (4)
- Professur Siedlungswasserwirtschaft (4)
- Professur Stahl- und Hybridbau (4)
- Professur Werkstoffe des Bauens (4)
- Geschichte und Theorie der Visuellen Kommunikation (3)
- Junior-Professur Augmented Reality (3)
- Junior-Professur Komplexe Tragwerke (3)
- Professur Baumechanik (3)
- Professur Biotechnologie in der Ressourcenwirtschaft (3)
- Professur Modellierung und Simulation - Konstruktion (3)
- Professur Stochastik und Optimierung (3)
- Universitätsbibliothek (3)
- Hochschule für Musik FRANZ LISZT (2)
- Institut für Konstruktiven Ingenieurbau (IKI) (2)
- Materialforschungs- und -prüfanstalt an der Bauhaus-Universität (2)
- Medienkunst/Mediengestaltung (2)
- Professur Baustatik und Bauteilfestigkeit (2)
- Professur Geschichte und Theorie künstlicher Welten (2)
- Professur Grundbau (2)
- Professur Informatik in der Architektur (2)
- Professur Intelligentes Technisches Design (2)
- Professur Marketing und Medien (2)
- Professur Raumplanung und Raumforschung (2)
- Professur Stadtplanung (2)
- Professur Systeme der Virtuellen Realität (2)
- Professur Verkehrssystemplanung (2)
- An-Institute (1)
- Freie Kunst (1)
- Graduiertenkolleg Medienanthropologie (GRAMA) (1)
- Institut für Bauinformatik, Mathematik und Bauphysik (IBMB) (1)
- Instructional Design - Didaktik medialer Lernumgebungen (1)
- Junior-Professur Bildtheorie (1)
- Junior-Professur Europäische Medienkultur (1)
- Junior-Professur Psychophysiologie und Wahrnehmung (1)
- Junior-Professur Usability (1)
- Juniorprofessur Biotechnologie in der Abfallwirtschaft (1)
- Kompetenzzentrum Medienanthropologie (KOMA) (1)
- Professur Abfallwirtschaft (1)
- Professur Allgemeine Baustoffkunde (1)
- Professur Bauaufnahme und Baudenkmalpflege (1)
- Professur Bauformenlehre (1)
- Professur Baumanagement und Bauwirtschaft (1)
- Professur Computer Vision in Engineering (1)
- Professur Darstellungsmethodik (1)
- Professur Datenbanken und Kommunikationssysteme (1)
- Professur Denkmalpflege und Baugeschichte (1)
- Professur Entwerfen und Architekturtheorie (1)
- Professur Experimentelle Konstruktions- und Materialanalyse (1)
- Professur Geschichte und Theorie der Kulturtechniken (1)
- Professur Immobilienwirtschaft und -management (1)
- Professur Informations- und Wissensverarbeitung (1)
- Professur Medienmanagement (1)
- Professur Planung von Ingenieurbauten (1)
- Professur Soziologie und Sozialgeschichte der Stadt (1)
- Professur Strömungsmechanik (1)
- Professur Tragwerkslehre (1)
- Professur Verkehrsplanung und Verkehrstechnik (1)
- Professur Wasserbau (1)
- Professur für Kultur- und Mediensoziologie (1)
- Promotionsstudiengang Kunst und Design-Freie Kunst-Medienkunst (Ph.D) (1)
Keywords
- Computerunterstütztes Verfahren (150)
- Architektur <Informatik> (99)
- Angewandte Informatik (91)
- Angewandte Mathematik (86)
- CAD (71)
- Bauhaus-Kolloquium (66)
- Weimar (66)
- Architektur (58)
- Computer Science Models in Engineering; Multiscale and Multiphysical Models; Scientific Computing (41)
- 2003 (30)
Temporary changes in precipitation may lead to sustained and severe drought or massive floods in different parts of the world. Knowing the variation in precipitation can effectively help the water resources decision-makers in water resources management. Large-scale circulation drivers have a considerable impact on precipitation in different parts of the world. In this research, the impact of El Niño-Southern Oscillation (ENSO), Pacific Decadal Oscillation (PDO), and North Atlantic Oscillation (NAO) on seasonal precipitation over Iran was investigated. For this purpose, 103 synoptic stations with at least 30 years of data were utilized. The Spearman correlation coefficient between the indices in the previous 12 months with seasonal precipitation was calculated, and the meaningful correlations were extracted. Then, the month in which each of these indices has the highest correlation with seasonal precipitation was determined. Finally, the overall amount of increase or decrease in seasonal precipitation due to each of these indices was calculated. Results indicate the Southern Oscillation Index (SOI), NAO, and PDO have the most impact on seasonal precipitation, respectively. Additionally, these indices have the highest impact on the precipitation in winter, autumn, spring, and summer, respectively. SOI has a diverse impact on winter precipitation compared to the PDO and NAO, while in the other seasons, each index has its special impact on seasonal precipitation. Generally, all indices in different phases may decrease the seasonal precipitation up to 100%. However, the seasonal precipitation may increase more than 100% in different seasons due to the impact of these indices. The results of this study can be used effectively in water resources management and especially in dam operation.
In displacement oriented methods of structural mechanics may static and dynamic equilibrium conditions lead to large coupled nonlinear systems of equations. In many cases they are solved iteratively utilizing derivatives of Newton's method. Alternatively, the equations may be expressed in terms of the Karush-Kuhn-Tucker conditions of an optimization problem and, therefore, may be solved using methods of mathematical programming. To begin with, the work deals with the fundamentals of the formulation as optimization problem. In particular, the requirements of material nonlinearity and contact situations are analyzed. Proximately, an algorithm is implemented which utilizes the usually sparse structure of the Hessian matrix, whereby particularly the convergence behaviour is analyzed and adjusted. The implementation was tested using examples from statics and dynamics of large systems. The results are verified considering the accuracy comparing alternative solutions (e.g. explicit methods). The potential areas of application is shown and the efficiency of the method is evaluated.
Increasing structural robustness is the goal which is of interest for structural engineering community. The partial collapse of RC buildings is subject of this dissertation. Understanding the robustness of RC buildings will guide the development of safer structures against abnormal loading scenarios such as; explosions, earthquakes, fine, and/or long-term accumulation effects leading to deterioration or fatigue. Any of these may result in local immediate structural damage, that can propagate to the rest of the structure causing what is known by the disproportionate collapse.
This work handels collapse propagation through various analytical approaches which simplifies the mechanical description of damaged reinfoced concrete structures due to extreme acidental event.
A phantom-node method is developed for three-node shell elements to describe cracks. This method can treat arbitrary cracks independently of the mesh. The crack may cut elements completely or partially. Elements are overlapped on the position of the crack, and they are partially integrated to implement the discontinuous displacement across the crack. To consider the element containing a crack tip, a new kinematical relation between the overlapped elements is developed. There is no enrichment function for the discontinuous displacement field. Several numerical examples are presented to illustrate the proposed method.
Paraffin Nanocomposites for Heat Management of Lithium-Ion Batteries: A Computational Investigation
(2016)
Lithium-ion (Li-ion) batteries are currently considered as vital components for advances in mobile technologies such as those in communications and transport. Nonetheless, Li-ion batteries suffer from temperature rises which sometimes lead to operational damages or may even cause fire. An appropriate solution to control the temperature changes during the operation of Li-ion batteries is to embed batteries inside a paraffin matrix to absorb and dissipate heat. In the present work, we aimed to investigate the possibility of making paraffin nanocomposites for better heat management of a Li-ion battery pack. To fulfill this aim, heat generation during a battery charging/discharging cycles was simulated using Newman’s well established electrochemical pseudo-2D model. We couple this model to a 3D heat transfer model to predict the temperature evolution during the battery operation. In the later model, we considered different paraffin nanocomposites structures made by the addition of graphene, carbon nanotubes, and fullerene by assuming the same thermal conductivity for all fillers. This way, our results mainly correlate with the geometry of the fillers. Our results assess the degree of enhancement in heat dissipation of Li-ion batteries through the use of paraffin nanocomposites. Our results may be used as a guide for experimental set-ups to improve the heat management of Li-ion batteries.
Environmental and operational variables and their impact on structural responses have been acknowledged as one of the most important challenges for the application of the ambient vibration-based damage identification in structures. The damage detection procedures may yield poor results, if the impacts of loading and environmental conditions of the structures are not considered.
The reference-surface-based method, which is proposed in this thesis, is addressed to overcome this problem. In the proposed method, meta-models are used to take into account significant effects of the environmental and operational variables. The usage of the approximation models, allows the proposed method to simply handle multiple non-damaged variable effects simultaneously, which for other methods seems to be very complex. The input of the meta-model are the multiple non-damaged variables while the output is a damage indicator.
The reference-surface-based method diminishes the effect of the non-damaged variables to the vibration based damage detection results. Hence, the structure condition that is assessed by using ambient vibration data at any time would be more reliable. Immediate reliable information regarding the structure condition is required to quickly respond to the event, by means to take necessary actions concerning the future use or further investigation of the structures, for instance shortly after extreme events such as earthquakes.
The critical part of the proposed damage detection method is the learning phase, where the meta-models are trained by using input-output relation of observation data. Significant problems that may encounter during the learning phase are outlined and some remedies to overcome the problems are suggested.
The proposed damage identification method is applied to numerical and experimental models. In addition to the natural frequencies, wavelet energy and stochastic subspace damage indicators are used.
Practical examples show that the improvement in cost flow and total amount of money spend in construction and further use may be cut significantly. The calculation is based on spreadsheets calculation, very easy to develop on most PC´s now a days. Construction works, are a field where the evaluation of Cash Flow can be and should be applied. Decisions about cash flow in construction are decisions with long-term impact and long-term memory. Mistakes from the distant past have a massive impact on situations in the present and into the far economic future of economic activities. Two approaches exist. The Just-in-Time (JIT) approach and life cycle costs (LCC) approach. The calculation example shows the dynamic results for the production speed in opposition to stable flow of production in duration of activities. More sophisticated rescheduling in optimal solution might bring in return extra profit. In the technologies and organizational processes for industrial buildings, railways and road reconstruction, public utilities and housing developments there are assembly procedures that are very appropriate for the given purpose, complicated research-, development-, innovation-projects are all very good aspects of these kinds of applications. The investors of large investments and all public invested money may be spent more efficiently if an optimisation speed-strategy can be calculated.
The construction and operation of a sanitary landfill (SLF) in the Philippines presents concerns on the regulation of the activities of the informal sector in the area. In anticipation of these directives, an association of informal waste reclaimers group called Uswag Calajunan Livelihood Association, Inc. (UCLA) was formed in May 2009. One option identified was the waste-to-energy activity through the production of fuel briquettes. With the availability of raw materials in the area, what was lacking then was an appropriate technology that would cater to their needs. This study, therefore, presented the case of UCLA on how socio-economic and technical aspects was integrated for the development and improvement of a briquetting technology needed in the production of quality briquettes as part of their income generating activities. A non-experimental posttest only design was utilized for the collection of descriptive information. Descriptions and discussions were also made on the enhancement of the briquetting machine from the first hand-press molder developed until the finalized design was attained.
Results revealed that the improved briquetting technology withstood the wear and tear of operation showing a significant (P<0.01) increase on the production rate (220 pcs/hr; 4 kg/hr) and bulk density (444.83 kg/m3) of briquettes produced. The quality of cylindrical briquettes produced in terms of bulk density, heating value (15.13 MJ/kg), moisture (6.2%), N and S closely met or has met the requirements of DIN 51731. Based on the operating expenses, the briquettes may be marked-up to Php0.25/pc (USD0.006) or Php15.00/kg (USD0.34) for profit generation. The potential daily earnings of Php130.00 (USD2.95) to Php288.56 (USD6.56) generated in producing briquettes are higher when compared to the majority of waste reclaimers’ daily income of Php124.00 (USD2.82). The high positive response (93%) on the usability of briquettes and the willingness of the respondents (81%) to buy them when sold in the market indicates its promising potential as fuel in the nearby communities. Results of briquette production citing the case of UCLA could be considered as potential source of income given the social, technical, economic and environmental feasibility of the experiment. This method of utilizing wastes in an urban setting of a developing country with similar socio-economic and physical set-ups may also be recommended for testing or replication.
The promise of lower costs for sensors that can be used for construction inspection means that inspectors will continue to have new choices to consider in creating inspection plans. However, these emerging inspection methods can require different activities, resources, and decisions such that it can be difficult to compare the emerging methods with other methods that satisfy the same inspection needs. Furthermore, the context in which inspection is performed can significantly influence how well certain inspection methods are suited for a given set of goals for inspection. Context information, such as weather, security, and the regulatory environment, can be used to understand what information about a component should be collected and how an inspection should be performed. The research described in this paper is aimed at developing an approach for comparing and selecting inspection plans. This approach consists of (1) refinement of given goals for inspection, if necessary, in order to address any additional information needs due to a given context and in order to reach a level of detail that can be addressed by an inspection activity; (2) development of constraints to describe how an inspection should be achieved; (3) matching of goals to available inspection methods, and generation of activities and resource plans in order to address the goals; and (4) selection of an inspection plan from among the possible plans that have been identified. The authors illustrate this approach with observations made at a local construction site.
This paper is a report of Radio Frequency Identification (RFID) technology and its potential applications in the commercial construction industry. RFID technology offers wireless communication between RFID tags and readers with non line-of-sight readability. These fundamental properties eliminate manual data entry and introduce the potential for automated processes to increase project productivity, construction safety, and project cost efficiency. Construction contractors, owners, and material suppliers that believe technology can further develop methods and processes in construction should feel obligated to participate in RFID studies for the advancement of the construction industry as a whole.
We describe the database requirements of SEED (Software Environment to Support the Early Phases in Building Design). The requirements are typical for a database that intends to support a heterogeneous design support environment consisting of independent software modules with diverse internal design models, requirements not met by any commercial database system. The design and implementation of this database is an integral part of the overall software engineering effort. We describe the SEED approach that integrates external and in-house software based on a shared information model specified in the modeling language SPROUT, which allows for the specification of domains, and classes, relationship types and their behavior, and multiple classifications. The SPROUT run-time system organizes and coordinates the communication between the software modules and the database
In this study we introduce a concept of discrete Laplacian on the plane lattice and consider its iteration dynamical system. At first we discuss some basic properties on the dynamical system to be proved. Next making their computer simulations, we show that we can realize the following phenomena quite well:(1) The crystal of waters (2) The designs of carpets, embroideries (3) The time change of the numbers of families of extinct animals, and (4) The echo systems of life things. Hence we may expect that we can understand the evolutions and self organizations by use of the dynamical systems. Here we want to make a stress on the following fact: Although several well known chaotic dynamical systems can describe chaotic phenomena, they have difficulties in the descriptions of the evolutions and self organizations.
This research represents an effort made towards contribute to the critical thinking from an analysis of the hegemonic neoliberal ideology, which supports the idea of the end of history and the technocratic universalism which in turn implies the imposition of a single model of life, denying, in the name of realism and the end of utopias, any other alternative possibility.
This makes it necessary to recover the critical thinking to analyze and understand the reality, thus overcoming the ideological barrier towards claiming that things can be otherwise.
It is clear from this research that the discourse of sustainable development has unquestionably transformed the context and content of political activity in Europe. This discourse has exercised and obvious influence in the Governance processes, mainly because it has contributed to the introduction of a new political field, which was then promoted, either explicitly or implicitly by policy-makers, researchers on the field and practitioners during the last three decades. Though it may be bold to affirm that the discourse of sustainable development is the sole driver of these whole set of changes, there is no doubt that it has played a key part in the way in which the governance priorities have been handled in the European continent.
The underlying goal of this work is to reduce the uncertainty related to thermally induced stress prediction. This is accomplished by considering use of non-linear material behavior, notably path dependent thermal hysteresis behavior in the elastic properties.
Primary novel factors of this work center on two aspects.
1. Broad material characterization and mechanistic material understanding, giving insight into why this class of material behaves in characteristic manners.
2. Development and implementation of a thermal hysteresis material model and its use to determine impact on overall macroscopic stress predictions.
Results highlight microcracking evolution and behavior as the dominant mechanism for material property complexity in this class of materials. Additionally, it was found that for the cases studied, thermal hysteresis behavior impacts relevant peak stress predictions of a heavy-duty diesel particulate filter undergoing a drop-to-idle regeneration by less than ~15% for all conditions tested. It is also found that path independent heating curves may be utilized for a linear solution assumption to simplify analysis.
This work brings forth a newly conceived concept of a 3 state, 4 path, thermally induced microcrack evolution process; demonstrates experimental behavior that is consistent with the proposed mechanisms, develops a mathematical framework that describes the process and quantifies the impact in a real world application space.
Meshfree methods (MMs) such as the element free Galerkin (EFG)method have gained popularity because of some advantages over other numerical methods such as the finite element method (FEM). A group of problems that have attracted a great deal of attention from the EFG method community includes the treatment of large deformations and dealing with strong discontinuities such as cracks. One efficient solution to model cracks is adding special enrichment functions to the standard shape functions such as extended FEM, within the FEM context, and the cracking particles method, based on EFG method. It is well known that explicit time integration in dynamic applications is conditionally stable. Furthermore, in enriched methods, the critical time step may tend to very small values leading to computationally expensive simulations. In this work, we study the stability of enriched MMs and propose two mass-lumping strategies. Then we show that the critical time step for enriched MMs based on lumped mass matrices is of the same order as the critical time step of MMs without enrichment. Moreover, we show that, in contrast to extended FEM, even with a consistent mass matrix, the critical time step does not vanish even when the crack directly crosses a node.
The worldwide growth of communication networks and associated technologies provide the basic infrastructure for new ways of executing the engineering process. Collaboration amongst team members seperated in time and location is of particular importance. Two broad themes can be recognized in research pertaining to distributed collaboration. One theme focusses on the technical and technological aspects of distributed work, while the other emphasises human aspects thereof. The case of finite element structural analysis in a distributed collaboratory is examined in this paper. An approach is taken which has its roots in human aspects of the structural analysis task. Based on experience of how structural engineers currently approach and execute this task while utilising standard software designed for use on local workstations only, criteria are stated for a software architechture that could support collaborative structural analysis. Aspects of a pilot application and the results of qualitative performance measurements are discussed.
CRITICAL STRESS ASSESSMENT IN ANGLE TO GUSSET PLATE BOLTED CONNECTION BY SIMPLIFIED FEM MODELLING
(2010)
Simplified modelling of friction grip bolted connections of steel member – to – gusset plate is often applied in engineering practise. The paper deals with the simplification of pre-tensioned bolt model and simplification of load transfer within connection. Influence on normal strain (and thus stress) distribution at critical cross-section is investigated. Laboratory testing of single-angle or double-angle members – to – gusset plates bolted connections were taken as basis for numerical analysis. FE models were created using 1D and 2D elements. Angles and gusset plates were modelled with shell elements. Two methods of modelling of friction grip bolting were considered: bolt-regarding approach with 1D element systems modelling bolts and two variants of bolt-disregarding approach with special constraints over some part of member and gusset plate surfaces in contact: a) constraints over whole area of contact, b) constraints over the area around each bolt shank (“partially tied”). Modelling of friction grip bolted connections using simplified bolt modelling may be effective, especially in the case of analysis concerning elastic range only. In such a case disregarding bolts and replacing them with “partially tied” modelling seems to be more attractive. It is less time-consuming and provides results of similar accuracy in comparison to analysis utilizing simplified bolt modelling.
The effective and efficient cooperation in communities and groups requires that the members of the community or group have adequate information about each other and the environment. In this paper, we outline the basic challenges of managing awareness information. We analyse the management of awareness information in face-to-face situations, and discuss challenges and requirements for the support of awareness management in distributed settings. Finally, after taking a look at related work, we present a simple, yet powerful framework for awareness management based on constraint pattern named COBRA.
The release of the large language model-based chatbot ChatGPT 3.5 in November 2022 has brought considerable attention to the subject of artificial intelligence, not only to the public. From the perspective of higher education, ChatGPT challenges various learning and assessment formats as it significantly reduces the effectiveness of their learning and assessment functionalities. In particular, ChatGPT might be applied to formats that require learners to generate text, such as bachelor theses or student research papers. Accordingly, the research question arises to what extent writing of bachelor theses is still a valid learning and assessment format. Correspondingly, in this exploratory study, the first author was asked to write his bachelor’s thesis exploiting ChatGPT. For tracing the impact of ChatGPT methodically, an autoethnographic approach was used. First, all considerations on the potential use of ChatGPT were documented in logs, and second, all ChatGPT chats were logged. Both logs and chat histories were analyzed and are presented along with the recommendations for students regarding the use of ChatGPT suggested by a common framework. In conclusion, ChatGPT is beneficial for thesis writing during various activities, such as brainstorming, structuring, and text revision. However, there are limitations that arise, e.g., in referencing. Thus, ChatGPT requires continuous validation of the outcomes generated and thus fosters learning. Currently, ChatGPT is valued as a beneficial tool in thesis writing. However, writing a conclusive thesis still requires the learner’s meaningful engagement. Accordingly, writing a thesis is still a valid learning and assessment format. With further releases of ChatGPT, an increase in capabilities is to be expected, and the research question needs to be reevaluated from time to time.
A technique for using object-oriented technologies to write structural analysis software has been developed. The structural design information of an individual building is stored in an object-oriented database. A global database provides general design values as material data and safety factors. A class library for load elements has been evolved to model the transfer of loads in a building. This class library is the basis for the development of further classes for other structural elements such as beams, columns or slabs. A software has been developed to monitor the forces transferred from one structural member to another in a building for load cases and combinations according to Eurocode 1. The results of the analysis are stored in the projects database from which a structural design report may be generated. The software was developed under Microsoft Visual C++. The Microsoft Foundation Class Library (MFC) was used to program the Graphical User Interface (GUI). Object Linking and Embedding (OLE) technology is useful to include any type of OLE server objects for example texts written with a word processor or CAD drawings in the structural design report. The Object-Oriented Database Management System (OODBMS) ObjectStore provides services to store the large amount of objects.
This study permits a reliability analysis to solve the mechanical behaviour issues existing in the current structural design of fabric structures. Purely predictive material models are highly desirable to facilitate an optimized design scheme and to significantly reduce time and cost at the design stage, such as experimental characterization.
The present study examined the role of three major tasks; a) single-objective optimization, b) sensitivity analyses and c) multi-objective optimization on proposed weave structures for woven fabric composites. For single-objective optimization task, the first goal is to optimize the elastic properties of proposed complex weave structure under unit cells basis based on periodic boundary conditions.
We predict the geometric characteristics towards skewness of woven fabric composites via Evolutionary Algorithm (EA) and a parametric study. We also demonstrate the effect of complex weave structures on the fray tendency in woven fabric composites via tightness evaluation. We utilize a procedure which does not require a numerical averaging process for evaluating the elastic properties of woven fabric composites. The fray tendency and skewness of woven fabrics depends upon the behaviour of the floats which is related to the factor of weave. Results of this study may suggest a broader view for further research into the effects of complex weave structures or may provide an alternative to the fray and skewness problems of current weave structure in woven fabric composites.
A comprehensive study is developed on the complex weave structure model which adopts the dry woven fabric of the most potential pattern in singleobjective optimization incorporating the uncertainties parameters of woven fabric composites. The comprehensive study covers the regression-based and variance-based sensitivity analyses. The second task goal is to introduce the fabric uncertainties parameters and elaborate how they can be incorporated into finite element models on macroscopic material parameters such as elastic modulus and shear modulus of dry woven fabric subjected to uni-axial and biaxial deformations. Significant correlations in the study, would indicate the need for a thorough investigation of woven fabric composites under uncertainties parameters. The study describes here could serve as an alternative to identify effective material properties without prolonged time consumption and expensive experimental tests.
The last part focuses on a hierarchical stochastic multi-scale optimization approach (fine-scale and coarse-scale optimizations) under geometrical uncertainties parameters for hybrid composites considering complex weave structure. The fine-scale optimization is to determine the best lamina pattern that maximizes its macroscopic elastic properties, conducted by EA under the following uncertain mesoscopic parameters: yarn spacing, yarn height, yarn width and misalignment of yarn angle. The coarse-scale optimization has been carried out to optimize the stacking sequences of symmetric hybrid laminated composite plate with uncertain mesoscopic parameters by employing the Ant Colony Algorithm (ACO). The objective functions of the coarse-scale optimization are to minimize the cost (C) and weight (W) of the hybrid laminated composite plate considering the fundamental frequency and the buckling load factor as the design constraints.
Based on the uncertainty criteria of the design parameters, the appropriate variation required for the structural design standards can be evaluated using the reliability tool, and then an optimized design decision in consideration of cost can be subsequently determined.
As human thought was developing, likewise, the technology used for illumination was growing. But a haul through history, reviewing its pages and analyzing it, inherently brings up old and new question, like: Is it possible to alter negatively the image of historic buildings and monuments through inadequate lighting to the degree of distorting the perception that people have of the work? and if so, what are the causes that generate it? Do the light designers take into consideration criteria to protect not only historic buildings and monuments, but also the environment? What are the consequences that may generate the inadequate lighting of urban heritage to the environment? What are the factors to consider for a proper illumination of urban heritage? The answers to these questions will help lay the foundation for proper illumination of the urban heritage, avoiding at the maximum the light pollution and the effects that it generates, seeking a balance and harmonious reconciliation between the technology, urban heritage and environment, taking as a framework and the case study the urban heritage of a city from the colonial era in southern Mexico, with pre-Hispanic roots and where today you can still see through its streets and buildings an atmosphere of mysticism reflection of their folklore and traditions, this city is known as Chiapa de Corzo, Chiapas.
Object-Oriented Damage Information Modeling Concepts and Implementation for Bridge Inspection
(2022)
Bridges are designed to last for more than 50 years and consume up to 50% of their life-cycle costs during their operation phase. Several inspections and assessment actions are executed during this period. Bridge and damage information must be gathered, digitized, and exchanged between different stakeholders. Currently, the inspection and assessment practices rely on paper-based data collection and exchange, which is time-consuming and error-prone, and leads to loss of information. Storing and exchanging damage and building information in a digital format may lower costs and errors during inspection and assessment and support future needs, for example, immediate simulations regarding performance assessment, automated maintenance planning, and mixed reality inspections. This study focused on the concept for modeling damage information to support bridge reviews and structural analysis. Starting from the definition of multiple use cases and related requirements, the data model for damage information is defined independently from the subsequent implementation. In the next step, the implementation via an established standard is explained. Functional tests aim to identify problems in the concept and implementation. To show the capability of the final model, two example use cases are illustrated: the inspection review of the entire bridge and a finite-element analysis of a single component. Main results are the definition of necessary damage data, an object-oriented damage model, which supports multiple use cases, and the implementation of the model in a standard. Furthermore, the tests have shown that the standard is suitable to deliver damage information; however, several software programs lack proper implementation of the standard.
Die Einflüsse polymerer Zusätze auf die Ausbildung der Mikrostruktur im frühen Stadium der Erhärtung und auf die Eigenschaften, insbesondere die Dauerhaftigkeit der modifizierten Mörtel wurden erforscht. Es sollte die Frage beantwortet werden, ob durch die Modifizierung die Dauerhaftigkeit von Mörteln mehr verbessert werden kann, als dies durch übliche betontechnologische Maßnahmen möglich ist. Die Ausbildung der Mikrostruktur in den ersten 24 Stunden der Erhärtung wurde mit verschiedenen Methoden, u.a. mittels ESEM, untersucht. Es wurden Modellvorstellungen zur Ausbildung der organischen Matrix und der anorganischen Matrix entwickelt: Interaktionen sind Adsorptionsreaktionen, Agglomerationen und Behinderung der Hydratation. Es wurden Frisch- und Festmörteluntersuchungen beschrieben und interpretiert. Unterschiedliche Dauerhaftigkeitsuntersuchungen wurden durchgeführt und bewertet. Die Mikrostruktur der Festmörtel wurde hinsichtlich ihres Einflusses auf die Dauerhaftigkeit betrachtet.
In machine learning, if the training data is independently and identically distributed as the test data then a trained model can make an accurate predictions for new samples of data. Conventional machine learning has a strong dependence on massive amounts of training data which are domain specific to understand their latent patterns. In contrast, Domain adaptation and Transfer learning methods are sub-fields within machine learning that are concerned with solving the inescapable problem of insufficient training data by relaxing the domain dependence hypothesis. In this contribution, this issue has been addressed and by making a novel combination of both the methods we develop a computationally efficient and practical algorithm to solve boundary value problems based on nonlinear partial differential equations. We adopt a meshfree analysis framework to integrate the prevailing geometric modelling techniques based on NURBS and present an enhanced deep collocation approach that also plays an important role in the accuracy of solutions. We start with a brief introduction on how these methods expand upon this framework. We observe an excellent agreement between these methods and have shown that how fine-tuning a pre-trained network to a specialized domain may lead to an outstanding performance compare to the existing ones. As proof of concept, we illustrate the performance of our proposed model on several benchmark problems.
Assessing Essential Qualities of Urban Space with Emotional and Visual Data Based on GIS Technique
(2016)
Finding a method to evaluate people’s emotional responses to urban spaces in a valid and objective way is fundamentally important for urban design practices and related policy making. Analysis of the essential qualities of urban space could be made both more effective and more accurate using innovative information techniques that have become available in the era of big data. This study introduces an integrated method based on geographical information systems (GIS) and an emotion-tracking technique to quantify the relationship between people’s emotional responses and urban space. This method can evaluate the degree to which people’s emotional responses are influenced by multiple urban characteristics such as building shapes and textures, isovist parameters, visual entropy, and visual fractals. The results indicate that urban spaces may influence people’s emotional responses through both spatial sequence arrangements and shifting scenario sequences. Emotional data were collected with body sensors and GPS devices. Spatial clustering was detected to target effective sampling locations; then, isovists were generated to extract building textures. Logistic regression and a receiver operating characteristic analysis were used to determine the key isovist parameters and the probabilities that they influenced people’s emotion. Finally, based on the results, we make some suggestions for design professionals in the field of urban space optimization.
This dissertation is devoted to the theoretical development and experimental laboratory verification of a new damage localization method: The state projection estimation error (SP2E). This method is based on the subspace identification of mechanical structures, Krein space based H-infinity estimation and oblique projections. To explain method SP2E, several theories are discussed and laboratory experiments have been conducted and analysed.
A fundamental approach of structural dynamics is outlined first by explaining mechanical systems based on first principles. Following that, a fundamentally different approach, subspace identification, is comprehensively explained. While both theories, first principle and subspace identification based mechanical systems, may be seen as widespread methods, barely known and new techniques follow up. Therefore, the indefinite quadratic estimation theory is explained. Based on a Popov function approach, this leads to the Krein space based H-infinity theory. Subsequently, a new method for damage identification, namely SP2E, is proposed. Here, the introduction of a difference process, the analysis by its average process power and the application of oblique projections is discussed in depth.
Finally, the new method is verified in laboratory experiments. Therefore, the identification of a laboratory structure at Leipzig University of Applied Sciences is elaborated. Then structural alterations are experimentally applied, which were localized by SP2E afterwards. In the end four experimental sensitivity studies are shown and discussed. For each measurement series the structural alteration was increased, which was successfully tracked by SP2E. The experimental results are plausible and in accordance with the developed theories. By repeating these experiments, the applicability of SP2E for damage localization is experimentally proven.
The task-based view of web search implies that retrieval should take the user perspective into account. Going beyond merely retrieving the most relevant result set for the current query, the retrieval system should aim to surface results that are actually useful to the task that motivated the query.
This dissertation explores how retrieval systems can better understand and support their users’ tasks from three main angles: First, we study and quantify search engine user behavior during complex writing tasks, and how task success and behavior are associated in such settings. Second, we investigate search engine queries formulated as questions, and explore patterns in a large query log that may help search engines to better support this increasingly prevalent interaction pattern. Third, we propose a novel approach to reranking the search result lists produced by web search engines, taking into account retrieval axioms that formally specify properties of a good ranking.
In this work, practice-based research is conducted to rethink the understanding of aesthetics, especially in relation to current media art. Granted, we live in times when technologies merge with living organisms, but we also live in times that provide unlimited resources of knowledge and maker tools. I raise the question: In what way does the hybridization of living organisms and non-living technologies affect art audiences in the culture that may be defined as Maker culture? My hypothesis is that active participation of an audience in an artwork is inevitable for experiencing the artwork itself, while also suggesting that the impact of the umwelt changes the perception of an artwork. I emphasize artistic projects that unfold through mutual interaction among diverse peers, including humans, non-human organisms, and machines. In my thesis, I pursue collaborative scenarios that lead to the realization of artistic ideas: (1) the development of ideas by others influenced by me and (2) the materialization of my own ideas influenced by others. By developing the scenarios of collaborative work as an artistic experience, I conclude that the role of an artist in Maker culture is to mediate different types of knowledge and different positions, whereas the role of the audience is to actively engage in the artwork itself. At the same time, aesthetics as experience is triggered by the other, including living and non-living actors. It is intended that the developed methodologies could be further adapted in artistic practices, philosophy, anthropology, and environmental studies.
In this paper, an artificial neural network is implemented for the sake of predicting the thermal conductivity ratio of TiO2-Al2O3/water nanofluid. TiO2-Al2O3/water in the role of an innovative type of nanofluid was synthesized by the sol–gel method. The results indicated that 1.5 vol.% of nanofluids enhanced the thermal conductivity by up to 25%. It was shown that the heat transfer coefficient was linearly augmented with increasing nanoparticle concentration, but its variation with temperature was nonlinear. It should be noted that the increase in concentration may cause the particles to agglomerate, and then the thermal conductivity is reduced. The increase in temperature also increases the thermal conductivity, due to an increase in the Brownian motion and collision of particles. In this research, for the sake of predicting the thermal conductivity of TiO2-Al2O3/water nanofluid based on volumetric concentration and temperature functions, an artificial neural network is implemented. In this way, for predicting thermal conductivity, SOM (self-organizing map) and BP-LM (Back Propagation-Levenberq-Marquardt) algorithms were used. Based on the results obtained, these algorithms can be considered as an exceptional tool for predicting thermal conductivity. Additionally, the correlation coefficient values were equal to 0.938 and 0.98 when implementing the SOM and BP-LM algorithms, respectively, which is highly acceptable. View Full-Text
Tall buildings have become an integral part of cities despite all their pros and cons. Some current tall buildings have several problems because of their unsuitable location; the problems include increasing density, imposing traffic on urban thoroughfares, blocking view corridors, etc. Some of these buildings have destroyed desirable views of the city. In this research, different criteria have been chosen, such as environment, access, social-economic, land-use, and physical context. These criteria and sub-criteria are prioritized and weighted by the analytic network process (ANP) based on experts’ opinions, using Super Decisions V2.8 software. On the other hand, layers corresponding to sub-criteria were made in ArcGIS 10.3 simultaneously, then via a weighted overlay (map algebra), a locating plan was created. In the next step seven hypothetical tall buildings (20 stories), in the best part of the locating plan, were considered to evaluate how much of theses hypothetical buildings would be visible (fuzzy visibility) from the street and open spaces throughout the city. These processes have been modeled by MATLAB software, and the final fuzzy visibility plan was created by ArcGIS. Fuzzy visibility results can help city managers and planners to choose which location is suitable for a tall building and how much visibility may be appropriate. The proposed model can locate tall buildings based on technical and visual criteria in the future development of the city and it can be widely used in any city as long as the criteria and weights are localized.
The initial shear modulus, Gmax, of soil is an important parameter for a variety of geotechnical design applications. This modulus is typically associated with shear strain levels about 5*10^-3% and below. The critical role of soil stiffness at small-strains in the design and analysis of geotechnical infrastructure is now widely accepted.
Gmax is a key parameter in small-strain dynamic analyses such as those to predict soil behavior or soil-structure interaction during earthquake, explosions, machine or traffic vibration where it is necessary to know how the shear modulus degrades from its small-strain value as the level of shear strain increases. Gmax can be equally important for small-strain cyclic situations such as those caused by wind or wave loading and for small-strain static situations as well. Gmax may also be used as an indirect indication of various soil parameters, as it, in many cases, correlates well to other soil properties such as density and sample disturbance. In recent years, a technique using bender elements was developed to investigate the small-strain shear modulus Gmax.
The objective of this thesis is to study the initial shear stiffness for various sands with different void ratios, densities, grain size distribution under dry and saturated conditions, then to compare empirical equations to predict Gmax and results from other testing devices with results of bender elements from this study.
For many purposes geometric information about existing buildings is necessary, e.g. planing of conservation or reconstruction. Architectural photogrammetry is a technique to acquire 3D geometric data of buildings for a CAD model from images. In this paper the state of the art in architectural photogrammetry and some developments towards automation are described. The photogrammetric process consists of image acquisition, orientation and restitution. Special attention is put on digital methods, from digital image acquisition to restitution methods, supported by digital image processing. There are a few field of development towards automation, e.g. feature extraction, extraction of edges and lines and the detection of corresponding points. The acquired data may be used in a CAD environment or for visualization in Virtual Reality Models, using digital orthoimages for texture mapping.
Effective knowledge management is increasingly considered as a cornerstone of sustainable business success. Knowledge management systems are strategically valuable for both ensuring consistency and continuous improvement of various aspects such as quality delivery, productivity and competitiveness. The small and medium enterprises (SMEs) in the construction industry are mostly operating under tighter timeframes, narrower profit margins and more constrained resources. Hence the recently commenced SMILE-SMC (Strategic Management with Information Leveraged Excellece for Small and Medium Contractors) project aims to support the information and knowledge management needs of the small and medium contractors in Hong Kong. This paper presents some snapshots on the SMILE-SMC project, and its conceptualized deliverables with some highlights of recent developments.
In this contribution, the design of an analysis environment is presented, that supports an analyst to come to a decision within a gradual collaborative planning process. An analyst represents a project manager, planner or any other person, involved in the planning process. Today, planning processes are managed by several geographically distributed planners and project managers. Thus, complexity of such a process rises even more. Prediction of consequences of many planning decisions is not possible, in particular since assessment of a planning advance is not trivial. There have to be considered several viewpoints, that depend on individual perceptions. In the following, methods are presented to realize planning decision support.
The uncertainty existing in the construction industry is bigger than in other industries. Consequently, most construction projects do not go totally as planned. The project management plan needs therefore to be adapted repeatedly within the project lifecycle to suit the actual project conditions. Generally, the risks of change in the project management plan are difficult to be identified in advance, especially if these risks are caused by unexpected events such as human errors or changes in the client preferences. The knowledge acquired from different resources is essential to identify the probable deviations as well as to find proper solutions to the faced change risks. Hence, it is necessary to have a knowledge base that contains known solutions for the common exceptional cases that may cause changes in each construction domain. The ongoing research work presented in this paper uses the process modeling technique of Event-driven Process Chains to describe different patterns of structure changes in the schedule networks. This results in several so called “change templates”. Under each template different types of change risk/ response pairs can be categorized and stored in a knowledge base. This knowledge base is described as an ontology model populated with reference construction process data. The implementation of the developed approach can be seen as an iterative scheduling cycle that will be repeated within the project lifecycle as new change risks surface. This can help to check the availability of ready solutions in the knowledge base for the situation at hand. Moreover, if the solution is adopted, CPSP, “Change Project Schedule Plan „a prototype developed for the purpose of this research work, will be used to make the needed structure changes of the schedule network automatically based on the change template. What-If scenarios can be implemented using the CPSP prototype in the planning phase to study the effect of specific situations without endangering the success of the project objectives. Hence, better designed and more maintainable project schedules can be achieved.
This paper deals with the modelling and the analysis of masonry vaults. Numerical FEM analyses are performed using LUSAS code. Two vault typologies are analysed (barrel and cross-ribbed vaults) parametrically varying geometrical proportions and constraints. The proposed model and the developed numerical procedure are implemented in a computer analysis. Numerical applications are developed to assess the model effectiveness and the efficiency of the numerical procedure. The main object of the present paper is the development of a computational procedure which allows to define 3D structural behaviour of masonry vaults. For each investigated example, the homogenized limit analysis approach has been employed to predict ultimate load and failure mechanisms. Finally, both a mesh dependence study and a sensitivity analysis are reported. Sensitivity analysis is conducted varying in a wide range mortar tensile strength and mortar friction angle with the aim of investigating the influence of the mechanical properties of joints on collapse load and failure mechanisms. The proposed computer model is validated by a comparison with experimental results available in the literature.
A lot of real-life problems lead frequently to the solution of a complicated (large scale, multicriteria, unstable, nonsmooth etc.) nonlinear optimization problem. In order to cope with large scale problems and to develop many optimum plans a hiearchical approach to problem solving may be useful. The idea of hierarchical decision making is to reduce the overall complex problem into smaller and simpler approximate problems (subproblems) which may thereupon treated independently. One way to break a problem into smaller subproblems is the use of decomposition-coordination schemes. For finding proper values for coordination parameters in convex programming some rapidly convergent iterative methods are developed, their convergence properties and computational aspects are examined. Problems of their global implementation and polyalgorithmic approach are discussed as well.
This paper concerns schedule synchronization problems in public transit networks. In particular, it consists of three main parts. In the first the subject area is introduced, the terms are defined and framework for optimal synchronization in the form of problem representation and formulation is proposed. The second part is devoted to transfer synchronization problem when passengers changing transit lines at transfer points. The intergrated Tabu Search and Genetic solution method is developed with respect to this specific problem. The third part deals with headways harmonization problem i.e. synchronization of different transit lines schedules on a common segments of routes. For the solution of this problem a new bilevel optimization method is proposed with zones harmonization at the bottom level and co-ordination of zones, by time buffers assigned to timing points, at the upper level. Finally, the synchronization problems are numerically illustrated by real-life examples of the public transport lines in Cracow.
BAUHAUS ISOMETRY AND FIELDS
(2012)
While integration increases by networking, segregation strides ahead too. Most of us fixate our mind on special topics. Yet we are relying on our intuition too. We are sometimes waiting for the inflow of new ideas or valuable information that we hold in high esteem, although we are not entirely conscious of its origin. We may even say the most precious intuitions are rooting in deep subconscious, collective layers of the mind. Take as a simple example the emergence of orientation in paleolithic events and its relation to the dihedral symmetry of the compass. Consider also the extension of this algebraic matter into the operational structures of the mind on the one hand and into the algebra of geometry, Clifford algebra as we use to call it today, on the other. Culture and mind, and even the individual act of creation may be connected with transient events that are subconscious and inaccessible to cognition in principle. Other events causative for our work may be merely invisible too us, though in principle they should turn out attainable. In this case we are just ignorant of the whole creative process. Sometimes we begin to use unusual tools or turn into handicraft enthusiasts. Then our small institutes turn into workshops and factories. All this is indeed joining with the Bauhaus and its spirit. We shall go together into this, and we shall present a record of this session.
Most of the existing seismic resistant design codes are based on the response spectrum theory. The influence of inelastic deformations can be evaluated by considering inelastic type of resisting force and then the inelastic spectrum is considerably different from the elastic one. Also, the influence of stiffness degradation and strength deterioration can be accounted for by including more precise models from material point of view. In some recent papers the corresponding changes in response spectra due to the P- Ä effect are discussed. The experience accumulated from the recent earthquakes indicates that structural pounding may considerably influence the response of structures and should be taken into account in design procedures. The most convenient way to do that is to predict the influence of the pounding on the response spectra for accelerations, velocities and displacements. Generally speaking the contact problems such as pounding are characterized by large extent of nonlinearity and slow convergence of the computational procedures. Thus obtaining spectra where the contact problem is accounted for seems very attractive from engineering point of view because could easy be implemented into the design procedures. However it is worth nothing that there is not rigorous mathematical proof that the original system can be decomposed into single equations related to single degree of freedom systems. It is the porpose of the paper to study the influence of the pounding on the response spectra and to evaluate the amplification due to the impact. For this purpose two adjacent SDOF systems are considered that are able to interact during the vibration process. This problem is solved versus the elastic stiffness ratio, which appears to be very important for such assemblage. The contact between masses is numerically simulated using opening gap elements as links. Comparisons between calculated response spectra and linear response spectra are made in order to derive analytical relationships to simply obtain the contribution of pounding. The results are graphically illustrated in response spectra format and the influence of the stiffness ratio is clarified.
Biofeedback constitutes a well-established, non-invasive method to voluntary interfere in emotional processing by means of cognitive strategies. However, treatment durations exhibit strong inter-individual variations and first successes can often be achieved only after a large number of sessions. Sham feedback constitutes a rather untapped approach by providing feedback that does not correspond to the participant’s actual state. The current study aims to gain insights into mechanisms of sham feedback processing in order to support new techniques in biofeedback therapy. We carried out two experiments and applied different types of sham feedback on skin conductance responses and pupil size changes during affective processing. Results indicate that standardized but context-sensitive sham signals based on skin conductance responses exert a stronger influence on emotional regulation compared to individual sham feedback from ongoing pupil dynamics. Also, sham feedback should forego unnatural signal behavior to avoid irritation and skepticism among participants. Altogether, a reasonable combination of stimulus features and sham feedback characteristics enables to considerably reduce the actual bodily responsiveness already within a single session.
The cost of keeping large area urban computer aided architectural design (CAAD) models up to date justifies wider use and access. This paper reviews the potential for collaborative groupwork creation and maintenance of such models and suggests an approach to data entry, data management and generation of appropriate levels of detail models from a Geographic Information System (GIS). Staff at the University of the West of England (UWE) modelled a large area of Bristol to demonstrate millennium landmark proposals. It became swiftly apparent that continued amendment of the model to keep it an accurate reflection of changes on the ground was a major data management problem. Piecing in new CAAD models received from Architectural Practices to visualise them in context as part of the planning negotiation process has often taken staff several days of work for each instance. The model is so complex and proprietary that Bristol City operates a specialist visualisation bureau service. UWE later modelled the environs of the Tower of London to support bids for funding and to provide the context for judging the visual impact of iterative design development. Further research continued to develop more effective approaches to. Data conversion and amalgamation from all the diverse sources was the major impediment to effective group working to create the models. It became apparent that a GIS would assist retrieving all the appropriate data that described the part of the model under creation. It was possible to predict that management of many historic part models stepping back through time, allowing for different expert interpretations to co-exist would be in itself a major task requiring a spatial database/GIS. UWE started afresh from the original source data, to explore the collaborative use of GIS and Virtual Reality Modelling Language (VRML) to integrate models and interventions from various sources and to generate an overall navigable interactive whole. Current exploration of the combination of event driven behaviours and Structured Query Language is seeking to define how appropriately to modify objects in the VRML model on demand. This is beginning to realise the potential for use of this process for: asynchronous group modelling on the lines of a collaborative virtual design studio; historic building maintenance management; visitor management; interpretation of historic sites to visitors and public planning information.