Refine
Document Type
- Article (238) (remove)
Institute
- Professur Theorie und Geschichte der modernen Architektur (62)
- Professur Informatik im Bauwesen (59)
- Institut für Strukturmechanik (ISM) (27)
- In Zusammenarbeit mit der Bauhaus-Universität Weimar (15)
- Professur Bauphysik (11)
- Bauhaus-Institut für zukunftsweisende Infrastruktursysteme (b.is) (5)
- Professur Bauchemie und Polymere Werkstoffe (5)
- Junior-Professur Computational Architecture (4)
- Junior-Professur Organisation und vernetzte Medien (4)
- Professur Stahl- und Hybridbau (4)
Keywords
- Bauhaus-Kolloquium (62)
- Weimar (62)
- Architektur (40)
- 2003 (30)
- Raum (23)
- Digitalisierung (20)
- 2007 (19)
- Bild (19)
- Design (19)
- Bautechnik (18)
Wissenschaftliches Kolloquium vom 27. bis 30. Juni 1996 in Weimar an der Bauhaus-Universität zum Thema: ‚Techno-Fiction. Zur Kritik der technologischen Utopien'
Wissenschaftliches Kolloquium vom 19. bis 22. April 2007 in Weimar an der Bauhaus-Universität zum Thema: ‚Die Realität des Imaginären. Architektur und das digitale Bild'
Wissenschaftliches Kolloquium vom 19. bis 22. April 2007 in Weimar an der Bauhaus-Universität zum Thema: ‚Die Realität des Imaginären. Architektur und das digitale Bild'
Wissenschaftliches Kolloquium vom 24. bis 27. April 2003 in Weimar an der Bauhaus-Universität zum Thema: ‚MediumArchitektur - Zur Krise der Vermittlung'
Wissenschaftliches Kolloquium vom 24. bis 27. April 2003 in Weimar an der Bauhaus-Universität zum Thema: ‚MediumArchitektur - Zur Krise der Vermittlung'
Architecture and association
(2003)
Wissenschaftliches Kolloquium vom 24. bis 27. April 2003 in Weimar an der Bauhaus-Universität zum Thema: ‚MediumArchitektur - Zur Krise der Vermittlung'
Architecture and Air
(2003)
Wissenschaftliches Kolloquium vom 24. bis 27. April 2003 in Weimar an der Bauhaus-Universität zum Thema: ‚MediumArchitektur - Zur Krise der Vermittlung'
Wissenschaftliches Kolloquium vom 24. bis 27. April 2003 in Weimar an der Bauhaus-Universität zum Thema: ‚MediumArchitektur - Zur Krise der Vermittlung'
The economic losses from earthquakes tend to hit the national economy considerably; therefore, models that are capable of estimating the vulnerability and losses of future earthquakes are highly consequential for emergency planners with the purpose of risk mitigation. This demands a mass prioritization filtering of structures to identify vulnerable buildings for retrofitting purposes. The application of advanced structural analysis on each building to study the earthquake response is impractical due to complex calculations, long computational time, and exorbitant cost. This exhibits the need for a fast, reliable, and rapid method, commonly known as Rapid Visual Screening (RVS). The method serves as a preliminary screening platform, using an optimum number of seismic parameters of the structure and predefined output damage states. In this study, the efficacy of the Machine Learning (ML) application in damage prediction through a Support Vector Machine (SVM) model as the damage classification technique has been investigated. The developed model was trained and examined based on damage data from the 1999 Düzce Earthquake in Turkey, where the building’s data consists of 22 performance modifiers that have been implemented with supervised machine learning.
Creation of hierarchical sequence of the plastic and viscoplastic models according to different levels of structure approximations is considered. Developed strategy of multimodel analysis, which consists of creation of the inelastic models library, determination of selection criteria system and caring out of multivariant sequential clarifying computations, is described. Application of the multimodel approach in numerical computations has demonstrated possibility of reliable prediction of stress-strain response under wide variety of combined nonproportional loading.
This paper presents an agent-based software, Virtual Administrator System (VAS) for the smallscale maintenance of school buildings. VAS is capable of handling a heavy load of routine, lowtech maintenance jobs. It assigns a different priority to each job application according to its significance and urgency, and automatically adjusts schedules for maintenance engineers when on-site supervision is needed. The system can help ease off the burden of routine small-scale maintenance work, making it more cost-effective and efficient in the overall management of school building maintenance. VAS posts jobs on the Web in a multi-media format and classified all applications into four categories: the on-call maintenance contract, the term maintenance contract, the guaranty maintenance contract, and the regular maintenance contract. It then estimates their urgency level and passes the information to maintenance engineers who will decide whether on-site inspection is needed. Based on the engineers’ feedback, VAS automatically implements the scheduling for inspection as well as sends out real-time or batch notifications to contractors. All these activities are recorded in a database to allow continuous research and data mining and the analysis and diagnosis of specific jobs for followup maintenance plans.
The Carbon journal is pleased to introduce a themed collection of recent articles in the area of computational carbon nanoscience. This virtual special issue was assembled from previously published Carbon articles by Guest Editors Quan Wang and Behrouz Arash, and can be accessed as a set in the special issue section of the journal website homepage: www.journals.elsevier.com/carbon. The article below by our guest editors serves as an introduction to this virtual special issue, and also a commentary on the growing role of computation as a tool to understand the synthesis and properties of carbon nanoforms and their behavior in composite materials.
Individual views on a building product of people involved in the design process imply different models for planning and calculation. In order to interpret these geometrical, topological and semantical data of a building model we identify a structural component graph, a graph of room faces, a room graph and a relational object graph as aids and we explain algorithms to derive these relations. The application of the technique presented is demonstrated by the analysis and discretization of a sample model in the scope of building energy simulation.
This paper presents an evaluation system for steel structures of hydroelectric power stations, including hydraulic gates and penstocks, based on Fault Tree Analyasis (FTA) and performance maps. This system consists of fault tree diagrams of FTA, performance maps, design and analysis systems, and engineerin databases. These four modules are integrated by appropriate hyperlinks so that the user of this system can use it easily and seamlessly. A well developed system was applied to some illustrative example cases, and they showed that the developed methodology and system worked well and the users found the system useful and effective for their maintenance tasks at powerstations.
The classical Internet of things routing and wireless sensor networks can provide more precise monitoring of the covered area due to the higher number of utilized nodes. Because of the limitations in shared transfer media, many nodes in the network are prone to the collision in simultaneous transmissions. Medium access control protocols are usually more practical in networks with low traffic, which are not subjected to external noise from adjacent frequencies. There are preventive, detection and control solutions to congestion management in the network which are all the focus of this study. In the congestion prevention phase, the proposed method chooses the next step of the path using the Fuzzy decision-making system to distribute network traffic via optimal paths. In the congestion detection phase, a dynamic approach to queue management was designed to detect congestion in the least amount of time and prevent the collision. In the congestion control phase, the back-pressure method was used based on the quality of the queue to decrease the probability of linking in the pathway from the pre-congested node. The main goals of this study are to balance energy consumption in network nodes, reducing the rate of lost packets and increasing quality of service in routing. Simulation results proved the proposed Congestion Control Fuzzy Decision Making (CCFDM) method was more capable in improving routing parameters as compared to recent algorithms.
The laser beam is a small, flexible and fast polishing tool. With laser radiation it is possible to finish many outlines or geometries on quartz glass surfaces in the shortest possible time. It’s a fact that the temperature developing while polishing determines the reachable surface smoothing and, as a negative result, causes material tensions. To find out which parameters are important for the laser polishing process and the surface roughness respectively and to estimate material tensions, temperature simulations and extensive polishing experiments took place. During these experiments starting and machining parameters were changed and temperatures were measured contact-free. The accuracy of thermal and mechanical simulation was improved in the case of advanced FE-analysis.
Framed-tube system with multiple internal tubes is analysed using an orthotropic box beam analogy approach in which each tube is individually modelled by a box beam that accounts for the flexural and shear deformations, as well as the shear-lag effects. A simple numerical modeling technique is proposed for estimating the shear-lag phenomenon in tube structures with multiple internal tubes. The proposed method idealizes the framed-tube structures with multiple internal tubes as equivalent multiple tubes, each composed of four equivalent orthotropic plate panels. The numerical analysis is based on the minimum potential energy principle in conjunction with the variational approach. The shear-lag phenomenon of such structures is studied taking into account the additional bending moments in the tubes. A detailed work is carried out through the numerical analysis of the additional bending moment. The moment factor is further introduced to identify the shear lag phenomenon along with the additional moment.
An architecture of a distributed planning system for the building industry has been developed. The emphasis is on highly collaborative environments in steelwork, timber construction etc. where designers concurrently handle 3D models. The overall system connects local design systems by the so-called Design Framework DFW. This framework consists of the definition of distributed components and protocols which make the collaborative design work. The process of collaborative design has been formalized on an abstract level. This paper describes how this has been done. A sample is given to illustrate the mapping of concrete scenarios of the ‘real design world’ to an abstract scenario level. This work is funded by the Deutsche Forschungsgemeinschaft DFG as part of the project SPP1103 (Meißner et al. 2003).
In this article, I show why it is necessary to abolish the use of predictive algorithms in the US criminal justice system at sentencing. After presenting the functioning of these algorithms in their context of emergence, I offer three arguments to demonstrate why their abolition is imperative. First, I show that sentencing based on predictive algorithms induces a process of rewriting the temporality of the judged individual, flattening their life into a present inescapably doomed by its past. Second, I demonstrate that recursive processes, comprising predictive algorithms and the decisions based on their predictions, systematically suppress outliers and progressively transform reality to match predictions. In my third and final argument, I show that decisions made on the basis of predictive algorithms actively perform a biopolitical understanding of justice as management and modulation of risks. In such a framework, justice becomes a means to maintain a perverse social homeostasis that systematically exposes disenfranchised Black and Brown populations to risk.
A vast number of existing buildings were constructed before the development and enforcement of seismic design codes, which run into the risk of being severely damaged under the action of seismic excitations. This poses not only a threat to the life of people but also affects the socio-economic stability in the affected area. Therefore, it is necessary to assess such buildings’ present vulnerability to make an educated decision regarding risk mitigation by seismic strengthening techniques such as retrofitting. However, it is economically and timely manner not feasible to inspect, repair, and augment every old building on an urban scale. As a result, a reliable rapid screening methods, namely Rapid Visual Screening (RVS), have garnered increasing interest among researchers and decision-makers alike. In this study, the effectiveness of five different Machine Learning (ML) techniques in vulnerability prediction applications have been investigated. The damage data of four different earthquakes from Ecuador, Haiti, Nepal, and South Korea, have been utilized to train and test the developed models. Eight performance modifiers have been implemented as variables with a supervised ML. The investigations on this paper illustrate that the assessed vulnerability classes by ML techniques were very close to the actual damage levels observed in the buildings.
In this Paper, we explored the relation between the electricity consumption in residential sector and the automobile energy consumption in transportation sector in accordance with the location of city by employing Geographic Information System (GIS). We found in the study that the electricity consumption per capita has a tendency that is higher in city center and lower in suburbs in Utsunomiya city. It is also noted that there is little difference among total consumption between city center and suburbs, despite the fact that the density of electric appliances tends to increase in a small size house of city center and the amount of automobile energy consumption from residence is lower in city center than in suburbs.
Rice husk ash (RHA) is classified as a highly reactive pozzolan. It has a very high silica content similar to that of silica fume (SF). Using less-expensive and locally available RHA as a mineral admixture in concrete brings ample benefits to the costs, the technical properties of concrete as well as to the environment. An experimental study of the effect of RHA blending on workability, strength and durability of high performance fine-grained concrete (HPFGC) is presented. The results show that the addition of RHA to HPFGC improved significantly compressive strength, splitting tensile strength and chloride penetration resistance. Interestingly, the ratio of compressive strength to splitting tensile strength of HPFGC was lower than that of ordinary concrete, especially for the concrete made with 20 % RHA. Compressive strength and splitting tensile strength of HPFGC containing RHA was similar and slightly higher, respectively, than for HPFGC containing SF. Chloride penetration resistance of HPFGC containing 10–15 % RHA was comparable with that of HPFGC containing 10 % SF.
This paper presents a new design environment based on Multi-Agents and Virtual Reality (VR). In this research, a design system with a virtual reality function was developed. The virtual world was realized by using GL4Java, liquid crystal shutter glasses, sensor systems, etc. And the Multi-Agent CAD system with product models, which had been developed before, was integrated with the VR design system. A prototype system was developed for highway steel plate girder bridges, and was applied to a design problem. The application verified the effectiveness of the developed system.
A Product Model of a Road
(1997)
Many errors and delays frequently appear when data is exchanged between particular tasks in the lifecycle of the road. Inter-task connections are therefore of great importance for the quality of the final product. The article describes a product model of a road wich is the kernel of an integrated information system intended to support all important stages of the road lifecycle: design, evaluation (through different analysis procedures), construction, and maintainance. Since particular tasks are often executed at different places and in different companies, the interconnections are supported by a special metafile which contains all specific data of the product model. The concept of the integrated system is object and component oriented. Additionally, existing conventional program packages are included to support some common tasks (methods). A conventional relational database system as well as an open spatial database system with the relevant GIS functionality are included to support the data structures of the model.
Global structural analyses in civil engineering are usually performed considering linear-elastic material behavior. However, for steel structures, a certain degree of plasticization depending on the member classification may be considered. Corresponding plastic analyses taking material nonlinearities into account are effectively realized using numerical methods. Frequently applied finite elements of two and three-dimensional models evaluate the plasticity at defined nodes using a yield surface, i.e. by a yield condition, hardening rule, and flow rule. Corresponding calculations are connected to a large numerical as well as time-consuming effort and they do not rely on the theoretical background of beam theory, to which the regulations of standards mainly correspond. For that reason, methods using beam elements (one-dimensional) combined with cross-sectional analyses are commonly applied for steel members in terms of plastic zones theories. In these approaches, plasticization is in general assessed by means of axial stress only. In this paper, more precise numerical representation of the combined stress states, i.e. axial and shear stresses, is presented and results of the proposed approach are validated and discussed.
The amount of adsorbed styrene acrylate copolymer (SA) particles on cementitious surfaces at the early stage of hydration was quantitatively determined using three different methodological approaches: the depletion method, the visible spectrophotometry (VIS) and the thermo-gravimetry coupled with mass spectrometry (TG–MS). Considering the advantages and disadvantages of each method, including the respectively required sample preparation, the results for four polymer-modified cement pastes, varying in polymer content and cement fineness, were evaluated.
To some extent, significant discrepancies in the adsorption degrees were observed. There is a tendency that significantly lower amounts of adsorbed polymers were identified using TG-MS compared to values determined with the depletion method. Spectrophotometrically generated values were lying in between these extremes. This tendency was found for three of the four cement pastes examined and is originated in sample preparation and methodical limitations.
The main influencing factor is the falsification of the polymer concentration in the liquid phase during centrifugation. Interactions in the interface between sediment and supernatant are the cause. The newly developed method, using TG–MS for the quantification of SA particles, proved to be suitable for dealing with these revealed issues. Here, instead of the fluid phase, the sediment is examined with regard to the polymer content, on which the influence of centrifugation is considerably lower.
The K-nearest neighbors (KNN) machine learning algorithm is a well-known non-parametric classification method. However, like other traditional data mining methods, applying it on big data comes with computational challenges. Indeed, KNN determines the class of a new sample based on the class of its nearest neighbors; however, identifying the neighbors in a large amount of data imposes a large computational cost so that it is no longer applicable by a single computing machine. One of the proposed techniques to make classification methods applicable on large datasets is pruning. LC-KNN is an improved KNN method which first clusters the data into some smaller partitions using the K-means clustering method; and then applies the KNN for each new sample on the partition which its center is the nearest one. However, because the clusters have different shapes and densities, selection of the appropriate cluster is a challenge. In this paper, an approach has been proposed to improve the pruning phase of the LC-KNN method by taking into account these factors. The proposed approach helps to choose a more appropriate cluster of data for looking for the neighbors, thus, increasing the classification accuracy. The performance of the proposed approach is evaluated on different real datasets. The experimental results show the effectiveness of the proposed approach and its higher classification accuracy and lower time cost in comparison to other recent relevant methods.
A Multi-objective Model for Optimizing Construction Planning of Repetitive Infrastructure Projects
(2004)
This paper presents the development of a model for optimizing resource utilization in repetitive infrastructure projects. The model provides the capability of simultaneous minimization of both project duration and work interruptions for construction crews. The model provides in a single run, a set of nondominated solutions that represent the tradeoff between these two objectives. The model incorporates a multiobjective genetic algorithm and scheduling algorithm. The model initially generates a randomly selected set of solutions that evolves to a near optimal set of tradeoff solutions in subsequent generations. Each solution represents a unique scheduling solution that is associated with certain project duration and a number of interruption days for utilized construction crews. As such, the model provides project planners with alternative schedules along with their expected duration and resource utilization efficiency.
Tall buildings have become an integral part of cities despite all their pros and cons. Some current tall buildings have several problems because of their unsuitable location; the problems include increasing density, imposing traffic on urban thoroughfares, blocking view corridors, etc. Some of these buildings have destroyed desirable views of the city. In this research, different criteria have been chosen, such as environment, access, social-economic, land-use, and physical context. These criteria and sub-criteria are prioritized and weighted by the analytic network process (ANP) based on experts’ opinions, using Super Decisions V2.8 software. On the other hand, layers corresponding to sub-criteria were made in ArcGIS 10.3 simultaneously, then via a weighted overlay (map algebra), a locating plan was created. In the next step seven hypothetical tall buildings (20 stories), in the best part of the locating plan, were considered to evaluate how much of theses hypothetical buildings would be visible (fuzzy visibility) from the street and open spaces throughout the city. These processes have been modeled by MATLAB software, and the final fuzzy visibility plan was created by ArcGIS. Fuzzy visibility results can help city managers and planners to choose which location is suitable for a tall building and how much visibility may be appropriate. The proposed model can locate tall buildings based on technical and visual criteria in the future development of the city and it can be widely used in any city as long as the criteria and weights are localized.
A Machine Learning Framework for Assessing Seismic Hazard Safety of Reinforced Concrete Buildings
(2020)
Although averting a seismic disturbance and its physical, social, and economic disruption is practically impossible, using the advancements in computational science and numerical modeling shall equip humanity to predict its severity, understand the outcomes, and equip for post-disaster management. Many buildings exist amidst the developed metropolitan areas, which are senile and still in service. These buildings were also designed before establishing national seismic codes or without the introduction of construction regulations. In that case, risk reduction is significant for developing alternatives and designing suitable models to enhance the existing structure’s performance. Such models will be able to classify risks and casualties related to possible earthquakes through emergency preparation. Thus, it is crucial to recognize structures that are susceptible to earthquake vibrations and need to be prioritized for retrofitting. However, each building’s behavior under seismic actions cannot be studied through performing structural analysis, as it might be unrealistic because of the rigorous computations, long period, and substantial expenditure. Therefore, it calls for a simple, reliable, and accurate process known as Rapid Visual Screening (RVS), which serves as a primary screening platform, including an optimum number of seismic parameters and predetermined performance damage conditions for structures. In this study, the damage classification technique was studied, and the efficacy of the Machine Learning (ML) method in damage prediction via a Support Vector Machine (SVM) model was explored. The ML model is trained and tested separately on damage data from four different earthquakes, namely Ecuador, Haiti, Nepal, and South Korea. Each dataset consists of varying numbers of input data and eight performance modifiers. Based on the study and the results, the ML model using SVM classifies the given input data into the belonging classes and accomplishes the performance on hazard safety evaluation of buildings.
We apply keyquery-based taxonomy composition to compute a classification system for the CORE dataset, a shared crawl of about 850,000 scientific papers. Keyquery-based taxonomy composition can be understood as a two-phase hierarchical document clustering technique that utilizes search queries as cluster labels: In a first phase, the document collection is indexed by a reference search engine, and the documents are tagged with the search queries they are relevant—for their so-called keyqueries. In a second phase, a hierarchical clustering is formed from the keyqueries within an iterative process. We use the explicit topic model ESA as document retrieval model in order to index the CORE dataset in the reference search engine. Under the ESA retrieval model, documents are represented as vectors of similarities to Wikipedia articles; a methodology proven to be advantageous for text categorization tasks. Our paper presents the generated taxonomy and reports on quantitative properties such as document coverage and processing requirements.
A comprehensive framework of information management system for construction projects in China has been established through extensive literature survey and field investigation. It utilizes the potential information technologies and covers the practical management patterns as well as the major aspects of construction project management. It can be used to guide and evaluate the design of the information management systems for construction projects in order to make the system to be applicable to a wide variety of construction projects and survive the changes in project management.
The synchronous distributed processing of common source code in the software development process is supported by well proven methods. The planning process has similarities with the software development process. However, there are no consistent and similarly successful methods for applications in construction projects. A new approach is proposed in this contribution.
Recently, the demand for residence and usage of urban infrastructure has been increased, thereby resulting in the elevation of risk levels of human lives over natural calamities. The occupancy demand has rapidly increased the construction rate, whereas the inadequate design of structures prone to more vulnerability. Buildings constructed before the development of seismic codes have an additional susceptibility to earthquake vibrations. The structural collapse causes an economic loss as well as setbacks for human lives. An application of different theoretical methods to analyze the structural behavior is expensive and time-consuming. Therefore, introducing a rapid vulnerability assessment method to check structural performances is necessary for future developments. The process, as mentioned earlier, is known as Rapid Visual Screening (RVS). This technique has been generated to identify, inventory, and screen structures that are potentially hazardous. Sometimes, poor construction quality does not provide some of the required parameters; in this case, the RVS process turns into a tedious scenario. Hence, to tackle such a situation, multiple-criteria decision-making (MCDM) methods for the seismic vulnerability assessment opens a new gateway. The different parameters required by RVS can be taken in MCDM. MCDM evaluates multiple conflicting criteria in decision making in several fields. This paper has aimed to bridge the gap between RVS and MCDM. Furthermore, to define the correlation between these techniques, implementation of the methodologies from Indian, Turkish, and Federal Emergency Management Agency (FEMA) codes has been done. The effects of seismic vulnerability of structures have been observed and compared.
Methods with the convergence order p 2 (Newton`s, tangent hyperbolas, tangent parabolas etc.) and their approximate variants are studied. Conditions are presented under which the approximate variants preserve their convergence rate intrinsic to these methods and some computational aspects (possibilities to organize parallel computation, globalization of a method, the solution of the linear equations versus the matrix inversion at every iteration etc.) are discussed. Polyalgorithmic computational schemes (hybrid methods) combining the best features of various methods are developed and possibilities of their application to numerical solution of two-point boundary-value problem in ordinary differential equations and decomposition-coordination problem in convex programming are analyzed.
A categorical perspective towards aerodynamic models for aeroelastic analyses of bridge decks
(2019)
Reliable modelling in structural engineering is crucial for the serviceability and safety of structures. A huge variety of aerodynamic models for aeroelastic analyses of bridges poses natural questions on their complexity and thus, quality. Moreover, a direct comparison of aerodynamic models is typically either not possible or senseless, as the models can be based on very different physical assumptions. Therefore, to address the question of principal comparability and complexity of models, a more abstract approach, accounting for the effect of basic physical assumptions, is necessary.
This paper presents an application of a recently introduced category theory-based modelling approach to a diverse set of models from bridge aerodynamics. Initially, the categorical approach is extended to allow an adequate description of aerodynamic models. Complexity of the selected aerodynamic models is evaluated, based on which model comparability is established. Finally, the utility of the approach for model comparison and characterisation is demonstrated on an illustrative example from bridge aeroelasticity. The outcome of this study is intended to serve as an alternative framework for model comparison and impact future model assessment studies of mathematical models for engineering applications.