Refine
Document Type
- Article (238) (remove)
Institute
- Professur Theorie und Geschichte der modernen Architektur (62)
- Professur Informatik im Bauwesen (59)
- Institut für Strukturmechanik (ISM) (27)
- In Zusammenarbeit mit der Bauhaus-Universität Weimar (15)
- Professur Bauphysik (11)
- Bauhaus-Institut für zukunftsweisende Infrastruktursysteme (b.is) (5)
- Professur Bauchemie und Polymere Werkstoffe (5)
- Junior-Professur Computational Architecture (4)
- Junior-Professur Organisation und vernetzte Medien (4)
- Professur Stahl- und Hybridbau (4)
Keywords
- Bauhaus-Kolloquium (62)
- Weimar (62)
- Architektur (40)
- 2003 (30)
- Raum (23)
- Digitalisierung (20)
- 2007 (19)
- Bild (19)
- Design (19)
- Bautechnik (18)
Rice husk ash (RHA) is classified as a highly reactive pozzolan. It has a very high silica content similar to that of silica fume (SF). Using less-expensive and locally available RHA as a mineral admixture in concrete brings ample benefits to the costs, the technical properties of concrete as well as to the environment. An experimental study of the effect of RHA blending on workability, strength and durability of high performance fine-grained concrete (HPFGC) is presented. The results show that the addition of RHA to HPFGC improved significantly compressive strength, splitting tensile strength and chloride penetration resistance. Interestingly, the ratio of compressive strength to splitting tensile strength of HPFGC was lower than that of ordinary concrete, especially for the concrete made with 20 % RHA. Compressive strength and splitting tensile strength of HPFGC containing RHA was similar and slightly higher, respectively, than for HPFGC containing SF. Chloride penetration resistance of HPFGC containing 10–15 % RHA was comparable with that of HPFGC containing 10 % SF.
Wissenschaftliches Kolloquium vom 24. bis 27. April 2003 in Weimar an der Bauhaus-Universität zum Thema: ‚MediumArchitektur - Zur Krise der Vermittlung'
A vast number of existing buildings were constructed before the development and enforcement of seismic design codes, which run into the risk of being severely damaged under the action of seismic excitations. This poses not only a threat to the life of people but also affects the socio-economic stability in the affected area. Therefore, it is necessary to assess such buildings’ present vulnerability to make an educated decision regarding risk mitigation by seismic strengthening techniques such as retrofitting. However, it is economically and timely manner not feasible to inspect, repair, and augment every old building on an urban scale. As a result, a reliable rapid screening methods, namely Rapid Visual Screening (RVS), have garnered increasing interest among researchers and decision-makers alike. In this study, the effectiveness of five different Machine Learning (ML) techniques in vulnerability prediction applications have been investigated. The damage data of four different earthquakes from Ecuador, Haiti, Nepal, and South Korea, have been utilized to train and test the developed models. Eight performance modifiers have been implemented as variables with a supervised ML. The investigations on this paper illustrate that the assessed vulnerability classes by ML techniques were very close to the actual damage levels observed in the buildings.
In this research, an attempt was made to reduce the dimension of wavelet-ANFIS/ANN (artificial neural network/adaptive neuro-fuzzy inference system) models toward reliable forecasts as well as to decrease computational cost. In this regard, the principal component analysis was performed on the input time series decomposed by a discrete wavelet transform to feed the ANN/ANFIS models. The models were applied for dissolved oxygen (DO) forecasting in rivers which is an important variable affecting aquatic life and water quality. The current values of DO, water surface temperature, salinity, and turbidity have been considered as the input variable to forecast DO in a three-time step further. The results of the study revealed that PCA can be employed as a powerful tool for dimension reduction of input variables and also to detect inter-correlation of input variables. Results of the PCA-wavelet-ANN models are compared with those obtained from wavelet-ANN models while the earlier one has the advantage of less computational time than the later models. Dealing with ANFIS models, PCA is more beneficial to avoid wavelet-ANFIS models creating too many rules which deteriorate the efficiency of the ANFIS models. Moreover, manipulating the wavelet-ANFIS models utilizing PCA leads to a significant decreasing in computational time. Finally, it was found that the PCA-wavelet-ANN/ANFIS models can provide reliable forecasts of dissolved oxygen as an important water quality indicator in rivers.
Real-world labs hold the potential to catalyse rapid urban transformations through real-world experimentation. Characterised by a rather radical, responsive, and location-specific nature, real-world labs face constraints in the scaling of experimental knowledge. To make a significant contribution to urban transformation, the produced knowledge must go beyond the level of a building, street, or small district where real-world experiments are conducted. Thus, a conflict arises between experimental boundaries and the stimulation of broader implications. The challenges of scaling experimental knowledge have been recognised as a problem, but remain largely unexplained. Based on this, the article will discuss the applicability of the “typology of amplification processes” by Lam et al. (2020) to explore and evaluate the potential of scaling experimental knowledge from real-world labs. The application of the typology is exemplified in the case of the Bauhaus.MobilityLab. The Bauhaus.MobilityLab takes a unique approach by testing and developing cross-sectoral mobility, energy, and logistics solutions with a distinct focus on scaling knowledge and innovation. For this case study, different qualitative research techniques are combined according to “within-method triangulation” and synthesised in a strengths, weaknesses, opportunities, and threats (SWOT) analysis. The analysis of the Bauhaus.MobilityLab proves that the “typology of amplification processes” is useful as a systematic approach to identifying and evaluating the potential of scaling experimental knowledge.
The current study attempts to recognise an adequate classification for a semi-rigid beam-to-column connection by investigating strength, stiffness and ductility. For this purpose, an experimental test was carried out to investigate the moment-rotation (M-theta) features of flush end-plate (FEP) connections including variable parameters like size and number of bolts, thickness of end-plate, and finally, size of beams and columns. The initial elastic stiffness and ultimate moment capacity of connections were determined by an extensive analytical procedure from the proposed method prescribed by ANSI/AISC 360-10, and Eurocode 3 Part 1-8 specifications. The behaviour of beams with partially restrained or semi-rigid connections were also studied by incorporating classical analysis methods. The results confirmed that thickness of the column flange and end-plate substantially govern over the initial rotational stiffness of of flush end-plate connections. The results also clearly showed that EC3 provided a more reliable classification index for flush end-plate (FEP) connections. The findings from this study make significant contributions to the current literature as the actual response characteristics of such connections are non-linear. Therefore, such semirigid behaviour should be used to for an analysis and design method.
Pressure fluctuations beneath hydraulic jumps potentially endanger the stability of stilling basins. This paper deals with the mathematical modeling of the results of laboratory-scale experiments to estimate the extreme pressures. Experiments were carried out on a smooth stilling basin underneath free hydraulic jumps downstream of an Ogee spillway. From the probability distribution of measured instantaneous pressures, pressures with different probabilities could be determined. It was verified that maximum pressure fluctuations, and the negative pressures, are located at the positions near the spillway toe. Also, minimum pressure fluctuations are located at the downstream of hydraulic jumps. It was possible to assess the cumulative curves of pressure data related to the characteristic points along the basin, and different Froude numbers. To benchmark the results, the dimensionless forms of statistical parameters include mean pressures (P*m), the standard deviations of pressure fluctuations (σ*X), pressures with different non-exceedance probabilities (P*k%), and the statistical coefficient of the probability distribution (Nk%) were assessed. It was found that an existing method can be used to interpret the present data, and pressure distribution in similar conditions, by using a new second-order fractional relationships for σ*X, and Nk%. The values of the Nk% coefficient indicated a single mean value for each probability.
The economic losses from earthquakes tend to hit the national economy considerably; therefore, models that are capable of estimating the vulnerability and losses of future earthquakes are highly consequential for emergency planners with the purpose of risk mitigation. This demands a mass prioritization filtering of structures to identify vulnerable buildings for retrofitting purposes. The application of advanced structural analysis on each building to study the earthquake response is impractical due to complex calculations, long computational time, and exorbitant cost. This exhibits the need for a fast, reliable, and rapid method, commonly known as Rapid Visual Screening (RVS). The method serves as a preliminary screening platform, using an optimum number of seismic parameters of the structure and predefined output damage states. In this study, the efficacy of the Machine Learning (ML) application in damage prediction through a Support Vector Machine (SVM) model as the damage classification technique has been investigated. The developed model was trained and examined based on damage data from the 1999 Düzce Earthquake in Turkey, where the building’s data consists of 22 performance modifiers that have been implemented with supervised machine learning.
The empire of language
(2003)
Wissenschaftliches Kolloquium vom 24. bis 27. April 2003 in Weimar an der Bauhaus-Universität zum Thema: ‚MediumArchitektur - Zur Krise der Vermittlung'
This paper presents an application of dynamic decision making under uncertainty in planning and estimating underground construction. The application of the proposed methodology is illustrated by its application to an actual tunneling project—The Hanging Lake Tunnel Project in Colorado, USA. To encompass the typical risks in underground construction, tunneling decisions are structured as a risk-sensitive Markov decision process that reflects the decision process faced by a contractor in each tunneling round. This decision process consists of five basic components: (1) decision stages (locations), (2) system states (ground classes and tunneling methods), (3) alternatives (tunneling methods), (4) ground class transition probabilities, and (5) tunneling cost structure. The paper also presents concepts related to risk preference that are necessary to model the contractor’s risk attitude, including the lottery concept, utility theory, and the delta property. The optimality equation is formulated, the model components are defined, and the model is solved by stochastic dynamic programming. The main results are the optimal construction plans and risk-adjusted project costs, both of which reflect the dynamics of subsurface construction, the uncertainty about geologic variability as a function of available information, and the contractor’s risk preference.
The development of 3D technologies during the last decades in many different areas, leads us towards the complete 3D representation of planet earth on a high level of detail. On the lowest level we have geographical information systems (GIS) representing the outer layer of our planet as a 3D model. In the meantime these systems do not only give a geographical model but also present additional information like ownership, infrastructure and others that might be of interest for the construction business. In future these systems will serve as basis for virtual environments for planning and simulation of construction sites. In addition to this work is done on the integration of GIS systems with 3D city models in the area of urban planning and thus integration of different levels of detail. This article presents research work on the use of 3D models in construction on the next level of detail below the level of urban planning. The 3D city model is taken as basis for the 3D model of the construction site. In this virtual nD-world a contractor can organize and plan his resources, simulate different variants of construction processes and thus find out the most effective solution for the consideration of costs and time. On the basis of former researches the authors present a new approach for cost estimation and simulation using development technologies from game software.
We apply keyquery-based taxonomy composition to compute a classification system for the CORE dataset, a shared crawl of about 850,000 scientific papers. Keyquery-based taxonomy composition can be understood as a two-phase hierarchical document clustering technique that utilizes search queries as cluster labels: In a first phase, the document collection is indexed by a reference search engine, and the documents are tagged with the search queries they are relevant—for their so-called keyqueries. In a second phase, a hierarchical clustering is formed from the keyqueries within an iterative process. We use the explicit topic model ESA as document retrieval model in order to index the CORE dataset in the reference search engine. Under the ESA retrieval model, documents are represented as vectors of similarities to Wikipedia articles; a methodology proven to be advantageous for text categorization tasks. Our paper presents the generated taxonomy and reports on quantitative properties such as document coverage and processing requirements.
Purpose of this study is to evaluate safety impact of the deceleration lane at the Upstream Zone of at-grade U-turns on 4-lane divided Thai highways. A substantial speed reduction is required by vehicles for diverging and making U-turn, and the deceleration lanes are provided for this purpose. These lanes are also providing a storage space for the U-turning vehicles to avoid unnecessary blockage of through lanes and reduce the potential of rear-end collisions. The safety at the U-turn is greatly influenced by the proper or improper use of the deceleration lanes. Subject to their length, full or partial speed adjustment can occur within the deceleration lane also the road users’ behavior is influenced. To assess the safety impact, the four groups of U-turns with the varying length of deceleration lanes were identified. Owing to limitation of availability and reliability of road crash data in Thailand, widely accepted Traffic Conflict Technique (TCT) was used as an alternative and proactive methodology. The U-turns’ geometric data, traffic conflicts and volume data were recorded in the field at 8 locations, 8 hours per location. Severity Conflict Rate (SCR) was assessed by applying a weighing factor (based on the severity grades according to the Czech TCT) to the observed conflicts related to the conflicting traffic volumes. A comparative higher value of SCR represents a lower level of safety. According to the results, increase in the functional length of the deceleration lane yields a lower value of SCR and a higher level of the road safety.
Wissenschaftliches Kolloquium vom 19. bis 22. April 2007 in Weimar an der Bauhaus-Universität zum Thema: ‚Die Realität des Imaginären. Architektur und das digitale Bild'
Wissenschaftliches Kolloquium vom 24. bis 27. April 2003 in Weimar an der Bauhaus-Universität zum Thema: ‚MediumArchitektur - Zur Krise der Vermittlung'
The highway product model based on the length information of the centerline, and the application system is developed. This paper shows the schema and the modeling process of the product model, which includes geometric elements such as an alignment, lanes, sidewalks, shoulders and sprits, and accessories such as guard fences, plantings and signs. Furthermore, The Highway Sequence Editor (HSE) is developed as an application system to verify the model.
The characteristic values of climatic actions in current structural design codes are based on a specified probability of exceedance during the design working life of a structure. These values are traditionally determined from the past observation data under a stationary climate assumption. However, this assumption becomes invalid in the context of climate change, where the frequency and intensity of climatic extremes varies with respect to time. This paper presents a methodology to calculate the non-stationary characteristic values using state of the art climate model projections. The non-stationary characteristic values are calculated in compliance with the requirements of structural design codes by forming quasi-stationary windows of the entire bias-corrected climate model data. Three approaches for the calculation of non-stationary characteristic values considering the design working life of a structure are compared and their consequences on exceedance probability are discussed.
The amount of adsorbed styrene acrylate copolymer (SA) particles on cementitious surfaces at the early stage of hydration was quantitatively determined using three different methodological approaches: the depletion method, the visible spectrophotometry (VIS) and the thermo-gravimetry coupled with mass spectrometry (TG–MS). Considering the advantages and disadvantages of each method, including the respectively required sample preparation, the results for four polymer-modified cement pastes, varying in polymer content and cement fineness, were evaluated.
To some extent, significant discrepancies in the adsorption degrees were observed. There is a tendency that significantly lower amounts of adsorbed polymers were identified using TG-MS compared to values determined with the depletion method. Spectrophotometrically generated values were lying in between these extremes. This tendency was found for three of the four cement pastes examined and is originated in sample preparation and methodical limitations.
The main influencing factor is the falsification of the polymer concentration in the liquid phase during centrifugation. Interactions in the interface between sediment and supernatant are the cause. The newly developed method, using TG–MS for the quantification of SA particles, proved to be suitable for dealing with these revealed issues. Here, instead of the fluid phase, the sediment is examined with regard to the polymer content, on which the influence of centrifugation is considerably lower.
Recently, the demand for residence and usage of urban infrastructure has been increased, thereby resulting in the elevation of risk levels of human lives over natural calamities. The occupancy demand has rapidly increased the construction rate, whereas the inadequate design of structures prone to more vulnerability. Buildings constructed before the development of seismic codes have an additional susceptibility to earthquake vibrations. The structural collapse causes an economic loss as well as setbacks for human lives. An application of different theoretical methods to analyze the structural behavior is expensive and time-consuming. Therefore, introducing a rapid vulnerability assessment method to check structural performances is necessary for future developments. The process, as mentioned earlier, is known as Rapid Visual Screening (RVS). This technique has been generated to identify, inventory, and screen structures that are potentially hazardous. Sometimes, poor construction quality does not provide some of the required parameters; in this case, the RVS process turns into a tedious scenario. Hence, to tackle such a situation, multiple-criteria decision-making (MCDM) methods for the seismic vulnerability assessment opens a new gateway. The different parameters required by RVS can be taken in MCDM. MCDM evaluates multiple conflicting criteria in decision making in several fields. This paper has aimed to bridge the gap between RVS and MCDM. Furthermore, to define the correlation between these techniques, implementation of the methodologies from Indian, Turkish, and Federal Emergency Management Agency (FEMA) codes has been done. The effects of seismic vulnerability of structures have been observed and compared.
Rapid Visual Screening (RVS) is a procedure that estimates structural scores for buildings and prioritizes their retrofit and upgrade requirements. Despite the speed and simplicity of RVS, many of the collected parameters are non-commensurable and include subjectivity due to visual observations. This might cause uncertainties in the evaluation, which emphasizes the use of a fuzzy-based method. This study aims to propose a novel RVS methodology based on the interval type-2 fuzzy logic system (IT2FLS) to set the priority of vulnerable building to undergo detailed assessment while covering uncertainties and minimizing their effects during evaluation. The proposed method estimates the vulnerability of a building, in terms of Damage Index, considering the number of stories, age of building, plan irregularity, vertical irregularity, building quality, and peak ground velocity, as inputs with a single output variable. Applicability of the proposed method has been investigated using a post-earthquake damage database of reinforced concrete buildings from the Bingöl and Düzce earthquakes in Turkey.
Vertical green system for gray water treatment: Analysis of the VertiKKA-module in a field test
(2022)
This work presents a modular Vertical Green System (VGS) for gray water treatment, developed at the Bauhaus-Universität Weimar. The concept was transformed into a field study with four modules built and tested with synthetic gray water. Each module set contains a small and larger module with the same treatment substrate and was fed hourly. A combination of lightweight structural material and biochar of agricultural residues and wood chips was used as the treatment substrate. In this article, we present the first 18 weeks of operation. Regarding the treatment efficiency, the parameters chemical oxygen demand (COD), total phosphorous (TP), ortho-phosphate (ortho-P), total bound nitrogen (TNb), ammonium nitrogen (NH4-N), and nitrate nitrogen (NO3-N) were analyzed and are presented in this work. The results of the modules with agricultural residues are promising. Up to 92% COD reduction is stated in the data. The phosphate and nitrogen fractions are reduced significantly in these modules. By contrast, the modules with wood chips reduce only 67% of the incoming COD and respectively less regarding phosphates and the nitrogen fraction.
Wissenschaftliches Kolloquium vom 19. bis 22. April 2007 in Weimar an der Bauhaus-Universität zum Thema: ‚Die Realität des Imaginären. Architektur und das digitale Bild'
The laser beam is a small, flexible and fast polishing tool. With laser radiation it is possible to finish many outlines or geometries on quartz glass surfaces in the shortest possible time. It’s a fact that the temperature developing while polishing determines the reachable surface smoothing and, as a negative result, causes material tensions. To find out which parameters are important for the laser polishing process and the surface roughness respectively and to estimate material tensions, temperature simulations and extensive polishing experiments took place. During these experiments starting and machining parameters were changed and temperatures were measured contact-free. The accuracy of thermal and mechanical simulation was improved in the case of advanced FE-analysis.
The work has studied the structure and properties of gypsum compositions modified with the manmade modifier based on metallurgical dust and multi-walled carbon nanotubes. The results show that changing the structure of solid gypsum leads to the increase in bending and compressive strength by 70,5% and 138% correspondingly, the water resistance increasing and the softening factor reaching 0,85. Modifying gypsum composition with complex additive leads to the formation of amorphous structures based on calcium hydrosilicates on the surface of primary gypsum crystallohydrates that bond gypsum crystals and reduce the access of water.
Architecture and Air
(2003)
Wissenschaftliches Kolloquium vom 24. bis 27. April 2003 in Weimar an der Bauhaus-Universität zum Thema: ‚MediumArchitektur - Zur Krise der Vermittlung'
Wissenschaftliches Kolloquium vom 24. bis 27. April 2003 in Weimar an der Bauhaus-Universität zum Thema: ‚MediumArchitektur - Zur Krise der Vermittlung'
The frame of this paper is the development of methods and procedures for the description of the motion of an arbitrary shaped foundation. Since the infinite half-space cannot be properly described by a model of finite dimensions without violating the radiation condition, the basic problems are infinite dimensions of the half-space as well as its non-homogeneous nature. Consequently, an approach has been investigated to solve this problem indirectly by developing Green's function in which the non-homogeneity and the infiniteness of the half-space has been included. When the Green's function is known, the next step will be the evaluation of contact stresses acting between the foundation and the surface of the half-space through an integral equation. The equation should be solved in the area of the foundation using Green's function as the kernel. The derivation of three-dimensional Green's function for the homogeneous half-space (Kobayashi and Sasaki 1991) has been made using the potential method. Partial differential equations occurring in the problem have been made ordinary ones through the Hankel integral transform. The general idea for obtaining the three-dimensional Green's function for the layered half-space is similar. But in that case some additional phenomena may occur. One of them is the possibility of the appearance of Stonely surface waves propagating along the contact surfaces of layers. Their contribution to the final result is in most cases important enough that they should not be neglected. The main advantage of results presented in comparing to other obtained with numerical methods is their accuracy especially in the case of thin layers because all essential steps of Green's function evaluation except of the contour integration along the branch cut have been made analytically. On the other hand the disadvantage of this method is that the mathematical effort for obtaining the Green's function is increasing drastically with the increase of the number of layers. Future work will therefore be directed in simplifying of the above described process
There exists a big gap between the capabilities of current 3D-CAD applications and their actual usage in practice. Many architects and planners still prefer to draft in 2D because the benefits of 3D modeling are difficult to explain. This presentation offers a basis to view the 3D building model not merely as the source for 2D plan generation. By adding extra dimensions like Time and Cost to the 3D building model it becomes possible to generate dynamic information on building construction progress with regards to used material, resources and cost. These additional benefits are key elements to many planners and contractors and may therefore widen the acceptance of 3D building modeling in general.
We provide a critical overview of the current status of computational support for construction the management and building performance evaluation in North-America. This overview is based on the research conducted in relation to the design and construction of the Intelligent Workplace (IW), Carnegie Mellon University, Pittsburgh, Pennsylvania, USA. With regard to the commercial software products in the field of construction management the following limitations can be identified: Although project planning, cost estimating and construction simulation are supported, tasks like bidding as well as site and material management have not received the same level of attention. Few project management software packages are integrated in a total design support software system. Little analysis or evaluation options are provided to support managerial decision making. Various research groups address the construction planning and scheduling, construction contracting, site layout generation as well as the integration of these three topics. Currently problems such as efficient material management and calculation of environmental and energy responsive site management are insufficiently addressed within the ongoing research projects. In the domain of building performance simulation and decision support one can notice that, the development and application of computational tools is industry driven. As a result the concerns addressed by the tools are mainly issues pertaining to the selection and sizing of systems and components rather than an integrated performance evaluation. Consequently, these programs are rarely used by building designers, especially in the early design stages, where the predictive capabilities of simulation tools could be of significant value. Although many research institutions address the necessity for the integration of performance simulation within the overall design support environments most of the practically available performance simulation tools still remain mono-dimensional and isolated.
Wissenschaftliches Kolloquium vom 19. bis 22. April 2007 in Weimar an der Bauhaus-Universität zum Thema: ‚Die Realität des Imaginären. Architektur und das digitale Bild'
In this paper, systematic analyses for the shoring systems installed to support the applied loads during construction are performed on the basis of the numerical approach. On the basis of a rigorous time-dependent analysis, structural behaviors of reinforced concrete (RC) frame structures according to the changes in design variables such as the types of shoring systems, shore stiffness and shore spacing are analyzed and discussed. The time-dependent deformations of concrete such as creep and shrinkage and construction sequences of frame structures are also taken into account to minimize the structural instability and to reach to an improved design of shoring system because these effects may increase the axial forces delivered to the shores. In advance, the influence of the column shortening effect, generally mentioned in a tall building structure, is analyzed. From many parametric studies, it has been finally concluded that the most effective shoring system in RC frame structures is 2S1R (two shores and one reshore) regardless of the changes in design variables.
Processes underlying crowding in visual letter recognition were examined by investigating effects of training. Experiment 1 revealed that training reduces crowding mainly for trained strings. This was corroborated in Experiment 2, where no training effects were obvious after 3 days of training when strings changed from trial to trial. Experiment 3 specified that after a short amount of training, learning effects remained specific to trained strings and also to the trained retinal eccentricity and the interletter spacing used in training. Transfer to other than trained conditions was observed only after further training. Experiment 4 showed that transfer occurred earlier when words were used as stimuli. These results thus demonstrate that part of crowding results from the absence of higher level representations of the stimulus. Such representations can be acquired through learning visual properties of the stimulus.
A Machine Learning Framework for Assessing Seismic Hazard Safety of Reinforced Concrete Buildings
(2020)
Although averting a seismic disturbance and its physical, social, and economic disruption is practically impossible, using the advancements in computational science and numerical modeling shall equip humanity to predict its severity, understand the outcomes, and equip for post-disaster management. Many buildings exist amidst the developed metropolitan areas, which are senile and still in service. These buildings were also designed before establishing national seismic codes or without the introduction of construction regulations. In that case, risk reduction is significant for developing alternatives and designing suitable models to enhance the existing structure’s performance. Such models will be able to classify risks and casualties related to possible earthquakes through emergency preparation. Thus, it is crucial to recognize structures that are susceptible to earthquake vibrations and need to be prioritized for retrofitting. However, each building’s behavior under seismic actions cannot be studied through performing structural analysis, as it might be unrealistic because of the rigorous computations, long period, and substantial expenditure. Therefore, it calls for a simple, reliable, and accurate process known as Rapid Visual Screening (RVS), which serves as a primary screening platform, including an optimum number of seismic parameters and predetermined performance damage conditions for structures. In this study, the damage classification technique was studied, and the efficacy of the Machine Learning (ML) method in damage prediction via a Support Vector Machine (SVM) model was explored. The ML model is trained and tested separately on damage data from four different earthquakes, namely Ecuador, Haiti, Nepal, and South Korea. Each dataset consists of varying numbers of input data and eight performance modifiers. Based on the study and the results, the ML model using SVM classifies the given input data into the belonging classes and accomplishes the performance on hazard safety evaluation of buildings.
The present article aims to provide an overview of the consequences of dynamic soil-structure interaction (SSI) on building structures and the available modelling techniques to resolve SSI problems. The role of SSI has been traditionally considered beneficial to the response of structures. However, contemporary studies and evidence from past earthquakes showed detrimental effects of SSI in certain conditions. An overview of the related investigations and findings is presented and discussed in this article. Additionally, the main approaches to evaluate seismic soil-structure interaction problems with the commonly used modelling techniques and computational methods are highlighted. The strength, limitations, and application cases of each model are also discussed and compared. Moreover, the role of SSI in various design codes and global guidelines is summarized. Finally, the advancements and recent findings on the SSI effects on the seismic response of buildings with different structural systems and foundation types are presented. In addition, with the aim of helping new researchers to improve previous findings, the research gaps and future research tendencies in the SSI field are pointed out.
Wissenschaftliches Kolloquium vom 19. bis 22. April 2007 in Weimar an der Bauhaus-Universität zum Thema: ‚Die Realität des Imaginären. Architektur und das digitale Bild'
We propose an enhanced iterative scheme for the precise reconstruction of piezoelectric material parameters from electric impedance and mechanical displacement measurements. It is based on finite-element simulations of the full three-dimensional piezoelectric equations, combined with an inexact Newton or nonlinear Landweber iterative inversion scheme. We apply our method to two piezoelectric materials and test its performance. For the first material, the manufacturer provides a full data set; for the second one, no material data set is available. For both cases, our inverse scheme, using electric impedance measurements as input data, performs well.
Broadband dielectric measurement methods based on vector network analyzer coupled with coaxial transmission line cell (CC) and open-ended coaxial probe (OC) are simply reviewed, by which the dielectric behaviors in the frequency range of 1 MHz to 3 GHz of two practical geomaterials are investigated. Kaolin after modified compaction with different water contents is measured by using CC. The results are consistent with previous study on standardized compacted kaolin and suggest that the dielectric properties at frequencies below 100 MHz are not only a function of water content but also functions of other soil state parameters including dry density. The hydration process of a commercial grout is monitored in real time by using OC. It is found that the time dependent dielectric properties can accurately reveal the different stages of the hydration process. These measurement results demonstrate the practicability of the introduced methods in determining dielectric properties of soft geomaterials.
Beyond metropolitan areas, many peripheral regions and their cities in Europe have, in manifold ways, been significantly shaped by industrialisation. In the context of the relocation of industrial production to other countries over the last decades, the question has been raised as to the role this heritage can play in futural regional development as well as the potential local identification with this history. Hence, this article seeks to analyse the perception of the industrial heritage in the Vogtland region, located alongside the border of three German federal states and the Czech Republic. It inquires as to the perception of the industrial heritage by the local population and related potential future narrations. Based on spontaneous and explorative interviews with local people as an empirical base, a discrepancy between the perception of the tangible and intangible dimensions of the industrial heritage can be observed. On the one hand, the tangible heritage like older factories and production complexes are seen as a functional legacy and an “eyesore” narrative is attributed to them. On the other hand, people often reference the personal and familial connection to the industry and highlight its importance for the historical development and the wealth of the region. But these positive associations are mainly limited to the intangible dimension and are disconnected from the material artefacts of industrial production.
Wissenschaftliches Kolloquium vom 27. bis 30. Juni 1989 in Weimar an der Hochschule für Architektur und Bauwesen zum Thema: ‚Produktivkraftentwicklung und Umweltgestaltung. Sozialer und wissenschaftlich-technischer Fortschritt in ihren Auswirkungen auf Architektur und industrielle Formgestaltung in unserer Zeit. Zum 100. Geburtstag von Hannes Meyer'
Creation of hierarchical sequence of the plastic and viscoplastic models according to different levels of structure approximations is considered. Developed strategy of multimodel analysis, which consists of creation of the inelastic models library, determination of selection criteria system and caring out of multivariant sequential clarifying computations, is described. Application of the multimodel approach in numerical computations has demonstrated possibility of reliable prediction of stress-strain response under wide variety of combined nonproportional loading.
In the abstract proposed is the Instrumental System of mechanics problems analysis of the deformed solid body. It supplies the researcher with the possibility to describe the input data on the object under analyses and the problem scheme based upon the variational principles within one task. The particular feature of System is possibility to describe the information concerning the object of any geometrical shape and the computation sheme according to the program defined for purpose. The Methods allow to compute the tasks with indefinite functional and indefinite geometry of the object (or the set of objects). The System provides the possibility to compute the tasks with indefinite sheme based upon the Finite Element Method (FEM). The restrictions of the System usage are therefore determined by the restrictions of the FEM itself. It contrast to other known programms using FEM (ANSYS, LS-DYNA and etc) described system possesses more universality in defining input data and choosing computational scheme. Builtin is an original Subsytem of Numerical Result Analuses. It possesses the possibility to visualise all numerical results, build the epures of the unknown variables, etc. The Subsystem is approved while solving two- and three-dimensional problems of Elasticiti and Plasticity, under the conditions of Geometrical Unlinearity. Discused are Contact Problems of Statics and Dynamics.
Urban planning involves many aspects and various disciplines, demanding an asynchronous planning approach. The level of complexity rises with each aspect to be considered and makes it difficult to find universally satisfactory solutions. To improve this situation we propose a new approach, which complement traditional design methods with a computational urban plan- ning method that can fulfil formalizable design requirements automatically. Based on this approach we present a design space exploration framework for complex urban planning projects. For a better understanding of the idea of design space exploration, we introduce the concept of a digital scout which guides planners through the design space and assists them in their creative explorations. The scout can support planners during manual design by informing them about potential im- pacts or by suggesting different solutions that fulfill predefined quality requirements. The planner can change flexibly between a manually controlled and a completely automated design process. The developed system is presented using an exemplary urban planning scenario on two levels from the street layout to the placement of building volumes. Based on Self-Organizing Maps we implemented a method which makes it possible to visualize the multi-dimensional solution space in an easily analysable and comprehensible form.
Wissenschaftliches Kolloquium vom 24. bis 27. April 2003 in Weimar an der Bauhaus-Universität zum Thema: ‚MediumArchitektur - Zur Krise der Vermittlung'
Wissenschaftliches Kolloquium vom 24. bis 27. April 2003 in Weimar an der Bauhaus-Universität zum Thema: ‚MediumArchitektur - Zur Krise der Vermittlung'
Wissenschaftliches Kolloquium vom 24. bis 27. April 2003 in Weimar an der Bauhaus-Universität zum Thema: ‚MediumArchitektur - Zur Krise der Vermittlung'
Bolted connections are commonly used in steel construction. The load-bearing behavior of bolt fittings has extensively been studied in various research activities and the bearing capacity of bolted connections can be assessed well by standard regulations for practical applications. With regard to tensile loading, the nut does not have strong influence on resistances, since the failure occurs in the bolts due to higher material strengths of the nuts. In some applications, so-called “blind holes” are used to connect plated components. In a manner of speaking, the nut is replaced by the “outer” plate with a prefabricated hole and thread, in which the bolt can be screwed and tightened. In such connections, the limit load capacity cannot solely be assessed by the bolt resistance, since the threaded hole in the base material has strong influence on the structural behavior. In this context, the available screw-in depth of the blind hole is of fundamental importance. The German National Annex of EN 1993-1-8 provides information on a necessary depth in order to transfer the full tensile capacity of the bolt. However, some connections do not allow to fabricate such depths. In these cases, the capacity of the connection is unclear and not specified. In this paper, first experiments on corresponding connections with different screw-in depths are presented and compared to limit load capacities according to the standard.
This article aims to develop a social theory of violence that emphasizes the role of the third party as well as the communication between the involved subjects. For this Teresa Koloma Beck’s essay ‘The Eye of the Beholder: Violence as a Social Process’ is taken as a starting point, which adopts a social-constructivist perspective. On the one hand, the basic concepts and the benefits of this approach are presented. On the other hand, social-theoretical problems of this approach are revealed. These deficits are counteracted by expanding Koloma Beck’s approach with a communicative-constructivist framework. Thus, the role of communicative action and the ‘objectification of violence’ is emphasized. These aspects impact the perception, judgement and (de-)legitimation of violence phenomena and the emergence of a ‘knowledge of violence’. Communicative actions and objectifications form a key to understanding violent interactions and the link between the micro and macro levels. Finally, the methodological consequences for the research of violence and Communicative Constructivism are discussed. Furthermore, possible research fields are outlined, which open up by looking at communicative action and the objectifications within the ‘triads of violence’.
In this study, an application of evolutionary multi-objective optimization algorithms on the optimization of sandwich structures is presented. The solution strategy is known as Elitist Non-Dominated Sorting Evolution Strategy (ENSES) wherein Evolution Strategies (ES) as Evolutionary Algorithm (EA) in the elitist Non-dominated Sorting Genetic algorithm (NSGA-II) procedure. Evolutionary algorithm seems a compatible approach to resolve multi-objective optimization problems because it is inspired by natural evolution, which closely linked to Artificial Intelligence (AI) techniques and elitism has shown an important factor for improving evolutionary multi-objective search. In order to evaluate the notion of performance by ENSES, the well-known study case of sandwich structures are reconsidered. For Case 1, the goals of the multi-objective optimization are minimization of the deflection and the weight of the sandwich structures. The length, the core and skin thicknesses are the design variables of Case 1. For Case 2, the objective functions are the fabrication cost, the beam weight and the end deflection of the sandwich structures. There are four design variables i.e., the weld height, the weld length, the beam depth and the beam width in Case 2. Numerical results are presented in terms of Paretooptimal solutions for both evaluated cases.
This study aims to develop an approach to couple a computational fluid dynamics (CFD) solver to the University of California, Berkeley (UCB) thermal comfort model to accurately evaluate thermal comfort. The coupling was made using an iterative JavaScript to automatically transfer data for each individual segment of the human body back and forth between the CFD solver and the UCB model until reaching convergence defined by a stopping criterion. The location from which data are transferred to the UCB model was determined using a new approach based on the temperature difference between subsequent points on the temperature profile curve in the vicinity of the body surface. This approach was used because the microclimate surrounding the human body differs in thickness depending on the body segment and the surrounding environment. To accurately simulate the thermal environment, the numerical model was validated beforehand using experimental data collected in a climate chamber equipped with a thermal manikin. Furthermore, an example of the practical implementations of this coupling is reported in this paper through radiant floor cooling simulation cases, in which overall and local thermal sensation and comfort were investigated using the coupled UCB model.
Conventional superplasticizers based on polycarboxylate ether (PCE) show an intolerance to clay minerals due to intercalation of their polyethylene glycol (PEG) side chains into the interlayers of the clay mineral. An intolerance to very basic media is also known. This makes PCE an unsuitable choice as a superplasticizer for geopolymers. Bio-based superplasticizers derived from starch showed comparable effects to PCE in a cementitious system. The aim of the present study was to determine if starch superplasticizers (SSPs) could be a suitable additive for geopolymers by carrying out basic investigations with respect to slump, hardening, compressive and flexural strength, shrinkage, and porosity. Four SSPs were synthesized, differing in charge polarity and specific charge density. Two conventional PCE superplasticizers, differing in terms of molecular structure, were also included in this study. The results revealed that SSPs improved the slump of a metakaolin-based geopolymer (MK-geopolymer) mortar while the PCE investigated showed no improvement. The impact of superplasticizers on early hardening (up to 72 h) was negligible. Less linear shrinkage over the course of 56 days was seen for all samples in comparison with the reference. Compressive strengths of SSP specimens tested after 7 and 28 days of curing were comparable to the reference, while PCE led to a decline. The SSPs had a small impact on porosity with a shift to the formation of more gel pores while PCE caused an increase in porosity. Throughout this research, SSPs were identified as promising superplasticizers for MK-geopolymer mortar and concrete.
This paper proposes a practice-theoretical journalism research approach for an alternate and innovative perspective of digital journalism’s current empirical challenges. The practice-theoretical approach is introduced by demonstrating its explanatory power in relation to demarcation problems, technological changes, economic challenges and challenges to journalism’s legitimacy. Its respective advantages in dealing with these problems are explained and then compared to established journalism theories. The particular relevance of the theoretical perspective is due to (1) its central decision to observe journalistic practices, (2) the transgression of conventional journalistic boundaries, (3) the denaturalization of journalistic norms and laws, (4) the explicit consideration of a material, socio-technical dimension of journalism, (5) a focus on the conflicting relationship between journalistic practices and media management practices, and (6) prioritizing order generation over stability.
Why Do Digital Native News Media Fail? An Investigation of Failure in the Early Start-Up Phase
(2020)
Digital native news media have great potential for improving journalism. Theoretically, they can be the sites where new products, novel revenue streams and alternative ways of organizing digital journalism are discovered, tested, and advanced. In practice, however, the situation appears to be more complicated. Besides the normal pressures facing new businesses, entrepreneurs in digital news are faced with specific challenges. Against the background of general and journalism specific entrepreneurship literature, and in light of a practice–theoretical approach, this qualitative case study research on 15 German digital native news media outlets empirically investigates what barriers curb their innovative capacity in the early start-up phase. In the new media organizations under study here, there are—among other problems—a high degree of homogeneity within founding teams, tensions between journalistic and economic practices, insufficient user orientation, as well as a tendency for organizations to be underfinanced. The patterns of failure investigated in this study can raise awareness, help news start-ups avoid common mistakes before actually entering the market, and help industry experts and investors to realistically estimate the potential of new ventures within the digital news industry.
Previous publications about biochar in anaerobic digestion show encouraging results with regard to increased biogas yields. This work investigates such effects in a solid-state fermentation of bio-waste. Unlike in previous trials, the influence of biochar is tested with a setup that simulates an industrial-scale biogas plant. Both the biogas and the methane yield increased around 5% with a biochar addition of 5%-based on organic dry matter biochar to bio-waste. An addition of 10% increased the yield by around 3%. While scaling effects prohibit a simple transfer of the results to industrial-scale plants, and although the certainty of the results is reduced by the heterogeneity of the bio-waste, further research in this direction seems promising.
Lack of Information technology applications on construction projects lead to complex flow of data during project life cycle. Building Information Modeling (BIM) has gained attention in the Architectural, Engineering and Construction (AEC) industry, envisage the use of virtual n-dimensional (n-D) models to identify potential conflicts in design, construction or operational of any facility. A questionnaire has been designed to investigate perceptions regarding BIM advantages. Around 102 valid responses received from diversified stakeholders. Results showed very low BIM adoption with low level of ‘Buzz’. BIM is a faster and more effective method for designing and construction management, it improves quality of the design and construction and reduces rework during construction; which came out as the top thee advantages according to the perception of AEC professionals of Pakistan.BIM has least impact on reduction of cost, time and human resources. This research is a bench mark study to understand adoption and advantageous of BIM in Pakistan Construction Industry.
Spatializing difference
(2003)
Wissenschaftliches Kolloquium vom 24. bis 27. April 2003 in Weimar an der Bauhaus-Universität zum Thema: ‚MediumArchitektur - Zur Krise der Vermittlung'
Wissenschaftliches Kolloquium vom 24. bis 27. April 2003 in Weimar an der Bauhaus-Universität zum Thema: ‚MediumArchitektur - Zur Krise der Vermittlung'
Patients and staff in hospitals are exposed to a complex sound environment with rather high noise levels. In intensive care units, the main noise sources are hospital staff on duty and medical equipment, which generates both operating noise and acoustic alarms. Although noise in most cases is produced during activities for the purpose of saving life, noise can induce significant changes in the depth and quality of sleep and negatively affect health in general. Results of a survey of hospital staff are presented as well as measurements in two German hospital wards: a standard two-bed room and a special Intermediate Care Unit (IMC-Unit), each in a different Intensive Care Unit (ICU). Sound pressure data were collected over a 48 hour period and converted into different levels (LAFeq, LAFmax, LAFmin, LAF 5%), as well as a rating level LAr, which is used to take tonality and impulsiveness into account. An analysis of the survey and the measured data, together with a comparison of thresholds of national and international regulations and standards describe the acoustic situation and its likely noise effects on staff and patients.
Occupant needs with regard to residential buildings are not well known due to a lack of representative scientific studies. To improve the lack of data, a large scale study was carried out using a Post Occupancy Evaluation of 1,416 building occupants. Several criteria describing the needs of occupants were evaluated with regard to their subjective level of relevance. Additionally, we investigated the degree to which deficiencies subjectively exist, and the degree to which occupants were able to accept them. From the data obtained, a hierarchy of criteria was created. It was found that building occupants ranked the physiological needs of air quality and thermal comfort the highest. Health hazards such as mould and contaminated building materials were unacceptable for occupants, while other deficiencies were more likely to be tolerated. Occupant satisfaction was also investigated. We found that most occupants can be classified as satisfied, although some differences do exist between different populations. To explain the relationship between the constructs of what we call relevance, acceptance, deficiency and satisfaction, we then created an explanatory model. Using correlation and regression analysis, the validity of the model was then confirmed by applying the collected data. The results of the study are both relevant in shaping further research and in providing guidance on how to maximize tenant satisfaction in real estate management.
Tensile strain and compress strain can greatly affect the thermal conductivity of graphene nanoribbons (GNRs). However, the effect of GNRs under shear strain, which is also one of the main strain effect, has not been studied systematically yet. In this work, we employ reverse nonequilibrium molecular dynamics (RNEMD) to the systematical study of the thermal conductivity of GNRs (with model size of 4 nm × 15 nm) under the shear strain. Our studies show that the thermal conductivity of GNRs is not sensitive to the shear strain, and the thermal conductivity decreases only 12–16% before the pristine structure is broken. Furthermore, the phonon frequency and the change of the micro-structure of GNRs, such as band angel and bond length, are analyzed to explore the tendency of thermal conductivity. The results show that the main influence of shear strain is on the in-plane phonon density of states (PDOS), whose G band (higher frequency peaks) moved to the low frequency, thus the thermal conductivity is decreased. The unique thermal properties of GNRs under shear strains suggest their great potentials for graphene nanodevices and great potentials in the thermal managements and thermoelectric applications.
This article focuses on further developments of the background-oriented schlieren (BOS) technique to visualize convective indoor air flow, which is usually defined by very small density gradients. Since the light rays deflect when passing through fluids with different densities, BOS can detect the resulting refractive index gradients as integration along a line of sight. In this paper, the BOS technique is used to yield a two-dimensional visualization of small density gradients. The novelty of the described method is the implementation of a highly sensitive BOS setup to visualize the ascending thermal plume from a heated thermal manikin with temperature differences of minimum 1 K. To guarantee steady boundary conditions, the thermal manikin was seated in a climate laboratory. For the experimental investigations, a high-resolution DLSR camera was used capturing a large field of view with sufficient detail accuracy. Several parameters such as various backgrounds, focal lengths, room air temperatures, and distances between the object of investigation, camera, and structured background were tested to find the most suitable parameters to visualize convective indoor air flow. Besides these measurements, this paper presents the analyzing method using cross-correlation algorithms and finally the results of visualizing the convective indoor air flow with BOS. The highly sensitive BOS setup presented in this article complements the commonly used invasive methods that highly influence weak air flows.
The K-nearest neighbors (KNN) machine learning algorithm is a well-known non-parametric classification method. However, like other traditional data mining methods, applying it on big data comes with computational challenges. Indeed, KNN determines the class of a new sample based on the class of its nearest neighbors; however, identifying the neighbors in a large amount of data imposes a large computational cost so that it is no longer applicable by a single computing machine. One of the proposed techniques to make classification methods applicable on large datasets is pruning. LC-KNN is an improved KNN method which first clusters the data into some smaller partitions using the K-means clustering method; and then applies the KNN for each new sample on the partition which its center is the nearest one. However, because the clusters have different shapes and densities, selection of the appropriate cluster is a challenge. In this paper, an approach has been proposed to improve the pruning phase of the LC-KNN method by taking into account these factors. The proposed approach helps to choose a more appropriate cluster of data for looking for the neighbors, thus, increasing the classification accuracy. The performance of the proposed approach is evaluated on different real datasets. The experimental results show the effectiveness of the proposed approach and its higher classification accuracy and lower time cost in comparison to other recent relevant methods.
Wissenschaftliches Kolloquium vom 19. bis 22. April 2007 in Weimar an der Bauhaus-Universität zum Thema: ‚Die Realität des Imaginären. Architektur und das digitale Bild'
Volumerendering ist eine Darstellungstechnik, um verschiedene räumliche Mess- und Simulationsdaten anschaulich, interaktiv grafisch darzustellen. Im folgenden Beitrag wird ein Verfahren vorgestellt, mehrere Volumendaten mit einem Architekturflächenmodell zu überlagern. Diese komplexe Darstellungsberechnung findet mit hardwarebeschleunigten Shadern auf der Grafikkarte statt. Im Beitrag wird hierzu der implementierte Softwareprototyp "VolumeRendering" vorgestellt. Neben dem interaktiven Berechnungsverfahren wurde ebenso Wert auf eine nutzerfreundliche Bedienung gelegt. Das Ziel bestand darin, eine einfache Bewertung der Volumendaten durch Fachplaner zu ermöglichen. Durch die Überlagerung, z. B. verschiedener Messverfahren mit einem Flächenmodell, ergeben sich Synergien und neue Auswertungsmöglichkeiten. Abschließend wird anhand von Beispielen aus einem interdisziplinären Forschungsprojekt die Anwendung des Softwareprototyps illustriert.
Wissenschaftliches Kolloquium vom 24. bis 27. April 2003 in Weimar an der Bauhaus-Universität zum Thema: ‚MediumArchitektur - Zur Krise der Vermittlung'
Wissenschaftliches Kolloquium vom 24. bis 27. April 2003 in Weimar an der Bauhaus-Universität zum Thema: ‚MediumArchitektur - Zur Krise der Vermittlung'
Wissenschaftliches Kolloquium vom 24. bis 27. April 2003 in Weimar an der Bauhaus-Universität zum Thema: ‚MediumArchitektur - Zur Krise der Vermittlung'
Wissenschaftliches Kolloquium vom 24. bis 27. April 2003 in Weimar an der Bauhaus-Universität zum Thema: ‚MediumArchitektur - Zur Krise der Vermittlung'
Wissenschaftliches Kolloquium vom 24. bis 27. April 2003 in Weimar an der Bauhaus-Universität zum Thema: ‚MediumArchitektur - Zur Krise der Vermittlung'
Wissenschaftliches Kolloquium vom 19. bis 22. April 2007 in Weimar an der Bauhaus-Universität zum Thema: ‚Die Realität des Imaginären. Architektur und das digitale Bild’
This paper presents a new design environment based on Multi-Agents and Virtual Reality (VR). In this research, a design system with a virtual reality function was developed. The virtual world was realized by using GL4Java, liquid crystal shutter glasses, sensor systems, etc. And the Multi-Agent CAD system with product models, which had been developed before, was integrated with the VR design system. A prototype system was developed for highway steel plate girder bridges, and was applied to a design problem. The application verified the effectiveness of the developed system.
A Multi-objective Model for Optimizing Construction Planning of Repetitive Infrastructure Projects
(2004)
This paper presents the development of a model for optimizing resource utilization in repetitive infrastructure projects. The model provides the capability of simultaneous minimization of both project duration and work interruptions for construction crews. The model provides in a single run, a set of nondominated solutions that represent the tradeoff between these two objectives. The model incorporates a multiobjective genetic algorithm and scheduling algorithm. The model initially generates a randomly selected set of solutions that evolves to a near optimal set of tradeoff solutions in subsequent generations. Each solution represents a unique scheduling solution that is associated with certain project duration and a number of interruption days for utilized construction crews. As such, the model provides project planners with alternative schedules along with their expected duration and resource utilization efficiency.
The paper investigates accuracy of deflection predictions made by the finite element package ATENA and design code methods ACI and EC2. Deflections have been calculated for a large number of experimental reinforced concrete beams reported by three investigators. Statistical parameters have been established for each of the technique at different load levels, separately for the beams with small and moderate reinforcement ratio.
Thin elastic plates are the basic constructional elements and are very often subjected to dynamic effects especially in the machine-building structures. Their saving design of resonance conditions of operation is an extremely complicated task which cannot be solved analytically. In the present report an efficient and sufficiently general method for optimal design of thin plates is worked out on the basis of energy resonance method of Wilder, the method of the finite elements for dynamic research and the methods of parameter optimization. By means of these methods various limitations and requirements put by the designer to the plates can be taken into account. A programme module for numerical investigation of the weight variation of the plate depending on the taken variable of the designed thickness at different supporting conditions is developed. The reasons for the considerable quantity and quality difference between the obtained optimal designs are also analysed.
A wide variety of behavioural models exist in microscopic traffic simulation. Commercial programms often use closed-source policies and are confined to their respective simulation platforms. Open-source approaches mainly focus on distinctive, highly specialized traffic situations. In the scope of this paper, an open-source framework for developing modular, objectoriented simulation systems is presented, capable of simultaneously accommodating different driving models and enabling the user to modify and extend the catalogue of driving behaviours. The existing driving behaviours and the computational implementation of the simulation are being described.
The presented work focuses on the presentation of a discrete event simulator which can be used for automated sequencing and optimization of building processes. The sequencing is based on the commonly used component–activity–resource relations taking structural and process constraints into account. For the optimization a genetic algorithm approach was developed, implemented and successfully applied to several real life steel constructions. In this contribution we discuss the application of the discrete event simulator including its optimization capabilities on a 4D process model of a steel structure of an automobile recycling facility.
Digital maps are very easily applied as route guide maps. Route guide maps are provided through Web or a mobile phone services, and demand for such services is increasing. However, the production of a route guide map requires a great deal of time. Therefore, it is difficult for general users to make route guide maps. The purpose of the present research is the development of a system that can generate a route guide map using the Digital Map 2500 (Spatial Data Framework) published by the Geographical Survey Institute. This system will not require advanced equipment or expert knowledge. Therefore, anyone can produce route guide maps easily and quickly. By using the Digital Map 2500, the time and cost required in order to generate a map are reduced. Moreover, a useful route guide map can be created by simplifying the map form based on the human cognitive map.
Development of Urban Land Use Model to Compare Transit-Oriented and Automobile-Oriented Cities
(2004)
This study is an attempt to develop a simple simulation model that can compare the differences between automobile-oriented and transit-oriented cities, and clarify the difference between city forms by transportation modes. Following a theoretical model development, a series of simulation runs are tried. The model allocates people who commute to CBD from residential zones along a transportation corridor. As a result of many simulation analyses, it is shown that automobiles need much more traffic space in comparison with the transit as is shown by the proposed traffic space ratio both in CBD and along the corridor.
Multimodel Numerical Analysis of the Elasto-Visco-Plastic Deformation of Materials and Constructions
(1997)
At the present time there is no a generally accepted theory of visco-plasticity which is applicable for a wide class of materials and arbitrary paths of loading. The multimodel approach, based on the creation of hierarchical sequence of the models, is the most rational. The developed library of elasto-visco-plastic models includes both simplest and sophistic models demanding numerous experimental data. A unified general form of constitutive equations for all used elasto-visco-plastic models are presented based upon the concept of tensorial internal state variables. It permits to use unified algorithm of boundary tasks solution for different variants of material models. The developed selection criteria system generates the necessary conditions and provides the choice of the simplest variant of theory sufficient for correct problem solution. Formulation of the selection criteria system is based on peculiarities of viscoplastic materials behavior for the wide range thermomechanical loading and numerous computational experiments with structures different complexity levels. A set of effective schemes of integration stress-strain relations and non-linear finite element system solution are discussed for the considered class of material models. Application possibility of different material models is studied both for material element and for complicated structures. Application of the multimodel approach in numerical computations has demonstrated possibility of reliable prediction of stress-strain response under wide variety of combined loading.
In the given paper the generalized formulation of the problem of computer modelling of the complex-composite structure interaction with different types of dynamic loads and effects is discussed. Here the analysis is given as for the usage of some universal computing systems for the solution of such problems. Also if is shown that the quantification of the dynamic models of the complex-composite systems with the variable structure, depending on the character and intensivity of the effects, is necessary. The different variants of the joint and the space structure element modelling are gested. It allows to consider the complex modes of the joint bending-torsional oscillations of such structures as bridges, towers, high-rise buildings. The peculiarities of the modelling and testing of some problems of the objects aerodynamics and the interaction of the frameworks constructions with shock and movable loads are considered. In this paper the examples of the complex-composite structure dynamic analysis are shown. It is achieved by means of some special methods of the input of the real inducements and loads of the exploitated analog-object into the computing model. The suggested models found a wide use both at the design of new structures and the dynamic monitoring of the exploitated structures.
Framed-tube system with multiple internal tubes is analysed using an orthotropic box beam analogy approach in which each tube is individually modelled by a box beam that accounts for the flexural and shear deformations, as well as the shear-lag effects. A simple numerical modeling technique is proposed for estimating the shear-lag phenomenon in tube structures with multiple internal tubes. The proposed method idealizes the framed-tube structures with multiple internal tubes as equivalent multiple tubes, each composed of four equivalent orthotropic plate panels. The numerical analysis is based on the minimum potential energy principle in conjunction with the variational approach. The shear-lag phenomenon of such structures is studied taking into account the additional bending moments in the tubes. A detailed work is carried out through the numerical analysis of the additional bending moment. The moment factor is further introduced to identify the shear lag phenomenon along with the additional moment.
When working on urban planning projects there are usually multiple aspects to consider. Often these aspects are contradictory and it is not possible to choose one over the other; instead, they each need to be fulfilled as well as possible. Planners typically draw on past experience when subjectively prioritising which aspects to consider with which degree of importance for their planning concepts. This practice, although understandable, places power and authority in the hands of people who have varying degrees of expertise, which means that the best possible solution is not always found, because it is either not sought or the problem is regarded as being too complex for human capabilities. To improve this situation, the project presented here shows the potential of multi-criteria optimisation algorithms using the example of a new housing layout for an urban block. In addition it is shown, how Self-Organizing-Maps can be used to visualise multi-dimensional solution spaces in an easy analysable and comprehensible form.
The human body is surrounded by a micro‐climate which results from its convective release of heat. In this study, the air temperature and flow velocity of this micro‐climate were measured in a climate chamber at various room temperatures, using a thermal manikin simulating the heat release of the human being. Different techniques (Particle Streak Tracking, thermography, anemometry, and thermistors) were used for measurement and visualization. The manikin surface temperature was adjusted to the particular indoor climate based on simulations with a thermoregulation model (UCBerkeley Thermal Comfort Model). We found that generally, the micro‐climate is thinner at the lower part of the torso, but expands going up. At the head, there is a relatively thick thermal layer, which results in an ascending plume above the head. However, the micro‐climate shape strongly depends not only on the body segment, but also on boundary conditions: the higher the temperature difference between the surface temperature of the manikin and the air temperature, the faster the air flow in the micro‐climate. Finally, convective heat transfer coefficients strongly increase with falling room temperature, while radiative heat transfer coefficients decrease. The type of body segment strongly influences the convective heat transfer coefficient, while only minimally influencing the radiative heat transfer coefficient.
One of the frequently examined design principles recommendations in multimedia learning is the personalization principle. Based on empirical evidence this principle states that using personalised messages in multimedia learning is more beneficial than using formal language (e.g. using ‘you’ instead of ‘the’). Although there is evidence that these slight changes in regard to the language style affect learning, motivation and the perceived cognitive load, it remains unclear, (1) whether the positive effects of personalised language can be transferred to all kinds of content of learning materials (e.g. specific potentially aversive health issues) and (2) which are the underlying processes (e.g. attention allocation) of the personalization effect. German university students (N= 37) learned symptoms and causes of cerebral haemorrhages either with a formal or a personalised version of the learning material. Analysis revealed comparable results to the few existing previous studies, indicating an inverted personalization effect for potentially aversive learning material. This effect was specifically revealed in regard to decreased average fixation duration and the number of fixations exclusively on the images in the personalised compared to the formal version. This result can be seen as indicators for an inverted effect of personalization on the level of visual attention.
Research into bio-based epoxy resins has intensified in recent decades. Here, it is of great importance to use raw materials whose use does not compete with food production. In addition, the performance of the newly developed materials should be comparable to that of conventional products. Possible starting materials are lignin degradation products, such as vanillin and syringaldehyde, for which new synthesis routes to the desired products must be found and their properties determined. In this article, the first synthesis of two amine hardeners, starting with vanillin and syringaldehyde, using the Smiles rearrangement reaction is reported. The amine hardeners were mixed with bisphenol A diglycidyl ether, and the curing was compared to isophorone diamine, 4-4′-diaminodiphenyl sulfone, and 4-Aminonbenzylamine by means of differential scanning calorimetry. It was found that the two amines prepared are cold-curing. As TG-MS studies showed, the thermal stability of at least one of the polymers prepared with the potentially bio-based amines is comparable to that of the polymer prepared with isophorone diamine, and similar degradation products are formed during pyrolysis.
Performance assessment of a ductless personalized ventilation system using a validated CFD model
(2018)
The aim of this study is twofold: to validate a computational fluid dynamics (CFD) model, and then to use the validated model to evaluate the performance of a ductless personalized ventilation (DPV) system. To validate the numerical model, a series of measurements was conducted in a climate chamber equipped with a thermal manikin. Various turbulence models, settings, and options were tested; simulation results were compared to the measured data to determine the turbulence model and solver settings that achieve the best agreement between the measured and simulated values. Subsequently, the validated CFD model was then used to evaluate the thermal environment and indoor air quality in a room equipped with a DPV system combined with displacement ventilation. Results from the numerical model were then used to quantify thermal sensation and comfort using the UC Berkeley thermal comfort model.
The described study aims to find correlations between urban spatial configurations and human emotions. To this end, the authors measured people’s emotions while they walk along a path in an urban area using an instrument that measures skin conductance and skin temperature. The corresponding locations of the test persons were measured recorded by using a GPS-tracker (n=13). The results are interpreted and categorized as measures for positive and negative emotional arousal. To evaluate the technical and methodological process. The test results offer initial evidence that certain spaces or spatial sequences do cause positive or negative emotional arousal while others are relatively neutral. To achieve the goal of the study, the outcome was used as a basis for the study of testing correlations between people’s emotional responses and urban spatial configurations represented by Isovist properties of the urban form. By using their model the authors can explain negative emotional arousal for certain places, but they couldn’t find a model to predict emotional responses for individual spatial configurations.
In this paper we present a theoretical background for a coupled analytical–numerical approach to model a crack propagation process in two-dimensional bounded domains. The goal of the coupled analytical–numerical approach is to obtain the correct solution behaviour near the crack tip by help of the analytical solution constructed by using tools of complex function theory and couple it continuously with the finite element solution in the region far from the singularity. In this way, crack propagation could be modelled without using remeshing. Possible directions of crack growth can be calculated through the minimization of the total energy composed of the potential energy and the dissipated energy based on the energy release rate. Within this setting, an analytical solution of a mixed boundary value problem based on complex analysis and conformal mapping techniques is presented in a circular region containing an arbitrary crack path. More precisely, the linear elastic problem is transformed into a Riemann–Hilbert problem in the unit disk for holomorphic functions. Utilising advantages of the analytical solution in the region near the crack tip, the total energy could be evaluated within short computation times for various crack kink angles and lengths leading to a potentially efficient way of computing the minimization procedure. To this end, the paper presents a general strategy of the new coupled approach for crack propagation modelling. Additionally, we also discuss obstacles in the way of practical realisation of this strategy.
When predicting sound pressure levels induced by structure-borne sound sources and describing the sound propagation path through the building structure as exactly as possible, it is necessary to characterize the vibration behavior of the structure-borne sound sources. In this investigation, the characterization of structure-borne sound sources was performed using the two-stage method (TSM) described in EN 15657. Four different structure-borne sound sources were characterized and subsequently installed in a lightweight test stand. The resulting sound pressure levels in an adjacent receiving room were measured. In the second step, sound pressure levels were predicted according to EN 12354-5 based on the parameters of the structure-borne sound sources. Subsequently, the predicted and the measured sound pressure levels were compared to obtain reliable statements on the achievable accuracy when using source quantities determined by TSM with this prediction method.
Coronary Artery Disease Diagnosis: Ranking the Significant Features Using a Random Trees Model
(2020)
Heart disease is one of the most common diseases in middle-aged citizens. Among the vast number of heart diseases, coronary artery disease (CAD) is considered as a common cardiovascular disease with a high death rate. The most popular tool for diagnosing CAD is the use of medical imaging, e.g., angiography. However, angiography is known for being costly and also associated with a number of side effects. Hence, the purpose of this study is to increase the accuracy of coronary heart disease diagnosis through selecting significant predictive features in order of their ranking. In this study, we propose an integrated method using machine learning. The machine learning methods of random trees (RTs), decision tree of C5.0, support vector machine (SVM), and decision tree of Chi-squared automatic interaction detection (CHAID) are used in this study. The proposed method shows promising results and the study confirms that the RTs model outperforms other models.
We present StarWatch, our application for real-time analysis of radio astronomical data in Virtual Environment. Serving as an interface to radio astronomical databases or being applied to live data from the radio telescopes, the application supports various data filters measuring signal-to-noise ratio (SNR), Doppler's drift, degree of signal localization on celestial sphere and other useful tools for signal extraction and classification. Originally designed for the database of narrow band signals from SETI Institute (setilive.org), the application has been recently extended for the detection of wide band periodic signals, necessary for the search of pulsars. We will also address the detection of week signals possessing arbitrary waveforms and present several data filters suitable for this purpose.
The laser beam is a small, flexible and fast polishing tool. With laser radiation it is possible to finish many outlines or geometries on quartz glass surfaces in the shortest possible time. It's a fact that the temperature developing while polishing determines the reachable surface smoothing and, as a negative result, causes material tensions. To find out which parameters are important for the laser polishing process and the surface roughness respectively and to estimate material tensions, temperature simulations and extensive polishing experiments took place. During these experiments starting and machining parameters were changed and temperatures were measured contact-free.
Wissenschaftliches Kolloquium vom 24. bis 27. April 2003 in Weimar an der Bauhaus-Universität zum Thema: ‚MediumArchitektur - Zur Krise der Vermittlung'
We demonstrate how logical operations can be implemented in ensembles of protoplasmic tubes of acellular slime mold Physarum polycephalum. The tactile response of the protoplasmic tubes is used to actuate analogs of two- and four-input logical gates and memory devices. The slime mold tube logical gates display results of logical operations by blocking flow in mechanically stimulated tube fragments and redirecting the flow to output tube fragments. We demonstrate how XOR and NOR gates are constructed. We also exemplify circuits of hybrid gates and a memory device. The slime mold based gates are non-electronic, simple and inexpensive, and several gates can be realized simultaneously at sites where protoplasmic tubes merge.