Refine
Document Type
- Article (57) (remove)
Institute
- Institut für Strukturmechanik (ISM) (33)
- Professur Bauphysik (6)
- Junior-Professur Organisation und vernetzte Medien (3)
- Professur Bauchemie und Polymere Werkstoffe (3)
- Professur Sozialwissenschaftliche Stadtforschung (2)
- Professur Stadtplanung (2)
- Bauhaus-Institut für zukunftsweisende Infrastruktursysteme (b.is) (1)
- Graduiertenkolleg 1462 (1)
- Professur Angewandte Mathematik (1)
- Professur Immobilienwirtschaft und -management (1)
Keywords
- OA-Publikationsfonds2020 (25)
- Maschinelles Lernen (17)
- Machine learning (12)
- Erdbeben (6)
- Deep learning (5)
- big data (5)
- Raumklima (4)
- rapid visual screening (4)
- computational fluid dynamics (3)
- earthquake safety assessment (3)
- machine learning (3)
- random forest (3)
- Behaglichkeit (2)
- Brücke (2)
- Fluid (2)
- Fotovoltaik (2)
- Fuzzy-Logik (2)
- Geopolymere (2)
- Intelligente Stadt (2)
- Internet of things (2)
- Journalismus (2)
- Neuronales Netz (2)
- Strömungsmechanik (2)
- Wohnen (2)
- Wohnungspolitik (2)
- artificial intelligence (2)
- artificial neural networks (2)
- buildings (2)
- damaged buildings (2)
- data science (2)
- ductless personalized ventilation (2)
- earthquake (2)
- smart cities (2)
- soft computing techniques (2)
- support vector machine (2)
- thermal comfort (2)
- urban morphology (2)
- vulnerability assessment (2)
- Abwasserwirtschaft (1)
- Aerodynamik (1)
- Akustische Laufzeit-Tomographie (1)
- Algorithmus (1)
- Anthropozän (1)
- Artificial neural network (1)
- Background-oriented schlieren (1)
- Balkan (1)
- Balkanroute (1)
- Bauklimatik (1)
- Bauphysik (1)
- Bayes-Verfahren (1)
- Belüftung (1)
- Bildanalyse (1)
- Bodentemperatur (1)
- Bridge (1)
- Bubble column reactor (1)
- Building Information Modeling (1)
- ContikiMAC (1)
- Convective indoor air flow (1)
- Cross-correlation (1)
- Datenmodell (1)
- Deutschland (1)
- Deutschland <Östliche Länder> (1)
- Digital image correlation (1)
- Digitalisierung (1)
- Dirac-Operator (1)
- Earthquake (1)
- Energieeffizienz (1)
- Entrepreneurship (1)
- Erdbebensicherheit (1)
- Erneuerbare Energien (1)
- Europäische Union (1)
- Fernerkung (1)
- Finite-Elemente-Methode (1)
- Flow visualization (1)
- Flächengerechtigkeit (1)
- Flächenverbrauch (1)
- Flüchtlingspolitik (1)
- Funktionentheorie (1)
- Fuzzy Logic (1)
- Fuzzy-Regelung (1)
- Gaussian process regression (1)
- Gebäude (1)
- Geoinformatik (1)
- Geometrie (1)
- Gerechtigkeit (1)
- Gesundheitsinformationssystem (1)
- Gesundheitswesen (1)
- Grundwasser (1)
- Größenverhältnis (1)
- Housing (1)
- Housing Policy (1)
- Human thermal plume (1)
- Hydrological drought (1)
- IAQ (1)
- IOT (1)
- Infrastructures (1)
- Ingenieurwissenschaften (1)
- Inspektion (1)
- Instrument (1)
- Internet der Dinge (1)
- Internet der dinge (1)
- K-nearest neighbors (1)
- KNN (1)
- Kontamination (1)
- Konzeptkunst (1)
- Körper (1)
- Kühlkörper (1)
- Künste (1)
- Künstliche Intelligenz (1)
- Land surface temperature (1)
- Leistungsverhalten (1)
- Literaturrecherche (1)
- Luftqualität (1)
- Lüftung (1)
- M5 model tree (1)
- Machine Learning (1)
- Marmara Region (1)
- Mensch (1)
- Metakaolin (1)
- Mieten (1)
- Morphologie (1)
- Multi-criteria decision making (1)
- Nachhaltigkeit (1)
- Nanofluid (1)
- Nanomaterials (1)
- Nanostrukturiertes Material (1)
- Nasskühlung (1)
- Naturkatastrophe (1)
- Neuartige Sanitärsysteme (1)
- Neue Medien (1)
- Nitratbelastung (1)
- OA-Publikationsfonds2019 (1)
- Oberflächentemperatur (1)
- Ostdeutschland (1)
- Performance (1)
- Peripherisierungsforschung (1)
- Polymere (1)
- Postpolitik (1)
- Postwachstumsstadt (1)
- Public-Private Partnerships (1)
- RSSI (1)
- Randwertproblem (1)
- Rapid Visual Screening (1)
- Raumluftströmungen (1)
- Raumordnung (1)
- Renewable energy (1)
- Responsibilisierung (1)
- Schlierenspiegel (1)
- Selbstgenutztes Wohneigentum (1)
- Siedlungswasserwirtschaft (1)
- Simulation (1)
- Smog (1)
- Social Housing (1)
- Solar (1)
- Sozialer Wohnungsbau (1)
- Stadtplanung (1)
- Steuerungsansätze (1)
- Strukturmechanik (1)
- Strömung (1)
- Städtischer Wohnungsmarkt (1)
- Superplasticizer (1)
- Sustainability (1)
- Temperatur (1)
- Thermal conductivity (1)
- Transformation (1)
- Transformation risks (1)
- Transformationsrisiken (1)
- Umweltbelastung (1)
- Umweltgerechtigkeit (1)
- Umweltveränderung (1)
- Vulnerability (1)
- Vulnerability assessment (1)
- Wastewater manegement (1)
- Welfare State (1)
- Wettbewerb (1)
- Wohnfläche (1)
- Wohnraum (1)
- Wohnungsbau (1)
- Wohnungseigentum (1)
- Wohnungsfrage (1)
- Wärmeleitfähigkeit (1)
- Zement (1)
- action recognition (1)
- adaptive neuro-fuzzy inference system (ANFIS) (1)
- adaptive pushover (1)
- alumosilicate (1)
- ant colony optimization algorithm (ACO) (1)
- artificial neural network (1)
- back-pressure (1)
- battery (1)
- berlinite (1)
- cement (1)
- classification (1)
- classifier (1)
- clear channel assessments (1)
- cluster density (1)
- cluster shape (1)
- clustering (1)
- competition (1)
- computation (1)
- computational fluid dynamics (CFD) (1)
- congestion control (1)
- coronary artery disease (1)
- cross-contamination (1)
- deep learning neural network (1)
- depletion method (1)
- desk fan (1)
- digital native news media (1)
- digital-born news media (1)
- digitization (1)
- dimensionality reduction (1)
- discrete Dirac operator (1)
- discrete boundary value problems (1)
- discrete monogenic functions (1)
- duty-cycles (1)
- earthquake damage (1)
- earthquake vulnerability assessment (1)
- energy consumption (1)
- energy efficiency (1)
- ensemble model (1)
- entrepreneurial journalism (1)
- experimental validation (1)
- extreme events (1)
- extreme pressure (1)
- firefly optimization algorithm (1)
- fisher-information matrix (1)
- fixed effects regression (1)
- flow pattern (1)
- fog computing (1)
- food informatics (1)
- fractional-order control (1)
- fuzzy decision making (1)
- fuzzy set qualitative comparative analysis (1)
- geoinformatics (1)
- geopolymer (1)
- ground water contamination (1)
- gully erosion susceptibility (1)
- health (1)
- health informatics (1)
- heart disease diagnosis (1)
- heat sink (1)
- human blob (1)
- human body proportions (1)
- human thermal plume (1)
- hybrid machine learning (1)
- hybrid machine learning model (1)
- hydraulic jump (1)
- hydrological model (1)
- hydrology (1)
- image processing (1)
- indoor air quality (1)
- industry 4.0 (1)
- journalism (1)
- journalism theories (1)
- künstlerischer Aktivismus (1)
- least square support vector machine (LSSVM) (1)
- longitudinal dispersion coefficient (1)
- mass spectrometry (1)
- mathematical modeling (1)
- mean-squared error (1)
- media performance (1)
- mitigation (1)
- nanofluid (1)
- natural hazard (1)
- neural networks (NNs) (1)
- news start-ups (1)
- partical swarm optimization (1)
- personalisierte Lüftung (1)
- personalized ventilation (1)
- photovoltaic (1)
- photovoltaic-thermal (PV/T) (1)
- physical activities (1)
- polymer adsorption (1)
- practice theories (1)
- practice theory (1)
- precipitation (1)
- predictive model (1)
- principal component analysis (1)
- public health (1)
- public service media (1)
- public space (1)
- received signal strength indicator (1)
- reinforcement learning (1)
- remote sensing (1)
- residential buildings (1)
- rice (1)
- rivers (1)
- rule based classification (1)
- schlieren imaging (1)
- schlieren velocimetry (1)
- seasonal precipitation (1)
- seismic assessment (1)
- seismic hazard analysis (1)
- seismic risk estimation (1)
- seismic vulnerability (1)
- signal processing (1)
- site-specific spectrum (1)
- smart sensors (1)
- sodium silicate solution (1)
- soil temperature (1)
- spatial analysis (1)
- spatiotemporal database (1)
- spearman correlation coefficient (1)
- square root cubature calman filter (1)
- standard deviation of pressure fluctuations (1)
- statistical analysis (1)
- statistical coeffcient of the probability distribution (1)
- stilling basin (1)
- support vector regression (1)
- sustainability (1)
- theory development (1)
- thermisches Empfinden (1)
- tower-like structures (1)
- tracer gas (1)
- type-3 fuzzy systems (1)
- urban health (1)
- urban sustainability (1)
- visible spectrophotometry (1)
- water quality (1)
- wavelet transform (1)
- wireless sensor network (1)
- wireless sensor networks (1)
- Öffentlich-private Partnerschaft (1)
- Öffentlich-rechtlicher Rundfunk (1)
- äquivalente Temperatur (1)
Year of publication
- 2020 (57) (remove)
Due to the importance of identifying crop cultivars, the advancement of accurate assessment of cultivars is considered essential. The existing methods for identifying rice cultivars are mainly time-consuming, costly, and destructive. Therefore, the development of novel methods is highly beneficial. The aim of the present research is to classify common rice cultivars in Iran based on color, morphologic, and texture properties using artificial intelligence (AI) methods. In doing so, digital images of 13 rice cultivars in Iran in three forms of paddy, brown, and white are analyzed through pre-processing and segmentation of using MATLAB. Ninety-two specificities, including 60 color, 14 morphologic, and 18 texture properties, were identified for each rice cultivar. In the next step, the normal distribution of data was evaluated, and the possibility of observing a significant difference between all specificities of cultivars was studied using variance analysis. In addition, the least significant difference (LSD) test was performed to obtain a more accurate comparison between cultivars. To reduce data dimensions and focus on the most effective components, principal component analysis (PCA) was employed. Accordingly, the accuracy of rice cultivar separations was calculated for paddy, brown rice, and white rice using discriminant analysis (DA), which was 89.2%, 87.7%, and 83.1%, respectively. To identify and classify the desired cultivars, a multilayered perceptron neural network was implemented based on the most effective components. The results showed 100% accuracy of the network in identifying and classifying all mentioned rice cultivars. Hence, it is concluded that the integrated method of image processing and pattern recognition methods, such as statistical classification and artificial neural networks, can be used for identifying and classification of rice cultivars.
Recently, the demand for residence and usage of urban infrastructure has been increased, thereby resulting in the elevation of risk levels of human lives over natural calamities. The occupancy demand has rapidly increased the construction rate, whereas the inadequate design of structures prone to more vulnerability. Buildings constructed before the development of seismic codes have an additional susceptibility to earthquake vibrations. The structural collapse causes an economic loss as well as setbacks for human lives. An application of different theoretical methods to analyze the structural behavior is expensive and time-consuming. Therefore, introducing a rapid vulnerability assessment method to check structural performances is necessary for future developments. The process, as mentioned earlier, is known as Rapid Visual Screening (RVS). This technique has been generated to identify, inventory, and screen structures that are potentially hazardous. Sometimes, poor construction quality does not provide some of the required parameters; in this case, the RVS process turns into a tedious scenario. Hence, to tackle such a situation, multiple-criteria decision-making (MCDM) methods for the seismic vulnerability assessment opens a new gateway. The different parameters required by RVS can be taken in MCDM. MCDM evaluates multiple conflicting criteria in decision making in several fields. This paper has aimed to bridge the gap between RVS and MCDM. Furthermore, to define the correlation between these techniques, implementation of the methodologies from Indian, Turkish, and Federal Emergency Management Agency (FEMA) codes has been done. The effects of seismic vulnerability of structures have been observed and compared.
A Machine Learning Framework for Assessing Seismic Hazard Safety of Reinforced Concrete Buildings
(2020)
Although averting a seismic disturbance and its physical, social, and economic disruption is practically impossible, using the advancements in computational science and numerical modeling shall equip humanity to predict its severity, understand the outcomes, and equip for post-disaster management. Many buildings exist amidst the developed metropolitan areas, which are senile and still in service. These buildings were also designed before establishing national seismic codes or without the introduction of construction regulations. In that case, risk reduction is significant for developing alternatives and designing suitable models to enhance the existing structure’s performance. Such models will be able to classify risks and casualties related to possible earthquakes through emergency preparation. Thus, it is crucial to recognize structures that are susceptible to earthquake vibrations and need to be prioritized for retrofitting. However, each building’s behavior under seismic actions cannot be studied through performing structural analysis, as it might be unrealistic because of the rigorous computations, long period, and substantial expenditure. Therefore, it calls for a simple, reliable, and accurate process known as Rapid Visual Screening (RVS), which serves as a primary screening platform, including an optimum number of seismic parameters and predetermined performance damage conditions for structures. In this study, the damage classification technique was studied, and the efficacy of the Machine Learning (ML) method in damage prediction via a Support Vector Machine (SVM) model was explored. The ML model is trained and tested separately on damage data from four different earthquakes, namely Ecuador, Haiti, Nepal, and South Korea. Each dataset consists of varying numbers of input data and eight performance modifiers. Based on the study and the results, the ML model using SVM classifies the given input data into the belonging classes and accomplishes the performance on hazard safety evaluation of buildings.
Tall buildings have become an integral part of cities despite all their pros and cons. Some current tall buildings have several problems because of their unsuitable location; the problems include increasing density, imposing traffic on urban thoroughfares, blocking view corridors, etc. Some of these buildings have destroyed desirable views of the city. In this research, different criteria have been chosen, such as environment, access, social-economic, land-use, and physical context. These criteria and sub-criteria are prioritized and weighted by the analytic network process (ANP) based on experts’ opinions, using Super Decisions V2.8 software. On the other hand, layers corresponding to sub-criteria were made in ArcGIS 10.3 simultaneously, then via a weighted overlay (map algebra), a locating plan was created. In the next step seven hypothetical tall buildings (20 stories), in the best part of the locating plan, were considered to evaluate how much of theses hypothetical buildings would be visible (fuzzy visibility) from the street and open spaces throughout the city. These processes have been modeled by MATLAB software, and the final fuzzy visibility plan was created by ArcGIS. Fuzzy visibility results can help city managers and planners to choose which location is suitable for a tall building and how much visibility may be appropriate. The proposed model can locate tall buildings based on technical and visual criteria in the future development of the city and it can be widely used in any city as long as the criteria and weights are localized.
The K-nearest neighbors (KNN) machine learning algorithm is a well-known non-parametric classification method. However, like other traditional data mining methods, applying it on big data comes with computational challenges. Indeed, KNN determines the class of a new sample based on the class of its nearest neighbors; however, identifying the neighbors in a large amount of data imposes a large computational cost so that it is no longer applicable by a single computing machine. One of the proposed techniques to make classification methods applicable on large datasets is pruning. LC-KNN is an improved KNN method which first clusters the data into some smaller partitions using the K-means clustering method; and then applies the KNN for each new sample on the partition which its center is the nearest one. However, because the clusters have different shapes and densities, selection of the appropriate cluster is a challenge. In this paper, an approach has been proposed to improve the pruning phase of the LC-KNN method by taking into account these factors. The proposed approach helps to choose a more appropriate cluster of data for looking for the neighbors, thus, increasing the classification accuracy. The performance of the proposed approach is evaluated on different real datasets. The experimental results show the effectiveness of the proposed approach and its higher classification accuracy and lower time cost in comparison to other recent relevant methods.
The amount of adsorbed styrene acrylate copolymer (SA) particles on cementitious surfaces at the early stage of hydration was quantitatively determined using three different methodological approaches: the depletion method, the visible spectrophotometry (VIS) and the thermo-gravimetry coupled with mass spectrometry (TG–MS). Considering the advantages and disadvantages of each method, including the respectively required sample preparation, the results for four polymer-modified cement pastes, varying in polymer content and cement fineness, were evaluated.
To some extent, significant discrepancies in the adsorption degrees were observed. There is a tendency that significantly lower amounts of adsorbed polymers were identified using TG-MS compared to values determined with the depletion method. Spectrophotometrically generated values were lying in between these extremes. This tendency was found for three of the four cement pastes examined and is originated in sample preparation and methodical limitations.
The main influencing factor is the falsification of the polymer concentration in the liquid phase during centrifugation. Interactions in the interface between sediment and supernatant are the cause. The newly developed method, using TG–MS for the quantification of SA particles, proved to be suitable for dealing with these revealed issues. Here, instead of the fluid phase, the sediment is examined with regard to the polymer content, on which the influence of centrifugation is considerably lower.
The classical Internet of things routing and wireless sensor networks can provide more precise monitoring of the covered area due to the higher number of utilized nodes. Because of the limitations in shared transfer media, many nodes in the network are prone to the collision in simultaneous transmissions. Medium access control protocols are usually more practical in networks with low traffic, which are not subjected to external noise from adjacent frequencies. There are preventive, detection and control solutions to congestion management in the network which are all the focus of this study. In the congestion prevention phase, the proposed method chooses the next step of the path using the Fuzzy decision-making system to distribute network traffic via optimal paths. In the congestion detection phase, a dynamic approach to queue management was designed to detect congestion in the least amount of time and prevent the collision. In the congestion control phase, the back-pressure method was used based on the quality of the queue to decrease the probability of linking in the pathway from the pre-congested node. The main goals of this study are to balance energy consumption in network nodes, reducing the rate of lost packets and increasing quality of service in routing. Simulation results proved the proposed Congestion Control Fuzzy Decision Making (CCFDM) method was more capable in improving routing parameters as compared to recent algorithms.
The economic losses from earthquakes tend to hit the national economy considerably; therefore, models that are capable of estimating the vulnerability and losses of future earthquakes are highly consequential for emergency planners with the purpose of risk mitigation. This demands a mass prioritization filtering of structures to identify vulnerable buildings for retrofitting purposes. The application of advanced structural analysis on each building to study the earthquake response is impractical due to complex calculations, long computational time, and exorbitant cost. This exhibits the need for a fast, reliable, and rapid method, commonly known as Rapid Visual Screening (RVS). The method serves as a preliminary screening platform, using an optimum number of seismic parameters of the structure and predefined output damage states. In this study, the efficacy of the Machine Learning (ML) application in damage prediction through a Support Vector Machine (SVM) model as the damage classification technique has been investigated. The developed model was trained and examined based on damage data from the 1999 Düzce Earthquake in Turkey, where the building’s data consists of 22 performance modifiers that have been implemented with supervised machine learning.
This article focuses on further developments of the background-oriented schlieren (BOS) technique to visualize convective indoor air flow, which is usually defined by very small density gradients. Since the light rays deflect when passing through fluids with different densities, BOS can detect the resulting refractive index gradients as integration along a line of sight. In this paper, the BOS technique is used to yield a two-dimensional visualization of small density gradients. The novelty of the described method is the implementation of a highly sensitive BOS setup to visualize the ascending thermal plume from a heated thermal manikin with temperature differences of minimum 1 K. To guarantee steady boundary conditions, the thermal manikin was seated in a climate laboratory. For the experimental investigations, a high-resolution DLSR camera was used capturing a large field of view with sufficient detail accuracy. Several parameters such as various backgrounds, focal lengths, room air temperatures, and distances between the object of investigation, camera, and structured background were tested to find the most suitable parameters to visualize convective indoor air flow. Besides these measurements, this paper presents the analyzing method using cross-correlation algorithms and finally the results of visualizing the convective indoor air flow with BOS. The highly sensitive BOS setup presented in this article complements the commonly used invasive methods that highly influence weak air flows.
When it comes to monitoring of huge structures, main issues are limited time, high costs and how to deal with the big amount of data. In order to reduce and manage them, respectively, methods from the field of optimal design of experiments are useful and supportive. Having optimal experimental designs at hand before conducting any measurements is leading to a highly informative measurement concept, where the sensor positions are optimized according to minimal errors in the structures’ models. For the reduction of computational time a combined approach using Fisher Information Matrix and mean-squared error in a two-step procedure is proposed under the consideration of different error types. The error descriptions contain random/aleatoric and systematic/epistemic portions. Applying this combined approach on a finite element model using artificial acceleration time measurement data with artificially added errors leads to the optimized sensor positions. These findings are compared to results from laboratory experiments on the modeled structure, which is a tower-like structure represented by a hollow pipe as the cantilever beam. Conclusively, the combined approach is leading to a sound experimental design that leads to a good estimate of the structure’s behavior and model parameters without the need of preliminary measurements for model updating.