Refine
Has Fulltext
- yes (57) (remove)
Document Type
- Article (57) (remove)
Institute
- Institut für Strukturmechanik (ISM) (33)
- Professur Bauphysik (6)
- Junior-Professur Organisation und vernetzte Medien (3)
- Professur Bauchemie und Polymere Werkstoffe (3)
- Professur Sozialwissenschaftliche Stadtforschung (2)
- Professur Stadtplanung (2)
- Bauhaus-Institut für zukunftsweisende Infrastruktursysteme (b.is) (1)
- Graduiertenkolleg 1462 (1)
- Professur Angewandte Mathematik (1)
- Professur Immobilienwirtschaft und -management (1)
Keywords
- OA-Publikationsfonds2020 (25)
- Maschinelles Lernen (17)
- Machine learning (12)
- Erdbeben (6)
- Deep learning (5)
- big data (5)
- Raumklima (4)
- rapid visual screening (4)
- computational fluid dynamics (3)
- earthquake safety assessment (3)
- machine learning (3)
- random forest (3)
- Behaglichkeit (2)
- Brücke (2)
- Fluid (2)
- Fotovoltaik (2)
- Fuzzy-Logik (2)
- Geopolymere (2)
- Intelligente Stadt (2)
- Internet of things (2)
- Journalismus (2)
- Neuronales Netz (2)
- Strömungsmechanik (2)
- Wohnen (2)
- Wohnungspolitik (2)
- artificial intelligence (2)
- artificial neural networks (2)
- buildings (2)
- damaged buildings (2)
- data science (2)
- ductless personalized ventilation (2)
- earthquake (2)
- smart cities (2)
- soft computing techniques (2)
- support vector machine (2)
- thermal comfort (2)
- urban morphology (2)
- vulnerability assessment (2)
- Abwasserwirtschaft (1)
- Aerodynamik (1)
- Akustische Laufzeit-Tomographie (1)
- Algorithmus (1)
- Anthropozän (1)
- Artificial neural network (1)
- Background-oriented schlieren (1)
- Balkan (1)
- Balkanroute (1)
- Bauklimatik (1)
- Bauphysik (1)
- Bayes-Verfahren (1)
- Belüftung (1)
- Bildanalyse (1)
- Bodentemperatur (1)
- Bridge (1)
- Bubble column reactor (1)
- Building Information Modeling (1)
- ContikiMAC (1)
- Convective indoor air flow (1)
- Cross-correlation (1)
- Datenmodell (1)
- Deutschland (1)
- Deutschland <Östliche Länder> (1)
- Digital image correlation (1)
- Digitalisierung (1)
- Dirac-Operator (1)
- Earthquake (1)
- Energieeffizienz (1)
- Entrepreneurship (1)
- Erdbebensicherheit (1)
- Erneuerbare Energien (1)
- Europäische Union (1)
- Fernerkung (1)
- Finite-Elemente-Methode (1)
- Flow visualization (1)
- Flächengerechtigkeit (1)
- Flächenverbrauch (1)
- Flüchtlingspolitik (1)
- Funktionentheorie (1)
- Fuzzy Logic (1)
- Fuzzy-Regelung (1)
- Gaussian process regression (1)
- Gebäude (1)
- Geoinformatik (1)
- Geometrie (1)
- Gerechtigkeit (1)
- Gesundheitsinformationssystem (1)
- Gesundheitswesen (1)
- Grundwasser (1)
- Größenverhältnis (1)
- Housing (1)
- Housing Policy (1)
- Human thermal plume (1)
- Hydrological drought (1)
- IAQ (1)
- IOT (1)
- Infrastructures (1)
- Ingenieurwissenschaften (1)
- Inspektion (1)
- Instrument (1)
- Internet der Dinge (1)
- Internet der dinge (1)
- K-nearest neighbors (1)
- KNN (1)
- Kontamination (1)
- Konzeptkunst (1)
- Körper (1)
- Kühlkörper (1)
- Künste (1)
- Künstliche Intelligenz (1)
- Land surface temperature (1)
- Leistungsverhalten (1)
- Literaturrecherche (1)
- Luftqualität (1)
- Lüftung (1)
- M5 model tree (1)
- Machine Learning (1)
- Marmara Region (1)
- Mensch (1)
- Metakaolin (1)
- Mieten (1)
- Morphologie (1)
- Multi-criteria decision making (1)
- Nachhaltigkeit (1)
- Nanofluid (1)
- Nanomaterials (1)
- Nanostrukturiertes Material (1)
- Nasskühlung (1)
- Naturkatastrophe (1)
- Neuartige Sanitärsysteme (1)
- Neue Medien (1)
- Nitratbelastung (1)
- OA-Publikationsfonds2019 (1)
- Oberflächentemperatur (1)
- Ostdeutschland (1)
- Performance (1)
- Peripherisierungsforschung (1)
- Polymere (1)
- Postpolitik (1)
- Postwachstumsstadt (1)
- Public-Private Partnerships (1)
- RSSI (1)
- Randwertproblem (1)
- Rapid Visual Screening (1)
- Raumluftströmungen (1)
- Raumordnung (1)
- Renewable energy (1)
- Responsibilisierung (1)
- Schlierenspiegel (1)
- Selbstgenutztes Wohneigentum (1)
- Siedlungswasserwirtschaft (1)
- Simulation (1)
- Smog (1)
- Social Housing (1)
- Solar (1)
- Sozialer Wohnungsbau (1)
- Stadtplanung (1)
- Steuerungsansätze (1)
- Strukturmechanik (1)
- Strömung (1)
- Städtischer Wohnungsmarkt (1)
- Superplasticizer (1)
- Sustainability (1)
- Temperatur (1)
- Thermal conductivity (1)
- Transformation (1)
- Transformation risks (1)
- Transformationsrisiken (1)
- Umweltbelastung (1)
- Umweltgerechtigkeit (1)
- Umweltveränderung (1)
- Vulnerability (1)
- Vulnerability assessment (1)
- Wastewater manegement (1)
- Welfare State (1)
- Wettbewerb (1)
- Wohnfläche (1)
- Wohnraum (1)
- Wohnungsbau (1)
- Wohnungseigentum (1)
- Wohnungsfrage (1)
- Wärmeleitfähigkeit (1)
- Zement (1)
- action recognition (1)
- adaptive neuro-fuzzy inference system (ANFIS) (1)
- adaptive pushover (1)
- alumosilicate (1)
- ant colony optimization algorithm (ACO) (1)
- artificial neural network (1)
- back-pressure (1)
- battery (1)
- berlinite (1)
- cement (1)
- classification (1)
- classifier (1)
- clear channel assessments (1)
- cluster density (1)
- cluster shape (1)
- clustering (1)
- competition (1)
- computation (1)
- computational fluid dynamics (CFD) (1)
- congestion control (1)
- coronary artery disease (1)
- cross-contamination (1)
- deep learning neural network (1)
- depletion method (1)
- desk fan (1)
- digital native news media (1)
- digital-born news media (1)
- digitization (1)
- dimensionality reduction (1)
- discrete Dirac operator (1)
- discrete boundary value problems (1)
- discrete monogenic functions (1)
- duty-cycles (1)
- earthquake damage (1)
- earthquake vulnerability assessment (1)
- energy consumption (1)
- energy efficiency (1)
- ensemble model (1)
- entrepreneurial journalism (1)
- experimental validation (1)
- extreme events (1)
- extreme pressure (1)
- firefly optimization algorithm (1)
- fisher-information matrix (1)
- fixed effects regression (1)
- flow pattern (1)
- fog computing (1)
- food informatics (1)
- fractional-order control (1)
- fuzzy decision making (1)
- fuzzy set qualitative comparative analysis (1)
- geoinformatics (1)
- geopolymer (1)
- ground water contamination (1)
- gully erosion susceptibility (1)
- health (1)
- health informatics (1)
- heart disease diagnosis (1)
- heat sink (1)
- human blob (1)
- human body proportions (1)
- human thermal plume (1)
- hybrid machine learning (1)
- hybrid machine learning model (1)
- hydraulic jump (1)
- hydrological model (1)
- hydrology (1)
- image processing (1)
- indoor air quality (1)
- industry 4.0 (1)
- journalism (1)
- journalism theories (1)
- künstlerischer Aktivismus (1)
- least square support vector machine (LSSVM) (1)
- longitudinal dispersion coefficient (1)
- mass spectrometry (1)
- mathematical modeling (1)
- mean-squared error (1)
- media performance (1)
- mitigation (1)
- nanofluid (1)
- natural hazard (1)
- neural networks (NNs) (1)
- news start-ups (1)
- partical swarm optimization (1)
- personalisierte Lüftung (1)
- personalized ventilation (1)
- photovoltaic (1)
- photovoltaic-thermal (PV/T) (1)
- physical activities (1)
- polymer adsorption (1)
- practice theories (1)
- practice theory (1)
- precipitation (1)
- predictive model (1)
- principal component analysis (1)
- public health (1)
- public service media (1)
- public space (1)
- received signal strength indicator (1)
- reinforcement learning (1)
- remote sensing (1)
- residential buildings (1)
- rice (1)
- rivers (1)
- rule based classification (1)
- schlieren imaging (1)
- schlieren velocimetry (1)
- seasonal precipitation (1)
- seismic assessment (1)
- seismic hazard analysis (1)
- seismic risk estimation (1)
- seismic vulnerability (1)
- signal processing (1)
- site-specific spectrum (1)
- smart sensors (1)
- sodium silicate solution (1)
- soil temperature (1)
- spatial analysis (1)
- spatiotemporal database (1)
- spearman correlation coefficient (1)
- square root cubature calman filter (1)
- standard deviation of pressure fluctuations (1)
- statistical analysis (1)
- statistical coeffcient of the probability distribution (1)
- stilling basin (1)
- support vector regression (1)
- sustainability (1)
- theory development (1)
- thermisches Empfinden (1)
- tower-like structures (1)
- tracer gas (1)
- type-3 fuzzy systems (1)
- urban health (1)
- urban sustainability (1)
- visible spectrophotometry (1)
- water quality (1)
- wavelet transform (1)
- wireless sensor network (1)
- wireless sensor networks (1)
- Öffentlich-private Partnerschaft (1)
- Öffentlich-rechtlicher Rundfunk (1)
- äquivalente Temperatur (1)
Year of publication
- 2020 (57) (remove)
In this paper, an artificial neural network is implemented for the sake of predicting the thermal conductivity ratio of TiO2-Al2O3/water nanofluid. TiO2-Al2O3/water in the role of an innovative type of nanofluid was synthesized by the sol–gel method. The results indicated that 1.5 vol.% of nanofluids enhanced the thermal conductivity by up to 25%. It was shown that the heat transfer coefficient was linearly augmented with increasing nanoparticle concentration, but its variation with temperature was nonlinear. It should be noted that the increase in concentration may cause the particles to agglomerate, and then the thermal conductivity is reduced. The increase in temperature also increases the thermal conductivity, due to an increase in the Brownian motion and collision of particles. In this research, for the sake of predicting the thermal conductivity of TiO2-Al2O3/water nanofluid based on volumetric concentration and temperature functions, an artificial neural network is implemented. In this way, for predicting thermal conductivity, SOM (self-organizing map) and BP-LM (Back Propagation-Levenberq-Marquardt) algorithms were used. Based on the results obtained, these algorithms can be considered as an exceptional tool for predicting thermal conductivity. Additionally, the correlation coefficient values were equal to 0.938 and 0.98 when implementing the SOM and BP-LM algorithms, respectively, which is highly acceptable. View Full-Text
The K-nearest neighbors (KNN) machine learning algorithm is a well-known non-parametric classification method. However, like other traditional data mining methods, applying it on big data comes with computational challenges. Indeed, KNN determines the class of a new sample based on the class of its nearest neighbors; however, identifying the neighbors in a large amount of data imposes a large computational cost so that it is no longer applicable by a single computing machine. One of the proposed techniques to make classification methods applicable on large datasets is pruning. LC-KNN is an improved KNN method which first clusters the data into some smaller partitions using the K-means clustering method; and then applies the KNN for each new sample on the partition which its center is the nearest one. However, because the clusters have different shapes and densities, selection of the appropriate cluster is a challenge. In this paper, an approach has been proposed to improve the pruning phase of the LC-KNN method by taking into account these factors. The proposed approach helps to choose a more appropriate cluster of data for looking for the neighbors, thus, increasing the classification accuracy. The performance of the proposed approach is evaluated on different real datasets. The experimental results show the effectiveness of the proposed approach and its higher classification accuracy and lower time cost in comparison to other recent relevant methods.
When it comes to monitoring of huge structures, main issues are limited time, high costs and how to deal with the big amount of data. In order to reduce and manage them, respectively, methods from the field of optimal design of experiments are useful and supportive. Having optimal experimental designs at hand before conducting any measurements is leading to a highly informative measurement concept, where the sensor positions are optimized according to minimal errors in the structures’ models. For the reduction of computational time a combined approach using Fisher Information Matrix and mean-squared error in a two-step procedure is proposed under the consideration of different error types. The error descriptions contain random/aleatoric and systematic/epistemic portions. Applying this combined approach on a finite element model using artificial acceleration time measurement data with artificially added errors leads to the optimized sensor positions. These findings are compared to results from laboratory experiments on the modeled structure, which is a tower-like structure represented by a hollow pipe as the cantilever beam. Conclusively, the combined approach is leading to a sound experimental design that leads to a good estimate of the structure’s behavior and model parameters without the need of preliminary measurements for model updating.
This paper reports the formation and structure of fast setting geopolymers activated by using three sodium silicate solutions with different modules (1.6, 2.0 and 2.4) and a berlinite-type aluminum orthophosphate. By varying the concentration of the aluminum orthophosphate, different Si/Al-ratios were established (6, 3 and 2). Reaction kinetics of binders were determined by isothermal calorimetric measurements at 20 °C. X-ray diffraction analysis as well as nuclear magnetic resonance (NMR) measurements were performed on binders to determine differences in structure by varying the alkalinity of the sodium silicate solutions and the Si/Al-ratio. The calorimetric results indicated that the higher the alkalinity of the sodium silicate solution, the higher the solubility and degree of conversion of the aluminum orthophosphate. The results of X-ray diffraction and Rietveldt analysis, as well as the NMR measurements, confirmed the assumption of the calorimetric experiments that first the aluminum orthophosphate was dissolved and then a polycondensation to an amorphous aluminosilicate network occurred. The different amounts of amorphous phases formed as a function of the alkalinity of the sodium silicate solution, indicate that tetrahydroxoaluminate species were formed during the dissolution of the aluminum orthophosphate, which reduce the pH value. This led to no further dissolution of the aluminum orthophosphate, which remained unreacted.
Calculating hydrocarbon components solubility of natural gases is known as one of the important issues for operational works in petroleum and chemical engineering. In this work, a novel solubility estimation tool has been proposed for hydrocarbon gases—including methane, ethane, propane, and butane—in aqueous electrolyte solutions based on extreme learning machine (ELM) algorithm. Comparing the ELM outputs with a comprehensive real databank which has 1175 solubility points yielded R-squared values of 0.985 and 0.987 for training and testing phases respectively. Furthermore, the visual comparison of estimated and actual hydrocarbon solubility led to confirm the ability of proposed solubility model. Additionally, sensitivity analysis has been employed on the input variables of model to identify their impacts on hydrocarbon solubility. Such a comprehensive and reliable study can help engineers and scientists to successfully determine the important thermodynamic properties, which are key factors in optimizing and designing different industrial units such as refineries and petrochemical plants.
Hydrological drought forecasting plays a substantial role in water resources management. Hydrological drought highly affects the water allocation and hydropower generation. In this research, short term hydrological drought forecasted based on the hybridized of novel nature-inspired optimization algorithms and Artificial Neural Networks (ANN). For this purpose, the Standardized Hydrological Drought Index (SHDI) and the Standardized Precipitation Index (SPI) were calculated in one, three, and six aggregated months. Then, three states where proposed for SHDI forecasting, and 36 input-output combinations were extracted based on the cross-correlation analysis. In the next step, newly proposed optimization algorithms, including Grasshopper Optimization Algorithm (GOA), Salp Swarm algorithm (SSA), Biogeography-based optimization (BBO), and Particle Swarm Optimization (PSO) hybridized with the ANN were utilized for SHDI forecasting and the results compared to the conventional ANN. Results indicated that the hybridized model outperformed compared to the conventional ANN. PSO performed better than the other optimization algorithms. The best models forecasted SHDI1 with R2 = 0.68 and RMSE = 0.58, SHDI3 with R 2 = 0.81 and RMSE = 0.45 and SHDI6 with R 2 = 0.82 and RMSE = 0.40.
Pressure fluctuations beneath hydraulic jumps potentially endanger the stability of stilling basins. This paper deals with the mathematical modeling of the results of laboratory-scale experiments to estimate the extreme pressures. Experiments were carried out on a smooth stilling basin underneath free hydraulic jumps downstream of an Ogee spillway. From the probability distribution of measured instantaneous pressures, pressures with different probabilities could be determined. It was verified that maximum pressure fluctuations, and the negative pressures, are located at the positions near the spillway toe. Also, minimum pressure fluctuations are located at the downstream of hydraulic jumps. It was possible to assess the cumulative curves of pressure data related to the characteristic points along the basin, and different Froude numbers. To benchmark the results, the dimensionless forms of statistical parameters include mean pressures (P*m), the standard deviations of pressure fluctuations (σ*X), pressures with different non-exceedance probabilities (P*k%), and the statistical coefficient of the probability distribution (Nk%) were assessed. It was found that an existing method can be used to interpret the present data, and pressure distribution in similar conditions, by using a new second-order fractional relationships for σ*X, and Nk%. The values of the Nk% coefficient indicated a single mean value for each probability.
This research aims to model soil temperature (ST) using machine learning models of multilayer perceptron (MLP) algorithm and support vector machine (SVM) in hybrid form with the Firefly optimization algorithm, i.e. MLP-FFA and SVM-FFA. In the current study, measured ST and meteorological parameters of Tabriz and Ahar weather stations in a period of 2013–2015 are used for training and testing of the studied models with one and two days as a delay. To ascertain conclusive results for validation of the proposed hybrid models, the error metrics are benchmarked in an independent testing period. Moreover, Taylor diagrams utilized for that purpose. Obtained results showed that, in a case of one day delay, except in predicting ST at 5 cm below the soil surface (ST5cm) at Tabriz station, MLP-FFA produced superior results compared with MLP, SVM, and SVM-FFA models. However, for two days delay, MLP-FFA indicated increased accuracy in predicting ST5cm and ST 20cm of Tabriz station and ST10cm of Ahar station in comparison with SVM-FFA. Additionally, for all of the prescribed models, the performance of the MLP-FFA and SVM-FFA hybrid models in the testing phase was found to be meaningfully superior to the classical MLP and SVM models.
In this study, a new approach to basis of intelligent systems and machine learning algorithms is introduced for solving singular multi-pantograph differential equations (SMDEs). For the first time, a type-2 fuzzy logic based approach is formulated to find an approximated solution. The rules of the suggested type-2 fuzzy logic system (T2-FLS) are optimized by the square root cubature Kalman filter (SCKF) such that the proposed fineness function to be minimized. Furthermore, the stability and boundedness of the estimation error is proved by novel approach on basis of Lyapunov theorem. The accuracy and robustness of the suggested algorithm is verified by several statistical examinations. It is shown that the suggested method results in an accurate solution with rapid convergence and a lower computational cost.
For this paper, the problem of energy/voltage management in photovoltaic (PV)/battery systems was studied, and a new fractional-order control system on basis of type-3 (T3) fuzzy logic systems (FLSs) was developed. New fractional-order learning rules are derived for tuning of T3-FLSs such that the stability is ensured. In addition, using fractional-order calculus, the robustness was studied versus dynamic uncertainties, perturbation of irradiation, and temperature and abruptly faults in output loads, and, subsequently, new compensators were proposed. In several examinations under difficult operation conditions, such as random temperature, variable irradiation, and abrupt changes in output load, the capability of the schemed controller was verified. In addition, in comparison with other methods, such as proportional-derivative-integral (PID), sliding mode controller (SMC), passivity-based control systems (PBC), and linear quadratic regulator (LQR), the superiority of the suggested method was demonstrated.
In this research, an attempt was made to reduce the dimension of wavelet-ANFIS/ANN (artificial neural network/adaptive neuro-fuzzy inference system) models toward reliable forecasts as well as to decrease computational cost. In this regard, the principal component analysis was performed on the input time series decomposed by a discrete wavelet transform to feed the ANN/ANFIS models. The models were applied for dissolved oxygen (DO) forecasting in rivers which is an important variable affecting aquatic life and water quality. The current values of DO, water surface temperature, salinity, and turbidity have been considered as the input variable to forecast DO in a three-time step further. The results of the study revealed that PCA can be employed as a powerful tool for dimension reduction of input variables and also to detect inter-correlation of input variables. Results of the PCA-wavelet-ANN models are compared with those obtained from wavelet-ANN models while the earlier one has the advantage of less computational time than the later models. Dealing with ANFIS models, PCA is more beneficial to avoid wavelet-ANFIS models creating too many rules which deteriorate the efficiency of the ANFIS models. Moreover, manipulating the wavelet-ANFIS models utilizing PCA leads to a significant decreasing in computational time. Finally, it was found that the PCA-wavelet-ANN/ANFIS models can provide reliable forecasts of dissolved oxygen as an important water quality indicator in rivers.
Der Beitrag verbindet die Diskussion um die postpolitische Stadt mit der zunehmenden wissenschaftlichen und aktivistischen Auseinandersetzung mit dem Anthropozän, ein Konzept, das die ökologischen und sozialpolitischen Implikationen menschlichen Handelns auf die Erdoberfläche beschreibt. Anhand von drei ausgewählten Fallstudien erkunden wir,
wie die spezifisch anthropogene, also menschengemachte, Krise urbaner Luftverschmutzung in künstlerischen Positionen problematisiert wird. Im Kontext des potenziellen Vormarschs von Postpolitik besprechen wir, wie der ambivalente Diskurs des Anthropozäns einerseits Depolitisierung begünstigt und andererseits neue Möglichkeiten für die Repolitisierung
globaler Umweltherausforderungen ermöglicht.
Wind effects can be critical for the design of lifelines such as long-span bridges. The existence of a significant number of aerodynamic force models, used to assess the performance of bridges, poses an important question regarding their comparison and validation. This study utilizes a unified set of metrics for a quantitative comparison of time-histories in bridge aerodynamics with a host of characteristics. Accordingly, nine comparison metrics are included to quantify the discrepancies in local and global signal features such as phase, time-varying frequency and magnitude content, probability density, nonstationarity and nonlinearity. Among these, seven metrics available in the literature are introduced after recasting them for time-histories associated with bridge aerodynamics. Two additional metrics are established to overcome the shortcomings of the existing metrics. The performance of the comparison metrics is first assessed using generic signals with prescribed signal features. Subsequently, the metrics are applied to a practical example from bridge aerodynamics to quantify the discrepancies in the aerodynamic forces and response based on numerical and semi-analytical aerodynamic models. In this context, it is demonstrated how a discussion based on the set of comparison metrics presented here can aid a model evaluation by offering deeper insight. The outcome of the study is intended to provide a framework for quantitative comparison and validation of aerodynamic models based on the underlying physics of fluid-structure interaction. Immediate further applications are expected for the comparison of time-histories that are simulated by data-driven approaches.
Tall buildings have become an integral part of cities despite all their pros and cons. Some current tall buildings have several problems because of their unsuitable location; the problems include increasing density, imposing traffic on urban thoroughfares, blocking view corridors, etc. Some of these buildings have destroyed desirable views of the city. In this research, different criteria have been chosen, such as environment, access, social-economic, land-use, and physical context. These criteria and sub-criteria are prioritized and weighted by the analytic network process (ANP) based on experts’ opinions, using Super Decisions V2.8 software. On the other hand, layers corresponding to sub-criteria were made in ArcGIS 10.3 simultaneously, then via a weighted overlay (map algebra), a locating plan was created. In the next step seven hypothetical tall buildings (20 stories), in the best part of the locating plan, were considered to evaluate how much of theses hypothetical buildings would be visible (fuzzy visibility) from the street and open spaces throughout the city. These processes have been modeled by MATLAB software, and the final fuzzy visibility plan was created by ArcGIS. Fuzzy visibility results can help city managers and planners to choose which location is suitable for a tall building and how much visibility may be appropriate. The proposed model can locate tall buildings based on technical and visual criteria in the future development of the city and it can be widely used in any city as long as the criteria and weights are localized.
The longitudinal dispersion coefficient (LDC) plays an important role in modeling the transport of pollutants and sediment in natural rivers. As a result of transportation processes, the concentration of pollutants changes along the river. Various studies have been conducted to provide simple equations for estimating LDC. In this study, machine learning methods, namely support vector regression, Gaussian process regression, M5 model tree (M5P) and random forest, and multiple linear regression were examined in predicting the LDC in natural streams. Data sets from 60 rivers around the world with different hydraulic and geometric features were gathered to develop models for LDC estimation. Statistical criteria, including correlation coefficient (CC), root mean squared error (RMSE) and mean absolute error (MAE), were used to scrutinize the models. The LDC values estimated by these models were compared with the corresponding results of common empirical models. The Taylor chart was used to evaluate the models and the results showed that among the machine learning models, M5P had superior performance, with CC of 0.823, RMSE of 454.9 and MAE of 380.9. The model of Sahay and Dutta, with CC of 0.795, RMSE of 460.7 and MAE of 306.1, gave more precise results than the other empirical models. The main advantage of M5P models is their ability to provide practical formulae. In conclusion, the results proved that the developed M5P model with simple formulations was superior to other machine learning models and empirical models; therefore, it can be used as a proper tool for estimating the LDC in rivers.
Cooling Performance of a Novel Circulatory Flow Concentric Multi-Channel Heat Sink with Nanofluids
(2020)
Heat rejection from electronic devices such as processors necessitates a high heat removal rate. The present study focuses on liquid-cooled novel heat sink geometry made from four channels (width 4 mm and depth 3.5 mm) configured in a concentric shape with alternate flow passages (slot of 3 mm gap). In this study, the cooling performance of the heat sink was tested under simulated controlled conditions.The lower bottom surface of the heat sink was heated at a constant heat flux condition based on dissipated power of 50 W and 70 W. The computations were carried out for different volume fractions of nanoparticles, namely 0.5% to 5%, and water as base fluid at a flow rate of 30 to 180 mL/min. The results showed a higher rate of heat rejection from the nanofluid cooled heat sink compared with water. The enhancement in performance was analyzed with the help of a temperature difference of nanofluid outlet temperature and water outlet temperature under similar operating conditions. The enhancement was ~2% for 0.5% volume fraction nanofluids and ~17% for a 5% volume fraction.
The Marmara Region (NW Turkey) has experienced significant earthquakes (M > 7.0) to date. A destructive earthquake is also expected in the region. To determine the effect of the specific design spectrum, eleven provinces located in the region were chosen according to the Turkey Earthquake Building Code updated in 2019. Additionally, the differences between the previous and updated regulations of the country were investigated. Peak Ground Acceleration (PGA) and Peak Ground Velocity (PGV) were obtained for each province by using earthquake ground motion levels with 2%, 10%, 50%, and 68% probability of exceedance in 50-year periods. The PGA values in the region range from 0.16 to 0.7 g for earthquakes with a return period of 475 years. For each province, a sample of a reinforced-concrete building having two different numbers of stories with the same ground and structural characteristics was chosen. Static adaptive pushover analyses were performed for the sample reinforced-concrete building using each province’s design spectrum. The variations in the earthquake and structural parameters were investigated according to different geographical locations. It was determined that the site-specific design spectrum significantly influences target displacements for performance-based assessments of buildings due to seismicity characteristics of the studied geographic location.
Die im Jahr 2020 in Deutschland praktizierte Siedlungs- und Wohnungspolitik erhält in Anbetracht ihrer Auswirkungen auf die soziale und ökologische Lage einen bitteren Beigeschmack. Arm und Reich triften weiter auseinander und einer zielgerichteten ökologischen Transformation der Art und Weise, wie Stadtentwicklung und Wohnungspolitik gestaltet werden,stehen noch immer historisch und systemisch bedingte Pfadabhängigkeiten im Weg. Diese werden nur durch eine integrierte Betrachtung sozialer und ökonomischer Aspekte sichtbar und deuten auf eine der ursprünglichen Fragen linker Gesellschaftsforschung hin: Die Auseinandersetzung mit dem Verhältnis von Eigentum und Gerechtigkeit.
Im Ergebnis stehen drei wesentliche Befunde: Der Diskurs zum Schutz des Klimas und der Biodiversität berührt direkt die Parameter Dichte, Nutzungsmischung und Flächeninanspruchnahme; zweitens steigt letztere relativ mit erhöhtem, individuell verfügbaren Kapital und insbesondere im selbstgenutztem Eigentum gegenüber Mietwohnungen; und drittens wächst der Eigentumsanteil mit fortschreitender Finanzialisierung des Wohnungsmarktes, sodass das Risiko sozialer und ökologischer Krisen sich verschärft.
The classical Internet of things routing and wireless sensor networks can provide more precise monitoring of the covered area due to the higher number of utilized nodes. Because of the limitations in shared transfer media, many nodes in the network are prone to the collision in simultaneous transmissions. Medium access control protocols are usually more practical in networks with low traffic, which are not subjected to external noise from adjacent frequencies. There are preventive, detection and control solutions to congestion management in the network which are all the focus of this study. In the congestion prevention phase, the proposed method chooses the next step of the path using the Fuzzy decision-making system to distribute network traffic via optimal paths. In the congestion detection phase, a dynamic approach to queue management was designed to detect congestion in the least amount of time and prevent the collision. In the congestion control phase, the back-pressure method was used based on the quality of the queue to decrease the probability of linking in the pathway from the pre-congested node. The main goals of this study are to balance energy consumption in network nodes, reducing the rate of lost packets and increasing quality of service in routing. Simulation results proved the proposed Congestion Control Fuzzy Decision Making (CCFDM) method was more capable in improving routing parameters as compared to recent algorithms.
Coronary Artery Disease Diagnosis: Ranking the Significant Features Using a Random Trees Model
(2020)
Heart disease is one of the most common diseases in middle-aged citizens. Among the vast number of heart diseases, coronary artery disease (CAD) is considered as a common cardiovascular disease with a high death rate. The most popular tool for diagnosing CAD is the use of medical imaging, e.g., angiography. However, angiography is known for being costly and also associated with a number of side effects. Hence, the purpose of this study is to increase the accuracy of coronary heart disease diagnosis through selecting significant predictive features in order of their ranking. In this study, we propose an integrated method using machine learning. The machine learning methods of random trees (RTs), decision tree of C5.0, support vector machine (SVM), and decision tree of Chi-squared automatic interaction detection (CHAID) are used in this study. The proposed method shows promising results and the study confirms that the RTs model outperforms other models.