Filtern
Dokumenttyp
- Artikel (Wissenschaftlicher) (57)
- Dissertation (26)
- Teil eines Buches (Kapitel) (16)
- Masterarbeit (7)
- Buch (Monographie) (4)
- Preprint (3)
- Konferenzveröffentlichung (2)
- Habilitation (2)
- Bericht (2)
- Bachelorarbeit (1)
Institut
- Institut für Strukturmechanik (ISM) (42)
- Junior-Professur Bildtheorie (17)
- Professur Bauphysik (8)
- Professur Sozialwissenschaftliche Stadtforschung (7)
- Junior-Professur Organisation und vernetzte Medien (6)
- Institut für Europäische Urbanistik (5)
- Professur Bauchemie und Polymere Werkstoffe (3)
- Professur Denkmalpflege und Baugeschichte (3)
- Professur Modellierung und Simulation - Konstruktion (3)
- Bauhaus-Institut für zukunftsweisende Infrastruktursysteme (b.is) (2)
- Freie Kunst (2)
- Professur Angewandte Mathematik (2)
- Professur Stadtplanung (2)
- Professur Werkstoffe des Bauens (2)
- Promotionsstudiengang Kunst und Design-Freie Kunst-Medienkunst (Ph.D) (2)
- Geschichte und Theorie der Kunst (1)
- Geschichte und Theorie der Visuellen Kommunikation (1)
- Graduiertenkolleg 1462 (1)
- Hochschule für Musik FRANZ LISZT (1)
- Kompetenzzentrum Medienanthropologie (KOMA) (1)
- Materialforschungs- und -prüfanstalt an der Bauhaus-Universität (1)
- Medienkunst/Mediengestaltung (1)
- Professur Baustatik und Bauteilfestigkeit (1)
- Professur Computer Vision in Engineering (1)
- Professur Content Management und Webtechnologien (1)
- Professur Entwerfen und Innenraumgestaltung (1)
- Professur Geschichte und Theorie der Kulturtechniken (1)
- Professur Immobilienwirtschaft und -management (1)
- Professur Intelligentes Technisches Design (1)
- Professur Medieninformatik (1)
- Professur Mediensoziologie (1)
- Professur Raumplanung und Raumforschung (1)
- Zentrale Einrichtungen, Büro des Präsidenten, Dezernate (1)
- bauhaus.institut für experimentelle Architektur (1)
Schlagworte
- OA-Publikationsfonds2020 (27)
- Maschinelles Lernen (17)
- Machine learning (12)
- Künstlerische Forschung (10)
- Erdbeben (7)
- Deep learning (5)
- Theater (5)
- big data (5)
- Medien (4)
- Raumklima (4)
- machine learning (4)
- rapid visual screening (4)
- Aerodynamik (3)
- Architektur (3)
- Beton (3)
- Bridge (3)
- Brücke (3)
- Denkmalpflege (3)
- Intelligente Stadt (3)
- Journalismus (3)
- Pädagogik (3)
- Stadtplanung (3)
- Wohnen (3)
- Wohnungspolitik (3)
- computational fluid dynamics (3)
- earthquake (3)
- earthquake safety assessment (3)
- random forest (3)
- smart cities (3)
- Aktivismus (2)
- Antirassismus (2)
- Artificial neural network (2)
- Batterie (2)
- Behaglichkeit (2)
- Concrete (2)
- Deutschland (2)
- Dissipation (2)
- Europa (2)
- Fluid (2)
- Fotovoltaik (2)
- Fuzzy-Logik (2)
- Geopolymere (2)
- Heritage (2)
- Industriekomplexe (2)
- Ingenieurwissenschaften (2)
- Internet of things (2)
- Kunst (2)
- Mensch-Maschine-Kommunikation (2)
- Modellierung (2)
- Montage (2)
- Nachhaltigkeit (2)
- Nanostrukturiertes Material (2)
- Neuronales Netz (2)
- Performance (2)
- Performativität (2)
- Planung (2)
- Raumordnung (2)
- Schreiben (2)
- Simulation (2)
- Stadtbaugeschichte (2)
- Stadtentwicklung (2)
- Strömungsmechanik (2)
- Undercommons (2)
- Verbundwerkstoff (2)
- Vermittlung (2)
- Wohnungsfrage (2)
- Workshop (2)
- Zement (2)
- artificial intelligence (2)
- artificial neural networks (2)
- buildings (2)
- cement (2)
- cyclic load (2)
- damaged buildings (2)
- data science (2)
- ductless personalized ventilation (2)
- reinforcement learning (2)
- soft computing techniques (2)
- support vector machine (2)
- thermal comfort (2)
- urban morphology (2)
- vulnerability assessment (2)
- 2D/3D Adaptive Mesh Refinement (1)
- 3D Interaction Techniques (1)
- Abwasserwirtschaft (1)
- Aerodynamic derivatives (1)
- Aerodynamics (1)
- Aeroelasticity (1)
- Aeroelastizität (1)
- African Revolution (1)
- Ahnenkultur (1)
- Akkumulator (1)
- Akustische Laufzeit-Tomographie (1)
- Algorithmus (1)
- Allyship (1)
- Anthropozän (1)
- Antikolonialismus (1)
- Architecture (1)
- Arman (1)
- Artefakt (1)
- Artistic Research (1)
- Asynchronität (1)
- Audiovision (1)
- Augmented Audio Reality (1)
- Autogenous (1)
- Autokratie (1)
- Autonomous (1)
- BIM (1)
- Background-oriented schlieren (1)
- Balkan (1)
- Balkanroute (1)
- Battery (1)
- Battery development (1)
- Baudenkmal (1)
- Bauklimatik (1)
- Bauphysik (1)
- Bayes-Verfahren (1)
- Belüftung (1)
- Berlin (1)
- Beschädigung (1)
- Bestattung (1)
- Bildanalyse (1)
- Biodiversität (1)
- Biogas (1)
- Biogasproduktion (1)
- Biomechanics (1)
- Biomechanik (1)
- Bodentemperatur (1)
- Brasilien (1)
- Brazil (1)
- Brazilian Music (1)
- Bridge aerodynamics (1)
- Bridges (1)
- Bruchmechanik (1)
- Brustkorb (1)
- Brückenbau (1)
- Bubble column reactor (1)
- Building Information Modeling (1)
- Candomblé (1)
- Category Theory (1)
- Charles Sanders Peirce (1)
- Christo (1)
- City marketing (1)
- Club (1)
- Collage (1)
- Composite (1)
- Computational Fluid Dynamics (1)
- ContikiMAC (1)
- Convective indoor air flow (1)
- Counter-ethnography (1)
- Cross-correlation (1)
- Cultural tourism (1)
- Damage (1)
- Daseinsvorsorge (1)
- Datenmodell (1)
- Deal ii C++ code (1)
- Deutschland <Östliche Länder> (1)
- Dictatorship (1)
- Didaktik (1)
- Digital image correlation (1)
- Digitaler Journalismus (1)
- Digitalisierung (1)
- Digitaljournalismus (1)
- Ding (1)
- Dingdiskurs (1)
- Dirac-Operator (1)
- Display (1)
- Drohne (1)
- Earthquake (1)
- Electrochemical properties (1)
- Elektrochemische Eigenschaft (1)
- Elektrode (1)
- Elektrodenmaterial (1)
- Energieeffizienz (1)
- Energiespeichersystem (1)
- Entrepreneurship (1)
- Erbe (1)
- Erdbebensicherheit (1)
- Ermüdung (1)
- Erneuerbare Energien (1)
- Erweiterte Realität <Informatik> (1)
- Ethnologie (1)
- European city-making process (1)
- Europäische Union (1)
- Experimente (1)
- FEM (1)
- Femimismus (1)
- Fernerkung (1)
- Film (1)
- Filmessay (1)
- Finite-Elemente-Methode (1)
- Flight path planning (1)
- Flow visualization (1)
- Flussgebiet (1)
- Flächengerechtigkeit (1)
- Flächenverbrauch (1)
- Flüchtlingspolitik (1)
- Format (1)
- Forschungsbericht (1)
- Fotografie (1)
- Fotografie als Handlung (1)
- Funktionentheorie (1)
- Fuzzy Logic (1)
- Fuzzy-Regelung (1)
- Gabe (1)
- Gaussian process regression (1)
- Gebäude (1)
- Geoinformatik (1)
- Geometric Modeling (1)
- Geometrie (1)
- Gerechtigkeit (1)
- Geschichte (1)
- Geschichtsbewusstsein (1)
- Geschichtswissenschaft (1)
- Geschäftsmodell (1)
- Gesundheit (1)
- Gesundheitsinformationssystem (1)
- Gesundheitswesen (1)
- Gewebeverbundwerkstoff (1)
- Gleichwertigkeit der Lebensverhältnisse (1)
- Goal-oriented A Posteriori Error Estimation (1)
- Graffiti (1)
- Graffito (1)
- Grundwasser (1)
- Größenverhältnis (1)
- Hans Ruin (1)
- Hausarzt (1)
- Healing (1)
- Heritage management (1)
- Hochschullehre (1)
- Housing (1)
- Housing Policy (1)
- Human thermal plume (1)
- Human-Computer Interaction (1)
- Hydrological drought (1)
- IAQ (1)
- IFC (1)
- IOT (1)
- Incompressibility (1)
- Industrie 4.0 (1)
- Infrastructures (1)
- Innovation (1)
- Innovationsforschung (1)
- Innovationsfähigkeit (1)
- Inspektion (1)
- Instrument (1)
- Inszenierte Fotografie (1)
- Internet (1)
- Internet der Dinge (1)
- Internet der dinge (1)
- Interoperabilität (1)
- Intersektionalität (1)
- Iran (1)
- Isogeometric Analysis (1)
- Isogeometrische Analyse (1)
- K-nearest neighbors (1)
- KNN (1)
- Klimapolitik (1)
- Kontamination (1)
- Konzeptkunst (1)
- Kulturwissenschaft (1)
- Kunsthochschule (1)
- Kybernetik (1)
- Körper (1)
- Kühlkörper (1)
- Künste (1)
- Künstlerischer Aktivismus (1)
- Künstliche Intelligenz (1)
- Land surface temperature (1)
- Lassen (1)
- Leistungsverhalten (1)
- Literaturrecherche (1)
- Localised Sound (1)
- Luftqualität (1)
- Ländlicher Raum (1)
- Lüftung (1)
- M5 model tree (1)
- Machine Learning (1)
- Macht (1)
- Mahlaggregat (1)
- Mahlung (1)
- Makroalgen (1)
- Mangement (1)
- Marmara Region (1)
- Mathematik (1)
- Mechanical properties (1)
- Mechanische Eigenschaft (1)
- Medienphilosophie (1)
- Medienphilosophie des Formats (1)
- Medientechnik (1)
- Medientheorie (1)
- Medienwirtschaft (1)
- Medienwissenschaft (1)
- Medienökonomie (1)
- Mensch (1)
- Mensch-Maschine-Interaktion (1)
- Mesh Refinement (1)
- Meso-Scale (1)
- Metakaolin (1)
- Metamodell (1)
- Methodologie (1)
- Mieten (1)
- Modernisierung (1)
- Modernization (1)
- Mongolei (1)
- Monitoring (1)
- Morphologie (1)
- Motion-induced forces (1)
- Multi-User Virtual Reality (1)
- Multi-criteria decision making (1)
- Multi-scale modeling (1)
- Muscle model (1)
- Museology (1)
- Museumskunde (1)
- Music (1)
- Musicology (1)
- Musik (1)
- Muskel (1)
- NURBS (1)
- Nanofluid (1)
- Nanomaterial (1)
- Nanomaterials (1)
- Nasskühlung (1)
- Nation-building (1)
- Naturkatastrophe (1)
- Navigation (1)
- Netzwerk (1)
- Neuartige Sanitärsysteme (1)
- Neue Medien (1)
- Neurodiversität (1)
- Nichtlineare Finite-Elemente-Methode (1)
- Nitratbelastung (1)
- North Korea (1)
- Nouveau Realisme (1)
- Näherungsverfahren (1)
- OA-Publikationsfonds2019 (1)
- OWL <Informatik> (1)
- Oberflächentemperatur (1)
- Oldenburg (1)
- Ontologie (1)
- Operante Konditionierung (1)
- Ostdeutschland (1)
- Pahlavi (1)
- Paid Content (1)
- Pan-Africanism (1)
- Performative Fotografie (1)
- Performative Strategien (1)
- Peripherisierungsforschung (1)
- Philosophie (1)
- Photobioreaktor (1)
- Photobioreaktorsystem (1)
- Plattformmodell (1)
- Polymere (1)
- Pop-Art (1)
- Post-colonial studies (1)
- Postkommunismus (1)
- Postpolitik (1)
- Postsozialismus (1)
- Postwachstumsstadt (1)
- Postwachstumsökonomie (1)
- Projektmanagement (1)
- Protestbewegung (1)
- Public-Private Partnerships (1)
- Pyongyang (1)
- RSSI (1)
- Radikales Lernen (1)
- Randwertproblem (1)
- Rapid Visual Screening (1)
- Raum (1)
- Raumklang (1)
- Raumluftströmungen (1)
- Raumplanung (1)
- Rauschenberg (1)
- Renewable energy (1)
- Resilience (1)
- Responsibilisierung (1)
- Rezension (1)
- Rumänien (1)
- Sakralbau (1)
- Saz (1)
- Schaden (1)
- Schadensanalyse (1)
- Schlierenspiegel (1)
- Schreiblabor (1)
- Segregation (1)
- Selbstbildnis (1)
- Selbstgenutztes Wohneigentum (1)
- Selbstheilung (1)
- Selbstportrait (1)
- Seminar (1)
- Siedlungswasserwirtschaft (1)
- Sinne (1)
- Sinnlichkeit (1)
- Smarter Together (1)
- Smog (1)
- Social Housing (1)
- Social movement (1)
- Solar (1)
- Soziale Bewegung (1)
- Soziale Mobilität (1)
- Soziale Sicherheit (1)
- Sozialer Muskel (1)
- Sozialer Wohnungsbau (1)
- Sozialismus (1)
- Speed Dating (1)
- Spekulative Didaktik (1)
- Spoerri (1)
- Stadt (1)
- Stadtbild (1)
- Stadtgeschichte <Fach> (1)
- Stadtmarketing (1)
- Stereofonie (1)
- Stereophonie (1)
- Steuerungsansätze (1)
- Strukturanalyse (1)
- Strukturmechanik (1)
- Strömung (1)
- Städtebau (1)
- Städtischer Wohnungsmarkt (1)
- Super Healing (1)
- Superplasticizer (1)
- Sustainability (1)
- Synchronisation (1)
- Synchronisierung (1)
- Technik (1)
- Technisches Denkmal (1)
- Tehran (1)
- Temperatur (1)
- Theatermaschine (1)
- Thermal conductivity (1)
- Thermoelasticity (1)
- Thermoelastizität (1)
- Thorax (1)
- Thüringen (1)
- Tod (1)
- Ton <Geologie> (1)
- Tonfilm (1)
- Tracht (1)
- Transdisziplinarität (1)
- Transformation (1)
- Transformation risks (1)
- Transformationsrisiken (1)
- UAS (1)
- Umweltbelastung (1)
- Umweltgerechtigkeit (1)
- Umweltveränderung (1)
- Universität (1)
- Urban Planning (1)
- Urban identity (1)
- Urbanität (1)
- Vertical roller mill (1)
- Vilém Flusser (1)
- Virtuelle Realität (1)
- Volkstracht (1)
- Vulnerability (1)
- Vulnerability assessment (1)
- Wasserreserve (1)
- Wasserressourcenmanagement (1)
- Wastewater manegement (1)
- Welfare State (1)
- Wettbewerb (1)
- Wohnfläche (1)
- Wohnraum (1)
- Wohnungsbau (1)
- Wohnungseigentum (1)
- Wärmeleitfähigkeit (1)
- Wärmeübergang (1)
- Wärmeübergangskoeffizient (1)
- Wärmeübergangskoeffizient an Zylinder (1)
- Zahlungsbereitschaft (1)
- Zementbeton (1)
- Zementmahlung (1)
- Zyklische Beanspruchung (1)
- action recognition (1)
- adaptive neuro-fuzzy inference system (ANFIS) (1)
- adaptive pushover (1)
- alumosilicate (1)
- ant colony optimization algorithm (ACO) (1)
- anti-colonialist (1)
- anti-racist (1)
- artificial neural network (1)
- artistic research (1)
- back-pressure (1)
- battery (1)
- berlinite (1)
- bridge inspection (1)
- capitalist city (1)
- cellulose (1)
- classification (1)
- classifier (1)
- clear channel assessments (1)
- cluster density (1)
- cluster shape (1)
- clustering (1)
- colonialcity (1)
- competition (1)
- computation (1)
- computational fluid dynamics (CFD) (1)
- congestion control (1)
- coronary artery disease (1)
- cross-contamination (1)
- damage information model (1)
- decolonisation (1)
- deep learning neural network (1)
- depletion method (1)
- desk fan (1)
- digital native news media (1)
- digital-born news media (1)
- digitale Kultur (1)
- digitization (1)
- dimensionality reduction (1)
- discrete Dirac operator (1)
- discrete boundary value problems (1)
- discrete monogenic functions (1)
- duty-cycles (1)
- earthquake damage (1)
- earthquake vulnerability assessment (1)
- energy consumption (1)
- energy efficiency (1)
- ensemble model (1)
- entrepreneurial journalism (1)
- experimental validation (1)
- extreme events (1)
- extreme pressure (1)
- fatigue (1)
- feministische Gesundheitsrecherche (1)
- firefly optimization algorithm (1)
- fisher-information matrix (1)
- fixed effects regression (1)
- flow pattern (1)
- fog computing (1)
- food informatics (1)
- fractional-order control (1)
- fuzzy decision making (1)
- fuzzy set qualitative comparative analysis (1)
- geoinformatics (1)
- geopolymer (1)
- ground water contamination (1)
- gully erosion susceptibility (1)
- health (1)
- health informatics (1)
- heart disease diagnosis (1)
- heat sink (1)
- heat transfer coefficient for cylinders (1)
- human blob (1)
- human body proportions (1)
- human thermal plume (1)
- hybrid machine learning (1)
- hybrid machine learning model (1)
- hydraulic jump (1)
- hydrological model (1)
- hydrology (1)
- ideological space (1)
- image processing (1)
- indoor air quality (1)
- industry 4.0 (1)
- journalism (1)
- journalism theories (1)
- künstlerischer Aktivismus (1)
- least square support vector machine (LSSVM) (1)
- longitudinal dispersion coefficient (1)
- mass spectrometry (1)
- mathematical modeling (1)
- mean-squared error (1)
- media performance (1)
- memoy (1)
- mitigation (1)
- mobility points (1)
- mobility stations (1)
- monolithic (1)
- museum (1)
- nachhaltige Regionalentwicklung (1)
- nachhaltige Stadtentwicklung (1)
- nachhaltige Verkehrspolitik (1)
- nachhaltige Wirtschaft (1)
- nanofluid (1)
- natural hazard (1)
- neural networks (NNs) (1)
- news start-ups (1)
- papercrete (1)
- partical swarm optimization (1)
- personalisierte Lüftung (1)
- personalized ventilation (1)
- photovoltaic (1)
- photovoltaic-thermal (PV/T) (1)
- physical activities (1)
- polymer adsorption (1)
- practice theories (1)
- practice theory (1)
- precipitation (1)
- predictive model (1)
- principal component analysis (1)
- public health (1)
- public service media (1)
- public space (1)
- received signal strength indicator (1)
- recovery-based and residual-based error estimators (1)
- remote sensing (1)
- residential buildings (1)
- rice (1)
- rivers (1)
- rule based classification (1)
- schlieren imaging (1)
- schlieren velocimetry (1)
- seasonal precipitation (1)
- seismic assessment (1)
- seismic control (1)
- seismic hazard analysis (1)
- seismic risk estimation (1)
- seismic vulnerability (1)
- signal processing (1)
- site-specific spectrum (1)
- smart mobility (1)
- smart sensors (1)
- socialist city (1)
- sodium silicate solution (1)
- soil temperature (1)
- space of ideology (1)
- space of terror (1)
- spatial analysis (1)
- spatial transition (1)
- spatiotemporal database (1)
- spearman correlation coefficient (1)
- square root cubature calman filter (1)
- stacking system (1)
- standard deviation of pressure fluctuations (1)
- statistical analysis (1)
- statistical coeffcient of the probability distribution (1)
- stilling basin (1)
- structural analysis (1)
- structural control (1)
- studies in 1:1 scale (1)
- support vector regression (1)
- sustainability (1)
- tech company (1)
- technische Bilder (1)
- theory development (1)
- thermisches Empfinden (1)
- totalitarian city (1)
- tower-like structures (1)
- tracer gas (1)
- tuned mass damper (1)
- type-3 fuzzy systems (1)
- urban development (1)
- urban health (1)
- urban history (1)
- urban planning (1)
- urban regeneration (1)
- urban space (1)
- urban sustainability (1)
- urbanism (1)
- vestimentäre Moderne (1)
- visible spectrophotometry (1)
- visual arts (1)
- water quality (1)
- wavelet transform (1)
- wireless sensor network (1)
- wireless sensor networks (1)
- woven composites (1)
- zyklische Beanspruchung (1)
- Ästhetik (1)
- Öffentlich-private Partnerschaft (1)
- Öffentlich-rechtlicher Rundfunk (1)
- äquivalente Temperatur (1)
Erscheinungsjahr
- 2020 (123) (entfernen)
Die Mahlung als Zerkleinerungsprozess stellt seit den Anfängen der Menschheit eine der wichtigsten Verarbeitungsformen von Materialien aller Art dar - von der Getreidemahlung, über das Aufschließen von Heilkräutern in Mörsern bis hin zur Herstellung von Tonern für Drucker und Kopierer. Besonders die Zementmahlung ist in modernen Gesellschaften sowohl ein wirtschaftlicher als auch ein ökologischer Faktor. Mehr als zwei Drittel der elektrischen Energie der Zementproduktion werden für Rohmehl- und Klinker- bzw. Kompositmaterialmahlung verbraucht. Dies ist nur ein Grund, warum der Mahlprozess zunehmend in den Fokus vieler Forschungs- und Entwicklungsvorhaben rückt. Die Komplexität der Zementmahlung steigt im zunehmenden Maße an. Die simple „Mahlung auf Zementfeinheit“ ist seit langem obsolet. Zemente werden maßgeschneidert, mit verschiedensten Kombinationsprodukten, getrennt oder gemeinsam, in unterschiedlichen Mahlaggregaten oder mit ganz neuen Ansätzen gefertigt. Darüber hinaus gewinnt auch der Sektor des Baustoffrecyclings, mit allen damit verbundenen Herausforderungen, immer mehr an Bedeutung. Bei der Fragestellung, wie der Mahlprozess einerseits leistungsfähige Produkte erzeugen kann und andererseits die zunehmenden Anforderungen an Nachhaltigkeit erfüllt, steht das Mahlaggregat im Mittelpunkt der Betrachtungen. Dementsprechend gliedert sich, neben einer eingehenden Literaturrecherche zum Wissensstand, die vorliegende Arbeit in zwei übergeordnete Teile:
Im ersten Teil werden Untersuchungen an konventionellen Mahlaggregaten mit in der Zementindustrie verwendeten Kernprodukten wie Portlandzementklinker, Kalkstein, Flugasche und Hüttensand angestellt. Um eine möglichst effektive Mahlung von Zement und Kompositmaterialien zu gewährleisten, ist es wichtig, die Auswirkung von Mühlenparametern zu kennen. Hierfür wurde eine umfangreiche Versuchsmatrix aufgestellt und
abgearbeitet. Das Spektrum der Analysemethoden war ebenfalls umfangreich und wurde sowohl auf die gemahlenen Materialien als auch auf die daraus hergestellten Zemente und Betone angewendet. Es konnte gezeigt werden, dass vor allem die Unterscheidung zwischen Mahlkörpermühlen und mahlkörperlosen Mühlen entscheidenden Einfluss auf die Granulometrie und somit auch auf die Zementperformance hat. Besonders stark wurden die Verarbeitungseigenschaften, insbesondere der Wasseranspruch und damit auch das Porengefüge und schließlich Druckfestigkeiten sowie Dauerhaftigkeitseigenschaften der aus diesen Zementen hergestellten Betone, beeinflusst. Bei Untersuchungen zur gemeinsamen Mahlung von Kalkstein und Klinker führten ungünstige Anreicherungseffekte des gut mahlbaren Kalksteins sowie tonigen Nebenbestandteilen zu einer schlechteren Performance in allen Zementprüfungen.
Der zweite Teil widmet sich der Hochenergiemahlung. Die dahinterstehende Technik wird seit Jahrzehnten in anderen Wirtschaftsbranchen, wie der Pharmazie, Biologie oder auch Lebensmittelindustrie angewendet und ist seit einiger Zeit auch in der Zementforschung anzutreffen. Beispielhaft seien hier die Planeten- und Rührwerkskugelmühle als Vertreter genannt. Neben grundlegenden Untersuchungen an Zementklinker
und konventionellen Kompositmaterialien wie Hüttensand und Kalkstein wurde auch die Haupt-Zementklinkerphase Alit untersucht. Die Hochenergiemahlung von konventionellen Kompositmaterialien generierte zusätzliche Reaktivität bei gleicher Granulometrie gegenüber der herkömmlichen Mahlung. Dies wurde vor allem bei per se reaktivem Zementklinker als auch bei latent-hydraulischem Hüttensand beobachtet. Gemahlene Flugaschen konnten nur im geringen Maße weiter aktiviert werden. Der generelle Einfluss von Oberflächenvergrößerung, Strukturdefekten und Relaxationseffekten eines Mahlproduktes wurden eingehend untersucht und gewichtet. Die Ergebnisse bei der Hochenergiemahlung von Alit zeigten, dass die durch Mahlung eingebrachten Strukturdefekte eine Erhöhung der Reaktivität zur Folge haben. Hierbei konnte festgestellt werden, das maßgeblich Oberflächendefekte, strukturelle (Volumen-)defekte und als Konterpart Selbstheilungseffekte die reaktivitätsbestimmenden Faktoren sind. Weiterhin wurden Versuche zur Mahlung von Altbetonbrechsand durchgeführt. Im Speziellen wurde untersucht, inwieweit eine Rückführung von Altbetonbrechsand, als unverwertbarer Teil des Betonbruchs, in Form eines Zement-Kompositmaterials in den Baustoffkreislauf möglich ist. Die hierfür verwendete Mahltechnik umfasst sowohl konventionelle Mühlen als auch Hochenergiemühlen. Es wurden Kompositzemente mit variiertem Recyclingmaterialanteil hergestellt und auf grundlegende Eigenschaften untersucht. Zur Bewertung der Produktqualität wurde der sogenannte „Aktivierungskoeffizient“ eingeführt. Es stellte sich heraus, dass die Rückführung von Altbetonbrechsand als potentielles Kompositmaterial wesentlich vom Anteil des Zementsteins abhängt. So konnte beispielsweise reiner Zementstein als aufgemahlenes Kompositmaterial eine bessere Performance gegenüber dem mit Gesteinskörnung beaufschlagtem Altbetonbrechsand ausweisen. Bezogen auf die gemessenen Hydratationswärmen und Druckfestigkeiten nahm der Aktivierungskoeffzient mit fallendem Abstraktionsgrad ab. Ebenfalls sank der Aktivierungskoeffizient mit steigendem Substitutionsgrad. Als Vergleich wurden dieselben Materialien in konventionellen Mühlen aufbereitet. Die hier erzielten Ergebnisse können teilweise der Hochenergiemahlung als gleichwertig beurteilt werden. Folglich ist bei der Aktivierung von Recyclingmaterialien weniger die Mahltechnik als der Anteil an aktivierbarem Zementstein ausschlaggebend.
Die Verbindung der sozialen und der ökologischen Frage ist eine der zentralen Herausforderungen linker Politik und kritisch-engagierter Wissenschaft heute. Dafür, wie wenig das bisher gelingt, sind die öffentlichen und wissenschaftlichen Diskussionen um die Wohnungsfrage gute Beispiele. Dieser Aufruf ist eine Einladung an den kollektiven Wissensschatz aus Wissenschaft und Aktivismus, die unterschiedlichen Aspekte der ökologischen Wohnungsfrage, die bisher stark fragmentiert behandelt werden, in einzelnen Beiträgen weiter auszuführen und auf ihren strukturellen Zusammenhang mit der sozialen Wohnungsfrage hin zu beleuchten.
Why isn't Google welcome in Kreuzberg? Social movement and the effects of Internet on urban space
(2020)
Advances in information and communication technologies such as the Internet have driven a great transformation in the interactions between individuals and the urban environment. As the use of the Internet in cities becomes more intense and diverse, there is also a restructuring of urban space, which is experienced by groups in society in various ways, according to the specificity of each context. Accordingly, large Internet companies have emerged as new players in the processes of urbanization, either through partnerships with the public administration or through various services offered directly to urban residents. Once these corporations are key actors in the digitalization of urban services, their operations can affect the patterns of urban inequality and generate a series of new struggles over the production of space. Interested in analyzing this phenomena from the perspective of civil society, the present Master Thesis examined a social movement that prevented Google to settle a new startup campus in the district of Kreuzberg, in Berlin. By asking why Google was not welcome in that context, this study also sought to understand how internet, as well as its main operators, has affected everyday life in the city. Thus, besides analyzing the movement, I investigated the particularities of the urban context where it arose and the elements that distinguish the mobilization’s opponent. In pursuit of an interdisciplinary approach, I analyzed and discussed the results of empirical research in dialogue with critical theories in the fields of urban studies and the Internet, with emphasis on Castells' definitions of urban social movements and network society (1983, 2009, 2015), Couldry's and Mejias' (2019) idea of data colonialism, Lefèbvre's (1991, 1996) concepts of abstract space and the right to the city, as well as Zuboff's (2019) theory of surveillance capitalism. The case at hand has exposed that Google plays a prominent role in the way the Internet has been developed and deployed in cities. From the perspective accessed, the current appropriation of Internet technologies has been detrimental to individual autonomy and has contributed to intensifying existing inequalities in the city. The alternative vision to this relies mainly on the promotion of decentralized solidarity networks.
A complex artistic research on the theme of cultural heritage and (neo)colonial processes of material and immaterial expropriation. Starting from the encounter with a phonographic relic at the Berliner Phonogramm-Archiv, the artist embarks on a journey to her own roots embodied in the practice of the Afro-Brazilian religion Candomblé. In the form of a theoretical treatise, an archive (photos, diagrams, maps, newspaper clippings, letters, documents), as well as a sound performance in the public space of the city of Weimar, several theoretical and performative elements are brought together in this transmedia artistic research that proposes a true decolonial practice.
Why Do Digital Native News Media Fail? An Investigation of Failure in the Early Start-Up Phase
(2020)
Digital native news media have great potential for improving journalism. Theoretically, they can be the sites where new products, novel revenue streams and alternative ways of organizing digital journalism are discovered, tested, and advanced. In practice, however, the situation appears to be more complicated. Besides the normal pressures facing new businesses, entrepreneurs in digital news are faced with specific challenges. Against the background of general and journalism specific entrepreneurship literature, and in light of a practice–theoretical approach, this qualitative case study research on 15 German digital native news media outlets empirically investigates what barriers curb their innovative capacity in the early start-up phase. In the new media organizations under study here, there are—among other problems—a high degree of homogeneity within founding teams, tensions between journalistic and economic practices, insufficient user orientation, as well as a tendency for organizations to be underfinanced. The patterns of failure investigated in this study can raise awareness, help news start-ups avoid common mistakes before actually entering the market, and help industry experts and investors to realistically estimate the potential of new ventures within the digital news industry.
Das Innovationsmanagement von Medienorganisationen unterliegt derzeit erheblichen Veränderungen: Im veränderten Marktumfeld erweisen sich Flexibilität, schnelle Richtungswechsel und Anpassungsfähigkeit als zentral. Darauf muss auch die Medienmanagement-Forschung reagieren: Um die Agilität der gegenwärtigen Unternehmenspraxis valide zu erforschen, ist eine ebenso agile, adaptive Forschung gefordert. Zu diesem Zweck schlägt der Beitrag eine praxistheoretische Perspektive auf das Innovationsmanagement von Medienorganisationen vor. Empirische Forschungsdesigns, die aus einem solchen Zugriff resultieren, werden sowohl hinsichtlich ihrer methodischen Herausforderungen als auch ihres Forschungsprojektmanagements diskutiert. Der Beitrag greift außerdem neue Möglichkeitsräume des wissenschaftlichen Publizierens, des Universitätsmanagements sowie der Forschungsorganisation auf, die praxistheoretisch gegründete, empirische Innovationsforschung in der Medienwirtschaft einfordert.
The evolution of urbanism under dictatorship forms the core of the current research. This thesis is part of a research network at Bauhaus-Universität Weimar, which studies the 20th century's urbanism under different dictatorships. The network has provided a cross-cultural and cross-border environment and has enabled the author to communicate with other like-minded researchers. The 2015 published book of this group 'Urbanism and Dictatorship: A European Perspective' strengthens the foundation of this research's theoretical and methodological framework.
This thesis investigates urban policies and plans leading to the advancement of urbanization and the transformation of urban space in Iran during the second Pahlavi (1941-1979) when the country faced a milestone in its history: Nationalization of the Iranian oil industry. By reflecting the influence of economic and socio‐political determinants of the time on urbanism and the urbanization process, this work intends to critically trace the effect of dictatorship on evolved urbanism before and after the oil nationalization in 1951.
The research on the second Pahlavi's urbanism has been limitedly addressed and has only recently expanded. Most of the conducted studies date back to less than a decade ago and could not incorporate all the episodes of the second Pahlavi urbanism. These works have often investigated urbanism and architecture by focusing merely on the physical features and urban products in different years regardless of the importance of urbanism as a tool in the service of hegemony. In other words, the majority of the available literature does not intend to address the socio-economic and political roots of urban transformations and by questioning 'what has been built?' investigates the individual urban projects and plans designed by individual designers without interlinking these projects to the state's urban planning program and tracing the beneficiaries of those projects or questioning 'built for whom?'
Moreover, some chapters of this modern urbanism have rarely been investigated. For instance, scant research has looked into the works of foreign designers and consultants involved in the projects such as Peter Georg Ahrens or Constantinos A. Doxiadis. Similarly, the urbanism of the first decade of the second Pahlavi, including the government of Mossadegh, has mainly been overlooked.
Therefore, by critically analyzing the state's urban planning program and the process of urbanization in Iran during the second Pahlavi, this research aims to bridge the literature gap and to unravel the effect of the power structure on urban planning and products while seeking to find a pattern behind the regime's policies.
The main body of this work is concentrated on studying the history of urbanism in Iran, of which collecting data and descriptions played a crucial role. To prevent the limitations associated with singular methods, this research's methodology is based on methodological triangulation (Denzin, 2017). With the triangulation scheme, the data is gathered by combining different qualitative and quantitative methods such as the library, archival and media research, online resources, non-participatory observation, and photography. For the empirical part, the city of Tehran is selected as the case study. Moreover, individual non-structured interviews with the locals were conducted to gain more insights regarding urban projects.
Wiederkehrende Belastungen, wie sie beispielsweise an Brücken oder Windenergieanlagen auftreten, können innerhalb der Nutzungsdauer solcher Bauwerke bis zu 1.000.000.000 Lastwechsel erreichen. Um das dadurch eintretende Ermüdungsverhalten von Beton zu untersuchen, werden diese zyklischen Beanspruchungen in mechanischen Versuchen mit Prüfzylindern nachgestellt. Damit Versuche mit solch hohen Lastwechselzahlen in akzeptablen Zeitdauern durchgeführt werden können, wird die Belastungsfrequenz erhöht. Als Folge dieser erhöhten Belas-tungsfrequenz erwärmen sich allerdings die Betonprobekörper, was zu einem früheren, unrealistischen Versagenszeitpunkt führen kann, weshalb die Erwärmung begrenzt werden muss. Um die Wärmefreisetzung in der Probe zu untersuchen, wurden Versuche und Simulationen durchgeführt. Im Beitrag wird die analytische und messtechnische Analyse des Wärmeübergangs an erwärmten Betonzylindern vorgestellt. Resultierend daraus wird eine Möglichkeit zur Reduktion der Erwärmung an zyklisch beanspruchten Betonzylindern vorgestellt.
Neuartige Sanitärsysteme zielen auf eine ressourcenorientierte Verwertung von Abwasser ab. Erreicht werden soll dies durch die separate Erfassung von Abwasserteilströmen. In den Fachöffentlichkeiten der Wasserwirtschaft und Raumplanung werden neuartige Sanitärsysteme als ein geeigneter Ansatz für die zukünftige
Sicherung der Abwasserentsorgung in ländlichen Räumen betrachtet. Die Praxistauglichkeit dieser Systeme wurde zwar in Forschungsprojekten nachgewiesen, bisher erschweren jedoch für Abwasserentsorger vielfältige Risiken die Einführung einer ressourcenorientierten Abwasserbewirtschaftung. Ausgehend von einer Untersuchung der Kontexte bei der Umsetzung eines neuartigen Sanitärsystems im ländlichen Raum Thüringens wird in diesem Beitrag der Frage nachgegangen, wie auf Landesebene mit dem abwasserwirtschaftlichen Instrumentarium die Einführung von ressourcenorientierten Systemansätzen unterstützt werden kann. Zentrale Elemente des Beitrags sind die Darstellung der wesentlichen Transformationsrisiken in Bezug auf die Einführung innovativer Lösungsansätze, eine Erläuterung der spezifischen abwasserwirtschaftlichen Instrumente sowie die Darlegung von Steuerungsansätzen,mit denen die Einführung von neuartigen Sanitärsystemen gefördert werden kann. Im Ergebnis wird die Realisierbarkeit von neuartigen Sanitärsystemen durch den strategischen Einsatz des Instrumentariums deutlich, gleichwohl die Wasserwirtschaft durch die Erweiterung der bisherigen Systemgrenzen auf die Kooperation mit anderen Bereichen der Daseinsvorsorge angewiesen ist.
Along with environmental pollution, urban planning has been connected to public health. The research indicates that the quality of built environments plays an important role in reducing mental disorders and overall health. The structure and shape of the city are considered as one of the factors influencing happiness and health in urban communities and the type of the daily activities of citizens. The aim of this study was to promote physical activity in the main structure of the city via urban design in a way that the main form and morphology of the city can encourage citizens to move around and have physical activity within the city. Functional, physical, cultural-social, and perceptual-visual features are regarded as the most important and effective criteria in increasing physical activities in urban spaces, based on literature review. The environmental quality of urban spaces and their role in the physical activities of citizens in urban spaces were assessed by using the questionnaire tool and analytical network process (ANP) of structural equation modeling. Further, the space syntax method was utilized to evaluate the role of the spatial integration of urban spaces on improving physical activities. Based on the results, consideration of functional diversity, spatial flexibility and integration, security, and the aesthetic and visual quality of urban spaces plays an important role in improving the physical health of citizens in urban spaces. Further, more physical activities, including motivation for walking and the sense of public health and happiness, were observed in the streets having higher linkage and space syntax indexes with their surrounding texture.
The thesis concerns a work of urban history intended not to describe the city but rather to interpret it. By doing so, I have interpreted the city by means of the role played by the so-called ‘great property’ in the European city-making process during the last three decades of the 20th century, specifically focused on the concrete case of military properties in Italy. I have also considered the role played by other kinds of great properties, i.e. industries and railway, which previously acted in the production of the built environment in a different way respect to the military one. As all of them have as common denominator the fact of being ‘capital in land’, I analysed great industrial and railway properties in order to extrapolate a methodology which helped me to interpret the relationship between military properties and city-making process in Europe in the late 20th century.
I have analysed the relationship between the capital in land and the city-making process on the ground of the understanding the interrelation between the great property, the urban development, and the agents involved in the urban and territorial planning. Here I have showed that urban planning is not the decisive factor influencing the citymaking process, but instead the power held by the capital in land. I have found that is the great property the trigger of the creation of new ‘areas of centrality’ intended as large areas for consumerism. As far as the role played by great property is concerned, I have also discovered that it has evolved over time. Originally, industrial and railway properties have been regenerated into a wide range of new profit-driven spaces; successively, I have found out that most of the regeneration of military premises aimed to materialise areas of centrality. The way of interpreting this factor has been based on focusing my attention on the military premises in Italy: I have classified their typology when they have been built and, most importantly, when they have been regenerated into new areas of centrality.
Space is a social product and a social producer. The main aim of this thesis is to reveal ‘the process of totalitarian city making in Pyongyang’, especially in the light of the interaction between the power and urban space.
The totalitarian city of Pyongyang was born out of modernization in the process of masses formation. During the growth of colonial capitalism and Christian liberal ideas, Pyongyang was modernized and displayed the characteristics of a modern city with industrialization and urbanization. During the introduction of Japanese colonial capitalism, peasants, women, and slaves became the first masses and urban poor, and they later transformed into the mob; their violence was finally demonstrated during the Anti-Chinese Riot.
After the 1945 independence, Kim’s regime formed the one-party state with a cry for revolution. They produced an atmosphere of imminent war to instill fear and hatred into the psyche of Pyongyang citizens. The regime eliminated all political opponents in 1967 and finally declared the totalitarian ideology in 1974. During this process, Pyongyang demonstrated two main characteristics of a totalitarian city: the space of terror and of ideology. The space of terror produces the fear of death and the space of ideology controls the thought and life of citizens.
After entry to the market, to keep Kim’s controlling power, the regime used the strategy of location exchange. The camp, market, and Foreign Currency Shop were effective tools to prepare for executives’ gifts. However, the market also produces the desire not only for consumption but also for freedom and truth; it is tearing down the foundation of the totalitarian city of Pyongyang.
This research focuses primarily on the interaction between political power and urban space. In the process of making a totalitarian city, the power produced urban space and it influenced the psyche of Pyongyang citizens. Even though this spatial transition has created the totalitarian city and helped maintain political power, it also led and produced intended or unintended social variation in Pyongyang society.
The Marmara Region (NW Turkey) has experienced significant earthquakes (M > 7.0) to date. A destructive earthquake is also expected in the region. To determine the effect of the specific design spectrum, eleven provinces located in the region were chosen according to the Turkey Earthquake Building Code updated in 2019. Additionally, the differences between the previous and updated regulations of the country were investigated. Peak Ground Acceleration (PGA) and Peak Ground Velocity (PGV) were obtained for each province by using earthquake ground motion levels with 2%, 10%, 50%, and 68% probability of exceedance in 50-year periods. The PGA values in the region range from 0.16 to 0.7 g for earthquakes with a return period of 475 years. For each province, a sample of a reinforced-concrete building having two different numbers of stories with the same ground and structural characteristics was chosen. Static adaptive pushover analyses were performed for the sample reinforced-concrete building using each province’s design spectrum. The variations in the earthquake and structural parameters were investigated according to different geographical locations. It was determined that the site-specific design spectrum significantly influences target displacements for performance-based assessments of buildings due to seismicity characteristics of the studied geographic location.
The "Stapelhaus": Experimental building project on the campus of the Bauhaus-Universität Weimar
(2020)
The project is a cooperation between the bauhaus.ifex and MFPA Weimar and is intended to develop step by step as an experimental student village. Special focus is given to sustainability and construction using different building materials. For the construction of the first room module, CemCel was chosen as a new, lightweight and fibre-based building material.
Synergistic Framework for Analysis and Model Assessment in Bridge Aerodynamics and Aeroelasticity
(2020)
Wind-induced vibrations often represent a major design criterion for long-span bridges. This work deals with the assessment and development of models for aerodynamic and aeroelastic analyses of long-span bridges.
Computational Fluid Dynamics (CFD) and semi-analytical aerodynamic models are employed to compute the bridge response due to both turbulent and laminar free-stream. For the assessment of these models, a comparative methodology is developed that consists of two steps, a qualitative and a quantitative one. The first, qualitative, step involves an extension
of an existing approach based on Category Theory and its application to the field of bridge aerodynamics. Initially, the approach is extended to consider model comparability and completeness. Then, the complexity of the CFD and twelve semi-analytical models are evaluated based on their mathematical constructions, yielding a diagrammatic representation of model quality.
In the second, quantitative, step of the comparative methodology, the discrepancy of a system response quantity for time-dependent aerodynamic models is quantified using comparison metrics for time-histories. Nine metrics are established on a uniform basis to quantify the discrepancies in local and global signal features that are of interest in bridge aerodynamics. These signal features involve quantities such as phase, time-varying frequency and magnitude content, probability density, non-stationarity, and nonlinearity.
The two-dimensional (2D) Vortex Particle Method is used for the discretization of the Navier-Stokes equations including a Pseudo-three dimensional (Pseudo-3D) extension within an existing CFD solver. The Pseudo-3D Vortex Method considers the 3D structural behavior for aeroelastic analyses by positioning 2D fluid strips along a line-like structure. A novel turbulent Pseudo-3D Vortex Method is developed by combining the laminar Pseudo-3D VPM and a previously developed 2D method for the generation of free-stream turbulence. Using analytical derivations, it is shown that the fluid velocity correlation is maintained between the CFD strips.
Furthermore, a new method is presented for the determination of the complex aerodynamic admittance under deterministic sinusoidal gusts using the Vortex Particle Method. The sinusoidal gusts are simulated by modeling the wakes of flapping airfoils in the CFD domain with inflow vortex particles. Positioning a section downstream yields sinusoidal forces that are used for determining all six components of the complex aerodynamic admittance. A closed-form analytical relation is derived, based on an existing analytical model. With this relation, the inflow particles’ strength can be related with the target gust amplitudes a priori.
The developed methodologies are combined in a synergistic framework, which is applied to both fundamental examples and practical case studies. Where possible, the results are verified and validated. The outcome of this work is intended to shed some light on the complex wind–bridge interaction and suggest appropriate modeling strategies for an enhanced design.
In Germany, bridges have an average age of 40 years. A bridge consumes between 0.4% and 2% of its construction cost per year over its entire life cycle. This means that up to 80% of the construction cost are additionally needed for operation, inspection, maintenance, and destruction. Current practices rely either on paperbased inspections or on abstract specialist software. Every application in the inspection and maintenance sector uses its own data model for structures, inspections, defects, and maintenance. Due to this, data and properties have to be transferred manually, otherwise a converter is necessary for every data exchange between two applications. To overcome this issue, an adequate model standard for inspections, damage, and maintenance is necessary. Modern 3D models may serve as a single source of truth, which has been suggested in the Building Information Modeling (BIM) concept. Further, these models offer a clear visualization of the built infrastructure, and improve not only the planning and construction phases, but also the operation phase of construction projects. BIM is established mostly in the Architecture, Engineering, and Construction (AEC) sector to plan and construct new buildings. Currently, BIM does not cover the whole life cycle of a building, especially not inspection and maintenance. Creating damage models needs the building model first, because a defect is dependent on the building component, its properties and material. Hence, a building information model is necessary to obtain meaningful conclusions from damage information. This paper analyzes the requirements, which arise from practice, and the research that has been done in modeling damage and related information for bridges. With a look at damage categories and use cases related to inspection and maintenance, scientific literature is discussed and synthesized. Finally, research gaps and needs are identified and discussed.
Temporary changes in precipitation may lead to sustained and severe drought or massive floods in different parts of the world. Knowing the variation in precipitation can effectively help the water resources decision-makers in water resources management. Large-scale circulation drivers have a considerable impact on precipitation in different parts of the world. In this research, the impact of El Niño-Southern Oscillation (ENSO), Pacific Decadal Oscillation (PDO), and North Atlantic Oscillation (NAO) on seasonal precipitation over Iran was investigated. For this purpose, 103 synoptic stations with at least 30 years of data were utilized. The Spearman correlation coefficient between the indices in the previous 12 months with seasonal precipitation was calculated, and the meaningful correlations were extracted. Then, the month in which each of these indices has the highest correlation with seasonal precipitation was determined. Finally, the overall amount of increase or decrease in seasonal precipitation due to each of these indices was calculated. Results indicate the Southern Oscillation Index (SOI), NAO, and PDO have the most impact on seasonal precipitation, respectively. Additionally, these indices have the highest impact on the precipitation in winter, autumn, spring, and summer, respectively. SOI has a diverse impact on winter precipitation compared to the PDO and NAO, while in the other seasons, each index has its special impact on seasonal precipitation. Generally, all indices in different phases may decrease the seasonal precipitation up to 100%. However, the seasonal precipitation may increase more than 100% in different seasons due to the impact of these indices. The results of this study can be used effectively in water resources management and especially in dam operation.
The purpose of this study is to develop self-contained methods for obtaining smooth meshes which are compatible with isogeometric analysis (IGA). The study contains three main parts. We start by developing a better understanding of shapes and splines through the study of an image-related problem. Then we proceed towards obtaining smooth volumetric meshes of the given voxel-based images. Finally, we treat the smoothness issue on the multi-patch domains with C1 coupling. Following are the highlights of each part.
First, we present a B-spline convolution method for boundary representation of voxel-based images. We adopt the filtering technique to compute the B-spline coefficients and gradients of the images effectively. We then implement the B-spline convolution for developing a non-rigid images registration method. The proposed method is in some sense of “isoparametric”, for which all the computation is done within the B-splines framework. Particularly, updating the images by using B-spline composition promote smooth transformation map between the images. We show the possible medical applications of our method by applying it for registration of brain images.
Secondly, we develop a self-contained volumetric parametrization method based on the B-splines boundary representation. We aim to convert a given voxel-based data to a matching C1 representation with hierarchical cubic splines. The concept of the osculating circle is employed to enhance the geometric approximation, where it is done by a single template and linear transformations (scaling, translations, and rotations) without the need for solving an optimization problem. Moreover, we use the Laplacian smoothing and refinement techniques to avoid irregular meshes and to improve mesh quality. We show with several examples that the method is capable of handling complex 2D and 3D configurations. In particular, we parametrize the 3D Stanford bunny which contains irregular shapes and voids.
Finally, we propose the B´ezier ordinates approach and splines approach for C1 coupling. In the first approach, the new basis functions are defined in terms of the B´ezier Bernstein polynomials. For the second approach, the new basis is defined as a linear combination of C0 basis functions. The methods are not limited to planar or bilinear mappings. They allow the modeling of solutions to fourth order partial differential equations (PDEs) on complex geometric domains, provided that the given patches are G1
continuous. Both methods have their advantages. In particular, the B´ezier approach offer more degree of freedoms, while the spline approach is more computationally efficient. In addition, we proposed partial degree elevation to overcome the C1-locking issue caused by the over constraining of the solution space. We demonstrate the potential of the resulting C1 basis functions for application in IGA which involve fourth order PDEs such as those appearing in Kirchhoff-Love shell models, Cahn-Hilliard phase field application, and biharmonic problems.
Smart Cities and Mobility Stations: Lessons learned from the Smarter Together in Vienna and Munich
(2020)
With an increasing urban population and urban problems arising from this unplanned growth, several projects aimed at promoting sustainable urban development have emerged. Smart mobility strategies, such as shared mobility and mobility stations, represent some of the solutions to promote changes in travel behavior. Despite its beneficial impacts, however, the implementation of such infrastructure is criticized for not contributing to current urban issues, as well as often disregarding knowledge about urban space and its functioning.
In this context, the Smarter Together, a joint research and innovation project funded through the European Union program H2020, was implemented. The project selected three lighthouse cities to test and upscale innovative solutions: Vienna, Munich, and Lyon.
This master thesis presents the main characteristics of the mobility stations systems implemented in Vienna and Munich in the scope of the project Smarter Together. Its main goal is to share what can be learned from their experiences while approaching critically the concept of smart cities. This master thesis identifies important aspects to take into account when planning, implementing, and operating mobility stations, and provides an understanding of smart cities and smart mobility that goes beyond the adoption of technology. Several methods were combined for the development of this master thesis, such as quantitative secondary data, observational studies, application of survey forms, explorative expert interviews, and literature review.
This work has demonstrated that the Smarter Together has a cutting-edge scope and contributed greatly to research and innovation, by creating living laboratories to test the application of technology in the urban environment. However, from the perspective of the mobility stations assessment, many caveats were made. In short, many lessons could be learned and are presented throughout this work aiming at contributing to the improvement of the mobility stations implemented in the project areas in Munich and Vienna, as well as for inspiring other cities in Europe and worldwide.
In this thesis, a generic model for the post-failure behavior of concrete in tension is proposed. A mesoscale model of concrete representing the heterogeneous nature of concrete is formulated. The mesoscale model is composed of three phases: aggregate, mortar matrix, and the Interfacial Transition Zone between them. Both local and non-local formulations of the damage are implemented and the results are compared. Three homogenization schemes from the literature are employed to obtain the homogenized constitutive relationship for the macroscale model. Three groups of numerical examples are provided.
Hydrological drought forecasting plays a substantial role in water resources management. Hydrological drought highly affects the water allocation and hydropower generation. In this research, short term hydrological drought forecasted based on the hybridized of novel nature-inspired optimization algorithms and Artificial Neural Networks (ANN). For this purpose, the Standardized Hydrological Drought Index (SHDI) and the Standardized Precipitation Index (SPI) were calculated in one, three, and six aggregated months. Then, three states where proposed for SHDI forecasting, and 36 input-output combinations were extracted based on the cross-correlation analysis. In the next step, newly proposed optimization algorithms, including Grasshopper Optimization Algorithm (GOA), Salp Swarm algorithm (SSA), Biogeography-based optimization (BBO), and Particle Swarm Optimization (PSO) hybridized with the ANN were utilized for SHDI forecasting and the results compared to the conventional ANN. Results indicated that the hybridized model outperformed compared to the conventional ANN. PSO performed better than the other optimization algorithms. The best models forecasted SHDI1 with R2 = 0.68 and RMSE = 0.58, SHDI3 with R 2 = 0.81 and RMSE = 0.45 and SHDI6 with R 2 = 0.82 and RMSE = 0.40.
What you are about to read is the very last issue of the ZMK. Since our overall research enterprise, the IKKM, has to cease all of its activities due to the end of its twelve years’ funding by the German federal government, the ZMK will also come to an end. Its last topic, Schalten und Walten has also been the subject of the concluding biannual conference of the IKKM, and we hope it will be a fitting topic to resume the research of the IKKM on Operative Ontologies.
Although this final issue is in English, we decided to leave its title in German: Schalten und Walten. As it is the case for the name of the IKKM, (Internationales Kolleg für Kulturtechnikforschung und Medienphilosophie), the term seems untranslatable to us, not only for the poetic reason of the rhyming sound of the words. Switching and Ruling might be accepted as English versions, but quite an unbridgeable difference remains. In German, Schalten und Walten is a rather common and quite widespread idiom that can be found in everyday life. Whoever, the idiom stipulates, is able to execute Schalten und Walten has the power to act, has freedom of decision and power of disposition.
Although both terms are mentioned together and belong together in the German expression Schalten und Walten, they are nevertheless complements to each other. They both refer to the exercise and existence of domination, disposal or power, but they nonetheless designate two quite different modes of being. Schalten is not so much sheer command over something, but government or management. It is linked to control, intervention and change, in short: it is operative and goes along with distinctive measures and cause-and-effect relations. The English equivalent switching reflects this more or less adequately.
Die Verbreitung mobiler Smartphones und besonders deren allgegenwärtige Lokalisierungstechnologien verändern das Navigationsverhalten im Raum nachhaltig. Parallel zur schnell voranschreitenden Entwicklung alltäglicher Geräte, die mitgeführt werden, setzt der Übergang der bereits länger dauernden Entwicklung von Virtual-Reality-Technik in eine erweiterte und augmentierte Mixed Reality ein. In diesem Spannungsfeld untersucht die vorliegende Arbeit, inwieweit richtungsgebundene und binaural wiedergegebene Stereofonie die menschliche Bewegung im Raum beeinflussen kann und versucht zu erörtern, welche Potenziale in der Wiederentdeckung einer relativ lange bekannten Technik liegen. Der Autor hat im Rahmen dieser Arbeit eine binaurale mobile Applikation für richtungsgebundene Stereofonie entwickelt, mit der virtuelle bewegte oder statische Audio-Hotspots im Raum platziert werden können. So kann links, rechts oder 30 Meter vor einer Person ein virtueller oder tatsächlicher Klang im Raum verortet sein. Durch die in Echtzeit berechnete binaurale Wiedergabe der Klangquellen mit einem Stereo-Kopfhörer können diese räumlich verorteten Klänge mit zwei Ohren dreidimensional wahrgenommen werden, ähnlich dem räumlichen Sehen mit zwei Augen. Durch den Einsatz mehrerer lokalisierter Klangquellen als Soundscape entsteht eine augmentierte auditive Realität, die die physische Realität erweitert. Die Position und Navigation des Nutzers wird durch binaurale Lautstärkenmodulation (die Lautstärke nimmt bei abnehmender Distanz zur Quelle zu) und Stereopanning mit Laufzeitmodulation (die Richtung wird über ein Stereosignal auf beiden Ohren räumlich links-rechts-vorne verortet) interaktiv und kybernetisch beeinflusst. Die Nutzer navigieren — durch ihr Interesse an den hörbaren virtuellen Klangquellen geleitet — durch einen dynamisch erzeugten, dreidimensionalen akustischen Raum, der gleichzeitig ein virtueller und kybernetischer Raum ist, da die Repräsentation der Klänge an die Bewegung und Ausrichtung der Nutzer im Raum angepasst wird. Diese Arbeit untersucht, ob die Bewegung von Menschen durch (virtuelle) Klänge beeinflusst werden kann und wie groß oder messbar dieser Einfluss ist. Dabei können nicht alle künstlerischen, architektonischen und philosophischen Fragen im Rahmen der vorliegenden Schrift erörtert werden, obwohl sie dennoch als raumtheoretische Fragestellung von Interesse sind. Hauptgegenstand der vorliegenden Arbeit liegt in der Erforschung, ob richtungsgebundene Stereofonie einen relevanten Beitrag zur menschlichen Navigation, hauptsächlich zu Fuß, in urbanen Gebieten — vorwiegend im Außenraum — leisten kann. Der erste Teil gliedert sich in »Raum und Klang«, es werden raumtheoretische Überlegungen zur menschlichen Bewegung im Raum, Raumvorstellungen, räumliche Klänge und Klangwahrnehmung sowie die Entwicklung stereofoner Apparaturen und Aspekte der Augmented Audio Reality besprochen. Im zweiten Teil werden drei Demonstratoren als Anwendungsszenarien und drei Evaluierungen im Außenraum vorgestellt. Die Tests untersuchen, ob sich das Verfahren zur Navigation für Fußgänger eignet und inwieweit eine Einflussnahme auf das Bewegungsverhalten von Nutzern getroffen werden kann. Die Auswertungen der Tests zeigen, dass sich stereofone Klänge grundsätzlich als Navigationssystem eignen, da eine große Mehrzahl der Teilnehmer die akustisch markierten Ziele leicht gefunden hat. Ebenso zeigt sich ein klarer Einfluss auf die Bewegungsmuster, allerdings ist dieser abhängig von individuellen Interessen und Vorlieben. Abschließend werden die Ergebnisse der Untersuchungen im Kontext der vorgestellten Theorien diskutiert und die Potenziale stereofoner Anwendungen in einem Ausblick behandelt. Bei der Gestaltung, Erzeugung und Anwendung mobiler Systeme sind unterschiedliche mentale und räumliche Modelle und Vorstellungen der Entwickler und Anwender zu beachten. Da eine umfassende transdisziplinäre Betrachtung klare Begrifflichkeiten erfordert, werden Argumente für ein raumtheoretisches Vokabular diskutiert. Diese sind für einen gestalterischen Einsatz von richtungsgebundener Stereofonie — besonders im Kontext mobiler Navigation durch akustisch augmentierte Räume — äußerst relevant.
Radikale Planung
(2020)
A new large‐field, high‐sensitivity, single‐mirror coincident schlieren optical instrument has been installed at the Bauhaus‐Universität Weimar for the purpose of indoor air research. Its performance is assessed by the non‐intrusive measurement of the thermal plume of a heated manikin. The schlieren system produces excellent qualitative images of the manikin's thermal plume and also quantitative data, especially schlieren velocimetry of the plume's velocity field that is derived from the digital cross‐correlation analysis of a large time sequence of schlieren images. The quantitative results are compared with thermistor and hot‐wire anemometer data obtained at discrete points in the plume. Good agreement is obtained, once the differences between path‐averaged schlieren data and planar anemometry data are reconciled.
Schreibwalzer GeradekanonischeTexteumgibteine Aura der Unantastbarkeit und es scheint, als bräuchte man eine gewisse Art der Legitimation, sich diesen auch kritisch anzunähern oder mit eigenen Gedanken auf eine Stufe zu stellen. Im Präludium tasten wir uns phy- sisch mit Schere, Bleistift, Klebestift an diese Schwer- gewichte heran und darüber hinaus; mit Respekt, doch ohne Einschüchterung.
Pressure fluctuations beneath hydraulic jumps potentially endanger the stability of stilling basins. This paper deals with the mathematical modeling of the results of laboratory-scale experiments to estimate the extreme pressures. Experiments were carried out on a smooth stilling basin underneath free hydraulic jumps downstream of an Ogee spillway. From the probability distribution of measured instantaneous pressures, pressures with different probabilities could be determined. It was verified that maximum pressure fluctuations, and the negative pressures, are located at the positions near the spillway toe. Also, minimum pressure fluctuations are located at the downstream of hydraulic jumps. It was possible to assess the cumulative curves of pressure data related to the characteristic points along the basin, and different Froude numbers. To benchmark the results, the dimensionless forms of statistical parameters include mean pressures (P*m), the standard deviations of pressure fluctuations (σ*X), pressures with different non-exceedance probabilities (P*k%), and the statistical coefficient of the probability distribution (Nk%) were assessed. It was found that an existing method can be used to interpret the present data, and pressure distribution in similar conditions, by using a new second-order fractional relationships for σ*X, and Nk%. The values of the Nk% coefficient indicated a single mean value for each probability.
In this paper, an artificial neural network is implemented for the sake of predicting the thermal conductivity ratio of TiO2-Al2O3/water nanofluid. TiO2-Al2O3/water in the role of an innovative type of nanofluid was synthesized by the sol–gel method. The results indicated that 1.5 vol.% of nanofluids enhanced the thermal conductivity by up to 25%. It was shown that the heat transfer coefficient was linearly augmented with increasing nanoparticle concentration, but its variation with temperature was nonlinear. It should be noted that the increase in concentration may cause the particles to agglomerate, and then the thermal conductivity is reduced. The increase in temperature also increases the thermal conductivity, due to an increase in the Brownian motion and collision of particles. In this research, for the sake of predicting the thermal conductivity of TiO2-Al2O3/water nanofluid based on volumetric concentration and temperature functions, an artificial neural network is implemented. In this way, for predicting thermal conductivity, SOM (self-organizing map) and BP-LM (Back Propagation-Levenberq-Marquardt) algorithms were used. Based on the results obtained, these algorithms can be considered as an exceptional tool for predicting thermal conductivity. Additionally, the correlation coefficient values were equal to 0.938 and 0.98 when implementing the SOM and BP-LM algorithms, respectively, which is highly acceptable. View Full-Text
A novel combination of the ant colony optimization algorithm (ACO)and computational fluid dynamics (CFD) data is proposed for modeling the multiphase chemical reactors. The proposed intelligent model presents a probabilistic computational strategy for predicting various levels of three-dimensional bubble column reactor (BCR) flow. The results prove an enhanced communication between ant colony prediction and CFD data in different sections of the BCR.
The assessment of wind-induced vibrations is considered vital for the design of long-span bridges. The aim of this research is to develop a methodological framework for robust and efficient prediction strategies for complex aerodynamic phenomena using hybrid models that employ numerical analyses as well as meta-models. Here, an approach to predict motion-induced aerodynamic forces is developed using artificial neural network (ANN). The ANN is implemented in the classical formulation and trained with a comprehensive dataset which is obtained from computational fluid dynamics forced vibration simulations. The input to the ANN is the response time histories of a bridge section, whereas the output is the motion-induced forces. The developed ANN has been tested for training and test data of different cross section geometries which provide promising predictions. The prediction is also performed for an ambient response input with multiple frequencies. Moreover, the trained ANN for aerodynamic forcing is coupled with the structural model to perform fully-coupled fluid--structure interaction analysis to determine the aeroelastic instability limit. The sensitivity of the ANN parameters to the model prediction quality and the efficiency has also been highlighted. The proposed methodology has wide application in the analysis and design of long-span bridges.
This paper proposes a practice-theoretical journalism research approach for an alternate and innovative perspective of digital journalism’s current empirical challenges. The practice-theoretical approach is introduced by demonstrating its explanatory power in relation to demarcation problems, technological changes, economic challenges and challenges to journalism’s legitimacy. Its respective advantages in dealing with these problems are explained and then compared to established journalism theories. The particular relevance of the theoretical perspective is due to (1) its central decision to observe journalistic practices, (2) the transgression of conventional journalistic boundaries, (3) the denaturalization of journalistic norms and laws, (4) the explicit consideration of a material, socio-technical dimension of journalism, (5) a focus on the conflicting relationship between journalistic practices and media management practices, and (6) prioritizing order generation over stability.
Städte ohne Wachstum - eine bislang kaum vorstellbare Vision. Doch Klimawandel, Ressourcenverschwendung, wachsende soziale Ungleichheiten und viele andere Zukunftsgefahren stellen das bisherige Allheilmittel Wachstum grundsätzlich infrage. Wie wollen wir heute und morgen zusammenleben? Wie gestalten wir ein gutes Leben für alle in der Stadt? Während in einzelnen Nischen diese Fragen bereits ansatzweise beantwortet werden, fehlt es noch immer an umfassenden Entwürfen und Transformationsansätzen, die eine fundamental andere, solidarische Stadt konturieren. Diesen Versuch wagt das Projekt Postwachstumsstadt.
In diesem Buch werden konzeptionelle und pragmatische Aspekte aus verschiedenen Bereichen der Stadtpolitik zusammengebracht, die neue Pfade aufzeigen und verknüpfen. Die Beiträge diskutieren städtische Wachstumskrisen, transformative Planung und Konflikte um Gestaltungsmacht. Nicht zuletzt wird dabei auch die Frage nach der Rolle von Stadtutopien neu gestellt. Dadurch soll eine längst fällige Debatte darüber angestoßen werden, wie sich notwendige städtische Wenden durch eine sozialökologische Neuorientierung vor Ort verwirklichen lassen.
The performance of ductless personalized ventilation (DPV) was compared to the performance of a typical desk fan since they are both stand-alone systems that allow the users to personalize their indoor environment. The two systems were evaluated using a validated computational fluid dynamics (CFD) model of an office room occupied by two users. To investigate the impact of DPV and the fan on the inhaled air quality, two types of contamination sources were modelled in the domain: an active source and a passive source. Additionally, the influence of the compared systems on thermal comfort was assessed using the coupling of CFD with the comfort model developed by the University of California, Berkeley (UCB model). Results indicated that DPV performed generally better than the desk fan. It provided better thermal comfort and showed a superior performance in removing the exhaled contaminants. However, the desk fan performed better in removing the contaminants emitted from a passive source near the floor level. This indicates that the performance of DPV and desk fans depends highly on the location of the contamination source. Moreover, the simulations showed that both systems increased the spread of exhaled contamination when used by the source occupant.
Die Sicherstellung der medizinischen Versorgung in der Bundesrepublik wird seit einigen Jahren verstärkt diskutiert. Besonders in ländliche Regionen ist die Gewährleistung einer bedarfsgerechten medizinischen Versorgung problematisch. Längst ist dies zu einem gesamtgesellschaftlichen Thema geworden. Die Bachelorarbeit von Lena Wild untersucht die Schnittstelle von räumlicher Planung und medizinischer Planung mit der Frage „Inwiefern leisten die derzeitigen Planungen und Förderprogramme einen positiven Beitrag zur Sicherstellung der medizinischen Versorgung im ländlichen Raum Thüringens?“
Neben einer Literatur- und Datenanalyse stehen Expert:inneninterviews mit verschiedenen kommunalen und medizinischen Akteur:innen aus ausgewählten Landkreisen in Thüringen im Fokus der Arbeit.
Nach einer Einordnung der medizinischen Versorgung in den Kontext der räumlichen Planung wird eine Übersicht über das Gesundheitssystem der Bundesrepublik und die Besonderheiten in Ostdeutschland gegeben. Darauffolgend werden Instrumente zur Steuerung der medizinischen Versorgung genauer untersucht. Besonders die kassenärztliche Bedarfsplanung als zentrales Steuerungsinstrument und die Fördermöglichkeiten in Thüringen und deren Wirkungsweisen stehen dabei im Fokus. Im Weiteren wird auf ländliche Räume und deren Herausforderungen für die Sicherstellung der medizinischen Versorgung und die Kommunikation und Zusammenarbeit der Akteur:innen eingegangen.
Die Arbeit ist als Exkurs der Raumplanung in eine Fachplanung zu verstehen. Zwischen medizinischer Fachplanung und räumlicher Planung bestehen Wechselwirkungen. Das Kennen von Instrumenten und Wirkungsweisen der anderen Disziplin schafft dabei einen Mehrwert, um gemeinsame Ziele, wie die flächendeckende medizinische Versorgung und damit einhergehend eine nachhaltige räumliche Entwicklung, zu erreichen.
Welfare‐state transformation and entrepreneurial urban politics in Western welfare states since the late 1970s have yielded converging trends in the transformation of the dominant Fordist paradigm of social housing in terms of its societal function and institutional and spatial form. In this article I draw from a comparative case study on two cities in Germany to show that the resulting new paradigm is simultaneously shaped by the idiosyncrasies of the country's national housing regime and local housing policies. While German governments have successively limited the societal function of social housing as a legitimate instrument only for addressing exceptional housing crises, local policies on providing and organizing social housing within this framework display significant variation. However, planning and design principles dominating the spatial forms of social housing have been congruent. They may be interpreted as both an expression of the marginalization of social housing within the restructured welfare housing regime and a tool of its implementation according to the logics of entrepreneurial urban politics.
In conjunction with the improved methods of monitoring damage and degradation processes, the interest in reliability assessment of reinforced concrete bridges is increasing in recent years. Automated imagebased inspections of the structural surface provide valuable data to extract quantitative information about deteriorations, such as crack patterns. However, the knowledge gain results from processing this information in a structural context, i.e. relating the damage artifacts to building components. This way, transformation to structural analysis is enabled. This approach sets two further requirements: availability of structural bridge information and a standardized storage for interoperability with subsequent analysis tools. Since the involved large datasets are only efficiently processed in an automated manner, the implementation of the complete workflow from damage and building data to structural analysis is targeted in this work. First, domain concepts are derived from the back-end tasks: structural analysis, damage modeling, and life-cycle assessment. The common interoperability format, the Industry Foundation Class (IFC), and processes in these domains are further assessed. The need for usercontrolled interpretation steps is identified and the developed prototype thus allows interaction at subsequent model stages. The latter has the advantage that interpretation steps can be individually separated into either a structural analysis or a damage information model or a combination of both. This approach to damage information processing from the perspective of structural analysis is then validated in different case studies.
This study aims to evaluate a new approach in modeling gully erosion susceptibility (GES) based on a deep learning neural network (DLNN) model and an ensemble particle swarm optimization (PSO) algorithm with DLNN (PSO-DLNN), comparing these approaches with common artificial neural network (ANN) and support vector machine (SVM) models in Shirahan watershed, Iran. For this purpose, 13 independent variables affecting GES in the study area, namely, altitude, slope, aspect, plan curvature, profile curvature, drainage density, distance from a river, land use, soil, lithology, rainfall, stream power index (SPI), and topographic wetness index (TWI), were prepared. A total of 132 gully erosion locations were identified during field visits. To implement the proposed model, the dataset was divided into the two categories of training (70%) and testing (30%). The results indicate that the area under the curve (AUC) value from receiver operating characteristic (ROC) considering the testing datasets of PSO-DLNN is 0.89, which indicates superb accuracy. The rest of the models are associated with optimal accuracy and have similar results to the PSO-DLNN model; the AUC values from ROC of DLNN, SVM, and ANN for the testing datasets are 0.87, 0.85, and 0.84, respectively. The efficiency of the proposed model in terms of prediction of GES was increased. Therefore, it can be concluded that the DLNN model and its ensemble with the PSO algorithm can be used as a novel and practical method to predict gully erosion susceptibility, which can help planners and managers to manage and reduce the risk of this phenomenon.
Material properties play a critical role in durable products manufacturing. Estimation of the precise characteristics in different scales requires complex and expensive experimental measurements. Potentially, computational methods can provide a platform to determine the fundamental properties before the final experiment. Multi-scale computational modeling leads to the modeling of the various time, and length scales include nano, micro, meso, and macro scales. These scales can be modeled separately or in correlation with coarser scales. Depend on the interested scales modeling, the right selection of multi-scale methods leads to reliable results and affordable computational cost. The present dissertation deals with the problems in various length and time scales using computational methods include density functional theory (DFT), molecular mechanics (MM), molecular dynamics (MD), and finite element (FE) methods.
Physical and chemical interactions in lower scales determine the coarser scale properties. Particles interaction modeling and exploring fundamental properties are significant challenges of computational science. Downscale modelings need more computational effort due to a large number of interacted atoms/particles. To deal with this problem and bring up a fine-scale (nano) as a coarse-scale (macro) problem, we extended an atomic-continuum framework. The discrete atomic models solve as a continuum problem using the computationally efficient FE method. MM or force field method based on a set of assumptions approximates a solution on the atomic scale. In this method, atoms and bonds model as a harmonic oscillator with a system of mass and springs. The negative gradient of the potential energy equal to the forces on each atom. In this way, each bond's total potential energy includes bonded, and non-bonded energies are simulated as equivalent structural strain energies. Finally, the chemical nature of the atomic bond is modeled as a piezoelectric beam element that solves by the FE method.
Exploring novel materials with unique properties is a demand for various industrial applications. During the last decade, many two-dimensional (2D) materials have been synthesized and shown outstanding properties. Investigation of the probable defects during the formation/fabrication process and studying their strength under severe service life are the critical tasks to explore performance prospects. We studied various defects include nano crack, notch, and point vacancy (Stone-Wales defect) defects employing MD analysis. Classical MD has been used to simulate a considerable amount of molecules at micro-, and meso- scales. Pristine and defective nanosheet structures considered under the uniaxial tensile loading at various temperatures using open-source LAMMPS codes. The results were visualized with the open-source software of OVITO and VMD.
Quantum based first principle calculations have been conducting at electronic scales and known as the most accurate Ab initio methods. However, they are computationally expensive to apply for large systems. We used density functional theory (DFT) to estimate the mechanical and electrochemical response of the 2D materials. Many-body Schrödinger's equation describes the motion and interactions of the solid-state particles. Solid describes as a system of positive nuclei and negative electrons, all electromagnetically interacting with each other, where the wave function theory describes the quantum state of the set of particles. However, dealing with the 3N coordinates of the electrons, nuclei, and N coordinates of the electrons spin components makes the governing equation unsolvable for just a few interacted atoms. Some assumptions and theories like Born Oppenheimer and Hartree-Fock mean-field and Hohenberg-Kohn theories are needed to treat with this equation. First, Born Oppenheimer approximation reduces it to the only electronic coordinates. Then Kohn and Sham, based on Hartree-Fock and Hohenberg-Kohn theories, assumed an equivalent fictitious non-interacting electrons system as an electron density functional such that their ground state energies are equal to a set of interacting electrons. Exchange-correlation energy functionals are responsible for satisfying the equivalency between both systems. The exact form of the exchange-correlation functional is not known. However, there are widely used methods to derive functionals like local density approximation (LDA), Generalized gradient approximation (GGA), and hybrid functionals (e.g., B3LYP). In our study, DFT performed using VASP codes within the GGA/PBE approximation, and visualization/post-processing of the results realized via open-source software of VESTA.
The extensive DFT calculations are conducted 2D nanomaterials prospects as anode/cathode electrode materials for batteries. Metal-ion batteries' performance strongly depends on the design of novel electrode material. Two-dimensional (2D) materials have developed a remarkable interest in using as an electrode in battery cells due to their excellent properties. Desirable battery energy storage systems (BESS) must satisfy the high energy density, safe operation, and efficient production costs. Batteries have been using in electronic devices and provide a solution to the environmental issues and store the discontinuous energies generated from renewable wind or solar power plants. Therefore, exploring optimal electrode materials can improve storage capacity and charging/discharging rates, leading to the design of advanced batteries.
Our results in multiple scales highlight not only the proposed and employed methods' efficiencies but also promising prospect of recently synthesized nanomaterials and their applications as an anode material. In this way, first, a novel approach developed for the modeling of the 1D nanotube as a continuum piezoelectric beam element. The results converged and matched closely with those from experiments and other more complex models. Then mechanical properties of nanosheets estimated and the failure mechanisms results provide a useful guide for further use in prospect applications. Our results indicated a comprehensive and useful vision concerning the mechanical properties of nanosheets with/without defects. Finally, mechanical and electrochemical properties of the several 2D nanomaterials are explored for the first time—their application performance as an anode material illustrates high potentials in manufacturing super-stretchable and ultrahigh-capacity battery energy storage systems (BESS). Our results exhibited better performance in comparison to the available commercial anode materials.
Rechargeable lithium ion batteries (LIBs) play a very significant role in power supply and storage. In recent decades, LIBs have caught tremendous attention in mobile communication, portable electronics, and electric vehicles. Furthermore, global warming has become a worldwide issue due to the ongoing production of greenhouse gases. It motivates solutions such as renewable sources of energy. Solar and wind energies are the most important ones in renewable energy sources. By technology progress, they will definitely require batteries to store the produced power to make a balance between power generation and consumption. Nowadays,rechargeable batteries such as LIBs are considered as one of the best solutions. They provide high specific energy and high rate performance while their rate of self-discharge is low.
Performance of LIBs can be improved through the modification of battery characteristics. The size of solid particles in electrodes can impact the specific energy and the cyclability of batteries. It can improve the amount of lithium content in the electrode which is a vital parameter in capacity and capability of a battery. There exist diferent sources of heat generation in LIBs such as heat produced during electrochemical reactions, internal resistance in battery. The size of electrode's electroactive particles can directly affect the produced heat in battery. It will be shown that the smaller size of solid particle enhance the thermal characteristics of LIBs.
Thermal issues such as overheating, temperature maldistribution in the battery, and thermal runaway have confined applications of LIBs. Such thermal challenges reduce the Life cycle of LIBs. As well, they may lead to dangerous conditions such as fire or even explosion in batteries. However, recent advances in fabrication of advanced materials such as graphene and carbon nanotubes with extraordinary thermal conductivity and electrical properties propose new opportunities to enhance their performance. Since experimental works are expensive, our objective is to use computational methods to investigate the thermal issues in LIBS. Dissipation of the heat produced in the battery can improve the cyclability and specific capacity of LIBs. In real applications, packs of LIB consist several battery cells that are used as the power source. Therefore, it is worth to investigate thermal characteristic of battery packs under their cycles of charging/discharging operations at different applied current rates. To remove the produced heat in batteries, they can be surrounded by materials with high thermal conductivity. Parafin wax absorbs high energy since it has a high latent heat. Absorption high amounts of energy occurs at constant temperature without phase change. As well, thermal conductivity of parafin can be magnified with nano-materials such as graphene, CNT, and fullerene to form a nano-composite medium. Improving the thermal conductivity of LIBs increase the heat dissipation from batteries which is a vital issue in systems of battery thermal management. The application of two-dimensional (2D) materials has been on the rise since exfoliation the graphene from bulk graphite. 2D materials are single-layered in an order of nanosizes which show superior thermal, mechanical, and optoelectronic properties. They are potential candidates for energy storage and supply, particularly in lithium ion batteries as electrode material. The high thermal conductivity of graphene and graphene-like materials can play a significant role in thermal management of batteries. However, defects always exist in nano-materials since there is no ideal fabrication process. One of the most important defects in materials are nano-crack which can dramatically weaken the mechanical properties of the materials. Newly synthesized crystalline carbon nitride with the stoichiometry of C3N have attracted many attentions due to its extraordinary mechanical and thermal properties. The other nano-material is phagraphene which shows anisotropic mechanical characteristics which is ideal in production of nanocomposite.
It shows ductile fracture behavior when subjected under uniaxial loadings. It is worth to investigate their thermo-mechanical properties in its pristine and defective states. We hope that the findings of our work not only be useful for both experimental and theoretical researches but also help to design advanced electrodes for LIBs.
Unmanned aircraft systems (UAS) show large potential for the construction industry. Their use in condition assessment has increased significantly, due to technological and computational progress. UAS play a crucial role in developing a digital maintenance strategy for infrastructure, saving cost and effort, while increasing safety and reliability. Part of that strategy are automated visual UAS inspections of the building’s condition. The resulting images can automatically be analyzed to identify and localize damages to the structure that have to be monitored. Further interest in parts of a structure can arise from events like accidents or collisions. Areas of low interest exist, where low resolution monitoring is sufficient.
From different requirements for resolution, different levels of detail can be derived. They require special image acquisition parameters that differ mainly in the distance between camera and structure. Areas with a higher level of detail require a smaller distance to the object, producing more images. This work proposes a multi-scale flight path planning procedure, enabling higher resolution requirements for areas of special interest, while reducing the number of required images to a minimum. Careful selection of the camera positions maintains the complete coverage of the structure, while achieving the required resolution in all areas. The result is an efficient UAS inspection, reducing effort for the maintenance of infrastructure.
Wie können journalistische Angebote nachhaltig finanziert werden? Dies bleibt die Kernfrage für Medienhäuser und journalistische Neugründungen bei der Entwicklung und beim Aufbau tragfähiger digitaler Geschäftsmodelle.
Die Autoren des vorliegenden Bandes vermitteln einen breiten Überblick über den Wissensstand zum Thema Paid Content, Plattformen und Zahlungsbereitschaft im Journalismus und eröffnen innovative Blickwinkel auf neuartige Plattformmodelle ebenso wie auf Motive und Bedürfnisse der Nutzerinnen und Nutzer digitaljournalistischer Inhalte. Auf Grundlage empirischer Forschung werden Handlungsempfehlungen für die nutzerzentrierte Ausgestaltung von Paid-Content-Angeboten sowie neue Perspektiven auf Zahlungsbereitschaft im digitalen Journalismus erschlossen – relevant sowohl für die Wissenschaft wie auch für die Medienpraxis.
Evaporation is a very important process; it is one of the most critical factors in agricultural, hydrological, and meteorological studies. Due to the interactions of multiple climatic factors, evaporation is considered as a complex and nonlinear phenomenon to model. Thus, machine learning methods have gained popularity in this realm. In the present study, four machine learning methods of Gaussian Process Regression (GPR), K-Nearest Neighbors (KNN), Random Forest (RF) and Support Vector Regression (SVR) were used to predict the pan evaporation (PE). Meteorological data including PE, temperature (T), relative humidity (RH), wind speed (W), and sunny hours (S) collected from 2011 through 2017. The accuracy of the studied methods was determined using the statistical indices of Root Mean Squared Error (RMSE), correlation coefficient (R) and Mean Absolute Error (MAE). Furthermore, the Taylor charts utilized for evaluating the accuracy of the mentioned models. The results of this study showed that at Gonbad-e Kavus, Gorgan and Bandar Torkman stations, GPR with RMSE of 1.521 mm/day, 1.244 mm/day, and 1.254 mm/day, KNN with RMSE of 1.991 mm/day, 1.775 mm/day, and 1.577 mm/day, RF with RMSE of 1.614 mm/day, 1.337 mm/day, and 1.316 mm/day, and SVR with RMSE of 1.55 mm/day, 1.262 mm/day, and 1.275 mm/day had more appropriate performances in estimating PE values. It was found that GPR for Gonbad-e Kavus Station with input parameters of T, W and S and GPR for Gorgan and Bandar Torkmen stations with input parameters of T, RH, W and S had the most accurate predictions and were proposed for precise estimation of PE. The findings of the current study indicated that the PE values may be accurately estimated with few easily measured meteorological parameters.
Im vorliegenden Beitrag werden Messungen und Berechnungen vorgestellt, die die Temperaturentwicklung in Betonzylindern aufgrund zyklischer Beanspruchung genau beschreiben. Die Messungen wurden in einem Versuchsstand, die Berechnungen im FEM-Programm ANSYS durchgeführt. Mit Hilfe der Temperaturmessungen konnten die Simulationen für die Temperaturentwicklung der Betonzylinder mit der verwendeten Betonrezeptur validiert werden. Die Untersuchungen lassen den Schluss zu, dass bei zyklischer Probekörperbelastung und der einhergehenden Probekörperdehnung Energie dissipiert wird und diese maßgeblich für die Erwärmung der Probe verantwortlich ist.
Medien/Denken/Um/Formatieren
(2020)
Die medienwissenschaftliche Praxis des Um / Formatierens wurde im Wintersemester 2017 / 18 mit vierzig Bache- lor-Studierenden der Medien-, Theaterwissenschaft und den Gender Studies entwickelt und erprobt. Inhalt des Seminars waren konkrete mediale Standardformate, wie z. B. die Carte de Visite, die Schallplatte, der 16- oder 35mm Filmstreifen, digitale Dateiformate, wie das JPEG, GIF, MP3, die Videokassette oder auch Papierformate wie DIN A0 oder A8.
Acoustic travel-time TOMography (ATOM) allows the measurement and reconstruction of air temperature distributions. Due to limiting factors, such as the challenge of travel-time estimation of the early reflections in the room impulse response, which heavily depends on the position of transducers inside the measurement area, ATOM is applied mainly outdoors. To apply ATOM in buildings, this paper presents a numerical solution to optimize the positions of transducers. This optimization avoids reflection overlaps, leading to distinguishable travel-times in the impulse response reflectogram. To increase the accuracy of the measured temperature within tomographic voxels, an additional function is employed to the proposed numerical method to minimize the number of sound-path-free voxels, ensuring the best sound-ray coverage of the room. Subsequently, an experimental set-up has been performed to verify the proposed numerical method. The results indicate the positive impact of the optimal positions of transducers on the distribution of ATOM-temperatures.
The contribution explores the migratory situation on the Balkans and more specifically in the so-called Refugee District in Belgrade from a spatial perspective. By visualizing the areas of tensions in the Refugee District, the city of Belgrade, Serbia and Europe it aims to disentangle the political and socio-spatial levels that lead to the stuck situation of in-betweenness at the gates of the European Union.
In this study, a new approach to basis of intelligent systems and machine learning algorithms is introduced for solving singular multi-pantograph differential equations (SMDEs). For the first time, a type-2 fuzzy logic based approach is formulated to find an approximated solution. The rules of the suggested type-2 fuzzy logic system (T2-FLS) are optimized by the square root cubature Kalman filter (SCKF) such that the proposed fineness function to be minimized. Furthermore, the stability and boundedness of the estimation error is proved by novel approach on basis of Lyapunov theorem. The accuracy and robustness of the suggested algorithm is verified by several statistical examinations. It is shown that the suggested method results in an accurate solution with rapid convergence and a lower computational cost.
Das Hauptziel der vorliegenden Arbeit war es, eine stetige Kopplung zwischen der ananlytischen und numerischen Lösung von Randwertaufgaben mit Singularitäten zu realisieren. Durch die inter-polationsbasierte gekoppelte Methode kann eine globale C0 Stetigkeit erzielt werden. Für diesen Zweck wird ein spezielle finite Element (Kopplungselement) verwendet, das die Stetigkeit der Lösung sowohl mit dem analytischen Element als auch mit den normalen CST Elementen gewährleistet.
Die interpolationsbasierte gekoppelte Methode ist zwar für beliebige Knotenanzahl auf dem Interface ΓAD anwendbar, aber es konnte durch die Untersuchung von der Interpolationsmatrix und numerische Simulationen festgestellt werden, dass sie schlecht konditioniert ist. Um das Problem mit den numerischen Instabilitäten zu bewältigen, wurde eine approximationsbasierte Kopplungsmethode entwickelt und untersucht. Die Stabilität dieser Methode wurde anschließend anhand der Untersuchung von der Gramschen Matrix des verwendeten Basissystems auf zwei Intervallen [−π,π] und [−2π,2π] beurteilt. Die Gramsche Matrix auf dem Intervall [−2π,2π] hat einen günstigeren Konditionszahl in der Abhängigkeit von der Anzahl der Kopplungsknoten auf dem Interface aufgewiesen. Um die dazu gehörigen numerischen Instabilitäten ausschließen zu können wird das Basissystem mit Hilfe vom Gram-Schmidtschen Orthogonalisierungsverfahren auf beiden Intervallen orthogonalisiert. Das orthogonale Basissystem lässt sich auf dem Intervall [−2π,2π] mit expliziten Formeln schreiben. Die Methode des konsistentes Sampling, die häufig in der Nachrichtentechnik verwendet wird, wurde zur Realisierung von der approximationsbasierten Kopplung herangezogen. Eine Beschränkung dieser Methode ist es, dass die Anzahl der Sampling-Basisfunktionen muss gleich der Anzahl der Wiederherstellungsbasisfunktionen sein. Das hat dazu geführt, dass das eingeführt Basissys-tem (mit 2 n Basisfunktionen) nur mit n Basisfunktion verwendet werden kann.
Zur Lösung diese Problems wurde ein alternatives Basissystems (Variante 2) vorgestellt. Für die Verwendung dieses Basissystems ist aber eine Transformationsmatrix M nötig und bei der Orthogonalisierung des Basissystems auf dem Intervall [−π,π] kann die Herleitung von dieser Matrix kompliziert und aufwendig sein. Die Formfunktionen wurden anschließend für die beiden Varianten hergeleitet und grafisch (für n = 5) dargestellt und wurde gezeigt, dass diese Funktionen die Anforderungen an den Formfunktionen erfüllen und können somit für die FE- Approximation verwendet werden.
Anhand numerischer Simulationen, die mit der Variante 1 (mit Orthogonalisierung auf dem Intervall [−2π,2π]) durchgeführt wurden, wurden die grundlegenden Fragen (Beispielsweise: Stetigkeit der Verformungen auf dem Interface ΓAD, Spannungen auf dem analytischen Gebiet) über-
prüft.
Der Beitrag verbindet die Diskussion um die postpolitische Stadt mit der zunehmenden wissenschaftlichen und aktivistischen Auseinandersetzung mit dem Anthropozän, ein Konzept, das die ökologischen und sozialpolitischen Implikationen menschlichen Handelns auf die Erdoberfläche beschreibt. Anhand von drei ausgewählten Fallstudien erkunden wir,
wie die spezifisch anthropogene, also menschengemachte, Krise urbaner Luftverschmutzung in künstlerischen Positionen problematisiert wird. Im Kontext des potenziellen Vormarschs von Postpolitik besprechen wir, wie der ambivalente Diskurs des Anthropozäns einerseits Depolitisierung begünstigt und andererseits neue Möglichkeiten für die Repolitisierung
globaler Umweltherausforderungen ermöglicht.
Körperstreik
(2020)
Das spekulative Handbuch bietet vielfältige Techniken für ein radikales Lernen und Vermitteln. Es umfasst konkrete Anleitungen, Erfahrungen und theoretische Überlegungen. Die Texte beteiligen sich an der Konzeption einer Vermittlung, die das gemeinsame Experimentieren (wieder) einführt.
Im Seminarraum, in Workshops, auf Festivals, in Fluren, Parks und der Stadt finden Lernen und Verlernen statt. Texte und Anleitungen u. a. zu: Filmessays, Collagen, Banküberfällen, der Universität der Toten, wildem Schreiben, konzeptuellem speed Dating, neurodiversem Lernen, Format-Denken, dem Theater der Sorge, dem Schreiblabor, dem Körperstreik.
das Theater der sorge ist konzipiert als 3-Phasen-modell:
in der 1. Phase geht es um die hersTellung einer laborsituation, in der präfigurative lebens-, denk- und arbeitsformen in diskursiver und spielerischer weise präsentisch erprobt und in konstituierende Prozesse übertragen werden können;
in der 2. Phase geht es um die gemeinschaftliche Konzeption, Vorbereitung und durchführung einer das labor abschließenden öffentlichen auFsTellung, die auf den kollektiven erfahrungen und ergebnissen der
labore basiert. dadurch können gesellschaftliche resonanzen erzeugt und die konstituierenden Prozesse vorläufig instituiert werden.
in der 3.Phase geht es dann darum, Vorbereitung: alle inhaltlichen erkenntnisse und ästhetischen Versuche in eine wiederholbare Theaterinszenierung / Performance zu überführen. die nötigen Voraussetzungen und bedingungen für diese von uns Vorstellungen genannten Formate sind für die hier verhandelte Fragestellung nicht von bedeutung – respektive würden sie den vorhandenen rahmen überschreiten – daher sparen wir diesen Komplex an dieser stelle aus.
Rapid Visual Screening (RVS) is a procedure that estimates structural scores for buildings and prioritizes their retrofit and upgrade requirements. Despite the speed and simplicity of RVS, many of the collected parameters are non-commensurable and include subjectivity due to visual observations. This might cause uncertainties in the evaluation, which emphasizes the use of a fuzzy-based method. This study aims to propose a novel RVS methodology based on the interval type-2 fuzzy logic system (IT2FLS) to set the priority of vulnerable building to undergo detailed assessment while covering uncertainties and minimizing their effects during evaluation. The proposed method estimates the vulnerability of a building, in terms of Damage Index, considering the number of stories, age of building, plan irregularity, vertical irregularity, building quality, and peak ground velocity, as inputs with a single output variable. Applicability of the proposed method has been investigated using a post-earthquake damage database of reinforced concrete buildings from the Bingöl and Düzce earthquakes in Turkey.
Image Analysis Using Human Body Geometry and Size Proportion Science for Action Classification
(2020)
Gestures are one of the basic modes of human communication and are usually used to represent different actions. Automatic recognition of these actions forms the basis for solving more complex problems like human behavior analysis, video surveillance, event detection, and sign language recognition, etc. Action recognition from images is a challenging task as the key information like temporal data, object trajectory, and optical flow are not available in still images. While measuring the size of different regions of the human body i.e., step size, arms span, length of the arm, forearm, and hand, etc., provides valuable clues for identification of the human actions. In this article, a framework for classification of the human actions is presented where humans are detected and localized through faster region-convolutional neural networks followed by morphological image processing techniques. Furthermore, geometric features from human blob are extracted and incorporated into the classification rules for the six human actions i.e., standing, walking, single-hand side wave, single-hand top wave, both hands side wave, and both hands top wave. The performance of the proposed technique has been evaluated using precision, recall, omission error, and commission error. The proposed technique has been comparatively analyzed in terms of overall accuracy with existing approaches showing that it performs well in contrast to its counterparts.
Im Workshop des Sinnlichen
(2020)
Folgende fiktive Situation soll ein Problem markieren, das in diesem Beitrag diskutiert wird und nach Ansicht seines Autors ein an Kunsthochschulen weit verbreitetes Phänomen darstellt. Im Rahmen eines Mentoring-Workshops stellt eine Gruppe Studierender Arbeitsmaterial ihres aktuellen Projekts vor, um es anschließend in der Gruppe zu besprechen. Ziel des Veranstaltungsformats ist es, die Studierenden während der Entwicklung ihrer künstlerischen Praktiken zu begleiten und diese nicht anhand handwerklicher Kriterien überzudeterminieren, sondern ihrer Eigenlogik zu folgen, den ihnen inhärenten ästhetischen Potentialen nachzugehen und ein Bewusstsein für die Kontexte und Diskurse zu schaffen, in denen sie verortet sind. Die Studierenden, die heute ihr Projekt vorstellen, haben ein Photoalbum mitgebracht, in das sie, der Anordnungslogik von Urlaubsaufnahmen folgend, eine Reihe analoger Photographien geklebt haben, die Dutzende Schnappschüsse einer entfernten Insel zeigen, die sich hinter der den Vordergrund des Bildes einnehmenden Meeresoberfläche abzeichnet.
Das Buch greift die enge Verknüpfung von Industrialisierung und Urbanisierung auf, die in den letzten gut 250 Jahren Europas Städte und ihre Stadtbaugeschichte maßgeblich geprägt hat. Damit stellen sich auch vielfältige Fragen und Aufgaben für die Denkmalpflege.
Die Habilitationsschrift leistet einen Beitrag, um die stadtbaugeschichtlichen und stadtbildprägenden Werte historischer Industriekomplexe zu erkennen und zu erhalten. Wie können wir die industriellen Stadtlandschaften erfassen? Wie gestalten wir Umnutzungen und Konversionen denkmalgerecht und beziehen im Rahmen eines Heritage-Managements Aspekte der nachhaltigen Stadtentwicklung ein?
Das Buch greift die enge Verknüpfung von Industrialisierung und
Urbanisierung auf, die in den letzten gut 250 Jahren Europas Städte und ihre Stadtbaugeschichte maßgeblich geprägt hat. Damit stellen sich auch vielfältige Fragen und Aufgaben für die Denkmalpflege.
Die Habilitationsschrift leistet einen Beitrag, um die stadtbaugeschichtlichen und stadtbildprägenden Werte historischer Industriekomplexe zu erkennen und zu erhalten. Wie können wir die industriellen Stadtlandschaften erfassen? Wie gestalten wir Umnutzungen und Konversionen denkmalgerecht und beziehen im Rahmen eines Heritage-Managements Aspekte der nachhaltigen Stadtentwicklung ein?
Hans Ruin: Being with the Dead—Burial, ancestral politics, and the roots of historical consciousness
(2020)
How can society be thought of as something in which the living and the dead interact throughout history? In Being with the Dead. Burial, Ancestral Politics, and the Roots of Historical Consciousness, Hans Ruin turns to the relationship between the living and the dead as well as ‘historical consciousness’. He is referring to the expression ‘being with the dead’ (Mitsein mit dem Toten). Rather en passant, Martin Heidegger (1962: 282) shaped this existential-ontological term, which so far has hardly received any consideration. But for Ruin, it now forms the starting point for his “expanded phenomenological social ontology” (p. XI). By illuminating history and historical consciousness with the category ‘being with the dead,’ he gains remarkable insights into the meaning of ancestrality. Concerning ‘necropolitics,’ Ruin shows that the political space includes the living as well as the dead and how they constitute it. The foci of his considerations are the human sciences, above all sociology, anthropology, archaeology, philology and history. Ruin’s book aims at a “metacritical thanatology,” which he elaborates as “an exploration of the social ontology of being with the dead mediated through critical analyses of the human-historical sciences themselves” (p. XII). As a result, in a total of seven chapters, he succeeds astonishingly in emphasizing the political and ethical importance of a scientific gaze that cultivates the interaction of the living and the dead.
In recent years, substantial attention has been devoted to thermoelastic multifield problems and their numerical analysis. Thermoelasticity is one of the important categories of multifield problems which deals with the effect of mechanical and thermal disturbances on an elastic body. In other words, thermoelasticity encompasses the phenomena that describe the elastic and thermal behavior of solids and their interactions under thermo-mechanical loadings. Since providing an analytical solution for general coupled thermoelasticity problems is mathematically complicated, the development of alternative numerical solution techniques seems essential.
Due to the nature of numerical analysis methods, presence of error in results is inevitable, therefore in any numerical simulation, the main concern is the accuracy of the approximation. There are different error estimation (EE) methods to assess the overall quality of numerical approximation. In many real-life numerical simulations, not only the overall error, but also the local error or error in a particular quantity of interest is of main interest. The error estimation techniques which are developed to evaluate the error in the quantity of interest are known as “goal-oriented” error estimation (GOEE) methods.
This project, for the first time, investigates the classical a posteriori error estimation and goal-oriented a posteriori error estimation in 2D/3D thermoelasticity problems. Generally, the a posteriori error estimation techniques can be categorized into two major branches of recovery-based and residual-based error estimators. In this research, application of both recovery- and residual-based error estimators in thermoelasticity are studied. Moreover, in order to reduce the error in the quantity of interest efficiently and optimally in 2D and 3D thermoelastic problems, goal-oriented adaptive mesh refinement is performed.
As the first application category, the error estimation in classical Thermoelasticity (CTE) is investigated. In the first step, a rh-adaptive thermo-mechanical formulation based on goal-oriented error estimation is proposed.The developed goal-oriented error estimation relies on different stress recovery techniques, i.e., the superconvergent patch recovery (SPR), L2-projection patch recovery (L2-PR), and weighted superconvergent patch recovery (WSPR). Moreover, a new adaptive refinement strategy (ARS) is presented that minimizes the error in a quantity of interest and refines the discretization such that the error is equally distributed in the refined mesh. The method is validated by numerous numerical examples where an analytical solution or reference solution is available.
After investigating error estimation in classical thermoelasticity and evaluating the quality of presented error estimators, we extended the application of the developed goal-oriented error estimation and the associated adaptive refinement technique to the classical fully coupled dynamic thermoelasticity. In this part, we present an adaptive method for coupled dynamic thermoelasticity problems based on goal-oriented error estimation. We use dimensionless variables in the finite element formulation and for the time integration we employ the acceleration-based Newmark-_ method. In this part, the SPR, L2-PR, and WSPR recovery methods are exploited to estimate the error in the quantity of interest (QoI). By using
adaptive refinement in space, the error in the quantity of interest is minimized. Therefore, the discretization is refined such that the error is equally distributed in the refined mesh. We demonstrate the efficiency of this method by numerous numerical examples.
After studying the recovery-based error estimators, we investigated the residual-based error estimation in thermoelasticity. In the last part of this research, we present a 3D adaptive method for thermoelastic problems based on goal-oriented error estimation where the error is measured with respect to a pointwise quantity of interest. We developed a method for a posteriori error estimation and mesh adaptation based on dual weighted residual (DWR) method relying on the duality principles and consisting of an adjoint problem solution. Here, we consider the application of the derived estimator and mesh refinement to two-/three-dimensional (2D/3D) thermo-mechanical multifield problems. In this study, the goal is considered to be given by singular pointwise functions, such as the point value or point value derivative at a specific point of interest (PoI). An adaptive algorithm has been adopted to refine the mesh to minimize the goal in the quantity of interest.
The mesh adaptivity procedure based on the DWR method is performed by adaptive local h-refinement/coarsening with allowed hanging nodes. According to the proposed DWR method, the error contribution of each element is evaluated. In the refinement process, the contribution of each element to the goal error is considered as the mesh refinement criterion.
In this study, we substantiate the accuracy and performance of this method by several numerical examples with available analytical solutions. Here, 2D and 3D problems under thermo-mechanical loadings are considered as benchmark problems. To show how accurately the derived estimator captures the exact error in the evaluation of the pointwise quantity of interest, in all examples, considering the analytical solutions, the goal error effectivity index as a standard measure of the quality of an estimator is calculated. Moreover, in order to demonstrate the efficiency of the proposed method and show the optimal behavior of the employed refinement method, the results of different conventional error estimators and refinement techniques (e.g., global uniform refinement, Kelly, and weighted Kelly techniques) are used for comparison.
In einer systematischen Interpretation von Vilém Flussers Werk schlägt die Arbeit vor, Flussers Ansatz als einen medienphilosophischen zu verstehen, insofern er das „wie“ der medienphilosophischen Fragestellung in den Mittelpunkt rückt. Medien werden nicht erst dann zu einem wesentlichen Bestandteil von Flussers Philosophie, wenn er sie explizit zum Gegenstand seiner Untersuchungen der gegenwärtigen Kultur und Gesellschaft oder historischer Rückblicke macht; Denken vollzieht sich immer in Medien oder medialen Praktiken, es wird nicht nur von ihnen (mit) geprägt – ohne Medien gäbe es kein Denken und umgekehrt verändert sich Philosophie mit den (jeweils) neuen Medien. Ausgehend von Begriffen oder eher Denkfiguren, die neben dem „was“ des jeweils verhandelten Themas auch das „wie“ der Reflexion selbst adressieren, wird der „Umbruch in der Struktur des Denkens“ zugleich als Beschreibung von Medienumbrüchen verstanden – mit dem Fluchtpunkt des Sprungs in das Universum der Komputation – und als Vollzug der gegenwärtigen Veränderung der „Methode des Denkens“. Flussers (Ver)Suche einer Reflexion, die nicht mehr durch das Medium Schrift strukturiert ist, sondern sowohl alten Medien wie dem Bild – bzw. Praktiken des Abbildens, Darstellens, Einbildens usw. – als auch neuen Medien – dem Komputieren – Geltung verschafft, laufen auf eine widersprüchliche Diagnose des neuen Universums der Komputation (anders: der technischen Bilder) hinaus : eine kybermetisch inspirierte Vision der frei modellierbaren Wirklichkeit(en) einerseits und die Dystopie einer Welt, in der Apparaten Denken, Wahrnehmen und Handeln beherrschen andererseits. Die Arbeit zeigt auf, wie Flusser zu dieser Aporie der Medienreflexion – die weit über Flussers Werk hinaus virulent bleibt – gelangt und wie sie, ausgehend von seiner Figur der Geste, im Sinne einer performativen Medienreflexion gelöst werden könnte.
While Public-Private Partnership (PPP) is widely adopted across various sectors, it raises a question on its meagre utilisation in the housing sector. This paper, therefore, gauges the perspective of the stakeholders in the building industry towards the application of PPP in various building sectors together with housing. It assesses the performance reliability of PPP for housing by learning possible take-aways from other sectors. The role of key stakeholders in the industry becomes highly responsible for an informed understanding and decision-making. To this end, a two-tier investigation was conducted including surveys and expert interviews, with several stakeholders in the PPP industry in Europe, involving the public sector, private sector, consultants, as well as other community/user representatives.
The survey results demonstrated the success rate with PPPs, major factors important for PPPs such as profitability or end-user acceptability, the prevalent practices and trends in the PPP world, and the majority of support expressed in favour of the suitability of PPP for housing. The interviews added more detailed dimensions to the understanding of the PPP industry, its functioning and enabling the formation of a comprehensive outlook. The results present the perspective, approaches, and experiences of stakeholders over PPP practices, current trends and scenarios and their take on PPP in housing. It shall aid in understanding the challenges prevalent in the PPP approach for implementation in housing and enable the policymakers and industry stakeholders to make provisions for higher uptake to accelerate housing provision.
For this paper, the problem of energy/voltage management in photovoltaic (PV)/battery systems was studied, and a new fractional-order control system on basis of type-3 (T3) fuzzy logic systems (FLSs) was developed. New fractional-order learning rules are derived for tuning of T3-FLSs such that the stability is ensured. In addition, using fractional-order calculus, the robustness was studied versus dynamic uncertainties, perturbation of irradiation, and temperature and abruptly faults in output loads, and, subsequently, new compensators were proposed. In several examinations under difficult operation conditions, such as random temperature, variable irradiation, and abrupt changes in output load, the capability of the schemed controller was verified. In addition, in comparison with other methods, such as proportional-derivative-integral (PID), sliding mode controller (SMC), passivity-based control systems (PBC), and linear quadratic regulator (LQR), the superiority of the suggested method was demonstrated.
Der Forschungsgegenstand dieser Arbeit basiert auf einer phänomenologischen Beobachtung internationaler fotografischer Positionen des Selbstportraits, welche seit den 1960er-Jahren verwandte Inhalte, gleichartige bildästhetische Merkmale und ähnliche Prozesse im fotografischen Herstellungsprozess aufweisen. Gemeinsam haben die in dieser Arbeit besprochenen Künstler*innen, dass sich ihre Bildwerdung am eigenen Körper vollzieht und an einen durch Bewegung gekennzeichneten Handlungsablauf geknüpft ist. Die jeweilige Bildsprache weist eine ephemere Ästhetik aus, in welcher inhaltlich sowohl der physische als auch der philosophisch gemeinte Begriff des (Los-)lassens eine Rolle spielt. Die künstlerischen Positionen, die Gegenstand dieser Ph.D.-Arbeit sind, umfassen Arbeiten von Bas Jan Ader (1942 – 1975), Francesca Woodman (1958 – 1981), Bernhard (1937 – 2011) und Anna Blume (*1937), Antoine d’Agata (*1961) und Tom Pope (*1986).
This paper reports the formation and structure of fast setting geopolymers activated by using three sodium silicate solutions with different modules (1.6, 2.0 and 2.4) and a berlinite-type aluminum orthophosphate. By varying the concentration of the aluminum orthophosphate, different Si/Al-ratios were established (6, 3 and 2). Reaction kinetics of binders were determined by isothermal calorimetric measurements at 20 °C. X-ray diffraction analysis as well as nuclear magnetic resonance (NMR) measurements were performed on binders to determine differences in structure by varying the alkalinity of the sodium silicate solutions and the Si/Al-ratio. The calorimetric results indicated that the higher the alkalinity of the sodium silicate solution, the higher the solubility and degree of conversion of the aluminum orthophosphate. The results of X-ray diffraction and Rietveldt analysis, as well as the NMR measurements, confirmed the assumption of the calorimetric experiments that first the aluminum orthophosphate was dissolved and then a polycondensation to an amorphous aluminosilicate network occurred. The different amounts of amorphous phases formed as a function of the alkalinity of the sodium silicate solution, indicate that tetrahydroxoaluminate species were formed during the dissolution of the aluminum orthophosphate, which reduce the pH value. This led to no further dissolution of the aluminum orthophosphate, which remained unreacted.
Ausgehend von der vielfachen Verwertung der bäuerlichen Kleidung durch den Staat während des Sozialismus in Rumänien wird in der Arbeit das ‚Gemacht-Sein‘ von Volkstrachten befragt entlang von im untersuchten Zeitraum wirkenden Diskursen, wie dem Prozess der Modernisierung oder der Hervorhebung nationaler Werte. Die künstlerische Forschung setzt dabei auf Simulacra (Roland Barthes). Ziel war, tradierte Formate der Wissensaufbereitung und -verbreitung zu appropriieren, so auch von Strategien, die auf der Ebene von Bildern und Sprache agieren, um eine Re-Lektüre sowohl von ‚Volkstracht‘ im Sozialismus als auch von ihren Entsprechungen nach 1989 zu ermöglichen.
Calculating hydrocarbon components solubility of natural gases is known as one of the important issues for operational works in petroleum and chemical engineering. In this work, a novel solubility estimation tool has been proposed for hydrocarbon gases—including methane, ethane, propane, and butane—in aqueous electrolyte solutions based on extreme learning machine (ELM) algorithm. Comparing the ELM outputs with a comprehensive real databank which has 1175 solubility points yielded R-squared values of 0.985 and 0.987 for training and testing phases respectively. Furthermore, the visual comparison of estimated and actual hydrocarbon solubility led to confirm the ability of proposed solubility model. Additionally, sensitivity analysis has been employed on the input variables of model to identify their impacts on hydrocarbon solubility. Such a comprehensive and reliable study can help engineers and scientists to successfully determine the important thermodynamic properties, which are key factors in optimizing and designing different industrial units such as refineries and petrochemical plants.
Das spekulative Handbuch bietet vielfältige Techniken für ein radikales Lernen und Vermitteln. Es umfasst konkrete Anleitungen, Erfahrungen und theoretische Überlegungen. Die Texte beteiligen sich an der Konzeption einer Vermittlung, die das gemeinsame Experimentieren (wieder) einführt.
Im Seminarraum, in Workshops, auf Festivals, in Fluren, Parks und der Stadt finden Lernen und Verlernen statt. Texte und Anleitungen u. a. zu: Filmessays, Collagen, Banküberfällen, der Universität der Toten, wildem Schreiben, konzeptuellem speed Dating, neurodiversem Lernen, Format-Denken, dem Theater der Sorge, dem Schreiblabor, dem Körperstreik.
Experimente lernen, Techniken tauschen
Ein spekulatives Handbuch
Das spekulative Handbuch bietet vielfältige Techniken für ein radikales Lernen und Vermitteln. Es umfasst konkrete Anleitungen, Erfahrungen und theoretische Überlegungen. Die Texte beteiligen sich an der Konzeption einer Vermittlung, die das gemeinsame Experimentieren (wieder) einführt.
Im Seminarraum, in Workshops, auf Festivals, in Fluren, Parks und der Stadt finden Lernen und Verlernen statt. Texte und Anleitungen u. a. zu: Filmessays, Collagen, Banküberfällen, der Universität der Toten, wildem Schreiben, konzeptuellem speed Dating, neurodiversem Lernen, Format-Denken, dem Theater der Sorge, dem Schreiblabor, dem Körperstreik.
In this study, machine learning methods of artificial neural networks (ANNs), least squares support vector machines (LSSVM), and neuro-fuzzy are used for advancing prediction models for thermal performance of a photovoltaic-thermal solar collector (PV/T). In the proposed models, the inlet temperature, flow rate, heat, solar radiation, and the sun heat have been considered as the input variables. Data set has been extracted through experimental measurements from a novel solar collector system. Different analyses are performed to examine the credibility of the introduced models and evaluate their performances. The proposed LSSVM model outperformed the ANFIS and ANNs models. LSSVM model is reported suitable when the laboratory measurements are costly and time-consuming, or achieving such values requires sophisticated interpretations.
Piping erosion is one form of water erosion that leads to significant changes in the landscape and environmental degradation. In the present study, we evaluated piping erosion modeling in the Zarandieh watershed of Markazi province in Iran based on random forest (RF), support vector machine (SVM), and Bayesian generalized linear models (Bayesian GLM) machine learning algorithms. For this goal, due to the importance of various geo-environmental and soil properties in the evolution and creation of piping erosion, 18 variables were considered for modeling the piping erosion susceptibility in the Zarandieh watershed. A total of 152 points of piping erosion were recognized in the study area that were divided into training (70%) and validation (30%) for modeling. The area under curve (AUC) was used to assess the effeciency of the RF, SVM, and Bayesian GLM. Piping erosion susceptibility results indicated that all three RF, SVM, and Bayesian GLM models had high efficiency in the testing step, such as the AUC shown with values of 0.9 for RF, 0.88 for SVM, and 0.87 for Bayesian GLM. Altitude, pH, and bulk density were the variables that had the greatest influence on the piping erosion susceptibility in the Zarandieh watershed. This result indicates that geo-environmental and soil chemical variables are accountable for the expansion of piping erosion in the Zarandieh watershed.
The longitudinal dispersion coefficient (LDC) plays an important role in modeling the transport of pollutants and sediment in natural rivers. As a result of transportation processes, the concentration of pollutants changes along the river. Various studies have been conducted to provide simple equations for estimating LDC. In this study, machine learning methods, namely support vector regression, Gaussian process regression, M5 model tree (M5P) and random forest, and multiple linear regression were examined in predicting the LDC in natural streams. Data sets from 60 rivers around the world with different hydraulic and geometric features were gathered to develop models for LDC estimation. Statistical criteria, including correlation coefficient (CC), root mean squared error (RMSE) and mean absolute error (MAE), were used to scrutinize the models. The LDC values estimated by these models were compared with the corresponding results of common empirical models. The Taylor chart was used to evaluate the models and the results showed that among the machine learning models, M5P had superior performance, with CC of 0.823, RMSE of 454.9 and MAE of 380.9. The model of Sahay and Dutta, with CC of 0.795, RMSE of 460.7 and MAE of 306.1, gave more precise results than the other empirical models. The main advantage of M5P models is their ability to provide practical formulae. In conclusion, the results proved that the developed M5P model with simple formulations was superior to other machine learning models and empirical models; therefore, it can be used as a proper tool for estimating the LDC in rivers.
Matthias Bernt und Andrej Holm weisen zu Recht darauf hin, dass es einer Forschung zu ostdeutschen Städten als konzeptionell eigenständigem Feld bedarf, die die spezifische Verräumlichung des tiefgreifenden gesellschaftlichen Transformationsprozesses nach 1990 ins Zentrum stellt. Dabei betrachten sie insbesondere das Feld des Wohnens als produktiv, um Kenntnis über die Struktur und Wirkung dieses Prozesses zu erlangen. Allerdings bleiben sie vage dabei, wie eine solche spezifisch auf Ostdeutschland gerichtete Wohnungsforschung zu konzipieren wäre und in welcher Weise die Besonderheiten und Parallelitäten ostdeutscher Entwicklungen zu den Transformationen von Wohnungs- und Stadtentwicklungspolitik in Westdeutschland, aber auch international, in Bezug zu setzen wären.
Marine Makroalgen besitzen vielversprechende Eigenschaften und Inhaltsstoffe für die Verwendung als Energieträger, Nahrungsmittel oder als Ausgangsstoff für Pharmazeutika. Dass die Quantität und Qualität der in natürlicher Umgebung wachsenden Makroalgen schwankt, reduziert jedoch deren Verwertbarkeit und erschwert die Erschließung hochpreisiger Marktsegmente. Zudem ist eine Ausweitung der Zucht in marinen und küstennahen Aquakulturen in Europa gegenwärtig wenig aussichtsreich, da vielversprechende Areale bereits zum Fischfang oder als Erholungs- bzw. Naturschutzgebiete ausgewiesen sind. Im Rahmen dieser Arbeit wird demzufolge ein geschlossenes Photobioreaktorsystem zur Makroalgenkultivierung entwickelt, welches eine umfassende Kontrolle der abiotischen Kultivierungsparameter und eine effektive Aufbereitung des Kulturmediums vorsieht, um eine standortunabhängige Algenproduktion zu ermöglichen. Zur Bilanzierung des Gesamtkonzeptes einer Kultivierung und Verwertung (stofflich oder energetisch) werden die spezifischen Wachstumsraten und Methanbildungspotentiale der Algenarten Ulva intestinalis, Fucus vesiculosus und Palmaria palmata in praktischen Versuchen ermittelt.
Im Ergebnis wird für den gegenwärtigen Entwicklungsstand der Kultivierungsanlage eine positive Bilanz für die stoffliche Verwertung der Algenart Ulva intestinalis und eine negative Bilanz für die energetische Verwertung aller untersuchten Algenarten erzielt. Wird ein Optimalszenario betrachtet, indem die Besatzdichten und Wachstumsraten der Algen in der Zucht erhöht werden, bleibt die Energiebilanz negativ. Allerdings summieren sich die finanzielle Einnahmen durch einen Verkauf der Algen als Produkt auf jährlich 460.869€ für Ulva intestinalis, 4.010€ für Fucus vesiculosus und 16.913€ für Palmaria palmata. Im Ergebnis ist insbesondere eine stoffliche Verwertung der gezüchteten Grünalge Ulva intestinalis anzustreben und die Produktivität der Zuchtanlage im Sinne des Optimalszenarios zu steigern.
Im Rahmen der Dissertation wurde ein Toolboxmodell für transdisziplinäres Wasserressourcenmanagement entwickelt. Das Modell liefert den methodischen Rahmen Wasserressourcen nachhaltig und transdisziplinär zu bewirtschaften.
Der Begriff der Nachhaltigkeit und eine Konkretisierung der nachhaltigen Bewirtschaftung globaler Wasserressourcen scheinen unüberschaubar und suggerieren die Forderung nach einer neuen Weltformel. Die globale Bedeutung der Wasserressourcen, die für Regionen spezifischen Besonderheiten des natürlichen Wasserhaushalts und der anthropogenen Nutzung, die Zeitskala und die Kontextualisierung in alle betroffenen und benachbarten Disziplinen deuten auf die Komplexität der Thematik hin. Es wird eine Systematisierung des Planungsprozesses von Wasserressourcen notwendig, anhand derer eine holistische Herangehensweise mit einer Strategieentwicklung für Regionen spezifischer Schwerpunktprobleme erfolgt. Ziel der Arbeit ist die Erarbeitung einer Strategie zur Systematisierung nach diesen Forderungen und die Bereitstellung eines Toolboxmodelles als Planungswerkzeug für das transdisziplinäre Wasserressourcenmanagement.
Das Toolboxmodell stellt den konzeptionellen Rahmen für die Bewirtschaftung von Wasserressourcen mit der Anwendung transdisziplinärer Forschungsmethoden bereit. Wesentliche Herausforderung bei der Anwendung der transdisziplinären Methode sind die Implementierung verschiedener Skalenbereiche, der Umgang mit der Komplexität von Daten, das Bewahren von Transparenz und Objektivität sowie die Ermöglichung eines auf andere Regionen übertragbaren Planungsprozesses.
Die theoretischen Grundlagen naturwissenschaftlicher Forschung zur Nachhaltigkeit haben ihren Ursprung in den biologischen und geographischen Disziplinen. Das Ineinandergreifen naturräumlicher Zusammenhänge und der Einfluss anthropogener Nutzung und technischer Innovationen auf den Naturhaushalt sind Kern der Kausalität übergreifenden Denkens und Verstehens. Mit dem Ansatz des integrierten Wasserressourcenmanagements (IWRM) erfolgt die Berücksichtigung wirtschaftlicher und sozioökonomischer Ziele in den Planungsprozess für ökologisch nachhaltige Wasserwirtschaft. Das Instrument der Wasserrahmenrichtlinie (EU-WRRL) ist auf eine Gewässerökologie ausgerichtete Richtlinie, welche die Integration verschiedener Interessenvertreter in den Planungsprozess vorsieht. Das Konzept der neuartigen Sanitärsysteme basiert auf Stoffflüssen zwischen konkurrierenden Handlungsbereichen, wie Abfall-, Ressourcen- und Landwirtschaft.
Den integrierten Ansätzen fehlt eine übergeordnete gemeinsame Zielstrategie – eine sogenannte Phase Null. Diese Phase Null – das Lernen aller 7 Zusammenfassung 157 relevanten, konkurrierenden und harmonisierenden Handlungsfelder eines Planungshorizontes wird durch eine transdisziplinäre Perspektive ermöglicht. Während bei der integralen Perspektive eine disziplinorientierte Kooperation im Vordergrund steht, verlangt die transdisziplinäre Perspektive nach einer problemorientierten Kooperation zwischen den Interessenvertretern (Werlen 2015). Die bestehenden Konzepte und Richtlinien für das nachhaltige Management von Wasserressourcen sind etabliert und evaluiert. Der Literatur zur Folge ist eine Weiterentwicklung nach der Perspektive der Transdisziplinarität erforderlich. Das Toolboxmodell für integrales Wasserressourcenmanagement entspricht einem Planungstool bestehend aus Werkzeugen für die Anwendung wissenschaftlicher Methoden. Die Zusammenstellung der Methoden/Werkzeuge erfüllt im Rahmen die Methode transdisziplinärer Forschung. Das Werkzeug zum Aufstellen der relevanten Handlungsfelder umfasst die Charakterisierung eines Untersuchungsgebietes und Planungsrahmens, die kausale Verknüpfung des Bewirtschaftungskonzeptes und konkurrierender sowie sich unterstützender Stakeholder. Mit dem Werkzeug der Kontextualisierung und Indikatorenaufstellung wird eine Methode der stufenweisen und von einer Skala unabhängigen Bewertung des Umweltzustandes für die Zielpriorisierung vorgenommen. Damit wird das Toolboxmodell dem Problem der Komplexität und Datenverfügbarkeit gerecht. Anhand der eingesetzten ABC Methode, werden die Bewertungsgrößen differenziert strukturiert auf verschiedene Skalen und Datenressourcen (A=Ersterkennung,B=Zeigerwerte, C=Modell/Index). Die ABC-Methode ermöglicht die Planung bereits mit unsicherer und lückenhafter Datengrundlage, ist jederzeit erweiterbar und bietet somit eine operative Wissensgenerierung während des Gestaltungsprozesses.
Für das Werkzeug zur Bewertung und Priorisierung wird der Algorithmus der Composite Programmierung angewandt. Diese Methode der Mehrfachzielplanung erfüllt den Anspruch der permanenten Erweiterbarkeit und der transparenten und objektiven Entscheidungsfindung. Die Komplexität des transdisziplinären Wasserressourcenmanagements kann durch die Methode der Composite Programmierung systematisiert werden. Das wesentliche Ergebnis der Arbeit stellt die erfolgreiche Erarbeitung und Anwendung des Tool-boxmodells für das transdisziplinäre Wasserressourcenmanagement im Untersuchungsgebiet Stadt Darkhan in der Mongolei dar. Auf Grund seiner besonderen hydrologischen und strukturellen Situa-tion wird die Relevanz eines nachhaltigen Bewirtschaftungskonzeptes deutlich. Im Rahmen des Querschnittsmoduls des MoMo-Projektes wurde eine für das Toolboxmodell geeignete Datengrundlage erarbeitet. Planungsrelevante Handlungsfelder wurden im Rahmen eines Workshops mit verschiedenen Interessenvertretern erarbeitet. Im Ergebnis dessen wurde die Systematik eines Zielbaumes mit Hauptzielen und untergeordneten Teilzielen als Grundlage der Priorisierung nach den holistischen Anspruch der transdisziplinären Forschung aufgestellt. Für die Messbarkeit, in-wieweit Teilziele erreicht sind oder Handlungsbedarf besteht, wurden Indikatoren erarbeitet. Die Indikatoren-Aufstellung erfolgte exemplarisch für das Handlungsfeld Siedlungswasserwirtschaft in allen Skalen des ABC-Systems. Die im BMBF-MoMo Projekt generierte umfassende Datengrundlage ermöglichte die Anwendung und Evaluierung des Toolboxmodells mit unterschiedlichem quantitativem und qualitativem Dateninput. Verschiedene Kombination von A (Ersterkennung), B (Zeigerwerte) und C (Modell/Index) als Grundlage der Priorisierung mit der Compostite Programmierung ermöglichten die Durchführung und Bewertung des transdisziplinären Planungstools. Die er-mittelten Rangfolgen von Teilzielen mit unterschiedlichen Bewertungsvarianten ergaben ähnliche
Tendenzen. Das ist ein Hinweis dafür, dass für die zukünftige Anwendung des Toolboxmodells die operative Wissensgenerierung, d.h. das schrittweise Hinzufügen neu ermittelter, gesicherterer Daten, funktioniert. Eine schwierige Datenverfügbarkeit oder eine noch im Prozess befindliche wissenschaftliche Analyse sollen keine Hindernisse für eine schrittweise und erweiterbare Zielpriorisierung und Maßnahmenplanung sein. Trotz der Komplexität des transdisziplinären Ansatzes wird durch die Anwendung des Toolboxmodells eine effiziente und zielorientierte Handlungspriorisierung ermöglicht. Die Effizienz wird erreicht durch ressourcenschonende und flexible, Ziel fokussierte Datenermittlung. Zeit und Kosten im Planungsprozess können eingespart werden. Die erzielte Priorisierung von letztlich Handlungsempfehlungen erfolgt individuell auf die Eigenart des Untersuchungsgebietes angepasst, was hinsichtlich seiner Wirkung als erfolgsversprechend gilt.
Die zu beobachtenden kürzeren Produktlebenszyklen und eine schnellere Marktdurchdringung von Produkttechnologien erfordern adaptive und leistungsfähige Produktionsanlagen. Die Adaptivität ermöglicht eine Anpassung der Produktionsanlage an neue Produkte, und die Leistungsfähigkeit der Anlage stellt sicher, dass ausreichend Produkte in kurzer Zeit und zu geringen Kosten hergestellt werden können. Durch eine Modularisierung der Produktionsanlage kann die Adaptivität erreicht werden. Jedoch erfordert heutzutage jede Adaption manuellen Aufwand, z.B. zur Anpassung von proprietären Signalen oder zur Anpassung übergeordneter Funktionen. Dadurch sinkt die Leistungsfähigkeit der Anlage.
Das Ziel dieser Arbeit ist es, die Interoperabilität in Bezug auf die Informationsverwendung in modularen Produktionsanlagen zu gewährleisten. Dazu werden Informationen durch semantische Modelle beschrieben. Damit wird ein einheitlicher Informationszugriff ermöglicht, und übergeordnete Funktionen erhalten Zugriff auf alle Informationen der Produktionsmodule, unabhängig von dem Typ, dem Hersteller und dem Alter des Moduls. Dadurch entfällt der manuelle Aufwand bei Anpassungen des modularen Produktionssystems, wodurch die Leistungsfähigkeit der Anlage gesteigert und Stillstandszeiten reduziert werden.
Nach dem Ermitteln der Anforderungen an einen Modellierungsformalismus wurden potentielle Formalismen mit den Anforderungen abgeglichen. OWL DL stellte sich als geeigneter Formalismus heraus und wurde für die Erstellung des semantischen Modells in dieser Arbeit verwendet. Es wurde exemplarisch ein semantisches Modell für die drei Anwendungsfälle Interaktion, Orchestrierung und Diagnose erstellt. Durch einen Vergleich der Modellierungselemente von unterschiedlichen Anwendungsfällen wurde die Allgemeingültigkeit des Modells bewertet. Dabei wurde gezeigt, dass die Erreichung eines allgemeinen Modells für technische Anwendungsfälle möglich ist und lediglich einige Hundert Begriffe benötigt.
Zur Evaluierung der erstellten Modelle wurde ein wandlungsfähiges Produktionssystem der SmartFactoryOWL verwendet, an dem die Anwendungsfälle umgesetzt wurden. Dazu wurde eine Laufzeitumgebung erstellt, die die semantischen Modelle der einzelnen Module zu einem Gesamtmodell vereint, Daten aus der Anlage in das Modell überträgt und eine Schnittstelle für die Services bereitstellt. Die Services realisieren übergeordnete Funktionen und verwenden die Informationen des semantischen Modells. In allen drei Anwendungsfällen wurden die semantischen Modelle korrekt zusammengefügt und mit den darin enthaltenen Informationen konnte die Aufgabe des jeweiligen Anwendungsfalles ohne zusätzlichen manuellen Aufwand gelöst werden.
Energy‐Efficient Method for Wireless Sensor Networks Low‐Power Radio Operation in Internet of Things
(2020)
The radio operation in wireless sensor networks (WSN) in Internet of Things (IoT)applications is the most common source for power consumption. Consequently, recognizing and controlling the factors affecting radio operation can be valuable for managing the node power consumption. Among essential factors affecting radio operation, the time spent for checking the radio is of utmost importance for monitoring power consumption. It can lead to false WakeUp or idle listening in radio duty cycles and ContikiMAC. ContikiMAC is a low‐power radio duty‐cycle protocol in Contiki OS used in WakeUp mode, as a clear channel assessment (CCA) for checking radio status periodically. This paper presents a detailed analysis of radio WakeUp time factors of ContikiMAC. Furthermore, we propose a lightweight CCA (LW‐CCA) as an extension to ContikiMAC to reduce the Radio Duty‐Cycles in false WakeUps and idle listening though using dynamic received signal strength indicator (RSSI) status check time. The simulation results in the Cooja simulator show that LW‐CCA reduces about 8% energy consumption in nodes while maintaining up to 99% of the packet delivery rate (PDR).
Die derzeitige Wohnungskrise hat eine sozial-ökologische Kernproblematik. Dabei ist die sozial ungerechte und ökologisch problematische Verteilung von Wohnfläche meist unsichtbar und wird weder in wissenschaftlichen noch in aktivistischen Kontexten ausreichend als Frage der Flächengerechtigkeit problematisiert. Denn Wohnraum und Fläche in einer Stadt sind keine endlos verfügbaren Güter: Wenn einige Menschen auf viel Raum leben, bleibt für andere Menschen weniger Fläche übrig. Und die Menschen, die am wenigstens für eine Verknappung von Wohnraum verantwortlich sind, leiden am meisten darunter. Dieser Artikel arbeitet zunächst den Begriff der Wohnflächengerechtigkeit heraus, wobei auf die Ungleichverteilung von Wohnfläche und deren gesellschaftliche Implikationen unter derzeitigen Wohnungsverteilungsmechanismen Bezug genommen wird. Anschließend wird der Verbrauch von (Wohn-)Fläche aus ökologischer Perspektive problematisiert. Der Artikel diskutiert scheinbare und transformationsorientierte Lösungs- und Handlungsansätze. Abschließend fordert er in der kritischen Stadtforschung und in aktivistischen Kontexten eine stärkere Debatte um eine Wohnflächengerechtigkeit, deren Verwirklichung gleichermaßen eine soziale wie ökologische Dimension hat.
Conventional superplasticizers based on polycarboxylate ether (PCE) show an intolerance to clay minerals due to intercalation of their polyethylene glycol (PEG) side chains into the interlayers of the clay mineral. An intolerance to very basic media is also known. This makes PCE an unsuitable choice as a superplasticizer for geopolymers. Bio-based superplasticizers derived from starch showed comparable effects to PCE in a cementitious system. The aim of the present study was to determine if starch superplasticizers (SSPs) could be a suitable additive for geopolymers by carrying out basic investigations with respect to slump, hardening, compressive and flexural strength, shrinkage, and porosity. Four SSPs were synthesized, differing in charge polarity and specific charge density. Two conventional PCE superplasticizers, differing in terms of molecular structure, were also included in this study. The results revealed that SSPs improved the slump of a metakaolin-based geopolymer (MK-geopolymer) mortar while the PCE investigated showed no improvement. The impact of superplasticizers on early hardening (up to 72 h) was negligible. Less linear shrinkage over the course of 56 days was seen for all samples in comparison with the reference. Compressive strengths of SSP specimens tested after 7 and 28 days of curing were comparable to the reference, while PCE led to a decline. The SSPs had a small impact on porosity with a shift to the formation of more gel pores while PCE caused an increase in porosity. Throughout this research, SSPs were identified as promising superplasticizers for MK-geopolymer mortar and concrete.
Earthquake is among the most devastating natural disasters causing severe economical, environmental, and social destruction. Earthquake safety assessment and building hazard monitoring can highly contribute to urban sustainability through identification and insight into optimum materials and structures. While the vulnerability of structures mainly depends on the structural resistance, the safety assessment of buildings can be highly challenging. In this paper, we consider the Rapid Visual Screening (RVS) method, which is a qualitative procedure for estimating structural scores for buildings suitable for medium- to high-seismic cases. This paper presents an overview of the common RVS methods, i.e., FEMA P-154, IITK-GGSDMA, and EMPI. To examine the accuracy and validation, a practical comparison is performed between their assessment and observed damage of reinforced concrete buildings from a street survey in the Bingöl region, Turkey, after the 1 May 2003 earthquake. The results demonstrate that the application of RVS methods for preliminary damage estimation is a vital tool. Furthermore, the comparative analysis showed that FEMA P-154 creates an assessment that overestimates damage states and is not economically viable, while EMPI and IITK-GGSDMA provide more accurate and practical estimation, respectively.
The latest earthquakes have proven that several existing buildings, particularly in developing countries, are not secured from damages of earthquake. A variety of statistical and machine-learning approaches have been proposed to identify vulnerable buildings for the prioritization of retrofitting. The present work aims to investigate earthquake susceptibility through the combination of six building performance variables that can be used to obtain an optimal prediction of the damage state of reinforced concrete buildings using artificial neural network (ANN). In this regard, a multi-layer perceptron network is trained and optimized using a database of 484 damaged buildings from the Düzce earthquake in Turkey. The results demonstrate the feasibility and effectiveness of the selected ANN approach to classify concrete structural damage that can be used as a preliminary assessment technique to identify vulnerable buildings in disaster risk-management programs.
Discrete function theory in higher-dimensional setting has been in active development since many years. However, available results focus on studying discrete setting for such canonical domains as half-space, while the case of bounded domains generally remained unconsidered. Therefore, this paper presents the extension of the higher-dimensional function theory to the case of arbitrary bounded domains in Rn. On this way, discrete Stokes’ formula, discrete Borel–Pompeiu formula, as well as discrete Hardy spaces for general bounded domains are constructed. Finally, several discrete Hilbert problems are considered.
In this research, an attempt was made to reduce the dimension of wavelet-ANFIS/ANN (artificial neural network/adaptive neuro-fuzzy inference system) models toward reliable forecasts as well as to decrease computational cost. In this regard, the principal component analysis was performed on the input time series decomposed by a discrete wavelet transform to feed the ANN/ANFIS models. The models were applied for dissolved oxygen (DO) forecasting in rivers which is an important variable affecting aquatic life and water quality. The current values of DO, water surface temperature, salinity, and turbidity have been considered as the input variable to forecast DO in a three-time step further. The results of the study revealed that PCA can be employed as a powerful tool for dimension reduction of input variables and also to detect inter-correlation of input variables. Results of the PCA-wavelet-ANN models are compared with those obtained from wavelet-ANN models while the earlier one has the advantage of less computational time than the later models. Dealing with ANFIS models, PCA is more beneficial to avoid wavelet-ANFIS models creating too many rules which deteriorate the efficiency of the ANFIS models. Moreover, manipulating the wavelet-ANFIS models utilizing PCA leads to a significant decreasing in computational time. Finally, it was found that the PCA-wavelet-ANN/ANFIS models can provide reliable forecasts of dissolved oxygen as an important water quality indicator in rivers.
Das vorliegende Gutachten befasst sich mit der Innovationslandschaft des deutschen Journalismus. Innovation wird als eine essenzielle Voraussetzung verstanden, um tragfähige Lösungsansätze für die gegenwärtigen Probleme des Journa-lismus zu entwickeln. Im Mittelpunkt des Gutachtens steht die Frage, wie Innovationspolitik im Journalismus – d. h. die Unterstützung von Innovation durch die öffentliche Hand – funktionstüchtig ausgestaltet werden kann. Dabei wird dem Innovationssysteme-Ansatz gefolgt, welcher Probleme, Barrieren und Hemmnisse identifiziert, die der Innovationsfähigkeit des Journalismus in Deutschland grundlegend im Wege stehen.
Der perfekte Bankraub
(2020)
Finanzielle Unabhängigkeit, überleben können, Superheld*in oder Pop-Star sein, Adrenalin-Kick, lebenslange Kompliz*innenschaft und ewige romanti- sche Verbundenheit, Verschwörung, siegreiches Über- listen, Täuschungstechniken – die Fantasien, die sich um die Idee des Bankraubs ranken, sind so verschieden wie die Menschen, die sie haben. Ein Banküberfall ist wahrscheinlich der Traum Vieler, angesichts der zuneh- menden Prekarisierung persönlicher Ökonomien und
– gleichzeitig oder gerade deswegen – ein spektakulari- siertes, fast popkulturelles Ereignis, das in den Medien gut dokumentiert und in unzähligen Filmen illustriert und weitergesponnen wird.
Personalisierte Lüftung (PL) kann die thermische Behaglichkeit sowie die Qualität der eingeatmeten Atemluft verbessern, in dem jedem Arbeitsplatz Frischluft separat zugeführt wird. In diesem Beitrag wird die Wirkung der PL auf die thermische Behaglichkeit der Nutzer unter sommerlichen Randbedingungen untersucht. Hierfür wurden zwei Ansätze zur Bewertung des Kühlungseffekts der PL untersucht: basierend auf (1) der äquivalenten Temperatur und (2) dem thermischen Empfinden. Grundlage der Auswertung sind in einer Klimakammer gemessene sowie numerisch simulierte Daten. Vor der Durchführung der Simulationen wurde das numerische Modell zunächst anhand der gemessenen Daten validiert. Die Ergebnisse zeigen, dass der Ansatz basierend auf dem thermischen Empfinden zur Evaluierung des Kühlungseffekts der PL sinnvoller sein kann, da bei diesem die komplexen physiologischen Faktoren besser berücksichtigt werden.
This thesis explores how cultural heritage plays a role in the development of urban identity by engaging both actively and passively with memory, i.e. remembering and forgetting. I argue that architectural heritage is a medium where specific cultural and social decisions form its way of presentation, and it reflects the values and interests of the period. By the process of remembering and forgetting, the meanings between inhabitant and object in urban environment are practiced, and the meanings are created.
To enable the research in narrative observation, cultural tourism management is chosen as the main research object, which reflects the alteration of interaction between the architectural heritage and urban identity. Identifying the role of heritage management, the definition of social resilience and the prospects of cultural heritage as a means of social resilience are addressed. Case region of the research is East Ger- many, thereby, the study examines the distinct approaches and objectives regarding heritage management under the different political systems along the German reunification process.
The framework is based on various theoretical paradigms to investigate the broad research questions: 1) What is the role of historic urban quarters in the revitalisation of East German towns? 2) How was the transition processed by cultural heritage management? 3) How did policy affect residents’ lives?
The case study is applied to macro level (city level: Gotha and Eisenach) and micro level study (object level: specific heritage sites), to analyse the performance of selective remembering and making tourist destination through giving significance to specific heritage. By means of site observations, archival research, qualitative inter- views, photographs, and discourse analysis on printed tourism materials, the study demonstrates that certain sites and characteristics of the city enable creating and focusing messages, which aids the social resilience.
Combining theory and empirical studies this thesis attempts to widen the academic discussion regarding the practice of remembering and forgetting driven by cultural heritage. The thesis argues for cultural heritage tourism as an element of social resilience and one that embraces the historic and cultural identity of the inhabitants.
This study investigates the performance of two systems: personalized ventilation (PV) and ductless personalized ventilation (DPV). Even though the literature indicates a compelling performance of PV, it is not often used in practice due to its impracticality. Therefore, the present study assesses the possibility of replacing the inflexible PV with DPV in office rooms equipped with displacement ventilation (DV) in the summer season. Numerical simulations were utilized to evaluate the inhaled concentration of pollutants when PV and DPV are used. The systems were compared in a simulated office with two occupants: a susceptible occupant and a source occupant. Three types of pollution were simulated: exhaled infectious air, dermally emitted contamination, and room contamination from a passive source. Results indicated that PV improved the inhaled air quality regardless of the location of the pollution source; a higher PV supply flow rate positively impacted the inhaled air quality. Contrarily, the performance of DPV was highly sensitive to the source location and the personalized flow rate. A higher DPV flow rate tends to decrease the inhaled air quality due to increased mixing of pollutants in the room. Moreover, both systems achieved better results when the personalized system of the source occupant was switched off.
Coronary Artery Disease Diagnosis: Ranking the Significant Features Using a Random Trees Model
(2020)
Heart disease is one of the most common diseases in middle-aged citizens. Among the vast number of heart diseases, coronary artery disease (CAD) is considered as a common cardiovascular disease with a high death rate. The most popular tool for diagnosing CAD is the use of medical imaging, e.g., angiography. However, angiography is known for being costly and also associated with a number of side effects. Hence, the purpose of this study is to increase the accuracy of coronary heart disease diagnosis through selecting significant predictive features in order of their ranking. In this study, we propose an integrated method using machine learning. The machine learning methods of random trees (RTs), decision tree of C5.0, support vector machine (SVM), and decision tree of Chi-squared automatic interaction detection (CHAID) are used in this study. The proposed method shows promising results and the study confirms that the RTs model outperforms other models.
Cooling Performance of a Novel Circulatory Flow Concentric Multi-Channel Heat Sink with Nanofluids
(2020)
Heat rejection from electronic devices such as processors necessitates a high heat removal rate. The present study focuses on liquid-cooled novel heat sink geometry made from four channels (width 4 mm and depth 3.5 mm) configured in a concentric shape with alternate flow passages (slot of 3 mm gap). In this study, the cooling performance of the heat sink was tested under simulated controlled conditions.The lower bottom surface of the heat sink was heated at a constant heat flux condition based on dissipated power of 50 W and 70 W. The computations were carried out for different volume fractions of nanoparticles, namely 0.5% to 5%, and water as base fluid at a flow rate of 30 to 180 mL/min. The results showed a higher rate of heat rejection from the nanofluid cooled heat sink compared with water. The enhancement in performance was analyzed with the help of a temperature difference of nanofluid outlet temperature and water outlet temperature under similar operating conditions. The enhancement was ~2% for 0.5% volume fraction nanofluids and ~17% for a 5% volume fraction.
The effect of urban form on energy consumption has been the subject of various studies around the world. Having examined the effect of buildings on energy consumption, these studies indicate that the physical form of a city has a notable impact on the amount of energy consumed in its spaces. The present study identified the variables that affected energy consumption in residential buildings and analyzed their effects on energy consumption in four neighborhoods in Tehran: Apadana, Bimeh, Ekbatan-phase I, and Ekbatan-phase II. After extracting the variables, their effects are estimated with statistical methods, and the results are compared with the land surface temperature (LST) remote sensing data derived from Landsat 8 satellite images taken in the winter of 2019. The results showed that physical variables, such as the size of buildings, population density, vegetation cover, texture concentration, and surface color, have the greatest impacts on energy usage. For the Apadana neighborhood, the factors with the most potent effect on energy consumption were found to be the size of buildings and the population density. However, for other neighborhoods, in addition to these two factors, a third factor was also recognized to have a significant effect on energy consumption. This third factor for the Bimeh, Ekbatan-I, and Ekbatan-II neighborhoods was the type of buildings, texture concentration, and orientation of buildings, respectively.
This study permits a reliability analysis to solve the mechanical behaviour issues existing in the current structural design of fabric structures. Purely predictive material models are highly desirable to facilitate an optimized design scheme and to significantly reduce time and cost at the design stage, such as experimental characterization.
The present study examined the role of three major tasks; a) single-objective optimization, b) sensitivity analyses and c) multi-objective optimization on proposed weave structures for woven fabric composites. For single-objective optimization task, the first goal is to optimize the elastic properties of proposed complex weave structure under unit cells basis based on periodic boundary conditions.
We predict the geometric characteristics towards skewness of woven fabric composites via Evolutionary Algorithm (EA) and a parametric study. We also demonstrate the effect of complex weave structures on the fray tendency in woven fabric composites via tightness evaluation. We utilize a procedure which does not require a numerical averaging process for evaluating the elastic properties of woven fabric composites. The fray tendency and skewness of woven fabrics depends upon the behaviour of the floats which is related to the factor of weave. Results of this study may suggest a broader view for further research into the effects of complex weave structures or may provide an alternative to the fray and skewness problems of current weave structure in woven fabric composites.
A comprehensive study is developed on the complex weave structure model which adopts the dry woven fabric of the most potential pattern in singleobjective optimization incorporating the uncertainties parameters of woven fabric composites. The comprehensive study covers the regression-based and variance-based sensitivity analyses. The second task goal is to introduce the fabric uncertainties parameters and elaborate how they can be incorporated into finite element models on macroscopic material parameters such as elastic modulus and shear modulus of dry woven fabric subjected to uni-axial and biaxial deformations. Significant correlations in the study, would indicate the need for a thorough investigation of woven fabric composites under uncertainties parameters. The study describes here could serve as an alternative to identify effective material properties without prolonged time consumption and expensive experimental tests.
The last part focuses on a hierarchical stochastic multi-scale optimization approach (fine-scale and coarse-scale optimizations) under geometrical uncertainties parameters for hybrid composites considering complex weave structure. The fine-scale optimization is to determine the best lamina pattern that maximizes its macroscopic elastic properties, conducted by EA under the following uncertain mesoscopic parameters: yarn spacing, yarn height, yarn width and misalignment of yarn angle. The coarse-scale optimization has been carried out to optimize the stacking sequences of symmetric hybrid laminated composite plate with uncertain mesoscopic parameters by employing the Ant Colony Algorithm (ACO). The objective functions of the coarse-scale optimization are to minimize the cost (C) and weight (W) of the hybrid laminated composite plate considering the fundamental frequency and the buckling load factor as the design constraints.
Based on the uncertainty criteria of the design parameters, the appropriate variation required for the structural design standards can be evaluated using the reliability tool, and then an optimized design decision in consideration of cost can be subsequently determined.
Despite digitization and platformization, mass media and established media companies still play a crucial role in the provision of journalistic content in democratic societies. Competition is one key driver of (media) company behavior and is considered to have an impact on the media’s performance. However, theory and empirical research are ambiguous about the relationship. The objective of this article is to empirically analyze the effect of competition on media performance in a cross-national context. We assessed media performance of media companies as the importance of journalistic goals within their stated corporate goal system. We conducted a content analysis of letters to the shareholders in annual reports of more than 50 media companies from 2000 to 2014 to operationalize journalistic goal importance. When employing a fixed effects regression analysis, as well as a fuzzy set qualitative comparative analysis, results suggest that competition has a positive effect on the importance of journalistic goals, while the existence of a strong public service media sector appears to have the effect of “crowding out” commercial media companies.
Wind effects can be critical for the design of lifelines such as long-span bridges. The existence of a significant number of aerodynamic force models, used to assess the performance of bridges, poses an important question regarding their comparison and validation. This study utilizes a unified set of metrics for a quantitative comparison of time-histories in bridge aerodynamics with a host of characteristics. Accordingly, nine comparison metrics are included to quantify the discrepancies in local and global signal features such as phase, time-varying frequency and magnitude content, probability density, nonstationarity and nonlinearity. Among these, seven metrics available in the literature are introduced after recasting them for time-histories associated with bridge aerodynamics. Two additional metrics are established to overcome the shortcomings of the existing metrics. The performance of the comparison metrics is first assessed using generic signals with prescribed signal features. Subsequently, the metrics are applied to a practical example from bridge aerodynamics to quantify the discrepancies in the aerodynamic forces and response based on numerical and semi-analytical aerodynamic models. In this context, it is demonstrated how a discussion based on the set of comparison metrics presented here can aid a model evaluation by offering deeper insight. The outcome of the study is intended to provide a framework for quantitative comparison and validation of aerodynamic models based on the underlying physics of fluid-structure interaction. Immediate further applications are expected for the comparison of time-histories that are simulated by data-driven approaches.
This research aims to model soil temperature (ST) using machine learning models of multilayer perceptron (MLP) algorithm and support vector machine (SVM) in hybrid form with the Firefly optimization algorithm, i.e. MLP-FFA and SVM-FFA. In the current study, measured ST and meteorological parameters of Tabriz and Ahar weather stations in a period of 2013–2015 are used for training and testing of the studied models with one and two days as a delay. To ascertain conclusive results for validation of the proposed hybrid models, the error metrics are benchmarked in an independent testing period. Moreover, Taylor diagrams utilized for that purpose. Obtained results showed that, in a case of one day delay, except in predicting ST at 5 cm below the soil surface (ST5cm) at Tabriz station, MLP-FFA produced superior results compared with MLP, SVM, and SVM-FFA models. However, for two days delay, MLP-FFA indicated increased accuracy in predicting ST5cm and ST 20cm of Tabriz station and ST10cm of Ahar station in comparison with SVM-FFA. Additionally, for all of the prescribed models, the performance of the MLP-FFA and SVM-FFA hybrid models in the testing phase was found to be meaningfully superior to the classical MLP and SVM models.