TY - JOUR A1 - Knecht, Katja T1 - Augmented Urban Model: Ein Tangible User Interface zur Unterstützung von Stadtplanungsprozessen N2 - Im architektonischen und städtebaulichen Kontext erfüllen physische und digitale Modelle aufgrund ihrer weitgehend komplementären Eigenschaften und Qualitäten unterschiedliche, nicht verknüpfte Aufgaben und Funktionen im Entwurfs- und Planungsprozess. Während physische Modelle vor allem als Darstellungs- und Kommunikationsmittel aber auch als Arbeitswerkzeug genutzt werden, unterstützen digitale Modelle darüber hinaus die Evaluation eines Entwurfs durch computergestützte Analyse- und Simulationstechniken. Analysiert wurden im Rahmen der in diesem Arbeitspapier vorgestellten Arbeit neben dem Einsatz des Modells als analogem und digitalem Werkzeug im Entwurf die Bedeutung des Modells für den Arbeitsprozess sowie Vorbilder aus dem Bereich der Tangible User Interfaces mit Bezug zu Architek¬tur und Städtebau. Aus diesen Betrachtungen heraus wurde ein Prototyp entwickelt, das Augmented Urban Model, das unter anderem auf den frühen Projekten und Forschungsansätzen aus dem Gebiet der Tangible User Interfaces aufsetzt, wie dem metaDESK von Ullmer und Ishii und dem Urban Planning Tool Urp von Underkoffler und Ishii. Das Augmented Urban Model zielt darauf ab, die im aktuellen Entwurfs- und Planungsprozess fehlende Brücke zwischen realen und digitalen Modellwelten zu schlagen und gleichzeitig eine neue tangible Benutzerschnittstelle zu schaffen, welche die Manipulation von und die Interaktion mit digitalen Daten im realen Raum ermöglicht. T3 - Arbeitspapiere Informatik in der Architektur - Nr. 6 KW - tangible user interface KW - simulation Y1 - 2011 U6 - http://nbn-resolving.de/urn/resolver.pl?urn:nbn:de:gbv:wim2-20160823-26740 UR - http://infar.architektur.uni-weimar.de/service/drupal-infar/Arbeitspapiere ER - TY - BOOK ED - Maier, Matthias ED - Simon-Ritz, Frank T1 - Alles digital? E-Books in Studium und Forschung : Weimarer EDOC-Tage 2011 N2 - Es ist ein Bild aus alten Tagen: ein wissbegieriger Student, auf der Suche nach fundierter wissenschaftlicher Information, begibt sich an den heiligsten Ort aller Bücher – die Universitätsbibliothek. Doch seit einiger Zeit tummeln sich Studierende nicht mehr nur in Bibliotheken, sondern auch immer häufiger im Internet. Sie suchen und finden dort digitale Bücher, sogenannte E-Books. Wie lässt sich der Wandel durch den Einzug des E-Books in das etablierte Forschungssystem beschreiben, welche Konsequenzen lassen sich daraus ablesen und wird schließlich alles digital, sogar die Bibliothek? Diesen Fragen geht ein elfköpfiges Expertenteam aus Deutschland und der Schweiz während der zweitägigen Konferenz auf den Grund. Bei den Weimarer E-DOC-Tagen geht es nun um die Veränderung des institutionellen Gefüges rund um das digitale Buch. Denn traditionell sind Verlage und Bibliotheken wichtige Bestandteile der Wissensversorgung in Studium und Lehre. Doch mit dem Aufkommen des E-Books verlagert sich die Recherche mehr und mehr ins Internet. Die Suchmaschine Google tritt als neuer Konkurrent der klassischen Bibliotheksrecherche auf. Aber auch Verlage müssen verstärkt auf die neuen Herausforderungen eines digitalen Buchmarktes reagieren. In Kooperation mit der Universitätsbibliothek und dem Master-Studiengang Medienmanagement diskutieren Studierende, Wissenschaftler, Bibliothekare und Verleger, wie das E-Book unseren Umgang mit Literatur verändert. Der Tagungsband stellt alle Perspektiven und Ergebnisse zum Nachlesen zusammen. KW - Elektronisches Buch KW - Studium KW - Lehre KW - Forschung KW - EDOC-Tage Weimar 2011 KW - E-Book KW - Digitaler Buchmarkt Y1 - 2012 U6 - http://nbn-resolving.de/urn/resolver.pl?urn:nbn:de:gbv:wim2-20120223-15699 SN - 978-3-86068-454-2 N1 - Die Tagung fand am 27. - 28. Mai 2011 an der Bauhaus-Univesität Weimar im Audimax statt. PB - Verlag der Bauhaus-Universität CY - Weimar ER - TY - JOUR A1 - Köhler, Hermann A1 - König, Reinhard T1 - Aktionsräume in Dresden N2 - In vorliegender Studie werden die Aktionsräume von Befragten in Dresden über eine standardisierte Befragung (n=360) untersucht. Die den Aktionsräumen zugrundeliegenden Aktivitäten werden unterschieden in Einkaufen für den täglichen Bedarf, Ausgehen (z.B. in Café, Kneipe, Gaststätte), Erholung im Freien (z.B. spazieren gehen, Nutzung von Grünanlagen) und private Geselligkeit (z.B. Feiern, Besuch von Verwandten/Freunden). Der Aktionsradius wird unterschieden in Wohnviertel, Nachbarviertel und sonstiges weiteres Stadtgebiet. Um aus den vier betrachteten Aktivitäten einen umfassenden Kennwert für den durchschnittlichen Aktionsradius eines Befragten zu bilden, wird ein Modell für den Kennwert eines Aktionsradius entwickelt. Die Studie kommt zu dem Ergebnis, dass das Alter der Befragten einen signifikanten – wenn auch geringen – Einfluss auf den Aktionsradius hat. Das Haushaltsnettoeinkommen hat einen mit Einschränkung signifikanten, ebenfalls geringen Einfluss auf alltägliche Aktivitäten der Befragten. T3 - Arbeitspapiere Informatik in der Architektur - Nr. 14 KW - Aktionsraumforschung KW - Quantitative Sozialforschung KW - Dresden Y1 - 2012 U6 - http://nbn-resolving.de/urn/resolver.pl?urn:nbn:de:gbv:wim2-20160822-26726 UR - http://infar.architektur.uni-weimar.de/service/drupal-infar/Arbeitspapiere ER - TY - JOUR A1 - Köhler, Hermann T1 - Ergebnisse der Befragung zu Wohnstandortpräferenzen von Lebensweltsegmenten in Dresden N2 - In vorliegender Studie werden die Wohnstandortpräferenzen der Sinus-Milieugruppen in Dresden über eine standardisierte Befragung (n=318) untersucht. Es wird unterschieden zwischen handlungsleitenden Wohnstandortpräferenzen, die durch Anhaltspunkte auf der Handlungsebene stärker in Betracht gezogen werden sollten, und Wohnstandortpräferenzen, welche eher orientierenden Charakter haben. Die Wohnstandortpräferenzen werden untersucht anhand der Kategorien Ausstattung/Zustand der Wohnung/des näheren Wohnumfeldes, Versorgungsstruktur, soziales Umfeld, Baustrukturtyp, Ortsgebundenheit sowie des Aspektes des Images eines Stadtviertels. Um die Befragten den Sinus-Milieugruppen zuordnen zu können, wird ein Lebensweltsegment-Modell entwickelt, welches den Anspruch hat, die Sinus-Milieugruppen in der Tendenz abzubilden. Die Studie kommt zu dem Ergebnis, dass die Angehörigen der verschiedenen Lebensweltsegmente in jeder Kategorie - wenn auch z.T. auf geringerem Niveau - signifikante Unterschiede in der Bewertung einzelner Wohnstandortpräferenzen aufweisen. T3 - Arbeitspapiere Informatik in der Architektur - Nr. 12 KW - Milieuforschung KW - Residentielle Mobilität KW - Wohnstandortpräferenzen KW - Wohnstandortentscheidungen KW - Quantitative Sozialforschung Y1 - 2012 U6 - http://nbn-resolving.de/urn/resolver.pl?urn:nbn:de:gbv:wim2-20160822-26704 UR - http://infar.architektur.uni-weimar.de/service/drupal-infar/Arbeitspapiere ER - TY - THES A1 - Azari, Banafsheh T1 - Bidirectional Texture Functions: Acquisition, Rendering and Quality Evaluation N2 - As one of its primary objectives, Computer Graphics aims at the simulation of fabrics’ complex reflection behaviour. Characteristic surface reflectance of fabrics, such as highlights, anisotropy or retro-reflection arise the difficulty of synthesizing. This problem can be solved by using Bidirectional Texture Functions (BTFs), a 2D-texture under various light and view direction. But the acquisition of Bidirectional Texture Functions requires an expensive setup and the measurement process is very time-consuming. Moreover, the size of BTF data can range from hundreds of megabytes to several gigabytes, as a large number of high resolution pictures have to be used in any ideal cases. Furthermore, the three-dimensional textured models rendered through BTF rendering method are subject to various types of distortion during acquisition, synthesis, compression, and processing. An appropriate image quality assessment scheme is a useful tool for evaluating image processing algorithms, especially algorithms designed to leave the image visually unchanged. In this contribution, we present and conduct an investigation aimed at locating a robust threshold for downsampling BTF images without loosing perceptual quality. To this end, an experimental study on how decreasing the texture resolution influences perceived quality of the rendered images has been presented and discussed. Next, two basic improvements to the use of BTFs for rendering are presented: firstly, the study addresses the cost of BTF acquisition by introducing a flexible low-cost step motor setup for BTF acquisition allowing to generate a high quality BTF database taken at user-defined arbitrary angles. Secondly, the number of acquired textures to the perceptual quality of renderings is adapted so that the database size is not overloaded and can fit better in memory when rendered. Although visual attention is one of the essential attributes of HVS, it is neglected in most existing quality metrics. In this thesis an appropriate objective quality metric based on extracting visual attention regions from images and adequate investigation of the influence of visual attention on perceived image quality assessment, called Visual Attention Based Image Quality Metric (VABIQM), has been proposed. The novel metric indicates that considering visual saliency can offer significant benefits with regard to constructing objective quality metrics to predict the visible quality differences in images rendered by compressed and non-compressed BTFs and also outperforms straightforward existing image quality metrics at detecting perceivable differences. KW - Wahrnehmung KW - Bildqualität KW - BTF Y1 - 2018 U6 - http://nbn-resolving.de/urn/resolver.pl?urn:nbn:de:gbv:wim2-20180820-37790 ER - TY - JOUR A1 - Tonn, Christian A1 - Tatarin, René T1 - Volumen Rendering in der Architektur: Überlagerung und Kombination von 3D Voxel Volumendaten mit 3D Gebäudemodellen N2 - Volumerendering ist eine Darstellungstechnik, um verschiedene räumliche Mess- und Simulationsdaten anschaulich, interaktiv grafisch darzustellen. Im folgenden Beitrag wird ein Verfahren vorgestellt, mehrere Volumendaten mit einem Architekturflächenmodell zu überlagern. Diese komplexe Darstellungsberechnung findet mit hardwarebeschleunigten Shadern auf der Grafikkarte statt. Im Beitrag wird hierzu der implementierte Softwareprototyp "VolumeRendering" vorgestellt. Neben dem interaktiven Berechnungsverfahren wurde ebenso Wert auf eine nutzerfreundliche Bedienung gelegt. Das Ziel bestand darin, eine einfache Bewertung der Volumendaten durch Fachplaner zu ermöglichen. Durch die Überlagerung, z. B. verschiedener Messverfahren mit einem Flächenmodell, ergeben sich Synergien und neue Auswertungsmöglichkeiten. Abschließend wird anhand von Beispielen aus einem interdisziplinären Forschungsprojekt die Anwendung des Softwareprototyps illustriert. T3 - Arbeitspapiere Informatik in der Architektur - Nr. 13 KW - Multiple Volume Rendering KW - Overlay KW - 3D Surface Models Y1 - 2012 U6 - http://nbn-resolving.de/urn/resolver.pl?urn:nbn:de:gbv:wim2-20160822-26718 UR - http://infar.architektur.uni-weimar.de/service/drupal-infar/Arbeitspapiere ER - TY - THES A1 - Gerold, Fabian T1 - Konzepte zur interaktiven Entwurfsraum-Exploration im Tragwerksentwurf N2 - Der Entwurfsraum für den Entwurf eines Tragwerks ist ein n-dimensionaler Raum, der aus allen freien Parametern des Modells aufgespannt wird. Traditionell werden nur wenige Punkte dieses Raumes durch eine numerische (computergestützte) Simulation evaluiert, meist auf Basis der Finite-Elemente-Methode. Mehrere Faktoren führen dazu, dass heute oft viele Revisionen eines Simulationsmodells durchlaufen werden: Zum einen ergeben sich oft Planungsänderungen, zum anderen ist oft die Untersuchung von Planungsalternativen und die Suche nach einem Optimum wünschenswert. In dieser Arbeit soll für ein vorhandenes Finite-Elemente-Framework die sequentielle Datei-Eingabeschnittstelle durch eine Netzwerkschnittstelle ersetzt werden, die den Erfordernissen einer interaktiven Arbeitsweise entspricht. So erlaubt die hier konzipierte Schnittstelle interaktive, inkrementelle Modelländerungen sowie Status- und Berechnungsergebnis-Abfragen durch eine bidirektionale Schnittstelle. Die Kombination aus interaktiver numerischer Simulation und Interoperabilität durch die Anwendung von Konzepten zur Bauwerks-Informations-Modellierung im Tragwerksentwurf ist Ziel dieser Dissertation. Die Beschreibung der Konzeption und prototypischen Umsetzung ist Gegenstand der schriftlichen Arbeit. KW - Interaktive numerische Simulation KW - Industry Foundation Classes (IFC) Y1 - 2013 U6 - http://nbn-resolving.de/urn/resolver.pl?urn:nbn:de:gbv:wim2-20140408-21532 ER - TY - GEN A1 - Simon-Ritz, Frank A1 - Liehr, Harald S. T1 - Das Urheberrecht - ein Pulverfass für Lehre und Forschung N2 - Radiodiskussion bei bauhaus.fm am 5. November 2012. Harald S. Liehr ist Lektor und Leiter der Niederlassung Weimar des Böhlau-Verlags (Wien / Köln / Weimar), Dr. Frank Simon-Ritz ist Direktor der Universitätsbibliothek der Bauhaus-Universität Weimar. Die Fragen stellten René Tauschke und Jean-Marie Schaldach. KW - Urheberrecht Y1 - 2012 U6 - http://nbn-resolving.de/urn/resolver.pl?urn:nbn:de:gbv:wim2-20121130-17753 ER - TY - JOUR A1 - Kalisch, Dominik T1 - Wissen wer wo wohnt N2 - In cities people live together in neighbourhoods. Here they can find the infrastructure they need, starting with shops for the daily purpose to the life-cycle based infrastructures like kindergartens or nursing homes. But not all neighbourhoods are identical. The infrastructure mixture varies from neighbourhood to neighbourhood, but different people have different needs which can change e.g. based on the life cycle situation or their affiliation to a specific milieu. We can assume that a person or family tries to settle in a specific neighbourhood that satisfies their needs. So, if the residents are happy with a neighbourhood, we can further assume that this neighbourhood satisfies their needs. The socio-oeconomic panel (SOEP) of the German Institute for Economy (DIW) is a survey that investigates the economic structure of the German population. Every four years one part of this survey includes questions about what infrastructures can be found in the respondents neighbourhood and the satisfaction of the respondent with their neighbourhood. Further, it is possible to add a milieu estimation for each respondent or household. This gives us the possibility to analyse the typical neighbourhoods in German cities as well as the infrastructure profiles of the different milieus. Therefore, we take the environment variables from the dataset and recode them into a binary variable – whether an infrastructure is available or not. According to Faust (2005), these sets can also be understood, as a network of actors in a neighbourhood, which share two, three or more infrastructures. Like these networks, this neighbourhood network can also be visualized as a bipartite affiliation network and therefore analysed using correspondence analysis. We will show how a neighbourhood analysis will benefit from an upstream correspondence analysis and how this could be done. We will also present and discuss the results of such an analysis. T3 - Arbeitspapiere Informatik in der Architektur - Nr. 11 KW - urban planning KW - cluster analysis KW - urban research-quantitative KW - complex data analysis KW - singular value decomposition Y1 - 2012 U6 - http://nbn-resolving.de/urn/resolver.pl?urn:nbn:de:gbv:wim2-20160822-26695 UR - http://infar.architektur.uni-weimar.de/service/drupal-infar/Arbeitspapiere ER - TY - JOUR A1 - Hahlbrock, David A1 - Braun, Michael A1 - Heidel, Robin A1 - Lemmen, Patrik A1 - Boumann, Roland A1 - Bruckmann, Tobias A1 - Schramm, Dieter A1 - Helm, Volker A1 - Willmann, Jan T1 - Cable Robotic 3D-printing: additive manufacturing on the construction site JF - Construction Robotics N2 - This paper outlines an important step in characterizing a novel field of robotic construction research where a cable-driven parallel robot is used to extrude cementitious material in three-dimensional space, and thus offering a comprehensive new approach to computational design and construction, and to robotic fabrication at larger scales. Developed by the Faculty of Art and Design at Bauhaus-University Weimar (Germany), the faculty of Architecture at the University of Applied Sciences Dortmund (Germany) and the Chair of Mechatronics at the University of Duisburg-Essen (Germany), this approach offers unique advantages over existing additive manufacturing methods: the system is easily transportable and scalable, it does not require additional formwork or scaffolding, and it offers digital integration and informational oversight across the entire design and building process. This paper considers 1) key research components of cable robotic 3D-printing (such as computational design, material exploration, and robotic control), and 2) the integration of these parameters into a unified design and building process. The demonstration of the approach at full-scale is of particular concern. KW - Robotik KW - Automation KW - 3D-Printing KW - Cable-driven parallel robots Y1 - 2022 U6 - http://nbn-resolving.de/urn/resolver.pl?urn:nbn:de:gbv:wim2-20230124-48791 UR - https://link.springer.com/article/10.1007/s41693-022-00082-3 VL - 2022 SP - 1 EP - 14 PB - Springer International Publishing CY - Cham ER - TY - JOUR A1 - Stadler, Max T1 - Gründerzeit. Hightech und Alternativen der Wissenschaft in West-Berlin JF - NTM Zeitschrift für Geschichte der Wissenschaften, Technik und Medizin N2 - Zu den diversen Unternehmungen sozialbewegter „Gegenwissenschaft“, die um 1980 auf der Bildfläche der BRD erschienen, zählte der 1982 gegründete Berliner Wissenschaftsladen e. V., kurz WILAB – eine Art „alternatives“ Spin-off der Technischen Universität Berlin. Der vorliegende Beitrag situiert die Ausgründung des „Ladens“ im Kontext zeitgenössischer Fortschritte der (regionalen) Forschungs- und Technologiepolitik. Gezeigt wird, wie der deindustrialisierenden Inselstadt, qua „innovationspolitischer“ Gegensteuerung, dabei sogar eine gewisse Vorreiterrolle zukam: über die Stadtgrenzen hinaus sichtbare Neuerungen wie die Gründermesse BIG TECH oder das 1983 eröffnete Berliner Innovations- und Gründerzentrum (BIG), der erste „Incubator“ [sic] der BRD, etwa gingen auf das Konto der 1977/78 lancierten Technologie-Transferstelle der TU Berlin, TU-transfer. Anders gesagt: tendenziell bekam man es hier nun mit Verhältnissen zu tun, die immer weniger mit den Träumen einer „kritischen“, nicht-fremdbestimmten (Gegen‑)Wissenschaft kompatibel waren. Latent konträr zur historiographischen Prominenz des wissenschaftskritischen Zeitgeists fristeten „alternativen“ Zielsetzungen verpflichtete Unternehmungen wie „WILAB“ ein relativ marginalisiertes Nischendasein. Dennoch wirft das am WILAB verfolgte, so gesehen wenig aussichtsreiche Anliegen, eine andere, nämlich „humanere“ Informationstechnologie in die Wege zu leiten, ein instruktives Licht auf die Aufbrüche „unternehmerischer“ Wissenschaft in der BRD um 1980. KW - Berlin KW - Wissenschaftspolitik KW - Gegenwissenschaft KW - Gegenwissen KW - Informatik KW - Strukturkrise Y1 - 2022 U6 - http://nbn-resolving.de/urn/resolver.pl?urn:nbn:de:gbv:wim2-20230124-48800 UR - https://link.springer.com/article/10.1007/s00048-022-00352-9 VL - 2022 IS - 30 (2022) SP - 599 EP - 632 PB - Basel CY - Birkhäuser ER - TY - THES A1 - Dang, Trang T1 - Automated Detailing of 4D Schedules N2 - The increasing success of BIM (Building Information Model) and the emergence of its implementation in 3D construction models have paved a way for improving scheduling process. The recent research on application of BIM in scheduling has focused on quantity take-off, duration estimation for individual trades, schedule visualization, and clash detection. Several experiments indicated that the lack of detailed planning causes about 30% non-productive time and stacking of trades. However, detailed planning still has not been implemented in practice despite receiving a lot of interest from researchers. The reason is associated with the huge amount and complexity of input data. In order to create a detailed planning, it is time consuming to manually decompose activities, collect and calculate the detailed information in relevant. Moreover, the coordination of detailed activities requires much effort for dealing with their complex constraints. This dissertation aims to support the generation of detailed schedules from a rough schedule. It proposes a model for automated detailing of 4D schedules by integrating BIM, simulation and Pareto-based optimization. T3 - Schriften der Professur Baubetrieb und Bauverfahren - 32 KW - BIM KW - Simulation KW - Pareto optimization KW - multi-objective optimization KW - scheduling Y1 - 2014 U6 - http://nbn-resolving.de/urn/resolver.pl?urn:nbn:de:gbv:wim2-20141006-23103 ER - TY - THES A1 - Carvalho Daher, Cesar Felipe T1 - Horoskopos: a virtual planetarium for astrology N2 - This report details the development of Horoskopos, a virtual planetarium for astrology. This project was an attempt to develop a learning tool for studying astrological concepts as connected to observational astronomy. The premise that astrology and observational astronomy were once inseparable from each other in ancient times guided the conceptualization of this tool as an interactive planetarium. The main references were existing software and applications for visualization in astrology and astronomy. Professional astrology teachers were consulted in order to understand better the state of astrological teaching and learning, as well as existing tools and practice. Horoskopos was built using the Unity3D development interface, which is based on the C# programming language. It also relied on the Swiss Ephemeris coding interface from Astrodienst. The development process was experimental and many of the needed skills were developed as needed. Usability tests were performed as new features were added to the interface. The final version of Horoskopos is fully usable, with many interactive visualization features and a defined visual identity. It was validated together with professional astrologers for its effectiveness in concept and visualization. KW - Mediendesign KW - Astrologie KW - Astronomie KW - Planetarium KW - Information design KW - Interaction design KW - Astrology KW - Observational astronomy Y1 - 2022 U6 - http://nbn-resolving.de/urn/resolver.pl?urn:nbn:de:gbv:wim2-20220930-47181 ER - TY - CHAP A1 - König, Reinhard A1 - Treyer, Lukas A1 - Schmitt, Gerhard T1 - Graphical smalltalk with my optimization system for urban planning tasks T2 - 31st eCAADe Conference – Volume 2 N2 - Based on the description of a conceptual framework for the representation of planning problems on various scales, we introduce an evolutionary design optimization system. This system is exemplified by means of the generation of street networks with locally defined properties for centrality. We show three different scenarios for planning requirements and evaluate the resulting structures with respect to the requirements of our framework. Finally the potentials and challenges of the presented approach are discussed in detail. KW - Städtebau KW - Architektur KW - Design optimization KW - evolutionary multi-criteria optimization KW - generative system integration KW - interactive planning support system Y1 - 2013 U6 - http://nbn-resolving.de/urn/resolver.pl?urn:nbn:de:gbv:wim2-20160121-25171 SP - 195 EP - 203 PB - TU Delft CY - Delft, Netherlands ER - TY - CHAP A1 - Chirkin, Artem A1 - König, Reinhard T1 - Concept of Interactive Machine Learning in Urban Design Problems : proceedings N2 - This work presents a concept of interactive machine learning in a human design process. An urban design problem is viewed as a multiple-criteria optimization problem. The outlined feature of an urban design problem is the dependence of a design goal on a context of the problem. We model the design goal as a randomized fitness measure that depends on the context. In terms of multiple-criteria decision analysis (MCDA), the defined measure corresponds to a subjective expected utility of a user. In the first stage of the proposed approach we let the algorithm explore a design space using clustering techniques. The second stage is an interactive design loop; the user makes a proposal, then the program optimizes it, gets the user’s feedback and returns back the control over the application interface. KW - MCDM KW - interactive machine learning KW - urban design KW - multiple-criteria optimization KW - Stadtgestaltung Y1 - 2016 U6 - http://nbn-resolving.de/urn/resolver.pl?urn:nbn:de:gbv:wim2-20160622-26000 UR - http://dl.acm.org/citation.cfm?id=2898365 SP - 10 EP - 13 PB - ACM New York, NY, USA CY - San Jose, CA, USA ER - TY - CHAP A1 - König, Reinhard A1 - Müller, Daniela T1 - Simulating the development of residential areas of the city of Vienna from 1888 to 2001 T2 - Compendium of Abstracts of the 8th International Conference on Urban Planning and Environment (UPE8) N2 - The structure and development of cities can be seen and evaluated from different points of view. By replicating the growth or shrinkage of a city using historical maps depicting different time states, we can obtain momentary snapshots of the dynamic mechanisms of the city. An examination of how these snapshots change over the course of time and a comparison of the different static time states reveals the various interdependencies of population density, technical infrastructure and the availability of public transport facilities. Urban infrastructure and facilities are not distributed evenly across the city – rather they are subject to different patterns and speeds of spread over the course of time and follow different spatial and temporal regularities. The reasons and underlying processes that cause the transition from one state to another result from the same recurring but varyingly pronounced hidden forces and their complex interactions. Such forces encompass a variety of economic, social, cultural and ecological conditions whose respective weighting defines the development of a city in general. Urban development is, however, not solely a product of the different spatial distribution of economic, legal or social indicators but also of the distribution of infrastructure. But to what extent is the development of a city affected by the changing provision of infrastructure? As KW - urban simulation KW - Simulation Y1 - 2009 U6 - http://nbn-resolving.de/urn/resolver.pl?urn:nbn:de:gbv:wim2-20160622-26066 CY - Kaiserslautern, Germany ER - TY - CHAP A1 - Treyer, Lukas A1 - Klein, Bernhard A1 - König, Reinhard A1 - Meixner, Christine T1 - Lightweight urban computation interchange (LUCI) system T2 - Proceedings N2 - In this paper we introduce LUCI, a Lightweight Urban Calculation Interchange system, designed to bring the advantages of a calculation and content co-ordination system to small planning and design groups by the means of an open source middle-ware. The middle-ware focuses on problems typical to urban planning and therefore features a geo-data repository as well as a job runtime administration, to coordinate simulation models and its multiple views. The described system architecture is accompanied by two exemplary use cases that have been used to test and further develop our concepts and implementations. KW - Luci KW - distributed computing KW - middleware Y1 - 2015 U6 - http://nbn-resolving.de/urn/resolver.pl?urn:nbn:de:gbv:wim2-20160622-25982 UR - https://e-pub.uni-weimar.de/opus4/frontdoor/index/index/docId/2504 PB - FOSS4G CY - Seoul, South Korea ER - TY - CHAP A1 - Hijazi, Ihab Hamzi A1 - Hussein, M. H. A1 - König, Reinhard T1 - Enabling geo-design: Evaluating the capacity of 3D city model to support thermal design in building T2 - 9th 3DGeoInfo Conference N2 - Enabling geo-design: Evaluating the capacity of 3D city model to support thermal design in building KW - Informatik KW - bim; cad; citygml; gbxml; thermal design Y1 - 2014 U6 - http://nbn-resolving.de/urn/resolver.pl?urn:nbn:de:gbv:wim2-20160118-25089 CY - Dubai, UAE ER - TY - RPRT A1 - König, Reinhard A1 - Tapias, Estefania A1 - Schmitt, Gerhard T1 - New Methods in Urban Analysis and Simulation: Documentation of teaching results from the spring semester 2015 N2 - Documentation of teaching results from the spring semester 2015 at the chair of Information Architecture at ETH Zurich KW - Architektur Y1 - 2015 U6 - http://nbn-resolving.de/urn/resolver.pl?urn:nbn:de:gbv:wim2-20160118-25052 ER - TY - CHAP A1 - König, Reinhard ED - Martens, Bob ED - Wurzer, G, Gabriel ED - Grasl, Tomas ED - Lorenz, Wolfgang ED - Schaffranek, Richard T1 - CPlan: An Open Source Library for Computational Analysis and Synthesis T2 - 33rd eCAADe Conference N2 - Some caad packages offer additional support for the optimization of spatial configurations, but the possibilities for applying optimization are usually limited either by the complexity of the data model or by the constraints of the underlying caad system. Since we missed a system that allows to experiment with optimization techniques for the synthesis of spatial configurations, we developed a collection of methods over the past years. This collection is now combined in the presented open source library for computational planning synthesis, called CPlan. The aim of the library is to provide an easy to use programming framework with a flat learning curve for people with basic programming knowledge. It offers an extensible structure that allows to add new customized parts for various purposes. In this paper the existing functionality of the CPlan library is described. KW - Architektur KW - Computer KW - CAAD KW - cplan KW - CAD Y1 - 2015 U6 - http://nbn-resolving.de/urn/resolver.pl?urn:nbn:de:gbv:wim2-20160118-25037 SP - 245 EP - 250 PB - Vienna University of Technology CY - Vienna ER - TY - JOUR A1 - Knecht, Katja A1 - König, Reinhard T1 - Automatische Grundstücksumlegung mithilfe von Unterteilungsalgorithmen und typenbasierte Generierung von Stadtstrukturen N2 - Dieses Arbeitspapier beschreibt, wie ausgehend von einem vorhandenen Straßennetzwerk Bebauungsareale mithilfe von Unterteilungsalgorithmen automatisch umgelegt, d.h. in Grundstücke unterteilt, und anschließend auf Basis verschiedener städtebaulicher Typen bebaut werden können. Die Unterteilung von Bebauungsarealen und die Generierung von Bebauungsstrukturen unterliegen dabei bestimmten stadtplanerischen Einschränkungen, Vorgaben und Parametern. Ziel ist es aus den dargestellten Untersuchungen heraus ein Vorschlagssystem für stadtplanerische Entwürfe zu entwickeln, das anhand der Umsetzung eines ersten Softwareprototyps zur Generierung von Stadtstrukturen weiter diskutiert wird. T3 - Arbeitspapiere Informatik in der Architektur - Nr. 15 KW - Automatisierung KW - Grundstücksumlegung KW - städtische Strukturen KW - Unterteilungsalgorithmen KW - Computational Design Y1 - 2012 U6 - http://nbn-resolving.de/urn/resolver.pl?urn:nbn:de:gbv:wim2-20160822-26730 UR - http://infar.architektur.uni-weimar.de/service/drupal-infar/Arbeitspapiere ER - TY - RPRT A1 - König, Reinhard A1 - Tapias, Estefania A1 - Schmitt, Gerhard T1 - New Methods in Urban Analysis and Simulation: Documentation of teaching results from the autumn semester 2013 N2 - Documentation of teaching results from the autumn semester 2013 at ETH Zurich KW - Städtebau Y1 - 2013 U6 - http://nbn-resolving.de/urn/resolver.pl?urn:nbn:de:gbv:wim2-20160121-25168 ER - TY - THES A1 - Schrader, Kai T1 - Hybrid 3D simulation methods for the damage analysis of multiphase composites T1 - Hybride 3D Simulationsmethoden zur Abbildung der Schädigungsvorgänge in Mehrphasen-Verbundwerkstoffen N2 - Modern digital material approaches for the visualization and simulation of heterogeneous materials allow to investigate the behavior of complex multiphase materials with their physical nonlinear material response at various scales. However, these computational techniques require extensive hardware resources with respect to computing power and main memory to solve numerically large-scale discretized models in 3D. Due to a very high number of degrees of freedom, which may rapidly be increased to the two-digit million range, the limited hardware ressources are to be utilized in a most efficient way to enable an execution of the numerical algorithms in minimal computation time. Hence, in the field of computational mechanics, various methods and algorithms can lead to an optimized runtime behavior of nonlinear simulation models, where several approaches are proposed and investigated in this thesis. Today, the numerical simulation of damage effects in heterogeneous materials is performed by the adaption of multiscale methods. A consistent modeling in the three-dimensional space with an appropriate discretization resolution on each scale (based on a hierarchical or concurrent multiscale model), however, still contains computational challenges in respect to the convergence behavior, the scale transition or the solver performance of the weak coupled problems. The computational efficiency and the distribution among available hardware resources (often based on a parallel hardware architecture) can significantly be improved. In the past years, high-performance computing (HPC) and graphics processing unit (GPU) based computation techniques were established for the investigationof scientific objectives. Their application results in the modification of existing and the development of new computational methods for the numerical implementation, which enables to take advantage of massively clustered computer hardware resources. In the field of numerical simulation in material science, e.g. within the investigation of damage effects in multiphase composites, the suitability of such models is often restricted by the number of degrees of freedom (d.o.f.s) in the three-dimensional spatial discretization. This proves to be difficult for the type of implementation method used for the nonlinear simulation procedure and, simultaneously has a great influence on memory demand and computational time. In this thesis, a hybrid discretization technique has been developed for the three-dimensional discretization of a three-phase material, which is respecting the numerical efficiency of nonlinear (damage) simulations of these materials. The increase of the computational efficiency is enabled by the improved scalability of the numerical algorithms. Consequently, substructuring methods for partitioning the hybrid mesh were implemented, tested and adapted to the HPC computing framework using several hundred CPU (central processing units) nodes for building the finite element assembly. A memory-efficient iterative and parallelized equation solver combined with a special preconditioning technique for solving the underlying equation system was modified and adapted to enable combined CPU and GPU based computations. Hence, it is recommended by the author to apply the substructuring method for hybrid meshes, which respects different material phases and their mechanical behavior and which enables to split the structure in elastic and inelastic parts. However, the consideration of the nonlinear material behavior, specified for the corresponding phase, is limited to the inelastic domains only, and by that causes a decreased computing time for the nonlinear procedure. Due to the high numerical effort for such simulations, an alternative approach for the nonlinear finite element analysis, based on the sequential linear analysis, was implemented in respect to scalable HPC. The incremental-iterative procedure in finite element analysis (FEA) during the nonlinear step was then replaced by a sequence of linear FE analysis when damage in critical regions occured, known in literature as saw-tooth approach. As a result, qualitative (smeared) crack initiation in 3D multiphase specimens has efficiently been simulated. T3 - ISM-Bericht // Institut für Strukturmechanik, Bauhaus-Universität Weimar - 2013,2 KW - high-performance computing KW - finite element method KW - heterogeneous material KW - domain decomposition KW - scalable smeared crack analysis KW - FEM KW - multiphase KW - damage KW - HPC KW - solver Y1 - 2012 U6 - http://nbn-resolving.de/urn/resolver.pl?urn:nbn:de:gbv:wim2-20131021-20595 ER - TY - THES A1 - Carvajal Bermúdez, Juan Carlos T1 - New methods of citizen participation based on digital technologies N2 - The current thesis presents research about new methods of citizen participation based on digital technologies. The focus on the research lies on decentralized methods of participation where citizens take the role of co-creators. The research project first conducted a review of the literature on citizen participation, its origins and the different paradigms that have emerged over the years. The literature review also looked at the influence of technologies on participation processes and the theoretical frameworks that have emerged to understand the introduction of technologies in the context of urban development. The literature review generated the conceptual basis for the further development of the thesis. The research begins with a survey of technology enabled participation applications that examined the roles and structures emerging due to the introduction of technology. The results showed that cities use technology mostly to control and monitor urban infrastructure and are rather reluctant to give citizens the role of co-creators. Based on these findings, three case studies were developed. Digital tools for citizen participation were conceived and introduced for each case study. The adoption and reaction of the citizens were observed using three data collection methods. The results of the case studies showed consistently that previous participation and engagement with informal citizen participation are a determinining factor in the potential adoption of digital tools for decentralized engagement. Based on these results, the case studies proposed methods and frameworks that can be used for the conception and introduction of technologies for decentralized citizen participation. KW - Partizipation KW - Beteiligung KW - Technologie KW - Citizen participation KW - Digital technologies Y1 - 2022 U6 - http://nbn-resolving.de/urn/resolver.pl?urn:nbn:de:gbv:wim2-20220906-47124 ER - TY - THES A1 - Vogler, Verena T1 - A framework for artificial coral reef design: Integrating computational modelling and high precision monitoring strategies for artificial coral reefs – an Ecosystem-aware design approach in times of climate change N2 - Tropical coral reefs, one of the world’s oldest ecosystems which support some of the highest levels of biodiversity on the planet, are currently facing an unprecedented ecological crisis during this massive human-activity-induced period of extinction. Hence, tropical reefs symbolically stand for the destructive effects of human activities on nature [4], [5]. Artificial reefs are excellent examples of how architectural design can be combined with ecosystem regeneration [6], [7], [8]. However, to work at the interface between the artificial and the complex and temporal nature of natural systems presents a challenge, i.a. in respect to the B-rep modelling legacy of computational modelling. The presented doctorate investigates strategies on how to apply digital practice to realise what is an essential bulwark to retain reefs in impossibly challenging times. Beyond the main question of integrating computational modelling and high precision monitoring strategies in artificial coral reef design, this doctorate explores techniques, methods, and linking frameworks to support future research and practice in ecology led design contexts. Considering the many existing approaches for artificial coral reefs design, one finds they often fall short in precisely understanding the relationships between architectural and ecological aspects (e.g. how a surface design and material composition can foster coral larvae settlement, or structural three-dimensionality enhance biodiversity) and lack an integrated underwater (UW) monitoring process. Such a process is necessary in order to gather knowledge about the ecosystem and make it available for design, and to learn whether artificial structures contribute to reef regeneration or rather harm the coral reef ecosystem. For the research, empirical experimental methods were applied: Algorithmic coral reef design, high precision UW monitoring, computational modelling and simulation, and validated through parallel real-world physical experimentation – two Artificial Reef Prototypes (ARPs) in Gili Trawangan, Indonesia (2012–today). Multiple discrete methods and sub techniques were developed in seventeen computational experiments and applied in a way in which many are cross valid and integrated in an overall framework that is offered as a significant contribution to the field. Other main contributions include the Ecosystem-aware design approach, Key Performance Indicators (KPIs) for coral reef design, algorithmic design and fabrication of Biorock cathodes, new high precision UW monitoring strategies, long-term real-world constructed experiments, new digital analysis methods and two new front-end web-based tools for reef design and monitoring reefs. The methodological framework is a finding of the research that has many technical components that were tested and combined in this way for the very first time. In summary, the thesis responds to the urgency and relevance in preserving marine species in tropical reefs during this massive extinction period by offering a differentiated approach towards artificial coral reefs – demonstrating the feasibility of digitally designing such ‘living architecture’ according to multiple context and performance parameters. It also provides an in-depth critical discussion of computational design and architecture in the context of ecosystem regeneration and Planetary Thinking. In that respect, the thesis functions as both theoretical and practical background for computational design, ecology and marine conservation – not only to foster the design of artificial coral reefs technically but also to provide essential criteria and techniques for conceiving them. Keywords: Artificial coral reefs, computational modelling, high precision underwater monitoring, ecology in design. N2 - Charakteristisch für das Zeitalter des Klimawandels sind die durch den Menschen verursachte Meeresverschmutzung sowie ein massiver Rückgang der Artenvielfalt in den Weltmeeren. Tropische Korallenriffe sind als eines der ältesten und artenreichsten Ökosysteme der Erde besonders stark gefährdet und stehen somit symbolisch für die zerstörerischen Auswirkungen menschlicher Aktivitäten auf die Natur [4], [5]. Um dem massiven Rückgang der Korallenriffe entgegenzuwirken, wurden von Menschen künstliche Riffsysteme entwickelt [6], [7]. Sie sind Beispiele dafür, wie Architektur und die Regenerierung von Ökosystemen miteinander verbunden werden können [8]. Eine Verknüpfung von einerseits künstlichen und andererseits komplexen, sich verändernden natürlichen Systemen, stellt jedoch eine Herausforderung dar, u.a. in Bezug auf die Computermodellierung (B-Rep Modellierung). Zum Erhalt der Korallenriffe werden in der vorliegende Doktorarbeit Strategien aus der digitalen Praxis neuartig auf das Entwerfen von künstlichen Korallenriffen angewendet. Die Hauptfrage befasst sich damit, wie der Entwurfsprozess von künstlichen Korallenriffen unter Einbeziehung von Computermodellierung und hochpräzisen Überwachungsstrategien optimiert werden kann. In diesem Zusammenhang werden Techniken, Methoden sowie ein übergeordnetes Framework erforscht, welche zukünftige Forschung und Praxis in Bezug auf Ökologie-geleitete Entwurfsprozesse fördern soll. In Anbetracht der vielen vorhandenen künstlichen Riffsysteme, kann man feststellen, dass die Zusammenhänge zwischen Architektur- und Ökosystem-Anforderungen nicht genau untersucht und dadurch bei der Umsetzung nicht entsprechend berücksichtigt werden. Zum Beispiel wie Oberflächenbeschaffenheit und Materialität eine Ansiedlung von Korallenlarven begünstigt oder wie eine räumlich vielseitige Struktur die Artenvielfalt verbessern kann. Zudem fehlt ein integrierter Unterwasser-Überwachungsprozess, welcher Informationen über das Ökosystem liefert und diese dem Entwurf bereitstellt. Zusätzlich ist eine Unterwasser-Überwachung notwendig, um herauszufinden, ob die künstlichen Riffstrukturen zur Regenerierung beitragen oder dem Ökosystem gänzlich schaden. In dieser Forschungsarbeit werden empirische und experimentelle Methoden angewendet: Algorithmisches Entwerfen für Korallenriffe, hochpräzise Unterwasser-Überwachung, Computermodellierung und -simulation. Die Forschung wird seit 2012 bis heute durch zwei Riffprototypen (Artificial Reef Prototypes – ARPs) in Gili Trawangan, Indonesien validiert. Zusätzlich wurden weitere separate Methoden und Techniken in insgesamt siebzehn computergestützten Experimenten entwickelt und so angewendet, dass viele kreuzvalidiert und in ein Framework integriert sind, welches dann als bedeutender Beitrag dem Forschungsgebiet zur Verfügung steht. Weitere Hauptbeiträge sind der Ökosystem-bewusste Entwurfsansatz (Ecosystem-aware design approach), Key Performance Indicators (KPIs) für das Gestalten von Korallenriffen, algorithmisches Entwerfen und die Herstellung von Biorock-Kathoden, neue hochpräzise Unterwasser-Überwachungsstrategien, reale Langzeitexperimente, neue digitale Analysemethoden, sowie zwei webbasierte Softwareanwendungen für die Gestaltung und die Überwachung von künstlichen Korallenriffen. Das methodische Framework ist das Hauptergebnis der Forschung, da die vielen technischen Komponenten in dieser Weise zum ersten Mal getestet und kombiniert wurden. Zusammenfassend reagiert die vorliegende Doktorarbeit sowohl auf die Dringlichkeit als auch auf die Relevanz der Erhaltung von Artenvielfalt in tropischen Korallenriffen in Zeiten eines massiven Aussterbens, indem sie einen differenzierten Entwurfsansatz für künstliche Korallenriffe offeriert. Die Arbeit zeigt auf, dass ein digitales Entwerfen einer solchen „lebendigen Architektur“ unter Berücksichtigung vielfältiger Anforderungen und Leistungsparametern machbar ist. Zusätzlich bietet sie eine ausführliche kritische Diskussion über die Rolle von computergestützten Entwerfen und Architektur im Zusammenhang mit Regenerierung von Ökosystemen und “Planetary Thinking”. In dieser Hinsicht fungiert die Doktorarbeit als theoretischer und praktischer Hintergrund für computergestütztes Entwerfen, Ökologie und Meeresschutz. Eine Verbesserung des Entwerfens von künstlichen Korallenriffen wird nicht nur auf technischer Ebene aufgezeigt, sondern es werden auch die wesentlichen Kriterien und Techniken für deren Umsetzung benannt. Schlüsselwörter: Künstliche Korallenriffe, Computermodellierung, hochpräzise Unterwasser-Überwachung, Ökologie im Architekturentwurf. KW - Korallenriff KW - Algorithmus KW - Architektur KW - Meeresökologie KW - Software KW - Artificial coral reefs KW - Computational modelling KW - High precision underwater monitoring KW - Ecology in design KW - Künstliche Korallenriffe KW - Unterwasserarchitektur Y1 - 2022 U6 - http://nbn-resolving.de/urn/resolver.pl?urn:nbn:de:gbv:wim2-20220322-46115 UR - https://artificialreefdesign.com/ SN - 978-3-00-074495-2 N1 - Die URL führt zu 3D Modelle von echten Korallenriffen. ER - TY - JOUR A1 - König, Reinhard T1 - Computers in the design phase - Ten thesis on their uselessness T1 - Der Computer in der Entwurfsphase - Zehn Thesen zu seiner Nutzlosigkeit JF - Der Generalist N2 - At the end of the 1960s, architects at various universities world- wide began to explore the potential of computer technology for their profession. With the decline in prices for PCs in the 1990s and the development of various computer-aided architectural design systems (CAAD), the use of such systems in architectural and planning offices grew continuously. Because today no ar- chitectural office manages without a costly CAAD system and because intensive soſtware training has become an integral part of a university education, the question arises about what influence the various computer systems have had on the design process forming the core of architectural practice. The text at hand devel- ops ten theses about why there has been no success to this day in introducing computers such that new qualitative possibilities for design result. RESTRICTEDNESS KW - computational design KW - CAD Y1 - 2008 U6 - http://nbn-resolving.de/urn/resolver.pl?urn:nbn:de:gbv:wim2-20160622-26075 ER - TY - RPRT A1 - König, Reinhard A1 - Tapias, Estefania A1 - Schmitt, Gerhard T1 - New Methods in Urban Analysis and Simulation: Documentation of the teaching results from the spring semester 2014 N2 - Documentation of the teaching results from the spring semester 2014 at ETH Zurich KW - Städtebau Y1 - 2014 U6 - http://nbn-resolving.de/urn/resolver.pl?urn:nbn:de:gbv:wim2-20160121-25154 ER - TY - RPRT A1 - König, Reinhard A1 - Tapias, Estefania A1 - König, Gerhard T1 - Digital Urban Simulation: Documentation of the teaching results from the fall semester 2014 N2 - Documentation of the teaching results from the fall semester 2014 KW - Städtebau Y1 - 2014 U6 - http://nbn-resolving.de/urn/resolver.pl?urn:nbn:de:gbv:wim2-20160121-25125 ER - TY - JOUR A1 - König, Reinhard A1 - Standfest, Matthias A1 - Schmitt, Gerhard ED - Thompson, Emine Mine T1 - Evolutionary multi-criteria optimization for building layout planning: Exemplary application based on the PSSA framework JF - 32nd eCAADe Conference - Volume 2 N2 - When working on urban planning projects there are usually multiple aspects to consider. Often these aspects are contradictory and it is not possible to choose one over the other; instead, they each need to be fulfilled as well as possible. Planners typically draw on past experience when subjectively prioritising which aspects to consider with which degree of importance for their planning concepts. This practice, although understandable, places power and authority in the hands of people who have varying degrees of expertise, which means that the best possible solution is not always found, because it is either not sought or the problem is regarded as being too complex for human capabilities. To improve this situation, the project presented here shows the potential of multi-criteria optimisation algorithms using the example of a new housing layout for an urban block. In addition it is shown, how Self-Organizing-Maps can be used to visualise multi-dimensional solution spaces in an easy analysable and comprehensible form. KW - Architektur Y1 - 2014 U6 - http://nbn-resolving.de/urn/resolver.pl?urn:nbn:de:gbv:wim2-20160121-25139 UR - http://cumincad.scix.net/cgi-bin/works/Show?_id=ecaade2014_019&sort=DEFAULT&search=series%3aecaade year%3a2014&hits=132 SP - 567 EP - 574 ER - TY - JOUR A1 - König, Reinhard A1 - Knecht, Katja T1 - Comparing two evolutionary algorithm based methods for layout generation: Dense packing versus subdivision JF - Artificial Intelligence for Engineering Design, Analysis and Manufacturing N2 - We present and compare two evolutionary algorithm based methods for rectangular architectural layout generation: dense packing and subdivision algorithms.We analyze the characteristics of the two methods on the basis of three floor plan sce- narios. Our analyses include the speed with which solutions are generated, the reliability with which optimal solutions can be found, and the number of different solutions that can be found overall. In a following step, we discuss the methods with respect to their different user interaction capabilities. In addition, we show that each method has the capability to generate more complex L-shaped layouts. Finally,we conclude that neither of the methods is superior but that each of them is suitable for use in distinct application scenarios because of its different properties. KW - Architektur KW - Informatik KW - Kremlas Y1 - 2014 UR - http://www.journals.cambridge.org/abstract_S0890060414000237 N1 - Paper is only available from the journal home page. SP - 285 EP - 299 ER - TY - CHAP A1 - Treyer, Lukas A1 - Klein, Bernhard A1 - König, Reinhard A1 - Meixner, Christine T1 - Lightweight urban computation interchange (LUCI) system T2 - FOSS4G 2015 Conference N2 - In this paper we introduce LUCI, a Lightweight Urban Calculation Interchange system, designed to bring the advantages of a calculation and content co-ordination system to small planning and design groups by the means of an open source middle-ware. The middle-ware focuses on problems typical to urban planning and therefore features a geo-data repository as well as a job runtime administration, to coordinate simulation models and its multiple views. The described system architecture is accompanied by two exemplary use cases that have been used to test and further develop our concepts and implementations. KW - Architektur KW - Informatik KW - Geographie KW - Luci KW - distributed computing KW - middleware Y1 - 2015 U6 - http://nbn-resolving.de/urn/resolver.pl?urn:nbn:de:gbv:wim2-20160118-25042 PB - FOSS4G CY - Seoul, South Korea ER - TY - CHAP A1 - König, Reinhard A1 - Bauriedel, Christian T1 - Computer-generated Urban Structures T2 - Proceedings of the Generative Art Conference N2 - How does it come to particular structure formations in the cities and which strengths play a role in this process? On which elements can the phenomena be reduced to find the respective combination rules? How do general principles have to be formulated to be able to describe the urban processes so that different structural qualities can be produced? With the aid of mathematic methods, models based on four basic levels are generated in the computer, through which the connections between the elements and the rules of their interaction can be examined. Conclusions on the function of developing processes and the further urban origin can be derived. KW - Computational Urban Design Y1 - 2004 U6 - http://nbn-resolving.de/urn/resolver.pl?urn:nbn:de:gbv:wim2-20160623-26090 SP - 1 EP - 10 CY - Milan, Italy ER - TY - JOUR A1 - Klein, Bernhard A1 - König, Reinhard T1 - Computational Urban Planning: Using the Value Lab as Control Center JF - FCL Magazine, Special Issue Simulation Platform N2 - Urban planning involves many aspects and various disciplines, demanding an asynchronous planning approach. The level of complexity rises with each aspect to be considered and makes it difficult to find universally satisfactory solutions. To improve this situation we propose a new approach, which complement traditional design methods with a computational urban plan- ning method that can fulfil formalizable design requirements automatically. Based on this approach we present a design space exploration framework for complex urban planning projects. For a better understanding of the idea of design space exploration, we introduce the concept of a digital scout which guides planners through the design space and assists them in their creative explorations. The scout can support planners during manual design by informing them about potential im- pacts or by suggesting different solutions that fulfill predefined quality requirements. The planner can change flexibly between a manually controlled and a completely automated design process. The developed system is presented using an exemplary urban planning scenario on two levels from the street layout to the placement of building volumes. Based on Self-Organizing Maps we implemented a method which makes it possible to visualize the multi-dimensional solution space in an easily analysable and comprehensible form. KW - Computational Urban Design KW - Stadtgestaltung Y1 - 2016 U6 - http://nbn-resolving.de/urn/resolver.pl?urn:nbn:de:gbv:wim2-20160622-26011 SP - 38 EP - 45 ER - TY - JOUR A1 - König, Reinhard T1 - Die Stadt der Agenten und Automaten JF - FORUM - Architektur & Bauforum N2 - PLANUNGSUNTERSTÜTZUNG DURCH DIE ANALYSE RÄUMLICHER PROZESSE MITTELS COMPUTERSIMULATIONEN. Erst wenn man – zumindest im Prinzip – versteht, wie eine Stadt mit ihren komplexen, verwobenen Vorgängen im Wesentlichen funktioniert, ist eine sinnvolle Stadtplanung möglich. Denn jede Planung bedeutet einen Eingriff in den komplexen Organismus einer Stadt. Findet dieser Eingriff ohne Wissen über die Funktionsweise des Organismus statt, können auch die Auswirkungen nicht abgeschätzt werden. Dieser Beitrag stellt dar, wie urbane Prozesse mittels Computersimulationen unter Zuhilfenahme so genannter Multi-Agenten-Systeme und Zellulärer Automaten verstanden werden können. von KW - Computational Urban Design KW - CAD Y1 - 2007 U6 - http://nbn-resolving.de/urn/resolver.pl?urn:nbn:de:gbv:wim2-20160622-26083 ER - TY - JOUR A1 - Treyer, Lukas A1 - Klein, Bernhard A1 - König, Reinhard A1 - Meixner, Christine T1 - Lightweight Urban Computation Interchange (LUCI): A System to Couple Heterogenous Simulations and Views JF - Spatial Information Research N2 - In this paper we introduce LUCI, a Lightweight Urban Calculation Interchange system, designed to bring the advantages of calculation and content co-ordination system to small planning and design groups by the means of an open source middle-ware. The middle-ware focuses on problems typical to urban planning and therefore features a geo-data repository as well as a job runtime administration, to coordinate simulation models and its multiple views. The described system architecture is accompanied by two exemplary use cases, that have been used to test and further develop our concepts and implementations. KW - Middle-ware KW - Design-simulation-loop KW - Computational urban planning KW - Distributed computing KW - Multiple comparative views Y1 - 2016 U6 - http://nbn-resolving.de/urn/resolver.pl?urn:nbn:de:gbv:wim2-20160622-26037 UR - http://link.springer.com/article/10.1007/s41324-016-0025-y SP - 1 EP - 12 ER - TY - CHAP A1 - König, Reinhard A1 - Schmitt, Gerhard ED - Szoboszlai, Mihály T1 - Backcasting and a new way of command in computational design : Proceedings T2 - CAADence in Architecture Conference N2 - It's not uncommon that analysis and simulation methods are used mainly to evaluate finished designs and to proof their quality. Whereas the potential of such methods is to lead or control a design process from the beginning on. Therefore, we introduce a design method that move away from a “what-if” forecasting philosophy and increase the focus on backcasting approaches. We use the power of computation by combining sophisticated methods to generate design with analysis methods to close the gap between analysis and synthesis of designs. For the development of a future-oriented computational design support we need to be aware of the human designer’s role. A productive combination of the excellence of human cognition with the power of modern computing technology is needed. We call this approach “cognitive design computing”. The computational part aim to mimic the way a designer’s brain works by combining state-of-the-art optimization and machine learning approaches with available simulation methods. The cognition part respects the complex nature of design problems by the provision of models for human-computation interaction. This means that a design problem is distributed between computer and designer. In the context of the conference slogan “back to command”, we ask how we may imagine the command over a cognitive design computing system. We expect that designers will need to let go control of some parts of the design process to machines, but in exchange they will get a new powerful command on complex computing processes. This means that designers have to explore the potentials of their role as commanders of partially automated design processes. In this contribution we describe an approach for the development of a future cognitive design computing system with the focus on urban design issues. The aim of this system is to enable an urban planner to treat a planning problem as a backcasting problem by defining what performance a design solution should achieve and to automatically query or generate a set of best possible solutions. This kind of computational planning process offers proof that the designer meets the original explicitly defined design requirements. A key way in which digital tools can support designers is by generating design proposals. Evolutionary multi-criteria optimization methods allow us to explore a multi-dimensional design space and provide a basis for the designer to evaluate contradicting requirements: a task urban planners are faced with frequently. We also reflect why designers will give more and more control to machines. Therefore, we investigate first approaches learn how designers use computational design support systems in combination with manual design strategies to deal with urban design problems by employing machine learning methods. By observing how designers work, it is possible to derive more complex artificial solution strategies that can help computers make better suggestions in the future. KW - Cognitive design computing KW - machine learning KW - backcasting KW - design synthesis KW - evolutionary optimization KW - CAD Y1 - 2016 U6 - http://nbn-resolving.de/urn/resolver.pl?urn:nbn:de:gbv:wim2-20160622-25996 SP - 15 EP - 25 CY - Budapest ER - TY - JOUR A1 - König, Reinhard A1 - Bauriedel, Christian T1 - Generating settlement structures: a method for urban planning and analysis supported by cellular automata JF - Environment and Planning B: Planning and Design N2 - Previous models for the explanation of settlement processes pay little attention to the interactions between settlement spreading and road networks. On the basis of a dielectric breakdown model in combination with cellular automata, we present a method to steer precisely the generation of settlement structures with regard to their global and local density as well as the size and number of forming clusters. The resulting structures depend on the logic of how the dependence of the settlements and the road network is implemented to the simulation model. After analysing the state of the art we begin with a discussion of the mutual dependence of roads and land development. Next, we elaborate a model that permits the precise control of permeability in the developing structure as well as the settlement density, using the fewest necessary control parameters. On the basis of different characteristic values, possible settlement structures are analysed and compared with each other. Finally, we reflect on the theoretical contribution of the model with regard to the context of urban dynamics. KW - Cellular automata KW - Computational urban design Y1 - 2009 U6 - http://nbn-resolving.de/urn/resolver.pl?urn:nbn:de:gbv:wim2-20160624-26054 UR - http://www.envplan.com.proxy.lib.sfu.ca/abstract.cgi?id=b34025 SP - 602 EP - 624 ER - TY - CHAP A1 - König, Reinhard A1 - Varoudis, Tasos T1 - Spatial Optimizations: Merging depthmapX , spatial graph networks and evolutionary design in Grasshopper T2 - Proceedings of ecaade 34: Complexity & Simplicity N2 - In the Space Syntax community, the standard tool for computing all kinds of spatial graph network measures is depthmapX (Turner, 2004; Varoudis, 2012). The process of evaluating many design variants of networks is relatively complicated, since they need to be drawn in a separated CAD system, exported and imported in depthmapX via dxf file format. This procedure disables a continuous integration into a design process. Furthermore, the standalone character of depthmapX makes it impossible to use its network centrality calculation for optimization processes. To overcome this limitations, we present in this paper the first steps of experimenting with a Grasshopper component (reference omitted until final version) that can access the functions of depthmapX and integrate them into Grasshopper/Rhino3D. Here the component is implemented in a way that it can be used directly for an evolutionary algorithm (EA) implemented in a Python scripting component in Grasshopper KW - depthmapx KW - python KW - optimization KW - space syntax KW - grasshopper Y1 - 2016 U6 - http://nbn-resolving.de/urn/resolver.pl?urn:nbn:de:gbv:wim2-20160622-26040 SP - 1 EP - 6 CY - Oulu, Finland ER - TY - JOUR A1 - Bimber, Oliver A1 - Iwai, Daisuke T1 - Superimposing Dynamic Range JF - Eurographics 2009 N2 - Replacing a uniform illumination by a high-frequent illumination enhances the contrast of observed and captured images. We modulate spatially and temporally multiplexed (projected) light with reflective or transmissive matter to achieve high dynamic range visualizations of radiological images on printed paper or ePaper, and to boost the optical contrast of images viewed or imaged with light microscopes. KW - CGI KW - Computer graphics KW - Image processing KW - Computer vision KW - 54.73 Y1 - 2009 U6 - http://nbn-resolving.de/urn/resolver.pl?urn:nbn:de:gbv:wim2-20120130-15325 ER - TY - THES A1 - Walsdorf, Joern T1 - M-Learning: Lernen im mobilen Kontext an Hochschulen N2 - A fundamental characteristic of human beings is the desire to start learning at the moment of birth. The rather formal learning process that learners have to deal with in school, on vocational training or in university, is currently subject to fundamental changes. The increasing technologization, overall existing mobile devices, the ubiquitous access to digital information, and students being early adaptors of all these technological innovations require reactions on the part of the educational system. This study examines such a reaction: The use of mobile learning in higher education. Examining the subject m-learning first requires an investigation of the educational model e-learning. Many universities already established e-learning as one of their educational segments, providing a wide range of methods to support this kind of teaching. This study includes an empirical acceptance analysis regarding the general learning behavior of students and their approval of e-learning methods. A survey on the approval of m-learning supplements the results. Mobile learning is characterized by both the mobility of the communication devices and the users. Both factors lead to new correlations, demonstrate the potential of today's mobile devices and the probability to increase the learning performance. The dissertation addresses these correlations and the use of mobile devices in the context of m-learning. M-learning and the usage of mobile devices not only require a reflection from a technological point of view. In addition to the technical features of such mobile devices, the usability of their applications plays an important role, especially with regard to the limited display size. For the purpose of evaluating mobile apps and browser-based applications, various analytical methods are suitable. The concluding heuristic evaluation points out the vulnerability of an established m-learning application, reveals the need for improvement, and shows an approach to rectify the shortcoming. KW - Mobile Learning KW - M-Learning KW - E-Learning KW - Electronic Learning KW - Learning on demand KW - Mobiles Lernen Y1 - 2014 U6 - http://nbn-resolving.de/urn/resolver.pl?urn:nbn:de:gbv:wim2-20140304-21361 ER - TY - THES A1 - Kulik, Alexander T1 - User Interfaces for Cooperation N2 - This thesis suggests cooperation as a design paradigm for human-computer interaction. The basic idea is that the synergistic co-operation of interfaces through concurrent user activities enables increased interaction fluency and expressiveness. This applies to bimanual interaction and multi-finger input, e.g., touch typing, as well as the collaboration of multiple users. Cooperative user interfaces offer more interaction flexibility and expressivity for single and multiple users. Part I of this thesis analyzes the state of the art in user interface design. It explores limitations of common approaches and reveals the crucial role of cooperative action in several established user interfaces and research prototypes. A review of related research in psychology and human-computer interaction offers insights to the cognitive, behavioral, and ergonomic foundations of cooperative user interfaces. Moreover, this thesis suggests a broad applicability of generic cooperation patterns and contributes three high-level design principles. Part II presents three experiments towards cooperative user interfaces in detail. A study on desktop-based 3D input devices, explores fundamental benefits of cooperative bimanual input and the impact of interface design on bimanual cooperative behavior. A novel interaction technique for multitouch devices is presented that follows the paradigm of cooperative user interfaces and demonstrates advantages over the status quo. Finally, this thesis introduces a fundamentally new display technology that provides up to six users with their individual perspectives of a shared 3D environment. The system creates new possibilities for the cooperative interaction of multiple users. Part III of this thesis builds on the research results described in Part II, in particular, the multi-user 3D display system. A series of case studies in the field of collaborative virtual reality provides exemplary evidence for the relevance and applicability of the suggested design principles. N2 - Die vorliegende Arbeit betrachtet Kooperation als Gestaltungsparadigma für Mensch-Maschine Schnittstellen. Dabei geht es um Kooperation im Sinne paralleler Aktivitäten und deren synergetischer Kombination mit dem Ziel einer flüssigen und effektiven Computerarbeit. Dieses Interaktionsmuster ist für zweihändige Eingaben und die Nutzung mehrerer Finger, z.B. beim Maschinenschreiben, genauso anwendbar wie für die Zusammenarbeit mehrerer Nutzer. Kooperative Benutzungsschnittstellen bieten Einzelpersonen sowie Gruppen von Nutzern mehr Flexibilität und Ausdrucksmöglichkeiten. Teil I dieser Arbeit betrachtet den Stand von Forschung und Technik zu diesem Thema. Dabei werden Limitierungen etablierter Benutzungsschnittstellen untersucht als auch das Potential und die Bedeutung kooperativer Interaktion. Auf Grundlage von Forschungsergebnissen aus der Psychologie, den Bewegungswissenschaften und der Forschung zu Mensch-Maschine Schnittstellen werden kognitive und ergonomische Grundlagen kooperativer Benutzungsschnittstellen abgeleitet. Darüber hinaus werden generische Kooperationsmuster diskutiert und die Anforderungen an kooperative Benutzungsschnittstellen in drei Gestaltungsprinzipien zusammengefasst. Teil II dieser Arbeit stellt drei Forschungsarbeiten zur Entwicklung und Untersuchung kooperativer Benutzungsschnittstellen vor. In Kapitel 8 wird zweihändige Kooperation am Beispiel tischbasierter 3D Eingabegeräte untersucht. Kapitel 9 stellt eine neue Multitouch Interaktionstechnik vor, die dem Paradigma kooperativer Benutzungsschnittstellen folgt und klare Vorteile gegenüber einer etablierten Technik aufweist. Kapitel 10 präsentiert die Entwicklung und Untersuchung einer neuen 3D Projektionstechnologie, die bis zu sechs Personen individuelle Perspektiven auf eine gemeinsame virtuelle Umgebung bietet. Daraus ergeben sich völlig neue Möglichkeiten für die kooperative Interaktion mehrerer Nutzer mit dreidimensionalen Daten. Teil III dieser Arbeit baut auf den Ergebnissen der in Teil II beschriebenen Experimente auf. Fallstudien aus dem Bereich der virtuellen Realität für mehrere Nutzer, belegen die Relevanz und Anwendbarkeit der vorgeschlagenen Gestaltungsprinzipien KW - Human-Computer Interaction (HCI) KW - Mensch-Maschine-Interaktion (MMI) KW - Kollaborative Arbeit KW - User Interfaces and Interaction Techniques KW - Computer Supported Collaborative Work KW - Bimanual Interaction KW - 3D User Interfaces and Interaction Techniques KW - Display Technoloy, Collaboration, Virtual Reality Y1 - 2016 U6 - http://nbn-resolving.de/urn/resolver.pl?urn:nbn:de:gbv:wim2-20161202-27207 ER - TY - THES A1 - Moehring, Mathias T1 - Realistic Interaction with Virtual Objects within Arm's Reach N2 - The automotive industry requires realistic virtual reality applications more than other domains to increase the efficiency of product development. Currently, the visual quality of virtual invironments resembles reality, but interaction within these environments is usually far from what is known in everyday life. Several realistic research approaches exist, however they are still not all-encompassing enough to be usable in industrial processes. This thesis realizes lifelike direct multi-hand and multi-finger interaction with arbitrary objects, and proposes algorithmic and technical improvements that also approach lifelike usability. In addition, the thesis proposes methods to measure the effectiveness and usability of such interaction techniques as well as discusses different types of grasping feedback that support the user during interaction. Realistic and reliable interaction is reached through the combination of robust grasping heuristics and plausible pseudophysical object reactions. The easy-to-compute grasping rules use the objects’ surface normals, and mimic human grasping behavior. The novel concept of Normal Proxies increases grasping stability and diminishes challenges induced by adverse normals. The intricate act of picking-up thin and tiny objects remains challenging for some users. These cases are further supported by the consideration of finger pinches, which are measured with a specialized finger tracking device. With regard to typical object constraints, realistic object motion is geometrically calculated as a plausible reaction on user input. The resulting direct finger-based interaction technique enables realistic and intuitive manipulation of arbitrary objects. The thesis proposes two methods that prove and compare effectiveness and usability. An expert review indicates that experienced users quickly familiarize themselves with the technique. A quantitative and qualitative user study shows that direct finger-based interaction is preferred over indirect interaction in the context of functional car assessments. While controller-based interaction is more robust, the direct finger-based interaction provides greater realism, and becomes nearly as reliable when the pinch-sensitive mechanism is used. At present, the haptic channel is not used in industrial virtual reality applications. That is why it can be used for grasping feedback which improves the users’ understanding of the grasping situation. This thesis realizes a novel pressure-based tactile feedback at the fingertips. As an alternative, vibro-tactile feedback at the same location is realized as well as visual feedback by the coloring of grasp-involved finger segments. The feedback approaches are also compared within the user study, which reveals that grasping feedback is a requirement to judge grasp status and that tactile feedback improves interaction independent of the used display system. The considerably stronger vibrational tactile feedback can quickly become annoying during interaction. The interaction improvements and hardware enhancements make it possible to interact with virtual objects in a realistic and reliable manner. By addressing realism and reliability, this thesis paves the way for the virtual evaluation of human-object interaction, which is necessary for a broader application of virtual environments in the automotive industry and other domains. N2 - Stärker als andere Branchen benötigt die Automobilindustrie realistische Virtual Reality Anwendungen für eine effiziente Produktentwicklung. Während sich die visuelle Qualität virtueller Darstellungen bereits der Realität angenähert hat, ist die Interaktion mit virtuellen Umgebungen noch weit vom täglichen Erleben der Menschen entfernt. Einige Forschungsansätze haben sich mit realistischer Interaktion befasst, gehen aber nicht weit genug, um in industriellen Prozessen eingesetzt zu werden. Diese Arbeit realisiert eine lebensnahe mehrhändige und fingerbasierte Interaktion mit beliebigen Objekten. Dabei ermöglichen algorithmische und technische Verbesserungen eine realitätsnahe Usability. Außerdem werden Methoden für die Evaluation dieser Interaktionstechnik vorgestellt und benutzerunterstützende Greiffeedbackarten diskutiert. Die verlässliche und gleichzeitig realistische Interaktion wird durch die Kombination von robusten Greifheuristiken und pseudophysikalischen Objektreaktionen erreicht. Die das menschliche Greifverhalten nachbildenden Greifregeln basieren auf den Oberflächennormalen der Objekte. Die Reduktion negativer Einflüsse verfälschter Normalen und eine höhere Griffstabilität werden durch das neuartige Konzept der Normal Proxies erreicht. Dennoch bleibt für manche Nutzer das Aufnehmen von dünnen und kleinen Objekten problematisch. Diese Fälle werden zusätzlich durch die Einbeziehung von Fingerberührungen unterstützt, die mit einem speziellen Fingertracking Gerät erfasst werden. Plausible Objektreaktionen auf Benutzereingaben werden unter Berücksichtigung typischer Objekteinschränkungen geometrisch berechnet. Die Arbeit schlägt zwei Methoden zur Evaluierung der fingerbasierten Interaktion vor. Ein Expertenreview zeigt, dass sich erfahrene Benutzer sehr schnell in die Technik einfinden. In einer Benutzerstudie wird nachgewiesen, dass fingerbasierte Interaktion im hier untersuchten Kontext vor indirekter Interaktion mit einem Eingabegerät bevorzugt wird. Während letztere robuster zu handhaben ist, stellt die fingerbasierte Interaktion einen deutlich höheren Realismus bereit und erreicht mit den vorgeschlagenen Verbesserungen eine vergleichbare Verlässlichkeit. Um Greifsituationen transparent zu gestalten, realisiert diese Arbeit ein neuartiges druckbasiertes taktiles Feedback an den Fingerspitzen. Alternativ wird ein vibrotaktiles Feedback am gleichen Ort realisiert und visuelles Feedback durch die Einfärbung der griffbeteiligten Fingersegmente umgesetzt. Die verschiedenen Feedbackansätze werden in der Benutzerstudie verglichen. Dabei wird Greiffeedback als Voraussetzung identifiziert, um den Greifzustand zu beurteilen. Taktiles Feedback verbessert dabei die Interaktion unabhängig vom eingesetzten Display. Das merklich stärkere Vibrationsfeedback kann während der Interaktion störend wirken. Die vorgestellten Interaktionsverbesserungen und Hardwareerweiterungen ermöglichen es, mit virtuellen Objekten auf realistische und zuverlässige Art zu interagieren. Indem die Arbeit Realismus und Verlässlichkeit gleichzeitig adressiert, bereitet sie den Boden für die virtuelle Untersuchung von Mensch-Objekt Interaktionen und ermöglicht so einen breiteren Einsatz virtueller Techniken in der Automobilindustrie und in anderen Bereichen. KW - Virtuelle Realität KW - Interaktion KW - Mensch-Maschine-Interaktion KW - Medieninformatik Y1 - 2013 U6 - http://nbn-resolving.de/urn/resolver.pl?urn:nbn:de:gbv:wim2-20130301-18592 ER - TY - THES A1 - Lux, Christopher T1 - A Data-Virtualization System for Large Model Visualization N2 - Interactive scientific visualizations are widely used for the visual exploration and examination of physical data resulting from measurements or simulations. Driven by technical advancements of data acquisition and simulation technologies, especially in the geo-scientific domain, large amounts of highly detailed subsurface data are generated. The oil and gas industry is particularly pushing such developments as hydrocarbon reservoirs are increasingly difficult to discover and exploit. Suitable visualization techniques are vital for the discovery of the reservoirs as well as their development and production. However, the ever-growing scale and complexity of geo-scientific data sets result in an expanding disparity between the size of the data and the capabilities of current computer systems with regard to limited memory and computing resources. In this thesis we present a unified out-of-core data-virtualization system supporting geo-scientific data sets consisting of multiple large seismic volumes and height-field surfaces, wherein each data set may exceed the size of the graphics memory or possibly even the main memory. Current data sets fall within the range of hundreds of gigabytes up to terabytes in size. Through the mutual utilization of memory and bandwidth resources by multiple data sets, our data-management system is able to share and balance limited system resources among different data sets. We employ multi-resolution methods based on hierarchical octree and quadtree data structures to generate level-of-detail working sets of the data stored in main memory and graphics memory for rendering. The working set generation in our system is based on a common feedback mechanism with inherent support for translucent geometric and volumetric data sets. This feedback mechanism collects information about required levels of detail during the rendering process and is capable of directly resolving data visibility without the application of any costly occlusion culling approaches. A central goal of the proposed out-of-core data management system is an effective virtualization of large data sets. Through an abstraction of the level-of-detail working sets, our system allows developers to work with extremely large data sets independent of their complex internal data representations and physical memory layouts. Based on this out-of-core data virtualization infrastructure, we present distinct rendering approaches for specific visualization problems of large geo-scientific data sets. We demonstrate the application of our data virtualization system and show how multi-resolution data can be treated exactly the same way as regular data sets during the rendering process. An efficient volume ray casting system is presented for the rendering of multiple arbitrarily overlapping multi-resolution volume data sets. Binary space-partitioning volume decomposition of the bounding boxes of the cube-shaped volumes is used to identify the overlapping and non-overlapping volume regions in order to optimize the rendering process. We further propose a ray casting-based rendering system for the visualization of geological subsurface models consisting of multiple very detailed height fields. The rendering of an entire stack of height-field surfaces is accomplished in a single rendering pass using a two-level acceleration structure, which combines a minimum-maximum quadtree for empty-space skipping and sorted lists of depth intervals to restrict ray intersection searches to relevant height fields and depth ranges. Ultimately, we present a unified rendering system for the visualization of entire geological models consisting of highly detailed stacked horizon surfaces and massive volume data. We demonstrate a single-pass ray casting approach facilitating correct visual interaction between distinct translucent model components, while increasing the rendering efficiency by reducing processing overhead of potentially invisible parts of the model. The combination of image-order rendering approaches and the level-of-detail feedback mechanism used by our out-of-core data-management system inherently accounts for occlusions of different data types without the application of costly culling techniques. The unified out-of-core data-management and virtualization infrastructure considerably facilitates the implementation of complex visualization systems. We demonstrate its applicability for the visualization of large geo-scientific data sets using output-sensitive rendering techniques. As a result, the magnitude and multitude of data sets that can be interactively visualized is significantly increased compared to existing approaches. KW - Computer Graphics KW - Visualisation KW - Volume Rendering KW - Large Data Y1 - 2013 U6 - http://nbn-resolving.de/urn/resolver.pl?urn:nbn:de:gbv:wim2-20130725-19855 ER - TY - THES A1 - Fleischmann, Ewan T1 - Analysis and Design of Blockcipher Based Cryptographic Algorithms N2 - This thesis focuses on the analysis and design of hash functions and authenticated encryption schemes that are blockcipher based. We give an introduction into these fields of research – taking in a blockcipher based point of view – with special emphasis on the topics of double length, double call blockcipher based compression functions. The first main topic (thesis parts I - III) is on analysis and design of hash functions. We start with a collision security analysis of some well known double length blockcipher based compression functions and hash functions: Abreast-DM, Tandem-DM and MDC-4. We also propose new double length compression functions that have elevated collision security guarantees. We complement the collision analysis with a preimage analysis by stating (near) optimal security results for Abreast-DM, Tandem-DM, and Hirose-DM. Also, some generalizations are discussed. These are the first preimage security results for blockcipher based double length hash functions that go beyond the birthday barrier. We then raise the abstraction level and analyze the notion of ’hash function indifferentiability from a random oracle’. So we not anymore focus on how to obtain a good compression function but, instead, on how to obtain a good hash function using (other) cryptographic primitives. In particular we give some examples when this strong notion of hash function security might give questionable advice for building a practical hash function. In the second main topic (thesis part IV), which is on authenticated encryption schemes, we present an on-line authenticated encryption scheme, McOEx, that simultaneously achieves privacy and confidentiality and is secure against nonce-misuse. It is the first dedicated scheme that achieves high standards of security and – at the same time – is on-line computable. N2 - Die Schwerpunkte dieser Dissertation sind die Analyse und das Design von blockchiffrenbasierten Hashfunktionen (Abschnitte I-III) sowie die Entwicklung von robusten Verfahren zur authentifizierten erschlüsselung (Abschnitt IV). Die Arbeit beginnt mit einer Einführung in diese Themengebiete, wobei – insbesondere bei den Hashfunktionen – eine blockchiffrenzentrierte Perspektive eingenommen wird. Die Abschnitte I-III dieser Dissertation beschäftigen sich mit der Analyse und dem Design von Hashfunktionen. Zu Beginn werden die Kollisionssicherheit einiger wohlbekannter Kompressions- und Hashfunktionen mit zweifacher Blockchiffrenausgabelänge näher analysiert: Abreast-DM, Tandem-DMundMDC-4. Ebenso werden neue Designs vorgestellt, welche erhöhte Kollisionssicherheitsgarantien haben. Ergänzend zur Kollisionssicherheitsanalyse wird die Resistenz gegen Urbildangriffe von Kompressionsfunktionen doppelter Ausgabelänge untersucht. Dabei werden nahezu optimale Sicherheitsschranken für Abreast-DM, Tandem-DM und Hirose-DM abgeleitet. Einige Verallgemeinerungen sind ebenfalls Teil der Diskussion. Das sind die ersten Sicherheitsresultate gegen Urbildangriffe auf blockchiffrenbasierte Kompressionsfunktionen doppelter Länge, die weit über die bis dahin bekannten Sicherheitsresultate hinausgehen. Daran anschließend folgt eine Betrachtung, die auf einem erhöhten Abstraktionslevel durchgeführt wird und den Begriff der Undifferenzierbarkeit einer Hashfunktion von einem Zufallsorakel diskutiert. Hierbei liegt der Fokus nicht darauf, wie man eine gute Kompressionfunktion auf Basis anderer kryptographischer Funktionen erstellt, sondern auf dem Design einer Hashfunktionen auf Basis einer Kompressionsfunktion. Unter Einnahme eines eher praktischen Standpunktes wird anhand einiger Beispiele aufgezeigt, dass die relativ starke Eigenschaft der Undifferenzierbarkeit einer Hashfunktion zu widersprüchlichen Designempfehlungen für praktikable Hashfunktionen führen kann. Im zweiten Schwerpunkt, in Abschnitt IV, werden Verfahren zur authentifizierten Verschlüsselung behandelt. Es wird ein neues Schema zur authentifizierten Verschlüsselung vorgestellt,McOEx. Es schützt gleichzeitig die Integrität und die Vertrauchlichkeit einer Nachricht. McOEx ist das erste konkrete Schema das sowohl robust gegen die Wiederverwendung von Nonces ist und gleichzeitig on-line berechnet werden kann. KW - Kryptologie KW - Kryptographie KW - Authentifizierte Verschlüsselung KW - Beweisbare Sicherheit KW - Blockchiffrenbasierte Hashfunktion KW - Blockcipher Based Hashing Y1 - 2013 U6 - http://nbn-resolving.de/urn/resolver.pl?urn:nbn:de:gbv:wim2-20130722-19835 ER - TY - THES A1 - Gollub, Tim T1 - Information Retrieval for the Digital Humanities N2 - In ten chapters, this thesis presents information retrieval technology which is tailored to the research activities that arise in the context of corpus-based digital humanities projects. The presentation is structured by a conceptual research process that is introduced in Chapter 1. The process distinguishes a set of five research activities: research question generation, corpus acquisition, research question modeling, corpus annotation, and result dissemination. Each of these research activities elicits different information retrieval tasks with special challenges, for which algorithmic approaches are presented after an introduction of the core information retrieval concepts in Chapter 2. A vital concept in many of the presented approaches is the keyquery paradigm introduced in Chapter 3, which represents an operation that returns relevant search queries in response to a given set of input documents. Keyqueries are proposed in Chapter 4 for the recommendation of related work, and in Chapter 5 for improving access to aspects hidden in the long tail of search result lists. With pseudo-descriptions, a document expansion approach is presented in Chapter 6. The approach improves the retrieval performance for corpora where only bibliographic meta-data is originally available. In Chapter 7, the keyquery paradigm is employed to generate dynamic taxonomies for corpora in an unsupervised fashion. Chapter 8 turns to the exploration of annotated corpora, and presents scoped facets as a conceptual extension to faceted search systems, which is particularly useful in exploratory search settings. For the purpose of highlighting the major topical differences in a sequence of sub-corpora, an algorithm called topical sequence profiling is presented in Chapter 9. The thesis concludes with two pilot studies regarding the visualization of (re)search results for the means of successful result dissemination: a metaphoric interpretation of the information nutrition label, as well as the philosophical bodies, which are 3D-printed search results. N2 - In zehn Kapiteln stellt diese Arbeit Information-Retrieval-Technologien vor, die auf die Forschungsaktivitäten korpusbasierter Digital-Humanities-Projekte zugeschnitten sind. Die Arbeit strukturiert sich an Hand eines konzeptionellen Forschungsprozess der in Kapitel 1 vorgestellt wird. Der Prozess gliedert sich in fünf Forschungsaktivitäten: Die Generierung einer Forschungsfrage, die Korpusakquise, die Modellierung der Forschungsfrage, die Annotation des Korpus sowie die Verbreitung der Ergebnisse. Jede dieser Forschungsaktivitäten bringt unterschiedliche Information-Retrieval-Aufgaben mit besonderen Herausforderungen mit sich, für die, nach einer Einführung in die zentralen Information-Retrieval-Konzepte in Kapitel 2, algorithmische Ansätze vorgestellt werden. Ein wesentliches Konzept der vorgestellten Ansätze ist das in Kapitel 3 eingeführte Keyquery-Paradigma. Hinter dem Paradigma steht eine Suchoperation, die als Antwort auf eine gegebene Menge von Eingabedokumenten relevante Suchanfragen zurückgibt. Keyqueries werden in Kapitel 4 für die Empfehlung verwandter Arbeiten, in Kapitel 5 für die Verbesserung des Zugangs zu Aspekten im Long Tail von Suchergebnislisten vorgeschlagen. Mit Pseudo-Beschreibungen wird in Kapitel 6 ein Ansatz zur Document-Expansion vorgestellt. Der Ansatz verbessert die Suchleistung für Korpora, bei denen ursprünglich nur bibliografische Metadaten vorhanden sind. In Kapitel 7 wird das Keyquery-Paradigma eingesetzt, um auf unüberwachte Weise dynamische Taxonomien für Korpora zu generieren. Kapitel 8 wendet sich der Exploration von annotierten Korpora zu und stellt Scoped Facets als konzeptionelle Erweiterung von facettierten Suchsystemen vor, die besonders in explorativen Suchszenarien nützlich ist. Um die wichtigsten thematischen Unterschiede und Entwicklungen in einer Sequenz von Sub-Korpora hervorzuheben, wird in Kapitel 9 ein Algorithmus zum Topical Sequence Profiling vorgestellt. Die Arbeit schließt mit zwei Pilotstudien zur Visualisierung von Such- bzw. Forschungsergebnissen als Mittel für eine erfolgreiche Ergebnisverbreitung: eine metaphorische Interpretation des Information-Nutrition-Labels, sowie die philosophischen Körper, 3D-gedruckte Suchergebnisse. KW - Information Retrieval KW - Explorative Suche KW - Digital Humanities KW - keyqueries Y1 - 2022 U6 - http://nbn-resolving.de/urn/resolver.pl?urn:nbn:de:gbv:wim2-20220801-46738 ER - TY - THES A1 - Anderka, Maik T1 - Analyzing and Predicting Quality Flaws in User-generated Content: The Case of Wikipedia N2 - Web applications that are based on user-generated content are often criticized for containing low-quality information; a popular example is the online encyclopedia Wikipedia. The major points of criticism pertain to the accuracy, neutrality, and reliability of information. The identification of low-quality information is an important task since for a huge number of people around the world it has become a habit to first visit Wikipedia in case of an information need. Existing research on quality assessment in Wikipedia either investigates only small samples of articles, or else deals with the classification of content into high-quality or low-quality. This thesis goes further, it targets the investigation of quality flaws, thus providing specific indications of the respects in which low-quality content needs improvement. The original contributions of this thesis, which relate to the fields of user-generated content analysis, data mining, and machine learning, can be summarized as follows: (1) We propose the investigation of quality flaws in Wikipedia based on user-defined cleanup tags. Cleanup tags are commonly used in the Wikipedia community to tag content that has some shortcomings. Our approach is based on the hypothesis that each cleanup tag defines a particular quality flaw. (2) We provide the first comprehensive breakdown of Wikipedia's quality flaw structure. We present a flaw organization schema, and we conduct an extensive exploratory data analysis which reveals (a) the flaws that actually exist, (b) the distribution of flaws in Wikipedia, and, (c) the extent of flawed content. (3) We present the first breakdown of Wikipedia's quality flaw evolution. We consider the entire history of the English Wikipedia from 2001 to 2012, which comprises more than 508 million page revisions, summing up to 7.9 TB. Our analysis reveals (a) how the incidence and the extent of flaws have evolved, and, (b) how the handling and the perception of flaws have changed over time. (4) We are the first who operationalize an algorithmic prediction of quality flaws in Wikipedia. We cast quality flaw prediction as a one-class classification problem, develop a tailored quality flaw model, and employ a dedicated one-class machine learning approach. A comprehensive evaluation based on human-labeled Wikipedia articles underlines the practical applicability of our approach. KW - Data Mining KW - Machine Learning KW - Wikipedia KW - User-generated Content Analysis KW - Information Quality Assessment KW - Quality Flaw Prediction Y1 - 2013 U6 - http://nbn-resolving.de/urn/resolver.pl?urn:nbn:de:gbv:wim2-20130709-19778 ER - TY - THES A1 - Forler, Christian T1 - Analysis Design & Applications of Cryptographic Building Blocks N2 - This thesis deals with the basic design and rigorous analysis of cryptographic schemes and primitives, especially of authenticated encryption schemes, hash functions, and password-hashing schemes. In the last decade, security issues such as the PS3 jailbreak demonstrate that common security notions are rather restrictive, and it seems that they do not model the real world adequately. As a result, in the first part of this work, we introduce a less restrictive security model that is closer to reality. In this model it turned out that existing (on-line) authenticated encryption schemes cannot longer beconsidered secure, i.e. they can guarantee neither data privacy nor data integrity. Therefore, we present two novel authenticated encryption scheme, namely COFFE and McOE, which are not only secure in the standard model but also reasonably secure in our generalized security model, i.e. both preserve full data inegrity. In addition, McOE preserves a resonable level of data privacy. The second part of this thesis starts with proposing the hash function Twister-Pi, a revised version of the accepted SHA-3 candidate Twister. We not only fixed all known security issues of Twister, but also increased the overall soundness of our hash-function design. Furthermore, we present some fundamental groundwork in the area of password-hashing schemes. This research was mainly inspired by the medial omnipresence of password-leakage incidences. We show that the password-hashing scheme scrypt is vulnerable against cache-timing attacks due to the existence of a password-dependent memory-access pattern. Finally, we introduce Catena the first password-hashing scheme that is both memory-consuming and resistant against cache-timing attacks. KW - Kryptologie KW - symmetric crypto KW - provable security KW - authenticated encryption KW - hash functions KW - password scrambler Y1 - 2015 U6 - http://nbn-resolving.de/urn/resolver.pl?urn:nbn:de:gbv:wim2-20150330-23764 PB - Shaker Verlag ER - TY - THES A1 - Schneider, Sven T1 - Sichtbarkeitsbasierte Raumerzeugung - Automatisierte Erzeugung räumlicher Konfigurationen in Architektur und Städtebau auf Basis sichtbarkeitsbasierter Raumrepräsentationen N2 - Das Erzeugen räumlicher Konfigurationen ist eine zentrale Aufgabe im architektonischen bzw. städtebaulichen Entwurfsprozess und hat zum Ziel, eine für Menschen angenehme Umwelt zu schaffen. Der Geometrie der entstehenden Räume kommt hierbei eine zentrale Rolle zu, da sie einen großen Einfluss auf das Empfinden und Verhalten der Menschen ausübt und nur noch mit großem Aufwand verändert werden kann, wenn sie einmal gebaut wurde. Die meisten Entscheidungen zur Festlegung der Geometrie von Räumen werden während eines sehr kurzen Zeitraums (Entwurfsphase) getroffen. Fehlentscheidungen die in dieser Phase getroffen werden haben langfristige Auswirkungen auf das Leben von Menschen, und damit auch Konsequenzen auf ökonomische und ökologische Aspekte. Mittels computerbasierten Layoutsystemen lässt sich der Entwurf räumlicher Konfigurationen sinnvoll unterstützen, da sie es ermöglichen, in kürzester Zeit eine große Anzahl an Varianten zu erzeugen und zu überprüfen. Daraus ergeben sich zwei Vorteile. Erstens kann die große Menge an Varianten dazu beitragen, bessere Lösungen zu finden. Zweitens kann das Formalisieren von Bewertungskriterien zu einer größeren Objektivität und Transparenz bei der Lösungsfindung führen. Um den Entwurf räumlicher Konfigurationen optimal zu unterstützen, muss ein Layoutsystem in der Lage sein, ein möglichst großes Spektrum an Grundrissvarianten zu erzeugen (Vielfalt); und zahlreiche Möglichkeiten und Detaillierungsstufen zur Problembeschreibung (Flexibilität), sowie Mittel anzubieten, mit denen sich die Anforderungen an die räumliche Konfiguration adäquat beschreiben lassen (Relevanz). Bezüglich Letzterem spielen wahrnehmungs- und nutzungsbezogene Kriterien (wie z. B. Grad an Privatheit, Gefühl von Sicherheit, Raumwirkung, Orientierbarkeit, Potenzial zu sozialer Interaktion) eine wichtige Rolle. Die bislang entwickelten Layoutsysteme weisen hinsichtlich Vielfalt, Flexibilität und Relevanz wesentliche Beschränkungen auf, welche auf eine ungeeignete Methode zur Repräsentation von Räumen zurückzuführen sind. Die in einem Layoutsystem verwendeten Raumrepräsentationsmethoden bestimmen die Möglichkeiten zur Formerzeugung und Problembeschreibung wesentlich. Sichtbarkeitsbasierte Raumrepräsentationen (Sichtfelder, Sichtachsen, Konvexe Räume) eignen sich in besonderer Weise zur Abbildung von Räumen in Layoutsystemen, da sie einerseits ein umfangreiches Repertoire zur Verfügung stellen, um räumliche Konfigurationen hinsichtlich wahrnehmungs- und nutzungsbezogener Kriterien zu beschreiben. Andererseits lassen sie sich vollständig aus der Geometrie der begrenzenden Oberflächen ableiten und sind nicht an bestimmte zur Formerzeugung verwendete geometrische Objekte gebunden. In der vorliegenden Arbeit wird ein Layoutsystem entwickelt, welches auf diesen Raumrepräsentationen basiert. Es wird ein Evaluationsmechanismus (EM) entwickelt, welcher es ermöglicht, beliebige zweidimensionale räumliche Konfigurationen hinsichtlich wahrnehmungs- und nutzungsrelevanter Kriterien zu bewerten. Hierzu wurde eine Methodik entwickelt, die es ermöglicht automatisch Raumbereiche (O-Spaces und P-Spaces) zu identifizieren, welche bestimmte Eigenschaften haben (z.B. sichtbare Fläche, Kompaktheit des Sichtfeldes, Tageslicht) und bestimmte Relationen zueinander (wie gegenseitige Sichtbarkeit, visuelle und physische Distanz) aufweisen. Der EM wurde mit Generierungsmechanismen (GM) gekoppelt, um zu prüfen, ob dieser sich eignet, um in großen Variantenräumen nach geeigneten räumlichen Konfigurationen zu suchen. Die Ergebnisse dieser Experimente zeigen, dass die entwickelte Methodik einen vielversprechenden Ansatz zur automatisierten Erzeugung von räumlichen Konfigurationen darstellt: Erstens ist der EM vollständig vom GM getrennt, wodurch es möglich ist, verschiedene GM in einem Entwurfssystem zu verwenden und somit den Variantenraum zu vergrößern (Vielfalt). Zweitens erlaubt der EM die Anforderungen an eine räumliche Konfiguration flexibel zu beschreiben (unterschiedliche Maßstäbe, unterschiedlicher Detaillierungsgrad). Letztlich erlauben die verwendeten Repräsentationsmethoden eine Problembeschreibung vorzunehmen, die stark an der Wirkung des Raumes auf den Menschen orientiert ist (Relevanz). Die in der Arbeit entwickelte Methodik leistet einen wichtigen Beitrag zur Verbesserung evidenzbasierter Entwurfsprozesse, da sie eine Brücke zwischen der nutzerorientierten Bewertung von räumlichen Konfigurationen und deren Erzeugung schlägt. T3 - bauhaus.ifex research series - 3 KW - Architektur KW - Raum KW - CAAD KW - Sichtbarkeit KW - Entwurfsmethodik KW - Räumliche Konfiguration KW - Isovist KW - Entwurfssystem KW - Space Syntax Y1 - 2016 U6 - http://nbn-resolving.de/urn/resolver.pl?urn:nbn:de:gbv:wim2-20160613-25900 ER - TY - JOUR A1 - Söbke, Heinrich A1 - Lück, Andrea T1 - Framing Algorithm-Driven Development of Sets of Objectives Using Elementary Interactions JF - Applied System Innovation N2 - Multi-criteria decision analysis (MCDA) is an established methodology to support the decision-making of multi-objective problems. For conducting an MCDA, in most cases, a set of objectives (SOO) is required, which consists of a hierarchical structure comprised of objectives, criteria, and indicators. The development of an SOO is usually based on moderated development processes requiring high organizational and cognitive effort from all stakeholders involved. This article proposes elementary interactions as a key paradigm of an algorithm-driven development process for an SOO that requires little moderation efforts. Elementary interactions are self-contained information requests that may be answered with little cognitive effort. The pairwise comparison of elements in the well-known analytical hierarchical process (AHP) is an example of an elementary interaction. Each elementary interaction in the development process presented contributes to the stepwise development of an SOO. Based on the hypothesis that an SOO may be developed exclusively using elementary interactions (EIs), a concept for a multi-user platform is proposed. Essential components of the platform are a Model Aggregator, an Elementary Interaction Stream Generator, a Participant Manager, and a Discussion Forum. While the latter component serves the professional exchange of the participants, the first three components are intended to be automatable by algorithms. The platform concept proposed has been evaluated partly in an explorative validation study demonstrating the general functionality of the algorithms outlined. In summary, the platform concept suggested demonstrates the potential to ease SOO development processes as the platform concept does not restrict the application domain; it is intended to work with little administration moderation efforts, and it supports the further development of an existing SOO in the event of changes in external conditions. The algorithm-driven development of SOOs proposed in this article may ease the development of MCDA applications and, thus, may have a positive effect on the spread of MCDA applications. KW - Multikriteria-Entscheidung KW - Multikriterielle Entscheidungsanalyse KW - multi-criteria decision analysis; KW - set of objectives KW - crowdsourcing KW - platform KW - elementary interaction KW - OA-Publikationsfonds2022 Y1 - 2022 U6 - http://nbn-resolving.de/urn/resolver.pl?urn:nbn:de:gbv:wim2-20220713-46624 UR - https://www.mdpi.com/2571-5577/5/3/49 VL - 2022 IS - Volume 5, issue 3, article 49 SP - 1 EP - 20 PB - MDPI CY - Basel ER - TY - JOUR A1 - König, Reinhard T1 - Interview on Information Architecture JF - Swiss Architecture in the Moving Image N2 - Interview on Information Architecture KW - Architektur Y1 - 2015 U6 - http://nbn-resolving.de/urn/resolver.pl?urn:nbn:de:gbv:wim2-20180422-25078 SP - 151 EP - 154 ER - TY - JOUR A1 - Mosavi, Amir A1 - Hosseini Imani, Mahmood A1 - Zalzar, Shaghayegh A1 - Shamshirband, Shahaboddin T1 - Strategic Behavior of Retailers for Risk Reduction and Profit Increment via Distributed Generators and Demand Response Programs JF - Energies N2 - Following restructuring of power industry, electricity supply to end-use customers has undergone fundamental changes. In the restructured power system, some of the responsibilities of the vertically integrated distribution companies have been assigned to network managers and retailers. Under the new situation, retailers are in charge of providing electrical energy to electricity consumers who have already signed contract with them. Retailers usually provide the required energy at a variable price, from wholesale electricity markets, forward contracts with energy producers, or distributed energy generators, and sell it at a fixed retail price to its clients. Different strategies are implemented by retailers to reduce the potential financial losses and risks associated with the uncertain nature of wholesale spot electricity market prices and electrical load of the consumers. In this paper, the strategic behavior of retailers in implementing forward contracts, distributed energy sources, and demand-response programs with the aim of increasing their profit and reducing their risk, while keeping their retail prices as low as possible, is investigated. For this purpose, risk management problem of the retailer companies collaborating with wholesale electricity markets, is modeled through bi-level programming approach and a comprehensive framework for retail electricity pricing, considering customers’ constraints, is provided in this paper. In the first level of the proposed bi-level optimization problem, the retailer maximizes its expected profit for a given risk level of profit variability, while in the second level, the customers minimize their consumption costs. The proposed programming problem is modeled as Mixed Integer programming (MIP) problem and can be efficiently solved using available commercial solvers. The simulation results on a test case approve the effectiveness of the proposed demand-response program based on dynamic pricing approach on reducing the retailer’s risk and increasing its profit. In this paper, the decision-making problem of the retailers under dynamic pricing approach for demand response integration have been investigated. The retailer was supposed to rely on forward contracts, DGs, and spot electricity market to supply the required active and reactive power of its customers. To verify the effectiveness of the proposed model, four schemes for retailer’s scheduling problem are considered and the resulted profit under each scheme are analyzed and compared. The simulation results on a test case indicate that providing more options for the retailer to buy the required power of its customers and increase its flexibility in buying energy from spot electricity market reduces the retailers’ risk and increases its profit. From the customers’ perspective also the retailers’accesstodifferentpowersupplysourcesmayleadtoareductionintheretailelectricityprices. Since the retailer would be able to decrease its electricity selling price to the customers without losing its profitability, with the aim of attracting more customers. Inthiswork,theconditionalvalueatrisk(CVaR)measureisusedforconsideringandquantifying riskinthedecision-makingproblems. Amongallthepossibleoptioninfrontoftheretailertooptimize its profit and risk, demand response programs are the most beneficial option for both retailer and its customers. The simulation results on the case study prove that implementing dynamic pricing approach on retail electricity prices to integrate demand response programs can successfully provoke customers to shift their flexible demand from peak-load hours to mid-load and low-load hours. Comparing the simulation results of the third and fourth schemes evidences the impact of DRPs and customers’ load shifting on the reduction of retailer’s risk, as well as the reduction of retailer’s payment to contract holders, DG owners, and spot electricity market. Furthermore, the numerical results imply on the potential of reducing average retail prices up to 8%, under demand response activation. Consequently, it provides a win–win solution for both retailer and its customers. KW - Risikomanagement KW - demand response programs KW - stochastic programming KW - forward contracts KW - risk management KW - retailer KW - OA-Publikationsfonds2018 Y1 - 2018 U6 - http://nbn-resolving.de/urn/resolver.pl?urn:nbn:de:gbv:wim2-20180628-37546 UR - http://www.mdpi.com/1996-1073/11/6/1602 VL - 2018 IS - 11, 6 PB - MDPI CY - Basel ER - TY - JOUR A1 - Ghazvinei, Pezhman Taherei A1 - Darvishi, Hossein Hassanpour A1 - Mosavi, Amir A1 - Yusof, Khamaruzaman bin Wan A1 - Alizamir, Meysam A1 - Shamshirband, Shahaboddin A1 - Chau, Kwok-Wing T1 - Sugarcane growth prediction based on meteorological parameters using extreme learning machine and artificial neural network JF - Engineering Applications of Computational Fluid Mechanics N2 - Management strategies for sustainable sugarcane production need to deal with the increasing complexity and variability of the whole sugar system. Moreover, they need to accommodate the multiple goals of different industry sectors and the wider community. Traditional disciplinary approaches are unable to provide integrated management solutions, and an approach based on whole systems analysis is essential to bring about beneficial change to industry and the community. The application of this approach to water management, environmental management and cane supply management is outlined, where the literature indicates that the application of extreme learning machine (ELM) has never been explored in this realm. Consequently, the leading objective of the current research was set to filling this gap by applying ELM to launch swift and accurate model for crop production data-driven. The key learning has been the need for innovation both in the technical aspects of system function underpinned by modelling of sugarcane growth. Therefore, the current study is an attempt to establish an integrate model using ELM to predict the concluding growth amount of sugarcane. Prediction results were evaluated and further compared with artificial neural network (ANN) and genetic programming models. Accuracy of the ELM model is calculated using the statistics indicators of Root Means Square Error (RMSE), Pearson Coefficient (r), and Coefficient of Determination (R2) with promising results of 0.8, 0.47, and 0.89, respectively. The results also show better generalization ability in addition to faster learning curve. Thus, proficiency of the ELM for supplementary work on advancement of prediction model for sugarcane growth was approved with promising results. KW - Künstliche Intelligenz KW - Sustainable production KW - ELM KW - prediction KW - machine learning KW - sugarcane KW - estimation KW - growth mode KW - extreme learning machine KW - OA-Publikationsfonds2018 Y1 - 2018 U6 - http://nbn-resolving.de/urn/resolver.pl?urn:nbn:de:gbv:wim2-20181017-38129 UR - https://www.tandfonline.com/doi/full/10.1080/19942060.2018.1526119 VL - 2018 IS - 12,1 SP - 738 EP - 749 PB - Taylor & Francis ER - TY - JOUR A1 - Faizollahzadeh Ardabili, Sina A1 - Najafi, Bahman A1 - Alizamir, Meysam A1 - Mosavi, Amir A1 - Shamshirband, Shahaboddin A1 - Rabczuk, Timon T1 - Using SVM-RSM and ELM-RSM Approaches for Optimizing the Production Process of Methyl and Ethyl Esters JF - Energies N2 - The production of a desired product needs an effective use of the experimental model. The present study proposes an extreme learning machine (ELM) and a support vector machine (SVM) integrated with the response surface methodology (RSM) to solve the complexity in optimization and prediction of the ethyl ester and methyl ester production process. The novel hybrid models of ELM-RSM and ELM-SVM are further used as a case study to estimate the yield of methyl and ethyl esters through a trans-esterification process from waste cooking oil (WCO) based on American Society for Testing and Materials (ASTM) standards. The results of the prediction phase were also compared with artificial neural networks (ANNs) and adaptive neuro-fuzzy inference system (ANFIS), which were recently developed by the second author of this study. Based on the results, an ELM with a correlation coefficient of 0.9815 and 0.9863 for methyl and ethyl esters, respectively, had a high estimation capability compared with that for SVM, ANNs, and ANFIS. Accordingly, the maximum production yield was obtained in the case of using ELM-RSM of 96.86% for ethyl ester at a temperature of 68.48 °C, a catalyst value of 1.15 wt. %, mixing intensity of 650.07 rpm, and an alcohol to oil molar ratio (A/O) of 5.77; for methyl ester, the production yield was 98.46% at a temperature of 67.62 °C, a catalyst value of 1.1 wt. %, mixing intensity of 709.42 rpm, and an A/O of 6.09. Therefore, ELM-RSM increased the production yield by 3.6% for ethyl ester and 3.1% for methyl ester, compared with those for the experimental data. KW - Biodiesel KW - Optimierung KW - extreme learning machine KW - machine learning KW - response surface methodology KW - support vector machine KW - OA-Publikationsfonds2018 Y1 - 2018 U6 - http://nbn-resolving.de/urn/resolver.pl?urn:nbn:de:gbv:wim2-20181025-38170 UR - https://www.mdpi.com/1996-1073/11/11/2889 IS - 11, 2889 SP - 1 EP - 20 PB - MDPI CY - Basel ER - TY - THES A1 - List, Eik T1 - Design, Analysis, and Implementation of Symmetric-key (Authenticated) Ciphers N2 - Modern cryptography has become an often ubiquitous but essential part of our daily lives. Protocols for secure authentication and encryption protect our communication with various digital services, from private messaging, online shopping, to bank transactions or exchanging sensitive information. Those high-level protocols can naturally be only as secure as the authentication or encryption schemes underneath. Moreover, on a more detailed level, those schemes can also at best inherit the security of their underlying primitives. While widespread standards in modern symmetric-key cryptography, such as the Advanced Encryption Standard (AES), have shown to resist analysis until now, closer analysis and design of related primitives can deepen our understanding. The present thesis consists of two parts that portray six contributions: The first part considers block-cipher cryptanalysis of the round-reduced AES, the AES-based tweakable block cipher Kiasu-BC, and TNT. The second part studies the design, analysis, and implementation of provably secure authenticated encryption schemes. In general, cryptanalysis aims at finding distinguishable properties in the output distribution. Block ciphers are a core primitive of symmetric-key cryptography which are useful for the construction of various higher-level schemes, ranging from authentication, encryption, authenticated encryption up to integrity protection. Therefore, their analysis is crucial to secure cryptographic schemes at their lowest level. With rare exceptions, block-cipher cryptanalysis employs a systematic strategy of investigating known attack techniques. Modern proposals are expected to be evaluated against these techniques. The considerable effort for evaluation, however, demands efforts not only from the designers but also from external sources. The Advanced Encryption Standard (AES) is one of the most widespread block ciphers nowadays. Therefore, it is naturally an interesting target for further analysis. Tweakable block ciphers augment the usual inputs of a secret key and a public plaintext by an additional public input called tweak. Among various proposals through the previous decade, this thesis identifies Kiasu-BC as a noteworthy attempt to construct a tweakable block cipher that is very close to the AES. Hence, its analysis intertwines closely with that of the AES and illustrates the impact of the tweak on its security best. Moreover, it revisits a generic tweakable block cipher Tweak-and-Tweak (TNT) and its instantiation based on the round-reduced AES. The first part investigates the security of the AES against several forms of differential cryptanalysis, developing distinguishers on four to six (out of ten) rounds of AES. For Kiasu-BC, it exploits the additional freedom in the tweak to develop two forms of differential-based attacks: rectangles and impossible differentials. The results on Kiasu-BC consider an additional round compared to attacks on the (untweaked) AES. The authors of TNT had provided an initial security analysis that still left a gap between provable guarantees and attacks. Our analysis conducts a considerable step towards closing this gap. For TNT-AES - an instantiation of TNT built upon the AES round function - this thesis further shows how to transform our distinguisher into a key-recovery attack. Many applications require the simultaneous authentication and encryption of transmitted data. Authenticated encryption (AE) schemes provide both properties. Modern AE schemes usually demand a unique public input called nonce that must not repeat. Though, this requirement cannot always be guaranteed in practice. As part of a remedy, misuse-resistant and robust AE tries to reduce the impact of occasional misuses. However, robust AE considers not only the potential reuse of nonces. Common authenticated encryption also demanded that the entire ciphertext would have to be buffered until the authentication tag has been successfully verified. In practice, this approach is difficult to ensure since the setting may lack the resources for buffering the messages. Moreover, robustness guarantees in the case of misuse are valuable features. The second part of this thesis proposes three authenticated encryption schemes: RIV, SIV-x, and DCT. RIV is robust against nonce misuse and the release of unverified plaintexts. Both SIV-x and DCT provide high security independent from nonce repetitions. As the core under SIV-x, this thesis revisits the proof of a highly secure parallel MAC, PMAC-x, revises its details, and proposes SIV-x as a highly secure authenticated encryption scheme. Finally, DCT is a generic approach to have n-bit secure deterministic AE but without the need of expanding the ciphertext-tag string by more than n bits more than the plaintext. From its first part, this thesis aims to extend the understanding of the (1) cryptanalysis of round-reduced AES, as well as the understanding of (2) AES-like tweakable block ciphers. From its second part, it demonstrates how to simply extend known approaches for (3) robust nonce-based as well as (4) highly secure deterministic authenticated encryption. N2 - Die moderne Kryptographie hat sich zu einem oft unsichtbaren aber essentiellen Teil unserer alltäglichen Kommunikation entwickelt. Protokolle für sichere Authentisierung und Verschlüsselung schützen uns als Gesellschaft in einer Vielzahl von Lebensbereichen, sei es in Nachrichten oder Online-Einkäufen, bis hin zum Schutz von Banktransaktionen oder behördlicher Kommunikation. In den meisten Fällen ist dabei die Sicherstellung von Authentizität und Verschlüsselung von entscheidender Bedeutung. Wie eine Kette ihre Stärke aus den einzelnen Gliedern bezieht, kann jedes kryptographisches Protokoll jedoch nur so sicher sein wie seine darunterliegenden Verfahren und Primitive. Mit den offenen Nominierungswettbewerben haben sich heutige Standards wie zum Beispiel der Advanced Encryption Standard (AES) für Blockchiffren oder der Galois/Counter Mode für authentisierte Verschlüsselung etabliert. Zudem haben sie durch vergangenen zwei Jahrzehnte an Analyse durch die kryptographische Forschergemeinschaft ein hohes Maß an Vertrauen erhalten. Insbesondere in der symmetrischen Kryptographie, die hohen Wert auf Effizienz der Verfahren legt, werden vorwiegend systematische Analysetechniken eingesetzt. Darum ist der Prozess der Verständnisentwicklung und -gewinnung nie abgeschlossen, sondern Bedarf neuer Blickwinkel und Ideen. Als die Kernkomponenten der täglichen Internetkommunikation sind dabei Blockchiffren und Verfahren zur authentisierten Verschlüsselung dabei von besonderem Interesse. Diese Dissertation gliedert sich in zwei Hauptteile die insgesamt sechs Beiträge beinhalten. Der erste Teil ist der Analyse von AES-basierten (tweakable) Blockchiffren gewidmet. Der zweite Teil stellt neue Konstruktionen für robuste authentisierte Verschlüsselung mit hohen Sicherheitsgarantien vor. Die Kryptanalyse versucht nicht-ideale Eigenschaften in den Ausgaben kryptographischer Systeme zu finden. Blockchiffren als wichtige Grundbausteine der symmetrischen Kryptographie sind hierbei von besonderem Interesse, da sie die Grundlage für die Konstruktion sicherer Verfahren für Authentisierung oder Verschlüsselung darstellen. Diese wiederum schützen in Protokollen in einer Vielzahl von Anwendungen uns in unserem Alltag. Die Analyse von Blockchiffren als Grundoperationen ist daher essentiell um die Sicherheit darauf aufbauender Verfahren verstehen zu können. Der AES ist wahrscheinlich die derzeit am weitesten verbreitete moderne Blockchiffre, was ihn allein bereits zum hochinteressanten Studienobjekt macht. Tweakable Blockchiffren ergänzen ihre klassischen Vorbilder um einen zusätzlichen öffentlichen Input, den Tweak, was insbesondere in der jüngeren Vergangenheit zu effizienteren oder sichereren Verfahren geführt hat. AES-basierte Konstruktionen versuchen von der Vielzahl der existierenden bekannten Eigenschaften des AES und seiner Effizienz zu profitieren. Die Rolle des Tweaks für die Sicherheit ist allerdings nicht völlig verstanden. Ihre Analyse ist daher von besonderem Interesse. Der erste Teil dieser Thesis stellt eine neue Sichtweise der differentiellen Analyse von vier bis sechs Runden AES vor. Im weiteren Verlauf betrachtet sie differentielle Angriffe auf Kiasu-BC, eine Tweakable Blockchiffre auf Basis des AES, die versucht, so nah wie möglich am Design des AES zu bleiben. Somit kann ihre Analyse die durch den Tweak hervorgerufenen Unterschiede genauer beleuchten als das bei dedizierten Konstruktion möglich wäre. Abschließend analysiert der erste Teil Tweak-and-Tweak (TNT), eine generische Transformation einer klassischen in eine tweakable Blockchiffre, sowie ihre Instanz TNT-AES. Durch einen besseren Sicherheitsbeweis von TNT schließt sie die Lücke (mit Ausnahme eines logarithmischen Faktors) zwischen ihrer vorher bewiesenen Sicherheit und bekannten Angriffen. Eine Vielzahl von Anwendungen für Authentisierung oder Verschlüsselung benötigen beide Eigenschaften simultan. Darum sind Verfahren für die Authentisierte Verschlüsselung (AE Schemes) aus dem täglichen Gebrauch nicht mehr wegzudenken. Bis vor einigen Jahren waren die Anforderungen an den Gebrauch von AE Schemes allerdings sehr restriktiv: Moderne AE Schemes erfordern in der Regel für jede zu verschlüsselnde Nachricht einen Input der sich nicht wiederholen darf, eine Nonce. Nonces sind allgemein akzeptiert und sinnvoll - ihre Einmaligkeit kann jedoch nicht über alle Anwendungsumgebungen hinweg garantiert werden. Für viele Konstruktionen bedeutet eine Verletzung der Anforderungen allerdings dass keinerlei Sicherheit mehr gewährleistet werden kann. Robuste AE Schemes wurden eingeführt um den Schaden möglichst auch im Fall von Nonce-Wiederholungen Sicherheit zu garantieren. Derartige Verfahren bedenken darüberhinaus auch den Fall, dass ungültige Klartexte unbeabsichtigt nach außen gelangen, was insbesondere in Umgebungen relevant ist, in denen sie nicht bis zur erfolgreichen Verifikation gespeichert werden können. Die vorliegende Arbeit stellt drei blockchiffrenbasierte Verfahren für misuse-resiliente authentisierte Verschlüsselung vor: RIV, SIV-x und DCT. RIV bietet Robustheit gegen Nonce-Wiederholungen wie auch gegen die Fälle in denen Informationen über ungültige Entschlüsselungen nach außen gelangen. SIV-x und DCT bieten sehr hohe Sicherheitsgarantien ohne auf Nonces angewiesen zu sein, jeweils basierend auf tweakable Blockchiffren. Im ersten Teil möchte die vorliegende Arbeit zum Einen das Verständnis der differentiellen Sicherheit des rundenreduzierten AES hin zu Unterscheidungsangriffen mit extrem kleinen, aber dennoch signifikanten Differenzen ergänzen. Zum Anderen erweitert dieser Teil die Analyse AES-ähnlicher Tweakable Blockchiffren. Der zweite Teil ergänzt zum Einen mit RIV das Portfolio robuster authentisierter Verschlüsselungsverfahren und zeigt wie diese effizient implementiert werden können. Zum Anderen zeigt er mit SIV-x und DCT, wie deterministische authentisierte Verschlüsselung mit hohen Sicherheitsanforderungen mit zwei Iterationen von denen nur eine pseudozufällig sein muss realisiert werden können. KW - Kryptologie KW - Cryptography KW - Symmetric-key cryptography KW - Cryptanalysis KW - Provable security KW - Encryption Y1 - 2021 U6 - http://nbn-resolving.de/urn/resolver.pl?urn:nbn:de:gbv:wim2-20211103-45235 ER - TY - THES A1 - Schürenberg, Philipp T1 - Automatisierte fachliche Qualitätssicherung in einem Crowdsourcing-basierten bauphysikalischen Lernspiel N2 - In crowdsourcingbasierten Systemen kommt der Qualitätssicherung des durch Benutzer generierten Inhaltes große Bedeutung für die Erhaltung der Benutzbarkeit zu. Das bauphysikalische Lehrspiel "BuildVille" benutzt für die Quiz-Anwendung einen Crowdsourcing-Ansatz: Die Quiz-Fragen werden von den Benutzern selbst generiert. Mit Hilfe dieser Arbeit soll sichergestellt werden, dass fehlerhafte, irrtümlicherweise oder zum Spaß eingegebene Fragen möglichst früh erkannt, korrigiert oder von der weiteren Verbreitung ausgeschlossen werden. Dazu soll mit Hilfe einer Analyse bestehender crowdsourcingbasierter Systeme bezüglich umgesetzter Qualitätssicherungsmaßnahmen ein Konzept für die QS-Maßnahmen in BuildVille entwickelt werden. KW - Crowdsourcing KW - BuildVille Y1 - 2012 U6 - http://nbn-resolving.de/urn/resolver.pl?urn:nbn:de:gbv:wim2-20120704-16734 ER - TY - THES A1 - Franke, Carolin T1 - Bauphysikalisches Quartett N2 - Quartett ist ein ebenso altes, wie auch beliebtes Kartenspiel. Vor allem bei Kindern erfreut es sich großer Beliebtheit, während in den älteren Generationen kaum jemand mit Quartettkarten spielt. Quartettspiele speziell für Kleinkinder sind zum Großteil mit Inhalten versehen, die Wissen auf spielerische Art und Weise vermitteln. Dabei werden gute Lernerfolge in dieser Zielgruppe verzeichnet. Wie lassen sich also diese Lernerfolge durch das Spielen mit Quartettkarten erzielen? Und wie kann dieser Effekt auch auf Studenten übertragen werden? Ziel dieser Arbeit ist es, das Konzept des Quartettkartenspiels auf bauphysikalische Inhalte anzuwenden und gegebenenfalls die Spielprinzipien zu erweitern oder zu verändern. Dabei sind die Studenten der Fakultät Bauingenieurswesen die Zielgruppe, an die sich das Spiel richten soll. Besondere Herausforderung ist es, unterschiedliche Objekteklassen von bauphysikalischer Relevanz in einem Spiel zusammenzubringen und vergleichbar zu machen. Das sich ergebende Quartettkartenspiel sollte nicht nur eine Objektklasse, sondern mehrere Objektklassen zum Inhalt haben. Dabei sollen die Objektklassen so gewählt werden, dass sich Kategorien mit bauphysikalischem Inhalt finden lassen. Augenmerk sollte auch auf die Strukturierung der Lerninhalte gelegt werden, um eine leichte Übertragung des Spielkonzepts auf andere Fachdomänen zu ermöglichen. Das Ergebnis der Arbeit sind zwei fertige und spielbare Quartette. KW - Quartett KW - Lernspiel KW - Bauphysik KW - Baustoff Quartett Y1 - 2012 U6 - http://nbn-resolving.de/urn/resolver.pl?urn:nbn:de:gbv:wim2-20121130-17723 ER - TY - THES A1 - Müller, Naira T1 - Erweiterung von Fliplife mit bauphysikalischen Inhalten N2 - In dieser Arbeit wurde ein Konzept erstellt, das Fliplife um einen bauphysikalischen Karriereweg erweitert. In das Spiel wurden beispielhaft bauphysikalische Inhalte sowie spielkonzept-kompatible und wissensvermittelnde Spielmechaniken implementiert. KW - Social Game KW - Bauphysik KW - Fliplife KW - Lernspiele KW - Digital Games Based Learning Y1 - 2012 U6 - http://nbn-resolving.de/urn/resolver.pl?urn:nbn:de:gbv:wim2-20120704-16763 ER - TY - JOUR A1 - Bielik, Martin A1 - Schneider, Sven A1 - Kuliga, Saskia A1 - Griego, Danielle A1 - Ojha, Varun A1 - König, Reinhard A1 - Schmitt, Gerhard A1 - Donath, Dirk ED - Resch, Bernd ED - Szell, Michael T1 - Examining Trade-Offs between Social, Psychological, and Energy Potential of Urban Form JF - ISPRS International Journal of Geo-Information N2 - Urban planners are often challenged with the task of developing design solutions which must meet multiple, and often contradictory, criteria. In this paper, we investigated the trade-offs between social, psychological, and energy potential of the fundamental elements of urban form: the street network and the building massing. Since formal mehods to evaluate urban form from the psychological and social point of view are not readily available, we developed a methodological framework to quantify these criteria as the first contribution in this paper. To evaluate the psychological potential, we conducted a three-tiered empirical study starting from real world environments and then abstracting them to virtual environments. In each context, the implicit (physiological) response and explicit (subjective) response of pedestrians were measured. To quantify the social potential, we developed a street network centrality-based measure of social accessibility. For the energy potential, we created an energy model to analyze the impact of pure geometric form on the energy demand of the building stock. The second contribution of this work is a method to identify distinct clusters of urban form and, for each, explore the trade-offs between the select design criteria. We applied this method to two case studies identifying nine types of urban form and their respective potential trade-offs, which are directly applicable for the assessment of strategic decisions regarding urban form during the early planning stages. KW - Planung KW - social accessibility KW - urban perception KW - energy demand KW - urban form KW - trade-offs KW - OA-Publikationsfonds2019 Y1 - 2019 U6 - http://nbn-resolving.de/urn/resolver.pl?urn:nbn:de:gbv:wim2-20190408-38695 UR - https://www.mdpi.com/2220-9964/8/2/52 VL - 2019 EP - Volume 8, Issue 2, 52 ER - TY - THES A1 - Kavrakov, Igor T1 - Structural Optimization of Composite Cross-Sections and Elements using Energy Methods N2 - Structural optimization has gained considerable attention in the design of structural engineering structures, especially in the preliminary phase. This study introduces an unconventional approach for structural optimization by utilizing the Energy method with Integral Material Behavior (EIM), based on the Lagrange’s principle of minimum potential energy. An automated two-level optimization search process is proposed, which integrates the EIM, as an alternative method for nonlinear structural analysis, and the bilevel optimization. The proposed procedure secures the equilibrium through minimizing the potential energy on one level, and on a higher level, a design objective function. For this, the most robust strategy of bilevel optimization, the nested method is used. The function of the potential energy is investigated along with its instabilities for physical nonlinear analysis through principle examples, by which the advantages and limitations using this method are reviewed. Furthermore, optimization algorithms are discussed. A numerical fully functional code is developed for nonlinear cross section, element and 2D frame analysis, utilizing different finite elements and is verified against existing EIM programs. As a proof of concept, the method is applied on selected examples using this code on cross section and element level. For the former one a comparison is made with standard procedure, by employing the equilibrium equations within the constrains. The validation of the element level was proven by a theoretical solution of an arch bridge and finally, a truss bridge is optimized. Most of the principle examples are chosen to be adequate for the everyday engineering practice, to demonstrate the effectiveness of the proposed method. This study implies that with further development, this method could become just as competitive as the conventional structural optimization techniques using the Finite Element Method. KW - Strukturoptimierung Y1 - 2014 U6 - http://nbn-resolving.de/urn/resolver.pl?urn:nbn:de:gbv:wim2-20190815-39593 ER - TY - THES A1 - Schollmeyer, Andre T1 - Efficient and High-Quality Rendering of Higher-Order Geometric Data Representations N2 - Computer-Aided Design (CAD) bezeichnet den Entwurf industrieller Produkte mit Hilfe von virtuellen 3D Modellen. Ein CAD-Modell besteht aus parametrischen Kurven und Flächen, in den meisten Fällen non-uniform rational B-Splines (NURBS). Diese mathematische Beschreibung wird ebenfalls zur Analyse, Optimierung und Präsentation des Modells verwendet. In jeder dieser Entwicklungsphasen wird eine unterschiedliche visuelle Darstellung benötigt, um den entsprechenden Nutzern ein geeignetes Feedback zu geben. Designer bevorzugen beispielsweise illustrative oder realistische Darstellungen, Ingenieure benötigen eine verständliche Visualisierung der Simulationsergebnisse, während eine immersive 3D Darstellung bei einer Benutzbarkeitsanalyse oder der Designauswahl hilfreich sein kann. Die interaktive Darstellung von NURBS-Modellen und -Simulationsdaten ist jedoch aufgrund des hohen Rechenaufwandes und der eingeschränkten Hardwareunterstützung eine große Herausforderung. Diese Arbeit stellt vier neuartige Verfahren vor, welche sich mit der interaktiven Darstellung von NURBS-Modellen und Simulationensdaten befassen. Die vorgestellten Algorithmen nutzen neue Fähigkeiten aktueller Grafikkarten aus, um den Stand der Technik bezüglich Qualität, Effizienz und Darstellungsgeschwindigkeit zu verbessern. Zwei dieser Verfahren befassen sich mit der direkten Darstellung der parametrischen Beschreibung ohne Approximationen oder zeitaufwändige Vorberechnungen. Die dabei vorgestellten Datenstrukturen und Algorithmen ermöglichen die effiziente Unterteilung, Klassifizierung, Tessellierung und Darstellung getrimmter NURBS-Flächen und einen interaktiven Ray-Casting-Algorithmus für die Isoflächenvisualisierung von NURBSbasierten isogeometrischen Analysen. Die weiteren zwei Verfahren beschreiben zum einen das vielseitige Konzept der programmierbaren Transparenz für illustrative und verständliche Visualisierungen tiefenkomplexer CAD-Modelle und zum anderen eine neue hybride Methode zur Reprojektion halbtransparenter und undurchsichtiger Bildinformation für die Beschleunigung der Erzeugung von stereoskopischen Bildpaaren. Die beiden letztgenannten Ansätze basieren auf rasterisierter Geometrie und sind somit ebenfalls für normale Dreiecksmodelle anwendbar, wodurch die Arbeiten auch einen wichtigen Beitrag in den Bereichen der Computergrafik und der virtuellen Realität darstellen. Die Auswertung der Arbeit wurde mit großen, realen NURBS-Datensätzen durchgeführt. Die Resultate zeigen, dass die direkte Darstellung auf Grundlage der parametrischen Beschreibung mit interaktiven Bildwiederholraten und in subpixelgenauer Qualität möglich ist. Die Einführung programmierbarer Transparenz ermöglicht zudem die Umsetzung kollaborativer 3D Interaktionstechniken für die Exploration der Modelle in virtuellenUmgebungen sowie illustrative und verständliche Visualisierungen tiefenkomplexer CAD-Modelle. Die Erzeugung stereoskopischer Bildpaare für die interaktive Visualisierung auf 3D Displays konnte beschleunigt werden. Diese messbare Verbesserung wurde zudem im Rahmen einer Nutzerstudie als wahrnehmbar und vorteilhaft befunden. N2 - In computer-aided design (CAD), industrial products are designed using a virtual 3D model. A CAD model typically consists of curves and surfaces in a parametric representation, in most cases, non-uniform rational B-splines (NURBS). The same representation is also used for the analysis, optimization and presentation of the model. In each phase of this process, different visualizations are required to provide an appropriate user feedback. Designers work with illustrative and realistic renderings, engineers need a comprehensible visualization of the simulation results, and usability studies or product presentations benefit from using a 3D display. However, the interactive visualization of NURBS models and corresponding physical simulations is a challenging task because of the computational complexity and the limited graphics hardware support. This thesis proposes four novel rendering approaches that improve the interactive visualization of CAD models and their analysis. The presented algorithms exploit latest graphics hardware capabilities to advance the state-of-the-art in terms of quality, efficiency and performance. In particular, two approaches describe the direct rendering of the parametric representation without precomputed approximations and timeconsuming pre-processing steps. New data structures and algorithms are presented for the efficient partition, classification, tessellation, and rendering of trimmed NURBS surfaces as well as the first direct isosurface ray-casting approach for NURBS-based isogeometric analysis. The other two approaches introduce the versatile concept of programmable order-independent semi-transparency for the illustrative and comprehensible visualization of depth-complex CAD models, and a novel method for the hybrid reprojection of opaque and semi-transparent image information to accelerate stereoscopic rendering. Both approaches are also applicable to standard polygonal geometry which contributes to the computer graphics and virtual reality research communities. The evaluation is based on real-world NURBS-based models and simulation data. The results show that rendering can be performed directly on the underlying parametric representation with interactive frame rates and subpixel-precise image results. The computational costs of additional visualization effects, such as semi-transparency and stereoscopic rendering, are reduced to maintain interactive frame rates. The benefit of this performance gain was confirmed by quantitative measurements and a pilot user study. KW - Rendering KW - CAD KW - NURBS KW - Computer-Aided Design KW - Isogeometric Analysis KW - Graphics hardware Y1 - 2018 U6 - http://nbn-resolving.de/urn/resolver.pl?urn:nbn:de:gbv:wim2-20181120-38234 ER - TY - THES A1 - Lang, Kevin T1 - Argument Search with Voice Assistants N2 - The need for finding persuasive arguments can arise in a variety of domains such as politics, finance, marketing or personal entertainment. In these domains, there is a demand to make decisions by oneself or to convince somebody about a specific topic. To obtain a conclusion, one has to search thoroughly different sources in literature and on the web to compare various arguments. Voice interfaces, in form of smartphone applications or smart speakers, present the user with natural conversations in a comfortable way to make search requests in contrast to a traditional search interface with keyboard and display. Benefits and obstacles of such a new interface are analyzed by conducting two studies. The first one consists of a survey for analyzing the target group with questions about situations, motivations, and possible demanding features. The latter one is a wizard-of-oz experiment to investigate possible queries on how a user formulates requests to such a novel system. The results indicate that a search interface with conversational abilities can build a helpful assistant, but to satisfy the demands of a broader audience some additional information retrieval and visualization features need to be implemented. KW - Amazon Alexa KW - Argument KW - Suche KW - Stimme KW - Assistent KW - alexa KW - voice KW - assistant KW - arguments KW - search Y1 - 2018 U6 - http://nbn-resolving.de/urn/resolver.pl?urn:nbn:de:gbv:wim2-20190617-39353 ER - TY - THES A1 - Held, Janina T1 - Entwurf eines Spieler-Modells für eine erweiterbare Spielplattform zur Ausbildung in der Bauphysik N2 - Im Projekt Intelligentes Lernen beschäftigen sich die Professuren Content Management und Web-Technologien, Systeme der Virtuellen Realität und Bauphysik der Bauhaus- Universität Weimar mit der Entwicklung innovativer Informationstechnologien für eLearning- Umgebungen. In den Teilbereichen Retrieval, Extraktion und Visualisierung großer Dokumentkollektionen, sowie simulations- und planbasierter Wissensvermittlung werden Algorithmen und Werkzeuge erforscht, um eLearning-Systeme leistungsfähiger zu machen und um somit den Lernerfolg zu optimieren. Ziel des Projekts, auf dem Gebiet des simulationsbasierten Wissenstransfers, ist die Entwicklung eines Multiplayer Online Games (MOG) zur Ausbildungsunterstützung in der Bauphysik. Im Rahmen der vorliegenden Bachelorarbeit wird für diese digitale Lernsoftware ein Spieler- Modell zur Verwaltung der spielerspezifischen Daten entworfen und in das bestehende Framework integriert. Der Schwerpunkt der Arbeit liegt in der Organisation der erlernten Fähigkeiten des Spielers und in der an den Wissensstand angepassten Auswahl geeigneter Spielaufgaben. Für die Anwendung im eLearning-Bereich ist die Erweiterbarkeit des Modells um neue Lernkomplexe eine wesentliche Anforderung. KW - Skill KW - Skillsystem KW - Skillmanager KW - Missionsmanager KW - Spielermodell Y1 - 2011 U6 - http://nbn-resolving.de/urn/resolver.pl?urn:nbn:de:gbv:wim2-20120117-15249 ER - TY - CHAP A1 - Fedior, Marco A1 - Hamel, Wido ED - Steiner, Maria ED - Theiler, Michael ED - Mirboland, Mahsa T1 - Simulationsumgebung zur Evaluation von umweltorientierten Verkehrsmanagement-Strategien T2 - 30. Forum Bauinformatik N2 - Der vorliegende Beitrag beschreibt die Problematik bei der Prognose verkehrsbedingter Schadstoff-Immissionen. Im Mittelpunkt steht die Entwicklung und der Aufbau einer Simulationsumgebung zur Evaluation von umweltorientierten Verkehrsmanagement-Strategien. Die Simulationsumgebung wird über die drei Felder Verkehr, Emission, Immission entwickelt und findet zunächst Anwendung in der Evaluation verkehrlicher Maßnahmen für die Friedberger Landstraße in Frankfurt am Main. KW - Verkehr KW - Simulation KW - Emission KW - Luftverunreinigender Stoff Y1 - 2018 U6 - http://nbn-resolving.de/urn/resolver.pl?urn:nbn:de:gbv:wim2-20190328-38678 ER - TY - THES A1 - Beck, Stephan T1 - Immersive Telepresence Systems and Technologies N2 - Modern immersive telepresence systems enable people at different locations to meet in virtual environments using realistic three-dimensional representations of their bodies. For the realization of such a three-dimensional version of a video conferencing system, each user is continuously recorded in 3D. These 3D recordings are exchanged over the network between remote sites. At each site, the remote recordings of the users, referred to as 3D video avatars, are seamlessly integrated into a shared virtual scenery and displayed in stereoscopic 3D for each user from his or her perspective. This thesis reports on algorithmic and technical contributions to modern immersive telepresence systems and presents the design, implementation and evaluation of the first immersive group-to-group telepresence system in which each user is represented as realistic life-size 3D video avatar. The system enabled two remote user groups to meet and collaborate in a consistent shared virtual environment. The system relied on novel methods for the precise calibration and registration of color- and depth- sensors (RGBD) into the coordinate system of the application as well as an advanced distributed processing pipeline that reconstructs realistic 3D video avatars in real-time. During the course of this thesis, the calibration of 3D capturing systems was greatly improved. While the first development focused on precisely calibrating individual RGBD-sensors, the second stage presents a new method for calibrating and registering multiple color and depth sensors at a very high precision throughout a large 3D capturing volume. This method was further refined by a novel automatic optimization process that significantly speeds up the manual operation and yields similarly high accuracy. A core benefit of the new calibration method is its high runtime efficiency by directly mapping from raw depth sensor measurements into an application coordinate system and to the coordinates of its associated color sensor. As a result, the calibration method is an efficient solution in terms of precision and applicability in virtual reality and immersive telepresence applications. In addition to the core contributions, the results of two case studies which address 3D reconstruction and data streaming lead to the final conclusion of this thesis and to directions of future work in the rapidly advancing field of immersive telepresence research. N2 - In modernen 3D-Telepresence-Systemen werden die Nutzer realistisch dreidimensional repräsentiert und können sich in einer gemeinsamen virtuellen Umgebung treffen. Da sich die Nutzer gegenseitig realistisch sehen können, werden Limitierungen von herkömmlichen zweidimensionalen Videokonferenzsystemen überwunden und neue Möglichkeiten für die Kollaboration geschaffen. Für die Realisierung eines immersiven Telepresence-Systems wird jeder Nutzer kontinuierlich in 3D aufgenommen und als sogenannter 3D-Video-Avatar rekonstruiert. Die 3D-Video-Avatare werden über eine Netzwerkverbindung zwischen den entfernten Orten ausgetauscht, auf jeder Seite in eine gemeinsame virtuelle Szene integriert und für jeden Nutzer perspektivisch korrekt dreidimensional angezeigt. Diese Arbeit trägt algorithmisch und technisch zur aktuellen Forschung im Bereich 3D-Telepresence bei und präsentiert das Design, die Implementierung und die Evaluation eines neuen immersiven Telepresence-Systems. Benutzergruppen können sich dadurch zum ersten Mal von unterschiedlichen Orten in einer konsistenten gemeinsamen virtuellen Umgebung treffen und als realistische lebensgroße 3D-Video-Avatare sehen. Das System basiert auf neu entwickelten Methoden, welche die präzise Kalibrierung und Registrierung von mehreren Farb- und Tiefenkameras in ein gemeinsames Koordinatensystem ermöglichen, sowie auf einer neu entwickelten verteilten Prozesskette, welche die realistische Rekonstruktion von 3D-Video-Avataren in Echtzeit ermöglicht. Im Rahmen dieser Arbeit wurde die Kalibrierung von 3D-Aufnahmesystemen, die auf mehreren Farb- und Tiefenkameras basieren, deutlich verbessert. Eine erste Entwicklung konzentrierte sich auf die präzise Kalibrierung und Registrierung ein- zelner Tiefenkameras. Eine wesentliche Neuentwicklung ermöglicht es, mehrere Farb- und Tiefenkameras mit sehr hoher Genauigkeit innerhalb eines großen 3D-Aufnahmebereichs volumetrisch zu kalibrieren und in ein übergeordnetes Koordinatensystem zu registrieren. Im Laufe der Arbeit wurde die notwendige Nutzerinteraktion durch ein automatisches Optimierungsverfahren deutlich verringert, was die Kalibrierung von 3D-Aufnahmesystemen innerhalb weniger Minuten mit hoher Genauigkeit ermöglicht. Ein wesentlicher Vorteil dieser neuen volumetrischen Kalibrierungsmethode besteht darin, dass gemessene Tiefenwerte direkt in das Koordinatensystem der Anwendung und in das Koordinatensystem der korrespondierenden Farbkamera abgebildet werden. Insbesondere sind während der Anwendungslaufzeit keine Berechnungen zur Linsenentzerrung nötig, da diese bereits implizit durch die volumetrische Kalibrierung ausgeglichen sind. Das in dieser Arbeit entwickelte immersive Telepresence-System hebt sich von verwandten Arbeiten ab. Der durch das System geschaffene virtuelle Begegnungsraum ermöglicht natürliche Interaktionsformen, wie zum Beispiel Gestik oder Mimik, und bietet gleichzeitig etablierte Interaktionstechniken der Virtuellen Realität, welche die gemeinsame Exploration und Analyse von 3D-Inhalten unterstützen. Die in dieser Arbeit neu entwickelte Kalibrierungsmethode stellt eine effiziente Lösung hinsichtlich Genauigkeit und Flexibilität für Virtual-Reality- und moderne 3D-Telepresence-Anwendungen dar. Zusätzlich zu den vorgestellten Entwicklungen tragen die Ergebnisse zweier Fallstudien im Bereich 3D-Rekonstruktion und Netzwerkübertragungzu dieser Arbeit bei und unterstützen Vorschläge und Ausblicke für zukünftige Entwicklungen im fortschreitenden Gebiet der 3D-Telepresence-Forschung. KW - Virtuelle Realität KW - Telepräsenz KW - Mensch-Maschine-Kommunikation KW - Tiefensensor KW - Camera Calibration KW - Depth Camera KW - 3D Telepresence KW - Virtual Reality Y1 - 2019 U6 - http://nbn-resolving.de/urn/resolver.pl?urn:nbn:de:gbv:wim2-20190218-38569 ER - TY - JOUR A1 - Ouaer, Hocine A1 - Hosseini, Amir Hossein A1 - Amar, Menad Nait A1 - Ben Seghier, Mohamed El Amine A1 - Ghriga, Mohammed Abdelfetah A1 - Nabipour, Narjes A1 - Andersen, Pål Østebø A1 - Mosavi, Amir A1 - Shamshirband, Shahaboddin T1 - Rigorous Connectionist Models to Predict Carbon Dioxide Solubility in Various Ionic Liquids JF - Applied Sciences N2 - Estimating the solubility of carbon dioxide in ionic liquids, using reliable models, is of paramount importance from both environmental and economic points of view. In this regard, the current research aims at evaluating the performance of two data-driven techniques, namely multilayer perceptron (MLP) and gene expression programming (GEP), for predicting the solubility of carbon dioxide (CO2) in ionic liquids (ILs) as the function of pressure, temperature, and four thermodynamical parameters of the ionic liquid. To develop the above techniques, 744 experimental data points derived from the literature including 13 ILs were used (80% of the points for training and 20% for validation). Two backpropagation-based methods, namely Levenberg–Marquardt (LM) and Bayesian Regularization (BR), were applied to optimize the MLP algorithm. Various statistical and graphical assessments were applied to check the credibility of the developed techniques. The results were then compared with those calculated using Peng–Robinson (PR) or Soave–Redlich–Kwong (SRK) equations of state (EoS). The highest coefficient of determination (R2 = 0.9965) and the lowest root mean square error (RMSE = 0.0116) were recorded for the MLP-LMA model on the full dataset (with a negligible difference to the MLP-BR model). The comparison of results from this model with the vastly applied thermodynamic equation of state models revealed slightly better performance, but the EoS approaches also performed well with R2 from 0.984 up to 0.996. Lastly, the newly established correlation based on the GEP model exhibited very satisfactory results with overall values of R2 = 0.9896 and RMSE = 0.0201. KW - Maschinelles Lernen KW - Machine learning KW - OA-Publikationsfonds2020 Y1 - 2019 U6 - http://nbn-resolving.de/urn/resolver.pl?urn:nbn:de:gbv:wim2-20200107-40558 UR - https://www.mdpi.com/2076-3417/10/1/304 VL - 2020 IS - Volume 10, Issue 1, 304 PB - MDPI ER - TY - THES A1 - Berhe, Asgedom Haile T1 - Mitigating Risks of Corruption in Construction: A theoretical rationale for BIM adoption in Ethiopia N2 - This PhD thesis sets out to investigate the potentials of Building Information Modeling (BIM) to mitigate risks of corruption in the Ethiopian public construction sector. The wide-ranging capabilities and promises of BIM have led to the strong perception among researchers and practitioners that it is an indispensable technology. Consequently, it has become the frequent subject of science and research. Meanwhile, many countries, especially the developed ones, have committed themselves to applying the technology extensively. Increasing productivity is the most common and frequently cited reason for that. However, both technology developers and adopters are oblivious to the potentials of BIM in addressing critical challenges in the construction sector, such as corruption. This particularly would be significant in developing countries like Ethiopia, where its problems and effects are acute. Studies reveal that bribery and corruption have long pervaded the construction industry worldwide. The complex and fragmented nature of the sector provides an environment for corruption. The Ethiopian construction sector is not immune from this epidemic reality. In fact, it is regarded as one of the most vulnerable sectors owing to varying socio-economic and political factors. Since 2015, Ethiopia has started adopting BIM, yet without clear goals and strategies. As a result, the potential of BIM for combating concrete problems of the sector remains untapped. To this end, this dissertation does pioneering work by showing how collaboration and coordination features of the technology contribute to minimizing the opportunities for corruption. Tracing loopholes, otherwise, would remain complex and ineffective in the traditional documentation processes. Proceeding from this anticipation, this thesis brings up two primary questions: what are areas and risks of corruption in case of the Ethiopian public construction projects; and how could BIM be leveraged to mitigate these risks? To tackle these and other secondary questions, the research employs a mixed-method approach. The selected main research strategies are Survey, Grounded Theory (GT) and Archival Study. First, the author disseminates an online questionnaire among Ethiopian construction engineering professionals to pinpoint areas of vulnerability to corruption. 155 responses are compiled and scrutinized quantitatively. Then, a semi-structured in-depth interview is conducted with 20 senior professionals, primarily to comprehend opportunities for and risks of corruption in those identified highly vulnerable project stages and decision points. At the same time, open interviews (consultations) are held with 14 informants to be aware of state of the construction documentation, BIM and loopholes for corruption in the country. Consequently, these qualitative data are analyzed utilizing the principles of GT, heat/risk mapping and Social Network Analysis (SNA). The risk mapping assists the researcher in the course of prioritizing corruption risks; whilst through SNA, methodically, it is feasible to identify key actors/stakeholders in the corruption venture. Based on the generated research data, the author constructs a [substantive] grounded theory around the elements of corruption in the Ethiopian public construction sector. This theory, later, guides the subsequent strategic proposition of BIM. Finally, 85 public construction related cases are also analyzed systematically to substantiate and confirm previous findings. By ways of these multiple research endeavors that is based, first and foremost, on the triangulation of qualitative and quantitative data analysis, the author conveys a number of key findings. First, estimations, tender document preparation and evaluation, construction material as well as quality control and additional work orders are found to be the most vulnerable stages in the design, tendering and construction phases respectively. Second, middle management personnel of contractors and clients, aided by brokers, play most critical roles in corrupt transactions within the prevalent corruption network. Third, grand corruption persists in the sector, attributed to the fact that top management and higher officials entertain their overriding power, supported by the lack of project audits and accountability. Contrarily, individuals at operation level utilize intentional and unintentional 'errors’ as an opportunity for corruption. In light of these findings, two conceptual BIM-based risk mitigation strategies are prescribed: active and passive automation of project audits; and the monitoring of project information throughout projects’ value chain. These propositions are made in reliance on BIM’s present dimensional capabilities and the promises of Integrated Project Delivery (IPD). Moreover, BIM’s synchronous potentials with other technologies such as Information and Communication Technology (ICT), and Radio Frequency technologies are topics which received a treatment. All these arguments form the basis for the main thesis of this dissertation, that BIM is able to mitigate corruption risks in the Ethiopian public construction sector. The discourse on the skepticisms about BIM that would stem from the complex nature of corruption and strategic as well as technological limitations of BIM is also illuminated and complemented by this work. Thus, the thesis uncovers possible research gaps and lays the foundation for further studies. KW - Building Information Modeling KW - Korruption KW - Risikomanagement KW - Äthiopien KW - Corruption KW - BIM KW - Risk management KW - Construction KW - Ethiopia Y1 - 2021 U6 - http://nbn-resolving.de/urn/resolver.pl?urn:nbn:de:gbv:wim2-20211007-45175 ER - TY - THES A1 - Preis Dutra, Joatan T1 - Cultural Heritage on Mobile Devices: Building Guidelines for UNESCO World Heritage Sites' Apps N2 - Technological improvements and access provide a fertile scenario for creating and developing mobile applications (apps). This scenario results in a myriad of Apps providing information regarding touristic destinations, including those with a cultural profile, such as those dedicated to UNESCO World Heritage Sites (WHS). However, not all of the Apps have the same efficiency. In order to have a successful app, its development must consider usability aspects and features aligned with reliable content. Despite the guidelines for mobile usability being broadly available, they are generic, and none of them concentrates specifically into cultural heritage places, especially on those placed in an open-air scenario. This research aims to fulfil this literature gap and discusses how to adequate and develop specific guidelines for a better outdoor WHS experience. It uses an empirical approach applied to an open-air WHS city: Weimar and its Bauhaus and Classical Weimar sites. In order to build a new set of guidelines applied for open-air WHS, this research used a systematic approach to compare literature-based guidelines to industry-based ones (based on affordances), extracted from the available Apps dedicated to WHS set in Germany. The instructions compiled from both sources have been comparatively tested by using two built prototypes from the distinctive guidelines, creating a set of recommendations collecting the best approach from both sources, plus suggesting new ones the evaluation. N2 - Der technische Fortschritt und der Zugang dazu bieten ein ertragreiches Szenario für die Innovation und Entwicklung mobiler Anwendungen (Apps). Daraus ist eine Vielzahl von zu Verfügung stehenden Apps entstanden, die Informationen über touristische Reiseziele bereitstellen, insbesondere solche mit einem kulturellen Hintergrund, etwa die, die zum UNESCO Welterbe/Weltkulturebe (auf Englisch WHS – World Heritage Sites, fotan so genannt in dieser Arteit) gehören. Allerdings haben nicht alle Apps die gleiche Effizienz. Um eine erfolgreiche App entwickeln zu können, sollten gleichzeitig die Benutzerfreundlichkeit der Aspekte und Funktionen in Betracht gezogen werden, als auch Zuverlässigkeit ihrer Inhalte. Obwohl die Richtlinien für eine mobile Benutzerfreundlichkeit weit verbreitet sind, sind diese jedoch generisch. Keine Richtlinie ist eindeutig auf Orte des Kulturerbes zugeschnitten, schon gar nicht auf Standorte, die sich im Freien befinden. Ziel dieser Arbeit ist es, diese Literaturlücke zu schließen und gleichzeitig nahe zu bringen, wie bestimmte Richtlinien angepasst und ausgebaut werden könnten, um bessere WHS-Erlebnisse im Freien zu schaffen. Die Arbeit geht empirisch vor am Beispiel der deutschen Stadt Weimar, die mit ihren Altstadt- und Bauhaus-Standorten als Freiluft-WHS gilt. Deutschland ist das fünftbedeutsamste Land der UNESCO-Liste und bekannt für sein technologisches Profil und der Zugang dazu. Dieses Land als geeignete Umgebung für den empirischen Ansatz zu wählen ist ein guter Ausgangspunkt, um Erfahrungen zu sammeln für innovative Projekte, die mit Apps das Kulturerbe erschließen. Um eine Reihe neuer Richtlinien für Freiluft-WHS (auf Englisch: Open-Air WHS) aufstellen zu können, wurden im Rahmen dieser Arbeit Richtlinien aus akademischen Quellen mit branchenüblichen Richtlinien verglichen, gewonnen aus einer umfangreichen Auszug verfügbarer Apps, die sich den WHS in Deutschland widmen. Im branchenüblichen Ansatz wurden WHS-bezogene Apps ausgewählt, die auf dem deutschen App-Markt erhältlich sind (für iOS- und Android-Geräte). Deren Eigenschaften und Gebrauch wurden aus der Nutzersicht betrachtet. Daraus ließen sich Gemeinsamkeiten ableiten innerhalb der aktuellen Funktionen und Tools, mit denen WHS beworben werden. Die Analyse förderte Richtlinien hervor fand aus der Layout-, Navigations-, Design- und der Inhaltsperspektive. Daraus wurde eine Prototyp-App erstellt, die dem in der Branche verzeichneten Stand der Technik entspricht. Während die Richtlinien aus der Branchenübersicht einen Beobachtungsansatz der Leistungen verfolgten, bezogen die aus der Literaturrecherche gewonnenen Richtlinien einen systematischen Ansatz des akademischen Materials. Der literaturbasierte Ansatz wurde aus Veröffentlichungen zur Benutzerfreundlichkeit mobiler Apps gewonnen, die in Forschungsdatenbanken erhältlich sind. Zusätzlich wurden in führenden Unternehmen für die Entwicklung von mobilen Betriebssystemen (iOS und Android) bereits vorhandene Modelle zur Nutzerfreundlichkeit sowie offizielle Richtlinien einbezogen, die unterschiedliche Ansätze sowie Ansichten für das Design mobiler Schnittstellen verbinden und kombinieren und die für WHS genutzt werden könnten. Unter Rücksichtnahme der Vielfalt der Besucherprofile in Weimar wurden auch Studien über Nutzeroberflächen für ältere Nutzer eingeschlossen. Obwohl die Leitlinien sich nicht auf didaktische Funktionen konzentrieren, wurden auch Studien über das mobile Lernen in der Studie mitaufgenommen, sofern die Nutzeroberfläche Teil des Studienziels war. Diese Entscheidung wurde getroffen, weil die Stadt Weimar auch jugendliche Studenten als Besucher hat, die die historischen Sehenswürdigkeiten der Stadt besuchen und kennenlernen. Allgemein konzentrierte sich diese Ausarbeitung auf klare Anweisungen, die in Richtlinien umgesetzt werden könnten. Die Analyse beschränkte sich jedoch nicht auf die Auswahlliste der Studienobjekte und wurde extrapoliert, unter Berücksichtigung relevanter Referenzen, die in den für die Stichprobe ausgewählten Veröffentlichungen zitiert wurden. Wann immer eine Leitlinie oder Empfehlung gefunden werden konnte, wurde diese, verfahrensgemäß in eine Tabelle eingefügt, einer ähnlichen Struktur folgend zu den aus der Übersicht der App-Industrie gewonnenen Richtlinien. Um den Erkenntnissen der Literaturrecherche zu entsprechen, wurden neue Kategorien hinzugefügt. Im Großen und Ganzen bestätigten einige Richtlinien des literarischen Ansatzes die Anweisungen des branchenorientierten Ansatzes, andere widersprachen diesen. Dadurch konnte ein neuer Satz von Leitlinien entstehen, die gegeneinander getestet wurden. Der Inhalt beider Prototypen (branchen- und literaturbasiert) wurden ausgearbeitet, um das WHS in Weimar anzusprechen. Dabei wurden Inhalte abgerufen, die auf der offiziellen touristischen Website der Stadt und in der größten Kulturstiftung Weimars, der Klassik Stiftung, verfügbar sind. Die Nutzung zweier verschiedener einfache Prototypen bot die Möglichkeit, auch andere Funktionen zu testen wie z. B. verschiedene Arten der Anzeige von Karten und Inhalten. Zur Überprüfung und Gegenüberstellung der verschiedenen Leitlinien, wurden ein aufgabenbasierter Test und eine vergleichende Bewertungsumfrage durchgeführt. Tester aus verschiedenen Altersgruppen führten in beiden Prototypen eine Reihe vordefinierter Aufgaben aus und beantworteten einen Fragebogen, wobei die in beiden Versionen vorgestellten Funktionen und Formate verglichen werden konnten. Die Fragen sollten anhand eine Auswahl vordefinierter Antworten von Nutzern beantwortet werden, ideal für eine statistische Auswertung, insbesondere zum Thema Benutzerzufriedenheit. Auch offene Fragen, zur Leistung persönliche Beiträge der Tester, wurden angeboten. Diese Methode war ausschlaggebend dafür, beide Richtliniensätze (Industrie vs. Literaturrecherche) gegeneinander zu vergleichen und zu analysieren, So konnte ein idealer Ansatz gefunden werden für Richtlinien von Apps, die sich mit Welterbestätten unter freiem Himmel befassen. Auch weitere Empfehlungen, die sich aus der Bewertung ergaben, konnten hinzugefügt werden. Das Ergebnis führt zu einem umfassendem Satz von Richtlinien, die in zukünftigen touristischen Open-Air-Apps angewandt werden können, die sich mit Weltkulturerbestätten befassen. KW - Benutzerschnittstellenentwurfssystem KW - Welterbe KW - Benutzerfreundlichkeit KW - App KW - Mobiles Endgerät KW - Interface Design KW - world heritage sites KW - usability KW - mobile devices Y1 - 2021 U6 - http://nbn-resolving.de/urn/resolver.pl?urn:nbn:de:gbv:wim2-20211129-45319 UR - www.joatan.com.br/docs/Thesis_JoatanPreisDutra.pdf ER - TY - THES A1 - Rausch, Gabriel T1 - Dreidimensionale Webanwendungen zur Verräumlichung von Informationen N2 - Die vorliegende Arbeit untersucht das Potential von Webanwendungen in 3D zur Vermittlung von Informationen im Allgemeinen und zur Darstellung von städtebaulichen Zusammenhängen im Speziellen. Als grundlegender Faktor der visuellen und funktionalen Qualität - welche die Wahrnehmung des Nutzers direkt beeinflusst -, erfolgt die Bewertung der Machbarkeit von 3D Webinhalten unter Anwendung einer explorativen, qualitativen Evaluierung von Webagenturen. Darauf aufbauend wird das Potential von 3D Webanwendungen aus Nutzerperspektive untersucht, um Zusammenhänge herstellen zu können: einerseits zwischen der Machbarkeit bei der Entwicklung und anderseits die Akzeptanzkriterien beim Rezipienten betreffend. Die empirische Studie, die mit dem Forschungspartner Bosch für diese Arbeit modelliert wurde, eruiert zum einen, inwiefern 3D im Vergleich zu 2D und 2,5D, und zum anderen WebGL im Vergleich zu bisherigen 3D Webtechnologien die visuelle Wahrnehmung und kognitive Leistungsfähigkeit des Nutzers beeinflusst. Die Erkenntnisse der Untersuchung zeigen Parallelen zu bestehenden Studien aus web-fernen Bereichen. Um die Bedeutung von 3D Webanwendungen zur Verbesserung von Entscheidungsprozessen in Stadtplanungsprojekten ableiten zu können, werden Aspekte zur Interaktion und visuellen Wahrnehmung in den speziellen Kontext von Stadtplanungswerkzeugen gebracht. Dabei wird überprüft, ob sich web-basierte 3D Visualisierungen sinnvoll zur Vermittlung städtebaulicher Zusammenhänge einbinden lassen und inwieweit bestehende Projekte, wie in dieser Arbeit beispielhaft das vom Fraunhofer IGD entwickelte Forschungsprojekt urbanAPI, die Technologie WebGL nutzen können. Vor diesem Hintergrund soll die Arbeit Akzeptanzkriterien und Nutzungsbarrieren von 3D Webanwendungen auf Basis der Technologie WebGL identifizieren, um einen Beitrag zur Machbarkeit von Webanwendungen und zur Entwicklung entsprechender Stadtplanungswerkzeuge zu leisten. KW - WEB KW - WebGL KW - 3D KW - Web KW - Application KW - Urban Planning Tool KW - Stadtplanung Y1 - 2016 U6 - http://nbn-resolving.de/urn/resolver.pl?urn:nbn:de:gbv:wim2-20161116-27029 ER - TY - GEN A1 - Müller, Naira A1 - Hennig, Christoph A1 - Aubel, Mario A1 - Hesse, Tobias A1 - Schneider, Sascha T1 - FlipLife als Mitarbeiterrekrutierungsquelle N2 - Ausgehend von der Aufgabenstellung umreißt diese Arbeit die Besonderheiten der Mitarbeiterrekrutierung und deren Anwendung im Internet, im speziellen bei Social-Games, gemessen an dem Spiel FlipLife. KW - Fliplife Mitarbeiterrekrutierung Y1 - 2012 U6 - http://nbn-resolving.de/urn/resolver.pl?urn:nbn:de:gbv:wim2-20120229-15727 ER - TY - JOUR A1 - Hijazi, Ihab Hamzi A1 - König, Reinhard A1 - Schneider, Sven A1 - Li, Xin A1 - Bielik, Martin A1 - Schmitt, Gerhard A1 - Donath, Dirk T1 - Geostatistical Analysis for the Study of Relationships between the Emotional Responses of Urban Walkers to Urban Spaces JF - International Journal of E-Planning Research N2 - The described study aims to find correlations between urban spatial configurations and human emotions. To this end, the authors measured people’s emotions while they walk along a path in an urban area using an instrument that measures skin conductance and skin temperature. The corresponding locations of the test persons were measured recorded by using a GPS-tracker (n=13). The results are interpreted and categorized as measures for positive and negative emotional arousal. To evaluate the technical and methodological process. The test results offer initial evidence that certain spaces or spatial sequences do cause positive or negative emotional arousal while others are relatively neutral. To achieve the goal of the study, the outcome was used as a basis for the study of testing correlations between people’s emotional responses and urban spatial configurations represented by Isovist properties of the urban form. By using their model the authors can explain negative emotional arousal for certain places, but they couldn’t find a model to predict emotional responses for individual spatial configurations. KW - Emotion KW - Isovist KW - Geo-Statistical Analysis KW - GIS KW - Geografie KW - Architektur Y1 - 2016 U6 - http://nbn-resolving.de/urn/resolver.pl?urn:nbn:de:gbv:wim2-20160622-26025 N1 - Erschienen in: International Journal of E-Planning Research ; Volume 5, Issue 1 (2016) SP - 1 EP - 19 ER - TY - CHAP A1 - König, Reinhard A1 - Schneider, Sven A1 - Hijazi, Ihab Hamzi A1 - Li, Xin A1 - Bielik, Martin A1 - Schmitt, Gerhard A1 - Donath, Dirk T1 - Using geo statistical analysis to detect similarities in emotional responses of urban walkers to urban space T2 - Sixth International Conference on Design Computing and Cognition (DCC14) N2 - Using geo statistical analysis to detect similarities in emotional responses of urban walkers to urban space KW - Städtebau Y1 - 2014 U6 - http://nbn-resolving.de/urn/resolver.pl?urn:nbn:de:gbv:wim2-20160121-25146 ER - TY - BOOK A1 - Breuer, Johannes A1 - Bart, Marlene A1 - Freier, Alex Leo A1 - Rünker, Maximilian A1 - Jakubek, Kristin A1 - Rubiano, Juan A1 - Groos, Cora A1 - Šálek, Martin A1 - Fritz, Henrieke A1 - Kokkinidou, Eirini A1 - Richter, Fabian A1 - Liu, Ani A1 - Held, Tobias A1 - Moses, Gabriel S A1 - Blasius, Clara Maria A1 - Spång, Fanny A1 - Bencicova, Evelyn A1 - Rückeis, Julia A1 - Thurow, Katharina A1 - Maas, Frederike A1 - Farfán, Vanessa A1 - Tikka, Emilia A1 - Lee, Sang A1 - Holzheu, Stefanie ED - Breuer, Johannes ED - Bart, Marlene ED - Freier, Alex Leo T1 - Atlas der Datenkörper. Körperbilder in Kunst, Design und Wissenschaft im Zeitalter digitaler Medien N2 - Digitale Technologien und soziale Medien verändern die Selbst- und Körperwahrnehmung und verzerren, verstärken oder produzieren dabei spezifische Körperbilder. Die Beiträger*innen kartographieren diese Phänomene, fragen nach ihrer medialen Existenzweise sowie nach den Möglichkeiten ihrer Kritik. Dabei begegnen sie ihrer Neuartigkeit mit einer transdisziplinären Herangehensweise. Aus sowohl der Perspektive künstlerischer und gestalterischer Forschung als auch der Kunst-, Kultur- und Medienwissenschaft sowie der Psychologie und Neurowissenschaft wird die Landschaft rezenter Körperbilder und Techniken einer digitalen Körperlichkeit untersucht. T3 - Atlas der Datenkörper - 1 KW - Körperbild KW - Kunst KW - Design KW - Digitale Medien KW - Körper KW - Digitalisierung KW - Körper KW - Design KW - Kunst KW - Künstlerische Forschung KW - OA-Publikationsfonds2022 Y1 - 2022 U6 - http://nbn-resolving.de/urn/resolver.pl?urn:nbn:de:gbv:wim2-20220411-46248 SN - 2750-7483 VL - 2022 PB - transcript Verlag CY - Bielefeld ER - TY - THES A1 - Mthunzi, Everett T1 - Interactive Surface Environments: Design and Implementation N2 - This dissertation presents three studies on the design and implementation of interactive surface environments. It puts forward approaches to engineering interactive surface prototypes using prevailing methodologies and technologies. The scholarly findings from each study have been condensed into academic manuscripts, which are conferred herewith. The first study identifies a communication gap between engineers of interactive surface systems (i.e., originators of concepts) and future developers. To bridge the gap, it explores a UML-based framework to establish a formal syntax for modeling hardware, middleware, and software of interactive surface prototypes. The proposed framework targets models-as-end-products, towards enabling a shared view of research prototypes thereby facilitating dialogue between concept originators and future developers. The second study positions itself to support developers with an open-source solution for exploiting 3D point clouds for interactive tabletop applications using CPU architectures. Given dense 3D point-cloud representations of tabletop environments, the study aims toward mitigating high computational effort by segmenting candidate interaction regions as a preprocessing step. The study contributes a robust open-source solution for reducing computational costs when leveraging 3D point clouds for interactive tabletop applications. The solution itself is flexible and adaptable to variable interactive surface applications. The third study contributes an archetypal concept for integrating mobile devices as active components in augmented tabletop surfaces. With emphasis on transparent development trails, the study demonstrates the utility of the open-source tool developed in the second study. In addition to leveraging 3D point clouds for real-time interaction, the research considers recent advances in computer vision and wireless communication to realize a modern, interactive tabletop application. A robust strategy that combines spatial augmented reality, point-cloud-based depth perception, CNN-based object detection, and Bluetooth communication is put forward. In addition to seamless communication between adhoc mobile devices and interactive tabletop systems, the archetypal concept demonstrates the benefits of preprocessing point clouds by segmenting candidate interaction regions, as suggested in the second study. Collectively, the studies presented in this dissertation contribute; 1—bridging the gap between originators of interactive surface concepts and future developers, 2— promoting the exploration of 3D point clouds for interactive surface applications using CPU-based architectures, and 3—leveraging 3D point clouds together with emerging CNN-based object detection, and Bluetooth communication technologies to advance existing surface interaction concepts. KW - Mensch-Maschiene-Kommunikation KW - Human-machine communication Y1 - 2023 U6 - http://nbn-resolving.de/urn/resolver.pl?urn:nbn:de:gbv:wim2-20230704-64065 ER - TY - JOUR A1 - Meiabadi, Mohammad Saleh A1 - Moradi, Mahmoud A1 - Karamimoghadam, Mojtaba A1 - Ardabili, Sina A1 - Bodaghi, Mahdi A1 - Shokri, Manouchehr A1 - Mosavi, Amir Hosein T1 - Modeling the Producibility of 3D Printing in Polylactic Acid Using Artificial Neural Networks and Fused Filament Fabrication JF - polymers N2 - Polylactic acid (PLA) is a highly applicable material that is used in 3D printers due to some significant features such as its deformation property and affordable cost. For improvement of the end-use quality, it is of significant importance to enhance the quality of fused filament fabrication (FFF)-printed objects in PLA. The purpose of this investigation was to boost toughness and to reduce the production cost of the FFF-printed tensile test samples with the desired part thickness. To remove the need for numerous and idle printing samples, the response surface method (RSM) was used. Statistical analysis was performed to deal with this concern by considering extruder temperature (ET), infill percentage (IP), and layer thickness (LT) as controlled factors. The artificial intelligence method of artificial neural network (ANN) and ANN-genetic algorithm (ANN-GA) were further developed to estimate the toughness, part thickness, and production-cost-dependent variables. Results were evaluated by correlation coefficient and RMSE values. According to the modeling results, ANN-GA as a hybrid machine learning (ML) technique could enhance the accuracy of modeling by about 7.5, 11.5, and 4.5% for toughness, part thickness, and production cost, respectively, in comparison with those for the single ANN method. On the other hand, the optimization results confirm that the optimized specimen is cost-effective and able to comparatively undergo deformation, which enables the usability of printed PLA objects. KW - 3D-Druck KW - Polymere KW - Maschinelles Lernen KW - 3D printing KW - machine learning KW - fused filament fabrication KW - OA-Publikationsfonds2021 Y1 - 2021 U6 - http://nbn-resolving.de/urn/resolver.pl?urn:nbn:de:gbv:wim2-20220110-45518 UR - https://www.mdpi.com/2073-4360/13/19/3219 VL - 2021 IS - Volume 13, issue 19, article 3219 SP - 1 EP - 21 PB - MDPI CY - Basel ER - TY - JOUR A1 - Schwenke, Nicolas A1 - Söbke, Heinrich A1 - Kraft, Eckhard T1 - Potentials and Challenges of Chatbot-Supported Thesis Writing: An Autoethnography JF - Trends in Higher Education N2 - The release of the large language model-based chatbot ChatGPT 3.5 in November 2022 has brought considerable attention to the subject of artificial intelligence, not only to the public. From the perspective of higher education, ChatGPT challenges various learning and assessment formats as it significantly reduces the effectiveness of their learning and assessment functionalities. In particular, ChatGPT might be applied to formats that require learners to generate text, such as bachelor theses or student research papers. Accordingly, the research question arises to what extent writing of bachelor theses is still a valid learning and assessment format. Correspondingly, in this exploratory study, the first author was asked to write his bachelor’s thesis exploiting ChatGPT. For tracing the impact of ChatGPT methodically, an autoethnographic approach was used. First, all considerations on the potential use of ChatGPT were documented in logs, and second, all ChatGPT chats were logged. Both logs and chat histories were analyzed and are presented along with the recommendations for students regarding the use of ChatGPT suggested by a common framework. In conclusion, ChatGPT is beneficial for thesis writing during various activities, such as brainstorming, structuring, and text revision. However, there are limitations that arise, e.g., in referencing. Thus, ChatGPT requires continuous validation of the outcomes generated and thus fosters learning. Currently, ChatGPT is valued as a beneficial tool in thesis writing. However, writing a conclusive thesis still requires the learner’s meaningful engagement. Accordingly, writing a thesis is still a valid learning and assessment format. With further releases of ChatGPT, an increase in capabilities is to be expected, and the research question needs to be reevaluated from time to time. KW - Chatbot KW - Künstliche Intelligenz KW - Hochschulbildung KW - AIEd KW - artificial intelligence KW - academic writing KW - ChatGPT KW - education Y1 - 2023 U6 - http://nbn-resolving.de/urn/resolver.pl?urn:nbn:de:gbv:wim2-20231207-65016 UR - https://www.mdpi.com/2813-4346/2/4/37 VL - 2023 IS - Volume 2, issue 4 SP - 611 EP - 635 PB - MDPI CY - Basel ER - TY - THES A1 - Al Khatib, Khalid T1 - Computational Analysis of Argumentation Strategies N2 - The computational analysis of argumentation strategies is substantial for many downstream applications. It is required for nearly all kinds of text synthesis, writing assistance, and dialogue-management tools. While various tasks have been tackled in the area of computational argumentation, such as argumentation mining and quality assessment, the task of the computational analysis of argumentation strategies in texts has so far been overlooked. This thesis principally approaches the analysis of the strategies manifested in the persuasive argumentative discourses that aim for persuasion as well as in the deliberative argumentative discourses that aim for consensus. To this end, the thesis presents a novel view of argumentation strategies for the above two goals. Based on this view, new models for pragmatic and stylistic argument attributes are proposed, new methods for the identification of the modelled attributes have been developed, and a new set of strategy principles in texts according to the identified attributes is presented and explored. Overall, the thesis contributes to the theory, data, method, and evaluation aspects of the analysis of argumentation strategies. The models, methods, and principles developed and explored in this thesis can be regarded as essential for promoting the applications mentioned above, among others. KW - Argumentation KW - Natürliche Sprache KW - Argumentation Strategies KW - Sprachverarbeitung KW - Natural Language Processing KW - Computational Argumentation Y1 - 2021 U6 - http://nbn-resolving.de/urn/resolver.pl?urn:nbn:de:gbv:wim2-20210719-44612 ER - TY - JOUR A1 - Ahmadi, Mohammad Hossein A1 - Baghban, Alireza A1 - Sadeghzadeh, Milad A1 - Zamen, Mohammad A1 - Mosavi, Amir A1 - Shamshirband, Shahaboddin A1 - Kumar, Ravinder A1 - Mohammadi-Khanaposhtani, Mohammad T1 - Evaluation of electrical efficiency of photovoltaic thermal solar collector JF - Engineering Applications of Computational Fluid Mechanics N2 - In this study, machine learning methods of artificial neural networks (ANNs), least squares support vector machines (LSSVM), and neuro-fuzzy are used for advancing prediction models for thermal performance of a photovoltaic-thermal solar collector (PV/T). In the proposed models, the inlet temperature, flow rate, heat, solar radiation, and the sun heat have been considered as the input variables. Data set has been extracted through experimental measurements from a novel solar collector system. Different analyses are performed to examine the credibility of the introduced models and evaluate their performances. The proposed LSSVM model outperformed the ANFIS and ANNs models. LSSVM model is reported suitable when the laboratory measurements are costly and time-consuming, or achieving such values requires sophisticated interpretations. KW - Fotovoltaik KW - Erneuerbare Energien KW - Solar KW - Deep learning KW - Machine learning KW - Renewable energy KW - neural networks (NNs) KW - adaptive neuro-fuzzy inference system (ANFIS) KW - least square support vector machine (LSSVM) KW - photovoltaic-thermal (PV/T) KW - hybrid machine learning model KW - OA-Publikationsfonds2020 Y1 - 2020 U6 - http://nbn-resolving.de/urn/resolver.pl?urn:nbn:de:gbv:wim2-20200304-41049 UR - https://www.tandfonline.com/doi/full/10.1080/19942060.2020.1734094 VL - 2020 IS - volume 14, issue 1 SP - 545 EP - 565 PB - Taylor & Francis ER - TY - JOUR A1 - Mosavi, Amir A1 - Shamshirband, Shahaboddin A1 - Esmaeilbeiki, Fatemeh A1 - Zarehaghi, Davoud A1 - Neyshabouri, Mohammadreza A1 - Samadianfard, Saeed A1 - Ghorbani, Mohammad Ali A1 - Nabipour, Narjes A1 - Chau, Kwok-Wing T1 - Comparative analysis of hybrid models of firefly optimization algorithm with support vector machines and multilayer perceptron for predicting soil temperature at different depths JF - Engineering Applications of Computational Fluid Mechanics N2 - This research aims to model soil temperature (ST) using machine learning models of multilayer perceptron (MLP) algorithm and support vector machine (SVM) in hybrid form with the Firefly optimization algorithm, i.e. MLP-FFA and SVM-FFA. In the current study, measured ST and meteorological parameters of Tabriz and Ahar weather stations in a period of 2013–2015 are used for training and testing of the studied models with one and two days as a delay. To ascertain conclusive results for validation of the proposed hybrid models, the error metrics are benchmarked in an independent testing period. Moreover, Taylor diagrams utilized for that purpose. Obtained results showed that, in a case of one day delay, except in predicting ST at 5 cm below the soil surface (ST5cm) at Tabriz station, MLP-FFA produced superior results compared with MLP, SVM, and SVM-FFA models. However, for two days delay, MLP-FFA indicated increased accuracy in predicting ST5cm and ST 20cm of Tabriz station and ST10cm of Ahar station in comparison with SVM-FFA. Additionally, for all of the prescribed models, the performance of the MLP-FFA and SVM-FFA hybrid models in the testing phase was found to be meaningfully superior to the classical MLP and SVM models. KW - Bodentemperatur KW - Algorithmus KW - Maschinelles Lernen KW - Neuronales Netz KW - firefly optimization algorithm KW - soil temperature KW - artificial neural networks KW - hybrid machine learning KW - OA-Publikationsfonds2019 Y1 - 2020 U6 - http://nbn-resolving.de/urn/resolver.pl?urn:nbn:de:gbv:wim2-20200911-42347 UR - https://www.tandfonline.com/doi/full/10.1080/19942060.2020.1788644 VL - 2020 IS - Volume 14, Issue 1 SP - 939 EP - 953 ER - TY - JOUR A1 - Homaei, Mohammad Hossein A1 - Soleimani, Faezeh A1 - Shamshirband, Shahaboddin A1 - Mosavi, Amir A1 - Nabipour, Narjes A1 - Varkonyi-Koczy, Annamaria R. T1 - An Enhanced Distributed Congestion Control Method for Classical 6LowPAN Protocols Using Fuzzy Decision System JF - IEEE Access N2 - The classical Internet of things routing and wireless sensor networks can provide more precise monitoring of the covered area due to the higher number of utilized nodes. Because of the limitations in shared transfer media, many nodes in the network are prone to the collision in simultaneous transmissions. Medium access control protocols are usually more practical in networks with low traffic, which are not subjected to external noise from adjacent frequencies. There are preventive, detection and control solutions to congestion management in the network which are all the focus of this study. In the congestion prevention phase, the proposed method chooses the next step of the path using the Fuzzy decision-making system to distribute network traffic via optimal paths. In the congestion detection phase, a dynamic approach to queue management was designed to detect congestion in the least amount of time and prevent the collision. In the congestion control phase, the back-pressure method was used based on the quality of the queue to decrease the probability of linking in the pathway from the pre-congested node. The main goals of this study are to balance energy consumption in network nodes, reducing the rate of lost packets and increasing quality of service in routing. Simulation results proved the proposed Congestion Control Fuzzy Decision Making (CCFDM) method was more capable in improving routing parameters as compared to recent algorithms. KW - Internet der dinge KW - IOT KW - Internet of things KW - wireless sensor network KW - congestion control KW - fuzzy decision making KW - back-pressure Y1 - 2020 U6 - http://nbn-resolving.de/urn/resolver.pl?urn:nbn:de:gbv:wim2-20200213-40805 UR - https://ieeexplore.ieee.org/document/8967114 IS - volume 8 SP - 20628 EP - 20645 PB - IEEE ER - TY - JOUR A1 - Hassannataj Joloudari, Javad A1 - Hassannataj Joloudari, Edris A1 - Saadatfar, Hamid A1 - GhasemiGol, Mohammad A1 - Razavi, Seyyed Mohammad A1 - Mosavi, Amir A1 - Nabipour, Narjes A1 - Shamshirband, Shahaboddin A1 - Nadai, Laszlo T1 - Coronary Artery Disease Diagnosis: Ranking the Significant Features Using a Random Trees Model JF - International Journal of Environmental Research and Public Health, IJERPH N2 - Heart disease is one of the most common diseases in middle-aged citizens. Among the vast number of heart diseases, coronary artery disease (CAD) is considered as a common cardiovascular disease with a high death rate. The most popular tool for diagnosing CAD is the use of medical imaging, e.g., angiography. However, angiography is known for being costly and also associated with a number of side effects. Hence, the purpose of this study is to increase the accuracy of coronary heart disease diagnosis through selecting significant predictive features in order of their ranking. In this study, we propose an integrated method using machine learning. The machine learning methods of random trees (RTs), decision tree of C5.0, support vector machine (SVM), and decision tree of Chi-squared automatic interaction detection (CHAID) are used in this study. The proposed method shows promising results and the study confirms that the RTs model outperforms other models. KW - Maschinelles Lernen KW - Machine learning KW - Deep learning KW - coronary artery disease KW - heart disease diagnosis KW - health informatics KW - data science KW - big data KW - predictive model KW - ensemble model KW - random forest KW - industry 4.0 Y1 - 2020 U6 - http://nbn-resolving.de/urn/resolver.pl?urn:nbn:de:gbv:wim2-20200213-40819 UR - https://www.mdpi.com/1660-4601/17/3/731 VL - 2020 IS - Volume 17, Issue 3, 731 PB - MDPI ER - TY - JOUR A1 - Kargar, Katayoun A1 - Samadianfard, Saeed A1 - Parsa, Javad A1 - Nabipour, Narjes A1 - Shamshirband, Shahaboddin A1 - Mosavi, Amir A1 - Chau, Kwok-Wing T1 - Estimating longitudinal dispersion coefficient in natural streams using empirical models and machine learning algorithms JF - Engineering Applications of Computational Fluid Mechanics N2 - The longitudinal dispersion coefficient (LDC) plays an important role in modeling the transport of pollutants and sediment in natural rivers. As a result of transportation processes, the concentration of pollutants changes along the river. Various studies have been conducted to provide simple equations for estimating LDC. In this study, machine learning methods, namely support vector regression, Gaussian process regression, M5 model tree (M5P) and random forest, and multiple linear regression were examined in predicting the LDC in natural streams. Data sets from 60 rivers around the world with different hydraulic and geometric features were gathered to develop models for LDC estimation. Statistical criteria, including correlation coefficient (CC), root mean squared error (RMSE) and mean absolute error (MAE), were used to scrutinize the models. The LDC values estimated by these models were compared with the corresponding results of common empirical models. The Taylor chart was used to evaluate the models and the results showed that among the machine learning models, M5P had superior performance, with CC of 0.823, RMSE of 454.9 and MAE of 380.9. The model of Sahay and Dutta, with CC of 0.795, RMSE of 460.7 and MAE of 306.1, gave more precise results than the other empirical models. The main advantage of M5P models is their ability to provide practical formulae. In conclusion, the results proved that the developed M5P model with simple formulations was superior to other machine learning models and empirical models; therefore, it can be used as a proper tool for estimating the LDC in rivers. KW - Maschinelles Lernen KW - Gaussian process regression KW - longitudinal dispersion coefficient KW - M5 model tree KW - random forest KW - support vector regression KW - rivers Y1 - 2020 U6 - http://nbn-resolving.de/urn/resolver.pl?urn:nbn:de:gbv:wim2-20200128-40775 UR - https://www.tandfonline.com/doi/full/10.1080/19942060.2020.1712260 VL - 2020 IS - Volume 14, No. 1 SP - 311 EP - 322 PB - Taylor & Francis ER - TY - JOUR A1 - Dehghani, Majid A1 - Salehi, Somayeh A1 - Mosavi, Amir A1 - Nabipour, Narjes A1 - Shamshirband, Shahaboddin A1 - Ghamisi, Pedram T1 - Spatial Analysis of Seasonal Precipitation over Iran: Co-Variation with Climate Indices JF - ISPRS, International Journal of Geo-Information N2 - Temporary changes in precipitation may lead to sustained and severe drought or massive floods in different parts of the world. Knowing the variation in precipitation can effectively help the water resources decision-makers in water resources management. Large-scale circulation drivers have a considerable impact on precipitation in different parts of the world. In this research, the impact of El Niño-Southern Oscillation (ENSO), Pacific Decadal Oscillation (PDO), and North Atlantic Oscillation (NAO) on seasonal precipitation over Iran was investigated. For this purpose, 103 synoptic stations with at least 30 years of data were utilized. The Spearman correlation coefficient between the indices in the previous 12 months with seasonal precipitation was calculated, and the meaningful correlations were extracted. Then, the month in which each of these indices has the highest correlation with seasonal precipitation was determined. Finally, the overall amount of increase or decrease in seasonal precipitation due to each of these indices was calculated. Results indicate the Southern Oscillation Index (SOI), NAO, and PDO have the most impact on seasonal precipitation, respectively. Additionally, these indices have the highest impact on the precipitation in winter, autumn, spring, and summer, respectively. SOI has a diverse impact on winter precipitation compared to the PDO and NAO, while in the other seasons, each index has its special impact on seasonal precipitation. Generally, all indices in different phases may decrease the seasonal precipitation up to 100%. However, the seasonal precipitation may increase more than 100% in different seasons due to the impact of these indices. The results of this study can be used effectively in water resources management and especially in dam operation. KW - Maschinelles Lernen KW - Machine learning KW - spatiotemporal database KW - spatial analysis KW - seasonal precipitation KW - spearman correlation coefficient Y1 - 2020 U6 - http://nbn-resolving.de/urn/resolver.pl?urn:nbn:de:gbv:wim2-20200128-40740 UR - https://www.mdpi.com/2220-9964/9/2/73 VL - 2020 IS - Volume 9, Issue 2, 73 PB - MDPI ER - TY - JOUR A1 - Saqlai, Syed Muhammad A1 - Ghani, Anwar A1 - Khan, Imran A1 - Ahmed Khan Ghayyur, Shahbaz A1 - Shamshirband, Shahaboddin A1 - Nabipour, Narjes A1 - Shokri, Manouchehr T1 - Image Analysis Using Human Body Geometry and Size Proportion Science for Action Classification JF - Applied Sciences N2 - Gestures are one of the basic modes of human communication and are usually used to represent different actions. Automatic recognition of these actions forms the basis for solving more complex problems like human behavior analysis, video surveillance, event detection, and sign language recognition, etc. Action recognition from images is a challenging task as the key information like temporal data, object trajectory, and optical flow are not available in still images. While measuring the size of different regions of the human body i.e., step size, arms span, length of the arm, forearm, and hand, etc., provides valuable clues for identification of the human actions. In this article, a framework for classification of the human actions is presented where humans are detected and localized through faster region-convolutional neural networks followed by morphological image processing techniques. Furthermore, geometric features from human blob are extracted and incorporated into the classification rules for the six human actions i.e., standing, walking, single-hand side wave, single-hand top wave, both hands side wave, and both hands top wave. The performance of the proposed technique has been evaluated using precision, recall, omission error, and commission error. The proposed technique has been comparatively analyzed in terms of overall accuracy with existing approaches showing that it performs well in contrast to its counterparts. KW - Bildanalyse KW - Mensch KW - Größenverhältnis KW - Geometrie KW - Körper KW - action recognition KW - rule based classification KW - human body proportions KW - human blob KW - OA-Publikationsfonds2020 Y1 - 2020 U6 - http://nbn-resolving.de/urn/resolver.pl?urn:nbn:de:gbv:wim2-20200904-42322 UR - https://www.mdpi.com/2076-3417/10/16/5453 VL - 2020 IS - volume 10, issue 16, article 5453 PB - MDPI CY - Basel ER - TY - THES A1 - Fröhlich, Jan T1 - On systematic approaches for interpreted information transfer of inspection data from bridge models to structural analysis N2 - In conjunction with the improved methods of monitoring damage and degradation processes, the interest in reliability assessment of reinforced concrete bridges is increasing in recent years. Automated imagebased inspections of the structural surface provide valuable data to extract quantitative information about deteriorations, such as crack patterns. However, the knowledge gain results from processing this information in a structural context, i.e. relating the damage artifacts to building components. This way, transformation to structural analysis is enabled. This approach sets two further requirements: availability of structural bridge information and a standardized storage for interoperability with subsequent analysis tools. Since the involved large datasets are only efficiently processed in an automated manner, the implementation of the complete workflow from damage and building data to structural analysis is targeted in this work. First, domain concepts are derived from the back-end tasks: structural analysis, damage modeling, and life-cycle assessment. The common interoperability format, the Industry Foundation Class (IFC), and processes in these domains are further assessed. The need for usercontrolled interpretation steps is identified and the developed prototype thus allows interaction at subsequent model stages. The latter has the advantage that interpretation steps can be individually separated into either a structural analysis or a damage information model or a combination of both. This approach to damage information processing from the perspective of structural analysis is then validated in different case studies. KW - Brückenbau KW - Strukturanalyse KW - Schadensanalyse KW - BIM KW - IFC KW - bridge inspection KW - damage information model KW - structural analysis KW - IFC KW - BIM Y1 - 2020 U6 - http://nbn-resolving.de/urn/resolver.pl?urn:nbn:de:gbv:wim2-20200416-41310 ER - TY - THES A1 - Lang, Kevin T1 - Worteinbettung als semantisches Feature in der argumentativen Analyse N2 - Diese Arbeit beschäftigt sich mit der Nutzung von Worteinbettungen in der automatischen Analyse von argumentativen Texten. Die Arbeit diskutiert wichtige Einstellungen des Einbettungsverfahren sowie diverse Anwendungsmethoden der eingebetteten Wortvektoren für drei Aufgaben der automatischen argumentativen Analyse: Textsegmentierung, Argumentativitäts-Klassifikation und Relationenfindung. Meine Experimente auf zwei Standard-Argumentationsdatensätzen zeigen die folgenden Haupterkenntnisse: Bei der Textsegmentierung konnten keine Verbesserungen erzielt werden, während in der Argumentativitäts-Klassifikation und der Relationenfindung sich kleine Erfolge gezeigt haben und weitere bestimmte Forschungsthesen bewahrheitet werden konnten. In der Diskussion wird darauf eingegangen, warum bei der einfachen Worteinbettung in der argumentativen Analyse sich kaum nutzbare Ergebnisse erzielen lassen konnten, diese sich aber in Zukunft durch erweiterte Worteinbettungsverfahren verbessern können. KW - Argumentation KW - Worteinbettung Y1 - 2016 U6 - http://nbn-resolving.de/urn/resolver.pl?urn:nbn:de:gbv:wim2-20190617-39343 ER - TY - JOUR A1 - Harirchian, Ehsan A1 - Lahmer, Tom A1 - Kumari, Vandana A1 - Jadhav, Kirti T1 - Application of Support Vector Machine Modeling for the Rapid Seismic Hazard Safety Evaluation of Existing Buildings JF - Energies N2 - The economic losses from earthquakes tend to hit the national economy considerably; therefore, models that are capable of estimating the vulnerability and losses of future earthquakes are highly consequential for emergency planners with the purpose of risk mitigation. This demands a mass prioritization filtering of structures to identify vulnerable buildings for retrofitting purposes. The application of advanced structural analysis on each building to study the earthquake response is impractical due to complex calculations, long computational time, and exorbitant cost. This exhibits the need for a fast, reliable, and rapid method, commonly known as Rapid Visual Screening (RVS). The method serves as a preliminary screening platform, using an optimum number of seismic parameters of the structure and predefined output damage states. In this study, the efficacy of the Machine Learning (ML) application in damage prediction through a Support Vector Machine (SVM) model as the damage classification technique has been investigated. The developed model was trained and examined based on damage data from the 1999 Düzce Earthquake in Turkey, where the building’s data consists of 22 performance modifiers that have been implemented with supervised machine learning. KW - Erdbeben KW - Maschinelles Lernen KW - earthquake vulnerability assessment KW - rapid visual screening KW - machine learning KW - support vector machine KW - buildings KW - OA-Publikationsfonds2020 Y1 - 2020 U6 - http://nbn-resolving.de/urn/resolver.pl?urn:nbn:de:gbv:wim2-20200707-41915 UR - https://www.mdpi.com/1996-1073/13/13/3340 VL - 2020 IS - volume 13, issue 13, 3340 PB - MDPI CY - Basel ER - TY - JOUR A1 - Kavrakov, Igor A1 - Kareem, Ahsan A1 - Morgenthal, Guido T1 - Comparison Metrics for Time-histories: Application to Bridge Aerodynamics N2 - Wind effects can be critical for the design of lifelines such as long-span bridges. The existence of a significant number of aerodynamic force models, used to assess the performance of bridges, poses an important question regarding their comparison and validation. This study utilizes a unified set of metrics for a quantitative comparison of time-histories in bridge aerodynamics with a host of characteristics. Accordingly, nine comparison metrics are included to quantify the discrepancies in local and global signal features such as phase, time-varying frequency and magnitude content, probability density, nonstationarity and nonlinearity. Among these, seven metrics available in the literature are introduced after recasting them for time-histories associated with bridge aerodynamics. Two additional metrics are established to overcome the shortcomings of the existing metrics. The performance of the comparison metrics is first assessed using generic signals with prescribed signal features. Subsequently, the metrics are applied to a practical example from bridge aerodynamics to quantify the discrepancies in the aerodynamic forces and response based on numerical and semi-analytical aerodynamic models. In this context, it is demonstrated how a discussion based on the set of comparison metrics presented here can aid a model evaluation by offering deeper insight. The outcome of the study is intended to provide a framework for quantitative comparison and validation of aerodynamic models based on the underlying physics of fluid-structure interaction. Immediate further applications are expected for the comparison of time-histories that are simulated by data-driven approaches. KW - Ingenieurwissenschaften KW - Aerodynamik KW - Brücke KW - Bridge Y1 - 2020 U6 - http://nbn-resolving.de/urn/resolver.pl?urn:nbn:de:gbv:wim2-20200625-41863 UR - https://ascelibrary.org/doi/10.1061/%28ASCE%29EM.1943-7889.0001811 N1 - This material may be downloaded for personal use only. Any other use requires prior permission of the American Society of Civil Engineers. This material may be found at https://ascelibrary.org/doi/10.1061/%28ASCE%29EM.1943-7889.0001811. N1 - This is the final draft of the following article: https://ascelibrary.org/doi/10.1061/%28ASCE%29EM.1943-7889.0001811, which has been published in final form at https://doi.org/10.1061/(ASCE)EM.1943-7889.0001811 ER - TY - JOUR A1 - Shabani, Sevda A1 - Samadianfard, Saeed A1 - Sattari, Mohammad Taghi A1 - Mosavi, Amir A1 - Shamshirband, Shahaboddin A1 - Kmet, Tibor A1 - Várkonyi-Kóczy, Annamária R. T1 - Modeling Pan Evaporation Using Gaussian Process Regression K-Nearest Neighbors Random Forest and Support Vector Machines; Comparative Analysis JF - Atmosphere N2 - Evaporation is a very important process; it is one of the most critical factors in agricultural, hydrological, and meteorological studies. Due to the interactions of multiple climatic factors, evaporation is considered as a complex and nonlinear phenomenon to model. Thus, machine learning methods have gained popularity in this realm. In the present study, four machine learning methods of Gaussian Process Regression (GPR), K-Nearest Neighbors (KNN), Random Forest (RF) and Support Vector Regression (SVR) were used to predict the pan evaporation (PE). Meteorological data including PE, temperature (T), relative humidity (RH), wind speed (W), and sunny hours (S) collected from 2011 through 2017. The accuracy of the studied methods was determined using the statistical indices of Root Mean Squared Error (RMSE), correlation coefficient (R) and Mean Absolute Error (MAE). Furthermore, the Taylor charts utilized for evaluating the accuracy of the mentioned models. The results of this study showed that at Gonbad-e Kavus, Gorgan and Bandar Torkman stations, GPR with RMSE of 1.521 mm/day, 1.244 mm/day, and 1.254 mm/day, KNN with RMSE of 1.991 mm/day, 1.775 mm/day, and 1.577 mm/day, RF with RMSE of 1.614 mm/day, 1.337 mm/day, and 1.316 mm/day, and SVR with RMSE of 1.55 mm/day, 1.262 mm/day, and 1.275 mm/day had more appropriate performances in estimating PE values. It was found that GPR for Gonbad-e Kavus Station with input parameters of T, W and S and GPR for Gorgan and Bandar Torkmen stations with input parameters of T, RH, W and S had the most accurate predictions and were proposed for precise estimation of PE. The findings of the current study indicated that the PE values may be accurately estimated with few easily measured meteorological parameters. KW - Maschinelles Lernen KW - Machine learning KW - Deep learning Y1 - 2020 U6 - http://nbn-resolving.de/urn/resolver.pl?urn:nbn:de:gbv:wim2-20200110-40561 UR - https://www.mdpi.com/2073-4433/11/1/66 VL - 2020 IS - Volume 11, Issue 1, 66 ER - TY - JOUR A1 - Abbaspour-Gilandeh, Yousef A1 - Molaee, Amir A1 - Sabzi, Sajad A1 - Nabipour, Narjes A1 - Shamshirband, Shahaboddin A1 - Mosavi, Amir T1 - A Combined Method of Image Processing and Artificial Neural Network for the Identification of 13 Iranian Rice Cultivars JF - agronomy N2 - Due to the importance of identifying crop cultivars, the advancement of accurate assessment of cultivars is considered essential. The existing methods for identifying rice cultivars are mainly time-consuming, costly, and destructive. Therefore, the development of novel methods is highly beneficial. The aim of the present research is to classify common rice cultivars in Iran based on color, morphologic, and texture properties using artificial intelligence (AI) methods. In doing so, digital images of 13 rice cultivars in Iran in three forms of paddy, brown, and white are analyzed through pre-processing and segmentation of using MATLAB. Ninety-two specificities, including 60 color, 14 morphologic, and 18 texture properties, were identified for each rice cultivar. In the next step, the normal distribution of data was evaluated, and the possibility of observing a significant difference between all specificities of cultivars was studied using variance analysis. In addition, the least significant difference (LSD) test was performed to obtain a more accurate comparison between cultivars. To reduce data dimensions and focus on the most effective components, principal component analysis (PCA) was employed. Accordingly, the accuracy of rice cultivar separations was calculated for paddy, brown rice, and white rice using discriminant analysis (DA), which was 89.2%, 87.7%, and 83.1%, respectively. To identify and classify the desired cultivars, a multilayered perceptron neural network was implemented based on the most effective components. The results showed 100% accuracy of the network in identifying and classifying all mentioned rice cultivars. Hence, it is concluded that the integrated method of image processing and pattern recognition methods, such as statistical classification and artificial neural networks, can be used for identifying and classification of rice cultivars. KW - Maschinelles Lernen KW - Machine learning KW - food informatics KW - big data KW - artificial neural networks KW - artificial intelligence KW - image processing KW - rice Y1 - 2020 U6 - http://nbn-resolving.de/urn/resolver.pl?urn:nbn:de:gbv:wim2-20200123-40695 UR - https://www.mdpi.com/2073-4395/10/1/117 VL - 2020 IS - Volume 10, Issue 1, 117 PB - MDPI ER - TY - THES A1 - Theiler, Michael T1 - Interaktive Visualisierung von Qualitätsdefiziten komplexer Bauwerksinformationsmodelle auf Basis der Industry Foundation Classes (IFC) in einer webbasierten Umgebung N2 - Der inhaltlichen Qualitätssicherung von Bauwerksinformationsmodellen (BIM) kommt im Zuge einer stetig wachsenden Nutzung der verwendeten BIM für unterschiedliche Anwen-dungsfälle eine große Bedeutung zu. Diese ist für jede am Datenaustausch beteiligte Software dem Projektziel entsprechend durchzuführen. Mit den Industry Foundation Classes (IFC) steht ein etabliertes Format für die Beschreibung und den Austausch eines solchen Modells zur Verfügung. Für den Prozess der Qualitätssicherung wird eine serverbasierte Testumgebung Bestandteil des neuen Zertifizierungsverfahrens der IFC sein. Zu diesem Zweck wurde durch das „iabi - Institut für angewandte Bauinformatik” in Zusammenarbeit mit „buildingSMART e.V.“ (http://www.buildingsmart.de) ein Global Testing Documentation Server (GTDS) implementiert. Der GTDS ist eine, auf einer Datenbank basierte, Web-Applikation, die folgende Intentionen verfolgt: • Bereitstellung eines Werkzeugs für das qualitative Testen IFC-basierter Modelle • Unterstützung der Kommunikation zwischen IFC Entwicklern und Anwendern • Dokumentation der Qualität von IFC-basierten Softwareanwendungen • Bereitstellung einer Plattform für die Zertifizierung von IFC Anwendungen Gegenstand der Arbeit ist die Planung und exemplarische Umsetzung eines Werkzeugs zur interaktiven Visualisierung von Qualitätsdefiziten, die vom GTDS im Modell erkannt wurden. Die exemplarische Umsetzung soll dabei aufbauend auf den OPEN IFC TOOLS (http://www.openifctools.org) erfolgen. KW - BIM KW - IFC KW - GTDS KW - Visualisierung KW - BIM KW - IFC KW - GTDS KW - Visualisierung Y1 - 2010 U6 - http://nbn-resolving.de/urn/resolver.pl?urn:nbn:de:gbv:wim2-20121214-17869 ER - TY - JOUR A1 - Faroughi, Maryam A1 - Karimimoshaver, Mehrdad A1 - Aram, Farshid A1 - Solgi, Ebrahim A1 - Mosavi, Amir A1 - Nabipour, Narjes A1 - Chau, Kwok-Wing T1 - Computational modeling of land surface temperature using remote sensing data to investigate the spatial arrangement of buildings and energy consumption relationship JF - Engineering Applications of Computational Fluid Mechanics N2 - The effect of urban form on energy consumption has been the subject of various studies around the world. Having examined the effect of buildings on energy consumption, these studies indicate that the physical form of a city has a notable impact on the amount of energy consumed in its spaces. The present study identified the variables that affected energy consumption in residential buildings and analyzed their effects on energy consumption in four neighborhoods in Tehran: Apadana, Bimeh, Ekbatan-phase I, and Ekbatan-phase II. After extracting the variables, their effects are estimated with statistical methods, and the results are compared with the land surface temperature (LST) remote sensing data derived from Landsat 8 satellite images taken in the winter of 2019. The results showed that physical variables, such as the size of buildings, population density, vegetation cover, texture concentration, and surface color, have the greatest impacts on energy usage. For the Apadana neighborhood, the factors with the most potent effect on energy consumption were found to be the size of buildings and the population density. However, for other neighborhoods, in addition to these two factors, a third factor was also recognized to have a significant effect on energy consumption. This third factor for the Bimeh, Ekbatan-I, and Ekbatan-II neighborhoods was the type of buildings, texture concentration, and orientation of buildings, respectively. KW - Fernerkung KW - Intelligente Stadt KW - Oberflächentemperatur KW - remote sensing KW - smart cities KW - Land surface temperature KW - energy consumption KW - residential buildings KW - urban morphology KW - urban sustainability Y1 - 2020 U6 - http://nbn-resolving.de/urn/resolver.pl?urn:nbn:de:gbv:wim2-20200110-40585 UR - https://www.tandfonline.com/doi/full/10.1080/19942060.2019.1707711 VL - 2020 IS - Volume 14, No. 1 SP - 254 EP - 270 PB - Taylor & Francis ER - TY - JOUR A1 - Nabipour, Narjes A1 - Mosavi, Amir A1 - Baghban, Alireza A1 - Shamshirband, Shahaboddin A1 - Felde, Imre T1 - Extreme Learning Machine-Based Model for Solubility Estimation of Hydrocarbon Gases in Electrolyte Solutions JF - Processes N2 - Calculating hydrocarbon components solubility of natural gases is known as one of the important issues for operational works in petroleum and chemical engineering. In this work, a novel solubility estimation tool has been proposed for hydrocarbon gases—including methane, ethane, propane, and butane—in aqueous electrolyte solutions based on extreme learning machine (ELM) algorithm. Comparing the ELM outputs with a comprehensive real databank which has 1175 solubility points yielded R-squared values of 0.985 and 0.987 for training and testing phases respectively. Furthermore, the visual comparison of estimated and actual hydrocarbon solubility led to confirm the ability of proposed solubility model. Additionally, sensitivity analysis has been employed on the input variables of model to identify their impacts on hydrocarbon solubility. Such a comprehensive and reliable study can help engineers and scientists to successfully determine the important thermodynamic properties, which are key factors in optimizing and designing different industrial units such as refineries and petrochemical plants. KW - Maschinelles Lernen KW - Machine learning KW - Deep learning Y1 - 2020 U6 - http://nbn-resolving.de/urn/resolver.pl?urn:nbn:de:gbv:wim2-20200113-40624 UR - https://www.mdpi.com/2227-9717/8/1/92 VL - 2020 IS - Volume 8, Issue 1, 92 PB - MDPI ER - TY - INPR A1 - Steiner, Maria A1 - Bourinet, Jean-Marc A1 - Lahmer, Tom T1 - An adaptive sampling method for global sensitivity analysis based on least-squares support vector regression N2 - In the field of engineering, surrogate models are commonly used for approximating the behavior of a physical phenomenon in order to reduce the computational costs. Generally, a surrogate model is created based on a set of training data, where a typical method for the statistical design is the Latin hypercube sampling (LHS). Even though a space filling distribution of the training data is reached, the sampling process takes no information on the underlying behavior of the physical phenomenon into account and new data cannot be sampled in the same distribution if the approximation quality is not sufficient. Therefore, in this study we present a novel adaptive sampling method based on a specific surrogate model, the least-squares support vector regresson. The adaptive sampling method generates training data based on the uncertainty in local prognosis capabilities of the surrogate model - areas of higher uncertainty require more sample data. The approach offers a cost efficient calculation due to the properties of the least-squares support vector regression. The opportunities of the adaptive sampling method are proven in comparison with the LHS on different analytical examples. Furthermore, the adaptive sampling method is applied to the calculation of global sensitivity values according to Sobol, where it shows faster convergence than the LHS method. With the applications in this paper it is shown that the presented adaptive sampling method improves the estimation of global sensitivity values, hence reducing the overall computational costs visibly. KW - Approximation KW - Sensitivitätsanalyse KW - Abtastung KW - Surrogate models KW - Least-squares support vector regression KW - Adaptive sampling method KW - Global sensitivity analysis KW - Sampling Y1 - 2018 U6 - http://nbn-resolving.de/urn/resolver.pl?urn:nbn:de:gbv:wim2-20181218-38320 N1 - This is the pre-peer reviewed version of the following article: https://www.sciencedirect.com/science/article/pii/S0951832017311808, which has been published in final form at https://doi.org/10.1016/j.ress.2018.11.015. SP - 1 EP - 33 ER - TY - INPR A1 - Rezakazemi, Mashallah A1 - Mosavi, Amir A1 - Shirazian, Saeed T1 - ANFIS pattern for molecular membranes separation optimization N2 - In this work, molecular separation of aqueous-organic was simulated by using combined soft computing-mechanistic approaches. The considered separation system was a microporous membrane contactor for separation of benzoic acid from water by contacting with an organic phase containing extractor molecules. Indeed, extractive separation is carried out using membrane technology where complex of solute-organic is formed at the interface. The main focus was to develop a simulation methodology for prediction of concentration distribution of solute (benzoic acid) in the feed side of the membrane system, as the removal efficiency of the system is determined by concentration distribution of the solute in the feed channel. The pattern of Adaptive Neuro-Fuzzy Inference System (ANFIS) was optimized by finding the optimum membership function, learning percentage, and a number of rules. The ANFIS was trained using the extracted data from the CFD simulation of the membrane system. The comparisons between the predicted concentration distribution by ANFIS and CFD data revealed that the optimized ANFIS pattern can be used as a predictive tool for simulation of the process. The R2 of higher than 0.99 was obtained for the optimized ANFIS model. The main privilege of the developed methodology is its very low computational time for simulation of the system and can be used as a rigorous simulation tool for understanding and design of membrane-based systems. Highlights are, Molecular separation using microporous membranes. Developing hybrid model based on ANFIS-CFD for the separation process, Optimization of ANFIS structure for prediction of separation process KW - Fluid KW - Simulation KW - Molecular Liquids KW - optimization KW - machine learning KW - Membrane contactors KW - CFD Y1 - 2018 U6 - http://nbn-resolving.de/urn/resolver.pl?urn:nbn:de:gbv:wim2-20181122-38212 N1 - This is the pre-peer reviewed version of the following article: https://www.sciencedirect.com/science/article/pii/S0167732218345008, which has been published in final form at https://doi.org/10.1016/j.molliq.2018.11.017. VL - 2018 SP - 1 EP - 20 ER - TY - INPR A1 - Kavrakov, Igor A1 - Morgenthal, Guido T1 - A synergistic study of a CFD and semi-analytical models for aeroelastic analysis of bridges in turbulent wind conditions N2 - Long-span bridges are prone to wind-induced vibrations. Therefore, a reliable representation of the aerodynamic forces acting on a bridge deck is of a major significance for the design of such structures. This paper presents a systematic study of the two-dimensional (2D) fluid-structure interaction of a bridge deck under smooth and turbulent wind conditions. Aerodynamic forces are modeled by two approaches: a computational fluid dynamics (CFD) model and six semi-analytical models. The vortex particle method is utilized for the CFD model and the free-stream turbulence is introduced by seeding vortex particles upstream of the deck with prescribed spectral characteristics. The employed semi-analytical models are based on the quasi-steady and linear unsteady assumptions and aerodynamic coefficients obtained from CFD analyses. The underlying assumptions of the semi-analytical aerodynamic models are used to interpret the results of buffeting forces and aeroelastic response due to a free-stream turbulence in comparison with the CFD model. Extensive discussions are provided to analyze the effect of linear fluid memory and quasi-steady nonlinearity from a CFD perspective. The outcome of the analyses indicates that the fluid memory is a governing effect in the buffeting forces and aeroelastic response, while the effect of the nonlinearity is overestimated by the quasi-steady models. Finally, flutter analyses are performed and the obtained critical velocities are further compared with wind tunnel results, followed by a brief examination of the post-flutter behavior. The results of this study provide a deeper understanding of the extent of which the applied models are able to replicate the physical processes for fluid-structure interaction phenomena in bridge aerodynamics and aeroelasticity. KW - Ingenieurwissenschaften KW - Aerodynamik KW - Bridge KW - Aerodynamic nonlinearity KW - Fluid memory KW - Vortex particle method KW - Buffeting KW - Flutter Y1 - 2018 U6 - http://nbn-resolving.de/urn/resolver.pl?urn:nbn:de:gbv:wim2-20200206-40873 N1 - This is the pre-peer reviewed version of the following article: https://www.sciencedirect.com/science/article/abs/pii/S0889974617308423?via%3Dihub, which has been published in final form at https://doi.org/10.1016/j.jfluidstructs.2018.06.013 ER - TY - INPR A1 - Abbas, Tajammal A1 - Kavrakov, Igor A1 - Morgenthal, Guido A1 - Lahmer, Tom T1 - Prediction of aeroelastic response of bridge decks using artificial neural networks N2 - The assessment of wind-induced vibrations is considered vital for the design of long-span bridges. The aim of this research is to develop a methodological framework for robust and efficient prediction strategies for complex aerodynamic phenomena using hybrid models that employ numerical analyses as well as meta-models. Here, an approach to predict motion-induced aerodynamic forces is developed using artificial neural network (ANN). The ANN is implemented in the classical formulation and trained with a comprehensive dataset which is obtained from computational fluid dynamics forced vibration simulations. The input to the ANN is the response time histories of a bridge section, whereas the output is the motion-induced forces. The developed ANN has been tested for training and test data of different cross section geometries which provide promising predictions. The prediction is also performed for an ambient response input with multiple frequencies. Moreover, the trained ANN for aerodynamic forcing is coupled with the structural model to perform fully-coupled fluid--structure interaction analysis to determine the aeroelastic instability limit. The sensitivity of the ANN parameters to the model prediction quality and the efficiency has also been highlighted. The proposed methodology has wide application in the analysis and design of long-span bridges. KW - Aerodynamik KW - Artificial neural network KW - Ingenieurwissenschaften KW - Bridge KW - Bridge aerodynamics KW - Aerodynamic derivatives KW - Motion-induced forces KW - Bridges Y1 - 2020 U6 - http://nbn-resolving.de/urn/resolver.pl?urn:nbn:de:gbv:wim2-20200225-40974 N1 - This is the pre-peer reviewed version of the following article: https://www.sciencedirect.com/science/article/abs/pii/S0045794920300018?via%3Dihub, https://doi.org/10.1016/j.compstruc.2020.106198 ER - TY - INPR A1 - Mosavi, Amir A1 - Torabi, Mehrnoosh A1 - Hashemi, Sattar A1 - Saybani, Mahmoud Reza A1 - Shamshirband, Shahaboddin T1 - A Hybrid Clustering and Classification Technique for Forecasting Short-Term Energy Consumption N2 - Electrical energy distributor companies in Iran have to announce their energy demand at least three 3-day ahead of the market opening. Therefore, an accurate load estimation is highly crucial. This research invoked methodology based on CRISP data mining and used SVM, ANN, and CBA-ANN-SVM (a novel hybrid model of clustering with both widely used ANN and SVM) to predict short-term electrical energy demand of Bandarabbas. In previous studies, researchers introduced few effective parameters with no reasonable error about Bandarabbas power consumption. In this research we tried to recognize all efficient parameters and with the use of CBA-ANN-SVM model, the rate of error has been minimized. After consulting with experts in the field of power consumption and plotting daily power consumption for each week, this research showed that official holidays and weekends have impact on the power consumption. When the weather gets warmer, the consumption of electrical energy increases due to turning on electrical air conditioner. Also, con-sumption patterns in warm and cold months are different. Analyzing power consumption of the same month for different years had shown high similarity in power consumption patterns. Factors with high impact on power consumption were identified and statistical methods were utilized to prove their impacts. Using SVM, ANN and CBA-ANN-SVM, the model was built. Sine the proposed method (CBA-ANN-SVM) has low MAPE 5 1.474 (4 clusters) and MAPE 5 1.297 (3 clusters) in comparison with SVM (MAPE 5 2.015) and ANN (MAPE 5 1.790), this model was selected as the final model. The final model has the benefits from both models and the benefits of clustering. Clustering algorithm with discovering data structure, divides data into several clusters based on similarities and differences between them. Because data inside each cluster are more similar than entire data, modeling in each cluster will present better results. For future research, we suggest using fuzzy methods and genetic algorithm or a hybrid of both to forecast each cluster. It is also possible to use fuzzy methods or genetic algorithms or a hybrid of both without using clustering. It is issued that such models will produce better and more accurate results. This paper presents a hybrid approach to predict the electric energy usage of weather-sensitive loads. The presented methodutilizes the clustering paradigm along with ANN and SVMapproaches for accurate short-term prediction of electric energyusage, using weather data. Since the methodology beinginvoked in this research is based on CRISP data mining, datapreparation has received a gr eat deal of attention in thisresear ch. Once data pre-processing was done, the underlyingpattern of electric energy consumption was extracted by themeans of machine learning methods to precisely forecast short-term energy consumption. The proposed approach (CBA-ANN-SVM) was applied to real load data and resulting higher accu-racy comparing to the existing models. 2018 American Institute of Chemical Engineers Environ Prog, 2018 https://doi.org/10.1002/ep.12934 KW - Data Mining KW - support vector machine (SVM) KW - Machine Learning KW - forecasting KW - Prediction KW - Electric Energy Consumption KW - clustering KW - artificial neural networks (ANN) Y1 - 2018 U6 - http://nbn-resolving.de/urn/resolver.pl?urn:nbn:de:gbv:wim2-20180907-37550 N1 - This is the pre-peer reviewed version of the following article: https://onlinelibrary.wiley.com/doi/10.1002/ep.12934, which has been published in final form at https://doi.org/10.1002/ep.12934. This article may be used for non-commercial purposes in accordance with Wiley Terms and Conditions for Use of Self-Archived Versions. ER - TY - INPR A1 - Simon-Ritz, Frank A1 - Rudolf, Sylvelin T1 - Ein Schaufenster für die Kunst N2 - Kunstausstellungen in Bibliotheken KW - Ausstellung KW - Kunstausstellung Y1 - 2017 U6 - http://nbn-resolving.de/urn/resolver.pl?urn:nbn:de:gbv:wim2-20170711-32307 N1 - This is the pre-peer reviewed version of the following article: https://zs.thulb.uni-jena.de/receive/jportal_jparticle_01036154 VL - 69 IS - Heft 6 SP - 312 EP - 317 ER -