Refine
Document Type
- Article (57)
- Doctoral Thesis (26)
- Part of a Book (16)
- Master's Thesis (7)
- Book (4)
- Preprint (3)
- Conference Proceeding (2)
- Habilitation (2)
- Report (2)
- Bachelor Thesis (1)
Institute
- Institut für Strukturmechanik (ISM) (42)
- Junior-Professur Bildtheorie (17)
- Professur Bauphysik (8)
- Professur Sozialwissenschaftliche Stadtforschung (7)
- Junior-Professur Organisation und vernetzte Medien (6)
- Institut für Europäische Urbanistik (5)
- Professur Bauchemie und Polymere Werkstoffe (3)
- Professur Denkmalpflege und Baugeschichte (3)
- Professur Modellierung und Simulation - Konstruktion (3)
- Bauhaus-Institut für zukunftsweisende Infrastruktursysteme (b.is) (2)
Keywords
- OA-Publikationsfonds2020 (27)
- Maschinelles Lernen (17)
- Machine learning (12)
- Künstlerische Forschung (10)
- Erdbeben (7)
- Deep learning (5)
- Theater (5)
- big data (5)
- Medien (4)
- Raumklima (4)
Year of publication
- 2020 (123) (remove)
Das spekulative Handbuch bietet vielfältige Techniken für ein radikales Lernen und Vermitteln. Es umfasst konkrete Anleitungen, Erfahrungen und theoretische Überlegungen. Die Texte beteiligen sich an der Konzeption einer Vermittlung, die das gemeinsame Experimentieren (wieder) einführt.
Im Seminarraum, in Workshops, auf Festivals, in Fluren, Parks und der Stadt finden Lernen und Verlernen statt. Texte und Anleitungen u. a. zu: Filmessays, Collagen, Banküberfällen, der Universität der Toten, wildem Schreiben, konzeptuellem speed Dating, neurodiversem Lernen, Format-Denken, dem Theater der Sorge, dem Schreiblabor, dem Körperstreik.
In einer systematischen Interpretation von Vilém Flussers Werk schlägt die Arbeit vor, Flussers Ansatz als einen medienphilosophischen zu verstehen, insofern er das „wie“ der medienphilosophischen Fragestellung in den Mittelpunkt rückt. Medien werden nicht erst dann zu einem wesentlichen Bestandteil von Flussers Philosophie, wenn er sie explizit zum Gegenstand seiner Untersuchungen der gegenwärtigen Kultur und Gesellschaft oder historischer Rückblicke macht; Denken vollzieht sich immer in Medien oder medialen Praktiken, es wird nicht nur von ihnen (mit) geprägt – ohne Medien gäbe es kein Denken und umgekehrt verändert sich Philosophie mit den (jeweils) neuen Medien. Ausgehend von Begriffen oder eher Denkfiguren, die neben dem „was“ des jeweils verhandelten Themas auch das „wie“ der Reflexion selbst adressieren, wird der „Umbruch in der Struktur des Denkens“ zugleich als Beschreibung von Medienumbrüchen verstanden – mit dem Fluchtpunkt des Sprungs in das Universum der Komputation – und als Vollzug der gegenwärtigen Veränderung der „Methode des Denkens“. Flussers (Ver)Suche einer Reflexion, die nicht mehr durch das Medium Schrift strukturiert ist, sondern sowohl alten Medien wie dem Bild – bzw. Praktiken des Abbildens, Darstellens, Einbildens usw. – als auch neuen Medien – dem Komputieren – Geltung verschafft, laufen auf eine widersprüchliche Diagnose des neuen Universums der Komputation (anders: der technischen Bilder) hinaus : eine kybermetisch inspirierte Vision der frei modellierbaren Wirklichkeit(en) einerseits und die Dystopie einer Welt, in der Apparaten Denken, Wahrnehmen und Handeln beherrschen andererseits. Die Arbeit zeigt auf, wie Flusser zu dieser Aporie der Medienreflexion – die weit über Flussers Werk hinaus virulent bleibt – gelangt und wie sie, ausgehend von seiner Figur der Geste, im Sinne einer performativen Medienreflexion gelöst werden könnte.
Energy‐Efficient Method for Wireless Sensor Networks Low‐Power Radio Operation in Internet of Things
(2020)
The radio operation in wireless sensor networks (WSN) in Internet of Things (IoT)applications is the most common source for power consumption. Consequently, recognizing and controlling the factors affecting radio operation can be valuable for managing the node power consumption. Among essential factors affecting radio operation, the time spent for checking the radio is of utmost importance for monitoring power consumption. It can lead to false WakeUp or idle listening in radio duty cycles and ContikiMAC. ContikiMAC is a low‐power radio duty‐cycle protocol in Contiki OS used in WakeUp mode, as a clear channel assessment (CCA) for checking radio status periodically. This paper presents a detailed analysis of radio WakeUp time factors of ContikiMAC. Furthermore, we propose a lightweight CCA (LW‐CCA) as an extension to ContikiMAC to reduce the Radio Duty‐Cycles in false WakeUps and idle listening though using dynamic received signal strength indicator (RSSI) status check time. The simulation results in the Cooja simulator show that LW‐CCA reduces about 8% energy consumption in nodes while maintaining up to 99% of the packet delivery rate (PDR).
The classical Internet of things routing and wireless sensor networks can provide more precise monitoring of the covered area due to the higher number of utilized nodes. Because of the limitations in shared transfer media, many nodes in the network are prone to the collision in simultaneous transmissions. Medium access control protocols are usually more practical in networks with low traffic, which are not subjected to external noise from adjacent frequencies. There are preventive, detection and control solutions to congestion management in the network which are all the focus of this study. In the congestion prevention phase, the proposed method chooses the next step of the path using the Fuzzy decision-making system to distribute network traffic via optimal paths. In the congestion detection phase, a dynamic approach to queue management was designed to detect congestion in the least amount of time and prevent the collision. In the congestion control phase, the back-pressure method was used based on the quality of the queue to decrease the probability of linking in the pathway from the pre-congested node. The main goals of this study are to balance energy consumption in network nodes, reducing the rate of lost packets and increasing quality of service in routing. Simulation results proved the proposed Congestion Control Fuzzy Decision Making (CCFDM) method was more capable in improving routing parameters as compared to recent algorithms.
Coronary Artery Disease Diagnosis: Ranking the Significant Features Using a Random Trees Model
(2020)
Heart disease is one of the most common diseases in middle-aged citizens. Among the vast number of heart diseases, coronary artery disease (CAD) is considered as a common cardiovascular disease with a high death rate. The most popular tool for diagnosing CAD is the use of medical imaging, e.g., angiography. However, angiography is known for being costly and also associated with a number of side effects. Hence, the purpose of this study is to increase the accuracy of coronary heart disease diagnosis through selecting significant predictive features in order of their ranking. In this study, we propose an integrated method using machine learning. The machine learning methods of random trees (RTs), decision tree of C5.0, support vector machine (SVM), and decision tree of Chi-squared automatic interaction detection (CHAID) are used in this study. The proposed method shows promising results and the study confirms that the RTs model outperforms other models.
Das Hauptziel der vorliegenden Arbeit war es, eine stetige Kopplung zwischen der ananlytischen und numerischen Lösung von Randwertaufgaben mit Singularitäten zu realisieren. Durch die inter-polationsbasierte gekoppelte Methode kann eine globale C0 Stetigkeit erzielt werden. Für diesen Zweck wird ein spezielle finite Element (Kopplungselement) verwendet, das die Stetigkeit der Lösung sowohl mit dem analytischen Element als auch mit den normalen CST Elementen gewährleistet.
Die interpolationsbasierte gekoppelte Methode ist zwar für beliebige Knotenanzahl auf dem Interface ΓAD anwendbar, aber es konnte durch die Untersuchung von der Interpolationsmatrix und numerische Simulationen festgestellt werden, dass sie schlecht konditioniert ist. Um das Problem mit den numerischen Instabilitäten zu bewältigen, wurde eine approximationsbasierte Kopplungsmethode entwickelt und untersucht. Die Stabilität dieser Methode wurde anschließend anhand der Untersuchung von der Gramschen Matrix des verwendeten Basissystems auf zwei Intervallen [−π,π] und [−2π,2π] beurteilt. Die Gramsche Matrix auf dem Intervall [−2π,2π] hat einen günstigeren Konditionszahl in der Abhängigkeit von der Anzahl der Kopplungsknoten auf dem Interface aufgewiesen. Um die dazu gehörigen numerischen Instabilitäten ausschließen zu können wird das Basissystem mit Hilfe vom Gram-Schmidtschen Orthogonalisierungsverfahren auf beiden Intervallen orthogonalisiert. Das orthogonale Basissystem lässt sich auf dem Intervall [−2π,2π] mit expliziten Formeln schreiben. Die Methode des konsistentes Sampling, die häufig in der Nachrichtentechnik verwendet wird, wurde zur Realisierung von der approximationsbasierten Kopplung herangezogen. Eine Beschränkung dieser Methode ist es, dass die Anzahl der Sampling-Basisfunktionen muss gleich der Anzahl der Wiederherstellungsbasisfunktionen sein. Das hat dazu geführt, dass das eingeführt Basissys-tem (mit 2 n Basisfunktionen) nur mit n Basisfunktion verwendet werden kann.
Zur Lösung diese Problems wurde ein alternatives Basissystems (Variante 2) vorgestellt. Für die Verwendung dieses Basissystems ist aber eine Transformationsmatrix M nötig und bei der Orthogonalisierung des Basissystems auf dem Intervall [−π,π] kann die Herleitung von dieser Matrix kompliziert und aufwendig sein. Die Formfunktionen wurden anschließend für die beiden Varianten hergeleitet und grafisch (für n = 5) dargestellt und wurde gezeigt, dass diese Funktionen die Anforderungen an den Formfunktionen erfüllen und können somit für die FE- Approximation verwendet werden.
Anhand numerischer Simulationen, die mit der Variante 1 (mit Orthogonalisierung auf dem Intervall [−2π,2π]) durchgeführt wurden, wurden die grundlegenden Fragen (Beispielsweise: Stetigkeit der Verformungen auf dem Interface ΓAD, Spannungen auf dem analytischen Gebiet) über-
prüft.
This thesis explores how cultural heritage plays a role in the development of urban identity by engaging both actively and passively with memory, i.e. remembering and forgetting. I argue that architectural heritage is a medium where specific cultural and social decisions form its way of presentation, and it reflects the values and interests of the period. By the process of remembering and forgetting, the meanings between inhabitant and object in urban environment are practiced, and the meanings are created.
To enable the research in narrative observation, cultural tourism management is chosen as the main research object, which reflects the alteration of interaction between the architectural heritage and urban identity. Identifying the role of heritage management, the definition of social resilience and the prospects of cultural heritage as a means of social resilience are addressed. Case region of the research is East Ger- many, thereby, the study examines the distinct approaches and objectives regarding heritage management under the different political systems along the German reunification process.
The framework is based on various theoretical paradigms to investigate the broad research questions: 1) What is the role of historic urban quarters in the revitalisation of East German towns? 2) How was the transition processed by cultural heritage management? 3) How did policy affect residents’ lives?
The case study is applied to macro level (city level: Gotha and Eisenach) and micro level study (object level: specific heritage sites), to analyse the performance of selective remembering and making tourist destination through giving significance to specific heritage. By means of site observations, archival research, qualitative inter- views, photographs, and discourse analysis on printed tourism materials, the study demonstrates that certain sites and characteristics of the city enable creating and focusing messages, which aids the social resilience.
Combining theory and empirical studies this thesis attempts to widen the academic discussion regarding the practice of remembering and forgetting driven by cultural heritage. The thesis argues for cultural heritage tourism as an element of social resilience and one that embraces the historic and cultural identity of the inhabitants.
The longitudinal dispersion coefficient (LDC) plays an important role in modeling the transport of pollutants and sediment in natural rivers. As a result of transportation processes, the concentration of pollutants changes along the river. Various studies have been conducted to provide simple equations for estimating LDC. In this study, machine learning methods, namely support vector regression, Gaussian process regression, M5 model tree (M5P) and random forest, and multiple linear regression were examined in predicting the LDC in natural streams. Data sets from 60 rivers around the world with different hydraulic and geometric features were gathered to develop models for LDC estimation. Statistical criteria, including correlation coefficient (CC), root mean squared error (RMSE) and mean absolute error (MAE), were used to scrutinize the models. The LDC values estimated by these models were compared with the corresponding results of common empirical models. The Taylor chart was used to evaluate the models and the results showed that among the machine learning models, M5P had superior performance, with CC of 0.823, RMSE of 454.9 and MAE of 380.9. The model of Sahay and Dutta, with CC of 0.795, RMSE of 460.7 and MAE of 306.1, gave more precise results than the other empirical models. The main advantage of M5P models is their ability to provide practical formulae. In conclusion, the results proved that the developed M5P model with simple formulations was superior to other machine learning models and empirical models; therefore, it can be used as a proper tool for estimating the LDC in rivers.
Temporary changes in precipitation may lead to sustained and severe drought or massive floods in different parts of the world. Knowing the variation in precipitation can effectively help the water resources decision-makers in water resources management. Large-scale circulation drivers have a considerable impact on precipitation in different parts of the world. In this research, the impact of El Niño-Southern Oscillation (ENSO), Pacific Decadal Oscillation (PDO), and North Atlantic Oscillation (NAO) on seasonal precipitation over Iran was investigated. For this purpose, 103 synoptic stations with at least 30 years of data were utilized. The Spearman correlation coefficient between the indices in the previous 12 months with seasonal precipitation was calculated, and the meaningful correlations were extracted. Then, the month in which each of these indices has the highest correlation with seasonal precipitation was determined. Finally, the overall amount of increase or decrease in seasonal precipitation due to each of these indices was calculated. Results indicate the Southern Oscillation Index (SOI), NAO, and PDO have the most impact on seasonal precipitation, respectively. Additionally, these indices have the highest impact on the precipitation in winter, autumn, spring, and summer, respectively. SOI has a diverse impact on winter precipitation compared to the PDO and NAO, while in the other seasons, each index has its special impact on seasonal precipitation. Generally, all indices in different phases may decrease the seasonal precipitation up to 100%. However, the seasonal precipitation may increase more than 100% in different seasons due to the impact of these indices. The results of this study can be used effectively in water resources management and especially in dam operation.
Hydrological drought forecasting plays a substantial role in water resources management. Hydrological drought highly affects the water allocation and hydropower generation. In this research, short term hydrological drought forecasted based on the hybridized of novel nature-inspired optimization algorithms and Artificial Neural Networks (ANN). For this purpose, the Standardized Hydrological Drought Index (SHDI) and the Standardized Precipitation Index (SPI) were calculated in one, three, and six aggregated months. Then, three states where proposed for SHDI forecasting, and 36 input-output combinations were extracted based on the cross-correlation analysis. In the next step, newly proposed optimization algorithms, including Grasshopper Optimization Algorithm (GOA), Salp Swarm algorithm (SSA), Biogeography-based optimization (BBO), and Particle Swarm Optimization (PSO) hybridized with the ANN were utilized for SHDI forecasting and the results compared to the conventional ANN. Results indicated that the hybridized model outperformed compared to the conventional ANN. PSO performed better than the other optimization algorithms. The best models forecasted SHDI1 with R2 = 0.68 and RMSE = 0.58, SHDI3 with R 2 = 0.81 and RMSE = 0.45 and SHDI6 with R 2 = 0.82 and RMSE = 0.40.
In Germany, bridges have an average age of 40 years. A bridge consumes between 0.4% and 2% of its construction cost per year over its entire life cycle. This means that up to 80% of the construction cost are additionally needed for operation, inspection, maintenance, and destruction. Current practices rely either on paperbased inspections or on abstract specialist software. Every application in the inspection and maintenance sector uses its own data model for structures, inspections, defects, and maintenance. Due to this, data and properties have to be transferred manually, otherwise a converter is necessary for every data exchange between two applications. To overcome this issue, an adequate model standard for inspections, damage, and maintenance is necessary. Modern 3D models may serve as a single source of truth, which has been suggested in the Building Information Modeling (BIM) concept. Further, these models offer a clear visualization of the built infrastructure, and improve not only the planning and construction phases, but also the operation phase of construction projects. BIM is established mostly in the Architecture, Engineering, and Construction (AEC) sector to plan and construct new buildings. Currently, BIM does not cover the whole life cycle of a building, especially not inspection and maintenance. Creating damage models needs the building model first, because a defect is dependent on the building component, its properties and material. Hence, a building information model is necessary to obtain meaningful conclusions from damage information. This paper analyzes the requirements, which arise from practice, and the research that has been done in modeling damage and related information for bridges. With a look at damage categories and use cases related to inspection and maintenance, scientific literature is discussed and synthesized. Finally, research gaps and needs are identified and discussed.
Der Beitrag verbindet die Diskussion um die postpolitische Stadt mit der zunehmenden wissenschaftlichen und aktivistischen Auseinandersetzung mit dem Anthropozän, ein Konzept, das die ökologischen und sozialpolitischen Implikationen menschlichen Handelns auf die Erdoberfläche beschreibt. Anhand von drei ausgewählten Fallstudien erkunden wir,
wie die spezifisch anthropogene, also menschengemachte, Krise urbaner Luftverschmutzung in künstlerischen Positionen problematisiert wird. Im Kontext des potenziellen Vormarschs von Postpolitik besprechen wir, wie der ambivalente Diskurs des Anthropozäns einerseits Depolitisierung begünstigt und andererseits neue Möglichkeiten für die Repolitisierung
globaler Umweltherausforderungen ermöglicht.
Conventional superplasticizers based on polycarboxylate ether (PCE) show an intolerance to clay minerals due to intercalation of their polyethylene glycol (PEG) side chains into the interlayers of the clay mineral. An intolerance to very basic media is also known. This makes PCE an unsuitable choice as a superplasticizer for geopolymers. Bio-based superplasticizers derived from starch showed comparable effects to PCE in a cementitious system. The aim of the present study was to determine if starch superplasticizers (SSPs) could be a suitable additive for geopolymers by carrying out basic investigations with respect to slump, hardening, compressive and flexural strength, shrinkage, and porosity. Four SSPs were synthesized, differing in charge polarity and specific charge density. Two conventional PCE superplasticizers, differing in terms of molecular structure, were also included in this study. The results revealed that SSPs improved the slump of a metakaolin-based geopolymer (MK-geopolymer) mortar while the PCE investigated showed no improvement. The impact of superplasticizers on early hardening (up to 72 h) was negligible. Less linear shrinkage over the course of 56 days was seen for all samples in comparison with the reference. Compressive strengths of SSP specimens tested after 7 and 28 days of curing were comparable to the reference, while PCE led to a decline. The SSPs had a small impact on porosity with a shift to the formation of more gel pores while PCE caused an increase in porosity. Throughout this research, SSPs were identified as promising superplasticizers for MK-geopolymer mortar and concrete.
Die derzeitige Wohnungskrise hat eine sozial-ökologische Kernproblematik. Dabei ist die sozial ungerechte und ökologisch problematische Verteilung von Wohnfläche meist unsichtbar und wird weder in wissenschaftlichen noch in aktivistischen Kontexten ausreichend als Frage der Flächengerechtigkeit problematisiert. Denn Wohnraum und Fläche in einer Stadt sind keine endlos verfügbaren Güter: Wenn einige Menschen auf viel Raum leben, bleibt für andere Menschen weniger Fläche übrig. Und die Menschen, die am wenigstens für eine Verknappung von Wohnraum verantwortlich sind, leiden am meisten darunter. Dieser Artikel arbeitet zunächst den Begriff der Wohnflächengerechtigkeit heraus, wobei auf die Ungleichverteilung von Wohnfläche und deren gesellschaftliche Implikationen unter derzeitigen Wohnungsverteilungsmechanismen Bezug genommen wird. Anschließend wird der Verbrauch von (Wohn-)Fläche aus ökologischer Perspektive problematisiert. Der Artikel diskutiert scheinbare und transformationsorientierte Lösungs- und Handlungsansätze. Abschließend fordert er in der kritischen Stadtforschung und in aktivistischen Kontexten eine stärkere Debatte um eine Wohnflächengerechtigkeit, deren Verwirklichung gleichermaßen eine soziale wie ökologische Dimension hat.
Matthias Bernt und Andrej Holm weisen zu Recht darauf hin, dass es einer Forschung zu ostdeutschen Städten als konzeptionell eigenständigem Feld bedarf, die die spezifische Verräumlichung des tiefgreifenden gesellschaftlichen Transformationsprozesses nach 1990 ins Zentrum stellt. Dabei betrachten sie insbesondere das Feld des Wohnens als produktiv, um Kenntnis über die Struktur und Wirkung dieses Prozesses zu erlangen. Allerdings bleiben sie vage dabei, wie eine solche spezifisch auf Ostdeutschland gerichtete Wohnungsforschung zu konzipieren wäre und in welcher Weise die Besonderheiten und Parallelitäten ostdeutscher Entwicklungen zu den Transformationen von Wohnungs- und Stadtentwicklungspolitik in Westdeutschland, aber auch international, in Bezug zu setzen wären.
Hans Ruin: Being with the Dead—Burial, ancestral politics, and the roots of historical consciousness
(2020)
How can society be thought of as something in which the living and the dead interact throughout history? In Being with the Dead. Burial, Ancestral Politics, and the Roots of Historical Consciousness, Hans Ruin turns to the relationship between the living and the dead as well as ‘historical consciousness’. He is referring to the expression ‘being with the dead’ (Mitsein mit dem Toten). Rather en passant, Martin Heidegger (1962: 282) shaped this existential-ontological term, which so far has hardly received any consideration. But for Ruin, it now forms the starting point for his “expanded phenomenological social ontology” (p. XI). By illuminating history and historical consciousness with the category ‘being with the dead,’ he gains remarkable insights into the meaning of ancestrality. Concerning ‘necropolitics,’ Ruin shows that the political space includes the living as well as the dead and how they constitute it. The foci of his considerations are the human sciences, above all sociology, anthropology, archaeology, philology and history. Ruin’s book aims at a “metacritical thanatology,” which he elaborates as “an exploration of the social ontology of being with the dead mediated through critical analyses of the human-historical sciences themselves” (p. XII). As a result, in a total of seven chapters, he succeeds astonishingly in emphasizing the political and ethical importance of a scientific gaze that cultivates the interaction of the living and the dead.
Die Verbindung der sozialen und der ökologischen Frage ist eine der zentralen Herausforderungen linker Politik und kritisch-engagierter Wissenschaft heute. Dafür, wie wenig das bisher gelingt, sind die öffentlichen und wissenschaftlichen Diskussionen um die Wohnungsfrage gute Beispiele. Dieser Aufruf ist eine Einladung an den kollektiven Wissensschatz aus Wissenschaft und Aktivismus, die unterschiedlichen Aspekte der ökologischen Wohnungsfrage, die bisher stark fragmentiert behandelt werden, in einzelnen Beiträgen weiter auszuführen und auf ihren strukturellen Zusammenhang mit der sozialen Wohnungsfrage hin zu beleuchten.
Prediction of the groundwater nitrate concentration is of utmost importance for pollution control and water resource management. This research aims to model the spatial groundwater nitrate concentration in the Marvdasht watershed, Iran, based on several artificial intelligence methods of support vector machine (SVM), Cubist, random forest (RF), and Bayesian artificial neural network (Baysia-ANN) machine learning models. For this purpose, 11 independent variables affecting groundwater nitrate changes include elevation, slope, plan curvature, profile curvature, rainfall, piezometric depth, distance from the river, distance from residential, Sodium (Na), Potassium (K), and topographic wetness index (TWI) in the study area were prepared. Nitrate levels were also measured in 67 wells and used as a dependent variable for modeling. Data were divided into two categories of training (70%) and testing (30%) for modeling. The evaluation criteria coefficient of determination (R2), mean absolute error (MAE), root mean square error (RMSE), and Nash–Sutcliffe efficiency (NSE) were used to evaluate the performance of the models used. The results of modeling the susceptibility of groundwater nitrate concentration showed that the RF (R2 = 0.89, RMSE = 4.24, NSE = 0.87) model is better than the other Cubist (R2 = 0.87, RMSE = 5.18, NSE = 0.81), SVM (R2 = 0.74, RMSE = 6.07, NSE = 0.74), Bayesian-ANN (R2 = 0.79, RMSE = 5.91, NSE = 0.75) models. The results of groundwater nitrate concentration zoning in the study area showed that the northern parts of the case study have the highest amount of nitrate, which is higher in these agricultural areas than in other areas. The most important cause of nitrate pollution in these areas is agriculture activities and the use of groundwater to irrigate these crops and the wells close to agricultural areas, which has led to the indiscriminate use of chemical fertilizers by irrigation or rainwater of these fertilizers is washed and penetrates groundwater and pollutes the aquifer.
Image Analysis Using Human Body Geometry and Size Proportion Science for Action Classification
(2020)
Gestures are one of the basic modes of human communication and are usually used to represent different actions. Automatic recognition of these actions forms the basis for solving more complex problems like human behavior analysis, video surveillance, event detection, and sign language recognition, etc. Action recognition from images is a challenging task as the key information like temporal data, object trajectory, and optical flow are not available in still images. While measuring the size of different regions of the human body i.e., step size, arms span, length of the arm, forearm, and hand, etc., provides valuable clues for identification of the human actions. In this article, a framework for classification of the human actions is presented where humans are detected and localized through faster region-convolutional neural networks followed by morphological image processing techniques. Furthermore, geometric features from human blob are extracted and incorporated into the classification rules for the six human actions i.e., standing, walking, single-hand side wave, single-hand top wave, both hands side wave, and both hands top wave. The performance of the proposed technique has been evaluated using precision, recall, omission error, and commission error. The proposed technique has been comparatively analyzed in terms of overall accuracy with existing approaches showing that it performs well in contrast to its counterparts.
This paper proposes a practice-theoretical journalism research approach for an alternate and innovative perspective of digital journalism’s current empirical challenges. The practice-theoretical approach is introduced by demonstrating its explanatory power in relation to demarcation problems, technological changes, economic challenges and challenges to journalism’s legitimacy. Its respective advantages in dealing with these problems are explained and then compared to established journalism theories. The particular relevance of the theoretical perspective is due to (1) its central decision to observe journalistic practices, (2) the transgression of conventional journalistic boundaries, (3) the denaturalization of journalistic norms and laws, (4) the explicit consideration of a material, socio-technical dimension of journalism, (5) a focus on the conflicting relationship between journalistic practices and media management practices, and (6) prioritizing order generation over stability.