Refine
Document Type
- Conference Proceeding (80)
- Doctoral Thesis (17)
- Article (12)
- Master's Thesis (4)
- Periodical (2)
- Study Thesis (2)
- Report (1)
Institute
- In Zusammenarbeit mit der Bauhaus-Universität Weimar (50)
- Institut für Strukturmechanik (20)
- Graduiertenkolleg 1462 (12)
- Professur Angewandte Mathematik (4)
- Juniorprofessur Augmented Reality (3)
- Juniorprofessur Stochastik und Optimierung (3)
- Professur Informatik im Bauwesen (3)
- An-Institute (2)
- Institut für Europäische Urbanistik (2)
- Institut für Konstruktiven Ingenieurbau (2)
- Professur Betriebswirtschaftslehre im Bauwesen (2)
- Professur Grundbau (2)
- Professur Informatik in der Architektur (2)
- Professur Massivbau II (2)
- Professur Soziologie und Sozialgeschichte der Stadt (2)
- F. A. Finger-Institut für Baustoffkunde (1)
- Geschichte und Theorie der Visuellen Kommunikation (1)
- Institut für Mathematik-Bauphysik (1)
- Professur Baubetrieb und Bauverfahren (1)
- Professur Bodenmechanik (1)
- Professur Marketing und Medien (1)
- Professur Medienmanagement (1)
- Professur Mediensicherheit (ab 2023 Professur Informationssicherheit und Kryptographie) (1)
- Professur Stahlbau (1)
- Professur Systeme der Virtuellen Realität (ab 2023 Professur Virtuelle Realität und Visualisierung) (1)
- Professur Verkehrsplanung und Verkehrstechnik (1)
Keywords
Year of publication
- 2010 (118) (remove)
In this note, we describe quite explicitly the Howe duality for Hodge systems and connect it with the well-known facts of harmonic analysis and Clifford analysis. In Section 2, we recall briefly the Fisher decomposition and the Howe duality for harmonic analysis. In Section 3, the well-known fact that Clifford analysis is a real refinement of harmonic analysis is illustrated by the Fisher decomposition and the Howe duality for the space of spinor-valued polynomials in the Euclidean space under the so-called L-action. On the other hand, for Clifford algebra valued polynomials, we can consider another action, called in Clifford analysis the H-action. In the last section, we recall the Fisher decomposition for the H-action obtained recently. As in Clifford analysis the prominent role plays the Dirac equation in this case the basic set of equations is formed by the Hodge system. Moreover, analysis of Hodge systems can be viewed even as a refinement of Clifford analysis. In this note, we describe the Howe duality for the H-action. In particular, in Proposition 1, we recognize the Howe dual partner of the orthogonal group O(m) in this case as the Lie superalgebra sl(2 1). Furthermore, Theorem 2 gives the corresponding multiplicity free decomposition with an explicit description of irreducible pieces.
THE FOURIER-BESSEL TRANSFORM
(2010)
In this paper we devise a new multi-dimensional integral transform within the Clifford analysis setting, the so-called Fourier-Bessel transform. It appears that in the two-dimensional case, it coincides with the Clifford-Fourier and cylindrical Fourier transforms introduced earlier. We show that this new integral transform satisfies operational formulae which are similar to those of the classical tensorial Fourier transform. Moreover the L2-basis elements consisting of generalized Clifford-Hermite functions appear to be eigenfunctions of the Fourier-Bessel transform.
This paper describes the application of interval calculus to calculation of plate deflection, taking in account inevitable and acceptable tolerance of input data (input parameters). The simply supported reinforced concrete plate was taken as an example. The plate was loaded by uniformly distributed loads. Several parameters that influence the plate deflection are given as certain closed intervals. Accordingly, the results are obtained as intervals so it was possible to follow the direct influence of a change of one or more input parameters on output (in our example, deflection) values by using one model and one computing procedure. The described procedure could be applied to any FEM calculation in order to keep calculation tolerances, ISO-tolerances, and production tolerances in close limits (admissible limits). The Wolfram Mathematica has been used as tool for interval calculation.
In this dissertation, a new, unique and original biaxial device for testing unsaturated soil was designed and developed. A study on the mechanical behaviour of unsaturated sand in plane-strain conditions using the new device is presented. The tests were mainly conducted on Hostun sand specimens. A series of experiments including basic characterisation, soil water characteristic curves, and compression biaxial tests on dry, saturated, and unsaturated sand were conducted. A set of bearing capacity tests of strip model footing on unsaturated sand were performed. Additionally, since the presence of fine content (i.e., clay) influences the behavior of soils, soil water characteristic tests were also performed for sand-kaolin mixtures specimens.
Im vorliegenden Beitrag wird ein Framework für ein verteiltes dynamisches Produktmodell (FREAC) vorgestellt, welches der experimentellen Softwareentwicklung dient. Bei der Entwicklung von FREAC wurde versucht, folgende Eigenschaften umzusetzen, die bei herkömmlichen Systemen weitgehend fehlen: Erstens eine hohe Flexibilität, also eine möglichst hohe Anpassbarkeit für unterschiedliche Fachdisziplinen; Zweitens die Möglichkeit, verschiedene Tools nahtlos miteinander zu verknüpfen; Drittens die verteilte Modellbearbeitung in Echtzeit; Viertens das Abspeichern des gesamten Modell-Bearbeitungsprozesses; Fünftens eine dynamische Erweiterbarkeit sowohl für Softwareentwickler, als auch für die Nutzer der Tools. Die Bezeichnung FREAC umfasst sowohl das Framework zur Entwicklung und Pflege eines Produktmodells (FREAC-Development) als auch die entwickelten Tools selbst (FREAC-Tools).
Am 25. März 2010 veranstaltete die Professur Baubetrieb und Bauverfahren im Rahmen der jährlich stattfindenden baubetrieblichen Tagungsreihe gemeinsam mit der Arbeitsgruppe „Unikatprozesse“ in der Fachgruppe „Simulation in Produktion und Logistik“ (SPL) im Rahmen der Arbeitsgemeinschaft Simulation – ASIM einen ganztägigen Workshop mit dem Titel: „Modellierung von Prozessen zur Fertigung von Unikaten“. Viele Bauprozesse sind dadurch gekennzeichnet, dass sie Unikatcharakter besitzen. Unikate sind durch prototypische Einmaligkeit, Individualität, vielfältige Randbedingungen, einen geringen Grad an Standardisierung und Wiederholungen gekennzeichnet. Das erschwert die realitätsnahe Modellierung zur Simulation sogenannter Unikatprozesse. Dieser Besonderheit widmet sich die überwiegende Zahl der Tagungsbeiträge, die in diesem Band widergegeben sind.
Besides home entertainment and business presentations, video projectors are powerful tools for modulating images spatially as well as temporally. The re-evolving need for stereoscopic displays increases the demand for low-latency projectors and recent advances in LED technology also offer high modulation frequencies. Combining such high-frequency illumination modules with synchronized, fast cameras, makes it possible to develop specialized high-speed illumination systems for visual effects production. In this thesis we present different systems for using spatially as well as temporally modulated illumination in combination with a synchronized camera to simplify the requirements of standard digital video composition techniques for film and television productions and to offer new possibilities for visual effects generation. After an overview of the basic terminology and a summary of related methods, we discuss and give examples of how modulated light can be applied to a scene recording context to enable a variety of effects which cannot be realized using standard methods, such as virtual studio technology or chroma keying. We propose using high-frequency, synchronized illumination which, in addition to providing illumination, is modulated in terms of intensity and wavelength to encode technical information for visual effects generation. This is carried out in such a way that the technical components do not influence the final composite and are also not visible to observers on the film set. Using this approach we present a real-time flash keying system for the generation of perspectively correct augmented composites by projecting imperceptible markers for optical camera tracking. Furthermore, we present a system which enables the generation of various digital video compositing effects outside of completely controlled studio environments, such as virtual studios. A third temporal keying system is presented that aims to overcome the constraints of traditional chroma keying in terms of color spill and color dependency. ...
This paper deals with the modelling and the analysis of masonry vaults. Numerical FEM analyses are performed using LUSAS code. Two vault typologies are analysed (barrel and cross-ribbed vaults) parametrically varying geometrical proportions and constraints. The proposed model and the developed numerical procedure are implemented in a computer analysis. Numerical applications are developed to assess the model effectiveness and the efficiency of the numerical procedure. The main object of the present paper is the development of a computational procedure which allows to define 3D structural behaviour of masonry vaults. For each investigated example, the homogenized limit analysis approach has been employed to predict ultimate load and failure mechanisms. Finally, both a mesh dependence study and a sensitivity analysis are reported. Sensitivity analysis is conducted varying in a wide range mortar tensile strength and mortar friction angle with the aim of investigating the influence of the mechanical properties of joints on collapse load and failure mechanisms. The proposed computer model is validated by a comparison with experimental results available in the literature.
In recent years special hypercomplex Appell polynomials have been introduced by several authors and their main properties have been studied by different methods and with different objectives. Like in the classical theory of Appell polynomials, their generating function is a hypercomplex exponential function. The observation that this generalized exponential function has, for example, a close relationship with Bessel functions confirmed the practical significance of such an approach to special classes of hypercomplex differentiable functions. Its usefulness for combinatorial studies has also been investigated. Moreover, an extension of those ideas led to the construction of complete sets of hypercomplex Appell polynomial sequences. Here we show how this opens the way for a more systematic study of the relation between some classes of Special Functions and Elementary Functions in Hypercomplex Function Theory.
The numerical simulation of microstructure models in 3D requires, due to enormous d.o.f., significant resources of memory as well as parallel computational power. Compared to homogeneous materials, the material hetrogeneity on microscale induced by different material phases demand for adequate computational methods for discretization and solution process of the resulting highly nonlinear problem. To enable an efficient/scalable solution process of the linearized equation systems the heterogeneous FE problem will be described by a FETI-DP (Finite Element Tearing and Interconnecting - Dual Primal) discretization. The fundamental FETI-DP equation can be solved by a number of different approaches. In our approach the FETI-DP problem will be reformulated as Saddle Point system, by eliminating the primal and Lagrangian variables. For the reduced Saddle Point system, only defined by interior and dual variables, special Uzawa algorithms can be adapted for iteratively solving the FETI-DP saddle-point equation system (FETI-DP SPE). A conjugate gradient version of the Uzawa algorithm will be shown as well as some numerical tests regarding to FETI-DP discretization of small examples using the presented solution technique. Furthermore the inversion of the interior-dual Schur complement operator can be approximated using different techniques building an adequate preconditioning matrix and therewith leading to substantial gains in computing time efficiency.