### Refine

#### Document Type

- Conference Proceeding (174)
- Article (21)
- Doctoral Thesis (2)
- Report (2)
- Book (1)

#### Institute

- In Zusammenarbeit mit der Bauhaus-Universität Weimar (200) (remove)

#### Keywords

- Computerunterstütztes Verfahren (174)
- Architektur <Informatik> (141)
- CAD (92)
- Angewandte Informatik (82)
- Angewandte Mathematik (82)
- Computer Science Models in Engineering; Multiscale and Multiphysical Models; Scientific Computing (49)
- Building Information Modeling (16)
- Data, information and knowledge modeling in civil engineering; Function theoretic methods and PDE in engineering sciences; Mathematical methods for (robotics and) computer vision; Numerical modeling in engineering; Optimization in engineering applications (15)
- Berührungslose Messung (2)
- Controlling (2)

The aim of this paper is to present so-called discrete-continual boundary element method (DCBEM) of structural analysis. Its field of application comprises buildings constructions, structures and also parts and components for the residential, commercial and un-inhabitant structures with invariability of physical and geometrical parameters in some dimensions. We should mention here in particular such objects as beams, thin-walled bars, strip foundations, plates, shells, deep beams, high-rise buildings, extensional buildings, pipelines, rails, dams and others. DCBEM comes under group of semianalytical methods. Semianalytical formulations are contemporary mathematical models which currently becoming available for realization due to substantial speed-up of computer productivity. DCBEM is based on the theory of the pseudodifferential boundary equations. Corresponding pseudodifferential operators are discretely approximated using Fourier analysis or wavelet analysis. The main DCBEM advantages against the other methods of the numerical analysis is a double reduction in dimension of the problem (discrete numerical division applied not to the full region of the interest but only to the boundary of the region cross section, as a matter of fact one is solving an one-dimensional problem with the finite step on the boundary area of the region), one has opportunities to carrying out very detailed analysis of the specific chosen zones, simplified initial data preparation, simplistic and adaptive algorithms. There are two methods to define and conduct DCBEM analysis developed – indirect (IDCBEM) and direct (DDCBEM), thus indirect like in boundary element method (BEM) applied and used little bit more than direct.

The execution of project activities generally requires the use of (renewable) resources like machines, equipment or manpower. The resource allocation problem consists in assigning time intervals to the execution of the project activities while taking into account temporal constraints between activities emanating from technological or organizational requirements and costs incurred by the resource allocation. If the total procurement cost of the different renewable resources has to be minimized we speak of a resource investment problem. If the cost depends on the smoothness of the resource utilization over time the underlying problem is called a resource levelling problem. In this paper we consider a new tree-based enumeration method for solving resource investment and resource levelling problems exploiting some fundamental properties of spanning trees. The enumeration scheme is embedded in a branch-and-bound procedure using a workload-based lower bound and a depth first search. Preliminary computational results show that the proposed procedure is promising for instances with up to 30 activities.

We present recent developments of adaptive wavelet solvers for elliptic eigenvalue problems. We describe the underlying abstract iteration scheme of the preconditioned perturbed iteration. We apply the iteration to a simple model problem in order to identify the main ideas which a numerical realization of the abstract scheme is based upon. This indicates how these concepts carry over to wavelet discretizations. Finally we present numerical results for the Poisson eigenvalue problem on an L-shaped domain.

We present an algebraically extended 2D image representation in this paper. In order to obtain more degrees of freedom, a 2D image is embedded into a certain geometric algebra. Combining methods of differential geometry, tensor algebra, monogenic signal and quadrature filter, the novel 2D image representation can be derived as the monogenic extension of a curvature tensor. The 2D spherical harmonics are employed as basis functions to construct the algebraically extended 2D image representation. From this representation, the monogenic signal and the monogenic curvature signal for modeling intrinsically one and two dimensional (i1D/i2D) structures are obtained as special cases. Local features of amplitude, phase and orientation can be extracted at the same time in this unique framework. Compared with the related work, our approach has the advantage of simultaneous estimation of local phase and orientation. The main contribution is the rotationally invariant phase estimation, which enables phase-based processing in many computer vision tasks.

Analysis of the reinforced concrete chimney geometry changes and their influence on the stresses in the chimney mantle was made. All the changes were introduced to a model chimney and compared. Relations between the stresses in the mantle of the chimney and the deformations determined by the change of the chimney's vertical axis geometry were investigated. The vertical axis of chimney was described by linear function (corresponding to the real rotation of the chimney together with the foundation), and by parabolic function (corresponding to the real dislocation of the chimney under the influence of the horizontal forces - wind). The positive stress pattern in the concrete as well as the negative stress pattern in the reinforcing steel have been presented. The two cases were compared. Analysis of the stress changes in the chimney mantle depending on the modification in the thickness of the mantle (the thickness of the chimney mantle was altered in the linear or the abrupt way) was carried out. The relation between the stresses and the chimney's diameter change from the bottom to the top of the chimney was investigated. All the analyses were conducted by means of a specially developed computer program created in Mathematica environment. The program makes it also possible to control calculations and to visualize the results of the calculations at every stage of the calculation process.

Nodal integration of finite elements has been investigated recently. Compared with full integration it shows better convergence when applied to incompressible media, allows easier remeshing and highly reduces the number of material evaluation points thus improving efficiency. Furthermore, understanding it may help to create new integration schemes in meshless methods as well. The new integration technique requires a nodally averaged deformation gradient. For the tetrahedral element it is possible to formulate a nodal strain which passes the patch test. On the downside, it introduces non-physical low energy modes. Most of these "spurious modes" are local deformation maps of neighbouring elements. Present stabilization schemes rely on adding a stabilizing potential to the strain energy. The stabilization is discussed within this article. Its drawbacks are easily identified within numerical experiments: Nonlinear material laws are not well represented. Plastic strains may often be underestimated. Geometrically nonlinear stabilization greatly reduces computational efficiency. The article reinterpretes nodal integration in terms of imposing a nonconforming C0-continuous strain field on the structure. By doing so, the origins of the spurious modes are discussed and two methods are presented that solve this problem. First, a geometric constraint is formulated and solved using a mixed formulation of Hu-Washizu type. This assumption leads to a consistent representation of the strain energy while eliminating spurious modes. The solution is exact, but only of theoretical interest since it produces global support. Second, an integration scheme is presented that approximates the stabilization criterion. The latter leads to a highly efficient scheme. It can even be extended to other finite element types such as hexahedrals. Numerical efficiency, convergence behaviour and stability of the new method is validated using linear tetrahedral and hexahedral elements.

Subject of the paper is the realisation of a model based efficiency control system for PV generators using a simulation model. A standard 2-diodes model of PV generator is base of the ColSim model, which is implemented in ANSI C code for flexible code exporting. The algorithm is based on discretisized U-I characteristics, which allows the calculation of string topologies witch parallel and serial PV cells and modules. Shadowing effects can be modelled down to cell configuration using polar horizon definitions. The simulation model was ported to a real time environment, to calculate the efficiency of a PV system. Embedded System technology allows the networked operation and the integration of standard I/O devices. Futher work focus on the adaption of shadowing routine, which will be adapted to get the environment conditions from the real operation.

Die meisten Insolvenzen in Deutschland kommen aus der Bauindustrie. Die Gründe hierfür sind vielschichtig, jedoch kann mittels eines modern ausgerichteten M-I-S und Baustellen-Controllings frühzeitig erkannt werden, wie sich die Baustellenergebnisse entwickeln. Hierzu ist es notwendig, dass die Arbeitskalkulation ständig auf dem Laufenden gehalten wird. Nur wenn dies geschieht, sind monatliche Soll-/ Ist-Vergleiche und eine Betrachtung der cost-to-complete möglich und sinnvoll. Eine monatlich rollierende Prognose des Baustellenergebnisses zum Bauende ermöglicht, dass gravierende Veränderungen des Ergebnisses umgehend aufgedeckt werden. Nur in Kenntnis dieser Entwicklungen kann das Management frühzeitig (im Sinne eines Frühwarnsystems) agieren und Steuerungsmaßnahmen ergreifen. Die Ergebnisprognose zum Bauende ist allein als Steuerungsinstrument nicht ausreichend. Die Finanzsituation der Baustelle muß auch regelmäßig geprüft werden, d.h. der Leistungsstand mit der Rechnungsstellung an den Bauherren abgeglichen sowie die unbezahlten Rechnungen des Bauherren überprüft werden. Das beste Prognoseergebnis ist wertlos, wenn der Bauherr seine bezogenen Leistungen nicht vergütet. Die wirtschaftlichen Daten stehen den Verantwortlichen online im Baustellen-Informations-System (B-I-S) zur Verfügung. Ein Ampelsystem verdeutlicht die wirtschaftliche Lage der Baustelle.

The Carbon journal is pleased to introduce a themed collection of recent articles in the area of computational carbon nanoscience. This virtual special issue was assembled from previously published Carbon articles by Guest Editors Quan Wang and Behrouz Arash, and can be accessed as a set in the special issue section of the journal website homepage: www.journals.elsevier.com/carbon. The article below by our guest editors serves as an introduction to this virtual special issue, and also a commentary on the growing role of computation as a tool to understand the synthesis and properties of carbon nanoforms and their behavior in composite materials.

For the dynamic behavior of lightweight structures like thin shells and membranes exposed to fluid flow the interaction between the two fields is often essential. Computational fluid-structure interaction provides a tool to predict this interaction and complement or eventually replace expensive experiments. Partitioned analyses techniques enjoy great popularity for the numerical simulation of these interactions. This is due to their computational superiority over simultaneous, i.e. fully coupled monolithic approaches, as they allow the independent use of suitable discretization methods and modular analysis software. We use, for the fluid, GLS stabilized finite elements on a moving domain based on the incompressible instationary Navier-Stokes equations, where the formulation guarantees geometric conservation on the deforming domain. The structure is discretized by nonlinear, three-dimensional shell elements.
Commonly used sequential staggered coupling schemes may exhibit instabilities due to the so-called artificial added mass effect. As best remedy to this problem subiterations should be invoked to guarantee kinematic and dynamic continuity across the fluid-structure interface. Since iterative coupling algorithms are computationally very costly, their convergence rate is very decisive for their usability. To ensure and accelerate the convergence of this iteration the updates of the interface position are relaxed. The time dependent, 'optimal' relaxation parameter is determined automatically without any user-input via exploiting a gradient method or applying an Aitken iteration scheme.

We apply keyquery-based taxonomy composition to compute a classification system for the CORE dataset, a shared crawl of about 850,000 scientific papers. Keyquery-based taxonomy composition can be understood as a two-phase hierarchical document clustering technique that utilizes search queries as cluster labels: In a first phase, the document collection is indexed by a reference search engine, and the documents are tagged with the search queries they are relevant—for their so-called keyqueries. In a second phase, a hierarchical clustering is formed from the keyqueries within an iterative process. We use the explicit topic model ESA as document retrieval model in order to index the CORE dataset in the reference search engine. Under the ESA retrieval model, documents are represented as vectors of similarities to Wikipedia articles; a methodology proven to be advantageous for text categorization tasks. Our paper presents the generated taxonomy and reports on quantitative properties such as document coverage and processing requirements.

A practical framework for generating cross correlated fields with a specified marginal distribution function, an autocorrelation function and cross correlation coefficients is presented in the paper. The contribution promotes a recent journal paper [1]. The approach relies on well known series expansion methods for simulation of a Gaussian random field. The proposed method requires all cross correlated fields over the domain to share an identical autocorrelation function and the cross correlation structure between each pair of simulated fields to be simply defined by a cross correlation coefficient. Such relations result in specific properties of eigenvectors of covariance matrices of discretized field over the domain. These properties are used to decompose the eigenproblem which must normally be solved in computing the series expansion into two smaller eigenproblems. Such decomposition represents a significant reduction of computational effort. Non-Gaussian components of a multivariate random field are proposed to be simulated via memoryless transformation of underlying Gaussian random fields for which the Nataf model is employed to modify the correlation structure. In this method, the autocorrelation structure of each field is fulfilled exactly while the cross correlation is only approximated. The associated errors can be computed before performing simulations and it is shown that the errors happen especially in the cross correlation between distant points and that they are negligibly small in practical situations.

The reduction of oscillation amplitudes of structural elements is necessary not only for maintenance of their durability and longevity but also for elimination of a harmful effect of oscillations on people and technology operations. The dampers are widely applied for this purpose. One of the most widespread models of structural friction forces having piecewise linear relation to displacement was analysed. T The author suggests the application of phase trajectories mapping in plane "acceleration – displacement". Unlike the trajectories mapping in a plane "velocity – displacement", they don't require large number of geometrical constructions for identification of the characteristics of dynamic systems. It promotes improving the accuracy. The analytical assumptions had been verified by numerical modeling. The results show good enough coincide between numerical and analytical estimation of dissipative characteristic.

What is nowadays called (classic) Clifford analysis consists in the establishment of a function theory for functions belonging to the kernel of the Dirac operator. While such functions can very well describe problems of a particle with internal SU(2)-symmetries, higher order symmetries are beyond this theory. Although many modifications (such as Yang-Mills theory) were suggested over the years they could not address the principal problem, the need of a n-fold factorization of the d’Alembert operator. In this paper we present the basic tools of a fractional function theory in higher dimensions, for the transport operator (alpha = 1/2 ), by means of a fractional correspondence to the Weyl relations via fractional Riemann-Liouville derivatives. A Fischer decomposition, fractional Euler and Gamma operators, monogenic projection, and basic fractional homogeneous powers are constructed.

The article presents analysis of stress distribution in the reinforced concrete support beam bracket which is a component of prefabricated reinforced concrete building. The building structure is spatial frame where dilatations were applied. The proper stiffness of its structure is provided by frames with stiff joints, monolithic lift shifts and staircases. The prefabricated slab floors are supported by beam shelves which are shaped as inverted letter ‘T’. Beams are supported by the column brackets. In order to lower the storey height and fulfill the architectural demands at the same time, the designer lowered the height of beam at the support zone. The analyzed case refers to the bracket zone where the slant crack. on the support beam bracket was observed. It could appear as a result of overcrossing of allowable tension stresses in reinforced concrete, in the bracket zone. It should be noted that the construction solution applied, i.e. concurrent support of the “undercut” beam on the column bracket causes local concentration of stresses in the undercut zone where the strongest transverse forces and tangent stresses occur concurrently. Some additional rectangular stresses being a result of placing the slab floors on the lower part of beam shelves sum up with those described above.

The paper contains a description of dynamic effects in the silo wall during the outflow of a stored material. The work allows for determining the danger of construction damage due to resonant vibrations and is of practical importance by determining the influence of cyclic pressures and vibro–creeping during prolonged use of a silo. The paper was devised as a result of tests on silo walls in semi-technical scale. The model is generally applicable and allows for identification of parameters in real- size silos as well.

The paper proposes a new method for general 3D measurement and 3D point reconstruction. Looking at its features, the method explicitly aims at practical applications. These features especially cover low technical expenses and minimal user interaction, a clear problem separation into steps that are solved by simple mathematical methods (direct, stable and optimal with respect to least error squares), and scalability. The method expects the internal and radial distortion parameters of the used camera(s) as inputs, and a plane quadrangle with known geometry within the scene. At first, for each single picture the 3D position of the reference quadrangle (with respect to each camera coordinate frame) is calculated. These 3D reconstructions of the reference quadrangle are then used to yield the relative external parameters of each camera regarding the first one. With known external parameters, triangulation is finally possible. The differences from other known procedures are outlined, paying attention to the stable mathematical methods (no usage of nonlinear optimization) and the low user interaction with good results at the same time.

Many researchers are working on developing robots into adequate partners, be it at the working place, be it at home or in leisure activities, or enabling elder persons to lead a self-determined, independent life. While quite some progress has been made in e.g. speech or emotion understanding, processing and expressing, the relations between humans and robots are usually only short-term. In order to build long-term, i.e. social relations, qualities like empathy, trust building, dependability, non-patronizing, and others will be required. But these are just terms and as such no adequate starting points to “program” these capacities even more how to avoid the problems and pitfalls in interactions between humans and robots. However, a rich source for doing this is available, unused until now for this purpose: artistic productions, namely literature, theater plays, not to forget operas, and films with their multitude of examples. Poets, writers, dramatists, screen-writers, etc. have studied for centuries the facets of interactions between persons, their dynamics, and the related snags. And since we wish for human-robot relations as master-servant relations - the human obviously being the master - the study of these relations will be prominent. A procedure is proposed, with four consecutive steps, namely Selection, Analysis, Categorization, and Integration. Only if we succeed in developing robots which are seen as servants we will be successful in supporting and helping humans through robots.

Since the 90-ties the Pascal matrix, its generalizations and applications have been in the focus of a great amount of publications. As it is well known, the Pascal matrix, the symmetric Pascal matrix and other special matrices of Pascal type play an important role in many scientific areas, among them Numerical Analysis, Combinatorics, Number Theory, Probability, Image processing, Sinal processing, Electrical engineering, etc. We present a unified approach to matrix representations of special polynomials in several hypercomplex variables (new Bernoulli, Euler etc. polynomials), extending results of H. Malonek, G.Tomaz: Bernoulli polynomials and Pascal matrices in the context of Clifford Analysis, Discrete Appl. Math. 157(4)(2009) 838-847. The hypercomplex version of a new Pascal matrix with block structure, which resembles the ordinary one for polynomials of one variable will be discussed in detail.