Abstract: The calculation of self-consistent shear viscous corrections closely follows the standard determination of shear viscosity from covariant Boltzmann kinetic theory. The collision term is linearized in the deviation from local thermal equilibrium for each particle species, while the free streaming term is taken at zeroth order in the deviation, i.e., p · ∂f → p · ∂feq. This leads to an integral equation for the shear corrections. It can be shown that for each species i the corrections can be reduced to a one-dimensional function χi(|p|) that only depends on the magnitude of the particle momentum in the ﬂuid rest frame.The integral equation turns out to be equivalent to a variational problem for a functional that is quadratic in the χi, which then becomes a linear algebra maximizatio...
László Ábel Somlai (2016.04.01 - 2016.07.01)
Supervisor: Mátyás Vasúth
Abstract: In the year 2015, the first observation of gravitational wave (GW150914) was detected by the LIGO-Virgo group. Its waveform emanating from the inward spiral and merger of a pair of black holes matched the predictions of general relativity. The continuous development of the detectors give the opportunity for detecting more subtle effects, of which one is the cosmological constant. It seems reasonable to determine the effect of it for the known waveforms in order to extend the number of the searched parameters.
István Csabai PhD (2018.08.01-2018.12.31)
ELTE, The Department of Physics of Complex Systems
Abstract: Although the LCDM model has achieved remarkable success, however, in recent years the accuracy of the measurements has reached the limit where parameter estimates from various observations, such as the Hubble constant determined from both CMB and supernovae, are incompatible with it. Recently we have developed a model, based on N-body simulations, which is able to resolve this tension by taking better account of complex structure formation and without introducing dark energy. During the project we will develop a new type of $N$-body simulation algorithm ,,StePS" that overcomes limitations of current methods through mapping the infinite spatial extent of the universe onto a compact manifold. Specifically, we use stereographic projection onto the surface of a four dimensional sphere. The discretization of this surface leads to a systematic multi-resolution simulation with unprecedented dynamic range for given computational resources and perfect consistency with the Newtonian force law. Our approach retains the best features of multipole solvers and AMR simulations through a continuous, mathematically consistent refinement of scales toward the center of the simulations and constant angular resolution of distant fluctuations. The algorithm is ideal for GPUs, harnessing a recent cost effective numerical hardware revolution. A prototype of our algorithm has been successfully tested against GADGET, the early version of the code is open source, and available on GitHub, and the paper on the preliminary results R\'acz et al. (2018) has been submitted to MNRAS.
Abstract: When modeling a heavy ion collision, hydrodynamics is not applicable in the regime where deviations from local thermal equilibrium are not guaranteed to be small. Further, the fact that experiments detect particles warrants a switch from a ﬂuid dynamical picture to a particle picture. One approach to modeling particles with non-equilibrium dynamics utilizes the relativistic Boltzmann Transport Equation (BTE). The BTE describes the evolution of particle phase space distributions via collision terms for various scattering processes. While the elastic 2 → 2 collision terms are useful to study the approach to thermal equilibrium, the radiative 2 ↔ 3 collision terms are necessary to study the approach to chemical equilibrium in systems with changing particle number.
Peter D. Anderson
Publication: Loop equations and bootstrap methods in the lattice
Abstract: Monte Carlo lattice simulations of pure Yang-Mills theory, such as the one proposed here, have been shown to be ideally suited for GPU computations since all changes in the action are local (as opposed to dynamical fermions that are non-local in the lattice simulation). The action is written in terms of link variables and only depends on nearest neighbors. Thus any lattice site with even (or odd) parity can be run simultaneously. The link is an element of the gauge group SU(NC) and acts as a parallel transporter from one site to the next site. A Wilson Loop is deﬁned as the trace of a product of links associated with a closed path in the lattice. Its expectation value is measured by averaging over a large number of statistically independent conﬁgurations that are obtained in the simulation...
PhD. Gábor Marschalkó (2016.03.01. - 2016.09.30.)
Supervisor: Emese Forgács-Dajka
Abstract: To study the light variations of eclipsing binaries one need to create an ensemble of model light curves. Hence these models contain numerous parameters solving these problem needs signiﬁcant computing resources, so the parallelization seems quite obvious. During this project we would like to parallelize our code to an extent, temporarily determining the orbits via Kepler equation using Newton-Raphson method and the surface light intensity without feedback (e.g light reﬂexion) eﬀects.
János Sztakovics (2016.03.01. - 2016.08.31.)
Supervisors: Emese Forgács-Dajka, Tamás Borkovits
Abstract: About 50 % of stars are part of binary, or multiple systems. Investigations of components of these systems allows us to determine some physical parameters of them. By observing eclipsing binaries we can calculate these parameters in more details besides others. Analyzing and modelling lightcurves of eclipsing binaries are important to get clearer picture of the individual systems (components, orbits, etc.). To define eccentricities and arguments of periastron we should measure the duration of eclipses, and the relative position of secondary minimum due to the primary minimum in phase. I would like to process the huge amount of data of the space missions with fast and precise algorithms. The goal is to further develop the single-thread program to C and CUDA languages to measure th...
Tamás Hajdu (2016.03.01. - 2016.08.31.)
Supervisors: Emese Forgács-Dajka, Tamás Borkovits
Abstract: More than the half of the stars around us are part of a binary or multiple system, therefore their observation and examination plays a mayor role in developements of star formation and stellar evolution models. Thanks for today's accurate photometric measurements, so many effects can be detected based on eclipse timing variation. Such as light-travel-time effect, apsidial motion and dynamical effect. During my work I will use Kepler and K2 databases. To determine the time of each eclipse I will use Monte-Carlo- and Bootstrap-method. By these methods I will get much more accurate O-C data than before. I will use a parallel programing architecture, which is based on my previous C code, to reduce the running time. The results will be screened to collect those which indicate the presence...
Ernő Dávid, Dávid El-Saig and Zoltán Lehóczky (2019.08.31 - 2019.12.31)
Wigner RCP és Lombiq Technologies Ltd. cooperation
Abstract: Hastlayer by Lombiq Technologies allows software developers of the .NET platform to utilize FPGAs as compute accelerators. It converts standard .NET constructs into equivalent hardware implementations, automatically enhancing the performance while lowering the power consumption of suitable algorithms. Developers keep writing .NET programs as usual, no hardware design knowledge is required.
Hastlayer needs to support FPGA boards specifically, and formerly it only supported one that was suitable for testing and creating proof of concepts, but not for high performance computing scenarios. The work ongoing in collaboration with Wigner RCP is about making it support high-performance FPGAs of the Microsoft Catapult platform. Wigner's task is to create the FPGA-side hardware framework that hosts the automatically generated hardware cores created by Hastlayer.
Tuan Máté Nguyen (2016.09.01 - 2016.10.31)
Supervisor: Gergely Gábor Barnaföldi
Abstract: The Hough-transform is a frequent data analysis and pattern recognition task. It can be used to detect lines in noisy data or decide whether a set of points are on a single or multiple lines and if so, what are the parameters of such lines. The principle of the operation is that the incoming discretized values (bins or pixels) are mapped to lines, that in turn are drawn on an image, then select the points where the most lines intersect each other. The coordinates of these crossing points can be transformed into the slope and intersect values of the line passing through the points. If the input values have errors, we need to generalize the algorithm by drawing stripes instead of thin lines. The transformation is in itself computationally intensive, but if we’d like to process lots of data with i...
János Endre Maróti (2016.04.01 - 2016.07.01)
Supervisor: Mátyás Vasúth
abstract: We can detect gravitational waves by binary systems only before the collision. The development of the detectors give opportunity of more subtle detections. Based on these binary systems' signal and these detectors' signal-to-noise ratio we can determine the binary systems' parameters that we can detect at a specific time before the collision.
Balázs Kacskovics (2016.04.01 - 2016.07.01)
Supervisor: Mátyás Vasúth
Abstract: The supermassive binary black hole system OJ 287 gives a unique possibility to examine gravitational effects to high accuracy. The quasiperiodic light variations of this object have been observed for more than one century. Based on the times of these outbursts the orbital elements and other parameters of the system were determined to high accuracy. Using recent assumptions on the spins of the components we study the effects of spin contributions to the orbit up to 3.5 PostNewtonian order.
Márton Vargyas (2016.09.01-2018.08.31)
Abstract: The task of the Budapest Advanced Quality Assurance (QA-A) Centre is to test and classify the Gas Electron Multiplier (GEM) foils, which would be an integral part of the upgrade of ALICE's Time Projection Chamber (TPC) detector. To classify the foils we use the known correlation between their hole size and electrical properties. We take high definition images of the foils in a clean room equipped with an X-Y-Z robot and telecentric lens (images of the foil's two sides can take up to 50GB), then we recognize these holes with a GPU-accelerated software, which identifies every hole. Then we decide the fate of the foil, either it would be built into the detector allowing continous readout, making it a 3D camera or we return it to the manufacturer.