Tuesday, July 19th, 2022. Time: 12 noon
Blended Learning - O.C. Zienkiewicz Conference Room, C1 Building, UPC Campus Nord, Barcelona - Link for online session: https://meet.google.com/qjo-sttx-dgo
This 'yearly report'-like talk concentrates on a few of the algorithmic advances that have taken place over the last year:
a) Finding Isolated Domains
b) Computing Viewfactors
c) Adjoint-Based Sensitivities to Boundary Conditions
d) Bias Ordering for Deep Neural Nets
---------------------------------------------------------------------------------
a) Finding Isolated Domains
Many applications where the surface geometry may not be topologically consistent or clean require the use of embedded or immersed methods. Some of these problems present many (possibly thousands) of objects/fragments embedded in the mesh. These in turn many produce isolated domains (pieces, islands), with disconnected flowfields.
The unknowns in these isolated regions need to be reset in order to avoid instabilities. The isolated regions change every timestep, so a fast, scalable (OMP, MPI) algorithm needs to be found.
b) Computing Viewfactors
Coupling fluid, thermal and radiation codes in order to predict the effect of heat waves in cities has become the focus of much activity worldwide. So-called viewfactors are required for diffuse radiation. For N faces, the brute force algorithmic complexity is of O(N^3). Given that N is in the millions for a city, it is imperative to obtain better algorithms. Octree and mesh based techniques will be discussed and compared.
c) Adjoint-Based Sensitivities to Boundary Conditions
The analysis of haemodynamic phenomena and their clinical relevance via computational mechanics (fluids, solids, ...) is now common in research and development. Yet a recurring question has been the influence of boundary conditions and geometry on "clinically relevant measures". Adjoints offer an appealing way to answer this question. Some interesting theoretical results have been obtained as part of this effort.
d) Bias Ordering for Deep Neural Nets
No talk in the year 2022 could be complete without some mention of deep neural nets. Here is the `current hammer that is looking for all possible nails'. Yet even the simple task of optimizing weights and biases seems to have escaped the observation that a neural net with a layer of N neurons has N! equally optimal solutions.
A bias ordering technique has been developed to reduce this to O(1), making it much easier for optimizers to converge to an optimal solution.
Rainald Lohner is the head of the CFD center of George Mason University in Fairfax, VA, in the outskirts of Washington, D.C.
He received a MSc in Mechanical Engineering from the Technische Universitaet Braunschweig, Germany, as well as a PhD and DSc in Civil Engineering from the University College of Swansea, Wales.
His areas of interest include numerical methods, solvers, grid generation, parallel computing, visualization, pre-processing, fluid-structure interaction, shape and process optimization and computational crowd dynamics.
His codes and methods have been applied in many fields, including aerodynamics of airplanes, cars and trains,
hydrodynamics of ships, submarines and UAVs, shock-structure interaction, dispersion analysis in urban areas and the built enviroment, haemodynamics of vascular diseases and pedestrian safety assessments.
He is the author of more than 800 articles covering the fields enumerated above, as well as a textbook on Applied CFD Techniques.