Veranstaltungen

zum Archiv

Mittwoch, 24.07.2024, 14:15 Uhr (WIAS-ESH)
Berliner Oberseminar „Nichtlineare partielle Differentialgleichungen” (Langenbach-Seminar)
Prof. Dr. Emil Wiedemann, Friedrich-Alexander-Universität Erlangen-Nürnberg:
Measure-valued solutions in fluid dynamics
mehr ... Veranstaltungsort
Weierstraß-Institut, Mohrenstr. 39, 10117 Berlin, Erdgeschoss, Erhard-Schmidt-Hörsaal

Abstrakt
As more and more ill-posedness results have been shown for fluid PDEs (not only by convex integration!), the idea to solve the Cauchy problem by some unique weak or entropy solution has become questionable. Instead, non-deterministic solution concepts such as measure-valued or statistical have sparked much recent research interest. They also seem to be more in line with well-known theories of turbulence, which are typically statistical. I will give an overview of measure-valued solution concepts, including their weak-strong stability, their relation to more conventional solutions, and questions of existence. Links to other notions of "very weak" solution (dissipative, subsolutions, energy-variational) will briefly be discussed.

Weitere Informationen
Oberseminar “Nichtlineare Partielle Differentialgleichungen” (Langenbach Seminar)

Veranstalter
Humboldt-Universität zu Berlin
WIAS Berlin
Dienstag, 23.07.2024, 15:00 Uhr (WIAS-405-406)
Seminar Modern Methods in Applied Stochastics and Nonparametric Statistics
Yuanyuan Li, Fudan University, China:
Function and derivative approximation by shallow neural networks
mehr ... Veranstaltungsort
Weierstraß-Institut, Mohrenstr. 39, 10117 Berlin, 4. Etage, Raum: 405/406

Abstrakt
We investigate a Tikhonov regularization scheme specifically tailored for shallow neural networks within the context of solving a classic inverse problem: approximating an unknown function and its derivatives within a bounded domain based on noisy measurements. The proposed Tikhonov regularization scheme incorporates a penalty term that takes three distinct yet intricately related network (semi)norms: the extended Barron norm, the variation norm, and the Radon-BV seminorm. These choices of the penalty term are contingent upon the specific architecture of the neural network being utilized. We establish the connection between various network norms and particularly trace the dependence of the dimensionality index, aiming to deepen our understanding of how these norms interplay with each other. We revisit the universality of function approximation through various norms, establish rigorous error-bound analysis for the Tikhonov regularization scheme, and explicitly elucidate the dependency of the dimensionality index, providing a clearer understanding of how the dimensionality affects the approximation performance.

Weitere Informationen
Dieser Vortrag findet auch via Zoom statt: https://zoom.us/j/492088715

Veranstalter
WIAS Berlin