[NA] Amanda Howard: Multifidelity, domain decomposition, and stacking for improving training for physics-informed networks
14 februari 2025 12:30 t/m 13:15 - Locatie: LB01.170 | Zet in mijn agenda
Physics-informed neural networks and operator networks have shown promise for effectively solving equations modeling physical systems. However, these networks can be difficult or impossible to train accurately for some systems of equations. One way to improve training is through the use of a small amount of data, however, such data is expensive to produce. We will introduce our novel multifidelity framework for stacking physics-informed neural networks and operator networks that facilitates training by progressively reducing the errors in our predictions for when no data is available. In stacking networks, we successively build a chain of networks, where the output at one step can act as a low-fidelity input for training the next step, gradually increasing the expressivity of the learned model. We will finally discuss the extension to domain decomposition using the finite basis method, including applications to newly-developed Kolmogorov-Arnold Networks.