Introduction¶
Function operators are ubiquitous in mathematics and physics: They are used to describe dynamics of physical systems, such as the Navier-Stokes equations in fluid dynamics. As solutions of these systems are functions, it is natural to transfer the concept of function mapping into machine learning.
Operators¶
In mathematics, operators are function mappings: they map functions to functions. Let
be a set of functions that map a
be a set of functions that map a
An operator
maps functions
Example
The operator
Learning Operators¶
Operator learning is the task of learning the mapping
Discretization¶
In continuiti, we use the general approach of mapping function
evaluations to represent both input and output functions
Let num_sensors
many
sensor positions (or collocation points) in the input domain
The mapped function num_evaluations
many evaluation points (or query points) in the input domain
In Python, we write the operator call as
with tensors of shapesx: (batch_size, x_dim, num_sensors...)
u: (batch_size, u_dim, num_sensors...)
y: (batch_size, y_dim, num_evaluations...)
v: (batch_size, v_dim, num_evaluations...)
This is to provide the most general case for implementing operators, as some neural operators differ in the way they handle input and output values.
Wrapping¶
For convenience, the call can be wrapped to mimic the mathematical syntax.
For instance, for a fixed set of collocation points x
, we could define
Applications¶
Neural operators extend the concept of neural networks to function mappings, which enables discretization-invariant and mesh-free mappings of data with applications to physics-informed training, super-resolution, and more. See our Examples section for more on this.
Further Reading¶
Follow our introduction to Functions in continuiti and proceed with the Training example to learn more about operator learning in continuiti.
Created: 2023-12-04