Skip to content

Introduction

Function operators are ubiquitous in mathematics and physics: They are used to describe dynamics of physical systems, such as the Navier-Stokes equations in fluid dynamics. As solutions of these systems are functions, it is natural to transfer the concept of function mapping into machine learning.

Operators

In mathematics, operators are function mappings: they map functions to functions. Let

U={u:XRdxRdu}

be a set of functions that map a dx-dimensional input to an du-dimensional output, and

V={v:YRdyRdv}

be a set of functions that map a dy-dimensional input to a dv-dimensional output.

An operator

G:UV,uv,

maps functions uU to functions vV.

Example

The operator G(u)=xu maps functions u to their partial derivative xu.

Learning Operators

Operator learning is the task of learning the mapping G from data. In the context of neural networks, we want to train a neural network Gθ with parameters θ that, given a set of input-output pairs (uk,vk)U×V, maps uk to vk. We generally refer to such a neural network Gθ as a neural operator.

Discretization

In continuiti, we use the general approach of mapping function evaluations to represent both input and output functions u and v in a discretized form.

Let xiX be num_sensors many sensor positions (or collocation points) in the input domain X of u. We represent the function u by its evaluations at these sensors and write x=(xi)i and u=(u(xi))i. This finite dimensional representation is fed into the neural operator.

The mapped function v=G(u), on the other hand, is also represented by function evaluations only. Let yjY be num_evaluations many evaluation points (or query points) in the input domain Y of v and y=(yj)j. Then, the output values v=(v(yj))j are approximated by the neural operator v(y)=G(u)(y)Gθ(x,u,y)=v.

In Python, we write the operator call as

v = operator(x, u, y)
with tensors of shapes

x: (batch_size, x_dim, num_sensors...)
u: (batch_size, u_dim, num_sensors...)
y: (batch_size, y_dim, num_evaluations...)
v: (batch_size, v_dim, num_evaluations...)

This is to provide the most general case for implementing operators, as some neural operators differ in the way they handle input and output values.

Wrapping

For convenience, the call can be wrapped to mimic the mathematical syntax. For instance, for a fixed set of collocation points x, we could define

G = lambda y: lambda u: operator(x, u(x), y)
v = G(u)(y)

Applications

Neural operators extend the concept of neural networks to function mappings, which enables discretization-invariant and mesh-free mappings of data with applications to physics-informed training, super-resolution, and more. See our Examples section for more on this.

Further Reading

Follow our introduction to Functions in continuiti and proceed with the Training example to learn more about operator learning in continuiti.


Last update: 2024-04-16
Created: 2023-12-04