Super-resolution¶
Neural operators, being agnostic to a specific discretization, can be used for super-resolution tasks, where the goal is to increase the resolution of an image.
In this example, we will employ the FLAME dataset, a set of flow samples of
resolution 32x32 that should be up-sampled to 128x128. You can download the data
set from Kaggle
and put it into the data/flame
directory to reproduce this example.
Setup¶
Flame Dataset¶
continuiti provides the Flame
class (a special OperatorDataset
) that reads
and exports samples from the FLAME data. The data set contains train/val splits
and has four channels ux
, uy
, uz
, and rho
. In this example,
we only use channel ux
from the first four samples of the val split,
and we visualize the provided data using matplotlib.
Operator¶
We define a DeepONet
to map the low-resolution data to a continuous function.
Note that we increase the expressivity of the trunk network by increasing the
width and depth.
Training¶
With an OperatorDataset
at hand, training is straightforward using the Trainer.fit
method.
Here, we add the LearningCurve
callback to monitor the training loss.
Evaluation¶
As we can evaluate the trained operator at arbitrary positions, we can plot the mapped function on a fine mesh with 128 positions (or any other resolution), aka super-resolution!
Created: 2024-08-20