Warning

Package poppy (see https://pypi.org/project/poppy/) must be installed !

2. First simulation with Pyxel#

../_images/pyxel_logo.png

2.1. Authors#

The Pyxel development team

2.2. Keywords#

Exposure mode

2.3. Prerequisites#

Concepts

Importance

Notes

Setup

Necessary

Background

xarray

Helpful

Background

bokeh

Helpful

2.4. Learning Goals#

By the end of the lesson you will know how to:

  • Load the configuration file

  • Run Pyxel in exposure mode from the Jupyter notebook

  • Display the final detector object

  • Save the outputs

2.5. Summary#

In this notebook we will see Pyxel in action by running a simple simulation using the exposure mode! We will apply a number of models to a .fits image of Pleiades. All the necessary configuration is provided in the file exposure.yaml.

We will use Pyxel’s module configuration to load the configuration file and functions run_mode and display_detector from modules run and notebook.

import pyxel

2.6. Load configuration#

The main input of a Pyxel simulation is the YAML configuration file, specifying information about the running mode, the detector and all the models user wants to apply (the pipeline). Configuration file is in the YAML format and it is loaded with the function load(), which outputs an instance of class Configuration, in our case the config object.

Feel free to check out the configuration file! For example, you will find the path to the input .fits file as an argument of the load_image model under pipeline -> photon_generation.

config = pyxel.load("exposure.yaml")  # class Configuration

2.7. Create running mode, detector and pipeline objects#

By inspecting the configuration file, one can see that it is separated into three main compartments, each representing a class in the Pyxel architecture (Exposure, CCD, DetectionPipeline). They are saved as attributes in the configuration object and we can access them in the following way:

exposure = config.exposure  # class Exposure
detector = config.ccd_detector  # class CCD
pipeline = config.pipeline  # class DetectionPipeline

2.8. Run the pipeline#

We can run the single mode simulation with the function exposure_mode(), passing the objects exposure, detector and pipeline. By doing so, the detector object passes through the pipeline once where it is edited by the models. By default the data is read out once at the readout time of 1 second.

result = pyxel.run_mode(
    mode=exposure,
    detector=detector,
    pipeline=pipeline,
)

result
<xarray.DatasetView> Size: 21MB
Dimensions:  (time: 3, y: 450, x: 450)
Coordinates:
  * y        (y) int64 4kB 0 1 2 3 4 5 6 7 8 ... 442 443 444 445 446 447 448 449
  * x        (x) int64 4kB 0 1 2 3 4 5 6 7 8 ... 442 443 444 445 446 447 448 449
  * time     (time) float64 24B 1.0 5.0 7.0
Data variables:
    photon   (time, y, x) float64 5MB 1.432e+04 1.321e+04 ... 2.964e+04
    charge   (time, y, x) float64 5MB 1.432e+04 1.321e+04 ... 2.964e+04
    pixel    (time, y, x) float64 5MB 1.432e+04 1.321e+04 ... 2.964e+04
    signal   (time, y, x) float64 5MB 1.432 1.321 1.309 ... 2.792 2.807 2.964
    image    (time, y, x) uint16 1MB 9386 8654 8576 8619 ... 18298 18398 19422
Attributes:
    pyxel version:  2.9
    running mode:   Exposure

What happens to the detector object during the pipeline is easiest to explain with an image. We can see below that the detector stores its properties as well as data representing different stages of the imaging process. These are Photon, Pixel, Signal and Image arrays as well as Charge, which is a Pandas dataframe representing the charge point cloud. Data is a Xarray Dataset containing the processed data from the models in that group. The buckets are edited by the functions in the pipeline - models.

../_images/architecture.png

You can now check out the output folder for saved data.

2.9. Display detector#

We can display the detector at the end with function display_detector(), which shows us different arrays stored inside.

pyxel.display_detector(detector)
Charge is not part of this display since it is a dataframe of a charge cloud in the detector volume and not an array.

You can also find the saved outputs that were specified in the configuration file in the output folder.

To get to know the YAML configuration and Pyxel classes a bit more in depth, continue with the next step of the tutorial: Pyxel configuration and classes.

2.10. Accessing data stored inside the xarray dataset#

result["image"].sel(time=1)
<xarray.DataArray 'image' (y: 450, x: 450)> Size: 405kB
array([[ 9386,  8654,  8576, ...,  9008,  9080,  9687],
       [ 8705,  7779,  7719, ...,  8166,  8276,  9111],
       [ 8670,  7769,  7725, ...,  8079,  8214,  9065],
       ...,
       [ 8859,  7914,  7761, ...,  8124,  8196,  8989],
       [ 8959,  8084,  8477, ...,  8294,  8357,  9082],
       [10342,  9684,  8984, ...,  9148,  9199,  9710]],
      shape=(450, 450), dtype=uint16)
Coordinates:
  * y        (y) int64 4kB 0 1 2 3 4 5 6 7 8 ... 442 443 444 445 446 447 448 449
  * x        (x) int64 4kB 0 1 2 3 4 5 6 7 8 ... 442 443 444 445 446 447 448 449
    time     float64 8B 1.0
Attributes:
    units:      adu
    long_name:  Image

To convert to a numpy array, method to_numpy can be used.

result["image"].sel(time=1).to_numpy()
array([[ 9386,  8654,  8576, ...,  9008,  9080,  9687],
       [ 8705,  7779,  7719, ...,  8166,  8276,  9111],
       [ 8670,  7769,  7725, ...,  8079,  8214,  9065],
       ...,
       [ 8859,  7914,  7761, ...,  8124,  8196,  8989],
       [ 8959,  8084,  8477, ...,  8294,  8357,  9082],
       [10342,  9684,  8984, ...,  9148,  9199,  9710]],
      shape=(450, 450), dtype=uint16)

Xarray datasets also support math operations:

result["image"].sel(time=1).sum()
<xarray.DataArray 'image' ()> Size: 8B
array(2328646708, dtype=uint64)
Coordinates:
    time     float64 8B 1.0

2.11. Plotting image as function of readout time#

Library holoviews can be used to quickly plot data stored inside xarray datasets. Here with the bokeh backend, matplotlib is also supported.

# Display one 'image'
result["image"].sel(time=1).plot()
<matplotlib.collections.QuadMesh at 0x7992aa548ec0>
../_images/5e24b4a10cf9497d00f32444de73954198d843bd5e8060ae2cde66097946c910.png
# Display all 'images'
result["image"].plot(col="time")
<xarray.plot.facetgrid.FacetGrid at 0x7992aa549d30>
../_images/044b9ad61f1157317d4eaf7b2a822c8c73715c377644e8f9d32f0d21f7b4a291.png
import holoviews as hv
from holoviews import opts

hv.extension("bokeh")
out = hv.Dataset(result["image"])
image = out.to(hv.Image, ["x", "y"], dynamic=True)
plot = image.opts(opts.Image(aspect=1, cmap="gray", tools=["hover"])).opts(
    framewise=True, axiswise=True
)

plot

2.12. Plotting image with xarray#

It’s possible to plot directly using library xarray. See here for more information: https://docs.xarray.dev/en/stable/user-guide/plotting.html