simulate_3d   A
last analyzed

Complexity

Total Complexity 0

Size/Duplication

Total Lines 221
Duplicated Lines 0 %

Importance

Changes 0
Metric Value
eloc 69
dl 0
loc 221
rs 10
c 0
b 0
f 0
wmc 0
1
"""
2
3D map simulation
3
=================
4
5
Simulate a 3D observation of a source with the CTA 1DC response and fit it with the assumed source model.
6
7
Prerequisites
8
-------------
9
10
-  Knowledge of 3D extraction and datasets used in gammapy, see for
11
   example the :doc:`/tutorials/starting/analysis_1` tutorial.
12
13
Context
14
-------
15
16
To simulate a specific observation, it is not always necessary to
17
simulate the full photon list. For many uses cases, simulating directly
18
a reduced binned dataset is enough: the IRFs reduced in the correct
19
geometry are combined with a source model to predict an actual number of
20
counts per bin. The latter is then used to simulate a reduced dataset
21
using Poisson probability distribution.
22
23
This can be done to check the feasibility of a measurement (performance
24
/ sensitivity study), to test whether fitted parameters really provide a
25
good fit to the data etc.
26
27
Here we will see how to perform a 3D simulation of a CTA observation,
28
assuming both the spectral and spatial morphology of an observed source.
29
30
**Objective: simulate a 3D observation of a source with CTA using the
31
CTA 1DC response and fit it with the assumed source model.**
32
33
Proposed approach
34
-----------------
35
36
Here we can’t use the regular observation objects that are connected to
37
a `DataStore`. Instead we will create a fake
38
`~gammapy.data.Observation` that contain some pointing information and
39
the CTA 1DC IRFs (that are loaded with `~gammapy.irf.load_cta_irfs`).
40
41
Then we will create a `~gammapy.datasets.MapDataset` geometry and
42
create it with the `~gammapy.makers.MapDatasetMaker`.
43
44
Then we will be able to define a model consisting of a
45
`~gammapy.modeling.models.PowerLawSpectralModel` and a
46
`~gammapy.modeling.models.GaussianSpatialModel`. We will assign it to
47
the dataset and fake the count data.
48
49
"""
50
51
52
######################################################################
53
# Imports and versions
54
# --------------------
55
#
56
57
# %matplotlib inline
58
59
import numpy as np
60
import astropy.units as u
61
from astropy.coordinates import SkyCoord
62
from gammapy.data import Observation, observatory_locations
63
from gammapy.datasets import MapDataset
64
from gammapy.irf import load_cta_irfs
65
from gammapy.makers import MapDatasetMaker, SafeMaskMaker
66
from gammapy.maps import MapAxis, WcsGeom
67
from gammapy.modeling import Fit
68
from gammapy.modeling.models import (
69
    FoVBackgroundModel,
70
    GaussianSpatialModel,
71
    Models,
72
    PowerLawSpectralModel,
73
    SkyModel,
74
)
75
76
######################################################################
77
# Simulation
78
# ----------
79
#
80
81
82
######################################################################
83
# We will simulate using the CTA-1DC IRFs shipped with gammapy. Note that
84
# for dedictaed CTA simulations, you can simply use
85
# ``Observation.from_caldb()` <>`__ without having to externally load
86
# the IRFs
87
#
88
89
# Loading IRFs
90
irfs = load_cta_irfs(
91
    "$GAMMAPY_DATA/cta-1dc/caldb/data/cta/1dc/bcf/South_z20_50h/irf_file.fits"
92
)
93
94
# Define the observation parameters (typically the observation duration and the pointing position):
95
livetime = 2.0 * u.hr
96
pointing = SkyCoord(0, 0, unit="deg", frame="galactic")
97
98
# Define map geometry for binned simulation
99
energy_reco = MapAxis.from_edges(
100
    np.logspace(-1.0, 1.0, 10), unit="TeV", name="energy", interp="log"
101
)
102
geom = WcsGeom.create(
103
    skydir=(0, 0),
104
    binsz=0.02,
105
    width=(6, 6),
106
    frame="galactic",
107
    axes=[energy_reco],
108
)
109
# It is usually useful to have a separate binning for the true energy axis
110
energy_true = MapAxis.from_edges(
111
    np.logspace(-1.5, 1.5, 30), unit="TeV", name="energy_true", interp="log"
112
)
113
114
empty = MapDataset.create(geom, name="dataset-simu", energy_axis_true=energy_true)
115
116
# Define sky model to used simulate the data.
117
# Here we use a Gaussian spatial model and a Power Law spectral model.
118
spatial_model = GaussianSpatialModel(
119
    lon_0="0.2 deg", lat_0="0.1 deg", sigma="0.3 deg", frame="galactic"
120
)
121
spectral_model = PowerLawSpectralModel(
122
    index=3, amplitude="1e-11 cm-2 s-1 TeV-1", reference="1 TeV"
123
)
124
model_simu = SkyModel(
125
    spatial_model=spatial_model,
126
    spectral_model=spectral_model,
127
    name="model-simu",
128
)
129
130
bkg_model = FoVBackgroundModel(dataset_name="dataset-simu")
131
132
models = Models([model_simu, bkg_model])
133
print(models)
134
135
136
######################################################################
137
# Now, comes the main part of dataset simulation. We create an in-memory
138
# observation and an empty dataset. We then predict the number of counts
139
# for the given model, and Poission fluctuate it using `fake()` to make
140
# a simulated counts maps. Keep in mind that it is important to specify
141
# the `selection` of the maps that you want to produce
142
#
143
144
# Create an in-memory observation
145
location = observatory_locations["cta_south"]
146
obs = Observation.create(
147
    pointing=pointing, livetime=livetime, irfs=irfs, location=location
148
)
149
print(obs)
150
151
# Make the MapDataset
152
maker = MapDatasetMaker(selection=["exposure", "background", "psf", "edisp"])
153
154
maker_safe_mask = SafeMaskMaker(methods=["offset-max"], offset_max=4.0 * u.deg)
155
156
dataset = maker.run(empty, obs)
157
dataset = maker_safe_mask.run(dataset, obs)
158
print(dataset)
159
160
# Add the model on the dataset and Poission fluctuate
161
dataset.models = models
162
dataset.fake()
163
# Do a print on the dataset - there is now a counts maps
164
print(dataset)
165
166
167
######################################################################
168
# Now use this dataset as you would in all standard analysis. You can plot
169
# the maps, or proceed with your custom analysis. In the next section, we
170
# show the standard 3D fitting as in :doc:`/tutorials/analysis-3d/analysis_3d`
171
# tutorial.
172
#
173
174
# To plot, eg, counts:
175
dataset.counts.smooth(0.05 * u.deg).plot_interactive(add_cbar=True, stretch="linear")
176
177
178
######################################################################
179
# Fit
180
# ---
181
#
182
# In this section, we do a usual 3D fit with the same model used to
183
# simulated the data and see the stability of the simulations. Often, it
184
# is useful to simulate many such datasets and look at the distribution of
185
# the reconstructed parameters.
186
#
187
188
models_fit = models.copy()
189
190
# We do not want to fit the background in this case, so we will freeze the parameters
191
models_fit["dataset-simu-bkg"].spectral_model.norm.frozen = True
192
models_fit["dataset-simu-bkg"].spectral_model.tilt.frozen = True
193
194
dataset.models = models_fit
195
print(dataset.models)
196
197
# %%time
198
fit = Fit(optimize_opts={"print_level": 1})
199
result = fit.run(datasets=[dataset])
200
201
dataset.plot_residuals_spatial(method="diff/sqrt(model)", vmin=-0.5, vmax=0.5)
202
203
204
######################################################################
205
# Compare the injected and fitted models:
206
#
207
208
print(
209
    "True model: \n",
210
    model_simu,
211
    "\n\n Fitted model: \n",
212
    models_fit["model-simu"],
213
)
214
215
216
######################################################################
217
# Get the errors on the fitted parameters from the parameter table
218
#
219
220
result.parameters.to_table()
221