Skip to content

Commit

Permalink
Merge pull request #80 from weizmannk/main
Browse files Browse the repository at this point in the history
Update of   ZTF and Rubin light curves detection files

Former-commit-id: ef58d24
  • Loading branch information
weizmannk authored Apr 19, 2023
2 parents f8fff08 + 9eae08e commit cd7b639
Show file tree
Hide file tree
Showing 3 changed files with 219 additions and 98 deletions.
81 changes: 81 additions & 0 deletions doc/gwemopt_light_curves_detection.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,81 @@
# Instructions for using **[nmma]** and **[gwemopt]** to simulate KNe detection with ZTF and LSST

First of all we have to install **[nmma]** and **[gwemopt]** together in the same environmment.

1 **- Install nmma**:

Instructions For `nmma` click here: **[NMMA-installation]**

2 **- Install gwemopt**:

Instructions For `gwemopt` click here: **[gwemopt-installation]**


3 **- Split Farah(GWTC-3) CBCs data in BNS and NSBH**

In the injections.dat files outcome **[observing-scenarios-simulations]**, the masses of the CBCs are masses observed in the detectors. To splitting them into BNS, NSBH and BBH populations, we should convert them in source frame mass.

All CBCs in ``Farah(GWTC-3)`` distribustion that passed the SNR are recorded here: **[obs-scenarios-data-2022]** . For reasons of weight, we keep only the data necessary to produce the light curves, so the simulation with nmma.
For more data like skymaps file please go to **[zenodo]**. Anyway we need to download it if we need to get skymap files (``.fits`` files ) to run the KNe detection process with **[gwemopt]**.

Use the following python code for the ``split``:

Click here: **[split_CBC_data_for_nmma.py]**

## **Generate a `simulation set`**

4 **- Then use ``BNS`` or ``NSBH``** injections.dat to create injection file with nmma.

To generate a json file (``injection.json``) with the BILBY processing of compact binary merging events. This injection contents a simulation set of parameters : luminosity_distance, log10_mej_wind, KNphi, inclination_EM, KNtimeshift, geocent_time for the KNe (`Bu2019lm` - BNS or `Bu2019nsbh` - NSBH) model.

**`BNS` type**

nmma_create_injection --prior-file ./priors/Bu2019lm.prior --injection-file ./injections.dat --eos-file ./example_files/eos/ALF2.dat --binary-type BNS --original-parameters --extension json --aligned-spin --binary-type BNS --filename ./outdir_BNS/Bu2019lm_injection

**`NSBH` type**

nmma_create_injection --prior-file ./priors/Bu2019nsbh.prior --injection-file ./injections.dat --eos-file ./example_files/eos/ALF2.dat --binary-type NSBH --original-parameters --extension json --aligned-spin --filename ./outdir_NSBH/Bu2019nsbh_injection

5 **- Generate lightcurve **

This command line concern the ``ZTF`` for ``Rubin`` please replace `--injection-detection-limit 24.1,21.7,21.4,20.9,24.5,23.0,23.2,22.6,22.6 `` by **--injection-detection-limit 23.9,25.0,24.7,24.0,23.3,22.1,23.2,22.6,22.6**

**`BNS`**

light_curve_generation --model Bu2019lm --svd-path ./svdmodels --outdir ./outdir_BNS --label injection_Bu2019lm --dt 0.5 --injection ./outdir_BNS/injection_Bu2019lm.json --injection-detection-limit 24.1,21.7,21.4,20.9,24.5,23.0,23.2,22.6,22.6 --filters u,g,r,i,z,y,J,H,K --absolute --plot --generation-seed 42

**`NSBH`**


light_curve_generation --model Bu2019lm --svd-path ./svdmodels --outdir ./outdir_NSBH --label injection_Bu2019nsbh --dt 0.5 --injection ./outdir_NSBH/injection_Bu2019nsbh.json --injection-detection-limit 24.1,21.7,21.4,20.9,24.5,23.0,23.2,22.6,22.6 --filters u,g,r,i,z,y,J,H,K --absolute --plot --generation-seed 42

6 **- KNe detection process using ``gwemopt``**


* Here we need to use ``skymap`` files from **[zenodo]** in this example we use  **O4** (``--skymap-dir ./runs/O4/farah/allsky``), but this should be replace by **O5** if you work with **O5** data, then...
* Then the json injection file ``injection_Bu2019lm.json`` for BNS or ``injection_Bu2019nsbh.json`` for NSBH, which contains the IDs of the simulations we need to identify their correspondence in the ``skymap`` data.
* And the direction of the ``config`` foder in **[gwemopt]** file.

* in **``--telescope ZTF``**, **ZTF** should be replace by ``LSST`` for ``Rubin`` telescope.
* Also the ``--exposuretime 300`` means time of observation (300 secondes), but we should also use ``--exposuretime 180` for 180 secondes.

**`BNS`**

light_curve_detection --configDirectory ./gwemopt/config --outdir ./ztf_detection/outdir_BNS --injection-file ./outdir_BNS/injection_Bu2019lm.json --skymap-dir ./runs/O4/farah/allsky --lightcurve-dir ./outdir_BNS --binary-type BNS --filters u,g,r,i,z,y,J,H,K --exposuretime 300 --detections-file ./ztf_detection/bns_lc_skymap_detection.txt --telescope ZTF --tmin 0 --tmax 14 --dt 0.5 --parallel --number-of-cores 10 --generation-seed 42

**`NSBH`**

light_curve_detection --configDirectory ./gwemopt/config --outdir ./ztf_detection/outdir_NSBH --injection-file ./outdir_NSBH/injection_Bu2019nsbh.json --skymap-dir ./runs/O4/farah/allsky --lightcurve-dir ./outdir_NSBH --binary-type NSBH --filters u,g,r,i,z,y,J,H,K --exposuretime 300 --detections-file ./ztf_detection/nsbh_lc_skymap_detection.txt --telescope ZTF --tmin 0 --tmax 14 --dt 0.5 --parallel --number-of-cores 10 --generation-seed 42





[nmma]: https://github.com/nuclear-multimessenger-astronomy/nmma
[NMMA-installation]: https://github.com/nuclear-multimessenger-astronomy/nmma/blob/main/doc/installation.md
[gwemopt]: https://github.com/mcoughlin/gwemopt
[gwemopt-installation]:https://github.com/mcoughlin/gwemopt/blob/main/README.md
[observing-scenarios-simulations]:https://github.com/lpsinger/observing-scenarios-simulations
[zenodo]:https://doi.org/10.5281/zenodo.7026209
[obs-scenarios-data-2022]:https://github.com/weizmannk/obs-scenarios-data-2022
[split_CBC_data_for_nmma.py]:https://github.com/weizmannk/obs-scenarios-data-2022/blob/main/split_CBC_data_for_nmma.py
174 changes: 102 additions & 72 deletions nmma/em/create_lightcurves.py
Original file line number Diff line number Diff line change
Expand Up @@ -2,7 +2,6 @@
import numpy as np
import argparse
import json
from scipy.interpolate import interpolate as interp

from astropy import time

Expand All @@ -16,26 +15,36 @@
KilonovaGRBLightCurveModel,
)
from .injection import create_light_curve_data
from .utils import NumpyEncoder, check_default_attr


def main():

parser = argparse.ArgumentParser(
description="Inference on kilonova ejecta parameters."
)
parser.add_argument(
"--model", type=str, required=True, help="Name of the kilonova model to be used"
"--model",
type=str,
required=True,
help="Name of the kilonova model to be used"
)
parser.add_argument(
"--svd-path",
type=str,
default="svdmodels",
help="Path to the SVD directory, with {model}_mag.pkl and {model}_lbol.pkl",
)
parser.add_argument(
"--outdir", type=str, required=True, help="Path to the output directory"
"--outdir",
type=str,
required=True,
help="Path to the output directory"
)
parser.add_argument(
"--label",
type=str,
required=True,
help="Label for the run"
)
parser.add_argument("--label", type=str, required=True, help="Label for the run")
parser.add_argument(
"--tmin",
type=float,
Expand All @@ -49,7 +58,10 @@ def main():
help="Days to be stoped analysing from the trigger time (default: 14)",
)
parser.add_argument(
"--dt", type=float, default=0.1, help="Time step in day (default: 0.1)"
"--dt",
type=float,
default=0.1,
help="Time step in day (default: 0.1)"
)
parser.add_argument(
"--svd-mag-ncoeff",
Expand All @@ -69,18 +81,6 @@ def main():
help="A comma seperated list of filters to use (e.g. g,r,i). If none is provided, will use all the filters available",
default="u,g,r,i,z,y,J,H,K",
)
parser.add_argument(
"--grb-resolution",
type=float,
default=5,
help="The upper bound on the ratio between thetaWing and thetaCore (default: 5)",
)
parser.add_argument(
"--jet-type",
type=int,
default=0,
help="Jet type to used used for GRB afterglow light curve (default: 0)",
)
parser.add_argument(
"--generation-seed",
metavar="seed",
Expand All @@ -89,7 +89,10 @@ def main():
help="Injection generation seed (default: 42)",
)
parser.add_argument(
"--injection", metavar="PATH", type=str, help="Path to the injection json file"
"--injection",
metavar="PATH",
type=str,
help="Path to the injection json file"
)
parser.add_argument(
"--joint-light-curve",
Expand All @@ -102,7 +105,10 @@ def main():
action="store_true",
)
parser.add_argument(
"--plot", action="store_true", default=False, help="add best fit plot"
"--plot",
action="store_true",
default=False,
help="add best fit plot"
)
parser.add_argument(
"--verbose",
Expand All @@ -124,10 +130,15 @@ def main():
default="sklearn_gp",
)
parser.add_argument(
"--absolute", action="store_true", default=False, help="Absolute Magnitude"
"--absolute",
action="store_true",
default=False,
help="Absolute Magnitude"
)
parser.add_argument(
"--ztf-sampling", help="Use realistic ZTF sampling", action="store_true"
"--ztf-sampling",
help="Use realistic ZTF sampling",
action="store_true"
)
parser.add_argument(
"--ztf-uncertainties",
Expand Down Expand Up @@ -191,7 +202,6 @@ def main():

bilby.core.utils.setup_logger(outdir=args.outdir, label=args.label)
bilby.core.utils.check_directory_exists_and_if_not_mkdir(args.outdir)

# initialize light curve model
sample_times = np.arange(args.tmin, args.tmax + args.dt, args.dt)

Expand Down Expand Up @@ -248,7 +258,8 @@ def main():
interpolation_type=args.interpolation_type,
)
light_curve_model = SVDLightCurveModel(**light_curve_kwargs)


## read injection file
with open(args.injection, "r") as f:
injection_dict = json.load(f, object_hook=bilby.core.utils.decode_bilby_json)

Expand All @@ -265,14 +276,27 @@ def main():
args.kilonova_error = 0

injection_df = injection_dict["injections"]

# save simulation_id from observing scenarios data
# we save lighcurve for each event with its initial simulation ID
# from observing scenarios
simulation_id = injection_df['simulation_id']

mag_ds = {}
for index, row in injection_df.iterrows():

injection_outfile = os.path.join(args.outdir, "%d.dat" % index)
injection_outfile = os.path.join(args.outdir, "%d.dat" % simulation_id[index])
if os.path.isfile(injection_outfile):
with open(injection_outfile, "r") as outfile:
mag_ds[index] = json.loads(outfile.read())
continue
try:
mag_ds[index] = np.loadtxt(injection_outfile)
continue

except ValueError :
print('\n===================================================================')
print('The previous run generated light curves with unreadable content.\n')
print('Please remove all output files in .dat format then retry.')
print('===================================================================\n')
exit()

injection_parameters = row.to_dict()

Expand All @@ -292,40 +316,57 @@ def main():
)
print("Injection generated")

with open(injection_outfile, "w") as outfile:
json.dump(data, outfile, cls=NumpyEncoder)

mag_ds[index] = data

try:
fid = open(injection_outfile, "w")
# fid.write('# t[days] u g r i z y J H K\n')
# fid.write(str(" ".join(('# t[days]'," ".join(args.filters.split(',')),"\n"))))
fid.write("# t[days] ")
fid.write(str(" ".join(args.filters.split(","))))
fid.write("\n")

for ii, tt in enumerate(sample_times):
fid.write("%.5f " % sample_times[ii])
for filt in data.keys():
if args.filters:
if filt not in args.filters.split(","):
continue
fid.write("%.3f " % data[filt][ii, 1])
fid.write("\n")
fid.close()

except IndexError:
print('\n===================================================================')
print('Sorry we could not use ZTF or Rubin flags to generate those statistics\n')
print('Please remove all flags rely on with ZTF or Rubin then retry again')
print('===================================================================\n')
exit()

mag_ds[index] = np.loadtxt(injection_outfile)

if args.plot:
import matplotlib.pyplot as plt
import matplotlib

matplotlib.use("agg")
params = {
"backend": "pdf",
"axes.labelsize": 30,
"legend.fontsize": 30,
"xtick.labelsize": 24,
"ytick.labelsize": 24,
"axes.labelsize": 42,
"legend.fontsize": 42,
"xtick.labelsize": 42,
"ytick.labelsize": 42,
"text.usetex": True,
"font.family": "Times New Roman",
"figure.figsize": [16, 20],
}
matplotlib.rcParams.update(params)

ztf_sampling = check_default_attr(args, "ztf_sampling")
ztf_ToO = check_default_attr(args, "ztf_ToO")
rubin_ToO = check_default_attr(args, "rubin_ToO")
photometry_augmentation = check_default_attr(
args, "photometry_augmentation", default=None
)

plotName = os.path.join(
args.outdir, "injection_" + args.model + "_lightcurves.pdf"
)
fig = plt.figure(figsize=(16, 18))


fig = plt.figure()

filts = args.filters.split(",")
#filts = "u,g,r,i,z,y,J,H,K".split(",")
ncols = 1
nrows = int(np.ceil(len(filts) / ncols))
gs = fig.add_gridspec(nrows=nrows, ncols=ncols, wspace=0.6, hspace=0.5)
Expand All @@ -337,14 +378,7 @@ def main():

data_out = []
for jj, key in enumerate(list(mag_ds.keys())):
data_set = np.array(mag_ds[key][filt])
if ztf_sampling or ztf_ToO or rubin_ToO or photometry_augmentation:
f = interp.interp1d(
data_set[:, 0], data_set[:, 1], fill_value="extrapolate"
)
data_out.append(f(sample_times))
else:
data_out.append(data_set[:, 1])
data_out.append(mag_ds[key][:, ii + 1])
data_out = np.vstack(data_out)

bins = np.linspace(-20, 1, 50)
Expand All @@ -360,15 +394,10 @@ def return_hist(x):
hist = hist.astype(np.float64)
hist[hist == 0.0] = np.nan

ax.pcolormesh(X, Y, hist.T, shading="auto", cmap="viridis", alpha=0.7)
c= ax.pcolormesh(X, Y, hist.T, shading="auto", cmap="cividis", alpha=0.7)

# plot 10th, 50th, 90th percentiles
ax.plot(
sample_times,
np.nanpercentile(data_out, 50, axis=0),
c="k",
linestyle="--",
)
ax.plot(sample_times, np.nanpercentile(data_out, 50, axis=0), "k--")
ax.plot(sample_times, np.nanpercentile(data_out, 90, axis=0), "k--")
ax.plot(sample_times, np.nanpercentile(data_out, 10, axis=0), "k--")

Expand All @@ -381,20 +410,21 @@ def return_hist(x):
else:
plt.setp(ax.get_xticklabels(), visible=False)
ax.set_yticks([-18, -16, -14, -12])
ax.tick_params(axis="x", labelsize=30)
ax.tick_params(axis="y", labelsize=30)
ax.tick_params(axis="x", labelsize=42)
ax.tick_params(axis="y", labelsize=42)
ax.grid(which="both", alpha=0.5)

fig.text(0.45, 0.05, "Time [days]", fontsize=30)

fig.colorbar(c, ax = ax)
fig.text(0.4, 0.05, r"Time [days]", fontsize=42)
fig.text(
0.01,
0.5,
"Absolute Magnitude",
r"Absolute Magnitude",
va="center",
rotation="vertical",
fontsize=30,
fontsize=42,
)

# plt.tight_layout()
#plt.tight_layout()
plt.savefig(plotName, bbox_inches="tight")
plt.close()
plt.close()
Loading

0 comments on commit cd7b639

Please sign in to comment.