Last modified: 14 Dec 2022

URL: https://cxc.cfa.harvard.edu/ciao/threads/srcflux_plugin/

Using srcflux plugins

CIAO 4.15 Science Threads


Overview

Synopsis:

srcflux extracts many of the standard data products and creates many of the standard response files needed to perform a variety of analysis tasks. This includes images, spectra, and lightcurves as well as exposure maps, PSFs, ARFs, and RMFs. These products can be used to extract radial profiles, measure source extent, fit spectral models, compute hardness ratios, search for variability, and many more.

srcflux was updated in CIAO 4.15 to allow users to write their own analysis plugins to easily make use of these data products to perform custom analysis tasks and have those results automatically be collected in the .flux output file. These plugins are Python routines that get run for each source in each observations, and are run with the merged products.

In this thread we discuss the requirements for a user supplied plugin, provide information about a catalog of sample plugins, go into the details about a user plugin to fit spectra with an absorbed APEC spectral model, and then show how to access the results.

Purpose:

Users will want to run this thread if they wish to perform additional data analysis with the outputs from the srcflux tool.

Related Links:

Last Update: 14 Dec 2022 - First publication of this thread.


Contents


Getting Started

For this thread we are going to be combining the data for two observations of HL Tau. Not all these steps are needed just to use the srcflux plugin but are shown here for completeness.

Download the sample data: 18915, 20906 (HL Tau)

unix% download_chandra_obsid 18915, 20906 

While these observations have been planned to minimize the offset between individual observations, we still need to correct for any fine astrometric offsets. The bulk of this section of the thread will give a very brief summary of the steps needed to align the data. Additional information about each step can be found on the related threads linked in each subsection.

[NOTE]
Looping: bash shell vs. tcsh shell

This thread is using bash syntax for loops. tcsh users will change the loops from

for variable in list
do
  stuff
done
to
foreach variable (list)
  stuff
end

Reprocess data to apply latest calibrations

We begin by applying the latest calibrations by running chandra_repro

unix% chandra_repro 18915,20906 outdir= clob+

Here chandra_repro is given a stack of directories to process. With outdir=, a blank string, a repro directory will be created under each individual input directory.


Create initial images

The next step is to create images, exposure maps, and PSF maps for each individual observations; these are needed to detect sources using wavdetect. For simplicity we are going to restrict data to just the aimpoint chip, ACIS-7.

unix% /bin/ls */repro/*evt2.fits > evt.lis
unix% merge_obs @evt.lis"[ccd_id=7]" out=merge/out bin=1 band=broad psfecf=0.5 mode=h clob+

The output from merge_obs includes the images, exposure maps, PSF maps, and the event files reprojected to a common tangent point for each individual observation as well as the combined products.

For more details see Using merge_obs to combine observations and create exposure-corrected images.


Detect sources in each individual observation

Next we will detect sources in each individaul observation using wavdetect

unix% for obsid in 18915 20906
do
  punlearn wavdetect wrecon wtransform
  pset wavdetect \
     infile=merge/out_${obsid}_broad_thresh.img \
     expfile=merge/out_${obsid}_broad_thresh.expmap \
     psffile=merge/out_${obsid}_broad_thresh.psfmap \
     outfile=wav_${obsid}.src \
     scales="1.4 2 4" \
     scell=wav.cell defn=wav.nbg imagef=wav.recon clob+ 
  wavdetect mode=h
done

For more details see the Running wavdetect thread.


Compute fine astrometric offsets

Users analyzing a single observations can skip to the next section.

We strongly recommend that users inspect the output from wavdetect and possibly filter their source lists before using them for any analysis, including cross-matching for astrometric correction. We did not find any spurious sources that would affect the fine astrometric correction in this example.

The next step is to cross-match the sources. For this example thread we simply choose to match the observations to OBS_ID 20906. This was an arbitrary choice. Users may want to cross-match with an external catalog (such as Gaia) or develop their own heuristics for picking a reference observation (eg longest observation, most sources, etc.).

unix% ref=20906
unix% for obsid in 18915 20906
do
  punlearn wcs_match
  wcs_match infile=wav_${obsid}.src ref=wav_${ref}.src out=xform_${obsid}.fits method=trans radius=4 clob+ \
          wcs=merge/out_${obsid}_reproj_evt.fits
  dmlist xform_${obsid}.fits"[cols t1,t2]" data,clean
done

The output looks like

#  t1                   t2
    -0.16420759778384    -0.59824529646612
#  t1                   t2
                    0                    0

t1 and t2 represent the X and Y offsets, in physical pixels (0.492") between the observations. We can see that there is over a half pixel shift between the two observations which can be significant especially for sources near the optical axis.

For more details and options see the Correcting Absolute Astrometry thread and the Calculate source count rates and fluxes for combined datasets thread.


Apply astrometric corrections

The next step is to apply the fine astrometric correction to each observation's event file and aspect solution files. We also copy the necessary auxiliary files: the bad pixel file and detector mask files into the same directory so that the analysis scripts can locate them automatically.

unix% mkdir -p fine_astro

unix% for obsid in 18915 20906
do
  cp merge/out_${obsid}_reproj_evt.fits fine_astro/out_${obsid}_reproj_fa_evt.fits 
  wcs_update fine_astro/out_${obsid}_reproj_fa_evt.fits trans=xform_${obsid}.fits outfile= \
    wcs=merge/out_${obsid}_reproj_evt.fits
  wcs_update ${obsid}/repro/pcadf${obsid}_00?N00?_asol1.fits \
    outfile=fine_astro/out_${obsid}_fa_asol.fits \
    trans=xform_${obsid}.fits wcs=merge/out_${obsid}_reproj_evt.fits clob+
  dmhedit fine_astro/out_${obsid}_reproj_fa_evt.fits file= op=add key=ASOLFILE value=out_${obsid}_fa_asol.fits

  cp ${obsid}/repro/*bpix1.fits fine_astro/
  cp ${obsid}/repro/*msk1.fits fine_astro/
done

The event file is copied because wcs_update updates it in-place (only header keywords are modified), whereas a new aspect solution file is created since each row is updated.

Again for more details see the Correcting Absolute Astrometry thread.


Create final merged products and detect sources

The final data preparation step then is to re-merge the astrometric corrected data and run wavdetect on the co-aligned image.

unix% /bin/ls fine_astro/*evt.fits > fa_evt2.lis
unix% merge_obs @fa_evt2.lis merge_fa/out bin=1 band=broad psfecf=0.5 mode=h clob+
unix% punlearn wavdetect wrecon wtransform
unix% pset wavdetect \
  infile=merge_fa/out_broad_thresh.img \
  expfile=merge_fa/out_broad_thresh.expmap \
  psffile=merge_fa/out_broad_thresh.psfmap \
  outfile=wav_merge.src scell=wav.cell defn=wav.nbkg imagef=wav.recon \
  scales="1.4 2 4" clob+
unix% wavdetect mode=h

unix% dmlist wav_merge.src counts
6

The combined image with the source detections are shown in Figure 1.

Figure 1: Broad band counts image for combined dataset

[Thumbnail image: A ds9 image showing a subarray dataset panned to the lower left. There are 6 sources marked with green ellipses.]

[Version: full-size]

[Print media version: A ds9 image showing a subarray dataset panned to the lower left. There are 6 sources marked with green ellipses.]

Figure 1: Broad band counts image for combined dataset

The broad band image for the merged dataset. Both observations were performed using a 256 row (one-quarter chip) subarray. The wavdetect identified source regions are the overlaid green ellipses.


The user plugin

Introduction to Plugins

[New] Plugins allow users to write code to perform their own custom analysis and have srcflux automatically run that code and collect the outputs. Plugins were added in CIAO 4.15. srcflux supports two plugins: one to be run for each individual observation, and one to be run on the merged results.

There are a few simple requirements when writing a plugin.

  • The plugin is written in Python.
  • For the per-observation plugin, the Python script must define a routine named srcflux_obsid_plugin. The routine is run separately for each observation, for each source, and for each energy band. The routine's parameters are:
    • infile : the name of the input event file.
    • outroot : the name of the output root for this source number.
    • band : the name of the energy band.
    • elo : the low energy bound for this energy band, in keV.
    • ehi : the high energy bound for this energy band, in keV.
    • src_num : the number of the source (1 to N) being processed.

    For example:

    def srcflux_obsid_plugin(infile, outroot, band, elo, ehi, src_num):
        return []
    
  • For the merged results plugin, the Python script must define a routine named srcflux_merge_plugin. The parameters are the same as srcflux_obsid_plugin, except that the infile is a list of all the input event file names.
    def srcflux_merge_plugin(infile, outroot, band, elo, ehi, src_num):
        return []
    
  • The plugin must always return a list of values to be added to the output .flux file. If there are no values then it still must return and empty list, ie []. The list must be returned even if there is an Exception raised.
  • Each item in the returned list must have the following attributes: name, units, value, and description. This may be a NamedTuple, a dataclass object, or any other object type will work. The output column name must not already exist in the .flux file and must adhere to FITS standards (letters, numbers, hyphens and underscores only).
  • The plugin can create files, but it must clean up any temporary files itself.

Sample Plugins

We have provided several sample plugins in the $ASCDS_CONTRIB/data/sample_srcflux_plugins.py file.

unix% cat $ASCDS_CONTRIB/data/sample_srcflux_plugins.py
"Sample plugin file for srcflux"


"""
These are the files that will be available for the plugin to use.
For a single input event file, the ${outroot} will be the same as 
the outroot parameter.  For multiple input event files, 
the obi number is appended to the outroot parameter.

    The main output files w/ all the src properties:
        ${outroot}_${band}.flux

    Files used for spectral fits:
        ${outroot}.arf
        ${outroot}.pi
        ${outroot}.rmf
        ${outroot}_bkg.pi
        ${outroot}_bkg.arf   (only if bkgresp=yes)
        ${outroot}_bkg.rmf   (only if bkgresp=yes)
        ${outroot}_grp.pi
        ${outroot}_nopsf.arf

    Region files:
        ${outroot}_bkgreg.fits
        ${outroot}_srcreg.fits

    Light curves
        ${outroot}_${band}.gllc
        ${outroot}_${band}.lc

    Postage stamp images
        ${outroot}_${band}_flux.img
        ${outroot}_${band}_thresh.expmap
        ${outroot}_${band}_thresh.img

    PSF (only if psfmethod=marx)
        ${outroot}_${band}_projrays.fits
        ${outroot}_${band}.psf

    aprates probability density function
        ${outroot}_${band}_rates.prob

For multiple input event files, these files will be available to the 
plugin. The outroot is the same as the outroot parameter.

    The main output files w/ all the src properties:
        ${outroot}_${band}.flux

    Files used for spectral fits:
        ${outroot}_src.arf   \
        ${outroot}_src.pi     - Note: merge file names have _src 
        ${outroot}_src.rmf   /
        ${outroot}_bkg.pi
        ${outroot}_bkg.arf   (only if bkgresp=yes)
        ${outroot}_bkg.rmf   (only if bkgresp=yes)

"""
...

In this file users will find plugins to

  • def srcflux_merge_plugin(infile, outroot, band, elo, ehi, src_num):
    This plugin creates merged images.
  • def spectral_fit_sample_srcflux_merge_plugin(infile, outroot, band, elo, ehi, src_num):
    This plugin performs a spectral fit using the combined data products.
  • def hardness_ratio_sample_srcflux_obsid_plugin(infile, outroot, band, elo, ehi, src_num):
    This plugin computes hardness ratios for individual observations.
  • def radial_profile_example_srcflux_obsid_plugin(infile, outroot, band, elo, ehi, src_num):
    This plugin create a radial profile.
  • def arestore_sample_srcflux_obsid_plugin(infile, outroot, band, elo, ehi, src_num):
    This plugin performs a blind deconvolution.
  • def srcextent_sample_srcflux_obsid_plugin(infile, outroot, band, elo, ehi, src_num):
    This plugin tries to estimate the extent of the source compared to the PSF.
  • def spectral_fit_sample_srcflux_obsid_plugin(infile, outroot, band, elo, ehi, src_num):
    This plugin fits the spectra for each individual observation.
  • def aplimits_srcflux_obsid_plugin(infile, outroot, band, elo, ehi, src_num):
    This plugin computes upper limits and false source probabilities.

Our plugin: Spectral fitting an absorbed APEC model

Based on the above examples we have written the following plugin. Users may download the file here: my_plugin.py.

unix% cat -n my_plugin.py
     1	
     2	'An example plugin for the scrflux script'
     3	
     4	import os
     5	import numpy as np
     6	import matplotlib
     7	import matplotlib.pylab as plt
     8	from sherpa.astro.ui import *
     9	
    10	from collections import namedtuple
    11	ReturnValue = namedtuple('ReturnValue', 'name value units description')
    12	
    13	matplotlib.use("agg")
    14	
    15	def srcflux_obsid_plugin(infile, outroot, band, elo, ehi, src_num):
    16	    """
    17	    Sample srcflux plugin: fitting spectrum for each individual obsid
    18	
    19	    This sample plugin uses sherpa to fit a spectral model, and
    20	    return an estimate of the flux w/ errors calculated with
    21	    the sample_flux routine.
    22	
    23	    We wrap the actual fitting code below in a try/except block
    24	    here so that we can always return nan's in the case of any error.
    25	    """
    26	
    27	    try:
    28	        return doit_specfit(infile, outroot, band, elo, ehi, src_num)
    29	    except Exception as wrong:
    30	        print(str(wrong))
    31	        print(f"Problem fitting {outroot} spectrum. Skipping it.")
    32	
    33	        return [ReturnValue("fitted_Nh", np.nan, "cm**22", "Fitted Absorption value"),
    34	                ReturnValue("kT", np.nan, "keV", "Plasma Temperature"),
    35	                ReturnValue("normalization", np.nan, "", "Spectrum Normalization"),
    36	                ReturnValue("reduced_statistic", np.nan, "", "Reduced Fit Statistic"),
    37	                ReturnValue("fit_statistic", np.nan, "", "Fit Statistic"),
    38	                ReturnValue("dof", np.nan, "", "Degrees of Freedom"),
    39	                ReturnValue("sample_flux", np.nan, "", "2-10 keV Sample Flux"),
    40	                ReturnValue("sample_flux_lo", np.nan, "", "2-10 Sample Flux Uncertainty Low"),
    41	                ReturnValue("sample_flux_hi", np.nan, "", "2-10 Sample Flux Uncertainty Low"),
    42	                ]
    43	
    44	
    45	def srcflux_merge_plugin(infile, outroot, band, elo, ehi, src_num):
    46	    """
    47	    Sample srcflux plugin: fitting spectrum for the combine spectra.
    48	
    49	    In this example it is the same plugin as the per-observation plugin,
    50	    but it does not have to be.
    51	    """
    52	
    53	    return srcflux_obsid_plugin(infile, outroot, band, elo, ehi, src_num)
    54	
    55	
    56	def doit_specfit(infile, outroot, band, elo, ehi, src_num):
    57	    'Perform the spectral fit: an absorbed APEEC model'
    58	    
    59	    from sherpa.utils.logging import SherpaVerbosity
    60	
    61	    # Find the spectrum file (per-obi vs. merged file names)
    62	    if os.path.exists(outroot+".pi"):
    63	        pi_file = outroot+".pi"
    64	    elif os.path.exists(outroot+"_src.pi"):
    65	        pi_file = outroot+"_src.pi"
    66	    else:
    67	        raise IOError(f"Unable to locate source spectrum for this source number: {src_num}")
    68	
    69	    # Suppress some of the sherpa chatter
    70	    with SherpaVerbosity('WARN'):
    71	        load_data(pi_file)
    72	        group_counts(10)
    73	        set_source(xsphabs.abs1*xsapec.thrm)
    74	        set_method("neldermead")
    75	        subtract()
    76	        counts = calc_data_sum(lo=0.5, hi=7.0)
    77	        if counts < 100:
    78	            raise ValueError(f"{pi_file} only has {counts} counts. Need 100 to do spectral fit")
    79	        notice(0.5, 8.0)
    80	        fit()
    81	        fit_info = get_fit_results()
    82	        fflux, cflux, vals = sample_flux(lo=2.0, hi=10.0)
    83	        f0, fhi, flo = cflux
    84	        plot_fit_delchi()
    85	        plt.savefig(outroot+".png")
    86	
    87	    return [ReturnValue("fitted_Nh", abs1.nH.val, "cm**-22", "Fitted Absorption value"),
    88	            ReturnValue("kT", thrm.kT.val, "keV", "Plasma Temperature"),
    89	            ReturnValue("normalization", thrm.norm.val, "", "Spectrum Normalization"),
    90	            ReturnValue("reduced_statistic", fit_info.rstat, "", "Reduced Fit Statistic"),
    91	            ReturnValue("fit_statistic", fit_info.statval, "", "Fit Statistic"),
    92	            ReturnValue("dof", fit_info.dof, "", "Degrees of Freedom"),
    93	            ReturnValue("sample_flux", f0, "", "2-10 keV Sample Flux"),
    94	            ReturnValue("sample_flux_lo", flo, "", "2-10 keV Sample Flux Uncertainty Low"),
    95	            ReturnValue("sample_flux_hi", fhi, "", "2-10 Sample Flux Uncertainty Low"),
    96	            ]

Below we go into a little more detail about the code.

  • Lines 10 and 11: srcflux requires that the plugin return an object that has 4 attributes: name, value, units, and description. We have chosen to use a namedtuple.
  • Line 13: Tell matplotlib to use the non-interactive agg backend for plotting.
  • Line 15: this is the function definition line for the per-observation plugin with each of the required parameters.
  • Lines 33 through 41: srcflux requires that the plugin return a list of the return values even when an Exception occurs. To make this easier, the routine that does the real work, doit_specfit, is wrapped in a try/except block so that any Exceptions raised during the fit are handled and the script returns the list of parameters with all the values set to NaN.
  • Line 45: this is the function definition line for the plugin to run on the merged dataset.
  • Line 53: We have written our code to handle both the per-observation and merged datasets so the plugin for the merged data just calls the plugin for the per-observation data.
  • Line 56: The routine that does the actual spectral fitting on either the per-obsid or the merged products.
  • Lines 62-67: Identifies the spectrum file to load based on the outroot; outroot is different for the per-observation and the merged datasets.
  • Lines 59 and 70: this allow us to suppress some of the Sherpa chatter and is completely optional.
  • Lines 71 through 85: this is the actual Sherpa fitting code. It performs some setup by loading the file, grouping the data, setting the source model, and subtracting the background. It also checks to see if there are fewer than 100 counts in which case it skips the fit. Then the fit is performed, the results are used to compute an estimate of the flux in the 2-10 keV range, and we create a plot of the fit with residuals. This could be expanded to compute confidence intervals on the fitted parameters, to do something iterative like freeze some parameter to do an initial fit and then thaw them to perform a final fit, use set_xsxset to set any of the internal XSPEC parameter, etc.
  • Lines 87 to 96: This returns a list of the parameter values that we have chose to return. The list of the parameters returned in Lines 30-39 when an Exception occurs must match the same list of parameters being returned here. Users may chose to return an scalar values; this could be confidence intervals, execution time, etc.

Run srcflux

unix% mkdir -p flux
unix% punlearn srcflux

unix% srcflux \
  infile=@fa_evt2.lis \
  pos=wav_merge.src \
  outroot=flux/out \
  psfmethod=marx \
  bkgresp=yes conf=0.68 \
  plugin=my_plugin.py \
  mode=h clob+

The terminal output looks like:

...
Processing OBI 001
Extracting counts
Simulating psfs using marx
...
Scaling model flux confidence limits
Appending variability results onto output
Running user plugin from 'my_plugin.py'
flux/out_obi001_0001.pi only has 14.44082023906261 counts. Need 100 to do spectral fit
Problem fitting flux/out_obi001_0001 spectrum. Skipping it.
flux/out_obi001_0002.pi only has 3.0032852954630744 counts. Need 100 to do spectral fit
Problem fitting flux/out_obi001_0002 spectrum. Skipping it.
flux/out_obi001_0003.pi only has 31.003395044292073 counts. Need 100 to do spectral fit
Problem fitting flux/out_obi001_0003 spectrum. Skipping it.
Reading APEC data from 3.0.9

flux/out_obi001_0006.pi only has 7.9182515358234244 counts. Need 100 to do spectral fit
Problem fitting flux/out_obi001_0006 spectrum. Skipping it.
Processing OBI 002
Extracting counts
...
Appending variability results onto output
Running user plugin from 'my_plugin.py'
flux/out_obi002_0001.pi only has 23.22241617768667 counts. Need 100 to do spectral fit
Problem fitting flux/out_obi002_0001 spectrum. Skipping it.
flux/out_obi002_0002.pi only has 18.821272454515007 counts. Need 100 to do spectral fit
Problem fitting flux/out_obi002_0002 spectrum. Skipping it.
flux/out_obi002_0003.pi only has 14.63876212278057 counts. Need 100 to do spectral fit
Problem fitting flux/out_obi002_0003 spectrum. Skipping it.
WARNING: hard minimum hit for parameter abs1.nH
WARNING: hard maximum hit for parameter abs1.nH
WARNING: hard minimum hit for parameter thrm.kT
WARNING: hard maximum hit for parameter thrm.kT
WARNING: hard minimum hit for parameter thrm.norm
WARNING: hard maximum hit for parameter thrm.norm
WARNING: Covariance failed for 'abs1.nH', trying Confidence...
WARNING: Covariance failed for 'thrm.kT', trying Confidence...
WARNING: Covariance failed for 'thrm.norm', trying Confidence...
flux/out_obi002_0006.pi only has 4.50757727416939 counts. Need 100 to do spectral fit
Problem fitting flux/out_obi002_0006 spectrum. Skipping it.
Combining count rates
Combining spectra and running model flux for each source
...
Running user plugin from 'my_plugin.py'
flux/out_0001_src.pi only has 37.66323641674928 counts. Need 100 to do spectral fit
Problem fitting flux/out_0001 spectrum. Skipping it.
flux/out_0002_src.pi only has 22.552543701019964 counts. Need 100 to do spectral fit
Problem fitting flux/out_0002 spectrum. Skipping it.
flux/out_0003_src.pi only has 45.32505559025648 counts. Need 100 to do spectral fit
Problem fitting flux/out_0003 spectrum. Skipping it.
flux/out_0006_src.pi only has 14.154296401621957 counts. Need 100 to do spectral fit
Problem fitting flux/out_0006 spectrum. Skipping it.
...

From this we see that sources 1, 2, 3, and 6 have less than 100 counts and so no fit was perform. Only sources 4 and 5 met the criteria we chose.


Getting the results

The plugin outputs are stored in the output .flux files

unix% /bin/ls -1 flux/*flux
flux/out_broad.flux
flux/out_obi001_broad.flux
flux/out_obi002_broad.flux

obi001 is OBS_ID 18915 and obi002 is OBS_ID 20906. The file without obi in the file name is for the merged products. The .flux files now contain the extra columns created by our plugin:

unix% dmlist flux/out_obi001_broad.flux cols
 
--------------------------------------------------------------------------------
Columns for Table Block HISTOGRAM
--------------------------------------------------------------------------------
 
ColNo  Name                 Unit        Type             Range
   1   sky(X,Y)             pixel        Real8          -Inf:+Inf            Position
   2   EQPOS(RA,Dec)        deg          Real8          -360.0:      360.0   Position
...
  68   fitted_Nh            cm**22       Real8          -Inf:+Inf            Fitted Absorption value
  69   kT                   keV          Real8          -Inf:+Inf            Plasma Temperature
  70   normalization                     Real8          -Inf:+Inf            Spectrum Normalization
  71   reduced_statistic                 Real8          -Inf:+Inf            Reduced Fit Statistic
  72   fit_statistic                     Real8          -Inf:+Inf            Fit Statistic
  73   dof                               Real8          -Inf:+Inf            Degrees of Freedom
  74   sample_flux                       Real8          -Inf:+Inf            2-10 keV Sample Flux
  75   sample_flux_lo                    Real8          -Inf:+Inf            2-10 Sample Flux Uncertainty Low
  76   sample_flux_hi                    Real8          -Inf:+Inf            2-10 Sample Flux Uncertainty Low
... 

and we can display the data

% dmlist flux/out_obi001_broad.flux"[cols fitted_nH,kT,reduced_statistic]" data,clean
#  fitted_Nh            kT                   reduced_statistic
                  NaN                  NaN                  NaN
                  NaN                  NaN                  NaN
                  NaN                  NaN                  NaN
     0.63375063306261     0.85055049657958         1.3150511448
         2.4775294607         3.2966331631     0.63913573975030
                  NaN                  NaN                  NaN

We also created plots of the fits

unix% /bin/ls -1 flux/*png
flux/out_0004.png
flux/out_0005.png
flux/out_obi001_0004.png
flux/out_obi001_0005.png
flux/out_obi002_0004.png
flux/out_obi002_0005.png

The output root file name convention is the same above with the trailing zero-padded, four digit number corresponding to the source number. The plot for the merged results for source 4 are shown in Figure 2.

Figure 2: Spectral Fit Results

[This shows a standard Sherpa plot of the spectrum on the top with the best fit model drawn as a solid connected line. The residuals are shown below.]
[Print media version: This shows a standard Sherpa plot of the spectrum on the top with the best fit model drawn as a solid connected line. The residuals are shown below.]

Figure 2: Spectral Fit Results

The spectral fit results for the combined specturm, ARF, and RMF for source number 4 located at 04:31:40.08 +18:13:56.50.


Summary

In this thread we introduced srcflux plugins. We provided the requirements for creating a plugin and provided information about a catalog of sample plugins. Based on the examples provided we created our own plugin to compute a spectral fit using an absorbed APEC model that is run for sources with more than 100 counts for each individual observation and for the merged products.

Beyond the examples provided users could write plugins that access third party applications (installed separately) such as astropy or scipy. Plugins could be written that provide an interactive user interface such as launching ds9 or some other graphical application. The possibilities for user plugins is limitless.

As a final note : users should be aware that while most of srcflux is setup to run in parallel, plugins are always run serially, one at a time. Given the scope of what could be done with plugins, running in parallel could cause unmanageable race conditions.


History

14 Dec 2022 First publication of this thread.