Last modified: December 2023

URL: https://cxc.cfa.harvard.edu/ciao/ahelp/convert_xspec_user_model.html
AHELP for CIAO 4.16

convert_xspec_user_model

Context: Tools::Utilities

Synopsis

Compile a XSPEC user model for use in Sherpa *experimental*

Syntax

convert_xspec_user_model name modelfile

Unlike most CIAO contributed scripts, there is no parameter file.

The supported command-line flags can be found using -h/--help:

--udmget build with the udmget package (FORTRAN only)
--udmget64 build with the udmget64 package (FORTRAN only)
-l / --local build the package locally rather than globally?
-p / --prefix flags are used to control the user model names
--pyver set the version number of the Python module
-c / --clobber will overwrite existing files.
-v / --verbose changes the amount of screen output.
--version prints the script version

Description

The convert_xspec_user_model tool will compile a XSPEC user model into a form usable by Sherpa; it is therefore similar to the "initpackage" command in XSPEC. Unlike the initpackage command, this script does not require that the XSPEC source package is installed, and is not run from Sherpa but the command line.

This script is *experimental* and will not work for all models; please see the "Known Problems" section below for more information and contact the CXC HelpDesk if you find problems or models that it will not compile.

The script takes the model file - often called model.dat or lmodel.dat - and the source code in the working directory, using them to create a Python module consisting of the compiled model and some supporting Python code that creates the Sherpa model. The output is a Python module, whose name is the first argument to the script. This can be loaded into Sherpa by saying (assuming that you ran 'convert_xspec_user_model relxill ...'):

sherpa> import relxill.ui

Loading the module

By default the module is installed globally, so that this import should work from any directory, but you can use the --local flag to build a version of the code that requires either changing to the directory you ran convert_xspec_user_model, or by adding the directory to the Python path list.

Using the models

At this point the models can then be used; the default behavior of the script is to add the prefix "xsum" to the model name (in a similar manner to how Sherpa uses the prefix "xs" for the XSPEC models), but this can be changed using the --prefix flag, as shown below in the examples.

Table models

Sherpa already supports additive, multiplicative, and exponential XSPEC table models using the load_xstable_model command (support for exponential table models was added in CIAO 4.14).

Removing the models

The models can be removed by running the following command (replacing name with the value used when running convert_xspec_user_model):

unix% pip uninstall name

This step is not needed if you used the --local flag.


Examples

Example 1

unix% ls *.f
mdl.f
unix% convert_xspec_user_model mymodel lmodel.dat

The mymodel module is created, which contains the model(s) defined in the file lmodel.dat and with code in mdl.f. In this example it defines two models called, imaginatively enough, mdl1 (additive) and mdl2 (multiplicative). If the script completes successfully then you can then load the models into Sherpa with the command:

sherpa> import mymodel.ui
Adding additive XSPEC local model: xsummdl1
Adding multiplicative XSPEC local model: xsummdl2

At this point any additive, convolution, and multiplicative models will be available using the XSPEC name with the prefix "xsum". So, (assuming mdl2 has an alpha component), we can use them as any other XSPEC model:

sherpa> set_source(xsphabs.galabs * xsimmdl1.mdl)
sherpa> print(mdl)
sherpa> xsummdl2.gas
sherpa> gas.alpha = 2

Example 2

sherpa> from sherpa.utils.logging import SherpaVerbosity
sherpa> with SherpaVerbosity("WARN"):
...         import mymodel.ui

When the "model.ui" module is loaded a line is displayed for each model that is added to Sherpa. This output is controlled by the Sherpa logging interface and can be turned off temporarily, as shown above, to hide this output. Note that the final line, re-setting the level, should not be forgotten or else you may miss useful information from Sherpa functions.

Example 3

unix% convert_xspec_user_model mymodel2 lmodel.dat --prefix

In this case the model names are not preceeded by "xsum", as in the example above, because the --prefix argument has been given. This means that instead of saying xsummdl1.mdl you would just say mdl1.mdl.

Note that there is only a limited check that the model names do not match existing Sherpa models or function names, so there is a chance for confusion.

To use a different prefix than "xsum" supply an argument to the --prefix flag, but it must start with a capital letter - e.g.

--prefix XS

would make the model name be xsmdl1 rather than xsummdl1.mdl. The requirement for a capital letter is because the name is also used to create a Python class (this class is not used in normal use of Sherpa).

Example 4

unix% ls *.c
polar.c polarutils.c
unix% convert_xspec_user_model polar lmodel_polar.dat --clobber

Here the models in the file lmodel_polar.dat - with C code automatically picked up from the polar.c and polarutils.c files - are compiled to create a polar module. The --clobber argument is used to delete any existing version of the module.

Example 5

unix% cd /data/xspeclmodels/foo
unix% convert_xspec_user_model foo model.dat --local

This builds the module "foo" but does not install it into Python. To use the module you either need to change to this directory or add the directory to the Python path such as the following

os.sys.path.insert(0, '/data/xspeclmodels/foo/src')
import foo.ui

Loading the module

The default behavior is to add the module to the Python installation, in which case the module can be installed with just the command (replacing name with the name given when runnnig convert_xspec_user_model):

sherpa> install name.ui

If you used the --local option then you either have to run Python (or sherpa) from the directory where you ran convert_xspec_user_model, or adjust the python path by setting the PYTHONPATH environment variable or changing the os.sys.path array. For example, the following will allow models defined in /data/models/ to be loaded:

unix% setenv PYTHONPATH /data/models

or

sherpa> import os
sherpa> os.sys.path.insert(0, "/data/models")

Automatically loading the module into Sherpa

Models can automatically be loaded into Sherpa by taking advantage of the IPython startup directory. Any Python file in $HOME/.ipython-ciao/profile_sherpa/startup/ will be loaded when Sherpa is started up, so add a file to this directory - e.g. 90-xspec-models.py - with commands to set up the models. An example file is shown below; adjust as appropriate:

import os

# If needed, set the path with commands like
#   os.sys.path.insert(0, "/data/models/carbatm")
#
import carbatm.ui
import relxill.ui

# The carbatm model needs to know where the data files are
# via XSPEC settings; relxill via an environment variable:
#
set_xsxset("CARBATM", "/data/models/carbatm")
os.environ["RELLINE_TABLES"] = "/data/models/relxill"

Starting Sherpa will display a line for each message, which can be useful but quickly gets annoying. To hide these messages use the SherpaVerbosity context managed (new in CIAO 4.14):

from sherpa.utils.logging import SherpaVerbosity

with SherpaVerbosity('WARN'):
    import carbatm.ui
    import relxill.ui

Notes

This only works for the Sherpa application. If you have loaded up Sherpa into an IPython session or a script then you wil have to manually load in the models you need.

Known Problems

Please contact the CXC HelpDesk if you have a problem or need to use an unsupported feature, such as platform or language.

Platform support

If possible the conda installation of CIAO is to be preferred (as the compiler versions are likely to better match those used to build CIAO).

The macOS support is limited. It my be necessary to set the following environment variables (adjusted to point to the locations on your system):

setenv CC `which clang`
setenv CXX `which clang++`
setenv FC `which gfortran`
setenv CONDA_BUILD_SYSROOT /opt/MacOSX10.9.sdk

Language support

All the languages supported by XSPEC user models - i.e. C, C++, and Fortran - should be supported, as long as the compiler is compatible with that used to compile the XSPEC models in CIAO (see below).

Model types

Sherpa only supports the additive, multiplicative, and convolution models; other models will be ignored.

Models that need to be re-evaluated per spectrum or that calculate model variances are converted but there has been no testing to see if they work correctly (note that Sherpa ignores the error array that is returned by these models).

When using convolution models, one possible problem is if you have ignored an energy - or wavelength - range within the data; if this range is larger than the RMF at this energy then it may be possible that the convolution will not properly account for edge effects.

Choice of module name

The first argument to convert_xspec_user_model is used to create a Python module of the same name that can then be loaded into Sherpa or a Python script. It can not match the name of one of the models (once the prefix value has been added), and it should not match other Python modules or common symbols, otherwise it could confuse users. There is only limited checking to see whether there are possible name clashes for the module and model names.

Parameter types

Periodic parameters are not supported.

Accessing XSPEC routines

The default behavior is to try and link against the XSPEC libraries, but the results have not been well tested. One known problem is if the gfortran version is significantly different to that used to build the XSPEC models provided as part of CIAO, which can lead to errors such as

undefined symbol: _gfortran_copy_string

External files or set up

The equivalent location of the spectral/xspec/ directory of XSPEC is the spectral/modelData/ directory within $ASCDS_INSTALL. The location depends on whether CIAO was installed with conda or ciao-install, and can vbe found with the sherpa.astro.xspec.get_xspath_model() routine. The set_xsxset command can be used to set an XSPEC "xset" variable, if the model uses these to define alternative locations for data files or other configuration information. The os.environ directionary can be used to set an environment variable if these are used by the model; for example

sherpa> set_xsxset('CARBATM', '/data/models/data/carbatm')
sherpa> os.environ['TBLLOC'] = '/data/models/data'

will set the CARBATM XSPEC variable and the TBLLOC environment variable.

What to do if the module fails to compile?

Unsupported models

The script will error out if there are no supported models in the modelfile (e.g. 'lmodel.dat'). Please contact the CXC HelpDesk if this happens, but please note that there is no support in Sherpa for these types of models.

Problems while building the code

If there is a problem when building the interface the resulting error message is unlikely to be useful. In this case try running

unix% pip install . --verbose

as it should provide more information. Possible problems are models that refer to XSPEC code from older (or newer) versions of XSPEC than used by CIAO, as well as access to certain parts of the XSPEC build system. Some of these can be worked around - such as removing the problematic models or tweaking how the extension module is built in setup.py - and the CXC HelpDesk may be able to help.

Parameter names

Not all parameter names are usable in Sherpa, due to the Python grammar. Below is a list of the conversion rules used, but if you are in doubt then just print out an instance of a model to see what the names are.

The conversion rules are:

What does the script do?

The compilation of the code is based on the approach taken in the xspeclmodels project, which uses the build structure provided by Python and Sherpa to build the code. There are two parts to the process:

The model definition file is parsed to extract all the models, which provides details on the model name, its type (such as additive or multiplicative), the arguments, the name of the routine that evaluates the model, and the language used for this routine. The parsing follows the definition given in the XSPEC appendix on local models and is based on the sherpa.astro.utils.xspec.parse_xspec_model_description routine from Sherpa.

Creating the compiled code

The compiled code is buuilt following the approach Sherpa uses to build its interface to the XSPEC model library. The C++ code is generated in the src/{modname}/src/_models.cxx file. The sherpa.astro.utils.xspec.create_xspec_code routine is used to create the files, although some post-processing is needed to fix up known problems and to integrate it into a full Python module.

What files are compiled?

The following files in the current working directory are compiled: *.f, *.f03, *.f90, *.c, *.cxx, *.C, *.cpp, and *.cc. The CXX files are checked and any that match lpack*.cxx or *FunctionMap.cxx are removed, as they are assumed to have been created by XSPEC's initpackage call. Please contact the CXC HelpDesk if you have problems with this file selection.

The selected files are compiled via the Python Extension class (details are in the setup.py file), although there is some extra work to compile FORTRAN files using this interface, including support the the udmget interface. Any improvements to this code would be gratefully received!

Creating the python code

The script creates two Python files: src/{modname}/__init.py - which defines the Python classes that represent the models - and src/{modname}/ui.py - which should be used to load the models into a Sherpa session. The model class names are created by appending the model name from the definition file to the prefix value (which defaults to "XSUM"); since Python class are expected to start with a capital letter the prefix must be capitalised (or, if blank, then the first character of the model name is capitalized).

Changes in the scripts 4.16.0 (December 2023) release

Updated to match Sherpa in CIAO 4.16.

Changes in the scripts 4.15.1 (January 2023) release

Fix a problem that meant the script could not be used on models that did not include any C++ files. The script will now also refuse to create a module that matches any of the model names, as that would cause confusion.

Initial support for FORTRAN models requiring udmget

If the XSPEC compilation includes the -udmget or -udmget64 flag in the initpackage call then the --udmget or --udmget64 flag should be used when calling convert_xspec_user_model. This is not guaranteed to work, so please contact the CXC HelpDesk if there are any issues.

Changes in the scripts 4.15.0 (December 2022) release

The script has been updated to work with Sherpa in CIAO 4.15, as the interface to the XSPEC model library was changed. The script will now exclude any file that appears to have been created by a call to the initpackage routine of XSPEC.

Changes in the scripts 4.14.4 (November 2022) release

Support for models that use *.cpp files has been added, such as recent versions of relxill, and the order that files are compiled has been switched to alphabetical, as this appears to be the approach taken by XSPEC and fixes an issue with the reltrans set of models. The build system has been simplified which should reduce the time taken to build, and avoid possible issues with Sherpa and NumPy.

Changes in the scripts 4.14.0 (December 2021) release

The script has been updated to handle changes in XSPEC 12.12.0 (as it makes building packages easier) and in Sherpa. In particular the parsing of XSPEC model files is now handled by the sherpa.astro.utils.xspec.parse_xspec_model_description routine and the code generation relies on the sherpa.astro.utils.xspec.create_xspec_code routine, both new in Sherpa 4.14.0.

Please contact the CXC HelpDesk if you are unable to run the script with a local model.

Changes in the scripts 4.13.0 (December 2020) release

The script has been re-worked and added back to the contributed scripts package. Please contact the CXC HelpDesk if you have problems. One major change is that the models should be imported using the model name plus ".ui"; that is

sherpa> import mymodel.ui

whereas in previous versions you would have just imported the mymodel module.

No support for the lmod function

In earlier versions we provided a lmod functon which could load the module. Due to changes in how the module is built this functionality has been removed.

Convolution models

The code takes advantage of the support for XSPEC convolution models added in CIAO 4.13, which means that it no longer creates routines starting with load_xxx, but just lets you create the model components as any other model.

Changes in the scripts 4.11.1 (December 2018) release

The script has not been updated to work with XSPEC 12.10.0e (which is distributed as part of CIAO 4.11). Please contact the CXC HelpDesk if you find this script useful.

Changes in the scripts 4.9.2 (April 2017) release

The script should now work with XSPEC convolution models.

Changes in the scripts 4.9.1 (December 2016) release

The script has been updated to work with CIAO 4.9 but has only seen very-limited testing. There is still no support for convolution models.

Changes in the scripts 4.8.2 (January 2016) release

Support for models with an initialization string

XSPEC models which use an initialization string, such as the snapec model can now be converted.

Changes in the scripts 4.8.1 (December 2015) release

This script has not been updated to reflect changes made in Sherpa in this release. Please contact the CXC HelpDesk if you need to use this script.

Changes in the scripts 4.6.6 (September 2014) release

The script is new in this release.

Notes

This script is not an official part of the CIAO release but is made available as "contributed" software via the CIAO scripts page. Please see the installation instructions page for help on installing the package.


Bugs

See the bugs page for this script on the CIAO website for an up-to-date listing of known bugs.

Refer to the CIAO bug pages for an up-to-date listing of known issues.