A very large MARX simluation was run to calibrate the optimal source region for extraction of LETG spectra. many wads of rays with a flat spectral distribution in wavelength space were input to the simulation, resulting in somewhat fewer wads of output rays evenly distributed across the dispersion axis. In order to match the definition of the MPE grating model, the simulated data were filtered to contain only photons with a primary diffraction order of 1. All orders of support structure diffraction were included. The plot below shows the locations of 10e6 of the filtered output rays, with the current LETG extraction region overplotted in light blue.
For each 1 A bin in the dispersion direction the percentage of total counts within the extraction region was calculated. A plot of these values across the detector is shown below. There were no input rays with wavelengths shorter than 2 A; any photons in this region are due to scattering and thus the low extraction percentages here are meaningless.
Fits to the positive and negative sides of the extraction efficiency curve were made for use in later analyses and are overplotted in red below. The maximum deviation of the fits is on the order of 2%; the standard deviation is on the order of 10e-4. The few % dip in the long wavelength side of the negative order plot is due to the upper extraction limit nibbling into the source region and will be modified in the next version of the region.
The fitted extraction efficiency is available in tabular form here.