i am examining a ACIS-S3 50ksec exposure, and find a period of exact 1000
seconds. I use wavdetect to get all source regions with default ellipsigma
setting, then i pick out some bright sources with Nphoton > 2000, and use
XRONOS to carry out timing analysis, and find that a source with 43000+ x-ray
photons shows a period of exactly 1000 seconds, the folded light curve shows a
sine curve, and varying amplitude is about 20%. the other bright sources, and
the background events do not show such periods. This period is so EXACTly
1000.00 seconds, that i cannot help thinking it's some kind of artifacts.
Have any of you met such problems before?
This archive was generated by hypermail 2b29 : Tue Dec 10 2013 - 01:00:11 EST