DG Notes on 20170904 Event: Difference between revisions

From EOVSA Wiki
Jump to navigation Jump to search
Line 49: Line 49:


At this stage, the calibrated, concatenated ms can be transferred to baozi for further processing.  For example, the command
At this stage, the calibrated, concatenated ms can be transferred to baozi for further processing.  For example, the command
  tar -czf IDB20170904_2023-2053.ms.tgz IDB20170904_2023-2053.ms
  tar -czf IDB20170904_2023-2103.ms.tgz IDB20170904_2023-2103.ms
creates a compressed tar file that can be copied directly to baozi if the appropriate tunnel exists, or it can be staged in two steps.  The file is 1 GB in size, so it could take some time to transfer.
creates a compressed tar file that can be copied directly to baozi if the appropriate tunnel exists, or it can be staged in two steps.  The file is 1 GB in size, so it could take some time to transfer.

Revision as of 18:35, 5 November 2018

Purpose

There are various scripts for doing the imaging and self calibration, but they require explanation on how to use them. This page is meant to provide step-by-step explanation on analyzing one event, the M5.5 flare that occurred on 2017-Sep-04. I earlier did an initial study of this event, but in a hurried fashion in order to use the results for a proposal (which was successful, by the way!). I would like to do a more careful job with the initial calibration and analysis, and then continue with a second round of self calibration.

Initial Preparation

The data begin as raw data in Miriad format, recorded to disk by the pipeline system, in the form of IDB files. These files each contain 10-min of data, with several such files together representing a scan. The easiest way to identify the time period of interest is to examine the overview spectral plots on the RHESSI Browser and check "EOVSA Radio Data" on the upper left. Use the time you identified to find the corresponding IDB file(s) under /data1/eovsa/fits/IDB/yyyymmdd/ (also accessible from the EOVSA web site. The scan containing this flare is a quite long one, starting 18:53:33 UT and continuing to 21:30:00 UT or so.

To know the exact time of interest takes some iteration, especially for such a long flare with several peaks and smaller events. After some checking, I settled on the three 10-min files starting at 20:23:33, 20:33:33, and 20:43:33 UT. Undoubtedly an additional one or two 10-min intervals after the end time of 20:53:33 would be useful for following the decay of the event, but in the interests of time analysis of these will be postponed.

Importing EOVSA Data

Once the desired timerange is known, the next step is to convert the raw IDB files to calibrated CASA ms files. The script below does this from within CASA, and must be run on pipeline, which has access to the raw data and the SQL database. This script may require 30 minutes or more to run.

from suncasa.tasks import task_calibeovsa as calibeovsa
from suncasa.tasks import task_importeovsa as timporteovsa
import dump_tsys as dt
from util import Time
import numpy as np
import os
# import copy
get_tp = True
workpath = '/data1/dgary/solar/20170904_Mflare/'
if not os.path.exists(workpath):
    os.makedirs(workpath)
os.chdir(workpath)
# This time range will read files IDB20170904202333, IDB20170904203333, IDB20170904204333, and IDB20170904205333
trange = Time(['2017-09-04 20:20:00', '2017-09-04 21:00:00'])
info = dt.rd_fdb(Time('2017-09-04'))
sidx = np.where(
    np.logical_and(info['SOURCEID'] == 'Sun', info['PROJECTID'] == 'NormalObserving') & np.logical_and(
        info['ST_TS'].astype(np.float) >= trange[0].lv,
        info['ST_TS'].astype(np.float) <= trange[1].lv))
filelist = info['FILE'][sidx]
outpath = 'msdata/'
if not os.path.exists(outpath):
    os.makedirs(outpath)
inpath = '/data1/eovsa/fits/IDB/{}/'.format(trange[0].datetime.strftime("%Y%m%d"))
ncpu = 1
#
#
timporteovsa.importeovsa(idbfiles=[inpath + ll for ll in filelist], ncpu=ncpu, timebin="0s", width=1, visprefix=outpath,
                         nocreatms=False,
                         doconcat=False, modelms="", doscaling=False, keep_nsclms=False, udb_corr=True)
msfiles = [outpath + ll + '.ms' for ll in filelist]
# invis=copy.copy(msfiles)
concatvis = os.path.basename(msfiles[0])[:11] + '_2023-2103.ms'
vis = calibeovsa.calibeovsa(msfiles, caltype=['refpha', 'phacal'], interp='nearest', doflag=True, flagant='13~15',
                            doimage=False, doconcat=True,
                            msoutdir='msdata', concatvis=concatvis, keep_orig_ms=True)

This will result in the four measurement sets, IDB20170904202333.ms, IDB20170904203333.ms, IDB20170904204333.ms, and IDB20170904205333.ms being created and written to the current directory, and then concatenated into a single ms with the name IDB20170904_2023-2103.ms.

At this stage, the calibrated, concatenated ms can be transferred to baozi for further processing. For example, the command

tar -czf IDB20170904_2023-2103.ms.tgz IDB20170904_2023-2103.ms

creates a compressed tar file that can be copied directly to baozi if the appropriate tunnel exists, or it can be staged in two steps. The file is 1 GB in size, so it could take some time to transfer.