Date: Fri, 8 Mar 2013 15:58:21 -0500 (EST)
From: Ken Young 
To: Mark A. Gurwell 
Cc: aargon@cfa.harvard.edu, cqi@cfa.harvard.edu, jzhao@cfa.harvard.edu, dwilner@cfa.harvard.edu, spaine@cfa.harvard.edu
Subject: Re: Modification to the tsys_read file

   I think we should store only measured quantities in the data set,
rather than things like atmospheric models.   If we want to use our
measured Tsys values as an input to an atmospheric model that might
give us Tsys for each channel, it seems to me that's a job for the
calibration software, not the realtime system.   Perhaps we should
abandon storing a Tsys for each chunk, since there is no reason to expect
that Tsys will change right at the boundary of a chunk, and we're never
apt to measure Tsys finely enough to provide a Tsys for each chunk
anyway.   What would you think about us moving to a variable-length
Tsys file (something like sch_read) where the file contains n Tsys values
and n descriptors specifying the frequency boundaries between which that
particular Tsys measurement was acquired?   Then it would be up to the
data reduction software to figure out how to use those Tsys vs frequency
values to produce a per chunk, or even per channel Tsys value.

Taco

On Fri, 8 Mar 2013, Mark A. Gurwell wrote:

> Ultimately it might be cool to have the possibility to interpolate
> the Tsys values across the 12 GHz and apply as finely as possible.
> If we were really really nifty we'd couple the measurements to the
> "am" model of Scott Paine's, such that we could better handle the
> O3 lines.   But that will require some extra thought and a fair
> amount of work.  Perhaps we should discuss this at some point
> maybe with Scott also?
>
> I think we should definitely figure out a way to save all the Tsys
> data in a format that we can use, interpolate, and apply as finely
> as a channel by channel basis (note, I am *not* suggesting we stick
> a Tsys value to each channel! with SWARM that would be a big waste
> of space...).
>
> Mark
>       From rtm@cfa.harvard.edu  Fri Mar  8 15:19:36 2013
>       Date: Fri, 8 Mar 2013 15:19:35 -0500 (EST)

>       From: Ken Young 
>       From: Ken Young 
>       To: cqi@cfa.harvard.edu, jzhao@cfa.harvard.edu, mgurwell@cfa.harvard.edu,
>               aargon@cfa.harvard.edu
>       Subject: Modification to the tsys_read file
>       MIME-Version: 1.0
>       Content-Type: MULTIPART/MIXED; BOUNDARY="-1903989480-407595680-1362773975=:32125"
>
>         This message is in MIME format.  The first part should be readable text,
>         while the remaining parts are likely unreadable without MIME-aware tools.
>         Send mail to mime@docserver.cac.washington.edu for more info.
>
>       ---1903989480-407595680-1362773975=:32125
>       Content-Type: TEXT/PLAIN; charset=US-ASCII
>
>       Dear SMA Data Format Stakeholder,
>
>          Charlie pointed out that I was not writing Tsys values into the
>       tsys_read file for the new SWARM correlator chunks (s49 and s50).
>       Indeed, I had forgotten even to expand the size of the Tsys array to make
>       room for them.   That problem has been fixed.   From now on the tsys_read
>       file will have slots for 50 chunk temperatures (in the newFormat copy
>       of the data, of course - the standard data pathway is unchanged).
>       I have attached to this missive a copy of the new mirStructures.h file
>       which defines the format.   I have also updated the "Proposed new SMA data
>       file format" wiki page.
>
>          This brings up a trickier issue.   Around the time that we get the
>       SWARM correlator, we will also get new continuum detectors which are
>       designed to allow us to measure a separate Tsys value for each 1 GHz
>       interval along our 12 GHz IF (it may even handle a 16 GHz IF).   So at
>       long last, the tsys_read file will not just contain a zillion copies of
>       the sme Tsys values, repeated for every chunk.   The problem is, the
>       new SWARM corrrelator chunks will be so wide that we will have more than
>       one Tsys measurement for each of those chunks.   What should we do about
>       that?
>
>       Taco