{5} Assigned, Active Tickets by Owner (Full Description) (21 matches)
List tickets assigned, group by ticket owner. This report demonstrates the use of full-row display.
Kana Sugimoto (4 matches)
Ticket | Summary | Component | Milestone | Type | Severity | Created | |||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
Description | |||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
#234 | improve asapplotter by using Matplotlib 1.0.0 functions | General | Unified development | defect | normal | 14 years ago | |||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
Matplotlib 1.0.0 has improvements in plotting subplots etc. Look into the new APIs and update asapplotter. |
|||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
#287 | Improve pointing plots | General | Unified development | enhancement | normal | 12 years ago | |||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
Inprove plots by asapplotter.plotpointings:
|
|||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
#290 | Implement sideband separation algorithm in ASAP (from CAS-4141) | General | Unified development | defect | normal | 12 years ago | |||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
ALMA Band-9 and 10 is observed with double side band receiver. It is necessary to develop a way to supress and/or separate signals from the sidebands. The algorithm to separate sidebands recommended by Commisioning Science Verification team of ALMA is based on Fourier Transform technique (Emerson, Klein, and Haslam 1979, A&A 76, 92). |
|||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
#296 | Enable polarization average of spectra without time averating. (from CAS-5194) | c++ | Unified development | enhancement | normal | 11 years ago | |||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
Reporter: Erik Muller (EA-ARC) polarizations are not averaged correctly together -- currently, they include temporal average by default. |
|||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
Malte Marquarding (17 matches) |
|||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
#196 | Create new Scantable definition | wiki | Unified development | enhancement | normal | 14 years ago | |||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
Make a formal wiki page for the Scantable version 4 design. |
|||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
#212 | On OSX 10.6: IERSpredict not found despite successfully running asap_update_data | General | ASAP 3.0 | defect | blocker | 14 years ago | |||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
Then, when running my reduction scripts, these are the messages, the script fails finally on average_pols (which may or may not be related). The same script runs fine on kaputar. 010-09-23 05:50:35 WARN MeasIERS::fillMeas(MeasIERS::Files, Double) (file measures/Measures/MeasIERS.cc, line 309) Requested data table IERSpredict cannot be found in the searched directories: 2010-09-23 05:50:35 WARN MeasIERS::fillMeas(MeasIERS::Files, Double) (file measures/Measures/MeasIERS.cc, line 309)+ ./ 2010-09-23 05:50:35 WARN MeasIERS::fillMeas(MeasIERS::Files, Double) (file measures/Measures/MeasIERS.cc, line 309)+ ./data/ 2010-09-23 05:50:35 WARN MeasIERS::fillMeas(MeasIERS::Files, Double) (file measures/Measures/MeasIERS.cc, line 309)+ /Users/balt/aips++/data//ephemerides/ 2010-09-23 05:50:35 WARN MeasIERS::fillMeas(MeasIERS::Files, Double) (file measures/Measures/MeasIERS.cc, line 309)+ /Library/Python/2.6/site-packages/asap/data//ephemerides/ 2010-09-23 05:50:35 WARN MeasIERS::fillMeas(MeasIERS::Files, Double) (file measures/Measures/MeasIERS.cc, line 309)+ /local/mar637/pyenv/share/casacore/data/ephemerides/ 2010-09-23 05:50:35 WARN MeasIERS::fillMeas(MeasIERS::Files, Double) (file measures/Measures/MeasIERS.cc, line 309)+ /local/mar637/pyenv/share/casacore/data/ephemerides/ 2010-09-23 05:50:35 WARN MeasIERS::fillMeas(MeasIERS::Files, Double) (file measures/Measures/MeasIERS.cc, line 309)+ /Users/balt/aips++/data//geodetic/ 2010-09-23 05:50:35 WARN MeasIERS::fillMeas(MeasIERS::Files, Double) (file measures/Measures/MeasIERS.cc, line 309)+ /Library/Python/2.6/site-packages/asap/data//geodetic/ 2010-09-23 05:50:35 WARN MeasIERS::fillMeas(MeasIERS::Files, Double) (file measures/Measures/MeasIERS.cc, line 309)+ /local/mar637/pyenv/share/casacore/data/geodetic/ 2010-09-23 05:50:35 WARN MeasIERS::fillMeas(MeasIERS::Files, Double) (file measures/Measures/MeasIERS.cc, line 309)+ /local/mar637/pyenv/share/casacore/data/geodetic/ 2010-09-23 05:50:35 INFO MeasIERS::initMeas(MeasIERS::Files) Cannot read IERS table IERSpredict 2010-09-23 05:50:35 INFO MeasIERS::initMeas(MeasIERS::Files) + Calculations will proceed with lower precision 2010-09-23 05:50:35 INFO MeasTable::dUT1(Double) No requested dUT1 data available from IERS tables. 2010-09-23 05:50:35 INFO MeasTable::dUT1(Double) + Proceeding with probably less precision. 2010-09-23 05:50:35 WARN MeasIERS::fillMeas(MeasIERS::Files, Double) (file measures/Measures/MeasIERS.cc, line 309) Requested data table TAI_UTC cannot be found in the searched directories: 2010-09-23 05:50:35 WARN MeasIERS::fillMeas(MeasIERS::Files, Double) (file measures/Measures/MeasIERS.cc, line 309)+ ./ 2010-09-23 05:50:35 WARN MeasIERS::fillMeas(MeasIERS::Files, Double) (file measures/Measures/MeasIERS.cc, line 309)+ ./data/ 2010-09-23 05:50:35 WARN MeasIERS::fillMeas(MeasIERS::Files, Double) (file measures/Measures/MeasIERS.cc, line 309)+ /Users/balt/aips++/data//ephemerides/ 2010-09-23 05:50:35 WARN MeasIERS::fillMeas(MeasIERS::Files, Double) (file measures/Measures/MeasIERS.cc, line 309)+ /Library/Python/2.6/site-packages/asap/data//ephemerides/ 2010-09-23 05:50:35 WARN MeasIERS::fillMeas(MeasIERS::Files, Double) (file measures/Measures/MeasIERS.cc, line 309)+ /local/mar637/pyenv/share/casacore/data/ephemerides/ 2010-09-23 05:50:35 WARN MeasIERS::fillMeas(MeasIERS::Files, Double) (file measures/Measures/MeasIERS.cc, line 309)+ /local/mar637/pyenv/share/casacore/data/ephemerides/ 2010-09-23 05:50:35 WARN MeasIERS::fillMeas(MeasIERS::Files, Double) (file measures/Measures/MeasIERS.cc, line 309)+ /Users/balt/aips++/data//geodetic/ 2010-09-23 05:50:35 WARN MeasIERS::fillMeas(MeasIERS::Files, Double) (file measures/Measures/MeasIERS.cc, line 309)+ /Library/Python/2.6/site-packages/asap/data//geodetic/ 2010-09-23 05:50:35 WARN MeasIERS::fillMeas(MeasIERS::Files, Double) (file measures/Measures/MeasIERS.cc, line 309)+ /local/mar637/pyenv/share/casacore/data/geodetic/ 2010-09-23 05:50:35 WARN MeasIERS::fillMeas(MeasIERS::Files, Double) (file measures/Measures/MeasIERS.cc, line 309)+ /local/mar637/pyenv/share/casacore/data/geodetic/ 2010-09-23 05:50:35 SEVERE MeasTable::dUTC(Double) (file measures/Measures/MeasTable.cc, line 6295) Cannot read leap second table TAI_UTC 2010-09-23 05:50:35 WARN CoordinateUtil::makeFrequencyMachine Unable to convert between the input and output SpectralCoordinates 2010-09-23 05:50:35 WARN CoordinateUtil::makeFrequencyMachine+ this probably means one is in the REST frame which requires 2010-09-23 05:50:35 WARN CoordinateUtil::makeFrequencyMachine+ the radial velocity - this is not implemented yet 2010-09-23 05:50:35 SEVERE CoordinateUtil::makeFrequencyMachine A trial conversion failed - something wrong with frequency conversion machine 2010-09-23 05:50:35 SEVERE CoordinateUtil::makeFrequencyMachine A trial conversion failed - something wrong with frequency conversion machine Aligned at reference Epoch 2010/09/21/13:04:07 (UTC) in frame BARY Traceback (most recent call last): File "./process.py", line 41, in <module> iav = av.average_pol(weight='tsys') AttributeError: 'NoneType' object has no attribute 'average_pol' |
|||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
#181 | plotting a saved file with one IF doesn't work | General | ASAP 3.0 | defect | major | 15 years ago | |||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
Code to demo bug: d = scantable('2010-03-24_1445_316p81-0p06.rpf') m = d.auto_quotient() m.set_freqframe('LSRK') m.set_unit('GHz') m.freq_align() iffreqs = [23.6945e9,24.13939e9] m.set_restfreqs(iffreqs) # Set the (1,1) and (4,4) freFs pdq = m.average_pol(weight='tsys') av = average_time(pdq,weight='tsys',align=False) bsp = av.copy() bsp.save('blspec',overwrite=True) resp=scantable('blspec') plotter.plot(resp) works but ifno=0 sel=selector() sel.set_ifs(ifno) bsp.set_selection(sel) bsp.save('only1IF',overwrite=True) resp=scantable('only1IF') plotter.plot(resp) dies with the message /usr/lib64/python2.6/site-packages/asap/asapplotter.py in _plot(self, scan) 765 y = scan._getspectrum(r, polmodes[scan.getpol(r)]) 766 else: --> 767 y = scan._getspectrum(r) 768 m = scan._getmask(r) 769 from matplotlib.numerix import logical_not, logical_and RuntimeError: Table DataManager error: Invalid operation: MSM: no array in row 0 of /local/home/gaf/DATA/Parkes/MMB_NH3/temp14634_856/table.f1 |
|||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
#288 | SDFITS export import doesn't keep flags. | General | Unified development | defect | major | 12 years ago | |||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
SDFITS does not preserve flag data Hi there, I am writing out ASAP data in SDFITS format and have noticed that flagged scans or cycles become unflagged when read back into ASAP. ie. the process of writing to SDFITS file appears to delete flag information. If I save in ASAP format, the flags are preserved. But I need to write into SDFITS format because the data then go through livedata/gridzilla. Later. Andrew xxx |
|||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
#168 | scantable.save doesn't honour frequency conversions for non asap output | General | ASAP 2.4 | defect | minor | 16 years ago | |||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
Frequency conversion is only done on-the-fly for
Th current workaround is to use This should become a new ticket and breaks the table format -> major version change. |
|||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
#111 | writing to SDFITS fails if data are pol averaged | General | enhancement | normal | 18 years ago | ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
using data /DATA/KAPUTAR_3/mopra/MOPSarchive/2007-06-25_2212-M232.rpf with these steps: s=scantable(fileinname) q=s.auto_quotient() tav=q.average_time(weight='tsys') polav=tav.average_pol() polav.set_unit('MHz') plotter.plot(tav) polav.save('NMLTAU2007-06-25_2212','SDFITS') generates: Traceback (most recent call last):
RuntimeError: poltype = stokes not yet supported in output. The error isn't produced if the average_pol step is omitted. |
|||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
#134 | scantable constructor doesn't work for multiple asap files | python | enhancement | normal | 17 years ago | ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
Need to handle multiple files via |
|||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
#144 | Ability to create empty scantable? | General | enhancement | normal | 16 years ago | ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
I've recently written a script to load data stored as a two column text file (velocity and flux density) and use the ASAP Gaussian fitting and it works very nicely. I do this using code like : def gauss_fit_data(x,y,ngauss,outname='none') :
.... where x and y are lists containing the data read from the file. I would like to be able to create a scantable with the same data in it using scantable._setspectrum (and scantable._setabcissa, although that doesn't seem to exist?). In particular it would be useful to be able to create/use masks, use the in built statistic etc, but there doesn't seem to be any facility to create an empty scantable? I know that I can fake some of this by loading in an unrelated dataset, but its not very elegant (nor portable). |
|||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
#173 | NH3 line fitting | General | ASAP 4.0 | enhancement | normal | 15 years ago | |||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
Hi, Following on from ticket:166 it would be nice to have ammonia line fitting in ASAP. I know it is possible to export data to CLASS format, but if I can keep everything within ASAP, so much the better. THe old SPC program did have a NH3fit routine which could be ported to ASAP? Cheers, Stacy. |
|||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
#215 | how to get columns not returned by get_column_names() | General | Unified development | question | normal | 14 years ago | |||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
When asap reads in one of my FITS files, what does it do with columns that are not returned by scantable.get_column_names()? More specifically, does it keep the data and can I get at them somehow? I'm attaching an example SDFITS file. |
|||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
#217 | How does auto_quotient(mode='time') recognize scans? | General | Unified development | question | normal | 14 years ago | |||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
What information in an SDFITS record tells ASAP that a scan is an OFF or an ON? The only thing I see in the column names that might do it is SRCTYPE. If that is what it is, what are acceptable values, and what SDFITS column name would map into that in the scantable? |
|||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
#255 | RuntimeError: Couldn't convert frequency frame | General | Unified development | defect | normal | 13 years ago | |||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
Hello, I am still going not far with ASAP, after some update this is my return: jose@ubuntu:~$ sudo asap_update_data [sudo] password for jose: Checking if an update is required. Data already at latest available version. If you still get errors running asap, please report this. jose@ubuntu:~$ asap Loading ASAP... Welcome to ASAP v4.0.0 (2011-10-05) - the ATNF Spectral Analysis Package Please report any bugs via: http://svn.atnf.csiro.au/trac/asap/simpleticket [IMPORTANT: ASAP is 0-based] Type commands() to get a list of all available ASAP commands. ASAP>s = scantable('2008-03-12_0932-M999.rpf') Found ATMOPRA data. Auto averaging integrations Importing 2008-03-12_0932-M999.rpf... ASAP>print s RuntimeError Traceback (most recent call last) /home/jose/<ipython console> in <module>() /usr/lib/pymodules/python2.7/asap/scantable.pyc in str(self)
--> 440 Scantable._summary(self, tempFile.name)
RuntimeError: Couldn't convert frequency frame. ASAP>s = scantable('2008-03-12_0932-M999.rpf') WARN: Requested data table IERSeop97 cannot be found in the searched directories: ./ ./data/ /home/jose/aips++/dataephemerides/ /usr/lib/pymodules/python2.7/asap/dataephemerides/ /usr/local/share/casacore/data/ephemerides/ /usr/local/share/casacore/data/ephemerides/ /home/jose/aips++/datageodetic/ /usr/lib/pymodules/python2.7/asap/datageodetic/ /usr/local/share/casacore/data/geodetic/ /usr/local/share/casacore/data/geodetic/ Cannot read IERS table IERSeop97 Calculations will proceed with lower precision WARN: Requested data table IERSpredict cannot be found in the searched directories: ./ ./data/ /home/jose/aips++/dataephemerides/ /usr/lib/pymodules/python2.7/asap/dataephemerides/ /usr/local/share/casacore/data/ephemerides/ /usr/local/share/casacore/data/ephemerides/ /home/jose/aips++/datageodetic/ /usr/lib/pymodules/python2.7/asap/datageodetic/ /usr/local/share/casacore/data/geodetic/ /usr/local/share/casacore/data/geodetic/ Cannot read IERS table IERSpredict Calculations will proceed with lower precision No requested dUT1 data available from IERS tables. Proceeding with probably less precision. WARN: Requested data table TAI_UTC cannot be found in the searched directories: ./ ./data/ /home/jose/aips++/dataephemerides/ /usr/lib/pymodules/python2.7/asap/dataephemerides/ /usr/local/share/casacore/data/ephemerides/ /usr/local/share/casacore/data/ephemerides/ /home/jose/aips++/datageodetic/ /usr/lib/pymodules/python2.7/asap/datageodetic/ /usr/local/share/casacore/data/geodetic/ /usr/local/share/casacore/data/geodetic/ SEVERE: Cannot read leap second table TAI_UTC WARN: Unable to convert between the input and output SpectralCoordinates this probably means one is in the REST frame which requires the radial velocity - this is not implemented yet WARN: Requested data table TAI_UTC cannot be found in the searched directories: ./ ./data/ /home/jose/aips++/dataephemerides/ /usr/lib/pymodules/python2.7/asap/dataephemerides/ /usr/local/share/casacore/data/ephemerides/ /usr/local/share/casacore/data/ephemerides/ /home/jose/aips++/datageodetic/ /usr/lib/pymodules/python2.7/asap/datageodetic/ /usr/local/share/casacore/data/geodetic/ /usr/local/share/casacore/data/geodetic/ SEVERE: Cannot read leap second table TAI_UTC WARN: Unable to convert between the input and output SpectralCoordinates this probably means one is in the REST frame which requires the radial velocity - this is not implemented yet Found ATMOPRA data. Auto averaging integrations Importing 2008-03-12_0932-M999.rpf... Thanks for any light... ASAP> |
|||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
#280 | possible incorrect sdfits headers | General | Unified development | defect | normal | 12 years ago | |||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
I am trying to process Parkes mosaicing data partially through ASAP. The standard route is to use livedata, then gridzilla. But ASAP is preferred partially due to the superior baseline fitter (there are other reasons). I have tried two methods to reduce the data, whose steps are outlined below: Method1:
Method2:
The problem is that the resultant cubes for the two methods are offset in velocity by about 1.18km/s. I suspect that the ASAP stage is either incorrectly writing header information or writing it in a format that subsequent gridzilla processing does not recognise. I come to this conclusion partly because in the output sdfits header from ASAP (Method2), one line says: SPECSYS = 'TOPOCENT' / Doppler reference frame (transformed) whilst the sdfits output from livedata (Method1) says: SPECSYS = 'LSRK ' / Doppler reference frame (transformed) Note that manually editing the sdfits file from ASAP to read "LSRK" does NOT change the final data cube, and it is still offset in velocity. I have attached the following files: raw rpfits file (2012-05-13_1506-P817_G335.0-1.0_lon_dfb4.rpf) sdfits file made using livedata in step 1 of Method1 (method1.sdfits) sdfits file made using livedata in step 1 of Method2 (2012-05-13_1506-P817_G335.0-1.0_lon_dfb4.sdfits) sdfits file made using ASAP in step 2 of Method2 (1506.sdfits) ASAP script used to produce 1506.sdfits above (red.py) I have included the files in a tar file. Later. Andrew xxx |
|||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
#281 | Several rms values for one spectrum | General | ASAP 4.0 | task | normal | 12 years ago | |||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
Hi, I'm reducing Mopra data taken in position-switching mode. After removing the bad scans and performing the quotient calculation, I merged the relevant scantables, time-averaged the whole bunch and finally averaged both polarisations. When I plot the resulting 'scantable', I get one single spectrum (as expected). However, when I use the task 'stats' to get the rms depth, it gives me several different values, corresponding to different times. Is this normal? And if so, how should I interpret these values? Thanks, Flor |
|||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
#285 | error in frequency frame conversion, old data, old script, new install. | General | Unified development | defect | normal | 12 years ago | |||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
Welcome to ASAP v4.0.0 (2011-10-05) - the ATNF Spectral Analysis Package ASAP>from asap import * ASAP>file='2008-03-12_0932-M999.rpf' ASAP>s = scantable(file, average=True) Found ATMOPRA data. Auto averaging integrations Importing 2008-03-12_0932-M999.rpf... ASAP>print s RuntimeError Traceback (most recent call last) /Users/erik/<ipython console> in <module>() /Library/Python/2.6/site-packages/asap-4.0.0-py2.6.egg/asap/scantable.pyc in str(self)
--> 440 Scantable._summary(self, tempFile.name)
RuntimeError: Couldn't convert frequency frame. |
|||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
#291 | alignment error for Tid data | General | Unified development | defect | normal | 12 years ago | |||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
The NH3 features of the averaged spectrum after processing don't line up. There is a shift of approx 50km/s. It looks like an issue with frequency alignment of data from several epochs. The follow is the process before merging a1 = sd.scantable('2011_260_t199/2011-09-17_074510_T199.rpf') # lupus tid 2011 260 some bad scans, but 1,1 and 2,2 present LupIR a2 = sd.scantable('2011_260_t199/2011-09-17_083236_T199.rpf') # lupus tid 2011 260 nothing LupIR a3 = sd.scantable('2011_260_t199/2011-09-17_092613_T199.rpf') # lupus tid 2011 260 mostly nothing with 1 scan with 2,2 LupIR (0-25) and LupINW (26-42) a4 = sd.scantable('2012_087_t199/2012-03-27_194022_T199.rpf') # lupus tid 2012 087 some 1,1 (LupIR, only one scan) a5Scan = sd.scantable('2012_091_t199/2012-03-31_182325_T199.rpf') # lupus tid 2012 091 noisy but some 1,1 transition LupINW (0-8) LupISE (9-34) LupISW (35-84) LupIIIMMS (85-109) LupIIIS (110-134) LupIIIN(135-159) LupIV(160-182) a5 = a5Scan.get_scan(range(0,85)) a6 = sd.scantable('2012_092_t199/2012-04-01_185240_T199.rpf') # lupus tid 2012 092 bad start, nothing det LISW a7 = sd.scantable('2012_092_t199/2012-04-01_203555_T199.rpf') # lupus tid 2012 092 1,1 overall noise is bad LupIR (0,1) LupISW (2 scan) a8Scan = sd.scantable('2012_099_t199/2012-04-08_182004_T199.rpf') # lupus tid 2012 099 good noise lvl, few 1,1 and one 2,2 last scan bit dodgy LupIR (0-1) LupISW (2-7) LupIIIMMS(8-21) a8 = a8Scan.get_scan(range(0,7)) a9 = sd.scantable('2012_172_t199/2012-06-20_095048_T199.rpf') # lupus tid 2012 172 Lupus I NW (0-7) central hyperfine detected 2 scans (LupINW01-02) a10 = sd.scantable('2012_172_t199/2012-06-20_101136_T199.rpf') # lupus tid 2012 172 Lupus I NW (0-17) and SE (18-35), possible central hyperfine detected. a11 = sd.scantable('2012_182_t199/2012-06-30_165737_T199.rpf') # lupus tid 2012 182 no detection covering LupISE (0-6) and SW (7-36) a12 = sd.scantable('2012_209_t199/2012-07-27_150854_T199.rpf') # lupus tid 2012 209 LupI SW only one scan src a13 = sd.scantable('2012_209_t199/2012-07-27_151516_T199.rpf') # lupus tid 2012 209 LupI SW 0-14 nothing a14Scan = sd.scantable('2012_209_t199/2012-07-27_154021_T199.rpf') # lupus tid 2012 209 LupI SW 0-19 Lup3MMS 20-40 (scan 39-33 1,1 dectection) a14 = a14Scan.get_scan(range(0,19)) a15 = sd.scantable('2012_251_t199/2012-09-07_112832_T199.rpf') # lupus tid 2012 251 Lupus IR reasonable 1,1 detection a16 = sd.scantable('2012_251_t199/2012-09-07_114049_T199.rpf') # lupus tid 2012 251 Lupus IR (0,1) detection 1,1 and 2,2 and sw a17 = sd.scantable('2012_251_t199/2012-09-07_115617_T199.rpf') # lupus tid 2012 251 Lupus IR #the arrays you want to process separately and merge arrayScan = [a1,a2,a3,a4,a6,a7,a8,a9,a10,a11,a13,a14,a15,a16,a17] qArray = []#leave this blank for merging the scan tables counter = 0#counter for naming convention for i in arrayScan: qString = 'q'+str(counter)#naming convention for the qArray and merging scan tabled qString = i.auto_quotient()# Make the quotient spectra qString.set_freqframe('LSRK')# Plot/select in velocity qString.set_unit('km/s') # Correct for gain/el effects qString.recalc_azel()# Tid does not write the elevation qString.gain_el() qString.opacity(0.05) qString.freq_align() qArray.append(qString)#need this for merging the scanTables counter = counter + 1 totalScan = sd.merge(qArray) #merging the scan tables print(totalScan) totalScan.set_restfreqs([23694.4700e6]) # 1,1 dectection cropScan = totalScan.average_pol() msk = cropScan.create_mask([-100,-70],[70,100]) cropScan.poly_baseline(mask=msk, order=1) sd.plotter.plot(cropScan) sd.plotter.set_layout(4,4) #change the format of plotter.plot sd.plotter.set_histogram() sd.plotter.plot(cropScan) cropScan.auto_poly_baseline() sd.plotter.plot() |
|||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
#209 | Change FIT PARAMETER Column from Double to Float | c++ | Unified development | enhancement | normal | 14 years ago | |||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
The fitting is done as Float, so no need for conversions. This breaks the Scantable version so should be done when we change the schema. |
|||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||