source: trunk/docs/hints.tex @ 285

Last change on this file since 285 was 285, checked in by Matthew Whiting, 17 years ago

Changes worth talking about are:

  • New parameters:
    • The trimming is now controlled by flagTrim. The BLANK etc keywords are always read in -- if they aren't there, flagTrim set to false. User access of flagBlankPix and blankPixVal is disabled. For clarity, flagTrimmed changed to hasBeenTrimmed.
    • Default value of kernMin set to -1 -- then if it is found to be negative, it is set to match kernMaj.
    • Slight changes to parameter output.
  • Fixed bug in findAllStats, so that the correct array is used.
  • Number of pixels for FDR setup set to 2*beamsize only if we are 3-dimensional.
  • Added pixel information to VOTable.
  • Simplified some code in drawMomentCutout
  • Added check for recon array allocation at start of ReconSearch?, as well as checks on dimensionality of cube.
  • Updated images and documentation.
File size: 4.9 KB
Line 
1\secA{Notes and hints on the use of \duchamp}
2\label{sec-notes}
3
4In using \duchamp, the user has to make a number of decisions about
5the way the program runs. This section is designed to give the user
6some idea about what to choose.
7
8The main choice is whether to alter the cube to try and enhance the
9detectability of objects, by either smoothing or reconstructing via
10the \atrous method. The main benefits of both methods are the marked
11reduction in the noise level, leading to regularly-shaped detections,
12and good reliability for faint sources.
13
14The main drawback with the \atrous method is the long execution time:
15to reconstruct a $170\times160\times1024$ (\hipass) cube often
16requires three iterations and takes about 20-25 minutes to run
17completely. Note that this is for the more complete three-dimensional
18reconstruction: using \texttt{reconDim=1} makes the reconstruction
19quicker (the full program then takes less than 5 minutes), but it is
20still the largest part of the time.
21
22The smoothing procedure is computationally simpler, and thus quicker,
23than the reconstruction. The spectral Hanning method adds only a very
24small overhead on the execution, and the spatial Gaussian method,
25while taking longer, will be done (for the above example) in less than
262 minutes. Note that these times will depend on the size of the
27filter/kernel used: a larger filter means more calculations.
28
29The searching part of the procedure is much quicker: searching an
30un-reconstructed cube leads to execution times of less than a
31minute. Alternatively, using the ability to read in previously-saved
32reconstructed arrays makes running the reconstruction more than once a
33more feasible prospect.
34
35On the positive side, the shape of the detections in a cube that has
36been reconstructed or smoothed will be much more regular and smooth --
37the ragged edges that objects in the raw cube possess are smoothed by
38the removal of most of the noise. This enables better determination of
39the shapes and characteristics of objects.
40
41A further point to consider when using the reconstruction is that if
42the two-dimensional reconstruction is chosen (\texttt{reconDim=2}), it
43can be susceptible to edge effects. If the valid area in the cube (\ie
44the part that is not BLANK) has non-rectangular edges, the convolution
45can produce artefacts in the reconstruction that mimic the edges and
46can lead (depending on the selection threshold) to some spurious
47sources. Caution is advised with such data -- the user is advised to
48check carefully the reconstructed cube for the presence of such
49artefacts. Note, however, that the 1- and 3-dimensional
50reconstructions are \emph{not} susceptible in the same way, since the
51spectral direction does not generally exhibit these BLANK edges, and
52so we recommend the use of either of these.
53
54If one chooses the reconstruction method, a further decision is
55required on the signal-to-noise cutoff used in determining acceptable
56wavelet coefficients. A larger value will remove more noise from the
57cube, at the expense of losing fainter sources, while a smaller value
58will include more noise, which may produce spurious detections, but
59will be more sensitive to faint sources. Values of less than about
60$3\sigma$ tend to not reduce the noise a great deal and can lead to
61many spurious sources (this depends, of course on the cube itself).
62
63The smoothing options have less parameters to consider: basically just
64the size of the smoothing function or kernel. Spectrally smoothing
65with a Hanning filter of width 3 (the smallest possible) is very
66efficient at removing spurious one-channel objects that may result
67just from statistical fluctuations of the noise. One may want to use
68larger widths or kernels of larger size to look for features of a
69particular scale in the cube.
70
71When it comes to searching, the FDR method produces more reliable
72results than simple sigma-clipping, particularly in the absence of
73reconstruction.  However, it does not work in exactly the way one
74would expect for a given value of \texttt{alpha}. For instance,
75setting fairly liberal values of \texttt{alpha} (say, 0.1) will often
76lead to a much smaller fraction of false detections (\ie much less
77than 10\%). This is the effect of the merging algorithms, that combine
78the sources after the detection stage, and reject detections not
79meeting the minimum pixel or channel requirements.  It is thus better
80to aim for larger \texttt{alpha} values than those derived from a
81straight conversion of the desired false detection rate.
82
83Finally, as \duchamp is still undergoing development, there are some
84elements that are not fully developed. In particular, it is not as
85clever as I would like at avoiding interference. The ability to place
86requirements on the minimum number of channels and pixels partially
87circumvents this problem, but work is being done to make \duchamp
88smarter at rejecting signals that are clearly (to a human eye at
89least) interference. See the following section for further
90improvements that are planned.
Note: See TracBrowser for help on using the repository browser.