IPAC 2MASS Working Group Meeting #96 Minutes

IPAC 2MASS Working Group Meeting #96 Minutes 6/11/96

Attendees: R. Beck, T. Chester, R. Cutri, T. Evans, J. Fowler, T. Jarrett, D. Kirkpatrick, B. Light, H. McCallon, J. White


  1. PCP Disk Requirements
  2. Maintaining Products From Different Processing Dates/Times
  3. Extended Source CDR
  4. Effects of Varying Matched Filter and Smoothing Kernel on DAOFind


  1. PCP Disk Requirements -- J. White reported that PCP development has progressed to the point of designing the disk allocation layout for the processing of each scan. Each PCP (Pipeline Control Processor) processes a single scan at a time, and each CPU on each production machine (two machines, four CPUs each) will run one PCP during the corresponding phase of the 2MAPPS execution. In order to meet the goal of reducing I/O contention between PCPs, the current design for each production machine calls for three input volumes (one per band), four scratch volumes (one per CPU/PCP), and two output volumes, hence nine disk volumes per machine. This is a change from the previous requirement for eight disk volumes per machine; R. Cutri will inform R. Scholey of the increased requirement. If CPUs are added to a production machine with the intention of running more PCPs, then one additional disk volume per additional CPU will also be required.

    R. Beck reported that for testing in the near term, one karloff disk volume could be used as a ninth lugosi volume (assuming PCP testing on lugosi), but the karloff disk should be used as an input volume, since reads across the karloff-lugosi connection are significantly faster than writes.

  2. Maintaining Products From Different Processing Dates/Times -- 2MAPPS subsystem cognizant engineers have recently been encouraged to include the processing date and time in the output headers of the files their subsystems generate (see the minutes to meeting no. 93). T. Evans pointed out that the current data base design contains no provision for storing multiple output file sets for the same scan processed on different dates or times, and to add this capability would complicate the indexing needed for the rapid accesses which are the goal of the current design. Simply storing products in a hierarchical scheme in which processing date is a level of the hierarchy is not a solution that fits into the design contemplated. The different processing dates/times would somehow have to be woven into the indexing scheme, and a good way of doing that is not apparent. Some team members felt that storing these different output sets in a readily accessible fashion should be considered highly desirable. T. Evans will study this matter further.

  3. Extended Source CDR -- R. Cutri reported that the Extended Source Critical Design Review will be held Wednesday, June 26, and that a revised agenda schedule has been circulated. T. Chester touched on some of the issues that will be raised. These included a presentation showing that the "super coadd" (as known in galaxy processing circles, i.e., multi-band, as opposed to multi-coverages in a single band) is not required to meet project requirements, and that it would be costly in terms of extra CPU time and suffers from fundamental theoretical problems. A runthrough of presentations will be held on Friday, June 21.

  4. Effects of Varying Matched Filter and Smoothing Kernel on DAOFind -- R. Cutri presented the results of some analysis he and S. Wheelock have been doing to study the effects on DAOFind of varying the FWHM of the matched filter and the smoothing kernel used in producing the coadded image employed for aperture photometry. The M67 field was used in the study, which evaluated the completeness and reliability as functions of the parameters varied. The dependence on the matched filter FWHM was weak in the vicinity of the right value, especially for slightly large values. The dependence on the H parameter of the smoothing kernel was more involved: a value of 0.3 yielded "peakier" point sources, so that more detections resulted, while a value of 0.4 yielded fewer detections, but with a higher reliability, so that the overall completeness did not suffer significantly. The lower reliability of the 0.3 value was compensated by rejection downstream, so that the bottom line was that any reasonable settings produced more or less the same final result. A suggestion that H = 0.35 should be used was passed by acclamation.