Minutes of Detector Characterization Sessions
              ---------------------------------------------

     (Stanford LSC Meeting, July 19-21, 1999)

[The following minutes are meant to be brief summaries of presentations
 and discussions. They are not meant to be exhaustive and may contain
 errors.  Speakers or others who attended are encouraged to provide
 corrections. The transparencies of all speakers will be posted on the
 LSC home page shortly.]

               MONDAY AFTERNOON SESSION

John Zweizig (Caltech) - Data Monitoring Tool Status
----------------------------------------------------
The Data Monitoring Tool (DMT) software package provides an environment
for running offline monitoring algorithms either in background
(c language) or in foreground (c/c++ language) using CERN's root
program for graphics and command interpretation. The background
and foreground programs can be downloaded from the web:
  http://www.ligo.caltech.edu/~jzweizig/cdist.tar.gz
  http://www.ligo.caltech.edu/~jzweizig/rdist-0.1.tar.gz
The DMT will run on Sun workstations at the sites. The Hanford
workstation (E-450) has been delivered and will be commissioned
shortly.

As part of the brand-new root-version release, John has defined
a large number of classes, including ways to access, store,
manipulate and display data. These include some basic signal
processing tasks, such as windowing, filtering and fourier
transforms. Example code with graphics output were shown.

               TUESDAY AFTERNOON SESSION

Keith Riles (Michigan) - Update on Performance Characterization
---------------------------------------------------------------
Performance characterization refers to describing systematically
the stationary and quasi-stationary behavior of the interferometers.
A number of volunteers had committed at Gainesville or in the interim
to completing some high-priority perfchar tasks by fall 1999. Others
had expressed interest in a number of areas. KR spent most of this
presentation trying to recruit new volunteers and to encourage
firm commitments from those who already expressed interest.

The large task table assembled as part of the analysis white paper
preparation served as a menu, and many new commitments were indeed made.
(The current list of tasks/priorities/people/institutions can be
be viewed at http://www-mhp.physics.lsa.umich.edu/~keithr/lscdc/tasktables.html)
In addition, a number of task definitions were refined, broadened, or
consolidated.

Fred Raab (LHO) - Transient Working Group Status
------------------------------------------------
Transient analysis refers to identifying & characterizing transient
artifacts in the data arising from purely instrumental or environmental causes.
A web page lists expressions of interest in a variety of trananal tasks, but
there has been little detectable activity. A number of urgent transient
signatures and methods were described. There are a number of data samples
from the 40-meter and from LHO on which algorithms can be tested now and more
data samples will be on the way soon.

This presentation too used the large task table as a menu to recruit
volunteers. Again, there were several new commitments, and some task
definitions were modified.

David Strom (Oregon) - Report from the Data Set Reduction Working Group
-----------------------------------------------------------------------
This working group is devoted to customized data sets for detector
characterization and to the more general problem of LIGO data reduction
for scientific analysis. The group's efforts are divided into four
broad categories: 1) providing infrastructure for data distribution from
the sites to LSC institutions, both over the internet for small data samples
and via data tapes for larger samples; 2) providing simple working examples of
a variety of tools to look at distributed data; 3) data compression techniques
(lossless or nearly so); and 4) data reduction algorithms that exploit
the nature of the data itself (e.g., filtered decimation, statistical
descriptions). Volunteers to help in this effort are encouraged to come
forward, especially in the area of reduction algorithm development &
implementation.

               WEDNESDAY MORNING + AFTERNOON SESSIONS

Sam Finn (Penn State) - Simulated Data Sets
-------------------------------------------
This working group is focussed in the near term on phenomelogical data
set simulation, using parametrization models. Eventually this effort
will be merged into the more ambitious End-to-End Model project
(see Yamamoto talk below). The group plans to provide a first release
of the simulation software in early November, software that allows
real-time generation of time-domain data for immediate analysis or
for storage. Ideally, the software will include certain known instrumental
artifacts (e.g., stationary lines, decaying violin modes, fast transients)
on top of random Gaussian or non-Gaussian background noise. Superposition
of astrophysical signals will be supported. The first (basic) release will
produce data suitable for Matlab analysis. The second release, scheduled for
December, will produce data in the frame format and support more of the
ideal features described above. To be most useful, the simulation should
include the effects of the servo control loops and whitening filters of
the interferometer. Daniel Sigg & Nergis Mavalvala offered to provide
the appropriate parametrizations.

Sam Finn - Thermal Noise from Structurally Damped Systems
---------------------------------------------------------
This was a brief description of an algorithm used to simulate
thermal noise in damped mechanical systems. A physical model
of N parallel spring/dashpot connections to a test mass proves
quite useful in producing compact, causal digital filters for simulating
the effects of sharp mechanical resonances. Ideal analog vs simulated digital
comparisons of transfer functions and power spectra were shown

Hiro Yamamoto (Caltech) - End to End Model Update
-------------------------------------------------
The End to End Model is an ambitious project to simulate the interferometers
from first principles, including the effects of optical losses, servo control
electronics, misalignments, ambient noise, etc. Ultimately an entire
interferometer will be simulated in detail in both the frequency and
time domain, but work is proceeding according to the subsystems coming
online first. Simulation of the pre-stabilized laser system and of
the input optics is nearly complete. Work is underway on the suspensions
& seismic modelling. Simulation of complex optics configurations with
servo controls is available now in multi mode acquisition code.

Plans for the future include implementing more complex subsystems,
streamlining of code to allow more complex optics and fields,
model validation using real data, enhancing the user interface,
adoption of the ansi standard template library, implementing thermal
lensing, speeding up of the simulation code, and documentation.

Robert Schofield - Ambient & Diagnostic Magnetic Fields Measured
----------------------------------------------------------------
(Oregon)           inside of a BSC Vacuum Chamber at Hanford
--------           -----------------------------------------
Knowledge of magnetic fields in the vicinity of the test masses
is important because of the magnets attached to the masses for
actuation. Measurements have been carried out to determine both
the ambient fields in one of the Hanford vacuum chambers and the
transfer function from fields generated immediately outside the chamber
to the inside. Two magnetometers were mounted on a fiberglass rod suspended
in the chamber, and measurements were made along two different lines
in the chamber. Field magnitudes and gradients were determined for
frequencies up to 800 Hz. Effects of the clean room fans were
clearly visible in the data.

From the ambient field and field gradient measurements, a model
(D. Coyne) was used to predict resulting test mass displacement
noise. First indications are that noise is generally below the
LIGO 1 requirement, at least for the chamber measured. The measured
transfer functions are in rough agreement with a simple model
of eddy current damping in the chamber walls.

Daniel Sigg (LHO) - On-Site Seismic Correlation
-----------------------------------------------
Seismometer measurements were taken at two different sites at Hanford
(corner station and one mid-station) over 30 512-second intervals. Power
spectra, cross-power spectra and coherence were calculated using a Hanning
window. Measurements were taken along 3 dimensions at each site. All
individual power spectra show the micro-seismic peak near 0.2 Hz while
noise at much lower frequencies is attributed to instrument drift.
The noise bottoms out near 2 Hz and is still rising at 10 Hz (limit
of measurements shown). Coherence between orthogonal axes of measurement
is generally weak, even for the same seismometer, but coherence between
parallel axes of the two seismometers is quite strong in the vicinity
of the micro-seismic peak, reaching to 80-90%. The phase angle of
the cross spectrum at that frequency (about 60 degrees) is roughly
consistent with the expected speed of seismic waves.

Gabriela Gonzalez (Penn State) - Data Analysis: A Test for Stationarity
-----------------------------------------------------------------------
Techniques for testing and quantifying stationarity have been tried out
on 40-meter data. If noise is Gaussian and stationary, the distribution
of successive spectrogram bins has a Rayleigh distribution with a
mean value equal to the standard deviation. If one divides the
standard deviation by the mean (positive definite), one ideally expects
a flat distribution in frequency at the value one. The 40-meter data
display a somewhat ideal distribution below about 80 Hz and above
about 400 Hz. In the intermediate range, though, where one expects
thermal noise to be important, the distribution shows much larger
standard deviations than mean (except narrow bands where violin modes
dominate). Most of this behavior turns out to be due to a small number
of rapid transients. Removing the affected time intervals yields a
distribution only slightly higher than ideal and very nearly ideal
at the violin mode frequencies.

Soumya Mohanty - Non-Parametric Method for Detecting
----------------------------------------------------
(Penn State)     Non-Stationarity
------------     -----------------
The goal is to create a test for non-stationarity that does not depend
on knowing a priori the right noise distribution. Ideally the test should
detect true transients with good efficiency but should also have a low
false-alarm rate. It is meant for generic bursts where one does not
know the burst waveform, precluding optimized filters. Put simply, the
test used here compares power spectral density estimates computed at
two different times (with normalizations determined by the measured
variances) for various bins in frequency. From these comparisons, a
"carpet plot" (differential form of "ordinary" carpet plot)
is generated as time passes. A true rapid transient is characterized by a
double band in time of large values over some frequency range. The
test has been evaluated on various background noise distributions
and power spectrum shapes, and the thresholds set to give false alarm rates
of 1/hour. With those settings, one can obtain 80% burst detection efficiency
for various burst types (narrowband Gaussian at different central frequencies
and white Gaussian) with maximum excursions in amplitude of order a
few standard deviations. The algorithm will be converted to run in the
Data Monitor Tool environment this fall.

Soma Mukherjee - Simultaneous Dynamical Tracking & Removal of Multiple
----------------------------------------------------------------------
(Penn State)     Violin Modes
------------     ------------
The goal is to track closely the noise in narrow violin mode resonances
using a Kalman filter technique to allow accurate subtraction of
the known lines from the data. The Kalman filter uses a dynamical model
of a viscously damped simple harmonic oscillator driven by white noise.
The Kalman filter uses a minimum mean square estimation to derive a best
estimate of the underlying oscillation strength (and associated
"mode temperature"). The algorithm has been tested on 40-meter data
from 1994 to track simultaneously about 25 violin modes where a
subtraction of derived mode signals reduced the total variance in
the 565-610 Hz range enormously. This algorithm too will be converted
to run in the DMT environment this fall.

Bernard Whiting (Florida/ANU) - Progress in Noise Characterization
------------------------------------------------------------------
The Florida and Australia groups have investigated using spectral
uniformity as a means of characterizing stationarity of noise.
Histograms of the magnitude, and the real and imaginary parts of
FFT coefficients at different frequencies in 40-meter data show
clear deviations from Gaussian behavior for frequency bins affected
by power line noise (60 Hz & harmonics). It was demonstrated that
mere numerical inaccuracy (e.g., digitization noise) or mismatch of
frequency line and frequency binning can give misleading effects in
power spectra, and frequency aliasing is a prevalent hazard.
In general, it was shown that one must be careful in interpreting
spectral features and be aware of artifacts.

Sergei Klimenko (Florida) - Input Optics Simulation Status
----------------------------------------------------------
The goal is to build an End-to-End module for simulating the
input optics that includes the mode cleaner and Faraday isolator
and that interfaces to the pre-stablilized laser and core optics
simulations. The simululation includes control servos and effects of noise.
Work is far enough along to verify that explicit time domain modelling
gives results consistent with frequency-domain modelling.
The optics part of the IO simulation is complete. Work is underway
on the mode cleaner length control servo, integration with the
PSL system and implementing the mode cleaner wave front sensing servo.
Completion of the input optics simulation is expected in early September.

Neven Simicevic (Louisiana Tech) - New Seismic Measurements at Livingston
-------------------------------------------------------------------------
Seismometers more sensitive than the nominal LIGO sensors have been tested
at Livingston, allowing the limits of sensitivity at low frequencies
to be decreased by an order of magnitude. Four sensors were placed on
site, one at the corner station, one at one end station and two at or
near the other end station. Time series were taken over several days.
Aside from giving a more precise measure of ambient noise at the site,
the measurements revealed daily disturbances from trains passing to the
south, disturbances that can be distinguished clearly from background for
durations of about 15 minutes.

Discussion on data tape technology
----------------------------------
Various data tape technologies were discussed, using a web page table
of options (created by Stuart Anderson) as background. Given concerns
about cost and reliability of other technologies, it was decided that
the Sony AIT-2 8-mm choice was most desirable. One can obtain single
tape drives for about $3k. The technology allows data transfer rates
up to 6 MB/s, and a single tape holds 50 GB. One significant drawback,
however, is that each tape costs about $100.

Discussion on data tape mounting at sites
-----------------------------------------
Various off-site LSC groups will want data from the IFO sites from
time to time to carry out detector characterization tasks. Although
the infrastructure is being put in place to produce such tapes,
recording data on them and shipping them could be a significant
manpower drain at Hanford and Livingston. The Lab is reluctant to
commit to that task and has suggested that LSC members provide such
manpower as part of participating in the project.

Some comments made during discussion:

 * It's too expensive for individual groups to send a person to a site
   merely to record data on a tape.
 * It's quite reasonable to expect the LSC to help detector operations
   in this way as part of helping "run shifts" at the sites. In the long
   term one expects/hopes several LSC members to be on site at any given
   time.
 * Data tape recording requests should be made via informal proposals
   to the Lab, perhaps by the detector characterization w.g. collectively.
 * Data is not stored long at the sites. It will not be possible to request
   full data samples very long after the fact.
 * Without someone on site to verify channels are connected properly and
   sensors are operating, data taken at a random time during commissioning
   is likely to be worthless.
 * It will be desirable to arrange in advance for short data taking periods
   in the coming months from which many LSC groups can benefit, rather than
   forcing each group to make its own arrangements.

Conclusion:

 No real decision was made, but for the short term (next 6 months or so),
 there will probably be two modes for getting data on tape:
 * A group sends someone to the site not only to take care of the
   data recording, but also the verification of the data's integrity.
 * In coordination with commissioning, we collectively arrange for occasional
   data recording periods and the shipping of tapes to interested groups.

In either mode, a participating group must provide its own raw tapes
for recording. It should be kept in mind that small data transfers are also
possible over the internet.

Discussion on analysis white paper
----------------------------------
Some suggestions for improvement:
 * Add a data flow diagram and explanatory introductory text.
 * Prune the sections on astrophysical sources.
 * Clarify the role of non-Lab LSC groups who participate in
   commissioning / operations. A number of groups would like to
   help out at the sites but do not want to violate NSF rules on
   double-funding installation/commissioning.
These suggestions have been forwarded to Rai Weiss and the
other editors of the white paper.