Minutes of Detector Characterization Teleconference (December 10, 1999)
Present:
-
AEI: Chassande-Mottin, Sintes
-
ANU: Charlton
-
Caltech: Majid, Yamamoto, Zweizig
-
Dublin: Ottewill
-
Hanford: Gustafson, Penn, Raab, Savage, Schofield, Sigg, Whitcomb
-
Livingston: Kovalik, Marka, Saulson
-
Louisiana State: Giaime, Johnson, McNeil
-
Michigan: Riles
-
MIT: Daw, Penn, Shoemaker
-
Oregon: Mauceli
-
Penn State: Finn
-
Stanford: Brau
-
UT Brownsville: Romano
Introduction & Overview (KR):
-
The milestone of seeing light at the end of one of the Hanford 2-km arms
is a good indication that the 2-km commissioning is well underway (see
Sigg presentation below).
-
Back in the spring and summer we identified a number of priority-1 software
tasks we felt would be needed to be ready in at least a preliminary form
at the start of the 2-km commissioning, which at that time was expected
to be October.
-
Although the commissioning schedule has slipped a bit since then, it's
clear that the software delivery schedule has slipped even more.
-
The main purpose of this meeting is to see where we stand and decide how
best to focus our efforts in the next several months. Suggestions from
Hanford scientists working in the trenches are invited.
-
An important software issue is the overhead required to install the Data
Monitor Tool (DMT) on various platforms at LSC institutes. Comments from
DMT users and would-be users are invited.
-
Data was taken recently with the Caltech 40-Meter prototype by the Michigan
and Caltech groups. Whether and how to use that data for detector characterization
will be discussed.
Detector / Commissioning Status at Hanford (Daniel Sigg):
-
The laser and mode cleaner are working well. The 2-km stacks and optics
are fully installed, except for one end test mass.
-
The single-arm 2-km cavity has been observed to have flashing fringes,
and work is underway on the control system to permit a single-arm lock.
(Note added: Lock was achieved later in the week.)
-
Next in line is single-arm characterization and an attempt to lock a simple
Michelson.
-
The data acquisition (daq) system is working in all buildings, including
GPS time recording and frame building.
-
Many environmental monitors have been connected. Some subsystems, such
as seismometers and weather monitors, have been commissioned to an advanced
state. Others still need attention. Connecting and verifying the validity
of monitor data is a tedious job, and many channels remain to be addressed.
LSC volunteers to help in this are most welcome.
-
The diagnostics system's stimulus-response engine is now working.
-
The control room has six operating workstations and multiple monitors.
-
The online diagnostics data viewer can be used to view any channel.
-
Most of the DMT hardware is running at Hanford and being installed at Livingston.
Two Sun-450 workstations receive the framed data from the daq system. They
have successfully received data at rates up to 20 MB/s, but they are plagued
by occasional failures.
-
The tape robot to be used for distributing reduced-data samples to the
LSC has arrived, but is not yet installed.
-
Fred remarked that there is now tremendous control over the beam without
even leaving the control room for 1/3 - 1/2 of the time. He also mentioned
that area high school teachers and students have been running and analyzing
data from seismometers to produce periodic reports on quakes and the strength
of the microseismic peak.
-
In regard to ways that the LSC can contribute to the ongoing work, Daniel,
Fred & Stan had the following suggestions:
-
Daniel: Pick certain channels or physics problems and tackle it, i.e.,
visit the site, carry out calibrations, take tape home for analysis (tape
production soon available)
-
Fred: Identify correlations in lock-killing events
-
Stan: Look for correlations in acoustics and seismic channels with control
signals. He urged scientists to make contact with someone on site in advance
of visiting to ensure that both people and the relevant data channels will
be available. Temporary channel disconnections are common.
Data Monitor Tool Status (John Zweizig):
-
John's outline
-
Two DMT machines are running at Hanford: (sand.ligo-wa.caltech.edu
and stone.ligo-wa.caltech.edu). The software resides on sand and
is mirrored each night to stone. Anyone wanting an account on these machines
should contact John or Daniel after getting a regular general
computing account (first contact Christine Patton at Hanford).
-
Much development software has been installed along
with the DMT code (root and gcc compiler).
-
Because of the communication reliability troubles
(see above), it is presently hard to get a contiguous sample of data longer
than about 2 hours.
-
Automated startup and monitoring infrastructure is
working. See http://blue.ligo-wa.caltech.edu/
to see updated status reports from the monitors which look for bit errors
in the streaming data. (Off-site browsers may have trouble seeing this
online system web page.)
-
Version 1.0 of the root-compatible DMT version was
released on November 9. Version 1.1 will be ready very soon.
-
A new data monitor environment has been defined,
which allows development of code under root with easy migration to a compiled
background process. See http://www.ligo.caltech.edu/~jzweizig/DMT-BackEnv.html
for more information. A strip chart tool for monitoring band-limited RMS
in some seismic channels has been developed in this environment and works.
-
Passing data around has been made more efficient
through the use of pointers.
-
More flexibility is permitted for defining and manipulating
time series, FFT's and power spectra.
-
Frame writing does not work in version 1.0 but will
in version 1.1.
-
With help from Sam, filters have been added, using
an abstract base class from which different specific filters can be derived.
Automated design of FIR filters specified by zeroes and poles has been
provided.
-
New signal processing classes allow for windowing
and heterodyning.
-
The latest version of DMT requires the very latest
version of the GNU compiler which is now called gcc (again), its old name
previous to the more recent name egcs. (Note added: John later decided
to "back out" of the gcc requirement so that the last egcs version will
work with DMT version 1.1.)
-
KR brought up the problem of installing DMT/root/egcs
on non-Solaris, non-Linux platforms at LSC institutes and suggested that
the DMT web page provide a clearinghouse for reporting problems and solutions
to installation. It was suggested that volunteers agree to take responsibility
for beta testing code on these other platforms and to provide the information
posted in the clearinghouse. KR agreed to serve that role for hp-ux systems.
-
(Note added later: After further discussions with
users, John agreed to "freeze" the root/compiler choices for periods of
6 months or more so that installing each new release of DMT doesn't automatically
require installation of new root and egcs/gcc versions too. If a bug in
an older version of root seriously interferes with DMT development, however,
John reserves the discretion to upgrade to a newer root version. More guidance
will be provided to the user on which root/compiler versions are needed
for a given DMT release.)
-
Daniel mentioned that he is using root to develop
a graphics package for the online diagnostics.
-
Sam asked whether hooks could be installed in the
DMT for forking/piping information to other graphics packages, such as
xmgr, to avoid the overhead of the root package.
-
John said that not all of root must be loaded. The
shared libraries permit only the graphics routines to be loaded. Providing
a piping option would not be a problem, but the user would be responsible
for defining the output format and reading it with the target program.
It was universally agreed that John doesn't have the time to provide graphics
interfaces for xmgr and the like. Daniel suggested that users could write
the alternate output in the lightweight data format.
Data Set Simulation Status (Sam Finn):
-
The first stage (matlab-based) of the data set simulator
was released, as originally promised, on November 1.
-
Its inputs are interferometer parameters, and its
current output is random noise that includes thermal noise from the substrate,
laser shot noise and radiation noise.
-
Seismic noise and violin-mode artifacts will be available
in the next release. The seismic noise can be modelled, but finite dynamic
range requires whitening, and Sam has just received the zeroes and poles
needed to describe planned LIGO whitening filters. Implementing violin
mode artifacts likewise requires only knowing the correct frequencies and
Q's to insert.
-
The second stage of the simulator will include these
two noise sources, writing of framed data and some non-Gaussian noise.
-
The non-Gaussian noise will be generic spikes that
are Poisson-distributed in time with user-specified average amplitude and
frequency. The spikes also serve as templates for future plug-in transient
modules.
-
Once released, the second stage will be turned over
to the End-to-End Model group for incorporation.
-
On a Sparc Ultra 30, the simulator produces data
at five times real time.
-
John offered to use spare DMT cpu power for generating
large data samples.
-
Warren asked whether simulated data is the best way
to test algorithms.
-
Fred pointed out the tremendous virtue in knowing
the right answer when testing code on a stretch of data.
-
KR worried that too much testing on simulated data
would lead to complacency and urged that confronting real data not be postponed
too long.
-
Phil asked whether line noise would be included in
the next simulator release. Sam offered to include it.
-
Fred cautioned that Sam would have an endless job
if he agreed to put in every requested form of noise crap. He suggested
providing a generic facility that allows a user to provide a short time-series
to describe the shape of a transient. He also suggested that an "answer
key" be written out to a separate file when such transients are generated.
Sam stated that most of the infrastructure for generic transients already
exists in the code.
Performance Characterization (KR):
-
A note was sent out early in the week to all LSC
scientists who had agreed to carry out a priority 1 task in performance
characterization (characterization of stationary or quasi-stationary behavior)
asking for brief status reports. The following are summaries of the summaries
of those who responded.
-
Phil (ANU) reported that the ANU is evaluating a
variety of line-removal techniques.
-
Sam (PSU) reported that the PSU group has started
installing Kalman filters in the DMT and setting up a standard API for
all filters, as discussed above. The Kalman filter will be used to track
frequency lines in the data. He is also working on a similar data conditioning
API with Joe Romano and Susan Scott for the LDAS software. They are trying
to maintain commonality between the LDAS and DMT components. When the Kalman
filter is up and running in the DMT, he will implement an IIR transfer
function between environment and data channels for studying inter-channel
correlations, work that will also involved system identification tools.
-
Alicia (AEI) reported that she has a Matlab version
of a line removal program running, which she plans to convert to the DMT.
She has installed root on a silicon graphics workstation at AEI, but she
has not yet been able to install the root version of DMT because she needs
a new compiler to be installed first by her system support people (see
above discussion).
-
Steve (Syr) is at Hanford and hopes to get some DMT
coding done while there. He is beta testing the new DMT code on linux and
trying to get the build procedure to work. He is also working with Gabriela
Gonzalez on analytic methods for tuning suspension controllers.
-
Adrian (UCD) visited Milwaukee last month to work
with Bruce Allen on environmental correlation code, but was handicapped
by inconsistencies between the available versions of root & DMT. He
is now installing the latest code at Dublin and hopes to make substantial
progress early next year.
-
KR (Mich) will be travelling to Hanford in one week
and had hoped to install some operational state software during that visit,
but now-familiar DMT/root/compiler incompatibilities (on Michigan hewlett-packard
workstations) have stalled progress.
-
In summary, some work has been steaming ahead, but
software version/platform incompatibilities have slowed many of us considerably.
Those having trouble getting DMT to work at home should keep in mind
the temporary fallback of logging in remotely to sand.ligo-wa.caltech.edu
to develop code with back-home X-windows display of any graphics needed.
Transient Analysis Status (Fred Raab):
-
Fred briefly summarized what he knows of ongoing
transient analysis activity with invitations to those present to report:
-
Soumya Mohanty(PSU) has seismic monitoring software
running in Matlab. Sam mentioned that Soumya is out of the country at the
moment, but he expects the work to be completed approximately two months
after the DMT is up and running at PSU.
-
KR (Mich) is working on an algorithm for detecting
the onset of servo instability, but has no code ready to release. He will
work on it during the upcoming Hanford visit.
-
Rai Weiss (MIT) volunteered to work on an event catalog.
Fred believes he is working with Rana Adhikari on this.
-
Eric (AEI) is working on impulse recognition and
has code working in Matlab. His DMT integration is held up along with Alicia's
by compiler problems at AEI.
-
The Oregon group is working on detection of wind
gusts, quakes, magnetic field transients, with the magnetic field transients
receiving the highest priority.
-
Sergei Klimenko (UFL) is working on a wavelet analysis
for generic transient analysis.
-
Walid (CIT) will start work in February on detecting
dust in the beam.
-
Fred said he would be requesting status updates from
the various groups for posting on the transient analysis web page.
-
It was mentioned that Rolf Bork has commissioned
at Hanford a Matlab interface on the control room workstations that allows
direct access to the framed data as it emerges from the daq system.
Data Set Reduction (Jim Brau):
-
Jim provided in advance two online documents: a summary
of the goals and plans of the data set reduction group and a first
stab at defining the contents of a standard
reduced data set.
-
The proposed standard RDS had been circulated to
a small group for comment and is now released for wider consideration.
Comments and suggestions for improvements should be sent to Jim.
Specific suggestions on what decimation rates to use for various control
signals are especially welcome. Warren suggested storing higher moments
of seismic data.
-
Every data channel is represented in at least the
form of min, max & rms trends. Most channels are heavily decimated.
The proposed content gives an estimated reduced data rate of 200 kB/s summed
over all three interferometers. This represents a little over 1% of the
original full data, which was the group's goal. Fred suggested implementing
the proposed format and revising its content according to user demand.
-
A discusssion ensued over the logistics of producing
both standard reduced data sets and customized reduced data sets for analysis
at LSC institutes. The CDS group plans to provide tape manager software
to run the recently purchased tape robot, allowing non-interfering multiple
tape production. The tentative model involves user programs writing frame
files to a specified buffer directory where the tape manager later looks
for it. An automated tape manager is thought to be the best way to avoid
conflict between the high-priority standard RDS production and lower-priority
customized RDS production.
-
Walid wondered whether logical channels would be
provided for, namely new channels created from manipulation of multiple
IFO data channels. This should be straightforward in the DMT environment.
Daniel pointed out that any information derived from test point data channels
would have to be obtained in the DMT environment, the last stage where
test points can be seen.
Use of recently taken 40-Meter data (KR, Walid
Majid):
-
Through heroic efforts by Dick Gustafson (Michigan)
and valuable support from the Caltech group, data was taken with the Caltech
40-Meter prototype interferometer in two several-day data runs in fall
1999.
-
More than 100 data channels were recorded, and the
interferometer was in a fully recycled, LIGO-like configuration.
-
The first data run was arranged to coincide with
a TAMA 300 data run (TAMA was in a non-recycled configuration) and achieved
a displacement sensitivity of about 10**(-17) m/sqrt(HZ) at 1 kHZ.
-
The second run, a few weeks later after more machine
tuning, reached a sensitivity of about 4 * 10**(-18) m/sqrt(Hz). In both
runs, however, the noise was not fundamental in the 100-1000 Hz band, but
rather seemed to be electronic.
-
Although the data is probably not astrophysically
interesting, it does provide a nice testbed for exercising detector characterization
algorithms and some astrophysical search algorithms. In particular, the
recording of so many interferometer and environmental channels allows for
detailed correlation studies.
-
Walid reported on his
initial analysis of the 1st data run (coincidence with TAMA):
-
Roughly 300 GB of data has been stored in the CACR
archive. Only 7 seconds of data was lost in the entire run.
-
About 1% of the data has a bad GPS time stamp, but
that is correctable.
-
The longest single lock stretch was about 1900 seconds.
The in-lock duty factor was typically 85-95%, and the coincidence stretches
of simultaenous 40-Meter and TAMA locks was substantial.
-
A plot of ambient noise versus time shows a general
trend of improvement which resulted from Dick's tweaking of various servo
gains during the run as experience was gained.
-
Walid has catalogued all events with more than five-sigma
prolonged deviation from the mean in the gravitational strain channel.
These come mostly in the first minute or so after lock is acquired. He
is working to correlate these transients with the common mode servo channel
and will also look at correlations with the beam splitter servo. He also
reported an odd set of transients in which the light transmitted through
the end masses dropped to near zero, but lock was not lost.
-
A small group of Caltech & Michigan physicists
will be carrying out a coincidence analysis of the 1st data run, and other
LSC scientists have been invited to participate via a formal analysis proposal
to the LSC. More information can be found in a note
circulated by Rai Weiss in November.
-
If there is interest, KR will assemble a similar
analysis proposal on behalf of the working group to use the recent data
for exercising / testing detector characterization algorithms. (KR will
look at correlations among longitudinal control servo channels and between
orientational and longitudinal servos to better understand some unresolved
controls instabilities.)
-
Interested participants are encouraged to provide
KR with a specific work plan to be carried out as part of the proposed
analysis project. In keeping with the guidelines for LSC analysis proposals
outlined in the LSC Data Analysis White Paper, some detail should be provided
on the scientific problem addressed; the computational and analysis methods
to be used, the logistics to carry out the analysis (resources needed,
participating scientists, schedule) and any publications/documents expected
to arise from the work.
-
Sam expressed interest in looking at the data, but
wants first to see exactly which IFO and environmental channels were recorded.
KR & Dick agreed to compile and circulate a list of channels believed
to be reliable during the two data runs. Fred pointed out that many environmental
channels were commissioned after the experience of the November 1994 data
run.
A.O.B.
-
Subgroups are encouraged to hold at least one more
teleconference before the March LSC meeting.
-
A teleconference devoted to the Data Monitor Tool
software development will be held in late January or in February.
-
Be sure to periodically check the working
group bulletin board for important updates and miscellaneous links
to useful documentation