Minutes of Detector Characterization Sessions
LSC Meeting - LIGO Hanford Observatory (August 15-17,
2000)
Introduction (KR - Michigan) (see also
opening
and closing plenary session transparencies)
-
With close to 20 presentations over the next three days, speakers were
urged to keep to assigned times, and questioners warned that discussion
may be cut short at times. The plentitude of talks is a positive reminder
of much ongoing detector characterization activity.
-
Points to discuss during the sessions:
-
Unfilled Data Monitor Tool (DMT) tasks
-
Better integration of DC group into on-site science, including participation
in upcoming engineering runs (round-table discussion)
-
Infrastructure (hardware & software) for future off-site DC analysis
(round-table discussion)
Data Monitoring Tool Status (John Zweizig
- LIGO-CIT)
-
John began with a summary of data acquisition (DAQ)
and DMT performance during the April engineering run (2-km single arm).
The IFO was in lock most of the time with 42 locked sections of average
length 1667 seconds. Unlocked sections averaged only 37 seconds.
-
174 channels were selected and written to to disk,
giving 50 GB of data, all of which is available online at LHO and at CACR.
A timing error in initialization caused 18 frames of data to be lost, and
1.5 hours were lost when the fortress computer hung. It is thought that
both problems will not recur in the next run, and monitors have been written
to look for them in real-time.
-
Although a glitch prevented testing of trigger generation
during the run, the test was carried out afterward in a real-time replay
mode and was successful. One monitor tested looked for discontinuities
in the data stream and saturation, and another recorded power in resonance
peaks into trend frames. The interface to the meta-database (GUILD
- P. Shawhan) worked very well, but could be further improved with more
flexible trigger analysis for verification.
-
A new production version (1.2) of the DMT was released
July 14 with extensive changes (see /export/home/dmt/pro/Changes on
the sand machine). These included various bug fixes and the following new
functionality in the infrastructure
-
Reworked trigger generation and a trigger manager
-
More API's and utilities for background monitors
-
More efficient operation
-
Extension of many classes
-
In addition, software contributed by LSC members
was incorporated (multitaper line tracker - A. Ottewill, operational state
conditions - KR (see talk below), time-frequency plots - S. Mohanty &
S. Siddiqui)
-
A new patch release (1.2.1) is in preparation with
fixes to new bugs and additional functionality, including a script for
monitoring the data distribution. The patch will also include support for
wavelet tools and a line remover (S. Klimenko - see talk below) and another
line remover (A. Sintes - see talk below).
Report from the Data Set Reduction Working
Group (David Strom - Oregon)
-
A quick review was given of the data reduction milestones decided upon
at the last LSC meeting:
-
A data distribution template is in draft form. P. Shawhan's clearing
house web page gives information on accessing and understanding
the data from the April engineering run. D. Strom has provided a web
page of links to more information on data access and includes a sample
program for looking at data using root and the DMT.
-
A prototype decimation routine (E. Mauceli) for the D MT has been written
and used in a tiltmeter study (see talk below). This code needs further
development to allow arbitrary bandwidth and should allow for a wavelet
compression option.
-
Benchmarks for data reductions have been proposed (S. Klimenko) to quantify
information loss based on a chi-square method in the time domain and to
quantify CPU efficiency using two existing algorithms (differentiation
with gzip and frame library algorithm) to any proposed new algorithm.
-
Wavelet compression (lossy and lossless) has been tested on the April engineering
data (see Klimenko talk below).
-
Linear predictors code is under development (N. Zotov).
-
The outline of a straw man reduced data set selection was presented at
the last LSC meeting, leading to about 200 kB/s per interferometer. No
feedback was received on the strawman, but the channels selected for the
engineering run in April amounted to about 680 kB/s with no decimation.
-
Reduced data samples will be useful in future engineering runs, and an
effort will be mounted to prepare for the next one this fall at Hanford.
Discussion ensued on what should be attempted during or after that run.
It's not clear that manpower is available to complete the necessary software
changes for real-time data reduction. D. Sigg pointed out that even using
the decimation-by-two algorithm (FIR filter) already installed in the DMT
creates non-trivial synchronization problems for channels decimated by
different factors. Ultimately it is planned to make available a reduced
data set with which DMT software developers can test their algorithms
to see what if any degradation is caused by the reduction.
Note added by KR: Further discussions are underway to make sure these
tests can be carried out soon. Benoit Mours of VIRGO who is on sabbatical
at Caltech this year has offered to help out. Technical details in data
transfer to tape and decisions on what reduction/compression should be
done on LDAS computers must be confronted in the long term. For the next
engineering run, some data will be written to fortress (as in April) for
recording to tape, and data compression will probably not be attempted
in real-time (instead carried out afterward with a dedicated program).
Status of Data Set Simulation (Sam Finn - Penn State)
-
The data set simulation program using phenomenological parametrization
(written in Matlab) is essentially complete. It is available on a Penn
State web site and has also been given to the End-To-End modelling group
for incorporation. Some of that incorporation is complete.
-
Additional functionality in this final version includes violin modes, better
thermal noise modelling, hooks for astrophysical sources (prototype for
inspirals included), and a streamlined frame reader and writer. Version
4 frames are produced. The parametrization used is the same as in Sam's
Bench program.
-
On a 5-year-old SGI workstation, the program can generate simulated data
at a rate 10 times real-time (with all bells & whistles enabled).
-
F. Raab would like the program to be available to IFO operators and urged
that someone from Penn State arrange for the required knowledge to be imparted.
-
D. Sigg mentioned that incompatibilities might arise in frame files since
some of the code running on site might not recognize simulated data. It
was pointed out that we should be careful to mark in the frames that they
represent simulated not real data! One scheme is to use negative run numbers;
another is to use the formal simdata structure supported by the frame library,
but here incompatibility with data-reading software is more likely.
Status of DMT Software Tasks (KR -
Michigan)
-
Although the DMT infrastructure is in great shape (see Zweizig talk above),
there are several problems that need to be addressed:
-
LSC-contributed software is arriving in various states, most not amenable
to immediate use by operators or commissioners.
-
Other DMT code that is in use at one or both sites is not integrated into
the DMT process manager environment.
-
Some priority 2 tasks have no volunteers yet.
-
Some promised software shows no sign of imminent delivery.
-
Most DMT software developers must keep in mind two somewhat conflicting
goals: to provide flexible tools for general use and specific monitors
useful now to non-C++-conversant operators and commissioners. To date providing
tools has received by far the most attention.
-
To help developers make code more useful, templates are needed for background
monitors. Two categories of such monitors are needed: performance characterization,
where summary info is written periodically to trend frame files (along
with other info perhaps) and transient analysis where aperiodic triggers
are generated for logging, setting EPICS alarms and writing meta-database
entries. Both types of monitor should also serve up data on request to
the DMT display manager (now under development by D. Sigg & J. Zweizig).
-
KR will work with John to make a transient analysis template based on servo
instability detection code (see talk below). A volunteer was requested
to help develop a useful performance characterization template. Peter Shawhan
and Sergey Klimenko expressed interest later in helping out on this.
-
DMT software developers should model deliverables after one of these
templates (or at least provide the same functionality).
-
LHO operator Rick Graff has agreed to work with software developers to
test new code and to transmit information to other LHO operators.
-
Brief status summaries were given for eighteen priority 1 and 2 DMT tasks
- see transparencies.
Round-table Discussion of LSC Participation in Science at the Sites
and Upcoming Engineering Runs
-
KR had raised the issue in the opening plenary session (see transparencies)
of the need for increasing participation by non-Lab LSC members in the
science at the Hanford and Livingston sites. In short, too many members
are too detached from site operations to make effective contributions to
detector characterization. For example, contributed DMT software sits unused
because it requires writing new programs to use tools or simply because
no effort has been made to educate the operators or IFO commissioners (see
preceding talk too). In addition, there are upcoming engineering runs at
LHO and LLO that will required scientific monitoring shifts to be manned.
-
B. Barish prefaced the discussion by restating the Lab's willingness to
help groups with travel support while they await word from NSF on travel
supplement requests or increased-baseline proposals. He stressed, however,
that only trips of at least 2 weeks duration would be considered for LAB
support. He has spoken with Joe Dehmer of NSF who says he recognizes the
need for such travel support and believes it will not be a problem. Barry
also mentioned the Lab policy that Lab scientists are expected to spend
an average of one week out of four at one of the sites and are encouraged
to pick one site to maximize efficiency. It was suggested that physicists
responding to the need for more LSC presence at the sites plan to spend
an average of at least one week out of six at a site in order not to qualify
as a "tourist".
-
S. Whitcomb suggested that the LIGO PAC be told explicitly which LSC groups
are making the most useful contributions at the sites, information to be
used by the PAC in evaluating NSF proposals. Barry remarked that the PAC
has begun to take a more active role in prioritizing such proposals and
that the NSF does listen.
-
Representatives of several groups stated their plans for stationing physicists
on site:
-
Penn State I (Sam Finn) - Willing to station a postdoc on site one week
out of six. Have requested additional travel funds from NSF to start in
August 2001. Would have to request Lab support to start sooner.
-
Penn State II (Gabriela Gonzalez) - Planning to send a graduate student
to a site for two weeks twice in each semester.
-
Oregon (Jim Brau) - Postdoc Robert Schofield is resident at Hanford, along
with graduate student Masahiro Ito. Departing postdoc Evan Mauceli has
been making regular visits to Hanford, and his replacement will too. Faculty
and other students make occasional visits.
-
Florida (Bernard Whiting) - Planning to send a member to Hanford for an
average of one week out of eight (Sergey Klimenko & Bob Goldberg).
(This is in addition to Florida physicists working on input optics commissioning).
It was suggested that rather than alternating between Sergey and Bob, that
only one person make regular visits in order to be more efficient (or to
increase the frequency for both).
-
Syracuse (Peter Saulson) - Peter is resident at Livingston this year on
sabbatical. Willing to ratchet up the presence of postdoc Steve Penn who
now makes occasional visits.
-
LSU (Joe Giaime) - Postdoc Ed Daw is resident at Livingston. With frequent
visits by Joe and students, at least another half-FTE is on site. It is
hoped to station an additional postdoc there half time in the future.
-
Michigan (KR) - Research scientist Dick Gustafson is resident at Hanford.
Graduate student Dave Chin will be making regular visits of at least two
weeks duration roughly three times per semester. KR will increase the frequency
of his visits.
-
Details of the upcoming engineering run haven't been worked out, but the
most likely time for the Hanford 2-km run will be the 2nd or 3rd week of
November. A "signup sheet" will be posted on the Web showing the commitments
by LSC groups to man scientific monitoring shifts during each upcoming
run. LSC groups are urged to "vote with their feet" to demonstrate commitment
to making LIGO successful. Members of the ASIS working group are also strongly
urged to join in this effort.
Compression of LIGO Data with Wavelets
(Sergey Klimenko - Florida)
-
Wavelet compression has been developed and tested on the April engineering
data, using both lossless and lossy techniques. It was suggested that Level
1 data (first cut at reduction) may be appropriate for lossless or "quasi-lossless"
reduction, while Level 2 and 3 data would require lossy techniques.
-
Wavelets allow de-correlation and decomposition transformations that give
compact representation of the DATA. Some of the same infrastructure can
be used for lossless and lossy compression. A lossless Random Data Compression
(RDC) algorithm has been developed that exploits the near Gaussianity of
LIGO data. A transformation makes the data more Gaussian and then an encoder
optimized for Gaussian noise is applied. The bit mapping for the 16-bit
sampled LIGO data words was outlined with diagrams and examples. The RDC
gives more compression of white Gaussian noise than the gzip program and
does so by a factor of five faster.
-
A table of lossless compression ratios achieved with various IFO and environmental
channels recorded in the April engineering run was shown with comparisons
among gzip, ERI (commerical package) and the RDC for time series data and
for various forms of transformed data. RDC consistently outcompressed gzip
and usually did better than ERI. For 16 kHz channels, the average RDC compression
ratio was about 2.2 and for 2 kHz data was about 1.9.
-
In carrying out lossy compression, the technique chosen was frequency-dependent
reduction of dynamic range in the wavelet domain. To avoid introducing
non-random noise artifacts, the algorithm is based on a simple truncation
of an integer ratio.
-
Tables and plots were shown of achieved compressions for the GW channel
vs the loss as defined by the chi-square figure of merit. A compression
rate of 3 can be achieved for an information loss of 0.1%. Comparisons
were also shown with the MPEG3 algorithm used for audio compression. Plots
of spectra before and after lossy compression show well-preserved spectral
shapes even for compression ratios approaching five, except near the Nyquist
frequency for 16 kHz channels.
Detecting the Micro-Seismic Peak with Tiltmeters
(David Strom - Oregon)
-
This study (based on the April engineering run data) was carried out to
investigate whether tiltmeters could be used as effective accelerometers
in a feed-forward control servo to stabilize test masses against micro-seismic
motion at about 0.15 Hz (suggested by R. Schofield). Tiltmeters also seem
more stable against noise at lower frequencies. They can be used as accelerometers
because an acceleration by g*theta is equivalent to a tilt by theta.
-
The main component seen in the one-arm test error signal is tidal drift.
For the moment, the drift has been fitted with a polynomial and removed.
This still leaves oscillations with a period of about 500 seconds, perhaps
due to the temperature cycle of the air handling systems (needs further
investigation). The power spectrum of the residual error signal shows a
very strong micro-seismic peak.
-
The tiltmeter signal also shows long-term drift, thought to be from thermal
effects, perhaps differential expansion of the legs (needs further investigation).
-
One expects a correlation between the length control signal and the difference
in tiltmeter signals at the LVEA and the mid station. To make the comparison,
the tiltmeter signals were subjected to a simple bandpass filter in the
0.1 to 1.0 Hz range and a pair of coefficients determined from a chi square
fit to the error signal and a sum of terms depending on both (filtered)
tiltmeter signals. The coefficients found were -0.42 +- 0.01 and +0.23
+- 0.01 times seconds**2 times g.
-
Applying the coefficients allows a reduction in total rms of the error
signal by 4% (after tidal correction) and in the 0.1-0.3 Hz range by about
30%. Similar results are found from using twice-differentiated seismometer
signals with nearly identical coefficients obtained. The seismometer signals
provide a 35% reduction in rms in the 0.1-0.3 Hz range.
-
The same technique was also applied at much lower frequencies to see if
the 500 Hz oscillations might be real tilts. One can reduce the power at
those low frequencies, but it's not clear for this 1-day data sample whether
the correlations are accidental or secondary. D. Sigg pointed out that
during the April run the microseismic activity was much lower than is typical,
suggesting that the tiltmeter correction will typically have a larger effect
on total rms.
-
Future plans include applying better filters, studying long-term tiltmeter
behavior and better tidal effect corrections (working with F. Raab).
Offline Analysis Tools for Detector Characterization
(Peter Shawhan - LIGO-CIT)
-
Online tools (Data Viewer, Diagnostic Test Tool, Data Monitor Tool) are
in good shape, but we will also want to carry out analysis offline for
1) detailed transient investigation, 2) analysis of large data samples,
and 3) comparison of different analysis techniques on same data. This talk
addressed data access & viewing, environment for automated analysis,
summary data object standards, and viewing / manipulation of summary data
objects.
-
For offline analysis the Guild program allows display of meta-database
information and writing of raw data to disk using the frameAPI. The Xlook
program allows viewing of the lightweight data files. But what is lacking
is display of raw data, display of trend data, and access by programs.
-
Peter proposed making the Data Viewer and DTT available for offline use,
to set up a trend server for off-site clients, a data flow manager (intermediary
btw programs and LDAS, along with network data server for transplanted
online programs) , an index of information services, and a LAL package
for parsing lightweight LIGO data.
-
For manipulating data summary objects (e.g., power spectra), objects would
be stored in lightweight format in individual files with indices for random
access. Matlab would be used to display and manipulate the objects.
-
An offline environment is needed for automated analysis (repetitive tasks).
Candidates considered so far include C/C++, Matlab, Triana, and PSE/SCIRun:
-
C/C++ provides flexibility and allows (in principle) linking to LAL and
DMT libraries, but signal processing functions must be written from scratch.
One possibility is to link to the Matlab math library. In any case, one
would want to establish a standard i/o paradigm and standard main programs.
-
Matlab has many advantages (maturity and widespread use, many built-in
functions, nice graphics, good documentation, extensibility with compiled
C/C++ code and access to primitive datatypes & operations). Disadvantages
include cost ($1200+ for academic license), slowness for interpreted scripts,
and difficulty in parallelizing. Adapting to LIGO needs would require creating
tools and example scripts. Simulink might be a good model to follow. One
would probably want to link in the LAL and/or DMT libraries.
-
Triana (developed by GEO in java) has a nice graphical interface, customized
GW analysis signal processing tools, a natural pipeline analysis structure
and can be parallelized. Disadvantages include unwieldy primitive operations,
a limited graphics module, and the relative unfamiliarity & slowness
of java.
-
PSE / SCIRun (developed at University of Utah for biophysics and 3-D problem
solving/visualization) is written in C++ with a Tcl/Tk graphical interface.
Its advantages include that it's free, multi-threading for multi-processor
machines, speed & good memory management, and extensibility in familiar
programming languages. Disadvantages include no built-in signal processing,
no conventional 2-D visualization and little formal documetnation. Much
work would be required to make it usable.
-
Peter's recommendation is to focus on C++ for batch processing with link
access to the DMT and LAL libraries (and perhaps to Matlab libraries) and
to focus on Matlab for scripted interactive analysis and for viewing /
manipulation of summary data objects.
-
A key point is the need to start building tools and expertise for offline
analysis with widespread sharing. Peter offered to set up a clearinghouse
web site for offline analysis tools, documentation, and usage notes. He
will also make a list of tasks needed to provide the ideal analysis environment.
Collaborators are most welcome.
Getting "Old" Data for Diagnostic Archives
(Daniel Sigg - LIGO-Hanford)
-
This talk and the previous one served as introductions to a round-table
discussion on off-site detector characterization analysis. Daniel began
with sobering numbers on data access rates.
-
Reading frames from disk, one can read all channels at 5 times real-time
and 1-10 channels at 50-100 times real-time. Reading from tape, one contends
with a 2-minute delay, data rates of 10-20 MB/s, giving a maximum of four
times real-time (if one is reading a tape with all channels recorded).
-
In reading data from archived frames at CACR, a request for 1 week of 1
channel requires 2 days of reading and 2 days for T1 internet transfer.
1 week of 1000 channels requires the same time to read, but 1 year to transfer!
This assumes data is stored frame by frame, as produced at the sites.
-
If the data is stored instead as striped frames (separate files for separate
channels), then the total read/transfer time is proportional to the number
of channels requested, and 8-bit 2 kHz channels can be read 16 times faster
than 16-bit 16 kHz channels. For example, one week of a slow channel that
has been compressed by a factor of two can be read in 1-2 minutes and transferred
over the internet in 3 hours. Thus striping helps in both read-time and
transfer, but the transfer times are still long.
-
Four user models for detector characterization analysis were presented
for consideration:
-
Model 0: Work with online data only
-
Model 1: Work with reduced data sets at home institutes and contend
with substantial delays
-
Model 2: Work on computers near archive (physically or remotely)
-
Model 3: Work online over the internet
-
Model 2 seems the most useful and versatile by far, but it requires access
to general-purpose computing with high-bandwidth access to the data archive(s).
Round-table Discussion of Off-site Detector Characterization Analysis
-
Peter's and Daniel's presentations raised many issues that hadn't received
much attention, given the focus to date on online analysis, leading to
much discussion that wandered at times.
-
Here is an over-simplified and abbreviated summary of salient points;
-
The issues raised are important and need to be addressed sooner than we
had thought.
-
Data will disappear quickly at the sites (fortress can store of order 1
day of full data), and the local LDAS tape archive can store about a week.
-
Stuart Anderson stated that LDAS plans to provide several general-purpose
workstations near the CACR archive that permit five simultaneous users
with data access rates of 10 MB/s. This may well be sufficient for our
needs. These computers are separate from the "sandbox" linux machines for
LDAS analysis development.
-
LDAS will keep three copies of the full data: one at CACR in the original
frame-by-frame format with all channels, one at CACR in channel-striped
frames, and one at another physical location for safekeeping.
-
Matlab has served people well for interactive and even batch offline analysis.
There are many tools already developed for GW analysis which could be immediately
useful. Peter was strongly encouraged to create his proposed clearinghouse
web page for Matlab tools and other offline analysis tools. He was also
encouraged to compile a list of tasks needed to create a useful offline
analysis environment that could run on the LDAS workstations at CACR
(and perhaps elsewhere), based on C/C++ programs and Matlab.
PEM Audit at LIGO Livingston (Joe
Kovalik - LIGO-LLO)
-
Several UT Brownsville undergraduates (recruited by Joe Romano) spent the
summer at Livingston systematically "auditing" the Physical Environment
Monitoring (PEM) channels. The purpose of the audit was to verify channel
operation from PEM sensor to data acquisition and to catalog the "typical"
behaviors of the channels.
-
Verification includes checking the sensor for reasonably correct response
to controlled disturbances (for the time being, calibrations provided by
sensor manufacturers have been assumed to be correct), checking cabling,
correct channel labelling and the response of the associated A/D.
-
For every data channel, a record is stored of a sample times series, power
spectrum of appropriate bandwidth, a long time series where appropriate,
a histogram of time series amplitudes (with and without filtering), and
probability density plots for the histograms. In addition, records were
made for different periods (night/day, quiet/noisy). Input from the DC
group on other quantitites to compile is encouraged.
-
Roughly 100 channels of non-IFO data have been catalogued. These have various
sampling rates and can be used for vetos or for feed-forward servo control.
-
Examples of cataloguing info were shown for tiltmeters and accelerometers.
The tiltmeters were first run through a "huddle" test with 4 instruments
in the same location to estimate systematic noise and test mounting techniques.
One instrument each was then placed in the corner station, end stations
and on the surface in a "stay clear zone". The tiltmeter showed very good
correlation in the huddle test with some variation in DC offset with instrument
and support (styrofoam or granite, styrofoam giving a bit less noise).
-
For the accelerometers, sample real-time data viewer output, power spectra,
amplitude histogram and derived probability density were shown. R. Weiss
urged that the compiled information be placed on the Web.
-
Joe remarked that this has been an extremely effective outreach program
for UTB students from which both they and LIGO greatly benefited.
Monitoring the LIGO Environment (Evan Mauceli
- Oregon)
-
A new DMT monitor class has been created that allows detection of a variety
of environmental transients and is easily extensible. So far derived classes
have been written to look for seismic and magnetic field transients.
-
A sample calling sequence and a sample class definition code was shown.
For the time being, thresholds and other parameters are hardwired into
the class definition. Evan (or his successor at Oregon) was urged to provide
run-time configuration files to allow easy tuning of monitors during development.
-
The basis of each transient detection is a certain number of channels exceeding
a threshold absolute deviation from the nominal mean by some number of
sigma. Generated triggers contain the start/stop times for a disturbance,
an amplitude figure of merit and current channel statistical measures.
-
The statistical measures for each channel used to define baseline conditions
are based on the last 10 intervals of 30 seconds where no trigger may occur
during that time.
-
The seismic monitor was run for six days in late July and detected all
three earthquakes reported in the Pacific Northwest's seismic network.
Three additional triggers were generated of unknown origin, one of which
is suspected to come from some cryogenic tank filling. Time series were
shown of computed averages and standard deviations for five seismometer
channels over the test run, with large quakes clearly visible.
-
One problem found in the algorithm for determining the threshold was seen
during a large Russian earthquake in which multiple triggers were generated
for the single quake.
-
Future improvements include triggered threshold adjustments, sending of
trigger info to a web page for display, and band-limiting filters.
-
A. Lazzarini suggested Evan contact Rohay of Battelle to share seismic
data. LIGO seismometers add significantly to the local seismic network.
Wavelet Analysis / Line Removal (Sergey
Klimenko - Florida)
-
Wavelet tools have been integrated into the next DMT release (see Zweizig
talk above) in a package called the Wavelet Analysis Tool (WAT). The wavelet
compression algorithms described above use these tools, which include several
different "families" of wavelets with associated methods. root macro files
are also available for plotting wavelet information.
-
The families include Daubechies and Lifting wavelets (which allow direct
mapping of integers to integers). Wavelet coefficients can be computed
according to an ordinary or binary wavelet tree.
-
Examples were shown of wavelet transformations on white Gaussian noise
data with a linear chirp superposed. The linear chirp is quite visible
in the wavelet time-frequency plot. Examples were also shown for the single-arm
control signal in the April engineering run. The wavelet tools provide
a means for detection, identification and perhaps removal of a broad range
of transients.
-
A line filter class has also been written for the DMT that allows finding
and removing specified harmonics of a given line. The information is stored
in a linked list and a number of convenient methods provided for accessing
and applying the information to clean data.
-
The Quasi-Monochromatic line removal algorithm developed at Florida is
used at the moment, but the class has been designed with other line tracking/removal
algorithms in mind and can be extended in a straightforward way. Sergey
has offered to work with Adrian Ottewill and Alicia Sintes (see talk below)
on integrating their algorithms into the same framework.
-
Documentation for the Wavelet Analysis tool can be found at http://www.phys.ufl.edu/LIGO/wavelet/index.html
and documetnation on the line removal software at http://www.phys.ufl.edu/~klimenko.
-
Future plans include adding families of Gaussian and symmetric Daubechies
wavelets and developing wavelet-based transient detection.
Transient Identification in Engineering
Data (Julien Sylvestre - LIGO-MIT)
-
Julien began with a detailed explanation of the algorithm used to detect
transients and then gave examples of transients found in the April engineering
run data. The software (program called tid) is routinely run at both IFO
sites using the DMT library. Operators have been given a tutorial
on its use.
-
The technique is based on applying an adaptive threshold to spectrograms.
The algorithm is fast, robust against stationary non-Gaussian noise, colored
noise and strong transients. The statistics of transient detection can
be controlled in a known way by resolution choice.
-
Actual detection is based on looking for significant "clusters" of pixels
in a time-frequency display where a pixel is turned on if the power in
a certain frequency band exceeds an adjustable, frequency-dependent threshold.
Application of the algorithm to the April 1-arm control signal shows a
false-alarm rate much higher than expected for Gaussian noise for nominal
parameters.
-
This transient detection has been augmented by searches for coincidences
between the GW and PEM channels. A "coincidence gate" allows adjustment
of false alarm rates. The gate is based on a metric-like measure of closeness
of two clusters where the "coordinates" represent numbers of pixels, mean
occurrence time, mean frequency, etc. An example was shown of computing
a confidence interval for the GW "foreground" rate from measured GW foreground,
GW background and seismic foreground.
-
The tid program has been used to detect and identify overhead aircraft
which shows a characteristic Doppler shift of a monochromatic source (at
~80 Hz) of approach and recession. An example was shown of a fitted curve
to an s-shaped band of T-F pixels due apparently to a plane travelling
overhead at 560 km/h at 3750 m altitude. A filter bank runs continuously
at Hanford to scan over "airplane parameter space", looking for the characteristic
s-curve signature.
-
Other examples of recurring transient patterns observed include:
-
Narrow-band periodic bursts of duration ~100 seconds every ~15 minutes.
-
A ~17 Hz resonance (pendulum roll mode?) driven by occasional impulses
with the resonance decaying over ~100 seconds.
-
Strings of symmetric bursts.
Operational State and Servo Instability DMT
Software (KR - Michigan)
-
Operational state conditions allow specifying a set of conditions before
undertaking an analysis or writing frames to disk. A DMT class has been
written to allow run-time ascii config file specification of such conditions,
including Boolean combinations of conditions. Examples conditions include
requiring the mean or rms value in a time interval exceed a threshold or
lie in a certain range or requiring certain bits in a control flag to be
on or off.
-
Example code and config files were shown. The code has been integrated
into the latest DMT release and is used by the designer data set writer.
The source code, sample program, makefile and documentation can be found
in ~dmt/cvs/dmt/src/dmtlib/osc on the Hanford sand or stone machines.
-
The operational state conditions have been enhanced recently to allow detection
of servo instabilities that include true runaway behavior, gain peaking
and excitation of out-of-band resonances, such as internal test mass vibration
modes. Examples of new conditions include excess power in a given frequency
band, rapid rise in absolute or fractional power in a band, and rapid power
magnitude rise in a band.
-
A separate ascii config file is used to control the behavior of a monitor
that uses the new conditions. Examples of config files used to detect excitation
of the "butterfly" test mass resonance were shown. One can choose merely
to log detected instabilities or in addition, to generate an EPICS alarm
and/or write an entry to the meta-database.
-
Although the expanded set of servo conditions is complete, a dedicated
stand-alone monitor is still under development with completion expected
by September 15.
-
Future plans include augmenting the monitor to allow communication with
the DMT display manager for responding to operator requests for more information
and creating a set of tuned config files for some of the key servos used
in the Hanford 2-km interferometer (in collaboration with Dick Gustafson
on site). That work is planned for completion by November 15.
Comparison of Line Removal Techniques (Bernard
Whiting - Florida)
-
There are a variety of lines to worry about, including electrical mains,
violin modes, and pendula. The lines are coherent, sometimes large amplitude
and can be highly non-Gaussian. Removing the lines can reduce data volume
and improve Gaussianity, allowing better matched filters in searches for
astrophysical sources.
-
A catalog of methods and types of lines removed was shown. This study was
a comparison of three of these methods (multitaper - Ottewill/Allen, quasi-monochromatic
- Klimenko, and coherent - Sintes/Schutz) on the 1994 40 Meter data. These
three methods are now available in the DMT or LAL package. In the comparison
the following criteria have been or will be examined:
-
Statistical properties: Improvement in Gaussianity, detection of non-Gaussian
components
-
Spectral properties: Creation of "glitches" in removal, shape of residual
noise in cleaned data
-
Signal detection ability: False alarm rate, detection failure, dependence
upon SNR threshold
-
A comparison was shown of a histogram in FFT coefficients (real part)
for the frequency bin centered on 14.46 Hz before line removal and after
line removal using the three techniqus. In each case a Gaussian of the
same RMS is shown for comparison. The multitaper technique performs best
by far in creating a Gaussian residual of low rms at this frequency.
-
A comparison of the techniques on an incoherent "bump" near 600 Hz, however,
showed much smaller differences among the techniques tried.
-
Spectra were shown in the vicinity of 180 and 300 Hz, along with a superposition
of spectra from 60, 180 and 300 Hz (+/-30 Hz window). Many nearby lines
are visible. The same superposition was shown using the Klimenko code with
default and with more extreme removal parameters, with distortion visible
in the extreme case. The Sintes and Ottewill (GRASP) techniques do not
clean up the superposed spectrum as well, with apparent artificial sidebands
created by the Sintes method.
-
Finally, the residual non-Gaussian noise was examined via a histogram of
real FFT components at 180 Hz before and after the three line removal techniques.
Here the Ottewill and Klimenko methods give similar performance, with a
distinctly broader residual for the Sintes method.
-
KR wondered how much of the difference in performance of the techniques
was due to non-linear effects, giving shoulders and sidebands on the visible
lines. Bernard mentioned that future plans include looking at magnetometer
data for a cleaner determination of 60 Hz amplitude & phase.
Coherent Line Removal - in LAL C Code (Alicia
Sintes - AEI/Potsdam)
-
This presentation was a brief description of the method used for line removal
and a description of the LAL C code routines used.
-
The algorithm is meant to remove coherent lines that generate harmonics
while preserving stochastic detector noise and any nearby unassociated
lines. The code can also work on lines & harmonics that change slowly
in frequency with time (e.g., the drifting 50 Hz mains seen in Glasgow
data). It is recommended to monitor at least five harmonics for best performance.
-
The principle is to reconstruct a nearly monochromatic time-domain function
m(t) whose effect on the data is to add (coherently) a series of lines
and harmonics and then to subtract m(t) from the data. Examples were shown
from Glasgow data with wandering harmonics (450 and 750 Hz) of the mains.
The mains are cleanly removed while preserving a nearby violin mode at
745 Hz and an artificial 452 Hz line.
-
Time-frequency plots before and after line removal show dramatically
the cleaning up of the wandering mains at 750 Hz with preservation of two
nearby violin modes.
-
The line removal code is part of a package called cohlineremoval which
includes routines for finding harmonics, generating a reference interference
signal, and removing all harmonics. The code is written to the LAL specificiation
of July 2000 and has been tested. (The code was originally written in Matlab.)
-
Structures for storing time and frequency series and relevant parameters
were shown, along with the calling sequences (and arguments) needed for
the finder, reference generator and cleaner routines. The package includes
code documentation and a sample frame file for testing the code.
Status of Detector Commissioning (Nergis
Mavalvala - LIGO-CIT)
-
Nergis began with an overview of recent news from the Livingston and Hanford
sites. At Livingston, the PSL has been installed and characterized. The
mode cleaner (MC) has been installed and tested. Installation of in-vacuum
components and realignment of core optics is ongoing. She emphasized that
characterization is being carried out more systematically on the 4-km Livingston
IFO than was done on the Hanford 2-km IFO because the 2-km was viewed as
a "pathfinder" testbed for identifying the major problems to be solved.
-
At Hanford, the single-arm cavity tests were completed in April, including
the 24-hour engineering run and a test of the critical common mode servo.
A number of improvements, including beam-reducing telescopes and rehanging
of optics were done afterward. Frequency noise in the input optics was
reduced with various improvements, including reduction of acousto-mechanical
coupling on the PSL table. The power recycled Michelson (PRM) cavity can
now be operated, and work is ongoing to attempt simultaneous locking of
the PRM with one and two arms.
-
During the single-arm tests in the spring, the locking servo was cobbled
together in analog with feedback to the input test mass alone. Although
the original design for this servo has a high bandwidth for full IFO lock
acquisition, it was found that lock acquisition was easier at lower gain
(bw<100 Hz). Gains could then be raised after acquisition (bw=300 Hz).
-
It was necessary (as expected) to notch out axisymmetric test mass resonances
at 9.5 and 14.5 kHz. Unexpected was the need to notch out the non-axisymmetric
"butterfly" mode at 6.5 kHz. Better beam centering is expected to help
reduce excitation of this mode. It was pointed out that the butterfly mode
has the virtue of giving an error signal on mis-centering!
-
The single-arm operation allowed testing of a preliminary version of the
common mode servo, the most complex of the LIGO I longitudinal servos.
Feedback actuation is made on test mass positions at low frequency and
on both the MC length coil driver and the MC error offset at high frequencies.
Plots were shown of the original servo design and of the realization in
hardware after problems were encountered in locking. In the new scheme
the MC length path does not dominate open loop gain at any frequency in
the servo realization. .
-
A plot of measured open loop gain (inferred from closed loop gain) was
shown with a unity-gain frequency of about 9 kHz. The best achieved was
20 kHz with a phase margin of 80 degrees.
-
About 40% of the alignment controls system was tested successfully (wave
front sensing scheme), i.e., with 4 mirror orientation angles and two input
beam pointing degrees of freedom. The alignment controls are low-bandwidth
(few Hz).
-
Several properties of the single-arm cavities were measured:
-
Resonant reflectivity gave a visibility of 0.02, corresponding to an apparent
70 ppm average loss per mirror. Expected was a visibility of 0.01 for 30
ppm per mirror. Beam clipping cannot be excluded as the explanation.
-
The cavity storage time was measured to be 467 usec, corresponding to a
finesse of 220 and a sum of input mirror transmission and cavity loss of
0.0281 (design = 0.04).
-
The cavity length was measured to be 2009.11 m with RF resonant techniques,
in excellent agreement with the survey value of 2009.12 m.
-
Mode matching into the arm was found to be 96%.
-
The Michelson contrast was found to be 0.32%.
-
The status of the core optic suspensions was also discussed. There are
four sensors and four actuators to control nominally three degrees of freedom
(position, pitch, yaw), with non-negligible cross couplings to vertical,
side and roll couplings. Diagonalization of the sensor/actuator matrices
required some work (see talk by Penn at last LSC meeting).
-
Measurements of decay times in situ give Q values for test mass internal
modes of 104-107.
-
Scattering of the 1 micron main laser beam into the sensor photodiodes
has prevented high beam intensities. A solution using new modulated LED's
with demodulated sensor output is to be tried soon, along with better shielding
and optical filtering.
-
The locked single arm allowed a measured of PSL frequency noise. The acousto-mechanical
resonances are still apparent, despite improvements in optical mounting
and elimination of parasitic interferometers. There is also a broad background
ascribed to carrier resonance in the reference cavity with saturation of
RF photodetection. The noise level is significantly above the design value
(but which itself is 10 times below what becomes a serious problem).
-
The digital servo controls will be commissioned soon, with feedback to
end test masses (was unavailable in spring running). Next on the agenda
is getting the PRM to lock with one and then both arms.
Characterization of LIGO Input Optics (Haisheng
Rong - Florida)
-
The primary functions of the Input Optics (IO) system include conditioning
of the PSL output (frequency, intensity, pointing stability), mode matching
of delivered beam, optical isolation and generation of RF sidebands for
IFO sensing. A schematic diagram of the IO system leading up to beam delivery
into the IFO was shown, along with a block diagram of IO interfaces to
other LIGO subsystems.
-
Overview of IO status for the three IFO's:
-
LHO 2K - Operational since Aug 99, major characterization complete, including
integration with 2K arm cavities, modifications/realignment carried out
in April/May 00.
-
LLO 4K - Mode cleaner locked in March 00, characterization/improvement
March-June 00 and realignment June-August 00.
-
LHO 4K - Installation started July 00.
-
A large number of characterization measurements have been carried out.
Results were shown for the LHO 2K IO system:
-
The MC length has been measured with RF resonant sideband detuning.
-
The transmission has been measured to be 98 +- 5 % and the mode matching
be 95-98%.
-
The pointing stability has a long term drift of about 4 urad/hour and a
jitter of 10-10 rad/sqrt(Hz) for freq > 20 Hz. A power spectrum
of horizontal and vertical fluctuations shows strong peaks at known resonances
below 20 Hz. Wave front sensing is not (presently) effective above 1 Hz.
-
The MC cavity linewidth has been measured both using a ringdown technique
and via the optical gain. The HWHM was found to be 3.66 kHz in Sept 99
and 3.55 kHz in Feb 00, giving estimated cavity losses of 148 ppm and 14
ppm, respectively, with a +-50 ppm uncertainty, dominated by the uncertainty
in transmission.
-
The internal Q's of the three MC mirrors have been measured to be 0.75,
0.36 and 1.29 times 106 for MC1, MC2 and MC3, respectively,
with resonant frequencies all near 28.2 kHz. Measurements were carried
out by observing ringing after excitation.
-
The MC has been used to analyze the PSL noise, as discussed by Mavalvala
above.
-
Power spectra for the MC length control signal were shown for January and
June 2000. The June PSD looks much improved, believed to be primarily due
to better levelling of the MC beam plane.
-
The frequency noise of the MC/PSL system has been itself measured with
the 2K arms. Power spectral densities (units of Hz/sqrt(hz)) were shown
for the 2K arm error signal ("common mode"), along with the control signals
fed back to the input test mass, the Mode Cleaner length and the AO path.
-
Coming up soon is an attempt to lock the MC at the designed input power
after installing new OSEM sensors that include demodulation detection to
mitigate effects of scattering of the beam into the photodiodes (see Mavalvala
talk above).
Source and Propagation of the Predominant
1-50 Hz Seismic Signal from Off-site at LIGO-Hanford (Robert Schofield
- Oregon)
-
A power spectrum of seismometer data at Hanford shows a broad enhancement
at the 1-10 nanometer level over about 1 to 50 Hz, with peaking in the
4-10 Hz range. This talk focussed on determining both the predominant source
of that noise and pinning down its propagation path to the LIGO seismometers.
-
Investigation in the field established early on that the noise comes primarily
from automobile traffic (trucks being worse) on highway 240. Other local
roads were eliminated, as was a small weir spill on the Yakima River. It
wasn't clear, however, whether the primary source was from the engines,
or from the interaction of the tires with the pavement. By running both
a car and truck along the highway while taking measurements, it was found
that the peak frequency of noise correlated well with that expected from
the axle spacing between front and back wheels, consistent with the rough
and ungrooved pavement on the highway. Resurfacing the highway would help
reduce the noise.
-
Pinning down the propagation path took some care. Hypotheses included direct
acoustic (through air) coupling to seismometers, acoustic coupling to the
ground, or an entirely-ground path of propagation.
-
The first hypothesis was tested (and rejected) by suspending a speaker
near a seismometer to measure its sensitivity to acoustic noise. Also,
insulating the seismometer underground successfully suppressed acoustic
noise detected by a microphone, but had no effect on the seismometer.
-
The acoustic effect on traffic on ground motion was tested with planes
and a helicopter where one could estimate reasonably well the expected
amplitude of seismic disturbance (from an 1885 calculation!) and remove
the possibility of direct coupling to the ground. Measured displacements
were within a factor of three of calculation for the aircraft, but the
same calculations applied to a truck on the highway failed to account for
measured seismic vibrations by three orders of magnitude. These results
suggested that the propagation takes places entirely through the ground.
-
To test this, the velocity of sound propagation through the ground was
measured vs frequency using portable ground tampers and two seismometers.
Then the velocity of the highway 240 disturbances was measured via correlations
between the two seismometers. The highway 240 disturbance velocity was
measured to be 492 +- 62 m/s, consistent with the tamper measurements for
signals peaking below 10 Hz.
-
A spatial "Q" value was estimated from the attenuation observed in signals
between the end and middle Y stations. In the truck peak range (4.4-6 Hz)
the measured value was Q = 68 +- 12. One expects a Q of about 40 for the
local soil.
-
The effect of these road disturbances on the arm control signals will be
looked at in the future.