Minutes of Detector Characterization Sessions
LSC Meeting - LIGO Livingston Observatory (March
16-18, 2000)
Introduction (KR - Michigan):
-
With more than 20 presentations over the next three days, speakers were
urged to keep to assigned times, and questioners warned that discussion
may be cut short at times. The plentitude of talks is a positive reminder
of much ongoing detector characterization activity.
-
Points to discuss during the sessions:
-
Revision of task tables, including more realistic milestones, consolidation
of groups signed up for common tasks with assignment of a "Lead Group"
for certain tasks, and revision of priorities & task definitions.
-
Clearer definition of Data Monitor Tool "Deliverable"
-
Mechanism for distributing upcoming engineering data to LSC groups (issue
already raised by Rai Weiss in plenary session)
-
Participation in analysis of fall 1999 40 Meter data (coincidence run with
TAMA)
Characterization of the 2-km Hanford Single-Arm IFO (Peter Fritschel
- LIGO-MIT):
-
This presentation gave a brief account of commissioning
progress to date on the 2-km single arms at Hanford, along with a sampling
of measurements to characterize the arms.
-
Goals of the ongoing characterization include:
-
Optical parameter measurement, specifically cavity
losses (via resonant reflectivity) and mode matching (tunable with input
telescope)
-
Environmental influences, including the effects of
lunar tides and the microseism on length fluctuations and including angular
fluctuations
-
Servo control performance. Concerns here include
lock acquisition and effects of test mass resonances. One wants to test
the alignment servos (wave front sensing), the frequency stabilization
in the common mode servo, and the stack fine actuators.
-
Most of the work to date has focussed on the y arm,
which has a measured finesse of about 200. Actuation on the input mass
is presently via an analog servo loop. Although mode matching can be sensed,
the servo loops to control it have not yet been closed.
-
Input power has been limited by an unintended leakage
of laser light into the OSEM photodiodes used to sense mode cleaner mirror
position and alignment. The laser is run at 300 mW for the moment, and
about 4 mW impinges on the beam splitter. (The recycling mirror is deliberately
misaligned for simplicity, leading to large reflection of the incident
beam from the mode cleaner.)
-
When the gate valves between the corner and mid stations
were first opened in November, the laser light was immediately seen to
scatter off the midstation suspension cage, indicating excellent
a priori alignment.
-
Much effort has gone into acquiring/maintaining lock,
with 1-2 hour locks common since the end of February.
-
Wavefront sensor alignment has been implemented (early
March) and its benefits confirmed (increased power in the arm, as indicated
in a plot shown). Unity gain frequency is now limited to 1 Hz, but the
goal is about 5 Hz.
-
Characterization of the single arm will continue
through the first week of April.
-
The length locking servo has a measured unity gain
frequency of 500 Hz. A plot was shown of cavity power and length sensing
error vs time when the arm was out of lock. The width of the fringe
transient indicated an open-loop velocity of about 1/4 of a wavelength
per second, a low value indicating quiet conditions.
-
To avoid exciting test mass resonances, the initial
scheme was to notch out (analog filter) two axisymmetric resonances at
9.4 and 14 kHz. This scheme didn't work well, however, because non-axisymmetric
("butterfly") modes at 6.2 kHz and above 20 kHz were excited. To suppress
these excitations, another notch was added at 6.2 kHz, and the gain rolloff
above 10 kHz was steepened. These normal modes have Q's of 10**6 - 10**7,
making them easy to excite.
-
The total loss in the cavity was measured via
the reflectivity of the cavity on resonance vs off resonance, giving
150 ppm, three times higher than the expected 50 ppm.
-
Inexact beam centering led initially to higher losses,
but seems to have been fixed. It is hoped that the higher loss than expected
is an artifact of beam clipping in non-cavity optical elements in the vacuum
system.
-
Mode matching will eventually be monitored continuously
(and controlled) via "bullseye" photodetectors, but for now it has been
measured from the transient created when the input beam is suddenly chopped,
giving 80% matching.
-
Drifts of the 2 km length close to 1 micron/minute
were initially observed and traced down to temperature drift in the laser's
reference cavity chamber, which has now been temperature stabilized. Lower
drift rates are also seen and ascribed to tidal distortion.
-
Correlation between seismometer readings and the
cavity length control have been studied. A strong correlation is seen at
the micro-seismic peak (0.1-0.2 Hz), as expected. The correlation becomes
even stronger when the difference between LVEA and midstation seismometers
is compared to the length control signal. This is illustrated not
only by correlation power spectra with coherence, but also by the relatively
small residual in a time series of the length control when the normalized
differential seismic signal is subtracted.
-
The spectra of the error and control signals above 1 Hz indicate that the
control signal is largely contending with frequency noise.
The Diagnostic Test and Display Tool (Daniel Sigg - LIGO-LHO):
-
A comprehensive software package called the Diagnostic
Test Tool (DTT) has been written to support online diagnostics at the sites,
including real-time graphics display based on CERN's root program. It was
developed at Hanford, but was recently installed at Livingston. This presentation
included example outputs from the program and a real-time demonstration
of the program in the Livingston control room.
-
Example plots shown in warmup:
-
Power spectra of the 2-km arm error and control signals
-
Same with seismic spectra superimposed
-
Coherence vs frequency for various pairs of
control signal and seismic signals
-
Spectra of pitch and yaw monitors when dithering
present
-
Measured transfer functions (magnitude) between orientation
excitation voltages and wave front sensor signals
-
Measured transfer functions (magnitude and phase)
between PEM and optical lever signals, along with derived transfer coefficients
binned in frequency
-
A block-diagram overview of the diagnostics system
was shown, illustrating the flow of data both from and to (excitations)
the interferometer.
-
At Hanford the test tool can also be run on the fortress
machine which has access to two 70 GB disk stores and an AIT-2 tape robot
(50 GB/tape) which can be written at 6 MB/s.
-
Illustrations of the GUI screens used to control
the DTT were shown for the arbitrary wave form generator, which supports
excitations of sine waves, square waves, ramps, swept sines, etc. It looks
very much like a software version of an HP spectrum analyzer.
-
Other GUI screens shown illustrated controls for
taking power spectra and displaying results.
-
The DTT also supports a command-line interface, allowing
scripts to define and carry out measurements, including manipulation of
the arbitrary waveform generator. Sample commands and resulting output
were shown.
-
Summary of DTT capabilities:
-
Excitation with 8 kHz bandwidth, digital & analog
with many waveforms
-
Access to all on-line data with integration of excitation
signals, including correct synchronization
-
Fourier analysis with transfer function and triggered
responses
-
Control panel GUI with integrated graphics to display
results
-
Support of several graphics output formats (ps, eps,
pdf, jpeg. etc)
-
Export of ASCII or binary test results in LIGO lightweight
format
-
Can be run offline to look at stored data with same
capability (minus excitation!)
Status of the Data Monitor Tool (John Zweizig - LIGO-CIT):
-
At LHO there are two 4-cpu Sun E-450 workstations
(sand and stone) running DMT software. The dedicated frame builder is working,
and designer data sets can be written to disk or tape on the fortress machine.
Typically, four monitors are running at any given time, with 2-3 users
logged on. Resources are plentiful - for the moment.
-
At LLO a new machine (delaronde) has been brought
up with configuration identical to sand and stone at LHO. The DMT code
is running, but there is no connection yet to the CDS or LDAS networks.
The frame builder is running.
-
There have been several DMT releases in recent months,
the most recent being 1.1b on 2/22/00. The DMT web page has been upgraded
to include more information on installation.
-
A new data analysis environment has been defined
(DatEnv) to provide more structure for generic data monitor development,
insulating users from details of frame i/o. This environment is available
in background or under root. The environment is defined by a base class
in C++, and a template example is included in the DMT distribution package.
-
The DMT includes a frame writer which is used at
LHO for saving small data samples locally, and another utility has been
written to keep track of frame files in specified directories.
-
A number of technical improvement to the data container
classes will be available in the next DMT release (1.2).
-
Work is underway with Daniel Sigg to develop communication
with DMT background monitors, allowing interrogation and display by operators.
The software developed for the Diagnostic Test Tool will be adapted for
presenting graphics. The plan is to incorporate a data server automatically
into every monitor to standardize and simplify communication.
-
Other miscellaneous DMT improvements for the future
include an audio API, a trend-frame-writing API, better frame reading efficiency
(using the new framecpp table of contents), and closer integration with
LDAS software.
-
To kick off discussion on the definition of a DMT
deliverable, John outlined the following steps for full installation:
-
Add monitor sources & documentation to CVS software
repository
-
Link to documentation from GDS/DMT web page
-
Compile from CVS into production directory
-
Add monitor to process manager table
-
Enable trigger logging
-
The documentation should include:
-
Purpose of monitor
-
Algorithm used
-
Description of all inputs
-
Description of all outputs (triggers, reports, trend
data, served data)
-
List of required packages
-
The full source code should be provided, along with
a sample make file.
-
Some discussion ensued on how to handle software
that does not yet meet all of the standards, but which should be available
for use / testing by other physicists.
-
It was decided that a "pre-delivery" of development
software would be appropriate before true delivery of a final installed
monitor. That pre-delivery would involve placing (working!) DMT code in
a publicly accessible directory on the LHO machines (sand/stone) and announcing
its availability for trial. Documentation of use and algorithm used would
be required, with a link on the DMT web page. A mailing list will be set
up for DMT code developers.
-
If a monitor is indeed found to be useful (and not
harmful!), the final full delivery would be expected to occur within 3
months of pre-delivery.
Status of Performance Characterization DMT Software (KR - Michigan)
-
This was more a cajoling session than a status report, focussed on obtaining
firm commitments to deliver priority 1 (and some priority 2) software for
the DMT.
-
Lead persons were assigned to Line Noise Sources (Ottewill), Seismic Noise
(Daw), Inter-channel Correlations (Ottewill), Operational State (Riles),
and Band-limited RMS (Daw) software tasks. Revised task tables will be
posted on the working group web site.
-
Later discussions concerning the Stack Vibrations and Bilinear Cross-couplings
tasks (both priority 1) argued for combining these two into one task. Steve
Penn is interested in this area and will decide soon whether to make a
commitment.
Status of Transient Analysis DMT Software (Fred Raab - LIGO-LHO)
-
This was a brief status report on promised software deliveries, followed
by additional cajoling for firmer commitments, including realistic dates.
-
In the end, commitments were obtained for all transient analysis tasks
except the monitoring of Flickering Optical Modes (priority 1) and Automated
Transient ID (priority 3).
Status of Data Set Reduction and Experience in HEP Data Sharing (David
Strom - Oregon)
-
The White Paper data model was quickly reviewed, in which it was envisioned
that "Level 1" archived reduced data would amount to about 25 TB/year with
Level 2 at 2.5 TB/year and Level 3 at 0.2 TB/year.
-
A strawman proposal for channels to include in the Level 2 data set was
shown, giving 200 kB/s or 1.6 TB/year, assuming a 50% duty cycle and a
50% compression. One could imagine exporting data of this size to LSC member
institutes. Feedback on channel decimation choices is requested from working
group members.
-
Various data compression schemes are under consideration. Lossless gzip
compression could give a factor of two reduction. Wavelet compression (proposed
by Klimenko) might bring a factor of five, but with some loss. Other approaches
are being investigated by Zotov.
-
More sophisticated reduction could combine information from multiple channels.
-
Presently one can use the JDclient and RDSWriter programs running on site
to select out a subset of channels to write to disk or tape. A more powerful
reduction program will be developed to run in the DMT, one that includes
filtered decimation.
-
In response to an earlier request by Albert Lazzarini, a description was
given of the data sharing & analysis facilities used by the high energy
physics experiment OPAL:
-
Data is stored in a 1 TB area and in a 10 TB tape robot with 0.7 TB disk
buffer. A small farm of 200 MHZ SGI workstations can read this data and
ship highly reduced samples to office workstations for further, more cpu-intensive
analysis. The data farm is called SHIFT (Scalable Heterogeneous Facility
Testbed). Running over all data normally requires 1-2 weeks.
-
This system was begun in 1992, and better systems should be available now.
Roughly 50-100 publications arise each year from analysis carried out in
part on the SHIFT system.
-
CPU time is allotted democratically, but OPAL management has reassigned
priorities in high-use periods preceding conferences.
-
While LIGO will face some common problems with OPAL, LIGO analysis is intrinsically
more CPU intensive
Status of Data Set Simulation (Sam Finn - Penn State)
-
This was primarily a description of the Simdata program (runs in Matlab)
for parametrized time-domain modelling of interferometer noise.
-
The first release in Nov 1999 included parametrized modelling of shot noise,
radiation pressure noise, and thermal noise from suspensions and internal
test mass normal modes.
-
A second release, scheduled for May 15, will write data output in frames,
include seismic noise and violin modes, and allow whitened
output data.Once complete, this program will be turned over to the End-to-End
Team for incorporation of the algorithms.
-
A third release will include hooks for injecting transients and signals,
improved internal thermal noise modelling, and a streamlined frame writing
program.
-
The program runs at 5 times real time for a 16 kHz sampling rate on a Sun
Ultra 30 and 10 times on a modern laptop.
Application of the End-to-End Model to the LHO 2-km Interferometer (Hiro
Yamamoto - LIGO-CIT)
-
The End-to-End Model is now being exercised on the partial interferometers
at Hanford.
-
Detailed comparisons between predictions and measurements are now possible
for many subsystems, and E2E code developers are spending time at the sites.
-
These comparisons allow various IFO parameters to be determined.
-
Examples of fruitful comparisons include:
-
Measurement of arm finesse and test mass velocity from fringe spacing and
widths
-
Confirmation that 120 Hz and 240 Hz peaks seen in reflected light can be
explained by 60 Hz frequency modulation in the mode cleaner control system
-
Mode Cleaner and single-arm mode matching measurement from ringdown curves
with laser intensity chopping
-
Verification of seismic noise modelling via power spectra
-
Detailed simulation is underway of the laser reference cavity, including
a mechanical model of its optical table and mechanical disturbances.
-
Lock acquisition simulation has begun, superseding earlier work based on
the SMAC simulation package. Single-mode acquisition simulation is running,
and multi-mode simulation is planned for the future.
-
Other detailed modelling in the plans include photodiode motion and non-uniformity,
lens motion and possible spurious cavities from photodiode reflection.
Interchannel Correlations, Line Tracking, and Porting GRASP Code to
the DMT (Adrian Ottewill - Dublin)
-
A description was given of the algorithm used in GRASP for automatic cross
talk removal from multi-channel data, based on correlations between the
GW channel and other channels is narrow frequency bands. Examples of denoising
improvements were shown, along with graphs of correlation coefficients
vs frequency for the 1994 40 meter data.
-
A description was given of the Slepian multi-taper method used in GRASP
for estimating the amplitude and phase of line sources for monitoring or
removal.
-
An example of porting GRASP code to the DMT was shown, illustrating translation
rules for mapping of GRASP I/O, graphing and spectral analysis routines
to corresponding routines in the DMT. Plots were shown of output
from both programs for the same data.
-
Adrian has signed up to deliver the interchannel correlation DMT code by
May 16 and the line tracking code by April 16.
Noise Characterization & Higher Order Statistics (Albert Lazzarini
- LIGO-CIT)
-
This presentation was a report on work carried out by a summer SURF student,
Denis Petrovic.
-
In addition to the familiar power spectra and 2nd-order correlations used
to analyize data, once can apply more general and higher order measures
call cumulants and polyspectra.
-
These higher order measures are well suited to understanding / quantifying
non-Gaussian noise. For Gaussian data, the polyspectra of order n>2 vanish.
-
More generally, these techniques allow measurement of non-linear effects
in a signal.
-
For this study, a 3rd-order measure called the bicoherence was used, which
is a normalized version of a polyspectrum and depends on two frequencies.
-
The 1994 40 Meter data was used, data characterized by not only many harmonics,
but also by sidebands on those harmonics, indicating non-linear behavior.
-
From the data, one can determine the probability distribution in the bicoherence
value squared (ranges from 0 to 1) and compare with what is expected for
truly Gaussian data.
-
Although a simple plot of time series amplitudes look pretty Gaussian to
the eye, the long tail in the bicoherence indicates otherwise.
-
To understand how to interpret these results, several toy models were simulated,
including uncorrelated lines with coincident frequency sum (bicohrence
uniform), strong mixing of two lines (peaks in bicoherence at frequency
sum and difference), upconverted broadband noise ("wings" appear on peaks
in bicoherence).
-
The 40 meter data exhibits such wings, indicating upconversion, consistent
with previous interpretation of visible sidebands.
-
Albert has no one to pursue this technique with in the near future and
hopes to interest another LSC member into taking it on. It seems to be
a natural analysis tool for the tasks of bilinear cross-couplings and stack
vibration monitoring discussed above.
Some Effects of Earthquakes, Temperature, Wind Storms and Barometric
Pressure on the Interferometer at Hanford (Robert Schofield - Oregon)
-
Several environmental influences have been studied at the Hanford site.
These influences, primarily at low frequencies can lead to loss of lock,
need for increase servo gain, need for increased actuator dynamic range,
and increased noise in the GW bandwidth due to upconversion.
-
A nitty-gritty necessity in estimating enviromental influences is a set
of calibration factors relating a physical quantity to an ADC count. A
table of such factors, determined empirically, was shown for seismometers,
tilt meters, shadow sensors, optical levers and the test mass actuation.
-
An example of a time series transient in optical lever signals due to a
(large) California earthquake, along with power spectra taken during
recent quakes in Oregon and Alaska from seismometer, tilt meter and shadow
sensor signals.
-
Rough agreement in implied test mass motion is seen among the tilt meter,
seismometer and shadow sensor during the latter quakes, given the calibration
scale factors above.
-
As previewed above, it was discovered that the laser's reference cavity
temperature correlated strongly with drifts seen in the arm cavity length
control signal, drifts large enough to cause periodic lock loss. The cavity
has now been temperature stabilized.
-
Very strong gusts (tens of meters/ second) measured by anenometers mounted
on building roofs show a striking correlation with seismometer and optical
lever signals, indicating a significant coupling of wind forces to the
interferometer.
-
A strong anti-correlation was also discovered between barometric pressure
and the voltage controlling the piezo-electric transducer that defines
the pre-mode cleaner cavity length. Losses of lock due to exceeding the
pzt voltage range could be directly attributed to pressure variation, which
affects the optical path length in the cavity. To accommodate historical
pressure variations since 1955, the existing pzt would need to have a factor
of 6.4 more dynamic range. An alternative to replacing the pzt is to seal
the cavity, perhaps in vacuum.
Seismic Monitoring at LHO with DMT (Evan Mauceli - Oregon)
-
A continuous monitor has been written in the DMT of local seismic activity,
with an emphasis on earthquake detection.
-
The program monitors frequencies below 0.5 Hz, including the microseismic
peak and surface waves with 20-second period from distant quakes.
-
Plots were shown of seismometer data from Mt. Rainier and from LHO of a
quake originating in in the Vanuatu Islands.
-
The monitoring algorithm analyzes data frame by frame, looking at 10 seismometer
channels distributed over the Hanford arms and LVEA.
-
Channels are sub-sampled to 1 Hz and a threshold applied to each, corresponding
to 3 sigma deviation, based on a preceding hour of data. An earthquake
trigger is defined by 6 of the 10 channels exceeding threshold.
-
The trigger information (now written to XML and text files) includes the
GPS time of occurrence, channels above threshold and the corresponding
seismometer reading (microns/s).
-
A sample trigger output file was shown for a recent quake in Alaska.
-
Planned extensions include quadrature sums of x, y, and z motion from monitors,
multiple frequency bands and monitoring of other transients, such as magnetic
field transients.
-
Evan has signed up to deliver magnetic transient detection code by May
1 and seismic / wind gust monitoring code by June 2001 (although he expects
and was strongly encouraged to have it ready much sooner).
Seismic Monitoring at LLO (Warren Johnson - Louisiana State)
-
No transparencies turned in --> no minutes written (yet).
Results on the Gladstone HS Microseismic Monitoring Project (Fred Raab
- LIGO-LHO)
-
A team of Gladstone high school students and teachers has been monitoring
the microseismic motion at Hanford, using the DMT.
-
The PSD is estimated for 0.1-0.2 Hz and written to a web-accessible file
every 15 minutes.
-
Further analysis (trending, statistics, correlation, event ID) is then
carried out at Gladstone throughout the school year. In fact, on a number
of occasions the students have called up to find out why the data looked
strange or was missing, prompting LHO physicists to discover a malfunction.
-
The motivation for studying the microseismic over a long period was a worry
that much earlier and short-term pre-construction measurements had missed
occasional prolonged fluctuations upward by as much as a factor of ten.
-
The data from the last several months do not support that worry. Fluctuations
are of order three, with a typical displacement rms of about 5 x 10**-7.
-
An effort is now underway at Gladstone to check NOAA deep-ocean buoy data
for correlation with the local microseismic strength.
Report on Application of the DMT for Transient Searches and Seismic
Data Analysis (Ed Daw - LIGO-MIT)
-
DMT software has been written to monitor seismic noise, measuring both
band-limited RMS and searching for transients using filter banks.
-
The code is running at LHO and analyzes 16-second segments, using data
from three seismometers which are sampled at 256 Hz.
-
A FIR digital filter is used for anti-aliasing, followed by downsampling
to 16 Hz and subtraction of the DC level. For now, a microseismic band
below 1 Hz and a 1-5 Hz band are implemented, with an octave scheme planned
for the future.
-
A separate analysis calculates direction / speed of seismic disturbances
by comparing phases from FFT's of the various seismometers.
-
A 12.5-hour history of seismic activity from several monitors at Hanford
indicate a clear increase in motion during working hours.
-
A similar plot from Livingston from two monitors shows the passage of two
trains separated by a few hours.
-
For detecting and classifying transients, a bank of three filters has been
created, corresponding to different time scales. An algorithm searches
for peaks in each filter output, checks for coincidences, and accumulates
statistics. The algorithm automatically adjusts peak-defining thresholds
based on recent history.
-
Plans for the future include delivery of band-limited rms monitor by April
16, extension of the seismic characterization tools, and speeding up of
the filter bank analysis via decimation.
Wavelet Analysis & Line Removal (Sergey Klimenko - Florida)
-
This was half of a talk given partly in the ASIS working group session.
This part focussed on line removal using wavelet algorithms.
-
For removing a line of a particular frequency, the data is resampled at
a rate close to the target value, using polynomial interpolation.
-
Optimal filtering in the Fourier domain has been tried. It is possible
to reconstruct fully the original waveform from the filtered data and knowledge
of the line source filter used and derived coefficients.
-
In precisely determining a given line, its harmonics are also examined.
-
Examples were shown of spectra in the 1994 40 meter data before and after
line removal of power line harmonics and the 582.4 Hz violin mode.
-
The amplitude and phase of a line can also be tracked vs time.
-
The code is already in C++ form. So porting it to the DMT shouldn't be
a major undertaking.
-
Sergey plans to deliver a wavelet toolbox package for the DMT by late summer.
He has signed up to deliver a transient detection routine by June 2001,
based on wavelet analysis, and he is interested in providing the line monitoring/removal
code in DMT form too.
Operational State and Servo Instability DMT Software (KR - Michigan)
-
DMT code is being written to allow specification of the interferometer
state (e.g., require in-lock, quiet, high laser intensity, etc.). These
are conditions specified on a particular channel or as a Boolean combination
of previously defined conditions. Conditions can be defined via C++ code
or via a configuration file read during run-time. The code was estimated
to be ready by April 8.
-
Code is also being written to detect servo instability, meaning true instability
(runaway behavior) or merely excessive gain, leading to a distorted power
spectrum, including excitation of resonances out of the servo's bandwidth
(e.g., excitation of "butterfly" test mass modes). Again, a run-time configuration
file can define channels to be monitored and the nature of the instability
to watch for. Trigger levels will be defined, depending on the severity
of the detected instability. This code was estimated to be ready by April
30.
Optical Sensing Matrices (Steve Penn - Syracuse)
-
The methodology for determining the sensing and actuation matrices of the
test masses was described. The OSEM system provides four photodiode readings
from the test mass face and onef rom the side, allowing determination ideally
of pitch, yaw, position, and transverse motion via a sensing matrix with
simple elements of zero, 1 and -1. Similarly, an actuation matrix converts
these into coil currents to push on magnets at nearly the same locations
as the photodiodes. Again, that matrix ideally has a simple form.
-
In practice, there are non-ideal couplings in both the sensing and actuation,
requiring an empirical determination of corresponding matrices to recover
orthogonality in sensing and control of the test mass degrees of freedom.
-
A power spectrum for the raw position, pitch, yaw, and transverse signals
using the naive ideal sensing matrix shows the expected large peaks at
the known resonances for those motions, but one can also see significant
contamination of each peak in the signals for the other motions, indicating
coupling. The coupling between raw pitch and position is essentially unavoidable
because the mass traverses the arc of a circle. Correcting for this calculable
effect does help a bit in decoupling position and pitch, as seen in revised
spectra, but more cleaning is necessary.
-
This cleaning is essentially a frequency-dependent eigenvalue/eigenvector
problem dependent upon the measured power spectral densities of the channels
and their cross-spectral densities.
-
Applying the technique does clean up the data considerably, reducing the
secondary peaks at the "wrong" resonances in each channel. However, although
one can orthogonalize the side sensor component in offline software, the
present amplifier boards do not support inclusion of that signal in controlling
position, pitch and yaw. The boards are being upgraded to allow complete
orthogonalization soon.
-
Work has been carried out in addition to relate the optical lever pitch/yaw
measurements to those derived from the OSEMs by way of a simple rotation/translation
transformation. This is determined via a fit to trajectories in pitch/yaw
space.
Analysis of Fall 1999 40-Meter/TAMA Coincidence Data (Walid Majid -
LIGO-CIT)
-
(KR gave the presentation for Walid, who had to cancel his trip at the
last moment.).
-
Two several-day data runs were carried out at the 40 meter in fall 1999
by Dick Gustafson (Michiga) and Steve Vass (CIT) with support from the
Caltech LIGO team. The first of those runs coincided with an engineering
data run by the TAMA interferometer.
-
A small group of LSC physicists is starting to look at the data and plans
to carry out a coincidence analysis with a group of TAMA physicists. Other
LSC physicists are welcome to join in this effort.
-
The duty cycle of 40 meter locks was about 90% in the evenings and varied
from 50% to 80% during the day. Conditions improved during the run, which
occurred mostly over a weekend. The longest lock period was 1900 seconds,
with a median of about 600 seconds. There are approximately 30 hours of
coincidence data between the 40 meter and TAMA (which did not attempt to
run around the clock).
-
Walid is developing software tools to look at the data in an automated
way and that exploit the existing LDAS database.
-
Issue that the analysis group will look at (some of which Walid has already
made a first stab at) include non-stationarity and non-Gaussianity of the
data, burst events, times series amplitude outliers, and matched filtering
methods.
-
A plot of GW adc countsf from one short segment of data show good Gaussianity
for the bulk, but significant tails at low and high count values. There
are a greatly disproportionate number of outliers beyond 15 sigma.
-
The general improvement in lock duty cycle with time is echoed in a
generally decreasing rms motion through the run.
-
A formal analysis proposal to the LSC is in preparation, and discussions
with TAMA are underway.
Transient Identification (Julien Sylvestre - LIGO-MIT)
-
An analysis searching for transients in the recent 40 meter data
was described.
-
An analysis pipieline has been set up, based on threshold cuts in the time-frequency
domain, where a list of times is generated for each transient candidate,
allowing for subsequent display of the original time domain data.
-
A time-frequency plot was shown, illustrating a clear transient.
-
The thresholding is based on exceeding a Rayleigh probability distribution
cut in power for a given narrow frequency band.
-
To go further, transients are sought in several detector channels, and
a trigger defined by coincidence of excess power in more than one channel
in the same frequency band.
-
An example was shown of multiple time domain plots, all showing an sudden
excess power in the 80-250 Hz band.
-
A number of different models are under consideration for transient identification,
including allowing for time delays and non-linearity. One guide in choosing
among them is the natural time scale of transients to be identified.
Adaptive Denoising Techniques (Eric Chassande-Mottin - AEI Potsdam)
-
While matched filter techniques are optimal for Gaussian, stationary data,
they may be non-optimail for more realistic noise. An alternate method
that is more robust, model independent and computationally inexpensive
was described, based on adaptive linear prediction techniques.
-
One tries to minimize the mean square error between measurements and predictions
based on preceding measurements (e.g., assuming the presence of a line
source disturbance). A steepest descent approach is used to find the minimum,
following the gradient and not imposing a model requirement on the random
noise.
-
The technique has a number of adjustable parameters which must be tuned
to optimize performance and depend on the type of noise source being removed.
Line and ringdown shapes have been considered so far.
-
Examples were shown of before- and after-correction time series and time-frequency
plots.
-
It was suggested that this technique might be useful for data compression
also.
Violin Mode Analysis (Robert Coldwell - Florida)
-
In principle, any data signal can be created from a series of driving-force
impulses on a violin mode. The possibility of a creating a gravitational
wave artifact in this way has been investigated.
-
Examples of pulse superposition were shown to illustrate the effect. Fortunately,
to mimic a waveform with significant off-peak frequency content requires
a large number of pulses with particular initial conditions.
-
This study seems to confirm that a series of accidental pulses is exceedingly
unlikely to give a waveform easily confused with a gravitational wave transient.
Line Removal Techniques (Bernard Whiting - Florida)
-
The Florida group is attacking line noise on several fronts (see talks
by Klimenko and Coldwell above). This presentation addressed comparisons
of different techniques for quantifying line noise developed by Allen &
Ottewill (see above), Schutz & Sintez, Klimenko (see above), and Finn
& Mukherjee (see August 99 LSC meeting minutes/transparencies).
Not all of these methods can be applied both to 50/60 Hz harmonics and
to violin modes.
-
In particular, one must keep in mind the natures of these two "line" sources:
-
50/60 Hz & harmonics are controlled by an external agent, can have
a time-dependent frequency, and are strongly coherent with nearly constant
amplitude and very slowly varying phase. However, the coupling of the harmonic
into a given data channel may not be stationary.
-
Violin modes are affected by local controls, show time dependence also,
and are inherently stochastic with large variations in amplitude and phase.
-
Both the 1994 40 meter data and some Glasgow data (which has a prominent
line at 725.1 Hz) from a turbo pump) have been studied.
-
Histograms of fourier transform components for frequency bins neighboring
a 60 Hz or 50 Hz harmonic show a dramatic improvement in Gaussianity after
a line removal algorithm is applied.
-
Monitoring the calculated amplitude and phase of both mains harmonics and
violin modes shows large variations in frequency over short periods
O(100 seconds) in the 40 Meter data. One also sees very large fluctuations
in violin mode amplitudes, as expected, and small fluctuations in the mains
amplitudes.
-
Similar variations in violin mode amplitude are seen when the effective
noise temperature (K) is plotted over these time scales.
-
Various statistical tools have used to measure the success of line removal,
including a power/variance ratio (presented by Gonzalez at August 99 LSC
meeting), a likelihood ratio for Gaussianity (joint development of UR &
ANU), and fitting technique (Coldwell).
-
Planned studies include separate estimations of coherent and stochastic
contributions to mains harmonics, statistical analysis of Kalman filter
output (Mukherjee method), and perhaps investigation of the Chassande-Mottin
technique.
-
A long discussion ensued concerning measuring non-linear influences related
to mains harmonics, for example, sidebands (see Lazzarini talk above) and
whether the line removal techniques under consideration adequately take
into account such effects. Bernard expressed interest in looking into this.
Analysis of LLO Seismic Data (Dick Greenwood - Louisiana Tech)
-
Seismic data from the Livingston site has been analyzed using time frequency
methods. The data were taken with borrowed Guralp seismometers which are
three times more sensitive than the default LIGO monitors in the LIGO monitor
bands and much more sensitive at lower frequencies.
-
The integrated rms in a .05-0.20 Hz band was integrated over 15-minute
intervals. Plots of the rms vs time from several distributed sensors
show a very large increase in activity in the aftermath (3-4 hours) of
a magnitude 7.5 earthquake in Mexico.
-
This increased activity, phase information, and known positions of the
monitors permit reconstruction of the dominant seismic wave direction and
speed vs frequency, using analysis software (Seismic Analysis Code
- SAC developed at LLNL).
-
Contour plots of northern and eastern "wave numbers" (cycles/km) show striking
differences between data taken just before the earthquake and that during
its aftermath, both in compressional and vertical oscillation. The wave
numbers measured during the active period are consistent with the epicenter
of the quake.
-
The ambient microseismic activity (non-quake) is found to be consistent
with the early measurements carried out by Rohay.
A.O.B.
-
Stan Whitcomb invited LSC members to participate directly in an engineering
data run in early April at Hanford with one arm of the 2-km interferometer,
both to lend a hand and to learn more about the interferometer and its
data. KR promised to recruit volunteers.