TOC | Back | Next | | | INDEX
FAQ
Frequently Asked Question concerning run_survey
July 10, 1996
I noticed that in your test data for page4 that you used rcm5 as a reference
station. Is this because its rinex *n file is in daily format, rather than
hourly format like ccv1 and eky1? Is there any way to process 24 hours of
data using a different station as reference, say ccv1? I guess what I'm asking
is if there is a program like cato that works for rinex *n files (if it's
necessary)?
Response
The
RINEX
n file is used for one or possibly two purposes. It is always
used in mergedb
to rotate the phases and ranges from different sites
to common times. You can also specify that the broadcast ephemeris,
also contained in the "n" file as the only ephemeris (I can tell you
how to do this if you are interested but I'd have to think about it for
a minute). In both cases, you should have an "n" file which covers the
time span of the data.
The info contained in the
RINEX
n file is generic, any satellite broadcasts
the identical info about all other satellites. Therefore any "n" file
which covers the data time span will do. (The exception is the brief period
when new info is being uploaded.)
We have a program called rinbal which properly concatenates "n" files
together. I didn't build it and don't know anything about it, but can
track it and its source code down if you are interested.
Our data center (Geosciences Lab - maybe CORS but I'm not sure) provides
a global "n" file called globlDDD.orb where DDD is the day-of-year. The
IGS data center at the cddis has a similar product called brdcDDD.YYn where
DDD is the day-of-year, as before, and YY is the year. CORS may provide
the globlDDD.orb file.
Remember that any "n" file will do so rcm5, wes2, tmgo, brmu or the
"weather" sites lmno, denc, wsmn, ... will work.
Finally, I chose rcm5 as a reference because it a IGS global tracking
site with good coordinates in the ITRF. Using the rcm5 "n" file was
simply a convenience - I wanted the "o" file and the "n" file is there.
July 11, 1996
In order for me to process Cedar Key data in
pages, I believe I
need to create a POM file for the site. Could you tell me what the format is
for the POM files? Also, if there is any other new info I may need before
processing, could you please let me know?
Response
The pom file is very simple - position / offset / met / antenna. Using
rcm5.pom as a example:
961334.762 -5674074.250 2740535.196
.000 .000 .110 -.018
15.000 980.000 75.000
4
The first line is the geocentric XYZ coordinates.
The second line is the offset in local north/east/up plus the L1-L2 phase
center difference in height. I see that rcm5 has a 0.11 m vertical offset.
This tells me that the coordinates are for the antenna reference point (ARP).
The IGS has a text file showing these offsets from ARP's, and L1 and L2
offsets. I'll include that below.
The third line are temperature [C], pressure [mbar], and humidity [%].
The fourth line is the antenna ID. This number should match up with an
entry in the antenna phase correction file (phscor). A "4" shows this
site to have a Dorne Margolin T antenna. This matches the 0.11 and
-0.018 in the offset line.
Make sure the coords, offset and antenna are all consistent.
There have been many antennas calibrated (like the values in the phscor
file) recently, but these have gone into a new "standard format" file
distributed July 1. Unfortunately, these antennas - names - formats are
still in flux at this time.
In principle you simply need the
RINEX
data from your new and a known
site, poms, and an ephemeris, then you're set to go. Depending upon
how good your initial coords are you may need to do a couple iterations
(run_survey - update the pom file - clean directory - run_survey).
Within a few meters is good; within a few hundered meters and things
start to get interesting. If any data makes it through the automatic
editing you'll probably get a better position from the adjustment.
If not or if no data makes it through the auto-edit, let me know and
I'll send you a couple options to try.
Once you have a few days of good data and solutions under your belt, you
probably should consider starting a site info file to keep a history of
changes to the site. pages can read
this file and processes any epoch appropriately.
July 12, 1996
Is there may be a way to modify the
values in doofussp to allow for high a priori estimates for a site?
Response
doofuss is controlled from a doofuss.inp file. When these are generated
automatically, they are stored in the inpt subdirectory under the work
(yy_doy) directory and will have names like doofuss.cccc where cccc is
the four character database ID. The first line of this file controls
the editing parameters.
When created automatically, the parameters on that first line are gotten
from the file doofussp in the files directory. doofussp literally is the
first line of the doofuss.inp file. The doofussp file contains good numbers
for a variety of baselines with the assumption that the positions are good.
The files directory is at the same level as the work directories. For
example, in that sample I put in with the run_survey executables, you
should see a directory structure like:
data inpt plts
\ | /
96_085 files
\ /
\ /
Test
There are two simple things to try...
- modify a copy of run_survey so that doofuss is not run at all
and reprocess the day just like before except with the new script
You may have to manually edit the database though. I can talk you
through that very easily.
- copy doofussp to doofussp.sav then modify doofussp. I would try
increasing the second parameter. Now I believe your doofussp should
have numbers like
0 0.015 10 20 3 5 .30 0.5 10 0
try
0 0.045 10 20 3 5 .30 0.5 10 0
Then reprocess the data normally. That second parameter is the maximum
rate of change for double-difference residuals in cycles/second.
(I can't remember if there is a multiplier applied but that is a secondary
issue). What your telling the program is that the double-difference
residuals can change 3 times faster like when there is large curvature
from a poor initial position. If that doesn't get you something try
0.15. These parameters are:
C LPLT PLOT SWITCH (1=PLOTS, 0=NO PLOTS)
C
C CYCLE SLIP DETECTION PARAMETERS
C
C DDRLIM DOUBLE DIFFERENCE RATE (cy/s) ABOVE WHICH CYCLE SLIP OCCURS
C
C CYCLE SLIP CORRECTION PARAMETERS
C
C NSPAN NUMBER OF POINTS ON EITHER SIDE OF SLIP FOR POLYNOMIAL FIT
C NGAP SIZE OF GAP TO BE INSERTED WHEN CORRECTIONS FAIL
C NMIN MIN NUMBER OF GOOD POINTS NEEDED ON EITHER SIDE OF SLIP
C ISRCH +- INTEGER SEARCH RANGE IN L1 AND L2
C W1 WEIGHTING FACTOR FOR L1, L2 AND WL
C WSUMMX MAX VALUE FOR WEIGHTED SUM OF FIT DIFFERENCES. ELSE PUT GAP
C NSEGMN MIN NUMBER OF POINTS NEEDED FOR SEGMENT - OTHERWISE DELETED
C JX 0 OR SV # OF SV TO BE DIAGNOSED IN TEST FILE
In either case, once you get an estimate for cedar key, you should put
the updated coords in the pom file and reprocess one more time (this
time with the original doofussp since your coords should now be pretty good)
That final iteration should have plenty of obs and coords good to a
few centimeters. For the number of obs, I always use the rule of thumb
five double-differences per epoch; so four hours of 30sec data should
give you something like:
5 dd/epoch * 1 epoch/30 sec * 3600 sec/hr * 4 hr = 2400 obs
or
5 dd/epoch * 1 epoch/30 sec * 3600 sec/hr * 24 hr = 14400 obs
Starting up a new site will be the most frustrating thing you'll do.
Everything after this is simply typing run_survey then doing science.
August 26, 1996
Is there a way to get post-fit RMS per baseline from
pages?
Response
The bad news is that when I modified list to read the "new" format page4.sum
file, the RMS by baseline was lost. The good news is that information is
still in the page4.sum file though. Open up the page4.sum and search for
"POST-FIT". I think right under the block containing "POST-FIT RMS:"
the statistics for each baselines should be listed in a table/matrix form.
look for the baseline you're interested in. The first column is the
OVERALL POST-FIT RMS for that baseline. This carries an assumption that
you're doing a complete run on the second pass (a zero in the first line
of the page4.inp file). I believe that is the way we left it.
The following is an example of a page4.sum file to give you an idea. Note
the "BLOCK= 1 OVERALL" and related column.
--
OVERALL STATISTICS:
A PRIORI RMS: .1740 M FROM 1025 OBSERVATIONS; 27 OMITTED
POST-FIT RMS: .0122 M FROM 1025 OBSERVATIONS; 27 OMITTED
STD. ERROR: .1223 M FROM 1025 OBSERVATIONS; 27 OMITTED
WEIGHTED MEAN TIME = 96/07/22 02:44:35.86 ( 50286.11430391)
FREE PARAMETERS = 39; PORTION OF MATRIX USED = 41
------------------------------------------------------------------------------
POST-FIT RMS BY SATELLITE VS. BASELINE
BLOCK= 1 OVERALL 01 02 03 04 05 06 07 09
------------------------------------------------------------------------------
fair-brw1| .012 .010 ... ... ... .011 ... .012 .018
RMS BY PRN .010 ... ... ... .011 ... .012 .018
------------------------------------------------------------------------------
14 15 16 17 18 19 20 21 22
------------------------------------------------------------------------------
fair-brw1| .010 .014 ... ... ... ... ... .013 ...
RMS BY PRN .010 .014 ... ... ... ... ... .013 ...
--
We've just about finished with the new version of page4 (officially, you
are running a beta version of that program). I was required to make some
additional changes to the page4.sum. Once this is done and the format of
the new page4.sum file fixed, I'll be free to put some of the old stuff
back in list. We should be putting the new version out for our people this
week. I let you know when we start distributing it. Hopefully, page4 will
be static for a while after that.
TOC | Back | Next | | | INDEX
faq.html
June 24, 1999