TOC | | pages.skl | Pages | Gpscom | INDEX
Helmert blocking is basically a technique for breaking up a least squares adjustment problem, which is too large to be managed as a single computation with the resources available, into many smaller computational tasks which can be managed. Not only are the computational problems thus smaller individually but with a good blocking strategy there is actually much less computation to be done as the technique introduces and takes advantage of sparsity in the normal equation system. In fact the technique is very similar to the method for solving sparse normal equation systems known as nested disection (George 1973) and with the right blocking strategy should have the same advantages.
From a computational point of view it is probably better to break the problem into quite small blocks, ie. groups of sites for Pages to process. Small blocks run much faster in the Pages program and usually result in a more sparse normal equation system thus reducing the overall computational task. However the concept of making many smaller blocks to be individually processed with the Pages program does have a potential disadvantage. This is because the double differencing method used by Pages introduces correlations between the double difference observations. The Pages program has an algorithm (Hilla and Milbert 1989) which will correct for these correlations. For the decorrelation method to work all observations at a given epoch must be handled simultaneously by the Pages program. Using the Helmert blocking method with double difference observations prevents the correlations for those observations, of a baseline that spans two blocks, from being handled completely correctly. Practical tests we have preformed with blocked data indicate that comparisons between results from a Helmert blocked adjustment versus one in which all data was processed through Pages so all correlations could be handled correctly show differences within the normal noise level of our GPS adjustments.
While the above discussion describes one short comeing of the Helmert blocking system as we are using it, the counter point is that it allows us to solve much larger problems than we could otherwise handle and it will allow us to include much larger amounts of real data in these adjustments. While the numerical tests done thus far do not prove that the technique will always provide good results they do indicate that it is working well in our current applications. We will continue to monitor the pros and cons of the technique and we do hope to show in the near future that the large amount of data that we will be able to handle will produce even better results than we have been able to achieve before.
One method of blocking the data which is based on a geographical concept would be to divide the sites in the network into geographical regions with an approximately equal number of sites. Each region can then be processed independently with the Pages program as is usually done for any network, with one exception. In order to connect the regions into one complete network one baseline should be selected to connect two adjacent regions and the observations from this baseline should be assigned to only one of the adjacent regional blocks. The site in that baseline not originally included in the block into which the observations are to be assigned must also be added to that block and it thus becomes a junction site by virtue of being in more than one block. Each region only needs to be connected in this way to one other adjacent region but the connectons should cause all regions to become one complete network. While this may be one of the simplest strategies and many others are possible each needs to follow a few simple rules;
1) The same observations can not be processed in more than one block. This is perhaps easiest to achieve if the same baseline is not included in more than one block.
2) The final network that is formed by the network design internal to the blocks combined with the baselines that cross block boundaries should not cause closed loops to form in the baseline connections.
If one needs to process data spanning several days, or observing sessions, the above proceedure can be applied to each days observations which can then be combined with Gpscom. The resulting normal equation files from the daily combinations can then be combined with Gpscom producing a final global adjustment for all the data.
When the Pages program is run for a combined adjustment the parameters can generally be grouped into three categories. The parameters that are included in the output normal equation file to be carried forward into another combined adjustment are referred to as global parameters, where as those parameters that are reduced and eliminated from the matrix and the normal equation file in the current adjustment are referred to as disposable or sometimes nuisance parameters. These are parameters which must be included in the GPS observation equation but which after data editing and initial processing their final values are no longer of interest in a combined adjustment. The Bias terms are typical of this type of parameter. A third categorey of parameter , called "local parameters", is intermediate between the other two, in that they do not need to be carried forward into a combined global adjustment but the adjusted values of the parameters will be of interest. An example of this type of parameter would be the daily orbit parameters which are included in a longer term weekly or yearly combined adjustment. The output normal equation file can be made by either Pages or Gpscom and is always created before any constraints involving the global parameters are applied. This allows the final set of constraints to be defined at the time of the final adjustment.
When one wants to put together data sets which have been processed independently with the Pages program and which cover the same time span one must define the junction sites. Junction sites are those which, due to double differencing with more than one additional site, end up being processed by Pages in multiple seperate executions. These sites will have troposphere parameters defined in each of the Pages processing steps in which they occur. In order to get a common single estimate of the troposphere parameters for that site, those parameters must be carried forward as global parameters into an adjustment by Gpscom where all of the contributions to the troposphere parameters can be combined. At this point those troposphere parameters can be reduced and eliminated from the matrix as "nuisance" parameters. The names of the junction sites are given to the Pages program using the option "MAKE SITE TROPO GLOBAL" in older versions of the program this was called "DEFINE JUNCTION SITES". When this is done the troposhphere parameters associated with the selected junction sites will be defined as global parameters and included in the normal equations file to be used in a combined adjustment by the program Gpscom. This allows for a rigorus adjustment of data for a given time span which has been processed in multiple runs of the Pages program. By this approach many more tracking sites can be adjusted than could easily be accomodated in one execution of the Pages program.
A time span of one day is usually used in this type of processing and the days data is divided in to groups of stations. Those stations that for double differencing purposes are included in more than one group are defined as junction sites and the data span and troposhere interval must be the same for each group of data to be combined together. When all of the data for the day has been processed by Pages then the data sets can be combined by Gpscom and the troposphere parameters eliminated as nusiance parameters. The resulting normal equation file for that days data can then be combined with the data from other days in another adjustment by Gpscom.
If the "MAKE SATELITES GLOBAL" option is specified for the Pages program then the satelite parameters will be carried forward into the normal equations file to be used in a combined adjustment in the Gpscom program. When satelite parameters are defined as global parameters the time span of the groups of data being combined need carefull consideration and in most cases must match. This is because the satelite parameter identifiers as assigned by the Pages program are only unique to the particular parameter type and provide no way to distinguish between different time spans. Thus the data for a given time must be combined with Gpscom and the satelite parameters eliminated from the matrix before the data for that time can be combined with the data from a different time. An exception for this would be if one wanted the satelite orbit parameters to be adjusted for a longer time span, such as two or three days, in which case all of the data could be combined together to get one set of orbit paramters.
If the "MAKE EOP GLOBAL" option is specified for the Pages program then the earth orientation parameters will be carried forward into the normal equations file to be used in a combined adjustment in the Gpscom program. Like troposphere parameters and satalite parameters the alignment of the time spanned by the adjusted parameters is important. However in the case of the EOP parameters multiple days of parameters can be combined together. When multiple EOP parameters are solved for in Pages they can be be solved for as piecewise linear terms and the sequence of terms can be extended in the piecewise linear model using the Gpscom program as long as the time boundaries are adjacent and there are no breaks in the sequence. Gaps in the sequence will cause singularities as only one constraint can be supplied for each parameter type. An alternative to the piece wise linear model is available in Gpscom using the option "DAILY EOP VALUES" which will make the EOP parameters of succesive data sets independent of each other.
George, A., 1973. Nested dissection of a regular finite element mesh, SIAM J. Numer. Anal., 10, 345-363.
Helmert, F. R., 1880. Die mathematischen und physikalischen Theorien der hoheren Geodasie. 1. Teil. Leipzig, 631 pp.
Wolf, H., 1978. The Helmert block method, its origin and development. "Proceedings of the Second International Symposium on Problems Related to the Redifinition of North American Geodetic Networks, Arlington, Va. April 24-28," 319-326.
Hilla, S. A., and Milbert, D. G., 1989. Algorithms for Computing Covariance Matrices for GPS Double Difference Observables, Internal memorandum, National Geodetic Survey, NOAA, Silver Spring, Md. USA.
TOC | | pages.skl | Pages | Gpscom | INDEX