OPUS Projects
User Instructions
and Technical Guide
NOAA | National Geodetic Survey
P a g e
|
93
a. Communication: Strong robust wireless, or WiFi low latency communication is essential between the
RTN and rovers during point locations. This technology continues to expand and become more robust.
b. Check-in often on known project marks: Checking in on a known mark with expected results before
collecting new data and during the session, leads to confidence in the RTN
correctors applied and
prevents blunders from being a bias throughout the entire initialization.
c. Redundancy needed: Being able to acquire redundant coordinates on marks within expected accuracy
requirements provide reliable results and confidence in the RTN station coordinates.
d. Avoid Multipath: Gathering or staking out project coordinates under varied satellite geometry will
help mitigate multipath error. Rover users should avoid collecting data or staking points under tree
canopy, near metal structures, water surfaces, signs or other reflective surfaces above the antenna
height.
C. NSRS Validation for RTN
The NGS proposes simple methods for RTN Administrators to provide confirmation that their network is
aligned with the NSRS. This confirmation would advance the way toward a “validation” whereby after a
technical review the NGS would corroborate the RTN station coordinates. Further
validation guidelines
and dialogue may be adopted over time. If OP is used to adjust an RTN then NSRS alignment and
validation may be easily verified and confirmed.
D. Monitoring RTN Station Coordinates
The RTN Administrator should plan to monitor the positions of all RTN stations utilizing network quality
assurance software for any given day of RTN operation, and provide assurance to users that the
positions do not vary by more than an expected amount. How much variance or movement in position
is acceptable? What are the seasonal effects on station positions? Each RTN should determine
acceptable normal limits and beyond those limits consider re-positioning the station. For
consistency,
consider adopting procedures for updating the station’s coordinates and/or velocity if coordinate
differences in excess of 2 cm in either horizontal dimension and/or 4 cm in ellipsoid height persist over a
period of several days.
After completing a new RTN adjustment or updating individual station coordinates the RTN
Administrator should consider publishing each stations position on the RTN web site. Include metadata
about the adopted coordinates and relative positioning network accuracies for the RTN stations based
on the assumption that CORS are errorless. Network accuracies should be published for each station so
that users may include those errors when performing local project static network adjustments using the
RTN stations. Thus local project network error will include the RTN network error. The network error
should be normally reported at the 95% (2 ) confidence level.
To answer the question: "Do users achieve coordinates in the field from an RTN that are consistent with
the NSRS?". RTN users should consider creating (check-in) passive marks at their office on projects so
that they can check-in with their rovers on a daily basis to confirm alignment with the RTN.
OPUS Projects
User Instructions and Technical Guide
NOAA | National Geodetic Survey
P a g e
|
95
3.6
What's Under the Hood - Processing Baselines with PAGES
The PAGES (Program for Adjustment of GPS Ephemerides) is the orbit/baseline estimation software used
within OP. Using double-differenced phase as its observable, PAGES should be suitable for most
projects requiring the highest accuracy. Many types of parameters can be estimated including
tropospheric corrections, station (marks and CORS) coordinates, linear velocities, satellite vectors and
polar motion. An
analysis strategy summary
that includes a terse description of the PAGES program is
kept at the IGS Central Bureau.
3.7
Adjusting Networks with GPSCOM
GPSCOM is a program for the combined adjustment of multiple GPS data sets
initially processed by the
program PAGES. The program GPSCOM is a simple Helmert Blocking, normal equation processor which
combines multiple GPS data sets that have initially been processed by the program PAGES to form and
partially reduce normal equations eliminating numerous nuisance parameters which are not generally of
interest in a large global adjustment. The normal equation elements for the global parameters, those to
be passed on to a combined adjustment, are written by PAGES into a normal
equation file which
becomes the basic input data for the program GPSCOM. One or more of these files as well as its own
output normal equation files can then be processed by GPSCOM to provide a combined adjustment of
the global parameters.
3.8
Making Combined Adjustments
Making combined adjustments using the programs PAGES and GPSCOM is a process usually called
Helmert blocking in the traditional geodetic community. The technique was first described by Friedrich
Helmert (1880) and a good description of the application of his paper was presented by Wolf (1978)
including the original instructions given by Helmert for using this approach. The program GPSCOM used
here for combining the normal equation matrices is in many ways a direct descendent
of a prototype
program (Dillinger 1978) coded many years ago for doing the same type of task.
Once some initial processing of data through the PAGES program is complete, one can begin making
combined adjustments using GPSCOM and the normal equation files that have been generated. Since
the normal equation files generated by the program GPSCOM are identical in structure to those made
by PAGES, one can combine partial normal equations from an execution of the GPSCOM program with
those from PAGES or other runs of GPSCOM. This allows one to use a pyramid type structure to build up
combined adjustments with larger and larger amounts of data.
3.8.1 Helmert Blocking
Helmert blocking is basically a technique for breaking up a least squares adjustment problem that is too
large to be managed as a single computation with
the resources available, into many smaller
computational tasks that can be managed. Not only are the computational problems thus smaller
individually, but, with a good blocking strategy, there is actually much less computation to be done as
the technique introduces and takes advantage of scarcity in the normal equation system. In fact, the
technique is very similar to the method for solving sparse normal equation systems known as nested
dissection (George 1973) and, with the right blocking strategy, should have the same advantages.
From a computational point of view, it is probably better to break the problem into quite small blocks,
i.e. groups of marks for PAGES to process. Small blocks run much faster in the PAGES
program and
usually result in a more sparse normal equation system thus reducing the overall computational task.
However the concept of making many smaller blocks to be individually processed with the PAGES
program does have a potential disadvantage. This is because the double differencing method used by
PAGES introduces correlations between the observations. The PAGES program has an algorithm (Hilla
and Milbert 1989) that will correct for these correlations. For the de-correlation method to work, all
observations at a given epoch must be handled simultaneously by the PAGES program. Using the
Helmert blocking method with double difference observations prevents the correlations for those
observations, of a baseline that spans two blocks, from being handled completely correctly. Practical