Many surveyors are making heavy use of GPS reference networks these days, and they are absorbing 3D data from satellites routinely. While doing research for an upcoming article, I came across information that may call into question the accuracies of GPS reference networks. I believe these potential faults should be researched, evaluated, solved, standardized and regulated.
When I first wrote about GPS reference systems in January 2005, I initially referred to them as VRS for Virtual Reference Systems until I found that Trimble had trademarked this term. Because of that, I titled my article “Imaginary Reference Systems,” since the purpose was to create an imaginary base station where the rover is. This article produced a number of questions about the networks, such as “Who is watching the watcher?”
Why is that question important, you ask? It’s important because we need to learn a lot more about how the systems actually work. The manufacturers of the GPS systems developed the software that assembles the data from surrounding base stations and performs the computations for dissemination to the user. These manufacturers implemented mathematical formulas to read the data, assess errors, perform corrections and apply those values to a dynamic stream of data. The problem is that the manufacturers consider their systems to be proprietary and therefore, are not open to scrutiny by the public.
When I wrote the article in 2005, I noted that each manufacturer had developed software to produce corrections to the data being captured. Trimble uses the Trimble RTKnet software; Leica developed Spidernet; and Topcon offers the TopNet Reference Station Software Suite.
Based on e-mail responses I received from the article, many people questioned the output of these solutions. So I decided to research the issue and write a follow-up article on what algorithms were being used to produce these computations.
As part of my research, I contacted each of the major manufacturers of these systems and asked if we could discuss the mathematics. My intent was to briefly describe how the corrections were being performed and whether or not they were approved by a licensed surveyor. The meetings did not go quite as I had hoped. Each manufacturer deferred the answers to my questions by claiming that the systems were proprietary and not completely open to public scrutiny. One advised me to read the white papers on the manufacturer’s website, because that was the only public information they were willing to provide. I did review those white papers, albeit briefly, yet they did not provide me with the answers to my questions.
Left at a dead end in my research, I made the comment that this is the first time that I could think of that a surveyor went to sleep at night completely trusting in someone else’s answers. All of these professionals spend a lot of time checking their answers and then they check the check. With the advent of reference networks, surveyors must trust the data they are collecting even though they don’t know exactly where it came from.
This lack of information has left me with many questions regarding these systems, including:
Does a licensed professional oversee these systems?
What happens when a system crosses state lines, was that professional licensed by the states in question?
What happens when a construction surveyor collects some data from a network and then for whatever reason, switches to another network?
Can one replicate exactly the same solution when this switch occurs? If not, why not? Is that OK with everyone?
Who is overseeing this national but independent network of solutions?
Other related questions I have include: Who checks the algorithms of these correction systems? Who verifies the inputs and outputs and the dissemination of this data? Is the software bug free and if not, what corrections are made and when are they made, after complaints or after QA/QC occurs? Is QA/QC occurring? Are there licensed surveyor(s) overseeing, reviewing and approving this information, the mathematics and the formulas on which they are based?
Let’s say that a licensed surveyor does approve these computations and processes. Well, what happens when a customer crosses into areas where the approving licensed surveying professional is not certified or licensed? This could easily occur if a project’s limits cross jurisdictions, such as with the Heartland Corridor project that runs between Newport News, Va., and Chicago. In that case, the surveyor is collecting information from multiple states networks. If there was a licensed professional certifying the system, were they licensed by all of the states that the systems serve? So where does the buck stop—who is responsible?
Now add to that another complication, one where the surveyor jumps from one network to another manufacturer’s network. It is possible that a project could begin using one network and later, due to a weaker signal perhaps, the equipment operator elects to take advantage of his subscription to an overlapping network. So when the collection of data must result in a homogenous and seamless data stream, are the different manufacturers using precisely the same algorithms? Have they implemented the mathematics in the same way? I suggest that the answer is uniformly “No.” I say no because there is currently no regulation, and since the manufacturers won’t divulge their secrets, how could they possibly all be done the same?
Because of this uncertainty, I believe we need an independent third party—perhaps the NGS
or Weights and Measures
—to step in, identify the issues involved, document the solutions and check to see that the manufacturers have correctly implemented the solutions.
I recently read an article about the aeronautical industry discussing a similar concept for airplanes trying to obtain guidance information for landing aircraft. The author also seemed distressed that regulation and conformity didn’t exist. The author recommended that all station operators should standardize using the Radio Technical Commission for Maritime Services (RTCM) Special Committee 104 standard for Differential Global Navigation Satellite System (DGNSS) services. In another article
, the solution was that the “format of GPS measurement corrections should be standardized to ensure that the system is independent of any single receiver manufacturer. This can be solved by adopting the RTCM standard for RTK multiple reference stations v3.0.”
I now pose this question to the practicing professionals, those using the reference networks on a daily basis. Do you know the basis and final implementation of the system you are using? Do you cross networks in performing projects? How comfortable are you that you will get the exact same answer from one position being collected by two or more reference networks? If you cross state lines, who checked the computations you are collecting? Please share your comments below or e-mail me and let me know what your thoughts are.