How do you measure the level of the sea at the mm precision? It is highly doubtful we are able to do that with any kind of instrumentation, more so around the whole world at the same time.
EDIT: they say they measure everything with satellites, but that does not mean there is no incertitude in the measurement. No measurement is 100% accurate, and I'd certainly like to see the error range of satellites, when even GPS can't do better than +/- 30 cm precision.
AFAIK, satellite sea level measurements are only really accurate to a few cm. But when many measurements are averaged over long periods of time, that's still good enough to observe trends on the order of a few mm per year.
Each diagram should have a FAQ with answers to questions like ekianjo's, otherwise a conspiratorial attack is too easy.
The "pro" side should read and learn from forum discussion like this (other issues pointed out: "All of the charts should use the same time scale. It's confusing and misleading to use different scales for each", "I wonder why the chart for CO2 concentration starts at 1960 vs 1850 for the global mean temperature difference", "Interesting graphs, but it's a little confusing that all but one of them use difference of the metric (i.e. relative values) instead of absolute values -- what and when are the differences relative to?"), making notes on criticisms and improving their presentations. I think "The definition of insanity is doing the same thing over and over again, but expecting different results" has some relevance here.
The nice thing about satellite orbits is that they are extremely steady and predictable. Over long time scales, a satellite's orbit drifts due to many effects, such as non-uniformity of Earth's gravity. But over short timescales, its motion is very precisely determined by its orbital parameters.
In particular, there's a precise relationship between a satellite's orbital period and its orbital radius (technically, its semi-major axis). A one-centimeter variation in altitude would result in a timing error of several hundred microseconds per day, which is enough to be detected using precise clocks and Doppler effect measurements.
> A one-centimeter variation in altitude would result in a timing error of several hundred microseconds per day
Source or math for this? Because for any signal in the MHz range, I’m not sure I believe it necessarily.
Several hundred microseconds of a 150Mhz wave is several thousand cycles. That seems... questionable.
I did a check on a decibel calc with a 150Mhz signal and a 1 meter change was approx .01db... which is effectively undetectable to a real world application. Signal strength isn’t the same as propagation delay, I know. But yea...
I look forward to being corrected, but I can’t say that claim seems legitimate on its face.
EDIT: Nope. Did some probably bad math on this on my own, claim is very nonsense. Esp because the delta distance is in space where radio has the speed of light.
I don't understand what you think is nonsense about this claim. Can you elaborate?
The timing numbers I quoted are purely based on the orbital motion of a (hypothetical) satellite, and have nothing to do with radio signals. Kepler's third law states that a body's orbital period varies in proportion to the 1.5th power of its semi-major axis. A 1cm altitude difference for a satellite in LEO corresponds to a change of about 1.5 parts per billion, which translates to a 2.2 ppb change in orbital period. As I said, this amounts to a cumulative difference of a couple hundred microseconds per day.
And it's actually much easier to precisely measure frequency differences than amplitude differences, if you have sufficiently accurate clocks. If you have a 150.000000MHz reference signal and a 150.000001MHz doppler-shifted signal, you can simply multiply them together to get a 1Hz beat frequency. Using this technique, you can measure phase differences that are considerably less than a single cycle of the original signal.
A major limiting factor, of course, is the stability and precision of your reference clocks. Apparently, the Jason-2 satellite that (until recently) was responsible for a lot of these measurements had a high-precision quartz oscillator that was stable to roughly one part per trillion: https://www.ncbi.nlm.nih.gov/pubmed/30004875
Measuring the absolute position and velocity of a satellite is comparatively a lot more difficult. But with sufficiently precise Doppler relative-velocity measurements from multiple points, you can solve for both the orbital parameters and the slowly-varying perturbations with a high degree of accuracy.
I don't agree with this claim, unless you quantify it. This has already been touched upon before, for example here:
> "It depends upon the orbit and what time scales you are talking about. Satellites are subjected to many perturbations in its orbit. There are effects due to atmospheric drag, which as you'd expect affect lower satellite orbits more than higher orbits, but the atmosphere swells up all the time depending upon the level of solar activity. Gravitationally, the Earth is not a point mass and it has regions where the gravity gradient changes, which causes the satellite to get pulled one way or another (very slightly) as it orbits around."
The link in the top-level comment addresses all of these concerns, among other considerations and carefully calibrated corrections. They clearly know what they are doing.
They have a high quality map of the variations of gravity across the surface of the Earth. They also have a model that accounts for atmospheric drag.
The problem is that every instrument involved in the chain of measurement has its own inaccuracies, and at the end of the day you would need to make sure that once you add each of their inaccuracies it does not compound up to more than what you are actually measuring. This is a very complex subject and I'm not sure it's as "settled" as you seem to portray it.
Oh, sure, I don't mean to minimize the engineering challenges involved. I'm far from an expert in the details of how these particular satellites work; I'm just trying to describe the general principles, to make the point that this level of measurement accuracy shouldn't be viewed as intrinsically unattainable.
> But when many measurements are averaged over long periods of time, that's still good enough to observe trends on the order of a few mm per year.
That would be assuming your satellites have no measurement error over time (doubtful) and that all satellites used over several decades (since it's not a single satellite) have the same level of accuracy or bias over time. Unfortunately in practice it's very rarely the case and you end up averaging different measurements with different levels of inaccuracies and that does not make for a very convincing resulting single value.
You are very casually implying gross incompetence or dishonesty among the many highly skilled individuals working on these projects.
Have you actually thoroughly researched the methodologies they use? Do you have any hard evidence that they are not properly accounting for sources of error or overstating their results?
The claim that you can actually measure the growth of grass from a sattellite (which is the kind of precision we are talking about here) seems just too extraordinary to accept at face value, when all other technologies we use with satellites have no such precision. it does not pass the sniff test. I could be wrong but I am willing to bet the error bars on such measurements would be huge.
Since we're interested in the delta, even if it isn't even, it will not matter as long as the error distribution is constant during the whole experiment.
If you are within the range of error of your instruments you can average what you want all you will get is random noise. Averaging within the noise threshold does not bring you closer to the truth.on top of that there are waves on the sea so this is not even a flat and fixed surface we are talking about.
EDIT: they say they measure everything with satellites, but that does not mean there is no incertitude in the measurement. No measurement is 100% accurate, and I'd certainly like to see the error range of satellites, when even GPS can't do better than +/- 30 cm precision.