Surveying II. Lecture 1.

Download Surveying II. Lecture 1.

Post on 11-Feb-2016




0 download


Surveying II. Lecture 1. Types of errors. There are several types of error that can occur, with different characteristics. . Mistakes Such as miscounting the number of tape lengths when measuring long distances or transposing numbers when booking. - PowerPoint PPT Presentation


<ul><li><p>Surveying II.</p><p>Lecture 1.</p></li><li><p>Types of errorsThere are several types of error that can occur, with different characteristics. Mistakes</p><p>Such as miscounting the number of tape lengths when measuring long distances or transposing numbers when booking.</p><p>Can occur during the whole surveying process, including observing, bboking, computing or plotting.</p><p>Solution:Creating suitable procedures, and checking the measurements.Probability theoryThe effect of a blunder is much larger than the acceptable error of the applied measurement technique.</p></li><li><p>Types of errorsSystematic errors</p><p>Systematic errors arise from sources that act in a similar manner on observations. </p><p>Examples: expansion of steel tapes due to temperature changes frequency changes in electromagnetic distance measurements These errors are dangerous, when we have to add observations, because they act in the same direction. Hence the total effect is the sum of each error.</p><p>Solution:Calibrating the instruments - comparing the observations with other observations made by other instruments.</p></li><li><p>Types of errorsRandom errors</p><p>All the discrepancies remaining once the mistakes and systematic errors have been eliminated. Even when a quantity is measured many times with the same technology and instrumentation, it is highly unlikely that the results would be identical.</p><p>Although these errors are called random, they have the following charachteristics: small errors occur more frequently than large ones positive and negative errors are equally likely very large errors occur rarely Due to this, the normal statistical distribution can be assumed.</p><p>Solution:Repetitions of observations. </p></li><li><p>The aim of processing the observationsQuestions:</p><p>How can the variability of the observations be described numerically. (Error theory)</p><p>How can we describe the variability of functions of observations (area, volume, etc.)? (Error propagation laws)</p><p>How can we remove the discrepancies from the observations? (Computational adjustment)</p></li><li><p>Basics of error theoryProbabilistic Variables (PV): quantities on which random processes has an effect.</p><p>Discrete PV: the variable can have a unique number of values.</p><p>Continuous PV: the variable can have infinite number of values.</p></li><li><p>Probability Distribution FunctionProportional Frequency curve</p><p>Probability Distribution curveFrequency curve</p></li><li><p>Probability Distribution FunctionProperties of PDF: </p><p>The probability, that the PV is within the interval (c,d) </p></li><li><p>The Normal DistributionIf the value of PV depends on a large number of independent and random factors, and their effects are small than the PV usually follows the normal distribution.</p></li><li><p>The Bell-curveChange in the mean valueChange in the standard deviation</p></li><li><p>The standard normal distributionInstead of PV the standardized PV could also be used for computations:</p></li><li><p>The 3s ruleThe probability, that the PV is within the interval +/-3s around its mean value (m), is 99,73% (almost sure).</p></li><li><p>Important quantitiesThe Mean ValueThe VarianceThe Standard Deviation</p></li><li><p>Observation errorsLets denote the difference of the theoretical value and the mean value (d)Lets denote the difference of the i-th observation and the mean value (x)ThenTotal Error = systematic error + random error</p></li><li><p>The mean errorGauss:Recall the definition of standard deviationIf we separate the systematic and the random errors:Where d - mean systematic errormx - mean random error</p></li><li><p>The correlation and the covarianceIn case of two PVs may arise the question: Are they independent? Do they depend on each other?Correlation:Covariance:Is there a linear relationship between x and h?</p><p>If r = +1 or -1 -&gt; linear relationship between the two quantities,if r = 0 -&gt; it is necessary , but not suitable criteria for the independence</p></li><li><p>EstimationsPlease note that up to now, all PVs were continuous PVs.</p><p>BUT. We do not know the probability distribution of the PVs. Therefore it should be estimated from a number of samples (observations).</p><p>Undistorted estimations: If the mean value of the estimation equals to the estimated quantity.</p><p>Efficiency of the estimation: If two estimations are undistorted, the more efficient is the one with the lower variance.</p></li><li><p>EstimationsThe Mean Value - the arithmetic meanThe mean errorSince the observation errors are not known (ei), we could use the difference from the arithmetic mean instead. The estimation above is distorted, therefore we use the corrigated standard deviation:</p></li><li><p>Error propagationIf the observations are PVs, then their functions are PVs, too. Thats the law of propagation.</p><p>We assume that the observations are independent.</p><p>Lets have n observations (L1, L2, , Ln), and their function G = g L1, L2, , Ln)</p><p>Questions:</p><p> how big is the error of the value of G (eG), when the error of Li are known (ei). what is the standard deviation of G (sG), when the std. dev of Li are known.Lets suppose that the function G is linear (if not, it should be reformatted as Taylor series).</p></li><li><p>Error propagationexeyxyey=ex*gy=G(x)where g = G(x)</p></li><li><p>Error propagationPropagation of observation errors:The propagation of observation error is linear.If we still have some systematic errors in the observations, then their effect propagates linearly.</p></li><li><p>Error propagationPropagation of mean errors:</p></li><li><p>Error propagationSimple cases:</p><p> observation multiplied with a constant (G=cL)</p><p> sum of two quantities</p><p> product of two quantities</p><p> mean value of the samples</p></li></ul>