r3-cop-d4.2.3 cv-hazop r1.1 oz · pdf filethese ingredients are introduced and discussed in...
TRANSCRIPT
![Page 1: R3-COP-D4.2.3 CV-HAZOP r1.1 OZ · PDF fileThese ingredients are introduced and discussed in the next chapter. In Chapter 2, the HAZOP study is carried out and its results collected](https://reader031.vdocuments.us/reader031/viewer/2022030411/5a9dd51e7f8b9a0d5a8cbbe2/html5/thumbnails/1.jpg)
R3-COP Grant agreement no. 100233
Project acronym R3-COP
Project full title Resilient Reasoning Robotic Co-operating Systems
Dissemination level PU
Date of Delivery 08. Feb. 2013
Deliverable Number D4.2.3 Version: 1.1
Deliverable Name Criticality Analysis for Computer Vision Algorithms – CV HAZOP
AL / Task related WP4.2: T4.2.4
Editor Oliver Zendel (AIT), Markus Murschitz (AIT)
Contributors Oliver Zendel (AIT), Wolfgang Herzner (AIT), Markus Murschitz(AIT), Gisbert Lawitzky (SIE), Georg von Wichert (SIE), Wendelin Feiten (SIE), Ulrich Köthe (HCI), Jan Fischer (IPA)
Keywords: computer vision, criticality analysis, hazard and operability study, HAZOP, coverage metrics, model-based test case generation, autonomous systems
Abstract During the development of techniques for model-based genera-tion of test data for robot vision, a detailed analysis of visual aspects that can hamper the correct interpretation of observed scenes by computer vision algorithms has been carried out in WP4.2, based on HAZOP concepts. This document contains the results of this study, in order to both base further (e.g. implementation) work on it and to ease their continuous public extension.
![Page 2: R3-COP-D4.2.3 CV-HAZOP r1.1 OZ · PDF fileThese ingredients are introduced and discussed in the next chapter. In Chapter 2, the HAZOP study is carried out and its results collected](https://reader031.vdocuments.us/reader031/viewer/2022030411/5a9dd51e7f8b9a0d5a8cbbe2/html5/thumbnails/2.jpg)
ARTEMIS-2009-1 R3-COP / Visual Perception Testing D4.2.3
Page 2 / 187
Document History Ver. Date Changes Editor 0.0 01.06.2012 • Document skeleton prepared W. Herzner (AIT) 0.1 11.06.2012 • Integration of peer reviewed
criticalities originally (CV_CritAnalysis_v0 4 2) of O. Zendel, W. Herzner and M. Murschitz
M. Murschitz (AIT) (integrati-on) O. Zendel (AIT)
• Format clean-up M. Murschitz (AIT) 0.2 13.07.2012 • Merged with Num,Observer from OZ O.Zendel (AIT) 0.3 17.07.2012 • Merged with PSF (Oliver Z)
• Added analysis and conclusion • Removed most to dos • Added references to certain specific
vision topics
M. Murschitz (AIT)
0.4 18.07.2012 • Executive summary updated • Remaining to dos removed
W. Herzner (AIT)
0.5 25.07.2012 • Minor open issues solved • Incorporated inputs from G.Lawitzky • Incorporated inputs from U.Köthe
W. Herzner (AIT) M. Murschitz (AIT) O. Zendel (AIT)
0.6 27.07.2012 • Statistics update and reordering • Clean-up • Incorporated inputs from J.Fischer • Corrected some lower/uppercase
collisions • Removed empty bullet lists
W.Herzner (AIT) M. Murschitz (AIT) O.Zendel (AIT)
1.0 31.07.2012 • Fixed some bullet list styles • Finalised
O.Zendel (AIT) W. Herzner (AIT)
13.12.2012 • 5 Internal reference errors corrected W. Herzner (AIT) 1.1 8. 02. 2013 • All HAZOP-tables: “Criticalities”
removed from header of each column 5
• P.145, table 14, column 2: list headers (‘3’, ‘4’, removed)
• Figures 5 and 6 optically improved • Added HAZOP Terminology
Remarks
W. Herzner (AIT) O.Zendel (AIT)
![Page 3: R3-COP-D4.2.3 CV-HAZOP r1.1 OZ · PDF fileThese ingredients are introduced and discussed in the next chapter. In Chapter 2, the HAZOP study is carried out and its results collected](https://reader031.vdocuments.us/reader031/viewer/2022030411/5a9dd51e7f8b9a0d5a8cbbe2/html5/thumbnails/3.jpg)
ARTEMIS-2009-1 R3-COP / Visual Perception Testing D4.2.3
Page 3 / 187
Table of contents Abbreviations .................................................................................................................................... 5 Executive Summary .......................................................................................................................... 6 1 Background ................................................................................................................................ 7
1.1 General Aspects of Risk Analysis ........................................................................................ 7 1.2 HAZOP................................................................................................................................. 7
1.2.1 Principles ....................................................................................................................... 7 1.2.2 Technique ...................................................................................................................... 7 1.2.3 Parameters .................................................................................................................... 8 1.2.4 Guidewords ................................................................................................................... 8
1.3 Information-Oriented Approach for CV HAZOP ................................................................... 9 1.3.1 General Considerations ................................................................................................. 9 1.3.2 Information Path / Locations ........................................................................................ 10
1 / Light Source ..................................................................................................................... 11 2 / Medium ............................................................................................................................ 12 3 / Object ............................................................................................................................... 12 4 / Observer (Sensor) ............................................................................................................ 13 5 / Software (Algorithm) ........................................................................................................ 15
2 HAZOP of CV Systems ............................................................................................................ 17 2.1 General Aspects ................................................................................................................ 17
2.1.1 System Borders ........................................................................................................... 17 2.1.2 Unwanted Results ....................................................................................................... 17 2.1.3 Guidewords ................................................................................................................. 17 2.1.4 Parameter Selection .................................................................................................... 18 2.1.5 References between Criticalities ................................................................................. 18 2.1.6 HAZOP Terminology Remarks .................................................................................... 19
2.2 Light Sources ..................................................................................................................... 19 2.2.1 Parameters .................................................................................................................. 19 2.2.2 HAZOP ........................................................................................................................ 20
2.3 Medium .............................................................................................................................. 40 2.3.1 Parameters .................................................................................................................. 40 2.3.2 HAZOP ........................................................................................................................ 40
2.4 Object................................................................................................................................. 53 2.4.1 Parameters .................................................................................................................. 53 2.4.2 HAZOP ........................................................................................................................ 54
2.5 Objects ............................................................................................................................... 82 2.5.1 Parameters .................................................................................................................. 82 2.5.2 HAZOP ........................................................................................................................ 82
2.6 Observer - Optomechanics .............................................................................................. 106 2.6.1 Parameters ................................................................................................................ 106 2.6.2 HAZOP ...................................................................................................................... 108
2.7 Observer - Electr(on)ics ................................................................................................... 144 2.7.1 Parameters ................................................................................................................ 144 2.7.2 HAZOP ...................................................................................................................... 145
2.8 Algorithm .......................................................................................................................... 161 2.8.1 Parameters ................................................................................................................ 161 2.8.2 HAZOP ...................................................................................................................... 161
2.9 Missed Criticalities ........................................................................................................... 180 2.10 Removed Criticalities .................................................................................................... 182
3 Summary ................................................................................................................................ 184 3.1 Qualitative Analysis .......................................................................................................... 184 3.2 Quantitative Analysis ....................................................................................................... 184 3.3 Conclusion ....................................................................................................................... 185 3.4 Next Steps ....................................................................................................................... 186
References .................................................................................................................................... 187
![Page 4: R3-COP-D4.2.3 CV-HAZOP r1.1 OZ · PDF fileThese ingredients are introduced and discussed in the next chapter. In Chapter 2, the HAZOP study is carried out and its results collected](https://reader031.vdocuments.us/reader031/viewer/2022030411/5a9dd51e7f8b9a0d5a8cbbe2/html5/thumbnails/4.jpg)
ARTEMIS-2009-1 R3-COP / Visual Perception Testing D4.2.3
Page 4 / 187
List of Images Figure 1: HAZOP workflow ............................................................................................................... 8 Figure 2: CV HAZOP: locations and information paths ................................................................... 10 Figure 3: optical effects: overexposure, rays, inner lens reflections ............................................... 14 Figure 4: sensor effect: blooming .................................................................................................... 15 Figure 5: CV-HAZOP statistics: entries per location ..................................................................... 185 Figure 6: Percentage of non-meaningful combinations of guideword and parameter per location 185
List of Tables Table 1: Process parameters for consideration in a HAZOP ............................................................ 8 Table 2: HAZOP guidewords ............................................................................................................ 9 Table 3: Used HAZOP guidewords ................................................................................................. 17 Table 4: Light Source Parameters .................................................................................................. 19 Table 5: Light Source HAZOP ........................................................................................................ 20 Table 6: Medium Parameters .......................................................................................................... 40 Table 7: Medium HAZOP ................................................................................................................ 40 Table 8: Object Parameters ............................................................................................................ 53 Table 9: Object HAZOP .................................................................................................................. 54 Table 10: Objects Parameters ....................................................................................................... 82 Table 11: Objects HAZOP .............................................................................................................. 82 Table 12: Observer - Optomechanics Parameters ....................................................................... 106 Table 13: Observer - Optomechanics HAZOP .............................................................................. 108 Table 14: Observer - Electronics Parameters ............................................................................... 144 Table 15: Observer - Electronics HAZOP ..................................................................................... 145 Table 16: Algorithm Parameters ................................................................................................... 161 Table 17: Algorithm HAZOP ......................................................................................................... 161 Table 18: Missed Criticalities ........................................................................................................ 180 Table 19: Removed Criticalities .................................................................................................... 182
![Page 5: R3-COP-D4.2.3 CV-HAZOP r1.1 OZ · PDF fileThese ingredients are introduced and discussed in the next chapter. In Chapter 2, the HAZOP study is carried out and its results collected](https://reader031.vdocuments.us/reader031/viewer/2022030411/5a9dd51e7f8b9a0d5a8cbbe2/html5/thumbnails/5.jpg)
ARTEMIS-2009-1 R3-COP / Visual Perception Testing D4.2.3
Page 5 / 187
Abbreviations Appl. Application Beam Beam properties Calib. Calibration Compl. Complexity CV Computer Vision CV alg. Computer Vision algorithm DOF Depth of Field/Focus Expos. Exposure Focus. Focusing FOV Field of View HAZOP Hazard and Operability Study HDR High Dynamic Range Hist. History IR Infrared L.s. Light source LDR Low Dynamic Range LGeom. Lenses Geometry LNum. Lenses Number LOD Level of Detail Num. Number Obj. Object Obs. Observer Occl. Occlusion Orient. Orientation Param. Parameters Perf. Real-time Performance Pos. Position PSF Point Spread Function PTZ Pan-tilt-zoom (camera) Qual. Quality Quant. Quantization/Sampling Refl. Reflectance Resol. Resolution ROI Region of Interest Runtime. Runtime-Environment RV Robot Vision Shad. Shadow SNR Signal to Noise Ratio Spec.Eff. Spectral efficiency SUT System Under Test Transp. Transparency UV Ultra violet VOrient. Viewing Orientation VPos. Viewing Position Wave Wave properties
![Page 6: R3-COP-D4.2.3 CV-HAZOP r1.1 OZ · PDF fileThese ingredients are introduced and discussed in the next chapter. In Chapter 2, the HAZOP study is carried out and its results collected](https://reader031.vdocuments.us/reader031/viewer/2022030411/5a9dd51e7f8b9a0d5a8cbbe2/html5/thumbnails/6.jpg)
ARTEMIS-2009-1 R3-COP / Visual Perception Testing D4.2.3
Page 6 / 187
Executive Summary Despite the decades-long use of image processing techniques as sensory capabilities of computer-based systems, still today no general methodology exists for testing their robustness with measur-able coverage of scenes and situational aspects typical and – in particular – critical for a given ap-plication. Since, however, for commercial application of safety-critical systems public authorities often require their safety certification, the exploitation of autonomous and robotic systems in hu-man or critical environments will be hampered by missing means for certifying visually perception. Therefore, in work package 4.2 of R3-COP a model-based approach for enabling this kind of certi-fication is developed. One important step in this approach is the identification of visual aspects that can decrease the capability of a robot’s vision system (or more general, a computer vision system) to interpret a scene correctly; for example, reflections, textures, shadows, occlusions, but also arte-facts generated by the sensor system itself, e.g. aberration or blooming. We will refer to such as-pects as (visual) criticalities. Due to the open-endedness of the environments the systems encounter, and the practical infinity of situations they can become confronted with, it appears as a hopeless activity to find all possible (or even only the) relevant criticalities. However, safety industry has developed several techniques to systematically analyse and identify potential risks in context of a given application. One of them is the so-called HAZOP (HAZard and OPerability) study that is based on a systematic combination of guidewords with system elements, which are used by experts to think about potential meanings of these combinations. While HAZOP was originally developed for chemical plants, railway interlocking or similar systems, its applicability is not restricted to such domains. By adapting the guidewords and modelling the system to be analysed appropriately (“from light source to algorithm”), it was possible to use it also for identifying potential risks for computer vision components, which resulted in approximately one thousand criticalities. This paper contains the results of this “CV HAZOP”. The work is based on following ingredients:
• general aspects of risk analysis • how a HAZard and Operability (HAZOP) study works • a novel generic computer vision system model which is based on an underlying information
oriented concept • the technical means on which the HAZOP is applied, i.e. distortions and deviations charac-
teristic to a computer vision system.
These ingredients are introduced and discussed in the next chapter. In Chapter 2, the HAZOP study is carried out and its results collected in tabular form. The final Chapter 3 analyses the HA-ZOP entries and draws a conclusion. Additionally it identifies several succeeding steps. It should be noted that the CV HAZOP as presented in this paper is not considered as complete. Its usage in generating tests for computer vision components will surely reveal further criticalities. For instance, the end of chapter 2 contains criticalities that have been missed initially and added later to the tables, which may provide guidance for future extensions.
![Page 7: R3-COP-D4.2.3 CV-HAZOP r1.1 OZ · PDF fileThese ingredients are introduced and discussed in the next chapter. In Chapter 2, the HAZOP study is carried out and its results collected](https://reader031.vdocuments.us/reader031/viewer/2022030411/5a9dd51e7f8b9a0d5a8cbbe2/html5/thumbnails/7.jpg)
ARTEMIS-2009-1 R3-COP / Visual Perception Testing D4.2.3
Page 7 / 187
1 Background
1.1 General Aspects of Risk Analysis When any kind of risk analysis is going to be applied to some target system, following questions need to be answered:
• What are the system borders? In particular, • What belongs to the system and what not? • If necessary: how are interfaces defined? • Are human beings part of it?
• What are the “unwanted results” that should be avoided when the analysed system is used? I.e. it needs to be defined on what the analysis shall turn focus. For instance, in a train interlock system it must not happen that two trains collide, or that a train passes a railway crossing as long as the barriers are not down.
For CV HAZOP, these questions will be addressed in clause 2.1.
1.2 HAZOP
1.2.1 Principles HAZOP stands for Hazard and Operability Study and is a structured and systematic examination of a planned or existing process or operation in order to identify and evaluate problems that may rep-resent risks to personnel or equipment, or prevent efficient operation [Red+99]. The HAZOP technique was initially developed to analyse chemical process systems, but has later been extended to other types of systems and also to complex operations and to software systems. A HAZOP is a qualitative technique based on guidewords and is typically carried out by a multi-disciplinary team (HAZOP team) during a set of meetings1. Rausand further notes “the HAZOP study should preferably be carried out as early in the design phase as possible - to have influence on the design. On the other hand, to carry out a HAZOP we need a rather complete design. As a compromise, the HAZOP is usually carried out as a final check when the detailed design has been completed.” Since we have a clear concept of the com-puter vision process, including the observed phenomena, we consider HAZOP as appropriate for guiding our criticality analysis.
1.2.2 Technique According to history, Process HAZOP is the most elaborated type of HAZOP. Hence, most litera-ture is dealing with it, sometimes complemented by aspects of Procedure HAZOP. We therefore outline the HAZOP technique based on these types; for identifying potential criticalities associated with computer vision algorithms. For readers interested in Software HAZOP we refer again to [Red+99] as well as Fenelond et al. [Fenelonnd+ 94]. Fenelon et al. compare Software HAZOP to other risk analysis techniques in respect to software design and present a formal model for HA-ZOP. According to Rausand, the HAZOP procedure can be illustrated as below. Sections denote main components such as “reactor” or “storage”, while nodes refer to parts of these main compo-nents, e.g. a “vessel” or a “pump”, but also “operating instruction”. Nodes can also be understood as specific locations in a process where (deviations of) the process or design are evaluated. For guideword and parameters, see next clauses.
1 from: Marvin Rausand, „HAZOP“ – presentation slides. Dep. of Production and Quality Engineering, Norwegian University of Science and Technology, [email protected]; Oct. 7, 2005
![Page 8: R3-COP-D4.2.3 CV-HAZOP r1.1 OZ · PDF fileThese ingredients are introduced and discussed in the next chapter. In Chapter 2, the HAZOP study is carried out and its results collected](https://reader031.vdocuments.us/reader031/viewer/2022030411/5a9dd51e7f8b9a0d5a8cbbe2/html5/thumbnails/8.jpg)
ARTEMIS-2009-1 R3-COP / Visual Perception Testing D4.2.3
Page 8 / 187
Figure 1: HAZOP workflow
1.2.3 Parameters Parameters describe relevant properties of a process or its conditions. They can refer to physical aspects as well as to operational ones. Rausand lists following examples of “process parameters”:
Table 1: Process parameters for consideration in a HAZOP Flow Addition ControlPressure Mixing MeasureTemperature Reaction SequenceViscosity Time SignalComposition Speed Start/StopSeparation Stirring Operate/Maintain Phase Transfer ServicesParticle Size pH Communication
1.2.4 Guidewords A guideword is a short word that shall help to create the notion of a deviation of the design/process intent:
Guideword + Parameter → Deviation Usually, basic and additional guidewords are distinguished.
Divide the system into „sections“
For all sections
Divide the section into „nodes“
For all nodes
Record consequences and
causes, and suggest
For each combination of guideword and parameter
Get more information
yes
no
not sure Any hazards or problems?
![Page 9: R3-COP-D4.2.3 CV-HAZOP r1.1 OZ · PDF fileThese ingredients are introduced and discussed in the next chapter. In Chapter 2, the HAZOP study is carried out and its results collected](https://reader031.vdocuments.us/reader031/viewer/2022030411/5a9dd51e7f8b9a0d5a8cbbe2/html5/thumbnails/9.jpg)
ARTEMIS-2009-1 R3-COP / Visual Perception Testing D4.2.3
Page 9 / 187
Table 2: HAZOP guidewords Guideword Meaning Example
Basic No (not none) None of the design intents are achieved No flow when production is
expected More (more of, higher)
Quantitative increase (of parameter) Higher temperature than designed
Less (less of, lower)
Quantitative decrease (of parameter) Lower pressure than normal
As well as Qualitative increase (an additional activity occurs)
Other valves closed at the same time (logic fault or human error)
Part of Qualitative decrease (only some of the design intention is achieved)
Only part of the system is shut down
Reverse Logical opposite of the design intention occurs
Back-flow when the system shuts down
Other than Complete substitution - another activity takes place
Liquids in the gas piping
Additional Before / after The step (or part of it) is effected out of
sequence, relative to other events Ignition is started before all valves are closed
Faster / slower The step is not done with the right timing Tank is filled faster than emptied Where else for OTHER THAN when considering
position, sources, or destination Gas pipe is connected to oil tank
Of course, not all guide words fit to all attributes.
1.3 Information-Oriented Approach for CV HAZOP
1.3.1 General Considerations This approach tries to address identification of criticalities for CV algorithms by assessing the loss of information on its way from the scene through the sensor to the algorithm, and relies on follow-ing observations:
a) Conceptually, initially full and precise information about an observed scene exists. b) However, any information about the observed scene available to the CV component is
given by the electromagnetic spectrum (simply referred to as “light” in the rest of the paper) and received by the “observer” (i.e. the SUT’s sensor). Hence, the info in the scene has to be converted to light in some way. An observer receives light mostly from a certain pyra-mid-shaped infinite volume, a fraction of the whole scene, the so-called Field of View (FOV) (with the exception of parasitic effects like internal lens reflections).
c) Each generation of light and interaction of light with the scene, as well as each sens-ing/processing step within the CV component distorts and/or reduces this information. E.g., no light eliminates all information from the CV component’s point of view.
d) The sensing process, i.e. the transformation of received light into digital data, further re-duces and distorts the information carried by the received light.
e) Finally, the processing of these data by the CV algorithm may also loose or distort informa-tion (through rounding errors, integration etc.).
Therefore, we distinguish two information carriers to be dealt with: • light outside the system to be tested (SUT) • digital data within the SUT
![Page 10: R3-COP-D4.2.3 CV-HAZOP r1.1 OZ · PDF fileThese ingredients are introduced and discussed in the next chapter. In Chapter 2, the HAZOP study is carried out and its results collected](https://reader031.vdocuments.us/reader031/viewer/2022030411/5a9dd51e7f8b9a0d5a8cbbe2/html5/thumbnails/10.jpg)
ARTEMIS-2009-1 R3-COP / Visual Perception Testing D4.2.3
Page 10 / 187
According to these considerations, the questions raised in clause 1.1 can be answered as follows: System borders: outside the sensing process, everything that can influence the distribution of
light – including light sources – have to be considered in the CV HAZOP and therefore as within the system to be analysed. On the other end of the information-processing pipeline, the interface of the CV algorithm to the software processing consuming its output defines the “inner” system border.
Interfaces: three interfaces of relevance: (a) that between observed scene and sensor / sensing process, (b) that between sensing process and CV algorithm, and (c) that between CV algorithm and following processes. (a) and (b) are subject matter of the CV HAZOP, as far as potential risks for the correct scene interpretation are concerned. (c) is not directly addressed, as only the risks for correct scene interpretation by the CV algorithm itself is in the focus of the CV HAZOP.
Involvement of humans: humans can only be part of the observed scene. As long as only passive observation is concerned, the CV algorithm itself cannot directly be harmful to them2. Active light, however, bears some risk to human eyes.
Unwanted results (clause 1.1): each misinterpretation of a scene is an unwanted result.
Following kinds of distorting information can be distinguished: Reduction: Parts of the information are lost, resulting in the fact that a (possible infinite) set of
initially different values become identical. It can be represented by a surjective mapping from original to reduced information.
Erasure: special case of reduction, where the information is completely lost Windowing: further special for multi-valued information, where only a part of the values is
completely lost. Transformation: Information is transformed in a way that conceptually allows a lossless retrieval
of the original information, given the transformation parameters are precisely known (which is usually not the case). An example is the red-shift of a frequency spectrum.
Blending: addition of other information. Knowing the added information precisely (which is usually not the case) would allow retrieving the original information,
In practice, combinations of these distortions are likely.
1.3.2 Information Path / Locations Figure 2 sketches the possibly paths and interactions of information from light sources to the CV-algorithm or the software implementing it, respectively.
Figure 2: CV HAZOP: locations and information paths
2 of course, it can cause harm indirectly by providing wrong information to succeeding actor processes, but this is caused by wrong scene interpretation, which again is the central part of the HAZOP
Light Source Medium Observer (Sensor)
Software (Algorithm)
Object
![Page 11: R3-COP-D4.2.3 CV-HAZOP r1.1 OZ · PDF fileThese ingredients are introduced and discussed in the next chapter. In Chapter 2, the HAZOP study is carried out and its results collected](https://reader031.vdocuments.us/reader031/viewer/2022030411/5a9dd51e7f8b9a0d5a8cbbe2/html5/thumbnails/11.jpg)
ARTEMIS-2009-1 R3-COP / Visual Perception Testing D4.2.3
Page 11 / 187
The nodes of this graph are denoted as “locations”. Evidently, only “light” is the information carrier between the locations before and up to the observer, while “digital data” are provided from the observer to the software.
Locations can “interact” with light in several ways: Emission: sending out light which is not received Absorption: consumption of received light without emitting anything of it Reflection: sending light out from same point (at an object’s surface) where light is received Transmission: sending out light from another point (of an object’s surface) than where light is
received Refraction: influences transmission by changing direction, perhaps frequency-dependent Scattering: redistribution of received light into various directions, usually with a maximum
direction and decreasing intensity with increasing deviation from maximum direction. Polarisation: usually, the oscillation planes of photons are arbitrarily oriented. In particular
reflection may align the oscillation planes of photons, potentially causing appearance of virtual colours at the reflecting locations due to interference. In the case of circular polarisation the oscillation plane rotates continuously
Coherence: the oscillation phase is a further property of photons. Usually, the phases of photons emitted from a source are equally distributed, but special sources such as lasers emit equally phased or coherent light.
With digital data, locations can “interact” in following ways:
Generation: sending out digital data which is not received Processing: received data are (functionally) transformed according to given rules, and results
provided as output. Storage & Retrieval: received are stored, and can be read at a later point in time. Consumption: receiving data without emitting anything of it
In the following, we try to characterize these locations.
1 / Light Source Description: Light sources emit light into the scene under consideration. They can be located both
outside and (partially) inside the scene. In the latter case, it is very likely co-located with at least one Object. But even if this is not the case, it may be necessary to model its geometry and physical properties such as extension, inner reflections etc. Physically, light sources are characterized by their location and the spectrum emitted into every direction possible. Since realistic light sources have extension (even tiny LEDS are not points in physical sense), conceptually each point on/in them should be considered as light source. For solid light sources with non-transparent surface, it is sufficient to consider their surface, for others, e.g. fire, every possible point of them could be a light source. Of course, in practise it will be necessary to simplify their modelling for coping with the computational effort. In particular if light sources are outside of the observable scene, in many cases it is sufficient to model them as spot lights or similar simple objects.
Influences on information: Although light sources are fundamental for generating the “information carrier” from scene to Observer, they also contribute to its distortion with their characteris-tics: in very most cases, they have a specific emission spectrum which possibly varies with direction. If they are extended, parts of it may obscure light emitted from other parts; this property, however, shall be treated with their “object” nature. Hence, light sources interact with light only in one way:
Emission Further, emitted light might by polarized or even coherent.
![Page 12: R3-COP-D4.2.3 CV-HAZOP r1.1 OZ · PDF fileThese ingredients are introduced and discussed in the next chapter. In Chapter 2, the HAZOP study is carried out and its results collected](https://reader031.vdocuments.us/reader031/viewer/2022030411/5a9dd51e7f8b9a0d5a8cbbe2/html5/thumbnails/12.jpg)
ARTEMIS-2009-1 R3-COP / Visual Perception Testing D4.2.3
Page 12 / 187
Light sources apply following distortions to the light emanating from their location: Reduction (frequency dependent factor, describing the emission spectrum)
Blending (with light received from other sources or parts of itself)
Interactions with other locations: It is assumed that light emanating from light sources always travels through the “medium” before it either hits the observer or a scene object. In particu-lar, it is not considered that a light source directly touches the Observer. However, light sources are objects; hence, they will interact with their object nature. There are also applications, where the light source itself is the object of interest, rather than an il-luminated object (e.g. car head light tracking). Dependent on this distinction between the light as a necessity for lit objects, and the light as object of interest the criticalities differ in kind and severity. Further, light sources may receive light from the medium, which can affect their own emis-sion characteristics.
2 / Medium Description: Medium is the space in between objects within the scene, light sources, and the observer. It is mostly air, but could also be water or other liquids.
Influences on information: The medium is affecting light traversing it, and hence affects the in-formation carried by it. The most common medium for CV applications is air. Air is very transparent and colour neutral, as long as not containing tiny particles such as water drops (fog) or dust. Most indoor applications will therefore not need to consider its influence on light, while in outdoor applications it may play a (significant) role, in particular with increas-ing distance from observer. Water at least has a clear dimming effect, and also attenuates light by absorbing blue light less than other colours. Hence, medium can interact with light in following ways:
Absorption Transmission Refraction Scattering
All effects besides transmission are increasing proportional to distance, while transmission decreases with distance from observer. Following distortions can be caused by Medium:
Reduction: the most common effect. Both air and water are more transparent to blue light than to other frequencies. A special case is windowing: some media absorb certain wave lengths completely. Transformation: some liquids and gases can retransmit received light at other fre-quencies. Blending: by means of refraction, medium can bring light into not directly illuminated (e.g. shadowed) zones. Further, optical effects such as halos (which are caused by both refraction and reflection) can redistribute light in areas around light sources, or even opposite to them, such that some zones appear brighter than others.
Interactions with other locations: Medium interacts with all other scene constituents. It is even possible that two different media touch each other, e.g. air and water. At their interfaces, at least refraction will occur, due to their different refraction indices.
3 / Object Description: Objects establish the main constituents of scenes. They are clearly distinguished
from media and light sources. A special case are tiny but numerous objects of the same kind, e.g. dust particles or water drops. It depends on the application whether it is preferred to model them by means of me-dium rather than as individual objects. Also object can include things we would intuitively
![Page 13: R3-COP-D4.2.3 CV-HAZOP r1.1 OZ · PDF fileThese ingredients are introduced and discussed in the next chapter. In Chapter 2, the HAZOP study is carried out and its results collected](https://reader031.vdocuments.us/reader031/viewer/2022030411/5a9dd51e7f8b9a0d5a8cbbe2/html5/thumbnails/13.jpg)
ARTEMIS-2009-1 R3-COP / Visual Perception Testing D4.2.3
Page 13 / 187
not describe as objects like the horizontal line or a rainbow. It is about its semantic identity as a possible object of interest not its physical manifestation.
Influences on information: Besides emission, objects can interact in all ways with light: Absorption Reflection Transmission Refraction Scattering Polarization
and hence may distort received light in several ways: Reduction: most surfaces reflect light with a frequency-dependent factor, which causes the impression of a specific colour. Erasure: an opaque object may cast shadows, in which case the incoming light is completely erased. (Medium may bring light through blending into such zones.) An opaque object may also occlude other objects (partially) from the position of the ob-server, in which case the light from the occluded parts of these objects towards the observer is erased. Blending: Objects may blend incoming light from different sources into the same di-rection, e.g. through simultaneous reflection and scattering or transmission. Transformation: Some materials retransmit at least certain incoming wavelength at other, usually longer ones. For instance, some white cloths emit incoming UV in the blue range.
Interactions with other locations: Objects interacts mainly with medium, but can interact with other objects in several ways: − An object can (partially) occlude other objects. − An object can cast shadows onto other objects. − An object may reflect received light onto other objects. − Two objects touch each other, and at least one transmits light (i.e. is at least partially
transparent). A light source which is part of the scene is likely co-located with an object modelling its housing. This may lead to a direct interaction between them, if, for instance, a solid body is both emanating light and absorbing it. However, e.g. a bulb and a reflector around it do not interact directly (there is a medium between them).
4 / Observer (Sensor) Description: The observer receives part of the light travelling through the scene via the medium.
Today, it is usually realized by a sensor consisting of an optical and an electronic part: the optics serve to collect and focus light, while the electronics converts it into intensity informa-tion at several points (“pixels”) in the “field of view”. As such, it represents the interface be-tween the “domain of light” – the scene, and the “domain of digitized information” – the CV algorithm. The set of intensity information for all pixels at a certain point in time is denoted as “image”. Note: Although sensor technologies for electromagnetic waves exist, which provide either only 1D-data (along a scan line), values for different pixels only at different points in time, or as analogue values, we regard images as two-dimensional arrays of pixels. Pixels are re-garded as sets records of discrete values; values usually denote intensities in different spectral intensities, e.g. triples for red/green/blue intensities; all pixels of an image are of same kind, and are assigned to the same timestamp3.
3 This does not necessarily mean that they are originating from the same moment in time. Exceptions are the occurrence of rolling shutter (common to consumer CMOS cameras) or HDR images which have been composed from LDR images.
![Page 14: R3-COP-D4.2.3 CV-HAZOP r1.1 OZ · PDF fileThese ingredients are introduced and discussed in the next chapter. In Chapter 2, the HAZOP study is carried out and its results collected](https://reader031.vdocuments.us/reader031/viewer/2022030411/5a9dd51e7f8b9a0d5a8cbbe2/html5/thumbnails/14.jpg)
ARTEMIS-2009-1 R3-COP / Visual Perception Testing D4.2.3
Page 14 / 187
Influences on information: Principally, the only interaction with light the observer should do is absorbing it. However, its optical component can affect incoming light in a couple of ways, and the electronic part will contribute further distortions to the derived information too. In particular, the optics as a transparent object may interact with received light. Besides emanation, causing the following distortions:
Reduction: only part of incoming light will be focused onto the (electronic) sensor, the rest will be absorbed or reflected. This may be different for different pixels, e.g. with increased dimming towards the image borders. Transformation: different refraction of different wave lengths may cause spectrum splitting, creating coloured borders around objects. Blending: basically, the optics should not add light to that received. However, incom-ing light can be transferred to other locations within the image. Three typical optical effects, emanating from a strong light source located within an image, are shown in the image below: overexposure – erasing the trees close to the sun, rays, and lens reflections – with the (mostly hexagonal) shape of the shutter.
Figure 3: optical effects: overexposure, rays, inner lens reflections
from http://freeimagefinder.com/pic/nazareth.html "Optical Effects / Nicholas_T"
Aberration: lens effects may distort the projection, e.g. forcing straight lines to be projected as curves. Literature refers to the five Seidel Aberrations. These are the lens effects which do not follow the linear lens model (the pinhole model). They are Spherical Aberration, Coma, Astigmatism, Curvature of Field and Distortion. Most of them lead by our definition to Blending effects as described before. But e.g. Distor-tion, the most severe deviation from the pinhole model, is not a Blending effect and has to be considered separately (see [Szeliski+2011]. Vignetting: image appears darker towards the edge.
The electronic (photoelectric) sensor can cause following deviations: Thermal noise: heating of the device during operation causes generation of “artificial photons”. Black current: similarly, thermal affects will cause that even in the full absence of in-coming light some intensity will be measured Missed photons: photons may be missed (spectral efficiency), resulting in a “darker” output than would be correct. Spectral response: The sensor discretizes the whole spectrum into digital values which depend on the sensor’s spectral efficiency. Temporal discretisation: The information is gathered by the sensor over a certain time span - the exposure time and all information is hereby integrated.
![Page 15: R3-COP-D4.2.3 CV-HAZOP r1.1 OZ · PDF fileThese ingredients are introduced and discussed in the next chapter. In Chapter 2, the HAZOP study is carried out and its results collected](https://reader031.vdocuments.us/reader031/viewer/2022030411/5a9dd51e7f8b9a0d5a8cbbe2/html5/thumbnails/15.jpg)
ARTEMIS-2009-1 R3-COP / Visual Perception Testing D4.2.3
Page 15 / 187
Blooming: a white stripe emanating vertically from an overexposed light source, mostly towards the bottom. It is caused by the fact that in a CCD device, a pixel can take only a limited amount of charge quantity. If this is exceeded, it “overflows” to-wards the next pixel connected along the read-out direction, which is usually parallel along the short image edge.
Figure 4: sensor effect: blooming
from http://de.wikipedia.org/wiki/Blooming
Another group of effects concern the conversion of light into digital data, which can be di-vided into those of the value and the time domain:
Precision: the presentation of data used in most CV software is of limited accuracy, i.e. using either integers, fixed-point notations or floating point notations with a fixed number of bits for both mantissa and exponent. But in general, the conversion from analogue to digital representations always suffers from the so called Quantization error. Non-linear conversion: ratios between light intensities can be mapped to different ra-tios between digital data (representing these intensities), in particular with unex-pected results. Timeliness: in some cases it is possible that the ADC provides the conversion re-sults too late (or – in rare cases – too early). This distortion is relevant for sensor techniques which do provide their output in pixel-sequential order or event-driven rather than for conventional CCD devices.
Again, all these effects can have different strengths for different pixels, but again in general those differences are negligible.
Interactions with other locations: Clearly, the optics interacts with the medium, and the electron-ics interact with the software processing its output. The latter was just treated (precision … timeliness). The medium can affect the optics in two ways: − The surface of the optics can be “contaminated”, e.g. by drops, condensation, dust, or a
mixture of them. − Different refraction indices of medium and optics can cause a difference between the di-
rections where an object is actually located and the corresponding location in the im-age.
The interactions between optics and electronics have already been described under “influ-ences on information”. For the software, the observer states a data generator.
5 / Software (Algorithm) Description: This location is the actual target of the activities of which this work is part of. We have
to distinguish, though, between the algorithm and the software implementing it for at least two reasons: (i) the software can be erroneous, i.e. not correctly implementing the algo-rithm, (ii) even if it can be considered as correct, it will only be an approximation, as nu-merical operations carried out by it can (and will) exhibit imprecision, in particular when processing of real or complex numbers is required. Therefore, we will consider the imple-mentation rather than the underlying algorithm here, and will use the terms software and implementation synonymously.
![Page 16: R3-COP-D4.2.3 CV-HAZOP r1.1 OZ · PDF fileThese ingredients are introduced and discussed in the next chapter. In Chapter 2, the HAZOP study is carried out and its results collected](https://reader031.vdocuments.us/reader031/viewer/2022030411/5a9dd51e7f8b9a0d5a8cbbe2/html5/thumbnails/16.jpg)
ARTEMIS-2009-1 R3-COP / Visual Perception Testing D4.2.3
Page 16 / 187
The implementation is the only location in the information processing path considered here which receives (only) digital data as input. Usually, it will also output digital data, and we will only consider this type of output here.
Influences on information: We consider two categories of effects on the processed data: numeri-cal (e.g. rounding) errors, common to all digital computer applications, and effects intro-duced “by purpose” by the implementation. While the first category can be subsumed by the reduction distortion, the second can be structured as
Reduction: only part of the data are used (filtering) or are accumulated (integration, average etc.). From the resulting data the input data in general are not reproducible. Precision: the presentation of data used in most CV software might use a lower pre-cision than what is supplied by the sensor. Transformation: data can be mathemati-cally transformed, e.g. into another colour model. Surjective transformations will not be unambiguously invertible. Blending: for this location, blending is regarded as sub-case of transformation.
Interactions with other locations: The interaction with the sensor has already been discussed. Other interactions are not possible.
![Page 17: R3-COP-D4.2.3 CV-HAZOP r1.1 OZ · PDF fileThese ingredients are introduced and discussed in the next chapter. In Chapter 2, the HAZOP study is carried out and its results collected](https://reader031.vdocuments.us/reader031/viewer/2022030411/5a9dd51e7f8b9a0d5a8cbbe2/html5/thumbnails/17.jpg)
ARTEMIS-2009-1 R3-COP / Visual Perception Testing D4.2.3
Page 17 / 187
2 HAZOP of CV Systems This chapter contains a HAZOP of the system model outlined just before in Section 1.3, in order to identify potential criticalities for CV algorithms.
2.1 General Aspects
2.1.1 System Borders The borders of the system are shown in the diagram at begin of clause 1.3.2: it includes the objects in the scene, any medium and light source, sensor and software of the SUT. Probability and severity are domain-specific and thus not considered in this analysis, which looks for all potential criticalities.
2.1.2 Unwanted Results Although the specific results to be avoided depend on the SUT, in general two categories can be distinguished:
• False Positives: the SUT should not report observations where no respective objects or situations are present.
• False Negatives: the SUT should not miss to report observations where respective objects or situations are present.
A third group, namely reporting some event A erroneously as event B, is considered as a combination of a false negative (not reporting A) and a false positive (reporting event B). In addition, a quality measure for true positives is often of interest, in order to judge the overall performance of a CV algorithm and perhaps to compare it with others. A typical measure is the sum of squares of distances between exact responses (GT) and answers from the algorithm.
2.1.3 Guidewords The guidewords that were chosen for the HAZOP are mentioned in table Table 3 together with explanations in the context of CV.
Table 3: Used HAZOP guidewords Guideword Meaning Example
Basic No (not none) No information can be derived
The strength of the respective effect is not known.
No light at all is reflected by a surface Position of light source not known
More (more of, higher)
Quantitative increase (of parameter) above expected level
Light spectrum has a higher average frequency than expected
Less (less of, lower)
Quantitative decrease (of parameter) below expected level
Medium is thinner than expected
As well as Qualitative increase (additional situational element)
Two lights shine on the same object
Part of Qualitative decrease (only part of the situational element)
Part of an object is occluded by another object
Reverse Logical opposite of the design intention occurs
Light casts a shadow instead of providing light
Other than Complete substitution - another Light source emits a completely
![Page 18: R3-COP-D4.2.3 CV-HAZOP r1.1 OZ · PDF fileThese ingredients are introduced and discussed in the next chapter. In Chapter 2, the HAZOP study is carried out and its results collected](https://reader031.vdocuments.us/reader031/viewer/2022030411/5a9dd51e7f8b9a0d5a8cbbe2/html5/thumbnails/18.jpg)
ARTEMIS-2009-1 R3-COP / Visual Perception Testing D4.2.3
Page 18 / 187
Guideword Meaning Example situation encountered different light texture than
expected Additional-Spatial
Where else Other than when considering position, direction, or related aspects
Light reaches the sensor from an unexpected direction
Spatial periodic
Parameter causes a spatially periodic or regular effect
Light source projects a repeating pattern Aberration increases from lenses center outwards
Spatial aperiodic
Parameter causes a spatially irregular effect
Texture on object shows a stochastic pattern
Close / Remote
Effects caused by param. when s.t. is spatially close to / remote s.t. else
In front of / Behind
Effects caused by param. according to position relative to other objects
One object completely occludes another object
Additional-Temporal Before / After The step (or part of it) is effected out of
sequence, relative to other events or any other Deviation from schedule
Flash is triggered after exposure of camera terminated or Camera iris opens too early
Faster / Slower
The step is not done with the right timing
Object moves faster than expected
Temporal periodic
Parameter causes a temporally regular effect
Light flickers periodically with 50Hz
Temporal aperiodic
Parameter causes a temporally irregular effect
Intensity of light source has stochastic breakdowns
2.1.4 Parameter Selection During the process of deciding which parameters to use for a location, two aspects have to be considered: On the one hand, the set of parameters should allow to make statements in an expressive, sometimes even abstract manner and on the other hand the set should provide a complete location characterization. To fulfil both requirements the set is allowed to be redundant to certain degree. However, redundancy has to be limited to generate a feasible amount of entries. Therefore, a trade-off solution has to be found which incorporates expressiveness, completeness and feasibility.
An example would be the selection of Field of View in Section 2.6 Observer - Optomechanics: The parameters as shown in Table 12 are chosen according to the trade-off between systematic accuracy and feasibility. We knowingly chose them to be redundant to a certain degree due to feasibility reasons. For example: On one hand one could argue, that the parameter Field of View is entirely dependent on other low level parameters (a function of focal length, sensor resolution, size of the photo element, orientation and position of the observer). On the other hand, the concept of Field of View as “the section of world which is captured by a sensor” is powerful. It allows for interpretation on a different level with a higher expressiveness. Still, the rule of cause and effect applies; Changes in position or orientation do have a direct influence on the Field of View. Therefore, their criticalities propagate.
2.1.5 References between Criticalities Sometimes, different combinations of guideword and parameter for a given location lead to same Meaning, Consequence, or potential Hazard. In this case, following notation is used to refer to the same entries:
![Page 19: R3-COP-D4.2.3 CV-HAZOP r1.1 OZ · PDF fileThese ingredients are introduced and discussed in the next chapter. In Chapter 2, the HAZOP study is carried out and its results collected](https://reader031.vdocuments.us/reader031/viewer/2022030411/5a9dd51e7f8b9a0d5a8cbbe2/html5/thumbnails/19.jpg)
ARTEMIS-2009-1 R3-COP / Visual Perception Testing D4.2.3
Page 19 / 187
“See [Guideword]”: refers to application of same Guideword to the same parameter for current location. Example: See More
“See [Guideword(i)]”: as before, but helps to distinguish between different entries for same <guideword parameter location>. i denotes the number of the entry from top down.
“See [Guideword Param]”: refers to application of Guideword to Param of current location. Example: See Less Trans
“See [Location]”: refers to application of current guide word to current parameter of Location. Example: See l.s.
If for a combination no meaning could be found, “n/a” (not applicable) is entered.
2.1.6 HAZOP Terminology Remarks In some fields the word criticality is used to describe how critical an entry is (i.e. as a prioritization, to categorize how much damage this individual hazard might cause). In this document, we define a (vision) criticality to be a circumstance under which the performance or the output quality of a computer vision system can potentially be reduced. Each individual valid entry where a meaning was found in the CV-HAZOP list is a criticality.
The difference between consequence and hazard are twofold: • A consequence is not judgmental (i.e. it is not considered harmful nor helpful but just a
causality); the hazard is always harmful to the system. This is useful as the harmfulness of consequences can depend on the intent of the system, which is not defined as we are con-sidering a generic computer vision case. The hazard on the other hand is more specialized and has exemplary applications in mind.
• A consequence often applies to the current location; the hazard often applies to the actual output result/performance of the whole system.
2.2 Light Sources
2.2.1 Parameters
Table 4: Light Source Parameters Parameters Values / Range Meaning Number N Number of (distinguishable) light sources. Position 3D-Point One emphasized point of light source; usually its
center of gravity. Area 3D-Extent The radiating area of the light source. Spectrum Frequency distribution Colour, i.e. richness of light source with respect to
emission spectrum (both spatial and temporal distribution)
Texture Pattern of Spectrum Contrast distribution of emitted light. One spectrum in simplest case, direction dependent spectrum in most complicated case.
Intensity R0+ Scaling factor for Texture Beam properties angle Opening angle and direction of a light beam. Also
includes its profile and cross section respectively. No precise distribution function needed for HAZOP
Wave properties polarization coherence
Polarization: degree, angle, direction, rotation period Coherence: degree, phase
R0+ denotes a non-negative real value.
![Page 20: R3-COP-D4.2.3 CV-HAZOP r1.1 OZ · PDF fileThese ingredients are introduced and discussed in the next chapter. In Chapter 2, the HAZOP study is carried out and its results collected](https://reader031.vdocuments.us/reader031/viewer/2022030411/5a9dd51e7f8b9a0d5a8cbbe2/html5/thumbnails/20.jpg)
ARTEMIS-2009-1 R3-COP / Visual Perception Testing D4.2.3
Page 20 / 187
2.2.2 HAZOP Note: probability and severity are domain-specific and thus not considered in this analysis, which looks for all potential criticalities.
Table 5: Light Source HAZOP Guideword Par. Meaning Consequences Hazards Note No Num No light sources • No light available • Sensor will receive no light,
but thermal noise or black current can cause wrong input
More Num Many light sources (more light sources than expected)
• Too much light • Too few shadows
• Overexposure (of whole image)
• Algorithms using shadows can be confused
Feasible in appl. where an average num. of (distributed) light sources is typical
Less Num Few light sources (fewer light sources than expected)
• Too faint light (in parts of the scene)
• Too many shadows • Very sharp shadows
• Sensor will receive too faint light from some scene re-gions
• Algorithms can be con-fused by shadows
Feasible in appl. where an average num. of (distributed) light sources is typical
As well as Num Mirrors fake additional light sources
• Light sources can appear at locations other than where they are
• Increases shadow complexity • Also as “More”
• Algorithm confuses posi-tion of light sources
• Algorithm detects more light sources than exist
• Also as “More”
is a combination of light source and object
Part of Num Part of expected light source configuration is missing
• Light source configuration ap-pears different than “usual”
• Shadow configuration appears different than “usual”
• Scene-relevant light-source or shadow-configuration not detected
• Scene-relevant light-source or shadow-configuration confused with other config.
Examples: • Car with one head light
• Multiple reflectors in application-specific configura-tion (e.g. flood light)
![Page 21: R3-COP-D4.2.3 CV-HAZOP r1.1 OZ · PDF fileThese ingredients are introduced and discussed in the next chapter. In Chapter 2, the HAZOP study is carried out and its results collected](https://reader031.vdocuments.us/reader031/viewer/2022030411/5a9dd51e7f8b9a0d5a8cbbe2/html5/thumbnails/21.jpg)
ARTEMIS-2009-1 R3-COP / Visual Perception Testing D4.2.3
Page 21 / 187
Guideword Par. Meaning Consequences Hazards Note Reverse Num n/a Other than Num n/a Where else Num Light sources are arranged
differently than expected • Characteristic arrangements are corrupted
• CV alg. based on charac-teristic arrangements of lights fail
E.g.: Characteristic arrangement for a train head light (triangle arrangement) is lost if a fourth light occurs
Spatial periodic
Num Several light sources are configured in periodic manner
• Consequences depend on combined param.
• Hazards depend on com-bined param.
Attention! Is actually a combination of param. “Number” with at least another light source param.
Spatial aperiodic
Num See “spatial periodic” • See “spatial periodic” • See “spatial periodic” See “spatial periodic”
Temporal periodic
Num Number of light sources varies periodically
• Illumination changes predict-able with time
• Illumination may change significantly between dif-ferent exposures
Attention! Is to be considered with intensity and texture
Temporal aperiodic
Num See “temporal periodic” • See “temporal periodic” • See “temporal periodic” See “temporal periodic”
Close (spat.)
Num n/a
Remote (spat.)
Num n/a
Before (temp.)
Num n/a
After (temp.)
Num n/a
In front of (spat.)
Num n/a
![Page 22: R3-COP-D4.2.3 CV-HAZOP r1.1 OZ · PDF fileThese ingredients are introduced and discussed in the next chapter. In Chapter 2, the HAZOP study is carried out and its results collected](https://reader031.vdocuments.us/reader031/viewer/2022030411/5a9dd51e7f8b9a0d5a8cbbe2/html5/thumbnails/22.jpg)
ARTEMIS-2009-1 R3-COP / Visual Perception Testing D4.2.3
Page 22 / 187
Guideword Par. Meaning Consequences Hazards Note Behind (spat.)
Num n/a
Faster Num n/a Slower Num n/a All following analyses are per l.s., if not otherwise indicated No Pos. Position of light source not
known • Light source is not in direct view or not detected as such
• An algorithm cannot relate a shadow from the scene to the respective light source and might confuse it with an object
No Pos No finite position known, hence infinitely far away (e.g. sun or moon in approx.)
• Light rays are parallel • No measurable dimming within “terrestrial” distances due to distance from l.s.
• Rendering: this approxima-tion can lead to shadows different from real ones
Example: sun or moon in approximation. Note: Hazard does not belong to this HAZOP, but is listed for being not forgotten
More Pos. L.s. far away from scene • Lighting of scene can be too weak
• Also: See No
• See No • also: See Less Num
Less Pos. L.s. near to observer • Lighting of scene can be too strong
• Light intensity may decrease (with increasing distance from l.s.) significantly within scene
• Shadow direction may change significantly within scene
• Over- and underexposure in same scene possible
• Only parts close to l.s. suf-ficiently illuminated
• Different shadow directions may confuse algorithm
• also: See More Num
As well as Pos. L.s. can only be described with several pos.s
• See Num • See Num
![Page 23: R3-COP-D4.2.3 CV-HAZOP r1.1 OZ · PDF fileThese ingredients are introduced and discussed in the next chapter. In Chapter 2, the HAZOP study is carried out and its results collected](https://reader031.vdocuments.us/reader031/viewer/2022030411/5a9dd51e7f8b9a0d5a8cbbe2/html5/thumbnails/23.jpg)
ARTEMIS-2009-1 R3-COP / Visual Perception Testing D4.2.3
Page 23 / 187
Guideword Par. Meaning Consequences Hazards Note Part of Pos. Part of l.s. is visible • Light source at the image’s
edge looks different than in the middle
• L.s. may look strange
• Overexposure (of image parts)
• Reflections of optics in im-age
• Blooming • L.s. may be confused with another l.s. by CV alg.
Reverse Pos. n/a Other than Pos. See “Where else” (by
definition of “Where else”) •
Where else Pos. L.s. has a different position than expected
• Algorithms needing a l.s. pos. are not applicable
• Shadows and object visibility are different than expected
• See More & Less for Area (combination of both)
• The changed position will reduce the detection rate of an algorithm if it was trained using old scene data
“Meaning” is covered by Temporal (a)periodic and More Area Remark: for Pos = Other than
Where else Pos. L.s. position cannot be assigned precisely (with a certain position)
• See Where else Pos. (1) • See Where else Pos. (1) See Where else Pos. (1)
Spatial periodic
Pos. n/a for sets of l.s. see Num Note: special configs of l.s. and slits (in objects) can cause special diffraction patterns
Spatial aperiodic
Pos. n/a for sets of l.s. see Num
Temporal periodic
Pos. L.s. position changes periodically
• Illumination and shading (shadowing) varies predictably
• Appearance and shading of same scene may change significantly but predictably (“learnable”)
![Page 24: R3-COP-D4.2.3 CV-HAZOP r1.1 OZ · PDF fileThese ingredients are introduced and discussed in the next chapter. In Chapter 2, the HAZOP study is carried out and its results collected](https://reader031.vdocuments.us/reader031/viewer/2022030411/5a9dd51e7f8b9a0d5a8cbbe2/html5/thumbnails/24.jpg)
ARTEMIS-2009-1 R3-COP / Visual Perception Testing D4.2.3
Page 24 / 187
Guideword Par. Meaning Consequences Hazards Note Temporal aperiodic
Pos. L.s. position changes aperiodically
• Illumination and shading (shadowing) varies unpredicta-bly
• Appearance and shading of same scene may change significantly and not predictably
Close (spat.)
Pos. See Less
Remote (spat.)
Pos. See More
Before (temp.)
Pos. L.s. moves along a path earlier than expected
• See Other than • See Other than
Before (temp.)
Pos. L.s. moves away before Observer has started to gather information
• See Less • See Less
After (temp.)
Pos. L.s. moves along a path later than expected
• See Other than • See Other than
After (temp.)
Pos. L.s. moves into view after Observer has ended to gather information
• See Less • See Less
In front of (spat.)
Pos. Two or more light sources in a row: The expected light source is in the back
• Wrong side of objects are lit • Could be same as less, de-pending on strength of other light source
• See Less
In front of (spat.)
Pos. L.s. is part of scene (in front of observer)
• L.s. can be directly visible from observer
• Overexposure (of image parts) - local outshining
• Reflections of optics in im-age
• Virtual rays in image • blooming
Behind (spat.)
Pos. Two or more light sources in a row: The expected light source is in front
• Could be same as more de-pending on strength of other light source
• See more
![Page 25: R3-COP-D4.2.3 CV-HAZOP r1.1 OZ · PDF fileThese ingredients are introduced and discussed in the next chapter. In Chapter 2, the HAZOP study is carried out and its results collected](https://reader031.vdocuments.us/reader031/viewer/2022030411/5a9dd51e7f8b9a0d5a8cbbe2/html5/thumbnails/25.jpg)
ARTEMIS-2009-1 R3-COP / Visual Perception Testing D4.2.3
Page 25 / 187
Guideword Par. Meaning Consequences Hazards Note Behind (spat.)
Pos. L.s. behind Observer • Objects illuminated with small angle between direction of light and direction of view
• Shadows hardly visible and short
• Little contrasts on smooth sur-faces
• Observer can cast shadow onto the scene (has to be modelled if important for appl.)
• Small irregularities on ob-ject surfaces with same colours as surroundings may remain undetected
• Transparent objects may remain undetected
• Reflecting areas oriented parallel to image plain may appear over exposed
Faster Pos. L.s. moves faster than expected
• L.s. stays shorter at a place than expected
• Relativistic effects • Doppler Shift
• Too weak light • Underexposure • Motion blur
Slower Pos. L.s. moves slower than expected
• L.s. stays longer at a place than expected
• Too much light
• Overexposure
No Area Point-like l.s. • No l.s. extent (needed to be computed)
• Extremely sharp shadows if not diffused by medium e.g. distant stars, laser or high intensity LED
More Area Large l.s. • Complex illumination computa-tion needed
• Diffuse shadows • Low contrasts
e.g. indirect light
Less Area L.s. with small but not point-like extent
• See No • See No e.g. light bulbs
As well as Area L.s. with both small and large extents
• Mixture of illumination models needed
• Overexposure • Anisotropic shadowing
e.g. neon tubes
Part of Area Part of l.s. (area) is visible (although Pos is outside)
• See Pos • See Pos
Reverse Area L.s. shadows the area (because it is an object too)
• Creates unexpected Texture • -Too weak light on scene
• See Less and Other than Texture
![Page 26: R3-COP-D4.2.3 CV-HAZOP r1.1 OZ · PDF fileThese ingredients are introduced and discussed in the next chapter. In Chapter 2, the HAZOP study is carried out and its results collected](https://reader031.vdocuments.us/reader031/viewer/2022030411/5a9dd51e7f8b9a0d5a8cbbe2/html5/thumbnails/26.jpg)
ARTEMIS-2009-1 R3-COP / Visual Perception Testing D4.2.3
Page 26 / 187
Guideword Par. Meaning Consequences Hazards Note Other than Area n/a Where else Area n/a Spatial periodic
Area See Num • See Num • See Num
Spatial aperiodic
Area Instead of focusing the light in the wanted area, the light source spreads the light into different patches
• This creates a texture • See More texture
Temporal periodic
Area Radiating area changes periodically
• Illumination and shading (shadowing) changes predicta-bly
• Appearance and shading of scene may change (sig-nificantly) but predictably (“learnable”)
Temporal aperiodic
Area Radiating area changes aperiodically
• Illumination and shading (shadowing) changes unpre-dictably
• Appearance and shading of scene may change (sig-nificantly) and not pre-dictably
e.g. open fire
Close (spat.)
Area n/a
Remote (spat.)
Area n/a
Before (temp.)
Area n/a
After (temp.)
Area n/a
In front of (spat.)
Area See Part of
Behind (spat.)
Area L.s. behind observer • See “Behind Pos.” • See “behind pos.” • Large area may remove contrasts to a high degree (completely)
• Small area may cause to cast observer large shadow onto scene
Faster Area n/a
![Page 27: R3-COP-D4.2.3 CV-HAZOP r1.1 OZ · PDF fileThese ingredients are introduced and discussed in the next chapter. In Chapter 2, the HAZOP study is carried out and its results collected](https://reader031.vdocuments.us/reader031/viewer/2022030411/5a9dd51e7f8b9a0d5a8cbbe2/html5/thumbnails/27.jpg)
ARTEMIS-2009-1 R3-COP / Visual Perception Testing D4.2.3
Page 27 / 187
Guideword Par. Meaning Consequences Hazards Note Slower Area n/a No Spectrum Monochromatic light source • To little texture as contrast is
also created by different col-ours which are not visible un-der monochromatic light
• Colours in scene reduced to one frequency (strong informa-tion reduction)
• No colours can be distin-guished in scene
• If observer not sensitive to emitted colour, under ex-posure will likely result
often in combination with “Less Beam”, e.g. laser
More Spectrum More continuous spectrum (l.s. has a very broad spectrum)
• Little influence on scene col-ours by l.s.
• Very broad spectrum means the Intensity itself will be low as much of the energy is wasted in regions the sensor cannot use
• Underexposure “white light” is covered by this guideword/param-comb.
Less Spectrum Less continuous spectrum (l.s. has a very sharp spectrum)
• Only a few “colours” emitted by l.s.
• Like “No”, if only one colour emitted
• Scene may appear in un-usual colours
• If observer is sensitive nei-ther to emitted light nor to frequencies to which l.s. light is transformed by some object/process, un-derexposure may result
As well as Spectrum Light source has additional peak in spectrum where it’s not expected
• See Meaning • Ultra violet light in addition to visible light can cause unexpected effects in elec-tronic sensor part of ob-server
• -If additional peak is within the sensors response, this will create some unwanted overexposure
![Page 28: R3-COP-D4.2.3 CV-HAZOP r1.1 OZ · PDF fileThese ingredients are introduced and discussed in the next chapter. In Chapter 2, the HAZOP study is carried out and its results collected](https://reader031.vdocuments.us/reader031/viewer/2022030411/5a9dd51e7f8b9a0d5a8cbbe2/html5/thumbnails/28.jpg)
ARTEMIS-2009-1 R3-COP / Visual Perception Testing D4.2.3
Page 28 / 187
Guideword Par. Meaning Consequences Hazards Note As well as Spectrum One l.s. has no or less
spectrum and another has more
• Illumination characteristics de-pend on orientation and posi-tion of l.s.s
• Different kinds of illumina-tion in different zones of scene can confuse ob-server
Part of Spectrum The light source only emits a part of the expected spectrum
• Skewed spectrum • -objects appear in different colours than expected
• Algorithm can miss detec-tions as the environment is very different from the ex-pected/trained one
Reverse Spectrum The light source emits no light in the expected spectrum
• Insufficient light in scene • Underexposure
Other than Spectrum A completely different spectrum is emitted completely different l.s. spectrum than usual in appl.
• See Part of • Illumination characteristics dif-fer significantly from expected
• CV algorithms relying on colour may be severely confused
Where else Spectrum Note: objects reemitting incoming light in other wavelengths to be treated in Object HAZOP
Spatial periodic
Spectrum The peaks within the light source spectrum are periodic
• Special properties of resulting light
• Interference very special case See also Texture
Spatial aperiodic
Spectrum n/a See also Texture
Temporal periodic
Spectrum Emitted spectrum changes periodically over time
• “Colouring” of scene changes predictably over time
• Colouring of captured im-age depends on period phase and frequency, but between single images it may vary significantly ⇒ “unsynchronized” CV alg. misinterprets scene “phase”
• After a constant number of cap-tures the (almost) same colouring will reappear ⇒ learnable
![Page 29: R3-COP-D4.2.3 CV-HAZOP r1.1 OZ · PDF fileThese ingredients are introduced and discussed in the next chapter. In Chapter 2, the HAZOP study is carried out and its results collected](https://reader031.vdocuments.us/reader031/viewer/2022030411/5a9dd51e7f8b9a0d5a8cbbe2/html5/thumbnails/29.jpg)
ARTEMIS-2009-1 R3-COP / Visual Perception Testing D4.2.3
Page 29 / 187
Guideword Par. Meaning Consequences Hazards Note Temporal aperiodic
Spectrum Emitted spectrum changes periodically over time
• “Colouring” of scene changes unpredictably over time
• Colouring of individual im-ages depends on speed of changes, but may vary highly unpredictable ⇒ CV alg. misinterprets scene “phase”
• If variations based on statistic distri-bution with con-stant parameters, learning of mean colouring is possi-ble
Close (spat.)
Spectrum Peaks in the light source spectrum are too close to each other so that they become indistinguishable
• Special colour contrasts in peak frequencies may be re-duced
• Special observers (filters) may be affected
Rare case
Remote (spat.)
Spectrum Peaks in the light source spectrum which are expected to be close to each other are too far way so that they become separated instead of giving a joined response
• Small textural light contrasts in scene may become exagger-ated
• Special observers (filters) may be affected
Rare case
Before (temp.)
Spectrum A change in l.s. spectrum occurs earlier than expected
• The spectrum does not corre-spond with the expectation -> See Other Than
E.g.: An eclipse happens and is not expected by a system which can only deal with sunsets.
After (temp.)
Spectrum See Before
In front of (spat.)
Spectrum n/a
Behind (spat.)
Spectrum n/a
Faster Spectrum Light source is moving towards/ away from the observer
• Spectrum is skewed by speed of light source and/or observer-> Doppler effect, Away = Red-shift, Towards = Blueshift
• Different colouring of scene may confuse CV alg.
![Page 30: R3-COP-D4.2.3 CV-HAZOP r1.1 OZ · PDF fileThese ingredients are introduced and discussed in the next chapter. In Chapter 2, the HAZOP study is carried out and its results collected](https://reader031.vdocuments.us/reader031/viewer/2022030411/5a9dd51e7f8b9a0d5a8cbbe2/html5/thumbnails/30.jpg)
ARTEMIS-2009-1 R3-COP / Visual Perception Testing D4.2.3
Page 30 / 187
Guideword Par. Meaning Consequences Hazards Note Faster Spectrum Spectrum changes faster
than shutter time of observer • Image captured by observer has mixture of l.s. spectra
• L.s. colour less significant for observer
Slower Spectrum n/a possible effects covered by other guidewords for Spectrum
No Texture Light source has no texture • Homogeneous scene illumina-tion with respect to l.s. colour
• Potential impact on dedi-cated CV algorithms (e.g. stereo on smooth objects with diffuse surface)
Normally this is wanted
More Texture Light source has too much texture
• The light source produces a texture of its own by projecting a textured light beam (virtual texture)
• Texture of emitted light is confused with texture on object. This creates false positive detections.
• CV algorithm may be con-fused by virtual textures
Less Texture The light source has not sufficient texture
• Any structured light system dependent on texture fails
• Structured light approach is not able to identify pro-jected pattern
As well as Texture Light source projects combination of two textures, one expected, the other unexpected
• Blending of lightings, complex illumination and shadowing
• Texture of emitted light is interchanging with texture on object. This provokes missed detections.
• CV algorithms may be hampered by this illumina-tion, in particular:
• Small changes in l.s. con-figuration may cause large differences in responses of CV algorithm (e.g. Moiré)
Part of Texture Only a part of a projected textured light pattern is captured
• Missing texture information in the scene
• A structured light system is not able to identify the pro-jection of a pattern
![Page 31: R3-COP-D4.2.3 CV-HAZOP r1.1 OZ · PDF fileThese ingredients are introduced and discussed in the next chapter. In Chapter 2, the HAZOP study is carried out and its results collected](https://reader031.vdocuments.us/reader031/viewer/2022030411/5a9dd51e7f8b9a0d5a8cbbe2/html5/thumbnails/31.jpg)
ARTEMIS-2009-1 R3-COP / Visual Perception Testing D4.2.3
Page 31 / 187
Guideword Par. Meaning Consequences Hazards Note Reverse Texture n/a Other than Texture L.s. emits completely
different l.s. texture than expected
• Illumination characteristics (contrast) differ significantly from the expected
• CV algorithms relying on illumination texture may be severely confused
Where else Texture Texture is as expected, but is projected into the wrong direction
• See Beam • See Beam
Spatial periodic
Texture Periodic texture • Pattern reduces contrast on existing texture (interference more likely)
• “virtual” textures in scene can fake periodic textures
• CV algorithms can take virtual textures for real, in particular on smooth, monochrome surfaces
• e.g. Moiré
Spatial aperiodic
Texture Aperiodic texture • “virtual” textures in scene can fake stochastic textures
• CV algorithms can take virtual textures for real, in particular on smooth, monochrome surfaces
Temporal periodic
Texture Light texture changes periodically over time
• Illumination contrast changes predictably over time
• Appearance of captured image depends on period phase and frequency, but between single images it may vary significantly
Temporal aperiodic
Texture Light texture changes aperiodically over time
• Illumination contrast changes unpredictably over time
• Appearance of individual images depends on speed of changes, but my vary highly unpredictable
Close (spat.)
Texture Light source emits a texture that is very similar to that of a different light source
• Distinction between l.s. is hampered
• Algorithm confuses this light source with another light source
Remote (spat.)
Texture Light source emits a texture that is very much different than the expected (and, e.g. trained one)
• See Other than • See Other than
![Page 32: R3-COP-D4.2.3 CV-HAZOP r1.1 OZ · PDF fileThese ingredients are introduced and discussed in the next chapter. In Chapter 2, the HAZOP study is carried out and its results collected](https://reader031.vdocuments.us/reader031/viewer/2022030411/5a9dd51e7f8b9a0d5a8cbbe2/html5/thumbnails/32.jpg)
ARTEMIS-2009-1 R3-COP / Visual Perception Testing D4.2.3
Page 32 / 187
Guideword Par. Meaning Consequences Hazards Note Before (temp.)
Texture A texture is projected to early • The wrong, or no texture pro-jection is captured
• Projection of texture (pat-tern) cannot be not de-tected
In case of temporally changing structured light
After (temp.)
Texture See Before • See Before • See Before See Before
In front of (spat.)
Texture n/a
Behind (spat.)
Texture n/a
Faster Texture Texture changes faster than shutter time of observer
• Image captured by observer has mixture of l.s. contrasts
• L.s. contrasts less signifi-cant for observer
e.g. rotating texture
Faster Texture Texture changes faster than expected by CV alg.
• Sequence of images may show significantly illumination char-acteristics
• Artefacts can occur because of temporal aliasing effects, whenever the texture is chang-ing periodically and its fre-quency exceeds the half frame rate [Oppenheim+99]
• CV alg. may have prob-lems with matching con-secutive images
• CV alg. interprets the arte-fact rather than the meas-urement
e.g. rotating texture e.g. recording a TV screen
Slower Texture n/a No Intensity L.s. is off • No light from this l.s. in scene • Underexposure if no other
light sources on • Captured camera noise leads to fake effects
More Intensity L.s. is too strong • Too much light in scene • Overexposure of lit objects Less Intensity Similar to “No”
![Page 33: R3-COP-D4.2.3 CV-HAZOP r1.1 OZ · PDF fileThese ingredients are introduced and discussed in the next chapter. In Chapter 2, the HAZOP study is carried out and its results collected](https://reader031.vdocuments.us/reader031/viewer/2022030411/5a9dd51e7f8b9a0d5a8cbbe2/html5/thumbnails/33.jpg)
ARTEMIS-2009-1 R3-COP / Visual Perception Testing D4.2.3
Page 33 / 187
Guideword Par. Meaning Consequences Hazards Note As well as Intensity Strong and weak light
sources mixed • Weak light sources are out-shined by strong ones
• If light source should be detected by CV algorithm, this may be hampered
• Might exceed the intensity range of sensor
The range of a sensor is limited, auto exposure and auto aperture can compensate only as long as the range of the scene is sufficiently limited
Part of Intensity n/a Reverse Intensity n/a Other than Intensity Intensity of light source is
completely different than expected by appl.
• Scene too bright or too dim • Over or underexposure of relevant objects or scene elements
Where else Intensity n/a Spatial periodic
Intensity See Texture
Spatial aperiodic
Intensity See Texture
Temporal periodic
Intensity The exposure time interval is not synchronized with the light source l.s. intensity changes periodically over time
• Brightness of illuminated scene changes periodically over time
• Brightness of captured im-age depends on period phase and frequency, but between single images it may vary significantly
after a constant number of captures the (almost) same brightness will reappear ⇒ learnable
Temporal aperiodic
Intensity L.s. intensity changes aperiodically over time
• Brightness of illuminated scene changes aperiodically over time
• Brightness of individual images depends on speed of changes, but may vary highly unpredictable
if variations based on statistic distribution with constant parameters, learning of mean brightness is possible e.g. clouds
![Page 34: R3-COP-D4.2.3 CV-HAZOP r1.1 OZ · PDF fileThese ingredients are introduced and discussed in the next chapter. In Chapter 2, the HAZOP study is carried out and its results collected](https://reader031.vdocuments.us/reader031/viewer/2022030411/5a9dd51e7f8b9a0d5a8cbbe2/html5/thumbnails/34.jpg)
ARTEMIS-2009-1 R3-COP / Visual Perception Testing D4.2.3
Page 34 / 187
Guideword Par. Meaning Consequences Hazards Note Close (spat.)
Intensity Two light sources have close intensity values
• They are not distinguishable by their intensity
• Algorithm is confused and misclassifies
E.g: A star(sun) and a planet of similar intensity
Remote (spat.)
Intensity n/a
Before (temp.)
Intensity Light source is switched off before a picture was taken
• See No • See No E.g: Flash is triggered after exposure of camera terminated
After (temp.)
Intensity Light source is switched on after a picture was taken
• See no • See No See Before
After (temp.)
Intensity Light source takes too much time to reach the expected intensity after being switched on
• Increases runtime if recognized by the algorithm, see Less oth-erwise
• See Less
After (temp.)
Intensity The aging light source delivers a different result as at its start
• Change of texture over a longer time scope
• Algorithm is confused by changing light source into thinking that the objects have changed
In front of (spat.)
Intensity See Pos
Behind (spat.)
Intensity See Pos
Faster Intensity Intensity changes faster than expected.
• Image brightness depends on exposure time and duration
• Over- or under exposure • Observer electronics (ex-posure control) gets con-fused when trying to com-pensate
Example: arc lamps, powered by AC flickers with 50 Hz
Slower Intensity Intensity changes Slowly over time, e.g. at sunset or sunrise
• Scene brightness changes significantly over longer time intervals
• CV alg. may miss resulting scene changes
No Beam Beam width is infinitely small • This l.s. does not contribute to the scene illumination
• As Num if Num==1
![Page 35: R3-COP-D4.2.3 CV-HAZOP r1.1 OZ · PDF fileThese ingredients are introduced and discussed in the next chapter. In Chapter 2, the HAZOP study is carried out and its results collected](https://reader031.vdocuments.us/reader031/viewer/2022030411/5a9dd51e7f8b9a0d5a8cbbe2/html5/thumbnails/35.jpg)
ARTEMIS-2009-1 R3-COP / Visual Perception Testing D4.2.3
Page 35 / 187
Guideword Par. Meaning Consequences Hazards Note More Beam Large beam angle, even
omni-directional emission of light
• All objects will be lit • Reflections in all shiny sur-faces possible
e.g. sun part of “l.s. : obj.” HAZOP
Less Beam Focused beam • Only fractions of objects will be lit
• Large parts of scene may be dark
• Unsmooth illumination of surfaces
• Diffraction patterns possi-ble
e.g. laser or spot
As well as Beam Light source produces multiple beams The light beam has an elliptical profile
• Complex illumination model needed
• Strange scene illumination • Confusing shadowing
e.g. lighthouse ray
Part of Beam n/a Reverse Beam Beam angle is broad where it
should be thin and the other way around
• See Less/More • See Less/More
Reverse Beam Beam is pointing in the opposite direction than expected
• See Less/More • See Less/More
Other than Beam Beam points in wrong direction
• Too weak light on object • See Less Intensity
Other than Beam No preferred light direction (“shining medium”)
• Light comes from “everywhere” • No shadows e.g. in special physical conditions (Cherenkov radiation), but water and fog can have similar effect Ambient lighting term is often part of rendering equation to hide the absence of global illumination
![Page 36: R3-COP-D4.2.3 CV-HAZOP r1.1 OZ · PDF fileThese ingredients are introduced and discussed in the next chapter. In Chapter 2, the HAZOP study is carried out and its results collected](https://reader031.vdocuments.us/reader031/viewer/2022030411/5a9dd51e7f8b9a0d5a8cbbe2/html5/thumbnails/36.jpg)
ARTEMIS-2009-1 R3-COP / Visual Perception Testing D4.2.3
Page 36 / 187
Guideword Par. Meaning Consequences Hazards Note Where else Beam The beam points in the
wrong direction • The illuminated area is dis-placed from the intended posi-tion
E.g.: Laser triangulation
Spatial periodic
Beam See Texture for sets of l.s. see Num.
Spatial aperiodic
Beam See Texture for sets of l.s. see Num.
Temporal periodic
Beam Shape and direction of beam (width) changes periodically
• Illumination and shading (shadowing) changes predicta-bly
• also see Texture
• Appearance and shading of scene may change (sig-nificantly) but predictably (“learnable”)
• also see Texture
Temporal aperiodic
Beam Shape and direction of beam (width) changes aperiodically
• Illumination and shading (shadowing) changes unpre-dictably
• also see Texture
• Appearance and shading of scene may change (sig-nificantly) and not pre-dictably
• also see Texture
Close (spat.)
Beam See Less
Remote (spat.)
Beam n/a
Before (temp.)
Beam n/a
After (temp.)
Beam n/a
In front of (spat.)
Beam See Pos
Behind (spat.)
Beam See Pos
Faster Beam n/a Slower Beam n/a
![Page 37: R3-COP-D4.2.3 CV-HAZOP r1.1 OZ · PDF fileThese ingredients are introduced and discussed in the next chapter. In Chapter 2, the HAZOP study is carried out and its results collected](https://reader031.vdocuments.us/reader031/viewer/2022030411/5a9dd51e7f8b9a0d5a8cbbe2/html5/thumbnails/37.jpg)
ARTEMIS-2009-1 R3-COP / Visual Perception Testing D4.2.3
Page 37 / 187
Guideword Par. Meaning Consequences Hazards Note No Wave Light is neither polarized nor
coherent • No interferences between light from different sources
• Distinction between light from different sources by polariza-tion is impossible
• Contrast increase with pol.filter will be diminished ⇒ Loss of contrast and thus misdetections
• Observers relying on po-larization or coherence will not work
More Wave Light of some source is more polarized or coherent than expected
• Interference effects like diffrac-tion rings may appear in the scene
• Extinction of light intensity when observed with polariza-tion filters
• Interference effects can disturb perception, e.g. may be confused with tex-ture or objects, or obfus-cate details
• Underexposure (polariza-tion may diminish light)
Less Wave Light of some source is less polarized or coherent than expected
• See No • See No
As well as Wave L.s. is coherent as well as polarized
• Effects as described under More will combine
• Scene may look completely differently than expected
Laser depends strongly on objects interacting with the light
As well as Wave The light of a coherent/polarized l.s. gets mixed with light having a different wave property
• See No • See No Exception: Two for example orthogonally polarized light sources are not affecting each other.
Part of Wave n/a Reverse Wave Wave properties of emitted
light are inverted relative to expectations.
• Change polarization plane by 90° or if circular reverse rota-tion direction
• Observed effects can be-come inverse to expected ones
![Page 38: R3-COP-D4.2.3 CV-HAZOP r1.1 OZ · PDF fileThese ingredients are introduced and discussed in the next chapter. In Chapter 2, the HAZOP study is carried out and its results collected](https://reader031.vdocuments.us/reader031/viewer/2022030411/5a9dd51e7f8b9a0d5a8cbbe2/html5/thumbnails/38.jpg)
ARTEMIS-2009-1 R3-COP / Visual Perception Testing D4.2.3
Page 38 / 187
Guideword Par. Meaning Consequences Hazards Note Other than Wave Wave property is other than
expected; • Wave is linearly polarized in-stead of circular, or polarized instead of coherent
• See Reverse • Interaction with surfaces: part of objects HAZOP. E.g. reflection splits the beam into same polari-zation axis as ob-ject surface and perpendicular axis beam; See Brew-ster’s angle (This happens at the object)
Where else Wave n/a Spatial periodic
Wave Wave properties depend periodically from part of light source where light is emitted
• Polarization plane depends periodically from location within light source area where it is emitted
• Visual effects depend on scene configuration, includ-ing observer location
• Object appearance can be periodically modified in a way the observer is not able to cope with
• Theoretically, these effects are predictable
Spatial aperiodic
Wave Wave properties depend irregularly from part of light source where light is emitted
• Polarization plane depends stochastically from location within light source area where it is emitted
• Same as Spatial periodic, but in an unpredictable way
![Page 39: R3-COP-D4.2.3 CV-HAZOP r1.1 OZ · PDF fileThese ingredients are introduced and discussed in the next chapter. In Chapter 2, the HAZOP study is carried out and its results collected](https://reader031.vdocuments.us/reader031/viewer/2022030411/5a9dd51e7f8b9a0d5a8cbbe2/html5/thumbnails/39.jpg)
ARTEMIS-2009-1 R3-COP / Visual Perception Testing D4.2.3
Page 39 / 187
Guideword Par. Meaning Consequences Hazards Note Temporal periodic
Wave Wave properties of emitted light are periodically modified.
• Linear polarization plane ro-tates regularly over time
• Visual effects depend on image capture time on Ob-server
• Object appearance can be periodically modified in a way the observer is not able to cope with
Theoretically, these effects are predictable Observation of stereo projection using temporally interleaved polarised frames
Temporal aperiodic
Wave Wave properties of emitted light are irregularly modified.
• Linear polarization plane ro-tates irregularly over time
• Visual effects depend on image capture time on Ob-server
• Object appearance can be modified in a way the ob-server is not able to cope with
• These effects are not pre-dictable
e.g. unstable laser
Close (spat.)
Wave n/a
Remote (spat.)
Wave n/a
Before (temp.)
Wave Generation of special wave properties happens earlier than expected
• E.g. switch of polarization plane
• If synchronization with ob-server or other object in scene is needed: expected effect may not occur
E.g.: A pulsed laser is out of sync
After (temp.)
Wave Generation of special wave properties happens earlier than expected
• See Before • See Before See above
In front of (spat.)
Wave n/a
Behind (spat.)
Wave n/a
Faster Wave n/a
![Page 40: R3-COP-D4.2.3 CV-HAZOP r1.1 OZ · PDF fileThese ingredients are introduced and discussed in the next chapter. In Chapter 2, the HAZOP study is carried out and its results collected](https://reader031.vdocuments.us/reader031/viewer/2022030411/5a9dd51e7f8b9a0d5a8cbbe2/html5/thumbnails/40.jpg)
ARTEMIS-2009-1 R3-COP / Visual Perception Testing D4.2.3
Page 40 / 187
Guideword Par. Meaning Consequences Hazards Note Slower Wave n/a
2.3 Medium
2.3.1 Parameters
Table 6: Medium Parameters Parameters Values / Range Meaning Transparency R Dimming factor per wavelength and distance unit Spectrum Frequency distribution Colour, i.e. richness of medium with respect to absorption spectrum (isotropic or anisotropic) Texture Pattern of Spectrum Generated by density fluctuations and at surfaces (e.g. water waves) Wave properties Polarization, coherence Particles <distrib., size, optical prop.> Effects and influences of particles within the medium.
2.3.2 HAZOP
Table 7: Medium HAZOP Guideword Par. Meaning Consequences Hazards Note No Transp. Medium is not transparent • No light can pass through
medium • The scene is neither illumi-nated nor is the sensor capturing any light
• Observer will not de-tect/react to anything engulfed within the medium
• -Is like a blocking ob-ject
E.g.: extreme fog, polluted water
![Page 41: R3-COP-D4.2.3 CV-HAZOP r1.1 OZ · PDF fileThese ingredients are introduced and discussed in the next chapter. In Chapter 2, the HAZOP study is carried out and its results collected](https://reader031.vdocuments.us/reader031/viewer/2022030411/5a9dd51e7f8b9a0d5a8cbbe2/html5/thumbnails/41.jpg)
ARTEMIS-2009-1 R3-COP / Visual Perception Testing D4.2.3
Page 41 / 187
Guideword Par. Meaning Consequences Hazards Note More Transp. Medium is clearer/more
transparent than expected• More light can pass through
• Increases contrasts • The object is illuminated to a greater extend therefore more affected by the prop-erties (e.g. non uniform-ities) of the l.s.
• More contrast than expected could result in mismatches
• Overexposure • See l.s./Intensity/More
E.g.: Atmospheric dimming hides unnecessary details and contrasts in the far distance. If atmosphere is very clear much more optical clutter from the background obscures the objects in the foreground E.g: In a cloudy outdoor scene diffuse light dominates. Without clouds the light is more directed => specular reflections have a severe effect.
Less Transp. Medium is optically thicker than expected
• Less light can pass through • Reduces contrasts
• Less contrast than expected could result in mismatches
• Underexposure • See l.s./Intensity/Less
As well as Transp. Two different media have same optical thickness
• It is not possible to use light intensity passing through the two media to distin-guish between them
• Refraction index of both media is the same
• Borders between me-dia hard/impossible to detect
• Objects may remain undetected
As well as Transp. Two different media have a different optical thickness
• Refraction occurs: changes the path of light to the ob-ject
• Refraction occurs: changes the path of light from the object to the observer
• Changed light-direction-dependent phenomena (shading, specular reflections …)
• The object appears to be displaced
Eg.: An object in water, observer above the water surface
![Page 42: R3-COP-D4.2.3 CV-HAZOP r1.1 OZ · PDF fileThese ingredients are introduced and discussed in the next chapter. In Chapter 2, the HAZOP study is carried out and its results collected](https://reader031.vdocuments.us/reader031/viewer/2022030411/5a9dd51e7f8b9a0d5a8cbbe2/html5/thumbnails/42.jpg)
ARTEMIS-2009-1 R3-COP / Visual Perception Testing D4.2.3
Page 42 / 187
Guideword Par. Meaning Consequences Hazards Note Part of Transp. Part of medium has
different transp. than expected ⇒ textured ⇒See Texture entries
Reverse Transp. The transparency is reversed (See Less or More)
Other than Transp. Medium has an unexpected Transparency
• All effects caused by other keywords possible → see there
• All hazards described at other keywords possible → see there
Where else Transp. Medium is non-uniformly transparent ⇒ unexpected pattern ⇒ see Texture entries
Spatial periodic
Transp. See Texture
Spatial aperiodic
Transp. See Texture
Temporal periodic
Transp. Transparency changes periodically with time
• Visibility/appearance of scene changes periodically with time
• CV alg. calibrated for a certain transparency has a periodically (and predictable) changing response quality
Temporal aperiodic
Transp. Transparency changes aperiodically with time
• Visibility/appearance of scene changes aperiodi-cally with time
• CV alg. calibrated for a certain transparency has an aperiodically changing response quality, not predicable
![Page 43: R3-COP-D4.2.3 CV-HAZOP r1.1 OZ · PDF fileThese ingredients are introduced and discussed in the next chapter. In Chapter 2, the HAZOP study is carried out and its results collected](https://reader031.vdocuments.us/reader031/viewer/2022030411/5a9dd51e7f8b9a0d5a8cbbe2/html5/thumbnails/43.jpg)
ARTEMIS-2009-1 R3-COP / Visual Perception Testing D4.2.3
Page 43 / 187
Guideword Par. Meaning Consequences Hazards Note Close (spat.)
Transp. At close range the transparency of a medium does not influence the light enough to be distinguishable
• Example: Testing liq-uids by their colour might fail if the liquid quantities are very small because their colour is very faint
Remote (spat.)
Transp. Transparency decreases with (increasingly) remote distances
• Medium transparency is declining with distance (transp. Is a function of the thickness of medium which increases with distance)
• Distant objects too faint to be detected
Before (temp.)
Transp. n/a
After (temp.)
Transp. n/a
In front of (spat.)
Transp. See combination of Remote and More
Behind (spat.)
Transp. See combination of Remote and Less
Faster Transp. A medium is moving very fast
• Streaming effects smear out
• Medium cannot be distinguished from (solid) objects
Examples: fog that moves very fast looks more uniform, water jet looks like a solid object Note: actually applied to medium as a whole rather than only to its transparency, but is mostly influenced by it
Slower Transp. A medium is moving very slowly
• Streaming effects remain distinguishable
• Medium is detected as a distinct feature
good case
No Spectrum Medium has no distinct colour
• Medium does not change colours of objects seen through it
• If transparency is high, medium may remain undetected although it should be detected
![Page 44: R3-COP-D4.2.3 CV-HAZOP r1.1 OZ · PDF fileThese ingredients are introduced and discussed in the next chapter. In Chapter 2, the HAZOP study is carried out and its results collected](https://reader031.vdocuments.us/reader031/viewer/2022030411/5a9dd51e7f8b9a0d5a8cbbe2/html5/thumbnails/44.jpg)
ARTEMIS-2009-1 R3-COP / Visual Perception Testing D4.2.3
Page 44 / 187
Guideword Par. Meaning Consequences Hazards Note More Spectrum Additive spectrum mixing
caused by medium • Skewed colours • Failed detection as
training data was ob-tained under unfiltered light
Less Spectrum Medium acts as a filter and changes the light’s spectrum
• Skewed colours • Failed detection – since training data was obtained under unfil-tered light
As well as Spectrum Medium has similar colour as nearby l.s./object
• Low contrast • Objects and medium become indistinguish-able
Part of Spectrum See Less Reverse Spectrum Medium has very different
colour as nearby l.s./object
• High contrast • Creates edges where none are expected
Other than Spectrum See Reverse Where else Spectrum Medium shifts transmitted
spectra • Shifted colours • Failed detection as
training data was ob-tained without medium
Spatial periodic
Spectrum See Texture
Spatial aperiodic
Spectrum See Texture
Temporal periodic
Spectrum n/a
Temporal aperiodic
Spectrum n/a
Close (spat.)
Spectrum See As well as
Remote (spat.)
Spectrum See Reverse
Before (temp.)
Spectrum n/a
![Page 45: R3-COP-D4.2.3 CV-HAZOP r1.1 OZ · PDF fileThese ingredients are introduced and discussed in the next chapter. In Chapter 2, the HAZOP study is carried out and its results collected](https://reader031.vdocuments.us/reader031/viewer/2022030411/5a9dd51e7f8b9a0d5a8cbbe2/html5/thumbnails/45.jpg)
ARTEMIS-2009-1 R3-COP / Visual Perception Testing D4.2.3
Page 45 / 187
Guideword Par. Meaning Consequences Hazards Note After (temp.)
Spectrum n/a
In front of (spat.)
Spectrum n/a
Behind (spat.)
Spectrum n/a
Faster Spectrum See Trans Slower Spectrum See Trans No Texture Medium is featureless • Medium does not affect
visual appearance of ob-jects
• If also transparency is high, medium may re-main undetected al-though it should be de-tected
Usually the intention of a CV system.
More Texture Medium is more texturized than expected
• Object appearance can be significantly distorted, e.g. fragmented
• Medium has an unexpected non-uniformity that casts a shadow
• Object recognition hampered
• Medium can be misin-terpreted as an object
• See l.s. More Texture
Less Texture Medium has less non-uniformity than expected
• See No • See No
As well as Texture Medium has a surface/non-uniformity that looks like an object/l.s.
• See Meaning • Non-existing objects detected
As well as Texture Medium has a surface/non-uniformity that looks like another medium
• See Meaning • Media are confused by CV alg.
thin (hot) oil and water hot water and cold water
Part of Texture n/a Reverse Texture Texture of medium is
inversed • See Other than • See Other than
![Page 46: R3-COP-D4.2.3 CV-HAZOP r1.1 OZ · PDF fileThese ingredients are introduced and discussed in the next chapter. In Chapter 2, the HAZOP study is carried out and its results collected](https://reader031.vdocuments.us/reader031/viewer/2022030411/5a9dd51e7f8b9a0d5a8cbbe2/html5/thumbnails/46.jpg)
ARTEMIS-2009-1 R3-COP / Visual Perception Testing D4.2.3
Page 46 / 187
Guideword Par. Meaning Consequences Hazards Note Other than Texture Texture of medium is
different than expected • Influences of medium on scene appearance is unex-pected
• Algorithms trained for a certain medium are confused
Other than Texture See As well as (1) • See As well as (1) • See As well as (1) Other than Texture See As well as (2) • See As well as (2) • See As well as (2) Where else Texture See Other than • See Other than (1) • See Other than(1) Spatial periodic
Texture Texture of medium is periodic (periodic density fluctuations)
• Interference with other tex-tures more likely
• Medium projects periodic texture onto surfaces
• Transparency changes periodically throughout me-dium (in one or several di-rections)
• Can create pattern/texture where there is no object
• Projected spectrum may be misinterpreted as object spectrum
• Confusion with object texture by CV alg. possible
• One extended object may appear as several (through medium zones with more transparency
• Misinterpretation of medium texture as ob-ject texture
See Spectrum
Spatial aperiodic
Texture Texture of medium is aperiodic (aperiodic density fluctuations)
• Transparency changes aperiodically throughout medium (in one or several directions)
• Decreases contrast
• Opaque zones could be misinterpreted as solid objects
• Scene appearance strongly depends on observer location and viewing direction
Examples: • Shadows of water waves at a pool floor
• Clouds,
Temporal periodic
Texture Texture of medium changes periodically over time
• Texture changes predict-able over time ⇒ scene appearance changes be-tween consecutive captur-ings
• See l.s.
![Page 47: R3-COP-D4.2.3 CV-HAZOP r1.1 OZ · PDF fileThese ingredients are introduced and discussed in the next chapter. In Chapter 2, the HAZOP study is carried out and its results collected](https://reader031.vdocuments.us/reader031/viewer/2022030411/5a9dd51e7f8b9a0d5a8cbbe2/html5/thumbnails/47.jpg)
ARTEMIS-2009-1 R3-COP / Visual Perception Testing D4.2.3
Page 47 / 187
Guideword Par. Meaning Consequences Hazards Note Temporal aperiodic
Texture Texture of medium changes aperiodically over time
• Texture changes random over time ⇒ scene appear-ance changes between consecutive capturings
• See l.s.
Close (spat.)
Texture See As well as/More E.g.: Dirt on a transparent camera casing
Remote (spat.)
Texture See Other than
Before (temp.)
Texture n/a
After (temp.)
Texture n/a
In front of (spat.)
Texture n/a
Behind (spat.)
Texture n/a
Faster Texture See Transparency Slower Texture See Transparency No Wave Medium destroys wave
property (no coherency and no polarization afterwards)
• If observer needs wave property to distinguish be-tween different l.s./objects this medium would reduce information
• Failed identifications due to missing wave properties
More Wave Medium increases coherency or polarizes the light
• Corresponding effects emanate or are increased
• See l.s.
Less Wave Medium decreases coherency or reduces the polarization of the light
• See No • See No
As well as Wave Medium polarizes light as well as decreases coherency
• See More • See More
![Page 48: R3-COP-D4.2.3 CV-HAZOP r1.1 OZ · PDF fileThese ingredients are introduced and discussed in the next chapter. In Chapter 2, the HAZOP study is carried out and its results collected](https://reader031.vdocuments.us/reader031/viewer/2022030411/5a9dd51e7f8b9a0d5a8cbbe2/html5/thumbnails/48.jpg)
ARTEMIS-2009-1 R3-COP / Visual Perception Testing D4.2.3
Page 48 / 187
Guideword Par. Meaning Consequences Hazards Note Part of Wave n/a Reverse Wave Wave property of passing
light is inverted. • E.g. unpolarised light be-comes polarized, or inco-herent light becomes co-herent.
• See l.s.
Other than Wave Wave property of passing light is changed (in an unexpected way)
• E.g. polarization plane of light is changed by 90° or if circular p. reverse rotation direction when passing through the medium
• See Reverse E.g. Faraday effect
Where else Wave n/a Spatial periodic
Wave Wave properties of passing light are modified by medium in a spatial periodic or regular way
• E.g. polarization plane de-pends periodically from lo-cation of medium where it passes through
• See l.s. Polarisation depending on density of medium that decreases with pressure
Spatial aperiodic
Wave Wave properties of passing light are modified by medium in a spatial irregular way
• E.g. polarization plane de-pends stochastically from location of medium where it passes through
• See l.s. Polarisation depending on temperature of medium
Temporal periodic
Wave Wave properties of passing light are modified by medium in a temporally periodic way
• See l.s. • See l.s. See spatial periodic, if pressure/temperature changes periodically (e.g. day/night cycle)
Temporal aperiodic
Wave Wave properties of passing light are modified by medium in a temporally irregular way
• See l.s. • See l.s. See spatial periodic, if pressure/temperature changes aperiodically
Close (spat.)
Wave n/a
Remote (spat.)
Wave Medium changes wave properties with increasing distance
• Light is polarized by scat-tering in medium (larger distance means more po-larization)
• Colours and contrasts depend on observer orientation if it uses polarization filter
Example: sunlight is polarized Normally good case for navigation (bees)
![Page 49: R3-COP-D4.2.3 CV-HAZOP r1.1 OZ · PDF fileThese ingredients are introduced and discussed in the next chapter. In Chapter 2, the HAZOP study is carried out and its results collected](https://reader031.vdocuments.us/reader031/viewer/2022030411/5a9dd51e7f8b9a0d5a8cbbe2/html5/thumbnails/49.jpg)
ARTEMIS-2009-1 R3-COP / Visual Perception Testing D4.2.3
Page 49 / 187
Guideword Par. Meaning Consequences Hazards Note Before (temp.)
Wave Change of wave properties of transmitted light happens before it is expected
• See l.s. • See l.s.
After (temp.)
Wave Change of wave properties of transmitted light happens after it is expected
• See l.s. • See l.s.
In front of (spat.)
Wave n/a
Behind (spat.)
Wave n/a
Faster Wave n/a Slower Wave n/a No Particles No particles in the
medium • No particles in the medium which scatter transmitting light
• The medium has no texture
• If particles are needed to e.g. visualize flow dynamics, this will be hampered
In general a good case; only in special applications a potential risk
No Particle No information about Particles
• It is not possible to decide if there are any particles in the medium
• Correct recognition of medium is hampered or impossible
Particles should help to recognize existence or nature of medium. Example: drinking glass filled with water: without bubbles it can be impossible to decide whether there is water in it or it’s empty
More Particles More particles in the medium than expected
• Particles in the medium scatter light and reduce transparency
• Scene gets clus-tered/congested by parti-cles in medium
• Object recognition is (severely) reduced
• Particles are misinter-preted as texture
• Stereo vision can be heavily hampered
Examples: snowflakes, raindrops, small hailstones
![Page 50: R3-COP-D4.2.3 CV-HAZOP r1.1 OZ · PDF fileThese ingredients are introduced and discussed in the next chapter. In Chapter 2, the HAZOP study is carried out and its results collected](https://reader031.vdocuments.us/reader031/viewer/2022030411/5a9dd51e7f8b9a0d5a8cbbe2/html5/thumbnails/50.jpg)
ARTEMIS-2009-1 R3-COP / Visual Perception Testing D4.2.3
Page 50 / 187
Guideword Par. Meaning Consequences Hazards Note • Distance of sight is de-creased
• See Less Trans
• See Less Trans
More Particles Particles are large(r than expected)
• Particles appear as distinct objects
• Particles are misinter-preted as objects
Examples: large hailstones or snowflakes
More Particle Particle size is bigger than the light’s wavelength
• Geometric Scattering • See Less Trans or More Texture
Less Particles Less particles in the medium (than expected)
• Too few particles for rec-ognizing them as part of an “ensemble”
• Background is more visible • Distance of sight is in-creased
• See Trans
• Particles are misinter-preted (if recognized at all)
• See Trans
Examples: a small number of snowflakes appearing as insects
Less Particles Smaller particles than expected
• Actual visual effects differ from anticipated ones
• CV alg. misinterprets unexpected visual ef-fects
Example: small water drops produce rainbow effects more effectively than larger ones
Less Particles Particle size is smaller than the light’s wavelength
• Rayleigh Scattering – col-our of transmitted light is changed
• Misinterpretation of colours
Reason for blue sky colour
As well as Particles Particles of different kinds occur simultaneously
• Mixture of appearances and optical effects
• Reason of distur-bances remain unde-tected by CV alg.
Example: fresh snow and raised ice crystals
As well as Particles Particles are clustered together
• Particle clusters can be-come arbitrarily large
• Particle clusters are interpreted as real (relevant) objects
Snowflakes assemble on the ground to form a layer of snow
As well as Particles Particles are of the same size as the light’s wavelength
• Mie Scattering – fog effect • Vision hampered Reason for white cloud colour
Part of Particles Particles break up and form a spray of smaller particles
• See More Particles than expected and Less Particle size than expected
• See More Particles than expected and Less Particle size than
Raindrop hits an edge and splashes into a thin spray of water
![Page 51: R3-COP-D4.2.3 CV-HAZOP r1.1 OZ · PDF fileThese ingredients are introduced and discussed in the next chapter. In Chapter 2, the HAZOP study is carried out and its results collected](https://reader031.vdocuments.us/reader031/viewer/2022030411/5a9dd51e7f8b9a0d5a8cbbe2/html5/thumbnails/51.jpg)
ARTEMIS-2009-1 R3-COP / Visual Perception Testing D4.2.3
Page 51 / 187
Guideword Par. Meaning Consequences Hazards Note expected
Reverse Particles n/a Other than Particles Other particles than
expected • Scene appearance not as anticipated
• See More for “other than no particles”
• Preconfigured CV alg. features can lead to wrong scene interpre-tation or high noise in results
Other than Particles Different types of particles are mixed in the medium
• Different visual effects combined
• Increased number of recognition errors
Snow and rain, dust and rain…
Where else Particles Particles fill up different parts of the scene with different density
• Different areas of scene exhibit different visual ef-fects
• Different recognition quality throughout an image
Differences at large scale; e.g. left side of scene is foggy, while right side is clear
Spatial periodic
Particles Particles are periodically distributed
• Anisotropic visual effects, i.e. depending on Observer position and viewing direc-tion
• Scene appearance depends on Observer position and viewing direction
Example: Halo effects, rainbow
Spatial aperiodic
Particles Particles are aperiodically distributed
• Typical distribution of parti-cles
• See More .. As well as
Temporal periodic
Particles Particle distribution varies periodically over time
• Visual effects change peri-odically with time
• See Trans
• Scene appearance changes periodically, therefore depends on moment of recording
• See Trans
(windshield) wiper
Temporal aperiodic
Particles Particle distribution varies aperiodically over time
• Typical behaviour of parti-cles
• See Trans
• See Trans • See More .. As well as
snow or rain fall
Close (spat.)
Particles Particles very close to Observer
• Single particles may cover larger scene fractions
• Single particles are confused with real scene objects
• Particle occludes ob-jects in scene
![Page 52: R3-COP-D4.2.3 CV-HAZOP r1.1 OZ · PDF fileThese ingredients are introduced and discussed in the next chapter. In Chapter 2, the HAZOP study is carried out and its results collected](https://reader031.vdocuments.us/reader031/viewer/2022030411/5a9dd51e7f8b9a0d5a8cbbe2/html5/thumbnails/52.jpg)
ARTEMIS-2009-1 R3-COP / Visual Perception Testing D4.2.3
Page 52 / 187
Guideword Par. Meaning Consequences Hazards Note Remote (spat.)
Particles Particles far away from Observer
• No more individual particles resolvable, effects merge with those of Remote Trans
• The particles blend into the background
• See Remote Trans
Before (temp.)
Particles n/a
After (temp.)
Particles n/a
In front of (spat.)
Particles See Close
Behind (spat.)
Particles Particles behind object but before l.s.
• Object silhouette affected • Object detection ham-pered
Example: aura
Faster Particles Particles move faster than expected
• Motion blur of particles • Blurred particles ob-fuscate (parts of) scene ⇒ image rec-ognition severely re-duced
Faster Particle Light is scattered while traversing a density changing media
• Brillouin Scattering • Object misinterpreta-tion due to schlieren
Atmospheric turbulences
Slower Particles Particles move slower than expected
• Particle outline shapes are sharply resolved (instead of being blurred)
• Cannot be compen-sated by temporal av-eraging – leads to ar-tefacts
E.g.:: moving particles can mark the flow of the medium E.g.: A rain expecting system is confused by slow falling snow
![Page 53: R3-COP-D4.2.3 CV-HAZOP r1.1 OZ · PDF fileThese ingredients are introduced and discussed in the next chapter. In Chapter 2, the HAZOP study is carried out and its results collected](https://reader031.vdocuments.us/reader031/viewer/2022030411/5a9dd51e7f8b9a0d5a8cbbe2/html5/thumbnails/53.jpg)
ARTEMIS-2009-1 R3-COP / Visual Perception Testing D4.2.3
Page 53 / 187
2.4 Object
2.4.1 Parameters
Table 8: Object Parameters Parameter Values /
Range Meaning
Position 3D-Point One emphasized point of Object; usually its center of gravity. Size R+3 Size of Object (minimal 3D-bounding box) Orientation 3 angles Two angles for orientation of main axes, third for angle between object’s main front and observer.
But better used as qualitative value like “upright”, “tilt” etc. Complexity R+ Measure for shape complexity, the larger, the more complex the Object appears
e.g. sphere: very low, human being: high, tree. very high) Also used for the concept of “shape” in general.
Level of detail N Different resolution levels exhibit different kinds of details (geometry and texture), that require different descriptions One level of detail stands for the concept of a resolution-scale range which corresponds to scale specific semantics. A wood for example can be seen as a whole (the wood and it’s extend) -> LOD1, a pool of trees -> LOD2 , which are pools of (leaves and trunks and roots) -> LOD3 … LOD is a potentially endless in real world, and is only restricted by the semantics the application requires. E.g.: Atomic level will not make sense if the application shall categorize trees.
Spectrum Frequency distribution
Colour of reflected & reemitted light
Texture Pattern of Spectrum
Contrast/reflectance variation on the object’s surface
Reflectance R [0..1] Reflection properties of the object. The amount of reflection and its kind (diffuse, specular, retroreflection) Transparency R [0..1] Transparency of object, refractive effects (0 = opaque, 1 = fully transparent), tinting (in combination with
Spectrum) Wave properties
Polarization, coherence
![Page 54: R3-COP-D4.2.3 CV-HAZOP r1.1 OZ · PDF fileThese ingredients are introduced and discussed in the next chapter. In Chapter 2, the HAZOP study is carried out and its results collected](https://reader031.vdocuments.us/reader031/viewer/2022030411/5a9dd51e7f8b9a0d5a8cbbe2/html5/thumbnails/54.jpg)
ARTEMIS-2009-1 R3-COP / Visual Perception Testing D4.2.3
Page 54 / 187
2.4.2 HAZOP
Table 9: Object HAZOP Guideword Par. Meaning Consequences Hazards Note No Pos Pos. cannot be defined/
detected • An object’s “central” point cannot be defined
• Scene is missing refer-ence points
• Scene is filled by just one object
• The part which identifies the position of an object is not within the cap-tured view field.
• One object is re-ported as several
• An object is reported at wrong position
Example: large, diffuse or highly structured or flexible objects like clouds (belongs to medium), fungus mycelium, or cloth, e.g. table-cloths. 2nd. Example: Feet, lower body and shadow of a person are out of view and this prevents the system from knowing whether a person stands on the ground or sits on a chair
More Pos See Remote Less Pos See Close As well as Pos Pos. is ambiguous • Position of Object can-
not be clearly identified • CV alg. delivers wrong location of ob-ject
Example: diffuse object with several more pronounced parts
Part of Pos Only part of the position is known (e.g. only left/right relations but front/back is missing, x/y coordinates are known (->from pixel coordinates) but z Coordinate is missing (Stereo! Normally needs two images)
• See No • CV alg. reports some erroneous position coordinates while others are accurate
Reverse Pos n/a
![Page 55: R3-COP-D4.2.3 CV-HAZOP r1.1 OZ · PDF fileThese ingredients are introduced and discussed in the next chapter. In Chapter 2, the HAZOP study is carried out and its results collected](https://reader031.vdocuments.us/reader031/viewer/2022030411/5a9dd51e7f8b9a0d5a8cbbe2/html5/thumbnails/55.jpg)
ARTEMIS-2009-1 R3-COP / Visual Perception Testing D4.2.3
Page 55 / 187
Guideword Par. Meaning Consequences Hazards Note Other than Pos Pos. is Other than
expected • Object is placed at an-other position than usual
• False negative: ob-ject remains unde-tected
• False positive: object is reported as an-other object
Cup is glued to the wall
Other than Pos Object is out of place for this scenery
• Unexpected object • CV alg. cannot deal with the occurrence of the object and produces unex-pected results
Elephant stands in the kitchen
Other than Pos Obj. Position is assumed to be different because an ego motion is confused with an object motion
• Observer and object pose are confused
• No problem as long as only the relative motion is of interest
• Motion models fail in consecutive calcula-tions
E.g.: An observer in a train watching another train moving can not determine who is moving
Where else Pos Pos. is “wrong” or impossible
• Object is placed at a “forbidden” location
• Entirely misinter-preted scene
• See Other than
Example: objects which cannot fly are detached from supporting ground
Spatial periodic
Pos Object can only be at special integral positions that have the same distance from each other
• Most positions “illegal” • Objects at illegal po-sitions misinterpreted
Object is constrained by a rail or other objects and “locks” into different positions, where each position has the same distance from the one before it.
Spatial aperiodic
Pos Objects have no deterministic spatial placing; e.g. can be free floating or randomly placed within the scene
• All/Most positions con-sidered to be “legal”
• More degrees of free-dom than expected
• Too many possible positions for CV alg.
![Page 56: R3-COP-D4.2.3 CV-HAZOP r1.1 OZ · PDF fileThese ingredients are introduced and discussed in the next chapter. In Chapter 2, the HAZOP study is carried out and its results collected](https://reader031.vdocuments.us/reader031/viewer/2022030411/5a9dd51e7f8b9a0d5a8cbbe2/html5/thumbnails/56.jpg)
ARTEMIS-2009-1 R3-COP / Visual Perception Testing D4.2.3
Page 56 / 187
Guideword Par. Meaning Consequences Hazards Note Temporal periodic
Pos Obj. Pos. changes periodically
• Position of object within scene depends on time of recording, but peri-odically (and hence pre-dictable)
• Temporal aliasing, if frame rate of sensor is insufficient for the movements frequency
• Mismatch of objects in consecutive im-ages
• CV alg. interprets the aliasing-frequency instead of the real one
E.g. (for aliasing): the picket fence effect
Temporal aperiodic
Pos Obj. Pos. changes aperiodically
• Position of object within scene depends on time of recording, not peri-odically (and hence not predictable)
• Mismatch of objects in consecutive im-ages
Close (spat.)
Pos Object Closer to Observer than expected
• Object is larger (and covers more of the scene than expected
• Object covers complete scene
• False positive: Ob-ject not correctly rec-ognized
• Other parts of scene not (correctly) recog-nized
Interactions: • Level of Detail • Texture • Reflectance
Remote (spat.)
Pos Object More Remote from Observer than expected
• Object is smaller than expected
• Object covers only a few pixels
• Object details are lost
• False positive: Ob-ject not correctly rec-ognized
• False negative: Ob-ject is not detected as an object but merges into back-ground
Interactions: • Level of Detail
Before (temp.)
Pos Obj. moves along a path earlier than expected
• See Other than • See Other than
![Page 57: R3-COP-D4.2.3 CV-HAZOP r1.1 OZ · PDF fileThese ingredients are introduced and discussed in the next chapter. In Chapter 2, the HAZOP study is carried out and its results collected](https://reader031.vdocuments.us/reader031/viewer/2022030411/5a9dd51e7f8b9a0d5a8cbbe2/html5/thumbnails/57.jpg)
ARTEMIS-2009-1 R3-COP / Visual Perception Testing D4.2.3
Page 57 / 187
Guideword Par. Meaning Consequences Hazards Note Before (temp.)
Pos Obj. moves out of scene before Observer has started to gather information
• Object is not in scene • False positive. non-existing object re-ported
After (temp.)
Pos Obj. moves into view after Observer has ended to gather information
• See Before • See Before
After (temp.)
Pos Obj. moves along a path later than expected
• See Other than • See Other than
In front of (spat.)
Pos See Objects
Behind (spat.)
Pos See Objects
Faster Pos Obj. moves faster than expected
• Obj. stays shorter at a place than expected
• Relativistic effects • Doppler Shift
• Transversal motion blur
• Temporal aliasing, see Temporal Peri-odic
Slower Pos Obj. moves slower than expected
• Obj. stays longer at a place than expected
• Moving object is visi-ble although it’s sup-posed to be trans-parent (due to persis-tence of vision of the background)
Example: Fast moving rotor blades in front of an airplane are normally not visible, if they move too slow they may occlude parts of the scene
No Size Obj. is “infinitely small” • Obj. is too small to be detected by Observer
• False negative: no object reported (ob-ject leaves no foot-print in image)
practically useless
![Page 58: R3-COP-D4.2.3 CV-HAZOP r1.1 OZ · PDF fileThese ingredients are introduced and discussed in the next chapter. In Chapter 2, the HAZOP study is carried out and its results collected](https://reader031.vdocuments.us/reader031/viewer/2022030411/5a9dd51e7f8b9a0d5a8cbbe2/html5/thumbnails/58.jpg)
ARTEMIS-2009-1 R3-COP / Visual Perception Testing D4.2.3
Page 58 / 187
Guideword Par. Meaning Consequences Hazards Note More Size Obj. is larger than
expected • Obj. has a size more similar to other objects than its own characteris-tic size
• Obj. is too large to be detected, because its edges outside of view
• False positive: object is confused with some other object
• False negative: no object reported (too few details of object in image)
• Depth of object not correctly recognized
From the current view point the true depth is obscured by the object itself (its form) and the training data suggests a thinner object with less depth Interactions: • Level of Detail • Texture • Reflectance
Less Size Obj. is smaller than expected
• Obj. has a size more similar to other objects than its own characteris-tic size
• Obj. is too small to be (correctly) detected by Observer
• False positive: object is confused with some other object
• False negative: no object reported (ob-ject leaves no foot-print in image)
This corresponds to the typical test case: “decrease object size until CV alg. starts to fail its correct recognition” Interactions: • Level of Detail
As well as Size Obj. is both larger and smaller than expected
• Obj. has a size more similar to other objects than its own characteris-tic size
• Obj. has a unusual length:width:height ratio (i.e.unusual elongation)
• False positive: object is confused with some other object
• False negative: no object reported
Example: an eel is also a fish, but with unusual length Interactions: • More Complexity
Part of Size Only Part of Obj. area is visible
• Obj. is either partially occluded
• Obj. is located partially outside the field of view.
• Object is not cor-rectly recognized
Note: the side turned away from observer is usually not visible (exception: transparent objects), but this is not meant here Interactions: • Orientation
![Page 59: R3-COP-D4.2.3 CV-HAZOP r1.1 OZ · PDF fileThese ingredients are introduced and discussed in the next chapter. In Chapter 2, the HAZOP study is carried out and its results collected](https://reader031.vdocuments.us/reader031/viewer/2022030411/5a9dd51e7f8b9a0d5a8cbbe2/html5/thumbnails/59.jpg)
ARTEMIS-2009-1 R3-COP / Visual Perception Testing D4.2.3
Page 59 / 187
Guideword Par. Meaning Consequences Hazards Note Part of Size One of the Object extents
is missing • Flat object or even thin wire
• Degenerated configura-tion of object surface
• Object is misinter-preted
• If looked at from the wrong view point, undetectable
• CV alg. fails because of a degenerated case
E.g.: Camera plane perpendicular to a paper plane makes a paper invisible E.g.: Co-planarity of all points on a surface leads to linear dependence - not solvable without assumptions or additional noise
Reverse Size Object is turned inside out • Exchange of inner and outer parts
• Object not recog-nized or misinter-preted
E.g.: Indoor versus outdoor – the free space inside of a room is defined by the inner of the walls while outside it is the outer of the walls .
Other than Size Obj. has an Other Size than expected (than usual)
• See More, Less, As well as
• See More, Less, As well as
Object has other length:width:height-ratio than expected
Where else Size n/a Spatial periodic
Size Obj. has a regular shape • Several symmetries possible
• Obj. is too regular to distinguish different ori-entations
• Detection of orienta-tion hampered
Example: sphere, regular polyhedron with large number of vertices
Spatial aperiodic
Size Obj. has an irregular shape • No symmetries • Detection vulnerable to orientation
Interaction: • Complexity • Orientation
![Page 60: R3-COP-D4.2.3 CV-HAZOP r1.1 OZ · PDF fileThese ingredients are introduced and discussed in the next chapter. In Chapter 2, the HAZOP study is carried out and its results collected](https://reader031.vdocuments.us/reader031/viewer/2022030411/5a9dd51e7f8b9a0d5a8cbbe2/html5/thumbnails/60.jpg)
ARTEMIS-2009-1 R3-COP / Visual Perception Testing D4.2.3
Page 60 / 187
Guideword Par. Meaning Consequences Hazards Note Temporal periodic
Size Obj. Size changes periodically
• Obj. Size depends on time of recording, but periodically (and hence predictable)
• Temporal aliasing, if frame rate of sensor is insufficient for the movements frequency
• Mismatch of objects in consecutive im-ages
• CV alg. interprets the aliasing-frequency instead of the real one
Example: breathing human moves waist with each breath predictably
Temporal aperiodic
Size Obj. Size changes aperiodically
• Obj. Size depends on time of recording, not periodically (and hence not predictable)
• Mismatch of objects in consecutive im-ages
Close (spat.)
Size See Pos. Typical optical illusion: The visible size of an object is dependent on its distance from the observer-> An object might appear closer than it is because its size is unusually big for its type
Remote (spat.)
Size See Pos. Example: miniatures fake a real scene in small scale
Before (temp.)
Size Object size grows/shrinks/pulses earlier than expected
• Object’s appearance other than expected
• Misinterpretation of scene
After (temp.)
Size Object size grows/shrinks/pulses later than expected
• Object’s appearance other than expected
• Misinterpretation of scene
In front of (spat.)
Size See Objects
Behind (spat.)
Size See Objects
![Page 61: R3-COP-D4.2.3 CV-HAZOP r1.1 OZ · PDF fileThese ingredients are introduced and discussed in the next chapter. In Chapter 2, the HAZOP study is carried out and its results collected](https://reader031.vdocuments.us/reader031/viewer/2022030411/5a9dd51e7f8b9a0d5a8cbbe2/html5/thumbnails/61.jpg)
ARTEMIS-2009-1 R3-COP / Visual Perception Testing D4.2.3
Page 61 / 187
Guideword Par. Meaning Consequences Hazards Note Faster Size Obj. size changes faster
than expected • Obj. shrinks/increases/pulses remarkably during ex-posure
• Temporal aliasing, if frame rate of sensor is insufficient for the movements frequency
• Radial motion blur • CV alg. interprets the aliasing-frequency instead of the real one
Slower Size Obj. size changes slower than expected
• Obj. shrinks/increases/pulses undetectably during ex-posure
• Missing of variation in size leads to wrong scene inter-pretation (in image sequences)
No Orient. Obj. has no (specific, preferred) orientation
• Obj. can be present in any orientation
• Object is recognized but orientation is still unknown
Can lead to failures in further computation steps
No Orient. Orientation of Obj. is not (easily) recognizable
• See Spatial periodic Size
• See Spatial periodic Size
See Spatial periodic Size
More Orient. Obj. has a more unusual Orientation than expected
• Obj. is present in scene in unusual orientation
• Obj. is not correctly recognized
Less Orient. Obj. has a less usual Orientation than expected
• See More • See More
As well as Orient. Object points at multiple directions
• Object’s orientation is ambiguous
• Wrong orientation taken as the “desired one” by CV alg.
Example: a sign post with two signs
Part of Orient. Part of orientation is unusual
• See Meaning • Obj. is not correctly recognized
• Obj. is correctly rec-ognized, but its ori-entation not
Example: object is upright as usual, but with the back side (which is usually not seen) towards the Observer
Reverse Orient. Orientation is opposite to usual
• All orientation angles are inverted
• See Part of Example: object is “upside down”
![Page 62: R3-COP-D4.2.3 CV-HAZOP r1.1 OZ · PDF fileThese ingredients are introduced and discussed in the next chapter. In Chapter 2, the HAZOP study is carried out and its results collected](https://reader031.vdocuments.us/reader031/viewer/2022030411/5a9dd51e7f8b9a0d5a8cbbe2/html5/thumbnails/62.jpg)
ARTEMIS-2009-1 R3-COP / Visual Perception Testing D4.2.3
Page 62 / 187
Guideword Par. Meaning Consequences Hazards Note Other than Orient. Orientation is other than
expected • See More, Less, Part of, Reverse
• See More, Less, Part of, Reverse
Example: object points into wrong direction, object is upside down
Where else Orient. Obj.s orient. depends on Pos.
• Object reels or staggers while moving through the scene
• Object’s recognition hampered
Spatial periodic
Orient. Object can only be oriented in discrete steps
• Only a limited number of orientations possible
• CV alg. might expect a continous range and interpolate be-tween the discrete steps thus producing an invalid result
Example: hands of a clock with only a finite number of positions
Spatial aperiodic
Orient. Object can be oriented in any direction
• Unlimited number of orientations possible
• Object not correctly recognized in all ori-entations
Temporal periodic
Orient. Obj. orient. changes periodically
• Obj. orient. depends on time of recording, but periodically (and hence predictable)
• If the frequency of the orientation changes ex-ceeds half of the frame rate temporal aliasing occurs
• Mismatch of objects in consecutive im-ages
• The virtual frequency of aliasing is treated for the actual one
Mismatch can be avoided by learning (in the case of no aliasing) E.g. for aliasing: The wagon wheel effect
Temporal periodic
Orient. Obj. orient. change deviates from prediction
• Obj. orient. depends on time of recording, though not periodically but still predictable
• Misinterpretation of temporal progress of scene
Example: person performs an expected action earlier than expected
Temporal aperiodic
Orient. Obj. orient. changes aperiodically
• Obj. orient. depends on time of recording, not periodically (and hence not predictable)
• Mismatch of objects in consecutive im-ages
![Page 63: R3-COP-D4.2.3 CV-HAZOP r1.1 OZ · PDF fileThese ingredients are introduced and discussed in the next chapter. In Chapter 2, the HAZOP study is carried out and its results collected](https://reader031.vdocuments.us/reader031/viewer/2022030411/5a9dd51e7f8b9a0d5a8cbbe2/html5/thumbnails/63.jpg)
ARTEMIS-2009-1 R3-COP / Visual Perception Testing D4.2.3
Page 63 / 187
Guideword Par. Meaning Consequences Hazards Note Close (spat.)
Orient. Obj. is oriented such that parts of object are close to Observer
• Extreme perspective • Obj. is not correctly recognized
Remote (spat.)
Orient. n/a Not symmetric to Close, because at distance perspective distortions diminish
Before (temp.)
Orient. Orientation changes before expected
• See Other Than Pos • Obj. is not correctly recognized
Before (temp.)
Orient. Analogue to Size • See Size • See Size
After (temp.)
Orient. Analogue to Size • See Size • See Size
In front of (spat.)
Orient. n/a
Behind (spat.)
Orient. n/a
Faster Orient. Orient. changes faster than expected
• Obj. rotates remarkably during exposure
• See Temporal periodic for aliasing
• Rotational motion blur
• See Temporal peri-odic
E.g.: The wagon wheel effect
Slower Orient. n/a No Compl. Object has No Complexity • Object has no features
which support identifica-tion and recognition - extremely simple object
• Object not detected • Object detected, but not identified
• Object detected, but aspects such as po-sition not identifiable
Example: homogeneous sphere
![Page 64: R3-COP-D4.2.3 CV-HAZOP r1.1 OZ · PDF fileThese ingredients are introduced and discussed in the next chapter. In Chapter 2, the HAZOP study is carried out and its results collected](https://reader031.vdocuments.us/reader031/viewer/2022030411/5a9dd51e7f8b9a0d5a8cbbe2/html5/thumbnails/64.jpg)
ARTEMIS-2009-1 R3-COP / Visual Perception Testing D4.2.3
Page 64 / 187
Guideword Par. Meaning Consequences Hazards Note More Compl. Object is more complex
than expected • Object has features or feature combinations – shape, texture, trans-parency – which make its correct recognition difficult
• Object is not cor-rectly recognized
• Parts of object not recognized or misin-terpreted,
Examples: • Hairs interpreted as other material
• In stereo images, quills or other repeating surface structures lead to mis-matches
Interactions: • As well as Size
Less Compl. Object is less complex than expected
• Object misses features or feature combinations which are expected or relevant for its kind
• Object lacks natural features
• Parts of object not or not correctly recog-nized
• An insufficient amount of natural features leads to faulty/no results in 3D reconstruction or self-localisation
E.g.: A bald persons face is hard to detect
As well as Compl. Object has several distinct complexities or misses some usual one
• Object’s appearance strongly depends on orientation
• Object’s appearance strongly depends on its distance
• Object is recognized differently from dif-ferent sides and in different distances
Examples: human head: front different from rear side tree: looks like bush at remote distance
Part of Compl. See Less Reverse Compl. Object’s shape is contrary
to expected • Object’s shape is in-verted, e.g. bumps are pits, smooth parts are rough etc.
• Object is confused with some other
• Object parts are con-fused
E.g: a non–rigid object expected to be rigid (like muscles)
![Page 65: R3-COP-D4.2.3 CV-HAZOP r1.1 OZ · PDF fileThese ingredients are introduced and discussed in the next chapter. In Chapter 2, the HAZOP study is carried out and its results collected](https://reader031.vdocuments.us/reader031/viewer/2022030411/5a9dd51e7f8b9a0d5a8cbbe2/html5/thumbnails/65.jpg)
ARTEMIS-2009-1 R3-COP / Visual Perception Testing D4.2.3
Page 65 / 187
Guideword Par. Meaning Consequences Hazards Note Other than Compl. Object has a complete
different complexity (shape) than expected
• Like Meaning • Object is confused with some other
• Object is not recog-nized at all
Example: caterpillar instead of butterfly; bud instead of blossom; car fully damaged instead of undamaged
Where else Compl. n/a Spatial periodic
Compl. Object has spatially periodic shape
• Parts of object are iden-tical
• Mismatch of object parts in stereo lead to wrong depth or shape recognition
• Mismatch of object parts in image se-quences lead to wrong movement es-timation
E.g: A house facade with a regular grid of windows.
Spatial periodic
Compl. Object has (partially) self-similar shape
• Parts of objects are similar to different de-tail-levels of themselves (fractal))
• Object’s distance is wrongly computed
Interactions: • Level of Detail
Spatial aperiodic
Compl. Object has spatially aperiodic shape
• All object parts differ from each other, and all parts are relevant
• Object is not cor-rectly recognized, if not enough of its parts visible
Example: a distinct landscape region in bird view
Spatial aperiodic
Compl. Object has spatially aperiodic shape
• All object parts differ from each other, but not each part is relevant
• Object is not cor-rectly recognized, if too close
• Object is not cor-rectly recognized, if not enough relevant parts visible
Example: trees – not each leaf is relevant, only shape as a whole
Spatial aperiodic
Compl. Object shape has symmetries
• Some object parts are symmetric counterparts of others
• Object is not cor-rectly recognized, if symmetry invisible
![Page 66: R3-COP-D4.2.3 CV-HAZOP r1.1 OZ · PDF fileThese ingredients are introduced and discussed in the next chapter. In Chapter 2, the HAZOP study is carried out and its results collected](https://reader031.vdocuments.us/reader031/viewer/2022030411/5a9dd51e7f8b9a0d5a8cbbe2/html5/thumbnails/66.jpg)
ARTEMIS-2009-1 R3-COP / Visual Perception Testing D4.2.3
Page 66 / 187
Guideword Par. Meaning Consequences Hazards Note Temporal periodic
Compl. Object shape changes periodically or at least predictable
• See Orientation • See Orientation See Orientation Aliasing example: waves on the surface of a liquid
Temporal aperiodic
Compl. Object shape changes aperiodically
• See Orientation • See Orientation
Close (spat.)
Compl. Parts close to Observer show more Compl. than remote ones
• Object exhibit decreas-ing LOD along view axes
• Object not correctly recognized
Remote (spat.)
Compl. See Close
Before (temp.)
Compl. Desired complexity degree occurs before it is expected
• Complexity is not cap-tured at its peak
• A simplified model is assumed
E.g: Reconstruction of a thunderbolts plasma channel with wrong timing
Before (temp.)
Compl. Analogue to Size • See Size • See Size
After (temp.)
Compl. Desired complexity degree occurs after it is expected
• See Before • See Before See Before
After (temp.)
Compl. Analogue to Size • See Size • See Size
In front of (spat.)
Compl. n/a
Behind (spat.)
Compl. N,a,
Faster Compl. Object shape changes faster than expected
• See Meaning • Aliasing if periodical change has too high a frequency
• Object is not cor-rectly recognized
• Failed re-identification in con-secutive images of a (video) sequence
• Heterogeneous mo-tion blur
• Temp aliasing effect causes interpretation of a virtual frequency
Example. Flames Aliasing example: waves on the surface of a liquid
![Page 67: R3-COP-D4.2.3 CV-HAZOP r1.1 OZ · PDF fileThese ingredients are introduced and discussed in the next chapter. In Chapter 2, the HAZOP study is carried out and its results collected](https://reader031.vdocuments.us/reader031/viewer/2022030411/5a9dd51e7f8b9a0d5a8cbbe2/html5/thumbnails/67.jpg)
ARTEMIS-2009-1 R3-COP / Visual Perception Testing D4.2.3
Page 67 / 187
Guideword Par. Meaning Consequences Hazards Note Slower Compl. n/a No LOD No differences in LOD at
different distances (only one LOD)
• Object looks same if small and close or large and remote
• If Object can have different sizes: size and distance are confused
E.g.: Uniform sphere
More LOD Object has more LOD than expected
• Object looks more dif-ferent at different dis-tances than expected
• Level of detail might not be captured due insufficient spatial resolution
• Object is not recog-nized at all or not correctly at certain distances
Interactions: • Size • Pos.
Less LOD Object has less LOD than expected
• Object looks less differ-ent at different distances than expected
• CV alg. expects more structures at zoom-in for better classification but these structures are not present
Interactions: • Size • Pos.
As well as LOD Object has both more and less LOD than expected
• Depending on orienta-tion, Object looks either more or less different at different distances than expected
• Depending on orien-tation, object is not correctly recognized
Example: a glass bottle has a simple complexity, but its label contains large small print
Part of LOD Part of the Object has unexpected LOD
• See As well as • See As well as See As well as
![Page 68: R3-COP-D4.2.3 CV-HAZOP r1.1 OZ · PDF fileThese ingredients are introduced and discussed in the next chapter. In Chapter 2, the HAZOP study is carried out and its results collected](https://reader031.vdocuments.us/reader031/viewer/2022030411/5a9dd51e7f8b9a0d5a8cbbe2/html5/thumbnails/68.jpg)
ARTEMIS-2009-1 R3-COP / Visual Perception Testing D4.2.3
Page 68 / 187
Guideword Par. Meaning Consequences Hazards Note Reverse LOD LOD changes with
increasing distance inversely to expected
• Object looks differently from expected at all (at least many) distances
• Object is (likely) not correctly recognized at many distances
Examples: for some kinds of objects like blood vessel systems LOD decreases with decreasing distance, while for true fractals LOD is either constant or may even virtually increase with “decreasing distance” (increasing resolution)
Other than LOD LOD is other than expected • See More, Less, Part of, Reverse
• See More, Less, Part of, Reverse
Example: smooth surface reveals distinct pattern when zoomed in very far/under special lighting
Where else LOD n/a Spatial periodic
LOD n/a
Spatial aperiodic
LOD n/a
Temporal periodic
LOD LOD changes periodically • LOD depends on time of recording, but periodi-cally (and hence pre-dictable)
• Object is sometimes not correctly recog-nized
Swinging pendulum along optical axis
Temporal aperiodic
LOD LOD changes aperiodically • LOD depends on time of recording, not periodi-cally (and hence not predictable)
• See Temporal peri-odic
Close (spat.)
LOD Object has unexpected LOD at small distance
• At close-ups, Object looks differently than expected
• If too close, Object is not correctly recog-nized
Example: leaves
Remote (spat.)
LOD Object has unexpected LOD at large distance
• If remote, Object looks differently than expected
• If too far away, Ob-ject is not correctly recognized
Rare effect
![Page 69: R3-COP-D4.2.3 CV-HAZOP r1.1 OZ · PDF fileThese ingredients are introduced and discussed in the next chapter. In Chapter 2, the HAZOP study is carried out and its results collected](https://reader031.vdocuments.us/reader031/viewer/2022030411/5a9dd51e7f8b9a0d5a8cbbe2/html5/thumbnails/69.jpg)
ARTEMIS-2009-1 R3-COP / Visual Perception Testing D4.2.3
Page 69 / 187
Guideword Par. Meaning Consequences Hazards Note Before (temp.)
LOD Analogue to Size • See Size • See Size
After (temp.)
LOD Analogue to Size • See Size • See Size
In front of (spat.)
LOD n/a
Behind (spat.)
LOD n/a
Faster LOD LOD changes faster than expected
• See Compl. • See Compl. Object approaches observer faster than expected
Slower LOD n/a No hazard identifiable No Colour Object has no specific
colour • Object’s colour fully de-pends on incoming light, i.e. it reflects all light as received
• Object is not recog-nized at all (e.g. a white wall or a mir-ror)
• Object is confused with that of projected image (e.g. a video projection wall)
Interaction: • Reflectance: - fully diffuse – white - fully shiny - mirror
No Colour Object is strictly mono-coloured
• Object filters incoming light strongly. If it con-sists of frequencies dif-ferent from the object’s ones, the object be-comes dark
• Underexposure (ob-ject appears darker than expected
Interaction: • L.s. / Colour
More Colour Object has more (specific) colour(s) than expected
• Coloured illumination can change appearance significantly
• Colour distortion if illuminated with dif-ferent colour
• Underexposure, if illuminated with too different colour
![Page 70: R3-COP-D4.2.3 CV-HAZOP r1.1 OZ · PDF fileThese ingredients are introduced and discussed in the next chapter. In Chapter 2, the HAZOP study is carried out and its results collected](https://reader031.vdocuments.us/reader031/viewer/2022030411/5a9dd51e7f8b9a0d5a8cbbe2/html5/thumbnails/70.jpg)
ARTEMIS-2009-1 R3-COP / Visual Perception Testing D4.2.3
Page 70 / 187
Guideword Par. Meaning Consequences Hazards Note More Colour The scene contains more
of a certain colour than expected
• Auto white balance of the observer is confused by to many occurrences of a certain colour within the scene.
• Captured image has different white bal-ance
Less Colour Object has less (specific) colour(s) than expected
• Colour is no significant identification property
• Object not (correctly) recognized
Example: aged objects often have bleached colours See After (2)
As well as Colour Object has both more and less (specific) colour than expected
• Combination of more and less
• Orientation can change appearance unexpect-edly
• See More and Less, depending on orien-tation
Interaction: • Orientation
Part of Colour See Less Reverse Colour See Other than Other than Colour Object has different
colour(s) than expected • Colour can be a mis-leading identification property
• Object not (correctly) recognized
Example: sere grass or leaves are brown instead of green
Where else Colour n/a Spatial periodic
Colour See Texture
Spatial aperiodic
Colour See Texture
Temporal periodic
Colour Colour changes periodically
• Obj. Colour depends on time of recording, but periodically (and hence predictable)
• Or at least predictable
• Mismatch of objects in consecutive im-ages
• Periodically incorrect recognition of object
![Page 71: R3-COP-D4.2.3 CV-HAZOP r1.1 OZ · PDF fileThese ingredients are introduced and discussed in the next chapter. In Chapter 2, the HAZOP study is carried out and its results collected](https://reader031.vdocuments.us/reader031/viewer/2022030411/5a9dd51e7f8b9a0d5a8cbbe2/html5/thumbnails/71.jpg)
ARTEMIS-2009-1 R3-COP / Visual Perception Testing D4.2.3
Page 71 / 187
Guideword Par. Meaning Consequences Hazards Note Temporal aperiodic
Colour Colour changes aperiodically
• Obj. Colour depends on time of recording, not periodically (and hence not predictable)
• Mismatch of objects in consecutive im-ages
• Aperiodically incor-rect recognition of object
Close (spat.)
Colour n/a
Remote (spat.)
Colour n/a
Before (temp.)
Colour Analogue to Size • See Size • See Size
After (temp.)
Colour Analogue to Size • See Size • See Size
After (temp.)
Colour After the object has reached a certain age it has less colour
• Colour information is less meaningful
• Object not (correctly) recognized
Example: aged objects often have bleached colours
In front of (spat.)
Colour n/a
Behind (spat.)
Colour n/a
Faster Colour Colour changes faster than expected
• See Meaning • Object’s state is mis-interpreted
Example: leaves normally become brown in fall
Slower Colour Colour changes slower than expected
• See Meaning • See Faster Example: colour of metals changes when melting
No Texture Object has no texture • Object is either mono-chrome, or is highly re-flective or transparent
• Texture-based CV alg. will not work
• Less significant points (cor-ners/edges) can be detected
E.g.: Shape from gradient of texture will fail E.g.:Finding Corners for creating sparse features is more difficult.
![Page 72: R3-COP-D4.2.3 CV-HAZOP r1.1 OZ · PDF fileThese ingredients are introduced and discussed in the next chapter. In Chapter 2, the HAZOP study is carried out and its results collected](https://reader031.vdocuments.us/reader031/viewer/2022030411/5a9dd51e7f8b9a0d5a8cbbe2/html5/thumbnails/72.jpg)
ARTEMIS-2009-1 R3-COP / Visual Perception Testing D4.2.3
Page 72 / 187
Guideword Par. Meaning Consequences Hazards Note More Texture Object has more texture
than expected • Texture can be mislead-ing
• Object not correctly recognized
• Texture-sensible CV alg is confused
• Virtual object is taken for real
E.g: In shape from shading contrast should only be dependent on light sources not texture; Note: In a highly textured scene a non-textured object is highly visible!
Less Texture Object has less texture than expected
• Texture is no significant identification property
• Texture-based CV alg. is hampered
E.g.: Any vision algorithm based on texture features (stitching, SLAM)
As well as Texture Object has a combination of (No,) More and Less texture
• Object’s appearance deviates from expected one
• Object not correctly recognized
As well as Texture Object has a mixture of periodic and aperiodic texture
• Semi-periodic texture: periodic at coarse LOD, but differences in detail
• Object not correctly recognized
Example: handmade tiles
As well as Texture Texture resembles silhouette of another object
• The Texture is misinter-preted as the other ob-ject
• Object recognition reports a false posi-tive
A picture of an object
Part of Texture Object has only part of expected texture
• Parts of expected tex-ture are missing
• See Less Example. from a checked pattern, only one direction of stripes is present
Reverse Texture Object’s texture is inverted • Object looks like its “negative”
• Object confused or not recognized
Depends on definition of textures and inverse element
Other than Texture Object has a completely different texture than expected
• Texture is a misleading object property
• Object is confused with some other or not recognized at all
Example: replaced floor cover
Where else Texture Textures are located on objects at other places than expected
• The parts of object that are expected to be tex-tured are empty and vice versa
• Object parts are con-fused
![Page 73: R3-COP-D4.2.3 CV-HAZOP r1.1 OZ · PDF fileThese ingredients are introduced and discussed in the next chapter. In Chapter 2, the HAZOP study is carried out and its results collected](https://reader031.vdocuments.us/reader031/viewer/2022030411/5a9dd51e7f8b9a0d5a8cbbe2/html5/thumbnails/73.jpg)
ARTEMIS-2009-1 R3-COP / Visual Perception Testing D4.2.3
Page 73 / 187
Guideword Par. Meaning Consequences Hazards Note Spatial periodic
Texture Obj.texture is periodic • Same appearance of texture on different parts of objects
• If the period undercuts half the resolution of the observer spatial aliasing occurs
• Object parts are con-fused
• Stereo alg.s compute wrong depth maps due to mismatch of texture cells
Spatial aperiodic
Texture Obj. texture is aperiodic • Texture does not pre-cisely repeat, but varia-tions are irrelevant
• Irregular (stochastic) mismatches in stereo images
Example: typical floor covers
Spatial aperiodic
Texture Obj. texture is aperiodic • Texture does not pre-cisely repeat, but varia-tions are relevant
• Textures are con-fused, leading to confusion of texture-carrying object( part)s
Example: on some animals, e.g. certain fish, pattern depends on its location on the body
Temporal periodic
Texture Obj. texture changes periodically
• Obj. Texture depends on time of recording, but periodically (and hence predictable)
• De-synched scene interpretation causes mismatch of objects in consecutive im-ages
Mismatch can be avoided by learning or prior knowledge. E.g.: Frame rate of an observed PAL-TV is known to be 25 fps progressive
Temporal periodic
Texture Obj. texture changes periodically
• If the frequency exceeds the frame rate of the ob-server temporal aliasing occurs
• The aliased fre-quency is treated as the actual one
Temporal aperiodic
Texture Obj. texture. changes aperiodically
• Obj. Texture depends on time of recording, not periodically (and hence not predictable)
• Mismatch of objects in consecutive im-ages
Close (spat.)
Texture Texture only recognizable if Obj. close to Observer
• Small texture, smears out at larger distances
• Object incorrectly recognized at larger distances
![Page 74: R3-COP-D4.2.3 CV-HAZOP r1.1 OZ · PDF fileThese ingredients are introduced and discussed in the next chapter. In Chapter 2, the HAZOP study is carried out and its results collected](https://reader031.vdocuments.us/reader031/viewer/2022030411/5a9dd51e7f8b9a0d5a8cbbe2/html5/thumbnails/74.jpg)
ARTEMIS-2009-1 R3-COP / Visual Perception Testing D4.2.3
Page 74 / 187
Guideword Par. Meaning Consequences Hazards Note Close (spat.)
Texture Two similar textures are touching on the same object
• Borders between the textures are barely visi-ble
• Object recognition hampered, if relevant parts are to be dis-tinguished by their texture
Remote (spat.)
Texture Texture only recognizable if obj. at remote distance from observer
• Large texture, not fully recognizable at small distances
• Object incorrectly recognized at smaller distances
Remote (spat.)
Texture Two very different textures are touching on the same object
• Borders between the textures create strong contrast there
• Object recognition hampered, if border between textures creates new visual artifacts
Before (temp.)
Texture See Size
After (temp.)
Texture See Size
In front of (spat.)
Texture n/a
Behind (spat.)
Texture n/a
Faster Texture Texture changes Faster than expected
• Object appearance changes during re-cording – texture smeared out
• If periodic aliasing can occure (see temporal periodic)
• Texture not recog-nized
Example: Sepias can change their skin patterns quickly
![Page 75: R3-COP-D4.2.3 CV-HAZOP r1.1 OZ · PDF fileThese ingredients are introduced and discussed in the next chapter. In Chapter 2, the HAZOP study is carried out and its results collected](https://reader031.vdocuments.us/reader031/viewer/2022030411/5a9dd51e7f8b9a0d5a8cbbe2/html5/thumbnails/75.jpg)
ARTEMIS-2009-1 R3-COP / Visual Perception Testing D4.2.3
Page 75 / 187
Guideword Par. Meaning Consequences Hazards Note Slower Texture Texture changes Slower
than expected • See Meaning • Object’s state is mis-
interpreted E.g: • Frame rate of an ob-served TV screen is lower than expected
• Running text on electronic billboards is moving very slow
• Textures representing a physical state, e.g. tem-perature, is changing due to a different law that an-ticipated
No Refl. Obj. has no reflectance • No light reflected; if not transparent, object ap-pears very dark – no features visible
• Underexposure • Object confused with shadow
• Object not recog-nized
More Refl. Obj. has much Refl. (more than expected)
• Shiny surface – mirror • Overexposure of the observer
• Object not recog-nized
• Reflected objects taken for real
E.g. for overexposure: A retroreflecting street sign
Less Refl. Obj. has little Refl. (less than expected)
• Dull surface – diffuse reflection
• See No
As well as Refl. Obj. has both shiny and dull surface
• Diffuse reflection with highlight/glare
• Object recognition distorted by glares
• Local overexposure due to glares
Part of Refl. Obj. has only part of expected reflectance
• Object appears more dull or less shiny than expected
• Obj. confused with others
![Page 76: R3-COP-D4.2.3 CV-HAZOP r1.1 OZ · PDF fileThese ingredients are introduced and discussed in the next chapter. In Chapter 2, the HAZOP study is carried out and its results collected](https://reader031.vdocuments.us/reader031/viewer/2022030411/5a9dd51e7f8b9a0d5a8cbbe2/html5/thumbnails/76.jpg)
ARTEMIS-2009-1 R3-COP / Visual Perception Testing D4.2.3
Page 76 / 187
Guideword Par. Meaning Consequences Hazards Note Reverse Refl. Obj. reflects into inverse
direction • Obj. appears to violate the reflection law (reflec-tion angle = incidence angle)
• CV alg. is confused • See More for retrore-flectance
Examples: • Aura around own shadow on smooth surface, caused by micro globules within that surface which totally reflect light on their opposite side of incoming rays (retroreflecting mate-rials);
• Aura around shadow on clouds, caused by tunnel effects [Nussenzweig+03]
Reverse Refl. The parts that are expected to be reflecting are dull and vice versa
• Special case of Other than
• See Other than
Other than Refl. Obj. has other refl. properties than expected
• Obj.s appearance is (massively) changed
• Object not (correctly) recognized
Example: dull mirror
Where else Refl. There are multiple reflections on different parts of the object
• See As well as • See As well as
Spatial periodic
Refl. Reflecting surface has spatially periodic changing reflectance properties
• Obj.’s appearance is sensitive to the relative positioning of l.s., obj., and obs., but predict-able
• Output of CV alg. strongly depends on its position
E.g.: • Fish scales • Reflecting industrial en-casements with regular ventilation slots
Spatial aperiodic
Refl. Reflecting surface has spatially aperiodic changing reflectance properties
• See Spatial periodic, but not predictable
• See Spatial periodic See Spatial periodic
Temporal periodic
Refl. Refl. changes periodically with time
• Obj. Refl. depends on time of recording, but periodically (and hence predictable)
• Mismatch of objects in consecutive im-ages
Mismatch can be avoided by learning
![Page 77: R3-COP-D4.2.3 CV-HAZOP r1.1 OZ · PDF fileThese ingredients are introduced and discussed in the next chapter. In Chapter 2, the HAZOP study is carried out and its results collected](https://reader031.vdocuments.us/reader031/viewer/2022030411/5a9dd51e7f8b9a0d5a8cbbe2/html5/thumbnails/77.jpg)
ARTEMIS-2009-1 R3-COP / Visual Perception Testing D4.2.3
Page 77 / 187
Guideword Par. Meaning Consequences Hazards Note Temporal aperiodic
Refl. Refl. changes aperiodically with time
• Obj. Refl. depends on time of recording, not periodically (and hence not predictable)
• Mismatch of objects in consecutive im-ages
Close (spat.)
Refl. Objects seen through a reflection might appear nearer than they are
• See Meaning • Misinterpretation of scene
Remote (spat.)
Refl. Objects seen through a reflection might appear to be farer away than they are
• See Meaning • See Close
Before (temp.)
Refl. Analogue to Size • See Size • See Size
After (temp.)
Refl. Analogue to Size • See Size • See Size
In front of (spat.)
Refl. n/a
Behind (spat.)
Refl. n/a
Faster Refl. n/a Slower Refl. n/a No Transp. Obj. is not transparent • Opaque object • Part of scene invisi-
ble if object should be transparent – wrong scene inter-pretation
• Casts a shadow – therefore the scene is not illuminated as expected
E.g.: Completely dirty window
![Page 78: R3-COP-D4.2.3 CV-HAZOP r1.1 OZ · PDF fileThese ingredients are introduced and discussed in the next chapter. In Chapter 2, the HAZOP study is carried out and its results collected](https://reader031.vdocuments.us/reader031/viewer/2022030411/5a9dd51e7f8b9a0d5a8cbbe2/html5/thumbnails/78.jpg)
ARTEMIS-2009-1 R3-COP / Visual Perception Testing D4.2.3
Page 78 / 187
Guideword Par. Meaning Consequences Hazards Note More Transp. • Obj. is highly transparent
• Obj. is more transparent than expected
• Transparent object • Object not recog-nized
• Objects behind it not correctly recognized due to distortions, e.g. through glass
• Objects within it not correctly recognized due to distortions, e.g. through glass
Less Transp. Obj. is less transparent than expected
• Objects behind it are less visible than ex-pected
• Objects behind it not correctly recognized due to reduced transp.
• Casts a shadow see No
Example: dirty glass
As well as Transp. Obj. is both more and less transp. than expected
• Obj. consists of parts with high and low trans-parency
• Object is recognized as several objects
• Object seen through it is recognized as several objects
• Object itself and ob-jects behind it are merged
Example. glass bottle is transparent but also opaque at its label
Part of Transp. Parts of the object are transparent
• See As well as • See As well as
Reverse Transp. Object transparency is opposite to what is expected
• Extreme case of Other than
• See Other than
Other than Transp. Obj. has another transp. than expected
• See More, Less, As well as
• See More, Less, As well as
Where else Transp. See Other than
![Page 79: R3-COP-D4.2.3 CV-HAZOP r1.1 OZ · PDF fileThese ingredients are introduced and discussed in the next chapter. In Chapter 2, the HAZOP study is carried out and its results collected](https://reader031.vdocuments.us/reader031/viewer/2022030411/5a9dd51e7f8b9a0d5a8cbbe2/html5/thumbnails/79.jpg)
ARTEMIS-2009-1 R3-COP / Visual Perception Testing D4.2.3
Page 79 / 187
Guideword Par. Meaning Consequences Hazards Note Spatial periodic
Transp. Transp. of Obj. is spatially periodic
• Relative positioning of Obj. and Observer is relevant (for what is seen through)
• Wrong objects be-hind transp. Obj.
Example: glass roundels, wall with windows
Spatial aperiodic
Transp. Transp. of Obj. is spatially aperiodic
• See Spatial periodic • See Spatial periodic
Temporal periodic
Transp. Transp. changes periodically
• Transp. depends on time of recording, but periodically (and hence predictable)
• Temporarily periodic “Transp.” errors as listed above
• Mismatch of objects in consecutive im-ages
E.g: Liquid crystal and polarization panes of of an LCD screen
Temporal aperiodic
Transp. Transp. changes aperiodically
• Transp. depends on time of recording, not periodically (and hence not predictable)
• Temporarily periodic “Transp.” errors as listed above
• Mismatch of objects in consecutive im-ages
See Temporal periodic
Close (spat.)
Transp. Visible object transparency is increased for closer objects
• See Meaning • Object not recog-nized
Example: window
Remote (spat.)
Transp. Visible object transparency is reduced for more distant objects
• See Meaning • Objects behind not correctly recognized
Example: window
Before (temp.)
Transp. Analogue to Size • See Size • See Size
After (temp.)
Transp. Analogue to Size • See Size • See Size
In front of (spat.)
Transp. Objects in front of a transparent object are visible together with the other object;
• See Meaning • Objects are confused Example: a glass pane can create the illusion of a mixed shape of objects that are behind the pane and reflected objects.
![Page 80: R3-COP-D4.2.3 CV-HAZOP r1.1 OZ · PDF fileThese ingredients are introduced and discussed in the next chapter. In Chapter 2, the HAZOP study is carried out and its results collected](https://reader031.vdocuments.us/reader031/viewer/2022030411/5a9dd51e7f8b9a0d5a8cbbe2/html5/thumbnails/80.jpg)
ARTEMIS-2009-1 R3-COP / Visual Perception Testing D4.2.3
Page 80 / 187
Guideword Par. Meaning Consequences Hazards Note Behind (spat.)
Transp. n/a
Faster Transp. Transp. changes faster than expected
• See Other than • See Other than
Slower Transp. Transp. changes slower than expected
• See Other than • See Other than
No Wave Received light is not polarized
• No respective visual effects caused by Obj.
• CV alg. based on polarization effects are hampered
E.g. Material stress analysis using photoelastic materials expects certain polarization effects
More Wave Received light is more polarized than expected
• Reflected light is polar-ized – rainbow colours appear
• Polarization colours confuse CV alg.
Less Wave Received light is less polarized than expected
• Too few respective vis-ual effects caused by Obj.
• CV alg. based on polarization effects won’t work
E.g. Material stress analysis using photoelastic materials expects certain polarization effects
As well as Wave Received light is both more and less polarized than expected (by different object parts)
• See More and Less • Mixture of More and Less effects, de-pending on orienta-tion
See Less
Part of Wave n/a Reverse Wave Effects on received light is
inverse to expected • See Other than • See Other than E.g.: The polarisation pane
is orthogonal to the expected pane
Other than Wave Effects on received light is different from expected
• Unexpected object ap-pearance
• Object not (correctly) recognized
Where else Wave n/a
![Page 81: R3-COP-D4.2.3 CV-HAZOP r1.1 OZ · PDF fileThese ingredients are introduced and discussed in the next chapter. In Chapter 2, the HAZOP study is carried out and its results collected](https://reader031.vdocuments.us/reader031/viewer/2022030411/5a9dd51e7f8b9a0d5a8cbbe2/html5/thumbnails/81.jpg)
ARTEMIS-2009-1 R3-COP / Visual Perception Testing D4.2.3
Page 81 / 187
Guideword Par. Meaning Consequences Hazards Note Spatial periodic
Wave Polarization changes periodically with location along object
• E.g. polarization plane depends periodically from location, generat-ing positioning and ori-entation dependent op-tical effects on object surface
• CV alg. confused by resulting periodic vir-tual texture
Spatial aperiodic
Wave Polarization changes aperiodically with location along object
• See Spatial periodic • See Spatial periodic
Temporal periodic
Wave Wave properties of reflected/transferred light are periodically modified.
• E.g. linear polarization changes regularly over time
• Object appearance can be periodically modified in a way the observer is not able to cope with
Theoretically, these effects are predictable
Temporal aperiodic
Wave Wave properties of reflected/transferred light are irregularly modified.
• E.g. linear polarization plane rotates irregularly over time
• Object appearance can be modified in a way the observer is not able to cope with
These effects are not predictable
Close (spat.)
Wave n/a
Remote (spat.)
Wave n/a
Before (temp.)
Wave See Size
After (temp.)
Wave See Size
In front of (spat.)
Wave n/a
Behind (spat.)
Wave n/a
Faster Wave See More Slower Wave See More
![Page 82: R3-COP-D4.2.3 CV-HAZOP r1.1 OZ · PDF fileThese ingredients are introduced and discussed in the next chapter. In Chapter 2, the HAZOP study is carried out and its results collected](https://reader031.vdocuments.us/reader031/viewer/2022030411/5a9dd51e7f8b9a0d5a8cbbe2/html5/thumbnails/82.jpg)
ARTEMIS-2009-1 R3-COP / Visual Perception Testing D4.2.3
Page 82 / 187
2.5 Objects Objects and Object is analyzed in a separate CV-HAZOP because objects change information in their combination, not only individually. This fact is represented by the feedback-loop in the object model that was presented in section 1.3.2.
2.5.1 Parameters
Table 10: Objects Parameters Parameters Values / Range Meaning Number N Number of (distinguishable) objects. Positions Relative Position Relative position of objects to each other Occlusion Amount and complexity of occlusion between multiple objects Shadowing Amount and complexity of how objects shadow other objects and themselves (cast and form shadows) Reflectance Amount and complexity of how objects reflect other objects Transparency Amount and complexity of the property of transparency, allowing to see objects through other objects Wave Frequency Influence of wave properties by interferences between objects Notes: Occlusion .. Transparency should consider Object params Size, LOD, Complexity.
2.5.2 HAZOP
Table 11: Objects HAZOP Guideword Par. Meaning Consequences Hazards Note No Num No objects • Scene with no objects (only
light sources and media) • False positives: non-existing objects erro-neously reported by CV alg.
No Num Number of objects is not detectable/decidable
• Scene with “unknown” number of objects
• False negatives: CV alg. misses detection of some objects
• False positives: non-existent objects repor-ted
Example: Scene contains many objects that are too small to distinguish from each other (all blend to background)
![Page 83: R3-COP-D4.2.3 CV-HAZOP r1.1 OZ · PDF fileThese ingredients are introduced and discussed in the next chapter. In Chapter 2, the HAZOP study is carried out and its results collected](https://reader031.vdocuments.us/reader031/viewer/2022030411/5a9dd51e7f8b9a0d5a8cbbe2/html5/thumbnails/83.jpg)
ARTEMIS-2009-1 R3-COP / Visual Perception Testing D4.2.3
Page 83 / 187
Guideword Par. Meaning Consequences Hazards Note More Num More objs. than expected • Scene is more complex than
expected • Occlusions between objects are more likely than expected
• False negatives: ob-jects are missed
• Misinterpretation of number of objects within scene; e.g.:
• Partial occluded ob-jects remain com-pletely undetected
• An object is covered such that uncovered parts are interpreted as belonging to differ-ent objects
• Parts of different ob-jects are interpreted as some other, non-existent object
Less Num Less objs. than expected • Scene is simpler than ex-pected
• Less activity within the scene than expected (than usual)
• Scene is emptier than ex-pected (than usual)
• (Parts of) background interpreted as (fore-ground) object(s)
• False positives: ob-jects erroneously re-ported by CV alg.
As well as Num Both more objs. (of kind X) and less objs. (of kind Y) than expected
• Scene is different from ex-pected
• Objects of kind X are reported as of kind Y (or vice versa)
• Objects of kind X or Y are reported as of some other kind Z
• False negatives: ob-jects are missed
![Page 84: R3-COP-D4.2.3 CV-HAZOP r1.1 OZ · PDF fileThese ingredients are introduced and discussed in the next chapter. In Chapter 2, the HAZOP study is carried out and its results collected](https://reader031.vdocuments.us/reader031/viewer/2022030411/5a9dd51e7f8b9a0d5a8cbbe2/html5/thumbnails/84.jpg)
ARTEMIS-2009-1 R3-COP / Visual Perception Testing D4.2.3
Page 84 / 187
Guideword Par. Meaning Consequences Hazards Note As well as Num Both more and less objs.
than expected • Unusual (unexpected) distribu-tion of objects: in some areas of the scene unusual many, in other parts unusual few
• See More and Less Also: As well as
Part of Num See Less Part of Num One object is split into
multiple objects because parts of the object are covered by a different object or are outside of the view
• See Part of Occl. • See Part of Occl.
Reverse Num n/a Other than Num Other objs.than expected • Unusual (unexpected) objects
within the scene • False positives: ob-jects are misinter-preted
Other than Num See More and Less • See More and Less • See More and Less Where else Num Objects are arranged
differently than expected E.g.: A bicycle with
three wheels Spatial periodic Num Object arrangement is
periodical • Observer resolution and win-dowing have to be appropriate to capture a characteristic ar-rangement
• If resolution of field of view are not appropri-ate detection based on characteristic ar-rangements is cor-rupted
E.g.: Dash dotted line (“_ . _ . _”) – if only a small part is visible it is not identifiable as such
Spatial aperiodic
Num n/a
Temporal periodic
Num Number of objects varies Temporal periodically
• Number of obj. in scene de-pends on time of recording, but periodically (and hence predictable)
• If the frequency exceeds the frame rate (faster) temporal aliasing occurs
• Mismatch of objects in consecutive images
• Aliasing frequency is interpreted instead of actual one
![Page 85: R3-COP-D4.2.3 CV-HAZOP r1.1 OZ · PDF fileThese ingredients are introduced and discussed in the next chapter. In Chapter 2, the HAZOP study is carried out and its results collected](https://reader031.vdocuments.us/reader031/viewer/2022030411/5a9dd51e7f8b9a0d5a8cbbe2/html5/thumbnails/85.jpg)
ARTEMIS-2009-1 R3-COP / Visual Perception Testing D4.2.3
Page 85 / 187
Guideword Par. Meaning Consequences Hazards Note Temporal aperiodic
Num Number of objects varies temporal aperiodically
• Number of Obj. in scene de-pends on time of recording, not periodically (and hence not predictable)
• Mismatch of objects in consecutive images
Close (spat.) Num A number of objs. is closer to observer than expected
• See Obj. Pos. • See Obj. Pos.
Remote (spat.) Num A number of objs. is more remote from observer than expected
• See Obj. Pos. • See Obj. Pos.
Before (temp.) Num n/a After (temp.) Num n/a In front of (spat.)
Num A number of obj. is in front of each other (in respect to the observer)
• They cover each other • They are indistin-guishable
Behind (spat.) Num See In front of • See In front of • See In front of Faster Num The num. of obj. changes
faster than expected • The number of objects in scene depends significantly on time of exposure
• Aliasing see Temporal periodic
• Scene phase misin-terpreted
Slower Num n/a No Pos No difference in Position
hence multiple objects share the same position
• One object completely hides another
• Two objects are intertwined but share the same center point
• Algorithm misses de-tection of hidden ob-ject
• Algorithm sees multi-ple objects as one
E.g.: Two typewriter letter impressions at the same spot on a paper Something inside an opaque box
No Pos No Positions of Objects relative to each other known
• It is not possible to decide how an object is positioned relative to another
• Objects order in dis-tance from Observer is confused
• Lateral relationship of objects is misinter-preted
Examples: mutual overlap, optical illusions, silhouette of at least one object outside of image
![Page 86: R3-COP-D4.2.3 CV-HAZOP r1.1 OZ · PDF fileThese ingredients are introduced and discussed in the next chapter. In Chapter 2, the HAZOP study is carried out and its results collected](https://reader031.vdocuments.us/reader031/viewer/2022030411/5a9dd51e7f8b9a0d5a8cbbe2/html5/thumbnails/86.jpg)
ARTEMIS-2009-1 R3-COP / Visual Perception Testing D4.2.3
Page 86 / 187
Guideword Par. Meaning Consequences Hazards Note More Pos More different positions
than expected • See Num • See Num
More Pos More discrete relative Pos.s of Obj.s than expected
• Object positions are quantized at a greater resolution than expected
• CV alg. expects dis-crete positions but gets positions in be-tween → confused
Knowing relative pos.s. of obj.s. more precisely than assumed means no risk, as long as the algorithm does not exploit the regularity E.g: Cars in a parking lot should be parked according to a grid of parking markings, but are usually not
More Pos. One or more extends of relative position is bigger than expected
• An arrangement of objects is “stretched”
• CV alg. based on as-signing a group of ob-jects to one entity is less likely to find the compound
E.g.s: See Remote (spat.)
Less Pos Relative Pos.s of objects known Less (precisely) than expected
• It is difficult to decide how an object is positioned relative to another
• Some ordering of ob-jects in distance is confused
• Some lateral relation-ships of objects are misinterpreted
Less Pos Less different positions than expected
• See Num • See Num
Less Pos. One or more extends of position are smaller than expected
• An arrangement of objects is “shrunken”
• The CV alg. is less likely to distinguish the two objects than to identify them as a one
As well as Pos See No (2)
![Page 87: R3-COP-D4.2.3 CV-HAZOP r1.1 OZ · PDF fileThese ingredients are introduced and discussed in the next chapter. In Chapter 2, the HAZOP study is carried out and its results collected](https://reader031.vdocuments.us/reader031/viewer/2022030411/5a9dd51e7f8b9a0d5a8cbbe2/html5/thumbnails/87.jpg)
ARTEMIS-2009-1 R3-COP / Visual Perception Testing D4.2.3
Page 87 / 187
Guideword Par. Meaning Consequences Hazards Note As well as Pos One extend of pos is
smaller another is bigger than expected
• See More, Less • See More, Less
Part of Pos Multiple objects are partially overlapping
• See No • See No
Part of Pos. Only part of relative obj. Pos. known
• Information about object parts is missing
• Scene misinterpreted, because position of whole objects mis-leads CV alg.
Example: position of human is known, but not the pos.s of her arms
Reverse Pos Two objects have swapped positions (from the expected)
• Swapped positions create different scene than expected
• CV alg. confuses mul-tiple objects because it expects each object at a specific position or relative position
(special case of Other than)
Other than Pos. Relative pos. of obj. is Other than expected
• Unexpected scene • Scene is completely misinterpreted
• Parts of scene are misinterpreted
• An object or some objects not correctly recognized or not recognized at all
Examples: object which should be behind another is in front of it; objects are vertically grouped instead of horizontally
Where else Pos. See Other than Spatial periodic Pos. Objects are located
regularly • Same kind of objects appear in a geometrically regular pat-tern
• Individual objects are confused
• Stereo CV alg. mis-matches ob-jects/points ⇒ dis-tances of objects re-ported incorrectly
Example: windows in a larger building, ventilation holes in ceiling
Spatial periodic Pos. Objects are located regularly
• Different kind of objects ap-pear in a geometrically regular pattern
• Only regularity de-tected, but not the in-dividual objects
Example: people in a row, books on a shelf
![Page 88: R3-COP-D4.2.3 CV-HAZOP r1.1 OZ · PDF fileThese ingredients are introduced and discussed in the next chapter. In Chapter 2, the HAZOP study is carried out and its results collected](https://reader031.vdocuments.us/reader031/viewer/2022030411/5a9dd51e7f8b9a0d5a8cbbe2/html5/thumbnails/88.jpg)
ARTEMIS-2009-1 R3-COP / Visual Perception Testing D4.2.3
Page 88 / 187
Guideword Par. Meaning Consequences Hazards Note Spatial aperiodic
Pos. Objects are placed in an chaotic unordered fashion
• No relative positions can be expected
• CV alg. Cannot use apriori knowledge of positional relations (except of maybe sta-tistical knowledge)
• CV alg. expects spe-cial positional rela-tions and is confused if the relation is not happening
Temporal periodic
Pos. Objects relative pos. changes periodically
• Objects relative pos. depends on time of observation (expo-sure), but reappears periodi-cally and hence predictably
• “Unsynchronized” CV alg. misinterprets scene “phase”
• Also aliasing (misin-terpretation of fre-quency) can occur if the period is too short
• Confusion of objects or their positioning
Example: an object orbits another ⇒ after a constant number of captures the (almost) same positioning will reappear ⇒ learnable Interaction: Object
Temporal aperiodic
Pos. Objects movement schedule diverges from expected periodic movement, hence being aperiodic
• There is an expected move-ment schedule from which the system diverges
• CV alg. expects spe-cial movement schedule and is con-fused if the schedule is not met
Temporal aperiodic
Pos. Objects move in an chaotic unordered fashion
• No movement schedule can be created
• CV alg. cannot use apriori knowledge of movements
Close (spat.) Pos Multiple objects closer together than expected (also see Less Pos)
• Visible connection between multiple objects is strength-ened
• Multiple objects directly con-tact each other
• (also see Less Pos)
• CV alg. might inter-pret the multiple ob-jects as something different (e.g. a single different object)
• (See Less Pos)
(Special case of Other than)
![Page 89: R3-COP-D4.2.3 CV-HAZOP r1.1 OZ · PDF fileThese ingredients are introduced and discussed in the next chapter. In Chapter 2, the HAZOP study is carried out and its results collected](https://reader031.vdocuments.us/reader031/viewer/2022030411/5a9dd51e7f8b9a0d5a8cbbe2/html5/thumbnails/89.jpg)
ARTEMIS-2009-1 R3-COP / Visual Perception Testing D4.2.3
Page 89 / 187
Guideword Par. Meaning Consequences Hazards Note Remote (spat.) Pos Multiple objects are further
apart from each other than expected (also see More Pos)
• Visible connection between multiple objects is weakened
• (also see More Pos)
• CV alg. might miss essential relationship between multiple ob-jects because they are too far apart
Examples: legs of a table not recognized of belonging to one object if Observer too close; car headlights not recognized as belonging to the same car
Before (temp.) Pos One object is moving earlier than the others
• Nn/a • Nn/a This case does not produce special visual effects
After (temp.) Pos One object is moving later than the others
• Nn/a • Nn/a
In front of (spat.)
Pos Object is in front of another • Object covers (partially) an-other object → depending on its transparency, the object behind is more or less visible
• See Occlusion and Transparency
Interactions: • Occlusion • Transparency
Behind (spat.) Pos See In front of Faster Pos The whole scene
acts/moves faster than expected
• See Observer Exposure and Quantization
• See Observer Expo-sure and Quantization
Slower Pos The whole scene acts/moves slower than expected
• See Observer Exposure and Quantization
• See Observer Expo-sure and Quantization
No Occl. Objects are not occluding each other
• All objects are completely visi-ble
• CV alg. is confused by clutter in scene
• Too many objects visible for correct and timely processing by CV alg.
Example: untidy room scene shows all objects that should be found as well as many potential false positive objects in one single view
![Page 90: R3-COP-D4.2.3 CV-HAZOP r1.1 OZ · PDF fileThese ingredients are introduced and discussed in the next chapter. In Chapter 2, the HAZOP study is carried out and its results collected](https://reader031.vdocuments.us/reader031/viewer/2022030411/5a9dd51e7f8b9a0d5a8cbbe2/html5/thumbnails/90.jpg)
ARTEMIS-2009-1 R3-COP / Visual Perception Testing D4.2.3
Page 90 / 187
Guideword Par. Meaning Consequences Hazards Note More Occl. More objects occlude each
other than expected • Less details of objects are visible
• Some objects are occluded to a degree which makes their detection difficult
• Detection quality is decreased by less in-formation of needed Objects
More Occl. More visible area of an object is occluded by others than expected
• Less details of object are visi-ble
• Detection quality is decreased by less in-formation of needed Object
See also Obj. Part of Size
Less Occl. Less objects occlude each other than expected
• More details of objects are visible
• Detection quality is decreased by too much clutter in the scene
• See No
Less Occl. Less visible area of an object is occluded by others than expected
• More details of object are visi-ble
• Detection quality is decreased by too much clutter in the scene
• See No
As well as Occl. Occluded objects further occlude other objects
• Multiple levels of occlusion means most occlusions will remain even if observer moves to a different position
• Object separation severely hampered
As well as Occl. Combination of More and Less
• See More and Less • See More and Less
Part of Occl. Object is partially occluded • Remaining visible parts have a different appearance than the whole object
• Confusion of partial image with another object can create mal-functions
Reverse Occl. Instead of occluding an object, it is made visible See Transparency entries
![Page 91: R3-COP-D4.2.3 CV-HAZOP r1.1 OZ · PDF fileThese ingredients are introduced and discussed in the next chapter. In Chapter 2, the HAZOP study is carried out and its results collected](https://reader031.vdocuments.us/reader031/viewer/2022030411/5a9dd51e7f8b9a0d5a8cbbe2/html5/thumbnails/91.jpg)
ARTEMIS-2009-1 R3-COP / Visual Perception Testing D4.2.3
Page 91 / 187
Guideword Par. Meaning Consequences Hazards Note Reverse Occl. Occlusion is reverted • Object B occludes object A
instead A occludes B as usual • Objects are confused (B taken for A)
E.g.: Foreground background confusion
Other than Occl. See Where else Where else Occl. Different parts of an object
are occluded than expected • Parts of an object which are usually occluded are visible, while others are occluded, which are usually visible
• Object is not recog-nized
• Object is confused with another
• CV alg. Needs special parts of an object to be covered, a lack of important parts ham-pers detection
Examples: on an assembly line a part of a glass bottle label is always occluded by some grabber-part of the assembly line; one grabber is malformed and the label is exposed resulting in a malfunction in the visual quality control system
Spatial periodic Occl. Occl. creates an ordered pattern; there is some order/rule as to what parts of an object are occluded
• Ordered occlusion creates a pattern
• CV alg. confuses oc-clusion pattern with object
Examples: picket fence; tree trunks in a dense forest - complex shaped objects occlude each other - view through periodically perforated object
Spatial aperiodic
Occl. Occl. creates a chaotic /unordered pattern
• Occlusions are chaotic • CV alg. is not han-dling occlusions cor-rectly
• Scene analysis takes longer than allowed
Example: view through an aperiodically perforated object
![Page 92: R3-COP-D4.2.3 CV-HAZOP r1.1 OZ · PDF fileThese ingredients are introduced and discussed in the next chapter. In Chapter 2, the HAZOP study is carried out and its results collected](https://reader031.vdocuments.us/reader031/viewer/2022030411/5a9dd51e7f8b9a0d5a8cbbe2/html5/thumbnails/92.jpg)
ARTEMIS-2009-1 R3-COP / Visual Perception Testing D4.2.3
Page 92 / 187
Guideword Par. Meaning Consequences Hazards Note Temporal periodic
Occl. Occl. follow a time schedule • See Observer Exposure and Quantization
• Occlusions can be fil-tered/reduced by time
• See Observer Expo-sure and Quantization
• CV alg. expects oc-clusions to follow a time schedule but this schedule is not ad-hered to
When filming footages from a helicopter the camera might trigger/integrate light whenever the helicopter blades are not in view. If the synchronicity is amiss, the camera might film the blade/blade shadow as well
Temporal aperiodic
Occl. Occl. follow no special time schedule
• Occl. cannot be filtered by time
• CV alg. cannot filter Occl. by time
Close (spat.) Occl. Object which occludes another is closer (to Observer) than expected
• Remote object is completely hidden by the closer object
• Remote object is in-visible
• As long as − Closer object is
opaque − No shadow of re-
mote object visi-ble
− It is not reflected in some third ob-ject
− Gravitational lens effect of occluding object is ignorable
Remote (spat.) Occl. Object which occludes another is more Remote (from Observer) than expected
• Nearer object covers only some “inner part of remote ob-ject → complete silhouette of remote is visible
• Closer object is con-sidered as part of re-mote object
Example: moon in front of planet
Before (temp.) Occl. Occl. appear earlier than expected
• See Temp. periodic • See Temp. periodic See Temp. periodic
![Page 93: R3-COP-D4.2.3 CV-HAZOP r1.1 OZ · PDF fileThese ingredients are introduced and discussed in the next chapter. In Chapter 2, the HAZOP study is carried out and its results collected](https://reader031.vdocuments.us/reader031/viewer/2022030411/5a9dd51e7f8b9a0d5a8cbbe2/html5/thumbnails/93.jpg)
ARTEMIS-2009-1 R3-COP / Visual Perception Testing D4.2.3
Page 93 / 187
Guideword Par. Meaning Consequences Hazards Note After (temp.) Occl. Occl. appear later than
expected • See Temp. periodic • See Temp. periodic See Temp. periodic
In front of (spat.)
Occl. See Close, Remote
Behind (spat.) Occl. See Close, Remote Faster Occl. Occl. changes faster than
expected • Mixture of effects from Pos. and other of Occl.
• Mixture of effects from Pos. and other of Occl.
Slower Occl. Occl. change slower than expected
• Mixture of effects from Pos. and other of Occl.
• Mixture of effects from Pos. and other of Occl.
No Shad. No shadowing in the scene Object is not shadowing the scene
• No cast shadows visible. Pos-sible reasons: − only diffuse light − Many l.s. in/around scene − (strong) l.s. very close to ob-
server (e.g. ring light) • (small (minimal) contrast • Without shadow the object position is more difficult to de-rive
• Objects not, or wrongly recognized, because their rough-ness, hills or dips on their surface is not de-tected
• Relationships among objects incorrectly in-terpreted
• Completely occluded object remains unde-tected (no cast shadow from it visible)
• CV alg. Dependent on shadows Miscalcu-lates the object’s posi-tion
This can be the desired case, while others depend on shadowing. E.g.:Shape from shadowing will fail. A car detector is trained for shadows beneath the car and the recognition rate suffers from lack of shadows
No Shad. Object shadow is hidden by other object or lies outside of FOV
• Without shadow the object position is more difficult to de-rive
• CV alg. Miscalculates the object’s position
![Page 94: R3-COP-D4.2.3 CV-HAZOP r1.1 OZ · PDF fileThese ingredients are introduced and discussed in the next chapter. In Chapter 2, the HAZOP study is carried out and its results collected](https://reader031.vdocuments.us/reader031/viewer/2022030411/5a9dd51e7f8b9a0d5a8cbbe2/html5/thumbnails/94.jpg)
ARTEMIS-2009-1 R3-COP / Visual Perception Testing D4.2.3
Page 94 / 187
Guideword Par. Meaning Consequences Hazards Note More Shad. Object has bigger/darker
shadow than expected • Shadow may “occlude” more in the scene than expected
• See More Occl.
• See More Occl.
More Shad. Object is within bigger/darker shadow than expected
• See More Occl. • See More Occl.
More Shad. More shadowing than expected
• Large parts of scene in shadow
• Underexposure: ob-jects in shadow not detected
Less Shad. Object has smaller/lighter shadow than expected
• Shadow may “occlude” less in the scene than expected
• See Less Occl.
• See Less Occl.
Less Shad. Object is within smaller/lighter shadow than expected
• See Less Occl. • See Less Occl.
Less Shad. Less shadowing than expected
• More parts of scene in light than usual
• More parts of the scene con-tain visible information that must be processed
• Overexposure: similar to No
• Too long computing time for image proc-essing needed
As well as Shad. Several objects cast shadows onto the same object
• Combination and thus more complex shadows are possible
• See More Shad.
• See More Shad.
As well as Shad. Both more and less Shadowing than expected
• Combination of More and Less • Combination of More and Less in different image regions
Possible reasons: strong l.s. within scene; wide angle view
Part of Shad. Partial shadow is visible on Object
• See Occl. • See Occl.
Reverse Shad. Shading object and shadowed object exchange roles (not their position, e.g. by changing pos. of l.s.)
• See Where else • See Where else
![Page 95: R3-COP-D4.2.3 CV-HAZOP r1.1 OZ · PDF fileThese ingredients are introduced and discussed in the next chapter. In Chapter 2, the HAZOP study is carried out and its results collected](https://reader031.vdocuments.us/reader031/viewer/2022030411/5a9dd51e7f8b9a0d5a8cbbe2/html5/thumbnails/95.jpg)
ARTEMIS-2009-1 R3-COP / Visual Perception Testing D4.2.3
Page 95 / 187
Guideword Par. Meaning Consequences Hazards Note Reverse Shad. Shadowing of objects is
reversed • Parts usually shadowed are not shadowed and vice versa
• CV alg. confused – misinterpretation of scene
Possible reason: l.s. pos. changed
Reverse Shad. Shadows are reversed • All shadows point in opposite direction than usual
• CV alg. confused – misinterpretation of scene
Possible reason: l.s. pos. or Observer pos. on other side of scene as usual Example: on satellite images with sun near the horizon, valleys look like hills and vice versa
Other than Shad. See Where else Where else Shad. Different parts than
expected of an object are shadowed
• See More, Less, As well as • CV alg. Is confused by unexpected shad-ing -> misdetections
Where else Shad. Reflecting obj. is within shadow
• Reflecting object is less rec-ognizable than if illuminated → reflected objects become more prominent, but distortion ef-fects through reflecting obj. remain
• Reflecting objects in shadow remain unde-tected
• Reflected objects are interpreted as real, but aliasing effects distort reflected im-ages → hampered recognition
Where else Shad. Transp. obj. is shad. • Transparent object is less rec-ognizable than if illuminated → objects behind it become more prominent, but aliasing effects through transp. remain
• Objects in shadow remain undetected
• Aliasing effects distort seen through objects → hampered recogni-tion
Where else Shad. Refl. obj. is outside of scene
• Shadows cast into the scene from aside
• Hampered object rec-ognition
![Page 96: R3-COP-D4.2.3 CV-HAZOP r1.1 OZ · PDF fileThese ingredients are introduced and discussed in the next chapter. In Chapter 2, the HAZOP study is carried out and its results collected](https://reader031.vdocuments.us/reader031/viewer/2022030411/5a9dd51e7f8b9a0d5a8cbbe2/html5/thumbnails/96.jpg)
ARTEMIS-2009-1 R3-COP / Visual Perception Testing D4.2.3
Page 96 / 187
Guideword Par. Meaning Consequences Hazards Note Spatial periodic Shad. Spatial periodic shadows;
there is some order/rule as to what parts of an object are shaded
• Regular shadows creates a pattern
• See Where else
• CV alg. confuses shadow pattern with object
• CV alg. confuses shadow pattern with texture
Spatial aperiodic
Shad. Spatial aperiodic shadows - Shad. creates a chaotic /unordered pattern
• Shadows are chaotic • CV alg. confused by irregular shadows -> misdetections (shad-ows taken for tex-tures)
Temporal periodic
Shad. Shad. follows a time schedule
• According Pos. • See Observer Exposure and Quantization
• According Pos. • See Observer Expo-sure and Quantization
Requires regular movement of l.s.
Temporal aperiodic
Shad. Shad. follows no special time schedule
• Shad. cannot be filtered by time / exploiting scene phase
• CV alg. cannot filter Shad. by exploiting scene phase
Requires irregular movement of l.s.
Close (spat.) Shad. Shadow is closer than expected
• See More • See More
Close (spat.) Shad. Object which shades another is closer to it than expected
• Depending on relative size of l.s., shadowed object is either more (e.g.) completely or less in shadow
• See More and Less
Close (spat.) Shad. Object which shades another is closer to l.s. than expected
• See More and Less • See More and Less
Remote (spat.) Shad. Shadow is more remote than expected
• See Less • See Less
Before (temp.) Shad. Shad. appears earlier than expected
• See Temporal aperiodic • See Temporal aperi-odic
After (temp.) Shad. Shad. appears later than expected
• See Temporal aperiodic • See Temporal aperi-odic
![Page 97: R3-COP-D4.2.3 CV-HAZOP r1.1 OZ · PDF fileThese ingredients are introduced and discussed in the next chapter. In Chapter 2, the HAZOP study is carried out and its results collected](https://reader031.vdocuments.us/reader031/viewer/2022030411/5a9dd51e7f8b9a0d5a8cbbe2/html5/thumbnails/97.jpg)
ARTEMIS-2009-1 R3-COP / Visual Perception Testing D4.2.3
Page 97 / 187
Guideword Par. Meaning Consequences Hazards Note In front of (spat.)
Shad. See More / Less / As well as/ See Part of
Behind (spat.) Shad. See More / Less / As well as/ See Part of
Faster Shad. Shad. (Obj.) moves faster than expected
• Shadows move “faster than exposure”
• Motion blur of shad-ows – shadows dif-fused – hampered scene recognition
Slower Shad. Shad. (Obj.) moves slower than expected
• Shadows move “slower than exposure”
• Shadows sharper than expected – ham-pered scene recogni-tion
• Shadows move so slowly that they are taken for texture
Example: cloud shadows on landscape
No Refl. There are no reflections between objects
• Hidden objects cannot be seen
• Hidden objects re-main undetected
as long as no transparencies or extruding shadows occur
More Refl. There are more reflections between objects than expected
• Creates multiple views within the scene
• Can create multiple visible instances of the same object
• CV alg. might confuse reflectance with reality and infer wrong posi-tion/relation data
• CV alg. detects more objects than there are in the scene
![Page 98: R3-COP-D4.2.3 CV-HAZOP r1.1 OZ · PDF fileThese ingredients are introduced and discussed in the next chapter. In Chapter 2, the HAZOP study is carried out and its results collected](https://reader031.vdocuments.us/reader031/viewer/2022030411/5a9dd51e7f8b9a0d5a8cbbe2/html5/thumbnails/98.jpg)
ARTEMIS-2009-1 R3-COP / Visual Perception Testing D4.2.3
Page 98 / 187
Guideword Par. Meaning Consequences Hazards Note Less Refl. There are less reflections
between objects than expected
• Reflection can be used to see different views in one image; if the expected reflection is hampered, this reduces the quality/visibility of this different view
• CV alg. could miss a second point of view and thus needed ad-ditional information about objects
• The scene can be not sufficiently illumi-nated, due to only dark non reflecting objects
-Mirror/reflector is used to concentrate light -Mirror is used to inspect front and backside of an object with one camera
As well as Refl. See More As well as Refl. Both More and Less Refl.
Obj. in scene than expected – in different scene areas
• Strongly diversified complexity in different scene areas
• See More and less/No
Part of Refl. A reflection of an object is partially visible
• See Occl. • See Occl.
Part of Refl. A reflection on an object is partially visible
• A highlight in an object is par-tially covered by another
• Overblending – partial hampering of correct situation recognition
Reverse Refl. Observer sees itself in a reflection instead of expected object
• Own body/ observer itself visi-ble as an object
• CV alg. confuses ob-server/its own body with other objects
Reverse Refl. Refl. are reversed • Object reflections appear re-versed to expected, e.g. up-side down, or laterally inverted
• CV alg. confused Example. in concave mirrors, objects appear laterally reversed, upside down etc.
Other than Refl. Reflections are different than expected
• Reflections occur where none should and vice versa
• See No/Less/More
Where else Refl. Refl. Obj. is Shad. • See Shad. • See Shad. Where else Refl. Refl. Obj. is Transp. • On Obj.s. surface, reflected
and seen through objects merge
• Misinterpretation of reflecting object and its associated images
Example: window panes
![Page 99: R3-COP-D4.2.3 CV-HAZOP r1.1 OZ · PDF fileThese ingredients are introduced and discussed in the next chapter. In Chapter 2, the HAZOP study is carried out and its results collected](https://reader031.vdocuments.us/reader031/viewer/2022030411/5a9dd51e7f8b9a0d5a8cbbe2/html5/thumbnails/99.jpg)
ARTEMIS-2009-1 R3-COP / Visual Perception Testing D4.2.3
Page 99 / 187
Guideword Par. Meaning Consequences Hazards Note Spatial periodic Refl. Refl. creates an ordered
pattern; there is some order/rule as to what parts of an object are visible through reflections
• Ordered reflectance creates a pattern
• See Where else
• CV alg. confuses re-flectance pattern with object or textures
Spatial aperiodic
Refl. Refl. creates a chaotic /unordered pattern
• Refl. is chaotic/irregular • CV alg. confused by irregular reflectance -> misdetections
Temporal periodic
Refl. Refl. follows a time schedule
• See Observer Exposure and Quantization
• See Observer Expo-sure and Quantization
Temporal aperiodic
Refl. Refl. follows no special time schedule
• Refl. cannot be filtered by time / exploiting scene phase
• CV alg. cannot filter Refl. by exploiting scene phase
Close (spat.) Refl. Refl. is closer than expected
• See More • See More
Close (spat.) Refl. Refl. Obj. is closer to reflected object than expected
• Real and reflected object may appear as (symmetrical) pair
• False positive: object actually interpreted as (symmetrical) pair
Close (spat.) Refl. Refl. Obj. is closer to Observer than expected
• Reflections are larger and/or brighter than expected
• False positive: mir-rored scene taken for real
• Overexposure: reflec-tion too bright
Remote (spat.) Refl. Refl. is more remote than expected
• See Less • See Less
Remote (spat.) Refl. Refl. Obj. is more remote from reflected object than expected
• Association between real obj. and mirror image lost
• False positive. mirror image reported as real object
![Page 100: R3-COP-D4.2.3 CV-HAZOP r1.1 OZ · PDF fileThese ingredients are introduced and discussed in the next chapter. In Chapter 2, the HAZOP study is carried out and its results collected](https://reader031.vdocuments.us/reader031/viewer/2022030411/5a9dd51e7f8b9a0d5a8cbbe2/html5/thumbnails/100.jpg)
ARTEMIS-2009-1 R3-COP / Visual Perception Testing D4.2.3
Page 100 / 187
Guideword Par. Meaning Consequences Hazards Note Remote (spat.) Refl. Refl. Obj. is more remote
from Observer than expected
• Reflections are not recogniz-able any more
• No mirror effect can be exploited (hidden objects cannot be de-tected through reflec-tions anymore)
Before (temp.) Refl. Refl. changes earlier than expected
• Refl. cannot be filtered by time / exploiting scene phase4
• CV alg. cannot filter Refl. by exploiting scene phase
Requires predictable scene changes E.g: A rotating reflecting object with known and expected period (e.g. gripper of a lathe) is deviating from this period.
After (temp.) Refl. Refl. changes later than expected
• Refl. cannot be filtered by time / exploiting scene phase
• CV alg. cannot filter Refl. by exploiting scene phase
See above
After (temp.) Refl. After reaching a certain age obj. are losing reflection property
• See Less Reflection • CV alg. could miss a second point of view and thus needed ad-ditional information about objects
“Only new things shine”
In front of (spat.)
Refl. Refl. occurs directly in front of observer
• (Almost) no means given to distinguish reflection from re-flect object(s)
• Reflection is confused with original object(s)
System (e.g. robot) sees itself in a mirror
In front of (spat.)
Refl. Reflecting obj. in front of expected object
• Expected reflections not de-tectable by Observer – as long as reflecting Obj. is opaque
• No mirror effect can be exploited
• Also see Where else Shad.
In front of (spat.)
Refl. Refl. obj. in front of observer
• See Close • See Close
Behind (spat.) Refl. See As well as/ See Part of 4 “exploiting scene phase” meaning: There exists a temporal schedule that ensures that the l.s. and the objects are positioned to each other in such a way that the amount of reflections is minimized at the exposure timeframe.
![Page 101: R3-COP-D4.2.3 CV-HAZOP r1.1 OZ · PDF fileThese ingredients are introduced and discussed in the next chapter. In Chapter 2, the HAZOP study is carried out and its results collected](https://reader031.vdocuments.us/reader031/viewer/2022030411/5a9dd51e7f8b9a0d5a8cbbe2/html5/thumbnails/101.jpg)
ARTEMIS-2009-1 R3-COP / Visual Perception Testing D4.2.3
Page 101 / 187
Guideword Par. Meaning Consequences Hazards Note Behind (spat.) Refl. Refl. obj. behind observer • If also a reflecting object in
front of observer, infinite re-flections can occur
• CV alg. confused
Faster Refl. Refl. changes faster than expected
• Refl. change during exposure • Reflections are smeared out / blurred
Slower Refl. Refl. changes slower than expected
• Refl. stay at one pos. longer than expected
• Overexposures of reflections
No Transp. All objects are opaque • Objects which are occluded by other objects, are hidden from Observer’s view, as long as shadows extruding behind oc-cluding objects don’t reveal them
• Occluded objects not detected
• Casted shadows mis-interpreted as objects or textures on visible objects or background
More Transp. The objects are more transparent than expected
• More objects are visible than usual
• More “looking through” effects than usual
• CV alg. is confused by clutter in scene
• Scene analysis takes too much time
Less Transp. The objects are less transparent than expected
• Objects can be occluded or the bad visibility of objects through another object can re-duce the visible contrast of these objects
• Decreased visible image quality can re-duce the quality of de-tection of an CV alg.
An autonomous car sees through the front panes but the pane is murky and the car cannot detect the lane correctly
As well as Transp. Different object transparencies are combined
• Multiple layers of transparency create a reduced/combined transparency
• See Less
As well as Transp. Different object transparencies in different scene areas
• Strongly diversified complexity in different scene areas
• See More and less/No
![Page 102: R3-COP-D4.2.3 CV-HAZOP r1.1 OZ · PDF fileThese ingredients are introduced and discussed in the next chapter. In Chapter 2, the HAZOP study is carried out and its results collected](https://reader031.vdocuments.us/reader031/viewer/2022030411/5a9dd51e7f8b9a0d5a8cbbe2/html5/thumbnails/102.jpg)
ARTEMIS-2009-1 R3-COP / Visual Perception Testing D4.2.3
Page 102 / 187
Guideword Par. Meaning Consequences Hazards Note Part of Transp. Parts of an object are
transparent and alow a part of another object to be seen
• Complex mixture of multiple objects visible through projec-tion although the objects are not intertwined
• Misdetection of ob-jects as appearances are changed
Part of Transp. Transp. obj. in unusual part of scene
• Scene looks different from usual
• CV alg. confused Example: floor is transparent
Reverse Transp. See More and Less Other than Transp. Other parts of an object
than expected are shown through transparency
• Different object parts visible • Misdetection of ob-jects as appearances are changed
Where else Transp. Different parts of an object show other objects through Transp. than expected
• Different object parts visible • Misdetection of ob-jects as appearances are changed
Where else Transp. Transp. obj. is shad. • See Shad. • See Shad. Where else Transp. Transp. obj. is reflecting • See Refl. • See Refl. Spatial periodic Transp. Transp. creates an ordered
pattern; there is some order/rule which parts of an object are visible through other objects
• Regular transparencies in scene
• Ordered trans. creates a pat-tern
• Different object parts visible
• CV alg. confuses transparency pattern with object
Example: conventional large buildings with arrays of windows
Spatial aperiodic
Transp. Transp. creates a chaotic /unordered pattern
• Transp. is chaotic • CV alg. confused by irregular transparency -> misdetections
Temporal periodic
Transp. Transp. follows a time schedule
• According Refl. • See Refl.
Temporal aperiodic
Transp. Transp. follows no special time schedule
• According Refl. • See Refl.
Close (spat.) Transp. Transp. is closer to seen-through object than expected
• Separation between transp.obj. and those behind is aggravated
• Objects not correctly distinguished
![Page 103: R3-COP-D4.2.3 CV-HAZOP r1.1 OZ · PDF fileThese ingredients are introduced and discussed in the next chapter. In Chapter 2, the HAZOP study is carried out and its results collected](https://reader031.vdocuments.us/reader031/viewer/2022030411/5a9dd51e7f8b9a0d5a8cbbe2/html5/thumbnails/103.jpg)
ARTEMIS-2009-1 R3-COP / Visual Perception Testing D4.2.3
Page 103 / 187
Guideword Par. Meaning Consequences Hazards Note Close (spat.) Transp. Transp. obj. is closer to
observer than expected • Silhouette of transparent obj. only partially visible, or not at all
• Transp. obj. confused with medium
• Optical distortions of seen-through obj. not correctly interpreted
Remote (spat.) Transp. Transp. obj. is more remote from seen-through object(s) than expected
• See Less • See Less
Remote (spat.) Transp. Transp. obj. is more remote from Observer than expected
• Transparent object smears with background
• Transp. obj. too far away or too faint for being detected
Before (temp.) Transp. Transp. appears earlier than expected
• According Refl. • See Refl.
After (temp.) Transp. Transp. appears later than expected
• According Refl. • See Refl.
In front of (spat.)
Transp. Transp. obj. in front of another Transp. obj.
• Transparency effects accumu-late
• Objects not correctly separated
• CV alg. confused
Examples: several windows in sequence; drinking glass in front of another; carafe seen through window
Behind (spat.) Transp. Transp. obj. behind another Transp. obj.
• See In front of • See In front of See In front of
Faster Transp. Tranp. appear faster than expected
• According Refl. • According Refl.
Slower Transp. Transp. appear slower than expected
• According Refl. • According Refl.
![Page 104: R3-COP-D4.2.3 CV-HAZOP r1.1 OZ · PDF fileThese ingredients are introduced and discussed in the next chapter. In Chapter 2, the HAZOP study is carried out and its results collected](https://reader031.vdocuments.us/reader031/viewer/2022030411/5a9dd51e7f8b9a0d5a8cbbe2/html5/thumbnails/104.jpg)
ARTEMIS-2009-1 R3-COP / Visual Perception Testing D4.2.3
Page 104 / 187
Guideword Par. Meaning Consequences Hazards Note No Wave No interference between
objects • No respective optical effects caused by objects
• Systems depending on coherent or polar-ized light fail
The usual case avoids interferences, but special applications like the visualisation of stress in materials use light interference on purpose
More Wave More interferences between objects than expected
• Accumulation of optical effects such as halos, rainbows, auras etc.
• CV alg. confused – misinterpretation of visual effects
Example: special weather conditions, with rain, clouds, low sun
Less Wave Less interferences between objects than expected
• No optical effects where they are expected
• Wrong scene interpre-tation
Example: in particular physical experiments, expected effects do not occur because of e.g. pollution of used materials
As well as Wave Both more and less interferences between objects than expected (in different scene areas)
• Highly complex scene – see More and Less
• Mixture of More and Less effects, depend-ing on relative posi-tioning of objects
Example: sunset with special weather conditions seen through window, indirect light with no wavelength effects in interior
Part of Wave Only part of expected optical effects
• Interpretation of scene is com-plicated
• Hampered or wrong scene interpretation
E.g.: Only the first mode of an interference is within the field of view, therefore not allowing to measure the distance of modes
Reverse Wave n/a
![Page 105: R3-COP-D4.2.3 CV-HAZOP r1.1 OZ · PDF fileThese ingredients are introduced and discussed in the next chapter. In Chapter 2, the HAZOP study is carried out and its results collected](https://reader031.vdocuments.us/reader031/viewer/2022030411/5a9dd51e7f8b9a0d5a8cbbe2/html5/thumbnails/105.jpg)
ARTEMIS-2009-1 R3-COP / Visual Perception Testing D4.2.3
Page 105 / 187
Guideword Par. Meaning Consequences Hazards Note Other than Wave n/a Where else Wave n/a Spatial periodic Wave Spatial periodic variation of
Wave effects (of some objects)
• Interferences occur regularly in scene
• Confusion of objects causing interference effects
• Interferences are in-terpreted as objects or textures
Example: diffraction rings around barely visible holes in chemical diaphragm
Spatial aperiodic
Wave Spatial aperiodic variation of Wave effects (of some objects
• See Spatial periodic • See Spatial periodic
Temporal periodic
Wave Temporal periodic change of optical interferences between several objects
• Scene appearance depends on exposure time, but repeat-able and hence learnable
• Misinterpretation of scene “phase”
In principle, effects are predictable
Temporal aperiodic
Wave Temporal aperiodic change of optical interferences between several objects
• Scene appearance depends on exposure time, not pre-cisely repeatable
• Misinterpretation of scene “phase”
Effects are only statistically repeatable
Close (spat.) Wave Interferencing objects are close to each other (closer than expected)
• In general, optical effects are reduced,
• But small distance changes can influence appearance of optical effects more signifi-cantly
• Optical effects used to calculate object posi-tions – interferometry - is hampered
Remote (spat.) Wave Interferencing objects are far away from each other (more than expected)
• Optical effects can be aggra-vated,
• But less sensible to small dis-tance changes
• Scene not correctly recognized
• Distance between object imprecisely measured
Before (temp.) Wave n/a After (temp.) Wave n/a
![Page 106: R3-COP-D4.2.3 CV-HAZOP r1.1 OZ · PDF fileThese ingredients are introduced and discussed in the next chapter. In Chapter 2, the HAZOP study is carried out and its results collected](https://reader031.vdocuments.us/reader031/viewer/2022030411/5a9dd51e7f8b9a0d5a8cbbe2/html5/thumbnails/106.jpg)
ARTEMIS-2009-1 R3-COP / Visual Perception Testing D4.2.3
Page 106 / 187
Guideword Par. Meaning Consequences Hazards Note In front of (spat.)
Wave Interferencing obj. In front of another
• Complex composition of opti-cal effects
• Scene not correctly recognized
• Complex optical ef-fects interpreted as texture or own objects
Example: polarization filter in front of diffraction hole
Behind (spat.) Wave Interferencing obj. Behind another
• See In front of • See In front of See In front of
Faster Wave Interferences change faster than expected
• Interferences change during exposure – smeared out
• Hampered recognition of visual effects
Slower Wave n/a
2.6 Observer - Optomechanics
2.6.1 Parameters
Table 12: Observer - Optomechanics Parameters Parameters Values / Range Meaning Number N Number of observers: The number of observers is not necessarily the same as the
number of cameras used to capture a scene. A camera used multiple times from different positions capturing a static scene is equivalent to a number of distinct cameras capturing the scene from different positions at the same time, as long as the scene is static.
Viewing orientation 3 angles • Viewing direction of observer (optical axis + direction) in the world • Rotation of the image pane around the optical axis (so called up-vector)
Viewing position position vector (3D) Viewing position of observer within the scene assuming a world coordinate system
Focusing • Focal distance: The distance which focused on (also called object distance), the dis-tance where objects are in focus at most.
• Depth of field (DoF): distance range of “sharp” scene mapping. The DoF is a function of aperture, focal length and the size of the image sensor.
• Note: The focal length is not part of Focusing since it corresponds to the amount of
![Page 107: R3-COP-D4.2.3 CV-HAZOP r1.1 OZ · PDF fileThese ingredients are introduced and discussed in the next chapter. In Chapter 2, the HAZOP study is carried out and its results collected](https://reader031.vdocuments.us/reader031/viewer/2022030411/5a9dd51e7f8b9a0d5a8cbbe2/html5/thumbnails/107.jpg)
ARTEMIS-2009-1 R3-COP / Visual Perception Testing D4.2.3
Page 107 / 187
Parameters Values / Range Meaning magnification or zoom and is therefore assigned to the Field of View
Field of View horizontal and vertical angle of view, determined by focal length and photoelectric sensor size
Two angles span the infinite (pyramid-shaped) volume within the world which is observed. The extent of the FOV is a function of focal length and photoelectric sensor size, while its position is a function of the Viewing orientation and Viewing position.
Transparency Light spectrum gathered and transmitted, inner material effects and optical shutter effects
Spectrum Colour effects Lenses number N Number of lenses in lens assembly Lenses geometry • Thickness
• Curvature (concave, convex, plane) • Surface (or outer material effects)
Wave property • Polarization • Coherence
Aperture • Diameter • Shape • Other aperture param. which can effect capturing
Optical Point Spread Function
Distribution of light from a point-like source over the sensor surface. Formally: • Representation of the spatial impulse response of the optical system, which represents the response of a particular pixel sensor to an ideal point light source. Includes all opti-cal effects of the lens system like blurring and anti-aliasing filters [Joshi+08] but is also affected by aperture, focal length, focusing distance, and illuminant spectrum [Shih+2012] .
• Can be interpreted as the PSF (the 2 dimensional impulse response) or its counterpart the MTF (Modulation Transfer Function) in the frequency domain.
• Has a non-neglect able extent: The neighbourhood of pixels that is effected to a non-neglect able amount.
![Page 108: R3-COP-D4.2.3 CV-HAZOP r1.1 OZ · PDF fileThese ingredients are introduced and discussed in the next chapter. In Chapter 2, the HAZOP study is carried out and its results collected](https://reader031.vdocuments.us/reader031/viewer/2022030411/5a9dd51e7f8b9a0d5a8cbbe2/html5/thumbnails/108.jpg)
ARTEMIS-2009-1 R3-COP / Visual Perception Testing D4.2.3
Page 108 / 187
2.6.2 HAZOP
Table 13: Observer - Optomechanics HAZOP Guideword Par. Meaning • Consequences • Hazards Note No Num No observers present • No observations avail-
able • Misinterpreted as unlit scene
• SUT cannot detect any-thing but does not realize that there is no observer
• SUT detects false objects (archived input or supplied by other components that are no visual observers ”synesthesia”)
More Num More observers present than expected
• More observations are made
• Input data grows
• Calculation time exceeds available time
• Multiple observers are confused with each other, as less observers are ex-pected
• More resources are needed to process the in-creased amount of infor-mation, thus unnecessar-ily blocking them for other components
![Page 109: R3-COP-D4.2.3 CV-HAZOP r1.1 OZ · PDF fileThese ingredients are introduced and discussed in the next chapter. In Chapter 2, the HAZOP study is carried out and its results collected](https://reader031.vdocuments.us/reader031/viewer/2022030411/5a9dd51e7f8b9a0d5a8cbbe2/html5/thumbnails/109.jpg)
ARTEMIS-2009-1 R3-COP / Visual Perception Testing D4.2.3
Page 109 / 187
Less Num Insufficient observers present
• Views on scene do not allow to capture the com-plexity of the scene
• The imperfect input data leads to imperfect results. The imperfection can lead to no, unexpected, inaccu-rate or wrong output
E.g: No: 3D–reconstruction needs at least two observers Inaccurate: Bundle adjustments accuracy increases with number of Obs. Unexpected/wrong: A complex scene needs to be seen from a number of positions to be able to capture its complexity.
Less Num System uses less observers than expected
• There are less different observers present than expected
• There is less diverse in-formation available than needed to achieve the necessary goals/quality
e.g. CCT does not cover a specific place -> no detection there
As well as Num System uses the same observer in multiple ways
• The same observer out-put is taken for different observers from different positions
• See Less e.g. Left camera = right camera for stereo vision
Part of Num Only a part of the expected observers are identifiable (can be located)
• See Less • See Less
Reverse Num n/a Other than Num The number of Observers
differs from the expected Num
• See Part of More, Less • See Part of More, Less
Where else Num Observers are located/oriented differently than expected
• All data dependent on the expected position is inac-curate
• Wrong, inaccurate, or indeterminable output
E.g.: A certain fixed base line length is expected in a stereo system. Deviation of this length leads to inaccurate wrong or indeterminable output.
![Page 110: R3-COP-D4.2.3 CV-HAZOP r1.1 OZ · PDF fileThese ingredients are introduced and discussed in the next chapter. In Chapter 2, the HAZOP study is carried out and its results collected](https://reader031.vdocuments.us/reader031/viewer/2022030411/5a9dd51e7f8b9a0d5a8cbbe2/html5/thumbnails/110.jpg)
ARTEMIS-2009-1 R3-COP / Visual Perception Testing D4.2.3
Page 110 / 187
Spatial periodic
Num Several observers are placed in a regular grid/array
• The spatial information in the scene (its detail) is not allowed to exceed a certain frequency other-wise its lost or aliasing can occur
• Insufficiently detailed or wrong output
E.g: A setup like this allows for interpolation of images in between by quantization of the plenoptic function. Another example.: A compound-eye lens
Spatial aperiodic
Num Several observers are placed at different positions
• See Spatial Periodic • See Spatial Periodic See Spatial Periodic
Temporal periodic
Num See Temporal aperiodic • See Temporal aperiodic • See Temporal aperiodic More likely to be aperiodic
Temporal aperiodic
Num Over time the number of observers fluctuates chaotically
• The total amount of available information about the scene fluctu-ates chaotically
• The system’s perform-ance fluctuates chaotically
Temporal aperiodic
Num Number of “usable” observers varies aperiodically
• Observer positions are determinable with peri-odically changing accu-racy therefore previous positions can have vari-ant confidence
• If the variant confidence is not taken into account, inaccuracy will propagate over time
Close (spat.) Num All observers are close to each other (short baseline)
• Short baseline makes triangulation results less accurate since the dis-placement of correspond-ing images pts. is smaller
• The common FOV is lim-ited to a certain direction
• Reconstruction results will be accurate for this direction only
• Camera pose estimation fails or is inaccurate
• Reconstruction fails or is inaccurate
• Observer direction de-pendent optical effects are misinterpreted as object-inherent (e.g. texture)
The accuracy of 3D reconstruction is proportional to relation of baseline to Obj. distance
![Page 111: R3-COP-D4.2.3 CV-HAZOP r1.1 OZ · PDF fileThese ingredients are introduced and discussed in the next chapter. In Chapter 2, the HAZOP study is carried out and its results collected](https://reader031.vdocuments.us/reader031/viewer/2022030411/5a9dd51e7f8b9a0d5a8cbbe2/html5/thumbnails/111.jpg)
ARTEMIS-2009-1 R3-COP / Visual Perception Testing D4.2.3
Page 111 / 187
Remote (spat.) Num Observers are far from each other (wide baseline)
• Hard to find correspon-dences between images from different positions -> Reconstruction results are more likely to fail
• Relative camera positions might not be determinable
• Relative object positions might not be determinable
Before (temp.) Num The number of observers is reduced before it was expected
• See Less • See Less e.g. some cameras in CCTV are shut down too soon (i.e. before closing time), so that parts of a store are open but not supervised
Before (temp.) Num The number of observers is increased before it was expected
• See More • See More e.g. some cameras in CCTV are started too soon (i.e. before opening time), so that parts of a store are supervised, but not open (this could drain resources that are needed elsewhere)
After (temp.) Num The minimum number of observes is reached after it was expected
• See meaning • Any event that shall be registered within a 3D en-vironment is lost until the minimum number of ob-servers is reached
E.g.: At least two observers are needed to initialize a 3D environment. If a camera stays static and shall be treated as multiple observers the system cannot be initialized.
In front of (spat.)
Num See Pos
Behind (spat.) Num See Pos Faster Num Over time the number of
observers changes faster than expected
• See Before • See Before
![Page 112: R3-COP-D4.2.3 CV-HAZOP r1.1 OZ · PDF fileThese ingredients are introduced and discussed in the next chapter. In Chapter 2, the HAZOP study is carried out and its results collected](https://reader031.vdocuments.us/reader031/viewer/2022030411/5a9dd51e7f8b9a0d5a8cbbe2/html5/thumbnails/112.jpg)
ARTEMIS-2009-1 R3-COP / Visual Perception Testing D4.2.3
Page 112 / 187
Slower Num Over time the number of observers changes slower than expected
• See After • See After
Slower Num Moving single observer (treated as multiple observers) is too slow in capturing a dynamic scene, which is assumed to be static
• Artefacts are introduced in the observed image-data itself due to asyn-chronous capturing
• Unpredictable results The assumption: moving single observer equals multiple observers is not fulfilled See definition of Num
All following analyses are per observer, if not otherwise indicated. Guideword Par. Meaning • Consequences • Hazards Note No FOV No FOV at all • Extremely small section
of scene visible • Nothing (correctly) de-
tected focal length infinite
![Page 113: R3-COP-D4.2.3 CV-HAZOP r1.1 OZ · PDF fileThese ingredients are introduced and discussed in the next chapter. In Chapter 2, the HAZOP study is carried out and its results collected](https://reader031.vdocuments.us/reader031/viewer/2022030411/5a9dd51e7f8b9a0d5a8cbbe2/html5/thumbnails/113.jpg)
ARTEMIS-2009-1 R3-COP / Visual Perception Testing D4.2.3
Page 113 / 187
More FOV Observer uses a bigger FOV than expected
• Focal length smaller than expected
• More of scene captured than expected
• Objects size shrinks/grows with in-creasing/decreasing dis-tance stronger than ex-pected
• Details are lost for the sake of a bigger FOV.
• Objects appear smaller than expected
• Image distortion (pillow) and vignetting can be-come remarkable
• View has probably more overlap with another view
• System will be out of focus
• More distant entities not detected
• Objects interpreted as smaller than they are
• Image processing takes longer than feasible, be-cause “more world” has to be covered
• The lens system has to change the focus dis-tance accordingly, other-wise the system is out of focus
Example: wide angle instead of normal angle Happens whenever zoom optics are misused Whenever a zoom lens changes its focal length it also has to change the focus distance so that the object distance stays the same, otherwise an object at object distance will not be in focus
![Page 114: R3-COP-D4.2.3 CV-HAZOP r1.1 OZ · PDF fileThese ingredients are introduced and discussed in the next chapter. In Chapter 2, the HAZOP study is carried out and its results collected](https://reader031.vdocuments.us/reader031/viewer/2022030411/5a9dd51e7f8b9a0d5a8cbbe2/html5/thumbnails/114.jpg)
ARTEMIS-2009-1 R3-COP / Visual Perception Testing D4.2.3
Page 114 / 187
Less FOV Observer uses a smaller FOV than expected
• Focal length larger than expected
• Less of scene captured than expected – over-view reduced
• Objects appear bigger than expected
• Objects size shrinks with distance weaker than expected
• View has probably less overlap with previous view
• System will be out of focus
• Motion estimation possi-bly hampered (Aperture Problem see [Hildreth+83])
• Objects interpreted than expected
• Relevant scene elements not seen (not within im-age)
• Connection to previous view might be lost – (reg-istration/ visual odometry wrong)
• The lens system has to change the focus dis-tance accordingly, other-wise the system is out of focus
See More FOV
As well as FOV Viewing angle both larger and smaller than expected → aspect ratio different from expected
• Scene appears com-pressed along one im-age axis
• Objects appearance mis-interpreted
• Objects confused
As well as FOV Multiple Observers are used together
• Intersection of FOVs needed to merge infor-mation
• See Num
• See Calib. of Algorithm • See Num
![Page 115: R3-COP-D4.2.3 CV-HAZOP r1.1 OZ · PDF fileThese ingredients are introduced and discussed in the next chapter. In Chapter 2, the HAZOP study is carried out and its results collected](https://reader031.vdocuments.us/reader031/viewer/2022030411/5a9dd51e7f8b9a0d5a8cbbe2/html5/thumbnails/115.jpg)
ARTEMIS-2009-1 R3-COP / Visual Perception Testing D4.2.3
Page 115 / 187
Part of FOV Part of FOV is always blocked
• See Observer Less Transp
• See Medium Less Transp
• See Obj. Close (spat.)
• See Observer Less Transp
• See Medium Less Transp
e.g. lens body or lens hood (sunshade) extending into FOV Covered by other combinations, e.g. “FOV clipped in depth: semi-opaque medium, walls etc.
Reverse FOV Aspect ratio reversed (e.g. portrait instead of landscape)
• See Orientation • See Orientation
Reverse FOV FOV points in opposite direction than expected
• See Orientation • See Orientation
Other than FOV Orientation/Position is Other than expected
• Tilt view • View upside down
• Objects not recognized • Whole scene not cor-
rectly recognized
Can happen if the sensor-carrying unit tips
Where else FOV See VOrient/VPos Spatial periodic
FOV FOV changes its orientation periodically
• Can miss things happen-ing in overall (con-nected) FOV
• Detection fails E.g: Automated PTZcam is tricked by burglar by staying out of its changing FOV
Spatial aperiodic
FOV FOV changes spatially and temporally aperiodical
• See periodic • See periodic E.g: Automated PTZcam which is tracking an object
Temporal periodic
FOV FOV changes in a temporally periodic manner
• FOV changes in a peri-odic manner
• Continuous confusion of CV alg.
Example: Failure of zoom automatics
Temporal periodic
FOV FOV changes in a temporally periodic manner
• See Spatial periodic • See Spatial Periodic See Spatial Periodic
Temporal aperiodic
FOV FOV changes in a temporally irregular manner
• FOV changes in tempo-rally non-periodic man-ner
• Continuous confusion of CV alg.
Example: Failure of camera automatics
![Page 116: R3-COP-D4.2.3 CV-HAZOP r1.1 OZ · PDF fileThese ingredients are introduced and discussed in the next chapter. In Chapter 2, the HAZOP study is carried out and its results collected](https://reader031.vdocuments.us/reader031/viewer/2022030411/5a9dd51e7f8b9a0d5a8cbbe2/html5/thumbnails/116.jpg)
ARTEMIS-2009-1 R3-COP / Visual Perception Testing D4.2.3
Page 116 / 187
Close (spat.) FOV The FOVs of two observers have a large overlap
• Leads to small dispari-ties in corresponding points
• See Close (spat.) Num
• See Close (spat.) Num If additional views are available this can be not critical at all
Remote (spat.) FOV The FOVs of two observers have a small overlap
• Leads to large disparities in corresponding points
• See Remote (spat.) Num
• See Remote (spat.) Num
Before (temp.) FOV FOV changes earlier than expected
• FOV changes earlier than expected
• CV system misinterprets scene phase
After (temp.) FOV FOV changes later than expected
• FOV changes later than expected
• See Before
In front of (spat.)
FOV One observer is in front of the other
• The Object characteris-tics of the observer in front are captured by rear observer
• See Close
• Possible misdetections, the front observer is mis-taken for an object of in-terest
• Front observer blocks the view of the rear one (See Obj. Position Close)
• See Close
Behind (spat.) FOV One observer is in front of the other
• See in front of • See in front of
Faster FOV FOV changes faster than expected
• PTZ movement motion blur
• PTZ motion confused with Observer motion
Slower FOV FOV changes slower than expected
• Wrong section of scene captured
• See More/Less
No VOrient Relative orientation of observer in world unknown
• Camera orientation un-defined
• See No FOV
• Miscalculation of relative positions
• Miscalculation of all con-secutive steps
• See No FOV
E.g: Flexible/moving Observer mounting introduces additional uncertainties about the current FOV
![Page 117: R3-COP-D4.2.3 CV-HAZOP r1.1 OZ · PDF fileThese ingredients are introduced and discussed in the next chapter. In Chapter 2, the HAZOP study is carried out and its results collected](https://reader031.vdocuments.us/reader031/viewer/2022030411/5a9dd51e7f8b9a0d5a8cbbe2/html5/thumbnails/117.jpg)
ARTEMIS-2009-1 R3-COP / Visual Perception Testing D4.2.3
Page 117 / 187
More VOrient. Observer is more oriented towards ROI than expected
• The object did not enter the FOV, this entrance event is not captured
• Missed detection of dy-namic objects
Less VOrient. VOrient Less oriented towards ROI than expected
• Only part of relevant scene is captured
• Wrong objects confused with expected ones
• Scene not correctly rec-ognized
• Unusual objects in scene cause longer processing time than allowed
As well as VOrient. See Part of Part of VOrient. Misorientation in only one
or two rotational axis • See Less • See Less
Reverse VOrient. Observer looks in the opposite direction than expected
• FOV includes very dif-ferent scene than ex-pected
• See Other than
Other than VOrient. Observer oriented towards a wrong direction
• (Completely) different scene captured
• Wrong objects confused with expected ones
• Nothing correctly recog-nized
• Unusual objects in scene causes longer processing time than allowed
Where else VOrient. See Other than • See Other than • See Other than
![Page 118: R3-COP-D4.2.3 CV-HAZOP r1.1 OZ · PDF fileThese ingredients are introduced and discussed in the next chapter. In Chapter 2, the HAZOP study is carried out and its results collected](https://reader031.vdocuments.us/reader031/viewer/2022030411/5a9dd51e7f8b9a0d5a8cbbe2/html5/thumbnails/118.jpg)
ARTEMIS-2009-1 R3-COP / Visual Perception Testing D4.2.3
Page 118 / 187
Spatial periodic
VOrient. Observer orientation can only be set to discrete values
• Best orientation to center view on object or capture multiple objects in one view cannot be used as it is not available due to the movement discreti-zation
• Objects might be partial-ly outside the FOV
• Spatial aliasing possible
• Misdetections • Incomplete reconstruction
(holes)
E.g for aliasing.: In image based rendering (quantization of the plenoptic field)
Spatial aperiodic
VOrient. VOrient is not constrained (within a given range)
• Additional uncertainties due to arbitrary viewing direction
• Additional uncertainties introduce additional un-certainties for the position estimation of objects rela-tive to each other
E.g.: Manually controlled ego-motion camera e.g. Observer/sensor can be turned arbitrarily relative to SUTs origin, or SUT can fly
Temporal periodic
VOrient. VOrient changes in a temporally periodic manner
• Motion blur • Captured FOV changes
in a periodically manner • Coupled with synchro-
nised exposure, only a fraction of scene could be recorded over longer periods of time
• Can miss things happen-ing in overall (con-nected) FOV
• Temp. aliasing if the fre-quency exceeds half the frame rate
• Periodic misinterpretation of scene
• Periodic violation of com-puting time limits
• Aliasing frequency is taken for real
Example: camera head rotates See FOV spatial periodic
![Page 119: R3-COP-D4.2.3 CV-HAZOP r1.1 OZ · PDF fileThese ingredients are introduced and discussed in the next chapter. In Chapter 2, the HAZOP study is carried out and its results collected](https://reader031.vdocuments.us/reader031/viewer/2022030411/5a9dd51e7f8b9a0d5a8cbbe2/html5/thumbnails/119.jpg)
ARTEMIS-2009-1 R3-COP / Visual Perception Testing D4.2.3
Page 119 / 187
Temporal aperiodic
VOrient. VOrient changes in a temporally aperiodic manner
• Motion blur • Captured scene cut-out
changes in an irregular manner
• Uncertainty of position over time is increased
• Occasional misinterpreta-tion of scene
• Occasional violation of computing time limits
Close (spat.) VOrient. n/a Remote (spat.) VOrient. n/a Before (temp.) VOrient. VOrient changes earlier
than expected • ROI lost earlier than ex-
pected • Following an object is
difficult
• Relevant aspects in scene not more captured
• Moving object might not be detected
After (temp.) VOrient. VOrient changes later than expected
• Changing ROI lost (tem-porally)
• Following an object is difficult
• See Before
In front of (spat.)
VOrient. n/a
Behind (spat.) VOrient. n/a Faster VOrient. VOrient changes faster
than expected • Observer rotation blur • Aliasing if periodic too
fast for the given frame rate
• Blur wrongly interpreted as high speed of Ob-server
• See Temporal periodic
Slower VOrient. VOrient changes slower than expected
• See After • See After
No VPos. Observer position unknown
• Uncertainties in scene localisation
• Misinterpretation of scene and contained objects
More VPos. See Remote Less VPos. See Close
![Page 120: R3-COP-D4.2.3 CV-HAZOP r1.1 OZ · PDF fileThese ingredients are introduced and discussed in the next chapter. In Chapter 2, the HAZOP study is carried out and its results collected](https://reader031.vdocuments.us/reader031/viewer/2022030411/5a9dd51e7f8b9a0d5a8cbbe2/html5/thumbnails/120.jpg)
ARTEMIS-2009-1 R3-COP / Visual Perception Testing D4.2.3
Page 120 / 187
As well as VPos. Multiple observers at different positions are used together
• Several Observer pos.s increase computational complexity
• Several Observer pos.s increase potential for making mistakes in fus-ing Observer data
• Optical (lens) effects can be different in corre-sponding images
• Image mismatch causes wrong scene interpreta-tion
• Non-overlapping parts cannot be used, or are not interpreted correctly.
• Different optical effects hamper registration / matching
Example: stereo-vision
Part of VPos. VPos is Part of scene (within scene)
• Sensor too close to scene – scene partially defocused
• Defocused objects not correctly recognized
Reverse VPos. See Other than Other than VPos. VPos not at expected
position with respect to scene
• Scene appears different from expected (different perspective on scene than expected)
• Scene (partially) misin-terpreted
Example: scene seen from above than frontal
Other than VPos. Observers camera center is at a different position than expected
• Offset between the cam-era center and the cen-ter of mechanical cam-era rotation differs from the expectation (that can be 0)
• The placement of the observers housing is dif-ferent than expected
• Wrong expectation of the camera housings position can lead to collisions
E.g.: Systems that expect the camera center to be at the observers center of rotation (like in simplified panorama stitching) will fail
Other than VPos. Principal point position is different than expected
• Different representations of origins are confused
• Miscalculations in posi-tions
• Shifts in position calcula-tion
Where else VPos. See Other than
![Page 121: R3-COP-D4.2.3 CV-HAZOP r1.1 OZ · PDF fileThese ingredients are introduced and discussed in the next chapter. In Chapter 2, the HAZOP study is carried out and its results collected](https://reader031.vdocuments.us/reader031/viewer/2022030411/5a9dd51e7f8b9a0d5a8cbbe2/html5/thumbnails/121.jpg)
ARTEMIS-2009-1 R3-COP / Visual Perception Testing D4.2.3
Page 121 / 187
Spatial periodic
VPos. Observer position can only be set to discrete values
• Preferred best position (to optimally capture the object(s) of interest) cannot be achieved
• Misdetections e.g. special position for observer on system (normally stays fixed to remainder of the system) e.g. fixed observer or limited to a rail Note: To achieve the desired preferred position, it might be possible to move the whole SUT into the correct position
Spatial aperiodic
VPos. Observer position is not constrained (perhaps within a given range)
• Additional uncertainties due to arbitrary position of observer
• Additional uncertainties introduce additional un-certainties for the position estimation of objects
e.g. Observer/sensor can be moved arbitrarily away from observer or observer can fly
Temporal periodic
VPos. See VOrient
Temporal aperiodic
VPos. See VOrient
Close (spat.) VPos. VPos is closer to scene than expected object distance shorter
• Object distance is short-er than expected (out of focus)
• Objects have more de-tails than expected
• View contains less ob-jects than expected
• View includes only part of the object instead of whole objects
• Objects too close to Ob-server to capture their significance
• Too close objects con-fuse scene interpretation
• See Less Focus • See Object Pos. Close
![Page 122: R3-COP-D4.2.3 CV-HAZOP r1.1 OZ · PDF fileThese ingredients are introduced and discussed in the next chapter. In Chapter 2, the HAZOP study is carried out and its results collected](https://reader031.vdocuments.us/reader031/viewer/2022030411/5a9dd51e7f8b9a0d5a8cbbe2/html5/thumbnails/122.jpg)
ARTEMIS-2009-1 R3-COP / Visual Perception Testing D4.2.3
Page 122 / 187
Remote (spat.) VPos. VPos is more remote from scene than expected
• Object distance is bigger than expected (out of fo-cus)
• Object have less details than expected
• View contains more ob-jects than expected
• View includes whole objects instead of view-ing sections/parts of the object
• Relevant scene ele-ments (objects) too re-mote from Observer
• Relevant scene details not recognized
• Objects distances esti-mated less accurate
Before (temp.) VPos. See VOrient After (temp.) VPos. See VOrient In front of (spat.)
VPos. See Close
Behind (spat.) VPos. Observer is at a position where the expected view is blocked
• See Objects Occl. • See Objects Occl.
Faster VPos. Observer moves faster than expected
• Motion blur more likely with longer exposures
• Less time for observing scene
• Internal representation lags behind observed scene
• Blurred objects misde-tected
• Scene wrongly inter-preted
Slower VPos. Observer moves slower than expected
• Internal representation runs ahead observed scene
• Scene wrongly inter-preted
![Page 123: R3-COP-D4.2.3 CV-HAZOP r1.1 OZ · PDF fileThese ingredients are introduced and discussed in the next chapter. In Chapter 2, the HAZOP study is carried out and its results collected](https://reader031.vdocuments.us/reader031/viewer/2022030411/5a9dd51e7f8b9a0d5a8cbbe2/html5/thumbnails/123.jpg)
ARTEMIS-2009-1 R3-COP / Visual Perception Testing D4.2.3
Page 123 / 187
No Transp. The sensor optics is not transparent at all
• No light transmits them. • Confusion with “complete darkness”
• Sensor does not detect that its blind but reports exaggerated/extrapolated data
e.g. lens cap still on lens, malfunction of shutter/aperture
More Transp. The sensor optics is more transparent than expected
• Overall scene intensity as received by the elec-tronics is higher than ex-pected
• Light with unexpected (unwanted) frequencies is received by Obs. elec-tronics
• Colours or relative inten-sities (between Obj.s) as received differ from ex-pected ones
• Overexposure, at least of specific objects
• Misinterpretation of whole scene or at least some objects
Examples: unexpected wavelengths like near IR transmitted
Less Transp. The sensor optics is less transparent than expected
• Only part of expected light is received by Obs. electronics
• Underexposure • Misinterpretation of whole
scene or at least of some objects
Possible reason: dirt or bugs on lenses, cataract
As well as Transp. The sensor optics is both more and less trans-parent at the same time
• Combination of More and Less
• Parts of scene are misin-terpreted
E.g.: During cleaning a lens
Part of Transp. Part of optics are less transparent than expected (e.g. dirt on lens)
• Defocused areas • Dirt creates virtual pat-
tern on sensor
• Misdetections E.g.: Thick dust irregularly distributed on lens surfaces
![Page 124: R3-COP-D4.2.3 CV-HAZOP r1.1 OZ · PDF fileThese ingredients are introduced and discussed in the next chapter. In Chapter 2, the HAZOP study is carried out and its results collected](https://reader031.vdocuments.us/reader031/viewer/2022030411/5a9dd51e7f8b9a0d5a8cbbe2/html5/thumbnails/124.jpg)
ARTEMIS-2009-1 R3-COP / Visual Perception Testing D4.2.3
Page 124 / 187
Part of Transp. Observer block part of the image
• Parts of the image are black
• Objects may be (partially) invisible (in black image area)
• (Partially) Blocked Ob-jects are not detected
• Information within the blocked FOV cannot be interpreted
• See Part of FOV
E.g.: Lens body/lens hood is prolonged and corners are thus blocking the view
Reverse Transp. n/a Other than Transp. The transparency of
sensor optics is completely different from expected, e.g. due to broken lenses
• The scene looks com-pletely differently than expected, e.g. parts of it are multiplied
• Strong confusion of CV alg. if fault not detected
Where else Transp. Lens body is not completely light proof, light can reach sensor from the side of the body
• Flare effects • See More
• See More • Additional light effects
misinterpreted as scene components
E.g.: gap between lens body screw and sensor assembly
Spatial periodic
Transp. The transparency of the lenses changes in a spatially periodic manner
• Parts of the FOV are blocked reduced in in-tensity
• Strong confusion of CV alg.
E.g.: A dirty Fresnel-lens
Spatial aperiodic
Transp. The transparency of the lenses changes in a spatially irregularly manner,
• The intensity of the im-age is irregularly re-duced
• CV alg. is hampered. • In particular, dense ste-
reo vision is significantly hampered, since pollution is very likely different for both cameras.
E.g.: Pollution of lens surface
Temporal periodic
Transp. Transparency changes in a temporally periodic manner
• See Less • If not learned and com-pensated, See Less
E.g.: An outdoor mounted camera with morning dew every day
Temporal aperiodic
Transp. Transparency changes in a temporally aperiodic manner
• See Less E.g.: As a function of temperature or moisture
![Page 125: R3-COP-D4.2.3 CV-HAZOP r1.1 OZ · PDF fileThese ingredients are introduced and discussed in the next chapter. In Chapter 2, the HAZOP study is carried out and its results collected](https://reader031.vdocuments.us/reader031/viewer/2022030411/5a9dd51e7f8b9a0d5a8cbbe2/html5/thumbnails/125.jpg)
ARTEMIS-2009-1 R3-COP / Visual Perception Testing D4.2.3
Page 125 / 187
Close (spat.) Transp. Last lens in assembly has different transparency than expected
• See Other Than/Less/More
• See Other Than/Less/More
Remote (spat.) Transp. First lens in assembly has different transparency than the rest
• The lens system either magnifies local transpar-ency changes or focuses them
• For global effects: See More Less
• See Less, Part of E.g.: The image of dirt on the first lens is magnified or focused
Before (temp.) Transp. Shutter opens or closes before this is expected
• Exposure time differs from expected
• Photoelectric events are exposed to light out of schedule
• Over/Under-exposure, See Less Exposure
• Rolling-shutter effect can occur, if image sensor is reading data while the shutter is open, See Slower Exposure
• Not synchronized with flash trigger, See Before L.s Intensity
• See Less/More/Faster Exposure
• Rolling Shutter is causing artefacts which are misin-terpreted as object prop-erties
A mechanical shutter is opening or closing earlier than expected
After (temp.) Transp. Shutter opens or closes after this is expected
• See Before Transp.
In front of (spat.)
Transp. Something is directly in front of the lens (e.g. dirt)
• See Less, Part of • See Less, Part of
Behind (spat.) Transp. Something is trapped behind the lens (e.g. dirt)
• See Less, Part of • See Less, Part of
Faster Transp. n/a Slower Transp. n/a
![Page 126: R3-COP-D4.2.3 CV-HAZOP r1.1 OZ · PDF fileThese ingredients are introduced and discussed in the next chapter. In Chapter 2, the HAZOP study is carried out and its results collected](https://reader031.vdocuments.us/reader031/viewer/2022030411/5a9dd51e7f8b9a0d5a8cbbe2/html5/thumbnails/126.jpg)
ARTEMIS-2009-1 R3-COP / Visual Perception Testing D4.2.3
Page 126 / 187
No Spectrum No colour is transmitted (monochromatic), because only a very small fraction of spectrum is passed
• Objects not emitting (re-flecting etc.) light in this range are “black”
• Underexposure (less en-ergy transmitted)Most ob-jects not (correctly) rec-ognized
• Objects are not distin-guishable any more
• Confusion of shadows with ‘black’ objects
More Spectrum Lenses add intensity at certain frequencies
• Some incoming colours are more adulterated than others
• Object classification is hampered (confusion of similar objects or object parts
• Textures not distin-guished or confused
E.g.: Inter-lens reflections
More Spectrum Lens transmits more of the incoming spectrum than expected
• Unwanted spectral com-ponents of light hit the sensor (e.g. IR or UV)
• Reduction of received light to distinguish e.g. different l.s.s cannot be used
• Effects caused by un-wanted spectral compo-nents occur (E.g.: noise or exceeding the dynamic range of the sensor)
• Intended selection of view-specific scene com-ponents fails
E.g.: Infrared cut-off filter is not mounted, and CCD or CMOS are very sensitive on IR
Less Spectrum Lenses are opaque to many frequencies in visual range
• Reduces contrast • Different colours than
expected
• Certain image features are lost
• Misinterpretation • Misdetection
As well as Spectrum See More and Less • See More and Less • See More and Less Part of Spectrum See less • See More and Less • See More and Less
![Page 127: R3-COP-D4.2.3 CV-HAZOP r1.1 OZ · PDF fileThese ingredients are introduced and discussed in the next chapter. In Chapter 2, the HAZOP study is carried out and its results collected](https://reader031.vdocuments.us/reader031/viewer/2022030411/5a9dd51e7f8b9a0d5a8cbbe2/html5/thumbnails/127.jpg)
ARTEMIS-2009-1 R3-COP / Visual Perception Testing D4.2.3
Page 127 / 187
Reverse Spectrum A range of spectrum that is expected to be omitted is passing while the desired range is passed
• See Meaning • Photo elements are dis-turbed by the passing spectrum
E.g.: An IR cut-off filter should prevent the sensor from being exposed to IR light, but a high pass filter is mounted
Other than Spectrum Lenses absorb certain frequencies and reemit them at other wave lengths
• Most (or all) incoming colours are changed
• See more e.g. UV turned into visible light
Where else Spectrum Different parts of spectrum are transmitted to different locations
• Chromatic aberration (achromatism): coloured edges
• Washed-out/Defocused edges
• Edge detection and seg-mentation is hampered
• Stereo imaging: matching preciseness decreased
Chrom.aberr.: Light is diffracted and the different colours have different diffraction coefficients. This results in chromatic aberration and thus coloured edges
Spatial periodic
Spectrum Analogous to Transp. • Spectral errors are cou-pled to compound lenses
• Combination of “Where else” and “Transp.”
E.g.: Aberration in compound lenses
Spatial aperiodic
Spectrum The spectral effects of the lens changes in a spatially aperiodic manner
• Diffuse coloured edges occur around objects (above a certain contrast level)
• See where else Example: , manufacturing errors such as bubbles in lenses close to focal plane
Temporal periodic
Spectrum n/a
Temporal aperiodic
Spectrum n/a
Close (spat.) Spectrum Objects close to each other (adjacent) get their spectra merged
• See Where else • See Where else
Remote (spat.) Spectrum n/a Before (temp.) Spectrum n/a After (temp.) Spectrum n/a In front of (spat.)
Spectrum See Where else
![Page 128: R3-COP-D4.2.3 CV-HAZOP r1.1 OZ · PDF fileThese ingredients are introduced and discussed in the next chapter. In Chapter 2, the HAZOP study is carried out and its results collected](https://reader031.vdocuments.us/reader031/viewer/2022030411/5a9dd51e7f8b9a0d5a8cbbe2/html5/thumbnails/128.jpg)
ARTEMIS-2009-1 R3-COP / Visual Perception Testing D4.2.3
Page 128 / 187
Behind (spat.) Spectrum See Where else Faster Spectrum Observer is moved
sufficiently fast to cause Doppler effect
• Colours of fast moving objects are changed
• Misinterpretation of fast moving objects
Slower Spectrum n/a No LNum. No lenses are used • Defocusing • See Less Focus e.g. Lenses fall out of lens
body or lens body is missing
More LNum. More lenses are in lens assembly than expected
• More reflections be-tween lenses
• Lens reflections appear in image
• Transmitted frequencies are more filtered – stronger filtering, more dimming, less contrast
• Lens reflections are mis-interpreted as textures or objects
• Underexposure • See also Less Transp.
![Page 129: R3-COP-D4.2.3 CV-HAZOP r1.1 OZ · PDF fileThese ingredients are introduced and discussed in the next chapter. In Chapter 2, the HAZOP study is carried out and its results collected](https://reader031.vdocuments.us/reader031/viewer/2022030411/5a9dd51e7f8b9a0d5a8cbbe2/html5/thumbnails/129.jpg)
ARTEMIS-2009-1 R3-COP / Visual Perception Testing D4.2.3
Page 129 / 187
Less LNum. Less lenses are in lens assembly than expected
• More uncorrected chro-matic lens errors; i.e. aberration: halos, glares
• Coma: small bright ob-jects show a coma-like effect towards the image boarder
• Astigmatism: (Occurs for each lens) Rays from a point off the optical axis are less focused on the image pane is usually reduced in multi-lens ar-rangements
• More light intensity gets through the whole lens assembly (more intensity at image)
• More contrast than ex-pected
• More saturation than expected
• Spherical aberration: Ob-ject and texture recogni-tion is hampered
• Coma, halo, glare: misin-terpreted as objects or texture
• Astigmatism: object’s shape wrongly computed , or object’s distance wrongly computed
• Overexposure • See also More Transp. • See Less Focus, Other
than FOV
As well as LNum. Multiple different lens assemblies are used
• Different lens effects than expected
• Registration and match-ing of images hampered
e.g. in stereo vision
Part of LNum. Part of the lens assembly is missing
• See Where else Trans • See Where else Trans
Reverse LNum. Lens body is used in reversed order
• Projected real/virtual im-age size very different from expected
• Object sizes different from expected
• Failure to detect miscon-figuration leads to wrong interpretations in size or distance
• See Less Focus, Other than FOV
![Page 130: R3-COP-D4.2.3 CV-HAZOP r1.1 OZ · PDF fileThese ingredients are introduced and discussed in the next chapter. In Chapter 2, the HAZOP study is carried out and its results collected](https://reader031.vdocuments.us/reader031/viewer/2022030411/5a9dd51e7f8b9a0d5a8cbbe2/html5/thumbnails/130.jpg)
ARTEMIS-2009-1 R3-COP / Visual Perception Testing D4.2.3
Page 130 / 187
Other than LNum. Different number of lenses than expected are used
• Changed amount of opti-cal aberration
• Different focal length than expected
• See Less Focus, Other than FOV
Where else LNum. Lenses are arranged differently than expected
• Changed amount of opti-cal aberration
• Different focal length than expected
• See Less Focus, Other than FOV
Spatial periodic
LNum. n/a
Spatial aperiodic
LNum. n/a
Temporal periodic
LNum. n/a
Temporal aperiodic
LNum. n/a
Close (spat.) LNum. Lenses in the assembly are too close together
• Optical aberrations are not corrected as ex-pected
• Increased aberration mis-interpreted as virtual pat-tern
Remote (spat.) LNum. Lenses in the assembly are too far apart
• Too many total reflec-tions between lenses
• Resulting optical effects misinterpreted as objects or virtual patterns
Before (temp.) LNum. n/a After (temp.) LNum. n/a In front of (spat.)
LNum. n/a
Behind (spat.) LNum. n/a Faster LNum. n/a Slower LNum. n/a
![Page 131: R3-COP-D4.2.3 CV-HAZOP r1.1 OZ · PDF fileThese ingredients are introduced and discussed in the next chapter. In Chapter 2, the HAZOP study is carried out and its results collected](https://reader031.vdocuments.us/reader031/viewer/2022030411/5a9dd51e7f8b9a0d5a8cbbe2/html5/thumbnails/131.jpg)
ARTEMIS-2009-1 R3-COP / Visual Perception Testing D4.2.3
Page 131 / 187
No LGeom. Lenses are flat (simple window)
• Lenses cannot be used to focus
• Lenses cannot be used to reduce optical aberra-tion
• See No Focus
More LGeom. More optical effects due to strongly curved lens surfaces
• Distortion: barrel or pin-cushion
• Vignetting: image dark-ening towards the edges
• Focus range is limited in distance (near-sighted)
• Distortion: scene geome-try misinterpreted
• Vignetting: increased • Underexposure • Misinterpretation of col-ours
• Misinterpretation of shad-ows
• towards the edges, • Remote objects poorly recognized
Less LGeom. Less optical corrections due to weakly curved lens surfaces
• Focus range is limited in distance (long-sighted)
• Close objects defocused – poorly recognized
As well as LGeom. Lens surface is weaker on one side and stronger curved on the other side of the lens surface than expected
• See More/Less • See More/Less
Part of LGeom. Part of lens have other geometry than expected
• Non-uniform deformation of image
• Distances and object siz-es can be miscalculated
• See Less Focus
Example: thermal effects cause non-uniform distortion of optics
![Page 132: R3-COP-D4.2.3 CV-HAZOP r1.1 OZ · PDF fileThese ingredients are introduced and discussed in the next chapter. In Chapter 2, the HAZOP study is carried out and its results collected](https://reader031.vdocuments.us/reader031/viewer/2022030411/5a9dd51e7f8b9a0d5a8cbbe2/html5/thumbnails/132.jpg)
ARTEMIS-2009-1 R3-COP / Visual Perception Testing D4.2.3
Page 132 / 187
Part of LGeom. Part of one lens is missing or lens is scattered
• Crack/ Edge of lens piec-es introduce complex patterns and interreflections
• Visible Cracks/Edges block/defocus parts of the scene
• Objects in Blocked/Defocused are-as of the image are not correctly detected
• Cracks/Edges are con-fused with objects
Reverse LGeom. Lens optics are reverse to expected
• Lens is zooming out in-stead of in and vice ver-sa
• Wrong distances to ob-jects are estimated
Other than LGeom. Lenses have a different geometry than expected
• See More/Less • See More/Less Wide angle lens is used instead of tele and vice versa
Where else LGeom. n/a Spatial periodic
LGeom. Lens geometry has a periodic pattern
• Lens structure introduc-es a virtual pattern of distortions
• Virtual pattern (kaleido-scopic) is confused with object pattern
e.g. compound eye lens
Spatial aperiodic
LGeom. Spatial aperiodic disturbance or imperfections of lens geometry
• Bright rays virtually emanate from bright ob-jects within scene
• Imperfections create zones of increased defo-cus and aberrations
• Imperfections create virtual patterns
• Objects in defect zones of image are not detected correctly
• Defects are confused with objects; e.g. rays are misinterpreted as edges or complete objects
• Locations (disparities) are corrupted
e.g. scratches or rain drops in front of the lens
Temporal periodic
LGeom. n/a
Temporal aperiodic
LGeom. LGeom changes over time due to environmental conditions or aging
• See Other Than FOV • Compensations due to
calibration do not fit the actual observers behav-iour any more
• See Other Than FOV • Overcompensation of
distortion and wrong focal length (See Other Than Calibration)
E.g.: Temperature has an impact on the lens system, changes
![Page 133: R3-COP-D4.2.3 CV-HAZOP r1.1 OZ · PDF fileThese ingredients are introduced and discussed in the next chapter. In Chapter 2, the HAZOP study is carried out and its results collected](https://reader031.vdocuments.us/reader031/viewer/2022030411/5a9dd51e7f8b9a0d5a8cbbe2/html5/thumbnails/133.jpg)
ARTEMIS-2009-1 R3-COP / Visual Perception Testing D4.2.3
Page 133 / 187
Close (spat.) LGeom. n/a Remote (spat.) LGeom. n/a Before (temp.) LGeom. n/a After (temp.) LGeom. Lens geometry changes
after calibration has been performed
• See Temporal aperiodic • See Other Than Calibra-
tion
• See Temporal aperiodic • See Other Than Calibra-
tion
• See Temporal aperiodic
In front of (spat.)
LGeom. n/a
Behind (spat.) LGeom. Lens has imperfections inside its medium (“Behind surface”)
• See Spatial aperiodic / Part of
• See Spatial aperiodic / Part of
e.g. air bubbles
Faster LGeom. Lens geometry changes faster (between calibration intervals) than expected
• See Temporal aperiodic • See Other Than Calibra-
tion
• See Temporal aperiodic • See Other Than Calibra-
tion
• See Temporal aperiodic
Slower LGeom. n/a No Focus No part of scene is within
DOF • Highly defocused image,
but different distances within scene show dif-ferent degrees of defo-cusing
• Image recognition se-verely hampered
No Focus Lenses are not shaped to allow focusing
• See above • See FOV No 2
• See above
More Focus DoF is larger than expected
• Larger depth of field than expected, e.g. back-ground as sharp as fore-ground
• Contrast is increased • More objects are clearly
visible than expected
• CV alg. exploiting differ-ence between sharply edged and blurred seg-ments are hampered
• Separation of objects from background ham-pered
• Computing time in-creased
![Page 134: R3-COP-D4.2.3 CV-HAZOP r1.1 OZ · PDF fileThese ingredients are introduced and discussed in the next chapter. In Chapter 2, the HAZOP study is carried out and its results collected](https://reader031.vdocuments.us/reader031/viewer/2022030411/5a9dd51e7f8b9a0d5a8cbbe2/html5/thumbnails/134.jpg)
ARTEMIS-2009-1 R3-COP / Visual Perception Testing D4.2.3
Page 134 / 187
More Focus DoF is larger than expected
• Reduces accuracy in shape from focus
• 3D reconstruction is smeared or corrupted at all
Less Focus DoF is smaller than expected
• Smaller depth of field than expected
• Essential scene parts are out of focus
• Blurry image • Edges are smoothed • Contrast is reduced
• Objects cannot be com-pletely recognised
• Blurred image areas mis-interpreted as being empty or “medium only”
• Detection of edges and their correct position de-teriorated
As well as Focus Focusing on multiple different distances (special optics or plenoptic camera)
• Multiple different distanc-es are sharp, with defo-cused zones in between
• Multiple focal planes con-fuse CV alg.
E.g.: Front and rear side of an object is sharp, and the unsharp area in between causes CV alg. to interpret object as two
Part of Focus Depth of field is (strongly) limited
• Parts of scene are out of focus
• See Less Standard Optics and Sensors only allow the setting of one focal distance. This means that part of the scene is always out of focus. Bigger objects and multiple objects will always be partially out of focus
Other than Focus Focal distance is at another distance than expected
• Another distance is sharply mapped than expected – unwanted objects are sharply mapped
• Wrong objects recog-nised
• Shape from focus => wrong distance estima-tion
Example: snowflakes in foreground are pictured sharply, while objects of interest are blurred
Where else Focus See Other than Spatial periodic
Focus See Spatial aperiodic
![Page 135: R3-COP-D4.2.3 CV-HAZOP r1.1 OZ · PDF fileThese ingredients are introduced and discussed in the next chapter. In Chapter 2, the HAZOP study is carried out and its results collected](https://reader031.vdocuments.us/reader031/viewer/2022030411/5a9dd51e7f8b9a0d5a8cbbe2/html5/thumbnails/135.jpg)
ARTEMIS-2009-1 R3-COP / Visual Perception Testing D4.2.3
Page 135 / 187
Spatial aperiodic
Focus The optics focus parts of the image different than others
• Image always partially out of focus
• See Less • See Other than
• See Less • See Other than
Example: tilt lens, shift lens
Temporal periodic
Focus Focal changes periodically with time
• Some or all effects from No to Other than appear periodically
• The plane in focus changes over time (in-tent for shape from fo-cus)
• Some or all hazards from No to Other than occur periodically
Example: out-of-control focus drive turns periodically through focal range Attention: This is the intent in shape from focus
Temporal aperiodic
Focus Focal changes aperiodically with time
• Some or all effects from No to Other than appear in unpredictable order
• Some or all hazards from No to Other than occur periodically
Example: out-of-order focus drive turns periodically through focal range
Close (spat.) Focus Depth of field is nearer than expected
• “Wrong” objects in focal range
• “Wrong” objects recog-nized
Example: small insect or spider in front of optics forces autofocus to turn on it
Remote (spat.) Focus Depth of field is more remote than expected
• “Wrong” objects in focal range
• “Wrong” objects recog-nized
Example: small object upfront should be recognized, but background is in focus
Before (temp.) Focus The focal distance is at a certain value before expected
• “Wrong” objects in focus • In shape from focus a wrong distance is as-sumed
E.g.: Shape from focus: The focus stack (a number of images taken at different focal distance) is captured with a focal distance offset
Before (temp.) Focus Focusing is changed in advance of the object’s movement
• See Less • See Less Predicted velocity is to high
After (temp.) Focus The focal distance is at a certain value after expected
• See Before • See Before
![Page 136: R3-COP-D4.2.3 CV-HAZOP r1.1 OZ · PDF fileThese ingredients are introduced and discussed in the next chapter. In Chapter 2, the HAZOP study is carried out and its results collected](https://reader031.vdocuments.us/reader031/viewer/2022030411/5a9dd51e7f8b9a0d5a8cbbe2/html5/thumbnails/136.jpg)
ARTEMIS-2009-1 R3-COP / Visual Perception Testing D4.2.3
Page 136 / 187
After (temp.) Focus Focusing is changed too late after the object’s movement
• See Less • See Less No/Wrong prediction of movement
In front of (spat.)
Focus See Close
Behind (spat.) Focus See Remote Faster Focus Focus adapts faster than
expected • Wrong DoF in slow
changing scene • “Wrong” or no objects
recognized
Slower Focus Focus adapts slower than expected
• Wrong DoF in fast changing scene
• “Wrong” or no objects recognized
No Apert. No aperture is used • Very large circles of con-fusion
• Very small DOF • See Less Focus
• See Less Focus
More Apert Aperture is more open than expected
• Smaller DOF than ex-pected
• Focusing has stronger influence on highlighting objects
• More light hitting elec-tronics than necessary
• See Less Focus • Overexposure
Less Apert Aperture is less open than expected
• Larger DOF than ex-pected
• Focusing has a weaker influence on highlighting objects
• Less light hitting elec-tronics than necessary
• See More Focus • Underexposure
![Page 137: R3-COP-D4.2.3 CV-HAZOP r1.1 OZ · PDF fileThese ingredients are introduced and discussed in the next chapter. In Chapter 2, the HAZOP study is carried out and its results collected](https://reader031.vdocuments.us/reader031/viewer/2022030411/5a9dd51e7f8b9a0d5a8cbbe2/html5/thumbnails/137.jpg)
ARTEMIS-2009-1 R3-COP / Visual Perception Testing D4.2.3
Page 137 / 187
As well as Apert. Aperture changes opening size multiple times during one exposure
• Aperture-specific aberra-tions integrated → broader but weaker ab-errations
• See Less Focus
• See Less Focus
Part of Apert. Aperture is only partially opened/closed
• See More/Less • Aperture is non-uniform • Entrance and exit pupil
degenerate to complex forms, as does the shape of a lens flare
• Increase of optical aber-rations (including chro-matic aberration)
• See Less Focus • Asymmetric DoF • Lens flare specific forms
cannot be detected or compensated
Parts of aperture blades are stuck therefore the shape of the entrance pupil is not a regular polygon any more See Where else Apert.
Reverse Apert. n/a Other than Apert. Aperture opening is
different from expected • DOF different from ex-
pected • See More/Less • Wrong image brightness
• See More/Less • Over- or underexposure
Where else Apert. Aperture is placed at a different location in the lens assembly than expected
• Changed optical aberra-tion amount
• Different DOF than ex-pected
• See Less Focus Example: Lens assembly does not fit perfectly into mounting
Where else Apert. Aperture form is projected into different places within the image
• Visible highlight artifacts blend/overexpose sec-tions within the image
• Chromatic aberration in shape of aperture (See More Colour)
• Contrast of objects or parts of objects are re-duced
• Objects not detected • Aperture projection is
mistaken for an object
Flare effects, each lens in the assembly creates an aperture sized highlight on the image plane
![Page 138: R3-COP-D4.2.3 CV-HAZOP r1.1 OZ · PDF fileThese ingredients are introduced and discussed in the next chapter. In Chapter 2, the HAZOP study is carried out and its results collected](https://reader031.vdocuments.us/reader031/viewer/2022030411/5a9dd51e7f8b9a0d5a8cbbe2/html5/thumbnails/138.jpg)
ARTEMIS-2009-1 R3-COP / Visual Perception Testing D4.2.3
Page 138 / 187
Spatial periodic
Apert. n/a
Spatial aperiodic
Apert. n/a
Temporal periodic
Apert. Aperture changes periodically
• Periodic change of im-age brightness
• See Other than
• Periodic change between over-, correct, and under-exposure
• See Other than
Example: out-of-control device changes aperture periodically
Temporal aperiodic
Apert. Aperture changes aperiodically
• Aperiodic change of im-age brightness
• See Other than
• Aperiodic change be-tween over-, correct, and under-exposure
• See Other than
Close (spat.) Apert. Apert is closer to observer electronics than expected
• See Other than • See Other than
Remote (spat.) Apert. Apert is more remote from observer electronics than expected
• See Other than • See Other than
Before (temp.) Apert. Aperture changes later than expected
• Overexposure • See Consequences Since Apert. And Exposure are correlated see Exposure
After (temp.)
Apert. Aperture changes later than expected
• Underexposure • See Consequences Since Apert. And Exposure are correlated see Exposure
In front of (spat.)
Apert. n/a
Behind (spat.) Apert. n/a Faster Apert. Aperture changes faster
than expected • See Other than • See Other than
Slower Apert. Aperture changes slower than expected
• See Other than • See Other than
No Wave Wave property is not changed by the observer
• See Less • See Less
![Page 139: R3-COP-D4.2.3 CV-HAZOP r1.1 OZ · PDF fileThese ingredients are introduced and discussed in the next chapter. In Chapter 2, the HAZOP study is carried out and its results collected](https://reader031.vdocuments.us/reader031/viewer/2022030411/5a9dd51e7f8b9a0d5a8cbbe2/html5/thumbnails/139.jpg)
ARTEMIS-2009-1 R3-COP / Visual Perception Testing D4.2.3
Page 139 / 187
More Wave Wave property is changed stronger by the observer than expected
• Light intensity is reduced by filtering
• Light leaving lenses is strongly polarised
• Sensor electronics re-ceive and capture only a fraction of incoming light – underexposure, modi-fied colours
Less Wave Wave property is changed weaker by the observer than expected
• Strength of filtering ef-fects of different l.s. based on their wave property is reduced
• Light leaving lenses is less polarised than ex-pected
• Effects of different l.s. cannot be differentiated
• Effects of different l.s. are confused
• Recording electronics overcompensate ex-pected polarisation ef-fects
Example: polarised light used for distinguishing lighting effects from different l.s.
As well as Wave n/a Part of Wave (Only) Part of transmitted
light is polarised • Different frequency
ranges of light leaving lenses are differently po-larised
• Sensor electronics re-ceive and capture differ-ent frequencies of incom-ing light different effi-ciently – modified colours
Reverse Wave Polarisation of transmitted light is reversed to expected
• See Other than • See Other than Example: horizontally instead of vertically polarized
Other than Wave Polarisation of transmitted light is other than expected
• See Meaning • Poor compensation or even exaggeration of lens effects by sensor elec-tronics
Where else Wave See Other than Spatial periodic
Wave Polarisation changes spatially periodically over optics area, e.g. concentric
• Different polarisation effects at different image areas
• Like Other than, but in a spatially periodic manner, which causes overall scene misinterpretation; e.g strengthening of vi-gnetting
![Page 140: R3-COP-D4.2.3 CV-HAZOP r1.1 OZ · PDF fileThese ingredients are introduced and discussed in the next chapter. In Chapter 2, the HAZOP study is carried out and its results collected](https://reader031.vdocuments.us/reader031/viewer/2022030411/5a9dd51e7f8b9a0d5a8cbbe2/html5/thumbnails/140.jpg)
ARTEMIS-2009-1 R3-COP / Visual Perception Testing D4.2.3
Page 140 / 187
Spatial aperiodic
Wave Polarisation changes spatially aperiodically over optics area
• See Spatial periodic • Like Other than, but in a spatially irregular manner
Example: filter defect
Temporal periodic
Wave n/a
Temporal aperiodic
Wave Polarisation changes with operating temperature and/or with age
• Over time, polarisation properties change gradually and hardly predictable
• In extreme cases, small influence on image qual-ity
It is assumed that the effect is only small
Close (spat.) Wave n/a Remote (spat.) Wave n/a Before (temp.) Wave n/a After (temp.) Wave n/a In front of (spat.)
Wave n/a
Behind (spat.) Wave n/a Faster Wave n/a Slower Wave n/a No PSF The PSF of the optical
system is unknown • The image blur cannot
be (partially) compen-sated
• The boundaries of ob-jects in a blurred image are fuzzy
• The features of objects in a blurred image are fuzzy
• Errors of calculated object positions and extents have an unknown vari-ance
• Misdetection of objects
Blur reduction can be done by a deconvolution with the PSF (e.g. Wiener deconvolution)
No PSF The PSF of the optical system is everywhere zero
• The optics prevent light from reaching the sensor
• See No Transp, Spectral Efficiency, No Exposure …
• See No Transp., …
![Page 141: R3-COP-D4.2.3 CV-HAZOP r1.1 OZ · PDF fileThese ingredients are introduced and discussed in the next chapter. In Chapter 2, the HAZOP study is carried out and its results collected](https://reader031.vdocuments.us/reader031/viewer/2022030411/5a9dd51e7f8b9a0d5a8cbbe2/html5/thumbnails/141.jpg)
ARTEMIS-2009-1 R3-COP / Visual Perception Testing D4.2.3
Page 141 / 187
No PSF The PSF has only constant components (MSF is a single spike at frequency 0)
• The whole sensor works like a single integrating pixel
• See No Resolution E.g.: The system is severely out of focus
No PSF No optical blurring before discretisation
• Aliasing artefacts • Staircasing of edges and
lines • Moiré patterns in inten-
sity and colour of repeti-tive textures (e.g. stripes on clothing, brick walls)
• Large errors in location and orientation estimates
• Apparent texture differs from true texture
• Unpredictable differences between appearance of corresponding points/regions in consecu-tive frames
More PSF The PSF’s extent is larger (effecting a bigger neighbourhood of pixels) than expected
• The amount of blurriness is larger than expected
• Loss of detail • Loss of contrast
• See Less Focus • Errors of calculated object positions and extents have a larger variance than assumed
• Loss of small objects • Object shape distortions (e.g. rounded corners)
Less PSF The PSF’s extend is smaller (effecting a smaller neighbourhood of pixels) than expected
• The amount of blurriness is smaller than expected
• See More Focus • Any filtering assuming an
e.g. Gaussian correlation between neighbouring pixels is hampered
As well as PSF Multiple PSFs are combined
• Combination of multiple fuzziness
• The different PSF influ-ences cannot be differ-entiated and are harder to compensate
• See More
![Page 142: R3-COP-D4.2.3 CV-HAZOP r1.1 OZ · PDF fileThese ingredients are introduced and discussed in the next chapter. In Chapter 2, the HAZOP study is carried out and its results collected](https://reader031.vdocuments.us/reader031/viewer/2022030411/5a9dd51e7f8b9a0d5a8cbbe2/html5/thumbnails/142.jpg)
ARTEMIS-2009-1 R3-COP / Visual Perception Testing D4.2.3
Page 142 / 187
As well as PSF The PSF has multiple maxima
• A shifted version of the image is blended with the original
• Any positions an alg. has to find are ambiguous
double image seen through thick glass
Part of PSF The PSF is nearly circular except for some section
• Anisotropic blurriness • Blur pattern might intro-duce fake virtual pattern in image that creates fake detections or interpreta-tions
Reverse PSF The actual PSF is near to the inverted form of the expected point PSF
• The effect of PSF differs from the expected one
• PSF correction has the opposite effect, increas-ing effects as under More or Other than.
inverse Fresnel lense
Other than PSF The PSF is different from the expected
• Blur compensation is ineffective or even over-compensating (creates more distortions)
• Correction of PSF cre-ates additional pattern in the resulting image thus creating false detections and misinterpretations
Where else PSF The PSF form is shifted from the origin
• The image is shifted • Any locations have an unexpected displacement
Spatial periodic
PSF There are periodic pattern visible in the PSF; i.e. The PSF is spatially periodic
• Additional small scale blurriness creating a spatial pattern
• Contours of objects are duplicated and create possibility for confusions
Inter lens reflections The usual case for bending effects e.g. at the aperture
Spatial periodic
PSF PSF changes regularly over the sensitive area (e.g. radially)
• Information quality of the sensitive area changes regularly; e.g. reduces from centre outwards
• Objects at image border are more likely to be mis-detected
E.g. vignetting
Spatial aperiodic
PSF The PSF is not as periodic as expected
• Image quality is greatly reduced due to irregular blurring pattern and speckles (high noise)
• Misdetections
![Page 143: R3-COP-D4.2.3 CV-HAZOP r1.1 OZ · PDF fileThese ingredients are introduced and discussed in the next chapter. In Chapter 2, the HAZOP study is carried out and its results collected](https://reader031.vdocuments.us/reader031/viewer/2022030411/5a9dd51e7f8b9a0d5a8cbbe2/html5/thumbnails/143.jpg)
ARTEMIS-2009-1 R3-COP / Visual Perception Testing D4.2.3
Page 143 / 187
Temporal periodic
PSF The PSF fluctuates periodically over time
• Image quality/blurriness fluctuates regularly over time
• Performance is not sta-ble, results fade in/out all the time; but can poten-tially be compensated
e.g. due to vibrations
Temporal aperiodic
PSF The PSF is not as periodic as expected
• The modelled PSF is changed periodically for de-convolution whereas the actual PSF is not => See Other Than
• See Other Than E.g.: In shape from focus
Temporal aperiodic
PSF The PSF fluctuates chaotically over time
• Blur compensation can-not keep up with chaoti-cally changing blurriness
• Image interpretation compromised due to irre-versible blurriness
Close (spat.) PSF The PSF is unexpected for close object distances
• Image is more / less blurry than expected for objects with short dis-tance
• Over- and under-compensation of PSF during de-convolution
• Inaccuracy in computa-tion results
• Up to compromised inter-pretation due to irreversi-ble blurriness
The PSF varies most rapidly for objects close to the observer [Shih+2012]
Remote (spat.) PSF The PSF is different for objects that are far away
• Image is more / less blurry than expected for objects with far distance
• Different PSF patterns may overlap in image regions where different object distances overlap
• See Close Due to its dependency on focal distance
Before (temp.) PSF n/a After (temp.) PSF n/a In front of (spat.)
PSF n/a
Behind (spat.) PSF n/a
![Page 144: R3-COP-D4.2.3 CV-HAZOP r1.1 OZ · PDF fileThese ingredients are introduced and discussed in the next chapter. In Chapter 2, the HAZOP study is carried out and its results collected](https://reader031.vdocuments.us/reader031/viewer/2022030411/5a9dd51e7f8b9a0d5a8cbbe2/html5/thumbnails/144.jpg)
ARTEMIS-2009-1 R3-COP / Visual Perception Testing D4.2.3
Page 144 / 187
Faster PSF PSF is changing faster than expected
• See Other Than • See Other Than
Slower PSF PSF is changing slower than expected
• See Other Than • See Other Than
2.7 Observer - Electr(on)ics
2.7.1 Parameters
Table 14: Observer - Electronics Parameters Parameters Values / Range Meaning Exposure and shutter seconds Exposure duration/starting time, and shutter effects Resolution (spatial) m * n Pixel number, aspect ratio, pixel size/ratio (spatial resolution/quantisation) Spectral efficiency Frequency distribution Colour response curves Quality Signal to Noise Ratio (SNR) Quality of electrics/electronics, perfect would mean no deviation introduced due to
electric components Quantization/Sampling Reduction of continuous time/amplitude into the discrete domain It shall be noted that exposure corresponds to the time the image sensor is exposed to light. This can be done in two ways by a mechanical or an electronic shutter. The mechanical shutter is placed either in the lens system (iris-shutter), which would make it part of the Observer – Optics HAZOP, or at the focal-pane. The electronic shutter is clearly within this HAZOP. Since the effect on information, manly the influence of exposure on the SNR and the raise in intensity are better fitting within this category, it fits within this HAZOP location. Also the effect on information of electronic shutter and focal-pane shutter are similar, while the iris-shutter is most optimal and the arising aberrations are the smallest. It shall also be noted that the exposure time is not necessarily similar to the inverse frame rate of a video camera. The frame rate is the sampling frequency for temporal Quantisation/Sampling while the exposure time is a time interval where the observed light is integrated.
![Page 145: R3-COP-D4.2.3 CV-HAZOP r1.1 OZ · PDF fileThese ingredients are introduced and discussed in the next chapter. In Chapter 2, the HAZOP study is carried out and its results collected](https://reader031.vdocuments.us/reader031/viewer/2022030411/5a9dd51e7f8b9a0d5a8cbbe2/html5/thumbnails/145.jpg)
ARTEMIS-2009-1 R3-COP / Visual Perception Testing D4.2.3
Page 145 / 187
2.7.2 HAZOP
Table 15: Observer - Electronics HAZOP Guideword Par. Meaning Consequences Hazards Note No Expos. Exposure is not
triggered • No image is taken • Algorithm waits indefinitely on
image from sensor, which never arrives
More Expos. Longer exposure time than expected
• More light captured per image than expected
• Latency: During exposure the current image data is not accessible
• Overexposure • Motion blur if movement is to
fast • Latency is a critical factor in
every closed-loop control (oscil-lations, loosing)
Less Expos. Shorter exposure time than expected
• Less light captured per image than expected
• Underexposure • Effects which are usually filtered
by averaging over time have a stronger impact
As well as Expos. Multiple exposures • Multiple frames superim-posed into one image
• Number of elements is miscal-culated
• Movement is miscalculated
Part of Expos. Only a part of photoelectric sensor area is exposed to light
• Only parts of the image show the scenery the rest stays black
• Misinterpretation of scene by CV alg.
• Missed detections • Missing input information
causes incomplete output
Potential reason: geometric mismatch with optics
Reverse Expos. Electronics emit light instead of receiving it
• Additional l.s. creates un-expected shadows / pat-terns
• See More l.s.
• See More l.s. E.g.: IR camera is causing heat and IR light respectively
Other than Expos. Exposure time is different than
• Temporal coherence of • Two consecutive frames are Potential reason: fault in control unit, e.g.
![Page 146: R3-COP-D4.2.3 CV-HAZOP r1.1 OZ · PDF fileThese ingredients are introduced and discussed in the next chapter. In Chapter 2, the HAZOP study is carried out and its results collected](https://reader031.vdocuments.us/reader031/viewer/2022030411/5a9dd51e7f8b9a0d5a8cbbe2/html5/thumbnails/146.jpg)
ARTEMIS-2009-1 R3-COP / Visual Perception Testing D4.2.3
Page 146 / 187
Guideword Par. Meaning Consequences Hazards Note expected frames is lost due to
changing exposure
that different that corresponding elements cannot be found any more => any tracking is con-fused
• Scene recognition breaks down or is at least severely hampered
computes exposure time proportional to intensity of incident light
Where else Expos. n/a Spatial periodic
Expos. Exposure is spatially not synchronized with data readout
• Not all pixels are exposed at the same time
• Rolling shutter effect
• The frequency of the rolling shutter is misinterpreted as movement
E.g: Standard CMOS sensor without a mechanical shutter, where the readout triggers the exposure
Spatial aperiodic
Expos. Exposure time of photoelectric elements is not constant
• Due to the mechanical opening of shutter the time of exposure is varying
• Photometric calibration differs spatially
• The brightness constancy assumption flow is not en-tirely fulfilled
• LDR to HDR based on photo-metric calibration fails
• Optical flow is inaccurate
E.g: A malfunction of the camera
Temporal periodic
Expos. Exposure time changes periodically
• Frequent (and quick) changes between over- and underexposure
• See other than
• See other than
Temporal aperiodic
Expos. Continuous recording without a constant exposure time
• Movements of objects appear erratic
• Miscalculation of positions • Miscalculation of motion
Close (spat.)
Expos. n/a
Remote (spat.)
Expos. n/a
![Page 147: R3-COP-D4.2.3 CV-HAZOP r1.1 OZ · PDF fileThese ingredients are introduced and discussed in the next chapter. In Chapter 2, the HAZOP study is carried out and its results collected](https://reader031.vdocuments.us/reader031/viewer/2022030411/5a9dd51e7f8b9a0d5a8cbbe2/html5/thumbnails/147.jpg)
ARTEMIS-2009-1 R3-COP / Visual Perception Testing D4.2.3
Page 147 / 187
Guideword Par. Meaning Consequences Hazards Note Before (temp.)
Expos. Exposure occurred before expected time
• A too early scene captured • Wrong scene interpretation if wrongly matched to expected scene phase
Before (temp.)
Expos. Exposure starts earlier than expected
• If correct exposure dura-tion, a too early scene is captured
• If too long exposure dura-tion, overexposure results
• Wrong scene interpretation if wrongly matched to expected scene phase
• Hampered recognition due to overexposure
Example: in physical experiments, precise timing can be jeopardized
After (temp.)
Expos. Exposure occurred after expected time
• A too late scene captured • See Before (1)
After (temp.)
Expos. Exposure starts later than expected
• If correct exposure dura-tion, a too late scene is captured
• If too long exposure dura-tion, overexposure results
• See Before (temp.) (2)
In front of (spat.)
Expos. Something is in front of the detector electronics
• Increased static image noise
• See less Focus
• See less Focus Layers in front of light sensing CCD (e.g. readout lines)
Behind (spat.)
Expos. Something behind detector electronics which emits
• Detector receives more light than expected
• Depending on wave prop., this can lead to annihila-tion
• Overexposure • Dimming or colour modification
Example: cat eye reflector
Faster Expos. See Less Slower Expos. See More No Resol. Sensor has no
resolution • All incoming light accumu-
lated into 1 “pixel” • De facto no detection of any
object possible Example: photo diode
More Resol. The sensor resolution is higher than expected Pixel size is smaller
• More data to be processed than expected
• Usually less light sensitiv-ity and more noise than
• If wrong image size not recog-nized by CV alg.: line mismatch occures (i.e. part of 1st pixel line exceeding expected width, is in-
E.g: A different camera is used, or all pixels of sensor are used instead of only the inner ones (it
![Page 148: R3-COP-D4.2.3 CV-HAZOP r1.1 OZ · PDF fileThese ingredients are introduced and discussed in the next chapter. In Chapter 2, the HAZOP study is carried out and its results collected](https://reader031.vdocuments.us/reader031/viewer/2022030411/5a9dd51e7f8b9a0d5a8cbbe2/html5/thumbnails/148.jpg)
ARTEMIS-2009-1 R3-COP / Visual Perception Testing D4.2.3
Page 148 / 187
Guideword Par. Meaning Consequences Hazards Note than expected expected
• Angular resolution is finer than expected → more de-tails visible
terpreted as begin of second line and so on)
• If image size correctly recog-nized: processing time too long
• Increase error rate on scene interpretation due to higher noise rate
• CV alg. parameters have to be adapted (E.g minimal size of sliding window), otherwise func-tionality is lost or hampered
is common technique to drop a number of rim pixels in order to reduce distortion effects)
Less Resol. The sensor resolution is lower than expected Pixel size is larger than expected
• Less data to be processed than expected
• Lower noise level than expected
• Angular resolution is coarser than expected
• Complementary to More
As well as Resol. Image sensor creates images of multiple resolutions
• See Other than • Algorithm might confuse low resolution image with high reso-lution image -> See Other than
e.g. progressive JPG
Part of Resol. Only along one dimension Resol. is different from expected
• Image/pixel ratio other than expected → image distortion
• If ratio of actual to ex-pected pixel number is simple (e.g. 1:2 or 2:1): wrong appearance can look similar to interlacing
• If ratio of actual to expected pixel number is simple (e.g. 1:2 or 2:1), and interlacing used: subtle image differences may distort image in a way simulat-ing higher or lower motion speed of (Obj.s in) scene
• Otherwise like As well as
Example: image is 512*256 instead of 512*512
Part of Resol. Only part of detector area is sensitive to light
• Not all expected parts of scene are visible in scene. E.g. if sensitive area is smaller than expected,
• Fewer objects or events de-tected.
• Scene recognition hampered.
Examples: area smaller than expected; some pixels damaged
![Page 149: R3-COP-D4.2.3 CV-HAZOP r1.1 OZ · PDF fileThese ingredients are introduced and discussed in the next chapter. In Chapter 2, the HAZOP study is carried out and its results collected](https://reader031.vdocuments.us/reader031/viewer/2022030411/5a9dd51e7f8b9a0d5a8cbbe2/html5/thumbnails/149.jpg)
ARTEMIS-2009-1 R3-COP / Visual Perception Testing D4.2.3
Page 149 / 187
Guideword Par. Meaning Consequences Hazards Note dark area around image border results.
• Static image noise in-creased.
Part of Resol Part of pixel area is insensitive
• Noise increased • Detectable light intensity
reduced
• Scene recognition hampered. • Underexposure
Reverse Resol. Resolution is n*m instead of m*n
• Size of pixel lines and col-umns reversed, but num-ber of pixels per image as expected
• If lens properties are cho-sen according to expected ratio, image borders show strong aberration and oc-clusions
• If this is not recognized: strongly confused scene interpretation
Other than Resol. Resolution is different from expected
• See More/Less • See More/Less Example: image width and weight is not a power of 2
Other than Resol. Aspect ratio differs from expected one
• If resulting number of pix-els is equal to expected number: see Reverse
• Else: see More/Less
• See Reverse • See More/Less
Other than Resol. Pixel ratio differs from expected
• Distorted image • Object mismatch or scene mis-interpretation
Where else Resol. The pixel geometry is not rectangular; e.g. hexagonal
• Center of pixels not on rectangular raster, and pixel shape not rectangu-lar
• If this is not recognized: strange appearance of scene – several distortion effects
Spatial periodic
Resol. Pixel size/position changes regularly over the view
• Different parts of the sen-sor sample the same scene area in different
• If this is not recognized, the ef-fects correspond to those of bar-rel or pincushion distortion –
E.g. (like an eye): resolution at center higher than at image
![Page 150: R3-COP-D4.2.3 CV-HAZOP r1.1 OZ · PDF fileThese ingredients are introduced and discussed in the next chapter. In Chapter 2, the HAZOP study is carried out and its results collected](https://reader031.vdocuments.us/reader031/viewer/2022030411/5a9dd51e7f8b9a0d5a8cbbe2/html5/thumbnails/150.jpg)
ARTEMIS-2009-1 R3-COP / Visual Perception Testing D4.2.3
Page 150 / 187
Guideword Par. Meaning Consequences Hazards Note image area sizes Optics/LGeom./More
• Miscalculation of sizes • Miscalculation of relative rela-
tions and positions
edges; e.g. also compound eye
Spatial aperiodic
Resol. Pixel size changes non-uniformly (irregularly) over the view
• As special periodic, but perhaps limited to a small numbers of pixels; pre-sumably due to production faults
• Miscalculation of sizes • Miscalculation of relative rela-
tions and positions
Temporal periodic
Resol. The resolution changes over time in a regular manner
• Number of pixel lines and columns changes periodi-cally
• Limit of captured scene details changes synchro-nously
• If this not recognized: continu-ously changing scene distortion
• If this is recognized: computa-tion time changes correspond-ingly, and some scene details are lost periodically
Possibly irrelevant
Temporal aperiodic
Resol. The resolution changes over time in a spontaneous manner
• Number of pixel lines and columns changes abruptly at unforeseeable points in time
• Limit of captured scene details changes synchro-nously
• If this is not recognized, effects as with More … Reverse occur.
Some capture electronics may change resolution if overall intensity passes a given threshold for a given amount of time
Close (spat.)
Resol. n/a
Remote (spat.)
Resol. n/a
Before (temp.)
Resol. The resolution changes earlier than expected
• The resolution change is expected in principle, but not at the time it actually occurs
• The effects of Temporal (a)periodic occur until the ex-pected point in time
After (temp.)
Resol. The resolution changes later than
• See Before • The effects of Temporal (a)periodic occur from the ex-
![Page 151: R3-COP-D4.2.3 CV-HAZOP r1.1 OZ · PDF fileThese ingredients are introduced and discussed in the next chapter. In Chapter 2, the HAZOP study is carried out and its results collected](https://reader031.vdocuments.us/reader031/viewer/2022030411/5a9dd51e7f8b9a0d5a8cbbe2/html5/thumbnails/151.jpg)
ARTEMIS-2009-1 R3-COP / Visual Perception Testing D4.2.3
Page 151 / 187
Guideword Par. Meaning Consequences Hazards Note expected pected point in time until the ac-
tual point in time In front of (spat.)
Resol. n/a
Behind (spat.)
Resol. n/a
Faster Resol. See Before and Temporal (a)periodic
Slower Resol. See After and Temporal (a)periodic
No Spec.Eff. No colour response at all - sensor is completely insensitive
• Black image, only thermal noise generated
• No scene or objects detectable
No Spec.Eff. Image is sampled in monochrome
• Spectrum reduced to the intensity of a single fre-quency
• Distinction of objects/material based on their specific colour re-emission might fail
More Spec.Eff. Sensor is sensitive to more frequencies than expected
• More frequencies contrib-ute to scene impression than expected – strange or “wrong” colours
• Misinterpretation of objects or their parts
• Overexposure
More Spec.Eff. Image is sampled with more colour channels than expected
• Image contains more col-our planes than expected
• Distinction of objects/material based on their specific colour re-emission might fail
• Too long processing time • Colour planes are wrongly
grouped (e.g. <x,y,z,k>, <x,y,z,k> processed as <x,y,z>, <k,x,y>, …)
Less Spec.Eff. Sensor is sensitive to less frequencies
• More frequencies contrib- • Misinterpretation of objects or
![Page 152: R3-COP-D4.2.3 CV-HAZOP r1.1 OZ · PDF fileThese ingredients are introduced and discussed in the next chapter. In Chapter 2, the HAZOP study is carried out and its results collected](https://reader031.vdocuments.us/reader031/viewer/2022030411/5a9dd51e7f8b9a0d5a8cbbe2/html5/thumbnails/152.jpg)
ARTEMIS-2009-1 R3-COP / Visual Perception Testing D4.2.3
Page 152 / 187
Guideword Par. Meaning Consequences Hazards Note than expected ute to scene impression
than expected – strange or “wrong” colours
their parts • Overexposure
Less Spec.Eff. Image is sampled with less colour channels than expected
• Image contains fewer col-our planes than expected
• Distinction of objects/material based on their specific colour re-emission might fail
• Complementary to 2nd More
e.g. Satellite imagery has many different wavelength (NIR->UV), one essential is missing
As well as Spec.Eff. In expected frequency range, sensor is less sensitive, but in other frequencies, it is more sensitive
• Other frequencies contrib-ute to scene impression than expected – strange or “wrong” colours
• See More and Less Compared with the human eye, cheap sensors are more sensitive to near IR, but less sensitive to near UV
As well as Spec.Eff. Optical filters and sensor colour filter interact
• Colours are skewed • Underexposure
• Distinction of objects/material based on their specific colour re-emission might fail
Part of Spec.Eff. Spectral Eff. has a strong break in (band-rejection filter)
• Colours are skewed • Colour and Intensity are
mixed
• Misdetections
Reverse Spec.Eff. Spectral Eff. Curve is inverted
• Completely wrong colour interpretation
• Misdetections
Other than Spec.Eff. Spec. Eff. Curve is different than expected
• See More/Less/As well as • See More/Less/As well as
Other than Spec.Eff. Automatically adapted Spec. Eff. (auto white balance) is erroneous
• See More/Less/As well as • See More/Less/As well as Can be caused by More Color Obj.
Other than Spec.Eff. Image is sampled with different colour channels than
• Wrong colour interpreta-tion
• Distinction of objects/material based on their specific colour re-emission might fail
e.g. different Colour filter array/Bayer pattern
![Page 153: R3-COP-D4.2.3 CV-HAZOP r1.1 OZ · PDF fileThese ingredients are introduced and discussed in the next chapter. In Chapter 2, the HAZOP study is carried out and its results collected](https://reader031.vdocuments.us/reader031/viewer/2022030411/5a9dd51e7f8b9a0d5a8cbbe2/html5/thumbnails/153.jpg)
ARTEMIS-2009-1 R3-COP / Visual Perception Testing D4.2.3
Page 153 / 187
Guideword Par. Meaning Consequences Hazards Note expected
Where else Spec.Eff. Spec. Eff. changes depending on the position on detector area
• Static noise is increased and influences colour in-tensity and hue
• Misdetections
Spatial periodic
Spec.Eff. Spec. Eff. changes regularly over Sensor area
• Different sensor areas show different response curves, but predictable
• Effects of As well as merge with those of Optics
Example: sensitivity reduces towards image edges – corresponds to vignetting
Spatial aperiodic
Spec.Eff. Sensitivity changes irregularly over Sensor area
• Different sensor areas show different response curves, which is not or hardly predictable
• Depending on strength of differ-ences, “scattered” image ham-pered correct scene interpreta-tion severely
Example: aging sensor
Temporal periodic
Spec.Eff. Sensitivity changes regularly over time
• The Sensor’s response curves change over time, but in a (widely) predict-able manner
• Changing colour temperature hampers recognition of previ-ously learned colours or tex-tures, but correction is possible
Example: due to warming up
Temporal aperiodic
Spec.Eff. Sensitivity changes irregularly over time
• The Sensor’s response curves change over time, but in a (widely) unpre-dictable manner
• Changing colour temperature hampers recognition of previ-ously learned colours or tex-tures, if not corrected
Example: changing outside temperature
Close (spat.)
Spec.Eff. n/a
Remote (spat.)
Spec.Eff. n/a
Before (temp.)
Spec.Eff. Colour response curve changes earlier than expected
• See Meaning • Deviation of “generated” from “real” colours causes CV alg. to misinterpret scene status, e.g. point in time
E.g. In LDR to HDR
After (temp.)
Spec.Eff. Colour response curve changes later than expected
• See Meaning • See Before See Before
In front of Spec.Eff. n/a
![Page 154: R3-COP-D4.2.3 CV-HAZOP r1.1 OZ · PDF fileThese ingredients are introduced and discussed in the next chapter. In Chapter 2, the HAZOP study is carried out and its results collected](https://reader031.vdocuments.us/reader031/viewer/2022030411/5a9dd51e7f8b9a0d5a8cbbe2/html5/thumbnails/154.jpg)
ARTEMIS-2009-1 R3-COP / Visual Perception Testing D4.2.3
Page 154 / 187
Guideword Par. Meaning Consequences Hazards Note (spat.) Behind (spat.)
Spec.Eff. n/a
Faster Spec.Eff. See Before Slower Spec.Eff. See After No Quality No quality of
Sensor electronics at all
• Generated image data are purely random – only tem-poral noise
• No object detected • Noise misinterpreted, e.g. as
particles in medium
More Quality Sensor Quality is higher than expected
• Less noise, vignetting etc. than expected
• More contrast
• CV alg. which tries to compen-sate Sensor faults introduce ar-tefacts
• Clutter in scene might be inter-preted as objects
E.g.: A denoising procedure assuming a certain amount of noise will flatten the image
Less Quality More (thermal) pixel noise than expected
• More pixel values are ran-domly modified than ex-pected
• Pixel values are stronger randomly modified than expected
• Less contrast
• Textures wrongly interpreted • Wrong textures detected • Small image details not de-
tected • Wrong small image details de-
tected
Less Quality More intensity offset error than expected
• Dark scene zones appear brighter than expected
• Details in the dark (shadows) not detected
E.g: Black is not detected as black it is grey
Less Quality More overflow effects than expected
• E.g. blooming • Blooming effects misinterpreted as objects or object parts
Less Quality Image noise is stronger than any signal caused by light intensities
• Image noise is prevalent • Contrast is reduced
• Misdetections
As well as Quality Different levels of image noise are
• Training data image noise • Misdetections due to “wrong” e.g. when sampling with more frames the static
![Page 155: R3-COP-D4.2.3 CV-HAZOP r1.1 OZ · PDF fileThese ingredients are introduced and discussed in the next chapter. In Chapter 2, the HAZOP study is carried out and its results collected](https://reader031.vdocuments.us/reader031/viewer/2022030411/5a9dd51e7f8b9a0d5a8cbbe2/html5/thumbnails/155.jpg)
ARTEMIS-2009-1 R3-COP / Visual Perception Testing D4.2.3
Page 155 / 187
Guideword Par. Meaning Consequences Hazards Note present under different conditions
and image noise under test can vary
learning noise increases ;after a warm-up phase the sensor behaves differently
As well as Quality All Less Quality effects combined
• See all Less Quality • See all Less Quality
Part of Quality Some Less Quality effects combined
• See all Less Quality • See all Less Quality
Reverse Quality n/a Other than Quality Image noise is
different than expected
• See More/Less • See More/Less
Where else Quality The intensity offset error is expected to be high at a certain region of the image but is high at another region
• See Meaning • Overcompensations in some and under compensations in other regions
E.g.: In radiometric calibration
Spatial periodic
Quality Sensor quality changes over its area in a regular manner
• Consequences listed un-der Less Quality occur at different image locations with different intensities, but in a regular and widely predictive manner (with respect to variation pa-rameters)
• Image noise can produce a virtual pattern
• Hazards listed under Less Qual-ity occur with different intensities over the image, but could prin-cipally be compensated
Spatial aperiodic
Quality Sensor quality changes over its area in irregular manner
• Like Spatial periodic, but in an irregular and hence hardly predictive manner
• Image noise can produce a virtual pattern
• Hazards listed under Less Qual-ity occur with different intensities over the image, but mitigation difficult
• Image noise could be confused
![Page 156: R3-COP-D4.2.3 CV-HAZOP r1.1 OZ · PDF fileThese ingredients are introduced and discussed in the next chapter. In Chapter 2, the HAZOP study is carried out and its results collected](https://reader031.vdocuments.us/reader031/viewer/2022030411/5a9dd51e7f8b9a0d5a8cbbe2/html5/thumbnails/156.jpg)
ARTEMIS-2009-1 R3-COP / Visual Perception Testing D4.2.3
Page 156 / 187
Guideword Par. Meaning Consequences Hazards Note with objects or textures
Temporal periodic
Quality Sensor quality changes over time in a regular and hence predictable manner
• Consequences listed un-der Less Quality change regularly – and hence at least conceptually predic-tive – over time
• Hazards listed under Less Qual-ity occur with different intensities over time, but could principally be mitigated
E.g.: Image noise that is random but occurs with at a 1/f frequency and can be filtered if modelled.
Temporal aperiodic
Quality Sensor quality changes over time in irregular manner
• Consequences listed un-der Less Quality change irregularly – and hence even conceptually hard to predict – over time
• Hazards listed under Less Qual-ity occur with different intensities over time, and can hardly be mitigated
Close (spat.)
Quality n/a
Remote (spat.)
Quality n/a
Before (temp.)
Quality Before the system is warmed up, , the image noise level is different than expected
• See As well as • See As well as
After (temp.)
Quality After the system has reached a certain age, the image noise level is different than expected
• See As well as • Static image noise
changes over the life time
• See As well as • Filters to remove static noise
are working with reduced effi-ciency
In front of (spat.)
Quality n/a
Behind (spat.)
Quality n/a
Faster Quality See Temporal aperiodic
Slower Quality See Temporal
![Page 157: R3-COP-D4.2.3 CV-HAZOP r1.1 OZ · PDF fileThese ingredients are introduced and discussed in the next chapter. In Chapter 2, the HAZOP study is carried out and its results collected](https://reader031.vdocuments.us/reader031/viewer/2022030411/5a9dd51e7f8b9a0d5a8cbbe2/html5/thumbnails/157.jpg)
ARTEMIS-2009-1 R3-COP / Visual Perception Testing D4.2.3
Page 157 / 187
Guideword Par. Meaning Consequences Hazards Note aperiodic
Slower Quality Electronics degrade very slowly
• Only small changes in image noise over time
• Detection of deterioration might fail as changes are so small to go unnoticed
No Quant. No Quantisation at all
• Inputs are only available in analogue form
• Or only one bit per pixel • Or only one constant out-
put
• CV alg. not able to process ana-logue data
• No scene elements recogniz-able
May happen when sensor breaks down
More Quant. More Value Quantisation Levels than expected
• Encoding of quantised data uses more bits than expected
• Encoding uses a larger fraction of the possible range (e.g. 0..32767 in-stead of only 0..10000 with 15 bits)
• If pixel boundaries becomes misaligned: image is completely deformed
• If pixel boundaries remain aligned: exceeding values not handled by CV alg. – usually, bright parts effected; e.g. − All values exceeding ex-
pected max. are mapped to “maximum white” → image brighter than cor-rect
− All values exceeding ex-pected max. are mapped into expected range in a round-robin manner
More Quant. Space Quantization is stronger than expected
• See Resolution • See Resolution
More Quant. Frame rate is lower than expected
• Less observations per time
• Misses important observations since observed objects or their change of properties is to fast
• Different sapling rate has to be
See Fast Obj properites See Fast l.s. properties
![Page 158: R3-COP-D4.2.3 CV-HAZOP r1.1 OZ · PDF fileThese ingredients are introduced and discussed in the next chapter. In Chapter 2, the HAZOP study is carried out and its results collected](https://reader031.vdocuments.us/reader031/viewer/2022030411/5a9dd51e7f8b9a0d5a8cbbe2/html5/thumbnails/158.jpg)
ARTEMIS-2009-1 R3-COP / Visual Perception Testing D4.2.3
Page 158 / 187
Guideword Par. Meaning Consequences Hazards Note considered in dynamic models, otherwise they do not fit
• Temporal aliasing occurs Less Quant. Value Quantisation
is lower than expected
• Encoding of discretised data uses fewer bits than expected
• Encoding uses a fraction of the possible range (e.g. 0..10000 instead 0.. 32767 with 15 bits)
• If pixel boundaries becomes misaligned: image is completely deformed
• If pixel boundaries remain aligned: image dimmer than correct
• Subpixel accuracy is reduced due to less possible data values
Less Quant. Space Quantization is weaker than expected
• See Resolution • See Resolution
Less Quant. Frame rate is higher than expected
• More observations per time
• Different sapling rate has to be considered in dynamic models, otherwise they do not fit
As well as Quant. Multiple quantization effects (Value, space & time) interact with each other
• See More/Less • See More/Less
Part of Quant. Only Part of quantisation is performed
• Some of expected dimen-sions not provided
• Misinterpretation of available date leads to highly deformed image
Example: only intensity provided instead r/g/b
Reverse Quant. Received Intensity is encoded inverse to expected
• Image is encoded as its “negative”
• Scene recognition breaks down • Dips are interpreted as hills and
vice versa • L.s. positions confused
Example: 0=white, 255=black
Other than Quant. A/D Converter introduce additional current and fake
• Increased image noise • Misdetections
![Page 159: R3-COP-D4.2.3 CV-HAZOP r1.1 OZ · PDF fileThese ingredients are introduced and discussed in the next chapter. In Chapter 2, the HAZOP study is carried out and its results collected](https://reader031.vdocuments.us/reader031/viewer/2022030411/5a9dd51e7f8b9a0d5a8cbbe2/html5/thumbnails/159.jpg)
ARTEMIS-2009-1 R3-COP / Visual Perception Testing D4.2.3
Page 159 / 187
Guideword Par. Meaning Consequences Hazards Note light intensity
Other than Quant. Combination of Part of and Reverse
Other than Quant. Value Quantisation is other than expected
• Intensity output is other than expected
• More/Less different inten-sity values than expected
• More/Less contrast
• Colours and shadows misinter-preted, derived scene geometry has systematic deviations
• Less details can result in missed detections
• More image noise and back-ground clutter can result in addi-tional false detections
e.g. logarithmic quantization instead of linear
Other than Quant. Time quantization is other than expected
• Speed assumptions do not fit the observations
• Dynamic models have to be adapted
Where else Quant. Different pixels quantize the same intensity with different values
• Image has scattered struc-ture overlaid; e.g. certain pixels appear always darker than their neighbours
• Misinterpretation, e.g. as texture
Spatial periodic
Quant. Sensor pixels sample only at discreet locations thus quantizing space
• Scene is sampled only at specific locations → in-crease contrast (steps)
• Stronger contrast misinter-preted, e.g. as shadow edges
Spatial periodic
Quant. Quantisation changes over sensor in a regular manner
This is considered as so unusual, that no further analysis is provided
Spatial aperiodic
Quant. Quantisation changes over sensor in an irregular manner
This is considered as so unusual, that no further analysis is provided
![Page 160: R3-COP-D4.2.3 CV-HAZOP r1.1 OZ · PDF fileThese ingredients are introduced and discussed in the next chapter. In Chapter 2, the HAZOP study is carried out and its results collected](https://reader031.vdocuments.us/reader031/viewer/2022030411/5a9dd51e7f8b9a0d5a8cbbe2/html5/thumbnails/160.jpg)
ARTEMIS-2009-1 R3-COP / Visual Perception Testing D4.2.3
Page 160 / 187
Guideword Par. Meaning Consequences Hazards Note Temporal periodic
Quant. Quantisation changes in a temporally regular manner
(not to be confused with Exposure) This is considered as so unusual, that no further analysis is provided
Temporal periodic
Quant. Sensor samples environment only with a certain frame rate thus quantizing time
• See Exposure • See Exposure
Temporal aperiodic
Quant. Quantisation changes in a temporally irregular manner
See Temporal periodic
Close (spat.)
Quant. n/a
Remote (spat.)
Quant. n/a
Before (temp.)
Quant. n/a
After (temp.)
Quant. See Exposure
In front of (spat.)
Quant. n/a
Behind (spat.)
Quant. n/a
Faster Quant. See Exposure Slower Quant. See Exposure
![Page 161: R3-COP-D4.2.3 CV-HAZOP r1.1 OZ · PDF fileThese ingredients are introduced and discussed in the next chapter. In Chapter 2, the HAZOP study is carried out and its results collected](https://reader031.vdocuments.us/reader031/viewer/2022030411/5a9dd51e7f8b9a0d5a8cbbe2/html5/thumbnails/161.jpg)
ARTEMIS-2009-1 R3-COP / Visual Perception Testing D4.2.3
Page 161 / 187
2.8 Algorithm
2.8.1 Parameters
Table 16: Algorithm Parameters Parameters Values /
Range Meaning
Parameters Input parameters, thresholds, intervals, sample sizes, mask sizes, kernel sizes History Previous inputs/results dependent effects, memory time span Models Object models (quality: representation, coverage: size relation between model and environment, content:
overlap between training and operation), probabilities, physics, dynamic models (e.g. Kalman) Calibration Perspective and observer lens geometry/position correction, colour calibration Real-time Performance
Performance of algorithm, results per second, time intervals between arrival of different input data (sampling)
Runtime-Environment Timing constraints(synchronicity), interference with other processes, memory load/utilization, processors, GPU
2.8.2 HAZOP
Table 17: Algorithm HAZOP Guideword Par. Meaning Consequences Hazards Note No Param. (Some) Parameters not
set • System uses default
parameters • Default parameters don’t fit
current situation
More Param. Parameter value is too high
• Threshold is too high • Window too big to for
target frequency in data
• Functionality hampered Example: Binary segmentation is too optimistic/pessimistic Classifier: False Positives/False Negatives
Less Param. Parameter value is too low
• See More • See More
As well as Param. One parameter is too low the other too high
• See More and Less • See More and Less
![Page 162: R3-COP-D4.2.3 CV-HAZOP r1.1 OZ · PDF fileThese ingredients are introduced and discussed in the next chapter. In Chapter 2, the HAZOP study is carried out and its results collected](https://reader031.vdocuments.us/reader031/viewer/2022030411/5a9dd51e7f8b9a0d5a8cbbe2/html5/thumbnails/162.jpg)
ARTEMIS-2009-1 R3-COP / Visual Perception Testing D4.2.3
Page 162 / 187
Guideword Par. Meaning Consequences Hazards Note Part of Param. Some Parameters are
not set • See No • See No
Reverse Param. Parameters are input in reverse
• See Where else • See Where else Example: byte order is reversed
Other than Param. Parameter is provided in units differently from expected
• Misinterpretations or scale of inputs is wrong
• Incorrect model • Incorrect action planning
Example: cm assumed but inches output
Other than Param Parameter does not fit the reality
• Deviation of the sys-tems view of reality from the real world
• The system fails or pre-sents erroneous data
The length of the baseline of a stereo system parameter is set inaccurately
Where else Param. The value for one parameter is confused with another
• Parameters are con-fused (wrong values assigned)
• Functionality nullified Example: Length and width of a 3D volume to be used for dense stereo are confused
Spatial periodic
Param. n/a
Spatial aperiodic
Param. n/a
Temporal periodic
Param. Parameter is oscillating • No stable parameter set
• No stable result
• Unreliable result Calling unit introduces an oscillation of a parameter
Temporal aperiodic
Param. Parameter is stochastically changed
• See Temporal periodic • See Temporal periodic
Close (spat.)
Param. n/a
Remote (spat.)
Param. n/a
Before (temp.)
Param. Initialization is done in the wrong order
• Temporal dependen-cies are ignored
• Functionality hampered • Deadlocks • Racing Conditions • Result are used although
they are not yet valid
![Page 163: R3-COP-D4.2.3 CV-HAZOP r1.1 OZ · PDF fileThese ingredients are introduced and discussed in the next chapter. In Chapter 2, the HAZOP study is carried out and its results collected](https://reader031.vdocuments.us/reader031/viewer/2022030411/5a9dd51e7f8b9a0d5a8cbbe2/html5/thumbnails/163.jpg)
ARTEMIS-2009-1 R3-COP / Visual Perception Testing D4.2.3
Page 163 / 187
Guideword Par. Meaning Consequences Hazards Note Before (temp.)
Param. Parameter change occurs before expected
• System misinterprets situation
• Functionality hampered
After (temp.)
Param. See Before
After (temp.)
Param. Phase shift in parameter oscillation
• See Before • See Before
In front of (spat.)
Param. See Reverse
Behind (spat.)
Param. See Reverse
Faster Param. See Temporal Periodic Slower Param. See Temporal Periodic No History History is not available • History is empty • Results are invalid No History No history is used at all • See Meaning • Time based effects are
missed • Temporal patterns such as
motion cannot be used to analyse observed scene
More History History size/time span is too long
• Calculation time higher
• Low pass filter effect • Increased Smoothing • Historical data are
weighted more than current inputs
• Important details are drowned in a sea of unimportant data
• Results are smoothed too much
• Results arrive too late (de-lays)
• Relevant but short-time phenomena are ignored
Example: Speed data is smoothed too much (current speed value is average of longer time frame); picket fence removed due to repeated mismatch in subsequent images
Less History History is not yet complete
• Initialization is not yet completed
• Algorithm must first
• Results at beginning are incorrect
• History was only valid for
Example: Speed/Pose/Movement calculation are not stable yet
![Page 164: R3-COP-D4.2.3 CV-HAZOP r1.1 OZ · PDF fileThese ingredients are introduced and discussed in the next chapter. In Chapter 2, the HAZOP study is carried out and its results collected](https://reader031.vdocuments.us/reader031/viewer/2022030411/5a9dd51e7f8b9a0d5a8cbbe2/html5/thumbnails/164.jpg)
ARTEMIS-2009-1 R3-COP / Visual Perception Testing D4.2.3
Page 164 / 187
Guideword Par. Meaning Consequences Hazards Note sample inputs for some time to create a history or default as-sumptions about the history are needed
• Algorithm might use old history (e.g. from last run); results are based on an old his-tory
the old time frame/location. Using the old history now results in incorrect behav-iour
after only one frame
Less History History size/time span is too short
• Calculation time shorter
• High pass filter effect • Decreased Smoothing • Historical data are
weighted less than appropriate
• Results are not smoothed enough
• Results arrive too early • Short-time phenomena are
overemphasized
As well as History For some aspects, too much history is used, while for others too little
• Historical data is ap-plied inappropriately
• Scene is partially misinter-preted
Part of History History is only partially complete
• See Less • See Temporal aperi-
odic
• See Less • See Temporal aperiodic
Reverse History History time is inversed • Old entries appear as new and vice versa
• Confusion of events and causality can create erratic system behaviour
Other than History History from a different time span than expected, is used
• Results are based on out-dated history
• Wrong events are cor-related
• Functionality hampered • Results incorrect • Miscalculations of timings
and movements
Where else History History from a different • Wrong events are cor- • Miscalculations of location Example: CCTV uses history
![Page 165: R3-COP-D4.2.3 CV-HAZOP r1.1 OZ · PDF fileThese ingredients are introduced and discussed in the next chapter. In Chapter 2, the HAZOP study is carried out and its results collected](https://reader031.vdocuments.us/reader031/viewer/2022030411/5a9dd51e7f8b9a0d5a8cbbe2/html5/thumbnails/165.jpg)
ARTEMIS-2009-1 R3-COP / Visual Perception Testing D4.2.3
Page 165 / 187
Guideword Par. Meaning Consequences Hazards Note location is used related and tracking of wrong camera to analyse a
different location Spatial periodic
History Spatial structures are sampled and turned into history data
• Missed frames change interpretation of spatial structure
• Miscounts • Misinterpretation of spatial
periodicals
Spatial aperiodic
History See Realtime
Temporal periodic
History History is periodically reset
• History is rebuild for every n frames
• Synchronized phenomena are differently interpreted in a periodic manner
Temporal aperiodic
History Some frames in history are missing
• Information of some frames is missing
• CV alg. Does not detect the missing of a frame(s) and crashes
• Missing information is in-terpolated in a wrong way
Close (spat.)
History History data points are spatially incorrect
• Errors in spatial differ-ences between history data points is increas-ing
• Incorrect results • Incomplete results
History data points are interpretation of raw data, Example: Tracking moves to infinity
Remote (spat.)
History History data points are spatially far away
• Errors in spatial differ-ences
• See Close Example: Tracking in far distance will fail because image data is too small; Tracking moves from infinity
Before (temp.)
History See After
Before (temp.)
History Phase shift in frame sequence capture (too early)
• History is phase shifted
• Results arrive too early • Synchronisation to other
systems is lost
After (temp.)
History Newest history includes an extreme event
• History is weighted strongly and the influ-ence of the extreme event is significant
• Results outside of specifi-cations
• Incorrect Results
Example: After flash light
![Page 166: R3-COP-D4.2.3 CV-HAZOP r1.1 OZ · PDF fileThese ingredients are introduced and discussed in the next chapter. In Chapter 2, the HAZOP study is carried out and its results collected](https://reader031.vdocuments.us/reader031/viewer/2022030411/5a9dd51e7f8b9a0d5a8cbbe2/html5/thumbnails/166.jpg)
ARTEMIS-2009-1 R3-COP / Visual Perception Testing D4.2.3
Page 166 / 187
Guideword Par. Meaning Consequences Hazards Note • History spikes
In front of (spat.)
History n/a
Behind (spat.)
History n/a
Faster History History time base is misinterpreted (too fast)
• Dynamics and Move-ment calculation error increased
• Miscalculation of dynamics and movements
Example: Variable frame rate effects
Slower History See Faster No Models No Models available • Default results can be
reported • Recognition of appl.-
typical objects and situations not trained
• Incorrect results • Everything reported as
unknown • Distinction between un-
critical and critical situa-tions hampered
• Detection of typical objects and situations not more ef-ficient than of others
More Models More Models available than expected
• Runtime higher • Model domain does
not fit environment
• Results arrive too late • Mismatches • False positives (too many
classes in the classifiers) • False negatives: No critical
scenes detected, because “everything” is learned as normal
More Models Modelling extent too large or over specific
• Algorithm is very pe-dantic
• Noise effects are en-hanced
• Matching will not work as no patch exactly matches the very specific model
• Classifier has too sharp peaks
E.g.: Over fitting: due to training the model with too much specialized data, it is not generalizing enough and learns characteristics which
![Page 167: R3-COP-D4.2.3 CV-HAZOP r1.1 OZ · PDF fileThese ingredients are introduced and discussed in the next chapter. In Chapter 2, the HAZOP study is carried out and its results collected](https://reader031.vdocuments.us/reader031/viewer/2022030411/5a9dd51e7f8b9a0d5a8cbbe2/html5/thumbnails/167.jpg)
ARTEMIS-2009-1 R3-COP / Visual Perception Testing D4.2.3
Page 167 / 187
Guideword Par. Meaning Consequences Hazards Note • Noise confused with signal are not task specific but
specific for the dataset Less Models Modelling extent too
small or too generic • Large range of hy-
potheses is assigned to a small number of results
• More false positives
Trained with a database which is not sufficiently generalizing (E.g.: Cars do always have shadows)
Less Models Number of models too few
• Small range of hy-potheses is covered
• More false negatives
As well as Models Redundancy in models • Hypotheses are sepa-rated although they should match to the same model
• Total selectivity of system reduced
• Likelihoods are washed out instead of peaked
• Ambiguous results • Misdetections • Tracking errors (consis-
tency between frames lost)
Example: Combining models from different sources
Part of Models Models are incomplete • System uses incom-plete model data
• Some parts of the model are correct, but others are missed
• Model misses essen-tial aspects or situa-tions
• Trained model is not covering enough vari-ants of a task
• Misinterpretation • Wrong defaults are used to
fill out missing information • False negatives
Example: Reference coordinate system reference point not specified per model E.g.: For a trained model: A person detector has never been trained for sitting persons only standing and walking ones
Part of Models Dynamical models are incomplete
• Calculations are miss-ing inputs from models
• Tracking fails
Reverse Models Training data are opposite to those
• “Good” cases trained • Continuous wrong scene
![Page 168: R3-COP-D4.2.3 CV-HAZOP r1.1 OZ · PDF fileThese ingredients are introduced and discussed in the next chapter. In Chapter 2, the HAZOP study is carried out and its results collected](https://reader031.vdocuments.us/reader031/viewer/2022030411/5a9dd51e7f8b9a0d5a8cbbe2/html5/thumbnails/168.jpg)
ARTEMIS-2009-1 R3-COP / Visual Perception Testing D4.2.3
Page 168 / 187
Guideword Par. Meaning Consequences Hazards Note occurring during appl. as “bad” and vice
versa interpretation
Other than Models Inadequate model assumptions
• Applicability of model to given problem is re-duced
• Missing regularisation
• Performance is negatively affected
• More false positives / false negatives
Example: Neural net with too many neurons
Other than Models Inconsistent models are supplied
• See Part of • Specified objects are
physically impossible
• CV Algorithm is confused by inconsistent model -> infinite loop
Example: Model has only front faces and no back faces
Other than Models Units or reference systems are other than expected
• Scaling of model is different from ex-pected
• Orientation of objects can differ from the ex-pected
• Specified objects are physically impossible
• Result has wrong units/scale
• Miscalculation of distances • Misinterpretation of rela-
tions
Examples: • Different coordinate sys-
tems used, Imperial units instead of metric
• Euler angle rotation used to define orientation uses different axes (e.g. ZYZ instead of XYZ)
Other than Models Wrong probabilities used
• Confidences are incor-rect
• Relations between objects are changed
• Results incorrect Example: Deterministic model or model with wrong probabilities used
Other than Models The training data is erroneous (label noise)
• The resulting model is inaccurate
• Results are erroneous or ambiguous
E.g.: Human error
Other than Models Global model is used instead of local model and vice versa.
• A global algorithm is applied and fails due to the occurrence of local phenomena, and vice versa.
• Foreground/background segmentation fails be-cause it is using a single threshold for the whole im-age (global illumination model), whereas a local model (adaptive threshold) which deals with local illu-
Can happen during pre-processing or during the computation of the core algorithm E.g: global and local thresholds
![Page 169: R3-COP-D4.2.3 CV-HAZOP r1.1 OZ · PDF fileThese ingredients are introduced and discussed in the next chapter. In Chapter 2, the HAZOP study is carried out and its results collected](https://reader031.vdocuments.us/reader031/viewer/2022030411/5a9dd51e7f8b9a0d5a8cbbe2/html5/thumbnails/169.jpg)
ARTEMIS-2009-1 R3-COP / Visual Perception Testing D4.2.3
Page 169 / 187
Guideword Par. Meaning Consequences Hazards Note mination changes suc-ceeds.
Where else Models n/a Spatial periodic
Models Model includes a spatial periodicity
• Algorithm expects same periodicity in scene as in model
• Failed detection caused by small deviation in periodic-ity in current scene com-pared to periodicity of model
Spatial aperiodic
Models Event is physically inconsistent in regard of general assumptions
• Algorithm interprets physically inconsistent input as real
• Algorithm is confused by physical inconsistency
Temporal periodic
Models Model includes a temporal periodicity
• Algorithm expects same periodicity in scene as in model
• See Other than
• See Other than Example: Models are built in summer but used in winter
Temporal periodic
Models Model includes temporal periodicity instead of ignoring it
• Algorithm expects parts of scene to be static
• Algorithm expects special configuration of object
• Misdetections • False negatives
Example: Walking Humans, Beating Heart, Moving Flag
Temporal aperiodic
Models Model captures singular event as typical
• General cases are not supported
• Misclassifications
Close (spat.)
Models Assumptions about ranges in model violated
• Scaling is different from expected
• Classifications fail • Measurements incorrect
Close (spat.)
Models Two Modes are that close (in parameter space) that they are hard to distinguish
• Confusion between the two modes
• Mode based classification severely hampered
Possible reasons: Classes have too low intra-class variance, or parameter space has not enough dimensions
![Page 170: R3-COP-D4.2.3 CV-HAZOP r1.1 OZ · PDF fileThese ingredients are introduced and discussed in the next chapter. In Chapter 2, the HAZOP study is carried out and its results collected](https://reader031.vdocuments.us/reader031/viewer/2022030411/5a9dd51e7f8b9a0d5a8cbbe2/html5/thumbnails/170.jpg)
ARTEMIS-2009-1 R3-COP / Visual Perception Testing D4.2.3
Page 170 / 187
Guideword Par. Meaning Consequences Hazards Note Remote (spat.)
Models See Parameters
Before (temp.)
Models See After
Before (temp.)
Models Model building/Training stopped to early
• Only part of relevant situations learned
• Unlearned situations are more likely misinterpreted
After (temp.)
Models Model includes assumptions about temporal sequences
• Algorithm expects temporal sequence in input data
• Waiting infinitely for events
After (temp.)
Models Training occurs while in operation
• Continuous learning • Critical situations occurring repeatedly during opera-tion eventually are consid-ered as normal
After (temp.)
Models Model building/Training started too late
• Operational phase started too late
• Early critical situations missed
In front of (spat.)
Models n/a
Behind (spat.)
Models n/a
Faster Models Model time scale different from expected
• Algorithm expects a different time scale
• Tracking fails • Matching fails
Slower Models See Faster Slower Models Training occurs slower
than expected • Training phase takes
too long • Operation is started too
late and misses critical situations
No Calib. No calibration is used • See Less • See Less More Calib. Calibration has more
parameters than can be estimated
• Redundant parameter has no meaningful value
• Calibration converges slowly or against a lo-
• Positions incorrect • Calibration does not con-
verge
Example: Tracking, combined effect of car drifting and road drifting could cancel out on default Pivot point of camera not known.
![Page 171: R3-COP-D4.2.3 CV-HAZOP r1.1 OZ · PDF fileThese ingredients are introduced and discussed in the next chapter. In Chapter 2, the HAZOP study is carried out and its results collected](https://reader031.vdocuments.us/reader031/viewer/2022030411/5a9dd51e7f8b9a0d5a8cbbe2/html5/thumbnails/171.jpg)
ARTEMIS-2009-1 R3-COP / Visual Perception Testing D4.2.3
Page 171 / 187
Guideword Par. Meaning Consequences Hazards Note cal minimum
Less Calib Less Calibration is used than appropriate
• Scene geometry not known to scene inter-preter
• Distortion effects are not compensated
• Epipolar constraints not applying
• Scene geometry wrongly interpreted → wrong dis-tances and sizes derived
• Objects appear differently depending on their position within the image -> misde-tections
Less Calib. Calibration has not enough parameters
• Algorithm uses de-faults /fixed compen-sations
• Robustness reduced • Functionality hampered
Less Calib. Calibration is based on too few data points
• Quality of calibration is reduced
• See No
As well as Calib. Different calibration methods are combined into one calibration set
• Sequence of calibra-tion methods influ-ences calibration out-come
• See Other than
Part of Calib. See Less Reverse Calib. Scene Geometry
reversed • Negative scaling factor
is used • Near objects are inter-
preted as far away and re-verse.
Other than Calib. Calibration other than expected is used
• Calibration is inappro-priate
• Calibration faulty • Results hampered • Scene misinterpreted
Where else Calib. Calibration applies to different environment than expected
• Calibration doesn’t match current scene
• See Other than Example: Over the air calibration used underwater calibration Air temperature has influence on calibration
Spatial periodic
Calib. Error in correspondence of period in periodic
• Wrong calibration re-sult e.g. for extrinsic
• Range errors in creation of world model
![Page 172: R3-COP-D4.2.3 CV-HAZOP r1.1 OZ · PDF fileThese ingredients are introduced and discussed in the next chapter. In Chapter 2, the HAZOP study is carried out and its results collected](https://reader031.vdocuments.us/reader031/viewer/2022030411/5a9dd51e7f8b9a0d5a8cbbe2/html5/thumbnails/172.jpg)
ARTEMIS-2009-1 R3-COP / Visual Perception Testing D4.2.3
Page 172 / 187
Guideword Par. Meaning Consequences Hazards Note calibration pattern (e.g. checkerboard)
calibration • Range error in perception of objects
Spatial periodic
Calib. The calibration pattern is presented at too regular positions
• Not enough data to get a valid calibration for a large operation volume
• Bad results at other places than considered in the calibration
Spatial aperiodic
Calib. The calibration patterns used violate an assumption about periodicity
• Too much noise in the calibration data, wrong calibration result
• Bad recognition results for classification and localization
Temporal periodic
Calib. The stimuli to calibrate the latency are periodic and there is a period mismatch
• The parameter for the latency is set to a wrong value
• Visual servoing doesn't work well
This is for calibration of the vision system latency w.r.t. the overall system
Temporal aperiodic
Calib. The clock for the cameras has some systematic jitter which should have been calibrated but hasn't
• Asynchronous stereo images mismatch
• Position errors in the im-ages and hence distance errors from stereo
Close (spat.)
Calib. If an assumption about distance is part of the calibration measurement, the calibration object may be closer than modeled
• The focal length is under-estimated
• The stereo-base is under-estimated
• Distance and object size measurements are wrong
Remote (spat.)
Calib. See Close Example: difference of focal length and baseline cancel each other out
Before (temp.)
Calib. There may be a suitable order of steps for calibration e.g. intrinsic first, then extrinsic, then latency
• Arrive at wrong local minimum
• Get fundamental ma-trix and intrinsic cam-era parameters wrong
• See In front of We might make a list of consequences for the case that the calibration is messed up beyond all recovery
![Page 173: R3-COP-D4.2.3 CV-HAZOP r1.1 OZ · PDF fileThese ingredients are introduced and discussed in the next chapter. In Chapter 2, the HAZOP study is carried out and its results collected](https://reader031.vdocuments.us/reader031/viewer/2022030411/5a9dd51e7f8b9a0d5a8cbbe2/html5/thumbnails/173.jpg)
ARTEMIS-2009-1 R3-COP / Visual Perception Testing D4.2.3
Page 173 / 187
Guideword Par. Meaning Consequences Hazards Note This may be mixed up
Before (temp.)
Calib. A trigger comes earlier than modeled
• See temporal periodic and temporal aperi-odic
• See temporal periodic and temporal aperiodic
After (temp.)
Calib. See before
In front of (spat.)
Calib. Some objects are in front of the calibration object (or some distorting media)
• Noisy or missing cali-bration data, maybe even with systematic error
• Poor calibration re-sults
• Bad image rectification leads to shape errors, poor feature extraction, bad ste-reo correspondences etc.
E.g. in colour calibration be careful not to have large, strongly coloured objects in the surroundings
Behind (spat.)
Calib. Calibration object in front of a background parts of which could be mistaken for calibration relevant Segmentation of calibration object from background may be poor E.g. in colour calibration be careful not to have large, strongly coloured objects in the background
• Extra (wrong) calibra-tion input
• Poor calibration re-sults
• See In front of
Faster Calib. If motion models should be used for calibration, the real velocity could be faster than the modeled velocity
• Comparing velocities in image and world yields wrong calibra-tion results
• See close Similar effect to the object passes at closer distance than modelled
Slower Calib. See faster Slower Calib. Calibration takes more • Computation com- • Scene geometry misinter-
![Page 174: R3-COP-D4.2.3 CV-HAZOP r1.1 OZ · PDF fileThese ingredients are introduced and discussed in the next chapter. In Chapter 2, the HAZOP study is carried out and its results collected](https://reader031.vdocuments.us/reader031/viewer/2022030411/5a9dd51e7f8b9a0d5a8cbbe2/html5/thumbnails/174.jpg)
ARTEMIS-2009-1 R3-COP / Visual Perception Testing D4.2.3
Page 174 / 187
Guideword Par. Meaning Consequences Hazards Note time than appropriate pleted too late or is
pre-terminated preted
No Perf. No images are captured, the process hangs, the last image that was captured is still returned as the result
• The software does not derive recent results
• Either no results are avail-able or invalid old results are used
In general the line between performance and runtime is not too strict E.g. Whether for algorithmic reasons or clock errors, the consequences of late arrival of results are similar Some similarities in consequences and hazards might occur
More Perf. The frame rate is higher than assumed by the system
• The processes seem to be slower than they are
• Visual servoing fails • Velocity based classifica-
tion fails
More Perf. Some processing is much faster than expected
• If the process gets its data from some stack, it might overtake the other processes and the data become in-consistent
• Inconsistent data may lead to completely wrong re-sults
Get some mechanisms in place that guarantee correct synchronisation and correspondences
Less Perf. In case of parallel processing some processing may take longer than expected
• Dependent processes get no results or old results
• Subsequent processes might crash due to lack of valid input or might pro-duce wrong results be-cause of using old, invalid input
As well as Perf. Processes might be applied to more data than actually required (e.g. if downstream a sub-sampling to every n-th image is made, there
• The available re-sources (computation power, memory com-munication bandwidth) are wasted
• There are not enough re-sources to do other impor-tant things
![Page 175: R3-COP-D4.2.3 CV-HAZOP r1.1 OZ · PDF fileThese ingredients are introduced and discussed in the next chapter. In Chapter 2, the HAZOP study is carried out and its results collected](https://reader031.vdocuments.us/reader031/viewer/2022030411/5a9dd51e7f8b9a0d5a8cbbe2/html5/thumbnails/175.jpg)
ARTEMIS-2009-1 R3-COP / Visual Perception Testing D4.2.3
Page 175 / 187
Guideword Par. Meaning Consequences Hazards Note is no need to process every raw image)
Part of Perf. There are spurious delays, i.e. generally the performance is OK, and then occasionally one result is too late
• See less, only that it happens infrequently and spuriously
• See Less
Reverse Perf. In order to ensure performance fixed limits on the data to be processed are given (e.g. the number of features)
• The limits are worst case for each process, so the overall limita-tion is too strong
• Poor performance
Other than Perf. The processor doesn't have a math coprocessor so there are many function calls and the software runs a lot slower
• See Slower • See Slower
Where else Perf. Closely interacting processes are assigned to units that are too far spread out
• Long latencies • Poor performance
Spatial periodic
Perf. n/a
Spatial aperiodic
Perf. n/a
Temporal periodic
Perf. A subprocess might be started periodically based on some clock but then be out of sync with the input data
• No input data avail-able
• Old input data reused • Input data are lost
• Wrong results
Temporal aperiodic
Perf. A subprocess might be used to trigger other
• The waiting subproc-ess doesn't start
• System stalls or some re-sults are missing
![Page 176: R3-COP-D4.2.3 CV-HAZOP r1.1 OZ · PDF fileThese ingredients are introduced and discussed in the next chapter. In Chapter 2, the HAZOP study is carried out and its results collected](https://reader031.vdocuments.us/reader031/viewer/2022030411/5a9dd51e7f8b9a0d5a8cbbe2/html5/thumbnails/176.jpg)
ARTEMIS-2009-1 R3-COP / Visual Perception Testing D4.2.3
Page 176 / 187
Guideword Par. Meaning Consequences Hazards Note processes but spuriously fails to deliver the trigger
Close (spat.)
Perf. n/a
Remote (spat.)
Perf. See Where else
Before (temp.)
Perf. Between temporally ordered subprocesses (e.g. a predictor – corrector algorithm or an expectation – maximization algorithm) the processes get out of order
• Process operates on old data or data are missing
• Wrong results out, or no results, or subsequent process crashes
Again a theoretical threat rather than a practical one – one simply wouldn't rely on runtimes to determine the order of invocation
Before (temp.)
Perf. One might use an estimated processing time or latency of a hardware driver to assign a time stamp for data that do not come from a subsystem with a proprietary, synchronized clock
• The estimated time might be wrong
• The time might vary a lot between calls to the driver (in that case see spatial aperiodic)
• Wrong time stamps lead to wrong positions, orienta-tions, velocities etc.
• Also wrong data associa-tions might result
The situation might be inevitable depending on the hardware of the subsystems, but then very careful calibration of the timing is mandatory
After (temp.)
Perf. See Before
In front of (spat.)
Perf. n/a
Behind (spat.)
Perf. n/a
Faster Perf. The process is faster than expected so that it uses old input data repeatedly instead of
• Overestimate impor-tance of the data
• Wrong results Rather theoretical, in a good system design you would never rely on assumed processing times alone
![Page 177: R3-COP-D4.2.3 CV-HAZOP r1.1 OZ · PDF fileThese ingredients are introduced and discussed in the next chapter. In Chapter 2, the HAZOP study is carried out and its results collected](https://reader031.vdocuments.us/reader031/viewer/2022030411/5a9dd51e7f8b9a0d5a8cbbe2/html5/thumbnails/177.jpg)
ARTEMIS-2009-1 R3-COP / Visual Perception Testing D4.2.3
Page 177 / 187
Guideword Par. Meaning Consequences Hazards Note once
Slower Perf. The software (or function) needs a lot more time to perform than expected Maybe because there are much more data to be processed than expected (e.g. too many features)
• The results are not available in time
• If the rest of the sys-tem just keeps running it is as well as missing results
• The entire system might halt, or might produce bad results due to missing data
No Runtime The system does not run, some subsystem does not run, or some threads do not run
• Results missing • The entire perception re-sult or some parts of the expected perception result are not present, or they are not as good as one might expect (e..g. missing measurements may de-grade a filter, but not throw it off completely)
In general the line between performance and runtime is not too strict E.g. Whether for algorithmic reasons or clock errors, the consequences of late arrival of results are similar Some similarities in consequences and hazards might occur
More Runtime More threads are running than needed, or threads not belonging to the perception system steal the performance
• The intended process performance is not reached
• Too few results, results are not in time
Too much speed (runtime performance, clock rate too high) see faster
Less Runtime Some subsystem does not run or some threads do not run
• Results missing • Some parts of the ex-pected perception result are not present, or they are not as good as one might expect (e.g. missing measurements may de-grade a filter, but not throw it off completely)
As well as Runtime The same or competing • Certainties are over- • Wrong perception result
![Page 178: R3-COP-D4.2.3 CV-HAZOP r1.1 OZ · PDF fileThese ingredients are introduced and discussed in the next chapter. In Chapter 2, the HAZOP study is carried out and its results collected](https://reader031.vdocuments.us/reader031/viewer/2022030411/5a9dd51e7f8b9a0d5a8cbbe2/html5/thumbnails/178.jpg)
ARTEMIS-2009-1 R3-COP / Visual Perception Testing D4.2.3
Page 178 / 187
Guideword Par. Meaning Consequences Hazards Note results are obtained by two instances of a process
estimated • Maybe results are ac-cepted that should have been discarded
Part of Runtime Serialized data are split at the wrong time
• E.g. if there is a serial-ized analogue TV sig-nal, wrong time base leads to a beam in the middle and part old, part new images
• The image is useless
Part of Runtime Some bits or sectors in the memory are corrupted
• Data are wrong • More noise • Control flow may
crash
• The result is invalid, or the certainty of the result is over-estimated
Reverse Runtime Instead of being started or of getting high priority, a process is stopped or gets low priority
• No results available • Timing is other than
expected, fewer re-sults as needed
• No valid results or a stream of results with where many instances are missing or are wrong
Other than Runtime A process is assigned to the wrong hardware module, e.g. a calculation is assigned not to the GPU but a general purpose CPU
• The hardware might still do the calculation but be much slower, at the expense of other processes
• Performance degradation • Or no results at all de-
pending on the robustness w.r.t. assignment of proc-essor
Other than Runtime A different version of a process is started than the intended version, caused by naming error or version error or wrong path
• The interpretation of results might be wrong
• The data are incom-patible
• The timing is different than intended
• Wrong perception results Hard to specify the error without knowing about the other process Try to compare with known good results / ground truth
Where else Runtime Assign a process to a processor that is farther
• See After • See After Example: poor system configuration for processing
![Page 179: R3-COP-D4.2.3 CV-HAZOP r1.1 OZ · PDF fileThese ingredients are introduced and discussed in the next chapter. In Chapter 2, the HAZOP study is carried out and its results collected](https://reader031.vdocuments.us/reader031/viewer/2022030411/5a9dd51e7f8b9a0d5a8cbbe2/html5/thumbnails/179.jpg)
ARTEMIS-2009-1 R3-COP / Visual Perception Testing D4.2.3
Page 179 / 187
Guideword Par. Meaning Consequences Hazards Note away than expected and has more communication latency
of very wide baseline stereo
Spatial periodic
Runtime n/a
Spatial aperiodic
Runtime n/a
Temporal periodic
Runtime Wrong trigger times • Subsystems get out of sync, bad data asso-ciation
• Wrong perception results, wrong estimation of dis-tances or velocities
Temporal aperiodic
Runtime Poor signal quality on the clock
• Clock ticks are spuri-ously lost
• Old data are used instead of new ones,
• Bad data association
Close (spat.)
Runtime Maybe the designer has assigned signal latencies based on distance, resp. spatial length of communication channel, and if the channel is a lot shorter the timing is different than expected
• See Before • See Before
Remote (spat.)
Runtime A system may be far off so that significant delays are introduced due to communication time
• See After (2) • See After (2)
Before (temp.)
Runtime There is a race in the signals, e.g. confusing the order of reset and calculation
• A result is calculated on a wrongly assumed previous state (ex-pected result of reset) and then deleted by the reset
• No results, or wrong re-sults
![Page 180: R3-COP-D4.2.3 CV-HAZOP r1.1 OZ · PDF fileThese ingredients are introduced and discussed in the next chapter. In Chapter 2, the HAZOP study is carried out and its results collected](https://reader031.vdocuments.us/reader031/viewer/2022030411/5a9dd51e7f8b9a0d5a8cbbe2/html5/thumbnails/180.jpg)
ARTEMIS-2009-1 R3-COP / Visual Perception Testing D4.2.3
Page 180 / 187
Guideword Par. Meaning Consequences Hazards Note Before (temp.)
Runtime The results of a subsystem arrive earlier than modeled
• The time stamps as-signed to events are wrong resp. inconsis-tent
• When creating models in motion, the places and an-gles are wrong
After (temp.)
Runtime See Before (1)
After (temp.)
Runtime The results of a subsystem arrive later than modeled
• The time stamps as-signed to events are wrong resp. inconsis-tent
• When creating models in motion, the places and an-gles are wrong
In front of (spat.)
Runtime n/a
Behind (spat.)
Runtime n/a
Faster Runtime One subsystem runs faster than expected
• It is out of synch with the other subsystems
• Wrong data corre-spondence
• Result are wrong
Slower Runtime See faster
2.9 Missed Criticalities In this clause, criticalities are listed, which have been overseen initially, but detected later due to observations or hints. This list shall help to avoid such omissions in future HAZOPs.
Table 18: Missed Criticalities Criticality Treatment Origin Flickering l.s. can confuse exposure controller of Observer (electronics) Added to “Faster Intensity” of
l.s. R3-COP partner Wichert/SIEMENS (e-mail 2011-10-19,
![Page 181: R3-COP-D4.2.3 CV-HAZOP r1.1 OZ · PDF fileThese ingredients are introduced and discussed in the next chapter. In Chapter 2, the HAZOP study is carried out and its results collected](https://reader031.vdocuments.us/reader031/viewer/2022030411/5a9dd51e7f8b9a0d5a8cbbe2/html5/thumbnails/181.jpg)
ARTEMIS-2009-1 R3-COP / Visual Perception Testing D4.2.3
Page 181 / 187
Criticality Treatment Origin 13:52)
Reflections show objects upside down or laterally inverted. Added to “Reverse Reflection” of Objects
Herzner, 6.11.11
Grid or fence objects: (a)periodically perforated object Partially addressed by “More History” of CV alg. Added 4.2.12 as example to “Spatial (a)periodic Occlusion” of Objects
Herzner 12.11.11
Complex shaped objects occlude another of very similar shape strongly – separation extremely difficult
Added 4.2.12 as example to “Spatial periodic Occlusion” of Objects
Herzner 12.11.11
Viewing parameter – direction, orientation, width (angle) missing Added 30.11.11 (Viewing params of Observer – Optics)
Zendel, 14.11.11
Co-planarity of feature-pts. or objects: If coplanar pts occur, calculations cannot be performed since the lines in equation systems are not linear independent,
Part Of Size Object Murschitz 18.5.12
Global versus local: A global algorithm is applied and fails due to the occurrence of local phenomena and vice versa. Can happen during pre-processing or during the computation of the core algorithm E.g: global and local thresholds, Foreground/background segmentation fails because it is using a single threshold for the whole image (global illumination model), whereas a local model (adaptive threshold) which deals with local illumination changes succeeds.
Added 18.7.12 as own “Other than / Models” case to Algorithm.
Murschitz 18.5.12
Not enough image-structure (texture and edges) to find correspondences covered by “low contrast” Murschitz 4.6.12 Motion perception is ambiguous due to the limited FOV (Aperture problem) http://en.wikipedia.org/wiki/Motion_perception#The_aperture_problem)
Added 18.7.2012 Observer Less FOV
Murschitz 15.6.12
Ego versus system motion: An observer in a train watching another train moving can not determine who is moving
Added 18.7.2012 Other Than Pos Object / Observer
Murschitz 4.6.12
Trained with a database which is not generalizing enough (E.g: cars do always have shadows)
Added 18.7.2012 Algo. Less Models
Murschitz 4.6.12
Trained with a database which has too much label noise (E.g: Human error) Added 18.7.2012 Other Than Models
Murschitz 4.6.12
![Page 182: R3-COP-D4.2.3 CV-HAZOP r1.1 OZ · PDF fileThese ingredients are introduced and discussed in the next chapter. In Chapter 2, the HAZOP study is carried out and its results collected](https://reader031.vdocuments.us/reader031/viewer/2022030411/5a9dd51e7f8b9a0d5a8cbbe2/html5/thumbnails/182.jpg)
ARTEMIS-2009-1 R3-COP / Visual Perception Testing D4.2.3
Page 182 / 187
2.10 Removed Criticalities This table lists criticalities, which have been part of the HAZOP list initially, but were removed from it later due to their exotic nature or physically improbable nature. They show that the HAZOP inspired us to think out of the box.
Table 19: Removed Criticalities Guideword Par. Meaning Consequences Hazards Note
Light source Slower Wave Light moves detectably slower
through a super-cooled medium (e.g. Bose-Einstein condensate)
• Assumption that light is instantaneous (does not need measureable time) is invalid
• Malfunction of any time de-pendent algo-rithms working under the as-sumption
Near 0°K in Bose-Einstein condensate.
MediumNo Spectrum Medium turns any spectrum
into the same spectrum • Monochrome image results
• No colours can be distinguished by the CV alg.
ObjectNo Spectrum From colour spectrum, only
integral intensity within a given spectral range is transmitted
• Different spectra with same intensity can-not be distinguished
• Algorithms relying on colour differen-tiation cannot work properly, in particular
• Textures with similar intensities cannot (or only hardly) be distin-guished
• Object distinction is hampered
![Page 183: R3-COP-D4.2.3 CV-HAZOP r1.1 OZ · PDF fileThese ingredients are introduced and discussed in the next chapter. In Chapter 2, the HAZOP study is carried out and its results collected](https://reader031.vdocuments.us/reader031/viewer/2022030411/5a9dd51e7f8b9a0d5a8cbbe2/html5/thumbnails/183.jpg)
ARTEMIS-2009-1 R3-COP / Visual Perception Testing D4.2.3
Page 183 / 187
Faster Refl. Refl. changes faster than expected
• See Meaning • Object is not cor-rectly recognized
• Failed re-identification in consecutive im-ages of a (video) sequence
E.g.: Reflectance of wet road is increasing faster than expected because of heavy rain
Slower Refl. Refl. changes slower than expected
• See Meaning • Object’s state is misinterpreted
E.g.: Reflectance of wet road is increasing slower than expected because of weaker rain than expected
![Page 184: R3-COP-D4.2.3 CV-HAZOP r1.1 OZ · PDF fileThese ingredients are introduced and discussed in the next chapter. In Chapter 2, the HAZOP study is carried out and its results collected](https://reader031.vdocuments.us/reader031/viewer/2022030411/5a9dd51e7f8b9a0d5a8cbbe2/html5/thumbnails/184.jpg)
ARTEMIS-2009-1 R3-COP / Visual Perception Testing D4.2.3
Page 184 / 187
3 Summary
3.1 Qualitative Analysis The quality of the CV-HAZOP was assessed during the whole course of this work. This assessment is an inherent part of the CV-HAZOP workflow itself (as shown in Figure 1). Whenever the not-sure path is reached, the quality of the CV-HAZOP is increased by adding more parameters or modification of the list of guidewords. Since several persons with different backgrounds – from the computer vision as well as V&V sector – have generated inputs corresponding to their field, the presented CV-HAZOP should incorporate all the aspects important to both domains.
Once all the criticalities were accumulated, a unification and clean-up phase was entered. During this phase, the main concerns were uniformity and minimization of redundancy. Additionally some of the criticalities are shifted from the main catalogue into the Section 2.10, as they are of no practical relevance. However, they show that CV-HAZOP leads to out of the box thinking.
3.2 Quantitative Analysis A total number of 1034 criticalities could be identified (as meaningful). 30% of the entries are for object/objects, which reflects their importance as the most influential elements in a scene. The 25% of observer entries reflect the importance of the observer to a vision system. Figure 5 shows the distribution of CV-HAZOP entries according to their location. The 17% of non-meaningful combinations need further examination:
In the performed CV-HAZOP a number of criticalities has been assigned to “n/a” (not applicable). This corresponds to the fact, that for some guideword-parameter-combinations no meaningful criticality can be identified.5
By analysing the guideword-location combinations in Figure 6 we see that the guidewords lead to a reasonably low percentage of invalid entries (criticalities without meaning) for the generic guidewords “No” to “Other Than” with the exception of “Reverse”. The applicability of guidewords for temporal aspects (“Temporal periodic” to “Slower”) and spatial aspects – (“Spatial Periodic” to “Behind”) is location dependent. Especially the concept of space is not applicable for a number of parameters at various locations. Therefore, they fail in creating meaningful interpretations. E.g. “Close” and “Electronic Observer” do not bare any meaningful combination at all; since their concepts are non-related.
Nevertheless, the usage of the guidewords and parameter combination leads to interpretations, which would have been hard to find otherwise.
5 In contrast a single parameter-guideword-combination can also lead to multiple criticalities, which describe entirely distinct situations.
![Page 185: R3-COP-D4.2.3 CV-HAZOP r1.1 OZ · PDF fileThese ingredients are introduced and discussed in the next chapter. In Chapter 2, the HAZOP study is carried out and its results collected](https://reader031.vdocuments.us/reader031/viewer/2022030411/5a9dd51e7f8b9a0d5a8cbbe2/html5/thumbnails/185.jpg)
ARTEMIS-2009-1 R3-COP / Visual Perception Testing D4.2.3
Page 185 / 187
Figure 5: CV-HAZOP statistics: entries per location
Number of valid CV-HAZOP entries per location, and the number of non-meaningful parameter-guideword-combinations (“n/a”)
Figure 6: Percentage of meaningful combinations of guideword and parameter per location
(Black: 0% White:100%)
3.3 Conclusion We presented the application of the well-known risk analysis procedure HAZOP to the broad field of visual criticalities with respect to computer vision systems. The analysis was performed by first introducing a novel generic model of computer vision systems and performing the so-called CV-HAZOP afterwards.
The model separates the system into different units, each corresponding to a specific influence on information. These units are referred to as locations, e.g.: light source, medium, object, observer, and algorithm. During the course of the CV-HAZOP, a number of vision
![Page 186: R3-COP-D4.2.3 CV-HAZOP r1.1 OZ · PDF fileThese ingredients are introduced and discussed in the next chapter. In Chapter 2, the HAZOP study is carried out and its results collected](https://reader031.vdocuments.us/reader031/viewer/2022030411/5a9dd51e7f8b9a0d5a8cbbe2/html5/thumbnails/186.jpg)
ARTEMIS-2009-1 R3-COP / Visual Perception Testing D4.2.3
Page 186 / 187
relevant guidewords were identified. Additionally a set of parameters for each location was defined. This set is sufficient to characterize the location’s individual influence on information. The parameters have been analysed according to the HAZOP-scheme. Each parameter-guideword-combination can lead to one or more meaningful deviations (from a “normal” scene) and therefore criticalities.
Additionally a quantitative analysis of the CV-HAZOP has been presented, supporting the feasibility of the chosen guidewords, the locations, and their parameters. A qualitative analysis has been performed constantly – as part of the workflow according to Figure 1.
3.4 Next Steps The main goal we want to achieve is the incorporation of the CV-HAZOP into an automatic test-case generator, which will provide a full proof of concept. A necessary requirement for the automation is the formalisation of the HAZOP entries in a machine-readable way.
Further on this CV-HAZOP has to stay open for modification. New technologies in computer vision will constantly enlarge the number of solvable problems. At the same time, the visual criticalities will increase in number and complexity. An example for a rising technology is the plenoptic camera. Even if it eliminates some critical situations like defocusing, it also raises a new category of possible hazards that might not be fully understood yet.
![Page 187: R3-COP-D4.2.3 CV-HAZOP r1.1 OZ · PDF fileThese ingredients are introduced and discussed in the next chapter. In Chapter 2, the HAZOP study is carried out and its results collected](https://reader031.vdocuments.us/reader031/viewer/2022030411/5a9dd51e7f8b9a0d5a8cbbe2/html5/thumbnails/187.jpg)
ARTEMIS-2009-1 R3-COP / Visual Perception Testing D4.2.3
Page 187 / 187
References [Fenelonnd+94] P. Fenelonnd B. Hebbron, „Applying HAZOP to software engineering models“, in Risk
Management And Critical Protective Systems: Proceedings of SARSS, 1994, pp.11–116.
[Hildreth+83] Hildreth, E. C. (1983), “The Measurement Of Visual Motion”. Cambridge, MA: MIT Press.URI: http://hdl.handle.net/1721.1/6374
[Joshi+08] N. Joshi, R. Szeliski, and D. J. Kriegman, „PSF estimation using sharp edge prediction“, in IEEE Conference on Computer Vision and Pattern Recognition, 2008. CVPR 2008, 2008, S. 1 –8.
[Nussenzweig+03] Nussenzweig, H.M, “Light Tunneling in Clouds”. In: Applied Optics 42, March 2003, pp.1588-1593
[Oppenheim+99] A. V. Oppenheim, R. W. Schafer, und J. R. Buck, Discrete-Time Signal Processing (2nd Edition), 2. Aufl. Prentice Hall, 1999.
[Shih+2012] YiChang Shih, Neel Joshi, Brian Guenter, Image Enhancement using Calibrated Lens Simulation, ECCV 2012
[Red+99] F. Redmill, M. Chudleigh, J. Catmur; System Safety: HAZOP and Software HAZOP; John Wiley & Sons, 1999; ISBN: 978-0-471-98280-7
[Szeliski+2011] R. Szeliski, Computer Vision: Algorithms and Applications, 2011. Aufl. Springer, 2010, ISBN: 1848829345.