Abstract
The optic flow, i.e., the displacement of retinal images of objects in the environment induced by self-motion, is an important source of spatial information, especially for fast-flying insects. Spatial information over a wide range of distances, from the animal's immediate surroundings over several hundred metres to kilometres, is necessary for mediating behaviours, such as landing manoeuvres, collision avoidance in spatially complex environments, learning environmental object constellations and path integration in spatial navigation. To facilitate the processing of spatial information, the complexity of the optic flow is often reduced by active vision strategies. These result in translations and rotations being largely separated by a saccadic flight and gaze mode. Only the translational components of the optic flow contain spatial information. In the first step of optic flow processing, an array of local motion detectors provides a retinotopic spatial proximity map of the environment. This local motion information is then processed in parallel neural pathways in a task-specific manner and used to control the different components of spatial behaviour. A particular challenge here is that the distance information extracted from the optic flow does not represent the distances unambiguously, but these are scaled by the animal’s speed of locomotion. Possible ways of coping with this ambiguity are discussed.
Similar content being viewed by others
Avoid common mistakes on your manuscript.
Behaviour needs to be orchestrated on a variety of spatial scales
The behaviour of moving animals takes place in space, in the case of flying animals, like most insects, in three-dimensional space. The range of distances relevant for spatial behaviour varies widely: For example, if a prey object is to be grasped, it must be within the grasping range of the animal’s limbs. Often, however, the goal object must first be reached by locomotion over a certain distance. On the path to such a goal, other objects may be in the way. These must then be identified as obstacles to be able to initiate suitable evasive manoeuvres. Objects that are obstacles under certain conditions can, however, serve as landing objects in other behavioural contexts. For both collision avoidance and landing decisions, the behaviourally relevant distance range is determined by the locomotion velocity as well as the time the animal needs to initiate an avoidance manoeuvre or a landing approach. However, if goals such as a prolific food source or the nest are to be found after a long excursion, the behaviourally relevant distance range may extend far beyond the spatial range that can be directly perceived by the animal. The wide range over which spatial information must be gathered and represented in the animal's brain implies a variety of underlying mechanisms.
Spatial vision is often taken to be essentially equivalent to stereoscopic vision, which is based on disparities between the retinal images of the two eyes. Stereoscopic vision is of central importance for primates in the near range when grasping and manipulating objects with their hands (Howard and Rogers 2012; Read 2021), but also for other animals such as toads aiming for a prey with their tong (Collett 1977) or praying mantids sitting in ambush to catch prey with the claws of their forelegs (Rossel 1983; Nityananda et al. 2018; O'Keeffe et al. 2022). However, for geometric reasons, stereoscopic vision is not practicable at distances much larger than the immediate grasping range, regardless of the species. Especially during fast locomotion in complex environments behavioural decisions need to be made often already at relatively large distances from objects. Then other sources of distance information are needed. Although a number of spatial information sources are available (Collett and Harkness 1982; Howard 2012), for moving animals ranging from insects to birds and humans, the optic flow (OF), i.e., the pattern of image shifts on the eyes induced during locomotion, plays a dominant role in guiding spatial behaviour at all distance ranges, from very close to far away.
Optic flow as a source of distance information
OF has its basis in the image displacements on the eyes induced during self-motion of the animal. For geometric reasons, the OF pattern can be formally decomposed into two components, reflecting the translation vector and the corresponding rotation vector of self-motion in three-dimensional space. The rotational components of the OF pattern depend only on the rotation velocity of the animal, whereas the translational OF components are determined by both the translation velocity of the animal and the distance to the objects in the environment. The spatial information is thus contained exclusively in the translational OF component, which, at a given translation velocity, causes closer objects to move faster on the eyes than more distant ones. However, the spatial information is closely intertwined with the animal's translation velocity, as both a smaller distance to surrounding objects and a larger translation velocity lead to a larger image speed and thus a stronger OF. Hence the spatial information in the translational OF component is ambiguous (Longuet-Higgins and Prazdny 1980; Heeger and Jepson 1992). This important property of OF-based spatial information has consequences for spatial behaviour in various functional contexts.
Apart from this geometric ambiguity, it is important to note that OF information can only be detected by visual mechanisms if the retinal input signals are modulated in a time-dependent manner. This is not the case when the animal passes, for instance, a homogeneous, unstructured surface. Contrasts, such as object edges, where the brightness of the background differs from that of the objects, are therefore a prerequisite for determining OF and thus the spatial information based on it.
Reduction of optic flow complexity by active vision strategies to facilitate spatial vision
As already mentioned, the spatial information contained in the OF pattern is interwoven in an intricate way with the actual movement information. Theoretically, under certain conditions, the translational OF component and, in the next step, the spatial information can be extracted computationally from this complex flow field (Longuet-Higgins and Prazdny 1980; Strübbe et al. 2015). However, the necessary computational processes are complex and would represent a great challenge for the small brains of insects. One way to reduce the computational effort of the nervous system for extracting behaviourally relevant spatial information from the OF is to actively shape the rotational and translational components of self-motion. Three such active vision strategies relative to environmental objects are particularly relevant in the context of spatial behaviour of insects; they are contrasted in Fig. 1 with a pure rotation of the animal around its vertical axis, which leads to OF without spatial information.
-
1.
Looming OF induced by translational locomotion towards an object: The retinal image of an object that is approached enlarges more and more with a characteristic retinal speed that depends on the speed of self-motion and the distance to the object. Such looming stimuli are thus accompanied by a characteristic OF pattern that scales with the locomotion speed and contains distance information, which can be used for behavioural control in various contexts.
-
2.
Motion parallax induced by pure translational locomotion: During pure translational locomotion, the projected image of an object on the retina shifts at a larger speed the closer it is to the animal. Hence, the distance to an object is defined at a given translation velocity of the animal by the object’s retinal velocity. For geometrical reasons, the OF is zero directly in the direction of translation (‘focus of expansion’) and increases in an equidistant environment within the visual field towards the direction orthogonal to the direction of translation where it is largest. Hence, an animal should move sideways, if distance information relative to an object needs to be gathered in the frontal visual field. Indeed, prominent sideways flight components can be observed in many behavioral situations when the animal appears to scrutinize the spatial layout of its environment.
-
3.
Pivoting parallax induced by a specific combination of translational and rotational locomotion: Distance information relative to a behaviourally relevant goal location in the nearby environment can be generated by a specific combination of translational and rotational self-motion that leads to a rotation of the animal about this goal location, i.e., the pivoting point, rather than about the animal’s centre. For this peculiar form of locomotion, the OF scales for a given velocity of the animal with increasing distance to the pivoting point.
Animals can use such active vision strategies in specific behavioural situations and environmental contexts (see below) to be able to capture behaviourally relevant spatial information as computationally parsimonious as possible. Apart from such specific situations, insects, at least those where this has been systematically studied, i.e., mainly flies and Hymenoptera, employ an active flight and gaze strategy even if their behaviour is not explicitly directed by objects. This saccadic flight and gaze strategy is characterized by a sequence of often extremely fast rotations, the saccades, and by intervals in which the animals primarily translate with a largely constant gaze direction, leading to an OF on the eyes in which the translational component dominates (Fig. 2A) (Land 1999; van Hateren and Schilstra 1999; Dickinson 2005; Zeil et al. 2008; Egelhaaf et al. 2012; Boeddeker et al. 2015; Kress et al. 2015; Muijres et al. 2015; Doussot et al. 2020). The relative duration of the intersaccades, i.e., the time during which the direction of gaze changes only slightly, accounts for more than 80% of the total flight time. Intersaccadic flight can be further subdivided into prototypical movements, each characterized by the specific relative contribution of forward and sideways flight components (Fig. 2B) (Braun et al. 2012). The saccadic flight and gaze strategy facilitates the evaluation of spatial information from the OF by largely separating the translational and rotational OF components behaviourally. This saccadic flight style is prevalent during most of the time that the insects are flying around, even if no immediate reference object for their behaviour is apparent. This makes sense because an animal should have constant access to spatial information about its immediate surroundings. Only then can the animal detect objects of whatever spatial dimension as existing and react appropriately to them—potentially using one of the dedicated active vision strategies mentioned above. In other words, the general saccadic flight and gaze strategy helps to segment a cluttered spatial scenery without much computational effort into nearby and distant objects and on this basis allows the animal to act towards those objects that are of special behavioural relevance by specific dedicated flight manoeuvres.
Spatial behaviour based on optic flow information in insects
So far, it has been outlined how the OF can be simplified by specifically orchestrated locomotion in such a way that extraction of behaviourally relevant spatial information by the brain is possible with computationally parsimonious mechanisms; computational parsimony with simultaneous efficiency can be expected in particular from insects with their small brains. In the following, it is outlined how insects solve certain spatial behavioural tasks and what role OF might play as a source of spatial information in each case. The outlined behavioural tasks take place on a broad spectrum of spatial scales, which implies a broad spectrum of mechanisms for the acquisition of spatial information and behavioural control. The choice of behavioural tasks summarised covers only a small spectrum of the tasks insects face and is limited to those that have each been studied to date at least to some extent.
Peering movements for behaviour in the near range
Even though stereoscopic vision is used at least by praying mantises in the near range to strike prey, i.e., in a situation where they should be as motionless as possible to not scare away the potential prey (Rossel 1983; Nityananda et al. 2016; Read 2021), there are situations where sitting insects use distance information based on OF to control their behaviour. For instance, sitting locusts generate translational sideways movements with their bodies (‘peering’), which lead to motion parallax on their eyes, especially in the frontal visual field. The resulting distance information is then used to jump accurately to a twig or another target object (Collett 1978; Kral and Poteser 1997; Kral 2012). In experiments in which the parallax velocity of the object and thus its apparent distance were systematically manipulated by the experimenter during the animal's peering movements, it could be shown that the locust’s jump velocity is controlled by OF-based distance information (Sobel 1990). If the animals are sitting, peering movements of the body and head are the only way to gain distance information through movement parallax. Peering movements also play a role during flight in a variety of behavioural contexts, such as when the insect pinpoints the distance to a food source location (Boeddeker and Hemmi 2010), assesses its flight altitude (Baird et al. 2021) or the width of a gap to be flown through (see below).
Regulating flight speed depending on the spatial layout of the environment
Flying insects regulate their overall translation velocity depending on the spatial layout of the environment. For instance, insects decelerate if the width of a flight corridor becomes narrower or if the flight path is partially obstructed by objects (Srinivasan et al. 1996; Baird and Dacke 2012; Kern et al. 2012; Serres and Ruffier 2017). Flight speed is thought to be controlled by OF generated during translational flight. Flies, bees, and moths were concluded to keep the OF on their eyes at a “preset” level by adjusting their flight speed. Accordingly, they decelerate when the translational OF increases, for instance, while passing a narrow segment of a flight corridor (Srinivasan et al. 1991; Baird et al. 2005; Portelli et al. 2011; Kern et al. 2012; Linander et al. 2015, 2016; Stöckl et al. 2019; Grittner et al. 2021). Not all parts of the visual field contribute equally to the input of the flight velocity controller: For flies the most prominent role can be attributed to the intersaccadic OF generated in eye regions looking in front of the insect (Fig. 3). In these regions of the visual field, the intersaccadic retinal velocities are kept in a range where the responses of the motion vision system still increase monotonically with increasing velocity and decrease with decreasing velocity (see below; Egelhaaf et al. 2012; Kern et al. 2012). Bumblebees have been concluded to rely on the OF and thus on relative nearness information in the lateral visual field when negotiating narrow corridors, but on ventral OF and, thus, relative nearness information to the ground in wider terrain; the optic flow in the frontal field of view plays a particularly important role when it comes to detecting and responding to changes in the proximity of the environment well before these changes in the spatial layout are encountered (Baird et al. 2010; Linander et al. 2015, 2016).
Although it is clear overall that flight speed is reduced in confined and cluttered terrain, there are still unanswered questions and sometimes conflicting findings about the underlying control mechanisms. It is generally accepted that the OF is balanced between the two eyes for a wide range of environmental conditions, as is reflected in experimental studies in flight tunnels on a variety of insect species meandering along the tunnel’s midline ("centering response"). However, if the textural and/or spatial properties of the environment in front of the two eyes differ in certain ways deviations from centering can be observed (Srinivasan et al. 1991; Serres et al. 2008; Dyhr and Higgins 2010; Baird and Dacke 2012; Kern et al. 2012; Linander et al. 2015, 2016; Chakravarthi et al. 2017; Serres and Ruffier 2017; Lecoeur et al. 2019; Baird 2020). Two findings in particular show that balancing the OF in front of the two eyes is not always essential to allow insects to pursue a roughly straight flight course: (1) Functionally blinding one of the eyes hardly affects the ability of flies to fly straight, though monocular flies tend to fly slower and to tilt their body axis slightly to the side of the seeing eye. Detailed analyses of the OF patterns on the non-occluded eye show that this surprising performance of monocular flies can be explained by a kind of monocular optomotor equilibrium, i.e., the superimposed translational and rotational OF components on the seeing eye sum to zero (Kern and Egelhaaf 2000). (2) Large OF asymmetries on the two eyes may, however, also occur under more natural conditions, for instance, when an insect flies along a textured wall with an open field on the other side of a flight tunnel after having the bee trained to fixate and fly towards a feeder close to the wall at the other end of the tunnel. This situation, that has been experimentally analyzed for bees, cannot be accounted for by balancing OF across the two eyes. Instead, a lateral OF regulator has been proposed, i.e., a feedback system that is aimed at keeping the unilateral OF constant (Serres et al. 2006, 2008).
Landing
The ability to land is essential for all flying animals and thus represents a central spatial vision requirement. As the animals are in motion when initiating landing manoeuvres, OF is predestined to be used as a source of spatial information. Successful landings require continuous precise adjustment of flight speed, body orientation and leg posture to the respective distance from the landing site to ensure a smooth landing. Usually, the animals decelerate before landing to ensure a smooth touchdown (see below). However, some hymenopteran species were found to do the opposite and accelerate just before landing; not much is known about the potential role of OF in controlling this peculiar behaviour (Shackleton et al. 2019; Tichit et al. 2020). Two fundamentally different landing situations can be distinguished, namely landing on relatively flat surfaces and landing on small objects. In both spatial situations, OF is considered the decisive source of information for the control of the landing behaviour.
When landing on a flat horizontal surface, honeybees were concluded to pursue a computationally simple OF-based strategy by keeping the retinal velocity of the ground constant during the approach, thus automatically ensuring that flight speed is close to zero at touchdown (Srinivasan et al. 2001). If the surface they approach is vertical, the bees also gradually and automatically reduce their flight speed by keeping the speed of the image expansion constant (Baird et al. 2013). Whereas honeybees continuously decelerate when landing on a vertical surface, bumblebees were concluded to employ a somewhat different strategy by exhibiting a series of deceleration bouts. During each bout, the bumblebee keeps the relative rate of expansion constant; from one bout to the next, the bumblebee tends to shift to a higher constant relative rate of expansion. This landing strategy is interpreted to be faster than that described for honeybees (Goyal et al. 2021).
Under natural conditions, insects often land on objects that differ from their background in some way. They can land even on objects that are indistinguishable from the background in texture and colour and can only be detected based on relative motion cues (Lehrer et al. 1988; Srinivasan et al. 1989; Kimmerle et al. 1996; Kern et al. 1997). Regardless of the characteristics that distinguish a landing object, the OF induced during the landing approach with its strong looming component plays the decisive role in controlling the pre-landing deceleration and the timely extension of the legs before touchdown; this has been shown in both free flight and tethered flight experiments (Wagner 1982; Borst 1990; Kern et al. 1997).
Collision avoidance
An existential challenge to spatial vision, especially for fast-moving animals, is avoiding collisions with objects that are in the way. OF is the most relevant sensory source of information for solving this task. It could be shown that the OF during flights of bumblebees in a tunnel with numerous obstacles represents the proximity to these; this is a consequence of the saccadic flight and gaze strategy and the associated relative stability of the gaze direction during the intersaccadic intervals (Fig. 4) (Ravi et al. 2022). Obstacle avoidance behaviour was concluded to be essentially robust over the entire brightness range under which bumblebees are naturally active (Baird 2020). How insects deal with the OF characteristics under different collision avoidance conditions has been analyzed in detail during both tethered and free flight. The evasive reactions in free flight are based on the extremely fast saccadic rotations characteristic of many insects (see above). The evasive reactions observed in tethered flight are much slower and only reflect to some extent what happens under free flight conditions (Tammero and Dickinson 2002; Bender and Dickinson 2006). Independent of these dynamic aspects, the studies agree that asymmetries in the OF pattern on both eyes, which is largely translational during intersaccadic intervals, along with a strong looming component, are crucial for controlling the direction and amplitude of saccades leading to collision avoidance. However, it is not yet clear which of the various parameters that characterise the asymmetries in binocular OF are most important. The asymmetry of the flow pattern may be due to the location of the expansion focus in front of one eye or to a difference between the total OF in the visual fields of the left and right eye (Bertrand et al. 2015; Serres and Ruffier 2017; Thoma et al. 2020). In this context, bumblebees were shown to extract the maximum rate of image motion in the frontal visual field and in this way steer away from obstacles (Lecoeur et al. 2019). Thereby, the bees’ maximum evasive acceleration depends linearly on the relative retinal expansion velocity of the obstacle, i.e., the ratio of the retinal image expansion of the obstacle to its retinal size (Ravi et al. 2022). Not the whole visual field appears to be involved in controlling saccadic turns that lead to collision avoidance manoeuvres. In blowflies, for example, OF in the lateral visual field plays no role in determining the direction of avoidance saccades. This property is probably related to the flight style of blowflies. During intersaccadic intervals, they fly predominantly forward, with some lateral components immediately after the saccades that shift the expansion pole of the OF slightly towards the frontolateral eye regions (Kern et al. 2012). In contrast, Drosophila, hoverflies, but also bees can hover and fly sideways to a certain degree. Here, lateral and even posterior parts of the eye can get involved in triggering evasive saccades as part of a collision avoidance flight strategy (Braun et al. 2012; Geurten et al. 2012; Muijres et al. 2014).
Negotiating gaps in cluttered environments
In cluttered natural environments, insects not only have to avoid obstacles. Often, they also must fly through small gaps without damaging their wings. To do this, they must assess whether a gap itself and the clearance behind the gap are large enough to fly through it unharmed. The assessment of the passability of gaps is an elementary task of spatial vision and aerial flight control. Whereas a brightness-based strategy for gap detection and negotiation was concluded to be a fast, computationally simple and efficient mechanism for orchid bees living in tropical rainforests (Baird and Dacke 2016), bumblebees have been shown to employ a specific active vision strategy to generate the OF required to assess the passability of a gap and to ensure injury-free passage. When bumblebees approach a gap, they already slow down at some distance if the clearance behind the gap or the width of the gap appear to fall below a critical size (Fig. 5A, D). A rough estimation of these critical measures thus seems to be possible already in the normal cruising flight mode through the translational OF during the intersaccades. While the bumblebees reduce their forward speed, they increase their sideways speed and perform mainly sideways scanning movements in front of the gap (Fig. 5B) providing OF-based spatial information essentially in the frontal visual field. As the clearance behind the gap or the width of the gap decreases, bumblebees spend more and more time on these scanning movements (Fig. 5C). Based on the distinct translational OF in the frontal visual field, bumblebees appear to evaluate the geometry of the gap and thus its passability (Ravi et al. 2019). Through this active flight strategy, bumblebees are able to assess whether a gap is wide enough to fly through head-first in the normal direction of flight, or whether they need to pass at an angle or even sideways (Fig. 5E, F). The bees thus seem to "know" their own body size along the different axes, because they behave in this peculiar way even if they had no previous experience with the respective environment. Recently, it could be shown that this elaborate gap crossing behaviour scales with the body size of the animals; body size and wingspan can vary by up to a factor of two even within a given colony: small bumblebees show the characteristic scanning behaviour and typical sideways passages only at smaller gaps than do larger animals (Ravi et al. 2020). Hence, flying bumblebees perceive the affordance of their surroundings relative to their individual body size to navigate safely through complex environments.
Optic flow in local navigation based on landmark information
All flying insects should be able to avoid collisions with obstacles and to negotiate narrow gaps in three-dimensional environments, but also to interrupt a flight by landing on an object. In contrast, navigation between the nest and food sources is a special ability of Hymenopterans such as bees, some wasps and ants, which take care of their brood and therefore need to return to their nest after foraging (Zeil et al. 2009; Zeil 2012; Collett et al. 2013; Zeil 2022). Basic components of navigation, in particular place memory, i.e., learning a location in space using visual cues in its vicinity, could also be identified in walking Drosophila (Ofstad et al. 2011). Visual landmarks represent important spatial cues and are used to locate an often barely visible goal, such as the nest hole in solitary bees or bumblebees. Information about the landmark constellation around the goal location is gained in characteristic learning flights during which the animal frequently orients towards the goal location or objects in its vicinity (Lehrer 1991; Zeil 1993a, b; Hempel de Ibarra et al. 2009; Philippides et al. 2013; Riabinina et al. 2014; Lobecke et al. 2018). During these learning flights, the animals are thought to gather relevant information about the landmark constellation, which can then be used to localise the goal when returning to it after an excursion. Although a variety of visual features such as contrast, texture and colour are relevant for defining landmarks and are used to locate the goal (Collett and Collett 2002; Zeil et al. 2009; Zeil 2022), there is evidence that the spatial arrangement of the landmark constellation can also play an important role and that landmarks defined solely by motion cues can be used for goal location (Lehrer and Collett 1994; Dittmar et al. 2010, 2011). Although the mechanisms by which landmark constellations are learned and what information is eventually used to locate the goal are not yet fully understood, the OF information actively generated during typical learning and search flights of bees is essential for the acquisition of spatial memory of the goal environment (Zeil et al. 2009; Zeil 2012). Moreover, in the proximity of landmarks, animals adjust their flight movements depending on the landmarks’ specific textural properties. Landmarks near the goal are best suited for its localisation because the retinal positions of near landmarks are shifted more during translational movements than those of more distant ones. Hence, special weight is implicitly given to nearby landmarks when navigating close to the goal—just because of the geometric properties of the OF during intersaccadic translational locomotion. The processing of behaviourally relevant visual information is thus facilitated by the characteristic active gaze strategy in free flight (Egelhaaf et al. 2012).
Estimation of flight distances: path integration in flying insects
OF information is used at least by flying Hymenoptera (honeybees, bumblebees; solitary wasps), but also to a certain extent by ants, for spatial tasks related to navigation over long distances from their nest site. They monitor travel distances and directions when foraging on often curvy and complex routes in an unknown environment to be able to return to their nest later. While the direction of locomotion is determined by employing compass information provided by the sun and the associated polarisation pattern in the sky and/or by other directional cues (Homberg 2004; Pfeiffer and Homberg 2007; Wolf 2011; Seelig and Jayaraman 2015; Srinivasan 2015; Heinze et al. 2018; Wehner 2020), OF experienced during flight to a food source or some other behaviourally relevant location, such as the nest, plays a crucial role in determining the distance travelled; OF in both the lateral and ventral visual field, depending on the spatial layout of the surroundings, is used for estimating the distance travelled (Chittka et al. 1995; Esch and Burns 1996; Srinivasan et al. 1996, 1997, 2000; Esch et al. 2001; Hrncir et al. 2003; Tautz et al. 2004; Srinivasan 2014). The measured directions and the corresponding distance estimates are combined to determine the vector for the direct route back to the goal location (‘path integration’). However, the OF generated by translatory locomotion depends on both the flight speed and the spatial layout of the environment flown through, i.e., in particular the distance to objects in the surroundings and—in the ventral visual field—to the ground (see above). This means that even for a given flight velocity the OF and, thus, the distance estimates based on it are highly ambiguous. It all becomes even more demanding regarding estimates of distances travelled, as both the lateral and ventral optic flow are not only used to assess the distances travelled, but also for regulating flight velocity and/flight height (Baird et al. 2005, 2006; Portelli et al. 2010, 2017; Linander et al. 2015). Behavioural experiments with honeybees in both experimenter-designed and natural environments revealed that these ambiguities are both reflected by the characteristics of the waggle dances, with which honeybee foragers communicate to their hive mates the estimated direction and distance to prolific food sources, but also the locations where the recruited bees should search for food after manipulating the environment in a dedicated way (Fig. 6) (Srinivasan et al. 2000; Esch et al. 2001; Tautz et al. 2004). These environment-dependent ambiguities of OF-based estimates of distances might be a serious challenge to finding a goal location using path integration unless certain conditions are met: (1) The flight altitude, independent of how it is controlled (see above), should be quite similar on the outbound and return flight. (2) The environment in which the outbound flight takes place and during which the direction and distance from the starting point are determined by path integration, should not differ statistically in terms of its spatial layout (e.g., density and distance of environmental objects) from the environment in which the animal flies back as straight as possible according to the path integration vector. (3) The perceived OF depends at any time on the current distance and texture to the objects in both the lateral and ventral visual field, even after some spatial pooling. Thus, the OF-based distance measurements especially in cluttered terrain fluctuate in time. To smooth out these fluctuations and to obtain a reasonably accurate estimate of the distances travelled, integration of the OF over a sufficiently long time period is necessary (Meyer et al. 2011; Schwegmann et al. 2014b). This means that path integration relying on OF-based measurements of travel distances are reasonably accurate only on a relatively large spatial scale, whereby this spatial scale is expected to depend on the statistic properties of the spatial layout of the environment. This might not be too much of a problem for flying hymenopterans, if they do not only rely on path integration, but can also use previously learned landmarks in the environment to find their goal again.
Optic flow computation, representation of spatial information in the insect brain and behavioural control
The retinal OF patterns, defined as shifts in the geometric projections of objects in the environment on the retina induced by the animal's own movements (Koenderink 1986; Strübbe et al. 2015), are not directly available to the visual system. Rather, the input of the array of photoreceptors is provided by the time-dependent brightness changes evoked during self-motion; these are transformed into electrical photoreceptor signals for further processing in the subsequent neuropil layers of the optic lobes, i.e., the lamina, the medulla, and the lobula complex, which is subdivided in flies into the lobula, and lobula plate (Fig. 7). OF information is initially determined locally by a large number of motion detectors that are arranged retinotopically; they jointly subserve the entire visual field. This local motion information and the spatial information it contains may then be combined with other visual information and further processed in partly parallel pathways involved in mediating the different components of spatial behaviour. These pathways are supplied by visual projection neurons that carry information from the optic lobes to distinct regions of the central brain and eventually via descending neurons to the motor control centres in the thoracic ganglia (Fig. 7).
Local optic flow processing in early visual areas
The neural circuits of motion detection receive their input from a retinotopic array of discrete sampling points of the ommatidial lattice. The relatively low spatial resolution of insect eyes does not seem to be a disadvantage, as many insects are quite capable of performing highly aerobatic flight manoeuvres and solving challenging spatial tasks based on OF information (see above). The computational mechanism that has been proposed as the basis for local visual motion processing in flying insects is the correlation-type motion detector (Reichardt 1961; Borst and Egelhaaf 1989, 1993; Egelhaaf and Borst 1993). In its simplest form, a local motion detector consists of two mirror-symmetric subunits. In each subunit, the spatially and temporally filtered brightness signals from neighbouring points in visual space are combined by a multiplicative interaction after one of them has been delayed. The final response of the detector results from the subtraction of the output signals of two such subunits with opposite preferred directions, greatly improving the directional selectivity of the motion detection circuit. Each motion detector responds with a positive signal to motion in a particular direction, i.e., either horizontally or vertically, and with a negative signal to motion in the opposite direction. Different variants of this basic motion detection scheme have been proposed to explain the responses of insect motion-sensitive neurons under a variety of stimulus conditions, including even natural OF as experienced under free-flight conditions (Borst and Egelhaaf 1989; Franceschini et al. 1989; Baird et al. 2013; Egelhaaf et al. 2014; Mauss et al. 2017; Borst 2018; Yang and Clandinin 2018; Chen et al. 2019; Zavatone-Veth et al. 2020; Kohn et al. 2021). By combining the sophisticated toolkit of genetic and molecular approaches in Drosophila with electrophysiological and imaging techniques, great progress has been made in recent years to identify the different cellular elements of the neural circuits underlying local motion detection. From a functional point of view, a particularly relevant result is the separation of the motion detector input circuits already at the level of the lamina into ON and OFF channels that processes brightness increases and decreases separately (Fig. 7). This separation is maintained in the intricate neural circuitry of motion detection; the ON and OF motion signals and the spatial information they carry are then fed retinotopically by two types of neurons (T4, T5) into the downstream processing of the OF. Because of the existing very good reviews on the cellular implementation of the local mechanisms of motion detection, this important aspect of OF computation will not be elaborated on here (Mauss et al. 2017; Strother et al. 2017; Yang and Clandinin 2018; Borst et al. 2020).
From a functional perspective, it is important to note that the local motion detectors do not provide a veridical representation of the OF—with several consequences for the OF-based spatial information: (1) Velocity dependence: Local motion detectors do not function like odometers, even if the temporal and/or spatial average of their responses is taken into account. Their average response amplitude increases with increasing speed, reaches a maximum and then decreases again; so biological movement detectors do not uniquely reflect the motion speed. The response characteristics of biological motion detection systems are even more complex, as their velocity maximum depends on the textural properties of a moving stimulus pattern, or transferred into a natural world, on the texture and shape of objects the animal flies by (Borst and Egelhaaf 1993; Egelhaaf and Borst 1993; Egelhaaf et al. 2014) (Fig. 8A). The pattern dependence of velocity tuning is less noticeable when the pattern consists of a broad spectrum of spatial frequencies (Dror et al. 2001), as is characteristic of natural scenes (Schwegmann et al. 2014b). Despite these ambiguities, free-flying flies and bees, those insects in which this important aspect has been particularly thoroughly studied, appear to regulate their translational velocities in such a way that retinal velocities remain within the part of the working range of the motion detection system where its response increases monotonically with velocity. (2) Time course of local motion responses: The responses of local motion detectors do not unambiguously reflect the local pattern speed, since they depend to a large extent on the local texture properties and the spatial layout of the visual environment. Since the response modulations of neighbouring motion detectors are out of phase with each other, spatial pooling of many such detectors reduces the pattern-dependent response modulations. There is thus a kind of trade-off between the spatial resolution with which the movement information is perceived and the quality of the represented time course of local pattern velocity (Fig. 8B) (Egelhaaf et al. 1989; Meyer et al. 2011; O'Carroll et al. 2011). (3) Motion adaptation: Motion-sensitive neurons adapt their response strength to the given stimulus conditions. This is inevitably at the expense of veridical velocity encoding, but improves sensitivity to velocity changes, such as those that may occur when flying past nearby objects. This property has been interpreted as an adaptation to facilitate the detection of spatial discontinuities (Liang et al. 2008, 2012; Kurtz 2012; Li et al. 2017). Motion adaptation occurs at multiple processing levels of the visual motion pathway, both at the level of local motion sensitive elements and at the level of downstream spatial integrating cells (see below). As a functional consequence of the direction-independent movement adaptation at the level of local movement detection, the representation of the local direction of movement is independent of the overall adaptation status (Li et al. 2021).
These characteristics of the local movement detectors have direct consequences for the spatial information that is represented at this processing stage and made available for further downstream processing. Since the retinal speed of an object scales inversely with distance during translational locomotion of the animal, a nearby object leads to stronger local movement detector responses than more distant objects. As a consequence, a visual scene is segregated into near and far objects without much computational effort, as demonstrated by computer modelling based on physiological data (Egelhaaf et al. 2014; Schwegmann et al. 2014a). Since motion responses are elicited only by textured surfaces or object boundaries, the array of local motion detectors represents with increasing response strength the proximity of object contours rather than the nearness to the objects themselves (Fig. 9). This representation of spatial information is further enhanced by the motion adaptation processes operating at the level of local movement detectors (Li et al. 2017).
Optic flow processing for behavioural control in higher order brain regions
The local retinotopically organised motion signals computed in the early visual motion areas provide the input information of the downstream computational processes that are involved in mediating the different types of OF-based spatial behaviour. OF-based behaviour relies on global, not just local, characteristics of the OF fields that are actively induced on the eyes during locomotion. This means that the mechanisms mediating the different components of spatial behaviour must combine local motion measurements from different parts of the visual field and, if necessary, bring them together with information from other visual channels depending on the requirements of the spatial task. This is done in parallel pathways depending on the respective behavioural tasks and the spatial information required. Three such pathways, all involved in solving OF-based spatial vision tasks, will be briefly outlined here (Fig. 7).
Global optic flow sensing and its dependence on the spatial structure of the environment
Much is known about the integration of local motion information over large parts of the visual field to estimate the different components of the animal's self-motion. This is relevant for course stabilisation in space, but also in other behavioural contexts, such as the regulation of flight speed depending on the spatial structure of cluttered environments (see above). The underlying mechanisms of OF processing have been studied in detail in flies. Pooling of local motion information takes place in the lobula plate on the extended dendrites of wide-field neurons, the lobula plate tangential cells (LPTCs). The different LPTCs respond in a characteristic manner to the direction of the animal's self-motion depending on their respective input organisation. The specificity for the different types of OF patterns is further enhanced by synaptic interactions between LPTCs within the same and/or both brain hemispheres (Hausen 1984; Krapp 2000, 2014; Egelhaaf and Kern 2002; Egelhaaf 2006; Borst et al. 2010; Hennig et al. 2011; Egelhaaf et al. 2012; Liang et al. 2012; Borst 2014; Hardcastle and Krapp 2016; Mauss and Borst 2019). Apart from providing information about self-motion of the animal, these neurons also represent during translatory self-motion OF-based spatial information as averaged within the confines of the cells’ receptive fields (Kern et al. 2005; Liang et al. 2012; Ullrich et al. 2015). The neuronal representation of spatial information is further enhanced by motion adaptation (see above; Liang et al. 2008; Li et al. 2017). The retinal velocity range, and thus the spatial range that can be encoded, is constrained by the inevitable neuronal noise resulting from the biophysical mechanisms of signal transduction in photoreceptors and cellular information processing (Laughlin 1994; Grewe et al. 2003, 2007; Warzecha et al. 2013). Under spatially constrained conditions, where flies fly at translation velocities of only slightly more than 0.5 m/s, the spatial range within which significant distance-dependent information is represented by LPTCs during intersaccades is about 2 m (Kern et al. 2005). Since retinal speed decreases with increasing distance to an object and increases with increasing translational velocity, a given retinal speed is achieved by different combinations of object distance and translational velocity. The behaviourally relevant spatial range thus scales with the intersaccadic translational velocity. From an ecological perspective, this scaling of the behaviourally relevant depth range is functional: a fast-flying animal should, for example, initiate a turn leading to an evasive manoeuvre or a deceleration when approaching an object to land at an earlier time and at a larger distance to the object than a slow-flying animal (Egelhaaf et al. 2012). This expectation matches, at least qualitatively, the characteristics of insect landing behaviour (see above). Furthermore, motion-sensitive wide-field neurons show a characteristic time-dependent response profile that reflects the spatial landmark constellation surrounding a behaviourally relevant goal of bees or bumblebees, such as the nest hole or a food source (Egelhaaf et al. 2014; Mertes et al. 2014). The information about rotational and translational self-motion is conveyed by these wide-field neurons to the posterior slope region of the central brain and from here via descending neurons to the motor control centres in the thoracic ganglia, but also to muscles that are responsible for the animal's head movements and thus head–body coordination (Fig. 7). The information conveyed by these descending neurons is assumed to play a role in stabilising an intended course of the animal against involuntary disturbances, in coordinating head and body movements, e.g., keeping the head horizontally aligned while the body needs to make roll movements during sharp curves (‘banked turns’) and during sidewards movements (Haag et al. 2007; Huston and Krapp 2008; Wertz et al. 2008, 2009, 2012; Suver et al. 2016; Namiki et al. 2018). During intersaccadic translatory self-motion, these pathways may also provide the global nearness information required for velocity control or balancing the flight trajectories in densely cluttered environments (see above).
Processing of looming OF for escape, collision avoidance and landing
When an animal approaches an object, such as a landing site or an obstacle, the OF pattern on the eyes expands (‘looming’). Depending on the behavioural state of the animal, the situational context, and the dynamics of the looming stimulus, different behavioural responses may be appropriate, i.e., either an escape response, an avoidance response, or a landing response. Expanding OF fields can, in principle, be encoded by integrating the output signals of several local motion-sensitive elements with suitable preferred directions in specific areas of the visual field. Evidence for a functional role of such local motion sensitive elements in encoding looming stimuli comes from experiments where the extension of the legs, as is characteristic of landing reactions, and evasive behaviour of tethered-flying Drosophilae in response to looming stimuli could be eliminated by opto-genetically switching off such local motion-sensitive elements (Schilling and Borst 2015). In flies, a population of visual projection neurons in the lobula, the lobula columnar (LC) cells (Fig. 7), plays a key role in mediating these and other behavioural responses, as could be shown by optogenetic activation experiments (Wu et al. 2016). The LC neurons can be classified into distinct anatomical types with response specificities for different visual features. Each LC type comprises several retinotopically arranged neurons with similar morphology, whose individual dendritic branches extend over part of the lobula column arrangement and, in some of these neurons, also over part of the lobula plate (lobula plate—lobula columnar (LPLC) cells; Fig. 7). Hallmark of most of these neurons is the convergence of their axons onto cell-type specific target regions in the central brain, often referred to as ‘optic glomeruli’ (Panser et al. 2016; Wu et al. 2016; Timaeus et al. 2020; Klapoetke et al. 2022). Looming stimuli are detected by specific types of LC as well as LPLC cells that distribute as projection neurons looming information from the optic lobe to several of the nearly 20 optic glomeruli in the posterior lateral protocerebrum (PLP) and posterior ventrolateral protocerebrum (PLVP) (Fig. 7). A pathway transferring visual looming information from these optic glomeruli via a giant descending neuron to the motor control centres in the thoracic ganglia elicits looming-induced escape responses (Fig. 7) (von Reyn et al. 2014; Klapoetke et al. 2017, 2022; Ache et al. 2019a, 2019b). Further descending neurons that mediate looming stimulus-induced landing receive their visual input from other types of visual projection neurons that combine input from the lobula and lobula plate. These descending neurons are gated by the behavioural state of the animal, i.e., their gain is severely attenuated if the animal does not fly and, thus, the initiation of landing would be inappropriate (Ache et al. 2019a). Thus, gating of looming responses of these descending neurons is a mechanism that ensures a meaningful context-dependence of the control of spatial behaviour.
Translational OF processing for path integration
The spatial behaviours and their respective neural basis considered so far are involved in solving tasks on a time scale of a few milliseconds to a few tens or hundreds of milliseconds at most. Path integration in the context of navigation behaviour takes place on a much longer time scale. Here, the translational OF components and the corresponding directions of locomotion during excursions to a target, e.g., a food source (see above), have to be integrated over times of several seconds up to the minute range. In recent years, essential aspects of OF-based estimation of travel distances have been elucidated, which could play a key role in path integration. To provide a neural representation of travel distance based on OF, the activity of motion-sensitive cells in appropriate areas of the visual field must be integrated over relatively long periods of time, taking into account the respective direction in which the animal was moving. Thus, path integration implies that direction and distance information are continually combined in an appropriate way. One brain region, the central complex, has emerged as the likely site of path integration in the insect brain, where the necessary distance and directional information is computed and brought together. The different areas of the central complex with their regular, repetitive neuroarchitecture with 16–18 vertical columns and several horizontal layers could be characterized anatomically, electrophysiologically and with imaging approaches as a functionally ring-shaped computational system and modelled based on the experimental data (Pfeiffer and Homberg 2014; Turner-Evans and Jayaraman 2016; Webb and Wystrach 2016; Varga et al. 2017; Heinze et al. 2018; Honkanen et al. 2019; Webb 2019; Pabst et al. 2022). Representations of directional information that can be used as a compass in path integration are found in this structure in a wide range of insects. Depending on the species and the specific ecological conditions under which they navigate, this information comes from different sources of information, such as in particular the celestial compass provided by the sun and/or the polarisation pattern of the sky. Directional information can, however, also be derived from distant landmarks or by integrating estimates of rotational movements of the animal based on the rotational OF and/or proprioceptive signals to encode the current course of the insect (Heinze and Homberg 2007; Seelig and Jayaraman 2015; Turner-Evans et al. 2017; Green and Maimon 2018; Rosner et al. 2019; Pisokas et al. 2020). Path integration requires combining directional signals with information about the distances travelled. As mentioned above, bees use translational OF to estimate travel distances, while walking animals like ants primarily rely on integrating their steps (Collett and Collett 2017; Stone et al. 2017; Ronacher 2020). In several insect groups the central complex could be shown to house neurons sensitive to wide-field motion (Bausenwein et al. 1994; Kathman et al. 2014; Weir et al. 2014). In bees, a set of four prominent neurons in the noduli of the central complex could be characterized that respond to translational OF in a speed-dependent manner. They respond best to backward or forward flight in directions that deviate by about 45° from the animal's central midline (Stone et al. 2017). These neurons thus divide the animal's movement space into four cardinal directions and together can robustly encode all of the animal's translational movements—even if the body axis is not aligned with the direction of movement, e.g., when the animal makes lateral movements before crossing a narrow gap (see above). Although it is not yet understood in detail at the neural level how OF-based distance information is integrated in the context of path integration, possible mechanisms have been made plausible by several computational modelling approaches based on established connections between existing central complex neurons (Stone et al. 2017; Pabst et al. 2022). The cellular mechanism in effect leads to an accumulation of neuronal activity in each of eight directions represented by the population of direction cells. Since each accumulator unit acts as a separate directional odometer, there is no need for an overarching odometer cell representing the total distance travelled. Rather, the combined activity of all integrators represents as a distributed neural code for the home vector the distance to the starting point in relation to its direction at a given time during foraging (Stone et al. 2017).
Conclusions
In flying insects, the OF generated on the eyes during locomotion is the main source of spatial information. Although other sources of spatial information, such as disparities in the two retinal images that form the basis for stereoscopic vision, are relevant for some insects (see above), stereoscopic vision can only be used in the immediate near field of the animal. However, especially during fast locomotion, behavioural decisions are necessary at much larger distances to behaviourally relevant structures in the respective environment.
Distance information, however, is only contained in the translational components of the OF; if these are overlaid by too strong rotational flow, it becomes computationally difficult to extract valid spatial information from the OF patterns. The saccadic flight and gaze strategy of many insects, in which changes in flight direction are squeezed into rapid saccadic turns and the animal moves essentially translationally during intersaccadic phases, is therefore interpreted as an active vision strategy that facilitates the evaluation of spatial information. This general flight and gaze strategy can be further refined (e.g., movement parallax or pivoting parallax) in specific behavioural contexts, e.g., when it is necessary to traverse narrow gaps or to determine distances of objects relative to a behaviourally relevant target near the animal.
However, OF can only be detected if there are contrasts or spatial discontinuities at object boundaries in the scenery. Accordingly, the array of retinotopically organised motion detectors in the early visual system does not provide a nearness map of the environment, but rather a contrast-weighted nearness map, i.e., information about the proximity of contours. This OF-based distance information is not metric, but scales inversely with object distance and is positively scaled with the animal's translational velocity. This means that a given value of an OF measurement at a large locomotion velocity corresponds to a larger object distance than at a smaller locomotion velocity. From an ecological perspective, these characteristics of OF-based distance information should not be a major problem in solving most spatial tasks, since object-related behavioural responses at higher locomotion speeds should already be initiated at a larger object distance than at a smaller speed. This applies especially to collision avoidance or the initiation of landings. An additional challenge arises if distance information based on OF is not determined directly to objects in the field of view, but if the distances over which the animal travels must be estimated, as is required for path integration between behaviourally relevant locations: Since the OF-based measurements of travel distances that are to be integrated depend on the spatial layout of the environment that the animal sees along its path as well as on its flight altitude, OF-based path integration can only lead to valid results if the animal flies at roughly the same height during its outbound and return flights and if the environment in which path integration takes place corresponds, at least statistically, in its spatial layout to the environment in which the animal eventually moves back to its starting point guided by the return vector determined in this way.
Overall, much progress has been made in recent years in understanding the mechanisms underlying OF-based spatial vision in different behavioural contexts, both at the behavioural and at the neural level. The neuronal processes of local motion detection could already be elucidated at a very high level of detail. This also applies to a large extent to the functional pathways for various aspects of spatial vision based on these local motion measurements, even though many questions are still open here. Moreover, it is still an exciting challenge to look at the detailed knowledge of the neuronal mechanisms of OF-based spatial behaviour not only from the point of view of cross-species commonalities of neuronal mechanisms, but also from the perspective of the species-specific adaptations of such mechanisms to the particularly interesting and often extreme behavioural feats performed, for example, by migratory locusts and monarch butterflies, by navigating Hymenoptera and several other fast-flying insect species in their rapid and usually collision-free traversal of dense, cluttered environments.
Data availability statement
This review article is based on a synopsis and interpretation of numerous previously published papers. No experimental data were obtained specifically for this review article.
References
Ache JM, Namiki S, Lee A, Branson K, Card GM (2019a) State-dependent decoupling of sensory and motor circuits underlies behavioral flexibility in Drosophila. Nat Neurosci 22:1132–1139
Ache JM, Polsky J, Alghailani S, Parekh R, Breads P, Peek MY, Bock DD, von Reyn CR, Card GM (2019b) Neural basis for looming size and velocity encoding in the Drosophila giant fiber escape pathway. Curr Biol 28:1073–1081
Baird E (2020) Obstacle avoidance in bumblebees is robust to changes in light intensity. Anim Cogn 23:1081–1086
Baird E, Dacke M (2012) Visual flight control in naturalistic and artificial environments. J Comput Physiol A 198:869–876
Baird E, Dacke M (2016) Finding the gap: a brightness-based strategy for guidance in cluttered environments. Proc R Soc B Biol Sci 283:20152988
Baird E, Srinivasan MV, Zhang S, Cowling A (2005) Visual control of flight speed in honeybees. J Exp Biol 208:3895–3905
Baird E, Srinivasan MV, Zhang S, Lamont R, Cowling A (2006) Visual control of flight speed and height in the honeybee. In: Nolfi S, Baldassare G, Calabretta R et al (eds) From Animals to Animats 9. Lecture notes in computer science. Springer, Berlin, pp 40–51
Baird E, Kornfeldt T, Dacke M (2010) Minimum viewing angle for visually guided ground speed control in bumblebees. J Exp Biol 213:1625–1632
Baird E, Boeddeker N, Ibbotson MR, Srinivasan MV (2013) A universal strategy for visually guided landing. Proc Natl Acad Sci 110:18686–18691
Baird E, Boeddeker N, Srinivasan MV (2021) The effect of optic flow cues on honeybee flight control in wind. Proc R Soc B 288:20203051
Bausenwein B, Müller NR, Heisenberg M (1994) Behavior-dependent activity labeling in the central complex of Drosophilia during controlled visual stimulation. J Comput Neurol 340:255–268
Bender JA, Dickinson MH (2006) Visual stimulation of saccades in magnetically tethered Drosophila. J Exp Biol 209:3170–3182
Bertrand OJ, Lindemann JP, Egelhaaf M (2015) A bio-inspired collision avoidance model based on spatial information derived from motion detectors leads to common routes. PLoS Comput Biol 11:e1004339
Boeddeker N, Hemmi JM (2010) Visual gaze control during peering flight manoeuvres in honeybees. Proc R Soc B 277:1209–1217
Boeddeker N, Mertes M, Dittmar L, Egelhaaf M (2015) Bumblebee homing: the fine structure of head turning movements. PLoS ONE 10:e0135020
Borst A (1990) How do flies land? From behavior to neuronal circuits. Biosci 40:292–299
Borst A (2014) Neural circuits for motion vision in the fly. Cold Spring Harb Symp Quant Biol 79:131–139
Borst A (2018) A biophysical mechanism for preferred direction enhancement in fly motion vision. PLoS Comput Biol 14:e1006240
Borst A, Egelhaaf M (1989) Principles of visual motion detection. Trends Neurosci 12:297–306
Borst A, Egelhaaf M (1993) Detecting visual motion: theory and models. In: Miles FA, Wallman J (eds) Visual motion and its role in the stabilization of gaze. Elsevier, Amsterdam, pp 3–27
Borst A, Haag J, Reiff DF (2010) Fly motion vision. Ann Rev Neurosci 33:49–70
Borst A, Haag J, Mauss AS (2020) How fly neurons compute the direction of visual motion. J Comput Physiol A 206:109–124
Braun E, Dittmar L, Boeddeker N, Egelhaaf M (2012) Prototypical components of honeybee homing flight behaviour depend on the visual appearance of objects surrounding the goal. Front Behav Neurosci 6:1
Chakravarthi A, Kelber A, Baird E, Dacke M (2017) High contrast sensitivity for visually guided flight control in bumblebees. J Comput Physiol A 203:999–1006
Chen J, Mandel HB, Fitzgerald JE, Clark DA (2019) Asymmetric ON-OFF processing of visual motion cancels variability induced by the structure of natural scenes. Elife 8:e47579
Chittka L, Geiger K, Kunze J (1995) The influences of landmarks on distance estimation of honey bees. Animal Behav 50:23–31
Collett T (1977) Stereopsis in toads. Nature 267:349–351
Collett TS (1978) Peering—a locust behavior pattern for obtaining motion parallax information. J Exp Biol 76:237–241
Collett TS, Collett M (2002) Memory use in insect visual navigation. Nat Rev Neurosci 3:542–552
Collett M, Collett TS (2017) Path integration: combining optic flow with compass orientation. Curr Biology 27:R1113–R1116
Collett TS, Harkness LIK (1982) Depth vision in animals. In: Ingle DJ, Goodale MA, Mansfield RJW (eds) Analysis of visual behavior. The MIT Press, Cambridge, pp 111–176
Collett M, Chittka L, Collett TS (2013) Spatial memory in insect navigation. Curr Biol 23:R789-800
Dickinson MH (2005) The initiation and control of rapid flight maneuvers in fruit flies. Integr Comput Biol 45:274–281
Dittmar L, Stürzl W, Baird E, Boeddeker N, Egelhaaf M (2010) Goal seeking in honeybees: matching of optic flow snapshots. J Exp Biol 213:2913–2923
Dittmar L, Egelhaaf M, Sturzl W, Boeddeker N (2011) The behavioral relevance of landmark texture for honeybee homing. Front Behav Neurosci 5:20
Doussot C, Bertrand OJN, Egelhaaf M (2020) The critical role of head movements for spatial representation during bumblebees learning Flight. Front Behav Neurosci 14:606590
Dror RO, O’Carroll DC, Laughlin SB (2001) Accuracy of velocity estimation by Reichardt correlators. J Opt Soc Am A 18:241–252
Dyhr JP, Higgins CM (2010) The spatial frequency tuning of optic-flow-dependent behaviors in the bumblebee Bombus impatiens. J Exp Biol 213:1643–1650
Egelhaaf M (2006) The neural computation of visual motion. In: Warrant E, Nilsson DE (eds) Invertebrate vision. Cambridge University Press, Cambridge, pp 399–461
Egelhaaf M, Borst A (1993) Movement detection in arthropods. In: Miles FA, Wallman J (eds) Visual motion and its role in the stabilization of gaze. Elsevier, Amsterdam, pp 53–77
Egelhaaf M, Kern R (2002) Vision in flying insects. Curr Opin Neurobiol 12:699–706
Egelhaaf M, Borst A, Reichardt W (1989) The nonlinear mechanism of direction selectivity in the fly motion detection system. Naturwisse 76:32–35
Egelhaaf M, Boeddeker N, Kern R, Lindemann JP (2012) Spatial vision in insects is facilitated by shaping the dynamics of visual input through behavioral action. Front Neur Circ 6:108
Egelhaaf M, Kern R, Lindemann JP (2014) Motion as a source of environmental information: a fresh view on biological motion computation by insect brains. Front Neur Circ 8:127
Esch HE, Burns JM (1996) Distance estimation by foraging honeybees. J Exp Biol 199:155–162
Esch HE, Zhang S, Srinivasan MV, Tautz J (2001) Honeybee dances communicate distances measured by optic flow. Nature 411:581–583
Franceschini N, Riehle A, Le Nestour A (1989) Directionally selective motion detection by insect neurons. In: Stavenga D, Hardie R (eds) Facets of vision. Springer, Berlin, pp 360–390
Geurten BRH, Kern R, Egelhaaf M (2012) Species-specific flight styles of flies are reflected in the response dynamics of a homolog motion-sensitive neuron. Front Integr Neurosci 6:11
Goyal P, Cribellier A, de Croon GCHE, Lankheet MJ, van Leeuwen JL, Pieters RPM, Muijres FT (2021) Bumblebees land rapidly and robustly using a sophisticated modular flight control strategy. iScience 24:102407
Green J, Maimon G (2018) Building a heading signal from anatomically defined neuron types in the Drosophila central complex. Curr Opin Neurobiol 52:156–164
Grewe J, Kretzberg J, Warzecha A-K, Egelhaaf M (2003) Impact of photon-noise on the reliability of a motion-sensitive neuron in the fly’s visual system. J Neurosci 23:10776–10783
Grewe J, Weckström M, Egelhaaf M, Warzecha A-K (2007) Information and discriminability as measures of reliability of sensory coding. PLoS ONE 2:e1328
Grittner R, Baird E, Stockl A (2021) Spatial tuning of translational optic flow responses in hawkmoths of varying body size. J Comput Physiol A 208:279–296
Haag J, Wertz A, Borst A (2007) Integration of lobula plate output signals by DNOVS1, an identified premotor descending neuron. J Neurosci 27:1992–2000
Hardcastle BJ, Krapp HG (2016) Evolution of biological image stabilization. Curr Biol 26:R1010–R1021
Hausen K (1984) The lobula-complex of the fly: Structure, function and significance in visual behaviour. In: Ali MA (ed) Photoreception and vision in invertebrates. Plenum Press, New York, pp 523–559
Heeger DJ, Jepson AD (1992) Subspace methods for recovering rigid motion I: algorithm and implementation. Int J Comput vis 7:95–117
Heinze S, Homberg U (2007) Maplike representation of celestial E-vector orientations in the brain of an insect. Science 315:995–997
Heinze S, Narendra A, Cheung A (2018) Principles of insect path integration. Curr Biol 28:R1043–R1058
Hempel de Ibarra N, Phillipides A, Riabinina O, Collett TS (2009) Preferred viewing directions of bumblebees (Bombus terrestris L.) when learning and approaching their nest site. J Expl Biol 212:3193–3204
Hennig P, Kern R, Egelhaaf M (2011) Binocular integration of visual information: a model study on naturalistic optic flow processing. Front Neur Circuits 5:4
Homberg U (2004) In search of the sky compass in the insect brain. Naturwiss 91:199–208
Honkanen A, Adden A, da Silva Freitas J, Heinze S (2019) The insect central complex and the neural basis of navigational strategies. J Exp Biol 222(Suppl 1):jeb188854
Howard IP (2012) Perceiving in depth, vol 3: other mechanisms of depth perception. Oxford University Press, Oxford
Howard IP, Rogers BJ (2012) Perceiving in depth, vol 2: stereoscopic vision. Oxford University Press, Oxford
Hrncir M, Jarau S, Zucchi R, Barth FG (2003) A stingless bee (Melipona seminigra) uses optic flow to estimate flight distances. J Comput Physiol A 189:761–768
Huston SJ, Krapp HG (2008) Visuomotor transformation in the fly gaze stabilization system. PLoS Biol 6:1468–1478
Kathman ND, Kesavan M, Ritzmann RE (2014) Encoding wide-field motion and direction in the central complex of the cockroach. Blaberus Discoidalis J Exp Biol 217:4079–4090
Kern R, Egelhaaf M (2000) Optomotor course control in the flies with largely asymmetric visual input. J Comput Physiol A 186:45–55
Kern R, Egelhaaf M, Srinivasan MV (1997) Edge detection by landing honeybees: Behavioural analysis and model simulations of the underlying mechanism. Vis Res 37:2103–2117
Kern R, van Hateren JH, Michaelis C, Lindemann JP, Egelhaaf M (2005) Function of a fly motion-sensitive neuron matches eye movements during free flight. PLoS Biol 3:1130–1138
Kern R, Boeddeker N, Dittmar L, Egelhaaf M (2012) Blowfly flight characteristics are shaped by environmental features and controlled by optic flow information. J Exp Biol 215:2501–2514
Kimmerle B, Srinivasan MV, Egelhaaf M (1996) Object detection by relative motion in freely flying flies. Naturwiss 83:380–381
Klapoetke NC, Nern A, Peek MY, Rogers EM, Breads P, Rubin GM, Reiser MB, Card GM (2017) Ultra-selective looming detection from radial motion opponency. Nature 551:237–241
Klapoetke NC, Nern A, Rogers EM, Rubin GM, Reiser MB, Card GM (2022) A functionally ordered visual feature map in the Drosophila brain. Neuron 110:1700–1711
Koenderink JJ (1986) Optic flow. Vision Res 26:161–180
Kohn JR, Portes JP, Christenson MP, Abbott LF, Behnia R (2021) Flexible filtering by neural inputs supports motion computation across states and stimuli. Curr Biol 31:5249–5260
Kral K (2012) The functional significance of mantis peering behaviour. Europ J Entom 109:295–301
Kral K, Poteser M (1997) Motion parallax as a source of distance information in locusts and mantids. J Insect Beh 10:145–163
Krapp HG (2000) Neuronal matched filters for optic flow processing in flying insects. In: Lappe M (ed) Neuronal processing of optic flow. Academic Press, San Diego, pp 93–120
Krapp HG (2014) Optic flow processing. In: Encyclopedia of computational neuroscience. Springer, New York, pp 2539–2558
Kress D, van Bokhorst E, Lentink D (2015) How lovebirds maneuver rapidly using super-fast head saccades and image feature stabilization. PLoS ONE 10:e0129287
Kurtz R (2012) Enhancement of prominent texture cues in fly optic flow processing. Front Neu Circuits 6:78
Land MF (1999) Motion and vision: why animals move their eyes. J Comput Physiol A 185:341–352
Laughlin SB (1994) Matching coding, circuits, cells, and molecules to signals: general principles of retinal design in the fly’s eye. Progr Retinal Eye Res 13:165–196
Lecoeur J, Dacke M, Floreano D, Baird E (2019) The role of optic flow pooling in insect flight control in cluttered environments. Sci Rep 9:1–13
Lehrer M (1991) Bees which turn back and look. Naturwiss 78:274–276
Lehrer M, Collett TS (1994) Approaching and departing bees learn different cues to the distance of a landmark. J Comput Physiol A 175:171–177
Lehrer M, Srinivasan MV, Zhang SW, Horridge GA (1988) Motion cues provide the bee’s visual world with a third dimension. Nature 332:356–357
Li J, Lindemann J, Egelhaaf M (2017) Local motion adaptation enhances the representation of spatial structure at EMD arrays. PLoS Comput Biol 13:e1005919
Li J, Niemeier M, Kern R, Egelhaaf M (2021) Disentangling of local and wide-field motion adaptation. Front Neur Circuits 15:713285
Liang P, Kern R, Egelhaaf M (2008) Motion adaptation enhances object-induced neural activity in three-dimensional virtual environment. J Neurosci 28:11328–11332
Liang P, Heitwerth J, Kern R, Kurtz R, Egelhaaf M (2012) Object representation and distance encoding in three-dimensional environments by a neural circuit in the visual system of the blowfly. J Neurophysiol 107:3446–3457
Linander N, Dacke M, Baird E (2015) Bumblebees measure optic flow for position and speed control flexibly within the frontal visual field. J Exp Biol 218:1051–1059
Linander N, Baird E, Dacke M (2016) Bumblebee flight performance in environments of different proximity. J Comput Physiol A 202:97–103
Lobecke A, Kern R, Egelhaaf M (2018) Taking a goal-centred dynamic snapshot as a possibility for local homing in initially naïve bumblebees. J Exp Biol 221:jeb168674
Longuet-Higgins HC, Prazdny K (1980) The interpretation of a moving retinal image. Proc R Soc Lond B 208:385–397
Mauss AS, Borst A (2019) Optic flow-based course control in insects. Curr Opin Neurobiol 60:21–27
Mauss AS, Vlasits A, Borst A, Feller M (2017) Visual circuits for direction selectivity. Annu Rev Neurosci 40:211–230. https://doi.org/10.1146/annurev-neuro-072116-031335
Mertes M, Dittmar L, Egelhaaf M, Boeddeker N (2014) Visual motion-sensitive neurons in the bumblebee brain convey information about landmarks during a navigational task. Front Behav Neurosci 8:335
Meyer HG, Lindemann JP, Egelhaaf M (2011) Pattern-dependent response modulations in motion-sensitive visual interneurons—a model study. PLoS ONE 6:e21488
Muijres FT, Elzinga MJ, Melis JM, Dickinson MH (2014) Flies evade looming targets by executing rapid visually directed banked turns. Science 344:172–177
Muijres FT, Elzinga MJ, Iwasaki NA, Dickinson MH (2015) Body saccades of Drosophila consist of stereotyped banked turns. J Exp Biol 218:864–875
Namiki S, Dickinson MH, Wong AM, Korff W, Card GM (2018) The functional organization of descending sensory-motor pathways in Drosophila. Elife 7:e34272
Nityananda V, Tarawneh G, Rosner R, Nicolas J, Crichton S, Read J (2016) Insect stereopsis demonstrated using a 3D insect cinema. Sci Rep 6:1–9
Nityananda V, Tarawneh G, Henriksen S, Umeton D, Simmons A, Read JCA (2018) A novel form of stereo vision in the praying mantis. Curr Biol 28:588–593
O’Carroll DC, Barnett PD, Nordstrom K (2011) Local and global responses of insect motion detectors to the spatial structure of natural scenes. J vis 11:1–17
Ofstad TA, Zuker CS, Reiser MB (2011) Visual place learning in Drosophila melanogaster. Nature 474:204–207
O’Keeffe J, Yap SH, Llamas-Cornejo I, Nityananda V, Read JCA (2022) A computational model of stereoscopic prey capture in praying mantises. PLoS Comput Biol 18:e1009666
Pabst K, Zittrell F, Homberg U, Endres D (2022) A model for optic flow integration in locust central-complex neurons tuned to head direction. In: Proceedings of the annual meeting of the cognitive science society, vol 44
Panser K, Tirian L, Schulze F, Villalba S, Jefferis GS, Buhler K, Straw AD (2016) Automatic segmentation of Drosophila neural compartments using GAL4 expression data reveals novel visual pathways. Curr Biol 26:1943–1954
Pfeiffer K, Homberg U (2007) Coding of azimuthal directions via time-compensated combination of celestial compass cues. Curr Biol 17:960–965
Pfeiffer K, Homberg U (2014) Organization and functional roles of the central complex in the insect brain. Annu Rev Entomol 59:165–184
Philippides A, Hempel de Ibarra N, Riabinina O, Collett TS (2013) Bumblebee calligraphy: the design and control of flight motifs in the learning and return flights of Bombus terrestris. J Exp Biol 216:1093–1104
Pisokas I, Heinze S, Webb B (2020) The head direction circuit of two insect species. Elife 9:e53985
Portelli G, Ruffier F, Franceschini N (2010) Honeybees change their height to restore their optic flow. J Comput Physiol A 196:307–313
Portelli G, Ruffier F, Roubieu FL, Franceschini N, Krapp HG (2011) Honeybees’ speed depends on dorsal as well as lateral, ventral and frontal optic flows. PLoS ONE 6:e19486
Portelli G, Serres JR, Ruffier F (2017) Altitude control in honeybees: joint vision-based learning and guidance. Sci Rep 7:1–10
Ravi S, Siesenop T, Bertrand O, Li L, Doussot C, Warren WH, Combes SA, Egelhaaf M (2020) Bumblebees perceive the spatial layout of their environment in relation to their body size and form to minimize inflight collisions. Proc Nat Acad Sci 117:31494–31499
Ravi S, Bertrand O, Siesenop T, Manz L-S, Doussot C, Fisher A, Egelhaaf M (2019) Gap perception in bumblebees. J Exp Biol 222:jeb184135
Ravi S, Siesenop T, Bertrand OJ, Li L, Doussot C, Fisher A, Warren WH, Egelhaaf M (2022) Bumblebees display characteristics of active vision during robust obstacle avoidance flight. J Exp Biol 225:jeb243021
Read JCA (2021) Binocular vision and stereopsis across the animal kingdom. Annu Rev vis Sci 7:389–415
Reichardt W (1961) Autocorrelation, a principle for the evaluation of sensory information by the central nervous system. In: Rosenblith WA (ed) Sensory communication. MIT Press/Wiley, New York/London, pp 303–317
Riabinina O, Hempel de Ibarra NH, Philippides A, Collett TS (2014) Head movements and the optic flow generated during the learning flights of bumblebees. J Exp Biol 217:2633–2642
Ronacher B (2020) Path integration in a three-dimensional world: the case of desert ants. J Comput Physiol A 206:379–387
Rosner R, Pegel U, Homberg U (2019) Responses of compass neurons in the locust brain to visual motion and leg motor activity. J Exp Biol 222:jeb.196261
Rossel S (1983) Binocular stereopsis in an insect. Nature 302:821–822
Schilling T, Borst A (2015) Local motion detectors are required for the computation of expansion flow-fields. Biol Open 4:1105–1108
Schwegmann A, Lindemann JP, Egelhaaf M (2014a) Depth information in natural environments derived from optic flow by insect motion detection system: a model analysis. Front Comput Neurosci 8:83
Schwegmann A, Lindemann JP, Egelhaaf M (2014b) Temporal statistics of natural image sequences generated by movements with insect flight characteristics. PLoS ONE 9(10):e110386
Seelig JD, Jayaraman V (2015) Neural dynamics for landmark orientation and angular path integration. Nature 521:186–191
Serres JR, Ruffier F (2017) Optic flow-based collision-free strategies: from insects to robots. Arthropod Struct Dev 46:703–717
Serres J, Ruffier F, Viollet S, Franceschini N (2006) Toward optic flow regulation for wall-following and centering behaviours. Int J Adv Robot Systs 3:147–154
Serres JR, Masson GP, Ruffier F, Franceschini N (2008) A bee in the corridor: centering and wall-following. Naturwiss 95:1181–1187
Shackleton K, Balfour NJ, Toufailia HA, Alves DA, Bento JM, Ratnieks FLW (2019) Unique nest entrance structure of Partamona helleri stingless bees leads to remarkable ‘crash-landing’ behaviour. Insectes Soc 66:471–477
Sobel EC (1990) The locust’s use of motion parallax to measure distance. J Comput Physiol A 167:579–588
Srinivasan MV (2014) Going with the flow: a brief history of the study of the honeybee’s navigational “odometer.” J Comput Physiol A 200:563–573
Srinivasan MV (2015) Where paths meet and cross: navigation by path integration in the desert ant and the honeybee. J Comput Physiol A 201:533–546
Srinivasan MV, Lehrer M, Zhang SW, Horridge GA (1989) How honeybees measure their distance from objects of unknown size. J Comput Physiol A 165:605–613
Srinivasan MV, Lehrer M, Kirchner WH, Zhang SW (1991) Range perception through apparent image speed in freely flying honeybees. Vis Neurosci 6:519–535
Srinivasan MV, Zhang SW, Lehrer M, Collett TS (1996) Honeybee navigation en route to the goal: visual flight control and odometry. J Exp Biol 199:237–244
Srinivasan MV, Zhang SW, Bidwell NJ (1997) Visually mediated odometry in honeybees. J Exp Biol 200:2513–2522
Srinivasan MV, Zhang S, Altwein M, Tautz J (2000) Honeybee navigation: nature and calibration of the “odometer.” Science 287:851–853
Srinivasan MV, Zhang S, Chahl JS (2001) Landing strategies in honeybees, and possible applications to autonomous airborne vehicles. Biol Bull 200:216–221
Stöckl A, Grittner R, Pfeiffer K (2019) The role of lateral optic flow cues in hawkmoth flight control. J Exp Biol 222:jeb199406
Stone T, Webb B, Adden A, Weddig NB, Honkanen A, Templin R, Wcislo W, Scimeca L, Warrant E, Heinze S (2017) An anatomically constrained model for path integration in the bee brain. Curr Biol 27:3069–3085
Strother JA, Wu ST, Wong AM, Nern A, Rogers EM, Le JQ, Rubin GM, Reiser MB (2017) The emergence of directional selectivity in the visual motion pathway of Drosophila. Neuron 94:168–182
Strübbe S, Stürzl W, Egelhaaf M (2015) Insect-inspired self-motion estimation with dense flow fields—an adaptive matched filter approach. PLoS ONE 10:e0128413
Suver MP, Huda A, Iwasaki N, Safarik S, Dickinson MH (2016) An array of descending visual interneurons encoding self-motion in Drosophila. J Neurosci 36:11768–11780
Tammero LF, Dickinson MH (2002) Collision-avoidance and landing responses are mediated by separate pathways in the fruit fly, Drosophila melanogaster. J Exp Biol 205:2785–2798
Tautz J, Zhang S, Spaethe J, Brockmann A, Si A, Srinivasan M (2004) Honeybee odometry: performance in varying natural terrain. PLoS Biol 2:915–923
Thoma A, Fisher A, Bertrand O, Braun C (2020) Evaluation of possible flight strategies for close object evasion from bumblebee experiments. In: Biomimetic and biohybrid systems. Lecture notes in computer science, pp 354–365
Tichit P, Alves-Dos-Santos I, Dacke M, Baird E (2020) Accelerated landing in a stingless bee and its unexpected benefits for traffic congestion. Proc R Soc B 287:20192720
Timaeus L, Geid L, Sancer G, Wernet MF, Hummel T (2020) Parallel visual pathways with topographic versus nontopographic organization connect the Drosophila eyes to the central brain. iScience 23:101590
Turner-Evans DB, Jayaraman V (2016) The insect central complex. Curr Biol 26:R453-457. https://doi.org/10.1016/j.cub.2016.04.006
Turner-Evans D, Wegener S, Rouault H, Franconville R, Wolff T, Seelig JD, Druckmann S, Jayaraman V (2017) Angular velocity integration in a fly heading circuit. Elife 6:e23496
Ullrich TW, Kern R, Egelhaaf M (2015) Influence of environmental information in natural scenes and the effects of motion adaptation on a fly motion-sensitive neuron during simulated flight. Biol Open 4:13–21
van Hateren JH, Schilstra C (1999) Blowfly flight and optic flow. II. Head movements during flight. J Exp Biol 202:1491–1500
Varga AG, Kathman ND, Martin JP, Guo P, Ritzmann RE (2017) Spatial navigation and the central complex: sensory acquisition, orientation, and motor control. Front Behav Neurosci 11:4
von Reyn CR, Breads P, Peek MY, Zheng GZ, Williamson WR, Yee AL, Leonardo A, Card GM (2014) A spike-timing mechanism for action selection. Nat Neurosci 17(7):962–970. https://doi.org/10.1038/nn.3741
Wagner H (1982) Flow-field variables trigger landing in flies. Nature 297:147–148
Warzecha A-K, Rosner R, Grewe J (2013) Impact and sources of neuronal variability in the fly’s motion vision pathway. J Physiol Paris 107:26–40
Webb B, Wystrach A (2016) Neural mechanisms of insect navigation. Curr Opin Insect Sci 15:27–39
Webb B (2019) The internal maps of insects. J Exp Biol 222:jeb.188094
Wehner R (2020) Desert navigator. Harvard University Press, London
Weir PT, Schnell B, Dickinson MH (2014) Central complex neurons exhibit behaviorally gated responses to visual motion in Drosophila. J Neurophysiol 111:62–71
Wertz A, Borst A, Haag J (2008) Nonlinear integration of binocular optic flow by DNOVS2, a descending neuron of the fly. J Neurosci 28:3131–3140
Wertz A, Haag J, Borst A (2009) Local and global motion preferences in descending neurons of the fly. J Comput Physiol A 195:1107–1120
Wertz A, Haag J, Borst A (2012) Integration of binocular optic flow in cervical neck motor neurons of the fly. J Comput Physiol A 198:655–668
Wolf H (2011) Odometry and insect navigation. J Exp Biol 214:1629–1641
Wu M, Nern A, Williamson WR, Morimoto MM, Reiser MB, Card GM, Rubin GM (2016) Visual projection neurons in the Drosophila lobula link feature detection to distinct behavioral programs. Elife 5:e21022
Yang HH, Clandinin TR (2018) Elementary motion detection in Drosophila: algorithms and mechanisms. Ann Rev vis Sci 4:143–163
Zavatone-Veth JA, Badwan BA, Clark DA (2020) A minimal synaptic model for direction selective neurons in Drosophila. J vis 20:1–22
Zeil J (1993a) Orientation flights of solitary wasps (Cerceris, Sphecidae, Hymenoptera). I. Description of flights. J Comput Physiol A 172:189–205
Zeil J (1993b) Orientation flights of solitary wasps (Cerceris; Sphecidae; Hymenoptera). II. Similarities between orientation and return flights and the use of motion parallax. J Comput Physiol A 172:207–222
Zeil J (2012) Visual homing: an insect perspective. Curr Opin Neurobiol 22:285–293
Zeil J, Boeddeker N, Hemmi JM (2008) Vision and the organization of behaviour. Curr Biol 18:R320–R323
Zeil J, Boeddeker N, Stürzl W (2009) Visual homing in insects and robots. In: Floreano D, Zufferey JC, Srinivasan MV, Ellington CP (eds) Flying insects and robots. Springer, Heidelberg, pp 87–99
Zeil J (2022) Visual navigation: properties acquisition and use of views. J Comp Physiol A. https://doi.org/10.1007/s00359-022-01599-2
Acknowledgements
I would like to express my sincere thanks to all my co-workers who have conducted the research in our workgroup over the many years with great commitment and forward-looking ideas. Thanks also go to the Deutsche Forschungsgemeinschaft (DFG), which has always generously supported our research.
Funding
Open Access funding enabled and organized by Projekt DEAL.
Author information
Authors and Affiliations
Contributions
This is a review article that presents the work from the relevant groups in the field including my work group at Bielefeld University. The article was written exclusively by myself.
Corresponding author
Ethics declarations
Conflict of interest
The author declares that he has no conflict of interest.
Additional information
Handling Editor: Keram Pfeiffer
Publisher's Note
Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.
Rights and permissions
Open Access This article is licensed under a Creative Commons Attribution 4.0 International License, which permits use, sharing, adaptation, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons licence, and indicate if changes were made. The images or other third party material in this article are included in the article's Creative Commons licence, unless indicated otherwise in a credit line to the material. If material is not included in the article's Creative Commons licence and your intended use is not permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from the copyright holder. To view a copy of this licence, visit http://creativecommons.org/licenses/by/4.0/.
About this article
Cite this article
Egelhaaf, M. Optic flow based spatial vision in insects. J Comp Physiol A 209, 541–561 (2023). https://doi.org/10.1007/s00359-022-01610-w
Received:
Revised:
Accepted:
Published:
Issue Date:
DOI: https://doi.org/10.1007/s00359-022-01610-w