Wang, Huatian
(2021)
Bio-inspired Neural Networks for Angular Velocity Estimation in Visually Guided Flights.
PhD thesis, University of Lincoln.
Bio-inspired Neural Networks for Angular Velocity Estimation in Visually Guided Flights | PhD Thesis | | ![[img]](/48571/1.hassmallThumbnailVersion/Wang%2C%20Huatian%20%E2%80%93%20Computer%20Science%20%E2%80%93%20May%202021.pdf) [Download] |
|
![[img]](/48571/1.hassmallThumbnailVersion/Wang%2C%20Huatian%20%E2%80%93%20Computer%20Science%20%E2%80%93%20May%202021.pdf)  Preview |
|
PDF
Wang, Huatian – Computer Science – May 2021.pdf
- Whole Document
18MB |
Item Type: | Thesis (PhD) |
---|
Item Status: | Live Archive |
---|
Abstract
Executing delicate flight maneuvers using visual information is a huge challenge
for future robotic vision systems. As a source of inspiration, insects are quite apt at
navigating in woods and landing on surfaces which require delicate visual perception
and flight control. The exquisite sensitivity of insects for image motion speed, as revealed recently, is coming from a class of specific neurons called descending neurons.
Some of the descending neurons have demonstrated angular velocity selectivity as the
image motion speed varies in retina. Build a quantitative angular velocity detection
model is the first step for not only further understanding of the biological visual system, but also providing robust and economic solutions of visual motion perception
for an artificial visual system. This thesis aims to explore biological image processing
methods for motion speed detection in visually guided flights. The major contributions
are summarized as follows.
We have presented an angular velocity decoding model (AVDM), which estimates
the visual motion speed combining both textural and temporal information from input signals. The model consists of three parts: elementary motion detection circuits,
wide-field texture estimation pathway and angular velocity decoding layer. The model
estimates the angular velocity very well with improved spatial frequency independence
compared to the state-of-the-art angular velocity detecting models, when firstly tested by moving sinusoidal gratings. This spatial independence is vital to account for
the honeybee’s flight behaviors. We have also investigated the spatial and temporal
resolutions of honeybees to get a bio-plausible parameter setting for explaining these
behaviors.
To investigate whether the model can account for observations of tunnel centering
behaviors of honeybees, the model has been implemented in a virtual bee simulated by
the game engine Unity. The simulation results of a series of experiments show that the
agent can adjust its position to fly through patterned tunnels by balancing the angular
velocities estimated on both eyes under several circumstances. All tunnel stimulations
reproduce similar behaviors of real bees, which indicate that our model does provide
a possible explanation for estimating the image velocity and can be used for MAV’s
flight course regulation in tunnels. What’s more, to further verify the robustness of the
model, the visually guided terrain following simulations have been carried out with a
closed-loop control scheme to restore a preset angular velocity during the flight. The
simulation results of successfully flying over the undulating terrain verify the feasibility and robustness of the AVDM performing in various application scenarios, which
shows its potential in applications of micro aerial vehicle’s terrain following.
In addition, we have also applied the AVDM in grazing landing using only visual
information. A LGMD neuron is also introduced to avoid collision and to trigger the
hover phase, which ensures the safety of landing. By applying honeybee’s landing
strategy of keeping constant angular velocity, we have designed a close-loop control
scheme with an adaptive gain to control landing dynamic using AVDM response as
input. A series of controlled trails have been designed in Unity platform to demonstrate
the effectiveness of the proposed model and control scheme for visual landing under
various conditions. The proposed model could be implemented into real small robots
to investigate the robustness in real landing scenarios in near future.
Repository Staff Only: item control page