Sie befinden Sich nicht im Netzwerk der Universität Paderborn. Der Zugriff auf elektronische Ressourcen ist gegebenenfalls nur via VPN oder Shibboleth (DFN-AAI) möglich. mehr Informationen...
Technological advances in multibeam sonar in the recent past has increased the volume of data to be analyzed. This problem is exacerbated by shallow water surveys. For surveys of less than 200 meters water depth the round trip travel time is sufficiently small to increase ping rates considerably. Any bad data, outlier, has previously relied on human intervention for removal. Automation of outlier removal has been plagued by problems of sparse population of the covered area. No assumptions about the underlying physical structure are made in order to increase the confidence of outlier detection probabilities. The authors' technique is embedded in part of a larger effort in progress at the Naval Oceanographic Office. This effort ingests both GSF logged multibeam data and native LIDAR data, and is called the PFM Editor. Their process starts at the swath or single beam level and continues in a cascade manner applying simple algorithms each of which build on the previous analysis. The time ordered data is operated on an algorithm that exploits the physical characteristics of the collected bathymetric data through the joint probability density function (PDF) of bottom roughness (variance) and slope (local grade). This PDF is subsequently used a posteriori for outlier detection. The valid subset is then used to serve as "anchors" for a multi-pass local aperture-filtering algorithm to classify the remaining data. By examining the statistics of the validated data in the local aperture, reliable classification of the adjoining data points can be made. Once complete the data is geo-registered it is examined utilizing a variant of the above procedure.