Show simple item record

dc.contributor.authorTurner, D.
dc.contributor.authorLucieer, A.
dc.contributor.authorMcCabe, Matthew
dc.contributor.authorParkes, Stephen
dc.contributor.authorClarke, I.
dc.date.accessioned2017-10-17T08:48:35Z
dc.date.available2017-10-17T08:48:35Z
dc.date.issued2017-08-31
dc.identifier.citationTurner D, Lucieer A, McCabe M, Parkes S, Clarke I (2017) PUSHBROOM HYPERSPECTRAL IMAGING FROM AN UNMANNED AIRCRAFT SYSTEM (UAS) – GEOMETRIC PROCESSINGWORKFLOW AND ACCURACY ASSESSMENT. ISPRS - International Archives of the Photogrammetry, Remote Sensing and Spatial Information Sciences XLII-2/W6: 379–384. Available: http://dx.doi.org/10.5194/isprs-archives-xlii-2-w6-379-2017.
dc.identifier.issn2194-9034
dc.identifier.doi10.5194/isprs-archives-xlii-2-w6-379-2017
dc.identifier.urihttp://hdl.handle.net/10754/625872
dc.description.abstractIn this study, we assess two push broom hyperspectral sensors as carried by small (10-15 kg) multi-rotor Unmanned Aircraft Systems (UAS). We used a Headwall Photonics micro-Hyperspec push broom sensor with 324 spectral bands (4-5 nm FWHM) and a Headwall Photonics nano-Hyperspec sensor with 270 spectral bands (6 nm FWHM) both in the VNIR spectral range (400-1000 nm). A gimbal was used to stabilise the sensors in relation to the aircraft flight dynamics, and for the micro-Hyperspec a tightly coupled dual frequency Global Navigation Satellite System (GNSS) receiver, an Inertial Measurement Unit (IMU), and Machine Vision Camera (MVC) were used for attitude and position determination. For the nano-Hyperspec, a navigation grade GNSS system and IMU provided position and attitude data. This study presents the geometric results of one flight over a grass oval on which a dense Ground Control Point (GCP) network was deployed. The aim being to ascertain the geometric accuracy achievable with the system. Using the PARGE software package (ReSe - Remote Sensing Applications) we ortho-rectify the push broom hyperspectral image strips and then quantify the accuracy of the ortho-rectification by using the GCPs as check points. The orientation (roll, pitch, and yaw) of the sensor is measured by the IMU. Alternatively imagery from a MVC running at 15 Hz, with accurate camera position data can be processed with Structure from Motion (SfM) software to obtain an estimated camera orientation. In this study, we look at which of these data sources will yield a flight strip with the highest geometric accuracy.
dc.publisherCopernicus GmbH
dc.relation.urlhttps://www.int-arch-photogramm-remote-sens-spatial-inf-sci.net/XLII-2-W6/379/2017/
dc.rightsThis work is distributed under the Creative Commons Attribution 4.0 License.
dc.rights.urihttps://creativecommons.org/licenses/by/4.0/
dc.subjectGeometric accuracy
dc.subjectHyperspectral
dc.subjectPARGE
dc.subjectPush broom
dc.subjectUAS
dc.titlePUSHBROOM HYPERSPECTRAL IMAGING FROM AN UNMANNED AIRCRAFT SYSTEM (UAS) – GEOMETRIC PROCESSINGWORKFLOW AND ACCURACY ASSESSMENT
dc.typeArticle
dc.contributor.departmentBiological and Environmental Sciences and Engineering (BESE) Division
dc.contributor.departmentEnvironmental Science and Engineering Program
dc.contributor.departmentWater Desalination and Reuse Research Center (WDRC)
dc.identifier.journalISPRS - International Archives of the Photogrammetry, Remote Sensing and Spatial Information Sciences
dc.conference.date2017-09-04 to 2017-09-07
dc.conference.name4th ISPRS International Conference on Unmanned Aerial Vehicles in Geomatics, UAV-g 2017
dc.conference.locationBonn, DEU
dc.eprint.versionPublisher's Version/PDF
dc.contributor.institutionUniversity of Tasmania, School of Land and Food, Hobart, TAS, , Australia
kaust.personMcCabe, Matthew
kaust.personParkes, Stephen
refterms.dateFOA2018-06-14T05:24:17Z


Files in this item

Thumbnail
Name:
isprs-archives-XLII-2-W6-379-2017.pdf
Size:
884.0Kb
Format:
PDF
Description:
Main article

This item appears in the following Collection(s)

Show simple item record

This work is distributed under the Creative Commons Attribution 4.0 License.
Except where otherwise noted, this item's license is described as This work is distributed under the Creative Commons Attribution 4.0 License.