Focus on the Vine: producing focused web solutins that bear real fruit
eNews Archive

eNewsletter March 2008

LiDAR Advances and Challenges

by Rod Franklin, Reporter

An expanded version of this story will be featured in the Spring 2008 issue of Imaging Notes, which will be mailed and posted on the website in April.

Standing room-only crowds at this year's International LiDAR Mapping Forum (ILMF) conference in Denver provided evidence of the technology's status as a three-dimensional mapping technique that continues to grow in popularity. The two-day event, held Feb. 21-22, included workshops for beginners, as well as advanced presentations detailing many of the technical advantages and challenges associated with Light Detection and Ranging (LiDAR).

Attendance was more than double that of the 2007 conference. With more than 580 delegates from 32 countries participating, registrants set up nearly twice as many exhibit booths this year. Some vendors reported substantial new orders from the show floor.

The healthy turnout was reflective of growth that in recent years has added variety to the list of players contributing to the LiDAR art. Since its genesis in the mid-1990s, LiDAR has advanced a respectable distance beyond the realm of hardware production. Affiliate roles now include system operator, support services vendor and government sponsor. However, though certain developments have pushed the modality into more diversified applications, LiDAR remains a technological work in progress. In many scenarios, it continues to serve in an adjunctive capacity to alternate imaging techniques, rather than taking the lead role.

Some of the reasons for this subordinate role have to do with market forces and the requirements of individual projects. Others have to do with technology.

James Wilder Young, LiDAR manager for the Sanborn Map Company of Colorado Springs, profiled an industry that has evolved significantly since its commercial debut, with a number of defining parameters improving by orders of magnitude. For instance, systems which once hummed along at a meager 5 to 10 Khz now operate in the 150 to 300 Khz range. Another development has been the advent of multi-pulsed LiDAR, which effectively put an end to one- and two-pulse configurations. Also, the nature of LiDAR data itself—typically millions of pulsed light elevation data points that require formatting, importation and processing—isn't quite as vexing a problem as it used to be for those off-the-shelf software packages that are designed to groom it into textured vistas of urban and rural beauty.

Despite these advances, LiDAR has its limits. Young observed that mapping options such as interferometric synthetic aperture radar are better tailored for certain projects. And on a broader level, LiDAR suffers from the same kind of software development gap that plagues other high-tech industries, due to the fact that programs coded to work with advanced LiDAR hardware generally fail to optimize the full potential of device circuitry. Young showed a slide which asserted that the LiDAR market "is growing all around the world, but LiDAR handling software is not," and identified three-dimensional urban modeling, automated classification and vegetation mapping as markets where this is especially problematic.

Others who spoke at the conference share his thoughts on this issue. Jennifer Whitacre, who gave a presentation on the simultaneous use of LiDAR with large format camera imaging, cites automatic data filtering as a feature that LiDAR experts rank high on their wish list of processing functionality. Specifically, she said, they want filtering capability robust enough to "clean up" LiDAR points so topographic features in complex elevation profiles can be distinguished, but not so aggressive that the targeted data itself is also stripped away.

"I think the higher technology systems are becoming so advanced that software does lag behind," said Whitacre, who handles LiDAR sales for Kansas City company M.J. Harden (now a GeoEye company). See Figure 1.

Figure 1. This LiDAR image of Kansas City was generated by M.J. Harden
Associates Inc. from unedited first return data captured at 0.9-m postings.
Since the raw data was not modified by point classification at this stage,
building features took on a rather spiky appearance.

The firm currently uses a software package that's capable of providing everything from data calibration to project management, tile parsing and data filtering. But because any given mapping assignment may call for modifications that program developers can't plan for in advance, "it's really important to be able to go in and tweak those algorithms to make them fit your specific project," Whitacre says. "I think all software has to be tweaked."

In his presentation on feature extraction from LiDAR data, Overwatch Textron Systems Chief Operating Officer Stuart Blundell observed that "software development is always chasing after advances in sensor capability." To some degree, that's just natural in the high-tech imaging marketplace: "I think it's always kind of been this way. It's a function of the physics (and) how optics are built, and (because) engineering capabilities are so sophisticated."

Blundell is dedicated to resolving the specific requirements of LiDAR in three areas that create bottlenecks during the feature extraction process. The first of these has to do with the use of multiple sensors during data acquisition. A second chore that taxes software involves the need to merge RGB color data or intensity data with XYZ data points. Third, high spatial resolutions and very large datasets present an additional set of challenges. Logjams result when the software chokes while trying to accommodate all of these characteristics of LiDAR. According to Blundell, these bottlenecks are most likely to occur at the data registration stage, during feature extraction, or when attributes are applied to the data so it can be useful in a GIS database.

If LiDAR data can be manipulated in a way that aligns with GIS-ready vector or shapefile formats, mapping experts should be able to move beyond the mere visualization of a scene and into the extraction of smaller physical features from within that scene. LiDAR Analyst is an Overwatch Textron Systems software product designed to provide this functionality in 2004 as a plug-in for ArcGIS and ERDAS Imagine (and available in 2008 as a plug-in for Remote View and ELT).

Denver's ILMF conference showed that the ranks of LiDAR technologists are swelling. With more brains working in tandem and a very capable array of hardware waiting, LiDAR should be well positioned into the future as a modal option for digital mapping experts. But, as Blundell observes, "the software guys like us now have to deliver on that the best we can."

Bernie Krause during his "Giving Voice to the Map: The 'There' in 'Where'" session at the 2007 Where 2.0 Conferece at the Fairmont Hotel in San Jose, California.

Explore the Future of Location-Aware Innovation at Where 2.0

The O'Reilly Where 2.0 conference brings together the people, projects, and issues inspiring change—the most provocative, interesting and useful event in the geospatial industry. Where 2.0 exposes the tools pushing the boundaries of the location frontier, tracks the emergence of new business models and services, and examines new sources of data and the platforms for collecting them. Discuss what's viable now, and what's lurking just below the radar at Where 2.0, May 12-14, Burlingame, CA.

Imaging Notes readers may use code "whr08img" to save 15% off registration fees.

Index of eNewsletters
DGI 2014 | 21-23 January 2014, London | Strategies for data, geoint, and cyber security in defence & intelligence | Find out more