July 30, 2014

Head in the Point Clouds

Sam Pfeifle

Sam Pfeifle was Editor of SPAR Point Group from September 2010 - February 2013.

Mobile scanning - yeah, people are doing that

Why do I attend conferences? Well, it's my job, obviously. But only at conferences can I go to lunch and happen to sit across the table from a guy who's got a photo on his phone of himself driving a four-wheeler with a scanner strapped to it.

Such is what happened at the Riegl conference last month, when I sat down with a couple of slices of pizza, right across from Jake Mattox, a VP at EMC Survey, a firm with headquarters in Grenada, Mississippi, and offices in Florida, Texas, and Lousiana. There were also a couple guys from Eagle Mapping in BC sitting with us, and we'd gotten to talking about how Riegl spends a lot of time emphasizing just how durable their scanners are. I mentioned how I'd heard people were strapping MDL's Dynascans onto four wheelers and whatnot since they were a little less expensive and maybe a crash wouldn't be quite such a disaster, like it would be with a more expensive (and more accurate and robust) Riegl system. 

Heck, Mattox says, look at this. And there's a Riegl VZ400 strapped to a four-wheeler, right there on his phone. Of course, he didn't have a card, but I gave him one of mine and asked him to please be in touch so I could show some of his rigs off. Well, he followed through, and here I am with a few images he sent me. Pretty dang cool, really.

First up, that four-wheeler (ahem, I mean "laser scanning platform," and, actually, I guess it's more accurately an "off road vehicle," a Ranger from Polaris):


EMC was kind enough to integrate the CADD, the point cloud and the actual image all into this one illustration for me. I think it's just about the coolest thing since sliced bread, to tell the truth. Just a great illustration of the company's capabilities. 

Here's another example, scanning pipeline:


And, to reiterate, that's not Riegl's VMX-250 or -450 mobile scanning platform. That's their terrestrial scanner rigged up by EMC to perform mobile scanning duties. They do both terrestrial and mobile off the same platform. I guess they're just careful. Personally, I'd be a little worried about bringing a scanner of that value into this here environment:


You know, accidents happen? I wouldn't want to be the guy having to explain what happened when the Riegl dropped in the water because I thought I saw an alligator or something. (I once had a colleague have to explain why she spilled red wine all over her laptop - that was bad enough!)

Regardless, it's great to see cool solutions created for real-world situations. If your firm is doing anything similar, please send some photos along. The more the merrier. Nothing tells a story like a picture.

Permanent link

So that IS a Velodyne lidar unit on that Google car

We here at SPAR have obviously been covering driverless cars (or autonomous vehicles) for some time now. We had a presentation titled "Autonomous Vehicles from NIST's Intelligent Systems Division" at SPAR 2005. It was pretty cool for the time, using Sick lasers and flash lidar and multi-camera systems to create some working prototypes and at least figure out what the issues were. 

By 2007, though, things had moved along quite a bit. SPAR was on the scene for the final round of the DARPA Challenge, which pitted 11 organizations against one another to figure out who had the best-performing autonomous vehicle navigating city streets with intersections and human drivers in the mix. Carnegie Mellon scored the top $2million prize, followed by guys at Stanford and Virginia Tech. Just three years prior to that, in the first Challenge event, 15 vehicles started the race, and not a single one finished. And that was a desert course without any traffic.

It's likely some of the ideas developed in those challenges wound up in Lockheed Martin's Squad Mission Support System, which we profiled last year, and is out in the field right now, following soldiers around and acting as a mule. 

Pretty clearly, the technology is moving forward at a rapid pace. Still, I was a little surprised last fall when Sergey Brinn went public in a major way about Google's driverless car initiative, and I was similarly surprised this week to read the MAMMOTH article in Wired about driverless car technology and its looming commercial availability. 

Or should I say current commercial availability. Even being in the business, I guess I wasn't aware of just how far the use of lidar and 3D data acquisition has come in the motor vehicle industry. 

Maybe it's because I'm not often in the market for Mercedes and BMWs, but their high-end cars are already making it so you can't veer out of your lane or hit a pedestrian or have to cut off your cruise control when you roll up on some grandma going 50 on the highway. And those of you living in the San Francisco area have apparently been dealing with driverless cars pretty frequently lately - Google cars had logged some 140,000 miles already by 2010. It's certainly well into the millions now, and Tom Vanderbilt, author of the Wired article, does a fantastic job of projecting the wonder and desirability of these lidar-shooting driverless vehicles.

Vanderbilt had been in Stanford's car in 2008, doing 25 mph on closed off roads. Now:

“This car can do 75 mph,” Urmson says. “It can track pedestrians and cyclists. It understands traffic lights. It can merge at highway speeds.” In short, after almost a hundred years in which driving has remained essentially unchanged, it has been completely transformed in just the past half decade.

It really is amazing. Just this video alone kind of blows my mind:


Pretty awesome for a 15-second video, right?

But perhaps the clearest indication to me, as a technology writer for the past seven years, is that this stuff is starting to become less and less secretive and more and more matter of fact. The first time I interviewed Velodyne president Bruce Hall, in January of 2011, he would only hint that the giant spinning soda cans on top of the Google cars were his. Now, we get this matter-of-fact statement from the Wired article:

Google employs Velodyne’s rooftop Light Detection and Ranging system, which uses 64 lasers, spinning at upwards of 900 rpm, to generate a point cloud that gives the car a 360-degree view.

I certainly couldn't put that in my article at the time. But then maybe if I'd been writing for Wired... Just kidding. This happens all the time. Usually, the big company doesn't want the little company bragging that the big company just bought a bunch of its stuff. And that's usually because the big company isn't sure the little company's stuff is going to work exactly right and they might want to scrap the whole project. At this point, with Brinn doing his crowing, this project isn't going anywhere. 

Nor is it clear, though, that lidar is going to be the data capture method of choice for autonomous cars. Check out what Mercedes is working on:

That’s why Mercedes has been working on a system beyond radar: a “6-D” stereo-vision system, soon to be standard in the company’s top models ... As we start to drive, a screen mounted in the center console depicts a heat map of the street in front of us, as if the Predator were striding through Silicon Valley sprawl. The colors, largely red and green, depict distance, calculations made not via radar or laser but by an intricate stereo camera system that mimics human depth perception. “It’s based on the displacement of certain points between the left and the right image, and we know the geometry of the relative position of the camera,” Barth says. “So based on these images, we can triangulate a 3-D point and estimate the scene depth.” As we drive, the processing software is extracting “feature points”—a constellation of dots that outline each object—then tracking them in real time. This helps the car identify something that’s moving at the moment and also helps predict, as Krehl notes, “where that object should be in the next second.”

That's videogrammetry at work, folks, and it may very well be that the cameras and software are cheaper to use, or more effective, than the lidar. Maybe the lidar is creating too much information. Only the real guys in the trenches know the answer to that. 

When things will get really cool is when all the cars are also rocking wifi and can communicate with each other in real time, so they can give each other virtual eye contact and virtual waves to let their autonomous counterparts go ahead. It's really ancillary to the 3D question, but I had to pass along this vision of the intersection of the future that was brought to us by Atlantic Cities, riffing on Wired's article: 


Pretty hard not to just stare at that forever, right?

As for whether people actually WANT autonomous cars, I'll leave that to the writers of those above linked-to articles. I have my own ideas, but they're probably not worth a whole lot. I will say, however, that Vanderbilt did get himself a great quote from Anthony Levandowsi, business head of Google's self-driving car initiative:

“The fact that you’re still driving is a bug,” Levandowski says, “not a feature.”

Permanent link