July 30, 2014

Head in the Point Clouds

Sam Pfeifle

Sam Pfeifle was Editor of SPAR Point Group from September 2010 - February 2013.

Uh oh. Laser scanning might be cool now...

You guys know what "bling" is? I'm guessing so. It's all that jewelry and nonsense that rappers put on (even capping their teeth with diamonds) and it kind of became a thing. At least here in the United States. Could the music industry create a similar boom for 3D data capture?

Hard to believe, I know, but see if you can stomach the video for this new song from will.i.am (of the Black Eyed Peas) and Britney Spears (of, well, a decade ago):


Did you get far enough to notice the 3D printer printing out will.i.am's head? Ok, probably not. I understand that the song is an abomination (although I sort of like the "oh wee oh wee oh wee oh" part). But the part you care about looked like this:


See, will.i.am and Britney are showing how cool they are by showing us the technology they're down with. You see and iPad with the cool typewriter keyboard link up, an iPod underwater, a robot hand, even the latest digital camera hanging from one of the dancer's necks ACTING AS bling. Well, all of that plus a Lamborghini. And a bunch of gold chains. 

But the important part is that you can't make will.i.am's head print out without scanning it first. Or using photogrammetry of some sort, but it looks like a scan to me.

In fact, major music industry commentators have noted that tech has supplanted music as the place where all the envelope-pushers go. As the music is increasingly created with digital tools like MPC players and just plain software, it only makes sense. It used to be that tech rode music's coattails, but now it's starting to be the other way around. Musicians want to ride the latest tech, the apps and gadgets that fascinate us nowadays.

And one of those gadgets is a laser scanner. 

Get used to it. You're cool. But you knew that.

Permanent link

Laser scanning in the snow?

Okay, actually, I didn't see anyone trying to laser scan in the snow, but it's been the topic of conversation up this way in Maine, so I thought I'd give you guys a peek at what we experienced.

Portland set a record for snowfall from a single storm, with over 31 inches (it was 27 inches, previously), but up in Gray, where I live, it was more like 22-24 inches. Still, pretty impressive. 

Here's how it started on Friday:


Honestly, at this point, I thought it was going to be another storm where all the weather folks wig out while wearing their Storm Center sweaters, but nothing actually happens. That happens all the time. We get so much snow, the Storm Center folks are celebrities. A local rapper even did a spot for them I quite like. Check it out:


But, overnight, things got nasty. This is what I woke up to on Saturday morning:


Yes, that snow ridge is well over the picnic table. The wind was howling. This is how my front door wound up:


But what's really crazy is that this was pretty much no big deal. We've got some serious snow-handling equipment here in Maine. By Sunday morning, I was driving 60 mph on back roads up to the local mountain to ski some crazy powder, and this morning I rolled into work five minutes early. No problem. But, as you can see, it might take a while before we lose the reminders of the storm. These snowbanks are immense:


Maybe we should scan them for posterity...

Permanent link

Those Hot Springs kids need our help

You'll remember my story from a couple weeks back about the kids at Hot Springs High and their work with laser scanning underground. Well, their project just won $40k from Samsung, and they've got a shot at $110k more. But they need your help. 

They've already beaten out 1,585 other entries to become a finalist in Samsung's Solve for Tomorrow contest, which encourages high schoolers to use technology to solve problems (a good idea, in general). Now, the 15 finalists go head to head in one of those public-voting deals, whereby there are 15 videos describing the finalists' projects and you've got to vote for theirs. 

So go vote. Right here

(Caveat: The voting process is particularly annoying. You've got to give them first name/last name/email address/zip code, then you've got to verify your email address in your inbox. Then you can vote. Once a day only. Annoying.)

Too wary of Samsung's spam bots to get your vote on? Well, at least watch the Hot Springs video and check out what they're up to:


Permanent link

Live on stage: 3D projection mapping

There were no shortage of 3D-related stories coming out of CES, and I've covered a few of them (see my most recent entries on the blog), but none of them caught the ear the way the band Love in the Circus did. They found themselves front and center at CES as guinea pigs for the use of 3D projection mapping as part of the live band experience, thanks to a little donation from Sony.

While I've written about 3D projection mapping a few times before, and have mentioned already the opportunity here for laser scanner professionals, this is the first time I've seen live performance incorporated alongside, and it got my wheels turning as the subject of this week's SPARVlog. Take a look:


Additional resources:

Billboard article about Love in the Circus

Video made by the band about how they did the projection mapping

Permanent link

Lidar: All up in your grill

In my first CES round-up, I pointed out the buzz Lexus was getting for its self-driving vehicle, which features the well-known Velodyne scanner spinning around on the roof. How I missed Audi's much-sleeker entry into the lidar-based auto-driving market I'm not sure. It's pretty rad.

Car and Driver does a great job of summing up the user experience in their blog entry here detailing the test drive:

Traffic-jam assist combines the sensory information from Audi’s existing radar-based adaptive cruise control system and the car’s lane-keeping assist camera (which monitors lane markers) with a compact, bumper-mounted LIDAR (Light Detection and Ranging) laser. Audi is particularly proud of this LIDAR unit, not the least because it packs all of the sensory ability of the giant, roof-mounted LIDAR towers seen on Lexus’s autonomous LS, Google’s fleet of self-driving Priuses, and the Pikes Peak autonomous Audi TT into a brick-sized piece of hardware. Software allows the system to steer the car; if any significant torque is applied to the steering wheel, control is relinquished back to the driver.

Well, I don't know about "giant, roof-mounted lidar towers." That seems a bit overblown. But there's no arguing that Audi's version is far sleeker:


(For those still not understanding my headline, go here. But only if you have a sense of humor.)

I'd consider this a pretty impressive development. They've got 145 degrees of visibility with the lidar sensor, so they neither smack into the car ahead of them nor miss cars trying to cut in front of your car in traffic. All of that while not looking ugly. Well done, Audi.

Engadget got a nice interview from CES in the Audi booth, which was by all descriptions very impressive. It outlines Audi's vision for the "piloted car" and how the company thinks people will use the functionality:


Am I the only one slightly amused by the constant assurances that the piloted function doesn't work on the highway because of how fun it is to drive an Audi on the highway and no one would want to give that up? I think they're just not quite ready to promote it at highway speeds. Try my commute to work in the Hyundai - I don't think an Audi would make 25 miles on the Maine Turnpike into a laugh riot. 

Regardless, all of this activity shows that carmakers are making a real commitment to the capture of 3D data to inform the performance of their vehicles and I think it's only a matter of time before lidar is built into vehicles of all sorts on construction sites and in industrial facilities for safety purposes. 

But think about what else could be done here. What if, while these vehicles were scanning for obstructions, they were also constantly replenishing your as-built point cloud documentation? Like a Viametris unit in the front grill of your bucket loader. Couldn't you just do a data dump every night and then let it register overnight and, voila, updated point cloud?

Something to consider while you count your pennies in advance of the release of this new A6.

Permanent link

3D leads the top tech breakthroughs of 2012

Last year I thought it was a big deal when Faro's Focus3D landed on PopSci's list of the "100 Best Innovations of the Year" issue, but Popular Mechanics has an even more exclusive list than that, and 3D data capture takes up a good 30 percent of the "Top 10 Tech Breakthroughs of 2012" (with a little printing thrown in).

• First up is the Lytro camera, which is really a 3D data capture device, in that it's constantly capturing light from all directions. I haven't yet seen someone harness this power for a major advancement in photogrammetry or some other jump forward, but it's only a matter of time, in my opinion. This ability to refocus after the fact has to have an application and there are smarter guys than me working on it, surely.

• Second up is 123D Catch, and that's got to be a nice validation for Autodesk. I 100 percent agree that this is a major step forward for the field of 3D data capture. I mean, this is FREE. Anyone can start creating 3D models from the world around them. This is Star Trek stuff come to life, people. I know many of us take this kind of technology for granted, since we work with it and talk about it all day, but to the world at large this is mind-blowing. I've seen minds blown just from what I show people on my phone. Good on Popular Mechanics for noticing.

Cubify also gets a nod, and for good reason. I'm not personally sure the price point is low enough to create the DIY impact that many think the device will cause, but it's a pretty good start. 3D printing in the home will absolutely drive the desire for 3D data. I'm sure of it. I'm just not sure a $1,300 printer gets into the home very often. But maybe I'm wrong.

• Finally, there is Leap Motion, and I think this is going to be the next big game changer in our little community. Just as the Kinect brought 3D data to a whole new world of developers who hacked the device and created all manner of solutions (including Matterport's handheld device and MIT's new indoor mapper), the new Leap stands to create even more amazing advances. 

Why? Well, it's 200 times more accurate than the Kinect and it costs even less. About $70. Essentially it creates an area of about eight cubic feet where you can have a live point cloud accurate to about .01mm (well, that's what they say, anyway).

Here's a video that might get your mouth watering:


Pretty amazing, right? Sure, they're primarily interested as a company in gesture control as a way to operate your computer, but you don't garner $12.75 million in Series A funding from Highland Capital Partners because you might make the mouse obsolete (well, ACTUALLY, that's probably a pretty good reason to get $12.75 million in funding all by itself, now that I think about it). You get that money because people see a brand-new platform that people can crack open and take a whack at. Already, even though they've only shipped 30 units to NDA-protected developers, they've had 26,000 applications from developers looking for the SDK and a free developer unit.

That's amazing. Check out what potential developers have already pitched for ideas:

Leap applications are full of potential, and software developers are eager to push Leap’s technology towards new and exciting directions. Here is a list of the popular application categories Leap software developers would like to create for:

Games – 14%

Music and video – 12%

Art and design – 11%

Science and medicine – 8%

Robotics – 6%

Web and social media – 6%

Education – 4%

Other popular ideas for the Leap include computer-aided software design, translating sign language, using the Leap to drive a car or plane, supporting physical rehabilitation and physical disabilities and special needs, manipulating photos and videos, and creating new types of art.

And I know what you're thinking: "Great, eight cubic feet..." But you can daisy chain these things! They're $70 and they're smaller than my iPod Classic. Someone is going to come up with something that even 3D data capture pros are clamoring for.




Permanent link

Endorsements for 3D technology abound this week

The evolution of a technology industry isn't all that different from the evolution of a popular television show or web site nowadays: People don't tend to watch or visit because they saw and advertisement. They watch or visit because a friend or trusted source told them to. Such is the power of social media, and media in general, in that it makes a person confident to give up some valuable time and take a chance on something.

"Hey, if all my friends are watching, I should, too."

In technology, it's business-related success that's emulated and taken as tacit endorsement: "Hey, if that company says lidar is the only way to go, we should probably give it a shot."

So, when I come across a video like this one, created and posted yesterday by the United States Geological Survey to promote its National Elevation Dataset, I'm thinking airborne lidar, especially, has pretty much made it:


I'm assuming marketing execs at Optech, Leica, and Riegl are rubbing their hands together maniacally: "Lidar is the NED's primary source of new elevation data. It's cost efficient, more accurate, and it captures data points from the earth's surface, to the tops of the features above, and in between."

"Its use goes beyond topographic mapping into other areas of science and to every day life."

"Lidar is essential in making accurate flood maps."

That's just a start! Such great endorsements of the technology from the USGS and FEMA (the U.S.'s Federal Emergency Management Agency). Corporations and the like that take their cues from the federal government, or that are looking to score government contracts, would be folls not to investigate and invest in lidar technology, right?

Along similar veins, a couple of specific companies in our space have gotten nice nods this week.

Allpoint Systems, which makes software for automatic feature recognition, and impressed the crowds at SPAR this year with their remote-controlled scanning robot, has been named a finalist in the Tech 50, which celebrates the best technology firms in the Pittsburgh area. They're up against the likes of Wombat Security Technologies (which makes cyber security software) and Rhiza (a data sharing and visualization firm I don't completely understand). Regardless if they "win," great to see a data-capture company in the mix.

Also, a number of 3D companies have been named to the Marine Technology Report 100, a list of the best companies in the undersea space. 2G Robotics makes robots with underwater laser scanning capabilities for inspection purposes; Blueview Technologies (recently acquired by Teledyne) makes high definition multi-beam sonar; IxBlue has a number of products that create 3D with sonar and inertial navigation units; Optech works in bathymetry; and there may be others on the list in our space I'm simply not familiar with. You can download a pdf of the list here.

Just because I'd seen it recently, here's 2G Robotics' very interesting presentation from SPAR International 2011 comparing underwater laser scanning with sonar and how the two can be used together.


To see a list like that, specific just to a certain vertical market, have so much 3D throughout is highly encouraging. This is technology that's proving effective and is being adopted in rapid fashion. When industry-leading publications hand out endorsements like this, it can be extremely helpful for adoption rates.

Permanent link

3D scanning: Let the hype begin!

Are you concerned that the capabilities of 3D laser scanning are being over-touted? Do you worry that people expect too much of 3D data capture and are bound to be disappointed with deliverables? Well, you might be right in line with the folks at Gartner, the IT-focused research and consulting firm. They added 3D scanning to their famous (some would say infamous) Hype Cycle chart this year, and why this matters is the subject of this week's SPARVlog. 

Take a look:


Supporting material: 

• Print-on-demand firm Shapeways has a reaction to seeing 3D printing at the top of the graph that you probably should take a look at.

• Here is the full press release from Gartner on this year's chart.


Some things I didn't have time to note in the video:

• Augmented reality was just cresting the peak of expectations last year, and this year Gartner have it just about to spiral toward the trough of disillusionment. Hasn't augmented reality already crashed and burned once before? Aren't people more open-eyed about it now? Or is that just for virtual reality?

• Mobile robots were just starting to climb last year, marked as 10 years or more from realization. They haven’t moved at all this year. Lidar and robots are pretty inextricably linked in my mind, so this is something to watch.

• Gesture Recognition was just leaving the expectations peak last year, and this year Gartner have renamed it “Gesture Control” and declared it well on its way to the trough. Along with Cloud Computing, by the way. Is anyone really disillusioned by cloud computing? I’d say it’s still being hyped virtually relentlessly. Heck, MAPPS/ASPRS is doing a full conference on "the revolution and evolution of cloud computing and other emerging new technologies impacting the practice of geospatial professionals," happening October 29 through November 1 in Tampa, Florida. Perhaps Gartner feels that lack of universal broadband, or broadband simply not being fast enough, with lead toward disillusionment.

• Also new this year is HTML 5, which is pretty vital to the display of 3D models online. It would be great if HTML 5 just skipped right along to the plateau of productivity, as it will pull 3D along with it.

• Finally, back in 2010, 3D flat panel displays were at the top of the hype cycle - now they're nowhere to be found. Have they become irrelevant again? Mainstream? Old hat? I can't believe Gartner has deemed them obsolete...


Permanent link

Why the Kinect matters

Yesterday, I met my new neighbor for the first time. His name is Kevin. He's maybe 22 years old. Just got married and moved in next door in our little portion of the great middle of nowhere that is rural Maine with his new wife. He wants to be a preacher. (Or maybe a pastor. I can't remember the difference.) He goes to Bible school. He obviously knows nothing about 3D data capture. 

When I start to explain to him what I do - talking about lasers that bounce off of stuff and create something called a point cloud, blah, blah - he stops me and says, "Oh, you mean like what the Kinect does?"


For those of you who slander the Kinect as a "toy" and wonder why anyone would care about the data it collects, since the accuracy ain't exactly going to get that bridge built, this is why anyone would care about it. The Kinect is an ambassador of 3D data capture. It provides an incredibly low-priced entry point to the science of gathering information about the world around us, digitizing it, and then making some use of it. 

And everyone knows what it does. It introduces into the minds of kids and young adults all over the world the possibility that 3D data can be cheaply captured and made to power systems. I move my arm, stuff happens on my television screen, thanks to the Kinect. What else can moving my arm make happen? 

For just one example of how students are using Kinect technology that's incredibly timely (even though it was posted in 2011), check out this amazing video made by the folks at UC Davis (if you've got 3D glasses, put them on):


You know that the Curiosity just landed in Gale Crater, right? Well, UCDavis' Dawn Summer (what a great name), was co-chair of the Landing Site Working Group, and you can see in this video why Gale was chosen. But just look at the video! To quote from the information provided, "This video was filmed using a Virtual Globe program called Crusta written by Tony Bernardin at UCDavis, which runs on the VR library Vrui written by Oliver Kreylos, both of UCDavis' KeckCAVES (http://keckcaves.org). Oliver used two Kinects to capture Dawn as she described the Gale site in front of a 3D TV system with head and wiimote tracking with an optical tracking system. Oliver then re-rendered Dawn's interaction with Crusta and the Kinect reconstruction of Dawn together into one movie, including the sound track as well. The result is the merging of Dawn and Mars into a virtual world. (See http://youtube/okreylos for more on Kinect wiimote hacking.)"

The Kinect allows for inspired creativity using 3D data. It's far less likely people are just going to play around with a $100k laser scanner. Creativity leads to technological advances, even if those advances eventually lead to $100k devices. That's why the Kinect matters.

Permanent link

3D forensic work highlighted in Aurora coverage

There's not much I dislike more than the media feeding frenzy that surrounds tragedy. The vast array of news options all smell pageviews and eyeballs in a story like the Aurora movie theater shooting, and they look to sensationalize any and all angles that might bring in a few more clicks or ticks on the ratings dial. And yet here I am finding a 3D angle.

First and foremost, let me register my profound objection to CNN's decision to label the event "Massacre in Theater Nine." Like it's some cheap slasher flick. That's embarrassing for us as a society, if you ask me. 

However, I came across a report from Anderson Cooper that shines a little light on our corner of the world here in 3D data capture. First, give it a watch:


Yes, they're essentially picking up on a CBS broadcast, but I couldn't find the original video of that. Most importantly, listen to the endorsement handed out for 3D data capture: "special equipment to try to recreate what occurred" and "it's powerful in front of a jury" and "extremely useful when there is some doubt as to what really happened." Exactly. Couldn't have said it better myself. 

But why does the unidentified guy in the studio say you can "collect up to 30 million points of reference"? Heck, that's like 30 seconds of scanning with a phase-based scanner! Is he just picking a number out of thin air? I'm guessing he's just passing along a tidbit of information he found and picked a number he thought sounded big, but I could be wrong.

To learn more about ARAS 360, since that was the program he seemed to be speaking about specifically, I gave them a call. They're located in Kamloops, British Columbia, and were founded in 2010 by Mike Kennedy and Mike Greenfield. Unlike a lot of CAD software for accident recreation, they didn't start out in 2D and move to 3D - they've always been nothing but 3D. And they weren't actually involved in the Aurora investigation, but were referred by a customer, Hal Sherman, that works with CBS on crime scene consultation as a retired NYPD officer.

Pretty sweet that they got to be the "example" of this kind of technology.

Talking to operations manager Ricci Krizmanich, she said users bring in data from just about any measuring device, including total stations, which they feature most prominently, including a package you can buy with a Nikon total station, but also Leica and Faro laser scanners, she mentioned specifically. 

Here's their page labelled "Total Station Download," which gives you some indication that most organizations they're working with haven't invested in laser scanners yet. However, they mention you can import your "points file" and "photogrammetry points" and "as many point sets as you like." It may be slightly easier said than done.

Maybe they haven't seen the size of some of the point clouds that are being created nowadays...

Anyway, the bigger picture here is yet more progress into the mainstream for 3D data collection. We saw a similar breakthrough in the UK in the case of the murdered MI6 agent, and this may end up being something of a watershed moment for laser scanning and forensic documentation in the US. Police officers take a ton of stock in the work their peers are doing, and this kind of validation in a high-profile case could go a long way. 

Permanent link