Posted on May 12, 2011
Dayton Technology Landscape Conference
Technology First is a local IT Trade Group, and their second annual “Technology Landscape Conference” was yesterday, so I dutifully (duty = I’m dating their intern) attended.
Ok, so there was some more duty… one of the companies presenting was ExpeData, a Dayton, Ohio (which is “local” for us folk) company who has a digital writing capture technology. We’ve been working with them for a few months to find some suitable applications and to discuss some security issues and requirements. It’s a fairly interesting technology, although I have some trouble finding its killer-app.
Another interesting company whose presentation I attended was Persistent Surveillance Systems — these guys have a 190+ MegaPixel camera array that they fly over the Cincinnati area (among others), taking pictures about once per second. When they hear about a crime, typically a murder, after the fact, they can go back and assign analysts to review the captured images to track people in the vicinity. Their software allows analysts to assign colored tracks and markers to people, vehicles, and anything else of interest — they initially track suspects, then go back and track anyone they interacted with, anyone nearby (possible witnesses/accomplices), and whatnot. The large pixel view of the city and long video times allow them to watch people drive all the way to their destination — a home, hideout, friends’ house, or whatever — where they can then work with police to get a warrant and follow up as appropriate. Their metadata is even good enough that they can apparently cross reference locations to find that, for example, the getaway driver from murder A may have lived next door to the suspect from murder B, which may help detectives tie together previously unrelated crimes.
At least that’s the theory. The company claims the system has already been used so solve some 35+ murders. At $2000/hr to run the flights, that would cost some $17.5 Million for a year of full coverage (someone check my math). That may be worth it to some cities to solve murders, I’m not sure, but it’s certainly interesting technology. The company claimed that their short pole currently is analyst time; they need to improve automation (object tracking among other things) because they can’t handle the volume of murders with the current manual work, and they’d like to branch into other crimes.
There are probably some major privacy concerns floating around here, I’m not certain. Of course, if you’re out in the open you’re kind of submitting to being videotaped these days, but if you’re the sort of paranoid type you may want to take routes that go through lots of tunnels if you’re in the Cincinnati area.
There was a large 3D screen set up by the folks from daytaOhio, a non-profit group working as part of Ohio’s Third Frontier program. Their talk was focused on human interaction limitations with data consumption; the speaker noted that people process words at roughly 500 baud (Bits Per Second — hadn’t heard that one in a while, so it was nice to hear a tech word that made me feel old). The goal of data visualization is to increase that consumption rate, dramatically if possible. Unfortunately I didn’t get to hear a lot about how they actually accomplish that — it seemed a bit smoke and mirrors, but the 3D tech they had in place was enjoyable enough to play with, and I got to talk to the speaker at length after the presentation about the issues we’re seeing in non-spacial data visualization and brainstorm a bit about how we could apply their practices.
It’s interesting to note that the core technology hasn’t changed much in a decade — I worked a bit at Argonne National Labs (i.e. I was smuggled in by a friend) in the mid-nineties on one of their CAVEs (CAVE Automatic Virtual Environment — my first exposure to a recursive acronym). We wrote some OpenGL to allow a user to plot an area in a virtual rubber incinerator to track a cross section of the heat/velocity/particles that were flowing through it, based on some fluid dynamic models. Very cool 3D environment with two walls, a floor, head tracking, wand (3D mouse), and a really cool fish screensaver that was kind of scary when a huge blue whale swam underneath. In fact, it was almost identical to the setup that daytaOhio had, except this one was only one wall, had no head tracking or mouse. :-\
There were some other compelling talks at the conference. PWC put on the morning keynote which talked about a few declining technologies, and had a nice overall technologist focus. Ford gave a lunch talk about their cutting edge auto technologies, including some of the interesting Sync features they have.
The final talk, though, was by IBM talking about the Watson platform. The talk wandered excellently between some high level challenges and some geeky discussions of the underlying technology. There’s no doubt that this sort of technology can be game-changing for the right adopters, but of course at this point the cost point is incredibly high. IBM is working to sort that out, of course, and I wouldn’t be surprised to see some commercial offerings of their DeepQA platform being used for real decision support in the not-too-distant future.
I’m trying not to make this too terribly long a post, so I’ve left a lot of juicy details out. If anyone’s interested (and if anyone reads this), drop me a note and I’ll try to fill in any gaps. Technology First should have some of the slides posted soon, so I hope to go back and pilfer a few thoughts from them to refresh my memory about what exactly was cool.
Your math was good on the cost of surveilance. Some interesting stuff there.