This is a guest blog post by Yosun Chang, second place winner in the Dreamforce ’11 Hackathon.

So, a few weeks ago, I spent the better part of each day participating in the Dreamforce Hackathon 2011 –  the result of three days’ work on creating SocialVoxels — and won a nice $5000 prize!   A couple of people have asked me how I came up with the idea, and what the whole process was like – so here goes.

Initial Ideas

I found out about the hackathon late, so I didn’t start my brainstorming and planning process until Saturday/Sunday. I typically work in phases:
1) raw “first blush” ideas
2) read-the-rules (and judging criteria)
3) idea filtering
4) last-round ideas
5) final filtering

This whole process is typically done in parallel, so to speak, as I tackle weekend errands and such – and get inspired by these usually dull events. Checking into Costco on Foursquare, scrolling through the usual bunch of comments, transportation, the works. There are generally tons of tips for every popular venue, just as there are tons of comments for every popular post online. There are lots of people out there, but, people’s comments get lost in a sea of text. There’s a problem with people trying to find what they need in this mess of metadata. Good stuff gets lost – and the more social stuff we have, the more stuff goes up there, and the more of that gets lost.

Final Filtering

Idea no. 1:  A graph based “true relevance” search engine:  Basically, “what you need, when you need it… through the social graph.” People update certain need-to-know status’s such as I need a job, or I have to sell a Burberry scarf, or I need a room, then using the relations-based magic of a graph database, they start knowing who in their circle (and beyond) can “answer” these needs for them. Then scraping from twitter and craigslist and such, it also expands to beyond just the people in the network. The plus of this is that InfiniteGraph is doing the graph magic… and it runs on Java, and Heroku

Idea no. 2: A human-readable way to express checkin data:  Well, this would have to be creative, since omniscience or the ken to perceive-it-all in light of infinite-data is something we mere-humans lack. I like building things in 3d – but then there’s always a limit to how much you can convey on mobile. And then there’s whether your typical user would get it. For the most part, everyone gets Lego block’s, even if they might not totally like them. And user data as consciously contributed by the user is almost always lost in a sea of text – it’s the bane of letting *everyone* post.

After much consideration, I decided it would be in my best interest to go with the second idea.

Judging Criteria

This was my thought process when considering the judges criteria for the hackathon:
1. Originality/Innovation:  What if there were a data type (other than photo’s) that was intrinsically conveyed for human eyes? It’s beyond the usual metadata, for sure. Hmm but how could this be useful data… Well, make it fun, at least…  add in some sound
2. Effective Use of Cloud Tech :  Lots of data needs cloud storage.
3. Relevance to the DF11 “Social Enterprise” theme:  a)True relevancy via graph – imagine HR being able to instantly recruit that one employee to change the world!  B) Lego-ize the world – a new social metric! – “a new social enterprise – imagine the millions of people out there… each contributing a voxel at a venue”
4. Judges Proclivity:  I’ve been lucky enough to attend enough startup pitches in front of similar audience types to have a good feel for the last one.

The Meat of the Hack

The way I do hackathon’s is that I go there with a solid idea, and then, when I’m finally there, I start building it from scratch.  It’s a purist’s perspective — and it’s also aptly scrappy: if it doesn’t turn out well, you’ve lost the least amount of time.

On Monday, I started setting down the foundations of the core engine. This was “architecture in a day.” Part of the day was LucidChart   flowchart’s, UML diagram’s, layout sketches, and Todoist for tasklisting. The other part of the day was actually coding the 3d engine for adding the block’s – which didn’t take long at all, thanks to Unity. At the end of the day, the app was wrapper’d up with data ready to be i/o’ed to cloud storage.

On Tuesday and part of Wednesday, I focused on integrating a bunch of API’s and i/o-ing data from the cloud. I also tested my iPad 2 out a bit, on the sort of GPS. I phased in and out of different Dreamforce session’s, attended the evening events – a few reception’s and parties in the usual SF haunt’s, and also, the Metallica concert!

By Thursday, I was in zombie mode, both for having worked quasi-intensely through most of the day and attending a bunch of evening events (averaging about 3 hours of sleep a night).

And then the part I’d spent most of Thursday preparing for — pitching. I’m not good at public speaking or pitching, but well, I guess, in the worst case scenario, if nothing happens, I get another chance to practice pitching. It went okay, as, I guess, some parts of my point went through. I didn’t get to demo as much of the app as I’d like (it was 2 minutes, strict) — and I didn’t even get to explain the real social enterprise significance.

So, the guy I’m TR-ing an AR book for, Kyle Roche, won first place, and I won second. (No relation — we met for the first time at the event. Apress does everything too remotely, no one knows anyone on the stack.)

One final note, if you are interested in Social Voxels, I am planning to complete a beta test of the app in late October to make the app more stable.  Please check my blog for the latest updates.

Get the latest Salesforce Developer blog posts and podcast episodes via Slack or RSS.

Add to Slack Subscribe to RSS