Friday, June 25, 2010

Facebook as a search engine? How about a "Paid Like" marketplace?

There was this big splash yesterday from the allfacebook.com folks about facebook declaring war against google. In that post they said:
"While we suggested that the like had just replaced the link, it has now become abundantly clear what Facebook’s intentions are. Facebook wants to launch the social semantic search engine as we alluded to during f8. Now that the search results are officially showing up as Facebook search results, the war has begun."
Using "Like" as an input into relevance is a fine idea - but Facebook (or Twitter for that matter) are long ways away from building a full-blown search engine purely based on that data. Why do I say that? Fundamentally "Likes" are ambiguous, random and spam-prone (and impossible to detect). These are real big issues for any search engine to solve before they label themselves a search engine.
Google's "Link" approach has many merits over the Facebook's "Like" approach; Link based approach fares better primarily in determining the context, and detecting spam. To that end, technically it is some-what easier to detect a spam link or a link that was "paid" for SEO purposes. However, it is nearly impossible to determine if a Like was a paid Like or a spurious Like. It's just not easy.
On top of this, in Google's eco-system a link is only as important as the link source - or the domain where its coming from. Google has come up with an authority system, Page Rank(PR), for domains based on a very complex but trusted model - this system helps us eliminate spam every time we perform a search on Google. If there were to be an equivalent to this from Facebook, it has to be around the people that are performing the Likes. So even for the sake of argument if you assume a Social Rank (SR) for a given individual based on their interests and friends and the related metadata, it is nearly impossible to determine context around that person's likes and dislikes. Reason? Its extremely complicated and highly error prone to model a human's interaction context.
Now, if Facebook were to combine this Like data with traditional Link based relevance, it could get interesting - but still its an incremental advancement but not a game changer that replaces Google.
I'm sure guys at Facebook are thinking of the potential pitfalls of Like based relevance and the shady "Paid Like" market that it could create. I'm also sure that if Facebook were to launch a "search engine" without proper technology/tools to combat "Like Spam", we all will just grow to Dislike the Like.
As I said before this is not an easy problem to solve, so is Facebook ready to be a Google rival? not a chance, not yet, at least.

Wednesday, June 23, 2010

Render your own Heatmaps with HeatMapRenderer

Heatmaps (aka Density Maps) are a neat way to visualize geo data - and these days there are everyhwere - from Starbucks WiFi heatmaps to The Dealmap Local Deals maps! While they are so popular, I was surprised to learn that there are not many (FREE) tools available to generate your own Google Maps ready Heatmap. So I have developed a simple utility,  called HeatMapRenderer, that can be used to generate Google Maps Ready heatmap tiles from any geo dataset that has latitude and longitude values.
This utility is built for Windows based machines (with .NET 3.5) and is written in C#. If there is a need and if there is a community that is willing to build on top of what I have I don't mind posting it as an open source, but for now its closed source with extensibility model built into it. Think of HeatmapRenderer as a simple rendering engine you can plug-and-play different data sources to render from. Each data source will have its own adaptor - an adaptor is responsible for reading the latitude/longitude data by implementing a simple interface. HeatMapRender natively supports CSV adaptor, so if you can dump your geodata into a CSV file you can use the Heatmaprender instantly. I will write a follow-up on how to extend and write your own adaptors in my next post.
So for now, let's assume you have your latitude/longitude data in a CSV file. And that the column index for latitude and logitude are 0 and 1. Now you need to add a config section like below in the HeatmapRenderer.exe.config file (from the download):
<add name="us-starbucks"
   sourcetype ="HeatMapRenderer.Sources.SimpleCSVSource, HeatMapRenderer.Sources"
   sourceconnectionstring="sampledata\starbucks.csv"
   sourcearguments="0,1"
   palettefilepath="palette\palette.bmp"
   pointradius="5"
   minmaplevel="1"
   maxmaplevel="10"
   outpath="map-tiles\us-starbucks">
</add>
Once you add a config section as shown above, all you need to do to generate the heatmap tiles, is to type in a command like below:
HeatmapRenderer.exe us-starbucks
As you can see its very straight forward, here is a bit more about the config values:
name = unique name so that you can identify and run the tile generation job.
sourcetype = This is the source adaptor. If you have a CSV file, then you can use the built in CSV source adaptor, and of course you can build your own adaptor too (more on that later)
sourceconnectionstring = points to the CSV file (or if you have a MySql data source this should be the connection string etc);
sourcearguments = if you have a CSV you identify lat/long column index values - you can use this field to send any data that is required for your own adaptor.
palettefilepath = default heatmap palette - you can generate your own palette if needed (the download has a palette file included)
pointradius = default 5pixels. you can reduce or increase the point radius based on your scenarios.
minmaplevel = usually 1. levels map to google maps zoom levels. so 1 is very very high-level map (at world map level)
maxmaplevel = usually determined by your app. but in this case 10. around 15 if you want some-what street level heatmaps
outpath = where to store the heatmap tiles generated - local disk path

Once the tiles are generated, you can easily integrate the tiles into a Google maps mashup using the following steps:
1. First upload your tiles to a web server where they can be accessed w/o any authentication.
2. Then use the code below to integrate
var map = new GMap2(document.getElementById("map_canvas"));
map.setMapType(G_NORMAL_MAP);
map.setCenter(new GLatLng(38.86252601314520,-76.96975322980910), 2);
var myCopyright = new GCopyrightCollection("© Chandu Thota");
myCopyright.addCopyright(new GCopyright('Chandu Thota', new GLatLngBounds(new GLatLng(-90,-180), new GLatLng(90,180)), 0,'2010 chanduthota.com'));
var gtl = new GTileLayer(myCopyright);
gtl.getTileUrl = function(tile,zoom) {
var t = "http://<your host tile path>/" + zoom + tile.x + tile.y + ".png";
return t;
};
gtl.isPng = function() { return true;};
gtl.getOpacity = function() { return 0.5; }
var tilelayer = new GTileLayerOverlay(gtl);
map.addOverlay(tilelayer);
That's it!
You can download the full package with exe, sample adaptor code, palette and sample data files (Starbucks locations and DC crime locations).
Let me know if you find this tool useful.

Get the local deals in a fancy Carousel!

Just wanted to point out our new Carousel Deal Widget. :) It's part of our Dealmap Widget family but just fancier to display top deals on your site with better experience. Of course we do have the standard ABI sizes if you want them too.




Again, it only takes one line of code to integrate it into your site:

<iframe height="100px" scrolling="no" src="http://widgets.thedealmap.com/carousel/?lat=37.3165&lon=-121.874&d=8&width=420" style="border: 0px;" width="450px"></iframe>


You can check out full widgets here: http://www.thedealmap.com/widgets