WikiGraph As Delicious Data
I decided to upload my WikiLog graph to my Del.icio.us account. Because Everything Is A Graph...
I sucked down a couple lists of the pages on my site - 2005-11-21-CurrentWikilogCount
-
one just giving the last-mod-date
-
another listing each WikiWord contained in each page
I grabbed Matt Biddulph's Python code to wrap the API, though it really wasn't necessary.
-
I took the same Expanding Wiki Words code I use in my WikiLog to define the Description for each item
-
I used the last-mod-date as the posting-date, but that doesn't seem to have mattered
I wrote some code to combine my 2 input lists and start uploading, with a built-in throttle to avoid getting blocked.
I'm up to roughly 3000 pages now. Boy is that a long tag list.
Some possible things to do
-
count the unique tags
-
tell other Wiki hosts to try the same thing (Blue Oxen, Community Wiki?)
-
maybe auto-post all those sites? I don't have the link info, actually they probably don't either... but they could write something that generates the data on the back end much more effiiciently than crawling the site via HTTP...
-
do separate upload of remote links tied to my local tags
Thoughts after uploading:
-
hmm, am I a combatant in the SpamWars now?
-
is Del.icio.us a directory? To what?
-
what kind of Peer Review/Reputation Management process is appropriate for Attention Management?
-
-
is Technorati a better place to do this sort of cross-site tag-match?
-
Tag Intersection queries
-
boy I miss the AltaVista interface that gave you co-occurring words for your search terms and let you exclude/include them, to refine a search
-
I smell some sort of Drag And Drop interface for query generation as a future need... esp in the Universal Inbox?
-
-
grabbing the HTML title of a page is [often](http://del.icio.us/tag/Tom Peters) nasty...
Edited: | Tweet this! | Search Twitter for discussion