Writing the Feeds Page was a project to learn more about JavaScript, specifically the AJAX technique for pulling in content from other websites using JavaScript requests. AJAX stands for “Asynchronous JavaScript And XML,” which is a misnomer because XML is not necessarily involved. It is also referred to as XHR, after the XMLHttpRequest object which is used to make requests, or perhaps most informatively as “scripted HTTP” (in David Flanagan’s massive tome JavaScript: The Definitive Guide).
Example: Last.fm
I wanted to learn AJAX, but I also wanted to play around with some APIs I had recently learned of. One of the first pieces I put together was my Last.fm recent tracks, which uses their user.getRecentTracks API. Last.fm was easy because the API is well-documented, RESTful (basically means parameters are passed in the request’s URI), & allows cross-domain requests. The code looks like this:
So where is this XMLHttpRequest object? I use jQuery to save myself from some verbose syntax & cross-browser headaches, so $.get
handles the XHR for me. The first parameter passed to $.get
is the URI to request, the second is a function to call if the request returns successfully, & the third is an optional one for type of data to expect (XML). The URI uses query string parameters of form “key=value” to request specific items: my URI says “I want the six most recent tracks from user phette23.” To reuse this code, you would need to get an API key of your own to fill in as the final “XXX” value.
Below $.get
is the processXML function which takes the response from Last.fm & constructs a long string of HTML. XML is akin to HTML but more structured; processXML iterates through the data (using jQuery’s each
), takes the text of artist & name elements from each track, & concatenates it all into one string. The final line puts that string into the recent-tracks div’s inner HTML.
All the Pieces
Each piece of the Feeds page works a little differently. For Last.fm, I used pure JavaScript to process XML data. The date field required a little special handling (see Struggles below) but for the most part it was straightforward.
For my Diigo links, I cheated quite a bit: Diigo has a nice linkroll that was exactly what I would have built. So because the Diigo API requires authentication & server-side scripting, I used their linkroll instead. I use jQuery to remove two elements from the linkroll that affect their styling, giving me more control over display. For Github, I originally used my public Atom feed as an XML source much like Last.fm but ended up using Drupal’s Aggregator module instead. I hadn’t used Aggregator before, so it was a learning experience, which was the whole point.
For Twitter I used an approach similar to Last.fm, because Twitter also has a well-documented, RESTful API, but Twitter offers JSONP which is even better for cross-domain requests. JSONP, or “JSON-in-Script,” is essentially a way of smuggling data in via a script tag that contains what amounts to a JavaScript object. This object is then easy to process; the Twitter equivalent to my processXML function above does not even use jQuery because it is working with native JavaScript structures, not manipulating the DOM.
For Patametadata, my blog hosted on Blogger, I used Yahoo! Pipes to pull in the blog’s RSS feed, convert it to JSON, & then did a similar process to Twitter: jQuery requests the JSON, I put the data into a HTML template with a few lines of code, then place the result onto the page.
Lastly, Goodreads required use of a server-side PHP script. This script just acts as a proxy, grabbing an XML file that JavaScript isn’t allowed to access & passing it along to my site as data. So the architecture resembles Last.fm: $.get
calls a success handler that processes some XML into a long string.
All the Struggles
It took awhile to get the Feeds page working. I struggled with many oddities that I still don’t have a solid solution for. For instance, my original Github code (relying on the public Atom feed of a user’s activity) suddenly stopped working, whether or not I used a PHP script as a proxy or straight JavaScript.
Twitter, despite being JSONP, took awhile to figure out. The hangup was passing the correct parameters: to receive a JSONP response I needed to add “callback=?” to the requested URI.
Even for Last.fm, the actual code is uglier than my example, because I must wrap the entire bit in a self-executing anonymous function that ensures $ = jQuery
. This is necessary because Drupal, for whatever reason, includes jQuery but leaves $ undefined. Furthermore, the date field proved troublesome; it comes as raw text in the GMT timezone, so to convert to my local time I had to slice the hour out of a string, convert to a number, subtract five hours adjusting for negative hours (e.g. => -03:00…that’s not a real time), & even then I don’t know how to programmatically adjust for daylight savings time. Another interesting wrinkle is that currently-playing tracks have no date data, which I adjusted for.
Lastly, my finished product is hardly cross-browser: try visiting the Feeds page in IE8 for a demonstration; only the Diigo linkroll, which required almost no work on my part, & Last.fm show up. Further work will involve making it work on more browsers, as opposed to adding further feeds. Just now, I tested in all major browsers on my MacBook & the results were actually quite positive: Chrome, Firefox, & Opera all displayed each section. I’m almost certain previous tests were not so successful. Safari, on first page load, didn’t show the PataMetaData feed, which then appeared upon refresh. Not sure what I need to do there.
Overall, this project not only provided an impetus to learn more about JavaScript, jQuery, & AJAX but also produced a nice web page summarizing my various Internet activities. As I begin to work with more APIs at work, I’m sure the experience will pay off handsomely.