Tag Archives: google

The Google Closure Minifier App CodeUnit 23 NOV 2010

I’ve mentioned Google’s cool Closure javascript minifier project here in these pages before, but now I seen they’ve gone and put up a fantastic app that makes it as simple as a snap of the fingers to generate your own minified versions of your existing javascript js libraries.

To use is as simple as hitting the URL, adding a URL to the js library you want to minify, selecting the optimization type you want to employ, add some additional formatting rules and compile!

Simple, nice, and gives a great result, telling you by how much the new file was compressed, giving you various other stats, showing off the compiled code, warning, errors (and even POST data) and most important of all, providing a download link to your minified js file.

Seriously cool in other words! :)

Related Link: http://closure-compiler.appspot.com/home

FPDF: Failing to Insert a Google Chart into a PDF CodeUnit 13 OCT 2010

fpdf-logoMy tried and trusted method for inserting graphs into PDFs is to use the nifty PHP FPDF library and insert an image into it via a Google Chart URL.

In other words, your code would look something like this:

$charturl = "http://chart.apis.google.com/chart?cht=lc&chco=" . implode(',', $colors) . "&chs=930x310&chd=t:" . implode('|', $graphdata) . "&chxt=x,y&chxl=0:|" . implode('|', $monthsheading) . "|1:|10|20|30|40|50|60|70|80|90|100";
$pdf->Image($charturl, null, null, 180, 60, 'PNG');

99% of the time though, my PDF fails, gives me an error about not being able to load the image and leaves me in tears.

Because each and every time, I can without fail go look at the PHP error log and will be greeted with this:

failed to open stream: HTTP request failed! HTTP/1.0 400 Bad Request

And then I remember: URLENCODE the damn graph string before feeding it into the Image function!

One day I’ll remember to do this from the get go, promise! :)

Related Link: http://php.net/manual/en/function.urlencode.php

PHP and Google Charts: Using a POST request when a GET String is too Long CodeUnit 16 JUN 2010

Google Charts is an extremely stable and reliable web-based charting engine that is perfect for dropping into any project where you quickly want to produce some high quality graphs for your end user.

The default way to generate a Google Chart is to simple create a normal image tag and set its source to the Google API script with all the chart options passed along as a GET request string.

Of course, as we all know, there is a limit to the length of a GET string and so the question is raised once you start generating bigger graphs, how do you get around this particular limitation?

Well the answer of course lies in using a POST request instead, and thanks to PHP’s explicit support for POST requestes, this becomes as easy as pie! :)

Swiping the demo code straight from Google’s documentation because I’m too lazy to paste my own development example, you can see that the PHP script which handles the generation is nothing more than a file that pushes through the appropriate image content header (in this case a png) and then makes use of the classic fopen functionality (ensure that remote URL access is enabled on your server) to pull down the generated image from Google’s Chart API, specifically setting a POST stream context to handle the chart parameters stored in the previously created data array.

The final leg of work is then of course to simply reference your PHP script as the src of an image tag on the page you want the actual graph to appear.

PHP Chart Script:

  // Create some random text-encoded data for a line chart.
  header('content-type: image/png');
  $url = 'http://chart.apis.google.com/chart';
  $chd = 't:';
  for ($i = 0; $i < 150; ++$i) {
    $data = rand(0, 100000);
    $chd .= $data . ',';
  $chd = substr($chd, 0, -1);

  // Add data, chart type, chart size, and scale to params.
  $chart = array(
    'cht' => 'lc',
    'chs' => '600x200',
    'chds' => '0,100000',
    'chd' => $chd);

  // Send the request, and print out the returned bytes.
  $context = stream_context_create(
    array('http' => array(
      'method' => 'POST',
      'content' => http_build_query($chart))));
  fpassthru(fopen($url, 'r', false, $context));

HTML Page to Display Chart:

Simple, but remarkably effective and thus clever! :P

Related Link: http://code.google.com/apis/chart/docs/post_requests.html

Optimize Your Javascript by Minifying with Google’s Closure Compiler CodeUnit 02 MAR 2010

Optimize your Javascript by minifying with Google’s Closure Compiler. Well, that pretty much says it all. By now we all no that there is plenty of scope for reducing the size of one’s Javascript code by replacing bulky variable names with shorter versions and stripping out whitespace, etc., but naturally as one would expect, achieving this optimization by hand is a rather tiresome affair.

Enter the nifty Google Closure Compiler, simply put, a tool for making Javascript download and run faster. It’s not a traditional code compiler mind you, it doesn’t compile source code into machine code but rather compiles Javascript to better Javascript, analyzing, clearing dead code, and rewriting and minimizing what’s left over. It checks syntax, variable references, types and even warns about common javascript pitfalls just for fun.

There are a number ways in which you can set the compiler loose on your code: you can use the open source java command line application, you can simply plug your script into the simple online web application or you can even make use of their full RESTful API.

The benefits of using this great little system do of course not need that much explanation. Firstly, in terms of efficiency, using the Closure Compiler will result in smaller javascript files which in turn means faster loading applications which obviously means reduced bandwidth needs. In terms of code checking, Closure Compiler provides warnings for illegal javascript as well as for potentially dangerous operations, resulting in less buggy scripts and scripts that are simply easier to maintain.

And just in case you were wondering why you should give them a spin, take note that jQuery have moved to the Closure Compiler to produce their minified scripts.

So what are you waiting for? ;)

Related Link: http://code.google.com/closure/compiler/

Using Google to Make Your jQuery Load Faster CodeUnit 04 FEB 2010

Google is a wonderful creature. Once the champion of the meek and “Do No Evil” standard bearer of the world, it has long since grown into a huge moneymaking behemoth that seeks to control every aspect of our forages into the online world.

However, no matter how much you dislike just how Microsoft-like they are becoming, they do have a lot of stuff that you want – like all their awesome, localized datacenters for example.

And just why would you want to make use of their awesome datacenters and data shifting capabilities then?

Well for a start, they can certainly help speed up delivering your jQuery-powered websites to the masses! Just think about it. Normally you simply host the main jQuery library script on your own webserver and link your webpage directly to it. However, there is a price to be paid here, and that is naturally the time it takes to download the jQuery script and use it accordingly. So how does Google help us reduce this for our public-facing websites then?

Well the first advantage of using Google to host the jQuery script is the decreased latency that comes with using Google’s content delivery network (CDN). Now CDNs usually work by distributing your static content across a range of diversely situated servers which then serves up whatever file the user requests from the server that is located closer to the user – in other word it technically should reach the user much faster. Now say for instance your webserver sits in the United States and a user accesses it from South Africa. Seeing as Google as a server sitting closer to the user means that a request should reach him faster by using the Google server rather than the native server all the way back in the States. Needless to say, with Google’s huge worldwide footprint, I’m pretty sure you can see what I’m implying.

Secondly, using an alternative source for your jQuery inclusion helps speed up things in terms of parallelism. The web was designed to load asynchronously, meaning that elements are all pulled down and loaded up on their own little threads, all in order to speed things up a little. However, in order to avoid throttling a server, most browsers restrict the number of concurrent connections that can be made to any particular server! This throttling of server connections could be a problem if all your content is being delivered from the same server, hence it makes sense to try and distribute the load across as many different servers as what you can – in other words using Google to serve up your jQuery file comes up trumps once again.

Finally, caching could just be the biggest advantage of using Google yet. Simply put, even if the jQuery file you are serving up is identical to the one being served up by every other jQuery-enabled website out there, the fact that this file comes from different servers means that your browser is forced to download it at least once, meaning you are left with multiple copies of the same file all over your system. However, if many sites adopt using jQuery off Google’s library, this would mean that you may already have grabbed the necessary file while browsing a site other than your own, meaning your browser no longer needs to pull the script file down again – saving you an entire download chunk of time!

So no matter how you look at it, relying on the giant that is Google to serve up your jQuery for you might be one of the smartest ideas yet – though just how should you go about doing it?

Well, while Google does offer a nifty little call that determines the correct version of jQuery you are looking for and serves it up immediately, this technique does have that extra little check in it that will add a millisecond or two to the whole process, which could be a little bothersome to all you efficiency whores out there. Instead, it might be better to call the jQuery library directly using a standard <script> call, leaving us with something like this:

<script src="http://ajax.googleapis.com/ajax/libs/jquery/1.4/jquery.min.js" type="text/javascript"></script>
<script type="text/javascript">$(document).ready(function(){
//nice! :)

Fast, efficient, and perhaps well worth taking the time to check out whether or not this implementation is faster than your old way of calling it! ;)

Multi-search Using 3 Search Engines at Once! CodeUnit 27 JAN 2010

Search engines have become an indispensable part of modern life and the Internet, and as such almost all of us have long since determined our own particular favourite amongst the multitude of available workhorses.

However, search engines are not all the same and quite often return vastly differing search results on any particular search term, meaning that you’re never quite sure as to whether or not you are in fact getting the best search results possible.

So someone came up with the bright idea to harness the power of the three of the biggest search engine names and show their comparative search results alongside one another for any particular search term entered. Featuring a slick AJAX-driven interface, a good layout of information as well a clever display of search term related tag cloud information and image results, enter Yabigo, a particularly clever little monster.

Ya = Yahoo, bi = Bing and go = Google, Yabigo submits your query against all three search engine giants and then returns the plethora of results back to you, allowing you quickly to compare the three’s performance and increase the chance of actually finding the information you were looking for in the first place!

Very clever boys, though I’m not sure how long you’ll last before the big guns put you down and out of the game. Still, very, very good idea! :)

Related Link: http://www.yabigo.com

Ubuntu: How to Install Google Chrome for Karmic Koala CodeUnit 25 DEC 2009

google-chrome-logoInstalling the latest browser upstart, Google Chrome into Ubuntu 9.10 Karmic Koala turns out to be a whole lot simpler than what you might have thought.

The Chromium dev team have just released a debian installer package for Google Chrome Beta, which you can head over and grab from http://www.google.com/chrome. (If you do it from a windows machine, you’ll need to navigate through to the Linux downloads, but if you launch the URL from Ubuntu, it defaults to the Linux download.)

Once the 12 mb file has finished downloading, it now becomes a very simple matter of double clicking on the .deb package and following the Debian Package Installer prompts that follow. That easy.

Finally, when the go ahead is lit and the package sits installed, you can access Google Chrome by hitting the Applications -> Internet -> Google Chrome system menu option.

(Needless to say, first option you get is to import all your stuff over from Firefox. You’d better do it – after all, you don’t know when you’ll ever be going back! :P)

Related Link: http://www.google.com/chrome

And now Google DNS CodeUnit 12 DEC 2009

google_logoSo not content with their ever-expanding empire that basically covers all aspects of life on the Internet as we know it, Google went and released their own DNS system last week, taking yet another deep dig into knowing just exactly what you’re doing on your PC at all point in time, no matter where you might be.

In case you are a little lost here, Domain Name Systems (DNS) are servers that are used to translate human-readable web addresses like http://www.craiglotter.co.za into physical network addresses which correspond with where the servers actual are (e.g. 64.202.1163.87)

Now for the most part, users don’t even know of the existence of these magic translation servers and happily go about using the default DNS server that their ISP provides, but over the last couple of years there have been a bit of shift towards a more competitive market, with companies like OpenDNS and Neustar popping up and offering both free and premium services that are generally better than the ISP offerings simply because the ISPs don’t have a vested interest in actually improving upon the technology.

Anyway, Google has now joined the bandwagon and is offering their very own DNS service, which simply by the fact that it IS a Google offering, automatically means that it should be rock solid, pretty damn clever and remarkably stable right from day 1 already.

And according to Google, using their DNS servers offer you gains in terms of performance benefits and enhanced security, as outlined here and here.

You back yet?

Anyway, to test their new DNS service out, simply change your machine or browsers network settings to use the IP addresses and as your DNS servers and you should be ready to go, though more information on exactly how to do that (if you’re feeling not quite up to scratch) can be found here (thanks Google!), including instructions for getting it right on Ubuntu first time around!

Related Link: http://code.google.com/speed/public-dns/